Context Sensitive Real-Time Data-Driven Music Algorithms
For Information, Contact:
Jay Bjerke
Commercialization Manager, Engineering
Web Published:
Researchers at Iowa State University have developed a set of algorithms for creating music based on real time data.  This approach allows aural information to be combined with visual cueing to enhance data analysis.

Development Stage:
A simulation is available for demonstration, and ISU is seeking partners interested in commercializing this technology

We live in a world filled with sound and receive a wide range of information aurally. By adding this information to our visual cueing, we more fully understand our environment. Sound directs our viewing and adds essential contextual information.  Because of this, numerous efforts to sonify data—represent data with sound—have been performed.  However, these efforts have mapped data directly to various aspects of sound, causing a result that is difficult to understand or irritating to listen to.  To overcome this drawback, ISU researchers have developed a musical approach to the sonification of data.  Because music can convey a large amount of information, it can enable users to perceive more facets of the data.  This method includes the use of context sensitive grammars, fractal algorithms, and atonal compositional techniques with the result that the music builds in listenability and flexibility for broad applicability to different types of data without external intervention by the composer.  This approach also provides a connection between micro- and macro-scales of the data, thus allowing the user to fully experience its intricacies and interrelationships.  Potential applications of this technique include ambient awareness, exploration of large complex data sets for scientific research and engineering design, augmentation of remote control of tractors or other working machinery, enhancement of viewing of websites or museum displays, use as a composition tool for creating music for performance, and providing an additional information channel during crowd surveillance or other visual targeting/surveillance activities.

• Versatile (can be used with diverse types of data sets)
• Flexible (use of atonal compositional techniques requires less rigid syntax than tonal music and enables real-time sonification)
• Listenable (musical approach creates pleasing aural results)
Sonfication of data for enhanced analysis and understanding.

Patent Information:
*To see the full version of the patent(s), follow the link below, then click on "Images" button.
Country Serial No. Patent No. Issued Date
United States 10/985,301 7,304,228* 12/4/2007

Direct Link: