NASA’s data sonification project converts astronomical observations into sound. By assigning different frequencies or instruments to distinct wavelengths of light (X-ray, optical, infrared), cosmic phenomena such as the Bullet Cluster, Crab Nebula, and Supernova 1987A can be “heard”. These audio interpretations highlight features like dark matter, spinning neutron stars, and supernova shockwaves, providing a new, immersive way to experience and understand the universe.

https://hydrologicalsoundscapes.github.io
Hydrological Soundscapes | Ivan Horner and Benjamin Renard (2023)
This app visualizes river hydrological data from thousands of global hydrometric stations in both bar charts and musical form. Each of four hydrological variables (average flow, monthly flows, monthly frequency of annual daily maxima, and monthly frequency of annual 30-day averaged minima) controls different musical elements, such as tempo, pitch, volume, and instrument choice. Users are encouraged to wear headphones for the best experience and can either follow a brief tutorial or start exploring immediately.
Reference
This passage describes a musical representation of library traffic patterns throughout the year. Each “row” of notes corresponds to a different time of day (weeks, mornings, afternoons/evenings, and nights) and is placed in a progressively higher pitch range. School breaks, weekends, and term times are reflected in gaps or surges in the music, illustrating how library hours and visitor numbers change across the summer, fall, winter, and spring quarters. Nights are only represented during school terms, highlighted by two-note arpeggios in the highest pitch range.
https://mlaetsc.hcommons.org/2023/01/18/data-sonification-for-beginners
Multisensory Data: Insights from “DATA AND DASEIN”
In the dissertation “DATA AND DASEIN – A Phenomenology of Human-Data Relations,” by T. Hogan a review of 154 data representations revealed that most rely on sight (151) and touch (144) to interpret data (Figure 1, B). A smaller subset (22) also incorporated sound, and even fewer tapped into taste or smell. Moreover, 139 examples combined both sight and touch, while only 11 used more than two sensory channels (Figure 1, A).

One standout example is Tac-tile (Figure 2), designed for visually impaired users. By combining tactile (via vibrotactile feedback through a stylus) and auditory (adjusting pitch through speakers) elements, Tac-tile highlights how multiple modalities can enable a richer, more inclusive data exploration. This concept extends beyond assistive technology: artist Ryoji Ikeda’s Data.anatomy[civic] [2] merges audio and dynamic graphics to immerse audiences in the intricate data driving Honda Civic car design. Meanwhile, Perpetual (Tropical) SUNSHINE [3] (Figure 3) uses infrared light bulbs to convey real-time temperature data from stations around the Tropic of Capricorn, translating environmental data directly into heat and light. And in more experimental territory, Data Cuisine by Moritz Stefaner [4] explores taste, smell, and sight to transform data into “edible diagrams.”


These examples underscore the creative possibilities of thinking beyond purely visual representations. When designers and researchers integrate multiple sensory channels, they can unlock new forms of engagement, accessibility, and emotional resonance.
Reference
T. Hogan, Data and Dasein – A Phenomenology of Human-Data Relations, Ph.D. dissertation, Bauhaus-Universität Weimar, Weimar, Germany, 2016, sect. 5.5.1.1 (Sensory Modalities).
[1] Steven A. Wall and Stephen A. Brewster. “Tac-tiles: Multimodal Pie Charts for Visually Impaired Users.” In: Proceedings of the 4th Nordic conference on Human-computer interaction changing roles – NordiCHI ’06. Association for Computing Machinery (ACM), 2006. doi: 10.1145/1182475.1182477. url: https://doi.org/10.1145%2F1182475.1182477.
[2] Ryoji Ikeda. data.anatomy.civic. website. 2012. url: http://dataanatomy.net/.
[3] fabric | ch. Perpetual (Tropical) SUNSHINE. (2006). website. 2012. url: http://www.fabric.ch/pts/.
[4] Moritz Stefaner. Data Cuisine. website. 2014. url: http://data-cuisine.net/.