#01 Multisensory Data Visualisation

Introduction to Multisensory Data Visualisation

Multisensory data visualization refers to the use of multiple sensory modalities—such as sight, hearing, and touch—to represent complex data sets in more intuitive and accessible ways. While conventional visualization techniques rely on graphs, charts, and maps, these predominantly visual methods can become overwhelming or fail to convey subtle patterns, especially when dealing with high-dimensional or time-sensitive data. Beyond auditory cues (e.g., sonification), incorporating tactile feedback (e.g., haptic vibrations) and other sensory channels has the potential to significantly enhance data interpretation by distributing cognitive load and addressing diverse user needs.


Background and Inspiration

During my bachelor and bachelor project, I initially explored and dealt with “traditional” forms of data representation, which led me to examine various approaches to accessibility in design. This exploration was further enriched by the talk “Lessons Learned From Our Accessibility-First Approach to Data Visualisation” by Kent Eisenhuth from the Usability Congress in Graz. There I first consiously encountered signification of data and was instantly intrigued.


Why Consider a Multisensory Approach?

  1. Reduced Cognitive Overload
    Representing data through multiple senses can distribute the processing demands across different sensory channels. For instance, tactile cues (such as haptic vibrations) and auditory cues (such as high or low sounds) can indicate threshold crossings or significant deviations in data, relieving some of the burden placed solely on visual elements.
  2. Enhanced Engagement and Emotional Resonance
    Research indicates that incorporating different sensory modalities—particularly auditory and tactile—may intensify user engagement. Whether through auditory signals highlighting sudden shifts or vibrations indicating key events, users often develop deeper cognitive and emotional connections when more than one sense is involved.
  3. Expanded Accessibility
    For users with visual impairments, sonification and tactile feedback can serve as vital tools for understanding data trends and outliers. Similarly, for users with hearing impairments, strategic use of visual and tactile elements can ensure equal access to critical insights. A truly multisensory system can be configured to accommodate a broad range of abilities.
  4. Detection of Subtle or Transient Patterns
    Time-sensitive or multi-dimensional data (e.g., financial fluctuations, climate patterns, or sensor readings) can be challenging to track visually. By adding non-visual modalities, patterns that might be overlooked in a purely visual chart can become more apparent through changes in pitch, rhythm, or tactile pulses.

Next Steps

My next steps will focus on gathering and analyzing data on how combining visual, auditory, and potentially tactile elements can influence user comprehension, retention, and emotional engagement with complex information. This research will involve reviewing existing literature, examining various sensory-mapping strategies, and identifying critical factors (e.g., cognitive load, accessibility requirements, and user preferences) that shape effective multisensory data representations. Comparative studies and expert interviews may inform which modalities are most beneficial for certain data types or user groups. These insights will guide the theoretical framework for understanding multisensory design principles, culminating in recommendations for inclusive and impactful data visualization practices.


Keywords for my Research

AI generated list of keywords to help me in my research.

  1. Sonification
  2. Tactile Feedback / Haptic Interfaces
  3. Data Accessibility
  4. Inclusive Design
  5. Universal Design
  6. Cognitive Load
  7. Sensory Mapping
  8. Multimodal Interaction
  9. Cross-Modal Perception
  10. User Experience (UX) Testing
  11. Threshold Detection
  12. Emotional Resonance
  13. Accessibility Guidelines (e.g., WCAG)
  14. Alt Text and Descriptive Metadata
  15. Adaptive/Assistive Technologies
  16. Perceptual Illusions in Multisensory Design
  17. Pattern Recognition in Data
  18. Interaction Design Principles
  19. Context-Aware Computing
  20. Sensory Substitution

Literature

T. Hogan and E. Hornecker, “Towards a Design Space for Multisensory Data Representation,” Interacting with Computers, vol. 29, no. 2, pp. 147–167, Mar. 2017, doi: 10.1093/iwc/iww015.

S. Tak and L. Toet, “Towards Interactive Multisensory Data Representations,” in Proceedings of the International Conference on Computer Graphics Theory and Applications and International Conference on Information Visualization Theory and Applications (IVAPP-2013), 2013, pp. 558–561. doi: 10.5220/0004346405580561.

A. Storto, “Using Data Visualisations in a Participatory Approach to Multilingualism: ‘I Feel What You Don’t Feel’,” 2024. doi: 10.2307/jj.20558241.11.

Leave a Reply

Your email address will not be published. Required fields are marked *