Impulse #3: Nadieh Bremer, WebExpo 2025

This blogpost will be a reflection inspired by Nadieh Bremers’ WebExpo 2025 talk Creating an effective & beautiful data visualisation from scratch with d3.js. Bremer demonstrates how visual interfaces can be designed to convey information clearly and emotionally. She outlines a design process that begins with understanding the data’s story and ends with polishing details such as visual hierarchy, color, and interaction. Her approach emphasizes that visuals should not only communicate facts but also evoke engagement and a sense of discovery. I rewatched the digital documentation of her talk to recap the content of her presentation.

Bremer presents visualization as a communication medium, where design choices directly impact user comprehension and emotional experience. Clarity reduces frustration, while appealing design increases motivation to explore. This perspective positions data visualization as a critical component of user experience, not merely a decorative or aesthetic layer.

Learning about new technologies for data visualization

When I encountered Nadieh Bremers work, I was already familiar with data visualization, but mostly through print media and a little experience with Processing. Designing layouts for magazines or static posters taught me how much data visuals can influence perception and guide a narrative. Around that time we went to WebExpo, I got into JS coding but wasn’t aware of the posibilities to use it for data visualization. Her projects demonstrated what I had been missing in print -> interactivity and adaptivity.

Why adaptive data visualization matters for a good user experience

During my deeper dive into adaptive data visualization literature, I explored a research paper focusing on real-time decision support in complex systems. It argues that static dashboards are no longer enough to support organizations facing rapidly changing data environments. Instead, visualizations must adapt to:

  • Incoming data streams
  • User interactions
  • Context shifts
  • Multivariate complexity

Adaptive systems combine machine learning, real-time processing, and flexible visualization layers to support faster and more informed decision-making. This means that the visualization is not just displaying data, it is interpreting and reacting to it. The paper specifically highlights D3.js as one of the technologies capable of creating these highly flexible and dynamic interfaces. Unlike pre-built dashboards, D3 allows developers to adapt interactions, transitions, and representations directly to user needs and situational changes.

In my earlier blog posts I wrote about affective computing. Combining the gained knowled I came to a conclusion: If a system can visually adapt based not only on the dataset, but also on the emotional state of the user, could generate a better user experience?

Sources:

https://slideslive.com/39043157/creating-an-effective-beautiful-data-visualisation-from-scratch

https://www.researchgate.net/publication/387471439_ADAPTIVE_DATA_VISUALIZATION_TECHNIQUES_FOR_REAL-TIME_DECISION_SUPPORT_IN_COMPLEX_SYSTEMS

Impulse #1: Affective Computing, Rosalind W. Picard

The work Affective Computing by Rosalind W. Picard from the year 2000 proposes a fundamental paradigm shift in computer science, challenging the traditional view that intelligent machines must operate only on logic and rationality. Picard’s work provides a comprehensive framework for the design of computational systems that relate to, arise from, or influence human emotions.

In Interaction Design we want interfaces that are easy to use and look good. We spend our time while working on projects thinking about usability, efficiency and aesthetics. For us in design, this means a functional interface isn’t enough anymore. If a system doesn’t register that a user is confused or frustrated, it’s not truly successful. Picard essentially launched a new field dedicated to building technology that can sense, interpret, and respond to human emotional states.

Adaptive Interfaces enhanced by Computer Vision Systems

A central connection between affective computing and my work in emotion detection for computer vision lies in the development of adaptive user interfaces. Picard emphasizes that computers often ignore users’ frustration or confusion, continuing to operate rigidly without awareness of emotional signals. By equipping systems with the ability to recognize facial expressions, stress indicators, or declining engagement, interfaces can dynamically adjust elements such as difficulty level, information density, feedback style, or interaction pacing. This emotional awareness transforms an interface from a static tool into an intelligent communication partner that responds supportively to users’ needs. In learning environments, for example, a tutor system could detect when a student becomes overwhelmed and automatically provide hints or slow down the content. In safety-critical settings, such as driver monitoring, emotion recognition can alert systems when attention or alertness drops. Thus, integrating affect recognition directly contributes to more human-centered, flexible, and effective interfaces, aligning with Picard’s vision of computers that interact with intelligence and sensitivity toward humans.

Computer Vision in UX-Testing

Computer vision–based emotion recognition can significantly enhance UX testing by providing objective insights into users’ emotional responses during interaction. Rather than relying solely on post-task questionnaires or self-reporting, facial expression analysis and behavioral monitoring enable systems to detect in real time when a user experiences frustration, confusion, satisfaction, or engagement. Picard highlights that current computers are affect-blind, unable to notice when users express negative emotions toward the system, and therefore cannot adjust their behavior accordingly. Integrating affective sensing into UX evaluation allows designers to pinpoint problematic interface moments, identify cognitive overload, and validate usability improvements based on measurable affective reactions.

In summary, the intersection of affective computing, computer vision, and adaptive interfaces offers a protential research path for my master thesis. By enabling systems to detect emotional reactions through facial expressions and behavioral cues, UX testing can become more insightful and responsive, leading to interface designs that better support the users needs. Building on Picard’s foundational ideas of emotional intelligence in computing, my research could contribute to developing affect-aware evaluation tools that automatically identify usability breakdowns and adapt interactions in real time.