What is Embodied Resonance?
Embodied Resonance is an experimental audio performance that investigates the interplay between trauma, physiological responses, and immersive sound. By integrating biofeedback sensors with spatial sound, this project translates the body’s real-time emotional states into an evolving sonic landscape. Through this process, Embodied Resonance aims to create an intimate and immersive experience that bridges personal narrative with universal themes of emotional resilience and healing.
Reference Works
Inspiration for this project draws heavily from groundbreaking works in biofeedback art. For instance, Tobias Grewenig’s Emotion’s Defibrillator (2005) inspired me to explore how visual imagery can serve as emotional triggers, sparking physiological responses that drive sound. Grewenig’s project combines sensory input with dynamic visual feedback, using breathing, pulse, and skin sensors to create a powerful interactive experience. His exploration of binaural beats and synchronized visuals provided a foundation for my use of AR imagery and biofeedback systems.
Another profound influence is the project BODY ECHOES, which integrates EMG sensors, breathing monitors, and sound design to capture inner bodily movements and translate them into a spatialized audio experience. This project highlights how subtle physiological states, such as changes in muscle tension or breathing rhythms, can form the basis of a compelling sonic narrative. It has inspired my approach to using EMG and respiratory sensors as key components for translating physical states into sound.
How Does It Work?
The performance involves the use of biofeedback sensors to capture physiological data such as:
- Electromyography (EMG) to measure muscle tension
- Electrodermal Activity (EDA/GSR) to track stress levels via skin conductivity
- Heart Rate (ECG/PPG) to monitor pulse fluctuations and emotional arousal
- Respiratory Sensors to analyze breath patterns
This real-time data is processed using software like Max/MSP and Ableton Live, which maps physiological changes to dynamic sound elements. Emotional triggers, such as augmented reality (AR) images chosen by the audience, influence the performer’s physiological responses, which in turn shape the sonic environment.

Core Components of the Project
- Emotional Triggers and Biofeedback: The audience plays an active role by selecting AR-displayed imagery, which elicits emotional and physiological responses from the performer.
- Sound Mapping and Generation: Physiological changes dynamically alter elements of the soundscape.
- Spatial Audio and Immersion: An Ambisonic sound system enhances the experience, surrounding the audience in a three-dimensional sonic space.
- Interactive Performance Structure: The performer’s emotional and physical state directly influences the performance, creating a unique, real-time interaction between artist and audience.
Why is This Project Important?
Embodied Resonance is an innovative approach to understanding how trauma manifests in the body and how it can be externalized through sound. This project:
- Explores the intersection of biofeedback technology, music, and performance art
- Provides a new medium for emotional processing and healing through immersive sound
- Pushes the boundaries of interactive performance, inviting the audience into a participatory experience
- Challenges conventional notions of musical composition by integrating the human body as an instrument
Why Do I Want to Work on It?
As a sound producer, performer, and music editor, I have always been fascinated by the connections between sound, emotion, and the body. My personal journey with trauma and healing has shaped my artistic explorations, driving me to create a performance that not only expresses these experiences but also fosters a shared space for reflection and empathy. By combining my technical skills with deep personal storytelling, I aim to push the boundaries of sonic expression.
How Will I Realize This Project?
Methods & Techniques
- Research: Studying trauma, somatic therapy, and the physiological markers of emotional states.
- Technology: Utilizing biofeedback sensors and signal processing tools to create real-time sound mapping.
- Performance Development: Experimenting with gesture analysis and embodied interaction.
- Audience Engagement: Exploring ways to integrate audience input via AR-triggered imagery.
Necessary Skills & Resources
- Sound Design & Synthesis: Proficiency in Ableton Live, Max/MSP, and Envelop for Live.
- Sensor Technology: Understanding EMG, ECG, and GSR sensor integration.
- Spatial Audio Engineering: Knowledge of Ambisonic techniques for immersive soundscapes.
- Programming: Implementing interactive elements using coding languages and software.
- Theoretical Research: Studying literature on biofeedback art, music therapy, and embodied cognition.
Challenges and Anticipated Difficulties
Spatial Audio Optimization: Achieving an immersive sound experience that maintains clarity and emotional depth.
Technical Complexity: Ensuring seamless integration of biofeedback data into real-time sound processing requires rigorous calibration and testing.
Emotional Vulnerability: The deeply personal nature of the performance may present emotional challenges, requiring careful preparation.
Audience Interaction: Designing a system that effectively incorporates audience input without disrupting the emotional flow.
Bibliography
- ‘BITalino’. https://www.pluxbiosignals.com/collections/bitalino?srsltid=AfmBOoq8fsD3J5MPuFYdxP7BJOB_5YGMQ0HDwe584EROU0gwkFmDd1Ny.
- ‘Body Echoes (2014) — SSI’. https://spatialsoundinstitute.com/Body-Echoes-2014.
- ‘Emotion´s Defibrillator’. https://www.khm.de/studentische_arbeiten/id.13071.emotion-s-defibrillator/.
- ‘Free Tools for Live Unlock 3D Spatial Audio, VR, AR | Ableton’. https://www.ableton.com/en/blog/free-tools-live-unlock-3d-spatial-audio-vr-ar/.
- ‘P_Virtual Reality in Audio Visual Arts Using Sensors and Biofeedback – SSI’. https://spatialsoundinstitute.com/P_Virtual-Reality-in-Audio-Visual-Arts-using-Sensors-and-Biofeedback.