The heart of the project is a custom Swift-based iOS application I developed: a tool to record impulse responses, estimate acoustic parameters like RT60, and apply spatial convolution in real time. The app consists of several modular components:
- A mic selector that supports mono, stereo, and (planned) Ambisonics input.
- A recording module that captures signals from sweep tones or balloon pops.
- A deconvolution processor that transforms recorded responses into usable IRs.
- A convolution engine that allows users to load external sounds and place them in the captured space.
- A visual interface that shows waveforms, energy decay, and export options.
Built using AVAudioEngine and SwiftUI, the app runs entirely on-device, making spatial recording accessible to artists, researchers, and designers
