Alongside the app, I prototyped a web-based soundmap that displays recorded locations and lets users hear the acoustics of real-world spaces. Built with Leaflet.js, the soundmap shows markers where impulse responses were captured. Clicking them reveals:
- Metadata (location, date, mic type)
- A photo of the space
- Audio preview of a dry sound convolved with that space’s IR
In the future, this could evolve into a public archive: a platform where users all over the world can contribute and explore acoustic identities. Think of it as Google Street View for sound — an acoustic memory atlas, built one snapshot at a time.
