IRCAM Forum 2025 – RIOT v3: A Real-Time Embedded System for Interactive Sound and Music

When you think of motion tracking, you might imagine a dancer in a suit covered with reflective dots, or a game controller measuring hand gestures. But at this year’s IRCAM Forum in Paris, Emmanuel Fléty and Marc Sirguy introduced R-IoT v3, the latest evolution of a platform developed at IRCAM for real-time interactive audio applications. For students and professionals working in sound design, physical computing, or musical interaction, RIOT represents a refreshing alternative to more mainstream tools like Arduino, Raspberry Pi, or Bela—especially when tight timing, stability, and integration with software environments like Max/MSP or Pure Data are key.

What is it, exactly?

RIOT v3 is a tiny device—about the size of a USB stick—that can be attached to your hand, your foot, a drumstick, a dancer’s back, or even a shoe. Once it’s in place, it starts capturing your movements: tilts, spins, jumps, shakes. All of that motion is sent wirelessly to your computer in real time.

What you do with that data is up to you. You could trigger a sound sample every time you raise your arm, filter a sound based on how fast you’re turning, or control lights based on the intensity of your movements. It’s like turning your body into a musical instrument or a controller for your sound environment.

What’s special about version 3?

Unlike Raspberry Pi, which runs a full operating system, or Arduino, which can have unpredictable latency depending on how it’s programmed, RIOT runs bare metal. This means there’s no operating system, no background tasks, no scheduler—nothing between your code and the hardware. The result: extremely low latency, deterministic timing, and stable performance—ideal for live scenarios where glitches aren’t an option.

In other words, RIOT acts like a musical instrument: when you trigger something, it responds immediately and predictably.

The third generation of RIOT introduces some important updates:

  • Single-board design: The previous versions required two boards—the main board and an extension board—but v3 integrates everything into a single PCB, making it more compact and easier to work with.
  • RP2040 support: This version is based on the RP2040 chip, the same microcontroller used in the Raspberry Pi Pico. It’s powerful, fast, and has a growing ecosystem.
  • Modular expansion: For more complex setups, add-ons are coming soon—including boards for audio I/O and Bluetooth/WiFi connectivity.
  • USB programming via riot-builder: The new software tool lets you write C++ code, compile it, and upload it to the RIOT board via USB—no need for external programmers. You can even keep your Max or Pure Data patch running while uploading new code.

Why this matters for sound designers

We often talk about interactivity in sound design—whether for installations, theatre, or music—but many tools still assume that the computer is the main performer. RIOT flips that. It gives you a way to move, breathe, and act—and have the sound respond naturally. It’s especially exciting if you’re working in spatial sound, live performance, or experimental formats.

And even if you’ve never touched an Arduino or built your own electronics, RIOT v3 is approachable. Everything happens over WiFi or USB, and it speaks OSC, a protocol used in many creative platforms like Max/MSP, Pure Data, Unity, and SuperCollider. It also works with tools some of you might already know, like CataRT or Comote.

Under the hood, it’s fast. Like really fast. It can sense, process, and send your movement data in under 2 milliseconds, which means you won’t notice any lag between your action and the response. It can also timestamp data precisely, which is great if you’re recording or syncing with other systems.

The device is rechargeable via USB-C, works with or without a battery, and includes onboard storage. You can edit configuration files just like text. There’s even a little LED you can customize to give visual feedback. All of this fits into a board the size of a chewing gum pack.

And yes—it’s open source. That means if you want to tinker later on, or work with developers, you can.

https://github.com/Ircam-R-IoT

A tool made for experimentation

Whether you’re interested in gesture-controlled sound, building interactive costumes, or mapping motion to filters and samples in real time, RIOT v3 is designed to help you get there faster and more reliably. It’s flexible enough for advanced setups but friendly enough for students or artists trying this for the first time.

At FH Joanneum, where design and sound design meet across disciplines, a tool like this opens up new ways of thinking about interaction, performance, and embodiment. You don’t need to master sensors to start exploring your own body as a controller. RIOT v3 gives you just enough access to be dangerous—in the best possible way.

Alina Volkova - a Ukrainian singer, sound producer, and DJ, performing under name Nina Eba. Her musical journey was shaped by her education at a music school, playing in rock bands, composing music for audio stocks, and working in television. In August 2024, she released her debut multi-genre mini-album MORPHO, followed by a remix compilation RE:MORPHIX, created in collaboration with 10 producers from different countries. Now she is master student at FH Joanneum/ KUG Sound Design Program and works on project Embodied Echoes.
Leave a Reply

Your email address will not be published. Required fields are marked *