The paper introduces spinCycle, an interactive music performance system by Spencer Kiser. It uses a turntable and colored plexiglass disks which are placed on the spinning turntable. A camera tracks their color and position and transforms those visual patterns into sound in real time. This System creates a unique mix of visual design, sound and interactivity.
On the technical side, spinCycle uses a webcam to capture the rotating disks, and a patch built in Max/MSP/Jitter processes the video feed. The system applies edge detection to identify when and where each color appears, triggering the corresponding sound connected to the color. A live visual feed is projected during the performance, so the audience can directly see how the visual patterns control the music and therefore the connection between color and sound.


The system can function as a drum machine or sine wave generator. In the drum machine version, each color triggers a different sound (kick, snare, hi-hat). In the sine wave version, each color is hard coded to a sine wave. In this option it is also possible to overlap disks to form secondary colors and harmonies.
I found it fascinating that the idea of connecting color and sound goes back to ancient cultures, including the Chinese and Persians.
In the west, Sir Isaac Newton tried to map colors to musical tones with his “Opticks” in 1704. He made a connection between their mathematical relationship to each other and the relationship
between the notes of the musical scale.

My thoughts on spinCycle
I think spinCycle is a fun concept that nicely blends visual art, physical interaction, and sound design. Coming from a visual design background, I sometimes find it challenging to fully grasp the logic behind sound design. That’s why I like this approach of using visual patterns to generate sound which creates a direct and intuitive connection between what you see and what you hear.
What I especially like is that the interface seems very playful and intuitive. It invites you to experiment with colors and spatial arrangements, making sound creation feel more like visual composition. For me, this is a fun and experimental way to make sound design more approachable for visual designers. I also find it fascinating to consider how each color can take on a mood or character through its associated sound. Many people naturally associate colors with certain emotions, and by layering sound onto color, it adds a new emotional dimension. For instance, a soft sine wave could enhance the calmness often associated with blue, while a sharp snare might amplify the energy or urgency linked to red. This creates an opportunity to explore how visual and auditory elements can work together to express emotion in a multisensory way.
However, after reading the paper, I still have some technical questions. While the concept is clear, I feel the technical implementation — especially the way video input is converted into sound — isn’t described in much detail. Since I’ve worked with Max/MSP before, I’m really curious to see how the patch is built. It would be helpful to see how the video tracking, color detection, and sound triggering are structured within the patch.
Overall I think spinCycle is a very fun and creative tool, which I would love to try it myself.