Using x-IMU3 Python API for Live USB Data

In addition to decoding files using the x-IMU3 GUI, this project also focuses on utilized the Python library provided by x-io Technologies. Here sensor data can be streamed directly from the device via an USB connection. After successfully installing the ximu3 package with pip3 install ximu3, the provided example scripts of the GitHub repository’s Examples/Python were being used. In particular the usb_connection.py. (https://github.com/xioTechnologies/x-IMU3-Software) In the next step, after installing the ximu3 Python package via pip3 install ximu3, the script usb_connection.py was located and run from the external SSD directory /Volumes/Extreme SSD/surfboard/usbConnection.

To execute the script, the following terminal command was used:
   python3 /Volumes/Extreme\ SSD/surfboard/usbConnection/usb_connection.py

The next step is successfully executed once the x-IMU3 is detected. Here the user is prompted whether to print data messages or not. After enabling this, the terminal now displays live sensor data. This data set includes quaternions, Euler angles, gyroscope, and accelerometer data. It is noticeable, that this method bypasses the GUI and provides a direct access to sensor streams. This step enables a more flexible integration and a more advanced data mapping setup.

The full Python architecture includes modular scripts like connection.py, usb_connection.py, and helpers.py. These are handling low-level serial communication and parsing. This additional access pathway expands the projects versatility and opens doors for a more experimental workflow (x-io Technologies, 2024).

  1.  OSC Data Interpretation in Pure Data

The received OSC data is interpreted using a custom Pure Data patch (imu3neuerversuch22.04..pd), which serves as a bridge between sensor data and visual representation of the data. This patch listens for incoming OSC messages via the [udpreceive] and [unpackOSC] objects, parsing them into sub-addresses like /imu3/euler, /imu3/acceleration, and /imu3/gyroscope.

Each of these OSC paths carries a list of float values, which are unpacked using [unpack f f f] objects. The resulting individual sensor dimensions (e.g., x, y, z) are then routed to various subpatches or modules. Inside these subpatches, the values are scaled and normalized to fit the intended modulation range. For example:

  • Euler angles are converted into degrees and used to modulate stereo panning or spatial delay.
  • Z-axis acceleration is used as a trigger threshold to initiate playback or synthesis grains.
  • Gyroscope rotation values modulate parameters like filter cutoff or reverb depth.

Additionally, [select] and [expr] objects are used to create logic conditions, such as identifying sudden peaks or transitions. This setup allows the system to treat physical gestures on the surfboard—like standing, carving, or jumping—as expressive control inputs for audio transformation.

The modular structure of the patch enables quick expansion. New OSC paths can be added, and new sound modules can be integrated without rewriting the core logic. By structuring the patch in this way, it remains both maintainable and flexible, supporting future extensions such as machine learning-based gesture classification or live improvisation scenarios.

This technical design reflects a broader trend in contemporary media art, where real-world data is used not just for visualization but as a means to dynamically sculpt immersive audio experiences (Puckette, 2007).