Playback and Visualization of x-IMU3 Sensor Data Using Python and Pure Data

Moving forward, in this section the documentation of the workflow used to playback recorded x-IMU3 motion sensor data and visualize it as a dynamic graph in Pure Data, is shown. The goal here was to analyze the movement of two specific flips along the X-axis. Therefore a few seconds of this rotation were recorded and then read through the python script, send to PureData and here printed in a visual graph. The confirmation of the accuracy was done through multiple validation layers.

First, data was captured using the x-IMU3 inertial measurement unit. During the recorded session, the sensor was physically maneuvered to do two flips along its X-axis. Then, this data was saved internally by the sensor into a binary format with the extension. ximu3. In order to later find this file again, it was named XIMUA_0005.ximu3 and was stored on an external drive.

Second step was to decode and transmit the recorded motion data. Therefore, I used a Python script named ximu2osc.py. This script was written to read both live and recorded data and transmit it via the Open Sound Control (OSC) protocol. This python script uses the official ximu3 Python library to do file decoding, and the python-osc library for sending OSC messages.

The python script was executed using the following command in the terminal:

Using this script the playback of the sensor recording get initialized by naming the .ximu3 file as input. Looking at the command, the -p argument sets the OSC port to 9000. The -H argument points out the destination IP address, which in this case is 127.0.0.1. The Python script, in the next step, reads and decodes the binary sensor data in real time. Next step, the formatted OSC messages gets send using a clearly defined path.

Focusing on the receiving end, a Pure Data (Pd) patch was created to receive and interpret the data. This patch was configured to listen on port 9000 and processes the incoming OSC messages with the [netreceive -u -b 9000] object. It is capable of receiving UDP packets in binary format. The output from netreceive was then connected into [oscparse] object. OSCPARSE is responsible for decoding incoming OSC messages into usable Pd lists.

[list trim] was introduced in the patch to remove any resisting selectors. As the next step, a set of [route] objects were implemented. There were placed to filter out the gyroscope data, specifically the values from the X, Y, and Z axes. In this process a hierarchical routing structure was used. The first one: [route intertial], followed by [route gyroscope] and finally [route xyz]. The final values were then unpacked using [unpack f f f] to split them into three float values (X, Y, Z). For this test, only the X-axis values were needed.

To have a visual representation of the X-axis values in real time, an array named array1 was created. It functions as a scrolling plot reading the incoming rotation data. This was executed by assigning each X value to a new index in the array [tabwrite array1]. A simple counter system was built using [metro] to write the position in the array. [+ 1], [mod 500]. The [metro] object gets triggered at a 500ms interval, which, in this case served as the sampling rate of the graph. This counter also loops over a fixed range of 500 steps. This is how the circular buffer was build. Now, each new value is being stored in a float object [f] and sent via [s x-index] to its matching [r x-index].

A screenshot of a computer

AI-generated content may be incorrect.

With using this setup, it is possible to visually plot the continuous stream of X-axis values into the array. The result is a dynamic visualization of the sensor’s movement over time. Looking at the playback of the recorded. ximu3 file, the two flips performed on the X-axis are strongly stown as spikes in the plotted graph. This provides a real representation of the motion of the flip along the X-axis. In addition, all values were also printed to the Pd console in order to verify and debugging purposes.

Next step, to ensure the accuracy of the visualization, I compared the received values in three different ways. First, I monitored the terminal output of the Python script. Here every OSC message that was being sent was printed out, including its path and its matching values. Secondly, I checked the values listed inside Pure Data. Here the numbers were compared with the one from the terminal. Thirdly, I opened the. ximu3 file in the official x-IMU3 GUI and therefore exported the data as a CSV file. Analyzing the resulting file Inertial.csv, the “Gyroscope X (deg/s)” column was detected and housing the same values then printed in the terminal, in Pure Data and visually on the graph. This lets me confirm, that the sensor data was transmitted consistently across all three layers: the original file, the terminal stream, and in the end, the Pd visualization.

In conclusion, this test showcasts a successful connection between recorded sensor movement and its visual representation using an OSC streaming data pipeline. A clearly structured, repeatable method was used to analyze a specific gestures or physical event, recorded by the sensor. Furthermore, the system is adaptive and can be easily adjusted to visualize different values. It also sets the ground stone for other possibilities in sound design and audio adjustment in the further process.