SURF SKATE SIMULATION AND TEST RECORDINGS

Purpose of the Simulation

Before deploying the system in ocean conditions, a controlled test was performed using a surf skate on land in order to structure the synchronization part of the different media in advance. Therefore, the simulation served multiple purposes:

  • First, to test the stability and functionality of the hardware setup under strong movements
  • To collect and analyze motion data from surfing-like movements like the cutback using the ximu3 sensor
  • To test and evaluate the contact microphone’s responsiveness to board interaction and different movement patterns
  • To practice audiovisual synchronization between footage an external camera setup, the Zoom H4n recorder, the contact microphone and the x-IMU3 motion data.

Therefore, the surf skate was chosen because of its closely representation of  the body movement and board rotation then surfing. Especially the cutback movement can be imitated by using a skate ramp.  

This testing setup consists of the following tools:

  • A Carver-style surf skateboard
  • The x-IMU3 sensor mounted on the bottom of the board to capture movement dynamics
  • The Piezo contact microphone taped next to the motion sensor on the bottom of the board. After testing the microphone was placed in the middle of the skateboard deck in order to capture the movement of both axes of the board at the same amount of loudness. Placing the microphone closer to the wheels of the board would result in much more noise in the recording due to the internal rotation of the axes. 
  • The Zoom H4n recorder was help in the hand of the skater and was connected to closed over ear headphones. 
  • Using the external film camera Sony Alpha 7iii the whole test was captured. This additional recording was helpful later in the synchronization part. 

The board was ridden in a skate ramp simulating the composition of the wave. ON the top of the ramp the cutback movement can be executed. 

A skateboard with headphones and a remote

AI-generated content may be incorrect.

At the start of the recording session, all devices were synchronized through a short impulse sound (hitting on the board) recorded on all three devices: Zoom, GoPro, and x-IMU3. The single surf skate tackes lasted approximately 2 minutes of recording and were repeated multiple times. 
The data recorded consists of:

  • accelerometer, gyroscope, orientation from the x-IMU3
  • Mono WAV audio from the contact mic
  • 1080p video footage from the external camera

The files were transferred and loaded into the respective analysis environments:

The x-IMU3 data was decoded using the official GUI and exported as CSV files;

The WAV audio was imported into REAPER and cross-referenced with the GoPro’s audio to align the sync impulse;

Motion data was plotted using Python and matched frame-by-frame to movement events in the video.

The result was a perfectly aligned audio-motion-video composite, usable both for analysis and composition.

  1.  Observations and Results

The contact mic successfully captured vibrational data including surface noise, carving intensity, and road texture;

The x-IMU3 data revealed clear peaks in angular velocity during simulated cutbacks and sharp turns;

The GoPro footage confirmed that movement gestures correlated well with sonic and motion data markers;

The Pelican case and foam provided sufficient shock insulation and no overheating or component failure occurred;

The synchronization method using a single impulse sound proved highly reliable.

The surf skate test validated the concept and highlighted important considerations:

Movement-based sonic gestures are highly expressive and usable for composition;

Vibration sensitivity of the contact mic is sufficient for detailed sound capture;

The sync strategy will work equally well in ocean sessions with minor adjustments;

Battery and storage life are adequate for short-to-medium-length surf sessions;

Cable insulation and structural mounting are durable under stress.

This test confirmed the system’s readiness for its full application in Morocco, where ocean sessions will build upon the structure and learnings of this simulation.

#14 Preparing to build a prototype

After outlining a national-level idea for a tactile map of Austria in my last post, I quickly realized: starting small is smarter. Not only because of time and material constraints, but because detail matters and working at a regional scale allows me to dive deeper into how elevation, infrastructure, and flood risk actually intersect.

So for my next prototype, I’m focusing on the region around Tulln, and potentially Vienna if time allows. This area offers a compelling intersection of topography, hydrology, and urban development all wrapped around the Danube, Austria’s largest and most flood-prone river.


Why Tulln?

  • It’s a mid-sized town with both urban and rural textures, making it ideal for mixed-surface representation.
  • It lies directly along the Danube, with several documented flood events in recent years.
  • Its relatively flat terrain offers subtle elevation changes—challenging but manageable for tactile representation.
  • Data is available: flood risk mapsland use info, and elevation contours are easier to source at this scale.

Plus: I have a personal reference point for it living close by, which helps in imagining scale and interpretation.


What the Data Says: A Quick WISA Deep-Dive

I spent an evening inside the WISA (WasserInformationsSystem Austria) portal, specifically the second-cycle hazard and risk maps:

https://maps.wisa.bmluk.gv.at/gefahren-und-risikokarten-zweiter-zyklus

Key takeaways:

  1. Three flood scenarios dominate planning: HQ30, HQ100, HQ300 (30-, 100-, 300-year events).
  2. Each scenario maps expected water depth and flow velocity—crucial for picking tactile textures.
  3. In the Tulln/Vienna stretch, HQ100 zones hug both banks, widening dramatically at meander bends.

Material Scouting (aka “Foam Feel-Up” Day)

I’ve been to a few shop looking at different materials to see what would work best.

Prototype Blueprint (Version 0.1)

LayerData SourceTactile Encoding
Elevation (4 bands)data.gv.atcardboard (stacked)
HQ100 flood zoneWISA hazard mapScrub Sponge (Reinigungsschwamm)
Sealed landdata.gv.atcompressed cork board
Green spacedata.gv.atfelt fabric
Danube + major tributariesWISA hazard mapsmooth craft foam

The aesthetic goal isn’t prettiness; it’s readability by hand. Every texture must scream its meaning in under two seconds of fingertip contact.

Scope Check

  • Board size: A4 fits on a lap, lowers material costs, easy for the first prototype
  • Layers: 3–4 elevation steps + 1 sponge overlay = max 5 tactile heights.
  • Geography: Tulln centre + ~5 km buffer on each side; Vienna only if the first build behaves.

Next Up: Cutting, Gluing, (Re-)Cursing

In my next Blog Post I’ll document the messy middle:

  1. Printing and tracing simplified contours.
  2. Foam-board surgery (scalpel + podcasts).
  3. Flood-sponge wrestling: how do you glue something that’s meant to feel like water?
  4. First blindfold test: can a friend locate “safe ground” by touch alone?

Fingers crossed (and hopefully uncut).

WebExpo Talk #1: Nadieh Bremer

Creating an effective & beautiful data visualisation from scratch

The field trip to Prague is over, and I’ve been thinking about the really interesting talk by Nadieh Bremer. Nadieh is a freelance data visualization designer from the Netherlands, and her work focuses on turning raw data into interactive and static visual art. It was fascinating to see how she approaches data, especially since my interest in data visualization started a few years ago during my bachelor’s in graphic and information design. This talk made me think in new ways about the potential of visualizing data, and I’m excited to dive deeper into it.

One of the things that stood out to me the most during the talk was how Nadieh works with D3.js, a JavaScript library for creating (interactive) data visualizations. I was amazed by how quickly she could take raw data—just numbers—and turn them into beautiful, meaningful visualizations. She made it look so easy, and the fact that she could transform the data into something visually stunning in such a short amount of time really caught my attention. I had heard about D3.js before and had been meaning to check it out, but like most people, I never had the time. So, this talk came at the perfect moment for me, and it made me realize just how powerful and useful this tool is for working with data.

As someone who has mainly worked with data in print media, I’ve always focused on static visualizations. Most of the techniques I’ve learned are for creating things like printed charts, posters, or other fixed formats. But seeing how Nadieh used D3.js to create interactive, dynamic visualizations opened up a whole new world for me. The idea that data can be more than just something to look at on paper—that it can be experienced and interacted with—was something I hadn’t fully considered before. With D3.js, the data is not just displayed; it’s alive and engaging. You can hover over elements to get more information, zoom in to explore trends, and see the data change in real-time. This is something you simply can’t do with traditional print media, and I’m excited to explore how I could bring this kind of interactivity to my own work.

What I also found really interesting was how data can be art. Nadieh’s visualizations weren’t just about presenting data clearly; they were also about making the data visually appealing and impactful. She showed that data visualization doesn’t have to be cold or purely functional—it can be something beautiful. This idea was a bit of an eye-opener for me, as I’d always thought of data as something to be communicated in a straightforward, no-frills way. But seeing her work made me realize that data can be both informative and artistic, and it’s something I want to try in my own designs.

The talk really showed me the potential of D3.js and how it can take data visualization to a whole new level. It’s not just about making a chart or graph anymore. It’s about telling a story through data, using color, motion, and interactivity to make the information more engaging and easier to understand. This is something that I think would take much longer to achieve using traditional print techniques, and it’s a huge opportunity for people like me who are interested in graphic design and information design.

Overall, I’m really glad I got to experience Nadieh’s talk. It made me realize just how much more there is to data visualization and how powerful tools like D3.js can be for creating engaging, interactive, and even artistic visualizations. I’m excited to start experimenting with D3.js myself and see where it takes me. I’ve learned that data doesn’t have to be static and technical—it can be creative and expressive, even be used in an artistic sense. And that’s a new perspective I learned and will keep in mind as I continue to work with data.

WebExpo Conference Talk #2 – Digital Intimacy: Feeling Human in an Artificial World

I have identified „Digital Intimacy-Feeling Human in an Artificial World“ as the second talk I want to discuss here because I have previously worked on two projects during my bachelors degree that dealt with the same topic and similar questions as the ones Lutz Schmitt presented at the Expo. Especially in one of my projects about long distance relationships my team and I asked ourselves how we could create a sense of closeness through media and technology. Closeness especially meaning emotional intimacy – through rituals shared experiences and time spent doing things together, but also asked ourselves if we should mimic physical intimacy and proximity in some way and more importantly how to do that with technology. 



Lutz Schmitt’s talk investigates how feelings of closeness and connection can be created in digital and artificial contexts (through robots, AI-driven systems, or designed experiences). He explores whether digital interactions can offer a genuine sense of intimacy and how we can distinguish meaningful connection from simulation. He brings up key questions: Can people form real emotional bonds with non-human objects? What role do trust and vulnerability play in creating such connections? And what ethical responsibilities arise when we design digital interactions?


From a UX and interaction design perspective, this talk is very relevant. In both projects I worked on, we looked into creating interfaces that go beyond typical communication(tools). Ones that encourage presence and emotional involvement. For example, instead of simply allowing users to send messages, we explored designing rituals: synchronized activities, and interfaces that created a sense of “co-being” rather than just „back and forth“ communication. These approaches align with Schmitt’s idea that intimacy is not just about frequency of contact, but about quality of interaction and the emotional context.

He also challenges the trend of creating frictionless, overly polished digital experiences. In reality, human relationships are full of imperfection and effort. Transferring that to UI/UX means intentionally designing for slowness and emotional nuance which is something we often avoid in tech but is deeply engrained in us and an inherent part of the human experience. For example, what if the interface was affected by emotional tone? Or what if moments of silence or waiting became part of the interaction, signaling care or presence instead of emptiness?

What I also found to be a really interesting and relevant aspect he brought up in his talk, was the consideration of privacy. This is much harder to maintain when introducing a technological component/product into a situation, since it’s almost impossible to not have a third party involved. It raises the ethical question of how to handle the very private data that is collected responsibly. As someone who designs these kinds of products this is something I hadn’t given much thought before but really need to take into consideration.

In conclusion the talk reminded me that designing for emotional intimacy is not just about what technology to use but a much deeper emotional and ethical problem that requires understanding the essence of human intimacy and how technology can support that, instead of substituting or mimicking it. It’s a complex but deeply relevant area for interaction design, that requires sensitivity, creativity, and critical thinking.