IMPULSE #4: Lunch with Prof. Baumann (with some good Kebap!)

This impulse is a bit different from the others because it is not a book or a talk, but a lunch meeting with Prof. Konrad Baumann that helped me put much sharper edges around my thesis idea. The conversation was essentially my first “real” check-in with someone I would like to supervise my thesis, and it forced me to articulate my motivations and what I actually want to achieve with “effective ethical design” and digital footprints. Instead of staying in my own head, I had to explain why this topic matters to me and where I see it sitting inside UX practice and the wider industry. That alone made this meeting feel like an important impulse.

We started by reconnecting threads from a previous class discussion, where we had talked about our interests in the UX field and the kinds of industry problems we care about. For me, those questions brought back the same themes: ethical design, dark patterns, privacy, and how users are often left in the dark about their data trails. This lunch was like a continuation of that exercise, but one-on-one and more honest. Saying my thesis topic out loud and contextualising it in front of someone with experience in this area made my intentions feel more “real”, and it also exposed where my thinking was still a bit vague or too broad.

I really liked how he brought up concrete cases and pointed me toward resources, including earlier advice I had heard about noyb (Neuerungen bei Datenschutzfällen), a privacy organisation that regularly takes companies to court over data protection violations. These cases are basically “real-life stories” of where digital products and services crossed lines in how they handled user data. That was a helpful reminder that my thesis is not just theoretical; it sits in a landscape where regulators, NGOs, and companies are already fighting over what is acceptable, from tracking to dark patterns to consent models.

Afterwards, Prof. Baumann shared an interesting ORF article that discusses current tensions and developments around privacy and digital rights in Austria and Europe. Even without quoting it directly, the article makes it clear how much is at stake: from weak enforcement to high-profile cases against platforms and tech companies, it shows that “privacy by design” is not just a slogan but something that either happens in concrete interfaces or does not. For my thesis, this is a useful anchor, because it links my academic work to a living context of laws being tested, companies being challenged, and users being affected.

What I take from this impulse is both emotional and structural. Emotionally, it reassures me that I am not chasing a “nice sounding topic” but something that sits at the intersection of UX, law, and real harms users are experiencing. Structurally, it pushes me to frame my thesis more clearly around a few core questions: How can interaction design make digital footprints visible and manageable in everyday interfaces? How can ethical constraints and legal requirements be translated into practical patterns instead of abstract guidelines? And how can designers avoid repeating the kinds of behaviours that end up in complaints, lawsuits, or investigative articles about privacy abuses?

For my next steps, this meeting gives me three concrete moves. First, to keep mapping real cases (like those collected by noyb and highlighted in media coverage) as examples of what “unethical design” looks like in practice, and why better interaction patterns are needed. Second, to use those cases as boundary markers when I prototype: if a pattern smells like something that has already led to a complaint or enforcement, it is a red flag. Third, to stay in close conversation with Prof. Baumann as a supervisor, so that my thesis stays grounded in both design practice and the evolving legal and ethical landscape.

Link to the ORF article Prof. Baumann shared (in German), which anchors this impulse in current debates about privacy and data protection:
https://orf.at/stories/3410746/

For broader context on enforcement and complaints concerning privacy violations in Europe, especially involving companies like Clearview AI, this overview from Reuters and noyb helps show how data misuse is being challenged at a legal level:
https://www.reuters.com/sustainability/society-equity/clearview-ai-faces-criminal-complaint-austria-suspected-privacy-violations
https://noyb.eu/en/criminal-complaint-against-facial-recognition-company-clearview-ai

Finally, this Austrian consumer-focused article on dark patterns and manipulative web design provides a very concrete list of deceptive practices and explains how new regulations like the Digital Services Act aim to limit them, which connects directly back to my thesis interest in ethical interfaces and user autonomy:
https://www.konsumentenfragen.at/konsumentenfragen/Kommunikation_und_Medien/Kommunikation_und_Medien_1/Vorsicht-vor-Dark-Patterns-im-Internet.html

Disclaimer: This blog post was developed with AI assistance (Perplexity) to help with structuring and phrasing my reflections.

IMPULSE #2: Reflecting on the panel discussion Privacy design, dark patterns, and speculative data futures – What if we designed for better data futures on purpose?

The panel at CPDP 2022 on “Privacy design, dark patterns, and speculative data futures” brings together researchers, regulators, and designers to talk about how current interfaces manipulate people, and how speculative design and foresight could help us imagine and build better data futures. This panel was moderated by Cristiana Santos (University of Utrecht, Netherlands) and had speakers like Régis Chatellier, Stefano Leucci, Dusan Pavlovic, Arianna Rossi and Cennydd Bowles.

The core things discussed on this panel is very close to my thesis: on one side, dark patterns and privacy-invasive mechanisms quietly exploit users; on the other side, there is a growing push for transparency-enhancing technologies and privacy-by-design approaches that could give people more control over their digital footprints.​​

One of the clear threads in the discussion is that dark patterns are not accidents; they result from deliberate choices, business pressures, and a lack of ethical guardrails in the design process. Panelists talk about building description schemas and datasets to systematically identify and classify deceptive patterns in interfaces, especially around privacy choices and access to personal data. For my thesis, this reinforces the idea that “ethical design” cannot stay abstract. If I want to help people manage their digital footprints, I need to treat dark patterns and their opposites as concrete, nameable design patterns and counter-patterns that can be recognised, tested, and avoided.​

Another important topic is how law, design, and foresight can work together. Several speakers stress that legal tools and enforcement alone are too slow and reactive to address fast-moving interface manipulation. They argue that designers and product managers hold a lot of power over whether an interface is deceptive or respectful, and that speculative methods can be used to anticipate future harms and design for better outcomes before those harms become normal. This fits directly with my research interest in “effective” ethical design: effectiveness here means not just compliance, but the ability of interfaces to prevent foreseeable harm to users’ data and autonomy.​​

Speculative design appears in the panel as a practical method, not just an art-school exercise. One example the discussion connects to is the use of speculative enactments and design fiction to help designers explore tensions between business goals and privacy rights. By staging hypothetical interfaces and futures, designers can see how certain patterns might feel manipulative or disloyal before they are deployed at scale. For my thesis, this suggests a concrete technique: using speculative prototypes to make digital footprints and their consequences visible, then inviting users or stakeholders to react to these “what if” scenarios.

The panel also raises a warning: speculative design can become trendy and superficial if it is done without a clear purpose or connection to actual decision-making. For ethical design, this means that speculative scenarios should feed into real processes like data protection impact assessments, design reviews, or pattern libraries, instead of staying as cool concept visuals. This is a useful constraint for my own work: any speculative interface I use in my thesis should be clearly tied to decisions about what data is collected, how consent is handled, and how users see and control their footprints.​​

For my research, this impulse does three things. First, it nudges me to explicitly frame dark patterns as “disloyal” design choices that work against users’ interests, especially in how their data is captured and used. Second, it shows that privacy-by-design and speculative design can be combined: speculative futures can help define the guardrails and desirable directions for ethical interaction patterns around digital footprints. Third, it highlights that designers and product teams must be at the center of this work, not just lawyers and regulators, which strengthens my argument that interaction design is a key lever for meaningful digital autonomy.​​

Some accompanying links:

Here is a link to the full panel video, which serves as the core resource for this impulse and gives the complete discussion on privacy design, dark patterns, and data futures:
https://www.youtube.com/watch?v=BbP_SjtGdkk

This conference program entry and description provide context on how the panel fits into a broader event on privacy and data protection, including its goals and questions around law, design, and foresight:
https://researchportal.vub.be/files/97144098/2022.05.22_CPDP2022.pdf

Finally, this related article on “Rationalizing Dark Patterns” explores how designers themselves rationalize or reproduce dark patterns in privacy UX, and proposes speculative enactments as a tool for more critical, privacy-aware design practice, which aligns well with the panel’s themes and my thesis:
http://www.ijdesign.org/index.php/IJDesign/article/view/4117/972

Disclaimer: This blog post was developed with AI assistance (Perplexity) to help with structuring and phrasing my reflections.