IMPULSE #4: Lunch with Prof. Baumann (with some good Kebap!)

This impulse is a bit different from the others because it is not a book or a talk, but a lunch meeting with Prof. Konrad Baumann that helped me put much sharper edges around my thesis idea. The conversation was essentially my first “real” check-in with someone I would like to supervise my thesis, and it forced me to articulate my motivations and what I actually want to achieve with “effective ethical design” and digital footprints. Instead of staying in my own head, I had to explain why this topic matters to me and where I see it sitting inside UX practice and the wider industry. That alone made this meeting feel like an important impulse.

We started by reconnecting threads from a previous class discussion, where we had talked about our interests in the UX field and the kinds of industry problems we care about. For me, those questions brought back the same themes: ethical design, dark patterns, privacy, and how users are often left in the dark about their data trails. This lunch was like a continuation of that exercise, but one-on-one and more honest. Saying my thesis topic out loud and contextualising it in front of someone with experience in this area made my intentions feel more “real”, and it also exposed where my thinking was still a bit vague or too broad.

I really liked how he brought up concrete cases and pointed me toward resources, including earlier advice I had heard about noyb (Neuerungen bei Datenschutzfällen), a privacy organisation that regularly takes companies to court over data protection violations. These cases are basically “real-life stories” of where digital products and services crossed lines in how they handled user data. That was a helpful reminder that my thesis is not just theoretical; it sits in a landscape where regulators, NGOs, and companies are already fighting over what is acceptable, from tracking to dark patterns to consent models.

Afterwards, Prof. Baumann shared an interesting ORF article that discusses current tensions and developments around privacy and digital rights in Austria and Europe. Even without quoting it directly, the article makes it clear how much is at stake: from weak enforcement to high-profile cases against platforms and tech companies, it shows that “privacy by design” is not just a slogan but something that either happens in concrete interfaces or does not. For my thesis, this is a useful anchor, because it links my academic work to a living context of laws being tested, companies being challenged, and users being affected.

What I take from this impulse is both emotional and structural. Emotionally, it reassures me that I am not chasing a “nice sounding topic” but something that sits at the intersection of UX, law, and real harms users are experiencing. Structurally, it pushes me to frame my thesis more clearly around a few core questions: How can interaction design make digital footprints visible and manageable in everyday interfaces? How can ethical constraints and legal requirements be translated into practical patterns instead of abstract guidelines? And how can designers avoid repeating the kinds of behaviours that end up in complaints, lawsuits, or investigative articles about privacy abuses?

For my next steps, this meeting gives me three concrete moves. First, to keep mapping real cases (like those collected by noyb and highlighted in media coverage) as examples of what “unethical design” looks like in practice, and why better interaction patterns are needed. Second, to use those cases as boundary markers when I prototype: if a pattern smells like something that has already led to a complaint or enforcement, it is a red flag. Third, to stay in close conversation with Prof. Baumann as a supervisor, so that my thesis stays grounded in both design practice and the evolving legal and ethical landscape.

Link to the ORF article Prof. Baumann shared (in German), which anchors this impulse in current debates about privacy and data protection:
https://orf.at/stories/3410746/

For broader context on enforcement and complaints concerning privacy violations in Europe, especially involving companies like Clearview AI, this overview from Reuters and noyb helps show how data misuse is being challenged at a legal level:
https://www.reuters.com/sustainability/society-equity/clearview-ai-faces-criminal-complaint-austria-suspected-privacy-violations
https://noyb.eu/en/criminal-complaint-against-facial-recognition-company-clearview-ai

Finally, this Austrian consumer-focused article on dark patterns and manipulative web design provides a very concrete list of deceptive practices and explains how new regulations like the Digital Services Act aim to limit them, which connects directly back to my thesis interest in ethical interfaces and user autonomy:
https://www.konsumentenfragen.at/konsumentenfragen/Kommunikation_und_Medien/Kommunikation_und_Medien_1/Vorsicht-vor-Dark-Patterns-im-Internet.html

Disclaimer: This blog post was developed with AI assistance (Perplexity) to help with structuring and phrasing my reflections.

IMPULSE #3: Interesting read from the chapter “The Need for Ethics in Design” from The Ethical Design Handbook and how we can effectively implement ethics in our work

I started reading “The Ethical Design Handbook” by Trine Falbe, Martin Michael Frederiksen, and Kim Andersen (it was one of the very first resources I discovered and noted down in the initial gathering process that led to the choice of my thesis topic) and now, I treat it as an ongoing, dip-in resource rather than a straight-through textbook. It is framed as a practical guide for leaving dark patterns behind and making ethical design part of everyday digital product work, not just a side note. For my thesis on helping people manage their digital footprints, this book feels like a toolkit I can slowly mine: I can pick the chapters that match my current questions, use them, and then come back later when a new angle opens up.

Right now, I’m really chewing alot on the second chapter “The Need for Ethics in Design”, because it sets up why ethical design has to be more than simple legal compliance. The authors walk through consequences of unethical design and show how dark patterns, aggressive tracking, and manipulative interfaces damage trust and harm users. They also introduce ethical principles like non-instrumentalism, self-determination, responsibility, and fairness, and connect them to familiar frameworks such as Privacy by Design. Reading this as a preparatory part of my future thesis work, is really helping me sharpen the language to better describe what bothers me about many current products and services: which currently treat people purely as data sources or conversion targets, this very action breaks those core principles and undermines users’ ability to effectively understand and shape their digital footprints.

What feels especially useful is how concrete the book tries to be. It is not just “be nice to users” as an abstract value statement; it tries to build an actual working framework, including tools like the Ethical Design Scorecard and “ethical blueprints” for real design processes. The scorecard is meant to assess how a product performs on different ethical dimensions, with weighted criteria. For my thesis, this sparks a very practical idea: I could adapt or extend such a scorecard specifically around footprint-related questions like what data is collected, how transparent the flows are, how easy it is to revoke or change consent, and whether users can see or manage their historical data in meaningful ways.

This chapter also acknowledges that change has to happen inside teams and businesses, not just in individual designers’ heads. Later parts of the book (which I plan to read next) focus on “creating positive change” and “the business of ethical design”, arguing that ethical practices can be aligned with sustainable business models instead of being framed as a cost. That connects well with my thesis constraint of balancing business needs with user autonomy: if I can borrow some of the arguments and models from these chapters, I can show how ethical digital footprint management is not just “good for users” but also part of a long-term, trust-based product strategy.

As an ongoing read, I see myself using this book in two ways. First, as a language and framework source: the principles and scorecard approach help me structure the “ethical requirements” part of my thesis more clearly. Second, as a bridge to practice: the blueprints and case-studies can inform how I generally approach projects/work in my career to more genuinely support user agency instead of nudging people into over-sharing and not giving them effective ways to manage what has been overshared. ​

Here is the official site for The Ethical Design Handbook, which includes the table of contents, the ethical design scorecard, and downloadable blueprints that expand on the tools discussed in the book:
https://ethicaldesignhandbook.com

Smashing Magazine’s book page gives a good high-level overview of the book’s goals, including how it aims to help teams replace dark patterns with honest patterns while still supporting business KPIs:
https://www.smashingmagazine.com/printed-books/ethical-design-handbook/

Finally, this Smashing Magazine article announcing the handbook’s release explains why the book was written and emphasizes the need for practical, long-lasting solutions to move companies away from manipulative design and towards sustainable, ethical digital footprints:
https://www.smashingmagazine.com/2020/03/ethical-design-handbook-release/

Disclaimer: This blog post was developed with AI assistance (Perplexity) to help with structuring and phrasing my reflections.

11-Quick Concept Prototype and Speed Dating Session

Early Prototype: Designing the Home Screen for an Information Scrubbing and Management Tool

From Idea to Prototype

For my latest project work, I started sketching out the home screen/dashboard for an information scrubbing tool, a mobile app designed to help users find and remove their personal data from the internet with ease. For some context, I’m planning on working on a thesis about effectively managing our digital footprints on the internet, and as part of that, I started sketching out the home screen/dashboard for a privacy scrubbing tool—a possible mobile app designed to help users find and remove their personal data from the internet easily. Since privacy management can often feel overwhelming, my goal was to make the interface simple, clean, and user-friendly right from the start.

I created a prototype, exploring the ways users could interact with the tool. Since this is meant to be a mobile app, I focused on layouts that would feel intuitive on a phone screen. The main elements I worked on included:

  • A clear status overview (showing how much data has been found and removed).
  • A quick action button for immediate scanning.
  • Navigation tabs for different privacy tools and settings.

I focused on the layout, content structure, and information hierarchy to see what felt the most natural.

What I Learned from Testing

After creating the prototype, I brought it to class for testing. The feedback was reassuring—most people understood the purpose of the app right away, with very little explanation. That was a good sign that the design was intuitive. There was also curiosity about what additional features could be included in future iterations, which gave me ideas for expanding its functionality.

Speed Dating and Unexpected Insights

During class, we did a fun rapid feedback session where we shared our prototypes in short, fast-paced rounds. Each person I spoke with provided different perspectives, and I got some valuable insights:

  • People grasped the concept quickly, meaning the layout and flow were already on the right track.
  • They were excited about seeing more features, suggesting that users would appreciate a more in-depth look at what the tool could do beyond just scrubbing data.
  • If my project had a “dating personality,” it would be ‘careful’—which makes sense, given that the app is all about privacy and cautious data management!
  • We were asked to give the most unexpected feedback on our prototypes and one date gave feedback that the “scan now” button felt like a button to launch the camera for a QR code scanner (this means the icon definitely needs some work🤣🤣)

This session helped me validate the direction I was going while also giving me fresh ideas to improve the user experience. Next, I’ll refine the prototype based on this feedback and start thinking about more detailed interactions.

01 – Understanding Data Collection: What’s Tracked and Why It Matters

Introduction: The Hidden World of Data Collection

Every time you shop online, scroll social media, or use apps, you leave behind a trail of data. But what’s being collected, and why does it matter? In this post, we’ll break down the basics of data collection, why companies track you, and how ethical design can help make these processes more transparent.

What Data Is Being Collected?

The types of information collected about you often fall into two main categories:

  1. Personal Information:
    • Data you actively provide, like your name, email, or payment details.
    • Example: Signing up for a streaming service.
  2. Behavioral Data:
    • Your browsing habits, app usage, and even geolocation.
    • Example: An online store tracking your clicks to recommend products.

How Is Your Data Collected?

Cookies:

  • First-party cookies personalize experiences (e.g., saving your cart).
  • Third-party cookies track you across sites for advertising.

Device Fingerprinting:

  • Websites identify your device through unique configurations like screen resolution or browser settings.

Social Media Plugins:

  • Share buttons on websites let platforms like Facebook track your activity.

Why Do Companies Collect This Data?

While companies use data for personalization, advertising, and product development, this raises ethical concerns:

  • Lack of Transparency: Most users don’t understand what’s collected or how.
  • Security Risks: Data breaches can expose sensitive information.
  • Informed Consent: Legal jargon in terms of service often obscures what users are agreeing to.

How can ethical design make data collection more transparent and user-friendly?

For designers, creating transparent and user-friendly interfaces is critical to addressing privacy concerns. Ethical design not only builds trust but also aligns with global privacy laws like GDPR and CCPA.

Challenges and Motivation

Challenges:

  • Understanding complex data flows between companies.
  • Educating users who lack technical knowledge.
  • Encouraging companies to adopt ethical practices.

Motivation: My frustrations with unclear privacy settings inspire me to design solutions that prioritize user control and clarity.

Next steps could include:

  • Prototyping solutions for clearer data collection practices.
  • Analyzing current privacy tools and consent mechanisms.
  • Conducting user interviews to understand behaviors and perceptions.

In the next post, I will dive deeper into what a Digital Footprint is.

Cheers!