WebExpo Conference Day 2: Designing for Security in Crypto – Markéta’s Winning Formula

On Day 2, I listened to a really interesting session by Markéta Kaizlerová called “High Stakes Flows: Designing for Security and Crypto’s Unique Challenges.” The talk focused on how to help people protect their crypto using better onboarding, especially when it comes to something as important as setting up a passphrase.

Her team’s main idea was to build an onboarding process that teaches users how serious and important their passphrase is. They started by using clear content and simple words to explain why it matters, then added visuals later to make things feel smoother and more friendly.

While that approach helped them communicate the message, I personally think it could be a problem for users who have low vision or struggle with reading. Depending mostly on written content might leave some people behind, especially when visual support comes too late in the process.

Another thing they ran into was confusion around the terms they used. In the crypto space, a lot of words already sound complicated, and trying to explain them during onboarding made things even more confusing. It also didn’t help that the team was trying to do too many things at once. They had to simplify their goals and guide people step by step, like a wizard-style flow.

One lesson I found really useful was how they set clear educational goals. They knew exactly what they wanted users to learn at each stage, which made the whole process easier to test and improve. It also helped them stay focused during development. Kaizlerová even said that you don’t always need a dedicated content writer if you keep your goals simple and test your designs regularly.

She also talked about how not everyone will finish the onboarding flow. That’s totally normal, and instead of seeing it as a failure, they planned for it. They designed clear ways for people to exit the flow if they weren’t ready to go through with it. I liked that idea a lot because it shows respect for users and avoids pushing them too hard.

The biggest takeaway for me was how they tried to balance two important things: making the experience easy to use while still being secure. In crypto, that’s a real challenge. You want to teach users without overwhelming them, and you want to build trust without making it all feel too technical.

WebExpo Conference Day 1 – Understanding Users Through the Jobs to Be Done Framework by Martina Klimešová

On Day 1 I attended the session by Martina Klimešová, and it focused on the Jobs to Be Done (JTBD) framework. This session was a solid introduction to a tool that helps designers and product teams understand what users are really trying to achieve when they use a product.

The key idea behind JTBD is pretty straightforward: people don’t care that much about the tool itself. What they care about is getting something done. In other words, people “hire” products to complete specific jobs in their lives. If the product does the job well, they keep using it. If it doesn’t, they “fire” it and move on to something else.

She walked us through the process of using JTBD in a real design workflow. It usually starts by defining a clear focus. After that, you conduct interviews with users to find out what jobs they’re trying to get done. From there, you analyze the interviews, cluster the insights, define the jobs clearly, and then create a final “Job Map.”

Job Maps were one of the most interesting parts of the talk for me. A Job Map shows all the steps a user goes through to complete a task. This helps designers figure out where features are actually needed, instead of guessing. It’s also a great way to build empathy with users because it shows you how they really think and feel while trying to get something done.

One thing she also pointed out was how Job Maps often work better than personas. She explained that personas are not always based on real people. Sometimes, teams spend time designing for a “user” that doesn’t actually exist. You can build a great product for a made-up person, but that doesn’t help real users. Job Maps avoid this problem by focusing on real tasks and real pain points.

Some other strengths of Job Maps she mentioned:

  • They are more flexible than personas.
  • They are based on real behavior, not guesses or stereotypes.
  • They don’t depend on specific tools or platforms.
  • They stay relevant over time, even if technology changes.

Overall, this talk gave me a better way to think about user needs. Instead of just asking who the user is, JTBD asks what the user is trying to achieve. That small shift in thinking can change everything — from the way we design features to how we test and prioritize them.

If you’re working on a product and want to make sure you’re solving real problems, not just designing for made-up characters, the Jobs to Be Done framework is a great place to start. This was a great session that reminded me why listening to users and focusing on their goals is always the right move.

12 – Tapping into the Beat: Thoughts on the dB Drummer Bot (A project by Çağrı Erdem, and Carsten Griwodz)

This is a review on dB: A Web-based Drummer Bot for Finger-Tapping, a project done by Çağrı Erdem, and Carsten Griwodz. You can find more info about the project here and also, a link to the paper can be found here.

This paper introduces dB, a really cool web-based tool that lets you create drum grooves just by tapping on your computer keyboard. Think of it as a drummer bot powered by artificial intelligence that takes your simple finger taps and turns them into more complex rhythms. The idea is to make music creation more accessible to everyone, even if you don’t have a musical background.

What I find particularly interesting about dB is its focus on how our bodies are involved in music. The researchers recognize that music isn’t just in our heads; it’s something we feel and move to. By using finger-tapping as the main way to interact with the AI, they’re exploring this connection in a simple way. The paper also highlights the importance of “groove,” that irresistible urge to move with the music, and how dB tries to tap into that.

Another great aspect of this project is the effort put into understanding how people actually use and feel about the system. The researchers conducted a user study to see if people felt bored, happy, in control, or tired while using dB. They found that when the AI introduced more randomness and variation into the drum patterns, users tended to be more engaged and less bored. This suggests that a bit of surprise can make the music-making experience more fun. Plus, the fact that they’ve made the code and the music data they used publicly available is a big win for open research.

However, like any project, there are some areas that could be looked at more closely. One thing that stands out is the reliance on just finger-tapping on a computer keyboard. While this makes it very accessible, one participant in the study mentioned the lack of “high-resolution” in the interaction. You can imagine that tapping a spacebar might not give you the same nuanced control as playing actual drums or even a more specialized musical interface. The paper itself acknowledges this “bottleneck” and its potential impact on the feeling of control.

Also, the AI model was trained on a specific type of music: eight-note beats common in rock and heavy metal, in a 4/4 time signature. While this was a deliberate choice for the study, it might mean that dB is better at generating certain kinds of grooves than others. It would be interesting to see how it performs with different musical styles and time signatures.

The paper also mentions that there aren’t great ways to really measure how “good” the AI-generated music is in a way that humans perceive it. They used mathematical calculations to train the model, but understanding how these calculations relate to what sounds good to our ears is still a challenge in AI music research.

Finally, the study found that many users didn’t feel particularly “skillful” while using dB. This might point to a need to find a better balance between the AI’s surprises and the user’s sense of ownership and control over the musical output.

Overall, the dB project is a fascinating exploration into making music creation more accessible through AI and simple bodily interactions. The user study provides valuable insights into what makes these kinds of interfaces engaging. While there are limitations, particularly in the interaction method and the scope of musical styles, dB lays a solid foundation for future research in human-AI musical collaboration. It makes you think about how even simple actions can be transformed into something musically interesting with the help of intelligent systems.

11-Quick Concept Prototype and Speed Dating Session

Early Prototype: Designing the Home Screen for an Information Scrubbing and Management Tool

From Idea to Prototype

For my latest project work, I started sketching out the home screen/dashboard for an information scrubbing tool, a mobile app designed to help users find and remove their personal data from the internet with ease. For some context, I’m planning on working on a thesis about effectively managing our digital footprints on the internet, and as part of that, I started sketching out the home screen/dashboard for a privacy scrubbing tool—a possible mobile app designed to help users find and remove their personal data from the internet easily. Since privacy management can often feel overwhelming, my goal was to make the interface simple, clean, and user-friendly right from the start.

I created a prototype, exploring the ways users could interact with the tool. Since this is meant to be a mobile app, I focused on layouts that would feel intuitive on a phone screen. The main elements I worked on included:

  • A clear status overview (showing how much data has been found and removed).
  • A quick action button for immediate scanning.
  • Navigation tabs for different privacy tools and settings.

I focused on the layout, content structure, and information hierarchy to see what felt the most natural.

What I Learned from Testing

After creating the prototype, I brought it to class for testing. The feedback was reassuring—most people understood the purpose of the app right away, with very little explanation. That was a good sign that the design was intuitive. There was also curiosity about what additional features could be included in future iterations, which gave me ideas for expanding its functionality.

Speed Dating and Unexpected Insights

During class, we did a fun rapid feedback session where we shared our prototypes in short, fast-paced rounds. Each person I spoke with provided different perspectives, and I got some valuable insights:

  • People grasped the concept quickly, meaning the layout and flow were already on the right track.
  • They were excited about seeing more features, suggesting that users would appreciate a more in-depth look at what the tool could do beyond just scrubbing data.
  • If my project had a “dating personality,” it would be ‘careful’—which makes sense, given that the app is all about privacy and cautious data management!
  • We were asked to give the most unexpected feedback on our prototypes and one date gave feedback that the “scan now” button felt like a button to launch the camera for a QR code scanner (this means the icon definitely needs some work🤣🤣)

This session helped me validate the direction I was going while also giving me fresh ideas to improve the user experience. Next, I’ll refine the prototype based on this feedback and start thinking about more detailed interactions.

10 The Future of Ethical Design: Creating a Privacy-First Culture

Introduction

We’ve come to the final post in this series, but the journey toward ethical design and better privacy practices is far from over. Throughout these posts, we’ve explored the challenges, strategies, and opportunities involved in helping users manage their digital footprints. Now, it’s time to reflect on the lessons learned and outline a vision for building a privacy-first culture—one where ethical design becomes the standard, not the exception.

Building a Privacy-First Culture

Creating a privacy-first culture requires effort from both users and companies. For users, education and tools are key to reclaiming control. For companies, ethical design and compliance must be woven into every interaction. The goal is to align user empowerment with business practices, ensuring trust is a central feature of every platform.

Key Principles for the Future of Privacy Design

  1. Transparency: Companies must clearly communicate how and why data is collected. Example: Platforms that display real-time data usage dashboards, as discussed in earlier posts, make data practices visible and actionable.
  2. Simplicity: Privacy controls should be easy to find and use, especially for vulnerable populations. Example: Large, well-labeled toggles for key permissions, like tracking or sharing.
  3. User Empowerment: Tools that simplify complex privacy tasks are essential. Example: The proposed scrubbing tool could automate data removal, making it easier for users to reduce their digital footprint.

The Role of the Proposed Solutions

Whether it’s a framework that guides companies toward ethical practices or a tool that helps individuals scrub their data from the internet, the real power of these solutions lies in their ability to make privacy accessible. These ideas aren’t about adding extra steps—they’re about creating thoughtful designs that integrate privacy into the user experience seamlessly.

Challenges and Opportunities Ahead

  1. Balancing Business and Privacy: Companies may hesitate to adopt privacy-first practices if they feel it conflicts with profit. However, studies show users are more loyal to brands they trust.
  2. Technological Complexity: Emerging tools like AI and blockchain offer solutions but also bring steep learning curves for developers and users.
  3. Global Alignment: With privacy laws differing across regions, creating solutions that work globally will require careful planning.

Why This Journey Matters

For me, this project has been about more than privacy settings or policies—it’s been about empowering people and aiming to solve a problem I have personally encountered. In a digital world where users often feel powerless, ethical design can restore agency and rebuild trust. It’s not just a technical challenge; it’s a moral responsibility for designers, developers, and companies.

A Call to Action

To companies: Commit to transparency and respect for user data. To users: Stay informed and advocate for your rights. And to designers like myself: Keep pushing for solutions that prioritize people over profits. Together, we can make privacy-first a global standard.

Closing Thoughts

This series has been an exploration of how we can design tools and systems that respect and protect digital footprints. From the history of data privacy to actionable strategies and emerging trends, the path forward is clear: ethical design must guide the future of digital interactions. Thank you for following along on this journey. Let’s continue building a world where privacy isn’t a luxury but a fundamental right.

Its been utterly enjoyable working on thiss!

09 Educating and Empowering Users: Privacy Beyond Settings

Introduction

Privacy tools and settings are only part of the solution. True empowerment comes when users understand their digital footprints and feel confident managing them. This post focuses on how education and design can work together to make privacy accessible, building trust and encouraging better digital habits. These insights are key to shaping my proposed solutions.

Why Education is Essential

Privacy issues are often complex, and many users feel overwhelmed or unaware of their choices. Example: A survey by Pew Research Center found that nearly 45% of users don’t fully understand how their data is collected or used online. Without education, even the best tools or settings can go unused. Education bridges the gap between awareness and action, giving users the confidence to take control of their data.

Strategies to Educate and Empower

  1. Visual Explanations: Use infographics, videos, or step-by-step guides to simplify privacy concepts. Example: A short animation explaining what cookies are and how to manage them effectively.
  2. Gamification: Encourage users to engage with privacy education through interactive challenges. Example: A quiz-style game where users learn to identify risky behaviors or optimize their privacy settings.
  3. Embedded Guidance: Integrate tips and tutorials directly into platforms. Example: A pop-up tip explaining how to adjust data-sharing preferences when a user sets up a new app.
  4. Feedback Mechanisms: Let users see the impact of their actions. Example: A dashboard showing how many tracking cookies have been blocked or deleted after activating a privacy tool.
Cookie banners and pop ups should have clearer and more explicitly explained information

How can design and education work together to empower users in managing their digital footprints?

  • What educational tools or techniques are most effective for teaching privacy concepts?
  • How can platforms encourage proactive behavior without overwhelming users?

Design Implications for User Education

  1. Clarity and Simplicity: Educational content should avoid jargon and focus on actionable advice. Example: Using plain language like “This setting stops apps from tracking your location” instead of legal terminology.
  2. Interactive Elements: Users are more likely to retain information through hands-on interaction. Example: An interactive tutorial that guides users through their privacy settings step-by-step.
  3. Personalization: Tailor educational content to user needs. Example: For casual users, focus on the basics; for tech-savvy users, provide advanced privacy tips.
Introducing a “Privacy Setup” as part of the relevant steps when onboarding on different apps

Challenges in Privacy Education

  1. Low Engagement: Users often skip educational content because they find it boring or unnecessary.
  2. Complexity of Concepts: Explaining technical topics like data encryption or cookies in simple terms is challenging.
  3. Skepticism: Some users may distrust educational efforts if they perceive them as self-serving or overly complicated.

Tying It to the Proposed Solutions

The idea of embedding education into digital tools aligns directly with the proposed scrubbing tool. For example, the tool could feature a built-in tutorial explaining what happens when personal data is shared online and how removing it impacts privacy. Similarly, the privacy framework could include guidelines for platforms to provide educational prompts during key interactions, such as account setup or when sharing sensitive information.

08 Emerging Trends in Privacy and Data Management: What’s Next?

Introduction

The landscape of privacy and data management is constantly evolving. From the rise of decentralized data systems to advancements in AI-powered privacy tools, staying informed about emerging trends is essential for creating future-ready designs. In this post, I’ll explore these trends and their implications for ethical design.

Emerging Trends Shaping the Future of Privacy

  1. Decentralized Data Management: Technologies like blockchain are enabling users to store and control their data independently, reducing reliance on centralized platforms. Example: Platforms like Solid (by Sir Tim Berners-Lee) give users full control over their data pods, allowing selective sharing.
  2. AI-Powered Privacy Tools: Artificial intelligence is being used to identify privacy risks and automate data management. Example: Jumbo Privacy App scans your accounts, recommends privacy settings, and automates actions like deleting old posts.
  3. Privacy-Enhancing Technologies (PETs): Tools such as differential privacy, homomorphic encryption, and federated learning enable data analysis without compromising user privacy. Example: Apple’s differential privacy techniques allow them to collect usage trends without identifying individual users.
  4. Legislative Momentum: New laws like Brazil’s LGPD and India’s DPDP Act are expanding global privacy standards, pushing companies to prioritize user data protection.
image source: Faster Capital

How can emerging privacy trends shape the design of tools and frameworks that empower users to manage their digital footprints?

  • How can decentralized technologies and PETs be integrated into practical user tools?
  • What role does legislation play in influencing design practices?

Design Implications of These Trends

  1. Incorporating Decentralization: Designers should consider how platforms can allow users to store data locally or use blockchain to share information securely. Example: A prototype privacy tool could use a decentralized network to manage opt-out requests without relying on third parties.
  2. Leveraging AI for User Empowerment: AI can simplify complex tasks like identifying where personal data exists or automating deletion requests. Example: An AI-driven privacy dashboard that highlights vulnerabilities and recommends actionable steps.
  3. Embedding PETs into Design: Designers can use privacy-enhancing technologies to build trust. Example: A visualization tool showing anonymized data usage in real time.
  4. Adapting to Laws: Incorporating compliance into the user experience ensures platforms meet legal standards while simplifying the process for users. Example: Pre-designed templates for GDPR-compliant consent forms.

Challenges in Adopting Emerging Trends

  1. Technical Complexity: Decentralization and PETs often require advanced infrastructure, making adoption challenging for smaller organizations.
  2. User Education: Explaining complex concepts like differential privacy or blockchain to users in simple terms can be difficult.
  3. Corporate Resistance: Companies may resist adopting PETs or decentralized models due to reduced control over user data.

Relevance to Thesis

Integrating these trends ensures that platforms are forward-looking and adaptable to future technologies and regulations. For instance, the data scrubbing tool I’m proposing could use AI to automate data deletion or blockchain to enhance data security, aligning with global privacy standards.

07 Designing for Vulnerable Populations: Privacy for Everyone

Introduction

Not all users interact with digital platforms in the same way. Vulnerable populations, such as children, the elderly, and those with limited technological literacy, often face unique challenges in managing their digital footprints. This post explores how privacy tools and frameworks can address these diverse needs, ensuring inclusivity and accessibility. These considerations will play a significant role in refining my thesis goals of creating possible solutions that works for everyone.

Why Vulnerable Populations Need Special Consideration

Certain groups are more susceptible to privacy risks due to limited understanding or access to tools:

  1. Children: Often unaware of data tracking, making them targets for ads or manipulative designs. Example: Gaming apps that collect location data without parental consent.
  2. Elderly Users: Many find privacy tools overwhelming or confusing, leaving them exposed to scams or data misuse.
  3. Low-Literacy or Non-Tech-Savvy Users: Struggle with complex settings or opaque terms of service, leading to accidental oversharing.
Certain groups are more susceptible to privacy risks due to limited understanding or access to tools

How can privacy tools and frameworks be designed to accommodate the unique needs of vulnerable populations?

  • What barriers prevent vulnerable groups from effectively managing their digital footprints?
  • How can accessibility principles improve privacy tool design?

Design Strategies for Inclusive Privacy Tools

  1. Simplified Interfaces: Prioritize clean layouts and clear labels. Example: A single dashboard with large buttons for enabling/disabling permissions (e.g., “Stop Location Sharing”).
  2. Parental Controls: Design features that empower parents to manage their children’s digital activity. Example: Tools that notify parents about apps collecting sensitive data.
  3. Educational Content: Embed tutorials or interactive guides that explain privacy concepts in simple terms. Example: A short video explaining what cookies are and how to manage them.
  4. Localization and Accessibility: Ensure tools are available in multiple languages and compatible with assistive technologies. Example: Text-to-speech options for visually impaired users.

These strategies will be incorporated into the possible solutions I will develop to ensure inclusivity is a core component.

Challenges in Addressing Vulnerable Populations’ Needs

  1. Diverse Requirements: Balancing simplicity with functionality to meet varied user needs.
  2. Awareness Gaps: Educating users about privacy risks without overwhelming them.
  3. Compliance with Laws: Adhering to regulations like COPPA (Children’s Online Privacy Protection Act) and ADA (Americans with Disabilities Act).

Relevance to My Thesis Goals

Inclusivity is central to my work. By addressing the needs of vulnerable users, I can ensure the solutions I propose are effective for a wider audience. These insights will help shape design guidelines that prioritize accessibility and equity, making privacy tools genuinely universal.

06 Transparency in Data Use: Building Trust Through Clear Communication

Introduction

Trust is the foundation of any user-platform relationship, and transparency is the key to earning it. Users need to know what data is being collected, why, and how it’s being used. In this post, I’ll explore how clear communication about data use can strengthen user trust and discuss practical design strategies for achieving transparency. These insights will inform my thesis objectives: creating a Privacy Framework for companies and prototyping a tool for managing personal data online.

Why Transparency Matters

Transparency transforms uncertainty into trust. When users understand how their data is used, they’re more likely to engage with a platform. Without it, users feel manipulated, leading to distrust and disengagement. Example: Many users became wary of Facebook after the Cambridge Analytica scandal because the platform failed to communicate how user data was being shared and exploited.

Key Elements of Transparent Data Use

  1. Clarity: Use plain language to explain data practices. Example: Replace “We may collect certain information to enhance services” with “We use your email to send weekly updates.”
  2. Visibility: Make privacy policies and settings easy to find. Example: A single-click link labeled “Your Data Settings” at the top of a webpage.
  3. Real-Time Feedback: Show users how their data is being used in real time. Example: A privacy dashboard that displays which apps or services are currently accessing your location.
Possible transparency settings that can be introduced by companies

Case Studies of Transparency in Action

  1. Apple’s Privacy Nutrition Labels: These labels show, at a glance, what data an app collects and how it is used, simplifying complex privacy policies into digestible bits of information.
  2. Google’s My Activity Dashboard: Google allows users to view and manage their activity data, offering options to delete or limit collection.
  3. noyb.eu’s Advocacy Work: By challenging platforms that obscure their data use, noyb has pushed for greater clarity and compliance with GDPR.

These examples demonstrate how transparency fosters trust and aligns with ethical design principles.

Apple lets you know what data is being used.
image source: Adjust
Google has a “My Activity” section tyhat shows relevant info.

How can design effectively communicate data use to build trust and ensure transparency?

  • What visual and interactive elements improve users’ understanding of data use?
  • How can transparency features integrate seamlessly into existing platforms?

Designing for Transparency

To achieve transparency, platforms can:

  1. Integrate Visual Feedback: Use graphics, charts, or icons to explain data use. Example: A pie chart showing how much of your data is used for ads vs. analytics.
  2. Streamline Privacy Policies: Provide short, bulleted summaries of key data practices. Example: “We collect: your email for updates, your location for recommendations, and your browsing history for ads.”
  3. Offer Customization: Allow users to adjust permissions directly. Example: Toggles for enabling/disabling specific data categories like tracking or personalization.

These approaches will also inform the Privacy Framework I’m developing, ensuring it includes actionable guidelines for platforms to improve data transparency.

Challenges and Personal Motivation

Transparency isn’t always easy to achieve. Challenges include balancing clarity with detail, overcoming user distrust, and addressing corporate reluctance to reveal data practices. However, I’m motivated by the potential to create tools and frameworks that make transparency accessible and actionable for users and companies alike.

05 Designing Privacy-Centric User Experiences: Case Studies and Practical Insights

Introduction

Creating platforms that respect user privacy isn’t just a moral obligation; it’s a necessity in today’s data-driven world. In this post, I focus on designing privacy-centric user experiences, showcasing real-world case studies and exploring actionable design strategies. These insights will directly inform my thesis goals of developing a framework for companies and prototyping a simple privacy tool to empower users to manage their digital footprints more effectively.

What Makes a Privacy-Centric User Experience?

A privacy-centric experience ensures that users are informed, in control, and confident about their data. It prioritizes transparency, simplicity, and respect for user consent while avoiding deceptive practices. This means:

  1. Clarity: Clear communication about what data is collected and why.
  2. Control: Tools that allow users to customize their privacy preferences easily.
  3. Trust: Ethical practices that build long-term confidence.

Example: Apple’s App Tracking Transparency feature asks users if they want to allow tracking, giving them a clear choice with simple language.

Case Studies of Privacy-Centric Platforms

  1. Signal (Messaging App): Signal prioritizes privacy by offering end-to-end encryption and collecting minimal metadata. Users trust Signal because it’s transparent about its data collection policies—essentially none—and offers simple privacy controls.
  2. DuckDuckGo (Search Engine): Unlike Google, DuckDuckGo doesn’t track users or store personal information. Its clean interface and privacy-first branding make it a favorite for those seeking anonymity.
  3. Joindeleteme.com (Data Removal Tool): This tool simplifies the process of removing personal data from online platforms, offering a user-friendly experience with automated data removal requests.
image source: IndianExpress
Image source: iDrop

How Do These Examples Inform Design Practices?

These platforms succeed by embedding privacy into the user experience, demonstrating best practices for designers:

  1. Default Privacy: Assume users want to opt out of tracking. Signal doesn’t track by default, removing the burden of choice.
  2. Simplified Consent: Make choices clear and accessible. DuckDuckGo eliminates tracking entirely, so users don’t need to worry about settings.
  3. Automation: Joindeleteme.com automates repetitive tasks, minimizing user effort while maintaining control.

How can design principles from successful privacy-centric platforms be adapted into frameworks or tools for broader use?

  • What features of these platforms are most effective at fostering user trust?
  • How can automation and default settings simplify privacy management for users?

Designing a Framework for Companies

To guide companies in creating privacy-centric platforms, a framework should include:

  1. Transparency Guidelines: Require clear communication about data collection. Example: Dashboards showing what data is collected in real time.
  2. User Empowerment: Include tools that allow users to opt out of tracking with a single click. Example: Privacy toggles for ads, location tracking, and analytics.
  3. Ethical Compliance Checks: Provide a checklist for meeting GDPR and other privacy laws. Example: Assessing whether consent dialogs meet “informed consent” criteria.
Possible permissions users can control if companies implement Privacy Transparency Guidelines

Designing a Prototype Privacy Tool

Inspired by joindeleteme.com, the proposed tool could:

  1. Identify Data Sources: Help users find where their personal information exists online.
  2. Simplify Requests: Automate data deletion requests based on privacy laws like GDPR.
  3. Track Progress: Provide real-time updates on data removal processes for transparency.
Mockup of a possible view of the tool

Challenges in Execution

  1. Data Mapping Complexity: Identifying all the platforms where a user’s data exists is a significant technical hurdle.
  2. User Trust: Convincing users to trust the tool with sensitive data requires a flawless UX and a proven ethical stance.
  3. Corporate Pushback: Companies reliant on data monetization may resist the widespread adoption of privacy frameworks.

My Personal Motivation

The more I explore privacy-first platforms, the more I realize how empowering ethical design can be. I’m motivated to create solutions that reduce friction for users while making privacy the norm rather than the exception.