👩🏽‍💻 WebExpo Conference: Survival kit for advertising jungle by Kateřina Huňová & Vladimír Zikmund

This talk stood out with its metaphorical yet practical framework: surviving the modern advertising jungle. Kateřina Huňová and Vladimír Zikmund offered 10 sharp, memorable tips that went beyond theory, highlighting real campaigns, missteps, and surprisingly simple creative ideas.

Here’s the survival kit they proposed:

1. Get your survival kit
Know your brand, your product or service, and, most importantly, your audience. Messaging only works when it aligns with identity. Ryanair’s chaotic memes and British Airways’ premium tone couldn’t be swapped. Know who you are, and stay in your lane.

2. Enter with courage
Courage in advertising can mean budget bravery (like Lays going all-in on football and music) or daring to be different. Kaufland’s idea to hand out ice packs of carrots to hockey players is absurd, and that’s exactly why it worked. It was cheap, simple, and memorable.

3. Hunt one animal
Focus on one thing: one product, one message, one feeling. Trying to do everything results in nothing. Klarna’s “smooth fish” campaign was absurd but effective. The Ordinary, too, owns its scientific tone with brutally plain product names. Simplicity is power.

4. Stay on the path
Consistency and integration are critical. Skoda’s visual metaphor using plus and minus signs doubled campaign awareness. Long-term consistency, like Snickers’ “You’re not you when you’re hungry” builds recognition and emotional memory.

5. Take a buddy
Mascots work, whether they’re cute (like DuoLingo’s owl), absurd (like the Panda who gets mad if you say no to milk), or even annoying. If your tone is strong and consistent, your audience will remember it.

6. Climb the tree for better perspective
Think differently. IBM made clever physical installations to demonstrate smart ideas. Jeep used unexpected ad placements (like bizarre parking spots). Dog food brands made frisbees shaped like gym weights. Unexpected formats create attention.

7. Follow the river flow
Trend moments are fast and short-lived, like “brat summer” or meme formats. You can ride them, but don’t rely on them. Heineken’s flippable phone device for distraction-free cheering was a brilliant trend-relevant product.

8. Cooperate with indigenous people
Influencers can help, but only when they truly fit. Jeremy Allen White for Calvin Klein worked. But Kendall Jenner for Pepsi? A disaster. Influencers are not the idea, they’re just one tool, and they must align with your brand values.

9. Obstacles can’t stop you
Barriers can spark creativity. Legal restrictions in Brazil banned beer logos on football jerseys, so Brahma used hair dye to create beer-colored hairstyles. Penny fought consumer price-blindness by printing huge prices on product packaging.

10. Celebrate at the end
After surviving the jungle, don’t forget to appreciate your wins. Analyze your results, celebrate your team, and enjoy the moment. Every good campaign is a journey.

This was one of the most engaging and creatively structured talks at WebExpo. Even if advertising isn’t your main field, these lessons about clarity, creativity, and boldness are easily applied across all types of design and communication.

👩🏽‍💻 WebExpo Conference: 12 core design skills by Jan Řezáč

At this year’s WebExpo Conference, Jan Řezáč delivered one of the most insightful and practical talks I’ve heard in a long time. His talk, titled “12 Core Design Skills,” focused not on tools or trends, but on the essential skills that make a designer truly effective. Instead of obsessing over Figma or pixel perfection, he urged us to zoom out and look at the broader responsibilities of a designer.

One of his most striking points was that Figma is not design, it’s documentation. This might sound surprising at first, especially since many of us use Figma daily. But his message was clear: design happens before the tool. Real design is about solving problems, not just arranging rectangles on a screen. Figma, like Corel Draw or Photoshop before it, is just one of many tools to express an idea, but it’s the thinking behind the idea that matters most.

Jan criticized the tendency to focus only on the last phase of the double diamond process, execution. By doing so, we ignore the equally important stages of discovery, definition, and ideation. This is where his list of 12 core skills came in, but rather than listing them all, I want to highlight the ones that stood out the most to me:

  • Design Thinking: Jan called this “creative problem-solving.” He emphasized being intentional with whichever design process we choose. What matters is not the method itself, but how we use it to explore and solve problems.
  • Business Thinking: Designers need to understand business goals. Learning to speak the language of strategy, money, and spreadsheets allows us to have better conversations with managers and stakeholders. Without this skill, good design ideas often fail to get implemented.
  • Workshop Facilitation: This was a key point. While junior designers may come in with strong ideas and enthusiasm, experienced designers know how to guide a team through a process. Good facilitation involves tactical empathy, structure, and the ability to improve outcomes by leading people, not just projects.
  • Customer Research: Jan talked about using both qualitative and quantitative methods: interviews, surveys, testing, analytics. The takeaway: good designers don’t just guess; they listen, observe, and test. Senior designers carry this mindset with them all the time, not just during research phases.
  • Testing Business Ideas: A great reminder that ideas need to be tested early and often. Jan suggested testing 20–100 ideas per week. It sounds intense, but it shifts the mindset from perfection to learning.

Throughout the talk, Jan returned to one core message: the most important tool we have is our brain. Tools change. Trends come and go. But the ability to think critically, communicate clearly, and collaborate strategically is what defines a strong designer.

This talk encouraged me to step back from the screen and refocus on the bigger picture: problem-solving, strategy, and working with people. It was a refreshing and important reminder of what design is really about.

2.2 Exploring Analog Tools for Stress Relief and Focus

In my previous exploration, I developed the prototype for Breathing Circle: a tactile, screen-free tool designed to guide users toward calmness. Building upon this, I’ve delved into existing analog relaxation devices to understand how current innovations align with or diverge from the principles of intuitive, low-effort emotional regulation. This journey aims to highlight the value of physical, non-digital tools in promoting well-being.

Breathing and Mindfulness Aids

Komuso Breathing Necklace: a sleek pendant that slows exhalation when breathed through, promoting relaxation and reducing anxiety. Its discreet design makes it suitable for use in various settings.

Tibetan Singing Bowls: traditional instruments producing resonant tones that aid in meditation and stress relief. Their use underscores the enduring value of simple, auditory tools in promoting mental well-being.

Expandable Breathing Ball: a colorful, collapsible sphere that expands and contracts, visually guiding deep breathing exercises. Its engaging design makes it a popular tool for both children and adults seeking mindfulness and stress relief.

Focus and Productivity Enhancers

Morphée Meditation Box: a screen-free device offering guided meditation sessions through a tactile interface. Its design encourages users to engage in mindfulness without digital distractions.

Analog Productivity System by Ugmonk: a physical task management system using cards to prioritize daily activities. It emphasizes focus and intentionality in task execution.

Tactile Stress Relievers

Baoding Balls: traditional Chinese stress-relief tools that promote relaxation and hand dexterity through rhythmic movement.

Acupressure Mats: mats embedded with spikes that stimulate pressure points, helping to relieve tension and improve circulation.

Fidget Cube: a compact, six-sided device featuring buttons, switches, and dials designed to keep hands engaged and minds focused. Each side offers a different tactile experience, catering to various sensory preferences.

Fidget Spinner: a small, ball-bearing device that spins between the fingers, providing a soothing sensory experience. Fidget spinners have been popularized as tools to aid focus and relieve stress, especially for individuals with ADHD or autism spectrum disorders. While scientific evidence is limited, many users find the repetitive motion calming and helpful in managing anxiety.

Additional Fidget Devices and Toys: beyond the Fidget Cube and Spinner, a variety of tactile tools offer sensory engagement and stress relief. Tangle toys consist of interconnected, twistable segments that can be manipulated into various shapes, providing continuous, quiet movement to aid concentration and reduce anxiety. Infinity Cubes are handheld devices made of smaller interconnected cubes that can be folded and unfolded endlessly, offering a repetitive motion that has a calming effect and helps maintain focus. Pop Its are silicone-based toys with bubble-like protrusions that can be pushed in and out, mimicking the sensation of popping bubble wrap; they offer tactile stimulation and are popular for stress relief. Stretchy Strings are elastic, colorful strings that can be stretched, twisted, and squeezed, providing sensory input useful for calming and focusing the mind. Wacky Tracks are interlocking, snap-together links that can be twisted and shaped into various forms, offering tactile feedback beneficial for fine motor skills and stress relief. Weighted Sensory Pillows are small, weighted pillows that provide deep pressure stimulation, promoting relaxation and reducing anxiety, often used in sensory integration therapy.

Reflections and Future Directions

The exploration of these analog devices reveals a shared commitment to facilitating emotional regulation through intuitive, tactile means. Their simplicity and portability make them accessible tools for individuals seeking screen-free methods to manage stress and anxiety.​ In the upcoming blog posts, I will focus on refining the Breathing Circle prototype. This will involve enhancing its design and functionality, followed by user testing to assess its effectiveness in promoting relaxation and emotional well-being. Through this process, I aim to gather insights that will inform further development and potential applications of the Breathing Circle.

🦖 Dinosaur Choir: Designing for Scientific Exploration, Outreach, and Experimental Music 🎶

Today, I dove into the quirky and ambitious world of Dinosaur Choir, a NIME 2023 paper by Brown, Dudgeon, and Gajewski. Yes – you read that right. It’s about playing music with dinosaur skulls. Well, replicas, but still! The idea? Reconstruct hadrosaur skulls (those duck-billed dinosaurs with dramatic nasal crests) to recreate their vocalizations through breath-powered instruments. It’s part speculative science, part interactive sound art, and part paleo-fan dream.

First impressions? It’s wild – in a good way. The concept of turning ancient anatomy into playable sound interfaces is not just fascinating but also incredibly poetic. You’re literally breathing life into extinct creatures. The goal isn’t only musical performance – it’s also science communication and education. As someone interested in design for mental well-being, I’m always drawn to tactile, embodied experiences. This feels like an emotional connection to the distant past, which is unexpectedly calming and awe-inducing.

Some things I really appreciated:

  • The use of CT scan data and iterative digital modelling (with tools like Blender and 3D Slicer) shows a commitment to scientific integrity.
  • They address accessibility and hygiene, especially post-COVID, by swapping out direct breath tubes for breath-activated microphones – smart move!
  • The project is also intentionally speculative, acknowledging that no one can truly know how a hadrosaur sounded, but instead allowing users to explore different hypotheses through interactive sound.

But here’s where my inner critic perks up. While the project is undeniably cool, it feels like it’s trying to be everything at once: a scientific model, an artistic instrument, a museum exhibit, and an educational tool. That multiplicity is exciting, but also a bit scattered. I wonder if it might benefit from more intentional “mode-switching” -like, a toggle between “science mode” (where only plausible vocalizations are allowed) and “experimental mode” (go wild with dino-DJing). Right now, the boundaries seem a bit blurry.

Also, one half of my brain (the one I made up for this blog post 😄) was thinking about how this might connect with more emotional, inner experiences. What if, instead of performing music, someone used this as a way to reflect on loss? Extinction isn’t just scientific – it’s emotional. Could the dinosaur choir be part of a meditative installation about disappearance, transformation, and the long arc of time?

All in all, I love it. It’s weird, fun, surprisingly moving, and technically impressive. The Dinosaur Choir might not be the most conventional music interface, but it’s got soul. Or at least… breath.

You can find the whole article here.

2.1 Documentation & Reflection: Speed-Dating My Lo-Fi Prototype

Last semester, my research focused on how UX/UI design can make mental health apps more calming and accessible, and how AI can provide personalized, empathetic support. I explored micro-interactions, color psychology, and AI-driven emotional intelligence to understand what makes digital mental health tools effective.

This time, I wanted to explore physical, tangible interactions for well-being—something that doesn’t require a screen or notifications but still guides users toward emotional regulation.

The Three Prototypes

For this exercise, I created three lo-fi prototypes:

Weekly Mood Tracker: A simple, analog way to log emotions over the week using color-coded entries for easy reflection.

Self-Reflection Cards: A deck of prompts designed to encourage mindful self-exploration and emotional processing.

Breathing Circle: A guided breathing tool made of paper, where users rotate a circular element to synchronize their breath with a visual cue.

Choosing the Breathing Circle

While all three prototypes engage users in self-awareness and well-being, I chose to bring the breathing circle to class because it best embodies my research goal: designing interfaces that guide users toward calmness in a simple, intuitive way. Unlike mood trackers or reflection tools, the breathing circle introduces a hands-on, meditative experience that requires minimal effort—ideal for moments of stress.

Speed-Dating My Prototype

In class, we shared our prototypes in a Speed-Dating/Sharing session, presenting them to different classmates in quick succession. This was an exciting way to gather feedback and refine ideas. Some of the key insights from my classmates included:

  • Great for children in schools – One student noted that the breathing circle could be useful in classrooms, similar to a fidget gadget, helping children focus while also providing a calming mechanism.
  • Ideal for bedtime – Another student said they would love to use it before bed to relax, which sparked the idea of making the prototype more tactile with textures and even usable in the dark.
  • A minimalist, portable tool – Someone pointed out that, since it’s thin and can be small, it’s perfect for carrying on public transport or while traveling. Its minimalistic design keeps the focus solely on breathing, without distractions.
  • A sensory experience – A classmate suggested adding resistance to the movement (like a soft fabric hinge) to make turning it feel more grounding.

What is My Prototype Trying to Address?

The breathing circle is designed to address a key challenge in mental health support: how to create intuitive, low-effort tools for emotional regulation. Unlike mood-tracking apps or chat-based AI support, this tool is immediate and physical—it doesn’t require users to think, analyze, or type, just breathe. This prototype is particularly suited for:

  • Commuters and travelers – Its thin, compact design makes it easy to use on the go, whether on public transport, at an airport, or in a waiting room.
  • Children and adults needing focus – It can function as a calming fidget tool, helping with concentration in schools, workplaces, or even at home.
  • People looking for a screen-free relaxation method – No notifications, no distractions—just a simple, intuitive breathing guide.

Potential Features & Future Iterations

Based on the feedback, I’d love to explore:

  • Tactile Elements – Soft materials, textured surfaces, or raised patterns to enhance sensory engagement.
  • Glow-in-the-Dark or Low-Light Adaptation – So it can be used before bed without needing external light.
  • Personalization – Adjustable speed settings, so users can customize their breathing pace.
  • Elastic Resistance – Adding a slight resistance to the movement to make it more grounding and engaging.

If My Prototype Had a Dating Profile …

“Looking for a mindful moment? I’m a simple, no-fuss tool that helps you slow down and just breathe. I work best in quiet moments, whether you’re feeling stressed, anxious, or just need to unwind. Small, discreet, and always ready to help—swipe right for relaxation!”

Final Thoughts

This session reinforced how valuable it is to test even the simplest ideas. The breathing circle started as a basic paper prototype, but through conversation and iteration, it could evolve into something more immersive and widely useful. The feedback also reminded me that not all mental health tools need to be digital—sometimes, the most powerful solutions are the simplest, most tangible ones.

1.10 AI Companions vs. Traditional Therapy

Can Technology Replace Human Connection?

The rise of AI companions has sparked a significant debate: can technology truly replace human therapists in addressing mental health issues? AI-driven systems like Woebot and Wysa offer cognitive-behavioral therapy (CBT) techniques, providing instant support to users. However, while these AI companions are effective in alleviating feelings of loneliness and offering immediate assistance, they still fall short in replicating the depth of human connection provided by traditional therapy.

Image Source: Vice

AI as a Complementary Tool

AI companions offer several advantages, such as accessibility, 24/7 availability, and anonymity, making them valuable tools for individuals who may not have immediate access to human therapists. For instance, 48% of people in the U.S. reported experiencing some form of mental health issue, and AI solutions could help bridge the gap where human therapists are unavailable or overwhelmed by demand. However, they lack the nuanced empathy and relational depth that human therapists bring to therapeutic conversations. Research indicates that while AI companions can provide immediate relief, they do not guarantee substantial long-term improvements in mental health.

The Future of Mental Health Care

Rather than replacing human therapists, AI companions could become part of a hybrid model. AI can handle initial assessments and offer support between therapy sessions, while human therapists provide ongoing treatment for deeper emotional and psychological issues. This collaborative approach can provide a more comprehensive mental health support system, blending the best of both worlds. For example, AI companions have been shown to reduce loneliness among seniors, enhancing their overall well-being.

Effectiveness of AI in Addressing Mental Health Issues

AI companions have demonstrated effectiveness in managing certain mental health conditions:

Anxiety and Depression: AI-driven applications can provide immediate support and coping strategies for individuals experiencing anxiety and depression. They offer tools like mood tracking, mindfulness exercises, and cognitive-behavioral techniques to help users manage symptoms.

Stress Management: AI companions can assist in stress reduction by guiding users through relaxation techniques, meditation, and providing real-time feedback on stress levels.

However, AI companions are less effective in addressing:

Severe Mental Health Disorders: Conditions such as schizophrenia, bipolar disorder, and severe personality disorders require comprehensive treatment plans that include medication management and intensive psychotherapy, areas where AI companions currently fall short.

Crisis Situations: In cases of acute mental health crises, such as suicidal ideation or severe self-harm, immediate human intervention is crucial. AI companions are not equipped to handle such emergencies and may not provide the necessary support.

Sources

  1. “AI In Mental Health: Opportunities And Challenges In Developing Intelligent Digital Therapies.” Forbes. Accessed: Jan. 25, 2024. [Online.] Available: https://www.forbes.com/sites/bernardmarr/2023/07/06/ai-in-mental-health-opportunities-and-challenges-in-developing-intelligent-digital-therapies/
  2. “AI Therapists vs. Human Therapists: Complementary Roles in Mental Health.” mindpeace.ai. Accessed: Jan. 25, 2024. [Online.] Available: https://mindpeace.ai/blog/ai-therapists-vs-human-therapists
  3. “Artificial intelligence in mental health care.” American Psychological Association. Accessed: Jan. 25, 2024. [Online.] Available: https://www.apa.org/practice/artificial-intelligence-mental-health-care
  4. “Exploring the Pros and Cons of AI in Mental Health Care.” Active Minds. Accessed: Jan. 25, 2024. [Online.] Available: https://www.activeminds.org/blog/exploring-the-pros-and-cons-of-ai-in-mental-health-care/
  5. “Can AI Companions Help Heal Loneliness? | Eugenia Kuyda | TED.” YouTube. Accessed: Jan. 25, 2024. [Online.] Available: https://www.youtube.com/watch?v=-w4JrIxFZRA
  6. Lee, E. E., Torous, J., De Choudhury, M., Depp, C. A., Graham, S. A., Kim, H. C., Paulus, M. P., Krystal, J. H., & Jeste, D. V. (2021). Artificial Intelligence for Mental Health Care: Clinical Applications, Barriers, Facilitators, and Artificial Wisdom. Biological Psychiatry: Cognitive Neuroscience and Neuroimaging, 6(9), 856-864. https://doi.org/10.1016/j.bpsc.2021.02.001
  7. “Mental Health Apps and the Role of AI in Emotional Wellbeing.” Mya Care. Accessed: Jan. 25, 2024. [Online.] Available: https://myacare.com/blog/mental-health-apps-and-the-role-of-ai-in-emotional-wellbeing
  8. Thakkar, A., Gupta, A., & De Sousa, A. (2024). Artificial Intelligence in Positive Mental Health: A Narrative Review. Frontiers in Digital Health, 6, 1280235. https://doi.org/10.3389/fdgth.2024.1280235
  9. ” ‘They thought they were doing good but it made people worse’: why mental health apps are under scrutiny.” The Guardian. Accessed: Jan. 25, 2024. [Online.] Available: https://www.theguardian.com/society/2024/feb/04/they-thought-they-were-doing-good-but-it-made-people-worse-why-mental-health-apps-are-under-scrutiny
  10. “Why Some Mental Health Apps Aren’t Helpful?” Greater Good Magazine. Accessed: Jan. 25, 2024. [Online.] Available: https://greatergood.berkeley.edu/article/item/why_some_mental_health_apps_arent_helpful

1.9 The Emotional Intelligence of AI: Can Chatbots Truly Understand Us?

As AI technology advances, chatbots are evolving to recognize emotional cues, providing support in mental health, companionship, and conversational interfaces. By integrating techniques such as natural language processing (NLP), sentiment analysis, and machine learning, these systems aim to simulate empathy and create meaningful interactions. However, the development of empathetic AI comes with challenges, including technological limitations, ethical concerns, and potential risks of over-dependence.

Advancements in Empathetic Algorithms

Empathetic algorithms are designed to detect, interpret, and respond to human emotions using methods such as NLP, voice tone recognition, and facial expression analysis. For example: Woebot employs cognitive-behavioral therapy (CBT) techniques to guide users through stress and anxiety management, leveraging emotional cues from conversations. Wysa uses sentiment analysis to provide customized mindfulness exercises and mood tracking tools for emotional resilience.

Beyond mental health, empathetic algorithms are being integrated into other sectors like education and customer service, tailoring interactions based on emotional cues to improve engagement and satisfaction.

Chatbots as Relationship Simulators

LLMs such as GPT power chatbots like Replika AI and Character AI, which simulate human-like relationships. Replika AI enables users to design virtual companions for friendship, mentorship, or even romantic connections, raising questions about emotional reliance and blurred boundaries between humans and machines. Character AI allows users to interact with AI representations of fictional or historical figures, blending entertainment with relationship simulation.

Replika, Image Source: Every

These developments reflect themes from the movie Her, where an AI operating system becomes a deeply personal companion. While such systems offer emotional support, they highlight risks like over-dependence, which could potentially hinder real-life emotional interactions.

Movie Her, Image Source: IMDb

The Role of Empathy in AI

Empathetic AI is transforming human-AI interactions by making them more intuitive and emotionally aligned. However, achieving true emotional intelligence in machines remains a significant challenge:

  • Complex Emotions: Emotions are shaped by individual, cultural, and situational factors, making them difficult for AI to interpret consistently.
  • Simulated Empathy: Current AI systems simulate empathy by mimicking human responses rather than genuinely understanding emotions.
  • Ethical Concerns: Privacy risks arise from AI’s reliance on sensitive emotional data, making transparency and data security essential.

Applications and Insights from Research

Recent studies emphasize how empathetic algorithms can enhance human emotional intelligence by fostering emotional awareness and resilience. For instance:

  • Educational AI systems: Tailor learning environments to students’ emotional states, adapting content based on signs of frustration or confusion.
  • Healthcare applications: Use empathetic AI to assess patients’ emotional needs and deliver personalized support, improving outcomes for individuals with anxiety or depression.

Despite these advancements, challenges such as cultural biases in emotion recognition and the need for interdisciplinary collaboration remain key areas for growth.

Sources

  1. “Character.ai: Young people turning to AI therapist bots.” BBC. Accessed: Jan. 24, 2025. [Online.] Available: https://www.bbc.com/news/technology-67872693?utm_source=chatgpt.com
  2. ” ‘Maybe we can role-play something fun’: When an AI companion wants something more.” BBC. Accessed: Jan. 24, 2025. [Online.] Available: https://www.bbc.com/future/article/20241008-the-troubling-future-of-ai-relationships?utm_source=chatgpt.com
  3. “Replika CEO Eugenia Kuyda says it’s okay if we end up marrying AI chatbots.” The Verge. Accessed: Jan. 24, 2025. [Online.] Available: https://www.theverge.com/24216748/replika-ceo-eugenia-kuyda-ai-companion-chatbots-dating-friendship-decoder-podcast-interview?utm_source=chatgpt.com
  4. Velagaleit, S. B., Choukaier, D., Nuthakki, R., Lamba, V., Sharma, V., & Rahul, S. (2024). Empathetic Algorithms: The Role of AI in Understanding and Enhancing Human Emotional Intelligence. Journal of Electrical Systems, 20-3s, 2051–2060. https://doi.org/10.52783/jes.1806
  5. “Woebot Health – Mental Health Chatbot.” Woebot Health. Accessed: Jan. 24, 2025. [Online.] Available: https://woebothealth.com/
  6. “Wysa – Everyday Mental Health.” Wysa. Accessed: Jan. 24, 2025. [Online.] Available: https://www.wysa.com/

1.8 Gamification in Mental Health Apps: Engagement or Overload?

Gamification, the integration of game-like elements into non-gaming contexts, has emerged as a popular strategy in mental health apps to boost user engagement and foster positive behavioral changes. By using rewards, progress tracking, and interactive challenges, gamification helps users stay motivated and engaged in achieving their mental health goals. The concept leverages the human tendency to seek immediate gratification, making long-term health routines feel more enjoyable and rewarding. This approach transforms otherwise mundane or challenging tasks into engaging and rewarding experiences, encouraging users to adhere to their mental health practices over time.

The Framework of Gamification: Mechanics, Dynamics, and Aesthetics

Gamification in mental health apps often revolves around the Mechanics-Dynamics-Aesthetics (MDA) framework:

Mechanics are the visible, interactive elements that users directly engage with, such as progress bars, badges, leaderboards, and daily check-ins. Apps like SuperBetter allow users to adopt secret identities, complete challenges, and invite allies for support. Similarly, Finch lets users nurture a virtual bird as they complete self-care tasks, turning progress into a tangible reward.

Dynamics focus on processes like goal-setting, progress tracking, and feedback mechanisms that sustain user engagement. For example, I Am Sober allows users to track their sobriety, showing tangible benefits like money and calories saved over time, which reinforces their commitment. Apps like Happify use adaptive challenges to maintain motivation, rewarding users with points for completing in-app activities tailored to their goals.

Gamification in Finch, Image source: App Store
Gamification in I Am Sober, Image source: App Store

Aesthetics evoke emotions such as motivation and connection through design and storytelling. eQuoo, for instance, uses fantasy storytelling and interactive challenges to teach emotional resilience. Meanwhile, SuperBetter embraces bold visuals and empowering language to encourage users to tackle “bad guys” like self-criticism.

Storytelling in eQuoo, Image source: One Mind Psyber Guide

Benefits of Gamification

Enhanced Engagement

Gamification makes mental health routines more enjoyable and accessible. Features like badges, progress tracking, and leaderboards reward users for their efforts, fostering a sense of accomplishment. For instance, Happify uses positive psychology techniques to help users reduce anxiety and loneliness while promoting emotional well-being​​.

Encouraging Healthy Habits

Apps like Finch and Rootd help users form consistent routines by rewarding daily actions, such as journaling or practicing mindfulness. These small, gamified nudges support users in developing healthier habits over time​.

Gamification in Rootd, Image source: New Ventures BC

Challenges of Gamification

App Fatigue

Over-reliance on repetitive tasks and extrinsic motivators like badges can lead to disengagement. When users feel overwhelmed by excessive prompts or redundant activities, the risk of app fatigue increases​.

Balancing Game and Therapy

Adding too many game elements can dilute the therapeutic value of an app. Research shows that increasing gamified features doesn’t always enhance outcomes, underscoring the need for thoughtful design​​.

Ethical and Practical Considerations

Customization and Personalization

Personalized experiences are key to keeping users engaged. Apps like Headspace offer tailored meditation tracks based on user input, while Rootd adapts its activities to help users manage anxiety and panic attacks effectively.

Onboarding Screens in Headspace, Image Source: UI Sources

Meaningful Interactions

Apps should prioritize outcomes over screen time. For instance, Headspace ensures users benefit from its programs without feeling pressured to overuse the app. Its studies show that completing at least 10 meditation sessions in eight weeks significantly reduces symptoms of depression.

Conclusion

Gamification has great potential to make mental health apps more engaging and effective. By thoughtfully combining game elements with therapeutic goals, these apps can support users on their well-being journeys. However, careful design is crucial to ensure they remain meaningful, balanced, and beneficial.

Sources

  1. Cheng, V. W. S., Davenport, T., Johnson, D., Vella, K., & Hickie, I. B. (2019). Gamification in apps and technologies for improving mental health and well-being: Systematic review. JMIR Mental Health, 6(6), e13717. https://doi.org/10.2196/13717
  2. Hamdoun, S., Monteleone, R., Bookman, T., & Michael, K. (2023). AI-based and digital mental health apps: Balancing need and risk. IEEE Technology and Society Magazine, 42(1), 25–36. https://doi.org/10.1109/MTS.2023.3241309
  3. “How To (and Why You Should) Incorporate Gamification Into Your Mental Health Care App.” SF AppWorks. Accessed: Jan. 19, 2025. [Online.] Available: https://www.sfappworks.com/blogs/incorporating-gamification-into-your-mental-health-care-app
  4. Santoso, I. S., Ferdinansyah, A., Sensuse, D. I., Suryono, R. R., Kautsarina, & Hidayanto, A. N. (2021). Effectiveness of gamification in mHealth apps designed for mental illness. Proceedings of the 2nd International Conference on ICT for Rural Development (IC-ICTRuDev), Jogjakarta, Indonesia, 1–6. https://doi.org/10.1109/IC-ICTRuDev50538.2021.9655706
  5. “The Power of Gamification in Mental Health Apps – And how they benefit well-being.” MedPage Today. Accessed: Jan. 19, 2025. [Online.] Available: https://www.medpagetoday.com/opinion/kevinmd/106239
  6. Valentine, L., D’Alfonso, S., & Lederman, R. (2023). Recommender systems for mental health apps: Advantages and ethical challenges. AI & Society, 38(4), 1627–1638. https://doi.org/10.1007/s00146-021-01322-w

1.7 Privacy vs. Personalization: Navigating Ethical Challenges in AI Mental Health Apps

AI-driven mental health apps offer a remarkable combination of personalization and accessibility, providing users with tailored experiences based on their unique needs. For example, apps like Talkspace utilize AI to detect crisis moments and recommend immediate interventions, while platforms such as Wysa offer personalized exercises based on user interactions. However, these benefits come with significant privacy and ethical challenges. To deliver personalized support, such tools rely on sensitive data such as user emotions, behavioral patterns, and mental health histories. This raises critical questions about how this data is collected, stored, and used.

Image Source: Government Technology Insider

Ensuring privacy in these apps requires robust safeguards, including encryption, secure data storage, and compliance with regulations like GDPR in Europe and HIPAA in the United States. These laws mandate transparency, requiring developers to clearly explain how user data is handled. Companies like Headspace exemplify these practices by encrypting user data, limiting employee access, and providing users with the option to control data-sharing settings. Headspace also rigorously tests its AI for safety, particularly in detecting high-risk situations, and connects users to appropriate resources when needed.

Beyond privacy, ethical concerns about fairness and inclusivity in AI algorithms are prominent. If the data used to train these algorithms isn’t diverse, the resulting tools may be less effective, or even harmful, for underrepresented groups. For example, biases in language or cultural context can lead to misunderstandings or inappropriate recommendations, potentially alienating users. To address this, platforms must ensure their datasets are diverse and representative, integrate cultural sensitivity into their development processes, and conduct ongoing audits to identify and rectify biases. Headspace’s AI Council, a group of clinical and diversity experts, serves as a model for embedding equity and inclusivity in AI tools.

Transparency is another key pillar for ethical AI in mental health. Users must be informed about how the AI works, the types of data it collects, and its limitations. For example, AI is not a replacement for human empathy, and users should be made aware of when to seek professional help. Clear communication builds trust and empowers users to make informed choices about their mental health.

While AI-driven mental health apps can enhance engagement and outcomes through personalization, the trade-off between privacy and functionality must be carefully managed. Ethical design practices, such as secure data handling, bias mitigation, and transparent user communication, are essential for balancing these priorities. By addressing these challenges proactively, developers can ensure that these tools support mental health effectively while respecting users’ rights and diversity.

Sources

  1. “AI principles at Headspace.” Headspace. Accessed: Jan. 14, 2025. [Online.] Available: https://www.headspace.com/ai
  2. Basu, A., Samanta, S., Sur, S., & Roy, A. Digital Is the New Mainstream. Kolkata, India: Sister Nivedita University, 2023.
  3. “Can AI help with mental health? Here’s what you need to know.” Calm. Accessed: Jan. 14, 2025. [Online.] Available: https://www.calm.com/blog/ai-mental-health
  4. Coghlan, S., Leins, K., Sheldrick, S., Cheong, M., Gooding, P., & D’Alfonso, S. (2023). To chat or bot to chat: Ethical issues with using chatbots in mental health. Digital Health, 9, 1–11. https://doi.org/10.1177/20552076231183542
  5. Hamdoun, S., Monteleone, R., Bookman, T., & Michael, K. (2023). AI-based and digital mental health apps: Balancing need and risk. IEEE Technology and Society Magazine, 42(1), 25–36. https://doi.org/10.1109/MTS.2023.3241309
  6. Valentine, L., D’Alfonso, S., & Lederman, R. (2023). Recommender systems for mental health apps: Advantages and ethical challenges. AI & Society, 38(4), 1627–1638. https://doi.org/10.1007/s00146-021-01322-w

1.6 How AI Is Reshaping Mental Health Support

Artificial intelligence is revolutionizing mental health care by breaking down barriers like cost, stigma, and accessibility. With features like chatbots, biofeedback, and voice analysis, AI offers innovative solutions for mental health support. While AI can’t replace human therapists, its ability to complement traditional care makes it a valuable tool.

Venture capital reports reveal that mental health is the fastest-growing marketplace category, with a growth rate exceeding 200% in 2023. This surge reflects a rising demand for accessible mental health solutions as AI continues to play a critical role in meeting that need.

How AI Powers Mental Health Apps

AI-Driven Chatbots

AI chatbots provide immediate, tailored support for users in need:

  • Wysa offers CBT-based exercises and mindfulness prompts, creating a safe space for users to manage stress and anxiety.
  • Woebot adapts its conversations to users’ emotions, providing tools for real-time mental health management.
  • Cass combines emotional support and psychoeducation, offering adaptive responses that cater to individual needs.

In May 2024, Inflection AI launched Pi, a bot designed for emotional support and conversational companionship. Unlike other chatbots, Pi openly acknowledges its limitations, avoiding the pretense of being human while focusing on honest and straightforward interactions.

Wearables and Biofeedback

Wearable devices enhance AI’s ability to provide real-time insights into users’ mental states:

  • Moodfit and Spring Health use wearable data, like heart rate and stress levels, to deliver personalized mental health strategies.
  • Kintsugi analyzes vocal biomarkers to detect signs of anxiety or depression, offering users actionable insights based on their voice patterns.
Image Source: 9to5Mac

These integrations bridge the gap between physical and emotional health, empowering users to take control of their well-being.

Opportunities in AI Mental Health Care

AI’s advantages lie in its ability to make mental health support more accessible, personalized, and inclusive:

  • Immediate and affordable: tools like Headspace’s Ebb and Wysa provide around-the-clock support at a fraction of the cost of traditional therapy.
  • Engagement and effectiveness: a 2022 review found that AI tools could improve engagement and reduce symptoms of anxiety and depression. However, experts emphasize that AI works best as a supplement, not a substitute, for traditional therapy. As Dr. Chris Mosunic of Calm explains, “Having a human in the driver’s seat with improved therapy AI tools might be just the right blend to maximize engagement, efficacy, and safety.”
  • Personalized support: apps like Woebot and Youper adapt their recommendations to the user’s changing emotional needs, creating a more tailored experience.
Image Source: Business Wire

Challenges and Ethical Considerations

While AI offers promising solutions, it also presents challenges:

  • Limited empathy: AI tools often lack the emotional depth of human therapists, which can leave users feeling unsupported in complex situations.
  • Bias and inclusivity: non-diverse training data can lead to biased responses, potentially failing marginalized communities that rely more heavily on these tools due to systemic barriers.
  • Privacy concerns: AI tools require access to sensitive data. Apps like Talkspace use encryption to protect user information, but trust in data security remains a significant hurdle.

As these tools evolve, balancing innovation with ethical responsibility will be critical – a topic that will be explored further in upcoming articles.

Sources

  1. A. Fiske, P. Henningsen, & A. Buyx. (2019). Your robot therapist will see you now: Ethical implications of embodied artificial intelligence in psychiatry, psychology, and psychotherapy. Journal of Medical Internet Research, 21(5), e13216. https://doi.org/10.2196/13216
  2. A. Thakkar, A. Gupta, & A. De Sousa. (2024). Artificial intelligence in positive mental health: A narrative review. Frontiers in Digital Health, 6. https://doi.org/10.3389/fdgth.2024.1280235
  3. “Can AI help with mental health? Here’s what you need to know.” Calm. Accessed: Jan. 4, 2025. [Online.] Available: https://www.calm.com/blog/ai-mental-health
  4. “Meet Ebb | AI Mental Health Companion.” Headspace. Accessed: Jan. 4, 2025. [Online.] Available: https://www.headspace.com/ai-mental-health-companion
  5. P. Gual-Montolio, I. Jaén, V. Martínez-Borba, D. Castilla, & C. Suso-Ribera. (2022). Using artificial intelligence to enhance ongoing psychological interventions for emotional problems in real- or close to real-time: A systematic review. International Journal of Environmental Research and Public Health, 19(13), 7737. https://doi.org/10.3390/ijerph19137737
  6. “Rise of AI therapists.” VML. Accessed: Jan. 4, 2025. [Online.] Available: https://www.vml.com/insight/rise-of-ai-therapists