At WebExpo Prague, one of my favorite talks was by Lutz Schmitt’s on “Digital Intimacy: Feeling Human in an Artificial World”. I left the compelling session with plenty to ponder about how we connect online. Schmitt opened by reminding us that we “easily recognise the people we’re closest to—our partners, friends, family—the ones we seek true intimacy with,” but that the internet, while meant to “stay in touch” across distances, often leaves us wondering if the person—or thing—on the other side of the screen is even real. He posed a provocative question: can an interaction with a robot ever feel as intimate as a conversation with a loved one, and do the trust-building challenges in UX design mirror those in human relationships?
Schmitt traced how, in today’s digital landscape, authenticity has become a scarce commodity. He described how social feeds, chat interfaces, and even AI-driven assistants can “raise doubts about authenticity” and make us second-guess if the person typing back is genuine. Drawing on examples of deep-fake profiles and automated chatbots, he emphasized that when users log in, they crave reassurance that there’s a human—or at least a convincingly human-like algorithm—behind the responses. This quest for authenticity mirrors the early days of the internet, when seeing a picture or avatar wasn’t enough; we’ve since demanded richer cues—voice, video, and now emotional response—to sustain digital intimacy.
A particularly striking point in Schmitt’s talk concerned the parallels between building intimacy in human relationships and designing for trust in digital products. He noted that just as partners rely on nonverbal cues—tone of voice, eye contact, subtle facial expressions—digital experiences need their own “signals” that assure users they’re understood and valued. For instance, a well-timed microanimation or a contextually relevant message can mimic the feeling of being “seen,” akin to a friend nodding in agreement. Schmitt argued that these design choices are not mere bells and whistles but foundational to forging a sense of closeness, especially when the “other” could be an AI agent.
Throughout his presentation, Schmitt highlighted real-world examples where companies have successfully crafted digital intimacy. He spoke about chatbot initiatives that go beyond scripted replies to offer genuinely empathetic interactions, referencing recent research into how social chatbots can mirror emotional patterns almost like human companions. One case study involved a mental-health app whose AI check-ins used tailored language based on prior user responses, offering a sense of “being remembered” that’s critical for emotional connection. Schmitt stressed that, from a UX perspective, transparency is key: when users understand how algorithms adapt to their behavior, it fosters trust rather than alienation.
Ethical considerations formed a core undercurrent of the session. Schmitt pointed out that as digital intimacy deepens—through mirrors of our speech patterns, personalized suggestions, and even voice-based AI companions—we risk blurring the line between authentic human relationships and artificial ones. He cautioned that, without guardrails, we could inadvertently encourage parasocial dependencies, a phenomenon where users form one-sided emotional bonds with AI entities. Recent studies warn of these “illusions of intimacy,” showing patterns where users—often vulnerable—may substitute human connection with AI that consistently affirms them. Schmitt urged designers to build feedback loops that encourage healthy real-world interactions alongside digital touchpoints rather than replacing them entirely.
In closing, Schmitt challenged us to consider how to maintain our humanity as technology becomes ever more adept at simulating it. He reminded us that “trust” in a product isn’t just about security or privacy—though those matter—it’s also about emotional reliability. Can we create digital partners, assistants, or communities that respect users’ need for genuine connection? Schmitt proposed that the future of UX lies in crafting experiences that feel “alive” in the right ways: consistent yet transparent, adaptive yet accountable. As I walked away from the Lucerna Great Hall, I couldn’t help but reflect on my own screen interactions—whether I’m truly “seen” by the apps I use daily, or simply speaking into an echo chamber of code. Festivals like WebExpo remind us: while AI can simulate intimacy, it’s up to us as designers and users to preserve the authentic spark of human connection.