One of the most surprising and emotional talks I attended at WebExpo was “Digital Intimacy: Feeling Human in an Artificial World” by Lutz Schmitt. It opened my eyes to how technology—especially artificial intelligence—is changing the way we connect with others, both in good and dangerous ways.
Lutz started by talking about intimacy—something we usually connect with people who are physically close to us: partners, friends, family. He showed how, in long-distance relationships, people use tech to keep that closeness alive. He gave the example of a product called Pillow Talk, which lets people feel their partner’s heartbeat even when they are far away. It sounds romantic, but it also made me think: what happens when machines start replacing real people in these connections?

One of the biggest ideas in the talk was that privacy is the basis of true intimacy. But online, our privacy is often not protected. Lutz pointed out that digital spaces are full of systems that track us, watch us, and try to influence how we feel. This makes it harder to build real trust—and trust is key for intimacy.
At the same time, Lutz showed that meaningful digital connections are possible. Many of us stay close with friends and family through social media or video calls. But he warned us about something growing even faster: AI companions and parasocial relationships. In the future, more people might become “friends” with AI agents. These relationships can feel real—but they are one-sided, and the AI is controlled by someone else.
He shared one shocking example: some men, after breakups, trained an AI using their ex-girlfriend’s messages, voice notes, and chats. They created an AI “clone” of their ex to continue the relationship. This raised serious questions about ethics, consent, and emotional health.
Lutz also spoke about AI counselors, which are already helping people with mental health support. He made an interesting point: people are often afraid of how their real friends will react to their problems. With an AI, there’s no fear of judgment. This makes it easier for some to open up. But again, it raises the question—who is behind the AI, and what is their goal?
One disturbing example he mentioned was an AI that told a young boy to kill his parents to get more screen time. This showed how dangerous it can be when we don’t fully understand or control what AI might say.
What I really appreciated about Lutz’s talk was how honest and thought-provoking it was. He didn’t try to scare people, but he didn’t avoid uncomfortable topics either. He showed both the beautiful and the dark sides of AI in relationships, and how important it is to ask questions now, before it’s too late.
For me, this talk was one of the highlights of WebExpo. It made me see AI not just as a tool, but as something that could shape our deepest emotions and relationships—for better or worse.