Young Europeans Turn to AI Chatbots for Emotional Support Survey Shows
Technology

Young Europeans Turn to AI Chatbots for Emotional Support Survey Shows

Share Tweet WhatsApp

PARIS β€” Nearly one in two young people in Europe have used AI chatbots to discuss intimate or personal matters, as the technology increasingly serves as a source of emotional support for a generation growing up with artificial intelligence, according to a major survey by Ipsos BVA published Tuesday. The findings reveal a significant shift in how young Europeans approach mental health conversations, with many finding AI easier to talk to than healthcare professionals.

The survey, commissioned by France's privacy watchdog CNIL and insurer Groupe VYV, was carried out among 3,800 people aged 11 to 25 across France, Germany, Sweden and Ireland in early 2026.

Key Survey Findings

The results paint a revealing picture of young people's relationship with AI for emotional support:

  • AI vs Professionals β€” 51% said it was "easy" to discuss mental health with a chatbot, compared to 49% for healthcare professionals and only 37% for psychologists
  • Personal Connections β€” Friends topped the list at 68%, with parents at 61%, showing human relationships remain the preferred source of support
  • Constant Availability β€” Around 90% of those surveyed had used AI tools before, with many citing their constant availability and non-judgmental nature
  • AI as Confidant β€” More than three in five users described AI as a "life adviser" or a "confidant"
  • Mental Health Concerns β€” About 28% of respondents met the threshold for suspected generalised anxiety disorder

Expert Perspectives

The survey results were not a surprise to Ludwig Franke FΓΆyen, a psychologist and digital health researcher at Stockholm's Karolinska Institutet. He noted that current large language models can produce high-quality responses, adding that his research suggested even licensed professionals may struggle to distinguish AI-generated advice from that of human experts.

However, Franke FΓΆyen warned against relying on chatbots alone for mental health support, emphasising that general-purpose AI systems are designed for engagement and companies' goals may not align with mental healthcare needs.

Cautionary Notes

"AI can offer information and support, but it should not replace human relationships or professional care," Franke FΓΆyen said. "If someone turns to a chatbot instead of speaking to a parent, a friend, or a mental health professional, that is a concern. We do not want technology to make people feel more alone."

Concerns over the psychological impact of AI tools have grown over the past year, with experts warning about the limitations of AI in detecting human emotions and safely providing emotional support. Earlier this year, the family of a Florida man sued Google, alleging its Gemini AI chatbot contributed to his paranoia and eventual suicide.

Regulatory Implications

The findings have significant implications for regulators and mental health services across Europe, highlighting both the potential of AI to provide accessible mental health support and the risks of over-reliance on technology for emotional well-being.

Category: Technology