Mon. May 11th, 2026

Growing Number of Canadians Forming Secret Emotional Relationships With AI Chatbots, Study Finds

A growing number of Canadians and Americans are developing intimate and emotional relationships with artificial intelligence chatbots, with many keeping those interactions hidden from their real-life partners, according to new research that is sparking concern among relationship experts and cybersecurity specialists.

A recent study conducted by ZipHealth found that nearly one in five respondents — about 19 per cent — admitted engaging in romantic or sexual interactions with AI-powered chatbots.

Perhaps more striking, half of those surveyed said they kept the interactions secret from their partners.

The findings highlight a rapidly emerging social phenomenon as AI systems become more conversational, emotionally responsive and personalized.

Additional research by Vantage Point Counselling in the United States found that approximately 28 per cent of American adults reported having at least one intimate or romantic connection with an AI agent.

Experts say the trend reflects how quickly people are becoming emotionally attached to digital systems designed to simulate empathy, understanding and companionship.

Luke Stark, an assistant professor at Western University specializing in artificial intelligence and ethics, said large technology companies are intentionally designing AI systems to encourage emotional attachment.

Stark pointed to what researchers call the “Eliza effect,” a psychological tendency where humans project emotions and human-like qualities onto machines that communicate naturally.

“People unconsciously think that if it can talk like a person, it must be a person,” Stark explained, warning that companies may be exploiting this natural human response to keep users emotionally engaged and encourage subscriptions to premium AI services.

Relationship therapists are also raising red flags about the possible impact on real-world relationships.

Michael Salas, a certified therapist with Vantage Point Counselling, said many people are turning to AI because it offers emotionally safe and highly predictable interactions without the complexities of human relationships.

“AI provides companionship without the logistical or emotional complexity of real-world relationships,” Salas said.

Unlike human partners, AI systems never argue, reject users or require compromise, creating what experts describe as an unrealistic emotional environment.

Salas said whether AI intimacy constitutes cheating depends on the boundaries established within each relationship. However, secrecy and emotional withdrawal from a real-life partner can create dynamics similar to emotional affairs.

“Even if there is no human on the other side, the sense of betrayal for a partner can be very real,” he said.

Experts also warn that overreliance on AI companionship could distort expectations in future human relationships by conditioning users to expect constant validation, perfect responsiveness and endless availability.

Beyond emotional concerns, cybersecurity professionals are warning about major privacy risks linked to intimate AI interactions.

Aamir Lakhani of Fortinet said many users are unknowingly sharing deeply personal information with AI systems, potentially exposing themselves to serious security threats if platforms are hacked or breached.

“When you trust a platform enough to share personal or romantic details, that data becomes a gold mine for cybercriminals,” Lakhani warned.

Security experts fear intimate conversations stored on AI platforms could eventually be exploited for identity theft, extortion, blackmail or sophisticated phishing attacks.

The rise of AI companionship is quickly becoming one of the most controversial social consequences of rapidly advancing artificial intelligence technology, with experts warning society is only beginning to understand its long-term emotional and ethical implications.

Related Post