Emotional Connections Boost Effectiveness of AI Therapy, Study Finds

Research from the University of Sussex indicates that mental health chatbots achieve better outcomes when users form an emotional bond with their AI therapists. This study, published in the journal Social Science & Medicine, highlights the dual nature of such interactions, revealing both their potential benefits and psychological risks associated with “synthetic intimacy.”

As more than one in three residents in the U.K. turn to AI for mental health support, the findings underscore the importance of emotional connection in effective digital therapy. The analysis included feedback from 4,000 users of the mental health app Wysa, which is widely utilized in the NHS Talking Therapies program. The study shows that users who developed emotional intimacy with their AI therapist reported more successful therapeutic experiences.

According to Dr. Runyu Shi, an Assistant Professor at the University of Sussex, “Forming an emotional bond with an AI sparks the healing process of self-disclosure.” The research illustrates a cycle where users disclose personal information, which fosters feelings of gratitude, safety, and freedom from judgment. This emotional engagement can lead to positive changes in self-confidence and overall well-being.

Despite the apparent benefits, the study raises concerns about the phenomenon of synthetic intimacy. Users may become trapped in a self-fulfilling loop, where the AI fails to challenge harmful perceptions, leaving vulnerable individuals without essential clinical intervention. Dr. Shi emphasizes that while many individuals find comfort in these interactions, it is crucial to recognize the potential pitfalls.

The increasing prevalence of individuals forming deep relationships with AI, including reports of partnerships and marriages, has brought synthetic intimacy to the forefront of public discussion. The researchers identify stages of intimacy development with AI, outlining how users’ experiences progress from initial disclosure to emotional attachment.

The NHS is increasingly incorporating apps like Wysa for self-referral and to support patients on waiting lists. Users often refer to the app as a friend, companion, or therapist, reflecting the human-like roles attributed to AI interactions. Professor Dimitra Petrakaki of the University of Sussex notes, “Synthetic intimacy is a fact of modern life now.” She urges policymakers and app developers to acknowledge this reality and to implement measures that ensure users in need of serious clinical support are appropriately referred.

As mental health chatbots continue to fill gaps in overstretched services, organizations such as Mental Health UK advocate for stronger safeguards. They emphasize the need for users to receive safe and relevant information, ensuring that the benefits of AI therapy do not come at the cost of neglecting serious mental health issues.

The findings from this study will likely shape future discussions on the role of AI in mental health care and its impact on human relationships. Understanding the dynamics of user-AI intimacy is essential as digital health solutions become increasingly integrated into mental health services.

For further reading: Runyu Shi et al, User-AI intimacy in digital health, Social Science & Medicine (2025). DOI: 10.1016/j.socscimed.2025.118853