Artificial intelligence (AI) is transforming our social relationships. No longer just a tool for automation and productivity, AI companion tools are being used to combat loneliness and mental health difficulties.4
The US, UK, Ireland, Japan, and South Korea report declines in close peer relationships and government efforts to combat social disconnection.4 This loneliness epidemic can be seen all over the world, and, as a result, the use of AI companion tools is simultaneously increasing.6
Some see AI tools as a solution, with 83% of Gen Zers believing it’s possible to form deep emotional bonds with AI technologies.1 But while experts say that AI-human relationships can be beneficial, there are serious risks to consider, too.6
This article will explore both sides, considering whether AI relationships are ”healthy” and the benefits and risks of emotional attachment to AI models. We’ll also discuss the concept of AI therapy and how AI relationships can impact mental health.
The Rise of AI-Human Relationships
To help boost understanding of why AI companion models exist and why their use is increasing, it’s important to consider the impact of loneliness on well-being.
The Loneliness Epidemic
Humans need connection, but we live in a time where we are more isolated than ever. So, in some ways, it’s unsurprising that people are increasingly relying on AI technologies for companionship, conversation, and emotional support. In fact, research finds that only 13% of US adults have ten or more close friends, down from 33% in 1990.4
In addition, fewer young people are connecting face-to-face. Instead, they use smartphones, the internet, and games to talk to their friends. For them, communicating with an AI model might feel identical to texting a friend.1 So, what does this mean for the use of companion apps?
Use of Companion Apps
Many so-called “companion apps”, like Blush and Nomi, have taken off in recent years, offering simulations of romantic relationships.2 In these apps, users can craft the chatbots’ personalities, customize their appearances and voices, and have conversations whenever they like.
Naturally, these apps are popular. With instant conversations at their fingertips, many people are spending an average of 93 minutes per day talking with their chatbots.1,4
Romance aside, AI chatbots can be used for friendship, therapy, tutoring, mentoring, and any other relationship you could think of.
So, how effective are these apps at providing true companionship, and what are the risks?
Benefits and Risks of AI-Human Relationships
75% of Gen Zers said that AI partners can fully replace human companionship.1 But if these tools are so effective at meeting our needs for intimacy and connection, why would 40% of Americans still long for closer relationships with friends?4
There are many pros and cons to AI relationships. While there’s some evidence to suggest they help to reduce loneliness, support psychotherapy, and teach social skills, there are also several ethical issues and mental health impacts to consider.2,6
So let’s dive into what these are.
Accessibility
AI may help reduce the barriers some people face when accessing mental health treatment. Whether it’s money, distance, or fears of opening up to someone, AI technologies allow people to access some form of support despite these barriers.3
Although their 24/7 availability and lack of fatigue can make chatbots “easier” than human relationships, they also fail to mimic the “give and take” of normal relationships.5 Therefore, the accessibility of AI comes at a cost of realness and authenticity.
Importance of Realness
Even though they appear to, AI chatbots do not have real empathy or care for the people engaging with them. Real empathy involves vulnerability, which AI technologies can’t offer. They offer an illusion of intimacy, but not the real thing.3
In addition, AI-human relationships can be extremely unrealistic. For example, a chatbot may respond in totally submissive ways and never stand up for itself against abusive behavior.5 So, even in less extreme scenarios, close attachments to machines might create unrealistic expectations about human relationships.6
Real relationships are founded on reciprocity, where both parties exchange something and benefit from the dynamic. AI companions might mimic reciprocity, but they shouldn’t be mistaken for the real thing. Down the line, when people embark on real relationships, they may struggle when confronted with the real needs and vulnerabilities of another person.6
So, artificial intelligence tools may reduce social isolation in the short-term, but isolation may still occur in the future when human-human relationships must eventually be navigated.6
Mental Health Impacts
A question at the center of this debate is whether AI can meet emotional needs. In fact, critics of AI companion models worry about AI chatbots replacing friendships and whether they could worsen mental health conditions.
Some research finds that AI companionship reduces feelings of loneliness and stress and increases emotional well-being.6 Plus, bots can also remind people to take their medications, get adequate sleep, and practice mindfulness.3,6
Further, these are compelling examples of AI supporting personal development. However, there might be serious impacts of AI on human intimacy, particularly when it comes to over-reliance.
For example, some report they would feel depressed if they had to stop interacting with their AI companion, and others prefer them entirely to real-life relationships.6 Additionally, using chatbots excessively may harm in-person relationships. For example, chatbot users report being more dependent after a month and less likely to socialize with their peers.4,6
Those who use chatbots for emotionally expressive conversations also report the highest levels of loneliness.4 While this suggests that AI companions aren’t so effective, other evidence suggests the tools can actually be harmful. In fact, there have been several high-profile incidents of suicides and crimes linked to AI-human relationships and cases where chatbots have encouraged self-harm and eating disorders.4,6
So, while 3% of users say that chatbots helped temporarily reduce suicidal thoughts, these tools aren’t without risks.4
Ethical Risks
AI-human relationships also pose privacy issues. Thousands of trackers begin to collect data about a person as soon as they begin engaging with a chatbot, including all the intimate and private thoughts they’ve divulged.3
Critics also suggest that people using AI companion tools could be exploited if chatbots are used by brands to promote certain products.6
Further, some people raise concerns that AI companion tools perpetuate certain social biases, such as gender norms. Through appearances and sexual narratives, these tools may reinforce harmful stereotypes about gender or religious groups.6 This poses ethical risks for society and certain marginalized communities.
Artificial Intelligence in Counseling and Psychotherapy
Through therapy chatbots, AI might also be replacing human interaction in the mental health field. While these tools are highly accessible and appeal to those who feel great stigma around seeking mental health treatment, they differ hugely from what real therapists offer.7
The following are some key differences between AI therapy and human therapy.
AI Therapy vs Human Therapy
AI systems are limited in their long-term memory capabilities. This prevents them from tracking interactions over long periods of time. As a result, AI therapy “relationships” can’t be truly coherent and continuous over time.9
Furthermore, research has found that therapy bots often fail to recognize suicidal intent in example messages.8 This highlights the need for a human touch, as well as the risks of AI therapy when it’s not overseen by professionals.
So while chatbots can reflect the emotions and content shared by users, these digital interactions lack the embodied experience of real therapy. When two human beings come together for a therapeutic encounter, there is a spiritual or existential quality in that space that is essential to the work.7
Coming back to something we discussed earlier, we live and need to be in relation to other people. For example, in education, it’s known that children can’t simply learn from reading content; they must also connect with another person to learn.4 Similarly, the “therapy” offered by these tools cannot provide a relational aspect, which is so powerful in therapeutic work.
For example, psychodynamic therapists work with something called “transference.” This is when thoughts and feelings from the past are projected onto the therapist by the client. Then, the therapist uses this here-and-now transference to inform their understanding of a client’s history and relationship patterns. Naturally, a machine is unable to do this.
While AI companionship has been found to reduce symptoms of depression and anxiety in the short-term, it’s likely that real therapy is necessary for long-term results. So, AI tools could be helpful for supporting coaching, self-reflection, and journaling, but not for long-term psychotherapy.8,9
Mission Connection offers flexible outpatient care for adults needing more than weekly therapy. Our in-person and telehealth programs include individual, group, and experiential therapy, along with psychiatric care and medication management.
We treat anxiety, depression, trauma, and bipolar disorder using evidence-based approaches like CBT, DBT, mindfulness, and trauma-focused therapies. Designed to fit into daily life, our services provide consistent support without requiring residential care.
Final Thoughts: Can AI Replace Real Relationships?
Warm, emotionally attuned, and responsive relationships shape our brains from infancy and are the most powerful predictors of resilience and lifelong health.4 With that in mind, AI relationships and human relationships will always be fundamentally different.
Understandably, face-to-face forms of social support can sometimes feel more frightening or be less accessible than AI versions. However, there are risks to those who are lonely or experiencing mental health difficulties when engaging with an AI companion. Over-reliance and a lack of human oversight mean these tools aren’t totally safe.
While AI companions can be helpful for baseline emotional support, long-term mental health treatment should occur in a relationship with a human professional. At Mission Connection, our emphasis is always on the therapeutic bond. Based on your needs and unique circumstances, we offer and can personalize a wide range of therapeutic approaches that can be delivered in-person or online.
Reach out to our team today to get started.
Call Today 866-833-1822.
References
- Koetsier, J. (2025, April 29). 80% Of Gen Zers Would Marry An AI: Study. Forbes. https://www.forbes.com/sites/johnkoetsier/2025/04/29/80-of-gen-zers-would-marry-an-ai-study/
- Mathis, J. (2025, July 23). Are AI lovers replacing humans? The Week. https://theweek.com/tech/ai-lovers-replacing-humans
- Zomorodi, M., Monteleone, K., & Meshkinpour, S. (2024, July 2). If a bot relationship FEELS real, should we care that it’s not? NPR. https://www.npr.org/2024/07/01/1247296788/the-benefits-and-drawbacks-of-chatbot-relationships
- Winthrop, R. (2025, July 2). What happens when AI chatbots replace real human connection? Brookings. https://www.brookings.edu/articles/what-happens-when-ai-chatbots-replace-real-human-connection/
- Riggio, R. (2025). Can an AI Companion Substitute for Real Human Relationships? Psychology Today. https://www.psychologytoday.com/gb/blog/cutting-edge-leadership/202508/can-an-ai-companion-substitute-for-real-human-relationships
- Ho, J. Q. H., Hu, M., Chen, T. X., & Hartanto, A. (2025). Potential and pitfalls of romantic Artificial Intelligence (AI) companions: A systematic review. Computers in Human Behavior Reports, 19, 100715. https://doi.org/10.1016/j.chbr.2025.100715
- UK Council for Psychotherapy. (n.d.). Artificial intelligence and psychotherapy. https://www.psychotherapy.org.uk/news/artificial-intelligence-and-psychotherapy/
- Wells, S. (2025, June 11). Exploring the Dangers of AI in Mental Health Care. Stanford University. https://hai.stanford.edu/news/exploring-the-dangers-of-ai-in-mental-health-care
- Zhang, Z., & Wang, J. (2024). Can AI replace psychotherapists? Exploring the future of mental health care. Frontiers in Psychiatry, 15. https://doi.org/10.3389/fpsyt.2024.1444382
