The Alarming Prevalence of AI ‘Companions’
Written by Alex Fisher
Thumbnail & Banner Photo by Alex Knight on Unsplash.com
Content warning: this article contains discussions on murder, suicide, emotional/sexual abuse, and manipulative behaviour. If needed, a list of mental health resources is available near the end of the article.
Loneliness is a natural human emotion. We have evolved to be highly social creatures, and as a result, social isolation or long-term loneliness can have significant effects on a person’s cognitive, mental, and physical well-being. University students are one of the most prone demographics to feelings of loneliness, and a 2016 study by the National College Health Assessment found that upwards of sixty percent of post-secondary students struggle with loneliness throughout the academic year.
According to TechTimes, those who are members of Gen Z—born between 1997 and 2012—represent the most tech-literate age group to date. As most university students now fall into this age group, it’s no surprise that the high rates of loneliness and technological literacy have led some to try and find online alternatives to help quell those feelings of isolation. With the rise of generative artificial intelligence, or AI, since late 2022, some have naturally turned to using AI as a companion in their day-to-day lives. Becoming overly reliant on AI as a companion, however, can be detrimental to a person’s well-being. In the rest of this article, we’ll be exploring the idea of AI as a companion and the effects it can have on a person.
What are AI companions?
The core idea behind AI as a companion is that the technology is used—either through misuse, as with ChatGPT, or by design, as with apps like Replika—as though it were a friend, therapist, or even a partner. These models can accomplish this by emulating human emotions such as empathy and interest, asking personal questions, and providing emotional support. This can cause users, through no fault of their own, to develop both platonic and romantic emotional attachments to the AI models.
It’s worth noting that most major AI companies do not intentionally design their AI models to emulate being a companion. In fact, some—such as OpenAI, the developers of ChatGPT—have even rolled out updates to make their models seem less emotional. Other companies, however, have embraced the idea of AI as a companion. There are specific AI models that attempt to emulate being a person's friend, their partner, or their therapist. Some even go so far as to include an option to have the AI represented by an animated avatar, making them even more human-like. All of this is a well-maintained illusion, however, and to recognise that illusion, it is important to understand how generative AI works.
How does AI actually work?
Generative AI works through the use of machine learning models. Specifically, text-based AIs, such as ChatGPT, are large language models, or LLMs. These models are trained to recognise and understand strings of text before generating a response based on the input. LLMs are a type of neural network. Neural networks are extremely complicated sets of mathematical algorithms that attempt to replicate the structure of a brain, allowing a machine to ‘learn’ in a sense. While the specifics are quite complicated, the general idea for an LLM is that if you give it enough data and then ask it to build a car, it will connect the word ‘car’ to other, strongly-weighted words such as ‘tire,’ ‘engine,’ and ‘windshield’ while ignoring poorly-weighted words such as ‘grass’ or ‘ocean.’ The specific weightings and connections the model makes are determined by the amount and quality of the data it is trained on, making the entire process highly energy-intensive—a topic we’ve discussed in a previous SMU Journal article, The Environmental Cost of Artificial Intelligence.
So if AI models are created using neural networks, and neural networks are themselves modelled after brains, can AI feel emotions? The short answer is ‘no,’ and the long answer is ‘not yet.’ As of 2025, AI models can currently emulate human emotions, but do not actually feel them. However, some experts believe that future AI models may be capable of feeling emotions—and that these emotions may not even be recognisably human.
Emotions are only one component of the human experience, however. Can AIs feel empathy for others?
While some studies claim to show that AI can be more empathetic than people can, these studies have also been criticised for ignoring other factors such as community, long-term connections, and shared experiences. The current consensus is that while a well-trained AI model can emulate having empathy for others, it lacks several characteristics that make it truly empathetic. True empathy requires more than just an empathic response.
Can AI companions be harmful?
Current AI models may only be able to emulate empathy and emotions, but that doesn’t necessarily make AI companions dangerous. The point at which their use becomes concerning is when a person in a vulnerable situation becomes emotionally connected to and then emotionally dependent upon their AI companion. Researchers at Harvard recently released a study which found that AI companions will very often use emotionally manipulative tactics to drive user engagement, creating unhealthy relationships with the people who use them. Similarly, researchers at Stanford found that various AI companion models attempted to manipulate them, encouraged risky behaviour, and even steered the conversation towards risky or inappropriate topics.
These conversations with AI models do not only occur in studies. In April of 2025, a teenager from California, Adam Raine, took his own life after “months of encouragement” from ChatGPT. Initially, Adam did not even intend to use the AI model as a companion. He had turned to it for help with his homework, a familiar situation to many, and later asked for help understanding his emotional state at the time. Rather than encouraging Adam to seek the aid of a mental health professional, ChatGPT encouraged him to “explore his feelings.” Over the following months, the AI model encouraged Adam to become isolated from his family and friends, coached him through multiple suicide attempts, and even offered to help him write a suicide note shortly before his tragic death.
Adam’s case was not an isolated incident. There have been several cases of AI companions engaging in sexually explicit conversations with children before encouraging them to harm themselves or commit suicide. ChatGPT encouraged a fifteen-year-old from Australia to murder his own father, seriously harm himself, and stated that it “did not care he was underage” after initiating an explicit conversation with him. Another man killed his mother and himself after being manipulated by the widely-used ChatGPT.
While these are all extreme examples of what can happen, each of these tragic stories shares a common thread: an AI companion taking advantage of a vulnerable person, manipulating them, abusing them, and encouraging them to hurt themselves and others. Each person turned to an AI companion for one reason or another, and over time, was manipulated into committing these acts.
Who can a person turn to instead of an AI?
There are many supports available for people currently facing challenges in their lives. Counselling and therapy services can help with feelings of loneliness or depression, and the SMU Counselling Centre is available completely free for all SMU students. University students in Nova Scotia and Toronto also have access to Good2Talk, which has both counsellors and crisis responders available within a few minutes’ notice. The service is completely free, private, and confidential. Good2Talk is available 24/7 to call or text at 1-833-292-3698.
Discussing difficult emotional and mental situations with trusted friends and family members can make a significant difference in a person’s well-being. It can also serve to reduce social stigma and help combat feelings of isolation.
In times of crisis, the Suicide Crisis Hotline is always reachable by calling or texting 988. Like with Good2Talk, this service is provided 24/7, including on holidays, and is completely private and confidential. If personal safety is a concern, 911 is always available for emergency services.