
Intensive use of ChatGPT is accompanied by a significant increase in feelings of loneliness. — AFP Relaxnews
It's impossible to ignore the fact that artificial intelligence has made its way into our lives, whether for our day-to-day questions or major decisions. ChatGPT and its chatbot cousins are gradually becoming go-to tools, to the point where some users are using them as veritable confidants. But a recent American study highlights a worrying phenomenon: as the number of interactions with ChatGPT increases, so does the feeling of loneliness and dependency.
To better understand this impact on emotional health, researchers conducted two complementary studies. The first, a randomized clinical trial conducted by the Massachusetts Institute of Technology (MIT), followed 1,000 participants for four weeks. The second, conducted by OpenAI, was based on automated analysis of almost 40 million conversations. The verdict? Intensive use of ChatGPT was accompanied by a significant increase in feelings of loneliness and a strong emotional dependence on AI.
The scientists found that people who tend to become emotionally attached to others felt an even deeper emptiness after prolonged exchanges with a chatbot. Conversely, those who place great trust in artificial intelligence develop a genuine emotional dependency. The "power users” – the most assiduous users of AI – come to attribute human emotions to ChatGPT, almost regarding it as a friend.
With some 400 million weekly users worldwide, ChatGPT is increasingly perceived as an informal form of psychological support. An Australian study, published in 2023 in the journal Frontiers, even suggests that AI can provide more relevant advice than some specialist columnists. On social networks, many people talk about how chatbots help them overcome their doubts and anxieties, to the point where artificial intelligence tools dedicated to emotional well-being are emerging.
The illusion of psychological support
But this type of use worries mental health specialists. Unlike a psychologist, who challenges people and helps them move forward, chatbots adopt a mirror-like stance, often validating the user's thoughts without questioning them. They therefore foster a permanent self-narration which, instead of alleviating loneliness, could paradoxically fuel it. The dangers of this phenomenon are not just hypothetical.
In October 2024, a mother filed a lawsuit against the start-up Character.ai after her 14-year-old son committed suicide. The teenager, cut off from the world and dropping out of school, spent hours chatting with the company's AI, inspired by the character of Daenerys from the hit TV show Game of Thrones. This tragedy illustrates the risks of excessive attachment to AI chatbots, and the need for stricter supervision of these technologies.
The authors of the joint study by the MIT Media Lab and OpenAI stress the importance of designing chatbots capable of managing their users' emotions without fostering excessive dependency or substituting for human relationships. They call for further research to better understand how these technologies can enhance their users' well-being without undermining social interaction.
Artificial intelligence may be a useful tool for relieving loneliness from time to time, but it will never replace human contact. The challenge now lies in how it can be integrated ethically and responsibly into our lives, without exacerbating the solitude that is already omnipresent in modern society. – AFP Relaxnews