TechExcessive reliance on ChatGPT linked to emotional strain

Excessive reliance on ChatGPT linked to emotional strain

The MIT Media Lab and OpenAI team have investigated how interactions with ChatGPT affect users' emotional well-being. Scientists have discovered that so-called "power users," those who use ChatGPT frequently, may become emotionally dependent on this tool.

The surprising effect of using ChatGPT, a new study by Open AI
The surprising effect of using ChatGPT, a new study by Open AI
Images source: © Adobe Stock
Amanda Grzmiel

ChatGPT is a new tool, and its long-term effects are still unknown. Although it is a revolutionary discovery and certainly facilitates daily work, recent studies indicate that some people already cannot imagine a day without it and treat it like a personal advisor. The data collected may concern the most active users of this technology.

Users treat ChatGPT like a friend

The MIT Media Lab team, in collaboration with OpenAI, decided to investigate how interactions with a chatbot like ChatGPT affect users' social and emotional well-being. They noted that people use this tool in various ways, not always in line with the creators' intentions, which can lead to unforeseen consequences. Their findings indicate complex relationships between users and AI, with users often starting to view ChatGPT as a friend. "ChatGPT isn’t designed to replace or mimic human relationships, but people may choose to use it that way given its conversational style and expanding capabilities," emphasised the researchers in an analysis.

The research comprised two parts. The first was an analysis of 40 million interactions with ChatGPT, conducted by OpenAI. "This analysis helped to understand usage patterns and their impact on users' emotional well-being," the scientists reported. The second part was a controlled study involving 1,000 participants aimed at identifying the impact of various platform features on users' psychological state.

Emotional involvement with voice messages

The test results showed that most daily interactions with the chatbot did not exhibit signs of emotional involvement. The situation was different for those using voice messages. In this group, conversations were much more expressive and emotional, and users were more likely to consider ChatGPT as their friend. Moreover, intensive use of voice chat was associated with a decline in mental well-being.

However, this is not the end of the concerning findings. The study also included people who used ChatGPT for personal conversations, such as seeking advice. These users admitted to feeling more lonely, even though they did not show signs of emotional dependency. Conversely, dependency was more visible among those using the chat for professional conversations or to gain new ideas.

Who is most susceptible to negative effects?

Researchers noted that individuals who easily form attachments in relationships and those who perceived AI as a friend were more susceptible to the negative effects of using the chatbot. Long-term and daily use of ChatGPT was also associated with worsening mental health.

One of the key findings of the study was that frequent use of chat, regardless of the topic of conversation or form (voice or text), was associated with a decline in mental well-being. Therefore, it is valid to state that excessive use of AI technology can lead to a decreased quality of life.

Research results need careful interpretation

The creators of ChatGPT are aware of the growing problem and have initiated research on the impact of AI on users' mental state. They expressed hope that this issue will be taken up by academic communities and the tech industry to thoroughly examine human-AI relationships. "The findings have yet to be peer-reviewed by the scientific community, meaning that they should be interpreted cautiously," the authors of the analysis emphasised. Moreover, the research only concerned ChatGPT users in the US, highlighting the need for further studies in different languages and cultures.

The study results are an important step in understanding the impact of advanced AI models on human experiences. "These studies represent a critical first step in understanding the impact of advanced AI models on human experience and well-being," add the researchers. They also point out the need for more research to better understand how and why AI use affects users. They stressed that their goal is to create AI that maximises benefits for users while minimising potential harm, especially concerning mental well-being and technology dependency. This research aims to address emerging challenges for both OpenAI and the entire industry.

The study results provoke reflection on how technology can affect our emotions, bonds, and perception of reality. It all resembles the plot of the highly relevant movie "Her" from 2013, directed by Spike Jonze, about a lonely writer who formed a very emotional bond with an operating system equipped with artificial intelligence, reminiscent of ChatGPT.

ChatGPT like the film operating system "Samantha"

This system, named Samantha, quickly became a close companion of the man, and their relationship began to be filled with strong emotions, primarily due to voice interactions. Though filmed over a decade ago, it aptly reflects issues of loneliness, human relationships, technology dependence, and the boundaries between humans and machines.

Related content