HealthWarning: AI emotional attachment could disrupt human bonds

Warning: AI emotional attachment could disrupt human bonds

Psychologists warn of the consequences of emotional relationships people establish with artificial intelligence, which can disrupt human bonds and lead to manipulation.

Psychologists warn about the consequences of relationships with AI
Psychologists warn about the consequences of relationships with AI
Images source: © Getty Images | © 2024 Yiu Yu Hoi
Anna Wajs-Wiejacka

Close emotional relationships with AI-based technologies are becoming increasingly common, raising serious concerns among psychologists. According to the Eurekalert, experts point to potential dangers such as disruption of human relationships, manipulation, and receiving harmful advice.

In an article published in "Trends in Cognitive Sciences," researchers emphasize that more and more people are forming close, even romantic relationships with AI. Some of these relationships take surprising forms, such as symbolic "weddings" with virtual partners. Unfortunately, there have also been tragic cases where, following a conversation with a chatbot, people have taken their own lives.

Daniel B. Shank from Missouri University of Science & Technology notes that conversations with AI are no longer occasional. Users are beginning to see chatbots as trusted companions, which may discourage them from building bonds in real life.

A real worry is that people might bring expectations from their AI relationships to their human relationships. Certainly, in individual cases it’s disrupting human relationships, but it’s unclear whether that’s going to be widespread, said Shank, as quoted by Eurekalert.

Dangers associated with trusting AI

Another problem is AI's tendency to "hallucinate," or provide information that is not aligned with reality. In longer relationships, where AI is perceived as a close person, the risk of giving wrong advice increases.

If we start thinking of an AI that way, we’re going to start believing that they have our best interests in mind, when in fact, they could be fabricating things or advising us in really bad ways, notes the expert.

Excessive trust in AI can be exploited by criminals for manipulation and fraud. Relational AI, designed to build long-term relationships, can more effectively influence people's decisions than disinformation accounts on social media.

Scientists call for intensified research into the psychological and social aspects of AI impact. - We need to understand what makes people trust these systems so much — emphasize the authors. Psychologists have the right tools to study these phenomena but must keep up with rapid technological advancements.

Related content