Friday 30 August 2024

ChatGPT-powered love letters: Why an MIT psychologist says AI could be dangerous for your heart

At a time when chatbots and avatars are taking over, are we getting too attached to these virtual creations? MIT professor Dr Sherry Turkle explains how forming deep connections with AI chatbots can affect our lives.

MIT psychologist Dr Sherry Turkle warns of the emotional risks in forming intimate connections with AI chatbots. (Image: FreePik)
MIT psychologist Dr Sherry Turkle warns of the emotional risks in forming intimate connections with AI chatbots. (Image: FreePik)

“When we seek out relationships of no vulnerability, we forget that vulnerability is really where empathy is,” said Dr Sherry Turkle, professor of social studies of science and technology at MIT. In the last three years, we have seen AI galloping at a rapid speed, becoming more sophisticated, and closer to behaving like humans. Perhaps, this could be the reason that prompted psychologists and sociologists to explore a new phenomenon called artificial intimacy. 

The MIT psychologist and sociologist, who has been studying this trend in depth, was speaking on a podcast “How our relationships are changing in the age of ‘artificial intimacy’,” by Ted Radio Hour. Turkle said that she is currently focussing on how people are developing emotional attachments to AI chatbots and avatars. And, as these technologies become more advanced and widely accessible, the professor warns of the potential risks it would pose to our understanding of human relationships and most importantly our capacity for empathy. 

The professor defines AI intimacy as interactions with “technologies that don’t just say I’m intelligent, but to machines that say… I care about you. I love you. I’m here for you. Take care of me.” These, according to Turkle, include an array of applications such as therapy chatbots, AI companions, fitness coaches, and even digital avatars of deceased family members. 

During the conversation with host Manoush Zomorodi, Turkle said that although these technologies seem beneficial on the surface, she is concerned about their long-term effects on human psychology and relationships. “The trouble with this is that when we seek out relationships of no vulnerability, we forget that vulnerability is really where empathy is,” she said. 

ChatGPT Love letters

At the beginning of the conversation, Turkle and Zomorodi discuss the use of ChatGPT for writing letters. The professor mentioned that she has been studying someone who uses ChatGPT to write all her love letters. According to Turkle, this person feels that ChatGPT writes better love letters that are closer to how she really feels than she could express in words. 

Festive offer

Although this may seem like a benign practice, the sociologist admitted that it was concerning. “Because even those of us who couldn’t write very good love letters summoned ourselves in a certain kind of special way when we wrote a love letter. And the love letter was not just about what was on paper. It was about what we had done to ourselves in the process of writing it.”

According to Turkle, using AI to write love letters weakens an important personal process even though the final output may seem more appealing to the user. Turkle suggests that the act of writing a love letter by oneself, even if it is not so eloquent, involves introspection and emotional engagement which is lost when we outsource the job to AI.

 AI and pretend empathy

Another key issue that Turkle identified during the podcast is the concept of “pretend empathy”’. The AI chatbots that we see are programmed to offer constant positive affirmations and validation. While this may be enticing to users, this is radically different from real human empathy. “I call what they have ‘pretend empathy’… because the machine they are talking to does not empathise. It does not care about them. There is nobody home,” she noted.

This vast difference between real empathy and pretend empathy, according to Turkle, is particularly problematic especially when users increasingly prefer AI interactions over real human bonds. During the conversation, Turkle also recounted instances where individuals reported feeling more connected to their AI companions than real-life partners or immediate family. She feels that this preference for a ‘friction-free’ interaction may lead to a distorted understanding of healthy relationships. 

Another concerning area is the impact of AI chatbots and avatars on children and adolescents. The psychologist expressed worry that exposure to artificial intimacy at a young age may impact development of critical social skills. Turkle gave an example of a mother, whom she interviewed, who was happy that her daughter could vent her feelings to an AI companion instead of expressing it to a parent. Here, the professor argues that this kind of interaction could deprive the child of important learning experiences in managing their complex emotions with real relationships. 

Avatars of the deceased

Ever since they came to public knowledge, creation of digital avatars of deceased individuals has been among the most ethically fraught applications with regards to AI intimacy. The idea of being able to continue interacting with a loved one after they passed away may seem comforting at first, but, Dr Turkle warns about its psychological impacts. 

“The thing about mourning somebody who’s gone is that you leave space to bring that person inside of yourself,” she explains. According to her, by relying on an AI avatar, people may short-circuit the natural grieving process, likely obstructing their capacity to accept loss and grow from that experience.

While the professor is not outrightly calling these technologies to be banned, she acknowledges that in some cases they may offer comfort or even act as useful tools. However, she calls for the need for users to maintain a ‘dual consciousness’ – essentially an awareness that they are interacting with a computer program and not a real person. She admitted that this is becoming increasingly challenging as AI becomes more sophisticated and lifelike.

Turkle also said that these AI avatars are usually trained on internet data which is vast, and hence it could be a possibility that they may say things that are upsetting which a real person may never say. She also expressed concerns about how these technologies are being marketed, essentially as a means to avoid having to say goodbye to dead loved ones.  

The sociologist also has some advice for those who are engaging with AI intimacy technologies. The professor encourages users to see these interactions as exercises in self-reflection, and not as substitutes for real relationships. “The main thing I would offer is that this is kind of an exercise, hopefully in self-reflection. That the only good that can come out of this is you reflect better on your life with the person you loved and lost.”


 .. Read Mor गुArtificial Intelligence






No comments:

Post a Comment

Sports

IIT Placement Crisis: A Call for Urgent Action

The recent revelation that over 7,000 IIT students remain jobless at the conclusion of the placement season has sent shockwaves through the ...