
The risk of emotional commitment
According to four weeks of work, chatting with Chatgpt or sound by text or sound directly affect the emotional experience of the participants. The research shows that people who use the chat robot more have increased loneliness and spend less time with their social circles. The research has not yet gone through a refereed assessment, but it offers significant findings on artificial intelligence and human interactions.
According to the research, talking about personal issues increases the feeling of loneliness in the short term, while interestingly chatting about general and non -personal issues further raises the risk of creating emotional commitment.
How was the research done?
The researchers watched about 1,000 people with different backgrounds for a month. Participants were randomly directed to text -based or two different voice chatgpt versions and asked to chat for at least five minutes a day. While some participants had free conversations, others spoke about personal or non -personal issues.
The results revealed that people who tend to emotionally ties emotionally in human relations also trust in chat robots more and experience the feeling of loneliness more. Interestingly, there was no finding that the audio interaction had a more negative consequence. In the second stage of the study, 3 million chatgpt speech was analyzed through software and surveys were conducted on how users interact with the chat robot. It was determined that most people do not use Chatgpt to support emotional support.
Launched at the end of 2022, Chatgpt increased interest in artificial intelligence technologies. Users use this technology in various fields from writing code to sharing personal problems. The increasingly human communication of artificial intelligence models brings the risk of establishing emotional ties against these systems.
Especially for young people and individuals with mental health problems, concerns about the possible emotional damages of these technologies have recently come up again. In 2024, Characterai, who developed another chat robot, was sued for a child for encouraging his suicide thoughts.