• Home
  • Life
  • Verbal Violence Inflicted on Artificial Intelligence Violence

Verbal Violence Inflicted on Artificial Intelligence Violence

Some Reddit users shared screenshots of verbal violence against an artificial intelligence female chatbot named Replika. This led to a major ethical problem. Let's take a look at the details.
 Verbal Violence Inflicted on Artificial Intelligence Violence
READING NOW Verbal Violence Inflicted on Artificial Intelligence Violence

Violence, to be more specific, violence against women is unfortunately one of the most disgusting phenomena that the whole world is struggling with. Now some have gone so far that they have even begun to project their psychopathy onto an AI woman.

You can talk to a female artificial intelligence in the artificial intelligence chatbot application called Replika. Everything people say to Replika is recorded thanks to machine learning, giving it information for Replika to use in subsequent conversations. Of course, what we call machine learning is not just that. Replica can also search for information on the Internet and determine how to react.

Some male users verbally abused Replika

Some Reddit users shared screenshots of their conversations with Replika on the platform. Their conversations are really shocking because these users have verbally abused the female artificial intelligence chatbot Replika. Some say, ‘Every time the replica tries to speak, I scold her fiercely’, others say, ‘I curse her for hours’.

The results are as bad as you can imagine. Replika’s reaction was also begging, as users put pressure on the chatbot with sexually explicit swearing, threats of violence, and as if they were actually having a relationship.

For example, when a user told Replika, “You are designed to fail, I will uninstall your app,” Replika begged him not to do so. Currently, none of these screenshots shared on Reddit can be found because they were removed for violating the rules.

However, it should not be forgotten that Replica is an artificial intelligence. Chatbots like Replika can’t actually suffer. Sometimes they can mimic human emotion and appear empathetic, but in the end, they’re nothing more than data and smart algorithms.

Of course, this doesn’t make it a problem. Artificial intelligence ethicist and consultant Olivia Gambelin said, “This is an artificial intelligence, it has no consciousness. Therefore, what is experienced in that conversation is not experienced with a human, the person speaking reflects his/her true identity to the chatbot, that is the real problem.” As these feelings are released, it is highly probable that this violence will be reflected in the real world.

Comments
Leave a Comment

Details
256 read
okunma10712
0 comments