• Home
  • Internet
  • Scammers started using artificial intelligence cloned voices: It works!

Scammers started using artificial intelligence cloned voices: It works!

As in many other technological revolutions, artificial intelligence will bring its harms into our lives along with its benefits. It seems that he is not prepared for this for now. The latest information is that realistic sound imitations produced with artificial intelligence ...
 Scammers started using artificial intelligence cloned voices: It works!
READING NOW Scammers started using artificial intelligence cloned voices: It works!
As in many other technological revolutions, artificial intelligence will bring its harms into our lives along with its benefits. It seems that he is not prepared for this for now. The latest information reports that realistic sound imitations produced by artificial intelligence are being used in fraud.

According to the Washington Post, several couples sent money to their children who were seeking help for various reasons, as any sensitive family would. However, there was a problem that they were not aware of; The person they were talking to was an artificial intelligence, not their children.

Fraud with artificial intelligence generated voice

Of course, voice-based scams are nothing new. U.S. Federal Trade Commission data reveals that of the 36,000 reports last year that people were scammed by criminals pretending to be friends or family, more than 5,100 took place on the phone. Imitating a person’s voice was previously possible with hours of sound exploration. The defects of the resulting products could be noticed by careful people. But now it only takes a few minutes for AI-generated sound imitations. According to reports, a couple sent $15,449 to scammers who imitated their child’s voice, and they are not the only victims.

For example, Microsoft’s Vall-E tool enables very realistic and consistent sound imitations with only 3 second sound clips. Moreover, it can even produce different tones (angry, excited, etc.) from this small sound clip. A similar tool, ElevenLabs, can produce much more realistic sounds. You can see the success of the vehicle in the Twitter video just above.

At this point, it seems necessary to take comprehensive measures for this type of fraud or abuse. It should not be viewed as just a scam. Deepfake videos combined with such generated sounds can have serious consequences, especially in times of confusion. Banks’ voice-based security features can be circumvented. Therefore, social media platforms may include tools that detect artificial intelligence in the videos or images uploaded to them. Models are currently being developed, such as DetectGPT, which detects text-generating artificial intelligence such as ChatGPT.

Comments
Leave a Comment

Details
135 read
okunma53449
0 comments