Is There A Way To Distinguish The Fake Voices From The Real Voices?

The increasing popularity of artificial intelligence technology can be used by malicious people for different purposes. Especially fake songs created using the voices of artists are quite popular. So, is there a way to distinguish voices cloned with artificial intelligence from real voices?
 Is There A Way To Distinguish The Fake Voices From The Real Voices?
READING NOW Is There A Way To Distinguish The Fake Voices From The Real Voices?

The field of use of artificial intelligence is gradually expanding. Political leaders and world-famous people see the biggest effects from this. Images created using artificial intelligence about former US President Donald Trump and Pope Francis had a great impact in the society.

Although the songs and videos produced with fake voices may seem amusing, there is also a dark side to the matter: Fraud. A ransom was demanded from a mother for her kidnapped daughter in the past, and the fake voice of the allegedly abducted girl was used to prove the reality of the situation. In fact, the girl had never been abducted.

The same fraud method becomes more realistic with artificial intelligence

Artificial intelligence scams are becoming more and more common, and it can get pretty scary as the voice on the end of the phone sounds like the voice of a loved one. Today, fraud is a profession for some in many parts of the world, including our country. With the development of technology, this situation is becoming more and more believable. Phone scams are, of course, the easiest and most common. But now it’s even scarier.

In April of 2023, we saw examples of the next generation of artificial intelligence scams. For example, using the call spoofing method, a loved one appeared as the caller on the victim’s phone. Another used an AI voice clone to try to extort ransom money from a mother to free her abducted daughter.

The McAfee company saw such abuse of artificial intelligence and published an in-depth report on the subject. In its report, the company explained ways to prevent fake voices and protect against fraud.

Before explaining the methods, let’s take a look at how artificial intelligence fake voices work together.

How does AI voice cloning work?

First of all, we can say that artificial intelligence voice fraud is a new version of a long-existing method, but it is much more convincing than that. Usually in this type of fraud; The voice of a loved one is used to ask for money for an emergency, or the victim is told that they have kidnapped a loved one for ransom.

In 2019, scammers impersonating the CEO of a UK-based energy firm demanded $243,000. In Hong Kong, a bank manager was tricked by someone using voice cloning technology, resulting in a large wire transfer in early 2020. In Canada, at least eight senior citizens lost a total of $200,000 to voice cloning scams earlier this year.

Since AI voice cloning tools are so cheap and available, creating voice clones for malicious parties is actually quite simple. Social media is, of course, the easiest way to get sample audio to do this. The more you share your voice online, the easier it is for scammers to find and clone your voice.

In the image above, you can see the results of a survey. According to the data obtained; 26% of users say they share their voices on social media once or twice a week.

How common is AI voice scam?

Although some stories about artificial intelligence voice scams are just starting to appear in the news, research reveals that this type of scam is becoming quite common.

Looking at global surveys, 25% of people say they or someone they know has been exposed to artificial intelligence scams. According to the survey data, the country with the most problems with artificial intelligence voice fraud is India with 47%.

Voice clone tools provide 95% accuracy

The artificial intelligence tools used for voice cloning are so realistic that almost all of the victims say the fake voice is exactly like the voice of the person they know. In fact, in the ransom case we mentioned above, the victim mother states that even her daughter’s “tongue” is the same except for her voice.

The fact that artificial intelligence fake voices are so similar to the real thing costs people thousands of dollars in a short period of time. According to research, 77% of victims of AI voice scams lose their money. The data show that a total of $2.6 billion was stolen through this method throughout 2022.

So, how to distinguish fake sounds from real ones?

For now, there is no surefire and quick way to prevent AI voice scams. However, there are some precautions you can take on your own.

  1. Limit how much you share your audio and/or video online and/or set your social media accounts private instead of public,
  2. If you receive a suspicious call and a loved one needs immediate financial assistance, ask a question or two that only that person can answer (never ask a question that has an online answer).
  3. Cybercriminals rely on the emotional connection between you and their impersonator to mobilize the victim. Be skeptical and don’t be afraid to question. If your doubts are not cleared, hang up and call that person directly.

Comments
Leave a Comment

Details
124 read
okunma59097