Scammers are now using AI to create fake voices that sound real. When new technology like AI comes out without rules, some people use it to trick others. AI makes it easier for scammers to sound believable.
We’ve seen scams where they play recordings of children sounding upset to make families think their own child is in trouble. But now, with AI voice generators, scammers can mimic your child’s actual voice. This makes the scam even more convincing.
What is an AI voice generator?
Voicemod, a company from Spain that makes sound effects software, explains AI voice generator technology like this:
AI voice is a fake voice that sounds like a human. It is made using artificial intelligence and deep learning. You can use these voices to turn text into speech, like Voicemod’s Text to Song feature, or to change one spoken voice into another. This is what Voicemod’s AI voice collection does.
The technology of AI voice generators is now being used in what the Federal Trade Commission calls “imposter scams.” In these scams, tricksters pretend to be a family member or friend to steal money, often from older people. A year ago, people reported over 5,100 of these phone scams, losing a total of $11 million, according to the Washington Post.
Companies like Voicemod offer services to create voices using AI, and there’s not much control over this. Microsoft’s AI, called VALL-E, can mimic someone’s voice from just three seconds of their speech, reports Ars Technica.
Hany Farid, a digital forensics professor at the University of California at Berkeley, explained to the Washington Post how it works. The AI needs a short audio clip, which could come from YouTube, podcasts, ads, or videos on TikTok, Instagram, or Facebook. It then copies the voice’s pitch and sound closely to sound like that person.
How does the AI voice generator scam work?
Scammers are using AI voice generator technology to copy voices, often of young children, to trick relatives into thinking the child has been kidnapped. They ask for money to release the child safely.
An NBC News report shows that it’s easy to take voice samples from social media and use them to say anything. It’s so convincing that a reporter fooled her coworkers into thinking she was speaking when it was actually the AI. One coworker even offered to lend her their company card to make purchases.
This scam is very believable, and it catches people off guard. There are many versions of this scam. For example, a couple in Canada lost $21,000 because of a fake call from a “lawyer.” The scammer said he was representing their son, who supposedly caused a fatal car accident. They asked for money for “legal fees” while their son was “in jail,” as reported by the Washington Post.
What can you do to avoid falling for the AI voice generator scam?
The best way to protect yourself from scams is to be aware of them. If you know how a scam works, you’re more likely to spot it if it happens to you or someone you love.
If you get a phone call claiming someone has been kidnapped, the best thing to do is to immediately call or video chat with the supposed victim. It might seem strange not to call the police or pay the ransom right away, but calling the person yourself can prove it’s a scam. If it is a scam, you’ll see or hear the person is just fine, doing their usual things.
Unfortunately, there aren’t many other ways to prevent being targeted by such scams. While there is technology that can detect AI use in videos, pictures, sounds, and text, it’s not available for public use in these situations yet. We hope that the technology causing these problems will soon find ways to solve them too.