Posted Mar 17, 2023 5:02pm ET
Artificial intelligence expert Marie Haynes says AI tools will soon make it difficult to distinguish AI from a real person’s voice. (Dave Charbonneau/CTV News Ottawa)
As artificial intelligence technology continues to advance, scammers are finding new ways to exploit it.
Voice cloning has become a particularly dangerous tool, with scammers using it to impersonate the voices of people who know and trust their victims in order to trick them into handing over money.
“People will soon be able to use tools like ChatGPT or even Bing and eventually Google to create voices that sound very similar to their own voice by using their cadence,” said Marie Haynes, an expert on artificial intelligence. “And it’s going to be very, very hard to tell apart from a real living person.”
She warns that voice cloning will be the new tool for scammers pretending to be someone else.
Carmi Levy, a technology analyst, explains that scammers can even spoof family and friends’ phone numbers to make it appear as if the call is actually coming from the person they are impersonating.
“Scammers are using increasingly sophisticated tools to convince us that the phone ringing is actually from that family member or important person. This person we know,” he says.
Levy advises people who receive suspicious calls to hang up and call the person they think is calling directly.
“If you get a call and it sounds just a little wrong, the first thing you should do is say, ‘Okay, thanks for letting me know. I will call my grandson, my granddaughter, whoever it is that you are telling me directly that you are in trouble.’ Then stop and call her,” he advises.
Haynes also warns that voice cloning is just the start, as the AI is powerful enough to clone someone’s face as well.
“If I get a FaceTime call anytime soon, how will I know it’s really someone I know,” she says. “Maybe it’s someone pretending to be that person.”
As this technology becomes more widespread, experts are urging people to be vigilant and screen calls from friends and family before sending money.
“There are all kinds of tools that can take the written word and turn it into a voice,” says Haynes. “We’ll soon find that fraudulent calls are really, really on the rise.”