Attackers use artificial intelligence (AI) to enhance conversational scams, such as the so-called “pig fight” social engineering scam, via mobile devices. Rather than just using the same attack with the same image, the AI allows scammers to quickly create thousands of attacks that look different for each target, making it harder for victims to spot the scam.
“It’s really a big numbers game, the attackers want to send those initial messages to as many people as possible in hopes of getting a response,” said Stuart Jones, director of the Cloudmark division at Proofpoint, who blogs on the April 18 issue. “Once they get a response, they use the AI to maintain that realism with the target that responded, they can change clothes or background over the course of a defined campaign, and they now have the ability to run thousands of different ones looking attacks.”
Proofpoint researchers also pointed out that, for example, as technology advances, AI bots trained to understand complex tax code and investment tools could be used to scam even the most sophisticated victims.
Jones said that the vast majority of these conversational attacks target mobile devices, primarily smartphones. He said Proofpoint had seen a 12x increase in these attacks over the past year, noting that they observed about 500,000 in a month or two period.
Many of the attacks revolve around romance and affect job seekers. The attackers try to lure the victims to alternative platforms like WhatsApp and Facebook Messenger to carry out the final transaction. The preferred payment method is Bitcoin.
Enterprise security professionals need to remain vigilant because although the pandemic has receded, most organizations are still in hybrid mode and there is a very noticeable blurring of work and home as people take turns using their personal and work devices.
“We’ve seen a variant of business abuse where a threat actor pretending to be a manager or supervisor or an executive at a company, starts a conversation and, by pretending to be that co-worker, tries to convince the person to giving up private information or money,” Jones said.
The term “pig slaughter” is used for a family of related scams where the victim “is fattened and then slaughtered,” explained Mike Parkin, senior technical engineer at Vulcan Cyber. Parkin said it derives from a Chinese term used in the field, but the concept is nothing new. Today, Parkin said it is affecting the crypto market’s links to cybercrime as the currency of choice.
“These scams target individuals and are a different business model than the more well-known ransomware and extortion schemes that target organizations,” Parkin said. “Ultimately, a stalled crypto market will hardly be an inconvenience to these scammers. They may have to change some details and resort to older techniques to peel their victims, but that won’t stop them. Criminals were active long before cryptocurrency existed, and they will continue to operate even after crypto is banned.”
Krishna Vishnubhotla, Zimperium’s vice president of product strategy, added that conversational fraud is difficult to prevent because the victim has no way of verifying the caller’s trustworthiness. When they give the victim a wallet address, the wallet providers do not allow the victim to check the person’s reputation, location or other details to confirm the story that they are being sold, Vishnubhotla said.
“AI tools will reduce the operational costs of running these scams to almost nothing,” Vishnubhotla said. “It’s not just about responding like a human and taking photos. The tools give conversations a distinct personality tailored to the victim’s demographic and socioeconomic background. When these technologies are misused, it leads to large-scale personalized fraud. And these will only evolve to get better as the models improve. We are already seeing these scams making it as legitimate profiles on LinkedIn, Instagram, and TikTok.”