Grandparent Scams Go High-Tech With AI Voices and Deepfakes

“Grandparent” scams are getting a new dangerous makeover with AI-generated voices. One of the most profitable scams just became more lucrative.

You pick up a phone and hear your relative, who is in distress. In a panicked and hurried voice, they tell you that they were arrested, got robbed, were taken to jail, or got seriously sick. They ask for money, use a very urgent reason why, and beg not to tell anyone because they are ashamed, or don’t want to alarm others. They would add a few believable personal details scrapped from social media, and maybe invite someone else, like a doctor or a police officer, to explain a situation better. You are puzzled, but a sense of urgency and drama override your logic and make you send them money. This is a grandparent scam, also known as an impersonation scam, and it’s disproportionally affecting elderly people.

$9,000 is the median cash amount sent by victims over 70 and over to perpetrators of a grandparent scam. Considering that many people do not report this type of scam out of shame, the numbers can be higher. The Federal Trade Commission reported $2.6 billion in losses from imposter scams in 2022 with 36,000 victims.

The rise of artificial intelligence made the grandparent scams significantly worse. Now victims will hear the convincing voices of their loved ones asking for help, making it easier to lose life savings. The current technology requires only a few audible sentences that can be easily taken from social media. The tools are easy to use and they are available online.

AI tools will analyze a sample of a voice, search a database to find a similar one and re-create an almost identical synthetical voice. Lyrebird, a voice cloning start-up, requires only a minute of recording to make a clone. The tone of voice and a few personal details taken from social media are enough to dupe any victim. The voice imperfections attributed to AI can be explained by bad connections or background noises. The fraudsters can be located in any part of the world, which makes them almost impossible to identify and trace.

Voice deepfakes were used in other scams. A UK-based energy company lost $243,000 in 2019 after a CEO of the company received a call from the chief executive officer of their parent company. The latter asked to urgently transfer money to a supplier from another country. The CEO recognized the voice of the officer, which was in reality generated by AI, and transferred the money. Symantec, a cyber security company, reported another three cases where the voice of an executive was impersonated.

In another case, a bank manager received a call from a client, asking to authorize a few large transfers, needed to pay for the acquisition of a new company. The manager them. He fell victim to an AI cloning fraud and lost $400,000. The fraud group involved 17 individuals around the globe and an extensive and long investigation.

An example of a deepfake AI voice that was used in a scam, reported by Vice

ElevenLabs, a synthetic speech start-up, was used to manipulate the voices of celebrities to make them make racist, transphobic, homophobic, and violent remarks. This company promised to synthesize a voice using a one-minute recording. They also sell professional services helping companies to add or remove an accent. VALL-E, a Microsoft AI tool, promises to simulate a voice after listening for only three seconds.

The Federal Trade Commission reminds us that people who lost money in grandparent scams should contact law enforcement as soon as possible and report scam calls on their website.

Oxana Korzun

Oxana Korzun is the voice behind the Investigator blog. She is a Certified Fraud Examiner, a professional investigator with more than eight years of experience in companies like Meta, AIG, and Transparency International.

Previous
Previous

Top 5 Secure Browsers for Investigators

Next
Next

These Common Job Scams Are Waiting For Victims