Criminals are using AI technology to clone victims’ voices and impersonate them to their banks, setting up direct debits over the phone.
National Trading Standards warned that this new wave of AI-assisted fraud specifically targets older people and is aimed at deceiving legitimate banks and businesses by impersonating the victim.
The scammer begins by calling the victim and conducting a ‘lifestyle survey’, during which they gather personal, health, and financial information.
They also record the call and feed the sound file through an AI tool that clones the victim’s voice, allowing the scammer to impersonate them over the phone.
The criminal then contacts the victim’s bank or other businesses they transact with and use their cloned voice to simulate consent for direct debits.
This allows them to easily set up payments without the victim’s knowledge and deceive legitimate banks into setting up payments from the victim’s account. In many cases, victims do not realise that payments are being taken from their account.
“What we’re seeing is a deeply disturbing combination of old and new: traditional phone scams supported by disturbing new techniques,” said Louise Baxter, Head of the National Trading Standards Scams Team.
“Criminals are using AI not just to deceive victims, but to trick legitimate systems into processing fraudulent payments.”
“This is no longer just a nuisance – it’s a coordinated, sophisticated operation targeting some of the most situationally vulnerable consumers in society,” Baxter said.
In its analysis of scam calls conducted last year, NTS found that scammers were also using AI avatar software to mask their original Indian accents with AI-modulated British accents to better deceive victims.
A long-running operation by National Trading Standards has blocked nearly 21 million scam phone calls and shut down 2,000 numbers in the last six months, shutting down scammers who cold called UK consumers and coerced them into provide financial and personal details.

Leave a Reply