AI deepfakes hit UK execs as fraud losses reach £5.5 billion
Key Points
- UK reported fraud surged 900% to £5.5 billion in 2025, driven by five high value cases including a £4.8 billion crypto money laundering scheme, per BDO's FraudTrack report.
- BDO flags AI generated deepfakes as the next major UK fraud threat, citing executive impersonation and authorisation fraud as central concerns.
- 1 in 14 UK adults was a fraud victim in the latest Crime Survey of England and Wales, with 1 in 4 UK businesses affected.
- The UK Online Crime Centre, launched April 2026, brings together police, banks, mobile networks and technology firms under a £250 million three year fraud strategy.
- London and the South East accounted for over 90% of reported fraud value in 2025, with Yorkshire rising to second place at nearly £200 million.
UK reported fraud and economic crime surged to £5.5 billion in 2025 as accountancy firm BDO flagged AI-generated deepfakes targeting senior executives as the next major threat to British businesses.
The figure marks a 900% rise on the £550 million recorded in 2024, though the jump comes largely from five exceptionally high-value cases, including a single £4.8 billion cryptocurrency money laundering scheme.
Money laundering accounted for £4.9 billion of the 2025 total, fraud perpetrated by individuals contributed a further £207 million, and management fraud rose from £11 million to £195 million.
Together these three categories accounted for 97% of total reported fraud value across the period from 1 December 2024 to 30 November 2025.
Stephen Peters, BDO Partner and Head of Investigations, said fraud is becoming more technologically enabled, more adaptive and more resilient.
HE called for modernised enforcement tools and cross-sector collaboration, warning that without sustained reform, the financial and societal costs will continue to rise.
Deepfake threat to UK businesses
The report identifies AI generated audio, video, images and text that closely mimic real individuals as a key emerging risk for UK organisations. Criminals deploy these tools for executive impersonation, authorisation fraud and misleading communications, posing convincingly as chief executives or finance directors to authorise fraudulent payments or instruct staff.
UK engineering group Arup lost around $25 million in 2024 after a finance worker in its Hong Kong office attended a video call populated by deepfaked versions of the chief financial officer and other senior staff, and authorised the transfer.
The same playbook now scales across smaller UK businesses, where finance teams field convincing video and voice impersonations of their own leadership.
The finance and insurance sector recorded £171 million in reported losses for 2025, public administration reported £138 million, and the professional, scientific and technical services sector logged £68 million.
Individuals remained the largest victim group at £4.9 billion in reported losses, reflecting the scale of the single crypto money laundering case.
1 in 14 adults now victims of fraud
The latest Crime Survey of England and Wales found 1 in 14 UK adults had been a victim of fraud, while the Economic Crime Survey reported that 1 in 4 UK businesses had been affected.
Criminals increasingly use deepfake voice clones in authorised push payment scams, impersonating a relative, bank official or company contact to direct victims to transfer funds.
The same techniques targeting corporate finance teams now scale down to individuals via WhatsApp, phone calls and social media.
Banks have already warned that voice cloning technology can replicate a person’s speech from a few seconds of audio captured from a video call, voicemail or social media clip.
The BDO findings follow the government’s £250 million three year strategy to combat fraud, anchored by the new Online Crime Centre that launched in April 2026.
The disruption unit combines specialists from government, police, intelligence agencies, banks, mobile networks and major technology firms under a single remit to tackle fraud collaboratively.
The Economic Crime and Corporate Transparency Act 2023 also introduced a failure to prevent fraud offence, which took effect in September 2025 and makes large organisations criminally liable where employees commit fraud and the organisation lacks reasonable prevention procedures.
Peters argued that legal and regulatory frameworks must evolve at pace if they are to remain effective against AI assisted fraud.
The Online Crime Centre’s coordination role marks one response, but BDO’s findings suggest the underlying threat is accelerating faster than enforcement capacity.
For UK businesses, the data shows that authentication of voice and video communications can no longer rely on recognition alone.
Finance teams, payment authorisers and corporate boards are introducing callback protocols, multi-factor approval steps and out-of-band verification into payment authorisation processes, with banks rolling out additional consumer warnings on impersonation scams.