Technology

UK fines tech bosses £70,000 for letting kids see knife content

Ryan Brothwell 4 min read
UK fines tech bosses £70,000 for letting kids see knife content

Key Points

  • The Crime and Policing Act 2026, in law from 29 April, lets UK police issue Content Removal Notices to platforms and named executives, with 48 hours to take down illegal knife content.
  • Combined fines of up to £70,000 per offence apply: £60,000 to the company, £10,000 to the designated executive, with a separate £60,000 penalty for failing to nominate an executive.
  • Scope covers social media platforms, online marketplaces and search services, including Meta, TikTok, X, YouTube, Snapchat, Reddit, Google, Amazon and eBay.
  • The regime is the first UK law making individual platform executives personally liable for content offences, going beyond the Online Safety Act's corporate enforcement model.
  • Measure forms part of Ronan's Law, named after Ronan Kanda, the 16-year-old killed in Wolverhampton in 2022 with a ninja sword bought online without ID checks.

Tech executives at Meta, TikTok, X, YouTube and other major platforms face personal fines and criminal liability if they fail to remove illegal knife content within 48 hours of being notified, under powers that became law on Wednesday (29 April).

The new regime, contained in the Crime and Policing Act 2026, allows police to issue Content Removal Notices to online platforms and to a named executive at each company, requiring takedown within 48 hours of any content marketing prohibited weapons such as zombie knives, ninja swords or knuckledusters, or any content that markets a knife as suitable for violence or is likely to encourage its use as a weapon.

Failure to comply triggers Civil Penalty Notices of up to £60,000 against the company and up to £10,000 against the designated executive per offence, a combined maximum of £70,000 for each piece of content left up.

Platforms that refuse to nominate a senior executive when requested face a separate £60,000 fine.

The measure represents a significant expansion of UK platform liability. The Online Safety Act 2023 placed duties of care on platforms but routed enforcement through Ofcom against the corporate entity, not individual executives.

The Crime and Policing Act creates the first UK regime making a named individual at each platform personally liable for content-related offences, with the Home Office consultation response showing 74 respondents broadly supported the proposals.

For UK parents, the practical change is a faster takedown route for knife content their children see. Police can act on individual posts rather than waiting for systemic regulatory intervention, with the 48-hour clock applying to specific items of content rather than overall platform safety standards.

The scope covers social media platforms, online marketplaces and search services, capturing Meta, TikTok, X, YouTube, Snapchat, Reddit, Google Search, Amazon and eBay alongside smaller platforms.

The legislation responds directly to evidence that knife content drives real-world violence.

Patrick Green, Chief Executive of the Ben Kinsella Trust and member of the government’s Coalition to Tackle Knife Crime, said that the portrayal of knife crime on social media has hindered efforts to reduce it by normalising and glamorising violence to young people, and that companies have repeatedly failed to address it.

The measure forms part of “Ronan’s Law,” named after 16-year-old Ronan Kanda who was killed in Wolverhampton in 2022 with a ninja sword bought online by a teenager who used his mother’s ID and had previously bought more than 20 knives via online platforms.

Crime and policing minister Diana Johnson said when the executive liability measure was announced in April 2025 that the kind of content young people scroll through online every day is sickening and that the government would not accept the argument that restricting access is too difficult.

The minister pointed to a particular concern about content reaching young boys, echoing wider government and Ofcom focus on algorithmic amplification of violent material to teenage male audiences.

The Act builds on Ofcom’s recently published children’s safety codes under the Online Safety Act, which require platforms to suppress content promoting violence in feeds served to under-18s.

The Crime and Policing Act sits alongside that framework rather than replacing it, with police able to act on specific content while Ofcom continues to enforce systemic duties.

What this means for UK families and platforms

Parents and guardians who report knife content to a platform now have a backstop if the platform fails to act.

Where content meets the threshold of marketing prohibited weapons or encouraging violent use of a knife, police can be asked to consider issuing a Content Removal Notice, with the platform and a named executive on the hook financially if takedown does not happen within 48 hours.

For UK-facing platforms, each company in scope must designate an executive with responsibility for compliance, and that individual is personally exposed to fines whenever content slips through moderation systems and is not removed within the statutory window.

Platforms with limited UK staffing presence will need to identify an accountable executive or risk separate £60,000 fines for failing to nominate one.

The wider question will likely be around enforcement bandwidth.

The Home Office has not published projections on how many Content Removal Notices police are likely to issue, or which force will lead enforcement, and the 48-hour deadline puts a meaningful operational burden on platform trust and safety teams that have been scaled back across the industry since 2023.

Whether the £70,000 ceiling is sufficient to alter the behaviour of trillion-dollar companies, or whether police will treat the new powers as a backstop to wider Online Safety Act enforcement, will determine whether the law materially changes what UK children see online.

Now read: Gaming joins social media in UK age ban plans