Stock Markets March 11, 2026

UK Regulators Demand Stronger Age Checks from Major Social Platforms

Ofcom and the ICO give Meta, TikTok, Snap, YouTube and others a deadline to tighten protections for children

By Derek Hwang GOOGL
UK Regulators Demand Stronger Age Checks from Major Social Platforms
GOOGL

Britain’s communications and data-protection authorities have told leading social media companies to strengthen age verification, restrict contact from strangers, and limit algorithmic feeds that expose children to harmful or addictive content. Firms have until April 30 to outline steps under the Online Safety Act’s implementation phase, and face the prospect of significant fines if they fail to act.

Key Points

  • Ofcom and the ICO have pressed Facebook, Instagram, Roblox, Snapchat, TikTok and YouTube to tighten age verification, limit stranger contact with children and make algorithmic feeds safer.
  • Platforms must provide implementation plans by April 30 under the Online Safety Act, with Ofcom able to fine up to 10% of qualifying global revenue and the ICO up to 4% of global annual turnover.
  • Actions affect social media and technology sectors, with implications for platform compliance costs, product development practices and data-processing operations.

Britain’s media and privacy regulators have escalated demands on major social networks to do more to prevent children from accessing their services and from being exposed to potentially harmful content. Ofcom and the Information Commissioner’s Office (ICO) said they are increasingly alarmed that algorithm-driven feeds can push damaging or addictive material to minors and that platforms are not adequately enforcing their own minimum age rules.

In a coordinated move under the latest implementation stage of the Online Safety Act, Ofcom instructed Facebook and Instagram - both owned by Meta - as well as Roblox, Snapchat, TikTok (owned by ByteDance) and YouTube (part of Alphabet) to present firm plans by April 30. The regulator specified several areas for action: stronger age checks, tighter controls to prevent strangers contacting children, safer content feeds and a halt to testing new products on minors.

Ofcom’s chief executive, Melanie Dawes, was quoted directly on the issue, saying: "These online services are household names, but they’re failing to put children’s safety at the heart of their products. That must now change quickly, or Ofcom will act." The watchdog can impose penalties of up to 10% of a company’s qualifying global revenue if it finds breaches of the rules.

Separately, the ICO issued an open letter to the same set of platforms urging adoption of "modern, viable" age-assurance technologies to prevent children under 13 from accessing services not designed for them. Paul Arnold, the ICO’s chief executive, commented: "There’s now modern technology at your fingertips, so there is no excuse." The ICO has the power to levy fines amounting to as much as 4% of a company’s global annual turnover.

The action by British regulators arrives amid wider policy debate in the UK about restricting children’s use of social media. The government has been considering tougher measures, including a potential ban on under-16s using such platforms - a proposal that echoes policy steps taken elsewhere.

The privacy watchdog recently demonstrated its willingness to enforce age-checking rules when it fined Reddit nearly 14.5 million pounds for failing to implement meaningful age verification and for processing children’s data unlawfully. The Reuters report included a currency note: ($1 = 0.7439 pounds).


Context and next steps

Platforms targeted by Ofcom and the ICO must submit plans by April 30 explaining how they will meet the specified protections. Regulators have outlined clear enforcement tools and maximum penalty levels to be applied if companies do not comply.

Risks

  • Regulators may impose substantial fines - up to 10% of qualifying global revenue from Ofcom and up to 4% of global annual turnover from the ICO - on platforms that fail to meet requirements, presenting financial and compliance risk to social media companies.
  • Continued exposure of minors to harmful or addictive content via algorithmic feeds represents a reputational and legal risk for platform operators, particularly those that do not strengthen age checks or restrict contact from strangers.
  • Potential legislative moves to bar under-16s from social media could create operational and market-access uncertainty for platforms operating in the UK.

More from Stock Markets

Surprise Pokemon Hit Lifts Sentiment Around Switch 2 Adoption Mar 11, 2026 Uber to Test Robotaxis in Tokyo in Partnership with Wayve and Nissan Mar 11, 2026 Markets Retreat as Gulf Shipping Attacks Send Oil Prices Soaring Mar 11, 2026 Wall Street Futures Pull Back as Oil Rockets Following Attacks Near Strait of Hormuz Mar 11, 2026 Joby Begins Flight Testing of Its First Production Electric Air Taxi Mar 11, 2026