The recent U.S. jury verdicts against Meta have intensified a debate that is no longer confined to Silicon Valley. On March 24, 2026, a New Mexico jury found that Meta had knowingly harmed children’s mental health and concealed what it knew about child sexual exploitation on its platforms. One day later, a California jury held Meta and YouTube liable in a landmark addiction case, after a young plaintiff argued that the services had been designed to hook children. Taken together, the rulings strengthen a powerful new idea: the danger may lie not only in harmful posts, but in the architecture of the platforms themselves. (apnews.com)
Governments are responding in strikingly different ways, but the direction is the same: stronger intervention. In Australia, children under 16 have been banned from holding social media accounts since December 2025, with the burden placed on platforms rather than on families. Indonesia moved further in March 2026, beginning a phased rule that prevents children under 16 from having accounts on high-risk platforms including YouTube, TikTok, Facebook, Instagram, Threads, X, Bigo Live, and Roblox. France had already adopted a “digital majority” at 15, requiring social networks operating there to refuse registration to under-15s unless a parent authorizes it, while also mandating age checks and time-control tools. (pm.gov.au)
In Europe, the trend is not just age limits, but design regulation. The European Commission opened formal proceedings against Meta under the Digital Services Act over concerns that Facebook and Instagram’s systems may encourage behavioural addiction in minors and that Meta’s age-assurance methods may be inadequate. In July 2025, the Commission also issued guidance recommending private-by-default accounts for minors, safer recommender systems, and the default disabling of features such as autoplay, streaks, read receipts, and push notifications. In the UK, Ofcom has taken a similar approach, pressing platforms to use robust age checks and to reconfigure algorithms so harmful material is filtered or downgraded for children. The era of “trust us, we’re trying” is ending; child safety is becoming a matter of enforceable design, legal liability, and national policy. (digital-strategy.ec.europa.eu)










