How far should governments go in controlling children’s social media use? In the United States, the sharpest battle is now happening in court, not in Congress. In Los Angeles, jurors have heard closing arguments in a landmark trial over whether social media companies can be held responsible for harm to child users. Meta and Google’s YouTube are the remaining defendants after TikTok and Snap settled, and Meta’s Instagram chief, Adam Mosseri, testified that he does not believe social media addiction is a clinical reality. The case turns a familiar family argument into a historic legal one: when does clever app design become a dangerous product? (apnews.com)
Elsewhere, lawmakers are moving faster than judges. Australia’s rules, in force since December 10, 2025, require major platforms such as Facebook, Instagram, TikTok, Snapchat, X and YouTube to take “reasonable steps” to stop under-16s from holding accounts, with possible penalties of up to A$49.5 million. At the same time, the government says users cannot be forced to rely on government ID alone, showing how privacy concerns have become part of the debate. Malaysia has said it plans to ban social media accounts for people under 16 from 2026, and Indonesia announced in March 2026 that children under 16 will be barred from accounts on high-risk platforms. In Europe, the political mood is also shifting: the European Parliament has called for a harmonised digital minimum age of 16, while still allowing access for 13- to 16-year-olds with parental consent. (esafety.gov.au)
And yet, age limits alone may not solve the problem. UNICEF has warned that blanket bans can backfire if children simply migrate to less regulated spaces, and the OECD found in 2025 that only two of 50 online services it studied systematically checked age at account creation. That finding reveals the real dilemma: society wants to protect children from bullying, exploitation and manipulative design, but enforcement remains technically difficult and politically sensitive. The strongest answer may be neither total freedom nor total prohibition, but a tougher mix of age limits, privacy-respecting age checks, safer product design and clearer accountability for the companies that profit from young users’ attention. (unicef.org)










