In partnership with

This Week: Instagram Exposed; Bans Fail; Australia Closes Millions

Happy Thursday.

This week, Mark Zuckerberg was in court — and the internal documents that came out tell you more about how Instagram actually works than anything Meta has ever published voluntarily.

You'll also find out what 640 teenagers' phones actually recorded during school hours (it's not what schools think), and where Australia's under-16 ban stands two months in.

Five minutes. Everything you need to know.

—Heidi

Share Wired Parents with a parent who'd want to know this. Subscribe here

COUNTRY TRACKER

Our weekly round up of what’s happening around the world. More countries move on children's social media restrictions.

More movement in Europe as governments move to change the minimum age for social media.

🇩🇪 GERMANY — CDU ADOPTS AGE 14 SOCIAL MEDIA BAN

German Chancellor Friedrich Merz's conservative CDU party adopted a motion at its annual convention on Saturday calling for a legal minimum age of 14 for social media use, with additional protections for users up to 16. The move follows Merz warning that "the influence of algorithms, artificial intelligence, as well as targeted and controlled influence" was previously underestimated. The proposal mirrors those from the coalition's SPD partner. Germany's largest economy joining the ban movement adds significant momentum to European regulatory efforts, particularly as the country pushes for EU-wide harmonised standards rather than fragmented national approaches.

Find out more over at Wired-Parents.com

First a word from our sponsor:

Wake up to better business news

Some business news reads like a lullaby.

Morning Brew is the opposite.

A free daily newsletter that breaks down what’s happening in business and culture — clearly, quickly, and with enough personality to keep things interesting.

Each morning brings a sharp, easy-to-read rundown of what matters, why it matters, and what it means to you. Plus, there’s daily brain games everyone’s playing.

Business news, minus the snooze. Read by over 4 million people every morning.

NEED TO KNOW

What The Meta Trials Reveal About How Instagram Actually Works

Two trials currently underway in Los Angeles and New Mexico have exposed internal Meta documents that show how the company makes decisions about children's safety on Instagram. The picture that's emerged is more revealing than any regulatory investigation has produced.

What the documents show

Meta knew it had approximately 4 million Instagram users under age 13 in the United States despite the platform's stated policy requiring users to be 13 or older. The company didn't require users to enter their birthdates when creating accounts until late 2019, citing "privacy concerns" in internal communications.

One case involves KGM, who started using Instagram at age 9 and spent more than 16 hours daily on the platform. Internal documents revealed that Margaret Stewart, then Meta's head of product design, warned in 2019 against introducing plastic surgery filters because they would harm teenage girls' body image. She was overruled.

The New Mexico trial centres on an undercover investigation where state officials posed as children on Instagram and documented the sexual solicitations they received. Internal Meta documents show the company's systems detected approximately 500,000 child exploitation cases daily.

The encryption question

Meta's shift to end-to-end encryption across its platforms will eliminate the company's ability to detect and report child sexual abuse material in messages. Internal documents estimate this will result in Meta losing the ability to report 7.5 million cases of child sexual abuse material annually to the National Center for Missing and Exploited Children.

An internal employee message from December 14, 2023 acknowledged: "There goes our CSER numbers next year," referring to Child Safety Escalation Reporting. A 2019 internal communication stated: "We will never find all of the potential harm we do today on Messenger when our security systems can see the messages themselves."

The addiction question

The trials have also exposed the careful language platforms use around addiction. Instagram head Adam Mosseri testified last week using the term "problematic use" rather than "addiction," describing it as "when someone's spending more time on Instagram than they feel good about." However, internal Meta documents show a senior data scientist with a PhD in neuroscience writing that "some of our users are addicted to our products" and noting that features like intermittent rewards "become especially hard to extinguish." The legal question is whether platforms can be held liable for addiction when social media addiction isn't yet classified in the DSM-5, despite gambling taking 33 years to receive that classification.

Why this matters

These trials are providing the documentation that regulatory investigations typically struggle to obtain. The internal communications show decision-making processes where safety concerns were weighed against business priorities, and in multiple documented cases, business priorities prevailed.

What this means for parents

The Meta documents show internal decision-making favoured business priorities over child safety. If you're deciding whether to allow Instagram:

  • Delay: These trials show the platform knew about problems and chose not to fix them. That's evidence for waiting.

  • Manage: If your teen already uses it, the addiction research (their own scientist confirmed it) suggests checking actual usage patterns, not just asking "how much do you use it?"

  • Allow with conditions: Private accounts only, following only people they know in real life, regular check-ins about what they're seeing

  • Refuse: The 4M under-13 users weren't accidental - Meta chose not to verify ages for years

NEED TO KNOW

Teens Use Phones For 70 Minutes Daily

New research has quantified the gap between school phone policies and actual student behaviour. A study published in the Journal of the American Medical Association found that American teenagers spend an average of 70 minutes daily on their phones during school hours, despite 99.7% of schools having policies restricting phone use.

The study tracked 640 adolescents aged 13-18 using passive monitoring rather than self-reporting. Usage broke down to approximately 30 minutes on social media platforms including Instagram, TikTok and Snapchat, 15 minutes on YouTube, and 15 minutes on gaming.

Older teenagers aged 16-18 and students from lower-income backgrounds showed higher usage rates. The research found that 70 minutes represents roughly 20% of instructional time, equivalent to one full class period lost daily to phone use.

Ireland calls for primary school bans

The enforcement gap is prompting calls for stronger action beyond the US. In Ireland, parents and experts are urging smartphone bans in primary schools, pointing to positive effects of phone-free environments observed elsewhere. Advocates argue for unified rules across families and schools to reduce pressure on individual parents to enforce restrictions their children's peers aren't following.

At least 32 US states plus the District of Columbia now require school districts to ban or restrict phones, with 71.3% of American adults supporting such measures according to recent polling.

The research underscores a consistent pattern: policies exist, but enforcement remains weak. Schools report difficulty monitoring compliance, students find workarounds, and the result is policies that exist on paper but have limited practical effect on daily behaviour.

What this means for parents

Schools have policies but can't enforce them. So if your school says "no phones":

  • Talk to teachers: Ask how they actually handle it. 70 minutes daily means it's happening.

  • Home rules matter more: School policies won't protect your child if they're unrestricted at home

  • Consider: If enforcement doesn't work at school, is giving them a smartphone for "emergencies" worth the 70 minutes daily trade-off?

NEED TO KNOW

Australia’s Social Media Ban: Two Months In

Australia's under-16 social media ban has reached its two-month mark since implementation on December 10, 2025. The data reveals both what's working and what isn't.

Platforms have closed 4.7 million accounts since the ban took effect. Meta reported closing 550,000 accounts, while Snapchat removed 415,000. Research suggests Australian teenagers aged 10-16 maintained approximately two accounts each across different platforms, explaining the high closure numbers.

Alternative platforms saw initial surges that didn't sustain. ByteDance's Lemon8 and photo-sharing app Yope both gained users as the ban approached, but the eSafety Commissioner noted "a spike in downloads but not a spike in usage," suggesting teenagers downloaded alternatives but didn't commit to using them long-term.

Legal challenges are proceeding. The Digital Freedom Project and Reddit have filed challenges in the High Court arguing the ban violates constitutional protections. Snap CEO Evan Spiegel wrote in the Financial Times on February 18 that age verification technology is "highly imperfect" and warned the ban risks pushing teenagers to less regulated platforms without proper safety protections.

The implementation question

The government claims the closure numbers demonstrate the ban is working. Platforms counter that account closures don't prove the ban is reducing harm, only that it's creating enforcement activity. The question parents are asking is whether removing access to mainstream platforms with established safety teams and reporting systems actually makes teenagers safer, or simply pushes them to platforms with fewer protections and less oversight.

Early research tracking Australian teenagers' mental health and behaviour over at least two years will provide data other countries are watching closely.

What Australia teaches other parents

  • Bans create workarounds (VPNs, alternative apps)

  • Account closures ≠ reduced harm

  • If your strategy relies on platforms enforcing restrictions, Australia shows that's optimistic

WORTH KNOWING

UK fines Reddit £14.5 million for failing to protect children

Britain's Information Commissioner's Office has fined Reddit £14.5 million for unlawfully processing children's personal data and failing to implement adequate age verification, allowing users under 13 to access potentially harmful content. Reddit plans to appeal. The fine is part of escalating UK enforcement under the Online Safety Act. Separately, the UK's National Crime Agency called this week for a "whole-system approach" to technology-enabled abuse, urging regulators to make fuller use of their powers to tackle online grooming and exploitation. The agency said tech companies must be held to account and that policing alone isn't enough without robust platform safeguards. The coordinated pressure demonstrates the UK moving from regulatory warnings to actual enforcement with substantial financial penalties.

Police reveal 1,500 Roblox-related crimes in England and Wales

A Times investigation found over 1,500 crimes recorded in England and Wales from 2020-24 involved Roblox, including serious sexual offences against children under 13. Predators use in-game currency and chat features to groom and coerce minors. The platform has 80 million daily active users, with half under age 13. The findings raise questions about whether current platform protections are sufficient given the scale of predatory behaviour documented by police.

California governor backs under-16 social media ban

California Governor Gavin Newsom backed legislation to restrict social media use for under-16s on Thursday, calling it "long overdue." He recalled confronting his daughter and her friends at a birthday party: "I literally stopped everybody because there were seven of them together on their cellphone at the birthday party, not one of them talking to each other." Elon Musk publicly criticised the proposal, highlighting growing political division over youth tech regulation in the US. Unlike Europe and Australia's bipartisan approaches, American proposals are increasingly splitting along political lines—though California's size and position as home to Silicon Valley makes Newsom's backing particularly significant.

📩 From the inbox

"My 15-year-old asked for a VPN 'for privacy.' I said yes because it sounded responsible, but now I'm reading the UK wants to limit children's VPN access. Did I just help him bypass the rules I'm trying to enforce?"

Possibly, yes. VPNs mask location and can bypass age restrictions, school filters and parental controls, which is exactly why the UK announcement included them. The problem is VPNs also have legitimate privacy uses, so this isn't straightforward. If your son is using it to watch region-locked content or avoid tracking, that's different from using it to access platforms you've said no to. Worth a direct conversation about what he's actually using it for and whether you're comfortable with that. If the answer makes you uneasy, most routers let you block VPN traffic entirely.

Got something you've been wondering about? Hit reply.

As always, please get in touch with us at [email protected] with any feedback, thoughts, suggestions.

Know a parent who would find this useful?

Forward this email

Or share this link: https://wired-parents.com


📚 NEW TO WIRED PARENTS?

Get the free 103-page Age-by-Age Tech Guide: See what parents worldwide are deciding about phones, social media, screen time and gaming at ages 8-17.

Download Free Guide →

Worth a Read: A free weekly newsletter for parents of athletes. Real tools. Clear strategies. No guesswork. Proven ways to lead your athlete with confidence through the ups, downs, and in-betweens.

The Game Plan

The Game Plan


Stay ahead. Every Thursday.

Were you forwarded this email? Sign up here

Reply

Avatar

or to participate

Keep Reading