This Week: WhatsApp Under-13s; UK Names Six Platforms; Trial Verdict Looms

Happy Thursday everyone.

I watched Louis Theroux’s manosphere documentary on Netflix this week and by the end I felt genuinely uncomfortable. Not because of what I learned, but because of what was missing.

The manosphere is the loose network of influencers, podcasters and content creators who promote hyper-masculine, often misogynistic content to millions of young men online.

The documentary shows how these men operate, up close, and it does that well. The get-rich-quick logic of social media runs through every frame: be more controversial, get more views, make more money. It is a business model built on provocation and it is clearly working.

But the trouble is, it just ends. There is no conversation about what parents can do. There is almost nothing from women about how they are being affected by this. The women you do see are mostly the girlfriends and partners of these men, and for the most part they say they are happy with their lot. "What do you like most about men? Money." There is nothing about what to tell your daughter when she encounters boys who think like this. I did, however, mentally high-five one of the girlfriends who, you learn in the closing credits, left her manosphere boyfriend.

It is worth watching if you have sons or daughters. Not for answers, because it does not give you any, but because you need to see what is competing for your sons' attention and what your daughters will encounter from boys who consume this content. In a world where likes and subscribers count more than ever, the worse a human being you are, the more these guys feel they have succeeded. "Losers work for other people" is not a punchline to them. It is a recruitment line, and it is aimed at teenage boys.

I am working on a guide to the platforms and channels where manosphere content actually reaches boys, what to look for, and what you can do about it. Keep an eye out for that.

OK, on that note, coffee. Time to see what else has been happening.

Catch up with you next week.

— Heidi

Share Wired Parents with a parent who'd want to know this. Subscribe here

NEED TO KNOW

WhatsApp Just Opened the Door to Under-13s

On 11 March, WhatsApp launched parent-managed accounts for children under 13. For the first time, Meta is officially offering a way for pre-teens to use the platform, with parents controlling who can contact them, which groups they can join, and how their privacy settings are configured.

Until now, WhatsApp's minimum age was 13. In practice, millions of younger children already use it to message family, join school groups, and coordinate after-school plans. This feature acknowledges that reality and tries to make it safer. But it also normalises younger children on a messaging platform, which is a decision worth thinking through before you set one up.

What parent-managed accounts actually do:

The account is restricted to messaging and calling only. No Meta AI, no Channels, no location sharing, no disappearing messages. All controls are locked behind a parent PIN. You decide who can contact your child and which groups they can join. Message requests from unknown contacts go to a separate folder that only you can unlock. Images from unknown contacts are blurred by default. Setup requires both your phone and your child's phone side by side, linked via QR code.

What to think about before setting one up:

This is not a decision you need to make this week just because the feature exists. If your child does not yet have WhatsApp, the question is whether they need it now or whether the family can wait. If they already use WhatsApp on an account registered as 13 or older, it is worth considering whether switching to a managed account gives you more control than you currently have.

If you do set one up, three things to do immediately:

Review the default contact settings and restrict messaging to known contacts only. Check which groups your child is in and decide whether they all need to stay. Turn on the activity alerts so you are notified when your child adds, blocks or reports a contact, or when a group enables disappearing messages.

The feature is rolling out gradually and may not be available in your region yet.

NEED TO KNOW

UK Regulators Told Six Platforms: You're Not Doing Enough

On 12 March, Ofcom and the ICO wrote directly to Facebook, Instagram, Roblox, Snapchat, TikTok, and YouTube. The message was blunt. Ofcom's chief executive, Dame Melanie Dawes, said these platforms are household names, but they are failing to put children's safety at the heart of their products.

The numbers back that up. Ofcom's own research shows that 72% of children aged 8 to 12 are currently accessing platforms that have a minimum age of 13. Self-declared birthdays are not working. The ICO's open letter said platforms must move beyond self-declaration and use technologies like facial age estimation or digital ID.

The regulators set out four demands: enforce minimum ages with real age checks, implement grooming protections so strangers cannot contact children, make recommendation algorithms safer for younger users, and stop testing AI products on children without safety assessments first.

The platforms have until 30 April to respond. Ofcom will report publicly on their responses in May and has said it will take enforcement action if it is not satisfied.

What this means for you right now: the protections these platforms claim to offer depend on knowing your child's age. If your child's account was set up with a fake birthday, the teen or child safety settings may not be active. Check every platform your child uses and confirm the date of birth is correct. On Instagram, an incorrect birthday means Teen Account protections are off. On TikTok, it means the default 60-minute time limit and Family Pairing restrictions are not applied. On YouTube, restricted mode and supervision features may not be triggered.

NEED TO KNOW

A Jury Is Deciding Whether Social Media Is a Defective Product

In a Los Angeles courtroom, 12 jurors are deliberating on a question that could reshape how every major platform treats children. The case is KGM v Meta and YouTube, the first social media addiction lawsuit to reach a jury.

The plaintiff, a 20-year-old woman referred to as Kaley, says she began using YouTube at age 6 and Instagram at age 9. Her lawyers argue that features like infinite scrolling, autoplay, notification systems and visible like counts were deliberately designed to maximise engagement and that these features caused her depression, anxiety, body dysmorphia and suicidal thoughts as a child.

Meta CEO Mark Zuckerberg testified in person on 18 February, his first time before a jury on this issue. Meta's defence is that Kaley's mental health challenges existed before she used social media. YouTube has argued that it is not a social media platform and that its features are not addictive.

TikTok and Snapchat were originally named as defendants but both settled before the trial began. The terms have not been disclosed.

This is a bellwether trial. Over 1,600 similar cases from families and school districts are pending across the US, and the outcome will influence whether those cases settle or go to trial. Legal analysts have compared it to the tobacco litigation of the 1990s and if the jury finds Meta and YouTube liable, it could force changes to how platforms design their products for younger users. A verdict could come any day. We’ll keep you posted.

WHAT THE WORLD DECIDED

Mexico has launched consultations on restricting children's access to social media, with regulatory proposals expected by June.

New York Governor Hochul proposed new legislation expanding age verification to gaming platforms including Roblox, setting children's accounts to the highest privacy settings by default, and expanding teen mental health first aid training statewide.

The UK's "Growing Up in the Online World" consultation remains open until 26 May. It includes a dedicated survey for parents and carers. If you want your voice in the policy conversation, the form is at gov.uk.

If you want to see what countries around the world are doing when it comes to children and social media, don’t forget to check out our Country Tracker.

LAST WEEK'S POLL

Last week I asked how you feel about governments banning social media for under-16s.

  • 60% said it's the right call

  • The rest were split between good intention but it won't work, too heavy-handed, and still undecided

  • Interesting to note that nobody felt strongly it's the wrong thing to do

This week's poll question is below. We’ll let you know the results next week.

THE DOWNLOAD

Have you got your copy of The Download? Our free guide to the eight technology decisions parents ask about most, from first phones to Instagram, gaming and AI. If you haven't grabbed it yet, it's here.

NEXT WEEK

Well, that's it for this week.

Next week we're looking at Instagram's parental controls. What Teen Accounts actually do, how to set up supervision, and where the gaps are. If your child is on Instagram, or asking to be, that one's for you.

I'm also keeping a close eye on the KGM verdict in Los Angeles. If it lands before next Thursday, expect a special update.

Until next Thursday!

Know a parent figuring out phones and social media?

SEND TO A FRIEND

Or send them The Download — our free guide to the eight technology conversations every parent will have. It's the quickest way to get up to speed.


📚 New here?

The Download is our free guide to the eight technology conversations every parent will have — from first phones to Instagram, gaming and AI.

Get The Download →

Stay ahead. Every Thursday.

Were you forwarded this email? Sign up here

Reply

Avatar

or to participate

Keep Reading