In partnership with

TL;DR: Enforcement Works; Europe Moves; Nostalgia Strikes.

Australia has just proved that large-scale enforcement is possible: 4.7 million accounts removed in two weeks, nearly double the country's youth population aged 8–15. Four European countries are already moving forward with their own versions. Meanwhile, everyone's posting photos from 2016, nostalgic for when social media felt fun and unengineered.

Here’s what you need to know.

Enjoying Plugged In? Forward this to a parent friend who's also trying to navigate digital parenting decisions. They can subscribe here

NEED TO KNOW

Australia's Under-16 Social Media Ban Update

So, we finally have official numbers from Australia. The eSafety Commissioner announced Friday that platforms removed access to 4.7 million accounts in the first two weeks of enforcement. That's nearly double Australia's entire youth population aged 8-15. All ten platforms—Facebook, Instagram, Snapchat, Threads, TikTok, Twitch, X, YouTube, Kick and Reddit—reported on time and were deemed compliant. The government didn't provide a platform-by-platform breakdown, though Meta self-reported removing 544,000 accounts across its three platforms.

If you've been watching this as a test case for what's possible, or if you're weighing when to let your child create accounts, these numbers make you realise what can be achieved. Large-scale enforcement is technically possible when platforms are legally required to act. The "it's impossible to enforce" argument no longer holds. Denmark announced similar plans for mid-2026 and France, Malaysia, and Indonesia are all exploring the same approach.

As for the complications? VPN usage from Australia surged 170% the day the ban took effect and some underage accounts remain active. The eSafety Commissioner acknowledged it's too early to declare full compliance. The 4.7 million is a starting number, not necessarily the final outcome.

Full breakdown: Wired Parents

One Simple Scoop For Better Health

The best healthy habits aren't complicated. AG1 Next Gen helps support gut health and fill common nutrient gaps with one daily scoop. It's one easy routine that fits into real life and keeps your health on track all day long. Start your mornings with AG1 and keep momentum on your side.

Europe Following Australia’s Lead

Australia’s under-16 social media ban is already influencing policy elsewhere. Across Europe, governments are moving in the same direction, but with different ages, mechanisms and levels of parental control.

What’s happening

  • Denmark: Under-15s will need parental consent. A government-run app, MitID Underage, will verify age without requiring platforms to store sensitive data.

  • France: President Macron is pushing for a full under-15 ban, with fines for platforms that fail to comply. The proposal covers social media, messaging apps and similar services.

  • UK: Exploring an under-16 ban. Ofcom already has age-verification powers and the government is closely watching Australia’s rollout.

  • Spain: Under-16s would need parental consent for social media, forums and generative-AI platforms. App stores would also give parents more control over downloads.

At EU level, the European Parliament has backed a non-binding recommendation for a minimum age of 16, parental consent for 13–15 year olds and restrictions on features such as infinite scroll and autoplay.

Why it matters
The idea that platforms and parents alone can manage risk is losing ground. Governments are stepping in, but not in a coordinated way. Some focus on age bans, others on parental consent or platform design and presumably each approach will have different consequences.

What this means for your decisions

  • Holding the line until 15–16 is increasingly being supported by policy, not just parental instinct.

  • If your child already has access, these moves don’t mean you’ve “done it wrong” but they may validate why the decision felt uncomfortable.

  • Details matter: age verification methods, default settings and enforcement will shape whether these policies help or simply push children elsewhere.

What to watch
Denmark’s MitID Underage rollout, France’s legislative progress and whether bans reduce harm or shift behaviour underground.

Policy moves slowly bu parenting decisions don’t have to.

What parents need to know: Wired Parents

UAE Is Taking A Different Approach

While Australia focused on who can access social media, the UAE has focused on how platforms must behave.

From January 1, 2026, any digital platform accessible to children in the UAE must meet strict safety requirements regardless of whether it markets itself “for kids.”

What’s covered
Social media, gaming, live-streaming, messaging, video platforms and even e-commerce sites. Platforms without a physical UAE presence are still included.

Key requirements

  • Under-13s: No personal data collection without verified parental consent

  • All children:

    • Parental controls and content filters

    • Safe search by default

    • Clear account deletion tools

    • Harmful content removal requests handled within 24 hours

Non-compliant platforms risk fines or service blocking.

Why it matters
This law treats digital safety as a design problem, not just an age problem. Platforms must build in protections rather than relying on parents to configure everything manually.

What this means for your decisions

  • In the UAE: You now have legal backing when demanding safer defaults and usable parental controls.

  • Elsewhere: Watch whether platforms roll out similar features globally or restrict them to regulated markets.

The complications
Enforcement is untested. Platforms may comply unevenly and features like infinite scroll or autoplay are not automatically disabled.

What to watch
Early enforcement action and whether other countries adopt this “earn access through safety” model.

Read more here: Wired Parents

PLATFORM WATCH

TikTok Adds New Parental Controls — But They’re Off By Default

TikTok now lets parents pause the app entirely, mute notifications during specific hours and set different daily limits for weekdays and weekends. But every feature must be switched on manually.

What’s new

  • Pause: Block TikTok completely during set hours

  • Flexible limits: Different daily screen time limits for weekdays vs weekends

  • Mute notifications: Silence alerts during homework or family time

All of these sit inside Family Pairing, which requires linking your account to your child’s and configuring each setting individually.

The catch

Defaults haven’t changed. Infinite scroll, algorithmic recommendations and constant notifications remain on unless you intervene. TikTok still assumes unlimited access unless parents actively enable controls.

What this means

  • Already using TikTok? You now have more control, particularly with scheduled pauses and flexible limits.

  • Considering allowing TikTok? Expect to configure safety settings yourself — the app won’t do it for you.

  • Comparing platforms? TikTok’s controls are now more granular than many competitors’, but none default to limited, age-appropriate experiences.

The complications

  • Family Pairing requires both parent and child to have accounts

  • Controls don’t apply across other devices or additional accounts

  • The app’s engagement-driven design still encourages repeated use

What to watch

Will Instagram and Snapchat follow with similar tools? And will platforms make parental controls easier to find or keep them buried in settings as regulation increases?

The takeaway

TikTok’s new tools offer more ways to manage access, but they don’t remove the need for an active parental decision.

PERSPECTIVE

Why Everyone Is Posting Photos From 2016 – And Why It Matters

Over the past few weeks, people have been sharing photos from 2016, often with captions about nostalgia, simplicity, or how things felt different back then.

On the surface, it looks like harmless reminiscing, but for many parents, it taps into something deeper.

What people seem to miss is not just a younger version of themselves, but a different digital environment. Social media then was quieter, less optimised, and less aggressive. Feeds felt more social, less engineered.

People posted without worrying whether an image was filtered, altered, or artificially enhanced. You never wondered if something was fake. You didn’t need to ask whether an image, a voice, video, face, a body, or even a moment had been generated or manipulated. There were no endless layers of filters smoothing, correcting, or upgrading reality.

Social media felt closer to connection than performance and people were not curating identities as carefully and many were there just for one thing. To have fun.

Since then, platforms have shifted from communication tools to attention systems; Infinite scroll, autoplay, algorithmic amplification and constant recommendations are now core features. These changes have been happening gradually and have now fundamentally altered how digital spaces behave.

Why this matters
The decisions facing parents today are not the same as those faced ten years ago. Giving a child a phone or social media access now means introducing them to a fully optimised attention economy, not just a neutral tool.

This explains why governments are acting now with regulation increasingly targeting design features, not just content or age.

What this means for you
If you are feeling sentimental for 2016, it may also be due to desire to turn back the clock, not on time but when tech seemed so much easier and… just nicer.

The question is no longer just when to allow access, but what kind of environment do you want to allow into your family life.

We can’t redesign platforms, but we can decide whether access is automatic or deliberate, rushed or considered.

Technology decisions shape childhoods. Seeing the environment and understanding what access it is allowing is the first step in making them deliberately.

WORTH KNOWING

Screen Time Limits Aren’t Enough

The American Academy of Pediatrics has updated its guidance, saying screen time limits by themselves no longer protect children’s wellbeing.

After reviewing hundreds of studies across two decades, the AAP now emphasises quality of use over raw time. Passive scrolling, autoplay, and constant notifications are linked to sleep and attention issues. High-quality content can support learning and connection.

Crucially, the guidance shifts responsibility outward. The AAP calls on platforms and policymakers to limit targeted advertising to minors, strengthen privacy protections and improve age verification.

What this means
This isn’t “just set better limits.” It’s an acknowledgment that platform design matters as much as parental rules.

US Lawmakers Debate Banning Children From Social Media

The US Senate Commerce Committee recently held hearings on the Kids Off Social Media Act (KOSMA), which would ban under-13s from platforms and restrict algorithmic recommendations for under-17s.

Privacy advocates, including the Electronic Frontier Foundation, argue the bill could shift control to tech companies by requiring widespread age verification, creating new privacy risks for children and families.

What this means
The US is moving toward age-based restrictions, but there’s no consensus yet on how to enforce them without unintended consequences.

For more articles from the week, head over to Wired-Parents.com

CLOSING THOUGHT

One theme runs through this week’s stories and it is that governments are no longer assuming that platforms will self-regulate, or that parents should shoulder the full burden alone. Whether through bans, age verification, or design requirements, the message is slowing becoming clear: it’s actually the environment that matters.

That doesn’t remove the need for personal judgement. Laws move slowly, they differ by country and rarely do they map neatly onto individual children.

If you've been questioning whether to allow access just because "everyone else is doing it", this week offers reassurance. Your instincts aren't out of step with the direction of governmental policies.

  • You don’t need to decide everything at once.

  • You don’t need to match another family’s timeline.

  • And you don’t need to wait for perfect clarity.

Technology decisions shape childhoods. Making them deliberately often starts with noticing when the defaults no longer feel right for your children and your family.

Know a parent who would find this useful?

Sharing is Caring

Or share this link: https://wired-parents.com


📚 NEW TO PLUGGED IN?

Get the free 103-page Age-by-Age Tech Guide: See what parents worldwide are deciding about phones, social media, screen time and gaming at ages 8-17.

Download Free Guide →

Worth Reading: TKSST

Fun and interesting science, art and nature videos for curious minds.

Daily Curated Videos & News from TKSST

Daily Curated Videos & News from TKSST

Curated educational videos and news stories that make science, art, and nature accessible to curious minds, since 2011.


Technology decisions shape childhoods. Make yours deliberately.

Helping parents act deliberately rather than cross their fingers.

Were you forwarded this email? Sign up here

Reply

Avatar

or to participate

Keep Reading