In partnership with

TL;DR: First Data Drops; Gaming Gets Caught; CEOs Restrict Their Own Kids

Happy New Year to you all! Welcome to our many new subscribers and thank you for caring enough to be a part of our community. I hope 2026 will be a good one for you all and technology drama free for you and your children.

2025 was the year we argued about whether governments should restrict children's social media access. 2026 is the year we find out if it actually works—and I'll be honest, I'm fascinated to see what the data shows.

Australia's eSafety Commission will release the first official compliance data from the world's first social media ban "in the coming days, most likely this week." Four weeks in, we're about to get hard evidence, not opinions, not predictions, actual data on whether platforms can enforce age restrictions at scale.

Meanwhile, the regulatory net is widening beyond social media, tech executives are making revealing choices about their own families and the implementation phase has well and truly begun.

Here's what I'm watching this week. As always, please get in touch and let me know what your biggest issues are in dealing with your children and tech.

Heidi

Enjoying Plugged In? Forward this to a parent friend who's also trying to navigate digital parenting decisions. They can subscribe here

The New Year Ritual That Sets the Tone for Energy and Glow

January calls for rituals that actually make you feel amazing—and Pique’s Sun Goddess Matcha is mine. It delivers clean, focused energy with zero jitters, supports glowing skin and gentle detox, and feels deeply grounding. Smooth, ceremonial-grade, and crave-worthy, it’s the easiest way to start your day clear, energized, and glowing from the inside out.

NEED TO KNOW

Australia’s Under-16 Ban: What’s Happened Since Christmas

[If you haven't read our comprehensive coverage of the ban's first month, including how enforcement actually works, workarounds being used, and concerns about vulnerable children, start here]

It's been four weeks since Australia's social media ban took effect, and the story keeps developing.

The Compliance Data Is Coming

The eSafety Commissioner promised to publish compliance findings "in the New Year" after issuing information notices to all 10 platforms in mid-December. When contacted by Plugged In, the eSafety office confirmed the initial compliance data will be released "in the coming days, most likely this week."

This will be the first official data on how many accounts were closed, whether platforms are successfully preventing new under-16 sign-ups, and how many teens are circumventing the restrictions.

Court Cases and Public Opinion

Legal challenges from Reddit and the Digital Freedom Project are proceeding, with the next court date set for April 13, 2026. Reddit argues the law violates constitutional free speech protections and that it shouldn't qualify as a "social media platform" in the first place.

Public support remains high—a Monash University poll found 79% of Australians support the ban. However, a December poll found only 29% of parents plan full compliance, with 53% planning to allow selective platform access.

New Zealand Joins In

On January 5, NZ Prime Minister Christopher Luxon announced social media restrictions for under-16s will become part of his government's official work programme, following Australia's model.

Bloomberg's analysis suggests other countries are watching Australia's enforcement data closely before committing to similar bans.

More Countries Move to Restrict Children's Social Media Access in 2026

Following Australia's under-16 ban, multiple countries have announced or implemented their own restrictions on children's social media use.

What Took Effect January 1, 2026

Virginia, USA First US state to impose daily time limits. Under-16s are automatically capped at one hour per day per platform (Instagram, TikTok, YouTube, Snapchat, etc.) unless parents override the setting. Platforms must verify users' ages and can face civil penalties for violations. The law is already facing First Amendment challenges from NetChoice.

Florida, USA Complete ban for under-14s. Children aged 14-15 require parental consent to create or maintain social media accounts. The law is not being actively enforced while legal challenges proceed, but it's technically on the books.

LOOKING AHEAD

France (September 2026 target) President Macron is pushing for parliamentary debate in January on an under-15 ban, timed to coincide with the 2026-2027 school year. The proposal also includes prohibiting mobile phones in secondary schools. Draft legislation cites risks including inappropriate content exposure, cyber-harassment, and disrupted sleep patterns.

Denmark (Mid-2026) Planning an under-15 ban with an exception: parents can grant permission for 13-14 year-olds. A "digital evidence" app launching spring 2026 will handle age verification. Officials report that despite current restrictions, 94% of Danish children under 13 have profiles on at least one social media platform.

Malaysia (2026) Confirmed plans for an under-16 ban as part of a broader Online Safety Act taking effect January 1, 2026. The country plans to implement eKYC (electronic Know Your Customer) checks using official government IDs.

Norway Prime Minister Mette Frederiksen announced plans in October 2025 to ban social media for under-15s, warning that "never before have so many children and young people suffered from anxiety and depression." Implementation timeline and details remain under development.

The Enforcement Challenge

Every jurisdiction implementing age restrictions faces the same obstacle: verification. Current options include self-reported birthdates (easily defeated), ID uploads (privacy concerns), biometric age estimation (accuracy issues, especially for teenagers), and device-level parental controls.

Meta announced in December 2025 it will roll out AgeKey, a cross-platform age verification system, in the UK, Australia, and Brazil in 2026. The system uses cryptographic credentials stored on users' devices, similar to passkeys, allowing one-time verification that can be reused across multiple platforms.

The US Patchwork Problem

At least eight US state social media laws have been temporarily blocked or enjoined on First Amendment grounds: Arkansas, California, Georgia, Louisiana, Mississippi, Ohio, Tennessee, Texas, and Utah. Virginia and Florida's laws are in effect but facing similar constitutional challenges.

The core legal question: do these restrictions unconstitutionally limit minors' free speech rights? The Supreme Court's June 2024 decision upholding age verification for adult websites may provide precedent, but the court has not directly addressed social media restrictions for minors.

What This Means

If you're in Virginia or Florida, these restrictions already apply to your family. If you're elsewhere, similar laws may be coming. Even without legislation in your area, platforms are implementing age verification systems, time limits, and enhanced safety features globally to comply with the growing regulatory patchwork.

The shift is clear: after years of debate, governments are implementing restrictions on children's social media access. Whether these measures improve children's wellbeing, whether they're enforceable at scale, and whether they create unintended consequences remains to be seen.

Nine in Ten Parents Argue With Their Children Over Technology Weekly

If you've had a screen time argument this week, you're in good company. New research reveals 90% of American parents argue with their children over technology use, and for half, these disputes happen weekly.

Top flashpoints: too much screen time (46%), bedtime phone use (40%), gaming (28%), and devices during meals (28%).

The shocking stat: In the past year, 59% of children saw online videos depicting extreme violence, serious injury, or death. Parents' concerns aren't abstract, they're responding to what children are actually encountering.

Despite 95% of parents having internet rules at home, the arguments persist. Technology disputes now top the list of parent-child conflicts (28%), ahead of chores (25%) and homework (21%).

I suspect most of us recognise this pattern. The negotiation never really ends, it just evolves.

Read more here: Wired Parents

YOUTUBE

YouTube's CEO Limits His Own Children's Social Media. Should You?

Neal Mohan runs YouTube, where two billion people watch 500 hours of content uploaded every minute. This week, the newly crowned TIME 2025 CEO of the Year revealed something striking: at home, he and his wife place "controlled and restricted" limits on their three children's access to YouTube and other platforms.

Mohan joins a remarkable list of tech executives who privately constrain what their own children can access. Susan Wojcicki, his predecessor at YouTube, barred her children from the main app. Bill Gates didn't let his children have phones until age 14. Steve Jobs restricted iPad use at home despite leading the company that created it.

The uncomfortable question: if the people building these products won't let their own children use them freely, what does that tell us?

A recent Pediatrics study of over 10,000 American adolescents found that owning a smartphone by age 12 was associated with increased risks of depression, anxiety, insufficient sleep and obesity. The study didn't even examine what children did on their phones, simply having one correlated with worse health outcomes.

The pattern is striking: those who build our digital world are notably cautious about how much of it they allow into their own homes.

Read the full analysis: Wired Parents

After covering all these restrictions, here's what's missing from the conversation:

OPINION

The Gaming Platform Loophole Is Closing

Social media platforms face intense regulatory scrutiny. Age verification requirements. Privacy-by-default settings. Bans in multiple countries. Gaming platforms? They've largely escaped.

This week changed that equation.

Roblox made facial age verification mandatory globally yesterday (January 7). New York announced legislation specifically targeting gaming platforms alongside social media—the first state to acknowledge the distinction is increasingly meaningless.

Here's what's odd about the current landscape. Australia banned under-16s from social media in December. Virginia requires parental consent and time limits. Denmark, France, Malaysia, Norway are all pursuing similar restrictions. The definition of "social media platform" in these laws consistently excludes gaming platforms.

Yet Roblox has 80 million daily users. A significant portion are under 13. They chat with strangers. They're exposed to user-generated content that's impossible to moderate completely. They spend real money on virtual items. They form friendships and social bonds. The primary activity might be gaming, but the social features mirror everything regulators are concerned about on Instagram or TikTok.

The classification matters because regulation follows categories. If you're "social media," you face age verification requirements, content moderation standards, and increasingly, outright bans for children. If you're "gaming," you don't—even when children are doing functionally identical activities.

Roblox argues it's a gaming platform where social features are secondary. But ask any parent whose child uses Roblox what the platform is actually for. The answer isn't "gaming." It's socialising through games.

New York spotted the problem. Governor Hochul's proposals announced this week specifically name Roblox. They extend age verification, privacy-by-default settings, and parental controls to gaming platforms. Not because gaming is inherently harmful, but because the gaming/social media distinction no longer reflects how children actually use these platforms.

Roblox's face scanning mandate—launched yesterday—is a response to this pressure. Three US states are suing over child safety. The lawsuits forced action.

The regulatory landscape is catching up. Whether that's good policy or government overreach depends on your perspective. But the days of gaming platforms flying under the radar appear to be ending.

Perhaps that's progress. Or perhaps it's just evidence that we're regulating symptoms rather than addressing why platforms serving millions of children weren't designed with safety as the default in the first place.

IN THE KNOW

For more articles from the week, head over to Wired-Parents.com

TECH TRIVIA

  • The First Webcam Was Created to Watch a Coffee Pot - In 1991, Cambridge University researchers set up a camera pointed at their coffee pot so they could check if it was full without leaving their desks. This "Trojan Room coffee pot" became one of the internet's first live-streaming webcams, running until 2001. (Source: BBC)

  • TikTok's Original Name Was Musical.ly - The app teenagers use today started in 2014, focused on lip-syncing videos. Chinese company ByteDance bought it for $1 billion in 2017, merged it with their own app, and rebranded everything as TikTok. Most users have no idea they're using a rebranded app. (Source: The Verge)

  • Snapchat Turned Down $3 Billion from Facebook - In 2013, Mark Zuckerberg offered to buy Snapchat for $3 billion. Founder Evan Spiegel, then 23 years old, declined. Today, Snap's market value fluctuates around $20 billion, though Meta has successfully copied many of Snapchat's features into Instagram and WhatsApp. (Source: Forbes)

Know a parent who would find this useful?

Sharing is Caring

Or share this link: https://wired-parents.com


📚 NEW TO PLUGGED IN?

Get the free 103-page Age-by-Age Tech Guide: See what parents worldwide are deciding about phones, social media, screen time and gaming at ages 8-17.

Download Free Guide →

Get Plugged In with Wired Parents

We track what's happening with children and technology so you can make informed decisions for your family. Every Thursday: safety updates, new research, and what's happening worldwide.

What every parent in today's digital world needs to know.

Were you forwarded this email? Sign up here

Reply

Avatar

or to participate

Keep Reading