In this week’s Plugged In by Wired Parents, safety by design takes centre stage. The world’s first AI-powered child smartphone, the HMD Fuse, has launched with real-time content blocking that stops nude images from ever being seen or shared marking a potential industry shift in how child devices are built.

At the same time, Ofcom research shows a third of children aged 5–7 are already on social media without supervision, revealing just how far behind current safeguards are in practice.

And in the policy space, the UK’s Safeguarding Minister Jess Phillips is pressing tech giants to follow suit or face tougher regulation, as pressure mounts on the industry to take greater responsibility for children’s online safety.

TL;DR: World’s first AI kids’ phone, 5-year-olds scrolling social apps & UK warns tech giants to clean up.

First time reading? Join other parents looking to keep themselves informed. Sign up here.

Need To Know

📱 World's First AI-Powered Child Smartphone Launches

  • New HMD Fuse smartphone for children features world's first technology capable of blocking explicit content

  • Uses built-in AI to detect and prevent nude content from being viewed or shared

  • Device specifically designed for children with parental oversight capabilities

  • Represents major breakthrough in proactive child protection technology

  • Available now, addressing growing parental concerns about smartphone safety for young users

The launch of the HMD Fuse comes at a critical time when parents are struggling to balance their children's need for communication technology with legitimate safety concerns about exposure to inappropriate content. This technological solution provides a hardware-based approach to content protection that could influence how other manufacturers approach child-friendly device design, potentially setting new industry standards for proactive safety features in children's technology. A new HMD Fuse smartphone for children is now available, featuring the world's first technology capable of blocking explicit content through built-in AI systems. This groundbreaking device represents a significant advancement in proactive child protection technology, moving beyond reactive filtering to prevent nude content from being viewed or shared in real-time.

The smartphone's AI-powered protection system operates directly on the device, providing immediate content blocking without relying on network-based filters that can be bypassed or may have delays. This approach offers parents a new level of confidence in their children's smartphone safety, addressing one of the most significant concerns about connected devices.

🪬 UK Safeguarding Minister Demands Tech Builders Do More for Child Protection

The UK’s Safeguarding Minister, Jess Phillips, has delivered a pointed message to Silicon Valley: device makers and operating system developers need to do more to protect children. Speaking this week, Phillips stressed that while regulators are moving ahead with the Online Safety Act, the companies designing the phones in children’s pockets must also carry responsibility for keeping them safe.

Her main concern? Easy access to online pornography. Despite age restrictions and parental controls, many young children still stumble across explicit content, and parents remain deeply anxious about what their children are exposed to when unsupervised. Phillips praised the new HMD Fuse handset for its built-in child safety features: tools that go beyond apps and filters to create protections at the device level.

But her message was clear: unless manufacturers and software providers take a more active role in child protection, they may face tougher legislation or even backlash from consumers demanding safer tech.

This intervention highlights a crucial shift in the conversation. For years, the spotlight has been on social media giants, but Phillips’ comments bring attention to the hardware and software foundations children use every day. Phones are not neutral tools, she argued. Their settings, defaults, and guardrails shape the online experience just as much as apps and platforms.

Why it matters for parents

  • Safety must be built-in, not bolted on. Filters and apps help, but Phillips wants protections embedded at the core of devices.

  • Market pressure is growing. Parents may soon have more choice between “family-first” devices and standard smartphones.

  • Regulation is tightening. Firms that drag their feet could find stricter rules imposed.

As the school year begins, parents may want to watch how this debate develops: the next generation of devices could either make safeguarding easier or leave families fighting the same battles on their own.

🧒 Shocking Data: Young Children Using Social Media Unsupervised

New figures from Ofcom have shed light on a stark reality: one in three children aged just five to seven are using social media platforms, despite most sites requiring users to be at least 13 years old. Even more concerning, many of these children are accessing platforms like TikTok, Instagram, and Snapchat without parental supervision, placing them in environments designed for teenagers and adults.

This data points to a growing gap between platform rules and the realities of family life. While social media companies set minimum age limits, enforcement remains weak, leaving the responsibility to parents many of whom struggle to monitor usage across multiple devices.

The Ofcom findings suggest that traditional safeguards are falling short. Age verification remains rudimentary, often based on self-declared birthdates that children can easily bypass. At the same time, parental controls are inconsistently applied, and digital literacy education for both children and parents lags behind the pace of change.

Experts warn that early exposure to social media can carry risks, from encountering inappropriate content and online predators to the impact on attention spans, sleep, and emotional wellbeing. However, they also note that these platforms can provide opportunities for creativity, connection, and learning—if used responsibly.

The findings may influence future regulatory approaches in the UK, with policymakers under pressure to push for stricter age verification tools and more transparent parental control systems. For families, it raises urgent questions: should children under 10 have any access to social media at all, and if so, what balance of supervision and education is realistic in today’s digital household?

As the debate intensifies, one point is clear—parents, schools, and regulators will all need to play a part in closing the gap between policy and reality.

🔑 Key Points

  • One in three children aged 5–7 are already using social media.

  • Most platforms (TikTok, Instagram, Snapchat) require users to be 13+.

  • Many children use these platforms without any adult supervision.

  • Age checks remain weak and easy to bypass with false birthdays.

  • Regulators may push for stricter verification and better parental controls.

If your child under 10 is online, assume they will find ways around platform rules. The best defence isn’t just relying on age limits but combining hands-on involvement, honest conversations, and consistent supervision. Consider co-using apps together, setting clear household rules, and teaching children early how to handle inappropriate content.

In The Know

For more articles from the week, head over to Wired-Parents.com

App Decoder

💬 App Alert: What Parents Need to Know About Discord Right Now

Your child says it's "just for gaming chat" – but Discord has become ground zero for a disturbing trend that every parent needs to understand. With 150 million users worldwide, this platform your kids use to coordinate Minecraft builds and Fortnite matches just became the centre of major legal action.

What's Happening

New Jersey's Attorney General sued Discord in April 2025 for deceptive practices that allegedly expose children to predators. But the real wake-up call came just last week: a family filed a lawsuit claiming Discord failed to protect their 10-year-old daughter from an abductor who used the platform to groom her before attempting a real-world kidnapping.

Discord isn't just where predators contact children. It's where they build relationships over months, gaining trust through shared gaming interests before escalating to private messages, phone calls, and eventually meeting attempts.

The Reality Check

Unlike social media platforms your kids might use, Discord operates more like a massive collection of private chat rooms. Children join "servers" (group chats) based on their interests – gaming, anime, art, music – where they interact with complete strangers of all ages. The platform's design makes it incredibly difficult for parents to monitor these interactions.

The concerning pattern: adults join servers popular with children, identify vulnerable kids through casual conversation, then move them to private direct messages where grooming escalates rapidly. Law enforcement reports show this exact sequence in multiple recent cases.

What This Means for Your Family

The Tricky Part: Your child probably uses Discord legitimately – coordinating with gaming friends, joining hobby communities, participating in school project groups.

The Dangerous Part: The same features that make Discord great for gaming coordination make it perfect for predators seeking private access to children.

The Parent Challenge: Discord's privacy-focused design means traditional monitoring approaches often fail.

Immediate Action Steps

  1. Ask to see their Discord – check what servers they're in and who they're talking to

  2. Look for private message conversations with people they don't know in real life

  3. Check if they're sharing personal information – real name, school, location, photos

  4. Set up friend request restrictions in Discord's privacy settings

  5. Enable the explicit media filter and restrict direct messages from strangers

The Bigger Picture

The New Jersey lawsuit alleges Discord markets itself as safe for children while knowing predators actively use the platform to target minors. Whether or not the legal case succeeds, the pattern of abuse is clear: Discord's design facilitates the exact type of gradual relationship-building that makes online predation successful.

Image credit: Discord

🆘 Wired Extra: This Week's Quick Hits

Word Your Kid Probably Knows: “Rizz”
Slang for charisma or flirting skills. 'He’s got rizz.'

🔧 Tech Tip for Tired Parents: Night Shift Mode
Turn on Night Shift to reduce blue light in the evenings.

📴 Offline Challenge of the Week: Make Something
Do a craft, cook a meal or build something together. No phones allowed!

Et cetera

Courtesy of Mattel

♟️ ​Youngest chess player to defeat grand master

🕰️ Getting kids back on sleep schedule for the school year

👧🏿 Barbie celebrates Venus Williams’ equal pay advocacy with new doll

🍖 Carnivore babies and their parents

🐩 Why the origin of the word 'dog' remains a mystery

PUZZLES & TIPS

Answers To Last Week’s Brain Teasers

How far can a squirrel run into the woods?

Halfway. After that, he’s running back out of the woods.

What is 3/7 chicken, 2/3 cat and 2/4 goat?

Chicago

Congrats to Carey, UK for the correct answers!

This Week’s Brain Teasers

What 4-letter word can be written forward, backward or upside down, and can still be read from left to right?

What is black when it’s clean and white when it’s dirty?

Answers next week!

SHARING IS CARING

No Parent Left Behind

If you think that this newsletter could help another parent in their own digital journey with their children, please forward on this email so they can subscribe. It would mean a lot to us.

Or copy and paste this link to others:

Get Plugged In with Wired Parents.

Our mission at Wired Parents is to help you critically evaluate your children’s digital access with insight, inquiry and options from news and opinions along with tools, tips and strategies. Because they’ll either thank you later, or wish you’d fought harder.

Were you forwarded this email? Sign up

Reply

or to participate

Keep Reading

No posts found