Australia asks Roblox, Minecraft to ...

Australia Demands Roblox, Minecraft, Fortnite, and Steam Explain Child Safety

Australia's eSafety office has issued legally enforceable transparency notices to Roblox, Microsoft, Epic, and Valve over child grooming and radicalisation concerns on their platforms.

Eliza Crichton-Stuart

Eliza Crichton-Stuart

Updated

Australia asks Roblox, Minecraft to ...

"Gaming platforms are amongst the online spaces most heavily used by Australian children, functioning not only as places to play, but also as places to socialise and communicate," said eSafety Commissioner Julie Inman Grant in a published statement. That quote sums up exactly why Australia's government just put four of the biggest names in gaming on notice.

The Australian eSafety office, an independent government agency established in 2015, has issued legally enforceable transparency notices to Roblox, Microsoft (Minecraft), Epic Games (Fortnite), and Valve (Steam). The notices formally demand that each company explain, in specific detail, what systems they have in place to prevent child grooming and the spread of violent extremist content on their platforms.

What the eSafety office actually found

The concerns aren't vague. Inman Grant cited documented examples across all four platforms, including Islamic State-inspired games and recreations of mass shootings appearing in Roblox, far-right groups recreating fascist imagery in Minecraft, Fortnite games built around World War II concentration camps, and recreations of the January 6, 2021 US Capitol Building riot. Steam, meanwhile, was flagged as "reportedly a hub for a number of extreme-right communities," a claim that tracks with previous scrutiny the platform has faced over hosting groups that amplify Nazi and other hate-based content.

The eSafety office's own research found that around 9 in 10 children aged 8 to 17 in Australia had played online games. That's an enormous audience, and according to Inman Grant, predatory adults know it.

The grooming pipeline that starts in-game

Here's the thing that makes gaming platforms specifically concerning in this context: the games themselves aren't always where the harm happens. Inman Grant specifically called out a pattern where offenders make initial contact with children inside game environments, then move them to private messaging services to continue grooming away from any platform moderation.

That two-step approach makes detection harder. A game's chat filter might catch obvious red flags, but once a conversation shifts to a private Discord server or a direct messaging app, the original platform has no visibility. The eSafety office's demand for transparency is partly aimed at understanding whether companies are even tracking this migration, let alone trying to interrupt it.

Roblox responds, others stay quiet

Of the four companies named, only Roblox had provided a response at the time of reporting. The company said it "welcomes engagement with eSafety" and outlined several existing measures: strict policies against content that promotes terrorist or extremist organisations, AI-based review of all images, text, and avatar items before publishing, and active cooperation with law enforcement and civil society groups.

Roblox also pointed to a recently announced update, confirming that age-based accounts for users under 16 will soon be introduced. These accounts will tie content access, communication settings, and parental controls directly to a user's verified age. The company acknowledged the limits of any system, saying "no system is perfect," but committed to continued collaboration with eSafety.

Microsoft, Epic Games, and Valve had not publicly commented at the time this story was published.

What this means beyond Australia

Australia has become one of the more aggressive regulatory environments for online platforms targeting children. The eSafety office's expanded mandate and willingness to issue mandatory notices with significant financial penalties signals that the days of self-regulation being sufficient are narrowing, at least in this jurisdiction.

What most players miss is that these platforms already have moderation systems in place. The real question the eSafety office is asking isn't whether safeguards exist, but whether they're working at the scale required, and whether companies can actually prove it. The transparency notices mean these companies will have to answer that question with specifics, not press statements.

The responses from Microsoft, Epic, and Valve, when they arrive, will be worth watching closely. For the latest gaming news coverage across the industry, keep an eye on how this regulatory pressure develops. Australia's approach has a history of influencing policy conversations in other markets, and if these transparency reports surface significant gaps, the fallout could reach well beyond one country's borders. You'll want to browse the latest reviews and coverage as this story develops.

Roblox Resources

Check out Roblox Gift Cards on Amazon here.

Learn about other popular Roblox experiences:

Announcements

updated

April 26th 2026

posted

April 26th 2026

Related News

Top Stories