Australia Demands Big Gaming Platforms Prove They're Stopping Child Predators

Australia Demands Big Gaming Platforms Prove They're Stopping Child Predators

Australia's eSafety Commissioner has issued legally binding demands to Roblox, Microsoft, Epic Games, and Valve, requiring them to detail their defenses against child grooming and extremist recruitment on their platforms. The enforcement notices carry teeth: companies face penalties up to AUD$825,000 per day for non-compliance.

Commissioner Julie Inman Grant said the agency has documented a pattern of predators using these gaming spaces as hunting grounds. "What we often see is offenders make contact with children in online game environments, then move them to private messaging services," she stated. The risk is acute because gaming platforms have become central social hubs for Australian youth, with roughly 9 in 10 children aged 8 to 17 playing online games.

The eSafety office cited specific examples of how platforms have been weaponized. Roblox has hosted Islamic State-inspired games and recreations of mass shootings. Minecraft users have created far-right imagery and fascist recreations. Fortnite saw World War II concentration camp recreations and January 6 Capitol riot reenactments. Steam, investigators found, harbors numerous extreme-right communities with tens of thousands of hate-based groups.

Predators and extremists exploit these spaces deliberately. "Predatory adults target children through grooming or embed terrorist and violent extremist narratives directly into gameplay," Inman Grant explained. Once initial contact is made in the game environment, offenders frequently shift victims to encrypted messaging apps where monitoring becomes harder.

The eSafety office was created in 2015 originally to combat cyberbullying and child sexual abuse material distribution, but its mandate has expanded significantly to cover broader online harms affecting all Australians.

Roblox responded to the inquiry with a statement outlining current safety measures. The company said it prohibits content promoting terrorism or extremism and uses AI to scan images, text, and avatar items before publication to block extremist symbols. It also encourages users to report concerning content. Roblox announced plans to introduce age-restricted accounts for users under 16, with tighter parental controls and age-appropriate content filtering. "Our commitment to safety never ends," a spokesperson said.

The company framed its approach as collaborative, noting it works with law enforcement and civil society groups on violent extremism prevention.

Author Emily Chen: "Australia is forcing the industry's hand on a problem that's been obvious for years, but these transparency notices only matter if the platforms actually answer honestly and change behavior."

Comments