Australia’s eSafety Commissioner gave eight social media and messaging companies 30 days to disclose how many children are on their platforms and how they detect and block underage users, as the nation considers protection options which could include imposing age limits.

Requests for information were sent to Google’s YouTube, Meta Platforms’ Facebook and Instagram, TikTok, Snap, Reddit, Discord and Twitch.

In a statement, eSafety Commissioner Julie Inman Grant noted while most of the platforms have age limits in place, commonly 13, “we also want to know how they are detecting and removing under-aged users and how effective this age enforcement is”.  

Grant suggested age limits could be on the table and explained the regulator requires better information to understand what might be effective along with what the unintended consequences could be.

“A key aspect of this conversation is having some solid data on just how many kids are on these platforms today and the range of their ages, which is a key focus of these requests.” 

The agency wants to assess age assurance readiness and find out how the platforms accurately determine age to prevent children who are under the permitted age from gaining access, along with ensuring appropriate protections for those allowed on the services. 

Grant said the regulator’s research showed almost a quarter of eight- to ten-year-olds said they used social media weekly or more often, with nearly half of those aged 11 to 13 years doing the same.