Australia’s internet regulator has raised fresh concerns that major social media companies are not doing enough to enforce the country’s under-16 ban, despite the law coming into effect late last year.
The legislation restricts users under 16 from accessing several major platforms, including Meta’s Facebook and Instagram, as well as Snapchat, TikTok, and YouTube. However, the eSafety regulator says there are “significant concerns” about how effectively these companies are complying.
Introduced in December, the ban was designed to shield children from harmful content and addictive platform algorithms. While widely supported by parents and closely monitored by countries such as the UK, critics argue the policy is difficult to enforce and may not address the root of the problem.
In its first report since the rollout, the regulator highlighted several shortcomings. These include allowing previously identified underage users to re-verify their age, letting minors repeatedly attempt age checks, and failing to properly block new under-16 users from creating accounts. It also noted a lack of effective reporting tools for parents trying to flag underage activity.
Early data shows some progress—around 4.7 million accounts were restricted or removed in the first month—but many underage users appear to still have access. Reports from schools suggest that most teens who used social media before the ban continue to do so, often without strict age verification.
eSafety Commissioner Julie Inman Grant said that while some steps have been taken, platforms must go further to meet legal requirements. Authorities will now shift from monitoring to enforcement, gathering evidence to determine whether companies have implemented adequate systems to prevent underage access.
Social media firms, meanwhile, argue that accurate age verification remains a major industry challenge. Meta has suggested that stronger controls at the app store level—such as age verification and parental approval—could be a more effective solution. Snapchat’s parent company, Snap, says it has already locked hundreds of thousands of accounts and continues to take action daily.
Despite the law, many experts remain skeptical. Critics argue that education about online risks may be more effective than outright bans, and warn that restrictions could disproportionately impact vulnerable groups—such as rural youth, disabled teens, and LGBTQ+ communities—who often rely on online spaces for connection.
Still, regulators insist the policy marks a necessary shift. Describing it as an effort to undo decades of entrenched social media habits, officials say meaningful cultural change will take time—but maintain that platforms already have the tools needed to comply.






































