Snapchat Disables Over 415,000 Australian Accounts Under Social Media Ban
Snapchat has locked or disabled more than 415,000 user accounts in Australia since the country's under-16s social media ban took effect in December. The social media platform announced this significant figure in a blog post on Monday, revealing that as of the end of January, it had taken action against accounts belonging to users who either declared an age under 16 or were identified as under 16 through age detection technology.
Compliance with Australian Regulations
Snapchat was among ten platforms required to ban people aged under 16 from accessing their services under Australian law. Prime Minister Anthony Albanese heralded the initial success of the ban in January, announcing that 4.7 million accounts across these platforms had been disabled or removed in the first days of implementation. However, Snapchat's disclosure provides the most detailed account-specific data to date from any of the affected platforms.
The company emphasised that it continues to lock more accounts daily as part of its ongoing compliance efforts. This proactive approach demonstrates Snapchat's commitment to adhering to the regulatory requirements while navigating the complex challenges of digital age verification.
Technical Limitations and Implementation Gaps
Despite these efforts, Snapchat has warned of significant gaps in the implementation of the ban that could undermine its effectiveness. The company pointed to real technical limitations to accurate and dependable age verification, referencing last year's age assurance technology trial which found facial age estimation technology was only accurate within two or three years of a person's actual age.
This technological limitation creates practical challenges:
- Some young people under 16 may be able to bypass protections, potentially leaving them with reduced safeguards
- Others over 16 may incorrectly lose access to their accounts
- The accuracy margin creates implementation difficulties for platforms trying to comply precisely with the law
Broader Industry Concerns and Regulatory Focus
Snapchat has raised additional concerns about the ban's broader impact, noting that other apps where users communicate have escaped the regulation. This creates a risk that teens might simply shift to alternative, less regulated messaging platforms. While we don't yet have data to quantify this shift, it's a risk that deserves serious consideration as policymakers evaluate whether the law is achieving its intended outcomes, the company stated in its blog post.
The eSafety commissioner, Julie Inman Grant, has acknowledged the ongoing challenges, telling reporters last month that regulatory focus has been on the initial ten platforms where most young people congregate. We're a small team, by necessity we are going to focus where the preponderance of young people are - where there are more than 250,000 for instance, is one measure, she explained, adding that compliance efforts would continue to evolve as a work in progress.
Industry Calls for Systemic Solutions
Snapchat, like Meta, has called for app-store-level age verification as a more comprehensive solution to the age verification challenge. This approach would shift responsibility from individual platforms to the distribution channels through which apps are downloaded and accessed.
Inman Grant has noted that eSafety would be sending notices to companies regarding their compliance with the ban, while also observing that Snapchat had been using facial age estimation without a liveness test which checks whether an image represents a real person. What's really important is that these companies are deploying them in the right way. And if they don't have the right settings or they're setting the calibrations too high, that is where they're going to likely have false positives, she cautioned.
While the total number of account deactivations across all ten platforms stands at 4.7 million, this figure includes not just accounts identified as being under 16, but also historical, inactive and duplicate accounts that have been removed. Aside from Meta and Snapchat, none of the other platforms have disclosed how many accounts they have deactivated, and the eSafety commissioner has declined to provide a detailed breakdown.
The ongoing implementation of Australia's social media ban continues to reveal both the scale of compliance efforts and the technical challenges facing platforms attempting to verify user ages accurately while maintaining service accessibility for legitimate users.