The passing of the federal government’s social media ban before the completion of an age assurance trial has raised questions about whether the legislation will adequately protect children.

According to the Department of Infrastructure, Transport, Regional Development, Communications and the Arts, age assurance is an overarching term that encompasses a number of different technologies that include age verification, age estimation, age inference, parental certification or controls, and technology stack deployments.

All these methods will be tested in an age assurance trial run by a consortium headed by the UK-based Age Check Certification Scheme (ACCS). Its results are due in mid-2025.

Internationally, no country has implemented a successful age mandate without issue,The Guardianreports, according to government documents accessed through FOI.

In the UK, the Online Safety Bill attempts to restrict young people’s access to adult websites, not social media, relying on uploads of government-issued documents or the use of biometrics.

Communications minister Michelle Rowland has confirmed that Australians will not be required to hand over personal identification, including government IDs, to social media companies.

Australian National Univeristy academic Dr Faith Gordon said that aside from personal identification, age assurance for social media could look like: “using biometrics to visually classify someone’s age based on their visual appearance; nomination of an adult to confirm that a young user is the age to access the platform, or having a landing age set up asking users to verify their age before moving onto the main page.”

The online harms bill, known informally as the social media ban, was passed on Thursday night alongside an amendment to the Privacy Act (1988). The latter requires the privacy commissioner to have oversight over the age assurance trial and work out privacy-preserving methods for the technology.

“What we’re seeing privacy reforms is a positive push towards the way our data will be managed, including an obligation for platforms to delete data collected for age assurance purposes.

“However, there’s still a long way to go in actioning this. We don’t have the granular details, and will have to wait to see how it plays out, practically,” Dr Gordon said.

However, the increasing clarity on age assurance doesn’t quell concerns that the federal ban could still exacerbate harms for children.

“It’s going to potentially lead children that get around the age assurance and verification technologies even more vulnerable because the companies are going to assume that children aren’t on the platforms,” she added.

The Convention on the Rights of the Child is one of the world’s most ratified human rights treaties. Australia has been a signatory since 1990.

The Convention speaks to the importance of children’s participation in decisions that could affect them and Dr Gordon said that these important perspectives are not being heard by the Australian government.

“The federal social media ban is under a description of protecting children, but in this case, protection seems to be eclipsing their participation rights,” Dr Gordon says.

Given social media’s popularity, many young social media users are likely to try to find ways around the ban. 12-year-old Angus Lydom told AFP that he’d “like to keep using it,” and that like his other friends, he’ll “find a way.”

Similarly, 11-year-old Elsie Arkinstall said there was still a place for social media, particularly for children wanting to watch tutorials about baking or art.

“Kids and teens should be able to explore those techniques because you can’t learn all those things from books,” she added.

Better education and empowerment of children and their families are crucial parts of developing a rights-based approach to using age assurance technologies, Dr Gordon said.

“The basic online safety expectations and enforceability that forces the companies to drive up safety standards is also key. Removing children is not going to address the issue of online harms on social platforms, and the manipulation of children’s data won’t either,” she added.