The report revealed that 80 per cent of surveyed children aged eight to 12 and 95 per cent of children aged 13 to 15 used one or more social media platforms — Facebook, Instagram, Reddit, Snapchat, TikTok, Twitch, YouTube and Discord — last year.
Of those children, 54 per cent accessed social media through their parent or carer’s account while 36 per cent had their own account on at least one of those services.
Instagram and Snapchat reported having more than one million users aged 13 to 17, YouTube had more than 643,600, TikTok had more than 522,800, Facebook had 455,000, Discord had 222,100 and Twitch had 24,400.
Reddit was unable to record how many underage users it had on its platform.
This is despite the federal government passing a social media ban for children under 16 in November — a move criticised as rushed and ill-considered by some politicians, experts and the industry.
The ban relied on truthful age declaration, so the social media platforms could appropriately block users under 16 from making an account.
The eSafety Commission noted the rules have been “insufficient in preventing many under-13 users from creating an account for services that are not suitable for them”.
Further to the age declarations, Snapchat, TikTok and Twitch deployed language analysis technology to detect signals of users under 16.
TikTok and Twitch also used artificial intelligence-driven age estimation to detect underage users, YouTube used classifiers while Facebook, Instagram and Discord had age estimate models in place.
Reddit collected no age-related data during the sign-up process and, instead, relied on users to truthfully declare their age.
According to the report, there was no baseline or consistency across the industry which revealed a disconnect between terms of use and how they are enforced.
“Some services are doing a lot and investing in tools, technology and/or processes, while others are doing very little,” the report read.
eSafety Commissioner Julie Inman Grant, in the report foreword, called on social media service providers to comply with basic online safety expectations and the law, the social media minimum age and emerging international requirements.
“While social media services must do more to implement age assurance measures and prioritise the best interests of children, we cannot expect them to act alone,” she said.
“The responsibility for child safety, including appropriate age assurance, must be shared across the digital ecosystem, including devices and their operating systems, app stores, search engines, and other services.
“Parents and carers, educators, policymakers, and technology developers all have a role to play in fostering safer digital spaces.”