
Advocates have welcomed a move to establish new online safety codes aimed at protecting Australian children from exposure to pornography and other harmful content.
The eSafety Commissioner Julie Inman Grant said her office will register three industry-prepared codes designed to limit children’s access to high impact, harmful material like pornography, violent content, themes of suicide, self-harm and disordered eating.
The codes cover enterprise hosting services, internet carriage services, and search engines.
The commissioner has also requested additional safety commitments and the strengthening of protections more broadly online, including app stores, device manufacturers, social media and messaging services, to prevent the rise of things such as “declothing apps” and other generative AI services becoming a risk to children.
“Just as AI has brought us much promise, it has also created much peril. And these harms aren’t just hypothetical—they are taking hold right now,” said Inman Grant in a 24 June speech at the National Press Club in Canberra.
“In February, eSafety put out its first Online Safety Advisory because we were so concerned with how rapidly children as young as 10 were being captivated by AI companions—in some instances, spending up to five hours per day conversing with sexualised chatbots.

“I plan to make my final determination by the end of next month. If I am not satisfied these industry codes meet appropriate community safeguards, I will move to developing mandatory standards.”
Inman Grant also revealed results from a survey of more than 2,600 children aged 10 to 15 on online harms they had faced.
Nearly all of the children reported having used at least one social media platform. Around seven in 10 said they had encountered content associated with harm, including exposure to misogynistic or hateful material, dangerous online challenges, violent fight videos, and content promoting disordered eating.
One in seven children surveyed reported experiencing online grooming-like behaviour from adults or other children at least four years older, including asking inappropriate questions or requesting they share nude images.
“Big Tech mega corporations have commercially mediated the exploitation of children and have failed to be accountable or transparent about this,” said leading campaigner against pornography and the sexualisation of girls, Melinda Tankard Reist in comments to The Catholic Weekly.
“That’s why we called for the age verification trial which has so far shown promising results in protecting children from explicit content. It’s why we backed laws to raise age of access to social media to 16, to take effect at the end of the year.

“It’s also why we support the eSafety Commissioner in demanding the online industry design codes with appropriate community safeguards and contributed submissions on what we believed needed to be in these codes, with child-safeguarding at the forefront.
“However, we are cynical about Industry’s willingness to comply with voluntary codes. History shows they rarely walk, and benefit vested interests over community safety. We note the Commissioner had to go back to industry to demand stronger codes as a number of earlier drafts were too weak.
“Our experience of industry failure to put the welfare of the most vulnerable before profits, suggest these mandatory codes may need to be developed and enforced sooner rather than later.”
Liberal MLC Susan Carter welcomed the recognition “that access to pornography has to be limited.”
“As a member of the [NSW] inquiry into the impact of harmful pornography, I have heard alarming evidence about the impact of on-line pornography on children – who are accessing it from as young as eight,” Carter said.
“On-line pornography is warping the views of teenagers of intimate relationships and shaping their views about themselves and how they relate to others.

“This is a good start. It recognises the harm to relationships caused by pornography. We will need to watch to ensure that it works, and is a sufficient response. We can’t talk seriously about building a society which respects women while allowing our preteens to access all types of pornography where nothing is off limits – except respect and consent.”
Heads Up Alliance founder Dany Elachi welcomed the new codes as a “step in the right direction” but said they were a case of “too little, too slowly.”
“While governments and regulators navel-gaze, wondering whether we should move toward voluntary codes or enforce mandatory standards, many more Australian children will today be traumatised by imagery that was illegal to sell to adults just a generation or two ago,” he said.
“I hope that on the back of new legislation limiting social media access to children under the age of 16, we can finally institute similar laws, with teeth, for minors accessing online pornography.”
Life, Marriage & Family Officer at the Sydney Centre for Evangelisation Michael Jaksic also welcomed the commissioner’s comments.
“Given the rise in mental illness among youth in recent times, and well-documented links to harmful content posted online, I’m inclined to think her proposal is a step in the right direction, looking to create the necessary boundaries to protect children from the adverse effects of heavy online use,” he said.
“However, a crucial factor will be its implementation.”