The consultation comes as the eSafety Commissioner is embroiled in a legal battle with X over the platform’s failure to outline how it can combat child sexual abuse content.
Last month, eSafety commissioner Julie Inman Grant slapped $610,500 fine on the company for failing to respond to questions on how it can tackle sensitive issues on the platform.
The draft standards cover Designated Internet Services, including apps, websites, and file and photo storage services; and Relevant Electronic Services, covering a range of messaging services as well as online dating services and gaming.
The standards address the production, distribution, and storage of “synthetic” child sexual abuse and pro-terror material, created using open-source software and generative AI.
Industry associations were tasked with drafting enforceable codes covering eight sectors on the online industry under Australia’s Online Safety Act.
These include social media services, websites, search engines, app stores, internet service providers, device manufacturers, hosting services, and services such as email, messaging, gaming and dating services.
Earlier this year, the eSafety Commissioner reported that only six draft codes contained appropriate community safeguards. The two remaining failed to provide sufficient measures.
The current consultation process follows more than two years of work by industry, including earlier public consultation, to develop draft codes that would meet these community safeguards.
Each draft standard will require companies to detect and deter unlawful, place systems and processes to deal with reports and complaints, and create tools and information to increase awareness among users to reduce the risk of content surfacing and being shared online.
Inman Grant invited interested stakeholders to participate in the consultation process.
“I encourage everyone to have their say on the draft standards because we all want these provisions to be as robust as possible ensuring online services take meaningful steps to address the risk of illegal content online and protect the community,” Inman Grant said.
As the eSafety Commissioner received more incidents of child abuse content, Inman Grant revealed Microsoft’s PhotoDNA, used by over 200 organisations and most large companies, automatically match child sexual abuse images against these databases of “known” and verified material.
“PhotoDNA is not only extremely accurate, with a false positive rate of one in 50 billion, but is also privacy protecting as it only matches and flags known child sexual abuse imagery.”
“It’s important to emphasise this point: PhotoDNA is limited to fingerprinting images to compare with known, previously hashed, child abuse material. The technology doesn’t scan text in emails or messages, or analyse language, syntax, or meaning.”
“Many large companies providing online services take similar steps in other contexts, processing webmail traffic using natural language processing techniques to filter out spam, or apply other categorisation rules.”
However, she clarified, “eSafety is not requiring companies to break end-to-end encryption through these standards nor do we expect companies to design systematic vulnerabilities or weaknesses into any of their end-to-end encrypted services.”
“But operating an end-to-end encrypted service does not absolve companies of responsibility and cannot serve as a free pass to do nothing about these criminal acts.”
“Our focus is on ensuring industry take meaningful steps to prevent the proliferation of seriously harmful content like child sexual abuse material. Many in industry, including encrypted services, are already taking such steps achieve these important outcomes.”
eSafety will consider the submissions when it starts to prepare the final versions of the standards ready to be tabled in Parliament.
It proposes for the two standards to come into effect six months after the industry standards are registered.