A new online safety code in Australia requires search engines to take measures to reduce the risk of child sexual abuse material appearing in their results and ensure AI is not misused to create deep fakes of the harmful content.

In a statement, eSafety Commissioner Julie Inman Grant noted the search code is a significant step in the protection of children online and puts more responsibility on the tech industry to play a part in restricting the growing global trade in “worst-of-the-worst” online content. 

The new code, which went into effect yesterday (12 March), covers multiple sections of the online industry and increases the number of industry codes in operation to six, also covering social media, app stores, internet service providers, hosting providers, and device manufacturers and suppliers. 

Inman Grant said the eSaftety Commissioner will be able to seek significant penalties if search engines don’t comply with the code.

The rapid adoption of generative AI and moves by Google and Bing to incorporate AI functionality into their search engines meant the originally proposed code would have been out of date before it started, the commissioner added.

She thanked the industry associations and the search engine providers for their work and willingness to go back and redraft the code to make it fit for purpose. “What we’ve ended up with is a robust code that delivers broad protections for Australians,” she stated.