simple hit counter

Australia urges tech companies to show how they stop online child sexual exploitation – POLITICO

Tech companies like Meta, Apple and Microsoft must disclose within the next 28 days how they monitor online child sexual exploitation materials or face potentially large fines, according to demands from Australia’s eSafety Commissioner published on Monday.

The requirements are part of updated rules that went into effect last year and give the country’s online content regulator more powers to force social media companies to publicize what steps they’re taking to reach people and especially to protect children online.

“Any company should have a zero-tolerance policy about weaponizing their platforms in this way, whether for the distribution, hosting or live streaming of this material,” said Julie Inman Grant, Australia eSafety Commissioner to ask POLITICO about her agency’s reason for more details on how these companies monitor such content. “If they aren’t doing enough to proactively detect, prevent and remove this (content), what else are they allowing to happen on their platforms?”

As part of the legal notices served to Meta, Microsoft, Apple, Snap and Omegle — a niche anonymous online chat service — the companies must provide detailed answers on how to find and remove child sexual exploitation material and what steps are these taking to keep children safe online within the next 28 days. Companies that fail to comply face daily penalties of up to AUD 550,000 or €383,000.

Almost all of these companies publish granular information about these processes in regular transparency reports. But Inman Grant said those documents often didn’t help Australians fall victim to often internationally organized gangs spread across countries like the Philippines or Nigeria. Existing reports also did not provide enough detail on what steps the companies were taking to track the problem, or how many cases of online child sexual exploitation took place on their platforms.

“We don’t really know the extent of the child sexual exploitation material,” said Inman Grant, a former Microsoft executive. “Part of the problem is that no one has put their feet in the fire or had any tools to say, ‘Do you have any real knowledge of how to arm your platforms?'”

Representatives from Apple, Microsoft, Snap, and Omegle did not immediately respond to comment. Meta confirmed receiving the legal notice.

Australia’s efforts are part of a broader push in the West to force companies to take more responsibility for how their platforms are used to disseminate child sexual exploitation material online. Countries including the European Union, Canada, and the United Kingdom are all aiming to enact new rules aimed at getting these companies to do more, including potentially scanning their users’ encrypted messages for such illegal content.

Those plans have pitted children’s advocacy groups who want companies to crack down on such abuses against privacy activists, who are urging companies not to use so-called end-to-end encryption or technologies that make it impossible for platforms to read messages weaknesses sent between individuals.

Inman Grant, Australia’s regulator, said it was not in favor of diluting encryption. However, she added that these companies have already scanned encrypted messages for malicious code and malware and should therefore take further steps to protect children from online exploitation.

“I see it as the responsibility of the platforms using this technology to also develop some of the tools that can help detect illegal activities when they take place, while maintaining privacy and security,” she added.

This article is part of POLITICO Pro

The one-stop-shop solution for political professionals that combines the depth of POLITICO journalism with the power of technology

Exclusive groundbreaking news and insights

Custom policy intelligence platform

A high-level public affairs network

Leave a Reply

Your email address will not be published. Required fields are marked *