close
close

Australia’s eSafety Commissioner relaxes child molestation detection rules in internet safety standards | Technology

Australia’s eSafety Commissioner relaxes child molestation detection rules in internet safety standards |  Technology

Australia’s internet safety regulator has relaxed new rules to force tech companies to detect child exploitation and terrorist content on encrypted messaging and cloud storage services after some of the world’s biggest tech companies warned it could lead to to mass government surveillance.

In November, the eSafety Commissioner announced draft standards that would require operators of cloud and messaging services to detect and remove known child exploitation and terrorism material “where technically feasible,” as well as to disrupt and deter new material related to it character itself.

It didn’t specify how companies would have to technically meet the requirements, but in a related discussion paper the authority said it “does not recommend creating vulnerabilities or backdoors to undermine the privacy and security of end-to-end encrypted services.”

However, because it was not explicitly defined in the standards, tech companies and privacy advocates raised concerns that it would not provide end-to-end encryption protection. Apple has warned that it will expose the communications of all people using these services to mass surveillance. The tech giant expressed concerns that “technical feasibility” would be limited to whether the implementation would be financially viable for companies, rather than whether it would break encryption.

But in the final internet security standards tabled in parliament on Friday, the documents make clear that companies will not be required to break encryption or take actions that are not technically feasible or reasonably practical.

This includes cases where it would require the provider to “implement or build in a systemic weakness or vulnerability in the service” and “with respect to an end-to-end encrypted service, implement or build in a new decryption capability in the service, or cause the encryption methods used on the website will be less effective.”

What’s behind the fight between Elon Musk’s X and Australia’s eSafety Commissioner? – video

If companies invoke these exceptions, the standards will require them to “take reasonable alternative actions,” and eSafety may require companies to provide information about those alternatives.

“We understand that different services may require different interventions, but the clear message from these standards is that no company or service can simply absolve itself of its responsibility to take clear and measurable actions to combat child sexual abuse and sexual material. terrorist nature in its services,” the eSafety report said Commissioner Julie Inman Grant in a statement.

Despite the compromise on the final standards, in an opinion piece published in The Australian ahead of the standards’ publication on Friday, Inman Grant addressed criticism of the proposals, stating that tech companies said the standards “represent a step too far”, potentially unleashing a dystopian future of widespread government surveillance “.

A true dystopian future, she said, will be one in which “adults do not protect children from terrible forms of torture and sexual abuse and then do not allow their traumas to be freely shared with predators on a global scale.”

“This is the world we live in today.”

The withdrawal is a victory for technology companies providing end-to-end encrypted messaging services, including Apple, Proton and Signal, which have expressed concerns about the proposal.

Proton has threatened to challenge them in court if these standards are adopted.

Encrypted messaging company Signal filed a complaint with the European Union this week over a similar proposal to force tech companies to “feed moderation” to detect content shared in encrypted messages before they are encrypted.

The company’s CEO, Meredith Whittaker, told Guardian Australia that regardless of how regulators phrased it, it was just another way of calling for mass scanning of private communications.

“We consistently strive to explain the technical reality and the stakes of the proposals,” she said, adding that they “disagree with the mass surveillance proposal and argue that it does not actually violate privacy.”

“What we are talking about is a kind of self-contradictory paradox. You can’t conduct mass surveillance in private, period.

The standards will come into force six months after the 15-day ban in parliament.