UK could force messaging apps to search for child sex abuse images | encryption

Heavily encrypted messaging services such as WhatsApp could be required to adopt cutting-edge technology to spot child sex abuse material or face the threat of steep fines, under new changes to the law. UK on digital security.
The amendment to the Online Safety Bill would require tech companies to do their best to deploy new technology that identifies and removes child sexual abuse and exploitation (CSAE) content.
It comes as Mark Zuckerberg’s Facebook Messenger and Instagram apps prepare to introduce end-to-end encryption, amid strong opposition from the UK government, which has called the plans “not acceptable”.
Priti Patel, a longtime critic of Zuckerberg’s plans, said the law change balances the need to protect children while ensuring user privacy online.
The Home Secretary said: ‘The sexual abuse of children is a sickening crime. We all need to ensure that criminals are not allowed to run rampant online and tech companies need to play their part and take responsibility for keeping our children safe.
“Privacy and security are not mutually exclusive – we need both, and we can have both and that’s what this amendment delivers.”
Child safety campaigners have warned that heavy encryption will prevent law enforcement and tech platforms from seeing illegal messages by ensuring only the sender and recipient can see their content – a known process as end-to-end encryption. However, officials said the amendment was not an attempt to stop the rollout of other such services and that any technology rolled out should be efficient and proportionate.
Zuckerberg’s Meta business, which also owns encrypted messaging service WhatsApp, is delaying the introduction of its Messenger and Instagram plans until 2023.
Checking private messages for child pornography material has proven controversial, with campaigners warning of negative consequences for users’ privacy. One controversial method that could be considered by the communications watchdog, which oversees the bill’s implementation, is client-side scanning. Apple has delayed plans to introduce the technology, which would involve scanning users’ images for child sexual abuse material before uploading them to the cloud. The company proposed to deploy a technique that would compare photos with known child abuse images when users chose to upload them to the cloud.
Under the proposed amendment, Ofcom’s watchdog will be able to require tech companies to deploy or develop new technologies that can help find abusive material as well as stop its spread. The amendment reinforces an existing clause in the bill which already gives Ofcom the power to require the deployment of “accredited technology”. The change will now require companies to use their “best efforts” to deploy or develop “new” technology if existing technology is not suitable for their platform.
If a company fails to adopt this technology, Ofcom has the power to impose fines of up to £18 million or 10% of a company’s annual worldwide turnover, whichever is greater. The Online Safety Bill returns to Parliament next week after being considered by a committee of MPs and is expected to become law towards the end of the year or early 2023.
There are between 550,000 and 850,000 people in the UK who pose a sexual risk to children, according to the National Crime Agency. “We need tech companies to be there on the front lines with us and these new measures will ensure that,” said Rob Jones, NCA’s chief executive for child sexual abuse.
theguardian Gt