The UK acknowledged possible technical hurdles in its planned crackdown on illegal online content after encrypted messaging companies including WhatsApp threatened to pull their service from the country.
Regulator Ofcom can only compel tech companies to scan platforms for illegal content such as images of child sexual abuse if it’s “technically feasible,” culture minister Stephen Parkinson told the House of Lords on Wednesday, as the chamber debated the government’s Online Safety Bill. He said the watchdog will work closely with businesses to develop and source new solutions.
“If appropriate technology does not exist which meets these requirements, Ofcom cannot require its use,” Parkinson said. Ofcom “cannot require companies to use proactive technology on private communications in order to comply” with the bill’s safety duties.
The remarks aim to allay concerns by tech companies that scanning their platforms for illegal content could compromise privacy and encryption of user data, giving hackers and spies a back door into private communications. In March Meta Platforms’s WhatsApp even threatened to pull out of the UK.
“Today really looks to be a case of the Department for Science, Innovation and Technology offering some wording to the messaging companies to enable them to save face and avoid the embarrassment of having to row back from their threats to leave the UK, their second largest market in the G7,” said Andy Burrows, a tech accountability campaigner who previously worked for the National Society for the Prevention of Cruelty to Children.
The sweeping legislation — which aims to make the web safer — is in its final stages in Parliament after six years of development. Parkinson said that Ofcom would, nevertheless be able to require companies to “develop or source a new solution” to allow them to comply with the bill.
“It is right that Ofcom should be able to require technology companies to use their considerable resources and their expertise to develop the best possible protections for children in encrypted environments,” he said.
Meredith Whittaker, president of encrypted messaging app Signal, earlier welcomed a Financial Times report suggesting the government was pulling back from its standoff with technology companies, citing anonymous officials as saying there isn’t a service today that can scan messages without undermining privacy.
However, security minister Tom Tugendhat and a government spokesman said it was wrong to suggest the policy had changed.
“As has always been the case, as a last resort, on a case-by-case basis and only when stringent privacy safeguards have been met, it will enable Ofcom to direct companies to either use or make best efforts to develop or source, technology to identify and remove illegal child sexual abuse content – which we know can be developed,” the spokesman said.
Ministers met big tech companies including TikTok and Meta in Westminster on Tuesday.
Language around technical feasibility has been used by the government in the past. In July Parkinson told Parliament “Ofcom can require the use of technology on an end-to-end encrypted service only when it is technically feasible.”
The NSPCC, a major advocate of the UK crackdown, said the government’s statement “reinforces the status quo in the bill and the legal requirements on tech companies remain the same.”
Ultimately, the legislation’s wording leaves it up to the government to decide what is technically feasible.
Once the bill comes into force, Ofcom can serve a company with a notice requiring it to “use accredited technology” to identify and prevent child sexual abuse or terrorist content, or face fines, according to July’s published draft of the legislation. There is currently no accredited technology because the process of identifying and approving services only begins once the bill becomes law.
Previous attempts to solve the dilemma have revolved around so-called client-side or device-side scanning. But in 2021 Apple Inc. delayed such a system, which would have searched photos on devices for signs of child sex abuse, after fierce criticism from privacy advocates, who feared it could set the stage for other forms of tracking.
Andy Yen, founder and CEO of privacy-focused VPN and messaging company Proton, said “As it stands, the bill still permits the imposition of a legally binding obligation to ban end-to-end encryption in the UK, undermining citizens’ fundamental rights to privacy, and leaves the government defining what is ‘technically feasible.’”
“For all the good intentions of today’s statement, without additional safeguards in the Online Safety Bill, all it takes is for a future government to change its mind and we’re right back where we started,” he said.
© 2023 Bloomberg LP