
Whatsapp and Signal are among the messaging apps that use end-to-end encryption (Alamy)
5 min read
Human rights activists have warned that their work in the UK could be under threat, as ministers press ahead with plans that could allow online platforms to scan private encrypted messages.
Regulators equipped with powers from the Online Safety Act could require platforms to use scanning technologies to analyse or intercept content that would otherwise be protected by encryption.
End-to-end encryption is a common security feature for emails, messaging services and internet banking, which ensures that only the sender and receiver can access the contents of messages and files. Even the service provider cannot read encrypted content.
Prominent campaigners, including Russian human rights activist Olga Borisova, exiled Afghan journalist Zahra Joya and Uyghur activist Rahima Mamut, fear that weakening encryption could put the safety of activists at risk.
Borisova, who now lives in London after fleeing Russia, said weakening encryption would have immediate and serious consequences for activists working with people in authoritarian states.
“If this new regulation starts working, then all our work will be paralysed,” she told PoliticsHome.
“I speak a lot with people from Russia and Belarus… just the fact that these people could send some information abroad could be considered high treason, and there could be a criminal charge, and people could go to jail for it.
“Encryption helps us to learn about the human rights situation in those regions, and it’s crucial to our work. And in our case, encryption saves lives because Signal is used by people, for example, in Russia, contacting human rights defenders to help them leave the country, leave persecution, leave from being conscripted to the war. And it is crucial.”
Major tech firms, including Signal and WhatsApp, have threatened to leave the UK market or remove services rather than weaken end-to-end encryption to comply with the Online Safety Act.
“Human rights defenders based here in the UK will lose one of the few secure ways they have to communicate with people living under authoritarian surveillance,” Borisova said.
“The UK is home to many exile activists and journalists like me, and if secure tools disappear here, the UK becomes a less safe place to do human rights work, not by intention, but just by technical design.”
While serious crimes, including child sexual exploitation, do take place in private, encrypted messaging spaces, Borisova said she believed that other methods, including targeted investigations, intelligence-led operations and lawful hacking would be more effective at tackling these crimes, rather than “blanket access to everyone’s private communications”.
“I don’t think the UK government is following an authoritarian tendency, I just think it’s a lazy solution,” she said.
Conservative MP John Whittingdale, who chairs the All-Party Parliamentary Group on media freedom, told PoliticsHome that weakened encryption in the UK could be exploited by hostile states. He said that the push to access encrypted material was being driven primarily by the Home Office in a drive to crack down on serious crime.
“Whilst we have assurances that it would only be used in extreme circumstances to detect terrorists, people distributing child pornography, or the worst kind of crimes, the problem is that as soon as you create the means by which that can be done, you weaken the security of the communication system,” he said.
“The ability to hack into it or to somehow break through the encryption is not restricted to UK law enforcement.”
Elements of the Online Safety Act have already been coming into force over the past year, including duties on platforms to tackle illegal content such as fraud, terrorism and child sexual exploitation.
Groups including Open Rights Group, Big Brother Watch, the Electronic Frontier Foundation and Index on Censorship have been collaborating to highlight the risks of weakening encryption – including hosting an event in Parliament earlier this month, which was attended by cross-party MPs.
Ofcom is currently developing draft guidance on message-scanning requirements and the “minimum standards of accuracy” for such technology, which it will present to the Secretary of State Liz Kendall in April.
However, Ofcom’s CEO told the Lords Communications and Digital Select Committee in October that it is not clear whether technology to scan messages without infringing on freedom of expression will exist, and conceded that these powers remain speculative at this point.
A government spokesperson said: “When carrying out its regulatory responsibilities, Ofcom must take account of users’ rights to privacy and freedom of expression. Services must also consider this when implementing safety measures.
“Ofcom requires services to take steps which are both proportionate and technically feasible to tackle child abuse material on their service. It is only right that we raise the expectations on online services to deal with this horrific illegal content.”
While Ofcom’s rules do not require private messages to be automatically scanned, whether they are encrypted or not, but in serious cases where it is “necessary and proportionate”, the Online Safety Act gives Ofcom the powers to tell a company to use or build specific technology to deal with that content.
Before those powers can be used, the government would need to approve and publish minimum standards of accuracy, following advice from Ofcom, and any technology would need to be formally accredited.



