WhatsApp, Signal and other messaging services have penned an open letter to the UK government urging them to urgently reconsider the Online Safety Bill (OSB), a law that would allow regulators to ask platforms to stop users to monitor them to identify images of child abuse.
Under the bill, the government could force chat services to apply content moderation policies, such as z or hear what’s being broadcast.
“Around the world, businesses, individuals and governments face ongoing threats from online fraud, fraud and data theft,” the letter said. “Malicious actors and hostile states routinely challenge the security of our critical infrastructure. End-to-end encryption is one of the strongest lines of defense against these threats, and as vital institutions become more dependent on Internet technologies to conduct core operations, the stakes have never been higher.
As it stands, the bill could break end-to-end encryption and open the door to routine, general, and indiscriminate surveillance of personal messages from friends, family members, co-workers, executives, journalists, human rights activists, and even politicians themselves, the ability of anyone to to communicate securely, fundamentally undermined.
The draft law offers no explicit protections for encryption and, if implemented as written, could authorize OFCOM to attempt to enforce the proactive scanning of private messages on end-to-end encrypted communications services – defeating the purpose of the end-to-end -End encryption as a result of nullifying and compromising the privacy of all users.
In short, the law poses an unprecedented threat to the privacy and security of all UK citizens and the people they communicate with around the world, while emboldening hostile governments who may seek to craft knock-on legislation.
The open letter is signed by Matthew Hodgson, CEO of Element, Alex Linton, Director of Oxen Privacy Tech Foundation and Session, Signal President Meredith Whittaker, Martin Blatter, CEO of Threema, Ofir Eyal, CEO of Viber, Will Cathcart, Head of WhatsApp, and Wire’s Chief Technical Officer Alan Durik.
Last year, Apple abandoned similar controversial plans to discover known child sexual abuse material (CSAM) stored in iCloud Photos. Apple planned to report iCloud accounts with known CSAM image hashes to the National Center for Missing and Exploited Children (NCMEC), a nonprofit organization that works with US law enforcement agencies.
The plans drew criticism from a variety of individuals and organizations, and Apple eventually dropped the proposal. “Children can be protected without companies going through personal data,” Apple said at the time. “We will continue to work with governments, child advocates and other companies to protect young people, uphold their right to privacy and make the internet a safer place for children and for all of us.
Under UK law, if a messaging service refuses to apply content moderation policies, it can face fines of up to 4 percent of its annual revenue. WhatsApp, Signal and Proton have already said they would discontinue their encrypted services in the UK and withdraw from the market if the bill required them to scan user content.
The UK Government’s Online Safety Bill is expected to be brought back to Parliament this summer.
Note: Due to the political or social nature of the discussion on this topic, the discussion thread is located in our political news forum. All forum members and website visitors are welcome to read and follow the thread, but posting is limited to forum members with at least 100 posts.