The FBI, Interpol and the UK’s National Crime Agency have accused Meta of making a “purposeful” decision to increase end-to-end encryption in a way that in effect “blindfolds” them to child sex abuse.
The Virtual Global Taskforce, made up of 15 law enforcement agencies, issued a joint statement saying that plans by Facebook and Instagram-parent Meta to expand the use of end-to-end encryption on its platforms were “a purposeful design choice that degrades safety systems”, including with regards to protecting children.
The law enforcement agencies also warned technology companies more broadly about the need to balance safeguarding children online with protecting users’ privacy.
“The VGT calls for all industry partners to fully appreciate the impact of implementing system design decisions that result in blindfolding themselves to CSA [child sexual abuse] occurring on their platforms or reduces their capacity to identify CSA and keep children safe,” the statement said.
“The abuse will not stop just because companies decide to stop looking.”
Meta-owned messaging app WhatsApp already offers end-to-end encryption by default. Meta also allows users to use end-to-end encryption for messages and calls on photo sharing app Instagram and Facebook’s messaging feature, and plans to make this the default option globally.
“The overwhelming majority of Brits already rely on apps that use encryption. We don’t think people want us reading their private messages, so have developed safety measures that prevent, detect and allow us to take action against this heinous abuse, while maintaining online privacy and security,” a Meta spokesperson said.
“As we continue to roll out our end-to-end encryption plans, we remain committed to working with law enforcement and child safety experts to ensure that our platforms are safe for young people.”
The intervention comes at a politically charged moment as the UK government pushes through a new “online safety bill”. Proponents argue it is designed to make the internet safer, but Silicon Valley companies warn it could undermine users’ privacy.
End-to-end encryption provides robust security to users of mobile phones, allowing only the sender and recipient to read messages sent this way.
The British legislation, which is making its way through the House of Lords, would enable media regulator Ofcom to make companies scan messages with “approved technology” to identify child sexual abuse material in some instances.
Earlier on Tuesday, Meta-owned WhatsApp, Signal and other messaging services urged the UK government to rethink the legislation, arguing in an open letter that it could open the door to “routine, general and indiscriminate surveillance” of personal messages.
While the government has argued that there are technological ways to scan messages without undermining the privacy of end-to-end encryption, the companies insisted “this is not possible”.
WhatsApp head Will Cathcart last month suggested that the app would refuse to comply with the bill in its current form, risking the service being blocked. The UK government’s stance could embolden authoritarian regimes to make similar requests, leaving users vulnerable to surveillance, he said.
But Tom Tugendhat, UK minister for security, said Meta’s plans to roll out end-to-end encryption could deprive law enforcement authorities of their ability to identify and disrupt predatory behaviour.
“The UK is pro-privacy, pro-innovation and pro-security,” he said. “But with no indication from Meta that any new safety systems implemented following the rollout of end-to-end encryption will effectively match or improve their current detection, we are urging them to implement robust safety systems that maintain or protect child safety.”