Insights
Encryption makes some charities’ work harder, but it also allows them to protect data. We take a look at the dilemma that encryption poses for charities
Encryption is one of the most powerful tools that charities can use to ensure that data about service users, donors, and other constituents don’t fall into the hands of cyber criminals. But a coalition of high-profile charities including Barnardo’s, NSPCC, The Children’s Society, The Lucy Faithful Foundation, and The Marie Collins Foundation are currently promoting a campaign called No Place to Hide.
The Home Office-backed campaign wants social media companies to suspend the introduction of end-to-end encryption on their messaging services “until they have the technology in place to ensure children will not be put at greater risk as a result”.
With end-to-end encryption (E2EE), messages are encrypted on the sender’s phone and remain encrypted until they arrive on the recipient’s phone. This means that the social media companies themselves, as well as the police and other agencies, are unable to see the contents of messages. That enables people to communicate securely and in privacy, but it also makes it harder to detect when child grooming is taking place, or when sexual images of children are being made, shared, or viewed.
So is encryption to be encouraged, so that charities can protect their data, or is it something more sinister which makes life easier for those who seek to carry out child abuse and sexual exploitation?
Victoria Green, Chief Executive of The Marie Collins Foundation, believes that encryption has the potential for harm if its use is not controlled. “We are not against encryption, but we are against it without safeguards being put in place,” she told Charity Digital.
“Child sexual abusers do use social media platforms to contact children and share images, but without end-to-end encryption it can be detected…There are 14 million reports of (suspected) child sexual abuse online every year, and these will be lost if end-to-end encryption goes ahead without safeguards.”
Green accepts the argument that people have a right to privacy, but she says that E2EE breaches privacy, too: “Child victims need privacy. Images shared without consent are a breach of privacy.” That’s why she says the No Place To Hide campaign is calling for a commitment from social media companies to find some technological solution which will allow child abuse to continue to be detected and reported before they implement E2EE on their messaging services.
In fact, billions of people around the world are already using messaging services that use E2EE, including WhatsApp and Apple’s iMessage. And last month Facebook made it possible for all Messenger users to turn on E2EE for chats (including group chats) and calls if they wish.
Does this mean that messaging services such as Messenger are now giving child abusers a free rein, safely hidden from the authorities by EE2E? Will all 14 million reports of suspected child sexual abuse really now be lost?
Not according to Antigone Davis, Global Head of Safety at Meta, the company that owns Facebook, WhatsApp, and Instagram. “We agree on the need for strong safety measures that work with encryption and are building these into our plans,” she told Charity Digital.
“We’re focused on preventing harm by banning suspicious profiles, restricting adults from messaging children they’re not connected with and defaulting under 18s to private or ‘friends only’ accounts. We’re also encouraging people to report harmful messages to us so we can see the contents, respond swiftly, and make referrals to the authorities.”
In effect the company is using metadata – data about data – and techniques used by intelligence agencies such as traffic analysis – looking at patterns of messages rather than the contents of messages – to detect signs of abuse. So, for example, if someone repeatedly sets up new profiles or messages a large number of people they don’t know, Meta can intervene to restrict or ban them.
The company also says that a Europol Digital Evidence Investigations Report found that of the 11 types of relevant information in a digital investigation, ‘content’ is ranked just seventh. Each of the other ten types of data is still available with end-to-end encryption, which the company says demonstrates that there are a significant number of data points, beyond the messages people send each other, available to law enforcement in identifying criminals.
An obvious question to ask, perhaps, is this: why not require all social media platforms that use E2EE to implement some sort of master key (sometimes known as a backdoor), which they could use to view the contents of encrypted messages on their networks to check for signs of illegal activity, or to give to the police to help them in their investigations?
The answer is that this approach is simply not practical, according to Jim Killock, executive director of Open Rights Group, a privacy, data, and digital rights campaigning organisation. “Any backdoor can be exploited by a third party. Nobody has been able to show how such as system could not be abused,” he explained. “If security is the goal, and we need that to stop criminals from exploiting us for everyday crime, then interfering with encryption is a simple no go.”
So, back to the original question: should charities be enthusiastic about encryption technology, because it can be used as a powerful tool to protect their data and thus the privacy of their constituents? Or should they be more wary of it, because it could be used by criminals to facilitate the type of behaviour that many charities are dedicated to protecting the vulnerable from?
Providing an answer is difficult. There is no doubt that encrypting stored data is a hugely important measure that charities should take to protect the private information they hold about their service users and donors. But whether the measures that companies like Meta say they can take to protect children from abuse despite E2EE are enough is not clear.
Even if these measures provide less protection than abandoning E2EE altogether, is this justified by the fact that E2EE offers more privacy and security for people who may not be targets for groomers and abusers, but who may be vulnerable in other ways?
Ultimately these are important questions that can only be answered after considering informed and balanced arguments from child protection charities, law enforcement agencies, social media companies, and society as a whole.
Our courses aim, in just three hours, to enhance soft skills and hard skills, boost your knowledge of finance and artificial intelligence, and supercharge your digital capabilities. Check out some of the incredible options by clicking here.