Child safety and end-to-end encryption: irreconcilable or a possible match?

End-to-end encryption

Donagh O’Malley, Advisor at Cyacomb, reflects in this opinion piece on the relationship between child safety and end-to-end encryption.

Legislators in the UK, EU and elsewhere appear determined to press ahead with requirements that end-to-end encryption (E2EE) platform operators implement client-side scanning (‘CSS’) to detect and act on Child Sexual Abuse Material. CSS is a technical method which can be used to analyse content on a user’s device, and detect potential matches against harmful content, including child sexual abuse material

The environment in which these policy developments are taking place presents seemingly irreconcilable realities.  On the one hand, Internet Watch Foundation reports a doubling over the past 2 years of online commercialisation of the most serious category of child sexual abuse material. On the other, Signal, WhatsApp and other leading E2EE platform operators say they’d rather close their services than compromise user privacy in any way. 

Personal privacy and child protection are both important and urgent matters. We don’t believe platforms wish to become enablers or havens for child sexual abuse material.  Equally, we don’t believe they wish to break their privacy promise for the vast majority of their consumers, who use these platforms for benign reasons.  We agree this is an extremely challenging policy debate to solve, and we don’t offer a policy solution, which ultimately will come from legislators and the lobbyists who try to influence them.  

What is client-side scanning?

We are, however, curious to see if there may be some technical solutions which could help advance the discussion, or at least clarify some clear spaces within which consensus may emerge.

The primary objections to CSS appear to be fourfold:

1. breaking the encryption promise to users;

2. proportionality (why scan everything when only a tiny percentage is likely criminal?);

3. security (adding a door increases security risks); and

4. surveillance (the door may start out being open only for child sexual abuse material detection, but inevitably we’ll face pressure for the door to be used for other purposes).

We agree these are reasonable and valid concerns, and that policy needs to address them in a way which incentivises platforms to continue their already-substantial investments in online safety. However, we also believe that there are technical solutions which can solve or alleviate these objections, or at least which may influence policymakers when it comes time to implement the detailed regulations of how CSS operates in practice.  

Working together towards finding a solution

What if there were a client-scanning solution that could completely protect user privacy and preserve user anonymity?  What if CSS acted by blocking content or issuing a warning to a user, rather than breaking encryption or reporting them?  What if CSS operated in a way which did not disclose blocking or warnings, even as an aggregate, to platform operators?  What if the results of matching of child sexual abuse material were never known (or knowable) outside of the user device?  What if scanning were limited to users identified as high-risk through other means?  What if scanning was informed by data, with users/files selected through integrity and assurance principles, similar to how scanning occurs currently on open platforms?

It’s likely that answering all of these questions negatively, or all of them positively, could simply perpetuate the current stalemate between child safety proponents and online privacy advocates.  If all of the above ideas were implemented, child safety work may be impeded, since it would be impossible to measure the scale of sharing child sexual abuse material in encrypted messages, or determine if it’s getting worse.  If all of the above ideas were rejected, the platform operators’ concerns about CSS may be validated, and their position of record remains that they will close their businesses in markets where CSS is required. Compromise needs to be identified, so that both technical solutions and public policy are balanced.

Cyacomb has developed technology, and a range of operational hypotheses, which we believe may offer technical solutions to forge some clarity in this debate, and we welcome the opportunity to discuss this with platform operators and others who may be interested to discuss further.