Briefing on the future of digital tools to detect child sexual exploitation and abuse online in Europe

szabo viktor 7hqEx1al0Fk unsplash comp

WeProtect Global Alliance – Briefing

January 15, 2021

What is the issue?

On 21 December 2020, the European Electronic Communications Code (EECC) came into force in the European Union (EU), which provides an expanded definition of ‘electronic communications services’ in the already existing 2002 Privacy and Electronic Communications Directive (ePrivacy Directive).

As a result, providers of electronic communications services operating in the EU must now comply with new confidentiality duties around communications and the processing of communications data. This has created significant ambiguity around the legality of the use of online detection tools that enable the identification and removal of suspected child sexual exploitation and abuse (CSEA).

These tools, such as Microsoft’s PhotoDNA and Google’s CSAI Match, help to prevent new and Known Child Sexual Abuse Material (CSAM) and child Grooming attempts through the use of image hashing, classifiers and anti-grooming applications. They are responsible for the vast majority of global CSAM reports, most of which are hosted on European URLs (see previous WPGA briefing).

A Temporary Derogation, proposed in September last year by the EC, had intended to ensure that the tools could continue to be used uninterrupted when the EECC came into force. However, agreement on amendments to this Temporary Derogation could not be reached due to concerns over privacy and the legal basis for the tools. This has led to increased legal uncertainty over the basis for the use of the tools in much of Europe from 21 December 2020 (in the absence of any relevant national measures made under Article 15 of the ePrivacy Directive).

What is the current situation?

As things stand, despite the new regulation, several companies have committed to continued use of the tools to detect CSEA online, which is strongly welcomed. However, others have ceased to use the tools leading to widespread concern about the increased risk to children and a reduced capacity of law enforcement to identify offenders.

For example, the Australian, Canadian, New Zealand, UK and US governments have released a joint statement urging European partners “to carry out their responsibility to protect not only European children, but also children around the world whose abuse will be shared among EU citizens that providers will be blind to stop”. In addition, Europol’s European Union Cybercrime Task Force (EUCTF) issued a statement in support of the original Temporary Derogation, warning that without the proactive detection of online abuse would result in a “significant reduction of investigations”.

There is hope that an interim solution can be found in January before any further damage is done. For this to happen, the European Council, the European Commission and the European Parliament must come to an agreement at Trilogue on a final text. Trilogue negotiations begin on the 14 January 2021 to agree the final text of the Temporary Derogation of the ePrivacy Directive with the final discussions due to take place on 26 January – so the timeframe for agreement and the window for influencing are both short.

A European Parliament text that amended the original text of the Temporary Derogation was put forward by the European Parliament’s Civil Liberties, Justice & Home Affairs (LIBE) Committee at the end of last year, which led to significant concerns amongst service providers and civil society as it was felt that disproportionate barriers to the voluntary detection of CSEA would be imposed, leaving the Temporary Derogation unworkable and too difficult for companies to implement.

Key concerns with the European Parliament text

  • Data impact assessment requirement – introducing a condition that requires service providers to undertake a Data Protection Impact Assessment before being able to use the tools would mean companies stopping their use of the tools whilst this is completed. If a data protection assessment is required, then it should be able to take place within a reasonable timeframe that does not allow any interruption of the tools.
  • Margins of error requirement – the requirement of a margin error of 1 in 50 billion for the technology to be deployed would make many tools unusable. This margin error exists for PhotoDNA because it uses hashes of known images so has a very low error rate. Having such a high threshold would de facto exclude the detection of new material, including self-generated images.
  • Exceptions for teenagers and professionals – the proposals make assumptions about user identity that may not be known and could require further review of the users’ account that would raise privacy issues (such as teenagers’ right to explore their sexuality safely and consistently with the age of sexual consent in their country, Recital 4(a), Art 1) and the need to protect professional secrecy (Art 3.1.(1)(xiii)).
  • Requirements on transparency – if there are too onerous reporting requirements on service providers, particularly smaller ones with fewer resources, this could make companies decide not to use the voluntary tools.
  • Requirement of human oversight – requiring human oversight and intervention for the processing of personal data, with no report being made without human confirmation (Art. 3.a (viii)) could expose content reviewers unnecessarily to harmful content that has already been determined to be CSAM and make reporting at scale very difficult. The importance of human review in this CSAM identification process is not underestimated, however, content that has already been removed using hash-matching technology has already been reviewed at least once by trained individuals who can be spared from seeing the same images over and over by hash technology.
  • Requirement to report to law enforcement – Art. 3.1(ea) sets out a requirement for every case of reasoned and verified suspicion of online child sexual abuse being “immediately reported to the competent national law enforcement authorities”. Not only is this difficult (in terms of establishing geography of content and because offenders/victims could be in multiple locations), it also risks creating confusion and duplication because there is already am effective system in that US-based service providers report to the National Center for Missing and Exploited Children (NCMEC), which forwards reports to national law enforcement agencies in the EU and globally.
  • Disclosure of information to the data subject – the requirement for providers to disclose information to the reported party once an investigation into that person has closed is problematic. It is not possible for service providers to know when an investigation has concluded, and this could impact on ongoing investigations (Art. 3.1.(xii)).
  • Concrete suspicion (in the original EC Temporary Derogation): although this text is only included in the Recital, it is problematic. The text in Recital 11 states: “[the technologies] should not be used for systematic filtering and scanning of communications containing text but only to look into specific communications in case of concrete elements of suspicion of child sexual abuse.” This wording would create legal uncertainty that may lead to companies to stop using the tools. For the technology to work, service providers need to scan all communications to identify suspected abuse (as with anti-virus and spam filters).

Key calls to action

  1. The European Commission, European Parliament and the European Council should reach swift agreement to pass a Temporary Derogation that restores the status quo and clearly allows service providers (existing and new) to legally use available technology to detect CSEA online;
  2. The above elements of the European Parliament text, which risk making the existing tools unworkable or deters service providers from using them, should be removed to ensure the tools can be deployed without interruption and effectively;
  3. A sustainable, proportionate, long-term solution and legal framework must be found that allows automated technology to be safely used to detect CSEA online.

Timeline

The following key dates regarding the Temporary Derogation to the ePrivacy Directive:

  • 14 January: Shadow meeting in the European Parliament: all the MEPs shadow rapporteurs + MEP Sippel (Rapporteur)
  • 15 January: Technical Trilogue: meeting of the staffers (assistants + advisors) of all 3 EU Institutions (European Parliament, European Commission + EU Council)
  • 18 January: Shadow meeting in the European Parliament: all the MEPs shadow rapporteurs + MEP Sippel (Rapporteur)
  • 20 January: Technical Trilogue: meeting of the staffers (assistants + advisors) of all 3 EU Institutions (European Parliament, European Commission + EU Council)
  • 26 January: Political Trilogue meeting: MEP Sippel on behalf of the European Parliament, Commissioner Ylva Johansson on behalf of the European Commission and Portuguese Ambassador to the EU on behalf of the EU Council

In the longer term, it is hoped that the upcoming EU strategy for a more effective fight against child sexual abuse, published in July 2020, and the Digital Services Act, the draft of which was published on 15 December 2020, will create a more long-lasting framework for the utilisation of online detection and removal tools for harmful content.

About WePROTECT Global Alliance

WePROTECT Global Alliance brings together the people, knowledge and influence to transform the global response to Child Sexual Exploitation and Abuse Online. As of 15 January 2021, 98 countries are members, along with 44 private sector companies, 43 civil society organisations and 7 international organisations.