Modernised reporting systems

What is it?

  • A reporting system for CSEA online (also known as cybertip) promotes and enables the reporting of and response to images and videos of CSEA online to ensure an efficient criminal justice response. It also encourages reporting of suspicions and causes for complaint of CSEA online (e.g. where there is no hard evidence).
  • Reporting mechanisms are set up so that users can report concerns. Once the reports are received, a filtering system (often using hashing or AI) is applied to remove abusive content.

Why is it important?

  • Highly visible, accessible and child-friendly reporting systems can prevent and respond to online abuse. All individuals in society should be able to report any cause for concern, risk or disclosure of abuse that may cause harm to themselves or to others both online and offline, with dedicated categories of reporting for content that is suspected of being related to child sexual exploitation and abuse.
  • Modernised cybertip can ensure that systems are in place to immediately respond to individual, or group, cases or issues, and to stop further associated harm.
  • Modernised cybertip can increase the awareness of online harms amongst children, general users as well as potential offenders. It can also increase understanding of different online abuses and harms, thereby increasing the number of reports being shared.
  • Modernised cybertip enables organisations to log, analyse and respond to concerns of harm and specific risk areas caused by or contributed to them.
  • Having a modernised cybertip system in place demonstrates an organisation’s commitment to safeguarding children online and to preventing and responding to online harms.
  • Having a visible, accountable and effective reporting system in place with the accompanying filters (modernised cybertip) in place can increase awareness of how suspicions and reports are handled, thereby increasing trust and confidence in the particular tools, software or platform, and potentially (eventually) leading to a reduction in abusive use.

How can it be implemented?

  • Reporting mechanisms – or links to reporting mechanisms – that are visible, accessible (including to children with disabilities) and age-appropriate (targeting the youngest potential user) should be put in place on every platform, tool and software. They should be streamlined and have reporting options that update with changes in technology and how people use the internet, including where possible an option to remain anonymous.
  • Where possible and appropriate, the reporting mechanism back-end architecture and filter systems should be aligned (international, regional, country level as necessary) with the reporting mechanisms of other similar tools, platforms and software.
  • Aligning reporting mechanism and response procedure architecture internationally, regionally and/or per country will allow for effective, coordinated response and criminal justice procedures. It can better enable trend monitoring and analysis, avoid duplication of efforts, and avoid personally identifiable reports being sent unnecessarily to a number of organisations.
  • To encourage user reporting, age-appropriate education and empowerment is required on: (i) what exploitation and abuse is, (ii) the associated risks, (iii) how to report, and (iv) what happens when a report is made. (See capability 9).
  • Ensure appropriate acknowledgement of every report to foster trust in the system. Appropriate acknowledgement could include anything from a generic email to a personalised response.
  • Where possible, all tools, services or platforms should install filters to prevent real-time activity that can lead to inadvertent or conscious child sexual exploitation and abuse. Preferably, this should be done at the design stage. E.g. Filters can prevent users from visiting foreign websites that contain images of children being sexually abused.
  • Where reports have been received and abusive content or risks of abuse verified, businesses should commit to redesigning and adapting tools, services or platforms to respond to the abuse and / or mitigate the risks identified, e.g. gaming terms, enforced protections not enforced, immersive and addictive technology, incomplete filters and inaccurate or inaccessible signposting.
  • An independent grievance redressal mechanism for victims/complainants is needed to address instances where CSAM is reported but not taken down despite court orders (per the Indian guidelines). Intermediaries need to be held accountable, especially for legal obligations, beyond the regularly published reports.

Further resources:

The US Government, CyberTipline Modernization Act of 2018.