The scale and sophistication of technology-facilitated child sexual exploitation and abuse is outpacing global safeguards

The 2025 Global Threat Assessment details how the threat of online child sexual exploitation and abuse has accelerated and changed over the last two years, driven by the speed of evolving technology and compounding societal risks.

The report’s tailored recommendations for each sector and the Prevention Framework provide the tools and guidance needed to make the necessary shift to prevention-based approaches.

woman in purple and black stripe long sleeve shirt holding blue smartphone

Rapidly evolving technological threats

Generative AI misuse

AI-generated child sexual abuse material, identified in our Global Threat Assessment 2023 as an emerging risk, has continued to grow at alarming speed.

Deepfake technologies, AI chatbots and generative models are being weaponised to exploit children and disseminate abuse material at scale.

This year’s report identifies emerging offender tactics, such as the use of custom AI models trained on real abuse material to create new AI-generated sexual abuse material and child-like chatbots have also been used to test online grooming strategies. AI-generated CSAM can also complicate detection and law enforcement processes as it can be difficult to identify and trace using traditional systems such as image ‘hashing’.

Opportunities

Despite this worrying picture, AI technologies also pose opportunities for helping to tackle online sexual abuse. For example, AI can be used for automated detection to speed up processes and interrupt harmful or risky interactions before harm occurs, and they can reduce human exposure to traumatic content.

If technology can now create images and videos that never actually happened, how will we know what is real in the future, and how will that change the way we trust each other online?

15-year-old male, Ethiopia

The shift to encryption and virtual worlds

The Global Threat Assessment 2025 shows that the growing use and adoption of end-to-end encryption is one of the technological shifts reshaping children’s digital environments. 

Encrypted digital spaces make it virtually impossible to detect grooming or child sexual abuse material and severely limit the ability for law enforcement to identify victims, and offenders are moving to encrypted platforms to avoid detection.

New and emerging technologies

The Global Threat Assessment 2025 highlights additional changes to the digital landscape that have significant potential to impact on children’s safety in the coming years, including quantum computing, decentralisation and extended reality (XR). 

Early policy and safety by design considerations are critical to ensure these technologies are built with harm prevention and children’s safety in mind.

…with virtual reality, you’re going to have tactile touch and feel soon,and there’s going to be pads on bodies and that’s going to be a new way of perpetrators inflicting physical harm in the virtual space.

Survivor

Evolving societal dynamics and behavioural risks

Vulnerabilities

The report outlines that children who are marginalised – whether due to poverty, minority status, neglect, unstable living conditions or rural residence – are disproportionately at risk of technology-facilitated exploitation and abuse.

Prior exposure to violence, CSAM and violent pornography, and family dynamics that normalise controlling behaviours are also found to be additional risk factors.

Pre-pubescent girls remain the most frequently depicted victims in reported CSAM, though certain types of abuse such as financial sexual extortion disproportionately affect boys.

Financial sexual extortion

The report finds that the criminal activity of financial sexual extortion, identified as an issue in the last Global Threat Assessment 2023, has grown to be a major threat.

Financial sexual extortion (sometimes referred to as ‘sextortion’) is usually a financially motivated crime, with perpetrators most often targeting teenage boys and conducting operations internationally.

The Global Threat Assessment 2025 presents the latest global data and insights on the issue, including case studies and examples of recent campaigns and initiatives around the world to tackle this growing problem. 

The report also highlights cross-sector measures for tackling financial sexual extortion involving financial institutions, adapting surveillance tools and reforming bank secrecy laws.

Violence, extremism, and harmful sexual behaviours

The Global Threat Assessment 2025 also identifies a complex web of harms that increasingly overlap and intersect with the issue of child sexual exploitation and abuse.

These include the issues of self-harm, terrorist and violent extremist content and a growing concern around children who display harmful sexual behaviours and peer-to-peer abuse.

Since the last Global Threat Assessment, online groups promoting violence have proliferated, with a 200% increase in NCMEC reports (over 1,300 total) from 2023 to 2024. These groups encourage children to harm themselves or others, highlighting new intersections between sexual exploitation, online radicalisation, and offline harms. (NCMEC).

There is also indication that children displaying harmful sexual behaviours is a growing issue, with evidence suggesting that peer-related exploration can sometimes escalate into more serious offending. Most interventions to tackle harmful sexual behaviours begin too late after harm has already occurred.

Recommendations

The Global Threat Assessment 2025 shows that no sector can solve this problem alone. It outlines tailored recommendations for each sector and provides the Prevention Framework as a practical tool designed to guide action.

Cross-cutting recommendations for all stakeholders

Address technology-facilitated CSEA as an urgent public health priority and invest in prevention strategies, including those to prevent perpetration and reduce the stigma associated with help-seeking and disclosure. Recognise that children are at risk both of being harmed and of engaging in behaviours that cause harm to other children.

Generate and use evidence to inform prevention. Safely and ethically engage children and survivors to define the problem and identify barriers to the inclusion of marginalised populations.

Collaborate across sectors to coordinate prevention efforts and share lessons learned. Adopt harmonised terminology aligned with the Terminology Guidelines, standardise reporting metrics/systems, share timely data and evidence of what does and does not work, and establish sustainable systems

For governments
  • Invest in prevention infrastructure and cross-agency coordination.
  • Strengthen and align legal frameworks and ensure enforcement capacity.
  • Support survivor services and embed youth participation in policymaking.
For technology companies
  • Apply safety-by-design across products and services.
  • Share data and collaborate on detection and prevention tools.
  • Conduct regular child impact assessments and transparency reporting.
For civil society organisations
  • Amplify survivor and youth voices.
  • Deliver frontline support and prevention education.
  • Advocate for inclusive, rights-based approaches.
For media and communications
  • Avoid sensationalism and victim-blaming.
  • Promote informed, solution-oriented coverage.
  • Support public understanding of online risks and protective strategies.
For international organisations and donors
  • Fund prevention initiatives and capacity-building.
  • Support global alignment of legislation, terminology and data-sharing.
Secret Link