Self-generated sexual content involving young people demands understanding and nuance

Self-generated content blog post

For frontline professionals, responding to self-generated sexual content involving children can bring up lots of complexities and questions. What might be reflective of changing sexual behaviours and expectations for adolescents today and what should be treated as problematic or even criminal? Researchers Dr Mark Kavenagh from Evident and Julia Durska consider these questions and present some answers for practitioners, caregivers and policymakers.

Technology is increasingly woven into the lives of young people and most online engagements bring benefits like social connectedness, access to information and opportunities for learning. However – as the real world has risks – young people’s online engagements do too.

It is from within this context that practitioners need to build understanding of the ways that young people are self-generating and sharing sexual content using a range of technology, such as social media, chat apps and live-streaming services to share it online. Research shows that sharing sexual content by ‘sexting’ or ‘sending nudes’ is increasingly common. In a U.S. sample 40% of respondents aged 9-17 agreed that it was ‘normal’ for people their age to share nudes with each other. Similar trends were observed in lower- and middle-income countries, with a recent study reporting that eight per cent of internet-using 12-17-year-olds in Thailand said they had taken naked images or videos of themselves in the year before being surveyed.

Police are increasingly seeing self-generated sexual content involving young people being shared online. It is important to recognise that some of this content may be the result of coercion, including through extortion. However, little is yet known about how many scenarios involve the non-consensual distribution of images. It also seems that content is being willingly shared (at least at first) between young people directly. As adolescents tentatively explore their own sexual development, the online environment can be appealing to test out behaviours with a distance and safety not possible in real life. These circumstances can be especially tempting for young people with diverse sexual and gender identities, who may use the anonymity of online interactions to seek out peers and explore their identity in secret. The limited research that amplifies young people’s perspectives indicates that they see some positives, adolescents in a Swedish study expressed that sharing sexual content via technology could provide advantages in their relationships and/or increase their self-esteem. Yet the nature of technology also means such behaviours can open them up to a range of risks as the content exists permanently, and can easily find its way out of the control of those who created it.

Privacy and sharing

The digital age has brought shifting notions of privacy and sexuality among young people as they mature. Adolescent development now includes negotiating online contexts and learning to balance opportunities like self-expression or social connections with privacy risks or abuse. Yet these decisions are increasingly complex. Commercially driven platforms are structured to reward the increased engagement of users. Certain behaviours are rewarded, for example, ‘sexy’ content gets more views, likes and shares. Additionally, unregulatable modes of engagement like live-streaming increase follower networks rapidly. These mechanisms influence the decision-making of developing young people who are still learning about the world and its consequences.

Body image and the normalisation of sexual content

Idealised body images are more successful in the algorithms that dictate what is presented to platform audiences. When these images are heavily promoted, it reinforces broader social narratives that prioritise particular ideals. When young people are constantly seeing these images – and witnessing how such images are rewarded by the algorithms and audiences – this subtly reinforces the idea of self-value deriving from their bodies. Such messages in fact mirror some of the manipulations used by offenders to groom and abuse young people serving to further enable their crimes. The normalisation of these narratives may also mean that young people don’t realise that certain behaviours are abusive or exploitative.

Generational and social tensions

Online engagements characterised by a globalised ‘internet culture’ are dominated by liberal values which may come into conflict with more conservative local family and social contexts, or lead to intergenerational conflict. Young people often use their online lives to try out different versions of themselves that they feel they can’t do safely in real life. Where the distance between young people’s online and real-world personas is greatest – the stakes if things go wrong are increased. Seeking help might require that they disclose elements about their previously unspoken online lives to those around them. The consequences of sharing these online versions of themselves with people in real life can range from discomfort to being met with victim-blaming for the experienced harms.

Solutions for Practitioners and Caregivers

Don’t focus solely on avoiding risks

A flaw in some preventative programming is the focus on ‘avoiding online risks’ which fails to account for the reality that young people’s lives today largely necessitate their online engagements. Being online means being exposed to risks so full avoidance is unrealistic. Adult support must focus on helping young people understand risks and avoid them turning into harm. Spotting, understanding and mitigating risks so harm is prevented is the key.

Don’t frame messages as ‘how to protect yourself’

Messaging that frames young people as charged with protecting themselves puts the responsibility on young people for offenders’ behaviour. If young people are then targeted by offenders, such framing can have the contradictory impact of creating self-blame: “I deserved this because I did not do what I was told to protect myself.” It’s a subtle difference, but messaging should be framed as increasing understanding to help young people spot concerns and seek help if something bothers them.

Understand young people’s lived realities

Young people don’t necessarily expect adults to be experts on social media and online platforms, but they do need adults to be open to talking and partnering with them in negotiating challenges. It is also important to reflect on the norms that may be influencing the interpretation of online behaviours. The world is changing fast and sometimes the norms influencing us are out of date.

Help young people understand the rules of social engagements (offline and on)

Adults have always helped young people to navigate social interactions and those taking place online is simply another domain in which we need to play this role. Young people need age-appropriate education about how to manage social interaction offline and online. This includes explicit teaching about setting healthy boundaries, information about sexual development, and support in negotiating complex situations. When young people don’t have support to understand these tasks, it enables offenders to take advantage.

Don’t automatically restrict internet access

A large part of young people’s social lives is online. When adult solutions to online challenges are to restrict internet access, this can feel like a punishment for speaking up. Knowing that restriction might be an adult response can discourage young people’s help-seeking. Restricting internet access can also limit the availability of support structures like supportive and caring peers.

While breaks from technology and limits to time spent online can be part of the solution, practitioners and other adults should not defer to restricting internet access without considering the range of impacts that might also occur.

Related Articles

Rebooting child protection systems: 5 ways to address child sexual abuse online

With more children than ever now engaging online, it is vital that they are both protected and enabled to engage safely. After the UN Committee on the Rights of the Child adopted General Comment 25 on the digital environment, I consider how this can be implemented effectively for children around the world.