Skip to main contentSkip to navigationSkip to navigation
Child at computer
The UK online safety bill’s progress through parliament has been delayed until the autumn. Photograph: PA
The UK online safety bill’s progress through parliament has been delayed until the autumn. Photograph: PA

Huge rise in self-generated child sexual abuse content online, report finds

This article is more than 1 year old

Disturbing global trend should be ‘entirely preventable’, says Internet Watch Foundation head

Incidents of children aged between seven and 10 being manipulated into recording abuse of themselves have surged by two-thirds over the past six months, according to a global report.

Almost 20,000 reports of self-generated child sexual abuse content were seen by the Internet Watch Foundation (IWF) in the first six months of this year, compared with just under 12,000 for the same period this year. The disturbing global trend has grown rapidly since the initial coronavirus lockdown, with cases involving that age group up 360% since the first half of 2020.

The IWF’s chief executive, Susie Hargreaves, said self-generated abuse should be “entirely preventable”, which should include educating parents, carers and children about technology use and sexual abuse within the home.

“Child sexual abuse, which is facilitated and captured by technology using an internet connection, does not require the abuser to be physically present, and most often takes place when the child is in their bedroom – a supposedly ‘safe space’ in the family home. Therefore, it should be entirely preventable,” she said.

“Only when the education of parents, carers and children comes together with efforts by tech companies, the government, police and third sector, can we hope to stem the tide of this criminal imagery.”

The IWF operates a UK-based hotline and also reports on instances of child sexual abuse material (CSAM) around the world. While the fastest increase in self-generated imagery was among the seven to 10 age group, the 11 to 13 age group generates the largest amount of such images reported by the IWF, with 56,000 images flagged in the first six months of the year. There was also an increase of 137% in self-generated images of boys aged between seven and 13.

Self-generated child sexual abuse imagery is typically created using webcams or smartphones and then shared online on a growing number of platforms. The IWF says children are groomed, deceived or extorted into producing an image or video of themselves.

It said most examples occur in bedrooms, where toys, laundry baskets and wardrobes can be seen in the background. In one case, a child can be seen apparently reading instructions on a screen, while in another the edge of a blanket is visible, implying that the victim is ready to quickly shut down or hide what they have been asked to do.

Tamsin McNally, manager of IWF’s hotline, said a number of factors could be behind the growth of self-generated abuse images since 2020. “It might be due to lockdown and children being at home more and having access to the internet, or it could be that we are uncovering more cases because our techniques for finding this sort of content have improved,” she said.

McNally added that the setting of the images and videos was shocking. She said: “This is not some alleyway or dark basement. It is in family homes … sometimes you can hear their parents outside the rooms.”

The IWF also warned in its annual report this year that children as young as between three and six were becoming victims of self-generated sexual abuse. Images are distributed through online forums, having been taken from image host sites. It said the five biggest sites used to store self-generated images of seven to 10-year-olds had not been used for that purpose before.

Sign up to First Edition, our free daily newsletter – every weekday morning at 7am BST

Hargreaves added that the UK online safety bill was essential for setting a regulatory example around the world. The bill, whose progress through parliament has been delayed until the autumn, requires tech firms to limit the spread of illegal content such as child sexual abuse images.

Companies will be required to report any child abuse material on their platforms to the National Crime Agency, if they do not have an arrangement in place with another body – such as the US National Center for Missing and Exploited Children. The communications regulator, Ofcom, will have the power to fine companies either £18m or 10% of global turnover and, in extreme cases, block websites or apps.

  • The NSPCC offers support to children on 0800 1111, and adults concerned about a child on 0808 800 5000. The National Association for People Abused in Childhood (Napac) offers support for adult survivors on 0808 801 0331.

More on this story

More on this story

  • Kaylea Titford had no care plan in place when she died, review finds

  • Eight in 10 convicted in UK over child abuse images avoid prison, NCA says

  • Review finds safeguarding failures over Sussex killing of man by girl, 14

  • Pornography driving UK teens towards child abuse material, say experts

  • How lockdown may have provided ‘cover’ for deadly child abuse in England

  • One in five child abuse images found online last year were category A – report

  • Covid lockdowns created ‘online backdoor’ for child abusers, says charity

  • ‘Painful’ Ofsted report in Herefordshire leads to calls for resignations

  • Child protection agencies failed son of drug addict, review finds

  • I once looked up to my uncle, the Jesuit priest and teacher – then I discovered the monstrous truth

Most viewed

Most viewed