Skip to main content

IICSA published its final Report in October 2022. This website was last updated in January 2023.

IICSA Independent Inquiry into Child Sexual Abuse

The Report of the Independent Inquiry into Child Sexual Abuse

Final report

J.4: Online-facilitated child sexual abuse

23. As set out in Part B, within the lifetime of this Inquiry, the scale of online-facilitated child sexual abuse has continued to escalate, year on year. In the UK, there has been a rapid increase in the amount of self-generated child sexual imagery, the age of children at risk of online harm has decreased and, worldwide, the number of referrals to law enforcement runs into the tens of millions.

24. Given the “exponential increase in reports of abuse” to the police, it is unlikely therefore that law enforcement could ever keep pace with the scale or rapid development of the threat.[1] This uncomfortable reality led a number of witnesses to state that they believed that the police “can’t simply arrest our way out” of the scale of offending.[2] This is not just a problem facing England and Wales. The WeProtect Global Alliance’s Global Threat Assessment 2021 indicated that the “sustained growth” in the scale of child sexual exploitation and abuse online is “outstripping our global capacity to respond”.[3]

25. The IWF reported that self-generated sexual imagery of children aged from 7 to 10 years old has increased three-fold, making it the fastest growing age group. In 2020, there were 8,000 instances; in 2021 there were 27,000 – a 235 percent increase.[4] Children have expressed concern about repeat victimisation because self-generated sexual images may remain available on the internet.[5] In summer 2021, it was announced that the IWF had partnered with Childline to launch an online tool, Report Remove, that empowered children and young people to remove nude videos or images of themselves from the internet. The tool can be used by anyone under the age of 18 to report a nude photo or video of themselves. The IWF subsequently reviews the report and has the content deleted if it breaks the law. Throughout the process, the young person can stay anonymous if they wish, and Childline ensures that they are safeguarded and supported.[6] Given the growth in self-generated imagery, Report Remove is likely to become an increasingly useful tool to help prevent children being harmed by the knowledge that an image of them is available online to be viewed and shared with others.

The dark web

26. There are growing concerns about the increase of offending via the dark web. Statistics from the NCA show that, in 2020, 2.88 million accounts were registered globally across the most harmful child sexual abuse dark web sites, with at least 5 percent believed to be registered in the UK.[7] A recent survey of just over 1,500 individuals who used the dark web found that 42 percent of respondents reported that they had sought direct contact with children through online platforms after viewing child sexual abuse material. While contacting a child may not necessarily lead to the sexual abuse of a child, 58 percent of respondents described feeling afraid that viewing such material “might lead to sexual acts with a child or adult”.[8] The results of the survey also supported the researchers’ hypothesis that respondents who reported using child sexual abuse material depicting infants and toddlers (aged 0 to 3 years) were the most likely to report contacting children online.[9]

27. The dark web provides a relative ‘safe haven’ for a significant number of the most depraved and committed perpetrators. The risk of harm to children caused by, and linked to, abusive material available on the dark web must not be ignored.

Back to top