Skip to main content

IICSA published its final Report in October 2022. This website was last updated in January 2023.

IICSA Independent Inquiry into Child Sexual Abuse

The Internet Investigation Report

G.1: Conclusions

1. The number of indecent images of children worldwide is in the many millions. The National Society for the Prevention of Cruelty to Children (NSPCC) has estimated that approximately half a million men in the UK may have viewed indecent images of children. In 2018, the Internet Watch Foundation (IWF) received nearly 230,000 reports of suspected online child sexual abuse. UK law enforcement record more than 10 grooming offences per day and arrest between 400 and 450 people per month for offences of online-facilitated child sexual abuse and exploitation.

2. The last five years have seen improvements in the response of law enforcement, industry and government to online-facilitated child sexual abuse. There have been many technological advances designed to prevent and detect online child sexual abuse, particularly in response to the volume of indecent images of children now available on the internet. More recently, attention has turned to the response to online grooming and live streaming.

3. Despite this, there has been an explosion in online-facilitated child sexual abuse. Law enforcement is struggling to keep pace.

4. There was no evidence to suggest that the number of offenders who use the internet to facilitate their abuse of children is diminishing. It is unclear whether the increase in reporting of online-facilitated child sexual abuse is indicative of an increase in offending or an increase in detection, or both.

5. It is difficult to assess the efficacy of the industry’s response to online-facilitated child sexual abuse if the companies do not know the scale of the problem on their platforms and services. The internet companies must do more to identify the true scale of the different types of offending. Such information should be publicly available.

6. It is also difficult to gauge whether the myriad of responses across all sectors are adequate if the offender’s underlying motivations and drivers are unknown. We therefore welcome the Home Office’s decision to fund the Centre of Expertise on Child Sexual Abuse and its work into the reasons why perpetrators commit child sexual abuse.

7. Most online-facilitated child sexual abuse is committed on the open web and the vast majority of sites that host indecent images of children are available on the open web.[1] By contrast, the dark web can only be accessed by means of specialist software. The abuse found on the dark web is often of the most depraved and deviant kind. While it is not illegal to access the dark web, the dark web is also used by those who have a sexual interest in children, particularly by more sophisticated offenders.

Detection and prevention

8. Since the development of PhotoDNA technology in 2009 (and PhotoDNA for Video in 2018), the detection of known child sexual abuse imagery on the internet has improved greatly. As one witness said, PhotoDNA is the “industry standard”.[2] In addition to this, internet companies have also developed their own technology – such as crawlers to identify large volumes of child sexual abuse imagery and software that can identify child nudity – to detect newly created or previously unseen indecent images.

9. Such developments are invaluable but preventing access to this imagery at the outset is what is required.

10. The National Crime Agency (NCA) has asked industry to pre-screen or pre-filter material before it is uploaded to their platforms and systems to prevent a user from gaining access to child sexual abuse images. While there may be challenges before pre-screening can be implemented, no industry witness said that such a step was technologically impossible. Any argument that pre-screening at the point of upload is unnecessary (given the speed with which known child sexual abuse material can be detected) misses the point. Industry has failed to do all it can to prevent access to such imagery.

11. Indecent images of children can be accessed all too easily. Every time a child sexual abuse image is viewed, the victim is re-victimised, and the offender is potentially drawn into a search for increasingly depraved material. The time has come for the government to stop access to indecent images of children by requiring industry to pre-screen material.

12. The UK government must also continue to prompt change not just nationally but internationally. As a result of the IWF’s work, the UK hosts a tiny proportion of child sexual abuse material (0.04 percent). The work of the IWF in removing significant amounts of child sexual abuse material is a genuine success story. The response of some other countries seemingly lags behind. It is beyond the remit of this Inquiry to make recommendations to other countries but it is clear that more needs to be done internationally to try and reduce the amount of child sexual abuse content that is available online and the government should do all it can through the WeProtect Global Alliance to help achieve this aim.

13. Encryption makes data unreadable to unauthorised parties and, in the case of end-to-end encrypted communications such as WhatsApp, iMessage and FaceTime, the content of the communication can only be seen by the sender and recipient. Many of the techniques used to detect online offending do not work where the communication is encrypted. One consequence of encryption, and in particular end-to-end encryption of messages, is that it will make it harder for law enforcement to detect and investigate offending of this kind and is likely to result in child sexual abuse offences going undetected. Encryption therefore represents a significant challenge to the detection of and response to online-facilitated child sexual abuse.

14. In late 2018, the Home Secretary convened a hackathon, where engineers from the leading internet companies developed a prototype that highlights conversations that might be indicative of grooming. That technology has now been launched. The progress made in the course of two days demonstrates what can be done when government, industry and law enforcement work together. This proactive approach is to be commended and, as the Online Harms White Paper itself acknowledges, “more of these innovative and collaborative efforts are needed”.[3]

15. While developments in technology play an important role in trying to detect such offending, they are not a substitute for the internet companies investing in live moderation. The internet companies need to ensure that there are sufficient numbers of human moderators with a specific focus on online child sexual abuse and exploitation. The value of human moderation is evident from the success achieved by the social network Yubo, whose moderators interrupt live streams to tell underage users to put their clothes on.

Age verification

16. The online abuse of children continues to grow. In the first three months of 2019, the IWF found that 81 percent of self-generated imagery they took action on showed children between 11 and 13 years old, predominantly girls. NSPCC research in 2017/18 recorded that children aged 11 and under were victims in one-quarter of offences where a child had been sent a sexual communication.

17. The majority of children own a smartphone from around the time they start secondary school. Although industry companies either prohibit or discourage children under 13 years old from accessing their platforms or services,[4] the age verification process can be often easily subverted – simply by inputting a false date of birth.

18. While some of the internet companies know how many users have failed the current age verification requirements and how many accounts have been terminated because the user is under 13 years old, such information is not contained within transparency reports and so the true scale of underage use is not public knowledge. Increased transparency about the extent and scale of underage use is required. Transparency reports are now commonplace but, in the absence of independent and consistent reporting standards, the reports only tell the public what the organisation wants and thinks the public should know.

19. Many social media platforms and online services have parental controls. Whilst these can be set so that parents can monitor who their children communicate with and how much time they spend online, the Inquiry heard no evidence of a comprehensive plan from industry and government to address the problem of underage use.

20. Children aged under 13 years old need additional protection. The industry must do more than rely on children to supply their true age when signing up to a platform. There must be better means of ensuring compliance with the current age restrictions.

Education and awareness

21. As the ‘Learning about online sexual harm’ research revealed, education about online safety at primary school is necessary. The Inquiry welcomes the Department for Education’s decision to make ‘Relationships Education’ in primary schools compulsory from September 2020. Coupled with the introduction of compulsory ‘Relationships and Sex Education’ in secondary schools, it is anticipated that these lessons will make children more aware of the ways the internet can be misused by those intent on sexually abusing children. Teaching children about the harm caused by the taking and sharing of self-generated imagery will help to raise awareness of how quickly a child can lose control over who has access to such material.

22. Educating children about the need to stay safe online is an important part of the response to tackling online-facilitated child sexual abuse and exploitation. We heard evidence from parents and children that even those parents who were regular users of social media did not necessarily understand the realities of children’s online lives. The ‘Learning about online sexual harm’ research highlights the need for teachers and parents to convey messages about staying safe online in a variety of ways. The introduction of the new compulsory ‘Relationships Education’ and ‘Relationships and Sex Education’ is an essential step in helping to prevent children from being harmed online.

Future reform

23. While we heard evidence of the positive intentions by industry to tackle online-facilitated child sexual abuse and exploitation, there is a lack of a coherent long-term strategy on how this is to be achieved. Responses by industry were varied and sometimes appeared to be reactive rather than proactive. One of the motivating factors that prompted some companies to take action seemed to be the reputational damage caused by adverse media reporting, rather than seeking to ensure the protection of children is given a high priority within their business models.

24. The children who participated in the ‘Learning about online sexual harm’ research identified five key areas which they thought would enhance their safety online:

  • users should be given warnings and advice about online harm when they first set up a device or open a social media account;
  • improved enforcement of age restrictions when accessing social media accounts and other online content;
  • improved use of privacy settings and, in particular, the use of default privacy functions when setting up an account;
  • more obvious and accessible reporting options and stronger action taken when concerns are reported; and
  • greater moderation of online activity by apps and platforms.[5]

Industry, government and law enforcement should take note of what children have suggested and take steps to give effect to them.

25. Regulation of the internet industry is now required. No witness who gave evidence to the Inquiry has argued otherwise. The December 2019 Queen’s Speech included the government’s commitment to progressing the Online Harms Bill, a matter to which the Inquiry will return in its final report.

26. The Online Harms White Paper stated that an interim code of practice for child sexual abuse and exploitation would be published by the end of 2019. This did not happen. The interim code will require companies to take reasonable steps across a wide range of areas, all of which are designed to protect children from online-facilitated sexual harm. The code is therefore invaluable and should be published without further delay.

27. The volume of online child sexual abuse and exploitation offences undoubtedly “represents a broader societal failure to protect vulnerable children”.[6] Continued and increased collaboration across all three sectors, coupled with education of children about the need to stay safe online, is what is required to protect children.

References

Back to top