Skip to main content

IICSA published its final Report in October 2022. This website was last updated in January 2023.

IICSA Independent Inquiry into Child Sexual Abuse

The Internet Investigation Report

F.2: Online Harms White Paper

The proposals

4. The aim of the White Paper is to “tackle content or activity that harms individual users, particularly children”.[1] It outlines plans to “make companies take more responsibility for the safety of their users and tackle harm caused by content or activity on their services”.[2]

5. In support of this, the government proposes a new regulatory framework for online safety on the open web[3] with a statutory (or legal) duty of care.

6. The proposed duty of care will require companies[4]to take reasonable steps to keep users safe, and prevent other persons coming to harm as a direct consequence of activity on their services”.[5] This will include the company preventing known child sexual abuse and exploitation content from being made available to users, taking action following a report of such content and supporting law enforcement investigations into criminal conduct.

7. Compliance with this duty of care will be overseen and enforced by an independent regulator.

8. In order to comply with the legal duty, the regulator will draft codes of practice. In relation to both child sexual abuse and exploitation and terrorism,[6] the government will have the power to direct the regulator in relation to the codes of practice and the codes must be approved by the Home Secretary. The regulator will not normally agree to companies adopting proposals which diverge from these codes.

9. In relation to child sexual abuse and exploitation, it is envisaged that the code of practice will include:[7]

  • the reasonable steps companies should take proactively to prevent known and new indecent images of children (and links to such material) being made available and to identify and act in respect of grooming and live streaming;
  • the reasonable steps companies should take to prevent searches linking to child sexual abuse and exploitation activity and content;
  • the reasonable steps companies should take to ensure services are ‘safer by design’ and to implement effective measures to identify which users are children and adopt enhanced safety measures for child users;
  • the reasonable steps companies should take to promptly inform law enforcement about a child sexual abuse and exploitation offence, including provision of sufficient information to enable victims and perpetrators to be identified;
  • the steps companies should take to ensure they continually review their efforts to tackle child sexual abuse and exploitation and remain ‘up to date’ with the scale and nature of the threat and adapt their procedures and technology in accordance with that threat; and
  • steps to ensure that users who are affected by child sexual abuse and exploitation are directed to and able to access support.

10. The White Paper stated that the government would publish interim codes of practice on child sexual abuse and exploitation by the end of 2019. This did not happen. In January 2020, the Home Office informed the Inquiry that the interim codes will be published “later this year”.[8]

11. The regulator will have enforcement powers. The potential powers include issuing an enforcement notice (requiring the company to respond to a breach of the code and provide an action plan to resolve the problem), civil fines, and publishing public notices where a company fails to comply with the regulations or the regulator.

12. The White Paper also asked consultees for their views on whether the enforcement powers should include the ability to block the companies’ platforms from being accessible in the UK and whether senior managers should be personally liable for a major breach of the statutory duty. As Mr Christian Papaleontiou (Head of the Home Office’s Tackling Exploitation and Abuse Unit) said, the power to block access to services or platforms is “very controversial”.[9] He said this would only be considered as a final step in the enforcement regime but “if there is to be a regulator, the regulator needs to have teeth. These are potentially big companies that it’s working with.[10]

Responses

13. The National Crime Agency (NCA) considered that the White Paper was right to tackle “The piece in the middle, which is industry and the platform where all of the offending has taken place.[11] The NCA wanted to see “a regime where it actually matters to industry”.[12]

14. Facebook said that they “welcome[13] both input from the government in relation to online harm and the proposal for there to be codes of practice and a regulator. The Internet Watch Foundation (IWF) adopted a similar stance, adding:

We very much hope that the legislation will be flexible enough to allow growth within the internet and the changes within the internet, but also allow for different companies of different sizes to be able to engage with and take advantage of the technologies around.[14]

15. Apple said “we are generally in favour of additional regulation, but I think it depends on what that looks like, and the devil really is in the details”.[15] Microsoft considered that a regulatory framework was important to help “re-establish trust between the general public and technology … to give them the reassurance that they’re not just relying on technology companies to do what they say they’re going to do”.[16]

16. Google was still in the process of considering the White Paper and said it would be responding to the consultation.[17] BT was also in the process of formulating its response and considered the issue was “one of making sure … that there are clear legal frameworks that will enable us to enact perhaps further blocking or further content examination”.[18]

17. Chief Constable Simon Bailey, the National Police Chiefs’ Council (NPCC) Lead for Child Protection and Abuse Investigations, told us that the NPCC was still drafting its response. His personal view was that any legislation needed to be “extra-territorial[19] given that so many of the technology companies were based outside the UK. He supported the need for sanctions to include the ability for internet service providers to block service for non-compliant companies and thought “a liability for executives is absolutely right”.[20] He added:

the White Paper will only deliver something meaningful if the powers are given to a regulator, whereby the companies recognise that actually they have now got to do something over and above what they are currently doing”.[21]

18. The Inquiry also heard from Mr John Carr OBE, who has been working in and advising on online safety for over 20 years and is a former board member of the IWF.[22] He considers that the time has come for an end to self-regulation because, as he put it:

everything seemed to take forever … unless there was a catastrophe, and then suddenly everything could happen very quickly, and there was no visible means of ever confirming that what the industry said they were doing they were actually doing”.[23]

19. The children spoken to as part of the ‘Learning about online sexual harm’ research “identified a clear role for the online industry to play in protecting children and young people from online sexual harm”.[24] As one 15-year-old interviewee put it:

“I think they [online companies] have a major responsibility, and they don’t do it, they don’t think about it at all. On Instagram, I’ve seen no posts about safety.”[25]

20. When the participants were asked about the steps that companies could take, five common actions were suggested:

  • embedded warnings and advice for users to read when signing up to an online platform;
  • improved enforcement of age restrictions;
  • improved privacy settings including the use of default privacy settings when setting up an account;
  • more obvious and accessible reporting options and stronger action when reports are made; and
  • enhanced moderation of online activity by apps and platforms.[26]

21. Industry, government and law enforcement should take note of these five key actions. Steps to give effect to them within the current institutional response and as part of the proposed online harms regulatory framework should be taken as soon as possible.

References

Back to top