Skip to main content

IICSA published its final Report in October 2022. This website was last updated in January 2023.

IICSA Independent Inquiry into Child Sexual Abuse

The Internet Investigation Report

D.6: The interaction between law enforcement and industry

66. The law enforcement response to online grooming, and other forms of online-facilitated child sexual abuse, necessarily involves close and constant interaction with industry. Chief Constable Bailey, having consulted with a number of police forces, told us that “relationships between policing and some Industry platforms is good”. One industry platform was said to demonstrate “extremely good practice and support … by providing law enforcement with detailed information upon which to conduct a criminal investigation”. Another had a “very active group of moderators of its chat rooms.[1] However, there were two particular issues on which there was a notable divergence of views between law enforcement and industry: encryption and access to data.

Encryption

67. Smartphones are not just telephones; they are also computers.[2] They enable communication between individuals but also store vast amounts of personal data, including work and social diaries, banking applications, photographs and videos of friends and family. In order to keep this information private, many of the technology companies use encryption. Encryption is the process of converting information or data into a code that makes it unreadable to unauthorised parties. Ms Melissa Polinsky, Director of the Global Security Investigations and Child Safety Team for Apple, said that Apple viewed encryption as “fundamental to the protection of our customers from “bad actors, by hackers, by various governments around the world for different purposes”.[3]

68. The use of encryption is growing. The number of encrypted websites has increased substantially. According to Google, desktop users spend two-thirds of their time on such websites.[4] Facebook is considering applying encryption to Facebook Messenger.[5]

69. Communications made via many common platforms – such as WhatsApp, iMessage and Facetime – are subject to end-to-end encryption. This means that the content of the communication can only be seen by the sender and recipient and not by third parties – including the providers of the platforms themselves.[6] In practice this means that if, as part of a criminal investigation, law enforcement needs to access the messages between two people, then the company running the messaging service would not be able to provide police with that information. The only way for law enforcement to ascertain what was being said in the messages would be to obtain the data from one of the devices (eg the telephone handset or computer) used or from an (un-encrypted) online backup.

70. End-to-end encryption has significant implications for the law enforcement response to online grooming, the sharing of child abuse imagery and live streaming of abuse. The NCA acknowledged that encryption can be “a force for good” but said that if “applied without thought to platforms that could be used by this type of offender, then, quite frankly, the lights could go out for law enforcement”.[7] Many of the techniques used to detect online offending do not work where the communication in question is encrypted. For example, PhotoDNA cannot scan the content of WhatsApp messages (which are encrypted) to detect child abuse imagery.[8] BT’s Cleanfeed system, designed to prevent access to child abuse imagery, cannot operate over encrypted websites.[9] While Microsoft can monitor conversations over XBox Live (which are not encrypted) for potential grooming,[10] Apple cannot monitor conversations over iMessage (which are encrypted)[11] or live streaming via Facetime.[12]

71. Offenders are aware and take advantage of the protection afforded by encryption. Mr Stower for the NSPCC gave evidence that groomers will often move children between platforms to “platforms which are smaller, that are more difficult for law enforcement to get into, particularly those that are encrypted.[13] A large amount of child abuse imagery is stored in “encrypted archives on the open web, beyond the reach of scanning techniques, with the means to access such archives (the encryption keys) stored on the dark net.[14]

72. The Home Office explained that the government’s goal was to secure “exceptional and targeted access to specific individuals’ communications”.[15] Mr Papaleontiou said that “Possible, platform-specific technical solutions exist, but these require working with individual service providers”.[16] Ms Polinsky said:

as much as I would love to have an exception that would only be an exception for child protection … the truth of the matter is that any exception to encryption is an exception for anyone and is something that can be exploited by anyone”.[17]

73. On 4 October 2019, the Home Secretary and her counterparts in the US and Australia sent an open letter to Facebook asking it not to proceed with its plan to implement end-to-end encryption across all of its messaging services. The letter stated that the risks to public safety were:

exacerbated in the context of a single platform that would combine inaccessible messaging services with open profiles, providing unique routes for prospective offenders to identify and groom our children.[18]

74. When asked how the NCA envisage dealing with the consequences of end-to-end encryption, Mr Jones said that the technology companies should adopt a “range of mitigations[19] and “use hash lists, use machine learning, use AI” on the un-encrypted areas to “make sure there are no child sexual abuse images in there”.[20] For example, if an offender downloaded a known child sexual abuse image from a website and sought to send it to a third party via a WhatsApp message, whilst the WhatsApp message itself could not be pre-screened (because WhatsApp messages are encrypted), if pre-screening had been deployed on the website it would not have been available for download in the first place.

75. In closing submissions, a number of core participants challenged the growing use of end-to-end encryption and called for industry to fundamentally change its approach. The NCA said:

It is simply not good enough, therefore, for a company which chooses to operate an encrypted service to shrug its shoulders and say there is nothing it can do. The making of that choice generates a responsibility to mitigate its harmful effects”.[21]

76. Submissions on behalf of IN-A1, IN-A2 and IN-A3 suggested that the technology companies’ insistence on “absolute privacy … is an excuse, a way in which platform providers, with a digital shrug, divest themselves of all responsibility”.[22] It was submitted that communications should be able to be accessed by the police.[23] In response to Apple’s evidence that an exception to encryption could not be created just for child protection,[24] counsel for the NCA asked rhetorically “how hard have you tried?[25]

77. Encryption represents a serious challenge to the detection of, and response to, online grooming and other forms of online-facilitated child sexual abuse. The public should be under no illusion: a consequence of encryption, and in particular end-to-end encryption of messages, is that it will make it harder for law enforcement to detect and investigate offending of this kind and is likely to result in child sexual abuse offences going undetected.

Securing data

78. According to Chief Constable Bailey, the “main challenge encountered by police nationally is obtaining data from industry to support investigations into online-facilitated child sexual abuse.[26] This is because data is typically stored by internet companies based overseas and there is an “extremely lengthy” and complex mutual legal assistance treaty (MLAT) process typically required to access content data. This leads to “significant delays in gathering evidence to pursue offenders and protect children”.[27]

79. The NCA gave an example of an investigation which commenced in 2017. The suspect was alleged to have used Facebook, Instagram, Gmail and Snapchat to groom teenage boys into sending him indecent images and videos of themselves committing sexual acts.[28] Over 150 potential victims had been identified. The suspect was arrested in early 2018 but, as at March 2019, the NCA was still awaiting “authorisation from a US judge to release content to further the investigation towards a potential prosecution”.[29]

80. As Mr Milward said, the MLAT process is “not suitable for the digital age at all”.[30] As explained in Part B of this report, in October 2019 the Home Secretary signed a UK–US bilateral data access agreement allowing UK law enforcement to directly request communications service providers to produce communications data and content.[31] It is envisaged that the new agreement will mean that data can be accessed in weeks if not days.

81. Where law enforcement sought data (other than content data) or other types of assistance from industry, Chief Constable Bailey’s evidence was that the response was mixed.[32] There was “consensus” that, where life was at risk, industry responded well and that, in such cases, support from social media applications was “very good.[33] Where there was no immediate risk to life – as in the vast majority of cases – there were examples of good practice; one industry platform responded within 48 hours but the response was “generally slow”.[34] One platform was identified by two forces as having an “extremely burdensome and lengthy law enforcement request process”.[35] Forces also noted that there were disparities between platforms as to how they dealt with law enforcement requests, the threshold for when assistance would be provided, and the quality and duration of data retained.

82. It is clear that improvements can and should be made to the speed and quality of the response by industry to law enforcement requests for data. Greater collaboration between law enforcement and industry ought to be capable of resolving the problem of inexpedient provision of information. It may be that the government will want to consider whether, if the regulator envisaged in the Online Harms White Paper is established, there should be a protocol setting out time limits for industry to respond to law enforcement requests.

Back to top