12. The digital technologies used to facilitate child sexual abuse are continually expanding. While computer-generated imagery (CGI) is not a new innovation, it enables individuals to create artificial child sexual abuse material.
13. ‘Deepfake’ technology is a form of CGI that uses artificial intelligence (AI) to manipulate a still or moving image, thereby replacing one person’s likeness with another. It is easily accessible via mobile or computer-based apps and is a harmful, emerging form of image-based sexual abuse. In 2020, sexualised ‘deepfake’ images of more than 100,000 women and apparently underage girls were generated by users of an AI bot (an automated software programme) that ‘removed’ items of clothing from non-nude photos taken from social media. The messaging app Telegram was then used to distribute the images.[1]
14. As with laws criminalising the possession of indecent photographs and videos of children, it is also illegal to possess non-photographic child sexual abuse images.[2] The legislation therefore prohibits possession of material such as comics, cartoons, drawings, manga (Japanese comic books) images and CGI that depict a defined set of sexual acts with a child.[3] The IWF operates a non-photographic imagery URL list which helps companies block access to website addresses that contain this kind of material.[4]
15. While the possession of non-photographic material does not cause direct physical harm to a child, it is a powerful indicator that the individual has a sexual interest in children and may be a risk to children. Moreover, this material is still harmful because “it fuels very real fantasies, encourages the propensity of sexual predators, and contributes to maintaining a market for child sexual abuse material” and “it creates a culture of tolerance for the sexualisation of children and thereby cultivates demand”.[5] If CGI images become so lifelike that they cannot be distinguished from a real image, it may become more difficult and time-consuming for law enforcement to identify genuine child victims and take steps to protect them.
16. However, not every country has legislated against the possession of non-photographic material.[6] The absence of this legislation is not just symbolic – it hampers law enforcement’s ability to take action against those who pose a risk to children, and also creates “a sense of impunity to further fuel offending, not least because offenders have been noted to purposely target children in jurisdictions with weak provisions”.[7]
17. Interactive games where children communicate with other players online have long been used as a means of grooming children and exposing them to other forms of sexually exploitative activity. Virtual reality (VR) games, along with augmented reality (AR) games (which superimposes digital content such as images, sounds and text over a real-world environment), can be used by offenders to exploit children. For example, in AR a child might explore their local area, with some games enabling the location of the child to be tracked.[8]
18. More recently, a number of companies, including Roblox (a children’s online gaming platform that allows players to build and play their own games together) and Meta (formerly known as Facebook), have developed their own virtual worlds known as the metaverse. The metaverse is a VR space where users can interact with a computer-generated environment and other users. It also has the potential to become a ‘safe space’ for sex offenders who use it as a tool for the sexual exploitation and abuse of children.
18.1. Its immersive nature means that it poses an enhanced threat when compared with other digital landscapes. For example, unlike social media, where the ability to initiate an interaction is mostly limited to text-based messages and emojis, the metaverse enables the sensation of physical touch. The VR company Emerge, for example, recently launched a product that enables ‘bare-hands’ tactility.[9] This would allow sexual predators to feel and be felt by their victims, without the need for physical contact.
18.2. Safety policies are extremely difficult to monitor and enforce in virtual spaces. Unlike mobile phones, computers or gaming consoles, VR headsets cannot be externally monitored by parents and they do not store records of users’ interactions.[10]
18.3. It is also an environment where sexual offenders are able to hide behind anonymous avatars, which are digital representations of themselves. Users’ identities are not verified and children can access adult-only features simply by ticking a box to declare that they meet the minimum age requirements.
18.4. Meta’s virtual world, introduced in 2019, enables users to create avatars and interact with others using VR headsets. Meta’s age policies have been described as “a paper tiger”, in that once a headset is linked to an adult’s account, it can be used freely by people of any age.[11] In spring 2022, Meta added parental controls to its headsets which enable a parent or guardian to prevent access to specific games and apps.[12]
19. Recent news stories highlight the ease with which children, particularly those under the age of 13, are exposed to the risk of sexual harm.
19.1. The VR app VRChat allows people to socialise as 3D avatars in a variety of online spaces. The app has a minimum age rating of 13 years but can be downloaded without any age verification or identity checks. Research conducted by the Centre for Countering Digital Hate (CCDH) found that users of the app were exposed to abusive behaviour every seven minutes. This included children being exposed to graphic sexual content, sexual harassment and grooming.[13] For example, a young person’s avatar was followed by “two heavily breathing men” and another male joked in front of a child that he was a “convicted sex offender”.[14]
19.2. A BBC News researcher investigated VRChat while posing as a 13-year-old girl. She visited rooms where child users could interact freely with adults, including pole dancing and strip clubs. The researcher witnessed sexual harassment, rape threats and a group of avatars who were simulating sex. She was also shown sex paraphernalia and was propositioned by several adult men. A safety campaigner told the BBC that he had spoken to children who had been groomed and forced to take part in virtual sex.[15]
19.3. In June 2018, a seven-year-old girl’s avatar was sexually assaulted in Roblox’s virtual playground.[16] In February 2022, it was reported that Roblox had hosted games featuring rooms where children could watch avatars involved in simulated sexual acts. While the article stated that Roblox “stressed that it was very unlikely a child would stumble across these rooms unless actively looking for them and that the games were usually only up online for as little as an hour before moderators discover and take them down”, Figure J.2 provides an example of the kind of game that children could access.
Figure J.2: An example of a screenshot from Roblox
Source: The Times, 16 February 2022, Nazi sex parties hosted on children’s game Roblox
19.4. Another parent discovered that graphic messages had been sent to her young son, who was groomed into sending sexually explicit images of himself.[17] Players can earn ‘Robux’, a digital currency used to purchase in-game upgrades and accessories. There have been reports of children using their avatars to exchange lap dances for Robux in virtual strip clubs.[18]
19.5. In December 2021, during trials of the VR platform Horizon Worlds, several users alleged that they were sexually assaulted or harassed while using the system. One victim was “groped” by a stranger while others watched.[19] Another user reported that her avatar was “virtually gang-raped” by a group of male avatars.[20]
20. Head of online child safety at the National Society for the Prevention of Cruelty to Children (NSPCC), Mr Andy Burrows, said about the metaverse that:
“We are seeing products rolled out without any suggestion that safety has been considered”.[21]
21. The Home Office’s Interim Code of Practice on Online Child Sexual Exploitation and Abuse (Interim Code, see below) makes clear that VR child sexual abuse material (along with photographs, videos and other material) should be reported.[22] Security controls and safety measures are required, however, to prevent sexually abusive behaviour from taking place in VR. These need to be implemented before the metaverse becomes widely adopted and accessible.
22. There are also benefits to many technological advances. In particular, the ability of artificial intelligence (AI) to process large volumes of data means that it can be employed to create a safer online environment for children.
22.1. AI can be used to help infiltrate offender networks. In 2013, a computer-generated image of a fictional child known as the ‘Sweetie’ avatar was used to identify more than 1,000 online predators in 71 countries.[23]
22.2. ‘Chat-bots’ can carry out text-message conversations with buyers and providers of sexual services, thereby obtaining additional information with which to prosecute perpetrators.
22.3. In February 2021, Government Communications Headquarters (GCHQ) published a report outlining how AI can be used to tackle issues including child sexual abuse and trafficking.[24] This includes training AI tools to identify potential grooming behaviour.[25]
22.4. In February 2022, Meta added a ‘Personal Boundary’ feature to its VR experiences. This prevents avatars from coming within a set distance of each other, thereby making it easier to avoid unwanted contact.[26]