top of page
Writer's pictureCourtney Jones

Sexual Assault in the Metaverse

Introduction


In June 2022, Refinery29 published an article about a woman who had experienced a sexual assault in the metaverse and the subsequent trauma that this caused her.[1] In fact, it wasn’t her body that had been assaulted, but instead her avatar which had been designed to resemble her on which the abuse had been inflicted.


Instances such as these have highlighted a very complex issue that the law will have to deal with in a world that is increasingly digitised – how do we, or rather can we, prosecute assaults that occur in the realm of virtual reality? The answer is not entirely clear and may rely on the courts interpreting the definition of current sexual assault laws to include offences occurring in virtual reality in the metaverse.


This leaves us with a broader, philosophical question – is an avatar, created as a representation of ourselves for the world of virtual reality, an extension of our corporeal bodies? An article by Škola and Liarokapis provides psychological evidence as to why this perhaps ought to be the predominant view.[2] In their article, they note that ‘Leveraging extension of the ownership, agency and self-location towards a non-body object… has already been proven to help in producing stronger EEG correlates of [motor imagery]’.[3] Additionally, Graber & Graber argue that avatars created for virtual reality ought to have rights ‘analogous to those of a biological body’.[4] If this is so, allowing for prosecution of sexual assaults and rapes against an avatar is therefore a logical extension of this view.


Extending Protections to Virtual Bodies


It appears to be the case that crimes can be committed virtually. Graber and Graber note that as far back as 2003, there were 22,000 reports of online game characters committing crimes to South Korean Police.[5] More recently, the United States’ Department of Justice arrested two individuals for conspiracy to launder $4.5 billion in stolen cryptocurrency.[6] Therefore, the courts have been prepared to extend the laws of theft beyond that of physical property to include virtual property. The next logical extension would be to extend assault and sexual abuse laws beyond that of physical bodies to include virtual avatars, especially given the very real trauma that such abuse can cause the victim. It must be noted, however, that in the case of game characters committing theft in South Korea, it was not the game characters committing the crimes of their own accord, but a legal person using the virtual characters as a tool to commit a crime. Additionally, given that many ‘gamers’ spend real-world money on these games, the loss may not have been entirely virtual.


One of the core arguments made by Graber and Graber in stating that virtual avatars should be granted rights similar to that of a biological body is that ‘a physical body is not necessary for legal protection and having rights as a person’, and highlights laws against the emotional abuse of children as an example.[7] This is logical given that non-physical abuse can have real consequences to an individual’s psyche and mental health, and leave the victim with very real trauma and the potential for PTSD. However, given that the brain is a part of the physical body and may be affected by emotional abuse, one could argue that a physical body is indeed necessary. Wolfendale argues that ‘victims of virtual harm can be extremely upset by the experiences’ and provides anecdotal evidence of a woman who read extremely graphic depictions of sexual acts by her avatar and found herself feeling post-traumatic stress.[8] This idea that a corporeal person can be impacted in the real world by something done to their digital avatar is interesting. It begs the question of whether there is a difference in how someone might be impacted by viewing such an assault against an avatar they have created in their likeness or for the purposes of self-representation versus viewing such a digital assault more generally. Given that many avatars are created to resemble the human being that they represent, this would likely result in more harm than viewing an assault against a generic avatar, though evidence is required to support this view.


Is Sexual Assault Against an Avatar Possible?


If we extend the argument that a person does not need to have a physical body to be granted rights under the law, then by extension it is possible to commit a sexual offence without the involvement of a physical body. Considering that, for example, harassment in the UK takes place where a person ‘engages in unwanted conduct of a sexual nature’,[9] the possibility remains that such unwanted sexual conduct could be committed against a digital representation of a person rather than a physical person. If the law is to consider a virtual reality avatar as an extension of the person, the extent to which this can be applied must be assessed.


It is argued here that only avatars intended to be an accurate representation of the physical person should be included, or where a physical person, by virtue of using virtual reality and/or metaverse technologies, undergoes a first-person experience of the assault. Therefore, for example, if someone creates an avatar that does not have similar physical attributes, but the victim experiences the sexual assault from the victim’s point of view in the virtual reality setting, justice would be available. As such, someone wearing a virtual reality headset who is approached and undergoes unwanted sexual conduct from another VR user, who experiences this conduct from a first-person perspective, would be eligible to make a complaint to the courts.


There is currently very little academic commentary on offences taking place in the metaverse, perhaps because the metaverse is not yet widely in use. However, given that the law is often slow to respond to the development of new technologies, potential occurrences such as this ought to be considered before the widespread implementation of these new technologies.


In response to Nina Patel’s experience of sexual assault in the metaverse, whereby she experienced verbal and sexual harassment within a minute of joining,[10] Meta (previously Facebook) created a virtual bubble around a person’s avatar to prevent future instances occurring in their virtual environment.[11] Although this is a step in the right direction, it clearly highlights how women’s concerns are often forgotten in the creation and development of new and innovative technologies and arguably does not go far enough to prevent forms of virtual sexual abuse that are not ‘physical’. Specifically, the creation of a bubble does not prevent the kinds of sexual abuse that are so prevalent on the internet, particularly verbal sexual abuse and deepfake pornography (the superimposition of one person’s face onto the body of a porn actor, often without their consent)[12].


Penalties for Virtual Sexual Abuse


Having established that sexual assault and abuse can and does take place in the metaverse, and that individuals should be afforded similar protections to that of biological bodies for their virtual reality avatars, one must ask whether virtual sexual abuse is as severe and therefore merits as strong a punishment as abuse which is committed against a physical, corporeal body. Arguably, this ought to be left to the discretion of the sentencing judge who would be able to consider any victim impact statements ?? and all the evidence of the effects of the sexual abuse on the victim. However, it is argued that a physical sexual assault and the rape of a corporeal body should merit the strongest of punishments given the potential for real physical harm that can be caused to the victim. That is not to say that sexual abuse in the virtual world should not be taken seriously, it certainly should. However, a distinction should be created between the two, particularly given the physical injuries and illnesses that can arise as a result of rape.


Liability of the Metaverse


Of course, the individual using their avatar to sexually abuse another person in the virtual reality world should be held accountable for their actions, whether criminally or through the civil courts. The question remains whether companies creating and implementing the metaverse should be held liable if appears that they are not doing enough to prevent such abuse. Like Meta’s ‘avatar bubble’ described above, measures implemented to protect users from potential harm are often reactionary as opposed to precautionary, meaning nothing is done until some harm occurs on the platform. Given that tort law often allows someone to be held liable for something that was reasonably foreseeable, it follows that, given the expertise of the platforms, that those behind the metaverse and these virtual reality tools should be held responsible when they are negligent in implementing protective measures against potential harms their users may face.


Regulation and Banishment


As artificial intelligence (AI) becomes more intelligent and our lives become increasingly digitised, regulating this new digital environment to prevent sexual offences is key, and those committing multiple violations ought to be banned from the platforms and networks. However, focusing on dealing with individual users violating the rules or exhibiting abusive behaviour is not enough. Given the widespread use of virtual technologies and its increasing persistence in our lives, governments ought to consider the potential harms of these technologies to produce regulations and guidelines for the companies who make them. In creating such regulations, women’s voices ought to be front and centre. For example, governments can require that the creators of these technologies implement simple complaints systems for users who are being victimised by the platforms. They can also require that companies offer sex-segregated spaces within these digital spaces to ensure that women have a place online free from abuse. While these remain mere options, there are countless opportunities for platforms to protect women from abuse when their voices are centred.


Conclusion


To conclude, sexual abuse within the metaverse and virtual reality is a real problem with real consequences. Both the criminal law and civil law should offer recourse to victims who experience abuse within virtual reality. However, preventative measures are also necessary to ensure that women using these new technologies do not experience abuse while visiting the platforms. A feminist approach to the regulation of technology is the most appropriate way forward to prevent the future victimisation of women.

[1] Katherine Singh, ‘There’s Not Much We Can Legally Do About Sexual Assault In The Metaverse’ (Refinery29, 9 June 2022) < https://www.refinery29.com/en-us/2022/06/11004248/is-metaverse-sexual-assault-illegal > accessed 24 October 2022. [2] Filip Škola and Fotis Liarokapis, ‘Embodied VR environment facilitates motor imagery brain-computer interface training’ (2018) 75 Computers & Graphics 59 < https://www-sciencedirect-com.uoelibrary.idm.oclc.org/science/article/pii/S009784931830089X?via%3Dihub> accessed 24 October 2022. [3] Ibid. [4] Mark Alan Graber & Abraham David Graber, ‘Get Your Paws Off of My Pixels: Personal Identity and Avatars as Self’ (2010) 12(3) Journal of Medical Internet Research 3. [5] Ibid. [6] Department of Justice, ‘Two Arrested for Alleged Conspiracy to Launder $4.5 Billion in Stolen Cryptocurrency’ (2022) < https://www.justice.gov/opa/pr/two-arrested-alleged-conspiracy-launder-45-billion-stolen-cryptocurrency > accessed 24 October 2022. [7] Ibid [4]. [8] Jessica Wolfendale, ‘My avatar, my self: Virtual harm and attachment’ (2007) 9 Ethics and Information Technology 111. [9] Equality Act 2010 s 26(2). [10] Nina Jane Patel, ‘Reality or Fiction?’ (Medium, 21 December 2021) <https://medium.com/kabuni/fiction-vs-non-fiction-98aa0098f3b0> [11] Magda Zima, ‘The Metaverse: virtual offences, real world penalties?’ (Kingsley Napley, 9 June 2022) < https://www.kingsleynapley.co.uk/insights/blogs/criminal-law-blog/the-metaverse-virtual-offences-real-world-penalties > accessed 24 October 2022. [12] Anne Pechenik Giseke, ‘”The New Weapon of Choice”: Law’s Current Inability to Properly Address Deepfake Pornography’ (2020) 73 Vand L Rev 1479 <https://heinonline-org.uoelibrary.idm.oclc.org/HOL/Page?handle=hein.journals/vanlr73&div=39> accessed 15 December 2022.

Recent Posts

See All

Comments


bottom of page