M

We have updated our website and myADLS. To access your myADLS account, please reset your password by using the “Forgot password” option or Click Here. If you need further assistance, contact us at helpdesk@adls.org.nz or 09 871 1385.

Back Home 5 News 5 Sex in the metaverse: virtual body, real sexual assault

Sex in the metaverse: virtual body, real sexual assault

2 Jun 2022

| Author: Diana Clement

It started with a blog, and then a story in USA Today.

A London-based technology researcher, Nina Jane Patel, was taking part in the beta testing (de-bugging) of a game being hosted in the metaverse (virtual world). Within 60 seconds of joining the game, Patel says her avatar – a virtual representation of herself – was set upon by three or four avatars with male voices who, she says, “essentially but virtually gang-raped my avatar”.

The alleged perpetrators then began messaging Patel, saying “don’t pretend you didn’t love it”, “don’t be stupid, it’s not real” and “don’t choose a female avatar – it’s as simple as that”.

For Patel, it was a horrible experience. “It happened so fast and before I could even think about putting my safety barrier in place,” she says in her blog. “I froze. It was surreal. It was a nightmare.

“Virtual reality has essentially been designed so the mind and body can’t differentiate virtual/digital experiences from real. In some capacity, my physiological and psychological response was as though it happened in reality.”

The game’s platform, Horizon World, was developed by Meta (formerly Facebook) whose founder, Mark Zuckerberg, hasn’t been available to comment.

But was a crime actually committed? Or could Patel make a claim for sexual harassment?

Probably not, says Joseph Jones, the president of Bosco Legal Services, a California-based firm with lawyers and investigators who delve into online mischief, including cybercrime. But Jones acknowledges that harassment in the metaverse is an “emerging space”.

A claim would depend on several factors, he says, including the specific comments that had been made and whether Patel’s avatar revealed any identifying information such as her name. The male avatars could be anonymous and hard to track down and it would be difficult to find a law enforcement agency that was willing to help. Patel could, however, file a civil restraining order to stop it happening again.

Here in New Zealand, she would also have scant chance of redress under current law, says Arran Hunt, a partner at Stace Hammond and a member of the ADLS Technology and Law committee.

Cases of virtual sexual harassment have been arising for more than a decade, Hunt says, but our law is playing catch-up with virtual and augmented reality. Patel’s case was particularly interesting to the committee, he says, because it is “always looking for the next big issue that is coming through”.

It’s yet another example where the Harmful Digital Communications Act (HDCA) may not be fit for purpose. Nor are the Human Rights Act or the Crimes Act able to accommodate a prosecution where a gamer has suffered psychological harm.

Some people have difficulty getting their heads around how the attacks on avatars could be viewed as sexual assault when no bodies are touched. But if sexual harassment can be verbal, it can also be virtual, Hunt says.

People being harmed

Sean Lyons, online safety operations centre manager at Netsafe, says it’s important to remember that behind the technology, it’s people who are harming other people who are also behind technology.

“[We] absolutely accept and understand that there will be safety issues for individuals in these online platforms,” Lyons says. “And those safety issues might well be very different because of the immersiveness of [the platforms].” That was the case with online worlds even before VR (virtual reality) headsets made them even more lifelike, he says.

“The degree of investment that people make in these spaces is definitely an exaggerating factor in terms of the harm that an individual feels.”

Academics have researched the lasting psychological impact of abuse in virtual worlds. Researchers from Johannes Gutenberg University Mainz in Germany noted: “VR can create a situation in which the user’s entire environment is determined by the creators of the virtual world, including ‘social hallucinations’ induced by advanced avatar technology.”

It Feels Real was the title of work by researchers from the University of Oxford. They studied the real physiological responses to a stressful virtual reality environment.

In their 2019 research, healthy men were studied to see what their physiological responses were to riding a VR elevator up the outside of a tall building. When compared to a control group travelling in an inside VR elevator, those in the external elevator had increases in skin conductance, pulse and subjective stress and anxiety ratings, altered heart rate variability and a delayed rise in cortisol.

Virtual grope

“If someone was seeing attempts to grope them, then why would it matter if it was actually a physical grope or a virtual grope?” says Hunt. “They’re still seeing it, they’re still possibly hearing it. They’re seeing the environment where it’s happening to them.”

It’s only going to get worse he says. As VR and AR become even more real and we pass the “uncanny valley”, so too does the chance of suffering psychological harm. (The uncanny valley is the point where the brain fails to notice imperfections and VR and AR become imperceptible from reality).

Lyons says it’s been clear since early social media days that orchestrated campaigns of harm against users could have a massive impact on their mental health. AR and VR would be no different.

“We see that reported to us,” he says “Depending on the frequency, the age, how well supported they feel, what else is going on in their life….some people harm themselves. Some people harm others.

“We see cyclic behaviour of people that are harmed harming others to make it seem more normal. Think back to your classic bullying. People used to say, when I was a young kid in school, ‘hurt people, hurt people’. Psychological harm is what’s at the basis of all of this.”

Netsafe isn’t receiving reports of this in relation to VR and AR games – yet. “Customers [ask] us about the potential risks and what should somebody do,” Lyons says. “Those discussions for us are quite academic at this point.”

Questions could also be raised around child sexual abuse material on VR and AR platforms, Hunt says. The point will come when legislators realise they need to act.

The issue in New Zealand is that when complaints do start rolling in about harm on VR and AR platforms, the law doesn’t protect potential victims.

The legislation

Under the Crimes Act “harm” means “serious emotional distress”, which could occur in a virtual environment, Hunt says.

The HRA in its current form isn’t the answer, either. Hunt says s 62(2) covers both visual material of a sexual nature and physical behaviour of a sexual nature. But not behaviour that is seen visually. “It also needs to be repeated or significant, so here it would likely need to be repeated.”

Section 62(3) sets out how the Act applies and none of the areas listed clearly covers all possible virtual environments, Hunt says.

“Section 62(3)(k), participation in the exchange of ideas and information, might [apply] in some circumstances. But it could be hard to justify a virtual act of the type discussed as an exchange of ideas or information.”

The Crimes Act covers indecent acts under s 2(1B). Physical contact isn’t required, says Hunt. But an attempt or threat of applying force is required, which likely wouldn’t happen in a virtual environment.

In theory, the HDCA offers protections. “Digital communication is fairly broad. It means any form of digital communication,” says Hunt. “But the act is problematic and in need of reform.”

Of the HDCA’s 10 communications principles, covering what digital communication should not do, three might be relevant to a virtual assault.

“Such behaviour could fall under principle 3, not be grossly offensive to a reasonable person, principle 4, not be indecent or obscene, but more likely under 5, that a digital communication should not be used to harass an individual. However, that then raises the question as to whether it is harassment as defined under law.”

In terms of the HDCA, he adds, there are also issues with the definition of “post” which requires it to be information about the victim whether truthful or untruthful, or an intimate visual recording of an individual. Gestures in an online environment would not trigger either of these requirements.

For the police to prosecute under the HDCA, there would need to be a requirement that the communication is posted with the intention to cause harm. “This is where the police interest would likely lapse. The Act isn’t about people being reckless as to whether harm is caused or not. It is about intending to cause harm.

“Netsafe can liaise with a website host, attempt to resolve complaints, pass it to the police or inform victims of their options if they wish to take civil action in the District Court. That’s where it goes nowhere fast. And very expensively.”

The difficulty is that a service provider such as Meta is usually offshore and the participants are likely to be using pseudonyms, Hunt says.

And being offshore, the service provider cannot be compelled to provide names and the perpetrators are often beyond the reach of the New Zealand courts.

Hunt doubts a victim could win in court even if they managed to identify the perpetrator. “If it was actual verbal or written statements being made, then fair enough. But if it came down to groping and virtual physical actions then, it’s not going to fall under the [Harmful Digital Communications] Act.”

Hunt argues that the HDCA needs to be rewritten. “It’s already horribly dysfunctional. There needs to be changes.”

Review needed

Lyons too, says the law is an ongoing challenge as technology moves forward. One of the challenges for Netsafe is at which point does the law need to change to reflect what is happening to individuals.

It’s inevitable, he says, that as technologies develop, legislation such as the HDCA will need review. He points out that it was amended earlier this year, thanks to a private member’s bill: the Harmful Digital Communications (Unauthorised Posting of Intimate Visual Recording) Amendment Act 2022. The bill criminalised the posting of intimate visual recordings without consent.

Lyons says Netsafe supports any reviews of the legislation that would ensure victims who experience harm can seek redress without time delays. “We absolutely would support any attempt at legislation to make it work for those that [suffer] harm online.”

Hunt believes the most obvious way to handle the issue of virtual assault would be an amendment to the Crimes Act to expand indecent assault to virtual environments

“It needs to be broadened….so that ‘physical’ can also include a virtual representation of a physical act.

“However, we would be sceptical of any government drafting such changes without either making it too narrow, making it too difficult to enforce or leaving it too open to the judiciary and the police who may not understand the inclusive nature of virtual environments and the impacts they can have and therefore not enforce it.” ■

Subscribe to LawNews

The weekly online publication is full of journalistic articles written for those in the legal profession. With interviews, thought pieces, case notes and analysis of current legal events, LawNews is a key source of news and insights for anyone working within or alongside the legal field.

Sign in or
become a Member
to join the discussion.

0 Comments

Submit a Comment

Your email address will not be published. Required fields are marked *

Latest Articles

Gloriavale wins fight to keep its bank accounts – for now

In July 2022, the Bank of New Zealand (BNZ) gave notice of its intention to terminate its banking relationship with various companies and entities associated with the Christian Church Community Trust, more commonly known as the Gloriavale Christian Community.

read more
Loading...