The metaverse has a groping problem already

Catherine Cross, a researcher on online harassment at the University of Washington, says that while virtual reality is immersive and real, the toxic behavior that occurs in that environment is also real. “At the end of the day, the nature of the virtual-reality space is such that it is designed to make the user think that they are physically in a certain space, their every physical action taking place in a 3D environment,” she says. . “It’s part of the reason why emotional responses can be stronger in space and why VR triggers similar internal nervous system and psychological responses.”

That was true in the case of the woman who was fired at Horizon Worlds. According to The Verge, her post reads: “Sexual harassment on the regular internet is no joke, but being in VR adds another level that makes the incident worse. Not only was I caught last night, but there were others as well.” There were those who supported this behavior which made me feel lonely in the plaza. [the virtual environment’s central gathering space],

Sexual assault and harassment are nothing new in the virtual world, and it is unrealistic to expect a world in which these issues disappear completely. As long as there are people who hide behind their computer screen to avoid moral responsibility, they will continue to be.

The real problem, perhaps, lies in the assumption that when you play a game or participate in a virtual world, Stanton describes it as a “contract between the developer and the player.” “As a player, I agree to be able to do whatever I want according to their rules in the developer’s world,” he says. “But as soon as that contract is broken and I no longer feel comfortable, the company’s responsibility is to get the player back to where they want to be and be comfortable.”

The question is: Whose responsibility is it to make sure users are comfortable? Meta, for example, says it gives users access to tools to protect themselves, effectively shifting responsibility on them.

“We want everyone at Horizon Worlds to have a positive experience with security tools that are easy to find અને and it’s never a user’s fault if they don’t use all the features we offer,” said Meta spokeswoman Christina Millian. “We will continue to improve our UI and better understand how people use our tools so that users can report things easily and reliably. Our goal is to make Horizon Worlds safer and we are committed to doing just that. “

Before joining Horizon Worlds, users must go through an onboarding process that teaches them how to get started in a safe zone, Million said. She also said that regular reminders are loaded into screens and posters at Horizon Worlds.

Screenshot of the Safe Zone interface from the meta

Facebook

Screenshot of the Safe Zone interface
Safe Zone Interface Courtesy Metana Screenshots

Facebook

But the fact is that the meta grouping victim either did not intend to use the safe zone or could not access it Exactly The problem, says Cross. “The structural question is a big issue for me,” she says. “Generally speaking, when companies address online abuse, the solution is to outsource it to the user and say, ‘Here, we give you the power to take care of yourself.'”

And it is inappropriate and does not work. Security should be simple and accessible, and there are many ideas to make this possible. For Stanton, virtual reality would require some sort of universal signal કદાચ perhaps a queer V gesture જે that could relay to moderators that something was wrong. Fox wonders if automatic personal distance will help unless two people agree to stay close to each other. And Cross believes that it will be useful for training sessions to clearly demonstrate the norms prevailing in normal life: “In the real world, you will not catch anyone randomly, and you should take them to the virtual world.”

Unless we know who is responsible for protecting users, a big step towards a secure virtual world is to discipline the invaders, who are often scat-free and eligible to participate online even after knowing their behavior. . “We need a blocker,” says Fox. That means making sure bad artists are found and suspended or banned. (Million Meta said “[doesn’t] Share clarifications about the individual case when asked about what happened to the alleged grocer.)

Stanton regrets not pushing harder to adopt industry-wide power gestures and failing to talk more about Bellamy’s suffocation. “It was a missed opportunity,” he says. “We could have avoided that incident on the meta.”

If anything is clear, it is this: let alone in the virtual world, nowhere is there an organization clearly responsible for the rights and safety of online participants. Until something changes, Metavers will be a dangerous, problematic space.

Similar Posts

Leave a Reply

Your email address will not be published. Required fields are marked *