Active Stocks
Fri Apr 19 2024 11:25:49
  1. Tata Steel share price
  2. 158.55 -0.91%
  1. Tata Motors share price
  2. 950.65 -2.14%
  1. Infosys share price
  2. 1,400.85 -1.39%
  1. ITC share price
  2. 423.75 1.15%
  1. NTPC share price
  2. 345.55 -1.66%
Business News/ Opinion / Columns/  Meta must act quickly against risks of metaverse harassment
BackBack

Meta must act quickly against risks of metaverse harassment

Virtual worlds must assure everyone safety at their very inception

Mark Zuckerberg’s dream of a harmonious online community faces a testPremium
Mark Zuckerberg’s dream of a harmonious online community faces a test

When Mark Zuckerberg described the metaverse last year, he conjured an image of harmonious social connections in an immersive virtual world. But his company’s first iterations of the space have not been very harmonious. Several women have reported incidents of harassment, including a beta tester who was virtually groped by a stranger and another who was virtually gang-raped within 60 seconds of entering Facebook’s Horizon Venues platform, as alleged. I had several uncomfortable moments with male strangers on social apps run by both Meta and Microsoft in December.

These are early days for the metaverse, but that’s the problem. If safety isn’t baked early on into its design, it’ll be much harder to secure down the line. Gaming firms like Riot Games, for instance, have faced an uphill battle trying to rescue a virtual community from toxic behaviour. Facebook also knows this problem well: It struggled to put the proverbial toothpaste back in the tube with covid vaccine misinformation, as highlighted by a whistleblower last year.

It turns out Facebook has grappled internally with building safety features into its new metaverse services. In 2016, it released Oculus Rooms, an app where anyone with its headset could hang out in a virtual apartment with friends and family. In 2017, it built Oculus Venues (now Horizon Venues), a virtual space where it would show films or sports games in the hope that visitors would mingle and make connections. It was a big shift, but also open to new risks.

The firm began holding meetings to discuss how they might design safety features into Venues, recalls Jim Purbrick, a former engineering manager at Facebook involved with its VR efforts, which they hadn’t done when designing Rooms. Managers did pay attention to safety, he tells me. For instance, people had to watch a safety video before entering Venues. He says he warned engineers early on that VR avatars should fade out and disappear if they got too close to another user. They liked the idea, he says, but it was never implemented. A spokeswoman for Facebook didn’t say why the firm had not implemented a fade-to-vanish feature, and instead highlighted its new ‘personal boundary’ tool, which prevents certain avatars from coming within a radius of two virtual feet of your own.

The boundary tool can backfire, Purbrick says, pointing to how similar features have been misused in gaming. “You can end up with gangs of people creating rings around others, making it difficult for them to move out," he says. “If there’s a big crowd and you have a bunch of personal boundaries, it makes navigation harder." Meta said avatars would still be able to move forward with the boundary tool.“Oculus definitely cared about people having good experiences in VR and understood that a bad first experience could put people off VR forever, but I think they underestimated the size of the problem," Purbrick adds.

He believes Meta should make safety features easier to find, like a fire extinguisher, and get volunteers to monitor behaviour. The gaming industry has some templates for this sort of governance. Until now, Meta has centralized the task of moderating content on Facebook, but it will struggle with such an approach in a new virtual world.

The company has “the most centralized decision-making structure" ever for a large company, according to one early backer, a description underscored by Zuckerberg’s control of 57% of the company’s voting shares. But virtual worlds are human communities at their core, which means people will want more of a say in how they are run. Relinquishing some of that central control could help Meta mitigate harassment.

Educating visitors about what constitutes potentially criminal behaviour would, too. Holly Powell Jones, a criminologist, has found that an alarming number of children and teenagers shrug off harassment or the sharing of indecent images because they have no idea that they are criminal offences. People have “almost certainly" been harassed at a criminal level in virtual reality already, she says. “Harassment in digital spaces is nothing new, and it’s something we and others in the industry have been working to address for years," Meta’s spokeswoman said.

With the police already stretched from social-media cases and the offline world, tech firms should try more radical solutions to address harassment in the metaverse before it’s too late. The dearth of women in the development process for virtual reality certainly isn’t helping and could be fixed.

Microsoft last week announced a more drastic move to combat harassment: It was shutting down several of its social platforms, including Campfire, muting all attendees when they joined an event and activating ‘safety bubbles’ as a default. Meta should take a cue from that. Else, its metaverse dream may fade away.

Parmy Olson is a Bloomberg Opinion columnist covering technology.

Unlock a world of Benefits! From insightful newsletters to real-time stock tracking, breaking news and a personalized newsfeed – it's all here, just a click away! Login Now!

Catch all the Business News, Market News, Breaking News Events and Latest News Updates on Live Mint. Download The Mint News App to get Daily Market Updates.
More Less
Published: 22 Feb 2022, 10:44 PM IST
Next Story footLogo
Recommended For You
Switch to the Mint app for fast and personalized news - Get App