How safe is meta?

Evaluation of Meta verse

When I heard the idea that Facebook had created a virtual world for people to virtually [simulate reality like experiences] in chat rooms, I was amazed that the idea didn’t receive major push back. Some folks may need such experiences from a mentoring aspect or medical engagement learning experience to control medicine addiction or other physically simulated treatments. However, this idea of simulated in-person physical interactions, can cause many negative results when it comes to the mentally disturbed, people who are victims of virtual abuse, and other incidents that can cause a chain reaction from visiting virtual rooms. As for corporations, it would be fair to mention that profits over healthy relationships may be the determined goal in this matter. With technology such as Meta, you would have to have a balanced governing control. Which could be hard to establish if it’s built on the notion that people will always do the right thing.


The social connotation that all people are good people, poses a narrative that all people can be seen as good, no matter what they do. This is a false narrative, presented by people who are reluctant to deal with the facts that humankind can present an evil that exposes the issues of men and women since the beginning of the world. From various holocausts, self imposed war controlled by vicious leaders, genocide for profits, and many other examples. Presenting a place for people to communicate is one thing, but presenting a place for people to touch other people in reality or in life-like simulations can produce a great repercussion of evil.

One article stated,

  • A nonprofit advocacy group says a researcher’s avatar was raped in the metaverse.
  • Other Meta users have also said they were sexually harassed or abused in the metaverse.
  • Meta investors wanted a report on harms facing metaverse users, but shareholders rejected the idea.

A researcher entered the metaverse wanting to study users’ behavior on Meta’s social-networking platform Horizon World. But within an hour after she donned her Oculus virtual-reality headset, she says, her avatar was raped in the virtual space.

Metaverse: another cesspool of toxic content,” a new report published by the nonprofit advocacy group SumOfUs on Tuesday, details the researcher’s violent encounter in Meta’s Horizon World.

According to SumOfUs’ account, users invited the researcher to a private party on Horizon World earlier this month. Users in the same room then asked her to disable a setting that prevented others from getting within 4 feet of her.

The report linked to a video that the group says shows what happened to the researcher’s avatar from her perspective. In the video, a male avatar is seen getting very close to her, while another male avatar stands nearby, watching. A bottle of what appears to be alcohol is then passed between the two avatars, per the 28-second video. Two male voices are heard making lewd comments in the video.

In a part of the video SumOfUs opted not to share but describe, the researcher “was led into a private room at a party where she was raped by a user who kept telling her to turn around so he could do it from behind while users outside the window could see — all while another user in the room watched and passed around a vodka bottle,” per the report.

Even though it happened in virtual reality, the incident left the researcher “disoriented,” she said in the report. The researcher noted her controller vibrated when the male avatars touched her, resulting in a physical sensation that was a result of what she was experiencing online.

“One part of my brain was like WTF is happening, the other part was like this isn’t a real body, and another part was like, this is important research,” she said in the report.

SumOfUs researchers also reported experiencing homophobic and racial slurs in Horizon World and said they witnessed gun violence on the platform.

Meta launched Horizon Worlds in December to users 18 and up in the US and Canada. By February, there were at least 300,000 users on the platform, according to The Verge.

Four other users also recently said their avatars were sexually assaulted or harassed in Horizon World and other Meta VR platforms, according to the SumOfUs report.

In November, a beta tester reported that her avatar had been groped in Horizon Worlds.

At the time, a Meta representative, Kristina Milian, told MIT Technology Review that users should have “a positive experience with safety tools that are easy to find — and it’s never a user’s fault if they don’t use all the features we offer.” She continued: “We will continue to improve our UI and to better understand how people use our tools so that users are able to report things easily and reliably. Our goal is to make Horizon Worlds safe, and we are committed to doing that work.”

But the next month, a metaverse researcher named Nina Jane Patel said in a post on Medium that within 60 seconds after she joined Horizon Worlds, three to four male-looking avatars gang-raped her avatar.

That same month, The New York Times reported that a female player’s avatar was groped on a Meta-owned shooter game. Separately, a player on the sports game Echo VR said a male player told her he had recorded her voice so he could “jerk off” to her cursing.

SumOfUs and Meta didn’t immediately respond to Insider’s requests for comments. In response to the SumOfUs report, a Meta representative told the Daily Mail it didn’t recommend “turning off the safety feature with people you do not know.”

At least 2 major metaverse investors expressed concern over emerging details of harassment and abuse on its metaverse platforms


Meta has staked its future on building its immersive metaverse virtual reality. It plowed $10 billion into designing the metaverse. CEO Mark Zuckerberg is playing the long game with his investment, recently saying the project could continue to lose money for three to five years, Insider reported.

At least two major Meta investors, however, were alarmed by emerging details of harassment and abuse on its metaverse platforms.

In December, the investors Arjuna Capital and Storebrand Asset Management, together with SumOfUs and several other advocacy organizations, co-filed a motion demanding that Meta publish a report examining any harms users could face on its metaverse platforms, they said in a press release.

“Investors need to understand the scope of these potential harms, and weigh in on whether or not this is a good idea before we throw good money after bad,” Arjuna Capital’s managing partner Natasha Lamb said in the release.

At Meta’s Wednesday shareholder meeting, a proposal was introduced to complete a third-party assessment of “potential psychological and civil and human rights harms to users that may be caused by the use and abuse of the platform” and “whether harms can be mitigated or avoided, or are unavoidable risks inherent in the technology.”

However, the proposal was voted down.

Earlier this month, Nick Clegg, the president for global affairs at Meta Platforms, said in a blog post that “the rules and safety features of the metaverse — regardless of the floor — will not be identical to the ones currently in place for social media” and “nor should they be.”

But, he continued: “In the physical world, as well as the internet, people shout and swear and do all kinds of unpleasant things that aren’t prohibited by law, and they harass and attack people in ways that are. The metaverse will be no different. People who want to misuse technologies will always find ways to do it.”

Read the original article on Business Insider

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Posts