Christmas Sales! Everyone can enjoy a 30% OFF on Mocap Suit and Mocap Gloves & FREE Shipping Worldwide.

Meta doesn’t want to control virtual worlds. kids pay the price

Zach Matheson, 28, sometimes worries about hostility on the Meta virtual reality social network, Horizon Worlds. When his 7-year-old son Mason was exploring the app, he encountered users, usually other kids, who were shouting profanity or racist slurs.
He became so upset with his son that he followed his every move in virtual reality through a TV connected to the Quest headset. When Matheson believes the room is unsafe, he orders Mason to leave. He regularly visits Internet forums, advising other parents to do the same.
“A lot of parents don’t understand it at all, so they usually just let their kids play there,” he said. He said, “If your kids have an Oculus, try spying on them and watching who they talk to.”
For years, Meta has argued that the best way to protect people in VR is to give them the ability to protect themselves by giving users the tools to control their own environment, such as the ability to block or push other users away. It’s a markedly less aggressive and costly stance than the one she’s taken on social media Facebook and Instagram, backed up by automated and human-supported systems to root out hate speech, violent content, and error messages that break the rules.
Nick Clegg, Meta’s president of global communications, likens the company’s metaverse strategy to owning a bar. If patrons come across “unpleasantly offensive language”, they simply leave without expecting the bar owners to follow up on the conversation.
But experts warn that such moderation tactics could be dangerous for kids who flock to Horizon, which users say is rife with bigotry, harassment and pornography. While Meta officially bans children under 18 from using its flagship VR app, researchers and users report that kids and teens are using the program en masse by managing accounts owned by adults or misreporting their age.
In some cases, teenage users were unable to handle the dangerous situations they found in the Metaverse, according to the researchers. Others reported that young users inappropriately harass other people without adult supervision. Meanwhile, new research shows that victims of virtual reality harassment and bullying often experience psychological effects similar to real-life attacks.
Jessie Fox, an Ohio State University assistant professor who studies virtual reality, says kids “don’t even know there’s no monster under the bed.” “How do they know the avatar is being manipulated by a monster?”
Despite the risks, Meta continues to market the Metaverse to an ever younger audience, drawing the ire of child protection activists and regulators. After Meta revealed it plans to open Horizon Worlds to younger users between the ages of 13 and 17, some lawmakers called on the company to drop the plan.
“Given your company’s failure to protect children and teenagers, and the growing evidence that young Metaverse users are at risk, we urge you to stop this program immediately.” – Senator Richard Blumenthal (D-CT) Edward J. Markey (D-Massachusetts) ) wrote last week in a letter to Meta CEO Mark Zuckerberg.
Meta spokeswoman Kate McLaughlin said in a statement that until the company makes Horizon World “accessible to teens, we will be providing additional protections and tools to help provide them with age-appropriate experiences.”
“We encourage parents and guardians to use our parental controls, including app access control, to help keep them safe,” she added.
A new study by Center Against Digital Hate, a technology company advocacy group, illustrates some of the dangerous scenarios faced by users who look like children in the metaverse. The study recorded a series of offensive, biased and explicit conversations in virtual comedy clubs, parties and moot courts in front of seemingly young users.
“The Metaverse is for young people. Children will inevitably find their own way,” said Imran Ahmed, executive director of the Center for Combating Digital Hate. “When you’re taking care of kids and trying to commercialize their attention, you have a responsibility to reassure their parents that your platform is safe.”
Controversy arises as The Meta seeks to change the way people interact by moving into an immersive virtual realm known as the Metaverse. Meta’s executives see a future where people work, play and shop together in a digital world that looks and feels like the real world, but runs on virtual and augmented reality devices.
Per Meta rules, pornography, illegal drug advertising, and extreme violence are prohibited. Users can report problematic incidents to security experts, block users, confuse users they don’t recognize, or remove themselves from social media.
These tools have not stopped illegal content from spreading throughout the metaverse, often in front of users who appear to be children.
Researchers from the Center to Combat Digital Hate have been included in the Horizon Worlds “Top 100″ list based on user feedback. They documented interactions they witnessed, categorized as adult content, or included interactions between apparently minors and adults.
Two researchers determined that a user was underage if they thought the person was speaking like a child, or if the user explicitly stated their age.
They found that users engage in group sex games by asking questions like “What category of porn do you have?” At the Soapstone comedy club, one user from the crowd reacted sharply to the request to “shut up”: “I’m only 12, calm down.”
In total, the group documented 19 incidents in which minors appeared to be influenced by biased comments, harassment or pornographic content. Out of 100 Horizon Worlds entries, 66 were found to contain users under the age of 18.
It’s unclear how many users have bypassed Meta’s age restrictions, or how the popularity of explicit content in Horizon Worlds compares to other VR programs.
“The challenge is getting kids into something they don’t necessarily want to experience,” said Jeff Haynes, senior editor for video games and websites at Common Sense, an advocacy group that evaluates entertainment content for kids.
Haley Kremer, 15, said she reached out to Horizon to connect, especially with senior mentors who have helped her through life’s challenges. She said it was great to meet more people who care about her.
But not all of her interactions with adults on the app have been so positive. A few months ago, a user with a gray-haired male avatar approached her in one of the main centers of the world, Horizon, and said that she was beautiful. When she tells him to stay away from her, he follows her until she stops him, a tactic she learned from a mentor.
New research into virtual reality shows that the visceral experience of being in virtual reality makes aggressive pursuit in space similar to aggression in the real world. Users often report that their virtual bodies appear to be extensions of their real bodies, a phenomenon known in academic research as incarnation.
“When someone says they’ve been stalked, attacked, or beaten up in virtual reality, it’s because all of their biological systems react the same way they would if they were physically attacked,” said Brittan Heller, senior fellow at The Atlantic for Democracy and Technology. . Advice.
Critics say that Meta’s bartending approach puts much of the responsibility on casual users to manage these immersive virtual spaces — a responsibility that is harder for younger users to fulfill. They claim that Horizon World was developed by a tech giant that has a bad reputation for cracking down on the spread of dangerous speech on its social media platforms.
“Meta doesn’t run a bar. No bar has ever been the cause of a genocide,” Ahmed said. “No bar has ever been a breeding ground for the most dangerous predators in the country. Facebook is everything, just like the Metaverse.”

 


Post time: Mar-10-2023