No products in the cart!
Please make your choice.View all catalog
The metaverse is set to be a more engaging, more immersive, more consuming space than the current online and social media app environments. Which, on one hand, can facilitate wholly new experiences, which will be transformative in many respects. But on the other, that also means that many of the problems that are common within the current social media landscape will be amplified even more, including bullying, harassment, and even more forms of abuse, which could feel even harder to escape from within more enclosed digital spaces.
Its next step on this front is a new ‘mature audiences’ content rating process for Horizon Worlds, its VR creation engine, which, eventually, will become a central element in its broader metaverse development.
As reported by Upload VR:
“Meta signposted the change in an email sent out to Horizon Worlds users, indicating that creators need to apply a content rating to their worlds to show whether it is appropriate for all ages or only for mature users (age 18 and over). If creators take no action and do not update their existing worlds within the next month, then those worlds “will default to 18+ regardless of the content in the world.”
Meta’s updated ‘Horizon Mature Worlds’ policy outlines exactly what is and is not acceptable within its VR environment.
Meta says that all worlds which include the following elements will now be considered ‘mature’ content:
Note that Meta also doesn’t allow sexually explicit material, content that depicts the use of illegal drugs, content that promotes criminal activity, or real life depictions of violence within its VR environment.
Though ‘Drunkn Bar Fight’ is evidently okay:
So, its rules against ‘realistic’ depictions of violence may be a little more flexible than you’d think.
It’s an important step for Meta, because based on past experiences with video games and other forms of immersive engagement, there are going to be significant concerns about the broader impacts of VR use, and how that may translate over to real world actions.
Grand Theft Auto, for example, has been cited as a key element in the normalization of violence, in various forms. Long before that, games like Doom were blamed for harmful impulses in youngsters who’d spend so much time trapped in these immersive 3D worlds that the lines between digital fiction and reality started to blur.
You can imagine, then, that far more realistic, far more immersive VR experiences are going to be tagged with the same, and whether there proves to be a real link between the two or not, Meta will need to tread carefully, and implement protection measures proactively to limit potential harms.
Horizon Worlds enables users to create their own VR environments, with objects and tools available to build your own places and experiences. Eventually, that platform will be filled with all sorts of 3D items and options, and the view is that this will eventually form the basis of the broader metaverse shift, establishing a home for millions, even billions, of unique experience, created by people all around the world, that will facilitate interaction and engagement in totally new ways.
But protection is key. Meta can’t ‘move fast and break things’ like it used to, we now have more knowledge as to the impacts that digital experience can have, and Meta needs to factor these in, as best it can, while also matching that with the acceleration of its metaverse push.
I’m not sure that many will have much faith in Meta taking a more measured, careful approach in this sense, but the VR experience is evolving fast, and these elements are critical considerations.
Horizon World users can locate the new rating option in the ‘World’ tab in Build mode.