Metaverse App Allows Children to View Explicit Material
A researcher has discovered a design flaw in the metaverse that allows children to view sexually explicit material.
Posing as a 13-year-old, a BBC News researcher visited the virtual reality world and witnessed avatars simulating sex.
She was able to view sex toys and condoms, and said that adult male users encouraged her to engage in virtual sex acts.
The metaverse is the name given to games and experiences viewed by people wearing virtual reality headsets. It has been the subject of renewed interest recently, after Facebook rebranded as Meta and invested billions into developing the technology.
The company’s Oculus Quest headset, now known as the Meta Quest, is thought to have as much as 75% of the market share and can be used in conjunction with third-party apps.
In this case, the researcher Jess Sherwood used VRChat, an online virtual platform that users can explore with 3D avatars.
Although the app isn’t owned by Facebook or Meta, users are required to have a Facebook account.
This process ostensibly works as an age verification system, because you must be at least 13 to have a Facebook account, but there are no further restrictions on what users can access in VRChat based on their age.
Inappropriate, harmful experiences
The metaverse is a nascent technology, but concerns about data privacy – particularly in relation to children – are ongoing.
Social media platforms have been scrutinised for the way they use people’s personal data, and this has been particularly true in relation to children’s data.
TikTok has faced numerous complaints and last year the former children’s commissioner for England, Anne Longford, sued the video-sharing app on behalf of 3.5 million children.
Instagram has also faced a regulatory inquiry, while Facebook has been criticised for everything from sharing users’ personal data to using a ‘legal trick’ to bypass its GDPR responsibilities.
The NSPCC’s (National Society for the Prevention of Cruelty to Children) head of online child safety policy, Andy Burrows, believes that the BBC’s research demonstrates an endemic issue regarding technology giants and data privacy.
”It’s children being exposed to entirely inappropriate, really incredibly harmful experiences,” he said.
What raises particular concerns about the metaverse is that users aren’t only viewing harmful images but interacting with them.
Jess Sherman said of her experience using VRChat: “I was surprised how totally immersed in the spaces you are. I started to feel like a child again. So when grown men were asking why I wasn’t in school and encouraging me to engage in VR sex acts, it felt all the more disturbing.”
She added: “Everything about the rooms feels unnerving. There are characters simulating sex acts on the floor in big groups, speaking to one another like children play-acting at being adult couples.
“It’s very uncomfortable, and your options are to stay and watch, move on to another room where you might see something similar, or join in – which, on many occasions, I was instructed to do.”
‘Dangerous by design’
As organisations become more technologically sophisticated, they are finding innovative ways to use personal data.
The metaverse demonstrates an extreme example of this, and although Meta might deserve a degree of leniency given that it’s trying to understand the limits of how it can be used, the company’s previous issues with data privacy shouldn’t leave you with much optimism.
Meta says it does have tools that enable players to block other users and is looking to improve privacy “as it learns how people interact in these spaces”.
But as Andy Burrows suggests, we shouldn’t accept organisations releasing platforms to the public if they contain major privacy issues.
He describes the metaverse a “a product that is dangerous by design, because of oversight and neglect. We are seeing products rolled out without any suggestion that safety has been considered.”
More advanced technology is going to create sophisticated privacy concerns, making it harder for organisations to bolt data protection measures on to existing products.
This is a problem identified in the GDPR (General Data Protection Regulation), which states that organisations must adopt privacy by design – which requires them to prioritise privacy when developing processes or tools.
Before any system goes live, organisations must be confident that sensitive data is being used responsibly and that it cannot be accessed by unauthorised parties.
The GDPR isn’t the only data protection law that requires this. The UK’s forthcoming Online Safety Bill, which is due before parliament later this year, addresses ‘safety by design’ requirements.
The Bill, when passed, would place a duty of care on platforms and providers to protect children from harmful content. Although it doesn’t specifically address virtual reality and the metaverse, Culture Secretary Nadine Dorries made it clear that the technology would be covered.

Download our free guide Privacy by Design – Step by step to learn more.
This eight-step guide explains in more detail what privacy by design is and how it works, and contains a walkthrough to help organisations meet their privacy by design requirements.
It takes you through the entire process of embedding privacy in your processes and tools, from creating a roadmap and selecting features to implementing, testing and launching your set-up.
What is the metaverse and what are the data privacy implications?
DQM GRC Senior Consultant Peter Galdies
Defining the metaverse takes us right to the heart of the problem – we simply cannot envisage what a working metaverse may turn out to be.
Some predict this will be a kind of virtual reality driven by an ‘alternate reality’, where people can get together using online personas in a kind of world-wide video game. The truth is that any current definition of the metaverse is only likely to be a massive oversimplification.
Today we don’t really know what the metaverse is, except as a term that is being hyped to describe for the future of digital interaction.
A significant number of organisations of all shapes and sizes are starting to seriously consider how this future might work.
Familiar names such as Meta (previously known as Facebook) and Google are leading on the metaverse initiatives, and it is likely that how we work as well as play will become a commercial opportunity for such firms to embrace.
So, we would appear to be at a juncture where we have organisations poised for a future “gold rush” into this space – but not quite knowing what to do or where to go, as there is a lack of real definition and regulation.
We can anticipate that any kind of digital interaction between individuals (such as the metaverse is likely to require) will be facilitated by the use of data, and there will also be an opportunity for such interactions to be recorded (or “tracked”).
It’s also likely that much of this data may fall into the definition of “personal” – meaning that much of the processing may fall into the scope of existing privacy laws, however, existing privacy rules, structures or standards will not have been designed with this future in mind – and may not provide the degree of control required to protect the rights of individuals. All-in-all the potential remains for another data “wild west”.
Regulation and standardisation will continue to evolve, but it will always follow innovation and commercial exploitation. Ideally, a core of those responsible for providing the access and technology necessary for the internet to exist, should work closely with worldwide governments to develop a set of principles and standards to form the shape of the future metaverse.
Metaverse meets meta-perverse
DQM GRC Consultant Mark James
When it comes to our personal data, identity, and perhaps even our soul, Facebook – or as it is now known, Meta – is not without its share of controversy.
Meta has struggled to keep personal data private or prevent users from posting harmful content, so there is rightful concern over its ambitious plan for the metaverse.
The company’s founder, Mark Zuckerberg, recently described AI as “the key to unlocking the metaverse”. That has come in the form of its ‘Builder Bot’, which claims to support data privacy and transparency, but there is scepticism around the scope of the project.
For the uninitiated to the metaverse, it’s a virtual world that you can access with a VR headset. Once in that world, you can play games, shop and visit simulated tourist attractions.
As some have found, you can even go to strip clubs or concerts, buy land, invest in Bitcoin and do a host of other things.
But the technology raises fundamental questions about privacy – with the BBC News expose being only the latest example.
Following a recent altercation in which an avatar was sexually harassed, Meta released a fix called Personal Boundary. The tech firm admits that more controls are necessary, but how does it propose to protect users from harm?
Regulatory frameworks such as the GDPR (General Data Protection Regulation) seek to legislate and make organisations more accountable for data protection risks.
One of the key aspects of this is privacy by design, which is outlined in Article 25 of the GDPR and Chapter 4:57 of the UK Data Protection Act. Those legislations describe effective safeguards that organisations should integrate during the conception, design and implementation of new tools and processes, rather than bolted them on to existing technology.
The overarching aim of these rules is to minimise the risks to the rights and freedoms of data subjects.
But legislation surrounding privacy by design don’t prescribe specific controls that organisations must implement. This gives organisations the ability to create measures appropriate to their processes.
Pseudonymisation is a clear example of this. The practice is mentioned in Article 32 and Recital 47 of the GDPR, but they contain no specific details on how or when it should be used.
Regulators clearly have their work cut out, and although the UK’s Online Safety Bill will provide greater guidance for protecting children’s data privacy, organisations must common to privacy by design if they are to achieve effective results.
A version of this blog was originally published on 24 February 2022.