When you put on a VR headset and connect to the internet, you’re entering a digital wild west.
Is the device scanning your retinas? Your face? Your room?
If you enter the young metaverse, you may encounter the same creeps, trolls and sock puppets you’ll find on traditional social media. Facebook parent company Meta, which rebranded to focus on the digital environment, has developed ways to cloak yourself and report at least some bad behavior when interacting with strangers. But harassment and even assaults are already a growing issue.
When it comes down to it, though, the real safety issue in the metaverse may be Meta itself.
Meta, as owner of the Quest VR brand, currently provides the most accessible virtual reality headset on the market. And Meta’s CEO, Mark Zuckerberghas a track record of failing to protect consumers on its Facebook platform, including everything from rampant bot activity to election interference.
Other companies that offer or are developing VR platforms, including Sony, Microsoft and Apple, will not necessarily be less obtrusive; Apple has a patent for tech that will scan images directly on people’s retinas, which raises questions.
Then again, as tech evolves, virtually every gaming, social media and app platform raises questions.
How informed are you?
Companies are supposed to inform you, the consumer, what their privacy policies are, and most do. But is information contained in a 14-page document written with heavy legal terminology that many consumers aren’t necessarily going to understand really giving them informed consent?
Tom Kellythe West Coast-based president and CEO of the data privacy firm IDX, does not think so. He is on a mission to require transparency from these companies that is not only detailed, but easy to digest. A warning label for people using VR and AR platforms would basically come down to the hard truth: You are risking your personal data by using this product.
“This is one area where companies have shown no instinct of self regulation,” Kelly told Technical.ly. “We’re creating a format that forces transparency, where people know exactly [what the risks are]just like a cigarette pack. ”
Kelly said he is planning discussions over the next few months on the topic with people in various government agencies.
Chris Glandencreator and host of BarCode, a podcast that tackles all kinds of cybersecurity topics, notes that VR is especially risky if you are very protective of your privacy. (The Wilmington, Delaware-based cybersecurity consultant’s professional experience includes working for ChristianaCare and Comcast.) VR has positive features, he said, and he would not tell people not to use it. But, like Kelly, he believes consumers need to know exactly what Meta, and any other company whose products they’re using, do with their personal data.
Your biometric data could be at risk
So, you may ask, after using platforms like Facebook, Twitter and hundreds of smartphone apps over the years, what are you really risking?
“When you think of VR, you have to think of it as data collection of biometric data,” Glanden said. If you’re doing retina scans, fingerprinting, facial recognition, voice, any of those type of things where there’s sensitive data being collected [and] stored somewhere, there’s a significant high risk of attacks. It’s just like stealing your credit card data, but they’re stealing your biometric data or your image to sell on the dark web for identity theft. ”
So, for example, someone who has stolen your biometric data can create an avatar of you and use it to interact in the metaverse. They could then scam other users, spread disinformation and commit other cybercrimes as you.
“I do not know how much developers are thinking about data encryption when you’re building out these applications, but certainly you need to encrypt this data,” Glanden said. “If you’re a product designer, you need to make sure the devices that you’re designing and the software they use in those devices and headsets have proper security mechanisms in place.”
You are the product
A bigger issue, per Kelly, is that companies are not putting the safety of their consumers first.
“You just need to be aware of the risks that you’re taking, and some of the potential scenarios that could play out from sharing any information. Malicious things can happen.”
“They do not care,” he said bluntly. “They only care about gathering the data. Facebook’s business model is based on collecting data. It’s free because you’re the product. ”
How much control you have over your own data depends on where you live, whether you’re purchasing an item online, using social media or gaming in VR. In Europe, there is the General Data Protection Regulation (GDPR) that deals with protecting individual data. In California, the California Consumer Privacy Act (CCPA) gives consumers control over their data, including the right to request the deletion of private data collected from them in most cases. Several other states are also taking similar steps in that direction, often facing opposition from business organizations.
“There’s a great need for those discussions, and for expanding those protections,” Kelly said.
Beyond the VR headset
And those needs are only going to increase as expanded reality grows. That encompasses VR, augmented reality and tech we may not even know of yet.
AR is commonly used in games like Pokémon GO and features like “view in your room” on Amazon, where you can place an item in your room using your phone’s camera. These may raise questions about GPS tracking or whether images of the inside of your home are being stored, but, for the most part, the risk is on the person playing the game or shopping for a new lamp.
That may not be the case with products already on the shelves like the Meta /Ray Ban collaboration Ray Ban Stories, where a pair of seemingly normal, stylish Wayfarer sunglasses are covertly a headset computer, complete with a photo and video camera and phone. And this tech is just the beginning, as full AR glasses that do not look like headsets are under development by Meta and other companies. These devices would be able to bring the metaverse to your living room, allowing, for example, for friends to virtually hang out together in their real spaces via avatar.
“These glasses identify you, and they can in fact start doing privacy invasion when they’re looking at you,” Kelly said. “They know you are. That’s a privacy threat to the individual and also privacy threats to the person in front of you. ”
The best you can do is be aware
If you use expanded reality (as this reporter does), you need to know that there is a lot that needs to be done before you can know what is happening with your data. You do need to suck it up and read the 14 pages of legal jargon and decide whether the risk is worth it. If your state has consumer protection legislation in the pipeline, consider supporting it.
“You just need to be aware of the risks that you’re taking, and some of the potential scenarios that could play out from sharing any information,” Glanden said. “Malicious things can happen.”