Artificial intelligence and virtual reality together could result in a privacy nightmare.
According to experts, a machine learning algorithm could estimate people’s height, weight, age, marital status, and more by observing how they walked while donning virtual reality headsets. The project demonstrates how artificial intelligence could infer personal information without direct human input.
Researchers at the University of California, Berkeley, identified a single person from among more than 50,000 other VR users in one sample with more than 94% accuracy in February. After analyzing only 200 seconds of motion data, they arrived at that conclusion. In a second study published in June, researchers used information from 1,000 players of the well-known VR game Beat Saber to determine a person’s height, weight, foot size, and country with an accuracy rate of more than 80%. With more than 70% accuracy, even private information like marital status, employment situation, and ethnicity may be determined.
The researchers used a machine learning model to analyze data uploaded to virtual reality headsets, such as eye or hand movements. “The easy ones for the model are age, gender, ethnicity, and country,” said lead researcher Vivek Nair at UC Berkeley. For instance, the model could guess based on how quickly they hit a virtual target to figure out someone’s age. Having a faster reaction time correlates with having better eyesight and being younger. “But there are even things like your level of income, your disability status, health status, even things like political preference can be guessed,” he said.
Over half of the participants in both experiments used the Quest 2 headset from Meta Platforms Inc., the Valve Index by 16% of participants, and other headsets such as the HTC Vive or Samsung Windows Mixed Reality by the remaining participants. According to Jay Stanley, a senior policy analyst at the American Civil Liberties Union, virtual reality headsets record information that wouldn’t be accessible through a conventional website or app, such as a user’s gaze, body language, body proportions, and facial emotions. “It intensifies other privacy issues and brings them all together.”
Although it needs to be clarified how much VR data is included in the mix, Meta, which makes the majority of its revenue from advertising based on user data, has already started using machine learning to fill in the gaps in what it knows about people. Apple’s privacy policies were altered in 2021, limiting the information Meta could track on iPhones and wiping away $10 billion in revenue for the social media juggernaut. That compelled the business to make an AI investment. After enhancing its AI to forecast better what content and advertisements users find interesting, Meta has resumed double-digit revenue growth this year.
Since 2021, Meta has sponsored a limited number of VR headset advertisements. At the time, the company promised not to target ads using data processed and saved on the headsets, such as pictures of hands. When asked for more information regarding the policy governing the data generated by its headsets, Meta referred Bloomberg to its Quest Safety Center, where the company describes how users can set their avatar, profile image, name, and username to private, giving them some control over who else can see it. Additionally, the company states that “data sent to and stored on our servers will be disassociated from your account when we no longer need it to provide the service or improve the eye tracking feature.”
The collection of sensitive user data by Meta has already drawn criticism. After encountering regulatory criticism, Meta turned down its facial recognition system and deleted over 1 billion facial photographs in 2021. Because they cannot be altered and used to identify a specific person quickly, biometric data like facial photographs are extremely sensitive. According to Nair, VR headsets also record critical information, but because the technology is more recent, users and authorities must fully comprehend its potential risks.
Privacy protections are far more challenging to develop than websites or applications because VR headsets need to collect data such as eye and hand movements to function. According to Stanley, there are a few options, including restricting the amount of data retained or encrypting the information VR headsets gather. However, the makers of these headsets “have incentives to gather information about people for marketing,” he added.
Researchers claim that user knowledge of privacy settings and the amount of data that VR headsets capture is minimal. “I don’t think it’s reasonable to expect consumers to defend themselves here,” Stanley added, citing the combination of robust AI extrapolations. Simply put, there are too many information gaps and technological changes.