None of the de-identified data that’s captured is going to show up in the new My Privacy Center, which means that there is currently no way for users to audit what types of de-identified data are being captured. There’s also no mechanism for users to see if the sample frequency of the recording of physical movements increases, and there’s no disclosure obligation by Oculus to let users know if they do increase the frequency or start capturing new types of physical movements. If Oculus is truly committed to full transparency, then they should provide a master list of all of the different types of data that are being collected in a table format with details about the different tiers of how that data are being stored, and what information is being shared with other Facebook-family services.
The new GDPR law also says that “it must be as easy to withdraw consent as it is to give it,” but there is not any indication that Oculus is going to be providing ways to opt out of having any types of data being captured and recorded as this granularity of control was not shown in initial screenshots of the new My Privacy Center.
But both the old and new privacy policies say that all data collected by Oculus can be also shared with Facebook. “Sharing Within Related Companies. Depending on which services you use, we share information within the family of related companies that are legally part of the same group of companies that Oculus is part of, or that become part of that group, such as Facebook.” It also says that they can use information to “market to you on and off our Services,” which may have been intended to mean e-mail, but it can also read to mean that Oculus data can be used to advertise to you on Facebook.
All of the biometric data experts that I’ve talked with have warned about the concerns about biometric data privacy. Behavioral neuroscientist John Burkhardt warns that there’s an unknown ethical threshold between predicting and controlling behavior with access to biometric data streams like eye tracking, facial tracking & emotional detection, galvanic skin response, EEG, EMG, and ECG.
Privacy advocate Sarah Downey warns that VR could turn out to be the most powerful surveillance technology ever created if companies start recording biometric data, or it could be the last bastion of privacy if architected correctly. She also points out that the more data that companies record, that the more that weakens America’s Fourth Amendment protections which can make it less likely that people will speak freely into their First Amendment rights to free speech.
Jim Preston warns against the dangers of performance-based marketing companies like Facebook or Google having access to biometric data, and that it’s mortgaging our rights to privacy in exchange for free services. He says that privacy is a really complicated topic, and that it’s going to take the entire VR industry to be engaged in these discussions.
Advanced Brain Monitoring CEO Chris Berka says that some biometric data should be considered medical information protected by HIPAA regulations, and that commercial companies will have to be navigating some sensitive issues for how they store and treat biometric data. Tobii’s VP of Products and integrations Johan Hellqvist says that companies should be asking for explicit consent before they consider recording eye tracking data.
So I’ve had many conversations with biometric data experts warning about how this data from your body reveals whole new levels of unconscious information about what you value, what you’re paying attention to, and perhaps even what you find interesting. Biometric data will be a gold mine for performance-based marketing companies like Google and Facebook, and so it’s not incredibly surprising that Oculus is leaving the door open for how they will treat it. But it’s also quite disappointing that Oculus is not being more proactive in participating in a larger conversation about biometric data while also seemingly discounting it as a concern that is really far off in the future when we’re already seeing prototype VR devices that have eye tracking technology built in, like Qualcomm’s reference design with Tobii eye tracking. I expect to see eye tracking and facial tracking technologies released in VR and AR hardware within the next 1-3 years, which is not so off into the future.
There may also be issues with recording this type of biometric data in what is presumed to be de-identified, but that there could be unique biometric signatures that de-anonymize it. Open BCI’s Conor Russomanno warns that it may turn out that EEG data may actually end up having unique biometric signatures that means that the data may not be able to be fully anonymized.
When I asked why they removed this security section, Hall said that they’re not trying to make a claim that data is 100% secure, but they also didn’t see that this passage was necessary. It also happened to scare people. I don’t think it should have been removed because I think it’s actually honest about the reality of how any data that’s collected actually isn’t 100% secure and that it can never be guaranteed to be 100% secure. People should be scared because we should be trying to limit what data are being captured and recorded.