3 mins read

A new law in California protects consumers’ brain data. Some think that doesn’t go far enough.

However, some intellectual privacy advocates are not convinced that the law does enough to protect neural data. “While it introduces important safeguards, significant ambiguity leaves room for loopholes that could undermine privacy protections, particularly with regard to inferences from neural data,” wrote Marcello Ienca, an ethicist at the Technical University of Munich, on X.

One of those ambiguities concerns the meaning of “non-neural information,” said Nita Farahany, a futurist and legal ethicist at Duke University in Durham, North Carolina. “The wording of the bill suggests that raw data is available [collected from a person’s brain] may be protected, but inferences or inferences – where the privacy risks are greatest – may not be,” Farahany wrote in a post on LinkedIn.

Ienca and Farahany are co-authors of a recent paper on intellectual privacy. In it, she and Patrick Magee, also at Duke University, argue for expanding the definition of neural data to include what they call “cognitive biometrics.” This category could include physiological and behavioral information as well as brain data – in other words, pretty much anything that could be detected by biosensors and used to infer a person’s mental state.

After all, it’s not just your brain activity that reveals how you feel. For example, an increase in heart rate can indicate excitement or stress. Eye tracking devices can reveal your intentions, such as: B. a decision you are likely to make or a product you may decide to purchase. This type of data is already being used to reveal information that might otherwise be extremely private. Recent research has used EEG data to predict volunteers’ sexual orientation or whether they use recreational drugs. And others have used eye-tracking devices to infer personality traits.

With this in mind, it is important that we get it right when it comes to protecting intellectual privacy. As Farahany, Ienca, and Magee put it, “By deciding whether, when, and how to share their cognitive biometric data, individuals can contribute to advances in technology and medicine while maintaining control over their personal data.”


Now read the rest of The Checkup

Read more from MIT Technology Review‘s archive

Nita Farahany offered her thoughts on technology that aims to read our minds and explore our memories in a fascinating question and answer session last year. Targeted dream incubation, anyone?

There are many ways your brain data could be used against you (or potentially exonerate you). Law enforcement agencies have already started asking neurotech companies for data from people’s brain implants. In one case, a person was accused of assaulting a police officer, but brain data showed he was having a seizure at the time.

EEG, the technology for measuring brain waves, has been around for 100 years. Neuroscientists are wondering how it might be used to read thoughts, memories and dreams in the next 100 years.