November 22, 2024
A new consumer privacy battle is underway as tech gadgets capture our brain waves
As consumer technology capturing brain data proliferates under the radar of medical device regulation, Colorado has changed its privacy act to include biology.

Nomadsoul1 | Istock | Getty Images

The question “What is a thought?” is no longer strictly a philosophical one. Like anything else measurable, our thoughts are subject to increasingly technical answers, with data captured by tracking brainwaves. That breakthrough also means the data is commodifiable, and captured brain data is already being bought and sold by companies in the wearable consumer technologies space, with few protections in place for users. 

In response, Colorado recently passed a first-in-the-nation privacy act aimed at protecting these rights. The act falls under the existing “Colorado Consumer Protection Act,” which aims to protect “the privacy of individuals’ personal data by establishing certain requirements for entities that process personal data [and] includes additional protections for sensitive data.” 

The key language in the Colorado act is the expansion of the term “sensitive data” to include “biological data” — inclusive of numerous biological, genetic, biochemical, physiological, and neural properties.

Elon Musk’s Neuralink is the most famous example of how technology is being embedded with the human mind, though it isn’t alone in the space, with Paradromics emerging as a close competitor, alongside devices that have returned speech to stroke victims and helped amputees move prosthetic limbs with their minds. All of these products are medical devices that require implantation, and are protected under HIPAA’s strict privacy requirements. The Colorado law is focused on the rapidly growing consumer technology sphere and devices that don’t require medical procedures, have no analogous protections, and can be bought and used without medical oversight of any kind. 

There are dozens of companies making products that are wearable technologies capturing brain waves (aka neura data). On Amazon alone, there are pages of products, from sleep masks designed to optimize deep sleep or promote lucid dreaming, to headbands promising to promote focus, and biofeedback headsets that will take your meditation session to the next level. These products, by design and necessity, capture neural data through use of small electrodes that produce readings of brain activity, with some deploying electric impulses to impact brain activity. 

The laws in place for the handling all of that brain data are virtually non-existent.   

“We have entered the world of sci-fi here,” said lead sponsor of the Colorado bill, Representative Cathy Kipp. “As with any advances in science, there must be guardrails.”

‘ChatGPT-moment’ for consumer brain tech

A recent study by The NeuroRights Foundation found that of thirty companies examined who are making wearable technology that is capable of capturing brainwaves, twenty-nine “provide no meaningful limitations to this access.” 

“This revolution in consumer neurotechnology has been centered on the increasing ability to capture and interpret brainwaves,” said Dr. Sean Pauzauskie, medical director at The NeuroRights Foundation. Devices using electroencephalography, a tech readily available to consumers, is “a multibillion-dollar market that is set to double over the next five or so years,” he said. “Over the next two to five years it is not implausible that neurotechnology might see a ChatGPT-moment.”

How much data can be collected depends upon several factors, but the technology is rapidly advancing, and could lead to an exponential increase in applications, with the tech increasingly incorporating AI. Apple has already filed patents for brain-sensing AirPods.

“Brain data are too important to be left unregulated. They reflect the inner workings of our minds,” said Rafael Yusuf, professor of biological sciences and director, NeuroTechnology Center, Columbia University, as well as Chairman of the NeuroRights Foundation and leading figure in the neutotech ethics organization Morningside Group. “The brain is not just another organ of the body,” he added. “We need to engage private actors to ensure they adopt a responsible innovation framework, as the brain is the sanctuary of our minds.”

Pauzauskie said the value to companies comes in the interpretation or decoding of the brain signals collected by wearable technologies. As a hypothetical example, he said, “if you were wearing brain-sensing earbuds, not only would Nike know that you browsed for runners’ shoes from your browsing history, but could now know how interested you were as you browsed.”  

A wave of biological privacy legislation may be needed

The concern targeted by the Colorado law may lead to a wave of similar legislation, with heightened attention to the mingling of rapidly-advancing technologies and the commodification of user data. In the past, consumer rights and protections have lagged behind innovation.

“The best and most recent tech/privacy analogies might be the internet and consumer genetic revolutions, which largely went unchecked,”  Pauzauskie said.

A similar arc could follow unchecked advancements in the collection and commodification of consumer brain data. Hacking, corporate profit motives, ever-changing privacy agreements for users, and narrow to no laws covering the data, are all major risks, Pauzauskie said. Under the Colorado Privacy Act, brain data is extended the same privacy rights as fingerprints.

According to Professor Farinaz Koushanfar and Associate Professor Duygu Kuzum of the department of Electrical and Computer Engineering at UC San Diego, it is still too early to understand the limitations of the technology, as well as the depths of the potentially intrusive data collection.

Tracking neural data could mean tracking a broad range of cognitive processes and functions, including thoughts, intentions, and memories, they wrote in a joint statement sent via email. At one extreme, tracking neural data might mean accessing medical information directly. 

The broad range of possibilities is itself an issue. “There are too many unknowns still in this field and that’s worrisome,” they wrote.

If these laws become widespread, companies may have no choice but to overhaul their current organizational structure, according to Koushanfar and Kuzum. There may be a need for establishing new compliance officers, and implementing methods such as risk assessment, third-party auditing and anonymization as mechanisms for establishing requirements for the entities involved.

On the consumer side, the Colorado law and any subsequent efforts represent important steps toward better educating users, as well as giving them the required tools to check and exercise their rights should they be infringed. 

“The privacy law [in Colorado] regarding neurotechnology might stand as a rare exception, where rights and regulations precede any widespread misuse or abuse of consumer data,” Pauzauskie said.