Bio-Ethicists Call for Regulation on Emerging "Neurotechnologies" Industry

A new report calls for tighter control over the rapidly growing market.
Jessica Miley

There has been a surge of "neurotechnologies" on the market in the last two years. These devices and apps claim to be able to diagnose mental states, improve cognition and even read emotions. 

A new study published in the journal Science suggests that much of the industry is not supported by science and that the apps and devices could pose health risks. 

The authors of the study, two bioethicists from Penn Medicine and the University of British Columbia call for the creation of a working group that would work to monitor the growing industry. The neuroscience technologies industry is expected to reach $3 billion by 2020. 

Public desperate for truth

"There's a real thirst for knowledge about the efficacy of these products from the public, which remains unclear because of this lack of oversight and gap in knowledge," said lead author Anna Wexler, PhD, an instructor in the department of Medical Ethics and Health Policy at the Perelman School of Medicine at the University of Pennsylvania. 

"We believe a diverse, dedicated group would help back up or refute claims made by companies, determine what's safe, better understand their use among consumers, and address possible ethical concerns." 

Equipment may be hard to make consumer-friendly

One of the problems seems to lie in the translation of research-grade equipment such as electroencephalography (EEG) devices, into consumer products which are only loosely based in science. 

What these devices claim to be able to do and what they can do looks to be space that the authors of the paper hope to fill. 

Most Popular

In addition to falsely claiming what they are technologically capable the devices may also pose physical risks such as skin burns.

More extreme examples of the dangers posed by these new devices include the potential psychological harm from consumer EEG devices that purport to "read" one's emotional state. 

"If a consumer EEG device erroneously shows that an individual is in a stressed state, this may cause him or her to become stressed or to enact this stressed state, resulting in unwarranted psychological harm," the authors wrote.

Some of the associated apps also have the ‘ability’ to diagnose serious mental health problems such as depression which without proper medical support structures has potentially drastic consequences. 

Loopholes in regulation needs scrutiny

The authors say the industry has been able to grow so fast due to large loopholes in the regulations. Many of the new devices aren't’ required to gain FDA approval as they are categorized as "low-risk" wellness products.

[see-also]

Investors in such technologies have also publicly stated that there would be less chance for companies to receive external funding for products that did require FDA approval. 

Currently, most of the regulatory burden for consumer neurotechnology falls to the Federal Trade Commission (FTA) which is in charge of monitoring all consumer devices that make untrue claims. 

However, due to the sheer number of products that potentially fall under this umbrella, the study’s authors say this is an inadequate mode of regulation.

"Given that government agencies and private enterprises are actively funding research into new methods of modulating brain function," the authors wrote, "the present generation of [direct-to-consumer] neurotechnologies may be only the tip of the iceberg--making it all the more imperative to create an independent body to monitor developments in this domain,” they say.

message circleSHOW COMMENT (1)chevron