Ethical Consumer Neurotechnologies

Report from International Neuroethics Society annual meeting
Seimi Rurup
November 13, 2017
human-robot shaking hands

The capabilities of neurotechnologies are revolutionizing the path of treatment and prevention for certain illnesses. As they continue to evolve, it’s become necessary for doctors and patients to consider the ethical quandaries that arise with the use of brain-interfacing devices.

Kreitmair portrait
Karola Kreitmair

“We are at a place where we are unlocking more and more data about peoples’ brains and behaviors, and developing more ways of affecting our brains,” neuroethicist Karola Kreitmair said in an interview with the International Neuroethics Society (INS) back in August. “It’s important that we have an ethical actor at the table to shape that future.”

Kreitmair was this year’s Rising Star Plenary Lecturer at the INS meeting, following a panel presentation on the ethics of neuroscience and neurotechnology. She addressed shared concerns brought up by the three panelists in her lecture, “The Seven Requirements for Ethical Consumer Neurotechnologies.”

Based on emerging literature, including some of her own research, she created the comprehensive set of principles to guide the development of consumer technologies that contribute to human enhancement. These include brain-computer interface (BCI) devices, neurostimulation devices, virtual reality systems, wearables, and smartphone apps. It’s important to note that Kreitmair’s focus on consumers excludes pharmacological interventions, clinical research, and military or government use of these neurotechnologies.

  1. Safety

Given the intimate nature of technology interacting with the brain, consumer neurotechnology must be safe in two regards: Safe with respect to its intended use and safe against malicious actors or cybersecurity threats. Cybersecurity threats have the potential to cause physical harm to users, she warned, and therefore, the most rigorous standards must be employed. An example of cybersecurity gone wrong is the recent recall of 465,000 pacemakers after it became known that they were vulnerable to hacking.

Another technology with potential safety concerns is transcranial direct current stimulation (tDCS), which can be used to treat ailments such as depression and chronic pain, Kreitmair said. “There are concerns about the risks of enhancing one area of the brain at the expense of another,” and there is other evidence that tDCS devices can cause harm to the exterior body, such as burning the skin, she said.

She also expressed great interest in mobile and wearable mental health technology popular among consumers of all ages. These include mood-enhancing games and trackers that monitor location, sleep, heart rate, and activity. Using these apps regularly increases new media screen time, which has been linked to depression and suicide, she said.

  1. Veracity

Consumer neurotechnologies must not promise results they can’t deliver. “Ideally, if it were up to me, I would think that all consumer neurotechnologies should actually provide a valuable benefit. But I realize for consumer products that’s too high of a bar,” she quipped. “But I don’t think it’s too high of a bar to require honesty with respect to the value that technology does provide, so that the consumer can decide if this is a benefit they actually want to pursue.”

  1. Privacy

Neurotechnology devices have access to massive amounts of highly sensitive data about users–brainwave data. “This is a vast explosion of neurodata, and there really is need for regulation.” In the meantime, users have a right for their brain data to remain private, she said.

  1. Epistemic Appropriateness

According to Kreitmair, wearable tracking technology that records brain activity data is coming under heavy scrutiny by some researchers because of emerging evidence that says, “focusing on the quantified experience and measurements of an experience diminishes the inherent enjoyability of that experience.” She added that tracking and focusing on external means of self-knowledge is counterproductive to being in the moment and can lead to an alienation from our sense of existence.

  1. Existential Authenticity

Existentialism is all about the idea of self-fashioning, and that happens in the context of real-life experiences, she said. For children who consistently use virtual reality devices, she asked, “What happens if we do this fashioning of ourselves on the basis of inauthentic experiences?”

  1. Just Distribution

If these neurotechnologies provide value, Kreitmair said, then they must be distributed fairly, regardless of socioeconomic status. The value of the individual technology should dictate its accessibility. She cited the example of the internet and its uneven distribution to varying populations, and warned that this situation must not be replicated with neurotechnology.

  1. Oversight

Consumer neurotechnology must be subject to oversight that addresses the previous six dimensions. “Of course, oversight mechanisms already exist. In the US, we have the Food and Drug Administration (FDA).” But many of these new technologies are not regulated by the FDA. Kreitmair concluded:

Stakeholders need to develop industry guidelines…[They] need to make judgement calls on where along the six dimensions…the threshold should fall. Anything below the threshold, a given consumer neurotechology should not be made available. The stakeholders are users, parents, developers, medical experts, cybersecurity experts, and people like us—neuroethicists.