Share This Page

The Fast-Moving Neuroprosthetics Frontier
Neuroethics Viewpoint

The science and technology embodied in neuroprosthetics are progressing far faster than the law can keep up. It’s not even clear whether existing law can, with suitable adjustments, handle the problems or whether we need a whole new legal approach, possibly including a bill of rights for neuroprosthetics.
That was the overriding message I took away from Jennifer Chandler, a professor of law at the University of Ottawa in Canada, who delivered the David Kopf Neuroethics lecture on November 9, at the annual meeting of the Society for Neuroscience—conducted virtually this year in deference to the ongoing Covid pandemic.
Her topic: How the law is evolving to deal with neuroprosthetics, a fast-moving frontier in the long-standing practice to use artificial devices to replace or enhance body parts and functions.
The idea that science and technology are progressing faster than existing law is a recurring theme in many issues I have written about, including the ethics of conducting research on brain organoids in a dish and of linking brain signals to computers, the so-called brain-computer interface.
Professor Chandler elaborated on her views in a lengthy book chapter that she made available and in emails exchanged with this columnist. Before definitively answering questions about a host of evolving legal issues, she wanted to first identify the central human interests at stake with the adoption of complex neuroprosthetics and the potential threats to those interests.
Although humans have a long history with prostheses of many types, neuroprostheses raise perplexing new issues. This is primarily because they stimulate the brain directly, thereby influencing its mental contents, and can be read outside the brain, allowing access to a person’s thoughts.
I had long considered this a privacy issue, which it clearly is. But Professor Chandler warned that such thoughts could also provide a potentially dangerous pathway to harm an individual.
For example, she wrote, Medtronic’s new deep brain stimulation device, known as Percept, simultaneously captures brain signals (using an implanted lead) that can be recorded, while delivering therapeutic stimulation—both inside and outside the clinic.
Since many devices and prostheses are networked, they not only transmit information about the person but present a portal through which harm could be done. In August 2017, for example, the Food and Drug Administration warned patients that a particular implanted pacemaker had a security vulnerability that might allow an intruder to harm them through hacking. Worse yet, with wide scale networking, the intruder could dole out harm en masse.
It is not far-fetched to think that a neuroprosthetic device could be similarly hacked and used to harm an individual, or many people, at once. All it would take is one deranged malefactor bent on evil deeds. Or such harm might conceivably be caused by a technical malfunction or glitch or inadvertent human error.
Another ethical issue emerged when the initial spread of Covid-19 overwhelmed many hospitals and forced some to consider allocating the use of scarce ventilators for the sickest patients or those most likely to recover. In those cases, the ventilators were considered objects that could be transferred from one patient to another. But that understandably raised alarm among chronically ill patients who relied on ventilators to breathe. As one patient explained it, “My vent is part of my body—I cannot be without it for more than an hour at the most due to my neuromuscular disability. For clinicians to take my vent away from me would be an assault on my personhood and lead to my death.”
Even before the pandemic, one bioethicist argued that a ventilator used for chronic illness was part and parcel of the person, not subject to being commandeered in a crisis.
Neuroprostheses, like ventilators, can also be critical for the physical or psychological survival of a patient. A spinal cord stimulator to ease chronic pain may be lifesaving by preventing some patients from death by suicide. And, given that memories are crucial to knowing who we are, prosthetics being explored to improve memory formation in Alzheimer’s patients would seem literally indispensable for the survival of the person.
There is no doubt that tinkering with the brain can be therapeutically valuable. Deep brain stimulation with implanted electrodes is being used to alleviate conditions such as Parkinson’s disease and to treat depression, OCD, eating disorders, aggression, and addiction.
A July 15 article in the New England Journal of Medicine describes how doctors were able to help a patient who had lost his ability to articulate words and sentences after a brain-stem stroke. They implanted a multi-electrode array over the part of the cortex that controls speech, recorded cortical activity while the participant attempted to say words—and eventually sentences—and were able to decode what he was trying to say without him actually saying it.
That raises the question of whether such imagined speech or thoughts should be made available to outsiders who might use it against a person. The possibility was raised in a case in Ohio in which a man was found guilty of arson and insurance fraud after a fire in his home. He claimed to have done all sorts of vigorous things to escape the fire. Police obtained a search warrant for the electronic data stored on his pacemaker and a cardiologist found it “highly improbable” that he could have done all the things he claimed.
Sooner or later the same issue will arise with a neuroprosthetic device. Stay tuned to see whether the law and bioethics can find a way to resolve such perplexing issues as the field of neuroscience races ahead.
—
Phil Boffey is former deputy editor of the New York Times Editorial Board and editorial page writer, primarily focusing on the impacts of science and health on society. He was also editor of Science Times and a member of two teams that won Pulitzer Prizes.
The views and opinions expressed are those of the author and do not imply endorsement by the Dana Foundation.