Share This Page

Mind Over Matter: Cognitive Neuroengineering
Brain-machine interface—once the stuff of science fiction novels—is coming to a computer near you. The only question is: How soon? While the technology is in its infancy, it is already helping people with spinal cord injuries. Our authors examine its potential to be the ultimate game changer for any number of neurodegenerative diseases, as well as on behavior, learning, and memory. They take the temperature of where the technology is, where it is going, and the inevitable ethical and regulatory implications.



Illustration by EGADS
Technology that is sparking an entirely new field of neuroscience will soon let us simply think about something we want our computers to do and watch it instantaneously happen. In fact, some patients with severe neurological injury or disease are already reaping the benefits of initial advances by using their thoughts to signal and control robotic limbs. This brain-computer interface (BCI) idea is spawning a new area of neuroscience called cognitive neuroengineering that holds the promise of improving the quality of life for everyone on the planet in unimaginable ways.
But the technology is not yet ready for prime time. There are three basic aspects of BCIs—recording, decoding, and operation, and progress will require refining all three. BCI works because brain activity generates a signal—typically an electrical field—that can be recorded through a dedicated device, which feeds it to a computer whose analysis software (i.e., a decoding algorithm) “translates” the signal to a simple command. This command signal operates a computer or other machine. The resulting operation can be as simple as moving a cursor on a screen, for which the command need contain just X and Y coordinates, or as complex as controlling a robotic arm, which requires information about position, orientation, speed, rotation, and more.
Recent work from University of Pittsburgh has shown that subjects with amyotrophic lateral sclerosis (ALS) can control a complex robot arm—having it pick up a pitcher and pour water into a glass—just by thinking about it. The downside is that it is necessary to surgically implant recording microelectrodes intothe brain and that, most importantly, such electrodes are not reliable for more than a few years. Moreover, the computer systems needed to process the data are as yet too large to lug around. But while the technology is not quite ready for broader use, progress has been rapid, and all signs point to the ability to make miniaturized, fully implantable hardware as well as more reliable microelectrodes. When this technology is ready to aid those with serious neuropathologies outside of the clinical setting, the current risks associated with its use will have been reduced, creating a relatively safe device. This may allow a transition from restoring function to augmenting it, opening a Pandora’s box of social and ethical issues that will need to be addressed.
The Hardware for Brain-Computer Control
Without the need for implantation, non-invasive versions of BCI now under development can control a cursor on a computer screen and perhaps operate a wheelchair. This technology utilizes electroencephalography (EEG) signals, which originate within the brain and permeate to the scalp. Since the recording electrodes are far away from the source of the signal and the signal is attenuated by the skull and the scalp, the information content of the EEG signal is low compared to invasive methods. Therefore, while non-invasiveness makes EEG a good candidate for BCI, there are important tradeoffs related to signal quality that create challenges.
To overcome the lack of direct information, a BCI uses an indirect signal, rather than a signal directly tied to the desire to move to attain the command. These indirect signals are generated by synchronous brain activity unrelated to movement. For example, when the user of a BCI looks at a flashing stimulus on a computer screen, cells in the visual cortex will fire synchronously to the flashes of light, generating an electrical oscillation that can be recorded at the scalp. By having the user look at different visual stimuli flashing at different frequencies, oscillations with different frequencies can be induced, which can then be used as control signals for selecting different commands (e.g., move cursor left or move cursor right).
While non-invasive BCI has its uses, the signals recorded through more invasive systems are faster and of better quality, and so can carry considerably more information. Such systems, currently available only in clinical studies, consist of microelectrode arrays that, when surgically inserted into the brain, can record the activity of hundreds of neurons simultaneously. By allowing for rich characterization of neuronal activity, this kind of BCI system can, for example, be used to control robot arms that require commands far more sophisticated than left or right movements.
Typically, arrays are implanted into brain areas involved in motor control, to record the activity of neurons related to the patient’s intention to move. Decoding algorithms use this information to generate command signals that operate a device that carries out the intention (e.g., pick up a pitcher and pour water). The greater information content of invasive recordings makes it possible to use true neural motor commands (compared to the use of surrogate brain signals in non-invasive BCI) to control the device.
While most BCIs are used to control an external device, it is also possible to feed information back into the nervous system in a closed-loop system—a potentially powerful strategy for modulating neural activity. In one animal experiment, a monkey was trained in a reach-to-grasp task, and a decoder was developed that transformed the neural activity into command signals for reach and grasp. When the corresponding motor nerve was anesthetized, preventing limb movement, the decoder detected the neural “move” signal and stimulated the appropriate muscles directly, bypassing the anesthetized nerve and allowing the animal to reach out. This type of approach has more recently been used by researchers at the University of California, Davis, to bypass spinal cord injuries, allowing users to move limbs that had been disconnected from the brain.
Beyond Motor Control
The application of BCI to motoric problems is certainly an exciting and active area of research with numerous applications. But the implications go well beyond motor control. BCI systems record and decode neural signals and produce an effector signal: by varying the nature of the latter we can greatly expand the scope of BCI. As suggested above, electrically stimulating the brain has the potential to treat diseases caused by aberrant brain activity, such as epilepsy and Parkinson’s disease (PD).
In PD, deep brain stimulation (DBS) has emerged as a widely accepted treatment, helping over 80,000 patients in the US alone. In its current form, DBS is not a BCI because the implanted electrodes deliver constant stimulation without any control signal. But the same electrodes can also record deep brain activity, which might then be used to modify stimulation and improve the effectiveness of the technique. Taking this approach a few steps further opens the possibility of continuously adjusting stimulation parameters (amplitude, frequency, etc.) in accordance with ongoing brain activity, in effect closing the loop to optimize the outcome. The current practice—in which the doctor tests the effect of different stimulation parameters on disease symptoms over a slow trial-and-error process that can take months—could thus be greatly accelerated by BCI.
Epilepsy is another brain pathology that could greatly benefit from dedicated BCIs. Epilepsy is not a movement disorder, but rather a neurological pathology resulting in periods of aberrant activity in the brain that is characterized by hyper-synchronous oscillations known as seizures. The latest clinical devices use closed-loop designs that can detect brain activity preceding an epileptic seizure and respond by triggering an electrical stimulation that aborts development of hypersynchrony, thus preventing the seizure before it develops. This reactive strategy, in which the device is activated only when it detects changes in brain signaling, is at least as effective as continuous stimulation. However, intermittent stimulation significantly extends battery life and thus represents an important improvement, and work is ongoing to gain additional benefits from these closed-loop approach.
Beyond motor deficits and hypersynchrony, many neurological and psychiatric disorders present cognitive disorders, such as deficits in attention, memory, planning, and decision-making abilities. In fact, cognitive disorders are often concurrent with neuropathologies for which neurotechnological approaches are already being developed. In PD, for example, patients typically have cognitive in addition to motoric problems, including decision-making and learning deficits. This is likely due to the fact that among the brain structures severely affected in PD are those implicated in both motor control and reinforcement learning and decision-making, such as the basal ganglia. A BCI for PD based on neural activity in this area may thus improve cognitive along with motor function. This concept has far reaching applications in the new cognitive neuroengineering field.
Epilepsy is similarly accompanied by cognitive decline. Colleagues at the University of California, Davis have suggested that the same mechanisms that go awry in epilepsy also impact cognitive abilities. If so, stimulation of some of the brain locations activated to prevent seizures might also address cognitive manifestation of the disorder. In animal models, stimulation of certain deep brain structures has in fact been shown to attenuate both cognitive deficits and seizure-type activity. These examples suggest the possibility of a cognitive prosthetic: neurotechnology that restores cognitive abilities through recording and stimulating within the brain.
Cognitive neuroprosthetics aimed at reducing aging-associated deficits represent another potential application for BCI technology. Plasticity in the brain declines with age, impairing the ability to learn. It has been suggested that brain stimulation, controlled by activity at a different location, could enhance association between the two brain locations, restoring learning. This approach is under development in a brain-injured rat model to potentially improve the outcome after stroke. Similarly, it has been suggested that auditory closed-loop stimulation, phase-locked to slow oscillations during sleep, might enhance memory.
Perhaps even more important than learning difficulties in an aging population is memory decline, another area that cognitive neuroengineering hopes to address. The vision is to bypass brain locations normally essential for storing and recalling memories and storing the memories on an implanted chip instead. Epilepsy patients are contributing to this new science in that brain implants are already used in their treatment.
Much of this early work is focused on understanding where and when to apply stimulation. For example, tests are ongoing with structures deep within the brain, including the seahorse-shaped hippocampus, which is involved in the creation of memories. Based on a small study, one group from the University of Pennsylvania suggests that they can restore the memory capacity of a 43-year-old brain to that of a 25-year-old, simply by stimulating the correct structures.
Researchers at Wake Forest University have shown similar results. While subjects memorized a list of words, the researchers recorded the subjects’ neural activity. When they replayed that signal back into the brain, the subjects could recall 35 percent more words than without stimulation.
Neuromodulation to improve attention deficits, poor decision making (in addiction, for example), and mood are also on the horizon. The first step toward these applications will be a better understanding of how the brain performs these fundamental cognitive tasks, an area in which basic neuroscience is making great progress. Advances in these areas are sorely needed: while medication for neurological and psychiatric disorders can improve many symptoms (e.g., hallucinations), they rarely alleviate cognitive deficits. There is hope that developing technology will lead to implantable devices that, by directly interfacing with relevant systems, will accomplish this goal.
Perceptual decision-making, choices based on sensory information, is a good example. Computational models can account for the accuracy of such decisions and how much time is needed to make up one’s mind, as well as clarifying decision-related activity recorded from the brain. We are beginning to understand the algorithms that the brain uses to solve such cognitive problems. Inactivation studies shed light on which areas of the brain are critical for particular aspects of the decision-making process. Artificial manipulation of brain activity provides insights into which neural populations contribute to decision-making and in what way. Even the history of individual decisions can be decoded from invasively recorded neural activity.
The same is true of value-based decisions, in which choices depend less on sensory information than on the assignment of value to available options (think choosing an ice cream flavor). Computational models of choice, together with invasive recording methods, are shedding light upon how these cognitive processes are implemented in the brain and open the possibility of modulating specific aspects of decision-making. For example, neural activity in the orbitofrontal cortex distinctly represents probability and risk estimations of upcoming decisions, as well as learning signals indicating whether the actions taken resulted in a positive outcome, or whether other courses of action would have been superior (i.e., regret).
It is conceivable that BCIs might modulate our choices by changing how we make decisions (i.e., making us more rational, willing to wait for a better outcome rather than choosing a suboptimal outcome now), the rate at which we learn from their resulting outcomes, or even the impact of decisions on our mood. As in the examples above, development of these BCIs could have therapeutic implications for mental disorders in which choice is affected, such as addiction or obsessive-compulsive disorder.
BCI to Augment Cognition
Currently, the risk/reward ratio of prototypical record-decode-stimulate BCI devices restricts them to those with severe neurological injury and disease. But as technology for neural prosthetics in general, and cognitive prosthetics in particular, becomes increasingly safe and reliable, the question will inevitably arise as to when such a device will be safe enough for individuals without compelling medical needs, and what the limits of their applications should be. If we can normalize pathological brain activity, can (or, indeed, should) we push beyond that, into the realm of augmented cognition?
Continued use of these devices to restore lost function will reduce the risks associated with their use. If these risks can be driven sufficiently low, the benefits of using them would soon reach a point far outweighing any remaining risk. If we develop the ability to rescue memory deficits in Alzheimer’s patients, could we design memory prostheses that would allow the cognitively intact to learn more quickly and remember more accurately? If we can help addicts make better choices in regard to their drug habits, should we design choice prostheses that would allow us to effortlessly go on a diet? The potential applications are far-reaching.
A reasonably compelling argument could be made for reserving these devices for those working in high consequence environments. Astronauts would have a tremendous advantage if they could control tools and robots just by thinking, rather than having to navigate control panels in complex capsules. Soldiers and air traffic controllers could keep their hands free by controlling devices or managing workload through their thoughts.
Such intermediate applications (not therapeutic, but not indiscriminately available) may pave the way to more widespread adoption and acceptance of cognitive augmentation. As recording and stimulating methods, as well as the safety of these devices improve, consumers will demand access. Testimony to BCI’s enormous potential is the interest it has aroused in powerhouse companies and entrepreneurs: Elon Musk’s Neuralink and the start-up Kernel, as well as units within Facebook, Google, Nissan, and IBM. The goals of these companies are diverse. Nissan’s Brain-to-Vehicle technology is focused on improving reaction time and reducing discomfort while driving. Facebook’s efforts are aligned with the objectives of neuromarketing—measuring and optimizing consumer attention and engagement. For others, enhanced cognition is the ultimate goal.
Clearly, the race to develop widely available BCIs is on. But numerous important challenges, beyond technological barriers, just as clearly need to be addressed. Risk and benefit analyses will be necessary: we need to better understand the societal impact of allowing individuals to alter their cognitive abilities. One issue is that new devices, which are certain to be initially expensive, will likely exacerbate wealth inequality by providing new opportunities to those that can afford them.
The ability to plug our thoughts and actions into a computer also raises serious concerns about privacy. Cyber security will be of prime importance—skilled hackers might well develop the ability to read and modify BCIs to target one’s thoughts. Adjudicating responsibility for untoward results between the user and the device (or the engineering team or manufacturer behind it) will be of great legal importance, much as in regard to current uses of Artificial Intelligence (i.e., self-driving cars). The very definition of personhood may need to be revised.
To tackle these issues, it will be necessary to develop a regulatory framework that addresses ethical and legal issues, to avoid standards being driven solely by technological capability (witness the Facebook privacy scandal). In any case, there is little doubt that advances achieved and underway mean that we are almost certainly headed to a BCI-wired society in which the boundaries between brain and machine are increasingly blurred. Scary? Perhaps. But, if we succeed in building an appropriate regulatory framework to keep up with fast moving technological changes, we will have the opportunity to improve human cognitive abilities and create a better society—perhaps a society of cyborg citizens.
Financial Disclosure: The authors have no conflicts of interest to report.