Share This Page

A Revolution in Brain Literacy
Anecdotes, surveys, and suggestive statistics are our only gauges of changing public perceptions, but all these instruments agree that in the past decade there has been a dramatic surge in public knowledge about the brain and support for brain research.
Neuroscientist and International Brain Bee founder Norbert Myslinski marshals the many indicators of change, asks what made such change possible, and looks at complex developments such as stem cell transplants, self-replicating robots, and DNA profiles—some already controversial, others likely to become so—that will put public “brain literacy” to the test.
The revolution may have begun one spring evening in 1990 in the president’s private quarters at the White House. Earlier that day, First Lady Barbara Bush had been the keynote speaker at a neuroscience symposium, where it came to her attention that her husband had not yet signed the proclamation declaring the 1990s as the Decade of the Brain. The next day, she returned to the symposium with the proclamation in hand, proudly announcing that she had persuaded George to sign it. Of course, a formal signing came later, but that was the way it came to pass.
The Decade of the Brain saw unprecedented advances in how we study the brain, understand its function, and treat its dysfunction and disorders. We have new imaging technology to view changes in the brain as it works. New experiments show that with brain impulses alone, we can operate robotic arms, and we implant microcircuits in the brain to take the place of dysfunctional ears and eyes. We have drugs to dissolve blood clots in the brains of stroke victims. We have vaccines to prevent an Alzheimer’s-like disease in animals. We have implanted modified cells into the brains of Parkinson’s disease patients to try to restore their motor control.
But what about the public’s knowledge of the brain and perception of brain research? While scientists and reporters tally research breakthroughs and hikes in research funding that have made them possible, what about the voters who pay for the research, the lawmakers who regulate it, the patients who benefit from it, and the children among whom are our future neuroscientists?
The answer, I believe, is that we are amid a revolution in public understanding that is as sweeping and pivotal to how we live as the explosion in scientific knowledge of the brain itself.
Navigating the New World of Neuroscience
The Decade of the Brain spoke to scientists of their obligation to communicate to the public what we do, to make our views heard regarding how our discoveries are applied, and to help society with the ethical and legal implications of our discoveries. Neuroscience is not only for the researcher or victim of a rare brain disease. Every one of us—salesclerk and senator, social worker and soldier—can benefit from knowledge of the brain. Daily we make decisions involving the brain, decisions touching on memory, sleep, depression, medications, aging parents, hearing aids, or our child’s development. To do so intelligently, we must navigate the new world of neuroscience.
Science education for Americans is an enigma. We are attracted to science, yet often feel estranged from it. The 1999 Third International Mathematics and Science Study, conducted by the National Center for Educational Statistics, indicated that American students fall behind their international peers in those two subjects after the fourth grade.1 By the time we reach advanced math and science, we rank second from last, beating out only Austria.
The science information we receive after our years of formal schooling is little better. In a survey conducted at Vanderbilt University and published in a 1998 report, Worlds Apart, neither scientists nor journalists thought the media did a good job of explaining science to the public. Science articles that are based upon, or even a lightly edited version of, the press releases that pour forth from research institutions may be misleading; press offices exist, after all, to convince the press that their institution’s news is important. Journalists often have a poor understanding of science and its implications and may be baffled by statistics, measurement of risk, and the peer-review process.2
In a sense, “news”—reporting something new—has an inherently uneasy relationship with “science”—the gradual process of attaining understanding. The brain does not yield its secrets easily. Scientists spend years on research, but only their published results make it into the news. For scientists, publication is not the end of a project; it is the beginning of retesting by many other scientists. The more often the experiments work, the more likely the results are true. As Einstein said of a scientific claim, “a single experiment can, at any time, prove it wrong.”
We do not all need Ph.D. degrees in brain science, but we do need to comprehend some basic concepts and the latest discoveries so we can make intelligent choices. These real choices of today involve medical treatments, rehabilitation, child development, education, violence, addiction, stress, aging, and the environment.
We cannot allow our only sources of information and influence to be advertising, tabloids, and one-line or one-minute summaries in the news. Decisions of the future will involve increasing our life spans, creating new life forms, and even implanting computer chips in our brains. If we do not become sufficiently informed about the brain, we will lose the unprecedented opportunities and benefits this era has to offer.
Stereotypes and Suspicions
Until the mid twentieth century, neuroscience was not even a separate scientific discipline, and until recently many people had no idea that afflictions ranging from stroke to autism to blindness could be considered brain diseases. Instead, with its intrinsic complexity, its unfulfilled promises, and its image in the media of mad scientists and crazy patients, brain science tended to be seen in a negative light.
The human brain is only three pounds of wrinkled gray and white matter, but it is more complicated than anything else we know in the universe. It has one hundred billion nerve cells that communicate among themselves through a quadrillion connections and with the rest of the world through our senses and muscles. How can anyone ever understand how this translates into a mother’s love, a great symphony, or a problem in learning to read? The public has long viewed the brain as defying understanding, and regarded scientists who try to understand it with indifference, suspicion, and even hostility.
Where do such attitudes come from? Sometimes the virtues of new brain discoveries have been oversold and their drawbacks understated. Remember the frontal lobotomies of the 1950s? The birth defects caused by thalidomide? The mind-altering drugs of the 1960s? What about psychoanalysis, which, not without reason, came to be viewed as counter-intuitive, expensive, time-consuming, and even resistant to attempts to measure its results? Psychoactive drugs such as Miltown and Valium were heralded as solutions to problems—only to end up often leading to abuse and addiction. Think, too, of the mapping of the human genome and the publicity that attended it. Most of the public, whose knowledge has not gone beyond the one gene-one trait level, will be disappointed if the brain disorder that runs in their families is not cured straightaway.
Public perception is also shaped by the movies. A 1994 study analyzed more than 50 movies portraying brain science and found that “Hollywood unfairly stereotypes scientists as crazy, elitist, psychotic, egotistical, nerdy, unkempt, out-of-touch, asocial leeches.”3 According to an article in The Scientist, this results in part from the attitude of some authors and screenwriters that “popular fiction about science must necessarily be sensationalistic… Accuracy…is never the most important value.”4
Consider this dialogue from the 1990 movie Awakenings:
What is being conveyed here? That science is grandiose, ridiculously expensive, obsessed with bizarre phenomena, out of touch, and confidently impervious to common sense.
An Explosion of Interest
Public awareness of science flared up with the dropping of the atom bombs in 1945, was sustained by publicity surrounding the new polio vaccine in the 1950s, and powerfully reinvigorated by space travel in the 1960s. Looking back now at the changes during the Decade of the Brain, we see a similar explosion in communication about brain research and the public’s awareness of its excitement. And we see the effects in everything from research funding to perceptions of the mentally ill, from media coverage to interest in neuroscience as a career.
The United States is a model for the world in its public support for all aspects of biomedical research, including brain research. Because the public has become excited about the advances in science, including neuroscience, and has communicated this to Congress, there were 15 percent increases in 1999, 2000, and 2001 in the budget for the National Institutes of Health, compared to 7 percent increases in 1997 and 1998. For the fiscal year 2000, NIH spent $3.4 billion on neuroscience-related research; almost every constituent institute has some part of its program devoted to neuroscience.
Neuroscientists have also radically changed their own culture. Traditionally, scientists believed that communicating science to laymen was not only inappropriate and unnecessary, but to some degree impossible. A good scientist did not combine objectivity with excitement. For example, many scientists regarded Carl Sagan with suspicion, even with disdain, for his books and television series that brought astronomy to millions of people. He was denied membership in the prestigious American Association for the Advancement of Science. Those who did attempt to speak out about science (Gerald Holton at Harvard University comes to mind, but there were others) did so only from the security of tenured positions toward the end of their careers.
Today, most scientists would acknowledge that we need effective spokespeople to justify research funding to politicians and taxpayers. Neil Lane, past director of the National Science Foundation, coined the phrase “civic scientist” to characterize such people. Many modern communicators of neuroscience, like Floyd Bloom, are even respected for it. His excellent reputation as an investigator helps, to be sure; winning a Nobel Prize, as Eric Kandel did, is even better. More young neuroscientists, too, are attempting to convey the excitement of their work to the public, and becoming excellent role models for aspiring science students.
Another indicator of the success of the Decade of the Brain is that the public increasingly views mental illness as a physical dysfunction of the brain, not a matter of choice, not a character defect, and not (as a few psychiatrists have argued) as an arbitrary label that society puts on undesirable behavior. During the Decade of the Brain, Laurie Flynn, executive director of the National Alliance for the Mentally Ill, used colorful PET scans to make her case for laws banning discrimination against people with mental illness. When lawmakers saw the differences between the brain scans of people with and without mental illnesses, they were able to understand it is a physical disease of the brain, just as heart disease is a physical disease of the heart, and lung cancer is a physical disease of the lungs. In contrast, as recently as 20 years ago the main theory about the cause of schizophrenia was cold and distant mothering, a concept from Freudian psychology.
Most medical doctors, insurance companies, and lawmakers no longer consider people with these disorders second-class citizens. They realize that mental illness has a terrible impact on individuals, families, and national productivity, and that disorders expressed through our mental lives are no less real, no less painful, and no less worthy of our attention, expertise, and treatment than diseases with visible physical signs. Indeed, the borders between neuroscience, neurology, psychology, and mental health are also disappearing; some of the hottest topics at the 2000 Society for Neuroscience meeting were drug abuse, aging, pain, stress, learning, motivation, and psychiatric disorders.
There may not yet be a Super Bowl of neuroscience, viewed by millions, but Americans in increasing numbers attend fairs and exhibits related to neuroscience. The National Science Board’s Science and Engineering Indicators 2000 reports that 9 out of 10 American adults are “moderately interested” or “very interested” in new scientific discoveries, inventions and technologies, including those related to neuroscience. Museums get the science right, make it engaging, and bring it to the public with self-paced learning experiences, such as “The Illuminated Brain” shown at the Laser Lightspeed Theater of the Maryland Science Center in Baltimore. At The California Science Center in Los Angeles, “Body Works” features a 50-foot see-through woman named Tess that teaches about the nervous system and other organs of the body. The Franklin Institute Science Museum in Philadelphia keeps a permanent hands-on neuroscience exhibit called “It’s All in your Head”; and the Smithsonian Institution in Washington, DC, has opened a new exhibit called “BRAIN: The World Inside Your Head.” A 1995 report of the Association of Science and Technology Centers showed annual attendance at such exhibits approaching 100 million—about as many people as attended all major U.S. sporting events combined in 1994-1995.5
Media portrayal of science is also improving. As recently as the early 1990s, stories on brain science were rare, usually reporting a big research breakthrough. Now articles appear almost daily on the front page as well as in the science section. Newspapers such as the New York Times have reporters like Sandra Blakeslee who specialize in reporting on the brain, and newsweeklies run cover stories on brain issues. J. Raymond DePaulo, M.D., a psychiatrist at Johns Hopkins University School of Medicine, comments that a decade ago reporters writing stories on the brain never “got it right” and were viewed as the bane of the scientist’s existence. Today, he says, mistakes and misrepresentation are the exception.
Before the 1990s, most science books were not popular; books about the brain and brain research were still less successful. Now, brain experts such as Oliver Sacks have been able to reach the public with their understandable, compelling, and thought-provoking work about the brain. TV has also been a stimulus to changing the attitudes of the public. The realistic spark that started with Watch Mr. Wizard in the 1950s has grown into a multimillion-dollar industry involving science programs on public television and even dedicated TV cable channels, such as Discovery, that reach a wide audience.
On the Internet, there are hundreds of brain-related websites from medical schools, government agencies, professional organizations, patient advocacy groups, and independent groups. They help everyone from researchers to elementary school children access information about the brain.
Neuroscience has even become a more popular career choice since the beginning of the Decade of the Brain. The 1998 Survey of Neuroscience Graduate and Postdoctoral Programs, conducted by the Association of Neuroscience Departments and Programs, reveals that applications per Ph.D. program increased significantly.6 Membership in the Society for Neuroscience now exceeds 28,000. Women and minorities are more frequently choosing science careers; and several prominent female scientists, including Carla J. Shatz and Patricia Goldman-Rakic, have served as president of the Society for Neuroscience. Neuroscience has solidified as a distinct discipline: Most neuroscience training is now in free-standing programs, instead of being an amalgam of other programs.
The Road to Success
The road to success opened when some neuroscientists finally realized that they had to be their own publicists for brain science; they could not leave it up to Hollywood, books, or even schoolteachers. The Decade of the Brain campaign rested on the vigorous grassroots efforts of these scientists, who volunteered in classrooms, ran workshops for teachers, created exhibits at museums, and helped fashion curricula. They spoke to church groups, teacher associations, retirement communities, corporate boards, lawmakers, reporters, and even prison inmates. Sharing their passion for brain science, they talked about how their research spelled hope for future treatments and cures, and how it contributed to economic prosperity. They took hard science and made it relevant and human, applying it to everyday situations involving emotions, memory, sleep, appetite, addictions, intelligence, vision, and hearing.
Their voices and efforts have been amplified by new, growing organizations such as the Society for Neuroscience, the Dana Alliance for Brain Initiatives, the Association of Neuroscience Departments and Programs, Women in Neuroscience, and the Faculty for Undergraduate Neuroscience. The Society for Neuroscience holds annual courses on “How to Take Neuroscience into Schools” and “Hands-on Neuroscience Activities,” as well as workshops for precollege science teachers and high school students. At their annual meeting, the organization holds week-long poster sessions for neuroscience education and special sessions for the media; it also has standing committees for neuroscience literacy and education. Seventy-four percent of the neuroscientist members of the Dana Alliance for Brain Initiatives report that they are involved in outreach activities.
In 1996, the Dana Alliance launched Brain Awareness Week with just 160 partner organizations; by 2001, it had more than 1,200 partners, including scientific institutions, patient advocacy groups, government agencies, hospitals, universities, and schools—and organized outreach events in 46 countries. In addition to publicity in newspapers and on television, radio, and the Internet, Brain Awareness Week features a host of events such as brain fairs, brain art or essay competitions, and the Brain Bowl. The biggest event, the International Brain Bee, 7 is a live question-and-answer competition for high school students. Winners of local competitions come from across North America to the University of Maryland in Baltimore to be quizzed on intelligence, emotions, consciousness, sensation, movement, brain imaging, and brain disorders. The goal is to motivate them to consider careers in biomedical brain research.
Neuroscience in the New Century
Judged both by its scientific accomplishments and the revolution in public support for brain research, the Decade of the Brain was a great success. The momentum must continue. It is projected that the world’s population could double or even triple in the next hundred years. In the wake of this population explosion will come increasing numbers of people suffering brain disorders related to stress and aging, such as depression and Alzheimer’s disease. How will genetic engineering, brain implants, and brain cell regeneration help treat these and other brain disorders? How will the relationships among the mind, the body, and the computer evolve? The need for the public to be knowledgeable about neuroscience can only become increasingly urgent.
Much has been accomplished, but we need to do much more to make neuroscience a priority. More college courses are offered on geology, astronomy, and oceanography than on neuroscience, yet knowledge of brain science has a potentially greater impact on our lives than knowledge of all those other disciplines combined. Perhaps the federal government should support neuroscientists who communicate with the public, providing outreach grants on a par with research grants. Corporate America must contribute by adding neuroscience to the list of fields it supports philanthropically.
Understanding and communicating new knowledge about the brain are only the beginning. We must systematically weigh the ethical uses of that knowledge. Today, many journals and forums focus on biomedical ethics in general, but we need a neuroethics as well. Decisions about how discoveries from brain research will be applied to our lives will be based not only on feasibility and economics but on judgments of value and moral principle. With success in bringing the public into the world of neuroscience comes the potential for controversy, as more people ponder the implications of new knowledge and choices, some at the core of what makes us human. Neuroethics is now a matter of public and legal concern, as evidenced by the intense debates that have already begun over topics as such stem cells, the human genome, and dangers to our brains from environmental hazards.
Consider the hot topic of fetal stem-cell research, which has the potential to cure brain disease but raises concerns about safety and potential for abuse. Stem cells implanted in the brain might trigger dangerous immune reactions or travel to other parts of the body, there to turn into tumors or the wrong type of tissue. Ethical concerns include the possibility that people may not be told that their aborted fetuses are being used for research. Some may sell the tissue for profit. Scientists have already created embryos from scratch just to harvest them for their stem cells.
The ethical arguments here are not only about how the cells are to be used, but about whether they should be used at all. The federal guidelines on stem cell research reflect the unsettled nature of this debate. President Ronald Reagan’s moratorium on stem cell use was reversed by President Clinton and reversed again by President Bush. Bush recently altered his stance to allow federal funding of fetal stem cell research on lines of cells that are already in existence, but not to create new cell lines. He said this was one of the most important, difficult, and agonizing decisions of his early presidency.
An Unprecedented Challenge to Our Privacy
Solving the puzzle of the human genome, with its tens of thousands of genes that code for development of the brain and nervous system, has also opened a Pandora’s box of concerns, including maintaining the privacy of one’s own genetic information, determining the importance of the genome in our behavior, and resisting the temptation to use our genetic makeup as an excuse to relinquish responsibility for our actions. Future generations will have their DNA structure mapped at birth and added to their medical records. As we improve our ability to predict who will develop certain brain diseases and disorders, we must consider laws to prevent unwarranted discrimination of people based on this information.
An employer, for example, might reject a job applicant whose genetic map indicates a predisposition to developing early Alzheimer’s disease, or a life or health insurance company might deny coverage because an applicant’s DNA indicates an increased risk of epilepsy or multiple sclerosis. Is this a legitimate use of information by parties entering into a contract? Aside from the moral question, simply having the gene for a certain neurological or psychological characteristic does not necessarily mean that you will show, or “express,” that characteristic. That likelihood depends on your experiences and interaction with the environment—factors such as nutrition, stress, toxins, parenting, schooling, exercise, trauma, and viruses.
In addition, complex human behavior involves multiple genes. For example, choosing to experiment with drugs can make changes in the brain that will inhibit the motivation needed to overcome the destructive behavior. On the other hand, the genes for intelligence and various skills will lay dormant unless coaxed to expression. Even in the elderly, mental exercise is necessary to maintain optimal brain function. Alleviating mental illnesses, addictions, and criminal behavior will still depend not only on advances in brain science but also on improving the condition of our families and society, and enriching our environment to allow the optimal genetic expression of brain function.
Some people are concerned that heightened focus on the brain as the source, or point of control, of our actions will diminish our sense of free will and responsibility, absolving us from guilt or locking us into a lifetime of suffering. Cheaters, bullies, abusers, and addicts could blame their behavior on the brain. We already hear phrases such as “the abuse excuse,” as courts ponder the biological boundaries that make some people more likely than others to slip into criminal behavior. We have a conscience, quite possibly located in our prefrontal cortex, where we make judgments and inhibit our more basic impulses and drives.
Brains are also changeable and malleable—“plastic,” as scientists say. Our actions and our experiences alter the biochemistry, anatomy, and function of our brains. Brain cells can divide and multiply throughout our lives, something thought impossible just a decade ago. Not only does the brain create behavior, but behavior can recreate the brain.
The Challenge of “Brain Literacy”
Following World War II, America’s lack of science literacy was a source of concern, but it was not fatal. Scientific progress brought longer lives but did not jeopardize our fundamental notions of what it means to be human. Brain research in the next century will both jeopardize and challenge those notions. Cloning, brain implants, genetic engineering, and designer babies are just the beginning. Soon our organic brains will merge with our inorganic brains. Electronic microchips are already being placed in our brains to repair lost functions or create new ones, and scientists are making chips that are part organic. What about computers that are part protoplasm? Are they part human? When do we stop calling self-replicating robots with human characteristics machines and start calling them life? What happens when we can increase the usefulness of animals by implanting them with genes for intelligence? These are a few of the overwhelming questions already beginning to confront us.
Scientific progress brought longer lives but did not jeopardize our fundamental notions of what it means to be human. Brain research in the next century will both jeopardize and challenge those notions.
Researchers now have the tools to create new life forms with unique characteristics by altering their basic genetic code. They have already created the first genetically engineered monkey (named ANDi for “inserted DNA”) who carries a foreign gene for a green fluorescent protein.8 Scientists have succeeded in reprogramming bacterial RNA to accept a 21st amino acid—a significant breakthrough, since all life, until now, has been made up of 20 amino acids. In so doing, we are recreating life from the ground up. What was once the domain of nature is now in our hands. The implications are profound. Who will decide when and how to use this newfound power?
Are we up to meeting these future challenges? Each major step in the evolution of the brain has given us higher function and greater consciousness. The reptilian brain gave us control over our vegetative functions. The mammalian brain gave us emotions. The human brain added cognition. Functions that distinguish us from other forms of life and characterize us as human, such as reason, judgment, problem solving, and creativity, are localized beneath our forehead, in the prefrontal cortex. The human brain is the peak of evolution, the flower of our humanity. It has given us the power to determine our own destiny.
References
- Michael Conn, “Make Science Relevant, Human, and Clear,” The Scientist, July 20, 1998.
- Charles Chappell and James Hartz, “The Challenge of Communicating Science to the Public,” The Chronicle of Higher Education, Volume XLIV, Number 28, March 20, 1998.
- J. S. Bakin and D. A. South, “Neuroscience and the Cultural Elite: How Hollywood Views Brain Science,” Society for Neuroscience Abstracts, Volume 20, 1994.
- James Kling, “Talking Science with Nonscientists: A Personal Communication,” The Scientist, March 29, 1999.
- Richard L. Hinman, “K-12 Education and Support for Science,” Science, Vol. 270, December 15, 1995.
- Web site of the Association of Neuroscience Departments and Programs: www.andp.org
- Society for Neuroscience, Riding the Wave of Public Outreach: 2000 Brain Awareness Week Report.
- Sandra Ackerman, “ANDi: The First Genetically Engineered Monkey,” NCRR Reporter, Spring, 2001, p. 10.