Cover image for The biology of violence : how understanding the brain, behavior, and environment can break the vicious circle of aggression
The biology of violence : how understanding the brain, behavior, and environment can break the vicious circle of aggression
Niehoff, Debra.
Personal Author:
Publication Information:
New York, NY : Free Press, [1999]

Physical Description:
xi, 353 pages : illustrations ; 25 cm
Format :


Call Number
Material Type
Home Location
Central Library RC569.5.V55 N53 1999 Adult Non-Fiction Central Closed Stacks

On Order



As scientists at the frontiers of neuroscience are discovering, violent behaviour is not simply a result of poverty or moral decline, but is also located in the way our brains work. This text therefore examines violence from a biological perspective.

Reviews 2

Booklist Review

Getting a highly controversial topic out of the either-or rut in which most discussion of it is mired is one of the most difficult tasks an author can undertake. Niehoff does just that, though, and merits considerable praise for a book that undoubtedly will bring her more abuse than good will. She asks, Is violence the result of genes or of the environment? Practically anyone will prove able to keep the argument for one or the other going as long as listeners will bear it. Such arguers, Niehoff clearly and winningly demonstrates, are clueless, though, and not only impede everyday discussion but also, at higher academic levels, hinder research, publication, and general education. Look inside the human brain, she says. Do we really know much about the relationship between inner and outer worlds? Closely argued, thoroughly documented, and beautifully worked out, Niehoff's book shows how genes and environment modify one another and add up to uniqueness in each person. --William Beatty

Publisher's Weekly Review

In this ambitious book, Niehoff presents biology's latest findings on the development of violent behavior in an effort to answer a simple question: "Why do people hurt each other?" Aggression, she argues, like all complex behavior, "is a biological process... that begins and ends in the brain." Drawing on a wealth of research in neurobiology, biochemistry, physiology, genetics and anatomy, Niehoff explains in precise prose that the old nature/nurture debate is obsolete: innate drives do not define character from birth, nor, she contends, do environments alone determine our predilections. Rather, she believes, the chemical reactions of the brain develop in constant and complex reaction to the environment. Niehoff resists the temptation to dumb down science, but she does a fine job of elucidating difficult concepts to make them accessible to general readers. In the last chapter, she appraises our current treatment and punishment of criminals and finds that most of our nation's drug and penal policies actually elicit violent, antisocial behavior. Solitary confinement, for example, leads to increased levels of violence in laboratory animals; Niehoff believes isolation of prisoners yields similar results. Though the real world is certainly more complicated than the laboratory, her proposed methods of curbing violence (which include prenatal care, cognitive behavioral therapy and careful use of psychopharmacology) are thought-provoking, and this book is a fine contribution to a debate often clouded by emotion. 7 b&w photos; 9 tables; 24 drawings. (Jan.) (c) Copyright PWxyz, LLC. All rights reserved



Chapter One SEEDS OF CONTROVERSY Janice is the first to speak. She is safe now, and that makes her bold enough to tell her story to the other women and their children gathered in the courtyard. But the breathy choke in her voice recalls a time when she was afraid, a time when her daughter, not yet a year old, had already learned how to babble and coo at Dad to distract him. If the baby could get him to laugh, he stopped hitting Janice. She still has no idea what he was so angry about. "He lashed out at anything," she whispers. "But he could only reach as far as me." Kathy follows Janice. Donna follows Kathy. Sharon follows Donna. The wind rises and the October night, crisp at seven o'clock, chills to a temperature undeterred by even the heaviest sweater. Those without coats shiver from the cold as well as emotion. When the stories are over, the group lights candles, as they have done here for more than a dozen years, to shame the men who have beaten and terrorized them, to shame a society that closes its eyes. In 1996, approximately seven thousand women in this county had babies. That same year, fifteen thousand -- more than twice as many -- were victims of domestic violence. To look at the faces of these victims is to know that the mirror image of anger is not compassion but fear. When people talk about violence, they're often thinking about crime -- about murder and rape and armed robbery. But the men who hurt Janice and Kathy, Donna and Sharon are not convicts. The harshest sentence most will receive is a restraining order. People tell themselves they're safe, because violence makes its home in crumbling neighborhoods fouled by poverty. But the women in the courtyard live nearly an hour from central Philadelphia, in quiet communities that enjoy some of the highest per capita incomes in Pennsylvania. People believe that violence is the work of strangers, and so they teach their children to run away from unfamiliar faces. But these assailants live in the same houses as their victims; this violence, says Susan Hauser, director of the county women's shelter, "comes from a person who once said, 'I love you.'" People recognize the violent; they make headlines. But every act of aggression involves two people. People insist that the violent are evil, as if that word alone could expatriate them to some dark and unknown place. But the sad truth is that the violent are only human, and they live right among us. Behind the crowd, rows of T-shirts, hung from a clothesline that circles the courtyard, ripple in the candlelight. Some memorialize women murdered by their abusers. Others have been created by the women with candles, outward symbols of their determination to escape and survive. One shirt depicts the flashing red and blue light that tops police cars. Above the picture, the woman who marked time by the arrival of those cars has written, in bold, angry letters, "My children's night light." That same fall, another group of antiviolence protesters had gathered outside the Aspen Institute in Queenstown, a small town on Maryland's Eastern Shore, to light candles and remember the past. But it wasn't violent behavior itself that inflamed these demonstrators; it was some people's answer to the problem. Inside the institute's Wye Conference Center, some seventy scientists, philosophers, bioethicists, lawyers, and sociologists had come together to debate the moral, medical, and social implications of research on genetics and criminal behavior. The conference, organized by University of Maryland legal scholar David Wasserman, was taking place nearly three years later than originally scheduled. During that time, it had come to stand for one of the most enduring conflicts between science and society of this century, an ethical debate as bitter as the tragedy it aimed to correct. The trouble began with a speech made to the National Advisory Mental Health Council on February 11, 1992, by Dr. Frederick Goodwin, then director of the Alcohol, Drug Abuse, and Mental Health Administration (ADAMHA), to mark the announcement of a new federal strategy for coordinating research on the origins of violence. Commenting on the importance of this "violence initiative" in an America besieged by rising crime rates, Goodwin pointed to the deterioration of social structure in "high-impact inner-city areas," suggesting that "maybe it isn't just careless use of the word when people call certain areas of certain cities jungles." Someone in the audience called the Washington Post. A media feeding frenzy ensued, in which Goodwin's observation was reported as a comparison of black Americans to monkeys in the jungle. Prompted by an aggressive "educational" campaign engineered by Maryland psychiatrist Dr. Peter Breggin, a long-time critic of the psychiatric community, African-American political leaders concluded that the "violence initiative" was actually a euphemism for a government-backed racist policy to control crime by scapegoating the black community. Events came to a head when the National Institutes of Health (NIH) awarded a grant to Wasserman for his conference, then titled "Genetic Factors in Crime: Findings, Uses, and Implications." For critics of the violence initiative this was the ideological last straw. In a July 4, 1992, interview on Black Entertainment Television, Breggin savaged the conference, comparing it to "the kind of racist behavior we saw in Nazi Germany. Outraged members of the Congressional Black Caucus demanded cancellation of the conference, an end to the violence initiative, and a global moratorium on violence research. The NIH first refused, then conceded to the pressure, freezing Wasserman's $78,000 grant on July 20. Eight months later, John Diggs, NIH deputy director for extramural research, notified the University of Maryland that the agency had formally canceled support for the beleaguered conference, citing statements in the brochure that had "significantly misrepresented the objectives of the conference" and that gave "the distinct impression that there is a genetic basis for criminal behavior. Academic leaders were appalled. "Actions of this kind put a chilling effect on the conduct of science," argued Robert Rosenzweig, president of the Association of American Universities, in an interview with the influential journal Science. Over the next two years, Wasserman and the University of Maryland restructured the conference to focus specifically on the allegations of racism, in the hope that this move would persuade the NIH to reconsider its decision. Ultimately, Wasserman convinced the Program on the Ethical, Legal, and Social Implications of the Human Genome Project, a branch of the health agency's National Center for Human Genome Research, to reinstate funding for the politically corrected conference, even expanding the original award of $78,000 to $133,000. In an interview with the New York Times three days before the conference finally began on September 22, 1995, Wasserman emphasized that "the conference has changed from three years ago. We're placing a greater emphasis on the issue of race and racial tensions....We're going to give it greater prominence." He noted that every effort had been made to include not only genetic researchers but also vocal critics of such research, such as sociologist Dr. Dorothy Nelkin of New York University and Dr. Andrew Futterman, a psychology researcher from Holy Cross College in Worcester, Massachusetts. In fact, the Times suggested that some biologists declined Wasserman's invitation to participate on the ground that "the conference was so weighted toward the ethics and implications of the work that it slighted the science." And the critics -- despite Wasserman's overtures -- remained wary and pessimistic. "I haven't a clue what to expect, but I do have serious questions," Futterman admitted beforehand. "I think that people will probably try to set up the biology and crime relationship as a straw man. They'll attack it and come away feeling virtuous." Nelkin was blunter. "It's a silly conference. Nothing will come out of it," she insisted. Organizers of the vigil still condemned the idea of discussing genetics and crime in the same sentence as "racist pseudoscience" and berated Wasserman on the Internet as the chief advocate of a "genetic/blochemical approach to urban youth violence [that] is...sucking away resources from tackling the ROOTS of urban youth violence, including RACISM and CLASSISM." Afterward I asked Wasserman if he felt that the result had been worth the anguish, if the biologists had become sensitized and the sociologists educated. His answer was a qualified yes. "I think the conference was tremendously valuable," he replied. "I think some in the audience wondered what all the fuss was about. Some were relieved to hear that scientists weren't doing the kind of simple-minded research they thought. And there were some good dialogues between people who came away with a lot of mutual respect for each other." He paused for a minute, then added, "But I wouldn't do it again." The Missing Link Our frustration with the widespread violence of American society is well documented; by now, the reports are as mind numbing as the statistics. But news reports and homicide figures tell only part of the story, for much cruelty goes unrecognized and unpunished by the criminal justice system. Daily life has degenerated into a threatening and occasionally terrifying cacophony of belligerent coworkers, disgruntled patriots who issue death threats to judges and forest rangers, and school board members who threaten to tear a noisy constituent "limb from limb and enjoy the process." The Internet is clotted with hate mail, the roads swarm with agitators who view their cars as weapons. Deluged with facts and figures, we still understand neither violence nor the violent. The belief that many, if not most, of the individuals responsible for the lurid headlines are somehow "sick," members of a shadow society that exists outside the mainstream, is widespread, yet we cannot explain exactly what it is that makes them different. Social critics, like Dr. Nelkin, point to poverty and racism but fail to explain how these factors twist behavior, or why the incidence of domestic violence, child abuse, hate crimes, and road rage has escalated in comfortable neighborhoods far from the inner city. We can predict prison costs and project handgun sales, but we become confused when we look at a group of troubled teenagers and try to forecast who will reform and who will degenerate. Crime victims initially inspire pity, then frustration. Why can't they get on with life? Why doesn't she just move out? One of the women who cried at Janice's story came to the vigil to light a candle for her friend Sandra. Only a few weeks earlier, Sandra had told her belligerent husband she'd finally decided to move out. He tried to use force to stop her. Alarmed, the friend had asked Sandra's priest to intervene. The priest refused, maintaining that because no one had actually witnessed the alleged assault, the behavior and motives of both husband and wife remained open to question. "After all," she concluded, "we don't know what went on inside their heads." This is the heart of the problem. Our violent behavior bewilders us because we lack crucial information. Countless newspaper articles, books, and television programs chart the social dimensions of violence: poverty, racism, the breakdown of the family, the pervasive influence of television, the ready availability of guns. But the outer world is meaningless until it enters the inner world, the dimension governed by brain and perception, thought and emotion, nerve and tissue. Until we know as much about this inner dimension as we do about the outer one -- what goes on inside the heads of aggressors and their victims -- we are not prepared to analyze the problem of violence effectively. Columbia University neuroscientist Eric Kandel has written, "The central tenet of modern neural science is that all behavior is a reflection of brain function. Beginning in the nineteenth century, our growing knowledge of brain structure and function has fueled recognition of the neural mechanisms underlying behavior. Indeed, at the turn of the century, science seemed poised on the edge of a biological revolution to equal the practical advances of the Industrial Revolution. Why, then, has neuroscience been excluded from public debate about the problem of violence, banished as too dangerous for public consumption? One source of the controversy is the age-old conflict between behavioral science and ideology that continues to fuel widespread misunderstanding of both the character of biological science and its motives. A second answer lies in the popular misconception that scientific progress is a continuum, with discovery overlapping discovery in a steadily advancing wave that crests to one clear and perfect answer: the steam engine, the personal computer, antibiotic therapy, a cure for cancer. In reality, however, science proceeds by fits and starts, wandering into dead ends, overlooking the obvious, daydreaming, stumbling into a hidden pit. False starts and wrong turns are in fact a hallmark of newly emerging disciplines as researchers struggle to mark a path through muddy, uncharted territory. Ground rules must be legislated, new tools and techniques invented. What initially seemed solid may suddenly shift; a single question may fragment into dozens of answers. First efforts to widen the track by translating laboratory results into practical applications are equally likely to encounter obstacles. Delays and frustrations are inevitable. Patience and a critical eye are essential. For if impatience, poor judgment, or societal demands rush this learning process, scientists may reach conclusions before they even understand the question. The history of biology's fall from grace as an explanation for human violence is a cautionary tale of the ease with which inspired discovery can die at the crossroads of human ignorance and human need. Form and Function We can see and measure the physical, and so the search for the difference between those who kill and those who are content to seethe began with attempts to correlate physical features and personality traits by an ambitious physician and anatomist, Franz Joseph Gall (1758-1828). Gall began speculating about the anatomical substrates of behavior even before he completed his formal training at the medical school of Vienna in 1795. Building on observations that dated from his childhood, when he noted a correspondence between the mental abilities of his classmates and the size and shape of their eyes, Gall proposed a working relationship between verbal memory and the frontal lobe, the brain area protruding into the bony cavity overlooking the eye sockets. His ideas ultimately evolved into phrenology, a theory that assigned precise neural addresses to a wide range of behaviors, personality traits, and mental characteristics, and also proposed that the relative importance of each faculty in a particular individual could be divined from the size and shape of the bumps in the skull overlying that functional region. A skilled practitioner could readily detect an antisocial mind-set merely by running his hand over the villain's head. To the modern mind, Gall's system of correspondences appears farfetched. Even in his own time, critics were quick to point out that his conclusions were supported by only the barest shreds of actual clinical evidence. For example "destructiveness" was associated with a bump located just above the ear because, "First, this is the widest part of the skull in carnivores. Second, a prominence was found here in a student who was 'so fond of torturing animals that he became a surgeon.' And third, this region was well developed in an apothecary who later became an executioner." Gall and his predictions, however ill founded they may have been, found a sympathetic audience in the Viennese public. But church and civil authorities proved less enthusiastic. Decrying phrenology as a materialistic assault on free will, they forbade Gall to teach and finally forced him to leave Vienna in 1805, for the less repressive environment of Paris. Here, he and a student, Johann Spurzheim (who would eventually conduct a worldwide public relations effort on behalf of phrenology), extended their investigations, enjoyed the admiration of Parisian society, and attracted an international following. One supporter, American physician John Bell, was so impressed by Gall that he founded the Central Phrenological Society upon his return to the United States in 1822. In one of the earliest clinical reports linking brain function to violence, Bell delivered a lecture to the society describing Spurzheim's phrenological evaluation of a group of thirty women who had killed their own children. According to Spurzheim, twenty-six of these homicidal mothers had an underdeveloped brain center for "philoprogenitiveness," or love of children. "The implication," science historian Stanley Finger notes, "was that their crimes resulted from a physically defective brain." Similarly, in an 1846 treatise, Rationale of Crime, Eliza Farnham provided this description of a phrenologist's opinion on the violent outbursts of one C.P., a woman jailed four times for crimes ranging from petty larceny to assaulting a prison warden with a carving knife: "In her head, destructiveness is enormously developed, with large secretiveness and caution, and very defective benevolence and moral organs generally." Interest in phrenology had waned by midcentury, weakened by the evidence of hands-on observations and increasingly scathing criticism on both sides of the Atlantic. But the idea that psychological makeup was reflected in physical characteristics resurfaced in the work of Italian criminal anthropologist Cesare Lombroso (1836-1909). Lombroso examined the physical features of prisoners, the insane, and cadavers. Based on these observations, he thought he could detect certain features -- sloping forehead, long arms, full lips, a twisted nose -- that seemed to occur more often in criminals than in law-abiding citizens. He concluded that such features, or atavisms, were evidence of regression to a more primitive stage of development. The violent, in other words, were not only different, but physically, mentally, and morally inferior, closet Neanderthals passing for civilized human beings. Moreover, they were born that way, cursed from the very beginning with a weak will and a native propensity to wreak havoc. Today, the idea that the shape of our eyes or the width of our foreheads can predict behavior seems ludicrous. But we are visual creatures, and the temptation to rely on the strongest of our five senses is great. We still judge character by what we see, and if clothes, body language, or facial expression look too much like the lead story on the evening news, we rarely hesitate to draw what may well be an unwarranted, unreasonable, or even dangerous conclusion. Purely Born Phrenology was criticized in the scientific community for its failure to back up its claims with hard clinical evidence. Outside the laboratory and the clinic, Gall's theory also raised larger questions about the ethics of examining human behavior from a physical rather than a moral or spiritual perspective. Gall's most vitriolic critic, the renowned French neurophysiologist Marie-Jean-Pierre Flourens, derided Gall as a madman who "suppresses free will and wishes that there is a soul." Such reproaches did more than exclude Gall, Spurzheim, and their supporters from the elitist society of nineteenth-century academic science. Moral objections to "functional localization" -- the mapping of particular behaviors to specific brain regions -- and Lombroso's atavisms sowed the seeds of an antipathy to behavioral biology that would soon be nourished by revolutionary ideas about a predictor for human behavior far more permanent than physical features: the genome. Charles Darwin's most dangerous idea was not his well-known theory of evolution and natural selection, but the suggestion that human beings can tinker with the process. Although Darwin did not openly propose any direct meddling, concern about the evolutionary implications of the celebrated fertility of the Victorian lower classes led him to suggest cautiously that encouraging society's more exemplary members to rear larger families might have merit. It was Darwin's younger cousin, Francis Galton, who breathed life into this idea that humans should control human reproduction, arguing that mental and moral characteristics -- including a range of vices, character defects, socially undesirable habits, and "crimes of violence" -- fell under the same hereditary authority as physical traits like height and eye color. He concluded that the solution to rampant crime rates and the exploding population of the urban poor was to control human breeding in much the same way that farmers bred "the best to the best" among their livestock. Science historian Diane Paul sums up Galton's attitude: "Those highest in civic worth should be encouraged to have more children; the stupid and improvident, few or none." Galton called his proposition eugenics, after the Greek word eugenes, or "good in birth." Eugenic practice actually predated the Greeks; it has, in fact, been a sordid aspect of human reproduction since the dawn of time. Oxford scholar Allan Roper, in a 1913 essay on eugenics in the ancient world, observed, "The preface to a history of Eugenics may be compiled from barbarism, for the first Eugenicist was not the Spartan legislator, but the primitive savage who killed his sickly child." Roper speculated that the precarious relationship between early human clans and an often unforgiving environment must have made care of the deformed and the infirm seem a dangerous luxury. But the comforts of civilization did not end dissatisfaction with less-than-perfect infants. The classical Greek texts that may have inspired Galton extolled the selection of a healthy, vigorous marriage partner as a chief civic responsibility. Their observation that "noble children are born from noble sires, the base are like in nature to their father," seemed logical to an audience familiar with agriculture and animal husbandry, who knew that breeding a weakling to a weakling typically produced inferior offspring. And population quality control did not end with the marriage contract. When nature failed to live up to expectations, civic-minded parents, like any sensible animal breeder, might well elect to dispose of the error. As Roper put it, "Infanticide saved the Greeks from the problems of heredity." Infanticide eventually fell out of favor in most countries. But when the emergent science of genetics provided a mechanism for natural selection, eugenics sought to exploit this knowledge to control the threat to evolutionary progress posed by the unchecked propagation of "the reckless, the vicious, and the otherwise inferior." Galton urged such reproductive management by what Paul calls "positive eugenics": encouraging society's best and brightest to have more children. But later champions of the eugenic cause advocated an aggressive "negative eugenic" approach that focused not on the most fit but on the least. Stressing the high cost to society of caring for the poor, the infirm, the insane, and the violent, leading turn-of-the-century eugenic advocates, such as Karl Pearson in Britain, Charles Davenport in the United States, and Fritz Lenz in Germany, gathered hereditary information on hundreds of thousands of people. Based on these heredity data, they initiated "genetic hygiene" measures ranging from segregation in "work colonies" to compulsory sterilization. American support for eugenics swelled during the early decades of this century in response to the rising tide of immigrants. Alarmists saw in the huddled masses yearning to be free a more ominous urge: to outbreed and overrun "native" Americans. Academicians and politicians weighed the reputed high fertility rates of so-called low-standard immigrants from southern and eastern Europe to the modest rates among Mayflower descendants and predicted social catastrophe, blaming this torrent of the socially inferior for "increasing levels of crime, insanity, and pauperism, and thus for the financial burden of custodial care, for urban political corruption, for strikes and other forms of labor militancy, and for unemployment, among other social ills." The self-appointed eugenic guardians whipped public fear of "race suicide" into a frenzy, adding a biological urgency to Progressive pressure to curb immigration. Worse, they introduced race as a critical factor in human genetics. According to these advocates, superior individuals who sought to control the immigration and reproduction of Jews, blacks, and eastern Europeans were not prejudiced, but expressing "a natural antipathy which serves to retain the purity of the type." Such propaganda culminated in restrictive immigration laws, such as the Johnson-Reed Act of 1924, which severely curtailed the number of immigrants from southern and eastern Europe, and in miscegenation statutes that prohibited the interracial marriages so dreaded by the eugenicists. But it simultaneously promoted a truly dangerous and degenerate union: the link between behavioral genetics and racism. Europeans also idealized the social value of racial purity and supported eugenic methods for achieving it. Under Hitler, however, eugenics degenerated into genocide. Public education, segregation, and voluntary measures gave way to compulsory sterilization, enforced abortion, and death camps. Violent, illegal, or dissident behavior -- characterized as "social feeble-mindedness" -- emerged as a particular target of Nazi eugenicists. The postwar disclosure of Nazi atrocities constituted a death sentence for eugenics, in America as well as in Europe. Genetic researchers renounced their high-profile public face and retreated to the lab. The eugenic societies that had once dispatched armies of field workers to collect volumes of trait data withered, despite contrite efforts to disentangle themselves from their racist past. Practices such as compulsory sterilization, once accepted as not only permissible but essential, fell to public outcry. But the newfound revulsion toward eugenics did not stop geneticists from quietly taking over biology. In Refiguring Life, science historian and philosopher Evelyn Fox Keller proposes that the genetic annexation of biology, the conviction that "with the gene comes life," gradually confined biological research to a genetic prison. The language of biology came to reverence a DNA that dictated not only physical structure, but every facet of existence from the moment of fertilization onward, and cursed biology with an enduring and unpalatable determinism. Eugenics poisoned the study of the biological foundations of behavior with this same determinism. If gene action was the sole basis of function, a well as structure, then aberrant behavior reflected a genetic defect, a string of "bad code" that crashed the biological program. A biology that believed that life was written in the genes had to conclude that people were either "born good" or "born bad." Learning, experience, and perception were irrelevant. Character was decided in utero, and the birth certificate became a life sentence. Thanks to eugenics, Galton's legacy was not a utopia of geniuses but an unbridgeable rift between nature and nurture. Nature, discredited by eugenics, lost validity as an explanation for aggressive behavior. On the other hand, nurture explanations for behavior became not only intellectually fashionable but morally obligatory. On June 23, 1993, a small box arrived at the Tiburon, California, home of University of California geneticist Charles Epstein. Epstein failed to note the box's unfamiliar northern California return address, perhaps imagining it contained lab samples, a gift, this month's unsolicited selection from yet another record club. The oversight proved deadly. Instead of test tubes or CDs, the package contained a pipe bomb that exploded in the researcher's hands, severely injuring him. It was Charles Epstein's brutal introduction to the Unabomber. Beginning in May 1978, when a professor at Northwestern University turned over a suspicious package found in a university parking lot to local authorities, the elusive Unabomber mailed, delivered, or left behind sixteen of the deadly boxes in a seventeen-year killing spree that inspired the costliest manhunt in FBI history. Only one eyewitness claimed to have seen the mysterious fugitive. No one understood why the bombs targeted university classrooms, computer stores, and airlines. The arrest of Montana recluse Theodore Kaczynski in April 1996 finally put a human face on the enigmatic Unabomber. And Kaczynski himself had explained the motive for the attacks six months earlier, when the New York Times and the Washington Post printed his 35,000-word diatribe against the Industrial Revolution. Science and technology, the manifesto claimed, have ruined our social and physical environment; they are the root cause of the stress and frustration that have "destabilized society, have made life unfulfilling, have subjected human beings to indignities, have led to widespread psychological suffering (in the Third World to physical suffering as well) and have inflicted severe damage on the natural world." Presumably Epstein's selection as a target to this identified genetics as one of the evil scientific advances contributing "disaster for the human race." Those who believe that problem behavior is the result of a damaged society, rather than damaged genes, are as capable of political excess as the rabid proponents of racial hygiene. Charles Epstein's fate is a chilling reminder that science and ideology are always a dangerous combination, regardless of whether scientists initiate or receive the first overture. The Meaning of Aggression Before Darwin was a geneticist, he was a naturalist. His theory of natural selection was not the result of laboratory research, but grew out of detailed observations of the natural world. French biologist Isidore Geoffroy-Saint-Hillaire, a contemporary of Darwin, christened this "fly-on-the-wall" approach to the study of animal behavior ethology, from the Greek word for "character" or characteristic." Ethology, like eugenics, owes more than its name to the Greeks. They were the first to describe the annual spring outbreak of avian civil war, then as now, a conflict centered around choice nesting sites. Once found and claimed, such a territory will not be ceded willingly. In my own backyard, a pair of lucky house sparrows hurries to cement their ownership of a birdhouse wedged into an ornamental pear tree. While the female snips choice bits of greenery for their new furnishings, the male stations himself at the top of the tree, where he swaggers and taunts potential competitors with an enthusiasm that puts the posturing of his human neighbors to shame. Most rivals get the hint. But one upstart decides this choice space is worth a serious confrontation. He lands on a branch just above the birdhouse and defies the first male to dislodge him. The enraged resident charges the invader, shrieking and screaming. The second bird parries and dodges. Were he just slightly larger, he might have won. But his opponent catches him off balance and he ricochets down through the tree to the ground, bounces a few steps, then retreats, the victorious resident shadowing him all the way. Bird watching continued to inspire the first systematic studies of aggression in the wild. From contests like the one described above, behavioral scientists following in Darwin's footsteps suggested that personal space is one critical reason why animals fight. Noting that many birds defend not only a hunting territory or nesting site, but a perching space, some researchers argued that aggression had evolved as a spacing mechanism, a way of limiting the population density. Other bird watchers contended that animals resort to aggression to defend resources and reputation rather than space. By comparing how many times a chicken pecked other members of the flock to the number of pecks it received in turn, Norwegian behaviorist Schjelderup-Ebbe discovered the infamous "pecking order." Subsequent researchers noted that high-ranking individuals not only had a competitive edge in quarrels, but also enjoyed special privileges: the first helping at meals, the most comfortable sleeping spot, sexual carte blanche. The underlings, like anxious middle managers threatened with downsizing, typically recognized and accepted the dominant animal's superiority, minimizing the outbreak of potentially lethal confrontations. However, the lust to displace the leader and claim the top spot for oneself could also fuel rebellion. During the 1930s and 1940s, many behavioral researchers viewed dominance hierarchies (as the pecking order came to be known) as the key factor in motivating, as well as containing, aggression. As a result, the factors that governed the outcome of power struggles figured significantly in both ethological field studies and studies of aggressive encounters between laboratory animals. For example, scientists discovered that in rodent pecking orders, an animal that won one fight often went on to win more; these victors ultimately became dominant. They also noted that a rat or mouse fighting on familiar territory enjoyed a home field advantage, an observation that linked dominance and territoriality. Still other ethologists, influenced by psychological theories of motivation, saw aggression as a response to the call of powerful inner mandates, or instincts. They differentiated appetites -- instinctively motivated behaviors aimed at getting something, such as food, water, or sex -- and aversions -- similar behaviors designed to avoid an unpleasant or painful stimulus. In a 1928 paper provocatively titled, "Why Do Animals Fight?" Wallace Craig, an early advocate of this explanation, classifies aggressive behavior as an aversion, arguing that "animals fight to rid themselves of the interference or thwarting of an instinct." Craig's link between hostility and disappointment was familiar to psychology acquainted with Freudian psychoanalysis. They saw the frustrations that followed a failure to obtain anticipated rewards as a bridge between psychoanalytic theory and Pavlovian conditioning. We want something; someone stops us from getting it; we lash out. A concept that's well known to anyone who has ever refused a toddler's escalating demands in a toy store, tried to contest a telephone bill, or found themselves trapped in traffic, the idea that aggression is caused by frustration had a profound theoretical effect on aggression research in social and clinical psychology, as well as ethology. Some behavioral researchers cited the frustration-aggression association as evidence that violent behavior can be learned. They exploited this possibility to develop animal "models" of aggressive behavior that transferred the study of animal behavior from the field to the psychology laboratory. Robert Hutchinson of Southern Illinois University and his colleagues, Roger Ulrich and N. Azrin, for example, trained rats to press a bar for a food reward and then frustrated the animals by turning off food delivery. The outraged rat's attacks on the apparatus served as a measure of the intensity of the animal's aggression. The Illinois researchers showed that other aversive stimuli, including physical pain, could also provoke angry reactions. Beginning in 1939, Hutchinson and his coworkers published dozens of studies that demonstrated fierce fighting between animals after electrical shocks were applied to their feet or tails. Pain was so effective at provoking aggression in these studies that shocked rats and monkeys could be induced to attack a doll, a cloth-covered tennis ball, or even a rubber hose if a living victim wasn't close by. A tormented monkey biting a rubber hose may remind us of our exasperation with cars that won't start, VCRs that eat tapes, and programs that aren't compatible with Windows 98. But the aggression that most concerns society involves other human beings. Devoid of social meaning, shock- and frustration-induced simulations of animal aggression seemed better suited to measuring pain thresholds than aggressive behavior. As a result, these initial attempts to construct a laboratory model of aggression did little to improve public confidence in either the relevance or the sensitivity of aggression research. Killer Instincts and Selfish Genes If gene action was a physiological engine driving behavior from the inside, ethology was the radar system tracking its course. Behavioral biology, like other areas of biological research, placed an increasing emphasis on the impact of behavior on survival. This emphasis on issues that contemporary behaviorist John Alcock calls "why" questions -- "the 'evolutionary,' or ultimate, reasons for why, an animal does something" -- was reflected in the preoccupation of aggression researchers with such survival-oriented activities as resource distribution, territory, and mating privileges. Questions about underlying mechanisms -- practical, or proximate explanations of "how an individual manages to carry out an activity" -- as well as questions about the short-term payoff of behaving aggressively took a back seat to the bigger question of evolutionary significance. Ethology and genetics worked hand in hand to foster the idea that behavior was as hard-wired and unchangeable as height or eye color. Territorial birds and frustrated rats may have introduced the research community to the ethology of aggression, but Konrad Lorenz made it public. The 1963 publication of his best-selling book, On Aggression, astonished millions with its suggestion that the key to human violence lay in the behavior of animals. This was an introduction with ominous overtones. Instead of ushering in a new era of self-awareness, Lorenz's book intensified opposition to biology as a legitimate means of explaining behavior and reinforced the idea that nature is incompatible with nurture. Lorenz agreed with earlier ethologists that animals used aggression to optimize population density, accumulate and defend resources, and protect themselves and their young. But Lorenz emphasized that aggression was not simply a response to an instinct but was itself an innate, driving force: "Knowledge of the fact that the aggression drive is a true, primarily species-preserving instinct enables us to recognize its full danger: it is the spontaneity of the instinct that makes it so dangerous....The aggression drive, like many other instincts, springs 'spontaneously' from the inner human being." In Lorenz's eyes, aggression was not an aversion but an appetite -- and a ravenous one at that. Humans were not only born bad, but born helpless, at the mercy of a "killer instinct" that bubbled up from a dark corner of the mind like oil and that needed only the match of some trivial insult to ignite. Contained for too long, it would combust spontaneously. The fragile defenses of ritual, culture, and morality were barely a match for this seething flood of instinctive rage. Evolutionary biologists welcomed the challenge to the tyranny of nurture. As Harvard biologist E.O. Wilson (soon to instigate a biological revolution of his own) put it, "Lorenz has returned animal behavior to natural history." But nurture advocates were hardly ready to retreat. They charged that Lorenz's idea of an aggressive drive not only discounted rational and moral control over behavior but actually justified violence -- dangerous ideas for an Austrian living in the shadow of the Holocaust. In the eyes of these critics, Lorenz had merely revived, updated, and psychologized biology's fatalistic reputation. The controversy swept through universities and laboratories as well as newspapers, as biologists rushed to confirm Lorenz and social scientists rushed to prove him wrong. Other biologists began to ask fewer questions about aggression and more about its absence, for right in the teeth of the Darwinian struggle for survival of the fittest were animals risking their own lives or renouncing their fair share) to help others. Birds who teamed up to fight off predators. Wolves who ceded their right to reproduce to the leader of the pack. Lionesses who shared their catch with malingering sisters. How could such altruism be adaptive? The answer, these researchers theorized, is that a generous spirit has rewards on earth as well as in heaven. Family members, for example, share a similar genetic makeup as well as a common history. By aiding and abetting kith and kin, a helpful relative ensures that at least some of the family genes make it to the next generation. Altruism can also benefit the good Samaritan directly. If I rescue you or share my dinner with you today, you are likely to do the same for me in the future, with a reproductive payoff for both of us. Good or bad, biologists reasoned, behavior had adaptive value and an evolutionary significance. Sociobiology -- the term E.O. Wilson used to describe this union of social theory and evolutionary biology -- thus brought even our cherished notion of moral superiority under genetic control. Wilson's book Sociobiology: The New Synthesis, like Lorenz's On Aggression, provoked a firestorm of controversy. The heart of the problem lay not in the book's premise but in its scope. By extending sociobiological theory to encompass the evolutionary implications of human social behavior, Wilson outraged nurture-oriented scientists and activists. To these modern-day antieugenicists, sociobiology was Galton revisited, complete with the attendant risk that socially sanctioned forms of oppression, such as racism or sexism, would be tolerated as "adaptive," while undesirable behavior, such as a penchant for murder, would be the genetic burden of the socially inferior. Wilson himself was horrified by these charges. "I had no interest in ideology," he declares in his autobiography, Naturalist. "My purpose was to celebrate diversity and to demonstrate the intellectual power of evolutionary biology." But he ruefully admits, "Perhaps I should have stopped at chimpanzees when I wrote the book." Sociobiologists after Wilson magnified the problem by characterizing individual effort as a slavish devotion of behavior to the interests of a "selfish" genome. The organism itself, human or animal, had value only as an expendable gene factory, and society was an illusion, a collective of opportunistic individualists at the mercy of genetic programs designed to maximize reproductive fitness. Empathy was not evidence of social evolution but a deceptively clever example of genetic manipulation. "Gene-centered sociobiology," as ethologist Frans de Waal calls it, seemed to revel in taunting critics, rather than striving to find a theoretical common ground between the warring camps of nature and nurture. Selfish genes and deadly instincts only reinforced the widening split between biological and social mechanisms of behavior, and, rather than broadening biology perspective, cemented it more firmly to genetics. Aggression was inborn and immutable, driving the individual to act in socially unacceptable ways as a way of guaranteeing his or her reproductive success. In aggression, social behavior mutated into an asocial compulsion, as the "killer instinct" transformed ordinary people into genetic bounty hunters. "Great and Desperate Cures" Evolutionary biology sparked a debate over the genetic control of behavior. Franz Joseph Gall's theory of phrenology ignited an equally acrimonious debate over the anatomical control of behavior. The question Gall raised -- did the brain act as a whole or the sum of its parts? -- marked the first shot in a hundred years' war that began at the dawn of the nineteenth century and was not fully resolved until well into the twentieth. Ultimately a combination of persuasive debate, forceful personalities, and methodological refinements brought an end to overt hostilities, but it was an uneasy peace. On the one hand, the mapping of behavior to specific brain regions, or functional localization, formed the basis of modern neurological diagnosis. The downside was an oversimplified anatomy that would spawn new efforts to control violence -- efforts that were as questionable as eugenics. Gall's chief protaganist, Marie-Jean-Pierre Flourens (1794-1867) was an elder statesman in the French scientific community, a member of the prestigious Académie des Sciences. In contrast to the phrenologists, he was no theorist but a hands-on researcher who stressed the importance of carefully conducted laboratory studies: "Everything in experimental researches depends upon the method; for it is the method that produces the results. A new method leads to new results; a rigorous method to precise results; an uncertain method can lead only to confused results." Based on experiments in which he surgically removed a chunk of brain tissue from a pigeon, then noted the pigeon's postoperative behavior, Flourens could accept the idea that certain areas orchestrate critical body functions, such as breathing and heart rate. But he drew the line at so-called higher mental functions -- reason, will, and emotion. He reported that when he removed the cerebral cortex, the folded helmet of gray tissue that forms the surface layer of the brain, his pigeons lost "all perception, judgment, memory, and will." From this result, Flourens concluded that the entire cortex was necessary for these activities, an idea that directly contradicted Gall's belief in a neural division of labor. The battle lines were drawn. On one side, neuroscientists had the "globalist" view espoused by Flourens, in which the brain, or at least the cortex, functioned as a single entity in the social, mental, and emotional realms. Or they could accept Gall's idea of a geographically organized brain, with a one-to-one correspondence between location and function. Resolution lay in Flourens's maxim: the development of experimental methods that yielded precise results. His own studies, like many of the earliest anatomical experiments, were based on ablation, a technique in which part of the brain is destroyed and its functional role deduced from the resulting deficit. However, neurosurgery in the mid-nineteenth century was anything but the "rigorous method" so prized by Flourens. With little more than a superficial knowledge of the anatomy of the interior of the brain, no technique for positioning the head, no orderly way of describing location, and only the crudest surgical tools, experimenters had little control over the size and extent of the lesion, nor could they ensure that exactly the same area was removed in every subject. The end result resembled an attempt to determine the impact of Harvard on scientific research in Boston by bombing the north shore of Charles River and charting the subsequent decline in intellectual activity. Trauma and postsurgical infection further biased results, and a dearth of techniques for actually measuring behavior meant that experimental "results" often consisted of little more than a cursory examination of the unfortunate subject in the brief interval between surgery and death. Fortunately, nature provided what the surgeons could not. The first convincing evidence in favor of functional localization came not from the laboratory but from the neurology clinic, in the form of autopsy data that demonstrated a clear correlation between damage to a circumscribed area of the brain and loss of a specific function. Speech was the first function mapped in this fashion. Beginning with the widely publicized case of an epileptic named "Tan," after the only intelligible word he spoke, the renowned French physician Paul Broca (1824-1880) observed a total of nine patients who had lost the ability to speak (a deficit known to neurologists as aphasia) but could still understand spoken language. In each case, postmortem examination of the patient's brain revealed damage to the same region in the left frontal lobe of the cortex, a neighborhood known today as Broca's area. The discovery of the brain's electrical properties by a self-effacing Italian physiology professor, Luigi Galvani (1737-1798), allowed neuroscientists to study the localization of function in an intact brain. Rather than cutting out a piece of brain tissue, scientists could instead apply an electrical current to excite the region of interest. As a result, the investigator could analyze behavior directly instead of guessing about function from its absence. Stimulation studies at first were nearly as crude as early lesion studies. For example, Galvani' s nephew, Giovanni, having learned that he could produce movements of the face and eyes of an ox by applying an electrical current to the exposed brain, progressed to testing the freshly severed heads of criminals sentenced to the guillotine. He proudly reported that brief shocks passed through the ear and mouth or applied directly to the human brain excited the same facial muscles he had activated in cattle. In the 1850s, two German scientists, physician Eduard Hitzig (1838-1907) and anatomist Gustav Fritsch (1838-1927), constructed fine-gauge electrodes to restrict the applied current (provided by a crude battery) to a more precisely defined spot of brain tissue. Funding and laboratory space being nearly as constrained for young investigators then as it is today, Hitzig and Fritsch were forced to construct a makeshift surgical suite on a dressing table in Hitzig's bedroom. Here, in a series of painstaking studies that provided the first truly convincing experimental evidence for functional localization, the two mapped the cortical sites responsible for controlling muscle movements of the paw, leg, face, and neck of the dog. They called these addresses "centers" and speculated that "certainly some psychological functions, and perhaps all of them, in order to enter matter or originate from it, need circumscribed areas of the cortex." An American physician, Roberts Bartholow (1831-1904), extended Hitzig and Fritsch's work to the living human brain. Bartholow inserted two wire electrodes into the brain of one of his patients, a young woman with a cancerous ulcer of the scalp that had eaten away a large hole in her skull. Before the woman died two weeks later (and the outraged citizens of Cincinnati drove Bartholow out of town), he demonstrated a correspondence between electrical stimulation of discrete brain regions and specific muscle contractions of the legs, hands, arms, or neck similar to those documented by the German researchers in the dog. Experiments such as those of Fritsch, Hitzig, and Bartholow, coupled with observations of the changes in function that accompany brain injury or disease, ultimately turned the tide of neuroanatomical opinion in favor of localization. But the concept of "centers" implied a dangerously oversimplified one-to-one correspondence between location and function. This entry-level understanding of neuroanatomy, in which functions could matched to patches of tissue like eggs arrayed in a carton, was not a recipe for progress, but for disaster. No other group of medical specialists was quicker to grasp the practical implications of functional localization than the neurosurgeons. The maps constructed by the localizationists allowed them to pinpoint the site of injury or disease based on the correspondence between symptoms (such as paralysis of particular limb) and the brain region associated with the affected function. As a result, they could operate to remove tumors, abscesses, or scars with a degree of spatial confidence never before possible. Clinicians who saw that physical symptoms, such as pain, numbness, or seizures, could be eliminated or cotrolled by surgical removal of dysfunctional brain tissue began to wonder if surgery might also relieve intractable and debilitating psychological symptoms -- depression, extreme anxiety, or hallucinations. Some even went so as to speculate that surgery could control the agitation and belligerence that often characterize mental illness and reduced caring for them into a battle between patients and caretakers. A fortuitous meeting at the 1935 International Neurological Congress in London provided the critical inspiration for testing the idea that socially acceptable behavior could be controlled with brain surgery. During one of most popular seminars of the meeting, Yale University researchers Carlyle Jacobsen and John Fulton described how they had effected a radical personality change in an unruly chimpanzee named Becky. Becky had threatened to upstage Jacobsen and Fulton's research on the anatomical basis of learning and memory by throwing violent temper tantrums during experimental sessions. But after surgical removal of the frontal segment of her cerebral cortex, she became a model subject, as docile and cooperative as the researchers' other chimpanzees. In Becky's surgical domestication, Portuguese neurosurgeon Egas Moniz already renowned for his role in the development of radiologic techniques for imaging blood flow in the brain, saw both a novel solution to the problem of human neurosis and an opportunity to advance his own reputation. Following the presentation, he tried to persuade Jacobsen and Fulton to join him in determining whether surgery "should now be attempted to reduce severe anxiety and delusional states in humans." The appalled Yale scientists refused. But Moniz was undeterred. Three months later, assisted by fellow surgeon and long-time collaborator Pedro Almeida Lima, he drilled two small holes into the forehead of a severely depressed patient and injected enough alcohol to kill the exposed brain tissue. Impressed by her postoperative tranquility, they declared the operation a success. Over the next several months, Moniz and Lima refined their surgical procedure, fashioning a special knife, or "leucotome," by bending the tip of a steel needle to form a loop. With this new instrument, the surgeons removed up to four "cores" of tissue from both sides of the brain. Prefrontal leucotomy, as the procedure came to be known, ushered in a new era of medical treatment for the delusional, the suicidal, and the agitated. Only a year after the fateful meeting in London, Moniz published results from an initial group of twenty patients: fourteen had either recovered or improved substantially as a result of the surgery. Agitated or depressed patients seemed especially responsive; the surgery reduced their hostility, brightened their mood, and enhanced their tractability. Encouraged by this early success, Moniz performed about one hundred prefrontal leucotomies over the next eight years. Neurosurgeons in Europe, South America, and the United States also adopted the procedure, and many confirmed the Portuguese surgeon's glowing reports. But outside the medical literature, disturbing inconsistencies quietly surfaced. Moniz followed his patients closely for only a few days after surgery and ignored them after their discharge from the hospital. A colleague who did conduct long-term investigations, however, documented not only numerous relapses but many deaths. Other physicians began to question whether leucotomy was an "effort to maim and destroy the creative functions." Moniz himself fell victim to a primitive malpractice suit, shot by a dissatisfied former patient in an assault that left him partially paralyzed. To a world that had little to offer the mentally ill, Moniz was still a hero and leucotomy a humanitarian advance. The medical community chose to overlook the negative reports, awarding Moniz the 1949 Nobel Prize in Medicine and Physiology for his pioneering efforts to develop the surgical solution to violence he christened "psychosurgery." American neurologists Walter Freeman and James Watts of the George Washington University Hospital made Moniz's operation a household word. Freeman and Watts modified the procedure by opening the skull at the temple, inserting a scalpel, and swinging the knife in a wide arc to mow down both nerve cells and the fibers that connected them. Their goal was to sever completely the frontal cortical feedback loops that Freeman believed to be the anatomical basis of a "circuit of emotion" responsible for the confused thinking and agitated behavior of the mentally ill. Cutting the circuit, he argued, freed the tortured patient "from the tyranny of his own past, from the anxious self-searching that has become too terrible to endure." Watts became increasingly uneasy about Freeman's growing zealousness and self-promotion and bowed out of the partnership. But Freeman continued to streamline the operation he called "lobotomy" on his own. Substituting electroconvulsive shock for standard surgical anesthesia, he rapidly slashed the offending cortical pathways by hammering an ice pick through the bone surrounding the eye socket. Freeman aimed to transform lobotomy into a simple office procedure, boasting that a competent surgeon should be able to perform ten to fifteen such operations in a single morning. The return of thousands of psychologically damaged veterans after World War II stretched the limits of the already crowded mental hospitals and created a growth market for the burgeoning lobotomy industry. In fact, the Veterans Administration, foreseeing the problem, issued a memorandum in 1943 calling for "all consulting and staff neurosurgeons at its neuropsychiatric installations to receive training in prefrontal lobotomy operations." Fueled by such directives and the lack of effective alternatives, physicians and families caring for the mentally ill were only too ready to embrace what chronicler Elliot Valenstein termed "great and desperate cures." Between 1942 and the mid-1950s, lobotomy "liberated" perhaps as many as forty thousand American psychiatric patients from the tyranny of their past; over thirty-five hundred were "cured" by Freeman alone. But by the late 1950s, lobotomy had gone out of fashion. The 1957 introduction of chlorpromazine, first of the modern psychoactive medications, reduced both the risk and the cost of treatment and did much to temper enthusiasm for psychosurgery. In addition, psychosurgery had failed to live up to its initial promise. Long-term studies showed that although the surgery did relieve anxiety and agitation, it had little effect on the delusions and hallucinations that led to hospitalization in the first place. Worse, reports of deficits similar to those observed after Moniz's operations began to surface, suggesting that the price of the surgery was a subtle but profound loss of motivation, sight, and organization, an inability to maintain interest in all but the most rudimentary of tasks. Clinicians called it the "prefrontal syndrome." The public thought of it as surgical voodoo, a mind-control method that reduced patients to zombies and punished disruptive behavior by neural mutilation. Psychopharmacology euthanized lobotomy, but its spirit lived on. Locked into a constricted view that partitioned the brain into clearly demarcated centers, some neurologists maintained that the failures of psychosurgery were result of poor resolution rather than a faulty premise. They insisted that more precise search-and-destroy procedures that targeted only the site of the putative damage and spared as much of the surrounding tissue as possible could control emotion safely and effectively, with benefits for both the patient and society. As a result, they worked to refine localization techniques. European neuroscientists, for example, pioneered surgical procedures for permanently implanting small-caliber electrodes, permitting repeated stimulation and observation of conscious animals. And at McGill University in Montreal, neurosurgeon Wilder Penfield produced exquisitely detailed maps of areas regulating speech, vision, hearing, memory, and even emotion in the human cortex by recording the responses of patients as he touched electrodes to dozens of sites across the surface of the brain. But his investigations were limited to the artificial environment of the operating room, his subjects to patients sedated and restrained in preparation for surgery. The invention of the transistor liberated electrical stimulation studies. Surgical implantation of miniature transistor radio receivers replaced cables and insulated test chambers and permitted researchers to study a freely moving animal in its natural environment. Remote-control stimulation was a boon to aggression research, thanks largely to the promotional efforts of a flamboyant Spanish expatriate, José Delgado. Radio-stimulated cats and monkeys became militant terrorists or gentle pacifists at Delgado's electrical bidding. A pulse of electricity that activated no more than a spot of tissue evoked an entire behavioral sequence. For example, a brief jolt to a point deep in the brain of a female monkey named Ludi sparked a series of events that "began with a change in facial expression, followed by her turning her head to the right, standing up, rotating to the right, walking upright, touching the walls of the cage or grasping a swing, climbing a pole, descending, uttering a low tone, threatening subordinate monkeys, and finally, changing her aggressive attitude to a peaceful one and approaching members of the colony in a friendly manner." Delgado welcomed publicity and encouraged controversy. He returned to Spain, donned a toreador's costume, and subdued a charging bull not with a cape but with a push of a button. He equipped a chimpanzee named Paddy with a "stimoceiver" that relayed electrical signals from the animal's brain to a nearby computer that radioed responses back to the brain, triumphantly announcing to the newspapers, "We are now talking to the brain without the participation of the senses." In a lecture at the American Museum of Natural History in New York, he declared, "It may certainly be predicted that the evolution of physical control of the brain...will continue at an accelerated pace, pointing hopefully toward the development of a more intelligent and peaceful mind of the species." By combining Delgado's state-of-the-art electrical mapping and high-resolution microsurgery, the new psychosurgeons of the 1960s sought to achieve this "intelligent and peaceful mind." Precision, or stereotaxic, psychosurgery began as an attempt to manage aggressive patients who also had intractable neurological or psychiatric conditions, particularly temporal lobe epilepsy. But it quickly came to be seen specifically as a "cure" for aggressive behavior: It was originally our intention to investigate the value of amygdalotomy upon patients with temporal lobe epilepsy characterized by psychomotor well as marked behavior disturbances such as hyperexcitability, assaultive behavior, or violent aggressiveness. The indications...were then extended to include patients without clinical manifestations of temporal lobe epilepsy, but with EEG [electroencephalogram] abnormalities and marked behavior disturbances. Finally, cases of behavior disorders without epileptic manifestations...were also included in the series. The cure gathered support with the publication in 1970 of Violence and the Brain, by Harvard neurosurgeons Vernon Mark and Frank Ervin. In the book, Mark and Ervin identified a new psychiatric disorder they termed the dyscontrol syndrome, which was characterized by four symptoms: (1) a history of physical assault, especially wife and child beating; (2) the symptom of pathological intoxication -- that is, drinking even a small amount of alcohol triggers acts of senseless brutality; (3) a history of impulsive sexual behavior, at times including sexual assaults; and (4) a history (in those who drove cars) of many traffic violations and serious automobile accidents. Using the remote stimulation and recording procedures developed by Delgado, Mark and Ervin reported that they could detect local electrical abnormalities in patients with dyscontrol symptoms, sites they christened "brain triggers of violence." They believed that by destroying these trigger sites, they could eliminate the unpredictable bursts of violent behavior. As evidence, they cited the case of a twenty-one-year-old woman with a history of seizures, panic attacks, and episodes of violent rage. During one such episode, she had stabbed another woman in the lounge of a movie theater. A second culminated in an attack on a psychiatric nurse. Exhaustive trials of "all known antiseizure medications, as well as the entire range of drugs used to help emotionally disturbed patients," years of psychotherapy, and over sixty electroshock treatments had failed to curb her unpredictable rages. Julia, surgically implanted with one of Delgado's "stimoceivers," was to become infamous in both the psychiatric literature and the popular press. Readers of Life magazine, for example, observed firsthand the graphic consequences of an electrical spark delivered by the stimoceiver to Julia's presumed "trigger site," and saw for themselves "the angry grimaces which included lip retraction and baring of the teeth, the ancient 'primate threat display,'" followed by a sudden, savage attack on the wall of her room. Bursts of abnormal brain waves churned from the second channel of her radio receiver. Could anyone doubt the wisdom of cutting this behavioral cancer out of poor Julia's brain? Mark and Ervin reported a dramatic decline in Julia's rage attacks after destruction of the trigger site on one side of her brain, a result they touted as convincing evidence of the antiaggressive effect of high-resolution psychosurgery. Close examination of some of their other patients, however, told a different story. A summary of the outcome in ten patients similar to Julia showed that in at least five, initial improvement was followed by a gradual return of their aggressive behavior. Only one -- a woman whose symptoms could be traced to a severe head injury -- was a clear success. The rise of social unrest during the turbulent 1960s and early 1970s activated a familiar desperation to "do something" about the problem of violence. Psychosurgeons had a ready answer. Collaborator and advocate William Sweet, in his introduction to Violence and the Brain, recommended that "knowledge gained about emotional brain function in violent persons with brain disease can be applied to combat the violence-triggering mechanisms in the brains of the nondiseased." Radio-controlled brain surgery, in other words, represented a reasonable and cost-effective alternative to long-term incarceration, argued advocates like California neurosurgeon M. Hunter Brown: "Each violent young criminal incarcerated from 20 years to life costs taxpayers perhaps $100,000. For roughly $6000, society can provide medical treatment which will transform him into a responsible well-adjusted citizen." The similarity of such statements to the economic arguments of the eugenicists did not trouble enthusiasts like Sweet and Brown. They found a receptive audience in get-tough legislators and law enforcement officials, who readily persuaded Congress to designate over $500,000 to support psychosurgery research. Opponents, however, quickly made the connection, particularly after a 1967 letter by Mark, Sweet, and Ervin to the Journal of the American Medical Association suggested that psychosurgery could solve the problem of urban riots: "We need intensive research and clinical studies of individuals committing the violence. The goal of such studies would be to pinpoint, diagnose, and treat those people with low violence thresholds before they contribute to further tragedies." Mark and Ervin did not overtly target any racial or ethnic group in their writing, but their comments easily resurrected the malevolent link between biology and racism forged by the eugenicists, and the suggestion that violent behavior was the result of having a "low violence threshold" reinforced the behavioral determinism that critics had read into Lorenz and Wilson. And if the sorry consequences of lobotomy were any indication, the "cure" for this illness was almost as bad as the disease. Once again, biological explanations for violence had proven hurtful, not helpful. Explanation or Excuse? The crowds began to arrive by midmorning. But unlike the typical outdoor gathering in this community of Philadelphia's fittest families, they hadn't come to watch a prep school field hockey final or an equestrian trial. The spectator sport that occasioned this tailgate picnic was a high-tension standoff between Delaware County police and local multimillionaire John du Pont. This live-action docudrama had actually begun two days earlier, on January 26, 1996, when du Pont shot and killed wrestler David Schultz, a former Olympic gold medalist who lived and trained at du Pont's Foxcatcher estate. After the murder, du Pont barricaded himself in the steel-lined library of his mansion and waged a forty-eight-hour war of wills with authorities. The siege finally ended when the local SWAT team thought of turning off the heat, and du Pont came out shivering. According to postarrest reports, du Pont was an accident waiting to happen. In fact, if anyone ever fit a crime novelist's portrayal of a man gone over the edge, it was John du Pont. He patrolled the 200-acre estate, which he believed to be the holy land of the Dalai Lama, in a tank. He mounted cameras in the corners to watch for ghosts, threatened to shoot his wife in the belief she was a Russian spy, and feared that the clocks in the gym were taking him back in time. Other athletes admitted that du Pont's behavior was so bizarre that he would have been barred from the wrestling community had he not contributed so lavishly to the sport. Nonetheless, rumors that du Pont's defense would center on his mental competency provoked responses ranging from disgust to outrage. Schultz was well-liked in the wrestling community, and he left behind not only a wife but two young children. Du Pont's hasty retreat after the crime suggested to many that far from being oblivious to the implications of his actions, he had actively attempted to evade capture. And local papers hinted that with a personal fortune estimated at $200 million, du Pont could well afford to "buy" an insanity plea. Was du Pont mentally unfit even to stand trial? The decision hinged on two criteria: whether he was lucid enough to understand the charges against him and whether he was rational enough to assist his lawyers in planning his defense. By the time the legal dust had settled and a competency hearing finally got under way eight months after his arrest, nearly everyone agreed that John du Pont failed on both accounts. Defense lawyers Richard Sprague and William Lamb (ultimately dismissed by du Pont because he believed they were part of a conspiracy headed by the CIA) argued that hundreds of hours of conversation with their client had failed to yield a single cogent discussion. Three defense psychiatrists, as well as two appointed by the court, were in "universal agreement...that du Pont was actively psychotic." A videotape from their interviews showed du Pont insisting that he was the Dalai Lama, describing a plane flight in which he'd commanded Bulgarian pilots, and reiterating his fears that he was the victim of a government conspiracy. On September 24, 1996, Delaware County court judge Patricia Jenkins ruled that du Pont was incompetent and ordered him committed to Norristown State Hospital for treatment. But she had no intention of leaving him there. Jenkins ordered an update on the millionaire murderer's condition within sixty days -- and every ninety days after that until she could justify putting him on trial. On December 3, only two months after he was sent to Norristown, Jenkins was satisfied that du Pont had recovered sufficiently to meet the legal criteria for competency. John du Pont would have to answer for his actions after all. But what would really be on trial was not his behavior but his state of mind. Du Pont's guilt was a given. His new defense team knew they would never convince a jury that he was innocent; they could only hope that jurors would buy the argument that du Pont still belonged in a hospital rather than a prison. The medicalization of violence by researchers such as Delgado, Mark, and Ervin offended some because it appeared to perpetuate the earlier inequities of the eugenics movement. For others, however, the growth of biological psychiatry created a different monster: the fear that biology was the equivalent of an insanity plea writ large. Highly publicized cases of "Twinkie defenses," "abuse excuses," and collaborations between well-paid lawyers and unscrupulous psychiatrists fueled the misconception that biology was an excuse, not an explanation. Society has been undecided about the line between malice and mental illness for centuries. Clearly, some people are so delusional or incapacitated that punishing them seems worthless at best and inhumane at worst. On the other hand, unless the concept of insanity is rigorously defined, it does present an open invitation to a defendant with deep pockets or a clever defense attorney. British courts set the modern standard for the definition of insanity. In 1843, a woodturner named Daniel M'Naughten was acquitted of murdering a secretary to Prime Minister Sir Robert Peel on the grounds that his reasoning abilities had deteriorated to the point that he was no longer capable of understanding the "nature and quality of the act he was doing, or, if he did know it...he did not know he was doing what was wrong." This right-or-wrong yardstick, a criterion that came to be known as the M'Naughten test, was quickly adopted by the American judicial system as the definitive answer to the question of assessing mental competence. But legal, medical, and popular interpretations of moral incapacity quickly muddied the judicial waters. Some states, for example, extended the insanity defense to encompass uncontrollable impulses, or "homicidal mania," while defendants pleading mental disease and "temporary insanity" enjoyed an increasingly favorable reception from judges and juries. Eventually the link between violence and mental illness led to a turning point in the definition of competence. In a controversial 1954 decision, Washington, D.C. circuit court judge David Bazelon struck down the M'Naughten rule in favor of a new criterion that he believed better reflected current medical knowledge: "an accused is not criminally responsible if his unlawful act was the product of mental disease or mental defect." Durham v. the United States had a profound impact on the public's opinion of biological explanations for human aggression. Scandalized newspaper reports documented a surge of successful insanity pleas in some areas; in Bazelon's district, for example, such pleas rose from 0.4 percent at the time of the decision to over 14 percent by 1961. A judicial system that had once punished all but the most obviously deranged now seemed to accept every anxiety or blue mood as a legitimate excuse for murder. Sick and psycho replaced immoral and evil as the preferred terms for describing the most violent. Media coverage of high-profile cases, such as that of John Hinckley, Jr., declared insane and therefore acquitted of attempting to kill President Ronald Reagan, reinforced the belief that murderers and rapists routinely exploited the insanity defense to evade punishment. In fact, recent studies show that fewer than 1 percent of defendants facing felony indictments resort to the insanity defense; of these, no more that one quarter are successful. Both lawyers and psychiatrists report that juries are surprisingly skeptical of arguments attributing violent behavior to mental illness; well-known exceptions, like the jury that acquitted Hinckley, represent, in the words of one forensic psychiatrist, "aberrations that skew the public perception." Unfortunately, these aberrations have convinced the public that biology favors the violent at the expense of victims and that forensic psychiatrists are poised to throw open the floodgates of our prisons. The exceptions also foster widespread misconception about the potential violence of the mentally ill. A recent study by the University of Pennsylvania Annenberg School of Communications, for example, discovered that the "psychotic" killer on a mad murder spree represents a common theme in television programming; in fact, 70 percent of mentally ill television characters were portrayed as violent. This "violent-maniac" stereotype translates into widespread discrimination and distrust, despite the fact that experts estimate no more than 20 percent of the mentally ill ever commit a violent act. The crime-fueled victimization of the mentally ill has not been lost on social critics of behavioral biology. They hear a familiar echo of biological determinism inherent in the insanity defense, the well-worn genetic mind-set that causes "millions of think that criminals are perhaps born that way; crime is in the blood, the genes, the bones." As a result, biologically inspired legal initiatives have not only enraged conservative Americans but having also paradoxically fueled the worst fears of modern antieugenicists, that biological explanations for violent behavior will be used to isolate, discriminate, and persecute. A Truce in the War Between Nature and Nurture The miscalculation, misconduct, and misunderstanding that characterized biologically oriented studies of behavior for over a century created a rupture between nature and nurture that has often seemed as entrenched and as bitter as a civil war. The inevitable consequence of eugenics and psychosurgery, killer instincts and selfish genes, caged animals attacking dolls and murderers bartering for a hospital bed instead of a prison cell has been a profound distrust of behavioral science and the motives of biological researchers. Criticism of the biological perspective has centered on the determinism that permeated biology after Darwin. If genes rule, people are little more than genetic puppets, their behavior and their moral judgment tethered to the double helix. If they are bad, it can be blamed on an unwelcome mutation, and if they are good, they have a reproductive advantage to enjoy along with a clear conscience. While this "it's not my fault" view may bring comfort to some offenders, it brings only anguish to victims and their families. As a result, many believe that the biological perspective on aggression has little to offer victims and little to say about the lasting consequences of violence. Biological explanations, they argue, merely allow perpetrators to walk away from their actions or to reverse the charges, blaming the victim and society at large for a lack of understanding. Explanations for violence based on rigid thinking -- whether they exclude the brain or the environment -- are easy prey for ideologists. Only a few decades ago, eugenic extremists decried those who opposed discriminatory limits on immigration and harsh laws forbidding interracial marriage as "race criminals." Today social critics like Peter Breggin charge that research on the biology of violence is part of a government-engineered conspiracy to implement "biomedical social control" of minorities, including enforced medication of children deemed to be at risk for committing violent crimes. The excesses of yesterday's advocates earned the outrage of today's detractors. But are such critics still justified? Does research on the behavioral biology of aggression constitute biomedical social control, or is it an undervalued and misunderstood option for understanding the problem of violence, not just among inner-city offenders and serial killers but in violent husbands, homicidal drivers, abusive parents, and even victims in all walks of American life? While the debate over eugenics, instincts, and sociobiology raged, neuroscience underwent a radical transformation, from a gawky scientific adolescent to a vigorous mature discipline. "Twenty-five years of progress" -- the slogan celebrating the 1995 silver anniversary of the neurobiological equivalent of the Screen Actor's Guild, the Society for Neuroscience -- have not simply brought us more sophisticated skirmishes between the nature and nurture camps, but have significantly altered the way we think about brain structure and function. In a tribute to the sagacity of Flourens, new methods have yielded new data, and fresh interpretations are changing the very foundation of our understanding of the relationship between brain and behavior, gene and environment. Some observers see these advances as the symbol of a holistic biology that can finally mend the rift. Biology is not destiny. The century-old gap between nature and nurture has been bridged by the brain. Critics who charge otherwise are trapped in an outmoded argument that ignores pivotal neuroscience discoveries and a paradigm shift in our understanding of the causes of human behavior. The critics may be fighting the same old battles, but the world itself has changed. Copyright © 1999 Debra Niehoff. All rights reserved.

Google Preview