The group thinks, feels, and acts quite
differently
from the way in which {24} members would were they isolated.
Steven-Pinker-The-Blank-Slate 1
Most Victorian gentlemen could not have imagined that the coming century would see a nation-state forged by Jewish pioneers and soldiers, a wave of African American public intellectuals, or a software industry in Bangalore.
Nor could they have anticipated that women would lead nations in wars, run huge corporations, or win Nobel Prizes in science.
We now know that people of both sexes and all races are capable of attaining any station in life.
This sea change included a revolution in the treatment of human nature by scientists and scholars. Academics were swept along by the changing attitudes to race and sex, but they also helped to direct the tide by holding forth on human nature in books and magazines and by lending their expertise to government agencies. The prevailing theories of mind were refashioned to make racism and sexism as untenable as possible. The doctrine of the Blank {17} Slate
? ? ? ? ? became entrenched in intellectual life in a form that has been called the Standard Social Science Model or social constructionism. 5 The model is now second nature to people and few are aware of the history behind it. 6 Carl Degler, the foremost historian of this revolution, sums it up this way:
What the available evidence does seem to show is that ideology or a philosophical belief that the world could be a freer and more just place played a large part in the shift from biology to culture. Science, or at least certain scientific principles or innovative scholarship also played a role in the transformation, but only a limited one. The main impetus came from the will to establish a social order in which innate and immutable forces of biology played no role in accounting for the behavior of social groups. 7
The takeover of intellectual life by the Blank Slate followed different paths in psychology and in the other social sciences, but they were propelled by the same historical events and progressive ideology. By the second and third decades of the twentieth century, stereotypes of women and ethnic groups were starting to look silly. Waves of immigrants from southern and eastern Europe, including many Jews, were filling the cities and climbing the social ladder. African Americans had taken advantage of the new "Negro colleges," had migrated northward, and had begun the Harlem Renaissance. The graduates of flourishing women's colleges helped launch the first wave of feminism. For the first time not all professors and students were white Anglo-Saxon Protestant males. To say that this sliver of humanity was constitutionally superior had not only become offensive but went against what people could see with their own eyes. The social sciences in particular were attracting women, Jews, Asians, and African Americans, some of whom became influential thinkers.
Many of the pressing social problems of the first decades of the twentieth century concerned the less fortunate members of these groups. Should more immigrants be let in, and if so, from which countries? Once here, should they be encouraged to assimilate, and if so, how? Should women be given equal political rights and economic opportunities? Should blacks and whites be integrated? Other challenges were posed by children. 8 Education had become compulsory and a responsibility of the state. As the cities teemed and family ties loosened, troubled and troublesome children became everyone's problem, and new institutions were invented to deal with them, such as kindergartens, orphanages, reform schools, fresh-air camps, humane societies, and boys' and girls' clubs. Child development was suddenly on the front burner. These social challenges were not going to go away, and the most humane assumption was that all human beings had an equal potential to prosper if they were given the {18} right upbringing and opportunities. Many social scientists saw i~t as their job to reinforce that assumption.
Modern psychological theory, as every introductory textbook makes clear, has roots in John Locke and other Enlightenment thinkers. For Locke the Blank Slate was a weapon against the church and tyrannical monarchs, but these threats had subsided in the English-speaking world by the nineteenth century. Locke's intellectual heir John Stuart Mill (1806-1873) was perhaps the first to apply his blank-slate psychology to political concerns we recognize today. He was an early supporter of women's suffrage, compulsory education, and the improvement of the conditions of the lower classes. This interacted with his stands in psychology and philosophy, as he explained in his autobiography:
I have long felt that the prevailing tendency to regard all the marked distinctions of human character as innate, and in the main indelible, and to ignore the irresistible proofs that by far the greater part of those differences, whether between individuals, races, or sexes, are such as not only might but naturally would be produced by differences in circumstances, is one of the chief hindrances to the rational treatment of great social questions, and one of the greatest stumbling blocks to human improvement. . . . [This tendency is] so agreeable to human indolence, as well as to conservative interests generally, that unless attacked at the very root, it is sure to be carried to even a greater length than is really justified by the more moderate forms of intuitional philosophy. 9
By "intuitional philosophy" Mill was referring to Continental intellectuals who maintained (among other things) that the categories of reason were innate. Mill wanted to attack their theory of psychology at the root to combat what he thought were its conservative social implications. He refined a theory of learning called associationism (previously formulated by Locke) that tried to explain human intelligence without granting it any innate organization. According to this theory, the blank slate is inscribed with sensations, which Locke called "ideas" and modern psychologists call "features. " Ideas that repeatedly appear in succession (such as the redness, roundness, and sweetness of an apple) become associated, so that any one of them can call to mind the others. And similar objects in the world activate overlapping sets of ideas in the mind. For example, after many dogs present themselves to the senses, the features that they share (fur, barking, four legs, and so on) hang together to stand for the category "dog. "
? ? ? ? ? ? The associationism of Locke and Mill has been recognizable in {19} psychology ever since. It became the core of most models of learning, especially in the approach called behaviorism, which dominated psychology from the 1920s to the 1960s. The founder of behaviorism, John B. Watson (1878-1958), wrote one of the century's most famous pronouncements of the Blank Slate:
Give me a dozen healthy infants, well-formed, and my own specified world to bring them up in and I'll guarantee to take any one at random and train him to become any type of specialist I might select -- doctor, lawyer, artist, merchant-chief, and yes, even beggar-man and thief, regardless of his talents, penchants, tendencies, abilities, vocations, and race of his ancestors. 10
In behaviorism, an infant's talents and abilities didn't matter because there was no such thing as a talent or an ability. Watson had banned them from psychology, together with other contents of the mind, such as ideas, beliefs, desires, and feelings. They were subjective and unmeasurable, he said, and unfit for science, which studies only objective and measurable things. To a behaviorist, the only legitimate topic for psychology is overt behavior and how it is controlled by the present and past environment. (There is an old joke in psychology: What does a behaviorist say after making love? "It was good for you; how was it for me? ")
Locke's "ideas" had been replaced by "stimuli" and "responses," but his laws of association survived as laws of conditioning. A response can be associated with a new stimulus, as when Watson presented a baby with a white rat and then clanged a hammer against an iron bar, allegedly making the baby associate fear with fur. And a response could be associated with a reward, as when a cat in a box eventually learned that pulling a string opened a door and allowed it to escape. In these cases an experimenter set up a contingency between a stimulus and another stimulus or between a response and a reward. In a natural environment, said the behaviorists, these contingencies are part of the causal texture of the world, and they inexorably shape the behavior of organisms, including humans.
Among the casualties of behaviorist minimalism was the rich psychology of William James (1842-1910). James had been inspired by Darwin's argument that perception, cognition, and emotion, like physical organs, had evolved as biological adaptations. James invoked the notion of instinct to explain the preferences of humans, not just those of animals, and he posited numerous mechanisms in his theory of mental life, including short-term and long-term memory. But with the advent of behaviorism they all joined the index of forbidden concepts. The psychologist J. R. Kantor wrote in 1923: "Brief is the answer to the question as to what is the relationship between {20} social psychology and instincts. Plainly, there is no relationship. "11 Even sexual desire was redefined as a conditioned response. The psychologist Zing Yang Kuo wrote in 1929:
Behavior is not a manifestation of hereditary factors, nor can it be expressed in terms of heredity. [It is] a passive and forced movement mechanically and solely determined by the structural pattern of the organism and the nature of environmental forces. . . . All our sexual appetites are the result of social stimulation. The organism possesses no ready-made reaction to the other sex, any more than it possesses innate ideas. 12
Behaviorists believed that behavior could be understood independently of the rest of biology, without attention to the genetic makeup of the animal or the evolutionary history of the species. Psychology came to consist of the study of learning in laboratory animals. B. E. Skinner (1904-1990), the most famous psychologist in the middle decades of the twentieth century, wrote a book called The Behavior of Organisms in which the only organisms were rats and pigeons and the only behavior was lever pressing and key pecking. It took a trip to the circus to remind psychologists that species and their instincts mattered after all. In an article called "The Misbehavior of Organisms," Skinner's students Keller and Marian Breland reported that when they tried to use his techniques to train animals to insert poker chips into vending machines, the chickens pecked the chips, the raccoons washed them, and the pigs tried to root them with their snouts. 13 And behaviorists were as hostile to the brain as they were to genetics. As late as 1974, Skinner wrote that studying the brain was just another misguided quest to find the causes of behavior inside the organism rather than out in the world. 14
Behaviorism not only took over psychology but infiltrated the public consciousness. Watson wrote an influential childrearing manual recommending that parents establish rigid feeding schedules for their children and give them a minimum of attention and love. If you comfort a crying child, he wrote, you will reward him for crying and thereby increase the frequency of crying behavior. (Benjamin Spock's Baby and Child Care, first published in 1946 and famous for recommending indulgence toward children, was in part a reaction to Watson. ) Skinner wrote several bestsellers arguing that harmful behavior is neither instinctive nor freely chosen but inadvertently conditioned. If we turned society into a big Skinner box and controlled behavior deliberately rather than haphazardly, we could eliminate aggression, overpopulation, crowding, pollution, and inequality, and thereby attain Utopia. 15 The noble
? ? ? ? ? ? ? ? savage became the noble pigeon. {21}
Strict behaviorism is pretty much dead in psychology, but many of its attitudes live on. Associationism is the learning theory assumed by many mathematical models and neural network simulations of learning. 16 Many neuroscientists equate learning with the forming of associations, and look for an associative bond in the physiology of neurons and synapses, ignoring other kinds of computation that might implement learning in the brain. 17 (For example, storing the value of a variable in the brain, as in "x = 3," is a critical computational step in navigating and foraging, which are highly developed talents of animals in the wild. But this kind of learning cannot be reduced to the formation of associations, and so it has been ignored in neuroscience. ) Psychologists and neuroscientists still treat organisms interchangeably, seldom asking whether a convenient laboratory animal (a rat, a cat, a monkey) is like or unlike humans in crucial ways. 18 Until recently, psychology ignored the content of beliefs and emotions and the possibility that the mind had evolved to treat biologically important categories in different ways. 19 Theories of memory and reasoning didn't distinguish thoughts about people from thoughts about rocks or houses. Theories of emotion didn't distinguish fear from anger, jealousy, or love. 20 Theories of social relations didn't distinguish among family, friends, enemies, and strangers. 21 Indeed, the topics in psychology that most interest laypeople -- love, hate, work, play, food, sex, status, dominance, jealousy, friendship, religion, art -- are almost completely absent from psychology textbooks.
One of the major documents of late twentieth-century psychology was the two-volume Parallel Distributed Processing by David Rumelhart, James McClelland, and their collaborators, which presented a style of neural network modeling called connectionism. 22 Rumelhart and McClelland argued that generic associationist networks, subjected to massive amounts of training, could explain all of cognition. They realized that this theory left them without a good answer to the question "Why are people smarter than rats? " Here is their answer:
Given all of the above, the question does seem a bit puzzling. . . . People have much more cortex than rats do or even than other primates do; in particular they have very much more . . . brain structure not dedicated to input/output -- and presumably, this extra cortex is strategically placed in the brain to subserve just those functions that differentiate people from rats or even apes. . . .
But there must be another aspect to the difference between rats and people as well. This is that the human environment includes other people and the cultural devices that they have developed to organize their thinking processes. 23 {22}
Humans, then, are just rats with bigger blank slates, plus something called "cultural devices. " And that brings us to the other half of the twentieth-century revolution in social~science.
? ? ? ? ? ? ? ? ? ? He's so unhip, when you say "Dylan,"
He thinks you're talkin' about Dylan Thomas (whoever he was). The man ain't got no culture.
-- Simon and Garfunkel
The word culture used to refer to exalted genres of entertainment, such as poetry, opera, and ballet. The other familiar sense -- "the totality of socially transmitted behavior patterns, arts, beliefs, institutions, and all other products of human work and thought" -- is only a century old. This change in the English language is just one of the legacies of the father of modern anthropology, Franz Boas (1858-1942).
The ideas of Boas, like the ideas of the major thinkers in psychology, were rooted in the empiricist philosophers of the Enlightenment, in this case George Berkeley (1685-1753). Berkeley formulated the theory of idealism, the notion that ideas, not bodies and other hunks of matter, are the ultimate constituents of reality. After twists and turns that are too convoluted to recount here, idealism became influential among nineteenth-century German thinkers. It was embraced by the young Boas, a German Jew from a secular, liberal family.
Idealism allowed Boas to lay a new intellectual foundation for egalitarianism. The differences among human races and ethnic groups, he proposed, come not from their physical constitution but from their culture, a system of ideas and values spread by language and other forms of social behavior. Peoples differ because their cultures differ. Indeed, that is how we should refer to them: the Eskimo culture or the Jewish culture, not the Eskimo race or the Jewish race. The idea that minds are shaped by culture served as a bulwark against racism and was the theory one ought to prefer on moral grounds. Boas wrote, "I claim that, unless the contrary can be proved, we must assume that all complex activities are socially determined, not hereditary. "24
Boas's case was not just a moral injunction; it was rooted in real discoveries. Boas studied native peoples,
? immigrants, and children in orphanages to prove that all groups of humans had equal potential. Turning Jespersen on his head, Boas showed that the languages of primitive peoples were not simpler than those of Europeans; they were just different. Eskimos' difficulty in discriminating the sounds of our language, for example, is matched by our difficulty in discriminating the sounds of theirs. True, many non-Western languages lack the means to express certain abstract concepts. They may have no words for numbers higher than three, for example, or no word for {23} goodness in general as opposed to the goodness of a particular person. But those limitations simply reflect the daily needs of those people as they live their lives, not an infirmity in their mental abilities. As in the story of Socrates drawing abstract philosophical concepts out of a slave boy, Boas showed that he could elicit new word forms for abstract concepts like "goodness" and "pity" out of a Kwakiutl native from the Pacific Northwest. He also observed that when native peoples come into contact with civilization and acquire things that have to be counted, they quickly adopt a full-blown counting system. 25
For all his emphasis on culture, Boas was not a relativist who believed that all cultures are equivalent, nor was he an empiricist who believed in the Blank Slate. He considered European civilization superior to tribal cultures, insisting only that all peoples were capable of achieving it. He did not deny that there might be a universal human nature, or that there might be differences among people within an ethnic group. What mattered to him was the idea that all ethnic groups are endowed with the same basic mental abilities. 26 Boas was right about this, and today it is accepted by virtually all scholars and scientists.
But Boas had created a monster. His students came to dominate American social science, and each generation outdid the previous one in its sweeping pronouncements. Boas's students insisted not just that differences among ethnic groups must be explained in terms of culture but that every aspect of human existence must be explained in terms of culture. For example, Boas had favored social explanations unless they were disproven, but his student Albert Kroeber favored them regardless of the evidence. "Heredity," he wrote, "cannot be allowed to have acted any part in history. "27 Instead, the chain of events shaping a people "involves the absolute conditioning of historical events by other historical events. "28
Kroeber did not just deny that social behavior could be explained by innate properties of minds. He denied that it could be explained by any properties of minds. A culture, he wrote, is superorganic -- it floats in its own universe, free of the flesh and blood of actual men and women: "Civilization is not mental action but a body or stream of products of mental exercise. . . . Mentality relates to the individual. The social or cultural, on the other hand, is in its essence non-individual. Civilization as such begins only where the individual ends. "29
These two ideas -- the denial of human nature, and the autonomy of culture from individual minds -- were also articulated by the founder of sociology, Emile Durkheim (1858-1917), who had foreshadowed Kroeber's doctrine of the superorganic mind:
Every time that a social phenomenon is directly explained by a psychological phenomenon, we may be sure that the explanation is false. . . .
The group thinks, feels, and acts quite differently from the way in which {24} members would were they isolated. . . . If we begin with the individual in seeking to explain phenomena, we shall be able to understand nothing of what takes place in the group. . . . Individual natures are merely the indeterminate material that the social factor molds and transforms. Their contribution consists exclusively in very general attitudes, in vague and consequently plastic predispositions. 30
And he laid down a law for the social sciences that would be cited often in the century to come: "The determining cause of a social fact should be sought among the social facts preceding it and not among the states of individual consciousness. "31
Both psychology and the other social sciences, then, denied that the minds of individual people were important, but they set out in different directions from there. Psychology banished mental entities like beliefs and desires altogether and replaced them with stimuli and responses. The other social sciences located beliefs and desires in cultures and societies rather than in the heads of individual people. The different social sciences also agreed that the contents of cognition -- ideas, thoughts, plans, and so on -- were really phenomena of language, overt behavior that anyone could hear and write down. (Watson proposed that "thinking" really consisted of teensy movements of the mouth and throat. ) But most of all they shared a dislike of instincts and evolution. Prominent social scientists repeatedly declared the slate to be blank:
Instincts do not create customs; customs create instincts, for the putative instincts of human beings are
always learned and never native.
? ? ? ? ? ? ? ? ? -- Ellsworth Faris (1927)32
? Cultural phenomena . . . are in no respect hereditary but are characteristically and without exception
acquired.
Man has no nature; what he has is history.
-- George Murdock (1932)33 -- Jose Ortega y Gasset (1935)34
? ? With the exception of the instinctoid reactions in infants to sudden withdrawals of support and to sudden loud noises, the human being is entirely instinctless. . . . Man is man because he has no instincts, because everything he is and has become he has learned, acquired, from his culture, from the man- made part of the environment, from other human beings.
-- Ashley Montagu (1973)35 {25}
True, the metaphor of choice was no longer a scraped tablet or white paper. Durkheim had spoken of "indeterminate material," some kind of blob that was molded or pounded into shape by culture. Perhaps the best modern metaphor is Silly Putty, the rubbery stuff that children use both to copy printed matter (like a blank slate) and to mold into desired shapes (like indeterminate material). The malleability metaphor resurfaced in statements by two of Boas's most famous students:
Most people are shaped to the form of their culture because of the malleability of their original endowment. . . . The great mass of individuals take quite readily the form that is presented to them.
-- Ruth Benedict (1934)36 We are forced to conclude that human nature is almost unbelievably malleable, responding accurately
and contrastingly to contrasting cul tural conditions. Others likened the mind to some kind of sieve:
? ? ? -- Margaret Mead (1935)37 Much of what is commonly called "human nature" is merely culture thrown against a screen of nerves,
? glands, sense organs, muscles, etc.
Or to the raw materials for a factory:
Human nature is the rawest, most undifferentiated of raw material.
-- Leslie White (1949)38 -- Margaret Mead (1928)39
? ? Our ideas, our values, our acts, even our emotions, are, like our nervous system itself, cultural products -- products manufactured, indeed, out of tendencies, capacities, and dispositions with which we were
born, but manufactured nonetheless.
-- Clifford Geertz (1973)41 {26}
Or to some other amorphous entity that can have many things done to it:
Cultural psychology is the study of the way cultural traditions and social practices regulate, express, transform, and permute the human psyche, resulting less in psychic unity for humankind than in ethnic
-- Clifford Geertz (1973)40 Man is the animal most desperately dependent upon such extragenetic, outside-the-skin control
? Or to an unprogrammed computer:
mechanisms, such cultural programs, for ordering his behavior.
? ? divergences in mind, self and emotion.
-- Richard Shweder (1990)42
? The superorganic or group mind also became an article of faith in social science. Robert Lowie (another Boas student) wrote, "The principles of psychology are as incapable of accounting for the phenomena of culture as is gravitation to account for architectural styles. "43 And in case you missed its full implications, the anthropologist Leslie White spelled it out:
? Instead of regarding the individual as a First Cause, as a prime mover, as the initiator and determinant
of the culture process, we now see him as a component part, and a tiny and relatively insignificant part at that, of a vast, socio-cultural system that embraces innumerable individuals at any one time and extends back into their remote past as well. . . . For purposes of scientific interpretation, the culture process may be regarded as a thing sui generis; culture is explainable in terms of culture. 44
In other words, we should forget about the mind of an individual person like you, that tiny and insignificant part of a vast sociocultural system. The mind that counts is the one belonging to the group, which is capable of thinking, feeling, and acting on its own.
The doctrine of the superorganism has had an impact on modern life that extends well beyond the writings of social scientists. It underlies the tendency to reify "society" as a moral agent that can be blamed for sins as if it were a person. It drives identity politics, in which civil rights and political perquisites are allocated to groups rather than to individuals. And as we shall see in later chapters, it defined some of the great divides between major political systems in the twentieth century. ~
The Blank Slate was not the only part of the official theory that social scientists felt compelled to prop up. They also strove to consecrate the Noble Savage. Mead painted a Gauguinesque portrait of native peoples as peaceable, egalitarian, materially satisfied, and sexually unconnected. Her uplifting vision of who we used to be -- and therefore who we can become again -- was accepted by such otherwise skeptical writers as Bertrand Russell and H. L. Mencken. Ashley Montagu (also from the Boas circle), a prominent public intellectual from the 1950s until his recent death, tirelessly invoked the doctrine {27} of the Noble Savage to justify the quest for brotherhood and peace and to refute anyone who might think such efforts were futile. In 1950, for example, he drafted a manifesto for the newly formed UNESCO that declared, "Biological studies lend support to the ethic of universal brotherhood, for man is born with drives toward co-operation, and unless these drives are satisfied, men and nations alike fall ill. "45 With the ashes of thirty-five million victims of World War II still warm or radioactive, a reasonable person might wonder how "biological studies" could show anything of the kind. The draft was rejected, but Montagu had better luck in the decades to come, when UNESCO and many scholarly societies adopted similar resolutions. 46
More generally, social scientists saw the malleability of humans and the autonomy of culture as doctrines that might bring about the age-old dream of perfecting mankind. We are not stuck with what we don't like about our current predicament, they argued. Nothing prevents us from changing it except a lack of will and the benighted belief that we are permanently consigned to it by biology. Many social scientists have expressed the hope of a new and improved human nature:
I felt (and said so early) that the environmental explanation was preferable, whenever justified by the
? ? ? ? data, because it was more optimistic, holding out the hope of improvement.
-- Otto Klineberg (1928)47
? Modern sociology and modern anthropology are one in saying that the substance of culture, or civilization, is social tradition and that this social tradition is indefinitely modifiable by further learning on the part of men for happier and better ways of living together. . . . Thus the scientific study of institutions awakens faith in the possibility of remaking both human nature and human social life.
-- Charles Ellwood (1922)48
Barriers in many fields of knowledge are falling below the new optimism which is that anybody can learn anything. . . . We have turned away from the concept of human ability as something fixed in the physiological structure, to that of a flexible and versatile mechanism subject to great improvement.
-- Robert Faris (1961)49
Though psychology is not as politicized as some of the other social sciences, it too is sometimes driven by a Utopian vision in which changes in child-rearing and education will ameliorate social pathologies and improve human welfare. And psychological theorists sometimes try to add moral heft to arguments for connectionism or other empiricist theories with warnings about the {28} pessimistic implications of innatist theories. They argue, for example, that innatist theories open the door to inborn differences, which could foster racism, or that the theories imply that human traits are unchangeable, which could we~aken support for social programs. 50
Twentieth-century social science embraced not just the Blank Slate and the Noble Savage but the third member of the
? ? ? ? trinity, the Ghost in the Machine. The declaration that we can change what we don't like about ourselves became a watchword of social science. But that only raises the question "Who or what is the 'we'? " If the "we" doing the remaking are just other hunks of matter in the biological world, then any malleability of behavior we discover would be cold comfort, because we, the molders, would be biologically constrained and therefore might not mold people, or allow ourselves to be molded, in the most socially salutary way. A ghost in the machine is the ultimate liberator of human will -- including the will to change society -- from mechanical causation. The anthropologist Loren Eiseley made this clear when he wrote:
The mind of man, by indetermination, by the power of choice and cultural communication, is on the verge of escape from the blind control of that deterministic world with which the Darwinists had unconsciously shackled man. The inborn characteristics laid upon him by the biological extremists have crumbled away. . . . Wallace saw and saw correctly, that with the rise of man the evolution of parts was to a marked degree outmoded, that mind was now the arbiter of human destiny. 51
The "Wallace" that Eiseley is referring to is Alfred Russel Wallace (1823-1913), the co-discoverer with Darwin of natural selection. Wallace parted company from Darwin by claiming that the human mind could not be explained by evolution and must have been designed by a superior intelligence. He certainly did believe that the mind of man could escape "the blind control of a deterministic world. " Wallace became a spiritualist and spent the later years of his career searching for a way to communicate with the souls of the dead.
The social scientists who believed in an absolute separation of culture from biology may not have literally believed in a spook haunting the brain. Some used the analogy of the difference between living and nonliving matter. Kroeber wrote: "The dawn of the social. . . is not a link in any chain, not a step in a path, but a leap to another plane. . . . [It is like] the first occurrence of life in the hitherto lifeless universe. . . . From this moment on there should be two worlds in place of one. "52 And Lowie insisted that it was "not mysticism, but sound scientific method" to say that culture was "sui generis" and could be explained only by culture, because everyone knows that in biology a living cell can come only from another living cell. 53 {29}
At the time that Kroeber and Lowie wrote, they had biology on their side. Many biologists still thought that living things were animated by a special essence, an elan vital, and could not be reduced to inanimate matter. A 1931 history of biology, referring to genetics as it was then understood, said, "Thus the last of the biological theories leaves us where we first started, in the presence of a power called life or psyche which is not only of its own kind but unique in each and all of its exhibitions. "54 In the next chapter we will see that the analogy between the autonomy of culture and the autonomy of life would prove to be more telling than these social scientists realized.
<< {30} >> Chapter 3
The Last Wall to Fall
In 1755 Samuel Johnson wrote that his dictionary should not be expected to "change sublunary nature, and clear the world at once from folly, vanity, and affectation. " Few people today are familiar with the lovely word sublunary, literally "below the moon. " It alludes to the ancient belief in a strict division between the pristine, lawful, unchanging cosmos above and our grubby, chaotic, fickle Earth below. The division was already obsolete when Johnson used the word: Newton had shown that the same force that pulled an apple toward the ground kept the moon in its celestial orbit.
Newton's theory that a single set of laws governed the motions of all objects in the universe was the first event in one of the great developments in human understanding: the unification of knowledge, which the biologist E. O. Wilson has termed consilience. 1 Newton's breaching of the wall between the terrestrial and the celestial was followed by a collapse of the once equally firm (and now equally forgotten) wall between the creative past and the static present. That happened when Charles Lyell showed that the Earth was sculpted in the past by forces we see today (such as earthquakes and erosion) acting over immense spans of time.
The living and nonliving, too, no longer occupy different realms. In 1628 William Harvey showed that the human body is a machine that runs by hydraulics and other mechanical principles. In 1828 Friedrich Wohler showed that the stuff of life is not a magical, pulsating gel but ordinary compounds following the laws of chemistry. Charles Darwin showed how the astonishing diversity of life and its ubiquitous signs of design could arise from the physical process of natural selection among replicators. Gregor Mendel, and then James Watson and Francis Crick, showed how replication itself could be understood in physical terms.
? ? ? ? ? ? ? ? ? ? ? The unification of our understanding of life with our understanding of matter and energy was the greatest scientific achievement of the second half of the twentieth century. One of its many consequences was to pull the rug out {31} from under social scientists like Kroeber and Lowie who had invoked the "sound scientific method" of placing the living and nonliving in parallel universes. We now know that cells did not always come from other cells and that the emergence of life did not create a second world where before there was just one. Cells evolved from simpler replicating molecules, a nonliving part of the physical world, and may be understood as collections of molecular machinery -- fantastically complicated machinery, of course, but machinery nonetheless.
This leaves one wall standing in the landscape of knowledge, the one that twentieth-century social scientists guarded so jealously. It divides matter from mind, the material from the spiritual, the physical from the mental, biology from culture, nature from society, and the sciences from the social sciences, humanities, and arts. The division was built into each of the doctrines of the official theory: the blank slate given by biology versus the contents inscribed by experience and culture, the nobility of the savage in the state of nature versus the corruption of social institutions, the machine following inescapable laws versus the ghost that is free to choose and to improve the human condition.
But this wall, too, is falling. New ideas from four frontiers of knowledge -- the sciences of mind, brain, genes, and evolution -- are breaching the wall with a new understanding of human nature. In this chapter I will show how they are filling in the blank slate, declassing the noble savage, and exorcising the ghost in the machine. In the following chapter I will show that this new conception of human nature, connected to biology from below, can in turn be connected to the humanities and social sciences above. That new conception can give the phenomena of culture their due without segregating them into a parallel universe. ~
The first bridge between biology and culture is the science of mind, cognitive science. 2 The concept of mind has been perplexing for as long as people have reflected on their thoughts and feelings. The very idea has spawned paradoxes, superstitions, and bizarre theories in every period and culture. One can almost sympathize with the behaviorists and social constructionists of the first half of the twentieth century, who looked on minds as enigmas or conceptual traps that were best avoided in favor of overt behavior or the traits of a culture.
But beginning in the 1950s with the cognitive revolution, all that changed. It is now possible to make sense of mental processes and even to study them in the lab. And with a firmer grasp on the concept of mind, we can see that many tenets of the Blank Slate that once seemed appealing are now unnecessary or even incoherent. Here are five ideas from the cognitive revolution that have revamped how we think and talk about minds.
The first idea: The mental world can be grounded in the physical world by the concepts of information, computation, and feedback. A great divide between {32} mind and matter has always seemed natural because behavior appears to have a different kind of trigger than other physical events. Ordinary events have causes, it seems, but human behavior has reasons. I once participated in a BBC television debate on whether "science can explain human behavior. " Arguing against the resolution was a philosopher who asked how we might explain why someone was put in jail. Say it was for inciting racial hatred. The intention, the hatred, and even the prison, she said, cannot be described in the language of physics. There is simply no way to define "hatred" or "jail" in terms of the movements of particles.
This sea change included a revolution in the treatment of human nature by scientists and scholars. Academics were swept along by the changing attitudes to race and sex, but they also helped to direct the tide by holding forth on human nature in books and magazines and by lending their expertise to government agencies. The prevailing theories of mind were refashioned to make racism and sexism as untenable as possible. The doctrine of the Blank {17} Slate
? ? ? ? ? became entrenched in intellectual life in a form that has been called the Standard Social Science Model or social constructionism. 5 The model is now second nature to people and few are aware of the history behind it. 6 Carl Degler, the foremost historian of this revolution, sums it up this way:
What the available evidence does seem to show is that ideology or a philosophical belief that the world could be a freer and more just place played a large part in the shift from biology to culture. Science, or at least certain scientific principles or innovative scholarship also played a role in the transformation, but only a limited one. The main impetus came from the will to establish a social order in which innate and immutable forces of biology played no role in accounting for the behavior of social groups. 7
The takeover of intellectual life by the Blank Slate followed different paths in psychology and in the other social sciences, but they were propelled by the same historical events and progressive ideology. By the second and third decades of the twentieth century, stereotypes of women and ethnic groups were starting to look silly. Waves of immigrants from southern and eastern Europe, including many Jews, were filling the cities and climbing the social ladder. African Americans had taken advantage of the new "Negro colleges," had migrated northward, and had begun the Harlem Renaissance. The graduates of flourishing women's colleges helped launch the first wave of feminism. For the first time not all professors and students were white Anglo-Saxon Protestant males. To say that this sliver of humanity was constitutionally superior had not only become offensive but went against what people could see with their own eyes. The social sciences in particular were attracting women, Jews, Asians, and African Americans, some of whom became influential thinkers.
Many of the pressing social problems of the first decades of the twentieth century concerned the less fortunate members of these groups. Should more immigrants be let in, and if so, from which countries? Once here, should they be encouraged to assimilate, and if so, how? Should women be given equal political rights and economic opportunities? Should blacks and whites be integrated? Other challenges were posed by children. 8 Education had become compulsory and a responsibility of the state. As the cities teemed and family ties loosened, troubled and troublesome children became everyone's problem, and new institutions were invented to deal with them, such as kindergartens, orphanages, reform schools, fresh-air camps, humane societies, and boys' and girls' clubs. Child development was suddenly on the front burner. These social challenges were not going to go away, and the most humane assumption was that all human beings had an equal potential to prosper if they were given the {18} right upbringing and opportunities. Many social scientists saw i~t as their job to reinforce that assumption.
Modern psychological theory, as every introductory textbook makes clear, has roots in John Locke and other Enlightenment thinkers. For Locke the Blank Slate was a weapon against the church and tyrannical monarchs, but these threats had subsided in the English-speaking world by the nineteenth century. Locke's intellectual heir John Stuart Mill (1806-1873) was perhaps the first to apply his blank-slate psychology to political concerns we recognize today. He was an early supporter of women's suffrage, compulsory education, and the improvement of the conditions of the lower classes. This interacted with his stands in psychology and philosophy, as he explained in his autobiography:
I have long felt that the prevailing tendency to regard all the marked distinctions of human character as innate, and in the main indelible, and to ignore the irresistible proofs that by far the greater part of those differences, whether between individuals, races, or sexes, are such as not only might but naturally would be produced by differences in circumstances, is one of the chief hindrances to the rational treatment of great social questions, and one of the greatest stumbling blocks to human improvement. . . . [This tendency is] so agreeable to human indolence, as well as to conservative interests generally, that unless attacked at the very root, it is sure to be carried to even a greater length than is really justified by the more moderate forms of intuitional philosophy. 9
By "intuitional philosophy" Mill was referring to Continental intellectuals who maintained (among other things) that the categories of reason were innate. Mill wanted to attack their theory of psychology at the root to combat what he thought were its conservative social implications. He refined a theory of learning called associationism (previously formulated by Locke) that tried to explain human intelligence without granting it any innate organization. According to this theory, the blank slate is inscribed with sensations, which Locke called "ideas" and modern psychologists call "features. " Ideas that repeatedly appear in succession (such as the redness, roundness, and sweetness of an apple) become associated, so that any one of them can call to mind the others. And similar objects in the world activate overlapping sets of ideas in the mind. For example, after many dogs present themselves to the senses, the features that they share (fur, barking, four legs, and so on) hang together to stand for the category "dog. "
? ? ? ? ? ? The associationism of Locke and Mill has been recognizable in {19} psychology ever since. It became the core of most models of learning, especially in the approach called behaviorism, which dominated psychology from the 1920s to the 1960s. The founder of behaviorism, John B. Watson (1878-1958), wrote one of the century's most famous pronouncements of the Blank Slate:
Give me a dozen healthy infants, well-formed, and my own specified world to bring them up in and I'll guarantee to take any one at random and train him to become any type of specialist I might select -- doctor, lawyer, artist, merchant-chief, and yes, even beggar-man and thief, regardless of his talents, penchants, tendencies, abilities, vocations, and race of his ancestors. 10
In behaviorism, an infant's talents and abilities didn't matter because there was no such thing as a talent or an ability. Watson had banned them from psychology, together with other contents of the mind, such as ideas, beliefs, desires, and feelings. They were subjective and unmeasurable, he said, and unfit for science, which studies only objective and measurable things. To a behaviorist, the only legitimate topic for psychology is overt behavior and how it is controlled by the present and past environment. (There is an old joke in psychology: What does a behaviorist say after making love? "It was good for you; how was it for me? ")
Locke's "ideas" had been replaced by "stimuli" and "responses," but his laws of association survived as laws of conditioning. A response can be associated with a new stimulus, as when Watson presented a baby with a white rat and then clanged a hammer against an iron bar, allegedly making the baby associate fear with fur. And a response could be associated with a reward, as when a cat in a box eventually learned that pulling a string opened a door and allowed it to escape. In these cases an experimenter set up a contingency between a stimulus and another stimulus or between a response and a reward. In a natural environment, said the behaviorists, these contingencies are part of the causal texture of the world, and they inexorably shape the behavior of organisms, including humans.
Among the casualties of behaviorist minimalism was the rich psychology of William James (1842-1910). James had been inspired by Darwin's argument that perception, cognition, and emotion, like physical organs, had evolved as biological adaptations. James invoked the notion of instinct to explain the preferences of humans, not just those of animals, and he posited numerous mechanisms in his theory of mental life, including short-term and long-term memory. But with the advent of behaviorism they all joined the index of forbidden concepts. The psychologist J. R. Kantor wrote in 1923: "Brief is the answer to the question as to what is the relationship between {20} social psychology and instincts. Plainly, there is no relationship. "11 Even sexual desire was redefined as a conditioned response. The psychologist Zing Yang Kuo wrote in 1929:
Behavior is not a manifestation of hereditary factors, nor can it be expressed in terms of heredity. [It is] a passive and forced movement mechanically and solely determined by the structural pattern of the organism and the nature of environmental forces. . . . All our sexual appetites are the result of social stimulation. The organism possesses no ready-made reaction to the other sex, any more than it possesses innate ideas. 12
Behaviorists believed that behavior could be understood independently of the rest of biology, without attention to the genetic makeup of the animal or the evolutionary history of the species. Psychology came to consist of the study of learning in laboratory animals. B. E. Skinner (1904-1990), the most famous psychologist in the middle decades of the twentieth century, wrote a book called The Behavior of Organisms in which the only organisms were rats and pigeons and the only behavior was lever pressing and key pecking. It took a trip to the circus to remind psychologists that species and their instincts mattered after all. In an article called "The Misbehavior of Organisms," Skinner's students Keller and Marian Breland reported that when they tried to use his techniques to train animals to insert poker chips into vending machines, the chickens pecked the chips, the raccoons washed them, and the pigs tried to root them with their snouts. 13 And behaviorists were as hostile to the brain as they were to genetics. As late as 1974, Skinner wrote that studying the brain was just another misguided quest to find the causes of behavior inside the organism rather than out in the world. 14
Behaviorism not only took over psychology but infiltrated the public consciousness. Watson wrote an influential childrearing manual recommending that parents establish rigid feeding schedules for their children and give them a minimum of attention and love. If you comfort a crying child, he wrote, you will reward him for crying and thereby increase the frequency of crying behavior. (Benjamin Spock's Baby and Child Care, first published in 1946 and famous for recommending indulgence toward children, was in part a reaction to Watson. ) Skinner wrote several bestsellers arguing that harmful behavior is neither instinctive nor freely chosen but inadvertently conditioned. If we turned society into a big Skinner box and controlled behavior deliberately rather than haphazardly, we could eliminate aggression, overpopulation, crowding, pollution, and inequality, and thereby attain Utopia. 15 The noble
? ? ? ? ? ? ? ? savage became the noble pigeon. {21}
Strict behaviorism is pretty much dead in psychology, but many of its attitudes live on. Associationism is the learning theory assumed by many mathematical models and neural network simulations of learning. 16 Many neuroscientists equate learning with the forming of associations, and look for an associative bond in the physiology of neurons and synapses, ignoring other kinds of computation that might implement learning in the brain. 17 (For example, storing the value of a variable in the brain, as in "x = 3," is a critical computational step in navigating and foraging, which are highly developed talents of animals in the wild. But this kind of learning cannot be reduced to the formation of associations, and so it has been ignored in neuroscience. ) Psychologists and neuroscientists still treat organisms interchangeably, seldom asking whether a convenient laboratory animal (a rat, a cat, a monkey) is like or unlike humans in crucial ways. 18 Until recently, psychology ignored the content of beliefs and emotions and the possibility that the mind had evolved to treat biologically important categories in different ways. 19 Theories of memory and reasoning didn't distinguish thoughts about people from thoughts about rocks or houses. Theories of emotion didn't distinguish fear from anger, jealousy, or love. 20 Theories of social relations didn't distinguish among family, friends, enemies, and strangers. 21 Indeed, the topics in psychology that most interest laypeople -- love, hate, work, play, food, sex, status, dominance, jealousy, friendship, religion, art -- are almost completely absent from psychology textbooks.
One of the major documents of late twentieth-century psychology was the two-volume Parallel Distributed Processing by David Rumelhart, James McClelland, and their collaborators, which presented a style of neural network modeling called connectionism. 22 Rumelhart and McClelland argued that generic associationist networks, subjected to massive amounts of training, could explain all of cognition. They realized that this theory left them without a good answer to the question "Why are people smarter than rats? " Here is their answer:
Given all of the above, the question does seem a bit puzzling. . . . People have much more cortex than rats do or even than other primates do; in particular they have very much more . . . brain structure not dedicated to input/output -- and presumably, this extra cortex is strategically placed in the brain to subserve just those functions that differentiate people from rats or even apes. . . .
But there must be another aspect to the difference between rats and people as well. This is that the human environment includes other people and the cultural devices that they have developed to organize their thinking processes. 23 {22}
Humans, then, are just rats with bigger blank slates, plus something called "cultural devices. " And that brings us to the other half of the twentieth-century revolution in social~science.
? ? ? ? ? ? ? ? ? ? He's so unhip, when you say "Dylan,"
He thinks you're talkin' about Dylan Thomas (whoever he was). The man ain't got no culture.
-- Simon and Garfunkel
The word culture used to refer to exalted genres of entertainment, such as poetry, opera, and ballet. The other familiar sense -- "the totality of socially transmitted behavior patterns, arts, beliefs, institutions, and all other products of human work and thought" -- is only a century old. This change in the English language is just one of the legacies of the father of modern anthropology, Franz Boas (1858-1942).
The ideas of Boas, like the ideas of the major thinkers in psychology, were rooted in the empiricist philosophers of the Enlightenment, in this case George Berkeley (1685-1753). Berkeley formulated the theory of idealism, the notion that ideas, not bodies and other hunks of matter, are the ultimate constituents of reality. After twists and turns that are too convoluted to recount here, idealism became influential among nineteenth-century German thinkers. It was embraced by the young Boas, a German Jew from a secular, liberal family.
Idealism allowed Boas to lay a new intellectual foundation for egalitarianism. The differences among human races and ethnic groups, he proposed, come not from their physical constitution but from their culture, a system of ideas and values spread by language and other forms of social behavior. Peoples differ because their cultures differ. Indeed, that is how we should refer to them: the Eskimo culture or the Jewish culture, not the Eskimo race or the Jewish race. The idea that minds are shaped by culture served as a bulwark against racism and was the theory one ought to prefer on moral grounds. Boas wrote, "I claim that, unless the contrary can be proved, we must assume that all complex activities are socially determined, not hereditary. "24
Boas's case was not just a moral injunction; it was rooted in real discoveries. Boas studied native peoples,
? immigrants, and children in orphanages to prove that all groups of humans had equal potential. Turning Jespersen on his head, Boas showed that the languages of primitive peoples were not simpler than those of Europeans; they were just different. Eskimos' difficulty in discriminating the sounds of our language, for example, is matched by our difficulty in discriminating the sounds of theirs. True, many non-Western languages lack the means to express certain abstract concepts. They may have no words for numbers higher than three, for example, or no word for {23} goodness in general as opposed to the goodness of a particular person. But those limitations simply reflect the daily needs of those people as they live their lives, not an infirmity in their mental abilities. As in the story of Socrates drawing abstract philosophical concepts out of a slave boy, Boas showed that he could elicit new word forms for abstract concepts like "goodness" and "pity" out of a Kwakiutl native from the Pacific Northwest. He also observed that when native peoples come into contact with civilization and acquire things that have to be counted, they quickly adopt a full-blown counting system. 25
For all his emphasis on culture, Boas was not a relativist who believed that all cultures are equivalent, nor was he an empiricist who believed in the Blank Slate. He considered European civilization superior to tribal cultures, insisting only that all peoples were capable of achieving it. He did not deny that there might be a universal human nature, or that there might be differences among people within an ethnic group. What mattered to him was the idea that all ethnic groups are endowed with the same basic mental abilities. 26 Boas was right about this, and today it is accepted by virtually all scholars and scientists.
But Boas had created a monster. His students came to dominate American social science, and each generation outdid the previous one in its sweeping pronouncements. Boas's students insisted not just that differences among ethnic groups must be explained in terms of culture but that every aspect of human existence must be explained in terms of culture. For example, Boas had favored social explanations unless they were disproven, but his student Albert Kroeber favored them regardless of the evidence. "Heredity," he wrote, "cannot be allowed to have acted any part in history. "27 Instead, the chain of events shaping a people "involves the absolute conditioning of historical events by other historical events. "28
Kroeber did not just deny that social behavior could be explained by innate properties of minds. He denied that it could be explained by any properties of minds. A culture, he wrote, is superorganic -- it floats in its own universe, free of the flesh and blood of actual men and women: "Civilization is not mental action but a body or stream of products of mental exercise. . . . Mentality relates to the individual. The social or cultural, on the other hand, is in its essence non-individual. Civilization as such begins only where the individual ends. "29
These two ideas -- the denial of human nature, and the autonomy of culture from individual minds -- were also articulated by the founder of sociology, Emile Durkheim (1858-1917), who had foreshadowed Kroeber's doctrine of the superorganic mind:
Every time that a social phenomenon is directly explained by a psychological phenomenon, we may be sure that the explanation is false. . . .
The group thinks, feels, and acts quite differently from the way in which {24} members would were they isolated. . . . If we begin with the individual in seeking to explain phenomena, we shall be able to understand nothing of what takes place in the group. . . . Individual natures are merely the indeterminate material that the social factor molds and transforms. Their contribution consists exclusively in very general attitudes, in vague and consequently plastic predispositions. 30
And he laid down a law for the social sciences that would be cited often in the century to come: "The determining cause of a social fact should be sought among the social facts preceding it and not among the states of individual consciousness. "31
Both psychology and the other social sciences, then, denied that the minds of individual people were important, but they set out in different directions from there. Psychology banished mental entities like beliefs and desires altogether and replaced them with stimuli and responses. The other social sciences located beliefs and desires in cultures and societies rather than in the heads of individual people. The different social sciences also agreed that the contents of cognition -- ideas, thoughts, plans, and so on -- were really phenomena of language, overt behavior that anyone could hear and write down. (Watson proposed that "thinking" really consisted of teensy movements of the mouth and throat. ) But most of all they shared a dislike of instincts and evolution. Prominent social scientists repeatedly declared the slate to be blank:
Instincts do not create customs; customs create instincts, for the putative instincts of human beings are
always learned and never native.
? ? ? ? ? ? ? ? ? -- Ellsworth Faris (1927)32
? Cultural phenomena . . . are in no respect hereditary but are characteristically and without exception
acquired.
Man has no nature; what he has is history.
-- George Murdock (1932)33 -- Jose Ortega y Gasset (1935)34
? ? With the exception of the instinctoid reactions in infants to sudden withdrawals of support and to sudden loud noises, the human being is entirely instinctless. . . . Man is man because he has no instincts, because everything he is and has become he has learned, acquired, from his culture, from the man- made part of the environment, from other human beings.
-- Ashley Montagu (1973)35 {25}
True, the metaphor of choice was no longer a scraped tablet or white paper. Durkheim had spoken of "indeterminate material," some kind of blob that was molded or pounded into shape by culture. Perhaps the best modern metaphor is Silly Putty, the rubbery stuff that children use both to copy printed matter (like a blank slate) and to mold into desired shapes (like indeterminate material). The malleability metaphor resurfaced in statements by two of Boas's most famous students:
Most people are shaped to the form of their culture because of the malleability of their original endowment. . . . The great mass of individuals take quite readily the form that is presented to them.
-- Ruth Benedict (1934)36 We are forced to conclude that human nature is almost unbelievably malleable, responding accurately
and contrastingly to contrasting cul tural conditions. Others likened the mind to some kind of sieve:
? ? ? -- Margaret Mead (1935)37 Much of what is commonly called "human nature" is merely culture thrown against a screen of nerves,
? glands, sense organs, muscles, etc.
Or to the raw materials for a factory:
Human nature is the rawest, most undifferentiated of raw material.
-- Leslie White (1949)38 -- Margaret Mead (1928)39
? ? Our ideas, our values, our acts, even our emotions, are, like our nervous system itself, cultural products -- products manufactured, indeed, out of tendencies, capacities, and dispositions with which we were
born, but manufactured nonetheless.
-- Clifford Geertz (1973)41 {26}
Or to some other amorphous entity that can have many things done to it:
Cultural psychology is the study of the way cultural traditions and social practices regulate, express, transform, and permute the human psyche, resulting less in psychic unity for humankind than in ethnic
-- Clifford Geertz (1973)40 Man is the animal most desperately dependent upon such extragenetic, outside-the-skin control
? Or to an unprogrammed computer:
mechanisms, such cultural programs, for ordering his behavior.
? ? divergences in mind, self and emotion.
-- Richard Shweder (1990)42
? The superorganic or group mind also became an article of faith in social science. Robert Lowie (another Boas student) wrote, "The principles of psychology are as incapable of accounting for the phenomena of culture as is gravitation to account for architectural styles. "43 And in case you missed its full implications, the anthropologist Leslie White spelled it out:
? Instead of regarding the individual as a First Cause, as a prime mover, as the initiator and determinant
of the culture process, we now see him as a component part, and a tiny and relatively insignificant part at that, of a vast, socio-cultural system that embraces innumerable individuals at any one time and extends back into their remote past as well. . . . For purposes of scientific interpretation, the culture process may be regarded as a thing sui generis; culture is explainable in terms of culture. 44
In other words, we should forget about the mind of an individual person like you, that tiny and insignificant part of a vast sociocultural system. The mind that counts is the one belonging to the group, which is capable of thinking, feeling, and acting on its own.
The doctrine of the superorganism has had an impact on modern life that extends well beyond the writings of social scientists. It underlies the tendency to reify "society" as a moral agent that can be blamed for sins as if it were a person. It drives identity politics, in which civil rights and political perquisites are allocated to groups rather than to individuals. And as we shall see in later chapters, it defined some of the great divides between major political systems in the twentieth century. ~
The Blank Slate was not the only part of the official theory that social scientists felt compelled to prop up. They also strove to consecrate the Noble Savage. Mead painted a Gauguinesque portrait of native peoples as peaceable, egalitarian, materially satisfied, and sexually unconnected. Her uplifting vision of who we used to be -- and therefore who we can become again -- was accepted by such otherwise skeptical writers as Bertrand Russell and H. L. Mencken. Ashley Montagu (also from the Boas circle), a prominent public intellectual from the 1950s until his recent death, tirelessly invoked the doctrine {27} of the Noble Savage to justify the quest for brotherhood and peace and to refute anyone who might think such efforts were futile. In 1950, for example, he drafted a manifesto for the newly formed UNESCO that declared, "Biological studies lend support to the ethic of universal brotherhood, for man is born with drives toward co-operation, and unless these drives are satisfied, men and nations alike fall ill. "45 With the ashes of thirty-five million victims of World War II still warm or radioactive, a reasonable person might wonder how "biological studies" could show anything of the kind. The draft was rejected, but Montagu had better luck in the decades to come, when UNESCO and many scholarly societies adopted similar resolutions. 46
More generally, social scientists saw the malleability of humans and the autonomy of culture as doctrines that might bring about the age-old dream of perfecting mankind. We are not stuck with what we don't like about our current predicament, they argued. Nothing prevents us from changing it except a lack of will and the benighted belief that we are permanently consigned to it by biology. Many social scientists have expressed the hope of a new and improved human nature:
I felt (and said so early) that the environmental explanation was preferable, whenever justified by the
? ? ? ? data, because it was more optimistic, holding out the hope of improvement.
-- Otto Klineberg (1928)47
? Modern sociology and modern anthropology are one in saying that the substance of culture, or civilization, is social tradition and that this social tradition is indefinitely modifiable by further learning on the part of men for happier and better ways of living together. . . . Thus the scientific study of institutions awakens faith in the possibility of remaking both human nature and human social life.
-- Charles Ellwood (1922)48
Barriers in many fields of knowledge are falling below the new optimism which is that anybody can learn anything. . . . We have turned away from the concept of human ability as something fixed in the physiological structure, to that of a flexible and versatile mechanism subject to great improvement.
-- Robert Faris (1961)49
Though psychology is not as politicized as some of the other social sciences, it too is sometimes driven by a Utopian vision in which changes in child-rearing and education will ameliorate social pathologies and improve human welfare. And psychological theorists sometimes try to add moral heft to arguments for connectionism or other empiricist theories with warnings about the {28} pessimistic implications of innatist theories. They argue, for example, that innatist theories open the door to inborn differences, which could foster racism, or that the theories imply that human traits are unchangeable, which could we~aken support for social programs. 50
Twentieth-century social science embraced not just the Blank Slate and the Noble Savage but the third member of the
? ? ? ? trinity, the Ghost in the Machine. The declaration that we can change what we don't like about ourselves became a watchword of social science. But that only raises the question "Who or what is the 'we'? " If the "we" doing the remaking are just other hunks of matter in the biological world, then any malleability of behavior we discover would be cold comfort, because we, the molders, would be biologically constrained and therefore might not mold people, or allow ourselves to be molded, in the most socially salutary way. A ghost in the machine is the ultimate liberator of human will -- including the will to change society -- from mechanical causation. The anthropologist Loren Eiseley made this clear when he wrote:
The mind of man, by indetermination, by the power of choice and cultural communication, is on the verge of escape from the blind control of that deterministic world with which the Darwinists had unconsciously shackled man. The inborn characteristics laid upon him by the biological extremists have crumbled away. . . . Wallace saw and saw correctly, that with the rise of man the evolution of parts was to a marked degree outmoded, that mind was now the arbiter of human destiny. 51
The "Wallace" that Eiseley is referring to is Alfred Russel Wallace (1823-1913), the co-discoverer with Darwin of natural selection. Wallace parted company from Darwin by claiming that the human mind could not be explained by evolution and must have been designed by a superior intelligence. He certainly did believe that the mind of man could escape "the blind control of a deterministic world. " Wallace became a spiritualist and spent the later years of his career searching for a way to communicate with the souls of the dead.
The social scientists who believed in an absolute separation of culture from biology may not have literally believed in a spook haunting the brain. Some used the analogy of the difference between living and nonliving matter. Kroeber wrote: "The dawn of the social. . . is not a link in any chain, not a step in a path, but a leap to another plane. . . . [It is like] the first occurrence of life in the hitherto lifeless universe. . . . From this moment on there should be two worlds in place of one. "52 And Lowie insisted that it was "not mysticism, but sound scientific method" to say that culture was "sui generis" and could be explained only by culture, because everyone knows that in biology a living cell can come only from another living cell. 53 {29}
At the time that Kroeber and Lowie wrote, they had biology on their side. Many biologists still thought that living things were animated by a special essence, an elan vital, and could not be reduced to inanimate matter. A 1931 history of biology, referring to genetics as it was then understood, said, "Thus the last of the biological theories leaves us where we first started, in the presence of a power called life or psyche which is not only of its own kind but unique in each and all of its exhibitions. "54 In the next chapter we will see that the analogy between the autonomy of culture and the autonomy of life would prove to be more telling than these social scientists realized.
<< {30} >> Chapter 3
The Last Wall to Fall
In 1755 Samuel Johnson wrote that his dictionary should not be expected to "change sublunary nature, and clear the world at once from folly, vanity, and affectation. " Few people today are familiar with the lovely word sublunary, literally "below the moon. " It alludes to the ancient belief in a strict division between the pristine, lawful, unchanging cosmos above and our grubby, chaotic, fickle Earth below. The division was already obsolete when Johnson used the word: Newton had shown that the same force that pulled an apple toward the ground kept the moon in its celestial orbit.
Newton's theory that a single set of laws governed the motions of all objects in the universe was the first event in one of the great developments in human understanding: the unification of knowledge, which the biologist E. O. Wilson has termed consilience. 1 Newton's breaching of the wall between the terrestrial and the celestial was followed by a collapse of the once equally firm (and now equally forgotten) wall between the creative past and the static present. That happened when Charles Lyell showed that the Earth was sculpted in the past by forces we see today (such as earthquakes and erosion) acting over immense spans of time.
The living and nonliving, too, no longer occupy different realms. In 1628 William Harvey showed that the human body is a machine that runs by hydraulics and other mechanical principles. In 1828 Friedrich Wohler showed that the stuff of life is not a magical, pulsating gel but ordinary compounds following the laws of chemistry. Charles Darwin showed how the astonishing diversity of life and its ubiquitous signs of design could arise from the physical process of natural selection among replicators. Gregor Mendel, and then James Watson and Francis Crick, showed how replication itself could be understood in physical terms.
? ? ? ? ? ? ? ? ? ? ? The unification of our understanding of life with our understanding of matter and energy was the greatest scientific achievement of the second half of the twentieth century. One of its many consequences was to pull the rug out {31} from under social scientists like Kroeber and Lowie who had invoked the "sound scientific method" of placing the living and nonliving in parallel universes. We now know that cells did not always come from other cells and that the emergence of life did not create a second world where before there was just one. Cells evolved from simpler replicating molecules, a nonliving part of the physical world, and may be understood as collections of molecular machinery -- fantastically complicated machinery, of course, but machinery nonetheless.
This leaves one wall standing in the landscape of knowledge, the one that twentieth-century social scientists guarded so jealously. It divides matter from mind, the material from the spiritual, the physical from the mental, biology from culture, nature from society, and the sciences from the social sciences, humanities, and arts. The division was built into each of the doctrines of the official theory: the blank slate given by biology versus the contents inscribed by experience and culture, the nobility of the savage in the state of nature versus the corruption of social institutions, the machine following inescapable laws versus the ghost that is free to choose and to improve the human condition.
But this wall, too, is falling. New ideas from four frontiers of knowledge -- the sciences of mind, brain, genes, and evolution -- are breaching the wall with a new understanding of human nature. In this chapter I will show how they are filling in the blank slate, declassing the noble savage, and exorcising the ghost in the machine. In the following chapter I will show that this new conception of human nature, connected to biology from below, can in turn be connected to the humanities and social sciences above. That new conception can give the phenomena of culture their due without segregating them into a parallel universe. ~
The first bridge between biology and culture is the science of mind, cognitive science. 2 The concept of mind has been perplexing for as long as people have reflected on their thoughts and feelings. The very idea has spawned paradoxes, superstitions, and bizarre theories in every period and culture. One can almost sympathize with the behaviorists and social constructionists of the first half of the twentieth century, who looked on minds as enigmas or conceptual traps that were best avoided in favor of overt behavior or the traits of a culture.
But beginning in the 1950s with the cognitive revolution, all that changed. It is now possible to make sense of mental processes and even to study them in the lab. And with a firmer grasp on the concept of mind, we can see that many tenets of the Blank Slate that once seemed appealing are now unnecessary or even incoherent. Here are five ideas from the cognitive revolution that have revamped how we think and talk about minds.
The first idea: The mental world can be grounded in the physical world by the concepts of information, computation, and feedback. A great divide between {32} mind and matter has always seemed natural because behavior appears to have a different kind of trigger than other physical events. Ordinary events have causes, it seems, but human behavior has reasons. I once participated in a BBC television debate on whether "science can explain human behavior. " Arguing against the resolution was a philosopher who asked how we might explain why someone was put in jail. Say it was for inciting racial hatred. The intention, the hatred, and even the prison, she said, cannot be described in the language of physics. There is simply no way to define "hatred" or "jail" in terms of the movements of particles.
