Or to an unprogrammed computer:
mechanisms, such cultural programs, for ordering his behavior.
mechanisms, such cultural programs, for ordering his behavior.
Steven-Pinker-The-Blank-Slate 1
The founder of behaviorism, John B.
Watson (1878-1958), wrote one of the century's most famous pronouncements of the Blank Slate:
Give me a dozen healthy infants, well-formed, and my own specified world to bring them up in and I'll guarantee to take any one at random and train him to become any type of specialist I might select -- doctor, lawyer, artist, merchant-chief, and yes, even beggar-man and thief, regardless of his talents, penchants, tendencies, abilities, vocations, and race of his ancestors. 10
In behaviorism, an infant's talents and abilities didn't matter because there was no such thing as a talent or an ability. Watson had banned them from psychology, together with other contents of the mind, such as ideas, beliefs, desires, and feelings. They were subjective and unmeasurable, he said, and unfit for science, which studies only objective and measurable things. To a behaviorist, the only legitimate topic for psychology is overt behavior and how it is controlled by the present and past environment. (There is an old joke in psychology: What does a behaviorist say after making love? "It was good for you; how was it for me? ")
Locke's "ideas" had been replaced by "stimuli" and "responses," but his laws of association survived as laws of conditioning. A response can be associated with a new stimulus, as when Watson presented a baby with a white rat and then clanged a hammer against an iron bar, allegedly making the baby associate fear with fur. And a response could be associated with a reward, as when a cat in a box eventually learned that pulling a string opened a door and allowed it to escape. In these cases an experimenter set up a contingency between a stimulus and another stimulus or between a response and a reward. In a natural environment, said the behaviorists, these contingencies are part of the causal texture of the world, and they inexorably shape the behavior of organisms, including humans.
Among the casualties of behaviorist minimalism was the rich psychology of William James (1842-1910). James had been inspired by Darwin's argument that perception, cognition, and emotion, like physical organs, had evolved as biological adaptations. James invoked the notion of instinct to explain the preferences of humans, not just those of animals, and he posited numerous mechanisms in his theory of mental life, including short-term and long-term memory. But with the advent of behaviorism they all joined the index of forbidden concepts. The psychologist J. R. Kantor wrote in 1923: "Brief is the answer to the question as to what is the relationship between {20} social psychology and instincts. Plainly, there is no relationship. "11 Even sexual desire was redefined as a conditioned response. The psychologist Zing Yang Kuo wrote in 1929:
Behavior is not a manifestation of hereditary factors, nor can it be expressed in terms of heredity. [It is] a passive and forced movement mechanically and solely determined by the structural pattern of the organism and the nature of environmental forces. . . . All our sexual appetites are the result of social stimulation. The organism possesses no ready-made reaction to the other sex, any more than it possesses innate ideas. 12
Behaviorists believed that behavior could be understood independently of the rest of biology, without attention to the genetic makeup of the animal or the evolutionary history of the species. Psychology came to consist of the study of learning in laboratory animals. B. E. Skinner (1904-1990), the most famous psychologist in the middle decades of the twentieth century, wrote a book called The Behavior of Organisms in which the only organisms were rats and pigeons and the only behavior was lever pressing and key pecking. It took a trip to the circus to remind psychologists that species and their instincts mattered after all. In an article called "The Misbehavior of Organisms," Skinner's students Keller and Marian Breland reported that when they tried to use his techniques to train animals to insert poker chips into vending machines, the chickens pecked the chips, the raccoons washed them, and the pigs tried to root them with their snouts. 13 And behaviorists were as hostile to the brain as they were to genetics. As late as 1974, Skinner wrote that studying the brain was just another misguided quest to find the causes of behavior inside the organism rather than out in the world. 14
Behaviorism not only took over psychology but infiltrated the public consciousness. Watson wrote an influential childrearing manual recommending that parents establish rigid feeding schedules for their children and give them a minimum of attention and love. If you comfort a crying child, he wrote, you will reward him for crying and thereby increase the frequency of crying behavior. (Benjamin Spock's Baby and Child Care, first published in 1946 and famous for recommending indulgence toward children, was in part a reaction to Watson. ) Skinner wrote several bestsellers arguing that harmful behavior is neither instinctive nor freely chosen but inadvertently conditioned. If we turned society into a big Skinner box and controlled behavior deliberately rather than haphazardly, we could eliminate aggression, overpopulation, crowding, pollution, and inequality, and thereby attain Utopia. 15 The noble
? ? ? ? ? ? ? ? savage became the noble pigeon. {21}
Strict behaviorism is pretty much dead in psychology, but many of its attitudes live on. Associationism is the learning theory assumed by many mathematical models and neural network simulations of learning. 16 Many neuroscientists equate learning with the forming of associations, and look for an associative bond in the physiology of neurons and synapses, ignoring other kinds of computation that might implement learning in the brain. 17 (For example, storing the value of a variable in the brain, as in "x = 3," is a critical computational step in navigating and foraging, which are highly developed talents of animals in the wild. But this kind of learning cannot be reduced to the formation of associations, and so it has been ignored in neuroscience. ) Psychologists and neuroscientists still treat organisms interchangeably, seldom asking whether a convenient laboratory animal (a rat, a cat, a monkey) is like or unlike humans in crucial ways. 18 Until recently, psychology ignored the content of beliefs and emotions and the possibility that the mind had evolved to treat biologically important categories in different ways. 19 Theories of memory and reasoning didn't distinguish thoughts about people from thoughts about rocks or houses. Theories of emotion didn't distinguish fear from anger, jealousy, or love. 20 Theories of social relations didn't distinguish among family, friends, enemies, and strangers. 21 Indeed, the topics in psychology that most interest laypeople -- love, hate, work, play, food, sex, status, dominance, jealousy, friendship, religion, art -- are almost completely absent from psychology textbooks.
One of the major documents of late twentieth-century psychology was the two-volume Parallel Distributed Processing by David Rumelhart, James McClelland, and their collaborators, which presented a style of neural network modeling called connectionism. 22 Rumelhart and McClelland argued that generic associationist networks, subjected to massive amounts of training, could explain all of cognition. They realized that this theory left them without a good answer to the question "Why are people smarter than rats? " Here is their answer:
Given all of the above, the question does seem a bit puzzling. . . . People have much more cortex than rats do or even than other primates do; in particular they have very much more . . . brain structure not dedicated to input/output -- and presumably, this extra cortex is strategically placed in the brain to subserve just those functions that differentiate people from rats or even apes. . . .
But there must be another aspect to the difference between rats and people as well. This is that the human environment includes other people and the cultural devices that they have developed to organize their thinking processes. 23 {22}
Humans, then, are just rats with bigger blank slates, plus something called "cultural devices. " And that brings us to the other half of the twentieth-century revolution in social~science.
? ? ? ? ? ? ? ? ? ? He's so unhip, when you say "Dylan,"
He thinks you're talkin' about Dylan Thomas (whoever he was). The man ain't got no culture.
-- Simon and Garfunkel
The word culture used to refer to exalted genres of entertainment, such as poetry, opera, and ballet. The other familiar sense -- "the totality of socially transmitted behavior patterns, arts, beliefs, institutions, and all other products of human work and thought" -- is only a century old. This change in the English language is just one of the legacies of the father of modern anthropology, Franz Boas (1858-1942).
The ideas of Boas, like the ideas of the major thinkers in psychology, were rooted in the empiricist philosophers of the Enlightenment, in this case George Berkeley (1685-1753). Berkeley formulated the theory of idealism, the notion that ideas, not bodies and other hunks of matter, are the ultimate constituents of reality. After twists and turns that are too convoluted to recount here, idealism became influential among nineteenth-century German thinkers. It was embraced by the young Boas, a German Jew from a secular, liberal family.
Idealism allowed Boas to lay a new intellectual foundation for egalitarianism. The differences among human races and ethnic groups, he proposed, come not from their physical constitution but from their culture, a system of ideas and values spread by language and other forms of social behavior. Peoples differ because their cultures differ. Indeed, that is how we should refer to them: the Eskimo culture or the Jewish culture, not the Eskimo race or the Jewish race. The idea that minds are shaped by culture served as a bulwark against racism and was the theory one ought to prefer on moral grounds. Boas wrote, "I claim that, unless the contrary can be proved, we must assume that all complex activities are socially determined, not hereditary. "24
Boas's case was not just a moral injunction; it was rooted in real discoveries. Boas studied native peoples,
? immigrants, and children in orphanages to prove that all groups of humans had equal potential. Turning Jespersen on his head, Boas showed that the languages of primitive peoples were not simpler than those of Europeans; they were just different. Eskimos' difficulty in discriminating the sounds of our language, for example, is matched by our difficulty in discriminating the sounds of theirs. True, many non-Western languages lack the means to express certain abstract concepts. They may have no words for numbers higher than three, for example, or no word for {23} goodness in general as opposed to the goodness of a particular person. But those limitations simply reflect the daily needs of those people as they live their lives, not an infirmity in their mental abilities. As in the story of Socrates drawing abstract philosophical concepts out of a slave boy, Boas showed that he could elicit new word forms for abstract concepts like "goodness" and "pity" out of a Kwakiutl native from the Pacific Northwest. He also observed that when native peoples come into contact with civilization and acquire things that have to be counted, they quickly adopt a full-blown counting system. 25
For all his emphasis on culture, Boas was not a relativist who believed that all cultures are equivalent, nor was he an empiricist who believed in the Blank Slate. He considered European civilization superior to tribal cultures, insisting only that all peoples were capable of achieving it. He did not deny that there might be a universal human nature, or that there might be differences among people within an ethnic group. What mattered to him was the idea that all ethnic groups are endowed with the same basic mental abilities. 26 Boas was right about this, and today it is accepted by virtually all scholars and scientists.
But Boas had created a monster. His students came to dominate American social science, and each generation outdid the previous one in its sweeping pronouncements. Boas's students insisted not just that differences among ethnic groups must be explained in terms of culture but that every aspect of human existence must be explained in terms of culture. For example, Boas had favored social explanations unless they were disproven, but his student Albert Kroeber favored them regardless of the evidence. "Heredity," he wrote, "cannot be allowed to have acted any part in history. "27 Instead, the chain of events shaping a people "involves the absolute conditioning of historical events by other historical events. "28
Kroeber did not just deny that social behavior could be explained by innate properties of minds. He denied that it could be explained by any properties of minds. A culture, he wrote, is superorganic -- it floats in its own universe, free of the flesh and blood of actual men and women: "Civilization is not mental action but a body or stream of products of mental exercise. . . . Mentality relates to the individual. The social or cultural, on the other hand, is in its essence non-individual. Civilization as such begins only where the individual ends. "29
These two ideas -- the denial of human nature, and the autonomy of culture from individual minds -- were also articulated by the founder of sociology, Emile Durkheim (1858-1917), who had foreshadowed Kroeber's doctrine of the superorganic mind:
Every time that a social phenomenon is directly explained by a psychological phenomenon, we may be sure that the explanation is false. . . . The group thinks, feels, and acts quite differently from the way in which {24} members would were they isolated. . . . If we begin with the individual in seeking to explain phenomena, we shall be able to understand nothing of what takes place in the group. . . . Individual natures are merely the indeterminate material that the social factor molds and transforms. Their contribution consists exclusively in very general attitudes, in vague and consequently plastic predispositions. 30
And he laid down a law for the social sciences that would be cited often in the century to come: "The determining cause of a social fact should be sought among the social facts preceding it and not among the states of individual consciousness. "31
Both psychology and the other social sciences, then, denied that the minds of individual people were important, but they set out in different directions from there. Psychology banished mental entities like beliefs and desires altogether and replaced them with stimuli and responses. The other social sciences located beliefs and desires in cultures and societies rather than in the heads of individual people. The different social sciences also agreed that the contents of cognition -- ideas, thoughts, plans, and so on -- were really phenomena of language, overt behavior that anyone could hear and write down. (Watson proposed that "thinking" really consisted of teensy movements of the mouth and throat. ) But most of all they shared a dislike of instincts and evolution. Prominent social scientists repeatedly declared the slate to be blank:
Instincts do not create customs; customs create instincts, for the putative instincts of human beings are
always learned and never native.
? ? ? ? ? ? ? ? ? -- Ellsworth Faris (1927)32
? Cultural phenomena . . . are in no respect hereditary but are characteristically and without exception
acquired.
Man has no nature; what he has is history.
-- George Murdock (1932)33 -- Jose Ortega y Gasset (1935)34
? ? With the exception of the instinctoid reactions in infants to sudden withdrawals of support and to sudden loud noises, the human being is entirely instinctless. . . . Man is man because he has no instincts, because everything he is and has become he has learned, acquired, from his culture, from the man- made part of the environment, from other human beings.
-- Ashley Montagu (1973)35 {25}
True, the metaphor of choice was no longer a scraped tablet or white paper. Durkheim had spoken of "indeterminate material," some kind of blob that was molded or pounded into shape by culture. Perhaps the best modern metaphor is Silly Putty, the rubbery stuff that children use both to copy printed matter (like a blank slate) and to mold into desired shapes (like indeterminate material). The malleability metaphor resurfaced in statements by two of Boas's most famous students:
Most people are shaped to the form of their culture because of the malleability of their original endowment. . . . The great mass of individuals take quite readily the form that is presented to them.
-- Ruth Benedict (1934)36 We are forced to conclude that human nature is almost unbelievably malleable, responding accurately
and contrastingly to contrasting cul tural conditions. Others likened the mind to some kind of sieve:
? ? ? -- Margaret Mead (1935)37 Much of what is commonly called "human nature" is merely culture thrown against a screen of nerves,
? glands, sense organs, muscles, etc.
Or to the raw materials for a factory:
Human nature is the rawest, most undifferentiated of raw material.
-- Leslie White (1949)38 -- Margaret Mead (1928)39
? ? Our ideas, our values, our acts, even our emotions, are, like our nervous system itself, cultural products -- products manufactured, indeed, out of tendencies, capacities, and dispositions with which we were
born, but manufactured nonetheless.
-- Clifford Geertz (1973)41 {26}
Or to some other amorphous entity that can have many things done to it:
Cultural psychology is the study of the way cultural traditions and social practices regulate, express, transform, and permute the human psyche, resulting less in psychic unity for humankind than in ethnic
-- Clifford Geertz (1973)40 Man is the animal most desperately dependent upon such extragenetic, outside-the-skin control
?
Or to an unprogrammed computer:
mechanisms, such cultural programs, for ordering his behavior.
? ? divergences in mind, self and emotion.
-- Richard Shweder (1990)42
? The superorganic or group mind also became an article of faith in social science. Robert Lowie (another Boas student) wrote, "The principles of psychology are as incapable of accounting for the phenomena of culture as is gravitation to account for architectural styles. "43 And in case you missed its full implications, the anthropologist Leslie White spelled it out:
? Instead of regarding the individual as a First Cause, as a prime mover, as the initiator and determinant
of the culture process, we now see him as a component part, and a tiny and relatively insignificant part at that, of a vast, socio-cultural system that embraces innumerable individuals at any one time and extends back into their remote past as well. . . . For purposes of scientific interpretation, the culture process may be regarded as a thing sui generis; culture is explainable in terms of culture. 44
In other words, we should forget about the mind of an individual person like you, that tiny and insignificant part of a vast sociocultural system. The mind that counts is the one belonging to the group, which is capable of thinking, feeling, and acting on its own.
The doctrine of the superorganism has had an impact on modern life that extends well beyond the writings of social scientists. It underlies the tendency to reify "society" as a moral agent that can be blamed for sins as if it were a person. It drives identity politics, in which civil rights and political perquisites are allocated to groups rather than to individuals. And as we shall see in later chapters, it defined some of the great divides between major political systems in the twentieth century. ~
The Blank Slate was not the only part of the official theory that social scientists felt compelled to prop up. They also strove to consecrate the Noble Savage. Mead painted a Gauguinesque portrait of native peoples as peaceable, egalitarian, materially satisfied, and sexually unconnected. Her uplifting vision of who we used to be -- and therefore who we can become again -- was accepted by such otherwise skeptical writers as Bertrand Russell and H. L. Mencken. Ashley Montagu (also from the Boas circle), a prominent public intellectual from the 1950s until his recent death, tirelessly invoked the doctrine {27} of the Noble Savage to justify the quest for brotherhood and peace and to refute anyone who might think such efforts were futile. In 1950, for example, he drafted a manifesto for the newly formed UNESCO that declared, "Biological studies lend support to the ethic of universal brotherhood, for man is born with drives toward co-operation, and unless these drives are satisfied, men and nations alike fall ill. "45 With the ashes of thirty-five million victims of World War II still warm or radioactive, a reasonable person might wonder how "biological studies" could show anything of the kind. The draft was rejected, but Montagu had better luck in the decades to come, when UNESCO and many scholarly societies adopted similar resolutions. 46
More generally, social scientists saw the malleability of humans and the autonomy of culture as doctrines that might bring about the age-old dream of perfecting mankind. We are not stuck with what we don't like about our current predicament, they argued. Nothing prevents us from changing it except a lack of will and the benighted belief that we are permanently consigned to it by biology. Many social scientists have expressed the hope of a new and improved human nature:
I felt (and said so early) that the environmental explanation was preferable, whenever justified by the
? ? ? ? data, because it was more optimistic, holding out the hope of improvement.
-- Otto Klineberg (1928)47
? Modern sociology and modern anthropology are one in saying that the substance of culture, or civilization, is social tradition and that this social tradition is indefinitely modifiable by further learning on the part of men for happier and better ways of living together. . . . Thus the scientific study of institutions awakens faith in the possibility of remaking both human nature and human social life.
-- Charles Ellwood (1922)48
Barriers in many fields of knowledge are falling below the new optimism which is that anybody can learn anything. . . . We have turned away from the concept of human ability as something fixed in the physiological structure, to that of a flexible and versatile mechanism subject to great improvement.
-- Robert Faris (1961)49
Though psychology is not as politicized as some of the other social sciences, it too is sometimes driven by a Utopian vision in which changes in child-rearing and education will ameliorate social pathologies and improve human welfare. And psychological theorists sometimes try to add moral heft to arguments for connectionism or other empiricist theories with warnings about the {28} pessimistic implications of innatist theories. They argue, for example, that innatist theories open the door to inborn differences, which could foster racism, or that the theories imply that human traits are unchangeable, which could we~aken support for social programs. 50
Twentieth-century social science embraced not just the Blank Slate and the Noble Savage but the third member of the
? ? ? ? trinity, the Ghost in the Machine. The declaration that we can change what we don't like about ourselves became a watchword of social science. But that only raises the question "Who or what is the 'we'? " If the "we" doing the remaking are just other hunks of matter in the biological world, then any malleability of behavior we discover would be cold comfort, because we, the molders, would be biologically constrained and therefore might not mold people, or allow ourselves to be molded, in the most socially salutary way. A ghost in the machine is the ultimate liberator of human will -- including the will to change society -- from mechanical causation. The anthropologist Loren Eiseley made this clear when he wrote:
The mind of man, by indetermination, by the power of choice and cultural communication, is on the verge of escape from the blind control of that deterministic world with which the Darwinists had unconsciously shackled man. The inborn characteristics laid upon him by the biological extremists have crumbled away. . . . Wallace saw and saw correctly, that with the rise of man the evolution of parts was to a marked degree outmoded, that mind was now the arbiter of human destiny. 51
The "Wallace" that Eiseley is referring to is Alfred Russel Wallace (1823-1913), the co-discoverer with Darwin of natural selection. Wallace parted company from Darwin by claiming that the human mind could not be explained by evolution and must have been designed by a superior intelligence. He certainly did believe that the mind of man could escape "the blind control of a deterministic world. " Wallace became a spiritualist and spent the later years of his career searching for a way to communicate with the souls of the dead.
The social scientists who believed in an absolute separation of culture from biology may not have literally believed in a spook haunting the brain. Some used the analogy of the difference between living and nonliving matter. Kroeber wrote: "The dawn of the social. . . is not a link in any chain, not a step in a path, but a leap to another plane. . . . [It is like] the first occurrence of life in the hitherto lifeless universe. . . . From this moment on there should be two worlds in place of one. "52 And Lowie insisted that it was "not mysticism, but sound scientific method" to say that culture was "sui generis" and could be explained only by culture, because everyone knows that in biology a living cell can come only from another living cell. 53 {29}
At the time that Kroeber and Lowie wrote, they had biology on their side. Many biologists still thought that living things were animated by a special essence, an elan vital, and could not be reduced to inanimate matter. A 1931 history of biology, referring to genetics as it was then understood, said, "Thus the last of the biological theories leaves us where we first started, in the presence of a power called life or psyche which is not only of its own kind but unique in each and all of its exhibitions. "54 In the next chapter we will see that the analogy between the autonomy of culture and the autonomy of life would prove to be more telling than these social scientists realized.
<< {30} >> Chapter 3
The Last Wall to Fall
In 1755 Samuel Johnson wrote that his dictionary should not be expected to "change sublunary nature, and clear the world at once from folly, vanity, and affectation. " Few people today are familiar with the lovely word sublunary, literally "below the moon. " It alludes to the ancient belief in a strict division between the pristine, lawful, unchanging cosmos above and our grubby, chaotic, fickle Earth below. The division was already obsolete when Johnson used the word: Newton had shown that the same force that pulled an apple toward the ground kept the moon in its celestial orbit.
Newton's theory that a single set of laws governed the motions of all objects in the universe was the first event in one of the great developments in human understanding: the unification of knowledge, which the biologist E. O. Wilson has termed consilience. 1 Newton's breaching of the wall between the terrestrial and the celestial was followed by a collapse of the once equally firm (and now equally forgotten) wall between the creative past and the static present. That happened when Charles Lyell showed that the Earth was sculpted in the past by forces we see today (such as earthquakes and erosion) acting over immense spans of time.
The living and nonliving, too, no longer occupy different realms. In 1628 William Harvey showed that the human body is a machine that runs by hydraulics and other mechanical principles. In 1828 Friedrich Wohler showed that the stuff of life is not a magical, pulsating gel but ordinary compounds following the laws of chemistry. Charles Darwin showed how the astonishing diversity of life and its ubiquitous signs of design could arise from the physical process of natural selection among replicators. Gregor Mendel, and then James Watson and Francis Crick, showed how replication itself could be understood in physical terms.
? ? ? ? ? ? ? ? ? ? ? The unification of our understanding of life with our understanding of matter and energy was the greatest scientific achievement of the second half of the twentieth century. One of its many consequences was to pull the rug out {31} from under social scientists like Kroeber and Lowie who had invoked the "sound scientific method" of placing the living and nonliving in parallel universes. We now know that cells did not always come from other cells and that the emergence of life did not create a second world where before there was just one. Cells evolved from simpler replicating molecules, a nonliving part of the physical world, and may be understood as collections of molecular machinery -- fantastically complicated machinery, of course, but machinery nonetheless.
This leaves one wall standing in the landscape of knowledge, the one that twentieth-century social scientists guarded so jealously. It divides matter from mind, the material from the spiritual, the physical from the mental, biology from culture, nature from society, and the sciences from the social sciences, humanities, and arts. The division was built into each of the doctrines of the official theory: the blank slate given by biology versus the contents inscribed by experience and culture, the nobility of the savage in the state of nature versus the corruption of social institutions, the machine following inescapable laws versus the ghost that is free to choose and to improve the human condition.
But this wall, too, is falling. New ideas from four frontiers of knowledge -- the sciences of mind, brain, genes, and evolution -- are breaching the wall with a new understanding of human nature. In this chapter I will show how they are filling in the blank slate, declassing the noble savage, and exorcising the ghost in the machine. In the following chapter I will show that this new conception of human nature, connected to biology from below, can in turn be connected to the humanities and social sciences above. That new conception can give the phenomena of culture their due without segregating them into a parallel universe. ~
The first bridge between biology and culture is the science of mind, cognitive science. 2 The concept of mind has been perplexing for as long as people have reflected on their thoughts and feelings. The very idea has spawned paradoxes, superstitions, and bizarre theories in every period and culture. One can almost sympathize with the behaviorists and social constructionists of the first half of the twentieth century, who looked on minds as enigmas or conceptual traps that were best avoided in favor of overt behavior or the traits of a culture.
But beginning in the 1950s with the cognitive revolution, all that changed. It is now possible to make sense of mental processes and even to study them in the lab. And with a firmer grasp on the concept of mind, we can see that many tenets of the Blank Slate that once seemed appealing are now unnecessary or even incoherent. Here are five ideas from the cognitive revolution that have revamped how we think and talk about minds.
The first idea: The mental world can be grounded in the physical world by the concepts of information, computation, and feedback. A great divide between {32} mind and matter has always seemed natural because behavior appears to have a different kind of trigger than other physical events. Ordinary events have causes, it seems, but human behavior has reasons. I once participated in a BBC television debate on whether "science can explain human behavior. " Arguing against the resolution was a philosopher who asked how we might explain why someone was put in jail. Say it was for inciting racial hatred. The intention, the hatred, and even the prison, she said, cannot be described in the language of physics. There is simply no way to define "hatred" or "jail" in terms of the movements of particles. Explanations of behavior are like narratives, she argued, couched in the intentions of actors -- a plane completely separate from natural science. Or take a simpler example. How might we explain why Rex just walked over to the phone? We would not say that phone-shaped stimuli caused Rex's limbs to swing in certain arcs. Rather, we might say that he wanted to speak to his friend Cecile and knew that Cecile was home. No explanation has as much predictive power as that one. If Rex was no longer on speaking terms with Cecile, or if he remembered that Cecile was out bowling that night, his body would not have risen off the couch.
For millennia the gap between physical events, on the one hand, and meaning, content, ideas, reasons, and intentions, on the other, seemed to cleave the universe in two. How can something as ethereal as "inciting hatred" or "wanting to speak to Cecile" actually cause matter to move in space? But the cognitive revolution unified the world of ideas with the world of matter using a powerful new theory: that mental life can be explained in terms of information, computation, and feedback. Beliefs and memories are collections of information -- like facts in a database, but residing in patterns of activity and structure in the brain. Thinking and planning are systematic transformations of these patterns, like the operation of a computer program. Wanting and trying are feedback loops, like the principle behind a thermostat: they receive information about the discrepancy between a goal and the current state of the world, and then they execute operations that tend to reduce the difference. The mind is connected to the world by the sense organs, which transduce physical energy into data structures in the brain, and by motor programs, by which the brain controls the muscles.
This general idea may be called the computational theory of mind. It is not the same as the "computer metaphor" of the mind, the suggestion that the mind literally works like a human-made database, computer program, or thermostat.
? ? ? It says only that we can explain minds and human-made information processors using some of the same principles. It is just like other cases in which the natural world and human engineering overlap. A physiologist might invoke the same laws of optics to explain how the eye works and how a camera works without implying that the eye is like a camera in every detail.
The computational theory of mind does more than explain the existence {33} of knowing, thinking, and trying without invoking a ghost in the machine (though that would be enough of a feat). It also explains how those processes can be intelligent -- how rationality can emerge from a mindless physical process. If a sequence of transformations of information stored in a hunk of matter (such as brain tissue or silicon) mirrors a sequence of deductions that obey the laws of logic, probability, or cause and effect in the world, they will generate correct predictions about the world. And making correct predictions in pursuit of a goal is a pretty good definition of "intelligence. "3
Of course there is no new thing under the sun, and the computational theory of mind was foreshadowed by Hobbes when he described mental activity as tiny motions and wrote that "reasoning is but reckoning. " Three and a half centuries later, science has caught up to his vision. Perception, memory, imagery, reasoning, decision making, language, and motor control are being studied in the lab and successfully modeled as computational paraphernalia such as rules, strings, matrices, pointers, lists, files, trees, arrays, loops, propositions, and networks. For example, cognitive psychologists are studying the graphics system in the head and thereby explaining how people "see" the solution to a problem in a mental image. They are studying the web of concepts in long-term memory and explaining why some facts are easier to recall than others. They are studying the processor and memory used by the language system to learn why some sentences are a pleasure to read and others a difficult slog.
And if the proof is in the computing, then the sister field of artificial intelligence is confirming that ordinary matter can perform feats that were supposedly performable by mental stuff alone. In the 1950s computers were already being called "electronic brains" because they could calculate sums, organize data, and prove theorems. Soon they could correct spelling, set type, solve equations, and simulate experts on restricted topics such as picking stocks and diagnosing diseases. For decades we psychologists preserved human bragging rights by telling our classes that no computer could read text, decipher speech, or recognize faces, but these boasts are obsolete. Today software that can recognize printed letters and spoken words comes packaged with home computers. Rudimentary programs that understand or translate sentences are available in many search engines and Help programs, and they are steadily improving. Face-recognition systems have advanced to the point that civil libertarians are concerned about possible abuse when they are used with security cameras in public places.
Human chauvinists can still write off these low-level feats. Sure, they say, the input and output processing can be fobbed off onto computational modules, but you still need a human user with the capacity for judgment, reflection, and creativity. But according to the computational theory of mind, these capacities are themselves forms of information processing and can be implemented in a computational system. In 1997 an IBM computer called Deep
{34} Blue defeated the world chess champion Garry Kasparov, and unlike its predecessors, it did not just evaluate trillions of moves by brute force but was fitted with strategies that intelligently responded to patterns in the game. Newsweek called the match "The Brain's Last Stand. " Kasparov called the outcome "the end of mankind. "
You might still object that chess is an artificial world with discrete moves and a clear winner, perfectly suited to the rule-crunching of a computer. People, on the other hand, live in a messy world offering unlimited moves and nebulous goals. Surely this requires human creativity and intuition -- which is why everyone knows that computers will never compose a symphony, write a story, or paint a picture. But everyone may be wrong. Recent artificial intelligence systems have written credible short stories,4 composed convincing Mozart-like symphonies,5 drawn appealing pictures of people and landscapes,6 and conceived clever ideas for advertisements. 7
None of this is to say that the brain works like a digital computer, that artificial intelligence will ever duplicate the human mind, or that computers are conscious in the sense of having first-person subjective experience. But it does suggest that reasoning, intelligence, imagination, and creativity are forms of information processing, a well- understood physical process. Cognitive science, with the help of the computational theory of mind, has exorcised at least one ghost from the machine.
A second idea: The mind cannot be a blank slate, because blank slates don't do anything. As long as people had only the haziest concept of what a mind was or how it might work, the metaphor of a blank slate inscribed by the environment did not seem too outrageous. But as soon as one starts to think seriously about what kind of computation enables a system to see, think, speak, and plan, the problem with blank slates becomes all too obvious: they don't do anything. The inscriptions will sit there forever unless something notices patterns in them, combines them with patterns learned at other times, uses the combinations to scribble new thoughts onto the slate, and reads the results to guide behavior toward goals. Locke recognized this problem and alluded to something called "the understanding," which looked at the inscriptions on the white paper and carried out the recognizing, reflecting, and associating. But of
?
Give me a dozen healthy infants, well-formed, and my own specified world to bring them up in and I'll guarantee to take any one at random and train him to become any type of specialist I might select -- doctor, lawyer, artist, merchant-chief, and yes, even beggar-man and thief, regardless of his talents, penchants, tendencies, abilities, vocations, and race of his ancestors. 10
In behaviorism, an infant's talents and abilities didn't matter because there was no such thing as a talent or an ability. Watson had banned them from psychology, together with other contents of the mind, such as ideas, beliefs, desires, and feelings. They were subjective and unmeasurable, he said, and unfit for science, which studies only objective and measurable things. To a behaviorist, the only legitimate topic for psychology is overt behavior and how it is controlled by the present and past environment. (There is an old joke in psychology: What does a behaviorist say after making love? "It was good for you; how was it for me? ")
Locke's "ideas" had been replaced by "stimuli" and "responses," but his laws of association survived as laws of conditioning. A response can be associated with a new stimulus, as when Watson presented a baby with a white rat and then clanged a hammer against an iron bar, allegedly making the baby associate fear with fur. And a response could be associated with a reward, as when a cat in a box eventually learned that pulling a string opened a door and allowed it to escape. In these cases an experimenter set up a contingency between a stimulus and another stimulus or between a response and a reward. In a natural environment, said the behaviorists, these contingencies are part of the causal texture of the world, and they inexorably shape the behavior of organisms, including humans.
Among the casualties of behaviorist minimalism was the rich psychology of William James (1842-1910). James had been inspired by Darwin's argument that perception, cognition, and emotion, like physical organs, had evolved as biological adaptations. James invoked the notion of instinct to explain the preferences of humans, not just those of animals, and he posited numerous mechanisms in his theory of mental life, including short-term and long-term memory. But with the advent of behaviorism they all joined the index of forbidden concepts. The psychologist J. R. Kantor wrote in 1923: "Brief is the answer to the question as to what is the relationship between {20} social psychology and instincts. Plainly, there is no relationship. "11 Even sexual desire was redefined as a conditioned response. The psychologist Zing Yang Kuo wrote in 1929:
Behavior is not a manifestation of hereditary factors, nor can it be expressed in terms of heredity. [It is] a passive and forced movement mechanically and solely determined by the structural pattern of the organism and the nature of environmental forces. . . . All our sexual appetites are the result of social stimulation. The organism possesses no ready-made reaction to the other sex, any more than it possesses innate ideas. 12
Behaviorists believed that behavior could be understood independently of the rest of biology, without attention to the genetic makeup of the animal or the evolutionary history of the species. Psychology came to consist of the study of learning in laboratory animals. B. E. Skinner (1904-1990), the most famous psychologist in the middle decades of the twentieth century, wrote a book called The Behavior of Organisms in which the only organisms were rats and pigeons and the only behavior was lever pressing and key pecking. It took a trip to the circus to remind psychologists that species and their instincts mattered after all. In an article called "The Misbehavior of Organisms," Skinner's students Keller and Marian Breland reported that when they tried to use his techniques to train animals to insert poker chips into vending machines, the chickens pecked the chips, the raccoons washed them, and the pigs tried to root them with their snouts. 13 And behaviorists were as hostile to the brain as they were to genetics. As late as 1974, Skinner wrote that studying the brain was just another misguided quest to find the causes of behavior inside the organism rather than out in the world. 14
Behaviorism not only took over psychology but infiltrated the public consciousness. Watson wrote an influential childrearing manual recommending that parents establish rigid feeding schedules for their children and give them a minimum of attention and love. If you comfort a crying child, he wrote, you will reward him for crying and thereby increase the frequency of crying behavior. (Benjamin Spock's Baby and Child Care, first published in 1946 and famous for recommending indulgence toward children, was in part a reaction to Watson. ) Skinner wrote several bestsellers arguing that harmful behavior is neither instinctive nor freely chosen but inadvertently conditioned. If we turned society into a big Skinner box and controlled behavior deliberately rather than haphazardly, we could eliminate aggression, overpopulation, crowding, pollution, and inequality, and thereby attain Utopia. 15 The noble
? ? ? ? ? ? ? ? savage became the noble pigeon. {21}
Strict behaviorism is pretty much dead in psychology, but many of its attitudes live on. Associationism is the learning theory assumed by many mathematical models and neural network simulations of learning. 16 Many neuroscientists equate learning with the forming of associations, and look for an associative bond in the physiology of neurons and synapses, ignoring other kinds of computation that might implement learning in the brain. 17 (For example, storing the value of a variable in the brain, as in "x = 3," is a critical computational step in navigating and foraging, which are highly developed talents of animals in the wild. But this kind of learning cannot be reduced to the formation of associations, and so it has been ignored in neuroscience. ) Psychologists and neuroscientists still treat organisms interchangeably, seldom asking whether a convenient laboratory animal (a rat, a cat, a monkey) is like or unlike humans in crucial ways. 18 Until recently, psychology ignored the content of beliefs and emotions and the possibility that the mind had evolved to treat biologically important categories in different ways. 19 Theories of memory and reasoning didn't distinguish thoughts about people from thoughts about rocks or houses. Theories of emotion didn't distinguish fear from anger, jealousy, or love. 20 Theories of social relations didn't distinguish among family, friends, enemies, and strangers. 21 Indeed, the topics in psychology that most interest laypeople -- love, hate, work, play, food, sex, status, dominance, jealousy, friendship, religion, art -- are almost completely absent from psychology textbooks.
One of the major documents of late twentieth-century psychology was the two-volume Parallel Distributed Processing by David Rumelhart, James McClelland, and their collaborators, which presented a style of neural network modeling called connectionism. 22 Rumelhart and McClelland argued that generic associationist networks, subjected to massive amounts of training, could explain all of cognition. They realized that this theory left them without a good answer to the question "Why are people smarter than rats? " Here is their answer:
Given all of the above, the question does seem a bit puzzling. . . . People have much more cortex than rats do or even than other primates do; in particular they have very much more . . . brain structure not dedicated to input/output -- and presumably, this extra cortex is strategically placed in the brain to subserve just those functions that differentiate people from rats or even apes. . . .
But there must be another aspect to the difference between rats and people as well. This is that the human environment includes other people and the cultural devices that they have developed to organize their thinking processes. 23 {22}
Humans, then, are just rats with bigger blank slates, plus something called "cultural devices. " And that brings us to the other half of the twentieth-century revolution in social~science.
? ? ? ? ? ? ? ? ? ? He's so unhip, when you say "Dylan,"
He thinks you're talkin' about Dylan Thomas (whoever he was). The man ain't got no culture.
-- Simon and Garfunkel
The word culture used to refer to exalted genres of entertainment, such as poetry, opera, and ballet. The other familiar sense -- "the totality of socially transmitted behavior patterns, arts, beliefs, institutions, and all other products of human work and thought" -- is only a century old. This change in the English language is just one of the legacies of the father of modern anthropology, Franz Boas (1858-1942).
The ideas of Boas, like the ideas of the major thinkers in psychology, were rooted in the empiricist philosophers of the Enlightenment, in this case George Berkeley (1685-1753). Berkeley formulated the theory of idealism, the notion that ideas, not bodies and other hunks of matter, are the ultimate constituents of reality. After twists and turns that are too convoluted to recount here, idealism became influential among nineteenth-century German thinkers. It was embraced by the young Boas, a German Jew from a secular, liberal family.
Idealism allowed Boas to lay a new intellectual foundation for egalitarianism. The differences among human races and ethnic groups, he proposed, come not from their physical constitution but from their culture, a system of ideas and values spread by language and other forms of social behavior. Peoples differ because their cultures differ. Indeed, that is how we should refer to them: the Eskimo culture or the Jewish culture, not the Eskimo race or the Jewish race. The idea that minds are shaped by culture served as a bulwark against racism and was the theory one ought to prefer on moral grounds. Boas wrote, "I claim that, unless the contrary can be proved, we must assume that all complex activities are socially determined, not hereditary. "24
Boas's case was not just a moral injunction; it was rooted in real discoveries. Boas studied native peoples,
? immigrants, and children in orphanages to prove that all groups of humans had equal potential. Turning Jespersen on his head, Boas showed that the languages of primitive peoples were not simpler than those of Europeans; they were just different. Eskimos' difficulty in discriminating the sounds of our language, for example, is matched by our difficulty in discriminating the sounds of theirs. True, many non-Western languages lack the means to express certain abstract concepts. They may have no words for numbers higher than three, for example, or no word for {23} goodness in general as opposed to the goodness of a particular person. But those limitations simply reflect the daily needs of those people as they live their lives, not an infirmity in their mental abilities. As in the story of Socrates drawing abstract philosophical concepts out of a slave boy, Boas showed that he could elicit new word forms for abstract concepts like "goodness" and "pity" out of a Kwakiutl native from the Pacific Northwest. He also observed that when native peoples come into contact with civilization and acquire things that have to be counted, they quickly adopt a full-blown counting system. 25
For all his emphasis on culture, Boas was not a relativist who believed that all cultures are equivalent, nor was he an empiricist who believed in the Blank Slate. He considered European civilization superior to tribal cultures, insisting only that all peoples were capable of achieving it. He did not deny that there might be a universal human nature, or that there might be differences among people within an ethnic group. What mattered to him was the idea that all ethnic groups are endowed with the same basic mental abilities. 26 Boas was right about this, and today it is accepted by virtually all scholars and scientists.
But Boas had created a monster. His students came to dominate American social science, and each generation outdid the previous one in its sweeping pronouncements. Boas's students insisted not just that differences among ethnic groups must be explained in terms of culture but that every aspect of human existence must be explained in terms of culture. For example, Boas had favored social explanations unless they were disproven, but his student Albert Kroeber favored them regardless of the evidence. "Heredity," he wrote, "cannot be allowed to have acted any part in history. "27 Instead, the chain of events shaping a people "involves the absolute conditioning of historical events by other historical events. "28
Kroeber did not just deny that social behavior could be explained by innate properties of minds. He denied that it could be explained by any properties of minds. A culture, he wrote, is superorganic -- it floats in its own universe, free of the flesh and blood of actual men and women: "Civilization is not mental action but a body or stream of products of mental exercise. . . . Mentality relates to the individual. The social or cultural, on the other hand, is in its essence non-individual. Civilization as such begins only where the individual ends. "29
These two ideas -- the denial of human nature, and the autonomy of culture from individual minds -- were also articulated by the founder of sociology, Emile Durkheim (1858-1917), who had foreshadowed Kroeber's doctrine of the superorganic mind:
Every time that a social phenomenon is directly explained by a psychological phenomenon, we may be sure that the explanation is false. . . . The group thinks, feels, and acts quite differently from the way in which {24} members would were they isolated. . . . If we begin with the individual in seeking to explain phenomena, we shall be able to understand nothing of what takes place in the group. . . . Individual natures are merely the indeterminate material that the social factor molds and transforms. Their contribution consists exclusively in very general attitudes, in vague and consequently plastic predispositions. 30
And he laid down a law for the social sciences that would be cited often in the century to come: "The determining cause of a social fact should be sought among the social facts preceding it and not among the states of individual consciousness. "31
Both psychology and the other social sciences, then, denied that the minds of individual people were important, but they set out in different directions from there. Psychology banished mental entities like beliefs and desires altogether and replaced them with stimuli and responses. The other social sciences located beliefs and desires in cultures and societies rather than in the heads of individual people. The different social sciences also agreed that the contents of cognition -- ideas, thoughts, plans, and so on -- were really phenomena of language, overt behavior that anyone could hear and write down. (Watson proposed that "thinking" really consisted of teensy movements of the mouth and throat. ) But most of all they shared a dislike of instincts and evolution. Prominent social scientists repeatedly declared the slate to be blank:
Instincts do not create customs; customs create instincts, for the putative instincts of human beings are
always learned and never native.
? ? ? ? ? ? ? ? ? -- Ellsworth Faris (1927)32
? Cultural phenomena . . . are in no respect hereditary but are characteristically and without exception
acquired.
Man has no nature; what he has is history.
-- George Murdock (1932)33 -- Jose Ortega y Gasset (1935)34
? ? With the exception of the instinctoid reactions in infants to sudden withdrawals of support and to sudden loud noises, the human being is entirely instinctless. . . . Man is man because he has no instincts, because everything he is and has become he has learned, acquired, from his culture, from the man- made part of the environment, from other human beings.
-- Ashley Montagu (1973)35 {25}
True, the metaphor of choice was no longer a scraped tablet or white paper. Durkheim had spoken of "indeterminate material," some kind of blob that was molded or pounded into shape by culture. Perhaps the best modern metaphor is Silly Putty, the rubbery stuff that children use both to copy printed matter (like a blank slate) and to mold into desired shapes (like indeterminate material). The malleability metaphor resurfaced in statements by two of Boas's most famous students:
Most people are shaped to the form of their culture because of the malleability of their original endowment. . . . The great mass of individuals take quite readily the form that is presented to them.
-- Ruth Benedict (1934)36 We are forced to conclude that human nature is almost unbelievably malleable, responding accurately
and contrastingly to contrasting cul tural conditions. Others likened the mind to some kind of sieve:
? ? ? -- Margaret Mead (1935)37 Much of what is commonly called "human nature" is merely culture thrown against a screen of nerves,
? glands, sense organs, muscles, etc.
Or to the raw materials for a factory:
Human nature is the rawest, most undifferentiated of raw material.
-- Leslie White (1949)38 -- Margaret Mead (1928)39
? ? Our ideas, our values, our acts, even our emotions, are, like our nervous system itself, cultural products -- products manufactured, indeed, out of tendencies, capacities, and dispositions with which we were
born, but manufactured nonetheless.
-- Clifford Geertz (1973)41 {26}
Or to some other amorphous entity that can have many things done to it:
Cultural psychology is the study of the way cultural traditions and social practices regulate, express, transform, and permute the human psyche, resulting less in psychic unity for humankind than in ethnic
-- Clifford Geertz (1973)40 Man is the animal most desperately dependent upon such extragenetic, outside-the-skin control
?
Or to an unprogrammed computer:
mechanisms, such cultural programs, for ordering his behavior.
? ? divergences in mind, self and emotion.
-- Richard Shweder (1990)42
? The superorganic or group mind also became an article of faith in social science. Robert Lowie (another Boas student) wrote, "The principles of psychology are as incapable of accounting for the phenomena of culture as is gravitation to account for architectural styles. "43 And in case you missed its full implications, the anthropologist Leslie White spelled it out:
? Instead of regarding the individual as a First Cause, as a prime mover, as the initiator and determinant
of the culture process, we now see him as a component part, and a tiny and relatively insignificant part at that, of a vast, socio-cultural system that embraces innumerable individuals at any one time and extends back into their remote past as well. . . . For purposes of scientific interpretation, the culture process may be regarded as a thing sui generis; culture is explainable in terms of culture. 44
In other words, we should forget about the mind of an individual person like you, that tiny and insignificant part of a vast sociocultural system. The mind that counts is the one belonging to the group, which is capable of thinking, feeling, and acting on its own.
The doctrine of the superorganism has had an impact on modern life that extends well beyond the writings of social scientists. It underlies the tendency to reify "society" as a moral agent that can be blamed for sins as if it were a person. It drives identity politics, in which civil rights and political perquisites are allocated to groups rather than to individuals. And as we shall see in later chapters, it defined some of the great divides between major political systems in the twentieth century. ~
The Blank Slate was not the only part of the official theory that social scientists felt compelled to prop up. They also strove to consecrate the Noble Savage. Mead painted a Gauguinesque portrait of native peoples as peaceable, egalitarian, materially satisfied, and sexually unconnected. Her uplifting vision of who we used to be -- and therefore who we can become again -- was accepted by such otherwise skeptical writers as Bertrand Russell and H. L. Mencken. Ashley Montagu (also from the Boas circle), a prominent public intellectual from the 1950s until his recent death, tirelessly invoked the doctrine {27} of the Noble Savage to justify the quest for brotherhood and peace and to refute anyone who might think such efforts were futile. In 1950, for example, he drafted a manifesto for the newly formed UNESCO that declared, "Biological studies lend support to the ethic of universal brotherhood, for man is born with drives toward co-operation, and unless these drives are satisfied, men and nations alike fall ill. "45 With the ashes of thirty-five million victims of World War II still warm or radioactive, a reasonable person might wonder how "biological studies" could show anything of the kind. The draft was rejected, but Montagu had better luck in the decades to come, when UNESCO and many scholarly societies adopted similar resolutions. 46
More generally, social scientists saw the malleability of humans and the autonomy of culture as doctrines that might bring about the age-old dream of perfecting mankind. We are not stuck with what we don't like about our current predicament, they argued. Nothing prevents us from changing it except a lack of will and the benighted belief that we are permanently consigned to it by biology. Many social scientists have expressed the hope of a new and improved human nature:
I felt (and said so early) that the environmental explanation was preferable, whenever justified by the
? ? ? ? data, because it was more optimistic, holding out the hope of improvement.
-- Otto Klineberg (1928)47
? Modern sociology and modern anthropology are one in saying that the substance of culture, or civilization, is social tradition and that this social tradition is indefinitely modifiable by further learning on the part of men for happier and better ways of living together. . . . Thus the scientific study of institutions awakens faith in the possibility of remaking both human nature and human social life.
-- Charles Ellwood (1922)48
Barriers in many fields of knowledge are falling below the new optimism which is that anybody can learn anything. . . . We have turned away from the concept of human ability as something fixed in the physiological structure, to that of a flexible and versatile mechanism subject to great improvement.
-- Robert Faris (1961)49
Though psychology is not as politicized as some of the other social sciences, it too is sometimes driven by a Utopian vision in which changes in child-rearing and education will ameliorate social pathologies and improve human welfare. And psychological theorists sometimes try to add moral heft to arguments for connectionism or other empiricist theories with warnings about the {28} pessimistic implications of innatist theories. They argue, for example, that innatist theories open the door to inborn differences, which could foster racism, or that the theories imply that human traits are unchangeable, which could we~aken support for social programs. 50
Twentieth-century social science embraced not just the Blank Slate and the Noble Savage but the third member of the
? ? ? ? trinity, the Ghost in the Machine. The declaration that we can change what we don't like about ourselves became a watchword of social science. But that only raises the question "Who or what is the 'we'? " If the "we" doing the remaking are just other hunks of matter in the biological world, then any malleability of behavior we discover would be cold comfort, because we, the molders, would be biologically constrained and therefore might not mold people, or allow ourselves to be molded, in the most socially salutary way. A ghost in the machine is the ultimate liberator of human will -- including the will to change society -- from mechanical causation. The anthropologist Loren Eiseley made this clear when he wrote:
The mind of man, by indetermination, by the power of choice and cultural communication, is on the verge of escape from the blind control of that deterministic world with which the Darwinists had unconsciously shackled man. The inborn characteristics laid upon him by the biological extremists have crumbled away. . . . Wallace saw and saw correctly, that with the rise of man the evolution of parts was to a marked degree outmoded, that mind was now the arbiter of human destiny. 51
The "Wallace" that Eiseley is referring to is Alfred Russel Wallace (1823-1913), the co-discoverer with Darwin of natural selection. Wallace parted company from Darwin by claiming that the human mind could not be explained by evolution and must have been designed by a superior intelligence. He certainly did believe that the mind of man could escape "the blind control of a deterministic world. " Wallace became a spiritualist and spent the later years of his career searching for a way to communicate with the souls of the dead.
The social scientists who believed in an absolute separation of culture from biology may not have literally believed in a spook haunting the brain. Some used the analogy of the difference between living and nonliving matter. Kroeber wrote: "The dawn of the social. . . is not a link in any chain, not a step in a path, but a leap to another plane. . . . [It is like] the first occurrence of life in the hitherto lifeless universe. . . . From this moment on there should be two worlds in place of one. "52 And Lowie insisted that it was "not mysticism, but sound scientific method" to say that culture was "sui generis" and could be explained only by culture, because everyone knows that in biology a living cell can come only from another living cell. 53 {29}
At the time that Kroeber and Lowie wrote, they had biology on their side. Many biologists still thought that living things were animated by a special essence, an elan vital, and could not be reduced to inanimate matter. A 1931 history of biology, referring to genetics as it was then understood, said, "Thus the last of the biological theories leaves us where we first started, in the presence of a power called life or psyche which is not only of its own kind but unique in each and all of its exhibitions. "54 In the next chapter we will see that the analogy between the autonomy of culture and the autonomy of life would prove to be more telling than these social scientists realized.
<< {30} >> Chapter 3
The Last Wall to Fall
In 1755 Samuel Johnson wrote that his dictionary should not be expected to "change sublunary nature, and clear the world at once from folly, vanity, and affectation. " Few people today are familiar with the lovely word sublunary, literally "below the moon. " It alludes to the ancient belief in a strict division between the pristine, lawful, unchanging cosmos above and our grubby, chaotic, fickle Earth below. The division was already obsolete when Johnson used the word: Newton had shown that the same force that pulled an apple toward the ground kept the moon in its celestial orbit.
Newton's theory that a single set of laws governed the motions of all objects in the universe was the first event in one of the great developments in human understanding: the unification of knowledge, which the biologist E. O. Wilson has termed consilience. 1 Newton's breaching of the wall between the terrestrial and the celestial was followed by a collapse of the once equally firm (and now equally forgotten) wall between the creative past and the static present. That happened when Charles Lyell showed that the Earth was sculpted in the past by forces we see today (such as earthquakes and erosion) acting over immense spans of time.
The living and nonliving, too, no longer occupy different realms. In 1628 William Harvey showed that the human body is a machine that runs by hydraulics and other mechanical principles. In 1828 Friedrich Wohler showed that the stuff of life is not a magical, pulsating gel but ordinary compounds following the laws of chemistry. Charles Darwin showed how the astonishing diversity of life and its ubiquitous signs of design could arise from the physical process of natural selection among replicators. Gregor Mendel, and then James Watson and Francis Crick, showed how replication itself could be understood in physical terms.
? ? ? ? ? ? ? ? ? ? ? The unification of our understanding of life with our understanding of matter and energy was the greatest scientific achievement of the second half of the twentieth century. One of its many consequences was to pull the rug out {31} from under social scientists like Kroeber and Lowie who had invoked the "sound scientific method" of placing the living and nonliving in parallel universes. We now know that cells did not always come from other cells and that the emergence of life did not create a second world where before there was just one. Cells evolved from simpler replicating molecules, a nonliving part of the physical world, and may be understood as collections of molecular machinery -- fantastically complicated machinery, of course, but machinery nonetheless.
This leaves one wall standing in the landscape of knowledge, the one that twentieth-century social scientists guarded so jealously. It divides matter from mind, the material from the spiritual, the physical from the mental, biology from culture, nature from society, and the sciences from the social sciences, humanities, and arts. The division was built into each of the doctrines of the official theory: the blank slate given by biology versus the contents inscribed by experience and culture, the nobility of the savage in the state of nature versus the corruption of social institutions, the machine following inescapable laws versus the ghost that is free to choose and to improve the human condition.
But this wall, too, is falling. New ideas from four frontiers of knowledge -- the sciences of mind, brain, genes, and evolution -- are breaching the wall with a new understanding of human nature. In this chapter I will show how they are filling in the blank slate, declassing the noble savage, and exorcising the ghost in the machine. In the following chapter I will show that this new conception of human nature, connected to biology from below, can in turn be connected to the humanities and social sciences above. That new conception can give the phenomena of culture their due without segregating them into a parallel universe. ~
The first bridge between biology and culture is the science of mind, cognitive science. 2 The concept of mind has been perplexing for as long as people have reflected on their thoughts and feelings. The very idea has spawned paradoxes, superstitions, and bizarre theories in every period and culture. One can almost sympathize with the behaviorists and social constructionists of the first half of the twentieth century, who looked on minds as enigmas or conceptual traps that were best avoided in favor of overt behavior or the traits of a culture.
But beginning in the 1950s with the cognitive revolution, all that changed. It is now possible to make sense of mental processes and even to study them in the lab. And with a firmer grasp on the concept of mind, we can see that many tenets of the Blank Slate that once seemed appealing are now unnecessary or even incoherent. Here are five ideas from the cognitive revolution that have revamped how we think and talk about minds.
The first idea: The mental world can be grounded in the physical world by the concepts of information, computation, and feedback. A great divide between {32} mind and matter has always seemed natural because behavior appears to have a different kind of trigger than other physical events. Ordinary events have causes, it seems, but human behavior has reasons. I once participated in a BBC television debate on whether "science can explain human behavior. " Arguing against the resolution was a philosopher who asked how we might explain why someone was put in jail. Say it was for inciting racial hatred. The intention, the hatred, and even the prison, she said, cannot be described in the language of physics. There is simply no way to define "hatred" or "jail" in terms of the movements of particles. Explanations of behavior are like narratives, she argued, couched in the intentions of actors -- a plane completely separate from natural science. Or take a simpler example. How might we explain why Rex just walked over to the phone? We would not say that phone-shaped stimuli caused Rex's limbs to swing in certain arcs. Rather, we might say that he wanted to speak to his friend Cecile and knew that Cecile was home. No explanation has as much predictive power as that one. If Rex was no longer on speaking terms with Cecile, or if he remembered that Cecile was out bowling that night, his body would not have risen off the couch.
For millennia the gap between physical events, on the one hand, and meaning, content, ideas, reasons, and intentions, on the other, seemed to cleave the universe in two. How can something as ethereal as "inciting hatred" or "wanting to speak to Cecile" actually cause matter to move in space? But the cognitive revolution unified the world of ideas with the world of matter using a powerful new theory: that mental life can be explained in terms of information, computation, and feedback. Beliefs and memories are collections of information -- like facts in a database, but residing in patterns of activity and structure in the brain. Thinking and planning are systematic transformations of these patterns, like the operation of a computer program. Wanting and trying are feedback loops, like the principle behind a thermostat: they receive information about the discrepancy between a goal and the current state of the world, and then they execute operations that tend to reduce the difference. The mind is connected to the world by the sense organs, which transduce physical energy into data structures in the brain, and by motor programs, by which the brain controls the muscles.
This general idea may be called the computational theory of mind. It is not the same as the "computer metaphor" of the mind, the suggestion that the mind literally works like a human-made database, computer program, or thermostat.
? ? ? It says only that we can explain minds and human-made information processors using some of the same principles. It is just like other cases in which the natural world and human engineering overlap. A physiologist might invoke the same laws of optics to explain how the eye works and how a camera works without implying that the eye is like a camera in every detail.
The computational theory of mind does more than explain the existence {33} of knowing, thinking, and trying without invoking a ghost in the machine (though that would be enough of a feat). It also explains how those processes can be intelligent -- how rationality can emerge from a mindless physical process. If a sequence of transformations of information stored in a hunk of matter (such as brain tissue or silicon) mirrors a sequence of deductions that obey the laws of logic, probability, or cause and effect in the world, they will generate correct predictions about the world. And making correct predictions in pursuit of a goal is a pretty good definition of "intelligence. "3
Of course there is no new thing under the sun, and the computational theory of mind was foreshadowed by Hobbes when he described mental activity as tiny motions and wrote that "reasoning is but reckoning. " Three and a half centuries later, science has caught up to his vision. Perception, memory, imagery, reasoning, decision making, language, and motor control are being studied in the lab and successfully modeled as computational paraphernalia such as rules, strings, matrices, pointers, lists, files, trees, arrays, loops, propositions, and networks. For example, cognitive psychologists are studying the graphics system in the head and thereby explaining how people "see" the solution to a problem in a mental image. They are studying the web of concepts in long-term memory and explaining why some facts are easier to recall than others. They are studying the processor and memory used by the language system to learn why some sentences are a pleasure to read and others a difficult slog.
And if the proof is in the computing, then the sister field of artificial intelligence is confirming that ordinary matter can perform feats that were supposedly performable by mental stuff alone. In the 1950s computers were already being called "electronic brains" because they could calculate sums, organize data, and prove theorems. Soon they could correct spelling, set type, solve equations, and simulate experts on restricted topics such as picking stocks and diagnosing diseases. For decades we psychologists preserved human bragging rights by telling our classes that no computer could read text, decipher speech, or recognize faces, but these boasts are obsolete. Today software that can recognize printed letters and spoken words comes packaged with home computers. Rudimentary programs that understand or translate sentences are available in many search engines and Help programs, and they are steadily improving. Face-recognition systems have advanced to the point that civil libertarians are concerned about possible abuse when they are used with security cameras in public places.
Human chauvinists can still write off these low-level feats. Sure, they say, the input and output processing can be fobbed off onto computational modules, but you still need a human user with the capacity for judgment, reflection, and creativity. But according to the computational theory of mind, these capacities are themselves forms of information processing and can be implemented in a computational system. In 1997 an IBM computer called Deep
{34} Blue defeated the world chess champion Garry Kasparov, and unlike its predecessors, it did not just evaluate trillions of moves by brute force but was fitted with strategies that intelligently responded to patterns in the game. Newsweek called the match "The Brain's Last Stand. " Kasparov called the outcome "the end of mankind. "
You might still object that chess is an artificial world with discrete moves and a clear winner, perfectly suited to the rule-crunching of a computer. People, on the other hand, live in a messy world offering unlimited moves and nebulous goals. Surely this requires human creativity and intuition -- which is why everyone knows that computers will never compose a symphony, write a story, or paint a picture. But everyone may be wrong. Recent artificial intelligence systems have written credible short stories,4 composed convincing Mozart-like symphonies,5 drawn appealing pictures of people and landscapes,6 and conceived clever ideas for advertisements. 7
None of this is to say that the brain works like a digital computer, that artificial intelligence will ever duplicate the human mind, or that computers are conscious in the sense of having first-person subjective experience. But it does suggest that reasoning, intelligence, imagination, and creativity are forms of information processing, a well- understood physical process. Cognitive science, with the help of the computational theory of mind, has exorcised at least one ghost from the machine.
A second idea: The mind cannot be a blank slate, because blank slates don't do anything. As long as people had only the haziest concept of what a mind was or how it might work, the metaphor of a blank slate inscribed by the environment did not seem too outrageous. But as soon as one starts to think seriously about what kind of computation enables a system to see, think, speak, and plan, the problem with blank slates becomes all too obvious: they don't do anything. The inscriptions will sit there forever unless something notices patterns in them, combines them with patterns learned at other times, uses the combinations to scribble new thoughts onto the slate, and reads the results to guide behavior toward goals. Locke recognized this problem and alluded to something called "the understanding," which looked at the inscriptions on the white paper and carried out the recognizing, reflecting, and associating. But of
?
