business students are more
conservative
than students in the arts, that women are more likely than men to want to lose weight, and that men are more likely than women to swat a fly with their bare hands, are not being irrational or bigoted.
Steven-Pinker-The-Blank-Slate 1
It is a bad idea to say that discrimination is wrong only because the traits of all people are indistinguishable.
? It is a bad idea to say that violence and exploitation are wrong only because people are not naturally inclined to them.
? It is a bad idea to say that people are responsible for their actions only because the causes of those actions are mysterious.
? And it is a bad idea to say that our motives are meaningful in a personal sense only because they are inexplicable in a biological sense. {194}
These are bad ideas because they make our values hostages to fortune, implying that someday factual discoveries could make them obsolete. And they are bad ideas because they conceal the downsides of denying human nature: persecution of the successful, intrusive social engineering, the writing off of suffering in other cultures, an incomprehension of the logic of justice, and the devaluing of human life on earth.
<< {195} >>
KNOW THYSELF
ow that I have attempted to make the very idea of human nature respectable, it is time to say something about what it is and what difference N it makes for our public and private lives. The chapters in Part IV present some current ideas about the design specs of the basic human
faculties. These are not just topics in a psychology curriculum but have implications for many arenas of public discourse. Ideas about the contents of cognition -- concepts, words, and images -- shed light on the roots of prejudice, on the media, and on the arts. Ideas about the capacity for reason can enter into our policies of education and applications of technology. Ideas about social relations are relevant to the family, to sexuality, to social organization, and to crime. Ideas about the moral sense inform the way we evaluate political movements and how we trade off one value against another.
In each of these arenas, people always appeal to some conception of human nature, whether they acknowledge it or not. The problem is that the conceptions are often based on gut feelings, folk theories, and archaic versions of biology. My goal is to make these conceptions explicit, to suggest what is right and wrong about them, and to spell out some of the implications. Ideas about human nature cannot, on their own, resolve perplexing controversies or determine public policy. But without such ideas we are not playing with a full deck and are vulnerable to unnecessary befuddlement. As the biologist Richard Alexander has noted, "Evolution is surely most deterministic for those still unaware of it. "1
<< {197} >> Chapter 12
In Touch with Reality
? ? ? ? ? ? ? ? ? ? ? ? ? ? What a piece of work is a man!
How noble in reason!
How infinite in faculty!
In form, in moving, how express and admirable! In action, how like an angel!
In apprehension, how like a god!
-- William Shakespeare
The starting point for acknowledging human nature is a sheer awe and humility in the face of the staggering complexity of its source, the brain. Organized by the three billion bases of our genome and shaped by hundreds of millions of years of evolution, the brain is a network of unimaginable intricacy: a hundred billion neurons linked by a hundred trillion connections, woven into a convoluted three-dimensional architecture. Humbling, too, is the complexity of what it does. Even the mundane talents we share with other primates -- walking, grasping, recognizing -- are solutions to engineering problems at or beyond the cutting edge of artificial intelligence. The talents that are human birthrights -- speaking and understanding, using common sense, teaching children, inferring other people's motives -- will probably not be duplicated by machines in our lifetime, if ever. All this should serve as a counterweight to the image of the mind as formless raw material and to people as insignificant atoms making up the complex being we call "society. "
The human brain equips us to thrive in a world of objects, living things, and other people. Those entities have a large impact on our well-being, and one would expect the brain to be well suited to detecting them and their powers. Failing to recognize a steep precipice or a hungry panther or a jealous spouse can have significant negative consequences for biological fitness, to put it mildly. The fantastic complexity of the brain is there in part to register consequential facts about the world around us. {198}
But this truism has been rejected by many sectors of modern intellectual life. According to the relativistic wisdom prevailing in much of academia today, reality is socially constructed by the use of language, stereotypes, and media images. The idea that people have access to facts about the world is nai? ve, say the proponents of social constructionism, science studies, cultural studies, critical theory, postmodernism, and deconstructionism. In their view, observations are always infected by theories, and theories are saturated with ideology and political doctrines, so anyone who claims to have the facts or know the truth is just trying to exert power over everyone else.
Relativism is entwined with the doctrine of the Blank Slate in two ways. One is that relativists have a penny-pinching theory of psychology in which the mind has no mechanisms designed to grasp reality; all it can do is passively download words, images, and stereotypes from the surrounding culture. The other is the relativists' attitude toward science. Most scientists regard their work as an extension of our everyday ability to figure out what is out there and how things work. Telescopes and microscopes amplify the visual system; theories formalize our hunches about cause and effect; experiments refine our drive to gather evidence about events we cannot witness directly. Relativist movements agree that science is perception and cognition writ large, but they draw the opposite conclusion: that scientists, like laypeople, are unequipped to grasp an objective reality. Instead, their advocates say, "Western science is only one way of describing reality, nature, and the way things work -- a very effective way, certainly, for the production of goods and profits, but unsatisfactory in most other respects. It is an imperialist arrogance which ignores the sciences and insights of most other cultures and times. "1 Nowhere is this more significant than in the scientific study of politically charged topics such as race, gender, violence, and social organization. Appealing to "facts" or "the truth" in connection with these topics is just a ruse, the relativists say, because there is no "truth" in the sense of an objective yardstick independent of cultural and political presuppositions.
Skepticism about the soundness of people's mental faculties also determines whether one should respect ordinary people's tastes and opinions (even those we don't much like) or treat the people as dupes of an insidious commercial culture. According to relativist doctrines like "false consciousness," "inauthentic preferences," and "interiorized authority," people may be mistaken about their own desires. If so, it would undermine the assumptions behind democracy, which gives ultimate authority to the preferences of the majority of a population, and the assumptions behind market economies, which treat people as the best judges of how they should allocate their own resources. Perhaps not coincidentally, it elevates the scholars and artists who analyze the use of language and images in society, because only they can unmask the ways in which such media mislead and corrupt. {199}
This chapter is about the assumptions about cognition -- in particular, concepts, words, and images -- that underlie recent relativistic movements in intellectual life. The best way to introduce the argument is with examples from the study of perception, our most immediate connection to the world. They immediately show that the question of whether reality is socially constructed or directly available has not been properly framed. Neither alternative is correct.
? ? ? ? Relativists have a point when they say that we don't just open our eyes and apprehend reality, as if perception were a window through which the soul gazes at the world. The idea that we just see things as they are is called nai? ve realism, and it was refuted by skeptical philosophers thousands of years ago with the help of a simple phenomenon: visual illusions. Our visual systems can play tricks on us, and that is enough to prove they are gadgets, not pipelines to the truth. Here are two of my favorites. In Roger Shepard's "Turning the Tables"2 (right), the two parallelograms are identical in size and shape. In Edward Adelson's "Checker Shadow Illusion"3 (below) the light square in the middle of the shadow (B) is the same shade of gray as the dark squares outside the shadow (A):
? ? ? world. Most of the time the system works: people don't usually bump into trees or bite into rocks.
But occasionally the brain is fooled. The ground stretching away from our feet projects an image from the bottom to the center of our visual field. As a result, the brain often interprets down-up in the visual field as near-far in the world, especially when reinforced by other perspective cues such as occluded parts (like the hidden table legs). Objects stretching away from the viewer get foreshortened by projection, and the brain compensates for this, so we tend to see a given distance running up-and-down in the visual field as coming from a longer object than the same distance running left-to-right. And that makes us see the lengths and widths differently in the turned tables. By similar logic, objects in shadow reflect less light onto our retinas than objects in full illumination. Our brains compensate, making us see a given shade of gray as lighter when it is in shadow than when it is in sunshine. In each case we may see the lines and patches on the page incorrectly, but that is only because our visual systems are working very hard to see them as coming from a real world. Like a policeman framing a suspect, Shepard and Adelson have planted evidence that would lead a rational but unsuspecting observer to an incorrect conclusion. If we were in a world of ordinary 3-D objects that had projected those images onto our retinas, our perceptual experience would be accurate. Adelson explains: "As with many so-called illusions, this effect really demonstrates the success rather than the failure of the visual system. The visual system is not very good at being a physical light meter, but that is not its purpose. The important task is to break the image information down into meaningful components, and thereby perceive the nature of the objects in view. "4
It's not that expectations from past experience are irrelevant to perception. But their influence is to make our perceptual systems more accurate, not more arbitrary. In the two words below, we perceive the same shape as an "H" in the first word and as an "A" in the second:5
But just because the world we know is a construct of our brain, that does not mean it is an arbitrary construct -- a phantasm created by expectations or the social context. Our perceptual systems are designed to register aspects of the external world that were important to our survival, like the sizes, shapes, and materials of objects. They need a complex design to accomplish this feat because the retinal image is not a replica of the world. The projection of an object on the retina grows, shrinks, and warps as the object moves around; color and brightness fluctuate as the lighting changes from sun to clouds or from indoor to outdoor light. But somehow the brain solves these maddening problems. It works as if it were reasoning backwards from the retinal image to hypotheses about reality, using {200} geometry, optics, probability theory, and assumptions about the
? ? ? ? We see the shapes that way because experience tells us -- correctly -- that the odds are high that there really is an "H" in the middle of the first word and an "A" in the middle of the second, even if that is not true in an atypical case. The mechanisms of perception go to a lot of trouble to ensure that what we see corresponds to what is usually out there.
So the demonstrations that refute nai? ve realism most decisively also refute the idea that the mind is disconnected from reality. There is a third alternative: {201} that the brain evolved fallible yet intelligent mechanisms that work to keep us in touch with aspects of reality that were relevant to the survival and reproduction of our ancestors. And that is true not just of our perceptual faculties but of our cognitive faculties. The fact that our cognitive faculties (like our perceptual faculties) are attuned to the real world is most obvious from their response to illusions: they recognize the possibility of a breach with reality and find a way to get at the truth behind the false impression. When we see an oar that appears to be severed at the water's surface, we know how to tell whether it really is severed or just looks that way: we can palpate the oar, slide a straight object along it, or pull on it to see if the submerged part gets left behind. The concept of truth and reality behind such tests appears to be universal. People in all cultures distinguish truth from falsity and inner mental life from overt reality, and try to deduce the presence of unobservable objects from the perceptible clues they leave behind. 6
~
Visual perception is the most piquant form of knowledge of the world, but relativists are less concerned with how we see objects than with how we categorize them: how we sort our experiences into conceptual categories like birds, tools, and people. The seemingly innocuous suggestion that the categories of the mind correspond to something in reality became a contentious idea in the twentieth century because some categories -- stereotypes of race, gender, ethnicity, and sexual orientation -- can be harmful when they are used to discriminate or oppress.
The word stereotype originally referred to a kind of printing plate. Its current sense as a pejorative and inaccurate image standing for a category of people was introduced in 1922 by the journalist Walter Lippmann. Lippmann was an important public intellectual who, among other things, helped to found The New Republic, influenced Woodrow Wilson's policies at the end of World War I, and wrote some of the first attacks on IQ testing. In his book Public Opinion, Lippmann fretted about the difficulty of achieving true democracy in an age in which ordinary people could no longer judge public issues rationally because they got their information in what we today call sound bites. As part of this argument, Lippmann proposed that ordinary people's concepts of social groups were stereotypes: mental pictures that are incomplete, biased, insensitive to variation, and resistant to disconfirming information.
Lippmann had an immediate influence on social science (though the subtleties and qualifications of his original argument were forgotten). Psychologists gave people lists of ethnic groups and lists of traits and asked them to pair them up. Sure enough, people linked Jews with "shrewd" and "mercenary," Germans with "efficient" and "nationalistic," Negroes with "superstitious" and "happy-go-lucky," and so on. 7 Such generalizations are pernicious when applied to individuals, and though they are still lamentably common in much of {202} the world, they are now actively avoided by educated people and by mainstream public figures.
By the 1970s, many thinkers were not content to note that stereotypes about categories of people can be inaccurate. They began to insist that the categories themselves don't exist other than in our stereotypes. An effective way to fight racism, sexism, and other kinds of prejudice, in this view, is to deny that conceptual categories about people have any claim to objective reality. It would be impossible to believe that homosexuals are effeminate, blacks superstitious, and women passive if there were no such things as categories of homosexuals, blacks, or women to begin with. For example, the philosopher Richard Rorty has written," 'The homosexual,' 'the Negro,' and 'the female' are best seen not as inevitable classifications of human beings but rather as inventions that have done more harm than good. "8
For that matter, many writers think, why stop there? Better still to insist that all categories are social constructions and therefore figments, because that would really make invidious stereotypes figments. Rorty notes with approval that many thinkers today "go on to suggest that quarks and genes probably are [inventions] too. " Postmodernists and other relativists attack truth and objectivity not so much because they are interested in philosophical problems of ontology and epistemology but because they feel it is the best way to pull the rug out from under racists, sexists, and homophobes. The philosopher Ian Hacking provides a list of almost forty categories that have recently been claimed to be "socially constructed. " The prime examples are race, gender, masculinity, nature, facts, reality, and the past.
? ? ? ? ? But the list has been growing and now includes authorship, AIDS, brotherhood, choice, danger, dementia, illness, Indian forests, inequality, the Landsat satellite system, the medicalized immigrant, the nation-state, quarks, school success, serial homicide, technological systems, white-collar crime, women refugees, and Zulu nationalism. According to Hacking, the common thread is a conviction that the category is not determined by the nature of things and therefore is not inevitable. The further implication is that we would be much better off if it were done away with or radically transformed. 9
This whole enterprise is based on an unstated theory of human concept formation: that conceptual categories bear no systematic relation to things in the world but are socially constructed (and can therefore be reconstructed). Is it a correct theory? In some cases it has a grain of truth. As we saw in Chapter 4, some categories really are social constructions: they exist only because people tacitly agree to act as if they exist. Examples include money, tenure, citizenship, decorations for bravery, and the presidency of the United States. 10 But that does not mean that all conceptual categories are socially constructed. Concept formation has been studied for decades by cognitive psychologists, and they conclude that most concepts pick out categories of objects in the {203} world which had some kind of reality before we ever stopped to think about them. 11
Yes, every snowflake is unique, and no category will do complete justice to every one of its members. But intelligence depends on lumping together things that share properties, so that we are not flabbergasted by every new thing we encounter. As William James wrote, "A polyp would be a conceptual thinker if a feeling of 'Hollo! thingumbob again! ' ever flitted through its mind. " We perceive some traits of a new object, place it in a mental category, and infer that it is likely to have the other traits typical of that category, ones we cannot perceive. If it walks like a duck and quacks like a duck, it probably is a duck. If it's a duck, it's likely to swim, fly, have a back off which water rolls, and contain meat that's tasty when wrapped in a pancake with scallions and hoisin sauce.
This kind of inference works because the world really does contain ducks, which really do share properties. If we lived in a world in which walking quacking objects were no more likely to contain meat than did any other object, the category "duck" would be useless and we probably would not have evolved the ability to form it. If you were to construct a giant spreadsheet in which the rows and columns were traits that people notice and the cells were filled in by objects that possess that combination of traits, the pattern of filled cells would be lumpy. You would find lots of entries at the intersection of the "quacks" row and the "waddles" column but none at the "quacks" row and the "gallops" column. Once you specify the rows and columns, the lumpiness comes from the world, not from society or language. It is no coincidence that the same living things tend to be classified together by the words in European cultures, the words for plant and animal kinds in other cultures (including preliterate cultures), and the Linnaean taxa of professional biologists equipped with calipers, dissecting tools, and DNA sequencers. Ducks, biologists say, are several dozen species in the subfamily Anatinae, each with a distinct anatomy, an ability to interbreed with other members of their species, and a common ancestor in evolutionary history.
Most cognitive psychologists believe that conceptual categories come from two mental processes. 12 One of them notices clumps of entries in the mental spreadsheet and treats them as categories with fuzzy boundaries, prototypical members, and overlapping similarities, like the members of a family. That's why our mental category "duck" can embrace odd ducks that don't match the prototypical duck, such as lame ducks, who cannot swim or fly, Muscovy ducks, which have claws and spurs on their feet, and Donald Duck, who talks and wears clothing. The other mental process looks for crisp rules and definitions and enters them into chains of reasoning. The second system can learn that true ducks molt twice a season and have overlapping scales on their legs and hence that certain birds that look like geese and are called geese really are ducks. Even when people don't know these facts from academic {204} biology, they have a strong intuition that species are defined by an internal essence or hidden trait that lawfully gives rise to its visible features. 13
Anyone who teaches the psychology of categorization has been hit with this question from a puzzled student: "You're telling us that putting things into categories is rational and makes us smart. But we've always been taught that putting people into categories is irrational and makes us sexist and racist. If categorization is so great when we think about ducks and chairs, why is it so terrible when we think about genders and ethnic groups? " As with many ingenuous questions from students, this one uncovers a shortcoming in the literature, not a flaw in their understanding.
The idea that stereotypes are inherently irrational owes more to a condescension toward ordinary people than it does to good psychological research. Many researchers, having shown that stereotypes existed in the minds of their subjects, assumed that the stereotypes had to be irrational, because they were uncomfortable with the possibility that some trait might be statistically true of some group. They never actually checked. That began to change in the 1980s, and now a fair amount is known about the accuracy of stereotypes. 14
With some important exceptions, stereotypes are in fact not inaccurate when assessed against objective benchmarks such as census figures or the reports of the stereotyped people themselves. People who believe that African Americans are more likely to be on welfare than whites, that Jews have higher average incomes than WASPs, that
? ? ? ? ? ? ? ? ?
business students are more conservative than students in the arts, that women are more likely than men to want to lose weight, and that men are more likely than women to swat a fly with their bare hands, are not being irrational or bigoted. Those beliefs are correct. People's stereotypes are generally consistent with the statistics, and in many cases their bias is to underestimate the real differences between sexes or ethnic groups. 15 This does not mean that the stereotyped traits are unchangeable, of course, or that people think they are unchangeable, only that people perceive the traits fairly accurately at the time.
Moreover, even when people believe that ethnic groups have characteristic traits, they are never mindless stereotypers who literally believe that each and every member of the group possesses those traits. People may think that Germans are, on average, more efficient than non-Germans, but no one believes that every last German is more efficient than every non-German. 16 And people have no trouble overriding a stereotype when they have good information about an individual. Contrary to a common accusation, teachers' impressions of their individual pupils are not contaminated by their stereotypes of race, gender, or socioeconomic status. The teachers' impressions accurately reflect the pupil's performance as measured by objective tests. 17
Now for the important exceptions. Stereotypes can be downright inaccurate when a person has few or no firsthand encounters with the stereotyped {205} group, or belongs to a group that is overtly hostile to the one being judged. During World War II, when the Russians were allies of the United States and the Germans were enemies, Americans judged Russians to have more positive traits than Germans. Soon afterward, when the alliances reversed, Americans judged Germans to have more positive traits than Russians. 18
Also, people's ability to set aside stereotypes when judging an individual is accomplished by their conscious, deliberate reasoning. When people are distracted or put under pressure to respond quickly, they are more likely to judge that a member of an ethnic group has all the stereotyped traits of the group. 19 This comes from the two-part design of the human categorization system mentioned earlier. Our network of fuzzy associations naturally reverts to a stereotype when we first encounter an individual. But our rule-based categorizer can block out those associations and make deductions based on the relevant facts about that individual. It can do so either for practical reasons, when information about a group-wide average is less diagnostic than information about the individual, or for social and moral reasons, out of respect for the imperative that one ought to ignore certain group-wide averages when judging an individual.
The upshot of this research is not that stereotypes are always accurate but that they are not always false, or even usually false. This is just what we would expect if human categorization -- like the rest of the mind -- is an adaptation that keeps track of aspects of the world that are relevant to our long-term well-being. As the social psychologist Roger Brown pointed out, the main difference between categories of people and categories of other things is that when you use a prototypical exemplar to stand for a category of things, no one takes offense. When Webster's dictionary used a sparrow to stand for all birds, "emus and ostriches and penguins and eagles did not go on the attack. " But just imagine what would have happened if Webster's had used a picture of a soccer mom to illustrate woman and a picture of a business executive to illustrate man. Brown remarks, "Of course, people would be right to take offense since a prototype can never represent the variation that exists in natural categories. It's just that birds don't care but people do. "20
What are the implications of the fact that many stereotypes are statistically accurate? One is that contemporary scientific research on sex differences cannot be dismissed just because some of the findings are consistent with traditional stereotypes of men and women. Some parts of those stereotypes may be false, but the mere fact that they are stereotypes does not prove that they are false in every respect.
The partial accuracy of many stereotypes does not, of course, mean that racism, sexism, and ethnic prejudice are acceptable. Quite apart from the democratic principle that in the public sphere people should be treated as individuals, there are good reasons to be concerned about stereotypes. {206} Stereotypes based on hostile depictions rather than on firsthand experience are bound to be inaccurate. And some stereotypes are accurate only because of self-fulfilling prophecies. Forty years ago it may have been factually correct that few women and African Americans were qualified to be chief executives or presidential candidates. But that was only because of barriers that prevented them from attaining those qualifications, such as university policies that refused them admission out of a belief that they were not qualified. The institutional barriers had to be dismantled before the facts could change. The good news is that when the facts do change, people's stereotypes can change with them.
What about policies that go farther and actively compensate for prejudicial stereotypes, such as quotas and preferences that favor underrepresented groups? Some defenders of these policies assume that gatekeepers are incurably afflicted with baseless prejudices, and that quotas must be kept in place forever to neutralize their effects. The research on stereotype accuracy refutes that argument. Nonetheless, the research might support a different argument for preferences and other gender- and color-sensitive policies. Stereotypes, even when they are accurate, might be self-fulfilling, and not just in the obvious case of institutionalized barriers like those that kept women and
? ? ? ? ? ? ? ? African Americans out of universities and professions. Many people have heard of the Pygmalion effect, in which people perform as other people (such as teachers) expect them to perform. As it happens, the Pygmalion effect appears to be small or nonexistent, but there are more subtle forms of self-fulfilling prophecies. 21 If subjective decisions about people, such as admissions, hiring, credit, and salaries, are based in part on group-wide averages, they will conspire to make the rich richer and the poor poorer. Women are marginalized in academia, making them genuinely less influential, which increases their marginalization. African Americans are treated as poorer credit risks and denied credit, which makes them less likely to succeed, which makes them poorer credit risks. Race- and gender- sensitive policies, according to arguments by the psychologist Virginia Valian, the economist Glenn Loury, and the philosopher James Flynn, may be needed to break the vicious cycle. 22
Pushing in the other direction is the finding that stereotypes are least accurate when they pertain to a coalition that is pitted against one's own in hostile competition. This should make us nervous about identity politics, in which public institutions identify their members in terms of their race, gender, and ethnic group and weigh every policy by how it favors one group over another. In many universities, for example, minority students are earmarked for special orientation sessions and encouraged to view their entire academic experience through the lens of their group and how it has been victimized. By implicitly pitting one group against another, such policies may cause each group to brew stereotypes about the other that are more pejorative than the {207} ones they would develop in personal encounters. As with other policy issues I examine in this book, the data from the lab do not offer a thumbs-up or thumbs-down verdict on race- and gender-conscious policies. But by highlighting the features of our psychology that different policies engage, the findings can make the tradeoffs clearer and the debates better informed.
~
Of all the faculties that go into the piece of work called man, language may be the most awe-inspiring. "Remember that you are a human being with a soul and the divine gift of articulate speech," Henry Higgins implored Eliza Doolittle. Galileo's alter ego, humbled by the arts and inventions of his day, commented on language in its written form:
But surpassing all stupendous inventions, what sublimity of mind was his who dreamed of finding means to communicate his deepest thoughts to any other person, though distant by mighty intervals of place and time! Of talking with those who are in India; of speaking to those who are not yet born and will not be born for a thousand or ten thousand years; and with what facility, by the different arrangements of twenty characters upon a page! 23
But a funny thing happened to language in intellectual life. Rather than being appreciated for its ability to communicate thought, it was condemned for its power to constrain thought. Famous quotations from two philosophers capture the anxiety. "We have to cease to think if we refuse to do it in the prisonhouse of language," wrote Friedrich Nietzsche. "The limits of my language mean the limits of my world," wrote Ludwig Wittgenstein. How could language exert this stranglehold? It would if words and phrases were the medium of thought itself, an idea that falls naturally out of the Blank Slate. If there is nothing in the intellect that was not first in the senses, then words picked up by the ears are the obvious source of any abstract thought that cannot be reduced to sights, smells, or other sounds. Watson tried to explain thinking as microscopic movements of the mouth and throat; Skinner hoped his 1957 book Verbal Behavior, which explained language as a repertoire of rewarded responses, would bridge the gap between pigeons and people.
The other social sciences also tended to equate language with thought. Boas's student Edward Sapir called attention to differences in how languages carve up the world into categories, and Sapir's student Benjamin Whorf stretched those observations into the famous Linguistic Determinism hypothesis: "We cut nature up, organize it into concepts, and ascribe significances as we do, largely because we are parties to an agreement to organize it in this way -- an agreement that holds throughout our speech community and is codified in the patterns of our language. The agreement is, of course, an implicit {208} and unstated one, but its terms are absolutely obligatory! "24 More recently, the anthropologist Clifford Geertz wrote that "thinking consists not of 'happenings in the head' (though happenings there and elsewhere are necessary for it to occur) but of a traffic in what have been called . . . significant symbols -- words for the most part. "25
As with so many ideas in social science, the centrality of language is taken to extremes in deconstructionism, postmodernism, and other relativist doctrines. The writings of oracles like Jacques Derrida are studded with such aphorisms as "No escape from language is possible," "Text is self-referential," "Language is power," and "There is nothing outside the text. " Similarly, J. Hillis Miller wrote that "language is not an instrument or tool in man's hands, a submissive means of thinking. Language rather thinks man and his 'world'. . . if he will allow it to do so. "26 The prize for the most extreme statement must go to Roland Barthes, who declared, "Man does not exist prior to
? ? ? ? ? ? ? ? language, either as a species or as an individual. "27
The ancestry of these ideas is said to be from linguistics, though most linguists believe that deconstructionists have gone off the deep end. The original observation was that many words are defined in part by their relationship to other words. For example, he is defined by its contrast with I, you, they, and she, and big mates sense only as the opposite of little. And if you look up words in a dictionary, they are defined by other words, which are defined by still other words, until the circle is completed when you get back to a definition containing the original word. Therefore, say the deconstructionists, language is a self-contained system in which words have no necessary connection to reality. And since language is an arbitrary instrument, not a medium for communicating thoughts or describing reality, the powerful can use it to manipulate and oppress others. This leads in turn to an agitation for linguistic reforms: neologisms like co or na that would serve as gender-neutral pronouns, a succession of new terms for racial minorities, and a rejection of standards of clarity in criticism and scholarship (for if language is no longer a window onto thought but the very stuff of thought, the metaphor of "clarity" no longer applies).
Like all conspiracy theories, the idea that language is a prisonhouse denigrates its subject by overestimating its power. Language is the magnificent faculty that we use to get thoughts from one head to another, and we can co-opt it in many ways to help our thoughts along. But it is not the same as thought, not the only thing that separates humans from other animals, not the basis of all culture, and not an inescapable prisonhouse, an obligatory agreement, the limits of our world, or the determiner of what is imaginable. 28
We have seen that perception and categorization provide us with concepts that keep, us in touch with the world. Language extends that lifeline by connecting the concepts to words. Children hear noises coming out of a family member's mouth, use their intuitive psychology and their grasp of the context {209} to infer what the speaker is trying to say, and mentally link the words to the concepts and the grammatical rules to the relationships among them. Bowser upends a chair, Sister yells, "The dog knocked over the chair! " and Junior deduces that dog means dog, chair means chair, and the subject of the verb knock over is the agent doing the knocking over. 29 Now Junior can talk about other dogs, other chairs, and other knockings over. There is nothing self-referential or imprisoning about it. As the novelist Walker Percy quipped, a deconstructionist is an academic who claims that texts have no referents and then leaves a message on his wife's answering machine asking her to order a pepperoni pizza for dinner.
Language surely does affect our thoughts, rather than just labeling them for the sake of labeling them. Most obviously, language is the conduit through which people share their thoughts and intentions and thereby acquire the knowledge, customs, and values of those around them. In the song "Christmas" from their rock opera, The Who described the plight of a boy without language: "Tommy doesn't know what day it is; he doesn't know who Jesus was or what prayin' is. "
Language can allow us to share thoughts not just directly, by its literal content, but also indirectly, via metaphors and metonyms that nudge listeners into grasping connections they may not have noticed before. For example, many expressions treat time as if it were a valuable resource, such as waste time, spend time, valuable time, and time is money. 30 Presumably on the first occasion a person used one of these expressions, her audience wondered why she was using a word for money to refer to time; after all, you can't literally spend time the way you spend pieces of gold. Then, by assuming that the speaker was not gibbering, they figured out the ways in which time indeed has something in common with money, and assumed that that was what the speaker intended to convey. Note that even in this clear example of language affecting thought, language is not the same thing as thought. The original coiner of the metaphor had to see the analogy without the benefit of the English expressions, and the first listeners had to make sense of it using a chain of ineffable thoughts about the typical intentions of speakers and the properties shared by time and money.
Aside from its use as a medium of communication, language can be pressed into service as one of the media used by the brain for storing and manipulating information. 31 The leading theory of human working memory, from the psychologist Alan Baddeley, captures the idea nicely. 32 The mind makes use of a "phonological loop": a silent articulation of words or numbers that persists for a few seconds and can be sensed by the mind's ear. The loop acts as a "slave system" at the service of a "central executive. " By describing things to ourselves using snatches of language, we can temporarily store the result of a mental computation or retrieve chunks of data stored as verbal expressions. Mental arithmetic involving large numbers, for example, may be {210} carried out by retrieving verbal formulas such as "Seven times eight is fifty-six. "33 But as the technical terms of the theory make clear, language is serving as a slave of an executive, not as the medium of all thought.
Why do virtually all cognitive scientists and linguists believe that language is not a prisonhouse of thought? 34 First, many experiments have plumbed the minds of creatures without language, such as infants and nonhuman primates, and have found the fundamental categories of thought working away: objects, space, cause and effect, number, probability, agency (the initiation of behavior by a person or animal), and the functions of tools. 35
? ? ? ? ? ? ? ? ? ? ? Second, our vast storehouse of knowledge is certainly not couched in the words and sentences in which we learned the individual facts. What did you read in the page before this one? I would like to think that you can give a reasonably accurate answer to the question. Now try to write down the exact words you read in those pages. Chances are you cannot recall a single sentence verbatim, probably not even a single phrase. What you remembered is the gist of those passages -- their content, meaning, or sense -- not the language itself. Many experiments on human memory have confirmed that what we remember over the long term is the content, not the wording, of stories and conversations. Cognitive scientists model this "semantic memory" as a web of logical propositions, images, motor programs, strings of sounds, and other data structures connected to one another in the brain. 36
A third way to put language in its place is to think about how we use it. Writing and speaking do not consist of transcribing an interior monologue onto paper or playing it into a microphone. Rather, we engage in a constant give- and-take between the thoughts we try to convey and the means our language offers to convey them. We often grope for words, are dissatisfied with what we write because it does not express what we wanted to say, or discover when every combination of words seems wrong that we do not really know what we want to say. And when we get frustrated by a mismatch between our language and our thoughts, we don't give up, defeated and mum, but change the language. We concoct neologisms (quark, meme, clone, deep structure), invent slang (to spam, to diss, to flame, to surf the web, a spin doctor), borrow useful words from other languages (joie de vivre, schlemiel, angst, machismo), or coin new metaphors (waste time, vote with your feet, push the outside of the envelope). That is why every language, far from being an immutable penitentiary, is constantly under renovation. Despite the lamentations of language lovers and the coercion of tongue troopers, languages change unstoppably as people need to talk about new things or convey new attitudes. 37
Finally, language itself could not function if it did not sit atop a vast infrastructure of tacit knowledge about the world and about the intentions of other people. When we understand language, we have to listen between the lines to winnow out the unintended readings of an ambiguous sentence, piece {211} together fractured utterances, glide over slips of the tongue, and fill in the countless unsaid steps in a complete train of thought. When the shampoo bottle says "Lather, rinse, repeat," we don't spend the rest of our lives in the shower; we infer that it means "repeat once. " And we know how to interpret ambiguous headlines such as "Kids Make Nutritious Snacks," "Prostitutes Appeal to Pope," and "British Left Waffles on Falkland Islands," because we effortlessly apply our background knowledge about the kinds of things that people are likely to convey in newspapers. Indeed, the very existence of ambiguous sentences, in which one string of words expresses two thoughts, proves that thoughts are not the same thing as strings of words.
~
Language often makes the news precisely because it can part company with thoughts and attitudes. In 1998 Bill Clinton exploited the expectations behind ordinary comprehension to mislead prosecutors about his affair with Monica Lewinsky. He used words like alone, sex, and is in senses that were technically defensible but which deviated from charitable guesses about what people ordinarily mean by these terms. For example, he suggested he was not "alone" with Lewinsky, even though they were the only two people in the room, because other people were in the Oval Office complex at the time. He said that he did not have "sex" with her, because they did not engage in intercourse. His words, like all words, are certainly vague at their boundaries. Exactly how far away or hidden must the nearest person be before one is considered alone? At what point in the continuum of bodily contact -- from an accidental brush in an elevator to tantric bliss -- do we say that sex has occurred? Ordinarily we resolve the vagueness by guessing how our conversational partner would interpret words in the context, and we choose our words accordingly. Clinton's ingenuity in manipulating these guesses, and the outrage that erupted when he was forced to explain what he had done, show that people have an acute understanding of the difference between words and the thoughts they are designed to convey.
Language conveys not just literal meanings but also a speaker's attitude. Think of the difference between/at and voluptuous, slender and scrawny, thrifty and stingy, articulate and slick. Racial epithets, which are laced with contempt, are justifiably off-limits among responsible people, because using them conveys the tacit message that contempt for the people referred to by the epithet is acceptable. But the drive to adopt new terms for disadvantaged groups goes much further than this basic sign of respect; it often assumes that words and attitudes are so inseparable that one can reengineer people's attitudes by tinkering with the words. In 1994 the Los Angeles Times adopted a style sheet that banned some 150 words, including birth defect, Canuck, Chinese fire drill, dark continent, divorcee, Dutch treat, handicapped, illegitimate, invalid, man-made, New World, stepchild, and to welsh. The editors assumed that words {212} register in the brain with their literal meanings, so that an invalid is understood as "someone who is not valid" and Dutch treat is understood as a slur on contemporary Netherlanders. (In fact, it is one of many idioms in which Dutch means "ersatz," such as Dutch oven, Dutch door, Dutch uncle, Dutch courage, and Dutch auction,
? ? ? ? the remnants of a long-forgotten rivalry between the English and the Dutch. )
But even the more reasonable attempts at linguistic reform are based on a dubious theory of linguistic determinism. Many people are puzzled by the replacement of formerly unexceptionable terms by new ones: Negro by black by African American, Spanish-American by Hispanic by Latino, crippled by handicapped by disabled by challenged, slum by ghetto by inner city by (according to the Times) slum once again. Occasionally the neologisms are defended with some rationale about their meaning. In the 1960s, the word Negro was replaced by the word black, because the parallel between the words black and white was meant to underscore the equality of the races. Similarly, Native American reminds us of who was here first and avoids the geographically inaccurate term Indian. But often the new terms replace ones that were perfectly congenial in their day, as we see in names for old institutions that are obviously sympathetic to the people being named: the United Negro College Fund, the National Association for the Advancement of Colored People, the Shriners Hospitals for Crippled Children. And sometimes a term can be tainted or unfashionable while a minor variant is fine: consider colored people versus people of color, Afro-American versus African American, Negro -- Spanish for "black" -- versus black. If anything, a respect for literal meaning should send us off looking for a new word for the descendants of Europeans, who are neither white nor Caucasian. Something else must be driving the replacement process.
Linguists are familiar with the phenomenon, which may be called the euphemism treadmill. People invent new words for emotionally charged referents, but soon the euphemism becomes tainted by association, and a new word must be found, which soon acquires its own connotations, and so on. Water closet becomes toilet (originally a term for any kind of body care, as in toilet kit and toilet water), which becomes bathroom, which becomes restroom, which becomes lavatory. Undertaker changes to mortician, which changes to funeral director.
? It is a bad idea to say that violence and exploitation are wrong only because people are not naturally inclined to them.
? It is a bad idea to say that people are responsible for their actions only because the causes of those actions are mysterious.
? And it is a bad idea to say that our motives are meaningful in a personal sense only because they are inexplicable in a biological sense. {194}
These are bad ideas because they make our values hostages to fortune, implying that someday factual discoveries could make them obsolete. And they are bad ideas because they conceal the downsides of denying human nature: persecution of the successful, intrusive social engineering, the writing off of suffering in other cultures, an incomprehension of the logic of justice, and the devaluing of human life on earth.
<< {195} >>
KNOW THYSELF
ow that I have attempted to make the very idea of human nature respectable, it is time to say something about what it is and what difference N it makes for our public and private lives. The chapters in Part IV present some current ideas about the design specs of the basic human
faculties. These are not just topics in a psychology curriculum but have implications for many arenas of public discourse. Ideas about the contents of cognition -- concepts, words, and images -- shed light on the roots of prejudice, on the media, and on the arts. Ideas about the capacity for reason can enter into our policies of education and applications of technology. Ideas about social relations are relevant to the family, to sexuality, to social organization, and to crime. Ideas about the moral sense inform the way we evaluate political movements and how we trade off one value against another.
In each of these arenas, people always appeal to some conception of human nature, whether they acknowledge it or not. The problem is that the conceptions are often based on gut feelings, folk theories, and archaic versions of biology. My goal is to make these conceptions explicit, to suggest what is right and wrong about them, and to spell out some of the implications. Ideas about human nature cannot, on their own, resolve perplexing controversies or determine public policy. But without such ideas we are not playing with a full deck and are vulnerable to unnecessary befuddlement. As the biologist Richard Alexander has noted, "Evolution is surely most deterministic for those still unaware of it. "1
<< {197} >> Chapter 12
In Touch with Reality
? ? ? ? ? ? ? ? ? ? ? ? ? ? What a piece of work is a man!
How noble in reason!
How infinite in faculty!
In form, in moving, how express and admirable! In action, how like an angel!
In apprehension, how like a god!
-- William Shakespeare
The starting point for acknowledging human nature is a sheer awe and humility in the face of the staggering complexity of its source, the brain. Organized by the three billion bases of our genome and shaped by hundreds of millions of years of evolution, the brain is a network of unimaginable intricacy: a hundred billion neurons linked by a hundred trillion connections, woven into a convoluted three-dimensional architecture. Humbling, too, is the complexity of what it does. Even the mundane talents we share with other primates -- walking, grasping, recognizing -- are solutions to engineering problems at or beyond the cutting edge of artificial intelligence. The talents that are human birthrights -- speaking and understanding, using common sense, teaching children, inferring other people's motives -- will probably not be duplicated by machines in our lifetime, if ever. All this should serve as a counterweight to the image of the mind as formless raw material and to people as insignificant atoms making up the complex being we call "society. "
The human brain equips us to thrive in a world of objects, living things, and other people. Those entities have a large impact on our well-being, and one would expect the brain to be well suited to detecting them and their powers. Failing to recognize a steep precipice or a hungry panther or a jealous spouse can have significant negative consequences for biological fitness, to put it mildly. The fantastic complexity of the brain is there in part to register consequential facts about the world around us. {198}
But this truism has been rejected by many sectors of modern intellectual life. According to the relativistic wisdom prevailing in much of academia today, reality is socially constructed by the use of language, stereotypes, and media images. The idea that people have access to facts about the world is nai? ve, say the proponents of social constructionism, science studies, cultural studies, critical theory, postmodernism, and deconstructionism. In their view, observations are always infected by theories, and theories are saturated with ideology and political doctrines, so anyone who claims to have the facts or know the truth is just trying to exert power over everyone else.
Relativism is entwined with the doctrine of the Blank Slate in two ways. One is that relativists have a penny-pinching theory of psychology in which the mind has no mechanisms designed to grasp reality; all it can do is passively download words, images, and stereotypes from the surrounding culture. The other is the relativists' attitude toward science. Most scientists regard their work as an extension of our everyday ability to figure out what is out there and how things work. Telescopes and microscopes amplify the visual system; theories formalize our hunches about cause and effect; experiments refine our drive to gather evidence about events we cannot witness directly. Relativist movements agree that science is perception and cognition writ large, but they draw the opposite conclusion: that scientists, like laypeople, are unequipped to grasp an objective reality. Instead, their advocates say, "Western science is only one way of describing reality, nature, and the way things work -- a very effective way, certainly, for the production of goods and profits, but unsatisfactory in most other respects. It is an imperialist arrogance which ignores the sciences and insights of most other cultures and times. "1 Nowhere is this more significant than in the scientific study of politically charged topics such as race, gender, violence, and social organization. Appealing to "facts" or "the truth" in connection with these topics is just a ruse, the relativists say, because there is no "truth" in the sense of an objective yardstick independent of cultural and political presuppositions.
Skepticism about the soundness of people's mental faculties also determines whether one should respect ordinary people's tastes and opinions (even those we don't much like) or treat the people as dupes of an insidious commercial culture. According to relativist doctrines like "false consciousness," "inauthentic preferences," and "interiorized authority," people may be mistaken about their own desires. If so, it would undermine the assumptions behind democracy, which gives ultimate authority to the preferences of the majority of a population, and the assumptions behind market economies, which treat people as the best judges of how they should allocate their own resources. Perhaps not coincidentally, it elevates the scholars and artists who analyze the use of language and images in society, because only they can unmask the ways in which such media mislead and corrupt. {199}
This chapter is about the assumptions about cognition -- in particular, concepts, words, and images -- that underlie recent relativistic movements in intellectual life. The best way to introduce the argument is with examples from the study of perception, our most immediate connection to the world. They immediately show that the question of whether reality is socially constructed or directly available has not been properly framed. Neither alternative is correct.
? ? ? ? Relativists have a point when they say that we don't just open our eyes and apprehend reality, as if perception were a window through which the soul gazes at the world. The idea that we just see things as they are is called nai? ve realism, and it was refuted by skeptical philosophers thousands of years ago with the help of a simple phenomenon: visual illusions. Our visual systems can play tricks on us, and that is enough to prove they are gadgets, not pipelines to the truth. Here are two of my favorites. In Roger Shepard's "Turning the Tables"2 (right), the two parallelograms are identical in size and shape. In Edward Adelson's "Checker Shadow Illusion"3 (below) the light square in the middle of the shadow (B) is the same shade of gray as the dark squares outside the shadow (A):
? ? ? world. Most of the time the system works: people don't usually bump into trees or bite into rocks.
But occasionally the brain is fooled. The ground stretching away from our feet projects an image from the bottom to the center of our visual field. As a result, the brain often interprets down-up in the visual field as near-far in the world, especially when reinforced by other perspective cues such as occluded parts (like the hidden table legs). Objects stretching away from the viewer get foreshortened by projection, and the brain compensates for this, so we tend to see a given distance running up-and-down in the visual field as coming from a longer object than the same distance running left-to-right. And that makes us see the lengths and widths differently in the turned tables. By similar logic, objects in shadow reflect less light onto our retinas than objects in full illumination. Our brains compensate, making us see a given shade of gray as lighter when it is in shadow than when it is in sunshine. In each case we may see the lines and patches on the page incorrectly, but that is only because our visual systems are working very hard to see them as coming from a real world. Like a policeman framing a suspect, Shepard and Adelson have planted evidence that would lead a rational but unsuspecting observer to an incorrect conclusion. If we were in a world of ordinary 3-D objects that had projected those images onto our retinas, our perceptual experience would be accurate. Adelson explains: "As with many so-called illusions, this effect really demonstrates the success rather than the failure of the visual system. The visual system is not very good at being a physical light meter, but that is not its purpose. The important task is to break the image information down into meaningful components, and thereby perceive the nature of the objects in view. "4
It's not that expectations from past experience are irrelevant to perception. But their influence is to make our perceptual systems more accurate, not more arbitrary. In the two words below, we perceive the same shape as an "H" in the first word and as an "A" in the second:5
But just because the world we know is a construct of our brain, that does not mean it is an arbitrary construct -- a phantasm created by expectations or the social context. Our perceptual systems are designed to register aspects of the external world that were important to our survival, like the sizes, shapes, and materials of objects. They need a complex design to accomplish this feat because the retinal image is not a replica of the world. The projection of an object on the retina grows, shrinks, and warps as the object moves around; color and brightness fluctuate as the lighting changes from sun to clouds or from indoor to outdoor light. But somehow the brain solves these maddening problems. It works as if it were reasoning backwards from the retinal image to hypotheses about reality, using {200} geometry, optics, probability theory, and assumptions about the
? ? ? ? We see the shapes that way because experience tells us -- correctly -- that the odds are high that there really is an "H" in the middle of the first word and an "A" in the middle of the second, even if that is not true in an atypical case. The mechanisms of perception go to a lot of trouble to ensure that what we see corresponds to what is usually out there.
So the demonstrations that refute nai? ve realism most decisively also refute the idea that the mind is disconnected from reality. There is a third alternative: {201} that the brain evolved fallible yet intelligent mechanisms that work to keep us in touch with aspects of reality that were relevant to the survival and reproduction of our ancestors. And that is true not just of our perceptual faculties but of our cognitive faculties. The fact that our cognitive faculties (like our perceptual faculties) are attuned to the real world is most obvious from their response to illusions: they recognize the possibility of a breach with reality and find a way to get at the truth behind the false impression. When we see an oar that appears to be severed at the water's surface, we know how to tell whether it really is severed or just looks that way: we can palpate the oar, slide a straight object along it, or pull on it to see if the submerged part gets left behind. The concept of truth and reality behind such tests appears to be universal. People in all cultures distinguish truth from falsity and inner mental life from overt reality, and try to deduce the presence of unobservable objects from the perceptible clues they leave behind. 6
~
Visual perception is the most piquant form of knowledge of the world, but relativists are less concerned with how we see objects than with how we categorize them: how we sort our experiences into conceptual categories like birds, tools, and people. The seemingly innocuous suggestion that the categories of the mind correspond to something in reality became a contentious idea in the twentieth century because some categories -- stereotypes of race, gender, ethnicity, and sexual orientation -- can be harmful when they are used to discriminate or oppress.
The word stereotype originally referred to a kind of printing plate. Its current sense as a pejorative and inaccurate image standing for a category of people was introduced in 1922 by the journalist Walter Lippmann. Lippmann was an important public intellectual who, among other things, helped to found The New Republic, influenced Woodrow Wilson's policies at the end of World War I, and wrote some of the first attacks on IQ testing. In his book Public Opinion, Lippmann fretted about the difficulty of achieving true democracy in an age in which ordinary people could no longer judge public issues rationally because they got their information in what we today call sound bites. As part of this argument, Lippmann proposed that ordinary people's concepts of social groups were stereotypes: mental pictures that are incomplete, biased, insensitive to variation, and resistant to disconfirming information.
Lippmann had an immediate influence on social science (though the subtleties and qualifications of his original argument were forgotten). Psychologists gave people lists of ethnic groups and lists of traits and asked them to pair them up. Sure enough, people linked Jews with "shrewd" and "mercenary," Germans with "efficient" and "nationalistic," Negroes with "superstitious" and "happy-go-lucky," and so on. 7 Such generalizations are pernicious when applied to individuals, and though they are still lamentably common in much of {202} the world, they are now actively avoided by educated people and by mainstream public figures.
By the 1970s, many thinkers were not content to note that stereotypes about categories of people can be inaccurate. They began to insist that the categories themselves don't exist other than in our stereotypes. An effective way to fight racism, sexism, and other kinds of prejudice, in this view, is to deny that conceptual categories about people have any claim to objective reality. It would be impossible to believe that homosexuals are effeminate, blacks superstitious, and women passive if there were no such things as categories of homosexuals, blacks, or women to begin with. For example, the philosopher Richard Rorty has written," 'The homosexual,' 'the Negro,' and 'the female' are best seen not as inevitable classifications of human beings but rather as inventions that have done more harm than good. "8
For that matter, many writers think, why stop there? Better still to insist that all categories are social constructions and therefore figments, because that would really make invidious stereotypes figments. Rorty notes with approval that many thinkers today "go on to suggest that quarks and genes probably are [inventions] too. " Postmodernists and other relativists attack truth and objectivity not so much because they are interested in philosophical problems of ontology and epistemology but because they feel it is the best way to pull the rug out from under racists, sexists, and homophobes. The philosopher Ian Hacking provides a list of almost forty categories that have recently been claimed to be "socially constructed. " The prime examples are race, gender, masculinity, nature, facts, reality, and the past.
? ? ? ? ? But the list has been growing and now includes authorship, AIDS, brotherhood, choice, danger, dementia, illness, Indian forests, inequality, the Landsat satellite system, the medicalized immigrant, the nation-state, quarks, school success, serial homicide, technological systems, white-collar crime, women refugees, and Zulu nationalism. According to Hacking, the common thread is a conviction that the category is not determined by the nature of things and therefore is not inevitable. The further implication is that we would be much better off if it were done away with or radically transformed. 9
This whole enterprise is based on an unstated theory of human concept formation: that conceptual categories bear no systematic relation to things in the world but are socially constructed (and can therefore be reconstructed). Is it a correct theory? In some cases it has a grain of truth. As we saw in Chapter 4, some categories really are social constructions: they exist only because people tacitly agree to act as if they exist. Examples include money, tenure, citizenship, decorations for bravery, and the presidency of the United States. 10 But that does not mean that all conceptual categories are socially constructed. Concept formation has been studied for decades by cognitive psychologists, and they conclude that most concepts pick out categories of objects in the {203} world which had some kind of reality before we ever stopped to think about them. 11
Yes, every snowflake is unique, and no category will do complete justice to every one of its members. But intelligence depends on lumping together things that share properties, so that we are not flabbergasted by every new thing we encounter. As William James wrote, "A polyp would be a conceptual thinker if a feeling of 'Hollo! thingumbob again! ' ever flitted through its mind. " We perceive some traits of a new object, place it in a mental category, and infer that it is likely to have the other traits typical of that category, ones we cannot perceive. If it walks like a duck and quacks like a duck, it probably is a duck. If it's a duck, it's likely to swim, fly, have a back off which water rolls, and contain meat that's tasty when wrapped in a pancake with scallions and hoisin sauce.
This kind of inference works because the world really does contain ducks, which really do share properties. If we lived in a world in which walking quacking objects were no more likely to contain meat than did any other object, the category "duck" would be useless and we probably would not have evolved the ability to form it. If you were to construct a giant spreadsheet in which the rows and columns were traits that people notice and the cells were filled in by objects that possess that combination of traits, the pattern of filled cells would be lumpy. You would find lots of entries at the intersection of the "quacks" row and the "waddles" column but none at the "quacks" row and the "gallops" column. Once you specify the rows and columns, the lumpiness comes from the world, not from society or language. It is no coincidence that the same living things tend to be classified together by the words in European cultures, the words for plant and animal kinds in other cultures (including preliterate cultures), and the Linnaean taxa of professional biologists equipped with calipers, dissecting tools, and DNA sequencers. Ducks, biologists say, are several dozen species in the subfamily Anatinae, each with a distinct anatomy, an ability to interbreed with other members of their species, and a common ancestor in evolutionary history.
Most cognitive psychologists believe that conceptual categories come from two mental processes. 12 One of them notices clumps of entries in the mental spreadsheet and treats them as categories with fuzzy boundaries, prototypical members, and overlapping similarities, like the members of a family. That's why our mental category "duck" can embrace odd ducks that don't match the prototypical duck, such as lame ducks, who cannot swim or fly, Muscovy ducks, which have claws and spurs on their feet, and Donald Duck, who talks and wears clothing. The other mental process looks for crisp rules and definitions and enters them into chains of reasoning. The second system can learn that true ducks molt twice a season and have overlapping scales on their legs and hence that certain birds that look like geese and are called geese really are ducks. Even when people don't know these facts from academic {204} biology, they have a strong intuition that species are defined by an internal essence or hidden trait that lawfully gives rise to its visible features. 13
Anyone who teaches the psychology of categorization has been hit with this question from a puzzled student: "You're telling us that putting things into categories is rational and makes us smart. But we've always been taught that putting people into categories is irrational and makes us sexist and racist. If categorization is so great when we think about ducks and chairs, why is it so terrible when we think about genders and ethnic groups? " As with many ingenuous questions from students, this one uncovers a shortcoming in the literature, not a flaw in their understanding.
The idea that stereotypes are inherently irrational owes more to a condescension toward ordinary people than it does to good psychological research. Many researchers, having shown that stereotypes existed in the minds of their subjects, assumed that the stereotypes had to be irrational, because they were uncomfortable with the possibility that some trait might be statistically true of some group. They never actually checked. That began to change in the 1980s, and now a fair amount is known about the accuracy of stereotypes. 14
With some important exceptions, stereotypes are in fact not inaccurate when assessed against objective benchmarks such as census figures or the reports of the stereotyped people themselves. People who believe that African Americans are more likely to be on welfare than whites, that Jews have higher average incomes than WASPs, that
? ? ? ? ? ? ? ? ?
business students are more conservative than students in the arts, that women are more likely than men to want to lose weight, and that men are more likely than women to swat a fly with their bare hands, are not being irrational or bigoted. Those beliefs are correct. People's stereotypes are generally consistent with the statistics, and in many cases their bias is to underestimate the real differences between sexes or ethnic groups. 15 This does not mean that the stereotyped traits are unchangeable, of course, or that people think they are unchangeable, only that people perceive the traits fairly accurately at the time.
Moreover, even when people believe that ethnic groups have characteristic traits, they are never mindless stereotypers who literally believe that each and every member of the group possesses those traits. People may think that Germans are, on average, more efficient than non-Germans, but no one believes that every last German is more efficient than every non-German. 16 And people have no trouble overriding a stereotype when they have good information about an individual. Contrary to a common accusation, teachers' impressions of their individual pupils are not contaminated by their stereotypes of race, gender, or socioeconomic status. The teachers' impressions accurately reflect the pupil's performance as measured by objective tests. 17
Now for the important exceptions. Stereotypes can be downright inaccurate when a person has few or no firsthand encounters with the stereotyped {205} group, or belongs to a group that is overtly hostile to the one being judged. During World War II, when the Russians were allies of the United States and the Germans were enemies, Americans judged Russians to have more positive traits than Germans. Soon afterward, when the alliances reversed, Americans judged Germans to have more positive traits than Russians. 18
Also, people's ability to set aside stereotypes when judging an individual is accomplished by their conscious, deliberate reasoning. When people are distracted or put under pressure to respond quickly, they are more likely to judge that a member of an ethnic group has all the stereotyped traits of the group. 19 This comes from the two-part design of the human categorization system mentioned earlier. Our network of fuzzy associations naturally reverts to a stereotype when we first encounter an individual. But our rule-based categorizer can block out those associations and make deductions based on the relevant facts about that individual. It can do so either for practical reasons, when information about a group-wide average is less diagnostic than information about the individual, or for social and moral reasons, out of respect for the imperative that one ought to ignore certain group-wide averages when judging an individual.
The upshot of this research is not that stereotypes are always accurate but that they are not always false, or even usually false. This is just what we would expect if human categorization -- like the rest of the mind -- is an adaptation that keeps track of aspects of the world that are relevant to our long-term well-being. As the social psychologist Roger Brown pointed out, the main difference between categories of people and categories of other things is that when you use a prototypical exemplar to stand for a category of things, no one takes offense. When Webster's dictionary used a sparrow to stand for all birds, "emus and ostriches and penguins and eagles did not go on the attack. " But just imagine what would have happened if Webster's had used a picture of a soccer mom to illustrate woman and a picture of a business executive to illustrate man. Brown remarks, "Of course, people would be right to take offense since a prototype can never represent the variation that exists in natural categories. It's just that birds don't care but people do. "20
What are the implications of the fact that many stereotypes are statistically accurate? One is that contemporary scientific research on sex differences cannot be dismissed just because some of the findings are consistent with traditional stereotypes of men and women. Some parts of those stereotypes may be false, but the mere fact that they are stereotypes does not prove that they are false in every respect.
The partial accuracy of many stereotypes does not, of course, mean that racism, sexism, and ethnic prejudice are acceptable. Quite apart from the democratic principle that in the public sphere people should be treated as individuals, there are good reasons to be concerned about stereotypes. {206} Stereotypes based on hostile depictions rather than on firsthand experience are bound to be inaccurate. And some stereotypes are accurate only because of self-fulfilling prophecies. Forty years ago it may have been factually correct that few women and African Americans were qualified to be chief executives or presidential candidates. But that was only because of barriers that prevented them from attaining those qualifications, such as university policies that refused them admission out of a belief that they were not qualified. The institutional barriers had to be dismantled before the facts could change. The good news is that when the facts do change, people's stereotypes can change with them.
What about policies that go farther and actively compensate for prejudicial stereotypes, such as quotas and preferences that favor underrepresented groups? Some defenders of these policies assume that gatekeepers are incurably afflicted with baseless prejudices, and that quotas must be kept in place forever to neutralize their effects. The research on stereotype accuracy refutes that argument. Nonetheless, the research might support a different argument for preferences and other gender- and color-sensitive policies. Stereotypes, even when they are accurate, might be self-fulfilling, and not just in the obvious case of institutionalized barriers like those that kept women and
? ? ? ? ? ? ? ? African Americans out of universities and professions. Many people have heard of the Pygmalion effect, in which people perform as other people (such as teachers) expect them to perform. As it happens, the Pygmalion effect appears to be small or nonexistent, but there are more subtle forms of self-fulfilling prophecies. 21 If subjective decisions about people, such as admissions, hiring, credit, and salaries, are based in part on group-wide averages, they will conspire to make the rich richer and the poor poorer. Women are marginalized in academia, making them genuinely less influential, which increases their marginalization. African Americans are treated as poorer credit risks and denied credit, which makes them less likely to succeed, which makes them poorer credit risks. Race- and gender- sensitive policies, according to arguments by the psychologist Virginia Valian, the economist Glenn Loury, and the philosopher James Flynn, may be needed to break the vicious cycle. 22
Pushing in the other direction is the finding that stereotypes are least accurate when they pertain to a coalition that is pitted against one's own in hostile competition. This should make us nervous about identity politics, in which public institutions identify their members in terms of their race, gender, and ethnic group and weigh every policy by how it favors one group over another. In many universities, for example, minority students are earmarked for special orientation sessions and encouraged to view their entire academic experience through the lens of their group and how it has been victimized. By implicitly pitting one group against another, such policies may cause each group to brew stereotypes about the other that are more pejorative than the {207} ones they would develop in personal encounters. As with other policy issues I examine in this book, the data from the lab do not offer a thumbs-up or thumbs-down verdict on race- and gender-conscious policies. But by highlighting the features of our psychology that different policies engage, the findings can make the tradeoffs clearer and the debates better informed.
~
Of all the faculties that go into the piece of work called man, language may be the most awe-inspiring. "Remember that you are a human being with a soul and the divine gift of articulate speech," Henry Higgins implored Eliza Doolittle. Galileo's alter ego, humbled by the arts and inventions of his day, commented on language in its written form:
But surpassing all stupendous inventions, what sublimity of mind was his who dreamed of finding means to communicate his deepest thoughts to any other person, though distant by mighty intervals of place and time! Of talking with those who are in India; of speaking to those who are not yet born and will not be born for a thousand or ten thousand years; and with what facility, by the different arrangements of twenty characters upon a page! 23
But a funny thing happened to language in intellectual life. Rather than being appreciated for its ability to communicate thought, it was condemned for its power to constrain thought. Famous quotations from two philosophers capture the anxiety. "We have to cease to think if we refuse to do it in the prisonhouse of language," wrote Friedrich Nietzsche. "The limits of my language mean the limits of my world," wrote Ludwig Wittgenstein. How could language exert this stranglehold? It would if words and phrases were the medium of thought itself, an idea that falls naturally out of the Blank Slate. If there is nothing in the intellect that was not first in the senses, then words picked up by the ears are the obvious source of any abstract thought that cannot be reduced to sights, smells, or other sounds. Watson tried to explain thinking as microscopic movements of the mouth and throat; Skinner hoped his 1957 book Verbal Behavior, which explained language as a repertoire of rewarded responses, would bridge the gap between pigeons and people.
The other social sciences also tended to equate language with thought. Boas's student Edward Sapir called attention to differences in how languages carve up the world into categories, and Sapir's student Benjamin Whorf stretched those observations into the famous Linguistic Determinism hypothesis: "We cut nature up, organize it into concepts, and ascribe significances as we do, largely because we are parties to an agreement to organize it in this way -- an agreement that holds throughout our speech community and is codified in the patterns of our language. The agreement is, of course, an implicit {208} and unstated one, but its terms are absolutely obligatory! "24 More recently, the anthropologist Clifford Geertz wrote that "thinking consists not of 'happenings in the head' (though happenings there and elsewhere are necessary for it to occur) but of a traffic in what have been called . . . significant symbols -- words for the most part. "25
As with so many ideas in social science, the centrality of language is taken to extremes in deconstructionism, postmodernism, and other relativist doctrines. The writings of oracles like Jacques Derrida are studded with such aphorisms as "No escape from language is possible," "Text is self-referential," "Language is power," and "There is nothing outside the text. " Similarly, J. Hillis Miller wrote that "language is not an instrument or tool in man's hands, a submissive means of thinking. Language rather thinks man and his 'world'. . . if he will allow it to do so. "26 The prize for the most extreme statement must go to Roland Barthes, who declared, "Man does not exist prior to
? ? ? ? ? ? ? ? language, either as a species or as an individual. "27
The ancestry of these ideas is said to be from linguistics, though most linguists believe that deconstructionists have gone off the deep end. The original observation was that many words are defined in part by their relationship to other words. For example, he is defined by its contrast with I, you, they, and she, and big mates sense only as the opposite of little. And if you look up words in a dictionary, they are defined by other words, which are defined by still other words, until the circle is completed when you get back to a definition containing the original word. Therefore, say the deconstructionists, language is a self-contained system in which words have no necessary connection to reality. And since language is an arbitrary instrument, not a medium for communicating thoughts or describing reality, the powerful can use it to manipulate and oppress others. This leads in turn to an agitation for linguistic reforms: neologisms like co or na that would serve as gender-neutral pronouns, a succession of new terms for racial minorities, and a rejection of standards of clarity in criticism and scholarship (for if language is no longer a window onto thought but the very stuff of thought, the metaphor of "clarity" no longer applies).
Like all conspiracy theories, the idea that language is a prisonhouse denigrates its subject by overestimating its power. Language is the magnificent faculty that we use to get thoughts from one head to another, and we can co-opt it in many ways to help our thoughts along. But it is not the same as thought, not the only thing that separates humans from other animals, not the basis of all culture, and not an inescapable prisonhouse, an obligatory agreement, the limits of our world, or the determiner of what is imaginable. 28
We have seen that perception and categorization provide us with concepts that keep, us in touch with the world. Language extends that lifeline by connecting the concepts to words. Children hear noises coming out of a family member's mouth, use their intuitive psychology and their grasp of the context {209} to infer what the speaker is trying to say, and mentally link the words to the concepts and the grammatical rules to the relationships among them. Bowser upends a chair, Sister yells, "The dog knocked over the chair! " and Junior deduces that dog means dog, chair means chair, and the subject of the verb knock over is the agent doing the knocking over. 29 Now Junior can talk about other dogs, other chairs, and other knockings over. There is nothing self-referential or imprisoning about it. As the novelist Walker Percy quipped, a deconstructionist is an academic who claims that texts have no referents and then leaves a message on his wife's answering machine asking her to order a pepperoni pizza for dinner.
Language surely does affect our thoughts, rather than just labeling them for the sake of labeling them. Most obviously, language is the conduit through which people share their thoughts and intentions and thereby acquire the knowledge, customs, and values of those around them. In the song "Christmas" from their rock opera, The Who described the plight of a boy without language: "Tommy doesn't know what day it is; he doesn't know who Jesus was or what prayin' is. "
Language can allow us to share thoughts not just directly, by its literal content, but also indirectly, via metaphors and metonyms that nudge listeners into grasping connections they may not have noticed before. For example, many expressions treat time as if it were a valuable resource, such as waste time, spend time, valuable time, and time is money. 30 Presumably on the first occasion a person used one of these expressions, her audience wondered why she was using a word for money to refer to time; after all, you can't literally spend time the way you spend pieces of gold. Then, by assuming that the speaker was not gibbering, they figured out the ways in which time indeed has something in common with money, and assumed that that was what the speaker intended to convey. Note that even in this clear example of language affecting thought, language is not the same thing as thought. The original coiner of the metaphor had to see the analogy without the benefit of the English expressions, and the first listeners had to make sense of it using a chain of ineffable thoughts about the typical intentions of speakers and the properties shared by time and money.
Aside from its use as a medium of communication, language can be pressed into service as one of the media used by the brain for storing and manipulating information. 31 The leading theory of human working memory, from the psychologist Alan Baddeley, captures the idea nicely. 32 The mind makes use of a "phonological loop": a silent articulation of words or numbers that persists for a few seconds and can be sensed by the mind's ear. The loop acts as a "slave system" at the service of a "central executive. " By describing things to ourselves using snatches of language, we can temporarily store the result of a mental computation or retrieve chunks of data stored as verbal expressions. Mental arithmetic involving large numbers, for example, may be {210} carried out by retrieving verbal formulas such as "Seven times eight is fifty-six. "33 But as the technical terms of the theory make clear, language is serving as a slave of an executive, not as the medium of all thought.
Why do virtually all cognitive scientists and linguists believe that language is not a prisonhouse of thought? 34 First, many experiments have plumbed the minds of creatures without language, such as infants and nonhuman primates, and have found the fundamental categories of thought working away: objects, space, cause and effect, number, probability, agency (the initiation of behavior by a person or animal), and the functions of tools. 35
? ? ? ? ? ? ? ? ? ? ? Second, our vast storehouse of knowledge is certainly not couched in the words and sentences in which we learned the individual facts. What did you read in the page before this one? I would like to think that you can give a reasonably accurate answer to the question. Now try to write down the exact words you read in those pages. Chances are you cannot recall a single sentence verbatim, probably not even a single phrase. What you remembered is the gist of those passages -- their content, meaning, or sense -- not the language itself. Many experiments on human memory have confirmed that what we remember over the long term is the content, not the wording, of stories and conversations. Cognitive scientists model this "semantic memory" as a web of logical propositions, images, motor programs, strings of sounds, and other data structures connected to one another in the brain. 36
A third way to put language in its place is to think about how we use it. Writing and speaking do not consist of transcribing an interior monologue onto paper or playing it into a microphone. Rather, we engage in a constant give- and-take between the thoughts we try to convey and the means our language offers to convey them. We often grope for words, are dissatisfied with what we write because it does not express what we wanted to say, or discover when every combination of words seems wrong that we do not really know what we want to say. And when we get frustrated by a mismatch between our language and our thoughts, we don't give up, defeated and mum, but change the language. We concoct neologisms (quark, meme, clone, deep structure), invent slang (to spam, to diss, to flame, to surf the web, a spin doctor), borrow useful words from other languages (joie de vivre, schlemiel, angst, machismo), or coin new metaphors (waste time, vote with your feet, push the outside of the envelope). That is why every language, far from being an immutable penitentiary, is constantly under renovation. Despite the lamentations of language lovers and the coercion of tongue troopers, languages change unstoppably as people need to talk about new things or convey new attitudes. 37
Finally, language itself could not function if it did not sit atop a vast infrastructure of tacit knowledge about the world and about the intentions of other people. When we understand language, we have to listen between the lines to winnow out the unintended readings of an ambiguous sentence, piece {211} together fractured utterances, glide over slips of the tongue, and fill in the countless unsaid steps in a complete train of thought. When the shampoo bottle says "Lather, rinse, repeat," we don't spend the rest of our lives in the shower; we infer that it means "repeat once. " And we know how to interpret ambiguous headlines such as "Kids Make Nutritious Snacks," "Prostitutes Appeal to Pope," and "British Left Waffles on Falkland Islands," because we effortlessly apply our background knowledge about the kinds of things that people are likely to convey in newspapers. Indeed, the very existence of ambiguous sentences, in which one string of words expresses two thoughts, proves that thoughts are not the same thing as strings of words.
~
Language often makes the news precisely because it can part company with thoughts and attitudes. In 1998 Bill Clinton exploited the expectations behind ordinary comprehension to mislead prosecutors about his affair with Monica Lewinsky. He used words like alone, sex, and is in senses that were technically defensible but which deviated from charitable guesses about what people ordinarily mean by these terms. For example, he suggested he was not "alone" with Lewinsky, even though they were the only two people in the room, because other people were in the Oval Office complex at the time. He said that he did not have "sex" with her, because they did not engage in intercourse. His words, like all words, are certainly vague at their boundaries. Exactly how far away or hidden must the nearest person be before one is considered alone? At what point in the continuum of bodily contact -- from an accidental brush in an elevator to tantric bliss -- do we say that sex has occurred? Ordinarily we resolve the vagueness by guessing how our conversational partner would interpret words in the context, and we choose our words accordingly. Clinton's ingenuity in manipulating these guesses, and the outrage that erupted when he was forced to explain what he had done, show that people have an acute understanding of the difference between words and the thoughts they are designed to convey.
Language conveys not just literal meanings but also a speaker's attitude. Think of the difference between/at and voluptuous, slender and scrawny, thrifty and stingy, articulate and slick. Racial epithets, which are laced with contempt, are justifiably off-limits among responsible people, because using them conveys the tacit message that contempt for the people referred to by the epithet is acceptable. But the drive to adopt new terms for disadvantaged groups goes much further than this basic sign of respect; it often assumes that words and attitudes are so inseparable that one can reengineer people's attitudes by tinkering with the words. In 1994 the Los Angeles Times adopted a style sheet that banned some 150 words, including birth defect, Canuck, Chinese fire drill, dark continent, divorcee, Dutch treat, handicapped, illegitimate, invalid, man-made, New World, stepchild, and to welsh. The editors assumed that words {212} register in the brain with their literal meanings, so that an invalid is understood as "someone who is not valid" and Dutch treat is understood as a slur on contemporary Netherlanders. (In fact, it is one of many idioms in which Dutch means "ersatz," such as Dutch oven, Dutch door, Dutch uncle, Dutch courage, and Dutch auction,
? ? ? ? the remnants of a long-forgotten rivalry between the English and the Dutch. )
But even the more reasonable attempts at linguistic reform are based on a dubious theory of linguistic determinism. Many people are puzzled by the replacement of formerly unexceptionable terms by new ones: Negro by black by African American, Spanish-American by Hispanic by Latino, crippled by handicapped by disabled by challenged, slum by ghetto by inner city by (according to the Times) slum once again. Occasionally the neologisms are defended with some rationale about their meaning. In the 1960s, the word Negro was replaced by the word black, because the parallel between the words black and white was meant to underscore the equality of the races. Similarly, Native American reminds us of who was here first and avoids the geographically inaccurate term Indian. But often the new terms replace ones that were perfectly congenial in their day, as we see in names for old institutions that are obviously sympathetic to the people being named: the United Negro College Fund, the National Association for the Advancement of Colored People, the Shriners Hospitals for Crippled Children. And sometimes a term can be tainted or unfashionable while a minor variant is fine: consider colored people versus people of color, Afro-American versus African American, Negro -- Spanish for "black" -- versus black. If anything, a respect for literal meaning should send us off looking for a new word for the descendants of Europeans, who are neither white nor Caucasian. Something else must be driving the replacement process.
Linguists are familiar with the phenomenon, which may be called the euphemism treadmill. People invent new words for emotionally charged referents, but soon the euphemism becomes tainted by association, and a new word must be found, which soon acquires its own connotations, and so on. Water closet becomes toilet (originally a term for any kind of body care, as in toilet kit and toilet water), which becomes bathroom, which becomes restroom, which becomes lavatory. Undertaker changes to mortician, which changes to funeral director.
