I would like to think that you can give a reasonably
accurate
answer to the question.
Steven-Pinker-The-Blank-Slate 1
?
?
?
But the list has been growing and now includes authorship, AIDS, brotherhood, choice, danger, dementia, illness, Indian forests, inequality, the Landsat satellite system, the medicalized immigrant, the nation-state, quarks, school success, serial homicide, technological systems, white-collar crime, women refugees, and Zulu nationalism.
According to Hacking, the common thread is a conviction that the category is not determined by the nature of things and therefore is not inevitable.
The further implication is that we would be much better off if it were done away with or radically transformed.
9
This whole enterprise is based on an unstated theory of human concept formation: that conceptual categories bear no systematic relation to things in the world but are socially constructed (and can therefore be reconstructed). Is it a correct theory? In some cases it has a grain of truth. As we saw in Chapter 4, some categories really are social constructions: they exist only because people tacitly agree to act as if they exist. Examples include money, tenure, citizenship, decorations for bravery, and the presidency of the United States. 10 But that does not mean that all conceptual categories are socially constructed. Concept formation has been studied for decades by cognitive psychologists, and they conclude that most concepts pick out categories of objects in the {203} world which had some kind of reality before we ever stopped to think about them. 11
Yes, every snowflake is unique, and no category will do complete justice to every one of its members. But intelligence depends on lumping together things that share properties, so that we are not flabbergasted by every new thing we encounter. As William James wrote, "A polyp would be a conceptual thinker if a feeling of 'Hollo! thingumbob again! ' ever flitted through its mind. " We perceive some traits of a new object, place it in a mental category, and infer that it is likely to have the other traits typical of that category, ones we cannot perceive. If it walks like a duck and quacks like a duck, it probably is a duck. If it's a duck, it's likely to swim, fly, have a back off which water rolls, and contain meat that's tasty when wrapped in a pancake with scallions and hoisin sauce.
This kind of inference works because the world really does contain ducks, which really do share properties. If we lived in a world in which walking quacking objects were no more likely to contain meat than did any other object, the category "duck" would be useless and we probably would not have evolved the ability to form it. If you were to construct a giant spreadsheet in which the rows and columns were traits that people notice and the cells were filled in by objects that possess that combination of traits, the pattern of filled cells would be lumpy. You would find lots of entries at the intersection of the "quacks" row and the "waddles" column but none at the "quacks" row and the "gallops" column. Once you specify the rows and columns, the lumpiness comes from the world, not from society or language. It is no coincidence that the same living things tend to be classified together by the words in European cultures, the words for plant and animal kinds in other cultures (including preliterate cultures), and the Linnaean taxa of professional biologists equipped with calipers, dissecting tools, and DNA sequencers. Ducks, biologists say, are several dozen species in the subfamily Anatinae, each with a distinct anatomy, an ability to interbreed with other members of their species, and a common ancestor in evolutionary history.
Most cognitive psychologists believe that conceptual categories come from two mental processes. 12 One of them notices clumps of entries in the mental spreadsheet and treats them as categories with fuzzy boundaries, prototypical members, and overlapping similarities, like the members of a family. That's why our mental category "duck" can embrace odd ducks that don't match the prototypical duck, such as lame ducks, who cannot swim or fly, Muscovy ducks, which have claws and spurs on their feet, and Donald Duck, who talks and wears clothing. The other mental process looks for crisp rules and definitions and enters them into chains of reasoning. The second system can learn that true ducks molt twice a season and have overlapping scales on their legs and hence that certain birds that look like geese and are called geese really are ducks. Even when people don't know these facts from academic {204} biology, they have a strong intuition that species are defined by an internal essence or hidden trait that lawfully gives rise to its visible features. 13
Anyone who teaches the psychology of categorization has been hit with this question from a puzzled student: "You're telling us that putting things into categories is rational and makes us smart. But we've always been taught that putting people into categories is irrational and makes us sexist and racist. If categorization is so great when we think about ducks and chairs, why is it so terrible when we think about genders and ethnic groups? " As with many ingenuous questions from students, this one uncovers a shortcoming in the literature, not a flaw in their understanding.
The idea that stereotypes are inherently irrational owes more to a condescension toward ordinary people than it does to good psychological research. Many researchers, having shown that stereotypes existed in the minds of their subjects, assumed that the stereotypes had to be irrational, because they were uncomfortable with the possibility that some trait might be statistically true of some group. They never actually checked. That began to change in the 1980s, and now a fair amount is known about the accuracy of stereotypes. 14
With some important exceptions, stereotypes are in fact not inaccurate when assessed against objective benchmarks such as census figures or the reports of the stereotyped people themselves. People who believe that African Americans are more likely to be on welfare than whites, that Jews have higher average incomes than WASPs, that
? ? ? ? ? ? ? ? ? business students are more conservative than students in the arts, that women are more likely than men to want to lose weight, and that men are more likely than women to swat a fly with their bare hands, are not being irrational or bigoted. Those beliefs are correct. People's stereotypes are generally consistent with the statistics, and in many cases their bias is to underestimate the real differences between sexes or ethnic groups. 15 This does not mean that the stereotyped traits are unchangeable, of course, or that people think they are unchangeable, only that people perceive the traits fairly accurately at the time.
Moreover, even when people believe that ethnic groups have characteristic traits, they are never mindless stereotypers who literally believe that each and every member of the group possesses those traits. People may think that Germans are, on average, more efficient than non-Germans, but no one believes that every last German is more efficient than every non-German. 16 And people have no trouble overriding a stereotype when they have good information about an individual. Contrary to a common accusation, teachers' impressions of their individual pupils are not contaminated by their stereotypes of race, gender, or socioeconomic status. The teachers' impressions accurately reflect the pupil's performance as measured by objective tests. 17
Now for the important exceptions. Stereotypes can be downright inaccurate when a person has few or no firsthand encounters with the stereotyped {205} group, or belongs to a group that is overtly hostile to the one being judged. During World War II, when the Russians were allies of the United States and the Germans were enemies, Americans judged Russians to have more positive traits than Germans. Soon afterward, when the alliances reversed, Americans judged Germans to have more positive traits than Russians. 18
Also, people's ability to set aside stereotypes when judging an individual is accomplished by their conscious, deliberate reasoning. When people are distracted or put under pressure to respond quickly, they are more likely to judge that a member of an ethnic group has all the stereotyped traits of the group. 19 This comes from the two-part design of the human categorization system mentioned earlier. Our network of fuzzy associations naturally reverts to a stereotype when we first encounter an individual. But our rule-based categorizer can block out those associations and make deductions based on the relevant facts about that individual. It can do so either for practical reasons, when information about a group-wide average is less diagnostic than information about the individual, or for social and moral reasons, out of respect for the imperative that one ought to ignore certain group-wide averages when judging an individual.
The upshot of this research is not that stereotypes are always accurate but that they are not always false, or even usually false. This is just what we would expect if human categorization -- like the rest of the mind -- is an adaptation that keeps track of aspects of the world that are relevant to our long-term well-being. As the social psychologist Roger Brown pointed out, the main difference between categories of people and categories of other things is that when you use a prototypical exemplar to stand for a category of things, no one takes offense. When Webster's dictionary used a sparrow to stand for all birds, "emus and ostriches and penguins and eagles did not go on the attack. " But just imagine what would have happened if Webster's had used a picture of a soccer mom to illustrate woman and a picture of a business executive to illustrate man. Brown remarks, "Of course, people would be right to take offense since a prototype can never represent the variation that exists in natural categories. It's just that birds don't care but people do. "20
What are the implications of the fact that many stereotypes are statistically accurate? One is that contemporary scientific research on sex differences cannot be dismissed just because some of the findings are consistent with traditional stereotypes of men and women. Some parts of those stereotypes may be false, but the mere fact that they are stereotypes does not prove that they are false in every respect.
The partial accuracy of many stereotypes does not, of course, mean that racism, sexism, and ethnic prejudice are acceptable. Quite apart from the democratic principle that in the public sphere people should be treated as individuals, there are good reasons to be concerned about stereotypes. {206} Stereotypes based on hostile depictions rather than on firsthand experience are bound to be inaccurate. And some stereotypes are accurate only because of self-fulfilling prophecies. Forty years ago it may have been factually correct that few women and African Americans were qualified to be chief executives or presidential candidates. But that was only because of barriers that prevented them from attaining those qualifications, such as university policies that refused them admission out of a belief that they were not qualified. The institutional barriers had to be dismantled before the facts could change. The good news is that when the facts do change, people's stereotypes can change with them.
What about policies that go farther and actively compensate for prejudicial stereotypes, such as quotas and preferences that favor underrepresented groups? Some defenders of these policies assume that gatekeepers are incurably afflicted with baseless prejudices, and that quotas must be kept in place forever to neutralize their effects. The research on stereotype accuracy refutes that argument. Nonetheless, the research might support a different argument for preferences and other gender- and color-sensitive policies. Stereotypes, even when they are accurate, might be self-fulfilling, and not just in the obvious case of institutionalized barriers like those that kept women and
? ? ? ? ? ? ? ? African Americans out of universities and professions. Many people have heard of the Pygmalion effect, in which people perform as other people (such as teachers) expect them to perform. As it happens, the Pygmalion effect appears to be small or nonexistent, but there are more subtle forms of self-fulfilling prophecies. 21 If subjective decisions about people, such as admissions, hiring, credit, and salaries, are based in part on group-wide averages, they will conspire to make the rich richer and the poor poorer. Women are marginalized in academia, making them genuinely less influential, which increases their marginalization. African Americans are treated as poorer credit risks and denied credit, which makes them less likely to succeed, which makes them poorer credit risks. Race- and gender- sensitive policies, according to arguments by the psychologist Virginia Valian, the economist Glenn Loury, and the philosopher James Flynn, may be needed to break the vicious cycle. 22
Pushing in the other direction is the finding that stereotypes are least accurate when they pertain to a coalition that is pitted against one's own in hostile competition. This should make us nervous about identity politics, in which public institutions identify their members in terms of their race, gender, and ethnic group and weigh every policy by how it favors one group over another. In many universities, for example, minority students are earmarked for special orientation sessions and encouraged to view their entire academic experience through the lens of their group and how it has been victimized. By implicitly pitting one group against another, such policies may cause each group to brew stereotypes about the other that are more pejorative than the {207} ones they would develop in personal encounters. As with other policy issues I examine in this book, the data from the lab do not offer a thumbs-up or thumbs-down verdict on race- and gender-conscious policies. But by highlighting the features of our psychology that different policies engage, the findings can make the tradeoffs clearer and the debates better informed.
~
Of all the faculties that go into the piece of work called man, language may be the most awe-inspiring. "Remember that you are a human being with a soul and the divine gift of articulate speech," Henry Higgins implored Eliza Doolittle. Galileo's alter ego, humbled by the arts and inventions of his day, commented on language in its written form:
But surpassing all stupendous inventions, what sublimity of mind was his who dreamed of finding means to communicate his deepest thoughts to any other person, though distant by mighty intervals of place and time! Of talking with those who are in India; of speaking to those who are not yet born and will not be born for a thousand or ten thousand years; and with what facility, by the different arrangements of twenty characters upon a page! 23
But a funny thing happened to language in intellectual life. Rather than being appreciated for its ability to communicate thought, it was condemned for its power to constrain thought. Famous quotations from two philosophers capture the anxiety. "We have to cease to think if we refuse to do it in the prisonhouse of language," wrote Friedrich Nietzsche. "The limits of my language mean the limits of my world," wrote Ludwig Wittgenstein. How could language exert this stranglehold? It would if words and phrases were the medium of thought itself, an idea that falls naturally out of the Blank Slate. If there is nothing in the intellect that was not first in the senses, then words picked up by the ears are the obvious source of any abstract thought that cannot be reduced to sights, smells, or other sounds. Watson tried to explain thinking as microscopic movements of the mouth and throat; Skinner hoped his 1957 book Verbal Behavior, which explained language as a repertoire of rewarded responses, would bridge the gap between pigeons and people.
The other social sciences also tended to equate language with thought. Boas's student Edward Sapir called attention to differences in how languages carve up the world into categories, and Sapir's student Benjamin Whorf stretched those observations into the famous Linguistic Determinism hypothesis: "We cut nature up, organize it into concepts, and ascribe significances as we do, largely because we are parties to an agreement to organize it in this way -- an agreement that holds throughout our speech community and is codified in the patterns of our language. The agreement is, of course, an implicit {208} and unstated one, but its terms are absolutely obligatory! "24 More recently, the anthropologist Clifford Geertz wrote that "thinking consists not of 'happenings in the head' (though happenings there and elsewhere are necessary for it to occur) but of a traffic in what have been called . . . significant symbols -- words for the most part. "25
As with so many ideas in social science, the centrality of language is taken to extremes in deconstructionism, postmodernism, and other relativist doctrines. The writings of oracles like Jacques Derrida are studded with such aphorisms as "No escape from language is possible," "Text is self-referential," "Language is power," and "There is nothing outside the text. " Similarly, J. Hillis Miller wrote that "language is not an instrument or tool in man's hands, a submissive means of thinking. Language rather thinks man and his 'world'. . . if he will allow it to do so. "26 The prize for the most extreme statement must go to Roland Barthes, who declared, "Man does not exist prior to
? ? ? ? ? ? ? ? language, either as a species or as an individual. "27
The ancestry of these ideas is said to be from linguistics, though most linguists believe that deconstructionists have gone off the deep end. The original observation was that many words are defined in part by their relationship to other words. For example, he is defined by its contrast with I, you, they, and she, and big mates sense only as the opposite of little. And if you look up words in a dictionary, they are defined by other words, which are defined by still other words, until the circle is completed when you get back to a definition containing the original word. Therefore, say the deconstructionists, language is a self-contained system in which words have no necessary connection to reality. And since language is an arbitrary instrument, not a medium for communicating thoughts or describing reality, the powerful can use it to manipulate and oppress others. This leads in turn to an agitation for linguistic reforms: neologisms like co or na that would serve as gender-neutral pronouns, a succession of new terms for racial minorities, and a rejection of standards of clarity in criticism and scholarship (for if language is no longer a window onto thought but the very stuff of thought, the metaphor of "clarity" no longer applies).
Like all conspiracy theories, the idea that language is a prisonhouse denigrates its subject by overestimating its power. Language is the magnificent faculty that we use to get thoughts from one head to another, and we can co-opt it in many ways to help our thoughts along. But it is not the same as thought, not the only thing that separates humans from other animals, not the basis of all culture, and not an inescapable prisonhouse, an obligatory agreement, the limits of our world, or the determiner of what is imaginable. 28
We have seen that perception and categorization provide us with concepts that keep, us in touch with the world. Language extends that lifeline by connecting the concepts to words. Children hear noises coming out of a family member's mouth, use their intuitive psychology and their grasp of the context {209} to infer what the speaker is trying to say, and mentally link the words to the concepts and the grammatical rules to the relationships among them. Bowser upends a chair, Sister yells, "The dog knocked over the chair! " and Junior deduces that dog means dog, chair means chair, and the subject of the verb knock over is the agent doing the knocking over. 29 Now Junior can talk about other dogs, other chairs, and other knockings over. There is nothing self-referential or imprisoning about it. As the novelist Walker Percy quipped, a deconstructionist is an academic who claims that texts have no referents and then leaves a message on his wife's answering machine asking her to order a pepperoni pizza for dinner.
Language surely does affect our thoughts, rather than just labeling them for the sake of labeling them. Most obviously, language is the conduit through which people share their thoughts and intentions and thereby acquire the knowledge, customs, and values of those around them. In the song "Christmas" from their rock opera, The Who described the plight of a boy without language: "Tommy doesn't know what day it is; he doesn't know who Jesus was or what prayin' is. "
Language can allow us to share thoughts not just directly, by its literal content, but also indirectly, via metaphors and metonyms that nudge listeners into grasping connections they may not have noticed before. For example, many expressions treat time as if it were a valuable resource, such as waste time, spend time, valuable time, and time is money. 30 Presumably on the first occasion a person used one of these expressions, her audience wondered why she was using a word for money to refer to time; after all, you can't literally spend time the way you spend pieces of gold. Then, by assuming that the speaker was not gibbering, they figured out the ways in which time indeed has something in common with money, and assumed that that was what the speaker intended to convey. Note that even in this clear example of language affecting thought, language is not the same thing as thought. The original coiner of the metaphor had to see the analogy without the benefit of the English expressions, and the first listeners had to make sense of it using a chain of ineffable thoughts about the typical intentions of speakers and the properties shared by time and money.
Aside from its use as a medium of communication, language can be pressed into service as one of the media used by the brain for storing and manipulating information. 31 The leading theory of human working memory, from the psychologist Alan Baddeley, captures the idea nicely. 32 The mind makes use of a "phonological loop": a silent articulation of words or numbers that persists for a few seconds and can be sensed by the mind's ear. The loop acts as a "slave system" at the service of a "central executive. " By describing things to ourselves using snatches of language, we can temporarily store the result of a mental computation or retrieve chunks of data stored as verbal expressions. Mental arithmetic involving large numbers, for example, may be {210} carried out by retrieving verbal formulas such as "Seven times eight is fifty-six. "33 But as the technical terms of the theory make clear, language is serving as a slave of an executive, not as the medium of all thought.
Why do virtually all cognitive scientists and linguists believe that language is not a prisonhouse of thought? 34 First, many experiments have plumbed the minds of creatures without language, such as infants and nonhuman primates, and have found the fundamental categories of thought working away: objects, space, cause and effect, number, probability, agency (the initiation of behavior by a person or animal), and the functions of tools. 35
? ? ? ? ? ? ? ? ? ? ? Second, our vast storehouse of knowledge is certainly not couched in the words and sentences in which we learned the individual facts. What did you read in the page before this one?
I would like to think that you can give a reasonably accurate answer to the question. Now try to write down the exact words you read in those pages. Chances are you cannot recall a single sentence verbatim, probably not even a single phrase. What you remembered is the gist of those passages -- their content, meaning, or sense -- not the language itself. Many experiments on human memory have confirmed that what we remember over the long term is the content, not the wording, of stories and conversations. Cognitive scientists model this "semantic memory" as a web of logical propositions, images, motor programs, strings of sounds, and other data structures connected to one another in the brain. 36
A third way to put language in its place is to think about how we use it. Writing and speaking do not consist of transcribing an interior monologue onto paper or playing it into a microphone. Rather, we engage in a constant give- and-take between the thoughts we try to convey and the means our language offers to convey them. We often grope for words, are dissatisfied with what we write because it does not express what we wanted to say, or discover when every combination of words seems wrong that we do not really know what we want to say. And when we get frustrated by a mismatch between our language and our thoughts, we don't give up, defeated and mum, but change the language. We concoct neologisms (quark, meme, clone, deep structure), invent slang (to spam, to diss, to flame, to surf the web, a spin doctor), borrow useful words from other languages (joie de vivre, schlemiel, angst, machismo), or coin new metaphors (waste time, vote with your feet, push the outside of the envelope). That is why every language, far from being an immutable penitentiary, is constantly under renovation. Despite the lamentations of language lovers and the coercion of tongue troopers, languages change unstoppably as people need to talk about new things or convey new attitudes. 37
Finally, language itself could not function if it did not sit atop a vast infrastructure of tacit knowledge about the world and about the intentions of other people. When we understand language, we have to listen between the lines to winnow out the unintended readings of an ambiguous sentence, piece {211} together fractured utterances, glide over slips of the tongue, and fill in the countless unsaid steps in a complete train of thought. When the shampoo bottle says "Lather, rinse, repeat," we don't spend the rest of our lives in the shower; we infer that it means "repeat once. " And we know how to interpret ambiguous headlines such as "Kids Make Nutritious Snacks," "Prostitutes Appeal to Pope," and "British Left Waffles on Falkland Islands," because we effortlessly apply our background knowledge about the kinds of things that people are likely to convey in newspapers. Indeed, the very existence of ambiguous sentences, in which one string of words expresses two thoughts, proves that thoughts are not the same thing as strings of words.
~
Language often makes the news precisely because it can part company with thoughts and attitudes. In 1998 Bill Clinton exploited the expectations behind ordinary comprehension to mislead prosecutors about his affair with Monica Lewinsky. He used words like alone, sex, and is in senses that were technically defensible but which deviated from charitable guesses about what people ordinarily mean by these terms. For example, he suggested he was not "alone" with Lewinsky, even though they were the only two people in the room, because other people were in the Oval Office complex at the time. He said that he did not have "sex" with her, because they did not engage in intercourse. His words, like all words, are certainly vague at their boundaries. Exactly how far away or hidden must the nearest person be before one is considered alone? At what point in the continuum of bodily contact -- from an accidental brush in an elevator to tantric bliss -- do we say that sex has occurred? Ordinarily we resolve the vagueness by guessing how our conversational partner would interpret words in the context, and we choose our words accordingly. Clinton's ingenuity in manipulating these guesses, and the outrage that erupted when he was forced to explain what he had done, show that people have an acute understanding of the difference between words and the thoughts they are designed to convey.
Language conveys not just literal meanings but also a speaker's attitude. Think of the difference between/at and voluptuous, slender and scrawny, thrifty and stingy, articulate and slick. Racial epithets, which are laced with contempt, are justifiably off-limits among responsible people, because using them conveys the tacit message that contempt for the people referred to by the epithet is acceptable. But the drive to adopt new terms for disadvantaged groups goes much further than this basic sign of respect; it often assumes that words and attitudes are so inseparable that one can reengineer people's attitudes by tinkering with the words. In 1994 the Los Angeles Times adopted a style sheet that banned some 150 words, including birth defect, Canuck, Chinese fire drill, dark continent, divorcee, Dutch treat, handicapped, illegitimate, invalid, man-made, New World, stepchild, and to welsh. The editors assumed that words {212} register in the brain with their literal meanings, so that an invalid is understood as "someone who is not valid" and Dutch treat is understood as a slur on contemporary Netherlanders. (In fact, it is one of many idioms in which Dutch means "ersatz," such as Dutch oven, Dutch door, Dutch uncle, Dutch courage, and Dutch auction,
? ? ? ? the remnants of a long-forgotten rivalry between the English and the Dutch. )
But even the more reasonable attempts at linguistic reform are based on a dubious theory of linguistic determinism. Many people are puzzled by the replacement of formerly unexceptionable terms by new ones: Negro by black by African American, Spanish-American by Hispanic by Latino, crippled by handicapped by disabled by challenged, slum by ghetto by inner city by (according to the Times) slum once again. Occasionally the neologisms are defended with some rationale about their meaning. In the 1960s, the word Negro was replaced by the word black, because the parallel between the words black and white was meant to underscore the equality of the races. Similarly, Native American reminds us of who was here first and avoids the geographically inaccurate term Indian. But often the new terms replace ones that were perfectly congenial in their day, as we see in names for old institutions that are obviously sympathetic to the people being named: the United Negro College Fund, the National Association for the Advancement of Colored People, the Shriners Hospitals for Crippled Children. And sometimes a term can be tainted or unfashionable while a minor variant is fine: consider colored people versus people of color, Afro-American versus African American, Negro -- Spanish for "black" -- versus black. If anything, a respect for literal meaning should send us off looking for a new word for the descendants of Europeans, who are neither white nor Caucasian. Something else must be driving the replacement process.
Linguists are familiar with the phenomenon, which may be called the euphemism treadmill. People invent new words for emotionally charged referents, but soon the euphemism becomes tainted by association, and a new word must be found, which soon acquires its own connotations, and so on. Water closet becomes toilet (originally a term for any kind of body care, as in toilet kit and toilet water), which becomes bathroom, which becomes restroom, which becomes lavatory. Undertaker changes to mortician, which changes to funeral director. Garbage collection turns into sanitation, which turns into environmental services. Gym (from gymnasium, originally "high school") becomes physical education, which becomes (at Berkeley) human biodynamics. Even the word minority -- the most neutral label conceivable, referring only to relative numbers -- was banned in 2001 by the San Diego City Council (and nearly banned by the Boston City Council) because it was deemed disparaging to non-whites. "No matter how you slice it, minority means less than," said a semantically challenged official at Boston College, where the preferred term is AHANA (an acronym for African-American, Hispanic, Asian, and Native American). 38 {213}
The euphemism treadmill shows that concepts, not words, are primary in people's minds. Give a concept a new name, and the name becomes colored by the concept; the concept does not become freshened by the name, at least not for long. Names for minorities will continue to change as long as people have negative attitudes toward them. We will know that we have achieved mutual respect when the names stay put.
~
"Image is nothing. Thirst is everything," screams a soft-drink ad that tries to create a new image for its product by making fun of soft-drink ads that try to create images for their products. Like words, images are salient tokens of our mental lives. And like words, images are said to have an insidious power over our consciousness, presumably because they are inscribed directly onto a blank slate. In postmodernist and relativist thinking, images are held to shape our view of reality, or to be our view of reality, or to be reality itself. This is especially true of images representing celebrities, politicians, women, and AHANAs. And as with language, the scientific study of imagery shows that the fear is misplaced.
A good description of the standard view of images within cultural studies and related disciplines may be found in the Concise Glossary of Cultural Theory. It defines image as a "mental or visual representation of an object or event as depicted in the mind, a painting, a photograph, or film. " Having thus run together images in the world (such as paintings) with images in the mind, the entry lays out the centrality of images in postmodernism, cultural studies, and academic feminism.
First it notes, reasonably enough, that images can misrepresent reality and thereby serve the interests of an ideology. A racist caricature, presumably, is a prime example. But then it takes the concept further:
With what is called the "crisis of representation" brought about by. . . postmodernism, however, it is often questioned whether an image can be thought to simply represent, or misrepresent, a supposedly prior or external, image-free reality. Reality is seen rather as always subject to, or as the product of, modes of representation. In this view we inescapably inhabit a world of images or representations and not a "real world" and true or false images of it.
In other words, if a tree falls in a forest and there is no artist to paint it, not only did the tree make no sound, but it did not fall, and there was no tree there to begin with.
? ? In a further move . . . we are thought to exist in a world of hyperreality, in which images are self-
? generating and entirely detached from any {214} supposed reality. This accords with a common view of contemporary entertainment and politics as being all a matter of "image," or appearance, rather than of substantial content.
Actually, the doctrine of hyperreality contradicts the common view of contemporary politics and entertainment as being a matter of image and appearance. The whole point of the common view is that there is a reality separate from images, and that is what allows us to decry the images that are misleading. We can, for example, criticize an old movie that shows slaves leading happy lives, or an ad that shows a corrupt politician pretending to defend the environment. If there were no such thing as substantial content, we would have no basis for preferring an accurate documentary about slavery to an apologia for it, or preferring a good expose of a politician to a slick campaign ad. The entry notes that images are associated with the world of publicity, advertising, and fashion, and thereby with business and profits. An image may thus be tied to "an imposed stereotype or an alternative subjective or cultural identity. " Media images become mental images: people cannot help but think that women or politicians or African Americans conform to the depictions in movies and advertisements. And this elevates cultural studies and postmodernist art into forces for personal and political liberation:
The study of "images of women" or "women's images" sees this field as one in which stereotypes of women can be reinforced, parodied, or actively contested through critical analysis, alternative histories, or creative work in writing and the media committed to the production of positive counter-images. 39
I have not hidden my view that this entire line of thinking is a conceptual mess. If we want to understand how politicians or advertisers manipulate us, the last thing we should do is blur distinctions among things in the world, our perception of those things when they are in front of our eyes, the mental images of those things that we construct from memory, and physical images such as photographs and drawings.
As we saw at the beginning of this chapter, the visual brain is an immensely complicated system that was designed by the forces of evolution to give us an accurate reading of the consequential things in front of us. The "intelligent eye," as perceptual psychologists call it, does not just compute the shapes and motions of people before us. It also guesses their thoughts and intentions by noticing how they gaze at, approach, avoid, help, or hinder other objects and people. And these guesses are then measured against everything else we know about people -- what we infer from gossip, from a person's words and deeds, and from Sherlock Holmes-style deductions. The result is {215} the knowledge base or semantic memory that also underlies our use of language.
Physical images such as photographs and paintings are devices that reflect light in patterns similar to those coming off real objects, thereby making the visual system respond as if it were really seeing those objects. Though people have long dreamed of illusions that completely fool the brain -- Descartes's evil demon, the philosopher's thought experiment in which a person does not realize he is a brain in a vat, the science-fiction writer's prophecy of perfect virtual reality like in The Matrix -- in actuality the illusions foisted upon us by physical images are never more than partially effective. Our perceptual systems pick up on the imperfections of an image -- the brush strokes, pixels, or frame -- and our conceptual systems pick up on the fact that we are entertaining a hypothetical world that is separate from the real world. It's not that people invariably distinguish fiction from reality: they can lose themselves in fiction, or misremember something they read in a novel as something they read in the newspapers or that happened to a friend, or mistakenly believe that a stylized portrayal of a time and place is an accurate portrayal. But all of us are capable of distinguishing fictitious worlds from real ones, as we see when a two-year-old pretends that a banana is a telephone for the fun of it but at the same time understands that a banana is not literally a telephone. 40 Cognitive scientists believe that the ability to entertain propositions without necessarily believing them -- to distinguish "John believes there is a Santa Claus" from "There is a Santa Claus" -- is a fundamental ability of human cognition. 41 Many believe that a breakdown of this ability underlies the thought disorder in the syndrome called schizophrenia. 42 Finally, there are mental images, the visualizations of objects and scenes in the mind's eye. The psychologist Stephen Kosslyn has shown that the brain is equipped with a system capable of reactivating and manipulating memories of perceptual experience, a bit like Photoshop with its tools for assembling, rotating, and coloring images. 43 Like language, imagery may be used as a slave system -- a "visuospatial sketchpad" -- by the central executive of the brain, making it a valuable form of mental representation. We use mental imagery, for example, when we visualize how a chair might fit in a living room or whether a sweater would look good on a relative. Imagery is also an invaluable tool to novelists, who imagine scenes before describing them in words, and to scientists, who rotate molecules or play out forces and motions in their imagination.
Though mental images allow our experiences (including our experience of media images) to affect our thoughts and attitudes long after the original objects have gone, it is a mistake to think that raw images are downloaded into our minds and then constitute our mental lives. Images are not stored in the mind like snapshots in a shoebox; if they
? ? ? ? ? ? ? were, how could you ever find the one you want? Rather, they are labeled and linked to a vast database of {216} knowledge, which allows them to be evaluated and interpreted in terms of what they stand for. 44 Chess masters, for
example, are famous for their ability to remember games in progress, but their mental images of the board are not raw photographs. Rather, they are saturated with abstract information about the game, such as which piece is threatening which other one and which clusters of pieces form viable defenses. We know this because when a chessboard is sprinkled with pieces at random, chess masters are no better at remembering the arrangement than amateurs are. 45 When images represent real people, not just chessmen, there are even more possibilities for organizing and annotating them with information about people's goals and motives -- for example, whether the person in an image is sincere or just acting.
The reason that images cannot constitute the contents of our thoughts is that images, like words, are inherently ambiguous. An image of Lassie could stand for Lassie, collies, dogs, animals, television stars, or family values. Some other, more abstract form of information must pick out the concept that an image is taken to exemplify. Or consider the sentence Yesterday my uncle fired his lawyer (an example suggested by Dan Dennett). When understanding the sentence, Brad might visualize his own ordeals of the day before and glimpse the "uncle" slot in a family tree, then picture courthouse steps and an angry man. Irene might have no image for "yesterday" but might visualize her uncle Bob's face, a slamming door, and a power-suited woman. Yet despite these very different image sequences, both people have understood the sentence in the same way, as we could see by questioning them or asking them to paraphrase the sentence. "Imagery couldn't be the key to comprehension," Dennett points out, "because you can't draw a picture of an uncle, or of yesterday, or firing, or a lawyer. Uncles, unlike clowns and firemen, don't look different in any characteristic way that can be visually represented, and yesterdays don't look like anything at all. "46 Since images are interpreted in the context of a deeper understanding of people and their relationships, the "crisis of representation," with its paranoia about the manipulation of our mind by media images, is overblown. People are not helplessly programmed with images; they can evaluate and interpret what they see using everything else they know, such as the credibility and motives of the source.
The postmodernist equating of images with thoughts has not only made a hash of several scholarly disciplines but has laid waste to the world of contemporary art. If images are the disease, the reasoning goes, then art is the cure. Artists can neutralize the power of media images by distorting them or reproducing them in odd contexts (like the ad parodies in Mad magazine or on Saturday Night Live, only not funny). Anyone familiar with contemporary art has seen the countless works in which stereotypes of women, minorities, or gay {217} people are "reinforced, parodied, or actively contested. " A prototypical example is a 1994 exhibit at the Whitney Museum in New York called "Black Male: Representations of Masculinity in Contemporary Art. " It aimed to take apart the way that African American men are culturally constructed in demonizing and marginalizing visual stereotypes such as the sex symbol, the athlete, the Sambo, and the photograph in a Wanted poster. According to the catalogue essay, "The real struggle is over the power to control images. " The art critic Adam Gopnik (whose mother and sister are cognitive scientists) called attention to the simplistic theory of cognition behind this tedious formula:
The show is intended to be socially therapeutic: its aim is to make you face the socially constructed images of black men, so that by confronting them -- or, rather, seeing artists confront them on your behalf -- you can make them go away. The trouble is that the entire enterprise of "disassembling social images" rests on an ambiguity in the way we use the word "image. " Mental images are not really images at all, but instead consist of complicated opinions, positions, doubts, and passionately held convictions, rooted in experience and amendable by argument, by more experience, or by coercion. Our mental images of black men, white judges, the press, and so on do not take the form of pictures of the kind that you can hang up (or "deconstruct") on a museum wall. . . . Hitler did not hate Jews because there were pictures of swarthy Semites with big noses imprinted on his cerebellum; racism does not exist in America because the picture of O. J. Simpson on the cover of Time is too dark. The view that visual cliches shape beliefs is both too pessimistic, in that it supposes that people are helplessly imprisoned by received stereotypes, and too optimistic, in that it supposes that if you could change the images you could change the beliefs. 47
Recognizing that we are equipped with sophisticated faculties that keep us in touch with reality does not entail ignoring the ways in which our faculties can be turned against us. People lie, sometimes baldly, sometimes through insinuation and presupposition (as in the question "When did you stop beating your wife? "). People disseminate disinformation about ethnic groups, not just pejorative stereotypes but tales of exploitation and perfidy that serve to stoke moralistic outrage against them. People try to manipulate social realities like status (which exist in the mind of the beholder) to make themselves look good or to sell products.
But we can best protect ourselves against such manipulation by pinpointing the vulnerabilities of our faculties of
? ? ? ? ? ? categorization, language, and imagery, not by denying their complexity. The view that humans are passive {218} receptacles of stereotypes, words, and images is condescending to ordinary people and gives unearned importance to the pretensions of cultural and academic elites. And exotic pronouncements about the limitations of our faculties, such as that there is nothing outside the text or that we inhabit a world of images rather than a real world, make it impossible even to identify lies and misrepresentations, let alone to understand how they are promulgated.
<< {219} >> Chapter 13
Out of Our Depths
A man has got to know his limitations. -- Clint Eastwood in Magnum Force
Most people are familiar with the idea that some of our ordeals come from a mismatch between the source of our passions in evolutionary history and the goals we set for ourselves today. People gorge themselves in anticipation of a famine that never comes, engage in dangerous liaisons that conceive babies they don't want, and rev up their bodies in response to stressors from which they cannot run away.
What is true for the emotions may also be true for the intellect. Some of our perplexities may come from a mismatch between the purposes for which our cognitive faculties evolved and the purposes to which we put them today. This is obvious enough when it comes to raw data processing. People do not try to multiply six-digit numbers in their heads or remember the phone number of everyone they meet, because they know their minds were not designed for the job. But it is not as obvious when it comes to the way we conceptualize the world. Our minds keep us in touch with aspects of reality -- such as objects, animals, and people -- that our ancestors dealt with for millions of years. But as science and technology open up new and hidden worlds, our untutored intuitions may find themselves at sea.
What are these intuitions? Many cognitive scientists believe that human reasoning is not accomplished by a single, general-purpose computer in the head. The world is a heterogeneous place, and we are equipped with different kinds of intuitions and logics, each appropriate to one department of reality. These ways of knowing have been called systems, modules, stances, faculties, mental organs, multiple intelligences, and reasoning engines. 1 They emerge early in life, are present in every normal person, and appear to be computed in partly distinct sets of networks in the brain. They may be installed by different {220} combinations of genes, or they may emerge when brain tissue self- organizes in response to different problems to be solved and different patterns in the sensory input. Most likely they develop by some combination of these forces.
What makes our reasoning faculties different from the departments in a university is that they are not just broad areas of knowledge, analyzed with whatever tools work best. Each faculty is based on a core intuition that was suitable for analyzing the world in which we evolved. Though cognitive scientists have not agreed on a Gray's Anatomy of the mind, here is a tentative but defensible list of cognitive faculties and the core intuitions on which they are based:
? An intuitive physics, which we use to keep track of how objects fall, bounce, and bend. Its core intuition is the concept of the object, which occupies one place, exists for a continuous span of time, and follows laws of motion and force. These are not Newton's laws but something closer to the medieval conception of impetus, an "oomph" that keeps an object in motion and gradually dissipates. 2 ? An intuitive version of biology or natural history, which we use to understand the living world. Its core intuition is that living things house a hidden essence that gives them their form and powers and drives their growth and bodily functions. 3
? An intuitive engineering, which we use to make and understand tools and other artifacts. Its core intuition is that a tool is an object with a purpose -- an object designed by a person to achieve a goal.
This whole enterprise is based on an unstated theory of human concept formation: that conceptual categories bear no systematic relation to things in the world but are socially constructed (and can therefore be reconstructed). Is it a correct theory? In some cases it has a grain of truth. As we saw in Chapter 4, some categories really are social constructions: they exist only because people tacitly agree to act as if they exist. Examples include money, tenure, citizenship, decorations for bravery, and the presidency of the United States. 10 But that does not mean that all conceptual categories are socially constructed. Concept formation has been studied for decades by cognitive psychologists, and they conclude that most concepts pick out categories of objects in the {203} world which had some kind of reality before we ever stopped to think about them. 11
Yes, every snowflake is unique, and no category will do complete justice to every one of its members. But intelligence depends on lumping together things that share properties, so that we are not flabbergasted by every new thing we encounter. As William James wrote, "A polyp would be a conceptual thinker if a feeling of 'Hollo! thingumbob again! ' ever flitted through its mind. " We perceive some traits of a new object, place it in a mental category, and infer that it is likely to have the other traits typical of that category, ones we cannot perceive. If it walks like a duck and quacks like a duck, it probably is a duck. If it's a duck, it's likely to swim, fly, have a back off which water rolls, and contain meat that's tasty when wrapped in a pancake with scallions and hoisin sauce.
This kind of inference works because the world really does contain ducks, which really do share properties. If we lived in a world in which walking quacking objects were no more likely to contain meat than did any other object, the category "duck" would be useless and we probably would not have evolved the ability to form it. If you were to construct a giant spreadsheet in which the rows and columns were traits that people notice and the cells were filled in by objects that possess that combination of traits, the pattern of filled cells would be lumpy. You would find lots of entries at the intersection of the "quacks" row and the "waddles" column but none at the "quacks" row and the "gallops" column. Once you specify the rows and columns, the lumpiness comes from the world, not from society or language. It is no coincidence that the same living things tend to be classified together by the words in European cultures, the words for plant and animal kinds in other cultures (including preliterate cultures), and the Linnaean taxa of professional biologists equipped with calipers, dissecting tools, and DNA sequencers. Ducks, biologists say, are several dozen species in the subfamily Anatinae, each with a distinct anatomy, an ability to interbreed with other members of their species, and a common ancestor in evolutionary history.
Most cognitive psychologists believe that conceptual categories come from two mental processes. 12 One of them notices clumps of entries in the mental spreadsheet and treats them as categories with fuzzy boundaries, prototypical members, and overlapping similarities, like the members of a family. That's why our mental category "duck" can embrace odd ducks that don't match the prototypical duck, such as lame ducks, who cannot swim or fly, Muscovy ducks, which have claws and spurs on their feet, and Donald Duck, who talks and wears clothing. The other mental process looks for crisp rules and definitions and enters them into chains of reasoning. The second system can learn that true ducks molt twice a season and have overlapping scales on their legs and hence that certain birds that look like geese and are called geese really are ducks. Even when people don't know these facts from academic {204} biology, they have a strong intuition that species are defined by an internal essence or hidden trait that lawfully gives rise to its visible features. 13
Anyone who teaches the psychology of categorization has been hit with this question from a puzzled student: "You're telling us that putting things into categories is rational and makes us smart. But we've always been taught that putting people into categories is irrational and makes us sexist and racist. If categorization is so great when we think about ducks and chairs, why is it so terrible when we think about genders and ethnic groups? " As with many ingenuous questions from students, this one uncovers a shortcoming in the literature, not a flaw in their understanding.
The idea that stereotypes are inherently irrational owes more to a condescension toward ordinary people than it does to good psychological research. Many researchers, having shown that stereotypes existed in the minds of their subjects, assumed that the stereotypes had to be irrational, because they were uncomfortable with the possibility that some trait might be statistically true of some group. They never actually checked. That began to change in the 1980s, and now a fair amount is known about the accuracy of stereotypes. 14
With some important exceptions, stereotypes are in fact not inaccurate when assessed against objective benchmarks such as census figures or the reports of the stereotyped people themselves. People who believe that African Americans are more likely to be on welfare than whites, that Jews have higher average incomes than WASPs, that
? ? ? ? ? ? ? ? ? business students are more conservative than students in the arts, that women are more likely than men to want to lose weight, and that men are more likely than women to swat a fly with their bare hands, are not being irrational or bigoted. Those beliefs are correct. People's stereotypes are generally consistent with the statistics, and in many cases their bias is to underestimate the real differences between sexes or ethnic groups. 15 This does not mean that the stereotyped traits are unchangeable, of course, or that people think they are unchangeable, only that people perceive the traits fairly accurately at the time.
Moreover, even when people believe that ethnic groups have characteristic traits, they are never mindless stereotypers who literally believe that each and every member of the group possesses those traits. People may think that Germans are, on average, more efficient than non-Germans, but no one believes that every last German is more efficient than every non-German. 16 And people have no trouble overriding a stereotype when they have good information about an individual. Contrary to a common accusation, teachers' impressions of their individual pupils are not contaminated by their stereotypes of race, gender, or socioeconomic status. The teachers' impressions accurately reflect the pupil's performance as measured by objective tests. 17
Now for the important exceptions. Stereotypes can be downright inaccurate when a person has few or no firsthand encounters with the stereotyped {205} group, or belongs to a group that is overtly hostile to the one being judged. During World War II, when the Russians were allies of the United States and the Germans were enemies, Americans judged Russians to have more positive traits than Germans. Soon afterward, when the alliances reversed, Americans judged Germans to have more positive traits than Russians. 18
Also, people's ability to set aside stereotypes when judging an individual is accomplished by their conscious, deliberate reasoning. When people are distracted or put under pressure to respond quickly, they are more likely to judge that a member of an ethnic group has all the stereotyped traits of the group. 19 This comes from the two-part design of the human categorization system mentioned earlier. Our network of fuzzy associations naturally reverts to a stereotype when we first encounter an individual. But our rule-based categorizer can block out those associations and make deductions based on the relevant facts about that individual. It can do so either for practical reasons, when information about a group-wide average is less diagnostic than information about the individual, or for social and moral reasons, out of respect for the imperative that one ought to ignore certain group-wide averages when judging an individual.
The upshot of this research is not that stereotypes are always accurate but that they are not always false, or even usually false. This is just what we would expect if human categorization -- like the rest of the mind -- is an adaptation that keeps track of aspects of the world that are relevant to our long-term well-being. As the social psychologist Roger Brown pointed out, the main difference between categories of people and categories of other things is that when you use a prototypical exemplar to stand for a category of things, no one takes offense. When Webster's dictionary used a sparrow to stand for all birds, "emus and ostriches and penguins and eagles did not go on the attack. " But just imagine what would have happened if Webster's had used a picture of a soccer mom to illustrate woman and a picture of a business executive to illustrate man. Brown remarks, "Of course, people would be right to take offense since a prototype can never represent the variation that exists in natural categories. It's just that birds don't care but people do. "20
What are the implications of the fact that many stereotypes are statistically accurate? One is that contemporary scientific research on sex differences cannot be dismissed just because some of the findings are consistent with traditional stereotypes of men and women. Some parts of those stereotypes may be false, but the mere fact that they are stereotypes does not prove that they are false in every respect.
The partial accuracy of many stereotypes does not, of course, mean that racism, sexism, and ethnic prejudice are acceptable. Quite apart from the democratic principle that in the public sphere people should be treated as individuals, there are good reasons to be concerned about stereotypes. {206} Stereotypes based on hostile depictions rather than on firsthand experience are bound to be inaccurate. And some stereotypes are accurate only because of self-fulfilling prophecies. Forty years ago it may have been factually correct that few women and African Americans were qualified to be chief executives or presidential candidates. But that was only because of barriers that prevented them from attaining those qualifications, such as university policies that refused them admission out of a belief that they were not qualified. The institutional barriers had to be dismantled before the facts could change. The good news is that when the facts do change, people's stereotypes can change with them.
What about policies that go farther and actively compensate for prejudicial stereotypes, such as quotas and preferences that favor underrepresented groups? Some defenders of these policies assume that gatekeepers are incurably afflicted with baseless prejudices, and that quotas must be kept in place forever to neutralize their effects. The research on stereotype accuracy refutes that argument. Nonetheless, the research might support a different argument for preferences and other gender- and color-sensitive policies. Stereotypes, even when they are accurate, might be self-fulfilling, and not just in the obvious case of institutionalized barriers like those that kept women and
? ? ? ? ? ? ? ? African Americans out of universities and professions. Many people have heard of the Pygmalion effect, in which people perform as other people (such as teachers) expect them to perform. As it happens, the Pygmalion effect appears to be small or nonexistent, but there are more subtle forms of self-fulfilling prophecies. 21 If subjective decisions about people, such as admissions, hiring, credit, and salaries, are based in part on group-wide averages, they will conspire to make the rich richer and the poor poorer. Women are marginalized in academia, making them genuinely less influential, which increases their marginalization. African Americans are treated as poorer credit risks and denied credit, which makes them less likely to succeed, which makes them poorer credit risks. Race- and gender- sensitive policies, according to arguments by the psychologist Virginia Valian, the economist Glenn Loury, and the philosopher James Flynn, may be needed to break the vicious cycle. 22
Pushing in the other direction is the finding that stereotypes are least accurate when they pertain to a coalition that is pitted against one's own in hostile competition. This should make us nervous about identity politics, in which public institutions identify their members in terms of their race, gender, and ethnic group and weigh every policy by how it favors one group over another. In many universities, for example, minority students are earmarked for special orientation sessions and encouraged to view their entire academic experience through the lens of their group and how it has been victimized. By implicitly pitting one group against another, such policies may cause each group to brew stereotypes about the other that are more pejorative than the {207} ones they would develop in personal encounters. As with other policy issues I examine in this book, the data from the lab do not offer a thumbs-up or thumbs-down verdict on race- and gender-conscious policies. But by highlighting the features of our psychology that different policies engage, the findings can make the tradeoffs clearer and the debates better informed.
~
Of all the faculties that go into the piece of work called man, language may be the most awe-inspiring. "Remember that you are a human being with a soul and the divine gift of articulate speech," Henry Higgins implored Eliza Doolittle. Galileo's alter ego, humbled by the arts and inventions of his day, commented on language in its written form:
But surpassing all stupendous inventions, what sublimity of mind was his who dreamed of finding means to communicate his deepest thoughts to any other person, though distant by mighty intervals of place and time! Of talking with those who are in India; of speaking to those who are not yet born and will not be born for a thousand or ten thousand years; and with what facility, by the different arrangements of twenty characters upon a page! 23
But a funny thing happened to language in intellectual life. Rather than being appreciated for its ability to communicate thought, it was condemned for its power to constrain thought. Famous quotations from two philosophers capture the anxiety. "We have to cease to think if we refuse to do it in the prisonhouse of language," wrote Friedrich Nietzsche. "The limits of my language mean the limits of my world," wrote Ludwig Wittgenstein. How could language exert this stranglehold? It would if words and phrases were the medium of thought itself, an idea that falls naturally out of the Blank Slate. If there is nothing in the intellect that was not first in the senses, then words picked up by the ears are the obvious source of any abstract thought that cannot be reduced to sights, smells, or other sounds. Watson tried to explain thinking as microscopic movements of the mouth and throat; Skinner hoped his 1957 book Verbal Behavior, which explained language as a repertoire of rewarded responses, would bridge the gap between pigeons and people.
The other social sciences also tended to equate language with thought. Boas's student Edward Sapir called attention to differences in how languages carve up the world into categories, and Sapir's student Benjamin Whorf stretched those observations into the famous Linguistic Determinism hypothesis: "We cut nature up, organize it into concepts, and ascribe significances as we do, largely because we are parties to an agreement to organize it in this way -- an agreement that holds throughout our speech community and is codified in the patterns of our language. The agreement is, of course, an implicit {208} and unstated one, but its terms are absolutely obligatory! "24 More recently, the anthropologist Clifford Geertz wrote that "thinking consists not of 'happenings in the head' (though happenings there and elsewhere are necessary for it to occur) but of a traffic in what have been called . . . significant symbols -- words for the most part. "25
As with so many ideas in social science, the centrality of language is taken to extremes in deconstructionism, postmodernism, and other relativist doctrines. The writings of oracles like Jacques Derrida are studded with such aphorisms as "No escape from language is possible," "Text is self-referential," "Language is power," and "There is nothing outside the text. " Similarly, J. Hillis Miller wrote that "language is not an instrument or tool in man's hands, a submissive means of thinking. Language rather thinks man and his 'world'. . . if he will allow it to do so. "26 The prize for the most extreme statement must go to Roland Barthes, who declared, "Man does not exist prior to
? ? ? ? ? ? ? ? language, either as a species or as an individual. "27
The ancestry of these ideas is said to be from linguistics, though most linguists believe that deconstructionists have gone off the deep end. The original observation was that many words are defined in part by their relationship to other words. For example, he is defined by its contrast with I, you, they, and she, and big mates sense only as the opposite of little. And if you look up words in a dictionary, they are defined by other words, which are defined by still other words, until the circle is completed when you get back to a definition containing the original word. Therefore, say the deconstructionists, language is a self-contained system in which words have no necessary connection to reality. And since language is an arbitrary instrument, not a medium for communicating thoughts or describing reality, the powerful can use it to manipulate and oppress others. This leads in turn to an agitation for linguistic reforms: neologisms like co or na that would serve as gender-neutral pronouns, a succession of new terms for racial minorities, and a rejection of standards of clarity in criticism and scholarship (for if language is no longer a window onto thought but the very stuff of thought, the metaphor of "clarity" no longer applies).
Like all conspiracy theories, the idea that language is a prisonhouse denigrates its subject by overestimating its power. Language is the magnificent faculty that we use to get thoughts from one head to another, and we can co-opt it in many ways to help our thoughts along. But it is not the same as thought, not the only thing that separates humans from other animals, not the basis of all culture, and not an inescapable prisonhouse, an obligatory agreement, the limits of our world, or the determiner of what is imaginable. 28
We have seen that perception and categorization provide us with concepts that keep, us in touch with the world. Language extends that lifeline by connecting the concepts to words. Children hear noises coming out of a family member's mouth, use their intuitive psychology and their grasp of the context {209} to infer what the speaker is trying to say, and mentally link the words to the concepts and the grammatical rules to the relationships among them. Bowser upends a chair, Sister yells, "The dog knocked over the chair! " and Junior deduces that dog means dog, chair means chair, and the subject of the verb knock over is the agent doing the knocking over. 29 Now Junior can talk about other dogs, other chairs, and other knockings over. There is nothing self-referential or imprisoning about it. As the novelist Walker Percy quipped, a deconstructionist is an academic who claims that texts have no referents and then leaves a message on his wife's answering machine asking her to order a pepperoni pizza for dinner.
Language surely does affect our thoughts, rather than just labeling them for the sake of labeling them. Most obviously, language is the conduit through which people share their thoughts and intentions and thereby acquire the knowledge, customs, and values of those around them. In the song "Christmas" from their rock opera, The Who described the plight of a boy without language: "Tommy doesn't know what day it is; he doesn't know who Jesus was or what prayin' is. "
Language can allow us to share thoughts not just directly, by its literal content, but also indirectly, via metaphors and metonyms that nudge listeners into grasping connections they may not have noticed before. For example, many expressions treat time as if it were a valuable resource, such as waste time, spend time, valuable time, and time is money. 30 Presumably on the first occasion a person used one of these expressions, her audience wondered why she was using a word for money to refer to time; after all, you can't literally spend time the way you spend pieces of gold. Then, by assuming that the speaker was not gibbering, they figured out the ways in which time indeed has something in common with money, and assumed that that was what the speaker intended to convey. Note that even in this clear example of language affecting thought, language is not the same thing as thought. The original coiner of the metaphor had to see the analogy without the benefit of the English expressions, and the first listeners had to make sense of it using a chain of ineffable thoughts about the typical intentions of speakers and the properties shared by time and money.
Aside from its use as a medium of communication, language can be pressed into service as one of the media used by the brain for storing and manipulating information. 31 The leading theory of human working memory, from the psychologist Alan Baddeley, captures the idea nicely. 32 The mind makes use of a "phonological loop": a silent articulation of words or numbers that persists for a few seconds and can be sensed by the mind's ear. The loop acts as a "slave system" at the service of a "central executive. " By describing things to ourselves using snatches of language, we can temporarily store the result of a mental computation or retrieve chunks of data stored as verbal expressions. Mental arithmetic involving large numbers, for example, may be {210} carried out by retrieving verbal formulas such as "Seven times eight is fifty-six. "33 But as the technical terms of the theory make clear, language is serving as a slave of an executive, not as the medium of all thought.
Why do virtually all cognitive scientists and linguists believe that language is not a prisonhouse of thought? 34 First, many experiments have plumbed the minds of creatures without language, such as infants and nonhuman primates, and have found the fundamental categories of thought working away: objects, space, cause and effect, number, probability, agency (the initiation of behavior by a person or animal), and the functions of tools. 35
? ? ? ? ? ? ? ? ? ? ? Second, our vast storehouse of knowledge is certainly not couched in the words and sentences in which we learned the individual facts. What did you read in the page before this one?
I would like to think that you can give a reasonably accurate answer to the question. Now try to write down the exact words you read in those pages. Chances are you cannot recall a single sentence verbatim, probably not even a single phrase. What you remembered is the gist of those passages -- their content, meaning, or sense -- not the language itself. Many experiments on human memory have confirmed that what we remember over the long term is the content, not the wording, of stories and conversations. Cognitive scientists model this "semantic memory" as a web of logical propositions, images, motor programs, strings of sounds, and other data structures connected to one another in the brain. 36
A third way to put language in its place is to think about how we use it. Writing and speaking do not consist of transcribing an interior monologue onto paper or playing it into a microphone. Rather, we engage in a constant give- and-take between the thoughts we try to convey and the means our language offers to convey them. We often grope for words, are dissatisfied with what we write because it does not express what we wanted to say, or discover when every combination of words seems wrong that we do not really know what we want to say. And when we get frustrated by a mismatch between our language and our thoughts, we don't give up, defeated and mum, but change the language. We concoct neologisms (quark, meme, clone, deep structure), invent slang (to spam, to diss, to flame, to surf the web, a spin doctor), borrow useful words from other languages (joie de vivre, schlemiel, angst, machismo), or coin new metaphors (waste time, vote with your feet, push the outside of the envelope). That is why every language, far from being an immutable penitentiary, is constantly under renovation. Despite the lamentations of language lovers and the coercion of tongue troopers, languages change unstoppably as people need to talk about new things or convey new attitudes. 37
Finally, language itself could not function if it did not sit atop a vast infrastructure of tacit knowledge about the world and about the intentions of other people. When we understand language, we have to listen between the lines to winnow out the unintended readings of an ambiguous sentence, piece {211} together fractured utterances, glide over slips of the tongue, and fill in the countless unsaid steps in a complete train of thought. When the shampoo bottle says "Lather, rinse, repeat," we don't spend the rest of our lives in the shower; we infer that it means "repeat once. " And we know how to interpret ambiguous headlines such as "Kids Make Nutritious Snacks," "Prostitutes Appeal to Pope," and "British Left Waffles on Falkland Islands," because we effortlessly apply our background knowledge about the kinds of things that people are likely to convey in newspapers. Indeed, the very existence of ambiguous sentences, in which one string of words expresses two thoughts, proves that thoughts are not the same thing as strings of words.
~
Language often makes the news precisely because it can part company with thoughts and attitudes. In 1998 Bill Clinton exploited the expectations behind ordinary comprehension to mislead prosecutors about his affair with Monica Lewinsky. He used words like alone, sex, and is in senses that were technically defensible but which deviated from charitable guesses about what people ordinarily mean by these terms. For example, he suggested he was not "alone" with Lewinsky, even though they were the only two people in the room, because other people were in the Oval Office complex at the time. He said that he did not have "sex" with her, because they did not engage in intercourse. His words, like all words, are certainly vague at their boundaries. Exactly how far away or hidden must the nearest person be before one is considered alone? At what point in the continuum of bodily contact -- from an accidental brush in an elevator to tantric bliss -- do we say that sex has occurred? Ordinarily we resolve the vagueness by guessing how our conversational partner would interpret words in the context, and we choose our words accordingly. Clinton's ingenuity in manipulating these guesses, and the outrage that erupted when he was forced to explain what he had done, show that people have an acute understanding of the difference between words and the thoughts they are designed to convey.
Language conveys not just literal meanings but also a speaker's attitude. Think of the difference between/at and voluptuous, slender and scrawny, thrifty and stingy, articulate and slick. Racial epithets, which are laced with contempt, are justifiably off-limits among responsible people, because using them conveys the tacit message that contempt for the people referred to by the epithet is acceptable. But the drive to adopt new terms for disadvantaged groups goes much further than this basic sign of respect; it often assumes that words and attitudes are so inseparable that one can reengineer people's attitudes by tinkering with the words. In 1994 the Los Angeles Times adopted a style sheet that banned some 150 words, including birth defect, Canuck, Chinese fire drill, dark continent, divorcee, Dutch treat, handicapped, illegitimate, invalid, man-made, New World, stepchild, and to welsh. The editors assumed that words {212} register in the brain with their literal meanings, so that an invalid is understood as "someone who is not valid" and Dutch treat is understood as a slur on contemporary Netherlanders. (In fact, it is one of many idioms in which Dutch means "ersatz," such as Dutch oven, Dutch door, Dutch uncle, Dutch courage, and Dutch auction,
? ? ? ? the remnants of a long-forgotten rivalry between the English and the Dutch. )
But even the more reasonable attempts at linguistic reform are based on a dubious theory of linguistic determinism. Many people are puzzled by the replacement of formerly unexceptionable terms by new ones: Negro by black by African American, Spanish-American by Hispanic by Latino, crippled by handicapped by disabled by challenged, slum by ghetto by inner city by (according to the Times) slum once again. Occasionally the neologisms are defended with some rationale about their meaning. In the 1960s, the word Negro was replaced by the word black, because the parallel between the words black and white was meant to underscore the equality of the races. Similarly, Native American reminds us of who was here first and avoids the geographically inaccurate term Indian. But often the new terms replace ones that were perfectly congenial in their day, as we see in names for old institutions that are obviously sympathetic to the people being named: the United Negro College Fund, the National Association for the Advancement of Colored People, the Shriners Hospitals for Crippled Children. And sometimes a term can be tainted or unfashionable while a minor variant is fine: consider colored people versus people of color, Afro-American versus African American, Negro -- Spanish for "black" -- versus black. If anything, a respect for literal meaning should send us off looking for a new word for the descendants of Europeans, who are neither white nor Caucasian. Something else must be driving the replacement process.
Linguists are familiar with the phenomenon, which may be called the euphemism treadmill. People invent new words for emotionally charged referents, but soon the euphemism becomes tainted by association, and a new word must be found, which soon acquires its own connotations, and so on. Water closet becomes toilet (originally a term for any kind of body care, as in toilet kit and toilet water), which becomes bathroom, which becomes restroom, which becomes lavatory. Undertaker changes to mortician, which changes to funeral director. Garbage collection turns into sanitation, which turns into environmental services. Gym (from gymnasium, originally "high school") becomes physical education, which becomes (at Berkeley) human biodynamics. Even the word minority -- the most neutral label conceivable, referring only to relative numbers -- was banned in 2001 by the San Diego City Council (and nearly banned by the Boston City Council) because it was deemed disparaging to non-whites. "No matter how you slice it, minority means less than," said a semantically challenged official at Boston College, where the preferred term is AHANA (an acronym for African-American, Hispanic, Asian, and Native American). 38 {213}
The euphemism treadmill shows that concepts, not words, are primary in people's minds. Give a concept a new name, and the name becomes colored by the concept; the concept does not become freshened by the name, at least not for long. Names for minorities will continue to change as long as people have negative attitudes toward them. We will know that we have achieved mutual respect when the names stay put.
~
"Image is nothing. Thirst is everything," screams a soft-drink ad that tries to create a new image for its product by making fun of soft-drink ads that try to create images for their products. Like words, images are salient tokens of our mental lives. And like words, images are said to have an insidious power over our consciousness, presumably because they are inscribed directly onto a blank slate. In postmodernist and relativist thinking, images are held to shape our view of reality, or to be our view of reality, or to be reality itself. This is especially true of images representing celebrities, politicians, women, and AHANAs. And as with language, the scientific study of imagery shows that the fear is misplaced.
A good description of the standard view of images within cultural studies and related disciplines may be found in the Concise Glossary of Cultural Theory. It defines image as a "mental or visual representation of an object or event as depicted in the mind, a painting, a photograph, or film. " Having thus run together images in the world (such as paintings) with images in the mind, the entry lays out the centrality of images in postmodernism, cultural studies, and academic feminism.
First it notes, reasonably enough, that images can misrepresent reality and thereby serve the interests of an ideology. A racist caricature, presumably, is a prime example. But then it takes the concept further:
With what is called the "crisis of representation" brought about by. . . postmodernism, however, it is often questioned whether an image can be thought to simply represent, or misrepresent, a supposedly prior or external, image-free reality. Reality is seen rather as always subject to, or as the product of, modes of representation. In this view we inescapably inhabit a world of images or representations and not a "real world" and true or false images of it.
In other words, if a tree falls in a forest and there is no artist to paint it, not only did the tree make no sound, but it did not fall, and there was no tree there to begin with.
? ? In a further move . . . we are thought to exist in a world of hyperreality, in which images are self-
? generating and entirely detached from any {214} supposed reality. This accords with a common view of contemporary entertainment and politics as being all a matter of "image," or appearance, rather than of substantial content.
Actually, the doctrine of hyperreality contradicts the common view of contemporary politics and entertainment as being a matter of image and appearance. The whole point of the common view is that there is a reality separate from images, and that is what allows us to decry the images that are misleading. We can, for example, criticize an old movie that shows slaves leading happy lives, or an ad that shows a corrupt politician pretending to defend the environment. If there were no such thing as substantial content, we would have no basis for preferring an accurate documentary about slavery to an apologia for it, or preferring a good expose of a politician to a slick campaign ad. The entry notes that images are associated with the world of publicity, advertising, and fashion, and thereby with business and profits. An image may thus be tied to "an imposed stereotype or an alternative subjective or cultural identity. " Media images become mental images: people cannot help but think that women or politicians or African Americans conform to the depictions in movies and advertisements. And this elevates cultural studies and postmodernist art into forces for personal and political liberation:
The study of "images of women" or "women's images" sees this field as one in which stereotypes of women can be reinforced, parodied, or actively contested through critical analysis, alternative histories, or creative work in writing and the media committed to the production of positive counter-images. 39
I have not hidden my view that this entire line of thinking is a conceptual mess. If we want to understand how politicians or advertisers manipulate us, the last thing we should do is blur distinctions among things in the world, our perception of those things when they are in front of our eyes, the mental images of those things that we construct from memory, and physical images such as photographs and drawings.
As we saw at the beginning of this chapter, the visual brain is an immensely complicated system that was designed by the forces of evolution to give us an accurate reading of the consequential things in front of us. The "intelligent eye," as perceptual psychologists call it, does not just compute the shapes and motions of people before us. It also guesses their thoughts and intentions by noticing how they gaze at, approach, avoid, help, or hinder other objects and people. And these guesses are then measured against everything else we know about people -- what we infer from gossip, from a person's words and deeds, and from Sherlock Holmes-style deductions. The result is {215} the knowledge base or semantic memory that also underlies our use of language.
Physical images such as photographs and paintings are devices that reflect light in patterns similar to those coming off real objects, thereby making the visual system respond as if it were really seeing those objects. Though people have long dreamed of illusions that completely fool the brain -- Descartes's evil demon, the philosopher's thought experiment in which a person does not realize he is a brain in a vat, the science-fiction writer's prophecy of perfect virtual reality like in The Matrix -- in actuality the illusions foisted upon us by physical images are never more than partially effective. Our perceptual systems pick up on the imperfections of an image -- the brush strokes, pixels, or frame -- and our conceptual systems pick up on the fact that we are entertaining a hypothetical world that is separate from the real world. It's not that people invariably distinguish fiction from reality: they can lose themselves in fiction, or misremember something they read in a novel as something they read in the newspapers or that happened to a friend, or mistakenly believe that a stylized portrayal of a time and place is an accurate portrayal. But all of us are capable of distinguishing fictitious worlds from real ones, as we see when a two-year-old pretends that a banana is a telephone for the fun of it but at the same time understands that a banana is not literally a telephone. 40 Cognitive scientists believe that the ability to entertain propositions without necessarily believing them -- to distinguish "John believes there is a Santa Claus" from "There is a Santa Claus" -- is a fundamental ability of human cognition. 41 Many believe that a breakdown of this ability underlies the thought disorder in the syndrome called schizophrenia. 42 Finally, there are mental images, the visualizations of objects and scenes in the mind's eye. The psychologist Stephen Kosslyn has shown that the brain is equipped with a system capable of reactivating and manipulating memories of perceptual experience, a bit like Photoshop with its tools for assembling, rotating, and coloring images. 43 Like language, imagery may be used as a slave system -- a "visuospatial sketchpad" -- by the central executive of the brain, making it a valuable form of mental representation. We use mental imagery, for example, when we visualize how a chair might fit in a living room or whether a sweater would look good on a relative. Imagery is also an invaluable tool to novelists, who imagine scenes before describing them in words, and to scientists, who rotate molecules or play out forces and motions in their imagination.
Though mental images allow our experiences (including our experience of media images) to affect our thoughts and attitudes long after the original objects have gone, it is a mistake to think that raw images are downloaded into our minds and then constitute our mental lives. Images are not stored in the mind like snapshots in a shoebox; if they
? ? ? ? ? ? ? were, how could you ever find the one you want? Rather, they are labeled and linked to a vast database of {216} knowledge, which allows them to be evaluated and interpreted in terms of what they stand for. 44 Chess masters, for
example, are famous for their ability to remember games in progress, but their mental images of the board are not raw photographs. Rather, they are saturated with abstract information about the game, such as which piece is threatening which other one and which clusters of pieces form viable defenses. We know this because when a chessboard is sprinkled with pieces at random, chess masters are no better at remembering the arrangement than amateurs are. 45 When images represent real people, not just chessmen, there are even more possibilities for organizing and annotating them with information about people's goals and motives -- for example, whether the person in an image is sincere or just acting.
The reason that images cannot constitute the contents of our thoughts is that images, like words, are inherently ambiguous. An image of Lassie could stand for Lassie, collies, dogs, animals, television stars, or family values. Some other, more abstract form of information must pick out the concept that an image is taken to exemplify. Or consider the sentence Yesterday my uncle fired his lawyer (an example suggested by Dan Dennett). When understanding the sentence, Brad might visualize his own ordeals of the day before and glimpse the "uncle" slot in a family tree, then picture courthouse steps and an angry man. Irene might have no image for "yesterday" but might visualize her uncle Bob's face, a slamming door, and a power-suited woman. Yet despite these very different image sequences, both people have understood the sentence in the same way, as we could see by questioning them or asking them to paraphrase the sentence. "Imagery couldn't be the key to comprehension," Dennett points out, "because you can't draw a picture of an uncle, or of yesterday, or firing, or a lawyer. Uncles, unlike clowns and firemen, don't look different in any characteristic way that can be visually represented, and yesterdays don't look like anything at all. "46 Since images are interpreted in the context of a deeper understanding of people and their relationships, the "crisis of representation," with its paranoia about the manipulation of our mind by media images, is overblown. People are not helplessly programmed with images; they can evaluate and interpret what they see using everything else they know, such as the credibility and motives of the source.
The postmodernist equating of images with thoughts has not only made a hash of several scholarly disciplines but has laid waste to the world of contemporary art. If images are the disease, the reasoning goes, then art is the cure. Artists can neutralize the power of media images by distorting them or reproducing them in odd contexts (like the ad parodies in Mad magazine or on Saturday Night Live, only not funny). Anyone familiar with contemporary art has seen the countless works in which stereotypes of women, minorities, or gay {217} people are "reinforced, parodied, or actively contested. " A prototypical example is a 1994 exhibit at the Whitney Museum in New York called "Black Male: Representations of Masculinity in Contemporary Art. " It aimed to take apart the way that African American men are culturally constructed in demonizing and marginalizing visual stereotypes such as the sex symbol, the athlete, the Sambo, and the photograph in a Wanted poster. According to the catalogue essay, "The real struggle is over the power to control images. " The art critic Adam Gopnik (whose mother and sister are cognitive scientists) called attention to the simplistic theory of cognition behind this tedious formula:
The show is intended to be socially therapeutic: its aim is to make you face the socially constructed images of black men, so that by confronting them -- or, rather, seeing artists confront them on your behalf -- you can make them go away. The trouble is that the entire enterprise of "disassembling social images" rests on an ambiguity in the way we use the word "image. " Mental images are not really images at all, but instead consist of complicated opinions, positions, doubts, and passionately held convictions, rooted in experience and amendable by argument, by more experience, or by coercion. Our mental images of black men, white judges, the press, and so on do not take the form of pictures of the kind that you can hang up (or "deconstruct") on a museum wall. . . . Hitler did not hate Jews because there were pictures of swarthy Semites with big noses imprinted on his cerebellum; racism does not exist in America because the picture of O. J. Simpson on the cover of Time is too dark. The view that visual cliches shape beliefs is both too pessimistic, in that it supposes that people are helplessly imprisoned by received stereotypes, and too optimistic, in that it supposes that if you could change the images you could change the beliefs. 47
Recognizing that we are equipped with sophisticated faculties that keep us in touch with reality does not entail ignoring the ways in which our faculties can be turned against us. People lie, sometimes baldly, sometimes through insinuation and presupposition (as in the question "When did you stop beating your wife? "). People disseminate disinformation about ethnic groups, not just pejorative stereotypes but tales of exploitation and perfidy that serve to stoke moralistic outrage against them. People try to manipulate social realities like status (which exist in the mind of the beholder) to make themselves look good or to sell products.
But we can best protect ourselves against such manipulation by pinpointing the vulnerabilities of our faculties of
? ? ? ? ? ? categorization, language, and imagery, not by denying their complexity. The view that humans are passive {218} receptacles of stereotypes, words, and images is condescending to ordinary people and gives unearned importance to the pretensions of cultural and academic elites. And exotic pronouncements about the limitations of our faculties, such as that there is nothing outside the text or that we inhabit a world of images rather than a real world, make it impossible even to identify lies and misrepresentations, let alone to understand how they are promulgated.
<< {219} >> Chapter 13
Out of Our Depths
A man has got to know his limitations. -- Clint Eastwood in Magnum Force
Most people are familiar with the idea that some of our ordeals come from a mismatch between the source of our passions in evolutionary history and the goals we set for ourselves today. People gorge themselves in anticipation of a famine that never comes, engage in dangerous liaisons that conceive babies they don't want, and rev up their bodies in response to stressors from which they cannot run away.
What is true for the emotions may also be true for the intellect. Some of our perplexities may come from a mismatch between the purposes for which our cognitive faculties evolved and the purposes to which we put them today. This is obvious enough when it comes to raw data processing. People do not try to multiply six-digit numbers in their heads or remember the phone number of everyone they meet, because they know their minds were not designed for the job. But it is not as obvious when it comes to the way we conceptualize the world. Our minds keep us in touch with aspects of reality -- such as objects, animals, and people -- that our ancestors dealt with for millions of years. But as science and technology open up new and hidden worlds, our untutored intuitions may find themselves at sea.
What are these intuitions? Many cognitive scientists believe that human reasoning is not accomplished by a single, general-purpose computer in the head. The world is a heterogeneous place, and we are equipped with different kinds of intuitions and logics, each appropriate to one department of reality. These ways of knowing have been called systems, modules, stances, faculties, mental organs, multiple intelligences, and reasoning engines. 1 They emerge early in life, are present in every normal person, and appear to be computed in partly distinct sets of networks in the brain. They may be installed by different {220} combinations of genes, or they may emerge when brain tissue self- organizes in response to different problems to be solved and different patterns in the sensory input. Most likely they develop by some combination of these forces.
What makes our reasoning faculties different from the departments in a university is that they are not just broad areas of knowledge, analyzed with whatever tools work best. Each faculty is based on a core intuition that was suitable for analyzing the world in which we evolved. Though cognitive scientists have not agreed on a Gray's Anatomy of the mind, here is a tentative but defensible list of cognitive faculties and the core intuitions on which they are based:
? An intuitive physics, which we use to keep track of how objects fall, bounce, and bend. Its core intuition is the concept of the object, which occupies one place, exists for a continuous span of time, and follows laws of motion and force. These are not Newton's laws but something closer to the medieval conception of impetus, an "oomph" that keeps an object in motion and gradually dissipates. 2 ? An intuitive version of biology or natural history, which we use to understand the living world. Its core intuition is that living things house a hidden essence that gives them their form and powers and drives their growth and bodily functions. 3
? An intuitive engineering, which we use to make and understand tools and other artifacts. Its core intuition is that a tool is an object with a purpose -- an object designed by a person to achieve a goal.
