Each map is
organized
by a different reference frame: the eyes, the head, the body, or salient objects and places in the world.
Steven-Pinker-The-Blank-Slate 1
Rather, we engage in a constant give- and-take between the thoughts we try to convey and the means our language offers to convey them.
We often grope for words, are dissatisfied with what we write because it does not express what we wanted to say, or discover when every combination of words seems wrong that we do not really know what we want to say.
And when we get frustrated by a mismatch between our language and our thoughts, we don't give up, defeated and mum, but change the language.
We concoct neologisms (quark, meme, clone, deep structure), invent slang (to spam, to diss, to flame, to surf the web, a spin doctor), borrow useful words from other languages (joie de vivre, schlemiel, angst, machismo), or coin new metaphors (waste time, vote with your feet, push the outside of the envelope).
That is why every language, far from being an immutable penitentiary, is constantly under renovation.
Despite the lamentations of language lovers and the coercion of tongue troopers, languages change unstoppably as people need to talk about new things or convey new attitudes.
37
Finally, language itself could not function if it did not sit atop a vast infrastructure of tacit knowledge about the world and about the intentions of other people. When we understand language, we have to listen between the lines to winnow out the unintended readings of an ambiguous sentence, piece {211} together fractured utterances, glide over slips of the tongue, and fill in the countless unsaid steps in a complete train of thought. When the shampoo bottle says "Lather, rinse, repeat," we don't spend the rest of our lives in the shower; we infer that it means "repeat once. " And we know how to interpret ambiguous headlines such as "Kids Make Nutritious Snacks," "Prostitutes Appeal to Pope," and "British Left Waffles on Falkland Islands," because we effortlessly apply our background knowledge about the kinds of things that people are likely to convey in newspapers. Indeed, the very existence of ambiguous sentences, in which one string of words expresses two thoughts, proves that thoughts are not the same thing as strings of words.
~
Language often makes the news precisely because it can part company with thoughts and attitudes. In 1998 Bill Clinton exploited the expectations behind ordinary comprehension to mislead prosecutors about his affair with Monica Lewinsky. He used words like alone, sex, and is in senses that were technically defensible but which deviated from charitable guesses about what people ordinarily mean by these terms. For example, he suggested he was not "alone" with Lewinsky, even though they were the only two people in the room, because other people were in the Oval Office complex at the time. He said that he did not have "sex" with her, because they did not engage in intercourse. His words, like all words, are certainly vague at their boundaries. Exactly how far away or hidden must the nearest person be before one is considered alone? At what point in the continuum of bodily contact -- from an accidental brush in an elevator to tantric bliss -- do we say that sex has occurred? Ordinarily we resolve the vagueness by guessing how our conversational partner would interpret words in the context, and we choose our words accordingly. Clinton's ingenuity in manipulating these guesses, and the outrage that erupted when he was forced to explain what he had done, show that people have an acute understanding of the difference between words and the thoughts they are designed to convey.
Language conveys not just literal meanings but also a speaker's attitude. Think of the difference between/at and voluptuous, slender and scrawny, thrifty and stingy, articulate and slick. Racial epithets, which are laced with contempt, are justifiably off-limits among responsible people, because using them conveys the tacit message that contempt for the people referred to by the epithet is acceptable. But the drive to adopt new terms for disadvantaged groups goes much further than this basic sign of respect; it often assumes that words and attitudes are so inseparable that one can reengineer people's attitudes by tinkering with the words. In 1994 the Los Angeles Times adopted a style sheet that banned some 150 words, including birth defect, Canuck, Chinese fire drill, dark continent, divorcee, Dutch treat, handicapped, illegitimate, invalid, man-made, New World, stepchild, and to welsh. The editors assumed that words {212} register in the brain with their literal meanings, so that an invalid is understood as "someone who is not valid" and Dutch treat is understood as a slur on contemporary Netherlanders. (In fact, it is one of many idioms in which Dutch means "ersatz," such as Dutch oven, Dutch door, Dutch uncle, Dutch courage, and Dutch auction,
? ? ? ? the remnants of a long-forgotten rivalry between the English and the Dutch. )
But even the more reasonable attempts at linguistic reform are based on a dubious theory of linguistic determinism. Many people are puzzled by the replacement of formerly unexceptionable terms by new ones: Negro by black by African American, Spanish-American by Hispanic by Latino, crippled by handicapped by disabled by challenged, slum by ghetto by inner city by (according to the Times) slum once again. Occasionally the neologisms are defended with some rationale about their meaning. In the 1960s, the word Negro was replaced by the word black, because the parallel between the words black and white was meant to underscore the equality of the races. Similarly, Native American reminds us of who was here first and avoids the geographically inaccurate term Indian. But often the new terms replace ones that were perfectly congenial in their day, as we see in names for old institutions that are obviously sympathetic to the people being named: the United Negro College Fund, the National Association for the Advancement of Colored People, the Shriners Hospitals for Crippled Children. And sometimes a term can be tainted or unfashionable while a minor variant is fine: consider colored people versus people of color, Afro-American versus African American, Negro -- Spanish for "black" -- versus black. If anything, a respect for literal meaning should send us off looking for a new word for the descendants of Europeans, who are neither white nor Caucasian. Something else must be driving the replacement process.
Linguists are familiar with the phenomenon, which may be called the euphemism treadmill. People invent new words for emotionally charged referents, but soon the euphemism becomes tainted by association, and a new word must be found, which soon acquires its own connotations, and so on. Water closet becomes toilet (originally a term for any kind of body care, as in toilet kit and toilet water), which becomes bathroom, which becomes restroom, which becomes lavatory. Undertaker changes to mortician, which changes to funeral director. Garbage collection turns into sanitation, which turns into environmental services. Gym (from gymnasium, originally "high school") becomes physical education, which becomes (at Berkeley) human biodynamics. Even the word minority -- the most neutral label conceivable, referring only to relative numbers -- was banned in 2001 by the San Diego City Council (and nearly banned by the Boston City Council) because it was deemed disparaging to non-whites. "No matter how you slice it, minority means less than," said a semantically challenged official at Boston College, where the preferred term is AHANA (an acronym for African-American, Hispanic, Asian, and Native American). 38 {213}
The euphemism treadmill shows that concepts, not words, are primary in people's minds. Give a concept a new name, and the name becomes colored by the concept; the concept does not become freshened by the name, at least not for long. Names for minorities will continue to change as long as people have negative attitudes toward them. We will know that we have achieved mutual respect when the names stay put.
~
"Image is nothing. Thirst is everything," screams a soft-drink ad that tries to create a new image for its product by making fun of soft-drink ads that try to create images for their products. Like words, images are salient tokens of our mental lives. And like words, images are said to have an insidious power over our consciousness, presumably because they are inscribed directly onto a blank slate. In postmodernist and relativist thinking, images are held to shape our view of reality, or to be our view of reality, or to be reality itself. This is especially true of images representing celebrities, politicians, women, and AHANAs. And as with language, the scientific study of imagery shows that the fear is misplaced.
A good description of the standard view of images within cultural studies and related disciplines may be found in the Concise Glossary of Cultural Theory. It defines image as a "mental or visual representation of an object or event as depicted in the mind, a painting, a photograph, or film. " Having thus run together images in the world (such as paintings) with images in the mind, the entry lays out the centrality of images in postmodernism, cultural studies, and academic feminism.
First it notes, reasonably enough, that images can misrepresent reality and thereby serve the interests of an ideology. A racist caricature, presumably, is a prime example. But then it takes the concept further:
With what is called the "crisis of representation" brought about by. . . postmodernism, however, it is often questioned whether an image can be thought to simply represent, or misrepresent, a supposedly prior or external, image-free reality. Reality is seen rather as always subject to, or as the product of, modes of representation. In this view we inescapably inhabit a world of images or representations and not a "real world" and true or false images of it.
In other words, if a tree falls in a forest and there is no artist to paint it, not only did the tree make no sound, but it did not fall, and there was no tree there to begin with.
? ? In a further move . . . we are thought to exist in a world of hyperreality, in which images are self-
? generating and entirely detached from any {214} supposed reality. This accords with a common view of contemporary entertainment and politics as being all a matter of "image," or appearance, rather than of substantial content.
Actually, the doctrine of hyperreality contradicts the common view of contemporary politics and entertainment as being a matter of image and appearance. The whole point of the common view is that there is a reality separate from images, and that is what allows us to decry the images that are misleading. We can, for example, criticize an old movie that shows slaves leading happy lives, or an ad that shows a corrupt politician pretending to defend the environment. If there were no such thing as substantial content, we would have no basis for preferring an accurate documentary about slavery to an apologia for it, or preferring a good expose of a politician to a slick campaign ad. The entry notes that images are associated with the world of publicity, advertising, and fashion, and thereby with business and profits. An image may thus be tied to "an imposed stereotype or an alternative subjective or cultural identity. " Media images become mental images: people cannot help but think that women or politicians or African Americans conform to the depictions in movies and advertisements. And this elevates cultural studies and postmodernist art into forces for personal and political liberation:
The study of "images of women" or "women's images" sees this field as one in which stereotypes of women can be reinforced, parodied, or actively contested through critical analysis, alternative histories, or creative work in writing and the media committed to the production of positive counter-images. 39
I have not hidden my view that this entire line of thinking is a conceptual mess. If we want to understand how politicians or advertisers manipulate us, the last thing we should do is blur distinctions among things in the world, our perception of those things when they are in front of our eyes, the mental images of those things that we construct from memory, and physical images such as photographs and drawings.
As we saw at the beginning of this chapter, the visual brain is an immensely complicated system that was designed by the forces of evolution to give us an accurate reading of the consequential things in front of us. The "intelligent eye," as perceptual psychologists call it, does not just compute the shapes and motions of people before us. It also guesses their thoughts and intentions by noticing how they gaze at, approach, avoid, help, or hinder other objects and people. And these guesses are then measured against everything else we know about people -- what we infer from gossip, from a person's words and deeds, and from Sherlock Holmes-style deductions. The result is {215} the knowledge base or semantic memory that also underlies our use of language.
Physical images such as photographs and paintings are devices that reflect light in patterns similar to those coming off real objects, thereby making the visual system respond as if it were really seeing those objects. Though people have long dreamed of illusions that completely fool the brain -- Descartes's evil demon, the philosopher's thought experiment in which a person does not realize he is a brain in a vat, the science-fiction writer's prophecy of perfect virtual reality like in The Matrix -- in actuality the illusions foisted upon us by physical images are never more than partially effective. Our perceptual systems pick up on the imperfections of an image -- the brush strokes, pixels, or frame -- and our conceptual systems pick up on the fact that we are entertaining a hypothetical world that is separate from the real world. It's not that people invariably distinguish fiction from reality: they can lose themselves in fiction, or misremember something they read in a novel as something they read in the newspapers or that happened to a friend, or mistakenly believe that a stylized portrayal of a time and place is an accurate portrayal. But all of us are capable of distinguishing fictitious worlds from real ones, as we see when a two-year-old pretends that a banana is a telephone for the fun of it but at the same time understands that a banana is not literally a telephone. 40 Cognitive scientists believe that the ability to entertain propositions without necessarily believing them -- to distinguish "John believes there is a Santa Claus" from "There is a Santa Claus" -- is a fundamental ability of human cognition. 41 Many believe that a breakdown of this ability underlies the thought disorder in the syndrome called schizophrenia. 42 Finally, there are mental images, the visualizations of objects and scenes in the mind's eye. The psychologist Stephen Kosslyn has shown that the brain is equipped with a system capable of reactivating and manipulating memories of perceptual experience, a bit like Photoshop with its tools for assembling, rotating, and coloring images. 43 Like language, imagery may be used as a slave system -- a "visuospatial sketchpad" -- by the central executive of the brain, making it a valuable form of mental representation. We use mental imagery, for example, when we visualize how a chair might fit in a living room or whether a sweater would look good on a relative. Imagery is also an invaluable tool to novelists, who imagine scenes before describing them in words, and to scientists, who rotate molecules or play out forces and motions in their imagination.
Though mental images allow our experiences (including our experience of media images) to affect our thoughts and attitudes long after the original objects have gone, it is a mistake to think that raw images are downloaded into our minds and then constitute our mental lives. Images are not stored in the mind like snapshots in a shoebox; if they
? ? ? ? ? ? ? were, how could you ever find the one you want? Rather, they are labeled and linked to a vast database of {216} knowledge, which allows them to be evaluated and interpreted in terms of what they stand for. 44 Chess masters, for
example, are famous for their ability to remember games in progress, but their mental images of the board are not raw photographs. Rather, they are saturated with abstract information about the game, such as which piece is threatening which other one and which clusters of pieces form viable defenses. We know this because when a chessboard is sprinkled with pieces at random, chess masters are no better at remembering the arrangement than amateurs are. 45 When images represent real people, not just chessmen, there are even more possibilities for organizing and annotating them with information about people's goals and motives -- for example, whether the person in an image is sincere or just acting.
The reason that images cannot constitute the contents of our thoughts is that images, like words, are inherently ambiguous. An image of Lassie could stand for Lassie, collies, dogs, animals, television stars, or family values. Some other, more abstract form of information must pick out the concept that an image is taken to exemplify. Or consider the sentence Yesterday my uncle fired his lawyer (an example suggested by Dan Dennett). When understanding the sentence, Brad might visualize his own ordeals of the day before and glimpse the "uncle" slot in a family tree, then picture courthouse steps and an angry man. Irene might have no image for "yesterday" but might visualize her uncle Bob's face, a slamming door, and a power-suited woman. Yet despite these very different image sequences, both people have understood the sentence in the same way, as we could see by questioning them or asking them to paraphrase the sentence. "Imagery couldn't be the key to comprehension," Dennett points out, "because you can't draw a picture of an uncle, or of yesterday, or firing, or a lawyer. Uncles, unlike clowns and firemen, don't look different in any characteristic way that can be visually represented, and yesterdays don't look like anything at all. "46 Since images are interpreted in the context of a deeper understanding of people and their relationships, the "crisis of representation," with its paranoia about the manipulation of our mind by media images, is overblown. People are not helplessly programmed with images; they can evaluate and interpret what they see using everything else they know, such as the credibility and motives of the source.
The postmodernist equating of images with thoughts has not only made a hash of several scholarly disciplines but has laid waste to the world of contemporary art. If images are the disease, the reasoning goes, then art is the cure. Artists can neutralize the power of media images by distorting them or reproducing them in odd contexts (like the ad parodies in Mad magazine or on Saturday Night Live, only not funny). Anyone familiar with contemporary art has seen the countless works in which stereotypes of women, minorities, or gay {217} people are "reinforced, parodied, or actively contested. " A prototypical example is a 1994 exhibit at the Whitney Museum in New York called "Black Male: Representations of Masculinity in Contemporary Art. " It aimed to take apart the way that African American men are culturally constructed in demonizing and marginalizing visual stereotypes such as the sex symbol, the athlete, the Sambo, and the photograph in a Wanted poster. According to the catalogue essay, "The real struggle is over the power to control images. " The art critic Adam Gopnik (whose mother and sister are cognitive scientists) called attention to the simplistic theory of cognition behind this tedious formula:
The show is intended to be socially therapeutic: its aim is to make you face the socially constructed images of black men, so that by confronting them -- or, rather, seeing artists confront them on your behalf -- you can make them go away. The trouble is that the entire enterprise of "disassembling social images" rests on an ambiguity in the way we use the word "image. " Mental images are not really images at all, but instead consist of complicated opinions, positions, doubts, and passionately held convictions, rooted in experience and amendable by argument, by more experience, or by coercion. Our mental images of black men, white judges, the press, and so on do not take the form of pictures of the kind that you can hang up (or "deconstruct") on a museum wall. . . . Hitler did not hate Jews because there were pictures of swarthy Semites with big noses imprinted on his cerebellum; racism does not exist in America because the picture of O. J. Simpson on the cover of Time is too dark. The view that visual cliches shape beliefs is both too pessimistic, in that it supposes that people are helplessly imprisoned by received stereotypes, and too optimistic, in that it supposes that if you could change the images you could change the beliefs. 47
Recognizing that we are equipped with sophisticated faculties that keep us in touch with reality does not entail ignoring the ways in which our faculties can be turned against us. People lie, sometimes baldly, sometimes through insinuation and presupposition (as in the question "When did you stop beating your wife? "). People disseminate disinformation about ethnic groups, not just pejorative stereotypes but tales of exploitation and perfidy that serve to stoke moralistic outrage against them. People try to manipulate social realities like status (which exist in the mind of the beholder) to make themselves look good or to sell products.
But we can best protect ourselves against such manipulation by pinpointing the vulnerabilities of our faculties of
? ? ? ? ? ? categorization, language, and imagery, not by denying their complexity. The view that humans are passive {218} receptacles of stereotypes, words, and images is condescending to ordinary people and gives unearned importance to the pretensions of cultural and academic elites. And exotic pronouncements about the limitations of our faculties, such as that there is nothing outside the text or that we inhabit a world of images rather than a real world, make it impossible even to identify lies and misrepresentations, let alone to understand how they are promulgated.
<< {219} >> Chapter 13
Out of Our Depths
A man has got to know his limitations. -- Clint Eastwood in Magnum Force
Most people are familiar with the idea that some of our ordeals come from a mismatch between the source of our passions in evolutionary history and the goals we set for ourselves today. People gorge themselves in anticipation of a famine that never comes, engage in dangerous liaisons that conceive babies they don't want, and rev up their bodies in response to stressors from which they cannot run away.
What is true for the emotions may also be true for the intellect. Some of our perplexities may come from a mismatch between the purposes for which our cognitive faculties evolved and the purposes to which we put them today. This is obvious enough when it comes to raw data processing. People do not try to multiply six-digit numbers in their heads or remember the phone number of everyone they meet, because they know their minds were not designed for the job. But it is not as obvious when it comes to the way we conceptualize the world. Our minds keep us in touch with aspects of reality -- such as objects, animals, and people -- that our ancestors dealt with for millions of years. But as science and technology open up new and hidden worlds, our untutored intuitions may find themselves at sea.
What are these intuitions? Many cognitive scientists believe that human reasoning is not accomplished by a single, general-purpose computer in the head. The world is a heterogeneous place, and we are equipped with different kinds of intuitions and logics, each appropriate to one department of reality. These ways of knowing have been called systems, modules, stances, faculties, mental organs, multiple intelligences, and reasoning engines. 1 They emerge early in life, are present in every normal person, and appear to be computed in partly distinct sets of networks in the brain. They may be installed by different {220} combinations of genes, or they may emerge when brain tissue self- organizes in response to different problems to be solved and different patterns in the sensory input. Most likely they develop by some combination of these forces.
What makes our reasoning faculties different from the departments in a university is that they are not just broad areas of knowledge, analyzed with whatever tools work best. Each faculty is based on a core intuition that was suitable for analyzing the world in which we evolved. Though cognitive scientists have not agreed on a Gray's Anatomy of the mind, here is a tentative but defensible list of cognitive faculties and the core intuitions on which they are based:
? An intuitive physics, which we use to keep track of how objects fall, bounce, and bend. Its core intuition is the concept of the object, which occupies one place, exists for a continuous span of time, and follows laws of motion and force. These are not Newton's laws but something closer to the medieval conception of impetus, an "oomph" that keeps an object in motion and gradually dissipates. 2 ? An intuitive version of biology or natural history, which we use to understand the living world. Its core intuition is that living things house a hidden essence that gives them their form and powers and drives their growth and bodily functions. 3
? An intuitive engineering, which we use to make and understand tools and other artifacts. Its core intuition is that a tool is an object with a purpose -- an object designed by a person to achieve a goal. 4
? An intuitive psychology, which we use to understand other people. Its core intuition is that other people are not objects or machines but are animated by the invisible entity we call the mind or the soul. Minds contain beliefs and desires and are the immediate cause of behavior.
? A spatial sense, which we use to navigate the world and keep track of where things are. It is based on a dead reckoner, which updates coordinates of the body's location as it moves and turns, and a network of mental maps.
Each map is organized by a different reference frame: the eyes, the head, the body, or salient objects and places in the world. 5
? A number sense, which we use to think about quantities and amounts. It is based on an ability to
? ? ? ? ? ? ? ? ? ? ? register exact quantities for small numbers of objects (one, two, and three) and to make rough relative estimates for larger numbers. 6
? A sense of probability, which we use to reason about the likelihood of uncertain events. It is based on the ability to track the relative frequencies of events, that is, the proportion of events of some kind that turn out one way or the other. 7 {221}
? An intuitive economics, which we use to exchange goods and favors. It is based on the concept of reciprocal exchange, in which one party confers a benefit on another and is entitled to an equivalent benefit in return.
? A mental database and logic, which we use to represent ideas and to infer new ideas from old ones. It is based on assertions about what's what, what's where, or who did what to whom, when, where, and why. The assertions are linked in a mind-wide web and can be recombined with logical and causal operators such as and, or, not, all, some, necessary, possible, and cause. 8
? Language, which we use to share the ideas from our mental logic. It is based on a mental dictionary of memorized words and a mental grammar of combinatorial rules. The rules organize vowels and consonants into words, words into bigger words and phrases, and phrases into sentences, in such a way that the meaning of the combination can be computed from the meanings of the parts and the way they are arranged. 9
The mind also has components for which it is hard to tell where cognition leaves off and emotion begins. These include a system for assessing danger, coupled with the emotion called fear, a system for assessing contamination, coupled with the emotion called disgust, and a moral sense, which is complex enough to deserve a chapter of its own. These ways of knowing and core intuitions are suitable for the lifestyle of small groups of illiterate, stateless people who live off the land, survive by their wits, and depend on what they can carry. Our ancestors left this lifestyle for a settled existence only a few millennia ago, too recently for evolution to have done much, if anything, to our brains. Conspicuous by their absence are faculties suited to the stunning new understanding of the world wrought by science and technology. For many domains of knowledge, the mind could not have evolved dedicated machinery, the brain and genome show no hints of specialization, and people show no spontaneous intuitive understanding either in the crib or afterward. They include modern physics, cosmology, genetics, evolution, neuroscience, embryology, economics, and mathematics.
It's not just that we have to go to school or read books to learn these subjects. It's that we have no mental tools to grasp them intuitively. We depend on analogies that press an old mental faculty into service, or on jerry-built mental contraptions that wire together bits and pieces of other faculties. Understanding in these domains is likely to be uneven, shallow, and contaminated by primitive intuitions. And that can shape debates in the border disputes in which science and technology make contact with everyday life. The point of this chapter is that together with all the moral, empirical, and political factors that go into these debates, we should add the cognitive factors: the way our
{222} minds naturally frame issues. Our own cognitive makeup is a missing piece of many puzzles, including education, bioethics, food safety, economics, and human understanding itself.
~
The most obvious arena in which we confront native ways of thinking is the schoolhouse. Any theory of education must be based on a theory of human nature, and in the twentieth century that theory was often the Blank Slate or the Noble Savage.
Traditional education is based in large part on the Blank Slate: children come to school empty and have knowledge deposited in them, to be reproduced later on tests. (Critics of traditional education call this the "savings and loan" model. ) The Blank Slate also underlies the common philosophy that the early school-age years are an opportunity zone in which social values are shaped for life. Many schools today use the early grades to instill desirable attitudes toward the environment, gender, sexuality, and ethnic diversity.
Progressive educational practice, for its part, is based on the Noble Savage. As A. S. Neill wrote in his influential book Summerhill, "A child is innately wise and realistic. If left to himself without adult suggestion of any kind, he will develop as far as he is capable of developing. "10 Neill and other progressive theorists of the 1960s and 1970s argued that schools should do away with examinations, grades, curricula, and even books. Though few schools went that far, the movement left a mark on educational practice. In the method of reading instruction known as Whole Language, children are not taught which letter goes with which sound but are immersed in a book-rich environment where reading skills are expected to blossom spontaneously. 11 In the philosophy of mathematics instruction known as constructivism, children are not drilled with arithmetic tables but are enjoined to rediscover mathematical truths themselves by solving problems in groups. 12 Both methods fare badly when students' learning is assessed objectively,
? ? ? ? ? ? ? ? ? but advocates of the methods tend to disdain standardized testing.
An understanding of the mind as a complex system shaped by evolution runs against these philosophies. The alternative has emerged from the work of cognitive scientists such as Susan Carey, Howard Gardner, and David Geary. 13 Education is neither writing on a blank slate nor allowing the child's nobility to come into flower. Rather, education is a technology that tries to make up for what the human mind is innately bad at. Children don't have to go to school to learn to walk, talk, recognize objects, or remember the personalities of their friends, even though these tasks are much harder than reading, adding, or remembering dates in history. They do have to go to school to learn written language, arithmetic, and science, because those bodies of knowledge and skill were invented too recently for any species-wide knack for them to have evolved. {223}
Far from being empty receptacles or universal learners, then, children are equipped with a toolbox of implements for reasoning and learning in particular ways, and those implements must be cleverly recruited to master problems for which they were not designed. That requires not just inserting new facts and skills in children's minds but debugging and disabling old ones. Students cannot learn Newtonian physics until they unlearn their intuitive impetus-based physics. 14 They cannot learn modern biology until they unlearn their intuitive biology, which thinks in terms of vital essences. And they cannot learn evolution until they unlearn their intuitive engineering, which attributes design to the intentions of a designer. 15
Schooling also requires pupils to expose and reinforce skills that are ordinarily buried in unconscious black boxes. When children learn to read, the vowels and consonants that are seamlessly woven together in speech must be forced into children's awareness before they can associate them with squiggles on a page. 16 Effective education may also require co-opting old faculties to deal with new demands. Snatches of language can be pressed into service to do calculation, as when we recall the stanza "Five times five is twenty-five. "17 The logic of grammar can be used to grasp large numbers: the expression four thousand three hundred and fifty-seven has the grammatical structure of an English noun phrase like hat, coat, and mittens. When a student parses the number phrase she can call to mind the mental operation of aggregation, which is related to the mathematical operation of addition. 18 Spatial cognition is drafted into understanding mathematical relationships through the use of graphs, which turn data or equations into shapes. 19 Intuitive engineering supports the learning of anatomy and physiology (organs are understood as gadgets with functions), and intuitive physics supports the learning of chemistry and biology (stuff, including living stuff, is made out of tiny, bouncy, sticky objects). 20
Geary points out a final implication. Because much of the content of education is not cognitively natural, the process of mastering it may not always be easy and pleasant, notwithstanding the mantra that learning is fun. Children may be innately motivated to make friends, acquire status, hone motor skills, and explore the physical world, but they are not necessarily motivated to adapt their cognitive faculties to unnatural tasks like formal mathematics. A family, peer group, and culture that ascribe high status to school achievement may be needed to give a child the motive to persevere toward effortful feats of learning whose rewards are apparent only over the long term. 21
~
The layperson's intuitive psychology or "theory of mind" is one of the brain's most striking abilities. We do not treat other people as wind-up dolls but think of them as being animated by minds: nonphysical entities we cannot see or touch but that are as real to us as bodies and objects. Aside from {224} allowing us to predict people's behavior from their beliefs and desires, our theory of mind is tied to our ability to empathize and to our conception of life and death. The difference between a dead body and a living one is that a dead body no longer contains the vital force we call a mind. Our theory of mind is the source of the concept of the soul. The ghost in the machine is deeply rooted in our way of thinking about people.
A belief in the soul, in turn, meshes with our moral convictions. The core of morality is the recognition that others have interests as we do -- that they "feel want, taste grief, need friends," as Shakespeare put it -- and therefore that they have a right to life, liberty, and the pursuit of their interests. But who are those "others"? We need a boundary that allows us to be callous to rocks and plants but forces us to treat other humans as "persons" that possess inalienable rights. Otherwise, it seems, we would place ourselves on a slippery slope that ends in the disposal of inconvenient people or in grotesque deliberations on the value of individual lives. As Pope John Paul II pointed out, the notion that every human carries infinite value by virtue of possessing a soul would seem to give us that boundary. Until recently the intuitive concept of the soul served us pretty well. Living people had souls, which come into existence at the moment of conception and leave their bodies when they die. Animals, plants, and inanimate objects do not have souls at all. But science is showing that what we call the soul -- the locus of sentience, reason, and will -- consists of the information-processing activity of the brain, an organ governed by the laws of biology. In an individual person it comes into existence gradually through the differentiation of tissues growing from a single cell.
? ? ? ? ? ? ? ? ? ? ? In the species it came into existence gradually as the forces of evolution modified the brains of simpler animals. And though our concept of souls used to fit pretty well with natural phenomena -- a woman was either pregnant or not, a person was either dead or alive -- bio-medical research is now presenting us with cases where the two are out of register. These cases are not just scientific curiosities but are intertwined with pressing issues such as contraception, abortion, infanticide, animal rights, cloning, euthanasia, and research involving human embryos, especially the harvesting of stem cells.
In the face of these difficult choices it is tempting to look to biology to find or ratify boundaries such as "when life begins. " But that only highlights the clash between two incommensurable ways of conceiving life and mind. The intuitive and morally useful concept of an immaterial spirit simply cannot be reconciled with the scientific concept of brain activity emerging gradually in ontogeny and phylogeny. No matter where we try to draw the line between life and nonlife, or between mind and nonmind, ambiguous cases pop up to challenge our moral intuitions.
The closest event we can find to a thunderclap marking the entry of a soul {225} into the world is the moment of conception. At that instant a new human genome is determined, and we have an entity destined to develop into a unique individual. The Catholic Church and certain other Christian denominations designate conception as the moment of ensoulment and the beginning of life (which, of course, makes abortion a form of murder). But just as a microscope reveals that a straight edge is really ragged, research on human reproduction shows that the "moment of conception" is not a moment at all. Sometimes several sperm penetrate the outer membrane of the egg, and it takes time for the egg to eject the extra chromosomes. What and where is the soul during this interval? Even when a single sperm enters, its genes remain separate from those of the egg for a day or more, and it takes yet another day or so for the newly merged genome to control the cell. So the "moment" of conception is in fact a span of twenty-four to forty- eight hours. 22 Nor is the conceptus destined to become a baby. Between two-thirds and three-quarters of them never implant in the uterus and are spontaneously aborted, some because they are genetically defective, others for no discernible reason.
Still, one might say that at whatever point during this interlude the new genome is formed, the specification of a unique new person has come into existence. The soul, by this reasoning, may be identified with the genome. But during the next few days, as the embryo's cells begin to divide, they can split into several embryos, which develop into identical twins, triplets, and so on. Do identical twins share a soul? Did the Dionne quintuplets make do with one- fifth of a soul each? If not, where did the four extra souls come from? Indeed, every cell in the growing embryo is capable, with the right manipulations, of becoming a new embryo that can grow into a child. Does a multicell embryo consist of one soul per cell, and if so, where do the other souls go when the cells lose that ability? And not only can one embryo become two people, but two embryos can become one person. Occasionally two fertilized eggs, which ordinarily would go on to become fraternal twins, merge into a single embryo that develops into a person who is a genetic chimera: some of her cells have one genome, others have another genome. Does her body house two souls? For that matter, if human cloning ever became possible (and there appears to be no technical obstacle), every cell in a person's body would have the special ability that is supposedly unique to a conceptus, namely developing into a human being. True, the genes in a cheek cell can become a person only with unnatural intervention, but that is just as true for an egg that is fertilized in vitro. Yet no one would deny that children conceived by IVF have souls.
The idea that ensoulment takes place at conception is not only hard to reconcile with biology but does not have the moral superiority credited to it. It implies that we should prosecute users of intrauterine contraceptive devices and the "morning-after pill" for murder, because they prevent the conceptus from implanting. It implies that we should divert medical research from {226} curing cancer and heart disease to preventing the spontaneous miscarriages of vast numbers of microscopic conceptuses. It impels us to find surrogate mothers for the large number of embryos left over from IVF that are currently sitting in fertility clinic freezers. It would outlaw research on conception and early embryonic development that promises to reduce infertility, birth defects, and pediatric cancer, and research on stem cells that could lead to treatments for Alzheimer's disease, Parkinson's disease, diabetes, and spinal-cord injuries. And it flouts the key moral intuition that other people are worthy of moral consideration because of their feelings -- their ability to love, think, plan, enjoy, and suffer -- all of which depend on a functioning nervous system.
The enormous moral costs of equating a person with a conceptus, and the cognitive gymnastics required to maintain that belief in the face of modern biology, can sometimes lead to an agonizing reconsideration of deeply held beliefs. In 2001, Senator Orrin Hatch of Utah broke with his longtime allies in the anti-abortion movement and came out in favor of stem-cell research after studying the science of reproduction and meditating on his Mormon faith. "I have searched my conscience," he said. "I just cannot equate a child living in the womb, with moving toes and fingers and a beating heart, with an embryo in a freezer. "23
The belief that bodies are invested with souls is not just a product of religious doctrine but embedded in people's psychology and likely to emerge whenever they have not digested the findings of biology. The public reaction to cloning is a case in point. Some people fear that cloning would present us with the option of becoming immortal,
? ? ? ? others that it could produce an army of obedient zombies, or a source of organs for the original person to harvest when needed. In the recent Arnold Schwarzenegger movie The Sixth Day, clones are called "blanks," and their DNA gives them only a physical form, not a mind; they acquire a mind when a neural recording of the original person is downloaded into them. When Dolly the sheep was cloned in 1997, the cover of Der Spiegel showed a parade of Claudia Schiffers, Hitlers, and Einsteins, as if being a supermodel, fascist dictator, or scientific genius could be copied along with the DNA.
Clones, in fact, are just identical twins born at different times. If Einstein had a twin, he would not have been a zombie, would not have continued Einstein's stream of consciousness if Einstein had predeceased him, would not have given up his vital organs without a struggle, and probably would have been no Einstein (since intelligence is only partly heritable). The same would be true of a person cloned from a speck of Einstein. The bizarre misconceptions of cloning can be traced to the persistent belief that the body is suffused with a soul. One conception of cloning, which sets off a fear of an army of zombies, blanks, or organ farms, imagines the process to be the duplication of a body without a soul. The other, which sets off fears of a Faustian grab at {227} immortality or of a resurrected Hitler, conceives of cloning as duplicating the body together with the soul. This conception may also underlie the longing of some bereaved parents for a dead child to be cloned, as if that would bring the child back to life. In fact, the clone would not only grow up in a different world from the one the dead sibling grew up in, but would have different brain tissue and would traverse a different line of sentient experience.
The discovery that what we call "the person" emerges piecemeal from a gradually developing brain forces us to reframe problems in bioethics. It would have been convenient if biologists had discovered a point at which the brain is fully assembled and is plugged in and turned on for the first time, but that is not how brains work. The nervous system emerges in the embryo as a simple tube and differentiates into a brain and spinal cord. The brain begins to function in the fetus, but it continues to wire itself well into childhood and even adolescence. The demand by both religious and secular ethicists that we identify the "criteria for personhood" assumes that a dividing line in brain development can be found. But any claim that such a line has been sighted leads to moral absurdities.
If we set the boundary for personhood at birth, we should be prepared to allow an abortion minutes before birth, despite the lack of any significant difference between a late-term fetus and a neonate. It seems more reasonable to draw the line at viability. But viability is a continuum that depends on the state of current biomedical technology and on the risks of impairment that parents are willing to tolerate in their child. And it invites the obvious rejoinder: if it is all right to abort a twenty-four-week fetus, then why not the barely distinguishable fetus of twenty-four weeks plus one day? And if that is permissible, why not a fetus of twenty-four weeks plus two days, or three days, and so on until birth? On the other hand, if it is impermissible to abort a fetus the day before its birth, then what about two days before, and three days, and so on, all the way back to conception?
We face the same problem in reverse when considering euthanasia and living wills at the end of life. Most people do not depart this world in a puff of smoke but suffer a gradual and uneven breakdown of the various parts of the brain and body. Many kinds and degrees of existence lie between the living and the dead, and that will become even more true as medical technology improves.
We face the problem again in grappling with demands for animal rights. Activists who grant the right to life to any sentient being must conclude that a hamburger eater is a party to murder and that a rodent exterminator is a perpetrator of mass murder. They must outlaw medical research that would sacrifice a few mice but save a million children from painful deaths (since no one would agree to drafting a few human beings for such experiments, and on this view mice have the rights we ordinarily grant to people). On the other hand, {228} an opponent of animal rights who maintains that personhood comes from being a member of Homo sapiens is just a species bigot, no more thoughtful than the race bigots who value the lives of whites more than blacks. After all, other mammals fight to stay alive, appear to experience pleasure, and undergo pain, fear, and stress when their well-being is compromised. The great apes also share our higher pleasures of curiosity and love of kin, and our deeper aches of boredom, loneliness, and grief. Why should those interests be respected for our species but not for others?
Some moral philosophers try to thread a boundary across this treacherous landscape by equating personhood with cognitive traits that humans happen to possess. These include an ability to reflect upon oneself as a continuous locus of consciousness, to form and savor plans for the future, to dread death, and to express a choice not to die.
Finally, language itself could not function if it did not sit atop a vast infrastructure of tacit knowledge about the world and about the intentions of other people. When we understand language, we have to listen between the lines to winnow out the unintended readings of an ambiguous sentence, piece {211} together fractured utterances, glide over slips of the tongue, and fill in the countless unsaid steps in a complete train of thought. When the shampoo bottle says "Lather, rinse, repeat," we don't spend the rest of our lives in the shower; we infer that it means "repeat once. " And we know how to interpret ambiguous headlines such as "Kids Make Nutritious Snacks," "Prostitutes Appeal to Pope," and "British Left Waffles on Falkland Islands," because we effortlessly apply our background knowledge about the kinds of things that people are likely to convey in newspapers. Indeed, the very existence of ambiguous sentences, in which one string of words expresses two thoughts, proves that thoughts are not the same thing as strings of words.
~
Language often makes the news precisely because it can part company with thoughts and attitudes. In 1998 Bill Clinton exploited the expectations behind ordinary comprehension to mislead prosecutors about his affair with Monica Lewinsky. He used words like alone, sex, and is in senses that were technically defensible but which deviated from charitable guesses about what people ordinarily mean by these terms. For example, he suggested he was not "alone" with Lewinsky, even though they were the only two people in the room, because other people were in the Oval Office complex at the time. He said that he did not have "sex" with her, because they did not engage in intercourse. His words, like all words, are certainly vague at their boundaries. Exactly how far away or hidden must the nearest person be before one is considered alone? At what point in the continuum of bodily contact -- from an accidental brush in an elevator to tantric bliss -- do we say that sex has occurred? Ordinarily we resolve the vagueness by guessing how our conversational partner would interpret words in the context, and we choose our words accordingly. Clinton's ingenuity in manipulating these guesses, and the outrage that erupted when he was forced to explain what he had done, show that people have an acute understanding of the difference between words and the thoughts they are designed to convey.
Language conveys not just literal meanings but also a speaker's attitude. Think of the difference between/at and voluptuous, slender and scrawny, thrifty and stingy, articulate and slick. Racial epithets, which are laced with contempt, are justifiably off-limits among responsible people, because using them conveys the tacit message that contempt for the people referred to by the epithet is acceptable. But the drive to adopt new terms for disadvantaged groups goes much further than this basic sign of respect; it often assumes that words and attitudes are so inseparable that one can reengineer people's attitudes by tinkering with the words. In 1994 the Los Angeles Times adopted a style sheet that banned some 150 words, including birth defect, Canuck, Chinese fire drill, dark continent, divorcee, Dutch treat, handicapped, illegitimate, invalid, man-made, New World, stepchild, and to welsh. The editors assumed that words {212} register in the brain with their literal meanings, so that an invalid is understood as "someone who is not valid" and Dutch treat is understood as a slur on contemporary Netherlanders. (In fact, it is one of many idioms in which Dutch means "ersatz," such as Dutch oven, Dutch door, Dutch uncle, Dutch courage, and Dutch auction,
? ? ? ? the remnants of a long-forgotten rivalry between the English and the Dutch. )
But even the more reasonable attempts at linguistic reform are based on a dubious theory of linguistic determinism. Many people are puzzled by the replacement of formerly unexceptionable terms by new ones: Negro by black by African American, Spanish-American by Hispanic by Latino, crippled by handicapped by disabled by challenged, slum by ghetto by inner city by (according to the Times) slum once again. Occasionally the neologisms are defended with some rationale about their meaning. In the 1960s, the word Negro was replaced by the word black, because the parallel between the words black and white was meant to underscore the equality of the races. Similarly, Native American reminds us of who was here first and avoids the geographically inaccurate term Indian. But often the new terms replace ones that were perfectly congenial in their day, as we see in names for old institutions that are obviously sympathetic to the people being named: the United Negro College Fund, the National Association for the Advancement of Colored People, the Shriners Hospitals for Crippled Children. And sometimes a term can be tainted or unfashionable while a minor variant is fine: consider colored people versus people of color, Afro-American versus African American, Negro -- Spanish for "black" -- versus black. If anything, a respect for literal meaning should send us off looking for a new word for the descendants of Europeans, who are neither white nor Caucasian. Something else must be driving the replacement process.
Linguists are familiar with the phenomenon, which may be called the euphemism treadmill. People invent new words for emotionally charged referents, but soon the euphemism becomes tainted by association, and a new word must be found, which soon acquires its own connotations, and so on. Water closet becomes toilet (originally a term for any kind of body care, as in toilet kit and toilet water), which becomes bathroom, which becomes restroom, which becomes lavatory. Undertaker changes to mortician, which changes to funeral director. Garbage collection turns into sanitation, which turns into environmental services. Gym (from gymnasium, originally "high school") becomes physical education, which becomes (at Berkeley) human biodynamics. Even the word minority -- the most neutral label conceivable, referring only to relative numbers -- was banned in 2001 by the San Diego City Council (and nearly banned by the Boston City Council) because it was deemed disparaging to non-whites. "No matter how you slice it, minority means less than," said a semantically challenged official at Boston College, where the preferred term is AHANA (an acronym for African-American, Hispanic, Asian, and Native American). 38 {213}
The euphemism treadmill shows that concepts, not words, are primary in people's minds. Give a concept a new name, and the name becomes colored by the concept; the concept does not become freshened by the name, at least not for long. Names for minorities will continue to change as long as people have negative attitudes toward them. We will know that we have achieved mutual respect when the names stay put.
~
"Image is nothing. Thirst is everything," screams a soft-drink ad that tries to create a new image for its product by making fun of soft-drink ads that try to create images for their products. Like words, images are salient tokens of our mental lives. And like words, images are said to have an insidious power over our consciousness, presumably because they are inscribed directly onto a blank slate. In postmodernist and relativist thinking, images are held to shape our view of reality, or to be our view of reality, or to be reality itself. This is especially true of images representing celebrities, politicians, women, and AHANAs. And as with language, the scientific study of imagery shows that the fear is misplaced.
A good description of the standard view of images within cultural studies and related disciplines may be found in the Concise Glossary of Cultural Theory. It defines image as a "mental or visual representation of an object or event as depicted in the mind, a painting, a photograph, or film. " Having thus run together images in the world (such as paintings) with images in the mind, the entry lays out the centrality of images in postmodernism, cultural studies, and academic feminism.
First it notes, reasonably enough, that images can misrepresent reality and thereby serve the interests of an ideology. A racist caricature, presumably, is a prime example. But then it takes the concept further:
With what is called the "crisis of representation" brought about by. . . postmodernism, however, it is often questioned whether an image can be thought to simply represent, or misrepresent, a supposedly prior or external, image-free reality. Reality is seen rather as always subject to, or as the product of, modes of representation. In this view we inescapably inhabit a world of images or representations and not a "real world" and true or false images of it.
In other words, if a tree falls in a forest and there is no artist to paint it, not only did the tree make no sound, but it did not fall, and there was no tree there to begin with.
? ? In a further move . . . we are thought to exist in a world of hyperreality, in which images are self-
? generating and entirely detached from any {214} supposed reality. This accords with a common view of contemporary entertainment and politics as being all a matter of "image," or appearance, rather than of substantial content.
Actually, the doctrine of hyperreality contradicts the common view of contemporary politics and entertainment as being a matter of image and appearance. The whole point of the common view is that there is a reality separate from images, and that is what allows us to decry the images that are misleading. We can, for example, criticize an old movie that shows slaves leading happy lives, or an ad that shows a corrupt politician pretending to defend the environment. If there were no such thing as substantial content, we would have no basis for preferring an accurate documentary about slavery to an apologia for it, or preferring a good expose of a politician to a slick campaign ad. The entry notes that images are associated with the world of publicity, advertising, and fashion, and thereby with business and profits. An image may thus be tied to "an imposed stereotype or an alternative subjective or cultural identity. " Media images become mental images: people cannot help but think that women or politicians or African Americans conform to the depictions in movies and advertisements. And this elevates cultural studies and postmodernist art into forces for personal and political liberation:
The study of "images of women" or "women's images" sees this field as one in which stereotypes of women can be reinforced, parodied, or actively contested through critical analysis, alternative histories, or creative work in writing and the media committed to the production of positive counter-images. 39
I have not hidden my view that this entire line of thinking is a conceptual mess. If we want to understand how politicians or advertisers manipulate us, the last thing we should do is blur distinctions among things in the world, our perception of those things when they are in front of our eyes, the mental images of those things that we construct from memory, and physical images such as photographs and drawings.
As we saw at the beginning of this chapter, the visual brain is an immensely complicated system that was designed by the forces of evolution to give us an accurate reading of the consequential things in front of us. The "intelligent eye," as perceptual psychologists call it, does not just compute the shapes and motions of people before us. It also guesses their thoughts and intentions by noticing how they gaze at, approach, avoid, help, or hinder other objects and people. And these guesses are then measured against everything else we know about people -- what we infer from gossip, from a person's words and deeds, and from Sherlock Holmes-style deductions. The result is {215} the knowledge base or semantic memory that also underlies our use of language.
Physical images such as photographs and paintings are devices that reflect light in patterns similar to those coming off real objects, thereby making the visual system respond as if it were really seeing those objects. Though people have long dreamed of illusions that completely fool the brain -- Descartes's evil demon, the philosopher's thought experiment in which a person does not realize he is a brain in a vat, the science-fiction writer's prophecy of perfect virtual reality like in The Matrix -- in actuality the illusions foisted upon us by physical images are never more than partially effective. Our perceptual systems pick up on the imperfections of an image -- the brush strokes, pixels, or frame -- and our conceptual systems pick up on the fact that we are entertaining a hypothetical world that is separate from the real world. It's not that people invariably distinguish fiction from reality: they can lose themselves in fiction, or misremember something they read in a novel as something they read in the newspapers or that happened to a friend, or mistakenly believe that a stylized portrayal of a time and place is an accurate portrayal. But all of us are capable of distinguishing fictitious worlds from real ones, as we see when a two-year-old pretends that a banana is a telephone for the fun of it but at the same time understands that a banana is not literally a telephone. 40 Cognitive scientists believe that the ability to entertain propositions without necessarily believing them -- to distinguish "John believes there is a Santa Claus" from "There is a Santa Claus" -- is a fundamental ability of human cognition. 41 Many believe that a breakdown of this ability underlies the thought disorder in the syndrome called schizophrenia. 42 Finally, there are mental images, the visualizations of objects and scenes in the mind's eye. The psychologist Stephen Kosslyn has shown that the brain is equipped with a system capable of reactivating and manipulating memories of perceptual experience, a bit like Photoshop with its tools for assembling, rotating, and coloring images. 43 Like language, imagery may be used as a slave system -- a "visuospatial sketchpad" -- by the central executive of the brain, making it a valuable form of mental representation. We use mental imagery, for example, when we visualize how a chair might fit in a living room or whether a sweater would look good on a relative. Imagery is also an invaluable tool to novelists, who imagine scenes before describing them in words, and to scientists, who rotate molecules or play out forces and motions in their imagination.
Though mental images allow our experiences (including our experience of media images) to affect our thoughts and attitudes long after the original objects have gone, it is a mistake to think that raw images are downloaded into our minds and then constitute our mental lives. Images are not stored in the mind like snapshots in a shoebox; if they
? ? ? ? ? ? ? were, how could you ever find the one you want? Rather, they are labeled and linked to a vast database of {216} knowledge, which allows them to be evaluated and interpreted in terms of what they stand for. 44 Chess masters, for
example, are famous for their ability to remember games in progress, but their mental images of the board are not raw photographs. Rather, they are saturated with abstract information about the game, such as which piece is threatening which other one and which clusters of pieces form viable defenses. We know this because when a chessboard is sprinkled with pieces at random, chess masters are no better at remembering the arrangement than amateurs are. 45 When images represent real people, not just chessmen, there are even more possibilities for organizing and annotating them with information about people's goals and motives -- for example, whether the person in an image is sincere or just acting.
The reason that images cannot constitute the contents of our thoughts is that images, like words, are inherently ambiguous. An image of Lassie could stand for Lassie, collies, dogs, animals, television stars, or family values. Some other, more abstract form of information must pick out the concept that an image is taken to exemplify. Or consider the sentence Yesterday my uncle fired his lawyer (an example suggested by Dan Dennett). When understanding the sentence, Brad might visualize his own ordeals of the day before and glimpse the "uncle" slot in a family tree, then picture courthouse steps and an angry man. Irene might have no image for "yesterday" but might visualize her uncle Bob's face, a slamming door, and a power-suited woman. Yet despite these very different image sequences, both people have understood the sentence in the same way, as we could see by questioning them or asking them to paraphrase the sentence. "Imagery couldn't be the key to comprehension," Dennett points out, "because you can't draw a picture of an uncle, or of yesterday, or firing, or a lawyer. Uncles, unlike clowns and firemen, don't look different in any characteristic way that can be visually represented, and yesterdays don't look like anything at all. "46 Since images are interpreted in the context of a deeper understanding of people and their relationships, the "crisis of representation," with its paranoia about the manipulation of our mind by media images, is overblown. People are not helplessly programmed with images; they can evaluate and interpret what they see using everything else they know, such as the credibility and motives of the source.
The postmodernist equating of images with thoughts has not only made a hash of several scholarly disciplines but has laid waste to the world of contemporary art. If images are the disease, the reasoning goes, then art is the cure. Artists can neutralize the power of media images by distorting them or reproducing them in odd contexts (like the ad parodies in Mad magazine or on Saturday Night Live, only not funny). Anyone familiar with contemporary art has seen the countless works in which stereotypes of women, minorities, or gay {217} people are "reinforced, parodied, or actively contested. " A prototypical example is a 1994 exhibit at the Whitney Museum in New York called "Black Male: Representations of Masculinity in Contemporary Art. " It aimed to take apart the way that African American men are culturally constructed in demonizing and marginalizing visual stereotypes such as the sex symbol, the athlete, the Sambo, and the photograph in a Wanted poster. According to the catalogue essay, "The real struggle is over the power to control images. " The art critic Adam Gopnik (whose mother and sister are cognitive scientists) called attention to the simplistic theory of cognition behind this tedious formula:
The show is intended to be socially therapeutic: its aim is to make you face the socially constructed images of black men, so that by confronting them -- or, rather, seeing artists confront them on your behalf -- you can make them go away. The trouble is that the entire enterprise of "disassembling social images" rests on an ambiguity in the way we use the word "image. " Mental images are not really images at all, but instead consist of complicated opinions, positions, doubts, and passionately held convictions, rooted in experience and amendable by argument, by more experience, or by coercion. Our mental images of black men, white judges, the press, and so on do not take the form of pictures of the kind that you can hang up (or "deconstruct") on a museum wall. . . . Hitler did not hate Jews because there were pictures of swarthy Semites with big noses imprinted on his cerebellum; racism does not exist in America because the picture of O. J. Simpson on the cover of Time is too dark. The view that visual cliches shape beliefs is both too pessimistic, in that it supposes that people are helplessly imprisoned by received stereotypes, and too optimistic, in that it supposes that if you could change the images you could change the beliefs. 47
Recognizing that we are equipped with sophisticated faculties that keep us in touch with reality does not entail ignoring the ways in which our faculties can be turned against us. People lie, sometimes baldly, sometimes through insinuation and presupposition (as in the question "When did you stop beating your wife? "). People disseminate disinformation about ethnic groups, not just pejorative stereotypes but tales of exploitation and perfidy that serve to stoke moralistic outrage against them. People try to manipulate social realities like status (which exist in the mind of the beholder) to make themselves look good or to sell products.
But we can best protect ourselves against such manipulation by pinpointing the vulnerabilities of our faculties of
? ? ? ? ? ? categorization, language, and imagery, not by denying their complexity. The view that humans are passive {218} receptacles of stereotypes, words, and images is condescending to ordinary people and gives unearned importance to the pretensions of cultural and academic elites. And exotic pronouncements about the limitations of our faculties, such as that there is nothing outside the text or that we inhabit a world of images rather than a real world, make it impossible even to identify lies and misrepresentations, let alone to understand how they are promulgated.
<< {219} >> Chapter 13
Out of Our Depths
A man has got to know his limitations. -- Clint Eastwood in Magnum Force
Most people are familiar with the idea that some of our ordeals come from a mismatch between the source of our passions in evolutionary history and the goals we set for ourselves today. People gorge themselves in anticipation of a famine that never comes, engage in dangerous liaisons that conceive babies they don't want, and rev up their bodies in response to stressors from which they cannot run away.
What is true for the emotions may also be true for the intellect. Some of our perplexities may come from a mismatch between the purposes for which our cognitive faculties evolved and the purposes to which we put them today. This is obvious enough when it comes to raw data processing. People do not try to multiply six-digit numbers in their heads or remember the phone number of everyone they meet, because they know their minds were not designed for the job. But it is not as obvious when it comes to the way we conceptualize the world. Our minds keep us in touch with aspects of reality -- such as objects, animals, and people -- that our ancestors dealt with for millions of years. But as science and technology open up new and hidden worlds, our untutored intuitions may find themselves at sea.
What are these intuitions? Many cognitive scientists believe that human reasoning is not accomplished by a single, general-purpose computer in the head. The world is a heterogeneous place, and we are equipped with different kinds of intuitions and logics, each appropriate to one department of reality. These ways of knowing have been called systems, modules, stances, faculties, mental organs, multiple intelligences, and reasoning engines. 1 They emerge early in life, are present in every normal person, and appear to be computed in partly distinct sets of networks in the brain. They may be installed by different {220} combinations of genes, or they may emerge when brain tissue self- organizes in response to different problems to be solved and different patterns in the sensory input. Most likely they develop by some combination of these forces.
What makes our reasoning faculties different from the departments in a university is that they are not just broad areas of knowledge, analyzed with whatever tools work best. Each faculty is based on a core intuition that was suitable for analyzing the world in which we evolved. Though cognitive scientists have not agreed on a Gray's Anatomy of the mind, here is a tentative but defensible list of cognitive faculties and the core intuitions on which they are based:
? An intuitive physics, which we use to keep track of how objects fall, bounce, and bend. Its core intuition is the concept of the object, which occupies one place, exists for a continuous span of time, and follows laws of motion and force. These are not Newton's laws but something closer to the medieval conception of impetus, an "oomph" that keeps an object in motion and gradually dissipates. 2 ? An intuitive version of biology or natural history, which we use to understand the living world. Its core intuition is that living things house a hidden essence that gives them their form and powers and drives their growth and bodily functions. 3
? An intuitive engineering, which we use to make and understand tools and other artifacts. Its core intuition is that a tool is an object with a purpose -- an object designed by a person to achieve a goal. 4
? An intuitive psychology, which we use to understand other people. Its core intuition is that other people are not objects or machines but are animated by the invisible entity we call the mind or the soul. Minds contain beliefs and desires and are the immediate cause of behavior.
? A spatial sense, which we use to navigate the world and keep track of where things are. It is based on a dead reckoner, which updates coordinates of the body's location as it moves and turns, and a network of mental maps.
Each map is organized by a different reference frame: the eyes, the head, the body, or salient objects and places in the world. 5
? A number sense, which we use to think about quantities and amounts. It is based on an ability to
? ? ? ? ? ? ? ? ? ? ? register exact quantities for small numbers of objects (one, two, and three) and to make rough relative estimates for larger numbers. 6
? A sense of probability, which we use to reason about the likelihood of uncertain events. It is based on the ability to track the relative frequencies of events, that is, the proportion of events of some kind that turn out one way or the other. 7 {221}
? An intuitive economics, which we use to exchange goods and favors. It is based on the concept of reciprocal exchange, in which one party confers a benefit on another and is entitled to an equivalent benefit in return.
? A mental database and logic, which we use to represent ideas and to infer new ideas from old ones. It is based on assertions about what's what, what's where, or who did what to whom, when, where, and why. The assertions are linked in a mind-wide web and can be recombined with logical and causal operators such as and, or, not, all, some, necessary, possible, and cause. 8
? Language, which we use to share the ideas from our mental logic. It is based on a mental dictionary of memorized words and a mental grammar of combinatorial rules. The rules organize vowels and consonants into words, words into bigger words and phrases, and phrases into sentences, in such a way that the meaning of the combination can be computed from the meanings of the parts and the way they are arranged. 9
The mind also has components for which it is hard to tell where cognition leaves off and emotion begins. These include a system for assessing danger, coupled with the emotion called fear, a system for assessing contamination, coupled with the emotion called disgust, and a moral sense, which is complex enough to deserve a chapter of its own. These ways of knowing and core intuitions are suitable for the lifestyle of small groups of illiterate, stateless people who live off the land, survive by their wits, and depend on what they can carry. Our ancestors left this lifestyle for a settled existence only a few millennia ago, too recently for evolution to have done much, if anything, to our brains. Conspicuous by their absence are faculties suited to the stunning new understanding of the world wrought by science and technology. For many domains of knowledge, the mind could not have evolved dedicated machinery, the brain and genome show no hints of specialization, and people show no spontaneous intuitive understanding either in the crib or afterward. They include modern physics, cosmology, genetics, evolution, neuroscience, embryology, economics, and mathematics.
It's not just that we have to go to school or read books to learn these subjects. It's that we have no mental tools to grasp them intuitively. We depend on analogies that press an old mental faculty into service, or on jerry-built mental contraptions that wire together bits and pieces of other faculties. Understanding in these domains is likely to be uneven, shallow, and contaminated by primitive intuitions. And that can shape debates in the border disputes in which science and technology make contact with everyday life. The point of this chapter is that together with all the moral, empirical, and political factors that go into these debates, we should add the cognitive factors: the way our
{222} minds naturally frame issues. Our own cognitive makeup is a missing piece of many puzzles, including education, bioethics, food safety, economics, and human understanding itself.
~
The most obvious arena in which we confront native ways of thinking is the schoolhouse. Any theory of education must be based on a theory of human nature, and in the twentieth century that theory was often the Blank Slate or the Noble Savage.
Traditional education is based in large part on the Blank Slate: children come to school empty and have knowledge deposited in them, to be reproduced later on tests. (Critics of traditional education call this the "savings and loan" model. ) The Blank Slate also underlies the common philosophy that the early school-age years are an opportunity zone in which social values are shaped for life. Many schools today use the early grades to instill desirable attitudes toward the environment, gender, sexuality, and ethnic diversity.
Progressive educational practice, for its part, is based on the Noble Savage. As A. S. Neill wrote in his influential book Summerhill, "A child is innately wise and realistic. If left to himself without adult suggestion of any kind, he will develop as far as he is capable of developing. "10 Neill and other progressive theorists of the 1960s and 1970s argued that schools should do away with examinations, grades, curricula, and even books. Though few schools went that far, the movement left a mark on educational practice. In the method of reading instruction known as Whole Language, children are not taught which letter goes with which sound but are immersed in a book-rich environment where reading skills are expected to blossom spontaneously. 11 In the philosophy of mathematics instruction known as constructivism, children are not drilled with arithmetic tables but are enjoined to rediscover mathematical truths themselves by solving problems in groups. 12 Both methods fare badly when students' learning is assessed objectively,
? ? ? ? ? ? ? ? ? but advocates of the methods tend to disdain standardized testing.
An understanding of the mind as a complex system shaped by evolution runs against these philosophies. The alternative has emerged from the work of cognitive scientists such as Susan Carey, Howard Gardner, and David Geary. 13 Education is neither writing on a blank slate nor allowing the child's nobility to come into flower. Rather, education is a technology that tries to make up for what the human mind is innately bad at. Children don't have to go to school to learn to walk, talk, recognize objects, or remember the personalities of their friends, even though these tasks are much harder than reading, adding, or remembering dates in history. They do have to go to school to learn written language, arithmetic, and science, because those bodies of knowledge and skill were invented too recently for any species-wide knack for them to have evolved. {223}
Far from being empty receptacles or universal learners, then, children are equipped with a toolbox of implements for reasoning and learning in particular ways, and those implements must be cleverly recruited to master problems for which they were not designed. That requires not just inserting new facts and skills in children's minds but debugging and disabling old ones. Students cannot learn Newtonian physics until they unlearn their intuitive impetus-based physics. 14 They cannot learn modern biology until they unlearn their intuitive biology, which thinks in terms of vital essences. And they cannot learn evolution until they unlearn their intuitive engineering, which attributes design to the intentions of a designer. 15
Schooling also requires pupils to expose and reinforce skills that are ordinarily buried in unconscious black boxes. When children learn to read, the vowels and consonants that are seamlessly woven together in speech must be forced into children's awareness before they can associate them with squiggles on a page. 16 Effective education may also require co-opting old faculties to deal with new demands. Snatches of language can be pressed into service to do calculation, as when we recall the stanza "Five times five is twenty-five. "17 The logic of grammar can be used to grasp large numbers: the expression four thousand three hundred and fifty-seven has the grammatical structure of an English noun phrase like hat, coat, and mittens. When a student parses the number phrase she can call to mind the mental operation of aggregation, which is related to the mathematical operation of addition. 18 Spatial cognition is drafted into understanding mathematical relationships through the use of graphs, which turn data or equations into shapes. 19 Intuitive engineering supports the learning of anatomy and physiology (organs are understood as gadgets with functions), and intuitive physics supports the learning of chemistry and biology (stuff, including living stuff, is made out of tiny, bouncy, sticky objects). 20
Geary points out a final implication. Because much of the content of education is not cognitively natural, the process of mastering it may not always be easy and pleasant, notwithstanding the mantra that learning is fun. Children may be innately motivated to make friends, acquire status, hone motor skills, and explore the physical world, but they are not necessarily motivated to adapt their cognitive faculties to unnatural tasks like formal mathematics. A family, peer group, and culture that ascribe high status to school achievement may be needed to give a child the motive to persevere toward effortful feats of learning whose rewards are apparent only over the long term. 21
~
The layperson's intuitive psychology or "theory of mind" is one of the brain's most striking abilities. We do not treat other people as wind-up dolls but think of them as being animated by minds: nonphysical entities we cannot see or touch but that are as real to us as bodies and objects. Aside from {224} allowing us to predict people's behavior from their beliefs and desires, our theory of mind is tied to our ability to empathize and to our conception of life and death. The difference between a dead body and a living one is that a dead body no longer contains the vital force we call a mind. Our theory of mind is the source of the concept of the soul. The ghost in the machine is deeply rooted in our way of thinking about people.
A belief in the soul, in turn, meshes with our moral convictions. The core of morality is the recognition that others have interests as we do -- that they "feel want, taste grief, need friends," as Shakespeare put it -- and therefore that they have a right to life, liberty, and the pursuit of their interests. But who are those "others"? We need a boundary that allows us to be callous to rocks and plants but forces us to treat other humans as "persons" that possess inalienable rights. Otherwise, it seems, we would place ourselves on a slippery slope that ends in the disposal of inconvenient people or in grotesque deliberations on the value of individual lives. As Pope John Paul II pointed out, the notion that every human carries infinite value by virtue of possessing a soul would seem to give us that boundary. Until recently the intuitive concept of the soul served us pretty well. Living people had souls, which come into existence at the moment of conception and leave their bodies when they die. Animals, plants, and inanimate objects do not have souls at all. But science is showing that what we call the soul -- the locus of sentience, reason, and will -- consists of the information-processing activity of the brain, an organ governed by the laws of biology. In an individual person it comes into existence gradually through the differentiation of tissues growing from a single cell.
? ? ? ? ? ? ? ? ? ? ? In the species it came into existence gradually as the forces of evolution modified the brains of simpler animals. And though our concept of souls used to fit pretty well with natural phenomena -- a woman was either pregnant or not, a person was either dead or alive -- bio-medical research is now presenting us with cases where the two are out of register. These cases are not just scientific curiosities but are intertwined with pressing issues such as contraception, abortion, infanticide, animal rights, cloning, euthanasia, and research involving human embryos, especially the harvesting of stem cells.
In the face of these difficult choices it is tempting to look to biology to find or ratify boundaries such as "when life begins. " But that only highlights the clash between two incommensurable ways of conceiving life and mind. The intuitive and morally useful concept of an immaterial spirit simply cannot be reconciled with the scientific concept of brain activity emerging gradually in ontogeny and phylogeny. No matter where we try to draw the line between life and nonlife, or between mind and nonmind, ambiguous cases pop up to challenge our moral intuitions.
The closest event we can find to a thunderclap marking the entry of a soul {225} into the world is the moment of conception. At that instant a new human genome is determined, and we have an entity destined to develop into a unique individual. The Catholic Church and certain other Christian denominations designate conception as the moment of ensoulment and the beginning of life (which, of course, makes abortion a form of murder). But just as a microscope reveals that a straight edge is really ragged, research on human reproduction shows that the "moment of conception" is not a moment at all. Sometimes several sperm penetrate the outer membrane of the egg, and it takes time for the egg to eject the extra chromosomes. What and where is the soul during this interval? Even when a single sperm enters, its genes remain separate from those of the egg for a day or more, and it takes yet another day or so for the newly merged genome to control the cell. So the "moment" of conception is in fact a span of twenty-four to forty- eight hours. 22 Nor is the conceptus destined to become a baby. Between two-thirds and three-quarters of them never implant in the uterus and are spontaneously aborted, some because they are genetically defective, others for no discernible reason.
Still, one might say that at whatever point during this interlude the new genome is formed, the specification of a unique new person has come into existence. The soul, by this reasoning, may be identified with the genome. But during the next few days, as the embryo's cells begin to divide, they can split into several embryos, which develop into identical twins, triplets, and so on. Do identical twins share a soul? Did the Dionne quintuplets make do with one- fifth of a soul each? If not, where did the four extra souls come from? Indeed, every cell in the growing embryo is capable, with the right manipulations, of becoming a new embryo that can grow into a child. Does a multicell embryo consist of one soul per cell, and if so, where do the other souls go when the cells lose that ability? And not only can one embryo become two people, but two embryos can become one person. Occasionally two fertilized eggs, which ordinarily would go on to become fraternal twins, merge into a single embryo that develops into a person who is a genetic chimera: some of her cells have one genome, others have another genome. Does her body house two souls? For that matter, if human cloning ever became possible (and there appears to be no technical obstacle), every cell in a person's body would have the special ability that is supposedly unique to a conceptus, namely developing into a human being. True, the genes in a cheek cell can become a person only with unnatural intervention, but that is just as true for an egg that is fertilized in vitro. Yet no one would deny that children conceived by IVF have souls.
The idea that ensoulment takes place at conception is not only hard to reconcile with biology but does not have the moral superiority credited to it. It implies that we should prosecute users of intrauterine contraceptive devices and the "morning-after pill" for murder, because they prevent the conceptus from implanting. It implies that we should divert medical research from {226} curing cancer and heart disease to preventing the spontaneous miscarriages of vast numbers of microscopic conceptuses. It impels us to find surrogate mothers for the large number of embryos left over from IVF that are currently sitting in fertility clinic freezers. It would outlaw research on conception and early embryonic development that promises to reduce infertility, birth defects, and pediatric cancer, and research on stem cells that could lead to treatments for Alzheimer's disease, Parkinson's disease, diabetes, and spinal-cord injuries. And it flouts the key moral intuition that other people are worthy of moral consideration because of their feelings -- their ability to love, think, plan, enjoy, and suffer -- all of which depend on a functioning nervous system.
The enormous moral costs of equating a person with a conceptus, and the cognitive gymnastics required to maintain that belief in the face of modern biology, can sometimes lead to an agonizing reconsideration of deeply held beliefs. In 2001, Senator Orrin Hatch of Utah broke with his longtime allies in the anti-abortion movement and came out in favor of stem-cell research after studying the science of reproduction and meditating on his Mormon faith. "I have searched my conscience," he said. "I just cannot equate a child living in the womb, with moving toes and fingers and a beating heart, with an embryo in a freezer. "23
The belief that bodies are invested with souls is not just a product of religious doctrine but embedded in people's psychology and likely to emerge whenever they have not digested the findings of biology. The public reaction to cloning is a case in point. Some people fear that cloning would present us with the option of becoming immortal,
? ? ? ? others that it could produce an army of obedient zombies, or a source of organs for the original person to harvest when needed. In the recent Arnold Schwarzenegger movie The Sixth Day, clones are called "blanks," and their DNA gives them only a physical form, not a mind; they acquire a mind when a neural recording of the original person is downloaded into them. When Dolly the sheep was cloned in 1997, the cover of Der Spiegel showed a parade of Claudia Schiffers, Hitlers, and Einsteins, as if being a supermodel, fascist dictator, or scientific genius could be copied along with the DNA.
Clones, in fact, are just identical twins born at different times. If Einstein had a twin, he would not have been a zombie, would not have continued Einstein's stream of consciousness if Einstein had predeceased him, would not have given up his vital organs without a struggle, and probably would have been no Einstein (since intelligence is only partly heritable). The same would be true of a person cloned from a speck of Einstein. The bizarre misconceptions of cloning can be traced to the persistent belief that the body is suffused with a soul. One conception of cloning, which sets off a fear of an army of zombies, blanks, or organ farms, imagines the process to be the duplication of a body without a soul. The other, which sets off fears of a Faustian grab at {227} immortality or of a resurrected Hitler, conceives of cloning as duplicating the body together with the soul. This conception may also underlie the longing of some bereaved parents for a dead child to be cloned, as if that would bring the child back to life. In fact, the clone would not only grow up in a different world from the one the dead sibling grew up in, but would have different brain tissue and would traverse a different line of sentient experience.
The discovery that what we call "the person" emerges piecemeal from a gradually developing brain forces us to reframe problems in bioethics. It would have been convenient if biologists had discovered a point at which the brain is fully assembled and is plugged in and turned on for the first time, but that is not how brains work. The nervous system emerges in the embryo as a simple tube and differentiates into a brain and spinal cord. The brain begins to function in the fetus, but it continues to wire itself well into childhood and even adolescence. The demand by both religious and secular ethicists that we identify the "criteria for personhood" assumes that a dividing line in brain development can be found. But any claim that such a line has been sighted leads to moral absurdities.
If we set the boundary for personhood at birth, we should be prepared to allow an abortion minutes before birth, despite the lack of any significant difference between a late-term fetus and a neonate. It seems more reasonable to draw the line at viability. But viability is a continuum that depends on the state of current biomedical technology and on the risks of impairment that parents are willing to tolerate in their child. And it invites the obvious rejoinder: if it is all right to abort a twenty-four-week fetus, then why not the barely distinguishable fetus of twenty-four weeks plus one day? And if that is permissible, why not a fetus of twenty-four weeks plus two days, or three days, and so on until birth? On the other hand, if it is impermissible to abort a fetus the day before its birth, then what about two days before, and three days, and so on, all the way back to conception?
We face the same problem in reverse when considering euthanasia and living wills at the end of life. Most people do not depart this world in a puff of smoke but suffer a gradual and uneven breakdown of the various parts of the brain and body. Many kinds and degrees of existence lie between the living and the dead, and that will become even more true as medical technology improves.
We face the problem again in grappling with demands for animal rights. Activists who grant the right to life to any sentient being must conclude that a hamburger eater is a party to murder and that a rodent exterminator is a perpetrator of mass murder. They must outlaw medical research that would sacrifice a few mice but save a million children from painful deaths (since no one would agree to drafting a few human beings for such experiments, and on this view mice have the rights we ordinarily grant to people). On the other hand, {228} an opponent of animal rights who maintains that personhood comes from being a member of Homo sapiens is just a species bigot, no more thoughtful than the race bigots who value the lives of whites more than blacks. After all, other mammals fight to stay alive, appear to experience pleasure, and undergo pain, fear, and stress when their well-being is compromised. The great apes also share our higher pleasures of curiosity and love of kin, and our deeper aches of boredom, loneliness, and grief. Why should those interests be respected for our species but not for others?
Some moral philosophers try to thread a boundary across this treacherous landscape by equating personhood with cognitive traits that humans happen to possess. These include an ability to reflect upon oneself as a continuous locus of consciousness, to form and savor plans for the future, to dread death, and to express a choice not to die.
