The
possibility
of Premark's second form of doubt is dependent on the what R.
Brett Bourbon - 1996 - Constructing a Replacement for the Soul
ThatisnotwhatIam pursuing here.
I am asking instead what does it mean to be so built or so described (as an effect within a causal chain)?
I am building in this chapter a causal aesthetic, turning causal principles (a simplified scientific world) into an allegorical picture o f building a mind within itself, but limitedbyitsworld. Inmanywaysthismodelwillenactanaspectofthephilosophical aestheticofthedissertationasawhole. Themockengineeringinthischapterwillnot, therefore, be an exercise in cognitive philosophy, but in philosophical aesthetics: an allegory about how ontological limits describe ways of making sense through the process
Notes for this chapter are on page 613
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
571
? o f building. Call this thinking by making machines as fictions that excavate how an T becomes an 'our' and difference becomes a time.
My machines arejugs which think ofthemselves as minds. Such a fiction means something within a variety of contexts (in philosophy or science fiction or cognitive science). Mygoalhereisneithertoexplorethatmeaningnortheassumptionsofthose contexts and uses. I am using these machines (as fictions and as causal models) to make visible the possibilities of orienting ourselves within the causal limits these machine(s) describe. Thisorientationismeanttopressuretheconfusedcollapseofmodelsof animation and mind into interpretive allegories (in order to give them a kind o f ontological
force) that I described in Jakobson and Vendler and in a different way in Romantic fragmentsingeneralandinKeatspowerfulimageofsoul-making. Soul-makingforme, operates toward the limits of person-making not as a replacement for person-making (for science). In this sense this engineering exercise is a theological exercise meant to question how we apply our interpretations o f art and texts and others as versions o f minds.
If models of mind underlie many of our aesthetic models and many of the ways in which we enter, exit and interpret ourselves within metaphors, allegories, and language games,thenmodelsofthemindcanmakevisibledifferentkindsofaesthetics. Thismight meanthattheoriesofmindshouldbeunderstoodasaestheticmodels. WhileIthinkthat this is true, for the most part, this does not diminish these theories, nor does it tells us much about what kind of aesthetics these are. A full analysis of a theory of mind as an aesthetic would require another dissertation. What my machine(s) are meant to do is to showthekindofaestheticsuchtheoriescangenerate. Inotherwords,howwouldwe
with permission of the copyright owner. Further reproduction prohibited without permission.
? inhabit a world described by and large by causal and mathematical limits as meaningful without ignoring those limits? My machines suggest a way o f beginning to orient ourselves in relation to these limits in a way that will move beyond the kind of pictures of the subjunctive enacted by both Heidegger in "Das Ding" and Eliot in The Waste Land. This is why I think it describes a kind o f theology.
In answering the question cwhat does it mean to be built or described as an effect within a causal chain? ' I have attempted to build a set of theoretical machines that perceive time by figuring themselves (in a fundamental sense) within the temporal order they perceive. 1 By this process the machines construct themselves within a world of temporal surfaces: they figure themselves continually and as the succession o f limits that determine their experience as anything. This is a picture of the writing at limits that I have investigatedinWittgenstein,Joyce,Eliot,Keats,andHeideggerfromtheunderside. Itis not meant to generate a theory, but to offer itselfas another writing toward and at these limits.
This means I have experimented in a form of Science Fiction, in which I have attempted to slice an aspect o f the logic o f human temporality out o f the conscious/ unconscious logic ofthe human mind. The problem in constructing a Time-perceiving machine is in constructing a future, which means constructing a fundamental ontological belief that the future exists. I have attempted to build a theoretical machine that perceives time. Suchatimemachinesimplifiesourconsciousapprehensiontothebasicprocessof determining successive differences. I will use bits of this machine in order to analyze the conceptual logic and organization which allows and constructs our phenomenal perception
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
573
? o f time. High-level problems in the organization o f mental processes can be explored by building simple machines, or systems, not in an attempt to model what actually happens in the brain, nor to offer an explanation o f how a particular agent might function in a real machine, but in order to model these conceptual problems and their solutions as functions and processes.
I have constructed these theoretical machines, or rather one evolving machine, not in order to model the brain or its evolution, nor even to posit a possible construction of a real machine or set of algorithms to be included in some artificial intelligence machine (although it has some relevance to this). My machines are manipulations, within an imaginary and impossible world, o f the theoretical problems involved in the human inhabitation within time as a grammatical limit. These machines pressure the problem of time into the kind of clarity that allows the structure, and more importantly the significance, of temporality to function in relation to the more complex problems of consciousness and language. I will use my machines to explore and explain the organization and conceptual problems attending any attempt at modeling "reality" as meaningful and not just as a "thing".
In effect this means questioning the relation between the possibility o f meaning and the possibility oftime at the limits ofsense and nonsense. How can you build yourselfinto a grammar using only causal mechanisms? In my attempt to build such a grammar I am trying to outline the point of conceptual confusion around which language breaks down intononsenseinFinnegansWake,forexample. Iamtryingtoapproachorwriteator within the limit described by the soul. This means I am trying to figure causal mechanisms
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
574
? at the limit o f the interpretative relations that I have described so far. These machine masks will not disprove Wittgensteinian or Jocyean grammar, but will describe a causal language game(s) evolving toward the limits these grammars describe.
I return first to a description of philosophy I introduced in the second chapter. Kurt Godel, in explaining his philosophical work, described philosophy as the analysis of concepts, and science as the use of these same concepts. Engineering philosophy into a machine, a philosophy machine, allows us to reconceive and merge Godel's distinctions within a model o f thinking where the analysis o f concepts, categories, and logics follows from their construction and use in an evolving machine.
I will begin this process of construction with the hypothesis that human language and the logic of human temporality simultaneously create each other as systems of symbolization organizing the world and one's relationship to it. Time requires a kind of syntax, a structure allowing for symbolic exchange between different representations of mental states. Instead of the pure syntax of calculus, the syntactical structure of temporality mirrors the structure of symbolic communication in human language, such that to construct a future is to construct a language. Thus, consciousness functions through or within the construction of human time (specifically of a future which mimics the structure of the present) and the simultaneous or contingent construction of human language.
Language and temporality simultaneously create each other as systems of symbolization that organize the world (our experience) and one's relationship to it. (By symbolization I mean the process by which relationships and identities are configured into particular structures and/or signs).
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
575
? In my analysis o f Wittgenstein I showed how the sense that the future meaning o f a word is present in its form (or the sense that future is somehow present as a possibility in thepresent)displayedhowweinhabitourgrammar. Weexperiencechangethroughour shifting position between language games and through our functioning within the temporal orderdescribedbythegrammarofparticularlanguagegames. Thesenseofthefuture arose as an interpretation o f our relation within the structures o f the grammars that describe our forms o f life. I will use my machines to excavate this grammar. Thus in constructing a future my machine(s) will, in general, abstractform from particular contents, and in so doing articulate a symbolic and syntactical structural relationship between external and internal states that will come to stand for the future.
I list the following as the engineering assumptions and goals underlying the construction o f my machine(s):
1) Temporality functions through a syntax, a structure allowing for symbolic exchange between different representations o f mental states. The syntactical structure o f temporality mirrors the structure o f symbolic communication in human language, such that to construct a future is to construct a language.
2) One can identify two forms oftemporal syntax: (a) a simple equivalence between past states and current states that allow for short-term predictions based on simple patterns or causal associations between a small set of inputs and (b) a more complex syntax that functions as a continuous future.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
576
? 3) My problem is to turn the Turing test around. Instead o f attributing a thinking and/or consciousness to a machine, I need to construct a model in which a machine will attribute its mental state and condition to another machine (or at some point to a human).
4) self-consciousness arises from a model o f other minds as forms o f one's own mind.
5) The possibility ofthe future (that is a symbolic structure defining the future) is built by using structures derived from meta-temporal identities, identities that extend over time but whose contents, except as a markers o f identity, are meaningless.
14. 2 A time machine linguist
A Time Machine constructs a diachronic ontology, a logic organizing and determining the forms o f existence by nesting them within the phenomenal field defined by the machine, as a mechanical surrogate for the individual subject. A time machine, like the one built by H. G. Wells, spatializes time, in order that the appearance of a succession of momentsisdeterminedbythemovement,asonaplain,ofthefieldoftheobserver. In Wells' machine a strangely immaterial metal-like bar serves as a transitive link between the machine (as the physical embodiment and protective encapsulation of the individual subject) and all other times. Consequently, the transitive bar functions as a verb which
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
577
? translates the subject into the future or past, both ofwhich serve as predicate objects, in essence defining the subject. Wells' Time Machine, therefore, defines and functions through the logic o f the copula.
The disguised use o f this linguistic metaphor in organizing Time highlights the fundamental cognitive dependence o f the conceptual structures o f time and language. Language works as a form o f external memory. In this sense language and memory both transport information through time. We use this memory as a means of actively re-shaping elements o f our experience, including other people, into forms and relations that allow for thesuccessofourgoals. Inthissenseweattempttore-formulatethroughakindofviral infection, the internal state, the descriptions o f the world and the goals embodied in these descriptions (goals are often understood as ideal descriptions of self) of other people.
If language is a form of memory, then we should expect our memories to function throughakindoflanguage(asystemofsymbolicrelations). Thisiscertainlytrueof Minsky's K-Iines. K-lines wire together a set of agents active during an experience, and thusformamemoryofthatexperience. Theconnectionsbetweenagentsvaryinstrength, forming a hierarchy of levels. A K-line, to the degree that it links agents within such a hierarchy of relevant levels, sketches a set of relations within a loose syntax. Minsky organizes the formation o f these K-lines within "societies o f memory. " New K-lines are attached to the most recently active K-lines. When (Jack) (Fly) (Kite) are connected by a new K-line, predicates, that is, the K-lines defining each o f these terms, e. g. (Male) (Outside) (Young), are attached to the relevant agent, e. g. (Jack). Because these older K- lines are used to describe the new elements, as predicates, they are in effect nested within
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
578
? the new K-line. IfK-lines are used to form "societies ofmemory", therefore, they build syntactical trees. Minsky never addresses why the world should be categorized in this way. Nor does he examine how such categories would be constructed so that we could recognize them. What his model o f memory does demonstrate is the temptation to describe ourselves and our biology within grammars built from our language.
Predication links Well's Time Machine to all possible times (the set o f possible objects). Although K- lines set up a more complicated set o f relations and associations, these relationship are organized in a rough equivalent o f predication. The Time Machine works not only like parser, it reduces all temporality to the form o f what we understand as memory. Ourexperienceoftime,inspiteoftheroleofconsciousnessandmemoryin determining continuity and particular expectations, is quite different. For Well's the content of all possible moments is pre-determined and already established within the synchronicplainon(orin)whichthemachinetravels. Oursenseoftime,however,and also unlike the closed temporal-spatial loops described in Godel's solution ofEinstein's
gravitational field equations in General Relativity, is of a totalizing alteration or change in the entire universe, where the content o f future moments is understood as undetermined and therefore not yet real. In our experience there is a logical distinction between change (a physical process) and Time (our construction of this change into a form we can perceive). Time arises as an effect of our phenomenal observation of a succession of changes, determined in relation to the relative continuities of other objects or events, and in relation to the continuity of the phenomenal frame through which we represent reality (what we call our subjective consciousness).
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
579
? 14. 3 Generatingthepresentfromlanguage "A Rose is a Rose is a Rose"
To perceive time we must represent or model it. Any such model must organize sensory input within a temporal structure. Such a structure is difficult to imagine because the word "structure" defines spatial relationships, and thus can only be used in a vague metaphoricalsensewhenusedtoorganizetemporality. Whatwerecognizeastemporal structure is really repetition (of the sort that orders and constructs music--which is the formal art of structuring time in relation to sound).
Human attempts to re-create a continuous present rely on repetition in order to split the sound o f the word (signifier) from its concept or meaning (signified and referent). In other words, repetition dissolves the symbolic content of a word by erasing semantic difference. As meaning is dissolved, the temporal dimension slows to nothing more than an awareness of the progressive loss of meaning in the sentence. Our experience of reading non-sense exposes a curious relation between our awareness of temporal succession and language.
A single image is not splendor. Dirty is yellow. A sign of more in not mentioned. Apieceofcoffeeisnotadetainer. Theresemblancetoyellowisdirtier and distincter. The clean mixture is whiter and not coal color, never more coal color than altogether.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
580
? The sight of a reason, the same sight slighter, the sight of a simpler negative answer, the same sore sounder, the intention to wishing, the same splendor, the same furniture. (Stein, 463)
Unlike some o f Lewis Carroll's nonsense lyrics, this passage plays between sense and nonsenseintheattempttoconstructwhatSteincalledacontinuouspresent. Whatwe apprehend as sense defines a frame, a set of expectations, which the subsequent non-sense dissolves. The failure ofthe frame to interpret the semantic content ofthe words throws us into an interpretative loop. We interpret that loop as a failure to progress beyond the immediate context.
Language is not simply structured serially, that is, it does not function through a successionofwords. IfitdidStein'spassagewouldclearlymakesense. Languageis structured through a series of nested domains. In a highly repetitive sentence, "Rose is a Rose. . these nested domains (syntactical categories) collapse into a tautological chain. In Stein's non-sense passage semantic meaning marks a stable moment (or frame) which disintegratesintoindeterminacy,butstillwithinastablesyntax. Theeffectistomodela mind capable o f formulating short-term predictions, but incapable o f organizing the facts (the words) in order to satisfy these predictions.
Stein wants to think like a chimp. D. Premack, in his attempt to evaluate the degree to which the chimpanzee has a theory of mind, formulates a distinction between simple and complex states. Simple states are "hard-wired, automatic and reflex- like, and encapsulated" (172). He defines these states as "seeing, wanting, and expecting". The pathways determining each o f these (with varying degrees o f complexity) are tied to direct
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
581
? sensory input, such that their is little mediating control by the organism over these states. Complex states, however, are highly mediated states in which some level o f decision- making is required in order to determine the state. The prototypical example is belief. Belief arises as a mediating and, some would say, conscious response to particular forms ofdoubt. Premack distinguishes two forms ofdoubt (and consequently ofbelief): doubt in the veracity o f sensory information (epistemological doubt) and doubt in the information communicated by a member o f one's group (fear o f deception).
Epistemological doubt is countered by conscious attempts to construct an external means of determining the truth of sensory input: observation, experiment, analysis, reason, art, religion. Premack characterizes epistemological doubt with three questions: "Do I really see X? " "Do I really want X? " "Are my expectations well founded? " If we ignore the problem of their linguistic form, these questions describe a critical process of categorization. Epistemological doubt of this sort is predicated (and expresses or functions through) a distinction between truth and falsity. In other words these questions, defined by their use of "really" and "well-founded" require doubt to be in place in order to be asked. They appear initially to be questions o f domain. These questions are what I would call questions of syntax concerned about what belongs where as what. In other words syntax defines a set o f domains that taken together define the form and legitimacy of a particular string to function within language, that is to be intelligible. Syntax, however and to what degrees of specificity it is defined, constructs the possibility of determining legitimacy or, in relation to our epistemological doubt, truth. Syntax implies an established semantics, just as questioning the truth of X requires the possibility of falsity.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
582
? falsity. Syntaxdeterminesthedomainoflegitimacyasameansofstructuringthe possibility of doubt (or failure). Because a particular word might not be legitimate, syntax divorces the criterion for truth from the content of the word. The specific form of an entity is understood as variable and possibly false. This does not mean that a grammarians strictdefinitionofsyntaxdeterminesthesenseofasentence. Syntaxisitselfmuchmore fluidthancommonlysupposed. Strictsyntacticalrulessimplydescribethecenterofa syntactical domain. One can still make sense of words used within an incorrect syntax as long as they still reside within the domain defining their relationship with other words (and
thus its legitimacy as a member of a meaningful string). Once a word, however, can no longer be related to other words in a meaningful way, that word has left the syntactical domain that determined its legitimacy (and thus its meaningfulness). Thus one can see why a cognitive scientist like Schank argues for semantics over syntax. He should be arguing,however, foramorebroaddefinitionofsyntax,asyntaxthatdetermines relationships through the activation of semantic content within a particular domain (a possiblesetofrelations). Schank'sscriptorframeisasyntacticalstructure,onethat determines the legitimacy or meaningfulness of words by the establishment of a set of possible meanings, albeit possible meanings determined by previous knowledge. Belief as a form o f long-range prediction depends on a syntactical elaboration o f the present in order to create the possibility of a future.
The possibility of Premark's second form of doubt is dependent on the what R. Seyfarth and D Cheney call, in their study of meaning and language in Vervet monkeys, the attribution of mental states "such as knowledge, beliefs, and desires to others" (126).
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
583
? In their study they conclude that Vervet monkeys in spite of the character of three specific calls and the different responses these calls cause in those who hear them in warning of a snake as opposed to a martial eagle or a leopard do not have a language because they cannot determine the difference between a knowledgeable audience and an ignorant audience (127). They apparently cannot evaluate the knowledge o f those around them, because their communication is not attempting to change the mental state of their audience. Itissimplyasignalactivatingalearnedresponse. Theattributionofamental state to another seems to require the ability to choose. Making a choice (a kind of
thinking) involves constructing a world, a frame, or a domain of relations organized according to a particular grammar (a set of rules and values).
Chimps make choices. Do they then make conceptual worlds? In an experiment a female chimp was trained to add and count. Part ofthe training procedure required the chimp to play a game where two piles of candies were counted out. Whichever pile the chimp first point to was given to another chimp sitting in front of her in a cage. The object of the game was to get the most candies. Whenever the candies were counted out the chimp invariably pointed at the larger pile, thus losing. When numbers written on cards were substituted for the candies, however, the chimp was always able to point to the smaller number, thus insuring that she would be rewarded with the larger number of treats.
One could speculate. When the chimp saw the food an instinctual grab/eat rule (or agent)determinedherchoice. Sheplayedthecandygameusingherinstinctualrules. She needed instead to nest the candy game within the her instinctual game, by replacing the instinctual mechanism for achieving her desire for the most food with the mechanism
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
584
? defined by the game. By encoding the "amounts" o f candy within a numerical domain the relationship between the two piles can be determined without invoicing the grab/eat rule. Thisinformationisnolongerstoredasanimagewithinthegrab/eatrule. Thenumerical domain indicates the value of each number in relation to the other, not in relation to the chimp or her desire to eat. The failure of any abstract system to define this kind of personal relation requires that its values be accessed and assessed from another domain and according to another grammar. In this case, the chimp can then apply the rules o f the candy game to these "amounts" in order to determine the winning choice.
Any abstract system sets up a play between at least two different systems or domains. This recursive interaction creates a kind o f consciousness, because it creates the illusion o f free choice. This choice is not in any sense free, it is simply the mediation o f one domain by another. The chimp evaluates how to get food [an algorithm defined by CHOOSE THE MOST DESIRABLE and THE MOST DESIRABLE IS THE MOST FOOD] through the algorithm of the candy game [CHOOSE THE LEAST FIRST], The transcription of candies into number allows both of these algorithms to function together.
The three systems of relations in the previous experiment are all defined as systems o f value. These systems o f value are really systems o f meaning because they are sets o f relations. Semantic meaning within any discourse is primarily the articulation of a particularsetofrelations. Weassumethissetofrelationsisnotarbitrary. Thismeans they are constructed according to a particular logic. This logic is a logic of difference.
The statement "x means y" requires at some level, although not necessarily at the conceptual level Saussure suggests, that the notions "x" and "y" be understood as "not z",
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
585
? "not w". Even to say this, however, requires that the notion "x" be crystallized in such a way that such a comparison is possible. How do we know that 'x' is not the same as ly'? We know this because they sound different, or because their shapes are different. Shape and sound serve as common denominators. Thus to speak o f difference requires that "x" function within a realm of common terms or qualities, which means a realm of pre- established possible relations and configurations. Thus what I first called a logic of difference is a difference mediated by a commonality. This common context is not given, but is constructed. (On a fundamental level, however, the initial contexts would be created primarily out o f the biological structures used to interpret our interactions with our environment).
Value is determined through differences expressed within a common context. A common context means in relation to a binary descriptive system: between two similar phonemes, between more and less, and so on. Values generated out ofthese metaphoric matrices are re-coded into more complex systems o f relations. We can easily see how this works in the chimp experiment. How can the chimp nest the rules o f the candy game within its instinctual grab/eat game? Encoding the amount o f candy in numbers created a domain of pure value organized around more and less. This information was embedded in the world o f the candy game where the output o f the algorithm CHOOSE THE LESS selects the lesser value. This value is eliminated and the larger value is transferred into the
domainofthechimpsgrab/eatrule. Ineachcasevalueisdeterminedbyhowtheeventis encoded within a system ofvalues. The situation facing the chimp is successively translated into different conceptual worlds, where the meaning o f that information is
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
586
? determined by the values (in this case simple algorithms) organizing the relations within that world.
Thus, thinking is a process ofworld-making. A Dutch psychologist, de Groot, discovered that chess masters through a process o f implicit pruning do not see bad moves. They act within or through a grammar or a domain defined exclusively by good moves. They construct a conceptual world of good moves. This world defines what they see: it defines the moves the chess master recognizes as possible.
How does someone determine which of the possible moves or relations is correct, best, or relevant? This is too large a question to answer here. I can suggest, however, how we determine the relevant from the irrelevant within a frame or grammar in a general way. In an obvious way relevance is a function of understanding how something functions within a frame or domain. Frames must be extremely flexible to be able to handle the high degree of uncertainty and complexity of our experience. If we re-define determining relevance as making a choice we can make some progress. Choice arises through a kind of recursive nesting of domains and grammars. The interaction of these domains and grammars selects relationships from each that are integrated within a new domain and grammar, thus constructing an alternate world. This world defines what is possible. One assumes that at some level or in some world one is left with only a single possibility. This choice (really an algorithm) is transported to higher levels o f organization where it appears like we have made a conscious choice.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
587
? 14. 4 A wink in time means two
Before we can build a belief in the future, we must build awareness. We can build a basic time detector by using difference detectors, or what Minsky calls time-blinking:
Any agent that is sensitive to changes in time can also be used to detect differences. For whenever we expose such an agent, first to a situation A and then to a situation B, any output from that agent will signify some difference between A and B. (240)
While time-blinking partially removes the problem of comparing two inputs from different agents without requiring synchronous and identical frames (a trick it does largely by having a temporary memory), it does not explain how difference is translated into an ordered succession o f moments. One could, however, easily input the recognition o f a perceptual difference to a kind o f tape, as if marking a date or a second, which is then advanced. Suchaprocedurewouldrecordsuccessivedifferences,whichcouldfunction like a clock for human beings. It could never generate something like temporal experience for a machine. By what process is temporal order perceived and organized such that consciousness understands itself as the present? Do all the inputs that make up consciousness run through a single agent (an agent dangerously close to a homunculus) in rapidsuccessioninorderthatwecanperceivedifferenceandcallthatdifferencetime? If not, how is the information that our brain receives, processes and constructs, all at different rates, constructed into a now? The formation of defined moments is a process of constructing meaningful information. A moment, therefore, defines those "differences that makeadifference. " Timeisaninformationstructurewithinwhichwefindourselves.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
588
? Thus our machine must generate representations o f its world in succession in which it finds itself. It has to turn itselfinside out.
14. 5 "Be! Verb umprincipiant through the trancitive spaces! "
My machine lives in a world o f simple objects. As a time machine it does not recognize objects as objects, but objecthood, as that which can be noticed as something, which it perceives in what it considers a partial instant o f time. It does not recognize the world or the environment, but only something that triggers its optical apparatus as a somethingtobenoticed. Itsopticalapparatushasalreadysimplifieditsphenomenalfield into this categorical perception, which we can call the set o f objects. This information is sent to the Optical (O) input agent which registers this information within a formally defined phenomenal field. The machine can rcognize separate example of objecthood within a range from 1 to 5. This phenomenal field is defined by a set o f switches (5) that representanobjectbybeingonandrepresentnoobjectbybeingoff. Thusthemachine optical apparatus can immediately register up to 5 objects. An initial input, N l, once it has set these switches (once the optical information is represented as an internal state), is transferred to an Optical analyzer agent where it again sets a similar set of switches. If the next input (N2) to the Optical input agent resets the switches, it is registered as a different moment and is thus sent on to O difference detector. The difference detector records that adifferencehasoccured. IfitsinformationisidenticaltoNl,nochangeofstatetakes place. Because our machine has only one sensory input at this point, the failure to register
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
589
? difference is understood as non time; the first moment "continues" because the first state continues.
Once Nl has reset the difference detector, indicating a different moment registered as a change o f state, it enters a feedback loop. The detector's switches are then immediately reset by N2 (if there is an N2). N2 is similarly sent on a feedback loop but at twice the rate. Immediately after N2 resets the difference detector, N l resets it again.
This difference is converted into a value indicating the difference between these two states (degree o f change). The second cycle resets the detector to N2, before the next signal from O input is received.
Although it is not necessary to separate the process of detecting change from that of determining the degree of change, their functional difference allows us to split the presentintotwophenomenalelementsorinteractingmodels: anexperientialmodelandan evaluatory process. The difference detector, as the second element, serves as a kind of short-term memory within the phenomenal experience. It has, however, changed the nature o f the information perceived by the machine. Both kinds o f information determine the nature o f the present. It is the detector's separation from the initial model o f change
that allows the machine (to the degree that it is defined by its internal processes) to be aware of change. This means that the difference detector makes change, as defined by the 0 input agent, meaningful. The moment is actually not defined by any single state but by two states defined in relation to each other, embedded within the degree o f difference determined by the difference detector (so that Nl and N2 will be stored together as PI and P2). The difference detector, however, is inadequate for any short term memory. We
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
590
? need a memory that encodes the consciousness o f the machine, and that requires that the information from both agents be stored. Both Nl and N2 are sent into a single short term memory unit (they are connected by a simple K-line). This unit is organized within short term memory (STM) according to two parameters: the ordered received (representing temporal succession) and degree of difference. . Short term memory (STM) stores both our experience o f change (suitably simplified) and a particular fact about these experiences. Thesearenotidenticalbitsofinformation. Theanalyzerdeterminesthe relevant facts for the time machine. Given the formally defined character of the phenomenal field, this information can be used to actually structure STM. (There are 10
possible values 1 through 5 and -1 through -5 defined in relation to the second state (N2): there is no O, o f course)
Optical Difference Machir (N2 Input State)
Difference Valu
(fact)
N1 = P1
Optical Inupt Agent (Change detector)
Optical Difference Detector
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
Experiential Order
Short Term Memory
591
? This is, ofcourse, not a model ofhuman perception. We have eliminated some ofthe criticalproblemsinconstructingamodelofoursenseoftime. Theworldandthe informationfromthatworldisimmeasurablysimplified. Inmymachine,theformal continuity between moments is built into the form in which the sensory input is represented (a defined set o f switches). With only one kind o f sensory information (number of objects), we do not have the problem of synchronizing and integrating the information from different senses (and their processing agents within the brain). The initial formation o f the moment is defined by the limit inherent within the speed by which the optical apparatus can convert its sensory input into a set o f objects. Changes that take placeata ratefasterthanthemachine'sphysicalabilitytorecognizeasetofobjectsare simply not registered.
14. 6 The logic of short-term prediction
If my machine can recognize patterns, such that the number of objects defined in one moment is always followed by a defined number in another moment, it can make short-term predictions. Once a pattern has been established, it can be recalled and run through a memory recall agent (another difference detector) at a rate faster than the information processed by the Optical agent. I will not spend much time constructing the mechanics of such a process. What is important in this problem is less the way in which this information is stored, than in the syntactical logic it necessarily generates. The logic here is simple: if x (a certain number of objects) is registered it will be followd by y
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
592
? (another number ofobjects). Such predictions can, ofcourse, only happen ifthere is a pattern to be recognized.
Initially Short Term Memory (STM) is evaluated to determine a match with the initial optical input (Nl). If a match is found (through difference detectors attached to each memory slot), then the entire temporal unit, containing two states (PI, P2; remember PI) is transferred to the Memory Recall Agent. Nl and PI are processed as identical. The P2 state will not have an initial correlate state (it is run before N2). Because a prediction is based on understanding P2 as if it were N2, P2 must be understood as an N,
butnotastherealN2. Consequently,anysuchpredictionsrequireasymboliceconomy allowing for the conversion of Px into *Nx (where * indicates this input has an ontological claim equal in value but different in kind). The creation of an Nx state represents the creation of a temporal syntax. In order for the correlation of P2 with Nx to function as an expectation of N2, the machine must be able to act (or change its state as with Pavlov's dogs) on the basis of this correlation.
In Pavlov's bell experiment a similar temporal logic is manipulated. The equation o f food = salivate is re-wired to bell = salivate and is therefore based on an associated equivalence between food and bell. This equivalence, however, does not take place within a synchronic plane. It works because the sound of the bell recalls an established pattern recorded in memory. The P2 state of food following the PI state of the bell is understood asthesameasanNstate. Asymbolicinterchangetakesplacebetweenthepresentandthe future (patterned on a previously established relationship). A structural relationship has
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
593
? been established between these two diachronic states, which allows the content o f one state (the bell) to stand for the content of the other (food).
Predictions based on short term memory are stimulus dependent. They disappear as soon as the environmental stimulus that suggested them disappears. They define only a temporaryandunstablefuture. Amachinewithonlythiskindofmemorycouldnot engage in any long term planning. Such plans require an ability to invoke the possibility of the future on command, as well as much more complex reasoning and creative modeling.
14. 7 The continuous future: hearing of and speaidng in the new world order
The Time Machine's world is now a rectangular plane, on which, at some unspecified but unreachable distance, a number o f objects flash in and out o f existence. This plane is divided into a series of parallel districts (wide strips). The number of objects ineachdistrictchangeindependentlyofthenumberofobjectsinotherdistricts. No machine, o f which now their are many, can see the objects (the number o f objects) in a district other than the one it is in. The machines can move to other districts. To prevent any machine from becoming a part ofthe object field ofanother machine, all machines will move along the same line and can move around each other if one machine is in the way of another. (The eyes move to the side o f the machines head. The machines bump into other machines and then move around them; I will not include this perception and response withinmymodelinghereforavarietyofreasons. Thinkofthisbumpingandmovmentas a function o f the machine autonomic nervous system)(see Figure A).
Reproduced with permission of the copyright owner.
I am building in this chapter a causal aesthetic, turning causal principles (a simplified scientific world) into an allegorical picture o f building a mind within itself, but limitedbyitsworld. Inmanywaysthismodelwillenactanaspectofthephilosophical aestheticofthedissertationasawhole. Themockengineeringinthischapterwillnot, therefore, be an exercise in cognitive philosophy, but in philosophical aesthetics: an allegory about how ontological limits describe ways of making sense through the process
Notes for this chapter are on page 613
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
571
? o f building. Call this thinking by making machines as fictions that excavate how an T becomes an 'our' and difference becomes a time.
My machines arejugs which think ofthemselves as minds. Such a fiction means something within a variety of contexts (in philosophy or science fiction or cognitive science). Mygoalhereisneithertoexplorethatmeaningnortheassumptionsofthose contexts and uses. I am using these machines (as fictions and as causal models) to make visible the possibilities of orienting ourselves within the causal limits these machine(s) describe. Thisorientationismeanttopressuretheconfusedcollapseofmodelsof animation and mind into interpretive allegories (in order to give them a kind o f ontological
force) that I described in Jakobson and Vendler and in a different way in Romantic fragmentsingeneralandinKeatspowerfulimageofsoul-making. Soul-makingforme, operates toward the limits of person-making not as a replacement for person-making (for science). In this sense this engineering exercise is a theological exercise meant to question how we apply our interpretations o f art and texts and others as versions o f minds.
If models of mind underlie many of our aesthetic models and many of the ways in which we enter, exit and interpret ourselves within metaphors, allegories, and language games,thenmodelsofthemindcanmakevisibledifferentkindsofaesthetics. Thismight meanthattheoriesofmindshouldbeunderstoodasaestheticmodels. WhileIthinkthat this is true, for the most part, this does not diminish these theories, nor does it tells us much about what kind of aesthetics these are. A full analysis of a theory of mind as an aesthetic would require another dissertation. What my machine(s) are meant to do is to showthekindofaestheticsuchtheoriescangenerate. Inotherwords,howwouldwe
with permission of the copyright owner. Further reproduction prohibited without permission.
? inhabit a world described by and large by causal and mathematical limits as meaningful without ignoring those limits? My machines suggest a way o f beginning to orient ourselves in relation to these limits in a way that will move beyond the kind of pictures of the subjunctive enacted by both Heidegger in "Das Ding" and Eliot in The Waste Land. This is why I think it describes a kind o f theology.
In answering the question cwhat does it mean to be built or described as an effect within a causal chain? ' I have attempted to build a set of theoretical machines that perceive time by figuring themselves (in a fundamental sense) within the temporal order they perceive. 1 By this process the machines construct themselves within a world of temporal surfaces: they figure themselves continually and as the succession o f limits that determine their experience as anything. This is a picture of the writing at limits that I have investigatedinWittgenstein,Joyce,Eliot,Keats,andHeideggerfromtheunderside. Itis not meant to generate a theory, but to offer itselfas another writing toward and at these limits.
This means I have experimented in a form of Science Fiction, in which I have attempted to slice an aspect o f the logic o f human temporality out o f the conscious/ unconscious logic ofthe human mind. The problem in constructing a Time-perceiving machine is in constructing a future, which means constructing a fundamental ontological belief that the future exists. I have attempted to build a theoretical machine that perceives time. Suchatimemachinesimplifiesourconsciousapprehensiontothebasicprocessof determining successive differences. I will use bits of this machine in order to analyze the conceptual logic and organization which allows and constructs our phenomenal perception
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
573
? o f time. High-level problems in the organization o f mental processes can be explored by building simple machines, or systems, not in an attempt to model what actually happens in the brain, nor to offer an explanation o f how a particular agent might function in a real machine, but in order to model these conceptual problems and their solutions as functions and processes.
I have constructed these theoretical machines, or rather one evolving machine, not in order to model the brain or its evolution, nor even to posit a possible construction of a real machine or set of algorithms to be included in some artificial intelligence machine (although it has some relevance to this). My machines are manipulations, within an imaginary and impossible world, o f the theoretical problems involved in the human inhabitation within time as a grammatical limit. These machines pressure the problem of time into the kind of clarity that allows the structure, and more importantly the significance, of temporality to function in relation to the more complex problems of consciousness and language. I will use my machines to explore and explain the organization and conceptual problems attending any attempt at modeling "reality" as meaningful and not just as a "thing".
In effect this means questioning the relation between the possibility o f meaning and the possibility oftime at the limits ofsense and nonsense. How can you build yourselfinto a grammar using only causal mechanisms? In my attempt to build such a grammar I am trying to outline the point of conceptual confusion around which language breaks down intononsenseinFinnegansWake,forexample. Iamtryingtoapproachorwriteator within the limit described by the soul. This means I am trying to figure causal mechanisms
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
574
? at the limit o f the interpretative relations that I have described so far. These machine masks will not disprove Wittgensteinian or Jocyean grammar, but will describe a causal language game(s) evolving toward the limits these grammars describe.
I return first to a description of philosophy I introduced in the second chapter. Kurt Godel, in explaining his philosophical work, described philosophy as the analysis of concepts, and science as the use of these same concepts. Engineering philosophy into a machine, a philosophy machine, allows us to reconceive and merge Godel's distinctions within a model o f thinking where the analysis o f concepts, categories, and logics follows from their construction and use in an evolving machine.
I will begin this process of construction with the hypothesis that human language and the logic of human temporality simultaneously create each other as systems of symbolization organizing the world and one's relationship to it. Time requires a kind of syntax, a structure allowing for symbolic exchange between different representations of mental states. Instead of the pure syntax of calculus, the syntactical structure of temporality mirrors the structure of symbolic communication in human language, such that to construct a future is to construct a language. Thus, consciousness functions through or within the construction of human time (specifically of a future which mimics the structure of the present) and the simultaneous or contingent construction of human language.
Language and temporality simultaneously create each other as systems of symbolization that organize the world (our experience) and one's relationship to it. (By symbolization I mean the process by which relationships and identities are configured into particular structures and/or signs).
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
575
? In my analysis o f Wittgenstein I showed how the sense that the future meaning o f a word is present in its form (or the sense that future is somehow present as a possibility in thepresent)displayedhowweinhabitourgrammar. Weexperiencechangethroughour shifting position between language games and through our functioning within the temporal orderdescribedbythegrammarofparticularlanguagegames. Thesenseofthefuture arose as an interpretation o f our relation within the structures o f the grammars that describe our forms o f life. I will use my machines to excavate this grammar. Thus in constructing a future my machine(s) will, in general, abstractform from particular contents, and in so doing articulate a symbolic and syntactical structural relationship between external and internal states that will come to stand for the future.
I list the following as the engineering assumptions and goals underlying the construction o f my machine(s):
1) Temporality functions through a syntax, a structure allowing for symbolic exchange between different representations o f mental states. The syntactical structure o f temporality mirrors the structure o f symbolic communication in human language, such that to construct a future is to construct a language.
2) One can identify two forms oftemporal syntax: (a) a simple equivalence between past states and current states that allow for short-term predictions based on simple patterns or causal associations between a small set of inputs and (b) a more complex syntax that functions as a continuous future.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
576
? 3) My problem is to turn the Turing test around. Instead o f attributing a thinking and/or consciousness to a machine, I need to construct a model in which a machine will attribute its mental state and condition to another machine (or at some point to a human).
4) self-consciousness arises from a model o f other minds as forms o f one's own mind.
5) The possibility ofthe future (that is a symbolic structure defining the future) is built by using structures derived from meta-temporal identities, identities that extend over time but whose contents, except as a markers o f identity, are meaningless.
14. 2 A time machine linguist
A Time Machine constructs a diachronic ontology, a logic organizing and determining the forms o f existence by nesting them within the phenomenal field defined by the machine, as a mechanical surrogate for the individual subject. A time machine, like the one built by H. G. Wells, spatializes time, in order that the appearance of a succession of momentsisdeterminedbythemovement,asonaplain,ofthefieldoftheobserver. In Wells' machine a strangely immaterial metal-like bar serves as a transitive link between the machine (as the physical embodiment and protective encapsulation of the individual subject) and all other times. Consequently, the transitive bar functions as a verb which
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
577
? translates the subject into the future or past, both ofwhich serve as predicate objects, in essence defining the subject. Wells' Time Machine, therefore, defines and functions through the logic o f the copula.
The disguised use o f this linguistic metaphor in organizing Time highlights the fundamental cognitive dependence o f the conceptual structures o f time and language. Language works as a form o f external memory. In this sense language and memory both transport information through time. We use this memory as a means of actively re-shaping elements o f our experience, including other people, into forms and relations that allow for thesuccessofourgoals. Inthissenseweattempttore-formulatethroughakindofviral infection, the internal state, the descriptions o f the world and the goals embodied in these descriptions (goals are often understood as ideal descriptions of self) of other people.
If language is a form of memory, then we should expect our memories to function throughakindoflanguage(asystemofsymbolicrelations). Thisiscertainlytrueof Minsky's K-Iines. K-lines wire together a set of agents active during an experience, and thusformamemoryofthatexperience. Theconnectionsbetweenagentsvaryinstrength, forming a hierarchy of levels. A K-line, to the degree that it links agents within such a hierarchy of relevant levels, sketches a set of relations within a loose syntax. Minsky organizes the formation o f these K-lines within "societies o f memory. " New K-lines are attached to the most recently active K-lines. When (Jack) (Fly) (Kite) are connected by a new K-line, predicates, that is, the K-lines defining each o f these terms, e. g. (Male) (Outside) (Young), are attached to the relevant agent, e. g. (Jack). Because these older K- lines are used to describe the new elements, as predicates, they are in effect nested within
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
578
? the new K-line. IfK-lines are used to form "societies ofmemory", therefore, they build syntactical trees. Minsky never addresses why the world should be categorized in this way. Nor does he examine how such categories would be constructed so that we could recognize them. What his model o f memory does demonstrate is the temptation to describe ourselves and our biology within grammars built from our language.
Predication links Well's Time Machine to all possible times (the set o f possible objects). Although K- lines set up a more complicated set o f relations and associations, these relationship are organized in a rough equivalent o f predication. The Time Machine works not only like parser, it reduces all temporality to the form o f what we understand as memory. Ourexperienceoftime,inspiteoftheroleofconsciousnessandmemoryin determining continuity and particular expectations, is quite different. For Well's the content of all possible moments is pre-determined and already established within the synchronicplainon(orin)whichthemachinetravels. Oursenseoftime,however,and also unlike the closed temporal-spatial loops described in Godel's solution ofEinstein's
gravitational field equations in General Relativity, is of a totalizing alteration or change in the entire universe, where the content o f future moments is understood as undetermined and therefore not yet real. In our experience there is a logical distinction between change (a physical process) and Time (our construction of this change into a form we can perceive). Time arises as an effect of our phenomenal observation of a succession of changes, determined in relation to the relative continuities of other objects or events, and in relation to the continuity of the phenomenal frame through which we represent reality (what we call our subjective consciousness).
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
579
? 14. 3 Generatingthepresentfromlanguage "A Rose is a Rose is a Rose"
To perceive time we must represent or model it. Any such model must organize sensory input within a temporal structure. Such a structure is difficult to imagine because the word "structure" defines spatial relationships, and thus can only be used in a vague metaphoricalsensewhenusedtoorganizetemporality. Whatwerecognizeastemporal structure is really repetition (of the sort that orders and constructs music--which is the formal art of structuring time in relation to sound).
Human attempts to re-create a continuous present rely on repetition in order to split the sound o f the word (signifier) from its concept or meaning (signified and referent). In other words, repetition dissolves the symbolic content of a word by erasing semantic difference. As meaning is dissolved, the temporal dimension slows to nothing more than an awareness of the progressive loss of meaning in the sentence. Our experience of reading non-sense exposes a curious relation between our awareness of temporal succession and language.
A single image is not splendor. Dirty is yellow. A sign of more in not mentioned. Apieceofcoffeeisnotadetainer. Theresemblancetoyellowisdirtier and distincter. The clean mixture is whiter and not coal color, never more coal color than altogether.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
580
? The sight of a reason, the same sight slighter, the sight of a simpler negative answer, the same sore sounder, the intention to wishing, the same splendor, the same furniture. (Stein, 463)
Unlike some o f Lewis Carroll's nonsense lyrics, this passage plays between sense and nonsenseintheattempttoconstructwhatSteincalledacontinuouspresent. Whatwe apprehend as sense defines a frame, a set of expectations, which the subsequent non-sense dissolves. The failure ofthe frame to interpret the semantic content ofthe words throws us into an interpretative loop. We interpret that loop as a failure to progress beyond the immediate context.
Language is not simply structured serially, that is, it does not function through a successionofwords. IfitdidStein'spassagewouldclearlymakesense. Languageis structured through a series of nested domains. In a highly repetitive sentence, "Rose is a Rose. . these nested domains (syntactical categories) collapse into a tautological chain. In Stein's non-sense passage semantic meaning marks a stable moment (or frame) which disintegratesintoindeterminacy,butstillwithinastablesyntax. Theeffectistomodela mind capable o f formulating short-term predictions, but incapable o f organizing the facts (the words) in order to satisfy these predictions.
Stein wants to think like a chimp. D. Premack, in his attempt to evaluate the degree to which the chimpanzee has a theory of mind, formulates a distinction between simple and complex states. Simple states are "hard-wired, automatic and reflex- like, and encapsulated" (172). He defines these states as "seeing, wanting, and expecting". The pathways determining each o f these (with varying degrees o f complexity) are tied to direct
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
581
? sensory input, such that their is little mediating control by the organism over these states. Complex states, however, are highly mediated states in which some level o f decision- making is required in order to determine the state. The prototypical example is belief. Belief arises as a mediating and, some would say, conscious response to particular forms ofdoubt. Premack distinguishes two forms ofdoubt (and consequently ofbelief): doubt in the veracity o f sensory information (epistemological doubt) and doubt in the information communicated by a member o f one's group (fear o f deception).
Epistemological doubt is countered by conscious attempts to construct an external means of determining the truth of sensory input: observation, experiment, analysis, reason, art, religion. Premack characterizes epistemological doubt with three questions: "Do I really see X? " "Do I really want X? " "Are my expectations well founded? " If we ignore the problem of their linguistic form, these questions describe a critical process of categorization. Epistemological doubt of this sort is predicated (and expresses or functions through) a distinction between truth and falsity. In other words these questions, defined by their use of "really" and "well-founded" require doubt to be in place in order to be asked. They appear initially to be questions o f domain. These questions are what I would call questions of syntax concerned about what belongs where as what. In other words syntax defines a set o f domains that taken together define the form and legitimacy of a particular string to function within language, that is to be intelligible. Syntax, however and to what degrees of specificity it is defined, constructs the possibility of determining legitimacy or, in relation to our epistemological doubt, truth. Syntax implies an established semantics, just as questioning the truth of X requires the possibility of falsity.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
582
? falsity. Syntaxdeterminesthedomainoflegitimacyasameansofstructuringthe possibility of doubt (or failure). Because a particular word might not be legitimate, syntax divorces the criterion for truth from the content of the word. The specific form of an entity is understood as variable and possibly false. This does not mean that a grammarians strictdefinitionofsyntaxdeterminesthesenseofasentence. Syntaxisitselfmuchmore fluidthancommonlysupposed. Strictsyntacticalrulessimplydescribethecenterofa syntactical domain. One can still make sense of words used within an incorrect syntax as long as they still reside within the domain defining their relationship with other words (and
thus its legitimacy as a member of a meaningful string). Once a word, however, can no longer be related to other words in a meaningful way, that word has left the syntactical domain that determined its legitimacy (and thus its meaningfulness). Thus one can see why a cognitive scientist like Schank argues for semantics over syntax. He should be arguing,however, foramorebroaddefinitionofsyntax,asyntaxthatdetermines relationships through the activation of semantic content within a particular domain (a possiblesetofrelations). Schank'sscriptorframeisasyntacticalstructure,onethat determines the legitimacy or meaningfulness of words by the establishment of a set of possible meanings, albeit possible meanings determined by previous knowledge. Belief as a form o f long-range prediction depends on a syntactical elaboration o f the present in order to create the possibility of a future.
The possibility of Premark's second form of doubt is dependent on the what R. Seyfarth and D Cheney call, in their study of meaning and language in Vervet monkeys, the attribution of mental states "such as knowledge, beliefs, and desires to others" (126).
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
583
? In their study they conclude that Vervet monkeys in spite of the character of three specific calls and the different responses these calls cause in those who hear them in warning of a snake as opposed to a martial eagle or a leopard do not have a language because they cannot determine the difference between a knowledgeable audience and an ignorant audience (127). They apparently cannot evaluate the knowledge o f those around them, because their communication is not attempting to change the mental state of their audience. Itissimplyasignalactivatingalearnedresponse. Theattributionofamental state to another seems to require the ability to choose. Making a choice (a kind of
thinking) involves constructing a world, a frame, or a domain of relations organized according to a particular grammar (a set of rules and values).
Chimps make choices. Do they then make conceptual worlds? In an experiment a female chimp was trained to add and count. Part ofthe training procedure required the chimp to play a game where two piles of candies were counted out. Whichever pile the chimp first point to was given to another chimp sitting in front of her in a cage. The object of the game was to get the most candies. Whenever the candies were counted out the chimp invariably pointed at the larger pile, thus losing. When numbers written on cards were substituted for the candies, however, the chimp was always able to point to the smaller number, thus insuring that she would be rewarded with the larger number of treats.
One could speculate. When the chimp saw the food an instinctual grab/eat rule (or agent)determinedherchoice. Sheplayedthecandygameusingherinstinctualrules. She needed instead to nest the candy game within the her instinctual game, by replacing the instinctual mechanism for achieving her desire for the most food with the mechanism
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
584
? defined by the game. By encoding the "amounts" o f candy within a numerical domain the relationship between the two piles can be determined without invoicing the grab/eat rule. Thisinformationisnolongerstoredasanimagewithinthegrab/eatrule. Thenumerical domain indicates the value of each number in relation to the other, not in relation to the chimp or her desire to eat. The failure of any abstract system to define this kind of personal relation requires that its values be accessed and assessed from another domain and according to another grammar. In this case, the chimp can then apply the rules o f the candy game to these "amounts" in order to determine the winning choice.
Any abstract system sets up a play between at least two different systems or domains. This recursive interaction creates a kind o f consciousness, because it creates the illusion o f free choice. This choice is not in any sense free, it is simply the mediation o f one domain by another. The chimp evaluates how to get food [an algorithm defined by CHOOSE THE MOST DESIRABLE and THE MOST DESIRABLE IS THE MOST FOOD] through the algorithm of the candy game [CHOOSE THE LEAST FIRST], The transcription of candies into number allows both of these algorithms to function together.
The three systems of relations in the previous experiment are all defined as systems o f value. These systems o f value are really systems o f meaning because they are sets o f relations. Semantic meaning within any discourse is primarily the articulation of a particularsetofrelations. Weassumethissetofrelationsisnotarbitrary. Thismeans they are constructed according to a particular logic. This logic is a logic of difference.
The statement "x means y" requires at some level, although not necessarily at the conceptual level Saussure suggests, that the notions "x" and "y" be understood as "not z",
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
585
? "not w". Even to say this, however, requires that the notion "x" be crystallized in such a way that such a comparison is possible. How do we know that 'x' is not the same as ly'? We know this because they sound different, or because their shapes are different. Shape and sound serve as common denominators. Thus to speak o f difference requires that "x" function within a realm of common terms or qualities, which means a realm of pre- established possible relations and configurations. Thus what I first called a logic of difference is a difference mediated by a commonality. This common context is not given, but is constructed. (On a fundamental level, however, the initial contexts would be created primarily out o f the biological structures used to interpret our interactions with our environment).
Value is determined through differences expressed within a common context. A common context means in relation to a binary descriptive system: between two similar phonemes, between more and less, and so on. Values generated out ofthese metaphoric matrices are re-coded into more complex systems o f relations. We can easily see how this works in the chimp experiment. How can the chimp nest the rules o f the candy game within its instinctual grab/eat game? Encoding the amount o f candy in numbers created a domain of pure value organized around more and less. This information was embedded in the world o f the candy game where the output o f the algorithm CHOOSE THE LESS selects the lesser value. This value is eliminated and the larger value is transferred into the
domainofthechimpsgrab/eatrule. Ineachcasevalueisdeterminedbyhowtheeventis encoded within a system ofvalues. The situation facing the chimp is successively translated into different conceptual worlds, where the meaning o f that information is
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
586
? determined by the values (in this case simple algorithms) organizing the relations within that world.
Thus, thinking is a process ofworld-making. A Dutch psychologist, de Groot, discovered that chess masters through a process o f implicit pruning do not see bad moves. They act within or through a grammar or a domain defined exclusively by good moves. They construct a conceptual world of good moves. This world defines what they see: it defines the moves the chess master recognizes as possible.
How does someone determine which of the possible moves or relations is correct, best, or relevant? This is too large a question to answer here. I can suggest, however, how we determine the relevant from the irrelevant within a frame or grammar in a general way. In an obvious way relevance is a function of understanding how something functions within a frame or domain. Frames must be extremely flexible to be able to handle the high degree of uncertainty and complexity of our experience. If we re-define determining relevance as making a choice we can make some progress. Choice arises through a kind of recursive nesting of domains and grammars. The interaction of these domains and grammars selects relationships from each that are integrated within a new domain and grammar, thus constructing an alternate world. This world defines what is possible. One assumes that at some level or in some world one is left with only a single possibility. This choice (really an algorithm) is transported to higher levels o f organization where it appears like we have made a conscious choice.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
587
? 14. 4 A wink in time means two
Before we can build a belief in the future, we must build awareness. We can build a basic time detector by using difference detectors, or what Minsky calls time-blinking:
Any agent that is sensitive to changes in time can also be used to detect differences. For whenever we expose such an agent, first to a situation A and then to a situation B, any output from that agent will signify some difference between A and B. (240)
While time-blinking partially removes the problem of comparing two inputs from different agents without requiring synchronous and identical frames (a trick it does largely by having a temporary memory), it does not explain how difference is translated into an ordered succession o f moments. One could, however, easily input the recognition o f a perceptual difference to a kind o f tape, as if marking a date or a second, which is then advanced. Suchaprocedurewouldrecordsuccessivedifferences,whichcouldfunction like a clock for human beings. It could never generate something like temporal experience for a machine. By what process is temporal order perceived and organized such that consciousness understands itself as the present? Do all the inputs that make up consciousness run through a single agent (an agent dangerously close to a homunculus) in rapidsuccessioninorderthatwecanperceivedifferenceandcallthatdifferencetime? If not, how is the information that our brain receives, processes and constructs, all at different rates, constructed into a now? The formation of defined moments is a process of constructing meaningful information. A moment, therefore, defines those "differences that makeadifference. " Timeisaninformationstructurewithinwhichwefindourselves.
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
588
? Thus our machine must generate representations o f its world in succession in which it finds itself. It has to turn itselfinside out.
14. 5 "Be! Verb umprincipiant through the trancitive spaces! "
My machine lives in a world o f simple objects. As a time machine it does not recognize objects as objects, but objecthood, as that which can be noticed as something, which it perceives in what it considers a partial instant o f time. It does not recognize the world or the environment, but only something that triggers its optical apparatus as a somethingtobenoticed. Itsopticalapparatushasalreadysimplifieditsphenomenalfield into this categorical perception, which we can call the set o f objects. This information is sent to the Optical (O) input agent which registers this information within a formally defined phenomenal field. The machine can rcognize separate example of objecthood within a range from 1 to 5. This phenomenal field is defined by a set o f switches (5) that representanobjectbybeingonandrepresentnoobjectbybeingoff. Thusthemachine optical apparatus can immediately register up to 5 objects. An initial input, N l, once it has set these switches (once the optical information is represented as an internal state), is transferred to an Optical analyzer agent where it again sets a similar set of switches. If the next input (N2) to the Optical input agent resets the switches, it is registered as a different moment and is thus sent on to O difference detector. The difference detector records that adifferencehasoccured. IfitsinformationisidenticaltoNl,nochangeofstatetakes place. Because our machine has only one sensory input at this point, the failure to register
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
589
? difference is understood as non time; the first moment "continues" because the first state continues.
Once Nl has reset the difference detector, indicating a different moment registered as a change o f state, it enters a feedback loop. The detector's switches are then immediately reset by N2 (if there is an N2). N2 is similarly sent on a feedback loop but at twice the rate. Immediately after N2 resets the difference detector, N l resets it again.
This difference is converted into a value indicating the difference between these two states (degree o f change). The second cycle resets the detector to N2, before the next signal from O input is received.
Although it is not necessary to separate the process of detecting change from that of determining the degree of change, their functional difference allows us to split the presentintotwophenomenalelementsorinteractingmodels: anexperientialmodelandan evaluatory process. The difference detector, as the second element, serves as a kind of short-term memory within the phenomenal experience. It has, however, changed the nature o f the information perceived by the machine. Both kinds o f information determine the nature o f the present. It is the detector's separation from the initial model o f change
that allows the machine (to the degree that it is defined by its internal processes) to be aware of change. This means that the difference detector makes change, as defined by the 0 input agent, meaningful. The moment is actually not defined by any single state but by two states defined in relation to each other, embedded within the degree o f difference determined by the difference detector (so that Nl and N2 will be stored together as PI and P2). The difference detector, however, is inadequate for any short term memory. We
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
590
? need a memory that encodes the consciousness o f the machine, and that requires that the information from both agents be stored. Both Nl and N2 are sent into a single short term memory unit (they are connected by a simple K-line). This unit is organized within short term memory (STM) according to two parameters: the ordered received (representing temporal succession) and degree of difference. . Short term memory (STM) stores both our experience o f change (suitably simplified) and a particular fact about these experiences. Thesearenotidenticalbitsofinformation. Theanalyzerdeterminesthe relevant facts for the time machine. Given the formally defined character of the phenomenal field, this information can be used to actually structure STM. (There are 10
possible values 1 through 5 and -1 through -5 defined in relation to the second state (N2): there is no O, o f course)
Optical Difference Machir (N2 Input State)
Difference Valu
(fact)
N1 = P1
Optical Inupt Agent (Change detector)
Optical Difference Detector
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
Experiential Order
Short Term Memory
591
? This is, ofcourse, not a model ofhuman perception. We have eliminated some ofthe criticalproblemsinconstructingamodelofoursenseoftime. Theworldandthe informationfromthatworldisimmeasurablysimplified. Inmymachine,theformal continuity between moments is built into the form in which the sensory input is represented (a defined set o f switches). With only one kind o f sensory information (number of objects), we do not have the problem of synchronizing and integrating the information from different senses (and their processing agents within the brain). The initial formation o f the moment is defined by the limit inherent within the speed by which the optical apparatus can convert its sensory input into a set o f objects. Changes that take placeata ratefasterthanthemachine'sphysicalabilitytorecognizeasetofobjectsare simply not registered.
14. 6 The logic of short-term prediction
If my machine can recognize patterns, such that the number of objects defined in one moment is always followed by a defined number in another moment, it can make short-term predictions. Once a pattern has been established, it can be recalled and run through a memory recall agent (another difference detector) at a rate faster than the information processed by the Optical agent. I will not spend much time constructing the mechanics of such a process. What is important in this problem is less the way in which this information is stored, than in the syntactical logic it necessarily generates. The logic here is simple: if x (a certain number of objects) is registered it will be followd by y
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
592
? (another number ofobjects). Such predictions can, ofcourse, only happen ifthere is a pattern to be recognized.
Initially Short Term Memory (STM) is evaluated to determine a match with the initial optical input (Nl). If a match is found (through difference detectors attached to each memory slot), then the entire temporal unit, containing two states (PI, P2; remember PI) is transferred to the Memory Recall Agent. Nl and PI are processed as identical. The P2 state will not have an initial correlate state (it is run before N2). Because a prediction is based on understanding P2 as if it were N2, P2 must be understood as an N,
butnotastherealN2. Consequently,anysuchpredictionsrequireasymboliceconomy allowing for the conversion of Px into *Nx (where * indicates this input has an ontological claim equal in value but different in kind). The creation of an Nx state represents the creation of a temporal syntax. In order for the correlation of P2 with Nx to function as an expectation of N2, the machine must be able to act (or change its state as with Pavlov's dogs) on the basis of this correlation.
In Pavlov's bell experiment a similar temporal logic is manipulated. The equation o f food = salivate is re-wired to bell = salivate and is therefore based on an associated equivalence between food and bell. This equivalence, however, does not take place within a synchronic plane. It works because the sound of the bell recalls an established pattern recorded in memory. The P2 state of food following the PI state of the bell is understood asthesameasanNstate. Asymbolicinterchangetakesplacebetweenthepresentandthe future (patterned on a previously established relationship). A structural relationship has
Reproduced with permission of the copyright owner. Further reproduction prohibited without permission.
593
? been established between these two diachronic states, which allows the content o f one state (the bell) to stand for the content of the other (food).
Predictions based on short term memory are stimulus dependent. They disappear as soon as the environmental stimulus that suggested them disappears. They define only a temporaryandunstablefuture. Amachinewithonlythiskindofmemorycouldnot engage in any long term planning. Such plans require an ability to invoke the possibility of the future on command, as well as much more complex reasoning and creative modeling.
14. 7 The continuous future: hearing of and speaidng in the new world order
The Time Machine's world is now a rectangular plane, on which, at some unspecified but unreachable distance, a number o f objects flash in and out o f existence. This plane is divided into a series of parallel districts (wide strips). The number of objects ineachdistrictchangeindependentlyofthenumberofobjectsinotherdistricts. No machine, o f which now their are many, can see the objects (the number o f objects) in a district other than the one it is in. The machines can move to other districts. To prevent any machine from becoming a part ofthe object field ofanother machine, all machines will move along the same line and can move around each other if one machine is in the way of another. (The eyes move to the side o f the machines head. The machines bump into other machines and then move around them; I will not include this perception and response withinmymodelinghereforavarietyofreasons. Thinkofthisbumpingandmovmentas a function o f the machine autonomic nervous system)(see Figure A).
Reproduced with permission of the copyright owner.
