Nanotechnology! (or SimLifeWorld) – Sandy Baldwin

. . . the problems of knowing what is the subject of the State, of war, etc., are exactly of the same type as the problems of knowing what is the subject of perception: one will not clear up the philosophy of history except by working out the problem of perception. — Maurice Merleau-Ponty

6 July 1945. J. Robert Oppenheimer later recalled that as he watched that first atomic blast at Los Alamos ‘there came floating through his mind […] a line from the Bhagavad-Gita in which Krishna is trying to persuade the Prince that he should do his duty: “I am become death, the shatterer of worlds” ‘ (qtd. in Goodchild, 1981: 162). The citation addresses the disjunction between Oppenheimer’s perceptions and responsibilities, and the megaton flash that turned the New Mexico desert to glass. Only as a god can one perceive and act in the realm where cosmic events occur, but at the same time, such experience makes the mythic that much easier to achieve. Oppenheimer’s later arrest and trial for not being sufficiently right-wing showed already the permeability of this outer limit under Cold War conditions. In the end, Krishna’s words become the subtitle of Oppenheimer’s biography. It turns out that there is no end to the ways of living with the bomb. The DARPANET precursor to the Internet was designed on the presumed imminence of nuclear attack. The Internet’s distributed topology works precisely on the assumption that this attack has already occurred. It is only a short way from here to the field of Disaster Recovery, where instant data warehousing allows the massive archives of insurance companies, for example, to be fully recovered after the blast. This means that the astronomical is no longer that big: not at all sublime and entirely calculable. We are left with the mass of fissionable nuclear material as a mythic fossil, its radioactivity the residual index of its eternity. We exist between about 103 and 10-3meters. Beyond that we fade to motes, beneath it we separate into molecules. In ‘Change of Paradigms’, his last lecture, given days before he died, German media theorist Vilem Flusser observed that

We are forced to split up the things and processes of the world into three orders of magnitude. In the medium order of magnitude, which is measurable in our measures, that is, in centimeters and seconds, Newton is still valued. In the big order of magnitude, that is, the one measurable in light-years, the Einsteinian rules are valid. In the small one, which is measurable in micromicrons and nanoseconds, the rules of quantum mechanics are valid. In each of these three worlds we have to think differently, try to imagine differently, and act differently. (1993a:293).

Elsewhere, emphasizing the inescapable ‘gray zones between orders of magnitude that set our teeth on edge,’ Flusser calls for a ‘new humanism’ that would respect the ‘typical epistemology, ethics, and aesthetics that is effective’ for each order of magnitude (1993b:296 and 298). Such thinking encounters a “fractal” condition, in Benoit Mandelbrot’s terms, where Flusser’s “respect” for specific effects must consider displacement and degree of resolution across permeable orders of magnitude.1 Given such medial conditions, Marshall McLuhan’s pace-setting definition of media as “the extensions of man” is worth recalling. The task of Understanding Media set in the title of his most famous book, defines media theory as a reflection on our fractal implication into other orders of magnitude. Friedrich Kittler’s recent upgrade of McLuhan makes exactly this point: “so-called Man” is not the subject of “aesthetics” as the privileged immediacy of perception or pattern recognition, but is ‘determined by technical standards’ (1997:133). Kittler’s program of “media materialism” elevates such standards to functions ‘in the real,’ independent ‘of any subjectivity’ (1997:164). Media theory dissolves the phenomenological priority of the human order of magnitude into the relativity of magnitude shifts. In doing so, these shifts preserve the humanism of this order of magnitude – no doubt the implied ethics in Flusser’s “respect” and infinitely close to what J.F. Lyotard came to call “the inhuman” – as the dilemma of “understanding.” What can we know and what do we need to know in such a post-hermeneutic situation? Judith Butler prefaces Bodies that Matter by asking perhaps the most pressing instance of just such a question: ‘Which bodies come to matter – and why?’ (Butler,1993:xii). I want to take this question in a different, but not contrary direction, to ask what matter comes to be bodies – and why. I want to ask this in view of the limits of orders of magnitude, particularly as these limits bear on certain challenges to understanding posed by technology and art. Consider Mark Leyner’s short story ‘I Was an Infinitely Hot and Dense Dot,’ where the narrator tries to pick up a woman in a bar: ‘I can’t tell if she’s a human or fifth-generation gynemorphic android and I don’t care’, his lips ‘one angstrom unit from her lips, which is one ten-billionth of a meter.’ She replies: ‘I can’t kiss you, we’re monozygotic replicants – we share 100% of our genetic material.’ He answers:

What if I said I could change all that. . . . What if I said that I had a miniature shotgun that blasts gene fragments into the cells of living organisms, altering their genetic matrices so that a monozygotic replicant would no longer be a monozygotic replicant and she could then make love to a muscleman without transgressing the incest taboo,’ I say, opening my shirt and exposing the device which I had stuck into the waistband of my black jeans. How’d you get that thing?, she gasps, ogling its thick fiber-reinforced plastic barrel and the Uzi-Biotech logo embossed on the magazine, which held two cartridges of gelated recombinant DNA. I got it for Christmas. . . . Do you have any last words before I scramble your chromosomes, I say, taking aim. Yes, she says, you first. (Leyner, 1990:6-7)

N. Katherine Hayles notes that ‘the wit of this passage comes from the juxtaposition of folk wisdom and seduction cliches with high tech language and ideas’ (Hayles,1999: 44). The cut-up of singles scenery with Gibsonian hardware auto-destructs cultural codes by scrambling the subject of the code: there is no ‘natural body’ within the text, as Hayles has it, reflected in the lack of a natural body ‘of this text’ (Hayles, 1999:45). The medialities of human dimensions qua order of magnitude are re-programmed through what Carnegie-Mellon roboticist Hans Moravec terms a fusion of biological, microelectronic, and micromechanical techniques (Moravec, 1988:72). Nevertheless, by staging the recombinant future of natural bodies, Leyner’s text marks the interface of android and human. Hayles’ fundamental category of ’embodiment,’ as ‘the specific instantiation generated from the noise of difference,’ maintains the articulation of the posthuman with the ‘fragility of a material world that cannot be replaced’ (1999:196 and 49). Thus, it turns out that the most up-to-date media theory is in fact the latest theory of historical dating in terms of an overarching world horizon.

Now consider the recent engineering science of nanotechnology, already in the background of Leyner’s story. For B. C. Crandall, nanotechnology, announced as ‘the Last Technological Revolution,’ is ‘simply a descriptive term for a particular state of our species’ control of materiality’ (Crandall and Lewis,1997: vii-viii). Nanotechnology is easily distinguished from what its theorists derisively term ‘bulk technology,’ including ‘heating, beating, shaping, and fastening’ of materials, forcing of chemical catalysts, burning electronic gradients into silicon, and so on (Milbraith, 1997:301). Into this classification fall the entire metaphorics of the machine as a fuel-driven structure of expenditure or as an entropic gradient. Bulk technology also corresponds roughly to Gilles Deleuze’s and Felix Guattari’s concept of the machinic phylum, re-defined by Manuel de Landa as ‘a term that would indicate how nonlinear flows of matter and energy spontaneously generate machinelike assemblages when internal or external pressures reach a critical level’ (De :Landa, 1992:136). Deleuze and Guattari speak of ‘tracking’ or ‘following’ the eruptions of the machinic phylum; by contrast, nanotechnology could be said to isolate and program the phylum itself. Nanotechnology deals with molecules in the range of a billionth of a meter – moving at about 7,000 miles a second, ten times the speed of light – and directly manipulates these to form complex structures, or more exactly, any structure open to specification by molecular physics. In the sense that physical laws deal with aggregates of molecules, molecular engineering structures equate exactly to the known physical properties of the universe. The machinic phylum may imply something like nanotechnology at work, while nanotechnology as a theory reflects on this implication and renders transparent the work it does.

Using the bonding and displacement of individual atoms as switches, nanocomputers faster and more powerful than the largest Cray supercomputer are conservatively predicted at the size of about a third of a square centimeter. Nanotechnology pioneer K. Eric Drexler writes of assemblers which will remake our world or destroy it. These machines, along with disassemblers for ‘prying groups of atoms free, [are] able to bond atoms together in virtually any stable pattern’, allowing us to ‘build almost anything that the laws of nature allow to exist,’ that is, ‘anything we can design’ (Drexler,1986:14). Complexity, whether as a traditional defining attribute of living creatures,2 or in its increasingly broad application to all systems, is simply a matter of aggregates of molecules. The availability of inexpensive desktop DNA synthesizers provides a prototype of “gene machines,” however, these still utilize the bulk method of a chemical soup producing more or less statistically reliable results, rather than the projected direct manipulation of atoms. We should instead expect a “meat box” built to simulate a cow’ s synthesis of proteins: just open the box and pull out the steak that has grown there. ‘Cell repair machines’ could re-assemble the ‘misarranged patterns of atoms’ that cause cancer, aging, and biostasis (also known as death), and build bodies from scratch (Drexler, 1986:98-112).

We are again close to the thought of Hans Moravec, who is serious about “downloading” the mind as a pattern of neuro-processes, easily stored in silicon, or preferably a more stable media, and expects nanotechnology to provide the technical basis for this “transmigration”. In fact, his notoriously sanguine approach to the obsolescence of the body provides the nightmarish jumping-off point for Hayles’ argument about ‘how we became posthuman’ (Hayles, 1999:1). ‘If the process is preserved, I am preserved,’ Moravec announces. ‘The rest is mere jelly” (Moravec1988:117). Moravec offers perhaps the most radical answer to Lyotard’s question of over a decade ago, a question I would like to link to Flusser’s and Butler’s questions, an “inhuman” question and ‘the sole serious question to face humanity today’ : ‘Can thought go on without a body?’ (Lyotard,1991:9). Lyotard points out a true limit to the philosophical critique of limits – as the dominant post-Enlightenment mode of thought – in the sun’s eventual destruction in four or five billion years. At this point, the cultural fascination with the finitude of thought in relation to an inaccessible nature will have self-annihilated: we will be awakened from our euphoric contemplation of the phenomenological horizon when it literally explodes. The closed loop of culture and human embodiment – and thus the utopian reflexivity of historical experience – is in this view no more than a recursive epiphenomenon within the larger, technological realm of differentiated matter and energy. We must assume ‘that the ground, Husserl’s Ur-Erde‘ is a transitory ‘arrangement of matter/energy […] lasting a few billion years more or less,’ and ‘will vanish into clouds of heat and matter’ (Lyotard,1991:10). The only task facing humanity is that of ‘simulating conditions of life and thought to make thinking remain materially possible after the change in the condition of matter.’ (Lyotard,1991:12) – that is, after the destruction and disappearance of the earth. Lyotard challenges us to think beyond the limits set by philosophical reflection, limits which seem to differentiate nature and technology from the first.

Keep in mind that it was David Hume’s inhuman and counter-philosophical thought of himself as ‘some strange uncouth monster […] expell’d from all human commerce’ that first prompted Edmund Husserl’s reflections on the phenomenological a priori of space (Hume,1985:311-312). The answer to Lyotard’s question is conditioned by the possibility of computationally re-creating the self-reflexivity of a mind able to orient itself in the Husserlian lifeworld, a mind thought in terms of an unstated and non-conceptual horizon of embodiment. The viability of Moravec’s ‘process identity,’ regardless of bio-hardware platform, must include the recursive ‘meta-functions’ that made mind and body such an inseparable problem for philosophy in the first place. As an answer, nanotechnology would bionically build the lifeworld, as if to say: we have tried to orient ourselves toward the world and we have failed. My concern here is not the feasibility of nanotechnology but the challenge it poses to thought. By translating the world horizon into technical/logistical parameters, nanotechnology makes transparent the limits of thought: it brings theory to its end. Nanotechnology reflects on the conditions enabling “calls” to respect bodies, matter, or the world, convicting embodiment of its purely methodological role within the theoretical program that invokes it. The sheer consistency of normative claims can not simply be a matter of respect for embodiment or materiality; such consistency can only be rhetorical. What can be repeated is already a piece of rhetoric.3The result is indeed truly technical – the latest result of that oldest of human techniques, rhetoric.

The rest of this essay explores some possible implications of nanotechnology for contemporary thought. The second part shows the development of nanotechnology to be a re-troping of the performativity of atoms as posited by quantum theory. The third part re-situates this shift rhetorically and in terms of Husserl’s theory of the lifeworld, with the aim of showing how nanotechnology directs us to the aporias of thinking itself rather than being merely a new stage of technical development. The final section will offer a preliminary investigation of instances of conceptual art that bears upon the philosophy of nanotechnology here presented.

The Assembler Breakthrough, or Why Bodies Aren’t Matter

Everything must wear a disguise in order to be real. – John le Carre

The prefix ‘nano’ refers to ten to the minus ninth power, or a billionth, a measurement where time/space concepts decompose. This means that whatever exists at this size and occurs in this dimension is phenomenally invisible. We are awash in light, but this self-evidence is lost on molecules too small to reflect any part of the pattern of light wave propagation, molecules existing in a constant motion and re-distribution at speeds exceeding the minimal event-time of photonic reflection. What sort of ‘space’ is this, which no amount of magnification will bring to light? What events come to pass at this order of magnitude? Drexler’s proposed assemblers are little more than the elaboration of the background intuition of an atomistic structure, but the dynamic between background and elaboration is all-important. Drexler formulates nanotechnology from existing bases for engineering in theoretical physics: no specifically new technologies are required for the possibility of nanotechnology, although actual nano-machines have yet to be built. Rather than requiring new machines, the possibility of nanotechnology is derived from a loosening of the constraints given in quantum physics. The medial relation of quantum theory to the humanly-accessible phenomenal realm is construed as a performative schema, where the instability of quantum phenomena is the potentiality actualized in technological, “bulk” outcomes. Nanotechnology re-tropes this performative schema, gambling on the regular movements of invisible strata of matter to determine the resulting technology. As a way of thinking, nanotechnology can not fail to be metaphorical: the limitation on any conceptualization of the quantum inherited from quantum theory is precisely the condition for metaphoric descriptions of what can not be described realistically. Nanotechnology is a technology of metaphorics.

Remember that the Epicurean atomist tradition made famous by Lucretius in the first century BC, postulated atoms as the invisible and utterly unknowable precipitate of nature’s sheer randomness. The atomic substrate offered a minimum commitment, leaving those in the world with a range of possible action. For this reason, Lucretius writes a poem and it is in the form of a poem, whose literal combinatory play is the metaphor of clinamen, or faster than light motion at the atomic level, and it is only in the form of a poem, that a mediating instance is found for the absolute distance and indifference of the Epicurean gods (Lucretius, 1951:51, 66-68). While the history of atomism is too complex to track here, phantasmagoric inaccessibility to intuition remains central. Ernst Mach repeats that atoms ‘cannot be perceived by the senses. They can never be seen or touched, and exist only in our imagination. They are things of thought’ (qtd in Regis, 1995: 23). Nevertheless, by 1908, only a few years after Mach’s pronouncement, it was already possible to use an ultramicroscope to confirm Einstein’s prediction that Brownian motion was explained by atomic level thermal diffusion. As a result of the constancy of their kinetic energy, an apparitional manifestation of atoms was for the first time perceived by the senses. Still, ‘life was hell down in the molecules,’ that is, at the quantum level, where Brownian motion was the surface trace of ‘incessant molecular pummeling at rates of 1015 per second’ by forces of ‘thermal vibration’ (kT) (Regis, 1995:16).

The critical arguments against nanotechnology are drawn largely from quantum physics, and point again and again to the chaotic, unstable nature of quantum phenomena, the so-called “fuzziness” of quantum indeterminacy, which seem to preclude any working technical application at that order of magnitude. As Arthur I. Miller describes in his history of the theorization of quantum physics in the period 1913-1927, the theory comes about because of the discernible effects of the invisible, but insofar as these effects are characterized by a ‘typical’ or ‘essential’ discontinuity, quantum phenomena are not visualizable, unanschaulich. The coherence of the theory is predicated on its objects remaining incoherent to anything other than theory. This led Werner Heisenberg to displace the basis for the intuitive or anschauliche interpretation of such phenomena from visualization to theoretical coherence or non-contradiction. Thus, quantum indeterminacy states that the location of particles cannot be absolutely determined, but it states this reliably. In the limitation provided by this theoretical re-situation, theory itself becomes a phenomenological a priori. Theoretical coherence replaces the intuitive certainty of the visual. Heisenberg noted: ‘we couldn’t doubt that this [quantum mechanics] was the correct scheme but even then we didn’t know how to talk about it [and were left in] a state of almost complete despair’; and Neils Bohr noted that this condition arose from the ‘general difficulty in the formation of human ideas, inherent in the distinction between subject and object’ (qtd. in Miller, 1995:103,105). Here is the resemblance to atomism: quantum phenomena are the real basis of the physical universe but remain reliably unavailable to perception – their effect is at best statistically observable – leaving humanity free to act in a world conditioned by science (Miller, 1995:105). And here, too, quantum physics returns to the phenomenological realm accessible to my body, between 103and 10-3 meters, a realm of phenomena that are both intuitable and scientifically manipulable, that is, phenomena that can be theorized in terms of Husserl’s differentiation of technization and lifeworld. Quantum theory is the aesthetic limitation of nanotechnology. }{\i\fs22 This magnitude becomes all the more stable in relation to the quantum instability of the lower order of magnitude: the implications of quantum theory leave this the only realm of intractable phenomena. The aesthetic effect of the postulate of theoretical coherence is to rhetorically strengthen the self-evidence of appearances.

Nanotechnology re-tropes the performativity of atoms posited in quantum physics. The famed nano-structures of today are only accessible through simulation devices such as scanning tunneling microscopes (STMs) and atomic force microscopes (AFMs), which typically use fluctuations in the electrical fields between the ultrafine tip of the microscope and the molecular surface to image individual atoms. But the tiny fluctuations that produce these simulations make all the difference. The imagability of quantum phenomena absorbs the burden of proof for appearances. The point is not that more sophisticated simulations of vision technologies make nanotechnology possible, but rather that these developments make the chaotic inaccessibility of the quantum no longer convincing, discovering the regularities of coherence and validity required, but not sought out, by quantum physics. The theoretical a priori of quantum theory is literalized in simulation, where the accumulation of observations under such new techniques makes the artificiality of simulation lose its secondary value. In the newly persuasive simulations of the quantum, we now see what was already visible anyhow. If molecules are not phenomenally visible, then the Scanning Tunneling Microscope is the proper way to see them. If the statistical effects predicted by quantum physics made atoms no longer simply ‘things of thought,’ as Mach had it, this meant that all that appears could be described in terms of the quantum, and, as this lower realm was brought into visibility, so too did appearances accumulate descriptive power. While the behavior of enzymes and hormones are ‘more often described in chemical terms,’ writes Drexler, as an example of statistical, bulk transformations of matter, they can also be described in “mechanical terms,” that is, in terms of the molecular technology of nanotechnical reactions (Drexler,1986:8). Appearances are an effect of mechanically explainable molecular reactions. Since molecular reactions are the binding (condition) for phenomena, simulated vision of such reactions takes the form of simulation as the correlate of this binding. The scanning tunneling microscope participates in a massive and pervasive system of translation. All that is seen is the coming to light of molecular effects. Covalent and repellent bonds, and other forces operating at that magnitude, become the giving-up of measure by the atom itself.

This fractal point of view regards nanotechnology the conceptual algorithm for moving between orders of magnitude. John von Neumann’s ‘cellular automata,’ postulated in his last years, had begun to figure in his theory of self-reproducing computers; they supplied nanotechnology with the foundation for the mathematical combinatory, whereby cells and machines may be considered equal. Crandall similarly thinks of this self-replication or self-assembly as a property of matter: the ‘bumbling, stumbling dance allows molecules to explore all possible “mating” configurations with other molecules in their local environments,’ through which ‘biological systems generate the event we call life’ and ‘may lead to the construction of complex structures designed by human engineers’ (Crandall,1998:5). This notion is very close to the Greek concept of phusis, a concept never far nor easily distinguished from techne: life is an event that generates itself from itself and gives itself in appearing. The atom is both the ratio and the matter of nanotechnology. It measures itself in measuring. What is seen through the Scanning Tunneling Microscope is not a simulation but the self-production of life.

Positional control, the second major postulate of nanotechnology (after self-replication), is derived in related fashion. The high-speed oscillation and fuzziness of molecules is re-valued as a relative indeterminacy acceptable under engineering conditions. As a result, the economy involved in bulk technology, with its accumulation of materials, its expenditure and release of energies, returns as an imminent property of matter. A given molecule has certain effects that are unstable but fall within a range that can be engineered. “Chaotic instability” is re-troped as a kind of dynamic productivity. The crazy play of forces in the unseen world that delimits this world is differentiated into a plurality of possible worlds. A realm is re-inhabited by the ‘dance’ Crandall describes. Reality is constructed out of this emergent complexity, constructed “all the way down” not as an unreality that conceals a truer reality but as the fractal accumulation of magnitude changes from which subject and object are extrapolated. Heisenberg’s equation of quantum indeterminacy with Aristotle’s potentia translates into a consistent potentiality: the Aristotelianism is Platonized, as it were.4

The fundamental nanotechnic concepts of self-replication and positional control are a translation – Umbesetzung, as in the terminology of historian and philosopher of science Hans Blumenberg – of limits set by quantum physics, while at the same time describing ‘nonintegral connections’ beyond the distinction of lifeworld and technicity. This is similar to what Mark Taylor has recently termed allelomimesis (Taylor, 1997:326-327) – literally, mutational self-emergence from otherness. This emergence is inseparable from institutions and histories, from the laboratories at MIT and Palo Alto where it is performed. Here we come to the second half of Lyotard’s “inhuman” question, a rejoinder that speaks from the proximity of thought and suffering, asking after the burden of memory on the machine: ‘Otherwise, why would they ever startthinking?’ (Lyotard,1991:20). The acuteness of Lyotard’s question illuminates the pre-conditions of the problems of technicity and lifeworld.

Nanotechnology Art Beyond Phenomenology

Unreadability of this world / All things doubled – Paul Celan

It turns out that Husserl was a high-tech kind of guy. This should come as a surprise, given the famous opposition in The Crisis of European Sciences and Transcendental Phenomenologyof the ‘the tendency to superficialize [..] in accord with technization [Technisierung]’ to the properly ‘infinite task’ of phenomenology (Husserl, 1970: 48,72). Focusing on measuring techniques, Husserl identified a built-in generality in ‘the essence of all method’ (1970:48) that leads away from the ” the intuited data of experience’ (1970:40) to an ongoing logistics of verification and endless refinement, a proliferation and fusion/confusion of technique and instrumentation with what it measures. Technicity characterizes the state of all things. We live in artificial environments of automatic devices and processes become so self-evident as to be invisible; indeed, this self-evidence is their automaticity.5 Husserl’s 1936 ‘Origin of Geometry’ states that measuring techniques belong to every culture, that culture is identical with such techniques, that such techniques are ‘always already there, always abundantly developed and pregiven to the philosopher who did not yet know geometry but who should be conceivable as its inventor’ (Husserl, 1970:376). Measuring techniques are philosophical, are philosophy, within the closure of theoria. The telos of all history is this departure from the Greek proto-geometers, and the task, the future-to-come, is the recognition of the reality of the lifeworld through reduction of the artificial self-evidence of technology. Husserl calls for European history, measured from the Greeks, to continue globally as the realization of the lifeworld. In such an account, history is the (de)programming of technology.

Despite the reception and legacy of Husserlian phenomenology as an unhistorical privileging of self-perception to the point of sheer abstraction, the technological aspects of Husserl’s late work have been at least partially developed in several important critiques.6The disturbance to the infinite task of phenomenology offered by technization proves exactly the impetus needed to pursue the life-world. Technology is already implied in the orientation towards the life-world: technization is thematized as an intermediate obstacle but occupies a systematic function in the self-maintenance of conceptual boundaries. The lifeworld can not be conceptualized apart from the mediation of technicity.

The re-troping that so suddenly brings about the ‘last technological revolution’ (at least according to Drexler) derives its movement and terminus from the phenomenological setting of quantum theory. According to Hans Blumenberg’s terminology of ‘reality-concepts’ (Wirklichkeitsbegriffe), phenomenology assumes the epic structure of actualization of context: ‘a progressive certainty which can never reach a total, final consistency, as it always looks forward to a future that might contain elements which could shatter previous consistency and so render previous “realities” unreal’ (Blumberg,1979:33). The method of phenomenology is continual deferral in the name of teleology. It is not the case that the rhetorical transaction between quantum instability and nanotechnological potentiality is simply the discursive after-effect of transformations at the level of material technologies or some similar substratum.7 Instead, the alternative readings of molecular potentiality are given in terms of the schema already set out by Husserl. This is despite the apparently causal role of developments in imaging simulation in shifting the burden of proof away from instability to self-replication and positional control. That the same molecules that were unstable now work nano-wise is secured by the metaphorical potential of the lifeworld. The background metaphor of the lifeworld as the “actualization of a context” is the schema off which we read the re-troping of molecular performativity.

The metaphoricity of the lifeworld lies in the doubleness of the concept of reality it implies. The reality of the lifeworld can not simply be an additional reality, not one more measurement, or one more virtuality dispelled. To guarantee the eventual success of the phenomenological task requires making it impossible to take as real those things measured and questioned, however persuasive, so that what finally comes along as real is not simply the latest version but at last really real. This means that there must be some way of exhausting measurement through measurement. At the same time, if there is always an automatic processing of the lifeworld at work through technique, there must be some already available form of regularity in the lifeworld, something in this unmeasurable originality capable of being measured; what Husserl terms ‘reference-back as a founding of validity.’8 This maintains expectations of completing the task by establishing a relay across the sediment of history: the lifeworld is already wired. This relay crosses an infinite distance to the ‘realm of original self-evidences’ where only a ‘pure geometry’ would be in effect (Husserl, 1970:127). The nature of this lifeworldly relay is ambiguous, in that the regularities of verification or measurement involved in technization should not be possible: when it comes to the lifeworld, the pre-given [vorgegeben] needs no verification (Husserl, 1970:124). There is a reference to and validation of the lifeworld; at the same time, the lifeworld would in no way be susceptible to theory of any sort, even if only to confirm this non-susceptibility. Like the Khirgiz Light in Gravity’s Rainbow , ‘There is no other way to know It’ (Pynchon, 1973:358). Here there is a separation between the absolute requirements of the lifeworld, which remains an infinitely rich but unavailable horizon structure, and the technical/representational requirements of what shows up as reality. The lifeworld itself, in contrast to phenomenological theory, falls under Blumenberg’s reality-concept of ‘momentary’ evidence whose ‘instantaneous’ quality is precisely the guarantee of its reality (Blumenberg, 1979:30). At this point, the requirements of the lifeworld as the aim of phenomenology are under incredible conceptual strain, but it is exactly this maintains phenomenology as a ‘task’. The phenomenal is the crux of the lifeworld and technicity, and the functionalization of this means that phenomenology is technological from the start.

Technicity is the ongoing refinement of and in the lifeworld. Systems theory builds on the fact that the lifeworld is ‘suspended’ and is ‘never the world in which we live’ (Blumenberg, 1972:430). Here is where the challenge of nanotechnology is most intense. Phenomenology at least secured its project by maintaining its unfinished nature, but if nanotechnology is taken as a way of thinking, it becomes possible to consider what would be done if the phenomenological project were completed, leaving what is roughly a topological field of concrete relations from which we ‘extrapolate’ our intersubjectivity – a }{\i\fs22 spacing – for which the terminology of embodiment is one metaphor. Nanotechnology provides the engineering for this field by forcing phenomenology to its limits. Husserl remained focused on a problematic movement from proto-geometry to logistical formulation, which in nanotechnical terms becomes a poorly formulated question derived from an earlier state of engineering. Instead of asking ‘What and where is the life-world?’ as Husserl does – still a matter of “understanding media” – the question becomes ” What is it that we want in the lifeworld, supposing we did have it ready at hand?” Or, as Drexler puts it, ” What is it that we want to build?”

Nanotechnology implies the constructibility of life-worlds from a massive Lego set, as it were. All that was previously immediate and self-evident is now technically available and subject to manipulation. Such a position does not, however, indicate the direction for thought: atoms are not concepts. What was once essence and is now externalized, no longer presents itself as essence. Of course, the advantage of the culturally sedimented idealization of the lifeworld was its always latent, always available state. If the Ur-Erde is finally set before us, how will we recognize it? Reformulated again: the terminal overlap of nanotechnology with the phenomenological immediacy of the Ur-Erde leads us to ask: what in the world it was that we wanted in the first place, a question never asked within world eco-horizons, horizons which precisely kept it from being asked. What is the point of respect for “a material world that cannot be replaced” once it is in our grasp? When complete simulation merges with realism, the proliferation and comparability of ‘realms of originary self-evidence ‘ determine the questions that arise.

For this we have ‘ubiquitous computing,’ announced in Scientific American by Mark Weiser of Xerox PARC: ‘The most profound technologies are those that disappear. They weave themselves into the fabric of everyday life until they are indistinguishable from it’ (Weiser, 1991:94). The animation of this ‘ fabric’ leads to an actualization of context, where everydayness is delimited, made visible in its invisibility, no different but renewed in precisely this. By contrast to the theatrical showings of virtual reality, which ‘focuses an enormous apparatus on simulating the world,’ Weiser proposes ‘invisibly enhancing the world that already exists’ through an ’embodied virtuality’ making use of hidden telematics communicating with prosthetic body-attachments via microwave (Weiser, 1991:98). After offering the example of writing (‘perhaps the first information technology’) as a disappearing technology, Weiser adds that such a ‘disappearance is a fundamental consequence not of technology but of human psychology,’ and cites the example of the phenomenological ‘horizon,’ which says ‘in essence,’ according to Weiser, ‘that only when things disappear in this way are we freed to use them without thinking and so to focus beyond them on new goals’ (Weiser, 1991:97). The automatically-cited horizon-reference can not conceal its translation: no longer a shared precondition of appearances but now a topology for disappearances as the basis of pre-programmed environments.

As Niklas Luhmann makes clear, the systems description of teleological functions embedded in the induced and programmed interests of the simulated life-world provides the basis for free activity, however, no orientation is supplied towards these goals (Luhmann, 1995: 69-71). As the cost of overcoming the phenomenological problematic, nanotechnology does not include a teleology: it discovers the binding of reality, but in doing so fulfills any need for this discovery. For this reason, both critics and proponents search hard for its purpose. While nanotechnology promises to create universal peace and health – thus the sub-title of Crandall’ s 1998 book, ‘Molecular Speculations on Global Abundance’ – the sheer open-ended quality of the technique, its lack of any economy, always leads to the Gnostic rhetoric of “something more.” What comes after global abundance? The immortality promised by Moravec is only the first step, and for every nanotechnologist so far the answer has been to speculate on travel to the stars. “Here to go,” as William S. Burroughs used to say. The results are predictable: having translated and erased the ambiguities of everyday existence, the goal is supplied by a new axis mundi, the stars as our destination, capitalizing on Drexler’s early training in astronautic physics. Because it no longer counts on the truth of the proximal space of embodiment, nanotechnology can only be thought on the global level, if not on the intergalactic level. Conversely, the dangers brought by the approaching fusion/confusion of magnitudes are given most spectacularly in the “gray goo” problem: out-of-control breeding and takeover by scientifically altered molecular level mutants, eventually turning the entire universe into an undifferentiated virus soup, returning to Flusser’s concern with “gray areas.”.

In fact, Weiser’s embodied virtuality is already here on the large order of magnitude, with the Global Positioning System (GPS) using satellite tracking to pinpoint location at the distance of a handshake. “Global development” now refers to the economics of imaging, where everything is visible to a range of less than three feet. Carl Schmitt’ s definition of the political as the “encounter of Friend and Enemy” is given technological proof. Here is a truly geopolitical guidance system for the lifeworld. The so-called public sphere is readily illuminated and global politics brought to its root in intersubjective or lifeworldly relations. But there is a catch: the National Security Agency (NSA), Lord of the Airwaves, distorts positioning to a range of about thirty feet unless you have a decoder. The simple topology of Friend and Enemy is replaced by a more complex condition, in short, by a war zone, and the recent reduction in the signal indicates simply that peacetime now equates to a certain image resolution of the world. This built-in political spacing made all the difference in 1991, when distortion was turned off for attacking forces, bringing into view the Iraqi countryside, guiding bombs down air conditioning vents, and making very clear who the Enemy was.

Concept Art, or What Still Remains After the End

. . . to see is always to see more than one sees… Maurice Merleau-Ponty

The disparate artworks grouped as Concept Art begin to explore this fractal topology, turning to from mimetic issues of visibility to ‘concepts,’ directly engaging the issues raised by a nanotechnological understanding of what used to be the lifeworld. No historiography of shared names and dates links nanotechnology and Concept Art; rather, occupying a kind of shared archeological strata, these artworks exemplify what nanotechnology promises but cannot presently deliver, displaying as art what was only desired by the theorists of technology.

The collaborative work of Arakawa and Madeline Gins, with its concern with survival beyond destinal appearances, is pertinent. The ‘mind can have no other color’ than that of these paintings, wrote Italo Calvino of The Mechanism of Meaning, an ongoing work since 1963 (Calvino,1985: 116). Not that the paintings ‘are “like” a mind, in the sense that first a mind exists and then a picture is taken that represents it as a landscape represents a place,’ but rather, after studying one of the paintings, ‘it is I who begin to feel that my mind is “like” the picture. Not only that, but I no longer remember what other image I might have ascribed to my mind before’ (Calvino,1985: 115). Arakawa and Gins present ambiguous maps and codes attempting to recreate the “mechanism” of “meaning” not representations of a certain meanings, nor depictions of meaning-making, but diagrams of meaning itself: to view the work is too complete the diagram and “make” meaning. The artwork is thought, artificial intelligence preserving not a pict ure of the mind but the mind itself.

The more recent Architecture: Sites of Reversible Destiny (Architectural Experiments After Auschwitz-Hiroshima) states that ‘Every instance of perception lands as a site’ and proposes to imbalance the body’s landing to recreate the process of constituting this destiny in the first place (1994:19). ‘Juggling, jumbling, and reshuffling the body with its fund of landing sites introduces a person to the process that constitutes being a person. To reverse destiny one must first re-enter destiny, repositioning oneself within the destiny of being slated to live without ever knowing exactly how and why.’ The resulting “experiments” are designed beyond any horizon for a communal dwelling in ‘a constructed (artificial?) eternity’ (1994:23).

Arakawa and Gins have received considerable attention: as the subject of a recent Guggenheim retrospective, as well as significant critical writings by Lyotard, Andrew Benjamin, and Mark Taylor. The comparatively neglected work of former Fluxus-associate Henry Flynt is equally important but perhaps more difficult to discuss. Flynt defines his work not as ‘Conceptual Art’ but as ‘Concept Art,’ that is, ‘art of which the material is concepts’ (Flynt, 1988:185).10The 1987 work The Apprehension of Plurality’ starts from consideration of David ‘Hilbert’s re-conception of mathematics as ‘a game played with meaningless marks.’ Flynt reviews Hilbert’s ‘stroke numerals’ as an attempt at a ‘concrete semantics’ for number: a ‘collection’ of marks with ‘no internal structure’ that ‘supposedly demands’ a count ‘thereby showing its meaning’ (Flynt, 1988b:192). Flynt points out that the stroke numerals attempt to eliminate learned methodological competency, ‘to weaken the requirements of arithmetic to the point that somebody with an adult mastery of quality and abstraction can do feasible arithmetic “blindly” ‘ (1988:194). Flynt’s project will then be to take ‘perceptually multi-stable’ images (e.g. Neckar cubes) as stroke numerals, requiring minute reconstruction of intuited perceptions before they form any stable gestalt. What is “seen” is an apparition derived from the image. This allows Flynt to isolate and describe differing hermeneutics applied to the numerals, allowing the re-creation of our ‘intuitions’ of ‘plurality’ and ‘quantity beyond the limits of the present culture’ (Flynt, 1988:193,195).

The results, the arithmetic constructions produced, are stable as apparitions but as apparitions only. The built-in invisibility of these figures haunt the formalism required to ‘count’ their aggregate state – a haunting that extends to the self, community, or culture perceiving the figures. Flynt’s written instructions and commentary name the hermeneutics brought into this formalism. Here, with these basic conceptual building blocks, the comparison to nanotechnology is most evident, but it is here that the comparison fails. To suggest the re-occupation of the lifeworld by basic units of measurement, whether atoms or concepts, requires attention to the way these atomic materials are stockpiled and deployed – attention to displaced citations of fractal strata in the topology of nanotechnology. The “image resolution” of perception, as the crucial conceptual technique of nanotechnology, is at best a metaphor for the non-conceptuality of an apparition. Flynt’s art takes concepts as material to put on view the world-apparition that we would be responsible for; the emphasis has shifted, however, from the primacy of perception to that of (political) responsibility for the cultural formalisms following on intuition.

Endnotes

I would like to thank Don Byrd for the conversation that first started me thinking about this essay; Birgit Hansen and Anthony Reynolds for the opportunity to present an earlier version to the Graduate College of the Europa-University Viadrina, Frankfurt in February 1998; finally, my thanks to Anselm Haverkamp.

1. See Mandelbrot, 1977 and Virilio 1991.

2. Following Schroedinger.

3. See the basic schema of Blumenberg’s’ Prospects for a Theory of Nonconceptuality,’ (Blumenberg, 1997:81-102).

4. See Heisenberg, 1958.

5. The best references for the historical interface to the philosophy of mathematics is Klein, 1968 and Rotman, 1993

6.To be exact: Blumberg (1972) and Blumenber (1996). Also, Derrida (1989) and Derrida (1973).

7. See the related point in Blumenberg (1987:126).

8. Geltungsfundierung. See Husserl (1970:140).

9. Along with Flynt’s Web site, this collection is my main source for Flynt’s work. For the Fluxus history, see Jenkins (1993). Very much in the spirit of Fluxus, Flynt insists that he was never a part of the group, but concedes that his early works were exhibited and published in conjunction with Fluxus events, and Flynt was the co-participant in the well-known Fluxus ‘Art Strike’ with George Maciunus.

References

Gins, A & M. (1994) Architecture: Sites of Reversible Destiny (Architectural Experiments After Auschwitz-Hiroshima) London: Academy Editions.

Blumenberg, H. (1979) ‘The Concept of Reality and the Possibility of the Novel’, pp.1-33 in New Perspectives in German Literary Criticism R. Amacher and V. Lange (eds) Princeton: Princeton UP.

– (1972) ‘The Life-World and the Concept of Reality’, pp. 425-444 in Life-World and Consciousness: Essays for Aron Gurwitsch. Evanston, IL: Northwestern.

– (1987) The Genesis of the Copernican WorldCambridge, MA: The MIT Press.

– (1996) ‘Lebenswelt und Technisierung unter Aspekten der Phaenomenologie’, pp.7-54 in Wirklichkeiten in denen wir leben . Stuttgart: Reclam,

– (1997) ‘Prospects for a Theory of Nonconceptuality’, pp.81-102 in Shipwreck with Spectator: Paradigm of a Metaphor for Existence. Cambridge, MA: The MIT Press

Butler, J. (1993) Bodies that Matter. New York: Routledge

Calvino, I. (1985) ‘The Arrow in the Mind.’ ArtForum 24/1:115-116

Crandall, B. C. (ed.) (1988) Nanotechnology: Molecular Speculation on Global Abundance . Cambridge, MA: The MIT Press.

Crandall, B. C. (ed.) (1998) Nanotechnology: Molecular Speculation on Global Abundance. Cambridge, MA: The MIT Press.

– (1997) ‘Preface’ in B. Crandall and B. Lewis (Eds). Nanotechnology: Research Perspectives Cambridge. MA: The MIT Press, pp.vii-viii.

De Landa, M. (1992) ‘Non-Organic Life’ in J. Crary and S. Kwinter (eds) Incorporations. New York: Zone, pp.120-150.

Derrida, J. (1989) Edmund Husserl’s Origin of Geometry: An Introduction. Lincoln, NB: University of Nebraska Press, 1989.

– (1973) Speech and Phenomena. Evanston, IL: Northwestern UP.

Drexler, K. E. (1986) Engines of Creations: Challenges and Choices of the Last Technological Revolution. New York: Doubleday.

Flusser, V. (1993) ‘Change of Paradigms’, Yale Journal of Criticism 6/2:289-294.

– (1993a) ‘Orders of Magnitude and Humanism’, Yale Journal of Criticism 6:2, pp.294-298.

Flynt, H. (1988) ‘Concept Art’, p.185 and ‘The Apprehension of Plurality’, pp.172-200 in Charles Stein (ed.) Being = Space X Action: Searches for Freedom of Mind through Mathematics, Art, and Mysticism.Berkeley, CA: North Atlantic Books.

Goodchild, P. J. (1981) Robert Oppenheimer: Shatterer of Worlds. Boston, MA: Houghton Mifflin Co.

Hayles. N. K. (1999)How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. Chicago: University of Chicago Press.

Heisenberg, W. (1958) Physics and Philosophy. New York: Harper.

Hume, D. (1985)A Treatise of Human Nature. London:Penguin.

Husserl, E. (1970)The Crisis of European Sciences and Transcendental Phenomenology. Evanston, IL: Northwestern UP,

Kittler, F. (1997) ‘Literature, Media, Information Systems’ in J. Johnston (ed.) G & B Arts International: Amsterdam.

Klein, J. (1968) Greek Mathematical Thought and the Origin of Algebra. Cambridge, MA: The MIT Press

Leyner, M. (1990) My Cousin, My Gastroenterologist. New York: Vintage.

Lucretius (1951) On the Nature of the Universe. London: Penguin.

Luhmann, N. (1995) Social Systems. Stanford, CA: Stanford UP.

– (1985) A Sociological Theory of Law London: Routledge.

Lyotard, J-F. (1991) The Inhuman: Reflections on Time. Stanford, CA: Stanford UP.

Mandelbrot, B. (1977) Fractals: Form, Chance and Dimension.San Francisco, CA: W. H. Freeman and Company.

Milbraith, L. W. (1997) ‘Fears and Hopes of an Environmentalist for Nanotechnology’ in B. C. Crandall and J.Lewis (eds) Nanotechnology: Research Perspectives. Cambridge, MA: The MIT Press.

Miller, A. (1996) ‘Visualization Lost and Regained: The Genesis of Quantum Theory in the Period 1913-1927’, pp.86-108 in T. Druckery (ed.) Electronic Culture: Technology and Visual Representation. New York: Aperture Press.

Moravec, H. (1988) Mind Children: The Future of Robot and Human Intelligence. Cambridge: Harvard UP.

Pynchon, T. (1973) Gravity’s Rainbow. New York: Viking.

Regis, E. (1995) Nano. Boston: Little, Brown & Co.

Rotman, B. (1993) Ad Infinitum: The Ghost in Turing’s Machine. Stanford, CA: Stanford UP.

Taylor, M. (1997) Hiding. Chicago. IL: University of Chicago Press.

Theweleit, K. (1992) ‘Circles, Lines and Bits’, pp.256-261 in . J. Crary and S. Kwinter (eds) Incorporations. New York: Zone Books.

Virilio, P. (1991) Lost Dimension. New York: Semiotext[e].

Weiser, M. ‘The Computer for the 21st Century’, Scientific American September 1991: 94-98.

Leave a Reply