Simulation: Presentation Technique and Cognitive Method 9783034609951, 9783764386863

Context Architecture: The first two volumes on basic architectural concepts between art, science, and technology Digit

177 40 2MB

English Pages [118] Year 2008

Report DMCA / Copyright

DOWNLOAD PDF FILE

Table of contents :
EDITORIAL
PARRHASIUS’S CURTAIN VISUAL SIMULATION’S MIMESIS AND MEDIALITY
OF MIRRORS, CLOUDS, AND PLATONIC CAVES: 20TH-CENTURY SPATIAL CONCEPTS IN EXPERIMENTAL MEDIA
SCIENTIA MEDIA – SIMULATION BETWEEN CULTURES
FLUSSER’S LEAP: SIMULATION AND TECHNICAL THOUGHT IN ARCHITECTURE
THE EPISTEMIC TEXTURE OF SIMULATED WORLDS
KNOWLEDGE IN THE AGE OF SIMULATION: METATECHNICAL REFLECTIONS
Appendix
Selected Literature
Illustration Credits
Biographies
Recommend Papers

Simulation: Presentation Technique and Cognitive Method
 9783034609951, 9783764386863

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

Simulation

Context Architecture: Fundamental Concepts Between Art, Science, and Technology Digitalization has altered architectural discourse. Today, discussions in architectural theory and design are shaped by many new ideas, including some that previously had no meaning in that context, or else very different ones. Increasingly, the conceptualizations and strategies of architectural discourse are molded by influences emerging along the interface joining scientific and cultural images of modern information technology. Posing itself against this background is the question: on the basis of which practical and in particular which theoretical concepts can architecture come to terms with these new technologies, thereby entering into a simultaneously productive and critical dialogue with them? Presented for debate in Context Architecture is a selection of such ideas, all of them central to current discourses. Context Architecture is a collaboration of the Zurich University of the Arts (ZHdK) and Ludger Hovestadt, chair for Computer Aided Architectural Design at the ETH Zurich.

Available in the series Context Architecture:

Simulation. Presentation Technique and Cognitive Method, ISBN 978-3-7643-8686-3

Complexity. Design Strategy and World View, ISBN 978-3-7643-8688-7

Context Architecture A collaboration of the Zurich University of the Arts (ZHdK) and the ETH Zurich

Simulation Presentation Technique and Cognitive Method

CONTEXT ARCHITECTURE

Edited by Andrea Gleiniger and Georg Vrachliotis

Birkhäuser Basel · Boston · Berlin

Translation from German into English, contribution by Thomas Hänsli: Lisa Rosenblatt, Vienna Translation from German into English, all other texts: Ian Pepper, Berlin Copy Editing: Monica Buckland, Basel Cover and layout design: Bringolf Irion Vögeli GmbH, Zurich Reproductions and typesetting: weissRaum visuelle Gestaltung, Basel This book is also available in German: Simulation. Präsentationstechnik und Erkenntnisinstrument, ISBN 978-3-7643-8685-6.

Library of Congress Control Number: 2008925218 Bibliographic information published by the German National Library The German National Library lists this publication in the Deutsche Nationalbibliografie; detailed bibliographic data are available on the Internet at http://dnb.d-nb.de. This work is subject to copyright. All rights are reserved, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, re-use of illustrations, recitation, broadcasting, reproduction on microfilms or in other ways, and storage in data bases. For any kind of use, permission of the copyright owner must be obtained.

© 2008 Birkhäuser Verlag AG Basel ∙ Boston ∙ Berlin P.O. Box 133, CH-4010 Basel, Switzerland Part of Springer Science+Business Media Printed on acid-free paper produced from chlorine-free pulp. TCF ∞ Printed in Germany

ISBN: 978-3-7643-8686-3

987654321

www.birkhauser.ch

7 Andrea Gleiniger & Georg Vrachliotis EDITORIAL 13 Thomas Hänsli PARRHASIUS’S CURTAIN VISUAL SIMULATION’S MIMESIS AND MEDIALITY 29 Andrea Gleiniger OF MIRRORS, CLOUDS, AND PLATONIC CAVES: 20TH-CENTURY SPATIAL CONCEPTS IN EXPERIMENTAL MEDIA 51 Nils Röller SCIENTIA MEDIA – SIMULATION BETWEEN CULTURES 63 Georg Vrachliotis FLUSSER’S LEAP: SIMULATION AND TECHNICAL THOUGHT IN ARCHITECTURE 83 Gabriele Gramelsberger THE EPISTEMIC TEXTURE OF SIMULATED WORLDS 93 Erich Hörl KNOWLEDGE IN THE AGE OF SIMULATION: METATECHNICAL REFLECTIONS

Appendix: 107 Selected Literature 115 Illustration Credits 117 Biographies

EDITORIAL “Our interest in the invisible world stems from a desire to find a form for it in the visible one, which means to prise open, to decompose, to atomize the deceptively familiar, the visible exterior appearance, before we can deal with it again. [...] We are interested in the hidden geometry of nature, in an intellectual principle, and not primarily in the external appearance of nature.”1 This is how Jacques Herzog and Pierre de Meuron describe their architectural enterprise. In the age of computer simulation, this characterization appears in a very special light. Increasingly, conceptualizations and strategies for architectural design are conditioned by influences that lie somewhere along the interface between scientific and cultural images of contemporary information technology. This raises the question: by means of which practices, and in particular which theoretical tools, can architecture productively interact with these new technologies while simultaneously engaging in a critical dialogue with them? Just how close the above characterization by Jacques Herzog and Pierre de Meuron comes to what Vilém Flusser has dubbed “calculatory thought”2 becomes clear only against the background of the computer as a “universal machine.” For the interplay between analysis and synthesis made possible by the computer points toward a level where, finally, overarching questions can be weighed in the context of reflections on architectural production. In light of the increasing dissemination of models of thought drawn from information technology, it has become necessary to re-engage in a critical discussion of the relationship between architecture and art, science, and technology. But it is not a question simply of heeding repeated demands to re-situate architecture. Rather, it is an issue of the basic concepts that make it possible to identify, reflect upon, and contextualize

1 Jacques Herzog und Pierre de Meuron, “Die verborgene Geometrie der Natur,” in Sturm der Ruhe. What is Architecture? ed. by the Architekturzentrum Wien, Salzburg 2001, p. 265, (lecture delivered by Jacques Herzog, Basel 1984, in: Herzog & de Meuron, 1978–1988 – Das Gesamtwerk, ed. Gerhard Mack vol. 1, Basel, Boston, Berlin 1984, pp. 207–211): “Unser Interesse an der unsichtbaren Welt liegt darin, für sie in der sichtbaren Welt eine Form zu finden, das heißt das trügerisch vertraute, sichtbare, äußere Erscheinungsbild aufzubrechen, zu zerlegen, zu atomisieren, bevor wir erneut damit umgehen können. [...] Unser Interesse ist die verborgene Geometrie der Natur, ein geistiges Prinzip und nicht primär eine äußere Erscheinungsform der Natur.” 2 Vilém Flusser, “Digitaler Schein,” in: Digitaler Schein: Ästhetik der elektronischen Medien, ed. Florian Rötzer, Frankfurt 1991, pp. 152ff.

7

the transformative processes to which architectural design thinking has been subjected under the influence of the paradigm change triggered by information technologies. In light of the method of investigation alluded to by Herzog and de Meuron, we might also ask: “Could there be anything more natural than to start with the visible form and then gradually to penetrate into the realm of the invisible? The architect must be able to manipulate the invisible so as to render reality visible, but also capable of seeing reality in order to change it.”3 The instrument for achieving this is computer simulation, with which it has become possible not only to visualize and represent the invisible, the as-yet inexistent or inconceivable, but also to gain access to an epistemological investigation. And is this not precisely the inverted path toward knowledge harbored by computer simulation? To begin in the realm of the non-visible, advancing painstakingly towards the structures of the visible? For an increasingly computational scientific landscape, this development towards synthetic procedures opens up a broad spectrum of new paths towards knowledge production. This is also the case for architecture. We are far from fathoming in detail just what this methodological turning point offers architectural design and planning processes. In this context, the concept of simulation must be accorded a very special significance. All the more so since it can be positioned in terms of architectural history as well as in a context of the growing importance of digital design and production methods. In architecture as well,the concept of simulation plays an essential role in discussions of mediatization: as illusion and imitation, as dissimulation and mimesis. In dialogue with information technology and computer science, moreover, it has acquired a new quality for architectural discourse as well: today, we understand simulation primarily as computer simulation. While simulation once pertained to modes of presentation, it now connects architecture to the natural sciences and to a methodological and strategic instrument, a tool of knowledge. Ontologically, computer simulation must be distinguished from the spectrum of traditional concepts of simulation found in architecture. It is no longer merely a technology of visual simulation, but rather a technological instrument for acquiring knowledge 3 Franz Oswald: Foreword to Pierre von Meiss: Elements of Architecture: From Form to Place, Lausanne 1991, p. xiii; “Préface” in: Pierre von Meiss: De la forme au lieu: Une Introduction à l’étude de l’architecture, Lausanne 1986, p. 7: “Y a-t-il voie plus évidente que de partir du visible, la forme, pour pénétrer peu à peu dans l’invisible, le caché? L’architecte doit être capable de manier des choses invisibles pour rendre visible la réalité, capable aussi de voir la réalité afin de la transformer.”

8

Editorial

in the spirit of the modern natural sciences. This means a leap from a timeless to a time-contingent technology. It is this temporal aspect that distinguishes computer simulation from inherited concepts of simulation in architecture. Computer simulation in architecture has a relatively brief history. As a rule, computer simulation is classified with computing and applied mathematics, while the technical development of its visualization methods is nonetheless inseparable from certain architectural aspects. Even early on, architects were involved in the development of computer-based methods of representation. More than 20 years ago, Horst Rittel, a former lecturer in design science at the Hochschule für Gestaltung in Ulm, referred to computer simulation as one of the most important areas where architects interacted with the computer. With the rapid development of hardware and software, finally, numerical simulation advanced to become a new working practice in science and research, in architecture and design. Having now established itself as a ubiquitous cultural technology, it increasingly alters our interactions with the world. “Simulation,” then, is a fundamental concept in whose definitions repeated demands for transdisciplinarity make themselves felt. This is true not only for the arts, but perhaps even more so for a dialogue between architecture, technology, and the sciences. With this background, it becomes clear that a discussion of such basic concepts goes beyond the traditional discursive boundaries of architectural discourse, and is characterized increasingly by an intensive dialogue with the theory and philosophy of technology and with the history of science. In order to do justice to these aims, we have chosen authors from diverse disciplines, all of whom we consider to have had a fundamental impact on the shaping and interpretation of these conceptualizations in architecture, or to be expected to do so in the future. Please note that it has not been possible to take account of every discipline that has been preoccupied with simulative strategies, whether conceptually, technologically, or in terms of epistemology. Instead of a contribution from the perennially popular field of neuroscience, for example, we chose to focus on one of the earliest and simultaneously most advanced fields of application of numerical simulation, meteorology. The modeling and simulation of climatological scenarios, of new molecules and materials, as well as of new building types and complex geometries, all testify to the profound change triggered by the use of computer-based simulations. A better understanding of this change from an architectural perspective also requires a deepened critical confrontation with historical as well as contemporary definitions of the concept of simulation.

9

Taking as his point of departure the concept of mimesis as formulated during the Renaissance and Baroque periods, with recourse to Classical Antiquity, Thomas Hänsli’s article traces art-theoretical interpretations of the idea of simulation and its artistic applications, beginning in the 15th century. Against this background, he inquires into the jurisdiction of the concept of simulation for current artistic presentations of architecture, for example, in the area of architectural photography. The article by Andrea Gleiniger demonstrates that the adaptation of the idea of simulation in the history of 20th-century architecture spread beyond the fields of representation and visualization. Using examples drawn from 20th-century spatial conceptions in the field of experimental media, she makes it clear that the concept of simulation was reflected and transformed in modern architecture in ways closely related to the history of ideas and of the media. In an outline of architectural history leading from El Lissitzky’s “Wolkenbügel” all the way to contemporary digital work, the history of an increasingly mediatized architecture is also read as a history of simulation. Decisive here is the way in which the concept of simulation, hitherto primarily tied to modes of representation and presentation, now links architecture increasingly to knowledge acquisition and to the natural sciences at the level of strategies for modeling dynamic processes. It is with regard to reflections on architecture from the perspective of media history that media-theoretical realizations of the concept of simulation acquire their special significance. This is all the more true because in media studies as well, as Nils Röller elaborates, the terms “simulation as dissimulation” and “simulation as modeling” are used to distinguish two traditional conceptualizations. Artificial intelligence developed from a branch of early computer science to become an independent and powerful scientific program, one that broke down eventually when it failed to live up to its own expectations. The desire, articulated in this context, to render complex models used in scientific investigations accessible through computer simulation is nonetheless decisive for discussions of the concept of simulation. Emerging in place of seemingly deceptive visualizations are questions of prognostication. Against this background, Georg Vrachliotis investigates the resulting impact on technical thinking in architecture. The associated question – which architectural design instruments can be generated from the historico-discursive space of computer simulation? – leads to a search for adequate theoretical access to an architectural production that is increasingly stamped by the structural sciences. Critical architectural reflections on the

10

Editorial

methods of numerical computer simulation (which are substantially structural, and are characterized by a “mathematics of temporal procedures”4) demands more than an improved understanding of the underlying technical principles. Even more important is an improved feel for their sociocultural implications and limitations. The heightened penetration of numerical computer simulation in the diverse branches of scientific research endows it with the status of a cultural technology.5 From the perspective of the philosophy of technology, Gabriele Gramelsberger not only examines the general constitution of these mathematical worlds, but also investigates their epistemo-theoretical space of possibility on the basis of specific climatological simulations. Anyone wishing to become more closely involved with simulation should avoid remaining on the “surface of the monitor,” for only a look into the depths of the data makes it possible for these semiotic worlds to be investigated adequately. Even earlier, it had become evident that such an investigation would necessarily refer to an additional, overarching level of discussion: to the changing relationship of interdependency between knowledge production and the mathematical logic of computer simulation. As Erich Hörl points out in his article, one prerequisite for any understanding of this transformation of knowledge is an enquiry into the epistemological and ontological situation created by the “computational turn” in the sciences. More than the meaning of the sciences, he argues, is displaced by computer-supported practices of modeling and simulation, from the descriptive to the projective. More important, the status of the technological shifts as “technical objects with minority status” and tools of instrumental reason form a “majoritarian structure,” a “new milieu of knowledge and becoming.” Detectable within the contours of these lines of development on various levels of knowledge production, in diverse dimensions of the aesthetic and of the technical, and in the most various disciplines, is a change in the concept of simulation – one that, considering the steady growth potential for applications of computer simulation, has perhaps yet to be exhausted. The essays collected in this volume attempt to expose a multifaceted panorama, and to set the basic idea of simulation and its underlying cultural-historical dynamics in an architectural context.

4 Carl Friedrich von Weizsäcker: Die Einheit der Natur. Studien, Munich 1971, p. 23; English edition available: The Unity of Nature, New York 1980. 5 See Walther Zimmerli: Technologie als Kultur. Braunschweiger Texte, Hildesheim 1997.

11

In conjunction with an accompanying volume on the topic of “Complexity,” this essay collection on “Simulation” initiates the publication Series Context Architecture. This series undertakes the task of opening up for discussion fundamental architectonic concepts in a realm situated between art, science, and technology. This project developed from an intensive collaboration between its two editors, and its profile came into sharper focus via constructive dialogue with its various authors. We take this opportunity to express our gratitude to all of our writers for their profound contributions to this volume. All texts were written expressly for the present publication. Our very special thanks also to Prof. Dr. Hans-Peter Schwarz, Founding Rector of the Zurich University of the Arts, and to Prof. Dr. Ludger Hovestadt, chair of Computer Aided Architectural Design at the ETH Zurich. Their generous financial support and encouragement regarding content made it possible for our book project to assume its present form. The associated cooperation between these two institutions also reflects the aim of bringing architecture, technology, art, and science into dialogue, and has manifested the ubiquitous call for transdisciplinarity in a singular fashion. Ultimately responsible for realizing this publication project was Birkhäuser Verlag. Our very special thanks go to Robert Steiger and Véronique Hilfiker Durand for their patient, competent, and consistently committed editorial efforts. Andrea Gleiniger, Georg Vrachliotis

12

Thomas Hänsli * PARRHASIUS’S CURTAIN: VISUAL SIMULATION’S MIMESIS AND MEDIALITY “The artist deals no longer with reality, but produces a picture of a picture of reality.” Thomas Ruff, Interview 1994 The Roman author and scholar Pliny the Elder (AD 23–79) provides us with what is certainly the first and also most meaningful description of the visual simulation of reality. In this account, Pliny reports a contest between the artist Zeuxis of Heraclea and his rival Parrhasius of Ephesus to determine which of them was the greater artist.1 Zeuxis displayed his painting of grapes depicted in such a deceptively real way that even the birds were fooled and flew in flocks to feast on the fruits. But Zeuxis’s triumph was short-lived: when he arrived at Parrhasius’s studio impatient to see his rival’s painting, which was supposedly covered by drapes, he invited Parrhasius to unveil the finished work. But Zeuxis had erred, for it was not a curtain covering Parrhasius’s painting; rather, the curtain was painted. While Zeuxis had been able to deceive nature, Parrhasius had deceived Zeuxis, an artist. [Fig. 1] Meanwhile, the legend’s main significance is as reference, not to the artists’ contest, but rather to the principle of mimesis in the arts, and thereby to the fundamental relationship between art and reality.2 It was Aristotle (384–322 BC), no less, who elevated mimesis, i.e., imitating nature by means of art, to the determination of art, and thereby to the artist’s primary and most noble task. This dictum, which would have great consequences later, can be read in his Poetics.3 Alt* My thanks go to the editors of the present volume for their patience as well as the very fruitful discussions with them and the co-authors. A special thanks goes to Jens Trimpin and Magdalena Nieslony for generously providing illustration and catalogue material as well as to Katja Lemelsen for establishing the contact with them. 1 Pliny the Elder, Naturalis Historia, Book XXXV Chapter 36. 2 Hans Blumenberg: “Nachahmung der Natur. Zur Vorgeschichte der Idee des schöpferischen Menschen,” in: Studium generale, 1957, vol. 10, pp. 266–283. See also Nicola Suthor: “Mimesis (Bildende Kunst),” in Historisches Wörterbuch der Rhetorik, vol. 5, ed. Gerd Ueding, Tübingen 2001, pp. 1294–1316. 3 Aristotle: Poetics, 1447a, 1448b. Aristotle grounds this in the anthropological foundation of human perception as mimesis and the human ability to imitate.

13

Fig. 1: Johann Jacob von Sandrart (1655–98): Zeuxis und Parrhasius, 1675 [after a painting by Joachim von Sandrart].

14

Thomas Hänsli | Parrhasius’s Curtain: Visual Simulation’s Mimesis and Mediality

hough formulated with reference to epic poetry and tragedy, Aristotle’s contention is that the actual goal of all arts is to imitate reality.4 In an Aristotelian sense, mimesis does not mean simply the faithful reproduction of the outer appearance of something in the sense of a copy. Instead, Aristotle expanded “imitation” to the category of the possible and the probable.5 Aristotle’s view thus directly opposes the verdict of his teacher Plato (427– 347 BC), who rejected art due to the artwork’s being “at the third remove from reality,” the mere “representation … of an apparition.”6 For him, the image itself was subject to the “deceptive illusoriness” of representations that were only “reflections … not real things,”7 a reservation that would have great consequences for later aesthetic discourse. In this, Plato differentiated between mímesis eikastiké, the faithful copying of reality, and mímesis phantastiké, the deceptive depiction of reality.8 Despite Plato’s verdict, the Aristotelian definition of the being of art, its essence, as mimesis, and the parallelization of the arts, would have far-reaching consequences for art-theoretical discourse, from the Renaissance to the Modern era. The concept of mimesis or imitatio naturae provided a fundamental formulation of the purpose and possibilities of painting and thereby also of visual simulation. Architecture’s visual simulation, more so than any other form of depiction, rests on the principle of imitating nature.9 As faithful as possible it aims for a depiction of an actual or future reality by means of aesthetic illusion – or, precisely, through “pre-tense and re-production.”10 The original meaning of simulatio as the “feigned appearance of a thing,” is just as central to the understanding of visual simulation as the other meanings attributed to it, such as “deception” or

4 Seneca: Letters to Lucilius, VII, 65.3 likewise refers—“Omnis ars naturae imitatio est” (All art is imitation of nature)—to the mimetic nature of the arts. 5 Aristotle: Poetics, 1451b. 6 Plato: The Republic, 598a–b. 7 Ibid., 596e. 8 Plato: Sophist, 236b, 264c; Plato: The Republic, 602c–d. In the latter, Plato gives reasons for his critique of art’s mimetic nature. 9 The present article is limited to the discussion of the concept of simulation in the sense of a visual representation of architecture or the visualization of its ideas and concepts. 10 See the editorial to the present volume.

15

even “hypocrisy.”11 The productivity of visual simulation is found in its mimetic potential, as well as in the impact on the beholder of its illusory appearance, the illusion. Seen in this way, depictions of visual simulations must measure up to no less than Parrhasius’ curtain! Imitatio naturae as a category in the modern understanding of art The art of the Renaissance, at the latter years, devoted itself to depicting reality as faithfully as possible. A prerequisite for this was a fundamental change in the understanding of the picture in Italian painting in the transition from the Middle Ages to the Modern era. The painters of the 14th century, starting with Giotto di Bondone (1267/75?–1337) and his successors, discovered three-dimensionality in painting. The viewer was no longer intended to experience an image as an autonomous ordering system based on medieval patterns, but as an excerpt from reality, as verosimile, that is, verisimilar: the visual experience of the picture should match that of reality.12 The painter, architect, and biographer Giorgio Vasari (1511–1574) elevated imitatio naturae to the foremost challenge for artists of his era and thereby the main theme of early modern art theory and production. According to Vasari, the most elemental principle for the emergence of the work is disegno, which describes the depiction of the idea, as well as the form of the artwork, in the medium of the drawing, thereby attributing a substantial role to the artist’s technical drawing skills. Thus, ceaseless exercise in disegno should lead to constant improvement and consistently truer depictions, in the sense of the most faithful imitation of nature.13 The assumption of constant progress in artistic development was also fundamental to his artist biographies Vite de’ più eccellenti pittori, scultori e architettori, published in 1550 and 1568 (2nd and extended edition), with Michelangelo’s art placed at the peak (1475–1564). Vasari’s disegno thus makes 11 The latin simulatio is derived from the adjective similis and its verb simulare/simulo and means “appearance” and “ deception”. G. Karl Ernst Georges: Ausführliches Lateinisch-Deutsches Handwörterbuch, Darmstatdt 1999, pp. 2678–2679. 12 Eloquent evidence of this is Boccaccio’s description of Giotto’s art in his commentary to the Divina Commedia cf. Giovanni Boccaccio: Commento alla “Divina Commedia,” ed. Domenico Guerri, Bari 1918, p. 72. 13 Giorgio Vasari: Le Vite de’ più eccellenti pittori, scultori e architettori nelle redazioni del 1550 e 1658, eds. Rosanna Bettarini and Paola Barocchi, 6 vols., Florence 1966–1987, vol. IV, p. 5. See also Wolfgang Kemp: “Disegno. Beiträge zu Geschichte des Begriffs zwischen 1547 und 1607,” in: Marburger Jahrbuch für Kunstwissenschaft, 1974, vol. XIX, pp. 219–240.

16

Thomas Hänsli | Parrhasius’s Curtain: Visual Simulation’s Mimesis and Mediality

direct terminological reference to the architectural theory texts of Leon Battista Alberti and Vitruvius,14 and thereby to drawing as a medium for depicting and designing architecture.15 The rationalization of mimesis: Alberti’s “finestra aperta” The decisive requirement for a truly faithful depiction of reality, and likewise perhaps the most important achievement of Renaissance art, was the discovery of a central, linear perspective.16 The Florentine architect and sculptor Filippo Brunelleschi (1377–1446) is credited with its discovery, and the two visual experiments he performed at the Baptistery and Palazzo Vecchio in Florence between 1410 and 1420. For the first experiment, Brunelleschi painted a small panel of the Baptistery, across from a set point in front of the main gate of the Cathedral of Santa Maria del Fiore, in such a way that the painted picture coincided with what was seen in reality. He bored a hole through the center of the panel so that the painted picture could be viewed from behind, through this small aperture, with the help of a mirror held in front of it.17 This simple arrangement made it possible for the observer to compare the painted depiction of the Baptistery with reality, and bring these views into correspondence. By virtue of its being conceived with perspective, Brunelleschi’s painted panel had a proportional relationship comparable to that of reality. As a reliable depiction of an existing reality, it thus achieved its own, rational validity, as it were. Linear perspective had thus come into existence as a comprehensible geometric method of rational depiction of reality, and with that the medieval pictorial order was superseded.18 14 Giorgio Vasari, see note 13, vol. II, p. 43 and also Giorgio Vasari: Einführung in die Künste der Architektur, Bildhauerei und Malerei, ed. Matteo Burioni, Berlin 2006, p. 10. 15 Alberti: De re aedificatoria, I, I. On this, see also Werner Oechslin: “Lineamenta,” in: archi, 1999, pp. 14–17. 16 Erwin Panofsky: “Die Perspektive als ‘Symbolische Form,’” in: Vorträge der Bibliothek Warburg (1924/25), 1927, pp. 258–330 (published in English as: Perspective as Symbolic Form, New York 1997); Richard Krautheimer: Lorenzo Ghiberti, Princeton monographs in art and archaeology, vol. 31, Princeton 1970; Samuel Y. Edgerton Jr.: The Renaissance Rediscovery of Linear Perspective, New York 1975. 17 Cf. Samuel Y. Edgerton Jr., see note 16, for the reconstruction of the experiment. 18 For the meaning of proportionality in perspective construction, see Rudolf Wittkower: “Brunelleschi and ‘Proportion in Perspective,’” in: Journal of the Warburg and Courtauld Institutes, 1953, vol. 16, p. 275ff. For the context above and the construction method used by Brunelleschi, see Frank Büttner: “Rationalisierung der Mimesis. Anfänge der konstruierten Perspektive bei Brunelleschi und Alberti,” in: Mimesis und Simulation, eds. Andreas Kablitz and Gerhard Neumann (Litterae, vol. 52), Freiburg im Breisgau 1998, pp. 55–87.

17

Fig. 2: Masaccio (Tommaso di Ser Giovanni di Mone Cassai, 1401–prior to 1428): Fresco of the Holy Trinity, Florence, Santa Maria Novella Church, c. 1427.

18

Thomas Hänsli | Parrhasius’s Curtain: Visual Simulation’s Mimesis and Mediality

The fresco of the Holy Trinity by the painter Masaccio (1401–prior to 1428) in the Florentine church of Santa Maria Novella is perhaps the most impressive example of a depiction designed according to the new principles of linear perspective. In an inversion of the principle devised by Brunelleschi, the fresco shows fictive chapel architecture represented on the church’s side aisle wall. The crucifixion scene shown, the Holy Trinity, Mary and John the Baptist, as well as the two benefactor figures shown from the side, meld seamlessly as well into the pictorial space as into the real space of the church aisle. The correct perspectival projection of the “feigned appearance” on the church wall lends the event a moment of the “possible,” so that the beholder tends to see the illusion as reality. Masaccio’s Trinity fresco takes advantage of the linear perspective’s productiveness in the sense of a simulating, visual expansion of real architectural space. [Fig. 2] The humanist, theorist, architect, and painter Leon Battista Alberti (1404– 1472) was responsible for the theoretical mediation of a reliable perspective method, in his treatise on painting entitled Della Pittura.19 Based on an understanding of the image, such as Giotto di Bondone’s, for example, Alberti was now able to also grasp the image literally as an excerpt from reality, describing it as a finestra aperta, a window opened in space. The space of the beholder should flow seamlessly into the pictorial space; the space of the beholder and that of the painting were thus, in geometric terms, one and the same. Alberti’s process, which was later called costruzione legittima, meant the construction of linear perspective as an orthogonal cut through the observer’s visual pyramid, and thereby the perfect analogy to the human sense of sight.20 This allowed him more than simply a precise and practical way to accurately reproduce reality in terms of perspective and proportion in a painting. It also, conversely, permitted the projection of any fictional pictorial space he wanted onto the painting surface. Alberti

19 Leon Battista Alberti: “Della Pittura (1436),” in: Opere volgari, ed. Cecil Grayson, 3 vols., Bari 1973. 20 Erwin Panofsky: “Die Erfindung der verschiedenen Distanzkonstruktionen in der malerischen Perspektive,” in: Repertorium für Kunstwissenschaften, 1925, vol. XLV, pp. 84–86; Cecil Grayson: “L. B. Alberti’s ‘costruzione legittima,’” in: Italian Studies, 1964, vol. 19, pp. 14–27.

19

Fig. 3: Albrecht Dürer (1471–1528): Illustration of a perspectival drawing utensil, from: Vnderweysung der Messkunst mit dem Zirckel vnd richtscheyt, Nuremberg 1538.

20

Thomas Hänsli | Parrhasius’s Curtain: Visual Simulation’s Mimesis and Mediality

thereby provided the necessary requirements for visual simulation of space, and paved the way for the creation of pictures that no longer had to be aligned solely with reality.21 Alberti’s finestra aperta would soon be developed further in a number of ways, for example, by Leonardo da Vinci and Albrecht Dürer. [Fig. 3] The heightened degree of illusion that the central perspective allowed soon influenced the visual depiction of architecture: based on the finestra aperta model, an attempt was made to dissolve the borders of space through the targeted opening of pictorial fields and guide the observer’s gaze into the distance. The development of an illusionist depiction of architecture in the Renaissance is closely associated with the northern Italian painter and engraver Andrea Mantegna (1430–1506). In service to the Mantuan court of the Gonzaga family, he created the Camera Picta (1465–74), one of the first works of illusionist architectural depiction, cleverly employing the connection of real architecture with architectural depiction in frescoes. The closed architecture of the relatively small space is dissolved in a fictive, illusionist loggia architecture, which serves as the literal frame for the depicted dynasty scene at the court of the Gonzaga family. Mantegna’s abilities culminated in the fresco of a faked circular ceiling opening in the center of the room. Framed by painted stucco and fruit garlands, it seems to open up a view of the heavens. Other equally famous examples from the 16th century, for example, the Sala delle Prospettive (1516–17) by Baldassarre Peruzzi (1481–1536) in the Roman Villa Farnesina (1516–17), and the Vatican’s Sala Clementina (1595) by Giovanni Alberti (1558–1601), use the suggestive possibilities of quadratura painting in a similar way.22

21 Panofsky, see note 16, recognized in his seminal article the meaning of perspective for Renaissance art as a “symbolic form” in the sense of Ernst Cassirer. 22 In Peruzzi’s biography, Vasari reports with a certain pleasure that even Titian had succumbed to the deception of Peruzzi’s perspectives: “… And I remember how I guided Cavaliere Titian, a highly admired and distinguished painter to view this work, and in no way did he want to believe that it was painting …” cf. Giorgio Vasari: Das Leben des Bramante und des Peruzzi, ed. by Sabine Feser, Berlin 2007, p. 42. [author’s italics]

21

Rhetoricizing the artwork and deceiving the beholder, “ti fan vedere ciò che non vedi” Baroque art, which made full use of the whole range of painting’s illusionist potential and the effects of deceptive illusoriness on the beholder, would also ground it theoretically. For example, the programmatic document Lo inganno de gl’occhi published in 1625 by the theorist, painter, and architect Pietro Accolti (1579–1642)23 contains an introduction to the fundamentals of geometric as well as visual perspective. The primary goal of his text, as the title suggests, is the intentional deception of the beholder. One of the most influential representatives of illusionist depiction was Andrea Pozzo (1642–1709), painter, Jesuit architect, and author of a two-volume treatise on perspective.24 His main work, the ceiling fresco in the nave of the church of Sant’Ignazio in Rome, depicts the Apotheosis of Saint Ignatius (1688–1694).25 Here, the instrument of perspectivebased depiction is exploited to the fullest extent: an illusory architecture spanning several stories continues the subdivision of the church space with such artifice that the transition from real to illusory architecture can be discerned only with great difficulty. The reality of the real church space continues in the “deceptive illusoriness” of the fictive architecture, giving beholders the illusion of a church room that opens at the top. Pozzo’s treatise on perspective leaves no doubts as to the intended effects of his art: “inganno,” intentionally deceiving the beholder.26 As possible uses for his methods, he cited specific situations in which the means or the circumstances precluded built architecture: in other words, he named the simulation of architecture as deceptive illusion.27 [Fig. 4]

23 Pietro Accolti: Lo inganno de gl’occhi, Florence 1625. 24 Andrea Pozzo: Perspectiva pictorum et architectorum, Pars I/II. Rome 1693/1700. 25 The fresco marked both the peak and end of a series of famous Roman ceiling paintings, such as Pietro da Cortona’s Divina providentia in the Palazzo Barberini, Guercino’s Aurora in the Casino Ludovisi, and Gaulli’s fresco Triumph of the Name of Jesus in the Gesù. 26 Andrea Pozzo: Perspectiva pictorum, Pars I. Ad lectorem, “L’arte della prospettiva con ammirabile diletto inganna … l’occhio.” (‘Perspective-based art deceives to the eye’s great pleasure.’) 27 Andrea Pozzo: Perspectiva pictorum: Pars I. Respondetur objectioni, “… essendo la prospettiva una mera fintione del vero.” (‘Perspective is a pure deception of reality.’) This is especially evident in the illusory cupola adjacent to the nave, see Andrea Pozzo, note 24: Pars II. figs. 64, 71, 91.

22

Thomas Hänsli | Parrhasius’s Curtain: Visual Simulation’s Mimesis and Mediality

It is of particular significance that it was in the Baroque era that the effect of deceptive illusion would become so interesting for both artists and clients.28 Under the influence of the Baroque theory of rhetoric, persuasion of the beholder – persuasio – would become one of the main concerns of both art theory and production of the era. To engage in this visual persuasion, art – analogous to speech – should evoke aesthetic pleasure and marvel, diletto and maraviglia. Art was also intended to evoke amazement and, where it helped mediate content, to shock.29 In the end, Conte Emanuele Tesauro (1592–1675) from Turin, a scholar of philosophy, poetry, and rhetoric, was thus able to detect evidence of an artist’s genius in his or her deception of the beholder through perspective. Tesauro equated use of perspective with greater astuteness, mainly due to its potential for deception: perspective is, accordingly, extremely astute, because it can represent to the beholder that which cannot be represented: “… ti fan vedere ciò che non vedi.”30

28 See the seminal article by Rensselaer W. Lee: “Ut pictura poesis. The Humanistic Theory of Painting,” in: Art Bulletin, 1940, vol. 22, pp. 197–269; and for further thoughts on this, Thomas Hänsli: “‘Omnis in unum’ – Inganno, Argutezza und Ingegno als kunsttheoretische Kategorien bei Emanuele Tesauro und Andrea Pozzo,” in: Wissensformen, Stiftung Bibliothek Werner Oechslin, Zurich 2008, pp. 166–179. 29 Francesco Bocchi: “Eccellenza del San Giorgio di Donatello (1570),” in: Trattati d’arte del Cinquecento fra Manierismo e Controriforma, ed. Paola Barocchi, Bari 1962, p. 190ff. sums up the effect of the statue of Saint George by Donatello in these terms. 30 Emanuele Tesauro: Il cannocchiale aristotelico, Turin 1670, p. 89, “argutissime finalmente sono le optiche; lequali … ti fan vedere ciò che non vedi.” (‘Ultimately, the visual perspective is extremely astute, which is capable of representing that which cannot be represented.’)

23

Fig. 4: Andrea Pozzo (1642–1709): Fresco of the Apotheosis of St. Ignatius, Rome, Sant’ Ignazio Church, nave (1688–94).

24

Thomas Hänsli | Parrhasius’s Curtain: Visual Simulation’s Mimesis and Mediality

The end of mimesis or the new autonomy of visual representation In recent years, philological research has assigned a further semantic field to the original concept of mimesis that goes far beyond the understanding of mimesis as pure imitation in the sense of a copy that doubles reality.31 The expanded concept of mimesis also includes categories such as depiction, representation, expression, and sensual visualization of reality, thus describing all relationships of the artwork to reality, including – primarily – the independent, productive-creative one.32 This renders the concept operable once again within today’s aesthetic and art-theoretical discourses. It proves productive – largely against the backdrop of an increasing medialization of architecture and images of architecture – as the conceptual definition of architecture’s visual simulation. The goal of visual simulation, from an artistic perspective, is no longer simply the imitative reproduction of architecture in the sense of a faithful copy of reality. Instead, the artist, faced with a flood of medial representations of architecture in film, photography, and television, is forced to create a picture of a picture of reality, as it is depicted by the media. This transformation can be recognized in the example of contemporary architectural photography. Beginning with the descriptive approach of documentary photography, as artist Thomas Struth states, contemporary architectural photography has meanwhile “… established itself more between art and other realities than the classical disciplines a picture of a picture of reality”33 Photography as a representational medium has attained an autonomy distinct from the depicted reality of the represented architecture. As photographer Thomas Ruff has so aptly surmised, the artist no longer creates pictures of reality, but now only “a picture of a picture of reality”.34

31 Thomas Metscher: “Mimesis,” in: Enzyklopädie der Philosophie, Hamburg 1999, p. 845; Valeska von Rosen: “Nachahmung,” in: Metzler Lexikon Kunstwissenschaften. Ideen, Methoden, Begriffe, ed. Ulrich Pfisterer, pp. 240–244. 32 Hans-Georg Gadamer: “Wahrheit und Methode. Grundzüge einer philosophischen Hermeneutik,” in: GW, vol. 1, 1986, suggests in this context the replacement of the concept of mimesis with the concept of “reproduction.” 33 Ludger Derenthal: “Skeptische Architekturphotographie,” in: Ansicht, Einsicht, Aussicht, ed. Monika Steinhauser, Düsseldorf 2000, pp. 19–28. 34 Willi Adam: “Der Berachter entscheidet,” Interview mit Thomas Ruff, Kultur-Joker, Vol. 4, March 4–17, 1994, p. 46. The same would also apply to the visual simulation of architecture as visualization of its ideas and concepts with computer-aided means of representation. The medium is no longer drawing, in the sense of Vasari’s disegno, but computer-generated design and imaging.

25

Fig. 5: Jens Trimpin (*1946): Wandloser Raum, 2003. Acrylic glass, 20 x 28 x15 cm.

26

Thomas Hänsli | Parrhasius’s Curtain: Visual Simulation’s Mimesis and Mediality

This applies to every form of artistic realization of architecture. On the occasion of the art exhibition “Kunst-Licht” in 2004, the sculptor Jens Trimpin presented a sculpture, Wandloser Raum, which he had based on the eponymous poem written in 1979 by the German author Ernst Meister.35 The “relationship of emptiness and fullness was apparently reversed”36 through minimal incisions in a massive, colorless, and translucent acrylic cube: Although clearly the work of a sculptor and recognizable as an art object, it has the effect of a representation of architecture. The artwork gives the “appearance of an architectural actuality,” as the artist states. In terms of the expanded mimesis concept, it is a matter of a sensual, in this case, artis-tic visualization of architecture. Here, as a final consequence, the visual simulation of architecture achieves its own status as art object [Fig. 5].

35 Ernst Meister: Wandloser Raum. Gedichte, Darmstadt, Neuwied 1979: “Ihr haltsamen | vier, ihr | Ecken der Gegend! || Ich steh | zwischen Luft, | den Atem sinnend, || indes, mir übers Haupt, | der Raum sich hebt | mit unzähligen Himmeln.” (Published in English as: Room Without Walls. Selected Poems, Los Angeles 1980, German / English, translation by Georg Gugelberger.) 36 Magdalena Nieslony: “Bei Licht besehen,” in: KUNST-LICHT, eds. Bernhard Knaus and Magdalena Nieslony, Freiburg im Breisgau 2004 [exhibition catalogue Kunst-Licht, E-Werk. Hallen für Kunst, Freiburg im Breisgau 16.5–27.6.2004]. In the same exhibition, among other works, Trimpin showed a painting on Plexiglas entitled Albertis Fenster, which was intended as “a programmatic counter position to the modern illusionist pictorial model,” cf. Ibid., p. 12.

27

Andrea Gleiniger OF MIRRORS, CLOUDS, AND PLATONIC CAVES: 20TH-CENTURY SPATIAL CONCEPTS IN EXPERIMENTAL MEDIA Around mid-1923, the Russian architect Lazar Markovich Lissitzky, known as El Lissitzky, began to experiment seriously with photography.1 Originating around the same time, between 1923 and 1925, was his most spectacular architectural project: the legendary “Wolkenbügel” (“Cloud Iron”). This “horizontal” vision of a high-rise for Moscow’s Ring Boulevard was the architectural result of the project El Lissitzky called Proun, which was designed “for the renewal of art,”2 and through which he had begun in 1919 to plumb a new relationship between painting and architecture, between surface and space.3 A photomontage, presumably dating from 1925, shows one of his eight “Wolkenbügel” in a realistic urban setting, in this case Moscow’s Nikitsky Square: standing out in slightly wide-angle perspective in a monumentalizing view from below against an airily retouched background with a pale blue sky is the projecting high-rise sculpture, surrounded by the street traffic, which surges around it in all directions [Fig. 1]. Here, the dynamism of the new and the futuristic is effectively set in context. The medial significance of the postcard is its suggestion of “reality.” And although this reality oscillates between the specificity of an architectonic attraction and the everyday life of a metropolitan square, this mode of (re)presentation seems to have engraved the model of the “Wolkenbügel” on collective memory. Visualizing the Future: Simulation and Photomontage With his choice of photomontage as one of the most advanced and untypical techniques of visualization in use at the time, El Lissitzky shows that using the concept of simulation in architecture reflects not only the history of ideas, but media

1 A good presentation of his works is provided by, among others: El Lissitzky 1890–1941. Retrospektive (exhib. cat., Hanover), Berlin 1988. 2 Proun = proekt utverzhdeniya novogo = project for the renewal of art: “… and we recognize that the new painting that grows out of us is no longer a picture. It describes nothing, but instead constructs extensions, surfaces, lines, with the objective of creating a system for composing the real world. We gave this new building a new name – Proun.” El Lissitzky, Die Überwindung der Kunst (1922), pp. 70–72. 3 Suprematism was established by Kasimir Malevich, its most prominent representative. The objective was the total abstraction and reduction to “pure” geometric forms, realized by Malevich in celebrated and influential pictures such as his “Black Square” (1915).

29

Fig. 1: El Lissitzky: Wolkenbügel (Cloud Iron) for Moscow, photomontage, 1924/25.

30

Andrea Gleiniger | Of Mirrors, Clouds, and Platonic Caves: 20th-Century Spatial Concepts in Experimental Media

history as well. Even before simulation, as an application of the arts of digital representation, appeared as it does today, in the gravitational field of information technology and epistemology – as well as in architecture –, simulation as a form of architectural representation was already particularly associated with the tradition of mimesis. Interpreted thus, the concept of simulation points back to a core preoccupation of architecture, namely the task of vividly conveying the concepts and ideas of that which is to be built. The development of architecture and of architectonic design has given rise to a spectrum of presentational forms and concepts of visualization that are concerned with communicating a vision, an experience of the prospective. It was not via the potentialities of digital visualization that this spectrum was substantially expanded for the first time, but earlier, in dialogue with the new and dominant modes of perception and technological concepts. This is particularly true for the process of modernization, the inception of which dates from the late 19th century; and especially for electricity, which generated the technological foundations for the medial expansion of ways of visualizing and thematizing architecture. El Lissitzky is by no means the only individual – nor even the first – to work with the new potential for visualization offered by photomontage. Visualization practices that pushed towards ever greater levels of vividness and powers of persuasion progressively gained significance, particularly in the context of architectural competitions, which became increasingly important in the course of the 19th century.4 New perspectives were opened up, in a literal sense, by photography – which had increasingly become a working tool for architecture since the late 19th century – and photomontage in particular became a new instrument of visualization that seemed uniquely suited to examining and displaying the effects of a given design in “reality.”5

4 At the same time, since color was forbidden and perspective views regarded as undesirable in the context of a competition, the colored architectural drawing lost its significance and was replaced by “standardized, technical, mechanical ink drawing” with its greater anonymity and objectivity. Cf. Winfried Nerdinger: Die Architekturzeichnung. Vom barocken Idealplan zur Axonometrie, Munich 1986, p. 16. 5 One of the first known photomontages was executed by the German architect Friedrich von Thiersch in 1902. Ibid, pp. 142 and 143. The photomontage acquired a special significance in the works of Ludwig Mies van der Rohe. Cf. here: Andres Lepik: “Mies and Photomontage, 1910–38,” in: Mies in Berlin (exhib. cat., New York/Berlin), New York 2002.

31

The exceptional importance attributed by El Lissitzky to precisely such “realistic” projections of his vision of the “Wolkenbügel” in real, existing contexts, projections at least seemingly oriented toward the pedestrian’s perspective, becomes explicit when we consider his critique of the veduta-style presentations of American skyscrapers, which in the 1920s provided talismanic prototypes for European high-rise designs.6 With his “Wolkenbügel” photomontage, El Lissitzky attempted to demonstrate in an exemplary manner the objectification of architectonic novelty, of a future architecture, within contemporary scenarios of everyday life. Yet in the postcard-style visualization of the “Wolkenbügel,” he was also preoccupied with providing a vivid image of concrete perceptual experience, that of the passerby on the street. Nevertheless, this attempted visualization may have struck him as inadequate, since the polyperspectival dynamization he strove to achieve in architectural drawing, and which he continually extemporized in his Prouns, could not be satisfactorily realized in a postcard-style view. This dynamization emerged principally at the symbolic level, remaining a suggestion embodied by the circulating street traffic, which thus becomes a symbol of the acceleration of spatial perception. As advanced as photomontage was in the context of media history and the history of visualization, it nonetheless remained conventional, to the extent that, in the final analysis, it could deliver little more than a static image; one certainly capable of presenting striking visions of the future, but not of the perceptual experience associated with movement. In this context, it would be film that would open up new possibilities for simulating architecture, space, and urban experience.

6 In an article in Bauindustrie that appeared in 1928, El Lissitzky wrote: “Our verdict of the American skyscraper is no doubt one-sided. [… Yet] our opinion is formed by what we see on postcards or similar illustrations. These photos generally show the building as detached, as seen by a European approaching from the ocean side, and not from the street, as though one is standing next to it and below, as actually perceived by the human eye. Precisely from a near distance, all of the fussiness of the style vanishes, and there remains the powerful impression of the foreshortened structure, which thrusts upwards.” Cited in: “Die Architektur der Stahl- und Stahlbetonrahmens,” in: El Lissitzky, Proun und Wolkenbügel, Schriften, Briefe, Dokumente, ed. Sophie Lissitzky-Küppers, VEB Verlag der Kunst Dresden 1977, pp. 70–79, here p. 73.

32

Andrea Gleiniger | Of Mirrors, Clouds, and Platonic Caves: 20th-Century Spatial Concepts in Experimental Media

Accelerated Spatial Perception: Simulation and Film The potential of film was recognized early on,7 and investigated, in particular, by the artistic avant-gardes of the 1920s, notably in the works of László MoholyNagy,8 as well as in films such as Walter Ruttmann’s Berlin. Die Sinfonie der Grossstadt (Berlin, Symphony of a Metropolis; 1927) and Dziga Vertov’s Der Mann mit der Kamera (The Man with the Movie Camera; 1929).9 As a genuine rapprochement of architecture and urban design, dynamism (or “tactile reception,”10 as Walter Benjamin referred to it) was still attempted in the two-dimensional image, for instance by the Futurists, the Cubists, and by early 20th-century painters of the metropolis such as George Grosz. New analytical potential arose out of the possibilities of cinematographic simulation. The film became a medium of dynamized urban experience: its peculiarities vis-à-vis perceptual psychology were disclosed and recorded by means of the camera eye. A new image of the city emerged within its dynamic visualization. Two perceptual events and spatial experiences were of central importance in this context: the visualization of movement as a mode for representing time, and the transformation and dematerialization of the nocturnal urban environment via the diverse manifestations of (moving) light, i.e. the invention of the metropolis in the orchestration of advertising, light, and movement [Fig. 2]. And even if a mobile, motorized experience of the urban realm still remained a privilege of the few, particularly in terms of the visual stimuli available from a moving automobile, such experiences would become a decisive source of inspiration that affected both urban planning concepts and the design of individual structures – think of contemporary designs for Berlin’s Alexanderplatz (1929). Becoming a reality in the wake of World War II, it was something only anticipated by the 7 H.W. Jost, “Kino und Architektur,” in: Städtebau, Monatsschrift für die künstlerische Ausgestaltung der Städte nach ihren wirtschaftlichen, gesundheitlichen und sozialen Grundsätzen, ed. Theodor Goecke and Camillo Sitte, 8/9, 1916, p. 91. For valuable references in this connection, my thanks to Martino Stierli, who was kind enough to provide an advance copy of his text “Die ‘Er-Fahrung’ der Stadt. Las Vegas, Film, Automobilität,” in: Das Auge der Architektur. Zur Frage der Bildlichkeit in der Baukunst, ed. Andreas Beyer, Matteo Burioni and Johannes Grave, Munich 2008. 8 Relevant in this connection are the following texts by László Moholy-Nagy: Malerei, Fotografie, Film, Bauhausbuch no. 8, 1925 (Reprint Mainz 1967); Von Material zu Architektur ed. Hans M. Wingler, with an article by Otto Stelzer and an article by the editor; facsimile of the 1929 edition, Mainz 1968; Vision in Motion, Chicago 1947. 9 On Vertov see also: Lev Manovich: The Language of New Media, Cambridge, Mass./London 2001. 10 Walter Benjamin: Das Kunstwerk im Zeitalter seiner technischen Reproduzierbarkeit. Drei Studien zur Kunstsoziologie, Frankfurt 1977 (10th edition), see e.g. p. 41.

33

Fig. 2: Erich Mendelsohn, Broadway at Night, 1925.

34

Andrea Gleiniger | Of Mirrors, Clouds, and Platonic Caves: 20th-Century Spatial Concepts in Experimental Media

designs of the 1920s: individual mobility became a mass phenomenon, and in many respects, the automobile became a planning parameter, while emerging from a new esthetic – one founded on perceptual psychology – was an instrument of city and planning analysis that found its adequate forms of expression in film, and in cinematic resources generally. This was the basis for architects such as Robert Venturi and Denise Scott Brown when they investigated Las Vegas around 1970. The View from the Road11 was the title given by Kevin Lynch – at this time programmatically preoccupied with ideas related to The Image of the City – to an essay written in 1964.12 And in her text “Learning from Pop,” which appeared in Casabella in 1971, Denise Scott Brown propagated video and film as investigative instruments in urbanistic analytical and planning processes: “New analytic techniques must use film and videotape to convey the dynamism of sign architecture and the sequential experience of vast landscapes.”13 Architectural concepts that thematize these simulated urban and spatial experiences in various ways emerge from the dialogue with filmic modes of perception – all the way to the design concepts of Bernard Tschumi. In his Manhattan Transcripts,14 for example, dating from 1981/82, and unlike Scott Brown and Venturi, Tschumi does not attempt to derive a new, large-format symbolicity for architecture by drawing analogies between cinematic perception and urban experience. He is far more concerned with transferring perceptual experiences derived from the dramaturgy of editing and montage into spatial arrangements in which an architectonic expression for “time” could be won from the relationship of sequence and fragment as the foundation for spatial experience.

11 Donald Appleyard, Kevin Lynch, John R. Myer: The View from the Road, Cambridge, Mass. 1964. 12 Same authors: The Image of the City, Cambridge, Mass. 1960. 13 “In fact, space is not the most important constituent of suburban form. Communication across space is more important, and it requires a symbolic and a time element in its descriptive systems which are only slowly being devised. New analytic techniques must use film and videotape to convey the dynamism of sign architecture and the sequential experience of vast landscapes, and computers are needed to aggregate mass repeated data into comprehensible patterns,” Denise Scott Brown: “Learning from Pop,” Casabella 12/1971, pp. 359–360. 14 Bernard Tschumi: The Manhattan Transcripts, New York 1981.

35

Fig. 3: Robert Venturi and Denise Scott Brown: “The ‘Strip,’” double page spread from Learning From Las Vegas, 1978.

36

Andrea Gleiniger | Of Mirrors, Clouds, and Platonic Caves: 20th-Century Spatial Concepts in Experimental Media

“Learning from Las Vegas” in a “Society of the Spectacle”: Simulation and the Mediatization of the Urban Realm In mobile, filmic urban analyses undertaken since the late 1960s, Scott Brown and Venturi pursued their challenge seriously. Yet, unlike such modernist heroes as Le Corbusier, who deployed the relationship of architecture and film in a highly artificial act of reciprocal illumination, the two Americans turned their attention toward trivial manifestations of urbanism and urban planning. They rediscovered “main street” in its utter vulgarity, and even more provocatively, its hypertrophied version in the Las Vegas Strip [Fig. 3]. Both now became serious objects of investigation in order to fathom the possible complexity and potential contextualization of an urban space that has become alienated in its mediatized symbolicity, and in order to develop from it a new, independent symbolicity for architecture. Via “estrangement,” Pop Art contributed essentially to rendering visible the experience of “alienation” in a commercialized world. Mediatized scenarios based on technological visions of the future are ironically ruptured again and again in the pointed urban Pop Art tableaux of groups such as Archigram. With Pop, Modernism’s claim of bringing “art into life” was in some sense turned into its opposite: now, life pushed its way into art. But this was a “life” that at this point had already been diagnosed and identified as a mediated one. And mediatization – which for the avant-gardes of the 1920s was still capable of founding a new claim for art – had in the meantime become the object of a critique of civilization, or at least of a discourse that was skeptical of the media. In light of the Society of the Spectacle15 on which Guy Debord trained his sights in a highly critical fashion in 1967, perhaps nothing could have been more misleading than to have taken Las Vegas seriously in urban planning terms, let alone proclaiming it a new model.16 For here was the quintessential experience of being totally overwhelmed and alienated by an environment dominated and guided exclusively by commercial interests and mediatized artificiality. In short: total simulation. All the more illuminating, then, to remind ourselves that Lewis Mumford, the American architecture critic and historian of the city, must have been aware of the phenomenon of simulation – without ever referring to it explicitly – when

15 Guy Debord: La Société du spectacle, Paris 1967, in: Œuvres, Paris 2006. In this connection, see also: Nils Röller, “Scientia Media – Simulation between Cultures,” this volume pp. 47–56. 16 Robert Venturi, Denise Scott Brown, Steven Izenour: Learning from Las Vegas, Cambridge, Mass. 1972.

37

in the last chapter of his profound 1961 historical study of urbanism The City in History: its Origins, its Transformations and its Prospects, under the heading “The Shadows of Success,” he focuses his pitiless gaze on the contemporary “metropolitan denizen”: “He lives, not in the real world, but in a shadow world projected around him at every moment by means of paper and celluloid and adroitly manipulated lights: a world in which he is insulated by glass, cellophane, pliofilm from the mortification of living. In short, a world of professional illusionists and their credulous victims. […] That life is an occasion for living and not a pretext… [does] not occur to the metropolitan mind. For [the members of a ‘crowd’ that watches a ‘spectacle’], the show is the reality, and the show must go on!”17 Mumford conjures up an image of the metropolis as a form of fundamental cultural (self-) deception, one that has degenerated into a world of mere appearances consisting of distracting maneuvers and substitute worlds. In descriptions of this civilizational condition into which the metropolis is said to have deteriorated in the modernity of the 20th century, Plato’s metaphor of the cave has experienced a contemporary renaissance, in agreement not least with Hans Blumenberg, who interprets “the metropolis as a repetition by means of new media and technology of the pre-civilizational cave […].”18 In this context, Japanese architect Toyo Ito has provided us with an exceptionally vivid translation of this metaphor into a mediatized spatial environment. In the design of the exhibition scenario for “Vision of Japan,” held in London in 1991, it was perhaps less a question of any visualization of a “simulated dream of the future world,” as the architect himself imagined.19 In the “interlocking of (simulated) spaces”20 staged by Ito, he displayed instead a simulation of that very simulation into which the frequently invoked metropolis – the one dematerialized by information flows and by its own “simulating” visualizations – had ultimately degenerated. In short: the virtualized “megalopolis.” [Fig. 4] 17 Lewis Mumford: The City in History: Its Origins, its Transformations and its Prospects, London 1961; see also Mumford, Lewis: City Development – Studies in Disintegration and Renewal (1945). 18 Norbert Bolz: Die Welt als Chaos und Simulation, Munich 1992, p. 97 in relation to Hans Blumenberg: Höhlenausgänge, Frankfurt 1989. 19 Sarah Chaplin: “Cyberspace – lingering on the threshold. Architecture, Post-modernism and Difference,” in: AD 65 (11/12), 1995, pp. 32–35, here p. 35. 20 Walter Benjamin: Das Passagen-Werk, ed. Rolf Tiedemann, Frankfurt 1963, p. 666. 21 For example in: Von Material zu Architektur, ed. Hans M. Wingler, with an essay by Otto Stelzer and a contribution by the editor, facsimile of the 1929 edition, Mainz 1968, p. 167 and p. 177. Or German architect Hugo Häring in a text written for the Bauhaus newspaper from the year 1928: “the square as a space

38

Andrea Gleiniger | Of Mirrors, Clouds, and Platonic Caves: 20th-Century Spatial Concepts in Experimental Media

Platonic Caves Plato’s metaphor of the cave is one of the founding metaphors of European thought, as well as of the discourse of simulation in Modernity. With the mediatization of space that took place in the 20th century on the basis of electrification, and the visualization technologies and strategies emerging from it (whether on the small scale of art and exhibition spaces, or on the larger scale of the urban realm), the Platonic cave underwent a new and thoroughly vivid re-interpretation. At the same time, this experience of mediatized space was described early on as an experience of dematerialization and virtualization, in particular by Moholy-Nagy.21 The city – which had been dematerialized, “virtualized,” and transformed by light and the new media into a large-scale media space – preoccupied the designers and artists of classical Modernism. In this context, it is hardly an accident that during the same period, Walter Benjamin discovered the Paris of the 19th century as the subject of his Passagenwerk.22 Forming for the first time in a higher concentration in Paris – the “city of mirrors” (“Spiegelstadt”) that Benjamin conjured up so unforgettably and with such material specificity – were the ingredients of the artificial night that transformed urban space into a magical realm of illusion. This was founded on technical developments that, while not yet functioning on the basis of (electronic) media, exploited materials such as glass and its reflective effects; and with mirrors as such, which already constituted the

in the sense of the historical art of urban planning no longer exists, it has been destroyed, completely dissolved. In the afterimage, nothing corporeal exists any longer […] the light sources appear freely disposed in space, floating. […] found everywhere, then, is the complete opposite of the historical architectural square. In terms of building material as well: light instead of stone, the conquering of open space […].” (“der platz als raum im sinne der historischen stadtbaukunst existiert nicht mehr, er ist zerstört, vollkommen aufgelöst. es existiert im nachtbild nichts körperhaftes mehr […]. die lichtquellen erscheinen frei disponiert im raum, schwebend. […] es bestehen also überall vollkommene gegensätze zum historischen architekturplatz. auch im baustoff: licht gegen stein. eroberung des freien raumes […].”) Cited in: Anne Hoormann: Lichtspiele. Zur Medienreflexion der Avantgarde in der Weimarer Republik, Munich 2003, p. 253: “der platz als raum im sinne der historischen stadtbaukunst existiert nicht mehr, er ist zerstört, vollkommen aufgelöst. es existiert im nachtbild nichts körperhaftes mehr […]. die lichtquellen erscheinen frei disponiert im raum, schwebend. […] es bestehen also überall vollkommene gegensätze zum historischen architekturplatz. auch im baustoff: licht gegen stein. eroberung des freien raumes […]”. 22 See note 20.

39

Fig. 4: Toyo Ito: Visions of Japan, exhibition installation in the Victoria & Albert Museum, London 1991.

40

Andrea Gleiniger | Of Mirrors, Clouds, and Platonic Caves: 20th-Century Spatial Concepts in Experimental Media

experience of the virtual23: the countless reflections of the urban realm within whose “spatial interlockings” emerged an autonomous and artificial world of illusion, a new world of perception. In this way, the idea of simulation acquired its modern contours against the background of industrialization and electrification. And it was light (specifically artificial light) that opened up spaces of possibility for perception and experience, which acquired new dimensions and a new dynamism through electrification.24 László Moholy-Nagy – a tireless experimenter with the new media – had recognized this as almost no one else had done.25 In this context, he considered two aspects as being of primary importance: the artistic appropriation of the new media technologies, and the foundation upon perceptual psychology of an artistic aesthetic in which precisely these new media and their design potential would play an essential role. At least implicitly, and (it should be stressed) without ever being actually formulated as such, the concept of simulation acquired a new, multifaceted significance via the tension between the perceptual-psychological anamnesis of a technically accelerated world and its artistic visualization. Shadows and Mirrors As early as the 1920s, the exhibition space was a location where the new modes of perception could be thematized and staged. Exhibition design – which flourished in this period for a variety of reasons – became a space of experimentation for the new media and for the concepts of visualization associated with them. Here, it was a question of innovative strategies for communicating modern design principles, of work devoted to enlightenment and persuasion. But it was not only the new image media and forms of projection such as photography, film and the slide transparencies that strove in their spatial scenarizations toward analogies of Plato’s cave metaphor. This was also true of the forms of projection that (as archetypes of the virtual) in a sense anticipated them: 23 Cf. Elena Esposito, “Illusion und Virtualität: Kommunikative Veränderungen der Fiktion,” in: Soziologie und künstliche Intelligenz, Produkte und Probleme einer Hochtechnologie, ed. Werner Rammert Frankfurt/New York 1995, pp. 187–216; on “mirror images and real images,” p. 191ff. 24 Marshall McLuhan: Understanding Media: The Extensions of Man, New York 1964; Marshall McLuhan with Quentin Fiore, coordinated by Jerome Agel: The Medium is the Massage: An Inventory of Effects, New York 1967. 25 On Moholy-Nagy’s programmatic activities as a “lighter” (“Lichtner”), see most recently: Hoormann 2003 (note 21).

41

mirroring and shadow. In his 1920s Lichtrequisit für eine elektrische Bühne (Light Prop for an Electric Stage), Moholy-Nagy conceived a kinetic experimental set-up that he could use to explore the reflective play of mobile dramaturgy of light and shadow as a space-generating system for the stage, and to experiment with the projection of abstract constructivist shadow realm.26 In this context, there are two spatial configurations in particular through which a phenomenology oriented by perceptual psychology could represent an anticipatory virtual space: the shadow realm of the stage, and the “mirror space” of the city, permeated by the reflective effects of illumination, the “scenic intensification” (“Szenenverdichtungen”) of which was made possible by the new media.27 The Pepsi-Cola Pavilion: “Beneath all those mirror images …”28 One project conceived as a programmatic continuation of the dialogue that modernism constituted between art and technology – the pavilion realized by the American artist’s group E.A.T. (Experiments in Art and Technology) for the American soft drink corporation Pepsi at the 1970 World’s Fair in Osaka – is a representative example (from today’s perspective) of the concept of simulation, which has been in a process of transformation since the middle of the last century. Among the multiplicity of artistic and technical attractions that this pavilion offered, two are of special interest in our present context: the mirrored interior of its cupola, and the fog (or perhaps cloud) that enveloped the cupola, itself clad in white plastic elements. Two forms of simulation that are brought into meaningful relationship to one another are represented in both of these elements, “mirror and fog,” through the example of this pavilion: in the cupola, accessible only through a tunnel-like passageway, we find an adaptation of the Platonic cave – scenarized in a way that is both spatially and technically virtuosic – and a staged and publicly effective Debordian “spectacle” [Fig. 5]. And thematized with the fog /cloud [Fig. 6] is a meteorological formation that alone had

26 Surviving, for instance, as part of a set for the 1928 production of Tales of Hoffman at the Staatsoper in Berlin: “an attempt to allow space to emerge from light and shadow. The scenery, among other elements, is converted into props for generating shadow. Everything is transparent, and all transparent elements are adapted to a lavish yet still comprehensible spatial articulation.” Moholy-Nagy 1929/1968, see note 21, p. 219. 27 Ibid, p. 175. 28 Rolf Haubl “Unter lauter Spiegelbildern …”. Zur Kulturgeschichte des Spiegels, Frankfurt 1991.

42

Andrea Gleiniger | Of Mirrors, Clouds, and Platonic Caves: 20th-Century Spatial Concepts in Experimental Media

become the object of a methodology initially introduced by meteorology: the use of simulation in the form of dynamic modeling as an analytical instrument of knowledge. The mirroring – realized following elaborate research and experimentation on the form of inflatable foil – took the form of a spherical mirror that promised a singular optical experience: “such a mirror allows the formation of ‘real images.’” Highlighted by this emphasis on the high degree of verisimilitude made possible by mirror technology and the consummate and almost transitionless “interlocking of spaces,” was the perfection of a deception whereby visitors hardly knew, in the end, where they stood: in the world in front of the mirror, or in the one behind it. This was the case because: “These images look exactly like the objects they represent, so that the real and image worlds cannot be distinguished. […] the real image world and the real physical world coexist in the same space.”29 The mirrored cupola in Osaka (which came into existence three years before the mirror world staged by Rainer Werner Fassbinder in his “Welt am Draht” (World on Wires), and six years after its literary prototype, the science-fiction novel Simulacron Three30 which appeared in 1964) thereby became a singular artistic and technological experimental protocol in which the dialogue between the “spectacle” and the metaphor of the Platonic cave could be plumbed in a multifaceted manner. But that was not all: “… breathing the atmosphere”31 The counterpart of the mirrored cupola was formed by the fog or cloud sculpture commissioned from Japanese artist Fujiko Nakaya.32 Through Nakaya, the scenarization of the Pepsi-Cola Pavilion entered the gravitational field of high-profile climatology research, for as the daughter of a renowned physicist, crystal-

29 Elsa Garmire: “An Overview,” in: Pavilion: By Experiments in Art and Technology, ed. Billy Klüver, Julie Martin, and Barbara Rose, New York 1972, p. 196; in particular: “Opticals of Spherical Mirrors,” pp. 243–246. 30 See Nils Röller in this volume, p. 48 (note 15). 31 Pavilion 1972, p. 41. 32 Since then, Fujiko Nakaya has made a name with various cloud sculptures. Elizabeth Diller and Ricardo Scofidio also sought her advice regarding the realization of the Cloud Pavilion at Yverdon-lesBains on the occasion of the 2002 Schweizer Landesausstellung (Swiss National Exhibition).

43

Fig. 5: E.A.T., Pepsi-Cola Pavilion, Osaka World’s Fair 1970: interior view of mirrored cupola; photo: Fujiko Nakaya.

44

Andrea Gleiniger | Of Mirrors, Clouds, and Platonic Caves: 20th-Century Spatial Concepts in Experimental Media

lographer, and meteorologist,33 she was not only especially interested in the artistic theme of clouds and fog, but also had the requisite connections to the relevant research disciplines of natural science, those at the forefront of the modeling technologies and scenarios used by climatology at that time. For with the fog as well, in the end, it was a question of a quality that had already distinguished the spherical mirrors of the cupola interior: a high degree of reality, and hence a heightened natural-mimetic effect.34 The new media technologies have considerably expanded simulatory possibilities in the mimetic sense. However, in the meantime, the 1960s were also caught up in the concept of simulation, in a way that was associated with the dynamic formation of models, as generated through the dialogue between cybernetics and climatology in particular.35 But that was not all: at the same time, a preoccupation with natural prototypes (repeatedly reclaimed in the architectural context) acquired a new dimension, for reasons related to building and its history as well as for (as is well known) ideological ones. At the time when of Carlo Lodoli was writing, i.e. in the 18th century, the conviction that “not the external aspect of nature becomes the initial impulse for the architectural imagination, but instead the inherent properties of the material, the structure […],” had formed a component of architectural-theoretical discourse concerning the organic.36 Architectural and constructive concepts, such as 33 Ukichiro Nakaya, then a professor at Tokyo University. Ichiro Sunagawa: “Growth and Morphology of Crystals,” in: Forma 14, 1999, pp. 147–166, here pp. 159–160. www.scipress.org/journals/forma/pdf/1401/ 14010147.pdf. 34 The names cited in this connection refer to the University of California in Los Angeles (UCLA), and to the research milieu revolving around Morton Wurtele, and later in particular around Akio Arakawa, a student of Nakaya’s, in which the developmental stages of climate modeling were carried out. On this topic and on the developmental history of fog, see Fujiko Nakaya: “Making of ‘Fog’ or Low-Hanging Stratus Cloud” and Thomas R. Mee: “Notes and Comments on Clouds and Fog,” in: Pavilion 1972, pp. 207–223 and pp. 224–227. Also: Paul N. Edwards: “A Brief History of Atmospheric General Circulation Modeling,” in: David A. Randall (ed.), General Circulation Development, Past Present and Future: The Proceedings of a Symposium in Honor of Akio Arakawa, New York 2000, pp. 67–90. 35 See the article in this volume by Georg Vrachliotis, pp. 63–81. 36 Cf. Hans-Peter Schwarz: “Die Mythologie des Konstruktiven. Zerstreute Gedanken zur Naturgeschichte des Konstruktiven,” in: Vision der Moderne. Das Prinzip Konstruktion, ed. Heinrich Klotz in collaboration with Volker Fischer, Andrea Gleiniger and Hans-Peter Schwarz (exhib. cat., Deutsches Architekturmuseum, Frankfurt am Main), Munich 1986, pp. 46–55, here p. 47. On the significance of Carlo Lodoli (1690–1761), cf. also Hanno-Walter Kruft: Geschichte der Architekturtheorie, Von der Antike bis zur Gegenwart, Munich 1985, p. 221ff.

45

Fig. 6: E.A.T.: Pepsi-Cola Pavilion, Osaka World’s Fair 1970: view of the pavilion with fog created by Fujiko Nakaya; photo: Shunk-Kender.

46

Andrea Gleiniger | Of Mirrors, Clouds, and Platonic Caves: 20th-Century Spatial Concepts in Experimental Media

those propagated and developed by Frei Otto beginning in the 1950s, for instance, not only prefigured a new processual design thinking (processual by virtue of the fact that the dynamism of an organic principle, one based on the analysis of nature, however conceived, was to have been rendered effective). It also modified a concept of mimesis, as pursued in the Pepsi-Cola Pavilion cloud. To be sure, this project had made reference to advanced techniques of simulation and their experimental environments, that is to say, to meteorology and climatology – while the “cloud” realized in 2002 in Yverdon (Switzerland) by the American architectural team Elizabeth Diller and Ricardo Scofidio, in the end, achieved little more than this, albeit now under the heading of digitalization. In the meantime, Frei Otto arrived at a kind of “structural mimesis,” through which the constructive principles of nature were imitated, or “simulated” in terms of their outer shape, as well as in terms of the material’s inherent properties. At that time, it was “only” a question of the way these materials were used in building; since then, it has become a question of the processes constituting these materials. And further, a question of how they become effective in a mimetic transformation that relates to their structures. This means more than visualization. It means an invention that is effective precisely in the realm of design. For the concept of simulation, digitalization has triggered a paradigm change. To be sure, commonly accepted perceptions particularly equate the digitally generated images and concepts of visualization that have transformed the mimetic tradition into a virtual one (whether through static 2-D or dynamic or 3-D presentations) with the concept of digital simulation as such. Meanwhile, in reality, it is the IT application of a processual concept of simulation that is challenging and transforming architecture and architectonic design, and which has endowed the concept of computer simulation with a new horizon of meaning. The so-called Blob Architecture of the past decade attempted to imagine this in formal terms: this esthetic anticipation, however, was still by no means the direct product of nonlinear design processes. Rather, their structure became the point of departure for a thoroughly individualized visualization in the sense of traditional architectural design. Yet even with the constantly advancing development of complex numeric simulation models and the digital design toolbox they produced, form by no means simply generates itself. New possibilities have now opened up for mastering structure as well as formal invention in their total complexity – and hence for arriving at a new process of formal invention that corresponds to the inherent tendencies of these processes. Perhaps representing

47

Fig. 7: Yusuke Obuchi: Wave Garden, project, 2002.

48

Andrea Gleiniger | Of Mirrors, Clouds, and Platonic Caves: 20th-Century Spatial Concepts in Experimental Media

interesting perspectives in this connection are projects such as one by Japanese architect Yusuke Obuchi. In “Wave Garden,” a floating hydroelectric power station set along the California coastline [Fig. 7], the various aspects of simulation are overlaid to form a multifaceted experimental set-up: in the formal and functional dialogue between the energy-generating elements with ocean conditions, as well as in its design and configuration, which become indicators of the amounts of electricity used by the population.37 It is perhaps no coincidence that, once again, it is in the context of the sciences of origin – oceanography and meteorology – that a concept of IT simulation has supplied cues for an innovative design.

37 Concerning this project, see among others: Nature Design. From Inspiration to Innovation (exhib. cat., Zurich), ed. Angeli Sachs, with contributions by Barry Bergdoll, Dario Gamboni and Philip Ursprung, Baden 2007, pp. 66/67.

49

Nils Röller SCIENTIA MEDIA – SIMULATION BETWEEN CULTURES “Is the new computer really only used to carry out government research contracts?” a journalist asks the head of development of a simulation program in a scene staged by Rainer Werner Fassbinder 1973 in his film Welt am Draht (World on a Wire).1 The head of development has no head for questions concerning the correct use of technical resources. Instead, he addresses the distinction between the computer’s hardware, and that which is developed: a model world. Significantly, this conversation takes place in an underground garage. The subterranean architecture could be interpreted as referring to a specifically European relationship to simulation. This interpretation rests on the notion of the human world as a cave, on top of which rises a brighter world. Fassbinder stages this timeless, perennially influential idea of the classical philosopher Plato using the resources of contemporary film and television technology. It is one aspect of the ambivalent staging of simulation in his film. The German director displays a heightened flair for architectonic accents. He situates the entrance of the “Institute for Cybernetics and Futurology Research” in an industrial landscape that must have appeared familiar to contemporary television viewers. Views are organized according to central perspective. Gates and bridges form horizontal lines, while verticals are formed by uniformly shaped office buildings. The institute programs a “world in a nutshell.” This is how the head of development explains the simulation program on another occasion. He fosters the suspicion that the world of the institute is a simulation, one that is programmed “from above.” This is consistent with the Platonic conception: the world of the cave is the world of human existence, which only perceives shadows, but regards them as being real. These shadows correspond to real objects and real movements located in the world found around the cave. In the seventh book of The Republic, Plato uses this metaphor to depict the limited capacities of the human mind to recognize ideas. Plato represents the world of ideas as being rich, and the human world as deficient.2 1 Welt am Draht, directed by Rainer Werner Fassbinder, Federal Republic of Germany 1973. Premiere broadcast on 13 and 14 October (99 min. and 105 min.). Thanks to Otto Rössler for his discussion of this film, to Andrea Gleiniger and Georg Vrachliotis for encouraging me to continue working on the topic of simulation. 2 Plato, The Republic, transl. R.E. Allen, New Haven 2006.

51

By means of mirror reflections and the deliberate use of unfocused shots, Fassbinder prepares the viewer’s perceptions in such a way that the world of the institute becomes recognizable as the programmed world of appearance. This esthetic construction allows two different aspects of simulation to be explicated: against the background of the Latin history of ideas, simulation can be understood, on the one hand, as distortion and dissimulation. It operates not with the essential, but with representational signs; these are presented as being true and are likewise regarded as such, just as the members of the institute or the residents of the Platonic cave regard their world as being true, failing to suspect that another world, a “real” world, exists above them. In the late 20th century, this conception was popularized in the context of media theory by Jean Baudrillard. He disqualified the mediatized consumer society as a world of illusory or “thirdorder” images, thereby linking Galouye’s novel Simulacron Three, also the source of Fassbinder’s scenario, with the cycles of the Occidental narrative of salvation. Developing independently of this discourse, on the other hand, was the conceptualization of simulation found in computer science. Simulation as Modeling The computer’s evolution in the United States led to an understanding of simulation as the modeling of dynamic systems by means of computer programs. It served as a method for predicting the behavior of systems. The conceptualization of simulation has largely established itself in the empirical sciences. The early protagonists of the computer sciences were unaware of the significance of simulation as distortion and dissimulation. In “Comments on the History of ‘Simulation,’” economist Herbert A. Simon explains: “An interesting use of the term ‘hand simulation’ without quotes occurs twice in our [Newell and Simon] first paper (1956) on the Logic Theorist. We stated in that paper that the program was not yet running on the computer, but that we had hand simulated its processes and thought we could reliably predict its behavior. Here we are talking of people simulating a computer rather than a computer simulating people!”3 Back then, the pioneers of artificial intelligence considered the distinction between “hand simulation” and “simulation” to be relevant; on the other hand,

3

Herbert A. Simon: Comments on the History of “Simulation,” personal communication, 13 July 1992.

52

Nils Röller | Scientia Media – Simulation between Cultures

no one regarded a distinction between “computer simulation” and pejorative uses of the concept “simulation” as being relevant. Simon seems to have become aware of this tradition only in 1992, and to have based his reflections on the Oxford English Dictionary. A common root for the understanding of simulation as a way of modeling system behavior, on the one hand, and simulation as distortion or dissimulation, on the other, is discovered as soon as one recalls, with Michael Friedman, that the forced emigration of German philosophers generated a split between the continental and analytic traditions.4 Before 1933, the protagonists of both developments still engaged in discussion with one another, for example Cassirer, Heidegger, and Carnap at Davos in 1929. Then, the topic was the status of mathematical symbols. Do they offer the possibility of gaining access to realms such as the infinity that exists beyond human existence? Or are mathematical symbols a medium through which humankind comes to terms with its mortality and narrowness? The background of this discussion was formed by various interpretations of Kant, which had been proposed in Heidegger’s Sein und Zeit (Being and Time; 1927) and in Ernst Cassirer’s Die Philosophie der symbolischen Formen (The Philosophy of Symbolic Forms; 1923–1929).5 The following quotation from Heinrich Hertz programmatically summarizes the Philosophie der symbolischen Formen: “We form for ourselves mental pictures or symbols of external objects; and the form that we give them is such that the necessary consequents in thought of our pictures are always the pictures of the necessary consequents in nature of the objects pictured … When on the basis of our accumulated previous experiences we have once succeeded in constructing pictures with the desired properties, we can quickly derive by means of them, as by means of models, the consequents which in the external world occur only after a comparatively long time, or as the result of our own intervention.”6 This quotation gives rise to controversial interpretations: it points the way toward an understanding of simulation as part of a dynamic relationship between

4 Michael Friedman: Carnap, Cassirer and Heidegger, Chicago 2000. 5 Ernst Cassirer: “Davoser Disputation zwischen Ernst Cassirer und Martin Heidegger” [17 March– 6 April 1929], Appendix IV, in: Martin Heidegger: Kant und das Problem der Metaphysik, Frankfurt am Main 1991, p. 278. 6 Heinrich Rudolf Hertz: A Collection of Articles and Addresses, ed. by Joseph F. Mulligan, New York 1994, p. 323.

53

theory, the world, and symbols, while stirring up the suspicion that the natural sciences distort our access to the world. Hertz compared illusions with models. Modeling, then, can be understood as the generation of illusions or – to use another Latin word for illusions – of simulacra. Later, after emigrating, Cassirer and Carnap spoke of the “production of simulacra,” and were listened to by students such as Herbert A. Simon at the University of Chicago.7 According to Friedman, Carnap and Cassirer promoted the mutual interpenetration of philosophy and the natural sciences in America. They were active in the sphere within which “experimental epistemology” arose.8 This developed in conjunction with computers, which offer the possibility of examining logic, and hence aspects of epistemology, mechanically, that is to say, empirically. As a consequence, forms of thought previously reserved to philosophical and mathematical speculation came under the testing scrutiny of engineers. Logical conclusions were now regarded from the aspect of the use of time and energy, as well as of the necessary hardware. The consistency of these “illusions” was not tested by computers. In the early 1970s, the products of these simulations were still abstract series of numbers. These were displayed by printers or at “data viewing stations.” At that time, they were largely image-free. It is understandable, then, that in 1973, Fassbinder avoided directing the camera’s gaze at the institute’s monitors. That representational images of the kind already visible on the screens of televisions and surveillance monitors would eventually also be displayed on computers: that was science fiction. Instead, Fassbinder shows us television monitors. The Welt am Draht is a world of screens with images assembled in rows – 25 images per second, not programmed digital pixels. In 1973, television and film were well suited to the media-technical staging of the Platonic cave. The data appearing on computer monitors offered too little visual stimulus. At the time, media transfer was still not worth the trouble.

7 Herbert A. Simon preceded his Models of Thought with the following dedication: “To the memory of my teachers Rudolf Carnap and Henry Schultz, who insisted that philosophy should be done scientifically and science philosophically.” Herbert A. Simon: Models of Thought, Dordrecht 1977, p. v. 8 Warren McCulloch: Embodiments of Mind, Cambridge 1965.

54

Nils Röller | Scientia Media – Simulation between Cultures

Simulation as Dissimulation Jean Baudrillard viewed simulation as the climax and collapse of the Occidental tradition of imitation and mimesis. Societal image production and computersupported combinatorial processes led to a situation in which reproductions and reproduction processes enjoyed greater relevance than what they reproduced. The reality thereby represented was so enfeebled that only a residue of it could be said to still exist at all. This was true for the age of third-order simulacra.9 Baudrillard developed his theory of simulation by speaking about cartography, about video and reality TV – about image media, not computers.10 His theory was based on the Marxist idea of alienation, which is driven to extremes in consumer society. This is a “society of the spectacle,” that is to say, a society that is dominated by visual representations.11 Baudrillard formulated his arguments from a position distanced from the natural sciences. He embodied a typical attitude found in continental philosophy, a principled skepticism concerning numbers and calculations. Characteristic of this attitude is a conception of simulation as image and distortion, one processed through technical artifacts. Such a media theory, then, positions itself in contradistinction to the natural sciences, and adopts a pejorative valuation of image production. During the interwar period, this skeptical attitude toward numbers and calculations had already found prominent expression in the thesis of the “age of the world picture.” Martin Heidegger used this theory to characterize contemporary reality around 1938. The age of the world picture stood for the age of science as an enterprise. The world was subject to its representations. In this context, the world was that which could be regarded as a picture: a “world picture […] does not mean a picture of the world, but the world conceived and grasped as an image. […] Wherever we have the world picture, an essential decision takes place regarding what is, in its entirety. The Being of whatever is, is sought and found in the representatives of the latter.”12 In modern science, this occurs in the experiment: “To set up

9 Jean Baudrillard: Symbolic Exchange and Death, Thousand Oaks 1993. (French original edition 1976). 10 With the aim of radical authenticity, the Loud family was filmed continuously for seven months. The documentary team reported triumphantly that “they lived as though we weren’t even there.” Jean Baudrillard: Agonie des Realen, Berlin 1978, p. 45. 11 Guy Debord: “La Société du spectacle,” in Œuvres, Paris 2006 (original edition, 1967). 12 Martin Heidegger, “The Age of the World Picture,” in: The Question concerning Technology and Other Essays, New York 1977, here p. 131. For the German, see “Die Zeit des Weltbilds” (1938), in Martin Heidegger: Holzwege, Frankfurt 1977, p. 89f.

55

an experiment means to present a condition that can be tracked according to a particular configuration of motions in the necessity of its course, i.e. its calculation can be controlled in advance.”13 “Setting up” implies predicting the future on the basis of the laws of natural science. It was the era of research as “enterprise.”14 The reduction of scientific praxis to the production of the world as picture meant: the image screened off reality. It hindered knowledge. According to Heidegger, it prevented an “understanding […] i.e., the truth of that which is not calculable.”15 Heidegger minted the concept of the world picture in such a way that calculation, experimental processes, and the subjectification of nature were subsumed in it. These were subsumed and simultaneously disqualified as dissimulation, as image, as that which hinders our access to Being. Heidegger, then, interpreted Heinrich Hertz’s reflections on the modeling and production of illusions pejoratively.16 Here, modern science as well as modern scientific prognostication is characterized as the banishing of Being, in and through the image. Against the backdrop of theories of science of that time, Heidegger’s argumentation is astonishing. It appears as a combinatorial master performance, which disqualified the prevalent arguments by Niels Bohr and Ernst Cassirer as irrelevant. In a reflection on quantum mechanics, Niels Bohr spoke of a “drastic breakdown of spatio-temporal images.”17 In exile in 1938, Ernst Cassirer wrote that pictorial representations, such as maps, failed to characterize the conceptual methodology of quantum mechanics.18 Heidegger’s combinatorial praxis paved

13 Ibid, p. 81. This English translation taken from Joseph Rouse: “Heidegger’s Philosophy of Science”, in: A Companion to Heidegger, ed. by Hubert L. Dreyfus and Mark A. Wrathall, Malden/USA 2005, p. 183. 14 Ibid, p. 84f. 15 Heidegger, “The Age of the World Picture,” in: The Question concerning Technology and Other Essays, New York 1977, here p. 136: “Man will know, i.e. carefully safeguard into its truth, that which is incalculable, only in creative questioning and shaping out of the power of genuine reflection. Reflection transports the man of the future into that ‘between’ in which he belongs to Being and yet remains a stranger amid that which is.” 16 In his debate with Cassirer, Heidegger was able to take note of Hertz. “Besprechung: Ernst Cassirer, Philosophie der symbolischen Formen. 2. Teil: Das mythische Denken. Bruno Cassirer Verlag Berlin 1925” was published by Heidegger in 1928 in the Deutsche Literaturzeitung. New edition: Martin Heidegger: Kant und das Problem der Metaphysik, Frankfurt 1991, pp. 55–270, esp. p. 270. 17 Bohr, cited in Hermann Weyl: Philosophie der Mathematik und Naturwissenschaft [Munich 1926]. Munich, 1966, p. 238. 18 Ernst Cassirer: Determinismus und Indeterminismus in der modernen Physik – Historische Studien zum Kausalproblem [Göteborg 1957]. Darmstadt 1957, p. 292, see also p. 302.

56

Nils Röller | Scientia Media – Simulation between Cultures

the way for the emergence in Europe of a discourse about simulation as dissimulation via images. In this context, Jean Baudrillard brought off a similar combinatorial sleight-of-hand when he conceptualized computer-supported simulations, against a background of traditional moral-theoretical debates over hypocrisy and dissimulation. The “metaphilosophy” of sociologist and urbanist Henri Lefebvre may be regarded as a theoretical-historical hinge joining Baudrillard and Heidegger.19 In 1965, Lefebvre demonstrated how social praxis could be altered by “simulations.”20 Lefebvre presented his arguments as an independent Marxist. The historical alienation of humanity from Being is a central moment in the formation of his theory. His point of departure was the assumption that in early Greek antiquity, poiesis and praxis had been one and the same. In the course of historical development, poetry and philosophy had become distanced from social praxis. Eventually, this would lead to the proclamation of the end of philosophy in Hegel’s writings. The end of philosophy coincided with a critical moment in urban development. The city had once been a second nature worth inhabiting.21 With the failure of the Commune, it became hell. It could no longer assert itself “as the law and rule of society, as the measure of the world.”22 Lefebvre describes the relationship of modernism to reality in a similarly divergent way. Researchers in mathematics and physics have torn the categories of space and time from their familiar and tangible relationships, allowing them to become infinite and multifarious. Lefebvre summarized developments in sociology, the natural sciences, and urban development with the words: “in all spheres, the last fixed ‘foundations’ have disappeared ….”23 This was a situation in which Heidegger’s meditations became relevant to Marxists, as metaphilosophy. They recalled poetic speech as the unity of poiesis and praxis: “speech is to Being what the house is to the human being.”24 Despite numerous and distinct differences, Lefebvre shared Heidegger’s yearning for an unmediated proximity to a numinous Being.

19 Henri Lefebvre: Metaphilosophie, Frankfurt am Main 1975 (French original edition 1965). 20 Ibid, p. 185, also p. 187, esp. the chapter “Annullierung der ‘Erkenntnistheorie’ – Simulierung und Simulacrum,” pp. 210–218. 21 Ibid, p. 128. 22 Ibid. 23 Ibid, p. 130. 24 Ibid, p. 129.

57

Lefebvre paid attention to popular writings from the realms of computer science and cybernetics of his time.25 He also adopted their understanding of simulation and interpreted it against the historical background of mimesis. His focused critique of the foundations of Occidental epistemology was diametrically opposed to his undifferentiated consideration of computerization. Because of this, and in contrast to other contemporary observers, he was incapable of conceiving of the potential failure of research into artificial intelligence, but regarded its success as guaranteed. He posed no questions concerning the capacities of storage units and processors, nor the problem of analytically grasping processes that occur in the human brain. The same questions eluded Baudrillard as well, as he enjoyed success in the wake of Lefebvre’s pioneering ideas. Both authors displayed a fondness for U.S. science-fiction writers. The play of thought cultivated by both sociologists is calibrated to the theory of the computer, not its empirical reality. The results are brilliant essays, which evade any confrontation with computers as “empirical artifacts.” Simulation between Cultures The understanding of simulation as an image is oriented to spatial representations. In German, it is the prefixes “über-” (above), “vor-” (before), and “ver-” (deformation, or deviation) that locate the relationship between science and reality. The tradition marked by Cassirer implies a temporal understanding of these prefixes, and hence a perspective of the relationship between simulation and world as one of dynamic and mutual exchange. This perspective makes possible a rapprochement between continental and analytic traditions (Friedman), one that is well worth working through. Today’s growing interest on the part of art and media theory in the history of science is an index of the rediscovery of this triad, and in particular its temporal dimension. Hansjörg Rheinberger’s theory of epistemological things draws attention toward detailed studies of temporal relations.26 His work makes it clear that scientific practices are subject to local differences and specific temporalities. Rheinberger emphasizes the coexistence of diverse temporalities, such as when he mentions the politically conditioned delays or accelerations in the 25 Ibid, p. 364f. 26 Hans-Jörg Rheinberger: Experimentalsysteme und epistemische Dinge – Eine Geschichte der Proteinsynthese im Reagenzglas, Frankfurt am Main 2006, p. 90: “Epistemic things, then, are inherently historical […].” (original edition, Göttingen 2001).

58

Nils Röller | Scientia Media – Simulation between Cultures

establishment of biological breeding methods.27 This historian of science pleads for a consideration of individual spatio-temporal “patches,” characterized by individual temporalities.28 They cannot be interpreted with reference to a single surveyable historical development. For this reason, the history of the scientific adaptation of nature is uncertain, and cannot be predicted clearly. Herbert A. Simon argues similarly in his Sciences of the Artificial. The growth of analytical methods and computing capacities does not improve our view of the future, he says, but conceals instabilities. The results are “temporal holes,” which makes it seem improbable that the authority of rationality will continue beyond the present.29 With this background, Simon develops his thesis of the restricted rationality of economic subjects. To be sure, it will be extended via the computer’s external storage units and processes, yet the computer too is subject to the limits of calculability.30 Simon pleads for an investigation of the computer as an empirical artifact, in other words, for understanding the computer as something that, on the one hand, uses time and energy, and on the other, alters the way in which time itself is shaped.31 This extension of a restricted rationality produces growing instability, since economic subjects now have access to a greater selection of possibilities for decision-making. Planning, for example, becomes additionally destabilized.32 This emphasis on temporal discrepancies and stoppages in recent science history, as well as in Simon’s Sciences of the Artificial, could be interpreted as a plea for changes in the linguistic bias of media theory. It would no longer be oriented toward spatial categories, but instead temporal ones, perceiving technical artifacts as problems and opportunities for temporal shaping. A media theory like this would attempt to locate interchanges between natural scientific and empirical practices of temporal shaping in the computer sciences. In the early modern period, debates about predictability were held under the label “scientia media.”33 At the time, the context was theological, and was specifically concerned with the preordination of the world by an all-knowing and 27 Hans-Jörg Rheinberger, “Ephestia: Alfred Kühns experimenteller Entwurf einer entwicklungsphysiologischen Genetik 1924–25,” in: Epistemologie des Konkreten, Frankfurt am Main 2006, pp. 114–130. 28 Hans-Jörg Rheinberger: Experimentalsysteme, p. 285ff. 29 Herbert A. Simon: The Sciences of the Artificial, Cambridge, Mass. 21981, p. 185. 30 Ibid, p. 137. 31 Ibid, p. 22. 32 Ibid, p. 47. 33 Thomas P. Flint, “Omniscience,” Routledge Encyclopedia of Philosophy, ed. by Edward Craig, New York 1998, vol. 7, pp. 107–112.

59

planning deity. Today, under different conditions, such a “middle knowledge” again seems desirable. It could be devoted to exploring the degree of freedom possible in an empirical intercourse with technology. This form of knowledge would stand between the knowledge of natural laws – which apply to technical artifacts such as computers just as they do to human beings – and the freedom to conceptualize possible forms of intercourse with technology in the form of science fiction. A middle knowledge located between science and science fiction rests on the presumption that symbols and artifacts that process symbols each have their own temporalities. One such “proper time” was rendered optically invisible, yet nonetheless simultaneously perceptible by Fassbinder in Welt am Draht. On two separate evenings, he demanded altogether 204 minutes of attention from viewers. He alternated shots of screeching tires in a subterranean garage with the breakneck acceleration and gradual deceleration of camera movements, which linger in front of mirrors. Through emphatic duration and accented suddenness, the temporal perceptions of viewers were modulated in a way that compelled them to experience the programmability of the “world,” but at the same time to observe that free space nonetheless exists, to expand or compress time, for example. Even then, a preoccupation with the temporalities of simulation required a specific dramaturgy. Today, in the era of the sciences of images, we would do well to reactivate it.

60

Georg Vrachliotis FLUSSER’S LEAP: SIMULATION AND TECHNICAL THOUGHT IN ARCHITECTURE Technical thinking encompasses not only our “technical-scientific culture,” but also the “potentialities of architectonic thought and production.”1 “But this means,” according to architect Bruno Reichlin, “first of all, an inquiry into the socalled ‘techniques’ that are inherent in such thinking, and which architecture conceives and projects.”2 Based on an expanded definition of architecture, Reichlin sketches the conceptual contours of the cultural-historical biformity of “technical thinking.” In light of the current diffusion of “mathematical worlds of scientific simulation,”3 it is worth considering the interplay between computer simulation and architecture in a way that takes these contours into account. Alongside the general question of what cultural influence simulation exercises on technical thinking in architecture, this essay will also raise the question: Which architectural design instrument can be generated from the historico-discursive space of computer simulation? Computer simulation – which is well established as an independent cultural technology – is increasingly altering our interactions with the world.4 The numerical simulation of future climate scenarios, new molecular structures, optimum building forms, complex loadbearing structures, and processes of nonlinear flow mechanics are all evidence of the far-reaching changes that computer simulation has brought about. To grasp computer simulation as a cultural technique presupposes an awareness of the fact that the production of knowledge is already a kind of knowledge technique, and also that the paths of knowledge production rely upon a certain “techné.” Speaking from an epistemological perspective, German philosopher Michael Hampe refers to computer simulation as a

1 Bruno Reichlin: “Architektur und Technisches Denken,” in: Daedalus Berlin Architectural Journal, edition 18, Berlin 1983, p. 12. 2 Ibid. 3 Gabriele Gramelsberger: “The Epistemic Texture of Simulated Worlds” this volume, pp. 83–91. 4 Walther Zimmerli, for example, writes: “In contrast to the classical cultural techniques of speaking/hearing, calculating, and writing/reading, the ‘fourth cultural technique’ of information and communication technique involving computers and networks is the first authentic cultural technology that involves a human-machine tandem.” Walther Zimmerli: “Technologie als Kultur,” in: Braunschweiger Texte, Hildesheim 1997, p. 27.

63

“Vico-esque technology”: “In it, poíesis and techné, or making (which as fabrication is already itself a form of cognition) enter a new developmental phase. Giambattista Vico asserted that individuals only truly understand that which they have themselves created. [...] With this philosophical background, the knowledge procedures being realized today in robotics, neurocomputing, and computer simulation appear to redeem that which has determined the modern understanding of scientific insight since Giambattista Vico: making as such functions as a knowledge process, and is no longer oriented toward an economic or any other aim, however defined.”5 Technology ought to be understood as a kind of “societal superstructure”:6 modern society contains virtually nothing that has not been conceptualized and described in technical terms. The majority of phenomenal forms in our culture are imprinted by technology. Hartmut Böhme has described our world as “technomorphic.”7 This is confirmed by the diagnosis offered by German philosopher Max Bense in 1949 (just a year after the appearance of Norbert Wiener’s epochal Cybernetics) in his essay Technische Existenz (Technical Existence): “Technology is one reality among others. The hardest, the most irrevocable of them all.”8 Cybernetics has altered technical thinking in architecture. Its emergence was followed by a period in which architects dreamed of automated design processes, intelligent calculating machines, global telecommunications engineering, and spaceship design. This dream involved a “poetic” technology, scientific progress, and the more humane world that was to have followed the Second World War. Today, from the perspective of largely digitized architectural production, some of these technological dreams of Modernity seem to have reached fruition, to have become a technical reality in architectural praxis. But it is not only the conceptual web of architecture, technology, and the natural sciences that has changed since then. The underlying conceptualization of nature has also been subjected to changes in the course of Modernity, often coinciding with technological, natural-scientific, artistic, and other cultural-technical developments. It

5 Michael Hampe: Denken, Dichten, Machen und Handeln. Anmerkungen zum Verhältnis von Philosophie, Wissenschaft und Technik. Introductory lecture given at the ETH Zurich, 2004, p. 14. 6 Hartmut Böhme: “Kulturgeschichte der Technik,” in: Orientierung Kulturwissenschaft, Hamburg 2000, p. 164. 7 Ibid, p. 164. For a critical view, cf. Christoph Hubig: Die Kunst des Möglichen I. Technikphilosophie als Reflexion der Medialität, Bielefeld 2006. 8 Max Bense: Technische Existenz. Essays, Stuttgart 1949, p. 123.

64

Georg Vrachliotis | Flusser’s Leap: Simulation and Technical Thought in Architecture

is not the original, undisturbed image of nature that has served architecture as a referential system since the advent of classical Modernism, but the analytical and constructive model of the natural sciences, an “anthropogenetic nature.”9 Given the increasing dependence of the sciences on calculation, this image seems selfevident. Since the age of cybernetics began, attempts have been made to use computer models to convert almost every natural dimension, every natural standard, every natural process into a mathematical model. In architecture, attempts have been made to formalize, automate, and numerically simulate every creative conceptual train of thought that takes place in the design process. Even today, architects not infrequently seem prone to an unreflective enthusiasm for digital formalism. The issue of whether a digital design approach to generating forms is also appropriate in the constructional and technical sense is generally shuffled aside. When this problem is regarded from an overarching perspective, the question arises: Should we do everything we can do, just because we can? This “technical imperative,” which implies the rationale “we must, because we can,” seems remarkable, even strange.10 In terms of architectural production, it could be argued that “the multiplicity of connotations accompanying ‘can’ and ‘may’ [render] this question dubious.”11 More interesting in the context of critical discussion, then, is the question: “Are we really capable of doing something at all if we are apparently unable to come to terms with its consequences?”12 Today, we have come far enough that we can attribute, in cultural terms, a metatechnical significance to computer simulation in architecture.13 With the rapid development of increasingly high-performance computers, numerical simulation is establishing itself as a universal working practice – not only in science and research, but in architecture and design as well. Today, virtually no technical field – and this includes architecture – could manage without computer simulation [Figs. 1 and 2]. 9 Cf. Gernot Böhme, Engelbert Schramm (eds.): Soziale Naturwissenschaft. Wege zu einer Erweiterung der Ökologie, Frankfurt 1985. 10 Christoph Hubig: Die Kunst des Möglichen II. Ethik der Technik als provisorische Moral, Bielefeld 2007, p. 9. See also Christoph Hubig: Die Kunst des Möglichen I. Technikphilosophie als Reflexion der Medialität, Bielefeld 2006. 11 Ibid, p. 9. 12 Ibid. 13 Cf. Max Bense: “Kybernetik oder Die Metatechnik einer Maschine,” in: Ausgewählte Schriften, vol. 2: Philosophie der Mathematik, Naturwissenschaft und Technik, ed. Elisabeth Walther, Stuttgart 1998, p. 429–475.

65

Figs. 1 and 2: Computer simulation aimed at optimizing the building project “Monte Rosa.” The real loading on the structure was conveyed via simulation. Not unlike an aerodynamic test in a wind tunnel, the form of the building and the pre-existing topography were subjected to a flow test. Simulation: Hovestadt group (CAAD), ETH Zurich. Design: Deplazes group, ETH Zurich.

66

Georg Vrachliotis | Flusser’s Leap: Simulation and Technical Thought in Architecture

In his essay Digitaler Schein (Digital Appearance), which appeared just a few years after his Ins Universum der technischen Bilder (In the Universe of Technical Images),14 Czech media philosopher Vilém Flusser diagnosed not only the progressive digitization of the visual media, but also the mathematicization of the technical-scientific world. Both aspects seem to virtually fuse at the level of the modeling and visualization of numerical simulation: “On the theoretical level, calculatory thought has penetrated more deeply into appearances. It has analyzed them and, as a consequence, the phenomena have increasingly taken on the structure of calculatory thought. Not only in physics do phenomena disintegrate into particles, but also in biology with the gene, in neurophysiology, for example, with the punctual stimulus, in linguistics with the phoneme, in ethnology with the cultureme, and in psychology with the acteme. There is no question any longer of the ‘extended object’; today, it is instead a question of structured fields of swarming particles. [...] The world has thus assumed the structure of the universe of numbers, which poses bewildering epistemological problems now that computers have demonstrated that calculatory thinking is capable not only of decomposing (analyzing) the world into particles, but also of reassembling (synthesizing) it.”15 Flusser’s “leap into calculative consciousness”16 has long since occurred in architecture as well.17 This has become obvious not least in the observation that the space of possibilities formed by the dialogue between architecture and computers has been fundamentally expanded. While the computer has served in previous design areas as a tool for visualizing concepts that had already been elaborated, it is now becoming increasingly possible for architects to develop software specifically for the respective design or production task:18 the computer becomes an open system of architectural production. “The discovery of new technological resources has directly allowed the all-too-human need to take on the whole, to construct the entire world anew, to 14 Vilém Flusser: Ins Universum der technischen Bilder (1985), Göttingen 1992. 15 Vilém Flusser: “Digitaler Schein,” in: Florian Rötzer (ed.), Digitaler Schein: Ästhetik der elektronischen Medien, Frankfurt 1991, pp. 152 and 154. 16 Ibid, p. 151. 17 Cf. Georg Vrachliotis, “Der Sprung ins kalkulatorische Bewusstsein. Evolutionäre Denkmodelle und Architektur,” in: Akós Moravanszky and Ole W. Fischer (eds.): Precisions. Architecture between Art and Sciences – Architektur zwischen Kunst und Wissenschaft, Berlin 2007, pp. 232–262. 18 For one of the earlier texts on this, see Ludger Hovestadt, “CAD im Selbstbau,” in: Arch+, Zeitschrift für Architektur und Städtebau, no. 83, 1985, pp. 32–37.

67

Fig. 3: Konrad Wachsmann, technical drawings of his constructive joint. “The result of nearly 2 years of development was a universal node surrounding a main tube ring-fashion in such a way that subsidiary tubes are capable of radiating outward in any desired combination and at any angle.”

68

Georg Vrachliotis | Flusser’s Leap: Simulation and Technical Thought in Architecture

emerge,”19 writes Heinrich Klotz in Vision der Moderne – Das Prinzip Konstruktion (Vision of Modernism: the Principle of Construction) concerning the mid-20th century structural-analytical flows between architecture and the science of engineering. Buckminster Fuller, Max Mengeringhausen, Konrad Wachsmann and Fritz Haller formed the core nuclei of these “other roots of Modernism.”20 Spoken about now were building systems, communications systems, infrastructure systems, and other abstract-technical ordering systems. The cell, the node, and the capsule became biological metaphors in both architecture and technology. Thinking around design and construction was determined by the modularization of architectural space and the systematization of production processes [Fig. 3]. In 1967, in his Myth of the Machine, and with an eye toward the ongoing rise of artificial intelligence, Lewis Mumford sketched an unsettling image of the future: that of the “great brain”21 and of the imminent “triumph of automation.”22 Appearing approximately simultaneously in 1959, less than 10 years earlier, were three texts whose titles already convey an impression of the scientific-technical research landscape of their epoch. In the title Atom und Automation (Atom and Automation), which he chose for his Enzyklopädie des technischen Jahrhunderts (Encyclopedia of the Technical Century),23 Abraham Moles used two core concepts from the history of science and technology to fashion a conceptual label based on the natural sciences and cybernetics. In Architektur, Automation, Atom (Architecture, Automation, Atom),24 Kurt Auckenthaler expanded on Moles’s earlier conceptual duality of science and technology by adding architecture. In his epochal book Wendepunkt im Bauen (Turning Point in Building),25 finally, Konrad Wachsmann heralded the end of traditional building and the inception of a new industrial-scientific architecture [Fig. 4]. Wachsmann’s conception of architecture was permeated by faith in scientific-technical progress: “Modular systems of coordination, scientific test19 Heinrich Klotz: “Vision der Moderne,” in: Heinrich Klotz (ed.): Vision der Moderne – Das Prinzip Konstruktion, Frankfurt am Main, Munich, New York 1986, p. 21. 20 Ibid, p. 14. 21 Lewis Mumford, “The Triumph of Automation,” in: The Myth of the Machine. Volume 1: Technics and Human Development (1967); Volume II: The Pentagon of Power (1970). 22 Ibid, p. 535. 23 Abraham Moles, Epoche Atom und Automation. Enzyklopädie des technischen Jahrhunderts, Geneva 1959. 24 Kurt Auckenthaler, Architektur, Automation, Atom, Wels 1959. 25 Konrad Wachsmann, Wendepunkt im Bauen, Wiesbaden 1959.

69

Fig. 4: Title page of Konrad Wachsmann’s Wendepunkt im Bauen (Turning Point in Building), 1962.

70

Georg Vrachliotis | Flusser’s Leap: Simulation and Technical Thought in Architecture

ing methods, the laws of automation and precision exert an influence on creative thinking. [...] The concepts of traditional architecture are no longer precise enough to permit the ideas of our era to be interpreted through them.”26 Wachsmann’s exacting objective was to derive the maximum possible number of variations of spatial structures by means of the smallest conceivable number of identical structural elements. “The precondition for this was the development of a single, universal structural element,”27 which entailed a search for the basic element of an imaginary, all-encompassing generative system: the universal node, the conceptual elementary particle of construction [Fig. 5]. On the conceptual level of these generative architectural systems, the boundaries were blurred between model and actuality, design and form generation; on the methodological level, between architecture, engineering, and the natural sciences. The consequence of this was a highly technological approach to architecture. “These elements will converge at articulations which, conceptualized as points in space, are bound together by imaginary lines [...],”28 as Wachsmann described his technical thinking, adding in conclusion: “Every statement will thus necessarily be restricted initially to point, line, plane, and volume. [...] The building, now divested of its mysteries, is exposed without protection to critical scrutiny.”29 In this context, the structure of a building increasingly becomes a geometric scatterplot of generic elements. The overall geometry of the construction is decomposed into an atomic structure, into the abstract basic elements of a generative system. The conceptual relevance of Wachsmann’s thinking for contemporary digital architectural production becomes evident only against the backdrop of the discursive production of early computer science, and hence only at second glance. For an attempt was made around the same time by Stanislaw Ulam and John von Neumann to mathematically model the basic element of an entirely different and far more abstract generative system. In the mid-1950s, Neumann, a Hungarian émigré mathematician by then living in the United States, began pursuing the idea of “mathematically modeling” the behavior of nonlinear systems. The reci-

26 27 28 29

71

Ibid, pp. 107 ff. Ibid, p. 96. Ibid, p. 113. Ibid.

Fig. 5: In 1968, Fritz Haller extended the idea of the node to the level of a global infrastructural system: Fritz Haller, totale stadt – ein globales modell (1968): a tertiary order unit for 300,000 residents in an existing landscape. Example: The Grosse Moos in the northwest Switzerland’s Mittelland. “This plan is […] not an actual development proposal for the selected area. Is it intended only to show how an arrangement of nodes and of the linear network of a tertiary order unit can be varied via pre-existing geographical features […]. Along the lengthwise axis lie hardware and the center of the tertiary order, while lying on both sides on parallel axes is tertiary order software.”

72

Georg Vrachliotis | Flusser’s Leap: Simulation and Technical Thought in Architecture

procal coordination of respectively different systemic principles was decisive in this context: the intention was to regard “organisms as computers and computers as organisms.”30 Ulam, a Polish mathematician who had also emigrated to the United States, inspired Neumann to conceptualize his vision of a “biological automaton” as a model of an infinite chessboard grid upon which individual basic elements – “cells” – would behave according to a simple regulatory system [Fig. 6]. The rules of behavior of this cell-grid model, finally, became the foundation of later “cellular automatons.” Neumann’s fourth model was a selfreproducing organism in a universe based on mathematical logic.31 In a way not unlike Norbert Wiener’s approach, Neumann attempted to extend the influence of mathematics and the natural sciences, of machine and organism on one another; and not only on a metaphorical level.32 In his worldview, “epistemology and technology reciprocally defined and encompassed one another.”33 It was more than a decade later that Neumann’s self-replicating automatons gained access to the design context. British mathematician John Conway’s playful simulation Life attempted to simplify the highly complex logic of Neumann’s automaton. Another momentous leap in such evolutionary concepts in architecture was the emergence of the research field that would be christened Artificial Life in 1987 by American mathematician Christopher Langton [Figs. 7 and 8].34 Since then, Langton’s dictum “to have a theory of the actual, it is necessary to understand the possible”35 has become a maxim. What this means for architecture from the perspective of the philosophy of technology is that “on the whole, it is a question of art, of designing the space of possibilities of our theoretical and practical access to the world.”36 In recent decades, thinking in the digital design, construction and production of architecture has been heavily influenced by such considerations, which relate to technical possibilities. One reason for this, certainly, is the methodological affinity between the architectural design process and the synthetic

30 Lily Kay: Das Buch des Lebens. Wer schrieb den genetischen Code, Frankfurt 2000, p. 157. 31 Cf. John von Neumann, Theory of Self-Reproducing Automata, ed. by A.W. Burks, Urbana, 1966. 32 Cf. “John von Neumanns genetische Simulakren,” in: Lily Kay, 2000, pp. 146–163. 33 Ibid, p. 159. 34 Cf. Christopher Langton: Artificial Life: An Overview, Cambridge 1995. 35 Ibid, p. ix. 36 Christoph Hubig: Die Kunst des Möglichen I. Technikphilosophie als Reflexion der Medialität, Bielefeld 2006 p. 13.

73

Fig. 6: In 1947, with his “Universal Constructor,” still without using computers, John von Neumann conceived a universal cellular robot with the capacity for self-replication. This robot was capable of reproducing optional models, including itself.

74

Georg Vrachliotis | Flusser’s Leap: Simulation and Technical Thought in Architecture

approach of computer-supported branches of the sciences. Only a limited degree of success should be expected from experimental numerical modeling as an epistemological tool of knowledge production. With regard to Neumann’s metaphors, as well as to the formation of analogies in biological and technological evolution, Argentine design theoretician Tomás Maldonado’s Noch einmal die Frage nach der Technik (Again the Question Concerning Technology) emphasizes that “an instrument [can] fulfill the function of an organ, surely evidence for similarity (and moreover, similarity with regard to performance), yet it would be erroneous to regard the two as being equivalent in every respect.”37 If we want to describe the computer, in its technical application, as an instrument, then what this instrument is used to simulate has changed over time. In terms of Maldonado’s demonstration, this means: the concept of function has been supplanted by that of behavior, the object-like organ by the concept of the network, and the linear by the nonlinear. A series of comparable biotechnical metaphors and design concepts based on modularity, growth, and variability were already present in the architecture of the first half of the 20th century, to be sure. Yet in architecture, it has now become possible for the computer’s synthetic milieu to use numerical digital simulation to turn natural-scientific images into components of a common technical reality. The search for and notion of a universal element – however constituted – is part of the cultural history of architecture, science, and technology. Neumann, Conway, and Langton aspired to decipher and model the “logical form of living systems.”38 To be sure, the degree to which the technical reality of simulating growing structures in architecture should be distinguished from simulating growing structures in biology calls for additional explanation. From the perspective of architecture, however, the search for a universal elementary particle has already entered the age of information technology. Wachsmann’s concept of the universal node, and Haller’s as well, can now be discussed on the systemic level of the

37 At this point, Maldonado refers to the writings of French philosopher of technology Simondons, in particular his 1958 dissertation Du Mode d’existence des objets techniques. Tomás Maldonado: “Noch einmal die Frage nach der Technik,” in: Tomás Maldonado: Digitale Welt und Gestaltung. Ausgewählte Schriften, ed. and trans. by Gui Bonsiepi, Zurich and Basel, Boston, Berlin 2007, p. 224. 38 Christopher Langton, Artificial Life: An Overview, Cambridge 1995, p. 113. 39 Peter Weibel, “Intelligente Wesen in einem intelligentem Universum,” in: Peter Weibel: Gamma und Amplitude. Medien- und Kunsttheoretische Schriften, ed. and with commentary and a foreword by Rolf Sachsse, Berlin 2004, p. 298.

75

Fig. 7: Wax simulation of a gametophyte of the Farnes Microsirium linguiforme through “artificial life.” Seen in the lower image is a photograph of a natural gametophyte.

76

Georg Vrachliotis | Flusser’s Leap: Simulation and Technical Thought in Architecture

“virtual particle,”39 of multi-agent simulation. The conceptual universality of the node reappears in the multifaceted nature of software agents, in the function of the smallest elements, in the behavior of software agents and the structures they form, and in the dynamic interaction of these “virtual particles.” In Langton’s sense, these particles “can be designated as the fundamental atoms and molecules of behavior.”40 That it is apparently possible, using a large number of “molecules of behavior,” to analyze not just the complexity of architectural structures, but also the dynamics of physical, biological, or even societal structures, and to synthesize and simulate them in Flusser’s sense, raises additional questions. For example: What idea of technology does this produce for architecture? “Above all, it is a question of modeling,” explains French computer scientist Jean-François Perrot. Using architectural metaphors, he attempts to provide us with some notion of this peculiar software structure: “The current languages oblige applications programmers to describe the phenomena they handle in a strictly mechanistic and hierarchized way. At best, an application is constructed on the basis of independently constructed modules – like a modern building made from prefabricated elements, instead of from stone blocks shaped on site. The procedure is perfect for a program that handles the functioning of a building. But what of the life of its inhabitants? The handsome layout that explains, for example, the complex structure of a story as a composition of rooms and walls, considered as being simpler elements, clearly no longer applies. We must take into account the operational independence of individuals, together with phenomena relating to communication, free will, belief, competition, consensus – and also discord. […] Now that computerized imaging of buildings, blocks of buildings, and even entire developments has almost become commonplace, the next step is to tackle the human beings and societies who inhabit these locations.”41 For architecture, this means extending technical thinking via computer simulation. According to German physicist and philosopher Carl Friedrich von Weizsäcker, one distinctive feature of such an extension is due to [its resting upon] a “mathematics of temporal processes, which can be represented through human decisions, through planning, through structures, as though they had been planned, or finally, as though they had been guided by chance. They are, then,

40 Ibid, p. 298. 41 Jean-François Perrot in the foreword to Jacques Ferber: Multi-agent Systems. An Introduction to Distributed Artificial Intelligence, London 1999, p. xiii.39

77

Fig. 8: Individual reproduction sequences and more developed cellular organisms of the “loops” programmed by Christopher Langton. These digital organisms possess the capacity for self-replication, which are simulated in cellular robots. The rules for the simulation built on von Neumann’s “Universal Constructor.”

78

Georg Vrachliotis | Flusser’s Leap: Simulation and Technical Thought in Architecture

structural theories of temporal change. Their most important aid is the computer, the theory of which is itself one of the structural sciences.”42 Weizsäcker explains further: “The mathematicization of the sciences is a feature of contemporary scientific developments. A physicist, a population biologist, or an economist can use the same mathematics.”43 In conclusion, we can say that the sociocultural influence of simulation on architectural production occurs on at least four different and interrelated levels of significance: the character of the tool; technologies of visualization; the epistemic category; and the instrument of knowledge. In view of the computer-related reciprocal effects between architecture, technology and science, it seems logical to add architecture to Weizsäcker’s paradigmatic list of research areas. In architecture as well as in science, numerical digital simulation can be regarded as an experimental field of productivity.44 Thus, it can hardly be disputed that the epistemological success of simulation in the sciences is inseparable from the media resources of the technologies used to visualize abstract quantities of data. “Computer-generated images today display a mathematics of the monstrous, a morphology of the amorphous, and a catastrophe theory,” argues German media theoretician Norbert Bolz in the foreword to Chaos und Simulation, “so that to understand the world means to be capable of simulating it.”45 In this context, computer simulations are not merely “technical images”46 and “the indirect products of scientific texts,” but – and herein lies their epistemic relevance – images of theories.47 Weizsäcker’s 1971 demand that we “begin a discussion of the role of science with the structural sciences”48 is comprehensible in this light. For a theoretical confrontation with architecture in the field of tension constituted by the media and information technology, this means that a structural-scientific examination

42 Carl Friedrich von Weizsäcker: The Unity of Nature, New York 1980; original edition: Die Einheit der Natur. Studien, Munich 1971. 43 Ibid. 44 Cf. Gabriele Gramelsberger: “Simulation als Kreativitätspraktik. Wissenschaftliche Simulationen als Experimentalsysteme für Theorien,” in: Günter Abel (ed.): Kreativität: Sektionsbeiträge, XX. Deutscher Kongress für Philosophie, 26.–30. September 2005, Berlin. 45 Norbert Bolz: Chaos und Simulation, Munich 1992, p. 8. 46 Cf. Flusser (1985), 1992. 47 Cf. Gabriele Gramelsberger: “Von der Ambivalenz der Bilder,” in: Klaus Rehkämper and Klaus Sachs-Hombach (eds.): Vom Realismus der Bilder, Magdeburg 2000. 48 von Weizsäcker, 1971, p. 23.

79

of architectural practice is indispensable if we wish to penetrate critically those layers of architectural production in which the potentials, but also the limits of design technology based increasingly on computer simulation are becoming evident. In particular (to apply once again Wachsmann’s image of the demystification of architecture via the rationality of technology), it is a question of devoting critical attention to the progressive disclosure of the mysteries of architecture via computer simulation. For “they [the structural sciences]” (and this remark is probably owed to Weizsäcker’s farsightedness with regard to the development of science and technology) “bring with them the temptation to confuse all of reality with executable, plannable structures. [...] One of the most important endeavors in the formation of consciousness must be to complement an eye for structure with an eye for reality.”49

49 Ibid.

80

Gabriele Gramelsberger THE EPISTEMIC TEXTURE OF SIMULATED WORLDS On virtual fish in simulated oceans and other semiotic objects Suppose you wish to familiarize yourself with the topic of climate change as it occurs in the computer, and with the rather discouraging prognoses that climate modelers offer for the real world. Before long, it will become evident that you have become submerged in a rather peculiar realm, in a purely mathematical world. To “simulate” simply means to re-create the phenomena of our lifeworld in computers by means of numeric “representations.” Boldly entering a climate model and leaping into its ocean, you would presumably expect to see fish darting through simulated water. Standing on land, you would presumably see parametric clouds or the strata of presumably blue skies. Without exception, today’s climate models – including the reference models for the climate change prognoses of the IPCC (Intergovernmental Panel on Climate Change) – are coupled atmospheric-ocean models, which have evolved over a 40-year programming history into organisms of massive magnitude.1 Only in recent years have fish been found in them at all; formerly, climate models lacked oceans altogether, since computers were too slow to calculate the increasingly complex processes involved. But what is the specific function of fish with regard to climate change? Among other things, climatologists are interested in the capacity of the oceans for storing carbon dioxide. This is where the fish come into the picture: plankton store carbon dioxide before dying and sinking to the ocean floor. In this way, carbon dioxide can be stored for decades. But when consumed by fish, plankton – along with their carbon dioxide content – re-enters the water along with other excreted matter, and so the carbon dioxide is released in the short or long term

1 These “gigantic” climate models (which began as atmospheric models before developing into Earth systems) can be stored on a first-generation USB memory stick. In other words, although the data take up less than 50 MB, their calculation requires supercomputers. The “gigantomachy” of these models resides in the sheer complexity of the Earth as a system, the modeling of which integrates a growing number of processes. Contemporary climate models can be traced back to a small number of simple models from the 1960s, and to some extent, 40-year-old codes are still found in up-to-date models, most of them still described in FORTRAN. In his article “A Brief History of Atmospheric General Circulation Modelling,” Paul N. Edwards sketches out the family tree of climate models. See Paul N. Edwards, “A Brief History of Atmospheric General Circulation Modeling,” in: General Circulation Development, Past, Present and Future: The Proceedings of a Symposium in Honor of Akio Arakawa, ed. by David A. Randall, New York 2000, pp. 67–90.

83

back into the atmosphere. Translated into specialist language, this is a question of the positive and negative feedback effects of geochemical processes in the ocean with regard to carbon dioxide, and specifically with regard to the carbon dioxide-plankton-fish cycle. Like all other scientific simulations, climate models consist of a large number of such feedback processes, each of which tells a different story. Each of these stories was first reconstructed through painstaking research involving observation, measurement, and laboratory experimentation. Yet these “fish” should be enjoyed with caution, and fishing is out of the question. And should you risk a dive into a simulated ocean, you would not even see them, for neither the ocean nor the fish exist there in their familiar forms. Instead, we find semiotically defined objects that are exclusively mathematical in nature, and which are subject to a non-visual logic of pure function. These simulated fish exist for the model as a whole in the form of the rate of change of plankton deposit levels. In other words: plankton “dies” as the square of the number of fish averaged out for the simulated ocean. To swim in a simulated ocean is to swim through an averaged-out plankton-ocean-fish soup; in such discrete worlds, “swimming” would be an equally averaged-out activity, one performed disjointedly, temporal step by temporal step, and interpolated between the points of the calculations. Since ocean models are calculated based on grid distances of 20 to 150 km, and at 10- to 20-minute intervals, you would only actually exist at 20-minute intervals, and would find yourself taking a swim in an extremely porous ocean. A rather peculiar world, in which objects are processes that exist in discrete form, and yet are averaged globally. Why are the mathematical worlds of scientific simulations so different from the world of everyday experience? First of all, we are dealing with purely sign-based, that is to say, semiotic, worlds. Second, we are dealing with a specific type of semiotic world, namely a mathematical one. A literary novel also represents a world that is mediated through signs (in this case, readable text). Yet novels differ fundamentally from the mathematical “narratives” elaborated by simulation models. Because the epistemic textures of these peculiar, simulated worlds bear so little resemblance to everyday experience, it can be difficult to understand that the prognoses they generate have tremendous impact on our lives. Even the style of speech used by modelers themselves – who speak of fish, clouds, oceans, and plankton, as well as the visualizations involved – suggests that we are dealing with simulations of real phenomena as perceived by the human eye. Yet these “in silico” realities have as little in common with the real

84

Gabriele Gramelsberger | The Epistemic Texture of Simulated Worlds

world as a porous plankton-ocean-fish-soup has with the Mediterranean. This does not mean, however, that simulations are science-fiction inventions. It demonstrates instead the degree to which the sciences have mathematized our view of the world over the past four centuries. The logic of this viewpoint – which is employed by climate models, architectural simulations, and in silico crash tests – is always the same, no matter how concrete the appearance of the visualizations produced once a simulation has been concluded. Anyone with an interest in simulation should avoid remaining preoccupied with the images appearing on the surface of the screen, and should instead direct his or her gaze into the depths of the data realm in order to investigate the epistemic texture of the semiotic world. The constitution of the semiotic worlds of scientific simulation A brief historical review should serve to clarify the nature of these semiotic worlds. Science employs signs, whether in the form of theoretical descriptions (readable texts), the results of measurements (indices, or data that can be read off), mathematical formulas (operational scripts), or algorithms (operational, symbolic machines). The development and application of operational scripts as symbolic machines, in particular, are the foundations of “in silico” worlds. Sybille Krämer has investigated the formalization, calculization, and mechanization of scripts in mathematics during the 16th and 17th centuries, and has introduced the terms “symbolic machines” and “operational scripts” for mathematical equations.2 The scripts are not conceived in order to convey readable text, but to represent and execute operations, for example, calculations with numerals. If concrete numerals are replaced by letters, as François Vieta did in the late 16th century when he introduced algebra, it is possible to “calculate” using signs in a fundamentally more abstract manner. The introduction of algebra made it possible to use new signs for new operations – for example, the introduction of differential calculus by Gottfried Wilhelm Leibniz, which made it possible to work with infinitesimal magnitudes.3 The linking together of algebra and geo2 See Sybille Krämer: Symbolische Maschinen. Die Idee der Formalisierung in geschichtlichem Abriss, Darmstadt 1988; Sybille Krämer: Berechenbare Vernunft. Kalkül und Rationalismus im 17. Jahrhundert, Berlin, New York 1991. 3 Isaac Newton also developed a method for calculating infinitesimal magnitudes around the same time, but it was Leibniz’s notation that met with success. See Jason S. Bardi: The Calculus Wars. Newton, Leibniz and the Greatest Mathematical Clash of All Time, New York 2006.

85

metry effected by René Descartes in the 17th century made it possible to employ calculation with letters as a general description of calculation with numbers. In this way, the symbolic machinery became more cunning, facilitating increasingly complex operations. And the more cunning this semiotic instrumentarium became, the more interesting were the descriptions and discoveries it made possible. For just as there is an enormous difference between looking at the stars with the naked eye or viewing them through a telescope or radio telescope, representable knowledge is largely dependent upon the constitution of the symbolic machinery employed. But as long as the symbolic machines of science existed only on paper, their range was limited. The question that had plagued the sciences for centuries was: how could these symbolic machines become real machines, thereby dispensing with the arduous labor of carrying out calculations by hand? Before this symbolic machinery could be delegated to real machines, general procedures (algorithms) needed to be developed for operations that had hitherto been carried out (not unlike cooking recipes) solely by human hands. But as we know from experience, cooking recipes are highly imprecise guides. Machines would be totally overwhelmed by their execution. Since the sciences in the modern era have been based increasingly on mathematics and calculation, the requirements for calculating capacities have risen enormously over the past two centuries, with many scientific fields stagnating as a result of the limited human powers of calculation. A general procedure was urgently needed that would allow all kinds of calculations to be carried out by machines. For many centuries, success in this field was seen only in the development of mechanical instruments (physically manifested algorithms, so to speak) for performing the four fundamental operations of arithmetic. What was needed, however, was a general and precise procedure covering all types of calculations that could be delegated to machines. Only in 1936, when an idea occurred to Alan Turing involving the mechanization of paper calculations (that is to say, the way in which signs are used in calculation), did it become possible to conceptualize a generally programmable calculating machine, namely the computer. “Turing turned back to his school days, recording the process of calculation in the boxes of gridded exercise books as number notations and according to fixed rules. This was a totally mechanical process, which Turing described appropriately in the model of a programmed machine, the Turing machine.”4 This dramatically expanded the spectrum of applications for mechanically executable algorithms, at least in cases where the necessary

86

Gabriele Gramelsberger | The Epistemic Texture of Simulated Worlds

operations could be subjected to the following logic: each step must emerge explicitly from the previous one, and there must be strict criteria for determining when an operation begins and ends. All procedures that are describable in these terms can be calculated mechanically. Simulations are such calculable descriptions. The “nature” of the epistemic texture of scientific simulation When we turn our attention toward the “nature” of the epistemic texture of scientific simulations, and hence to the nature of computers and their mode of processing signs (algorithms, mathematical operations), then it becomes clear that we are dealing with a precise, discrete, and finite world of binary numbers. To manipulate these worlds means to operate upon them in a way based on rules, and that is exactly what simulations do. They provide thousands of instructions prescribing how the numbers are to be balanced with one another in the calculation space once they have been initialized with measurement data (t0). The course of a climate model simulation, for example, works step by step through all of these instructions before delivering the computed value for the climate state variables for a point in time t1.5 The degree of resolution of the calculation grid plays an important role in determining whether processes are expressed in the data or not. Because storms are smaller than the T21 grid widths of approx. 500 km, even severe storms will remain undetected in a climate model with a resolution of T21. With a resolution of T42 (approx. 250 km grid widths), indications of storm activity become detectable, while at T106 (approx. 110 km grid widths), storms become dramatic events within the model. What we can learn

4 Wolfgang Coy: “Gutenberg und Turing: Fünf Thesen zur Geburt der Hypermedien,” in Zeitschrift für Semiotik 16.1–2, pp. 69–74, esp. p. 71. 5 In 1904, Vilhelm Bjerknes published an article in the Meteorologische Zeitschrift, entitled “Das Problem der Wettervorhersage, betrachtet vom Standpunkt der Mechanik und der Physik” (The Problem of Weather Prediction from the Point of View of Mechanics in Physics), thereby laying the foundation of weather and climate modeling. He proceeded from the assumption that the condition of the atmosphere at an arbitrary point in time could be sufficiently determined provided that at each moment, velocity, density, air pressure, temperature, and moisture could be calculated. The resulting equations burst the boundaries of analytical methods, and up to the present can only be solved numerically, that is to say, simulated. See Vilhelm Bjekrnes: “Das Problem der Wettervorhersage vom Standpunkt der Mechanik und der Physik,” in Meteorologische Zeitschrift 21, 1904: 1–7.

87

from simulations is essentially dependent upon the degree of resolution of the calculations involved. Since clouds, fish, and shrimp are considerably smaller than current grid widths of between 500 and 110 km, they would literally fall through the net of the climate simulation if not inserted into the calculations of the climate state variables as globally averaged parameters. In a scientific simulation, all instructions are based on theories. In this sense, simulations unfold a complex, epistemic texture of hypotheses derived from the most diverse theories and disciplines. To this extent, they are not merely multidisciplinary, but also synthetic, instruments of knowledge. These synthetic instruments of knowledge are new to the natural sciences, which were previously interested only in individual aspects and in analytical procedures. Essentially, simulations are instruments for experimenting with theories. How do the programmed hypotheses behave overall? Do they accurately reproduce the phenomena under investigation? The trick here is to adequately validate the results of the simulation, that is to say, to determine accurately whether “in silico” data actually coincide with reality. Visualizations of the results of simulations, then, are images of mathematical theories, and not of reality. The fact that many simulated images appear realistic has less to do with the data calculation than with its “creative” presentation. Towards a narratology of scientific simulations If we regard scientific simulations as narrative textual structures, however singular, then a narratology of scientific simulation should be possible. We have learned from the narratology of literary texts that narrative events possess a certain duration, even if these are not always ordered in a strict sequence, and that narratives are conditioned by certain omissions, reminiscences, and anticipations, as well as by a specific narrative schema. Analogously to literary genres (detective fiction, novel, short story), the various types of simulations articulate diverse narrative schemata. The climate situation under description is based on the classical narrative schema of deterministic simulations of partial differential equations, while for example the “Monte Carlo” simulations resort to a stochastic narrative schema that follows the law of large numbers.6 Differential equations

6 The LLN (law of large numbers) refers to the relative frequency of randomized results. According to this theorem, the more often a random experiment is carried out, the more closely the results will approximate theoretical probability.

88

Gabriele Gramelsberger | The Epistemic Texture of Simulated Worlds

(developed roughly simultaneously in the 17th century by Isaac Newton and Gottfried W. Leibniz) are the core of modern science, especially physics. These serve to describe the unfolding of a system in time and space, and for this reason, every developmental step emerges explicitly from the previous one. Standing behind these procedures is the idea of a deterministic nature, one that does not develop arbitrarily, but in a way that is predictable from its current state. Whether this narrative schema is true cannot be determined, for as thermodynamics and quantum physics have demonstrated, additional narrative schemata may still be “discovered.” The question of whether the narrative schema owes its existence to nature or to the symbolic instrumentarium will not be discussed here. Of greater interest for a narratology of scientific simulations is an engagement with the order of simulated narrative structures, with their relevance and sociocultural significance. Simulations decompose the simulated events into explicit and discretizable instructions for a finite calculation grid. The sequence of the narrated events (historical time) is not identical with the programmed course of the narrative itself (narrative time). The simulation runs through all programmed data of the simulation model according to a fixed schema, using flashbacks in order to generate predictions. While in nature, the states between t0 to t1 are developed continuously for the space, these processes must be decomposed in the simulation and executed successively. There exists in nature, then, no model for the temporal unfolding of the narrative. The development of individual processes is always a causal reconstruction under simplified conditions. Only that which can be observed and measured can be adequately reconstructed. For this reason, simulation models are subject to numerous omissions and idealizations, but also to additions conditioned by the calculations themselves. Since we have faith in the narratives of scientific simulation, for example, in the area of weather and climate prognosis, this type of narratology has a different ontological status than that dealing with literary textual production. It ought to be possible to determine whether a given scientific simulation had generated fiction or fact, but this is far from easy. The “success” of a simulation is not a matter of taste, but rather a question of correctly evaluating the results. Evaluation and validation, then, belong to the repertoire of the sciences of simulation. Scientific narratives are among the grand narratives of our culture, and they have been developed over a period of many centuries. With simulations, they now become “visible” and experimentally accessible. The language of these narratives

89

is that of mathematics, a universal language spoken by all natural scientists. The fascination of scientific simulations lies in their universality. To be sure, there may exist various narrative schemata, but the constitution and “nature” of these semiotic worlds is virtually universal. To the extent that they can be expressed in this “language,” all of these processes can be integrated into narratives. Consequently, there is a constantly growing collection of simulation models that represent an increasingly comprehensive epistemic texture of our world. And although the “narrator” is not a part of the simulated story, his or her “handwriting” can be detected everywhere: in the selection of processes to be examined, in choices of what to omit, in the ordering of narrative sequences, and in many other respects as well. In contradistinction to literary texts by individual authors, simulation models are collaborative texts based on knowledge that has been acquired by the scientific community through joint efforts. That which counts as knowledge is regulated by generally valid criteria: “stories” must be accurately verifiable by means of measurements, observations, and experimentation, and increasingly by some means of “in silico” simulations.7 The future role that scientific simulations will play with regard to the verifiability of theoretical knowledge, as well as to the generation of new knowledge, remains open. The notion of delegating everything to simulations still runs counter to our “empirical sense.” But there are signs that current developments are leading in this direction, and that even now, the sciences are entering a “post-empirical” stage. For simulations are capable of more than narrating stories. In the meantime, they have begun concretizing such narratives in reality, developing into design or engineering tools. Not unlike the digital chain in production and in architecture – CAAD (computer aided architectural design), RP (rapid prototyping), CAM (computer aided manufacturing) – there are already scientific fields doing precisely this (molecular modeling), or attempting to do so (synthetic biology). Today, the sociocultural significance of such changes in science can only be imagined. Discussions in biology, genetics, and pharmacology, with their talk of cyborg cells, “improving human performance,” and other similar topics, will surely make us sit up and take notice.

7 In simulations, many “narratives” emerge that are not prescribed by the programming data. Seasonal weather changes in climate models, for example, are the result of calculations. If they fail to emerge, the model is false.

90

Erich Hörl KNOWLEDGE IN THE AGE OF SIMULATION: METATECHNICAL REFLECTIONS 1. Dawning on the protagonist of Ernst Jünger’s Gläserne Bienen (The Glass Bees) during a stroll through Zapparoni’s new garden of creation, populated by swarms of animated marionettes, is a new sense of the world. This unprecedented feeling is constituted primarily by his loss of feeling for a specific difference. In Jünger’s novel, published in 1957, we find the following passage: “During this intense process of inspection and examination, I lost the capacity to distinguish between the natural and the artificial.”1 The consequence of this loss is a growing skepticism in relation to objects, and a problem “with regard to perception as such, for it failed to fully separate outer and inner, landscape and fancy. The strata lay one upon the next, oscillating, mixing up their contents, their meanings.”2 This doubt concerning the distinction between natural and artificial, rendered uncertain now by the “robot parade” in the park of the engineer, was a distinction that had been binding and functional since the inception of Modernity, and even earlier since Plato devised his doctrine of deceptive images and his theory of representation. Also in 1957, Hans Blumenberg investigated with greater theoretical precision the loss of orientation and sense of change expressed in Jünger’s novel. He painstakingly reconstructed the historical shift of meaning seen in the reshaping, reconstruction, and steady decomposition of a guiding Platonic formula, which in classical Greece had solidified the ontologically deficient status of the artificial and of the artistic, that is to say, of the technical in an all-encompassing sense, regulating the difference between natural and artificial. This formula was “the imitation of nature” (Nachahmung der Natur). Blumenberg speculated that the overcoming of the mimesis formula, which (in opposition to the direct participation in being of true images) regards all artistic and technical products as mere derivatives of being, as metaphysically bad, disreputable and deceptive images, phantasmata and simulacra, and as fictitious perversions, “could culminate in a ‘prefiguration of

1 The above passages translated by Ian Pepper. For the published English version, see Ernst Jünger: The Glass Bees, translated by Louise Bogan & Elisabeth Mayer, New York 1960, p. 138. 2 Ibid.

93

nature’” (Vorahmung der Natur).3 With this concept of prefiguration, Blumenberg coins the watchword of an emerging post-mimetic age (in the best sense). The background before which both of these diagnoses acquire their virulence is articulated pointedly in the prognosis of cyberneticist Abraham Moles: “Forming itself before our eyes is a new world,” he wrote in 1959, “the new world of machines.”4 This process, says Moles, will not stop at science, but its principal effect will be the reshaping of science: “20th-century science will be primarily the science of models.”5 The new “science of semblance,”6 which emerges on the basis of computing machines, would cease to “concern itself overly with reproducing reality.”7 In the cybernetic age, a phenomenon is recognized as valid when it can be “constructed as a model in such a way that it possesses the same behavior as its prototype in all details, and even the same uncertainties”8 without necessarily being a faithful imitation. The inception of the age of modeling and simulation that was articulated in the late 1950s in literature, philosophy, and cybernetics has since then come to signify the undermining of the foundations of a knowledge culture based on representation. The concomitant changes in research practices and knowledge production, and the possibilities interent in machine-based modes of “worldmaking” have resulted in a new epistemological and ontological constellation. The transition from a knowledge culture based on representation to one based on simulation requires an exit from the long-lived Platonic paradigm, which had been coextensive with Occidental history as a whole, and for which the minority status of the technical had been constitutive. In a similar manner, and through hard technical procedures, this transition brings us back to the sophistic and Platonic founding scene, one originally directed against the rhetorical technification of the lógos, and one that created a genuinely a-technical episteme as the

3 Hans Blumenberg: “Nachahmung der Natur. Zur Vorgeschichte der Idee des schöpferischen Menschen,” in idem: Wirklichkeiten, in denen wir leben, Stuttgart 1996, see p. 93; English edition available. “Imitation of Nature. Toward a Prehistory of the Idea of Creative Being,” in The End of Nature, Dossier on Hans Blumenberg, Qui Parle, Vol. 12, no. 1, 2000. 4 Abraham Moles: “Kybernetik, eine Revolution in der Stille,” in: Epoche Atom und Automation. Enzyklopädie des technischen Jahrhunderts, vol. VII. Geneva 1959, p. 7. 5 Ibid, p. 8. 6 Ibid. 7 Ibid., “Nachwort,” p. 123. 8 See note 5.

94

Erich Hörl | Knowledge in the Age of Simulation: Metatechnical Reflections

main scene of its philosophico-scientific program.9 Isabelle Stengers writes with great historical consistency that “the power of the computer as a simulation instrument [...] has [brought forth] a new species of scientist one might call the new sophist, a researcher no longer committed to truth, which silences fiction, but instead to the possibility of mathematical fictions through which any desired phenomena become reproducible.”10 The reversal of Platonism – spoken of from Nietzsche to Heidegger to Deleuze as the task of philosophy and as a new beginning for thought – is also (perhaps even mainly) a machine-based event, implementing an utterly unPlatonic “mathematics as instrument of fiction.”11 Nowhere is this so evident as in the computer-supported transition from a classical to a transclassical knowledge culture, one that shifts the sciences into close proximity to the arts.12 Under the aegis of the technosciences, the age-old tension between episteme and techne loses its force, and the order of knowledge is no longer characterized13 by procedures of truth and the production of certainty, but by the signatures of nonlinear mathematics and non-trivial machines.14

9 See Hans Blumenberg: “Lebenswelt und Technisierung unter Aspekten der Phänomenologie,” in: Wirklichkeiten, in denen wir leben, op. cit., pp. 7–54, esp. p. 13f. 10 Isabelle Stengers: Die Erfindung der modernen Wissenschaft, Frankfurt, New York, 1997, p. 209. 11 Ibid. 12 The difference between classical and transclassical machines, and between classical and transclassical thinking, here transferred to the culture of science, comes from German philosopher and cyberneticist Gotthardt Günther. According to Günther, classical machines are work or perfomanceoriented machines which, in ontological terms, imply a specific relationship to the world, namely that of the form- and meaning-giving subjects who manipulate an, in principle, formless and meaningless world of objectives. This dual ontological structure corresponds to the bivalent logic established since Aristotle. For Günther, the human hand is a model of the classical machine. Transclassical machines, on the other hand, do not manipulate the world, but process information, and have their model in the brain. They can only be grasped via the contours of a process ontology and new multivalent logic. See Gotthardt Günther: “Maschine, Seele und Weltgeschichte,” in Gotthardt Günther: Beiträge zur Grundlegung einer operationsfähigen Dialektik, vol. 3. Hamburg 1980, pp. 211–235. See also Erich Hörl: “Das kybernetische Bild des Denkens,” in: Die Transformation des Humanen. Beiträge zur Kulturgeschichte der Kybernetik, ed. by Michael Hagner and Erich Hörl, Frankfurt 2008, pp. 163–195. 13 On the difference between trivial and nontrivial machines, see Heinz von Foerster: “Mit den Augen des Anderen,” in Heinz von Foerster: Wissen und Gewissen. Versuch einer Brücke, ed. by Siegfried J. Schmidt, Frankfurt 1993, pp. 350–363. 14 On the signature of indeterminacy in technology, see Gerhard Gamm: “Technik als Medium. Grundlinien einer Philosophie der Technik,” in Gerhard Gamm: Nicht nichts. Studien zu einer Semantik des Unbestimmten, Frankfurt 2000 pp. 275–287. On the supplanting of the regime of certainty in mathematics by the pragmatics of modeling, see Amy Dahan: “Axiomatiser, modéliser, calculer: les mathéma-

95

To return to Jünger’s officer: in the end, he was granted a formula that seemed to ward off the technological disorientation of his present-day world: “There is much illusion in technics.”15 In transclassical times, even such rhetorical palliatives have become illusory, and his formula remains a sentence drawn from the Occidental archive, one that continues to reduce the technical. 2 All-encompassing, meanwhile, ist the list of sciences and epistemic fields which have implemented digital computing machines as fundamental media of research and knowledge since 1945, and which have grouped themselves around the new epistemo-technologies of simulation. These fields include astronomy, biology, chemistry, the geosciences, genetics, meteorology and climatology, as well as neurosciences and physics, from economics and sociology, the sciences of military and government, architecture, electrical technology, machine engineering, nanotechnology, aeronautics and space travel, artificial intelligence and artificial life, along with robotics. Even pure mathematics has been involved in this development since the 1980s, in the form of a new research field called experimental mathematics. Also shaken by the deployment of machines is the anthropocentric self-image of science, which is based on a non-technical understanding of basic procedures of discovery and proof. Moles’s prognosis has been dramatically confirmed. A displacement has occurred in the meaning of the sciences as well, transforming them increasingly into computer sciences through the application of computer-supported practices of modeling and simulation. The computer, then, has become an essential epistemic tool.16 The displacement of meaning effected by the technoscientific caesura has at this point acquired the status of normality. This caesura has generated a crisis in the sciences, characterized by Edmund Husserl in 1936 as the fundamental tiques, instrument universel et polymorphe d’action,” in: Les sciences pour la guerre 1940–1960, ed. by Amy Dahan and Dominique Pestre, Paris 2004, pp. 49–81. Eric Winsberg has characterized simulation as the management of uncertainty and indeterminacy; see Eric Winsberg: “Simulated Experiments: Methodology for a Virtual World,” in: Philosophy of Science, 70 (January 2003), pp. 105–125, esp. p. 112f. 15 Jünger (1957) 1990, p. 137. 16 The computerization of the sciences is seen as a constitutive element of the history of the computer itself, which has been deployed for purposes of simulation since its inception. See Paul Edwards: “The World in a machine: origins and impacts of early computerized global systems models,” in: Systems, Experts, and Computers. The Systems Approach in Management and Engineering, World War II and After, ed. by Agatha C. Hughes and Thomas P. Hughes, Cambridge, Mass/London 2000, pp. 221–253.

96

Erich Hörl | Knowledge in the Age of Simulation: Metatechnical Reflections

incomprehensibility of processes of formalization and technification. These processes, wrote Husserl, are remote from perception, and hence create problems for the subject of knowledge, which has always stood primarily in a perceptionbased, concrete, and non-technical lifeworld.17 It is this technological displacement of meaning that requires the renegotiation, not only of the epistemological situation, but of the ontological one as well. The techniques of representation through which the sciences penetrate ever deeper into the mathematicization of the non-representable, generating new objects and areas of experimentation, and transforming resistant, nonlinear, complex systems, can no longer be conceptualized as a temporary or purely external disturbance of inherited representational categories and structures of meaning. In light of the originary technicity of the sciences, any hope of dealing with such disturbances via the reacquisition on the basis of a pre-given “lifeworld” or through semiotic explications has been rendered obsolete. Rather, these disturbances (recognizable as indices of the technological constitution of the sciences) indicate the boundaries of inherited counter-technological strategies for constructing meaning. And conversely, they conspicuously symptomatize the fact that the question of our technological conditions has entered the foreground in a fully consequential way. Because the radical technification of the sciences calls into question the representative order of things from which they themselves emerged, they prompt us to focus on the strongly emergent technical milieu of our existence, a milieu long regarded by that order as being inferior and semantically impoverished, if not altogether meaningless. Although able to discern the structure and scope of the techno-scientific turn only vaguely, Martin Heidegger did emphasize the special role of the sciences for understanding our technological condition: “The more unequivocally the sciences press on toward their predetermined technological essence and its distinctive character, the more decisively does the question concerning the possibility of knowledge laid claim to in technology clarify itself – the question concerning the kind and limits of that possibility and concerning its title to right17 One fundamental concern of the phenomenological movement was the loss of any sense of coherent meaning in the sciences, which arrived in the wake of an advancing formalization and technification: the movement sought to reinstall a transcendental, pure subject of knowledge, one that would remain in control of all meaning-creation. These concerns received their most incisive articulation in Husserl’s socalled “Crisis” text, which appeared posthumously. See Edmund Husserl: The Crisis of the European Sciences and Transcendental Phenomenology, Evanston, Ill. 1970.

97

ness.”18 But the technification of the sciences triggered by the “computational turn” carries us far beyond Heidegger’s supposition, according to which technology has always conditioned the essence of science in opposition to philosophical founding claims. Technification leads us back to the intrinsic meaning of the technical itself. It compels us to consider the irreducibility of the technical, its non-derivability, to trace it back as the original condition and originary milieu of the human and of culture. Technology can no longer be grasped simply as applied science, as the practical precipitate of purely logo-theoretical efforts, as an application of a logocentric interpretation of the world, or as an instrument. Nor can it be positioned as an element of the symbolic order or reduced to a symbolic form of activity.19 When it begins to assert its autonomy, the world of machines can no longer be regarded primarily as a symbolic world – which of course does not necessarily imply the polemical contrary conclusion, that the world of the symbolic is one of machines.20 The question of technology must be liberated from this double misrecognition of the technological, namely the instrumental misunderstanding on the one hand and the symbolic fixation on the other, both characteristic and discursively potent since Plato (and even more so since Aristotle).21 The French philosopher Gilbert Simondon formulated this task quite explicitly. He proposed directing attention toward the “concrétisations objectives,” which respectively condition the milieus of human beings, and whose operators and objects we simultaneously are. For the great pioneer of mécanologie, history represented a process of concretization, one that has so far (and in this order) encompassed language, religion, and technics, as its systematic centers, and in the course of whose development the human being has progressively decentered and exteriorized him/herself in his/her objective concretizations.

18 Martin Heidegger: “The Word of Nietzsche: ‘God Is Dead’” (1943), in: The Question Concerning Technology and other Essays, New York 1977, pp. 53–114; see esp. p. 56. 19 See, for example Ernst Cassirer: “Form und Technik” (1930), in: Symbol, Technik, Sprache, ed. by Ernst Wolfgang Orth and John Michael Krois, Hamburg 1995, pp. 39–91. A radicalized version in which the technosciences, and with them technology, enter fully into a pluralism of the symbolic generation of worlds, is proposed by Nelson Goodman in his Ways of Worldmaking, Indianapolis/Cambridge 1978. 20 See Friedrich Kittler, “Die Welt des Symbolischen – eine Welt der Maschinen,” in: Draculas Vermächtnis. Technische Schriften, Leipzig 1993, pp. 58–80. 21 On this double misrecognition, see Gilbert Hottois: Philosophies des sciences, philosophies des techniques, Paris 2004. In particular, Hottois sketches the archaeology of the concept of the technosciences (pp. 119–171).

98

Erich Hörl | Knowledge in the Age of Simulation: Metatechnical Reflections

This process of exteriorization qua objective concretization is primordial, according to Simondon. That is to say, the human beings are never fully themselves, they do not exist prior to their objective concretizations. Rather, this process itself is constitutive of the process of becoming human, is a sign of the human as such: “Man forms a system with what he constitutes.”22 Simondon conceived of the process of system formation via technical devices as the fundamental characteristic of the present day. Simondon’s attempts to understand the intrinsic significance of the technical can be traced back his unease with the inability of contemporaries to decode the new technological situation and develop an adequate conception of technological culture. He distinguished two basic modes, which showed both the relationship of humanity to technical conditions and the ontological schematization of technical objects: the “statut de minorité” and the “statut de majorité.”23 These constellations respectively represent regimes of meaning of the technics. In the “statut de minorité,” technical devices are regarded principally as objects of use and as means to attain ends, as tools or instruments. Their assigned meanings are essentially subject to considerations of utility as dictated by human need and are subordinated to the regime of labor. In the “statut de majorité,” by contrast, technical knowledge becomes explicit and technical activity becomes a conscious activity, one that enters into a regulated relationship with the sciences and gains coherence, while the technological as such becomes a problem for thought and acquires its own significance. The transition from one mode to the other is a fundamental shift in the meaning of technics. For Simondon, the significant aspect of the concretization of technical objects that indicates the transition from minority to majority status resides in the technological transformation of all milieus of the “réalité humain,” the scope of which far surpasses all of the transformations seen in previous processes of concretization: “The technical object requires more and more a technological milieu in order to exist.”24 An essential feature of this process of concretization is the way in which technics (having transcended its instrumental status) trans22 Gilbert Simondon: “Les limites du progrès humain” (1959), in: Une pensée de l’individuation et de la technique, ed. by Gilles Châtelet, Paris 1994, pp. 268–275, esp. p. 270 : “L’homme forme système avec ce qu’il constitue.” Cited here from Gilbert Simondon, “The Limits of Human Progress” (1959), translatd by Sean Cubitt forthcoming in Cultural Politics. 23 See Gilbert Simondon: Du monde d’existence des objets techniques (1958), Paris 2005, p. 85ff. 24 Simondon (1959), p. 273. “L’objet technique exige de plus en plus un milieu technique pour exister.”

99

forms itself into technoscience and assumes the position formerly occupied by the sciences in the framework of pre-technological processes of concretization, in particular linguistic ones, supplanting scientific deduction with technoscientific experimentation. “The concretization of technical objects,” writes Simondon, “is conditioned by the reduction of the distance separating the sciences from technics. The handicraft stage is conditioned by a weak correlation, the industrial phase by a strong one.” 25 We are confronted by the technoscientific system far more directly than Simondon was. Seen in this perspective, the system can be expected to shake the foundations of the traditional instrumentalization of technics and the representational conception of science, both of which stem from pre-technological processes of concretization. Emerging from the technoscientific caesura is not only a displacement of the meaning of the sciences, but also a displacement of the meaning of technics. In fact, the former manifests itself as an effect of a changed technical, and henceforth technological meaning, and is founded and propelled by it. If we want to understand the contemporary situation, which is increasingly conditioned by technoscientific simulation procedures, then we must describe (specifically at the level of this double-displacement of meaning) the way in which we form systems via technical concretization, systems in which we ourselves are always already embedded, and which represent the pre-individual milieu of our individuation. This technical modeling cannot be decoded using standards and concepts that originate from a pre-technological phase of objective concretization – from the logo-theoretical configuration all the way to the theological guiding idea of the creative and its secularized form as invention. Any attempt to do so would mean the misrecognition of the majoritarian the technical constellation in which we find ourselves today.

25 “La concrétisation des objets techniques est conditionnée par le rétrécissement de l’intervalle qui sépare les sciences des techniques. La phase artisanale est caracterisée par une faible corrélation, la phase industrielle par une corrélation élevée.” Simondon: Du monde d’existence des objets techniques, p. 36. See also Bernard Stiegler: “La maïeutique de l’objet comme organisation de l’inorganique” in: Une pensée de l’individuation et de la technique, ed. by Gilbert Simondon, op. cit., pp. 239–262, esp. p. 253.

100

Erich Hörl | Knowledge in the Age of Simulation: Metatechnical Reflections

3 In the early 1930s, Gaston Bachelard coined the concept of “phénoménotechnique” to describe the technoscientific situation.26 With this term, created with the terminology of quantum physics in mind, Bachelard conceived of the new scientific spirit not as describing pre-existing or newly discovered phenomena, but as the originary technical production and invention of scientific objects. Phenomena with which the sciences are concerned manifest themselves as techno-phenomena. Objectivity was now regarded as being technically generated, and was no longer seen as the sovereign performance of transcendental observing subjects. The technical was revealed as an originary component of science, but no longer as phenomeno-graphic labor, rather as phenomenotechnical machinery instead. In the sciences, Bachelard recognized “The meta-technics of an artificial nature.”27 In particular, the technical mobilization of microphysics sharpened his gaze for the technicity of science as a whole, revealing to him (as Hans-Jörg Rheinberger has shown in detail) “the instrument” as being located “at the center of the epistemic ensemble” of modern science.28 Science reveals itself now as a primarily constructive form of action, realized in technical objects in a way that breaks with prescientific experience. However, Bachelard’s instrumentalism not only exposed the technicity of the sciences, but also concealed it. For the instrument that shifted into the focus of investigations of epistemological processes was essentially regarded as being a “veritable concretized theorem.”29 This primacy of mind and of theory, still subject to the reading of experimental technicity, is seen in the insistence on the traditional minorization of technical objects. Bachelard reveals himself as a transitional thinker whose pronounced technoscientific instincts are still dependent on common sense, a perspective that regards technics only as an instrument,

26 On the precise history of Bachelard’s coining of this concept, see Teresa Castelão-Lawless: “Phenomenotechnique in historical perspective: Its origins and implications for philosophy of science,” in: Philosophy of Science, 62 (1995), pp. 44–59. 27 Gaston Bachelard: “Noumène et Microphysique” (1931/32), in Gaston Bachelard: Études, Paris 1970, pp. 11–24, esp. p. 24 : “la métatechnique d’une nature artificielle.” 28 Hans-Jörg Rheinberger: “Gaston Bachelard und der Begriff der ‘Phänomenotechnik,’” in: Epistemologie des Konkreten, Frankfurt 2006, p 37–54, esp. p. 45. 29 Gaston Bachelard: Les intuitions atomistiques, Paris 1933, p. 140. Cited in Hans-Jörg Rheinberger (2006), p. 45.

101

hence depriving it of autonomy. But what remained plausible on the eve of the technological shift of the meaning of the sciences via simulation procedures is no longer sustainable today. This minoritizing trait is also registered in Hans-Jörg Rheinberger’s almost canonical investigation of experimental systems as the generation and concretion of so-called “epistemic objects,” investigations whose point of departure is Bachelard’s process epistemology. The “scientific” or “epistemic object,” the actual object of science, is said to be characterized by an “irreducible vagueness” and a “precarious status” because it “embodies the as-yet unknown.”30 Before its essentially withdrawn character becomes representable at all, before it can become an object of manipulation in a research context, the epistemic object must be delimited, determined, and stabilized via experimental conditions or “technical objects.” “The technical conditions,” he writes, “determine the space and scope of the representation of an epistemic object.” Moreover – and this is decisive – “sufficiently stabilized epistemic objects” become “constitutive parts of experimental arrangements,”31 thereby sinking to the status of stabilized and stabilizable technical objects. Rheinberger refers explicitly to the minority status assumed by technical objects in the process of knowledge formation, a process that is understood as the concretion of epistemic objects: “We are dealing here with an apparent paradox: on the one hand, the realm of the technical is a precondition for scientific research, and hence for epistemic objects. On the other, the technical conditions tend consistently to annihilate the scientific objects so defined (as epistemic object), to integrate them into the inventories of the technical. Essentially, this paradox can only be resolved because of the fact that the reciprocal effects between scientific object and technical conditions are non-technical in character. Scientists are “tinkerers,” not engineers. In contrast to the realm of the technical in the narrower sense, research systems are improvised arrangements oriented not towards mere repetition, but towards the continuous emergence of new and unanticipated events.”32

30 Hans-Jörg Rheinberger: “Experimentalsysteme, Epistemische Dinge, Experimentalkulturen. Zu einer Epistemologie des Experiments,” in: Deutsche Zeitschrift für Philosophie, 42 (1994) 3, pp. 405–417, cited on p. 408f. 31 Ibid., p. 409. 32 Ibid.

102

Erich Hörl | Knowledge in the Age of Simulation: Metatechnical Reflections

Today, the sciences have long since exited the stage of instrumentality, having been converted into the post-instrumental or trans-instrumental computer sciences. Since the accidental, the unanticipated, and the indeterminate are to a considerable degree generated using machines, even becoming effects of mechanical processing, vagueness as such is a basic feature of the technical object itself. Out of the question, then, is any recourse to an ultimately nontechnical research mentality. Henceforth, the “differential reproduction”33 of results and the generation of surprises that characterizes the epistemological process will reside in the technical procedures of modeling and simulation, and consequently to some extent in those technical objects that formerly were expected to serve solely as guarantors of stability and determinacy. These technical objects are agents not only of stabilization, but also of destabilization, and are bearers of openness and of the non-finished. In light of the computer sciences, the phenomenotechnical and the inferiority of the technical object implied therein are no longer guiding ideas. Nonetheless, we can speak of an epistemotechnics and can assume the existence of epistemo-technical structures which produce nature and the world without models prefiguring and projecting it. What occurs here, in the best pragmatic spirit, is “open-ended behavior” and the “openended generation of change.”34 Increasingly, the epistemological process of concretion obeys the objective concretion arising out of the spirit of the technical object. For their part, the sciences have (precisely in their technical constitution) changed from being machines for establishing certainty about the world to generators of uncertainty and indeterminacy, in the process becoming (to invoke an expression coined by Gerhard Gamm) a core magnitude of the apparatus of indeterminacy,35 one that increasingly characterizes our situation. This change also

33 Ibid., p. 410. 34 And in his ambitious account of the devaluation of practical action in orthodox philosophical thinking since Plato, John Dewey reinforces the concept of open-ended action. See John Dewey: The Quest for Certainty (1929), Frankfurt 2001, p. 7–29, esp. p. 26. 35 Gerhard Gamm: “Technik als Medium,” p. 276. For years now, Gamm has pursued a research program that examines the linguistic-philosophical, logical, and theoretical implications of the “semantics of indeterminacy.” The historical conclusion of this program is that a retreat from the fundamentally modern conviction according to which science produces certainty and security has been observable for a century and a half in various scientific domains, and that it is moreover based on a secure and certain methodological basis. For Gamm, the technical production of indeterminacy is only one phenomenon among others, while in my opinion, the engaging of the so-called dispositiv of indeterminacy is shaped by technology, if not actually induced by it.

103

affects the protagonists of the research process. Emerging as a central epistemotechnical actors in place of tinkerers are not so much engineers as simulators and model builders. Their fundamental gestures should be studied, not to mention their trans-theoretical technical series of tricks and artifices.36 4 For Vilém Flusser, a phenomenologist of the media, history is primarily the history of basal cultural-technical codes. Despite a certain symbolic fixation, he sketches a possible framework for the re-description of epistemo-technical structures, one that abandons the primacy of the theoretical attitude and of the inferior status of the technical object. “Today,” he claims, “there are numerous symptoms indicating that we are beginning to reorient ourselves from a subjective to a projective attitude,”37 thereby recoding our thinking. According to Flusser, this recoding particularly affects the new use of images, which is observable in the transition from alphanumeric and purely numeric codes to “synthetic codes,” and which as a whole characterizes the procedures of simulation: “The old images are re-presentations [Ab-bilder] of something, the new ones are projections, models [Vor-bilder] for something that does not exist, but which might. The old images are fictions, simulations of, while the new ones are concretizations of possibilities. The old images are an abstractive, withdrawn ‘imagination,’ the new ones are owed to a concretizing, projective ‘power of imagination.’ We think projectively.”38 Under advanced technological conditions, the powers of productive imagination can be expected to arise in the wake of the powers of reproductive imagination. Flusser regarded this change as being highly significant. Insofar as the power of imagination is understood as “a complex intentional gesture through which human beings shape their lifeworld,” this change of attitude, which reveals itself in the changed regime of the powers of imagination, testifies to no more and no less than “a contemporary cultural influx” and “a new way of being in the world.”39 For Flusser, this change of attitude implies “that the 36 In the above-mentioned text, Eric Winsberg already designated the “simulationist” (p. 109) as a central protagonist, while speaking of the “extra-theoretic modeling ‘tricks’” said to characterize his work. 37 Vilém Flusser: “Vom Subjekt zum Projekt,” in: Schriften, ed. by Stefan Bollmann and Edith Flusser, Bensheim, Dusseldorf, 1993–1996, pp. 9–160, esp. p. 24. 38 Ibid, p. 24f. 39 Vilém Flusser: “Eine neue Einbildungskraft,” in: Bildlichkeit, ed. by Volker Bohn, Frankfurt 1990, pp. 115–126, esp. p. 115.

104

Erich Hörl | Knowledge in the Age of Simulation: Metatechnical Reflections

concept of knowledge must be reformulated,” and that we “grasp [...] knowledge as artifice.”40 Today, science is a machine for generating possibilities, and no longer for the production of statements or the promise of theory concerning that which already exists. This change of attitude has substantially shifted the ontological position and self-understanding of human individuals and of the subject. The technification of observation that characterized the Galileian situation led to a prosthetic extension of the human sensorium, thereby promoting and cementing a subjective attitude. Non-Galileian technification of calculability, on the other hand, of such great significance for our times, manifests itself not only as a human expansion into the realm of mathematical representation, but at the same time our disappearance into a new techno-mathematicity, as well as in the production of possibilities to which it gives rise.41 On the one hand, the idea of the human itself is dissolved in the multiplicity of complex systems generated by the technology of representation. On the other, the classical position of the subject is destroyed. In the procedures of the new computer science for manipulating objects, world pictures, and world definitions founded on a new attitude one might refer to as projective, machines assume a major share of the tasks of design and definition, which had hitherto been the province of the transcendental subject. These are presumably the agents of not only a historico-epistemological but also a historico-ontological caesura, one that – succeeding the classical descriptive relation to being – summons a projective relation to being in a techno-medial world. After metaphysics, it might be said (based loosely on a vision by Max Bense) that the sovereignty of explanation concerning the new relationship to being in the technological world passes over to a “metatechnics,”42 one that goes beyond subjectivist and anthropological prejudices to reconfigure our standards for the being-in-the-world.

40 See note 37, p. 36f. 41 On the Galileian situation, see Hans Blumenberg: “Das Fernrohr und die Ohnmacht der Wahrheit,” in Galileo Galilei: Sidereus Nuncius, ed. by Hans Blumenberg, Frankfurt 2002, pp. 7–75. On the thesis of the extension of the human into the realm of mathematical representation via simulation procedures, see Paul Humphreys: Extending ourselves. Computational Science, empiricism and scientific method, Oxford, New York 2004. 42 Max Bense: “Kybernetik oder Die Metaphysik einer Maschine” (1951), in: Ausgewählte Schriften, vol. 2, ed. by Elisabeth Walther, Stuttgart, Weimar 1998, pp. 429–446, esp. p. 445.

105

SELECTED LITERATURE This selected bibliography lists the texts referred to by the authors of this volume. It has also been supplemented by relevant texts. Entries are ordered chronologically. Where feasible and useful, dates of first or original editions have been cited.

— Plato: The Republic, transl. R.E. Allen, New Haven 2006. — Giorgio Vasari: Einführung in die Künste der Architektur, Bildhauerei und Malerei, ed. by Matteo Burioni and Sabine Feser, Berlin 2006. — Andrea Pozzo: Perspectiva pictorum et architectorum, Pars I/II, Rome 1693/1700. — Giovanni Boccaccio: Commento alla “Divina Commedia,” ed. by Domenico Guerri, Bari 1918. — Erwin Panofsky: “Die Erfindung der verschiedenen Distanzkonstruktionen in der malerischen Perspektive,” in: Repertorium für Kunstwissenschaften, vol. XLV, Stuttgart 1925, pp. 84–86. — László Moholy-Nagy: Malerei, Fotografie, Film, Bauhausbuch no. 8, 1925 (Reprint Mainz 1967). — Hermann Weyl: Philosophie der Mathematik und Naturwissenschaft (1926), Munich 1966. — Erwin Panofsky: “Die Perspektive als ‘Symbolische Form,’” in: Vorträge der Bibliothek Warburg (1924/25), Leipzig/Berlin, 1927, pp. 258–330. In English as Perspective as Symbolic Form, trans. by Christopher S. Wood, New York 1991. — Walter Benjamin: Das Kunstwerk im Zeitalter seiner technischen Reproduzierbarkeit. Drei Studien zur Kunstsoziologie (1935/36), Frankfurt 101977. — László Moholy-Nagy: Vision in Motion, Chicago 1947.

107

— Max Bense: “Kybernetik oder Die Metatechnik einer Maschine,” in: Ausgewählte Schriften, vol. 2, (1951), ed. by Elisabeth Walther, Stuttgart, Weimar 1998, pp. 429–449. — Rudolf Wittkower: “Brunelleschi and ‘Proportion in Perspective,’” in: Journal of the Warburg and Courtauld Institutes, vol. 16, London, 1953, p. 275ff. — Hans Blumenberg: “Nachahmung der Natur. Zur Vorgeschichte der Idee des schöpferischen Menschen,” in: Studium generale, vol. 10, Bochum 1957. — Ernst Cassirer: Determinismus und Indeterminismus in der modernen Physik – Historische Studien zum Kausalproblem (1957), Darmstadt 1957. — Gilbert Simondon: Du Monde d’existence des objets techniques (1958), Paris 2005. — Abraham Moles: Epoche Atom und Automation. Enzyklopädie des technischen Jahrhunderts, Geneva 1959. — Edmund Husserl: Gesammelte Werke, ed. by Walter Biemel, vol. VI, The Hague 1962. — Marshall McLuhan: Understanding Media: the Extensions of Man, New York 1964. — John von Neumann: Theory of Self-Reproducing Automata, ed. by A. W. Burks, Urbana 1966. — Lewis Mumford: The Myth of the Machine, vol. I: Technics and Human Development, New York 1967; vol. II: The Pentagon of Power, New York 1970. — Robert Venturi, Denise Scott Brown, and Steven Izenour: Learning from Las Vegas, Cambridge, Mass. 1972. — Wolfgang Kemp: “Disegno. Beiträge zur Geschichte des Begriffs zwischen 1547 und 1607,” in: Marburger Jahrbuch für Kunstwissenschaft, vol. XIX, Marburg 1974, pp. 219–40.

108

Selected Literature

— Samuel Y. Edgerton Jr.: The Renaissance Rediscovery of Linear Perspective, New York 1975. — Henri Lefebvre: “Annullierung der ‘Erkenntnistheorie’ – Simulierung und Simulacrum,” in: Metaphilosophie, Frankfurt 1975, pp. 210–18. — Jean Baudrillard: L’échange symbolique et la mort, Paris 1976; English edition: Symbolic Exchange and Death, London 1993. — Herbert A. Simon: Models of Thought, Dordrecht 1977. — Jean Baudrillard: Agonie des Realen, Berlin 1978. — Nelson Goodman: Ways of Worldmaking, Indianapolis/Cambridge 1978. — Bernard Tschumi: The Manhattan Transcripts, New York 1981. — Jean Baudrillard: Simulacres et simulation, Paris 1981. In English as: Simulacra and Simulation, Michigan 1995. — Architecture and Technical Thinking, DAIDALOS, no. 18, Berlin 1983. — Vilém Flusser: Ins Universum der technischen Bilder (1985), Göttingen 41992. — Sybille Krämer: Symbolische Maschinen. Die Idee der Formalisierung in geschichtlichem Abriss, Darmstadt 1988. — Ars Electronica (ed.): Philosophien der neuen Technologie, Berlin 1989. — Hans Blumenberg: Höhlenausgänge, Frankfurt 1989.

109

— Herbert A. Simon: Die Wissenschaften vom Künstlichen, Berlin 1990. — Florian Rötzer (ed.): Digitaler Schein. Ästhetik der elektronischen Medien, Frankfurt 1991. — Rolf Haubl: “Unter lauter Spiegelbildern…” in: Zur Kulturgeschichte des Spiegels, 2 vols., Frankfurt 1991. — Sybille Krämer: Berechenbare Vernunft. Kalkül und Rationalismus im 17. Jahrhundert, Berlin, New York 1991. — Florian Rötzer und Peter Weibel (eds.): Strategien des Scheins. Kunst-Computer-Medien, Munich 1991. — Bernhard Hölz: Tractatus poetico-philosophicus: Über Simulation, Essen 1991. — Jean Baudrillard: L’illusion de la fin ou La grève des événements, Paris 1992. — Norbert Bolz: Die Welt als Chaos und Simulation, Munich 1992. — Bernhard Dotzler: “Simulation,” in: Ästhetische Grundbegriffe. Historisches Wörterbuch, vol. 5, Stuttgart/Weimar 1992ff., pp. 509–535. — Friedrich Kittler: Draculas Vermächtnis. Technische Schriften, Leipzig 1993. — Valentin Braitenberg and Inga Hosp: Simulation. Computer zwischen Experiment und Theorie, Reinbeck 1995. — Stefan Iglhaut, Florian Rötzer, Elisabeth Schweeger (eds.): Illusion und Simulation – Begegnung mit der Realität, Ostfildern 1995.

110

Selected Literature

— Ernst Cassirer: Symbol, Technik, Sprache, ed. by Ernst Wolfgang Orth and John Michael Krois, Hamburg 1995. — Teresa Castelão-Lawless: “Phenomenotechnique in historical perspective: Its origins and implications for philosophy of science,” in: Philosophy of Science, 62, Chicago 1995. — Christopher Langton: Artificial Life: An Overview, Cambridge 1995. — Werner Rammert (ed.): Soziologie und Künstliche Intelligenz. Produkte und Probleme einer Hochtechnologie, Frankfurt/New York 1995. — Hans Blumenberg: Wirklichkeiten, in denen wir leben, Stuttgart 1996. — Manuel Castells: The Information Age: 1. The Rise of Network Society, Oxford, Cambridge/ Mass. 1996, 22000. — Brigitte Felderer: Wunschmaschine Welterfindung. Eine Geschichte der Technikvisionen seit dem 18. Jahrhundert, Vienna/New York 1996. — Peter Galison: “Computer Simulations and the Trading Zone,” in: The Disunity of Science. Boundaries, Contexts and Power, ed. by Peter Galison and D.J. Stump, Stanford 1996, pp. 118–157. — Nils Röller: “Simulation,” in: Historisches Wörterbuch der Philosophie, vol. 9, Basel 1996, pp. 795–98. — Walther Zimmerli: Braunschweiger Texte, Hildesheim 1997. — Isabelle Stengers: “Das Subjekt und das Objekt,” in: Die Erfindung der modernen Wissenschaften, Frankfurt 1997, pp. 201–13.

111

— Andreas Kablitz and Gerhard Neumann (eds.): “Mimesis und Simulation,” in: Litterae, vol. 52, Freiburg im Breisgau 1998. — Norbert Bolz: Eine kurze Geschichte des Scheins, Paderborn 1998. — Sybille Krämer (ed.): Medien – Computer – Realität. Wirklichkeitsvorstellungen und Neue Medien, Frankfurt 1998. — Thomas P. Flint: “Omniscience,” in: Routlege Encyclopedia of Philosophy, vol. 7, ed. by Edward Craig, New York 1998, pp. 107–12. — Thomas Metscher: “Mimesis,” in: Enzyklopädie der Philosophie, Hamburg 1999, p. 845. — Jacques Ferber: Multi-agent Systems. An Introduction to Distributed Artificial Intelligence, London 1999. — Ichiro Sunagawa: “Growth and Morphology of Crystals,” in: Forma, vol. 14, 1999. — Gerhard Gamm: “Technik als Medium. Grundlinien einer Philosophie der Technik,” in: Nicht nichts, Frankfurt 2000, pp. 275–87. — Lily E. Kay: Who Wrote the Book of Life? A History of the Genetic Code, Stanford 2000. — Warren McCulloch: Embodiments of Mind, Cambridge 1965. — Werner Jung: Von der Mimesis zur Simulation. Eine Einführung in die Geschichte der Ästhetik, Berlin 2000. — David A. Randall (ed.): General Circulation Model Development, Past Present and Future. The Proceedings of a Symposium in Honor of Akio Arakawa, New York 2000.

112

Selected Literature

— Klaus Rehkämper and Klaus Sachs-Hombach (eds.): Vom Realismus der Bilder, Magdeburg 2000. — Agatha C. Hughes and Thomas P. Hughes (eds.): Systems, Experts, and Computers. The Systems Approach in Management and Engineering, World War II and After, Cambridge, Mass./London 2000. — Manfred Stöckler: “On Modeling and Simulations as Instruments for the Study of Complex Systems,” in: Martin Carrier, Gerald Massey, Laura Ruetsche (eds.): Science at Century’s End. Philosophical Questions on the Progress and Limits of Science, Pittsburgh 2000, pp. 355–73. — Andres Lepik: “Mies and Photomontage, 1910–38,” in: Mies in Berlin [exhib. cat., NewYork/ Berlin], New York 2002. — Nicola Suthor: “Mimesis (Bildende Kunst),” in: Historisches Wörterbuch der Rhetorik, vol. 5, ed. by Gerd Ueding, Tübingen 2001. — Ali Rahim (ed.): Contemporary Techniques in Architecture. Architectural Design AD, London 2002. — Ali Malkawi and Godfried Augenbroe (eds.): Advanced Building Simulation, New York 2003. — Evelyn F. Keller: “Models, Simulation, and Computer Experiments,” in: The Philosophy of Scientific Experimentation, ed. by Hans Radder, Pittsburgh 2003, pp. 198–215. — Bernhard Siegert: Passagen des Digitalen. Zeichenpraktiken der neuzeitlichen Wissenschaften 1500–1900, Berlin 2003. — Valeska Von Rosen: “Nachahmung,” in: Metzler Lexikon Kunstwissenschaften. Ideen, Methoden, Begriffe, ed. by Ulrich Pfisterer, Darmstadt, 2003, pp. 240–4.

113

— Paul Humphreys: Extending ourselves. Computational science, empiricism and scientific method, Oxford, New York 2004. — Gabriele Gramelsberger: “Simulation als Kreativitätspraktik. Wissenschaftliche Simulation als Experimentalsysteme für Theorien,” in: Kreativität: Sektionsbeiträge, XX. Deutscher Kongress für Philosophie, ed. by Günter Abel, Berlin 26–30 September 2005, vol. 1, pp. 435–45. — Gabriele Gramelsberger: “Die Verschriftlichung der Wissenschaft. Simulation als semiotische Rekonstruktion wissenschaftlicher Objekte,” in: Kulturtechnik Schrift: Graphé zwischen Bild und Maschine, ed. by Gernot Grube, Werner Kogge, Sybille Krämer, Stuttgart 2005, pp. 439–452. — Christoph Hubig: Die Kunst des Möglichen I. Technikphilosophie als Reflexion der Medialität, Bielefeld 2006. — Tomás Maldonado: Digitale Welt und Gestaltung. Ausgewählte Schriften, ed. by Gui Bonsiepi, Zurich and Basel, Boston, Berlin 2007. — Johannes Lenhard, Günter Küppers, Terry Shinn (eds.): Simulation. Pragmatic Constructions of Reality (Sociology of the Sciences Yearbook), Vienna/New York 2007. — Georg Vrachliotis: “Der Sprung vom linearen ins kalkulatorische Bewusstsein. Evolutionäre Denkmodelle und Architektur,” in: Precisions. Architecture between Art and Sciences – Architektur zwischen Kunst und Wissenschaft, ed. by Ákos Morávanszky and Ole W. Fischer, Berlin 2008, pp. 232–62. — Michael Hagner and Erich Hörl (eds.): Die Transformation des Humanen. Beiträge zur Kulturgeschichte der Kybernetik, Frankfurt 2008.

114

ILLUSTRATION CREDITS Hänsli Fig. 1 Fig. 2 Fig. 3 Fig. 4 Fig. 5

Source: Joachim von Sandrart: Teutsche Academie der Bau- Bildhauer- und MalerKunst, Vol. 3/2, Fol. C., Nuremberg 1774. [ETH-Bibliothek Zurich, Alte Drucke]. Source: Ernst H. Grombrich: Die Geschichte der Kunst, London 161995. Source: The Complete Woodcuts of Albrecht Dürer, ed. by Willi Kurth, New York 1963. Source: Andrea Pozzo, ed. by Alberta Battisti, Milan 1996. Photo: Margita Wickenhäuser, Mannheim.

Gleiniger Fig. 1 Source: El Lissitzky: Proun und Wolkenbügel, Schriften, Briefe, Dokumente, ed. by Fig. 2 Fig. 3

Fig. 4 Fig. 5 Fig. 6 Fig. 7

Sophie Lissitzky-Küppers, Dresden 1977. Source: Erich Mendelsohn: Amerika, Bilderbuch eines Architekten, Berlin 1926, p. 44. Source: Robert Venturi, Denise Scott Brown and Steven Izenour: Learning from Las Vegas, revised edition: The Forgotten Symbolism of Architectural Form, photographic collage, page 63, ©1977 Massachusetts Institute of Technology, by permission of the MIT Press. Source: Toyo Ito, with contributions by Charles Jencks and Irmtraud SchaarschmidtRichter, Berlin 1995, p. 88. Source: Pavilion, by Experiments in Art and technology, ed. by Billy Klüver, Julie Martin, and Barbara Rose, New York 1972, p. 122. Source: Pavilion, by Experiments in Art and technology, ed. by Billy Klüver, Julie Martin, and Barbara Rose, New York 1972, p. 87. Source: Nature Design: From Inspiration to Innovation, ed. by Angeli Sachs, with essays by Barry Bergdoll, Dario Gamboni and Philip Ursprung, Baden 2007, p. 66.

Vrachliotis Figs. 1–2 Source: Hovestadt group, Computer Aided Architectural Design, ETH Zurich, 2007. Fig. 3 Source: Konrad Wachsmann, Wendepunkt im Bauen, Wiesbaden 21962, p. 93. Fig. 4 Source: Konrad Wachsmann, Wendepunkt im Bauen, (reprint of the original edition of Fig. 5 Fig. 6 Fig. 7 Fig. 8

115

1959), Wiesbaden 1962. © Akademie der Künste, Berlin, Konrad-Wachsmann-Archiv. Source: Fritz Haller: totale stadt – ein globales modell, Olten 1968, p. 104. Source: John von Neumann: Essays on Cellular Automata, ed. by Arthur W. Burks, Urbana 1970, p. 519. Source: Przemyslaw Prusinkiewicz and Astrid Lindenmayer: The Algorithmic Beauty of Plants, Vienna 1990, p. 161. Source: Steven Levy: Artificial Life: A Report from the Frontier Where Computers Meet Biology, New York 1992, p. 216. © 1992 Stephen Wolfram.

116

BIOGRAPHIES —

Andrea Gleiniger is a historian of art and architecture. Since 2007 she has been a lecturer at the Zurich University of the Arts, with a focus on the history and theory of space/scenography. She studied art history, comparative literature, and archaeology in Bonn and Marburg; in 1988, she took a doctorate in art history with a project on guiding ideas in large-scale postwar housing development; from 1983–93, she was curator at Deutsches Architekturmuseum Frankfurt/Main; since 1983, she has held teaching positions and guest professorships at academies in Karlsruhe, Stuttgart, and Zurich. From 2002–07, she was research assistant at the ETH Zurich/Chair of CAAD. She is active as a writer, particularly on architecture, urban planning, art, and new media in the 20th century. —

Gabriele Gramelsberger has been an academic assistant at the Institute for Philosophy at the FU Berlin since 2004. Studied philosophy, political science, and psychology in Berlin und Augsburg. Scholarship with the Department for Theory/ Philosophy at the Jan van Eyck Academy, Maastricht. Doctorate in Philosophy from the Freie Universität Berlin with a project in the philosophy of science, focusing on numerical simulation and visualization. Selected publications: Computersimulationen in den Wissenschaften. Neue Instrumente der Erkenntnisproduktion, Explorationsstudie, Berlin 2004. —

Thomas Hänsli has been assistant professor in the Chair of Art and Architecture History of Professor Werner Oechslin, Institute for the History and Theory of Architecture (gta), ETH Zurich. Studied music, architecture, and art history in Zurich. Diploma in architecture, ETH Zurich 1993, academic assistant with various research projects dealing with modern Swiss architecture and the history of architectural theory, most recently with the project Architekturtheorie im deutschsprachigen Kulturraum. 1486 bis 1618/48. Research and teaching activities and publications in the history of architectural theory, and the art and architecture of the early modern period. —

Erich Hörl has been a junior professor since 2007 in media technology and media philosophy at the Ruhr University Bochum, and is director of the international Bochumer Kolloquium Medienwissenschaft (bkm), devoted to investigating technological-medial conditions. Studied philosophy in Vienna and Paris; doctorate in cultural studies at the Humboldt-Universität, Berlin. 2004–06, assistant for the philosophy of technology, professorship of philosophy at the ETH Zurich. Selected publications: Die heiligen Kanäle. Historisch-epistemologische Untersuchungen zur archaischen Illusion der Kommunikation, Zurich, Berlin 2005; Transformation des Humanen. Beiträge zur Kulturgeschichte der Kybernetik. ed. with Michael Hagner, Frankfurt am Main 2008.

117



Nils Röller is a media theoretician and author. Since 2003, lecturer for media and cultural theory in the area of new media at the Zurich University of the Arts; director of studies in media arts. Studied philosophy, Italian language and literature, and media studies at the Freie Universität Berlin; 1995–99, artistic-academic assistant at the Academy of Media Arts in Cologne; 1996–99, conception and direction (with Siegfried Zielinski) of the Festivals Digitale; 2001, doctorate from the Bauhaus University, Weimar with a work on Hermann Weyl and Ernst Cassirer; since 2001, projectrelated collaboration (DFG) at the Vilém Flusser Archiv at the Academy of Media Arts in Cologne; 2002, scholarship from the Institute for Basic Research / ZKM Karlsruhe; 2007, Werkbeitrag (award) for literature from the Canton of Zurich. Selected publications: Ahabs Steuer – Navigationen zwischen Kunst und Wissenschaft, Berlin 2005; “Simulation,” in Historisches Wörterbuch der Philosophie, vol. 9, Basel 1996. —

Georg Vrachliotis has been a researcher and teaching assistant for architecture theory at the Chair of Computer Aided Architectural Design (CAAD) in the Department of Architecture at the ETH Zurich, since 2004. Studied architecture, studies in Philosophy and the History of Science. Visiting researcher at the Universities of Bremen and Freiburg, and UC Berkeley. Primary interests: technical thinking in 20th-century architecture. Current research foci: architecture and cybernetics; “Fritz Haller’s philosophy of construction.” 2007, founded the research project “Theory of Technology in Architecture” (Techniktheorie) in Prof. Hovestadt’s group at the ETH (together with Oliver Schurer, Vienna University of Technology). Since 2006, lecturing in architectural theory at the Institute for Architectural Theory at Vienna University of Technology.

118