213 22 4MB
English Pages 128 Year 2008
Complexity
Context Architecture: Fundamental Concepts Between Art, Science, and Technology Digitalization has altered architectural discourse. Today, discussions in architectural theory and design are shaped by many new ideas, including some that previously had no meaning in that context, or else very different ones. Increasingly, the conceptualizations and strategies of architectural discourse are molded by influences emerging along the interface joining scientific and cultural images of modern information technology. Posing itself against this background is the question: on the basis of which practical and in particular which theoretical concepts can architecture come to terms with these new technologies, thereby entering into a simultaneously productive and critical dialogue with them? Presented for debate in Context Architecture is a selection of such ideas, all of them central to current discourses. Context Architecture is a collaboration of the Zurich University of the Arts (ZHdK) and Ludger Hovestadt, chair for Computer Aided Architectural Design at the ETH Zurich.
Available in the series Context Architecture:
Simulation. Presentation Technique and Cognitive Method, ISBN 978-3-7643-8686-3
Complexity. Design Strategy and World View, ISBN 978-3-7643-8688-7
Context Architecture A collaboration of the Zurich University of the Arts (ZHdK) and the ETH Zurich
Complexity Design Strategy and World View
CONTEXT ARCHITECTURE
Edited by Andrea Gleiniger and Georg Vrachliotis
Birkhäuser Basel · Boston · Berlin
Translation from German into English: Ian Pepper, Berlin (except for contributions by Denise Scott Brown, Kostas Terzidis and Robert Venturi, incl. biographies) Copyediting: Monica Buckland, Basel Cover and layout design: Bringolf Irion Vögeli GmbH, Zurich Reproductions and typesetting: weissRaum visuelle Gestaltung, Basel This book is also available in German: Komplexität. Entwurfsstrategie und Weltbild, ISBN 978-3-7643-8687-0. Library of Congress Control Number: 2008925219 Bibliographic information published by the German National Library The German National Library lists this publication in the Deutsche Nationalbibliografie; detailed bibliographic data are available on the Internet at http://dnb.d-nb.de. This work is subject to copyright. All rights are reserved, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, re-use of illustrations, recitation, broadcasting, reproduction on microfilms or in other ways, and storage in data bases. For any kind of use, permission of the copyright owner must be obtained.
© 2008 Birkhäuser Verlag AG Basel ∙ Boston ∙ Berlin P.O. Box 133, CH-4010 Basel, Switzerland Part of Springer Science+Business Media Printed on acid-free paper produced from chlorine-free pulp. TCF ∞ Printed in Germany ISBN: 978-3-7643-8688-7
987654321
www.birkhauser.ch
7 Andrea Gleiniger and Georg Vrachliotis EDITORIAL 13 Robert Venturi CONTEXT IN ARCHITECTURAL COMPOSITION EXCERPTS FROM M.F.A. THESIS, PRINCETON UNIVERSITY, 1950 25 Denise Scott Brown CONTEXT AND COMPLEXITY 37 Andrea Gleiniger “THE DIFFICULT WHOLE,” OR THE (RE)DISCOVERY OF COMPLEXITY IN ARCHITECTURE 59 Georg Vrachliotis POPPER’S MOSQUITO SWARM: ARCHITECTURE, CYBERNETICS, AND THE OPERATIONALIZATION OF COMPLEXITY 75 Kostas Terzidis ALGORITHMIC COMPLEXITY: OUT OF NOWHERE 89 Klaus Mainzer STRATEGIES FOR SHAPING COMPLEXITY IN NATURE, SOCIETY, AND ARCHITECTURE 99 Johann Feichter COMPLEXITY AND CLIMATE 109 Clemens Bellut “ACH, LUISE, LASS ... DAS IST EIN ZU WEITES FELD,” OR: THE GORDIAN KNOT OF COMPLEXITY Appendix: 117 Selected Literature 125 Illustration Credits 126 Biographies
EDITORIAL The Complexity of Complexity “The complexity of architecture begins with the impossibility of questioning the nature of space and at the same time making or experiencing a real space. […] We cannot both experience and think that we experience,” explains Bernard Tschumi in an interview for the Journal of Philosophy and the Visual Arts.1 To the question: “What would be the foundations of a complex architecture?” he replies: “Architecture finds itself in a unique situation: it is the only discipline that, by definition, combines concept and experience, image and use, image and structure. Philosophers can write, mathematicians can develop virtual spaces, but architects are the only ones who are the prisoners of that hybrid art, where the image hardly ever exists without combined activity.”2 Upon closer examination of the concept of complexity, it soon becomes evident that it has always existed in more than one definition, more than one interpretation, and certainly more than one architectural variant. One can either agree with Tschumi’s discussions of complexity or approach them critically. What seems essential to us in this context is his concluding thoughts: “Architecture is not about conditions of design, but about the design of conditions […].”3 Here, it becomes evident that reflections on the concept of complexity and architecture must eventually go beyond simple terminological definitions. Perhaps even more than other cultural disciplines, architecture is confronted by the most diverse levels of contemporary complexity. Against this background, it is apparently often a question of modeling further levels of complexity, and therefore always of contriving a subsequent world picture, rather than of precisely categorizing or analyzing the existent. What is required are ways of reading complexity that – despite or even precisely because of their differences – lead unavoidably to questions of the respective systems of reference: which context generates which concept of complexity, and what concept of complexity produces what context?
1 Bernard Tschumi: “Responding to the question of complexity,” in: Complexity. Art, Architecture, Philosophy. Journal of Philosophy and the Visual Arts, no. 6, ed. by Andrew Benjamin, London 1995, p. 82. 2 Ibid. 3 Ibid., p. 83.
7
When Vitruvius referred to the architect as an “homo universalis,” he provided the basis for the complex self-understanding of architects as generalists. Complexity thereby became a self-evident and fundamental precondition for architecture and urban planning, and for various stylistic developments. The architectural concept of complexity, has, however, failed to keep up with the societal complexity propelled forward by processes of industrialization. It has become flattened to the point of formal arbitrariness. Far from embracing complexity, 20th-century architectural Modernism set into motion a programmatic concept of simplification and objectification (“less is more”). Against this background, Robert Venturi introduced the concept of complexity into architectural discourse. Venturi was concerned less with re-establishing the semiotic complexity of architectural form and its history than with programmatically responding to what existed, to concrete reality. The disciplines now referred to as “the science of complexity” developed almost contemporaneously, primarily on the basis of physics. This science informs current confrontations with the concept of complexity, and has created a new linkage between the natural sciences and information technology, that has become a central reference for computer-based design concepts. It may seem self-evident to want to reclaim complexity as the fundamental precondition for life in general and for architecture and the city in particular. In the course of the 20th century, nonetheless, classical Modernist strategies of systematization, with their orientation toward scientific models, led to the instrumental-rational objectification of design and planning tasks and to giving priority to the reduction of life’s complexity to existential functions and straightforward rules. Yet the more the complexity of life and its functional relationships became objects of the strategies of simplification that served technocratic planning processes on the one hand, and of unification through industrial manufacturing processes on the other, the more evident the deficits of this orientation became. Andrea Gleiniger’s contribution illuminates the extent to which this loss of identity became the point of departure for the evolving conceptions of architecture that began to subject the project of Modernism to revisions during the 1950s. She foregrounds the diversity of the answers proffered, all of which – consistently with the logic of Modernism – shared a fascination for abstraction.
8
Editorial
This is the case whether these answers drew upon the mathematical and natural-scientific foundations of electronics in relation to the information age; or whether architectural-theoretical efforts culminated in the elevation of geometrical formalization to the highest principle; or whether a new consciousness for the “genius loci” was derived from phenomenological forays into the history of building. Yet in the end, it was Robert Venturi’s provocative plea for complexity and contradiction, developed during the 1950s and published for the first time in 1966, that succeeded in restoring to architectural discourse a concept (and ultimately a perceptual category as well) that at that time was primarily associated with the science of complexity then taking shape against a background of the natural and computer sciences. For this reason, Venturi’s text is remarkable in its plea for a narrative architecture of vividness and symbolicity, a plea located between architectural-historical efforts to secure supporting evidence and everyday anamnesis. Equally remarkable is the way in which he mobilized the term “complexity” – as defined then in the interplay of cybernetics, computer science, and information technology – as a dynamic category aimed at recontextualizing the city and architecture. We consider it a tremendous asset for the present volume that Robert Venturi was prepared not only to make available to us excerpts from his master’s thesis, written at Princeton University in 1950, which formed the cornerstone of Complexity and Contradiction, but to append fresh commentary to these excerpts as well, thereby re-examining his earlier position from a contemporary perspective. The article written by Denise Scott Brown for the present volume (as well as to profile Venturi, Scott Brown and Associates (VSBA)) vividly demonstrates the degree to which it was not a question back then solely of re-establishing semiotic complexity for architectural form and its history, but also of the programmatic response to the pre-existing, of a reality as concrete as it was complex. Of great importance for the discourse of complexity, then as now, is her insistence on the significance of the term “context,” which encompasses both the concrete relationality of the situation and its abstraction in the various categories of cultural and societal conditions. Once the demand for complexity became common cultural property (via the ideological vicissitudes and confrontations of the multifariously dazzling positions of Postmodernism since the late 1970s), having been formulated more-or-less explicitly in diverse ways in major positions of con-
9
temporary architectural discourse,4 it went on to acquire a new dimension in the face of digital information processes. In his essay, Kostas Terzidis emphasizes the potential of the generative procedures made possible by developments in digital computer technology, particularly when it is a question of mastering large-scale building enterprises such as residential settlements or high-rise architecture. These considerations imply the option of conceptualizing new possibilities, striving to becoming established against the background of highly disparate scenarios of digitalized design rhetoric and concepts. Such a new way of thinking would not only scenarize the “difficult whole” of architecture and its context in perfecting the spectacular, and that through “complicated” forms, but would also recognize its responsibility with regard to context. The distinction between the “complex” and the “complicated” is taken up by Clemens Bellut in a discussion of the concept of complexity that is grounded in philosophical and literary concerns. By illuminating the concept as a topos of laments over modernity, he sheds light on the cultural dimensions of the discourse of complexity. With regard to interpretations of complexity grounded in perceptual psychology and Gestalt theory, Georg Vrachliotis discusses the development of the concept in relation to the history of science and technology. This development, triggered by penetration of cybernetic models into the technical thinking of architecture around the mid-20th century, meant a stepwise “operationalization of complexity,” an approach that would become increasingly explicit via applications of information technology in contemporary digital architectural production. In his article, Klaus Mainzer shows how the complexity discourse initiated by Robert Venturi – in spite of or precisely because of its heightened opposition to positivist concepts of technology during the 1950s and 1960s – became analogous to developments and discoveries in the science of complexity. Comparatively new is the interest shown by the natural sciences for engaging in a dialogue with architecture – particularly dialogue that would be something more than a descriptive or metaphorical generation of analogies. An important methodological aspect of
4 One thinks of Bernhard Tschumi, cited above, or of architects like Wiel Arets, whose concept of context relates to medical-natural scientific analogies, as suggested by his metaphor of the “virological.” See Wiel Arets: “Een virologische architektuur,” in: de Architect-thema 57, pp. 42–48. Also Wiel Arets Architect: Maastricht Academy for the Arts and Architecture, Rotterdam 1994.
10
Editorial
some areas in the natural sciences lies in the grasp of complex realities through mathematical symbols and algorithms, not least in order to render reality comprehensible and manageable. A tremendous challenge in this context is represented by the climate system, into whose complexity Johann Feichter offers viable insight. Feichter’s article presents meteorology as a research field, which through John von Neumann became an early application area for the computer simulation of “complex systems,” and which continues to unite complexity and simulation like virtually no other discipline. In conjunction with a volume on the topic of simulation, the present collection of essays on complexity forms a prelude to the series Context Architecture.5 This series, which grew out of the intensive collaboration between its two editors, sets out to address fundamental architectural concepts lying between art, science, and technology. Our thanks go to all of our authors for their profound contributions, all of them produced especially for this publication. Our very special thanks also to Prof. Dr. Hans-Peter Schwarz, Founding Rector of the Zurich University of the Arts, and to Prof. Dr. Ludger Hovestadt, Chair for Computersided Architectural Design (CAAD) at the ETH Zurich. Their generous financial support and encouragement regarding content made it possible for our book project to assume its present form. The cooperation between these two institutions also reflects the aim of bringing architecture, technology, art, and science into dialogue, and has answered the ubiquitous call for transdisciplinarity in a singular fashion. Birkhäuser Verlag was ultimately responsible for realizing this publication project, and our very special thanks go to Robert Steiger and Véronique Hilfiker Durand for their patient, competent, and consistently committed editorial efforts. Andrea Gleiniger, Georg Vrachliotis
5 Andrea Gleiniger and Georg Vrachliotis (eds.): Simulation. Presentation Technique and Cognitive Method in the series Context Architecture: Fundamental Architectonic Concepts Between Art, Science, and Technology, Basel, Boston, Berlin 2008.
11
Robert Venturi CONTEXT IN ARCHITECTURAL COMPOSITION EXCERPTS FROM M.F.A. THESIS, PRINCETON UNIVERSITY, 1950 1 Foreword in 2008 This thesis promoted context as a consideration in architecture. In doing so, it ran counter to the tenets of 1950s Modernism and could be seen as revolutionary for its time. But although its subject is now well accepted and indeed a cliché in our field, I believe its early treatment in this thesis has been forgotten. For instance, I heard a Philadelphia colleague declare recently that present ideas of engaging context as an element that architecture should consider and learn from evolved in the 1970s. But I vividly remember my Eureka response in 1949 when I came across the theory of perceptual context in Gestalt psychology in the Eno Hall library at Princeton, and recognized its relevance to architecture. This was at a time when architects were designing exclusively “from the inside out” and Modernism was promoted as universally applicable – and to hell with that old stuff that was, alas, still around. Hard as it is to remember, this was true with some few exceptions, as when Frank Lloyd Wright connected with the prairie or a Pennsylvania waterfall – with the natural context but never with that lousy architecture that preceded him. It’s nice when a daring idea becomes accepted, but it hurts that it is often misunderstood and misapplied. It should also be noted that the significance of including meaning – not only expression – in architecture was acknowledged here, perhaps for the first time in our age, and that this opened the way for an acceptance of plurality and multiculturalism in architectural approaches to design. This thesis can also be seen as a foundation for my first book, Complexity and Contradiction in Architecture (Museum of Modern Art, 1966), which has promoted the importance of context to a wide and lasting audience. It also forms a basis for the work of Venturi, Scott Brown and Associates over the 50 years that followed it.
1 A more complete version of my thesis can be found in my book, Iconography and Electronics Upon a Generic Architecture: A View from the Drafting Room, Cambridge/Mass. 1996, pp. 335–374, ©1996 Massachusetts Institute of Technology, by permission of the MIT Press.
13
Sheet 19: The design problem was of a new chapel for the Episcopal Academy, then in Merion, Pennsylvania. The chapel is situated between two existing mansions that housed the Lower and Upper schools separately. The chapel does not fall within one of the parts, but outside of the parts and within a space among them. This causes the Lower and Upper schools to become oriented toward each other and toward the interior of the site rather than the outside highway as before.
14
Robert Venturi | Context in Architectural Composition | Excerpts from M.F.A. Thesis, Princeton University, 1950
The body of the thesis was composed on 25 sheets, which progress from introductory statements, research, and analysis, to formation of a thesis and testing it in the design of a chapel for the Episcopal Academy, then situated in Merion, Pennsylvania. Included here are general introductory remarks, explicit descriptions (excerpts from Sheets 11–13), and portions of the chapel design (Sheets 19 and 20). Intent The intent of this thesis problem is to demonstrate the importance of and the effect of setting on a building. It considers the art of environment; the element of environment as perceived by the eye. Specifically it deals with relationships of the part and the whole and with what architects call site planning. It attempts to evolve principles concerning these issues and ways for discussing them. Implication Its implication for the designer is that existing conditions around the site that should become a part of any design problem should be respected, and that through the designer’s control of the relation of the old and the new he can perceptually enhance the existing by means of the new. Content The thesis of the problem in short is that its setting gives a building expression; its context is what gives a building its meaning. And consequently change in context causes change in meaning. Sources The sources of my interest in this subject are relevant. An early and direct one was my impatience with architecture design problems produced by the Beaux Arts Institute of Design of New York which I did as a student, which frequently lacked indications of the setting or background of the building to be designed or at best indicated merely the physical dimensions of the site. This implied for me a dangerous assumption that the building could be designed only for itself. Another important source lay in certain experiences I had and my interpretation of them on a trip to France and Italy several summers ago. This was my first European trip and my approach was one of keen curiosity to discover fact and compare it with anticipation (to paraphrase George Santayana). My anticipation
15
Sheet 20: An elevation drawing of the campus site shows the new chapel acting as a hyphen between separate parts.
16
Robert Venturi | Context in Architectural Composition | Excerpts from M.F.A. Thesis, Princeton University, 1950
was based on images derived from the usual graphic and photographic means of representing buildings. Invariably this comparison caused surprises. From these reactions and their implications I induced my thesis. The surprises as I analyzed them seldom resulted from the difference caused by the extra dimensions of space and time but from the opportunity to include and relate the individual building and the setting, to perceive in a perceptual whole. This first opportunity for an American to experience characteristic Medieval and Baroque spaces as wholes, especially those derived from piazzas, instilled an enthusiasm for them which made subsequent library research on the Roman ones extensive and stimulating. One last source, a non-empirical one, was the subsequent discovery of Gestalt psychology as a necessary basis for a discussion of perceptual reactions and its usefulness for providing a precise vocabulary for an architect who hesitates to use some of his own worn out words. Among such words are “unity” which has lost precise meaning in criticism and “proportion” which in its usual application Frank Lloyd Wright, for one, has amusingly rendered useless. Method The method of organization and presentation of the material is as follows: The problem is essentially one great diagram so that the sizes and positions of words, symbols, plans, illustrations, etc. convey meaning as much as their symbolic denotations do. The problem as a diagram at the beginning consists of the two thesis statements mentioned above in architectural terms, accompanied, as paragraph headings to the left, by equivalent general statements in psychological terms, followed by amplification of these statements by means of a series of diagrams (Sheets 3 and 5). The following series of sheets (6–15) represents the argument of the thesis via analyses of historical architectural examples in Rome and of contemporary domestic architecture. The final series represents the application of the thesis to the design problem, the design of a chapel for an Episcopal country day school for which the rest of the thesis constitutes approach and research. Furthermore, the relation between a diagram and the diagram below it, or a plan and the one below it, as in the Campidoglio of 1545 and the Campidoglio of 1939, is the same as the relation between initial statements 1 and 2. The titles and subtitles of the various sections and also the series of colored round, square, and diamond symbols establish similar secondary horizontal axes of influence and
17
Sheet 11: Historical analysis shows how similar domed buildings change their appearance and meaning within different urban contexts. Excerpted here are drawings and photographs of the Roman Pantheon (top) and Philadelphia Girard Trust Bank (bottom).
18
Robert Venturi | Context in Architectural Composition | Excerpts from M.F.A. Thesis, Princeton University, 1950
equivalent relationships. The copies of engravings and the photographs of the examples from various views amplify their meaning, and their relative positions and connections via strings indicate equivalencies among illustrations. This form of organization facilitates the integration of the material. Sheets 1 & 2 The subtitle of this part of the research might be: “The recognition of space and form as qualities of a perceptual whole.” One definition of meaning is “Meaning is what one idea is as context of another idea.” Its context gives an idea its meaning, and this statement in architectural terms becomes Statement 1 in the diagram. Its context gives a building expression. A building is not a self-contained object but a part in a whole composition relative to other parts and the whole in its position and in its form. Statement 2 is a consequence of Statement 1: Just as a change in context causes change in meaning in terms of ideas, change in the setting of a building causes change in expression of a building. Change of a part in its position or its form causes a change in other parts and in the whole. These two variables are related to a constant – to psychological responses, to the observer’s visual reactions, to his limit of attention, to his situation, etc. By their adjustment quality can be attained among these relationships.2 There are properties of the whole which are distinct from properties of the part. The whole composition may possess different degrees of articulation. The more articulate the whole, the more does a change in one part affect the other parts and the whole. […] Sheet 11 One reason for the frequent failures of architectural eclecticism lies in its disregard of context – visual, historical, and functional. A comparison of some Pantheons (see Sheet 11) is one of many which could be made and calls to mind the experience derived from Emerson’s shells in his Each and All:
2 The diagram which appears under sheets 1 and 2 consisting of quotations compares the psychologist’s and the architect’s approaches to this idea.
19
Sheet 13: “Before” and “after” site sections show how Frank Lloyd Wright’s Johnson House in Racine, Wisconsin, gave its prairie site a positive horizontal quality.
20
Robert Venturi | Context in Architectural Composition | Excerpts from M.F.A. Thesis, Princeton University, 1950
The delicate shells lay on the shore; The bubbles of the latest wave Fresh pearls to their enamel gave… I wiped away the weeds and foam, I fetched my sea-born treasures home; But the poor, unsightly, noisome things Had left their beauty on the shore… All are needed by each one; Nothing is fair or good alone.3 Perhaps the setting of the Pantheon is not ideal, but the small scale and close proximity of the surrounding buildings contribute to its magnitude and their angularity reinforces the dominance of its dome. Above all the shape of the enclosing piazza sets it off as a central-type building, the climax of a whole urban composition which its circular and domed form and its function demand. McKim, Mead, and White’s Philadelphia Girard Trust Bank building competes with skyscrapers, becomes one of many units in a row within a gridiron street pattern, which position denies its essential centrality. The innovative square plan below the dome partially ameliorates this condition. Jefferson’s University of Virginia rotunda, if eventually inconvenient as a library, is successful in expressing its centrality within a whole composition, as it is fortified perceptually by its subordinate flanking buildings and by its hilltop situation. A concept of the library as the heart of a university is thereby distinctly reinforced. This thesis and its application fall within a so-called organic approach in architecture but the acknowledgement of context is not antithetical to a Classical tradition in architecture. The definition “meaning is what one idea is as context of another idea” could directly represent a Classical idea of proportion. The Classical concept does recognize context within composition – that of the building as a system of relationships of geometric shapes. But in its concept of universality and lack of emphasis on natural and architectural setting, the Classical approach disregards context from an organic standpoint.
3 Ralph Waldo Emerson: Early Poems of Ralph Waldo Emerson, New York, Boston 1899. It was my mother who suggested this quotation as an appropriate analogy for my thesis.
21
Insofar as it is Platonic and Neo-Platonic, the Classical tradition in this consideration of context differs from the organic in its dogma. One contextual relationship or system of proportions is constantly superior to others and universally applicable, and this precludes acknowledging the varieties of settings which are inevitable and which the organic approach can exploit. Sheets 12 & 13 Frank Lloyd Wright often mentions the important effect of its site on the design of a building. But this thesis maintains that this effect is reciprocal. And in his Autobiography Wright does acknowledge this idea specifically in reference to his Johnson house in Racine (“Wingspread”): “The house did something remarkable to that site. The site was not stimulating before the house went up ... charm appeared in the landscape.”4 The illustrations substantiate that this prairie site, negative in its aspect, later acquired a positive horizontal quality by means of the inclusion of the complementary form of the house which is sympathetically horizontal for one thing and by the use of its materials which are analogous to the site in texture and color (Sheet 13). This kind of building can exemplify the country house situated alone which, by means of its relation to its setting, becomes a part in a perceptual whole.
4
Frank Lloyd Wright: An Autobiography, San Francisco 1943, p. 478.
22
Denise Scott Brown und Robert Venturi, 1968.
24
Denise Scott Brown CONTEXT AND COMPLEXITY 1 Walking in Amsterdam along a street of 19th-century townhouses, I noticed a single house of 20th-century origin. Its date was hard to tell. Its lack of ornament and the proportions of its windows suggested the 1930s, but it could as easily have been a 1950s or even 1960s version of Modernism. Then I saw that the fanlight over the front door was copied from the building next to it. This signaled a 1980s Postmodern provenance. You can tell a PoMo building because it borrows from the building next door. A Modern architect would have scorned to do this. Postmodernism initiated a renewed discussion of context in architecture, one that concerned itself largely with borrowing from historical architecture and with fitting in to an existing context. I aim here to broaden the discussion, to give context a wider context. I count myself lucky that, as I started architecture school, I spent time in wilderness areas where I could see no sign of human beings. Looking at the solitude of nature, I felt any building, no matter how great, would be an intrusion. Here, it seemed we human beings should nestle into the landscape, become part of it, and leave no trace. This was a 19th-century Romantic vision to be sure, but the ecological movement has revived the concept and demanded we pay it heed. In any case, for me, consideration of context should include a no-build datum. Context in the Modern movement Although the term was not heard in architecture before the 1970s, the early Modern movement had its own debate with context; it just did not use the word. I was taught in the late 1940s that I would have to decide whether to design buildings that stand out from the landscape, like Le Corbusier, or enter into it, like Frank Lloyd Wright. Context for my teachers was the landscape. It was seen as something the building acted upon, by either standing out or nestling in, but not as an element in its own right having an ongoing and changing dialog with the building.
1 This article originally appeared as “Context in Context,” Chapter 8 of Architecture as Signs and Systems for a Mannerist Time, by Robert Venturi and Denise Scott Brown (Cambridge/Mass. 2004) and is, in turn, an enlargement upon thought presented in “Talking about Context,” in: Lotus 74, November 1992. The illustrations were excerpted from “Essays in Context,” Chapter 9 of the same book.
25
CONSEIL GÉNÉRAL, THE HOTEL du DÉPARTEMENT de la HAUT GARONNE, TOULOUSE, FRANCE The arrangement of buildings derives from our analysis of the urban patterns of its context—the Canal du Midi, the Minimes bridge, the Avenue Honoré de Serres, small perimeter streets, and commercial nodes directly south and north. The decision to organize the building around a pedestrian way connecting the nodes diagonally across the site was more contextual than we realized. Late in the construction process, we discovered there had once been a road where this way now runs. A second contextual challenge was to insert a large, civic building containing the offices of regional government into a delicate, small scale part of the city—a vital, disorderly place full of cars and alleys, modest private buildings and enchanting vistas. Our decision to make two parallel buildings, keep their heights low, and maintain their bulk within the interior of the site was related to the scale of this environment.
26
Denise Scott Brown | Context and Complexity
The dictionary agrees. It defines con-text, as “around a text or discourse.” The center is the text. The definition suggests that what surrounds the center is inert. This view of context as passive background was transferred by Modern architects to urbanism. The early Moderns recommended clearing the existing city and replacing it with parkland, from which their glass towers could rise unencumbered. Later, landscape became “townscape,” and Modern buildings were expected to achieve a unity with the existing city despite being in contrast with it. The addition to the Gothenburg Law Courts by the architect Gunnar Asplund was of interest to architects of this mind because it adhered to the dimensions and maintained the facade plane of the existing building, yet was starkly Modern, a frame building that stood out uncompromisingly from the masonry walls beside it. A second strain in Modernism produced prescriptions for “good manners in architecture.”2 Manners required that materials, proportions, cornice heights, wall-to-window ratios, and other regulating lines of the buildings next door be maintained in the facades of the new building, but that they be translated into a Modern idiom politely analogous to the historical surroundings. Although my generation of students called this attitude “ghastly good taste,” it was widely held in Europe in the early 1950s (and remains the preferred approach to building on many US university campuses in the early 2000s). During the post-World War II rebuilding of European cities, the weight of tradition was a strong force to contend with. Reactions to it spanned from complete restoration of what had been bombed to a total turning aside into contrasting forms, with various attempts to accommodate in between. In America the need to tie into the existing city was less fiercely felt than in Europe, and for the most part US urban renewal in the 1950s followed Le Corbusier’s paradigm of the gleaming towers, though they were more usually surrounded by city streets or parking lots than the parks of the Ville Radieuse. Reaction to this type of urban renewal played a part in the rising concern with context among American architects and urbanists in the 1970s.
2 A.T. Edwards: Good and Bad Manners in Architecture, London 1944, was a source for this approach, although Edwards himself was not a Modernist.
27
In Toulouse, even the Gothic architecture is of red brick. It is a city of strong shadow, bright sunlight and intense blue sky. In the immediate context of our site, buildings are of stucco and limestone intertwined with brick; the overall is not as red as at the city center. Therefore our complex mingles brick and limestone, but its internal walkway and entry points are more red than white. At the main entry, where the Pont de Minimes joins the Avenue Honoré de Serres, we commemorated columns that had once marked this gateway. The building contains shifting scales representing broad governmental relationships. Within the internal walkway, metal and glass passarelles join the building’s two wings and stand in contrast to the brick. All day, in changing ways, they reflect the court and the sky. People can be seen above, walking the passarelles, and under are long vistas to buildings in the neighborhood.
28
Denise Scott Brown | Context and Complexity
An interactive model Robert Venturi wrote about context in 1950. He learned from Jean Labatut at Princeton in the 1940s that harmony could be achieved in architecture through contrast as well as analogy, and he developed a taste for a blend of standing out yet melding in what he later called the “gray tie with red polka dots.”3 Bob came to believe that focusing squarely on context could enrich the design of buildings – that context had “meaning,” and that each increment of new building could reinterpret and add further meaning to its surroundings. He quoted Frank Lloyd Wright, “The house did something remarkable to that site. The site was stimulating before the house went up—but like developer poured over a negative, when you view the environment framed by the Architecture of the house from within, somehow, like magic—charm appears in the landscape and will be there wherever you look. The site seems to come alive.”4 “See context as alive and help it to be alive” seems appropriate advice to designers in the urban landscape. There are a million ways to do it. In aesthetic terms, interactions with context may be as literal as borrowing from the building next door; yet allusions can also be as fleeting as the flash of fish in water. They will probably be there whether we are conscious of them or not, but paying creative and knowing attention to context can make a richer, more relevant architecture. Overattention, however, can produce an obsequious blandness. The architect who cares not at all for context is a boor; the one who cares only for context is a bore. Seeing context in context There is more to context than physical fit. At Penn in the 1960s, social scientists and planners included the vibrant although suffering social city around us in the definition of urban context. At the same time, Pop Art was giving new meaning to the immediate objects of our daily lives, often by changing their contexts. Under these influences, our “Learning from” studies took a more muscular view of context than had the Moderns. Our analyses started with the Las Vegas Strip, which we saw as the archetype of the commercial strips and auto rows to be found at the outskirts of every American city.5 3 Robert Venturi: “Context in Architectural Composition,” Master’s thesis, Princeton University, 1950. See also “An Evolution of Ideas,” Chapter 1 of Architecture as Signs and Systems op cit. 4 Frank Lloyd Wright: An Autobiography, San Francisco 1943, p. 478. 5 “Learning from Las Vegas,” a research studio conducted at Yale University, fall 1968, published as Learning from Las Vegas, Cambridge/Mass. 1972, revised 1977.
29
The building’s exterior walls and windows have learned from Toulouse and, as in the historic city, colonnades along the central way bring strong, dark shade to the court.
30
Denise Scott Brown | Context and Complexity
As architects studying our craft, we analyzed the physical aspects of signs and buildings on the Strip, but our view was broader than the visual, and included the economic, cultural, symbolic, and historical context. Our “Learning from Levittown” study that followed stressed the social and cultural dimensions of housing and examined how these affected its imagery and symbolism.6 A changing context with a life of its own Our studies depicted contexts much larger than the buildings they surrounded. These had a vital life of their own, arising from forces that caused them to form and reform continuously – forces that would act on the individual project as well. Even if the word “context” meant no more than being with or around something, there was needed, nonetheless, an active engagement between object and context. The designer of a building or complex had the opportunity of entering into the changing context, using and adapting its meanings in the individual project and, in so doing, changing the context once again. And both would go on changing forever. An examination of fluctuating context might deliver principles and guidelines for design, to be learned from – but polka-dot fashion, not followed slavishly. In the city, shifting patterns of activities play out as changes in both buildings and their contexts, over time. Each may change at a different rate, and this may bring about jumps in urban scale. Consider the small-town church whose tower dominates the landscape. A hundred years later it is surrounded by skyscrapers. In the process of urban change, large new buildings replace the village. This causes discontinuities that may be unsettling; yet vital cities change – background becomes foreground, history gives way to the present. The Renaissance and Baroque architecture we admire did not, for the most part, arise from green fields; medieval buildings were removed to erect them. In the tug between change and permanence there are surely few easy answers, but understanding the processes of urban change and the shifting relations between the ever-changing project and its ever-changing contexts may help.
6 Yale University, spring 1970; it was published as “Remedial Housing for Architects’ Studio,” Venturi, Scott Brown and Associates: On Houses and Housing, London, New York 1992, pp. 51–57.
31
The building’s shape and siting result in intriguing peeks at it from small streets and broad but untidy vistas from busy arterials.
32
Denise Scott Brown | Context and Complexity
Cultural context, cultural relevance From the cultural context, architects can learn ways of doing things at a point in time. The conventions of building – how walls, roofs, and openings are made – can be discovered from the minor architecture of the everyday city around us. These conventions, we believe, should be followed for most building, because they are practical and culturally relevant. And, because in our time and place they derive from Modern architecture, we should, in general and in the major portions of our buildings follow Modern conventions. On the basis of cultural relevance, we question PoMo architects’ tendency to borrow indiscriminately from what is “next door,” either literally or metaphorically—that is, from ideas that are intellectually or aesthetically proximate. For example, some early PoMo borrowings were from Ledoux, a French 18th-century architect venerated by Modernists in the 1950s, probably because his architecture looked like that of the Brutalists. But is Ledoux’s work relevant to the symbolic or functional requirements of residential architecture in the United States? To us, Palladio seems a more culturally appropriate source, given the history of architectural influence between Italy and England and later between England and America. Our analyses of the sources of American housing images and symbols placed Palladio at the center in both Levittown and the Hamptons on Long Island. Similarly, why did the new office buildings of the 1980s in Chicago borrow themes from Sullivan’s theater there? Wouldn’t the 19th- and early 20th-century office buildings of the Chicago School have made a more relevant and more practical model? Deft allusions This is a separate argument from the one on borrowing at all and on whether and how it should be done. Although we feel borrowing – allusion – in some form or other is almost inevitable and often desirable, we criticize the heavy-handed literalness of PoMo borrowing and have looked for something lighter, and more compromised, for ourselves. For example, our extension to the London National Gallery does borrow from the building next door, because it is adding to it. But despite its allusion and inflection to its historical context, the Sainsbury Wing is a modern building inside and out. It uses modern structural methods and follows modern conventions in the majority of its details and spaces. Its circulation patterns and the subdivision of its spaces in plan are in important ways different from those of the historical museum it extends. Its allusions are responses to
33
both its context and its contents – Early Renaissance paintings – and they make modern use of traditional themes. For example, the syncopated rhythms and juxtapositions of columns and openings on the facade have no equivalent in traditional architecture. As latter-day Modernists (not Post- or Neo-), we have included communication and allusion as part of the functioning of buildings, but we try to ensure that our allusions are representations, not copies, of historical precedent, that they don’t deceive. You know what the real building consists of beneath the skin or peeking out from behind the facade. Behind and around the decoration, the shed is always present, the deceit is only skin-deep and sometimes the “skin” is a surface of light without depth. Broadening the concept In proposing that we consider the idea of context in its context, I’ve suggested that the concept may be inadequate to our tasks as architects and urbanists. In its strict definition it’s too object-oriented, too static. Therefore I’ve tried to augment the definition to maintain the value of the concept. That, in turn, has associated context with the idea of “patterns and systems.”7 The broader mandate of putting context in context, then, is to work within the fluctuations of the economic, cultural, and social life around us, to understand its systems and respect the patterns they form. We feel all buildings, even “background” buildings, should add to their context, although the appropriateness of what they add might be subject to discussion. And although some architecture stands too proud, each building should make its context better than it found it. Contextual architecture, then, becomes architecture that holds its own aesthetically and functionally within the natural and human environment, and contributes over time to its changing, pulsating patterns.
7 Described in “Architecture as Patterns and Systems; Learning from Planning,” Part II of Architecture as Signs and Systems, op cit.
34
Andrea Gleiniger “THE DIFFICULT WHOLE,” OR THE (RE)DISCOVERY OF COMPLEXITY IN ARCHITECTURE In 1959, the American design team of Charles and Ray Eames created a sensation with their innovative media installation inside Buckminster Fuller’s geodesic dome for the American pavilion at the Moscow World’s Fair. On seven monitorshaped screens, they presented a large-scale collage composed of film sequences and images. This multi-vision show bore the title Glimpses of the US; its theme was the American way of life [Fig. 1]. The installation was an event, not just politically, but also in the realm of media technology. Less than 15 years after the end of World War II, that conflict’s achievements in the areas of information and media technologies were taken up to great visual effect to create a civilian scenario, albeit one that – with the Cold War in its heyday – was not entirely pacific in character. Alongside the direct derivation of their installation from the configurations of simultaneous transmission and surveillance found in the war room (as well as in the control rooms that had become a ubiquitous element of everyday life) was a cinematic play using the latest observation technologies. These technologies were especially piquant in light of the Sputnik Mission, completed successfully by the Soviet Union two years earlier.1 Information Design: Strategies of Visualization for the Computer Age With Glimpses, the Eameses marked out a territory they would henceforth occupy as would virtually no others: the development of strategies of visualization for the information and computer age. By means of a wealth of installations and films,2 they not only explored the potentialities of the media; their contents too emerged from the logic of the media. As early as 1952, they had used animation with playful seriousness to explain Claude Shannon’s information theory (1949). In 1964,
1 Cf. here especially: Beatriz Colomina: “Enclosed by Images. The Eameses Multimedia Architecture,” Grey Room 02, MIT, Cambridge/Mass. 2001, pp. 6–29. Includes additional bibliographic references for the “Eames complex.” 2 The Eameses’ films are now published as a filmography encompassing 5 DVDs. See: http://www. eamesoffice.com/index2.php?mod=filmography.
37
Fig. 1: Ray and Charles Eames, multiscreen installation Glimpses in the American Pavilion (Buckminster Fuller) of the Moscow World’s Fair, 1959.
38
Andrea Gleiniger | “The Difficult Whole,” or the (Re)Discovery of Complexity in Architecture
at the New York World’s Fair, they took up and elaborated the work they had done for the Moscow Pavilion: in an installation consisting of 22 monitors of various sizes, they staged the multi-vision show Think3 in the pavilion of IBM, the American office equipment and computer manufacturer. They highlighted the significance and ubiquitous quality of the computer as a “machine for mastering complexity,” demonstrating how it had long since begun to intervene comprehensively in everyday life by systematizing its logical sequencing [Fig. 2]. Consistently, the Eameses experimented with scenographic and medial visualizations of mathematical and natural-scientific fundamentals of the electronic age that were located in the interplay of cybernetics, information technology, and computer science. This experimentation culminated in the exemplary installation A Computer Perspective of 1971. The “History Wall” they installed there was a kind of display, with its linear and spatial narrative mode concentrated in a multilayered, three-dimensional grid. The result was a complex configuration that seemed to foreshadow the nonlinear, hypertextual structures of the computer age: a metaphor for the complexity of memory (of history), and for the complexity of the processes represented by means of computers.4 Oscillating between the complexity of simultaneous perceptual experiences and the synthesizing complexity of mathematical procedures, the Eameses’ films and exhibition concepts were a new conceptualization of information design. They developed a visionary concept of information architecture,5 which had been
3 View from the People Wall (planned for vol. 7–8 of the Eames filmography) presents a summary of Think. 4 “Thus a visitor moving from one place to another in front of the History Wall would find that certain objects, graphics, and comments would disappear from view, while others would appear, providing a fluid of constantly shifting kaleidoscopic effect, a kinetic equivalent of the actual passage of time correlated with the viewer’s movement. In this way the exhibit technique reproduced the complexity and interaction of the forces and creative impulses that had produced the computer.” Bernhard Cohen, “Introduction,” Charles Eames and Ray Eames: A Computer Perspective. Background to the Computer Age, New Edition, Cambridge/Mass., London 1990, p. 5. 5 Colomina 2001, p. 23.
39
Fig. 2: Ray and Charles Eames, multiscreen installation Think at the IBM Pavilion of the New York World’s Fair, 1964.
40
Andrea Gleiniger | “The Difficult Whole,” or the (Re)Discovery of Complexity in Architecture
anticipated earlier in Classical Modernism, in particular by the advanced exhibition designs of figures such as László Moholy-Nagy and Herbert Bayer.6 Between Information Technology and Pop Art While the Eameses were exploring medial strategies for visualizing the information age, and pioneering the technological logic of modernity, an architect named Robert Venturi was drawing very different conclusions. There can be little doubt that he was familiar with the foundational role that discourses of complexity had in computer science and information technology. Right at the beginning of the “gentle manifesto” that opens his disputatious and combative plea for “complexity and contradiction” [Fig. 3] in architecture, published for the first time in 1966,7 Venturi states: “Everywhere, except in architecture, complexity and contradiction have been acknowledged from Gödel’s proof of ultimate inconsistency in mathematics to T.S. Eliot’s analysis of ‘difficult’ poetry to Joseph Albers’ definition of the paradoxical quality of painting.”8
6 With good reason, Beatriz Colomina makes a connection to Herbert Bayer’s optical investigations. As early as 1930, on the occasion of the German contribution to the 20e exposition des artistes décorateurs français, which was organized by the Bauhaus and took place at the Grand Palais in Paris, Bayer configured the photographs of the exhibited architectural examples not parallel to the wall, but at various angles to it. This arrangement experimented with the expansion of the field of vision. It also became a metaphor for the multi–perspectival vision repeatedly thematized as the founding experience of modern perception. See, among others Wulf Herzogenrath: “Ausstellungsgestaltung,” in: Bauhaus Utopien. Arbeiten auf Papier, ed. by Wulf Herzogenrath (in collaboration with Stefan Kraus), exhib. cat., Stuttgart 1988, p. 194. Bayer seems to have continued these investigations in a surviving drawing dated 1938. For the same exhibition, László Moholy-Nagy conceived and designed a room on the theme of light, which seems to have provided the impulse for the “room of the present” commissioned by Alexander Dorner for the Hanover Museum, which he directed. With the concept for this unrealized “demonstration room” for the various fields of experimentation of the new design, Moholy-Nagy gave form for the first time to a vision of a media room in the modern sense. See, among others Veit Loers: “Moholy-Nagys ‘Raum der Gegenwart’ und die Utopie vom dynamisch-konstruktiven Lichtraum,” in: László Moholy-Nagy, exhib. cat., Stuttgart 1991, pp. 37–51. 7 Complexity and Contradiction in Architecture arose out of the premises Venturi set down as early as 1950, in his master’s thesis at Princeton, an excerpt of which is published for the first time in this volume: see pp. 13–24. 8 Robert Venturi: Complexity and Contradiction in Architecture, New York 1966, p. 22.
41
Fig. 3: Robert Venturi: Complexity and Contradiction, title page of the original edition (Museum of Modern Art, New York 1966).
42
Andrea Gleiniger | “The Difficult Whole,” or the (Re)Discovery of Complexity in Architecture
Yet, unlike the Eameses, Venturi was not interested in technological structures and processes as foundations for systematizating and optimizing our perceptions of the world around us. He was interested instead in the symbols, images, and sciences of the modern information society. And from the very beginning, he set his sights on this society’s reverse side, with the following appeal: “Everything is said in the context of current architecture and consequently certain targets are attacked – in general the limitations of orthodox Modern architecture and city planning, in particular, the platitudinous architects who invoke integrity, technology, or electronic programming as ends in architecture, the popularizers who paint ‘fairy stories over our chaotic reality’ and suppress those complexities and contradictions inherent in art and experience.”9 Countering the dogmas of abstraction, Venturi made a plea for a new symbolicity and graphic vividness. This is more than just a provocation against the logic of Modernism. The conception of complexity associated with his position delimited itself explicitly against all the diversifications of the Modernist project, which had been characterized beginning in the mid-1950s, both in Europe and the US, in a more-or-less obviously contradictory way as the “pathos of functionalism.”10 Through a multifaceted panorama of references drawn from architectural history, Venturi aimed his polemic against this “pathos.” In ways that were knowledgeable, polemical, disrespectful, and ironic, he battled the orthodox heritage of the “International Style” modeled by Philip Johnson and Henry-Russell Hitchcock on the basis of the spectrum of European Modernist architecture presented at their legendary Museum of Modern Art exhibition of 1932.11 Using a wealth of
9 As indicated by a footnote, Venturi’s mention of “popularizers who paint ‘fairy stories’ over our chaotic reality” is apparently an allusion to Kenzo Tange, and hence to the Japanese Metabolists as a whole. Venturi: Complexity, 1966, p. 21. 10 Heinrich Klotz: Das Pathos des Funktionalismus. Berliner Architektur 1920–1930, Berlin 1974. An event organized by the Internationales Design Zentrum Berlin. Contribution to the Berliner Festwochen 1974. 11 Henry-Russell Hitchcock and Philip Johnson: The International Style. Architecture since 1922, New York 1932. It can hardly be accidental that we have the same museum to thank for the publication of Venturi’s polemic, a book that seemed especially well-suited to launching a new series focusing on contemporary architectural discourse. While Arthur Drexler makes no mention of this circumstance in his preface, Vincent Scully, in his foreword to this first edition of Venturi’s text, positions it in a sequence that includes Le Corbusier’s brilliant pamphlet Vers une architecture of 1923. From today’s perspective, and in contrast to Hanno Walter Kruft, who believed that such comparisons significantly overestimated
43
architectural examples drawn from history and the present day, Venturi was explicit about the principal concern driving his plea for the re-establishment of complexity: the goal was to restore to both art and everyday life the rights they had been deprived of by the strategies of simplifying systematization that governed the abstract model of architectural design prevailing in Modernism. Venturi’s concept emerged within the field of tension of a definition of architectural and art history that appealed in the most general terms for the “human rights of the eye,”12 while drawing support from a “school of perception” that emphasized iconographic readings of architecture. Instead, this concept connected a consciousness of the cultural deep space from which architecture and urban reality emerges with a programmatic turn towards perceptions of the everyday and the ordinary that was thematized by Pop Art. The project of Modernism was based on the intention of integrating art into life. With Pop Art, on the contrary, life was to have been introduced back into art, which no longer shrank from the banal and the everyday, but which was capable of reshaping and defamiliarizing them through wit and irony. The special quality of Venturi’s plea for complexity and contradiction in architecture is found not only in the postulate (located between a securing of evidence via architectural history13 and the anamnesis of everyday culture) for a narrative architecture and for symbolicity. It is also remarkable that, with the concept of complexity, a category is now (re)introduced into architectural discourse that was primarily of significance in relation to the science of complexity, and that had formed itself against the background of the natural and computer sciences. the book’s value and theoretical level, “it is possible to emphasize again with Heinrich Klotz the fact that Scully was essentially correct – even if his estimation of the significance of Venturi’s arguments was distorted on the one hand by critiques of Postmodernism’s sometimes rapid decline into superficial and theatrical poses, and on the other by discrepancies between Venturi’s theoretical positions and a critical examination of the architecture realized by Venturi, Scott Brown and Associates (VSBA).” 12 Werner Hofmann, Georg Syamken, Martin Warnke: Die Menschenrechte des Auges. Über Aby Warburg, Frankfurt am Main 1980. Mentioned only briefly here is the role played by the art history shaped by Aby Warburg and the Warburg Institute in London, especially by representatives such as Rudolf Wittkower and Richard Krautheimer, who got to know and to esteem Venturi in the 1950s and 1960s during stays at the American Institute in Rome; see Richard Krautheimer: Ausgewählte Aufsätze zur europäischen Kunstgeschichte, Cologne 2003, p. 45. 13 American architectural critic and theoretician Alan Colquhoun has observed that Venturi left open the question of the specific types of complexity he invoked: the mature multiplicity of historically conditioned complexity, or the inherent complexity of a design concept that positioned itself programmati-
44
Andrea Gleiniger | “The Difficult Whole,” or the (Re)Discovery of Complexity in Architecture
In retrospect, it may seem self-evident to want to reclaim complexity as the fundamental precondition of life in general, and of architecture and urban reality in particular. At that time, in fact, the definitional authority of complexity was located indisputably in the gravitational fields of cybernetics, computer science, and information technology.14 Venturi, then, set himself up not only in direct “contradiction” to the postulates of Modernism, with their orientation towards simplification, but also positioned himself in opposition to all of the transformations of postulates involving such universal claims to validity, the revision of which, beginning in the 1950s, took the form of a multifarious spectrum of technological, sociological, phenomenological, formal-geometrical, and other interpretations. Vividness versus abstraction With their orientation toward scientific models, Modernism’s strategies of systematization led to the instrumental-rational objectifications of planning tasks in which the reduction of the complexity of life to essential functions and straightforward rules was elevated to the highest postulate. For the language of architecture, this meant the dominance of standards, types, and norms. Classical Modernism had largely procured its societal legitimation through demands for the building and creation of residential models oriented to an existential mini-
cally in a multiple framework: “Venturi ignores [the fundamental distinction] here between complexities which are intentional and those which are results of accretion over time. The focus of the text oscillates between the effect of a building on the perceptions of the observer and the effect intended by the designer, as if these were historically the same thing. The book is thus a plea for complexity in general without suggesting how different kinds of complexity are related to particular historical circumstances, and therefore how the examples might apply to particular circumstances of our own time.” Alan Colquhoun: “Sign and Substance. Reflections on Complexity, Las Vegas and Oberlin” (1973), in Oppositions 14, 1978, pp. 26–37. Reprinted in Alan Colquhoun: Essays in Architectural Criticism. Modern Architecture and Historical Change, with a foreword by Kenneth Frampton, London 1981, pp. 139–151, here: p. 140. Colquhoun draws the legitimate conclusion that, on the one hand, this procedure allows Venturi to compare contemporary with historical works of architecture, but that this means, on the other hand, that Modernist architecture is not responsible for lacking complexity, but rather the respective design concept, the circumstances, the capacities of the architect, and so forth. 14 See Klaus Mainzer in this volume: “Strategies for Shaping Complexity in Nature, Society, and Architecture,” pp. 89–97.
45
mum. Just as architectural form was conditioned by the postulates of the International Style involving decontextualized abstractions independent of location, this legitimation forms the basis of strategies designed to unify architecture and urban planning. The legendary slogan “less is more,” through which Mies van der Rohe had passed from the programmatic ban on ornamentation to the geometric abstraction of primary forms, had become a kind of law – or at least a catchy and popular recipe. And yet the more the complexity of living and functional contexts had become the object of strategies of simplification by technocratic planning processes, on the one hand, and of a unification through manufacturing processes on the other, the more evident the deficits of these approaches would become. A loss of identity, in particular, became a point of departure for the new conceptions of architecture that began to subject the project of Modernism to revision, albeit in highly diverse ways. An initial programmatic expression of these revisionist claims arrived in 1955 with the foundation of Team X in La Sarraz as self-declared critics of “CIAM” doctrines.15 Despite the noticeable heterogeneity of the architectural positions within the group, its members were united by the then unconventional and emphatic demand for the contextualization of design tasks. In many respects, the initiatives brought together by Team X were influenced by the ideas of Structuralism. Linked to the search for generally valid, elementary, “archetypal” forms and figurations was the explicit intention of recreating “meaning” and of generating a counterweight to a technocratic aesthetic of reduction, with its orientation toward one-sided and industrial standardization. And yet the sociological, topological, and typological systems of references within which this structurally influenced architectural approach was set remained somewhat abstract. To be sure, both the challenge to validate the qual15 This is the same location where, in 1928, CIAM, the Congrès Internationaux d’Architecture Moderne, as summoned into existence by Le Corbusier, Sigfried Giedion, and Hélène de Mandrot. On the occasion of the preparations for the 10th CIAM Congress of 1956, a committed group of young architects now came together, beginning to explore opposition to the dominance of CIAM and its rigid planning doctrines. The founding members included Peter and Alison Smithson, Aldo van Eyck, Georges Candilis, Shadrach Woods, Stefan Wewerka, Jaap Bakema, and Giancarlo de Carlo. Although its members were hardly free from error, it would be difficult to overestimate the significance of Team X for the critical process during subsequent decades. To be sure, the architects of Team X represented thoroughly diverse approaches to building, but they were united in their rejection of the apodictic character of Le Corbusian planning doctrines.
46
Andrea Gleiniger | “The Difficult Whole,” or the (Re)Discovery of Complexity in Architecture
ities of “anonymous architecture”16 formulated by Austrian architect Bernhard Rudofsky in 1964, as well as the project for a “phenomenology of place” found in the historical and theoretical reflections of Norwegian architectural historian Christian Norberg-Schulz recalled the narrative dimension of architecture, expressing an interest in both those qualities defined by perceptual psychology and in cultural “deep space” as the foundations for the restoration of complexity. Yet the existential seriousness, found in particular in the weighty plea for the “genius loci” formulated in summary fashion by Norberg-Schulz, and used as a title for a volume of writings appearing initially in Italian in 1979,17 remained remote from the programmatic position set forth nearly simultaneously by Robert Venturi and Denise Scott Brown, with its emphasis on vivid images and striking symbols. This search for the topologies and patterns underlying architecture evolved in a way that relied upon a phenomenological orientation conditioned by the idea of the “memorable place” (Charles Moore), while simultaneously providing impulses to derive the initial foundations for processing based on information technology from its systematizing descriptions. The “Pattern Language” propagated in the early 1970s by Christopher Alexander, a mathematician who had become an architect,18 is only the best-known example. A fascination with the geometric systematization of the language of architecture is as old as architecture itself. Now, the complexity of life is condensed into a diagram. The diagram is favored by architects such as Peter Eisenman – but less with regard to the procedures of information technology than in the sense of an architectural-theoretical analogue to natural-scientific models. Here, the diagram becomes a condensation of architectural ideas and a metaphor or expression of the biological quality of the complexity of life as such.19
16 Bernard Rudofsky: Architecture without Architects. A Short Introduction to Non-Pedigreed Architecture, London 1964. 17 Christian Norberg-Schulz: Genius loci. Paesaggio, ambiente, architettura, Milan 1979; English ed.: Genius Loci: Towards a Phenomenology of Architecture, New York 1979. 18 Christopher Alexander, Sara Ishikawa, Murray Silverstein, with Max Jacobson, et al.: A Pattern Language. Towns, Buildings, Construction, New York 1977. Center for Environmental Structure series, vol. 2. 19 See Sean Keller: “Systems Aesthetics, or how Cambridge Solved Architecture,” in: Architecture and Authorship, ed. by Tim Anstey, Katja Grillner, Rolf Hughes, London 2007, pp. 165–163. Also Georg Vrachliotis: “The leap from the linear to computational consciousness: evolutionary models of thought and architecture,” in: Precisions. Architecture between sciences and arts, ed. by Ákos Moravánszky, Ole W. Fischer, Berlin 2008, pp. 232–261.
47
Fig. 4: Moshe Safdie, Habitat housing estate, Montreal World’s Fair, 1967.
48
Andrea Gleiniger | “The Difficult Whole,” or the (Re)Discovery of Complexity in Architecture
Yet as before, the technological mastery of the complexity of building is accorded at least as much significance. In light of the potentialities of the information technologies available today, attention is again being focused on projects and concepts involving architectural and urban planning visions; since the late 1950s, these have celebrated the seemingly unlimited possibilities of technological development. In the growing euphoria of the technological mastering of complexity, as manifested at that time among other things in the growing enthusiasm for megastructures, the spirit of Modernism enforced itself in an especially rigid manner. But even at that time, the vacuum left by the logic of the new technologies was already being filled, and not only by more-or-less spectacular design proposals. Especially in Europe, and in the climate of the mounting sociopolitical protests of the late 1950s and early 1960s, it also became a projection surface for societal visions within whose spaces, once again, rigidified conventions and hierarchies could be overcome. That the current revival 20 of such megastructures can still be regarded as a “perversion of normal city building process”21 should not distract us from the fact that its diverse conceptions are distinguishable from one another. Literally taking form at the 1967 World’s Fair in Montréal was a paradigmatic test apparatus for the technological-constructive thinking of those years. Just a few kilometers from one another rose Buckminster Fuller’s geodesic dome and Moshe Safdie’s “Habitat” residential settlement [Fig. 4].
20 See, among others, the website http://www.megastructure-reloaded.org/de/313/, which covers all of the important protagonists of that time. Or their discussion in the context of the new definition of autonomous building and participation, as in Jesko Fetzer, Michael Heyden (eds.): Hier entsteht. Strategien partizipativer Architektur und räumlicher Aneignung, Berlin 2004. 21 “But for our time the megastructure is a distortion of normal city building process for the sake inter alia of image.” Robert Venturi, Denise Scott Brown, and Steven Izenour: Learning from Las Vegas. The Forgotten Symbolism of Architectural Form, New York 1972, p. 119. The accompanying illustration, no. 114, on p. 119, shows Moshe Safdie’s Habitat.
49
Fig. 5: Richard Buckminster Fuller, “The Bioshpere”, American Pavilion, Montreal World’s Fair 1967, today: Environment Canada’s Biosphère Museum (opened 1995), detail of dome construction.
50
Andrea Gleiniger | “The Difficult Whole,” or the (Re)Discovery of Complexity in Architecture
While Fuller’s dome, a techno-ornament the tensegrity structure of which is formed of a jointed filigree tracery, celebrates the reduction of constructive complexity [Fig. 5], Safdie’s geometrically fissured residential agglomerations manifest a new and crystalline unsurveyability. The combination of altogether 345 selfsupporting “boxes” produces an overflowing diversity of spatial and volumetric relationships that surpasses the 16 standardized types in number.22 Corresponding to this demonstration of complexity in the spirit of a modularized and sculpturally composed modular construction system are the urbanist super-sculptures produced contemporaneously by the Japanese Metabolists, based on analogies with the logic of natural growth. As we saw earlier, Venturi excoriated the Metabolists as “popularizers who paint ‘fairy stories over our chaotic reality’ and suppress those complexities and contradictions inherent in art and experience.”23 Nonetheless, the concept of the megastructure that underlay the visions of designers such as Yona Friedman [Fig. 6], Constant [Fig. 7], and Austrian architects like Günter Domenig and Eilfried Huth was clearly associated with concerns with social models of complexity.24 The “cybernetic” dimension of a mobile, flexible, and incipiently the participatory conception of space thematized in projects such as the “Ville Spatiale,”25 visualized by Yona Friedman in an abundance of photomontages, shifted the claims of (artistic) design toward the dynamics of social figurations. The demiurgic gesture of the architect was relativized: Finally, the invention of the megastructure in the spirit of the floating spatializations of technological thinking, was associated with the idea of a newly won space of
22 Differently than in other contemporary examples, these 345 special cells were not transportable. They were piled up at the construction site into 158 apartments that were grouped around three access courses to become 12 storied structures. On Safdie, see among others Adolf Max Vogt, with Ulrike JehleSchulte Strathaus and Bruno Reichlin: Architektur 1940–1980, Berlin 1980, p. 202; Blake Gopnick and Michael Sorkin, Moshe Safdie: Habitat’67 Montreal, Turin 1998. 23 In Learning from Las Vegas, he also comments on Expo ’67, declaring: “At Expo ’67, those pavilions were for us the most boring; they could be interpreted as further developed analogies to the progressive constructions seen at 19th-century worlds fairs, and once celebrated by Sigfried Giedion.” p. 178. 24 See, among others, Yona Friedman: L’architecture mobile, Paris 1963; Yona Friedman: Utopies réalisables, Paris 1976; Mark Wigley: Constant’s New Babylon. The Hyper-Architecture of Desire, Rotterdam 1998. 25 See Heinrich Klotz (ed.) in collaboration with Volker Fischer, Andrea Gleiniger, and Hans-Peter Schwarz: Vision der Moderne. Das Prinzip Konstruktion, exhib. cat., Munich 1986, pp. 130–137.
51
Fig. 6: Yona Friedman, “Ville Spatiale”, 1985ff., vision of a floating city above Paris.
52
Andrea Gleiniger | “The Difficult Whole,” or the (Re)Discovery of Complexity in Architecture
unlimited possibilities,26 in which the complexity of life could develop in the form of social interaction, unimpeded now by the built structures of the past and their representational systems and functional compulsions [Fig. 8]. Perhaps this is why today – 50 years later – such concepts and visions are awakening so much interest. This interest is oriented less to the radical gesture of the megastructure, which wanted to reinvent the city from the spirit of the laboratory, and propagated a radical break with its inherited structures. Instead, the current interest is based on the idea of self-organization and on the formation of social models and with the spaces for maneuvering gained from such a technologically grounded optimization, spaces associated with sociopolitical, and ultimately design visions. The Difficult Whole In opposition to the fragmentation of lived reality, and on the basis of allencompassing claims for design, classical Modernism had attempted to redetermine the big picture, saturating it with a unifying design thinking. In the second half of the 20th century, in particular, this “noble barbarism,” propagated by the heroes of Modernism, provoked multifaceted opposition and a new consciousness of complexity. This consciousness sought out references in diverse conceptual frameworks. These were the very technical-scientific ordering systems that had largely organized the disciplines from which the logic of Modernism had emerged: sociology, the natural sciences, and information and media technologies. They supplied the cultural paradigm that led to the supplanting of the industrial society by the information society. The loss of narrative in architecture for which Modernism had been blamed could not be hindered by the “fairy stories” – which had themselves been derived from the construction of analogies to become representations of reality that emerged from the epistemological horizon of the mathematical and scientific
26 Andrea Gleiniger: “Technologische Phantasien und urbanistische Utopien,” in: Vision der Moderne, 1986, pp. 56–65. Reprinted in Helmut Draxler and Holger Weh (eds.): Die Utopie des Design, exhib. cat, Munich, Munich 1994 (unpaginated).
53
Fig. 7: Constant, New Babylon (nach einem Stand aus dem Jahre 1959), Modell.
54
Andrea Gleiniger | “The Difficult Whole,” or the (Re)Discovery of Complexity in Architecture
disciplines. It was their lack of expressiveness that architects such as Robert Venturi and Denise Scott Brown opposed. Particularly among the architects of VSBA, the iconography of the mediatized world 27 – which increasingly became the reality of the elapsing 20th century – became the logical continuation of experiences with the narrative qualities of historical architecture and their adequate visualization. They conceptualized complexity not least in relation to experiences of the visible and pre-existing world drawn from experimental psychology. The result was a demand for contextualization28 based on the vivid quality of the experience of real architecture, if occasionally at the price of the renewed simplification of an exaggerated symbolicity and graphicness. The dilemma of the discourse of complexity is apparent: in the end, it emerges where new strategies for simplification have been derived repeatedly from various ways of reading complexity – strategies that are effective where reflections on quality in architecture, the city, and space remain beholden to a pure faith in calculability. Digitalization of the bases of design has by no means solved this problem, but rather displaced it: the disparate scenarios of a digitalized design rhetoric provide a backdrop to those concepts that scenarize the difficult whole of architecture and its contexts primarily by perfecting the spectacular and the “complicated”29 form. Associated, by contrast, with the potentials of the software agent, which (as a generic element of information technology) has apparently come to occupy the position formerly taken up by the universal constructive joint of Modernism,30 is the option of a new thinking of possibilities. This should not return to fostering
27 Robert Venturi: Iconography and Electronics upon a Generic Architecture. A View from the Drafting Room. Cambridge/Mass. 1996. 28 See Denise Scott Brown in this volume, “Complexity and Context,” pp. 25–34. 29 See Clemens Bellut in this volume, “‘Ach, Luise, lass ... das ist ein zu weites Feld’ or: The Gordian Knot of Complexity,” pp.109–115. 30 See Georg Vrachliotis, “Flusser’s Leap: Simulation and Technical Thought in Architecture,” in: Simulation. Presentation Technique and Cognitive Method in the series Context Architecture: Fundamental Architectonic Concepts Between Art, Science, and Technology, ed. by Andrea Gleiniger and Georg Vrachliotis, Basel, Boston, Berlin 2008, pp. 63–80.
55
Fig. 8: Yona Friedman, Villa Spatiale”, 1958ff., view from the floating city, sketch.
56
Andrea Gleiniger | “The Difficult Whole,” or the (Re)Discovery of Complexity in Architecture
the fantasies of omnipotence repeatedly generated by the primacy of technology in architecture, but instead (and going beyond the mode of complexity represented by technological modeling) to the cultivation of a lively relationship with the real world: in the end, this is the quintessential concern of architecture. Against this background, it seems promising to try to derive a consciousness for the “unfinished whole” from a consciousness for the “difficult whole,” thereby creating a space of possibilities for the literally incalculable that exceeds the constantly growing possibilities of calculability. In this, architecture has succeeded, and will continue to do so, wherever it has translated its knowledge about the environment into willful form.
57
Georg Vrachliotis POPPER’S MOSQUITO SWARM: ARCHITECTURE, CYBERNETICS, AND THE OPERATIONALIZATION OF COMPLEXITY “With clouds replacing clocks,” conjectured American architect and architectural theoretician Charles Jencks in his Architecture of the Jumping Universe, “a revolution in thinking was under way, that can best be understood by opposing it into the dominant world view, by contrasting the Postmodern sciences of complexity with the Modern sciences of simplicity.”1 […] “In the new sciences and architectures the fundamental idea relates to feedback, self-organizing change, which the computer is well-adapted to portray.”2 Jencks presented complexity research as a “new science” and a “new paradigm.”3 For Jencks, Frank Gehry’s Guggenheim Museum in Bilbao, Peter Eisenman’s Aronoff Center in Cincinnati, and Daniel Libeskind’s Jewish Museum in Berlin are architectural replies to the question of the cultural outgrowths of this new science. More than a decade later, and in light of the new technologies being used in architecture, it seems necessary to explore from new perspectives not so much Jencks’s answers, but rather his questions. In the context of architectural production, is it possible to discuss complexity not only as an artistic-aesthetic category, but also as a fundamental technical-constructive idea? In other words: can the epochal publication of Robert Venturi’s Complexity and Contradiction4 be regarded as a development of the concept of complexity in architecture? The metaphor of the cloud – which achieved prominence not long ago through the pavilion constructed by American architects Elisabeth Diller and Ricardo Scofidio at the Swiss regional exhibition Expo 02 – has surfaced repeatedly in architectural history 5 [Fig. 1]. Yet Jencks does not invoke this metaphor in the form in which it is found in architecture, that is to say, as a poetic counter1 Charles Jencks: Architecture of the Jumping Universe: A Polemic. How Complexity Science is Changing Architecture and Culture, London, New York 1995, p. 31. 2 Ibid, p. 13. 3 Charles Jencks: “Nonlinear Architecture. New Science = New Architecture?” Architectural Design 129, 1997, p. 7. 4 Robert Venturi: Complexity and Contradiction in Architecture, New York 1966. 5 See Andrea Gleiniger: “Of Mirrors, Clouds, and Platonic Caves: 20th-Century Spatial Concepts in Experimental Media,” in: Simulation. Presentation Technique and Cognitive Method, from the series Context Architecture. Fundamental Architectonic Concepts Between Art, Science, and Technology, ed. by Andrea Gleiniger and Georg Vrachliotis, Basel, Boston, Berlin 2008.
59
Fig. 1: Diller + Scofidio Architects: Blur building, Yverdon-les-Bains, realized for the Swiss National Exhibition Expo.02.
60
Georg Vrachliotis | Popper’s Mosquito Swarm: Architecture, Cybernetics, and the Operationalization of Complexity
concept to the materiality of built architecture, or as a dream image of a space that has been liberated from physical limitations. Jencks’s level of reference is instead linked to overarching questions such as those concerning the interrelationships between architecture, science, and metaphor, and between architecture, science, and the world picture. Does a new science generate not only new metaphors and a new world picture, but also new architectural design strategies as well? The interrelationships between metaphor, philosophy, and the history of science, for example in the context of epochal upheavals and the emergence of technical-scientific metaphors, has been the object of numerous philosophicohistorical discourses.6 Yet if “every creative history [has] its world picture, and indeed in such a way as to concern itself from time to time about that world picture,”7 than the question is: what world picture can be represented by a cloud (in contradistinction to the mechanism of the clock)? In searching for the source of Jencks’s metaphorical simile, the trail leads directly to the mid-20th-century discursive space of theories of science: “My clouds are intended to represent physical systems which, like gases, are highly irregular, disorderly, and more or less unpredictable,”8 explained philosopher of science Karl Popper matter-of-factly in April 1965 in his Arthur Holly Compton Memorial lecture at Washington University, continuing, “[...] There are lots of things, natural processes and natural phenomena, which we may place between these two extremes – the clouds on the left, and the clocks on the right.”9 While Popper’s lecture dealt mainly with socio-philosophical questions, he was concerned in this context with the familiar philosophical question of the degree to which physical determinism or non-determinism supplied adequate descriptions of reality. The title of this lecture, “On Clocks and Clouds,” supplied the relevant metaphorical dualism.
6 See Lily Kay: “Spaces of Specificity: The Discourse of Molecular Biology Before the Age of Information,” in Lily Kay: Who wrote the Book of Life? A History of the Genetic Code, Stanford 2000, pp. 38–72. 7 Martin Heidegger: “The Age of the World Picture” (1938), in: The Question Concerning Technology and Other Essays, New York 1977, pp. 115–154, esp. p. 127–128 (Martin Heidegger: “Die Zeit des Weltbildes” (1938), in: Martin Heidegger: Holzwege. Frankfurt am Main 1977, p. 75–96). 8 Karl R. Popper: “On Clouds and Clocks. An Approach to the Problem of Rationality and the Freedom of Man,” Washington 1966. Republished in Karl R. Popper: Objective Knowledge. An Evolutionary Approach, Oxford 1972, p. 207. 9 Karl R. Popper: Objective Knowledge, p. 208.
61
On the basis of a simple example drawn from nature, Popper further elaborated his thoughts on the cloud metaphor. Only at second glance does it become evident that by choosing this example, he anticipated something that could eventually be modeled with the technical assistance of computers only 20 years later: “As a typical and interesting example of a cloud I shall make some use here of a cloud or cluster of small flies and gnats. [...] In this case of the gnats, their keeping together can be easily explained if we assume that, although they fly quite irregularly in all directions, those that find that they are getting away from the crowd turn back towards that part which is densest. This assumption explains how the cluster keeps together even though it has no leader, and no structure – only a random statistical distribution resulting from the fact that each gnat does exactly what he likes, in a lawless or random manner, together with the fact that he does not like to stray too far from his comrades [...] Like many physical, biological, and social systems, the cluster of gnats may be described as a ‘whole.’ Yet the cluster of gnats is an example of a whole that is indeed nothing but the sum of its parts; [...] for not only is it completely described by describing the movements of all individual gnats, but the movement of the whole is, in this case, precisely the (vectoral) sum of the movements of its constituent members, divided by the number of members.”10 In light of the following discussion of the concept of complexity and its development, Popper’s lecture offers two concrete points of departure: first, he speaks of the possible description of the swarm as a “whole,” an entity that is more than the “sum of its parts.” At the same time, through his attempt to explain the enigmatic coherence of the swarm “without a leader,” he also supplies us with a structural reference for the technical modeling of the behavior of dynamic systems. An initial reading suggests the immediate adoption of a manner of expression drawn from the field of Gestalt psychology. The investigations of this theory of perception, which emerged in Germany in the early 20th century, were driven by the central question of the perception of complex phenomena or stimuli.11
10 Ibid., pp. 208-210. 11 See Christian von Ehrenfels: “Über Gestaltqualitäten,” Vierteljahrsschrift für wissenschaftliche Philosophie, 14, 1890, pp. 249–292.
62
Georg Vrachliotis | Popper’s Mosquito Swarm: Architecture, Cybernetics, and the Operationalization of Complexity
A second reading, on the other hand, reveals the outlines of something that would presumably be characterized (from the perspective of contemporary technology) by American computer scientist Mitchel Resnick as “decentralized systems and self-organized behaviors”12 : a self-organizing, complex system consisting of multiplicity of interacting elements 13 [Fig. 2]. Popper’s mention of principles drawn from theories of perception is no accident; as early as 1928, his philosophically oriented dissertation Zur Methodenfrage der Denkpsychologie (On Questions of Method in the Psychology of Thinking) was supervised by Karl Bühler.14 Bühler, a German psychologist of thinking and perception, was a decisive influence for many of Popper’s sociophilosophical views. Popper’s remark that despite the nonlinear movements of each individual mosquito, the swarm as a whole functions like a coordinated collective is an allusion to the reciprocal effects in complex systems between local behaviors and global outcomes: technically, the global and behavior of a system can be modeled on the basis of local knowledge provided there exists a sufficiently large number of interacting elements. In Die Logik der Sozialwissenschaft (The Logic of the Social Sciences)15, Popper presented a series of theses designed to “articulate the opposition between our knowledge and our non-knowledge.”16 Here, we encounter the term “situational logic,” coined by Popper, and so reminiscent of the general term “situated agent,” which would become paradigmatic in so-called “artificial intelligence.” Both concepts emphasize the coupling between the agent and the context in which the agent acts: whether as here in the framework of a situational analysis, or as a cognitive modeling rule for the technical implementation of a multiagent system.17 The agent is inseparable from the 12 Mitchel Resnick: Turtles, Termites, and Traffic Jams, Cambridge/Mass. 1994. p. 5. 13 See Valentin Braitenberg: Vehicles. Experiments in Synthetic Psychology, Cambridge/Mass. 1984; Craig W. Reynolds: “Flocks, Herds, and Schools: A Distributed Behavioral Model,” in: Computer Graphics, 21, 4, SIGGRAPH ’87 Conference, 1987, pp. 25–34; Eric Bonabeau, Marco Dorigo and Guy Theraulaz: Swarm Intelligence: From Natural to Artificial System, Santa Fe Institute Studies in the Sciences of Complexity, Oxford 1999. 14 Karl R. Popper: “Zur Methodenfrage der Denkpsychologie” (unpublished dissertation), Vienna 1928. 15 Karl R. Popper: “Die Logik der Sozialwissenschaften,” in: Karl R. Popper, Auf der Suche nach einer besseren Welt. Vorträge und Aufsätze aus dreissig Jahren, Munich 2003, pp. 79–99 (first publication: Kölner Zeitschrift für Soziologie und Sozial-Psychologie, vol. 14, 1962). 16 Ibid., p. 80. 17 See William J. Clancey: Situated Cognition. On Human Knowledge and Computer Representations, Cambridge 1997. Many of these concepts go back to the perceptual psychology of American psychologist James J. Gibson. See James J. Gibson: The Ecological Approach to Visual Perception, Boston 1979.
63
Fig. 2: “The simulated flock is an elaboration of a particle system, with the simulated birds being the particles. The aggregate motion of the simulated flock is created by a distributed behavioral model much like that at work in a natural flock; the birds choose their own course. Each simulated bird is implemented as an independent actor that navigates according to its local perception of the dynamic environment, the laws of simulated physics that rule its motion, and a set of behaviors programmed into it by the ‘animator’.” “Boids” model by Craig Reynolds in collaboration with the Symbolics Graphics Division and Whitney/Demos Productions, 1986/87.
64
Georg Vrachliotis | Popper’s Mosquito Swarm: Architecture, Cybernetics, and the Operationalization of Complexity
context in which it is situatively embedded. Situatedness, then, is a fundamental condition permitting the modeling of global complexity through a simple system of local rules. Popper’s Holly Compton Memorial Lecture was published in 1966. Also published that year was Venturi’s Complexity and Contradiction. For decades, Venturi’s concept of complexity would remain central to the discourse concerning the opposition between narratives of complexity in postmodernist architecture and productive clarity in modernist architecture. Only relatively late would ideas concerning complex systems be brought into relationship with postmodernist ideas of complexity, for instance by Jencks and Wolfgang Welsch.18 To be sure, Postmodernism had honed an aesthetic gaze for nonlinear processes. An awareness of the dynamics of complex systems in nature, however, could be generated only through the technical clarity of computer simulation. Contributing to this development were personalities such as Claude Shannon, John von Neumann, and Herbert Simon. The question of whether to credit the emergence of theories of chaos and complexity in the history of science with a qualitative transformation of the world picture will surely receive further discussion. Also meriting continuing philosophical-scientific debate is the question of the degree to which humankind has – in opposition to the sciences in modernity – come closer to nature. Is it the case that “all that we can infer about the nature of the world from the fact that we have to use mathematical language if we want to describe it, is that this world has a certain degree of complexity or, that there are certain relationships in this world that cannot be described with too primitive means?”19 Against the background of the concept of complexity, cybernetics has conjured up a structural-scientific foundation of which the technological potential and cultural magnitude for architectural production can be discussed only from the perspective of contemporary information technology. “The thought of every age is reflected in its technique,” observed American mathematician Norbert Wiener his Cybernetics.20 Wiener’s book has fostered the technification of concepts in the humanities, natural sciences, and arts. This has had consequences
18 See, for example Wolfgang Welsch: “Übergänge,” in: Selbstorganisation. Jahrbuch für Komplexität in den Natur- Sozial- und Geisteswissenschaften, vol. 4, Berlin 1993, pp. 11–17. 19 Karl R. Popper: “What is dialectic?” Mind, 49, 1940, pp. 403–426, p. 421. 20 Norbert Wiener: Cybernetics. Or Communication and Control in the Animal and the Machine, 2nd edition, Cambridge 1965 (1st edition 1948), p. 38.
65
for architecture as well. Through processes of metaphoricization, such cybernetically minted concepts as “communication” and “feedback” advanced to the status of productive and effective guiding ideas in the architecture of succeeding decades.21 Abstract control processes now stood in the foreground, having supplanted individual features, and much influenced by a future universal science that would, it was said, integrate the various disciplines. Relegated to a secondary status was the question of whether we are talking about biological organisms, technical processes of automation, human perceptions, concepts of architectural engineering, or architectural planning and design processes. In short, it was a question of the “ontological restlessness” referred to by German media theoretician Claus Pias in his essay on the utopian potential of cybernetics.22 “This restlessness,” writes Pias, “resides in the indistinctness or interchangeability of that which was previously distinguished from artifacts under the concept of the human.”23 In discussions of architecture and complexity, the cybernetic period is accorded an important role. The appearance of Complexity and Contradiction coincided with Time Magazine’s heralding of the “Cybernated Generation” in April 1965.24 Coinciding with Venturi’s manifesto was a concept of complexity that could be discussed in the context of a “general, formal science of the structure, relations, and behavior of dynamic systems.”25 Complexity, then, could be understood from the perspectives of broadcast technology and information theory.26
21 See Erich Hörl: “Das Ende der archaischen Illusion. Kommunikation, Information, Kybernetik,” in Erich Hörl: Die heiligen Kanäle. Über die archaische Illusion von Kommunikation, Zurich, Berlin 2005, pp. 231–281. 22 Claus Pias: “Unruhe und Steuerung. Zum utopischen Potential der Kybernetik,” in: Die Unruhe der Kultur. Potentiale des Utopischen, ed. by Jörn Rüsen and Michael Fehr, Weilerswist 2004, p. 302. 23 Ibid., p. 302. 24 Time Magazine, 2 April 1965. 25 Hans-Joachim Flechtner: Grundbegriffe der Kybernetik. Eine Einführung, Stuttgart 41969, p. 10 (original edition: 1966). 26 See Claude E. Shannon: “A Mathematical Theory of Communication,” in: Bell System Technical Journal, 27, pp. 379–423 and pp. 623–656, July and October 1948.
66
Georg Vrachliotis | Popper’s Mosquito Swarm: Architecture, Cybernetics, and the Operationalization of Complexity
Just one year after Venturi’s publication, György Kepes, a HungarianAmerican artist and theoretician of art, announced the emergence of an integrated structural order encompassing the arts, architecture, science, and technology, which would interconnect the disciplines. Occurring in the present, according to Kepes, was a movement from the “classical sciences of simplicity toward a modern science of ordered complexity.”27 Against this background, he juxtaposes Pier Luigi Nervi’s supporting frame constructions, Buckminster Fuller’s spatial frameworks, and Max Bill’s concrete painting with electron microscope images and X-ray images of crystals, cells, and fluids. Just as he had done a year earlier in his New Landscape in Art and Science,28 Kepes presents his arguments through a rhetoric of visual analogies, announcing in this connection that “the most powerful imaginative vision is structure oriented.”29 Despite the fact that concepts such as nonlinearity and self-organization are accorded no explicit significance in Kepes’ structural aesthetics, they are a fundamental condition in terms of Gestalt psychology for attempts to unify scientific conceptions of form with those found in art and architecture at the level of structural science30 [Fig. 3]. From today’s perspective, Kepes should be accorded a pivotal function: he marks the transition from a Gestalt-theoretical conception of complexity to an aesthetic based on technical-scientific structural principles. At least three developmental tendencies are derivable from Kepes’s structural conception of complexity. First of all, there are Kevin Lynch’s empirical experiments in the field of perception, which dealt with visual complexity in urban structures, converting Kepes’s approaches into an interface joining architecture, urban planning, and the cognitive sciences, which were then just on the point of becoming established.31 Second, there are the “generative aesthetics” of early experimental computer graphics, for example those of Georg Nees and Frieder Nake, which emerged as a new graphic trend 32 [Fig. 4]. 27 György Kepes (ed.): Structure in Art and Science, New York 1965, p. iv. 28 György Kepes (ed.): New Landscape in Art and Science, Chicago 1956. 29 György Kepes (ed.): Structure in Art and Science, New York 1965, p. ii. 30 On complexity, see Arnheim’s chapter “Order and Disorder,” pp. 162–204, esp. the subchapter “Levels of Complexity,” pp. 178–182, in Rudolf Arnheim: The Dynamics of Architectural Form, Berkeley and Los Angeles 1977 (based on the 1975 Mary Duke Biddle lectures at the Cooper Union). 31 See Kevin Lynch: The Image of the City, Cambridge 1960, and Kevin Lynch: Good City Form, Cambridge 1981. 32 See Max Bense: “Projekte generativer Ästhetik,” and Georg Nees: “Programme und Stochastische Grafik,” in: edition rot, ed. by Max Bense and Elisabeth Walter, Stuttgart 1965.
67
Fig. 3: Visual materials accompanying Norbert Wiener’s essay “Pure Patterns in a Natural World,” from the exhib. cat. The New Landscape in Art and Science, ed. by György Kepes, Boston 1956.
68
Georg Vrachliotis | Popper’s Mosquito Swarm: Architecture, Cybernetics, and the Operationalization of Complexity
But for current architectural production using information technology, a third aspect harbors perhaps the greatest potential, located on the level of construction. The point of departure is what Frei Otto referred to as “natural construction.”33 On the basis of the investigation of complex systems, he attempted to constructively translate into architectural terms the economics governing the processes of formal invention in nature. Processes of selforganization were investigated from the perspective of their structural significance for construction. In this connection, Austrian-American architectural historian Eduard Sekler refers to the distinction between structure and construction: “The real difference between these two words is that ‘construction’ carries a connotation of something put together consciously while ‘structure’ refers to an ordered arrangement of constituent parts in a much wider sense.”34 Both rhetorically and methodologically, Otto attempted to overcome the purportedly intentional difference between construction and structure delineated by Sekler. Structure and construction were to be rendered equally controllable via the experimental transfer into architecture of the economic criteria of natural processes of formal invention. One of the most fruitful aspects for digital architectural production lies in Otto’s attempt to effect a rapprochement between structure and construction. “I examined natural, technical, and artistic objects, and in particular those processes through which objects acquire their characteristic forms, their gestalts,”35 explains Otto. Through the application of complex systems devoted to the structural determination of this “gestalt,” Otto succeeds in discussing complexity not only on an aesthetic level, but on an economic one as well. His point of departure was the conviction that “if you begin the design process not from a formal canon, but instead from the modeling of processes, then [it is] recommended that [these] be formulated like the rules of the game.”36
33 See Frei Otto: Natürliche Konstruktionen. Formen und Konstruktionen in Natur und Technik und Prozesse ihrer Entstehung, Stuttgart 1982. 34 Eduard F. Sekler: “Structure, Construction and Tectonics,” in: Structure in Art and in Science, New York 1965, pp. 89–95, here p. 89. 35 Frei Otto: Gestaltwerdung. Zur Formentstehung in Natur, Technik und Baukunst, ed. by György Kepes, Cologne 1988, p. 5. 36 Joachim Krause: “Die Selbstorganisation von Formen. Joachim Krause im Gespräch mit Nikolaus Kuhnert, Angelika Schnell und Gunnar Tausch,” in: Arch+ Zeitschrift für Architektur und Städtebau, 121 (Die Architektur des Komplexen), Stuttgart 1994, p. 25.
69
Fig. 4: Georg Nees: “Programming stochastic computer graphics,” page from the exhib. cat. Cybernetic Serendipity. The Computer and the Arts, London 1968.
70
Georg Vrachliotis | Popper’s Mosquito Swarm: Architecture, Cybernetics, and the Operationalization of Complexity
Architectural production hence opened up for Otto a procedural-technical interpretation of the complexity found in nature. Evident, however, with regard to the potentialities of joining architecture and information technology is a conceptual reservation concerning his methodology: both Otto’s concept of natural construction and the design ideas resulting from it rest as a rule on the structural principles of nonbiological processes. Investigated, for example, are the involved geometries of birds’ nests, but not the behavior of flocks of birds. The multilayered structures of anthills are analyzed, but not the behavior patterns of the ants themselves. Otto’s natural constructions are concerned, so to speak, with the design outcomes and finished products of nonlinear processes of formal invention. Yet through growing research into the methods of “artificial life”37 in engineering-oriented, marginal areas of digital architectural production, it has now also become possible to exploit the dynamic behavior of biological systems. Emerging here as well is the decoupling of elements that remained unified in Otto’s concept of “gestalt”: the structural process on the one hand, and the forms it generates on the other. In other words: the separation of structure and form. In the process, something has penetrated into technical thinking in architecture that derives unmistakably from the logic of information technology. The technologization of nonlinear processes serves as the foundation for an independent method of construction.38 Constructing with complex systems can be regarded, then, as “an additional phase of the technical world.”39
37 American computer scientist Christopher Langton is regarded as the founder of the Artificial Life Movement. At one of the first symposia, Langton described this research field as follows: “Artificial Life is the study of man-made systems that exhibit behaviors characteristic of natural living systems. It complements the traditional biological sciences concerned with the analysis of living organisms by attempting to synthesize life-like behaviors within computers and other artificial media. By extending the empirical foundation upon which biology is based beyond the carbon-chain life that has evolved on Earth, Artificial Life can contribute to theoretical biology by locating life-as-we-know-it within the larger picture of life-as-it-could-be.” Christopher Langton: Artificial Life. Proceedings of an Interdisciplinary Workshop on the Synthesis and Simulation of Living Systems, September, 1987 in Los Alamos, New Mexico. Cambridge 1989, p. 1. 38 See Georg Vrachliotis: “Flusser’s Leap: Simulation and Technical Thought in Architecture,” in: Simulation. Presentation Technique and Cognitive Method published in the series Context Architecture. Fundamental Architectonic Concepts Between Art, Science, and Technology ed. by Andrea Gleiniger and Georg Vrachliotis, Basel, Boston, Berlin 2008. 39 Max Bense: “Preface”, in: Louis Couffignal: Denkmaschinen, Stuttgart 1955, p. 8 (German original: “Eine weitere Stufe der technischen Welt.”).
71
In summary, it can be said that there are three lines of development of the concept of complexity in architecture: a gestalt-psychological line, a cybernetic line, and a biological-algorithmic line. Of course, these lines are not mutually exclusive, but an advancing technologization of the concept of complexity cannot be overlooked. With regard to the structural-scientific model of cybernetic thinking and in connection with our initial question concerning the extension of the complex of complexity, this development could be designated therefore as a rapprochement with an “operationalization of the complex.” To discuss this development from an architectural-theoretical perspective requires more than simply designating individual buildings as trademarks of the new science or of a new world picture. Equally unsatisfactory in exploring the influence of complexity research on architecture is a restriction to aesthetic criteria. Nonetheless, the buildings by Eisenman, Gehry, and Libeskind enumerated by Jencks can be seen with justice as artistic and architectural symbols of complexity. Still, the development of the concept of complexity discussed here demonstrates that Jencks never goes beyond the level of the visual. In other words: for Jencks, complexity remains indebted to architectural form. The present discussion by no means extends as far as conceptualizing the concept of “algorithmic complexity,” for example. Information technologies function independent of form; they operate rather at the structural level. Discussions of architecture and complexity, hence, become a play with the unrepresentable. Contemporary information technologies confront architectural-theoretical discourses with developments that call for an expanded theoretical instrumentarium. It remains unclear which architectural language might best be used to approach the concept of complexity associated with information technologies. This question might serve as a point of departure for critical discussions of the syntactic models of information technologies from the perspective of the semantic requirements of architecture. How can we render the concept of complexity that is operative in the information technologies comprehensible in architecture? How can architectural meaning be generated and even shaped by a technology whose operations are non-semantic in nature?
72
Kostas Terzidis ALGORITHMIC COMPLEXITY: OUT OF NOWHERE Complexity is a term used to denote the length of a description of a system, or the amount of time required to create a system.1 From networks and computers to machines and buildings, a great deal of effort is spent on how to understand, explain, model, or design systems whose scope, scale, and complexity often challenge the ability of designers to fully comprehend them. While complexity may be a characteristic of many systems or processes in nature, within the field of design the study of complexity is associated with artificial, synthetic, and human-made systems. Such systems, despite being human creations, consist of parts and relationships arranged in such complicated ways that often surpass a single designer’s ability to thoroughly comprehend them even if that person is their own creator. Paradoxical as it may appear, humans have become capable of exceeding their own intellect. By using advanced computer systems, intricate algorithms, and massive computations, designers are able to extend their thoughts into a once unknown and unimaginable world of complexity. However, it may be argued that the inability of the human mind to single-handedly grasp, explain, or predict artificial complexity is caused mainly by quantitative constraints, that is, by the amount of information or the time it takes to compute it, and not necessarily on human intellectual ability to comprehend, infer, or reason about such complexities. Nevertheless, while this assumption may be true, it is only for lack of any other explanation. In other words, if humans are not aware of artificial complexity, then who is? After all, the artificial is by definition human, as are its resulting complexities. However, there is a special category of complexity that even though human-made is not only unpredictable, incomprehensible, or inconceivable by humans, but also strange, foreign and, unfamiliar: randomness. Randomness is a term used to describe a lack of an identifiable pattern, purpose, or objective. In its formal manifestation, randomness can also be defined as a meaningless pattern. While this definition can be applied to the description of a pattern being random, it becomes problematic when it is applied 1 This definition, also referred to as Kolmogorov’s or K-complexity, offers a distinction between visual and structural complexity. Regardless of the complexity involved in the appearance of a pattern, complexity is by definition based on the reproducing algorithm, that is, a series of instructions that will regenerate the visual pattern. See Jesus Mosterin: “Kolmogorov Complexity”, in: Complexity and Emergence, ed. Evandro Agazzi and Luisa Montecucco, New Jersey 2002, pp. 45–56.
75
to the act of creating a random pattern. The claim itself involves a self-referential paradox: how can one create something that is meaningless? Wouldn’t the mere act of creation assign meaning automatically? In other words, randomness is the process of creating no meaning, which is a contradictory claim. Let us consider the following sentence: “This statement is meaningless.” If it is, then its meaning is that it is meaningless and if it is not, then it has a meaning. This logical paradox is referred to as a self-referential, “begging the question,” or circular fallacy because the premises of the argument include the claim that the conclusion is true. In other words, the creation of randomness involves intention, which is contrary to randomness. However peculiar this may sound, by definition one cannot create randomness. The moment one makes a random move it ceases to be random because it can be interpreted later as a series of causal steps. Nevertheless, while one may not be able to create randomness by one’s own will, one can certainly witness it by observing others. That is, if another person makes a move unpredictable to the first person, then that move is random for as long as the first person cannot identify a pattern, purpose, or objective. Complexity, as defined earlier, is associated with randomness as follows: if a pattern is very regular, it is easy to describe, and so it is simple. In contrast, if it is irregular, then it is difficult to describe, and so it becomes complex. If it is so complex that the information it contains cannot be compressed at all, we say that it is random. So randomness is characterized as the maximum of complexity, and as the opposite of regularity and simplicity. Consider the following binary sequences A and B: A: 001001001001001001001001001001001001001001001001001001 … B: 101101001101001001001110101010111100100010000110010101 … Apparently, the first sequence is a repetition of 001 whereas the second one does not appear to have any identifiable pattern that can be further compressed and so it will be assumed to be random until there is proof to the contrary. Now, consider the following sentence A and a random rearrangement of the words in the sentence B: A: if it exists, you can think of it B: if you can think of it, it exists
76
Kostas Terzidis | Algorithmic Complexity: Out of Nowhere
While preserving the grammatical and syntactical correctness, a random shift in the sequence of words in the sentence A produces a sentence B very different from the original sentence A. In this case, randomness functions as a transformation from one state into another, producing a new form from an existing one. This structural behavior in many ways resembles Dadaist poetry, or Markov processes.2 There, an algorithm functions as a string rewriting system that uses grammar-like rules to operate on strings of symbols in order to generate new strings of text. While the syntax of the resulting text may be consistent with grammatical rules, the meaning of the resulting text is not necessarily associated semantically with the intentions of the original code. In those cases, the introduction of randomness into the arrangement of text can produce results that are unpredictable, complicated, but also accidentally meaningful. However, just because something is random does not mean that it is also unpredictable. Unpredictability is, by definition, a dissociation of intention. But unlike chaos, a random rearrangement of elements within a rule-based system produces effects that even though unpredictable are intrinsically connected through the rules that govern that system. In a similar, almost humorous fashion, the Dada Engine is a computer algorithm that produces random text based on recursive rearrangement of elements in a grammar. The resulting text, while allegedly based on random processes, is readable, occasionally makes sense, and is sometimes surprisingly intelligent. While in all of these cases it is quite apparent that awareness, consciousness, or intention is missing, the resulting language patterns are convincing enough to “fool” someone into believing that they were authentic, that is, worthy of trust, reliance, or belief, as if they were produced by a sentient author. In one case, a paper constructed using the Dada Engine software was allegedly almost admitted to a conference, which had it happened would have passed Alan Turing’s classic test of computer intelligence. In the field of design, similarities may exist on formal, visual, or structural levels. Computational rearrangement of formal rules that describe, define, and formulate a certain style can produce a permutation of possible formal expressions for that style. For instance, drawing on Andrea Palladio’s original 40-odd designs of villas, Hersey and Freedman were able to detect, extract, and formulate
2 A Markov process is a stochastic process in which the probability distribution of the current state is conditionally independent of the path of its last state(s). This means that the process has no memory, except the memory of the last observed point(s).
77
Fig. 1: Skyscraper studies using combinatorial analysis (project by Joshua Dannenberg and Christ Shusta for course GSD 2311 taught by Kostas Terzidis at Harvard University).
78
Kostas Terzidis | Algorithmic Complexity: Out of Nowhere
rigorous geometric rules by which Palladio conceived these structures. Using a computational algorithm, they were able to create villa plans and facades that are stylistically indistinguishable from those of Palladio himself. Similarly, Dannenberg and Shusta developed an algorithm that produces all possible combinations of skyscrapers for a given site [Fig. 1]. Their strategy involves physical and geometric parameters to script a computer modeling code that builds, renders, and organizes an infinite number of skyscraper possibilities, from which emerges a formal pedigree categorized in texture and performance. What is remarkable about this – or any other combinatorial analysis – is that they are able to produce computationally any possible form ever created or any yet to be created. Algorithms can be used to solve, organize, or explore problems with increased visual or organizational complexity. In its simplest form, a computational algorithm uses numerical methods to address problems. The basic linguistic elements used in algorithms are constants, variables, procedures, classes and libraries, and the basic operations are arithmetical, logical, combinatorial, relational and classificatory, arranged under specific grammatical and syntactical rules. These elements and operations are designed to address the numerical nature of computers while providing the means for composing logical patterns. However, while numbers are often regarded as discrete quantitative units that are utilized for measuring, in computational terms numbers can be constructed to address an infinite degree of division, thus exhibiting theoretical continuity. Similarly, random variables or conditions can be inserted into an algorithm, further increasing the degree of unpredictability of the final outcome and magnifying the level of complexity. Contrary to common belief, algorithms are not only deterministic processes developed to explain, reason, or predict a humanly conceived problem, but can become venues for addressing complexities that exceed human ability to explain, reason, or predict. The random application of rules to a random pattern does not necessarily produce further randomness but, surprisingly, may produce a certain degree of order. Buffon’s experiment with needles3 showed that random sampling contributes to a stochastic approximation of the number π = 3.141592… In a similar
3 Georges-Louis Leclerc, Comte de Buffon, in the 18th century, noticed that the probability of randomly thrown needles to lie across lines on a floor made of parallel strips of wood approximates π.
79
fashion, random application of simple rules to random patterns produces a phenomenon referred to as self-organization. This is the progressive development of a system out of a random initial condition into an organized form. Instead of applying a center-based hierarchical decision-making process to create order, we use a parallel multiple decision-making process based on local neighboring information that contributes towards a collective emergent ordered behavior. In such systems, called cellular automata,4 local agents interact with one another based on rules pertinent to the information at their level, and contribute information synchronically towards a collective decision that often exhibits unexpected behavior, such as that of self-organization. Such systems exhibit autonomy, selfmaintenance, adaptation, heterogeneity, complexity, and order. Cellular automata have been used to describe, explain, and predict complex behaviors found in biology, mathematics, physics, or social systems. An alternative approach to cellular automata, called a genetic or evolutionary algorithm,5 is based on evolutionary biology, using terms and processes such as genomes, chromosomes, cross-over, mutation, or selection. The evolution starts from a population of completely random individuals and happens in generations. In each generation, the fitness of the whole population is evaluated, multiple individuals are selected from the current population (based on their fitness), modified (mutated or recombined) to form a new population, which becomes current in the next iteration of the algorithm. Such systems lead to the emergence of ordered patterns that can simulate, explain, and predict complex behaviors. Random patterns are iteratively enhanced and evaluated until a set of satisfying conditions is met. Genetic algorithms address a problem at the level of binary code (genotype) and use the resulting form (phenotype) as means to evaluate the progress. Thus, there is an emergent behavior embedded in the process of deriving possible solutions to a problem. This behavior is based on the premise that, under particular constraints, individual units may emerge into globally functional configurations by resolving their local neighboring conditions in a repetitive manner. Contrary to common belief, such seemingly chaotic local behavior does not necessarily result in chaotic overall behavior, but rather in an evolved form that optimizes (if not resolves) the local constraints. 4 See John von Neumann: The Theory of Self-reproducing Automata, ed. Arthur Burks, Urbana 1966. 5 See John H. Holland: “Genetic Algorithms: Computer programs that ‘evolve’ in ways that resemble natural selection can solve complex problems even their creators do not fully understand,” Scientific American, 1992, 66–72.
80
Kostas Terzidis | Algorithmic Complexity: Out of Nowhere
One of the main problems in architecture today is the quantity of information and the level of complexity involved in most building projects, especially in high-rises and large-scale housing projects. Housing projects for a few hundreds to thousands of people have started to emerge in large urban areas. Here, the old paradigm for housing design was the development of high-rises that served as stacking devices for multiple family housing units. This is unfortunately the only way to address excessive complexity using manual design skills, mainly because it is simple to conceive, but also because it is simple to construct. The unfortunate nature of this approach lies in the uniformity, similarity, and invariability that these projects express in comparison to individuality, discreteness, and identity that human beings and families express. In these projects there is a typology of residential units that need to be combined in various schemes to fulfill multiple functional, environmental, and economic constraints. While small apartment buildings may be solvable within one architect’s design capabilities, the design and planning of large projects with several thousand inhabitants is a challenge. The problem is to fulfill all complex requirements without using conventional repetitive high-rise patterns. Snyder and Ding addressed the problem of large scale high-rise housing by using cellular automata as an ordering device to fulfill multiple housing constraints [Fig. 2]. Similarly, Somnez and Ding used stochastic search to determine the position of building elements within a highrise [Fig. 3]. What makes randomness problematic for designers and architects is that they have traditionally maintained the view that design is associated with purpose, intention, or aim. Design is thus contrasted with purposelessness, randomness, or lack of complexity. The traditional view is that design can only arise out of the mind of a sentient designer. Challenging these assumptions, computational theories have proposed an alternative definition of design, in which it is still meaningful to speak of design without always speaking of a sentient designer. Rather than assuming the presence of a sensible mind, it may be that certain impersonal forces are equally capable of giving rise to a phenomenon called design. These two antithetical positions, albeit metaphysical, present two different theoretical approaches to the intellectual origin of design. Within the emergence of digital processes, it can be argued that certain qualities of the human mind such as those that contribute to what is considered “smart,” i.e. sharpness, quick thought, or brightness, may not be desirable or even applicable when dealing with the computer’s reasoning. What is considered
81
Fig. 2: Large-scale housing project using cellular automata (project by Mathew Snyder and Jeff Ding for course GSD 2311 taught by Kostas Terzidis at Harvard University).
82
Kostas Terzidis | Algorithmic Complexity: Out of Nowhere
to be smart in the one world may be considered stupid in the other.6 The dominant mode for discussing creativity in architecture has traditionally been that of intuition and talent, where stylistic ideas are pervaded by an individual, a genius, or a group of talented partners within a practice. In contrast, an algorithm is a procedure, the result of which is not necessarily credited to its creator. Algorithms are understood as abstract and universal mathematical operations that can be applied to almost any kind or any quantity of elements. It is not the person who invented it who is important, but rather its efficiency, speed, and generality. It can be argued, therefore, that human decision making (i.e. manual design) can be arbitrary, authoritative, and often naive in comparison to computational schemes where complexity, consistency, and generality are celebrated, liberating design from subjective interpretations and leading towards the emergence of designs that while functional may surprise even their own creators. These two distinct practices have deep and profound differences that are both ideological and methodological. For the last four decades, beginning with Christopher Alexander’s Notes on the Synthesis of Form and Robert Venturi’s Complexity and Contradiction in Architecture and continuing through a plethora of formal studies and computational methods,7 designers, architects and urban planners have been primarily concerned with the increased complexity involved in the design of buildings, urban areas, and cities. Research and professional practice have attempted to automate traditional “manual” methods of production using computer-aided design tools and to consider architectural schools and offices as hubs for cross-
6 For instance, to deduce a secret password, a person may exploit context-based conjectures, reductive reasoning, or assumptions as strategies for saving time and effort. In contrast, a computer can solve the same problem by simply checking all possible combinations of all alphanumeric symbols until a match is found. Such a strategy, referred to as brute force, would be considered overwhelming, pointless, naïve, or impossible for a human investigator. Nonetheless, given the computational power of a computer, such a “strategy” may take only a few seconds to check millions of possibilities, something inconceivable to the human mind. 7 See Christopher Alexander: Notes on the Synthesis of Form, Cambridge 1967; Robert Venturi: Complexity and Contradiction in Architecture, New York 1966; also see Marcos Novak: “Computational Compositions”, in: ACADIA 88, Proceedings, pp. 5–30; William Mitchell: Logic of Architecture, Cambridge 1990; Peter Eisenman: “Visions Unfolding: Architecture In the Age of Electronic Media”, in: Domus no. 734 (January 1992), pp. 20–24; John Frazer: An Evolutionary Architecture, London 1995; Greg Lynn: Animate Form, New York 1999.
83
Fig. 3: Skyscraper studies using stochastic search and random sampling (project by Mete Somnez and XiaoJun Bu for course GSD2311 taught by Kostas Terzidis at Harvard University).
84
Kostas Terzidis | Algorithmic Complexity: Out of Nowhere
pollination between diverse engineering disciplines. When comparing architectural design and other software-intensive engineering design disciplines it is necessary to overlook many significant and distinguishing differences in order to identify at least one common theme: the use of computational methods to address excessively complex tasks. Historically, algorithms have been used quite extensively in architecture. While the connotation of an algorithm may be associated with computer science, the instructions, commands, or rules in architectural practice are, in essence, algorithms. Architectural design has also a long history of addressing complex programmatic requirements through a series of steps, yet without a specific design target. Unlike other design fields where the target is to solve a particular problem in the best possible way, architectural design is open-ended, flux, and uncertain. Codified information, such as, standards, codes, specifications, or types, simply serve the purpose of conforming to functional requirements, yet do not guarantee a successful design solution. However, while deciding under uncertainty requires some degree of experience, intuition, and ingenuity, it may be also argued that it requires an ability to make as many random mistakes it takes until an acceptable solution is encountered. Traditionally, the second requirement, while plausible, has never been considered a viable option for at least two reasons: because it is simply too hard to go through all the possibilities that exist, and, crucially, because it lacks the most important ingredient of any decision: human involvement. A decision is by definition intentional, and therefore human. Computational systems lack causal powers that would give them intentionality, a necessary condition of thinking. Terms such as understanding, deciding, responding, or even more mundane ones, such as knowing, suggesting or helping, involve an elementary level of consciousness that computational devices do not possess.8 And yet, paradoxically, such systems occasionally do come up with interesting solutions using random iterations characterized by the use of dubious names such as genetic, artificial, or automatic, none of which is theoretically accurate. 8 The Chinese room paradox, posed by John Searle, is an argument about intentionality, understanding, and conscious experience. It suggests a table in a room with cards depicting Chinese ideograms and a set of rules on how to place them. A non-Chinese speaker enters the room and arranges the cards. Then a Chinesespeaker enters the room looks at the card arrangement and claims that it is a delightful poem. Of course, the first person did not have the faintest idea of what he or she was doing and the effect that it would have. John Searle: “Minds, Brains and Programs,” in: Behavioral and Brain Sciences 3, 1980, pp. 417–57.
85
Because of its quantitative nature, the study of complexity necessarily involves computational methods for analysis, simulation, and synthesis of systems that involve large amounts of information or information processing. Unlike traditional methods of analysis and synthesis, computational schemes offer a degree of rationality that allows them to migrate into computer-executable programs. Furthermore, the ability to produce large amounts of random sampling allows explorations of multiple solutions, some of which would never have been conceived by the designer. Such a possibility opens up more potential than has previously been possible; rather than utilizing mere human-based intelligence in resolving design problems, a complementary synergetic relationship between humans and computers becomes possible. Any scientific approach to design therefore needs to take into consideration not only systematic, methodical and rational models, but also alternative approaches that address the nature of design as an indefinite, ill-defined and chaotic process. Both architects and engineers argue for the deployment of computational strategies for addressing, resolving, and satisfying complicated design requirements. These strategies result from a logic based on the premise that systematic, methodical and traceable patterns of thought are capable of resolving almost any design problem. While this assumption may be true for well-defined problems, most design problems are not always clearly defined. In fact, the notion of design as an abstract, ambiguous, indefinite, and unpredictable intellectual phenomenon is quite attuned to the very nature of – or perhaps lack of – a single definition of design. However, the mere existence of certain ambiguous qualities such as indefiniteness, vagueness or elusiveness, serves to indicate that perhaps design is not only about an epiphany, the extraordinary intellectual power of a genius, or a methodical collage of construction elements, but also about a progressive, optimizing search for possible solutions based on iterative random sampling. While the second approach may sound foreign, naïve, or even dangerous to some, it does possess a certain degree of merit due to its practical implementations within a progressively computational design world. Indeed, the philosophical implications are even more interesting as they challenge the very nature of what design is, or even further, what creativity is. Nevertheless, regardless of the intrinsic differences, it is clear that both positions are essential for detecting, understanding, and addressing complexity.
86
Klaus Mainzer STRATEGIES FOR SHAPING COMPLEXITY IN NATURE, SOCIETY, AND ARCHITECTURE Homage to Robert Venturi Robert Venturi’s discussions of complexity in architecture display remarkable parallels with complexity research. His books Complexity and Contradiction in Architecture1 and Learning from Las Vegas,2 as well in as his later Iconography and Electronics upon a Generic Architecture,3 voice ideas about architecture that are comparable to ones I have explored in my research into complexity in the natural and social sciences, and in computer, information and communications technologies. Venturi sees complexity as a reply to classical Modernism, whose purist forms and functions he found incapable of adequately conceptualizing the diversity, the ruptures, and the dynamism of the urban environment in the postindustrial late 20th and early 21st centuries. The degree to which Venturi regards this approach to architecture as being a more generalized situational analysis is made explicit right at the beginning of Complexity and Contradiction: “Instead, I speak of a complex and contradictory architecture based on the richness and ambiguity of modern experience, including that experience which is inherent in art. Everywhere, except in architecture, complexity and contradiction have been acknowledged from Gödel’s proof of ultimate inconsistency in mathematics to T.S. Eliot’s analysis of ‘difficult’ poetry to Joseph Albers’ definition of the paradoxical quality of painting.”4 In my opinion, the purified and abstract forms of Modernism correspond to the mathematical laws of symmetry, structure, and invariance found in modern physics, which are ultimately reflected in a view that goes back to Plato. According to these laws, the multiplicity and change found in the world are reducible to simple and unalterable ideal forms.5 The regular bodies of Euclidean geometry, which Plato regarded as the building blocks of the universe, are the equivalent of
1 Robert Venturi: Complexity and Contradiction in Architecture, New York 1966, p. 22. 2 Robert Venturi: Denise Scott Brown, and Steven Izenour: Learning from Las Vegas. The Forgotten Symbolism of Architectural Form, New York 1972, p. 151. 3 Robert Venturi: Iconography and Electronics upon a Generic Architecture. A View from the Drafting Room, Cambridge/Mass. 1996. 4 Venturi, 1966, p. 22. 5 Plato: Timaios 53d.
89
the symmetrical features of elementary particles in contemporary high-energy physics and cosmology. Analogously, architects such as Le Corbusier sought out the pure and functional forms of Modernism in the proportional language of the Greek temple. In Symmetries of Nature,6 I investigated this scientific perspective of Modernism in mathematics, physics, chemistry, biology, art, and architecture, and traced them back historically to their sources in Antiquity. In general, change, multiplicity, and the genesis of the new were possible only through the rupturing of symmetry and random fluctuations. In such cases, physicists speak of phase changes and non-equilibrium dynamics. In Symmetries of Nature,7 I make reference to Postmodernism with an eye toward architecture, yet find myself, unlike Robert Venturi, unable to identify with this trend. The often merely additive, arbitrary, and complaisant element of architectural Postmodernism misses that which is essential about the complexity of the world. Thinking in Complexity,8 the first edition of which appeared in 1994, in some sense represents a systems-theoretical pendant to Venturi’s Complexity and Contradiction in Architecture. It begins with a critique of the Laplacean spirit, according to which the world is regarded as fully calculable according to the presuppositions of classical mechanics. The Modernist mania for planning has its roots in the fictions of the Laplacean spirit. The variegated quality, the ruptures, and contingencies found by Venturi in the Las Vegas Strip are disclosed for me in the fractality, ruptured symmetry, and nonlinear dynamics of complex systems. Ugliness and the everyday are integral components of the human lifeworld, as are entropy, noise, and randomness in real physical systems. Whoever seeks to eliminate them fails to recognize the reality of the world. Only through ruptures of symmetry do creativity and innovation become possible in the first place.9 Complex systems consist of numerous interacting elements: from the molecules of liquids or air streams to cells in organisms, to organisms within populations, and all the way to individuals in markets or societies, or processors in computer networks such as the World Wide Web. Complex systems such as the climate, economic systems, organisms, populations, markets, or communication 6 Klaus Mainzer: Symmetries of Nature. A Handbook for Philosophy of Nature and Science, Berlin, New York 1996. 7 Ibid., chapter 5.4. 8 Klaus Mainzer: Thinking in Complexity. The Computational Dynamics of Matter, Mind, and Mankind, Berlin, Heidelberg, New York 52007. 9 Cf. Klaus Mainzer: Der kreative Zufall. Wie das Neue in die Welt kommt, Munich 2007, chapter 7.
90
Klaus Mainzer | Strategies for Shaping Complexity in Nature, Society, and Architecture
networks cannot be guided like mechanical devices. But they do obey nonlinear laws of self-organization that are thoroughly amenable to human understanding. To be sure, their nonlinear mathematics does not generally permit analytical solutions of the kind that are able to precisely predict, and hence to plan, the behavior of single elements such as planets or billiard balls. But we can simulate their dynamics on computers, thereby investigating typical patterns of behavior of the total system under the relevant auxiliary conditions. And in climate models, for example, we are able to test out the relevant events in simulation. Transferring these principles to architecture and city planning, we find that a commercial metropolis such as Las Vegas is a complex urban system, whose nonlinear dynamics cannot be planned out on the drawing board as in the Cartesian city. In this context, computer-based design procedures have opened up new possibilities for creating simulations in virtual reality. Cities are not constructible or planable machines in the sense of the industrial age, a notion continually propagated by the exponents of Modernism.10 Organisms and ecosystems, which are self-organizing, are typical complex systems. In unstable situations, their independent dynamics can display extreme sensitivity. They nonetheless follow laws that can be expressed mathematically, and in many cases formulated stochastically. It is here that we find their essential difference from architectural Postmodernism: complexity can be modeled according to laws. Admittedly, we need to learn to understand the iconography of complex systems. Time series analysis and attractor analysis serve this purpose in the theory of complex systems.11 Venturi investigated the semiotics of urban systems. In his Iconography and Electronics upon a Generic Architecture, for example, he emphasizes the decisive role of information and computer technology in a forward-looking way. Complex Systems in Nature and Society In the late 18th century, at the high point of the French Enlightenment and during the age of mechanics, the French astronomer and mathematician PierreSimon Laplace (1747–1827) took as his point of departure the assumption that,
10 Ulrich Conrads (ed.): Programme und Manifeste zur Architektur des 20. Jahrhunderts, Bauwelt Fundamente vol. 1, Basel, Boston, Berlin 2000 (first edition Berlin 1964). 11 Time series are time-dependent sequences of measurement data that record the changing states (i.e. dynamics) of complex systems. The human heart, for example, is a complex cellular system whose changes of state over time are expressed in EKG curves. Attractors are states in which a dynamic system
91
provided all laws of force and initial conditions were known, all of nature would be calculable. The “Laplacean spirit” was the buzzword of the time. Today, we could conceive of a computer based on the totality of these equations. In principle, Laplace’s assumption is true for simple systems, for example, when only two elements interact: the “two body problem.” In this case, element A causally influences element B. This can be pictured as the physical impact of one ball on another. In a causal sequence of events, cause and effect are proportional, so that similar causes generate similar effects. That is to say: a minimal impact by one ball generates minimal deflection in another ball, while a larger impact generates a larger deflection. In this example, moreover, it can be assumed that the effect is unambiguously determined by the cause. Such proportional and determinable interactions correspond to linear equations that are solvable mathematically. However, if more than two elements interact in a dynamic system (the multi-body problem), then total calculability is no longer possible – and this is true even where cause unambiguously determines effect. A multiplicity of interactions between the elements potentially manifest themselves in such cases, involving feedback loops, which usually escalate one another, leading to instability, turbulence and chaos. If cause and effect still unambiguously determine these interactions, as before, we speak of a “deterministic chaos.”12 Mathematically, such feedback-governed interactions correspond to mutually dependent nonlinear equations, which cannot simply be solved analytically. In cases of deterministic chaos, the system is extraordinarily sensitive to the slightest alteration of its initial state. This is the well-known “butterfly effect”: in an unstable situation, a small, unnoticed turbulence on a weather chart, one that might have been triggered, in principle, by the flutter of a butterfly’s wings, is amplified to become a global change in the total weather picture.13 This is why weather fore-
is involved over extended time periods. A state of equilibrium corresponds to a fixed point attractor that no long alters over time (it “remains fixed”). One example of this is a swinging pendulum, which gradually becomes motionless due to friction. In limit cycles, states are periodically repeated. Examples of this are all periodic oscillations (such as a those of a clock or a beating heart). In the case of a chaos attractor, the development of a state culminates in a system of non-periodic, irregular behavior, which cannot be predicted in the long term (e.g. ventricular fibrillation). In time series or attractor analyses, it is a question of defining states of the respective complex system in order to identity its time series pattern or attractor types and to draw the appropriate conclusion for dealing with the system. Cf. Klaus Mainzer: Komplexität, UTB-Profile, Munich 2008. 12 Ibid, chapter 4. 13 Ibid, chapter 4.
92
Klaus Mainzer | Strategies for Shaping Complexity in Nature, Society, and Architecture
casts are only valid in the short term, and in principle, long-term prognoses are impossible. It is because in unstable systems, the calculations necessary for future prognosis grow exponentially – even if the equations expressing cause and effect can be determined unambiguously. Open systems, which remain in a state of continuous exchange of material and energy with their environments, are particularly interesting. In such systems, interactions between the system’s elements at the micro level can collectively generate patterns and structures at the macro level, which can, in turn, be traced back to the behavior of the individual elements. Chaos and order, then, organize themselves through these interactions between the micro and macro levels of a complex system, and are governed by feedback loops in ways that depend upon exchanges of matter and energy with the environment. In such cases, we refer to the “self-organization of chaos and order in open complex dynamic systems.” This can take the form of a climate system consisting of many molecules, an organism consisting of many cells, or populations consisting of numerous plants or animals. Self-organization is linked to the phase changes of complex and dynamic systems, leading to the genesis of increasingly complex structures. The genesis of a structure that can be traced back to the self-organization of a complex system is also referred to as “emergence.” This phase change can be vividly illustrated by a branching tree [fig. 1], where the changing state of the system is displayed in relation to a control parameter. Through changed conditions of the system, previous orders collapse in proximity to points of instability, and new orders and patterns emerge. Here, these points of instability are represented as points of divergence, where new branches of development open up. When properly interpreted, such magnitudes can be selected as new patterns of streaming in aerodynamics, or as new species in the ramifying system of Darwinian evolution. With a view to cultural history, it seems promising to conceptualize the development of human societies as dynamic and complex systems. Like weather fronts on a geographical map, hunting, agricultural, and industrial societies diffuse outward. As early as the industrialization of the 19th century, networks of streets and streetcars formed the nervous system of spreading national states. The dynamics of urban development can be studied in computer models: we begin with a uniformly populated region. This region is simulated on a chessboard-style network of nodes, upon which changing population densities over time can be depicted. Locations are linked to specific functions that express their
93
system state new order
old order
random fluctuations at point of instability
control parameter
Fig. 1: Bifurcation tree representing the phase transitions in complex dynamic systems.
94
Klaus Mainzer | Strategies for Shaping Complexity in Nature, Society, and Architecture
industrial capacities, traffic linkages, and their values in terms of leisure and recreational amenities. A population equation models the nonlinear dynamics of the settlement, which is displayed in the form of new urban centers, industrial areas, high-density zones, and changes to traffic and transport networks. Settlement patterns correspond to macroscopic orders, which can be characterized through ordering parameters. Such complex systems are reminiscent of the dynamics of organisms rather than those of machines, which can be centrally planned and guided. Architectural Modernism, on the other hand, regarded the city as a machine for living and working. In this sense, it stood in the tradition of Cartesian city planning of early-modern Rationalism, which ran aground on the nonlinear dynamics of complex settlements and urban structures. Complexity research allows us to understand the laws of nonlinear dynamics better, albeit without mastering them. Instead, it reveals the design potential of inducing systems to develop independently in the desired direction by introducing appropriate auxiliary conditions. Climate change cannot be simply “switched off” via the application of the appropriate lever. In order to influence its complex dynamics, marginal and initial conditions must be altered at a sufficiently early stage. Computer models of the dynamics of the city serve the study of independent urban dynamics, so that the impact of specific planning measures can be studied in simulation. At such levels of complexity, there are no analytical solutions that can be read off, so to speak, from a set of equations. Self-organization and Dynamics in Computer, Information, and Communications Systems The previous examples demonstrated how the investigation of chaos and nonlinear dynamics in complex systems must rely on computer simulations, and that this is so because analytical solutions of the corresponding differential and integral equations of nonlinear dynamics are frequently unattainable. The mathematical principles of the theory of dynamic systems were known as early as the beginning of the 20th century. But only the possibilities of visualization and experiments performed with the use of contemporary computing technology explain the research boom and the popularity of complexity research today. In many cases, it is possible to adequately simulate the high degrees of nonlinearity found in system dynamics. One example is computer-based architectural and urban planning design. These disciplines cannot usually manage without non-
95
linear differential and integral equations to describe mathematically the developmental dynamics of complex systems such as cities and settlements. But it is not just a question of computer-based simulations of dynamic systems. Meanwhile, our environment is being shaped by complex digital communication networks. In the first Web generation, 1.0, virtual documents were only represented and connected via links. The subsequent – current – generation is referred to as Web 2.0. Collective design supplants mere representation and the exclusively passive reception of documents and websites: Web 2.0 promotes collective Web intelligence. Standardization, subsequent usability, collaboration and cooperation, user-friendliness, cost-effectiveness, and intermedia publication require the tools of collective intelligence: blogs, wiki, podcasts, social networking, video communities, photo communities and others could be named. Well-known to Web users is the encyclopedia Wikipedia, to which thousands of anonymous authors contribute. Specialized articles are produced via selforganization, by no means haphazardly, and with no sense of being secondbest. Self-correction and self-monitoring produce measurable improvements in quality. Another phenomenon of simulated reality is Second Life, a virtual online universe. On this online 3-D playing field, millions of Internet users navigate a photorealist 3-D world with self-created avatars, interacting with other players and creating new worlds according to their own tastes. The purchase of building sites and the design of buildings and settlements are essential activities in Second Life. It would be interesting to apply Venturi’s Learning from Las Vegas to the typology of virtually existing strips from Second Life, in order to study the iconography and architectural symbolism of a virtual commercial city. To avoid foundering on the multiplicity and complexity of such virtual worlds, information systems must become more intelligent, and must learn to adapt to their users. Personalization is the answer to the demands of digital complexity. In the meantime, “ubiquitous computing” is being discussed, the microprocessors and computing functions found everywhere in the devices of contemporary everyday and professional life. Ubiquitous computing means the all-encompassing realization of information through digital information and communication networks, which already permeate our life environment like a complex nervous system. With ubiquitous computing, personalized information devices are available everywhere locally as well as globally (wireless), and via the World Wide Web. As devices, they retreat into the background (“invisible com-
96
Klaus Mainzer | Strategies for Shaping Complexity in Nature, Society, and Architecture
puting”), while supplementing the living environment (“augmented reality”). Architects and urban planners must consider how to design electronic infrastructures of buildings and devices in a “personalized” manner. In Venturi, we read: “The relevant revolution today is the current electronic one. Architecturally, the symbol systems that electronics purveys so well are more important than its engineering content. The most urgent technological problem facing us is the humane meshing of advanced scientific and technical systems with our imperfect and exploited human systems, a problem worthy of the best attention of architecture’s scientific ideologues and visionaries.”14 For architectural and space design, personalized information services in the world of work are an enormous challenge. Ubiquitous computing makes conference rooms possible in which, aside from furnishings such as tables and chairs, no computers are to be seen aside from personal PDA end devices (personal digital assistants). All participants communicate and work on joint projects via sensitive direction surfaces and the natural guiding of gestures. Heterogeneous infrastructures can be dealt with via ergonomic architecture. One sensitive issue is how to secure access to personal data while providing reliable general access. What Venturi refers to as a challenge of the digital age, namely “the humane meshing of advanced scientific and technical systems with our imperfect and exploited human systems,”15 must produce new ergonomic infrastructures. This is by no means a question of futuristic science-fiction architecture, which he criticized with good reason.16 In order to become humane, architecture must conceptualize the genuine complexity of human beings and their environment.
14 Venturi, Scott Brown, and Izenour, 1972, p. 151. 15 Ibid, loc cit. 16 Ibid, p. 150; Learning from Las Vegas, p. 176: “They also share the same tradition in architectural technology, taking the progressive, revolutionary, machine-aesthetic stance of the early Modern architects; part of being ‘heroic and original’ is being advanced technologically. The discrepancies between substance and image in Modern architecture’s technological machismo and the costliness of its frequently empty gestures emergerd earlier than architects would admit.”
97
Johann Feichter COMPLEXITY AND CLIMATE “Long before the human spirit’s impulse to build, the constructing movement existed in the world at large, already a layering and joining together, and it has remained so up to the present. Just as we erect our walls, the Earth erects itself layer upon layer from the precipitations of air and water.”1 The astonishing thing about complexity is that is arises from the simplest phenomena. Fractals,2 for instance, acquire their complexity via the constant repetition of simple basic rules. The complex numeric models designed to describe our environment, and the climate system, function similarly. These models encompass a large number of processes, and each is described in strongly simplified terms. Complexity arises from the numerous linkages and feedback loops between individual processes. This “condition of interdependency” is the precise meaning of the Latin word complexus. From the Complex to the Simple To think means to abstract, means constructing images from phenomena perceived by the senses. Thinking means creating a model of reality and discovering (or inventing) its laws. In the most radical form of abstraction, we grasp reality in numbers and formulate laws as mathematical algorithms, for example the equations of hydrodynamics that describe flows in the atmosphere and in the oceans. Gaston Bachelard, a French theoretician of science, formulated the following as a counter-thesis to a romanticist sensualism: “Faced with nature, the scientific spirit […] can only learn provided the natural material is purified and the tangled phenomena ordered.”3 Similarly, Gottfried Wilhelm Leibniz argued for a mathematization of our perceptions of nature, claiming that with a calculus for converting all signs into characteristic numbers, “the human being” would be in possession of a new instrument “that would elevate the capacities of the mind far more than optical glass has promoted the sharpness of the eye, and that would
1 Rudolf Schwarz: Die Bebauung der Erde (The Development of the Earth), Heidelberg 1949, p. 21. 2 Ulf Skirke: Technologie und Selbstorganisation: Zum Problem eines zukunftsfähigen Fortschrittsbegriffs, dissertation at the Department of Philosophy, Hamburg University, 1998, p. 77. 3 Gaston Bachelard: La formation de l’esprit scientifique, Paris 1938; English edition available: The formation of the Scientific Mind, Manchester 2006.
99
surpass the microscope and the telescope to the same degree that reason is superior to the sense of sight.”4 To formulate the complexity of reality in mathematical symbols and algorithms means to attempt to render it graspable and surveyable. What meteorology designates the “climate system” encompasses the atmosphere, the hydrosphere (oceans and hydrological cycles), the cryosphere (comprising sea and land ice), the land surface, and the marine and terrestrial biospheres. If we wish to accord humankind a special status, then we also speak of an “anthroposphere.” Via exchanges of energy, stimuli, and mass, the various components of the climate system are linked with one another and stand in a reciprocal relationship in a multiplicity of ways. The individual components experience change on highly divergent temporal scales, and their memories function on divergent time scales. The atmosphere, for example, is a storage unit with very low energy content and a short memory, which functions on a scale of days or at most weeks, a fact that makes weather forecasting over extended time periods difficult. The large ice sheets covering Antarctica and Greenland, in contrast, have long memories, functioning on the scale of many thousands or tens of thousands of years. For deep ocean water, the characteristic exchange occurs at scales of up to a thousand years. Interactions and nonlinear feedback loops between these components mean that the climatic system is subject to natural fluctuations. These fluctuations are chaotic, and hence unpredictable over longer periods of time (internal variability). Climatic fluctuations are observable on a variety of temporal scales, ranging from months to millions of years. Alongside internal variability, there are also climatic fluctuations or climate changes triggered by external mechanisms. Such mechanisms can be changes in solar radiation, changes in the orbital parameter, severe volcanic eruptions, or human interventions. The formation of climate system models begins with the preparation of a system of equations to describe the movements of air in the atmosphere or water in the oceans. The objects of experience, air or water, are thereby replaced by idealized objects, in this case by mass points, which are accorded no spatial extension. This procedure provides us with a system of partial differential equations that cannot be solved analytically. In 1945, Hungarian-American mathematician John von Neumann proposed replacing analytical methods with 4 Horst Bredekamp: Die Fenster der Monade. Gottfried Wilhelm Leibniz’ Theater der Natur und Kunst, Berlin 2004, p. 85.
100
Johann Feichter | Complexity and Climate
numeric ones to solve this system of equations for weather forecasting. In order to solve equations, the area under investigation also had to be subdivided into grid boxes, and differentials represented as differences between the mean values of the grid boxes (finite differences). The results derived in this way are approximations of precise solutions. It is also important to be clear about the fact that numeric models of the climate do not reflect the phenomena of our experiential world as they can be perceived by the senses, but as observed by instruments of measurement (pressure fields, temperature, etc.). Many important processes in the climate system, for example storm clouds, have far smaller spatial extensions than those of the grid boxes.5 Since these processes influence the large-scale state variables, they too must be taken into consideration. The method used for taking into account the influence of smallscale processes that are spatially and temporally inseparable from the model is called parametization. In the first simulated, “dry” climate models, which still contained no circulating water, clouds were still stable forms, and were represented as solid and unchangeable. With the transition from these “dry” atmospheric models to “moist” ones came the simulation of the properties of clouds. Now, water vapor was transported and allowed to condense and to precipitate in the form of rain. The result was a realistic scenario of changing cloud cover conditions. The following example makes more tangible why these small-scale processes are so important: clouds reflect solar radiation, and hence have a cooling effect. But they also absorb heat radiation, and hence function as a greenhouse gas, having a warming effect. The question now is: to what extent do clouds (whether water or ice clouds)6 either increase or reduce warming through greenhouse gases? Different simulation models offer divergent answers to this question. Since clouds play an important role in the climate system, efforts are being made to simulate extremely small-scale processes as well, all the way down to cloud droplets and ice crystals. But these more complex model clouds still bear no resemblance to the clouds we can perceive and observe. Juxtaposing a storm cloud as we perceive it with the processes calculated by the model, as well as with the results of these calculations, illustrates the problem [Fig. 1]. In order to conceive of the complex system of a storm cloud, the illus5 6
For example, contemporary models of the atmosphere have a horizontal resolution of 100–200 km. Water clouds consist of water droplets, ice clouds of ice crystals.
101
vapor deposition Bergeron process
primary ice
cold cloud warm cloud
aggregation
riming ice enhancement
melting
collisioncoalescence
condensation
breakup
0°C
continuous collection
activation evaporation CCN in H2O(v)
rainfall
Fig. 1: a) cumulus cloud, b) physical cloud processes, c) vertical profile state variables
102
Johann Feichter | Complexity and Climate
trated model calculates approximate values: the flows of warmth and water vapor in updrafts and downdrafts, condensation and evaporation, the formation of droplets and ice crystals, melting and freezing, rain and snow. Merely in order to describe changes in cloud ice, up to 11 different processes are parametized in a climate model realized at the Max Planck Institute for Meteorology in Hamburg. As the result of this parametization, the diagrammatic abstraction shows the changes caused by cloud processes on the vertical distribution of large-scale state variables, as well as on pressure, temperature, and humidity: after the “storm cloud” has fulfilled its function, that is to say, once it has enabled us to calculate changes in radiant flux, large-scale state variables, and condensation, it vanishes in the space of the model, and does not survive to a later point in time. Because not only data but also observations enter into parametization, the climate model combines theoretical considerations with empirical experience. In summary, we can say that simulated climate models are complex mathematical constructs that combine theoretical with empirical knowledge. At the same time, the computer code underlying these simulation models is subject to perpetual growth and continuous development. Gabriele Gramelsberger concludes: “The decomposition of facts into data and the generation of facts from data […] generate a layer of binary codes positioned between observer and observed environment, one that not only grows, but which is also stratified in increasingly multifarious feedback loops.”7 This computer code generates, by means of compilers,8 a machine-specific “assembler code,” i.e. a code that determines storage allocation and transmits arithmetic instructions to the processors. Since this allows complex climatic processes to be represented through code sequences based simply on the numbers 0 and 1, an extremely high degree of abstraction has been attained at this point.
7 Gabriele Gramelsberger: “Schrift in Bewegung. Eine semiotische Analyse der digitalen Schrift,” published contribution to the workshop “Bild-Schrift-Zahl” at the Helmholtz Zentrum für Kulturtechnik, Humboldt University, Berlin, November 16–17, 2001. 8 A compiler is a software program that translates programming language into machine language.
103
From the Simple to the Complex Such computer-based simulations – which are produced, for example, for the reports of the Intergovernmental Panel on Climate Change (IPCC) – generate enormous quantities of data in magnitudes of gigabytes to terabytes.9 But how does knowledge about the climate system arise from these “bits and bytes”? It is possible (as in Douglas Adams’s celebrated novel The Hitchhiker’s Guide to the Galaxy,10 in which the supercomputer responds to the ultimate question about the meaning of life by supplying the number 42) to condense these quantities of data, as when calculating the global mean temperature for the year 2100, for example. But if we want to understand how the climate system really functions, then it becomes necessary to examine the respective state variables more closely in their spatio-temporal development. The preferred instrument, one that enables us to render these abstract quantities of data tangible, is computer visualization. Sensory perception now comes into its own once again, for visual representation of knowledge values, “through the organ of sight, can be formed in the mind and strongly imprinted all the more swiftly, gracefully, and almost effortlessly, as though at a glance, without the digression of words.”11 Visualizations and the results of simulation, however, do not supply images in the conventional sense, and their representations bear no resemblance to familiar images of nature. A cloud is not a cloud, but takes the form of a grid box, and hence resembles a cuboid. Images derived from simulations are iconically translated numerics, which are interpretable in the framework of scientific theories.12 They express only measurable state variables, and also offer the possibility of visually representing structural relationships. In conjunction with contemporary visualization software, numeric simulation makes it possible to render the intangible systems of equations and abstract manipulations of mathematical symbols sensorily perceptible, thereby making it possible to grasp state variables in their spatio-temporal development, and rendering interrelationships comprehensible.
9 10 11 12
To clarify: a giga = 109, a tera = 1012, 1 byte = 8 bits; 1 bit has a value of either 0 or 1. Douglas Adams: The Hitchhiker’s Guide to the Galaxy, United Kingdom 1979. Bredekamp 2004, p. 165. Gramelsberger 2001.
104
Johann Feichter | Complexity and Climate
Understanding Systems through Numeric Simulation While measurements cannot render state variables graspable in their full spatio-temporal extension, computer-based simulations with numeric climate models can represent each state variable at any desired moment and at any given location in the model domain. In the form of cyberspace, the data realm can be navigated and experienced as a physically consistent virtual world. Numerical climate models are the virtual laboratories of the earth sciences. This fact is reflected in scientific terminology as well: we generally refer not to “simulations,” but to “experiments.” Accordingly, numeric simulations are also referred to as “in silico experiments.” We carry out experiments by altering the boundary conditions; we can, for example, elevate the atmospheric concentration of carbon dioxide, or vary solarization levels. Such experiments provide insight into the sensitivity of the climate system, indicating feedback loops and culminating in an understanding of the system, which cannot be attained via measurements alone. If we wish to understand in detail the underlying causes of climate changes, an improved understanding of the system is essential. If there existed, for example, only a pure greenhouse effect – in other words, if increasing quantities of greenhouse gases altered only the long-wave radiation balance – then it would be possible to precisely calculate resultant temperature changes. Should the amount of atmospheric carbon dioxide double, warming would then be somewhat more than 1 °C. Model assessments, however, show a warming effect of between 2 and 4.5°C. This is because warming triggers a series of feedback loops in the climate system, which can either increase or reduce warming. A warmer atmosphere, for example, can absorb more water vapor; since water vapor is a greenhouse gas, this can heighten warming. If the soil warms, more organic matter will be decomposed by microscopic life forms. This will release carbon dioxide, thereby heightening the greenhouse effect. On the other hand, carbon dioxide has a fertilizing effect on plant life. Increased growth leads to the increased absorption of carbon dioxide, and hence to reducing it in the atmosphere. In a warmer climate, snow and ice cover tends to recede on the land as well as on the oceans. Ice and snow reflect solar radiation very efficiently, so their recession accelerates warming at higher latitudes. Because all of these processes are linked to one another either directly or indirectly, they are far more numerous than the above-mentioned feedback loops: it is an extraordinarily complex system. The degree of complexity of this virtual world is so high that the entire range of possibilities offered by climate
105
modeling cannot remotely be exhausted or fully explored. This is all the more true since climate models are dynamic formations, which are subject to continuous development. Fiction and Reality When we move from the complex to the simple, and then back to the complex again, the question arises: What has actually been altered in the process? Do we encounter the same complex world that was our initial point of departure? As we have seen, computer models can contribute to knowledge only if they are capable of reproducing aspects of reality. Evaluations of computer models, then, have two aims: first, to identify these aspects, and second, to clarify the degree to which they adequately depict the behavior of the system in question, and their proximity to actually observed reality. In order to test the system behavior of climate models, we attempt to reconstruct the past (hindcast). Model calculations programmed with the changes in solar radiation caused by fluctuations in the parameters of the Earth’s orbit (Milankovitch cycles) accurately reflect the course of the Ice Ages, and are capable of reproducing the locations and periods of glaciation. While the virtual worlds created in the computer can potentially be investigated and understood in all their aspects and interconnections, this is not the case for real-world phenomena, as perceived via observation and measurement. Measurements of the states of the oceans or of the atmosphere can never be allencompassing or continuous. This is why models are used in order to extrapolate various measurement data in time and space, thereby generating fields that can be compared to model results (data assimilation). Moreover, state-of-the-art methods measure electromagnetic waves (including remote sensing data taken from Earth or from satellites). In order to derive the desired state variables from them, models are required. It is only by means of indirectly generated data that we have access to all-encompassing images, those generated by data models.
13 Karl R. Popper, an Austrian philosopher of science, developed the idea that scientific theories are not verifiable, i.e. their truth value cannot be confirmed, since the theory cannot be tested under all of its conceivable aspects and possibilities. If, however, a single instance can be identified in which the theory is incorrect, then it has been refuted, that is to say, “falsified.” Cf. Karl Popper: Logik der Forschung, Vienna 1935, (edited by Herbert Keuth: Karl R. Popper, Gesammelte Werke, vol. 3: Logik der Forschung, Tübingen 2005).
106
Johann Feichter | Complexity and Climate
In conclusion, we can say that meteorology approaches reality from two sides: by means of climate models on the one hand, and with data models on the other. Validation of climate models, or their falsification in the Popperian sense,13 in any event, is not possible. Nonetheless, it is possible to determine the degree to which models are useful for a specific purpose.14 The examination of the results of models is an important aspect of model development, and has meanwhile been largely standardized.15 Although the description of nature on the basis of models has little relation to sensory experience, numeric climate models construct aspects of reality, and can describe complex behavior resulting from a large number of linkages and feedback loops between the various components of the climate system. Despite their simplified descriptions of individual processes, these models are in position to reconstruct and forecast the past and future behavior of the system.
14 See Gavin Schmidt’s contribution to the discussion at RealClimate: http://www.realclimate.org/ 15 For more information, see http://www-pcmdi.llnl.gov/projects/model_intercomparison.php. * Cordial thanks to Gabriele Gramelsberger at the Institute of Philosophy of the Free University at Berlin for corrections and suggestions.
107
Clemens Bellut “ACH, LUISE, LASS ... DAS IST EIN ZU WEITES FELD,” OR: THE GORDIAN KNOT OF COMPLEXITY “The topic is much too vast…” (“Das ist ein zu weites Feld...”). With understated irony, Theodor Fontane sets this formula, repeatedly and in varied form, in the mouth of Effi Briest’s father – specifically, each time Briest’s wife Luise tries to involve him in unfathomable questions about life.1 It is, in a sense, the sole exhaustive reply to the question of complexity. It anticipates the failure of reasoning in the face of complex phenomena. This self-protective resignation places a conclusive and impassable barrier in front of the hopeless prospects of reason: “the topic is much too vast.” It promises only exertion with absolutely no prospect of attaining clarity. Laments of this type concerning the “complexity of relationships” that transcend the horizon of the representable, the conceivable, or the masterable are among the tópoi of “Modernity”: there are laments about the complexity of modern society, of modern technology, of contemporary urbanism, of the complexity of the psychic and of the social, and needless to say, concerning the complexity of the modern world as a whole. To be sure, the “new unsurveyability”2 has been “new” for some time, having been manifest ever since “Modernity” appeared modern to itself – at the latest beginning in the 16th century. Like “progress,” “reflexivity,” and “critique,” the notion of complexity is a part of the constitutive core of consciousness that Modernity – in all of its brilliance and wretchedness – has of itself. The reverse of the coin showing the wretchedness of “unsurveyability” and impenetrability bears the countenance and proud brilliance of an advanced stage of development. For complexity is simultaneously a distinction of social, biological, and technical stages of development that has (in this oppositional value system) outgrown the primitive. Recently, this reality has had certain curious aftereffects: new management theories make overblown claims for the insight that certain processes can no longer be guided via linear and mechanical notions, that they instead demand “highly developed” competencies in the mas-
1 2
Theodor Fontane: Effi Briest (1894f). Romane und Erzählungen, vol. 7, Darmstadt 1970ff., p. 310. Cf. Die neue Unübersichtlichkeit, the title of a book by Jürgen Habermas, Frankfurt am Main 1985.
109
tering of complexity.3 And complex biological formations such as the retina of the eye, in the meantime, have been mobilized as evidence against presumptuous claims that argue against the theory of evolution for the necessity of some form of “intelligent design” as a precondition for the creation of the world. Prior to its serviceability for describing and categorizing classes of phenomena and their structures, complexity is a programmatic concept of Modernity, against which the later measures the level of its state of consciousness. This consciousness finds itself capable – in opposition to the primitive, the simple, and the linear – of conceptualizing complexity. Belonging to the prehistory of the concept of complexity is the early Enlightenment “Querelle des Anciens et des Modernes,”4 a literary-theoretical debate occurring mainly in France and Germany in the 17th and 18th centuries. It disputed the priority of Classical or modern poetry. The main references were the simplicity, unity, and binding character of Classical poetry, as expressed in the Aristotelian doctrine of the three unities of drama: those of place, time, and action. Modern poetry was as remote as possible from this principle. The question was whether this departure was an error to be rectified in relation to Classical models, or an independent virtue. The celebrated correspondence between Goethe and Schiller fought this question through, and explicitly and paradigmatically resolved it in favor of the incommensurable and coequal status of each.5 A substantial portion of this correspondence accompanied Goethe while he composed his novel Wilhelm Meister, and Schiller as he wrote his play Wallenstein. Schiller reports how, starting from the dramatic center of his project, he found himself repeatedly obliged to create the indispensable preconditions for understanding by inserting elaborate introductory material. As a consequence, the final result was (not unlike the later case of Richard Wagner’s “Ring des Nibelungen”) a complex structure, an intricately suggestive dramatic trilogy, for only in this way could the dramatic aspect of the drama have any impact at all. Goethe and Schiller comprehended this in terms of
3 A topic in relevant magazines in recent years: see the focus theme “Kampf gegen Komplexität,” Harvard Business Manager, December 2007; and the focus on “Komplexität” (“Mach’s dir nicht zu einfach”) in brand eins, 1(2006). 4 Hans Robert Jauss: “Antiqui / moderni” (Querelle des Anciens et des Modernes), in Joachim Ritter (ed.): Historisches Wörterbuch der Philosophie, vol. I, Darmstadt 1971, pp. 410–414. 5 Cf. the entirely analogous oppositional structure in Schiller’s essay “Über naive und sentimentalische Dichtung,” in Werke in drei Bänden, vol. II, Munich 1966.
110
Clemens Bellut | “Ach, Luise, lass ... das ist ein zu weites Feld,” or: The Gordian Knot of Complexity
the entelechy of modern poetry, which, in contrast to the presumed mythical consciousness of the Greeks, was not in a position to invoke the communally shared contemporaneity of mythical narratives and figures. It must therefore first introduce and motivate the poetic material that is ultimately meant to be condensed, for instance, into a dramatic conflict. Reflexivity, openness, infinity: these concepts are the counterparts to immediacy, unity, finitude and persistent features, and they culminate in the idea of complexity. Something will appear complex, however, only against the backdrop of a certain set of preconceptions whose point of departure is found in entirely different, oppositional structures of thought: preconceptions that take for granted a structural logic that explains the world on the basis of linear and mechanical reasoning. This is displayed in an illuminating manner in the opposed idea of simplicity. According to Cartesian ideas, a simple entity is clear in and of itself, and is distinct from other entities: “clara et distincta.”6 Everything that remains nontransparent is regarded as being formed by associations of simple entities. The result would be at most complicated, but not necessarily complex. What appears complex is that which cannot be conceived of under these conditions, and in relation to which the concepts of the understanding find themselves inextricably entangled in contradictions. In German, a mnemonic can be fashioned from the literal meaning of the Latin word complicare, which means “to fold together (zusammenfalten)”: the complicated, then, can be rendered graspable via unfolding (Entfaltung), because it thereby becomes simple (einfach) in a sense of naïve or ingenuous (einfältig). But the Latin word complexus means “mutual embrace,” or, so to speak, the labyrinthine, the convoluted: here, simplicity already contains within itself the seed of all the complexity that comes to appearance through its own development. In cases of doubt, then, the complicated can be profitably reduced and simplified – but the complex, in contrast, cannot be simplified with impunity. To say that something is complicated means that the finite number of its determinations cannot be grasped directly. To say that something is complex means, by contrast, that the number of its determinations is simply infinite.
6 Cf. “Meditation IV” in René Descartes: Meditationes de prima philosophia, Latin-German edition, ed. Lüder Gäbe, Hamburg 1992.
111
The classical instance of this phenomenon can be read from the constructions of traditional mechanistic physics: these constructions are simple by nature in that they abstract from and exclude everything that might run counter to clear, simple, and universally valid laws. Everything else, all other factors that might be influential are filtered out as negligible quantities, so to speak. The price paid for this is that the reliability of the laws it identifies are closely associated with this constraint: the mechanical laws of motion are analyzed under conditions of the two-body model, and are true only as long as no additional bodies alter the prevailing conditions. Should this occur, complex patterns of movements will emerge which appear chaotic, and which may under no circumstance be conceptualized in terms of the juxtaposition of more than one simple two-body model. Simply put, the simple is not always simple. During the 17th century, and in a way analogous to classical mechanistic physics, the same operations of thought were formulated in the context of European theories of the social contract.7 The point of departure for the founding of societies, states, and laws was the presumption that human communities can be conceptualized and regulated as simple agglomerations of individuals, which nonetheless remain separate. And once again, all of the consequences of this reduction to the simple – here the individual – are tied to these preconceptions, namely that such associations are secondary phenomena, which can be explained and justified in relation to the nature of their separate and individual members. In a conceptual world that is constructed through the association of simple elements according to the concepts of logical understanding and of mechanics, complexity (instances, that is to say, when “complications” arise) is regarded at most as an irritating marginal phenomenon. Conversely, under contextual presuppositions, logical constructability appears exceptional. The gesture of abstraction – which distinguishes between negligible factors of influence and methodically isolated categories of investigation – aims for clarity and distinctness (“clara et distincta”), for the transparency of compound simple entities, and for comprehensibility, explainability, constructability. And not only vis-à-vis analytical objectives, but also concerning design intentions. In this context, the design practices of postwar Modernism provide us with a paradoxical example: positioning itself under the aegis of the programmatic formulae “less is 7 Cf. Tzvetan Todorov Life in Common: An Essay in General Anthropology, University of Nebraska Press 2001.
112
Clemens Bellut | “Ach, Luise, lass ... das ist ein zu weites Feld,” or: The Gordian Knot of Complexity
more” and “form follows function,” Modernism abstracts away from contextual relationships with an almost grandiose decisiveness – in this sense, remaining highly similar to and with a spiritually affinity to classical, mechanical physics, both methodologically and in terms of gesture. In Paul Valéry’s “Eupalinos” dialogue,8 Socrates and Phaidros evoke a counter-figure. In their estimation, Eupalinos is an architect, a kind of embodiment of the Vitruvian ideal of the uomo universale. Eupalinos consummately unites the contextual knowledge for the respective architectural undertaking; he struggles to “search [for] the form with love, striving to bring into existence an object that would delight the eye and communicate with the spirit, would be in harmony with reason and with the numerous usual contingencies.”9 In contrast, the architectural, design, and artistic avant-gardes of the 20th century enter the scene like Alexander the Great confronting the Gordian knot – which, indeed, is what made them avant-garde in first place. With a heroic gesture, they slice through the complexity of an intricate contextual knot with the sword stroke of a categorical new start. In his inexhaustible essay “Über das Marionettentheater,”10 Heinrich von Kleist demonstrates how the concealed complexity of a jointed puppet is able, when skillfully controlled by strings, to display a consummate grace of movement, a grace that can by no means be reproduced via the guided movement of all of its individual limbs. The combined movements of the individual limbs culminate in a complicated clumsiness. The attempt to trace back a complex movement sequence to the combination of simple movements makes out of the complexly simple something that is instead infinitely complicated. The most ordinary instance of this is the good joke that condemns to failure and ridicule every attempt to explain its inner and irreducible complexity via rational explanation. Something complex, then, can assume the appearance of simplicity, while the complicated always excludes everything simple from itself. And something simple can, to be sure, be made complicated, while the complex, on the other hand, cannot be generated through the association of simple elements. In contrast with the complex, the complicated is a comparably simple matter, and there is little point in making a fuss over it. 8 Paul Valéry: Eupalinos ou l’architecte, Paris 1923; English edition: “Eupalinos or the Architect,” in: Dialogues (The Collected Works of Paul Valéry), vol. 4, Princeton 1958. 9 Ibid. 10 Heinrich von Kleist: Sämtliche Werke und Briefe, vol. 2, ed. Helmut Sembdner, Munich 1987.
113
In many cases, a given issue is made to appear complex in the service of political propaganda, with the intention of either avoiding or disguising positions or decisions taken. Conversely, political populism propagates the reduction of complex matters to the simple and self-evident, distinctly delimiting a subject that is otherwise “too vast” and sparing listeners the exertion of weighing and confronting contradictions. The propagandists of complexity pursue the selfideologization of Modernity, and the populists the enterprise of anti-Modernism. The fragility of this pride in Modernity – which likes to see itself reflected in the concept of complexity – is repeatedly exposed when confronted by reproaches of decadence, emotional remoteness, and feebleness. This is the case with the conflict between Islam and the West in its propagandistic manifestations, when the West responds to criticism not via self-assertion, but by giving expression to wounded feelings and self-doubt, while in practical terms, such responses are carried as far as self-inflicted damage to its own constitutional structures. In many respects, this proud consciousness of complexity behaves like the sublime in Kant’s Critique of Pure Reason11: that which exceeds all human dimensions, and in relation to which the latter appear insignificantly small, elevates the individual who is exposed to it to a consciousness of grandeur and infinity that goes beyond all concepts of the understanding. This is why a consciousness of complexity feels pride when the concepts of the understanding find themselves in irresolvable conflict, for it is capable of thinking complex orders that cannot be conceptualized as associations of simple elements. It is no coincidence that the concept of complexus is found in the ancient teaching of the temperaments and bodily humors (humores), as well as of alchemy: here is the concept of a whole whose structure cannot be conceived of as the external assembly of simple elements. Complexity is not least the paradoxical striving to conceive of the incomprehensible. In order to avoid such efforts when faced with the Gordian knot of complexity, one resorts to the sword stroke of reduction and simplification – or one gives voice to the resigned lament: “Ah, Luise, leave it ... the topic is much too vast.” Complexity is an effort, in a manner of speaking, to render thought more consistent with reality, to overcome reduction and simplification through the concepts of the understanding. Yet it is only these concepts themselves and their reasoning that allow complexity to be mentioned at the moment of their collapse
11 Immanuel Kant: Kritik der Urteilkraft (critique of Pure Reason), 1790.
114
Clemens Bellut | “Ach, Luise, lass ... das ist ein zu weites Feld,” or: The Gordian Knot of Complexity
and inescapable entanglement. During the 1970s, similar cases were often adorned with the attribute “dialectical.” In contrast to “monocausal” (as they were referred to at the time) and linear efforts at explanation, anyone capable of counting to three was fond of saying: “You have to consider the question dialectically.” Both miss out on the insight that Paul Valéry put into Socrates’ mouth in his Eupalinos Dialogue: “One has the choice of being a man or a spirit. Man can only act because he is capable of not knowing, of satisfying himself with a portion of knowledge (therein resides his characteristic peculiarity), a knowledge which is, incidentally, greater than it should be!”12
12 Cf. Valéry, 1923, see note 8, p. 113.
115
SELECTED LITERATURE This selected bibliography lists the texts referred to by the authors of this volume, supplemented with additional texts of relevance. Entries are listed chronologically. Where feasible and useful, dates of first or original editions have been cited. — René Descartes: Meditationes de prima philosophia (1641); in English as Meditations on First Philosophy, trans. by John Cottingham, Cambridge 1986. — Friedrich Schiller: “Über naive und sentimentalische Dichtung,” in: Werke in drei Bänden, vol. II, Munich 1966. Available in English as: “On Naïve and Sentimental Poetry,” trans. William F. Wertz, Jr., http://www.schillerinstitute.org/transl/Schiller_essays/naive_sentimental-1.html — Christian von Ehrenfels: “Über Gestaltqualitäten,” in: Vierteljahrsschrift für wissenschaftliche Philosophie, 14, Leipzig 1890, pp. 249–292; English edition: “On Gestalt Qualities,” in Barry Smith (ed.), Foundations of Gestalt Theory, Munich and Vienna 1988, pp. 82–117. — Paul Valéry: Eupalinos ou l’architecte, Paris 1923; English edition “Eupalinos or the Architect,” in Dialogues (The Collected Works of Paul Valéry vol. 4), Princeton 1958. — Karl R. Popper: Zur Methodenfrage der Denkpsychologie (unpublished dissertation), Vienna 1928. — Karl R. Popper: Logik der Forschung (1935), ed. by Herbert Keuth, Tübingen 112005; English edition: The Logic of Scientific Discovery, London 1959. — Gaston Bachelard: La Formation de l’esprit scientifique. Contribution à une psychoanalyse de la connaissance objective, Paris 1938; English edition: The formation of the Scientific Mind, Manchester 2006. — Martin Heidegger: “Die Zeit des Weltbildes” (1938), in: Holzwege, complete edition vol. 5, ed. by F.-W. von Herrmann, Frankfurt am Main 1977, pp. 87–94; English edition:“The Age of the World Picture,” in: The Question concerning Technology and Other Essays, trans. by William Lovitt, New York 1977.
117
— Karl R. Popper: “What is dialectic?” in: Mind, vol. 49, 1940, reprinted in: Karl R. Popper: Conjectures and Refutations, London 1963, pp. 312–335. — Claude E. Shannon: “A Mathematical Theory of Communication,” in: Bell System Technical Journal, vol. 27, pp. 379–423 and pp. 623–656, July and October 1948. — Norbert Wiener: Cybernetics or Communication and Control in the Animal and the Machine, Cambridge 1948. — György Kepes (ed.): New Landscape in Art and Science, Chicago 1956. — Kevin Lynch: The Image of the City, Cambridge 1960. — Karl R. Popper: “Die Logik der Sozialwissenschaften,” in: Karl R. Popper: Auf der Suche nach einer besseren Welt. Vorträge und Aufsätze aus dreissig Jahren, Munich 122003, pp. 79–99 (first published in: Kölner Zeitschrift für Soziologie und Sozialpsychologie, vol. 14 no. 2, 1962). — Christopher Alexander: Notes on the Synthesis of Form, Cambridge 1964. — Yona Friedman: “Die 10 Prinzipien des Raumstadtbaus,” in: Ulrich Conrads (ed.): Programme und Manifeste zur Architektur des 20. Jahrhunderts, Bauwelt Fundamente vol. 1, Basel, Boston, Berlin 2001 (1st German edition 1964), p. 176. — Max Bense: “Projekte generativer Ästhetik,” in: edition rot, text 19, ed. by Max Bense and Elisabeth Walther, Stuttgart 1965. — Georg Nees: “Programme und Stochastische Grafik,” in: edition rot, text 19, ed. by Max Bense and Elisabeth Walther, Stuttgart 1965. — Eduard F. Sekler: “Structure, Construction and Tectonics,” in: Structure in Art and in Science, New York 1965 pp. 89–95.
118
— Robert Venturi: Complexity and Contradiction in Architecture, New York 1966. — John von Neumann: The Theory of Self-reproducing Automata, ed. by Arthur Burks, Urbana 1966. — Karl R. Popper: “On Clouds and Clocks. An Approach to the Problem of Rationality and the Freedom of Man,” Holly Compton Memorial Lecture, Washington 1966. Published in: Karl R. Popper: Objective Knowledge. An Evolutionary Approach, Oxford 1972. — György Kepes (ed.): Structure in Art and in Science, New York, 1965. — Robert Venturi, Denise Scott Brown and Steven Izenour: Learning from Las Vegas, New York 1972. — Rudolf Arnheim: The Dynamics of Architecture Form, Berkeley and Los Angeles, 1977. — James J. Gibson: The Ecological Approach to Visual Perception, Boston 1979. — John Searle: “Minds, Brains and Programs,” in: Behavioral and Brain Sciences, 3, 1980, pp 417–457. — Kevin Lynch: Good City Form, Cambridge 1981. — Christian Norberg-Schulz: Genius loci. Landschaft, Lebensraum, Baukunst, Stuttgart 1982. — Frei Otto: Natürliche Konstruktionen. Formen und Konstruktionen in Natur und Technik und Prozesse ihrer Entstehung, Stuttgart 1982. — Valentin Braitenberg: Vehicles. Experiments in Synthetic Psychology. Cambridge/Mass. 1984.
119
— Jürgen Habermas: Die neue Unübersichtlichkeit, Frankfurt am Main 1985. English edition: “The new obscurity: the crisis of the welfare state and the exhaustion of utopian energies,” trans. Phillip Jacobs, Philosophy & Social Criticism 11, 1986, pp. 1–18. — Andrea Gleiniger: “Technologische Phantasien und urbanistische Utopien,” in: Vision der Moderne. Das Prinzip Konstruktion, ed. by Heinrich Klotz in collaboration with Volker Fischer, Andrea Gleiniger-Neumann, and Hans-Peter Schwarz, Munich 1986, pp. 56–65. — Craig W. Reynolds: “Flocks, Herds, and Schools: A Distributed Behavioral Model,” in: SIGGRAPH ’87: Proceedings of the 14th annual conference on computer graphics and interactive techniques, ed. by Maureen C. Stone, (Association for Computing Machinery), New York 1987, pp. 25–34. — Heinrich von Kleist: Sämtliche Werke und Briefe, vol. 2, ed. by Helmut Sembdner, Munich 1987. — Lucius Kroll: An Architecture of Complexity, Cambridge 1987. — Klaus Mainzer: Symmetrien der Natur, Berlin, New York 1988; English edition: Symmetries of Nature. A Handbook for Philosophy of Nature and Science, Berlin, New York 1996. — Marcos Novak: “Computational Compositions,” in: ACADIA ’88, Workshop Proceedings, Ann Arbor/Mich. 1988, pp. 5–30. — Frei Otto: Gestaltwerdung. Zur Formentstehung in Natur, Technik und Baukunst, Cologne 1988. — Christopher Langton: Artificial Life. (Proceedings of an Interdisciplinary Workshop on the Synthesis and Simulation of Living Systems, September, 1987 in Los Alamos, New Mexico, California), Cambridge 1989. — William Mitchell: Logic of Architecture, Cambridge 1990.
120
— Denise Scott Brown: Urban Concepts. Architectural Design Profile, 60, January–February, London 1990. — Peter Eisenman: “Visions Unfolding: Architecture in the Age of Electronic Media,” in: Domus no. 734 (January 1992), pp. 20–24. — George Hersey, and Richard Friedman: Possible Palladian Villas. Plus a Few Instructively Impossible Ones, Cambridge 1992. — John H. Holland: “Genetic Algorithms: Computer programs that ‘evolve’ in ways that resemble natural selection can solve complex problems even their creators do not fully understand,” in: Scientific American, July 1992, pp. 66–72. — Wolfgang Welsch: “Übergänge,” in: Selbstorganisation. Jahrbuch für Komplexität in den Natur-, Sozial- und Geisteswissenschaften, vol. 4, Berlin 1993, pp. 11–17. — Klaus Mainzer: Thinking in Complexity. The Computational Dynamics of Matter, Mind, and Mankind, Berlin, Heidelberg, New York 1994 (52007). — Joachim Krause: “Die Selbstorganisation von Formen. Joachim Krause im Gespräch mit Nikolaus Kuhnert, Angelika Schnell und Gunnar Tausch,” in: Die Architektur des Komplexen. Arch+ Zeitschrift für Architektur und Städtebau, 121, Stuttgart 1994, p. 25. — Mitchel Resnick: Turtles, Termites, and Traffic Jams, Cambridge/Mass. 1994. — Andrew Benjamin (ed.): “Complexity in Art/Architecture,” in: Journal of Philosophy and the Visual Arts, 1995. — John Frazer: An Evolutionary Architecture, London 1995.
121
— Robert Venturi: Iconography and Electronics upon a Generic Architecture. A View from the Drafting Room, Cambridge/Mass. 1996. — Tzvetan Todorov: Abenteuer des Zusammenlebens. Versuch einer allgemeinen Anthropologie, Frankfurt am Main 1996. — William J. Clancey: Situated Cognition. On Human Knowledge and Computer Representations, Cambridge 1997. — Charles Jencks: Architecture of the Jumping Universe: A Polemic: How Complexity Science is Changing Architecture and Culture, Chichester 1997. — Charles Jencks: “Nonlinear Architecture. New Science = New Architecture?” in: Architectural Design, 129, 1997. — Ulf Skirke: Technologie und Selbstorganisation: Zum Problem eines zukunftsfähigen Fortschrittsbegriffs, dissertation at the Department of Philosophy of Hamburg University, 1998. http://www.on-line.de/~u.skirke/tus_titel.html accessed May 12 2008. — Mark Wigley: Constant’s New Babylon, The Hyper-Architecture of Desire, Rotterdam 1998. — Eric Bonabeau, Marco Dorigo, and Guy Theraulaz: Swarm Intelligence: From Natural to Artificial System, Santa Fe Institute Studies in the Sciences of Complexity, Oxford 1999. — Greg Lynn: Animate Form, New York 1999. — Lily Kay: “Spaces of Specificity: The Discourse of Molecular Biology before the Age of Information,” in: Lily Kay: Who wrote the Book of Life? A History of the Genetic Code, Stanford 2000, pp. 38–72.
122
— Beatriz Colomina: “Enclosed by Images. The Eameses Multimedia Architecture,” in: Grey Room 02, Grey Room Inc. and Massachusetts Institute of Technology, Cambridge/Mass. 2001, pp. 6–29. — Gabriele Gramelsberger: “Schrift in Bewegung. Eine semiotische Analyse der digitalen Schrift,” publication for the workshop “Bild-Schrift-Zahl” at the Helmholtz Zentrum für Kulturtechnik, Humboldt Universität Berlin, Nov 16–17, 2001. — Jesus Mosterin: “Kolmogorov Complexity,” in: Complexity and Emergence, ed. by Evandro Agazzi and Luisa Montecucco, New Jersey 2002, pp. 45–56. — Horst Bredekamp: Die Fenster der Monade, Berlin 2004. — Robert Venturi, Denise Scott Brown: Architecture as Signs and Systems for Mannerist Time, Cambridge 2004. — William J. Mitchell: “Constructing Complexity,” in: Computer Aided Architectural Design Futures, ed. by B. Martens and A. Brown, Vienna 2005, pp. 41–50. — Bart Lootsma: “Koolhaas, Constant und die niederländische Kultur der 60er,” in: Disco 1, ed. by Arno Brandlhuber, a42.org / Akademie der Bildenden Künste, Nuremberg 2006. — Michael Batty: Cities and Complexity: Understanding Cities with Cellular Automata, Agent-Based Models, and Fractals, Cambridge 2007. — Klaus Mainzer: Der kreative Zufall. Wie das Neue in die Welt kommt, Munich 2007. — Nicoletta Setta: Chaos and Complexity in the Arts and Architecture, New York 2007.
123
— Andrea Gleiniger: “Of Mirrors, Clouds, and Platonic Caves: 20th-Century Spatial Concepts in Experimental Media,” in: Simulation: Presentation Technique and Cognitive Method, in the series Context Architecture, ed. by Andrea Gleiniger and Georg Vrachliotis, Basel, Boston, Berlin 2008, pp. 29–49. — Klaus Mainzer: Komplexität, Munich 2008. — Sandra Mitchell: Komplexitäten. Warum wir erst anfangen, die Welt zu verstehen, Frankfurt am Main 2008. — Georg Vrachliotis: “Flusser’s Leap: Simulation and Technical Thought in Architecture,” in: Simulation: Presentation Technique and Cognitive Method, in the series Context Architecture, ed. by Andrea Gleiniger and Georg Vrachliotis, Basel, Boston, Berlin 2008, pp. 63–80.
124
ILLUSTRATION CREDITS
Gleiniger Fig. 1 Pat Kirkham: Charles and Ray Eames. Designers of the Twentieth Century, Fig. 2 Fig. 4 Fig. 5 Fig. 6 Fig. 7 Fig. 8
Cambridge/Mass. 1995, pp. 322. Pat Kirkham: Charles and Ray Eames. Designers of the Twentieth Century, Cambridge/Mass. 1995, p. 327. Photo by Andrea Gleiniger. Photo by Andrea Gleiniger. Vision der Moderne. Das Prinzip Konstruktion, ed. by Heinrich Klotz in collaboration with Volker Fischer, Andrea Gleiniger and Hans-Peter Schwarz. Mark Wigley: Constant's New Babylon: The Hyper-architecture of Desire, (exhib. cat. Rotterdam, 1998/ 99), Rotterdam 1998, p. 120. Vision der Moderne. Das Prinzip Konstruktion, ed. by Heinrich Klotz in collaboration with Volker Fischer, Andrea Gleiniger and Hans-Peter Schwarz.
Vrachliotis Fig. 1 Fig. 2
Fig. 3 Fig. 4
125
Photo by Andrea Gleiniger. Screenshots, text excerpts from: Craig Reynolds: “Flocks, herds and schools: A distributed behavioral model,” in: SIGGRAPH ‘87: Proceedings of the 14th annual conference on computer graphics and interactive techniques, ed. by Maureen C. Stone, (Association for computing Machinery), New York 1987, p. 25. Website by Craig Reynolds: http://www.red3d.com/cwr/boids, 5 April 2008. The New Landscape in Art and Science, exhib. cat., ed. by György Kepes, Boston 1956, p. 276. Cybernetic Serendipity. The Computer and the Arts, exhib. cat., ed. by Jasia Reichardt, London 1968, p. 79.
BIOGRAPHIES — Clemens Bellut, a philosopher, has been Deputy Director since 2006 of Design2context (Institute for Design Research, Zurich University of the Arts). Originally from the rural area of the Lower Rhine, he studied Humanities at the Universities of Bonn and Tübingen, and has worked as an advisor to the Board of Directors of Flughafen Frankfurt Main AG; as a lecturer at academies and universities in Tübingen, Stuttgart, Ilmenau, and Berlin; and as a teacher of German as a foreign language in Frankfurt.
— Johann Feichter, who has a doctorate in meteorology, has conducted research at the Max Planck Institute for Chemistry in Mainz and at the Meteorological Institute of Hamburg University. Since 2000, he has been director of the research group “Aerosols, Clouds, and Climate” at the Max Planck Institute for Meteorology in Hamburg. He is the lead author of the IPCC Third Assessment Report, and a contributing author and reviewer of the IPCC Fourth Assessment.
— Andrea Gleiniger is a historian of art and architecture. Since 2007 she has been a lecturer at the Zurich University of the Arts, with a focus on the history and theory of space/scenography. She studied art history, comparative literature, and archaeology in Bonn and Marburg; in 1988, she took a doctorate in art history with a project on guiding ideas in large-scale postwar housing development; from 1983–93, she was curator at Deutsches Architekturmuseum Frankfurt/Main; since 1983, she has held teaching positions and guest professorships at academies in Karlsruhe, Stuttgart, and Zurich. From 2002–07, she was research assistant at the ETH Zurich/Chair of CAAD. She is active as a writer, particularly on architecture, urban planning, art, and new media in the 20th century.
— Klaus Mainzer – Prof. Dr. Klaus Mainzer, holds the Chair of Philosophy of Science, Center for Mathematical Sciences, and is director of the Carl von Linde Academy at the Munich University of Technology, member of the European Academy of Sciences (Academia Europaea) in London, author of several books (e.g. about complex systems) translated into many languages, and has been a guest professor in Brazil, China, Japan, Russia, USA, and European countries.
126
— Kostas Terzidis – Associate Professor at the Harvard Graduate School of Design. He teaches courses in Kinetic Architecture, Algorithmic Architecture, Digital Media, Cinematic Architecture, and Design Research Methods. He holds a PhD in Architecture from the University of Michigan (1994), a Masters of Architecture from Ohio State University (1989), and a Diploma of Engineering from the Aristotelion University in Greece (1986). He is a registered architect in Europe (several commercial and residential buildings). His most recent work is in the development of theories and techniques for algorithmic architecture. His book Expressive Form: A Conceptual Approach to Computational Design (London 2003), offers a unique perspective on the use of computation as it relates to aesthetics, specifically in architecture and design. His latest book Algorithmic Architecture (Oxford 2006), provides an ontological investigation into the terms, concepts, and processes of algorithmic architecture and a theoretical framework for design implementations.
— Robert Venturi & Denise Scott Brown Robert Venturi and Denise Scott Brown have been significantly responsible for redirecting the mainstream of Modern architecture since the mid-1960s. By broadening architectural concerns to include pluralism, multiculturalism, and social responsibility; Pop Art, popular culture and the everyday landscape; symbolism, iconography and electronic communication; generic building and a reassessment of the doctrine of functionalism; uncomfortably direct design and the relevance of mannerism, they have called on architects to reconsider the practice of their profession today. The work of Venturi, Scott Brown and Associates ranges from the Vanna Venturi house of 1966 to civic buildings, university complexes, and campus and urban plans. Their designs have never been an interpretation of their theory – that would be too dry – but are developed from the specifics at hand. Most provide surprisingly direct solutions to architectural problems and, although they are carefully tied to context, almost all have some remarkable, even outrageous, characteristics. Scott Brown and Venturi have taught at Harvard, Yale, and the Universities of California and Pennsylvania, among others, and have lectured in the US, Europe, Africa, and the Far East. They work together on all aspects of their practice, and collaborate with the 30 members of their firm. Although both span the totality of their projects, Venturi is principal in charge of design and Scott Brown of urban and campus planning and design.
127
— Georg Vrachliotis has been a researcher and teaching assistant for architecture theory at the Chair of Computer Aided Architectural Design (CAAD) in the Department of Architecture at the ETH Zurich, since 2004. Studied architecture, studies in Philosophy and the History of Science. Visiting researcher at the Universities of Bremen and Freiburg, and UC Berkeley. Primary interests: technical thinking in 20th-century architecture. Current research foci: architecture and cybernetics; “Fritz Haller’s philosophy of construction.” 2007, founded the research project “Theory of Technology in Architecture” (Techniktheorie) in Prof. Hovestadt’s group at the ETH (together with Oliver Schurer, Vienna University of Technology). Since 2006, lecturing in architectural theory at the Institute for Architectural Theory at Vienna University of Technology.
128