Mainframe Experimentalism: Early Computing and the Foundations of the Digital Arts [Reprint 2019 ed.] 9780520953734

Mainframe Experimentalism challenges the conventional wisdom that the digital arts arose out of Silicon Valley’s technol

120 29 26MB

English Pages 376 [367] Year 2023

Report DMCA / Copyright

DOWNLOAD PDF FILE

Recommend Papers

Mainframe Experimentalism: Early Computing and the Foundations of the Digital Arts [Reprint 2019 ed.]
 9780520953734

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

MAINFRAME EXPERIMENTALISM

MAINFRAME EXPERIMENTALISM Early Computing and the Foundations of the Digital Arts

Edited by

Hannah B Higgins and Douglas Kahn

ip UNIVERSITY OF C A L I F O R N I A PRESS Berkeley

Los Angeles

London

University of California Press, one of the most distinguished university presses in the United States, enriches lives around the world by advancing scholarship in the humanities, social sciences, and natural sciences. Its activities are supported by the UC Press Foundation and by philanthropic contributions from individuals and institutions. For more information, visit www.ucpress.edu. University of California Press Berkeley and Los Angeles, California University of California Press, Ltd. London, England © 2012 by The Regents of the University of California

Library of Congress Cataloging-in-Publication Data Mainframe experimentalism: early computing and the foundations of the digital arts / edited by Hannah B Higgins and Douglas Kahn. p. cm. Includes bibliographical references and index. ISBN 978-0-520-26837-1 (cloth : alk. paper) ISBN 978-0-520-26838-8 (pbk.: alk. paper) 1. Art and computers. 2. Digital art. 3. Arts, Modern—20th century. I. Higgins, Hannah, 1964- editor. II. Kahn, Douglas, 1951- editor. NX180.C66M35 2012 776.09'046—dc23 2011049953

21 10

20

19 9

8

18 7

17 6

5

16

15

4

3

14 2

13 1

12

For James Tenney, mainframe experimenter, composer, inspirer, friend

CONTENTS

List of

Illustrations

Acknowledgments Introduction Hannah B Higgins and Douglas Kahn PART ONE. DISCOURSES 1. The Soulless Usurper: Reception and Criticism of Early Computer Art Grant Taylor 2. Georges Perec's Thinking Machines David Bellos 3. In Forming Software: Software, Structuralism, Dematerialization Edward A. Shanken PART TWO. CENTERS 4. Information Aesthetics and the Stuttgart School Christoph Klutsch 5. "They Have All Dreamt of the Machines—and Now the Machines Have Arrived": New Tendencies—Computers and Visual Research, Zagreb, 1968-1969 Margit Rosen

vlil

CONTENTS

6. Minicomputer Experimentalism in the United Kingdom from the 1950s to 1980 Charlie Gere PART THREE.

112 MUSIC

7. James Tenney at Bell Labs Douglas Kahn

131

8. HPSCHD—Ghost

or Monster?

Branden W. Joseph

147

9. The Alien Voice: Alvin Lucier's North American

Time Capsule 1967

Christoph Cox 10.

An Introduction

170 to North American

Time Capsule 1967

Robert A. Moog 11.

North American

187 Time Capsule 1967

Alvin Lucier

189

PART FOUR. ART AND 12.

An Introduction

INTERMEDIA

to Alison Knowles's

The House of Dust

Hannah B Higgins 13.

195

The Book of the Future: Alison Knowles's

The House of Dust

Benjamin H. D. Buchloh 14.

200

Three Early Texts by Gustav Metzger on Computer

Art

Compiled by Simon Ford 15.

Computer

Participator:

Work in

Computing

209 Situating Nam June

Paik's

William Kaizen PART FIVE. 16.

229

POETRY

First-Generation

Poetry Generators:

in Form Christopher Funkhouser

Establishing

Foundations 243

17. "Tape Mark I" Nanni Balestrini

266

CONTENTS

IX

18. Letter to Ann Noel Emmett Williams

275

19. The Computational Word Works of Eric Andersen and Dick Higgins Hannah B Higgins

279

20. Opus 1966 Eric Andersen

288

21. "Computers for the Arts" (May 1968) Dick Higgins

292

22. The Role of the Machine in the Experiment of Egoless Poetry: Jackson Mac Low and the Programmable Film Reader Mordecai-Mark Mac Low

298

PART SIX. FILM A N D ANIMATION 23. Stan VanDerBeek's Poemfields: The Interstice of Cinema and Computing Gloria Sutton

311

24. From the Gun Controller to the Mandala: The Cybernetic Cinema of John and James Whitney Zabet Patterson Index

334 355

ILLUSTRATIONS

o.i. James Tenney and Lejaren Hiller, Electronic Music Studio, University of Illinois, ca. 1961

6

1.1. United States Army Ballistic Research Laboratories, Splatter 1963

Diagram,

21

1.2. Manfred Mohr, P-159 rs, 1974

25

1.3. Sol LeWitt, Variations of Incomplete Open Cubes Component),

1974

(Photographic

26

2.1. Georges Perec's French flowchart, from Georges Perec: A Life in Words 2.2. Perec's English flowchart

45

3.1. Hans Haacke, News, 1969

56

4.1. Georg Nees, 23-Ecken, 1964

44

70

4.2. Siegfried Maser, "Kybernetisches Modell ästhetischer Probleme," 1974

73

4.3. Frieder Nake, Refined Model for the Aesthetic Process, 1974

76

4.4. Nake, Flowchart of the Program Package COMPART

ER 56,1974

4.5. Manfred Mohr, P-050/R, "a formal language," 1970

80

5.1. New Tendencies 4 installation, 1969

104

6.1. John Lansdown with dancers, 1969

114

77

8.1. John Cage and Lejaren Hiller, detail, Program (KNOBS) for the Listener, Output Sheet No. 10929,1967-69

153

12.1. Alison Knowles, Poem Drop, 1971

198

13.1. Knowles, The House of Dust, 1967/2007

201

15.1. Nam June Paik, The First "Snapshots" of Mars, 1966

233

16.1. Emmett Williams, "IBM," in A Valentine for Noel, 1973 xl

255

Xll

ILLUSTRATIONS

16.2. Marc Adrian, illustration for "Computer Texts," 1968 257 16.3. Margaret Masterman and Robin McKinnon Wood, illustration for "Computerized Japanese Haiku," 1968 259 17.1. Nanni Balestrini, Tape Mark 1,1961 269 19.1. Dick Higgins, Hank and Mary: A Choral for Dieter Rot: Computers for the Arts, 1968/1970 286 20.1. Eric Andersen, Opus 1966 289 20.2. Andersen, fragment of Opus 1966 2go 23.1. Stan VanDerBeek, still from Poemfield No. 2,1966 312 23.2. VanDerBeek, still from Poemfield No. 1,1965 31$ 23.3. VanDerBeek, Poemfields poster, 1966 325 24.1. James Whitney, Lapis, 1966 335 24.2. John Whitney, Permutations, 1968 345

ACKNOWLEDGMENTS

This book is dedicated to James Tenney, whom we would like to thank posthumously for his role in a 2002 symposium we held at the University of California, Davis. The topic of the symposium was the computer programming workshop he had held in 1967 in the living room of Dick Higgins and Alison Knowles, publishers of Something Else Press in New York City and parents of Hannah Higgins, coeditor of this book. Knowles was the other main speaker at the symposium, for her The House of Dust was generated in the context of the workshop and with Tenney's ongoing assistance. Like several artists whose work with early computers is at the core of this book, we were the beneficiaries of Tenney's generosity and expertise. This book would be, literally, unimaginable without him. Several editors from the University of California Press attended our symposium. Among them were Stephanie Fay and Deborah Kirshman, who expressed an immediate interest in the project. At the time, we imagined a very modest publication, but we soon came across a wealth of unexplored topics, source materials, and a new generation of scholars. We sought out authors and artists internationally, suffered several setbacks, and pulled back when the project exploded in size. This critical process required the gift of patience from our writers and artists and the Press. By 2008, it looked as if we were ready to go, but the copyright clearances have been complex, involving estates and long-since defunct publications. We are especially grateful for the generous granting of rights by the artists, writers, and estates involved. Finally, the astute guidance of Mary Francis and Eric Schmidt in helping to complete the publication stage should not go unacknowledged. Thank you.

xlli

Xlv

ACKNOWLEDGMENTS

Over the ensuing decade, our assistants worked tirelessly to pull this anthology together from Davis, Sydney, Washington, DC, and Chicago. Special thanks go to them for managing a complex project with grace. Douglas Kahn's assistant, Nilendra Gurusinghe, at the University of California, helped with the early stages of the project; Nathan Thomas, Higgins's research assistant at the University of Illinois, Chicago, donned a detective hat and had a keen eye for detail in managing the text and rights issues and formatting as the book was prepared for publication; and Peter Blarney in Sydney saw the project through its final stages. Generous grants from the National Institute for Experimental Arts, College of Fine Arts, University of New South Wales, and the University Scholar program at the University of Illinois, Chicago made the indexing and the completion of the book possible. Finally, we would like to thank our children and spouses for their continued patience, support, and daily inspiration.

INTRODUCTION Hannah B Higgins and Douglas Kahn What we need is a computer that... turns us... not "on" but into artists. JOHN C A G E , 1 9 6 6

Mainframe Experimentalism is a collection of essays and documents on the encounter with mainframe and minicomputers by artists, musicians, poets and writers, and filmmakers in and around the 1960s. The time frame for our book begins with the era of room-size mainframe computers in the late 1950s, extends through the 1960s' refrigerator-size, transistor-operated minicomputers and institutionally bound digital technologies, and ends in the 1970s with the transition to microcomputers, the desktop computers that paved the road for today's ubiquitous digital devices. This duration, the "long" 1960s, was a time when simple access to computers was determined by institutional rather than consumer logics. These institutions inhered to geopolitical, military, corporate, and scientific priorities that were not immediately or obviously amenable to the arts. For those artists lucky enough to find access to these computers, technical requirements mandated the expertise of engineers, so the process was always collaborative, yet rarely sustainable over any great length of time. Thus, while mainframes and minis grew at the core of major institutions, they played contingent and fleeting roles in artistic careers. It is a testament to all involved that so much was attempted and achieved within these constraints. With Mainframe Experimentalism, we are attempting to bring a new focus on a range of these artistic activities. The vitality and achievement across the arts of the 1960s are highly prized by today's critics, curators, historians, and collectors; a quick perusal of garden-variety museums of contemporary art demonstrates as much, since the decade that brought us pop, conceptualism, Fluxus, Happenings, video art, and minimalism is seen as the foundation for the arts of the present. The same can be said of 1960s literature, music, and film. In contrast, the l

2

INTRODUCTION

mainframe- and mini-based ancestors of today's digital art are generally remembered with some embarrassment. Whether because of a lack of exposure to actual computers, the absence of a critical apparatus that understood what was at stake in computer-based artwork, or the overabundance of geeky stereotypes, public perception of early computing and the arts did not fare well, as is recounted in tragic detail in Grant Taylor's contribution to this volume. If anything, firstgeneration computer art was and has been synonymous with "bad art" or, more generously, an immature or technologically defined aspirant art. It was associated with engineers with artistic aspirations or artists with engineering aspirations, rather than being seen as a technology that had fused with artistic practice and achievement, such as the chemistry of oil paints or the mechanics of a piano. The artist Jim Pomeroy spoke for many when he lamented "the look of computer a r t . . . flashy geometric logos tunneling through twirling 'wire-frames,' graphic nudes, adolescent sci-fi fantasies, and endless variations on the Mona Lisa."1 When it was not demonstrating new graphical interface capabilities or plotting information, computer art often meant emulations of the art historical canon, especially modern genres that already trafficked in simplified graphics. There appeared to be some artistic rationale, especially in the 1960s, in the way the computer provided a means to distance the maker from artistic authorship and thus from the perceived excesses of expressionism, yet this distancing could also operate in too close a proximity to the technocratic drive and bureaucratic numbing of both capitalism and the Eastern Bloc. Historians of fine art, media, and literature have historically avoided serious investigation of the digital arts of this period, and only a modicum of attention has been paid in musicology. The early digital arts seemed degraded as art; they seemed to be about workshopping technological possibilities or, in the case of musicology, seemed constrained to academic computer music. More recently, histories of digital arts have been channeled through prescient moments in the development and social uptake of digital technologies. Where once commentators followed the lines of Mir6 or Klee and found digital art wanting, later historians were drawn to the lines of Douglas Engelbart's mouse; where once canonical artists were housed in the domain of museums of modern art and the commodity culture of collectors, they were now housed in the computer architectures of John von Neumann and Silicon Valley design centers. The vanguard art world that New York had stolen from Paris during World War II had been digitally rerouted to Palo Alto, if we follow the migrations of the discourse. With Mainframe Experimentalism, we attempt to reestablish in their own right the efforts, ideas, and achievements of artists, musicians and writers working during the digital period of the long 1960s, many of whom have been flanked by strictures of "computer art" on the one hand and diffusion into "new media"

INTRODUCTION

3

and "digital culture" on the other. As experimenters well rehearsed in provisiona l l y , these artists were able to operate within inhospitable institutional and economic conditions, negotiate social networks and knowledge competencies, and meet the narrow constraints of computational tools with a sense of possibility. They did so as the exigencies of the military, bureaucratic, and corporate entities that controlled computer access ran up against the grassroots and collective concerns of the bohemian, countercultural, antistate, and antiwar constituencies of the 1960s. In addition, we wish to show that the radical and experimental aesthetics and political and cultural engagements of the period, across conventional disciplines and media, by an international array of individuals, groups, and institutions can stand as a historical allegory for the mobility among technological platforms, artistic forms, and social sites that has become commonplace today. And, in this way, what is detailed in these pages is the formation of the digital arts.

In 1966, the composer John Cage posed a rhetorical question from which the opening sentence of this introduction was taken: "Are we an audience for computer art?" The answer's not No; it's Yes. What we need is a computer that isn't labor-saving but which increases the work for us to do, that puns (this is [Marshall] McLuhan's idea) as well as Joyce revealing bridges (this is [Norman O.] Brown's idea) where we thought there weren't any, turns us (my idea) not "on" but into artists.2 In place of the mass efficiency normally presumed to append to computers, here we see Cage bearing witness to the labor required of the artist as well as the engineer, which yields unexpected puns and bridges and, most important of all, creates the mainframe artist—the person creatively engaged with an emerging technology. Cage had an abiding interest in technologies to disclose aspects of sound and its potentials, whether it was his early use of radios and recorded sound or the specialized scientific space of the anechoic chamber. More than anyone else, Cage became associated with experimentalism, first in music and then across the ranks of the arts following his time at Black Mountain College in the early 1950s, his classes at the New School for Social Research in the late 1950s, and his prodigious national and international touring. Key to the term experimentalism, as understood by Cage, was the unpredictability of outcome, which could be based in new technology or virtually any other process that removed the author's choice from the composition process. He explained, "The word 'experimental' is apt, providing it is understood not as descriptive of an act to be later judged in terms of success and failure, but simply as of an act the outcome of which is unknown." 3

4

INTRODUCTION

Mainframe Experimentalism is the unexpected outcome of a collaborative investigation by the editors into a little-known computer programming workshop that the composer James Tenney held in New York City for his friends associated with experimental music and art scenes. From 1961 to 1964, Tenney had worked on computer music and psychoacoustics at Bell Labs as a resident artist of sorts, and then at the Polytechnic Institute of Brooklyn as a researcher. Using the example of a study group that John Cage had held on the writings of Buckminster Fuller during the summer of 1967, Tenney decided to demystify and share his programming skills with his friends. The workshop occurred some time in the fall of 1967, and his friends happened to be a veritable who's-who of the experimental arts scene in New York: Phil Corner, Dick Higgins, Alison Knowles, Jackson Mac Low, Max Neuhaus, Nam June Paik, and Steve Reich. The workshop was held in the offices of Something Else Press in Chelsea, the home of Knowles and Higgins. As Reich himself would later write, "If you think this sounds like a strange context in which to study FORTRAN, you're right."4 There is only the most scattered and fleeting mention of this workshop in the historical record and, unfortunately, not much of a paper trail and few memories among the participants of what transpired. Even historians specializing in the experimental arts were unfamiliar with the workshop. Tenney's FORTRAN workshop was, of course, not an isolated activity but part of a wider presence of experimental arts within early computing. New York was in communication with important centers in Stuttgart, Zagreb, London, and Los Angeles. Tenney and Knowles both would be included in Jasia Reichardt's famous Cybernetic Serendipity exhibition, which premiered at the Institute of Contemporary Arts in London in 1968. The show also included London-based Gustav Metzger, Californian John Whitney, German Frieder Nake, and many others. Mainframe Experimentalism developed through our investigations into these activities and our sense that, while much important work had been done, especially by the younger generation of historians, much was still neglected and a better overall picture should be developed. Moving beyond the confines of New York required an expanded sense of the term experimental. E.H. Gombrich, in The Story of Art, characterizes the first half of the twentieth century by the term Experimental Art. In this broader sense, innovations in process and material involve an experimental attitude linking the futures of art to the past through a changing sense of art as linked to artists' everchanging worlds. In words that echo Cage's admonition that the computer not merely entertain but turn "us . . . not 'on' but into artists," Gombrich's 1965 "Postscript" ascribes to artists and critics "a healthy belief in experiments and a less healthy faith in anything that looks abstruse."5 Despite his cautionary tone and a clear distaste for virtually everything from abstract art onward, Gombrich's use of the term experiments coincides precisely with the sensibility of the early digi-

INTRODUCTION

5

tal artists described in this book who "keep an open mind and give a chance to new methods which have been proposed."6 Referring to the fifteen-year interval since The Story of Art was first published, he describes major technological changes as affecting the artists' landscape: "There was no jet travel when this book came out, no transistor radios, no artificial satellites, and computers were scarcely on the drawing board."7 Another sense of experimental was derived from science and was more closely tied to a specifically technological elaboration of the arts. From 1955 to 1957, Lejaren Hiller, a former chemist, used a mainframe at the University of Illinois to collaborate with Leonard Isaacson on the composition Illiac Suite for String Quartet. The vacuum-tube computer, weighing five tons and with miniscule memory, was used to generate the notation rather than synthesize the sounds, and the overriding purpose of the project was to technologically regenerate existing musical aesthetics, as was described in Hiller and Isaacson's 1959 book, Experimental Music: Composition with an Electronic Computer.8 The Electronic Music Studio he established at the University of Illinois became the training ground for James Tenney as a graduate student (figure 0.1), and, as Hiller's own aesthetics become more radical, he would collaborate with John Cage on the composition and multimedia extravaganza HPSCHD (1969), the title itself in programming format. Ironically, through his work on HPSCHD, Cage would discover how much labor was involved in working with computers, and, as Branden W. Joseph points out in his essay in this volume, the piece would "stand as the apotheosis of Cage's long-running pursuit of the most advanced technologies or, at the very least, of his doing so on such an ambitious scale."

Mainframes ran on vacuum tubes and filled entire rooms. They were much too expensive for ownership by anyone except state and military agencies, large corporations, universities, and research centers. They came equipped with their own institutional logic and list of priorities on which the arts did not rank at all. Moreover, they were born from the military exigencies of the Manhattan Project, the IBM punch cards of Holocaust administration, and the ballistic tests and administrative ranks of the Cold War. The American writer Kurt Vonnegut wrote a satirical story in 1950 about the implausibility of human concerns amid such institutional and material culture. The story was centered on a mainframe designed for military purposes that found it had a talent for writing love poetry. The computer was EPICAC, a name punned from ENIAC, the first famous mainframe, and the emetic ipecac. The real ENIAC cost approximately $500,000 in the World War II dollars that paid $2 billion for the Manhattan Project, and covered less than 700 square feet. The fictional EPICAC cost $776,434,927.54 in 1950 dollars, covered over 40,000 square feet, and took up "about an acre on the

6

INTRODUCTION

o.i. James Tenney (seated) and Lejaren Hiller at the Electronic Music Studio, University of Illinois, ca. 1961.

FIGURE

fourth floor of the physics building at Wyandotte College," weighing in at "seven tons of electronic tubes, wires, and switches, housed in a bank of steel cabinets and plugged into a no-volt A.C. line just like a toaster or a vacuum cleaner." Unlike a toaster, its main job was to "plot the course of a rocket from anywhere on earth to the second button from the bottom of Joe Stalin's overcoat, if necessary. Or, with his controls set right, he could figure out supply problems for an amphibious landing of a Marine division, right down to the last cigar and hand grenade." 9 There was a problem for which it had no solution; although it could write reams of love poetry, it would never have the protoplasm required to actually love. It self-destructed trying to compute its way around that fate. By 1968, another famous mainframe with personality, the H A L 9000 in Arthur C. Clarke's and Stanley Kubrick's book and film, 2001: A Space Odyssey, had no such internal conflict. The uses of real mainframes and minicomputers were dictated by cost-benefit factors dictated by military, intelligence, corporate, logistic, and administrative priorities. None of this was lost on the artist Gustav Metzger, who wrote in 1969: The first large electronic computers, the ENIAC and EDVAC, were developed under the pressure of the second world war. "It is said that the complex calculations

INTRODUCTION

7

needed to show whether a hydrogen bomb was possible took six months on an electronic computer. Without the computer the calculations might never have been made." (J.G. Crowther, "Discoveries and Inventions of the 20th Century," London, 1966, p. 68) Let us switch attention to an area that is of particular interest to this Symposium—computers and graphics. "The first display devices were special purpose types primarily provided for the military." (Computer News, St. Helier, [tt'c] Jersey. V. 12, No. 11, November, 1968, p. 3.) As you will know, the first prizewinners in the now annual computer art contest held by "Computers and Automation" were from a U.S. ballistic group. There is little doubt that in computer art, the true avantgarde is the military.10 The capability to which Vonnegut alluded, of pinpointing a specific button on Stalin's coat, had its precursor during World War II in Norbert Wiener's mathematical work on targeting mechanisms for antiaircraft guns, systems work that led to Wiener's development of cybernetics. The animator, inventor, and computer graphics pioneer John Whitney Sr. worked in a Cold War setting in the early 1950s on films on guided missile projects at Douglas Aircraft, and in the late 1950s he sourced parts from World War II antiaircraft directors in his analog computer for making abstract animations. The films produced on this device entranced thousands of habitués of the counterculture and became stock-in-trade in Gene Youngblood's notion of expanded

cinema.11

It would also pave the way to move

from analog to digital computing during Whitney's residency at IBM. James Tenney mused that in a better world the arts, rather than military and intelligence projects, would be first priority for m a i n f r a m e computers, and he held no illusions that classified projects elsewhere at Bell Labs were about to selfdestruct in E P I C A C fits of love. At the same time, Tenney also recognized that his boss at Bell Labs during the early 1960s, the telecommunications engineer John Pierce, misrepresented the true nature of Tenney's position to his own bosses in order to carve out an institutional space for Tenney to compose. Still, as the 1960s progressed, antiwar sentiments intensified and institutional clashes became more pronounced. In 1967, not long after the F O R T R A N workshop, while addressing faculty members and representatives from the military at the Polytechnic Institute of Brooklyn, Tenney chose to play his Fabric for Ché, based on Che Guevara, who had been captured and killed in Bolivia the same year, as his way to say "something about my disgust about the war in Vietnam." 1 2 As Edward Shanken describes in his essay in this volume, Jack Burnham curated the important exhibition Software at the Jewish Museum in New York in 1970, after having been in residence at the Center for Advanced Visual Studies at M I T during 1968-69, where he worked with a time-sharing computer. The artist Hans Haacke contributed a work that used a computer on loan to the museum to profile the visitors to the exhibition via such questions as "Should the use of marijuana be legalized, lightly or severely punished?" and "Assuming you were

8

INTRODUCTION

Indochinese, would you sympathize with the present Saigon regime?" However, the computer failed to function. Indeed, the wider availability of minicomputers was concurrent with increasing criticism of corporate profiting in the Vietnam War, pervasive militarism and nuclear proliferation, and technocratic rationalism. As an emergent technology, the mainframe computer was a different beast altogether from any technology that artists had worked with in the first half of the twentieth century. For example, the technology of cinema was expensive and access to it was difficult, but its challenges were very modest in comparison to those of mainframe computing; likewise, cinema offered an expansively mimetic palette, whereas computing could barely scratch out its results at the end of a laborious process. Scarce opportunities to work with computers seemed to be slanted in favor of the machines. According to a Time magazine review, despite an audience of forty thousand visitors, Cybernetic Serendipity was a show of "One Hand Clapping. Even at its best, the show proves not that computers can make art, but that humans are more essential than ever."13 Indeed, the visual arts and visual research in early computing constituted a particularly volatile mix of influences and conditions. The information aesthetics of Max Bense in Germany exerted a profound influence, as Christoph Kliitsch discusses in his thorough essay in this volume on activities in Stuttgart, and they helped usher aesthetics onto a ground of mathematics and technics conducive to computation. The same was true with Abraham Moles and aesthetic perception. While lending concretely to developments in graphic design and visual display, some computer art encountered difficulty once it left research labs and began interacting with other arts amid the intensity of the cultural and political attitudes of the 1960s, even as it aligned itself with constructivist and abstract tendencies that may themselves have had their own culturally and politically formative moments. The value of early computer art, in other words, lay not in where it had come from or what it might become, but in the contestation of what it actually was at a specific historical moment. This contested territory is described clearly in Margit Rosen's essay in this volume on the New Tendencies group in Zagreb. The group, founded in 1961 to exhibit work that distanced the artist from the creative process through rational, technological, or procedural means, mounted a 1968-69 colloquium, exhibition, and symposium concerned with computers as tools in visual research through which, in Rosen's words, art "was intended to become radically intersubjective, communicable, comprehensible, and reproducible, like a scientific experiment."

Because of an affinity with the numbers and letters of programmed systems, mainframe music and poetry were more easily aligned with the progressive

INTRODUCTION

9

aesthetics of the time. Technological controls were conducive to the simplicity of code and latitude of text and musical sound, and, in fact, similar procedures had already been exercised in the histories of formal literature and music. At least since the time of Gottfried Wilhelm Leibniz and the musical dice of Mozart, combinatory and chance processes had provided unpredictability, variation, possibility, and boundless plenitude. In the post-World War II period, information theorists wielded knowing references to the writings of modernist authors such as James Joyce and Gertrude Stein to demonstrate in language the entropie edge of originality versus banality, information versus redundancy, and so forth. Indeed, the technique that Claude Shannon used to demonstrate his "Mathematical Theory of Communication" (1949) could have easily leapt off the pages of the experimental poetry of the 1950s and '60s: "One opens a book at random and selects a letter at random on the page. This letter is recorded. The book is then opened to another page and one reads until this letter is encountered. The succeeding letter is then recorded. Turning to another page this second letter is searched for and the succeeding letter recorded, etc."14 Many of the artists discussed in this book worked with methods and systems that generated unforeseeable results and distanced the subject from authorship. John Cage and Jackson Mac Low both used chance operations and other formal methods long before they worked with computer systems to expedite and manage larger projects, even if, as was the case with Mac Low, the source was the RAND Corporation computer-generated book A Million Random Digits with 100,000 Normal Deviates. Similarly, the artist and poet Emmett Williams developed "the rules of the game" in 1956 for his poem "IBM," completed a decade later. Christopher Funkhouser discusses Williams, among many others, in his valuable survey included in the present volume. One of the most distinguished poetical works from the period of mainframes, "Tape Mark I" (1961), by the Italian poet and novelist Nanni Balestrini, is included in this volume in a new translation with program notes by Staisey Divorski. Balestrini understands this work within a tradition starting in the modernist techniques of Mallarmé and Raymond Roussel, yet it is very much in its own time, interlacing phrases from Lao Tzu's Tao te ching, a pulp detective novel, and Michihiko Hachiya's Hiroshima Diary. Likewise, in 1959, Theo Lutz, a student of Max Bense in Stuttgart, culled passages from Franz Kafka's The Castle that put the machine processing of probabilistic techniques in parallel with the routing systems of social control described in the novel. Georges Perec's The Art of Asking Your Boss for a Raise follows the recursive maze in the frustrating corridors of modern bureaucracy that one must go through to get denied a raise in pay. Perec's "thinking machine" was built in close proximity to computers, but never in the same room; the layout was instead manifested in computer-friendly algorithmic narrative and flowchart. Perec was

10

INTRODUCTION

a member of Oulipo, the small organization of writers and mathematicians born from the haunches of Alfred Jarry and the Collège de 'pataphysique. Oulipo's interest in computing and literature has appropriately received attention among recent writings on the digital arts; we are pleased to add Bellos's essay to this understanding (see chapter 2).15

In 1966, the physicist and computer display researcher Ken Knowlton worked with one of the quintessential '60s artists, Stan VanDerBeek, at Bell Labs. Along with A. Michael Noll, the trials of whom Grant Taylor has described (see chapter 1), Knowlton's early graphic works were synonymous with "computer art." The art historian Johanna Drucker is candid in her assessment: "The contents of works produced by Ken Knowlton at Bell Labs are shockingly dull images of gulls aloft or a telephone on a table or a female nude rendered in symbol set." Drucker qualifies her statement by reminding us, "As technological developments have stabilized, the experimental features of earlier works have become harder to appreciate for the contributions they made at the time."16 In her essay for this volume, Gloria Sutton describes in precise detail the factors feeding the "visual language" of Stan VanDerBeek's computer-animated Poemfields, "a new type of animated film that presented poetry in the interstice between cinema and computing," a project he worked on with Knowlton at Bell Labs beginning in 1966. VanDerBeek compared the labor and frustrations to "learning how to draw by pushing a pencil around with your nose." The traffic between artists and the mainframes at Bell Labs began musically, because of the musical interests of the engineers John Pierce and Max Mathews, and the acoustical and psycho-acoustical needs of the telephone and telecommunications industry. The composer James Tenney, following Edgar Varèse's enthusiasm for the artistic possibilities of new technologies, sought out the Electronic Music Studio of Lejaren Hiller as a graduate student, and then from 1961 to 1964 was the first person to compose digitally synthesized sound into a sustained body of work. Hie length of his stay allowed him to develop programming skills and creativity that were otherwise reserved for engineers at the time. Through the 1960s, as engineers from Bell Labs became more active in the New York art world, especially through the organization Experiments in Art and Technology, Tenney connected his friends with Bell Labs, and with the potential of computing in general, through personal introductions and his FORTRAN workshop. Among the artists in the FORTRAN workshop was Nam June Paik, one of the irrepressible spirits of experimentalism. Tenney remembered, "Nam June Paik used to call me his guru! He liked playing with hardware and here I was talking software."17 In the fall of 1967, Paik began working as a residential visitor at Bell Labs with the assistance of A. Michael Noll, and he began transforming earlier

INTRODUCTION

11

ideas about the cybernetics of Norbert Wiener and the media theory of Marshall M c L u h a n into the difficult practicalities of computing. In fact, the most tangible evidence of his residencies was a modest computer text entitled " C o n f u s e d Rain," in the spirit of Guillaume Apollinaire's calligramme II Pleut. Max Mathews of Bell Labs has quoted John Cage, who said that if you are surprised with the result, then the machine has composed the piece. If you are not surprised, then you have composed it. I found out, however, that no matter how genial a computer might be, "he" has no common sense. For example, instead of just saying "Walk," you have to break it down to logical steps, that is, give the weight to the left half of your body, give strength to the muscles below the knee, put the energy to the vector pointing to the sky, making 90 degrees to the earth, move the vector to 160 degrees to the earth, give the energy to the leg in the direction of the earth, using also universal gravitation, stop the movement as soon as the distance between your leg and the earth comes to zero, repeat the above process for your right leg, the right leg meaning your leg on the right side of your body, then repeat the entire process 100 times. I decided to title all my computer pieces in French, to protest the lack of common sense in the computer. Verlaine wrote: "It rains in my heart, as it rains in the city." I say: "It rains in my computer, as it rains in my heart"—"II pluit dans mon computeur" will be my first piece. It is the mix of real rain and simulated rain in the computer. My second piece will be called "La Computeur sentimentale," and the third piece, "Aimez-vous FORTRAN-programming!" The more it deals with the character of randomness and repetition, the more efficient is the computer. These are the two poles of human artistic materials. Total repetition means total determinism. Total randomness means total indeterminism. Both are mathematically simply explicable. The problem is how to use these two characters effectively. Therein lies the secret for the successful usage of the computer in the creative arts.18 Paik later would refocus his technological efforts on the more immediate results obtained by video. In another institutional setting, the computer and engineers at Regnecentralen in D e n m a r k allowed plenty of bit-room for the deeply serious and comedic artist Eric Andersen, also a Fluxus participant (see chapter 19). In his work Opus 1966, the poetical tasks of daily life began to be addressed to a computational frame: "Life itself is often defined not only as a phenomenon that reproduces itself but more importantly as a process that allows random mutations to take place." The actual machine behind this sentiment differed f r o m the labor-saving device that many imagined the computer would become. A n y such domestic appliance could exist only at multiple levels above the architects of computer prog r a m m i n g languages, who argued over punctuation and other functional issues of symbolism; even the poetic constraint in the language of Opus 1966 would have been Olympian among the engineers at Andersen's side. The interaction of

12

INTRODUCTION

actual engineers with artists is often left unacknowledged or represented romantically and, to counter this, Andersen cofounded the Union to Protect Data Processing Systems and Machines from Tedious Work. In the experimental aesthetic, tedium and boredom need not be avoided. Fluxus and intermedia artist and theorist Dick Higgins described the fruits of boredom in his classic essay "Boredom and Danger," and Claes Oldenburg really wanted to agree: "Boredom is beautiful but it is hard to keep awake." 19 The work of material and social interactions does not have to be yoked to an image of mechanically induced labor, and when Cage hoped for a computer that "increases the work for us to do," he did not necessarily mean work at the computer. Experimentalism focused its aesthetics and poetics on the everyday, on the seemingly banal dismissed in information aesthetics as redundancy and equilibrium, by creating the conditions for "an act, the outcome of which is u n k n o w n . . . anyone has the opportunity of having his habits blown away like dust." 20 Alison Knowles's The House of Dust began with what she acknowledged as the generosity of James Tenney's presentation of the arcane and alien workings of F O R T R A N to his friends. It seemed to fulfill Cage's criterion of "revealing b r i d g e s . . . where we thought there weren't any" by making the transit and translation from one social site to another, from engineering to the arts, from one language to another. In their respective contributions to Mainframe Experimentalism, Benjamin Buchloh and Hannah Higgins follow the probable and impossible houses and inhabitants of The House of Dust from their initial linguistic incarnation in computer printouts, the pages of Cybernetic Serendipity and Fantastic Architecture, to materialization in actual structures, performances, and social occasions that reiterate Tenney's initial act of generosity and interpersonal interaction. 21 The House of Dust would surely be one of the masterworks of the arts and computation, of experimental arts, and, indeed, of the art of the period if not for the fact that it intrinsically resists the very notion of mastery. The key to Alison Knowles's going beyond the technological limits of digital computing was her placing of the dedicated output, i.e., the printout of text, amid the ever-changing contingencies of social and poetical practice. This was neither an instrumentalist use of the device nor a formal introduction of chance into otherwise normal operations. Alvin Lucier, in his composition North American Time Capsule 1967, went beyond technological constraint by cracking open the device, in this case the circuits of the Vocoder at Sylvania Applied Research Laboratories, to intercede between normal input and output. The Vocoder was a digital device designed originally at Bell Labs to carry greater amounts of vocal information through the bandwidths of telecommunications infrastructures. As Christoph Cox writes in his essay (chapter 9), Lucier concentrated on the intermediary ability of the device for vocal analysis rather than the input-output repetition of

INTRODUCTION

13

synthesis, processed in the electronic dynamics somewhere between digital encoding and decoding, such that the "vocoder becomes a machine with which to liquidate speech and to abolish the identity of the speaking subject, shattering all syntax and pulverizing every symanteme, morpheme, and phoneme into fluid sonic matter." This was characteristic of the experimental approach within electronic music that discovered generative moments and music where others would find bugs and noise, a welcomed embrace of what the composer Pauline Oliveros calls "the negative operant phenomena of systems."22 As minicomputers became more common in universities and technical colleges during the 1960s, these institutions increasingly became arts patrons. This is described very well in Charlie Gere's essay in this volume on minicomputer experimentalism in the United Kingdom. Gere coedited what should be considered a companion volume to the present effort: White Heat Cold Logic: British Computer Art 1960-1980P Artists sought out pieces of the infrastructure dedicated to their mission, even if that meant time-sharing in the early morning hours and on weekends. Increasingly, in the latter half of the 1960s, computer corporations and corporations housing computers lent machine time for exhibitions and intermittent artistic research. But not until the mid-1970s, when microprocessors were fitted into preassembled, consumer-level computers such as the Altair 8800 and the KIM-i, did computation really begin to make its big leap from restricted or stigmatized institutional access to the kitchen table and consumer culture. Conditions have indeed changed. The composer John Chowning has calculated the relative cost of computing, based upon the cost of memory, between an IBM 7090 in 1967, a year of much of the activity chronicled in this book, and his laptop in 2007 (with its 3GB memory and 100GB disk storage) and found that the IBM would in 2007 be worth twelve cents, whereas his laptop would have in 1967 been worth $58,959,960,937.50, far exceeding the outrageous expense of the lovelorn EPICAC.24 The breadth of artistic and cultural practice has likewise increased exponentially and continues to evolve at a rapid pace on an increasing number of digital devices. No matter how accelerated the growth, many of the aesthetic, poetic, social, and political seeds of the digital arts are to be found within Mainframe Experimentalism. NOTES John Cage, "Diary: Audience 1966," in A Year from Monday

(Middletown, CT: Wesleyan University

Press, 1967), 50. In 1966, Cage was involved in the intensive art and technology of 9 Evenings: and Engineering

Theatre

and in the DIY electronics of David Tudor and Gordon M u m m a , but had yet to use

a computer. Timothy Leary had just used a technological metaphor in his LSD invitation to "turn on, tune in, drop out" when Cage reengineered the metaphor to express how the computer turns people into artists.

14

INTRODUCTION

1. Jim Pomeroy, "Capture Images/Volatile Memory/New Montage," in For a Burning World Is Come to Dance Inane: Essays by and about Jim Pomeroy, ed. Timothy Druckrey and Nadine Lemmon (Brooklyn: Critical Press, 1993), 61. 2. Cage, A Year from Monday, 50. 3. John Cage, "Experimental Music: Doctrine," in Silence: Lectures and Writings (Middletown, CT: Wesleyan University Press, 1961/1973), 13. 4. Steve Reich, "Tenney," Perspectives of New Music 25, nos. 1 - 2 (Winter/Summer 1987): 547-48. As a matter of disclosure, one of us was writing on Tenney and was his former student at CalArts, and the other is a scholar of Fluxus and, as the daughter of Knowles and Higgins, a Fluxkid herself. 5. Ernst Gombrich, The Story of Art (Greenwich, CT: Phaidon Press, 1950/1966), 465. 6. Ibid. 7. Ibid. 8. Lejaren A. Hiller Jr. and Leonard M. Isaacson, Experimental Electronic Computer (New York: McGraw-Hill, 1959).

Music: Composition with an

9. Kurt Vonnegut, "EPICAC" (1950), in Welcome to the Monkey House (New York: Dell Publishing, 1998), 297-98. Originally published in Collier's Weekly, November 25,1950. 10. See Gustav Metzger, untitled paper on theme number three for "Computers and Visual Research" symposium, Zagreb, 1969, in the present volume. 11. Gene Youngblood, Expanded Cinema (New York: E. P. Dutton, 1970). 12. Douglas Kahn, "James Tenney. An Interview," Leonardo Electronic Almanac 8, no. 11 (November 2000), n.p. 13. "Exhibitions: Cybernetic Serendipity," Time magazine 92, no. 14 (October 4,1968), www.time .com/time/magazine/article/o,9171,838821,oo.html (accessed August 20, 2008). 14. Claude E. Shannon, "The Mathematical Theory of Communication," in The Mathematical Theory of Communication, eds. Claude E. Shannon and Warren Weaver (Urbana: University of Illinois Press, 1949), 14-15. 15. David Bellos, Georges Perec: A Life in Words (Boston: David R. Godine, 1993); Noah WardripFruin and Nick Montfort, eds., The New Media Reader (Cambridge, MA: MIT Press, 2003), 147-92. 16. Johanna Drucker, "Interactive, Algorithmic, Networked: Aesthetics of New Media Art," in At a Distance: Precursors to Art and Activism on the Internet, eds. Annmarie Chandler and Norie Neumark (Cambridge, MA: MIT Press, 2005), 55, 36. 17. Kahn, "James Tenney: An Interview." 18. Nam June Paik quoted in Jud Yalkut, "Art and Technology of Nam June Paik," Arts Magazine (April 1968): 50-51. 19. Dick Higgins, "Boredom and Danger," Source: Music of the Avant Garde 5, vol. 3, no. 1 (January 1969): 14-1720. Cage, "Experimental Music: Doctrine," Silence, 13-16. 21. Dick Higgins and Wolf Vostell, eds., Fantastic Architecture (New York: Something Else Press, 1969). 22. Pauline Oliveros, "Valentine," appended to Elliott Schwartz, Electronic Music: A Listener's Guide (New York: Praeger, 1973), 246. 23. Paul Brown et al., eds., White Heat Cold Logic: British Computer Art 1960-1980 (Cambridge, MA: MIT Press, 2009). Another recent book of note is Stephen Jones, Synthetics: Aspects of Art and Technology in Australia, 1956-1975 (Cambridge, MA: MIT Press, 2011). 24. John Chowning, "Fifty Years of Computer Music: Ideas of the Past Speak to a Future— Immersed in Rich Detail," keynote address at the International Computer Music Conference, Copenhagen, Denmark, August 27-31, 2007. Presentation sent in personal correspondence to Douglas Kahn, September 9, 2008.

1

THE SOULLESS USURPER Reception and Criticism of Early Computer A r t Grant Taylor I sincerely hope that machines will never replace the creative artist, but in good conscience, I cannot say that they never could. DENNIS GABOR, SCIENTIST AND NOBEL PRIZE WINNER IN P H Y S I C S , 1 9 5 8

Mechanistic muses are expanding their domain to encompass every facet of creative activity. J . R. P I E R C E , " P O R T R A I T OF T H E M A C H I N E AS A Y O U N G A R T I S T , " PLAYBOY,

1965

To my good friends the Burroughs Bssoo and the CalComp 565. A R T I S T L L O Y D S U M N E R ' S B O O K D E D I C A T I O N , COMPUTES AND HUMAN

RESPONSE,

ART

1968

In 1958, as Dennis Gabor fretted over the future role of artists in the new digital age, a computer was generating the very first images identified as "computer art." 1 As if to confirm Gabor's fear, the first article published on the subject of computer art was provocatively entitled "The Electronic Computer as an Artist." 2 This landmark 1964 article included the very first example of award-winning computer-generated art, but did not identify any of the artists responsible for its production. In fact, those responsible for this new visual art form were not artists at all, but technologists—and, even more discordantly, these creators were employed by the United States military. This was not exactly an ideal birthplace for the newest of visual art media. Only a year later, in 1965, the engineer and author John R. Pierce—using the well-known male magazine Playboy as his vehicle—nudged computerized art into the realm of popular culture. By alluding to James Joyce's novel in his reworked title, "Portrait of the Machine as a Young Artist," Pierce portrayed the 17

18

DISCOURSES

computer as an artist who was growing toward a possible artistic maturity. After outlining the significant achievements already made by science and technology in the areas of computerized music, literature, film, and visual art, Pierce issued an open invitation to artists to explore and then develop the computer and its vast "potentialities." 3 A number of years passed before artists were sufficiently drawn to this mysterious new machine to brave the complex world of mainframe computers and its various peripheries. One such was the American artist Lloyd Sumner, who published a sentimental account of his growing connection to his novel "machines." However, even as he dedicated his landmark book to those computers with which he worked tirelessly, Sumner was acutely aware of the suspicion with which the orthodox art world regarded what was at the time a foreign and extraneous device. Sumner opened his book with a plea to the reader not to prejudge his art because of its mode of construction. 4 His appeal would go unheard for over a decade. In the late 1960s and early 1970s, computer art aroused the kind of extreme resentment that has characterized many iconoclastic controversies in the history of art. The history of computer art is marked by a variety of aggressive behaviors that include the sabotaging of computers and physical attacks on artists. 5 While these extreme reactions have been few, there is nevertheless a litany of lesser responses that range from casual critical dismissal to censorious negation. Considering the diversity and relentlessness of its negative critical reception, computer art was possibly the most maligned art form of the twentieth century. The hostility to computer-generated visual art is surprising when one considers that the computer was emerging as the most compelling emblem of its time and perhaps as humanity's greatest technological achievement. What was it about early computer art that elicited such a hostile response? By examining the origins of the first award-winning computer art and the critical response to the inaugural computer art exhibition in New York City, we are able to locate the complex interplay of ideological and discursive forces surrounding the computer's difficult entrance into the creative field. While one of the key reasons for computer art's marginality was its emergence in a period when the perceived division between the humanistic and scientific cultures was reaching its apogee, it was the uneasy reaction to the computer as a new but yet undefined machine that deeply affected the criticism of its art in the United States and abroad. Although computer art shared significant aesthetic and theoretical similarities to critically celebrated art movements of the day, art critics and artists continually reproached computerized art for its mechanical sterility. As the idea of the computer transformed within the popular imagination, criticism was inspired by the romantic or humanist fear that a computerized surrogate could replace the artist. Such usurpation was seen to undermine some keystones of Western art, which include the revered and inimitable role of the creative artist.

THE SOULLESS USURPER

19

Almost any artistic endeavor associated with early computing elicited a negative, fearful, or indifferent response. As early as 1956, musicians and poets exploring the vistas of a new technology were ambivalent: thrilled at forging new artistic paths and yet subdued by an undercurrent of misgiving from their cultural peers. As computing impacted all the arts in the 1960s, such responses would intensify. While computer music was often greeted with interest as the latest "novelty," 6 the early computer experimentalist Lejaren Hiller felt emerging from "many quarters" a deepening "incredulity and indignation." 7 Joel Chadabe, another pioneering computer composer, felt that critics and traditional musicians "feared" the machine and its potentially harmful influence on the entire field.8 In the early 1970s, Elliot Schwartz, in his listening guide to electronic music, best summed up the reaction: "The notion of music 'created' by a computer always seems to arouse a surprising degree of hostility, usually on the part of people who find twentieth-century art increasingly 'dehumanized' and 'mechanical.'" 9 Computer poetry fared little better. As Christopher Funkhouser has written, the literary world was underwhelmed by computer poetry. 10 Mirroring the critical responses of mainstream music, literary critics focused on the dehumanizing tendencies of the computer and the perceived ontological break between author and reader. 11 John Morris, writing in the Michigan Quarterly Review in 1967, praised the importance of the written poem as an essential "communication from a particular human being," and noted that if the difficulty of working with the computer discouraged those currently interested, then poems would happily remain "one last refuge for human beings." 12 In the world of dance, also, the computer received what Jeanne Beaman described as a "curious but cool response." 13 Beaman, who in the early 1960s pioneered computer dance and choreography, explained in her introductory presentation to computer dance: "Most of us do not even want a machine of any kind to succeed in conceiving any art form at all. The arts are usually presented as our last refuge from the onslaughts of our whole machine civilization with its attendant pressures towards squeezing us into the straitjacket of the organized man." 14 While the literary and music worlds were cool in their response to the computer, it was the world of fine art that generated the most severe and sustained attack on the emerging medium. Negative criticism permeated the discourse from computer art's beginnings in the 1960s until the early 1990s. 15 The most constant criticism emerged in the 1960s, when mainstream art critics judged computer art to be tediously repetitious and were quick to make clear their belief that it had no claims to the status of art, especially on aesthetic grounds. In 1972, critic Robert E. Mueller wrote in Art in America that visual results from computers had been "exceedingly poor and uninspiring." 16 According to Mueller, technologists lacked the necessary knowledge of art and its history, and their visual creations, which were mathematically inspired, bored the "sophisticated artistic

20

DISCOURSES

mind to death." 17 Even when computer-generated art gained fashionable notoriety, critics spurned it as a "popular sideshow."18 Some saw it as just another example of the vulgarization of science, in which besotted artists, flirting with the latest scientific and technological media, produced what was tantamount to scientific kitsch.19 While many galleries showed computer art, these exhibitions were often "condescendingly reviewed," as though the medium were "without serious intent or noble aspiration." 20 THE FUTURE CRASHES: AN UNORTHODOX BEGINNING From its very beginnings, computer-generated visual art suffered from a questionable provenance. 21 As awkward as it appeared to those first interested in computers and art, a military laboratory, as has been mentioned, produced the first recognized and indeed award-winning piece of computer art in the United States. The trade journal Computers and Automation (later to become Computers and People)22 facilitated the birth of computer art through its "Computer Art Contest" in 1963. Submissions were invited of "any interesting and artistic drawing, design or sketch made by a computer." The first and second prizes went to the United States Army Ballistic Research Laboratories (BRL) in Aberdeen, Maryland, the same laboratory that had started the computer industry in the United States during World War II. 23 The army-sponsored revolution in computing at BRL had produced the famous ENIAC, which was followed by the ORDVAC, EDVAC, and the BRLESC 1, which in 1962-63 had a role in producing the first examples of computer art. The prize-winning art piece, Splatter Diagram, which was printed on an early printer called a dataplotter, was a design analogue of the radial and tangential distortions of a camera lens (figure 1.1).24 In 1964, the same laboratory won first prize for an image produced from the plotted trajectories of a ricocheting projectile. The uncomfortable knowledge of computer art's military origins has prompted many commentators and proponents to situate the emergence of computer art a number of years after 1963, effectively bypassing its military birthplace. 25 As mathematical visualizations of natural phenomena, these authorless images were not produced for aesthetic reasons. As the original captions accompanying the artwork communicate, these images were "merely an aesthetic by-product" of utilitarian pursuits.26 Notwithstanding, the images—deemed "beautiful" by Computers and Automation's editor Edmund C. Berkeley—were published as "computer art." 27 Three years before, in i960, William Fetter, a Boeing employee, had coined the term computer graphics to describe computergenerated imagery. Berkeley, through the journal Computers and Automation, contributed to the general currency of the term computer art and, in consequence, propelled these new creations toward the discourse of art.

THE S O U L L E S S USURPER

21

J 1.1. United States Army Ballistic Research Laboratories, Splatter Diagram (often titled Splatter Pattern), 1963. Computergenerated. From Computers and Automation (1963). Courtesy of the Ballistic Research Laboratories, Aberdeen, Maryland. FIGURE

To those who first traced the origins of computer art, its emergence out of technologies of fire control at a military ballistics laboratory would have been incongruent with traditional art historiography. Art historians commonly analyze artistic lineage, stylistic change, and a variety of economic and social conditions that inform and impact various creative groups and individuals. Computer art's pedigree had no recourse to normative art histories or modes of development. Rather, computer art emerged from various cultures of engineering t h a t even prior to the invention of the computer—had explored feedback mechanisms, control systems, and communication theory.28 While remote from the world of art, these various interdisciplinary enterprises generated a considerable amount of innovation and creative energy among a generation of scientists. The spirit of research and innovation was embodied in the father of information theory, Claude Shannon, who would influence John R. Pierce, a key figure in promoting computer art. Under the leadership of Pierce, director of the research division at Bell Laboratories, a raft of pioneering experimental computer research was

22

DISCOURSES

undertaken in the areas of graphics, music, choreography, and animation. As A. Michael Noll says, the energetic environment promoted by Pierce encouraged researchers to "take chances and explore uncharted avenues of discovery." 29 It was Noll and a fellow researcher, Bela Julesz, who produced the first collection of computer art to be exhibited in the United States. In 1964, Howard Wise, art patron and director of the Howard Wise Gallery in New York City, approached Julesz and Noll to exhibit their computer-generated images. It was not until April 1965 that this first organized public exhibition took place. With computer-generated music as an ambient backdrop, the Wise Gallery displayed the first examples of what were called "computer-generated pictures." Instead of exploring interesting designs produced from practical pursuits, as was the case at BRL, Julesz was enthusiastic when he discovered that the IBM 7094 computer could "produce patterns of some originality and interest." 30 Likewise, Noll, his fellow exhibitor, came to computer-generated imagery by chance when a fellow intern created "abstract computer art," as Noll gladly put it, when a microfilm plotter erred and produced an unusual linear design. Reminded of Picasso's Ma Jolie, Noll was convinced that the image could "perhaps be considered a combination of abstraction and cubism." 31 With the director of a research laboratory environment fostering relatively free exploration—encouraging researchers to explore every avenue of unexpected error—it is easy to see why the interest in chance processes encouraged scientists to acknowledge the potential for creative acts. Although Julesz and Noll enthusiastically approached the invitation to exhibit their images, problems besieged the exhibition. Julesz was uneasy about using the term art in the title of the exhibition because the images began as stimuli for psychological investigations of visual perception. 32 Noll, by contrast, was quite comfortable in identifying his works as "art" because the images were made "solely for their aesthetic or artistic effects." 33 A compromise was reached by titling the exhibition Computer-Generated Pictures. Much of the ambivalence about whether to call the images "art" was associated with the initial response from the legal and public relations department at A T & T (which owned Bell Labs at the time). As Noll reflected some years later: Although the research management staff at Bell Labs was very supportive of the Howard Wise Gallery exhibit, the legal and public relations folks at AT&T became worried that the Bell Telephone companies that supported Bell Labs would not view computer art as serious scientific research. Hence an effort was made by AT&T to halt the exhibit, but it was too late, since financial commitments had already been made by Wise. Accordingly, Bela and I were told to restrict publicity, and, in an attempt to foster such restriction, Bell Labs gave Bela and me permission to copyright all the pictures in our own names.34

THE SOULLESS USURPER

23

When Noll attempted to register the copyright for his piece Gaussian Quadratic (1962) with the copyright office at the Library of Congress, he was refused on the grounds that a "machine had generated the work."35 Noll patiently explained that a human being had written the program, which incorporated randomness and order. Again the office declined to register the work, stating that randomness was not an acceptable criterion. Copyright was finally granted when Noll explained that although the numbers generated by the program "appeared 'random' to humans, the algorithm generating them was perfectly mathematical and not random at all."36 If the scientists were expecting a positive reception after their initial difficulties, they were to be disappointed. The reaction to the Howard Wise exhibition was hostile from artists and critics alike. The New York Times art critic Stuart Preston opened his review with: "The wave of the future crashes significantly at the Howard Wise Gallery."37 Although the exhibition was a significant landmark, generating a certain amount of technical interest, the criticism ranged from "cool indifference to open derision."38 Preston described the imagery as "bleak."39 The art critic in Time magazine, with a tone that communicated the author's begrudging resignation that computerized art had finally arrived, proceeded to describe the pictures on display has having "about the same amount of aesthetic appeal" as the "the notch patterns found on IBM cards."40 The New York Herald Tribune denounced the works as "cold and soulless," a criticism that would continue to haunt computer art.41 While the Howard Wise Gallery was the premier commercial venue for presenting innovative art, and the exhibition received considerable press attention, none of the work sold.42 Noll admitted in retrospect that the public and media's response was "disappointing."43 Many commentators have suggested that the cool reception of computer art was the inevitable result of friction between the scientific and artistic communities—what Goodman called the "uneasy liaison."44 The antipathy between the two communities, which was first expressed in C. P. Snow's influential book The Two Cultures and the Scientific Revolution,45 pervaded early commentary on the emerging art form. These hostilities were played out in science-based publications that tended to depict the technologist as a zealous scientist forging new artistic paths, while the artist, characterized as defiant and lacking fortitude, languished in the doldrums of technological ignorance 46 While Herbert Franke, the first significant writer on computer art, recognized that the computer as art maker had raised and exposed many problems, he stated that only members of the scientific community had the language, awareness, and skill to approach the new form.47 Those who presented themselves as art pundits, according to Franke, needed to give way to "scientists, mathematicians and technicians who, becoming involved in the discussions... injected new energy into the

24

DISCOURSES

field."48 In response, the critic Robert Mueller averred that the scientist, who has "no detectable knowledge of the tradition of artistic visual work," was making "work entirely without artistic meaning and completely sterile visually." 49 For both the artist and critic, the scientist's work was dull and lifeless, evidence of aesthetic ineptitude. Even as artists began to use the computer, the label of aesthetic ineptitude remained the most common censure of computer art. While the linear abstract designs of early computer-generated art were often simple in their design, the rejection of computer art on aesthetic grounds was more of an emotive response than a critical one. The marked animosity between artist and scientist—presented as a deep cultural divide—informed much of the early reception of computer art, but a deeper, more universal response to the computer as the unfamiliar machine sustained the criticism. After all, the critical response was not confined to the United States, which was where the "Two Cultures" debate was most ardently contested. In West Germany, the artistic community responded with distrust, even "unrest," to the computer. 50 Even in Japan, with its cultural embrace of technology, the artistic community was apprehensive. Haruki Tsuchiya observed that artists who were not computer professionals were extremely suspicious of computer art. 51

STERILE AND COLD: THE BIRTH OF COMPUTER AESTHETICS In the 1960s, one of the most common charges computer artists made against the art establishment was that their artwork was initially accepted on its merits, only to be rejected once curators discovered that it was produced with the use of the computer.52 The computer itself—not any aesthetic deficiencies of the a r t i s t appeared to stigmatize the art. In fact, criticism of the aesthetics of computer art can be assessed by comparing the significant commonalities between critically established art movements of the day and computer art. Although comparisons between computer art and parallel art forms of the 1960s can represent risky analysis, their aesthetic and process-related commonalities are too extensive to ignore. The most prominent examples of characteristics shared by computer art and analogous art forms are two serial-based artworks, one by Sol LeWitt, the foremost exponent of conceptual art, and the other by Manfred Mohr, an equally important figure in computer art. In the early 1970s, both artists completed a series of works that focused on three themes: the cube, seriality, and incompleteness. Although Mohr's Cubic Limit series begins in 1973, effectively predating LeWitt's Variations of Incomplete Open Cubes (1974), both Mohr and LeWitt had long explored the construction or deconstruction of the geometric cube. For both artists, the cube was devoid of subjective overtones, the "least emotive" of

THE SOULLESS USURPER

25

aaaggQfte&QQ "y g r"'s a b b b b b b B B 8 B R I I § I 9i%% % 8 9 8 9 B § 988

§ e s §6§e{?c? B'B B a B 9 3 i § B B I g P Q 9 8 B B 8 § 8 H au

S % ffl ^ & #

£

S

Off ^ S a ' -a s "O

a

BOr

3 123