219 33 2MB
English Pages [346] Year 2004
VIRTUAL WORLDS
VIRTUAL WORLDS CULTURE AND POLITICS IN THE AGE OF CYBERTECHNOLOGY PRAMOD K. NAYAR
SAGE PUBLICATIONS NEW DELHI S THOUSAND OAKS S LONDON
Copyright © Pramod K. Nayar, 2004 All rights reserved. No part of this book may be reproduced or utilised in any form or by any means, electronic or mechanical, including photocopying, recording or by any information storage or retrieval system, without permission in writing from the publisher. First published in 2004 by Sage Publications India Pvt Ltd B-42, Panchsheel Enclave Post Box 4109 New Delhi 110 017 Sage Publications Inc 2455 Teller Road Thousand Oaks California 91320
Sage Publications Ltd 1 Oliver’s Yard 55 City Road London EC1Y 1SP
Published by Tejeshwar Singh for Sage Publications India Pvt Ltd, typeset in 10/12 Cooper LtBT Normal at C&M Digitals, Chennai and printed at Chaman Enterprises, New Delhi. Library of Congress Cataloging-in-Publication Data Nayar, Pramod K. Virtual worlds : culture and politics in the age of cybertechnology/ Pramod K. Nayar. p. cm. Includes bibliographical references and index. 1. Information society. 2. Information technology—Social aspects. 3. Popular culture. 4. Computers and civilisation. I. Title HM851.N39
303.48′33—dc22
ISBN: 0-7619-3228-3 (US-Hb) 0-7619-3229-1 (US-Pb)
2004
2004000998
81-7829-357-9 (India-Hb) 81-7829-358-7 (India-Pb)
Sage Production Team: Sarita Vellani, Proteeti Banerjee, Rajib Chatterjee and Santosh Rawat
For nandini@everymillenium
CONTENTS List of Boxes Preface Acknowledgements
9 11 15
1. TECHNOCULTURE Techne-, Technology, and High Tech Culture Theories of the Information Society Key Concepts and Terms Technologies
19 25 28 46 65 73
2. ART, AESTHETICS, POPULAR CULTURE Literature Popular Visual Culture Music Dance, Choreography, Installation and Performance Art Architecture Computer Mediated Learning
141 144 150
3. POLITICS Cyberspace and the Social Cyberpolitics Cyberwar
157 159 174 205
4. BODY Cyborgs Posthumanism, Informatics, and the Body Body Modification Technological Embodiment
211 215 217 223 228
89 91 127 137
8S
VIRTUAL WORLDS
Cosmetic Surgery The Body, Eroticism, and Sexuality in VR/Cyberspace Cyborg Soldiers Posthuman (Body) Arts
245 248 251 254
5. GENDER The Feminist Critique of Science Technologies: Feminist Readings Gendering ICTs, Development, and Globalisation Cyberfeminism
263 264 272 278 282
Bibliography and Webliography Index About the Author
313 341 345
LIST OF BOXES
2.1 2.2 2.3 2.4
Technorealism Storyspace William Gibson The Magic of Cyberspace
3.1 3.2 3.3 3.4
Cybercrime The Cyborg Bill of Rights Digital City Infowars
4.1 4.2 4.3 4.4 4.5
Modern Primitives Genetically Modified Foods Botox Anti-aging and Death Biometrics
5.1 Donna Haraway
90 102 118–19 125 165 179 192 204 224–25 234 245 247 253 294
PREFACE
T
echnology is a critical category that we need to incorporate into our readings of culture since it is, in the wired years of the early twenty-first century, as much a component of our lives as our class affiliations or gender. Virtual Worlds: Culture and Politics in the Age of Cybertechnology seeks to explore the highly technologised cultural conditions of our times. Cyberculture, as the condition is commonly termed, comprises the technologised, wired and networked environments in which daily life is lived in metropolitan cities across the world. Since cybertechnology affects all aspects of life today, any reading of cyberculture necessitates interdisciplinarity in frameworks and methodologies, and this I have tried to do, albeit with all the limitations imposed by my ignorance of other disciplines. Hence the book discusses cyberculture as it mediates and is in turn influenced by globalisation and politics, architecture, medical science and war. Undergirding the debates around cyberculture is the idea of the posthuman—the technologically enhanced, wired, and chemically/surgically altered human who arrives with cyberculture and the informatisation of life. The book is intended as a modest introduction rather than a critical appraisal of contemporary technoculture. Discussions of major issues and concerns such as ecofeminism or reproductive technologies are kept brutally and unsatisfyingly short because the aim is to provide a survey of various technologies and debates rather than in-depth analyses of particular ones. The final chapter, ‘Gender’, however, serves the purpose of a critical evaluation of cybertechnology. I have used only one category of cultural criticism to read cyberculture. Similar exercises may be performed using other categories, such as race or class. Finally, the question of the ethics of technology has not found place here. Debates concerning
12 S
VIRTUAL WORLDS
the ethics of various technologies and sciences are too vast and complex to be included in an introduction of this kind. The chapter on gender, however, has a politico–ethical orientation with its debates over gender discrimination and the gendered nature of the development of Information and Communication Technologies (ICTs). Virtual Worlds focuses on four areas of contemporary cyberculture. Chapter 1, ‘Technoculture’ provides the conceptual frameworks for reading cyberculture, surveying the technologies and theories of the information society, and exploring various aspects of ‘techne- ’, culture, and science. Chapter 2, ‘Art, Aesthetics, Popular Culture’, discusses how these cultural domains became increasingly technologised in the 1980s and 1990s. Chapter 3, ‘Politics’, discusses the impact of contemporary technology on politics and concludes with a section on cyberdemocracy. Chapter 4, ‘Body’, looks at the one ‘site’ where contemporary technology has had its greatest interventionary role. Chapter 5, ‘Gender’, is of a more polemical nature, and organises issues raised in the preceding ones. An exercise in cultural studies, it surveys the relationship of gender with technology and the rise of cyberfeminism. A detailed bibliography and webliography for further reading is also provided. Other facets of cyberculture studies that have, unfortunately, not been included in this book include: science and nationalism and science and modernity in India, race and ethnicity, and queer studies. For studies of these areas of the new ICTs see Ashis Nandy (1990), Donald Morton (1999), Gyan Prakash (2000), Beth Kolko (2000), Veronica Hollinger (2002 [1999]), Lisa Nakamura (2002), Catherine Ramirez (2002), and others. Virtual Worlds had its germinal form in a short coda, ‘Technocriticism’, in my previous book, Literary Theory Today. Some of the ideas have appeared as pieces of journalism: ‘Her Body, Her Software: The Art of Orlan’ (Deccan Herald, 11 January 2001), ‘L’Ecriture Digital: Writing Bodies in Cyberspace’ (Deccan Herald, 28 January 2001), and ‘Welcome the Posthuman: Stelarc’ (Deccan Herald, 5 February 2002). A section of Chapter 4 appears as ‘Configuring the Techno-body: Towards a Posthuman Culture’ in Sura P. Rath et al. (eds), Reflections on Literature, Criticism and Theory: Essays in Honour of Professor Prafulla C. Kar (2004).
PREFACE
S 13
A word about the Webliography. Websites, as regular browsers will have discovered, are notoriously ephemeral. I have tried and updated links and URLs, but it is more than likely that some of these will have changed by the time this book appears in print. PKN Hyderabad, July 2003
ACKNOWLEDGEMENTS
I
owe a huge debt to several people, without whom this book would not have been this book. My parents, who make everything possible. Their loving (and understandably, anxiety-tinged) patience at my writing frenzy and long periods of disappearance into libraries and the study is truly monumental. Smt Nilima and Sri Anil Khadakkar, whose affection and warm hospitality enabled me to recover from months of writing. Their patient encouragement has been inexhaustible. To Dr Anna Kurian, who is, inexplicably, both colleague and friend, I owe a debt beyond measure. She read and provided incisive comments on every chapter. This book’s content is partly the result of her critical acumen. Neeraj Khadakkar and Claire McCann, whose endless generosity in supplying wagonloads of books borders on the wondrous. Without their invaluable gift of over two dozen cyberpunk texts, this book would simply not have happened. Sara Valentine (NY), for her gift of Philip Dick’s short fiction, and Colin Harrison (Liverpool, UK) for his ready photocopying and despatch of critical material. B.V. Tejah for drawing my attention to the Performing Art Journal’s issue on Virtual Performance. The staff of the Whipple Museum of the History of Science, University of Cambridge, UK, for their help with journals and other material during my stay there, 2000–2001. The Smuts Memorial Fund (University of Cambridge), which facilitated a beautiful year in Cambridge (UK), where much of the preliminary research for this book was done. The US Department of Energy Office of Biological and Environmental Research, the Human Genome Management
16 S
VIRTUAL WORLDS
Information System (Oak Ridge, Tennessee, USA), the American Judicature Society (Chicago) for the vast amount of information they posted to me periodically, enabling me to keep track of recent developments in genome research. The Director and staff of the American Information and Resource Centre, Chennai, India (especially Mr Jagadish), for their prompt despatch of numerous books and articles that I requested with such frightening regularity. Ajeeth and Panikkar, for their affectionate friendship. My teachers and mentors: Professor Mohan Ramanan, whose bemused and kind interest in my work, and incomparable personal affection over the years, has been indispensable. Professor Sudhakar Marathé, for his inspiring interest in the culture of science, the constant supply of related books and references, affectionate friendship, and wry wit that enlivens the work place. Professor Narayana Chandran, whose comments on the ‘Techne--Coloured’ course, and reading—delivered with his patented brand of humour—have been extremely thoughtprovoking. Professor Probal Dasgupta, for the inspirational conversations of several years, his generous sharing of knowledge, and the train of puns tracking through the groves of academia. Niyati Dhuldhoya, who generously allowed me to persuade her to read Chapter 1, and offered valuable criticism. Book review editors and referees for offering insightful criticism of my reviews: Joanna Zylinska of Culture Machine (Teesside), Joseph Behar of Social Science Computer Review (South Carolina), and David Silver of the Resource Center for Cyberculture Studies (Washington). Members of the course, ‘Techne- -Coloured Dreams: Literature and Contemporary Technology,’ in the M.A. programme at the Department of English, University of Hyderabad, 2002–3—the first victims of several of these ideas. Thanks especially to Ravi Chatterjee, B.V. Tejah, Nirmal Panda, Surendra Athawale, N.P. Ashley, Philip
ACKNOWLEDGEMENTS
S 17
Rajeeth and Venkatesh for their stimulating discussions of numerous issues in cyberpunk. The mysterious reader of the MSS, for valuable comments and suggestions. Debjani Dutta of Sage (India), for her patience and remarkable promptness in responding to my no doubt irritating queries. Proteeti Banerjee, my editor at Sage, for her incredible pace of work, and incomparable attention to detail. My son, Pranav (all of 1½ years!), for switching off the PC at critical moments, making it imperative that I work faster in order to spend time with him!
[Cyberspace] A consensual hallucination experienced daily by billions of ligitimate operators in every nation, by children being taught mathematical concepts … a graphic representation of data abstracted from the bank of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights, receding .… William Gibson, Neuromancer, 1984
There was no there there.
William Gibson, Idoru, 1996 It is the coming of the virtual itself which is our apocalypse, and this deprives us of the real apocalypse. Jean Baudrillard, Interview, 1997
CHAPTER ONE
TECHNOCULTURE
N
etworked and wired, jacked-in and cyborged … the rhetoric erupting from the new technoscientific reservoir inundates descriptions of contemporary culture and the popular imagination. Email and digital cameras, mobile phones and MP3 are significant cultural conditions of the late twentieth century. From online shopping to e-governance to electronic voting, we perceive an increasing ‘technologisation’ of the quotidian. The late twentieth century is the age of ‘high tech’. The ‘new electronic tribalism’, as the philosopher and culture critic Jean Baudrillard termed it (2000 [1985]: 98), represents the age of high-tech gadgetry that takes sophisticated technology out of scientific laboratories and research centres and transforms it into devices of everyday use. Virtual Worlds: Culture and Politics in the Age of Cybertechnology is a survey of the technocultural condition of the late twentieth century or the ‘age of information’. Technoculture today is primarily ‘info-culture’, where knowledge and information (though by no means are the two synonymous) constitute the very cultural fabric of society, especially in the Western world. While ‘cyberculture’ is used in common parlance to refer to Information Technology (IT) or, more accurately, Information and Communication Technologies (ICTs), and specifically the use of Internet technology, I have broadened the use of the term. The assumption here is that we live in an ‘information society’ where information is central to production, consumption, politics and everyday life. If culture is about ‘everyday life’, as James Lull (2000: 131) suggests,
20 S
VIRTUAL WORLDS
then cyberculture is about an everyday life that relies extensively on information and information technology—what the philosopher of hypermodernism, Paul Virilio (Virilio and Kitter 1999b), terms ‘monotheism of information’. Thus, while at first glance body modification and artificial reproduction might seem out of place in a book on cyberculture, a brief survey of any of these practices will reveal the extent to which they depend on ICTs. We discover that genetic engineering and crime detection, info-medicine and space exploration all rely heavily on ICTs, especially digitisation (translating into numbers) and data storage/retrieval. Cyberculture is a cultural condition where technology affects every aspect of our life—from cosmetics to crime, politics to pornography, and shopping to surgery. My use of the term ‘cyberculture’ acknowledges the etymological connotation of the term ‘cyber’, which means ‘to steer’. The computer, the mobile phone and the Internet, integral to ‘cyberculture’, constitute the newest moment in what I term the ‘instrumental imperative’ of human life—the drive to convert objects, including their bodies, into tools for use. Cyberculture today is literally a culture where information and communication technology steers our life—and the late twentieth century can be said to exhibit a ‘cybercultural turn’, or to have gone through a cyber pass to enter the world of information, the posthuman and the technologically embodied. The world of information itself has been dematerialised or virtualised. When data, say, a photograph, is fed into a computer, it is rendered into numbers. The numbers constitute a description, highly accurate, of the photograph. The photograph does not exist as a two-dimensional (2-D) image. While it does occupy a physical space, it is virtualised. The encoded description of the photograph can be translated into a visible image in a number of different media: on screen, in print form, etc. While the digital encoding is not strictly ‘immaterial’, it occupies less space than a photograph. But it is fluid and volatile (Pierre Lévy’s terms, 2001: 56), and the digital recording is ‘implicit in [its] visible manifestation, it is neither unreal nor immaterial, but virtual’ (ibid.: 56, emphasis in original). This is the virtualisation of information, where information is disembodied but waiting to be actualised, materialised, and made manifest. A new form of the human—the posthuman—is emerging (or has emerged already, while we were sleeping), with prosthetic devices, altered body, and networked with computers. Much of
TECHNOCULTURE
S 21
what follows in this book modulates into a discussion of the state of the human—in terms of body, mind, or ‘culture’—with the impact of the informatisation of life on the human. The posthuman condition, characterised by this heavily informatised or mathematised human, is one where the dependence on technology— especially software, the Internet and databases—has increased. Every ‘discipline’ of the human sciences (if one may evoke such antiquated terms)—geography, medicine, demography, biology, politics—uses computers and data. This mathematisation produces a new way of looking at humans: as a set of data. Conversely, the human is dispersed, disseminated into the database of credit card records, election rolls, medical histories and tax returns files. This posthuman condition need not necessarily be a frightening one. Those who feel threatened by the fragmentation of the human subject and identity might want to recall that it has been done before by Freud—the first real demolition man for liberal humanism’s myth of the unified human being. The posthuman subject is actually multiple and enhanced. Dissemination into the cybernetic circuit or ‘reduction’ into data does not necessarily mean the end of the human. Rather, we need to see the human as re-configured and organised differently. This posthuman is constituted by technology and articulated with technological devices—perhaps even improving her/himself as a result. The posthuman is the ghost that insinuates itself into the pages of this book. While this book does not quite systematically track the figure of the posthuman at every stage of its progress through late twentieth century culture and politics, we certainly have fleeting glimpses of a form that disappears around the corner just ahead of us. Orlan and Stelarc, to mention just two examples, give us some idea of this ghostly apparition that haunts the debates here. Virtual life, lived actually, is a description of this (and our) technologised posthuman condition where, disembodied, we meet other similar apparitions in cyberspace. Our world is becoming increasingly virtualised, while the virtuality itself awaits embodiment and arrival in real time and space. Every digital avatar seeks a body. The virtual world is not, therefore, simply the world of cyberspace: it is the potential actualisation of this world into the real. Technology is what facilitates this landscape of dreams and fantasies and its interface with the real. High tech is easily the technology of virtualisation in the late twentieth century.
22 S
VIRTUAL WORLDS
This book is thus a cultural study of contemporary ‘high tech’ and its impact on the human. Technology is here to stay, and as users/consumers of such technologies (even in the role of recipients of Western technologies: this book has been written with Microsoft’s help!), we need to be aware of their cultural implications. Adapting Martin Heidegger’s (1977) idea, we can say that the essence of technology is not technological but cultural, and the critique of technology is not the task of technologists alone. R.L. Rutsky succinctly sums up the issue when he writes: High tech also involves […] a noninstrumental or ‘nontechnological’ aspect. This ‘nontechnological’ aspect is linked to a realm that has generally been cast as the polar opposite of modern technology: that of art and aesthetics […] High tech can no longer be defined solely in terms of its instrumentality or function […] technology becomes much more a matter of representation, of aesthetics, of style. (1999: 3–4) Rutsky’s idea is an informing assumption of this book, which studies high tech as a cultural and political phenomenon. The technological–technical (the realm of engineers and scientists) is studied in the realm of the cultural, since this is epicentre of its impact. These technologies are cultural technologies, where the realm of social structures, identities, and individual or national ‘agency’ are all affected by the arrival or modification of a particular technology. Every society has to make large- or small-scale changes to accommodate new technologies. For instance, technological ‘solutions’ offered for problems in developing countries invariably bring with them non-technological problems. Issues such as national and cultural identity, foreign dependency and local or indigenous technologies are at the heart of these debates. Technologisation has always directly affected working conditions, the division of labour, social structures and cultural–technical skills. In today’s information-driven Indian (metropolitan?) society, software engineering has become a much sought-after skill, thus altering the demand for education in other areas. A technological shift—inspired, also, by the globalisation of the Indian economy since 1991—thus also marks a cultural shift of attitudes towards skills, education and employment.1 Emails and mobile phones have radically altered the shape and context of social interaction. Digital movies and
TECHNOCULTURE
S 23
advances in sound technology can decide the success of cultural artifacts such as motion pictures by influencing audience preferences. Further, technologies frequently double as instruments of surveillance and control. From highway speed cameras to state dossiers on citizens, all technologies possess an immanent ‘pervertability’ (as the philosopher Jacques Derrida warns us). This ‘pervertability’ also has its impact on the social and cultural realm. All culture, therefore, is technoculture, with varying degrees and configurations of the technological. Moreover, cultural ‘texts’ are not passive reflections of contemporary science and technology. They shape what technologies mean, and underline or question what the scientific theories mean in terms of culture. Technoculture, as Andrew Ross and Constance Penley suggest, also originates in the ‘work of everyday fantasies’ (Penley and Ross 1991: xiii). The dreams and aspirations of common people are as influential in creating, modifying and disseminating technoculture as corporate profit motives. Cable TV, broadband transmission, medical scanning methods and email are as much social creations as technological (as social studies of technology demonstrate). Penley and Ross argue that technologies are not repressively foisted upon populations. They are developed at a specific place and time ‘in accord with a complex set of existing rules or rational procedures, institutional histories, technical possibilities […] and popular desires’ (ibid.: xiv). Bruno Latour (1987), for instance, has demonstrated how institutions, people and laboratories are ‘actors’ in the construction of science. Social constructivist views of science embed the technology in its cultural contexts—both in terms of production and consumption—and see theories and objects (inventions) as originating in social, non-scientific and non-technological cultural contexts (see section on ‘Science as/is Culture’ in this chapter and Chapter 3 for more details on social studies of technology). Addressing the question of the cultural ‘neutrality’ of technology, Arnold Pacey suggests that we distinguish between technological knowledge and technological practice. Technology practice treats technology ‘not only as comprising machines, techniques and crisply precise knowledge, but also as involving characteristic patterns of organization and imprecise values’ (Pacey 1983: 4). Pacey therefore suggests a useful threefold concept of ‘technologypractice’. According to him, technology practice has three elements: organisational, technical and cultural.
24 S
VIRTUAL WORLDS
S The organisational aspect represents administration and public policy; it relates to the activities of designers, engineers, technicians and production workers, and also consumers and users of the products produced. S The technical aspect has to do with machines, techniques and knowledge. S The cultural aspect involves the goals set, as well as values and ethical codes. It is constituted by the sense and definition of progress that drives scientific development. (Pacey 1983: 5–6) Pacey then defines technology practice as ‘the application of scientific and other knowledge to practical tasks by ordered systems that involve people and organization, living things and machines’ (ibid.: 6). Pacey’s notion of technology practice and its constituents is, I think, a useful way to think about technoculture. Further, we need to distinguish between the technical and the technological. ‘Technical’ refers to the knowledge, skills and techniques involved in the development of a particular technological product. It refers to the technical aspect of technology practice as defined by Pacey. ‘Technological’, on the other hand, includes the organisational aspects involved in the development of a technological product or solution to a problem. Any study of technology must therefore look at the technical, politico–cultural and economic conditions in which technology is produced and consumed. Andrew Webster summarises the changing relationship between science (that is, scientific knowledge), technology and society (1991: 3–5). S The nature of scientific inquiry within the laboratory has changed. Experiments are conducted by teams of researchers rather than solitary scientists. (We recall here Mary Shelley’s image of the solitary scientist toiling away in his secret laboratory, out of sight of the world, in her 1817 cult text, Frankenstein.) S With the blurring of the difference between ‘pure’ and ‘applied’ research, the distinction between science and technology is breaking down. Both science and technology have been industrialised today. S There is a major reliance on corporate labs for basic research. The exploitation of scientific knowledge has
TECHNOCULTURE
S 25
become crucial for the survival—in a highly competitive world—of multinational corporations. S Science and technology today are firmly located in the political arena because of their importance for the state. All governments today fund research. Armed with these insights we can now explore the various concepts that figure prominently in this cultural study of contemporary technology: techne- –technology–high tech and culture.
TECHNE-, TECHNOLOGY, AND HIGH TECH ‘Techne- ’, the root of the word ‘technology’, has connotations that have been lost in contemporary times. Originally, it meant any skill or craft, and described a range of activities from engineering to the arts. The Greeks, for instance, referred to even art as ‘techne-’. Slowly, it split along two lines: one, the ‘technical’ or ‘technological’ and two, the arts. The rupture between the technical and the artistic had repercussions in cultural fields, when the social ‘sciences’ and the humanities became ‘subjects’ distinct from ‘science’. Technology became the privileged province of the latter. And the arts, seen as the very opposite of technology, was deemed nontechnical. The ‘scientific revolution’ and subsequent developments reinforced this division—a division that persists to this day. Technology, in modernity, has been conceived strictly in an instrumental sense. Indeed, modernity, defined in terms of this instrumental rationality and technology, is the basis on which the West distinguishes itself from ‘non-technological’ others. Cultures that perceive the world in terms other than the scientific–rational and technological control are, therefore, often characterised as feminine, anti-modern, or irrational. Unlike modern technology, high tech is no longer defined solely in terms of its functionality. High tech today is also a matter of style (aesthetics). This means that technology today advertises itself not merely in terms of its functions but also its aesthetic appeal. Studies of high tech therefore also entail a study of the representations of technology. We see this phenomenon at work when nontechnological objects from shoes to art are described as possessing
26 S
VIRTUAL WORLDS
a ‘high-tech style’. And in contrast, extremely sophisticated technological devices are described as ‘state-of-the-art’. High-tech design abandons the modernist obsession with mere functional use or efficiency in favour of expressions of cultural styles and desires. More significantly for our study of technoculture, the very functionality of high tech becomes a matter of style and representation. In high tech, Rutsky writes, ‘the ability to technologically reproduce, modify, and reassemble stylistic or cultural elements becomes not merely a means to an end, but an end in itself ’ (1999: 4). In the age of high tech, then, technology is increasingly a matter of aesthetics or style. And, keeping in mind this aesthetic aspect of high tech, Rutsky suggests that we refer to the very concept of technology as ‘high techne-’, capturing both the resonance of the term ‘techne-’ and the advancement of technology itself (ibid.: 5). Two important moments in the history of modern technoculture set the scene for a reading of cyberculture—modernist aesthetics and technology, and high-tech aesthetics.
Modern Aesthetics and Technology Modernist aesthetics of the late nineteenth and early twentieth century sought to merge the technological and the aesthetic realms. It linked the spiritual with the technological, treating mathematical and abstract geometric forms as possessing spiritual attributes— mainly in terms of reflecting eternal forms and values. Conversely, the modernists also sought to ‘technologise’ art, to render it practical and functional. When modernism sought to ‘demythologise’ the magical value of the artifact, it did so in technicist terms. Just as modernity’s (and the seventeenth–eighteenth century European Enlightenment’s) view of the world was based on a rejection of the magical and spiritual conception of the world, artistic modernism was based upon the death of the ‘aura’ around the artifact (Benjamin 1969). Modernism equated technological reproduction, its assemblage and collage with the rationality and functionality of mass production (Rutsky 1999: 11). It produced a ‘machine aesthetic’ that simulated the standardised forms of machines and factories. In a sense, this aesthetic of technological form leads to a conception of technology as noninstrumental, something that anticipates the high techne- of contemporary culture. Technological form, in the machine aesthetic
TECHNOCULTURE
S 27
of modernism, becomes separated from its function. And the function of technology itself becomes a matter of reproduction. In short, with modernism, technology undergoes an aesthetic ‘turn’ and aesthetics undergoes a technological ‘turn’.
High Tech Aesthetics Like modernism, high tech also sees technology as an aesthetic phenomenon. Here, any cultural form or element can be abstracted from its previous context and videotaped, digitised and reassembled. A curious paradox marks this high tech aesthetic. The tendency of high tech is towards minimalist design, towards reducing the world into smaller and more manageable forms. The ‘bit’ or ‘code’ culture of the late twentieth century is an example of this minimalisation of the world. Paradoxically, with the proliferation of these bits (of data), what we get is an extremely high degree of complexity. The minimalism of high tech with its minute pockets of data is also, peculiarly, its complexity. In high tech this complexity is beyond human understanding and control, and exhibits a certain autonomy. This is the moment when technology becomes technocultural (Rutsky 1999: 13). The general incomprehensibility of late twentieth century’s culture of ICs (integrated circuits), microprocessors and software codes suggests that we are immersed (and surely the term ‘immerse’, used to describe the experience of virtual reality, is apposite) in technological complexity. Technoculture exhibits a certain mutational aspect. High tech appears to possess a logic of its own, beyond human control. Chaos theory (as explained by popular science writers like James Gleick, and in novels such as Michael Crichton’s Jurassic Park) has demonstrated how a system, after reaching a certain level of complexity, mutates. Various bits of software, for example, in their interactions may reach this level of complexity and start interacting in ways that cannot always be foreseen. When these systems are linked up with bigger networks, the possibility of ‘bugs’ increases with the complexity of the network. When in 1989 AT&T suffered a massive breakdown due to such a bug, one of the company’s technology directors observed: ‘When you’re talking about even a single system, it’s difficult. But when we’re talking about systems of systems, then the risks are greater. All of these stored
28 S
VIRTUAL WORLDS
programs are interacting with each other and that makes it hellishly difficult’ (quoted in Rutsky 1999: 17). In high tech, technology must therefore be seen as possessing its own ‘life’ and agency. Technology is an ongoing process of mutation and generation. It possesses its own mutational ‘aesthetic’. Further, we are part of this system of technocultural reproduction. As we shall see in the discussion of the posthuman condition, technology can no longer be seen as an ‘Other’ of the human subject (see especially the discussion of Katherine Hayles’ work below and elsewhere in the book). We need to see ourselves (humans) as sharing boundaries and selves with technology. Technology is no longer a prosthesis, a secondary ‘addition’: it is us. This changed idea about the human subject and technology—the new technocultural condition—is the subtext of this book.
CULTURE This extraordinarily slippery word/concept is the second half of the term I have been using with such regularity, ‘technoculture’. I shall quickly summarise those theories and approaches that are relevant to our reading of contemporary technoculture. As shall emerge, the most persuasive readings and explications of the term/concept ‘culture’ have come from the Left. The work of Stuart Hall and others appear more convincing for our reading of the close link between media, technology and culture. Culture, Raymond Williams informs us, is one of the most complex words in the English language (1976: 76). In the nineteenth century the term began to be associated with the elite. The idea of ‘high’ culture was implicit in the very term. ‘Culture’ came from the words ‘cultura’ and the Latin ‘colere’, meaning inhabit, cultivate, protect and honour with worship. The general process of ‘tending’ acquired class connotations in the eighteenth century. ‘Culture’ as a noun was not common before the nineteenth century, when ‘civilisation’ and ‘progress’ became important ideas of human development. Three general usages of the term are now discernable: culture as (a) a general process of intellectual, spiritual and aesthetic development (eighteenth century); (b) a particular way of life (nineteenth century); and (c) a noun that describes
TECHNOCULTURE
S 29
the works and practices of intellectual and artistic activity. Culture is now music, literature and other arts (Williams 1976: 76–82). With the rise of cultural studies as a disciplinary field—influenced primarily by the work of Williams—the twentieth century ‘crisis’ of culture, especially the collapse of the distinction between ‘high’ and ‘low’ culture, has been located as a part of the social history of modernity (Chaney 1994: 13). With the influence of Raymond Williams, Michel Foucault and Stuart Hall, writers in cultural theory and cultural studies have developed powerful critiques of the term/concept of culture. Invariably, two categories are used to discuss the concept: power and class. Marx, who proposed that social and economic processes were cultural, and that ideology and power mediates social relations, is a major influence in the work of Stuart Hall and other culture critics. Culture, for John Hall and Mary Neitz, includes both material things and ideas. They, therefore define culture as (a) ideas, knowledge and recipes for doing things; (b) humanly fabricated tools (from shovels to computers); and (c) products of social action that may be drawn upon in the further conduct of social life (such as television). As can be seen, the definition includes both material and ‘symbolic’ culture within its ambit (Hall and Neitz 1993: 4–5). And for the purpose of this book—focusing, as it does, on actual material objects and the ideas and ideologies that produce, modify or govern these objects—it appears to be a fairly inclusive definition. Computers, globalisation and ICTs have transformed contemporary culture. Much of this transformation has been in the realm of consumer culture. However, this change in itself is not a new one. Consumerism began in England during the sixteenth century, when novelty and fashion become standard practices of specific classes. Mass consumption began in the eighteenth century, and by the nineteenth century different styles to suit different classes had begun to emerge. Consumption, as Rosalind Williams (1982) suggests, began to be seen as an expression of attitude. After World War I, debates about ‘mass society’ and ‘mass culture’ began to emerge.2 The status of individuals began to be ascribed to their social groups. And with the development of mass communication, critics began to fear a homogenisation of culture. Walter Benjamin’s (1969) comments on works of art in the age of mechanical reproduction draw attention to the loss of uniqueness
30 S
VIRTUAL WORLDS
suffered by an artifact with mass consumption/reproduction technologies. With the technology to reproduce an ‘original’ any number of times, an artifact becomes accessible to large audiences across the world. As a result, the ‘aura’ of uniqueness that makes the work of art special is lost. It is not autonomous any more. Mass culture, therefore, apparently erases true authenticity, autonomy and subjectivity. Pierre Bourdieu’s work on status and class (1999 [1989]) is a fascinating reworking of Max Weber’s sociology of culture. Bourdieu suggests that basic lifestyle—what he terms ‘habitus’—is constituted by class-located family household life formed by marketplace consumption, family culture and education. People in more or less stable societies participate in a struggle for social distinction, which is class-determined. Bourdieu uses the term ‘symbolic capital’ to refer to the knowledge, taste, sensibilities and material possessions that enable a person to claim a specific status (‘distinction’). Thus cultural capital depends upon a market wherein individuals or families compete for social distinction defined by the cultural and economic capital they possess (ibid.: 169–225). What a person learns culturally is influenced by the tastes and everyday activities of people who occupy the same social class. Though economic and symbolic capital may be symmetrical and opposite, they are interconvertible. In Bourdieu’s general economy of ‘practices’, economic profit is simply one form of capital along with the symbolic, cultural and political. The relationship between forms of capital and power depends upon the ‘field’. Thus in the economic field, possession of economic capital bestows power. In the field of cultural production, on the other hand, there is not always a correlation between economic and cultural value. Success in the field of cultural production is not measured in economic terms but rather in the creation of value—that is, the generation of belief in the value of art or science for its own sake, the legitimacy of the artist or scientist as a creator of valued objects. Thus, Bourdieu suggests that symbolic capital and the generation of value is as valid for cultural production as money is for economic production. For Bourdieu, the relationship and interconvertibility among different forms of capital are controlled by class interests. Thus redrawing the boundaries of the field, legitimating certain ‘expertises’ with value, and securing that value with institutional support are all modes of control over symbolic and cultural capital.
TECHNOCULTURE
S 31
For the twenty-first century, then, certain kinds of technology bestow distinction, and generate both economic and cultural capital. For instance, in William Gibson’s Neuromancer and other novels, a lot of emphasis is laid on the brand of computers and software used. Gibson underlines the material culture, notions of class, and the prestige that accompany high tech. As we shall see, technology can become a defining feature of class affiliation, and access to/possession of certain technologies or skills in ICTs, for instance, can distinguish the techno-rich from the techno-peasant. Technology, then, is related both to power and social status. The content of what we see/hear in mass media, as the media theorist Marshall McLuhan famously argued, is less important than the way in which the medium organises our world. We incorporate sounds and ‘bits’ from our immediate ‘lifeworld’ into our activities. Reading may distinguish between people. We choose specific kinds of reading, and the difference in print-driven knowledge/interests can be quite wide. Television, however, performs differently. Oriented to more general audiences, it offers them knowledge previously available only to specialists. Thus an awareness of the tribes of Africa (previously known to anthropologists or academics working in the area), politician’s lives (previously known to intimates and biographers) or child labour in Bangladesh and India (previously the subject of only social workers’ or UN workers’ knowledge) spreads among the general population. New relationships—of community-building or alienation—among people arise as a result of their access to particular technologies. Technology thus shapes cultural and social interaction. In the twentieth century, with the widespread social role of business corporations, culture itself has become subject to their power. The emergence of consumer society is based not merely on the production of consumer goods, but also on activities such as advertising and market research. These are activities that constitute a wider social institutional network which includes politics, information processing, planning agencies, government bureaucracies and scientific organisations. That is, when we are caught up in the throes of shopping for goods, we are ‘organised’ by governmental and corporate systems. Further, when we purchase a particular brand—associated with a specific social status—we are actually buying an image associated with that product. This
32 S
VIRTUAL WORLDS
image—symbol, in other terms—is the effect of representational practices. Stuart Hall is particularly emphatic on the point that culture is material. He writes: Culture has ceased […] to be a decorative addendum to the ‘hard world’ of production and things …. Through design, technology and styling, ‘aesthetics’ has already penetrated the world of modern production. Through marketing, layout and style, the ‘image’ provides the mode of representation and fictional narrativization of the [human] body on which so much of modern consumption depends. Modern culture is relentlessly material in its practices and modes of production. And the material world of commodities and technologies is profoundly cultural. (Hall 1996 [1989]: 233) The parallels with Rutsky’s argument about the aesthetics of high tech (presented earlier) are obvious. With developments in anthropological theory under the influence of Clifford Geertz, James Clifford and others, historiography under Hayden White, and poststructuralist theory, culture becomes the effect of representational practices. Culture is the shared set of practices of a group, community or society. Meaning here is generated out of representations: visual, aural and textual. Essays in the volume Writing Culture (Clifford and Marcus 1990) for example, look at the ways in which ethnography (thus far deemed to be objective in its field work and ‘readings’) is actually subjective (or reflexive) and utilises rhetorical/representational devices to speak of ‘native’ or ‘indigenous’ culture. These readings emphasise the significance of practices of meaning-generation even in so-called objective surveys, which share these devices with literature and the arts. Native culture or the culture under investigation is not just ‘there’ as a stable, coherent object, but is constructed as much by its processes of meaning-generation as the ethnographer’s methods of meaning-seeking and interpretive acts. Stuart Hall therefore argues that culture is not a set of things but a set of processes or practices through which individuals and groups come to make sense of those things. Culture here is the production and exchange of meaning (Hall 1997a: 3). Hall draws a distinction between the substantive and epistemological aspects of culture. The substantive aspect is culture’s place
TECHNOCULTURE
S 33
in the ‘actual empirical structure and organisation of cultural activities, institutions and relationships of society at any particular historical moment’. The epistemological aspect refers to culture’s position in ‘relation to matters of knowledge and conceptualization, that is how “culture” is used to transform our understanding, explanations and theoretical models of the world’ (Hall 1997c: 208–9). In the late twentieth century culture industries mediate every process—from the material infrastructure of modern societies to the means of circulation of ideas and images. Cultural industries sustain, in Hall’s words, ‘the global circuits of economic exchange on which the worldwide movement of information, knowledge, capital, investment, the production of commodities, the trade in raw materials and the marketing of goods depend’ (ibid.: 209). Hall points to the ‘global scope … breadth of impact […] the democratic and popular character of the late twentieth century’s cultural revolutions’ (ibid.). The ICT-driven ‘space–time compression’—David Harvey’s (1989) famous formulation—has changed popular consciousness with the almost-obliteration of distances. This has had a major impact culturally—particularly with the worldwide dissemination of CNN, Hollywood, McDonald’s and others (what the social theorist George Ritzer termed ‘the McDonaldization of society’). In this cultural condition, everyday lives have been radically transformed. The decline of industrial work, the rise of services and other occupations, the increase in leisure and job flexibility, changes in family structures (notably single-parent families), decline in marriage and the rise in the number of divorces, the increase in the number of the aged, the decline of religion and the simultaneous rise of religious fundamentalism (both, ironically, enabled by the Internet and other media) constitute the late twentieth century’s ‘dislocations’ of everyday life (Hall 1997c: 214). With poststructuralism’s debunking of the idea that language faithfully describes reality, cultural theory has undergone a major transformation. Meanings of objects do not rest in them anymore (that is, meaning is not an immanent feature of objects), but are generated through language games and classifying systems. What has always been seen as ‘natural’ is now seen as a construction of language (discourse—defined as a conceptual terrain that allows certain things to be said and represses certain others. It refers to
34 S
VIRTUAL WORLDS
historically specific systems of meaning which form the identities of subjects and objects. That is, discourse is a system of social relations and practices) and representations. Thus ‘culture’ itself is the ‘sum of the different classificatory systems and discursive formations, on which language draws in order to give meaning to things’ (Hall 1997c: 222). Economic and social processes are now seen as discursive constructions. Stuart Hall’s reference to the ‘cultural turn’ of the 1970s and 1980s draws attention to the cultural studies’ yoking of Foucault’s notions of discourse and Max Weber’s work on the ‘sociology of meaning’ (ibid.: 224). Cultural studies—of which cultural studies of technology, such as this, is an important constituent—with the work of Foucault, Williams, Hall, and the Marxist thinkers Antonio Gramsci and Louis Althusser, is the ‘new’ disciplinary field which begins with this assumption of the discursive nature of science, economics, historiography and the arts. However, as Stuart Hall is quick to point out, this does not mean that there is nothing but language or discourse—an idea generated from Jacques Derrida’s much quoted formulation: ‘There is nothing outside the text.’ Earlier notions of text treated it as something with margins, an author, and organised logically from a ‘beginning’ to an ‘end’. This text was distinct from consciousness and reality. Derrida argues for a text that overruns boundaries when it refers to other texts. A text here is a network of several texts; it is not a ‘finished’ product which concludes on the last page because its references, influences, borrowings and citations extend far beyond that last page. The world is available only as a system of differences (this is from the poststructuralist view of language), and this system of differences is also the feature of texts. Derrida is suggesting that reality—economics, politics, science—all take the form of language. That is, each of these disciplines relies on representational practices to convey meaning, construct ideas and initiate policies. Every social practice, as Stuart Hall, paraphrasing Derrida, states, ‘has a discursive character’ (Hall 1997c: 226, emphasis in original). Further, as we have seen in Pacey and others, politics and economics have a cultural dimension. These are not ‘objective’ disciplines but modes of generating particular forms of meaning, just as ‘culture’ (as Williams and Hall have pointed out) is a set of meaning-generating practices. As can be seen, meaning- generation is essentially the creation of symbolic forms—whether as physical gestures, verbal utterances
TECHNOCULTURE
S 35
or visual art. Symbolic power, as John B. Thompson defines it, is ‘the capacity to use symbolic forms to intervene in the course of events, to influence the actions of others and indeed to create events’ (Thompson 1995: 17). Thus interpretation eventually concretises as sociocultural events. In contemporary cultural studies symbolic form usually refers to the content of human communication as mediated by print, photographic, film, audio or digital technologies of reproduction and transmission (Lull 2000: 161). In short, symbolic forms are narratives that generate particular meanings upon acts of interpretation. Stuart Hall examines the processes through which these narratives ‘make sense’. In his early work on television and culture, Hall suggested a typology of ‘decoding’ (that is, the generation of meaning from signs and symbolic codes) practices. Decoding takes place from certain ‘positions’ (a move borrowed from Michel Foucault, who constantly emphasises the position from which ‘speaking’ or ‘listening’ is done). According to Hall, there are three positions: S The dominant–hegemonic—where the viewer works with the dominant order of the sign/symbol’s connotation. Thus we all agree with the dominant meaning of the military (present in symbolic codes such as the Republic Day parade) as being in the ‘national interest’. S The negotiated—where the dominant mode is accepted, but called into question in certain cases. To use the same example, while we agree that the military is in our ‘national interest’, we may question whether a nuclearised military is also in the national interest. S The oppositional—where alternative meanings are proposed. This is where the discourse is itself opened up and possibilities of other meanings are revealed by a careful attention to the modes by which meanings such as ‘national interest’ have been generated in the first place. Thus, in the third position, the term ‘national interest’ is itself called into question, and debates suggest that the term conceals the ideology of domination by a particular class. ‘National interest’ actually serves the interest of specific groups of people, but is passed off as the interest of the entire masses (2000 [1980]: 59–61).
36 S
VIRTUAL WORLDS
Twentieth century theories of culture treat culture and identity as contested spaces. Since culture and identity are processes of signification (that is, meaning-generation) they are prone to continual transformation and revision. Thus the term ‘popular culture’ has acquired different meanings over the ages. Its meaning of the moment depends upon the relation it bears with other categories such as ‘high culture’. What this means is that culture is now seen not as a set, discrete thing, but as something that emerges in a process of negotiation. And because culture is about struggles over meaning (that is, representation), it is ineluctably ideological. Cultural studies of art or technology must therefore account for the generation and dissemination of certain symbols and meanings and explain how specific symbols acquire values for society. What it studies is nothing but the processes through and by which ideology creates and influences meaning in certain contexts. Cultural studies begins with the assumption that the operations of ideology can be discovered in cultural practices—which, adapting from poststructuralist thought, it treats as ‘texts’. Hall’s work on ‘articulation’ provides us with a useful tool in understanding the related themes of symbols (i.e., representations), ideology and technologies. ‘Articulation’ is a concept that was first put forward by the political theorist Ernesto Laclau in 1977. Laclau argues that concepts are not linked together by logical relations but by connotative or evocative links that custom has established between them. Articulation is this arbitrary link between concepts. Thus we conclude that all links between concepts are connotative. And any exploration of the system involves the exploration of multiple, even non-necessary links between concepts (since there is no way to construct a totality of a system with just one concept). Moving on from this idea of connotative links to questions of class, culture and hegemony (a notion adapted from the Marxist thinker, Antonio Gramsci, that seeks to explain the pervasive influence and domination by a particular class-driven ideology), Laclau emphasises the role of discourse. He suggests that while no discourse has an essential class connotation, meanings within discourses are always connotatively linked to different class interests. Thus the discourse on nationalism is linked to a feudal project on maintaining traditional hierarchy. The class that achieves dominance is the class that is able to articulate (link) non-class contradictions into its own discourse and
TECHNOCULTURE
S 37
absorb the contents of the discourse of dominated classes. Articulation is thus the ability to ‘articulate different visions of the world in such a way that their potential antagonism is neutralized’ (quoted in Slack 1996: 119). Stuart Hall suggests that while no practice exists outside discourse, it cannot be reduced to discourse: There is a specificity to those practices whose principal object is to produce ideological representations. They are different from those practices which […] produce other commodities. Those people who work in the media are producing, reproducing and transforming the field of ideological representation itself. They stand in a different relationship to ideology in general from others who are producing and reproducing the world of material commodities—which are, nevertheless, also inscribed by ideology. (1985: 103–4) Hall thus insists on specific practices being in different kinds of relations to discourse. ‘Articulation’ enables us to think of ‘how specific practices articulated around contradictions which do not all arise in the same way … [but can] … nevertheless be thought together’ (1980: 69, emphasis in original). Thus cultural studies and theory consider the ‘range of other social forces both in their specificity and in discourse, interrogating the ways in which they are complexly articulated in structures of domination and subordination’ (Slack 1996: 123). For our purposes this means several things: S Reading social forces such as class relations or social mobility in the age of ICT-driven development (including aspects such as redrawn notions of work or skill in the new media age, the rise of the service sector and new forms of leisure to serve the high-salaried young professional and so on). S Reading the discourses of modernisation, technoscientific ideas of ‘development’ or emancipation (including hagiographic accounts of the Internet age such as Bill Gates’ The Road Ahead, the critique of commodity culture in Arjun Appadurai, or the discourse of indigenous knowledges in Vandana Shiva). S Reading the interaction or linkages between the two.
38 S
VIRTUAL WORLDS
The last chapter in this book, ‘Gender’, looks at the new technoculture precisely in terms of such an articulation between the discourse of modern science, the social structures of development or employment, the discourse of feminism, and the technologies that mediate the interaction between actual technical developments and discourses about/of women. The work of Donna Haraway, for instance, is a brilliant illustration of the practical ‘application’ of Hall’s idea of articulation. Evidently, any survey of cyberculture would be an attempt at a cultural studies inspired by the ideas about culture, ideology, discourse and technology as theorised by Stuart Hall. I conclude this section with a brief summary of what Tony Bennett terms a ‘pragmatics for cultural studies’. Bennett suggests four elements of cultural studies—each of which finds its resonance in the chapters that follow: S Work in cultural studies is interdisciplinary, but does not offer a critique of the disciplines. Rather, it coordinates methods from various disciplines and is concerned with the functioning of cultural practices and institutions of relations of power. S Cultural studies is concerned with everyday life, with whole ways of life—those practices, institutions and systems of classification through which values and beliefs are inculcated in populations. S The forms of power in relation to which culture is to be examined include relations of gender, class and race. S It has a political agenda, even though its relationship with social movements such as the woman’s movement or green movements has often been seen as a ‘problem’ (1998: 27–29). With these readings of ‘culture’, we can now proceed to look at scientific culture.
SCIENCE AS/IS CULTURE Science, as feminist theorists argue, must itself be treated as a social and cultural activity. The notion that science is culture has,
TECHNOCULTURE
S 39
in Doyle McCarthy’s words, ‘open[ed] up cross-disciplinary inquiries into the cultural foundations of knowledge-seeking and rationality’ (McCarthy 1996: 94). Ruth Bleier thus defines science as ‘a socially produced body of knowledge and a cultural institution’ (Bleier 1986: 2). Dorothy Smith’s work, indisputably one of the most influential in the social studies of science, has constantly drawn attention to the ways in which people’s (in Smith’s case, women’s) lives are ruled and managed by scientific knowledges. Helen Longino (1990) argues that science is essentially ‘social knowledge’ where social values shape scientific developments. The social movements of the 1970s and 1980s challenged the idea that science is autonomous from sociocultural life. Science, it was argued, has always been linked—practically (financially), politically (sanctioned research, patronage) and culturally (in product styling, for example)—with the institutions of politics, business and the military. The Organisation for Economic Cooperation and Development (OECD) Report, 2000 states the case clearly: [ICT is] a key technology to speeding up the innovation process and reducing cycle times, it has fostered greater networking in the economy, it makes possible faster diffusion of codified knowledge and ideas and it has played an important role in making science more efficient and linking it more closely with business. (quoted in Kuramoto and Sagasti 2002: 218) The comment illustrates the close link of science and technology with business interests. Peter Golding lists several problems with contemporary technoscience and the new media. As can be seen, not all the problems of the new media are economic—in fact most of them have enormous sociocultural effects. Golding points out that four of the top five international telecommunication routes have the USA as one partner, and the USA is partner to 51 per cent of international telephone traffic in the top 50 routes (1997 statistics). This is mere economics. Golding, however, argues that this ‘mediatisation’ of the new technologies follow past scenarios—of commercialisation, differentiated access, exclusion of the poor, privatisation, deregulation and globalisation (Golding 2000 [1998]: 814). Each of these problems has a cultural dimension, which grounds studies of
40 S
VIRTUAL WORLDS
contemporary technoscience more firmly in cultural studies. The most significant contribution to the notion of science as/is culture mcomes from the feminist critics of science (see Chapter 5 on Gender). As has been discussed earlier, ‘culture’ in twentieth-century thought is seen as constructed. Closely paralleling this notion is the idea that scientific discoveries, theories and practices are also constructed out of particular cultural contexts. Further, it is suggested that objects of scientific inquiry were constituted as such within a system of descriptions (representations) that already existed in a particular sociocultural context. That is, science’s objects were already embedded in specific fields of interpretation. Scientific knowledge is thus constructed out of a world already ‘known’ and experienced as ‘something’. To understand this radical rereading of science and scientific culture, we need to look at the work of Thomas Kuhn and others. The sociology of science and the sociology of knowledge—two influential roots of constructivist readings of science—owe much to Karl Marx, Robert Merton, Karl Mannheim, Emile Durkheim, Thomas Kuhn, Bruno Latour and others. A brief survey of the constructivist position on scientific culture sets the background for the study of contemporary technoculture. In the sociology of scientific knowledge (SSK), scientific belief is socially constructed. Others such as Bruno Latour and Steve Woolgar (1999 [1986]) argue that scientific facts are also social constructions. David Bloor summarises the main features of SSK thought: S SSK concerns itself with the conditions that bring about beliefs or states of knowledge. S SSK looks at both sides—truth and falsity. S It assumes that the same types of causes explain both true and false beliefs. Or, the same cause decides what is acceptable as true or false belief. Thus we need to look at the social conditions that accepted or rejected Galileo’s view of the universe as true or false, and not whether he was actually true or false (quoted in Kukla 2000: 9). Thus, SSK suggests that all beliefs—rational and irrational, true and false, scientific and non-scientific—can be explained in social terms. However, SSK also admits that even if all beliefs have social
TECHNOCULTURE
S 41
causes, there are additional causal factors which are different for rational and irrational beliefs. Thomas Kuhn (1970) marks an important step in interrogating the established conception of the culture of science. Kuhn suggests that the scientific tradition legitimised methods of enquiry and values. The scientific ‘paradigm’ (a term that entered popular social theory vocabulary with Kuhn) provides the framework for studying society. It is only within this paradigm that enquiries can be initiated, their objects identified, problems defined, and solutions/interpretations offered. There can be no ‘outside’ position from which to evaluate the enquiry. Truth is thus restricted to what the paradigm defines as truth. Kuhn also argues that there can be no clear distinction between observation and theory. Further, he suggests that scientific concepts are not particularly precise. Science itself is not unified, but consists of loosely overlapping disciplines. In fact, Kuhn uses the word ‘science’ in terms of small groups of research workers who work at one line of inquiry. This is the ‘disciplinary matrix’ (as Kuhn terms it), composed of interacting groups with common goals, basic assumptions, approaches and standards. These are passed on to students through notes, conversations and textbooks, and help in deciding what research should be supported, what problems are important and what solutions are possible. In terms of institutions, it also decides who referees papers, who publishes, and who advances in the career. This is Kuhn’s paradigm—which can be defined as a set of shared values. Based on Bourdieu’s work (discussed earlier), we can say that these shared values are actually social constructions that decide, offer and bestow symbolic capital to scientists. Paul Feyerabend in his Against Method (1978) presents a similar critique of so-called scientific objectivity. Feyerabend suggests that science knows no ‘bare’ facts; the ‘facts’ that enter our knowledge are already viewed in a certain way. Scientific education enables such a viewing. In scientific education a field of research is first delineated. This field, say physics, is separated from other areas such as biology or metaphysics. It is given a logic of its own. The student/researcher is then ‘trained’ in this ‘logic’. When all researchers have received this training in the field’s logic (by this Feyerabend means assumptions, methods and standards), there will be a certain uniformity in their actions. Feyerabend argues that part of the training is to ensure that there is no blurring of boundaries. Thus a person’s religion must not have any connection or
42 S
VIRTUAL WORLDS
impact upon her/his scientific activity. Even the language the researcher uses must be of a certain kind. ‘Facts’ are thus experienced as being independent of belief and cultural background. A ‘scientific community’, language, set of values—in other words, a paradigm—and a discourse are formed. Scientific knowledge-seeking, as Doyle McCarthy puts it, presumes a cultural habitat in which science unfolds itself (McCarthy 1996: 95). Earlier, scientific discoveries were seen as the inspired work of genius rather than the outcome of method and instrumentation. In addition, science and laboratory work was seen as autonomous and disinterested. Following Bourdieu, we can argue that today, matters of distinction and prestige (Bourdieu’s ‘symbolic capital’) attend the production of scientific knowledge. Science and scientific knowledge are ‘interested’, with an active engagement— social, cultural and economic—with the world. Thus, as Timothy Lenoir argues, the cognitive and the social are mutually implicated in the production of scientific knowledge (Lenoir 1997: 7–8). The construction of natural knowledge, for Lenoir, is ‘simultaneously an attempt to define society and to legitimate one’s own version of social reality or that of the group to which one belongs’ (ibid.: 8). We need to look at the construction of instruments, the manipulation of experimental apparatus in the laboratory, and the relationship of these ‘practical’ activities to their theoretical representations. Thus Thomas Kuhn, describing the state of scientific knowledge and the significance of sociocultural practices of representations, declares: ‘We have no direct access to what it is we know, no rules or generalizations with which to express this knowledge’ (Kuhn 1970: 196). What emerges from Kuhn’s and Bourdieu’s arguments is that for each disciplinary field—philosophy, art, science, religion, law— there corresponds a point of view of the world that creates its own object and provides (or facilitates) the means of understanding that object. As Bourdieu puts it: The structures of thought of the philosopher, the artist or the scientist, and therefore the limits of what presents itself to them as thinkable or unthinkable, are always partly dependent upon the structures of their field, and therefore on the history of the positions constituting the field and the dispositions they favour. (2000: 99)
TECHNOCULTURE
S 43
The field/discipline of science, as Lenoir’s argument demonstrates, is at least partly created by the institutions of knowledge—the laboratory, the university, the research institute, the peer-reviewed journal, the annual conference (the Macy Conferences on Cybernetics, 1943–54, are examples of this: see Chapter 4 for details on the Macy Conferences) are modes of institutionalisation. In Bourdieu’s words: ‘Every field is the institutionalization of a point of view in things and in habitus’ (2000: 99). This institutionalisation develops its own rhetoric and modes of representation. Science is a field that relies a great deal upon representational practices. Stereotypes of the mad scientist, the lonely scientist, the experiment gone wrong, are all modes of representation—and representation is a cultural condition/feature, not a technological one. Lawrence Prelli (1989), for instance, has meticulously documented the very special kind of rhetoric that science employs. Bruno Latour and Steve Woolgar argue that ‘facts’ are socially constructed. More importantly, the process of construction is such that all traces of their production are made extremely difficult to detect. The logic of this argument is surely a little difficult to accept, and requires some elaboration. In a laboratory there are certain statements (or concepts or theories, to use common terms). Laboratory members are unable to decide whether these statements are true or false. At some stage the statement begins to stabilise (that is, some of the interpretations are discarded). The statement is ‘split’ (Latour and Woolgar’s term). It is a set of words that represents a statement about an object, and it corresponds to the object itself which takes on a life of its own. So, at the point of stabilisation there appears to be both the object and statements about the object. Slowly, more reality is attributed to the object than to statements about the object. Finally, the object becomes the reason why the statements were formulated in the first place. Another interesting phenomenon takes place. It is made out that the object has been there all along, waiting to be discovered. The history of the construction of this object is now transformed into the pursuit, by ‘great’ scientists, of a single path that led to the ‘actual’ structure. The object ‘out there’—which must be found and understood—is not the cause of scientific work but the consequence of it (Latour and Woolgar 1999 [1986]: 251–59).
44 S
VIRTUAL WORLDS
We can add that while science deploys specific rhetorical strategies to further its ends (of presenting itself as ‘reason’, and concepts such as ‘discovery’), the rhetoric, in turn, facilitates subsequent thinking/experiments in science. Roger Cooter’s (1984) study of the rise of phrenology as a ‘science’ in nineteenth century England is an excellent demonstration of the argument that science is a cultural field with its own representational practices. Cooter points out that the consent was ‘organised’ through certain rhetorical/representational modes employed by both ‘scientists’ and the ‘commoner’. Phrenology becomes a proper discipline with the continued accretion of prestige, scientific ‘evidence’ and social values to it. Much of these, Cooter proves, was a cultural construction of a science. This leads Steve Woolgar, a constructivist philosopher of science, to state: There is no sense in which we can claim that the phenomenon […] has an existence independent of its means of expression […] There is no object beyond discourse […] the organization of discourse is the object. Facts and objects in the world are inescapably textual constructions. (1988: 73) Woolgar underlines the representational (discursive) feature of socalled scientific facts. And since representation is essentially about rhetoric and language—which are cultural ‘phenomena’—it follows that science is cultural too. Science exalts reason over every other human attribute. Bourdieu suggests that ‘reason’ itself is a construct that owes its valence to the very field of science. For Bourdieu, the scientific field is, like all other worlds, a social world. But the field is also different from other fields in that the necessity of reason is ‘instituted to varying degrees in the reality of structures and dispositions’. Bourdieu argues that the scientific field has its own modes and modalities of competition. This presupposes and produces a specific form of interest, but one which is oriented towards ‘winning the monopoly of scientific authority, in which technical competence and symbolic power are inextricably combined’. Bourdieu makes it very clear: The pursuit of the accumulation of knowledge is inseparably the quest for recognition and the desire to make a name for oneself; technical competence and scientific knowledge
TECHNOCULTURE
S 45
function simultaneously as instruments of accumulation of symbolic capital […] the polemics of reason are the contests of scientific rivalry …. (2000: 110) Bourdieu’s work suggests the extraordinarily cultural and nontechnical nature of science and technology. It is precisely this aspect of technology that invites a description of present conditions as ‘technoculture’. Two conclusions can now be drawn. One, that all observation (in the laboratory) is shaped by reference to a theory. Theories are always underdetermined by data, since several theories are compatible with the same set of data. Thus choice between theories does not rest on empirical support, but on conceptual issues. Two, statements derived from theory never confront nature alone, they are placed in interaction with several interrelated beliefs. Thus theories and the entire technical culture that they support can be accepted or rejected as wholes (Lenoir 1997: 22). In the nineteenth century, with cultural capital and authority being associated with ‘pure disciplines’ such as philosophy or literature, writers like Werner Siemens and Carl Ludwig tried to redefine the cultural field to include the natural sciences, and represent scientific practices as ‘pure’. They tried to move these disciplines away from ‘applied’ sciences such as medicine to relocate them in philosophy. We know from Bourdieu’s arguments that cultural capital is accorded to certain forms of knowledge. Scientific knowledge or technology is not exempt from this process. Thus, the increased value accorded to software engineering or genetic medicine in the late twentieth century (as opposed to, say, mechanical engineering in the nineteenth), and the legitimacy being increasingly accorded to science fiction (SF) as an acceptable literary genre (especially the work of ‘prophetic’ writers like Arthur C. Clarke) are examples of the reordering of value and cultural capital to science. What we understand from this discussion is that values are added and taken away from certain kinds of science and technologies. This value addition or subtraction is not purely a scientific matter, but is embedded in a cultural field. Subsequent developments in the technological field account for and internalise these values. Institutional support—in the form of research grants, new disciplines of study, award of degrees, student intake and creation of employment opportunities— is crucial to this ‘culturation’ of science. A simple example would
46 S
VIRTUAL WORLDS
suffice to illustrate this argument about science as a cultural field. After the success of the Manhattan Project, and during the Cold War, physics was the most highly valued academic discipline in American universities. Physicists were closer to economic and political power— since the Pentagon and the White House used academics as consultants and advisers on the war or armament programme—and even became intellectual leaders. Physics departments got the highest funding. Physics-based instrumentation such as computers and magnetic resonance imaging crept into other fields like medicine and biology. The influence of Michel Foucault (especially his work on the medical gaze, madness and sexuality, and his massive study of epistemology, The Order of Things) has been particularly strong in cultural studies of science. Science is treated as a knowledge system and technology as its instrument that concentrates power within a culture. Science itself is a product of human material practices. For thinkers like Bruno Latour and Donna Haraway—indisputably two of the most important thinkers on technoscience—the culture of science reinforces existing hierarchies. Technoculture, in the sense that this book treats it, is less about science as knowledge than science as material practice. Everyday life—as I stated at the opening of this introduction—is technologised. And a knowledge system that facilitates everyday life (through mundane structures such as building materials or computers) can become a mode of cultural control (that is, power). Scientists and engineers, as Chandra Mukerji points out, help produce the material world in which we live, and gain power because of this (Mukerji 1994: 146). This view firmly locates science studies in cultural studies, since cultural studies is mainly interested in questions of power and the structures that enable or inhibit its distribution.
THEORIES OF THE INFORMATION SOCIETY This part of the chapter comprises a historical survey of theories of the late twentieth century information society. While several
TECHNOCULTURE
S 47
theories of the information society exist, this survey orients itself along specific axes. It is organised into two sections: theories of the information society and key concepts. Various social thinkers—from Daniel Bell through Anthony Giddens to Jurgen Habermas and Manuel Castells—have meditated upon and provided major critiques of the information society. In addition, social historians of science, ranging in ideological positions from Marxism to feminism, have provided equally penetrating analyses of the ‘culture of technology’. Mapped here are the various debates, for and against, the present technologised society. The latter half of the twentieth century has been characterised by an increasing ‘informatisation’ of social life. A new mode of information—in terms of gathering, analysis and dissemination— predominates late twentieth century life. The developed nations, in particular, have become information societies or ‘wired’ societies. Increasing quantitative flows of information, faster movement of this information, the global movement of finances, and the dissemination of various technologies in conjunction with the centrifugal movement of business corporations characterise the global informational economy.
DEFINITIONS The term information society can be defined, as Frank Webster (1995) outlines it, in five ways: S Technological: This view suggests that developments in information processing, storage and transmission have led to the large-scale and ever-increasing application of information technologies (ITs) in everyday life. It sees the convergence of telecommunications and computing, and an increasing linkage between banks, homes, offices, factories, shops and educational institutions. This is the ‘networked society’, where the various sites are linked by the information grid (just as they are by electrical or water supply lines). As John Naisbitt declares in Megatrends:
48 S
S
S
S
S
VIRTUAL WORLDS
‘computer technology is to the information age what mechanisation was to the industrial revolution’ (1984: 28). Economic: Thinkers such as Fritz Machlup (1962) and Marc Porat (1977) describe the new information-based society entirely in terms of an ‘information economy’, treating knowledge production and distribution in terms of their contribution to the economy (quoted in Webster 1995: 11–13, 28). Thus Porat speaks of the ‘primary information sector’ (mass media, education and advertising) where economic values can be placed on information easily, and the ‘secondary information sector’, which includes research information and government studies (ibid.: 11–15). Occupational: This is a fairly straightforward definition of the information society. It pays attention to the change in nature of occupations, with larger numbers of people being involved in information work or, as Webster puts it, with ‘clerks, teachers, lawyers and entertainers outnumber[ing] coalminers, steelworkers, dockers and builders’ (ibid.: 13). This shift in occupations from industrial labour to deskbound information dealers is what Daniel Bell famously described as the rise of the ‘white collar society’. Spatial: The emphasis here is on ‘networks’ that mark the information society. John Goddard (ibid.: 18) and Manuel Castells, for instance, speak of ‘flows’ of information that link towns, regions and nations. Transborder movements of information—from research data to speculative capital— along electronic information highways radically alters notions of territory. Further, aligned with the information flow is the dispersion of business corporations, where the financial centre and the production unit can be in different continents but wired together (the multinational/ transnational corporation). Cultural: This emphasises the extraordinary predominance of information in everyday life: movies, radio, television, and now the Internet. We are constantly exchanging messages about ourselves—what is called signification. In fact, the process of information exchange is an end in itself. Signs and images do not refer to anything or any ‘reality’ outside themselves: they are ‘self-referential’. Therefore, in the media-driven culture of today, we get a surfeit of signs.
TECHNOCULTURE
S 49
This is what the philosopher Jean Baudrillard terms ‘hyperreality’—a world of signs but not much meaning. Role-playing and symbolic gestures—both modes of information— dominate our lives in the information society.
THEORIES It would be useful at this point to take a quick look at those thinkers who began writing about the arrival of a post-industrial society—now seen as a forerunner of the global network society driven by ICTs. More attention will be paid to later thinkers who have directly theorised about the new technologies and society of the late twentieth century.
Daniel Bell and Post-Industrialism Daniel Bell’s The Coming of Post-Industrial Society (1973) argues that the postindustrial society emerged from changes in the social sector. He sees the economy and occupational structure as influential here, but does not see politics or culture as significant factors in the change. Bell suggests that the following changes occur in the shift from industrial to postindustrial societies: S Reduction in industrial labour. S Increase in industrial output despite reduction in labour due to increased mechanisation. S Increase in wealth due to higher productivity, inducing newer needs of leisure and other services (entertainment, holidays). S New job opportunities in the service industry. With the shift from factory–industrial to service–industry, the battle (or the ‘game’, as Bell terms it) is between people (as opposed to that between people and nature in the pre-industrial, and between people and fabricated nature in the industrial). What is involved in this game between people is information. Service work—whether banking, travel or therapy—is based on information about the clientele, its needs, paying capacity and demands. In such a society professionals in research, health and education become
50 S
VIRTUAL WORLDS
the ‘new intelligentsia’ (Bell’s terms) because the information needs of the post-industrial society are considerably increased.
Herbert Schiller and Advanced Capitalism Herbert Schiller’s (1976, 1984, 1989) Marxist approach pays attention to the structural features of media and information. These are basically economic in nature, such as ownership, revenue, audience spending and so on. Schiller looks at the entire socioeconomic system (in this case, capitalism) that leads to specific developments in the information society, and at the same time also pays attention to the general trends and sequence of developments. Schiller argues that the commodification of information is the most significant feature of the information society. Thus research, innovations and technologies are all influenced by market criteria in their quest for information. Further, Schiller argues that inequality characterises the generation and distribution of information (what Pippa Norris [2001], with specific reference to the digitisation of the world, terms ‘digital divide’). Thus location in the social hierarchy decides the amount and quality of information one gets. Schiller adapts Raymond Williams’ term ‘corporate capitalism’ to argue that since contemporary capitalism is dominated by corporations, their priorities influence the information society. And, Schiller points out, their priority is information for private gain rather than public benefit. The ability to pay is an important factor in the generation of and access to information.
Jurgen Habermas and the Public Sphere One of the most influential thinkers of the twentieth century, Habermas’ work on the ‘public sphere’ has influenced several critiques of the information society. Habermas begins his The Structural Transformation of the Public Sphere (English translation in 1989) by arguing that the public sphere is more or less autonomous, even if funded by the state. It is in this space where rational debates— i.e., debates which are not ‘interested’ or ‘manipulated’—can occur and public opinion is formed. Thus, full reportage, increased accessibility, open debate and independence of actors from economic concerns characterise an ideal Habermasian public sphere.
TECHNOCULTURE
S 51
Information is therefore at the heart of the public sphere. Pure, reliable and adequate information enables sound discussion, while tainted or manipulated information effects prejudiced or flawed discussions and decisions. Habermas points out that libraries, museums, art galleries and radio broadcasting were originally intended to provide neutral and pure information to the widest possible population. This kind of public service of information is what makes a true public sphere. If public institutions, such as radio or libraries, are converted into mere profit-minded organisations, then the quality of information they provide will suffer significantly. Today, with the commodification of, and corporate– capitalist control over information, the quality of information has eroded. What we now have is ‘interested’ information—sponsorship, advertising and an increase in information management by politicians and corporations. That is, information is now increasingly propaganda. The effect is a shift in the quality of public debates and decisions. Habermas believes that the public sphere is being increasingly contaminated and, with its takeover by corporate houses and capitalist communications organisations, is rapidly shrinking. Information in the public sphere is now market-oriented, seeking to sell products. As a result, sensationalism and trivia characterise much of contemporary information-content (see section on ‘Public Sphere’ in the Chapter 3).
Post-Fordism The Fordist age was characterised by mass production—symbolised by the assembly line—and consumption, the dominance of industrial workers in the employment sector, the nation–state as locus of economic activity (and within the nation–state, oligopolies of companies), and extensive planning (from technological ‘events’ to the welfare state). These areas were considerably affected by globalisation in the 1970s, which modified the scope–role of each of these factors. The rise of the transnational corporation (TNC), which was also instrumental in this change, affected everything— from markets and production (including labour and raw material) to finance and communication. Post-Fordism, as this stage is described, is thus marked by the decline of mass production. It is characterised by flexibility as opposed to the more regimented nature of Fordism.
52 S
VIRTUAL WORLDS
S Flexibility of employees: A worker today is invariably multiskilled. Change is inherent to the production process and the worker is subject to ‘lifetime training’. In addition, there is wage flexibility (payment for what they do rather than a national rate), labour flexibility (employment on contract basis, so that people can expect to change jobs more frequently) and time flexibility (part-time employment, shifts and weekend work). S Flexibility of production: ‘Just-in-time’ production, where manufacturing starts only after orders are taken, is central to several industries now. Subcontracting and outsourcing means that the corporation can switch suppliers and products. S Flexibility of consumption: Electronic technologies offer more variety than before. Uniformity—a feature of Fordist production—is passé, as people look for exclusivity that expresses their personalities better (this is similar to what Anthony Giddens in his work on the Third Way sees as ‘life politics’, where the quality of life becomes a central concern). There is a direct link between certain features of post-Fordism and the rise of the information society: S A new international division of labour facilitated by telecommunications networks. S Global finance and cognate information services—from R&D to client services—depend on IT. S Increased monitoring of products for more cost-effective technology and control. S Increase in competitiveness, necessitating not just increased automation and computerisation but also better organisation—all of which use IT (Webster 1995: 147–53).
Manuel Castells and the Informational City In a series of works on the informational society Castells identifies an ‘informational mode’ of development. This is a new sociotechnical paradigm, with information processing as the main activity.
TECHNOCULTURE
S 53
Information processing influences all the processes of production, distribution, consumption and management. A restructuring of capitalism occured after the 1970s. The 1945–70s period was characterised by state-regulated, welfarist and corporate-oriented capitalism. With the recession of the 1970s, capitalism entered a ‘bad’ phase. The same period saw the rise of the informational mode of development, and capitalism was quick to utilise the new technologies for its own purposes. Thus capitalism and information technology conflated. As Castells (1989: 29) states explicitly, the ‘restructuring [of capitalism] could never have been accomplished without the unleashing of the technological and organisational potential of informationalism’. Castells argues that in the informational order the ‘flows’ of information are of paramount import. The development of ICT networks promotes the importance of information flows for economic and social organisation. It also reduces the significance of territory and geography. Thus the management of these information flows is the major feature of contemporary society and corporate business organisations. One effect of this restructuring has been that corporate bodies now have worldwide production– distribution strategies (for a detailed discussion of this theme and Manuel Castells, see Chapter 3).
Jean Baudrillard and Postmodern Simulacra Baudrillard’s work in Simulations (1983), America (1988), Seduction (1990a), Cool Memories (1990b) and other books, which has been enormously influential in media studies of the new age, has constantly emphasised the image, the development of ‘virtual’ models of reality and the commodity–consumerist culture of contemporary life. Baudrillard’s central argument is that in the age of perfect reproduction and endless repetition of images, the distinction between the real and the illusory, between the original and the copy, between superficiality and depth has broken down. What we now have is a culture of ‘hyperreality’. Baudrillard suggests that the sign is not an index of some underlying reality—i.e., it does not refer to any external reality (or referent)—but merely refers to other signs. The entire system, constituted by such signs that are ultimately empty (because they only refer to other similar signs rather than
54 S
VIRTUAL WORLDS
‘truth’), is thus a simulacrum of the real. Baudrillard terms this process ‘simulation’. Simulation is when the image/ model becomes more real than real. To use Baudrillard’s words, in simulation there is a ‘generation by models of a real without origin or reality: a “hyperreal”’. The hyperreal is basically a screen surface—no more, no less. Baudrillard is one of the first philosophers to theorise the ‘codification’ of late twentieth century life. Baudrillard argues that contemporary life is governed by the code: the DNA code in biology, the digital code of sound reproduction, the binary code of computer technology. The code ensures that the object reproduced is not simply a copy. Since the code enables an exact replication, the distinction between the original and the copy breaks down. Virtual reality, global communications, the infinite reproductions of databanks and holograms are examples of the redundancy of the distinction between real and imagined, between ‘true’ and copy. Simulation and models are thus pure reproductions. The origin of things is not an original thing but a code or formula. The last ‘original’ may now be reproduced exactly through the master code of the original—in what may be termed ‘reversibility’. That is, the ‘last original’ actually does not exist: it may be infinitely reproduced. This is the hyperreal. In the hyperreal the image is the reality. Our experiences (of reality) are simulated through technology. Examples of this simulated reality abound: television, electronic shopping, holographic images, and now virtual reality systems. ‘Virtual reality’—which sounds like an oxymoron—offers a rather interesting paradox. Philosophically, the virtual is that which exists potentially rather than actually, and refers to the ‘field of forces and problems that is resolved through actualization’ (Lévy 2001: 29). However, as it is used today, ‘virtual’ suggests ‘unreality’, since reality presupposes some tangible material embodiment. In this book, the terms ‘virtual’ and ‘virtual reality’ are used in the philosophical sense, opposed not to the real but to the actual, where virtuality and actuality are two sides of reality. Baudrillard argues that objects in a consumer society are not simply consumed. They are produced to signify a status rather than satisfy a need. Hence his argument that in a consumer society even objects become signs (of social class, wealth, influence, education, family). Culture, as we have argued earlier, is essentially a matter of interpreting symbolic forms and signs. These signs are
TECHNOCULTURE
S 55
narratives. Baudrillard argues that what we consume are not objects but signs: the signs of advertisements, of television. The objects have value for us as signs. Therefore we consume images, and the exchange value is transformed into sign-value. In such a context, the sign bears no relation to reality at all. Everything is merely a model, a simulation of the real. Reality itself no longer exists outside these representations. In his famous essay, ‘The Masses: The Implosion of the Social in the Media’, Baudrillard writes: ‘Each individual is forced into the undivided coherency of statistics […] a positive absorption into the transparency of computers, which is something worse than alienation’ (2000 [1985]: 101). He suggests that we are haunted by a new uncertainty, an uncertainty produced not by a lack of information or its excess, but by information itself. As an illustration, Baudrillard writes about opinion polls: ‘We will never know if an advertisement or opinion poll has had a real influence on individual or collective wills, but we will never know either what would have happened if there had been no opinion poll or advertisement’ (ibid.: 100–101). Baudrillard suggests that there is no Other anymore—the scene of the Other has disappeared. Further, he claims that society is selfabsorbed and ‘auto-intoxicated’ (ibid.: 101). It must know what it wants, what it thinks, and see itself continually on the videoscreen of statistics. Postmodern society thrives on images of itself (opinion polls, for example). Therefore, obsessed with itself, society can no longer enact itself: it is confused with its own image. It no longer occupies a public or political space, and when ‘it loses its scene [it] becomes obscene’ (ibid.). The postmodern is characterised by the hyperreal, by the collapse of the distinction between the private and the public (home, office, and now home–office, online shopping, teleconferences, simulated participation in events, etc.). This is the implosion of meaning in the media. As Baudrillard says: ‘Only signs without referents, empty, senseless, absurd and elliptical signs absorb us.’ The postmodern, according to Paul Virilio, Katherine Hayles and others, is characterised by massive technological and scientific changes. Virilio declares: These […] technologies (based on the digital signal, the video signal and the radio signal) will soon overturn […] the
56 S
VIRTUAL WORLDS
nature of human environment and its animal body, since the development of territorial space by means of heavy metal machinery is giving way to an almost immaterial control of the environment […] that is connected to the terminal body of men and women. (1993: 4) Uncertainty and instability, ephemerality and dispersion, the increasing pervasiveness of superficial ‘icons’ (logos, advertisements)—all features of postmodern thought—also characterise much of contemporary technoculture. Humanism has itself been radically interrogated, and along with the fragmentation of modes of knowledge and ‘reading’, the fragmented, technologised cyborg becomes the posthuman.
Arjun Appadurai and Globalised Culture Appadurai’s work (2000 [1990]) argues that the central problem of contemporary globalising processes is the ‘tension between cultural homogenization and cultural heterogenization’ (p. 94). Countering arguments about the large-scale Americanisation of the world, Appadurai offers, as an alternative, the numerous ‘indigenisation’ moves occurring elsewhere. Thus people in Java fight against Indonesianisation and Sri Lanka resists Indianisation. Appadurai argues that as forces from metropolises enter new societies, they are indigenised in different ways. Globalisation involves the use of a variety of instruments of homogenisation—armaments, advertising, language hegemonies and clothing styles—that are absorbed into local politics and cultural economies and then ‘repatriated’ as ‘heterogeneous dialogues of national sovereignty’ (ibid.: 99). Note, for instance, in the McDonaldisation of the world, how McDonald’s itself changes its attitudes and indigenises itself. The company’s reluctance to use beef and beef products (fuelled by speculations, a few years ago, that they were indeed using beef products) in its food items in India because of the Hindu reverence for the cow, is an example of how an international corporation indigenises itself culturally. Appadurai locates five factors that shape contemporary global cultural processes. These factors or ‘scapes’, as he terms them in his essay, ‘Disjuncture and Difference in the Global Cultural
TECHNOCULTURE
S 57
Economy’, ensure that cultural homogeneity is simply not possible. The five scapes are as follows: S Ethnoscapes: This describes the flow of people from one section of the world to another. These populations may be tourists, exiles, immigrants, refugees or guest workers. Appadurai suggests that we look at not only the realities of these movements but also at the fantasies of wanting to move. When international capital shifts and governments alter their policies on refugees, even the imagination of these groups changes. S Technoscapes: This is the export–import and alliances that shape scientific and industrial developments across countries and cultures (see Terry Shinn’s concept of ‘transverality’ in Chapter 3.). The distribution of technologies is driven, today, by the relationships between money flows, political possibilities and the availability of labour. S Financescape: This describes global money transfer, including international aid and foreign investments. S Mediascapes: The electronic mass media and the images it produces. These images are frequently, Appadurai argues, images of the Other. S Ideoscapes: This is the political or ideological aspect of culture. It may be about democracy and voting rights, or about struggles for power and oppressive regimes. These scapes influence global cultural conditions by their ‘disjunctures’ or ‘breaks’. Mass communication is instrumental in expanding cultural diversity. Globalisation is a complex set of interactions, with frequently oppositional flows of people, materials and symbols. These flows perpetuate heterogeneous cultural positionings (Appadurai 2000 [1990]: 92–102). (For a discussion of the globalisation of technology and the technology of globalisation, see Chapter 3.)
Jean–François Lyotard and the Postmodern Condition Lyotard’s The Postmodern Condition (English translation, 1984) and several subsequent works define, explicate and set the agenda for
58 S
VIRTUAL WORLDS
discussions of the postmodern condition. While Lyotard does not directly address questions of tele-technologies, his conception of the postmodern world is relevant to an understanding of the political and cultural implications of the ICT revolution. The postmodern condition, for Lyotard, is characterised primarily by a rejection of any ‘metanarratives’ (such as the emancipatory role of science, the psychoanalytic one of the repressive nature of all human civilisation, or of the human condition as a class struggle) in favour of ‘micronarratives’. This is essentially an anti-foundationalist stance, where any totalising explanations of reality are immediately suspect. Totalising narratives ignore differences and singularities at local, individual and regional levels in favour of an overarching programme. Such knowledges and explanations, Lyotard argues, are inherently tyrannical because they impose a false unity on knowledge and reality. Grand narratives prevent the full development of individuals and cultures. In an argument that comes close to contemporary concepts of science and culture, Lyotard suggests that science and scientific discourses are not about disseminating ‘truth’. Rather, they are about augmenting power. Scientific discourses legitimate certain knowledges and suppress oppositional knowledges in order to retain power. Further, ‘proof ’—that irreducible element of all scientific discourse—belongs to the wealthy. Games of scientific language become the games of the rich. The abuse of power by institutions (the scientific establishment, the state, the educational system, or religion) is born out of an attempt to legitimise their version of truth as universal. Thus multiple or oppositional readings of the scriptures are disallowed. In a comment that comes closest to contemporary technology’s consumerisation of information, Lyotard suggests that the ‘saleability’ of truth is all that matters. The question, ‘What use is it?’ replaces the query ‘Is it true?’ According to Lyotard, knowledge is the most important stake in the world’s quest for power. He predicts that the world will eventually fight for information just as it once fought for territory. The group that controls the metanarrative (or, in the case of cyberculture, the technology that generates the narratives; see the section on Stuart Hall in this chapter and Chapter 3 on ‘Politics’) controls political power. The postmodern condition is about the
TECHNOCULTURE
S 59
‘undecidables’, the limits of control and conflicts characterised by incomplete information. It is about ‘fractas’ (fractions) and catastrophes. It acknowledges that all knowledge is partial and fragmentary. One can only utilise local knowledge to know a part of the truth. The individual’s voice must be retrieved as a resistance and ‘delegitimation’ of the grand narratives that disguise one version of knowledge as universal. Mark Poster has adapted Lyotard’s work for cybercultural studies. Poster’s The Mode of Information (1990) also speaks, like Lyotard, about the commodification of information. Poster argues that a ‘mode of information’ is slowly supplanting sociocultural formations. He suggests that the institutional ‘routines’ that mark modernity have been disrupted by the arrival and widespread dissemination of the electronic media. TV-based media, surveillance technologies, and other ICT-based media resist traditional social theory. Poster regards electronic surveillance and databases as a ‘superpanopticon’ that strengthens the power of knowledge but also reconstitutes the ‘selves’ that are the object of its surveillance. Following Lyotard, Poster suggests that databases and knowledge banks be thrown open to the public.
Paul Virilio and Hypermodernism Paul Virilio’s work on dromology (the ‘science of speed’) is, after Baudrillard’s, perhaps the most exciting reading of contemporary technoculture. Virilio’s work is far too extensive and straddles several disciplines—from architecture to political economy—to be summarised here. I shall therefore sketch in those aspects of Virilio’s thought as seem most useful to understanding late twentieth-century cyberculture. Virilio argues that histories of sociopolitical institutions such as the military or even cultural movements demonstrate the need for speed rather than commerce and the urge for wealth. Virilio argues that the relationship of politics with speed is as important as its relationship with the accumulation of wealth. In other words, the political economy of speed is as important as the political economy of wealth. Virilio argues that higher speeds belong to the upper reaches of society and the slower ones to the bottom. The ‘wealth pyramid’ is a replica of the ‘velocity pyramid’ (2000: 35).
60 S
VIRTUAL WORLDS
Virilio’s interest lies in the instruments that accelerate and intensify speeds and that augment the power of those people who control these instruments. It is surely not a coincidence that the Microsoft moghul Bill Gates’ first book is The Road Ahead and the second is Business@Speed of Thought? The titles suggest not simply the spatial sense of movement, journeys and ever-receding destinations—symbolic of indefinite ambition, scope and potential—but also the sense of acceleration, of perpetually increasing acceleration. Like Virilio, Gates suggests a world increasingly governed by speed—of the transmission of information, money transfer, arming of nations, ‘first strike’, and transportation. Virilio argues that in the age of ‘cyber-war’, unspecified enemies (‘terrorists’) are invoked by the state in order to justify spending on military weapons. For Virilio, the concept of ‘pure’ war is actually the weaponry of new information and communications technology because it is the weapons of the military–industrial complex that cause ‘integral accidents’ such as the 1987 world stockmarket crash—brought on by the failure of automated programme trading. Virilio sums up his argument about contemporary technoscience: ‘Science itself has become pure war, and it no longer needs enemies. It has invented its own goal’ (Virilio and Lotringer 1997: 184). Virilio introduces the concept of ‘picnolepsy’ (frequent interruption—from ‘picnolepsia’, the momentary lapse of consciousness. Picnolepsia facilitates cinema, since it enables the production of continuities where none exist) to suggest that the hypermodern vision and the hypermodern city are both products of military power and time-based cinematic technologies of disappearance. Though there are political and cinematic aspects to the human visual consciousness of the city, it is important that we note the disappearance of the grand aesthetic with the arrival of micronarratives. Interrupted viewing, the controlled disappearance of whatever we are watching, the excessive screens in our lives leave ‘imprints’ of whatever we have just witnessed. Like the ‘flash’ on Internet Websites, discontinuous and interrupted images rule our visual life. We get, literally, only a ‘bit’ at a time, but we compose a whole out of the ‘disappeared’ bits. This is what Virilio terms the ‘crisis of whole dimensions’ (Virilio 1991: 9–28, 59–68), where we only have interruptions, ‘jumps’ and ‘technological space-time’.
TECHNOCULTURE
S 61
The screen is the new ‘city square’ and the ‘crossroads of all mass media’ (ibid.: 25–27). The logic of speed plays an important role in the militarisation of urban space, the organisation of territory and the transformation of social, political and cultural life. Dromology looks at the politics of transportation and information transmission, both of which rely on faster and more efficient movement. History itself, for Virilio, progresses ‘at the speed of its weapons systems’ (quoted in Armitage 2000: 6). Space and time are overwhelmed by technologies that instantaneously circulate images across space. In fact, Virilio argues that modernity marks the transition from the age of the brake to that of the accelerator. More significant to our reading of cybercultural technologies is Virilio’s work on the ‘aesthetics of disappearance’. In contemporary times, people’s faith in perception is subordinated to ‘the technical sightline’, and cultural and social substitution has reduced the visual field to ‘the line of a sighting device’ (Virilio 1994: 12–13). With the development of Virtual Reality (VR) and the Internet we ‘register a waning of reality’ (ibid.: 49). These technologies split the modes of perception and induce an aesthetics of disappearance. In ancient societies there was the aesthetic of ‘appearance’. There was an enduring material support for the image and, except for music, ‘aesthetics- related phenomena were phenomena of appearance, of emergence’ (Virilio 2000: 41). With photography and cinematography, the persistence of the image is no longer material but cognitive—it is in the eye of the beholder. Images owe their existence to the fact that they disappear, as on the screen. They appear precisely because they disappear later. At 60 frames a second—with special effects—we have an aesthetics of disappearance (ibid.). However, Virilio is emphatic that this is in no way a parallel of Baudrillard’s notion of ‘simulation’. He suggests that rather than simulation, contemporary technology creates a ‘substitution’, replacing one reality by another. As Virilio puts it: We have a ‘class I reality’, and then there is a simulation of that reality through a new technology, such as photography, or some other thing, or VR […] and then you have a fresh substitution, a second reality. Hence simulation is a mere intermediary phase, without import. What is important is
62 S
VIRTUAL WORLDS
substitution; how a ‘class I reality’ is substituted by a ‘class II reality’, and so on …. (2000: 43) In terms of the military and police uses of these technologies, Virilio suggests in The Vision Machine (1994) that we now have ‘a vision without a gaze’. Right from his very early work, War and Cinema (1989 [1984]), Virilio’s argument has been that war depends on the logistics of representation, especially in providing detailed and accurate images of the enemy’s movements, arms and location. Pictorial representations of the enemy have been integral to war throughout history. Contemporary technology in the form of computer simulation, radar, informatics and satellite imaging are advancements of such ‘cinematic’ technologies of war. However, a central difference is that there is, today, a progressive ‘dematerialisation’ of war, with technologies replacing human beings (see section on ‘Cyborg Soldiers’ in Chapter 4). As illustration, Virilio points to the slow erosion of the military’s reliance on human perception and representation. Computer simulations replace charts, maps and field exercises. In a particularly fine passage Virilio provides a description of a ‘virtual’ or cyber-war: The disintegration of the warrior’s personality is at a very advanced stage. Looking up, he sees the digital display (optoelectronic or holographic) of the windscreen collimator; looking down, the radar screen, the on-board computer, the radio and video screen, which enables him to follow terrain with its four or five simultaneous targets; and to monitor his self-navigating Sidewinder missiles fitted with a camera of infrared guidance system. (quoted in Kellner 2000a: 109) With surveillance cameras there were people watching from behind the camera. But with new recognition technologies (see Box 4.5) there is only a micro-receiver and a computer, but no human spectator of the image. The door itself recognises the human. The term ‘vision machine’, Virilio informs us, comes from the Cruise missile. Missiles with built-in radars and mapping systems and equipped with an ‘automatic gaze’ are examples of the vision machine. The machine, as Virilio puts it, ‘looks for itself ’
TECHNOCULTURE
S 63
(2000: 48–49). These technologies, in combination with the increasing nuclearisation of the world, produces what Virilio terms ‘pure war’. This is a situation where military technologies and the technocratic system controls every aspect of our life. The faith in nuclear deterrence, submission to a system of mass destruction (appropriately termed, in military rhetoric, MAD or ‘mutually assured destruction’), and societies preparing for the perpetual possibility of war are all aspects of Virilio’s pure war. Speed and technology replace democratic participation and undermine politics. Effective media politics diminishes the space of democratic political participation. Instantaneous communication actually reduces the time for detailed discussions, deliberations and consensus-building (see, in sharp contrast, Mark Poster’s views on the role of new ICTs in ‘democratisation’ in Chapter 3). For Virilio, every technology, generates its own specific form of accident. Accidents are integral to industries, transport, military and everyday technologies. Virilio suggests that there has been, with contemporary technology, the replacement of ‘local time’ with one single ‘global time’. To understand this idea it is important to look at Virilio’s concept of ‘polar inertia’. Virilio argues that time has ‘telescoped’: We’re heading towards a situation in which every city will be in the same place—in time […] cities which have kept their distance in space, but which will be telescoped in time […] The difference of sedentariness in geographical space will continue but real life will be led in a polar inertia (in Virilio and Lotringer 1997: 64). That is, geographical distances become irrelevant when everything is conducted at the speed of light (absolute speed, from theories in physics). Jacques Derrida puts it slightly differently when he states: [A] technology that displaces places: the border is no longer the border, images are coming and going through customs, the link between the political and the local, the topolitical [a portmanteau term from ‘topology’ and ‘political’], is as it were dislocated (in Derrida and Stiegler 2002: 57).
64 S
VIRTUAL WORLDS
Virilio also suggests that there has been, in the past few decades, a total collapse of the distinction between the human body and technology. Technological enhancement and substitution leads to what Virilio terms ‘endo-colonisation’. Virilio argues that there is no colonisation of territory without colonisation of the body (2000: 50). Contemporary technology, which turns against itself (the body), parallels a situation where the state turns against its own people. With the ‘transplant revolution’ (for Virilio the ‘Third Revolution’ after transportation, with the invention of the steam engine, and the transmission revolution that begins with Marconi), we have an endo-colonisation of the human body by technology (ibid.: 49–50). For Virilio, the enthusiasm of Donna Haraway and cyberfeminists for the new ICT-based cultural condition heralds a technological fundamentalism—and a politics that Virilio, in his usual inimitable style, terms ‘politics of the very worst’ (Virilio and Petit 1999a). Virilio claims that all these technologies, or what he terms ‘new technologies’ instantaneous interactivity’, ‘exile us from ourselves and make us lose the ultimate physiological reference: the ponderous mass of the locomotive body, axis, or more exactly seat of comportmental motility and of identity’ (quoted in Kellner 2000a: 117). For Virilio the new world order is a manifestation of what he terms ‘globalitarianism’. The single market, the convergence of time toward a single time (which dominates local time), the telecommunications conglomerates and cyberspace—all suggest a ‘totalitarianism of totalitarianism’. Where totalitarianism was once localised and singular (as in the Soviet Union), with the advent of globalisation ‘it is everywhere that one can be under control and surveillance’ (Virilio 2000: 38). Virilio warns us in ‘Speed and Information: Cyberspace Alarm’ (1995): We have to acknowledge that the new communication technologies will only further democracy if, and only if, we oppose from the beginning the caricature of global society being hatched for us by big multinational corporations throwing themselves at a breakneck pace on the information superhighways. ()
TECHNOCULTURE
S 65
KEY CONCEPTS AND TERMS CYBORG In order to understand the implications of the body in cyberspace we need to look at the evolution of the cyborg or cybernetic organism. Norbert Wiener’s work on cybernetics, Cybernetics: Or Control and Communciation in the Animal and the Machine (1948) was the first major survey of the science of cybernetics. Wiener provides a useful history of automata, a history that climaxes with the animal–machine interface: the Golemic age, the age of the clocks (seventeenth and eighteenth centuries), the age of steam (late eighteenth and nineteenth centuries), and the age of communication and control (the twentieth century, marked by a shift from power engineering to communication engineering). Each of these stages produced four corresponding models of the human body: (a) the body as a clay figure; (b) the body as a mechanism resembling clockwork; (c) the body as a heat engine burning fuel; and (d) the body as an electronic system (Wiener 1948: 51–52). Wiener then defined automata as a ‘servomechanism’, where the automata are coupled to the external world through a mechanism of communication—the incoming messages and outgoing ones. Wiener was thus suggesting a form of automata that impacted directly upon the social through this feedback mechanism. In a later work, The Human Use of Human Beings: Cybernetics and Society (1954), Wiener described the requirements of a new cybernetic organism: S They must possess effector organs analogous to human arms and legs. S They must be linked with the outer world through sense organs which enable them to record their efficiency in performing certain tasks (the ‘feedback’). Wiener suggested that there must be a central decision organ (such as a brain) which decides what the machine must do next based on the information received as feedback (just as the human
66 S
VIRTUAL WORLDS
body does). With the machine and the organ comprising two functionally equivalent states, the cybernetic automaton in effect mirrored the human body. Suddenly, then, cybernetic automata forced philosophy, engineering and biology to interrogate the identity of the human itself—a movement that culminates in the heralding of the ‘posthuman’. In the 1960s Marshall McLuhan extended the notion of such a technology to speak of ‘global villages’ of information-based consciousness—what David Tomas terms an ‘electronic collectivisation of the human body’ (Tomas 1991: 34). McLuhan famously spoke of a technology that was an ‘extension or self-amputation of physical bodies’, with an extended nervous system and imbedded in a stream of global information flows, based on such a cybernetic model. Then in 1960 Manfred Clynes and Nathan Kline coined the term ‘cyborg’ (cyb[ernetic] org[anism]), defining it as ‘self-regulating man– machine systems’. They described the cyborg system ‘as an organisational system in which such robot-like problems [such as the body’s autonomy and self-regulating mechanisms] are taken care of automatically and unconsciously, leaving man free to explore, to create, to think, and to feel’ (quoted in Tomas 1991: 35).
CYBORG DISCOURSE The manner in which various technological, natural, biological, social, linguistic and cultural changes are inscribed and embedded in the text’s narrative/rhetoric.
CYBERSPACE This now-omniscient term was coined by the novelist William Gibson. Gibson’s Neuromancer (1984) not only inaugurated the genre of cyberpunk, it also proved to be one of the most influential texts for contemporary studies of technology. Here is his famous definition: [Cyberspace] A consensual hallucination experienced daily by billions of legitimate operators in every nation, by children being taught mathematical concepts … a graphic representation
TECHNOCULTURE
S 67
of data abstracted from the bank of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights, receding …. (Gibson 1984: 51, ellipsis in original) Neal Stephenson in Snow Crash (1993 [1992]) uses the term ‘metaverse’ to describe cyberspace: Like any place in Reality, the Street is subject to development. Developers can build their own small streets feeding off the main one. They can build buildings, parks, signs, as well as things that do not exist in Reality, such as vast hovering overhead light shows, special neighbourhoods where the rules of three-dimensional spacetime are ignored, and free-combat zones where people can go to hunt and kill each other. The only difference is that since the Street does not really exist—it’s just a computer-graphics protocol written down on a paper somewhere—none of these things is being physically built. They are, rather, pieces of software, made available to the public over the worldwide fibre-optics network […] it is always nighttime in the Metaverse, and the Street is always garish and brilliant, like Las Vegas freed from constraints of physics and finance …. He [Hiro, the protagonist] is not seeing real people, of course. This is all a part of the moving illustration drawn by his computer according to the specifications coming down the fibre-optic cable. The people are pieces of software called avatars. They are audiovisual bodies that people use to communicate with each other in the Metaverse …. (pp. 24–42) Gibsonian cyberspace is a place where information is collected and which could be ‘entered’ by an avatar using a network computer (what Gibson in Neuromancer terms a ‘cyberspace deck’). John Barlow of the Electronic Frontier Foundation defines cyberspace as ‘where you exist when talking on the telephone’ (Rucker et al. 1993: 78). Barlovian cyberspace refers to the space computer networks create. Meredith Bricken describes cyberspace thus: ‘In a virtual world, we are inside an environment of pure information that we can see,
68 S
VIRTUAL WORLDS
hear and touch …. Cyberspace technology couples the functions of the computer with human capabilities’ (Bricken 1991: 363). A more detailed explanation/definition comes from architect Marcos Novak (whose work is discussed in Chapter 2). Novak writes: Cyberspace is a completely spatialised visualisation of all information in global information processing systems, along pathways provided by present and future communication networks, enabling full copresence and interaction of multiple users, allowing input and output from and to the full human sensorium, permitting simulations or real and virtual realities, remote data collection and control through telepresence, and total integration and intercommunication with a full range of intelligent products and environments in real space. (1991: 225) Mike Featherstone and Roger Burrows, following Bruce Sterling, define cyberspace as a generic term referring to a whole group of technologies, all of which have the ability to simulate environments within which humans can interact (Featherstone and Burrows 1998 [1995]: 5). Cyberspace is thus an alternative or parallel universe created by and within the world’s communication networks.
VIRTUALITY AND VIRTUAL REALITY Virtual Reality (VR) is a term coined by Jaron Lanier to refer to a real or simulated environment in which the perceiver experiences telepresence. VR gives the realistic experience of being immersed in an environment. It uses computerised clothing, eyephones and stereo headphones. Virtuality is the general term used to describe the alternate ‘world’ in cyberspace. This world, seen as different from physical ‘reality’ but with its parallels with reality, is ‘virtuality’. Lanier, enthusiastic about the new VR technologies, writes: Virtual Reality exists so that people can make up their reality as fast as they might otherwise talk about it. The whole thing with Virtual Reality is that you’re breeding reality with other people. You’re making shared cooperative dreams all the time. You’re
TECHNOCULTURE
S 69
changing the whole reality as fast as we go through sentences now. Eventually, you make your imagination external and it blends with other people’s. Then you make the world together as a form of communication. And that will happen. (1992: 46) One of the principal differences of real (or ‘meat’) space and the virtual world is that we do not take our physical bodies there. We sit and converse with people, meet them in cyberspace, when all the time our physical bodies remain stationary at the computer terminal. The term ‘virtual’ is now used to describe all computermediated activity and electronic culture. The term also has another resonance: that of potential. Marie-Laure Ryan points out that the term ‘virtual’ offers two distinct concepts: (a) fake, illusory, nonexistent, and (b) potential. These two combine in Pimentel and Texeira’s definition of VR: ‘Virtual reality refers to an immersive, interactive experience generated by a computer’ (quoted in Ryan 1999: 89). Cyberspace as fake means that it provides the sense of a place that does not exist physically. VR as potential, is ‘a mode that gives free rein to creative processes’ (Lévy, quoted in ibid.: 92). Pierre Lévy’s work provides a crucial modification of the term and is of particular interest to people looking at the arts. Lévy suggests that a virtual entity can be actualised endlessly, and actualisation is the process of passage from timelessness and placelessness to an existence rooted in the here and now. Knowledge resident within the large databases of the computer network is virtual, mainly because it is not depleted by use. Its value is in its potential to generate wealth. In the case of texts, the version preserved on the disc is virtual and invisible, but can be actualised endlessly. In fact, virtuality is the mode of existence of the text itself. This last is not new, since texts have always existed primarily as mental constructs. The act of writing actualises ideas, thoughts and memories. Thus every act of reading actualises the text. A text is therefore a potential. Marie-Laure Ryan sums up the virtual/actual nature of texts: The virtuality of texts and musical scores stems from the complexity of the mediation between what is there, physically, and what is made out of it. Colour and form are inherent to pictures and objects, but sound is not inherent to musical
70 S
VIRTUAL WORLDS
scores, nor are thoughts, ideas, and mental representations inherent to the graphic or phonic marks of texts. They must be constructed through an activity far more transformative than interpreting sensory data. (Ryan 1999: 96) What electronic technology has produced is an elevation of a text’s built-in virtuality to a higher power.
INFORMATIONAL SPACE Cyberspace is constructed out of information. It is basically an informational space. The exchange of information is in the form of a software code, and this creates cyberspace. Avatars (see extract from Stephenson quoted earlier) and virtual identities are created through the exchange of information.
CYBERPUNK The term ‘cyberpunk’ was first used by Bruce Bethke in 1983, in a short story called ‘Cyberpunk’ in Amazing Stories, but it became a popular literary genre with William Gibson’s Neuromancer (1984). Bruce Sterling, novelist and the editor of the first cyberpunk anthology, Mirrorshades (1988 [1986]), defines cyberpunk to the book in his Preface as a ‘modern reform’ and a natural extension of elements already present in science fiction (SF) (Sterling 1988 [1986]: xv). He lists the following concerns of cyberpunk: the theme of body invasion via prosthesis, genetic alteration, implanted circuitry and cosmetic surgery (ibid.: xiii). Sterling sees cyberpunk as an integration of technology and the 1980s counterculture—rock video, hackers, hip-hop and scratch music, and synthesiser rock. Veronica Hollinger defines cyberpunk thus: [Cyberpunk] is […] posthumanism with a vengeance, a posthumanism which, in its representation of ‘monsters’— hopeful or otherwise—produced by the interface of the human and the machine, radically decenters the human body, the
TECHNOCULTURE
S 71
sacred icon of the essential self, in the same way that the virtual reality of cyberspace works to decenter conventional humanist notions of an unproblematical real. (1991: 207)
CYBERPOLITICS This refers to the rights of access to and rights in cyberspace. Literally, it refers to the rights of digital constructs within cyberspace. The term is used increasingly to refer to the impact of electronic media and VR on ‘real’ politics.
POSTHUMANISM Simply put, a posthuman is what evolves after the human. It is primarily an attitude about overcoming the limitations of the human form—age, decay, disease, intelligence, looks—through technology. It has three main assumptions: S There is nothing natural or special about the human body. S Human beings are diverse, have diverse goals and diverse ways of reaching them. S Technology can enable us to overcome the limits of human form, a process termed ‘techno-transcendence’. Increasing intelligence and memory (for example, through drugs), implantation of new brain tissue, or wiring the brain directly into computers are some of the possibilities suggested by posthumanists. Katherine Hayles in her extraordinary work, How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics, argues that the posthuman is essentially a ‘point of view’ characterised by a particular set of assumptions (Hayles 1999: 2–3): S It privileges informational pattern over material ‘substance’ or ‘form’. Thus biological embodiment is an accident rather than an inevitability of life.
72 S
VIRTUAL WORLDS
S Consciousness is only a ‘sideshow’ and not the seat of human identity. S The body is the original ‘prosthesis’ that we learn to manipulate. Therefore extending or replacing the body with other prostheses is the continuation of a process. S There are no significant differences or absolute differences between bodily existence and computer simulation, cybernetic mechanism and biological organism, robot technology and human goals. In the posthuman the human can be seamlessly articulated with intelligent machines.
CYBERCULTURE STUDIES David Silver, founder of the Resource Center for Cyberculture Studies (Washington), and one of the world’s finest analysts of the new technocultural society, provides a useful survey of scholarship in cyberculture studies. Silver identifies three stages in cyberculture studies in the years between 1990 and 2000. 1. Popular cyberculture: Collections of essays, journalism and stories that began appearing in the early 1990s. Between 1993 and 1994 Time published two cover stories on the Internet, and Newsweek did one in 1994. Popular books such as The Internet for Dummies became bestsellers. These were mainly descriptive books. Others, termed ‘technofuturists’, declared cyberspace as the new frontier. Mondo 2000, Wired, Boing Boing and other such magazines greeted the new space with enthusiasm, partly driven by works like William Gibson’s Neuromancer. 2. Cyberculture studies: By the mid-1990s cyberculture studies were well underway, as exemplified in the work of Howard Rheingold (The Virtual Community), Sherry Turkle (Life on the Screen) and works such as Cyberspace: First Steps (edited by Michael Benedikt). Interdisciplinary in nature, studies now also looked at netiquette, consumption of technology, gender and cyberspace, and other themes.
TECHNOCULTURE
S 73
3. Critical cyberculture studies: By the late 1990s ‘critical cyberculture studies’ had emerged as a major field. This genre has four major emphases: (a) it explores the social, cultural and economic interactions that take place online; (b) it examines the stories we tell about these interactions; (c) it analyses a range of social, cultural, political and economic considerations that encourage, make possible and/or thwart individual and group access to such interactions; and (d) it assesses the deliberate, accidental and alternative technological decision and design processes which, when implemented, form the interface between the network and the user (Silver 2000: 19–30).
TECHNOLOGIES The history of man is the history of technology. There has always been a tool, a weapon, a device that enabled man to eat, fight, protect and conquer. The technologisation of man and civilisation is one of the most fascinating stories of all time. But this is a story that deserves a larger canvas and is therefore horrifically abbreviated here. My concern is with the late twentieth century’s instrumental imperative. Pastoral, agricultural, nomadic and urban cultures have always developed their own tools for war, food production and amusement. World mythologies are full of halfmetal/half-flesh creatures (cyborgs), automatons and artifacts endowed with magical animate properties. Hero’s mechanical tableaus date back to 300 BC. Jurassic Park and Mission Impossible, with their sophisticated entertainment technologies, are logical sequels to the early amusement technology such as Jacques de Vaucanson’s mechanical duck (1741). The early modern period, which saw the birth of ‘modern’ science, used puppets and prosthetic limbs. Autopsy and transplants, interchangeable artificial limbs (such as those wielded by Count Goetz von Berlichingen), galvanism and computer graphics—the history is varied and chequered. In this section I survey some twentieth century technologies that have become an integral part of everyday life today.
74 S
VIRTUAL WORLDS
THE INTERNET The Internet originated in the ARPANET, the Advanced Research Projects Agency of the US Department of Defense. The idea was to create a system where researchers could access each others’ computers from a single terminal, from any geographical location. ‘Packet switching’ solved the problem of enormous amounts of time spent in sending large data files. Data was first cut into packets, addressed, and sent off. Later, when they all arrived, these packets could be reassembled. Soon ‘hosts’ and ‘servers’ appeared on the scene. Dedicated network computers would serve the computers (‘hosts’) that people wanted to use. Thus people with no host computer could still connect through a server and use host computers at other places. Computers did not need to be directly connected with each other for this. Any computer, in theory, could be connected to the network through a server. ARPANET grew to have 150 sites by the 1980s. A survey conducted in 1973 revealed that three-quarters of the traffic was of electronic mail. This was a significant shift—people were using ARPANET email to communicate with other people rather than with computers— and marks the genesis of what is now called computer-mediated communication (CMC). In 1983 FidoNet emerged. This used home-based personal computers to set up a network for email and online discussions. By the early 1990s it had 1.56 million users. Usenet is a network devoted to news-based conference system. It has over 20,000 topics on every and any subject and enables computer conferencing. Eventually, in order to avoid astronomical telephone bills, it created dedicated lines. FidoNet and Usenet were the most successful of worldwide cooperative networks, though other similar networks such as the Joint Academic Network (JANET) and FNET (France) were also used. However, by the 1990s, with the rise of Local Area Networks (LAN), it appeared as though only secured, restricted and technically incompatible networks would emerge. The response to this was the Internet and later, the World Wide Web (WWW). A solution to the problem of technically incompatible networks (which meant that data could not move between networks, since they could not be connected directly to each other) was found.
TECHNOCULTURE
S 75
Packets of data would go via ‘gateways’. At the gateway the data would be placed in an envelope that could be read by the receiving network. The ability of the computer to communicate with the gateway was defined by a set of instructions called Transmission Control Protocol (TCP), which eventually became Internet Protocol (IP). Thus if a network employed TCP/IP, it could communicate packets to any network that also used TCP/IP. The Domain Name System (DNS) was developed to identify addresses of receivers and senders. Each name (called IP name) is related to a number that identifies the computer. In the early 1990s Tim Berners-Lee and his colleagues at the CERN physics laboratory in Switzerland developed a common space between all information resources constituted out of shared and compatible addresses. All resources were attached to a string of characters that indicated where the resource was located in this common space. This string was the universal document identifier, soon to be called the Universal Resource Locator (URL). The Hypertext Transfer Protocol (HTTP), prefixed to the URL, informed the user computer how to deal with the request for information. Information resources now began to be made available via the conjunction of a computer (that took Web requests), a Web server, and the information domain of the Internet. Text, pictures and other resources could be converted into hypertext pages through Hypertext Mark-Up Language (HTML) to be read on screen. The sequencing is as follows: S A user logs on (jacks in) to the Internet requesting for a specific information. S The computer transfers the request to a Web server using http. S The files or programs carrying the requested information in the Web are sent as ‘Web pages’ to the computer using TCP/IP. S The receiving computer uses a Web browser to compose the images out of the information. The Mosaic browser developed one of the first such Graphical User Interface (GUI) that presented the information as hypertext on the user computer’s screen. The Mosaic was later succeeded by Internet Explorer and Netscape.
76 S
VIRTUAL WORLDS
MULTIMEDIA Since we perceive the world through several senses at once, any division of the media is artificial. Multimedia seeks to deliver the world to all the senses. It is the structured collection of bits of media represented digitally (digitisation is the process of converting a signal from an analog to a digital form, usually through analogue to digital converters [ADCs]). These can then be manipulated by programs on a computer, stored on disks, and transmitted over networks. Different media can thus be combined into multimedia: sound and moving pictures, photographs and still images, graphic illustrations (say, flowcharts) and text (say, subtitles or captions). One of the central features of digital multimedia is that it is interactive. The viewer can view the scene as a video clip or add sound effects. In some cases scenes can be zoomed in on by clicking on sections of the image. The viewer can ask for a transcription of the dialogue, and people with poor vision can have the computer ‘read’ the scene to them. It is now also possible to change the story by adding or removing elements. With online delivery information can be delivered through a network. CDROMs facilitate offline delivery of information, storing it until the computer/viewer seeks to activate it. The CD-ROM specification was published in 1985. Drives for using CD-ROMs appeared on desktop machines from 1989, and the World Wide Web became publicly available from 1992. In January 1997 the HTML 3.2 specification was adopted as a WWW Consortium Recommendation. The following are some of the frequently used techniques in digital multimedia: GIF: This was developed by CompuServe as a common format for exchanging bitmapped images. These files use a compression technique and are restricted to 256 colours. JPEG: Join Photographic Experts Group developed a compression technique that allows an image compressed using it to be stored in any file format. MPEG (Motion Picture Experts Group): This is a compression technique for digital video. Vector and Bitmapped Graphics: What we see on the computer screen is actually an array of pixels—small, square dots of colour
TECHNOCULTURE
S 77
(the term ‘pixel’ is distilled from ‘picture element’). These dots merge optically when viewed from a distance and produce the impression of continuous tone. To display an image on the monitor the program must set each pixel to an appropriate colour or shades of grey, so that the pattern of pixels on the screen presents the required image. The graphics program keeps an internal model of the image to be displayed. The process of generating a pattern of pixels from a model is termed ‘rendering’. There are two approaches to graphic modelling: bitmapped graphics and vector graphics. In bitmapped graphics the image is modelled by an array of pixel values. The stored values are logical pixels, and the physical dots on the screen are physical pixels. The logical pixels may be of a different resolution during storage, and may have to be ‘scaled’ or ‘clipped’ for it to be displayed. In vector graphics the image is stored as a mathematical description of a collection of individual lines, curves and shapes. Some computation has to be performed in order to interpret the model and generate an array of pixels to be displayed.3
THE NEW MEDICINE An increasing knowledge of body chemistry and the nervous system has enabled large-scale research into and production of psychotropic drugs. The man–machine interface has become more sophisticated. Muscle sensors now pick up electromyographic signals and convert them to specific commands for the powered prosthesis. There is extensive military research on direct electroencephalographic signals from the brain. Accelerometers to track head motions and voice commands to initiate preprogrammed actions that control computer-directed prosthesis are now available. An arm or a finger can now be made to move via voice-driven prosthesis.
Implants While prosthesis has been around for centuries, artificial organ transplants are more recent. The kidney was the first artificial organ. Researched from the early decades of the twentieth century,
78 S
VIRTUAL WORLDS
the first successful model was built by Willem Kolff in 1944. Research is now being conducted on portable kidneys. In fact, technoculturally speaking, a semicyborg is created when patients requiring regular dialysis have to stay connected to the machine for some time. The first artificial liver was created in 1956 when a human was connected with a dog’s liver. Denton Cooley performed the first human artificial heart transplant in 1969. Cochlea implants for the ears now pick up sound waves and relay them directly to the auditory nerve. Electrical and biochemical stimulations for neurological dysfunctions are also available. Artificial eyes—with cameras that supply visual information through an electronic ‘patch’ on the visual cortex—are no longer merely Terminator/ Robocop fantasies. Pacemakers and cardioverterdefibrillators (heart resuscitation machines) get more sophisticated (and smaller) everyday. Biological tissues such as glands, vessels, skin, bones and cells now have polymer equivalents (termed ‘biomaterials’). Natural transplants have a long history: kidney (1951), liver (1963), heart (1967), lung (1981). Rejection rates have been high, though advancing research, mainly in genetic engineering and immunosuppressive therapy, has helped improve survival chances greatly. There is also an interesting cultural phenomenon here. Parents with a child suffering from any bone marrow disease produce another offspring to ensure that matching donor marrow will be available. Between 1984 and 1989 at least 40 such babies were born (Gray 2001: 81–82).
The Visible Human Project (VHP) The VHP is a good example of the collaboration between IT and medical technology. Computer Tomography (CT) scanned and Magnetic Resonance Imaging (MRI) images of a representative male and female cadaver were first obtained The cadaver was then cut into four sections and frozen in blue gelatin at −70ºC. When suitably solidified, the sections were shaved into thin slices of thickness ranging from 0.33 to 1 mm, digitally photographed, and further scanned using both CT and MRI. These three imaging modes were collated in a computer and a data package of the entire cadaver prepared. A viewer can now ‘fly through’ the human body in computer simulation: anatomy come alive.
TECHNOCULTURE
S 79
The Human Genome Project (HGP) The HGP is basically genome sequencing, which involves the following steps: S Chromosomes (ranging from 50 million to 250 million bases) are broken down into shorter pieces. S Each short piece is used as a template to generate a set of fragments. S The fragments in a set are separated. S The final base at the end of each fragment is identified (the ‘base-calling step’, as it is termed). Thus far, finished sequencing has been achieved for three chromosomes: numbers 20, 21 and 22 (the announcement for number 20 appeared in the February 2002 issue of Human Genome News [Vol. 12, Nos 1–2, p. 3]).
Nanomedicine One of the most sophisticated developments in medical technology in the late twentieth century, its enthusiasts say that nanomedicine will transform healthcare. Robert Freitas—a leading researcher in the field—popularised the idea in his multivolume Nanomedicine (first volume published in 1999). Nanomedicine is the application of nanotechnology (a term used to describe the science and engineering of tiny machines) to the treatment and prevention of disease. Nano, from the Greek ‘nanos’, meaning ‘dwarf ’, actually suggests a size of the order of a billionth of a meter (10−9). It involves the use of nanostructured materials such as enzymes and moves the level of medical intervention right down to the individual cell. Nanomedicine has several related components and technologies: S Nanobiology: A concept first proposed by the Japanese Agency of Science and Technology as part of a study group, Biological Nano-Mechanisms, in 1992–98, this is a technology that combines mechanistic biology and morphology. Most biological processes are carried out
80 S
VIRTUAL WORLDS
by molecular mechanisms of really small size (1–100 nanometres). Nanobiology studies these processes through scanning probe microscopy and micro-manipulating techniques. S Biomolecular imaging: This uses Atomic Force Microscopes (AFM) to observe biomolecular processes—such as the work of DNA or proteins—within the body. Diffusion Tensor Magnetic Resonance Imaging (DT-MRI) measures three-dimensional random motion of water molecules in soft tissues to produce images of that tissue’s structure. This study enables a better understanding of disease and decay of tissues. Another important area of biomolecular imaging is the study of molecular communication. This involves studing the manner in which external conditions reach the cell, transmit their messages and stimulate appropriate responses within the cell/tissue. S Medical nanoparticles: Related to proteomics—the study of proteins in cells—this involves the creation of magnetic nanoparticles that can be inserted into the body and circulate through the bloodstream. Then, using magnetic fields, these nanoparticles can be deflected or ‘lured’ to specific disease sites in the body to deliver medication or undertake probes. This is likely to create individualised medicine and has the additional advantage of size—small enough to beat the body’s immune system. In Bruce Sterling’s short story, ‘Our Neural Chernobyl’ in Global Head (1994 [1992]) researchers find a way of wiring drug packets into the body, which can be triggered by a specific body biomolecular process to give a permanent ‘high’. Carbon nanotubes can be used to deliver genes into cells in gene therapy, since the cells/immune system does not reject them as intruders. S Biomimetics: This is the study of natural systems to improve the efficiency of artificial ones. Applied Digital Solutions of the US produced a ‘Digital Angel’, an implantable recording device which measures and memorises various bodily processes. Patients can now collect the data themselves, transfer the same to their computers or pass it on to doctors. A read-out medical history at instant notice is now a reality with BioMemes. Neural probes installed semipermanently,
TECHNOCULTURE
S 81
for instance, now monitor cerebral fluid conditions and administer medications directly. S Medical nanosensors: Nanosensors can be implanted in the body, and produce readings of, say, insulin levels or DNA sequences. Such sensors, chemical, chemo- or bio-, are used today to track a particular piece of DNA that may be linked to a disease. Nanosensors can be administered transdermally or through the skin to avoid intravenous administration. William Gibson’s fiction frequently describes medication through ‘derms’. They use fluorescent tags that glow in the presence of certain proteins, which can then be detected using laser scanning devices. The most common application today is to measure concentration levels of specific molecules and biopolymers in blood serum or interstitial fluid. Chemotactic sensors are used to sample the chemical composition of surfaces (such as those of cells or tissues). S Nanorobots: These can enter cells and replace or repair damaged structures. They are miniature robot-surgeons which, it is hoped, will eventually be able to replicate themselves or even correct genetic defects. S Shape-memory biocompatible alloys: These are bendable surgical tools that enter the body in one shape and change to the required shape after arriving at the predetermined location. The newer versions of these alloys help bone repair and reinforcement of blood vessels. Screws, pellets, rivets and bone replacement with self-locking capacity will eventually be shape-memory devices: carrying the shape of the body part in their memory, they reach the required (diseased) section of the body and acquire the shape of the body part that needs replacement.
ASSISTED REPRODUCTIVE TECHNOLOGIES (ARTS) The first IVF (in vitro fertilisation) pregnancy was confirmed in 1975. The first ‘test-tube baby’, Louise Brown, was born in 1978. Today the phrase ‘infertility treatment’ has been replaced by
82 S
VIRTUAL WORLDS
‘Assisted Reproductive Technology’ (ART) to indicate the fact that such techniques are not just for people with infertility, but can be used to ‘engineer’ people’s reproductive lives. This can include anything from choosing the gender of their children to women delaying the birth of the first child until after they have established their careers. IVF—one of the most popular of the ARTs—involves some common elements of procedure. The woman’s menstrual cycle is suppressed by controlling the body’s hormones, specifically the Follicle Stimulating Hormone (FSH). The doctors then administer drugs that stimulate the ovary to produce many Graafian follicles inside which the eggs will mature. When the follicles reach a particular size, a synthetic version of the hCG (human Chorionic Gonadotrophin) hormone is injected to ripen the eggs. The eggs are then removed from the follicles, collected in a glass dish where they are mixed with sperms collected earlier. The sperms are retrieved from the semen by washing it and spinning it in a centrifuge. Thousands of sperms are added to each egg in the dish, which is then placed in an incubator overnight at a temperature similar to that inside the human body. Later, upto three resulting embryos (produced after fertilisation of the egg by the sperm) are transferred to the woman’s uterus. The other embryos are frozen, to be used if the first attempt is unsuccessful. One or more of the transferred embryos will (hopefully) develop into a healthy foetus and, later, a baby. IVF, has several variations: S Natural cycle IVF: No fertility drugs are given. This means that only one egg is produced per cycle. S ICSI (intracytoplasmic sperm injection): An individual sperm is injected directly into the egg (in some countries, immature sperms—called spermatids—are used). S SUZI (sub-zonal insemination): Several sperms are injected into the region between the zona and the membrane of the egg. S IVF using donated gametes (sperms or eggs) or embryos: Used in various situations, such as the woman not producing any eggs, or if one of the parents has an inheritable disease.
TECHNOCULTURE
S 83
Sometimes hospitals and clinics offer other variations, where fertilisation is not necessarily achieved outside the body: S GIFT (gamete intrafallopian transfer): The sperms and eggs rather than embryos are transferred into the woman’s uterus. S DI (donor insemination): The first recorded DI was carried out at least 200 years ago. DI may be either AID (artificial insemination by donor) or AIH (artificial insemination by husband). Here the sperm is transferred from the woman’s partner or sperm donor into the woman’s reproductive tract. S Surrogacy: This is used when a woman is unable to carry a pregnancy to term. Another woman (the surrogate mother) may be inseminated with the sperm of the patient’s husband. Or, the couple can undergo IVF treatment and the embryo can be transferred to a surrogate womb. There have been other developments (reproductive biology is an extraordinarily fertile field!) in the 1980s and 1990s. Two common methods focus on sperm collection: S MESA (microepididymal sperm aspiration): For men with a low sperm count—the primary cause of infertility—a surgical technique is used where sperms are taken directly from the epididymis using a fine needle. S PESA (percutaneous sperm aspiration): Here the sperm are collected via a fine needle inserted through the skin. In extreme cases (where a blockage prevents the sperm developed in the epididymis from mixing with the seminal fluid), testicular biopsy is performed where a small slice of the testis is removed.
Cloning Indisputably the most controversial of ARTs, cloning is the stuff that SF is made of. Cloning involves the use of genetic information (namely DNA) from a single cell to create an entirely new human being. A baby is created when two sex cells—a sperm and an
84 S
VIRTUAL WORLDS
egg—combine. Sex cells (also called germ cells) contain half a set (23 in number) of such information carriers (chromosomes). The combination of two sex cells produces a new cell with a unique genetic identity (genotype). A fertilised egg therefore contains 46 chromosomes. Any cell other than a sex cell (called a somatic cell) also contains 46 chromosomes. Thus each somatic cell contains a copy of the genotype of the person. Cloning uses the genotype from a single somatic cell to produce a new human being without fertilisation. More importantly, the baby will have the same genetic identity as the person whose cell was used in the cloning. On 23 February 1997, ‘Dolly’, the first cloned sheep made a public appearance. However, Dolly is not the first cloned life form, since plants have been cloned for some time now, through techniques such as grafting. The root of the word ‘clone’ helps us understand this better. It comes from the Greek klon, meaning ‘twig’, referring to techniques of asexual reproduction in plants such as Chlorophytum (the spider plant). Here the plant develops stalks at the end of which are ‘plantlets’. These plantlets are miniaturised versions of the plant, and soon fall away to grow as copies (clones) of the main plant. Each of these plantlets contains the same genetic structure as the parent. Cloning thus erases the difference between parent and offspring.
MOLECULAR MEDICINE AND GENE THERAPY Recent research into the biology of health and disease has revealed that several diseases and biological conditions have molecular and genetic origins. Medical and surgical techniques in human organ transplantation, pharmacotherapy (in Western medicine) and the unravelling of the human genetic code have been closely associated with the recent developments in medical biology. Gene therapy is the use of genetic manipulation for treatment of disease. It derives from genetics, molecular biology, clinical medicine and human genomics. Gene therapy offers the potential of a one-time cure for inherited ailments and disorders. DNA technologies now allow the identification of genes and manipulation of genetic material, as well as the examination of the molecular level of biology. Gene
TECHNOCULTURE
S 85
therapy is based on the transfer of DNA-based genetic material into an individual. This ‘gene delivery’ can be achieved through the direct administration of the packaged gene into the blood, tissue or cell (in vivo approach); or the packaged DNA can be administered indirectly via a culture of harvested tissue/cells, to which DNA is added and the genetically altered cells/tissue transferred back into the individual (ex vivo approach). Genetic vaccination seeks to extend the body’s human response, so that it has both prophylactic (preventive) and therapeutic (curative) potential. These are polynucleotide vaccines, with either RNA or DNA, and work by the direct innoculation of specific pathogen genes whose products are immunogenic. These genes subsequently induce protective or neutralising immunity. DNA vaccines—seen as possessing the greatest potential in gene therapy—consist of a bacterial plasmid with a strong viral ‘promoter’, the requisite gene and a genetic transcription termination sequence. The plasmid is grown in bacteria (such as Escherichia coli), purified and injected into the target tissue. The DNA then expresses itself in the diseased/host body. Other devices to boost the immune system are sometimes incorporated into the vaccine. With increasing sophistication of the techniques in molecular pathology, it is expected that gene therapies will find more use in laboratory medicine. Molecular biology and genetics can even be used to monitor a patient’s status. Aging is one area where gene therapy is expected to make considerable contributions. Another recent approach is cellular transplantation. In cases where organ transplants fail (as happens frequently in liver transplants), hepatocyte (live cell) transplant has been tried. These ‘transfected cells’ may be cultured in the laboratory and genetically modified to prevent the host body’s immune system from rejecting it. Cloned cells and organs are being tried as possible augments to or substitutes for organ transplantation. In the future gene therapy may evolve to include all types of drugs specifically designed to alter patterns of gene expression, and then treat complex diseases by adding or subtracting a single gene. We may develop what has been termed ‘individual genetic preventive medicine’ (Kresina 2001: 348). Physicians and healthcare workers will possess personalised genetic maps of their patients. These maps will enable physicians to identify the genetic interactions
86 S
VIRTUAL WORLDS
that, along with other factors, cause disease. This means that medical intervention can begin very early—based on the genetic predisposition to disease—and prevent the onset of the disease. Routine gene mapping will facilitate advance knowledge about any genetic disease in their offspring, even before pregnancy. Genetic information which maps a patient’s response to drugs can be used to maximise benefits and reduce risks in therapeutic medicine. Physicians can choose the best possible drug based on the patient’s constitution—a feature of the new science of pharmacogenetics. From exclusive clubs through monogrammed sartorial accessories, we have entered the age of custom-made medicine: there’s one for everyone, and nobody else has the same one that you do! *** In conclusion, then. ‘cyberculture’ is a shorthand term, a conceptmetaphor for what is essentially twenty-first century posthuman ‘technoculture’. ‘Cyberculture’ subsumes under itself a range of technologies, high-tech art forms and theories of social life. It is important to keep in mind that in a field as accelerated in its development as cyberculture and ICT, most of the technologies described here will be ‘upgraded’ or even become obsolete by the time this book sees published light! This is not to detract from the relevance of the technologies. Rather, it points to a condition of human society: every theory and practice has found its opposite or improved ‘form’ in its own generation. Thus Newton and Galilieo had their detractors even when they were alive. Hindsight may be an exact science, but criticism and dissension are more necessary since they drive innovation. Thus future technologies—even if they are radical refashionings—are based on today’s ideas or technologies. Technologies, as we shall see, develop under certain conditions that are almost always non-technological. Culture mediates, informs, selects, and disseminates technologies. Medical science and astrophysics, domestic appliance technologies and computer software—each of these was born out of a perceived need for certain devices, notions and ideas to understand, explain and tackle the world and human life. If the history of ideas and the history of
TECHNOCULTURE
S 87
consciousness elaborate these theories and concepts of human life, the history of technology and the philosophy of science maps the concrete examples where these theories and concepts have been instantiated. Twentieth-century cyberculture is not, I hope to demonstrate, merely an exercise in scientific ingenuity and technological brilliance. As the book’s concluding chapter demonstrates, these revolutions in ICTs and related technologies are shot through with problems that are very rarely to do with the actual technology. These problems are frequently cultural, and ideological cultural criticism (whether of the Marxist, feminist, or the postcolonial variety) will need to address questions of technology even in nontechnological disciplines such as the humanities. In an increasingly technologised world, no discipline, critical approach or theory can afford to ignore the technological conditions that have transformed our lives. Thus, cultural studies of, say, the oppressed in First World nations must take into account their access to certain technologies (how many Blacks or Asian–Americans are employed with NASA?) and scientific education (how many Asian–Americans secure PhDs in genetic engineering from Harvard or MIT?). For far too long have the humanities, especially in their postcolonial phase, ignored technological matters in their (frequently) astute discussions of race, class and gender. For this reason, the work of a Donna Haraway, Stuart Hall or Katherine Hayles ought to—if one may risk using imperatives—become points of departure for contemporary cultural studies. In an age where entertainment, information, education, medical treatment, ‘pure’ research, marketing, politics and fashion are all hypertechnologised, a cultural studies programme/project that does not account for this technologising of culture loses its critical and political edge. This book reveals the intimate, even sensuous, link between these areas and their occasionally messy ‘encounters’ with technology. It tries to make out a case where every cultural form or practice must be read in conjunction with others, and all read together with technology. While social forecasting is not the (conscious) intention of the book, I hope that the attention to particular technologies and their cultural ramifications provides what Paul Virilio termed the ‘archeology of the future’.
88 S
VIRTUAL WORLDS
NOTES
1. The National Association for Software and Services Companies (NASSCOM) provides statistics for the growth of ICTs and related technologies in India and the world. In 1985 software exports were estimated at a net value of US $0.3 billion. By 2002 this had reached US $8.1 billion. It is projected to reach US $50 billion by 2008. India has 6 million cellphones and 7 million Internet users, with the number increasing massively everyday (Editorial, Science, Technology and Society, 7[1], 2002. Also see Parthasarathy and Joseph in the same issue). 2. For a brilliant social history of modern consumption, see Rachel Bowlby’s Carried Away: The Invention of Modern Shopping. Bowlby points to an interesting use of technology in mass consumption and distribution. She notes that telephone salesmanship and shopping is now a favoured mode. She argues that ‘many commodities […] sold through the telephone play upon a sense of personal vulnerability to the outside world’. People barricaded within their homes, not wanting to go out, are still canvassed and persuaded to buy products. The telephone becomes a penetrative, intrusive device that enables such consumption. Ironically, double glazing for windows (used in England as a barricade to keep noise out) ‘is sold by the one method that proves in fact that the product can’t do what it promises!’ (Bowlby 2000: 46–47). 3. Four basic structures are used to work with interactive multimedia: timelines, stacks, flowcharts and object-oriented environments. Timeline programs are useful for works that unfold over time. When made interactive, these become extremely sophisticated. For example, interactivity enables ‘jumps’ (nonlinearly) from one frame to another. More complex are compositions that are separate sections that happen to be placed in the same timeline but are not inherently related to each other in time (Virtual Galapagos, an interactive ‘tour’ and exploration from Terraquest, is an example). Flowcharts treats time not as a linear sequence of frames but as a waterfall that can be diverted into different channels. Different media elements are arranged in chart modules and appear when the program reaches that part of the order/sequence/hierarchy. As the program progresses down the hierarchical chart, it displays images and movie clips and plays sounds. Stacks are arrangements of data, where clicking on one link brings up the data required. That is, stacks are programs that link different pieces of data. The designer composes the work by arranging media in ‘cards’ and creating connections between them. S/he puts graphic interaction tools such as buttons on the cards and then sets up interactions either by following prompts or writing short instructions.
CHAPTER TWO
ART, AESTHETICS, POPULAR CULTURE
T
he digitisation of the world has impacted variously on several fields: popular culture, journalism, advertising, education and literature. In order to understand the impact of the technology on popular culture one must first look at the way in which culture is produced and disseminated. With the development of telecommunications technologies there occured a ‘fusion of all arts into one work’ (Adorno and Horkheimer 1979: 124). That is, all elements of production became integrated in order to produce ‘culture’. Such an integration is now termed ‘synergy’. All forms of entertainment— music, films, literature, recreational sports, television and various forms of art—are linked to each other. Entertainment companies synchronise these forms and their production. Four strategies of synergy are common: S Textual connections or synergies of software: A particular ‘talent’ (author, singer, performer) is first acquired by a company. Her/his art form/talent is then publicised (sold) simultaneously across a range of media, entertainment products and leisure goods: audio recordings (sound images), images in books/magazines (still images), advertisements and films (moving images), and in video and computer games. S Connections between software and hardware or between text and technology: Here the art form and the technology
90 S
VIRTUAL WORLDS
are closely enmeshed. The art form (say music) must be delivered with as much speed and clarity as possible (hence the success of the Sony Walkman). S Convergence of previously distinct hardware components: Various cultural texts can now be coded into numerical data and stored for future transposition, modulation, mixing and reproduction. Thus music, films, fiction are all digitised and stored. Microprocessors now regularly combine various such forms to produce home entertainment (or multimedia). S Connections between technologies of distribution: Fibre optic cable, digital broadcasting, satellite transmission, information highways and the Internet all provide the technology to transmit various cultural texts/art forms over the entire globe. Thus film production, distribution, advertisement companies and telecommunications corporations all come together; art and technology, artistry and commerce conflate (Negus 1997: 84–86).
BOX 2.1 TECHNOREALISM Technorealism is a perspective that argues that we need to look critically at the technologies that inform human lives. It suggests that we see all technology, however ‘revolutionary’ it may seem, as the continuation of similar ‘revolutions’ throughout human history. Technorealism lists eight principles: (i) (ii) (iii)
Technologies are not neutral. The Internet may be revolutionary, but it is not Utopian. The government has a major role in cyberculture and technologies. (iv) Information is not knowledge. (v) Wiring (networking) the schools will not save them. (vi) Information wants to be protected. (vii) The public owns the airwaves and should benefit from them. (viii) Understanding technology should be an integral part of global citizenship (see ).
ART, AESTHETICS, POPULAR CULTURE
S 91
We can see that technology not only mediates the modalities of transmission of culture, but also alters the form that cultural products take. We cannot, therefore, speak of a cultural or art form without talking of the technologies (synergies) that have created the form. With this informing assumption, we can now look at some common cultural forms that contemporary technology has affected. One central aspect of contemporary technoculture is the predominance of the visual. As this book reaches its final stages, news reports arrive of the development of haptic—touch— technologies. People using data gloves and handling objects may soon be able to transmit their haptic sense of the object via the Internet to somebody who is online at that time. The Virtual Reality Laboratories at the University of Buffalo that developed the technology suggest that this will help users master skills—such as surgery, sculpture, even golf—that require precise application of touch and movement. (See ‘Transmitting the Sense of Touch’, The Hindu, 17 July 2003: 16.) Most of the new media technologies are essentially imaging technologies. Our culture is dominated by visual images. These images produce in us a wide array of emotions and cause us to invest the images with enormous power. Marita Sturken and Lisa Cartwright point out that foregrounding the visual in visual culture does not mean that we separate images from writing, speech, language, or other modes of representation. Instead, as contemporary art and advertising reveal, images are very often integrated with words (Sturken and Cartwright 2001: 4–5). The following is a short survey of the various representational practices that have been affected by the new technologies.
LITERATURE Literature and literary studies have always anticipated, influenced and critiqued science and technological developments. It therefore, comes as no surprise when we discover that the term ‘cyberspace’—now omniscient in its use—was coined by a novelist (William Gibson) and not by a technologist. Literature and literary
92 S
VIRTUAL WORLDS
studies have been profoundly affected and modified in the digital age. The Internet and related technologies call for new ways of reading, writing and research. This section surveys the impact of ICTs on the literary scene. After looking at the various reconfigurations of the nature and role of author/reader/text, it concludes with a discussion on cyberpunk—the genre launched by William Gibson’s Neuromancer and one which most extensively documents the varied dimensions of contemporary technoculture.
HYPERTEXT In order to understand the significance of hypertext, it is necessary to understand how contemporary philosophy and literary theory view a text. Thinkers such as Jacques Derrida and Michel Foucault—to mention only two ‘practitioners’ of what has come to be known as poststructuralism—emphasise textual openness and the infinity of meanings. The ‘nature’ of reality and the nature of a text are both redefined by poststructuralist thought. A text is a collection of words, which we read by distinguishing one word from another. Meaning is available only through this difference. Language is a system of differences (where the letter/ sound a is different from b, and the meaning of the word cat is available only when we recognise its difference from the word bat). Thus a text is a field of differences. The world is constructed and available only in language which, as has been pointed out, is itself a system of difference. People are different from each other, and places, histories, cultures, relationships are all ‘named’. That is, a word or a sign (or signifier) is used, though the word/name has no direct connection with, and does not directly indicate anything about, the person or place (the referent) in order to distinguish them. The world is available to us only (a) through words and signs, and (b) as a system of differing signs. What we describe as ‘culture’ or ‘history’ are all systems that require a language and a system of difference. Thus the parallel between reality and texts is startling. This leads Derrida to declare in his one of his most controversial statements: ‘There is nothing outside of the text’ (1976: 158). This simply means that all things (contexts or reality) take the form of texts, of language. Thus sounds, hand gestures, clothing, a restaurant menu,
ART, AESTHETICS, POPULAR CULTURE
S 93
physiognomies, road signs, scientific writing, music, movie images, computer graphics, a legal document, an obituary column, walking, relationships are all based on a system of difference and require signs/words to articulate this difference. The text is not, therefore, just the printed word. All signs are connected and in order to explain, say, a location, one uses words that signify place names, histories, people connected with the place and so on. For example, explaining ‘New Delhi’ to a foreign tourist requires the use of concepts, terms and names like ‘capital’, India-as-nation, North India, Parliament, President, seat-of-government, Old Delhi and so on. These are interconnected names/signs drawn from different aspects of reality—geography, history, political science, economics—all of which are read together to explain ‘New Delhi’. ‘New Delhi’ is thus a text that emerges in the language game of difference, where it is distinguished from and connected to these other texts. Reality thus gets textualised, and political, economic and social contexts all become connected, interlinked and mutually dependent texts (this linking is termed intertextuality). And our reality, or our ‘reading’ of reality, is a movement through this network of texts. What poststructuralism reveals, especially in the work of Derrida and the literary critic Paul de Man, is the self-referential nature of language. Language can no more be taken to refer to a reality out there. It refers only to itself. As we have seen in the example about New Delhi, in order to explain one term, ‘New Delhi’, we have to take recourse to several other terms. Each sign refers back to other signs, never to the space/structure of New Delhi itself. The word ‘cat’ in no way captures the reality of the animal. In order to understand the ‘reality’ of this animal, we need more words: a four-legged creature, of this species, possessing such and such features and so on. That is, these are words whose connection with the actual animal is purely arbitrary. This understanding comes from the work of the early twentieth-century linguist Ferdinand Saussure. Saussure (1966) argues that the connection between the word (signifier) and the concept behind the word (signified) is purely arbitrary. Where does it say that the word ‘cat’ actually describes a particular kind of animal? The sign has simply been taken to mean the animal. The reality of the animal does not in any way conjure up the word or name. It is only through repeated use, and by convention, that the word and
94 S
VIRTUAL WORLDS
the concept go together. Thus language cannot be taken to represent reality faithfully. This is not to say that poststructuralists distrust language. What they distrust is the ability of language to represent reality accurately. They argue that language can only refer to other words, to itself. Our knowledge of the world, available only through language, is thus tenuous at best. Poststructuralist theories enable us to be critical of the representational nature of language. Most importantly, its interrogation of origins and essences (‘truth’ in language, ‘the’ meaning of a text, and so on) seems peculiarly suited to a medium as amorphous and open as the hypertext. While poststructuralist thought by definition rejects totality, essences and the truth value of representations, its insight into the ways in which language gets organised is useful in understanding a hypertext. I hasten to add that poststructuralism does not provide a complete theory of the hypertext. What I would like to highlight is the parallel between poststructuralist thinking about texts, genres and narratives, and the hypertextual condition. It must be kept in mind that poststructuralism’s ideas of fluid forms have already been anticipated in English literary texts such as Lawrence Sterne’s eighteenthcentury novel, Tristram Shandy. Despite the enthusiasm of critics like George Landow and Katherine Hayles for the ‘convergence’ (Landow’s term, used in the title of his book, Hypertext, 1992) of poststructuralist literary theory, contemporary physics and computer sciences, we need to proceed cautiously. For what may get lost in the easy (and, let’s admit it, fashionable) ‘application’ of poststructuralism to hypertextuality is the former’s particular politics and critical–political edge. Wholesale import of Derrida in reading hypertext fiction would then be on par with the hopelessly parasitic and uncritical appropriation of Marx by Indian academics to map a so-called emancipatory critical—and, self-declared ‘nationalist’ or ‘nativist’, while missing the irony in the use of European Marx—stance in everything from literary to media studies. With this cautionary note in place, we can proceed to draw out specific aspects of poststructuralist thought that enables us to understand the nature of electronic texts and genres. In his classic S/Z Roland Barthes, a literary theorist writing in the 1970s and 1980s, describes an ideal text that has close parallels with contemporary digital texts. Barthes speaks of a text constituted by blocks of images or words linked electronically by multiple
ART, AESTHETICS, POPULAR CULTURE
S 95
paths. This text would be open-ended and perpetually unfinished because there would be no endings, only paths and networks that lead the reader from one section of the text to another. Barthes describes this kind of plural, multilayered text as follows: In this ideal text, the networks are many and interact, without any one of them being able to suppress the rest; this text is a galaxy of signifiers, not a structure of signifieds; it has no beginning; it is reversible; we gain access to it by several entrances, none of which can be authoritatively declared to be the main one; the codes it mobilises extend as far as the eye can reach, they are indeterminable … their systems of meaning can take over this absolutely plural text, but their number is never closed, based as it is on the infinity of language. (1975: 5–6) Barthes argues that an ideal text is made up of infinite symbols and words (signifiers), none of which provide the ultimate, definite or indisputable meaning. Every signifier leads the reader to yet another one. Meaning is thus never complete and perpetually postponed by the reader’s passage through whole networks. This description of an ideal text captures the essence of the hypertext. Theodor Nelson coined the term ‘hypertext’ in the 1960s to refer to electronic texts. Nelson defines hypertext thus: ‘Nonsequential writing—text that branches and allows choices to the reader, best read at an interactive screen … this is a series of text chunks connected by links which offer the reader different pathways’ (quoted in Landow 1992: 4). The slogan here (from Ted Nelson) is ‘everything is deeply intertwingled’ (quoted in Spalter 1999: 373). Ted Friedman defines hypertext as software that allows many different texts to be linked. A simple click of the mouse can bring up any new related document (Friedman 1995: 74). Together these definitions allow us to reach an understanding of hypertext. S A hypertext has several layers and the reader, unconstrained by the page (since it is read on screen) can follow any link, anywhere, and derive her/his own meaning. S As in Derridean poststructuralism, the hierarchic relation between main text and footnote, or main text and commentary, is broken down in hypertexts. There are no
96 S
VIRTUAL WORLDS
marginal or peripheral texts here, and every text contains ‘traces’ of other texts. S With the potential of hypermedia and developments in transmission of various forms of data, the text can now be read with visual and sound images. Diagrams, musical accompaniments, graphics, annotations and maps are all available simultaneously to the reader. The reader does not need to physically shift from a traditional, print library to a music archive to discover the music that was used in the staging of Shakespeare’s plays. S/he can obtain it easily through a marked link as s/he reads the text of the play on screen. Further, an image of Shakespeare’s stage setting can also be accessed on the same screen. Commentaries and biographical information on the author, the actors and others are all available off the screen through a link, or simultaneously as ‘voice-over’. S Hypertext, therefore, has no real boundaries. While a traditional text ‘ends’ with the covers or the shape of the page (the last line on it), a hypertext offers infinite texts— hovering just out of sight of the computer screen—as reduced ‘windows’ at the foot of the open window. The hypertext is thus a whole series of texts (where the notion of text itself must be taken to include the non-verbal as well). The hypertext is the ultimate dialogue, for every text is in perpetual ‘conversation’ with every other text. Jerome McGann argues that unlike the traditional print text, a hypertext disperses attention among several texts. Though every hypertext has a preferred set of arrangements and orderings (as links), these could be more decentralised. Thus a hypertext archive could create any number of centres and links, or relationship with other texts (McGann n.d., ‘The Rational of Hyper Text’ ). S A print text is read linearly. A hypertext’s prime feature is its non-linearity. Since the reader is free to follow any route through the text, the text’s linear organisation is of no importance. Espen Aarseth terms such texts ‘ergodic’. Ergodic (‘ergos’, meaning ‘work’ in Greek, and ‘hodos’, meaning ‘path’) phenomena in physics refer to cybernetic systems where feedback loops exist. Each time the system
ART, AESTHETICS, POPULAR CULTURE
S 97
is engaged it generates a new sequence. The signs do not have a pre-programmed sequence. The sequence of signs is one of many potential ones in the ‘event space of semiological possibility’ (Aarseth 1999: 33), chosen by the audience at that moment in time and in that space. Different audiences, therefore, at different times may experience the signs differently. Aarseth lists hypertexts, computer games, literature/story generators and dialogue systems as ergodic narratives. Thus, in Cayley’s The Speaking Clock (a poem generator), the program runs a different verbal sequence for every moment it runs, based on an internal clock in the computer. Words change letter for letter, and are replaced by others containing the correct letter in the correct time slot. Each word thus gets replaced in a continuous cycle (see also electronic poems such as Indra’s Net at ). Marjorie Perloff argues that the most interesting poetic and other artistic compositions of the late twentieth century position themselves against the language of TV and advertising. New forms of poetry and composition emerge as a result—the poetry of John Cage is an example of the new ‘artifice’ of hypertexts (Perloff 1991: 19). S It also alters writing skills, and therefore has a considerable role to play in education. For instance, when students put out their essays on the Web to be read by a linked audience elsewhere. When the students are asked to write in a manner such that anyone can understand, they face the necessity for developing grammatical correctness, easy syntax and a fluid style. Further, when asked to evaluate similar essays from another country/culture (arriving via the Internet), the students are required to develop specific analytical skills. When faced with a diverse audience, students are forced to re-examine some of the skills they already have (O’Reilly et al. 1997: 195). S Tree Fiction is an advanced form of hypertext literature. Tree fictions use electronic linking to join fragments of text and create forks in the plot. However, the user cannot return to former decision points and there are no merging of paths. Every branch is thus kept separate from others. This effectively delivers a straight, coherent narrative from
98 S
VIRTUAL WORLDS
start to finish, since there is no backtracking or merging. Once the user clicks on a specific branch/root, s/he has to follow that path alone, making contributions to the plot at specific points. S The Impermanence Agent () is a model for literature and art that is specific to the Internet and the PC. This is an artwork that operates as a function of the user’s Web browser. When the Agent is engaged, it appears at the corner of the screen. It moves forward only as the user clicks on other Websites (those not associated with the Agent). The user cannot navigate the Agent’s content directly. The Agent monitors the user’s Web traffic and continually alters its story using the material drawn from the user’s Web browsing. The entire story takes a week to tell. This makes the user an involuntary author of the story, ‘composing’ simply by clicking on Internet Websites that have no connection with the Agent.1 S Marie-Laure Ryan has an exhaustive list of features of the hypertext: ephemeral, reader freedom, emergent meaning, attention focused on language, text experienced as surface (surfing), decentred structure, free growth, local coherence, reading-as-aimless-wandering, play, diversity, chaos, heteroglossia (many registers), jumps and discontinuity, fluidity, interactive (Ryan 1999: 101–2). S Jerome McGann suggests that contrary to the idea that hypertexts have no ‘structure’, almost all hypertexts possess some order. McGann argues that the hypertext is structured around some initial set of design plans that are keyed to specific materials in the hypertext, and the imagined needs of the users of those materials. However, unlike a print text, a hypertext’s form is always open to alterations of both content and organisation (McGan n.d., ‘The Rational of Hyper Text’ ). One interesting way of looking at the poetics of the database and the narrative is offered by Lev Manovich. Manovich suggests that the database—defined as a structured collection of data (2001: 218)—is the very opposite of narrative, which follows a cause–effect sequence in organising its data. However, with
ART, AESTHETICS, POPULAR CULTURE
S 99
multimedia things change. The new media, as Manovich puts it, ‘consists of one or more interfaces to a database of multimedia material’ (2001: 227). The ‘user’ of the narrative traverses a database, following links between its records as established by the database’s creator. An interactive narrative is the sum of the multiple trajectories through a database. A traditional linear narrative is one among many other possible trajectories. With this, traditional linear narrative can be seen as a particular case of hypernarrative (ibid.). All narrative, this argument suggests, is basically database. The significance of this form of text for reading habits can well be imagined and will be discussed in the section on the ‘Reader’. With this notion of a hypertext, we can now move on to discuss other aspects of literature and literary texts.
AUTHOR Michel Foucault, Roland Barthes and several other poststructuralist thinkers question the notion of the author. Speaking of the ‘disappearance of the author’, Foucault wonders whether, when we use the phrase ‘the complete works of Nietzsche,’ we also mean the laundry list, an address or an appointment that Nietzsche scribbled in the margins of his work. In short, Foucault seeks to understand exactly how we define ‘work’, and therefore ‘author’. Several issues about the role of the author need to be discussed here. In a hypertext, the multiple pathways and links deny the privileged position of the single, coherent authorial voice. The fact that a reader may use any link to reach any data about/within a text means that the author frequently disappears. That is, the beginning, middle and end that the author intended for the book may not be the sequence followed by the reader. The author’s authority in controlling the mode of reading is therefore destroyed in a hypertext. The most important impact of digi-culture is visible in the debates over narrative structure. A text or a plot has always been seen in Aristotle’s terms of a linearity with a logical beginning, middle and end. Critics such as David Bolter have suggested that certain kinds of literary texts anticipate the hypertext in their
100 S
VIRTUAL WORLDS
non-linear, multidimensional, decentred-author structure. Examples of such texts would be Laurence Sterne’s Tristram Shandy, James Joyce’s Ulysses, and the work of Jorge Luis Borges, Robert Coover and Marc Saporta. Bolter argues that the modes of organisation of these texts have strong affinities with the non-sequential hypertext (Bolter 1991: 132–39). When a reader holds a printed book in the hand and reads it, the author is ‘immediate’ and ‘alone’. This is because the reader’s scope of vision is restricted to the sequence of words in that book, and therefore by that author. In hypertext, whose screen may simultaneously display several authors, commentaries on those authors, and graphic images, the reader is actually reading several authors at the same time. Thus the ‘original’ author, say, Shakespeare, loses his uniqueness when he is read along with his contemporaries, his critics, stage directions for his plays, and spoofs of his work. Shakespeare-the-author thus disappears into a conglomerate of Shakespeares (as though one is attributing a multiple personality disorder to him), where each of these other writers being displayed on screen becomes, in a sense, the Shakespeare text to the reader. The author, like the text, becomes multiple and fragmented. Also, since the author’s status is controlled by the manner in which s/he is displayed and read on screen, the role of the author and that of the reader conflate: both are equally responsible in producing the text on the screen. Every text in hypermedia is a collaborative text. When an author in the print system is writing, he or she may give the written section to someone else to read, comment on or edit. This is also a collaboration which, however, takes place later and in a different space (the second writer’s). As Landow points out, in a hypertext, ‘the author who is writing now [is writing] with the virtual presence of all writers “on the system” who wrote then but whose writings are still present’ (Landow 1992: 88, emphasis in original). Computer-generated literature is collaborative in another sense. The computer plays three roles in the man–machine collaboration: (a) outputs a blueprint for the human partner to translate into literary language; (b) produces text in a live dialogue with the user; and (c) performs various operations on a humancreated text. The result of such a collaborative effort may be a standard print text or an electronic one (Ryan 1999: 2).
ART, AESTHETICS, POPULAR CULTURE
S 101
In The Spot (formerly at ), an interactive soap opera, the collaboration between writer and reader becomes more direct. The Spot is about the daily life of a set of five people in a California house. A new installment in the diary of the characters is posted on the WWW everyday. Each character is ‘played’ by a writer who puts up the diary entry (though the writers/characters must decide, in advance, on a larger coherence). The visitor to the site can view pictures and send emails offering advice regarding the plot. The writer/character is then supposed to respond to the viewer’s comments personally. Robert Coover, thrilled by the freedom from a linear and ‘closed’ narrative of the print text, writes enthusiastically about the new role of the author. He describes the ‘allure of the blank spaces’ of the hypertext, and is enthusiastic about the potential to mix registers and tone, style and content (and, of course, the visual and the verbal) that the new media offers (quoted in Landow 1992: 105). John Barber’s ‘cybernetic engines’ for writers seek to ‘use creative technology to promote the development of higher level learning skills, especially in fiction or poetry writing classes’. The cybernetic writing engine has three essential components: S Essay generator: This creates essays from randomly generated words. Writers can use it to see how unrelated ideas can be creatively combined to produce interesting writing. A new essay is generated each time the story page is loaded. S Quote generator: This produces interesting quotations that developing writers can use as the basis for further writing. S Prompt generator: This engine produces unusual thoughts and ideas. Writers can use it to find interesting ideas/projects/ subjects that they can then write about (). The cybernetic engine redefines the author’s role by making any ‘work of genius’ a mechanical act. The author’s mind, erudition and vocabulary are only incidental, and may even be unnecessary to produce an essay or text.
102 S
VIRTUAL WORLDS
BOX 2.2 STORYSPACE Storyspace is a hypertext system for the Macintosh which has been designed specifically for writers. It was created by Jay David Bolter and Michael Joyce. Storyspace follows Bolter’s terminology in calling ‘nodes’, ‘writing spaces’. These are displayed as scrollable windows on a desktop with pull-down menus and a toolbar. Writing spaces may contain text, graphics, sound, or video. Writing spaces can be viewed in three configurations. (a) The Storyspace View displays nodes in a global map; (b) the Outline View shows a vertical list of node names; and (c) the Chart View displays them in a horizontal flow chart. A magnification tool allows one to zoom in on an area of particular interest. Navigation is as easy as ‘point and click’. Basic links may be made by drawing a line between two nodes and typing a label. A note tool automates annotation. A local map is provided through the Roadmap menu item. Paths may be named and saved. If there is more than one link from a node, priorities may be assigned to them by using ‘guard fields’. A Storyspace Web is a network. Boolean connectives are provided. Writing spaces may hold only 32,000 characters, which limits import of other works. Keywords may be assigned to writing spaces. Several visual window types are possible. There is a manual, ‘Getting Started with Storyspace’.
.Story-generating programs were developed as early as the 1980s (James Meehan’s Tale Spin is the most famous). A more recent work is Scott Turner’s Minstrel (1994), which uses 10,000 lines of code to generate a plot line of a dozen different King Arthur stories of half-a-page length each. What this project demonstrates, at a small level, is the ability to turn a database into stories. Experimental poetry using computers is also increasingly available. A computer program creates texts out of already available texts by finding the database for elements that fit specific patterns (such as anagrams or rhymes). Using collage and various permutations, the program then fits these elements into a humancreated template. John Cage, in I-V for instance, produces such poetry. A program creates acrostics by picking a word from a list and locating its individual letters in quotations randomly culled from another list. In the ‘output’ text, the capitalised letters of the
ART, AESTHETICS, POPULAR CULTURE
S 103
acrostics are centred in the middle of the page on a vertical axis, surrounded by ‘wing words’ of the quoted texts. In a sense, hypertext achieves what poststructuralist theory— anticipating technologies of the future—has argued: the death of the author. With the control shifting out of the hands of the author into that of the reader, notions of authenticity and original text are rejected. Just as poststructuralism disperses meaning beyond the text’s confines with its notion of intertextuality, the hypertext breaks up the figure of the author who now lies somewhere beyond the text. The author is a temporary point where several authors converge, and each of these authors is ‘original’.
READER As we have already noted, in a hypertext it is the reader who chooses where to go in a narrative. Roland Barthes spoke of a ‘writerly’ text where the reader is ‘no longer a consumer, but a producer of the text’ (Barthes 1975: 4). Several reasons make the hypertext such an ideal writerly text. The reader is in the same position of authority as the author. As we have seen in the preceding paragraphs, the reader can organise the narrative any which way, and is not required to follow the author’s sequence. The reader can bring to her or his reading of the moment any other text which, when displayed simultaneously on the screen, suggests many texts and many authors without privileging any one. The experience of reading a hypertext is also unnerving at times. For instance, when the reader follows a link only to discover that it is a dead end. The hotlink is no clue as to, the (a) data at the end or, (b) the quality of this data, when available. Thus a certain degree of indeterminacy is introduced into the reading experience—something that is not very common in the print form. Stephen Pulsford compares this uncertainty over unreliable links to the superscript referring the reader to an endnote in a print text. One has to look for the right page with the endnote, and then again look for the exact point in the main text at which s/he stopped before going after the endnote (Pulsford in Browner et al. 2000: 180). In a sense, every reader of a hypertext is an ‘informed’ or competent reader. For a print text such as a poem, a professor interested in the area will know where to go for the allusions or
104 S
VIRTUAL WORLDS
contexts. A less competent reader will not find this source. The Internet text of the same poem, however, offers a different schemata. The ready-made links directs the reader to exact sources such as concordances, commentaries, dictionaries and reference material. Any reader is thus in a position to locate material previously available only to the specialist. However, this also poses a major problem. In the print version, only authoritative and reliable material is packaged and circulated by and among scholars. The student can therefore approach and access this material with a reasonable amount of confidence in its quality. The Internet hosts thousands of sites that document poor quality material. These sites are hosted without the supervision and referee systems common to most publishing houses. A student looking for material is as likely to go to one of these second-rung sites (since the link does not provide any information about the quality of material offered) as consulting a scholarly, printed work. However, it must be noted that the information available on the WWW is not necessarily useful to students. In fact, the concept of ‘information overload’ is extremely relevant to contemporary knowledge-seeking quests. The excessive amount and range of materials available makes it difficult for the reader to adjudicate and select what might be most appropriate from them. In fact, by a curious irony, the WWW has reproduced an anxiety of the medieval and early modern period. During the sixteenth and seventeenth centuries the voluminous production of books caused scholars to ponder about information overload. Conrad Gesner, who authored the first catalogue of all the known books in the world in 1545, Bibliotheca universalis, in fact moaned the ‘confusing and harmful abundance of books’ (quoted in Blair [2003: 11]; also see accompanying essays in Journal of the History of Ideas, 64[1], for studies of ‘information overload’ in the early modern period in Europe).
CRITICAL THEORY Landow’s Hypertext discusses several important similarities between critical theory and digital culture. Landow notes that ‘network’ is a term and concept that occurs with remarkable
ART, AESTHETICS, POPULAR CULTURE
S 105
frequency in both. Landow’s summary is worth noting in some detail here. He suggests four meanings of network that appear in descriptions of hypertext systems: S Individual print works when converted to hypertext become blocks (called ‘lexias’ or ‘textrons’) joined by a network of paths and links. Network here is a hyperlinked electronic equivalent of the printed text. S Any collection of such blocks/lexias also becomes a network or Web. S Network also refers to actual links between computers through cables or such means (e.g., Local Area Networks). S Network takes on the status of a meta-term when it stands for all terms such as docuverse and infoworld. Information constitutes this network (Landow 1992: 23–25). Thinkers such as Barthes, Derrida, Bakthin, Foucault and, more recently, Landow, Michael Joyce, David Bolter and others have popularised the notion of the non-linear form. Thus narrative— which is central to poststructuralist thinking about the world—is now multilayered, composed of several units of which none is privileged, and each is connected to other units in an endless chain. Much of this theory uses a rhetoric that readily applies to hypertext systems and digital culture. Another realm of transformation within critical theory and criticism is the marking of electronic texts. Using SGML (standardised mark-up language), anyone can edit an electronic text. SGML’s flexibility allows the editor to describe and categorise a text. For example, a poem can be marked into line, stanza, quote, footnote, title: users can customise the appearance of the text on the screen. For literary scholars this is a great development. They can now isolate figures of speech, such as all metaphors for death in a Shakespeare play, or the speeches of a single character in a novel. The case of ‘the housewives from Arizona SGML tagging team’ is a famous example. These two women spent years ‘tagging’ Bram Stoker’s Dracula (1897). They identified the speaker of every line of dialogue in the novel. They tagged journal entries, newspaper articles and indirect quotes, chapters, paragraphs and pagination. This massive project now enables any reader to find exactly what s/he is looking for: a particular speech, a report or a name in the
106 S
VIRTUAL WORLDS
text. This is an illustration of intertextuality since the women have actually created a series of texts for Dracula (more details can be had from Gloria McMillan’s ‘Playing the Dracula Tag’ ). Contemporary critical theory rejects the notion of a text’s connection with the world. In other words, a text does not reflect the real. Instead, poststructuralism argues, a text is merely selfreferential: it refers back only to itself. The distrust of language in contemporary theory coincides with postmodernism’s idea of the simulacra. Reality, according to thinkers such as Jean Baudrillard, is only available to us as images (or ‘simulacra’). In fact, these images are the only reality we know. What lies behind the image is more images. The distinction between reality and simulated versions of reality breaks down when all life appears to be an advertisement in neon lights (see section on Baudrillard in Chapter 1). The Internet, with its sophisticated technologies of delivering images of reality—the collage, virtual reality, nanotechnology— presents reality as a sequence of images. The movement from link to link, from sense to sense (eye to ear to skin) is mediated throughout by technology. Human interaction is itself mediated through machinery, electronic circuits and sound bytes. The distinction between theatrically staged art/action/events (the WWF fights, for instance) and reality is blurred. Since we live in a science-fictional world (as Bruce Sterling puts it), contemporary life is a series of images. There is no ‘reality’ that we can see. Instead, the images we see are the reality. Like poststructuralism’s notion of all language as ‘figurative’, all reality is image.
Playfulness and Online Communication At first sight playfulness as a topic in critical theory and technology seems misplaced. However, Brenda Danet’s extraordinarily thorough work on the language of the Internet in Cyberpl@y: Communicating Online (2001) is essentially an exercise that illustrates much of contemporary poststructuralist and postmodern notions of the playful, open-ended ‘text’. Danet’s work on email, e-greetings and virtual theatre is worth discussing in some detail because she suggests ways in which notions of texts and language need to be ‘redone’ with the arrival of new technologies.
ART, AESTHETICS, POPULAR CULTURE
S 107
Online communication can be person-to-person (private email), person-to-group (listserv or Internet Relay Chat [IRC]) or person-to-remote-computer (as in accessing the WWW). Danet suggests five principal reasons for the playful nature of digital communication: the qualities of the medium (ephemeral, interactive, ‘immersive’), release from physicality, the frontier-like nature of cyberspace, the influence of hacker culture, and the masking of identity. For Danet typed online communication lies between speech and writing, yet is neither. She describes it as a paradoxical medium: both doubly enhanced and doubly attenuated (2001: 10–11). It is oral, especially in interactive or synchronous modes, insofar as it is dynamic and improvised. It is attenuated speech because the nonverbal and paralinguistic cues that contribute to meaning in ‘real time’ conversations are absent here. It is enhanced writing because it is more dialogic—we can establish immediate communication with the writer (an option not available to the print medium). It is also attenuated writing because the letters (pixels) of text are no longer physical objects. It is thus both doubly enhanced and doubly attenuated. A summary of the common features of digital writing, as provided by Danet, includes: multiple punctuation, eccentric spelling, capital letters, asterisks for emphasis, written-out laughter, descriptions of action, ‘smiley’ icons, abbreviations, and everything in lower case (ibid.: 17). Typed chat, for Danet, is playful performance for which there are customised desktops and screensavers. Danet also predicts that the speech-like, informal email style will increasingly characterise public as well as two-person communication. Extending her reading of the play of cyberspace, Danet looks at IRC performances of Hamnet, a parody of Shakespeare’s play by a group called Hamnet Players, who experimented with virtual theatre in 1993–94. The players played with Shakespeare’s text (coming up with such outlandish lines as the mathematical rendering of the ‘To be or not to be’ soliloquy as ‘2B I!2B’, the mathematician’s way of saying ‘is/is not’!), with the roles of the actors and audience, the frames of the interaction, with IRC software, and with language and norms of decorum (ibid.: 129–31. Details of other virtual drama projects are available at ). Digital greetings, likewise, are also play (e.g., Mark’s Apology Note Generator
108 S
VIRTUAL WORLDS
. However, there is also plenty of death on the Internet, as the free site Virtual Memorials suggests. ASCII art is an early form of play with computers. This is text art— i.e., art made from elements of text. ASCII art is woven line by line, from the upper left, in the slots where letters and other symbols are normally typed. Visual poetry—such as by Komninos Zervos (for other examples see )—plays extensively with typography. Danet lists several categories of ASCII art: representational images, shaped texts, calendars, stylised lettering, greetings, humour and others (Danet 2001: 210–11).
PEDAGOGY AND RESEARCH The implications of the digitisation of print on literature, literary studies and research are only beginning to dawn upon scholars of literature. Several issues arise in the context of newer forms of text and modes of reading.
E-texts Most of the canonical literature from pre-twentieth century is available free online. Archives such as Project Gutenberg (), Women Writers Project (Brown University, ), the Electronic Text Center (), Renascence Editions (), Oxford Text Archive (Oxford University, ) and more specific author sites (such as The William Blake Archive— easily one of the best and most scholarly sites on the Net— ) provide the entire works of an author(s), anthologies and even works-in-progress. However, not all sites provide all authors, and most twentieth-century texts with their copyright regulations may not be available. There is the usual problem of definitive and authoritative editions even for e-texts, and several guidelines exist for literary scholars to check on a Website before using the material hosted therein. American Studies International (), Victorian Institute Journal, Britannica Internet Guide and a less
ART, AESTHETICS, POPULAR CULTURE
S 109
detailed survey by the PMLA in its Internet News often evaluate sites and Internet resources. Peer-reviewed journals such as Romanticism on the Net () also provide links to reliable sites in the field. Journals such as Postmodern Culture— one of the first online journals ()— provide a list of sites on postmodernism. The free availability of a text provides interesting developments in teaching and learning literature. 1. In the case of actual ‘real-time’ teaching, the instructor can use a projected image of a text (in the case of a non-wired classroom). 2. In a wired classroom, one station—the instructor’s—can control all other stations, thus providing access to all students. 3. Electronic discussion lists facilitate the continuation of classroom discussions. More importantly, an electronic discussion list allows even the most diffident to speak, without the risk of being shouted down or receiving aggrieved/mocking/sceptical looks from fellow discussants to intimidate and discourage them (as happens in real-time seminars). The use of MUDS or Multiple-user Dungeons (that allow real-time conversation between many people), MOOs or Multi-user Object Oriented environments (that allow real-time exchange of graphics in addition to texts) and IRCs or Internet Relay Chats (that allow one-to-one conversations) allows the teacher to engage in email exchanges with every student in the course, setting up a chatroom or discussion list, using online discussion materials and so on. 4. In some cases an archive may allow the teacher to run a virtual seminar and create a ‘lesson’ (usually termed a ‘path’) through the electronic material. This can be in the form of annotations, comments, further queries, and so on. The lesson, stored in the archive, is then available to everyone. A good example would be WOMDA, the Wilfred Owen Multimedia Digital Archive (). The site displays the paths already created by courses offered at/by various institutions such as Oxford University and Minnesota University. Three links are displayed: British Poetry of the First World War, JTAP and Narratives of War. Each of these takes one to some extraordinary visuals of war scenes and a commentary/notes on the sidebar.
110 S
VIRTUAL WORLDS
The site itself facilitates a search and browsing of the archive. This includes Owen’s manuscripts (from the Oxford English Faculty Library), photographs and film clips of the Western Front, audio clips from interviews with veterans, and the full text of Owen’s war poems (from the Stallworthy edition). This last provides links to images of the manuscript pages. 5. The Internet will transform teaching through some important deviations that it offers for students of literature: (a) Simultaneous availability of several versions of, say, Whitman’s Leaves of Grass or a Dickens novel. This causes a rethinking of the idea, which the print version induces, of the unified, authoritative literary work. (b) The increase in sheer number of texts alters the canon, or even provides an alternative canon. For example, Renascence Editions, by putting out such texts as Bathsua Makin’s Plea for the Continuation of Education of Gentlewomen (1673) and other women’s polemical tracts, radically revises views of the Elizabethan age simply by providing such rare material free online. Some ideas about the canon, canon formation, and the effect of the new technologies are listed here: (i)
To even call any ‘text’ a work of ‘literature’ is to presume standards of judgement (regarding quality). (ii) Canonisation of a text lends it a certain status (such as being included, say, in the Norton anthology), and becomes a guarantee of quality and aesthetic appeal. (iii) Canonisation is a process that involves social, economic, political and educational practices, none of which can be disentangled from the other. (iv) The canon focuses a reader’s attention by showcasing or highlighting a text. The privilege of being in a canonical anthology means that the reader’s attention is willy-nilly drawn to the texts therein.
ART, AESTHETICS, POPULAR CULTURE
(v)
Further, since canonical texts are taught in literature courses, a wealth of commentary, analysis and criticism on the text is produced. This kind of cyclical dynamic reinforces the status of a text: a text with more material on it (say Shakespeare) is canonical. However, thinkers such as Claire Warwick suggest that an electronic canon is already forming—that the more canonical a writer, the more chances of s/he having her/his texts available online (see Warwick 2000). (vi) With contemporary technology, criticism and reviewing—two important contributors to the canon—are no longer restricted to academics and professional reviewers. Putting aside the question of scholarship in reading/evaluating for the moment, we can say that the Internet enables any reader to respond to a text. For example, Amazon.com advertises new books with a call for reviewers. Anyone can send in a review and see it on the Website, along with details of the book. Discussion groups also mark a significant shift in what the critic Stanley Fish famously described as the ‘authority of interpretive communities’. Richard Sears rightly describes this as a ‘leveling’, where ‘ordinary people read great books and comment on them, even love them’. As Sears puts it: ‘There is, after all, no particular evidence that academics enjoy reading books more than non-degree-seeking people do’ (Browner et al. 2000: 31). This greatly alters the canon—one that was installed by a dynamic between writers, readers, reviewers, critics and academics. (vii) The electronic classroom textualises classroom discourse. It puts the student response on par with the instructor and the author. It thus radically alters the relationships involved in the production of knowledge. The poet, the student and the senior professor of English are all equal
S 111
112 S
VIRTUAL WORLDS
in an electronic classroom. It produces what John Slatin terms ‘meta-knowledge’: knowledge about the way in which the class itself works or, to put it in other words, knowledge of themselves as participants in an ongoing conversation (Slatin 1994: 32). (viii) Students need not read texts in any chronological, thematic or geopolitical scheme (as arranged in the syllabus). Thus Shakespeare can easily be read—through hypertext links—with Salman Rushdie and Les Murray. This not only alters the reading scheme offered by canonical English literature courses, but also enables a student to draw comparisons between texts, as well as between traditions, cultures, canons and contexts, that would otherwise be impossible if the reading were restricted to the chronology/ texts of the syllabus. (ix) The instructor is also free to draw upon diverse texts/contexts and periods to form courses in hypertext. What appears to be both impossible (considering the amount of library resources, photocopying facilities and so on) and bizarre can actually be achieved with hypertext. (c) An increased use of visual images is now possible. For instance, the WOMDA site can be used to project film clips from World War I alongside a Sassoon or Owen poem. Where illustrated print editions of, say, Blake or film clips are expensive to acquire, the free availability of such visuals enable multiple levels of reading. (d ) This also means that literary studies can become more (easily) interdisciplinary. Inclusion of visuals, historical documents (WOMDA hosts a range of WWI documents, for example), speeches, diaries, family records, statistics such as demographic shifts, voting patterns, urban planning figures, architecture, music and so on play an important role in altering the canon because the reader can now connect a literary text with any other canonical or non-canonical (con)text. Thus a student writing on
ART, AESTHETICS, POPULAR CULTURE
S 113
literacy in early modern England can now utilise texts written by women, which have not been anthologised in standard compilations on the area. Thus two aims are achieved through the interdisciplinary approach of the new media: (i)
It makes a comment on the process of canon formation whereby women writers have been excluded from the more prestigious collections. (ii) It enables commentary on new texts, and thus moves them into positions of public recognition and critical debate—an essential component of canonisation. (e) Students can now be asked to find contextual or primary material from the data on the Web. These assignments may be simple contemporary reviews of a particular author, or more broad-based such as the context of voyages in seventeenth-century English literature. Stephanie Browner lists possible hypertext assignments: (i) Webliographies: a hotlinked bibliography. (ii) Annotations: where the teacher puts a section of the text into hypertext form and students write their comments as links. (iii) Student home pages: where students put down their ideas about a course, with their own essays and comments. (iv) Web essays: the traditional essay posted online. (v) Websites: student created sites, usually collaborative, where links to useful sites, essays and the work of different classes in a course over the years can be placed (Browner et al. 2000: 142–43). (f ) Internet databases such as JSTOR or Project Muse or the MLA Bibliography, available on subscription, enable students and researchers to access the full text of articles from major journals. Further, trips to musty libraries and rare-book collections can be avoided because within the subscribing network, any student/researcher can search,
114 S
VIRTUAL WORLDS
view, print or download the material in the comfort of her or his room, provided s/he has a PC hooked up to the Web. (g) Online public libraries provide the scope for greater dissemination of information. Betty Turock, former Director of the Graduate Library Science Program at Rutgers University and President of the American Library Association (the oldest and largest library association in the world) begins with the assumption that the electronic frontier must enhance business as well as public interest. She suggests that four national policy issues will decide whether the Internet, and hence the information world, remains free: (i)
Universal service: where libraries designated as ‘universal service providers’ will enable greater service. (ii) Intellectual freedom: with related aspects of indecent material, censorship, politically subversive material and others. (iii) Intellectual property rights: with the freedom of the public to reproduce material for educational purposes. (iv) Equity: where people have just, equitable and affordable access (O’Reilly et al. 1997: 167–68).
ARCHIVES The Internet transforms the very notion of an archive. Ethnographers, as Sarah Pink has demonstrated, have a great deal to gain by using of the new technologies. In the areas of literary and cultural research, digital archivisation can change the very nature of critical inquiry (Pink 2001: 155–75). As a simple example, let us look at a researcher from a country such as India researching eighteenth-century colonial India. The India Office Records of the British Library, London, contains some of the most valuable and rare British documents of the period. Much of this material remains beyond the reach of such a researcher unless s/he can find funding to visit this august space. However, if this wealth of
ART, AESTHETICS, POPULAR CULTURE
S 115
material is digitised and placed online–even for a fee—research is radically altered. The scholar accesses the Website, searches for the relevant links, and reads/downloads material. An archive is thus no more spatially located: in cyberspace geographical distance between, say, a researcher at the University of Hyderabad (India) and the British Library (London) becomes irrelevant. Perhaps this heralds a more egalitarian distribution of intellectual resources. In August 2003, the Andhra Pradesh State Archives declared that its entire collection of manuscripts and documents, dating back to 1724, in Urdu, Persian and Telugu, would soon be available via the Internet. The digitisation of these documents will mean that the material will be available to scholars the world over. With automatic translation software, it will even be available to users unfamiliar with these languages. The question of the archive becomes especially relevant for generally inaccessible material such as classics, banned books or even out-of-print works. A site such as hosts a history of censorship, and provides links to censored authors and texts. Books such as Leopold Ritter von Sacher-Masoch’s Venus in Furs (SacherMasoch being the man who gave rise to the term ‘masochism’ for his taste in being whipped and beaten)—rarely available in libraries and bookshops—can now be downloaded for free from here. The Christian Classics Ethereal Library () is a huge index to several religious non-fictional texts, fiction, hymns and reference sources. Medieval and classic collections are available at , http://www.literature. org/works> and the Online Medieval and Classical Library (). A large amount of pre-twentieth century poetry is available at . There are also author-specific, genrespecific and period-specific sites.
PROBLEMS OF HYPERTEXT Students and other readers of hypertext fiction do have problems with the hypertext. Jennifer Bowie, at Texas Tech University, in her essay ‘Student Problems with Hypertext and Webtext’ () lists the following problems that her students admitted to facing: S Getting lost due to too many paths or the several million possibilities available. S Getting lost in the links. S Uninteresting content. S No closure with the texts. S Possibility of missing information, or being unable to backtrack. S Not entertaining or popular. S Medium not user friendly. S Story interpretation issues. S Standards of hypertext not up to what they expect of ‘good’ literature.
LANGUAGE AND ELECTRONIC TEXTS With a new medium, a new form of writing, and therefore of language, emerges (as Brenda Danet has demonstrated). As MarieLaura Ryan points out, electronic literature is the latest episode in the quest for a new language (Ryan 1999: 11). Ryan lists a few of the following developments with computer technology: 1. Language, especially in multimedia and VR performances, combines visual spectacle, music, song, text and dance. It is the nearest to a ‘total’ language. 2. Even the non-alphabetic qualities of language—such as shape, size, colour and movement (this last especially in dancing letters, so commonly used, to flash messages)—is available to the viewer/user. 3. The public text turns into private language. No two people reading cyber- or hypertexts see the same set of signs. The viewer can choose plots, reinvent and identity for her/himself, as well as customise the text (through cutting, pasting or annotating). 4. Visual poetry (John Cayley’s, for instance) exploits the spatial arrangement of the words on the screen. The language in hypertexts thus acquires a new dimension altogether. 5. The text is literally a matrix of several texts, and a self-renewing entity (ibid.: 11–15).
ART, AESTHETICS, POPULAR CULTURE
S 117
THE DEATH OF PRINT? Daniel Okrent, editor-at-large, Time Inc. (and formed editor of Life magazine) believes that the new technology signals the imminent death of print. Okrent points to two important causal factors in the death of print: (a) Technological advances will ensure faster delivery of, say, newspapers, but computers will be faster, lighter, more functionally advanced and cheaper, and eventually it will be more practical to read a newspaper online; and (b) it will be cheaper to produce and circulate a newspaper online (Okrent ). John Dujay, in his ‘Will “E-Books” Make Paper Ones Obsolete?’ lists the advantages of the e-book over the printed one:
Technical Advantages S When nearing the end of a large volume, it is difficult to hold back hundreds of pages with the fingers. There are no such problems with an e-book. S If the reader is unsure of a word’s meaning, s/he needs only to click and a definition appears. S If, while reading, the reader forgets who a particular character is, a single click will provide information about the character. S Audio narration accompanying the scrolled words— which light up—provides an interesting ‘version’ of the printed book.
Economic Advantages S For publishers of shorter works, the e-book format is economically more viable. S We need not buy the expensive scholarly journal or magazine in order to read the one essay required. S We need not buy an entire volume of an author’s work or an anthology to read one piece by our favourite author (Dujay ).
118 S
VIRTUAL WORLDS
In any case what is certain is that reading a text will never be the same again. However, it is possible that the sentimental reasons for reading a book—the feel of a new book with all its smell and texture, dog-earring a favourite book—may help maintain the newspaper and the popular thriller in its present form for some more time.
CYBERPUNK The Marxist critic Fredric Jameson in his influential essay ‘Postmodernism, or the Cultural Logic of Late Capitalism’ argues that we need ‘radically new forms’ of art to describe the current historical conditions of multinational capital. Such new forms would contain possibilities of a new kind of ‘political art; which is neither nostalgic about the past nor merely represents the new world’. These new forms should, Jameson maintains, enable us to ‘understand our position as individual and collective subjects’, so that we can act and struggle (Jameson 1991 [1984]: 54). Such a new form of art—seeking to understand contemporary technoculture, the role of the human being within it, the crisis of subjectivity and the possibility of radical politics and liberatory action—evolved out of the SF tradition. BOX 2.3 WILLIAM GIBSON The man behind the term ‘cyberspace’, and the acknowledged pioneer of cyberpunk, William Gibson is perhaps the single most influential novelist of the last decades of the twentieth century. Neuromancer famously defined cyberspace as ‘a consensual hallucination experienced daily by billions of legitimate operators … a graphic representation of data abstracted from the banks of every computer in the human system’. Burning Chrome—a collection of short stories—provides several themes for cyberpunk. ‘Johnny Mnemonic’ is about a man whose mind has been reprogrammed into a computer to store data, which he himself cannot access. ‘The Belonging Kind’ presents women who alter their bodies as they move from one spot to another. ‘The Winter Market’ depicts the body as a vehicle for experiencing dreams edited into Hollywood thrillers. Gibson’s work has provided definitions, concepts and themes for writers as diverse as Bruce Sterling, Pat Cadigan and Rudy Rucker. Neuromancer—a portmanteau term suggesting both (continued)
ART, AESTHETICS, POPULAR CULTURE
S 119
(Box 2.3 continued) necromancer (magician and raiser of the dead) and neuro (the nervous system)—presents Case, who is an ‘intellectual punk’ (Spinrad 1990: 111). He is a prototype of cyberpunks in later fiction, where the neuromancer punks are magicians who interface with electronic systems/realms just as traditional shamans interface with mythic realms through drugs or trances. Case is ‘an electronic necromancer in a black leather jacket and mirror-shades’ (ibid.: 112). Neuromancer presents the central themes of cyberpunk: (a) contrast between ‘meat’ and synthetic/prosthetic body; (b) the relationship between human beings and computers; (c) the transformation of time; (d) the (excessive) wealth of data; (e) the paranoia about some mystical controlling force in our lives; and (f ) and the fear that we (humans, with emotions and passions) are being supplanted by the computer-generated consciousness that we ourselves have replicated through technology. In Idoru (1999) a rock star, Rez, falls in love with Idoru, a computer-generated ‘construct’. The reality of this construct and illusion is brought home when Rez decides to marry this construct. Colin Laney in All Tomorrow’s Parties (1996) is able to actually live cyberspace/information. He recognises nodal points that emerge in the flow of information, and is thus able to predict events that will emerge from these nodes. In Pattern Recognition (2003) Cayce Pollard is a ‘cool hunter’. She recognises emerging patterns in the fashion industry simply by looking at what people are buying and wearing. Pathologically allergic to labels herself, Pollard designs and advises companies on logos. Pollard is one of a long line of information-living Gibson protagonists—Case, Bobby Newmark, Colin Laney.
Cyberpunk, a new genre of fiction, began to emerge in the midand late 1980s. The term was first used by Bruce Bethke in a short story called ‘Cyberpunk’ in Amazing Stories in 1983. Cyberpunk’s themes are anticipated in several SF novels of the twentieth century (Cavallaro 2000: 1–13), but it begins to emerge as a coherent body of work only with William Gibson’s Neuromancer (1984). After Gibson several writers began writing a new kind of novel: Bruce Sterling, Pat Cadigan, Lewis Shiner, Sue Thomas, Laura Mixon, Amy Thomson, Melissa Scott, Lisa Mason and others have several features in common. Cyberpunk fiction is concerned with the ‘organisation of information as virtual spaces and the nature of virtual bodies’
120 S
VIRTUAL WORLDS
(Jordan 1999: 25). The first is an attempt to understand cyberspace, its ‘constitution’, extent and features. The second is to understand its implications for the human race (ibid.: 26). In short, it is an exploration of cybersociety and even, perhaps, social and cultural theory (Featherstone and Burrows 1998 [1995]: 8; see also Jameson 1984 quoted earlier). As William Gibson puts it in an oft-quoted interview: ‘When I write about technology, I write about how it has already affected our lives’ (Gibson 1991: 274, emphasis in original). Bruce Sterling, novelist and the editor of the first cyberpunk anthology, Mirrorshades, provides in his preface an important assessment and introduction to cyberpunk as a literary genre. Cyberpunk, for Sterling, is a ‘modern reform’ and a natural extension of elements already present in SF (Sterling 1988 [1986]: xv). The writers of cyberpunk, Sterling argues, live in a ‘truly science-fictional world’ (ibid.: xi). Cyberpunk, according to him, is ‘an unholy alliance of the technical world and the world of organised dissent—the underground world of pop, visionary fluidity, and street-level anarchy’ (ibid.: xii). The human mind–computer/databank interface and the ‘storage’ of human personalities are typical cyberpunk themes. Cyberpunk has close parallels with postmodern art forms. Both possess the following features: 1. The breakdown of distinctions between ‘high’ or ‘classic’ art and ‘low’ forms such as popular romances and detective fiction (Fitting 1991: 306). 2. The mixing of genres: romance, sci-fi, fantasy, utopian fiction, detective fiction, psychological thrillers. In fact, critics like Sharona Ben-Tov argue that cyberpunk retains the ideologies of sci-fi. In Ben-Tov’s colourful phrase, ‘Instead of a technological feat like the Voyager spacecraft, cyberpunk promises us a chip in the brain to make us think we’re on another planet’ (Ben-Tov 1995: 177). 3. The blurring of boundaries between real and unreal, fact and fiction and this world and cyberspace. The hallucinatory, ‘simulated’ environment of postmodern art is emphasised here. Brian McHale, in his perceptive essay, ‘POSTcyber MODERNpunkISM’, points out that cyberpunk borrows heavily from postmodern fiction (such as William Burroughs’ cybernetic trilogy, The Ticket that Exploded, The Soft Machine and Nova Express, and
ART, AESTHETICS, POPULAR CULTURE
S 121
Thomas Pynchon’s Gravity’s Rainbow or V), which itself has ‘sciencefictionised’ elements. McHale is here tracing a feedback loop connecting sci-fi, postmodern fiction and cyberpunk (McHale 1991: 308–23). Some of the essential features of cyberpunk—which, one must emphasise, it shares with SF and a wide variety of SF cinema—are summarised below. As we shall see, these features identify cyberpunk as one of the most representative genres of contemporary technoculture: 1.
2.
Technology: Cyberpunk writers begin with the assumption that the technological dreams and nightmares of the previous generation are already real now. Cyberpunk is an attempt to negotiate and appropriate this technology before all human beings become mere software, ‘easily deletable from the hard drives of multinationalism’s vast mainframe’ (McCaffery 1991: 12). Cyborgs: The cyborg represents, for cyberpunk, that indeterminable ‘creature’—man–animal–machine—which makes us rethink our analytical categories. (a) When everybody is prosthetic (even a pair of spectacles are a prosthetic device), then all humans are in a sense, cyborgs. In several cases this fiction presents a dystopian vision of such a blending of man and machine. Sterling declares: ‘Technology is visceral [for cyberpunk writers]’ (1988: xiii). (b) The debate, as Kevin MacCarron points out, is really about what is human, and its opposite, the inhuman (McCarron 1998 [1995]: 265). Gibson’s fiction, for instance, is the vision of a ‘nonnatural future, when our organic nature and shared biological origins and history with other creatures on this planet will have been superseded by the hybrid cyborg forms’ (Fitting 1991: 304). (c) The furore over cloning or implants is because the distinction between ‘original’ and ‘copy’, and human and animal is erased. Anything that resists categorisation by occupying more than one ‘position’—both this and the other—is frightening. Cyberpunk’s parallels with literary categories such as the grotesque emerge clearly here.
122 S
3.
VIRTUAL WORLDS
The body: In most cases, the fleshly human body is considered a limitation. (a) There is frequently a revulsion for human flesh (hence Stelarc’s slogan: ‘The body is obsolete’). (b) There are prostheses, organ banks, genetically modified bodies and cosmetic surgical procedures to ‘rectify’ the human body. Body modification is a central theme in all cyberpunk. Here is Gibson’s (1984) description of a human body in such a ‘posthuman’ (a term that Bruce Sterling popularised) age: ‘Ratz was tending bar, his prosthetic arm jerking monotonously […] the antique arm whined as he reached for another mug. It was a Russian military prosthesis, a seven-function forcefeed manipulator, cased in grubby pink plastic (Neuromancer: 3–4). In Count Zero Gibson describes the manner in which a man is put back together: ‘They cloned a square of skin for him, grew it on slabs of collagen and shark cartilage polysaccharides. They bought eyes and genitals on the open market’ (1994 [1986]: 9). David Skal’s Antibodies presents characters who are ‘robopaths’: they believe they are robots trapped inside organic bodies (meat). One of the central motifs for characters in Antibodies is, therefore, the crisis of being ‘born meat’ and ‘dying meat’ (Skal 1989: 19). The art of Stelarc—with technological prosthesis to his body—and Orlan (altering her body through cosmetic surgical procedures) are other equivalents of cyberpunk’s aesthetics of body modification and technological enhancement of the human form. These descriptions, read along with Stelarc and Orlan’s work, present a whole new form of art that has its origins in contemporary technoculture. The elision of the human and machine that these artists explore constitute what I have termed ‘technocorporeo art’, suggesting the use of technology that focuses on the body for aesthetic effect (Nayar 2002b). (c) Cyberpunk also calls into question the sexual mode of reproduction, suggesting, instead, a technoreproduction of the human. In Bruce Sterling’s ‘Twenty Evocations’,
ART, AESTHETICS, POPULAR CULTURE
S 123
Nikolai Leng cannot believe that he comes from earth. His teacher—‘a cybernetic system with a holographic interface’—explains: ‘The first true settlers in space were born on earth—produced by sexual means. Of course, hundreds of years have passed since then. You are a Shaper. Shapers are never born’ (Sterling 1991: 154). (d ) The body is denigrated and the mind elevated in importance in cyberpunk. Personality modification is central to such tales as Rudy Rucker’s Software (1982) and W.J. Williams’ Hardwired (1986). In the latter a program called Project Black Mind imposes personalities upon the mind. In a sense, therefore, this is also a theme of identity-modification. The human body is only ‘meat-jail’ (Cadigan 1991: Synners), or ‘meat puppet’ (Gibson 1984: Neuromancer). This theme frequently relates to information storage, and how it is now possible to store even human personalities. (e) In cyberpunk, the triumph of good over evil is only possible through an augmentation of the body. Thus in John Shirley’s Eclipse (1999 [1985]), Rickenharp—a rock musician, drug addict and resistance fighter—battles fascist Second Alliance (SA) forces and wins mainly because he has electronically augmented his body. Robocop fights and wins over evil precisely because of his augmented body. 4.
Atmosphere/Setting: The atmosphere of cyberpunk is almost always surreal, in keeping with its presentation of reality as just simulation. Sterling (1988 [1986]: xiv) writes in his preface to Mirrorshades: ‘Cyberpunk work is marked by its visionary intensity. Its writers prize the bizarre, the surreal, the formerly unthinkable.’ In such a world of ‘consensual hallucination’ (Gibson’s term for cyberspace), it becomes impossible to distinguish whether the human is in the environment, or vice versa. The distinction between inside and outside, individual and society, (wo)man and environment breaks down. Neal Stephenson (1993 [1992]) describes a ‘metaverse’ in Snow Crash (24–27, 35–42). This metaverse is an interactive electronic space inhabited by digitised body delegates (what Stephenson terms avatars). In cyberpunk
124 S
5.
6.
VIRTUAL WORLDS
space is immaterial, in the sense that it is only a product of electronic mapping. The technopolises (or rather megalopolises) are assemblages. As in postmodern visualisations of space (Edward Soja, Fredric Jameson), cyberpunk rejects the idea of spatial unity. The philosopher Felix Guattari describes contemporary ‘information and communication machines’ thus: ‘[They] contribute to the fabrication of new assemblages of enunciation, individual and collective’ (Guattari 1992: 18). In cyberpunk the huge multinational economies and corporations, the various subcultural groups and the maze of data (the matrix) are all without boundaries. In Gibson the multinational corporation becomes an evolving organism, living and reproducing (Fitting 1991: 305). The slow pan of Tyrell Corporation’s façade in Ridley Scott’s Blade Runner (1982) is an example of the facelessness of this megalopolis. As the city in Blade Runner and the cities in Gibson’s fiction reveal, human beings are mere fragmented commodities in the megalopolis. Computers: Computers are a central motif in all cyberpunk. Computers as databanks, but more importantly as memories, figure prominently in Philip K. Dick and William Gibson. Gibson states: ‘Computers in my books are simply metaphors for human memory. I’m interested in the how’s and why’s of memory, the ways it defines who and what we are, in how easily it’s subject to revision’ (1991: 270). There is also the related theme of ‘cyberspace as a means of translating electronically stored information into a form that could be experienced phenomenologically and manipulated by human agents jacked into the network’ (Foster 1993: 12). Mythography: Cyberpunk also utilises several myths in conjunction with contemporary technological developments. Dani Cavallaro lists some of the mythological elements in cyberpunk: new forms of life emerging from technology; the presence of shamans, mystics, vampires, visionaries and prophets; ‘creation deities’ in the form of Artificial Intelligence (AI) engineers, console cowboys and corporate heads; and the animism that is visible in novels such as Gibson’s trilogy (Neuromancer, Count Zero and Mona Lisa Overdrive). Cavallaro points out that there is a great deal of supernaturalism in
ART, AESTHETICS, POPULAR CULTURE
S 125
BOX 2.4 THE MAGIC OF CYBERSPACE The technology behind cyberspace is itself, most times, mystical and resembles magic. Sherry Turkle in Life on the Screen argues that the complexity of contemporary technology almost makes it alive. Turkle points out that such a machine/system, governed by its own processes of replication and mutation, and evolving its own forms of behaviour is life-like, because this is what life does. Further, the complex system suggests something beyond human control: an autonomous technological life that runs on its own complex logic. This makes it magical, and ‘they provoke spiritual, even religious speculations’ (Turkle 1995: 166–67). Terms such as wizard, voodoo, avatars, demons and sprites—common to most cyberspace users—suggest the magicospiritual dimensions of high tech. The transcendence of the body—a prime concern of high tech and cyberpunk—itself has religious and mystical overtones.
7.
cyberpunk, which frequently suggests alternative religions (voodoo) in alienating environments (Cavallaro 2000: 41–71; also Spinrad 1990: 111–12 ). Power and control: Cyberpunk suggests that all individuals are subject to already existing systems of control and power. All the characters in cyberpunk are, in a sense, subjects of the enormous world of information, of the ‘matrix’ (Gibson). An incident from Gibson’s Neuromancer illustrates this argument. Molly tells Case that she knows exactly how he is wired because their mutual employer has a complete computer profile (the parallel with state investigative agencies’ dossiers is surely inescapable) of Case. This profile is detailed enough to predict his actions, including his death from drug abuse. Thus, as Thomas Foster points out, the two characters’ understanding of themselves is mediated through cybernetic control (Foster 1993: 25). From this illustration the theme that emerges is not of a technology of freedom (from the body), but of domination. This leads Mark Poster to ask whether the new social/cultural movements of feminism or minority discourse are in any way connected, antagonistically or
126 S
8.
9.
VIRTUAL WORLDS
otherwise, with these new forms of domination of the high-tech age (Poster 1990: 19). Gender: Cyberpunk’s gendered nature has come in for serious critiques from several contemporary thinkers (for a detailed discussion, see Chapter 4). Cynthia Fuchs argues that cyborg films like Robocop, Terminator and Star Trek: The Next Generation all embody a crisis of male subjectivity. She notes three paradigms of such a male hysteria: the ‘triumphant macho-cyborg such as Robocop or Schwarzenegger–Terminator, the androgynous cyborg such as Robocop 2, and the human forced to function as cyborg’ (Fuchs 1993: 113–33). Donna Haraway believes that cyberpunk and cyborg imagery offers a ‘political myth’ that can be used by feminists to combat those anti-technological biases that reproduce stereotypes (Haraway 1991a [1985]: 150–53; Haraway sees the myth of the cyborg as ‘faithful to feminism, socialism and materialism’). Andrew Ross sees cyberpunk as reasserting ‘the most fully alienated urban fantasies of white male folklore’ (Ross 1991: 145). Cavallaro sees cyberpunk as ambivalent in its approach to gender roles, for it appears to both perpetuate and subvert stereotypical representations of masculinity and femininity (Cavallar 2000: 121). Social commentary: Cyberpunk also becomes an extrapolation of contemporary life. The cities of cyberspace are real in the sense that the streets here also have free combat zones where ‘people can go to hunt and kill each other’ (Stephenson 1993 [1992]: 25). The rich, with the best possible looks, on their digitised body delegates, ‘computer hair-brushed and retouched at seventy-two frames a second’, as Neal Stephenson describes the women avatars in Snow Crash (ibid.: 41), are in sharp contrast to the poor, who can only afford low-resolution black and white avatars. This becomes a parallel world, a world of information where the limitations of the body are overcome— dependent upon, as always, the purchasing power—by implants and images. The dirt and grime of contemporary cities, the problems of overpopulation, drug addiction, crime and violence, the subversive subcultures, and pathos figure prominently in the cybercities of films like Blade Runner. William Gibson describes Night City in Neuromancer as ‘a deranged experiment in Social Darwinism, designed by a bored
ART, AESTHETICS, POPULAR CULTURE
S 127
researcher who kept one thumb permanently on the fast-forward button’ (Gibson 1984: 7). For this reason Vivian Sobchack uses the term ‘tech noir’ to describe the aesthetics—which combines high tech, noir and MTV—of cyberpunk (quoted in Rutsky 1999: 118–19). Claudia Springer in fact argues that if contemporary cyberpunk amplifies urban city size and predominance of technology (faster, larger, denser), then it also amplifies the darkness of modern urbanism (Springer 1996: 76). 10. Counter-culture or romanticisation of culture?: Norman Spinrad— himself a reputed writer of SF—offers an interesting take on cyberpunk as a genre. He argues that most of cyberpunk fiction presents a new romanticisation of technology. Spinrad writes that when the punks of the 1970s rebelled against the anti-technological aesthetic of counter-culture, what ‘came in’ was ‘shiny black leather and high-tech, high-gloss chrome […] mirror-shades […] defiantly artificial spiked, color-frosted, and sculptured hair-dos’. The new punks, writes Spinrad, ‘embraced the real world that science and technology have made, the technosphere, the cybersphere’. More importantly, they (cyberpunk writers) are not simply hagiographers of new technology, but focus on the characters themselves. Spinrad’s insight into the new high-tech but still humanist romanticism culminates with his term for all that happens now: he describes all such writers and the characters in their work as ‘neuromantics’ (Spinrad 1990: 114–15).
POPULAR VISUAL CULTURE Digital technology has revolutionised popular cultural forms and artifacts. Computer graphics companies introduced digital imaging to television advertising in the mid-1980s. The Center for Digital Arts and Experimental Media (University of Washington) offers courses in several such new art forms, from computer music to digital video (their Website provides images of the work being done at the centre). Several visual artists have also used cyberspace to extend their artistic output. Eva Pariser points out that the contemporary digital/
128 S
VIRTUAL WORLDS
Web artists’ Websites are ‘vehicle[s] for self-referential expression’ (Pariser 2000: 62). Since any artist can set up a Website, no establishment or art gallery guidelines need to be followed. Thus no art here is subject to aesthetic judgement, and the focus is on the artist’s ‘self-conscious, self-promotion’ (ibid.). Digital techniques were soon adapted to advertising, music videos and television. An overall patterning among seemingly incompatible elements is achieved through manipulation of colour, direction of movement, editing rhythm, repetition and play of spatial figures, and the theme of mirroring/reflection. Computer games evolved through the 1980s and users engaged with action, characters and sound delivered in a highly realistic mode. It also became, in the West, (a) a public form of entertainment with the rise of video-game arcades with coin-operated machines, and (b) a home-entertainment system with its own hardware and software. Space Invaders marked the turning point in the development of computer games. These games had their own genres—ranging from battle games to sports (Pacman, Lunar Lander, Donkey Kong). With the rise of the PC in the 1970s, computer games became more cerebral: puzzles, riddles, map-making. Music videos, which combine styles, audiovisual forms, genres and devices adapted from theatre, art, cinema, fashion, television and advertising, are a hybrid genre. They resist classification because in them reality and illusion are deliberately mixed. What they do is self-consciously create and project the image. In several cases, these are self-reflexive, where the making of the video and the theme of the video are superimposed. In each of these cases the spectacle is supreme. Questions of aesthetics, of course, do arise. However, as Andrew Darley points out, we need to accept that a shift has occurred in contemporary visual culture toward an aesthetic that ‘foregrounds the dimension of appearance, form and sensation’ (Darley 2000: 6). Ornamentation, style and spectacle are the new aesthetic devices.
MOVIES Digital techniques began to be used to convey and create realism in cinema from the late 1970s. The use of pixel values enabled
ART, AESTHETICS, POPULAR CULTURE
S 129
refinement of realism, and its effects can be seen in Westworld (1973), Futureworld (1976), Star Trek: The Wrath of Khan (1982) and The Last Starfighter (1985). From the mid-1980s digital image manipulation (which involves altering pre-existing images on computers) and image synthesis (where images are produced within the computer) made it possible to make a film entirely with computers, without the traditional props of sets, lights and cameras. The first computer-synthesised feature-length film was Toy Story (1995). Later, films such as Terminator 2, Jurassic Park and Independence Day carried realism to the limit by effectively fusing real (meat) figures and action with digitally synthesised figures. And, more recently, the tropical island in the Tom Hanks starrer, Castaway (2000), was generated on a computer! Recalling the production, George Joblove, senior vice president for technology at Sony Pictures Imageworks (which created the visuals) confesses: ‘Any shot that had ocean or sky in it, was pretty much a special effect’ (Span, November/December 2002: 13). David Gauntlett discusses three cinema-related Web phenomena: (a) Movie promotion and the Web: Websites for specific movies provide detailed information. Production details, director’s comments, pictures, interviews and trailers are all usually available on the site. In addition, there are screensavers (scenes from the film) that can be downloaded, and provide a free source of publicity. The Blair Witch Project Website () began its ‘promos’ nearly a year before the film’s actual release, fostering the myth of the Blair Witch and the missing youngsters who had supposedly made the film. This myth-generation enabled a great deal of hype for the film. (b) Films on the Web: Short films now find a new medium in the Internet, though the audiences are still limited. The Internet Film Community () was launched to ‘exhibit’ films by independent filmmakers. Atom Films (), Student Films () and Pop.com (from Steven Spielberg and associates) are Websites that offer animations, shorts and a variety of other films, thus providing alternatives for home entertainment. Troops (), a documentary film about the Star Wars stormtroopers, which appeared in 1998, was designed for Web presentation (including special effects).
130 S
VIRTUAL WORLDS
(c) Reviews and research: The Internet Movie Database () is a huge storehouse of information about movies. It makes research (‘film studies’) much simpler, providing detailed information about the cast, crew, marketing, as well as soundtrack. It also puts out reviews by video-renters and moviegoers, thus making it a valuable audience-input database. The first Web version ran on Cardiff University (Wales) servers, and in 1998 it was sold to Amazon.com. The diversity of responses—from professional film critics and academics to filmmakers and college students—makes it a truly egalitarian source for research (Gauntlett 2000: 82–87). Late twentieth-century cinema marks the ‘renovation of spectacle’ (Darley 2000: 102). Characterised by extraordinary special effects, these films are high-budget productions and are given a lot of media hype (as we noted at the beginning of this chapter, technology informs the transmission of culture, and even alters the nature of cultural products. We cannot speak of a cultural form without talking of the technologies [synergies] that have created and disseminated/popularised the form). In his incisive study of the new aesthetics, Andrew Darley argues that contemporary high-tech cinema emphasises technique and image over content and meaning (ibid.). Such cinema depends upon a ‘fascinated’ (Darley’s term. We can easily substitute it with ‘bewitched’, ‘entranced’, or ‘enchanted’) spectator. Spectacle becomes an end in itself. The intention is to dazzle the eye and other senses. It does not require meaning: it is sufficient to enchant the viewer with endless sequences of images. The emphasis on special effects illustrates this argument. What is interesting is that spectacle and spectacular special effects call attention to themselves. The extreme self-conscious artifice of the effect (seen in such films as Speed, The Matrix, The Matrix Reloaded and Titanic) means that technical expertise produces both the spectacle and the recognition of artifice itself. Darley categorises the various kinds of special effects in spectacle-cinema. (i) ‘Photographing’ the impossible: Digital technology is used to produce the effect of photo-realism in scenes that are really fantastic (that is, with no equivalent in real life). Seen in films after The Abyss (1989), this involves integrating fantastic
ART, AESTHETICS, POPULAR CULTURE
S 131
objects (the alien tentacle) with traditional live action and live actors. Three levels have to be achieved: (a) Naturalism, for the object to look credible despite being fantastic; (b) it has to occupy the same filmic space as the live (human) actors, i.e., is representation has to be indistinguishable from the human cast. This meants that both the photographic and the seemingly photographic (digital simulation) have to be seamlessly combined; and (c) the shots have to be combined to provide the illusion of a smooth narrative. (ii) Verisimilitude and the spectacular: Here both subject matter and image construction become more technologically ‘dense’. Fantastic events in Terminator 2, for instance, seem convincing because their digitally-created images have the same quality as the live action characters. Advances in technology now enable even close-up shots of these fantastic-yetcredible computer generated images. Special effects in such films are of two broad types: (a) those that can be grouped with animatronics, puppetry, make-up and prosthetics, and are used to present the bizarre by producing photo-realist imagery involving close-ups with live actors; and (b) those involving models and miniatures, special sets, mock-ups, pyrotechnics, etc. These are used in long, panoramic shots and large-scale sequences to provide a naturalist narrative. A third type is technology for humour. Pastiche and exaggeration are also modes of using digital imaging techniques, especially evidenced in films such as The Mask. Hyperbole and grotesque transmutation mark this form (Darley 2000: 106–14).
MUSIC VIDEO The music video as a popular genre in visual culture became firmly established in the 1980s. Combining music and music performance, it adapts the techniques of various cultural audiovisual forms: theatre, art, cinema, dance, fashion, television and advertising. It represents, for Andrew Darley, E.A. Kaplan and others, a truly hybrid form, one which resists and even breaks down conventional generic boundaries. Relying heavily on technologies that enable image-making, fusion of forms/genres and special
132 S
VIRTUAL WORLDS
effects, the music video is perhaps one of the more popular adaptations of technology. Montage, an essential component of the music video, is dependent upon digital imaging. These images also openly reveal their borrowings from other genres. The self-reflexive attention to the devices of production (the sources, for example, of the narrative’s images) overrides the video’s attention to narrative. This means that the video escapes categorisation precisely because it refers to already circulating images, media models, discourses and icons. Music video images resist classification as either illusionist or anti-illusionist. Style is paramount in music video. The sequence of images (the syntax) is more for ornamental purposes than for conveying meaning. The actual song may well be ‘off-screen’, thus reinforcing the effect of visual ornamentalism. Visual images predominate over all other elements in the video. Further, the persona of the star ‘performer’ is crucial. The conspicuous merging of the star’s image with digitally generated effects and sequences again relies on the visual stimulation of the video. Heterogeneity is another characteristic of the music video. Live action, animation, video and digital techniques merge in the music video. Morphing, where one live action figure metamorphoses into another on-screen, is a frequently used technique. Archival (documentary) footage, cartoon animation, 3-D animation, live action/staged performance and narrative fiction—all merge seamlessly.
COMPUTER GAMES Computer games (or video games) became mass cultural forms in the early 1980s, though such games had developed much earlier with the emergence of computer imaging in the 1960s. It became a public form of entertainment (via the video arcade and coinoperated machines in restaurants and recreational facilities) and a home-entertainment form with the development of software for personal use. S Space Invaders was the game that really popularised computer games and games arcades. Following its success,
ART, AESTHETICS, POPULAR CULTURE
S 133
most of the games that came onto the market were battle and combat games, although with additional sound effects. S Platform games evolved with Donkey Kong. These games involved a character who had to progress through a series of obstacles. The player had to control the movement of this character, and see that the character reached its destination points bypassing the obstacles. S TV games were another form of computer games, played via consoles attached to TV sets. S Home computer games moved beyond the action genre. They were mostly puzzle-solving and map-making games and involved more thought than the action games (which relied only on the speed of reflex). The player had to assess the information and then act upon it. By the mid-1980s advances in computer hardware technology enabled games to be loaded faster into the computer, for the floppy disk had replaced the cassette. Faster and better computers meant faster response during play, and sound effects. The computer games industry began to boom with this. In the 1990s cartridge-based software and low cost of consoles meant instant playability, less cumbersome procedures (no need of changing disks) and increased capacity (so the player did not have to wait for the game to be loaded). The Nintendo and Sega companies pioneered the home computer games revolution. Arcades now regularly premier new games before they are available as home entertainment. Other developments have taken place. For instance, the player now sits in a (play/toy) car with a console to play a car-chase game. The arrival of virtual reality technology has further transformed the computer game. Head-mounted display helmets, synchronised movement, multimedia effects and multiplayer games (via the Internet) are now standard equipment in the area of computer games. Most of these games give the player an illusion of being in an alternative world. The computer game has thus intensified the aesthetic of the realist cinema: illusionist representation (Darley 2000: 158). The three-dimensional space enables the player to become an almost motionless observer/passenger. Even though mobility and freedom is limited, the partial freedom does augment
134 S
VIRTUAL WORLDS
the impression of involvement with the image on (or ‘in’) the screen. This kinaesthetic experience is paramount in the development of computer games. Realism is therefore central. The player’s relationship with and to the image has been altered enormously through the interactive computer game. Simulation rides—one version of the computer game—is a leisure attraction designed to simulate motion and the experience of travelling. One particular subjective point is intensified here—that of the travelling projectile. With digital technology the physical scale of the simulation is multiplied several times over. The screen, the imaginary vehicle and even props like hydraulically-controlled seats in the auditorium are involved here. The seats shift in careful coordination with the movement of the objects on the screen, thus intensifying the player/viewer’s impression that s/he is actually moving.
COMPUTER PAINTING, ANIMATION AND OTHER VISUAL ARTS Painting and Photoediting Painting and photoediting programs—especially after the release of Apple’s MacPaint in 1984—are common features of computer visual arts. The crucial moment in creating computer visuals was the invention of raster graphics by Xerox Corporation in the 1970s. The researchers at Xerox divided the screen into a mosaic of phosphor dots that were turned on and off by a beam that swept the screen methodically row by row. These rows were called ‘rasters’, and the images made on this new type of screen came to be called raster graphics. Raster graphics enabled the user to fill in selected areas of the screen and thus create a realistic computer image. A computer painting and photoediting program consists of: (i) Drawing area, or canvas: The Screen. (ii) Tools: The tool palette or tool bar. (iii) Palettes: Subsidiary palettes for customised tools such as changing the brush size, shape and opacity, line widths, choosing colours and patterns. (iv) Menus: Flipping an image and other actions.
ART, AESTHETICS, POPULAR CULTURE
S 135
Computer artists can now darken or lighten an image by using mapping. Raising or lowering a pixel value alters the image. Filtering changes a pixel’s value by taking into account not only its original brightness but also that of its neighbours. It averages a pixel’s brightness in comparison with its neighbours. This can blur or sharpen images, and be used to emboss or produce ‘glowing’ images.
Computer Animation This is now one of the biggest applications of computers in the visual arts. There are two main types of animation program interface: those that provide a series of cels or frame-like drawing areas to work in, and those that offer a visible timeline for controlling animation processes. Almost all 3-D geometric animation programs offer timelines. The timeline (also called dope sheet, time-layout window, or score) is the main control area. It lets the artist script, edit and control the timing and other parameters of animation objects. The horizontal axis is usually divided into either frames or seconds and the vertical axis into rows or tracks for arranging the different images (or other media) over time. With a timeline artists can view a timebased work spatially all at once and even control both the spatial and temporal aspects of the composition. Morphing—integral to music videos, animation and films— improves the fading effect used in 2-D films. The important features of each image are warped as the fade progresses, so that their locations sustain the illusion that one object is changing into another. The artist first indicates important position points on the two images in the beginning and end ‘keyframes’. The morphing program then interpolates intermediate positions and uses them to warp the images so that the dissolving and resolving features create convincing composites. High-quality morphing uses a large number of points or lines aligned on beginning and end images. The artist also touches up morphs in a digital painting or photoediting program to make them look seamless. Cynthia Beth Rubin (Inherited Memories, 1997, Siberian Summer Tales, 2002, see ) and Joseph Santarromana (Ferdinand M & Joseph S, 1994) both use morphing techniques in their digital paintings. And all this is not mere technology. Morphing becomes an act of recall, of mixing memories and histories of the past and present. It becomes symbolic of the
136 S
VIRTUAL WORLDS
conflation of cultures, memories and images that constitute human sensibility. As Cynthia Rubin writes: I am interested in how cultural traditions collide and merge, and how this is embedded in all of us […] These images grow from the affinity between my life as a contemporary American, and what I regard as my heritage, extending to times, places, and philosophies far from my own experience […] The computer is the instrument for allowing some images to sing, some to come forward as clear images, others to fall back into barely representational dreams of textures and colors. The interweaving of image fragments within the computer renders the texture of the memories, and creates a narrative out of final composition, even when it is rendered as a fixed twodimensional print […] When I first traveled through Europe looking at these works from my own tradition, I experienced an emotional connection that was completely unanticipated. Others may feel this when they view Renaissance art, but for me the sensation that art and creative works can link us to the lives and thoughts of those that went before us was completely new and overwhelming. Since then, my work has come to focus on communicating this sense of connection with the past and the present. (2001: ) Rubin’s statement captures the cultural uses to which such technologies can be put.
Digital Video In digital video artists work with different video clips (sequences of captured 2-D images) that are cut up and rearranged in time. Unlike traditional video, which requires fast-forwarding and rewinding to reach a particular frame, digital video is nonlinear. The artist simply clicks any point on a clip (or reaches it by typing in time codes) and converts it into the current clip. Rotoscoping— painting in effects by hand, frame by frame (the best known example is the light sabres in Star Wars)—is used here to control positions and other aspects. Nonlinear digital editing system ties several art disciplines together using multimedia. For example, video is frequently combined with digital painting.
ART, AESTHETICS, POPULAR CULTURE
S 137
3-D Sculpting Digital clay is at the heart of 3-D sculpting. The digital clay approach allows the user to click and drag on polygon vertices. By pushing and pulling on the vertices and connecting lines of the object’s polygonal mesh, we can warp any polyhedral object into bent, twisted, and distorted shapes. Spline patches (2-D surfaces bounded by four splines, spline being a special class of curves created by a mathematical technique that fits a curve to a set of given points called control points) help move vertices on the edges and interior to deform the surface to create varied contours. By increasing the resolution of the patch (by adding more vertices), one can model detailed surfaces such as the human face. Patches are used in car and airplane design and in designing surfaces for computer-aided manufacturing (CAM).
Web 3-D An important step in VR and WWW-based art is the Virtual Reality Modeling Language (VRML). VRML actualises the vision of writers such as William Gibson and Neal Stephenson, who saw cyberspace as an extraordinarily rich shared space. VRML describes polygonal forms and simple interactions and behaviours. Instead of downloading actual 3-D models, the Web browser downloads the textual VRML descriptions of geometric shapes, light sources and textures which the browsing machine then renders. VRML can be used to create 3-D worlds that are navigable via hyperlinks. Several such projects are available on the WWW. See, for example, Ed Stastny’s HyGrid (begun 1995), Victoria Vesna’s Bodies, INCorporated (begun 1996), and ).
MUSIC Information theory believes that all experience, including music, can be imagined as being constructed out of different sequences of basically similar chunks of data. Every experience is a specific signal. Digital technology, based upon the notion of experience as data, has provided new ways of ‘reading’ and creating music.
138 S
VIRTUAL WORLDS
Digital editing renders fidelity to sources unimportant, and enables ‘mixing’ of sounds from various sources that can be speeded up or slowed down. Thus Darth Vader’s famous metallic whisper in Star Wars (1977) was produced by mixing James Earl Jones’ voice with a recording of his breathing in a scuba tank regulator! (Cubitt 1998: 115). By the 1970s some models of analog synthesisers had begun to incorporate digital features. The logic circuits enabled programming of synthesisers to activate different sound-producing devices, or to alter pre-set controls. Electronic composers such as Michel Chion and Pierre Schaeffer foreground and emphasise the medium of recording itself (rather like literary modernism’s foregrounding of the process of literary production or the construction of a poem). A digital synthesiser is a computer with customised features that allow it to treat music/sounds in the same manner as ordinary computers treat symbols. Wave shapes are ‘translated’ into strings of numbers. Wave forms can be analysed and disassembled using a variety of mathematical techniques. Fast Fourier Analysis, for instance, breaks down a complex wave into simple S-shaped sine waves. These curves can be defined numerically, thus converting the music into a form that the computer can analyse. A digital oscillator can generate the string of numbers that represents a given wave, and each mathematical operation performed on that string of numbers will generate new numbers until a data list (composed entirely of numbers) is created that corresponds to the particular features of that particular sound. Thus these devices essentially translate sounds into numbers that can then be reprogrammed—since computers work with numbers—to produce further numbers in a specific sequence which, when reconverted, become music again. Theoretically, computers can work with any number sequence. This means that, in principle, they can produce any kind of sound. With the invention of the sampling technique, music was transformed as never before. Samplers sense sounds using a device such as a microphone that converts the mechanical energy in sound into electronic signals. It then measures the strength, frequency and precise ‘shape’ of that signal at every instant, and assigns a particular numerical value to each moment. Digital sampling occurs at the rate of about 40,000 samples a second, ensuring
ART, AESTHETICS, POPULAR CULTURE
S 139
that very little information is lost. The computer treats this as any other number sequence. Thus any sound once ‘coded’ as a number sequence can be infinitely reproduced and transposed. And then in the mid-1980s came MIDI, or Musical Instrument Digital Interface. Tod Machover, a Media Lab composer at the MIT and a team of engineers invented a hypercello. Here an electric cello (a cello analog to an electric guitar) serves as an input device to a stack of machines that process data created by a performer and generate a variety of responses. As the cellist plays the cello, the sensors transmit data about (a) the angle the cellist bow-hand wrist forms; (b) the location of the bow on the cello’s strings; (c) the pressure on the bow from the cellist; and (d) the actual ‘fingering’. This data from the cellist’s sensors is then translated into digital information and sent to the system’s main computer, which forwards it as MIDI information to samplers and sound synthesisers. Simultaneously, the electronic signal of the tune being actually played is fed into a synthesiser, which may manipulate the sound or pass it through as a conventional cello sound. Machover and his team also developed software to control the information from the performer to the machines. The cellist’s bow, which sets up vibrations on the strings, also acts as a conductor’s baton. The position of the tip of the bow, the length of the bow drawn across a string per second, and the pressure on the bow— all control the musical parameters such as the number of the musical lines called up from the computer, the timbre of the pitches played, and so on. A system called ‘trigger and control’ relies on the computer’s ability to follow a musical score. The computer then generates an accompaniment to the cellist’s solo. The main computer processes the data it receives, and can even create sounds depending on the instructions in the software. The software—and this is our interest here—is thus both a part of the instrument and part of the musical score. The system as a whole acts as both an instrument to be played by the performer and an orchestra responding to the performer’s commands, as if the cellist is a conductor. Headphones approximate to the ideal aural scape. A few years ago, at a major European retrospective in Cologne, the curators discovered that the sounds spaces of two exhibits were overlapping. They therefore provided infrared activated headsets. The visitor
140 S
VIRTUAL WORLDS
could now pick up the sound from an installation as s/he walked into the range of the installation’s miniature transmitter. Sound space here is reduced completely to the ears. Sound thus becomes individuated, restricted to the ears (and not even the rest of the body) of the individual. It also means that the acoustic environment—which usually includes reflecting surfaces of décor, room structure, furniture—is significantly altered.
MP3 Technology MP3 technology, which is at the core of audio innovation in contemporary art, is a ‘compression technology’ that compacts the size of musical recording and removes those sound files that are inaudible to human beings (Lessig 2001: 123). A five-minute song can be compressed to a file that is six megabytes in size, and thus transmitted via the Internet in less than a minute. The Internet with MP3 is now a distributor of music; it also, produces music when music scores are written exclusively for transmission via the Net. MP3.com, founded by Michael Robertson, and Emusic.com encourage artists to produce and distribute music across their sites. MP3.com also sees itself as a service bureau. It has developed an ‘automatic’ way for popular music to reach the top. The more users listen and download music, the more the music ‘floats’, and the more the system learns about user tastes. Thus eventually the system can make sure that your favourite music is at the top of your screen. MP3 is actually My.MP3, since anyone is allowed to develop the technologies that use it. The Internet has several programs that can ‘rip’—that is copy the music in MP3 format—the contents of CDs. Later, with the development of streaming technologies, the problem of transmission time (mainly due to restrictions of bandwidth) was overcome. The user no longer has to copy the file to hear the music. With streaming, the user can stream the desired track and play it at the same time. RealAudio sold tuners that enabled people to tune in to audio and, subsequently, video content, which could then be compressed and streamed across the Internet. In addition, ‘lyric sites’ on the Net carry large databases of lyrics from old scores. Fans can now find the lyrics archived in cyberspace. CDDB, or CD database, is an example of this. With MP3 equipment becoming more popular, people began wanting information
ART, AESTHETICS, POPULAR CULTURE
S 141
about CD titles and tracks on their MP3 device. MP3 services came up with a cooperative process for this purpose. When a user installs a CD, the system automatically searches the central database to see if that CD/track has been catalogued/archived. If it has, the title and track is automatically transferred to the user. If not, the user is offered the option of adding to the database by sending the record to the database. The archive thus keeps increasing as users send in all sorts of music.
Napster Shawn Fanning and Sean Parker invented a method to simplify file-sharing for MP3 files. The idea is brilliantly simple. Napster collects a database of which user has what music. When user ‘A’ searches for a particular song, the database produces a list of users who possess that song, and which one of them is online at that moment. ‘A’ can then select the copy s/he wants, and the computer sets up a connection between ‘A’ and the computer with the copy. What this meants is that one can find practically any song one wants, and that too, for free!
DANCE, CHOREOGRAPHY, INSTALLATION AND PERFORMANCE ART VR has challenged the boundaries of the body and of space. Dance, which is about the orientation of bodies in, and the physical–sensory relationship to, space is, naturally, a field wherein VR technologies have had considerable if specialised impact. Software such as LabanWriter and LifeForms attracted the attention of choreographers in the 1990s. Assimilating technologies into dance implied several things. One, it meant the relocation of the choreographic–compositional process into a laboratory-like setting; and two, the engineering of interface designs moved to the foreground. With this, ‘sensing’ becames more than the physical and organic (Birringer 2002: 86), i.e., space becomes dematerialised. Movement is captured, transmitted and rematerialised elsewhere. Dancers interact with sensory information such as the video, which projects different 3-D
142 S
VIRTUAL WORLDS
perceptions of energy, position and velocity. Choreography is now more akin to the ‘live mix’ of disc jockeys. The Secret Project presented by half/angel (a ‘dance/theatre company’ formed by dance/writer Jools Gilson-Ellis and composer Richard Povall. See www.halfangel.org.uk) uses motion-sensing and sound/MIDI software for choreography. The dance asks viewers to experience movement as live music composition and vocalisation of the poetry Gilson-Ellis has written. Triggering the soundscapes through her interactions with the sensor, Gilson-Ellis uses her hands as live musical instruments. Even her breathing, amplified or recorded, becomes part of the dance. Contemporary dance has four environments. (a) Interactive: Based on sensors and motion-tracking. (b) Immersive: VR-based, as in CAVE (Cave Automatic Virtual Environment, created in 1992 by the Electronic Visualisation Laboratory, University of Illinois, ). (c) Networked: Telepresence, video-conferencing, telerobotics— allowing users to experience a dispersed body and to interact with other remote bodies. (d) Derived: Motion-capture-based reanimations of bodily movement or liquid architecture (see the following section on architecture), which can be networked and reintroduced into live telepresence (Birringer 2002: 88). In several of these dances, transmittable data has to be produced and processed in synchrony at different locations. The Australian Company in Space performing Escape Velocity is an interesting example of this. Two dancers, two cameras and two projectors linked by a direct online connection between the Web Café at Arizona State University and a performance space in Melbourne were involved. The live mix merged the two dancers. Watching Hellen Sky dancing in Arizona also meant watching her sister dance in Melbourne. The teleconference set up between the two spatial points enabled audiences in the two places to see not just the other dancer, but also the other audience. It created, in Johannes Birringer’s words, a ‘composite dancer floating in a third space’ (ibid.: 91).
ART, AESTHETICS, POPULAR CULTURE
S 143
In a different mould is Richard Lord’s work. This was created exclusively for the Web (). It does not exist in real space, and is activated only through browsing and the viewer’s interaction. According to Scott deLahunta, virtual reality performances involve active audience participation and site-specific responses to space (whether real or actual). It draws heavily upon live art, performance art, installation art and communication art. The viewer interacts freely with an art work by navigating within a 3-D environment created by software. This entails the use of sensors to gather and integrate feedback from the audience. The computer takes in this data, calculates a perspective within the 3-D environment and displays this as ‘output’ to the user/viewer through projection devices (deLahunta 2002: 105). CAVE is an immersive environment art work. One walks into a 10′×10′ room wearing a special pair of active stereo glasses and carrying a mouse ‘wand’ that interacts with the space (not unlike the most famous wand in recent times: Harry Potter’s!). There is an input device in the form of a ‘head tracker’ that provides information about the user/audience’s position in the space. The computer synchronises the data and calculates the correct perspective for each wall from the point of view of the user. Four projectors send the computer-generated images onto the walls and the floor. Simply put, the viewer actually ‘creates’ the art work through her/his position in the room (see section on liquid architecture for the architectural equivalent of this art). Char Davies’ Osmose is a Montreal-based art work. It is modelled on a real-life activity, namely scuba diving, and the scuba diver rising and falling under water while breathing in and out. A vest customised with sensors to detect the movement of the chest enables the user/wearer to move up or down in the virtual world by breathing, and right/left, forward/backward by tilting. A head mounted display renders the 3-D visual experience for the interactive user. Occasionally, an audience can also view this user. In 1997, the Mixed Reality Laboratory (MRL) of the University of Nottingham (UK) collaborated with Blast Theory, a UK theatre group, to create Desert Rain. It used a multiuser VR system (called, appropriately, MASSIVE). Desert Rain is organised as a journey. Each participant is ensconced in a cubicle and stands on a movable footpad that controls the journey through a virtual world.
144 S
VIRTUAL WORLDS
Communicating with each other through a live audio link, participants explore bunkers, motels and deserts. The world is projected onto a screen of falling water, creating a ‘traversable interface’ through which the performers can visit the players at certain key moments. Players have 30 minutes to find the target, complete the mission and get to the final room. Each performer is first given a set of basic instructions. One set of instructions is about how to get through the virtual world, others give the significance of specific objects. Instructions also come in from the audience. Final instructions come from a performer who comes through the water screen and takes the participant into the final chamber. This is a mixture of performance, a computer game and installation art (deLahunta 2002: 108). Others such as Troika Ranch, Isabelle Choiniere, Lisa Naugle, Jean-Marc Matos and Douglas Rosenberg use various form of VR and computer tehnologies for their work (the International Dance and Technology Conference—IDAT—at the Arizona State University brings together many of these performers). Jet Lag, a collaborative work involving several organisations and artists was first presented at Kulturhus in Aarhus (Denmark) in 1998. It combined live action, recorded and live video, computer animation, music and text. The Zentrum fur Kunst und Medientechnologie (ZKM ) in Karlsruhe, Germany, launched in 1989, is Europe’s leading exhibition space for digital and electronic arts. It has produced a range of interactive installations by artists-in-residence, a series of artistic CD-ROMs and critical commentaries, and intermedia acoustic performances.
ARCHITECTURE The information revolution has transformed architecture and urban design (for the latter, see the Chapter 3). Architecture is no longer restricted to static conventions of plan, section and elevation. Buildings can now be fully envisaged using 3-D, Computer Aided Design (CAD), modelling, profiling, prototyping, specialised softwares, interfaces and hardware. That is, digital technology radically alters the conceptualisation (planning) and
ART, AESTHETICS, POPULAR CULTURE
S 145
fabrication (modelling) stages of building/architectural production. Frank Gehry, for example, produced 3-D models for every facet of both the surface (of titanium and stone) and the interior structure of curtain walls and stairways before delivering the design details of the Guggenheim Museum in Bilbao. Using 3-D, the structure of the built and finished models can be seen and experienced even before their actual construction. In addition, VR technologies are being used by architects to recreate ruined buildings in virtual space.2 Contemporary architects working with digital technologies are ‘blurring the relationship between matter and data, between the real and the virtual and between the organic and the inorganic’ (Zellner 1999: 9). Marcus Novak beautifully captures the increasing ‘informatisation’ of architecture in his essay ‘Transmitting Architecture: transTerraFirma/TidsvagNoll v2.0’: When bricks become pixels, the tectonics of architecture become informational. City planning becomes data structure design, construction costs become computational costs, accessibility becomes transmissibility, proximity is measured in terms of numbers of required links and available bandwidth. (Novak 1995: 45) Peter Zellner provides a useful summary of the ways in which contemporary architecture is experienced and described: S Objects, places and people are no longer framed in the moment but approached along multiple routes. Objects are thus not restricted by spatial and temporal limits. S Through visual and non-visual means of communication, buildings are set free from linear viewpoints. The building can be experienced variously, and not just as an ordered pattern of movement through it. S Rather than icons of stasis, buildings are now more like inclusive fields of organised materialisation. They respond to the users, and are therefore more ‘alive’ than merely concrete structures. Simply put, they are interactive buildings. S Paradoxically, with the aid of digital technology such as heat sensing, electron scanning and satellite imaging, the
146 S
S
S
S
S
VIRTUAL WORLDS
user can perceive the entire building from one position without moving at all. The idea of place is recast where instantaneous data exchange replaces mobility. Users of workstations, faxes or televisions remain fixed to these points of interface. Thus, in offices, the users of the space do not move through it, but continue to utilise space to transmit data. There is an oscillation between the real and the virtual. ATMs, the video arcade, the phone booth—all suggest a merging of the real and the virtual. In concrete spaces one finds means to access the virtual world. Contemporary explorations of built form are not based on Euclidean geometries (of the sphere, cube or pyramid) but on the torus or the Mobius strip. This means that architecture frequently adopts non regular shapes. The body’s curves or ‘twisted rubber bands’ (this latter in the work of Winka Dubbeldam), for instance, are favourite models for contemporary architects. Such architecture presents a new topology looping into and out of, back and forth, on a surface with no discernible end or beginning, interior or exterior. Architects now combine mathematical models with particulars from the ‘real’ world, such as pedestrian and automobile movement, environmental elements such as the wind and sun and conditions of views. Thus both linear time and two-dimensional space are altered in such architectural forms. Also, just as neither human movement nor environmental conditions follow regular geometries or time sequences, the architecture also seeks to incorporate the irregular, the random and the contingent.
Some examples from contemporary architects can be used to understand these new attitudes to time and space:
Co-citational Mapping Sulan Kolatan and William J. MacDonald incorporate systematised fields of data into spatial and material relations, what they call co-citational mapping.
ART, AESTHETICS, POPULAR CULTURE
S 147
1. For example, they index sectional profiles of domestic items that reveal a relation between a soap dish and a seat. The information and profiling is based on morphology rather than function. 2. At the next step, these associations and dissociations are marked into clusters that represent a network of connected elements. Thus a matrix of cross-referenced, hybrid objects is created. The spatial coordination, not based on any initial links of the functional use of the objects, evolves over time. 3. The O/K Apartment in New York city was renovated using this co-citation mapping. The architects produced a ‘domestic scape’ where they compiled a cross-sectional profile of everyday objects: sink, pillow, bed, soap dish, seat, lamp, refrigerator and bathtub. A flowing domestic environment was then created after the architects associated these profiles to form new spatial structures (after ignoring their original scales), such as a single bed–bathscape where the space shaped by the bathtub and the surface of the bedroom floor/wall is seamlessly transformed.
Hypersurface Stephen Perrella’s work is a good example of hypersurface systems. A mixture of matter and media, hypersurface represents a merging of ‘technology, consciousness, instrumentalities (forms and spaces), economy, representations (images) and identities’ (Zellner 1999: 46). In his first series of panel studies (1997–98), Perrella presents a model of such a hypersurface. As structural diagrams, the panels mix flexing interlocked steel or aluminium frames with a mesh-like metal fabric or synthetic membrane. Structure and ‘skin’ are a seamless continuity here. Perrella now uses animation software for his designs. ‘Mobius house’ seeks a space where the interior and exterior merge, where home and world are intermingled.
Parametric Modelling Kas Oosterhuis in his Saltwater Pavilion combines ‘techno-ecologies’ and body-buildings. The ‘form–gene’ underlying the structure is
148 S
VIRTUAL WORLDS
an octagonal ellipse that modulates into a quadrilateral along a three-dimensional curved path. The form is then kneaded, bent, stretched and rescaled using computers that facilitate the mathematical description of the geometries. In parametric modelling, Oosterhuis rejects numbers and formulae. Instead, behind every line and surface of the virtual model, there is an algorithm that sets the parameters for the geometry.
Combinatorial Topology Winka Dubbeldam’s work utilises a branch of topology called combinatorial topology, which views figures as combinations (complexes) of simple figures (simplexes) joined together in a regular but endlessly variable manner. Two figures are topologically commensurate if one can be deformed into the other through stretching, bending or twisting but without cutting, tearing or creasing. This architecture therefore rejects the fractalised fashion of folding in favour of loopings and circuits. Dubbelman’s work employs elastic figures such as the band, the knot and the loop. She seeks to link the fluidity and instability of urban constructs with the constant cyclings of the virtual world. Her design for an international cruise-ship terminal in Yokohama is the most famous example of this ‘twisted rubberband topology’ style. The terminal building is a long space located in the tightly intertwined figure formed by the intersection of two topological bands, which create a bridge structure that twists over the pier’s surface. The terminal is enclosed by the smooth curves of the bands’ geometries, and is a continuous structural envelope (an aluminium frame wrapped in a translucent skin of aluminium ribbons and fibreglass).
Soft Architecture The work of Rotterdam’s NOX company (founded by Lars Spuybroek) seeks to ‘capture the geometry of the fluid and the turbulent but also to dissolve all that is solid and crystalline— static—in architecture’ (Zellner 1999: 112). Spuybroek seeks an architecture that will absorb and enhance the plasticity and
ART, AESTHETICS, POPULAR CULTURE
S 149
suppleness of the human body in order to integrate it with the advancing technological environment. This is ‘soft’ architecture, directly related to the mobility of the body, its speeds and movements. Spuybroek writes: ‘Imagine that architecture is swallowed up by technology so that it becomes completely capable of absorbing and enhancing the body’s rhythm. That means the body’s rhythm will affect the form. And conversely it means that the form’s rhythmicality will in turn activate the body’ (quoted in Zellner 1999: 115–16). NOX’s freshH2O eXPO, a water pavilion and interactive installation on an island, Neeltje Jans, is an example of this soft architecture. The building’s form is shaped by the fluid deformation of 14 ellipses spaced out over 65 metres. There are no horizontal floors and no external relation to the horizon. The object’s external deformation is mirrored in the interior environment, which registers movements in 17 sensors that respond interactively with the visitor. The light and sound pulses rhythmically with the visitors’ movements.
Transarchitectures/Liquid Architectures Marcus Novak’s work integrates physical and virtual spaces with algorithmic unfolding, meta-data visualisation and navigable environments. Novak writes: ‘Transarchitecture [is] a way of refocusing liquid architectures to broader issues, and to the idea of eversion, or turning-inside-out, of cyberspace. Transarchitecture is a superset that contains and extends liquid architectures’ (quoted in Zellner 1999: 128). In this new mode of ‘transmodernity’ (Novak’s term) he lists algorithmic conception, rapid prototyping, robotic fabrication, interactive habitation, telepresence and telecommunications, and nano- and giga-presence. Novak seeks a link between ‘the real and the virtual, creating a new continuum of space: local(physical)–virtual–(non-local)physical’ (quoted in ibid.: 130). TransPORTs2001, a project Novak executed with Oosterhuis, is an example of this new style. The project is a physical territory of interactive, shape-shifting liquid architectures that visitors can navigate by controlling sensor buoys and variable floating architectural structures through computer control. Physical
150 S
VIRTUAL WORLDS
elements here are linked to a parallel field of liquid architectures in cyberspace. Navigation in the physical space is (or can be) done simultaneously with the virtual navigation in cyberspace. In a sense, the players literally modify the shape of the building.
COMPUTER MEDIATED LEARNING It is essential to understand the modes in which technology can influence forms of learning. Computers facilitate a greater degree of collaborative learning through peer exchange, interaction among equals, and interchangeability of roles. The existing divide between real-time education (face-to-face teaching/learning) and distance/correspondence education is now diminishing with the new technologies. S Students at even traditional campuses now take online courses from linked institutions (the education networks, for example) or from their own institution. S Professionals can now take a course at their workplace, delivered by print, video and audiographics from a distant university, with or without a local tutor. S Courses can now combine face-to-face seminars, residential weekends and lectures with computer conferencing as the delivery medium in between. S Groups of students at several sites can now be linked by audiographics with the teacher taking turns at each site to run the session (Mason 1995: 212). Technological influences on the methods, systems and software which can be used for education and training include: (i) Increasing availability of low-cost and more powerful PCs at home and workplace. (ii) Advent of multimedia, now adapted to PCs, facilitating asynchronous communication (fax, email, computer conferencing). (iii) Increasing miniaturisation of components and therefore increased portability of computers. (iv) Development of interfaces for voice and handwritten inputs.
ART, AESTHETICS, POPULAR CULTURE
S 151
(v) ISDN networks and broadband fibre optic highways. (vi) Growth of academic networks. (vii) Expansion of wireless communications from telephony to data transfer (Kaye 1995: 193). Anthony Kaye lists three classes of technology for groupware environments suitable for collaborative learning: 1. Communications systems: Synchronous text, audio, audiographics and video communication, asynchronous email, computer conferencing, voicemail and fax—these require a great deal more preparation by the teacher and good-quality pre-prepared graphics. However, the ability to see a remote tutor or for the tutor to see remote participants does not ensure better interaction. 2. Resource sharing systems: Synchronous screen-sharing, electronic whiteboards, concept mapping tools, asynchronous access to file systems and databases—all this means that each participant has access to a variety of tools and a public workspace accessible to all (WYSIWIS: what you see is what I see). The workspace can also be a private one, where material is visible only to a particular user before being transferred to a public window. Software tools now allow participants to jointly prepare representations of a body of knowledge. 3. Group process support systems: Project management systems, shared calendars, co-authoring tools, voting tools, brainstorming tools—their use makes it possible to prepare specific discussion formats, virtual environments that users can navigate with the help of browsers. Participants can now take on different roles— tutor, learner, resource person, expert, etc. Software is also available to provide support for specialised roles and differential access to resources, tools and activities (Kaye 1995: 198–203).
VIRTUAL LEARNING ENVIRONMENTS (VLES) VLEs are essentially course delivery systems, and generally include course materials, assessment facilities, conferencing and chat software. They also allow for student administration and monitoring, and authoring. Some examples of these systems are WebCt, LearningSpaces and Topclass.
152 S
VIRTUAL WORLDS
VLEs possess the following student features: S Course content as HTML Web pages, with its navigation system. S Bulletin boards and conferencing allowing all course participants to communicate. S Chat that enables real-time communication among course participants. S Email that enables one-to-one communication. S Personal notebook that allows students to take notes. S Whiteboard, which is a shared work area facilitating simultaneous (or synchronous) communication within the group. Any user can draw on her or his whiteboard, and all users can see what is drawn. S Tests, where student responses are marked correct or incorrect by the system. S Student presentation areas where students upload preprepared Web pages that can be viewed by all participants. VLEs possess the following teacher features: S Progress tracking, where a teacher tracks the student’s work by looking at what the student has accessed, when accessed and time spent. S Automatically graded quizzes delivered online. S Student management through class and grade lists. Most VLEs possess the following technical features: S Standard interface with built-in navigation features, pre-set headings and subheadings and contents pages generated automatically. S Customisation of Web pages for particular courses, including graphics, text size, links. S Site management with facilities for back-up of materials, uploading of files from instructors and version tracking. Three major developments in virtual education have been the virtual museum, the virtual library and the virtual laboratory. Each of these is an example of a VLE. Through networks of such VLEs,
ART, AESTHETICS, POPULAR CULTURE
S 153
education can be extended beyond the space of the classroom. Thus ‘extension’ education or ‘distance’ education are irrelevant terms when the classroom can now be reached or extended into the factory, playroom, home or arcade. 1. Virtual museums are an important aspect of virtual universities. Exploratorium, one of the most interactive museum sites on the Web (), is a vast collection of databases. The world’s major museums such as the Tate Museum in London can be accessed online. The visitor can view images of the major works, download and even print some of them, and (since the Web also serves commercial interests) buy prints online. Curators and researchers can call up digital images of objects stored in another museum (‘off site’) or storage facility, and add to the information already available (on, say, site 1). A visitor to the museum can now access images or events that are related to the ones on display on site 1. The museum is thus available anywhere, and in infinite expanse. A particularly important endeavour is the Four Directions Project funded by the US Department of Education. This project seeks to support culturally responsible education for Native American students. Its three stated aims—culturally responsive teaching, cultural revitalisation and cultural collaboration—inspire the creation of suitable projects. For example, a student working on a virtual museum of Native American cultures enhances community (student, Native American, or the general public), and a more immediate, access to material, which for a long time has been located in faraway museums, and thus creates a culturally responsive learning, since objects/artifacts are now within virtual ‘reach’ for reading, analysis and description. The project has three partner institutions: the Peublo of Laguna Department of Education, the University of Texas and the Smithsonian National Museum of the American Indian. This project is a good example of the use of technology to collaboratively develop virtual native museums. 2. Virtual laboratories such as the NASA Research and Education site () is a virtual simulation laboratory that enabled engineers from Boeing, Rockwell, Lockheed Martin based at the Johnson Space Center to participate in an astronaut training programme using the Vertical Motion Simulator.
154 S
VIRTUAL WORLDS
3. Virtual libraries demolish physical constraints of space both for storage of material and access. The Online Computer Library Center () is one of the pioneers in the field. In a digital library, the resources are multimedia objects. WorldCat is a merged electronic catalogue with 36 million records in eight bibliographic formats and growing at the rate of 2 million records every year (Ryan et al. 2000: 83). The Library of Congress—one of the most popular of these libraries (with over 2,695,000 hits during a weekday in February 2000—is host to the American Memory Project which compiles a vast amount of photographs, movies, documents and sound recordings. History writing and learning, with such a virtual university, is obviously not going to be the same again. Other major online libraries include the US National Library of Medicine () and the Virtual Library of WWW Development (). The Internet Public Library is an interesting metalibrary where one can go and read a newspaper—published in the USA or Uganda—wherever and whenever one can go online ().
NOTES 1. For a study of ‘The Impermanence Agent’ see Wardrup-Fruin and Moss (2002). 2. See, for example, Richard Beacham and James Packer’s work seeking to ‘restore’ in virtual space the famous ancient Roman Theatre of Pompey. Called The Pompey Project, this is particularly significant since further excavations of the site are no longer possible. For a study of the project see Hugh Denard (2002). Denard points out that virtuality is the only reality in this case because it is the only reality in which the information structure and appearance of the whole object now exists (ibid.: 34).
SUGGESTED READING Cavallaro, Dani. 2000. Cyberpunk and Cyberculture: Science Fiction and the Work of William Gibson. London and New Brunswick, NJ: Athlone. Darley, Andrew. 2000. Visual Digital Culture: Surface Play and Spectacle in New Media Genres. London and New York: Routledge.
ART, AESTHETICS, POPULAR CULTURE
S 155
Genders, 18. Winter 1993. Special issue on ‘Cyberpunk: Technologies of Cultural Identity’. Jolliffe, Alan, Jonathan Ritter and David Stevens. 2001. The Online Learning Handbook: Developing and using Web-Based Learning. London: Kogan Page. McCaffery, Larry (ed.). 1991. Storming the Reality Studio: A Casebook of Cyberpunk and Postmodern Fiction. Durham: Duke University Press. Penley, Constance, Elisabeth Lyow, dynn spicgel and Janet Bergetrom (eds). 1991. Close Encounters: Film, Feminism, and Science Fiction. Minneapolis: Minnesota University Press. Penley, Constance and Andrew Ross (eds). 1991. Technoculture. Minneapolis: Minnesota University Press. Pfeil, Fred. 1990. Another Tale to Tell: Politics and Narrative in Postmodern Culture. New York: Verso. Zellner, Peter. 1999. Hybrid Space: New Forms in Digital Architecture. London: Thames and Hudson.
CHAPTER THREE
POLITICS
What Orwell feared were those who would ban books. What [Aldous] Huxley feared was that there would be no reason to ban a book, for there would be no one who wanted to read one. Orwell feared those who would deprive us of information. Huxley feared those who would give us so much that we would be reduced to passivity and egoism. Orwell feared that the truth would be concealed from us. Huxley feared that the truth would be drowned in a sea of irrelevance. (Neil Postman, Amusing Ourselves to Death, 1986)
P
ostman’s pithy comment could easily apply to contemporary technologies and politics. The quote distils two ‘fears’ about contemporary technoculture. One is the fear of technology being abused. The other is the fear of technology’s potential being fulfilled but in very unexpected directions. The use and abuse of technology is a political subject, since it is invariably some form of politics that decides on the ‘implementation’ of a new technology or the ‘outdating’ of another. Further, new technologies both demand and create new forms of politics. Print transformed sixteenth- and seventeenth-century Europe (as Elizabeth Eisenstein’s (1991 [1979]) massive study has demonstrated. See also Asa Briggs and Peter Burke’s (2002) exhaustive A Social History of the Media from Gutenberg to the Internet). Television introduced televised national
158 S
VIRTUAL WORLDS
opinion polls and Presidential debates (in the USA). Propaganda, diatribe, polemics and plain abuse—all use available technology. ICTs are hailed in some quarters for heralding a new age for democracy. Others see it as a means of control and domination. In a sense, both arguments are true—all new technologies carry within them the potential for their own misuse and abuse. This chapter outlines some of the central themes of cyberpolitics. Since an introduction to (any) politics must account for the diversity of views on it, the chapter maps the utopian and dystopian interpretations of cyberculture to provide a detailed sketch of the political implications of the new technologies. Steven Jones (1995: 26) discerns five cultural and political ‘applications’ of computer-mediated communication (CMC). According to him, it (a) creates opportunities for education and learning; (b) creates new opportunities for participatory democracy; (c) establishes countercultures; (d) complicates legal debates on privacy, copyrights and ethics; and (e) restructures man–machine interaction. David Hakken’s ethnographic approach presents a scheme where we move from the smallest to the most complex level in our analysis of cyberpolitics. Hakken lists six levels: (i) The basic characteristics of the entities constituting cyberspace. (ii) The self-identities formed by such entities. (iii) The micro, or close social relations (between intimates and friends) these entities construct. (iv) Their meso-, or intermediate, social relations (community). (v) Their macro social relations (national and transnational). (vi) The political–economic structures which cyberspace entities produce and reproduce (1999: 7). Arturo Escobar’s anthropology of cyberculture suggests five ‘domains’ for analysis: 1. The production and use of new technologies. This includes studies of both ‘designers’ (scientists) and users of these technologies, and the construction of subjectivities and identities.
POLITICS
S 159
2. Appearance of computer-mediated communities. What is the kind of community that gets formed in cyberspace? An ‘interface anthropology’ (the term is adapted from Brenda Laurel) which looks at the user/context intersections and the interface design that enables users to interact with each other is needed. 3. Studies of the popular culture of science and technology. This is a study of technology in the social realm. 4. The growth of human computer-mediated communication. The relationship between language, communication, social structures and cultural identity needs to be studied. 5. The political economy of cyberculture, especially the relationship between information and capital, questions of class and elitism, the institutional control or apparatuses of informationgathering and dissemination (Escobar 1996: 121–7). These pointers from Jones, Hakken and Escobar provide us a with a lead while examining the issues involved in cyberpolitics.1
CYBERSPACE AND THE SOCIAL In order to understand the varied dimensions of cyberpolitics we need to first have a clear notion of power. Some simple definitions of power are summarised here: S Power is the ability of an individual or community to act. That is, in very simple terms, power involves a question of agency. Power can be technological, military, epistemological, intellectual, or economic. Power could be the capacity to produce specific effects and perform certain actions. S It is also the ability to overcome resistance. This is power as domination, where the person with power can enforce her/his wishes over others with less power. In both these cases power is treated as a possession. S Power can be delegated. A society may delegate the power of governance to a governor. It may also invest power in certain institutions (courts, legislature), objects (such
160 S
VIRTUAL WORLDS
systems of control as road rules/signs, toll-gates), or gestures (such as the duty of every citizen to pay tax). Power is thus the effect of a consensus, the authorisation by a community or any collective structure (commonly termed ‘society’). S Power structures relationships of inequality between people. This notion, adapted from Michel Foucault, treats power as an instrument of domination. Power produces subjects. What is significant to this concept of power is that resistance is integral to all forms of domination: there is no domination without resistance. But power is also multiple and productive. Foucault’s definition is worth quoting in full here: Power applies to immediate everyday life which categorises the individual, makes him by his own individuality, attaches him to his own identity, imposes a law of truth on him which he must recognise and which others must recognise in him. It is a form of power which makes individuals subjects. There are two meanings of the word subject: subject to someone else by control and dependence, and tied to his own identity by a conscience or self-knowledge. Both meanings suggest a form of power which subjugates and makes subjects. (Foucault 1983: 212) To understand the nature of politics in cyberspace, we also need to identify the role of the individual and society in cyberspace.
INDIVIDUAL IDENTITY IN CYBERSPACE The notion of identity as fluid is a first step in understanding the role of the individual in cyberspace. There are several modes of identity formation and identity switching in cyberspace. ‘Avatars’— a term coined by the novelist Neal Stephenson in Snow Crash to describe the identity created in cyberspace—possess certain identifiers that mark their online identity. These include addresses, names and self-descriptions. Email addresses (such as xyz@hotmail. com) identify the name, real or false, of the sender. Any additional
POLITICS
S 161
tags at the end of the email message (such as institutional affiliation) provide further information about the sender. A discussion group’s listserv provides all members of the group every member’s email address, and some idea of the identities and profiles of these members. Other clues include the domain labels attached to email addresses. For instance, .com indicates a commercial organisation, .gov a government one, while .ac.uk would indicate an education network in the UK, and .edu an education network. Hierarchies are difficult to maintain in cyberspace. Every individual can get adequate attention in discussion groups, and s/he cannot be ‘shouted down’ by others (as may happen in real-time discussions). Identity fluidity contributes to this anti-hierarchic nature of cyberspace. If identity tags are erased, then the extent of silencing-by-identification is reduced. Women masquerading as men online, for instance, seek to neutralise gender-based discrimination. Further, the real-time official designation of the person does not matter in cyberspace. Anyone in a discussion group about democracy could contribute to it by posing as a professor of political science. In real time, with the emphasis on status and designation, any equality between the ‘common’ man and such a professor in the context of such a debate is much more unlikely. At the level of the individual, then, cyberspace ignores identities. At institutional levels hierarchies can be reordered to benefit individuals. Myron Krueger describes this fluctuating, unstable self-identity and its social relations thus: In the ultimate artificial reality, physical appearance will be completely composable. You might choose on one occasion to be tall and beautiful; on another you might wish to be short and plain. It would be instructive to see how changed physical attributes altered your interactions with other people. Not only might people treat you differently, but you might find yourself treating them differently as well. (quoted in Robins 1999: 138) Subjectivity, then, is ‘dispersed throughout the cybernetic circuit […] the boundaries of the self are defined less by the skin than by the feedback loops connecting body and simulation in a techno– bio–integrated circuit’ (Hayles 1993: 72).
162 S
VIRTUAL WORLDS
Michael Benedikt argues that cyberspace becomes a stage where everyone and anyone can play roles. It is the space where anyone can fulfil her or his fantasy to be this mythic figure or that historical hero, to be a Hollywood star or an animal. ‘Cyberspace can be seen as an extension […] of our age-old capacity and need to dwell in fiction, to dwell empowered or enlightened on other, mythic planes’ (Benedikt 1991: 6). Benedikt points at the entirely mythic universe that an individual can occupy through her or his avatar. In a sense, cyberspace is a space that appeals to our romanticism. But this fluidity of identity has its advantages and disadvantages: (i) It may enable a person to divest her/himself of certain traits if the response to her/his online role-playing is rewarded. (ii) However, if at any point the distinction between online and offline identities breaks down, then the game moves from roleplay to deceit (Wallace 1999: 53–54). An increase in the number of participants in discussions is possible in cyberspace. Vast numbers cannot really be involved in real-time discussions due to various physical constraints. Such limitations are absent in cyberspace. Expertise is not necessary for to participating in such online decision-making discussions. And the difficulty of censoring information in cyberspace makes for a relatively ‘freer’ atmosphere for discussion and information exchange. Technology therefore literally constructs identity, thereby governing access to politics and citizenship.
THE COMMUNITY IN CYBERSPACE David Hakken (1999: 96–97) suggests four questions as starting points for the analysis of the community in cyberspace: S How are communities different in cyberspace? Are they more network-oriented, and less group-oriented? S How does cyberspace affect the social reproduction of regions, the physically bound spaces within which humans
POLITICS
S 163
spend most of their time? If the region becomes less important as a mediator of social formation reproduction, then what replaces it? S Are organisations in cyberspace moving towards a new ‘cyberfacture’ stage in the labour process? Will such changes in work processes and conditions be influenced by civil bodies? S What are the social and political implications of cyberspacing meso-social (intermediate, community) relations? Howard Rheingold popularised the idea of the virtual community, which is a group of digitised identities that meet regularly in cyberspace. By doing this they create a social space in cyberspace, just as meeting regularly in a park or a bar or a local train creates a social community in that space. Community formation occurs primarily through interactivity. The new communication technologies represent a whole new dynamics of interaction. The members of a virtual community share a system of information and communication. These shared modes could be discussions of the latest cricket match (as happens in local trains), the discussions of a group devoted to a study of the environment, or corporate meetings in cyberspace. The community usually consists of people with shared interests (as in most social groups). However, unlike ‘meatspace’ (a term William Gibson and other cyberpunk writers use to distinguish it from virtual space) communities, where geographical distance is of prime importance in community formation, cyberspace has no geographical restrictions. This is not to say that the Internet automatically facilitates the smooth formation of an egalitarian community. Cultural differences and similarities often prove equally strong barriers in cyberspace. Though the Internet represents a massive multicultural organisation, communities in cyberspace still require some cultural commonality. However, the Internet does facilitate to a remarkable extent our discovery of a like-minded set of people. Patricia Wallace puts it succinctly when she writes: ‘People who share your interest and lean in the same direction as you are just a few keystrokes away, regardless of the issue’s obscurity, social desirability, or bizarreness’ (1999: 79). However, as in real time, violence and assault can occur in cyberspace too. The infamous
164 S
VIRTUAL WORLDS
case of Mr Bungle in the MUD LambdaMOO is an illustration (Dibbell 1994). Hacking—seen as a criminal activity by business houses and governments—is an example of a subcultural community that works within Net communities. As Douglas Thomas’ excellent work has demonstrated, hacking is a subculture that seeks to undermine the ‘culture of secrecy’ of the information age. The hacking community believes in the freedom of information and resists the control of databanks by any authority. Hackers believe that the commodification of information has led to an increasing investment of power in the media (Thomas 2002). The Website www.hackers.com declares: ‘Hackers are the backbone of computer science!’ Cyberspace communities can also be alienating. The worker networking from a home-office is in a depersonalised state where there is no face-to-face communication. S/he is just another worker in a cell, where any identity s/he may possess is erased simply by reducing her/him to [email protected]. S/he is a name + a domain (a human being + a computer—now termed a ‘monad’), with no face, no gestures, no tone of voice to distinguish her/him from several million others. However, cyberspace is only a part of ‘meatspace’, and even after hours of cyborging and networking, one needs to get back to meatspace. As Allucquere Rosanne Stone says: ‘Virtual community originates in, and must return to the physical’ (quoted in Gray 2001: 134). However, enthusiasts of cyberspace communities (Howard Rheingold, for instance) believe that the success of virtual communities is an indication of the decline of the real. Mark Poster argues that all communities down the ages have been based upon an essential, fixed and stable identity of its members/constituents. However, virtual technologies, prosthetics and genetic engineering interrogate and upset this notion of a fixed and stable subject. The fluidity of identity (discussed earlier) means that a redefinition of ‘self ’ is called for. And, by extension, the notion of community. Howard Rheingold asks: ‘Are relationships and commitments as we know them even possible in a place where identities are fluid?’ (1993: 61). This is the age of ‘technosociality’ where new technologies set up social
S 165
POLITICS
BOX 3.1 CYBERCRIME The increased interconnectivity of nations has also resulted in transnational cybercrime. The hacking into telecom networks such as WorldCom by the group ‘Phonemasters’, the Melissa Virus created by David L. Smith (1999) that is estimated to have caused damage worth $80 million, the five Russian hackers who stole 5,400 credit card numbers from the Internet and appropriated more than $630,000 before they were arrested, are all examples of cybercrime on a transnational scale. In addition, there have been attacks on official government websites across the globe (India too was affected during 2000–2001), with very serious ones such as bogus changes being introduced into the official exchange rate of the national currency in Romania. A ‘Draft International Convention to Enhance Protection from Cyber Crime and Terrorism’ was prepared by Abraham Soafer, Gregory Grove and George Wilson at the Hoover Institution Conference of 1999. It listed the following as ‘offenses’: S S S S
Creating, storing, deleting, transmitting data with the intention of disrupting another cyber system, or provides false information. Accesses restricted areas of a cyber system. Interferes with tamper-detecting or authentication-mechanisms. Committing acts prohibited by several civil aviation, military or drug legislation.
The Draft also listed jurisdiction for legal appeal, extradition treaties between the signatory parties, and set up the ‘Agency for Information Infrastructure Protection’.
relations. Further, Poster argues that relationships in a real community are based (also) upon ‘visual cues’ about gender, age, ethnicity and social status. In cyberspace, where such cues do not exist/matter, dialogues flourish quickly, uninhibited by any such visual cue. Poster notes that even an enthusiastic supporter of virtual life such as Howard Rheingold acknowledges the ‘hunger for community’ (Rheingold 1992) that drives people into virtual participation. Poster believes that communities in virtual space are also driven by the concept of a real community. Poster concludes
166 S
VIRTUAL WORLDS
that ‘virtual communities derive some of their verisimilitude from being treated as if they were plain communities, allowing members to experience communications in cyberspace as if they were embodied social interactions’ (Poster 1997: 89–90). The disembodied, fluid and floating identities enjoy a sense of unlimited freedom. However, when such a fantasy of unlimited freedom is socially institutionalised, it will have to negotiate with the real world of situated identities. As Michael Heim suggests— inspired, probably, by the work of Emmanuel Levinas on ethics— this implies great changes to the ethics of interpersonal relations. When the computer ‘amplif[ies] an amoral indifference to human relationships’, the ethical disposition necessary for coexistence is made redundant. As Heim puts it, it is the ‘continuity of grounded identity that underpins and underwrites moral obligation and commitment’ (quoted in Robins 1998 [1995]: 144–45). Critics like Michael Benedikt (1991) argue that in postmodern times cyberspace is seen as a utopian space. This is imagined as an alternative to contemporary social reality—a space, therefore, akin to Disneyland (Robins 1998 [1995]: 146). This idealisation is visible in Rheingold’s The Virtual Community (1992). Rheingold argues that the virtual offers us a space where we can try and recover the values lost in meatspace: We reduce and encode our identities as words on a screen, decode and unpack the identities of others. The way we use these words, the stories (true and false) we tell about ourselves (or about the identity we want people to believe us to be) is what determines our identities in cyberspace. The aggregation of personae, interacting with each other, determines the nature of the collective culture. (Rheingold 1993: 61) This means that we may be able to construct a new community linked by interest and affinity rather than geographical location. Rheingold suggests that we may even envisage a ‘global civil society’ with a ‘revitalise[d] citizen-based democracy’, transcending national, geopolitical boundaries. As we have already noted, Rheingold’s vision is one where the notion of community still dominates (ibid.: 10–15). Cyberspace is ‘communitarian places
POLITICS
S 167
online’. Rheingold’s is perhaps the strongest argument in favour of the notion of cyberspace as community. However, Rheingold’s vision ignores certain crucial features of communities. Conflict, difference and inequality are inherent features of all communities. Community formation occurs when there is a subsumption of these differences for the greater common good. Communities are formed out of such forced homogenisation. Contemporary democratic politics, in most cases, is an attempt at acknowledging difference and forging a common consensus retaining these differences; in fact democracy thrives on such healthy conflicts. The pluralism that most nation–states seek is precisely this kind of a differentiated-yet-coherent community. To believe that virtual communities eliminate difference is to believe in an illusory world of perfect unanimity. The Internet must be seen, as David Holmes points out, as both a carrier or register of information and a means of communication: as both a ‘storage network’ and an interactive environment (Holmes 1997: 36–37). The space of cyberspace is based on knowledge and information, a part of this knowledge being the ability to access and navigate cyberspace (a theme to which I shall return in the section on cyberpower and cyberpolitics). The ritual sharing of information ‘makes’ a community, and it is this symbolic interaction that links one group to another (Jones 1995: 19–20). Critics such as James Beniger and Scott Peck speak of a ‘pseudo community’ where people lack a genuine personal commitment to one another. However, they also suggest that we need to redefine what we mean by ‘genuine’ commitment (quoted in ibid.: 24–25). Evidently, none of the terms, concepts or definitions are definitive yet. Traditional notions of ‘commitment’ and ‘public good’ need to be redefined for new forms of politics and social interaction.
PRODUCTION AND CONSUMPTION IN CYBERSPACE Production A traditional definition of production would be ‘the processes by which a commodity is created’. Production has the following components: (a) capital or money, (b) the processes by which
168 S
VIRTUAL WORLDS
capital, technology, raw material and labour are harnessed to the production process, and (c) the skilled labour involved in the actual production. Capital has been transformed with virtual technology. Finance capital is now truly global, has its effects in real time, and is selfperpetuating. The interlinking of the world’s major stock exchanges and banks means that agents and governments can now deal with exchanges across national boundaries (Castells 1996: 434–47). Stocks, bonds and cash move globally entirely in cyberspace. The very fact that large amounts of money are represented only as numbers on a screen means that cash flows are simply information flows: money is data. The production process has also changed. With the global linkages at work in the information society, the process of production is now split. Banks, labour, sources of raw material and the location of the factory were, at one time, circumscribed within a specific geographical site. In contrast, the contemporary production process is fragmented. The factory is frequently located in the so-called ‘Third World’ countries because of a perennial supply of cheap labour (Lyon 1988: 114). Finances are controlled by banks anywhere in the world. And expertise comes from the controlling research facilities in developed nations. (Manuel Castells’ detailed exposition of this fragmented production process is illuminating. See his 1989 work, The Informational City.) Castells identifies four forms of labour in contemporary informational production systems: the producers of high value, based on informational labour; the producers of high volume, based on lowcost labour; the producers of raw materials, based on natural endowments; and the redundant producers, based on devalued labour. Castells’ central emphasis here is that these forms of labour are not restricted to geopolitical sites. Any country can contain all four types in varying quantities. What is important is that these forms of labour are connected globally, allowing an MNC to coordinate all four types across the globe (Castells 1996: 146–48). Analysing the development of virtual technology and its impact on production, Greg Downey notes that human actors in the history of information Internetworks in the US have laboured in three roles: as managers of state and corporate institutions owning parts of the Internetworks; as consumers of the Internetwork
POLITICS
S 169
services and commodities; and as workers paid to help produce and reproduce the Internetwork on a daily basis. Internet workers, Downey points out, are both technologies themselves and moulders of technology (Downey 2002: 209–35).
Dromoeconomics: This is a term used to describe the financial and other ‘productions’ of cyberspace. Davidow and Malone define it as ‘modes of production organised around controlling the speeding flows of capital, labour, information, products, resources and techniques coursing through global modes of production as the output of “virtual offices”, “virtual factories” and “virtual corporations”’ (quoted in Luke 1999: 32). Every item/commodity is now bar-coded to track its movement through virtual offices, factories and stores. Production of services such as legal, financial, contracts, promotional campaigns (those pop-ups!), data analysis of sales, production output (itself recorded on computers in the manufacturing unit) and scientific papers can be conducted entirely online. A mass customisation is visible in these new dromoeconomic systems of production. Virtual firms produce goods and services by creating knowledge about dromoeconomic flows (when and what quantity of globally sourced components need assembling for sale at local points of use). Data from these flows can be guided into the most profitable and fast sites of extraction/production/accumulation/circulation so that the buyer gets the commodity immediately (ibid.: 32–33. See section on niche marketing under ‘Consumption’ for a discussion of the local sales/global manufacture condition).
The Internet and Business Paul Timmers lists the following key features of Internet businesses: (i) (ii)
Availability: 24 hours a day. Ubiquity: Eventually every company can possess an Internet connection. (iii) Global: No physical borders. (iv) Local: Local presence and person-to-person relationships. (v) Digitisation: Even physical products will eventually be marketed in info-space.
170 S
VIRTUAL WORLDS
(v) Multimedia: For consultancy, entertainment and marketing. (vi) Interactivity: Customer service can be greatly improved. (vii) One-to-one: Using feedback mechanisms, customer profiling business can be personalised (see section on niche marketing under ‘Consumption’). (viii) Network effects and network externalities: The increase in the number of parties in the network reduces costs, and enables virtual communities and third-party marketplaces. (ix) Integration: Information integration about all aspects of production and consumption (Timmers 2000: 10). Timmers also analyses current business models in e-commerce. These include: (i)
(ii)
(iii) (iv)
(v)
(vi) (vii)
E-shops: Web marketing of a company or a shop. Any company with its website can be said to have a basic e-shop. The possibility of ordering and paying may be added to this website. These websites can serve as business-to-business shops or business-to-consumer shops. E-procurement: Electronic tendering and procurement of goods and services. Large companies and public authorities use electronic negotiations and contracting, call for tenders on a global scale, and encourage collaborative tendering. E-malls: A collection of e-shops, linked under a brand (say, www.addall.com that unites several online bookshops). E-auctions: Electronic bidding accompanying multimedia presentation of goods. Contracting, payments and delivery can also be arranged electronically. Virtual communities: Either customers or partners adding information to the basic environment of the virtual community (a community of booksellers, for instance). Collaboration platforms: For collaborative design and engineering, and project support to a virtual team. Third-party marketplace: Where companies wish to leave Web marketing to a third party. The third-party marketplace is an additional online channel to serve other existing channels (including physical ones). These marketplaces allow a user to interface with the suppliers’ product catalogue.
POLITICS
S 171
There are also other models of Internet business: value-chain integrators, value-chain service providers and information brokers/ trust providers (Timmers 2000: 35–41).
Consumption Consumption is the utilisation of commodities, with mass consumption being a key element of informational economies. Two features of modern consumption patterns stand out: global markets and niche marketing. Trade agreements executed between governments are meant to create larger markets for goods—mainly material goods. Cyberpsace coordinates trade between various geopolitical regions for expanding markets. Niche marketing fragments the market into smaller and smaller groups, and eventually (at least theoretically) every individual can find a commodity to suit her/his exact specifications. This becomes feasible because in the information economy, with each visit to the store, the individual’s data is recorded for future use. This is a major development in the structure of consumption. Information feeds back into the production system and the product with the specifications of that individual fulfilled arrives at the store. Evidently such particularisation can occur only through fast(er) information flows in cyberspace. In a sense, the users/consumers are themselves producing the technology: they are now active participants in (technological) innovation. Products are ‘recontextualised’ when people live with technologies not of their own making, but assign values to the product (Miller 1995: 17). The fact that supermarkets record individual purchases simply means that they possess the exact details of a customer’s preferences and needs. Hearn and Roseneil describe the new order of consumption thus: The individual act of consumption is no longer just about the buying of that particular product at that particular time; rather it has also come to be understood as an item in the production of the overall consumption pattern of particular individuals and types of individuals. This is particularly clear in credit card transactions and the electronic surveillance that accompanies them [purchases by individual consumers]. (1999: 2)
172 S
VIRTUAL WORLDS
The Internet, then, is a site of personal consumption outlets for the individual purchase of information, goods and services (Noveck 2000: 28). In terms of new consumer technologies for the home, several changes are visible. From the 1990s ICTs have been applied to domestic consumer goods such as video cassette recorders, video games, and more recently multimedia CD-ROMs. These technologies have transformed the functionality of products. The use of microelectric components and computerised sensors have produced more miniaturised and portable equipment, more programmability and greater information storage about the individual user’s needs and preferences. The product or technological device now records for future use the individual’s needs, so that on subsequent occasions the user need not reprogram the tool/object. The development of ICT-controlled services such as telebanking and teleshopping have transformed daily experiences. The use of videophones, multimedia home computers, alarms and sensors have also changed life within the space of the house. Now integrated home systems are possible where all household systems—air conditioning, electric lights, gas supply, burglar alarms, doors and windows—can be computerised to respond together to altering conditions and reset themselves. In order to understand the consumption of technology, we need to analyse the modes by which values are assigned to technological artifacts. Two important questions need to be answered for an understanding of the politics of any technology: how technologies are embedded into the fabric of organisations, and how (particular) technologies come to have value (McLaughlin et al. 1999: 51; also see Irwin and Wynne 1998 [1996]). For this purpose, the social constructivist approach to technology (commonly termed Social Studies of Technology [SSOT] and Social Construction of Technology [SCOT]) is the most appropriate. The following features need to be kept in mind when analysing technology: (i) The ‘technical’ is a set of cultural assumptions and practices that privilege certain actors, groupings and ‘solutions’ over others. Consigning people, skills and artifacts to or excluding them from the category of the ‘technical’ is a cultural act. Technical aspects of technology are closed off—commonly called, in SSOT, the ‘black box’ of technology—where once
POLITICS
S 173
technologies move beyond their engineering origins, these origins are not understood (McLaughlin et al. 1999: 43–44). (ii) Technical language and technoculture displaces and mystifies the politics and management of technology through a strategy where certain ways of speaking and acting (generally termed ‘technological capacity’) become valued. Technology is never presented as the result of efforts driven by (say) economic or political concerns—it is presented as the natural outcome of value-free ‘research’. (iii) Instrumental rationality also claims neutrality and universality, while simultaneously offering the promise of control. Efficiency, rules and formal measures are valued for their own sake, and technology/technological rules assume the status of natural laws rather than cultural beliefs. They naturalise certain categories, relationships and practices and reject others as ‘irrational’ (the most damning rejection in terms of modern technocultures) (ibid.: 29–31).2 Silverstone et al. identify four stages in the process of technology consumption: (i) Appropriation: The route to the possession of the good. (ii) Objectification: Goods added to the existing systems of objects and meanings via meaning and display. (iii) Incorporation: As goods become part of the everyday routines and politics of the household, they become invisible as commodities. (iv) Conversion: Goods become implicated in relationships within the household and between the household and the outside world (quoted in ibid.: 52). Margolis and Resnick point out that interactions in cyberspace increasingly resemble those of the pluralistic market space of the real world. They write: The same enterprises that dominate business and […] politics in the real world, appear to dominate the business and politics of cyberspace. Moreover, the Internet users themselves act primarily as consumers or audience members, not producers,
174 S
VIRTUAL WORLDS
discussants, technical wizards, or political participants. (Margolis and Resnick 2000: 48)
CYBERPOLITICS The essential components of contemporary politics, with a few supplements specific to the new technologies, must remain the cornerstone of any discussion of cyberpolitics. Michael Margolis and David Resnick identify three types of Internet politics: (i) Intra-Net politics: This is politics within the Net, and has as its object the operation of the Net. It deals with matters that can be settled without reference to political or legal entities outside the Net community. It includes establishing technical standards and behavioural norms, settling disputes internal to newsgroups and listservs, and creating and structuring new virtual communities. This type of politics treats cyberspace as a virtual state of nature. People who work in cyberspace are termed ‘netizens’. For many of them Netlife is an extension of ‘real’ life. Interestingly, the politics for keeping the Net free is taking place in the realm of real space. (ii) Politics that affects the Net: This includes actions and policies that are taken by political entities. It is frequently treated as an intrusion by those who believe that genuine Net politics can only be intra-Net politics. There has been a spurt of interest in the legal aspects of the Internet, especially after it has become a mass communication medium. The main concern has been the freedom of cyberspace. China has blocked access to many Websites. German prosecutors have threatened legal action against online services that provide neo-Nazi propaganda. The US government and the Organization for Economic Cooperation and Development is exploring ways of establishing guidelines for encryption. (iii) Political uses of the Net: This refers to the use of the Net in influencing public opinion and political activities offline. Political sites urge people to enroll in signature campaigns and emailing authorities. With the introduction of electronic voting, the
POLITICS
S 175
Net may increase democratic participation. The 1996 American election was the first for which there was significant campaigning on the Web. Major media corporations set up Websites that attracted attention and followers (2000: 8–21).
THE PUBLIC SPHERE Closely aligned with ‘community’, the public sphere is an integral component of any discussion on politics, real or virtual. Virtual technologies and cyberspace effect a shrinking of the public space of everyday life. An increasing privatisation of the individual occurs through this shrinking. Social relationships are now established via email and chat. Earlier, technologies were used to facilitate face-to-face meetings. Contemporary technologies ensure that one meets only in this space of data exchange. This means that the social sources of an individual’s identity are increasingly lost (see section on ‘Individual Identity in Cyberspace’). The online dissemination of information can potentially damage political parties or the power elite, because it can induce skepticism and/or resistance by the citizenry. Such a transformation of the public sphere has been enthusiastically summed by John Katz in his ‘Birth of a Digital Nation’ thus: The people rushing toward the millenium with their fingers on the keyboards of the Information Age could become one of the most powerful political forces in history. Technology is power. Education is power. Communication is power. The digital young have all three. No other social group is as poised to dominate culture and politics in the twenty-first century. (Katz 1997 ) One important example of the use of the Internet as a medium of revolutionary thinking is the Zapatista communiqués. The Zapatistas combine features of the peasant rebellions with corporate publicity techniques. Their communiqués are frequently put out on the Internet by solidarity groups and sympathisers. Since Internet information is freely disseminated and ageographical, this becomes a major source of political activity. Zapatistas: Documents of the New Mexican Revolution—one of the premier
176 S
VIRTUAL WORLDS
tracts of the movement—was first published on the Internet before appearing in print form. Tim Jordan speaks of a virtual imaginary—the collective imagination of cyberspace. Adapting from Benedict Anderson’s influential formulation of nations as ‘imagined communities’ and Howard Rheingold’s work, Jordan argues that imaginaries often appear as becoming real. The imaginary offers promises that, in Jordan’s terms, are not ‘hopes and fears’, but ‘real projects just one or two steps away from completion’ (Jordan 1999: 183). Technological solutions always appear as becoming real. Cyberspace offers visions of either heaven or hell. Further, as Elizabeth Reid points out, cyberspace may be a technological construct, but virtual reality is a mental construct. According to her: Virtual worlds exist not in the technology used to represent them nor purely in the mind of the user but in the relationship … [the user needs] to treat the [virtual] manifestations of his or her imaginings as between internal mental constructs and technologically generated representations of these constructs. The illusion of reality lies not in the machinery itself but in the user’s willingness … [to treat them as if] they were real. (Reid 1995: 165–66, emphasis added) Cyberspace’s projected utopia/dystopia, Jordan argues, proceeds from the realisation that everything is controlled by information codes. Particular visions are collectively imagined and circulated, and then members recognise themselves as part of a community (Jordan 1999: 205). Jordan is essentially speaking of a cybercommunity or a public sphere built entirely on information exchange and a shared imaginary. What is important in cyberpolitics is that cyberspace and virtual environments/communities are not merely simulations of the real, but are substantively their own environments. Virtual reality demands that abstract communities be seen as their own entities, self-contained and self-referential. Since one of the central features of democratic public life is mutual assembly, virtual technologies call for a new way of thinking about the public sphere of assembly itself. There is a progressive ‘cellularisation’ of public life when it is possible to conduct economic, political or social affairs without ever meeting strangers. Mark Poster,
POLITICS
S 177
therefore, argues that the ‘age of the public sphere as face-to-face talk’ is clearly over. He argues that the Habermasian model of the public sphere—as a homogeneous space of embodied subjects in symmetrical relations, pursuing consensus through rational arguments—is not applicable to electronic politics (Poster 1997: 219–21).
CYBERPOWER Cyberpower has three interrelated aspects: the individual, the social and the imaginary. An individual’s power is dependent upon the body’s avatars and virtual hierarchies. Social cyberpower is dependent upon the informational space of flows that produce a virtual elite and a technological lower class. The virtual imaginary consists of the utopia and dystopia that we have already spoken about (Jordan 1999: 208). The cyberpower of the individual consists of access to information and the defence of her/his online rights. The demand for more technology proceeds from the increasing demand for information. An individual becomes ‘empowered’ with more information. However, this has another side to it. Such structures of informationgathering and dissemination also creates a techno-elite—a group which has the expertise to locate and control information. Social cyberpower is the negotiation between the drive to domination by this techno-elite and the individual. The collective imaginary is, of course, what produces newer technologies—created in an attempt to fulfil the dreams of the imaginary. There are also arguments about the social power available in information control. Chris Bradshaw and Richard Sennett, for instance, argue that control over information makes people more friendly than they would otherwise have been. Introverted people become chatty and more friendly online because the Internet makes it possible to conceal their true identities. Sennett writes: When everyone has each other under surveillance, sociability decreases, silence being the only form of protection […] People are more sociable, the more they have some tangible barriers between them, just as they need specific places in
178 S
VIRTUAL WORLDS
public whose sole purpose is to bring them together […] Human beings need to have some distance from intimate observation by others in order to feel sociable. (quoted in Nguyen and Alexander 1996: 104) Questions of cyberpower invariably modulate into discussions about the freedom to access information, the control of the Internet, rights of privacy, and the technopower of the elite. 1. The politics of access: Even accepting that the Internet and the new ICTs represent a whole new space for democracy, certain features of this new informational order need to be detailed in order to understand what they offer as emancipatory situations: S A minimum level of technological development is necessary for any country to have access to the Internet, or for it to be really productive. S A distinction must be made between information and knowledge. As the group ‘Interrogate the Internet’ puts it: ‘Knowledge taken out of context is really just noise’ (Interrogate the Internet 1996: 126). As such, some mechanism to filter knowledge from the vast amounts of information on the WWW becomes imperative for those regions where any access to the WWW is seen as access to ‘truth’ or knowledge, which creates a situation represented by the following equation: ‘data = information = knowledge = wisdom = truth = freedom’ (ibid.: 125). Beth Noveck argues that the profusion of information in cyberspace actually prevents the creation of a true online knowledge economy. In real space, Noveck points out, we go by trusted media names for ‘true’ information. In cyberspace, faced with so many Websites, we are forced to go back even more to these names because there is no way of ensuring that we get better/truer information from the Web sources. What occurs then, is that information and advertising become indistinguishable, and because companies use content to attract consumers online, the quality and independence of content is compromised (Noveck 2000: 24–25).
S 179
POLITICS
BOX 3.2 THE CYBORG BILL OF RIGHTS As an acknowledgement of the wired human, Chris Gray has drawn up a ‘Cyborg Bill of Rights’, as an extension of Stelarc’s ‘Cyborg Manifesto’ and Donna Haraway’s ‘A Manifesto for Cyborgs’, and John Barlow’s ‘Declaration of the Independence of Cyberspace’. This new Bill includes rights to travel in the flesh or in the virtual, freedom of electronic speech, right of electronic property, freedom of consciousness (including freedom to modify it through pharmacological, electronic or other means), right to life (including the right to modify bodies), right to death, right to political equality, freedom of information, freedom of family, sexuality and gender, and the right to peace. Barlow’s definition of cyberspace foregrounds the idea of independence: Ours is a world that is both everywhere and nowhere […] we must declare our virtual selves immune to your sovereignty […] we will spread ourselves across the Planet so that no one can arrest our thoughts […] We will create a civilization of the Mind in Cyberspace. May it be more humane and fair than the world your governments have made before. ()
There is a new spatialisation of inequality (and not just in terms of info-rich and info-poor countries—see the following section on ‘Cyberspace, Development and the New World Order’). Global cities are hyperconcentrations of infrastructure and the attendant resources, but large areas in less developed regions remain poor. Within these global cities, there remains a spatialisation of inequality. For instance, New York has the largest concentration of fibre-optic, cable-served buildings in the world, but most of these are located in the centre of the city. Harlem—the black section—has only one such building, while South Central Los Angeles—the site of the 1993 unrest—has none (Sassen 1999: 60). 2. Techno-elitism: ‘Interrogate the Internet’ argues that a new class position is now evolving out of the ICTs. It’s arguments about the potential, or lack thereof, of cyberspace and culture as emancipatory sites for democratic politics are worth citing.
180 S
VIRTUAL WORLDS
(i) Capital and technicians (a ‘technoclass’) control the structure of the Internet. These ‘techno-artisans’ actually control the technology that the general user is dependent upon. The class which lives in real-time habitats that are teleintegrated and form virtual communities are the ‘cybergoisie’ (Dear and Flusty 1999), and they occupy the cyburbia. These include stockholders, CEOs, entrepreneurs and celebrities who are involved in decision-making, exercise global coordination and enjoy socioeconomic security (ibid.: 76). Protosurfs are the sharecroppers, the cheap labour providing services to global systems of production. These include temporary labourers and itinerant labourers. (ii) The new technological relations can be thought of as occurring between a virtual class and a virtual vanguard. The technicians and other members of the virtual class hierarchy act in the name of the masses whose minds, bodies and souls are ‘saved’ and ‘structured’ with or without their consent (Interrogate the Internet 1996: 128). (iii) This techno-elitism is already visible in contemporary cyberculture. Multitechno/cultural groups now determine the rules for their own domain. In addition, the commercial nodes of the network cause an agglomeration of capital and grant more power to groups that already have power. In a sense, then, the inequality between groups remains a central feature of cyberculture. (iv) There is also an emerging trend where the distinction between the techno-elite and the technopeasants (Gray 2001: 24) is widening at a rapid rate. High tech remains the province of the upper middle class, white males. Technology today is an end in itself, a condition that Rutsky terms ‘techno-fetishism’ (Rutsky 1999: 130). People with the newest high-tech gadgetry acquire a certain status. This important aspect of the ‘technocultural unconscious’ must be accounted for in discussions of the dissemination of technology, for all technology has a social relevance and a social effect. 3. Cyberspace, development and the new world order: The actual impact of the new ICTs on developing nations is only beginning
POLITICS
S 181
to emerge. In a world divided sharply between the so-called ‘First’ and ‘Third World’, technology can be a means that is either liberatory or enslaving. Successful nations (even Asian nations such as Japan or Singapore) are countries where economic and scientific information is easily and plentifully available. Information systems support vital service sectors such as banking and finance, insurance, transport and the media. As John Feather points out, the comparative success of the information-rich economies and the comparative weakness of those that are information-poor is in itself an argument for the importance of information (Feather 2000 [1994]: 115). Jerry Everard’s work addresses several issues pertinent to the new informational global order. Since the global economy is dependent upon the global communications system, then exclusion or inclusion in this system of source and market of information can decide a country’s ‘politics of life chances’ (as Anthony Giddens describes environmental movements and politics). Any inequality in the production and distribution of such life chances will determine a country’s role in the global economy, with a corresponding effect upon its sovereignty. Everard identifies six central implications of a global expansion of communication network: (i) The flow of information across borders has political implications. (ii) There is a need for technical standardisation. (iii) There is a danger of cultural standardisation. (iv) International cooperation is required for infrastructure. (v) Management of the network requires global flows of capital. (vi) This also produces a concomitant rise in service sectors to manage the technology/infrastructure (Everard 2000: 30–31). Access to information depends, as we have noted, on the technology available with a particular group/country. Everard points out that most of the developing nations continue to use outdated technology (‘legacy technology’) from their colonial period. The high-speed modem or central processing units, efficient telephone networks and other essentials of the informational society are available only with
182 S
VIRTUAL WORLDS
the developed nations. Developing nations are saddled with obsolete or high-cost technologies, which increases their access costs. In 1991 there was one telephone line per 100 people in Africa, compared to 2.3 for all developing countries and 37.2 for industrial countries. In 1994 Africa accounted for 2 per cent of the world’s telephone lines. There are more telephone lines in Manhattan or Tokyo than in the whole of Sub-Saharan Africa (Castells 2000 [1998]: 92). Since most developing nations do not have either the capital or the expertise to build their telecommunications infrastructure, foreign companies that bring in both take the profits. The predominance of English as the Net language means that much of the ‘free’ information is available only to the Western world and the elite of developing nations. A simple example suffices: almost all email uses Cyrillic or Roman scripts, and this causes problems for nations where English may not be the dominant language. This point leads to the thesis that information about the developing South is mainly written from information produced and researched in the North (Everard 2000: 27–43). The parallels with colonial Europe’s ‘Orientalism’—which was essentially the production of a discourse and accumulation of knowledge about the East/Other—are striking. As Edward Said has demonstrated in his epochal work, Orientalism (1978), both hypothesis and proof about Oriental subjects/issues came out of the same Western paradigm. Contemporary technoculture’s accumulation of data about Other cultures, under the (laudable) pretext of acquisition, systematisation and preservation of rapidly declining/disappearing civilisations and ways of life is an epistemological ordering that recalls colonial attempts to archive and understand preceding control and rule. In terms of development, the inequality between nations has increased. In the case of Africa, for instance, adjustment policies inspired by IMF/World Bank have actually increased dependence on primary commodities. Between 1985 and 1994, most African countries saw a deterioration in the terms of trade (Castells 2000 [1998]: 85). Social exclusion—which Castells defines as the process by which certain individuals and groups are systematically barred from access to positions that would enable them to an autonomous livelihood within the social standards framed by institutions and values in a given context (ibid.: 71)—is a feature of all societies in the new information age. Castells’ analysis of such ‘black holes of informational capitalism’ is worth recounting in some detail.
POLITICS
S 183
Inequality and poverty remain features of these new economies. People and territories are shifted to positions of structural irrelevance because, from the perspective of dominant interests in global, informational capitalism, they are of less importance. The increase in numbers of the homeless (migrants, aged, deskilled workers, mentally ill persons)—in any country—is an example of this social exclusion. Functional illiteracy becomes a cause of social exclusion in a society increasingly given to capacities to decode language. Inner city ghettos are an example of such exclusion. Further, in international terms, there have been changes. The collapse of the ‘Second World’ and the increasing irrelevance of the ‘Third’ (in terms of its geopolitical significance in the global economy) is accompanied by the emergence of what Castells calls the Fourth World. The Fourth World includes regions such as Sub-Saharan Africa, impoverished rural areas of Latin America and Asia. But there are also Fourth World conditions in the ghettos in the USA, Japan and France. Stigmatised, criminalised, sick and illiterate persons belonging to the Fourth World are, for Castells, the direct result of informational capitalism (Castells 2000 [1998]: 165–68). Cyberspace and cybertechnologies of digitisation, for Ziauddin Sardar, may result in the erasure of non-Western societies. Sardar argues (and this relates closely to the point made by Everard) that once a culture has been stored and preserved digitally, then it becomes more than the real thing. Other people and cultures become data in cyberspace, summoned up or dismissed at the stroke of a key. This, Sardar argues, is a museumisation of the world, where anything different from Western culture will only be present in digital form, and that too ‘sanitised’—where dirt and contamination is removed and a ‘clean’ Other is presented on the CD-ROM (Sardar 1996: 19–20). Another point that Sardar emphasises is the close link between the new technologies and colonialism. Sardar notes that the language of the propagandists of the new informational order is about ‘frontiers’ and ‘new territories’ and ‘worlds’. Cyberpunk magazines such as Mondo 2000 announce the ‘colonising’ of cyberspace. Sardar reminds us that the expansion into new territories was always at the cost of the indigenous population (ibid.: 17–18). The link between technology and capitalism has transformed contemporary life and politics. Douglas Kellner argues that in the technocapitalist state, society is organised around capitalist
184 S
VIRTUAL WORLDS
‘imperatives’ of capital accumulation and production, with the related features of worker-exploitation and the hegemony of capital. Today, technical and scientific knowledge, automation and high tech are analogous to the role of human labour power and mechanisation in the earlier era of capitalism (Kellner 1999: 193). The ICT revolution is inseparable from the economic globalisation that has affected the world’s societies and nations in the latter half of the twentieth century. In fact, while ICT has brought the regions of the world closer, the rise and widespread role of multiand transnational business corporations has enabled the dissemination of technological change and development. Anthony Giddens (1991), for instance, has argued that globalisation would have been impossible without such technologies as the aeroplane, computers and satellite-based communication. It is thus almost impossible to treat ICTs and what I term the ‘cybercultural turn’ as distinct from globalisation. Technological change and innovation, as seen from Pacey and Andrew Webster’s arguments above (Chapter 1), is shaped by political and economic conditions. Contemporary high-tech culture, and the sophisticated technologies that constitute it, is both affected and effected by national and global forces. The software revolution in India, for example, has been the result of major changes in our economic and scientific policy (especially after 1991, and the Structural Adjustment Programme, economic liberalisation, and entry into the new world order). Daniele Archibugi and Jonathan Michie see ‘techno-nationalism’ and ‘techno-globalism’ as interrelated aspects of technological change. The globalised economy may transform the conditions for the generation and diffusion of innovation, but national policies continue to be a major influence in the matter. However, they acknowledge that ‘globalisation will increase the impact which national policy will have on domestic living standards’ (Archibugi and Michie 1997: 3). Two central concepts, national systems of innovation and the globalisation of technology, are important for an understanding of the late twentieth century’s cybercultural turn. Adapting from the work of Frierdich List, Chris Freeman introduced the term ‘national systems of innovation’ (NSI), in 1987 to describe the performance of Japan since the Second World War. Later work by Richard Nelson and others has demonstrated how
POLITICS
S 185
national factors such as education, public support to industrial innovation and defence-related technology schemes, the history and culture of a nation, and its demographic–professional aspect influences technological innovation. They have also paid particular attention to research centres and universities, business firms and their interaction in national contexts (Archibugi and Michie 1997: 3–4). Together, these constitute the national systems of innovation. In the case of India, the NSI would include, among others, the departments of science and technology, the various scientific research centres, apex bodies such as the University Grants Commission, the economic policies since independence (socialism and economic liberalism, for example), and the metropolitan contexts of business and research. Archibugi and Michie suggest that ICTs are effectively technologies of globalisation since they ‘service the increasing global operation of cultural, social and economic life’ (ibid.: 4). Today, strategies developed by governments and business houses to generate technology are no longer based in a single country. Firms compete on an international level, and this frequently causes an upgradation of products and technological processes. Technological globalisation has three processes: (i) International exploitation of national capabilities: Firms exploit their innovations in global markets, either by exporting products which embody them or by licensing the know-how (‘technological transfer’ is one aspect of this). (ii) Collaboration: This involves both public and business institutions in an exchange of know-how. Firms share costs of industrial research, for instance. (iii) The generation of innovations across more than one country: This is especially the case with multinational corporations (ibid.: 14, 172–97). This globalised technocultural condition has been termed ‘transversality’ by Terry Shinn. Shinn argues that technoscientific transversality ‘crosses cognitive, technical, economic and societal boundaries’ (2002: 611). Increasingly we perceive such a transversalised state of technology where ‘open-ended instrumentation that spread across the boundaries of science and engineering,
186 S
VIRTUAL WORLDS
academia and industry […] as specialists in particular niches adapt and integrate them’ (Shinn 2002: 612). Shinn has accurately summarised two aspects of contemporary technoculture: the technologies of globalisation and the globalisation of technology itself. Today, in the early years of the twenty-first century, we would be hard put to distinguish between the two.
THE ‘INDEPENDENCE’ OF CYBERSPACE: CENSORSHIP AND OTHER ISSUES Technology does not guarantee freedom. While the Internet, cable television and news networks suggest (especially in their hagiographic descriptions) greater degrees of freedom, the technology itself is structured, developed and disseminated within certain ideologies. Anticipating contemporary communications technologies Ithiel de Sola Pool writes in his Technologies of Freedom: Networks of satellites, optical fibers, and radio waves will serve the functions of the present-day postal system. Speech will not be free if these are not also free. The danger is not of an electronic nightmare, but of human error. It is not computers but policy that threatens freedom. The censorship that followed the printing press was not entailed in Gutenberg’s process; it was a reaction to it. (1983: 226) Groups such as the Electronic Frontier Foundation (EFF) and Computer Professionals for Social Responsibility (CPSR) have been campaigning for the freedom of the virtual word. John Barlow’s ‘Declaration of the Independence of Cyberspace’ (1996, available at ) stated unambiguously that governments cannot be allowed to fetter (at least) the space of the Internet. However, this vision is not so easily realised. Several issues are involved in declaring any space as ‘free’. The most obvious one would be the issue of censorship. Is it really possible to prevent access to offensive or potentially subversive news reports, studies and polemical documents? The
POLITICS
S 187
very nature of the Internet and WWW, its decentred structure and lack of geographical specificity—its ‘extraterritoriality’, as Anne Branscomb terms it—means that global standards of ‘acceptable conduct’ may have to be framed to enable censorship. Further, methods for dispute resolution have to be in place precisely because one cannot lodge petitions regarding contravention of these standards in particular countries. Groups such as the Electronic Frontier Foundation have argued that common carrier principles must be extended to the new communication technologies. These carriers must be seen in the same manner as telephones—as conduits for information transmission—and must not be allowed to change the content of the messages. Writing on the ‘codes’ of cyberspace, Lawrence Lessig of the Harvard Law School argues that the Internet forms an ‘innovation commons’ (a space where resources are held in common). Lessig identifies three ‘layers’ that make up the Net: (a) the physical layer of computers, wires; (b) the logical or code layer, which is the code that makes the hardwire work, and the Internet protocols; and (c) the content layer—the actual material that gets transmitted across the wires. Lessig suggests that while each of these layers can be controlled, at the code level things are free (2001: 20–25). Others have recommended some forms of control of information dissemination to prevent copyright infringement and violation of intellectual property rights. For example, suggestions made for a US National Information Infrastructure legislation included: 1. Outlaw devices that circumvent copyright management technology. 2. Outlaw attempts to alter or delete the copyright identifying information needed for intellectual property management. 3. Reaffirm that transmission is copying. 4. Ensure that online service providers and Internet service providers are held to a standard of responsible copyright management (O’Reilly et al. 1997: 299). Other issues to be discussed before pleading for or against censorship include an analysis of the kind of information about any issue
188 S
VIRTUAL WORLDS
that bypasses its borders via the WWW. John Everard identifies three kinds of information: those that carry potentially destabilising political messages, those that carry culturally dominating information, and those that speak ‘about’ or speak ‘for’ the state from the outside (Everard 2000: 31). Ithiel de Sola Pool outlines to principles for keeping electronic communications free. Though Pool’s analysis is meant for the USA, the principles involved are worth pondering over for other nations adopting information technologies on a large scale. 1. The First Amendment—freedom of speech and expression— applies to all media and further, it applies to the function of communication and not just to the media. 2. Anyone may publish at will, and there must be no licencing or scrutiny of who may publish/disseminate information in any form. 3. Enforcement of law must be after the fact, and not by prior restraint. Libel, obscenity and eavesdropping are punishable, but prior review cannot be allowed. 4. Regulation is a last recourse. If resources for communication are monopolistic, then use common carrier regulation rather than direct regulation or public ownership. 5. Interconnection among common carriers may be required. The basic principle must be that all will be served without discrimination. 6. Recipients of privilege may be subject to disclosure. 7. Privileges may have time limits. Patents and copyrights are for finite periods. 8. Governments and common carriers should be blind to circuit use. What the facility is used for is not their concern. However, some broad categories of use—such as emergency communications—must be given priority. Control of the conduit must not be seen as a means for the control of content. 9. Bottlenecks must not be used to extend control. Rules on undeliverable mail may have been used to control obscene content. 10. Copyright enforcement must be adapted to the technology (Pool 1983: 244–50).
POLITICS
S 189
Lawrence Lessig outlines some of the changes that will have to be made if the Internet is to be retained as a site of creativity and innovation (what he terms the ‘architecture of creativity’ [Lessig 2001: 239]). In addition to changes in the physical layers (the wires and computers), he recommends the following changes to the code and content layer: (i) The code layer is what will have to be protected to ensure innovation and content-freedom. The platform on which software is run (Windows, for example) must be open. (ii) No major player should be able to architect the Internet to empower her/his own strategic behaviour. (iii) Lessig suggests that authors and creators should have copyright for five years once they register, and that registration can be renewed 15 times. If the registration is not renewed the material falls into the public domain (ibid.: 240–61).
TECHNOCITIES AND VIRTUAL CITIES With the development of global circulation of capital and the network state, ‘places become increasingly shaped and constructed through their incorporation into powerful, corporate networks of flows and exchange’ (Graham 2000: 20). The contemporary global space of flows transforms cities into nodes of strategic importance and hubs of information exchange and communication. Eric Swyngedouw (1993) argues, following the work of David Harvey, that contemporary telecommunications technology and internationalising economy seek to strike a balance between ‘fixity’ and ‘motion’ (Harvey’s terms). Relatively immobile and embedded (fixed) transport and telecommunications infrastructures must be first produced. These will link production sites, distribution facilities and consumption spaces, so that there is an infrastructure to produce profits. As Swyngedouw puts it: ‘Liberation from spatial barriers can only take place through the creation of new communications networks, which, in turn, necessitates the construction of new (relatively) fixed and confining structures’ (quoted in Graham 2000: 20–21).
190 S
VIRTUAL WORLDS
Paul Adam, however, argues that the ‘personal extensibility’ of a person’s tele-mediated access to distant spaces and services actually allows her/his domination over excluded groups, and thus supports the production of divided spaces and cities. Thus elite groups within a city shape and fortify specific enclosed spaces (see section on ‘Cyburbia’), from where they operate with global extensibility. The rest of the population gets even more isolated with increasing constraints on social life, the restructuring of welfare and the labour market and withdrawal of banking and public transport services. Thus what is actually produced simultaneously with a networked cyburbia is a fragmented urban space for the non-elite. Only a small group controls global resources through the extensible telecommunications technologies. What is produced is less a homogeneous networked society than a highly uneven urban space (Graham 2000: 22). Wilson and Arrowsmith suggest a method for mapping cyberspace. The defining element of the physical world is distance. In electronic space, the relationship between places can be reduced to a cost per minute—the communication cost from a country (the cost of a three-minute standard telephone call). Analysing such ‘telecom tectonics’ Wilson and Arrowsmith argue that, unable to afford electronic technology, developing countries remain in what Nigel Thrift termed ‘electronic ghettos’. Thus there is really no ‘access for all’. As Wilson and Arrowsmith point out, equity considerations are becoming increasingly important since access, control and management of information have become the foundations of economic growth and development (Wilson and Arrowsmith 2000: 29–40). However, it is not sufficient, as Stephen Graham argues, to detail the effects of telecommunication-based changes. Rather, we need to see how these effects may be controlled and/or altered (Graham 1999: 12–13). Graham outlines three potential areas for analysis in contemporary urban space planning: (i) The ‘global positioning’ approach, where telematics are used to attract investment into cities. (ii) The ‘endogenous’ development approach, where telematics are used to reorganise the social structure of the cities (e.g., community cable networks, city host computers, or France’s
POLITICS
S 191
Minitel, which enables the public to access information and communicate with local centres of education, social organisations and municipal departments, thus encouraging participatory democracy). (iii) The utilisation of telematics for public service (payment of bills, shopping, communicating with the government and so on), thus altering the city–citizen link (Graham 1999: 17–20). Three principles may be used to evaluate virtual cities: (a) Informativeness: The richness of up-to-date information and the provision of useful services to citizens. (b) Participation and social access: Urban information technology (IT) initiatives must guarantee wider social participation. Thus far the use of the Internet is restricted to the wealthy, educated and relatively young members of the urban population. Social inclusiveness of urban citizens is a must. (c) Groundedness: Virtual cities must demonstrate a positive relationship with the host city. IT must be linked closely and effectively to the economic, social and cultural life of the host city, and this can be achieved by a greater attention to local users (Aurigi and Graham 2000: 493–94). Critics such as William Mitchell (City of Bits) argue that the virtual technologies of today have ‘reinvented the human habitat’ (Mitchell 1995: 166). Mitchell’s influential definition captures the varied dimensions of the technopolis: This will be a city unrooted to any definite spot on the surface of the earth, shaped by connectivity and bandwidth constraints rather than by accessibility and land values, largely asynchronous in its operation, and inhabited by disembodied and fragmented subjects who exist as collections of aliases and agents. Its places will be constructed virtually by software instead of physically from stones and timbers, and they will be connected by logical linkages rather than by doors, passageways, and streets. (ibid.: 24. Also available at )
192 S
VIRTUAL WORLDS
BOX 3.3 DIGITAL CITY (www.dds.nl) Amsterdam’s Digital City (DDS, De Digitale Stad) is an integrated network of virtual cities funded by Amsterdam City Council, the Ministry of Economic Affairs, Dutch Telecom and the Ministry for Home Affairs. In 1994 it had 7,000 registered users and 120,000 monthly ‘hits’. The virtual spaces within the DDS are designed as ‘town squares’ based on particular themes, such as the book square, world square, political square, tourist information square and European square. Each square has its own ‘café’ or ‘pub’ where people can meet and engage in debates (via the WWW). News-stands at each square enable a person to access the world’s newspapers pertinent to that square’s themes. Around each square are eight buildings rented to information providers of that theme. These could be community organisations or commercial ones. Thus, an explicit connection is made between the ‘virtual’ and the ‘real’ city. In 2000 DDS had 50,000 signed ‘residents’ and its own city ‘mayor’. For another good example see Bologna at
Even a virtual city is about organisation and order. Urbanity, argues Kevin Robins, is embodied. He further argues that the human body also requires discomfort and disturbance (Robins 1999: 52). Robins’ arguments belong to the same mode of looking at the city. Manuel Castells argues that the new architectural monuments of our epoch are likely to be built as ‘communication exchanges’: airports, train stations, intermodal transfer areas, telecommunication infrastructures, harbours, and computerised trading centres (Castells 1996: 422). Virtual cities (electronic analogies of the city via the Internet. See Box 3.3) are, at least in planning and imagining, modelled after real cities: with exits, entrances, nooks, crannies, streets, commuter facilities and so on (as emblematised in Mitchell’s description of the City of Bits). This means that even virtual cities can possess privileged sections, ghettos and slums, which in turn implies that the politics of access, the privileges of inclusion and the geographies of exclusion remain even in virtual space. John Pickering therefore suggests that we pay attention to the cultural surroundings of networks. The questions to be raised—and these are political questions—are: who controls
POLITICS
S 193
traffic and access, who will own and profit from the networks, and what political role will they play? Pickering points to the fact that there is a significant difference between the networks in the USA, where they are a state responsibility, and Europe, where they are more a private concern (Pickering 1999: 178). Saskia Sassen points out that there are no fully dematerialised firms or industries, that even the most advanced information industries such as finance are only partly installed in electronic space (Sassen 1999: 58–59). Further there is no dematerialised cyberspace without the material, grounded infrastructure of wires and cables, hard disks and processing units (Luke 1999: 32). Metropolitan regions dominate the physical infrastructure of host computers and telecom links that make up the Internet. They argue that polycentric urban regions resulting from current urbanisation trends are actually engines of electronic communication: physical movement, face-to-face interactions and urban life are all dependent upon a widening infrastructure of phones, mobile phones, Internet networks, TV and radio networks, electronic monitoring devices and so on. The growth of the Internet is itself fuelled by local interconnections and transactions at the level of metropolitan regions (Aurigi and Graham 2000: 490). Dear and Flusty suggest new terms and concepts to describe the postmodern, information city. They argue that megalopolises of the world have been integrated into a single urban system, or citistat. The citistat is a Collective world city […] a geographically diffuse hub of an omnipresent periphery, drawing labour and materials from readily substitutable locations throughout that periphery. Citistat is both geographically corporeal, in the sense that urban places exist, and yet ageographically ethereal, in the sense that communication systems create a virtual space permitting coordination across physical space. (Dear and Flusty 1999: 78) The citistat consists of the following sectors: (a) Commudities: These are commodified communities created to serve the rich cybergoisie. These are carefully constructed residential and commercial sections, which try to keep out the ‘lawless’ poor.
194 S
VIRTUAL WORLDS
(b) Commudities are virtually linked to produce cyburbia or teleintegrated hyperspaces. (c) In-beyond: The cheap labour that serves cybergoisie occupies the in-beyond. These are interest groups that can never achieve hegemonic status (Dear and Flusty 1999: 78–79). Bailly, Jensen-Butler and Leontidou mark four shifts in the transition from an industrial to an informational city: (i) Service industries such as business, professional and legal prefer cities because they provide access to customers, markets and professional expertise. (ii) The new professional and managerial class prefer cities because they sustain the consumption, lifestyle and work patterns of this class. (iii) The rising importance of creativity and innovation as a factor of competitive advantage privileges the urban setting owing to its ability to sustain governance activities, media and cultural industries via education and research (concentrated mostly in urban areas). (iv) Cultural assets of cities—media, theatres and such leisure industries—are a source of urban economic renewal (quoted in Amin 2000: 119). Four main connections can be drawn between virtual cities and real ones: (a) Virtual cities can be used to stimulate local development through local linkages between local firms, service providers and consumers. (b) Virtual cities can be electronic democracy initiatives, widening citizen participation in decision-making (through the Internet) and the relations between citizens and representatives. (c) They can be used to create a new ‘electronic public realm’ for online debates and communities, feeding back directly into the ‘real’ host metropolis. (d) They can initiate new forms of urban management, supporting the electronic delivery of public services, and ‘intelligent’ ways of managing urban services such as education, transport, waste and social services and planning (Aurigi and Graham 2000: 492).
POLITICS
S 195
One of the most important commentators on the contemporary city, Edward Soja, coins a new term, Postmetropolis, to describe the city of the late twentieth century. The Postmetropolis has six ‘discourses’ for Soja: (i) Flexcity: In the post-Fordist (see Chapter 1 for ‘Post-Fordism’) industrial economy, more flexible production systems, denser interactive networks of information flows create new industrial spaces that have altered the industrial–geographical one of the earlier Fordist age. (ii) Cosmopolis: This is the globalisation of urban capital, labour and culture and a new hierarchy of global cities. (iii) Exopolis: Edge cities, outer cities and postsuburbia emerge. There is an outmigration of domestic populations and the inmigration of ‘Third World’ workers and cultures. (iv) Metropolarities: A restructured social mosaic as a result of the out- and inmigration, with new polarities and widening gaps, new ethnic composition and the new welfare-dependent ghettos for the aged, orphans, and others. (v) Carceral Archipelagos: The rise of fortress cities and surveillance technologies. Armed and guarded communities, increased surveillance by state and capital, stringent antitrespass regulations and vehicular–pedestrian movement characterise these spaces. (vi) Simcities: The increasing hyperreality of everyday life. Simulating urbanism as a way of life, these cities embody simulation and hyperreality. Where once people went to hyperreal spaces (Hollywood or Disneyland), now hyperreality is the city itself (Soja 1997: 19–30).
THE NATION–STATE AND CYBERSPACE With the reconstitution and reorganisation of the world after the telecommunications revolution, the role of the nation–state has become integral to any discussion of cyberpolitics. Terms such as ‘postnational’ societies have begun to be applied to this new order. What then is the new concept of sovereignty of the nation–state in the networked global order?
196 S
VIRTUAL WORLDS
One of the most important changes has been that the state is now more a ‘manager’, facilitating execution of decisions taken by transnational organisations. Thus, in developing countries, for instance, economic policies have been structured around demands from ‘telematics corporations’. Don Tapscott in The Digital Economy argues that a greater polarisation between local and global activity is discernable today. As such, the managerial class and the nation–state have little role to play. Most importantly, there is no control over space anymore (Tapscott 1996: 310). An interesting development is the delocalisation of labour. Knowledge workers in one country are now employed by knowledge clients in another, without actual physical relocation. ‘Labour migration’ thus takes on new meanings such that the nation–state is unable to control such work situations in cyberspace. Nation–states are primarily predicated upon surveillance and control mechanisms for an equitable distribution of resources. If this function, and the power over political economy, are taken over by transnational corporate offices, then an important role of the nation–state is at an end. However, governments in the new order will serve other purposes, such as making decisions about external investment or the development of the local telematics infrastructure. The state intervenes (or can intervene) in four ways between information-owner and the information-seeker: (a) It can protect information as a piece of property in which the rights of ownership clearly belong to its legal possessor (the law of copyright). (b) It can prevent the unauthorised use of information that has been legitimately collected. (c) It can guarantee the right of access to certain categories of information of interest or benefit to its citizens. (d) It can prevent the dissemination of information (Feather 2000 [1994]: 135–36). Three facets of globalisation will inform all government actions: technological developments, liberalisation and free-trade treaties, and mobility of production mechanisms. The 1996 OECD report, building on these features, listed four challenges facing contemporary governments: (a) facilitating globalisation; (b) ensuring effective
POLITICS
S 197
adjustment; (c) strengthening capacity to tackle change within; (d) and seeking open markets (Everard 2000: 89).
The Network State Castells argues that the present moves towards unifying Europe will produce a ‘network state’. New forms of governance and new institutions of government are being created at the European national, regional and local levels. The network state is characterised by ‘the sharing of authority along a network’ (Castells 2000 [1998]: 363). The various nodes of the network are interdependent, so that no node can ignore others in the decision-making process. Local and regional initiatives are part of this process, and institutions link up with sub-national levels of government. The adoption of a single currency and interlinked financial markets leads to a homogenisation of macroeconomic conditions among European states, whose networking and integration has been enabled by the expansion of information technology. The networking of trade and investment across national boundaries is a feature of globalisation. Europe, in addition to its investment across the globe, also faces large demographic shifts with the immigration of workers— especially from the Mediterranean southern rim—into its boundaries (ibid.: 338–65). Castells argues that a new world is perhaps emerging (or has emerged) with the information revolution. Under informationalism, ‘the generation of wealth, the exercise of power, and the creation of cultural codes came to depend on the technological capacity of societies and individuals, with information technology as the core of this capacity’ (ibid.: 367). Society itself has been transformed, with the alternations in relationships of production, power relations and relationships of experience. Some features of this transformation are: S S S S S S
the decentralisation of work, the individualisation of labour, the rise of global financial markets, increased social inequality, exclusion and polarisation, the crisis of the nation–state and political democracy, ‘cultural battles’—where power lies in the networks of information exchange and symbol manipulation, which
198 S
VIRTUAL WORLDS
relate social actors, institutions and cultural movements through icons and intellectuals, S the crisis of patriarchalism and a redefinition of family and gender relationships, and S changes in the material foundations of social life, especially the ‘culture of real virtuality’ where reality itself is immersed in a virtual image setting and symbols are not just metaphors but the actual experience (Castells 2000 [1998]: 371–82). Benjamin Barber in his Jihad vs McWorld defines the future thus: [The] future is a busy portrait of onrushing economic, technological, and ecological forces that demand integration and uniformity and that mesmerize peoples everywhere with fast music, fast computers, and fast food—MTV, Macintosh, and McDonald’s—pressing nations into one homogeneous global theme park, one McWorld tied together by communications, information, entertainment, and commerce. (1995: 4) Discussing the postnational constellation, Jurgen Habermas analyses the effects of globalisation—driven by telecommunications technology, globalising of capital, etc.—and the prospects for democracy in nation–states. He admits that globalisation does present a danger to the nation–state in its institutional form. His concern is with those aspects of globalisation that threaten the capacity for democratic self-determination in a national society. Habermas’ influential arguments are worth considering in some detail here. Habermas argues that ‘the idea that societies are capable of democratic self-control and self-realisation has until now been credibly realised only in the context of the nation–state’ (2001: 61). Which is why the image of a postnational constellation induces feelings of helplessness. Rather than give into this, Habermas argues that we need to start thinking of ‘finding the appropriate forms for the democratic process to take beyond the nation–state’ (ibid., emphasis in original). According to Habermas, there ‘seems no trace of a waning power of the nation–state for more classical organisational functions, above all for state guarantees of property rights and conditions for fair competition’ (ibid.: 68). ‘Legitimation gaps’ open up as competencies and jurisdictions are
POLITICS
S 199
shifted from the national to the supranational levels (international governmental organisations or even non- governmental ones like the World Wide Fund for Nature or Greenpeace). With globalisation, nation–states lose the fiscal basis for social policies and their capacity to steer the economy via macroeconomic policy. The integrational force of nationality as a way of life and the homogenous basis of civil solidarity are also being diminished. This makes it difficult for nation–states to meet the need for self-legitimation (Habermas 2001: 80). The interdependency of world society challenges the basic premise that ‘national politics, circumscribed within a determinate national territory, is still adequate to address the actual fates of individual nation–states’ (ibid.: 70). Habermas suggests that there is a need to bring global economic networks under political control (ibid.: 81). He distinguishes between ‘networks’ and ‘lifeworlds’. Networks are a functional integration of social relations—based, usually, on market decisions of independent actors. Lifeworlds is a social integration of social relations by those who share a collective identity—based on mutual understanding, intersubjectively shared norms and collective values (ibid.: 82). A democratic order need not be rooted in the nation as a pre-political community of shared destiny. Democracy, for Habermas, means social integration. And this can be achieved through political participation by citizens. There are, of course, problems here. In most nation–states the majority culture comes to identify itself with the national culture and a general political culture. This must end. A decoupling of political culture from majority culture may shift the solidarity of citizens onto a ‘constitutional patriotism’ (ibid.: 74). Habermas detects two problems for nation–states in this age: How can we envision the democratic legitimation of decisions beyond the schema of the nation–state? And what are the conditions to ensure that states and supranational regimes see themselves as members of a community who need to consider one another’s and the community’s general interests? Habermas offers two answers to these problems: (a) Democratic procedure is also legitimised by the general accessibility to the decision-making process. Thus, while a functioning public sphere, discussion and accessibility may not be a substitute for conventional procedures of political representation,
200 S
VIRTUAL WORLDS
they do free democratic legitimacy from the established forms of state organisation. Habermas suggests institutional participation by non-governmental organisations in the decision-making processes at the national level. World organisations can demand that member–states carry out referendums on important issues. This may ensure a ‘transnational will-formation’. (b) Global powers may have to involve themselves in the institutionalised procedures of transnational will-formation. They have to willingly broaden their definitions of ‘national interest’ into a viewpoint of global governance, marking a change from ‘international relations to a global domestic policy’ (Habermas 2001: 110–12).
CYBERDEMOCRACY Is a new form of democracy evolving in cyberspace? If deliberative democracy is based on ‘thoughtful, reasoned, even languorous conversation about issues of common popular import’ (Noveck 2000: 32), then does the dissemination of information enable such a consensus? Is technology, with its power to affect social structures and influence interactions, democratic? These are some of the questions involved in cyberspace politics. Margolis and Resnick suggest that: (a) While the Internet may facilitate instant public feedback, it may not necessarily lead to the triumph of direct democracy. Moreover, unity, equality and devotion to the public good as prerequisites for effective participatory democracy are present in contemporary society. There is no evidence that existing democracies are metamorphosing into a new form of direct democracy. (b) Even if the Net were to facilitate participation of minor parties and interest groups in the democratic process, there is no evidence to suggest that these parties with their new messages can make significant changes in the real world (2000: 207–9) With this cautionary note about the potential for a new democratic future via the WWW, we can proceed to look at the emerging
POLITICS
S 201
prospects for cyberpolitics. As early as 1994, the US government pointed out the benefits of using the new technologies. An efficiency report pointed out that spiralling costs and disjointed administration had become the hallmark of the American government. With great candour, it stated: ‘The federal government is not simply broke, it is broken.’ The Chairman of the Government Information Technology Services Working Group (which produced the 1994 report) stated: It is an opportunity to use the power of information technology to fight the war on crime, to deliver entitlement benefits to the needy in a secure and efficient manner while eliminating fraud and cheating, to improve health care delivery, to find missing children, to improve privacy protection for all citizens—in short, to completely reshape how government delivers its services to its customers. (quoted in Ferdinand 2000: 4) The comment reflects a common view of the new technologies: that they can provide managerial innovation in politics. The Internet is seen as offering a new way for ordinary citizens to participate more directly in decisions (see Barber on ‘strong democracy’ later in this section). The transcendence of the body into a cyborg in cyberspace also removes the disadvantages and limitations associated with physical disability, mobility problems or spatial restrictions. This represents a major change in participation because anyone can now express opinions in discussion groups via the Internet from her/his home. Even if the Internet is elitist, there is, indisputably, a rapidly expanding local and grassroots-level participation and information exchange, thus decentralising the use of the technology. It enables new forms of interactions among people, and thus redefines new kinds of power relations between its participants. (We have already seen how radically ICTs and CMC alters the notion of the public sphere.) Mark Poster suggests that though political discourse has always been mediated by electronic machines, the machines now enable new forms of decentralised dialogue not restricted to the public sphere. As such, the Internet decentralises political discourse, ‘threatening’ the state (through uncensorable communication), private property (the infinite reproduction of
202 S
VIRTUAL WORLDS
information) and morality (Poster 1997: 220–21). With the facility to reinvent identity, the Internet ‘puts cultural acts, symbolisations in all forms’ in the hands of all participants, irrespective of ethnicity, gender, age or social group. This makes it democratic (ibid.: 222). For instance, the gender disadvantage that women face in the public sphere, especially when it comes to the articulation of political opinions, is neutralised in cyberspace where gender can be switched. However, it remains to be seen how the translation from articulation of opinions (masquerading as men) into actual political practice (as women) can be achieved. The egalitarian, democratic political culture of cyberspace does not so easily transform into similar politics in ‘real’ life. That is, in and by itself technology does not effect political freedom or democratic approaches. Richard Sclove (1995) looks at technology as a social structure in order to understand how ICTs can affect democratisation. Sclove argues, first, that people should be able to help shape the basic social circumstances of their lives. This means that there is a need for participatory social organisations and decision-making processes. Benjamin Barber’s Strong Democracy, for instance, is based on such an idea of participation, and is worth defining in full: It [strong democracy] rests on the idea of a self-governing community of civilians who are united less by homogeneous interests than by civic education and who are made capable of common purpose and mutual action by virtue of their civic attitudes and participatory institutions rather than their altruism or their good nature. Strong democracy is consonant with—indeed it depends upon—the politics of conflict, the sociology of pluralism, and the separation of private and public realms of action [….] It challenges the politics of elites and masses that masquerades as democracy in the West and in doing so offers a relevant alternative to what we have called thin democracy—that is, to instrumental, representative, liberal democracy in its three dispositions. (1984: 117) Thus the old Athenian model of democracy, where all citizens participated in public policy-making, may become an actuality. The Internet by enabling citizens to come together in some virtual forum erases the distance- and size-related problems of
POLITICS
S 203
modern nation–states: anyone can vote, no matter where s/he is located, in geographic terms, vis-à-vis the seat of government or the metropolis. Sclove suggests that ‘if citizens ought to be empowered to participating in determining their society’s basic structure, and technologies are an important species of social structure, it follows that technological design and practice should be democratised’ (1995: 91). Sclove then provides a useful system of design criteria for ‘democratic technologies’: (i) Toward democratic community: Avoid technologies that establish authoritarian social relationships in favour of a balance between communitarian and individual technologies. (ii) Toward democratic work: Seek flexible, self-actualising technological practices and avoid autonomy-damaging practices. (iii) Toward democratic politics: Seek technology that empowers socially impaired individuals/groups to participate in the social and political life. (iv) To help secure democratic self-governance: Restrict distribution of potentially adverse consequences such as environmental, to local political jurisdictions and seek local self-reliance and autonomy. (v) To help perpetuate democratic social structures: Seek ecological stability and local technology (ibid.: 92–93). Sclove thus effectively argues for community involvement in the design of technology in order to attain a ‘democratic politics of technology’.3 Beth Simone Noveck argues that ‘the technologies of freedom which enable us to customise and personalise everything about the Net, creating My-Browser and My NewsSite, encourage thinking only about what is good for me, rather than what is good for the world in which I live’ (Noveck 2000: 28). Thus Sclove’s point about a balance between individual and communitarian technologies becomes central. Certain basic issues concerning the new ICTs are pertinent to democratic politics. For instance, the issue of free speech. Governments seeking authoritarian control over its subjects cannot allow a technology that disseminates and allows access to any information (free speech, in a sense). If free speech allows debates
204 S
VIRTUAL WORLDS
over human rights, then governments could see it as a threat to stability (an example would be Burma’s legislations disallowing unlicensed modems). Data and information gathering have been used for social control and surveillance. For instance, South Africa has been using a computerised (IBM machine-driven) population register since 1955 to keep track of black employment and housing. This system has been supplemented by municipal administration systems from Sperry and IBM and a National Police Criminal Investigation System. West Germany was the first country to issue computer-readable identity cards (1987). In the USA Dun and Bradstreet offers ‘online credit information service’, providing detailed debt records and details of defaulters. The enormous amount of data about an individual, generated and stored by government agencies (census, bank transactions, driving licences, arms licences, passports, tax returns, health records, employment records, blood samples) may eventually create what Kenneth Laudon famously termed a ‘dossier society’ (Lyon 1988: 95–96). What these examples suggest is the potential for anti-democratic tendencies inherent in the new technologies. The new ICTs do create a new public sphere that enables people to promote emancipatory and progressive causes that effect social transformation. If the commodification of the Internet and its technological apparatuses proceed at the current pace, then it will mean more consumers at the level of the individual. This wired society expands the participatory base for democracy.
BOX 3.4 INFOWARS Infowars, or information operations (a term Jerry Everard uses as being more broad-based), has the following features: S S S
There is no prior warning. It blurs boundaries between state and non-state actors, crime and everyday system/network failures and between domestic and ‘international’ actors. It has no clear ‘front line’, because the targeted system can be anywhere, and of any kind (military, economic, sociocultural) (Everard 2000: 105).
POLITICS
S 205
We have already noted how increased connectivity between the citizen and the city enables a greater role for the former, when s/he can address queries to, say, the municipal authorities. Douglas Kellner points to the utilisation of the Internet by intellectuals and resistance groups to ‘circulate’ struggle and fights for rights. He argues that if corporate powers and oppressive states can use the Internet to advance their agendas, then progressives groups such as labour organisations, civil rights groups, or environmentalists can use it to advance the cause of the oppressed (Kellner 1999: 195–96). His term ‘globalisation from below’ (ibid.: 198) captures the development of solidarity networks of struggle within cyberspace, and envisions a major political role for the Internet. Minnesota E-Democracy or Open Meeting in the USA are two groups that use the Internet to improve political participation. Beth Simone Noveck (along with Benjamin Barber, a founder of the project/Website called Civic Exchange), however, has pointed at the paradox at the heart of the new technological revolution. Noveck writes: Even as we enjoy limitless information, we are at risk of obtaining less knowledge. Though protected by powerful tools to control how we present ourselves to the world, these technologies concomitantly destroy our privacy. The Internet offers decreasing marginal costs of communication access and yet the web is becoming thoroughly commercialised and devoid of interactive ‘public’ spaces. Finally these technologies make community possible and yet encourage atomisation. (2000: 18–19) Noveck’s concern is that rather than encouraging deliberative dialogue, we find more intrusive, personal messaging and cantankerous emails on the Net (ibid.: 19). She argues that it is content rather than the speed or technique of interaction that is important. The content that comes into wired schools or communities is of far more consequence than wiring them (ibid.: 20).
CYBERWAR One of the most important developments in contemporary times is the increasing use of communications and computing technology
206 S
VIRTUAL WORLDS
in warfare. The philosopher Paul Virilio has commented extensively on the logic of ‘militarism’ that governs and drives technoscientific developments and cultural conditions in the late twentieth century. James Dunnigan points out that until the 1980s, combat electronics and countermeasures were generally unknown to the public. Then, with the advent of video and computer games, public knowledge about electronic warfare increased dramatically (Dunnigan 1996: 216). Digital war has thus entered the public imaginary. Advanced computing is integral to C4I2 needs (command, control, communication, computers, intelligence and interoperability), as conceived of by the Pentagon in the USA. Gray (1997: 172), quoting Lindberg, lists five technoscience areas that the US army concentrates on. Of these, four relate to artificial intelligence: (i) very intelligent surveillance and target acquisition (VISTA) systems; (ii) DC3I (distributed command, control, communications and intelligence); (iii) self-contained munitions; (iv) the soldier–machine interface; and (v) biotechnology. Speed—of information-gathering, transmission, decision-taking, transmitting the decision to missile delivery systems and authorities (each integral to war)—requires sophisticated computer technology. Computers overcome humans in a significant way here: their response times are much shorter than that of humans. The technology and its military uses can be grouped under the following heads: S Intelligence: Cryptology, surveillance, image and sound analysis, satellite control, database management. S Autonomous weapons: Remote piloted vehicles and remotecontrolled systems. S Modelling, testing, simulation: To test the effects of weapons, human impacts, possible casualties, errors. The System for Integrated Nuclear Battle Analysis Calculus (SINBAC) is a computer network that enables military commanders to prepare contingency plans. Flight simulators and missile trainers are all computer-operated.
POLITICS
S 207
S C4I2: Communications technology used for crisis management, and recording messages from round the world (Gray 1997: 59–64). There are three components of cyberwar: (a) the illusion that war can be managed scientifically; (b) the realisation that war is mainly a matter of information and its interpretation; and (c) an emphasis on computers (ibid.: 23–24). Space technology, with its extensive use of communications and tracking technology, is integral to contemporary wars. Defence Meteorological Satellites, the Defence Satellite Communication System, ultrahigh-frequency and superhigh-frequency satellites, Global Positioning System, and Landsat remote sensing satellites were used by the US military during various conflicts. For instance, during its campaign in Nicaragua, the US supplied the contras with encryption machines, seismic, acoustic, infrared and magnetic remote sensors, satellite-generated images of targets, remote piloted vehicles (commonly known as RPVs). Commando teams were linked using ultrahigh-frequency Manpack terminals. The Gulf War of 1991 was truly a cyberwar—both in terms of the technology used and the broadcasting of the actual sequences worldwide. In fact, the commander of Desert Shield and Desert Storm, Norman Schwarzkopf, had described it as a ‘technology war’. An interesting sidelight on the link between computers, media, popular culture and the military is provided by James Der Derian (1997). Der Derian documents the use of games like Marine Doom in war simulation exercises in the US armed forces. He notes that there has always been a very close link between military simulations, computer developments and the entertainment industry. The politics of war games on video and computer games representations are also a crucial element in the public dissemination and consumption of technology. In short, the ‘Disney-fication of war’, as Der Derian terms it, is a politics in and of itself. The US and its allies relied heavily on computers during the Gulf War. There were at least nine metasystems, coordinating the thousands of computers, involved: (i) (ii)
The World-Wide Military Command and Control System (WWMCCS). The Modern Age Planning Program (MAPP).
208 S
VIRTUAL WORLDS
(iii) (iv) (v)
The Joint Deployment System. The Stock Control and Distribution Program. The Tactical Army Combat Service Support Computer Systems (TACCS). (vi) The Military Airlift Command Information Processing System (MACIPS). (vii) The Airborne Battlefield Command Control Center (ABCCC III). (viii) The Joint Surveillance and Target Attack Radar System (Jstars). (ix) The Operation Desert Storm Network (ODS Net). Chris Gray (1997) lists some of the uses of computers and computer technology during this war: S To organise and track armies across the desert. S To sort out the several thousands of satellite images and captured electronic transmission from over 50 satellites. S To fly planes and helicopters. S To guide bombs, missiles and artillery shells. S To jam radars and fool the Iraqis’ targeting computers. S To track Iraqi responses and predict the region’s weather. S To track the weapon platforms and their maintenance. S To count the weapons expended. S To look up home addresses of the dead. A central feature of the Gulf War was the number of images—of all types—generated. The Pentagon contracted National Football League Films to film the Department of Defence’s official version of the war. There were also several Hollywood films made about the war: Desert Shield, Human Shield, Shield of Honor and Target USA. Respected journals and magazines such as Cultural Critique, the London Review of Books and the South Atlantic Quarterly carried analyses of these images, thus emphasising the significance of the spectacle of war. The scale of the coverage accorded to the Gulf War was unprecedented: 1 billion people in 108 nations watched the CNN telecasts (Scarry 1993: 63). These were, naturally, one-sided representations: no pictures of the Iraqi dead, but several of injured American soldiers.
POLITICS
S 209
CYBERTERRORISM Cyberterrorism, which combines the terms cyberspace and terrorism, is defined as ‘premeditated, politically motivated attacks by subnational groups or clandestine agents against information, computer systems, computer programs, and data that result in violence against noncombatant targets’ (Conway 2002: 436). This definition, however, does not include within its ambit common misuse of the new technology, such as sending offensive content to minors or posting such material on the Internet, defacing Web pages, stealing credit card information on the Internet, and clandestinely redirecting Internet traffic from one site to another—all of which are standard cybercrime operations (see Box 3.1). In 1999, 12 of the 30 groups deemed foreign terrorists by the US State Department had their own Websites. Today, a majority of the 33 groups on the same list have an online presence (ibid.). Yael Shahar, Webmaster at the International Policy Institute for Counter-Terrorism in Herzliya, Israel, identifies three forms of ‘information terrorism’: (a) electronic terrorism, which targets hardware; (b) psychological terrorism, which uses inflammatory content; and (c) ‘hacker warfare’, which is cyberterrrorism (quoted in ibid.: 437). Maura Conway is careful to distinguish between ‘cybercrime’ and ‘cyberterrorism’. Cyberspace attacks must have a ‘terrorist’ component to be labelled cyberterrorism. Only terrorism using computer technology as a weapon or target can be ‘cyberterrorist’. Terrorists who simply use information technology and computers are just using the technology. As Conway points out, a vast majority of terrorist activity on the Internet is limited to ‘use’ (ibid.: 438). After 11 September 2001, surely a watershed date in contemporary world history, the FBI has closed down thousands of sites (they were prohibited from doing so before), including radical Internet radio shows. Ironically (and perhaps, prophetically), Dorothy Denning noted in the Harvard International Review, weeks before 11 September: ‘Whereas hacktivism is real and widespread, cyberterrorism exists only in theory. Terrorist groups are using the Internet, but they still prefer bombs to bytes as a means of inciting terror’ (quoted in ibid.: 441).
210 S
VIRTUAL WORLDS
NOTES 1. None of the terms used in these debates on the new technology and democracy are uncontested or unproblematic. Contemporary political theory, in conjunction with new historical approaches, foregrounds the specific and local rather than universal origins, aims and forms of nation–states, notions of democracy, and ‘culture’. Contemporary critical theories do not accept any of these terms as being of unquestionable validity or one single meaning. The nation–state, for instance, is seen as a category of social organisation that has a specifically European history. Debates over modernity, emancipation or political participation therefore frequently encounter problems of both nomenclature and concept. 2. There are several social studies of technology. See Bijker et al. (1997); Silverstone and Hirsch 1994; Rip et al. (1995); MacKenzie and Wajcman (1999). For studies of the social/cultural constructions of ‘laboratory facts’ see Bruno Latour (1987, 1988). 3. Wiebe Bijker and Karin Bijsterveld, using the example of housing projects in Netherlands, which were planned, critiqued and evaluated by women as users, have demonstrated such a model of democratic technology. Beth Simone Noveck points out that communications technologies are populist. However, the commercial aspect and potential of these technologies is of prime importance. Noveck notes that e-commerce does not require additional promotion because it ‘takes care of itself ’. However, a ‘public’ Internet, argues Noveck, requires ‘the support and attention the market cannot provide and can be most readily accomplished by letting democratic goals drive the design of technical architecture’ (Noveck 2000: 22).
SUGGESTED READING Downey, John and Jim McGuigan (eds). 1999. Technocities. London: Sage. Everard, Jerry. 2000. Virtual States: The Internet and the Boundaries of the Nation–State. London and New York: Routledge. Gray, Chris Hables. 1997. Postmodern War: The New Politics of Conflict. London: Routledge. Holmes, David (ed.) 1997. Virtual Politics: Identity and Community in Cyberspace. London: Sage. Jones, Steven G. (ed.) 1995. Cybersociety: Computer-Mediated Communication and Community. London: Sage. Wallace, Patricia. 1999. The Psychology of the Internet. Cambridge: Cambridge University Press.
CHAPTER FOUR
BODY
W
hen Donna Haraway, in her path-breaking ‘Manifesto’, declares that she would ‘rather be a cyborg than a goddess’, she is not coining a jingoistic phrase or ad-libbing for prosthetic technologies (Haraway 1991a [1985]: 181). Haraway is speaking of a technocultural condition wherein certain kinds of bodies, technologies of biopower, and a certain politics of the body are privileged and constructed, while suggesting the means by which such cyborg constructs are available for feminist and other emancipatory purposes by subverting these ideologies. In contemporary times digitised bodies, prosthetics and genetically engineered bodies populate the arenas of real and cyberspace. Technological body modification or cosmetic and reconstructive surgery are not in themselves new developments. However, the human–machine fusion that the digital era represents is perhaps the most extensive alteration and modification of everything human: limbs, organs, and now even personalities—producing what has been termed the ‘posthuman’. Further, the creation of artificial wombs, cyborg soldiers and new modes of personality and bodily alteration affect the very social and political fabric of human life. Bodies have never been truly ‘natural’. Dieting, exercising, ornamentation, posture training, fashion, medicine, stimulants and surgery have always ‘altered’ the body. The body is shaped, sculpted, beautified and improved constantly. Machine interfaces, implants, pharmacological alterations and cosmetic surgery have commodified the body as never before. Sexuality, eroticism,
212 S
VIRTUAL WORLDS
‘sensation’ and subjectivity, all of which have deep connections with the body, have been affected by the way the body has been ‘re-formed’ today. This chapter looks at some of the developments and features of bodies in cyberspace: cyborgs, body modifications and technobodies; forms of technological embodiment such as the Visible Human Project and the Human Genome Project; infomedicine and the body; cosmetic surgery; cybernetic human reproduction and genetic engineering; the four kinds of bodies (disappearing, marked, labouring and repressed), and posthuman body arts as exemplified in Orlan and Stelarc. In order to study the various configurations of the body in contemporary ‘high-tech’ culture (especially in developed nations), it would help to look at the theoretical frameworks provided by social historians of the postmodern, posthuman body. ‘Posthumanism’, a point of view and image of the future (of mankind), can be discussed in terms of epistemology and philosophy. The body serves as a ready example to explore the dimensions of the posthuman point of view because, as the preceding chapters have shown, it is the body (its biology, environment and psychology) that figures prominently in cybernetics and cyberculture. Since the body is also the immediate site of our notions of ourselves and identity, and because this is also what contemporary technology calls into question, the body serves our purpose in exploring posthumanism. Featherstone and Burrows (1998 [1995]: 11–12) list the stages in the development of posthuman bodies: (i) The ‘aesthetic manipulation of the body’s surface through cosmetic surgery, muscle grafts and animal or human transplants’, thus blurring the visual differences between humans and nonhumans. (ii) The more fundamental alteration of the functioning of the inner human body—such as biochip implants, upgraded senses and prosthetic implants. (iii) The cyborg interfaced with hardware, and existing in cyberspace—the avatars, digitised body delegates. The body is no more the limit for identity or subjectivity, and the potential for new disembodied subjectivities. In the context of the US, Arthur and Marilouise Kroker describe a situation of ‘Body McCarthyism’—a ‘biologically-driven politics
BODY
S 213
in which the strategies and powers of society come to be invested on the question of the transmission of bodily fluids’. Body McCarthyism is driven by a panic, ‘signs of fear about the breakdown of the immunological order of American culture’, the fear of viral contamination (Kroker and Kroker 1987: 11–12). They argue that in the contemporary postmodern situation, the aestheticisation of the body and its ‘dissolution’ into its parts reveal that we are being processed through a media scene. They list the modes by which the body becomes subordinated: S Ideologically: The body is inscribed by the signs of the fashion industry. Here the body is a screen effect for a desperate search for desire. S Epistemologically: Knowledge of the body becomes a basic condition of possibility for the operation of power. S Semiotically: The body is a floating sign, a surface to reflect the signifiers of modern capitalism and consumerism (tattoos, for instance). This works in two ways: (a) exteriorisation of body organs—which means that hitherto concealed body functions are now performed externally (examples would include in vitro fertilisation [IVF], computers as external memory banks, and medical scans of the inner physiology); and (b) interiorisation of subjectivity, whereby bodies are the manufactured or surrogate receptors of the fashion industry’s desires. For example, ‘You are what you wear. So you ought to desire this particular image, seek to become like this, if you are truly a metropolitan stockbroker.’ Here the subjectivity of the individual is ‘designed’ and structured around the image the fashion industry has decided for the stockbroker. S Technologically: The body is both superfluous, since the capitalist system can function without it, and yet necessary for the system, since the body is the locus of power (ibid.: 20–22). Further, there are various body ‘rhetorics’: S Economic: This targets the body as a privileged site for the acquisition of private property, and invests the body with the ideologies of desire (as we have seen in the preceding paragraph).
214 S
VIRTUAL WORLDS
S Political: A rhetoric that sees the public body as simply ‘public opinion’, which can be manipulated at will and fed back into the political body. S Psychoanalytic: Brings back the language of sexual desire and ‘marks’ the body (see Balsamo’s views later in this chapter) with the language of transgression, sublimation and censorship. S Scientific: Speaks of the body as located at the intersection of structural linguistics, genetics and cybernetics. S Sports: Where particular body parts are hypervalued and therefore commodified to excess. Thus feet in football or forearms in baseball become fetishes (Kroker and Kroker 1987: 21–22). With these frameworks and analytical tools, we can now proceed to read the impact of cyberculture upon the body. One of the best descriptions of contemporary accounts of cybernetic technology and the body comes from Donna Haraway. Haraway describes her work as: Explor[ations] [of] bodies, technologies, and fictions as they are constructed through the mediation of late twentieth-century communications sciences, the simultaneously organic and artifactual domains of biology and medicine, industry, and the military […] Cyborgs [as] [the] site of the potent fusions of the technical, textual, organic, mythic, and political. (1991c: 24–25) The cyborg, as Haraway sees it, is an example of a figure that does not fit the taxonomic system. She adapts the term ‘inappropriate/d others’ from Trinh T. Minh-ha to describe the cyborg: ‘To be in critical, deconstructive relationality […] to […] not fit in the taxon, to be dislocated from the available maps specifying kinds of actors’ (ibid.: 23, emphasis in original). The cyborg being neither machine nor human, animal nor human (and perhaps neither man nor woman), defies classification. Fitting into a given taxon provides a ready-made identity and a set of characteristics. Haraway is pointing to the construction of identities and subjectivities (‘human’, ‘animal’, ‘artifactual’) by technoculture. She gestures at the social constructions of such identities, and argues that the cyborg, instead, valorises the self-critical practice of difference (where the ‘I’ is never identical with itself) rather than
BODY
S 215
being constructed out of difference. Haraway’s description accounts for the several related components needed to analyse contemporary bodies, and informs much of what follows in this chapter.1
CYBORGS In 1960 Manfred Clynes and Nathan Kline coined the term ‘cyborg’ (cyb[ernetic] org[anism]), defining it as ‘self-regulating man–machine systems’. They described this cyborg system thus: To provide an organisational system in which such robot-like problems [such as the body’s autonomy and self-regulating mechanisms] are taken care of automatically and unconsciously, leaving man free to explore, to create, to think, and to feel. (quoted in Tomas 1991: 35) Donna Haraway immortalised the concept of the cyborg in her 1985 ‘A Manifesto for Cyborgs’. Haraway’s interpretation of the cyborg has several features that have since been appropriated by theorists of the body. (i) The cyborg, for Haraway, was a result of late-capitalism, a creature of social reality and of fiction. (ii) It was designed as an entity that would transgress the symbolic boundaries between the (a) human and animal, (b) the physical and the non-physical, and (c) the organic and the machine. (iii) Haraway’s cyborg originating in feminist SF is a creature in a post-gender world. (iv) This cyborg is also ‘resolutely committed to partiality, irony, intimacy, and perversity’, as Haraway famously described it in the ‘Manifesto’ (Haraway 1991a [1985]: 151). (v) For Haraway cyborg imagery has two important uses: (a) it enables a rejection of any totalising theory; and (b) it implies that we ‘take responsibility for the social relations of science and technology […] [thus] refusing an
216 S
VIRTUAL WORLDS
anti-science metaphysics, a demonology of technology’. The cyborg belongs to ‘a family of displaced figures […] multiply displaced’ (Haraway 1991b: 13). (vi) Nature itself, argues Haraway, is ‘reconstructed in the belly of a heavily militarised, communications-system-based technoscience in its late capitalist and imperialist forms’ (ibid.: 6). Vivian Sobchack points out that both the ‘delusionary liberatory rhetoric of technophiles’ and the ‘dangerous liberatory poetry of cultural formalists’ (such as Baudrillard) disavow the moral and material significance of the lived body in their longing to escape the body and its limitations (Sobchack 1998 [1995]: 209–10). She argues that it is the lived body that provides the material premises for the intelligibility of moral categories—all of which proceed from a bodily sense of gravity and finitude. This, according to Sobchack, is the result of a ‘dangerous confusion’ between ‘the agency that is our bodies/selves and the power of our incredible new technologies of perception and expression’ (ibid.: 211). The embodying of identity in cyberspace is frequently the creation of the most desirable cyborg. Human bodies are simply receptive surfaces for the ‘most desirable images’ circulating in the media. Thus Elvis Presley or Marilyn Monroe are used as ‘desirables’ for augmenting/improving the human body in the West. This means that the received identity can be chosen from a range of images, and effected through technology. Nigel Clark notes that computer-generated ‘stereotypically spectacular bodies’ are actually a continuation of images generated by earlier media. These virtual images present an apotheosis of the earlier generation of beautiful bodies, having ‘absorbed and accumulated all the projections of the earlier era’ (Clark 1998 [1995]: 125). This is crucial to our understanding of the politics of cyberspace (see Chapter 5 on ‘Gender’). Regular users of computers experience a different relationship with their PCs. Rather than the human–computer dyad of self versus the other, for many users there is a blurring of boundaries between the embodied self and the PC. Users invest certain aspects of themselves and their cultures when negotiating the PC, and their use of computers contribute to the individuals’ images and experience of their selves and their bodies (Lupton 1998
BODY
S 217
[1995]: 98–99). In contemporary culture phrases like ‘human computer’ elide the human–machine difference. For Lupton the cyborg body with its perfection represents an anxiety about the human body—with its illnesses, decay, pain and leakage. Men, especially, find the cyborg body more attractive, since the woman’s body has always been seen as leaky and vulnerable (ibid.: 101). Lupton extends this argument to look at nerds and hackers. Popular representations of hackers ‘reveal’ them to be obese, pale-skinned, living on junk food and physically unattractive. They lack social graces and are ‘addicted’ to the computer and cyberspace. Once again the cyborg body, superbly fit, uncontaminated and rationalised, is contrasted with such a nerd-body (ibid.: 102–3).
POSTHUMANISM, INFORMATICS, AND THE BODY One of the most detailed explorations of the development of the posthuman is Katherine Hayles’ How We Became Posthuman (1999). The study ranges across the new sciences of cybernetics and informatics, and the manner in which the two have ‘enveloped’ the body. Hayles’ work is the basis for much contemporary thinking on the so-called destruction of the liberal–humanist subject and the rise of the cyborg. Twentieth-century thinkers such as Michel Foucault had demonstrated that the ‘human’ as a category emerges as the result of a network of ‘discourses’—medical, psychological, legal and philosophical. People began to ‘talk about’ the ‘human’ in these terms with the European Renaissance. The human was defined, classified (separated from other life forms through taxonomy and classificatory mechanisms) and studied. The body was dissected, explored, exhibited and incarcerated. Science studied the body’s workings, its flaws and pathologies. Religion began to speak about the ills of the flesh and the immortality of the soul resident in the body. The law drafted modes of behaviour for the body—treating some acts as illegal or transgressive, and therefore offensive. Psychology began to speak about the thinking subject and the supremacy of reason (Descartes, of course, is the name that
218 S
VIRTUAL WORLDS
comes to mind here). Art began to structure the perfect human form, and aesthetics began to define what this form ought to be like. All these cultural forms of knowledge centred the entity called the ‘human’. In the late twentieth century, with developments in information technology, medical technologies, behaviourism and theories of the mind, the human begins to be talked about in a different way. This new ‘form’ of the human is nothing less than radical redefinitions of what it means to be human in an increasingly technologised age. Debates over the ‘seat’ of human subjectivity (mind or body?) apart, these redefinitions proceeded from an entirely new way of looking at life itself, and are crucial to our understanding of what a posthuman means in the twenty-first century. For our purposes, rather than survey the entire set of debates over mind, body, society and evolution, I shall limit my discussion to the role/question of the body in posthuman culture. The development of the posthuman has three interrelated ‘stories’ (as Hayles terms them): the reification of information, the cultural and technological construction of the cyborg, and the (assumed) ‘erosion’ of the liberal–humanist human subject in cybernetics. Hayles begins by looking at the three stages in the development of cybernetics and their corresponding view of life and the human (for a definition of cybernetics see Chapter 1). Hayles invokes Haraway’s term ‘informatics’ to refer to the technologies of information and the biological, social, linguistic, and cultural changes that initiate and accompany their development. Looking at contemporary representations of bodies in literature, Hayles argues that changes in these represented bodies are linked to changes in textual bodies as they are encoded within information media, as well as changes in the construction of human bodies as they interface with ICTs (Hayles 1999: 29). These are changes that occur in the material body and the message (codes of representation). 1. Stage 1: The age of Wiener, Claude Shannon and Warren McCulloch in the 1940s and 1950s. The Macy Conferences on Cybernetics (1943–54) forged a new paradigm in information theory. This stage redefined the human as an informationprocessing entity essentially similar to intelligent machines. By extension, a machine could function as a human. Working from
BODY
S 219
a premise of homeostasis (the ability of an organism to maintain steady states in erratic conditions), the first stage began to speak of a ‘feedback loop’. The feedback loop, through which information flows from outside into the organism to effect responses in the organism that help it maintain a homeostatic state, soon became identified with reflexivity. Reflexivity is ‘the movement whereby that which has been used to generate a system is made […] to become a part of the system’ (Hayles 1999: 8). Thus information flowing back from the observer/ environment into the system through the feedback loop, incorporates the observer as part of the system itself. The cyborg here is an amalgam of the mechanical and the organic human. 2. Stage 2: Influenced by the work of Heinz von Foerster, Humberto Maturana and Francisco Varela, this stage saw the world as a set of informationally closed systems. According to this theory, organisms respond to their environments in ways that are determined by their internal organisation. The only goal of the organisms is to continually produce and reproduce the organisation that defines them as systems. They are therefore, self-organising and autopoietic or ‘self-making’. The central idea here is that systems are informationally closed, and no information crosses the boundary separating the system from its environment. We do not see the world ‘out there’, but only what our internal organisation allows us to see. Information here is indistinguishable from the organisational properties of the system itself. Autopoietic theory seems to suggest that systems may be alive. 3. Stage 3: In this stage, characterised by research into artificial life, self-organisation of systems was seen to have two features— reproduction of internal organisation and emergence. Computer programs allow packets of computer codes to evolve spontaneously. The codes thus have a capacity to evolve. This feature makes them remarkably life-like. The third stage essentially saw life and the universe as composed of information. Jean Baudrillard’s statement in The Ecstasy of Communication sums up the late twentieth century’s notion of (human) life, and is startlingly similar to contemporary cybernetic theory: The human body, our body, seems superfluous in its proper expanse, in the complexity and multiplicity of its organs, of its
220 S
VIRTUAL WORLDS
tissues and functions, because today everything is concentrated in the brain and the genetic code, which alone sum up the operational definition of being. (Baudrillard 1988: 18) Donna Haraway in Modest_Witness makes a similar point about the Human Genome Project, arguing: At whatever level of individuality or collectivity, from a single gene region extracted from one sample through the whole species genome, this human is itself an information structure whose program might be written in nucleic acids or in the artificial intelligence programming language called Lisp®. (1997: 247) Therefore, these computer codes are life forms because they have the form of life—the informational code. What happens in the third stage is that information and materiality are seen as two distinct entities. Information, as Hayles tells us repeatedly, has ‘lost its body’: it is a disembodied, immaterial susbtance. Further, in the third stage, information is given the dominant position and materiality the lower. In the late twentieth century, the body is therefore reduced to information. In Neuromancer, William Gibson thus describes the body as ‘data made flesh’. Simultaneously, information is no longer material, it lacks a body. Disembodiment is thus the characteristic of the age. However, Hayles asks that we pay attention to the material interfaces and technologies that make this disembodiment possible. In the human–computer interface, which we can now identify as central to the posthuman condition, ‘functionality’ is a key condition. When the user wears a data-glove, for example, the computer responds to the user and vice versa. The computer today can now sense the body’s position and work accordingly (the computer’s sensors detect movement). Likewise, the user adapts to the computer’s responses. The user learns to move her/his hand in gestures that the computer can understand. Therefore a posthuman condition is one wherein the ‘computer moulds the human even as the human builds the computer’ (Hayles 1999: 47). The posthuman is not, Hayles emphasises, the destruction of the subject or subjectivity. Rather, it is the emergence of a certain
BODY
S 221
kind of subjectivity. This subjectivity is the result of the ‘crossing of the materiality of informatics with the immateriality of information’ (ibid.: 193). This means, simply, a set of two interacting polarities for Hayles. The first polarity is between the body as a cultural construct and the experience of embodiment. The second polarity is between inscription (system of signs and concepts) and incorporation (the material body that is the medium for the sign or gesture, such as the hand making a goodbye gesture). In order to understand the enormity of the posthuman as a concept, we need to look at these polarities in detail. The body, for Foucault and others, is a construct that emerges out of discourses. However, bodily practices have a physical reality that can never be fully assimilated into mere discourse. This brings in the notion of embodiment. Embodied cultural practices take into account variations of class, gender and race (among others) which allow certain discourses to spread through society. That is, there is a feedback mechanism between the discourse— ideas, concepts, ideologies—and materiality. ‘The body’ is an ideal, a discursive construct born out of a set of criteria about what constitutes the body. Images (from art to virtual technologies) create a normalised construct out of the several data points and deliver an idealised version of the object–body. This ‘body’ is thus a normative, an ideal standard. Embodiment is the specific experience in a locality, time and culture. It has its abnormalities, variations and particularities, which are seen as abnormal, varied and particular because of the ‘presence’ of the ‘normative body’. In every age embodiment is in interaction with such constructions of the ‘standard’ body. Embodiment is also, therefore, performative and subject to individual improvisations. The ‘body’ merges into information (precisely because that is what ‘body’ is: a set of ideas, data and concepts). But embodiment, rooted in the specific cultural location (ground), cannot. In simple terms, the ‘body’ is an abstraction of embodiment. Likewise, incorporation and inscription are in constant play. Inscriptions function independently of the material. A gesture or sign can be reproduced in another medium. The practice—rooted in the material—is abstracted into signs. There is a close link between incorporating practices and embodiment. Incorporating practices are actions coded into bodily memory by repeated performance (Hayles 1999: 199). The
222 S
VIRTUAL WORLDS
body produces culture just as culture produces the body. For example, dress codes and body posture create norms of behaviour for men and women from childhood itself. Incorporating practices perform the bodily content, and inscribing practices interpret—that is, correct, modify or reject—them. Thus when people begin to use their bodies in different ways (for technological or cultural reasons), changing experiences of embodiment find their way into language/culture. For example, the human being’s experience of walking erect has created a set of verticality metaphors. People are described as morally ‘upright’, experiencing a ‘high’, starting at the ‘bottom of the ladder’, or feeling ‘down’ (Hayles 1999: 205–6). This is the feedback loop. Hayles locates five dimensions of these incorporated knowledges: (i) Incorporated knowledge retains its contextual nature since it proceeds from performance in specific contexts. (ii) This knowledge is ‘deeply sedimented’ into the body. (iii) It becomes habitual and therefore becomes more or less ‘invisible’. (iv) Because incorporated knowledge is contextual, resistant to change and obscure to the thinking/reflective mind, it actually defines the boundaries within which conscious thought takes place. (v) When changes occur in incorporating practices, they are usually linked to new technologies. This means that people adapting to new technological conditions adapt their bodies, and embodiment mediates between material technology and abstract discourse by creating new frameworks of experience. These frameworks, in turn, create new discourses about what the ‘body’ is (ibid.: 205). The next crucial step in understanding the posthuman ‘reconfiguring’ of the human (body) is to look at the fascinating new science of Artificial Life (AL). Hayles reads the work of Rodney Brooks and Mark Tilden (of the MIT and University of Waterloo respectively) for this purpose. New robots have sensors and actuators with little communication between them. There is no central representation/control, except when there is a conflict between these sensors. So the robot has no coherent, coordinated view of the world (or consciousness), and each system sees the world entirely
BODY
S 223
differently from the others. The robot learns about the world directly from its interaction with the environment. Distributed systems interacting with the environment thus see the world in very different ways. In terms of human evolution, consciousness is a late development, and is analogous to the system that kicks in after/when there is a conflict between the sensors in the robots just described. Thus consciousness is not an essential part of the system’s architecture: it arrives late on the scene after the interaction of the system with the environment has proceeded for some time (hence the slogan: ‘consciousness is an epiphenomenon’). Consciousness is thus an emergent and not an inherent property. The goal of AL is to evolve such a consciousness/intelligence within the machine, through pathways found by the computer codes (‘creatures’) themselves. Proceeding from this same idea, human consciousness is perched on top, in terms of time-of-arrival and performance, of the machine-like functions that the systems carry out. The machine becomes the model for understanding the human! This is how, in Hayles’ words, the human is ‘transfigured’ into the posthuman (Hayles 1999: 236–39). The posthuman is thus an ‘envisioning of the human as information-processing machines with fundamental similarities to other kinds of information-processing machines, especially intelligent computers’ (ibid.: 246). In terms of ‘cyberpsychology’ two opposing, i.e., utopian and dystopian, views of the posthuman emerge. In the first view, the dispersion of subjectivity is praised as freedom from repression. The freedom to choose and alter identities is seen as truly anti-deterministic in sexual, somatic and social terms. The other view mourns the passing of the human subject and her/his agency in technoculture (Sey 1999: 38–39).
BODY MODIFICATION One of the central themes of cyberpunk and technoscience films has been the modified body. Body modification in itself is not a new development. All cultures have had practices that augmented, ornamentalised and controlled the body’s appearance, functions and capabilities. The body is no more a ‘given’. As Chris Shilling has demonstrated in his The Body and Social Theory, the body is now a ‘project’.
224 S
VIRTUAL WORLDS
Chris Shilling argues that with modernity’s affluence, the body ‘is an entity which is in the process of becoming; a project which should be worked at and accomplished as a part of an individual’s self-identity’ (1999 [1993]: 5). The term ‘body modification’ is used to describe a range of practices including piercing, tattooing, cutting, branding, binding and inserting implants to alter the appearance and ‘performance’ of the body. Gymnastics, exercise and diets are also part of the procedures of body modification because they seek to alter the appearance or capabilities of the ‘natural’ body. Prosthetic devices, from a pair of spectacles to nanotechnological neurological implants, are, in effect, body modification technologies. Body modification can be a liberatory experience. For the disabled or the injured, prosthetic devices enable them to lead a normal life despite their body. BOX 4.1 MODERN PRIMITIVES Modern Primitives is a subcultural movement that originated in the 1970s in California. Body Play and Modern Primitives Quarterly is its main magazine. Fakir Musafar, the founder of the movement, is a practitioner of body modification, or what he terms ‘Body Play’. Body Play is a ‘deliberate, ritualised modification of the human body’, according to Musafar. Musafar lists seven categories of body play: S S S S S S S
Contortion: Gymnastics, yoga, high-heeled shoes, stretching parts of the body. Constriction: Bondage, ligature, tight clothing, corsets, body presses. Deprivation: Fasting, sleep deprivation, restriction of movement, cages, helmets, body, suits. Encumberment: Heavy bracelets, neck ornamentation, manacles, chains. Fire: Sun tanning, steam/heat baths, branding. Penetration: Flagellation, piercing, tattooing, spiking. Suspension: Hanging the body from flesh hooks.
‘Modern Primitives’ seek inspiration from tribal practices of body modification, especially suspension, tattooing and piercing. Members see these practices as ‘spiritual’, the satisfaction of a primal urge. ModPrims revive many non-European cultural practices mainly through their interest in anthropology. Fakir Musafar thus, (continued)
BODY
S 225
(Box 4.1 continued) adapts the bark belt (called ‘bearing the bark’) from the Ibitoes of New Guinea and carries weighted objects like the ‘Ball Dancers’ of South India (who carry loads of hooked-on limes on their bodies). In addition ModPrims also adapt contemporary art and technological forms. This includes use of digitally remixed music, laser and light effects, piercing their ears with computer chips, wearing clothes that were fashionable in the 1970s and decorating themselves with futuristic hologram jewellery. There is a lot of eroticisation in ModPrims (genital piercing is a much-favoured practice) with close links to S/M, though by no means are their practices exclusively sexual. ‘Body Play’ in the ModPrims becomes an answer to the monotony of capitalist modernity, where entertainment and every aspect of life is predictable and routine. Television, radio, cinema have all resulted, for the ModPrims, in a deadening of the senses. Since pain is an absolute singularity unique to the sufferer, S/M and ModPrims see Body Play as the last authentic experience in a world where nothing is authentic. In response to the endless repetition of the banal stream of empty images (neon signs, flickering computer/TV screens, unending soap operas, movie-star icons), ModPrims see the pain of the body as a threshold event, beyond and outside which there is no authenticity any more. There is a certain element of the magical in the ModPrims. They seek to combine the occult with contemporary technology, totemic practices with information technology. Thus rocket propulsion systems, software, LSD and folk jewellery find equal space in their art. They are also believers in ancient cosmology, ancient shamanic rites and such (adapted from African and Asian cultures again). They have borrowed from contemporary Chaos Theory, Hermeticism and neuroscience. They combine the ‘low’ technology of ancient cultural rites with the ‘high’ technology of cyberculture. The combination of a romanticised pre-civilised, pre-mechanistic past with the mind-boggling technology of today is the most fascinating feature of ModPrims. This is why the ModPrims have been described as ‘technoshamans’. They have been accused of commodifying into Western consumer culture what were essentially communitarian acts of identification, and of retrieving old colonial notions of the ‘primitive’. It is also seen as a commodification of ethnic, racial and cultural difference. The ‘Modern Primitives’ practices of tattooing can also be seen as a means of creating collectivity. James Myers argues that in an increasingly fragmented and individualistic society, people are looking for communal rites. Myers sees these practices as ceremonial events akin to traditional rites of passage (quoted in Klesse 2000: 22).
226 S
VIRTUAL WORLDS
For others it can be a question of control over one’s body. Thus exercise regimens to shape the body according to one’s needs— and the fashion of the day—are body modification techniques. A modified body becomes a visible sign of identity. In consumer culture the body is one of the first objects to be transformed and commodified—where the body’s shape, appearance and functions are crucial to identity. In sharp contrast, for critics of such subcultural practices, cutting and mutilation—frequently involving pain—is a violation of the body, and an indication of the lack of respect for the self. To such critics, body modification suggests the irrelevance of and irreverence to the body. This ‘project’ of altering the body in order to make a political statement is what I have elsewhere termed ‘the politics of the prosthesis’, connoting the body’s prosthetic supplementarity—meaning both necessary completion and excess, simultaneously—of and to politics, and the political implications of corporeal cyborgisation (Nayar, ‘The Politics of the Prosthesis’ ). Certain kinds of body markings or modifications can be communitarian, identifying the wearer as a member of a particular group or subculture. Bryan Turner has argued that in traditional societies such markings were ritualistic and obligatory (for instance, used to mark a rite of passage into adulthood) to indicate membership of and solidarity with a tribe. In contemporary societies, where no such ‘thick’ loyalties exist, these become optional, voluntary, and even narcissistic (Turner 1999: 39–50). Such practices could also be oppositional to mainstream culture, where particular members ‘set themselves off ’ from others through common practices of tattooing. Western punk, for instance, with its tattoos, coloured hair and leather, contrasted with the suits and formal ties of the upper class and created its own subculture with an alteration of style. In the case of performance artists such as Stelarc and Orlan, the body is a limit that must be overcome. They both seek ‘post-evolutionary bodies’, where technology enables a body to generate new forms of sensory experience and functions. Thus we can see how technological modifications are less about the technology than about the culture in which such practices circulate. The body is the instrument, site
BODY
S 227
and ‘reaction’ to specific sociocultural and political conditions. In short, it is techne-. Mike Featherstone argues that body modification also gestures at the increasing sexualisation of the body, where the body is now seen (even more) as ‘a vehicle of pleasure and self-expression’ Featherstone et al. (1995 [1991]: 170). Thus the presentation of the body (the style, appearance) is now a part of the individual’s identity. Also, practices that augment appearance are closely linked to desirability and the status it bestows upon the individual. Indian markers of marriage, for instance, are also body modification techniques that supposedly add ‘respectability’ to the woman. Polhemus and Proctor see subcultural body modification practices as ‘anti-fashion’. Fashion for them suggests a degree of social mobility and change, since fashion changes with seasons (quoted in Sweetman 2000: 52). Anti-fashion is more permanent, since in most cases tattoos or scars are permanent and cannot be removed except through painful efforts. Anti-fashion practices therefore become signs of resistance to the status mechanism of mainstream fashion changes and the ideology of social mobility. Technobodies, such as the kind Stelarc seeks, are also modified bodies. In the case of the technobody, the question is not just of appearance but of functionality. Stelarc’s declaration—‘The Body is Obsolete’—is then a manifesto, a programme of action. His prosthetics of the body— the third arm, the extra ear—are attempts to extend the parameters of the body’s performance. Thus he seeks an augmentation of the visual senses, tactile sensations and mobility. The attempt then is to overcome the biology of the body, to transcend the limitations that the body imposes, seeking what Stelarc has termed ‘alternate operational possibilities’ (quoted in Zurbrugg 2000: 112). The Extropians (a group devoted to posthuman technology and philosophy) define transhumans thus: Persons of unprecedented physical, intellectual and psychological ability, self-programming and self-defining, potentially immortal, unlimited individuals. Posthumans may be partly or mostly biological in form, but will likely be partly or wholly postbiological—our personalities having been transferred
228 S
VIRTUAL WORLDS
‘into’ more durable, modifiable, and faster, and more powerful bodies and thinking hardware. This is called cyborgization. Some of the technologies that we currently expect to play a role in allowing us to become posthuman include genetic engineering, neural–computer integration, molecular nanotechnology and cognitive science. (. On extropians see ) This form of body modification, which combines high tech with contemporary aesthetic concerns, produces the culture of the posthuman body (posthumanism is seen as a ‘subgenre’ of transhumanism (). I have argued elsewhere that we need to account for that unquantifiable thing—the human desire to ‘overcome’ the limits of the body—when talking about technological modifications. Technology—whether political or social—has always had to contend with the ‘problems’ of Victor Frankenstein: science as a source of glory and power, and the human desire to exceed the body. Naturally, such amorphous concepts as greed, luxury or pride inform both our individual/personal and communal/collective politics (Nayar, ‘The Politics of the Prosthesis’ ). While it is perhaps inconclusive to work categories such as human desires into the politics of technology, the informing assumption of this book (inspired, as the first chapter outlines, by the work of Bourdieu on distinction) is that such factors are definitely crucial in the social life of technology.
TECHNOLOGICAL EMBODIMENT Body modification is an attempt to redefine, reorganise and, in some cases, revitalise the human body. Contemporary medical technologies have a central role to play in body augmentation (through prosthesis, for instance) and alteration (say, through transsexual surgery). Anne Balsamo (1998 [1995]: 215–37) discerns four major forms of technological embodiment today. These categories
BODY
S 229
enable us to understand not only the ways in which the body has been transformed, but also the modes of thinking about the body: 1. The labouring body: This includes mainly maternal, reproductive bodies. Today such bodies are also seen as technological— where the reproductive body is the container for the foetus, and in its role as the object of technological manipulation in the service of human reproduction. Thus drug-addicted or alcoholic mothers (or mothers-to-be) are charged with criminal child neglect. Balsamo points out that such legislative measures mean that the foetus is seen as an entity with rights. And this transformation of the notion of the foetus is made possible through visualisation technologies such as sonograms and laparoscopy. The mother’s body is only a medium through which this new entity is viewed. 2. The repressed body: Virtual Reality actually provides a sense of control over reality, nature and the body. When immersed in a virtual environment, even if linked to motion and visual sensors, the material body is visually and technologically repressed. The body, as Balsamo puts it, is a ‘sense apparatus […] nothing more than excess baggage for the cyberspace traveller’ (Balsamo 1998 [1995]: 229). Thus virtual reality is essentially an attempt to transcend the imperfections of the body, and attain one with only the preferred and ‘best’ qualities. However, as Balsamo demonstrates, what people seek through reconstruction and technological augmentation are still traditional and socioculturally desirable/prestigious markers of beauty, strength and sexuality. Thus reconstruction of the body rarely implies the reconstruction of a cultural identity (ibid.: 229). 3. The disappearing body: The natural body is augmented, modified and replaced in sections. Life magazine in 1988 was already announcing, in a piece called ‘Visions of Tomorrow’, a catalogue (what it called ‘a Sears catalogue of body options’) of body parts freely available in the market: wrist and elbow joints, tendons and ligaments. Soon it should be possible to substitute glass eyes with electronic retinas, for instance, and pacemakers with bionic hearts. More importantly, bodies literally disappear as coded information in large electronic databases, which can be summoned up (resurrected as ‘bodies’) at the touch of a mouse.
230 S
VIRTUAL WORLDS
However, there are other issues involved here. The Human Genome Project’s mapping of the human genetic code has a related project: the Human Genome Diversity project whose stated agenda is to record the ‘dwindling genetic diversity of Homo sapiens by taking DNA samples from several hundred distinct human populations and storing them in gene banks’. Some populations see such a coordinated effort at digitising ‘disappearing’ bodies/populations as a threat. The members of the ‘recorded’ population themselves have no access to this information and, as such, cannot monitor the use of this information. In a sense, their bodies disappear, to reappear on somebody’s (this somebody being, inevitably, a person from a developed nation with access to the databank) desktop. 4. The marked body: This kind of body underlines the cultural construction of a body’s identities. Identities here become signs and commodities. The ‘natural body’ is transformed into a sign of culture by the fashion and cosmetic industry. Exotic and ethnic wear, for instance, is a hallmark of the urban corporeal identity based in multiculturalism. Then, the eroticisation of women’s bodies, with further fine multicultural layers of the brown/black body (with specific features and dress), are also increasingly possible. In cosmetic surgery computer technology plays an important role in marking and selecting identities for bodies. The surgeon, with the aid of computer design tools, can now illustrate possible forms for the face, thereby providing the client a range of choices. Invariably, these ‘possibilities’ are themselves culturally conditioned and projected as ‘desirable’ (‘what’s in, what’s not’). Balsamo writes: ‘One of the consequences is that the material body is reconfigured as an electronic image that can be technologically manipulated on the screen before it is surgically manipulated in the operating room’ (1998 [1995]: 226). Two of the most important projects that constitute new forms of such technological intervention in the representation and ‘reading’ of the body are the Visible Human Project (VHP) and the Human Genome Project (HGP).2 These two projects, with their medico–legal technologies present the ultimate (for now) ‘spectacularisation’ of the human body. That is, the two projects are basically ways of perceiving and representing the body.
BODY
S 231
THE VISIBLE HUMAN PROJECT The Visible Human Project (VHP) was launched in 1986 by the National Library of Medicine (NLM), Maryland, USA. The idea was to complement the printed and bibliographic database about the human body with digital images. Eventually, it was made interactive: a digital anatomy. Several research centres collaborated on sections of the project, with the Centre for Human Simulation (Colorado University) and Stanford University Medical Media and Information Technologies (Stanford) at the vanguard. The stated aim, according to the NLM, is ‘to transparently link the print library of functional–physiological knowledge with the image library of structural–anatomical knowledge into one unified resource of health information’. The VHP acquired CT (Computer Tomography) scanned and MRI (Magnetic Resonance Imaging) images of a representative male and female cadaver. Slices of the cadaver were digitally photographed, and further scanned using both CT and MRI. These three imaging modes were collated in a computer and a data package of the entire cadaver prepared. The package, available as CD-ROMs, allows one to view cross-sections of the body, restack sections and reassemble the body, all in 3-D space. In short, the viewer can reorganise, reorder and manipulate the virtual corpse (inspiring the term postnatural). The cadavers were obtained from two sources. The male came from the body of a convicted murderer, Joseph Jernigan, who donated his body to medical science before his execution in Texas. The female cadaver was that of an unidentified Maryland woman who died of a cardiac arrest. These two constitute the Visible Man and Visible Woman respectively. The complete male dataset was made available to the public in November 1994, and the female one in November 1995. In August 2000, higher resolution axial anatomical images (the result of even thinner slices) were released. There are now 1,871 cross-sections of the human body of the woman and Jernigan, with a digital resolution of 4,096 pixels by 2,700 pixels providing immensely clear images on the screen. Scaled-down versions of all of these images are now available on the Internet for a fee. Some sample full-scale images are also available. In October 1996 the NLM sponsored the first Users’ Conference on the VHP.
232 S
VIRTUAL WORLDS
The second phase of the VHP is now underway. Segmentation and a prototype database of one section of the Visible Male (the thorax region) has been built (called AnatLine). In the future, NLM will put out a Web-based ‘atlas’ of the head and neck regions. The entire dataset will be available in six functional anatomy teaching modules. Soon NLM hopes to develop a ‘haptic control’ (‘haptic’, meaning the sense of touch) visualisation of the images using what has been termed Next Generation Internet (NGI). For the first time, a user can explore the interior of another’s body. Images from the VHP enable the user to program the ‘body’ to simulate a beating heart, tissue density, facial features, transpose anatomical structures and even move limbs. S/he can follow the course of blood circulation—what the NLM has termed ‘flythrough’—or a neural transmission without uncomfortable encounters with blood and other body fluids. The NLM refers to the VHP as ‘common public domain data’. Note the emphasis on body/anatomy as ‘data’. This illustrates the argument about the informatisation of the body in the age of cybertechnology. The body becomes a virtual world in such a CD, open to exploration and discovery, but always with the potential of being embodied. The advertisement for the VOXELMan 3D-Navigator: Brain and Skull from Springer-Verlag (Heidelberg and New York) describes it as an ‘anatomical atlas’. Towards the end the advertisement specifies: ‘The skull, the brain with its constituents, functional areas, vessels, and blood supply areas may be investigated in 36 interactive scenes’ (the word ‘scene’ is derived from s-kene meaning ‘stage’). This suggests a dramatic staging of the human body. (For further defails see the Website for the VHP: .)
THE HUMAN GENOME PROJECT The US Human Genome Project (HGP) was officially launched in 1990 by the US Department of Energy and National Institute of Health. Its goals are listed as follows. To
BODY
S 233
(i) identify all the approximate 30,000 genes in human DNA; (ii) determine the sequence of the 3 billion chemical base pairs of the human DNA; (iii) store (digitise) this data; (iv) improve tools for data analysis; (v) transfer related technologies to the private sector; (vi) address the ethical, legal and social issues (ELSI) involved (the official Website announces proudly that ‘it is the first large scientific undertaking to address the ELSI implications that may arise from the project’ ). The HGP has launched a new field: genomics, or understanding genetic material on a large scale. Genomics has extended into human health applications and has evolved genomic medicine (see section on ‘Info-Medicine, Nanotechnology and the Body’). A related project that provides an understanding of the cyborgisation of the human body and is linked to the HGP is the Genomes to Life program, also launched by the US Department of Energy. The goals of the program are to: (i) identify and characterise the molecular machines of life—the multiprotein complexes that execute cellular functions and govern cell form; (ii) characterise gene regulatory networks; (iii) characterise the functional repertoire of complex microbial communities in their natural environments at the molecular level; (iv) develop the computational methods and capabilities to achieve an advance understanding of complex biological systems and predict their behaviour (Genomes to Life, April 2001: 1). The Genomes to Life program requires a very high technological sophistication. The inaugural issue of the newsletter specifies the following: nuclear magnetic resonance (NMR) spectroscopy, neutron scattering technology, imaging technologies including confocal microscopy, x-ray microscopy, and electron tomography, mass spectroscopy, microarray technologies, etc. (Genomes to Life, April 2001, Appendix A: 54–64).
234 S
VIRTUAL WORLDS
BOX 4.2 GENETICALLY MODIFIED (GM) FOODS 19 May 1994 is an important date in human history. Associated Press announced that Calgene, Inc., was finally given approval by the US Food and Drugs Administration (FDA) to market Flavr Savr, a ‘Gene-Engineered US Tomato’. 1994 is also important for other, similar events. It marks the first authorisation by the European Union to market a transgenic plant—a tobacco plant. It is also the year when the first commercialisation of a transgenic plant—the delayed ripening tomato—takes place in the USA. Associated Press published a list of seven genetically engineered foods pronounced as ‘safe’ by the FDA: three more tomatoes, a squash genetically altered to naturally resist two deadly viruses and a potato that naturally resists the Colorado potato beetle. Finally, The Washington Post in 1994 also announced the creation of Darwin Molecular Corporation, a new biopharmaceutical company with plans to develop treatments for AIDS, cancer and autoimmune diseases through computer analysis of DNA. The new company had investments of $10 million from Paul Allen and Bill Gates, the founders of Microsoft. GM foods are also a development of cyberculture, info-medicine and genetic engineering. GM is a technology, or set of technologies that alters the genetic make-up of living organisms such as animals and plants. Using what is termed ‘recombinant DNA technology’ (combining genes from different organisms), engineers produce GM products such as medicines, vaccines, food and food ingredients and feeds. GM crops are usually crops altered for specific traits: early yield, disease resistance or with particular nutrient content. Today GM crops are grown commercially or as trials in 40 countries. In 2000, 109.2 million acres worldwide grew GM crops.
The Microbial Genome Program (launched in 1994), another related research project of the HGP, seeks to ‘determine the complete DNA sequence—the genome—of a number of non-pathogenic microbes’ (Genome Microbial Program, February 2000: 3). The intention is to understand how the microbe’s genetic structure enables it to ‘act’ on the environment. Microbes inhabiting the earth’s various regions affect or catalyse reactions as diverse as food decay to environmental clean-up and climate change.
BODY
S 235
Genetically altered microbes can then be used to perform activities that have a positive role to play in human affairs such as brewing, baking, dairy industries, pollution control, agriculture and several others. The program’s newsletter presents a future in which we can S use ‘super bugs’ to detect chemical contamination in soil, air and water, and clean up oil spills and chemicals in landfills; S cook and heat with natural gas collected from a backyard septic tank or bottled at a local waste-treatment facility; S obtain affordable alcohol-based fuels and solvents from cornstalks, wood chips and other plant by-products; and S produce new classes of antibiotics and process food and chemicals more efficiently (Microbial Genome Program, February 2000: 1). Two other related developments include: 1. The Human Proteome Project (HPP): A project in proteomics— the study of protein expression and function. The Human Proteome Organisation (HUPO) was formed in June 2001 to ‘increase awareness of the HPP and the importance of proteomics in the diagnosis, prognosis and therapy of disease’ (Human Genome News, Vol. 11, Nos 3–4, July 2001: 7). 2. Toxicogenomics: The utilisation of human genome data in toxicology, studying gene response to environmental stressors and toxicants. A National Center for Toxicogenomics was set up in 2000 under the US’ National Institute of Environmental Health.
INFO-MEDICINE, NANOTECHNOLOGY, AND THE BODY Medical science has been one of the more important discourses for the control and modification of the human body. In contemporary times, nanotechnology and info-medicine have intruded further in its investigation of the human body. As we have seen, the VHP is an important moment in the digitisation, informatisation
236 S
VIRTUAL WORLDS
and virtualisation of the human body. The HGP transforms the body into a set of codes and data for storage. In Donna Haraway’s words: [In the HGP] we become a particular kind of text which can be reduced to code fragments banked in transnational data storage systems and redistributed in all sorts of ways that fundamentally affect reproduction and labour and life chances and so on. (Haraway 1991b: 6) With VHP and HGP, medical science now ‘mathematises’ (to use Chris Gray’s term) the human body. There is, today, an explosion in the knowledge about the human body. Drugs are now designed to produce specific mental effects. There are technological devices that monitor brain activity and the influence of drugs upon it. Most importantly, medicating oneself is now more or less acceptable to society (Gray 2001: 70). Another development in the area of medicine is the increase in the number of ‘be better’ drugs. Drugs to improve life and looks abound today—drugs for baldness, cholesterol, bone density, bladder control and skin texture are commonly available. The sheer increase in the use of cosmetic drugs (for practically every part of the body) suggests a dual theme: the increasing obsession with looking younger and better (the desirability of the body as a desirable phenomenon), and the boom in consumer cosmetic medicine. Medical modification of the body is a common event today. Several parts of the body can now be replaced, with the heart and kidneys being the most common. There are also electrical stimulation systems that are implanted to control sleep disorders (by stimulating diaphragms), induce urination, improve defacation, facilitate penile erection in paraplegics and stimulate atrophied muscles in para- and quadriplegics. Developments in polymer chemistry have enabled creation of polymer substitutes for skin, cartilage, glands, bones and vessels. Artificial limbs are much more sophisticated and easier to use today. Sensors implanted in muscles can now pick up electromyographic signals and command prosthesis to move. Head motions and voice commands can now be used to move robot arms and computer-directed prosthesis. In the military such sensors enhance vision to enable
BODY
S 237
seeking/tracking targets more effectively. In all these cases medical technology (a) restores lost/reduced functions of the body, and (b) enhances these functions. The following pages discuss specific forms of this technologisation of the body.
Natural Organ Transplants Biologists today engineer organisms made with genetic material from two species. As Chris Gray points out, such transplanted bodies are chimeras (after the mythic creature with the head of a lion, the tail of a serpent and the body of a goat), since they combine at least two species within them (Gray 2001: 82). They are also cyborgs, where different species and technological prosthesis unite.
RealVideo Surgery In August 1998 the American Health Network (AHN) broadcast live over the Web an open heart surgery being performed at Providence Hospital at Seattle. This was a few months after the first live Webcast of a birth on the Internet. A head-mounted digital camera was worn by the chief surgeon, and was accompanied by a continuous voice-over. The signal was then sent over the Internet using a software called RealVideo. This is one more step in the ‘spectacularisation’ of the human body. As Eugene Thacker points out, in RealVideo surgery the Internet and software technology is not essential to the surgical procedure itself. What RealVideo surgery achieves is to combine modern institutionalised medical surgery with the public context of early modern anatomy theatres (the dissection and anatomy theatres in Padua, Bologna and Leiden. See Jonathan Sawday’s The Body Emblazoned [1996] for a survey). With RealVideo surgery the body is marked through technologies of medical knowledge and the mediation and digitisation by Web technologies (Thacker 2000: 329).
Cybernetic Human Reproduction Perhaps the most controversial of all new technologies—since it affects our thinking about the ‘natural’ process of procreation and
238 S
VIRTUAL WORLDS
children as ‘gifts’ of god—artificial human reproduction deserves a separate study under both cyberculture and the body and cyberpolitics. Questions of ethics, values, consumption and production are implicit to debates around artificial reproduction, surrogate womb technologies and embryo research. Most importantly, the new technologies are closely aligned with social structures and relations because, as feminist critiques point out, they hinge upon certain notions of the family, parenting and the woman’s role. It is for this reason that cybernetic human reproduction figures most prominently in feminist debates about science (and, almost equally prominently, in medical ethics). Artificial reproduction is a new development in the technological control of human bodies. Louise Brown, the world’s first human test-tube baby, was born on 28 July 1978. Described as a ‘miracle’ in Lesley and John Brown’s narrative of the experience, Our Miracle Called Louise (1979), Louise Brown marked the start of major developments in human reproductive technologies. Today the foetus—previously inaccessible and mysterious—can be easily manipulated through technology. Ultrasound, for instance, enables parents-to-be to start visualising and bonding with their child even before the actual birth. This is cyborgisation, where parents mediate and communicate with the ‘child’/foetus through a machine. Contemporary technology also enables a dead woman to be kept alive as a cyborg womb so that the baby can be delivered. (A famous case is that of Marie Odette Henderson. Henderson died of brain tumour in 1986. By court order her body was kept functioning for 23 days until the respiratory system of the foetus she carried had matured sufficiently to be independent. The baby was delivered by caesarean section and only then was Henderson disconnected from the life support systems.) From mere control over human bodies, contemporary medical technologies now seek a transformation of reproductive bodies. Sarah Franklin’s study of ‘assisted conception’ is an important contribution to the debates around technologies of the body. Franklin suggests that there are three components in contemporary reproductive medicine: (i) There is a significant discrepancy between the representation of in vitro fertilisation (IVF) as a series of progressive stages and the experience of the procedure as a serial failure to progress.
BODY
S 239
(ii) Technological intervention and potential enablement now define the very reproductive process, and become the focus of hopes and desires of both clinicians and clients. Reproduction comes to be redefined by the process of becoming technologised, commodified, professionalised, until eventually, ‘achieved’. The biological function of reproduction is seen as capable of being assumed technologically. When IVF fails, the attempt is not to abandon the attempt, but to improve the technology. (iii) There is a production of new uncertainties. Although the degree of technological intervention into human reproduction proceeds from better knowledge and confidence, failure breeds uncertainty and IVF becomes a gamble (Franklin 1997: 10–11). Franklin points to certain other features of the new reproductive technologies. (i) IVF mystifies the ‘facts of life’. There are no explanations for missed conceptions, though successes can be explained. Further, biological explanations are most effective in explaining successful conceptions, but are the least effective when it comes to reproductive failure. (ii) The inadequacy of the biological model is reflected in the language of hope and miracles that describe successful conceptions (such babies are frequently described as ‘miracle babies’). (iii) The clinic becomes the site of technological promise and potential. ‘Hope’ (for a techno-assisted natural miracle) becomes the most important value and is signified by the image of the desperate, infertile woman. The hope stands for a belief in technological progress.
Pharming This is the process—still at a preliminary stage—through which large mammals are cloned with added human genes. This process was developed in order to produce things that human bodies needed for medical reasons. ‘Pharm-products’ typically include
240 S
VIRTUAL WORLDS
cartilage, bone tissue, nerve tissue and human skin. Eventually, scientists believe, it might be possible to modify pigs and other mammals to produce organs for human transplants. Even humans could be modified to carry certain antibiotics in their blood. Oncomouse, developed at Harvard University in 1988, was a transgenic mouse with human and chicken genes. It was made susceptible to cancer (hence ‘oncomouse’, from ‘onco’, meaning cancer), and genetically modified so that the trait would be passed on. Interestingly, such organisms as oncomouse or the oil-eating microbe (engineered in 1971) can be patented, humans cannot (according to a US Supreme Court ruling, on the basis that the 13th Amendment outlaws slavery !). The ruling, however, does not mention genetically engineered human tissues, cells or genes.
MEDICINE AND GENOMICS This involves several related developments in the new medical ‘systems’. The main aim is to diagnose and predict disease and vulnerability of certain races/populations to specific diseases by looking at the racial genetic structure.
Gene Therapy This can be used to treat even genetic and acquired diseases. It means using normal genes to replace or supplement a defective gene or bestow more immunity to the natural body. In this therapy disease is prevented or treated by altering a person’s genes. There are two forms of this therapy at present: somatic, where a person’s genes are changed but the change is not passed on to the next generation, and germline therapy where parent (egg and sperm) cells are changed so that the change is passed on to the next generation. Drugs can be genetically ‘designed’ to target specific sites in the body or to suit different people’s ‘constitutions’.
Gene Testing DNA-based tests are used to examine the DNA molecule for genetic disorders. These tests can be used to identify the unaffected
BODY
S 241
but carrier individuals. Such individuals may possess one copy of a gene, of which the disease requires two for expression. These tests are also used for newborn screening, pre-symptomatic tests for adult ailments such as Huntington’s disease, cancer or Alzheimer’s disease, and identity/forensic testing.
Pharmacogenomics This is a new area that combines the traditional study of the effect of drugs with genomics. Pharmacogenomics studies an individual’s response to a drug in terms of her/his genetic traits. That is, it analyses how a person’s genetic code—inherited from parents, community, race, etc.—affect her or his response to drugs. The HGP and the Human Genome Diversity project, with its database of the genetic codes of entire populations, will then be able to predict the kind of drugs that will suit these populations. Further, once a person’s genetic code and the potential response to drugs have been digitally recorded, it will be possible to design drugs to suit her/his genetic make-up. We are therefore looking at ‘designer’ and ‘personalised’ drugs as a direct result of the cybercultural revolution.
Stem Cell Research Very early in the development of the human embryo, the cells have not yet begun differentiation into blood, muscle, or any other type of cell. That is, at this stage, all cells can develop into any cell in the adult human. This property is called pluripotency, and is lost in the second week after fertilisation. Stem cell research attempts to revert the cells back to their pluripotent state. If this can be achieved then the cells, back in their undifferentiated state, can be engineered (by exposing them to different molecular compounds) to develop into particular cells. Thus the cells can be used to grow fresh cells to replace tissues (which are collections of cells) lost in accidents or disease. Nerve cells damaged by neurological illnesses, for example, can then be replaced by fresh cells. Such cells can also be used to grow ‘custom-made’ tissue providing exact matches to a particular body, which will help reduce the problem of immuno-rejection during transplants because the
242 S
VIRTUAL WORLDS
replaced organ/tissue has the exact ‘configuration’ as the host. This procedure is called ‘therapeutic cloning’. It involves injecting a patient’s own DNA into an egg and then stimulating the cell and egg to fuse and start dividing. Each cell in the resultant embryo would have the exact DNA as the patient, and tissues built from these cells would match the patient’s own tissues, thus preventing rejection. In keeping with cybertechnology, this research is also rooted in the information carried by the embryo, and its potential for materialisation as an organ/body. The embryo is the newest virtual world.
‘In Silico’ Biology The newest development in the informatisation of medical science is in silico biology. This uses computer simulations of human bodies to identify causes, progress and cures of diseases. Such simulations of the human body are termed ‘virtual patients’. Each simulation of a disease begins by modelling the normal physiology and interactions of the organs involved in the disease. Basically it enables a researcher to discover how a person gets sick— in other words, to understand the physiological factors that cause disease. Computer simulations can reduce the dependence on human trials. Denis Noble and his team at the University of Oxford (UK) used computer simulations to develop a virtual heart. The model consists of a mass of virtual cells, each processing virtual sugar and oxygen. On a computer screen, this beats just like a real heart. The virtual heart can be programmed to develop different diseases and scientists can see how it responds to different treatments. The model has been used to predict whether a particular drug will cause abnormal heart rhythm (a common side effect). James Bassingthwaighte and other researchers at the University of Washington are combining individual computer models of genes, proteins, cells organs and metabolic pathways. This research is called the ‘virtual human’ or ‘physiome’ project. A related development is the ‘Archimedes model’. Kaiser Permanente health plan, based in California, has created a computer simulation of a virtual world where people develop diseases—asthma, diabetes and heart disease, so far—and go to doctors for tests and treatment. Archimedes also predicts results of
BODY
S 243
clinical trials that simply cannot be carried out in the real world. The model takes into account many influences on disease in real people. For example, each model has a virtual liver and pancreas. The health of these organs affects the concentration of sugar in a virtual patient’s blood. When sugar concentrations rise high enough, the patient may experience symptoms of diabetes. Archimedes uses equations that reproduce the disease’s known characteristics, such as the disease’s observed incidence in different ethnic groups (Christensen 2002: 378–80). The importance of VHP, HGP and genetic engineering can be gauged from the numerous books and scholarly essays that are published on the ethical, social and cultural impact of these new technologies. Respected journals like Configurations, Literature and Medicine and, Medicine and Philosophy carry debates on technology, ethics and culture. Judicature (the Journal of the American Judicature Society) ran a special issue called ‘Genes and Justice’ in November–December, 1999 (Vol. 83, No. 3). Several ethical issues still need to be explored, since the ramifications of the new technologies are only beginning to emerge. Three examples from contemporary cultural situations will illustrate the impact and problems of the ‘new learning’. In July 2001 several judges and scientists met in Hawaii for the first Courts First International Conversation on EnviroGenetics Disputes and Issues. The issues discussed here indicate the multiple uses/abuses of and the ethical and sociocultural impact of the genetic revolution: GM foods, bioscience and jurisprudence, biological property, genetic testing, human subjects in biomedical research and human rights. The members agreed that no legal precedent or scientific consensus exists in the world of genomics. This recognition of a major problem implies that cultural critics, philosophers and scientists will need to collaborate in order to make judgements in genomics, or genome-related cultural phenomena such as cyborg families. Biomedical ethics—one such area—now straddles the disciplines of law, medicine, ethics, philosophy and cultural studies. The work of social historians and cultural critics like Donna Haraway, Katherine Hayles and Vandana Shiva is also located at such an interface. Genetic discrimination may be the next oppressive practice and discourse after colonialism and racism. The important question is
244 S
VIRTUAL WORLDS
whether employers should be allowed to have access to genetic information. Scientists claim that by 2010 one should be able to purchase genetic marker kits (that identify potential conditions and diseases) for about $100. Employers can then not only know (precognition) about the diseases an employee is potentially prone to, but also the employee’s likely behaviour patterns (stress, kleptomania, depression). In the USA the Equal Employment Opportunity Commission (EEOC) prohibited discrimination on the basis of genetic make-up (under ADA or the Americans with Disabilities Act). The appropriateness of genetic testing of employees is a matter of debate. A sample case would be the Burlington Northern Santa Fe Railroad case. This company subjected its employees to surreptitious testing for genetic markers for carpal tunnel syndrome, so as to avoid ‘problem’ employees prone to stress injuries. The question this raises is: is genetic determinism the new oppression of modern society? The National Educational Foundation of Zeta Phi Beta Sorority, Inc. organised three major international conferences between 1999 and 2000 on the impact of genetics research on minorities. Keeping in mind such phenomena as Social Darwinism, Nazi medical research and the Bell-curve debate, minority scientists and lawyers agreed that the pattern of sampling reflects power relations. They called for the inclusion of African–Americans’ genetic sequences in the HGP project because if the data from the HGP were to eventually serve as commercial information for drug manufacture, then there simply wouldn’t be drugs for certain genotypes. These three examples present only the proverbial tip of the genetic iceberg. Law, science, philosophy and economics are all messily involved in these debates since no issue can be resolved by one discipline/expert alone. Genetic research and cyborgisation affects every aspect of everyday life. Surely, for example, the collection of so much information by one central power (currently the US government) is a major worry since this information can be used to stigmatise and ghettoise people as potential threats. As each discipline borrows from and prosthetically supplements the other, and the essence of technology is no more technological (as Martin Heidegger put it), we can safely claim a cyborgisation of disciplines too.
S 245
BODY
COSMETIC SURGERY There are two main fields of plastic surgery. Reconstructive surgery repairs deformities (of disease, accidents, or caused by birth). This type of surgery aims at the restoration of health, appearance and function. Cosmetic surgery, on the other hand, is ‘aesthetic surgery’. This is optional and is frequently associated with status and self-esteem. Cosmetic surgery deserves a separate section in any study of the body in contemporary culture simply because of the commercial and social values attached to the process. Several literary texts have dealt with the theme of women’s bodies in the twentieth century. Margaret Atwood’s Edible Woman and Lady Oracle, Fay Weldon’s The Life and Loves of a She-Devil and Toni Morrison’s The Bluest Eye deal, albeit with different inflections, with the emphasis on feminine beauty in contemporary society. The strongest response to contemporary culture and technology’s intervention in ‘natural’ beauty has been from the feminists. BOX 4.3 BOTOX Botulinum A Toxin (Botox), a protein derived from bacteria, is the latest addition to the cosmetic surgeon’s kit. When injected into a muscle it produces temporary paralysis of that muscle. It thus removes eye muscle spasms that cause crossed eyes and facial muscle imbalance. It is now being used to correct facial muscle tiredness around the eyes (which makes a person look tired or angry). Once the muscles have been inactivated, then such expressions are not induced. It is used to temporarily cure frown lines, forehead creases, ‘crow’s feet’ and platysmal bands (cords/bands in the neck). The entire procedure takes about 10 minutes, effects take some days to develop, and the results last for 3 to 10 months. Some numbness, swelling and bruising may happen. See advertisements for such treatment from organisations and facilities such as the Appearance Care Center (). In the UK, advertised rates for the procedure start at £175 for the forehead and £200 for the eyes and forehead (2001 rates).
246 S
VIRTUAL WORLDS
In the twentieth century the ‘cultivation of appearance’, as the feminist historian of cosmetic surgery Kathy Davis terms it in her excellent study, Reshaping the Female Body, has become a major commercial enterprise involving, as its subjects, women of all classes, ethnic groups and regions (Davis 1995: 40). Beauty is still seen, even with feminism and women’s suffragette movements, as something worth suffering for and spending money on. Role socialisation—with its emphasis on the physical attractiveness of women—plays a major role in this transformative practice of cosmetic surgery. Appearance is now a matter of identity formation. Exercise, diets, costumes are all apparatuses of body modification aimed at presenting the ‘body beautiful’. Contemporary feminist writing on the subject sees beauty as either of two things: as oppression, or as cultural discourse—a systematic ‘technology’ that seeks to discipline the female body (Davis 1995: 52–56). E-sthetics, providing online information about cosmetic surgery, lists the following ‘recommended’ and more popular procedures: Facelift, eyelid tightening, forehead lift, skin resurfacing, rhinoplasty, otoplasty, cheek implant, hair transplant, chin implant, scar revision, breast enlargement, breast reduction, mastopexy, gynecomasty, ultrasonic liposuction, abdominoplasty, arm tightening, thigh lift. The same site has links to similar treatments for men, and ‘body sculpture’. () The prevalence of cosmetic surgery in contemporary Western culture can be gauged from statistics. In 1996 the American Academy of Facial Plastic and Reconstructive Surgery’s survey stated that 825,000 procedures were performed on the face alone: one procedure for every 150 people in the US every year. Surveys also reveal a flourishing ‘medical tourism’ business today: Americans go to Mexico, the Dominican Republic and Brazil for face-lifts, the British go to Marabella (Spain), and people from the Middle East go to Israel. Lebanon is a popular choice for transgender surgery. This across-the-class phenomenon is amazing, as the costs are not covered by insurance (except in the Netherlands) since it is categorised as ‘elective’ surgery. And,
S 247
BODY
BOX 4.4 ANTI-AGING AND DEATH The oldest search in the history of mankind: eternal life and antiaging. This is one of the most important areas of medical biology today. In 1965 Robert Ettinger’s ‘The Prospect of Immortality’ was published, and detailed the science of cryonics (now available online at ). Cryonics, as defined by this group, means this: ‘a cryonics patient is cooled down to the point where all physical life functions, and all physical decay, completely stops. Then, at some point in the future, when medical science has developed enough to bring them back in good health, the person is restored to good health’ (space travel films frequently employ this theme). Reversible cryonic suspension for adult human beings is not yet a reality, and only the tiny human embryos put into cryostasis have been restored. Transhumanists believe that death will soon be overcome, and the body—with its debilitating conditions of decay and disease—will be overcome (a theme in William Gibson's cyberspace trilogy and Bruce Sterling's Holy Fire). The International Necronautical Society () was set up to explore death. Their manifesto is an interesting document. It begins with this statement: ‘death is a type of space, which we intend to map, enter, colonise and, eventually, inhabit’. In the language of contemporary critical theory and philosophy, they declare: ‘we shall take it upon us, as our task, to bring death out into the world. We will chart all its forms and media: in literature and art, where it is most apparent; also in science and culture, where it lurks submerged but no less potent for the obfuscation. We shall attempt to tap into its frequencies—by radio, the Internet and all sites where its processes and avatars are active. In the quotidian, to no smaller a degree, death moves: in traffic accidents both realised and narrowly avoided; in hearses and undertakers’ shops, in florists’ wreaths, in butchers’ fridges and in dustbins of decaying produce. Death moves in our apartments, through our television screens, the wires and plumbing in our walls, our dreams. Our very bodies are no more than vehicles carrying us ineluctably towards death. We are all necronauts, always, already’. Finally, it declares: ‘[their ultimate aim] shall be the construction of a craft that will convey us into death in such a way that we may, if not live, then at least persist […] let us deliver ourselves utterly to death, not in desperation but rigorously, creatively, eyes and mouths wide open so that they may be filled from the deep wells of the Unknown’.
248 S
VIRTUAL WORLDS
60–70 per cent of cosmetic surgery patients are women, most of whom have gone on record to state that they did it because they were unhappy with their bodies. Thus cosmetic surgery is by far the most controversial cultural effect of technology. Issues of aesthetics, sex appeal, femininity and identity conflate in these debates. The body continues to remain at the forefront of all technological debates.
THE BODY, EROTICISM, AND SEXUALITY IN VR/CYBERSPACE Eroticism and sexuality could figure equally well in Chapter 5. However, since much of cybersex is about corporeality and its ‘forms’, it might be more appropriate to discuss it under ‘Body’. The body’s situatedness in relationships codes sexuality in ‘real’ life. Thus ‘sanctioned’ sexual experiences (married and heterosexual) limit the body’s desire machine. Cybersex becomes an outlet for these prohibited desires. Cybersex also goes under various names: Netsex, Tinysex, MOOsex, Compusex are some of them. ‘Trading Sexpics’—or the circulation, exchange, accumulation and consumption of sexually explicit material is an important aspect of contemporary online culture. Chat and conversation—via typed text, also termed ‘textsex’—can also become eroticised as cybersex (flirting, sex talk). Several features of such an eroticised space stand out. Participants see chatting and cybersex as free and transgressive. Here, one can indulge in one’s deepest fantasies—promiscuity, interracial sex, exhibitionism, orgies, incest, bisexuality, SM, among others. ‘Taboo’ thus does not exist in a cybersex situation. The pleasure of cybersex is dependent upon the clear distinction of this sexuality from embodied ‘real’ life. Further, the commitments that accompany a sexual relationship in ‘meatspace’ do not exist for cybersex. There are no material cares, financial obligations (though payments are usually required to access these sites), fear of disease or threat of reproduction. It
BODY
S 249
is thus the production of desire without the ‘real’ body’s potential for reproduction. People can safely indulge fantasies and can be as creative as they want to be. More significantly, desire is unlimited and everyone is desired. Thus physical unattractiveness and bodily limitations are not sexual constraints in cybersex. Unending orgasms and the most beautiful partner are both available despite one’s corporeal limitations. Cybersex literally is sex without the body’s limitations or unpleasantness (tiredness, smell, sweat or bodily secretions). Cybersex is about pure consumption using the ‘body’ (perforce the term now goes into quote marks). There is no care of the body, and no question of labour. Everything can be, in the words of Lawra Rival et al., ‘pornographied’. To ‘pornography’ the other is to capture/absorb her or him within the place of desire beyond the limits of the body and the cares of the world (Rival et al. 1999: 301). Simply put, cybersex is an end in itself. Cyberspace is a utopia of both time and space. Rival et al. point out that cybersex does not destabilise established sex, gender or sexuality. In fact, it works with very stable notions of these. Pleasure is experienced within the established ‘constructs’ of sex or gender, without its confining limits. Rival et al. argue that online sex actually works backwards into ‘mundane versions of sexuality and familiality’ (ibid.: 302): (i) Participants are intensely aware of the performative nature of their online identities and encounters. Yet this is also accompanied by a strong sense of authenticity. Online performativity thus constitutes a series of issues about deception. There is an assumption that all online ‘performances’ somehow relate back to real life. Thus what is performed as online sexuality is more or less true to the ‘real’ self. Thus cybersex becomes a means of exploring what is repressed in real sexuality. Rival et al. put it succinctly when they state: ‘The ideology of IRC online sexuality is not deconstructive but libertarian’ (ibid.). This is especially so because Internet Relay Chat (IRC) sex ‘involves an almost consumerist normality that never gets near challenging the choosing self ’ (ibid.: 302).
250 S
VIRTUAL WORLDS
(ii) Maintaining the sexualised ‘ambience’ of cybersex requires mundane labour: policing, technical expertise, and so on. There is some homophobia even in cyberspace. And core sexualities—heterosexuality, adult sexuality—are never questioned. Empirical studies by Don Slater prove that specific notions prevail in cyberporn and cybersex: that men are heterosexual and women are bisexual; that men insatiably desire, but desire only women. Sexualities enacted online—with soft or extreme degrees—are within these conventional structures (quoted in ibid.: 301). (iii) Frequently, cybersex conversations move quickly from their taste in pornography to everyday matters—money, family, jobs, etc. IRC sexuality is thus seen as an escape from loneliness and frustration in embodied life. However, this is not always true. (iv) Peculiarly, the same set of emotions that inform ‘real’ relationships spill over into cybersex. Thus the cloying romantic, the claustrophobically jealous, honesty and ‘permissible’ behaviour figure prominently in cybersexual relationships. Thus, rather than an escape, cybersex marks an extension of the body—with all its emotions and conventions—into cyberspace. Cyberspace sexuality actualises the ideals the body carries in real life. Hence Rival et al.’s reading of cybersex as ‘embodying’ a ‘boundary-defining question’: ‘Is cybersex cheating on your Real Life partner?’ (ibid.: 304). (v) However, what cybersex does is to produce an ‘erotic overlay on the domestic scene […] not to challenge relationships but to make them a bit more exciting’. Thus ‘conventional domestic life is compatible with the sexualities enacted online, not challenged but rather revived by them’ (ibid.: 304–5). (vi) However, psychologists have discovered that when face-toface ‘cues’ are not available, there is a tendency to idealise the person. Hence there is a great deal of resistance to actual meetings in ‘real’ space (Wallace 1999: 154). (vii) Theorists argue that virtual sex should be treated as a phenomenon of social interaction. Many of the people indulging in virtual sex value it as a form of individualised learning, development and exploration (ibid.).
BODY
S 251
CYBERPORN A related issue is cyberporn. There is plenty of sexually explicit material on the Internet. Invitations to join adult groups routinely arrive as spam in email boxes. Patricia Wallace argues that pornography may be one of the ‘killer applications’ that propels this technology into more homes, just as adult videos did for Video Cassette Recorders (Wallace 1999: 159). There are two main kinds of cyberporn: the commercial and the amateur/free. ‘Hot chat’—sexually stimulating conversations with an online person using typed text—is usually ‘paid porn’. Amateur sites are those that exhibit sexual activities of men and women online. Amateurs frequently create Websites with pornographic pictures of their own adventures. A study reveals that 65 per cent of porn materials online come from non-commercial sources (ibid.: 160–61). With technology such as CUSeeMe, moving images can be transmitted from home PCs over the WWW. In a sense, the upsurge of cyberporn is a result of a technology that opens up avenues previously unavailable in the realm of sexuality. However, there are no thorough studies of cyberporn. Patricia Wallace makes a few speculations on the subject: S People will probably use Internet porn just as they have used sexually explicit materials in the past—for diversion and sexual enhancement. S It will mean larger distribution, more access, or simply much more of every thing. S There will be ‘disinhibition’. People will feel freer to read erotic stories or watch explicit pictures. They can explore fetishes and alternative sexualities. S Eventually there may be a diminishing of interest in it, simply because of its easy availability (ibid.: 169).
CYBORG SOLDIERS Computer war games for children and Hollywood films such as Robocop and Star Wars depict soldiers who are man–machine
252 S
VIRTUAL WORLDS
interfaces: cyborgs. Games such as Captain Power and the Soldiers of the Future, Centurians, Silverhawks, HeMan and the Masters of the Universe, Rambo, all involve cyborgs and other ‘strange mixes of the human, the beastly, the alien, and the technological’ (Gray 1997: 197). Since the soldier’s body is the basic instrument of war, and because the body has its limitations, contemporary technology seeks to ‘improve’ and ‘upgrade’ it into a better form. This involves a man–machine integration: a cyborg soldier ‘that combines machine-like endurance with a redefined human intellect subordinated to the overall weapon system’ (ibid.: 196). Chris Gray identifies three trends in the military’s attempt to ‘remove’ the human element in soldiering: the replacement of most human intelligence collected by agents with signal and satellite data; the widespread experimentation on the use of drugs and hypnosis to create amnesia and to facilitate the reprogramming of agents; and the use of direct electrical implants (bioelectronics) to control behaviour (ibid.: 198). There is also the increasing acceptance that psychic casualties of war are as important as any other. Some of the technologies that create or seek to create—the cyborg soldier are detailed here. S The Human Resources Office (HUMRRO) was set up by the US army in order to investigate the problem. A new term was coined: psychotechnology, signifying the incursion of contemporary technology into the psychic processes of the soldiers. The projects that resulted sought to improve man–machine interaction, psychomotor responses, and the soldier’s sight and hearing—all augmented through technology. S Research sponsored by the US Air Force includes development of biochips that can be activated by hormones and neural electrical stimulation. These biochips then trigger specific hormonal and mental behaviour in humans. S On-board systems in aircraft monitor pilot stress and fatigue in what has been termed ‘pilot state monitoring’. This includes monitoring the pilot’s brain waves, eye
S 253
BODY
movements, sweaty palms to gauge his mood, and even taking over the controls if the human controller reveals signs of unreliability. S As early as 1970 research had been undertaken into the brain–computer interface. Computer monitors that read the pilot’s thoughts, and even insert ideas, messages and images into them, are researched by the military technologists. S At Wright–Patterson Air Force Base human subjects have been trained to fire higher or lower intensities of brain waves based on signals of external lights (biofeedback systems). These waves then adjust the flight simulator— which is to say that the flying is achieved/controlled by brain power. S There are studies of ‘non-drug management of pain’, controlling automatic responses to stress and injury, and even ‘warrior-monks’ (this last by the Delta Force) who work with ESP (extra sensory perception), levitation and psychic healing (Gray 1997: 203–6).
BOX 4.5 BIOMETRICS Biometrics is the latest in security technology. Biometric technology establishes a person's identity based on distinctive physical attributes. Using a scanner or camera, the computer digitises the biometric information received and checks it against a database. The Iris Scan analyses the distinctive striations, furrows and patterns of the human iris (this is used in high security areas such as nuclear plants). Face Scan analyses the distance between the chin, nose and eyes, and is common in surveillance cameras. Digital Finger Scan reads the ridges, whorls and loops of the fingerprint, and is used for various purposes—from driver licensing to criminal investigations. Hand Scan measures the length, width, thickness and surface area of the hands, and is commonly used in employee time clock, building access and immigration. Voice Scan maps voice cadence and pitch, and is used in telephone banking, and house-arrest check-up.
254 S
VIRTUAL WORLDS
POSTHUMAN (BODY) ARTS STELARC Stelarc (born Stelios Arcadiou) is an artist whose work seeks to redefine the physical limits of the human form. Combining prosthetics, computer technology and electronics, Stelarc’s work is a fascinating example of ‘techno-corporeo-art’ (Nayar 2002b). In Exoskeleton a six-legged walking machine was built. The body was placed on a turntable, controlling the machine by means of tiltsensors and micro switches on an exoskeleton housing the upper body. The body activates the machine by moving its limbs. In one of his most famous performances, Stelarc developed a ‘third hand’, which he then used to write the words ‘evolution’ and ‘decadence’. In Movator Stelarc constructed a structure that allowed a body to animate a 3-D computer-generated virtual body to perform in cyberspace. Markers on the body tracked by cameras are analysed by computer and the motions transferred to a virtual actor. Stelarc hypothesises a virtual body that can ‘access’ a physical body, enabling the latter to perform in the ‘real’ world. With newer technology the avatar can then be imbued with artificial intelligence making it an Artificial Life (AL) entity. Physical bodies then can be accessed and made to perform from any part of the world. In one of Stelarc’s ‘innerspace’ art, Stomach Sculpture, he swallowed a nanorobot camera (after fasting for eight hours). He then filmed his inside (a short movie clip of this is available on his Website). He has thus far filmed about 2 metres of his insides! Parasite is one of Stelarc’s most famous events. The body is wired into an interactive video field. Images are then fed into the body, which reacts according to the images. Thus it increases the sensory and other capabilities of the physical body. Stelarc fantasises about a search engine that routinely scans the WWW for suitable images and then maps it into the body. The augmented body—with, for instance, a higher visual sense—moves and behaves accordingly. In short, it increases the body’s capacity through stimulation by images. (Parasite is an interesting name here. It means not just ‘dependent’, but also ‘para-site’, a location just off the main
BODY
S 255
site that enables you to enjoy the main site better. In this case the virtual world is the adjacent site that allows the human to enjoy her/his physical body better.) The body is itself a parasite (in both senses of the term) of the data streams of the Internet. Fractal Flesh, a version of Parasite, is based on a daring assumption: that a body can extrude its awareness/consciousness into other bodies in other places (not unlike representations of India’s ancient sages and ‘transmigration’). Thus a feedback loop provided alternative awareness which is then ‘jacked into’ the physical body. Remote agents are then able to control the physical body from great distances. On 10–11 November 1995 Stelarc demonstrated Fractal Flesh at Telepolis, an art and technology exhibition organised by the Munich Media Lab in Luxembourg. Stelarc plugged himself into a ‘muscle-stimulation’ system. The system was connected via the Internet to computer terminals in Amsterdam, Paris and Helsinki. When a participant touched certain parts of the 3-D body-image on a touch-screen, the touch was transferred into Stelarc’s body in Luxembourg causing it to move. That is, Stelarc’s body was being manipulated by people elsewhere: it was responding to senses ‘jacking in’ from the beyond. This, as Stelarc put it, was not a fragmented body, but a multiplicity of bodies: fractal flesh. In another version, Ping Body, the body was being manipulated not by other bodies (as in Fractal Flesh) but by data from the Internet itself. The body was now the ‘agent’ of data in what is termed Internet Upload, where the body moves in response to certain streams of data that are jacked into the muscle stimulators. In his interviews Stelarc has stated several times: ‘The body is obsolete’ (the statement that flashes as soon as one accesses his site). Stelarc here suggests that skin, bones and flesh are no more limiting factors for the body. Also, with the creation of enormous quantities of data and technology that the body cannot even begin to comprehend, far less manipulate, the body is literally obsolete. For this reason, Stelarc refers to the body as a ‘structure’ and not as ‘psyche’. He argues that by modifying the structure we can adapt our bodies to new and emergent forms of information and technology. Stelarc explores the possibility of exceeding the flesh. He believes that we are now in the age of post-evolution. He states: ‘I really question whether birth and death are these
256 S
VIRTUAL WORLDS
fundamental beginnings and boundaries of existence that define what it means to be alive’ (Sterlac 2000: 131–32). Thus the skin is not the external limit of the body, the boundary that demarcates ‘my’ body and ‘yours’. Frequently accused of allowing technology to assimilate the body, Stelarc objects to the assumption that he is essentially another ‘technobrat’. For Stelarc the body is not merely biology. It is an entity embedded in a complex network of relationships with other bodies. Stelarc extends this idea and literally connects his body with others through the Internet. Stelarc goes so far as to say: ‘It is no longer a matter of perpetuating the human species by REPRODUCTION, but of enhancing male– female intercourse by human–machine interface’ (Sterlac 1995: 91, emphasis in original). This is the adaptation of the bodystructure to new data and other structures, increasing its potential and awareness of the world. Stelarc believes that we have to consider a moment in time when we may have to redesign the body to match its machines, ‘creating more effective inputs and outputs and interfaces with these new technologies’ (Sterlac 2000: 139). Stelarc’s work is also significant in that it acknowledges the ‘invented’ nature of the human (a concept ‘invented’ at the end of the European classical period, and attaining prominence with the enlightenment, as Michel Foucault has shown). What Stelarc suggests is that it is time to accept the invented nature of humanity, the body and even the mind. We are not ‘natural’, for even in ancient times there were methods of enhancing human abilities and functions (in simple terms, it may be called ‘evolution’) to suit the environment. In contemporary society, with cosmetic surgery, prosthesis and body-building, we have accepted that the human always seeks to re-invent itself, to alter its body and to take charge of its destiny. Therefore, artifice is an acceptable way of achieving difference. Stelarc argues that with the vast flows of information and technology, the body runs the risk of being enslaved by machines, of being further controlled and disciplined. Instead, he suggests that we ‘incorporate’ technology to adapt ourselves. Rather than being frightened of the technology, one needs to see it as yet another means to become different. The natural body may be ‘upgraded’ to suit the new world. If difference and reinvention are accepted as the norm rather than as deviations, then technology does not
BODY
S 257
become monstrous. Technology becomes a means to negotiate between the natural (if there is such a thing) and the artificial. What we then have is a ‘higher’ human that has developed newer abilities and managed to exceed the limits of the flesh—the posthuman. The posthuman view, which Stelarc symbolises, privileges informational pattern over material structure. The human body is seen as almost an accident. Further, Stelarc’s posthuman treats the body itself as a prosthesis, an addition that we have learnt to manipulate over centuries of use. As a natural consequence, adding other prostheses to this ‘original’ prosthesis is simply another step in evolution. The most interesting, and perhaps most frightening, point in Stelarc’s work is that it suggests a seamless connection between humans and machines. The posthuman is integrated into the WWW of information patterns and data. The body is an extension of this database, and the database is a prosthesis of the body. Biology- and computer-simulated muscle movement, human mind and artificial intelligence, the genetic code and the WWW database are interchangeable because they are merely each other’s prostheses. When Stelarc writes the word ‘evolution’ with his three hands, he is only extending what humankind has always done: (a) invent technology that enhances human ability; and (b) treat the body as ‘originary’ and technology as ‘derived’, and the body as ‘natural’ and technology as culturally invented. Actually the distinction is impossible to sustain when one realises that the human body itself is structured, honed and altered through technologies of law, religion and culture (as we have earlier noted). However, this does not take away the central problem of technology and Stelarc’s assumptions. If techno-upgraded bodies become the norm, then it is eminently possible that only such upgraded bodies will be seen as human. As with all technology, this could lead to the creation of new definitions of the human that suit only people in the developed nations (since that is where such multimillion dollar technology develops). Essentially, this automatically consigns more than half of the world’s population to the margins as less-developed, barely human. The imbalance between upgraded people and non-upgraded ones simply leads to greater discrimination. It creates a meritocracy where technologically proficient and techno-modified human bodies are seen as deserving attention, or even worth surviving. Official Stelarc Website: .
258 S
VIRTUAL WORLDS
ORLAN Orlan’s work has been described as ‘illustrated psychopathology’. Orlan is an artist whose canvas is her skin and flesh, and whose artistic performance consists in redesigning a new face every year. Orlan’s famous statement, ‘This is my body … this is my software’, interrogates several concepts and established categories, prominent among which are art, human, the body and beauty. Born Mireille Porte at St Eteinne, central France, in 1947, Orlan was educated as a visual artist, and her very first forays into serious art attracted controversy. In a series called Mesurges Orlan measured public spaces and institutions using her body as a ‘metre’, measuring streets named after famous men in terms of ‘orlans’. Thus Victor Hugo street was 25 orlans! In a highly provocative 1971 series called Striptease she draped herself in sheets like the Biblical madonna, slowly dropping them to transform herself from madonna to a whore. Courting controversy, she christened this character Saint Orlan, a name she was to use repeatedly over several years. In Kiss of the Female Artist, held at an international art fair in 1977, she sat on a stage inviting people to buy her kisses. As a result Orlan lost her art school teaching post. Orlan’s work in the 1980s became more outrageous, eliciting howls of protest and debates on whether her work could be termed ‘art’. Focusing upon her body as art-object, Orlan would exhibit herself, wash her clothes in public and so on (one 1981 installation at the Aix-la-Chapel recorded people’s horrified and revolted expressions at her exhibition!). In 1990 Orlan embarked upon her most adventurous and controversial series. Titled The Reincarnation of Saint Orlan, it was an attempt at the transformation of the facial form through a series of plastic surgery operations that would give Orlan a new face. She referred to it as ‘carnal art’ to distinguish it from body art. She devised a synthesised self-portrait based on various features from women in famous works. Thus she took the forehead from Da Vinci’s Mona Lisa, the chin of Botticelli’s Venus, the nose of Fountainebleau’s Diana, the eyes of Gerard’s Psyche and the mouth of Boucher’s Europa. Interestingly, while these womenportraits were noted for their aesthetic appeal and ‘feminine’ beauty, Orlan’s work adapted them for the myths and stories associated
BODY
S 259
with them, as Orlan constantly emphasised in her interviews and commentaries. Thus the subtext of these women that Orlan took were: the transsexuality of Mona Lisa (it is now argued that beneath the famous face is a self-portrait of da Vinci, a sublimation of his desire to become female), Diana as the aggressive adventuress, Europa gazing at an uncertain future in another continent, Psyche combining love and spiritual hunger, and Venus representing fertility and creativity. Orlan says: ‘I chose them not for the canons of beauty they are supposed to represent, but rather on account of the stories associated with them. Diana was chosen because she refuses to submit to the gods or men’ (quoted in Davis 1997: 26). Each Orlan performance has a theme: Carnal Art, This Is My Body, This Is My Software, I Have Given My Body to Art, Omnipresence, Identity, Alterity. Orlan’s art consists essentially of a live broadcast of the surgery with running commentary by her (occasionally reading from philosophical, psychoanalytic or feminist texts), beamed live, with close-ups of needles and knives cutting into her, blood-soaked flesh, and her grimaces. The surgeons and assistants wear designer clothes (there is, occasionally, a striptease by males). Throughout the surgery, the operating theatre is her surreal studio in which she makes a film, a video, photographs and reliquaries. Later, some of the excised skin, with blood and flesh, is sold as artifact. She intends to continue selling them until she has ‘no more flesh to sell’ (as she put it). In her seventh operation she had two bumps put on her forehead (referred to as ‘horns’ by critics). Later, she installed the largest nose—midway from her forehead—possible for her face. She had originally planned longer legs too, but the surgery would have necessitated general anaesthesia, thus preventing Orlan from delivering her commentary. At the conclusion of her procedures she plans to petition the French government to grant her a new name and identity. Orlan’s work has been broadcast by satellite television all over the world, featured in several magazines and on the Discovery channel. It has been the subject of discussion and debate for several prominent thinkers with affiliations ranging from feminism, performance art, through body sociology and cultural studies. Academic essays have appeared in highly respected journals like Body and Society, Art and Design, Art in America and the Performing Art Journal.
260 S
VIRTUAL WORLDS
Orlan’s work is an extreme form of body art. Mutilation and self-exhibition are in themselves not original artistic modes. In the early 1960s Hemann Nitsch disemboweled slaughtered lambs over a naked woman. Rudolph Schwartzkogler appeared to maim himself. Gina Pane nicked herself on stage during performances that had religious overtones. Chris Burden crawled over broken glass and crucified himself on a Volkswagon. Bob Flanagan, dying of cystis fibrosis, does performance-installations with his body in hospital settings (Flanagan’s slogan is: ‘Fight Illness with Illness’). However, each of these performances has a theatrical manipulation (there is some doubt whether Schwartzkogler really maimed himself). Orlan is the only one who has demonstrated the actual surgical procedures beyond doubt. Among Orlan’s influences, the most important are that of the legendary and controversial avantgarde artist Marcel Duchamp, and the work of Eugenie LemoneLuccioni, a psychoanalyst in the tradition of the famous Jacques Lacan. At the beginning of every performance Orlan reads a paragraph from Lemone-Luccioni’s La Robe, one that summarises her own work: Skin is deceiving […] in life one only has one skin […] there is a bad exchange in human relations because one never is what one has […] I never have the skin of what I am. There is no exception to the rule because I am never what I have. (quoted in Clarke 2000: 193) What exactly does Orlan’s work do? Orlan describes plastic surgery as a primary area where ‘man’s power can be most powerfully asserted on women’s bodies […] where the dictates of the dominant ideology […] become more deeply embedded […] in female flesh’ (quoted in Davis 1997: 30). Orlan’s work interrogates the masculinist assumption that women would go to any lengths to attain the standards of beauty set by men. Orlan’s quest for ugliness redefines the woman’s use of surgical techniques for precisely this reason. On a more philosophical plane, the visualisation of flowing blood and scarification in Orlan is a blurring of boundaries between the ‘inside’ (the body) and the ‘outside’ (the world, the spectator). Her RealVideo Surgery tries to demystify medical practice itself, when the world is taken into the operating room. By being in control of the procedure and deciding the course of
BODY
S 261
surgery, Orlan is also suggesting that she will not be a submissive recipient of medical attention, that the tyranny of technology masquerading (as medicine so often does) as divinity is not acceptable. Most importantly, Orlan transgresses the rule that one should not tamper with nature and the body. As she put it in one of her more controversial statements: ‘My work is blasphemous […] I fight against God and DNA’ (quoted in Clarke 2000: 194). Her body is a text that clones, reinvents itself according to her dictates. Orlan suggests that her work questions ‘the status of the body in our society and its evolution in future generations via new technologies and upcoming genetic manipulations’ (quoted in ibid.: 189). It is a critique of the consequences of advanced technologies and of their possible ‘perversion’. To understand Orlan’s real significance as an artist with a social agenda, one needs to look at the questions her art raises about cosmetic surgery. Male standards of beauty treat the female body as ‘unruly’, put it under surveillance, and make the woman confess that she is flawed and requires ‘correction’. Orlan rejects this assimilation of and acquiescence to masculinist cultural dictates of beauty. In Orlan’s case, she is the controller of her destiny, her looks: she rejects traditional myths and images about women (as we have seen in her reappraisal of Mona Lisa and Diana). As for her body, with each surgery under epidural block, she increases the risk of paralysis and even death. Orlan does not appear to be scared of this consequence of her art. As she puts it: ‘Art is a dirty job, but someone has to do it’ (quoted in Davis 1997: 31). Orlan’s work also asks: ‘Can life, with all its grotesque elements, itself be “theatre”?’ (Lovelace 1995: 25). Her self-conscious ‘compositions’ are truly artistic in that they fulfil two primary criteria of art: intention and transformation. She asserts the ability of art to transform (even through shock and revulsion), to force people to rethink issues of beauty, technology and identity. Orlan once stated the function of art: ‘Art is not for decorating apartments, for we have plenty of that with aquariums, plants, carpets, curtains, furniture .…’ (quoted in Davis 1997: 29). Orlan uses her body as raw material for a subcultural art-form to ideologically question techno-medicine, the subjugation of the body, identity, and, most emphatically, issues of beauty. In many ways, the SF aesthetics of Orlan is an attempt to go beyond the limits of the natural biological body. Orlan demonstrates
262 S
VIRTUAL WORLDS
that identity is never ‘innate’ but attained only through repeated performances, that identity is itself only a socially situated performative act. For Orlan such a performance of identity must be born of self-determination, especially in the case of women. She chooses her face, her body-form, and her identity. She enacts this identity without subscribing to others’ standards. And finally, she extends the idea inaugurated with Freud that neither the human body nor the mind can be seen as unified, coherent or whole. Orlan’s work is another step on the road to the posthuman.
NOTES 1. A wealth of material exists on the sociology of the body. For a preliminary reading see Shilling (1999 [1993]); Featherstone et al. (1995 [1991]); Dutton (1995). More specialised studies of medicalised bodies, dieting bodies, sporting bodies, and feminist studies of the body in culture are also aplenty. 2. Several social studies exist on the Visible Human Project and the Human Genome Project. See, among others, works by Marks (1994); Lewontin (1991, 1994); Waldby (1997); Haraway (1991a, 1991b, 1997).
SUGGESTED READING Balsamo, Anne. 1996. Technologies of the Gendered Body: Reading Cyborg Women. Durham and London: Duke University Press. Davis, Kathy. 1995. Reshaping the Female Body: The Dilemma of Cosmetic Surgery. New York and London: Routledge. Featherstone, Mike (ed.). 2000. Body Modification. London: Sage. Franklin, Sarah. 1997. Embodied Progress: A Cultural Account of Assisted Conception. London and New York: Routledge. Gray, Chris Hables. 2001. Cyborg Citizen: Politics in the Posthuman Age. New York and London: Routledge. Morgan, Kathryn Pauly. 1991. ‘Women and the Knife: Cosmetic Surgery and the Colonisation of Women’s Bodies’. Hypatia, 6(3): 25–53. Waldby, Catherine. 1997. ‘Revenants: The Visible Human Project and the Digital Uncanny’. Body and Society, 3(1): 1–16.
CHAPTER FIVE
GENDER
T
his chapter organises the issues thrown up by the preceding ones. The focus is on gender as a ‘case study’ for contemporary cyberculture studies. The chapter zeroes in on gender as a category where technocultural themes of identity, body, community and politics can be located. To understand how gender becomes a central concern for contemporary technoculture, one only has to look at popular representations of cyborgs, Artificial Intelligence, or machine culture in SF (both print and film). It is in the popular arts that the public imagination of cyberculture manifests itself best. Many contemporary cyberfeminists therefore ground their programmatic critique of technology through readings of these popular cultural forms. The chapter summarises arguments and views about the new technologies as these impact upon women’s bodies, sex and gender, and identity. Some points from the debates in the feminist critique of science and technology need to be recounted in order to better situate a reading of ICT and its gendered/gendering character. I have focused on specific technologies and the feminist ‘take’ on them for purposes of clarity. These are only indicators as to how ‘objective’ science and purely ‘instrumental’ tools can be socially emancipatory or oppressive.
264 S
VIRTUAL WORLDS
THE FEMINIST CRITIQUE OF SCIENCE The debates over gender and science are an offshoot of the feminist movements of the 1970s. Three types of critique come together in feminist readings of science and technology: S The critique of science and technology as cultural and social forms. S The critique of science and technology as gendered constructions. S The critique of science and technology as involved in ecological collapse. These studies highlight the fact that science and technology are not separable from social and political structures, cultural projects and ecological relations. Science and technologies also have their political economies and political ecologies (as Vandana Shiva has shown). Science and technology are thus cultural projects, realityproducing and reproducing practices (Moser 1995: 5–6). Feminist studies of the relationship of gender and technology look at technological artifacts, technological practices, systems of knowledge, institutions and competencies. These studies focus on three areas: (i) The position of women in technological fields, including the sexual division of labour within technological fields. (ii) Women’s experience of technology, including the effects of domestic, industrial, medical and information technologies on women’s lives. (iii) The gendering of technology—i.e., the eroticisation or feminisation of certain technologies (Booth and Flanagan 2002: 9). The publication of biographies in the twentieth century of women scientists such as Ada Lovelace, Barbara McClintock, Rosalind Franklin and others spurred an interest in the gendered nature of science. It became a commonplace to describe the exclusion of women from scientific institutions. The education system and modes of socialisation, it was argued, drew women away from
GENDER
S 265
mathematics and science. That is, cultural conditions conspired to keep women away from science. Later, these same cultural conditions were used as ‘evidence’ of the low percentage of women in science to argue that women were ‘naturally’ inept at it. This kind of circularity of the nature–culture argument effectively left science solely in charge of the male. Sandra Harding, for instance, has argued that asking women to be ‘more like men’ to take to science did not have a corresponding ‘degendering’ process for men (Harding 1986). What is needed is a reshaping of the very institutions of science to accommodate women. This will mean a change in the highly gendered division of labour within the wider society. To argue for an equality of opportunity within science one must first address the overall imbalance in labour. Judy Wajcman illustrates this with an example. Current institutionalised careers in science require long periods of study/research. This does not allow for childcare and/or domestic responsibilities. So, in order to succeed in a ‘science’ career, women will have to give up these commitments—something that men are able to do, given the gendered division of labour (Wajcman 1993 [1991]: 2–3). A Marxist radical science movement in the 1970s began to link developments in science with capitalist production. Thus science was no longer viewed as objective and neutral, but seen as clearly tied to class interests. Further, these critiques revealed science’s link with environmental pollution and warfare. An influential text in the social studies of science was Thomas Kuhn’s The Structure of Scientific Revolutions (1970). Kuhn argues that scientific knowledge, like all other forms of knowledge, is deeply influenced by the society in which scientific experiments, observations and interpretations are ‘performed’. Thus social and political considerations influence scientific observations. ‘Facts’ themselves are changed to suit the dominant social concerns and interests. From this social study of science the work of Evelyn Fox Keller (Reflections on Gender and Science, 1985), Cynthia Cockburn (Machinery of Dominance: Women, Men and Technical Know-How, 1985) Sandra Harding (The Science Question in Feminism, 1986), Carolyn Merchant (The Death of Nature: Women, Ecology and the Scientific Revolution, 1980) and Ludmilla Jordanova (Sexual Visions: Images of Gender in Science and Medicine between the Eighteenth and Twentieth Centuries, 1989) produced a sustained and incisive feminist reading of science.
266 S
VIRTUAL WORLDS
Feminist analyses of science began as a concern of the women’s health movement in Anglo–American society in the 1970s. Such movements began to ask for greater control over their bodies, health and sexuality. Improved birth control and abortion rights began to be central themes in the debate. The second-wave feminism, of which this was a part, began to express its dissatisfaction with medical practice and psychiatry. Feminists uncovered the male bias in interpretations of women’s illness (a good example would be ‘hysteria’— identified as a woman’s illness). This questioning of medical practice and research moved into a discussion of science itself. Re-reading the history of science, feminists saw notions of scientific ‘objectivity’ and ‘rationality’ as masking a whole set of ideological binaries. The epistemology of science worked in dualisms: nature/culture, subject/ object, knower/known. Added to these were other binaries: mind/ body, reason/emotion and objectivity/subjectivity. These dualisms were also hierarchies, where the male was knower, subject and culture. The woman was known, object and nature. Carolyn Merchant, for instance, demonstrates how sixteenth- and seventeenth-century scientific discourse (including that of Sir Francis Bacon) is replete with sexual imagery, especially of conquest and rape. Ludmilla Jordanova shows how eighteenth- and nineteenth-century medicine used sexual metaphors of unveiling, unclothing and penetrating to describe both nature and women. The discourse of science invariably posited man as the seeker–conqueror, in battle with and victorious over the mysterious, violent and unruly woman–nature. The naturalisation of women and the feminisation of nature begin with this binary. The older ‘scientific’ view was essentially mechanistic. This meant that nature was seen as a machine, and natural phenomena analysed according to certain laws. This mechanistic view was essentially masculine. It believed that science was comprehensible only by the intellect using objectivity and reason—both supposedly masculine attributes. Evelyn Fox Keller writes: Modern science […] is based on a division of emotional and intellectual labour in which objectivity, reason, and ‘mind’ are cast as male and subjectivity, feeling and ‘nature’ are cast as female. In this genderisation of the world, it seems ‘natural’ to describe science as a ‘marriage’ between mind and nature— a marriage celebrating not so much union between mind and
GENDER
S 267
nature, but a radical separation of subject and object and, ultimately, the dominion of mind over nature. (quoted in Smith Keller 1992: 14) Women, excluded from the sphere of rationality were also, therefore, disqualified from science—which was supposedly the highest expression of rationality. Enlightenment philosophy argued that child-bearing and nurture automatically made the woman emotional, and thereby unqualified to take to science. Maternity was thus a ready tool to be used against the woman. However, they could be passive, non-interfering objects of scientific study (Hilde Hein, quoted in Hekman 1990: 120–21). Thus Ruth Bleier asks: ‘What is the basis of this ignorance about women? What is it about science—or about women—or about feminists—that explains the virtual absence of a feminist voice in the natural sciences?’ (1986: 1). Feminist critiques of science and technology have discussed the ways in which ‘knowledges’ have been accumulated and disseminated over centuries. Certain technologies such as food preparation and storage, female medicine and midwifery, manual agriculture and textiles are strongly associated with women. Hunting, mechanised agriculture, transport and weapons are associated with men. Laurie Smith Keller identifies three factors that are important in an individual’s career choices: ability or inclination, access to education and training, and perceived opportunity to take up a career (Smith Keller 1992: 30). Women have been consistently excluded from science and technology on all three counts. For example, the ‘desirable features’ to enter into a career of high-energy physics were all masculine: ‘aggressive individualism, haughty self-confidence and a sharp competitive edge’ (Traweek, quoted in ibid.: 31). Haraway points out that the selfeffacing ‘modest witness’—one who did not bring in his ‘personality’ while reporting ‘objectively’ on scientific experiments—was essential to the very idea of technoscience. Women have never been modest witnesses, even if they were present as labour during these experiments. Haraway’s argument—crucial to understanding the role of women in science—suggests that [T]he preexisting dependent status of women simply precluded their epistemological, and for the most part, their
268 S
VIRTUAL WORLDS
physical presence in the most important scenes of action in that period [the seventeenth century] in the history of science. (Haraway 1997: 27) For authorities like Robert Boyle, the issue was whether women had the independent status to be modest witnesses, and they did not (ibid.). By the early modern period, the image of ‘disorderly’ nature replaced that of a ‘nurturing’ nature. This justified man’s control over nature by treating it as an act of ‘taming’. Witches in the sixteenth and seventeenth centuries represented forbidden knowledge and the ‘dark’ side of nature. Hence, as an act of (masculine) control, they were burnt (see Carolyn Merchant’s The Death of Nature). Enlightenment epistemologies spoke in universalist and absolute terms, and ignored the historical and social variables that influenced phenomena. Until the eighteenth century, science was an activity restricted to sages, priests and teachers. It was also a solitary occupation. The Royal Society of London, set up in 1666, and with the publication of the Philosophical Transactions in the same year, provided a forum for the exchange of ideas. It was closed to women. Robert Boyle, Robert Hooke, Descartes and other illustrious members of the Royal Society met and discussed ‘science’ behind closed doors. Margaret Cavendish, Duchess of Newcastle and a natural philosopher, persuaded the Society to allow her to watch scientific demonstrations in the seventeenth century. The first woman admitted to the Society, entered in 1945, 300 years after Cavendish’s ‘unwelcome appearance’ (Haraway 1997: 32). By the eighteenth century, membership of the Society had been extended to wealthy ‘patrons’. By the late eighteenth century, science had little to do with the daily life of people. By the nineteenth century it had moved into the hands of professionals (scientists), universities and research institutions. This shift in the ‘progress’ of science was also intimately linked with the European discovery of other cultures. Material selfimprovement through science and technology became a means of measuring the superiority of the West. Thus the distinction between a ‘traditional’ or non-technologised East and a ‘progressive’ or ‘modern’ technologised West began to emerge in the nineteenth century.
GENDER
S 269
The so-called objectivity of scientific knowledge has been rigorously questioned, especially by Social Studies of Technology. These argue that scientific method has never been ‘invariant’, and has always been shaped by social conditions. Sandra Harding, for instance, argues for a ‘standpoint epistemology’—which acknowledges the diversity of modes of knowing. Thus the knowledge acquired by ‘Third World’ peoples, tribals and aboriginals, and women must also be treated on par with white male science. Proceeding from the assumption that a technological device becomes successful socially and commercially when the manufacturer/ creator makes use of social, legal and market networks, feminists argue that women have been consistently excluded from this network. The power group, therefore, has a say in which technology gets popularised. Harding argues for a ‘postcolonial feminist science studies’ which looks at preservation and improvement of natural resources, local community relations, non-western cultures and women’s conditions (1998: 81). Such studies would be a constituent of what I have elsewhere termed ‘postecolonial theory’, where a resistant postcolonial studies incorporates into its theoretical frames of reference ideas and practices from local/native ecological movements and politics (Nayar 2002c [1999]). Juana Kuramoto and Francisco Sagasti, likewise, sum up the (desperate) need to link indigenous knowledge with the new technology to facilitate development: Development strategies must open spaces and create opportunities for the integration of modern and indigenous knowledge and technology in the efficiency of traditional practices, but at the same time maintain the characteristics that render them useful and attractive to the poor and to indigenous people. (Kuramoto and Sagasti 2002: 239) Feminist historians argue that women’s contribution to technological innovation has been ignored in mainstream documentation. For instance, Ada Lovelace, daughter of the English poet Lord Byron, and now recognised as the first computer programmer, was instrumental in ‘decoding’ Charles Babbage’s calculating machine. Such women do not find equal space with male inventors.
270 S
VIRTUAL WORLDS
With regard to technological change, we need to address the following questions: (i) How do existing gender identities for users influence their ability to participate in shaping the technology? (ii) What skills do men and women use with new technology and how are these skills gendered? (iii) How do the gender identities of users and the meanings of technology shift during the processes of innovation and change? (McLaughlin et al. 1999: 148) There is no such thing as ‘natural’ skills of men and women. The unskilled nature of women’s employment generally implies a lack of training in the area. Further, in a vicious circularity, the areas in which women do work are deemed ‘unskilled’. And if women take over an area of activity that has previously been seen as male and skilled, then the interpretation of ‘skill’ is likely to change (ibid.: 149). Hilary Rose (1994), for example, demonstrates how the exclusion of women from science was accomplished discursively (that is, through the practices of representation) and through scientific knowledge that construed women in certain ways as outside rationality and authority. Thus objective ‘scientific’ theories in biology and psychology ‘confirmed’ women’s subordinate status, dependency and incapacity. Feminist studies of science treat all descriptions of women’s (in)capacities (‘this is not a woman’s work’, or ‘an unsuitable job for a woman’, as the title of novel by P.D. James goes) as constructed. Biology has been a crucial factor in determining sex roles. It has been used to describe the woman as inferior, weak, and generally (naturally) unsuited to scientific work. Feminist critiques therefore go back to the history of such representations, when the study of sex differences became a priority of scientific explorations. As has been demonstrated by Sandra Harding and others, biological inquiry has been shaped by masculine values. Thus every stage in scientific research, starting from what counts as a problem to its interpretation, is shot through with the masculine bias. Feminist critics suggest that the project of rewriting women’s history must draw from their own experiences and sociopolitical lives. This foregrounds what Haraway and others term ‘situated knowledges’ (see below).
GENDER
S 271
ECOFEMINISM One of the most sustained critiques of technoscience has come from ecofeminism. Since the 1960s, social theory has consistently remarked upon the connection between gender and the environment and, almost as a corollary, the impact of technoscience on women and the natural world. Val Plumwood, Carolyn Merchant, Vandana Shiva, Maria Mies and Ariel Salleh among others, have provided incisive readings of the women–nature problematic in twentieth-century culture. A brief summary of ecofeminism is would therefore be useful here. Ecofeminists argue that the equation of Mother/Woman and Nature has effectively constructed subordinate identities for both. By naturalising the feminine and feminising the natural, science has accorded the weaker of the identities to both. By extension, this bestows upon the male the superior protector role, with the additional right to exploit/tame/control women and nature. Karen Warren holds that ecofeminism is based on the following assumptions: (i) Nature and women share a common denominator—of oppression. (ii) We need to understand the nature of the connections between these two oppressions. (iii) Feminist theory and practice must include an ecological perspective. (iv) Solutions to ecological problems must include a feminist perspective (quoted in Barry 1999: 110). Ecofeminism is not a single, unified approach. Within the larger ecofeminist framework, there are two variant forms.
Ecofeminist Spirituality 1. Female attributes and ways of thinking must be made integral to intellectual thought. 2. Action and lived experience must be emphasised, rather than abstract theorising. 3. Women must represent nature. 4. Inner or personal change must precede sociopolitical changes.
272 S
VIRTUAL WORLDS
Materialist Ecofeminism 1. This pays attention to modes of production and reproduction, institutions and social structures. 2. Underlines human dependence upon nature. 3. Emphasises the fact that women contribute ‘real’ work in human society—reproduction, caring and nurturing, and home-making. 4. Draws together the spheres of production and reproduction. 5. Attempts to formulate an ecofeminist theory of economics, while seeking a restructuring of categories of economic thought (such as ‘work’, for instance) itself.
TECHNOLOGIES: FEMINIST READINGS The feminist critique of technoscience, best exemplified today by the work of Donna Haraway, has found resonance in readings of technology and technological practices. Several commonplace and ‘advanced’ technological developments are open to feminist readings for the modes in/by which they encode cultural assumptions and misogynist ideologies.
COSMETIC SURGERY As we have seen in Chapter 4, one of the central features of postmodern/posthuman cyberculture is cosmetic surgery and body modification. Cosmetic surgery has been the subject of extensive feminist critique in the 1990s. This critique has significant relevance to studies of contemporary technoculture. ‘Technologies of the face’, as Sandra Kemp (2000) terms them, have become the subject of public imagination (and popular cinema, as in Face/Off, 1997). For instance, the Website Fa(e)ces of the World (once available at ) offered some of the world’s most famous faces for the viewer to mutilate/improve endlessly! The Face Processing Research Group, University of
GENDER
S 273
Westminster () has been researching such technologies. Other similar technologies are also used to reconstruct faces. Oxford’s Facial Image Archive () states: The Archive will collect and collate static and moving, normal and abnormal facial images. All images will be digitised and stored electronically as a database. The principal teaching role of the Archive will be as a distant learning facility over the world wide web. The Archive will undertake inhouse assessments, collaborate in multi-centre research projects and provide material for individual and institutional study. Here plastic surgeons work with the patient to produce a computergenerated face that the surgeons then use during surgery. The Turing Institute in Glasgow has created a ‘Whole Body Camera’ that can be used to help construct a virtual head so that surgeons can plan surgical procedures. These medical applications of cosmetic and reconstructive surgery apart, such technologies are extremely lucrative.1 Iris Young argues that women in contemporary culture view their bodies as separate from who they are or would like to become and as the site for negotiations of identity. That is, for women, identity is continually defined through their bodies. However, Young argues that female anatomy and physiology need not be seen merely as symbols of female objectification. They can also be a source of empowerment and subversion. The female body can thus be both object (a mere ‘body’ for society) and embodied subject (where the body is a vehicle for enacting desires and experiencing the world). Cosmetic surgery is thus an ‘expression of the objectification of the female body and of women’s struggles to become embodied subjects rather than mere bodies’ (Davis 1995: 60). The paradox here is that what appears to be a deliberate choice (in terms of the decision on cosmetic surgery) by the woman turns out to be nothing more than an acceptance of and conformity to the social ideal of feminine beauty. Women’s attractiveness is defined as attractive-to-men, and their eroticism is defined as non-existent, pathological or peripheral—especially when not directed towards men (Morgan
274 S
VIRTUAL WORLDS
1991: 32, 36). Critics such as Beverley Brown and Parveen Adams argue that the body represents a site for feminist action through transformation, appropriation, parody and protest. It is a utilisation of technology for feminist ends. Cosmetic surgery posits an Ideal face, and textbooks on cosmetic surgery frequently recommend classical art to its surgeons, so that some idea of the face can provide inspiration (Balsamo 1996: 58). For Kathy Davis, drawing upon the work of Dorothy Smith, cosmetic surgery is also a question of agency, of individual will. This ‘doing femininity’, for Dorothy Smith, is secret agency: There is a secret agent behind the subject in the gendered discourse of femininity; she has been at work to produce the feminine subject-in-discourse whose appearance when read by the doctrines of femininity transfers agency to the man. (quoted in Davis 1995: 62) Anne Balsamo argues that cosmetic surgery is driven partly by the discourse of women’s body as pathological. This ‘narrative’ suggests that the female body is flawed in its distinctions and perfect when differences are transformed into sameness (through cosmetic surgery). Balsamo writes: When cosmetic surgeons argue that the technological elimination of facial ‘deformities’ will enhance a woman’s ‘natural’ beauty, we encounter one of the more persistent contradictions within the discourse of cosmetic surgery: namely, the use of technology to ‘augment’ nature. (1996: 71) The female body is thus the site where dominant cultural meanings get inscribed. It also produces a heightened ‘surveillance’— both personal and collective—of the female form (ibid.: 78). Reading Orlan’s art, Kathy Davis, the leading feminist historian of cosmetic surgery, argues that for Orlan, plastic surgery is a path towards self-determination—a way for women to regain control over their bodies. She is the one who decides and is therefore not the passive object of another’s decisions (Davis 1997: 30). Following Kathryn Morgan’s work, Davis identifies three reasons for treating Orlan’s works as a feminist critique of cosmetic surgery:
GENDER
S 275
(i) They unmask both ‘beauty’ and ‘ugliness’ as cultural artifacts rather than natural properties of the female body. They valorise what is normally seen as ugly, and thereby subvert the cultural constraints upon women to comply with the norms of beauty. The entire notion of a ‘natural’ body is thus destabilised. (ii) These surgical performances constitute women as subjects, who use their feminine body as a site for action and protest rather than as an object of discipline and normalisation. (iii) By providing a travesty of surgical technologies and procedures, these performances magnify the role that technology plays in constructing femininity through women’s bodies. They usurp men’s control over these technologies (Davis 1997: 33). Kathryn Morgan suggests two politically correct feminist responses to cosmetic surgery: refusal and appropriation. Refusal can be at individual and collective levels. This can proceed from an understanding of the ‘political technologising of women’s bodies’, and paying attention to the ideological biases that ‘frame the cultural world in which cosmetic surgeons practice’ (Morgan 1991: 42). The response of appropriation is to treat the female body as a site of transformation and protest, which involves valorising the domain of the ‘ugly’ and exploring the commodification aspect of cosmetic surgery (ibid.: 45–46).
REPRODUCTIVE TECHNOLOGIES The technology that most directly ‘targets’ the woman’s body is reproductive technology. Sarah Franklin in her study of IVF notes that in the history of Western scientific accounts of generation, conception has always been inseparable from metaphysics and cosmology. Contemporary IVF, with its images of ‘miracles’ (the term ‘miracle’ is used in the title of the book on Louise Brown, the first ‘test-tube baby’) is no exception. Women’s bodies, argues Franklin, become conduits for a technological miracle. The woman’s image is a devotional one, for the infertile woman’s hope in technology is akin to faith. Nature and technology thus become commensurate and substitutable. Technological/ instrumental capacity is represented as ‘just like’ nature and
276 S
VIRTUAL WORLDS
instrumental knowledge substitutes for biological function. Nature is now not only knowable, but is also appropriated as an extension of these techniques, and is thus instrumentalised (Franklin 1997: 199–210). As early as 1970 Shulamith Firestone had argued in her influential work, The Dialectic of Sex, that women could be free only by taking reproduction out of the body; that freedom meant freedom from reproductive biology. Valerie Hartouni argues that these new technologies are legitimised because they are said to ‘help women realise their maternal nature, their innate need to mother’ (Hartouni 1991: 49). Such arguments of natural rights, maternal yearnings and ‘fulfilment’ reaffirm established ideas and meanings of ‘woman’. These technologies must therefore be seen as political (ibid.: 50). IVF is not a neutral technology because ‘it reflect[s] the social relations at the time of its innovation’. And further: IVF curtails any potential for the redefinition of parenthood— or infertility—by focusing exclusively on women’s biological reproduction. In so doing it reinforces the notion of the ‘natural’ bond between a mother and her biological children as well as reinforcing the idea that the nuclear family […] is the only desirable structure of social relations between adults and young children. (quoted in Lorber 1988: 124) Ruth Hubbard in her study The Politics of Women’s Biology (1992) looks at how contemporary prenatal technologies have transformed childbirth. She points out, via Judith Walzer Leavitt’s work, that as long as women gave birth at home they retained considerable control. When the ‘event’ was relocated to the hospital, physicians took complete charge (Hubbard 1992: 149). She also points out that several delivery technologies, especially ones like Dr Joseph DeLee’s use of forceps and episiotomy (enlarging the vaginal opening by making a deep cut in the vaginal muscle) had other intentions. Dr DeLee wrote that episiotomies produced healthier babies and caused less debilitation to women with ‘virginal’ vaginas (his terms, quoted by Hubbard). Hubbard suggests that the technique was primarily concerned with the ‘advantages a woman’s husband would derive from the procedure [by having a wife with a virginal vagina for his sexual pleasure]’ (ibid.: 150). Hubbard concludes: ‘We [must ask] to what extent different ways
GENDER
S 277
of giving birth empower women or, alternatively, decrease our power to structure childbearing around our own needs and those of the people with whom we live’ (ibid.: 162). Donna Haraway suggests that we need an analysis of reproductive freedom from the point of view of marked groups— ‘groups that do not fit the white, or middle-class, or other “unmarked” standard—to produce a general statement’ (Haraway 1997: 197). Anne Balsamo, speaking of the ‘labouring body’ argues that women are far more likely to be the targets of such disciplining and surveillance because the maternal body is increasingly treated as a technological body—a container for the foetus. Pregnant women are under surveillance to ensure that they do not drug or abuse themselves (for instance, the infamous 1989 ‘cocaine mothers’ case in the US). Increasingly, the maternal rights of the woman’s body are set against the rights of a foetus. Visualisation techniques such as ultrasound scans enable the notion of the foetus as an entity with ‘rights’. Thus cultural discourses of reproduction delimit the meaning of material bodies and the range of freedom accorded them (Balsamo 1998 [1995]: 227).
DOMESTIC TECHNOLOGY With domestic technology, women’s lives within the home have changed considerably. Arguments suggesting that technology reduces the gender inequality in the home have been offered. However, thinkers such as Ruth Cowan dispute (male) ideas that women have nothing to do in the house since the advent of domestic technology. Mechanisation, argues Cowan, created a whole new range of tasks which were equally cumbersome and time-consuming. Developing domestic technology has been designed for use by housewives (Wajcman 1993 [1991]: 87). It has reinforced the gender divisions between husbands and wives and ensured that the women stays firmly in the allocated space of domesticity. Home is the place of the woman’s work and the man’s leisure. Further, there is a general notion that men are skilled with technology. A woman may use technology, but this is not seen as a mark of her skill. In addition, there is a differential use of technological implements. As Wajcman puts it, women’s identity is not enhanced by their use of machines (ibid.: 89). Even leisure
278 S
VIRTUAL WORLDS
technologies are gendered. Thus men prefer to watch television in silence and women, studies tell us, because of their domestic responsibilities can view only distractedly. Contemporary research has demonstrated that while routine domestic work has declined, the time spent in childcare and shopping have increased substantially. With the introduction of more electronic technology into the home, gender analysis of such technology is imperative. Most of these technologies—such as the home theatre or the PC and Internet—are leisure technologies and require that the user spend considerable amounts of time with them. But then, it is almost a truism to say that women find less time for ‘play’. And teleworking and home–office only increase the woman’s work in the house (Wajcman 1993 [1991]: 95).
GENDERING ICTS, DEVELOPMENT, AND GLOBALISATION With the rise of new technologies and globalisation, the nature of ‘work’ has changed considerably. With ICTs’ increased role in other industries, the effect on women’s work has been considerable. While these new technologies require a certain set of skills and training, they open newer spaces for those women who have attained both. The rise of small-scale units has enabled more women to find employment. Computer-aided designing has helped women make a career in such areas as fashion designing. Mitter and others point out that the gender hierarchy has not really altered with ICTs. Women are still part of the lowlevel office workers and middle management, while men occupy the technical–engineering and top management posts in most European telecommunications companies (Mitter and Rowbotham 1997: 33). Zillah Eisenstein’s Global Obscenities (1998) is an excellent study of the linkage between globalisation, patriarchy and contemporary technocapitalist cybercultures. Eisenstein argues that while cyberspace appears to create a whole new public space, ‘real’ public spaces are shrinking with steady privatisation. The state is increasingly shifting responsibility for civic services onto the voluntary
GENDER
S 279
sector or community efforts. Global capital for Eisenstein is both racialised and sexualised. Privatised individualism and volunteerism ‘mask a dependency on patriarchal and racialised forms of familialism’ because it ‘conceals the labor of women of all colors’. Further, as government programmes get ‘downsized’ or relegated to the private sector, the needs that these programmes no longer meet are consigned to a fantasised family life—and this directly impacts upon women (Eisenstein 1998: 8). Eisenstein argues that we need a new sense of ‘publicness’ with enlarged social responsibility. This reimagining must include women and girls of all colour, ensuring their full participation in public dialogues (ibid.: 28). Thus abortion rights legislation or decisions on child health, aging or reproductive technologies must not be taken without discussions involving the parties most at risk. Women of all colours must set the priorities for breast cancer and AIDS research. Eisenstein warns that ‘public’ is now ‘a’ public reduced to corporate interest as defined by racialised and gendered structures (a point suggested by Mitter’s argument about women’s position in corporate hierarchy in telecommunications industry in the preceding paragraph). Balancing the Equation (2002), the report by the USA’s National Council for Research on Women, presents the following figures that enable us to understand Eisenstein’s arguments regarding the gendered nature of skill evaluation, organisation and employment in the new telecommunications era. (i) In 1996 women constituted 45 per cent of the workforce in the USA, but just 12 per cent of science and engineering jobs. (ii) There has been a decline in women’s participation in collegelevel computer science study, from 37 per cent of women with undergraduate science degrees in 1984 to less than 20 per cent in 1999. (iii) In 1996 women earned 53 per cent of undergraduate degrees in biology, and 46 per cent of degrees in mathematics and statistics, but only 19 per cent of physics degrees and 18 per cent of engineering. (iv) From 1975 to 1992, three-quarters of African American women receiving PhDs in biology came from historically black institutions.
280 S
VIRTUAL WORLDS
(v) In 1999 56 per cent of Advanced Placement test takers were women, but 90 per cent of computer science test takers and 78 per cent of physics test takers were men. (vi) Less than 10 per cent of full professors in the sciences today are women, despite the fact that women have been earning more than one-quarter of the PhDs in science for 30 years (). Any discussion of women and technology must take into account such factors as education and access to technology within the international labour economy. Melanie Millar has pointed out that cybertechnology affects diverse groups of women extremely unevenly. The cyberfeminist, the factory worker making computer chips, and the data entry worker have very different relations to cyberculture. Rosi Braidotti emphasises: ‘Hyper-reality does not wipe out class relations: it just intensifies them’ (quoted in Booth and Flanagan 2002: 13). Zillah Eisenstein points out that as government authority continues to shrink, the space for our complaints and debates also shrinks, for there is no one to respond to our emails or computer menu driven toll-free numbers (Eisenstein 1998: 29). Eisenstein, as we can see, is in direct opposition here to thinkers such as Mark Poster who believe that cyberculture ushers in greater democracy (see the Chapter 4 on ‘Politics’). Further, since cyberspace is already ‘colonised’ by corporate interests, ‘pre-existing’ racial, sexual and gender inequalities persist. By privileging the so-called autonomy of technology, it blurs and conceals the realities of global capital that set up the technology itself. Some statistics are in order here for us to understand the degree of racial inequalities of cyberculture. Eighty-four per cent of computer users are from the North American and Northern European regions. Of these, 69 per cent are males, with an average age of 30, and households with an average income of $59,000. There is a very evident racial elitism in cybercommunities. Except for Singapore, the top 20 Internet–computer connected countries are all First World. Across the Atlantic 20 per cent of African–Americans in the USA have home PCs, and 3 per cent subscribe to online services. Eighty per cent of the world’s population lacks basic telecommunication access. There are more telephone lines in Manhattan
GENDER
S 281
than in the entire Sub-Saharan Africa. USA has 35 computers per 100 persons, Japan has 16. And Ghana has 1 per 1,000. Standard gender expectations continue even in cyberspace. Traditional notions of masculinity and femininity continue even in cyberporn and IRCs. It might be useful to study, if possible, the number of young males who visit pornographic Websites in cybercafes in Indian cities. The categories of porn visited, plus the number of each of these categories in terms of Web presence might enable an understanding of how prevalent notions of feminine ‘beauty’ or ‘machismo’ spill over into cyberspace. Then, as governments hand over power to transnational capital and corporations, aspects of ‘statist patriarchy’ (the privileging of men in the public sphere and the restriction of the women to the private/home) are transferred to the corporate body. A transnational gendered division of labour occurs. Women’ work is exploited at computer terminals in the Third World, with financial control/profit centred in the First (the spurt in India’s ‘call centres’ and medical transcription centres, staffed mainly by women working late-night shifts to adjust to the time difference between India and the Europe/US zone, is a case in point). Global capital simply ‘rewrites’ the sexual division of labour, and retains areas such as childbearing and nurturing for women. Thus, both transnational capital and the family compete for women’s labour. Just as the nuclear family replaced the extended family in First World countries, nuclear families are now being replaced by single-parent ones. When First World nations privatise, they nostalgically recall the fantasised family of yesterday. In Third World nations supplying the export labour for TNCs, the traditional patriarchal familial patterns are broken. Globalisation redeploys patriarchy in First World countries too. The woman’s burden is now different, since they constitute single-parent families and a ‘double day’ of labour, effectively forcing them to do more in the absence of state welfare (Eisenstein 1998: 135–37). The woman’s labour (domestic, informal, unregulated) is integral to transnational capital. Home-based women workers—now proliferating in India—are unregulated and undermine the traditional patriarchal family. It also increases the woman’s work within the domestic space of the home. For, in addition to her regular duties around the house, the woman has to also keep up her wage-earning work schedule. This, however, does not take away the fact that women occupy the lower rungs
282 S
VIRTUAL WORLDS
of the wage scale even in transnational economic conditions. The privatisation of state services affects women badly, especially in areas of childcare, health and education. Women’s labour, ‘underpaid or unpaid’ (as Eisenstein puts it), is ignored by the IMF and the World Bank. Eisenstein points out that without welfare 70 per cent of single mothers with young children would fall into poverty in the Netherlands (Eisenstein 1998: 142). She concludes that capitalist technologies unsettle traditional forms of patriarchy for more modern ones (ibid.: 138).2 Eisenstein makes a fascinating point when she argues that simultaneous with the privatisation of public life/space (as states shift it to volunteerism and the private sector), is a ‘publicising’ of private life. Sex scandals, talk shows confessionals and exposure of domestic violence mark this new culture (ibid.: 108). Transnational capital, Eisenstein argues, displaces more professional labour in favour of service labour. As educational institutions lose state subsidies, jobs in these areas become less desirable. As privatisation proceeds, more and more women are allowed into these institutions to manage ‘denuded and downsized arenas of power’ (ibid.: 109). As TNCs take power and states become dysfunctional with less and less role in welfare, the family gets hit. With downsizing, more families suffer from unemployment and lower incomes; in addition, they have no welfare. Families, now expected to fend for themselves since the state cannot provide for them, become some kind of ideal haven for thinkers (ibid.: 110). For instance, Anthony Giddens and New Labour Tony Blair’s Third Way politics emphasises the duty of everyone—with no restriction on age or disability— to look for work. In this ‘workfare’ state, welfare is linked to the capacity to seek work. This essentially throws the responsibility of finding work upon the individual. (See Anthony Giddens’ (1998) The Third Way and The Third Way and its Critics (2000) for a discussion of these issues.)
CYBERFEMINISM Cyberfeminism is an attempt to redress the masculinist appropriation of contemporary technology. Cyberfeminists see cyberculture
GENDER
S 283
as a revolutionary social experiment with the potential to create new identities, relationships and cultures (Booth and Flanagan 2002: 11). It is a response to a condition where popular cyberculture ‘is in danger of becoming ensnared in the nest of technological determinism’ (Adam 2002: 161). This cyberculture views technological development as inevitable and not socially determined. It is precisely this notion of cyberculture that cyberfeminism seeks to undo. Feminist respondents ask two pivotal questions of technology: (a) how are relations of power distributed across and actualised through human–technology interactions, and how do women fare in this distribution?; and (b) how can such relations, where they prove detrimental to women, be challenged and transformed? (Currier 2002: 519). Cyberfeminism builds on these questions to investigate the new forms of gender identity that emerge with contemporary ICTs. Donna Haraway’s ‘cyborg manifesto’ initiated the cyborg as an androgynous figure that feminism could appropriate. Haraway’s work launched cyberfeminism in the sense that it mapped out the trajectory for a feminist appropriation of contemporary technology. In order to understand the essential ideas of Haraway and cyberfeminism, it is first necessary to look at how women’s relationship with technology has been represented. All critics pay attention to (a) identity, (b) the body and corporeality, (c) sexuality, and (d) community in their discussions of cyberfeminism. Cyberfeminisms, as Faith Wilding of the Critical Art Ensemble defines them, Imagine ways of linking the historical and philosophical practices of feminism to contemporary feminist projects and networks both on and off the Net, and to material lives and experiences of women in the integrated circuit, taking full account of age, race, class, and economic differences. () The VNS Matrix’s ‘Cyberfeminist Manifesto for the twenty-first Century’ declares: ‘We are the virus of the new world disorder/ rupturing the symbolic from within/saboteurs of big daddy mainframe/the clitoris is a direct line to the matrix’ (). The VNS Matrix’s ‘Bitch Mutant Manifesto’ also declares: ‘We are the malignant accident which fell into your system while you were sleeping. And when you wake we will terminate your digital delusions, hijacking your impeccable software’ (). Jyanni Steffensen, reading the work of VNS Matrix and artists like Suzanne Treister, argues that cyberfeminist art has a distinctive edge to it. It is, for Steffensen, an effort at ‘transform[ing] the masculinist reproduction of female-sexed subjectivity’. Steffensen goes on: In these feminist-inspired virtual worlds, the female body is staged as active, intelligent, polymorphously sexual. This constitutes a shift from the cyberpunk signification of female subjects as passive objects of male desire (the cyberbimbo), or as a metaphor for technology as ‘female’, threatening, and out of control (the fembot/vamp). (2002: 216–17) The VNS Matrix’s specifically and graphically female arts are, in Steffensen’s terms ‘redeployed […] site[s] for the construction of libidinal pleasures—in sex, in horizontal rather than Oedipal (vertical) relationships, in technological production, in sexy technology—a feminized and postfeminist erotics of technocultural production’ (ibid.: 222). VNS Matrix and cyberfeminist art therefore appropriate paternal organs, spermatic metaphors, and metaphors of viral infection as well as references to female genitals and bodily processes (games such as DNA Sluts are examples). Spermatic and penetrative metaphors (the latter is seen frequently in male cyberpunk) are utilised in imaging the mutating female subject as a virus that infiltrates and penetrates the technobody of patriarchal databases (ibid.). Thus the cyberfeminist manifesto speaks of its discourse as ‘infiltrating disrupting disseminating corrupting the discourse’ (as the VNS Manifesto puts it). Cyberfeminism sees earlier feminisms as anti-technology, antisex, essentialist and irrelevant to women’s circumstances in the new technologies. However, as Faith Wilding points out in her ‘Where is Feminism in Cyberfeminism?’ (), the cyberfeminists have already adapted/adopted many of the strategies of avant garde feminists:
GENDER
S 285
S Separatism: women-only chat/discussion groups, networks and self-help groups. S Feminist cultural, social and language theory and analysis. S Creation of new images of women on the Net to counter sexist stereotyping (feminist avatars, cyborgs and pornography). S Feminist Net critique. Cyberfeminist groups emphasise and collaborate in spreading education, especially technological. In a roundtable discussion Marjorie Perloff argued that (at least for her) the revolution of the last few years has been the ‘subtle movement of Internet activities into the realm of poetry and the arts’ (). Katherine Hayles in her contribution to this discussion argued that the new ICTs are ‘especially well suited to collaborative projects’, and that more women are interested in the artistic use of the medium rather than in its use for militaristic games and simulations. They pay attention to the need for increased women’s involvement in Net development (this is a direct reference to the social constructivist view of technology). Cyberfeminists thus seek to redress the imbalance in the development of technology, where technology has always been ‘masculine’. A brief survey of the various positions on cyberfeminism cuts across theories of science, popular representations of women and technology, and the feminist theoretical ‘appropriation’ of contemporary technoculture.
FEMINISING TECHNOLOGY AND ANXIOUS MASCULINITIES Mary Ann Doane argues that fears and anxieties over technology are frequently tackled by displacing them onto a feminine figure. She posits that central to the anxiety of technology is the problem of reproduction. Emergent technologies such as IVF, birth control and artificial insemination produce anxieties about the modes of human reproduction and, relatedly, the ‘role’ of the male and female. Doane detects this anxiety over reproduction in films take The Fly, Alien and Blade Runner (Doane 1999: 25). Mary Shelley’s great scientific–gothic fantasy Frankenstein, for instance, has long
286 S
VIRTUAL WORLDS
been read by critics as embodying a similar fantasy–anxiety of reproduction without a woman. Species reproduction is central to all such cultural anxieties of ‘technobirths’. A related theme is the anxiety over the production–mothering of something monstrous (Doane 1999: 25). Documentation of the birth of monstrous children and animals, categorised as miracles or omens, is an old technoscientific tradition. The Philosophical Transactions of the Royal Society regularly documented such births, and no less a person than Robert Boyle reported and discussed such phenomena. In contemporary technoculture technologies of reproduction regulate what Doane terms the ‘excesses of the maternal’. Yet they also undermine the epistemological certainty and comfort of the maternal. Mother is no more knowable, and the first main credible source of knowledge is here taken away. In addition, these technologies of reproduction take away the comfort of ‘origins’ (ibid.: 31). Women in cyborg films—and Terminator is a good example—are frequently mothers, and suggest that gender concerns are central to popular cyborg representations. Such an anxiety over masculinity is visible in the ‘pumped-up hyper-masculine’ bodies of the male cyborgs. Several critics have pointed out that such ‘overdone’ bodies (Arnold Schwarzenegger is what comes first to mind when we hear the term ‘cyborg’. He is an advertisement for both contemporary technology and fitness gyms) suggest either reassertions of ‘hegemonic masculinity’ or ‘hysterical over-compensations’ for masculinity in crisis (Holland 1998 [1995]: 166). Claudia Springer points out that while the boundaries between man and machine are enthusiastically broken down, gender boundaries are less easily negotiated, and therefore cyborgs ‘appear masculine or feminine to an exaggerated degree’ (Springer 1999: 41). Springer also argues that, like the comic-book superheroes that they are based on, the erotic appeal of these cyborgs lies in the ‘promise of power they embody’ (ibid.: 47). Violence in all these cyborg images substitutes for sexual release. Springer, following the work of Steve Neal, suggests that films and images that caress a naked male body may trigger (unacceptable) homoerotic responses in a male audience. The objectification of the male cyborg body is justified by making him the object or perpetrator of violence. The male body is ‘restored to action to deny its status as passive object of desire by inviting the spectator to admire the beauty of the male
GENDER
S 287
body followed by its participation in violence’ (Springer 1999: 47). Springer argues that cyberpunk and techno-thrillers defend masculinity against femininity by transforming the body into something only minimally human. Technological props are used to ‘shore up’ masculinity. Thus, the irony is that the masculine subject can be preserved precisely by disintegrating its coherence as a male body—with its prostheses of electronic systems. Masculinity is defended as masculine only with its appendages and transformation. Springer argues that by escaping the identification with the male body, masculinity in cyborg imagery suggests an essential masculinity that is beyond the merely corporeal. Springer suggests that, for the popular imaginary/ideology, ‘in a world without human bodies […] technological things will be gendered and there will still be a patriarchal hierarchy’ (ibid.: 48). She concludes: ‘Muscular cyborgs in films thus assert and simultaneously disguise the dispersion of masculine subjectivity beyond the male body’ (ibid.: 49). Springer’s is a crucial reading of the gendering of popular representations of the technological condition today.3 Allucquere Rosanne Stone (1999) in her ‘Will the Real Body Please Stand Up?’ argues, in psychoanalytic terms, that for the young male computer user the experience of unlimited power is gendered and closely linked to the need for control. Penetrating the screen becomes analogous to the sexual act, where the male user empowers himself by incorporating the surface of the feminine/feminised cyberspace into himself. The seduction of cyberspace is simultaneously erotic and threatening: unlimited power and the loss of the body. The sense of dizzying physical movement through the matrix carries with it a longing for embodied conceptual space. This sense accompanies the desire to cross the human– machine boundary and sub/merge into the seductive smoothness of cyberspace. Hence the cyberfeminist group, VNS Matrix’s ‘Bitch Mutant Manifesto’ declares: ‘The pleasure is in the dematerialisation. The devolution of desire.’ This, Stone argues, is analogous to the longing of the male for the female. The desire for merging into cyberspace and breaking the human–machine barrier is what Stone calls ‘cyborg envy’ (Stone 1999: 90–91). To enter cyberspace is to ‘put on’ cyberspace. In Stone’s extraordinary formulation, ‘to become the cyborg, to put on the seductive and dangerous cybernetic space like a garment, is to put on the female’ (ibid.: 91).
288 S
VIRTUAL WORLDS
In a cyberfeminist critique Nicola Nixon points out that the ‘console cowboys’ (as William Gibson calls hackers in his fiction) in cyberpunk represent a gendering of the new technologies. The matrix, she notes is always configured as feminine space. (The word ‘matrix’ originates in the Latin ‘mater’ meaning mother, womb, or something within which something else originates or develops.) Reading Neuromancer, Nixon demonstrates the sexism of images such as the cowboy ‘penetrating’ the matrix. In the novel this ‘jacking in’ runs the risk of hitting the ICE (Intrusion Countermeasure Electronics), a metaphoric hymeneal membrane. The matrix may have a life of its own, but it is one that has been configured by women like 3Jane, Angie Mitchell and Mamman Brigittte in Gibson’s cyberspace trilogy. Even the Artificial Intelligences, Wintermute and Neuromancer, with their autonomous desires for unification, have been, we discover, programmed by Marie-Francie Ashpool, to have that desire. Nixon argues that the ‘change’ (as Gibson puts it in the trilogy) is basically an uncontrollable feminising of the matrix, where viral software is transformed into a feminine Other. Nixon goes on to reject the idea that cyberpunk is counter-cultural. This is because the computers and the software are a part of the corporate system, and there is no way in which Case or Bobby Newmark (in Gibson’s trilogy) can step out of the technology they use. Thus, in several ways, cyberpunk reveals how cyberspace is as genderbiased as ‘real’ space (Nixon 1992: 191–207). However, as Jyanni Steffensen points out, a simple reversal of this stereotype of matrix/cyberspace as female and hacker/computer-user as male in feminist cyberpunk retains the heterosexual norm. These feminist cybertexts retain the essential procreative heterosexual mythos through representations of female cybercultural producers of the phallus. Steffensen, instead, calls for a ‘queer spin’ on the ways in which female cyberartists such as Suzanne Treister and VNS Matrix have constructed a mythical order (Steffensen 2002: 221).
IDENTITY Anne Balsamo argues that the hybrid figure of the cyborg actually underlines the ‘inviolable opposition’ between the human and the machine. Cyborgs fascinate us by disrupting notions of
GENDER
S 289
Otherness, since all notions of human depend upon a certainty over the nature of the non-human. The non-human has always been seen as the artificial/mechanical Other. Cyborgs as simultaneously human and artificial complicate the opposition between human and non-human and authentic and artificial. They foreground and emphasise the constructed nature of all Otherness. They underline the fact that all culture depends upon certain notions of the Other that are arbitrary, shifting and unstable (Balsamo 1999: 153). Balsamo moves the debate over cyborgs one step further when she talks of the female cyborg as challenging the opposition between human and machine. While acknowledging that cyborg images reproduce cultural gender stereotypes, Balsamo argues that the female cyborgs, with their emotional, sexual and natural maternal characteristics, are far more radical. These embody a cultural contradiction where objective technology is always associated with masculinity, and the emotional–subjective with femininity. Balsamo’s argument has important consequences for cyberfeminism. She raises two questions: (a) can the ‘feminine’ essentialism of fictional cyborgs be transformed into a non-essential image for contemporary women?; and (b) is there a way that the cyborg image can be used by feminism? (ibid.: 148–49). Balsamo provides two answers, both of which are central to cyberfeminism: (i) One way of using the female cyborg for feminism is to construct a utopian vision of the possibilities of a technology that emancipates women from the body. Here technology can be liberating. This is Haraway’s contention in the ‘Cyborg Manifesto’. The utopian reading makes the cyborg a symbol for the integration of the old and the new, a feminist transcendental vision. (ii) The second is to construct an ideological critique of the cyborg image as it has been produced by patriarchal culture. This means that we need to acknowledge that women’s development is not separate from technological development. She has been forced to become, like a cyborg, a hybrid creature of fiction and reality, and both cyborgs and women become ‘unknowable’ in our imagination (ibid.: 149). A critical vision assembles women as cyborg from ‘bits and pieces of women’s experiences that have already been out there, a
290 S
VIRTUAL WORLDS
reassemblage that sustains a critical perspective of technological/ scientific/cybernetic discourse. (Balsamo 1999: 153). Balsamo concludes by suggesting that the cyborg must become a ‘prototype’ for a feminist rethinking of identity, embracing identity as multiple and diverse. There is no return to ‘origins’ or essential identity of the feminine/lesbian/black or even cyborg. Sadie Plant, following Luce Irigaray, argues in her famous cyberfeminist essay, ‘The Future Looms: Weaving Women and Cybernetics’, that feminine matter is denied by human culture. This matter takes the form of a virtual system from which the actual emerges. Nature is the actualisation of this virtuality. As Plant puts it: ‘What man sees is nature as extension and form, but this sense of nature is simply the camouflage, the veil again, which conceals its virtuality’ (1998 [1995]: 61). Man requires the virtuality, the nature from which he draws his materials, in order to build a ‘culture’. Plant points out that for man, Mother Nature may have been his material origin, but it is God the father to whom he must be faithful (ibid.: 57). Hence man continually seeks to escape the grounds of his origin in search of autonomy. Technophobia, for Plant, is a display of man’s fear of this matrix behind the actualisation. The screen enables man to ignore the fact that he is still connected to the mater–matrix. The computer, for Plant, ‘joins women on and as the interface between man and matter, identity and difference, one and zero, the actual and the virtual’ (ibid.: 63). What appears on screen is a disguise for what goes on behind it. The pixels are merely the surfaces of the data that hide on the other side. For Plant this is a veil, something a woman weaves and uses to conceal herself. The matrix too hides as its simulation (ibid.: 59–60). In an innovative reading, Plant sees the matrix as the place of a woman’s affirmation. This affirmation is of the future (which can be actualised; ‘virtual’ also means, after all, ‘potential’). This future has not yet arrived but its presence can be felt, behind the veil–screen (ibid.: 60). Plant extends this argument further. She says that looking back on his origins, man sees only the ‘flaw, the incompletion, the wound, a void’ (ibid.: 61). This ‘site of life, of reproduction, of materiality’ must be covered. Hence woman is covered and put on the screen. As Plant says: ‘She is covered not simply because she is too natural, but because
GENDER
S 291
she would otherwise reveal the terrifying virtuality of the natural’ (ibid.: 61). Dawn Dietrich argues that ‘women stand to gain little as quasiembodied subjects within a network environment without reference to the material conditions of their subjectivity’ (Dietrich 1997: 178, emphasis in original). Faith Wilding underlines the point when she notes that the new media ‘exist within a social framework that is already established in its practices and embedded in economic, political, and cultural environments that are still deeply sexist and racist’ (). Thus, while VR technologies offer a new stage for the construction and performance of body-based identities, it is more than likely that old stereotypes and identities will continue to be more comfortable and even reproduced. As Jyanni Steffensen puts it: ‘It is […] likely that these new technologies will be primarily utilized to tell old stories—stories that reproduce, in high-tech guise, traditional narratives of the subject’ (2002: 216). Julie Doyle and Kate O’Riordan reading the Visible Human Project and its ‘sister’ project, the Stanford Visible Female, point to this persistence of cultural stereotypes in medical imaging. The normal body depicted in these projects is of a young female. Most importantly, it is of a female who is reproductively able. This privileging of the pre-menopausal body, write Doyle and O’Riordan, ‘continues to abnormalize that of the postmenopausal, reinforcing dominant ideologies of legitimate femininity that conflate femininity with youth and reproductive capacities’ (Doyle and O’Riordan 2002: 250). Closely aligned with the debates over ovarian tissue transplantation technologies (which enable postmenopausal women, or those whose fertility has been affected due to radiotherapy, to become pregnant), the discourse privileges the ‘legitimate’ reproductive female over the aged, post-reproductive ‘illegitimate’ one (ibid.: 251). As Doyle and O’Riordan summarise it, ‘The female body is subject to a strict taxonomy of normal as young and actively reproductive, and this is publicly policed through discussion and scrutiny’ (ibid.: 251). Zoe Sofia, writing about ‘virtual corporeality’, speaks of the ‘seduction of synecdoche’ in these new technologies. Sofia defines synecdoche as
292 S
VIRTUAL WORLDS
[W]hat allows the disembodied, alienated, objective rationality of a certain gender, class, ethnicity and historical epoch to be vaunted as universal, while other styles and components of rationality—such as embodiment, situatedness, emotion— are ignored or dismissed as non-rational. (Sofia 1999: 57) Computer microworlds and virtual realities offer the promise of total control and mobility. Abstracted from the reality of the whole body (and the rest of the senses, with the visual and aural being privileged), the microworld of VR offers the illusion of total control over the entire system. VR has other implications and possibilities for alternative forms of desire and sexuality. Thomas Foster argues that heterosexuality has a fundamental prohibition. Occupying a male body naturally leads in two separate directions: towards self-identification with masculine gender norms and towards a libidinal cathexis on a female object-choice. Foster suggests that VR allows sexual desire to be combined with identification with the object of desire (Foster 2002: 481). VR literalises the inseparability of desire from identification, and thus undoes the process of heterosexual identity, which is based on binary and internally homogeneous constructions of masculinity and femininity (ibid.: 482). Feminist cyberpunk relocates subjectivity outside the space of the physical body. (Allucquere Stone’s theoretical work deals explicitly with this.) It also permits, for Foster, same-sex desire that has been incorporated and repressed through gender identification. Desire is no longer exclusively associated with the material body or identification with virtual spaces where supposedly disembodied minds can communicate. VR unsettles the distinction between desire and identification, body and mind; material and virtual no longer map onto one another perfectly (ibid.: 483). Alison Adam argues that feminist ethics must be interwoven into studies of cyberculture. She writes: ‘Feminist ethics can help expose the power inequalities that case studies often reveal and that traditional computer and Internet ethics renders invisible in its pursuit of mainstream ethical views’ (Adam 2002: 167). We therefore need to look at the ethical problems of the Internet, especially those that can be viewed as gendered problems. Cyberstalking, Internet pornography (especially paedophilia) and
GENDER
S 293
hacking are three such areas. Most cyberstalkers are male, and most of their victims are female. Most Internet paedophiles are male. Hacking itself is a predominantly male activity (Adam 2002: 168. Adam quotes as sources for this information BBC news programmes from ). Adam concludes her call for an explicitly feminist ethical approach thus: The challenge then for a cyberfeminist ethics is to develop further the argument that shows how the masculine individualism of traditional ethics is damaging in extreme circumstances, particularly when coupled with the dystopian, apolitical stance of cyberculture that allows individuals somehow to justify to themselves that their activities are not wrong. (ibid.: 169) Adam’s is, I believe, a salutary call for a new critical imperative and programme in cyberculture studies.
DONNA HARAWAY AND THE ‘PROMISES’ OF CYBORGS Haraway’s work has been path breaking in its exploration of the potential of contemporary cybernetic technoculture for feminism, and is significant enough to merit a separate discussion under cyberfeminism. As she says, her work ‘explore[s] bodies, technologies, and fictions as they are constructed through the mediation of late twentieth-century communications sciences’ (Haraway 1991c: 24), with the emphasis being on ‘constructed’, an important idea in Haraway’s work. Cyborgs, for Haraway, are ‘potent fusions of the technical, textual, organic, mythic, and political’ (ibid.: 24–25). Cyborgs are ‘nonoriginal people’, born out of multiple displacements (Haraway 1991b: 13). She calls into question the integrity and unity of ‘natural’ objects. Such a cyborg feminism, for Haraway, liberates the woman from totalising identity politics without recourse to biological fundamentalism in sexing bodies as ‘female’. Her cyborg is thus a post-gender cyborg.
294 S
VIRTUAL WORLDS
BOX 5.1 DONNA HARAWAY Professor with the History of Consciousness Programme, University of California, Santa Cruz (USA), Donna Haraway can only be described as a ‘historian of the systems of thought’ (the designation and official post Michel Foucault occupied in France). She has provided some of the most trenchant critiques of modern technoscience. Her ‘Manifesto for Cyborgs’ launched a concept and metaphor for generations of scholars to work upon. Combining a deeply humanist concern for life (human, plant, animal) with an extraordinary historical sense and critical rigour, Haraway’s work on genetic engineering, scientific laboratories, transplants and other contemporary technologies, has inaugurated an entire disciplinary field in cultural studies of technoscience. She has explored the racialised and gendered nature of science and technology, paying particular attention to the material practices that have discursively silenced and marginalised ‘other’ voices in the texts of technoscience. Haraway, like Edward Said in the postcolonial studies field, is surely one of the most important and influential thinkers of the twentieth century. The ‘cyborg’ is on par with ‘Orientalism’ (Said), ‘differánce’ (Derrida), ‘mirror stage’ (Lacan) and ‘episteme’ (Foucault) as one of the most influential and quoted terms of our times.
Working from the assumption that subjectivity is fragmented, Haraway argues that the boundaries between body and technology are socially drawn. The cyborg is a ‘monstrous’ image, the ‘illegitimate’ child of (a) dominant society and the oppositional social movement, (b) of science and technology, and (c) the human and the machine. It is thus a hybrid. Cyborgs are interactive participants found at all points of information production and replication. Both ‘woman’ and the ‘cyborg’ have been symbolically and biologically produced and reproduced through social interaction. Both the ‘self ’ and the ‘body’ are produced in this interaction. It is an oppositional figure that serves to demarcate the boundaries between fact and fiction, science and culture. Haraway’s argument is that if the cyborg embodies both a human and technologically reality, then it shares a close affinity with the woman’s identity, which is also biologically determined and socially constructed. She argues that cyborg identity—which
GENDER
S 295
she terms an ‘ironic political myth’—is the only possibility for woman-identity in the late twentieth century. Haraway writes: ‘Cyborg feminists have to argue that “we” do not want any more natural matrix of unity and that no construction is whole’ (1991a [1985]: 157). She seeks to unite what was hegemonic feminist theory with theories of local resistance and US Third World feminism. In a challenge to rethink the relationship of women to technology, Haraway suggests that women need to respond to technology without recourse to the traditional opposition—‘technology versus nature’. Writing of a ‘joint kinship’ and the ephemeral nature of all identity in cyborgs, Haraway declares: ‘A cyborg world might be about lived social and bodily realities in which people are not afraid of their joint kinship with animals and machines, not afraid of permanently partial identities and contradictory standpoints’ (ibid.: 154). The notion of joint kinship is something Haraway adapts from contemporary indigenous writings. These writings (in Alice Walker, Gloria Anzaldua and others) seek a relationship outside blood relationship, along the ‘lines of affinity’. The mestizo or border-crosser (a term Anzaldua has deployed to capture the sense of a shifting identity) is one who seeks a relationship across lines of difference, a kinship based on ‘love’ beyond differences rooted in the body/skin. For Haraway, contributions by women of colour (such as the Chicana author Cherrie Moraga) present a cyborg identity. Their emphasis on a ‘differential consciousness’ enables the subversion of fixed identities of ‘original’ characters. Instead Moraga and other writers ‘recode’ tools of communication and intelligence to subvert the command–control set-up. Simone Bergman and Liesbet van Zoonen therefore argue that the ‘communications facilities of the Internet can be seen as the virtual translation of more or less traditional feminine concerns of personal contact, sharing and creating community’ (1999: 105). Science must be treated as contested and attention must be paid to the agents that ‘create’ science (an argument that closely parallels Bruno Latour’s work on actors in science). Emerging from ‘networks of multicultural, ethnic, racial, national and sexual’ situations, these actors are ‘inappropriate/d others’ (a term she adopts from Trinh T. Minh-ha). These inappropriate/d Others are neither ‘Self ’ nor ‘Other’—the binary which all Western identity politics adopts. It means, in Haraway’s words, ‘not to fit in the taxon’, out of the available identity maps, and not ‘originally fixed
296 S
VIRTUAL WORLDS
by difference’ (1991c: 23). That is, the inappropriate/d Others refuse to be held and restricted to so-called ‘original’ identities based on differences of race, colour, gender or sexual preference. Drawing upon Chela Sandoval’s work, Haraway stresses that the phrase ‘women of colour’ suggests a cyborg entity. Such women are refused location in stable categories of sex, race and class. The women-of-colour subjectivity constructs a postmodernist identity of Otherness and difference. It does not privilege one social position over another, but accepts their intersections. Liberal cyberfeminism sees the computer and computermediated communication as liberating, a ‘utopia’ (a term Haraway uses) that overcomes the binaries male/female and heterosexual/ homosexual. Liberal cyberfeminism borrows from queer theory, postmodernism—especially when it stresses the contingency of all knowledge and the ‘situatedness’ of all methodological approaches— and feminism. In this utopian vision cyberspace is seen as a more democratic space where the gender inequalities of ‘real’ space can be overcome. In cyberspace, a cyborg science can mean the very temporary nature of meanings. Meanings shift, and attach to others depending upon current needs. This is the ‘differential consciousness’ of Haraway’s situated knowledges. A differential consciousness is a ‘tactical subjectivity’ (Chela Sandoval’s term), with the capacity to recentre depending upon the specific oppression to be confronted at the moment. In the US, this kind of subjectivity crafts a political unity between women of colour from different locations. These women have been positioned within specific histories of exclsuion and oppression, and yet they share a common subjectivity. For example, Cherrie Moraga in This Bridge Called My Back (Moraga and Anzaldua 1981) speaks of the ties between Chicana and African–American women. Haraway’s cyborg identity is precisely this: a politics based not on ‘essential’ identity but on positions and affinity. It exposes ‘white’, ‘black’, and ‘woman’ as social fictions rather than stable social categories.
EPISTEMOLOGY AND SITUATED KNOWLEDGES Mary Flanagan, building on Haraway’s work, pays attention to the ‘shape of knowledge’ that emerges in contemporary technoculture. She argues:
GENDER
S 297
It is at the site of the female body and computer imaging that epistemological implications arise, precisely because it is this body that exists at the forefront of popular media and culture, carrying with it a set of assumptions about the position or shape of knowledge. (Flanagan 2002: 431) Flanagan suggests that we need to treat the virtual body and its relationship to the user/participant’s corporeal self from an epistemological standpoint rather than look at the surfaces—the representations of women’s images—in cyberculture. For example, she asks: ‘How does the interface of Lara Croft’s digital body affect users/players of that body?’ (ibid.). Flanagan argues—substantiating her argument with empirical surveys—that players position themselves in relation to the heroine in multiple ways, even though the game manufacturer assumes that all users will take the same position. 3-D games can provide at least five positions for the users: (i) The user causes the character to act. (ii) The character acts independently—with some autonomy and agency—and the user watches the character for signs of life, as it were. (iii) Users act with the character as friends or co-adventurers. (iv) Players see the character as simply virtual. (v) Players identify themselves with the character. The media thus has a very complex identification with the audience. Having a female character complicates this even more. Users may see these figures as powerful females, but these figures have limited agency, and most game–play is dictated by the user’s desire. Thus, while the possibilities of fostering ‘situated knowledges’ is present technically, the game content actually leaves little room for critical questions or change. Flanagan suggests a move towards ‘hyperknowledge’. She argues that participants in computer games do not truly experience what it means to be female; rather their original subject location creates them as knowers. This is a ‘double embodiment’. The computer world user experiences a double consciousness: the class, race and gender identity of the user’s body and that of the virtual body/bodies s/he becomes when participating in ‘screen life’. The dichotomy between the physical subject and the ‘I’ formed
298 S
VIRTUAL WORLDS
through the screen is undefined and flexible, but it does not replace the user’s raced, classed and gendered experience. Flanagan argues that if we do not abandon the user’s real body, these multiple identities can offer potential for repositioning the subject (Flanagan 2002: 438). The body of the virtual game character is distanced, ‘proxied’ through the mechanism of the user interface. The type of knowledge established through these virtual characters becomes a way of ‘knowing’ through performance. There is the performance of the body—the performance of the gender of the virtual body and the relationship between this secondary performance and that of the gender of the knower. A combined subject position is made available through the multiple positioning of the user and the implications for agency within these virtual worlds. As more and more of the physical body is mediated by/in digital discourse, this kind of combined subject position has far-reaching implications. Flanagan argues that ‘floating’ subjects—as Allucquere Stone and others theorise them—are apolitical, and are uncoded by race or gender assumptions. Floating, for Flanagan, does not describe the experience of knowledge generated between these bodies. Knowledge in virtual space is always negotiated as a ‘product (a very political product) of a located and a roaming subjectivity. Hyperknowledge is created within this third space, in the relationship between the virtual body and the physical’ (ibid.: 440). Thus knowledge, for Flanagan, is not embodied or empirical but a combination of both. We need to look at the ‘shifting space between bodies’ where feminists can develop new ways of identification in space without ignoring the situation. Flanagan argues that a new way of knowing is articulated through movement in computer-generated space. The focus must therefore be on gaming as ‘peformance’ (here Flanagan invokes both Judith Butler and Jean-Francois Lyotard’s work on performance–event– identity), upon the possibilities in constructing a ‘gaming subject’ (ibid.: 441–42). In her reading of cyborg feminism, Haraway speaks of a new science. She suggests that knowledges across differences must be taken into account. Since masculinist science has always believed in a unitary, homogenous ‘objective’ knowledge, cyborg feminism must utilise knowledge of those categories that have
GENDER
S 299
been excluded from articulation in the power game. Traditional science has always seen knowledge as a given—just there for men to reach out and discover. Cyborg feminism insists that knowledge is ‘constructed’ by ignoring certain kinds of knowledges and valorising others. It suggests that the knowledge of Third World women and tribals—so far excluded from mainstream scientific thought—must also be used. This is what Haraway terms ‘situated knowledges’. Here the objects of knowledge in mainstream science (usually women, coloured people) who have never been allowed to ‘possess’ knowledge, must be seen as actors and agents. That is, from passive objects of knowledge, these become subjects of knowledge production. These subjects transform themelves while simultaneously allowing themselves to be transformed. In short, cyborg feminist science suggests a transcendence of the subject/object, knower/known binary. In her ‘postscript’ to ‘Cyborg Manifesto’ Haraway writes: The knowing self is partial in all its guises, never finished, whole, simply there and original; it is always constructed and stitched together imperfectly, and therefore able to join with another…. (1991c: 22). Chela Sandoval provides an important reading of Haraway’s ‘cyborg feminism’ in the context of feminism, postcolonialism and First World transnational cultural conditions. In her essay, ‘New Sciences: Cyborg Feminism and the Methodology of the Oppressed’ (1999), Sandoval argues that cyborg consciousness is oppositional consciousness. This oppositional consciousness has various names: ‘situated knowledges’, ‘mestiza consciousness’, and ‘womanism’. For such a consciousness to be effective, it must be developed out of a set of technologies that Sandoval calls the ‘methodology of the oppressed’. She posits five technologies that constitute the methodology of the oppressed. (i) Semiology (Barthes) or ‘signifyin’ (Henry Louis Gates) or ‘deep seeing’ (Audre Lorde): The study of signs and sign systems. (ii) The de-construction of dominant ideologies. (iii) ‘Meta-ideologising’: Appropriating dominant ideologies and transforming them.
300 S
VIRTUAL WORLDS
(iv) Democratics: Orienting the previous three technologies with the intent of producing egalitarian social relations. (v) Differential movement: Something on which the other technologies depend on for their operation. For Sandoval such a methodology is a new form of feminism itself. Here one needs to identify a set of skills that are capable of connecting our ‘technics’ (material and technical details, rules, machines and methods) and ‘erotics’ (sensuous apprehension and expression of ‘love’ as affinity). This will require a ‘politics of articulation’ (Haraway’s terms) capable of forging new collectivities and new coalitions. For this we need to look at the systems that link meanings and bodies (semiology). Then, we need to look at the processes by which ‘objectification’ occurs (such as the objectification of women, who will never become the subject and object of knowledge). We need to take responsibility for the systems of objectification. We need to be accountable for the politics and epistemologies that are constructed. The ‘object of knowledge’ must now become (or be made) the agent and actor in the epistemological game (Sandoval 1999: 247–63). There are, naturally, dissenting voices in this valorisation of the feminist ‘utility’ of the cyborg image. Looking at lesbian cyborgs Camilla Griggers raises the following questions of cyberfeminist readings of technoscience: ‘Who gets to produce cyborg bodies, who has access, who provides the laboring and component bodies, and who becomes and who buys the commodities produced?’ (quoted in Bayer 1999: 118). Critics such as Erica Burman believe that ‘far from fashioning new forms of subjectivity that escape traditional gender binaries […] current representations of cyborgs […] seem to work more to reconstruct them anew’ (Burman 1999: 178). Cyberfeminism critiques several contemporary technologies, prominent among them being those of Artificial Life. Artificial Life or ALife is an interdisciplinary science that combines biology and computer science, and which seeks to stimulate and synthesise life. MIT, University of Sussex, and Santa Fe Institute, New Mexico are some of the world’s leading centres for ALife research. In ALife organisms are regarded as autonomous information-processing and replicating systems which evolve and self-organise. In 1997
GENDER
S 301
the fourth European Conference on Artificial Life was held at Brighton, in collaboration with the Brighton Media Centre. One of the exhibitions was called LifeLike. A new computer game called Creatures was exhibited here. Creatures involved breeding and rearing artificial life forms called Norns. Norns live in Albia (a virtual world). The player of the game plays the role of parent, starting from incubation, through caring and educating these Norns (words appear in speech bubbles on the screen). Norns start conversing with the ‘parent’, and live for 15 hours. With remarkable foresight, the computer game menu also has a funeral kit, with photograph and headstone. ALife is about self-organisation and the emergence of order and complexity out of ‘simpler’, i.e., less complex origins. It is meant to be autonomous. Sarah Kember’s cyberfeminist perspective on ALife is a particularly perceptive reading of the ‘politics’ of artificial life. Kember argues that ALife’s emphasis on autonomy and self-organisation is a masculine fantasy projected onto an imaginary (or virtual) world of ‘innocent’ or ‘natural’ agents. Kember also argues that this fantasy is basically an abdication of responsibility. She believes that life elements such as emotions and the body, which are historically formed and feminised over long evolutionary periods, are now seen only in functionalist terms. Kember quotes Alison Adam: ‘There is no room for passion, love and emotion in the knowledge created in ALife worlds’ (Kember 2000: 40). She then argues that the autonomous, self-organising and emergent ‘agent’ is modelled on the myth of the essentially masculine, rational, unitary humanist subject. Adapting from Haraway and other thinkers, Kember argues that a ‘hands-on’ engagement with the actual ‘construction’ of ALife to challenge its biological and evolutionary determinism is essential. We need to talk about the ‘materialisation’ (borrowing Judith Butler’s term) in the sense of social and individual responsibility of this life-as-itcould-be (ibid.: 3–24). I shall conclude this section on gender with a discussion of what I believe is, to date, the most convincing cyberfeminist argument on identity and technology: the work of Dianne Currier. Dianne Currier argues that ‘in the transformative scenarios of cyberspace, gender becomes a key site of transformation insofar as it is understood to be information inscribed onto a material body that can be transcended through disembodiment or virtual
302 S
VIRTUAL WORLDS
rearticulation’ (Currier 2002: 529). Here gender-as-information operates in two modes: (i) Distinct from yet affixed to materially sexed bodies such that the two cannot be readily detached in the real world; however, through disembodiment or virtuality gender patterns may be eluded. (ii) Within pure information space, it is a free-floating pattern. Currier argues that any account of the unproblematic formulation of gender as detachable information as the locus of transformation retains the binary logic, is confined to the economy of identity, and is ultimately unable to secure transformation (ibid.: 528–29). The logic of identity is 1+1. It begins with an original selfidentical entity, being added to by some exterior, prosthetic element. The element is understood only in terms of its difference from the original as ‘not-original’. For cyberfeminism to be effective, argues Currier, we need to rethink bodies and technologies outside this binary of original/non-original (ibid.: 529–30). Adapting from the work of Elizabeth Grosz and DeleuzeGuattari, Currier writes of bodies as ‘assemblages’. Assemblages are functional conglomerations of elements. The component elements are not seen as unified, stable or self-identical entities. In each assemblage, the forces and flows of components meet with and link to the forces and flows of other elements; the resultant distribution of these meetings constitutes the assemblage. These are thus multiplicities, which are of two types: (i) Mutliplicities that are different in degree are that which are articulated in relation to identity or unity of origin. This is structured around a central entity that serves as a benchmark to understand difference. Difference is thus decided and understood in terms of the original ‘centre’. (ii) Multiplicities that are different in kind are differences that are not seen in relation to any prior entity. They are different in themselves. They change in their very nature as they divide. (ibid.: 531–32). Bodies can be treated as assemblages, as collections of disparate flows and impulses, instead of organised, coherent objects.
GENDER
S 303
Currier writes: ‘They take shape within a complex field of relations with the flows and intensities of surrounding objects, knowledges, geographies, and institutional practices in transitory, functional assemblages’ (Currier 2002: 533). The following implications arise from such a view: (i) If assemblages are temporary aggregations of multiplicities, there is no assumption of a stable, unified body prior to its encounter with technology. (ii) The encounter between bodies and technology occurs in a field of other intersections. (iii) The mode of meeting of bodies and technology is never one of simple addition (body + technology, 1+1). Rather, each meeting gives rise to a new configuration of bodies and technologies and all other elements of an assemblage. (iv) The task, then, is to track what configurations of bodies, technologies, practices, objects and discourses emerge within particular assemblages. A body in cyberspace is not simply split into (a) a materiality that is excluded, and (b) an electronic/informational body activated in cyberspace. Rather, the energy and impulses of bodies and electronic circuitry combine and find new forms and they are traversed by flows of light, information, signs, sexuality, sociality and conversation that give rise to differing meanings, experiences and configurations of bodies and technologies. We can now approach technology, masculinity–femininity, technoscientific discourses and military–industrial complexes as a series of assemblages. We can see how associations of masculinity with technologies of computing function, and on what basis women are articulated as incompatible with those technologies. Structures of knowledge and power are now seen as functional elements of an assemblage rather than as overarching and transcendent structures. Cyberspace is thus not simply a technologically-generated information space. It is a series of assemblages comprised of elements of the technical, social, discursive, material and immaterial. We need to map out these assemblages to understand how power works/flows along them. We can then discover movements that also traverse these assemblages (Currier 2002: 536–37). This is a posthuman view, where the human ‘subject’ is actually seen as an assemblage, with her/his
304 S
VIRTUAL WORLDS
subjectivity (and there may not even be a her or him in cyberspace) disseminated into and through prostheses that work back into the body/mind. The posthuman ‘subject’ is multiple and one, unified and yet scattered. Technology is within the human, just as the human exists within technology.
WOMEN’S CYBERFICTION AND FEMINIST CYBERPUNK Women cyberfiction writers pay careful attention to the structures of power and domination in contemporary technoscience. Rethinking male conceptions of cyberspace, these writers expose the anxiety over reproduction, disembodiment and stable identities as a specifically masculine anxiety. They argue that a woman’s experience of such a cyber-anxiety is quite different. Further, the feminisation of cyberspace (the ‘matrix–mater’, as we have seen in Mary Ann Doane’s arguments) does not exist here. In fact, in Pat Cadigan’s classic, Synners (1991), the males are the receptors, and they are implanted with sockets characterised as ‘the female … the receiver’. Amelia DeLoach in a piece in CMC Magazine (1 March 1996) argues that these Cybergrrrls are to the WWW what Riot Grrrls are to music and Guerrilla Grrrls are to art. Faith Wilding argues that these grrrl groups are a manifestation of a new subjective and cultural feminine representation in cyberspace. There are several categories of groups: anti-discrimination protests, medical self-help, artistic self-promotion, sexual exhibitionism, female porn, transgender exhibitionism, lesbian separatism, job services and others. As Wilding puts it: ‘Cybergrrlish lines of flight are important as vectors of investigation, research, invention, and affirmation’ (). In terms of popular culture, Pat Cadigan, Lisa Mason and Laura Mixon represent a feminist cyberpunk, situated by critics such as Mary Catherine Harper (1995) and Karen Cadora (1995) as oppositional to the masculinist cyberpunk of Gibson and Sterling. C.L. Moore was already writing about the relationship between humans and machines in the 1940s. Anne McCaffrey and Alice B. Sheldon (under the pseudonym, James Tiptree, Jr.) had cyborgs in several of their texts. Among the newer writers Cadigan, Mason, Mixon and Melissa Scott appropriated the styles of cyberpunk in
GENDER
S 305
order to fictionalise women’s relationship with the new technologies. Austin Booth’s survey of these writers is a useful summary of the genre. Women cyberpunk writers tend to explore ideas about imperfect bodies, using disability and deformity as themes. They also explore the manipulation of both male and female bodies, complicating notions of gender norms, heterosexual desire, race and class (Flanagan 2002: 433). Cyberfeminist writers also present a dystopian future of class war, economic suffering and exploitation. While they accept, just as male cyberpunk authors do, that pleasure is possible with VR technologies—perhaps only within VR—they point to the fact that these technologies are mostly controlled by the same corporate and bureaucratic powers that control everyday life. Women’s cyberfiction believes that there is no ‘outside’ position from which to critique this technology. Melissa Scott’s Trouble and Her Friends (1994) and Mary Rosenblum’s Chimera (1993) both depict worlds where women are kept in an underclass that is denied any technology. Unlike cult (male) cyberpunk, which is mostly about ‘console cowboys’ and hackers, feminist cyberpunk is full of people of colour, illegal workers, handicapped characters, lesbians, the poor and the homeless. These texts argue that women of colour, the poor, lesbians and gays, the aged, and the disabled are increasingly denied access to newer technologies. When they do have access, it is to technologies that are obsolete or less sophisticated. Social difference is marked by the type of technology one uses. Such cyberfeminist texts emphasise the need to distinguish between specific types of technologies used, since every technology has a particular meaning and historic context. The women protagonists of feminist cyberpunk use technology for several reasons. Primary among them is survival in an increasingly hostile, poor, environmentally unstable and exploitative world. However, they also use technology to challenge evils such as environmental destruction, military organisations, global corporate organisations and information control. In some of these texts, such as Misha’s Red Spider White Web (1990) and Pat Murphy’s The City, Not Long After (1989), women redefine technology to include communication, emotion and magic. Works by Alice B. Sheldon and Pat Cadigan draw links between technology, magic and spiritualism (though there is a strong mystic–shamanic–spiritual dimension in
306 S
VIRTUAL WORLDS
‘male’ cyberpunk too, as Dani Cavallaro has demonstrated in his book on Gibson). Feminist cyberfiction frequently uses the figure of the alien to talk about gendered and raced ‘others’. They conflate one type of alienation with another. Thus the self/other maps onto human/alien, and this mapping is then used/extended to talk about the mapping of the self/other onto man/woman or white/nonwhite (Booth 2002: 31). Many of the characters in feminist cyberpunk are biracial, thus representing the ‘inappropriate/d others’. They are neither inside nor outside any cultural group, thus providing a vantage point from which to critique cultural operations. Feminist cyberpunk describes the creation of systems of difference and discrimination within technocultures. Therefore, the popular cyborg image is female in such texts, and serves to demonstrate how women and robots (female robots or ‘fembots’) are constructed and dehumanised in similar ways. (However, these authors are also aware of the tension inherent in such appropriations of the cyborg. The tension is between the Harawayan image of the cyborg as emancipatory and the more radical cyberfeminists who see it as a figure of oppression.) For cyberfeminists the cyborg—a hybrid of Self and the Other—calls attention to the constructed nature of categories such as ‘human’ and ‘machine’ and the permeable boundaries between them. Cyberfeminist texts use the figure of the cyborg to present a condition of multiple and dispersed subjectivity. Anne Balsamo argues that the cyborg becomes a useful figure to reflect on the ways in which women’s subjectivities and identities have always been fragmented (quoted in ibid.: 33). The posthuman may well be a transgendered being. Feminist cyberfiction points out that we cannot equate the challenge posed by biracial characters to systems of racial difference with the challenge posed by cyborg characters to systems of human–machine difference. Austin Booth suggests that the genre poses a series of questions: Are all women cyborgs because women have mechanical body parts, or because women participate in technologybased economies as workers, or because women have historically experienced fragmented identities? How do we
GENDER
S 307
characterise different women’s very different relationships to their cyborgian nature, both in terms of their relationship to technology and to various, multiple systems of difference? (Booth 2002: 34) Male cyberpunk writers tend to privilege disembodiment and search for techno-transcendence. The separation of mind and body in cyberpunk leaves masculinity undisturbed and unthreatened. In feminist cyberpunk the female protagonists do not want to ‘leave’ their bodies. Unlike male cyberpunk, works by authors such as Cadigan and Rosenblum depict female characters who see cyberspace not as an escape from embodiment, but as a means of enhancing or multiplying embodiment. They wish, therefore, to reclaim and resituate embodiment precisely because the body is the site of being marked or written (as ‘black’ or ‘woman’). Male protagonists in ‘traditional’ cyberpunk see their bodies as unmarked, unmediated by race or gender. Cyberfeminists, by emphasising the materiality of the body, point to the importance of the marking of the body in the experience of subjectivity (experiencing subjectivity and identity as black, as woman and so on). In an innovative feminist reading of computers and VR, Zoe Sofia presents the computer as a ‘transitional object’. A tool, in the psychoanalysis of Melanie Klein, is a transitional object that is a part of the self projected into the world, and a part of the world introjected for human designs. Tools are midway between the Self and the Other, and guard against complete object loss, for humans can always have these ‘children of the mind to love’ (psychoanalytic anthropologist Geza Roheim, quoted in Sofia 1999: 58). The computer is a more transitional technology. It functions as a projection of certain parts of the mind (language and formal logic), and produces an uncanny effect of the second self between Self and Other, subject and object. The computer permits a ‘quasi-tactile’ manipulation of objects that exist on the boundary between the physical and abstract. Ambiguous gender, for Sofia, is a similar transitional aspect of computers. The computer’s feminine associations (mother board, consoles) contrasts with the heavy, masculine machinery of industry. The computer occupies what Sofia terms a ‘Jupiter Space’ (recalling the birth of Athena from the forehead of Jupiter, after he had ingested the pregnant Methis, the goddess of wisdom). Jupiter Space (the term
308 S
VIRTUAL WORLDS
comes from 2001: A Space Odyssey) refers to the womb-like brain of the computer. It refers to the ideological conception of technological artifacts, information, and feminine ‘forms’ as products of the rational, masculine brain. Female fertility—in the Athena myth— is cannibalised and displaced onto masculine intellectual production. The masculine brain functions as a womb and produces a new body (technology, artifacts) from the cannibalised mother. Sofia argues that femininity and maternity are displaced onto corporate and masculine technological fertility. For, instead of a female–mother–body, we find the male–mother–mind (Sofia 1999: 58–60).
POSTCOLONIAL AND MULTICULTURAL FEMINIST SCIENCE STUDIES Sandra Harding’s recent book, Is Science Multicultural? (1998), addresses significant questions about feminism, technology, postcolonial studies and the ‘formation’ of knowledge. Postcolonial science studies use as their evidence not traditional European history, but the disseminated, postcolonial, multicultural and global histories. In contrast with the earlier histories which were isolationist—treating the histories of Europe or America as self-contained and separate—these postcolonial studies look at how cultures have always intermingled and interacted. The ‘subject’/speaker in postcolonial studies is not the white man but the non-European Other. The standpoint here is of the vulnerable people, the ‘have-nots’ of the world’s masses. Postcolonial feminism starts with the lives of those people upon whose exploitation the legitimacy of the dominant system depends (Harding 1998: 17). Harding posits a ‘strong objectivity’ standpoint, where thought starts with the marginalised and the silenced. It looks at the women (and men) whose lives are affected by technologies, but who have no role in the epistemologies that create these technologies. Standpoint epistemology identifies the Eurocentric and androcentric elements in the very conceptual frameworks that are used to analyse scientific and technological change. It looks at the social relations that have valorised certain standpoints and knowledges in its
GENDER
S 309
search for more competent standards for maximising objectivity (Harding 1998: 18). Postcolonial feminist science studies pays attention to the role of non-European cultures in the Enlightenment and ‘scientific revolution’ of Europe. (Joseph Needham’s work on the history of Chinese science and technology, for instance, notes the many ‘diffusions’ of Chinese knowledge into European sciences.) Science, these studies demonstrate, has always been ‘multicultural’, since elements of the knowledge traditions of many different non-European cultures have been incorporated within modern science (ibid.: 35). Such studies call for ‘an epistemological equality’ (ibid.: 34), where different cognitive systems are treated on par. The distinction between ‘true knowledge’ and ‘local belief ’—which sustains European notions of development, modernity and science itself— must go. This simply means a fuller appreciation of the non-European scientific cultures. Traditional science studies have been hampered by the Eurocentrism of conventional histories, philosophies and social studies of science. Feminist standpoint studies—an important constituent of postcolonial feminist science studies—seek to use the unique resources of women’s particular social locations to identify understandings of the masculinist underpinnings of institutions and practices. Postcolonial feminism focuses on: (i) The resources and limitations in the different cultures within which women struggle against male supremacy, cultural imperialism, and economic deprivation. (ii) Analysing the mutually causal effects of changing gender relations and changes in global political economies. (iii) How the Eurocentrism of northern (i.e., developed nations’) assumptions exacerbates the effects of each culture’s existing androcentrism, leading to the further deterioration of women’s resources (ibid.: 77). Thus the role of race in the development of ethnography and science comes in for attention from scholars such as Nancy Stepan (The Idea of Race in Science). Feminist critiques of development such as Ester Boserup’s Women’s Role in Economic Development are also postcolonial feminist in their orientation.
310 S
VIRTUAL WORLDS
Postcolonial feminist science studies begin with the realities of the lives of women of other cultures in the North and South, and of women’s position in the global political economy. They re-examine scientific and technological change from the standpoint of these lives. Postcolonial feminist analyses have been produced primarily in the context of struggles against and within international agencies, local patriarchies and grassroots organising in the South. They look at technologies and applied sciences, rather than pure science, when they focus on the ‘lived’ science of the people. In these studies ‘gender’ is not simply a thing attached to individuals, but an attribute of social structures and symbolic systems. Gender is always interlocked with class, race and ethnicity, sexuality and other discursive formations and institutions. Further, where Northern feminist science studies analyse ‘nature’, postcolonial feminist science studies look at environment. That is, they looks at the surroundings with which humans regularly interact in daily subsistence struggles. ‘Nature’ here is frequently used with a moral–spiritual dimension (Harding 1998: 80–87). Standpoint epistemology proposes that resources for the production of knowledge are to be found in women’s lives rather than from large (androcentric) conceptual frameworks of disciplines. It pays attention, therefore, to politically and culturally assigned locations in social hierarchies. It works its way backwards from lives to concepts/theories, rather than the other way round (the latter being a feature of standard histories of science). It identifies gender roles in work, for instance, and asks questions about why women are assigned such roles. It sets the relationship between knowledge and politics at the centre of accounts of science, and explains how different political and social arrangements have produced certain knowledges (ibid.: 153).
NOTES 1. For a study of the Whole Body Camera see a brief paper by J.P. Siebert and J.W. Patterson, titled, appropriately, ‘Captivating Models’ in IEE Colloquium on Computer Vision for Virtual Human Modelling, June 1998 (available online, with illustrations, at ). The technology (including Turing Institute’s
GENDER
S 311
C3D and 3D-MATIC) is extremely useful for filmmaking, the authors point out, especially to create virtual actors (called ‘synthespians’). Idoru, in William Gibson’s novel of the same name, is such a virtual construct—a rock star who exists only in cyberspace. Gibson points to the realism of virtuality when another rock star falls in love with Idoru and decides to marry her. 2. For empirical studies on the impact of ICTs and global capital on women’s employment see Mitter and Rowbotham (1997). See also Joshi (n.d.) for case studies. 3. For a range of feminist readings of techno-thrillers and cyborg images in popular Hollywood cinema see essays by James Kavanagh, Judith Newton, Barbara Creed, Vivian Sobchack and Anne Cranny-Francis in Annette Kuhn (1995 [1990]). 4. Women’s groups proliferated on the Net, especially after the widespread use of the Internet by the mid-1990s: S S S S
Rosie X started cyberzine Geekgirl in January 1995. Aliza Sherman started Cybergrrrl Webstation in January 1995. Amelia Wilson launched NerdGrrrl! in November 1995. Lynda Wienman started Homegurrrl site in March 1995.
Judith Butler has argued that the taboo against homosexuality precedes the heterosexual incest taboo. Thus homosexuality and heterosexuality— identification and desire—are bound together in the formation of subjectivity (see her Gender Trouble [1990]).
SUGGESTED READING Eisenstein, Zillah. 1998. Global Obscenities: Patriarchy, Capitalism and the Lure of Cyberfantasy. New York: New York University Press. Harding, Sandra. 1986. The Science Question in Feminism. New York: Cornell University Press. Hekman, Susan J. 1990. Gender and Knowledge: Elements of a Postmodern Feminism. Cambridge: Polity. Mitter, Swasti and Sheila Rowbotham (eds). 1997. Women Encounter Technology: Changing Patterns of Employment in the Third World. London and New York: Routledge. Wajcman, Judy. 1993. Feminism Confronts Technology. Cambridge: Blackwell. Wolmark, Jenny (ed.). 1999. Cybersexualities: A Reader on Feminist Theory, Cyborgs and Cyberspace. Edinburgh: Edinburgh University Press.
BIBLIOGRAPHY AND WEBLIOGRAPHY Aarseth, Espen. 1999. ‘Aporia and Epiphany in Doom and The Speaking Clock: The Temporality of Ergodic Art’. In Marie-Laure Ryan (ed.), Cyberspace Textuality: Computer Technology and Literary Theory, pp. 31–41. Indiana: Indiana University Press. Adam, Alison. 2002. ‘The Ethical Dimension of Cyberfeminism’. In Mary Flanagan and Austin Booth (eds), Reload: Thinking Women + Cyberculture, pp. 158–74. Cambridge, MA: MIT. Adorno, Theodor and Max Horkheimer. 1979. The Dialectic of Enlightenment. Tr. John Cumming. London: Verso. Alexander, Jeffrey C. and Steven Seidman. 1990. Culture and Society: Contemporary Debates. Cambridge: Cambridge University Press. Amin, Ash. 2000. ‘The Economic Base of Contemporary Cities’. In Gary Bridge and Sophie Watson (eds), A Companion to the City, pp. 115–19. Oxford: Blackwell. Anderson, Warwick. 2002. ‘Postcolonial Technoscience’. Social Studies of Science, 32(5–6): 643–58. Appadurai, Arjun. 2000 [1990]. ‘Disjuncture and Difference in the Global Cultural Economy’. In John Benyon and David Dunkerley (eds), Globalization: The Reader, pp. 92–104. London: Athlone. Archibugi, Daniele and Jonathan Michie. 1997. ‘The Globalisation of Technology: A New Taxonomy’. In Daniele Archibugi and Jonathan Michie (eds), Technology, Globalisation and Economic Performance, pp. 172–96. Cambridge: Cambridge University Press. ———. (eds). 1997. Technology, Globalisation and Economic Performance. Cambridge: Cambridge University Press. Armitage, John. 2000. ‘Paul Virilio: An Introduction’. In John Armitage (ed.), Paul Virilio: From Modernism to Hypermodernism and Beyond, pp. 1–24. London: Sage. Armitt, Lucie (ed.). 1991. Where No Man has Gone Before: Women and Science Fiction. London: Routledge. Aurigi, Alessandro and Stephen Graham. 2000. ‘Cyberspace and the City: The Virtual City in Europe’. In Gary Bridge and Sophie Watson (eds), A Companion to the City, pp. 489–502. Oxford: Blackwell.
314 S
VIRTUAL WORLDS
Balsamo, Anne. 1996. Technologies of the Gendered Body: Reading Cyborg Women. Durham and London: Duke University Press. ———. 1998 [1995]. ‘Forms of Technological Embodiment: Reading the Body in Contemporary Culture’. In Mike Featherstone and Roger Burrows (eds), Cyberspace/Cyberbodies/Cyberpunk: Cultures of Technological Embodiment, pp. 215–38. London: Sage. ———. 1999. ‘Reading Cyborgs Writing Feminism’. In Jenny Wolmark (ed.), Cybersexualities: A Reader on Feminist Theory, Cyborgs and Cyberspace, pp. 145–56. Edinburgh: Edinburgh University Press. Barber, Benjamin R. 1984. Strong Democracy: Participatory Politics for a New Age. Berkeley: University of California Press. ———. 1995. Jihad vs McWorld. New York: Times. Barber, John. Cybernetic Engines. Barlow, John. 1996. Barrett, Edward. 1994. Sociomedia: Multimedia, Hypermedia, and the Social Construction of Knowledge. Cambridge, MA: MIT. Barry, John. 1999. Environment and Social Theory. London and New York: Routledge. Barthes, Roland. 1975. S/Z. Tr. Richard Miller. London: Jonathan Cape. Baruch, Elaine Hoffman, Amadeo F. D’Adamo Jr. and Joni Seager (eds). 1988. Embryos, Ethics, and Women’s Rights: Exploring the New Reproductive Technologies. New York and London: Harrington. Baudrillard, Jean. 1983. Simulations. Tr. Paul Foss et al. New York: Semiotext(e). ———. 1988. America. Tr. Chris Turner. London: Verso. ———. 1990a. Seduction. Tr. Brian Singer. London: Macmillan. ———. 1990b. Cool Memories. Tr. Chris Turner. London: Verso. ———. 2000 [1985]. ‘The Masses: The Implosion of the Social in the Media’. In Paul Marris and Sue Thornham (eds), Media Studies: A Reader, pp. 99–108. New York: New York University Press. Bauman, Zygmunt. 2000. Liquid Modernity. Cambridge: Polity. Bayer, Betty M. 1999. ‘Psychological Ethics and Cyborg Body Politics’. In Angel Gordo-Lopez and Ian Parker (eds), Cyberpsychology, pp. 113–29. London: Macmillan. Bell, Daniel. 1973. The Coming of Post-Industrial Society: A Venture in Social Forecasting. Harmondsworth: Penguin. Belling, Catherine. 1998. ‘Reading the Operation: Television, Realism, and the Possession of Medical Knowledge’. Literature and Medicine, 17(1): 1–23. Bender, Gretchen and Timothy Druckrey (eds). 1994. Culture on the Brink: Ideologies of Technology. Seattle: Bay Press.
BIBLIOGRAPHY AND WEBLIOGRAPHY
S 315
Benedikt, Michael (ed.). 1991. Cyberspace: First Steps. Cambridge, MA: MIT. Benjamin, Walter. 1969. Illuminations. New York: Schocken. Bennett, Tony. 1998. Culture: A Reformer’s Science. London: Sage. Ben-Tov, Sharona. 1995. The Artificial Paradise: Science Fiction and American Reality. Ann Arbor: University of Michigan Press. Benyon, John and David Dunkerley (eds). 2000. Globalization: The Reader. London: Athlone. Bergman, Simone and Liesbet van Zoonen. 1999. ‘Fishing with False Teeth: Gender and the Internet’. In John Downey and Jim McGuigan (eds), Technocities, pp. 90–107. London: Sage. Bijker, Wiebe E. and Karin Bijsterveld. 2000. ‘Women Walking through Plans: Technology, Democracy, and Gender Identity’. Technology and Culture, 41(3): 485–515. Bijker, Wiebe E., Thomas P. Hughes and Trevor J. Pinch (eds). 1997. The Social Construction of Technological Systems. Cambridge, MA: MIT. Birringer, Johannes (ed.). 2002. ‘Dance and Media Technologies’. Performing Art Journal, 70: 84–93. Blair, Ann. 2003. ‘Reading Strategies for Coping with Information Overload ca 1550–1700’. Journal of the History of Ideas, 64(1): 11–28. Bleier, Ruth. 1986. Feminist Approaches to Science. New York: Pergamon. Bolter, David Jay. 1991. Writing Space: The Computer, Hypertext and the History of Writing. New Jersey: Lawrence Erlbaum. Booth, Austin. 2002. ‘Women’s Cyberfiction: An Introduction’. In Mary Flanagan and Austin Booth (eds), Reload: Rethinking Women + Cyberculture, pp. 25–40. Cambridge, MA: MIT. Booth, Austin and Mary Flanagan. 2002. ‘Introduction’. In Mary Flanagan and Austin Booth (eds), Reload: Rethinking Women + Cyberculture, pp. 1–24. Cambridge, MA: MIT. Borsook, Paulina. 1995. ‘The Goddess in Every Woman’s Machine’. Wired, 3(07). Boserup, Esther. 1970. Women’s Role in Economic Development. London: Earthscan. Bourdieu, Pierre. 1999 [1989]. Distinction: A Social Critique of the Judgement of Taste. Tr. Richard Nice. London: Routledge. ———. 2000. Pascalian Meditations. Tr. Richard Nice. Cambridge: Polity. Bowie, Jennifer L. ‘Student Problems with Hypertext and Webtext: A Student-Centered Hypertext Classroom?’ Bowlby, Rachel. 2000. Carried Away: The Invention of Modern Shopping. London: Faber and Faber. Bricken, Meredith. 1991. ‘Virtual Worlds: No Interface to Design’. In Michael Benedikt (ed.), Cyberspace: First Steps, pp. 363–82. Cambridge, MA: MIT.
316 S
VIRTUAL WORLDS
Bridge, Gary and Sophie Watson (eds). 2000. A Companion to the City. Oxford: Blackwell. Briggs, Asa and Peter Burke. 2002. A Social History of the Media from Gutenberg to the Internet. Cambridge: Polity. Browner, Stephanie, Stephen Pulsford and Richard Sears. 2000. Literature and the Internet: A Guide for Students, Teachers, and Scholars. New York and London: Garland. Brook, James and Iain A. Boal (eds). 1995. Resisting the Virtual Life: The Culture and Politics of Information. San Francisco: City Lights. Burman, Erica. 1999. ‘The Child and the Cyborg’. In Angel Gordo-Lopez and Ian Parker (eds), Cyberpsychology, pp. 169–83. London: Macmillan. Butler, Judith. 1990. Gender Trouble: Feminism and the Subversion of Identity. New York: Routledge. Cadora, Karen. 1995. ‘Feminist Cyberpunk’. Science Fiction Studies, 22: 357–72. Castells, Manuel. 1989. The Informational City: Information Technology, Economic Restructuring and the Urban-Regional Process. Oxford: Blackwell. ———. 1996. The Rise of the Network Society. Oxford: Blackwell. ———. 2000 [1998]. End of Millenium: The Information Age: Economy, Society and Culture, Vol. II. Oxford: Blackwell. Cavallaro, Dani. 2000. Cyberpunk and Cyberculture: Science Fiction and the Work of William Gibson. London and New Brunswick, NJ: Athlone. Chaloner, Jack. 1999. The Baby Makers: The History of Artificial Conception. Basingstoke: Channel 4 – Macmillan. Chaney, David. 1994. The Cultural Turn: Scene-setting Essays on Contemporary Cultural History. London and New York: Routledge. Chapman, Nigel and Jenny Chapman. 2000. Digital Multimedia. Chichester: John Wiley. Christensen, Damaris. 2002. ‘In Silico Medicine’. Science News, 162 (14 December): 378–80. Clark, Nigel. 1998 [1995]. ‘Rear-View Mirrorshades: The Recursive Generation of the Cyberbody’. In Mike Featherstone and Roger Burrows (eds), Cyberspace/Cyberbodies/Cyberpunk: Cultures of Technological Embodiment, pp. 113–33. London: Sage. Clarke, Julie. 2000. ‘The Sacrificial Body of Orlan’. In Mike Featherstone (ed.), Body Modification, pp. 185–208. London: Sage. Clifford, James and George E. Marcus (eds). 1990. Writing Culture: The Poetics and Politics of Ethnography. Delhi: Oxford University Press. Cockburn, Cynthia. 1985. Machinery of Dominance: Women, Men and Technical Know-How. London: Pluto. Conley, V.A. (ed.). 1993. Rethinking Technologies. Minneapolis: Minnesota University Press.
BIBLIOGRAPHY AND WEBLIOGRAPHY
S 317
Conway, Maura. 2002. ‘What is Cyberterrorism?’ Current History (December). Cooter, Roger. 1984. The Cultural Meaning of Popular Science: Phrenology and the Organization of Consent in Nineteenth-Century Britain. Cambridge: Cambridge University Press. Corrigan, Peter. 1997. The Sociology of Consumption: An Introduction. London: Sage. Crane, Diana (ed.). 1994. The Sociology of Culture. Oxford: Blackwell. Crang, Mike, John Crang and John May (eds). 1999. Virtual Geographies: Bodies, Space and Relations. London and New York: Routledge. Cubitt, Sean. 1998. Digital Aesthetics. London: Sage. Currier, Dianne. 2002. ‘Assembling Bodies in Cyberspace: Technologies, Bodies, and Sexual Difference’. In Mary Flanagan and Austin Booth (eds), Reload: Rethinking Women + Cybercutlure, pp. 519–38. Cambridge, MA: MIT. Cutting Edge, The Women’s Research Group (eds). 2000. Digital Desires: Language, Identity and New Technologies. London and New York: I.B. Tauris. Danet, Brenda. 2001. Cyberpl@y: Communicating Online. New York: Oxford University Press. Darley, Andrew. 2000. Visual Digital Culture: Surface Play and Spectacle in New Media Genres. London and New York: Routledge. Davis, Erik. ‘Technopagans’. Wired, 3(07). Davis, Kathy. 1995. Reshaping the Female Body: The Dilemma of Cosmetic Surgery. New York and London: Routledge. ———. 1997. ‘“My Body is My Art”: Cosmetic Surgery as Feminist Utopia?’. European Journal of Women’s Studies, 4(1): 23–37. Davis, Mike. 1992 [1990]. City of Quartz: Excavating the Future in Los Angeles. London: Vintage. Dear, Michael and Stephen Flusty. 1999. ‘The Postmodern Urban Condition’. In Mike Featherstone and Scott Lash (eds), Spaces of Culture: City–Nation–World, pp. 64–85. London: Sage. deLahunta, Scott. 2002. ‘Virtual Reality and Performance’. Performing Art Journal, 70: 105–14. De Loach, Amelia. 1996. ‘Grrrls Exude Attitude’. CMC Magazine.
Denard, Hugh. 2002. ‘Virtuality and Performance: Recreating Rome’s Theatre of Pompey’. Performing Art Journal, 70: 5–15. Der Derian, James. 1997. ‘The Virtualization of Violence and the Disappearance of War’. Cultural Values, 1(2): 205–18. Derrida, Jacques. 1976. Of Grammatology. Tr. Gayatri Chakravorty Spivak. Baltimore: Johns Hopkins University Press. Derrida, Jacques and Bernard Stiegler. 2002. Echographies of Television: Filmed Interviews. Tr. Jennifer Bajorek. Cambridge: Polity.
318 S
VIRTUAL WORLDS
Dery, Mark (ed.). 1994. Flame Wars: The Discourse of Cyberculture. Durham: Duke University Press. Dibbell, Julia. 1994. ‘A Rape in Cyberspace: or, how an evil clown, a Haitian trickster spirit, two wizards, and a cast of dozens turned a database into a society’. In Mark Dery (ed.), Flame Wars: The Discourse of Cyberculture, pp. 237–61. Durham: Duke University Press. Dietrich, Dawn. 1997. ‘(Re)-Fashioning the Techno-Erotic Woman: Gender and Textuality in the Cybercultural Matrix’. In Steven G. Jones (ed.), Virtual Culture: Identity and Communication in Cyberculture, pp. 169–84. London: Sage. Doane, Mary Ann. 1999. ‘Technophilia: Technology Representation and the Feminine’. In Jenny Wolmark (ed.), Cybersexualities: A Reader on Feminist Theory, Cyborgs and Cyberspace, pp. 20–33. Edinburgh: Edinburgh University Press. Downey, Greg. 2002. ‘Virtual Webs, Physical Technologies, and Hidden Workers: The Spaces of Labor in Information Internetworks’. Technology and Culture, 42(2): 209–35. Downey, John and Jim McGuigan (eds). 1999. Technocities. London: Sage. Doyle, Julie and Kate O’Riordan. 2002. ‘Virtually Visible: Female Cyberbodies and the Medical Imagination’. In Mary Flanagan and Austin Booth (eds), Reload: Rethinking Women + Cyberculture, pp. 239–60. Cambridge, MA: MIT. Dujay, John. ‘Will E-Books make Paper Ones Obsolete?’ Dunnigan, James F. 1996. Digital Soldiers: The Evolution of HighTech Weaponry and Tomorrow’s Brave New Battlefield. New York: St. Martin’s. Dutton, Kenneth R. 1995. The Perfectible Body: The Western Ideal of Physical Development. London: Cassell. Dutton, William H. 1999. Society on the Line: Information Politics in the Digital Age. Oxford: Oxford University Press. Eisenstein, Elizabeth L. 1991 [1979]. The Printing Press as an Agent of Change: Communications and Cultural Transformations in Early-Modern Europe. Cambridge: Cambridge University Press. Eisenstein, Zillah. 1998. Global Obscenities: Patriarchy, Capitalism and the Lure of Cyberfantasy. New York: New York University Press. Erikson, Erik Oddvar and John Erik Fossum (eds). 2000. Democracy in the European Union: Integration through Deliberation? London and New York: Routledge. Escobar, Arturo. 1996. ‘Welcome to Cyberia: Notes on the Anthropology of Cyberculture’. In Ziauddin Sardar and Jerome R. Ravetz (eds),
BIBLIOGRAPHY AND WEBLIOGRAPHY
S 319
Cyberfutures: Culture and Politics on the Information Superhighway, pp. 111–37. New York: New York University Press. Everard, Jerry. 2000. Virtual States: The Internet and the Boundaries of Nation–states. London and New York: Routledge. Feather, John. 2000 [1994]. The Information Society: A Study of Continuity and Change. London: Library Association. Featherstone, Mike (ed.). 1999. Love and Eroticism. London: Sage. ——— (ed.). 2000. Body Modification. London: Sage. Featherstone, Mike and Roger Burrows (eds). 1998 [1995]. Cyberspace/ Cyberbodies/Cyberpunk: Cultures of Technological Embodiment. London: Sage. ———. 1998 [1995]. ‘Cultures of Technological Embodiment: An Introduction’. In Mike Featherstone and Roger Burrows (eds), Cyberspace/Cyberbodies/Cyberpunk: Cultures of Technological Embodiment, pp. 1–19. London: Sage. Featherstone, Mike, Mike Hepworth and Bryan S. Turner (eds). 1995 [1991]. The Body: Social Process and Cultural Theory. London: Sage. Featherstone, Mike and Scott Lash (eds). 1999. Spaces of Culture: CityNation-World. London: Sage. Ferdinand, Peter (ed.). 2000. The Internet, Democracy and Democratization. London: Frank Cass. Feyerabend, Paul. 1978. Against Method. London: Verso. Firestone, Shulamith. 1970. The Dialectic of Sex: The Case for Feminist Revolution. New York: Bantam. Fitting, Peter. 1991. ‘The Lessons of Cyberpunk’. In Constance Penley and Andrew Ross (eds), Technoculture, pp. 295–316. Minneapolis: University of Minnesota Press. Flanagan, Mary. 2002. ‘Hyperbodies, Hyperknowledge: Women in Games, Women in Cyberpunk, and Strategies of Resistance’. In Mary Flanagan and Austin Booth (eds), Reload: Rethinking Women + Cyberculture, pp. 425–54. Cambridge, MA: MIT. Flanagan, Mary and Austin Booth (eds). 2002. Reload: Rethinking Women + Cyberculture. Cambridge, MA: MIT. Foster, Thomas. 1993. ‘Meat Puppets or Robopaths?: Cyberpunk and the Question of Embodiment’. Genders, 18: 11–31. ———. 2002. ‘“The Postproduction of the Human Heart”: Desire, Identification, and Virtual Embodiment in Feminist Narratives of Cyberspace’. In Mary Flanagan and Austin Booth (eds), Reload: Rethinking Women + Cyberculture, pp. 469–504. Cambridge, MA: MIT. Foucault, Michel. 1983. ‘The Subject and Power’. In Herbert Dreyfuss and Paul Rabinow (eds), Michel Foucault: Beyond Structuralism and Hermeneutics. Chicago: Chicago University Press.
320 S
VIRTUAL WORLDS
Foucault, Michel. 1994 [1974]. The Order of Things: An Archaeology of the Human Sciences. London: Routledge. Franklin, Sarah. 1997. Embodied Progress: A Cultural Account of Assisted Conception. London and New York: Routledge. Friedman, Ted. 1995. ‘Making Sense of Software: Computer Games and Interactive Textuality’. In Steven G. Jones (ed.), Cybersociety: Computer-mediated Communication and Community, pp. 73–89. London: Sage. Frow, John. 1995. Cultural Studies and Cultural Value. Oxford: Clarendon. Fuchs, Cynthia J. 1993. ‘ “Death is Irrelevant”: Cyborgs, Reproduction, and the Future of Male Hysteria’. Genders, 18: 114–15. Garber, Marjorie, Jann Matlock and Rebecca L. Walkowitz (eds). 1993. Media Spectacles. New York and London: Routledge. Gattiker, Urs. 2001. The Internet as a Diverse Community: Cultural, Organizational, and Political Issues. Mahwah, New Jersey: Lawrence Erlbaum. Gauntlett, David (ed.). 2000. Web.studies: Rewiring Media Studies for the Digital Age. London: Arnold. Gauntlett, David. 2000. ‘The Web goes to the Pictures’. In David Gauntlett (ed.), Web.studies: Rewiring Media Studies for the Digital Age, pp. 82–87. London: Arnold. Gibson, William. 1991. ‘Interview with Larry McCaffery’. In Larry McCaffery (ed.), Storming the Reality Studio: A Casebook of Cyberpunk and Postmodern Fiction, pp. 263–85. Durham and London: Duke University Press. Giddens, Anthony. 1991. Consequences of Modernity. Cambridge: Polity. ———. 1998. The Third Way: The Renewal of Social Democracy. Cambridge: Polity. ———. 2000. The Third Way and its Critics. Cambridge: Polity. Gilman, Sander L. 1999. Making the Body Beautiful: A Cultural History of Cosmetic Surgery. Princeton, New Jersey: Princeton University Press. Golding, Peter. 2000 [1998]. ‘World Wide Wedge: Division and Contradiction in the Global Information Infrastructure’. In Paul Marris and Sue Thornham (eds), Media Studies: A Reader, pp. 802–15. New York: New York University Press. Golding, Peter and Phil Harris (eds). 1997. Beyond Cultural Imperialism: Globalization, Communication and the New International Order. London: Sage. Goodall, Jane. 2000. ‘An Order of Pure Decision: Un-Natural Selection in the Work of Stelarc and Orlan’. In Mike Featherstone (ed.), Body Modification, pp. 149–70. London: Sage. Gordo-Lopez, Angel J. and Ian Parker (eds). 1999. Cyberpsychology. London: Macmillan.
BIBLIOGRAPHY AND WEBLIOGRAPHY
S 321
Graham, Stephen. 1999. ‘Towards Urban Cyberspace Planning: Grounding Global through Urban Telematics Policy and Planning’. In John Downey and Jim McGuigan (eds), Technocities, pp. 9–33. London: Sage. ———. 2000. ‘The End of Geography or the Explosion of Place?: Conceptualising Space, Place and Information Technology’. In Mark I. Wilson and Kenneth E. Corey (eds), Information Tectonics: Space, Place and Technology in an Electronic Age, pp. 1–28. Chichester: John Wiley. Gray, Chris Hables. 1997. Postmodern War: The New Politics of Conflict. London: Routledge. ———. 2001. Cyborg Citizen: Politics in the Posthuman Age. London and New York: Routledge. Guattari, Felix. 1992. ‘Regimes, Pathways, Subjects’. In Jonathan Crary and S. Kwinter (eds), Incorporations. New York: Zone. Habermas, Jurgen. 1993 [1989]. The Structural Transformation of the Public Sphere: An Inquiry into a Category of Bourgeois Society. Tr. Thomas Burger. Cambridge, MA: MIT. ———. 2000. ‘Beyond the Nation-state?: On Some Consequences of Economic Globalization’. In Erik Oddvar Erikson and John Erik Fossum (eds), Democracy in the European Union: Integration through Deliberation?, pp. 29–41. London and New York: Routledge. ———. 2001. The Postnational Constellation: Political Essays. Tr. and ed. Max Pinsky. Cambridge: Polity. Hakken, David. 1999. Cyborgs@Cyberspace: An Ethnographer Looks to the Future. New York and London: Routledge. Hall, John R. and Mary Jo Neitz. 1993. Culture: Sociological Perspectives. New Jersey: Prentice Hall. Hall, Stuart. 1980. ‘Cultural Studies: Two Paradigms’. Media, Culture and Society, 2(1): 57–72. ———. 1985. ‘Signification, Representation, Ideology: Althusser and the Poststructuralist Debates’. Critical Studies in Mass Communication, 2(2): 91–114. ———. 1996 [1989]. ‘The Meaning of New Times’. In David Morley and Kuan-Hsing Chen (eds), Stuart Hall: Critical Dialogues in Cultural Studies, pp. 223–37. London: Routledge. ———. 1997a. ‘Introduction’. In Stuart Hall (ed.), Representation: Cultural Representations and Signifying Practices, pp. 1–30. London: Sage. ———. (ed.). 1997b. Representation: Cultural Representations and Signifying Practices. London: Sage. ———. 1997c. ‘The Centrality of Culture: Notes on the Cultural Revolutions of our Time’. In Kenneth Thompson (ed.), Media and Cultural Regulation, pp. 208–38. London: Sage.
322 S
VIRTUAL WORLDS
Hall, Stuart. 2000 [1980]. ‘Encoding/Decoding’. In Paul Marris and Sue Thornham (eds), Media Studies: A Reader, pp. 51–61. New York: New York University Press. Harasim, Linda M. (ed.). 1993. Global Networks: Computers and International Communication. Cambridge, MA: MIT. Haraway, Donna. 1985. Simians, Cyborgs and Women: The Reinvention of Nature. New York: Routledge. ———. 1991a [1985]. ‘A Manifesto for Cyborgs: Science, Technology and Socialist Feminism in the 1980s’. In Donna Haraway (ed.), Simians, Cyborgs and Women: The Reinvention of Nature, pp. 149–81. New York: Routledge. ———. 1991b. ‘Cyborgs at Large’. Interview with Constance Penley and Andrew Ross, Technoculture, pp. 1–20. Minneapolis: University of Minnesota Press. ———. 1991c. ‘The Actors are Cyborg, Nature is Coyote, and the Geography is Elsewhere: Postscript to “Cyborgs at Large”’. In Constance Penley and Andrew Ross (eds), Technoculture, pp. 21–26. Minneapolis: University of Minnesota Press. ———. 1997. Modest_Witness@Second_Millenium.FemaleMan_Meets_ Oncomouse : Feminism and Technoscience. London and New York: Routledge. Harding, Sandra. 1986. The Science Question in Feminism. New York: Cornell University Press. ———. (ed.). 1993. The ‘Racial’ Economy of Science: Toward a Democratic Future. Bloomington and Indianapolis: Indiana University Press. ———. 1998. Is Science Multicultural? Postcolonialisms, Feminisms, and Epistemologies. Bloomington and Indianpolis: Indiana University Press. Harper, Mary Catherine. 1995. ‘Incurably Alien Other: A Case for Feminist Cyborg Writers’. Science Fiction Studies, 22: 399–420. Harris, John and Soren Holm. 1998. The Future of Human Reproduction: Ethics, Choice and Regulation. Oxford: Clarendon. Harrison, Lawrence E. and Samuel P. Huntington (eds). 2000. Culture Matters: How Values Shape Human Progress. New York: Basic. Hartouni, Valerie. 1991. ‘Containing Women: Reproductive Discourse in the 1980s’. In Constance Penley and Andrew Ross (eds), Technoculture, pp. 27–56. Minneapolis: University of Minnesota Press. Harvey, David. 1989. The Condition of Postmodernity: An Enquiry into the Origins of Cultural Change. Oxford: Blackwell. Hayles, N. Katherine. 1993. ‘Virtual Bodies and Flickering Signifiers’. October, 66: 69–71. ———. 1999. How We Became Posthuman: Virtual Bodies in Cybernetics, Literature, and Informatics. Chicago: University of Chicago Press. Heap, Nick, Ray Thomas, Geoff Einon, Robin Mason and Hughie Mackay (eds). 1996 [1995]. Information Technology and Society: A Reader. London: Sage. TM
BIBLIOGRAPHY AND WEBLIOGRAPHY
S 323
Hearn, Jeff and Sasha Roseneil (eds). 1999. Consuming Cultures: Power and Resistance. London: Macmillan. ———. 1999a. ‘Consuming Cultures: Power and Resistance’. In Jeff Hearn and Sasha Roseneil (eds), Consuming Cultures: Power and Resistance, pp. 1–13. London: Macmillan. Heidegger, Martin. 1977. ‘The Question Concerning Technology’. In Martin Heidegger (ed.), The Question Concerning Technology and Other Essays. Tr. William Lovitt. New York: Harper Torchbooks. Hekman, Susan J. 1990. Gender and Knowledge: Elements of a Postmodern Feminism. Cambridge: Polity. Holland, Samantha. 1998 [1995]. ‘Descartes Goes to Hollywood: Mind, Body and Gender in Contemporary Cyborg Cinema’. In Mike Featherstone and Roger Burrows (eds), Cyberspace/Cyberbodies/ Cyberpunk: Cultures of Technological Embodiment, pp. 157–74. London: Sage. Hollinger, Veronica. 1991. ‘Cybernetic Deconstructions: Cyberpunk and Postmodernism’. In Larry McCaffery (ed.), Storming the Reality Studio: A Casebook of Cyberpunk and Postmodern Fiction, pp. 203–18. Durham and London: Duke University Press. ———. 2002 [1999]. ‘(Re)reading Queerly: Science Fiction, Feminism, and the Defamiliarization of Gender’. In Mary Flanagan and Austin Booth (eds), Reload: rethinking Women + Cyberculture, pp. 301–20. Cambridge, MA: MIT. Holmes, David (ed.). 1997. Virtual Politics: Identity and Community in Cyberspace. London: Sage. ———. 1997. ‘Virtual Identity: Communities of Broadcast, Communities of Interactivity’. In David Holmes (ed.), Virtual Politics: Identity and Community in Cyberspace, pp. 26–45. London: Sage. Holtzman, Steven. 1997. Digital Mosaics: The Aesthetics of Cyberspace. New York: Simon and Schuster. Hubbard, Ruth. 1992. The Politics of Women’s Biology. New Brunswick, New Jersey: Rutgers University Press. Interrogate the Internet. 1996. ‘Contradictions in Cyberspace: Collective Response’. In Rob Shields (ed.), Cultures of Internet: Virtual Spaces, Real Histories, Living Bodies, pp. 125–32. London: Sage. Irwin, Alan and Brian Wynne (eds). 1998 [1996]. Misunderstanding Science?: The Public Reconstruction of Science and Technology. Cambridge: Cambridge University Press. Jameson, Fredric. 1991 (1984). ‘Postmodernism, or the Cultural Logic of Late Capitalism’. In Fredric Jameson (ed.), Postmodernism, or the Cultural Logic of Late Capitalism. London: Verso. ———. 1993 [1991]. Postmodernism, or the Cultural Logic of Late Capitalism. London and New York: Verso. Jenks, Chris. 1995 [1993]. Culture. London and New York: Routledge.
324 S
VIRTUAL WORLDS
Jones, Steven G. (ed.). 1995. Cybersociety: Computer-Mediated Communication and Community. London: Sage. ———. 1995. ‘Understanding Community in the Information Age’. In Steven G. Jones (ed.), Cybersociety: Computer-mediated Communication and Community, pp. 10–35. London: Sage. ———. 1997. Virtual Culture: Identity and Communication in Cybersociety. London: Sage. Jordan, Tim. 1999. Cyberpower: The Culture and Politics of Cyberspace and the Internet. London and New York: Routledge. Jordanova, Ludmilla. 1989. Sexual Visions: Images of Gender in Science and Medicine between the Eighteenth and Twentieth Centuries. London: Harvester Wheatsheaf. Joseph, Sarah. 1998. Interrogating Culture: Critical Perspectives on Contemporary Social Theory. New Delhi: Sage. Joshi, Ila (ed.). n.d. Asian Women in the Information Age: New Communication Technology, Democracy and Women. Singapore: Asian Media Information and Communication Centre. Kaplan, E.A. 1987. Rocking Around the Clock: Music Television, Postmodernism and Consumer Culture. New York and London: Routledge. Kar, Prafulla C., Sura P. Rath and Kailash C. Baral (eds). 2002. Theory and Praxis: Curriculum, Culture and English Studies. New Delhi: Pencraft. Katz, John. 1997. ‘Birth of a Digital Nation’. Wired, 5(04). Kaye, Anthony. 1995. ‘Computer Supported Collaborative Learning’. In Nick Heap, Ray Thomas, Geoff Einon, Robin Mason and Hughie Mackay (eds), Information Technology and Society: A Reader, pp. 192–210. London: Sage. Keller, Evelyn Fox. 1983. A Feeling for the Organism: The Life and Work of Barbara McClintock. San Francisco: Freeman. ———. 1985. Reflections on Gender and Science. New Haven: Yale University Press. Kellner, Douglas. 1999. ‘New Technologies: Technocities and the Prospects for Democratisation’. In John Downey and Jim McGuigan (eds), Technocities, pp. 186–204. London: Sage. ———. 2000a. ‘Virilio, War and Technology: Some Critical Reflections’. In John Armitage (ed.), Paul Virilio: From Modernism to Hypermodernism and Beyond, pp. 103–25. London: Sage. ———. 2000b [1995]. Media Culture: Cultural Studies, Identity and Politics between the Modern and the Postmodern. London and New York: Routledge. Kember, Sarah. 2000. ‘Get ALife: Cyberfeminism and the Politics of Artificial Life’. In Cutting Edge (ed.), Digital Desires: Language, Identity and New Technologies, pp. 34–46. London: I.B. Tauris.
BIBLIOGRAPHY AND WEBLIOGRAPHY
S 325
Kemp, Sandra. 2000. ‘Technologies of the Face’. In Cutting Edge (ed.), Digital Desires: Language, Identity and New Technologies, pp. 3–24. London: I.B. Tauris. Kirkup, Gill and Laurie Smith Keller. 1992. Inventing Women: Science, Technology and Gender. Cambridge: Polity. Klee, Robert. 1999. Scientific Inquiry: Readings in the Philosophy of Science. Oxford: Oxford University Press. Klesse, Christian. 2000. ‘“Modern Primitivism”: Non-Mainstream Body Modification and Racialized Representation’. In Mike Featherstone (ed.), Body Modification, pp. 15–38. London: Sage. Kolko, Beth E., Lisa Nakamura, Gilbert B. Rodman (eds). 2000. Race in Cyberspace. New York and London: Routledge. Kraft, Michael E. and Norman J. Vig (eds). 1988. Technology and Politics. Durham and London: Duke University Press. Kresina, Thomas F. 2001. An Introduction to Molecular Medicine and Gene Therapy. New York: Wiley-Liss. Kroker, Arthur and Marilouise Kroker (eds). 1987. Body Invaders: Panic Sex in America. New York: St. Martin’s. ———. 1987. ‘Panic Sex in America’. In Arthur and Marielouise Kroker (eds), Body Invaders: Panic Sex in America, pp. 10–19. New York: St. Martin’s. ———. 1987. ‘Theses on the Disappearing Body in the Hyper-Modern Condition’. In Arthur Kroker and Marilouise Kroker (eds), Body Invaders: Panic Sex in America, pp. 20–34. New York: St. Martin’s. Kuhn, Annette. 1995 [1990]. Alien Zone: Cultural Theory and Contemporary Science Fiction Cinema. London: Verso. Kuhn, Thomas S. 1970. The Structure of Scientific Revolutions. Chicago and London: Chicago University Press. Kukla, André. 2000. Social Constructivism and the Philosophy of Science. London and New York: Routledge. Kuramoto, Juana and Francisco Sagasti. 2002. ‘Integrating Local and Global Knowledge, Technology and Production Systems: Challenges for Technical Cooperation’. Science, Technology and Society, 7(2): 215–47. Laclau, Ernesto. 1977. Politics and Ideology in Marxist Theory. London: Verso. Landow, George P. 1992. Hypertext: The Convergence of Contemporary Critical Theory and Technology. Baltimore and London: Johns Hopkins. Lanier, Jaron. 1992. ‘Life in the Data-Cloud’. Interview. Mondo 2000. Latour, Bruno. 1987. Science in Action: How to Follow Scientists and Engineers through Society. Cambridge, MA: Harvard University Press. ———. 1988. The Pasteurization of France. Tr. Alan Sheridan and John Law. Cambridge, MA: Harvard University Press.
326 S
VIRTUAL WORLDS
Latour, Bruno and Steve Woolgar. 1999 [1986]. ‘A Social Constructivist Field Study’. In Robert Klee (ed.), Scientific Inquiry: Readings in the Philosophy of Science, pp. 251–59. Oxford: Oxford University Press. Lenoir, Timothy. 1997. Instituting Science: The Cultural Production of Scientific Disciplines. Stanford, California: Stanford University Press. Lessig, Lawrence. 2001. The Future of Ideas: The Fate of the Commons in a Connected World. New York: Random House. Levenson, Thomas. 1994. Measure for Measure: A Musical History of Science. New York: Touchstone. Lévy, Pierre. 2001. Cyberculture. Tr. Robert Bononno. Minneapolis and London: Minnesota University Press. Lewontin, R.C. 1991. The Doctrine of DNA: Biology as Ideology. London: Penguin. ———. 1994. ‘The Dream of the Human Genome’. In Gretchen Bender and Timothy Druckrey (eds), Culture on the Brink: Ideologies of Technology, pp. 107–27. Seattle: Bay Press. Longino, Helen E. 1990. Science as Social Knowledge. Princeton, New Jersey: Princeton University Press. Lorber, Judith. 1988. ‘In Vitro Fertilisation and Gender Politics’. In Elaine Hoffman Baruch, Amadeo F. D’Adamo Jr. and Joni Seager (eds), Embryos, Ethics, and Women’s Rights: Exploring the New Reproductive Technologies, pp. 117–33. New York and London: Harrington. Lovelace, Carey. 1995. ‘Orlan: Offensive Acts’. Performing Art Journal, 49: 13–25. Luke, Timothy W. 1999. ‘Simulated Sovereignty, Telematic Territoriality: The Political Economy of Cyberspace’. In Mike Featherstone and Scott Lash (eds), Spaces of Culture: City–Nation–World, pp. 27–48. London: Sage. Lull, James. 2000. Media, Communication, Culture: A Global Approach. Cambridge: Polity. Lupton, Deborah. 1998 [1995]. ‘The Embodied Computer User’. In Mike Featherstone and Roger Burrows (eds), Cyberspace/Cyberbodies/ Cyberpunk: Cultures of Technological Embodiment, pp. 97–112. London: Sage. Lyon, David. 1988. The Information Society: Issues and Illusions. Cambridge: Polity. Lyotard, Jean-Francois. 1984. The Postmodern Condition: A Report on Knowledge. Tr. Geoff Bennington and Brian Massumi. Minneapolis: University of Minnesota Press. MacKenzie, Donald and Judy Wajcman (eds). 1999. The Social Shaping of Technology. Milton Keynes: Open University Press. Manovich, Lev. 2001. The Language of New Media. Cambridge, MA: MIT.
BIBLIOGRAPHY AND WEBLIOGRAPHY
S 327
Margolis, Michael and David Resnick. 2000. Politics as Usual: The Cyberspace ‘Revolution’. Thousand Oaks, California: Sage. Marks, Joan H. 1994. ‘The Human Genome Project: A Challenge in Biological Technology’. In Gretchen Bender and Timothy Druckrey (eds), Culture on the Brink: Ideologies of Technology, pp. 99–106. Seattle: Bay Press. Marris, Paul and Sue Thornham (eds). 2000. Media Studies: A Reader. New York: New York University Press. Mason, Robin. 1995. ‘The Educational Value of ISDN’. In Nick Heap, Ray Thomas, Geoff Einon, Robin Mason and Hughie Mackay (eds), Information Technology and Society: A Reader, pp. 211–23. London: Sage. McCaffery, Larry (ed.). 1991. Storming the Reality Studio: A Casebook of Cyberpunk and Postmodern Fiction. Durham and London: Duke University Press. ———. 1991. ‘The Desert of the Real’. Introduction. In Larry McCaffery (ed.), Storming the Reality Studio: A Casebook of Cyberpunk and Postmodern Fiction, pp. 1–16. Durham and London: Duke University Press. McCarron, Kevin. 1998 [1995]. ‘Corpses, Animals, Machines and Mannequins: The Body and Cyberpunk’. In Mike Featherstone and Roger Burrows (eds), Cyberspace/Cyberbodies/Cyberpunk: Cultures of Technological Embodiment, pp. 261–74. London: Sage. McCarthy, E. Doyle. 1996. Knowledge as Culture: The New Sociology of Knowledge. London and New York: Routledge. McGann, Jerome. n.d. ‘The Rational of Hyper Text’. McHale, Brian. 1991. ‘POSTcyberMODERNpunkISM’. In Larry McCaffery (ed.), Storming the Reality Studio: A Casebook of Cyberpunk and Postmodern Fiction, pp. 308–23. Durham and London: Duke University Press. McLaughlin, Janice, Paul Rosen, David Skinner and Andrew Webster. 1999. Valuing Technology: Organisations, Culture and Change. London and New York: Routledge. McMillan, Gloria. 1997. ‘Playing the Dracula Tag: The Adventures of the Two-Housewife Dracula TEI-tagging Team’. Humanist, 11(49). Merchant, Carolyn. 1980. The Death of Nature: Women, Ecology and the Scientific Revolution. New York: Harper and Row. Mies, Maria and Vandana Shiva. 1993. Ecofeminism. New Delhi: Kali for Women. Millar, Melanie. 1998. Cracking the Gender Code: Who Rules the Wired World? Toronto: Second Story. Miller, Daniel (ed.). 1995. Acknowledging Consumption: A Review of New Studies. London and New York: Routledge.
328 S
VIRTUAL WORLDS
Mitchell, William. 1995. City of Bits: Space, Place and the Infobahn. Cambridge, MA: MIT. Mitter, Swasti and Sheila Rowbotham (eds). 1997. Women Encounter Technology: Changing Patterns of Employment in the Third World. London and New York: Routledge. Moraga, Cherrie L. and Gloria E. Anzaldua (eds). 1981. This Bridge Called My Back: Writings by Radical Women of Color. Watertown, MA: Persephone. Morgan, Kathryn Pauly. 1991. ‘Women and the Knife: Cosmetic Surgery and the Colonisation of Women’s Bodies’. Hypatia, 6(3): 25–53. Morley, David and Kuan-Hsing Chen (eds). 1996. Stuart Hall: Critical Dialogues in Cultural Studies. London: Routledge. Morton, Donald. 1999. ‘Birth of the Cyberqueer’. In Jenny Wolmark (ed.), Cybersexualities: A Reader on Feminist Theory, Cyborgs and Cyberspace, pp. 295–313. Edinburgh: Edinburgh University Press. Moser, Ingunn. 1995. ‘Mobilizing Critical Communities and Discourses on Modern Biotechnology’. In Vandana Shiva and Ingunn Moser (eds), Biopolitics, pp. 1–24. London: Zed Books. Mukerji, Chandra. 1994. ‘Toward a Sociology of Material Culture: Science Studies, Cultural Studies and the Meanings of Things’. In Diana Crane (ed.), The Sociology of Culture, pp. 143–62. Oxford: Blackwell. Mulhern, Francis. 2000. Culture/Metaculture. London and New York: Routledge. Naisbitt, John. 1984. Megatrends: Ten New Directions Transforming Our Lives. New York: Warner Books. Nakamura, Lisa. 2002. ‘After/Images of Identity: Gender, Technology, and Identity Politics’. In Mary Flanagan and Austin Booth (eds), Reload: Rethinking Women + Cyberculture, pp. 321–31. Cambridge, MA: MIT. Nandy, Ashis (ed.). 1990 [1988]. Science, Hegemony and Violence: A Requiem for Modernity. Delhi: Oxford University Press. National Council for Research on Women. 2002. Balancing the Equation.
Nayar, Pramod K. ‘The Culture of Technology’. Review of R.L. Rutsky’s High Techne- . Culture Machine. ———. ‘The Politics of the Prosthesis’. Review of Chris Hables Gray’s Cyborg Citizen. Culture Machine. ———. 2001a. ‘Her Body, Her Software: The Art of Orlan’. Deccan Herald, 11 January. ———. 2001b. ‘L’Ecriture Digital: Writing Bodies in Cyberspace’. Deccan Herald, 28 January. ———. 2002a. Review of Urs Gattiker’s The Internet as a Diverse Community. Social Science Computer Review. 20(3): 357–59. ———. 2002b. ‘Welcome the Posthuman: Stelarc’. Deccan Herald, 5 February.
BIBLIOGRAPHY AND WEBLIOGRAPHY
S 329
Nayar, Pramod K. 2002c [1999]. ‘Postecolonial Theory: A New Ontopology and Radical Politics’. In Prafulla C. Kar, Sura P. Rath and Kailash C. Baral (eds), Theory and Praxis: Curriculum, Culture and English Studies, pp. 245–57. New Delhi: Pencraft. Negus, Keith. 1997. ‘The Production of Culture’. In Paul du Gay (ed.), Production of Culture/Cultures of Production, pp. 67–118. London: Sage. Nguyen, Dan Thu and Jon Alexander. 1996. ‘The Coming of Cyberspacetime and the End of Polity’. In Rob Shields (ed.), Cultures of the Internet: Virtual Spaces, Real Histories, Living Bodies, pp. 99–124. London: Sage. Nixon, Nicola. 1992. ‘Cyberpunk: Preparing the Ground for Revolution or Keeping the Boys Satisfied?’. Science Fiction Studies, 19: 219–35. Norris, Pippa. 2001. Digital Divide: Civic Engagement, Information Poverty, and the Internet. Cambridge: Cambridge University Press. Novak, Marcos. 1991. ‘Liquid Architecture in Cyberspace’. In Michael Benedikt (ed.), Cyberspace: First Steps, pp. 225–54. Cambridge, MA: MIT. ———. 1995. ‘Transmitting Architecture: transTerraFirma/TidsvagNoll v2.0’. Architecture Design, 118: 43–47. Noveck, Beth Simone. 2000. ‘Paradoxical Partners: Electronic Communication and Electronic Democracy’. In Peter Ferdinand (ed.), The Internet, Democracy and Democratization, pp. 18–35. London: Frank Cass. Ogilvie, Brian W. 2003. ‘The Many Books of Nature: Renaissance Naturalists and Information Overload’. Journal of the History of Ideas, 64(1): 29–40. Okrent, Dan. ‘The Death of Print?’. O’Reilly et al. (eds). 1997. The Harvard Conference on the Internet and Society. Sebastopol, California: O’Reilly and Harvard University Press. Orlan. 2000. ‘Serene and Happy and Distant’. Interview with Robert Ayers. In Mike Featherstone (ed.), Body Modification, pp. 171–84. London: Sage. Pacey, Arnold. 1983. The Culture of Technology. Oxford: Basil Blackwell. Pariser, Eva. 2000. ‘Artists’ Websites: Declarations of Identity and Presentations of Self ’. In David Gauntlett (ed.), Web.studies: Rewiring Media Studies for the Digital Age, pp. 62–67. London: Arnold. Parthasarathy, Ashok and K.J. Joseph. 2002. ‘Limits to Innovation with Strong Export Orientation: The Case of India’s Information and Communication Technologies Sector’. Science, Technology and Society, 7(1): 13–50. Penley, Constance and Andrew Ross (eds). 1991. Technoculture. Minneapolis: University of Minnesota Press. Perloff, Marjorie. 1991. Radical Artifice: Writing Poetry in the Age of Media. Chicago and London: Chicago University Press.
330 S
VIRTUAL WORLDS
Pickering, John. 1999. ‘Designs on the City: Urban Experience in the Age of Electronic Reproduction’. In John Downey and Jim McGuigan (eds), Technocities, pp. 168–85. London: Sage. Pink, Sarah. 2001. Doing Visual Ethnography: Images, Media and Representation in Research. London: Sage. Plant, Sadie. 1998 [1995]. ‘The Future Looms: Weaving Women and Cybernetics’. In Mike Featherstone and Roger Burrows (eds), Cyberspace/Cyberbodies/Cyberpunk: Cultures of Technological Embodiment, pp. 45–64. London: Sage. ———. ‘Intelligence is No Longer on the Side of Power’. Interview with Matthew Fuller. Pool, Ithiel de Sola. 1983. Technologies of Freedom: On Free Speech in an Electronic Age. Cambridge, MA: Harvard University Press. Poster, Mark. 1990. The Mode of Information: Poststructuralism and Social Context. Chicago: University of Chicago Press. ———. 1997. ‘Cyberdemocracy: The Internet and the Public Sphere’. In David Holmes (ed.), Virtual Politics: Identity and Community in Cyberspace, pp. 212–28. London: Sage. Postman, Neil. 1986. Amusing Ourselves to Death: Public Discourse in the Age of Show Business. New York: Viking. Prakash, Gyan. 2000. Another Reason: Science and the Imagination of Modern India. Delhi: Oxford University Press. Prelli, Lawrence J. 1989. A Rhetoric of Science: Inventing Scientific Discourse. Columbia, South Carolina: University of South Carolina. Ramirez, Catherine S. 2002. ‘Cyborg Feminism: The Science Fiction of Octavia E. Butler and Gloria Anzaldua’. In Mary Flanagan and Austin Booth (eds), Reload: Rethinking Women + Cyberculture, pp. 374–402. Cambridge, MA: MIT. Rath, Sura P., Kailash C. Baral and D. Venkat Rao (eds). 2004. Reflections on Literature, Criticism and Theory: Essays in Honour of Professor Prafulla C. Kar. New Delhi: Pencraft. Reid, Elizabeth. 1995. ‘Virtual Worlds: Culture and Imagination’. In Steven Jones (ed.), Cybersociety: Computer-mediated Communication and Community, pp. 164–83. London: Sage. Rheingold, Howard. 1992. The Virtual Community. New York: Simon and Schuster. ———. 1993. ‘A Slice of Life in My Virtual Community’. In Linda Harasim (ed.), Global Networks: Computers and International Communication. Cambridge, MA: MIT. Rip, Arie, Thomas J. Misa and Johan Schot (eds). 1995. Managing Technology in Society: The Approach of Constructive Technology Management. London: Pinter.
BIBLIOGRAPHY AND WEBLIOGRAPHY
S 331
Rival, Laura, Don Slater and Daniel Miller. 1999. ‘Sex and Sociality: Comparative Ethnographies of Sexual Objectification’. In Mike Featherstone (ed.), Love and Eroticism, pp. 295–321. London: Sage. Robertson, George, Melinda Mash, Lisa Tickner, Jon Bird, Barry Curtis and Tim Putnam (eds). 1996. FutureNatural: Nature, Science, Culture. London and New York: Routledge. Robins, Kevin. 1998 [1995]. ‘Cyberspace and the World We Live In’. In Mike Featherstone and Roger Burrows (eds), Cyberspace/Cyberbodies/Cyberpunk: Cultures of Technological Embodiment, pp. 135–55. London: Sage. ———. 1999. ‘Foreclosing on the City? The Bad Idea of Virtual Urbanism’. In John Downey and Jim McGuigan (eds), Technocities, pp. 34–59. London: Sage. Rosanne Stone, Alluquere. 1999. ‘Will the Real Body Please Stand Up? Boundary Stories about Virtual Cultures’. In Jenny Wolmark (ed.), Cybersexualities: A Reader on Feminist Theory, Cyborgs and Cyberspace, pp. 69–98. Edinburgh: Edinburgh University Press. Rose, Hilary. 1994. Love, Power and Knowledge. Bloomington: Indiana University Press. Ross, Andrew. 1991. Strange Weather: Culture, Science and Technology in the Age of Limits. New York: Verso. Rucker, Rudy, Queen Mu and R.U. Sirius (eds). 1993. Mondo 2000: A User’s Guide to the New Edge. London: Thames and Hudson. Rutsky, R.L. 1999. High Techne- : Art and Aesthetics from the Machine Age to the Posthuman. Minneapolis: Minnesota University Press. Ryan, Marie-Laure (ed.). 1999. Cyberspace Textuality: Computer Technology and Literary Theory. Indiana: Indiana University Press. Ryan, Steve, Bernard Scott, Howard Freeman and Daxa Patel. 2000. The Virtual University: The Internet and Resource-Based Learning. London: Kogan Page. Said, Edward W. 1985 [1978]. Orientalism. Harmondsworth: Penguin. Salleh, Ariel. 1997. Ecofeminism as Politics: Nature, Marx and the Postmodern. London: Zed Books. Sandoval, Chela. 1999. ‘New Sciences: Cyborg Feminism and the Methodology of the Oppressed’. In Jenny Wolmark (ed.), Cybersexualities: A Reader on Feminist Theory, Cyborgs and Cyberspace, pp. 247–63. Edinburgh: Edinburgh University Press. Sandra Kemp and Judith Squires (eds). 1997. Feminisms. Oxford and New York: Oxford University Press. Sardar, Ziauddin. 1996. ‘alt.civilizations.faq: Cyberspace as the Darker Side of the West’. In Ziauddin Sardar and Jerome R. Ravetz (eds), Cyberfutures: Culture and Politics on the Information Superhighway, pp. 14–41. New York: New York University Press.
332 S
VIRTUAL WORLDS
Sardar, Ziauddin and Jerome R. Ravetz (eds). 1996. Cyberfutures: Culture and Politics on the Information Superhighway. New York: New York University Press. Sassen, Saskia. 1999. ‘Digital Networks and Power’. In Mike Featherstone and Scott Lash (eds), Spaces of Culture: City–Nation–World, pp. 49–63. London: Sage. Saussure, Ferdinand de. 1966. A Course in General Linguistics. Tr. Wade Buskin. New York: McGraw-Hill. Sawday, Jonathan. 1996. The Body Emblazoned: Dissection and the Human Body in Renaissance Culture. London and New York: Routledge. Scarry, Elaine. 1993. ‘Watching and Authorising the Gulf War’. In Marjorie Garber, Jann Matlock and Rebecca L. Walkowitz (eds), Media Spectacles, pp. 57–76. New York and London: Routledge. Schiller, Herbert. 1976. Information and Cultural Domination. New York: International Arts and Science Press. ———. 1984. Information and the Crisis Economy. Norwood: Ablex. ———. 1989. Culture, Inc.: The Corporate Takeover of Public Expression. New York: Oxford. Sclove, Richard E. 1995. ‘Making Technology Democratic’. In James Brook and Iain A. Boal (eds), Resisting the Virtual Life: The Culture and Politics of Information, pp. 85–101. San Francisco: City Lights. Senft, Theresa M. ‘Performing the Digital Body—A Ghost Story’. Women and Performance, 17. Sey, James. 1999. ‘The Labouring Body and the Posthuman’. In Angel Gordo-Lopez and Ian Parker (eds), Cyberpsychology, pp. 23–41. London: Macmillan. Shields, Rob (ed.). 1996. Cultures of the Internet: Virtual Spaces, Real Histories, Living Bodies. London: Sage. Shilling, Chris. 1999 [1993]. The Body and Social Theory. London: Sage. Shinn, Terry. 2002. ‘The Triple Helix and New Production of Knowledge: Prepackaged Thinking in Science and Technology’. Social Studies of Science, 32(4): 599–614. Shirley, John. 1999 [1985]. Eclipse. California: Babbage. Shiva, Vandana and Ingunn Moser (eds). 1995. Biopolitics: A Feminist and Ecological Reader on Biotechnology. London: Zed Books. Silver, David. 2000. ‘Looking Backwards, Looking Forwards: Cyberculture Studies 1990–2000’. In David Gauntlett (ed.), Web.studies: Rewriting Media Studies for the Digital Age, pp. 19–30. London: Arnold. Silverstone, Roger and Eric Hirsch (eds). 1994. Consuming Technologies: Media and Information in Domestic Spaces. London and New York: Routledge.
BIBLIOGRAPHY AND WEBLIOGRAPHY
S 333
Sim, Stuart. 1992. Beyond Aesthetics: Confrontations with Poststructuralism and Postmodernism. New York: Harvester Wheatsheaf. Slack, Jennifer Daryl. 1996. ‘The Theory and Method of Articulation in Cultural Studies’. In David Morley and Kuan-Hsing Chen (eds), Stuart Hall: Critical Dialogues in Cultural Studies, pp. 112–27. London: Routledge. Slatin, John M. 1994. ‘Is There a Class in This Text?: Creating Knowledge in the Electronic Classroom’. In Edward Barrett (ed.), Sociomedia: Multimedia, Hypermedia, and the Social Construction of Knowledge, pp. 27–52. Cambridge, MA: MIT. Sloan, Helen. 2002. ‘Art in a Complex System: The Paintings of Matthias Groebel’. Performing Arts Journal, 70: 127–32. Smith, Dorothy. 1990. The Conceptual Practices of Power: A Feminist Sociology of Knowledge. Boston, MA: Northwestern University Press. Smith, Mark J. 2002. Culture: Reinventing the Social Sciences. New Delhi: Viva. Smith Keller, Laurie. 1992. ‘Discovering and Doing: Science and Technology, An Introduction’. In Gill Kirkup and Laurie Smith Keller (eds), Inventing Women: Science, Technology and Gender, pp. 12–32. Cambridge: Polity. Sobchack, Vivian. 1998 [1995]. ‘Beating the Meat/Surviving the Text, or How to Get Out of This Century Alive’. In Mike Featherstone and Roger Burrows (eds), Cyberspace/Cyberbodies/Cyberpunk: Cultures of Technological Embodiment, pp. 205–14. London: Sage. Sofia, Zoe. 1999. ‘Virtual Corporeality: A Feminist View’. In Jenny Wolmark (ed.), Cybersexualities: A Reader on Feminist Theory, Cyborgs and Cyberspace, pp. 55–68. Edinburgh: Edinburgh University Press. Soja, Edward W. 1993. Postmodern Geographies: The Reassertion of Space in Critical Social Theory. London: Verso. ———. 1997. ‘Six Discourses on the Postmetropolis’. In Sallie Westwood and John Williams (eds), Imagining Cities: Scripts, Signs, Memories, pp. 19–31. London and New York: Routledge. Spalter, Anne Morgan. 1999. The Computer in the Visual Arts. Reading, MA: Addison-Wesley. Spinrad, Norman. 1990. Science Fiction in the Real World. Carbondale and Edwardsville: Southern Illinois University Press. Springer, Claudia. 1996. Electronic Eros: Bodies and Desire in the Postindustrial Age. Austin, Texas: Texas University Press. ———. 1999. ‘The Pleasure of the Interface’. In Jenny Wolmark (ed.), Cybersexualities: A Reader on Feminist Theory, Cyborgs and Cyberspace, pp. 34–54. Edinburgh: Edinburgh University Press.
334 S
VIRTUAL WORLDS
Steffensen, Jyanni. 2002. ‘Doing it Digitally: Rosalind Brodsky and the Art of Female Subjectivity’. In Mary Flanagan and Austin Booth (eds), Reload: Rethiking Women + Cyberculture, pp. 201–33. Cambridge, MA: MIT. Stelarc. 1995. ‘Towards the Post-Human: From Psycho-body to Cybersystem’. Architectural Design, 118: 91–96. ———. 2000. ‘In Dialogue with Posthuman Bodies’. Interview with Ross Farnell. In Mike Featherstone (ed.), Body Modification, pp. 129–48. London: Sage. Stepan, Nancy. 1987. The Idea of Race in Science: Great Britain, 1800–1960. London: Macmillan. Sterling, Bruce (ed.). 1988 [1986]. Mirrorshades: The Cyberpunk Anthology. New York: Ace-Arbor House. Stone, Allucquere Rosanne. 1999. ‘Will the Real Body Please Stand Up?: Boundary Stories about Virtual Cultures’. In Jenny Wolmark (ed.), Cybersexualities: A Reader on Feminist Theory, Cyborgs and Cyberspace, pp. 69–98. Edinburgh: Edinburgh University Press. Sturken, Marita and Lisa Cartwright. 2001. Practices of Looking: An Introduction to Visual Culture. Oxford: Oxford University Press. Sweetman, Paul. 2000. ‘Anchoring the (Postmodern) Self? Body Modification, Fashion and Identity’. In Mike Featherstone (ed.), Body Modification, pp. 51–76. London: Sage. Tapscott, Don. 1996. The Digital Economy: Promise and Peril in the Age of Networked Intelligence. New York: McGraw-Hill Trade. Thacker, Eugene. 2000. ‘Performing the Technoscientific Body: Real Video Surgery and the Anatomy Theater’. In Mike Featherstone (ed.), Body Modification, pp. 317–36. London: Sage. Thomas, Douglas. 2002. Hacker Culture. Minneapolis and London: University of Minnesota Press. Thompson, John B. 1995. The Media and Modernity. Cambridge: Polity. Thompson, Kenneth (ed.). 1997. Media and Cultural Revolution. London: Sage. Timmers, Paul. 2000. Electronic Commerce: Strategies and Models for Business-to-Business Trading. Chichester: John Wiley. Tomas, David. 1991. ‘Old Rituals for New Space: Rites de Passage and Gibson’s Cultural Model of Cyberspace’. In Michael Benedikt (ed.), Cyberspace: First Steps, pp. 31–47. Cembridge, MA: MIT. Turkle, Sherry. 1995. Life on the Screen: Identity in the Age of the Internet. New York: Simon and Schuster. Turner, Bryan S. 1999. ‘The Possibility of Primitiveness: Towards a Sociology of Body Marks in Cool Societies’. In Mike Featherstone (ed.), Body Modification, pp. 39–50. London: Sage. Virilio, Paul. 1989 [1984]. War and Cinema: The Logistics of Perception. Tr. Patrick Camiller. London: Verso.
BIBLIOGRAPHY AND WEBLIOGRAPHY
S 335
Virilio, Paul. 1991. The Lost Dimension. Tr. Daniel Moshenberg. New York: Semiotext(e). ———. 1993. ‘The Third Interval: A Critical Transition’. In V.A. Conley (ed.), Rethinking Technologies, pp. 3–12. Minneapolis: University of Minnesota Press. ———. 1994. The Vision Machine. Tr. Julie Rose. Bloomington and London: Indiana University Press. ———. 1995. ‘Speed and Information: Cyberspace Alarm’. www. ctheory.com. ———. 2000. ‘From Modernism to Hypermodernism and Beyond’. Interview with John Armitage. In John Armitage (ed.), Paul Virilio: From Modernism to Hypermodernism and Beyond, pp. 25–56. London: Sage. ———. Interview with James Der Derian. Virilio and Philippe Petit. 1999a. Politics of the Very Worst. Tr. Michael Cavaliere. Ed. Sylvère Lotringer. New York: Semiotext(e). Virilio and Friedrich Kitter. 1999b. ‘The Information Bomb: A Conversation’. Angelaki, 4(2): 81–90. Virilio and Sylvère Lotringer. 1997. Pure War. Revised ed. Tr. Mark Polizotti. New York: Semiotext(e). Wajcman, Judy. 1993 [1991]. Feminism Confronts Technology. Cambridge: Polity. Wakeford, Nina. 2000. ‘New Media, New Methodologies: Studying the Web’. In David Gauntlett (ed.), Web.studies: Rewiring Media Studies for the Digital Age, pp. 19–30. London: Arnold. Waldby, Catherine. 1997. ‘Revenants: The Visible Human Project and the Digital Uncanny’. Body and Society, 3(1): 1–16. Wallace, Patricia. 1999. The Psychology of the Internet. Cambridge: Cambridge University Press. Wardrup-Fruin, Noah and Brion Moss. 2002. ‘The Impermanence Agent: Project and Context’. Performing Art Journal, 70: 52–83. Warwick, Claire. 2000. ‘The Lowest Common Denominator: Electronic Literary Texts, and the Role of the Information Professional’. Information Research, 5(2). Webster, Andrew. 1991. Science, Technology and Society: New Directions. Basingstoke, Hampshire: Macmillan. Webster, Frank. 1995. Theories of the Information Society. London and New York: Routledge. Westwood, Sallie and John Williams (eds). 1997. Imagining Cities: Scripts, Signs, Memory. London and New York: Routledge. Wiener, Norbert. 1948. Cybernetics: Or the Control and Communication in the Animal and the Machine. New York: John Wiley.
336 S
VIRTUAL WORLDS
Wiener, Norbert. 1954. The Human Use of Human Beings: Cybernetics and Society. New York: Doubleday Anchor. Wilding, Faith. ‘Where is Feminism in Cyberfeminism?’. Wilhelm, Anthony G. 2000. Democracy in the Digital Age: Challenges to Political Life in Cyberspace. New York and London: Routledge. Williams, Raymond. 1976. Keywords: A Dictionary of Culture and Society. New York: Oxford. Williams, Rosalind. 1982. Dream Worlds: Consumption in Late 19th Century France. Berkeley: University of California Press. Wilson, Mark I. and Colin A. Arrowsmith. 2000. ‘Telecom Tectonics and the Meaning of Electronic Space’. In Mark I. Wilson and Kenneth E. Corey (eds), Information Tectonics: Space, Place and Technology in an Electronic Age, pp. 29–40. Chichester: John Wiley. Wilson, Mark I. and Kenneth E. Corey (eds). 2000. Information Tectonics: Space, Place and Technology in an Electronic Age. Chichester: John Wiley. Wolmark, Jenny (ed.). 1999. Cybersexualities: A Reader on Feminist Theory, Cyborgs and Cyberspace. Edinburgh: Edinburgh University Press. Woolgar, Steve. 1988. Science: The Very Idea. London: Tavistock. Zellner, Peter. 1999. Hybrid Space: New Forms in Digital Architecture. London: Thames and Hudson. Zurbrugg, Nicholas. 2000. ‘Marinetti, Chopin, Stelarc and the Auratic Intensities of the Postmodern Techno-Body’. In Mike Featherstone (ed.), Body Modification, pp. 93–115. London: Sage.
CYBERPUNK FICTION AND RELATED SCIENCE FICTION Burroughs, William S. 1986. The Soft Machine. London: Grafton. Butler, Octavia. 1988 [1987]. Dawn. New York: Popular. ———. 1989. Adulthood Rites. New York: Popular. Cadigan, Pat. 1991. Synners. New York: Bantam. Dick, Philip K. 2002 [1953–69]. Minority Report. London: Gollancz. Gibson, William. 1984. Neuromancer. New York: Ace. ———. 1994 [1986]. Count Zero. London: Voyager-HarperCollins. ———. 1994 [1993]. Virtual Light. New York: Bantam. ———. 1997 [1996]. Idoru. New York: Berkley ———. 1998. Mona Lisa Overdrive. New York: Bantam. ———. 2003 [1999]. All Tomorrow’s Parties. New York: Berkley
S 337
BIBLIOGRAPHY AND WEBLIOGRAPHY
Gibson, William. 2003. Pattern Recognition. New York: G.B. Putnam. Murakami, Haruki. 2001 [1991]. Hard-Boiled Wonderland and the End of the World. Tr. Alfred Birnbaum. London: Harvill. Pynchon, Thomas. 1995 [1973]. Gravity’s Rainbow. New York: Penguin. Rucker, Rudy. Software. 1985 [1982]. London: Roc-Penguin. Scott, Melissa. 1993. Dreamships. New York: Tom Doherty. ———. 1993. Burning Bright. New York: Tom Doherty. Skal, David J. 1989. Antibodies. Toronto: Worldwide. Stephenson, Neal. 1993 [1992]. Snow Crash. New York: Bantam ———. 1996 [1995]. The Diamond Age. London: Penguin Sterling, Bruce. 1988 [1986]. Mirrorshades: The Cyberpunk Anthology. New York: Ace. ———. 1991. ‘Twenty Evocations’. In Larry McCaffery (ed.), Storming the Reality Studio: A casebook of Cyberpunk and Postmodern Fiction, pp. 154–161. Durham and London: Duke University Press. ———. 1994 [1992]. ‘Our Neural Chernobyl’. In Bruce Sterling (ed.), Global Head, pp. 3–11. London: Phoenix. ———. 1996. Holy Fire. London: Phoenix. Williams, W.J. 1988. Hardwired. New York: R. Talsorian Games.
FILMS Alien. Ridley Scott (1979) Alien 3. David Fincher (1992) Blade Runner. Ridley Scott (1982) Event Horizon. Paul W.S. Anderson (1997) Fly, The. David Cronenberg (1986) Invasion of the Body Snatchers, The. don Siegel (1956) Invasion of the Body Snatchers, The. Philip Kaufman (1978) Johnny Mnemonic. Robert Longo (1995) Lawnmower Man, The. Brett Leonard (1992) Matrix, The. Wachowsky Brothers (1999) Matrix Reloaded, The. Wachowsky Brothers (2003) Net, The. Irwin Winkler (1995) Robocop. Paul Verhoeven (1987) Terminator. James Cameron (1984) Terminator 2: Judgement Day. James Cameron (1991) Terminator 3: The Rise of the Machines. Jonathan Mostow (2003) Total Recall. Paul Verhoeven (1990) 2001: A Space Odyssey. Stanley Kubrick (1968)
338 S
VIRTUAL WORLDS
WEBLIOGRAPHY LITERATURE
Michael Joyce’s online fiction:
THEORY
JOURNALS Ctheory: Ejournal: Postmodern Culture: and
MEDIA STUDIES
Center for Digital Arts and Experimental Media: Resource centre for Cyberculture Studies:
THE PSYCHOLOGY OF CYBERSPACE
BIBLIOGRAPHY AND WEBLIOGRAPHY
S 339
WOMEN AND INFORMATION TECHNOLOGY
MOVIES
CYBERGEOGRAPHY
SOCIETY, CULTURE, ICTS
Electronic Frontier Foundation: Computer professionals for social responsibility: On Print and E-Media: Democracy: American civil liberties union:
WAR Institute of War and Peace Reporting:
HACKERS
340 S
QUEER ARTS RESOURCE
MAGAZINES Boing Boing: Wired:
VIRTUAL WORLDS
INDEX Aarseth, Espen, 96–97; see also Ergodic Text Appadurai, Arjun, 56–57 archives, 114–15 articulation, 36–38 artificial life, 219, 222–23, 254, 300–1 Assisted Reproductive Technologies (ARTs), 81–83, 237–39, 275–77, 285–86, 291 Author, 99–103 Balsamo, Anne, 228–30, 274, 277, 288–90, 306 Barlow, John Perry, 67, 179, 186 Barthes, Roland, 94–95, 99, 103 Baudrillard, Jean, 19, 48–49, 53–55, 216, 219–20 Bell, Daniel, 49–50; see also postindustrialism Benjamin, Walter, 26, 29–30 Bolter, David, 99–100 Bourdieu, Pierre, 30, 41, 42–43, 44–45 Castells, Manuel, 52–53, 168, 182–83, 192, 197–98 Cavallaro, Dani, 118, 124–25, 125, 306 censorship, 186–89 cloning, 83–84, 239–40 Clynes, Manfred and Nathan Kline, 66, 215; see also Cyborg computer animation, 135–36 computer games, 132–34 Cosmetic Surgery, 244–48, 258–62, 272–75
Currier, Diane, 283, 301–4 cyberdemocracy, 200–205 cyberpower, 177–86 cyberpunk, 70–71, 118–27, 288, 304–8 cyberspace, 66–68, 70, 166–67, 179 cyberterrorism, 209–10 cyborg, 62, 65–66, 121–22, 126, 211–12, 214–15, 215–24, 226–28, 237, 251–53, 254–57, 286, 288–90, 293–96, 303–4, 306–8; see also posthuman deLahunta, Scott, 143–44 Danet, Brenda, 106–8 Darley, Andrew, 128, 130–31, 133 Davis, Kathy, 246, 273–74 Derrida, Jacques, 23, 34, 63, 92–94 discourse, 33–35, 37–38, 44, 58–59, 221–22, 266–67 Doane, Mary Ann, 285–86 domestic technology, 277–78 Downey, Greg, 168–69 ecofeminism, 271–72 Eisenstein, Zillah, 278–82 Ergodic Text, 96–97 Everard, John, 181–82, 187–88, 196–97 Featherstone, Mike and Roger Burrows, 68, 118, 212 Flanagan, Mary, 296–98, 305 Foster, Thomas, 125, 292
342 S
VIRTUAL WORLDS
Foucault, Michel, 35, 46, 92, 99, 160, 217, 221, 256 Franklin, Sarah, 238–39, 275–76
Lévy, Pierre, 20, 54, 69 Lyotard, Jean François, 57–59; see also postmodernity
Gauntlett, David, 129–30 Gibson, William, 31, 66–67, 81, 91–92, 119, 121–22, 123, 125, 126–27, 220, 288 Giddens, Anthony, 52, 181, 184, 282 Gray, Chris Hables, 78, 179, 180, 206–8, 236, 237
Macy Conferences on Cybernetics, 218–19 Margolis, Michael and David Resnick, 173–74, 200–201 masculinity, 258–88 McGann, Jerome, 96, 98 McHale, Brian, 120–21 molecular medicine and gene therapy, 84–86, 240–41 Morgan, Kathryn, 273, 275 movies, 128–31 music video, 131–32
Habermas, Jurgen, 50–51, 176–77, 198–200; see also public sphere hacking, 164, 284, 293 Hakken, David, 158, 162–63 Hall, Stuart, 32–37 Haraway, Donna, 126, 179, 211, 214–16, 220, 236, 267–68, 277, 283, 289, 293–96, 298–99 Harding, Sandra, 265, 269, 308–10 Hayles, Katherine N., 55, 71–72, 94, 161, 217–23, 285 Human Genome Project (HGP), 79, 230, 232–35 hypertext, 92–99, 100, 103, 105, 115–16 identity, individual, 160–62; community, 162, 167, 176–77 implants/transplants, 77–78, 237 info/nanomedicine, 79–81, 235–40 Jordan, Tim, 120, 176, 177 Jupiter Space, 307–8 Kellner, Douglas, 183–84, 205 Kroker, Arthur and Marielouise Kroker, 212–14, Khun, Thomas, 41–42, 265 Laclau, Ernesto, 36–37 Landow, George P., 94, 95, 100–1, 104–5 Lanier, Jaron, 68–69 Latour, Bruno, 23, 40
national systems of innovation (NSI), 184–85 nation–state and cyberspace, 195–200 network state, 197–98 Novak, Marcus, 68, 145, 149 Noveck, Beth Simone, 178, 200, 203, 205, 210 Orlan, 21, 226, 258–62 Pacey, Arnold, 23–24 painting and photoediting, 134–35 pedagogy and research, 108–14 Plant, Sadie, 290–91 Poster, Mark, 59, 125–26, 165–66, 176–77, 201–2 Post-Fordism, 51–52 posthuman, 20–21, 65–66, 71–72, 122, 126, 211–13, 215–23, 227–28, 254–62, 286, 288–90, 293–96, 303–4, 306–8; see also Cyborg postindustrialism, 49–50; see also Bell postmodernity, 55, 57–59, 118–21, 212–14 postnational state, 198–200 poststructuralism, 92–96, 103, 105–6 print, death of, 117–18 production and consumption, 167–74 public sphere, 50–51, 175–77; see also Habermas
S 343
INDEX reader, 103–4 Rheingold, Howard, 72, 163, 164, 166–67 Rutsky R.L., 22, 26–27, 28 Ryan, Marie-Laurie, 69–70, 98, 100, 116 Shinn, Terry, 185–86 Silver, David, 72–73 situated epistemology, 308–10 situated knowledges, 296–97, 304; see also situated epistemology social constructivist views of science, 40–46, 172–73 Sofia, Zoe, 291–92, 307 Steffensen, Jyanni, 284, 288, 291 Stelarc, 21, 122, 226, 254–57 Stephenson, Neal, 67, 123, 126, 160 Sterling, Bruce, 70, 80, 106, 119, 120–21, 122–23 Stone, Allucquere Rosanne, 164, 287–88
technocities and virtual cities, 189–95 techno-elitism, 179–80 Timmers, Paul, 169–71 Turkle, Sherry, 72, 125 Virilio, Paul, 20, 55–56, 59–64 Visible Human Project (VHP), 78, 231–32 VNS Matrix, 283–85 Wajcman, Judy, 265, 277 Wallace, Patricia, 162–63, 251 Webster, Frank, 47–52 Wiener, Norbert, 65–66, 218–19 Wilding, Faith, 283, 284–85, 291, 304 Williams, Raymond, 28–29 Zellner, Peter, 145–46, 147, 148–49
344 S
VIRTUAL WORLDS
ABOUT THE AUTHOR Pramod K. Nayar teaches at the Department of English, University of Hyderabad, from where he took his doctoral degree, specialising in British fiction on India. He was Smuts Visiting Fellow in Commonwealth Studies, University of Cambridge (UK), 2000–2001. Sections of his work on early British travel writing and India have appeared, and are forthcoming, in the Journal for Early Modern Cultural Studies, Studies in Travel Writing, Journal of British Studies and 1650–1850. Among his other interests are the aesthetics of the sublime and cultural studies. He reviews books on environmentalism, literature, cyberculture and philosophy for Philosophy in Review, Resource Centre for Cyberculture Studies, Culture Machine, Jouvert, Social Science Computer Review and several other journals. His most recent book is Literary Theory Today (2002).