Against Transmission: Media Philosophy and the Engineering of Time 9781474293099, 9781474293129, 9781474293082

Against Transmission introduces the technical history and phenomenology of media, a field of study that explains the cha

177 105 6MB

English Pages [190] Year 2018

Report DMCA / Copyright

DOWNLOAD PDF FILE

Table of contents :
Cover
Half-title
Title
Copyright
Dedication
Contents
List of Figures
Acknowledgements
Introduction: Togetherness and Time
1. Media Temporalities: An Introduction to the Media Philosophical Approach
2. Media Aesthetics
3. Post-Historical Scenes
4. The Radical Cutting of Experimental Television
5. Time and Contemporary Television
Conclusion
Notes
References
Index
Recommend Papers

Against Transmission: Media Philosophy and the Engineering of Time
 9781474293099, 9781474293129, 9781474293082

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

Against Transmission

Against Transmission Media Philosophy and the Engineering of Time

TIMOTHY BARKER

BLOOMSBURY ACADEMIC Bloomsbury Publishing Plc 50 Bedford Square, London, WC1B 3DP, UK 1385 Broadway, New York, NY 10018, USA BLOOMSBURY, BLOOMSBURY ACADEMIC and the Diana logo are trademarks of Bloomsbury Publishing Plc First published 2018 Paperback edition first published 2019 Copyright © Timothy Barker, 2018 Timothy Barker has asserted his right under the Copyright, Designs and Patents Act, 1988, to be identified as Author of this work. For legal purposes the Acknowledgements on p. ix constitute an extension of this copyright page. All rights reserved. No part of this publication may be reproduced or transmitted in any form or by any means, electronic or mechanical, including photocopying, recording, or any information storage or retrieval system, without prior permission in writing from the publishers. Bloomsbury Publishing Plc does not have any control over, or responsibility for, any third-party websites referred to or in this book. All internet addresses given in this book were correct at the time of going to press. The author and publisher regret any inconvenience caused if addresses have changed or sites have ceased to exist, but can accept no responsibility for any such changes. A catalogue record for this book is available from the British Library. A catalog record for this book is available from the Library of Congress. ISBN: HB: 978-1-4742-9309-9 PB: 978-1-4742-9310-5 ePDF: 978-1-4742-9308-2 eBook: 978-1-4742-9311-2 Typeset by Integra Software Services Pvt. Ltd. To find out more about our authors and books visit www.bloomsbury.com and sign up for our newsletters.

In memory of Norma

Contents List of Figures viii Acknowledgements ix

Introduction: Togetherness and Time

1

1

Media Temporalities: An Introduction to the Media Philosophical Approach 23

2

Media Aesthetics

3

Post-Historical Scenes

4

The Radical Cutting of Experimental Television

5

Time and Contemporary Television Conclusion

Notes

160

References Index

172

162

155

55 81

135

103

List of Figures 1.1 Gustave Courbet, Un enterrement à Ornans

(A Burial at Ornans), 1849–1850

1

2.1 Jim Campbell, Exploded View (Commuters), 2011

56

2.2 Jeff Wall, Listener, 2015

70

2.3 David Claerbout, KING (after Alfred Wertheimer’s

1956 picture of a young man named Elvis Presley), 2015–2016

72

3.1 Étienne-Jules Marey and Charles Fremont,

A Study of Blacksmiths at the Anvil, 1894

82

3.2 Étienne-Jules Marey, Chronophotograph of

a Man on a Tricycle, date unknown

97

4.1 Illustration of the Bildtelegraph

107

4.2 Telefunken’s large-scale experimental pixel screen

108

Acknowledgements T

his entire book is an expression of the generosity, intellectual stimulation and friendship that I have found since moving to Glasgow. Without the support, input and guidance of my wonderful friends, colleagues and students in this city, no part of this book would have been possible. Special thanks to Carl Lavery, Karen Lury, Dimitris Eleftheriotis, David Martin-Jones and Amy Holdsworth, all of whom made very helpful comments on drafts at different stages. I particularly want to thank Carl, Karen and Dimitris, who each in their own way let me talk to them (a lot!) about the book and who have each done more than they know to help me along. My gratitude to Wolfgang Ernst is immense. Wolfgang was one of the first people with whom I discussed the plans for this book, and I owe him a great debt for the support that he has given to me, the intelligent comments that he made on early ideas for the book and the intellectual stimulation that his work has provided since. As well as the support of colleagues, I was also supported by the award of a research fellowship from the Leverhulme Trust, which made the book possible. I cannot thank Leverhulme enough for their support and their continued emphasis on a non-utilitarian, individual and responsive approach to funding. Thanks also to Liza Thompson and Frankie Mace at Bloomsbury, who from their initial interest in the idea, through the handling of the proposal to the realization of the book, have been excellent editors to work with. I was lucky enough during the course of writing the book to have been invited to give talks, where the organizers thankfully were open to letting me test out some new ideas as they were developing. Thanks to Pierre Cassou-Noguès, Claire Larsonneur and Arnauld Regnauld from Paris 8, James Williams and Dominic Smith from the University of Dundee, the students in the Critical, Curatorial and Conceptual Practices Programme at Columbia University, Matilda Mroz from Greenwich University and Paul Flaig from the University of Aberdeen. The writing of this book took me to some of the most important media archives in the world, and I could not have navigated my way through these institutions without the expert help of dedicated archivists. Special thanks to Joe Hursey from the Smithsonian National Museum of American History, Jörg Schmalfuß from Stiftung Deutsches Technikmuseum, Berlin, Véronique

x

ACKNOWLEDGEMENTS

Chauvet from Iconothèque de la Cinémathèque française, and Ian Logie Baird from the National Medium Museum, who was so generous with information about John Logie Baird and mechanical television. Thanks also to Jim Campbell, David Claerbout and Jeff Wall, who generously allowed me to reprint images of their artworks in the book. To my daughter Chloe, who during the writing of this book provided so many (welcomed) distractions: thank you for your patience and thank you for talking with me about my work (which was always very helpful, although you might not have known it!). During the writing of the book and the archive trips that took me away from Glasgow, my wife was always unconditionally supportive and encouraging, even with some serious health issues thrown into the mix (which she’s thankfully overcome). Michelle, you are a wonderful and strong partner. You are my north star. There is no way I could have written this book, or done much else, without you.

Introduction: Togetherness and Time

FIGURE I.1 Gustave Courbet, Un enterrement à Ornans (A Burial at Ornans), 1849–1850. Image courtesy of Musée d’Orsay.

2

W

AGAINST TRANSMISSION

hat does it mean to be contemporary? This question, the one at the centre of artworks such as the celebrated Un enterrement à Ornans (A Burial at Ornans) (1849–1850), has been repeated throughout art history and philosophy for decades and has more recently been addressed in accounts of digital culture. Look at the painting. The temporal systems that each figure inhabits, the separate spheres within the painting, are carefully composed by Courbet. The eye moves from one group to another. The women mourning, the politicians, the pall bearers, the altar boys, they all look to one another; more than just their arrangement on the canvas, when we look with a media philosophical gaze, we see that transmission events bind the different figures together into groups. These figures are contemporary, sharing a being-intime. Figures look to one another, exchanging glances. The men in the centre whisper. The figures move together; they move in groups, along the path. There are mediating structures – things such as dress, facial expressions and gestures, which organize the groups into different types – that allow these transmission events to take place. What then is the condition that makes each of these groups contemporary with one another? What are the structures, more than just a compositional arrangement, which group these figures together? Is it simply about being up to date, sharing an experience, being alike with another, being of one’s own time, belonging to a shared temporal system: the time of the everyday, the time of religion, the time of politics? Or is there something about being contemporary that is fundamentally about media operability, about the way transmission events are able to take place? This book is an investigation of this later question with reference to the way time has been measured and temporality has been engineered in technical media systems. We will return to the painting later. Transmission is a foundational media philosophical concept, particularly when considering temporality and the conditions for relations between groups. In Courbet’s painting, the representations of transmission events are one of the techniques used to group the figures together. In the twentieth century, the major transmission event was the translation of events into history, which grouped large collectives together in a shared being-in-time, a shared contemporariness. History was a transmission of events over time. This book is called Against Transmission because it looks to the operation of technical media that stand in the way of the smooth transition from past to present to future, whether it be forward or backward, and, in this sense, offers alternatives to historical temporality. The book investigates the shift from synthetic media – media such as cinema and print, which characterized modernity – to the measurement and storage media that now characterize the conditions for contemporaneity, by producing new temporal systems in which events are mediated and new types of histories are able to be written. In looking at this shift, I explore the storage

INTRODUCTION

3

media of database, photography and digital television, which now takes on a significant role in discourses of the past, particularly in terms of making the past available in multi-temporal ways, rather than as a transmission from moment to moment. In the medium theory tradition, with form mirroring content, the book does not present a linear history of media developments. The book is not a transmission through media history but instead a story of multiple different points in time that are folded into the present moment. Against Transmission focuses on the point at which media transduce events into signal, the technical function that occurs prior to transmission, which ensures that transmission can take place at all. It looks to the way this function, both technical and aesthetic, creates blockages in time and paradoxically works against the transmission of events that once came to define History (with a capital H). The book does not only focus on the representation of time in audiovisual media – it does not only look to the time-images offered by art, film and television – but also takes on the fundamentally media philosophical task of bringing into view the technical elements of media, their operation in time, showing how this is related to the images of time presented in media culture and outlining the ways that this is embedded in contemporary experience. Etymologically, the contemporary refers to a particular being-with-time. Con comes from the Latin for together or being with and temporary has as its root word tempus, the Latin for time. But what are the structures of this time? How is it produced? And how is it sustained? In short, what are the conditions for the experience of the con-temporary? And how is media, whose operation provides the supports for transmission events in the first place, implicated in generating this condition? How do media – film, television and digital media – organize the temporality of events and set the conditions to be contemporary? This is a particularly pressing question in the age of technical media, whose operability is based fundamentally on temporal concepts such as real-time systems, time axis manipulation, transmission, time-discrete and time-continuous signal processing and storage (Ernst 2016). This book tries to address these questions with a particular emphasis on the way media systems have been designed to solve time-based problems associated with real-time transmission, storage and, most prominently, transduction, and in so doing replacing the dominance of historical media (such as print and cinema). It explores the way time has been engineered in media systems, much like the way time is composed into a scene by Courbet. History, like the cinema, represented time as a line. With it, as already mentioned, people could be reassured of their collective being-in-time (Ernst 2016: 208). As Hegel once argued, history is made possible by the conditions of preservation and more specifically the application of chronological preservation media to events, which immediately makes events into historical

4

AGAINST TRANSMISSION

representations (Hegel [1899]1956: 3). A profound difference now opens up with new forms of storage and time-discrete signal processing. Storage media such as print and film once synthesized history into a line. The carrier medium relied on a linear operation, either of the eye moving along the page or the mechanical movement of film through the projector. Now, as new alternatives to this type of time emerge, as history no longer reassures its subjects of their collective being-in-time, it is the media philosophical task to ask how technical media is implicated in the post-historical, a moment where the domain of history, a story of evolution and progress, no longer offers the explanations and understandings of human life that it once did (Breisach 2003: 10). What are the new conditions for being-in-time produced by media that now takes the place of the historical, the alphabetic and the cinematic? ‘Turing’s symbol processing machine strictly requires that the time parameter be treated as discrete [emphasis in original]’ (Ernst 2016: 79). The Turing Machine is not based on the historical concept of a time-continuous evolution along a line but on a sequence of states. The time of the digital is, technically speaking, opposed to the time of History. Philosophers and political scientists have been for the last few decades using the German expression das post-historie to refer to a condition where the future is replaced with the present and the conditions for political change become mitigated by the advent of Western Liberal Democracy, as Fukuyama has argued for instance. The post-historical has been formulated in political theory as an anachronism, an ‘arbitrary simulation of fragments of the past’ (Niethammer 1992: 1). Against Transmission explores the ways that this type of time is not just a product of political realities but also of media technical ontologies. I leave the discussions of the political realities of the post-historical to others and instead look at a different aspect of this relatively new, discrete moment. What is the temporality of the supposed end of history? What happens when we live without the surprises of history? These are questions that address the phenomenological aspects of the post-historical. They ask what it feels like to live without a familiar, collective sense of time. More precisely, when cast in a media philosophical context, these questions ask us to explore the role of time-discrete media in setting up the conditions for the possibility of this experience. The discrete processes, given a technical form by digital media, rehearse the cultural observations made by Boris Groys (2009), who argues that never before has a culture been so obsessed with its own contemporaneity, its being-in-the-present (Groys 2016). But these moments, although a-historical, are not a-temporal (after all, the Turing machine based its operations in the present on information stored from the past). The argument presented in this book outlines the way that different, unfamiliar, non-anthropocentric types of time can be described in the post-historical as the nesting of multiple moments

INTRODUCTION

5

in the present, rather than points set out on a string. This is undertaken first through an examination of the operability of analytical media and second through an exploration of artworks and the philosophical descriptions of media events that offer new ways to conceive the time of post-history beyond recourse to a-temporality. In an attempt at coming to terms with the time of the present, people such as Jonathon Crary (2013), Paul Virilio ([1997]2008), Mark C. Taylor (2014), Benjamin Noys (2014) and others who align themselves with the different wings of the accelerationist movement have argued, particularly with regard to questions of time, modernity and neo-capitalism, that the contemporary experiences of digital culture are a product of increasingly fast computational processes that are the product of the steady and accelerating development of the modernist project. They turn to what Serres ([1983]2015) calls ‘the up stream of time’ and see a chain of events flowing in one direction with an ever increasing speed. But these arguments, while presenting an understandable narrative of events, tend to reframe the more traditional view of time as a line, as though footprints leading out of a cave, with contemporary temporalities the effect of a long string of causes, feeding into one another; in this case, it has to do with accelerating speed. As somewhat different to these voices, what I try and describe in this book is a story less about development and speed and more about fragmentation and the reoccurring points at which the time criticality of media has been and continues to be expressed. The book is more about deceleration and time expansion involving delays and storage than it is about the time compression of accelerationism. In using this method of analysis, more media archaeological than historical, I am interested in seeing first how time was engineered in the experimental and discontinuous stages of media developments in film, television and computer history, and next in explaining how this can be seen to have real effects on ubiquitous twenty-first-century media, not only through the speed at which they carry out processes but also through their information processing routines themselves. In offering an alternative to accelerationism, we are able to understand the post-historical not as a moment without time – not as a moment where things move too fast for us to have any time left – but as one in which subjects are able to occupy a radically different, multiplied type of time. As already mentioned, mirroring the arguments of the book, the media history that is presented is not the one based on the chronology of a line, with developments traced from their beginning to their realization in mass media. Instead, following in the wake of figures such as Wolfgang Ernst, Bernard Siegert, Sybille Krämer and of course Friedrich Kittler, the book presents media discontinuities. The book presents sections of media history – interruptions in the timeline, time-discrete events – in order to explore the types of temporality produced

6

AGAINST TRANSMISSION

at experimental stages of a medium’s development. More than just an account of media technical processes however, the book uses this mode of analysis to then begin to describe the multiple temporalities produced in contemporary media culture, as older inventions and solutions are folded into the digital present.

Technological determinism The phrase ‘technological determinism’ is often used against the lines of argument offered to us by the foundations of media philosophical inquiry. The work of key protagonists such as Kittler and Marshall McLuhan tends to push against the grain of a more traditionally humanist media theory. Raymond Williams made a stinging critique of McLuhan as an a-political technological determinist and spent the first pages of his landmark book on television setting out his opposition to McLuhan’s mode of analysis. A disregard for McLuhan and medium theory in general and particularly in British cultural and media studies followed (now beginning to be overturned with recent developments in software studies and a turn towards media archaeology). Certainly, my claim that analytical media support some kind of cultural anachronism might attract the criticism of replacing human agency or ideological systems with technological determinism. In the face of this potential sticking point, most books and essays written in the tradition of medium theory begin with a few words to try and guard against this criticism. In my opinion, the problem is not with whether or not the technological can determine the social or vice versa, but rather the inability of the two to be separated in the first place. If, following McLuhan, media are viewed as aesthetic rather than purely communicative tools, as environing processes rather than simply transmission devices, then everywhere we look, from work to leisure, technology and the apparatuses that exist at the relation between humans and human-made artefacts increasingly provide a set of conditions that suggest ways of being in the world. If we can define technology, following Jonathan Sterne (2014), as ‘repeatable social, cultural, and physical processes crystallised into mechanisms’ (121), a definition that is in complete accord with the process philosophy given to us by Alfred North Whitehead, then we might come closer to realizing that technology, as these repeatable processes that become formalized in material hardware, continually imposes itself on social life, in a radically non-hermeneutic, non-sensual way. The technology that is seen to be determining of social relations is not simply the hardware and software of cinema, televisions and computers, but the human techniques, processes and ways of operating that settle as sediments in the materiality of these machines.

INTRODUCTION

7

Faced with contemporary culture, it is difficult to dismiss technological determinism completely. As Nicholas Gane (2005) argues, as machines ‘learn’ to design and communicate with other machines with little human input, and shape all aspects of the lived environment (from the cars we drive – which increasingly drive us and maintain themselves – to the neighbourhoods we live in, which are structured to an even greater extent by geographical information systems) the power of technologies to determine human life is becoming ever clearer. (40) What is needed is to begin to bring this situation into view, rather than naively dismissing it, to uncover the cultural phenomena that it supports, to explore the dangers, the possibilities, what is lost and what might be gained. What is also needed is an eye to the elemental networks of human and technological processes, as their connection produces cultural techniques, rather than one being merely determined by the other. It is in this sense that technology, following the speculative philosophy of Whitehead, is an objectification of a prior process and it is this process that is determining of future potential social conditions. We programme technology and it ends up programming us. The subject as an a priori category is displaced by the emphasis on the process, the in-betweenness, of media where subjects and objects do not sit comfortably side by side in separate boxes, but are instead, like Whitehead told us, inextricably linked in the world of experience. The trick is to think of technology as a process, rather than as an object. After both McLuhan and Kittler, it is conditions rather than subjects, processes rather than objects, which should be the focus of media studies. In this book, I will not describe the experiences of individuals or groups (a task which has otherwise been undertaken by Sarah Sharma (2014), Judy Wajcman (2015) and Jon May and Nigel Thrift (2001), amongst others). Instead, based on archival research and critical reflection, I try to take on the task of analysing the production of the conditions for the possibility of experience. This task involves thinking the transcendental but it is also fundamentally technical. Instead of placing the human at the centre of experience and explaining the way people describe the experience of being-in-time, I look to conditions – the engineering developments that produced the conditions necessary for there to be an audiovisual discourse in the first place – as a way to get beyond recourse to subjectivity, a way that following Deleuze is both transcendental and empirical. Technical functions and their histories are invisible, undercover, meaningless and not open to consciousness directly. But they are experienced through images, texts, screens and relationships. These conditions, the

8

AGAINST TRANSMISSION

crystallization of techniques in technical artefacts, are not ones that are fixed for all time but are capable of arising and disappearing, like faculties. As Dominic Smith (2015) argues, ‘attention to the transcendental need not lead to the reification of any term whatsoever (not “God”, “Consciousness”, “Mind”, “Matter”, “Life”, “Being”, “Language”, or “Technology”); rather, what counts is attention to the relation between the empirical and the transcendental, a relation between “facts” and their conditions that must be persistently scrutinised and re-invigorated through critique’ (549). In short, our task, as scholars intent on the analysis of media, is to think about the way experience is organized and striated by a set of conditions, and in this book, because it is designed to bring into view a media philosophy of time, I look to the technical conditions for the possibility of conscious experience – what Zielinski calls the ‘deep time’ of analytical instruments – written into twenty-first-century media. What processes are in operation? What are the histories latent in these processes? What conditions do these processes support? What functions are carried out? And how do these functions produce theories, or ways of coming to know, the world?

Media anachronisms Part of this book is spent on (discontinuous) media historical detail, part is spent on philosophical reflection inspired by current media events and part is spent exploring the way contemporary artworks give form to the conditions of the temporality of the present. But here, let’s return to Courbet’s painting. The artworks most often noted as giving aesthetic form to the temporality of life under modernity were the great works held together under the banner of ‘Realism’. The faces in these works, those mostly turned away from the viewer, those painted by the great modern artists with names like Gustave Courbet, Jean-François Millet and Gustave Caillebotte, and hanging in the world’s most imposing museums, reveal the effects of a crushing length of time. Figures are weighed down by a flow of work, crushing rocks, gleaning wheat, scrapping floors and working in the fields. They turn away from us, into their own time. Or sometimes they look straight at us, punctuating our own time, the time of viewing, with traces of their own. Time at these moments extends over actions, containing them, compartmentalizing, encapsulating them within the spheres of modernity. These artworks are historical, but not in the same way that French academic painting was historical; they are not about celebrating great events of historical significance. Instead, they simply place events in historical, linear time. They give a context to events. They say ‘this event took place at one point in time’.

INTRODUCTION

9

The painting reproduced at the start of this book, however, Courbet’s major work, the one that I want to use to begin an exploration into the conditions of contemporaneity in twenty-first-century media culture, shows us something different. And this is why it is special and provides a good metaphor for contemporary media temporalities. Time in this image is much more complex than simply a lineal flow of work towards death. The title given to the work by Courbet in the register of the Salon was Tableau de figures humaines, historique d’un enterrement à Ornans (Painting of Human Figures, the History of a Burial at Ornans). The painting is not of an event, it is not of the historical facts of the burial, but a painting of a radically different type of history of an event to that usually favoured by the academy. The work is post-historical because it refuses to be a part of historical time. This is a multi-temporal, messy scene, with lines going off in all directions. Many people have already said that this is an image of one unexceptional moment, perhaps the burial of Courbet’s great uncle, filled with ugliness, which is certainly not usually the subject of the large canvases earmarked for French academic painting. But it is not just this that makes the painting into a post-historical image. We do not just see one moment in a history of events. We see instead a thickening of time, a multi-temporality, with gestures and images gathering together events, the pleats of time, in one image, where, as Serres ([1993]1995) would argue, the instant represented by Courbet is made up of moments both contemporary and archaic. The history of the burial is a history of many events drawn together; as the mourners gather around the grave, it is not the history of a line. The treatment of the paint, the use of blacks, recalls the traditions of seventeenth century Spanish art and the technique of tenebrism. The subject matter is reminiscent of El Greco’s The Burial of the Count of Orgaz (1586). The sky resembles the Dutch master of 200 years earlier, juxtaposed with figures, contemporary in their rendering. Images of Christ dominate the composition, the history of Christianity, brought together with images of everydayness, images of modernity, images of mourners, and images of politics, a mayor and a masonic judge along with the more ancient rituals of the burial. All of these individual human figures and symbols come with different senses of time: they all have their own tempo, their own rhythms, that they impose on the life of the scene. The eye moves from one time to another, until it is drawn to the black hole in the ground, at the bottom of the canvas flanked by a skull. If you visit the Musee d’Orsay and stand in front on the vast canvas, this hole is right in front of your eyes, but no one really looks at it. It provides more of a context from which to see the rest of the painting. As Michael Fried (1990) points out in his own reading of the work, ‘both the location and the treatment of the grave bears witness to a resolve to cut the ground out from

10

AGAINST TRANSMISSION

the […] beholders feet’ (133). The viewer is situated in the hole, the ground cut out from his feet, surveying the scene, in a void between the temporality of viewing and the temporality of the painting. The crowd ebbs and flows, moving both right and left in the composition, their histories, their rhythms of life, pulling them in different directions. As already mentioned, there are many different models of transmission in the image. Some figures are involved in their own conversation, the women at the right of the canvas circle around one another. The small boy looks to the pall bearer. Some figures, the man in the middle of the composition, the woman covering her nose and mouth, the clergyman holding the crucifix, stare at us, including the viewer in the composition. They all crowd around the dark hole. It anchors the composition.1 It organizes the historical events into a multi-temporal scene. They look down. They look into the grave, one of the first pieces of media used to organize a culture. It organizes this culture by drawing together the multiple times of the mourners into one multi-temporal event. A cursory definition can be given to media as a process by which information is transmitted from one point, whether in space or time, to another. The grave carries information – ‘what used to be a person who shared in your ideas of society lies beneath this spot of ground’. It contains material traces from the past; it forms the foundation for the tradition of the burial. It draws together the past and the present, acts as a storage media, a special case of transmission media which transmits events non-chronologically through time and in so doing produces storage time. It articulates the past and the present, both through their similarities and their differences. The grave marks the death and the life. It is a medium which persists in time, unlike the conversations and glances. There are multiple transmission events depicted in the painting but these are all organized around the function of the grave and the ritual of the burial. The figures look to the grave and see the abyss of time: The cave that sucks the events of the world into itself. The tomb, the storage medium, the material trace of a ritual, produces a media temporality. This is what the painting offers at its heart: the tomb, the thing that conditions the events of the painting, the thing around which the scene circulates. The point: this image, given aesthetic form in Courbet’s great painting, the black hole, this old medium, carries out the function of organizing temporality. If we can understand the grave in Courbet’s painting, and the rituals that it materializes, as a storage medium for ordering time, how might this then be used to reformulate the way we describe the relationship of time to other, more traditionally understood media? This is a question that involves not just an exploration of the technical engineering of time in media history but also a reformulation of the concept of media determinism made possible, following McLuhan, by situating media as aesthetic, as world building, rather than as a communication tool for transmission.

INTRODUCTION

11

The open grave organizes the painting both compositionally and conceptually. Vector lines meet at the black hole. The crowd gathers to witness the body buried in the earth. Their faces, as the eye moves from one to the other, form diagonal lines on each side of the composition, pronounced by the two book ends, the religious official and the man in teal, which intersect at the open tomb. The clouds signal the passing from light earth to black earth. The hole is the medium for experience. It is the in-between that brings together the mourners, the clergy, the mayor and the judge. This medium does not only transmit, as most communication theory would describe it. It does not simply transmit information about the body buried in the earth. It sucks events into itself. It sucks the gaze of the Christians, the politicians, the law makers and the citizens into itself. It is in this sense that it works against transmission. All events are directed towards it. This is the task carried out by all post-historical media. They translate events into scenes: they operate as transducers.

Post-historical media John Durham Peters (2015), in a tradition begun by McLuhan, has argued beautifully that media should be conceptualized as environments for living, rather than mere communication channels. ‘Media are not only carriers of symbolic freight but also crafters of experience’ (Peters 2015: 15). Communication and entertainment networks, and the rituals that they materialize, pull the time of events into themselves in order to translate events into scenes. They are like the medium of the tomb in Courbet’s painting. They make events meaningful, persistent in time, as a state of things. At their beginnings, before they were to become mass audiovisual media, when they were still fragmented as pieces of experimental research, we see this most clearly as inventors grappled with ways of translating variations in experience into transmittable signal. In the age of synthetic media, when the eye moved along printed text and the film ran through the projector, events could be reproduced in a line and History could proliferate. In an age of analytical media, which privileges time-discrete signal processing, the line is replaced by the storehouse, the hole in the ground. Transduction then, the turning of time-continuous events into time-discrete signals for processing, becomes the media technical process that needs to be addressed in developing media philosophical concepts of time and contemporaneity. The unfamiliar and multiple temporal systems not only are a product of media’s technical operability but also in terms of the polychronic temporality of the histories of media themselves. As has been argued by media archaeologists including Wolfgang Ernst, Siegfried Zeilinski and Jussi Parikka,

12

AGAINST TRANSMISSION

if we conceptualize supposedly ‘new media’ within the broad spectrum of the history of media practices, it begins to look rather old. Digital media retrieves ancient media practices such as registers, indexes, the census, calendars and catalogues (Peters 2015: 19). Ever since the Mesopotamians used tokens to count agricultural goods and then began inscribing marks to replace these tokens, inventing pictographic writing, the analytical, the discrete, has been used to keep track of the world; after all, when the first writing system developed, it was an off shoot of a method of counting developed around 8000 BCE (Schmandt-Besserat 1995). These techniques, the archaic analytical practices that linked writing (information) and counting (processing), now folded into the digital present, ‘have always been in the business of recording, transmitting, and processing culture; of managing subjects, objects, and data; of organizing time, space, and power’ (Peters 2015: 19). Data processing techniques, the modes of organizing time and space, things that came before culture and gave it its character, as argued by both Innis and Kittler, now reemerge in digital contemporaneity, not as record keeping but as ubiquitous media. Once they were used only by officials, now they are everywhere. Synthetic media, the mass media of the twentieth century, was the exception in media history (Peters 2015: 19). Synthetic media – synthetic in that it produces a new whole from fragments – produced a mass; it produced mass culture. Analytical media, on the other hand, breaks down this mass; they count it, separate it and unwind it. Synthetic media presented a smooth flow of entertainment and news, which was exceptional, but they achieved this through an infrastructure albeit invisible that relied on the precise measurement, storage and organization of data, which was itself a much older practice related to the more ancient techniques and one that is now far more visible in the computationally defined instant. Now that audiovisual culture retreats from the public space, now that entertainment and news of all kinds is tailored to individuals rather than a public, now that it no longer produces a mass but a fragmentation, media return to their analytical roots. The time of analytical media now works against – no longer simply a support structure for – the transmission time of the synthetic media of modernity.

Computing To begin to explore the conditioning produced by the apparatus of analytical media, a good place to start is with Charles Babbage and his work on programmable mechanisms. During his time at Cambridge, Babbage, a figure that would establish the foundations for the conditions of mechanical computing, with a small group of friends founded ‘The Analytical Society’. The goal of these ‘young infidels’, as he describes it, was to peer into the work

INTRODUCTION

13

on universal notation, supporting Leibniz, rejecting Newton, and do work that explored the very conditions of the mathematical system. Babbage would of course go on to make astounding discoveries that brought with them very new views on the foundations of analysis, not just applicable to number but also to concrete experience. Algebra, he writes in his unpublished Essays on the Philosophy of Analysis (Babbage circa 1820), appears at its first invention to have consisted of little more than the employment of a letter to represent a number to be determined by the conditions of the problem. This became more complicated as symbols began to stand in for time–space directions (Newton) and infinitesimal points (Leibniz) and the notations themselves became conditions for logical analysis. As Leibniz writes, the great value of algebra, what he calls ‘the art of symbols’, lies in the way it ‘unburdens the imagination’ (Beaney 2003). The conditions that allowed proofs to be garnered mechanically were unburdened by the vagaries of mental processes. It was not the mode of presentation that determined truth or falsity but the conditions of the mechanical processes. In 1812 or 1813, in a room in the Analytical Society, Babbage, after labouring over a table of logarithms, began to fall asleep at his desk. As his eyes became heavy and his head began to fall forward, another member woke him up by yelling through the door ‘Well, Babbage, what are you dreaming about?’ Startled, he answered, pointing to the logarithms, ‘I am dreaming that these tables might be calculated by a machine’ (Babbage 1864: 42). This dream would remain in Babbage’s mind until about a decade later when he designed for the Astronomical Society a calculating machine able to deliver printed outputs, using a clockwork mechanism to automatically control wheels with numbers on their edges. After building a small model of this machine for the Society, he went on to spend the rest of his life designing the Difference Engine and then the Analytical Engine. From this point, never before in human history had a culture, via the calculating medium of the computer, attempted to measure and master the world using real numbers (Ernst 2016: 78). At this point also debates and criticisms of media determinism become unthinkable because it becomes impossible to separate media processes from experience in the first place. As McLuhan would suggest, figure and ground become reunited. Thinking subjects and unthinking technologies could no longer be kept in such separate boxes (Peters 2015: 88–89). It is not the thinking subject alone that conceptualized the world, nor was it the thing in itself, that meaningful object, word or equation that could offer a way to think about the events of the world, but, as already mentioned, the condition for the production of the thing itself, the meaningless conditions for possibility, the carefully timed functions, the media, that brought the world into actual existence. Konrad Zuse, the son of a Prussian postal worker, the figure who would be the first to deliver a full realization of the ideas of Babbage, first enrolled

14

AGAINST TRANSMISSION

in the construction engineering programme at the Technical University of Berlin. Zuse in his own words felt ‘born to construct’; he wanted to create something new, a new language, a new system, something eccentric that did not function in the way established by others (Zuse interview with Merzbach in Computer Oral History Collection 1968). However, he immediately became put off by the ‘voluminous computations’ that were required. Zuse had a distaste for mathematics both in school and in university. The computations he was asked to perform as part of the construction engineering courses prompted him to say ‘that is really not right for a man, that is beneath a man. That should be accomplished by a machine’ (Zuse interview with Merzbach in Computer Oral History Collection 1968). Calculating machines could be used to crunch numbers, but could a programmed machine be developed to solve more complex problems, such as those that Zuse found so tiresome? Years later, in 1936, Zuse submitted his first patent to the US patent office on the concept of programmed control. The patent examiner sent his application back, commenting that the work that he was proposing had already been undertaken by an Englishman and that Zuse should consult his work. This was the first time that Zuse heard of Babbage and the criticism of his patent application was to lead to a breakthrough. Babbage’s work supplied Zuse with a starting point, to which he could apply the logarithms on which he was working and develop fully his Plan (programming) Kalkul (formal language) given form in the Z1. Binary had been known amongst mathematicians since the work of Leibniz, but Zuse came at the concept from an engineering perspective. It was the ideal number system to work with the relays that were the basis of his machine. Howard Aiken, one of the pioneers behind the Harvard Mk1, also discovered Babbage’s work through a happy set of circumstances that had considerable effects. Working in the Physics department at Harvard, Aiken proposed to his colleagues that he commence work on research into the development of computers. There was little interest. Voices came back: ‘we already have a machine like this and no-one uses it. It’s been stored in the attic gathering dust’ (Aiken interview with Tropp in Computer Oral History Collection 1973). Looking for the machine, Aiken found two of Babbage’s wheels from the difference engine, and this inspired some profound discoveries in the history of computing. Along with the influence from Babbage’s engine, Aiken went on to build his machine from different pieces of already existing technologies. Techniques and technologies from the telephone industry, such as the teletext, tape, switching theory and printing telegraph were assembled together as ‘a broad co-ordination of fragments that already existed in 1937’ (Aiken interview with Tropp 1973). There were a number of major breakthroughs in computing around this very fertile niche in media history that have had marked impacts on the way time

INTRODUCTION

15

is now organized and experienced in the twenty-first century. Two of the most instrumental ones were the realization that vacuum tubes did not have to reproduce current exactly to process signal, but merely reach a threshold. Signal could be processed by a tube that was either on or off. As Mauchly, the engineer responsible for early counting machines, puts it: signal processing was no longer about fidelity and linearity but actually about the opposite – and this is what made it realizable (Mauchly interview with Merzbach in Computer Oral History Collection 1970). Another breakthrough in terms of the development of post-historical media was the development of memory and the conversations between von Neumann, Mauchly, Ekert and Goldstine that led to the so-called von Neumann architecture so instrumental to modern computing. Eventually, transistors replaced tubes and delivered reliable forms of memory. Volatile electronic charges are now used to represent and transmit information. They become the second element (the transmitter), coming after the information source, in the five elements described in Claude Shannon’s communication chain, which transmits signal via either modulation or coding and intermediate data storage. They act in ways that could be described by the philosophical rich phrase arché: although they come after the information source in Shannon’s model, they act as the principles for the existence of information at all within the communication chain. They act as the conditions, the underlying substances and processes, from which knowledge, presentness and the post-historical emerge. It is the media philosophical task to begin to reflect on the way these technical conditions and their genealogy impart themselves on the transmission and translation of events. It is the media philosophical task to reflect on the loss of the human power over time (history), which is now taken up by media apparatuses.

Media philosophy But why media philosophy? Why not just media theory? Media theory describes a wide range of practices all of which deliver their own definitions of media and the rituals and techniques that accompany their use, from empirical, sociological and anthropological inquiry to more basic theoretical inquiry and critical reflection. Rather than delivering a theory of the media, media philosophy establishes the conditions for reflection on the technology of media. Media philosophy finds its place beyond media theory by conducting an exploration of media that asks: What are the fundamental concepts and experiences produced by the technical infrastructure of the media apparatus? What are the epistemological effects of transduction, transmission and storage? What are the conditions in-between human subjects and technical media that give form to both objects and experiences? Just as philosophers

16

AGAINST TRANSMISSION

of language argue for a rigorous investigation into the conditions for meaning and the relationship between language and reality, a philosophy of media looks to the medial conditions for life in describing experience in this inbetween constantly mediated and technical universe. Instead of language and semiotics, which proved so valuable both to the structuralist visions of the world and its reformulation in post-structuralism, media philosophy looks to technical codes, operability and data processing and storage routines. The world of communication could be broken down into signs and signals. Media studies of time look to the embodiment of time in signal processing rather than its representation as a sign. ‘Time-critical media processes are embodied not in symbolic signs but in indexical signal’ (Ernst 2016: 173). Media philosophy thus makes the transition from semiotics to the ‘media-technical time event’ (Ernst 2016: 173). It is the task of media philosophy to examine the role of media – physically grounded in matter, energy, transmission and processes of transduction – as evoking and provoking collective concerns on a daily and hourly basis, creating what Peter Sloterdijk ([2011]2016) refers to as the vibrating nervousness and social cohesion of communities (7). Conducting this task we might see the ‘chronic, symbolically produced stress’ (8), those black holes and black boxes that act as the maintenance for social cohesion. A tradition of philosophical reflections on media has been established and the early drafts of a history of this reflection has started to be written and analysed (Parikka 2012; Krämer 2015; Hansen 2004). This tradition however is difficult to chart. This is mainly because it draws from complicated, confusing and often heterogeneous older traditions, including semiotics, history and literary study. Academic philosophy has come to the party rather late. As Sybille Krämer (2015) argues ‘core areas in philosophy, like the philosophy of spirit and language, epistemology, and the theory of science, not to mention ontology and metaphysics, still remain largely unaffected by the issues in media theory. Why is philosophy struggling with these questions?’ (28). In his essay ‘Towards an Ontology of Media’, Kittler’s (2009) answer to this question is that, since the ontology developed by Aristotle, Western metaphysics has been unable to deal with media as media. The focus was on things in themselves, their matter and form, not on the relations and the ‘in-betweens’ of things in time and space. Or as Mark Poster writes, the major cultural theorists of the 1970s and 1980s, such as Deleuze, Foucault, Derrida, Lacan, Habermas and Althusser ‘either paid no attention at all to the vast changes in media culture taking place under their noses or […] commented on media only as a tool that amplified other institutions like capitalism or representative democracy’ (Poster 2010: 2). These figures, although focusing on communication, archives and aesthetics, often in terms of only one type of media apparatus (cinema for Deleuze, language for Foucault, material inscription practices for Derrida, print for Habermas and the cult of personality of Althusser), gave little attention to technical processes and

INTRODUCTION

17

often tried to bypass the study of technical media per se as it was emerging in popular culture. While there are others that commented on media more directly and holistically, such as Pierre Bourdieu, Roland Barthes, Walter Benjamin, Theodore Adorno and Jean Baudrillard, they usually focused on the content of media systems rather than the function of transmission, storage and transduction that constituted the technical functioning of the communication chain. They thought ‘through technology in a fundamentally philosophical way without taking into account the actual technical conditions of the apparatus’ (Ernst 2016: 149). An emphasis on the technicity of the global ecologies of media – and the conditions for emergence that this ‘in-betweeness’ engenders – remained largely unwritten in the continental philosophy of this era. To find the technical explorations of culture, one needs to look towards figures such as McLuhan, Innis, Ong and Havelock in English speaking theory and towards Flusser, Kittler, Ernst, Krämer, Siegert and Zielinski in the German tradition, figures that stood outside traditional philosophy and offered a new emphasis on the ontology of media rather than its content, figures that built on the legacy of Martin Heidegger and his overcoming and reformulation of metaphysics, which made possible a philosophy of media. The term media philosophy has previously described an approach to communication that focused on the transmission of information. Most notably, figures such as Serres (1982; [1993]1995; [1982]2007), Krämer (2015), Douglas Kahn (2013) and Peters (2015) have set out to reformulate concepts of media and communication in ways made possible after Shannon’s famous mathematical model of information. These figures, all in one way or other, grappling with the conceptualizations of physis and techneˉ made possible after Heiddeger, focus on the function of communication channels, the ontology of ‘in-betweenness’ as either ways of uniting or ways of separating entities. They focus on functions as diverse as that of angelic visitation, noise on radio channels, weather systems, the postal system, monetary exchanges, the transmission of diseases and many other examples of either material or symbolic transmission events that formulate links between technology, media and nature, human and non-human. As well as a media philosophy of transmission, a tradition of work has carved out fundamental media philosophical concepts via the exploration of storage media. Most famously, Kittler argued that the storage medium of the gramophone had cultural effects that could not be overestimated. Writing once held the monopoly on storage, turning events into the symbolic and enacting a process of filtering out whatever did not fit into the system of notation. The introduction of the gramophone weakened writing’s dominance and it became possible to store the acoustic goings on in the world, both in terms of sound and noise, which had previously been filtered. For Kittler, the gramophone achieved nothing less than the replacement of the symbolic with the actual

18

AGAINST TRANSMISSION

recording of the effects of the real, the stochastic disorder of bodies, in the discourse networks of 1900 (Kittler [1986]1999: 16). After Kittler reformulated them, Lacan’s concepts of the real, the imaginary and the symbolic were able to be conceptualized as that which is now coming in over information channels, what is transmittable and stored. But this apparently ‘real’ recorded on the gramophone, the symbolic of typed letters, the imaginary of the film, are able to be stored precisely because they are able to be transduced into material forms (photographs, the finite stock of a typewriter or records). Since then, figures such as Wolfgang Ernst (2013), Matthew Kirchenbaum (2008) and Jussi Parikka (2012) have developed a thoroughly archaeological approach to digital memory looking into the mechanisms that have produced the archival characteristic to the present. The media philosophical approach of this book owes a great deal to these important figures but, rather than storage or transmission, focuses on the concept of transduction as a foundational media function. Transduction takes place as both storage and transmission. In order for signal to be transmitted in either space (transmission) or time (storage), it is modulated by the transmitter into a form that can be passed through a carrier medium, such as film (frames), books (type) or data (volatile electronic charges). This process involves the transition of signal from one state, such as changes in air pressure, to another, such as electronic pulses. It is the very process that Courbet’s work gives form to, as the open black tomb draws into itself the events of the funeral. The tomb orders the events; it makes them meaningful by giving the whole scene the character of a funeral. This is the function that allows media, before it becomes communication media, to act as a device that measures phenomena and makes its analysis possible. It is here that the media philosophical approach offers novelty. The media philosophical gesture is to turn one’s back to the future and the present and reflect on the past in order to uncover the structures for experience, what McLuhan would call the ground from which figures (the future and the present) emerge. As Foucault said, it is to make the facile, the everyday, the taken for granted, the usually unthought into the topic, for interrogation (Foucault 1988: 154–155). The gesture in this book is to explore media before it becomes mass communication media, both in genealogical terms, as I turn back to the experimental beginnings of mass media, and also in a technical sense, as I look to the technical functions that operate before information is able to be either transmitted or stored. In short, this is neither a book about communication nor a book about storage. Instead, this is a book about media transduction. It explores how transduction, as time-continuous events become defined as time-discrete samples in order to become computable, enters into human culture and produces a media theory in the most literal sense of the term (a theory of the world that is produced by media themselves). This is a book about measurement and the organization

INTRODUCTION

19

of time-based events at moments in the engineering of what would become mass media. At this stage, history of course continues: events, ‘even large and grave events’ (Fukuyama [1992]2006: xii) continue to occur. But these events are turned into post-historical scenes, which mitigate the intellectual atmosphere conjured by time-continuous information (reading, watching, listening). Looking to the technical function of media the book tries to draw out things that are by their nature invisible and meaningless. As Krämer (2015) points out, ‘we hear not vibrations in the air, but rather the kettle whistling; we see not light waves of the yellow colours spectrum, but rather a canary; we hear not a CD, but rather music; and the cinema screen “disappears” as soon as the film grips us’ (31). Media makes things visible whilst withdrawing into the background. When they work well, they are seamless, they are what media theorists refer to as ‘transparent’, only to be sensed vicariously through messages. This book tries to bring into view the often invisible and meaningless functions of media, the function of drawing events into itself, rather than the function of transmission that has since Shannon dominated the field. It asks, ‘How do these invisible, meaningless events start to matter’? But this is not simply a task carried out because of an interest in the history of technology or the archaeology of media, but rather with an interest in the phenomenal. Unlike many media archaeological investigations, which have previously been criticized for forgetting humans entirely in place of an exploration of electronics, the human experience of using or being used by technology remains a central concern. It is only that this human experience is considered, following Alexander Galloway and Eugene Thacker’s formulations, as enmeshed in and inextricable from media channels, what they call the ‘elemental’ aspect of networks, which include both technical and nontechnical media; the unhuman element of networks that ‘nevertheless do not exclude the role of human decision and commonality’ (Galloway and Thacker 2007: 155). The book follows Galloway and Thacker’s impetus and continues their argument that ‘the individuated human subject is not the basic unit of constitution but a myriad of information, affects, and matters’ (Galloway and Thacker 2007: 155). To be human is to be in constant contact with the nonhuman and to stand before the non-human, to stand before the black hole, at every moment of individuation. It is to come to terms with the fact that in order to hope to begin to describe human experience, one must also be able to describe the way this experience is defined and measured by media systems, broadly understood. As Flusser writes in Post-History, Western culture seeks to transform itself into an apparatus (Flusser 1983[2013]: 9). What characterizes the West is the tendency to objectify all phenomena, to turn it into an object of manipulation. This project of transcendence situates the apparatus at the centre and allows it to measure and define its

20

AGAINST TRANSMISSION

subjects as objects. It is in this sense that the history of the West, the march of progress, has not ended per se. But that ‘all unrealized virtualities are infected by apparatus’ (Flusser [1983]2013). It is in this way that we can speak of a post-historical climate. It is in this way that programmes, their technical grounding, begin to ‘write’ history. It is, as Galloway and Thacker put it, to see the unhuman in the human.

Chapters The first chapter of this book provides an overview of the media philosophical approach from which I take my inspiration and starting points. McLuhan, Kittler and Flusser are introduced and aspects of their intellectual projects, particularly those concerning time, are weaved together to begin to set up a theoretical framework for the argument. Finally, Whitehead is introduced, and, reading his work alongside McLuhan, Kittler and Flusser, he is given a new identity as a media philosopher who offers a way to conceptualize posthistorical media as a movement between being in and out of time. Whitehead argued throughout all of his work, particularly in his major works Process and Reality and Adventure of Ideas, that the tendency for Western descriptions of reality to remain fixated on staid objects, as though measurable elements, needed to be overcome. But he also argued that a philosophical approach focused on pure becomings and continual flux, such as Spinoza’s, was itself insufficient. The task for a satisfactory cosmology, in Whitehead’s view, was to find categories that could situate reality between the two. Analytical media removes the becoming and reinforces the view of reality as measurable and staid objects. Whitehead is used to first identify this tendency of thought, situating objects and scenes outside of time. He is then used to try and show how media philosophy can begin to explain the post-historical as not simply a mode of being that is outside of time and process, but one which is in time, albeit a radically different time from linear, Historical time. Chapter 2 begins from the premise that what was once described as cinematic temporality and which was once seen to proliferate throughout modern culture, far beyond the cinema screen, has more or less ceased to be able to represent the conditions of the contemporary. The chapter begins to probe into the history of computing and the history of experimental storage media in order to describe discoveries that were important in the digital organization of events. It asks how the operation of analytical media can be seen to produce temporal systems that have come to define the condition of contemporariness. It then looks to the media artworks that reflect the non-cinematic, digital temporalities that characterize this condition and offer ways to rethink the ontology of this moment. The chapter then concludes

INTRODUCTION

21

by using examples from YouTube’s archiving of history to illustrate the new temporalities with which histories are now being memorialized and to offer ways that media philosophical reflection, like the media art approach, can describe and reformulate the time of the present, beyond recourse to an a-temporal eternal present. Chapter 3 begins the book’s rear view analysis by exploring a wide range of chronophotographic and cinematographic experiments carried out at the beginning of the twentieth century. In these examples of experimental media, photographers, scientists and physicians began to pass events through a technical imaging machine in order to study the outcomes. Marey and Muybridge, two of the most well-known figures in the history of pre-cinematic inventions, assembled devices that were able to arrest movement and then measure and chart their constitutive parts. Based on clinical mechanisms that were previously attached to the body to measure pulses rates – as beats in time – these inventions allowed biological phenomena to be recorded, coded and measured as discrete values. This had vast phenomenological, economic and social effects, involving the colonization of time and the segmentation of experience, particularly for those being measured. In Chapter 4, I explore the temporality produced as the ‘problems of television’ began to be solved around the end of the nineteenth and beginning of the twentieth centuries. Specifically, I explore how the engineers of experimental television, drawing attention to the very process of image transmission, grappled with time-based problems of synchronization and delay in ways that marked out the temporality of television culture for years to come. The mediated event of television is defined in this chapter with close reference to Whitehead and then worked through the historical detail of the development of experimental television, with specific reference to the early broadcasts of the 1930s and the manner in which engineering solutions to the ‘problem of television’ structured the temporality of performances in front of the camera. Following on from this, in Chapter 5, I continue an exploration of the technical development of television and read this genealogy into a contemporary example of broadcast journalism, focusing on fragmentation as both a technical representational device and a way of addressing the contemporary realities of television, particularly in the face of terror and trauma. The photographic camera reproduced images as the chemical effect of light upon grains. Television then fragmented the temporal moving image into microelements in a way that is far more programmatic than the photographic organization of the image. Philo Farnsworth’s early pick-up and scanning device was even called the ‘image dissector’. With electronic television we see the first medium to analytically break the temporal image into automatically and mathematically organized bits. As is shown, through an analysis of contemporary television’s

22

AGAINST TRANSMISSION

treatment of terror and the contingent and its storage of history as an aftermath of trauma, the solution of time-based technical problems are given form though the aesthetics of the medium. The history now presented by the television, via its techno-aesthetics, through its technical operability and display of images, including storage, repetition and also live transmissions of the contingent, produce a viewing time of the aftermath of history. The book ends by showing how the technical is articulated into the aesthetic and how the media time events that are usually invisible are expressed by audiovisual mass media.

1 Media Temporalities: An Introduction to the Media Philosophical Approach

T

he following chapter sets out an introduction to the media philosophical approach taken in this book to address questions of time and temporality. Beginning with Marshall McLuhan’s now famous work and moving through Friedrich Kittler, Vilém Flusser and finally Alfred North Whitehead, who offers a way to re-read McLuhan, Kittler and Flusser, the following points become clear: 1

The concept of transduction will be the focus of this book’s media philosophical inquiry. Previously, questions of storage and transmission have been the primary focus of media philosophy. The important work of Ernst (2013), concerning storage, and Krämer (2015), concerning time, has set out the conceptual imperatives and phenomenological effects of a culture that has become defined by both its archival tendencies and global transmission media. What I hope to do in this book is work towards a different objective for media philosophy by focusing on the function of transduction, which has been previously described in philosophical terms in the work of Gilbert Simondon (1964; 1992) and Adrian Mackenzie (2002) but also in information theory and electronics by mathematicians, engineers and media inventors.

2

Questions of time are central to this book. As such, I have taken a motivated reading of the work of McLuhan, Kittler, Flusser and Whitehead, focusing on the time-based aspects of their thought on media. The chapter should not be read as an introduction to these thinkers’ entire body of work per se (which is otherwise beyond the

24

AGAINST TRANSMISSION

scope of any one book) but rather as a setting out of how they can be used to reflect on the post-historical conditions supported by analytical media. 3

There are of course a number of names missing, including Bernard Stiegler, Mary Ann Doane, Roland Barthes and Walter Benjamin, amongst many others that could have been included. These are all important media philosophers of time who have provided us with richly textured concepts with which to face media culture. The focus in this introductory chapter is on McLuhan, Kittler and Flusser, because these figures focus on the ontology of media, the reality of process, and provide a way to describe the hardware of culture as the technical base for the transduction of events into information. The chapter ends with a discussion of the work of Whitehead, focusing on his application to media philosophy. A number of important books have recently come out on Whitehead’s philosophy and media theory, including Steven Shaviro’s Without Criteria (2009) and Mark Hansen’s Feed-Forward (2015), and perhaps my last book Time and the Digital (2012). What this section shows, adding to the recent activity in the field, is the new insights that Whitehead offers to the ‘event’ and the ‘function’ when recast as a media philosopher.

4

The analysis of McLuhan, Kittler, Flusser and Whitehead is written to initiate those readers that may be coming to this work unversed, but it is also hopefully written with enough content to keep the already initiated interested. This chapter is designed to provide a theoretical framework for the coming argument on analytical media and the conditions of post-history and as such offers an introduction to the key media philosophical works that will be drawn on as well as establishing a context for the current work.

MEDIA TEMPORALITIES

25

Marshall McLuhan: Phonetic writing and temporality Marshall McLuhan, one of the most important figures in the study of media was, like Alfred North Whitehead, a deeply process-oriented thinker. This, in fact, may be one of his greatest contributions to media philosophy (although he personally would resist the term ‘media philosophy’). His approach focused on the becoming of the human subject, the modulation of individuals, based on their investment within a network of technologies that ‘massage’, as he put it, human life (McLuhan, Fiore and Angel 1967). Via this process-based approach to technological culture, he offers a way to replace, for those interested in aesthetic inquiry, questions of representation with questions relating to functions and to reformulate the questions of determinism, for media theorists, as questions of shared participation. As he wrote in a letter to Harold Innis, ‘the business of art is no longer the communication of thoughts or feelings which are to be conceptually ordered, but a direct participation in an experience. The whole tendency of modern communication whether in the press, in advertising or in the high arts is towards participation in a process, rather than apprehension of concepts’ (McLuhan to Innis, 19 March 1951). The representation of time is replaced in McLuhan’s theory of media temporality with the production and ordering of time via a medium’s technical function so that it provides a context for participation. Alphabetic inditing meant that the reader engaged only with linear time. ‘The manuscript reader travelled too slowly, travelled too little to develop much time sense’ (McLuhan in Findlay-White and Logan 2016: 163). For them, cause always followed effect. Spoken words were broken down into phonemes, translated into meaningless signs, arranged in a sequential order and then able to be read one at a time. McLuhan argued that the ground of media, its operability, had a significant impact on the modes of thought that developed alongside this media, particularly concerning the ideas about time and temporality. The simultaneity of the new electronic environments that McLuhan described meant that, as different from the environment of alphabetic writing, information came at the subject from everywhere all at once. In the electronic media environment, the receivers of information were able to develop a timesense that was much more complex than the one which developed according to a line. In McLuhan’s technical universe, contemporary media users were able to be understood as living in all cultures of the past simultaneously (Findlay-White and Logan 2016: 163). An environment was created where multiple events from the past were, via the operation of technical media, folded into the present in ways that departed from the linearity of History and print.

26

AGAINST TRANSMISSION

Through his focus on process, relationships and experience rather than the study of the reception of content, McLuhan’s work maintained a focus on introducing phenomenology to cultural studies and he provided a way to conceptualize living processes, those intimate and difficult elements of experience, alongside media infrastructures (Marchessault 2005: 51). As W. Terrence Gordon put it in his commentary on McLuhan’s work, ‘environment is process, not container; the West speaks of space where the East speaks of spacing; historical descriptions of change are mere narratives that offer no insights into dynamics; debate packages knowledge for display, whereas dialogue organizes ignorance for discovery’ (Gordon 2010: 22). Process, openness, the movement of the many to the one, the production of space, of an environment, the operation of media functions and discovery are at the core of McLuhan’s work, and like Whitehead at the beginning of the twentieth century, he offers a way to come to grips with the processes that occur inbetween the becoming of objects. Much like the thought of Gilbert Simondon, who produced one of the most highly intellectual accounts of transduction and individuation, McLuhan argued that the processually formed environment, the networks of media, the material and symbolic means by which experience and knowledge were translated, transmitted and stored, impacted societies in terms of the way they engaged and understood those experiences and knowledge. It did so, as McLuhan argued, by doing nothing less than separating the senses, filtering experience from its multimodal roots into purely optical, auditory or tactile situations. Media separate the senses, an ear for radio, an eye for print, they create fragmented individuals and this has real consequences for the way media users meet the world. According to McLuhan, the optical, the world of print, had become the most powerful sense due to the proliferation of the phonetic alphabet which had far reaching effects on the development of modern societies. As such, he looks in all his major works to what he describes as technologies of language, which amount to the means of transducing reality into codes. His intellectual project running at least from the publication of The Gutenberg Galaxy involves ‘probing’ the influence of media transformations on social structures, forms and pedagogical practices, in order to identify how technologies are internalized and become vital in processes of individuation. On the first page of The Guttenberg Galaxy, McLuhan (1962) sets out the approach that would come to characterize his media theoretical position: Any technology tends to create a new human environment. Script and papyrus created the social environment we think of in connection with the empires of the ancient world. The stirrup and the wheel created unique environments of enormous scope. Technological environments are not

MEDIA TEMPORALITIES

27

merely passive containers of people but are active processes that reshape people and other technologies alike. (1) For McLuhan, communication was not reducible to a channel between sender and receiver, as it is imagined in more conventional communications theory. Instead, the medium was a condition for shared experience, a process that creates an environment and that creates the people that inhabit that environment, although this environment, as is the case with electronic environments, remains invisible. Humans design media; McLuhan was famous for saying that we perform to build the environment based on the capacities we find in our own body. Then the media, the environment, ends up programming us. The work of the anthropologist Edward T. Hall was crucial in allowing McLuhan to make these claims.1 In the Prologue to The Gutenberg Galaxy, one of McLuhan’s most important books, he quotes Hall: Today man has developed extensions for practically everything he used to do with his body. The evolution of weapons begins with the fist and ends with the atom bomb. Clothes and houses are extensions of man’s biological temperature-control mechanisms. Furniture takes the place of squatting and sitting on the ground. Power tools, glasses, TV, telephones and books which carry the voice across both time and space are examples of material extensions. (Hall in McLuhan 1962: 4) After Hall, McLuhan could formulate that weapons were extensions of the teeth and fists. Houses were extensions of temperature-controlled bodies. These tools become the media, standing in for the parts of the body with which humans relate to the world. They are processes rather than objects. They are conditions for relations, rather than solid, unchanging, permanent things in themselves. They become, for McLuhan, the environment, the processes involved in ‘spacing’, the entirely real conditions for the possibility of experience. Soon after the development of these tools, those things that relationally constitute the environment for living, it was found that even more tools needed to be developed to maintain and enhance these technical conditions. Knives need to be sharpened; an improved handle can be attached to the spear. Supplementary tools are developed which begin to evolve in the service of the original tool. Carolyn Miller (1978) states ‘the extensions seem to take on purposes of their own; they become distant from the original human capability they first extended and advance at the expense of that capability. Habits of behaviour, institutions, further technology, and ways of talking and thinking get built up around the original extension’ (229). The tool once extended the

28

AGAINST TRANSMISSION

human quite obviously, but after the addition of secondary technologies the human becomes extended through a network of technologies. McLuhan argues that these secondary technologies, as media whose content is always another medium, extend and enlarge the scope of human activity. This is the media theoretical context in which I was able to claim in the Introduction that media are a non-optional part of contemporary experience. Media function as the non-human element in human experience. When these media are technical, the outcome of this is the production of unfamiliar temporalities, not based on the way they represent time but based on their operation in time. The influence of McLuhan’s process-based approach to studies of technical media has been significant, not least for those interested in testing the relationship between humans and technologies. Stiegler ([1994]1998), for example, asserts that a co-evolution with technics has occurred, or a ‘technogenesis’, as he puts it. After McLuhan, Stiegler is able to claim that the technologies involved in audiovisual recording allow an externalization of memory that has consequences which should not be underestimated for the biological evolution of human memory as a temporalizing of the relationship between the past and present (Stiegler [2001]2011). N. Katherine Hayles has also made important contributions to this discussion, showing how the act of using recording media, such as a pen and paper or computers, is not simply a means of ‘writing down’ preconceived ideas but is in fact ‘as much a part of [the] cognitive system as neuron firing in the brain’ (93). This is similar to the claim that Jack Goody made in 1987: pens and papers affect thought on such a level that it is possible to conceive of a ‘mind’ out there in the world and a mind inside human bodies. The discoveries around cybernetics entered into the media theoretical discourse in profound ways. As we recognize that media externalize memory and affect cognitive routines due to their operability, including signal processing and data management, it becomes possible to speak of the production of media temporality and to begin exploring its effects. In order to think through the technological modulations of human life, McLuhan, like Hayles, bases his arguments on knowledge and the experience of language as an expressive art. His goal was to place the analysis of technological artefacts on a linguistic and humanist base for the first time (McLuhan 1964: 106). Orality has a special relationship with the human body, not just with human thought. Not only are spoken words formed by the parts of the body such as the lungs, throat, vocal chords, teeth and soft palate, they also express the body. They transcode physical conditions into information, they give conditions form (literally a process of information) in acoustic space. The human being not only uses the medium of orality but is also expressed by orality. In a way that prefigures arguments on media ecology, anti-hermeneutics and post-humanism that reformulate the human as always

MEDIA TEMPORALITIES

29

within, and in fact individuated by, a network of technology, McLuhan argues that the human’s being is always with orality, as a technology that forms a network of communication. If orality, as a technology, expresses a body, then writing, print, cinema, television, also express a kind of body, but one that due to the new character of these networks of optical media, projects into the world a new type of subject. For McLuhan, this is a type of subject that, like Deleuze and Guattari’s ([1972]2004) formulation to come around eight years later, is separated into discrete organs of ears, eyes, mouth and hands by media technologies. Those perceived treasures of human meaning are produced by the fracturing of the body in order to perform the cultural techniques required to interact with symbolic worlds. After McLuhan, it is possible to say that to read, to look, to listen, to touch are gestures facilitated by the medium that transmits content and the techniques that it supports. This is a subject that is fractured by the timelessness of analytical media. McLuhan (1962) argues that ‘the interiorization of the technology of the phonetic alphabet translates man from the magical world of the ear to the neutral visual world’ (18). The explicitness of the technology of the phonetic alphabet comes from its function of ‘spelling out’ one thing at a time. ‘[O]ne sense at a time, one mental or physical operation at a time’ (McLuhan 1962: 18). For McLuhan, it is the eye that is, after the invention of print, to become separated from all other senses in order to become the privileged way of making meaning of the world. The eye travels along the page. Time and space, the event translated onto the page, is lineally set out before the eye. History comes into being. This is in marked contrast to the time and space presented to the ear. Acoustic space is multidimensional, multi-temporal and full of signal and noise. According to McLuhan, visual space is a sculpting of these time–space relations into a line. ‘[I]f a new technology extends one or more of our senses outside us into the social world, then new ratios among all of our senses will occur in that particular culture’ (McLuhan 1962: 41). Specifically, for McLuhan, the technological subject that lives within the Gutenberg Galaxy is a type of subject that has replaced their ears with eyes. In the world dominated by literary eyes, the alphabet, much like the transduction of electronic communication, broke down human language into phonemes, small, meaningless elements of code. Motivated by the aims of this book to explore the time-discrete techniques of signal processing, we might say that medium theorists and philosophers of technics from McLuhan to Stiegler have alerted us to the fact that the alphabet as a medium created an artificial world of meaningless symbols set before human eyes. Without the printing press, the publishing industry, and educational institutions that acted before print was put before the eye – fulfilling the role of an apparatus for reading – a, e, i, o, u remain symbols that relate to sounds, not meanings. If

30

AGAINST TRANSMISSION

we look for the synthetic process of reading, we look to the content. But if we look to the analytic process of converting words into symbols, we look to the medium, which as McLuhan famously told us is where we will find the real message of media’s effect on culture. We should try and look past the content, the images or the words, to their groundings, which may allow glimpses into the function of these invisible ‘environing’ processes. I like McLuhan’s radical emphasis on the carrier medium, rather than the carried message. But my approach differs to this, perhaps the most famous of McLuhan’s claims, in quite significant ways. Rather than looking strictly at the technical function of media process, delving into archives and technical journals, I often begin my analysis, much like I did in the Introduction, by looking through images to try and see how they give form to media preconditions. These images in a sense transduce technical conditions into visible scenes. I am intensely interested in the technical function of media process, but I am also committed to seeing how invisible functions are given a visible form in images and exploring the vicarious effects of the technologies of production. Like Kittler, this would be to try to see how human subjects are formulated by the technical conditions of media themselves, rather than by the conditions of literature, philosophy, science or religion. It would be to see how media tell us about our senses. But it would also, and I am inspired by Whitehead in this respect, move against Kittler to articulate the technical and the phenomenological in ways that cancel out neither side of the equation. ‘For us the red glow of the sunset should be as much a part of nature as the molecules and electric waves by which men of science would explain the phenomenon’ (Whitehead [1920]2007: 29). The task is to link descriptions of experience and technicity, aesthetics and media, to see how functions are given form, to focus on the selective processes where the many are reduced to the one. One of the first methods used to visualize the otherwise invisible functions was the invention of the phonetic alphabet. In The Guttenberg Galaxy, McLuhan (1962) writes lineal alphabetic inditing made possible the sudden invention of ‘grammars’ of thought and science by the Greeks. These grammars or explicit spellings out of personal and social processes were visualizations of non-visual functions and relations. The functions and processes were not new. But the means of arrested visual analysis, namely the phonetic alphabet, was as new to the Greeks as the movie camera in our century. (23) The alphabet amounts to a coded visualization of non-visual processes. It is ‘a visual enclosure of non-visual spaces and senses’ (McLuhan 1962: 43). This transformation of the event, the moving multimodal world, into a coded visual scene, this process of individuation, this transduction, defines

MEDIA TEMPORALITIES

31

literate cultures and signals a shift from the old oral world of the tribe to the new world of individualism. It is true that McLuhan spent most of his time discussing the urgency with which his readers needed to come to terms with the extensions of the central nervous system and the conditions of being with electronic acoustic environments, particularly when these environments are controlled by large corporations, but McLuhan also gave sustained attention to the process of transduction. Transduction can be read via McLuhan as a way that the systematic rules and procedures of different media organize time-based signals into codes. These codes then become the processually formed context for individuality. They radically cut up space and time to produce individual points. For McLuhan, the most important medium to first undertake such a task was the phonetic alphabet, later to be followed by the universe of electronic communication, which brought with it, according to McLuhan, the possibility of shifting the emphasis of culture from linear visuality to a nonlinear acoustic life. It is not that writing is simply displaced by the invention of audiovisual media such as film, video and television. But rather that its function as an analytical medium, its ‘deep logic’ (Peters 2015: 286), is intensified and there is a potential for writing techniques to shift from the linear construction of chronological time to the multi-temporal construction of acoustic space. This promise of acoustic space that McLuhan would constantly remind us of, this world of free flowing information, of spaces without a centre, continues to escape us, replaced by a more radical, though less visible, partitioning of experience and a higher level ordering of the body. This, which is indeed due to the radical upscaling of the alphabet’s analytic function, will be argued in the coming chapters. For now though, let’s continue with our exploration of McLuhan’s technical universe. Throughout his writing McLuhan shows how mechanization acts as a form of translation. McLuhan (1964) writes, The tendency of neurotic children to lose neurotic traits when telephoning has been a puzzle to psychiatrists. Some stutters lose their stutter when they switch to a foreign language. That technologies are ways of translating one kind of knowledge into another mode has been expressed by Lyman Bryson in the phrase ‘technology is explicitness’. Translation is thus a ‘spelling-out’ of forms of knowing. What we call ‘mechanization’ is a translation of nature, and of our own natures, into amplified and specialised forms. (67) It is in this sense, the sense that McLuhan gives to mechanization as a ‘spelling-out’, that media become in his own words ‘metaphors in their power to translate experience into new forms’ (67). He argues that the spoken word was the first technology that offered to humans a metaphor of experience and

32

AGAINST TRANSMISSION

a new way of grasping the world. This argument came to full fruition in the work of Kittler, who argued that just as the media system of nineteenthcentury poetry was established to produced and cultivate the human soul, technical media now define the user via switching. In Kittler’s own words, which are more radical than McLuhan’s in their adherence to a technical a priori, he argues ‘what remains of people is what media can store and communicate’ (Kittler [1986]1999: xl). Orality, as McLuhan argues, translates experience into uttered senses. ‘By means of translation of immediate sense experience into vocal symbols the entire world can be evoked and retrieved at any instant’ (68). Human sense experience becomes translated into the form of information and, following McLuhan, this happens progressively in the age of electronic media. The translation of scenes into words printed using the phonetic alphabet signals a unique process that was, up until the introduction of electronic communication, unique. The content of phonetic writing is speech, whereas the content of hieroglyphic writing is events or situations. ‘Any phonetic writing is a visual code for speech. Speech is the “content” of phonetic writing. But it is not the content of any other kind of writing. Pictographic and ideographic varieties of writing are Gestalts or snapshots of various situations, personal or social’ (McLuhan 1962: 46). The point of the phonetic alphabet is that it dissociates all meaning from the sounds of the letters. ‘The meaningless letters relate to the meaningless sounds’ (McLuhan 1962: 47). The privileging of the visual, as McLuhan argues, has direct consequences for the way time is organized in literate cultures according to the eye, the gaze that moves over the meaningless letters, the meaningless spaces between words, punctuated by symbols for organizing temporality, stringing them together to make meaning. ‘The visual makes for the explicit, the uniform and the sequential in painting, in poetry, in logic, history’ (McLuhan 1962: 57). The idea of grasping the experienced world as sequential events strung together as a whole, a wholly cultural technique that followed on from the phonetic alphabet and the heightening of the visual, introduced notions of causality and continuity, as people began to be able to trace the connection of events with one another and importantly discovered the past (57). History was invented. The Greeks discovered a past as an area of peace in a distant perspective. This, the operation of media in order to control time, to demarcate past, present and future and produce temporality, was the function that the clock was later to amplify. The clock, a machine that produces uniform seconds, minutes and hours, first set to work in monasteries with their need for synchronization and order, helped to create the image of a numerically ordered and mechanically powered universe (McLuhan 1964: 157). But according to McLuhan, this technique for the organization of time, which involves humans using media to synchronize with other human and non-human systems,

MEDIA TEMPORALITIES

33

had been rehearsed by a much older invention. It was the phonetic alphabet that ‘made possible the visual and uniform fragmentation of time’ (McLuhan 1964: 159). The phonetic alphabet and its extension of sight reduced time to chronology and the regular instant. The eye moves across the page, stringing together meaninglessness into meaning. The alphabet became the source of the Western mechanism that translates experience from audible–tactile modes into the visual and the linear. McLuhan then goes on to argue – and this, as it will become clear in what follows, is another place where I tend to differ from McLuhan – that a new openness of time is afforded by electric technology that supposedly extends the human person’s central nervous system. In Understanding Media, McLuhan points to the vast shifts that have occurred as the lineal space of print and the phonetic alphabet has been replaced by the acoustic space of the television image. He argues that the depth experience offered by electronic media, as opposed to the condition once supported by the Gutenberg Galaxy, can only be explained in terms of the difference between visual and mosaic space. For a subject immersed in the visual world of print, to come to grips with the concepts and percepts offered by the non-visual world requires a totally new way of imagining – literally making an image of – the world. ‘In the ABC of Relativity Bertrand Russel began by explaining that there is nothing difficult about Einstein’s ideas, but that they do call for total reorganization of our imaginative lives. It is precisely this imaginative reorganization that has occurred via the TV image’ (McLuhan 1964: 56). McLuhan argues here that the television has in fact reversed the abstracting of experience and the separation of the senses brought about by the analytic medium of the phonetic alphabet. He argues that ‘the TV image reverses this literate process of analytic fragmentation of sensory life’ (356). As will be seen, particularly in Chapters 4 and 5, I see this as one of the weak points in McLuhan’s argument, largely due to his reluctance to go into any detailed technical analysis of the medium, instead focusing on the westerns, celebrities, politicians and news anchors broadcast on the medium. I agree with McLuhan that television signals a shift from the intimately related lineal organization of events and the privileging of the sense of sight. The television image made events into scenes, multisensory and multi-temporal events that filled the space of everyday life. However, in the television’s functioning, it did not provide an alternative to the analytic function of the alphabet but instead drastically upscaled the effects of the analytical, breaking events down into point of light organized techno-mathematically. It is only that McLuhan never got close enough to the screen (in terms of a close technical analysis) to see points, pixels. As I will argue in Chapters 4 and 5, this technical functioning of segmenting signal into points that were strung together to make bands,

34

AGAINST TRANSMISSION

had real cultural effects that went far beyond simply a technical function. The technical separation of light has been mirrored in a cultural segmentation of time, particularly concerning so-called digital television. It became like the unseen ghost that haunts experience. While the television provides a new mode of organizing events, beyond the linearity of print, it does so through a close analysis of a scene, breaking it down point-for-point. McLuhan is right in as much that the television does not present time via a single line (it in fact presents time by combining a number of scan lines). But he misses the fact that, as Kittler will later go on to argue, the television presents both space and time via a radical cutting.

Friedrich Kittler: Defining media (media that does the defining) Throughout his career Kittler cited McLuhan and it is clear that he saw his work as a valuable point of departure into thinking critically about technical media (Winthrop-Young 2011:122). McLuhan, following in the earlier work of Innis and Gideon, offered to Kittler a model with which to advocate that media studies abandon its focus on content and instead bring into view the figure/ ground relationship. As well as this, McLuhan gave Kittler a way to (infamously) say that ‘media determine our situation’, a claim that led to the conclusion that the only method of describing the human situation was with reference to the genealogy of media, rather than its present state. If media determine our situation, we could only describe the present (our situation) with reference to the grounding provided by the development of media. Media theory, after Kittler’s modulation of McLuhan, became media history. As Kittler writes, ‘McLuhan, who was originally a literary critic, understood more about perception than electronics, and therefore he attempted to think about technology in terms of bodies instead of the other way around’ (Kittler [1999]2010: 29). McLuhan privileged the social and the human. Kittler argued that the only way to understand the cultural effects of technology was to focus on the technical instead of the human senses. A need for rear view analysis, a genealogy of electronics, data processing and computer programming, those things that usually lie at the periphery of the sphere of media studies, but which none the less provide the conditions for the possibility of experience, was brought into focus after Kittler. For Kittler, an archaeology of the present was not achievable through the, at the time, traditional Western focus on semiotics and the theory of signs. What was needed instead was a focus on all types of discourse as essentially networks of data processing, which would replace the emphasis on language with an emphasis on the material and the

MEDIA TEMPORALITIES

35

technical. Until the 1980s, the world of culture was transformed into a world of discursive signs and referents. Language was discovered as the means with which people came to terms with their place in the world. What this approach has overlooked is that the ‘de-primitivizing’ of the human world, the production of which is culture, has taken place due to ‘techniques and rites, skills and practices that provide for the stability of lived-in space and the continuity of time’ (Krämer and Bredekamp 2013: 21). For Kittler, data processing is the key in this process as it materializes these techniques, rites, skills and practices. The task given to us in Kittler’s wake is to try and analyse the methods by which hardware, throughout history, because of the invisibility of its method of data processing, has been designed to override, rather than extend, the human senses, how this has played out in the production of culture and to see in the present the continuation and at times the drastic scaling up of these moments (Kittler [1999]2010: 36–39). Kittler’s approach to media owes a great deal to Claude Shannon’s work during the Second World War. After Shannon, and his later work with Warren Weaver ([1949]1963), what became important for thinkers oriented towards an analysis of the material ‘stuff’ of communication was the technical and mathematical way information could be understood, beyond subjectivity and representation. McLuhan did not like this emphasis on techno-mathematical questions that adherence to Shannon and Weaver signalled. He found it risky to think of communication without reference to the humanist study of history and aesthetics. Kittler, however, took Shannon and Weaver’s imperatives seriously and developed an approach to media studies designed to get at the media technical conditions for discourse, rather than those established by cultural history and human routines. Because, in his formulation, the medial conditions of systems act before and thus override sense experience, for Kittler the phenomenological was rendered meaningless in media theoretical terms. For Kittler, only that which was switchable existed at all. Ideas of the human and meaning were removed from the equation, and information was understood purely as a mathematical probability function. What was important was the probability of the signal reaching its receiver, which involved an analysis and optimization of the signal to noise ratio. In a drastic upscaling of McLuhan’s approach, the meanings of the message – the meaning of words uttered by the human voice along the channel – were irrelevant. As mentioned earlier, this is an approach to technical media studies that I would like to temper slightly in this book by maintaining a focus on both the technical and the visual.2 I still look to aesthetics but I am more interested in what the image can say about media, transmission events, transduction and the technical conditions for production than I am about cultural meaning or iconography. The focus on images is framed by an attempt to see how images

36

AGAINST TRANSMISSION

point back to media conditions and how they can be used to explore what media now mean in the twenty-first century. For Kittler, it was data processing that was the key element to look for in media history. What I look for in the images mentioned in this book, like Courbet’s painting from the Introduction, is the way that data processing, transmission and transduction become embodied and made affective in the phenomenological realm. Kittler’s approach to information is not based on simple recourse to Shannon and Weaver alone. It is based on a radical extension of the way the pair understood the transmission of information. For Kittler, the analysis of information was incomplete without an exploration of the hardware that produced it. Information becomes technically determined (Hansen 2006: 77) and media philosophers have since been focused on the event of mediation, as a process by which information is transformed based on the channels, protocols, software and hardware by which it is produced and through which it proliferates. As Nicholas Gane states, with Kittler’s ‘emphasis on storage and technologized memory, information is no longer treated as purely a probability function (as it was for Shannon and Weaver), but as a material property that is in no way distinct from the physical components that make it – or the choice between different variables – possible [emphasis in original]’ (Gane 2005: 29). Through reference to and then the extension of Shannon and Weaver, Kittler is able to focus on technical and time-based qualities of signal processing, transmission, delay and storage, asking how the temporality of these nonhuman processes affects the production of information and the way that it – after the fact – is experienced as part of the phenomenological world. Along with McLuhan and Shannon and Weaver, another major intellectual figure that gets reformulated by Kittler is Michel Foucault, a figure that prompted a new methodological approach that cut to the conditions for discourse. Specifically, Foucault offered to Kittler a means of thinking critically about modes of discourse by exploring their infrastructure, which was later reformulated by media archaeologists, such as Ernst and Parikka, interested in getting at the technical ways that machines organize and store data. Foucault throughout his works including The Order of Things ([1966]2002), The Archaeology of Knowledge ([1969]2002) and particularly well framed by the essay ‘The Order of Discourse’ (1981), originally delivered as his inaugural lecture at the Collége de France in 1970, maintained a clear focus on the rules, systems and protocols that constitute and are constituted by a human’s attempt to gain knowledge. For Foucault, this involved studying the rules and procedures of systems as constitutive of a realm of discursive practice that comprises the terrain in which knowledge is both formed and produced (Hook 2001: 522). ‘In every society the production of discourse is at once controlled, selected, organized and redistributed by a certain number of procedures whose role is to ward off its powers and dangers, to gain mastery over its chance events,

MEDIA TEMPORALITIES

37

to evade its ponderous, formidable materiality’ (Foucault 1981: 52). The book, the library, publishing industry and educational institutions acted as a dispotif and provided the protocological conditions through which discourse flows and through which it becomes mastered. Those procedural rules, regulations and functions, which engender events and should not be underestimated, act upon the production of knowledge, including some things whilst excluding others. In a way that echoes McLuhan’s famous adage, after Foucault what should be analysed is not the content of what is said or thought but the rules and functions, the genealogy of protocols, which act as a constituent part of discourse. The medium through which discourse, and hence knowledge, circulates, including all those rules, regulations, functions and procedures that need to operate for it to work, is the message. Kittler extends Foucault by arguing that it is not enough simply to look to the social, political, economic and pedagogical rules accepted as conditions for discursive practices. One must also bring into view the technical conditions that by their very nature make discourse via electronic media possible. It is worth quoting Kittler ([1985]1990) at length here: Foucault developed discourse analysis as a reconstruction of the rules by which the actual discourses of an epoch would have to have been organised in order not to be excluded as was, for example, insanity. His concept of the archive […] designates a historical a priori of written sentences. Hence discourse analytic studies had trouble only with periods whose dataprocessing method destroyed the alphabetic storage and transmission monopoly, that old-European basis of power. Foucault’s historical research did not progress much beyond 1850. All libraries are discourse networks, but all discourse networks are not books. In the second industrial revolution, with its automation of the streams of information, the analysis of discourse has yet to exhaust the forms of knowledge and power. Archaeologies of the present must also take into account data storage, transmission, and calculation in technological media. (369) For cultural criticism to progress, as Kittler argued, it was no longer enough to frame the study of books and writing with the established traditions of literary theory, philology or sociological inquiry. What was needed, which McLuhan also advocated, was a study of writing as a communication channel that flows through schools and universities, as the institutions that connect books with people. Kittler’s innovation was to focus on data processing, which made possible transmission and storage. The discourse analysis practised by Kittler in this period revealed, as Siegert points out, both an indebtedness and a ‘technologically informed’ distancing from Foucault (Siegert 2013: 49). ‘Foucault conceived discursive rules as comprehensible and therefore overlooked

38

AGAINST TRANSMISSION

technologies. But innovations in technology of information are what produced the specificity of the discourse network 1900’ (Kittler [1985]1990: 278). Kittler, inspired by Foucault’s archaeological approach and McLuhan’s medium theory, argued that discourse networks began to be conditioned by the processing, storage and retrieval of data due to the immense shifts between the discourse networks of 1800 and those of 1900. According to Kittler, these turning points were constituted by the universal alphabetization circa 1800 and technological data storage circa 1900, seen most acutely in the technological developments represented by the phonograph, the typewriter and film. Foucault argued that all power comes from and returns to the archive, which is preserved based on a network of institutional and discursive rules. Kittler extends Foucault by exploring the media that forms the basis for the discursive practices of processing, storage and transmission. For Kittler, media themselves, the ‘technologically possible manipulations’ that determine what can be a discourse, became active archaeologists of knowledge as they set the protocols for data processing and thus governed the transmission and storage (as transmission in time) of information. At this point, ‘German media theory shifted the focus from the representation of meaning to the conditions of representation, from semantics itself to the exterior and material conditions of what constitutes semantics’ (Siegert 2015: 2). At this point, a radical anti-hermeneutical approach came to characterize the brand of media science that came after Kittler, usually focused on questions of storage and transmission. The book and the library were once the key institutional forms that produced archives of knowledge, now electronic media is responsible for setting the conditions by which data is preserved, which data is omitted and how that data is accessed. Through a methodical process of examining the technical infrastructure for the transduction of signals, and including in this infrastructure a historical analysis of the development of specific media technology, the media philosopher who works in Kittler’s wake might take apart media technologies to see piece-bypiece how they now work and how they have historically worked as a system of storage and knowledge. Not simply a cultural history of outmoded technology, this type of media archaeologist’s task is to explore the ways that technical components of media systems, through their varied history of developments, have built within them a number of implicit technical codes, protocols and ways of operating that are continually rehearsed and refined in their contemporary use. In a sense, this is a task that involves excavating the fossils of media upon which the foundations of the contemporary have been laid. In trying to uncover these inbuilt technical codes, it is the processing of signal, as a function that, following Whitehead, is crystallized in the forms of media images, that, following Kittler, we need to direct our attention towards. But it also (to depart somewhat from Kittler’s unrelenting focus on electronics) involves a level of phenomenological analysis. Continuing what was begun in the introduction to

MEDIA TEMPORALITIES

39

this book, it is through the exploration of the functions of media-as-technology that we might be able to begin to explore the conditions for experience in twenty-first-century media culture. The human still remains within the frame of the image of media culture presented in this book, but what I am also trying to bring into view with more urgency, following Kittler, is the non-human entities that provide the material and symbolic conditions for subjectivity.

Vilém Flusser: Particles Taking an approach that works between Kittler and McLuhan, Flusser, the prolific Czech media philosopher, offers a number of examples where the temporality of media restructures the apparent ‘flows’ of historical life. Flusser works between McLuhan and Kittler in so much as he retains an emphasis on the phenomenological, keenly interested in movement, gesture and affect, but understands these as networked to more elemental mediating processes. For Flusser, media do not merely extend the senses. In fact, it might be the other way round. For Flusser, people become functionaries of media, with social phenomena being an outcome of the togetherness mediated by these systems. Society itself begins to function as a programmed apparatus. More will be said on this later. Towards a Philosophy of Photography, one of Flusser’s defining works, is based on the claim that ‘two fundamental turning points can be observed in human culture since its inception. The first, around the middle of the second millennium BC, can be summed up under the heading “the invention of linear writing”, and the second, the one we are currently experiencing, could be called “the invention of technical images”’ (Flusser [1983]2014: 7). Prior to the mechanized and automatic imaging of photography, the dominant cultural form was the alphabet and the printed word, which had associated with it the production of linear history. Language turned the events of the world into scenarios, passed on through stories either verbally or on the page, to be read from left to right, front to back, beginning to end. Even if the book is not read in this fashion (many media theorists, including Kittler, have taken issue with this description of print, pointing to the way books like the Bible are often read in any order whatsoever, rather than from beginning to end), it remains organized and stored by the technology of print as chronologically ordered pages, with one event following the other. In the post-hermeneutical sense, the medium itself, rather than its reading, presupposes a form of historical consciousness due to the architecture of the storage medium. In a way that unites the analysis of visual images with a media theoretical account of the technology of image making, Flusser argues that photography and the technical image in general mark a break from this tradition as they

40

AGAINST TRANSMISSION

automatically organize the world into particles that can be reorganized based on programmed rules. It is this, his articulation of the surface image with the technical processes that produced it, which is the most important for this book. According to Flusser, the events of the world were once transformed into literary scenarios; now, after the invention of the technical image, signal is transduced into particles. And it is through the images produced by technical media that we feel the effect of these particles. According to Flusser, the birth of the camera represented a shift in media that introduced a mode of thinking with ‘techno-logically’ produced images. Events were once organized based on the linearity associated with the written word and the privileging of the visual that McLuhan argued characterized print culture. But after photography was invented, events were organized based on the hardware (techno) and programming (logic) of apparatuses. Flusser, in both the style and the content of his media philosophy resonates in a number of ways with McLuhan (although he often brought up his disagreements with the man), perhaps most obviously in a passage from Understanding Media, where McLuhan (1964) writes ‘“[r]ational”, of course, has for the West long meant “uniform and continuous and sequential”. In other words, we have confused reason with literacy, and rationalism with a single technology’ (15). Like McLuhan and his arguments about linear, diachronic media, which presupposes the visual space of the eye, Flusser argues that a generalized historical consciousness that cut across social strata was spurred on by Gutenberg’s invention and the move towards compulsory public education. At this point, the peasantry began to live a proletarian and historical life: This took place thanks to cheap texts: Books, newspapers, flyers, all kinds of texts became cheap and a conceptual thinking that was equally cheap – leading to two diametrically opposed developments. On the one hand traditional images finding refuge from the inflation of texts in ghettos, such as museums, salons and galleries, became hermetic (universally decoded) and lost their influence on daily life. On the other hand, there came into being hermetic texts aimed at a specialist elite, i.e. a scientific literature with which the cheap kind of conceptual thinking was not competent to deal. Thus culture divided into three branches: that of the fine arts fed with traditional images which were, however, conceptually and technically enriched; that of science and technology fed with hermetic texts; and that of the broad strata of society fed with cheap texts. To prevent culture breaking up, technical images were invented – as a code that was to be valid for the whole of society. (Flusser [1983]2012: 18) For Flusser, photographs, the first type of technical images, were designed to reintroduce images into daily life, to render hermetic texts imaginable and

MEDIA TEMPORALITIES

41

to render visible what he calls ‘the subliminal magic’ inherent in cheap texts. They were meant to act like McLuhan’s synchronic or acoustic media that could present instantaneous and balanced information in environments that did not fracture the senses nor stratify society. These images would amount to a common code for art, science and politics, meant to be ‘beautiful’, ‘true’ and ‘good’. But, as Flusser argues and in a way that is at odds with McLuhan’s predictions, technical images do not, in practice, function in this manner. In a way that echoes Walter Benjamin, Flusser tells us that technical images do not reintroduce traditional images into daily life, they do not explain hermetic texts and hold nothing of the ‘magic’, or what Benjamin would call an aura, of cheap texts. The world is not immediately accessible to human beings and therefore images were created. ‘However, as soon as this happens, images come between the world and human beings. They are supposed to be maps but they turn into screens: Instead of representing the world they obscure it until human beings’ lives finally become a function of the images they create’ (Flusser [1983]2014: 9–10). Originals had a time and a place in a context: they were historical. Technical images on the other hand – the mechanical reproduction – are post-historical. They have no site, no location. Technical reproduction transposes events into ‘the network of topologically determined circulation’ (Groys 2016: 141). Flusser differs from McLuhan most acutely in his conceptualization of the universe of electronic media as a radically particalized universe only able to be made meaningful by technical apparatuses that operate based on mathematical rules. For McLuhan, the culture associated with electronic media represented an acoustic space, where the dominance of the eye and linearity was to be overcome by multisensory communication, a synthesis of the once fragmented body, which has associated with it multiple possibilities for interpretation and multiple opportunities for experience. In this environment, time becomes space and an instantaneous and balanced comprehension of code becomes possible, rather than one based on the primacy of the eye and the time-based linearity of reading. This type of multisensory synchronic space is also important for Flusser, but he offers us a very different interpretation of its consequences for life amongst technical media. Taking an approach that owes a great deal to Heidegger, Flusser argues that the particles of the electronic environment, the fragments of the world, rather than offering a free flowing, decentralized space, are radically filtered by technical media in such a way that produces images that are projected back on to the world and used to make sense of the world. The notion of an image is important to Flusser not simply as a surface or a representation of an idea. The technical image, a concept that may be Flusser’s major contribution to media philosophy, is instead formulated as an ‘image that means ideas’ (Jongen 2011: 208); it is a way of thinking through

42

AGAINST TRANSMISSION

images. They ‘have no content but are structures like an individual melody which evoke their own world’ (McLuhan 1962: 47). In Flusser’s work, the term technical image does not designate something posterior to ideas, which simply takes on a representational function. But, like Kittler, Flusser sees the technical image as a condition for knowledge: The technical image is a device for carrying out ‘what the antique thinkers called “theoria”’ (Flusser 2011: 286). If photography amounts to a gesture of seeing, then the output of the photograph, the output of seeing – the technical image – produces what these antique thinkers would term an ‘idea’. The technical image for Flusser embodies a process that engenders ways of seeing the world, gestures of interacting with the world and consequently ways of thinking about the world. It is an important concept for those of us interested in media philosophies of time because it brings into view the way images rehearse the temporality of their technical architecture and operation. ‘The technical image is an image produced by an apparatus’ (Flusser [1983]2014: 14). As such it is produced by a mechanical application of scientific texts, and, to follow Flusser, when one is dealing with them they are dealing with the indirect product of scientific texts. When dealing with time-critical media such as photography, film, television, these scientific texts are ones that grapple with and solve time-based problems. ‘They are metacodes of texts which […] signify texts, not the world “out there”’ (Flusser ([1983]2014: 15). They are the transcoding of concepts from text to image, making visible the solutions to time-based problems and the media engineering of temporality. As Flusser ([1983]2014) writes, in a way that is similar to Kittler’s infamous claim, ‘[t]here is something we can say about these images after all. […] they are not windows but images, i.e. surfaces that translate everything into a state of things; like all images, they have a magical effect; and they entice those receiving them to project this undecoded magic onto the world out there’ (16). Technical images translate events into situation, they translate time-continuous events into a state of things. They measure, transduce and then determine our situation based on their own programmes; they make events into situations – post-historical scenes – that are that way machine readable. At stake in Flusser’s conception of technical images is nothing less than the identification of a new form of temporal consciousness where time-based events are reduced to particles organized by programmed machines. As Mark Poster (2010) writes, ‘[i]f history for Flusser is a linear mode of consciousness related to writing, today it must be considered in crisis. The reason for the crisis is simply that writing is being supplanted by images, a new medium is being added to the old and taking priority over it in the culture’ (11). To paraphrase Flusser, to know something in the contemporary, post-historical world is to be able to conceptualize it as an image. This act of

MEDIA TEMPORALITIES

43

conceptualization, of image making, is undertaken by apparatuses involving televisions, cinemas and cameras. According to Flusser, these complex ‘playthings’, these processes that code ways of looking, are so complex that those playing with them, no matter how long they play with them, remain unable to plumb their depth. The game consists of combinations of the symbols contained within programmes. The player, the human user, the photographer, the television owner, the cinema goer, becomes a functionary within this system. He tries to learn the apparatus’ rules and eventually learns to think through them. Flusser sees this happening in all electronic communication and imaging, but first works his theory through the apparatus of photography. In doing this, we get to see one of the novelties in his approach: he uses the language of computers – programme and information processing – to describe photography, a cultural form of image making that spans the arts and sciences. Like Kittler, Flusser places an emphasis on data processing as foundational to cultural communication. He does not look to the signs themselves in images, but to the systems of code that makes it possible for the sign to become a phenomenon that signifies another. In a passage from Flusser’s ([1983]2014) photography book, he begins to outline the importance of the invisible conditions for the operation of the camera: It is true that many apparatuses are hard objects. A camera is constructed out of metal, glass, plastic, etc. But it is not the hardness that makes it capable of being played with, nor is it the wood of the chessboard and the chess pieces that make the game possible, but the rules of the game, the chess program. What one pays for when buying a camera is not so much the metal or the plastic but the program that makes the camera capable of creating images in the first place. (30) Cameras imprint an image automatically. The programme takes care of the work. It appears to function as a tool. This takes care of the work of image making that painters with a brush once encountered and the photographer is now engaged only in play. However, this play takes place within the parameters set out by the apparatus’ data processing routines. Like the computer, cameras process information according to a programme. The programme of the computer is obvious. But what is less obvious but nonetheless agentive is the programming of the camera, which, as Flusser tells us, acts as a calculating machine that is computational thinking flowing into hardware. It is by looking to the programme of the camera, its inbuilt codes, standards and ways of operating, that Flusser develops a way of conceptualizing the image as the embodiment of the usually invisible technical processes, which reorder relations across time and space.

44

AGAINST TRANSMISSION

The image produced by the camera is a technical image, dependent on a programme. For Flusser, these images offer new relations to the world than those offered by either print (as historical media) or traditional images (as prehistorical media). When Flusser discusses the prehistoric function of traditional images he often refers to the ‘magic’ of these images. Here, Flusser means to point to a life dependent on traditional images that act as nonlinear transformations of the world, without the cause and effect presupposed in historical life. When a cock crows in historical time it is because of sunrise. This is radically different from an image of a cock crowing, in which cause and effect cycle through repetitions. The cock at once signifies sunrise and the sunrise signifies the cock crowing (Flusser [1983]2013: 9). In the magical world of images, cause and effect are cyclical. In the historical world of print, everything has consequences and nothing is repeated. Once books became cheap and education became more common, Flusser argues that the peasants who had previously lived a ‘magical’ life based on traditional images, instead invested in the linearity of the printed word and started to live a proletarian and historical life. With technical images, which are no longer magical or historical but now post-historical due to their analytic and mathematical organization of data into images, it is the conditions of contemporary, as fragmented and radically present time, that begin to come into view. The cameraman circles his subject, choosing the right light, the right angle, holding the camera still, posing the subject. He clicks off shot after shot, looking for just the right composition, just the right conditions for making an image out of a changing situation. The scene looks to us like the photographer does not immediately know what he is doing. He struggles to find the right conditions for the photograph; he continues to try out different positions in time and space, different illuminations, hoping that one will pay off. Photography records the subject and hence records its meaning. Photography is created by phenomena (Flusser 2011: 283). Paintings, on the other hand, point to phenomena (they mean phenomena). Because photographs are produced by phenomena they necessarily elicit a gesture, the movement of the photographer, the electromagnetic processes of the camera. The photograph is in a sense the translation of this gesture from one form (a physical process) into another (a two dimensional image). The act of photography amounts to, in Flusser’s words, an act of seeing, a fundamentally philosophical gesture, by which an apparatus is used to ‘fix’ an observation (in the way a chemical fixative is used to develop a photograph). ‘In contrast to the majority of other gestures, the point of the photographic gesture is not primarily to change the world or to communicate with others. Rather it aims to observe something and fix the observation, to “formalize” it’ (Flusser 2011: 286). The programme of the camera offers to the photographer new ways of observing the world. The cameraman moves not

MEDIA TEMPORALITIES

45

so that he can see new things in the world but so the camera might be able to take new pictures, to fix new observations. ‘They handle the camera, turn it this way and that, look into it and through it. If they look through the camera out into the world, this is not because the world interests them but because they are pursuing new possibilities for producing information and evaluating the photographic program’ (Flusser [1983]2014: 26). They look for the undiscovered virtualities in the camera programme. Their interest is on the camera, no longer on the world, which is now redefined as simply a pretext for the camera (Flusser [1983]2014: 26). It is this observation that leads Flusser in Towards a Philosophy of Photography to conclude that the photographer does not work. He does not look for ways of changing the world. Instead, he looks for new information processed through the camera’s programme.

Alfred North Whitehead: Prehension The above introduction to the media theoretical accounts of McLuhan, Kittler and Flusser offers a framework to consider contemporaneity – the feeling of presence in the present – as produced by measurement and the rules and operations of machines, which can be described as a special case of transduction and transmission. What can now be added to these accounts, and what inspires most of the remainder of this book, is the process philosophy of Whitehead, which offers one of the most rigorous and also the most radical treatment of fragmentation and organization in Western philosophy. McLuhan said in The Gutenberg Galaxy that Whitehead’s most important assertion remains that the greatest discovery of the nineteenth century was the discovery of the methods of discovery themselves. As mentioned earlier, McLuhan takes from Whitehead an approach that follows the contours of process, not looking to the objects of process such as railways, motorways, aircrafts or electronic communication networks, but instead beginning an analysis of cultural change through attentiveness to the methods of invention and the relationship between the original idea and the final product as experienced in the world. This is certainly the Whitehead in Adventure of Ideas, where his metaphysics of process, reaching their full realization, are brought to bear on the history of civilization by tracing the mutations of human experience. This is a story of commerce, slavery, barbarians, theology, steam, democracy and ideals of freedom, which have all at some stage provided the conditions for civilization by mediating between, sometimes bringing together sometimes pushing apart, human beings. Whitehead’s metaphysics, and his consequent view of a history of events, is based on his rejection of the view of brute indivisible matter. For Whitehead,

46

AGAINST TRANSMISSION

it is not this or that object that should be given priority (as has recently become fashionable in the Object Oriented Ontology scene). Rather the processes, on both micro- and macroscales that provide the conditions for the becoming of objects. The actual entities, the really real things of the world, are in fact processes that paradoxically never actually exist in formal completeness. They never exist as fully formed objects because they are always either in the process of becoming or the process of perishing, in order to make room for the becoming of the next actual entity. In describing these strange entities at the base of reality, Whitehead gives form to the temporariness of an instant, what he is fond of calling a drop of experience, which constantly fades into inexistence, to be replaced by a new becoming, which likewise begins its perishing. And it is these drops of experience, these actual entities that produce the time and space of their becoming. By emphasizing process and the ingression of the past into the apparent materiality of the present, what Whitehead offers to media philosophers is a possibility for the definition of technology as a crystallization of techniques. Technologies begin to appear as sedimentations of processes, as formal embodiments of ways of doing things, of cultural techniques, that settle in material form. ‘It is gone, and yet it is here’ (Whitehead [1933]1967: 181). The past has perished, but it has not vanished from existence. It exists in as much as it is there for ingression into the present. One could say the same thing about the past of media systems. Media histories are in the past, but they continue to ingress in the present. As Flusser has shown, media histories do not lead from one event to the next in a serial order but are preserved in the present to be activated by this or that technique. As Kittler (2006) has argued, the computer is a return to the universal alphabet developed by the Greeks, reaffirming the ancient unity between numbers and written words. Global networks of communication are, as Serres has argued, rehearsals of much older myths of angels carrying the word. All media retrieve much older ways of doing things. It is in this sense that Whitehead provides a unique way to understand contemporaneity. The common view of what it means to be contemporary involves two entities that share the same spatio-temporal location. Michael Halewood gives the example of two university students enrolled in the same courses at the same time (Halewood 2013: 34). These two could be considered contemporary. However, as Halewood points out, if we understand their becoming using Whitehead, they are not always contemporary. They are contemporary at one moment, when they share a particular place in a temporal system, when their becoming, their learning, their growth, for instance, takes place in a shared temporal system. They stop being contemporary however when they occupy different temporal systems. For Whitehead, it is relations, their mediation, which is key, rather than the idea of staid existence. And

MEDIA TEMPORALITIES

47

in this way it is possible to exist both in and out of many different temporal systems. Basing his arguments on time on the then recent discoveries in relativity theory, Whitehead asks us to look to the usually ignored and nonsequential systems of time. For Whitehead, it is not that things endure in time, always being contemporary. Instead, each being, each entity is ‘defined anew on each occasion’ (Halewood 2013: 34–35). The two entities at one point assemble to constitute a new entity and at others resume their distinct existence (Whitehead [1929]1978: 199). In other words, the entity is produced by processes of mediation. After Whitehead, in order to study and become critical of the material ‘stuff’ that provides the historical conditions and technical infrastructure for digital temporality, an orientation towards process, energy, movement and change, the way it has been arrested, the way it has been engineered, and the way temporal systems are produced, percolates throughout the next chapters of this book. Following Whitehead and his major project of speculative philosophy of the 1920s and 1930s, this involves an exploration of the base processes that substantiate objects: media as media. It is incomprehensible for media philosophers to ask about the existence of this or that object. It is the processes that are running, the conditions under which these processes run, the prehensions and the kind of results that arise for the actors involved in this process which are far more persistent questions for those of us interested in trying to come to grips with the conditions of contemporary culture and the way techniques have sedimented – come to rest – in the material form of technologies (Schmidt 2008: 99). Another major contribution that Whitehead makes to media philosophy is the development of a metaphysical system that is able to describe the relationship between organism and atomism (between indivisible experience and the divisible measures that scientists use to describe experience). Both are relative terms. ‘The atomic element in the extensive continuum is that which limits, divides and places special emphasis on satisfaction of subjective aim […]. The organic, on the other hand, stresses the continuum as a whole and asks us to see it, not in a divided, limited fashion, but as a whole in which the parts are thoroughly interrelated’ (Miller 1946: 148). Throughout his work Whitehead argues for the philosophy of organism as a way to account for both the experience of the world and the scientific conception of the world. The systems of atomism that Whitehead describes are now mirrored in the media systems that divide the world in order to measure and analyse it. Computers, film and television segment events and offer a view of the world as though a tapestry of divisible and repeatable instants. The atomistic theory that underpins these devices involves the division of events into data but it also involves many other divisions. Film and television fragment the body of a performer with close ups and montage. Furthermore, the body of

48

AGAINST TRANSMISSION

the film is always separated into frames; the body of television is separated into pixels. McLuhan told us about the way media separate the senses of their receivers. An eye for print, an ear for radio. Flusser told us about the mathematical ordering of the world as image making apparatuses begin to operate based on automated programmes. Whitehead offers philosophers and artists, as will be argued in what follows, a way to oppose this purely atomistic tendency so acutely pointed to by media theorists. He offers a way to try and come to terms with the divided world by showing how experience emerges from divisions and he ultimately offers a view of what is lost when the world becomes purely (apparently) tractable. Perhaps by re-reading Whitehead in the contemporary media environment we might get a glimpse of the ways of thinking that need to be retained if we, as media theorists, artists and philosophers, are still interested in life and relationships rather than empty materialism. Whitehead might offer us a way to reconceptualize, a way out of, the anaemic present represented by the pixilation of the universe of technical images, where events are turned into a-temporal scenes that dilate into, and in effect overcome, what was once thought of as the past and future. Whitehead offers a way to speak of the present not as a point, a thin slice of time, on a line, but as a temporally thick moment which nests the past and the future within itself, giving them form rather than obscuring them. More will be said on this later. Whitehead’s first category of explanation is ‘that the actual world is a process, and that the process is the becoming of actual entities. Thus actual entities are creatures’ (Whitehead [1929]1978: 22). These actual entities – the creatures, as Whitehead describes them – that populate and constitute his universe, are at their basis formed by process. To put it another way, there is always something happening before there is some thing. And how that thing becomes reflects what that thing is. This is why it is often so violent, traumatic and impoverishing when experiences of the world become reduced to their measurement. At these points, the ‘organic realism’ that Whitehead advocates, as an outlook that engages fully with process and fluent energy, is replaced by the materialism now sweeping through both our experiential and intellectual engagement with media, which measures vectors, rather than following them. As Whitehead ([1929]1978) states, we need to rid any theory of the world of ‘[…] the notion of vacuous material existence with passive endurance, with primary individual attributes, and with accidental adventures’ (309). These theories of the world, that are given to us by analytical media (and by some theorists interested in measuring an object’s claim to ontology) are not only ‘useless’ as Whitehead puts it, but also focus our attention intensively on the present, its materiality, its inertia and its attributes, which does away with the creativity of the world through the adventures of the flux of process.

MEDIA TEMPORALITIES

49

‘[T]he man adds another day to his life, and the earth adds another millennium to the period of its existence. But until the death of the man and the destruction of the earth, there is no determinate nexus which in an unqualified sense is either the man or the earth’ (Whitehead [1933]1967: 204). Objects are always in process, always becoming other. Two objects can be contemporary, as they become in the same temporal unit, but may not be contemporary at another time, when they occupy different temporal systems. Their becoming at each moment, their relation to other entities, determines what they are at that moment, which differs as they occupy different spatio-temporal systems. When we measure them, when we observe them, we can only speak in abstractions. In an age of analytical media, an age of the post-historical, these abstractions become the foundation for archiving experience and for predicting the future. ‘But a man is more than a serial successions of occasions of experience. Such a definition may satisfy philosophers – Descartes for example. It is not the ordinary meaning of the term “man”’ (Whitehead [1933]1967: 205). Men or women are not simply constituted by the incorporated mind and body, they are the unity of the social coordination. According to Whitehead, humans are in a very literal sense their own society of occasions, which not only involve their own past but also the networks of elemental media that they find themselves within. And it is here, in the conception of occasions with blurred boundaries, which become based on the nexus they form with other occasions, that Whitehead gives us a way to talk about both a togetherness with media, in the in-betweens, and also the fragmentation, the mitigation of life, delivered by the analytical media. He offers a way to think at once about the fragmentation of experience but also to explore the new temporal systems produced by the post-historical. Like McLuhan, Whitehead takes issue with the fragmentation of the body into its capacity for five senses. ‘But the living organ of experience is the living body as a whole. Every instability of any part of it – be it chemical, physical, or molar – imposes an activity of readjustment throughout the whole organism’ (Whitehead [1933]1967: 225). More than an entity separated by five senses, the body acts as an organism, a nexus, of occasions, with blurred boundaries. If taken seriously, Whitehead’s thought here has implications that are twofold. First, as above, it speaks to the connections and togetherness that bodies form with the so-called external reality, which have since been figured in the cyborg writings that follow Donna Haraway’s work. In a way that is vital for the current book, it speaks to the way the image of the body, demarcated in time and space has been produced as an artefact of civilization – a body that is cut off from all other occasions, that inhabits its own bit of time and space and stands apart from the rest of nature. Analytical media are not solely responsible for this situation but they certainly today continue to play a large

50

AGAINST TRANSMISSION

part. Second, by reformulating experience as something that exceeds the five senses, Whitehead begins to offer a way to explore the functions, routines and processes that never reveal themselves in any sensuous way but that nonetheless impinge on the experienced world: to get below the surface of what is already given; to look beyond those things that media lift to primacy and to try and get at those things that retreat to the background. Whitehead is beginning to emerge as a philosopher who offers a radically new way to think about the contemporary condition of subjectivity, particularly with regards to mediation. The term ‘radical’, as used here, has a specialized (mathematical) definition. Etymologically, the term comes from the Latin word radix (root). It has a long history in critical theory, designating an attempt to affect ‘roots up’ political, social or economic change. But as both Kittler and Ernst have pointed out, ‘radical’ is also a mathematical term that refers to a function such as a square root. It is in this context that it refers to a symbol that designates a function used to get to the root number. It is also in this sense that media philosophy might become radical by attempting to fulfil the role of the function that seeks out root-level infrastructures for cultural routines and techniques, rather than the surface effects of these ‘below the surface’ routines. The reason that I look back to the radical philosophy of Whitehead is that he both alerts us so keenly to the function of analysis and also offers a way to speak of media without reducing them to objects. When we look closer at artefacts such as film, television and computers, media which seem to continually change and escape attempts at definition, we notice that they are not objects at all but processes. The material ‘stuff’ of media is actually processual ‘stuff’: media are not objects at all but functions and these functions, those processes that reduce the many to the one, that sustain temporal systems for the togetherness of entities, are of exactly the same category to the functions that occupy Whitehead’s thought. Whitehead also contributes to media philosophy by directing thought towards conditions rather than objects. Though very rarely discussing media, neither in his speculative metaphysical project nor his more historically inclined exploration of the development of civilization, Whitehead’s emphasis on transmission and the translation (transduction) of occasions, as the beating heart of process, whether this be the transmission between epochs or the transmission between atoms, sets to work a focus on the conditions for transmission. As he writes in Adventure of Ideas, our theories are warped ‘by the vicious assumption that each generation will substantially live amid the conditions governing the lives of its fathers and will transmit those conditions to mould with equal force the lives of its children. We are living in the first period of human history for which this assumption is false’ (Whitehead [1933]1967: 93). With this focus on instability, to try and explore the contemporary, the post-historical, is to look to these new conditions for transduction, the new

MEDIA TEMPORALITIES

51

conditions for transmission from one state of things to another, the new conditions for temporal systems where things can be said to be together, and it is the job of media scholars to look to those conditions produced by media as media. In a way that prefigures much of McLuhan’s thought, Whitehead writes ‘[t]hus the grounds, in human nature and in the successful satisfaction of purpose, these grounds for the current routine must be understood; and at the same time the sorts of novelty just entering into social effectiveness have got to be weighed against the old routine’ (Whitehead [1933]1967: 93). Here, we have Whitehead suggesting, much like McLuhan and indeed Kittler, that to understand the current (media) conditions, we should weigh them against older ways of doing things. By doing this, we might see what elements are retained, what are changed, what are heightened. We might see what occasions from the past ingress into the present fact. Using Whitehead to explore post-historical media immediately puts into work a particular style of thought. Whitehead followed a tradition of speculative, rather than strictly analytical, philosophy. As James Bradley (2008) points out, the speculative is opposed to the analytical in as much as it is focused on activity, ‘understood as the activity of actualization which makes things what they are’ (1). Whitehead became important in this tradition because in his work we find the process philosophy of a deep speculative metaphysician and a mathematically inspired focus on the function of analysis as though a mathematical function that confers a single value to an algebraic variability. From this, an observation of the way a mathematical function signals a movement from many variables to one value, Whitehead develops a speculative philosophy of reality by focusing on the process of mediation which gives form and value to experience. Whitehead focuses our attention on the transmission between concrete forms, rather than the forms themselves. He alerts us to what was earlier described using the vernacular of electrical engineering as transduction and this is used in this book to explore the functional conditions of technical analysis and measurement that underpin contemporary media culture. Whitehead ([1929]1978) writes, ‘the elucidation of immediate experience is the sole justification for any thought; and the starting point for thought is the analytical observation of components of this experience’ (4). However, these analytical observations have from their very beginnings been mediated via systems of knowledge that separate the flux of experience into instants that seem to militate against process thinking. And this, the framing of observation via analytical language, must become, if one is to take Whitehead seriously, the main topic for philosophers and media theorists. Whitehead uses John Stuart Mill to argue that the Greeks had difficulty in distinguishing between things which their language confounded, or in putting mentally together things which it distinguished. This was a language, as Kittler has shown,

52

AGAINST TRANSMISSION

that was linked inextricably to the Greek’s numerical system and the logic of separating experience into values. The school of Greek speculation, and the logic that followed through to the middle ages, was based on ‘[…] a sifting and analysing of the notions attached to common language. They thought by determining the meanings of words they could become acquainted with facts’ (Mill in Whitehead [1929]1978: 12). After Whitehead’s modulation of Mill, there is little reason to look to words, the separate measurements taken at instants, as pointing thought directly to the experiencing things in the world. Instead, following Whitehead, the best we can do is to look to the rules of language that produce these images of the world while simultaneously attempting to grasp the buzz of the world that withdraws from this language. If ‘every occurrence presupposes some systematic type of environment’ (Whitehead [1929]1978: 12), the implications for media philosophers are clear: it is towards the systematic environment, the conditions for experience, which are always in a process of being produced, that we should direct our critical inquiry and ask how this both excludes and produces the experienced world. On both technical and social levels, before there is an object, there is a process. Before there is an image on a screen there is a transmission, before there is an object that we call mass media a number of technical solutions and social changes need to play out. The object is always an outcome, however unstable, of these processes that continually reinvent the medium. Media theorists such as Sarah Kember and Joanna Zeilinska (2012), Doug Kahn (2013), Wolfgang Ernst (2013), Mark Hansen (2015), Andrew Murphie (2013), Sean Cubitt (2014) and Matthew Fuller (2005), to give a list that is in no way exhaustive, have all illustrated in unique ways how a process-based approach to media studies offers insights into the complex interactions between technological, cultural, economic, political and experimental entanglements that give rise to contemporary media culture, moving between the fields of media studies, philosophy, and history of technology. This book situates itself within these contexts and attempts to add something to the discussions that mark out these ‘in-between’ fields through an analysis of the material– technical, engineered and historical conditions for digital temporality. It does this not only through an exploration of the technical properties of the computer but also through an exploration of the engineering solutions that led to the development of other analytical media such as chronophotography and experimental television. As mentioned earlier, the major thrust of Whitehead’s Philosophy of Organism is that it is process that makes up reality, not subatomic particles or conscious human minds. In putting forward this position, Whitehead establishes a speculative metaphysics based on what he terms ‘actual entities’, ‘the final real things of which the world is made up’ (Whitehead [1929]1978: 23). Whitehead ([1920]2007) uses the term ‘entity’ in the Latin sense, to mean

MEDIA TEMPORALITIES

53

‘thing’ (5). But importantly, these ‘things’ or ‘entities’ are always in process. They are always happening or becoming, largely based on the entity’s impression, in a non-cognitive sense, of its environment. This involves the entity’s immediate surroundings and also its environment in a more temporal sense, as a past and future that impact upon the present. Following this, an actual entity is never static; it is always pre-existent, becoming based on its prehension of its environment, or post-existent, perishing to pass information to a subsequent entity. Whitehead attributes this process to every entity in the universe and it is this process, the becoming and perishing of actual entities, that initiates what we commonly think of as objects. ‘Yet the present occasion while claiming self identity, while sharing the very nature of the bygone occasion in all its living activities, nevertheless is engaged in modifying it, in adjusting it to other influences, in completing it with other values, in deflecting it to other purposes’ (Whitehead [1933]1967: 181). The present occasion not only finds its foundation in the past but is also actively involved in reshaping these occasions, in making them other, and then becoming constituted via the ingress of the other into itself. The creativity of the world, for Whitehead, comes from this process by which the present occasion takes past occasions into itself and makes these other based on its own set of conditions for becoming, whether this be the events of civilization such as the movement of ideas from Greek Philosophy into Christian Theology or more subtle events such as the movement of a melody line or the persistence of emotions in the experience of a human being. The concepts of actual entities and prehension run throughout Whitehead’s metaphysical work. This is a mode of thinking that finally reformulates the ancient atomism of Democritus and makes it work with a philosophy of organism. For Whitehead, the world exists as drops of actual occasions. But these actual occasions, these smallest things at the base of reality, the atoms, the bits, the particles, are constituted by the connection and relationships that are formed between them. The present occasion draws into itself the past and becomes datum for the future. It is in this way that we can start to conceptualize the present as thick with time and overcome the portioning of analytical media. For Whitehead, the actual object is always an event and as such extends in both space and time. It has blurred boundaries. To think of an object as stable and compartmentalized from the goings on around it is to abstract the object. It is to think in abstractions rather than in concrete terms, it is as Whitehead ([1929]1978) writes, to rehearse a ‘fallacy of misplaced concreteness’ (7–8). To try and think about the unperceivable connections which in fact give these objects their character is the task after Whitehead but it is also, as will be shown in what follows, the very thing that is obscured in the world of analytical media. It is not enough to think of the post-historical as an eternal state of presence in the present. A way needs to be found to

54

AGAINST TRANSMISSION

conceptualize the new temporalities, the new possibilities for rich experience, the new relationships between past, present and future that are now folded into one another, rather than represented on the line of History. In Process and Reality, Whitehead writes that it is the task of philosophy to bring to the surface what has been obscured by the selective process of subjectivity. He challenges those of us interested in grappling with the complicated phenomena associated with the rhythms and temporality of life to bring back into focus those elements of experience that have been filtered by our usual ways of coming to know the world. Like Deleuze after him, he sees philosophy’s role as experimental, bringing into view, in a way that a traditional linear history of progressive scenarios cannot, the pure event. It is the media philosopher’s task, as somewhat different to this, to instead look to the process of radical filtering – radical in the mathematical sense of a function – and contextualize the way this function produces media theories of the world. After Whitehead, philosophers interested in questions of ontology should look to the perpetual perishing of actual entities. Media philosophers, on the other hand, that are interested in questions of technologies of communication and their cultural and intellectual effects, should look to the technical way that these events are ordered, preserved, transduced and transmitted in ways that attempt to mitigate the perpetual perishing of the world and in so doing work against novelty and for melancholia. In reflecting on this, finding new ways to describe being-in-time, we might offer a way out of this condition.

2 Media Aesthetics

A

fter Kittler, the questions for media philosophers have been based around reflections on the way techniques of data processing determine a situation. If we hope to retain the sense of the human within this approach, then this question is one in which we need to focus on media as the inbetween where networks between humans and technology are processually formed. The current situation, this in-between, is one where analytic media has displaced synthetic media and has intensified the link between measurement and culture. In attempting to describe this change, the chapter makes the following points: 1

The cinematic temporality that was once used by philosophers and film theorists to conceptualize modernity has been replaced by a different type of mediated, non-anthropocentric organization of time.

2

The organization of time by digital media and the techno-mathematical developments that support it, including information theory and the techniques of archiving and segmentation experience, provide a way to describe what thinkers such as Boris Groys, Terry Smith and Giorgio Agamben refer to as ‘the conditions of contemporaneity’ or what Flusser refers to as post-history.

56

AGAINST TRANSMISSION

FIGURE 2.1 Jim Campbell, Exploded View (Commuters), 2011, 1,152 LEDs, custom electronics, wire, steel, 72 × 46 × 38 inches, photograph by Sarah Christianson. Courtesy of the Artist.

MEDIA AESTHETICS

57

In Jim Campbell’s Exploded View (Commuters) (2011), a thousand LEDs suspended from the gallery ceiling flicker to give the impression of moving bodies, commuters making their way through Grand Central Station. The images have the character of highly pixelated CCTV surveillance footage, where nothing remarkable happens. Prehistorical man once created images to come to terms with the world. A picture of a bull on a cave wall allows you to hunt one with more precision (Flusser [1985]2011: 11–12). Historical man wrote texts to explain images, which became irreversible events in time, where cause follows on from effect. The hunt became a story, a unique event. Post-historical man allows apparatuses to take on new roles, so much so that he now becomes part of the technical image making apparatus itself and takes on the role of a functionary within the system. Campbell still programmes the system, but he does not create the images per se. The light bulbs work together to form an apparatus whose internal relations give form to images. The shadows in Campbell’s work appear to pass by, moving through the station, because of the carefully timed switching and dimming of the lights. The message of the artwork is not in its content but in the environment created by media. The artwork is not about transmitting a meaning but about creating an environment for viewing the world. The images are created by relations set up by the apparatus: between light bulb to light bulb; between artist to medium, between medium to audience. The image of the world is fragmented by an apparatus and then reassembled by the viewing conditions (standing just far enough away, at the correct angle, the lights operating as intended) set up by the apparatus. In order to come to grips with these new images of the world we need to, as Flusser ([1985]2011) tells us, ‘start not from the tip of the vector of meaning but from the bow from which the arrow was shot’ (50). This is an inquiry directed towards media aesthetics, which involves unpicking the relationships set up by the apparatus. This involves first, thinking about the way that the technical function of the apparatus is articulated in culture, second, exploring the relationship between meaning and the realities of signal processing and third, addressing the way that the history of media, the discontinuous development of techno-mathematical machines now embedded in what we call contemporary media, provides what Kittler described as a technical a priori. Instead of looking towards images, this would be to analyse the technical processes that are the legacy of von Neumann, Babbage and Turing, the centrality of storage to communication and information processing, the flow of current through circuits and the developments in media technology, which underpin the technical image and the conditions for in-betweenness. As mentioned in the Introduction, these elements, the pixel, the point, the meaningless, those things that Campbell begins to move to the forefront in his low-resolution installations, are not usually experienced in any direct

58

AGAINST TRANSMISSION

way. Apart from in media art installations such as the Exploded View series, they are always hidden behind images. The media always disappears in order that the message can be delivered. But they are nevertheless experienced vicariously through technical images and through the meanings that arrive at these images once they have been filtered by the technico-aesthetic system that reorganizes the signals picked up from what Flusser ([1985]2011) once described as the ‘swarm of particles and quanta’ that now constitute the technical universe (10). These elements, those things that take place between the sender and the receiver in Shannon’s mathematical model of communication, the messengers, although meaningless in the conventional sense, provide the conditions for the possibility of experience, the possibility of communication, the ‘imperceptible background’ (Hansen 2015: 143), and, as will be argued in what follows, are able to be felt, albeit indirectly and unidentifiably, through the domain of experience. Analytical media provide the imperceptible background through their operations that break a perceived, already given whole into its constituent parts (Lechte 2002: 300). Installations such as Campbell’s low-resolution works allow us to think about this function. They make the background perceptible; we see the three-dimensional grid of LEDs filling the installation space. The impression of moving bodies is created as the LEDs dim in precisely the right alignment. The background is made foreground and the figures are only seen due to the absence of the background.1 The figures in this sense become the background upon which we see the organization of the imaging system. It is an inverse of the function of most optical media. It turns on its head synthetic media. The analytical, like that given form in Campbell’s work, involves the dissolution of the present into timeless moments. The background is a timelessness of points of light, rather than a time-specific event. The analytical measures what is thought to be self-evident so that it can be plumbed; it scrutinizes what is already there. The synthetic on the other hand produces novelty; it is contingent and moves forward into a field of chaos, creating something new, becoming, producing the unformed, causing things to appear. Analytical media take events as a pre-given occurrence, subtract time and analyse them as discrete instants, making them into post-historical scenes, no longer moving towards an unformatted future. Once the conditions of the present are established, the future is able to be predicted and hence ceases to exist. The computer is not the only analytical medium, but it is the one that provides the most easily identifiable metaphor for the analytical tendencies of digital culture, so much so that theorists have begun arguing that the algorithm actually provides the means with which to understand contemporary life (Beer 2009; Slavin 2011; Neyland 2015; Totara and Ninno 2014). The mathematical ordering of events not only takes place in the world of computers but has proliferated throughout mass media. It is just that this

MEDIA AESTHETICS

59

function has been usually invisible in twentieth-century media culture. This chapter, in a way that establishes a position for the rear-view analysis to come, begins by looking at the technical functions and moments in both computer engineering and the experimental development of what we now call mass media that in invisible ways underpin what we today think of as digital temporality. It then explores the ways in which contemporary media artists have attempted both to capture what it feels like to be alive in the present and to offer a way of making this present function in new ways. These works attempt to upset the analytical tendency of contemporary media, to reverse digital culture, making it productive, making it in that way dysfunctional, making digital media produce, rather than reduce, new temporalities. The chapter then concludes with an illustration of the way post-historical media can be reconceptualized after these works, using an example of citizen journalism distributed via YouTube. Gathering together these examples, these fragments, I try and show how art and media philosophy can begin to mitigate the segmentation of time by analytical media in digital culture by envisioning and exploring the new modes of temporality nested within the digital present.

Archives and the engineering of time Wolfgang Ernst, one of the key voices in media theoretical considerations of time, like Flusser, has taken a technico-mathematical approach to reconceive the time of the digital present. As Ernst (2015) argues, In digital culture more than ever, the present is immediately quantized, ‘sampled and held’ (the electronic precondition for real-time digital-signal processing). The audio-visual and textual present is being archived as soon as it happens – from Twitter messages and instant photography to sound recording. But even more dramatically undoing the traditional order of times, big data analysis algorithmically predicts the future already as futurein-the-past (futurum exactum). Never has a culture been more dynamically ‘archival’ than the present epoch of digital media. (22) The archival rather than the historical character of the contemporary is a product of technical media that is able to process signal from the world based on the von Neumann architecture and store this data along with instructions (algorithms) for processing future signal. The archive not only holds onto the past, but the future perfect as well (what Ernst calls the future exactum), as a future that has not yet but inevitably (statistically) will take place. The present begins to dilate into other spheres of time (past and future). It becomes the condition from which the future and the past are written.

60

AGAINST TRANSMISSION

Luciano Floridi, the important philosopher of information, offers a similar formulation in his description of what he terms ‘hyperhistory’. According to Floridi, the historical subject lives with a world where information communication technologies (ICTs), since the development of the protowriting systems around 8000 BCE, are used to record and transmit data. In so-called ‘hyperhistory’, ICTs record, transmit but also process data autonomously and, because of this, human societies become vitally dependent on them as a resource (Floridi 2015: 51–52). What we find in this element of Floridi’s thought, as well as that of both Flusser and Ernst, is the possibility for a media philosophy that offers a way to think about mediated time and temporality in terms of the post-historical. Beyond recourse to discussions about Historical time and its relation to the ‘cinematic time’ that marked out modernity – a type of time that was represented by a succession of images, moving towards the ungiven and ungiveable whole – a media philosophy of time that focuses on the non-discursive function of media, which is radically different from an emphasis on the synthesis of discursive media, offers a way to describe the new and complex temporalities produced by time-discrete signal processing, which are not so much without time, as Bergson once famously argued, as overfull with a type of time that is very different from human, historical, time. Digital media work based on the digital principle of discrete signal processing. But when we reflect on these processes from a media philosophical perspective, looking past purely human experiences of time to the way it is measured by technical media, it might be revealed that the time-discreteness of the digital has the potential to open up vastly new, multiple and folded modes of temporality. But first, let us look to the engineering of the instant as a discrete moment in time.

Time-discrete media The instant, the supposedly timeless, has been given formal importance in the engineering of media apparatuses themselves. The privileging of the timediscrete occurs most obviously in terms of digital signal processing, where, mainly thanks to the work conducted by the nineteenth century mathematician Jean-Baptiste Joseph de Fourier, samples are taken of a continuous wave function, held and reassembled. The Fourier transform is a process that is instrumental to the operation of digital signal processing. If we follow to its conclusion the process-oriented thought outlined in the last chapter, which holds that human and technical systems become concrescent at moments of individuation, this operation, not just technical anymore, also becomes instrumental to the analysis of the temporality of contemporary media culture.

MEDIA AESTHETICS

61

Fourier made the mathematical discovery that any variable could be represented as a series of multiples of the original variable. Periodic functions could be dissected into smaller parts, which may be either time-continuous or time-discrete, and then reconstructed. This was a discovery that was to become foundational to information theory. This approach, however, of conducting analysis through the segmentation of a whole was an example of a larger cultural technique that was being undertaken in fields such as medicine, criminology, heredity and biometry, largely supported by the new statistical methods introduced by figures such as Francis Galton and Karl Pearson, which emphasized a data science that was focused on the localized and the discrete from which inferences could be made. The Fourier transform, as a mathematical operation and a way of computing the world, seems the example par excellence of this then new media theory of the world, which would later be represented in the ‘particalized’ media philosophy of Flusser and the mathematical theory of information developed in the field of engineering by Claude E. Shannon. The Fourier method is important to scientists and media philosophers alike because it represents a moment when the contingent became calculable, when it was realized that the technical media of audio and visual recording for the first time began to record previously invisible and inaudible elements of reality, those things that oscillated either too fast or too slowly for human eyes or ears. The purely unrepeatable ‘becomes visible as the sum of decimals, and thereby also becomes repeatable’ (Krämer 2006: 101). Like Campbell’s installation, the background becomes perceptible. As different to language and the phonetic alphabet, the Fourier transform was based on the calculability of the irregular, the organization of the chaotic. The synthetic, the media that became productive of chaos, was replaced by the analytic, the media that was able to record and hence quantify the contingent. The Fourier transform refers to an analytical process where a time-continuous waveform is decomposed into the other time-continuous frequencies of which it is constituted. It is a process where a constant function of time is able to be separated into sinusoids, its modulating constitutive elements of sine and cosine functions. A numerical variation of the Fourier transform, the discrete Fourier transform, allows time-continuous signal to be processed and transformed into time-discrete samples. By this operation, not only can a waveform be represented as a sum of sinusoids but it can be further broken into segments and represented as discrete samples. This has proven to be foundational to digital signal processing and compression, where algorithms can be developed to filter out undetectable frequencies and hence reduce the number of bits needed to transmit a reliable signal. For the material world of vibrating signal, the Fourier method achieves a numerical dissection. The Fourier method now gives signal, from sound waves to photography to the

62

AGAINST TRANSMISSION

analysis of a complex change over time functions, a unique temporal character that underpins its capacity for representation. Both in terms of the technical qualities of images and the images themselves, it is the instant, the sample, the point, that matters.

Measurement and media Ernst uses the examples of Éduoard-Léon Scott’s phonoautograph to illustrate the way mass media have roots in measurement and analysis, those things heightened in Campbell’s installation, which owe a great deal to the discoveries made by Fourier (Ernst 2013: 184). Ernst’s example is a good one to use to begin to see how the apparatus of measurement, once relatively benign and specialized, becomes tangled up in life and forms part of the elemental surroundings that we now call mass media. Scott’s phonoautograph, patented in 1857 and preceding Edison’s and Berliner’s inventions, was used to record sounds as a linear inscription on blackened paper. A membrane was set behind a bell mouth, which was used to amplify incoming sounds and direct these in such a way that set the membrane vibrating and caused a wire brush to trace the frequency of the vibrations, hundreds of them per second, onto a cylinder, giving material form to the mathematical theory proved around thirty-five years earlier by Fourier. Scott’s device was never intended as a playback medium but was instead developed as a laboratory instrument to study acoustic waveforms that were both audible and inaudible to humans. It was not until later that it was discovered that the recorded inscription contained enough information about the sound to actually be used to play back the recorded sounds. Kittler similarly points to the analytical function of the phonograph and gramophone, which retrieve the original function of Scott’s invention, where he argues that the analysis of speech – an analysis of the material rather than the symbolic – that was afforded by the invention of recorded voice machines was instrumental to the vast changes to discourse seen around the end of the nineteenth century. These machines were able to be used for analytical purposes by slowing down the playback speed to reproduce for human ears sounds previously unable to be notated. The gramophone and phonograph, when used as a talking machine, were able to allow slow-motion studies of single sounds that were previously unavailable. This made the devices ideal in laboratories for measuring hearing. In schools, these devices were instrumental in allowing the analysis of ‘the most fleeting, unrepresentable and yet so important, characteristic aspect of language, of line phonetics (speech melody) and of line rhythm’ (Surkamp in Kittler [1985]1990: 233). Real phenomena were now able to be stored and played back depending on technical standards, which importantly included the temporality that the

MEDIA AESTHETICS

63

medium inscribed on the phenomena. The real began to be defined by what apparatuses could pick up, store and playback, including noise. Like the phonograph, the camera was also to be applied as an analytical medium. This is perhaps most obvious in the work of the French scientist Étienne-Jules Marey and the English photographer Eadweard Muybridge, who both famously studied movement via the techniques associated with chronophotography. Before the introduction of the photographic image to this technique, scientists such as Joseph Plateau and Simon Stampfer experimentally demonstrated how a spinning disk could be used to mobilize still images into apparent continuous motion (Wade 2016: 4). Drawing on earlier work concerning the persistence of vision, Plateau and Stampfer, were able to show how moving reality could be reduced to still images of incremental progression. Scientists such as Albert Londe, working at the La Salpêtrière Hospital under Jean-Martin Charcot, could then use this discovery to record the movements and disturbances of patients with varying neurological disorders and isolate frames in time that could be studied closely and precisely. The real became that which could be ordered by the apparatus. More will be said about this significant moment both in terms of visual culture and the history of analytical devices in the next chapter. The television was likewise experimentally applied as a device for measurement before it became a mass entertainment medium. One example of this is when Edward Appleton, an assistant to the famous Scottish inventor John Logie Baird, used the transmission system of mechanical television to study the possibility of radar sensing. Appleton, working with Baird on his thirty-line mechanical television transmissions for the BBC noticed that the telegraphic reception of the image often became susceptible to ‘ghosting’, where a faint image would appear slightly displaced from the original. This ‘ghost image’ was the result of two signals transmitted from an identical source, reaching the receiver at slightly different times. For those interested in the fidelity of television transmission, this second delayed signal amounted to visual noise and was an obstacle to be overcome. For Appleton, this ‘ghost signal’ was able to be used as an experimental tool and indicated to him important ways to understand the phenomena associated with the technique of radio ranging. Appleton realized that the second signal was the effect of the television signal reflecting off the ionosphere. By calculating the difference between the original image and the ‘ghost image’, Appleton was able to measure the difference between the object and the receiver, an important breakthrough in terms of the development of radar systems (Baird, Brown and Waddell 2005). In Scott’s experiments, as with Appleton’s and Marey’s, the experimental media, the phonograph, the camera, the television, reformulates the experimental subject as frequency modulations, in the case of Scott and Appleton, or as discrete moments in time, in the case of the analytical photographers that followed

64

AGAINST TRANSMISSION

Marey’s work. These are the material, technical elements of culture. The multitemporal crowd of mourners in Courbet’s painting were defined by the hole in the ground and the ritual that it materializes. It drew them into itself, captured their gaze, the ritual of the burial defined their roles and behaviours. For viewers of the painting, the figures are able to be characterized as mourners, as officials, as eulogisers, based on the conditions provided by the ritual of the burial. The subjects of technical media have now been defined by a different grounding and a different set of rituals as either frequency modulations or, as is now more common, discrete points in time. The term analytical, from the Greek analytikos, means to dissolve a whole into its smallest parts. It signifies self-evident, codified, segments of information. The Greek word analysis is related to the other Greek word for separation dialysis, which also means to dissolve or separate. The difference is the prefix ana and dia.2 Dia, which signifies the state of being apart, is replaced by ana, which stands for ‘again’, ‘anew’ or ‘repetition’. In dialysis, there is a loosening apart. In ana-lysis there is a repeated loosening. One whole subject to dialysis might at one point, after the event, be reunited. Analysis continually separates wholes into new holes. It continues to decompose the things it produces. The crystallization of techniques of analysis into material technical realities – the measuring of phenomena through their dissolution, the task undertaken by the phonoautograph recording short intervals of sound and allowing its analytical graphical study – produce the present as a posthistorical moment. The ana of analysis refers to the repeated dissolution and fragmentation of wholes, which in media theory after Kittler replaces the sign system of language in presenting the world – a literal making present of the world – through its media technical operation.

Out of cinematic time Prior to the dominance of analytical media, throughout the twentieth century, it was synthesis of mass entertainment media that conditioned the mediation of the world. Ever since cinematic images became widespread, the synthetic medium par excellence, and perhaps even before this moment (Doane 2002; Stiegler [2001]2011), it had been claimed that civilization, at least in the West, had begun to occupy ‘cinematic time’. By presenting apparent continuity through its framing of time as ‘stopped’ moments, the cinematic apparatus participated in a larger project of modernity where the rhythms of the day were regularized and standardized (Doane 2002). According to E.P. Thompson’s formulation (1967), around the birth of the era commonly referred to as modernity, the work day for much of the population of cities in the West became institutionalized and workers began to base the rhythms of their work

MEDIA AESTHETICS

65

on mechanisms. Pre-industrial society waited. They waited for the harvest. They waited for the seasons to change. Industrial society moved things in front of themselves. Like synthetic media, they transformed the environment. One worker in line finished their task. The next was then able to commence another. As the product, whether a motor car, children’s toy, furniture or textiles, rolled along the factory line, it was assembled with a repeatable precision. This is a relatively simple example of a technique based on the chronology produced under capitalism, but it could be extrapolated into other contexts such as financial markets that represent time on stock tickers or historical time that represents single events within the narrative chronology of eras, with cause and effect predictably following on from one another in relatively ‘shallow’ time loops. Industrial time, Historical time, is linear, it progresses towards the new. ‘It comes from the past and demands the future. Nothing is, everything becomes’ (Flusser [1983]2013: 119). As Zielinski points out, this is in complete accord with ‘the dogged regularity with which the film carrying the photographs was moved on a fraction, 16 or 32 times a second (according to shutter-type), stopped, illuminated, and then transported again’ (Zielinski [1989]1999: 79). Movement was towards a future, the medium was synthetic, the focus was on what was about to become, what was about to appear. A set of cultural techniques were developed for ordering time that was supported by the technical hardware of the cinema, which presupposed a particular ‘capturing’ of contemporary life by the camera and its transformation and representation as an image always pushing towards the future, towards the next frame to come. The apparatus acted as a device, that, as Greg Lambert (2016) writes, ‘literally causes (“makes ready”) something to begin to appear – sexuality, power, the state, God, etc.’ In this case, it was time that the apparatus caused to begin to appear. Given an aesthetic form in the cinema, the time that filtered through modernity was regularized, ordered, packaged and importantly able to be represented. The cinema not only participated in the organization of modern time through its technical function but also, in its self-reflexivity, produced images and stories that reflected the temporalities of life under capitalism, modernity and media machines. This is an argument that is rehearsed in film theory from Deleuze’s Cinema 1 ([1983]2005) and Cinema 2 ([1985]2005) books to the equally important work of Doane. But things (as much as they are in time) have changed (as much as time has undergone significant mediations that, put simply, are beyond the cinematic image). The audiovisual apparatus, the disposition, a time-critical rather than ontological system, has changed and so has the time that it causes to begin to appear. ‘The worker experiences reality differently from the functionary’ (Flusser [1983]2013: 27). The present, rather than its becoming, is brought into view. ‘The present is the totality of the real. In it all virtualities are realized. They “present” themselves’ (Flusser [1983]2013: 119). Even the past is now stored in the present. Even the future becomes coordinates of the present.

66

AGAINST TRANSMISSION

In the present, the waiting begins again, but because we can already envision the future, we wait for something else. What this something else is will become clearer both in Chapter 5 and at the end of this chapter. In its concrete actuality, the actuality that Peter Hallward (2006) draws our attention to, the irreducible flow of time is in fact now made reducible to discrete instants, and hence made less productive. In terms of the way that the dominant apparatus of our time, the digital computer, ‘captures’, ‘downloads’ and transforms’ human subjects, it seems that it is time-discrete data and momentary discontinuities that are now to be privileged, rather than the acoustic, free-flowing, environment that McLuhan once described. Perhaps Bergson was right. It is no longer the durational flow that defines contemporary time but rather a return to the function of analytical media that both McLuhan and Kittler alerted us to, as time-discrete signal processing. Once the cinema defined its subjects (in images and stories), now the computer defines its users (as data), just as the phonoautograph, the chronophotograph and Appleton’s experimental television system defined their subjects. The operations of media, such as the cinema, and the techniques that they reflect are, as they always have been, functions that give users a sense of the rhythms of the real. This observation is what led Siegert (2015) to argue that it was the material-symbolic infrastructures supplied by techniques of signal processing that have constituted the becoming subject within the world. The mirror of cinema is replaced by the measurement of analytical media. Siegert, by using Max Bense, argues that the world when defined by computers is determined as a signal processing which replaces beings with frequencies, attributes with functions and qualities with quantities. To arrive at a media theory of the world, objects, processes and attributes need to be redefined in media technical language, as the world is made understandable both by and for the computer (Heidenreich 2015: 137). Lacan once offered film scholars a way of discussing in psychoanalytical terms the subject arrested before cinematic images, as though an infant before a mirror, defined by experiences given through the camera. To know that I am alive, to know what I am worth, I look to the reflections offered by the other. They tell me what it is like to be alive. Is it now the cultural techniques associated with digital computation that fulfil this role? In the tradition of humanist media studies, however, the computer is, like all other technology, often reduced to a tool. Because of this, the humanities have been able to simply describe the tool based on its appearance to a human user, not based on its own characteristics, which condition the way it is used. ‘Those who have tried to pour the fuzzy logic of their insights and intentions into computer source code know from bitter experience how drastically the formal language of these codes distort those insights and intentions’ (Kittler 2006a: 49). Logic is poured into the hole, but it first needs to be transformed.

MEDIA AESTHETICS

67

Feedback loops lead from the machine to the human user, from the hole in the ground in Courbet’s painting to the mourners (the tomb contextualizes the actions of the figures), not the other way around. Rather than the tool being defined from the point of view of the user, we need to start at the other term in the equation. How is the user defined by the computer? What operations are permissible in terms of the rules of the specific coding language? How does the tomb describe the mourners? Charles Bachman, one of the first people to realize an effective database management system for computers, stood before a computer at General Electric watching a file run through it. Based on the earlier sequential punch card system, the magnetic tape that he was using contained data in a sequence, with each piece of information attributed a unique number. Bachman requested some data be returned by the computer and it sorted through the sequence in chronological order until the correct number was found. Whether looking at social security numbers, purchase orders or bank account numbers, this process was the same. But when Bachman was given access to a new direct access machine, things changed. He no longer stood before the computer as tape passed through it, but used the computer to search through a database. The database, the storage system rather than the computer, was at the centre of the information processing universe (Bachman 1973: 654). Data no longer went through the computer in sequence but into the database and was stored. An opportunity arose for him to dramatically change the face of information processing. He saw the potential for the machine to access information by probing for a record, rather than sequentially sorting through all the data (Bachman 1973: 654–655). This was not only a breakthrough in terms of information systems but would also have dramatic consequences for the way all kinds of events could be archived and organized and for the way a subject could come to terms with a world of data. The subject was no longer a stationary figure that stood before things passing before his eyes. The subject was now defined as an operator able to dive into an n-dimensional data space. Dimitris Eleftheriotis (2010) has written that ‘a linear, incremental and forward movement of a progressing subjectivity travelling towards ever-increasing knowledge’ became the all pervasive metaphor for life in the eras of modernity (12). In Lutz Neithammer’s (1992) words, ‘[t]he twentieth century is distinguished by the fact that the abstract, linear understanding of time which marked the human sciences in the eighteenth century, as well as the historical conception of nature in the nineteenth, have entered into the everyday life of society’ (26). Or as McLuhan (1962) put it, after the Renaissance, a world of multiple durations was replaced by a new lineal world, as people were translated from a world of ‘roles’ into a world of ‘jobs’. ‘As literate man entered the twentieth century, he was working, eating, sleeping and making love strictly according to the clock’ (Kalmar 2005: 229). At this point, work becomes specialized and the senses

68

AGAINST TRANSMISSION

fragment. The ear, the eye, touch, taste, smell can be used to develop their own brand of knowledge, with the eye and linear reading being privileged. Both McLuhan and Flusser wrote that the eye moves along a line of print towards ever-increasing knowledge, like the film moves through a projector, towards the future, or like the tape moves through the computer at General Electric. But now, the contemporary subject knows this experience less and less and is instead presented with a condition that could perhaps only be described as the aftermath of the accumulation and measurement of the stories of modernity. Historical subjects put things in front of themselves. Subjects of post-history wait and occupy themselves with the stories that have been accumulated in storehouses of data. To put it another way, Historical subjects put things in front of themselves. Post-historical subjects put things in storage. This will be fleshed out further along in the chapter with examples. For now though, we might say that in the face of ubiquitous computing, cinematic time – a time that does not stand still but instead flows – has been replaced by a time that is discrete, instantaneous and has the potential to be thick with multiple temporalities. This is a time that ‘breaks the surface’ of cinematic time, that gets ‘exposed’ and separates itself from the time of succession (Virilio [1997]2008: 27).

Con-temporary media art If media art is to have a value, it is to explore the conditions of analytical media, to mitigate its timelessness and to provide new ways to address the temporality, the radically non-cinematic qualities, of the present. Contemporary media art indeed only has one option if it hopes to articulate the experience of life in the present. It now turns to the instant. If art today is to be considered contemporary, to embody ways of being-in-time, following the formulation of the art historian Terry Smith (2009), its subject can only be the ontology of the present. But taking on this subject, contemporary art seeks to make the present dysfunctional; it seeks to make the present function in a radically different and nonproductive way. This has always been the task of what we used to call the avant-garde: Futurism, dada, formalism, all took it upon themselves to make the present dysfunctional by reordering the time of the present through the destruction of the past, the undoing of values, the revolutionary value of the dissolution of images (Groys 2016: 50–52). Contemporary media art continues this tradition by upscaling the processes that define the present as an instant without time – the still, the pixel, the archive – and then reinserts time into these post-historical moments, making them thick with temporalities. In grappling with the present as an instant, a number of media artworks look to the limits of the present, some produce media anachronisms, some expand prior media temporal systems and attempt to view the instant over much more drawn out moments in time.

MEDIA AESTHETICS

69

Notable artists in this tradition include Douglas Gordon, who radically extends the duration of films to draw attention to both the individual frames and the interstices between frames. The most famous example of this is his work 24 Hour Psycho (1993). But he also uses this technique, perhaps to more traumatic ends, to rework the found footage in 10ms-1 (1994), where he slows down the footage used to document and study the effects of ‘shell shock’ on First World War survivors. The work measures trauma, and in a sense reproduces this trauma in that it produces a sense of waiting by further segmenting experience and blocking the progression of time so that it accumulates. We wait for something to happen. But we also know what will happen. We can predict what will happen in the next frame, it is folded in to our experience of waiting in the present, because of what we see in the present frame. The slowing of duration seen in Gordon’s work reaches its full conclusion in Jeff Wall’s photographic pieces, where images taken over a period of months, sometimes years, are carefully composed to create photographic montages, which seemingly present samples of time. Durations are not only slowed but stopped and rearranged, as though images in a database, with different times overlapping one another. His most well-known work A Sudden Gust of Wind (after Hokusai) (1993) (Figure 2.1), made to resemble the Katsushika Hokusai Yejiri Station’s woodblock print Province of Suruga (ca. 1832), was assembled over one year by taking photographs of sets, props and actors, which were then combined into the instant of the photograph. The theatricality of the image, the wind that blows through the photograph, the relationships between the figures from different times, further exemplifies the way movement, in this case the movement of the production over twelve months, is frozen in a multitemporal present, with the instant represented as a scene that acts like an archive stretching back over the twelve-month production of images and even further to the original woodblock print. Wall’s more recent work Listener (2015) (Figure 2.2), similarly presents a stopped, multi-temporal scene. A shirtless man, whose twisting figure resembles the images of Christ once seen in mannerist paintings, kneels on the ground, victimized, but speaking. A man leans over him, listening. What has just happened? What is about to happen? What words have been said? What will be the response? The theatricality of the image presents us with an agonising extended present. All of the elements in the scene, the rocks, the surroundings, the art historical references, the potential for action, take on significance as the past and future are folded into this temporally thick moment. The photograph, like Gordon’s found footage, further alerts us to the way this ‘real moment in time’ and our experience of the tempos of this reality, is technically produced as a moment that nests within itself multiple temporalities and is at once up to date and already past.

70

AGAINST TRANSMISSION

FIGURE 2.2 Jeff Wall, Listener, 2015, inkjet print, 159.4 × 233.0 cm © Jeff Wall. Courtesy of the artist. Another figure in this tradition is David Claerbout, one of the most significant artists to give an aesthetic form to the contemporary philosophies of time. Whereas Wall’s works embody the accumulation of time, Claerbout in works such as Arena (2007) and Section of a Happy Moment (2007) radically extends the duration of an instant to break the surface of photographic temporality. Where Wall folds multiple instants into one whole, Claerbout unfolds one instant, one whole, allowing it to unravel over time. In these works, the archive of moments such as the instant before a point is scored in a basketball game or significant but seemingly banal moments of happiness, made still in the photographs, are turned into images not of frozen sections of life but media archaeological forms that, via a reintroduced duration, extend into the future and become scenes. Claerbout achieves this by photographing the moment from multiple and unexpected perspectives and then introducing duration via montage. Standing before one of these installations the viewer sees the instant from multiple perspectives. It is no longer a frozen section of time but a scene that extends into the future and with each new image of the montage offers us new information, new visions of the scene. In these works, the instant is examined and open to analysis in time. The extension of the instant, the usually fleeting moment, into the future, and the way in which this instant seems to be overfull with information, produce the affect, the feeling, of the work.

MEDIA AESTHETICS

71

There is an inescapable sense in these works that, although viewers are offered multiple views of the same still section of movement, there is paradoxically never enough time to grasp the fullness of each instant entirely. There is paradoxically too much time to conceptualize all there is in the work. The temporary has become overfull with significance. These photographic images show, via their continual re-presentation of the instant, what it might mean to be-with the temporary. Perhaps the image that shows this best is Claerbout’s recently completed video animation KING (After Alfred Wertheimer’s 1956 portrait of a young man named Elvis Presley) (2015) (Figure 2.3). In 1956, Wertheimer photographed Elvis Presley at the age of 21, just before he was to reach the heights of popular stardom and transform into ‘The King’. In Claerbout’s reworking of Wertheimer’s original photograph, the body of Presley, the body that will eventually become The King, is, in a computational type of recursive reflection, overlayed with textures of The King’s body taken from a digital archive of famous photographs (Claerbout 2015). The virtual camera in Claerbout’s animation circles around the image, it produces the wandering gaze over the surface of the image and comes in for close-ups on the now three-dimensional animated version of Presley’s body, made up from a composite of images only visible in close-up. The King is now able to be closely and methodically inspected at the moment of the original photograph, which is now overlayed by the media autopsy of this body, giving form to the thick multi-temporal event. Claerbout shows us the ongoing remediation of a temporary instant, not so much in a way that demonstrates Barthes’ well-known phrase ‘this will be and this has been’ but rather demonstrates the way analytical media continue to operate on archives of the past. The work demonstrates a ‘this once was and is now controlled by a program’. The image, not a record of what once was but now a conceptual abstraction, is based on the careful recomposition of a body based on the archive of photographic data of what that body would eventually become. Flusser once claimed that one’s wandering gaze over the photographic surface creates temporal relations between the photographed objects. ‘It can return to an image it has already seen and the “before” can become “after”’ (Flusser [1983]2014: 8–9). In Claerbout’s work, the computer takes over the role of the gaze and in a controlled way creates disjunctive temporal relations, not just between obviously visible elements of the photograph but between an archive of data in the form of textures of The King’s body parts photographed throughout his career. In this image, we do not look at a ‘frozen image’ but instead see a state of things translated into a multi-temporal scene. The technical image acts as a dam into which other images flow and become endlessly reproducible (Flusser [1983]2014: 19).

72

AGAINST TRANSMISSION

FIGURE 2.3 David Claerbout, KING (after Alfred Wertheimer’s 1956 picture of a young man named Elvis Presley), 2015–2016, single channel video projection, HD animation, black & white, silent, ten minutes, edition of 7 with 1 AP and 1 AC© David Claerbout. Courtesy of the artist and Sean Kelly, New York.

An artist that looks to the longer scale of the instant, where the experience of being with the temporary and the fragmentary takes form, in a similar manner to Claerbout’s found photographs, is Dominique Gonzalez-Foerster. In TH.2058 (2008), which was first exhibited as an element in the Unilever Series in the Turbine Hall at the Tate in 2008, Gonzalez-Foerster extends the contemporary moment fifty years into the future, when cultural artefacts, as she puts it, ‘take shelter’ in the Tate from the catastrophic climate change that has begun to take effect. This work presents the instant rather than the durational in the sense that no change takes place, only the sensation of being locked into an archive of artefacts at what might be considered the end point, or aftermath, of historical time. The Tate in 2058 acts as an archive for the books, films, sculptures and other artefacts of culture and a place that now stands in for the collective memory of a civilization at the moment of extinction of the extended but still temporary moment. This archiving also speaks to the nesting of time within the work, not only focused on a type of speculative fiction of what London might look like in 2058, but also nesting within itself the incremental and catastrophic accumulation of present moments, of which we are all as contemporaries implicated, and also nesting within itself, via its archive of cultural artefacts and traces of collective memory, events

MEDIA AESTHETICS

73

that stretch back much farther than 2008. The work is analytical because it extends the instant and is intent on carefully analysing what makes up this instant, rather than following vectors or creating the sense of a time that flows through the present to a future. The work engages a media aesthetic because it demonstrates how analytical and archival techniques, using media to preserve the past, create an environment of storage time. Much like the tradition of archive artworks such as Muntadas’ The File Room, the series of works conducted by Walid Raad under the banner of The Atlas Group, and indeed Claerbout’s King, it presents its archaeology of the present via nonchronological means; it blocks the transmission of events towards a future. What we see is a close analysis of a moment, which has been brought into such a close-up view that it dilates throughout time and the contemporary, the idea of attempting to grasp and archive a continually fleeting moment, becomes the condition that underpins the work. The temporary instant is extended in this work to the beginning of extinction itself. Where previously the instants of history came to an end, where paradigms would shift and be replaced by a new epoch, now all that is left after the end of the instant, this stretched out aftermath of history, is extinction itself. Gordon and Claerbout’s work, which allows small instants to unravel over an extended time period, Wall’s photograph, which theatrically assembles instants into a new scene, and Gonzalez-Foerster’s work, which offers viewers an opportunity to engage with the idea of a very large, non-chronological instant, give form to a condition of a present in the aftermath of time, a condition against transmission, or what Siegfried Zielinski described so perfectly as melancholia, a being too much with time. Either in very small or very large instants, where chronology is replaced by the archaeology of the moment, in all the media art works mentioned above, the present is thick with temporalities that percolate beneath images, where events no longer roll on but instead remain blocked up in scenes. In these artworks, the moment, the instant is thick with time and it is in this way that these artworks provide a way to reconceptualize the world of analytical, post-historical media. Where the analytical, as was shown by McLuhan and Flusser, separates time into thin, anaemic points, focusing attention and experience on the present, these artworks, by upscaling the analytical, by making it work harder and that way making it dysfunctional, show how the post-historical might be a time in which the past and future get radically reconceptualized as within, rather than beyond, the present.

YouTube as a post-historical medium I would like now to shift the focus from media artworks that give form to the storage time of electronic media to an example of so-called mass

74

AGAINST TRANSMISSION

media that similarly embodies the aesthetics of the post-historical. There are many everyday examples that could be used to demonstrate the experiences facilitated by post-historical media. One that seems particularly appropriate is the YouTube footage that circulated in 2011 of the killing of Muammar Gaddafi. This is due to its significance as an event of history but also because of the way it – the video, its exhibition online and its storage in a ‘digital memory’ – represents an outside of history via a particular measuring and analytics of the event. On YouTube, a platform characteristic of many of the recent developments in the so-called ‘digital television’, the bloodied body, stripped and dragged through the streets, was symbolic for the cohesion of rebel fighters. The act of photographing the events, uploading them and circulating them over the Internet was an act of symbolic violence but also an act that highlighted the temporalities with which real-world events are mediated via citizen journalism and non-centralized distribution platforms. YouTube becomes a post-historical medium in this case because, like Claerbout’s and GonzalezFoerster’s work mentioned earlier, it fulfils the operation of segmenting the event and giving it new significance via the system’s architecture. Rather than presenting the event within a conventionally historic timeline, the platform presents the event as a clip of the present moment, which now links to related clips either based on their title, metadata or previous user choices. It is in this sense that the aesthetics of YouTube mirrors the time-discrete character of digital signal processing, as samples that are able to be rearranged based on the logic of the apparatus. Citizen journalism here represents a radical cutting of the present, as viewers now occupy the present, via the shaky and chaotic mobile phone footage, for a temporary (and extremely distressing) moment in time, able to be rewatched over and over again. Not only this, but the event has been cut up and analysed by a computer, which now actualizes it based on search queries and predictive algorithms that group it with other supposedly related content. But what are we to do with this knowledge? How can we begin to react to such radical cutting and reassembly of these traumatic events? The images of Gaddafi, not simply the footage of retribution, are, as pointed out by Johanna Sumiala, ostensibly an image of sacrifice (Sumiala 2013: 34), offering the same level of symbolic value as the ritualistic human sacrifice to gods. Unlike previous versions of sacrificial rituals, however, these symbolic practices take place in a distributed space. Much older stories of the ancient custom of sacrifice detail rituals at specific places: the Aztecs assembled sacrifices at the Pyramid of Tenochtitlan, the Canaanites sacrificed children at Tophet, the Incas sacrificed humans en masse at great festivals. This is what we can see when the event is taken out of history and understood topographically rather than lineally. To come together as a

MEDIA AESTHETICS

75

community, citizens assembled to collectively participate in witnessing a sacrifice as an offering to a god that will sustain their way of life. But not only do these rituals unite individuals in space, they also link the attitudes of the observers through, as argued by René Girard ([1972]2005), their participation in the collective murder. A new temporality is given to an old practice: sacrifices are now looped in time. A global community comes together at different times and for varying lengths of time and witnesses these events, sharing in the collective guilt, often compartmentalized from their historical preconditions. The sacrifice, like the grave in Courbet’s painting, conditions the spectators, drawing the onlookers into a relation with its measurement of events. Not only do the participants that were once gathered in the spatially and temporally distinct site of the sacrifice share in a collective guilt, but it percolates through a temporally and spatially distributed aftermath of the events. Like Claerbout’s work mentioned earlier, the instant is extended and repeated, replayed from the archive of YouTube, so much so that there is a feeling that it can never be grasped in its over-completeness. We are both too much with this time and not enough with this time to grasp the event. It is reduced to a scene that is repeated over time. These scenes, as time-discrete moments that can be reassembled and made to connect to other scenes, now constitute a digital temporality – both in the sense of being presented online and also in the more metaphoric sense of representing a compartmentalized digit in time. As with all ‘space-biased media’, following Harold Innis’ famous account, the public’s attention towards such events is limited. The events cover space via computer networks, but fail to persist in time. It is true that the events are stored in what a number of scholars would term a ‘digital memory’, archived on servers. However, the footage at some point loses its place in the public imaginary as it is replaced by a new meme. The sheer volume of data stored on the server and its architecture ensures that only users aware of the search terms or those that search for similar material are able to excavate this media event. More importantly, the media event of the sacrifice of Gaddafi is isolated from a historical narrative of events. Via its measurement and storage, it has become a new, recontextualized media event, whose significance is built via the links it forms with other footage, such as the footage of Barack Obama supporting the revolutionary activities of the Libyan people (Sumiala 2013: 35). The process of tagging videos and the software system that forms links between the tags makes the experience of watching the killing of Gaddafi distinct from the older forms of journalism. Depending on the platform, it is surrounded by other – and for the computer, related – suggestions for future viewing such as a video documenting the ‘Life and Death of CIA asset Sadam

76

AGAINST TRANSMISSION

Hussein’, an interview with Gaddafi, a video titled ‘The Women of Gaddafi’, articulating these events in a multi-temporal, and sometimes bizarre, measured present of the aftermath. These relationships give character to the ritualistic images of torture, killing and sacrifice. The complexity of the local event is replaced by the multi-temporality of the supposedly global. Like all digital recording, an algorithm links together samples of time, then approximates and normalizes their connections. Ernst Breisach writes, ‘history has always been precariously lodged between the domain of philosophy with its universal abstractions of timeless quality distilled from the complex experience of life, on the one side, and the domain of literature with its imaginative reconstructions of life free of the obligation to reflect the past as it once had actually lived, on the other side’ (2003: 7). But History is also underpinned, in a way that stabilizes this precarious balancing act, by media. Written language gave a character to History that was different from that of oral language. Ong (1982) and Havelock (1986) have both shown this. Film has given a different temporality to History, as the moving image comes to define an era and, like television would in later decades, allows viewers to be telepresent in times and spaces previously unavailable. The footage of the sacrifice of Gaddafi is a particularly clear example of the temporality produced by post-historical media for a number of reasons, largely due to the way the computer and the mobile phone footage define the event and its relationships to other events. The footage captured by a single mobile phone camera has its own rhythms. Gaddafi’s face is only on screen for short moments in time, the majority of the footage consists of flashing colours, blurred images and sounds of gunfire, as the camera operator jostles in the crowd, looking for the right angle for the camera. Because of this, the footage has a distinct tempo, cutting between shots of Gaddafi bleeding, being taken around the town, and rapid movement as the camera operator circles the subject, bumping through the crowd to find the best view, attempting to, following Flusser, create a theory-throughimages of the event. The footage is immediate, it is ‘live’ and feels as if we are ‘directly there’. But it is stored in a server, accessed at different times and this infrastructure for the preservation of the image and the modes of access assures us that the image’s liveliness is in the past. The video feels as though it is both past, due to the reality of its existence in storage time, and present, due to the immediacy of the recording. In the footage that I accessed on YouTube, there is also a looping element, as the footage shows a repeated section of the cameraman moving past a truck to locate Gaddafi held by several men on the ground. And then there are the comments under the video posting, which reveal the ideologies and multiple histories that the various posters operate within; some vitriolically oppose the killing, whilst others argue for its justification. In the experience of

MEDIA AESTHETICS

77

watching the video, all these temporalities are drawn together, temporarily combined, in the present. This example of post-historical media, using tags and assembling conversation strands, segments information so that it can be strung together, explored and rearranged. In this instance, YouTube provides a way to analyse an event by presenting it within a context built via software programming and hardware design, from the way the camera as a hand-help mobile apparatus records the event to the way it links with other videos and comments that are assembled to suggest further viewing and public opinion.

What, or when, is the con-temporary? To return to the questions raised on the first page of this book, in the face of distressing media events such as the one discussed above, what does it mean to be contemporary? What is the being-with-time that this term refers to? How are the conditions for this experience produced by online distribution? The experience of being-with time is not a being-with the time of events but a being-with the time of the archive, the time of scenes and the time of analytical media. In the case of the YouTube killing of Gadaffi, this media event is thick with multiple temporalities, involving the present ‘fact’ of the killing, the comments under the video, arranged on a timeline that documents the video’s existence as temporal artefact, the archaic sacrificial rituals that it brings into the present moment and the other videos to which it links. Not just an element in a larger history of events, this temporary moment nests within it a number of events, from the recent to the archaic. This is what we can see when we view this event from Whitehead’s perspective, given in the previous chapter. To be contemporary, then, in the face of this video, is to occupy a condition where the temporary moment of the present nests within itself numerous temporalities that are produced by material technology, from the phone that uploaded the footage, to the distribution channels that gave the footage its significance as a sacrificial ritual and the algorithms that create links between videos. The mediation of the event reveals a time that, like Wall’s and Claerbout’s photographic works mentioned earlier, is gathered together from multiple pleats into points. No longer a line, time is now experienced as points, pixels, of varying density. In the final section of Giorgio Agamben’s What Is an Apparatus?, he outlines the conditions of what he terms contemporariness, or the ability to develop a sense of one’s own time. For Agamben, this occurs as contemporaries, in a seemingly paradoxical move, disassociate themselves with time in order to grasp their own time (Agamben 2009: 40). Like Nietzsche before him, for Agamben, to be able to see one’s own time, contemporaries must form a

78

AGAINST TRANSMISSION

distance between themselves and time. In short, they must live in time’s aftermath. He states, Contemporariness is, then, a singular relationship with one’s own time, which adheres to it and, at the same time keeps a distance from it. More precisely, it is that relationship with time that adheres to it through a disjunction and an anachronism. Those who coincide too well with the epoch, those who are perfectly tied to it in every respect, are not contemporaries, precisely because they do not manage to see it; they are not able to firmly hold their gaze on it [emphasis in original]. (Agamben 2009: 41) You cannot see something when you are in it. The killing of Gaddafi seemingly brings viewers closer to this event, but simultaneously produces a media event that, as set out above, unfolds in its own multilayered and cyclical time. To be contemporary then, following Agamben, is to be out of joint with time, to see both the original event and the fragmentation and reassembly of the event in a new type of time. This would be to oscillate between conditions of being in the time of events, watching the killing, and also out of the time of the event, scrolling through the comments or accessing other videos suggested by YouTube. This process of sensing time through the disassociations with time brought about through its fragmentation is the basis of analytical media, from the frames of cinema to the pixels of television screens. They are, to paraphrase Agamben, technologies for returning to a present where we have never been (Agamben 2009: 52). Like Jim Campbel’s Exploded View (Commuters) that began this chapter, the YouTube video of the killing of Gaddafi is an example of analytical media, as a major global event is broken up and reassembled in such a way that produces a new temporality for the playing out of a media event. The cinema dissolves time into frames. The introduction of the television then went further and introduced a ‘radical cutting’, not only separating movements in time but also ‘disintegrating connections or shapes into individual points in space’ (Kittler [1999]2010: 209). The computer then took this pixilation of events, as the mathematical organization of particle elements, one step further. Not simply technical, these techniques seem now to be rehearsed on a phenomenological level as people of all ages, rather than being invested in a flow, learn to rapidly shift between tasks, with attention packeted into smaller and smaller parcels. In a talk at Transmediale in Berlin, Siegfried Zielinski, as mentioned earlier, described this experience perfectly as melancholy, a being ‘too much in time’.3 I would add to this that melancholy, a chronic (timebased) ailment, is produced by the continual cutting up of time so much so that there is no alternative but, like in the contemporary artworks mentioned earlier, to be overwhelmed by the measurement of the present. As McLuhan

MEDIA AESTHETICS

79

once argued, from the analytic sequence came the assembly line principle. But because synchronization is no longer sequential, the assembly line is now obsolete in the electronic age. By digital means, synchronization of any number of different acts can be simultaneous. ‘Thus the mechanical principle of analysis in series has come to an end’ (McLuhan 1964: 164). Events are now analytical in relational systems, rather than sequential systems. In the Gaddafi example, viewers are synchronized with neither a specific time nor a specific space, but rather a multi-temporal collective of viewers, coming from different times and places. The questions ‘What are the structures of contemporary time? How is it produced? And how is it sustained?’ are at the centre of this book. Unlike Agamben, and indeed Smith and Groys – three of the current major thinkers to take seriously the condition of the contemporary – the answers that I propose in the chapters to follow are based on a technical analysis of media in order to explore the way the time of the present has been engineered. This will involve an exploration of media events, which are not reducible to the representations on screen but involve instead the operability of media technology which produces these images of culture. Melancholic and anachronistic, the present moment is viewed by theorists such as Groys, Smith and Agamben as continually overshadowed by the pastness of largescale historical events such as 9/11. My analysis takes this mode of thinking about large-scale political (public) events and turns towards the very small, the technical and the usually invisible. Inspired by the previous treatment of the contemporary and the post-historical in cultural theory and philosophy, I argue that, in addition to major events, the anachronism and melancholia that defines people as contemporaries has been facilitated by the long history of fragmentation that technical media bring to the experience of time. Human users make images by using apparatuses, over which they suppose that they have power. But users are also aware that these media apparatuses function according to rules and probabilities; the media themselves operate based on their own programme and external circumstances. Yet, we still think that it is us, the artist or the image maker, that intends meaning, when in fact it is something of an accident framed by the potential or degrees of freedom of the apparatus. If we can accept this argument, so rigorously pursued by figures such as Kittler and Flusser, storage and time-discrete signal processing become not just the technical system upon which computers are based, but the foundation upon which computer culture and the conditions of the contemporary are based. The stored programme that, based on a series of instructions, is able to produce, to use one of Flusser’s terms, a ‘theoria’, becomes the organizational system upon which the technical universe functions. From these claims, we are now ready to start the rear-view analysis of the post-historical, exploring the emergence of this condition from inventions

80

AGAINST TRANSMISSION

and discoveries that came sometimes hundreds of years earlier than the modern digital computer. These inventions, which were foundational to the birth of the mass optical media of film and television, were things that were never perceived in the same way as images on screens and hence were often thought, like most meaningless artefacts, to have a limited effect on culture. What will be shown however is that not only did these technical functions have very real effects at the experimental stages of the development of mass optical media but that these effects re-emerge in such a way that provides mass media supports for the post-historical.

3 Post-Historical Scenes

T

his chapter begins the book’s rear-view analysis and offers an exploration of media time rooted in the technical description of the analytical photographic method. The chapter should not, however, be read as an attempt to write a history of post-historical media. Instead, this and the following chapters try and illustrate the ways that, rather than a story of evolutionary progression (Historical time), the past of media can be seen to be folded into its present in ways that continue to have consequences. Post-historical media are thus archival in two different ways: they store information as time-discrete data and they store the genealogy of media, these moments from the past, which were the experiments that created the ground for mass media. Analytical media returns to these experimental beginnings. The chapter makes the following interventions in technical media history: 1

It argues that the organization of time as discrete stills, first given visual form as a technical image via the photomechanical medium, combined with a larger technique of analysis in fields such as psychoanalysis, history and genealogy, was foundational in supporting the new temporal systems of media culture. In order to make this argument, I first look to the way small physical movements were transcoded and analysed.

2

As is widely known, the developments in film and cameras used in analytical photography had demonstrable effects on life in the nineteenth century. Once cinema arrived as a mass media it replaced these effects and offered to film theorists new metaphors for describing modernity in images. In this chapter, I argue that it is only recently, as photography and the cinema has become replaced by the computer that time-discrete images begin again to provide metaphors and models for contemporary life.

82

AGAINST TRANSMISSION

FIGURE 3.1 Étienne-Jules Marey and Charles Fremont, A Study of Blacksmiths at the Anvil,1894.

POST-HISTORICAL SCENES

83

In the late nineteenth century, complementing the great advances made in the field of photography as a medium for observing, isolating and analysing movement, Étienne-Jules Marey collaborated with the civil engineer Charles Fremont to produce a chronophotograph of blacksmiths at work at the anvil (Figure 3.1). Marey worked with Fremont to extend his original work on animal locomotion to the study of the mechanics of the human-animated machine, which they saw, following the growing interest in the field of thermodynamics, as operating based on the dynamic principle of converting energy into movement (Brown 2005: 13). Marey and Fremont’s image depicts two men: one holding the steel to the anvil and hammering it out and the other much more vigorously, swinging the hammer with his entire body. The anvil, in the centre of the image, white due to the exposure time of the image, compared to the duller and more fragmented view of the workers, reveals itself as not only the centre of the image but the only constant through time. If synthesized together, as the eye moves along the vector lines of the hammers, the images of the workers and the swinging hammers combine to give off flickering images, moving up and down from the anvil, as faint lines overlapping, flaring up from the central axis. It is no coincidence that the image resembles the rhythmic flickering of flames, originating from the solid anvil.The heat of the anvil, illustrated by its glow, throws out blurred and less intense flames, in the form of the worker’s movements, as energy is converted and lost as heat. In this image, the blacksmiths work for the apparatus: it is at the centre of the image and it is defined in relation to its light and heat. The apparatus, like the hole in the ground that began this book, creates an environment that measures and defines movements in relation to itself. The increasing faintness of the lines of movement illustrates the fatigue and eventual entropy of labour. The more vigorously the worker moves the more he resembles death, burnt out in the sequence of faint stills. This image, which folds labour time into an image in the present, does not simply represent the so-called ‘facts of life’ but rather defines in an image the way human subjects can become understood in relation to their tools. It sucks these events, these rhythms, into an image. The workers here, no longer seen as stable, individual subjects, are represented as processes that are made operational in order to fulfil the function of tools. In this image, it is the anvil that defines the operations of the blacksmiths. The anvil programmes the apparatus. In terms of the larger imperatives of contemporary realities, as set out in the last chapter, it is the fully programmable electronic computer that now carries out this role of defining experience via its history as a technology engineered to treat time as discrete samples. Like the computer, in Marey and Fremont’s image the tool takes on the role of structuring the temporality of the task, as the only constant in time that demands particular rhythmic operations on the part of the blacksmith in order to attempt to bring the steel

84

AGAINST TRANSMISSION

to the point of phase transition, which of course we do not see in this image. It is no longer the worker that is at the centre of the unchanging image, but the tool. In this image, the steel does not change due to the imposition of heat and work. Instead, the steel stays the same and workers change around it. The chronophotographic image allows the study of posture and movement and a graphing of the changing relationship between the labourer and the object, which offered what Anson Rabinbach called a new European science of work. At this point the human becomes an apparatus. ‘For the factory owner, the Other is a worker that must be molded according to preconceived models, as a type of mass’ (Flusser [1983]2013: 27). They become made into an apparatus by an apparatus. The focus of the new science of work was based on ‘the expenditure and deployment of energy as opposed to human will, moral purpose or even technical skill’ (Braun 1992: 68). In other words, the technology of chronophotography began to define the subject only in relation to their work. The apparatus (the human) is defined by its operation, which is now programmed by tools. The American Frederick Taylor soon became aware of Marey and Fremont’s studies and developed with a team of engineers a number of time-andmotion studies designed to increase productivity and efficiency through standardization, with the aim of identifying the most efficient way to undertake a task and then having workers continually repeat these movements. Workers would carry out tasks with small lights strapped around their wrists and arms in ways that resembled today’s motion capture techniques. This then led to the operation of the factory as though a film was running through a projector. The movement and synthesis of film mirrored completely the regularized and standardized moment that came to define modern labour. Modernity moved things in front of itself. It did not wait. This, however, like the cinema was underpinned by the measurement of movement by time-discrete signal processing. Time, when possessed and measured by industry and corporations, was to be used efficiently. Movements were only to be made when they used the smallest portion of this time to the most productive ends. The instant, as Doane (2005) has argued, becomes owned and regulated by the machines that define modernity itself. Each moment was now able to be analysed in terms of potential for productivity. The apparatus remains at the centre of this new image of labour and workers synchronize their movements to the rhythms of the apparatus. What we are beginning to see in this image is the way that the temporality and conditions for experience were underpinned by the way engineers and scientists, such as Marey and Fremont, organized time in order for the signal to be processed by their mechanisms. The analytical photograph, as opposed to the cinematic, offers an image of one of the foundations for contemporaneity. It illustrates a cultural technique of archiving usually fleeting moments and transducing them into

POST-HISTORICAL SCENES

85

points. In effect, this amounts to the application of a theory of information, which documents an event via the sampling of instants that can then be made meaningful. It amounts to the transformation of an event into a scene. As argued in the previous chapter, the cinematic image, which was to become the mass optical media of the age, replaced the time-discrete image of photography as a metaphor for the moving reality of modern life: one was able to quite literally see history becoming. Now, however, as will be argued in what follows, the emphasis on the time-discrete and the sample of an instant begins to re-emerge as the metaphor of and for the present. In the above example, the image has taken on a new role. Rather than being used to orient humans in the world, as a representation or reflection of present conditions, these images are projected onto the world and it is understood in their terms. This is quite different from traditional images, which tend to represent a function of the world in a way that can be decoded. Instead, these new images obscure the world until they and human lives become a function of the images they create. ‘Human beings cease to decode the images and instead project them, still encoded, onto the world “out there”, which meanwhile itself becomes like an image – a context of scenes, of states of things’ (Flusser [1983]2014: 10). Not only do images such as Marey and Fremont’s create scenes, they project these scenes onto the world so states of affairs can be understood as scenes. Technical media do not just provide a metaphor for the present (although they provide this too), but they also provide a measurement and treatment of the present as it is perceived by them. The world, following Flusser, becomes itself a technical image. The danger is that technical images such as these are mistaken for objective symptoms of a real process. The image of the blacksmiths at the anvil is thought to be a symptom of their real movement in terms of an objective measurement. But, as Flusser argues, the difference between a symbol and a symptom is that the symbol means something to whoever has knowledge of the consensus of such a meaning, while the symptom is causally linked to its meaning. The word ‘dog’ symbolizes and the tracks on the ground symptomatize the animal. This pretension of technical images of being symptomatic or objective is fraudulent. In reality, apparatus transcode symptoms into symbols, and they do it in function of particular programs [emphasis in original]. (Flusser 2013) It is these programmes, as invisible, meaningless functions that therefore come to characterize the images that relate to the condition of post-history. They take symptoms of an event and translate it into the symbolic scene, which they then open for analysis.

86

AGAINST TRANSMISSION

In this chapter, inspired by the technically oriented approach to media studies coming from figures such as Kittler, Siegert, Krämer, Ernst and Zielinski, I look to the developments in analytical photography, particularly in terms of the chronophotographic method, around the end of the nineteenth century and beginning of the twentieth century as a set of key moments in the organization, systematization and measurement of the movement of bodies in time. The argument of this chapter is that analytical photography was a key moment in the media engineering of time, which began to establish current practices in the documentation and collection of contemporary life as data. Other people, including Mark B.N. Hansen (2011) and Stephen Mamber (2004), have also made this connection between the segmentation of chronophotography and digital media. Following Mamber, media historical descriptions should not cast off Marey’s famous work as a primitive precursor to cinema. Instead, these chronophotographic images should be viewed as a full-fledged visual theory of the analytic, which anticipated much of the theoretical discussions to come around a century later in new media theory. Following Hansen, we can see that the segmentation of time begun by chronophotography and culminating with the digital computer, represents a moment when mechanisms became a major part of an extended cognitive system that favours tractable data. Like the use of SMPTE to precisely organize filmed time and the Keykode printed on the side of filmstrips, these devices reveal the way technical media organize temporality, scaling up the techniques of segmentation undertaken in the studies of history, genealogy and genetics at around the same time, through the careful and precise measurement of instants. An exploration of these moments not only reveals historical and technical curiosities but also indicates how the time that was originally engineered into pre-cinematic systems for capturing and studying movement has in fact re-emerged in the now ubiquitous techniques of gathering and analysing data. Hansen suggests that these systems not only measure time but reveal new mechanically generated temporalities with which movement can be understood. My argument in this chapter, building on Mamber’s and Hansen’s work, is that, at these points in optical media history, time as duration, the time that had a material impact on the experience of having a body, was replaced by immobile images of temporariness, which had a different impact on experience by opening up new ways that bodies could be mediated. These types of images, which quickly vanished from the popular imagination when cinema became dominant, continued in the graphic imagination of the sciences only to re-emerge in information theory and contemporary digital culture. Chronophotographic developments, and the charting of physiology via the graphic method, represented moments when time, via technical images, became subordinate to movement and bodies, as Doane (2005) puts it, were petrified and paralysed and any unnecessary background detail was purged from

POST-HISTORICAL SCENES

87

the image. Or as Bergson ([1907]1989) famously argued, the segmentation of movement tends to obscure duration. Within the visual regime of the technical image at this point in history, biology, labour and mechanics were reduced to the way a body moved on timescales organized by film speeds. The faster the film was able to register light, the smaller the scales of movement that could be analysed, rationalized and systematized. This led to a universal way to chart motion and a number of developments such as the notorious time–motion studies of early twentieth century Taylorism that, supported by the analytical medium of chronophotography, attempted to homogenize and transform the entire workforce into automated biological machines. It did this by causing time to appear as a point (the very thing that the artworks and my writing on the Gaddafi video in the previous chapter attempted to make dysfunctional). The chronophotographic apparatus, which involves not just technical elements but also the routines, logic and approaches that this engendered, delivered an image that could be used to intricately represent phenomena in analytical terms. In doing so, it did not simply reproduce the movements that it recorded, which previously had receded from human perception, but produced technical images that offered new ways to study, understand and experience movement as stills in time. Marey, furthering the definition of life functions as purely mechanical operations, saw the human subject as a chain of organized and consistent events. Inspired by the work of German organic physicists, Marey’s chronophotographs pictured ‘the body as an animate machine, a machine governed by the same laws that governed inanimate nature: they investigated biological functions as nothing less than manifestations of physical and chemical phenomena’ (Braun and Whitcombe 1999: 219). Lived phenomena were replaced by mathematically perceived data. Everywhere, it seems, at least after Deleuze’s famous reformulation of Bergson in his cinema books, we are being told that we should think of time, in an ontological sense, as a flow. But nowhere, it seems, from the daily grind of post-industrial work to our place as subjects of post-history, is time experienced like this. Like the example of YouTube footage given in the previous chapter, the media of our time, through their ontic operations (Siegert 2015: 9), continue to shape the aftermath of large-scale historical eras, as the stored discourse we call history becomes assembled in databases and less and less resembles a continuous movement of transmission. It sucks into itself what would have previously been considered historical events. It is in this sense – the sense of operating before that which is meaningful and transmittable over both space and time – that technical media downloads time, its periodization and memory, into its own analytical frameworks. The analytical mediation of time, of course, has taken place in ways that has not involved the experimental application of technical media. Historical data has for some time been analysed by breaking events down into small parts.

88

AGAINST TRANSMISSION

Historians have analysed major events along different timescales of months, weeks, days, hours and minutes. Historical events may also be analysed in terms of ‘key players’, rather than a chronological narrative of events. Registers, calendars, money and other logistical media have played a fundamental role in organizing global space and time into grids which can be coordinated. Likewise, genealogists do not simply chart a family tree that grows over time but conduct research in analytical blocks of carefully chosen samples. Geneticists use analytical techniques to break down aetiologies and attempt to understand the transmission of diseases. All these fields apply analytical techniques that have mediated time and opened it to study through segmentation. The difference that post-historical media makes is that it breaks these events down into particle elements that can now only be grasped via the computer. Sometimes the events ordered by analytical media were large-scale historical events, sometimes these were more banal, everyday events. Derrida once said that the event is only an event when it signals the arrivant, a strange, unexpected and previously undefined figure. As analytical media ‘downloads’, defines and tunes humans into these events, attempting to define the arrivant, it translates them into post-historical scenes. The arrivant, the unexpected figure, is removed at the point that it becomes defined.

Events and scenes In Process and Reality, one of the major speculative metaphysical projects of modern Western philosophy, Whitehead alerts us to the flickering, buzzing process that gives form to the world. In a particularly important passage, Whitehead, in a way that resonates with all media philosophical inquiry, argues that these processes are never wholly graspable and, hence, often excluded by both scientific and philosophical inquiry, in which all forms of media are of course implicated: All modern philosophy hinges round the difficulty of describing the world in terms of subject and predicate, substance and quality, particular and universal. The result always does violence to the immediate experience which we express in our actions, our hopes, our sympathies, our purposes, and which we enjoy in spite of our lack of phrases for its verbal analysis. We find ourselves in a buzzing world, amid a democracy of fellow creatures. (Whitehead [1929]1978: 50) The language that philosophers and scientists once used to describe the world was argued by Whitehead to be insufficient when one stands before the world

POST-HISTORICAL SCENES

89

of process. Experience and knowledge always comes from an undifferentiated investment in the ‘buzzing’ world of ‘creatures’: There is always something happening, something elemental, before we have a thought about any kind of object. Our vocabulary, our media, however, is not fit to deal with such a world. A process of filtering, sampling and transduction takes place where the buzzing world of creatures becomes sorted, stored, made transmittable and thus made open to analysis, following Shannon’s formulation of information. This task was once undertaken by historical media, which synthesized events into a whole. Post-historical media now translates these events into their own logic. Marey and Fremont’s image of blacksmiths involved in a process of working steel, offers a way to arrest the buzzing world and reframe process as a phenomenon that can be captured. This is also the task carried out by calendars, catalogues, registers, databases and all other types of media that operate based on the analytical function. For Whitehead, there is always something happening before there is some thing. Objects, ideas, affects are always conceptualized by Whitehead as precluded by process. The object is not a stable entity that persists through time, but rather a process. It is not simply ‘there’ but rather it is always produced by the network of events that it finds itself within. In Feed-Forward, Hansen (2015), in his own work on Whitehead, gives a similar definition of human experience. The human is not simply to be thought of in relation to the phenomenon of global networks. Instead, the human is to be thought of as part of the elemental (2). Human experience then amounts to a non-optional outcome of embeddedness in networks. This however is not to suggest that networked objects (as processes) somehow define from an external position the becoming of the subject. It should not be read, even given my earlier comments, that a technical object defines those things that it is networked to, including the human subject. What is more to the point, and more in accord with Whitehead, is that technical processes and the act of networking within these processes presuppose certain routines, techniques and protocols that define the human user in order that they may be involved in this system. To paraphrase Kittler, they define the situation, the condition, in which human beings are given attributes. If we use Whitehead to understand media, we might say that it is not that the human is ‘determined’ by the technology to use the language of those involved in reductive arguments about who/what determines what/who, but it is rather that both object and subject are determined by the non-optional embeddedness in networks of both humans and technology that engender social practices. Whitehead gives a base to the media theoretical claim that human activity is now formulated by the computer, as input, in order for the human to be able to become embedded within the network. To explain this, we can turn to one of Whitehead’s own examples: A stone lying at the bottom of a flowing river. The stone is permanent, solid and stable

90

AGAINST TRANSMISSION

while the river changes around it. The stone is an object, the river is a process. It would seem ridiculous to think anything else. The stone does not move or grow and to say that it has experience would be completely unacceptable to most sensible readers. But for Whitehead, the stone is always first a process that has experiences, in a non-cognitive sense, of its environment, just as the human experiences his or her environment before he or she has a chance to think about this environment. The stone, as it becomes a stone, is objectified as water rushes over it. The stone becomes what we customarily think of as an object as the water carves out the curvature on its surface. Extended further back, the stone was first formed as the river deposited its sediments and bound them together. This is a good metaphor with which to think about the conditions for subjectivity within the networks formed within the universe of technical images. The stone, in its process of becoming other, senses its environment and adapts. Of course the term ‘sense’ is used here after Whitehead to refer to a precognitive process whereby a phenomenon impinges on a being. In this case, it is the river impinging on the stone. The stone, however, does not sense the river in its formal completeness, but only those things that have an effect on its own process of becoming an object. It only senses the flow that causes change to its surface. The stone, due to its potential for experience, defines the river from its own perspective and becomes other based on this experience. Neither is independent. Neither is staid. They are contemporaries and involved in each other’s experience. The capacity for one to ‘sense’ the other is what is determinate of the concrescence of objects. It is the same when Whitehead describes human experience. Things are always touched with our hands. Things are always seen with our eyes (Whitehead [1927]1985: 25). The capacity for individuation is found in these prehensions. Objects, as Whitehead told us in his media philosophy of the 1920s, are always passed through layers of media, whether these be technical or not, and it is this filtering, as information from the environment becomes embodied, where the process of individuation takes place. Of course, from a media philosophical perspective, which replaces the subject with the computer, we are interested in seeing how this functions when technical media, rather than the hands or the eyes, take on the role of defining the world and, as Whitehead puts it, make a transition from the many to the one. In Marey and Fremont’s image, the optical technology of the camera and the network of statistical techniques used to chart movement define the human subject based on the protocological routines and sampling of instants that amounted to a universal way to chart, measure and conceptualize movement. These techniques, which made images out of events, were then projected back on the world via Taylor’s notorious time–motion studies. At this point, the image of chronophotography, the scene that was once derived from an event,

POST-HISTORICAL SCENES

91

began to define events as samples. This is how the network of techniques and technologies come to define the ‘goings on’ in the world by projecting models onto the world. Due to their programmes, they create capacities for sensing the environment, like the way the stone sensed its surroundings, and begin to define the photographed subject’s place in the world. Attempting to grapple with the ontology of events and the function of media, theorists of photography have done a great deal of work in trying to come to grips with the medium’s capturing of the ‘goings on’ of the world. Roland Barthes, perhaps the most famous philosopher of photography and temporality, opened up this intriguing field of inquiry by formulating the way a photographic scene punctuates vision by signifying a ‘this will be and this has been’ (Barthes [1980]1981: 96). For Barthes, these temporal discontinuities crystallize in the image, with past and present folded within one another, they defined the subject as both alive and dead, and this is what, as argued by Walter Benjamin ([1937]2010), creates the fundamentally melancholic nature of the photographic medium. When looking at a picture of his mother, two girls watching a plane pass over their village or the assassin Lewis Payne moments before his execution, the extension of the moments into the present makes Barthes acutely aware that these people will die and that these people are already dead. But when, as with Marey and Fremont’s image, the photograph documents less of the existence of human subjects and more of their very small movements, it seems to signify not death but rather the deferral of the death of a string of once temporary moments. The instant is preserved so that it can be studied in relation to other temporary moments. The image is not for contemplative reflection. Instead, we are told by the image itself where to look. It is the movement that we are meant to trace with our gaze, not the identity of the workers. It is presupposed in these images that the human subjects are less important than their movements, which can now be analysed and used to improve the way future movements are performed. It is no longer the event that we are interested in surveying, as Barthes was when he was mourning his mother, but the repeatable scene. As Benjamin (1937/2010) argued, within these types of analytical photographs, ‘is manifested in the field of perception what in the theoretical sphere is noticeable in the increasing importance of statistics’ (8). Jean Luc Godard, in his return to mainstream cinema, began to give form to this field of experience. It is no coincidence that in Sauve qui peut (la vie) (1979) (released as Every Man for Himself in North America and Slow Motion in the UK), the awkward, violent film of moments before death, Godard masters the analytical language in order to critique it through the reorganization of the rhythm of events, scaling up the analytical function of the medium that cinema would most like to keep hidden. But this is the operability that Benjamin noticed that pervaded the cinema. It was a way of drawing viewers into the

92

AGAINST TRANSMISSION

close analysis of scenes, a fundamentally melancholic function and an intense focus on presence in the present. Sauve qui peut (la vie), this film about death, as revealed in its closing moments, is a film about the death of each moment and the potential for deferral into an abyss of time. At moments the film blocks its own narrative in place of the analysis of scenes. The synthetic medium of film is predicated on transmission between moments. The analytical techniques used by Godard stand against transmission. The melancholia of Godard’s film comes from the knots in linear time, the delays, jumps and sequence of analytical stills that arrest movement. What we see in this film is a way of linking human and symbolic worlds through the analysis of stills that was to become dominant in twenty-first-century media culture. Shots freeze when we first see Denise (Nathalie Baye), in motion, riding through the countryside, the apparatus dragging on her, weighing down movement. The awkward kiss on the cheek between Paul (Jacques Dutronc) and his daughter Cécile (Cécile Tanner) is made all the more alienating as it is drawn out. As Denise says in the film ‘this awkwardness, this aimless movement, this sudden acceleration, this hesitation of the hand, this grimace, this discord, is life’s struggle to hang on. That thing which in each man silently screams: “I am not a machine!”’ (17:30-18:13). In the slow motion, the moments when Godard disjoins the medium from the action, removes the plot, he offers us a view of the structures, the medium, that underlies experience, the hole that like Courbet’s painting given in the Introduction and Marey and Fremont’s image of labour given at the beginning of this chapter, sucks time into itself. According to Benjamin, one of the most revolutionary aspects of film was not necessarily its aesthetic potential but its penetration of the boundaries between art and science. Much like the way Freud isolated and made analysable the most banal things that would usually pass by unnoticed, such as a slip of the tongue, photography and film allow the close inspection of events. This is what is given to us by both Goddard in Sauve qui peut (la vie) and Marey and Fremont in their analytical photography. Of a screened behaviour item which is neatly brought out in a certain situation, like a muscle of a body, it is difficult to say which is more fascinating, its artistic value or its value for science. To demonstrate the identity of the artistic and scientific uses of photography which heretofore usually were separated will be one of the revolutionary functions of the film. (Benjamin [1937]2010: 29) Images such as those in Godard’s film, Claerbout’s work mentioned in the previous chapter and Marey and Fremont’s photography do not simply represent in more precise ways events that were already visible, but difficult to define.

POST-HISTORICAL SCENES

93

The event is instead turned into a scene, underpinned by analytical media, and, according to Benjamin, ‘reveals entirely new structural formations of the subject’ (30). The event is turned into a scene as ‘an unconsciously penetrated space is substituted for a space consciously explored by man’ (30). In Goddard’s film as with Claerbout’s work, although time is blocked up there is also a sense that there is too much time in the image. This is the role of art that engages with the time of the present: the images become dammed up because they are thick with time. In contrast, in terms of Marey and Fremont’s image, time ceases to appear. The analytical use of photography makes the image nonproductive. It sucks events into itself, rather than producing new events.

Graphs and diagrams: Measuring an era In the opening passage of Marey’s Animal Mechanism, he writes ‘living beings have been frequently and in every age compared to machines, but it is only in the present day that the bearings and justice of this comparison are fully comprehensible’ (Marey 1879: 1). With Marey, Descartes description of the living being as a mechanism, an idea of life that can be given once and for all, took on photomechanical form. Not only did the body become a ‘working object’ much like the technology of the camera that recorded it, with elements that could be examined and explained with relation to the function of the whole, it also became understood via its regulation by clockwork and its organization into stills regulated in time. In order to time the movements documented by chronophotography, pioneers such as Albert Londe, the famous photographer at the Hôspital Salpétrière used a metronome. Londe, one of the first to realize the instantaneity offered by the gelatine bromide process, was able to carefully time and hence organize the images that he captured of trauma and disease (Braun and Whitcombe 1999: 222). Pierre Jules César Janssen, the astronomer that first captured a chronophotographic image of the Passage of Venus, developed a clockwork mechanism that inspired Marey’s discovery of chronophotography. And Muybridge, the figure who would become known to film scholars as one of the founding old men of cinema, used electronics triggered as a horse galloped across a line of strings. The representation of the body was regulated by these means. As well as the body in front of the camera lens, the human operator behind the apparatus became largely a product of the machine. His or her role in the observation of phenomena was negligible. In fact, the chronophotographic method was developed in part to remove the personal equation once introduced by the observer. As mentioned in Chapter 1, Flusser gives us the image of the photographer circling the subject, looking for a way to measure and define the subject. The photographer works for

94

AGAINST TRANSMISSION

the camera, looking for the conditions that are just right for its programme. The philosophical gesture, the attempt to view the world and create a theory in an image, is inseparable from the camera that elicits this gesture. The photographer, as Flusser argues, ‘aims to observe something and fix the observation, to “formalize” it’ (Flusser 2011: 286). The moving body was rationalized by technical means as a body that could be examined analytically by technologically separating moments in time, separating out the components of the transmission from moment to moment, in order to see the relationship between its parts. An analytical mode of observation was adopted quite literally through the lens of the chronophotographic apparatus that was soon to become standardized. As Robert M. Brain observes, ‘Marey advanced a post-lapsarian image of nineteenth century science, with scientific disciplines having fallen deeper into a mutually incomprehensible Tower of Babel-like isolation […]. The answer lay in the new Adamic language of the graphic method, which offered not only clarity, but represented all results as energy relations’ (Brain 2002: 166). Once the chronophotographic method was settled as an approach that photographed movement and separated it out into microelements in time, a further development was able to be made: microelements of motion were able to be translated into data and represented in diagrams and graphs. A series proliferated thanks to the images of the chronophotographic mechanism that could be taken together as an archive of movement and time. ‘The chronophotographic apparatus became part of a new regime of technical images that sought to remove individual difference in the observation of moving bodies’ (Canales 2011: 107–108). Chronographic notation became a universal language in the study of physiology. It became a protocological language able to be exchanged all over the globe (Zielinski [2002]2006: 245). Much like the modern-day protocols that make computer-aided communication possible, a standard analytical language was developed through the segmentation of time-based events that was compatible all over the world. This gathering of values via photomechanical means was part of the large-scale and ongoing project of gathering numerical data that could be used to stabilize and systematize what previously was understood as highly individualized and asystematic events. At this moment in the history of data analytics, time, the chaotic and the contingent, was able to be recorded, stored and transmitted as a series of discrete moments, with values able to be given to each isolated section of movement. As Curtis states, ‘the ease with which photography generated a series of images of cases corresponded to the dream of a limitless well of evidence. This dream, however, presupposed the evidentiary status of the individual image. It presumed each image was a window; seeing through enough of them could give the observer a vision of the whole field’ (Curtis 2012: 81).

POST-HISTORICAL SCENES

95

As argued in the previous chapter, it is of course no accident that the development of film coincides with what is often described as the beginning of the modern era. Film itself had a marked influence on the way individuals conceived of history and grouped events in such a way that could constitute an era. Historical moments were created as film recorded, stored and projected events that happened in front of a camera at one point in time, to which most people would have otherwise never previously been exposed. These events then became historical moments of a vastly new order to those archived via photographs or the written word. As Zielinski ([1989]1999) argues at the opening of Audiovisions: In the rhythmic progression of photographs arranged on perforated celluloid strips that outwitted human visual perception, in the anonymity of publicly accessible spaces vested with highly intimate ambience, the human subjects who had been through industrialisation apparently discovered their appropriate and adequate communicative satisfaction. Reproducible dream worlds, staged for the eye and the ear, provided these subjects who had been rushed through the century of the steam engine, mechanisation, railways, and, lastly, electricity, with the material for satisfying their desires for rich sensory impressions, variety, diversion, escapism, but also for orientation. (11) The subjects, who were beginning to realize their existence as intimately tied to and evolving with technological developments, were offered a new technical apparatus within which they could survey and come to terms with their place in the world. Time, in its perceivable ‘realness’, was now able to be stored by machines. People could witness history and, better yet, return to it to watch it again, orienting and reorienting themselves. Chanan states ‘The modern world almost seems to have begun with the birth of film. Because we’re used to seeing film images of the First World War, the First World War seems to be part of the modern period. Anything more than twenty years earlier belongs to an era which we easily feel to be lost’ (Chanan 2005: 12). The image becomes history supported by technical means, with its storage and access dependent on technical protocols. Cinematographic representation has had a profound influence on the philosophies of time and movement, from Bergson and Deleuze to Kittler and Zielinski. At the beginning of the twentieth century, the cinema, through its technical architecture, began to present to audiences a type of time that proceeded sequentially, frame by frame. Philosophers reacted to this, Bergson first and Deleuze later. Bergson ([1907]1989) argued that ‘such is the contrivance of the cinematograph. And such is also that of our knowledge. Instead of attaching ourselves to the inner becoming

96

AGAINST TRANSMISSION

of things, we place ourselves outside them in order to recompose their becoming artificially. We take snapshots […] the mechanism of our ordinary knowledge is of a cinematographical kind’ [emphasis in original] (306). Deleuze later adapted Bergson quite radically to show how cinema in fact opens up dramatically new ways to conceive time. Bergson once warned us off cinema. Deleuze ([1985]2005) then gave us a new way to come to terms with the richness of its time-images. If the linear development of time as synthesis, represented by film running through a projector, represented modernity par excellence then the chronophotograph lay the foundations for what a century later we can start referring to as the analytical conditions of the post-historical.

Developments in hardware for organizing time In the bandes chrono-photographiques series of images now housed in the tremendous archive of Cinémathèque Fançaise, Marey gives us figures in sequential phases of chronological movement. In some images he even included a ‘chronometric dial’ – a forerunner to keykode that, since the 1990s has been used to track the frames of cinema in a way that is both human and machine readable – to indicate the beat of time that had been arrested and the interval between stills (Doane 2002: 214). Marey’s chronophotographic images are interesting from a media philosophical perspective for a number of reasons, not least because of the traces of the apparatus that remain on the picture. In any of these images, like the famous studies of animals or athletes in motion, there are obvious differences in the intensity of light between the sequential poses, produced as the film strip is exposed to light for a second time. The image has been cut up by the opening and closing of the shutter and the mechanical movement of film past the lens. The apparatus struggles to keep pace with the moving reality. It is obvious. It has been written about by countless film theorists before, from Eisenstein to Deleuze: Any time we look at the images of cinema, we also access traces of the apparatus and its method of ordering time. But in Marey’s experiments such as the Chronophotograph of a Man on a Tricycle (Figure 3.2), before the technique of using the chronophotographic gun entered full maturity, while he was still working out the process, we can see something else; we can see much more obviously the traces of the apparatus, we see the carrier medium of the film present itself, and are reminded of the demanding conditions for ordering events in time. It is here that we see more clearly the media performing. In the film history dominated by accounts of the apparatus and its mode of aestheticizing time, what has been less obvious than the onscreen representations and editing techniques and has been less widely written about,

POST-HISTORICAL SCENES

97

FIGURE 3.2 Étienne-Jules Marey, Chronophotograph of a Man on a Tricycle, date unknown.

apart from in the technical literature and guide books, is the role played by the image’s technical support medium. In terms of the inventions that stopped time into visual moments of the present, that caused time as presentness to begin to appear, little has been written about the developments in film stock needed to allow the very quick capture of moving reality. The roll of celluloid film was a breakthrough in terms of the possibility for the analysis of movement over time. The development of a suitable carrier medium was crucial to the development of the analytical method of observation. A discovery made by J.B. Spencer and A.J. Melhuish, later refined by George Eastman, would be instrumental in allowing time to be studied along smaller scales. Marey was limited by the necessity of using glass as a medium to carry the photographic emulsion. In 1854, Melhuish and Spencer patented the idea of using a roll of calotype paper film to feed into a camera. Eastman revolutionized the idea by not only producing the film more efficiently, but also by enhancing the print quality and offering a service to develop the film stock (Chanan 2005: 76). Marey was able to then design a new camera that used paper film, loaded via spools into the camera. A clamping device would hold the film flat and still at which point a rotating shutter would expose segments of the paper

98

AGAINST TRANSMISSION

(Coe 1969: 131). The images, however, continued to blur if the movement was rapid. As reported by Brian Coe (1969), a series of successive pauses was required for each pose to be rendered clearly (132). A development in 1890 would change this. Based on works in the development of synthetic materials that began in the middle of the nineteenth century, John Carbutt conceived the possibility of using celluloid as a medium to replace glass plates (Chanan 2005: 77). Hannibal Goodman, an amateur chemist, later patented the idea in 1887. However, before the patent was issued, Eastman, with the means of production afforded by a large company such as Kodak, developed and marketed a superior grade celluloid film. With celluloid film, Marey found a material that was much more suitable to the analytical study of movement in time. He was now able to shoot at 100 pictures a second. Moments were able to become much shorter and events were able to be measured at new scales. After Eastman introduced to the market his first paper roll film system, the apparatus of the camera was no longer constrained by the shutter speeds needed to expose a glass plate to light. Like other pioneers of this new carrier medium, such as Louis Le Prince and William Friese-Greene, Marey at first replaced the glass plate with paper film and was able to take pictures at a rate of around twenty per second at an exposure time of 1/500th of a second (Harding 2012). If the paper was passed by the shutter any faster than this it would tear in the cogwheel mechanism. After 1889, Marey, again following Eastman’s release of celluloid film, which largely solved the timebased problems of paper, was able to measure movement in time at a much greater speed. The celluloid film’s robustness when fed through the camera at high speeds allowed Marey to take pictures at a frequency of one hundred per second (Harding 2012). At this point, the analytical medium begins to be able to produce new images for thought at smaller and smaller scales that are characterized not by human perception but by instruments. Movements, from the perspective of analytical media rather than the perspective of human beings, begins to, as Bergson once pointed out, become describable as a stringing together of isolated frames. Along with developments in film, in order to record time and carry out the attempt at transducing and archiving events as scenes, a number of engineering solutions relating to the camera needed to be made. In 1862, in Paris, a figure known as A. Briois manufactured a photographic revolver originally invented by an Englishman by the name of Thompson, about which little has been recorded. The revolver itself was able to make four exposures on a glass plate housed in a circular magazine. Around the same time, Thomas Skaif made a similar device, which he used for astronomical purposes (Launay and Hingley 2005: 60). Then the astronomer Janssen, the figure credited with developing the first satisfactory chronophotographic measurement of

POST-HISTORICAL SCENES

99

phenomena, looked to the skies and developed a photographic device to record the passage of Venus across the sun. The once purely observational field of astronomy became experimental, it became thick with instrumentation, vision became technical and the unmoveable, non-manipulable objects of study became mediated (Peters 2015: 168). As the shutter disk turned continuously, the disk that held the glass plate turned intermittently and the eclipse could be recorded and analysed (Launay and Hingley 2005: 61). The clockmaker Antoine Raider built a mechanism for Janssen that drove the disk that contained the shutter disk continuously, rather than having to start and stop the shutter to take a photograph, which introduced imprecision to the image. The impetus behind Janssen’s invention was to overcome an error in previous observations of the passage of planets past the sun’s disc. With some embarrassment astronomers had to accept that for around a century the problem of exactly timing the solar parallax had proved to be unsolvable (Canales 2011: 92). Some blamed the equipment, some blamed ‘the personal equation’. Janssen sought to rectify this by shooting a series of photographs one of which would certainly capture this event (like the amateur photographer who snaps vast collections of digital photographs, only to edit them later). As Jimena Canales (2011) writes, ‘astronomers tried to solve reaction time and personal equation problems by developing, most intensely after the 1870s, new methods and techniques. In careful and costly preparations to eliminate tenth-of-a-second differences in observation, they adopted improved photographic techniques, hoping that with them they could finally determine precious astronomical constants’ (92). Janssen intended to isolate one specific moment by taking many shots in relatively quick succession and trying to eliminate the problems associated with timing a shot based on the reaction time of an observer. He was not interested in simulating movement, but in identifying and isolating one moment from the flow of time, that was not based on the ‘nervous reaction’ of observers (Canales 2011: 106). Marey extended Janssen’s work and developed the now well-known photographic rifle that used a circular photographic plate that was rotated synchronously by a clockwork mechanism. On top of the plate was located a rotating shutter, which exposed the plate while it was stationary and then covered over the portion as it moved on to make the next exposure (Coe 1969: 131). Via this assemblage of mechanical and chemical inventions, from the clockwork mechanism, the moving shutter and the photographic plate that reacted to exposure to light, Marey was able to photograph a limited twelve to fifteen pictures per second. Ernst Mach, who later would outline a great deal of philosophical thought on sense experience and the analytical, developed an extremely fast photographic technique to study supersonic motion. During the Franco-Prussian war, new French rifles were seen to create the crater-like wounds once attributed to

100

AGAINST TRANSMISSION

the exploding bullets that had previously been outlawed in the treaty of St Petersburg (Rossell 2008: 880). Mach was exploring possible reasons for this, one of which was the theory that a bullet, when travelling at very high speeds, carried with it compressed air. The old French rifles sent bullets flying too slowly to achieve the significant turbulence to inflict these crater wounds. But the new French rifles could send bullets flying at supersonic speeds, causing shock waves as they travelled through the air. Schlieren photography had already been developed by the German physicist August Toepler, who designed an apparatus that offered the possibility of imaging the refraction of a constant beam of light. Scientists could use this system to photograph the usually invisible movement of air around objects in a way that resembles a thick, viscous fluid. However, this technique could not cope with the very high speeds associated with supersonic motion. Mach adapted Toepler’s schlieren apparatus to produce a very brief spark that illuminated the projectile in motion and allowed it to be registered by the camera without the blur usually associated with very fast movements (Blackmore 1972: 110). Mach’s rebuilt Toepler apparatus operated by passing a projectile through a ring, which forced air through a small tube. This burst of air then caused a candle’s flame to flicker which in turn shorted a circuit and caused a Leyden jar to ignite a spark for a brief moment. By tinkering with the length of the tube that the air had to pass through, the illumination could be timed to the precise moment that the projectile passed in front of the camera lens. Mach then later upsized the experiment and, using the same technique, took a photograph of a canon ball in motion by firing it through a specially built darkened shed. Shortly after this, shutter speeds became increasingly fast and film increasingly sensitive, which meant that photographers no longer had to rely on stroboscopic effects in carefully controlled environments to capture time. In 1888, Ottomar Anschütz invented a camera using a pneumatically driven, electronically released shutter that could operate at 76 millionths of a second, enabling him to take the first daylight photograph of a cannonball in motion (Rossell 2008: 880). Through these inventions, photographers were able to record smaller and smaller processes with increasing fidelity. Inventiveness in the scientific lab – the solutions to time-based problems – gave birth to specific types of technical images, where observers could witness phenomena given form via its mediation. The time was taken out of these images, the future was deferred in place of a temporally thin present. It is through these photographs, these technical images that would become mass media, that we have been offered time at smaller and smaller scales. And it is through these photographs that we come to terms with what Mach, in his philosophical work, after his studies of ballistics, would term the ‘illusions of permanence’. By capturing the moment, deferring and preserving

POST-HISTORICAL SCENES

101

the death of each moment, in a way that Zeno once described, change is able to be studied as a relation of elements at rest. The constant deferral of death, the misplaced concreteness of images, both in terms of the high-speed photograph and other analytical media designed for the digital processing, storage, retrieval and visualization of data, ensures a continued aftermath of these temporary moments. Why mention all this historical detail? Simply put, it is to show how technical images began to be able to measure time at smaller and smaller scales and the tractable instant became both stored and defined by hardware. The arcane phrase empeiria seems to have had its day. Now replaced by the experiment, which lays down a law from the start that characterizes the moving world, observation takes place based on principles, theories and an apparatus. As Heidegger puts it, it is through experimentally applied laws that nature and history become objective ([1954]1977: 125). Nature and history become the objects of a representing that explains. Such representing counts on nature and takes account of history. Only that which becomes object in this way is – is considered to be in being […]. This objectifying of whatever is, is accomplished in a setting-before, a representing, that aims at bringing each particular being before it in such a way that man who calculates can be sure, and that means be certain, of that being. (Heidegger [1954]1977: 127) Or in Serres’ ([1983]2015) words, the superficial work of analysis, untangling the knotted world, gives us objects of knowledge that are ‘formed by dismantaling forms, by destroying them, or fragmenting them, in which books were wound by unwinding other books, the night without risk or fatigue or clarity, in which Penelope unwraps, the analytical night of the interminable discourse of explication’ (66). The analytical photographic apparatus, a device that could measure and assign calculable values to movement in the world, offered a new scientific image of movement in the world. ‘Truth has been transformed into the certainty of representation’ (Heidegger [1954]1977: 127). This then delivered a means for studying biology, in the sense of being certain of the being of a body, in terms of mechanics, by dismantling a body. By freezing movement in time, measuring and analysing the body, whether a human or animal body or the larger body politic, as a predictable, measurable and programmable entity, a new primacy was given to systematic technical observation.

4 The Radical Cutting of Experimental Television

F

ollowing the last chapters’ exploration of analytical photography, this chapter turns to the television. For large sections of the population the television provided a way to demarcate moments of the day into episodes. The field of television studies, since Raymond Williams’ work, has been occupied to a great extent in providing an analysis of this function of the medium. While the temporality of the viewing routines of television, like the temporality of cinema, has been written about widely, far less attention has been given to the temporality produced by the technical functioning of this highly technical object. Most work has been done looking at on-screen content or viewing routines, with next to no emphasis in English-speaking media theory on the temporality and the technology of the television as an analytical medium. The chapter sets out the following propositions: 1

At the early and experimental stages of the television’s development, a number of engineering solutions were developed to grapple with timebased problems such as synchronization and delay. The chapter argues that at these moments engineers began the technical organization of time that would come to define the medium and at this point the time produced by television entered into the experience of media culture. Usually, the television is understood as a time-continuous medium, whether this is in terms of its transmission of signal as frequency modulations or the programming of its content. The chapter offers a significantly different way to understand the television as a post-historical medium, which is now predicated on notions of the segmentation of an event, rather than the transmission of events.

2

The engineering of temporality, as the problems of synchronization and delay were solved, had real observable effects on the

104

AGAINST TRANSMISSION

performance of actors in front of the transmitter. This observation is used to argue that the technical properties of signal processing impinge in readily describable ways on the phenomena of contemporaneous life. 3

The key figure in developing the media philosophical approach to time, the post-historical and the analytical is Alfred North Whitehead. This chapter will continue the exploration of Whitehead’s latent but radical media philosophy begun in the previous chapter, with a particular emphasis on his conceptualization of events along different temporal systems than those given to us by historical media. Following the previous chapter’s formulation of events and scenes, the definition of mediation in this chapter, for which I take a great deal of inspiration from Whitehead, can be read as the processual turning of events into scenes.

THE RADICAL CUTTING OF EXPERIMENTAL TELEVISION

105

The Telectroscope – M. Senlecq, of Ardes, has recently submitted to the examination of MM. du Moncel and Hallez d’Arros a plan of an apparatus intended to reproduce telegraphically at a distance the images obtained in the camera obscura. This apparatus will be based on the property possessed by selenium of offering a variable and very sensitive electrical resistance according to the different gradations of light. The apparatus will consist of an ordinary camera obscura containing at the focus an unpolished glass and any system of autographic telegraphic transmission; the tracing point of the transmitter intended to traverse the surface of the unpolished glass will be formed of a small piece of selenium held by two springs acting as pincers, insulated and connected, one with a pile, the other with a line. The point of selenium will form the circuit. In gliding over the surface, more or less lightened up, of the unpolished glass, this point will communicate, in different degrees and with great sensitiveness, the vibrations of light. English Mechanic and World of Science, 31st January 1879 (reprinted in Stephen Herbert, 2004, A History of Television vol. 1, p. 15) With this description, the English Mechanic and World of Science put itself at the feet of wide ranging experiments with analytical media, where a pregiven scene was carefully examined point-for-point.To ‘see at a distance’ via electricity was what was at stake and this involved a number of solutions to problems of time, transmission, synchronization and performance, all of which were solved via technical developments that measured light in increasingly small and fast ways. Constantin Senlecq’s invention, much like Dennis D. Redmond’s Electric Telescope, invented in the same year, relied on selenium to transduce light into electrical signal. Both these inventions, however, stopping short of television proper, suffered from the slow response rate of the material and both delivered what were ostensibly early fax machines, able to reproduce still images and transmit these images over distances. Senlecq’s and Redmond’s inventions, like similar experimental devices such as Archibald Low’s TeleVista and the work of Arthur Korn that made possible the Bildtelegraph, proved to be breakthroughs in terms of reproducing images at a distance but they all had the same issues that stood in the way of the transmission of moving images and that prevented them from solving the problems of television. When light initially hits selenium it gives of a charge, but the substance takes time to recover its properties as a transducer. The early experiments with selenium all pointed to one common feature: instantaneous changes of light on the selenium did not cause instantaneous changes in charge given off by the material (Burns 1975: 954). The moving image, the event, always withdrew from the selenium, never able to be captured in its perceived completeness. This chapter looks into the technical foundations for the development of

106

AGAINST TRANSMISSION

optical media and explores the cultural role of the transmitter (in information theoretical terms) with regard to its signal-processing routines and its function in producing the temporalities for the transmission of images that would begin to become dominant in mass audiovisual cultural. When it was first developed, the television acted as an analytical medium by segmenting the world in front of the transmitter into points of light that could be strung together to make bands. It is in this sense that Kittler refers to the medium as the first to give an all-electronic form to Claude Shannon’s Mathematical Theory of Information (Kittler [1999]2010: 202). The television transmits images using a time-continuous signal, but prior to transmission the image is segmented into points of light (this is obvious, but sometimes to get to a description of conditions we need to begin with these basic, general details). The essence of this highly technical object, as stated in the first edition of the first journal to be dedicated to this experimental medium, is ‘that each spot is examined’ (‘Television 1873–1927’ 1928: 23) and then converted into a charge. To achieve this, as Kittler ([1999]2002) points out, images would become ‘discrete quantities of data’ (208). The conditions were established for the now widespread proliferation of images over networks. Ernst Ruhmer discovered that the image of a cross could be transmitted by segmenting the image into discrete squares of light. As mentioned already, it was then realized that the telegraph could be used not to just send dots and dashes but also to transmit signal that could be recomposed to make pictures. The Bildtelegraph (Figure 4.1), along with Low’s, Redmond’s and Selenecq’s inventions, relied on precise synchronization as the image was drawn one line at a time. Although these devices could not reproduce movement in time, they made operational their own media time in order to synchronize the sender and receiver. They became time-critical (Ernst 2016: 125). The pre-television machines created their own temporal systems in which they operated. They were delicate to time. Their very functionality was based on the way they operated in time, on the fly, producing media temporalities through their operation, as telegraph operators waited for a picture to appear. The waiting, the being-in-time of the users of these devices was based on the new medium’s pacing of its operation. In Germany, in 1937, Telefunken used lamps to develop a large pixelated screen (Figure 4.2), designed to reproduce television images to the mass public. The image was given form point-forpoint, line-for-line. In the United Kingdom, John Logie Baird’s invention broke an image into scan lines, able to be reassembled based on the synchronized operation of two Nipkow discs and an array of lamps. In these experiments, we see the emergence of television time in twentieth-century media culture. The major category of television is time […].The television, at these experimental stages amounted to an array of devices that not only

THE RADICAL CUTTING OF EXPERIMENTAL TELEVISION

107

FIGURE 4.1 Illustration of the Bildtelegraph © Stiftung Deutsches Technikmuseum Berlin, Historisches Archiv. represented time, like cinema, but relied at its technical base on time-based concepts such as synchronisation and delay. Sending and receiving devices had to stay in synch with for transmission to take place. It became all the more representative of a time criticality as images would disappear once they were transmitted. The temporal dimension of television […] would be an insistent present-ness. (Doane 2006: 251) The television image was ephemeral, in time, in a way that the photochemically fixed cinematic image was not. The television image was captured, held for a brief moment due to the slightest of delays in the wires of the medium. Like cinema, television gave form to time-images. But these

108

AGAINST TRANSMISSION

FIGURE 4.2 Telefunken’s large-scale experimental pixel screen © Stiftung Deutsches Technikmuseum Berlin, Historisches Archiv.

were distinct from cinematic time-images based not on their representational character but on their operation in time. In this chapter I explore some of the experiments conducted at the beginning of television, which often highlighted the time-based phenomena associated with the medium. These include issues to do with conductivity, sensitivity, transmission and synchronization required to facilitate the transduction of light into electricity, then able to be transmitted as information. The chapter begins by examining the way time was organized at these key moments in the engineering of what would become such a central element in mass optical media in relation to the key problems of synchronization and delay.1 It shows how the engineers that worked on the problems of television, the majority of which were time-based problems, produced the new temporal systems that Whitehead asks us to try to see. As mentioned earlier, Whitehead’s imperative is to bring into view unfamiliar temporal systems that indicate the conditions for being both in and out of time with another. The solutions to the time-based

THE RADICAL CUTTING OF EXPERIMENTAL TELEVISION

109

problems of television worked against the linear, progressive time of historical media and introduced into audiovisual culture the particalized time set out in Chapter 2. After investigating the emergence of these media temporal systems, this chapter ends with an exploration of a number of performances that were the subject of experimental broadcasts in Britain and the United States of America in the 1930s that demonstrate the television’s partitioning of time and experience. The function of the television as an analytical apparatus is explored, from early inventions to experimental broadcasts such as The Man with a Flower (1931), The Eve of St Agnes (1937) and The Queen’s Messenger (1928). The methods of recording and production of a television event – the methods by which the whole was analysed as discrete values – had real effects on dramatic performances to be depicted on screen. These moments reveal not only the early moments when performances had to be simplified due to technical limitations but also offer early glimpses of what was to become the conditions of the post-historical, images where movement is defined by segmentation, which now have come more fully into view. These experiments with time, transmission and synchronization, as media events, supported a particular style of performance in front of the camera, a type of televised subject, whose beginnings are further investigated here in the heavily ‘made-up’ image of the medium’s earliest performers. Questions of time and temporality have been a major part of the discipline of television studies. Raymond Williams (1974) most famously tuned scholars into this when he told us to focus on the programming of time-discrete units (programmes and ads) into a televisual ‘flow’ (89). Major scholars such as Paddy Scannell (1996), John Fiske (1987), Stephen Heath (1990) and Mimi White (1992), focusing on the programming, production and delivery of television content, have also added field-defining concepts to the discussion. Topics such as flow, liveness, the glance and more recently time shifting and mobile viewing seem to characterize the television as a medium that gives viewers a unique access to technically produced forms of temporality. For some cultural theorists, such as Stiegler, Flusser and Virilio, the television not only generates temporality but also brings with it what might be seen as a cultural loss of history and memory (McQuire 1998; Davis 2007). As the television image disappears into the communication chain, as it becomes a volatile electric charge, it is not able to be archived in the way print, photography or film once was. Predigital television did not record events in order for them to be archived. The events disappeared into the black hole of wires and circuits as soon as they were registered by the camera (Kittler 2010: 316). It did not produce a storage time, like print or film. Instead, it produced the temporality of events as they happened. Now that television has proliferated into various digital versions of itself, it is able to store the present, to store the events once captured as live. But it does not store these in order that they are able

110

AGAINST TRANSMISSION

to be accessed in chronological order, as events that can be synthesized into a whole. Digital television archives such as YouTube or Netflix are based on relational, rather than chronological, links. And it is by this that the television now produces the time of the post-historical as its images are designed to circulate, to go around in circles, rather than aimed at the future, leading on to the next image in a historical order. As Zielinski ([1989]1999) argues, ‘television is nothing more than the reification of time as a service or a commodity’ (235). In a way that continues today, viewers did not engage with homogeneous historical time slots, as far as the television was concerned. ‘Instead they dipped in and dipped out, helping themselves to the material provided by the industry for their consumption in frequent, small, portions, that broke up the other temporalities associated with home, school, college, and job’ (Zielinski [1989]1999: 235). Instead of a flow, this bitty engagement with the medium represents, for Zielinski, a segmentation of time, or what McLuhan might call a mosaic, that acts as a further support structure for the mechanical compartmentalization of time introduced around the beginning of the twentieth century, with Taylorism being the most acutely felt effect. The segmentation and electrical transmission of images began to be a relatively common place in the mid-1800s. Senlecq’s device, one of the forerunners of television as mentioned earlier, was modelled on an earlier apparatus designed by the Italian physicist Abbé Caselli. Called the ‘pantelegraph’ and much like the Bildtelegraph invented later in Germany, Caselli’s device, which began to be used for practical purposes in the 1860s, enabled the transmission of signatures, handwriting and drawings over telegraph wires. Caselli’s invention was itself apparently modelled on an earlier patent by the Scotsman and inventor of the first electronic clock, Alexander Bain (a figure who also was instrumental in the use of the pixel as a representational device). Bain’s device however, due to a number of legal battles, particularly with two other major figures of electrical communication, Samuel Morse and Charles Wheatstone, was to remain sidelined. As is the case in a great deal of media discoveries, often those figures instrumental to the development of the techniques are not the ones rewarded by commercial gain. In Bain’s device, raised pins (pixels) were scanned by a stylus that transmitted on–off pulses of signal. This signal, operating on the digital principle of switching, would then be transmitted to a receiver that reprinted the pattern on paper that was treated with a mixture of ammonium nitrate and potassium ferrocyanide, making it sensitive to electric current. In Caselli’s device, letters were painted with a special ink upon tin foil, which was scanned by a pendulum with a stylus attached to its tip. The foil itself was a very conductive material, while the ink applied on top of the foil prevented the conductivity of electric current. After each pass of the pendulum, a mechanism

THE RADICAL CUTTING OF EXPERIMENTAL TELEVISION

111

incrementally moved the image perpendicular to the line that was just traced. The result was that the stylus could measure the variation in electric current over each scanned line. This was then transmitted as electricity to a similar apparatus at the receiving end where a stylus would discharge the current on a piece of paper that, like Bain’s earlier invention, was treated with a chemical solution sensitive to electric charge. These were among the first systems to replace the voice of the telephone and the coded messages of the telegraph with images. The telephone and the telegraph presented the receiver with synthetic information – by either listening or reading the receiver would assemble the particles of the message into a whole. The image however, although drawn line-by-line over time, presents a new temporal system, one where a whole is given first and the receiver is left to inspect its elements as their gaze moves over its surface. To produce this style of communication, where particles of information are organized by the apparatus rather than the human receiver, the rhythms of the device’s operation became important. Both Bain’s and Caselli’s inventions did not pick up light, but instead via a regulating clock synchronized a scanning stylus at the transmitting end with a writing stylus at the receiving end. Senlecq’s later invention however was able to pick up light via, as mentioned earlier, his use of selenium, a substance that responded to light’s intensity by varying its conductivity of electricity. This was to offer the possibility of capturing movement, rather than simply reproducing still images. Selenium is a substance that usually has a very high resistance to the passing of a current of electricity. When exposed to light however its resistance drops by 15 to 30 per cent, depending on the intensity of the light. The electrical engineer Willoughby Smith developed an ingenious method to test underwater telegraph wire as it was being laid. To achieve this in practice he would need a material to act as a semiconductor with high resistance. Selenium rods were the perfect choice. However, Smith soon found that his checks were very unreliable due to the wide variation in resistance delivered by the rods, depending on whether or not they had been kept in the dark or exposed to light (Abramson 1987: 6). Although useless for the purpose of running continuity checks, this property meant that once selenium was included in a circuit, varying intensity of light which made up an image could be changed into varying intensity of electrical impulses. The selenium-based image could achieve a much higher resolution than Caselli’s pantelegraph, which was only really suitable for blocks of single colour. Selenium however had one serious problem. As mentioned earlier, there were considerable time lags in the substance retaining its conductivity after being exposed to light. This meant that selenium-based television could only ever produce still images. In 1928, the first popular journal dedicated exclusively to this still experimental medium set out the problem of television

112

AGAINST TRANSMISSION

to the public: ‘the problem of television was to break up each living image into many thousands of fragments, convey these “pieces” to the receiver and reassemble them on the receiving screen in a fraction of a second’ (‘Television 1973–1927’ 1928: 10). Both in the industrial vernacular and the philosophical underpinnings of media theory, televisions do not offer up the re-presentation of events but the production of new scenes via a reassembly of particles that are the outcome of technical and organic entanglements (the performance of the camera and the performance of the body). This is what Bourdieu ([1996]1998) referred to as the invisible censorship of television (15–16), produced by institutional, economic, political and, I would add, also technical conditions. The components of the apparatus of television, as a measuring instrument, offer particular epistemological engagements with the world. The television measures events, both through its programmed segments and its technical effect, as seen in the early experiments of Korn, Low, Redmond, Dieckman, Ruhmer, Senlecq and others with pixels and scan lines. The chronophotograph of the previous chapter turned the events of the world into scenes, still images which are projected back onto the world. The television, like the cinema but in a more radical way, once again projects scenes back onto the world based on its analysis of the swarming clouds of visual experience. A theory of events, of particles, is produced by media itself, as the apparatus becomes an active component in the images it produces. Media set the conditions for which all transmissions conform. This is the very thing that the term ‘media theory’ signifies and it is the very thing that Whitehead alerts us to in his discussions of the medium of language and logic.

Punctuating media The essence of mediation is what Whitehead refers to as a movement from the many to the one. A single spatio-temporal unit is derived from a mediating function that passes reality through the here and now. In a passage of Process and Reality, where the vernacular of speculative metaphysics resonates particularly meaningfully with technical media theory, Whitehead states, […] the perspective of one sub-region from the other is dependent on the fact that the extensive relations express the conditions laid on the actual world in its function of a medium. These extensive relations do not make determinate what is transmitted; but they do determine the conditions to which all transmissions must conform. They represent the systematic scheme which is involved in the real potentiality from which every actual occasion arises [emphasis in original]. (Whitehead [1929]1978: 288)

THE RADICAL CUTTING OF EXPERIMENTAL TELEVISION

113

Of course in this passage Whitehead is not discussing technical media per se. He is exploring the generalities of physical bodies and the scheme of extensive connections expressed in actual experience, whose transmission is mediated by extensive relations. But it is in this sense that the analytical medium of television and other electronic communications media fulfil the role of punctuation, separating lines into readable points, as though actual entities, and giving these points coordinates. These technical media act as a function that sets the conditions for the connections between coordinates. Images on screen and information over wires are received based on the connections that these conditions make possible, whether they be hardware-based, in the case of the engineering of circuits, or protocological. In terms of the analytical function of television, the segmentation of events is rehearsed both at the level of the protocols of the programming of content and at the level of the engineering of the technology. Williams (1974) points to the interruptions of advertising and trailers, which work to sustain the ‘flow’ of spectatorship. The night’s viewing, the advertising, the trailers for tomorrow night’s, ‘are programmed by providers and then by viewers, as a whole; that it is an event planned in discernible sequences which in this sense overrides particular programme units’ [emphasis in original] (93). In the age of analogue broadcast television, Williams focused on the whole that was synthesized from the parts. He was interested in the cultural meaning, as a whole, that came from watching television, which itself was a process for the crystallization of relations between broadcast moments and the interruptions of advertising. In the age of narrowcasting and greater fragmentation (both in terms of audience fragmentation and fragmentation as a technique for processing signal), the time is ripe to now bring the parts back into view and consider them, as meaningless segments, in media philosophical terms. Regarding this segmentation, John Ellis, in a similar vein to Zielinski, writes, Broadcast TV has developed a distinctive aesthetic form. Instead of the single, coherent text that is characteristic of entertainment cinema, broadcast TV offers relatively discrete segments: small sequential unities of images and sounds whose maximum duration seems to be about five minutes. These segments are organized into groups which are either simply cumulative, like news broadcast items and advertisements, or have some kind of repetitive or sequential connection, like the groups of segments that make up the serial or series. Broadcast TV narration takes place across these segments. (Ellis 1982: 112) The practice of programming ‘moments’ into a larger serial narrative of course has its lineage in the earlier programming logic of radio, but it also owes a

114

AGAINST TRANSMISSION

great deal to the experimental broadcasts of the late 1920s and early 1930s and the fragmentation that the new medium’s technical conditions facilitated. In order to understand the fragmentation of twenty-first-century new media from a post-historical perspective – and more specifically to explore the older practices now folded into new media – it may help to look back to the types of fragmentation produced by the medium at its early experimental stages. In the first experimental broadcast in 1928 in Britain, after introductory speeches given by William Graham and Sir Ambrose Flemming, performances from the actors and singers Sydney Howard, LuLu Stanley and Miss C. King were programmed at two-minute intervals. The actors and singers first performed for two minutes and then repeated their performance for the microphone. Both sound and image, broadcast from the roof of John Logie Baird’s studio, had to be transmitted on the same channel. Ears and eyes could not receive the transmission at the same time. They had to be separated. At these early stages, television delivered a time of live transmission to audiovisual media rather than the time of storage like that offered by film. Although the television worked by transmitting time-continuous signal, as modulations, this also delivered a condition of segmentation – breaking up the senses, breaking up movement, undermining the flow, breaking up events into discrete scenes. Digital television, now predicated on storage time, the mining of YouTube for journalistic content, time-shifting, TV on demand, once again retrieves this technique of segmenting events outside of a chronological flow and displaces the ‘liveness’ of the medium. Years after Baird’s series of two-minute broadcasts, in 1937, when Britain utilized both the EMI-Marconi System and the Baird Television System, the programming policy developed by Gerald Cock, the first director of the UK Television Service was based on ‘variety and balance’. Airtime was limited to two hours per day, which was segmented into programmes that were both topical and demonstrational (Jacobs 2000: 32). The programming of televisual time, a time based on transmission rather than long-term storage, was however not solely based on the decisions of the leaders of this new industry. It was also based on its technical supports. The use of the two competing television systems had a great impact on the durational aspects of programmes able to be broadcast. In the weeks that broadcasters used the Baird system, the programmes were limited to fifteen minutes. The reason for this was that the Baird system utilized a method of broadcast known as the Intermediate Film (IF) technique. A television drama to be broadcast was filmed by a stationary camera using a 17.5-millimetre film, which was then developed in under a minute and scanned electronically to be broadcast to Baird televisors and transduced back into moving images (Jacobs 2000: 32–33). The camera could not be moved without a halt in transmission. Both time and movement were locked into the restrictions of the technical apparatus. The EMI system on the

THE RADICAL CUTTING OF EXPERIMENTAL TELEVISION

115

other hand, which eventually won out over the Baird system, proved to be more flexible and allowed for both movement and time to be produced in new ways. This system was more attractive because it was not the static record of an event in time that it offered but instead the production of ‘liveness’, as a particular scenic image. The apparatus could organize the event, the everyday, into a scene of the live here and now. After 1937, the Baird system was shut down and the EMI system delivered a standardization of televisual time and movement. The broadcast day, lasting around four hours, was now split into the sixty to ninety-minute blocks of morning (11.00 a.m.–noon), afternoon (3.00– 4.30 p.m.) and evening (9.00–10.30 p.m.), each with their own characteristics (Jacobs 2000: 33). At the level of programming, the television has been very effective at segmenting the day into moments. In a similar way to the fragmentation of industrial life and the time–motion studies introduced by Taylor and supported by the apparatus of chronophotography, mentioned in the previous chapter, television segments the day on both macro- and microscales. As mentioned earlier, the time produced in order to contextualize the production of capital structured the day around a segment of work time. All other activities had to take place outside a demarcated section of time. In addition, chronophotography and time–motion studies segmented time on a much smaller scale, as workers’ very small movements were analysed and the work day became able to be measured in terms of gestures, which were in that way able to be made more efficient. Time as directly experienced became atomized and lost the quality of prehension, the fluent passing of one occasion to another, that Whitehead described. Television rehearses a similar segmentation of time. It segments the duration of the day, as shown above, into viewing ‘moments’ and also segments events at a much smaller scale into transmittable portions of light. The capacity for producing previously unfelt forms of temporality through transmission is what gives television a uniqueness as a time-based medium and this is something that engineers, inventers, actors and producers have continually experimented with. Before the introduction of the temporality of programming schedules, at its experimental beginnings, due to the transductive qualities of selenium, the apparatus produced a slow, delayed and often blurred form of image making. In television’s contemporary forms it similarly makes once unfamiliar temporalities felt in the everyday, offering time shifting, binge watching, casual viewing and multiplatform engagement that can take place at almost any time of the day, along with the multi-temporality produced by YouTube footage such as that described in Chapter 2. As it was first stabilized in a mechanical form, publicly demonstrated by Baird at Selfridges in London, its capacity for establishing new relations between vision, time and space was emphasized. As detailed by Selfridge himself, under the pen name of Callisthenes (1925),

116

AGAINST TRANSMISSION

‘television is to light what telephony is to sound – it means the instantaneous transmission of a picture, so that the observer at the receiving end can see, to all intents and purposes, what is a cinematographic view of what is happening at the “sending” end’ [emphasis in original] (14). However, this historical idea of capturing events as a flow of signal to be transmitted across space is certainly not the case nowadays and, as argued in Chapter 2, illustrated by the YouTube footage of Gaddafi, new types of time, apart from live, direct, linear time are now a product of digital television. From the here and now we might suggest that the archive of YouTube, a medium for the production of post-history, offers viewers a connection to a different type of instantaneity – which exemplifies the production of the present as multiply timed. The television was once predicated on the idea of liveness and a certain amnesia due to its operation as a non-storage medium (apart from the important camera invented by Zworykin which stores light in pixels for fractions of a second). As Kittler argued, the images of television are not optical but electrical. They are not preserved in visible form the way film images are. They are lost in the communication channel. ‘The eyes can only access these signals at the beginning and the end of the transmission chain, in the studio and on the screen’ (Kittler 2010: 316). Digital television is now, however, predicated on the idea of storing liveness, archiving the temporary, preserving its audiovisions as invisible media between sender and receiver and synchronizing a global community of viewers to the temporary point in time. If the phonetic alphabet, as McLuhan famously argued, predisposed lineal thought and the idea of History as a progression, then the storage time of YouTube may signal a shift in this lineal engagement with events by predisposing viewers to a life in the aftermath of the temporary. To paraphrase Flusser, events no longer roll towards the future, but are now sucked into YouTube, synchronized and made to repeat. Although the current character of the so-called digital television is prima facie quite different from the medium’s experimental beginnings, there remain a number of important events in the medium’s history that continue to affect the condition of television today. The current state of the medium draws into itself problems and solutions that are much older. The way issues to do with synchronization, sensitivity and delay were solved almost a hundred years ago continue to impact not just the technical function of the medium (as they led to further developments) but also its role as a cultural form. These moments in the history of television show us the way time-based engineering problems were solved and the way that these solutions and the medium’s new operability produced the temporality of the new audiovisual culture. This will be argued in the next chapter but before entering into media theoretical reflection, some more rear-view analysis and technical details are needed. In the remainder of this chapter, first, some developments in mechanical

THE RADICAL CUTTING OF EXPERIMENTAL TELEVISION

117

television are explored and their effects in relatively contemporaneous experimental broadcasts are traced. Second, in the next chapter, twenty-first century television is explored, with reference to these and other developments with early television to show how, like the early experimental broadcasts, the medium’s history of development, its engineering of time, continues to impose itself on contemporary transmissions.

The problem of time: Delays and synchronization Before television became standardized and before Baird’s experimental broadcasts, a number of inventors grappled with the problem of transmitting pictures over distances. As mentioned earlier, one of these early inventions was Caselli’s pantelegraph. At one end of the line the image was scanned line-by-line, at the other it was written line-by-line. The ‘great defect’, however, as Thomas Edison described it, of Caselli’s invention was synchronization (Coopersmith 2015: 23). Caselli relied on a system that involved the human operator adjusting the swing of the pendulum so that it came to the limit of its swing at a vertical line held at the edge of the paper. Any imprecision led to distorted and blurred images. An article that featured in an 1880 edition of Scientific America titled ‘Seeing by Electricity’ detailed an invention by George Carey, which took the form of a selenium camera that could be used to transduce light into electric current. Inside the camera, which was influenced by Bain’s, Senlecq’s and Redmond’s earlier work, a disk was inserted with ‘numerous small holes, each of which is filled partly or entirely with selenium’ (Herbert 2004: 17). Each small hole was then made part of a circuit. When the selenium is exposed to light it becomes conductive and allows the electric current to pass through it. When it is not exposed to light it is not conductive and does not allow current to pass. The changes in voltage amounts to the measurements of light as expressed by the changes in current. This apparatus, like Bain’s earlier invention, represented one of the earliest types of analytical media as far as the problem of television was concerned. However, although the device seemed to offer solutions to a number of synchronization problems with Caselli’s device, it still was ostensibly a fax machine as the problem of selenium’s response rate continued. It also, like other early experiments with pixels conducted at Bell Labs, proved too expensive and cumbersome to be practically useful. Although informationally economical, in the sense that it separated signal into small values (pixels) rather than continuous lines of the pantelegraph, the selenium telegraph was not economical in terms of hardware, with each pixel requiring its own

118

AGAINST TRANSMISSION

wired circuit. The apparatus could separate the image into pixels but could not transmit this information without each pixel requiring its own line. In 1927 and as part of a special issue on the topic, Herbert Ives published a paper outlining the problems of television. Ives set out the broad problems of television as that of converting light signals into electrical signals, transmitting these signals to a distance, and then converting these electrical signals back into light signals (Ives 1927: 552). These three essential tasks, as he put it, had more or less been solved by the experiments of Senlecq, Redmond and Carey, among others. But what these solutions continually rehearsed was the much more difficult problem of ‘developing these means to the requisite degree of sensitiveness, speed, efficiency, and accuracy, in order to recreate a changing scene at a distant point, without appreciable lapse of time, in a form satisfactory to the eye’ (Ives 1927: 552). A solution had come in 1887, but was not utilized for television until much later. To move television out of its ‘stand still’ brought about by a perpetual time lag, inventors, most notably Baird, looked to a discovery made by Heinrich Rudolf Hertz (Baird 1926: 3). While conducting experiments on electromagnetism, Hertz discovered that the sparks that he was using reacted to ultraviolet light. This observation led Hertz to conduct experiments, showing that sparks move more readily when exposed to ultraviolet light. A number of scientists followed Hertz’s work and began to peer into the behaviour of light and energy transfer at the quantum scale. This work culminated in 1905 when Albert Einstein explained that a beam of light hitting a metal surface passes its energy to the metal’s atoms and displaces some of its electrons. When technical devices became able to measure the displacement of electrons, light was able to be converted to electricity without the delays associated with selenium. In Television: A Popular Talk, Baird, looking for a solution to the problem of the sluggishness of selenium cites Hertz’s discovery as a possible solution. However, he did not adopt the electric cells, although his patron Will Day pleaded with him, instead persisting with selenium-based television. Although the photoelectric cell could respond instantaneously to light, as Baird had discovered, its insensitivity compared to selenium proved to be a major practical problem (Burns 2000: 106). Baird’s first system, in the style of the passionate dilettante experimenter, coupling together pieces of technology, testing things out to see what worked, was based on the use of a Nipkow disc spinning in front of selenium cells, at the transmitting end, wired up to lamps at the receiving end, whose light was filtered through a second Nipkow disc spinning in synchronization with the first.2 The difficulty with the delay and the electrical noise generated by the selenium continued, resulting in an inability to acquire the resolution that would make the Baird system viable as a commercial product (Magoun 2009: 32). Prior to a replacement for selenium being found the closest Baird

THE RADICAL CUTTING OF EXPERIMENTAL TELEVISION

119

could get to televising a moving image was by using silhouettes, which simply required the selenium to react to the light around the silhouette, not the reflected light from a moving body. An article in The Times in 1926 reported on the successful test of Baird’s ‘televisor’ apparatus: Members of the Royal Institution and other visitors to a laboratory in an upper room in Firth-street, Soho, on Tuesday saw a demonstration of apparatus invented by Mr. J. L. Baird who claims to have solved the problem of television. They were shown a transmitting machine, consisting of a large wooden revolving disc containing lenses, behind which was a revolving shutter and a light sensitive cell. It was explained that by means of the shutter and lens disc an image of articles or persons standing in front of the machine could be made to pass over the light sensitive cells at a high speed. The current in the cells varies in proportion to the light falling on it, and this varying current is transmitted to a receiver where it controls a light behind an optical arrangement similar to that at the sending end […]. Application has been made to the Postmaster-General for an experimental broadcasting license, and trials with the system may shortly be made from a building in St Martin’s lane. (The Times 28 January 1926: 9) The system still used the selenium-based cells, but made significant improvements by replacing the light bulb with the more sensitive neon lamps in the receiving apparatus and the operation of the Nipkow disc, which used 32 bull’s eye lenses from bicycle lamps to scan an area 10 × 8 inches (Magoun 2009: 35). The apparatus could now give a crude image with some gradation, but, as one witness described ‘is not even pleasant to view’ (Burns in Magoun 2009: 36). These initial problems became surmounted when Baird, following prompting from other researchers and engineers, finally replaced the selenium cell with a thallium sulphide photoelectric cell (Burns 2000: 105). In addition, Baird made a breakthrough in imaging by using a signalsharpening circuit.3 Baird was continually frustrated that the sample of a point of light given by selenium lacked a sharp ‘punch’. Selenium could measure intensity very well but was not very good at measuring the change in this intensity (i.e. movement). By using a signal-sharpening circuit, the measurement of intensity could be given but also a measurement of its rate of change. The circuit that Baird used responded to fluctuations not just intensity and was able to be used to measure moving light. The circuit also requires less amplification so did not create ‘ground noise’ which would otherwise interfere with sound recordings. Largely based on these new developments, in 1929, the PostmasterGeneral granted Baird the use of a BBC station for experimental broadcasts, outside normal broadcasting hours. This was after a year of controversy, whilst

120

AGAINST TRANSMISSION

the BBC continually delayed Baird, due to perceived impracticalities and the low resolution of his images. In 1929 however, after making the improvements to the light sensitivity and photoelectric cells, Baird demonstrated the new system to the BBC and GPO officials, which led to the following response from the Postmaster-General: The demonstration showed that the Baird system as capable on that occasion of producing with sufficient clearness to be recognised the features and movements of persons posed for the purpose at the transmitting point… In the Postmaster-General’s opinion the system represents a noteworthy scientific achievement; but it is not considered that at the present state of development television could be included in the broadcasting programmes within the existing hours. He bases this view not so much upon the quality of the reproduction which further experiments may be expected to improve as upon the present limited scope of the objects which can be reproduced. The Postmaster-General is, however, anxious that facilities should be afforded. (Swift 1950: 43–44) On 30 September 1929, Baird Television began its experimental broadcasts outside normal Broadcast programming. Even as the problem of delay was being solved, the problem of synchronization would prove to be a consistent issue to be grappled with by the engineers of experimental television systems. The reconstruction of an image at the receiving end of a television system is dependent on the reconstructed elements falling in exactly the right place at precisely the right time. A 1927 paper in the Bell Technical Journal by H.M. Stoller and E.R. Morton outlined the research conducted to find engineering solutions to the time-based problems associated with transmission. In order to get televisions around the country to operate at the same speed it was ‘necessary to employ unusual features of motor design and control circuits to secure the required results’ (Stoller and Morton 1927: 604). A way needed to be found for the whirring motor to be held in precisely the right tempo for the image to appear correctly in the viewing window. After Baird’s use of signal-sharpening circuits and his transition from selenium to the thallium sulphide photoelectric cell as the light-sensitive substance placed behind the spinning Nipkow disc, images could be transmitted in their liveness. But the problem of synchronization persisted. How could you make sure that the Nipkow disc was spinning at the same speed at each end and that corresponding lights were illuminated at the correct time? If things went wrong, the image would be cut up or framed incorrectly, appearing to splinter in vertical lines. The solution came as the analytical medium was used to, as well as measuring light, measure time

THE RADICAL CUTTING OF EXPERIMENTAL TELEVISION

121

and regulate the speed of mechanical wheels. Short pulses were sent to the television to trigger a small toothed wheel that regulates the speed of motor that drives the Nipkow disc. Baird explains this in an article in Popular Science in 1932. After describing the black unilluminated strip at the top of the receiving image on his television system, Baird states, ‘it is this strip which forms three hundred and seventy five definite impulses a second, and it is these impulses which we impress upon the synchronizing device’ (Baird in Waltz Jr 1932: 85). In Optical Media Kittler suggests that it was the introduction of the Braun tube that solved the problem of synchronization and delay that previously plagued Nipkow-disc-based systems. But in fact it was Baird’s discoveries of the use of signal-sharpening cells and the synchronization pulses that would remain in use years after the transition to fully electronic television. Baird further provides evidence for this in the description of his method in an instruction manual for one of his first television systems: the Baird automatic synchronisation gear makes use of the received signal to provide the synchronising impulses. These pulsations are fed to coils actuating 2 electromagnets placed diametrically opposite to one another and between which is a cogged wheel mounted on the motor shaft and having 30 teeth. In operation the received current passing through the coils of the magnets (assuming the disc has now been first synchronised with the transmitting disc) creates a magnetic pull on the tooth passing the magnet face and so holds or checks the motor speed. The solution to the problems of synchronization and delay that Baird offered were to become one of his greatest contributions to the measurement and production of time, first in experimental setting and later in contexts so everyday that the conditions of so-called real-time and the synchronization of networked machines are no longer seen as spectacular.

Events Technically produced scenes began to proliferate as the television began to become a more powerful technology. In order to understand the structure of these scenes as the transformation of an event, we can again turn to Whitehead, the philosopher who has given us the most rigorous treatment of the event and process as an outcome of very small actual entities and who is in complete accord with the technically focused media philosophy presented here. As mentioned in the previous chapters, the ‘event’ for Whitehead came to be one of the most important concepts around which he was able

122

AGAINST TRANSMISSION

to develop his process-based cosmology, which in effect is a ‘cell-theory of actuality’ (Whitehead [1929]1978: 219). This term designates more than simply the happenings of objects, or the experiences of subjects, but offers a radically non-subjective way to describe the world. An event for Whitehead could be described as ‘the nexus of actual occasions’ (Whitehead [1929]1978: 73). It is a channel or connection between occasions, a relation between collectives, it is what Serres would call a quasiobject, and it is an idea that becomes extremely important in Whitehead’s philosophy. Actual occasions, or actual entities, are terms that designate the smallest level of existence. They are, as Whitehead (1929/1978) puts it, ‘drops of experience, complex and interdeterminate’ (18). ‘Actual entities – also termed actual occasions – are the final real things of which the world is made up. There is no going behind actual entities to find anything more real’ (Whitehead [1929]1978: 18). In essence, this concept allows Whitehead to undo the distinction commonly held between objects and events. As argued in the previous chapter, for Whitehead, the object is an event. It exists within a world of events. Something is always happening around it, something is always happening to it and it itself is always happing with these other events. The event runs between actual occasions. It runs between the pixels, between information, behind images. It is the indivisible, extensive quantum that runs between the divisible acts of becoming. The event in these terms is the very thing obscured by analytic media. The event of transmission is now by the nature of the medium, divisible. The television, in media philosophical terms – seen when we look to the technical function of the medium in order to think critically about the medium – mitigates the event. It projects a scene onto the world. This is one thing that we see when we think about mediation through Whitehead. Another element of media culture that Whitehead’s philosophical treatment of the event allows us to explore further, much like the work of both Kittler and McLuhan, is the extensive relationships between humans and media systems. The event, the nexus that links together occasions, produce the bodies of which they are attributes and influences their becoming. ‘An actual entity is nothing but the unity to be ascribed to a particular instance of concrescence’ (Whitehead [1929]1978: 212). In a way that runs counter to much metaphysical thought, Whitehead argues that it is togetherness that happens first and then attributes, as abstractions of this togetherness, emerge. In other words, an event, those things at the base of all experience, are not, for Whitehead, simply things that happen to someone, like getting run over by a car. Instead they are things that happen with someone, and that in fact constitute the unity of that subject at that point in time. The human, for Whitehead, is a figure that is constituted by their location within a nexus, within spheres of togetherness.

THE RADICAL CUTTING OF EXPERIMENTAL TELEVISION

123

In an event following Whitehead’s formulation, it is not as simple as saying an object (such as a car) impinges on a subject (and breaks their bones). Instead the car and the so-called subject form a nexus of occasions that impinge on one another. The event, the car running off the road onto the footpath, is what causes the broken bones, not the physical attributes of the car. In fact, the event gives significance to the physical attributes of the car. It is through the event that this potential is actualized and that the emergence of new attributes happens. It is only through this event, which draws together many actual occasions that the weakness of the human body and the strength of the car’s frame actualize in the one moment. These primary qualities of the objects, those things that scientists attempt to measure, are, as has been more recently argued by new materialists such as Karen Barad, only known through these eventful interactions (or intraactions as she more precisely puts it). They circulate, taking on their character, based on the relations instituted by the event. How can this mode of thinking be applied to the history of media? How have subjects, as viewers, been produced by their togetherness with audiovisual technology? As mentioned in Chapter 2, beings now become frequencies, attributes become functions and qualities become quantities when, in eventful interactions, they are transduced by analytical media. Being now involves a relationship with technical media and the human and experience is given its attributes through this nexus. In his first category of explanation, Whitehead points out ‘that the actual world is a process, and that the process is the becoming of actual entities’ (Whitehead [1929]1978: 22). This is the becoming that occurs as frequencies are picked up, transmitted and tuned in, and turned into images based on the operation of media channels. First comes process and this provides the conditions for the definition, resolution or production of actual entities. It is the same with media. The processes that occur in the movement of electricity through circuits and, because of this, the potentiality immanent to hardware provides the condition for the definition and resolution of images on screen. This was seen previously in the description of the experimental selenium based devices that predate television. As Whitehead states in the second category of explanation ‘in the becoming of an actual entity, the potential unity of many entities in disjunctive diversity – actual and non-actual – acquires the real unity of the one actual entity; so that the real actual entity is the concrescence of many potentials’ (Whitehead [1929]1978: 22). Limited by the movement of electricity within the apparatus and the event of transduction, the media apparatus, with a wide ranging potential, begins to define what can and cannot be performed and what can and cannot become an image, which amounts to the making public of an identity of the event. When all the solutions to the problem of television came together and the medium began to stabilize, the cultural form of television moves from

124

AGAINST TRANSMISSION

indeterminacy to determinacy. This is a movement that happens a great deal in the television’s history as a technology, as it has shifted from mechanical to electronic and digital formats and has been used in vastly different ways by different groups. At each stage of emergence, a concrescence, a gathering together of problems and solutions, has occurred. This multi-temporal assemblage of inventions and solutions, where engineering developments from different times are cobbled together, set the conditions for the mediation of signal. The identity of the event, as was shown in Chapter 2’s interpretation of the YouTube image, is produced and then exposed by the multi-temporality embedded in the technical image. This is perhaps what process philosophers such as Whitehead mean when they argue that each moment of the present has integrated within itself every moment of the past and at the same time constitutes the past through this togetherness in time. Whitehead begins to discuss mediation in Process and Reality in more direct terms with reference to language as communicated via human speech.4 In a section of the book where Whitehead grapples with the function of symbolism, he begins by thinking of a single word – ‘forest’. ‘A single word is not one definite sound. Every instance of its utterance differs in some respect from every other instance: the pitch of the voice, the intonation, the accent, the quality of sound, the rhythmic relations of the component sounds, the intensity of the sound, all vary’ (Whitehead [1929]1978: 182). Whitehead argues that when the word is uttered and some listener gathers its meaning there is a symbolic interplay and an event has taken place. This event involves transmission on a number of levels: in the present, from the past as both expressed in mental recollections and the activities, change and ageing that is sedimented into the speaker’s and listener’s body. Whitehead points this out as follows, Thus we have a two-way system of symbolic reference involving two persons A and B. The forest, recollected by A, symbolises the word ‘forest’ for A; then A for his own sake and B’s sake, pronounces the word ‘forest’; then by the efficacy of the environment and of B’s bodily parts, and by the supplemental enhancement due to B’s experiential process, the word ‘forest’ is perceived by B in the mode of immediacy; and finally by symbolic reference, B recollects vaguely various forest scenes. (182) The media event for Whitehead thus consists of various levels of interplay that result in a number of potential outcomes being reduced to one. In this description, Whitehead puts emphasis on the physiology of B, which impacts greatly on the way he or she experiences the world through his or her having a body. However, Whitehead forgets to emphasize the physiology of A, which, as pointed out earlier, impacts greatly on the sound of the word ‘forest’.

THE RADICAL CUTTING OF EXPERIMENTAL TELEVISION

125

The word is formed by A as air is expelled from the lungs, through the throat, modulated by the vocal chords, across the soft palate over the tongue and through the teeth, to then cause vibrations in the air, which are picked up by B’s ear. The media event does not just involve the intentionality of A but involves hardware, which has real consequences for the ‘pitch of the voice, the intonation, the accent, the quality of the sound, the rhythmic relations of the component sounds and the intensity of the sound’ (Whitehead [1929]1978: 182). The elements are outcomes of the movement of the tongue, the shape of the teeth, the strength of the diaphragm and the shape of the vocal chords. These pieces of hardware, themselves given form by a history of events, such as smoking, dental work, dehydration or other environmental and genetic processes that alter the operability of the organs used in speaking – these multi-temporal events folded into the present – alter the way the event of the word ‘forest’ is able to become meaning for B. In order to arrive at a theory of media events it is not enough simply to look to the subjective experiences of meaning and authorial intention. One also needs to bring into view the hardware conditions, themselves constituted by events, some recent, some from long ago, from which these experiences emerge. This would amount to the replacement of static stuff with fluent, and constantly mediated, energy. ‘Mathematical physics translates the saying of Heraclitus, “All things flow”, into its own language. It then becomes “all things are vectors”. Mathematical physics also accepts the atomistic doctrine of Democritus. It translates it into the phrase, “All flow of energy obeys ‘quantum’ conditions”’ (Whitehead [1929]1978: 309). In looking to the experimental processes that led to the development of television hardware and the medium’s structuring of time, it is just this type of fluent event that I am, in one way or another, trying to get at. Looking to the experiments of Baird, Senlecq, Redmond and Caselli, among others, offers an indication of the process of development that led to the development of television hardware, which now acts as the substrate upon which the contemporary flows of energy and vectors are drawn. Rather than looking to the static stuff of television, its vacuous materiality, what these experiments indicate is how energy, in the form of light, became mediated in ways that supported its transmission and its conversion into a different form of energy. As the invisible event of transduction took place, which had associated with it a number of time-based problems to do with delay and synchronization, it had visible effects and produced a new style of performance. It could not capture fast, fluent movement. Actions, events, had to be timed based on the apparatus. On mechanical television screens the image continued to flicker and remained ill defined, with the human face appearing as ‘an oval of white, and, if the mouth was open, a flickering black spot appeared in the middle of this white oval’ (Baird 1926: 8). In order to remedy these effects at this early stage

126

AGAINST TRANSMISSION

of the engineering of images, performers in front of the camera had to adapt to the medium. Before television was stabilized as a medium, ‘for the first time a producer came up against the problem of make up’ (Swift 1950: 47). Actors were required to wear thick make-up, stand in front of extremely bright lights and remain close to the camera, making slow deliberate movements. The most well-known image of a performer at this time is the picture of Jane Carr, reproduced in a number of books in this period of television history. Most notably, the image appears in two groundbreaking books in television history, Zeilinski’s Audiovisions and much earlier in Smith’s Adventures in Vision. The image itself is a quaint reminder of the history of television performance but also of the demands made by the television on performance. It is through looking at the performances, these experiments with broadcasting, that we can see the temporality of the new medium. Viewing old televisions in media museums we can see the history of the design of hardware components. But to see the media temporalities of early television, we need to look at the medium’s operability, including importantly the performances that it transmitted and the dramatic scenes that it conditioned. But rarely have historians gone into the actual performance of these early actors, beyond simple descriptions of their appearance, largely because the footage has been lost. Media historians such as Jason Jacobs, Stephen Herbert, John Swift and Donald F. McLean provide some useful starting points, as do print material from newspapers, of which The Times has been the most valuable in both its reviews of broadcasts and reporting of engineering developments. In addition, the first of the experimental British television plays, The Man with A Flower in His Mouth, was recreated in 1967 by Bill Elliot, supported by the Inner London Educational Authority (ILEA) and the original producer Lance Sieveking, using similar (though not identical) cameras, and using all together different sound recording techniques (McLean 2013). These sources are drawn on here, along with the earlier exploration of the engineering of the apparatus itself, to explore the temporality of experimental television at these very first moments of the televisual media event that was to become so much a part of contemporary life. At these moments, the apparatus is most visible through its imposition on the performance and it is at these moments that we can see most clearly the conditions that are now, as analytic media has matured, far less visible. Going back to these early broadcasts, revisiting and piecing together these fossils of media culture, we might see something of the foundation on which contemporary television scenes are based. The scene amounts to technical media’s mitigation of the pure event that Whitehead describes. It refers to the measurement and limiting of extensive relations, seen most obviously above in Baird’s two-minute broadcasts, and the production of divisible and fragmented reality. Mediated scenes, like the type that Flusser described, although existing long before the introduction of

THE RADICAL CUTTING OF EXPERIMENTAL TELEVISION

127

the television, take on a new importance after the problems of television were solved and brought with them significant changes to the complex apparatus of communication. Television, and indeed radio before it, required a highly technical centralized form of organization. As Zielinski ([1989]1999) argues ‘in cinema, it was increasingly people who were new to the industry – businessmen, tradesmen, bankers, but also aesthetic producers – who organized the culture-industrial process. In television, Post Office officials, Post Office technicians, communications engineers, jurists, and administration specialists made the decisions as to what should be televised and how it should be done’ (187). In addition, television, the medium that was to become dominant in the family home, introduces new conditions of ownership. In the cinema, people pay a small amount to ostensibly rent film time or ‘temporal cine-space’ (Zielinski [1989]1999: 187). However, with the coming of the television, viewers were able to, on a large scale, own, rather than rent, the mechanism for producing media events and regulating time. In this way, viewers began to participate in what Zielinski calls the ‘audiovisual discourse’ and what, after Whitehead, we can understand as an entanglement of hardware conditions. Scenes are not reducible to the actual state of affairs dependent on the perceptual capacity of the human but are in fact produced by the elemental network of affordances built into technology at its beginnings (Ikoniadou 2016: 3–4). The events are produced by techniques of production and the conditions, often meaningless and invisible, which are their supports. In the next chapter, more time will be spent exploring the moments when television stabilized as a medium and viewers engaged more fully with the post-historical temporality of television. For now, in order to examine the effects of the apparatus, we turn to the performances in front of the camera and their transduction into scenes.

Experimental performances In the words of a reporter for The Times, via the televisor it is possible to transmit and reproduce instantly ‘the details of movement and such things as the play of expressions on the face’ (The Times 28 January 1926: 9). This may, in some sense be true. But the televisor, and all television to follow, required a great deal of support to be able to produce images through which viewers can be made to believe that the artifice is authentic. As mentioned earlier, movements needed to be paced to keep step with the apparatus (and of course you could not move out of frame) and performers had to wear thick make-up in order to define the features of their face. Human culture needed to engage in a negotiation with ‘the natural laws of media that it had brought forth’ (Ernst

128

AGAINST TRANSMISSION

2016: 242). This was true for both the first play to be broadcast in Britain, The Man with a Flower in His Mouth, and the first play to be broadcast in the United States, The Queen’s Messenger, which went to air two years earlier. In both the plays, performers were only seen from the shoulders, with closeups of their hands or props interspersed throughout the programme. In both the plays, the time-based problems of delay and synchronization persisted as actors had to stay close to the cameras so that light was able to be picked up by the photoelectric cells and in both actors had to wear heavy make-up and restrict their movements so that they could be recognized by receivers that were slightly out of synch with the transmitter and split the signal into scan lines that failed ever so slightly to line up. The medium continued to dictate what could and could not be made out on the small receiving screens. The Times featured the following description the day before the broadcast of The Man with a Flower in His Mouth: The first television play, an adaption of Pirandello’s The Man with a Flower in His Mouth, is to be broadcast by the B.B.C. from the Baird studios at 3:30 on Monday afternoon. The production marks an interesting advance in television, for hitherto the broadcasts have consisted of the head and shoulders of just one artist singing or a lecturer talking. In this play that head and shoulders will be seen of each of the three characters – the man, a customer and the woman – as he or she speaks, and not only will the faces of the actors be seen, but there will also be images of their hands, the gestures they make, and other objects illustrating the dialogue. (The Times 12 July 1930: 7) The play was particularly apt for television as it ‘has almost no action, demands no depth of perspective’ and ‘can be performed without grave loss though but one actor is to be seen at a time’ (The Times 15 July 1930: 12). The cultural product fitted into the media temporality of early television and would give this phenomenon an aesthetic form for the first time. The television play was produced by Lance Sieveking, the English writer and experienced BBC radio producer. The cast of actors was Gladys Young, Earle Gray and Lionel Millard and the scenery was painted by the well-known artist C.R.W. Nevinson (Swift 1950: 46–47). Unfortunately, the depth of field provided by the Baird system was not up to the challenge of reproducing this scenery. It did, however, effectively represent gestures and the changing expression of the actor’s face (Swift 1950: 47), as long as the actor moved slowly and amplified their facial expressions. As a reviewer in The Times stated, ‘a swift movement would perturb the whole delicate affair and blur the screened image. When this actors part has finished, the circular screen is past over them and they fade out from view, replaced by another actor, who similarly negotiates between

THE RADICAL CUTTING OF EXPERIMENTAL TELEVISION

129

movement and technical constraints’ (The Times 15 July 1930: 12). Following Whitehead, the performance of a body becomes through an entanglement of a variety of mediated processes. The media temporality produced by the technical conditions for transduction and transmission were bodied forth by the actors. The hardware of the television imposes itself on the actors and movements are given their character based on what the camera can pick up as information. If, as Kittler has argued, the gramophone radically altered the discourse networks of 1900, due to its ability to preserve the materiality of speech, to record things as they happened, then the television even more profoundly introduced a material apparatus into the discussions of epistemology and media history. The television from the start imposed a great deal of constraints on what could be recorded. The gramophone recorded things as they happened, even those things beyond the thresholds of human hearing, but the television in some senses made things happen, it conditioned them, providing the context for togetherness in the first place. In this sense, the medium was synthetic; it synthesized things and people together and produced new experiences. These new experiences however were restricted by the medium’s analytical characteristics, with both performance and viewing routines, as discussed above, conditioned by the way the pick-up apparatus segmented light and the way programmed content fragmented time into episodes. In The Man with a Flower in His Mouth, only the head and shoulders of actors were shown and scenery and object alternately took the place of the actors in front of the camera, always the same size and the same distance from the camera, with transitions provided by a circular screen that was lowered in front of the camera. Actors were made to stay relatively still, wearing heavy make-up before bright and imposing photoelectric cells. This led John Swift (1950) to sympathetically describe the broadcast as ‘impressionistic’, rather than ‘realistic’ (47). The proximity of the performers and objects to the camera was to do with the depth of field of the Nipkow-disc-based camera and the need for the performer to remain close to the intensely bright lights was so that the photoelectric cells would register the appropriate amount of reflected light. This is one way that the apparatus conditioned the context for togetherness. At this point, based on technical limitations, television found its medium specific qualities and achieved its ‘seeing up close’, as a particular close inspection of objects and events first afforded by the technical qualities of the media event but soon enhanced by television’s serialization, its place in living rooms, angled towards families intimately sitting in front of its screen. Television scholars such as Christine Geraghty (2009) and Karen Lury (2005) have previously pointed to this tendency of television to draw viewers into an ‘up-closeness’ with its texts. Before Geraghty and Lury, Zielinski ([1989]1999), approaching the medium with a different, more technically oriented, mode of

130

AGAINST TRANSMISSION

analysis argued that ‘the new medium also stood for an important change in the dimensions in which movement appeared visually. The distance between the camera and the objects shifted [as compared to cinema] as did the spatial presence of the visible surfaces on the screen. Televisual seeing became, first and foremost, an act of near-seeing’ [emphasis in original] (187). Before Zielinski, Swift (1950) pointed out that: The intimacy of the close-up is one of the essential ingredients of the medium. It is a thing that is more easily experienced than described. Broadly, it is created by the presentation of programmes of all types not for mass audiences of hundreds, but for thousands of separate audiences which may number anything from two to a dozen or more […]. There is a vast difference between a presentation that would be acceptable by a Leicester Square cinema audience and that by the family group at home, un-affected by mass reaction. (133–134) In full accord with the media archaeological approach, it can be suggested that perhaps this is more than just due to compositional decisions and actually underpinned by the history of the medium as technology, which begins with the recording and transmitting apparatus and is then rehearsed further down the line as the audience, sitting up close to screens, receives the signal on televisors with very small viewing windows. As Swift explained, if you reckon the distance from the actor to the camera and from the screen to the viewer, both are only a matter of feet away from one another. Seeing at a distance (the literal meaning of television) was supplanted by a seeing at close proximity. A particular style of viewing was established by this new medium which amounted to an up-closeness, which seemed to give the medium some of its earliest medium specific qualities. In relation to The Man with a Flower in His Mouth it was reported that ‘the visual transmission is far from perfect; you feel yourself to be prying through a keyhole at some swaying, dazzling exhibition of the first film ever made’ (The Times 15 July 1930: 12). The technical conditions of the medium entered into the aesthetics of the audiovisual culture. Both through its mode of transmission and its artistic content The Man with a Flower in His Mouth deals with what it means to be contemporary, as a particular mediation of being-with-time and the production of different temporal systems. It was an ideal play to be first staged on television not only for its simplicity but because its themes resonated so well with the new medium. At its heart, it is a play about time, given a new temporality via its broadcast and reiterated in its remake in 1967 and now its ability to be accessed on YouTube. The Man with a Flower in His Mouth opens on a scene in a country bar around midnight, established by a painted canvas that

THE RADICAL CUTTING OF EXPERIMENTAL TELEVISION

131

can be vaguely seen through the segmentation of the scan lines. In the bar, a man, later to be revealed as a dying man, meets a businessman from the city, who has missed his train. The businessman has little regard for time, allowing it to pass by, unable to keep to timetables. The man with the flower in his mouth, on the other hand, carrying with him the aftermath of his diagnosis, lives moment to moment, attempting to fill each one completely. For him, as he explains in the play, time is like a tuft of grass, with his remaining life able to be measured by the discrete blades. For the businessman, time is more like the indistinctiveness of a green field, where it is quite usual to miss the right moment to catch a train. In the transmission of the play, time and contemporariness are likewise major themes as the actors’ performances are conditioned by the time-based requirements of the medium, including the pace of their gestures and their proximity to the photoelectric cells of the camera, along with the picture becoming vertically cut up, like blades of grass, due to imprecise synchronization between the transmitter and the receiver. Many of the constraints that were to be found in The Man with a Flower in His Mouth were overcome as Baird further improved his television system. In 1931, Baird, attempting to improve the quality of his pictures, adopted the mirror-drum camera, invented by Weiller in 1889, in place of the Nipkow-based transmitter. Instead of a rotating disk, these cameras utilized a rotating drum with mirrors in place of the Nipkow disc’s holes. With these new cameras the Baird system found a greater depth of field, the ability of switching between close-ups and long shots and greater portability, which led to Baird and his team’s attempt to Broadcasting the Derby in 1931, which was somewhat effected by telegraph noise, but which ultimately led to their great success in 1932. In another of the early television plays, this new-found resolution proved to be problematic. The Eve of St Agnes was broadcast in 1937 as a television experiment. The story, based on the poem by John Keats, was presented in mime while a narrator read the text. Much like The Man with a Flower in His Mouth, it was reported that the experiment was in general successful, for a number of technically specific reasons. First, like The Man with a Flower in His Mouth, the choice of play to suit the limitations of the medium was essential; a number of the more elaborate passages from the original poem were left out. One reviewer noted that ‘extreme simplicity was the keynote of the production, and rightly so, for in the present conditions of television any attempt at elaborate staging leads to a distortion and indistinctness’ (The Times 29 October 1937: 28). Simplicity was one way to work within the restrictions of television, which The Eve of St Agnes accomplished. There was still the issue of reproducing faces on television screens, which was particularly problematic if the actors were far away from the camera, as was needed in this dramatic performance in order that the

132

AGAINST TRANSMISSION

performer’s body could be seen. A solution was found. The actors wore masks. This worked well for the long shots, which now allowed viewers to feel as though they could recognize characters. However, due to the improved resolution when actors approached the camera the faces of the performers, now seen in close-up, were reported to be ‘grotesque’ and ‘out of place’. The content was not sympathetic to the medium and the coolness that McLuhan famously attributed to the televisual. The problem was that television ‘viewers’, as opposed to cinema ‘goers’, were usually asked to be actively involved in reconstructing the images on their screen. The images of actors, often unable to be reproduced in high definition, needed a viewer to reassemble the elements lost between the scan lines. The television ‘only supplies a moiré pattern comprised of pixels that the audience must first decode back into shapes again in an active almost tactile way’ (Kittler [1999]2002: 222). This coolness for McLuhan was in contrast to the hotness of cinema, where viewers more passively received the wide-screen highresolution content. A spectator of cinematic content can be simply called a cinema ‘goer’. They arrive at the box office and the cinema takes care of the rest. A television spectator is different. They are a television ‘owner’, who takes some stock in the television images that they help to reconstruct through their ownership of a receiver. The faces of The Eve of St Agnes, in grotesque close-up must have seemed to run counter to this contract, as the small screen was filled with images that viewers felt uncomfortable to own. The images of grotesque faces in close-up heated the usually cool medium. The work involved in composing and recomposing what was once referred to as the cool images of television involved an intense concentration on the compartmentalization of time, at the transmitting end, and the recomposition of pixels, at the receiving end. In Swift’s 1950 book Adventure in Vision, he describes the context for television plays such as The Eve of St Agnes: The stranger, finding himself in either of the two studios during a programme from Alexandra Palace, could be excused for thinking he is in just another film studio at Denham or Shepperton, Elstree or Pinewood. Apart from the quarters being clearly cramped, there is little superficial difference – none whatever to the layman. There are scenery, props, microphones, odd-looking cameras, banks of fluorescent lights, a nightmare entanglement of cables and a lot of apparently dumb men wearing headphones. None of these men stops working every two or three minutes at the command of ‘cut’ because in television ‘cut’ merely means a rapid change from one camera to another and not a hold up in shooting as in films. Once the cameras go into action there is no pause until the end of the programme. (149)

THE RADICAL CUTTING OF EXPERIMENTAL TELEVISION

133

Swift’s description provides a picture of groups of men working frenetically with the odd-looking mirror-drum cameras in the tight spaces and bright lights of the studio. So much human work was expelled by the producer, the actors and the crew to achieve the liveness and flow so celebrated by television critics ever since Swift’s work. However, inside these odd-looking cameras the image was fragmented, with light measured at about a twelfth of a second, then transmitted to a receiver, which was put in synch with the camera by the discrete pulses of electricity that accompanied the transmission but which were kept undetectable to the viewer. While the production team worked hard to impose a flow onto the medium, at its base the technical media provide the preconditions for the future segmentation of time that now is such a large part of digital culture and its reduction of phenomena to data. Time was measured by the camera in increments. This technical rendering of time on very small scales also had consequences for the way time was segmented during the process of production. The lighting engineer was required to know beforehand all the movements of the actor and the camera in order to light the scene differently depending on whether a long shot or close-up was required. The cameramen also had to keep track of time very carefully, with shooting scripts prepared that choreographed their movement. The producer, like a contemporary data management system, sits above the movements of these circuits in the machine, in the production booth – the ‘control cabin set high in the studio wall and overlooking the whole area of the studio’ (Swift 1950: 149). Another example of television’s very early structuring of temporality was the broadcast of The Queen’s Messenger, the first American television play. This broadcast was more technically involved than the first British broadcast, involving three, rather than one, cameras and the producer switching between these inputs at carefully timed intervals. The Queen’s Messenger in its formal presentation was similar to The Man with a Flower in His Mouth in the sense that only the heads and shoulders of actors were seen, occasionally interspersed with images of hands and scenery. Two of the three cameras would be focused on an actor, while the third would be focused on either an object or assistant actors who posed their hands as the scene demanded (Edgerton 2007: 35). Due to the use of the three cameras a body could be cut up in space, with the head and shoulders provided by the actor and the hands provided by the assistant, and reassembled due to the rhythm of switching between cameras. The body is analysed by the medium’s logic and then synthesized by the production team. In what is now relatively commonplace practice, the body parts of different actors are assembled into a new body of performance. The body of analytical media, dispersed in its immediacy, strung out over the set, is united once it is organized by the producer.

134

AGAINST TRANSMISSION

There are a number of conditions that govern the performances of the early television plays and that are embodied via the gestures of television actors. The space in which the actors have their being (to say ‘in which the actors move’ would be to suggest too great a freedom) is half of a small cube, sliced through on its diagonal. Even the full beam of the light projector is not theirs to play in, for the effective plane of the photo-electric cells cuts it diagonally and cuts of the actors’ retreat. They must stay very close to the projector. (The Times 15 July 1930: 12) Maintaining this rigidity in their small space and a proximity to the light, each actor plays their part ‘like a man with his head in a band-box’ (The Times 15 July 1930: 12). Not only are the actors asked to inhabit small spaces, they are also asked to inhabit small moments in time. In the case of the first television play, the actors and Sieveking, the producer, have altered their practice to think in terms of fragments of speech, small scenes that the cameras changed between, and they had to invent ways to give coherence to the play given the limits prescribed by the technical apparatus. The performances and the movements of the tools of production are the outcomes of developments made in photoelectric cells, the measurement of light by mirror-drum cameras and the transmission of signal via Hertzian waves. The multi-temporal solutions to the problems of television eventually crystallized a mediation of performance as actors occupied the small spaces of the studio and the conditions of engineered temporality. As Swift (1950) states, ‘what he (the viewer) is seeing two, three, or maybe four nights a week is a continual progressing experiment towards a new interpretation of the dramatic art’ (153). Swift realized this in 1950 and indicated that the television did not simply combine theatre, cinema and radio into a new dramatic art but offered something vastly different. It offered an experimental testing ground for performances. It offered following Whitehead a nexus, a sphere of togetherness, the production of different temporal systems, from which attributes emerged, marked most acutely by the restrictions and opportunities offered by analytical media.

5 Time and Contemporary Television

M

oving on from the technical detail of mechanical television, this chapter begins to use the engineering developments related to fully electronic television to articulate the technologies of mass media with a media philosophy of time. In this chapter I continue my emphasis on television as an analytical medium – a medium that makes possible the close study of events, images and the performance of a body by introducing new temporal systems for image making – and begin to explore the moments when it began to stabilize as a technology. This investigation takes in the work of Alexander Bain, Vladamir Zworykin, Boris Rosing and Philo Farnsworth, among many others, and explores the way these pioneers of television technology grappled with questions of temporality and transduction. When the vacuum tube became replaced by the transistor, which would then be replaced by the integrated circuit chip, and when the printed circuit board made possible uniform production and performance, the television, in widespread domestic use, first introduced to domestic life the highly technical components that would come to define the contemporary electronic age. The argument of this chapter is that this produced a feeling of time which came from the technical ordering of events. This sense of time and temporality is on view most acutely in the media reportage of large-scale catastrophes and produces the affect – the feeling – of life in the aftermath of events. In order to explore this proposition, the chapter is organized into two sections: 1

Like the previous chapter, I am interested in seeing how the history of television technology is reactivated in contemporary forms of television that tend to produce, via the mechanisms of signal formatting, what I have been calling in this book the conditions of the post-historical. The chapter begins by exploring the television news

136

AGAINST TRANSMISSION

coverage of the aftermath of the November 2015 terrorist attacks in Paris and the techniques and technologies involved in turning this event into a post-historical scene. 2

To begin to speak of time more fully, the conclusion of this chapter brings up some contemporary examples of digital television and explores its mediation of time in order to begin to start trying to build a philosophy of the present from the technical details given in the last two chapters.

TIME AND CONTEMPORARY TELEVISION

137

As the first step in such a transmission, the space variations in brightness from point to point in the view must be translated into time variations in an electrical current that can be sent over the channel of communication. Frank Gray, J.W. Horton and R.C. Mathes, ‘The Production and Utilization of Television Signals’, p. 560 Media mediate change and are therefore the material form of time. Sean Cubitt, The Practice of Light, p. 257 For those of us interested in speaking of our own time, in as much as it has changed from industrial and historical time, and trying to explore what it means to be with this time, it is media apparatuses and their modes of organizing temporality that need to be constantly brought up and constantly taken apart. As Jacques Derrida ([1996]2002) points out in the opening to his book on television, written with another great philosopher of the medium Bernard Stiegler,1 Today, more than ever before, to think one’s time, especially when one takes the risk or chance of speaking publicly about it, is to register, in order to bring into play, the fact that the time of this very speaking is artificially produced. It is an artifact. In its very happening, the time of this public gesture is calculated, constrained, ‘formatted’, ‘initialized’ by a media apparatus [emphasis in original]. (3) In the previous chapters I explored how time has been engineered by media apparatuses, including photography, television and the computer database. It is well known that media alter conceptions of space due to their qualities as transmitters. A wide range of technologies from the telephone to the Internet demarcate and alter the version of social, shared spaces. But transmission media also produce time, through their modulation of signal and reordering of events, as well as the storage of signal, which itself is a special case of transmission, over time rather than space. Derrida points to precisely this condition where ‘actuality’ is fashioned by the technical operations – the ‘formatting’ and ‘initialization’ – of communications channels, which are selective. He states, ‘no matter how singular, irreducible, stubborn, distressing or tragic the “reality” to which it refers, “actuality” comes to us by way of a fictional fashioning’ (3), which is based on the way the artefacts of media filter discourse before they are transmitted. On Sunday, 15 November 2015, two days after the tragedy in Paris, UK’s Channel 4 newsman Matt Frei stood in front of a camera in the Place de la République. The Channel 4 coverage of the aftermath of this event had been

138

AGAINST TRANSMISSION

carefully organized into a ‘flow’, as Raymond Williams first described it.2 This however was not a flow that pointed towards the future but rather a flow of information designed to extend the aftermath of the original event, to repeat its significance by repeatedly folding the past into present realities. Carefully timed transitions took place from locations around Paris, where reporters described the new contexts produced by terror. This ‘flow’ undermined the discrete character of the transmission; it assembles pieces of information into a whole. But ‘breaking’ news disrupts the flow and television returns to its analytical roots (Ernst 2016: 153). Only a short time into Frei’s report, the crowd panicked and ran wildly in all directions, with Frei and the camera struggling to grasp the event, jostling with the crowd. The flow became replaced by a more complex mixture of movement. In the aftermath of the terrorist events of Friday evening, aimed at television screens, Frei was apparently reporting on the memorial activities of the crowd of Parisian mourners. As post-history becomes written by the television, and archived via video servers; these types of scenes in the aftermath of terror have become ubiquitous. What was unique however about Frei’s coverage was the moment of fear that it captured and the temporality of chaos and the contingent that it represented, as though a crescendo had been reached. There both was and was no enemy in the image. We are waiting for what terrifies us, for something or someone to arrive. Television coverage sets up this waiting time. There was no terrorist but what was palpable and agentive was the threat of an enemy in the image of mourners on the ‘home soil’ of Europe (Hoskins and O’Loughlin 2010: 157–159); the threat erupted through the waiting time. The speculative, the invisible, but very real element provided the conditions for the possibility of experience. The multi-temporal events, ones which came from the experience of being in multiple times at once, circulated around the black hole in the ground. The history of Paris, the violence, the military occupations of the past, provided a sense of being-in-time, the time of history. The anticipation of the future, the prehension of what is to come, the waiting, provided another. The history of terrorism, global scars that the west lives with, the aftermath, tangible, present signs of trauma, provide yet another. All of these times are folded into the present. All of these times are re-read through the prism of the unfamiliar, the monstrous present. This condition produced the catastrophe; a violent, sudden turn, which lasted only seconds, giving the body to a vibrating tension of nervousness, which was captured by the broadcast news crew. ‘I think I heard a gunshot…but I’m not sure if that’s true’. When the crowd turned and started running, Frei, swept up in the event, the professional reporter taking on the role usually reserved for the citizen journalist, scrambled for a sense of the scenic significance of the moment. The television apparatus was still trying to make a scene out of the scrambling crowd, fleeing what they heard as a gunshot. We were no longer waiting. The event of terrorism,

TIME AND CONTEMPORARY TELEVISION

139

the event that was ongoing days after the shooting, turned into a scene by the ritual of the television report, became again, before our eyes an event. Fear, chaos, uncertainty and the unexpected arrivant was again mobilized in the image. Everything unexpected is terrifying. But only these unexpected events are able to transform our current forms of waiting (Flusser [183]2013: 122). We wait for what terrifies us. ‘What is going to happen? What just happened? The agonising aspect of the pure event is that it is always at the same time something which has just happened and something about to happen’ (Patton 2009: 42). As Paul Patton writes, instances of becoming such as these events break through into history. The catastrophic, the violent, break the surface of the analytic organization of scenes. ‘They exhibit the hermeneutical sublime in the highest degree insofar as they realize the potential to break with existing frameworks of understanding’ (Patton 2009: 43). But it was not long before television dealt with the eruption of the event. Electronic media, the rituals of reporting, the negotiations between the technical image and the storytelling practices of journalists take history as an input and give us post-history as an output (Flusser [1983]2013: 57). The violent turning of catastrophe was in no time at all again turned back into the post-historical flow of scene after scene. Back to John Snow. Events turned back into programmes. The ecstatic time of the emergency returns to the segmentation of one discrete thing after another (Peters 2015: 242). But we still wait for more terrifying news. We look for what terrorizes us. We continue to wait. We balance the terror under which we live with the television. It is not the liveness that offers the feeling of cohesion and togetherness (Couldry 2004) but the aftermath of the event, replayed, now stored on YouTube in a relational structure. In a way that runs counter to arguments about the medium’s ‘liveness’, which have dominated English-speaking television theory since William’s work, McLuhan argues, and as we see in this example, its use is premised on a desire to experience the aftermath of an event, not the event itself. This is the defining character of the post-historical. McLuhan uses a number of examples to argue this point. One of the most relevant in this case is that when children’s eyes are tracked using head cameras, researchers see that they do not look at the violence of a Spaghetti Western but at the facial expressions. They look for the character’s reaction to the gunshot, not the gunshot itself (McLuhan 1964: 341). In Frei’s report, there was no gunshot but a swarming, buzzing reaction. The television’s role was not to capture the reality of the event in its liveness but rather to turn what Lacan once called ‘the real’ into a scene, to look to reactions, able to be grasped in its aftermath. For a moment, Frei, almost knocked over, but continuing to report on the uncertainty of his surrounding, offered viewers the image of the event. Shortly after, the initial panic, the coverage returned to John Snow, the main news anchor, reporting from Notre Dame, to quite obviously give an authoritative

140

AGAINST TRANSMISSION

voice to the contingent goings on, to again enact the media rituals associated with reporting terror. It was here that the coverage returned to what Kirsten Mogensen (2008) described as the norms of reporting at a time of crisis. The television, as a type of ‘first aid’ in the face of terror, attempts to ‘at the same time stimulate rational thinking and limit the negative stimulus of the feelings and senses’ (36). It acted both as a witness and, particularly as it was reporting on ‘home soil’, a way of comforting the witnesses of the shared traumatic event. It attempted to maintain something of an ordered society during times of catastrophe. This is why, in the Channel 4 coverage, television, in the face of chaos, returned to its ordering of the event in almost no time at all. After an initial shock, where liveness was in excess, the present was reordered, filtered, by the ‘formatting’ of television and returned to its status as a scene. We continue to wait for more information, more images. The waiting time, the togetherness, the contemporaneity, the being-in-waiting-time, is set up by the programming of the apparatus of television, both in terms of the practice of programming media events and the technical operation of the transmission medium. ‘When we get close to a screen we see points – pixels’ (Flusser [1985]2011: 34). Television images are not in fact images at all, but rather the results of chemical or electronic processes. The output of technical processes, the image amounts to an arrangement of pixels, particles or microelements of time and it is through this process that images become, again according to the media philosophical reflections of Flusser, post-historical. They are particles of information organized by programmes. And it is in this sense that the television image, as McLuhan famously argued, takes on a sculptural rather than a photographic character. There is no stable image on television, as there is in film, but rather the varying intensity of light is produced as electrons hit the screen as they sculpt the image over time. This is the condition for waiting. This is the collective condition of waiting for terror. In the television we see what McLuhan would call both the figure, the topic of reporting, but also the ground, the condition for this figure. We see the figure: the image of the crowd, Frei knocked out of the way. We see the ground of fear and threat and the anxiety of that which will not last and of that which is to come. What happened? What is about to happen? We see the aftermath of terror and also the post-historical sculpting of a scene. The filtering of television, when the medium is viewed as an analytical apparatus, amounts to a process where the time of events are reorganized to fit within the logic of the televisual apparatus and media ritualistic (Dayan and Katz 1992) modes of representation. Television theorists have been telling us this for decades. But what now becomes apparent, when we apply the media philosophical approach to this phenomenon, is that this is an effect not only of the programming routines of the industry but may

TIME AND CONTEMPORARY TELEVISION

141

be an inexorable result of the technical code built into television systems themselves. It is clear that this is very different to what was once called historical time. Events are cut up, organized into parcels of information that are able to be repeated depending on attention spans of the viewing public. Even though McLuhan argued that television in fact was not an analytical medium and actually reversed the analytical function of the alphabet to deliver viewers into an acoustic space, it can still be seen that the television, as a medium that punctuates life, does in fact carry out the function of analytics. It is based on the deep logics of media that fragment an event in order to measure it. The time-critical medium of television brought a radical fragmentation to the world that resulted in new rhythms and new modes of contemporariness, as the image of an event in time reveals itself as not only able to be segmented into temporary moments but also temporary particles within moments. As Zielinski states, All techniques for reproducing existing worlds and artificially creating new ones are, in a specific sense, time media. Photography froze the time that passed by the camera into a two-dimensional still. Telegraphy shrank the time that was needed for information to bridge great distances to little more than an instant. Telephony complemented telegraphy with vocal exchanges in real-time. The phonograph and records rendered time permanently available in the form of sound recordings. The motion picturecamera presented the illusion of being able to see the bodies in motion that photography had captured as stills […]. Electromechanical television combined all these concepts in a new medium, and electronic television went one step further. ([2002] 2006: 31) The technical operation of the analytical medium, breaking images into particles that can then be rearranged, plays out not only on very small scales but is also echoed on the larger scale of lived experience. The pixel, the particle, the bit have emerged as a cultural artefact of our time, with which digital memories are being written. As Ernst has argued so convincingly, the aesthetics of storage have radically shifted in the media technical field, with an emphasis on almost immediate reproduction and recycling rather than emphatic cultural long-time memory. ‘This change in archival logic corresponds to a technical discontinuity: the physics of printed or mechanical storage media set against the fluid electromagnetic memories’ (Ernst 2013: 95). The non-discursive, volatile electronic charge replaces the symbolic inscription practices that were once so dominant. In this context, it is the technical conditions for storage, the materiality of the computer, the very conditions of fragmentation, organization and (limited) preservation that give character to the types of history able to be written from the memory of twenty-first-century media.

142

AGAINST TRANSMISSION

The scenes of terrorism are deeply post-historical. They have no history, not because they are insignificant. On the contrary, they are both significant and traumatic events, whose effects percolate through many other events in countless ways. And it is indeed because of their post-historical character as scenes that they become traumatic on a global scale and begin, through this ongoing percolation through other events, to generate conditions for the possibility of experience. These scenes (and again I am using the term in the media philosophical sense given to it by Flusser) are post-historical because they occupy a time organized by analytical apparatuses which do not operate under the same cause and effect relationships as the history once supported by the inscription methods associated with the printing press and linear writing. As Derrida argues ‘What would “September 11” have been without television? [….] the real “terror” consisted of and, in fact, began by exposing and exploiting, having exposed and exploited, the image of this terror by the target itself’ (Derrida in Borradori 2003: 108). The events are no longer aimed at the future, but aimed at an apparatus that, controlled by the target, controlled by the United States and its allies, causes to begin to appear the image of terror. The images, coming from the television and from YouTube, terrify us. In Flusser’s ([1985]2011) words, ‘current events no longer roll toward some sort of future but toward technical images. Images are not windows: they are history’s obstruction’ (56). This apparatus – to bring again to the reader’s mind Courbet’s image of the tomb – sucks events into itself and repeats them in an aftermath of the original. The post-historical, following Flusser’s formulation, is produced by programmes, including the programming of television sets, rather than written texts. Once it has happened, once the wait is over, they repeat the image of terror and cause a new condition of waiting. ‘What we call “history” is the way in which conditions can be recognised in linear texts’ (Flusser[1985]2011: 58). The apparatus of print projects itself onto the events of the world and thus projects its own lineal structuring of these particular situations. Every event of history, represented on a line, is a unique occurrence that need not repeat itself. Analytical media now carries out a similar operation with completely different results. The programme of what Flusser calls the ‘technical image’ projects itself onto situations and turns events into ‘infinitely repeatable projections’ (Flusser[1985]2011: 58). And it is within these repeatable projections, as events continue to occur on screens, which produce a condition of the aftermath, the traumatic, the blocked up and the repeated. In a way that resonates deeply with the aftermath of terror, Flusser argues that ‘all events are nowadays aimed at television screens, the photograph, in order to be translated into a state of things. In this way, however, every action simultaneously loses its historical character and turns it into a magical ritual and an endlessly repeatable movement’ (Flusser [1983]2014: 20). The

TIME AND CONTEMPORARY TELEVISION

143

apparatus and the receivers of its images try hard to translate the images of terror, to make them graspable, understandable, within familiar temporal systems. But when these images are then turned into scenes, they lose their historical character as they are mediated by the post-historical apparatus. The television image, following Flusser, exists both in and out of time. On a technical level, the event, Frei rushing out of the way of a terrified crowd, once transduced into a value, is able to be represented as a string of timediscrete bits of information. The time of the national disasters documented, circulated and repeated, such as the tragedies in Paris, 9/11 and the London Bombings, is uncanny and upsetting, jolting viewers out of time. In all these cases, technical images indeed place events in time as either citizen journalists film with their smart phones or reporters leave the studio to go out ‘on location’ in ‘real-time’ to document and archive signal from the world. But these technical images ultimately produce a cultural experience of being out of time, as the signal originally recorded is continuously disoriented from ‘real-time’, repeated, remixed and rearranged based on the apparatus of journalism (Allen and Thorson 2009; Hoskins and O’Loughlin 2010; Freedman and Thusso 2012). This is certainly not to repeat the ‘hyper-real’ arguments of Jean Baudrillard, who seemed to only understand the television as a representational medium based on reducing it to a language and the theoretical logic of signs, consequentially missing the importance of the media technical context and the elemental networks of human and non-human systems. Baudrillard focuses on semiotics, media philosophy focuses on data processing. I would argue that, following the media philosophical imperatives given to us by Kittler, Flusser and McLuhan, thinkers who all understood acutely the importance of media technology as media technology, it is not the relationship between mediated spectacle and so called immediate reality that should be explored. This would only provide part of the story and would blind us to the way time is modulated by signal processing and data management. Instead, it is the medium’s function, its method of organizing data, that needs to be brought into view, which can never be separated from ‘culture’ or the ‘real life’ that likewise requires our close attention. All forms of media, from print to television, have embedded within them ways of organizing data that, because of their status as the intermediaries between events, the punctuation between sentences, processes and beings of all kind, give temporality to the world. The events are certainly real and the aftermath of these events is more than just media spectacle but something we can intimately experience and conceptualize. The television, its ordering of the contingent, does not create simulations without originals but instead creates the conditions for the aftermath of the event, in both a wholly transcendental and wholly empirical sense. As will be discussed in what follows, a time of the aftermath is created by these technical images,

144

AGAINST TRANSMISSION

which, like Frei’s report that could not get away from television’s grasp, via a framing of the world based on their analytical function, give form to what was once referred to as a ‘world picture’, evoking the experience of being in the present in multiple types of history.

Electronic television: Some technical notes The idea of scanning an object in order to transduce it into signal and transmit it as electricity was first published in 1880 in La Lumière Electrique by Maurice LeBlanc (Abramson 1987: 11). The first person to move this idea from a suggestion to an achievable task, as mentioned in the last chapter, was Paul Nipkow, who in 1884 patented a disk that had twenty-four holes in a spiral pattern spaced at regular intervals around the disk. Using this system, he supposed that a scene could be carefully inspected point-for-point and the time varying brightnesses at these points could be transmitted as pulses of electricity (Webb 2005: 7). Light from the scene to be scanned passed through the perforations in the disk onto a selenium cell. At the receiving end, another disk would rotate in synchronization with the transmitting disk, which was illuminated by a light source. Although this device was never built by Nipkow, he showed effectively how by using his disk mechanism an image could be systematically scanned and separated into its ‘elemental point’ (Abramson 1987: 13–15). The careful dissection of an event, separating it point-for-point, was a technique both technical and cultural that would come to define the television images such as those described above, where the technical operation of the television supports the cultural experience of both intense presentism and dislocated waiting. Another major accomplishment that made possible the fully electronic television was the development of the cathode ray tube (CRT). The discovery of electrons by J.J. Thomson and the application of his insights in the CRT meant that a great number of experiments were able to be undertaken with this new light emitting device, which could operate much more efficiently than the earlier slow mechanical systems, discussed in the previous chapter. In 1909, the German Max Dieckman built a device for transmitting images that could be received with a cold cathode Braun tube. The transmitter was built using twenty wire brushes on a disk rotating at 600 rpm. The brushes contacted a metal template of the object to be transmitted. The device consisted of no photoelectric cell, but operated like the pantelegraph mentioned in the previous chapter by physically contacting a surface. The signal was sent to the cathode tube and the beam of electrons was deflected when the brushes made contact with an object and formed no luminous spot on the screen. When no contact was made between the brushes and the object, the beam

TIME AND CONTEMPORARY TELEVISION

145

was normal and produced a luminous spot on the screen (Abramson 1987: 33). In Dieckman’s invention, information was transmitted by an on or off position which was reassembled to form an image. Although this system, without a light-sensitive cell, amounted to a telegraphic system, it showed how the analytical logic of the coded telegraph message could be applied to multimodal information. In the same year, Ernst Ruhmer gave a demonstration of a device that consisted of a mosaic of twenty-five selenium cells (Abramson 1987: 33) to transmit simple geometric figures. Both these inventions of course were not television proper, but they began, following the arguments of Chapter 3, to produce experimental theories of a world able to be understood as images transduced into particle elements. Images could not only be mechanically reproduced but transmitted through space. In 1911, the Russian physicist, Boris L’Vovich Rosing, set up a Braun tube linked to a mechanical camera and transmitted the first live images. Television, the transduction of a scene by precise point-for-point measurement of light, had arrived in electronic form. Although the screen could become all-electronic, the light-sensitive camera continued to prove to be one of the major problems that needed to be overcome to deliver all-electronic television. As with the problems of synchronization and delay that were solved in the mechanical era, this was a time-based problem. Specifically, this was a problem with storage. Early cameras based on the Nipkow system of scanning a scene point-for-point had the problem of insufficient sensitivity to light. In 1922, so the story goes, on a classroom blackboard in Idaho, the fourteen-year-old Philo Farnsworth made great advances to the transduction of a scene for transmission (Barnouw 1990: 77). Farnsworth sketched out plans for a camera system to replace Nipkow’s. This young boy, without much formal training in high-level engineering, described a system where an electronic tube would convert light into a commensurate electronic charge. Farnsworth had replaced the Nipkow disk with electromagnetic scanning, with the electron beam controlled by magnetic forces. This remarkable discovery was patented five years later and would replace the mechanical scanning then in use. However, to solve the problems with the all-electronic television and make it commercially viable, the issues around the light sensitivity of the camera needed to be solved. The figure to probably do the most over his long career for fully electronic television was Vladimir Zworykin. Attending the College de France from 1912 to 1914, Zworykin would, after his physics lectures, tinker with electronics and radio receivers. After leaving Paris and moving to Germany, war broke out and Zworykin returned to Russia where he worked as an engineer for the Russian army; this put him in contact with the ideas being developed by the Marconi wireless company (Webb 2005: 25). After leaving Russia during the Civil War, Zworykin ended up in America working for The Westinghouse Electric Company, who employed him to work on radio electronics. In his spare time,

146

AGAINST TRANSMISSION

Zworykin was tinkering, much like Baird, with Nipkow disks and importantly vacuum tubes, combining the mechanical operation of the disk with the light that was produced by the flow electrons in vacuum tubes and controlled by magnets. One of Zworykin’s major accomplishments, following on from this early work, was his organization of the timing of the flow of electrons in his storage camera. Zworykin, solving one of the most persistent problems of television, made it possible to store light for fractions of a second in order to arrive at a better quality image. Whereas in photography the exposure time could be extended to overcome problems with the sensitivity of film, with the movements of instants that television tries to capture through its radical and very fast cutting no such extended exposure was possible. The camera had to transduce and transmit what was happening at the moment it happened. The only way that the light-sensitive problem could be overcome was by flooding the scene with intensely bright lights and thus causing objects to reflect more light into the camera lens. The time-based problems of the camera were overcome by storage, a technique which has come to characterize the contemporary archival media culture that has been described so precisely by both Kittler and Ernst. Zworykin discovered that a far superior quality of image could be gained by projecting light onto a mosaic of insulated cells, each one with their own capacitor that was able to hold and accumulate charge for fractions of a second.3 An electron beam then scans each cell, in effect picking up its stored information. Richard Webb (2005), one of the key figures in the later development of television, describes the technique thus, when a television camera scans through roughly 250,000 picture elements in a scene, there are 249,999 units of time when each picture element is on its own and could be converting light energy into an equivalent electric charge, accumulating it for release at the next moment the scanning probe arrives. The size of the charge would then be that many times larger than what a nonstorage camera could produce. This is really an increase in exposure time for a television camera. (30) The camera developed at Radio Corporation of America signalled the beginning of the fully electronic television by using capacitors to store energy. Accumulation and release became pervasive techniques. The micro-delays that media theorists such as Ernst (2013) and Hansen (2013) would later alert us to in relation to the digital electronic computer, were first made operational in mass media by the television camera as this potential for storage became a major part in the transduction of signal, which had importance both technically and culturally. Waiting for very small delays, the engineering of time as able-

TIME AND CONTEMPORARY TELEVISION

147

to-be-stored, the ordering of the contingent, was built into television at its developmental stage. With Zworykin’s use of the capacitor and the gradual development of the use of the CRT, fully electronic television had begun to stabilize as a medium and gather together the earlier experiments of Alexander Bain, John Logie Baird and Herbert Ives, amongst many others inventors, experimenters, amateurs and enthusiasts. At this point, the world, or at least an image of the world, became describable as discrete particle elements. Mathematics and information theory begin to order technical images, and thus the theories of what constitutes the ‘real’. The technical image of television amounted to a segmented image, broken up and stored in the camera. The version of the ‘real’ that it supported was one of the discrete moments able to be archived, however briefly, in electronic circuits. As Kittler argued, after these technical shifts, only that which is switchable is able to be described as ‘real’. The storage function that pervades media culture begun with the work of Ewald Georg von Kleist and Pieter van Musschenbroek who discovered and refined the function of Leyden jars, later to be applied in the research of F.C. Williams’, Tom Kilburn’s, von Neuman’s, J. Presper Eckhert and John Mauchly’s, as well as the research at Bell Labs that led to the development of capacitors. However, the cultural technique of storing microelements of time was first given form in a way that was to become mass media, experienced (though not always consciously) in the workings of the television. Although in its transmission it operated as a time-continuous medium, transmitting signal as continuous frequency variations, in the process of transduction the medium was analytical, separating images into points, sampled and held. It is in this sense that the television, in its material function, offers one of the first technical images of the contemporary, both in and out of time; intensely, as mentioned earlier, focused on the present but also waiting, out of time.

Transduction Why spend the last three chapters going into so much historical and technical detail in a book whose aim is to make an intervention in the media philosophical analysis of contemporary culture? The intention was not simply to provide a tapestry of scenes but rather to illustrate in very concrete ways the material processes of transduction. The purpose of the technical detail was to offer a way to consider the technologies of culture in their entelechy and not in their static state. It was a way of showing how the mass synthetic media of the twenty-first century has, as was mentioned in the Introduction, folded within it analytic operations that are themselves much older. In the case of

148

AGAINST TRANSMISSION

chronophotography, time-continuous functions were converted into timediscrete samples. In the case of experimental television, time-continuous functions, in terms of light and sound waves were converted into the analogic time-continuous functions of frequency modulations. The events that were picked up by mechanical television, as has already been shown, had to be slowed down and often repeated, if sound was to be transmitted in addition to images it had to be done separately. In both cases, the media function of transduction reorganized the temporality of events and turned them into scenes able to be projected on to the world. Analytical photography quite obviously segmented movement in time and reduced a time-continuous function to samples that could be represented and handled as data. The television, due to restrictions in bandwidth and other limitations to do with the pick-up medium, also introduced a radical cutting based on its hardware conditions. This occurred first on the rooftop of Baird’s workshop as the first transmissions were limited to a repeated two minutes of action, once for image, once for sound. It also occurred later as the image was segmented into pixels, which broke it up in both space and time. Much like these early experimental beginnings of mass media, the technical function of transduction continues to be important in a media philosophical tradition in grappling with the production of time in contemporary media culture and the intense focus on being-with the present that seems to characterize the post-historical. The digital computer works by converting time-continuous functions into time-discrete values. In the previous chapters, I have tried to show that the temporality that underpins digital culture – a discrete engagement with the present – is not only produced by this mathematical function of computers. In multiple cultural operations, including disciplines such as statistics, medicine and biology, the process of segmenting the world in order to measure and understand it took hold most profoundly in the nineteenth century. This technique was also coded into what would soon become mass media around the end of the nineteenth century and start of the twentieth century. Chronophotography was used by scientists to analyse movement as stills in time. Not only did this have epistemological effects, leading to a universal method of graphing movement, but it also had effects on the experience of people subjected to this medium, seen most profoundly in the time–motion studies of Taylor and Londe and Charcot’s Chronophotographs. The television, at the experimental moments in the medium’s history, was used to segment information into elements able to be transmitted to receivers. By doing this, it delivered the first functional model of Shannon’s famous mathematical theory of communication. However, in order to achieve this, a great deal of work went into organizing the temporality and rhythms of transmission, including the synchronization of transmitter and receiver and the elimination of delay. These time-based solutions had real effects on the performers in front of the

TIME AND CONTEMPORARY TELEVISION

149

camera. Both experimental television and pre-cinematic technologies had the effect of focusing those that were subjected to the medium into (pre)digital subjects, with their movements cut up into data. What Shannon was so instrumental in showing was the way that the major communications inventions of the telephone, television and radio could be reconsidered not as devices that deliver meaning but as devices that transmit information, which could be figured in mathematical language. The voice of the telephone and radio, the audiovision of television, was able to be reduced to information problems and, as Shannon so importantly showed, information could be understood and measured as bits, when samples were taken of a continuous waveform. It is not until around half-way into his famous mathematical theory of information that Shannon (1948) begins discussing the function of transmitting and receiving apparatuses as ‘discrete transducers’ (399). In his own words, he endeavoured to ‘represent mathematically the operations performed by the transmitter and receiver in encoding and decoding the information. Either of these will be called a discrete transducer’ (399). The transducer, in Shannon’s formulation, acts as a kind of black box that takes a sequence of input symbols and represents these as a sequence of output symbols. This transducer may have an internal memory so that its output depends not only on the present input symbol but also on its past. The transducer, based on its current state and also its past states logged in its memory, takes source data and produces an output. The transducer codes a given input and, for Shannon, the importance was to work out how best to design transducers for the possibility of noiseless (i.e. efficient) transmission. The concept of transduction, in the philosophical formulation offered by Simondon, who sees it as a process at the heart of his concept of individuation, is in some respects quite different. For Simondon, the output of transduction is the output of a series of transformative operations (Mackenzie 2002: 46) and this concept is used to be able to think about the process of becoming, the process that both Deleuze and Whitehead formulated so forcefully as the function whereby objects are formed via process: a process of information processing. Simondon opens Shannon’s black box to philosophical reflection. Simondon offers a way to think of media transduction replacing Shannon’s emphasis on a medium as a communication tool with an emphasis on the aesthetics of media and their role as an environment. And by doing this, by moving the media technical concept of transduction into the media philosophical realm, we are offered a way to reconceive the problems of technological determinism previously raised in the field, which have percolated throughout this book. A number of examples demonstrate this. As set out in L’individu et sagenèsephysico-biologique (The Individual and Its Physico-Biological Individuation), in Simondon’s hands the existence

150

AGAINST TRANSMISSION

of a brick, as a durable, tough, weather-resistant object, is the output of a process of transduction involving clay, moulds and craftsmanship. The clay, with its own properties and potential for ‘taking shape’, such as its water content and its archive of minerals and plant life, comes into contact with a mould, which likewise involves a set of potentials for shape forming, including elements such as the wood species of the mould (Chabot [2003]2013: 76–77). The process of change, of capturing form, happens at the molecular level as forces of energy are converted into material through their interaction. An intermediary between clay and mould begins to take form, the point at which these two realities, that of the clay and that of the mould, interact. The media at play in this process involves not an object but a process. The media at play involves the process of the clay being pushed into the void of the mould; a process where energy is transferred by the craftsman or craftswoman to the clay and, as the clay is pushed into the corners of the mould, stopped by the wood. This is the nexus, the togetherness, that Whitehead describes, the concreteness that comes before objects. The mould does not impose form on the matter of the clay, but rather is an element in a larger scene that involves a translation between different states. In Simondon, the individual is always produced by a process of transduction in which they embody transformative operations, whether this be a brick, a seed crystal or a human subject (Simondon 1992). After the reformulation of transduction offered by Simondon, it can be suggested that the qualities of objects (as processes) are not ascribed by a self-contained entity, nor are they attributed an a priori identity in their own right. Instead, they become based on a set of protocols, networks, relationships, internal properties and the other objects, both virtual and actual, with which they form an assemblage. Following Adrian Mackenzie’s reading of Simondon, transduction is a concept that can be used to grasp the way that living and nonliving processes differentiate and develop. In terms of an interaction with technology, he states, ‘technologies are not a domain exterior to human bodies, but are constitutively involved in the “bodying-forth” of limits and differences’ (Mackenzie 2002: 52). It is in this sense that Galloway and Thacker are able to argue that the protocols, as a set of rules that organize the potential relationships that can be formed in a network, whether this be in terms of a technical network or any other network of objects, are modes of individuation that arrange relationships that remix both human and nonhuman entities (Galloway and Thacker 2007: 30–31). It is now also in this sense that I suggest that this is not just the case in networked computer culture, but was in fact folded into the technology of mass media at its very beginning. The technical operations of the camera and the television, not simply technical infrastructure, are the material form of Shannon’s information theory and are also the material processes that support a number of current

TIME AND CONTEMPORARY TELEVISION

151

cultural affects, such as anxiety and a repeated aftermath of events. This anxiety, this melancholia, as something felt in the body as affect, rather than conceptualized, is itself a process of transduction as technologies and cultural techniques of reporting global media events act as transformative operations that bind people together, in a temporal system, as a viewing public. These are the temporal systems that Whitehead describes, discussed previously in Chapter 1, when he talks about the conditions for moving in and out of contemporariness with others. In his philosophy of transduction, Simondon significantly transforms the philosophy of individuation. A doctrine of individuation that holds that an external mould, condition or principle imposes individuation on the material individual is completely overturned. ‘By presupposing the hierarchical subordination of matter to a transcendent form, the constituted individual is considered to be explicable on the basis of a principle of individuation anterior to it. However, the presupposition of a preformed principle of individuation renders the becoming of the individual as a real process impossible to explain’ (Sauvagnargues 2012: 57). The condition of the post-historical, as a set of technico-aesthetic world picturing devices, does not impose a being contemporary on individuals. It is rather a condition for the processes of being to play out. It is not a formal cause exterior to real processes but rather a product of these real processes that, like Whitehead’s perishing actual entities, provides the conditions for the becoming of new actual entities in their place. ‘The creativity of the world is the throbbing emotion of the past hurling itself into a new transcendental fact’ (Whitehead [1933]1967: 177). To look to the condition of the post-historical is thus to look to the processes of transformation, the coding, the transduction of experience into conditions. The term ‘transduction’ comes from Latin transductionem or traducionem and signifies a removal, or a transfer. It is a combination of the words ‘trans’, meaning across, and ‘duce’, meaning to ‘lead’. Etymologically speaking, transduction amounts to the function of leading across a divide, of navigating differences. Like Hermes, the figure that Serres uses to understand communication in both ancient and modern worlds, transduction is an operation that takes place in-between being, a function that punctuates messages (as in both ordering messages in the linguistic sense and punctuating them in the sense that Roland Barthes gives the term in his famous analysis of the techniques of photography). It carries out both functions of ordering and emphasizing by turning events into scenes and making them accessible. In the sixteenth century, the term took on a new meaning and in a slanderous sense stood for a function by which one would be led astray. To ‘traduce’ was to mislead, to distort messages like Serres’ fallen angels. The Oxford English Dictionary defines the use of the term in the sixteenth century as ‘to speak evil of, esp. (now always) falsely or maliciously; to defame, malign,

152

AGAINST TRANSMISSION

slander, calumniate, misrepresent’. But it also involves the transmission of traits through hereditary lines. It is the process that gives the character to individual objects. The term now can be applied to the mass media techniques that are the legacy of analytical media: It is no longer the events or the world that are reported but the full stop, the aftermath, the punctuation between events, the individual objects where processes of transduction accumulate, which are turned into scenes. This is just like the performance of actors in early experimental broadcasts on mechanical television, who tried to maintain a dramatic performance in the aftermath of the technological necessities of the medium. It is also just like the Channel 4 News report given to us by Frei, where the aftermath of the event, the agonizing wait, which was briefly broken, was the subject of the footage. Analytical photography too, before this, looked into the full stop of movement and offered a view of events in which time nested and remained blocked, directed at apparatuses rather than towards a future.

Punctuation My students at the University of Glasgow no longer own televisions. Kittler was right. Media now only exist as a periphery to the electronic digital computer. Their ontology is a product of a larger information transmission system. This is the condition of ‘media after media’. One seminar I asked a group of secondyear university students enrolled in Film and Television Studies ‘hands up who still goes to the cinema on a weekly basis? No one? What about fortnightly?’ A few raised their hands. ‘Hands up who owns a TV?’ Not a hand was raised. Of course they all own computers and they all watch films and television, but they do not own a television set or go regularly to the cinema. They don’t watch live television, apart from a game of football at the pub, or at least that is what they told me. Perhaps this is better for them. After all, they are enrolled in an academic programme where their lecturers ask them to closely analyse the medium. To do this, they watch stored events over and over again. Their media literacy is based not on reading but on experiencing repetition and exploring the limits of this condition. As Walter Ong put it, ‘literacy is imperious. It tends to arrogate to itself supreme power by taking itself as normative for human expression and thought. This is particularly true in high-technology cultures, which are built on literacy of necessity and which encourage the impression that literacy is an always to be expected and even natural state of affairs’ (Ong 1986: 23). For Ong, these habits distort the potential for understanding human thought processes and history by, as he puts it ‘taking possession of the consciousness’. Literacy is now replaced by media literacy. Ong sees this as

TIME AND CONTEMPORARY TELEVISION

153

secondary orality, where the television and the radio have replaced the written word as the information channel of western culture. But on a deeper level, and the level that first McLuhan and later Ernst alerted us to, showing all who cared about media that the alphabet was a medium for segmenting the world into discrete bits of code. This characteristic of the written word, in a culture of high-technology mass media such as televisions and computers, is not replaced but amplified and what is normal and natural is the segmentation of signal in order that it may be processed and calculated. Derrida ([1996]2002), in an interview recorded as an experiment by the INA (Institut National de l’ Audiovisuel) and later transcribed as the book Echographies of Television, speaks to exactly the condition produced by the analytic processing of signal by the television. Derrida reformulates Barthes’ famous analysis of photographs, this will be and this has been in order to think about the experience of being a subject being filmed for television: now, at this very instant, we are living a very singular, unrepeatable moment, which you and I will remember as a contingent moment, which took place only once, of something that was live, but that will be reproduced as live, with a reference to this present and this moment anywhere and anytime, weeks or years from now, reinscribed in other frames or ‘contexts’. A maximum of ‘tele’, that is to say, of distance, lag, or delay will convey what will continue to stay alive, or rather, the immediate image, the living image of the living. […] precisely because we know now, under the lights, in front of the camera, listening to the echo of our own voices, that this live moment will be able to be – that it is already – captured by machines that will transport and perhaps show it God knows when and God knows where, we already know that death is here. (Derrida and Stiegler [1996]2002: 38–39) The television image becomes, in Derrida’s terms, a simulacrum of life, but one that always brings with it death. Moment to moment, via the lags and delays that television continually replays, there is a presupposition of the point at which the subject will both meet death and be preserved in the discrete present moment that was captured and replayed as live. This seems to be the essence of media events and the essence of the contemporary. Mackenzie Wark, in his book Dispositions, comes to terms profoundly with the conditions of the contemporary. Both the form and the content of the book speak to the gridded, fragmented and disjunctive time and space of computer culture. Perhaps the most important moment of this book, at least for those of us interested in the material qualities of media functions, is his self-reflexive writing about sentences and full stops (or what in America are called periods). Wark writes, ‘the sentence is what is sentenced to exist, but the period points to the unsentenced. The unsentenced is not a domain of

154

AGAINST TRANSMISSION

meanings, but of senses. Stare into the black whole of the period, and all the dimensions of senses peer back’ (Wark 2002: 47). The meaninglessness of the full stop, the end of the sentenced, where time accumulates only to be mobilized again as the next sentence begins, is where the condition of the contemporary can be, and now is experienced: disjunctures, full stops, dots, bits, frames, pixels, those things that stand in the way of transmission. As argued in Chapter 2, analytical media present this as a discrete portion of time, cut off from past and future. But art and media philosophy, focused on the conditions for being-with-time, might reveal it as a new multi-temporal system for contemporariness.

Conclusion

Turn towards the upstream of time and look on it as though discrete moments, as though a string of full stops. Analytical media helps us here. They answer the questions: ‘How can we turn events into scenes?’ and ‘How can we organise disoriented particles into streams that all move in one direction?’ Bergson was right. These systems place the truth confidently along uniform and simple chains of points. Serres was right. This relationship concerns time. It concerns death. It concerns history. It generates this hell of history without time: the condition of the contemporary, the post-historical. Media systems of analysis are very effective at filtering and organizing time, like semi-conductors that take alternating current and transform it into time-discrete signals. Events that move backward and forward, with a vibrating nervousness, can be calmed down, made that way tractable, the contingent becomes able to be recorded and stored. What we are blind to in this condition is the field of potential sown into the determinate: the virtual side to the actual present. What we do not see or hear is the potential at every moment, the density of the present, which is otherwise filtered by functions that we likewise do not see or hear. The invisible, the meaningless. At the opening of this book I asked the reader to accept the grave depicted in Courbet’s Burial at Ornans as a medium, one which orders the events of the entire painting, which allows the activities to take on meaning, from the movement of clouds to the conversation and glances between figures. The tomb at once transmits information about the body that lies beneath it and also orders those gathered around it. It acts as a transducer. It gives these images meaning. If we can accept the grave as a medium, then, I asked, how can this offer ways to reconceptualize other media? How can media such as photography, film, television and computers act as processes that both transmit information but also stand in the way of transmission, transducing events into scenes? As shown throughout the book, from examples in media art, on YouTube, in films and on television, the archival character of the present

156

AGAINST TRANSMISSION

moment, the segmentation and storage of signal, offers a picture of scalar time, a thick, multi-temporal present. There are indeed possibilities for rich experience amongst this swarm of particles that constitute the present. Mostly these are offered through art that subverts the character of analytical media. ‘If art is to have a role or meaning at all in the age of real-time technologies it is to keep our human relation with time open in the light of its potential foreclosure by such technology’ (Gere 2006: 1–2). Art, such as the work of Claerbout and Gonzalez-Foerster mentioned in Chapter 2, offers a way to open the time of the storehouses of data to experience, and signals a way of potentially being with the multiple. Claerbout unravels the instant, moving the one to the many. Gonzalez-Foerster presents the time of the aftermath, where all that is left is to drift through archives of relationally meaningful objects. Both open up the black holes of storage time. Apart from these experimental forms however, mostly what the subject of analytical media is presented with is segments of events that pretend to be symptoms of events. They generate narratives based on what is stored and what is retrievable. Art offers one site of resistance but so does philosophy. If we can begin to conceptualize the scenes of mass media within a topological rather than lineal phase space, and begin to read our history as multi-temporal rather than progressive, as was the case with the YouTube images presented in Chapter 2, we might begin to see the way that the scene, time-discrete in the way it is now treated by media, has folded within it moments that are much older. In this sense, we might look to post-historical scenes as moments that are not so much out of time but more so involved in topological relationships based on the apparatus used and its programming. Their background qualities, their invisibility and meaninglessness are the reason that we usually don’t have to take analytical media seriously. In their recent history they merely supported mass synthetic media. They were never experienced in any direct way. They are the abstract, unknowable elements of media. But they are now breaking the surface of the present, extending it into the past and future. The development of databases, where segments of information could be stored and sorted in nonlinear ways was a breakthrough not just in information management but also in terms of the way time could be analysed. Digital humanities scholars can now open vast potentials for the study of texts through data mining and the analysis of large museum collections through digitization and search retrieve functions that could be used to reimagine history along relational, rather than chronological, lines. Journalists now can mine YouTube for content, with the once unexpected and unfamiliar, such as the scenes of the contingent explored in the previous chapter, now relatively unexceptional. Digital television has become defined by repetition, time-shifting and tracking and predicting user preferences, which are the very things offered by the background qualities of its infrastructure as analytical media.

CONCLUSION

157

The argument has been framed by fragmentation on a number of different scales. Firstly, there is the fragmentation before mass media. The rear view analysis of Chapters 3 and 4, as well as the discussions of computer history that ran throughout the book, presented moments of media fragmentation, before the experiments with imaging systems were gathered together under the homogenous banner ‘mass media’ and before the term ‘mass culture’ could be used. In this respect the case studies offer a view of the many before they became the one. The fragmentation of signal was explored, as time continuous events were subject to a radical cutting and transposed into time discrete scenes. This was the feature of the book and the technical function that I argued was at the core of the alienating media effects that philosophers such as Pierre Bourdieu, Bernard Stiegler, Vilém Flusser and Paul Virilio have attributed to popular audiovisual culture. It has also been about the fragmentation of viewing experience, the fragmentation of programming into repeatable episodes and the fragmentation of televised performances. Framed by these observations of fragmentation, the book at its heart has been about two things. First, it has been a reflection on the conditions of posthistorical media, using Whitehead to think through the functions of technical instruments, showing how this key thinker of the early twentieth century can be recast as a media philosopher. Using Whitehead, along with Kittler, Flusser and McLuhan the book explored the processual turning of events into scenes and attempted to explain the technical conditions that presuppose this function which stands in the way of the transmission of events from past to present to future. Second, the book has been about an exploration of moments that now take on media theoretical significance in terms of the production of digital temporality. The developments in computer history, involving Babbage and Lovelace’s work, von Neumann’s insights, Turing’s machines, the pioneers of database design and storage media, such as Ekhart, F.C. Williams, Ken Olsen and Norm Taylor, not only led to the development of the modern binary computer but also had great effects on the way time would become mediated by digital systems. In addition to this, the graphic method that Marey was so influential in developing, Eastman’s developments of the carrier medium of film, Mach’s capturing of bullets in flight, among many other experiments outlined in Chapter 3, illustrated the growing trend of using media not simply to carry out the function of a gesture of communication but to analyse the very conditions of that gesture. These systems were not simply transmission or storage systems but systems that translated events into tractable scenes. The medium did not simply transmit from point A (the event to be photographed) to point B (the viewing of the photograph) but instead added things, changed things, translated things along the way. It did not simply transmit information but became the arché of information itself. Looking at media history with an

158

AGAINST TRANSMISSION

emphasis on transduction we might begin to see the technological messages of these systems, rather than mass media content. It might begin to illustrate the togetherness of technical system and cultural systems. Likewise, the experiments that led to television were based on the segmentation of continuous signal, although the medium itself is traditionally viewed as based on time-continuous modulations. It segmented signal not into frames or bits, but into points of light strung together to make bands. This firstly had consequences on performances in front of the camera, as argued in Chapter 4, and then influenced television’s development as a cultural form, as argued in Chapter 5. For much of its life the television worked against segmentation in order to produce a type of flow that ran through its programming. But it now forefronts its pixelation and the segmentation that it always carried out. Organization moves into a digitization, a channelling of social interactions through the narrow channels of electronic communication and epistemological systems which from their very beginnings were made possible by their control of time. John von Neumann (1945), in a moment that would come to define a large part of the field of computer engineering, when he transcribed the conversations held at the Moore School and produced the First Draft Report on the EDVAC, wrote: if the device is to be elastic, that is as nearly as possible all purpose, then a distinction must be made between the specific instructions given for and defining a particular problem, and the general control organs which see to it that these instructions, no matter what they are, are carried out. The former must be stored in some way in existing devices. (2) The operations of the machine were to be dependent on stored conditions. It is possible to re-read von Neumann from a media philosophical perspective and to expand his definition of storage. The conditions are not just stored as volatile electronic charges in digital memory, but are also stored in media genealogies, including those explored in this book, which are the products of techniques and the development of technologies for ordering time. The conditions of the problem stored in the computer’s memory direct its performance. The conditions of the problem of time, stored in media’s history are embodied in the operation of the both new optical media, such as digital television, and the new analytical media, which has come to define the possibilities for speaking of being-in-time. In this book, we have seen how the present extends over what we could once think of as the past and the future. Analytical media represents the present as discrete from the past and the future, whilst extending over what came before and what is yet to become. The present extends into

CONCLUSION

159

a timelessness, and it is in this sense that it presents the conclusion to a thinning out of time and an intense focus on the present (what has been called post-history), rather than the line of history. At moments however, there is an opportunity for art and media philosophy to show how the present, which now extends across time, actually reverses these tendencies and becomes thick with multiple temporalities. This is what artists such as Claerbout, Gordon, Wall and Gonzalez-Foerster show, and it is what I have tried to describe with reference to the Gaddafi YouTube footage and Matt Frei’s report of the terror attacks in Paris. The archive artworks and these traumatic global events stand out to show how the future and the past are now contained in the present, as virtual sides to its concrete actuality. Artistic practice and media philosophical reflection can show how contemporary experience is manifested through the nesting of the past and the future in the present: they now transpire within it. Description such as this are needed if we are to find a way to live with analytical media.

Notes Introduction 1 The grave acts to present what Georges Didi-Huberman (1990 [2005]: 238) refers to as the historia, the quasi-object that makes the signified visible. The inclusion of the grave does not simply describe the scene but provides the conditions for it to be possible in the first place due to the invisible relations that it structures and the rituals of the burial that it embodies.

Chapter 1 1 The influence of Hall on McLuhan is documented in the numerous letters that they exchanged, which discussed at length the then emerging concepts of the technological extensions of man that were shared by the pair. This has also been elucidated further in Everett M. Rogers (2000), ‘The Extensions of Man: The Correspondence of Marshall McLuhan and Edward T. Hall’, Mass Communication and Society 3:1, 117–135. 2 Figures such as Bernard Siegert and Sybille Krämer have taken a similar approach to revising Kittler’s anti-humanist position by using the term ‘cultural technique’. The term cultural technique, in German Kulturtechniken, refers to a ‘chain of operations’ that links humans, non-humans and media into systems’ (Siegert 2007). Originally an agricultural term used to designate technical practices designed to enhance land productivity (Krämer 2003), it has been adapted by media philosophers to refer to the strategies and technical functions for dealing with symbolic worlds, such as reading, writing, listening, viewing and visualizing. As Krämer (2003) argues, through the performance of cultural techniques ‘the immaterial, such as meaning, but also knowledge and information, becomes not only visible and audible, but also becomes, in the most literal sense, tangible’ [emphasis in original] (528). For readers interested in the concept(s) of cultural techniques see the special Issue of Theory, Culture and Society 30(6), edited by Geoffrey Winthrop-Young, Ilinca Iurascu and Jussi Parikka.

Chapter 2 1 One of McLuhan’s most well-known examples of the medium as the message is the light bulb, whose operability was more meaningful than its content.

NOTES

161

2 Thank you to my friend and colleague Dimitris Eleftheriotis for pointing out to me this important etymological detail. 3 This talk can be watched on YouTube https://www.youtube.com/ watch?v=1yZppfFZHB0.

Chapter 4 1 There are countless inventors and patents that are involved in this story, which simply cannot be represented here. Albert Abramson’s 1987 book, The History of Television, 1880–1941, which presents an impressive 350-page timeline of events, could be consulted for any reader keen to see a more exhaustive account of the patents, experiments and false starts that marked this era of invention. Unlike Abramson’s book, this chapter is selective in the version and organization of the discontinuous history of television that it offers. 2 The description ‘passionate dilettante’ is a phrase that I take from Kittler, who in Optical Media describes Baird as ‘The passionate Scottish dilettante’. 3 Thank you to Iain Baird for alerting me to this, and also for all his help while I was at the National Media Museum archive. 4 This discussion is an extension of earlier work that he carried out in his book Symbolism: Its Meaning and Effect, published two years earlier than the Gifford Lectures, which would later become his major work Process and Reality.

Chapter 5 1 Derrida and Stiegler’s book is produced from transcripts of an experiment where the philosophers discuss the conditions of television as a medium whilst being filmed for broadcast by the INA. 2 The production of these types of media events have been described by media anthropologists (Dayan and Katz 1992; Lule 2001; Couldry 2003; Sumiala 2013) interested in the contemporary rituals of mediation. 3 This was similar to a patent submitted around the same time by Kálmán Tihanyi.

References Abramson, Albert. (1987), The History of Television 1880–1941, Jefferson: McFarland and Company. Agamben, Giorgio. (2009), What Is An Apparatus, Stanford, CA: Stanford University Press. Aiken, Howard. Interview with Tropp, Henry. (1973), in Computer Oral History Collection 1968 Computer Oral History Collection, 1968–1974, 1977, Archives Centre, National Museum of American History, Smithsonian Institution. Allen, Stuart and Thorson, Einar. (2009), ‘Introduction’, in Stuart Allen and Einar Thorson (eds.), Citizen Journalism: Global Perspectives, 1–16, New York: Peter Lang. Babbage, Charles. (circ 1820), Essays on the Philosophy of Analysis (unpublished). Western Manuscripts Collection, MS 37202, British Library Archives and Manuscripts. Babbage, Charles. (1864), Passages from the Life of a Philosopher, London: Longman, Green, Longman, Roberts and Green. Bachman, Charles W. (1973), ‘The Programmer as Navigator’, Communications of the ACM 16(11): 653–658. Baird, John Logie. (1926), Television: A Popular Talk. Available at: http://digital.nls. uk/scientists/pageturner.cfm?id=74491814 (accessed 4 September 2015). Baird, Malcolm, Brown, Douglas, and Waddell, Peter. (2005), ‘Television, Radar and J.L. Baird’, in Baird Television. Available at: http://www.bairdtelevision. com/radar.html (accessed 10 September 2015). Barker, Timothy. (2012). Time and the Digital: Connecting Technology, Aesthetics and a Process Philosophy of Time, Lebanon, NH: Dartmouth College Press. Barnouw, Erik. (1990), Tube of Plenty: The Evolution of American Television, New York: Oxford University Press. Barthes, Roland. ([1980]1981), Camera Lucida: Reflections on Photography, trans. Richard Howard, New York: Hill and Wang. Beaney, Michael. (2003), ‘Analysis’, The Stanford Encyclopedia of Philosophy (Spring 2015 Edition), Edward N. Zalta (ed.). Available at: http://plato.stanford. edu/archives/spr2015/entries/analysis/ (accessed 19 May 2016). Beer, David. (2009), ‘Power through the Algorithm? Participatory Web Cultures and the Technological Unconscious’, New Media & Society 11(6): 985–1002. Benjamin, Walter. ([1937]2010), The Work of Art in the Age of Mechanical Reproduction, Scottsdale, AZ: Prism Key Press. Bergson, Henri. ([1907]1989), Creative Evolution, trans. Arthur Mitchell, Lanham, MD: University Press of America. Blackmore, John T. (1972), Ernst Mach; His Work, Life, and Influence, Berkley and Los Angeles: University of California Press.

REFERENCES

163

Borradori, Giovanna. (2003), Philosophy in a Time of Terror: Dialogues with Jürgen Habermas and Jacques Derrida, Chicago and London: University of Chicago Press. Bourdieu, Pierre. ([1996]1998), On Television, trans. Priscilla Parkhurst Ferguson, New York: The New Press. Bradley, James. (2008), ‘The Speculative Generalization of the Function: A Key to Whitehead’, Inflexions: A Journal of Research Creation 2. Available at: http://www.senselab.ca/inflexions/n2_The-Speculative-Generalization-ofthe-Function-A-Key-to-Whitehead-by-James-Bradley.pdf (accessed 10 April 2015). Brain, Robert M. (2002), ‘Representation on the Line: Graphic Recording Instruments and Scientific Modernism’, in Bruce Clarke and Linda Henderson (eds.), From Energy to Information: Representation in Science and Technology, Art and Literature, 155–177, Stanford, CA: Stanford University Press. Braun, Marta. (1992), Picturing Time: The Work of Étienne-Jules Marey (1830– 1904), Chicago: The University of Chicago Press. Braun, Marta and Whitcombe, Elizabeth. (1999), ‘Marey, Muybridge, and Londe’, History of Photography 23(3): 218–224. Breisach, Ernst. (2003), The Future of History: The Postmodernist Challenge and Its Aftermath, Chicago: University of Chicago Press. Brown, Elspeth H. (2005), ‘Racialising the Virile Body: Eadweard Muybridge’s Locomotion Studies 1883–1887’, Gender and History 17(3): 627–656. Burns, R.W. (1975), ‘The First Demonstration of Television’, Electronics and Power 21(17): 953–956. Burns, Russel W. (2000), John Logie Baird: Television Pioneer, London: The Institute of Engineering and Technology. Callisthenes (1925), ‘Television’, The Times [London, England] 24 March: 14. Canales, Jimena. (2011), A Tenth of a Second: A History, Chicago: University of Chicago Press. Chabot, Pascal. ([2003]2013), The Philosophy of Simondon: Between Technology and Individuation, trans. Aliza Krefetz and Graeme Kirkpatrick, London and New York: Bloomsbury. Chanan, Michael. (2005), The Dream that Kicks: The Prehistory and Early Years of Cinema in Britain, London and New York: Routledge. Claerbout, David. (2015), Description of KING (After Alfred Wertheimer’s 1956 portrait of a young man named Elvis Presley) (2015). Available at: http:// davidclaerbout.com/KING-after-Alfred-Wertheimer-s-1956-picture-of-a-youngman-named (accessed 20 May 2016). Coe, Brian. (1969), ‘William Friese-Greene and the Origins of Cinematography III’, Screen 10(4–5): 129–147. Coopersmith, Jonathan. (2015), Faxed: The Rise and Fall of the Fax Machine, Baltimore, MD: Johns Hopkins University Press. Couldry, Nick. (2003), Media Rituals: A Critical Approach, New York and London: Routledge. Couldry, Nick. (2004), ‘Liveness, “Reality”, and the Mediated Habitus from Television to the Mobile Phone’, Communication Review 7(4): 353–361. Cubitt, Sean. (2014), The Practice of Light: A Genealogy of Visual Technologies from Prints to Pixels, Cambridge, MA: The MIT Press.

164

REFERENCES

Curtis, Scott. (2012), ‘Photography and Medical Observation’, in Nancy Anderson and Michael R. Dietrich (eds.), The Educated Eye: Visual Culture and Pedagogy in the Life Science, 68–93, Hannover: Dartmouth College Press. Crary, Jonathan. (2013). 24/7: Late Capitalism and the Ends of Sleep, London and New York: Verso. Davis, Wendy. (2007), ‘Television’s Liveness: A Lesson from the 1920s’, Westminster Papers in Communication and Culture 4(2): 36–51. Dayan, Daniel and Katz, Elihu. (1992), Media Events: The Live Broadcasting of History, Cambridge, MA: Harvard Press. Deleuze, Gilles. ([1983]2005), Cinema 1: The Movement Image, trans. Hugh Tomlinson and Robert Galeta, London and New York: Continuum. Deleuze, Gilles. ([1985]2005), Cinema 2: The Time Image, trans by Hugh Tomlinson and Robert Galeta, London and New York: Continuum. Deleuze, Gilles and Guattari, Felix. ([1972]2004), Anti-Oedipus: Capitalism and Schizophrenia, London and New York: Continuum. Derrida, Jacques and Stiegler, Bernard. ([1996]2002), Echographies of Television, trans. Jennifer Bajorek, Cambridge: Polity. Didi-Huberman, Georges. ([1990]2005). Confronting Images: The Ends of a Certain History of Art, University Park, PA: The Pennsylvania University Press. Doane, Mary Ann. (2002), The Emergence of Cinematic Time: Modernity, Contingency and the Archive, Cambridge, MA: Harvard University Press. Doane, Mary Ann. (2005), ‘Real Time: Instantaneity and the Photographic Imaginary’, in David Green and Joanna Lowry (eds.), Stillness and Time: Photography and the Moving Image, 23–38, Brighton: Photoworks. Doane, Mary Ann. (2006), ‘Information, Crisis, Catastrophe’, in Wendy Hui Kyong Chun and Thomas Keenan (eds.), New Media/Old Media: A History and Theory Reader, 251–264, New York and London: Routledge. Edgerton, Gary. (2007), The Columbia History of American Television, New York: Columbia University Press. Eleftheriotis, Dimitris. (2010), Cinematic Journeys: Film and Movement, Edinburgh: Edinburgh University Press. Ellis, John. (1982), Visible Fictions: Cinema: Television: Video, New York: Routledge. Ernst, Wolfgang. (2013), Digital Memory and the Archive, Minneapolis: University of Minnesota Press. Ernst, Wolfgang. (2015), ‘Media Archaeology-As-Such: Occasional Thoughts on (Més-)alliances with Archaeologies Proper’, Journal of Contemporary Archaeology 2(1): 15–23. Ernst, Wolfgang. (2016), Chronopoetics: The Temporal Being in Operativity of Technological Media, London and New York: Rowman and Littlefield. Findlay-White, Emma and Logan, Ken. (2016), ‘Acoustic Space, Marshall McLuhan and Links to Medieval Philosophers and Beyond: Centre Everywhere and Margin Nowhere’, Philosophies 1(2): 162–169. ‘The First Play by Television’ (1930), The Times [London, England] 15 July 1930: 12. Fiske, John. (1987), Television Culture, Oxon: Routledge. Floridi, Luciano. (2015), ‘Hyperhistory and the Philosophy of Information Policies’, in Luciano Floridi (ed.), The Onlife Manifesto: Being Human in a Hyperconnected Era, 51–64, Cham, Heidelberg, New York, Dordrecht and London: Springer.

REFERENCES

165

Flusser, Vilém. ([1983]2013), Post-History, trans. Rodrigo Maltez Novaes, Minneapolis: University of Minnesota Press. Flusser, Vilém. ([1983]2014), Towards a Philosophy of Photography, trans. Anthony Matthews, London: Reaktion Books. Flusser, Vilém. ([1985]2011), Into the Universe of Technical Images, trans. Nancy Ann Roth, Minneapolis: University of Minnesota Press. Flusser, Vilem. (2011), ‘The Gesture of Photographing’ trans. Nancy Ann Roth, Journal of Visual Culture 10(3): 279–293. Flusser, Vilém. (2013), ‘Our Images’, trans. Rodrigo Maltez Novaes, Flusser Studies 15 http://www.flusserstudies.net/sites/www.flusserstudies.net/files/ media/attachments/flusser-our-images.pdf (accessed 10 April 2015). Foucault, M. (1988), ‘Practicing Criticism’ trans. A. Sheridan et al., in L.D. Kritzman (ed.), Politics, Philosophy, Culture: Interviews and Other Writings, 1977–1984, 152–158, New York: Routledge. Foucault, Michel. ([1966]2002), The Order of Things, London and New York: Routledge. Foucault, Michel. ([1969]2002), The Archaeology of Knowledge, London and New York: Routledge. Foucault, Michel. (1981), ‘The Order of Discourse’, in Robert Young (ed.), Untying the Text: A Post-Structuralist Reader, 48–78, Boston, London and Henley: Routledge. Freedman, Des and Thusso, Daya Kishan. (2012), Media and Terrorism: Global Perspectives, London: Sage. Fried, Michael. (1990), Courbet’s Realism, Chicago and London: University of Chicago Press. Fukuyama, Francis. ([1992]2006), The End of History and the Last Man, New York: The Free Press. Fuller, Matthew. (2005), Media Ecologies: Materialist Energies in Art and Technoculture, Cambridge, MA: The MIT Press. Galloway, Alexander and Thacker, Eugene. (2007), The Exploit: A Theory of Networks, Minneapolis: University of Minnesota Press. Gane, Nicholas. (2005), ‘Radical Post-Humanism: Friedrich Kittler and the Primacy of Technology’, Theory, Culture & Society 22(3): 25–41. Geraghty, Christine. (2009), ‘Classic Television: A Matter of Time’, in The Making and Remaking of Classic Television, 19 March 2009, University of Warwick. Available at: http://eprints.gla.ac.uk/6562/1/6562.pdf (accessed 19 May 2016). Gere, Chalie. (2006), Art, Time and Technology, Oxford: Berg. Goody, Jack. (1987), The Interface between the Written and the Oral, Cambridge: Cambridge University Press. Gordon, W. Terrence. (2010), McLuhan: A Guide for the Perplexed, New York and London: Continuum. Gray, Frank, Horton, J.W. and Mathes, R.C. (1930), ‘The Production and Utilization of Television Signals’, The Bell Systems Technical Journal 6(4): 560–603. Groys, Boris. (2009), ‘Comrades of Time’, e-flux 11. Available at: http://www.e-flux. com/journal/comrades-of-time/. Groys, Boris. (2016), In the Flow, London and New York: Verso Books. Halewood, Michael. (2013), A.N. Whitehead and Social Theory: Tracing a Culture of Thought, London: Anthem Press.

166

REFERENCES

Hallward, Peter. (2006), Out of This World: Deleuze and the Philosophy of Creation, London and New York: Verso Books. Hansen, Mark B.N. (2004), A New Philosophy for New Media, Cambridge, MA: The MIT Press. Hansen, Mark B.N. (2006), ‘Media Theory’, Theory, Culture & Society 23(2–3): 297–306. Hansen, Mark B.N. (2011), ‘Digital Technics Beyond the “Last Machine”: Thinking Digital Media with Holis Frampton’, in Eivind Røssaak (ed.), Between Stillness and Motion: Film, Photography, Algorithms, Amsterdam: Amsterdam University Press, 45–72. Hansen, Mark B.N. (2013), ‘Ubiquitous Sensation: Toward an Atmospheric, Collective, and Microtemporal Model of Media’, in Ulrik Ekman (ed.), Throughout: Art and Culture Emerging with Ubiquitous Computing, 63–88. Cambridge, MA: The MIT Press. Hansen, Mark B.N. (2015), Feed-Forward: On the Future of Twenty-First Century Media, Chicago: University of Chicago Press. Harding, Colin. (2012), ‘Celluloid and Photography Part Three: The Beginnings of Cinema’. Available at: www.nationalmediamuseum.org.uk/~/media/Files/…/ TheBeginningsOfCinema.pdf (accessed 1st July 2016). Havelock, Eric. A. (1986), ‘The Alphabetic Mind: A Gift of Greece to the Modern World’, Oral Tradition 1(1): 134–150. Heath, Stephen. (1990), ‘Representing Television’, in Patricia Mellencamp (ed.), Logics of Television, Bloomington: Indiana University Press, 267–302. Hegel, Georg Willhelm Friedrich. ([1899]1956), The Philosophy of History, trans. J. Sibree, Mineola, NY: Dover. Heidegger, Martin. ([1954]1977), The Question Concerning Technology and Other Essays, trans. William Lovitt, New York: Harper Collins. Heidenreich, Stefan. (2015), ‘The Situation after Media’, in Eleni Ikoniadou and Scott Wilson (eds.), Media After Kittler, 135–154, London and New York: Rowman and Littlefield. Herbert, Stephen. (2004), A History of Early TV (Vol. 1), London and New York: Routledge. Hook, Derek. (2001), ‘Discourse, Knowledge, Materiality, History: Foucault and Discourse Analysis’, Theory and Psychology 11(4): 521–547. Hoskins, Andrew and O’Loughlin, Ben. (2010), War and Media: The Emergence of a Diffused War, Cambridge: Polity. Ikoniadou, Eleni. (2016), ‘Primer: The Media Question’, in Eleni Ikoniadou and Scott Wilson (eds.), Media After Kittler, 1–14, London and New York: Rowman and Littlefield. Ives, Herbert. (1927), ‘Television’, Bell Systems Technical Journal October 6: 551–559. Jacobs, Jason. (2000), The Intimate Screen: Early British Television Drama, Oxford: Oxford University Press. Jongen, Marc. (2011), ‘On Anthropospheres and Aphrogrammes: Peter Sloterdijk’s Thought Images of the Monstrous’, Humana.Mente 18: 199–219. Kahn, Douglas. (2013), Earth Sound Earth Signal: Energies and Earth Magnitude in the Arts, Berkley and Los Angeles: University of California Press. Kalmar, Ivan. (2005), ‘The Future of “Tribal Man” in the Electronic Age’, in Gary Genosko (ed.), Marchall McLuhan: Theoretical Elaborations Volume 2, 227– 261, London and New York: Routledge.

REFERENCES

167

Kember, Sarah and Zeilinska, Joanna. (2012), Life after New Media: Mediation as a Vital Process, Cambridge, MA: The MIT Press. Kirchenbaum, Matthew. (2008), Mechanism: New Media and the Forensic Imagination, Cambridge, MA: MIT Press. Kittler, Friedrich. ([1985]1990), Discourse Networks 1800/1900, trans. Michael Metteer and Chris Cullens, Stanford, CA: Stanford University Press. Kittler, Friedrich. ([1986]1999), Gramophone, Film, Typewriter, trans. Geoffrey Winthrop-Young and Michael Wutz, Stanford, CA: Stanford University Press. Kittler, Friedrich. ([1999]2010), Optical Media, trans. Anthony Enns, Cambridge: Polity. Kittler, Friedrich. (2006a), ‘Thinking Colours and/or Machines’, Theory, Culture & Society 23(7–8): 39–50. Kittler, Friedrich. (2006b), ‘Number and Numeral’, Theory, Culture & Society, 23(7–8): 51–61. Kittler, Freidrich. (2009), ‘Towards an Ontology of Media’, Theory, Culture and Society 26(2–3): 23–31. Krämer, Sybille. (2003), ‘Writing, Notational Iconicity, Calculus: On Writing as a Cultural Technique’, MLN 118(3): 518–537. Krämer, Sybille. (2006), ‘The Cultural Techniques of Time Axis Manipulation: On Friedrich Kittler’s Conception of Media’, Theory, Culture & Society 23(7–8): 93–109. Krämer, Sybille. (2015), Media, Messenger, Transmission: An Approach to Media Philosophy, trans. Anthony Enns, Amsterdam: Amsterdam University Press. Krämer, Sybille and Bredekamp, Horst. (2013), ‘Culture, Technology, Cultural Techniques – Moving beyond Text’, Theory, Culture and Society 30(6): 20–29. Lambert, Greg. (2016), ‘What Is a Dispositif’. Available at https://www.academia. edu/25507473/What_is_a_Dispositif?campaign=upload_email (accessed 23 May 2016). Launay, Françoise and Hingley, Peter. (2005), ‘Janssen’s “Revolver photographique” and Its British Derivative, “The Janssen slide”’, Journal for the History of Astronomy 36(1): 57–79. Lechte, John. (2002), ‘Time after Theory: The Cinema Image and Subjectivity’, Continuum: Journal of Media and Cultural Studies 16(3): 299–310. Lule, Jack. (2001), Daily News, Eternal Stories: The Mythological Role of Journalism, New York: Guilford Press. Lury, Karen. (2005), Interpreting Television, London and New York: Bloomsbury. Mackenzie, Adrian. (2002), Transduction: Bodies and Machines at Speed, London and New York: Continuum. Magoun, Alexander. (2009), Television: A Life Story of a Technology, Baltimore, MD: The Johns Hopkins University Press. Mamber, Stephen. (2004), ‘Marey, the Analytical and the Digital’, in John Fullerton and Jan Olsson (eds.), Allegories of Communication: Intermedial Concerns from Cinema to the Digital, Corso, Trieste: John Libbey Publishing, 83–91. Marchessault, Janine. (2005), Marshall McLuhan: Cosmic Media, London, Thousand Oaks and New Delhi: Sage. Marey, Étienne-Jules. (1879), Animal Mechanism: A Treatise on Terrestrial and Aerial Locomotion, New York: D. Appleton and Company. Mauchly, John. Interview with Merzbach, U.C. (1970), in Computer Oral History Collection 1968–1974, 1977, Archives Centre, National Museum of American History, Smithsonian Institution.

168

REFERENCES

May, Jon and Thrift, Nigel. (2001), Timespace: Geographies of Temporality, Oxon: Routledge. McLean, Donald F. (2013), The Dawn of TV: The Mechanical Era of British Television. Available at: http://www.tvdawn.com/earliest-tv/the-man-with-theflower-in-his-mouth/ (accessed 19 May 2016). McLuhan, Marshall. (1962), The Gutenberg Gallaxy: The Making of Typographic Man, Toronto: University of Toronto Press. McLuhan, Marshall. (1964), Understanding Media: The Extensions of Man, London: Sphere Books. McLuhan, Marshall, Fiore, Quentin and Agel, Jerome. (1967), The Medium Is the Massage: An Inventory of Effects, New York: Random House. McQuire, Scott. (1998), Visions of Modernity: Representation, Memory, Time and Space in the Age of the Camera, London: Sage. Miller, Carolyn R. (1978), ‘Technology as a Form of Consciousness: A Study of Contemporary Ethos’, Central States Speech Journal 29(4): 228–236. Miller, David L. (1946), ‘Whitehead’s Extensive Continuum’, Philosophy of Science 13(2): 144–149. Mogensen, Kirsten. (2008), ‘Television Journalism During Terror Attacks’, Media, War & Conflict 1(1): 31–49. Murphie, Andrew. (2013), ‘Convolving Signals: Thinking the Performance of Computational Processes’, Performance Paradigm 9. Available at: http://www. performanceparadigm.net/index.php/journal/article/view/135 (accessed 5 July 2016). Neyland, Daniel. (2015), ‘On Organizing Algorithms’, Theory, Culture & Society 32(1): 119–132. Niethammer, Lutz. (1992), Posthistorie: Has History Come to an End? trans. Patrick Camiller, London and New York: Verso. Noys, Benjamin. (2014). Malign Velocities: Accelerationism and Capitalism, Hants, UK: Zero Books. Ong, Walter. ([1982]2002). Orality and Literacy: The Technologizing of the World, London and New York: Routledge. Ong, Walter. (1986), ‘Writing Is a Technology That Restructures Thought’, in Gerd Baumann (ed.), The Written Word: Literacy in Transition, 23–50, Oxford: Oxford University Press. Parikka, Jussi. (2012), What Is Media Archaeology? London: Polity. Patton, Paul. (2009), ‘Events, Becoming and History’, in Jeffrey A. Bell and Claire Colebrook (eds.), Deleuze and History, 33–53, Edinburgh: Edinburgh University Press. Peters, John Durham. (2015), The Marvelous Clouds: Towards a Philosophy of Elemental Media, Chicago and London: University of Chicago Press. Poster, Mark. (2010), ‘McLuhan and the Cultural Theory of Media’, MediaTropes 2(2): 1–18. ‘Report on Baird’s Televisor’ (1926) The Times [London, England] 28 January 1926: 9. ‘Review of The Eve of St Agnes’ (1937), The Times [London, England], Friday, October 29: 28. Rogers, Everett M. (2000), ‘The Extensions of Man: The Correspondence of Marshall McLuhan and Edward T. Hall’, Mass Communication and Society 3(1): 117–135.

REFERENCES

169

Rossell, Deac. (2008), ‘Mach, Ernst’, in John Hannavy (ed.), Encyclopedia of Nineteenth-Century Photography Vol. 1, 880–881, New York: Routledge. Sauvagnargues, Anne. (2012), ‘Crystals and Membranes: Individuation and Temporality’, trans. Jon Roffe, in Arne De Boever, Alex Murray, Jon Roffe and Ashley Woodward (eds.), Gilbert Simondon: Being and Technology, 57–72, Edinburgh: Edinburgh University Press. Scannell, Paddy. (1996), Radio, Television and Modern Life, Hoboken, NJ: Wiley Blackwell. Schmandt-Besserat, Denise. (1995), ‘Record Keeping before Writing’, in Jack Sasson (ed.), Civilizations of the Ancient Near East, Vol. IV, 2097–2106, New York: Charles Scribner’s Sons. Schmidt, Siegfried J. (2008), ‘Media Philosophy—A Reasonable Programme?’, in Wittgenstein and the Philosophy of Information. Proceedings of the 30th International Ludwig Wittgenstein-Symposium, 89–105, Kirchberg, 2007. Frankfurt: Ontos Verlag. Serres, Michel. (1982), ‘The Origin of Language: Biology, Information Theory, and Thermodynamics’, in Josue V. Harari and David F. Bell (eds.), Hermes: Literature, Science, Philosophy, 71–83, Baltimore and London: The Johns Hopkins University Press. Serres, Michel. ([1982]2007), The Parasite, trans. Lawrence R. Schehr, Minneapolis: University of Minnesota Press. Serres, Michel. ([1983]2015), Rome, trans. Rudolph Burks, London: Bloomsbury. Serres, Michel. ([1993]1995), Angels: A Modern Myth, trans. Francis Cowper, Paris and New York: Flammarion. Shannon, Claude. (1948), ‘A Mathematical Theory of Information’, The Bell System Technical Journal 27(3): 379–423. Shannon, Claude and Weaver, Warren. ([1949]1963), The Mathematical Theory of Communication, Champaigne, IL: University of Illinois Press. Sharma, Sarah. (2014), In the Meantime: Temporality and Cultural Progress, Durham: Duke University Press. Shaviro, Steven. (2009), Without Criteria: Kant, Whitehead, Deleuze and Aesthetics, Cambridge, MA: The MIT Press. Siegert, Bernard. (2007), ‘Cacography or Communication? Cultural Techniques in German Media Studies’, trans. Geoffrey Winthrop-Young, Grey Room 29: 26–47. Siegert, Bernard. (2013), ‘Cultural Techniques: Or the End of the Intellectual Postwar Era in German Media Theory’, Theory, Culture & Society 30(6): 48–65. Siegert, Bernhard. (2015), Cultural Techniques: Grids, Filters, Doors, and Other Articulations of the Real, trans. Gregory Winthrop-Young, New York: Fordham University Press. Simondon, Gilbert. (1964), L’individu et sa genèse physico-biologique, Paris: PUF. Simondon, Gilbert. (1992), ‘The Genesis of the Individual’, in Jonathan Crary and Sanford Kwinter (eds.), Incorporations, 297–319, New York: Zone Books. Slavin, Kevin. (2011), ‘How Algorithms Shape Our World’. Available at: http://www. ted.com/talks/kevin_slavin_how_algorithms_shape_our_world.html (accessed 20 August 2015). Sloterdijk, Peter. ([2011]2016), Globes: Spheres II, trans. Wieland Hoban, South Pasadena: Semiotext(e).

170

REFERENCES

Smith, Dominic. (2015), ‘Rewriting the Constitution: A Critique of “Postphenomenology”’, Philosophy and Technology 28(4): 533–551. Smith, Terry. (2009), What Is Contemporary Art?, London and Chicago: University of Chicago Press. Sterne, Jonathan. (2014), ‘What Do We Want? Materiality! When Do We Want It? Now!’, in Tarleton Gillespie, Pablo J. Boczkowski and Kirsten A. Foot (eds.), Media Technologies: Essays on Communication, Materiality, and Society, 119–128, Cambridge, MA: The MIT Press. Stiegler, Bernard. ([1994]1998), Technics and Time 1: The Fault of Epimetheus, trans. Richard Beardsworth and George Collins, Stanford, CA: Stanford University Press. Stiegler, Bernard. ([2001]2011), Technics and Time, 3: Cinematic Time and the Question of Malaise, trans. Stephen Barker, Stanford, CA: Stanford University Press. Stoller, H.M. and Morton, E.R. (1927), ‘Synchronization of Television’, Bell System Technical Journal 6: 604–615. Sumiala, Johanna. (2013), Media and Ritual: Death, Community, and Everyday Life, Oxon: Routledge. Swift, Jonathon. (1950), Adventure in Vision: The First Twenty Five Years of Television, London: John Lehman. Taylor, Mark C. (2014). Speed Limits: Where Time Went and Why We Have So Little Left, New Haven and London: Yale University Press. ‘Television 1873–1927: A Brief Outline of What Has Been Accomplished in Little Over Half-a-Century’ (1928), Television: A Monthly Magazine March: 10–11 and 23. Thompson, E.P. (1967), ‘Time, Work-Discipline, and Industrial Capitalism’, Past and Present 38: 56–97. Totaro, Paolo and Ninno, Domenico. (2014), ‘The Concept of Algorithm as an Interpretative Key of Modern Rationality’, Theory, Culture and Society 31(4): 29–49. ‘Upcoming Broadcasts’ (1930) The Times [London, England] 12 July 1930: 7 Virilio, Paul. ([1997]2008), Open Sky, trans. Julia Ross, London and New York: Verso. von Neumann, John. (1945), ‘First Draft Report on the EDVAC’. Available at: http:// www.virtualtravelog.net/wp/wp-content/media/2003-08-TheFirstDraft.pdf accessed 15 June 2016. Wade, Nicholas J. (2016), ‘Capturing Motion and Depth before Cinematography’, Journal of the History of the Neurosciences 25(1): 3–22. Wajcman, Judy. (2015), Pressed for Time: The Acceleration of Life in Digital Capitalism, Chicago: University of Chicago Press. Waltz Jr, George H. (1932), ‘Television Scanning and Synchronization by the Baird System’, Popular Science Monthly February: 84–85. Wark, McKenzie. (2002), Dispositions, Applecross, Western Australia and Cambridge: Salt. Webb, Richard. (2005), Tele-Visionaries: The People behind the Invention of Television, Piscataway: IEEE Press. White, Mimi. (1992), Tele-Advising: Therapeutic Discourse in American Television, Chapel Hill and London: The University of North Carolina Press.

REFERENCES

171

Whitehead, Alfred North. ([1920]2007), The Concept of Nature, New York: Cosimo. Whitehead, Alfred North. ([1927]1985), Symbolism: Its Meaning and Effect. New York: Fordham University Press. Whitehead, Alfred North. ([1929]1978), Process and Reality: An Essay in Cosmology, New York: The Free Press. Whitehead, Alfred North. ([1933]1967), Adventure of Ideas, New York: The Free Press. Williams, Raymond. (1974), Television: Technology and Cultural Form, Oxon: Routledge. Winthrop-Young, Geoffrey. (2011), Kittler and the Media, Cambridge: Polity. Zielinski, Siegfried. ([1989]1999), Audiovisions: Cinema and Television as Entr’actes in History, trans. Gloria Custance, Amsterdam: Amsterdam University Press. Zielinski, Siegfried. ([2002]2006), Deep Time of the Media: Towards an Archaeology of Hearing and Seeing by Technical Means, trans. Gloria Custance, Cambridge, MA: The MIT Press. Zuse, Konrad. Interview with Merzbach, U.C. (1968), in Computer Oral History Collection, 1968–1974, 1977, Archives Centre, National Museum of American History, Smithsonian Institution.

Index Note: Locators followed with n denotes note number accelerationism 5–6 acoustic space 17, 28–31, 33, 41, 66, 141. See also synchronic space actual entities 46–8, 52–4, 113, 121–3, 151 Agamben, Giorgio 55, 77–9 Aiken, Howard 14 analytical media 5, 12, 20, 48–50, 58, 71, 142, 156–9 film and 81, 93, 98, 101 television as 105, 117–18, 123, 133 time and 6, 12, 49, 59, 66, 77–8, 87–8, 134, 154 timelessness and 29, 53–4, 58, 68 approach to media history 3, 5–6, 12, 14–15, 30, 81, 157–8 Kittler and 34–6 arché 15, 157 archival culture 23, 49, 81, 98, 109–10, 146 archival characteristics of present 18, 59, 116, 141, 155–6 art and 67–73, 159 Ernst and 59, 141 Foucault and 37–8 YouTube and 21, 75–7, 110, 116 the arrivant 88, 139 Babbage, Charles 12–14, 57, 157 Bain, Alexander 110–11, 117, 135, 147 Baird Television System 114–15, 118–21, 128, 131 Barthes, Roland 17, 24, 71, 91, 151, 153 becoming 20, 25–6, 46, 48–9, 53, 58, 65–6, 89–90, 95–6, 122–3, 139, 149, 151 being-with-time 3, 77, 130, 154

Bell Labs 117 Benjamin, Walter 17, 24, 41, 91–3 Bergson, Henri 60, 66, 87, 95–6, 98, 155 Bildtelegraph 105–6, 107, 110 Bourdieu, Pierre 17, 112, 157 Briois, A. 98 British Broadcast Corporation (BBC) 63, 119–20, 128 broadcast of the Derby 131 Campbell, Jim 56–8 Caselli, Abbé 110–11, 117, 125 casual television viewing 103, 109, 113, 115, 157 Cathode Ray Tube (CRT) 144–5, 147 Channel 4 (UK) 137, 140, 152 Chronophotography 21, 52, 63, 83–4, 86, 94, 96 digital media and 86 labour and 84 Passage of Venus and 98–9 temporality and 66, 84–5, 93, 96, 148 time motion studies and 84, 87, 90–1, 115 citizen journalism 59, 74, 138 Claerbout, David 70–5, 77, 92–3, 156, 159 Cock, Gerald 114 computer history 5, 12–15, 58–9, 67–8 contemporaneity 2–4, 9, 45–6, 68–73, 77–80, 84–5, 140 Courbet, Gustave 1–3, 8–11, 18, 36, 64, 67, 75, 92, 142, 155 cultural technique 7, 29, 32–3, 46, 61, 65, 147, 160 n.2

INDEX

database, history of development 17, 157 data processing 12, 16, 34–7, 43, 143 death and television 73–5, 153 Deferral of the instant 91–2, 100–1, 153 delay 5, 92, 146–9 television transmission and 21, 63, 107–8, 115–21, 128, 153 Deleuze, Gilles 7–8, 16–17, 29, 54, 65, 87, 95–6, 149 depth of field 128–9, 131 Derrida, Jacques 16, 88, 137, 142, 153 Dieckman, Max 112, 144–5 digital television 3, 34, 74, 110, 114, 116, 136, 156 discourse analysis 18, 34–9, 62–3, 129 discourse networks 17–18, 37–8 discrete transducer 149 early television plays 109, 127–34 early television studios 114, 128, 132–4 Eastman, George 97–8 EDVAC 158 Electronic computer 13–15, 66–7, 79–80, 83, 148–9, 157. See also computer history as an analytical medium 13, 46, 58, 74 cognition and 38, 67, 86 time, temporality and 59, 66 electronic television 21, 121, 141, 144–7 elemental network, the 7, 19, 39, 49, 62, 89, 127, 143 Elliot, Bill 126 EMI-Marconi Company 114–15 Un enterrement à Ornans (A Burial at Ornans) 1–3, 8–11, 18, 36, 64, 67, 75, 92, 142, 155 epistemology and media 15–20, 62–4, 112, 129, 148, 158 Ernst, Wolfgang 5, 11, 17–18, 23, 36, 50, 52, 59–60, 62, 86, 141, 146, 153 The Eve of St Agnes 109, 131–2 event 8–9, 11, 33, 79, 95 aftermath of 11, 114, 138–42, 150–2

173

media and 3–5, 18–19, 39, 58, 109, 153 media philosophy and 39–44, 79, 88–93, 112, 121–7, 142–3 transmission events 2–3, 10–11, 17 experimental broadcasts 109, 114–15, 119–21, 152. See also early television plays Exploded View (Commuters) 56–8 extended cognition 28, 86 Farnsworth, Philo 21, 135, 145 film stock 18, 86, 87, 96–101 celluloid film 95, 97, 98 paper film 97–8 Floridi, Luciano 60 Flusser, Vilém 17, 19–20, 39–45, 48, 57–8, 68, 71, 85, 93–4, 140, 142–3 history and 19–20, 55, 85, 139, 140–3 Kittler and 39, 42, 46, 79 McLuhan and 40–1, 68, 73, 116 non-linear time and 44 Foucault, Michel 16, 18 Fourier transform 61–2 Fourier, Jean-Baptiste Joseph de 60–2 fragmentation 5, 12, 41, 45, 49, 57, 64, 67–8, 79, 101 audience 26, 113, 157 signal 113–14, 126, 133, 141, 157 time and 4, 33, 58–9, 78, 115, 129, 141 Frei, Matt 137–40, 143–4, 152, 159 Friese-Greene, Wiliam 98 Godard, Jean Luc 91–2 Gonzalez-Foerster, Domonique 72–3, 156, 159 Gordon, Douglas 69, 73 gramophone 17–18, 62, 129 Groys, Boris 79 Hansen, Mark 24, 52, 86, 89, 146 Hayles, N. Katherine 28 Hegel, Georg 3–4 Heidegger, Martin 17, 41, 101 Heraclitus 125

174

INDEX

Inner London Educational Authority (ILEA) 126 illusion of permanence 100–1 individuation, process of 19, 26, 29–31, 60–1, 90, 149–51 information 10, 12, 28, 61, 67, 85–6, 145 Flusser’s approach to 43–5, 140 Kittler’s approach to 36–8 mathematical theory of 15, 17–18, 35, 89, 105–6, 147–50 (see also Mathematical Theory of Communication) philosophy of 60 integrated circuit chip 135 Janssen, Pierre 93, 98–9 KING (After Alfred Wertheimer’s 1956 portrait of a young man named Elvis Presley) 71–2 Kittler, Friedrich 5–7, 12, 16–18, 20, 30–2, 34–9, 46, 50–2, 62, 89, 106, 116, 121, 147, 160 n.2 Foucault and 36–9 McLuhan and 7, 38, 66 Kodak 98 Korn, Arthur 105, 112 Krämer, Sybille 5, 16, 17, 19, 23, 86 Lacan, Jacques 16, 18, 66, 139 LeBlanc, Maurice 144 Le Prince, Louis 98 linear history 3, 25, 39–40, 44, 54 linear time 3, 4, 8, 25, 31, 39, 42–3, 54, 67–8, 96, 116, 142 linear writing 39–42, 44, 67–8, 142 Listener 69–70 liveness (television) 109, 114–16, 120, 133, 139–40 logistical media 88 Londe, Albert 63, 93 Lovelace, Ada 157 Low, Archibald 105–6 Mach, Ernst 99–100, 157 Mamber, Stephen 86 Man with a Flower in His Mouth 126, 128–32, 133

Marey, Étiene-Jules 21, 63–4, 82–7, 89, 90–4, 96–9, 157 Mathematical Theory of Communication 61, 62, 106, 148–9 McLuhan, Marshall 6, 10, 13, 18, 25–34, 35, 37–8, 40–2, 48, 66, 67–8, 78–9, 139–41, 160 n.2 Hall’s influence on 27, 160 n.1 hot and cool media 132 phonetic alphabet 30, 116, 153 Whitehead and 45, 49, 51 mechanical television 63, 110–12, 114–16, 117–21, 125–34, 141, 144–6, 148 media archaeology 5–6, 11–12, 19, 36, 38, 70, 73, 130. See also arché Media art 56–9, 68–73 Media as aesthetic 6, 10, 26–7, 35, 57–8, 65, 70, 96, 130, 141, 149, 151 Media philosophy 7–8, 15–19, 23, 41–2, 47–8, 60, 143 media theory and 15–16 tasks for 3, 4, 15–16, 19, 30, 50, 54, 59, 154, 159 media rituals 74–8, 139–40, 142 melancholia and time 54, 73, 79, 92, 151 mirror drum cameras 131, 133, 134 misplaced concreteness 53, 101 Morse, Samuel and Wheatstone, Charles 110 Multi-temporality 9–10, 29, 31, 33, 69–71, 76–9, 124–5, 134, 138, 156 Muybridge, Eadweard 21, 63, 93 near-seeing 129–30 Nipkow, Paul 144 Nipkow Disc 106, 118–21, 129, 131, 145–6 non-linear history 39, 44, 48, 58, 69, 75–6, 100, 139, 142–4 November 15, Paris 137–40 Object Oriented Ontology (OOO) 46 Ong, Walter 17, 152–3

INDEX

pantelegraph 110–11, 117, 144 Parikka, Jussi 18, 36 pendulum 110, 111, 117. See also synchronisation perpetual perishing 46, 53–4, 151 Philosophy and media 16–17 Phonoautograph 62–4, 66 phonograph 38, 62–3, 141 photoelectric cells 120, 128–31, 134 photographic revolver 98 photographic rifle 99 post-historical media 11, 15, 51, 59, 73–4, 76–7, 88–9 post-history 5, 19, 68, 85, 87, 116, 138, 139, 159 post-humanism 28–9 prehension 45–7, 53–4, 90, 115, 138, 152 printing press 29–30, 142 protocols and epistemology 36–8, 90–1, 94–5, 113, 150 The Queen’s Messenger 109, 128, 133 Raider, Antoine 99 the real 17–18, 30, 46, 63, 65–6, 139 Redmond, Dennis 105–6, 112, 117–18, 125 regulating clock 111. See also synchronisation Rosing, Boris 135, 145 Ruhmer, Ernst 106, 112, 145 scan lines 34, 106, 112, 128, 130–1, 132 scanning stylus 110–11 Schlieren photography 100 seeing at a distance 130 selenium 105, 111, 115, 117–19, 120, 123, 144–5 semiotics 16, 34, 143 Senlecq, Constantin 105, 110–12, 117–18, 125 sensitivity 108, 120, 145–6 Serres, Michel 5, 9, 17, 46, 101, 122, 151, 155 Shannon, Claude 19, 35–6, 61, 149. See also transduction and Weaver 35–6

175

shutter speeds 65, 96–100, 119 Siegert, Bernard 5, 17, 37, 66, 86, 160 n.2 Sieveking, Lance 126, 128, 134 signal sharpening circuit 119–21 Skaif, Thomas 98 Sloterdijk, Peter 16 Smith, Terry 55, 68–9, 79 Smith, Willoughby 111 Spencer, J.B. and Melhuish, A.J. 97 Stiegler, Bernard 24, 28, 109, 137, 157 storage camera 146–7 storage media 3, 10, 15–17, 28, 36–8, 39, 57, 79–80 in computer history 15, 67, 157–8 digital memory 15, 18, 36, 72–3, 74–5, 141 film as 109, 114 history and 3–4, 38, 101 Leyden Jars 147 television and 21–2, 109–10, 114–16, 145–7 time and 2–3, 5, 10–12, 73, 73–6, 109, 145–7, 155–6 stroboscopic photography 100 Sauve qui peut (la vie) (1979) (also Every Man for Himself and Slow Motion) 91–3 A Sudden Gust of Wind (after Hokusai) 69 Swift, John 126, 129–30, 132–4 switching 14–15, 32, 35, 110, 145, 147 the symbolic 16, 17–18, 62, 66, 124, 141 symptom/sign distinction 85, 156 synchronic space 41 synchronization 21, 32, 79, 84, 99, 103, 105–9, 111, 117–21, 125, 128, 131, 144–5, 148. See also regulating clock synthetic media 2–3, 11–12, 29–30, 58, 61, 64–5, 92, 98, 111, 129, 147–8, 156 Taylor, Frederick 84, 90, 115 Taylorism 87, 110, 148. See also Chronophotography, timemotion studies

176

INDEX

technical image 39–42, 57–8, 71, 85–7, 90, 124, 139 history and 42, 48, 81, 100–1, 124, 142–4 photography as 40–1, 44 television as 147 technologized memory 15, 28–9, 87, 109–10, 141. See also storage media technology as extension of man 27–9, 31, 33, 35, 39 Telectroscope 105 television close-up 130, 131–2 TeleVista 105 televisual flow 12, 109–10, 113–14, 133, 137–8, 158 temporality of viewing 9–10, 22, 70–1, 74–6, 77–9, 109–10, 116, 151 10ms-1, 69 terrorism and television 137–40, 142–3 TH.2058, 72–3 time shifting (television) 114, 115, 156 Toepler, August 100 togetherness and media technology 39, 49–50, 122–4, 129, 134, 139–40, 158 transduction 3, 11, 15–17, 23–4, 36, 45, 89, 108, 129, 145–52, 158 Shannon and 148–9 Simondon and 26, 149–51 technical definition of 18 time and 29–31, 50, 123–5, 135 transmission media 2–3, 6, 10–11, 15, 17–19, 36–8, 45, 103, 105–10, 130 media philosophy and 2, 15–17, 23, 112–13 time and 12, 21, 50–2, 73, 87, 92, 114–17, 124, 129, 137–8, 147–9 Turing, Alan 4, 57, 157 24 Hour Psycho 69 typewriter 18, 38

Vacuum tubes 15, 135, 146 Virilio, Paul 5, 109, 157 von Neuman, John, 15, 57, 59, 147, 158 waiting time 66, 69, 106, 138–40, 142, 144, 147 Wall, Jeff 69–70, 73, 77, 159 Wark, Mackenzie 153–4 Webb, Richard 146 The Westinghouse Electric Company 145–6 Whitehead, Alfred North 20, 23–4, 30, 38, 45–54, 89–90, 108, 112–13, 121–3, 134, 150–1 atomism 47, 53 contemporaneity and 46–7, 77, 151 Kittler and 51, 89, 122 language and 51–2, 88–9, 112, 124–5 McLuhan and 25–6, 45, 49, 51, 122 media philosophy and 6, 46, 47–9, 50, 112, 123 technology and 7, 45–6, 89, 112–13 Williams, Raymond 103, 109, 113 and McLuhan 6 World picturing 144, 151 writing as a technology 12, 17–18, 25, 28–9, 32, 37–8, 39 history and 25, 29, 31–3, 44, 60, 142 Youtube 21, 59, 73–4, 114, 115, 137, 142, 155–6 history and 21, 87, 110, 116, 139 sacrifice on 73–8 Zeno 101 Zielinski, Siegfried 8, 17, 65, 73, 78, 86, 94–5, 110, 113, 127, 129–30 Zuse, Konrad 13–14