Conservation of time-based media art 9780367460426, 9781032343785, 9781003034865


132 11 27MB

English Pages [579] Year 2023

Report DMCA / Copyright

DOWNLOAD PDF FILE

Table of contents :
Cover
Half Title
Series
Title
Copyright
Contents
List of Figures
List of Tables
Preface and Acknowledgments
Part I Caring for Time-Based Media Art
1 Implementing Time-Based Media Art Conservation in Museum Practice
1.1 Where to Begin
1.2 The Consequences and Costs of Inaction
1.3 It Takes a Village: Organizing In-House
1.4 Finding the Workforce
1.5 Creating a Workplace
1.6 Engaging in Networks of Practice Development
1.7 Conclusion
2 Theories of Time-Based Media Art Conservation: From Ontologies to Ecologies
2.1 Introduction
2.2 Ontologies
2.2.1 Integrities
2.2.2 Work-Defining Properties
2.2.3 Between Instructions and Manifestations
2.2.4 Differential Approaches to Conservation
2.2.5 Medium-Independent Behaviors
2.3 Ecologies
2.3.1 Institutions and Networks
2.3.2 Doing Installation Art
2.3.3 Biographies
2.3.4 Producing Permanence
2.4 Conclusion
3 A Roundtable: Curatorial Perspectives on Collecting Time-Based Media Art
3.1 Moving from Technical Risk to the Importance of Curatorial Knowledge
3.2 The Role of the Curator in Preservation Matters
3.3 Collaboration Beyond the Institution
4 Institutional Assessments and Collection Surveys for Time-Based Media Conservation
4.1 Introduction to TBM Conservation Assessments and Surveys
4.2 Foundational Methodologies for Conservation Assessments and Surveys for Cultural Heritage Collections
4.2.1 Conservation Assessments
4.2.2 Collection Surveys
4.3 Frameworks for TBM Conservation Institutional Assessments and Collection Surveys
4.3.1 TBM Conservation Institutional Assessments
4.3.2 TBM Conservation Collection Surveys
4.3.3 Organizing the Institutional Assessment or Collection Survey
4.4 Conducting a TBM Institutional Assessment
4.4.1 Phase 1: Information-Gathering Prior to the Assessment
4.4.2 Phase 2: Interviews with Key Staff Members
4.4.3 Phase 3: Examination of Museum Spaces, Storage, and Infrastructure
4.4.4 Phase 4: Crafting the Final Institutional Assessment Report
4.4.5 Phase 5: Follow-up Visit
4.5 Conducting a TBM Collection Survey
4.5.1 Format Inventory and Identification of Migration Needs
4.5.2 Storage and Housing Survey
4.5.3 Documentation Survey
4.5.4 Equipment Inventory
4.5.5 Registration and Cataloging
4.5.6 Digital Storage
4.5.7 Condition Assessment
4.5.8 Treatment Prioritization
4.5.9 Crafting the Final Collections Survey Report
4.6 Next Steps
4.7 Conclusion
Appendix 4.1: Artwork Summary Template Used by the Authors for the 2017–18 Met TBM Collection Survey
5 Outside the Institution: Crossing the Boundaries of Communities and Disciplines to Preserve Time-Based Media
5.1 Introduction
5.2 Community Archiving Workshop (CAW)
5.2.1 Partnerships and Museum Relationships
5.2.2 Workflows
5.2.3 Impact on Communities and Collections
5.3 Audiovisual Preservation Exchange (APEX)
5.3.1 Archives and Artist Improvisations
5.3.2 Partnerships and Exchange
5.3.3 Principles of Collaboration
5.4 XFR Collective
5.4.1 Origins and Motivations
5.4.2 Projects
5.4.3 Impact on Communities and Collections
5.5 Conclusion
6 The Role of Advocacy in Media Conservation
6.1 Introduction
6.2 Why Is It a Problem?
6.3 What Is the Problem?
6.4 You Have a Problem—So What?
6.5 Make the Problem a Joint Priority
6.6 Turn the Problem into a Project
6.7 Articulate the Project
6.8 Summary
Part II Building a Workplace
7 Building a Time-Based Media Conservation Lab: A Survey and Practical Guide, from Minimum Requirements to Dream Lab
7.1 Introduction: TBM Lab Fundamentals
7.2 Space and Requirements
7.3 Purpose-Built Labs
7.4 Medium-Specific Overview
7.4.1 Digital Workstation
7.4.2 Tape-Based Media Workstation
7.4.3 Film and Slide Tools
7.4.4 Display Equipment and General Tools
7.5 Priorities and Advocacy
7.6 Sustaining a TBM Lab
Appendix 7.1 Lab Equipment Lists Compiled by Kate Lewis
8 Digital Storage for Artworks: Theory and Practice
8.1 Introduction
8.2 What Is a Repository?
8.3 Repository Models
8.4 Unique Needs of Artworks
8.5 Minimum Requirements
8.6 Deploying the Digital Repository
8.7 Automation
8.8 Conclusion
9 Staffing and Training in Time-Based Media Conservation
9.1 Context
9.2 Staffing Models and the Conservator
9.3 What Skills and Knowledge?
9.4 Who Has Responsibility?
9.5 How? Education and Training
9.5.1 Entry-Level Opportunities
9.5.2 Continual Professional Development
9.6 The Future
10 A Roundtable: Implementing Cross-Departmental Workflows at SFMOMA
10.1 Introduction
10.2 In Response to a Need
10.3 The Value of Collaboration
10.4 A Certain Level of Discomfort
10.5 Honoring a Collective Expertise
10.6 The Role of the Media Conservator
10.7 Looking Ahead
Part III Cross-Medium Practices in Time-Based Media Conservation
11 Documentation as an Acquisition and Collection Tool for Time-Based Media Artworks
11.1 Introduction
11.1.1 The Role of Documentation in Conservation
11.1.2 Evolution of Documentation Practices in Time-Based Media Conservation
11.1.3 Emerging Practice
11.2 Time-Based Media Documentation Stages
11.2.1 Pre-acquisition
11.2.2 Acquisition
11.2.3 Exhibition
11.2.4 Loans
11.2.5 Conservation Interventions
11.2.6 High-Level Documentation: Policies, Strategies, Guidelines, and Workflows
11.2.7 Integrated Approach to Documentation
11.3 Conclusion
12 Inventory and Database Registration of Time-Based Media Art
12.1 The Collection Information Environment
12.1.1 Digital Spreadsheets
12.1.2 Desktop Databases
12.1.3 Collection Management Systems
12.2 Cataloging Time-Based Media Art
12.2.1 Track Every Component
12.2.2 Reflect the Status of Components
12.2.3 Note Locations, Also Digital Ones
12.2.4 Note Relationships Between Components
12.3 Inventory
12.3.1 Defining the Categories
12.3.2 Selecting the Documentation System
12.3.3 Enriching the Record
12.3.4 Updating the Record
12.4 Time-Based Media Numbering Within a Collection Management System
12.4.1 Keep It Flexible
12.4.2 Integrate with the Existing System
12.4.3 Create a Comprehensive Numbering System
12.5 Beyond the Database: Where to Store What Doesn’t Fit?
12.6 Conclusions
13 Digital Preservation and the Information Package
13.1 Introduction: Preservation and the Past, Present, and Future
13.2 Defining the Information Package: The OAIS Reference Model and BagIt
13.3 Benefits of Information Packages
13.4 Data Ingest and Acquisition
13.4.1 Acquisition
13.4.2 File Transfer
13.5 OAIS Information Packages
13.5.1 The Submission Information Package
13.5.2 The Archival Information Package
13.6 Conclusion
14 Disk Imaging as a Backup Tool for Digital Objects
14.1 Disk Imaging in Media Conservation
14.2 Use Cases for Disk Images
14.2.1 Disk Image as an Archival Copy
14.2.2 Disk Image as a Tool for Artwork Examination
14.2.3 Disk Image as a Backup for Exhibition
14.2.4 Disk Image as a Deliverable for Loans
14.3 What to Disk-Image
14.4 Necessary Hardware, Software, and Tools for Disk Imaging
14.4.1 Preventing Damage to Hardware or Data
14.4.2 Disk Imaging Workstation
14.4.3 Software Choices for Creating Images
14.5 Disk Imaging Formats and Selection Criteria
14.5.1 Sustainability
14.5.2 Compression
14.5.3 Metadata
14.6 Review of Formats
14.7 Disk Image Metadata
14.8 Sample Disk Imaging Workflow
14.9 The Evolving Practice of Disk Imaging
15 Managing and Storing Artwork Equipment in Time-Based Media Art
15.1 Introduction
15.2 Change and Assessing Significance
15.3 Levels of Significance
15.4 Documentation
15.5 Sources of Equipment
15.6 Storage
15.7 Technical Care
15.7.1 Who Undertakes Technical Care?
15.7.2 Loan
15.7.3 Strategies for Technical Care
15.7.4 End of Life
15.8 Conclusion
16 The Installation of Time-Based Media Artworks
16.1 Introduction
16.2 Defining Artworks by Spatial, Aesthetical, and Technical Requirements
16.3 Understanding the Space Requirements of the Artwork and the Requirements of the Exhibition Space
16.4 Determining the Aesthetics of an Artwork’s Display
16.5 Integrating the Artwork in the Space
16.6 The Technical Realization of Time-Based Media Artworks
16.7 Creating a Technical Rider for New Works and Commissions
17 A Roundtable: Collaborating With Media Artists to Preserve Their Art
17.1 Introduction
17.2 Is Media Art Intended to Last?
17.3 Transferring Ownership by Transferring Knowledge
17.4 Gathering Distributed Knowledge
17.5 The Significance of Exhibition for the Preservation of Media Artwork
17.6 When Artists Maintain Their Own Work
17.7 The Professionalization of Media Conservation
Appendix 17.1 Bitforms Gallery Collector Agreement
Part IV Medium-Specific Practices in Time-Based Media Conservation
18 Caring for Analog and Digital Video Art
18.1 Introduction
18.2 A Brief Overview of Video as a Medium in Artists’ Practice
18.3 The Beginnings of Collecting and Preserving Video
18.4 Collection Practices Today
18.4.1 Pre-acquisition Process: Defining Deliverables with the Artist
18.4.2 Pre-acquisition Process: Identifying the Significance of Technologies
18.4.3 Acquisition Process: The Evaluation of Artist-Provided Materials
18.5 Acquisition Process: The Condition Assessment of Video Materials
18.5.1 Acquisition Process: Condition Assessment of Analog and Digital Videotapes
18.5.2 Acquisition Process: Condition Assessment of Digital, File-Based Video
18.6 Digitization as a Preservation Strategy
18.6.1 Digitization as a Key Moment in the Life of an Artwork
18.6.2 Preparing for Digitization
18.6.3 Choosing the Right Equipment for Digitization
18.6.4 Time Base Correctors
18.6.5 Determining the Target Formats for Digitization
18.6.6 Maintaining the Sound Layout During Digitization
18.6.7 Documentation of Digitization
18.7 Migrating Digital Video
18.7.1 Review and Analysis
18.7.2 Non-invasive (Non-transcode) Metadata Adjustments
18.7.3 Transcoding and Compression
18.7.4 Digital Storage of File-Based Video
18.7.5 Storing Tapes, Discs, and Drives
18.8 A Closer Look: The Technical Makeup of Video
18.8.1 Basics of the Video Signal
18.8.2 Color
18.8.3 Videotape
18.8.4 Analog to Digital Video
18.8.5 Digital Video
18.8.6 Video Format Identification
18.9 Video Hardware
18.9.1 Analog and Digital Display Technologies
18.9.2 Analog and Digital Playback Technologies
18.9.3 Ancillary Devices
18.9.4 Analog and Digital Signal Connections
18.9.5 Sound Systems
18.10 Preparing Video Artworks for Exhibition
18.10.1 Preparing the Equipment for Exhibition
18.10.2 Preparing the Audiovisual Media for Exhibition
18.10.3 Exhibition Documentation
18.11 Conclusion
19 Sound in Time-Based Media Art
19.1 Introduction
19.2 Physics—The Nature of Sound
19.2.1 Sound as Space
19.2.2 Pitch, Timbre, and Noise
19.2.3 Complex Sound and Sonic Cancellation
19.3 Psychoacoustics—The Experience of Sound
19.3.1 The Perception of Sound Is Relative
19.3.2 Balancing Sonic Environments in an Exhibition Context
19.4 Recording Sound
19.4.1 Analog versus Digital Recording
19.4.2 Analog Audio on Magnetic Tape and Vinyl
19.4.3 Sample Rate and Bit Depth in Digital Audio
19.4.4 Wordclock in Analog-to-Digital and Digital-to-Analog Conversion
19.4.5 Digital Audio Formats
19.5 Reproducing Sound
19.5.1 The Importance of Sample Rates
19.5.2 Getting the Audio to Speakers
19.5.3 Frequency Range
19.5.4 Active and Passive Speakers—Balanced, Unbalanced, and Digital Cables
19.5.5 Choosing Speakers That Reflect the Artist’s Intentions
19.6 Exhibiting Sound
19.6.1 Reducing Sound Reflections
19.6.2 Considering the Space and Location
19.6.3 Tailoring the Sound to the Space
19.6.4 Documenting the Intended Sound Experience
19.6.5 Maintaining the Work During Exhibition
19.6.6 Knowing the Artwork
20 Caring for Analog and Digital Film-Based Art
20.1 Introduction
20.2 A Brief Overview of Film as a Medium in Artists’ Practice
20.3 Collecting Artists’ Film
20.4 What Is Analog Film?
20.4.1 Material Composition
20.4.2 The Basics of Film Printing and Duplication
20.4.3 Manufacturing
20.5 Analog Film Identification, Inspection, and Documentation
20.5.1 Gauge
20.5.2 Base
20.5.3 Film Wind
20.5.4 Types of Film Stocks
20.5.5 Date Codes and Edge Printing
20.5.6 Optical Soundtracks
20.5.7 Magnetic Soundtracks
20.5.8 Base Deterioration
20.5.9 Color Dye Fading
20.5.10 Film Damage/Footage/Splice Count
20.5.11 Identifying Original, Master, and Print Materials
20.6 Duplication and Digitization
20.6.1 Preliminary Steps
20.6.2 Film-to-Film Duplication/Photochemical Film Preservation
20.6.3 Film-to-Digital Duplication/Digitization
20.6.4 Hybrid Approaches: Digital Source Output to Analog Film
20.6.5 Working with Film Labs
20.6.6 Quality Control
20.7 Storage
20.7.1 Film Storage
20.7.2 Digital Storage
20.8 Exhibiting Film-Based Art
20.8.1 Screening Room Considerations and Design
20.8.2 Film Projector Types and Anatomy
20.8.3 Loopers and Gallery Display
20.8.4 Storage, Maintenance, and Long-Term Care of Film Projectors
20.8.5 Digital Cinema Projectors
20.8.6 Analog and Digital Film Loans
20.9 Conclusion
Appendix 20.1 Anthology Film Archives—Film Inspection Report
21 Caring for Slide-Based Artworks
21.1 Introduction
21.2 History of the Slide Projector Show
21.3 Acquisition of Slide-Based Artworks
21.3.1 Components
21.3.2 Cataloging Components
21.4 Examination and Condition Assessment
21.5 Duplicating Slides
21.5.1 Film-to-Film Slide Duplication
21.6 Digitization of Slides
21.6.1 Digitizing with Scanners
21.6.2 Digitizing with Digital Cameras
21.7 Film Recorders: Re-output Back to Film from Digital Files
21.8 Working with Photo Labs
21.9 Slide Mounts
21.10 Storage
21.11 Projectors
21.12 Synchronized Slide Shows
21.13 Fading of Slides and Estimating When to Change Sets During Display
21.14 Conclusion
22 Caring for Software- and Computer-Based Art
22.1 Introduction
22.2 Collecting Software- and Computer-Based Art
22.2.1 What Is the Artwork? Understanding the Intended Experience
22.2.2 Researching the Meaning, Making, History, and Context
22.2.3 Identifying the Anatomy of the Work
22.2.4 Artwork Inspection
22.2.5 Conducting Risk Assessment for Preservation
22.2.6 Collecting Preservation-Relevant Components, Information, and Rights
22.3 Understanding Your Software-Based Artwork
22.3.1 What Is Hardware?
22.3.2 Analyzing and Documenting Hardware
22.3.3 Hardware Failure and Obsolescence
22.3.4 What Is Software?
22.3.5 Types of Software
22.3.6 Analyzing and Documenting Software
22.3.7 Software Obsolescence
22.3.8 Programming Languages and Platforms and Their Preservation Implications
22.3.9 Specific Artwork Genres and Preservation Considerations
22.4 Preservation and Treatment of Software-Based Art
22.4.1 Introduction to Environment Intervention Strategies
22.4.2 Environment Maintenance
22.4.3 Environment Migration
22.4.4 Environment Emulation
22.4.5 Choosing an Environment Intervention Approach
22.4.6 Emulation in Practice
22.4.7 Introduction to Code Intervention
22.4.8 Code Migration
22.5 Conclusion
23 A Word About Performance Art
23.1 Introduction
23.2 Time-Based Media and Performance Art
23.3 Performance Art and the Museum
23.4 Perspectives on the Conservation of Performance Art
23.5 Conclusion: From Emerging Practice to Sustained Development
List of Contributors
Index
Recommend Papers

Conservation of time-based media art
 9780367460426, 9781032343785, 9781003034865

  • Similar Topics
  • Art
  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

CONSERVATION OF TIME-BASED MEDIA ART

Conservation of Time-Based Media Art is the first book to take stock of the current practices and conceptual frameworks that define the emerging field of time-based media conservation, which focuses on contemporary artworks that contain video, audio, film, slides, or software components. Written and compiled by a diverse group of time-based media practitioners around the world, including conservators, curators, registrars, and technicians, among others, this volume offers a comprehensive survey of specialized practices that have developed around the collection, preservation, and display of time-based media art. Divided into 23 chapters with contributions from 36 authors and 85 additional voices, the narrative of this book provides both an overview and detailed guidance on critical topics, including the acquisition, examination, documentation, and installation of time-based media art; cross-medium and medium-specific treatment approaches and methods; the registration, storage, and management of digital and physical artwork components; collection surveys and project advocacy; lab infrastructures; staffing; and the institutional implementation of time-based media conservation. Conservation of Time-Based Media Art serves as a critical resource for conservation students and for a diverse professional audience who engage with time-based media art, including conservation practitioners and other collection caretakers, curators, art historians, collectors, gallerists, artists, scholars, and academics. Deena Engel, Clinical Professor Emerita of the Department of Computer Science at the Courant Institute of Mathematical Sciences at New York University, focuses her research on contemporary art. The winner of four teaching awards, she has experience teaching undergraduate computer science courses on web and database technologies, as well as Digital Humanities courses for graduate students. Deena is Co-Director along with Prof. Glenn Wharton of the Artist Archives Initiative, and she also works with major museums on projects that address the challenges involved in the conservation of time-based media art. She has an MA from SUNY-Binghamton in Comparative Literature and Literary Translation and an MS in Computer Science from the Courant Institute of Mathematical Sciences. Joanna Phillips is a time-based media conservator and Director of the Düsseldorf Conservation Center in Germany. She was previously Senior Conservator of Time-Based Media at the Solomon R. Guggenheim Museum in New York (2008–19), where she launched the first media conservation lab in a US museum, implemented time-based media conservation practices, and headed the Conserving Computer-Based Art (CCBA) initiative. She co-founded the AIC conference series TechFocus and lectures and publishes on TBM topics internationally. As a researcher in the Swiss project AktiveArchive, she co-authored the Compendium of Image Errors in Analog Video (2013). Phillips earned her MA in Paintings Conservation from the Dresden Academy of Fine Arts, Germany (2002).

ROUTLEDGE SERIES IN CONSERVATION AND MUSEOLOGY

Titles in this classic series, published by the world-leading Publisher of Museum & Heritage Studies and Conservation books, present the very latest research and practice from the fields of conservation and museology. With contributions from international experts, titles in the series will be relevant to researchers, practitioners and students  working in  the  fields of conservation,  museum and heritage studies, and art history. Titles published within the series include: CHEMICAL PRINCIPLES OF TEXTILE CONSERVATION Agnes Timar-Balazsy and Dinah Eastop CONSERVATION OF FURNITURE Shayne Rivers and Nick Umney THE HISTORY OF GAUGED BRICKWORK Conservation, Repair and Modern Application Gerard Lynch A PRACTICAL GUIDE TO COSTUME MOUNTING Lara Flecker CONSERVATION OF EASEL PAINTINGS, 2ND EDITION Edited by Joyce Hill Stoner and Rebecca Rushfield CONSERVATION OF TIME-BASED MEDIA ART Edited by Deena Engel and Joanna Phillips For more information on this series, visit: www.routledge.com/Routledge-Series-in-Conser vation-and-Museology/book-series/CONS

CONSERVATION OF TIMEBASED MEDIA ART

Edited by Deena Engel and Joanna Phillips

Cover image: 3D rendering for the nine-channel video projection ‘Ten Thousand Waves’ (2013) by Isaac Julien as installed at the Museum of Modern Art New York in 2013. Rendering: Tom Cullen. First published 2023 by Routledge 4 Park Square, Milton Park, Abingdon, Oxon OX14 4RN and by Routledge 605 Third Avenue, New York, NY 10158 Routledge is an imprint of the Taylor & Francis Group, an informa business © 2023 selection and editorial matter, Deena Engel and Joanna Phillips; individual chapters, the contributors The right of Deena Engel and Joanna Phillips to be identified as the authors of the editorial material, and of the authors for their individual chapters, has been asserted in accordance with sections 77 and 78 of the Copyright, Designs and Patents Act 1988. All rights reserved. No part of this book may be reprinted or reproduced or utilised in any form or by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying and recording, or in any information storage or retrieval system, without permission in writing from the publishers. Trademark notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation without intent to infringe. British Library Cataloguing-in-Publication Data A catalogue record for this book is available from the British Library Library of Congress Cataloging-in-Publication Data Names: Engel, Deena, editor. | Phillips, Joanna (Art restorer), editor. Title: Conservation of time-based media art / edited by Deena Engel and Joanna Phillips. Description: Abingdon, Oxon : Routledge, 2023. Identifiers: LCCN 2022018149 (print) | LCCN 2022018150 (ebook) | ISBN 9780367460426 (hardback) | ISBN 9781032343785 (paperback) | ISBN 9781003034865 (ebook) Subjects: LCSH: New media art—Conservation and restoration. | Time-based art—Conservation and restoration. Classification: LCC NX456.5.N49 C665 2023 (print) | LCC NX456.5.N49 (ebook) | DDC 709.04/07—dc23/eng/20220810 LC record available at https://lccn.loc.gov/2022018149 LC ebook record available at https://lccn.loc.gov/2022018150 ISBN: 978-0-367-46042-6 (hbk) ISBN: 978-1-032-34378-5 (pbk) ISBN: 978-1-003-03486-5 (ebk) DOI: 10.4324/9781003034865 Typeset in Bembo by Apex CoVantage, LLC

CONTENTS

List of Figuresxvii List of Tablesxxi Preface and Acknowledgments xxiii Deena Engel and Joanna Phillips PART I

Caring for Time-Based Media Art

1

  1 Implementing Time-Based Media Art Conservation in Museum Practice Joanna Phillips

3

1.1 1.2 1.3 1.4 1.5 1.6 1.7

Where to Begin  4 The Consequences and Costs of Inaction  5 It Takes a Village: Organizing In-House  6 Finding the Workforce  7 Creating a Workplace  11 Engaging in Networks of Practice Development  11 Conclusion 12

  2 Theories of Time-Based Media Art Conservation: From Ontologies to Ecologies Renée van de Vall 2.1 2.2

Introduction 13 Ontologies 14 2.2.1 Integrities 15 2.2.2 Work-Defining Properties  15 2.2.3 Between Instructions and Manifestations  17 2.2.4 Differential Approaches to Conservation  19 2.2.5 Medium-Independent Behaviors  21 v

13

Contents

2.3

2.4

Ecologies 23 2.3.1 Institutions and Networks  23 2.3.2 Doing Installation Art  24 2.3.3 Biographies 24 2.3.4 Producing Permanence  25 Conclusion 26

  3 A Roundtable: Curatorial Perspectives on Collecting Time-Based Media Art Annet Dekker in conversation with Karen Archey, Ulanda Blair, Sarah Cook, Ana Gonçalves Magalhães, Sabine Himmelsbach, Kelani Nichole, Christiane Paul, and Henna Paunu 3.1 3.2 3.3

Moving from Technical Risk to the Importance of Curatorial Knowledge 29 The Role of the Curator in Preservation Matters  32 Collaboration Beyond the Institution  35

  4 Institutional Assessments and Collection Surveys for Time-Based Media Conservation Lia Kramer, Alexandra Nichols, Mollie Anderson, Nora Kennedy, Lorena Ramírez López, and Glenn Wharton 4.1 4.2

4.3

4.4

4.5

28

Introduction to TBM Conservation Assessments and Surveys  39 Foundational Methodologies for Conservation Assessments and Surveys for Cultural Heritage Collections  40 4.2.1 Conservation Assessments  40 4.2.2 Collection Surveys  42 Frameworks for TBM Conservation Institutional Assessments and Collection Surveys  43 4.3.1 TBM Conservation Institutional Assessments  43 4.3.2 TBM Conservation Collection Surveys  44 4.3.3 Organizing the Institutional Assessment or Collection Survey 45 Conducting a TBM Institutional Assessment  48 4.4.1 Phase 1: Information-Gathering Prior to the Assessment 48 4.4.2 Phase 2: Interviews with Key Staff Members  48 4.4.3 Phase 3: Examination of Museum Spaces, Storage, and Infrastructure 49 4.4.4 Phase 4: Crafting the Final Institutional Assessment Report 49 4.4.5 Phase 5: Follow-up Visit  52 Conducting a TBM Collection Survey  53 4.5.1 Format Inventory and Identification of Migration Needs 54 vi

39

Contents

4.5.2 4.5.3 4.5.4 4.5.5 4.5.6 4.5.7 4.5.8 4.5.9

Storage and Housing Survey  55 Documentation Survey  56 Equipment Inventory  56 Registration and Cataloging  56 Digital Storage  57 Condition Assessment  57 Treatment Prioritization  58 Crafting the Final Collections Survey Report 58 4.6 Next Steps  61 4.7 Conclusion 62 Appendix 4.1: Artwork Summary Template Used by the Authors for the 2017–18 Met TBM Collection Survey  65   5 Outside the Institution: Crossing the Boundaries of Communities and Disciplines to Preserve Time-Based Media Mona Jimenez, Kristin MacDonough, and Martha Singer 5.1 5.2

5.3

5.4

5.5

Introduction 68 Community Archiving Workshop (CAW)  70 5.2.1 Partnerships and Museum Relationships  71 5.2.2 Workflows 73 5.2.3 Impact on Communities and Collections  73 Audiovisual Preservation Exchange (APEX)  74 5.3.1 Archives and Artist Improvisations  77 5.3.2 Partnerships and Exchange  78 5.3.3 Principles of Collaboration  79 XFR Collective  80 5.4.1 Origins and Motivations  83 5.4.2 Projects 84 5.4.3 Impact on Communities and Collections  84 Conclusion 85

  6 The Role of Advocacy in Media Conservation Jim Coddington 6.1 6.2 6.3 6.4 6.5 6.6 6.7 6.8

67

Introduction 89 Why Is It a Problem?  90 What Is the Problem?  90 You Have a Problem—So What?  90 Make the Problem a Joint Priority  90 Turn the Problem into a Project  91 Articulate the Project  92 Summary 92

vii

89

Contents PART II

Building a Workplace

93

  7 Building a Time-Based Media Conservation Lab: A Survey and Practical Guide, from Minimum Requirements to Dream Lab Kate Lewis

95

7.1 7.2 7.3 7.4

Introduction: TBM Lab Fundamentals  96 Space and Requirements  97 Purpose-Built Labs  100 Medium-Specific Overview  102 7.4.1 Digital Workstation  102 7.4.2 Tape-Based Media Workstation  103 7.4.3 Film and Slide Tools  104 7.4.4 Display Equipment and General Tools  104 7.5 Priorities and Advocacy  104 7.6 Sustaining a TBM Lab  106 Appendix 7.1 Lab Equipment Lists Compiled by Kate Lewis  108   8 Digital Storage for Artworks: Theory and Practice Amy Brost 8.1 8.2 8.3 8.4 8.5 8.6 8.7 8.8

Introduction 111 What Is a Repository?  112 Repository Models  114 Unique Needs of Artworks  118 Minimum Requirements  120 Deploying the Digital Repository  126 Automation 128 Conclusion 131

  9 Staffing and Training in Time-Based Media Conservation Louise Lawson 9.1 9.2 9.3 9.4 9.5 9.6

111

Context 139 Staffing Models and the Conservator  140 What Skills and Knowledge?  142 Who Has Responsibility?  144 How? Education and Training  145 9.5.1 Entry-Level Opportunities  150 9.5.2 Continual Professional Development  150 The Future  151

viii

139

Contents

10 A Roundtable: Implementing Cross-Departmental Workflows at SFMOMA Martina Haidvogl in conversation with Michelle Barger, Joshua Churchill, Steve Dye, Rudolf Frieling, Mark Hellar, Jill Sterrett, Grace T. Weiss, Layna White, and Tanya Zimbardo 10.1 10.2 10.3 10.4 10.5 10.6 10.7

154

Introduction 154 In Response to a Need  156 The Value of Collaboration  158 A Certain Level of Discomfort  159 Honoring a Collective Expertise  160 The Role of the Media Conservator  161 Looking Ahead  161

PART III

Cross-Medium Practices in Time-Based Media Conservation

163

11 Documentation as an Acquisition and Collection Tool for Time-Based Media Artworks Patricia Falcão, Ana Ribeiro, and Francesca Colussi

165

11.1 Introduction 165 11.1.1 The Role of Documentation in Conservation  166 11.1.2 Evolution of Documentation Practices in Time-Based Media Conservation  166 11.1.3 Emerging Practice  167 11.2 Time-Based Media Documentation Stages  168 11.2.1 Pre-acquisition 169 11.2.2 Acquisition 169 11.2.3 Exhibition 174 11.2.4 Loans 177 11.2.5 Conservation Interventions  177 11.2.6 High-Level Documentation: Policies, Strategies, Guidelines, and Workflows  178 11.2.7 Integrated Approach to Documentation  178 11.3 Conclusion 181 12 Inventory and Database Registration of Time-Based Media Art Martina Haidvogl and Linda Leckart 12.1 The Collection Information Environment  185 12.1.1 Digital Spreadsheets  186 12.1.2 Desktop Databases  186 12.1.3 Collection Management Systems  186

ix

185

Contents

12.2 Cataloging Time-Based Media Art  186 12.2.1 Track Every Component  187 12.2.2 Reflect the Status of Components  187 12.2.3 Note Locations, Also Digital Ones  188 12.2.4 Note Relationships Between Components  188 12.3 Inventory 188 12.3.1 Defining the Categories  188 12.3.2 Selecting the Documentation System  189 12.3.3 Enriching the Record  189 12.3.4 Updating the Record  189 12.4 Time-Based Media Numbering Within a Collection Management System 190 12.4.1 Keep It Flexible  190 12.4.2 Integrate with the Existing System  190 12.4.3 Create a Comprehensive Numbering System  192 12.5 Beyond the Database: Where to Store What Doesn’t Fit?  193 12.6 Conclusions 194 13 Digital Preservation and the Information Package Nicole Martin

196

13.1 Introduction: Preservation and the Past, Present, and Future  196 13.2 Defining the Information Package: The OAIS Reference Model and BagIt 197 13.3 Benefits of Information Packages  198 13.4 Data Ingest and Acquisition  199 13.4.1 Acquisition 199 13.4.2 File Transfer  199 13.5 OAIS Information Packages  200 13.5.1 The Submission Information Package  200 13.5.2 The Archival Information Package  201 13.6 Conclusion 203 14 Disk Imaging as a Backup Tool for Digital Objects Eddy Colloton, Jonathan Farbowitz, and Caroline Gil Rodríguez 14.1 Disk Imaging in Media Conservation  204 14.2 Use Cases for Disk Images  205 14.2.1 Disk Image as an Archival Copy  205 14.2.2 Disk Image as a Tool for Artwork Examination  206 14.2.3 Disk Image as a Backup for Exhibition  206 14.2.4 Disk Image as a Deliverable for Loans  206 14.3 What to Disk-Image  207 14.4 Necessary Hardware, Software, and Tools for Disk Imaging  207 x

204

Contents

14.5

14.6 14.7 14.8 14.9

14.4.1 Preventing Damage to Hardware or Data 209 14.4.2 Disk Imaging Workstation 210 14.4.3 Software Choices for Creating Images 211 Disk Imaging Formats and Selection Criteria 213 14.5.1 Sustainability 213 14.5.2 Compression 214 14.5.3 Metadata 214 Review of Formats 215 Disk Image Metadata 215 Sample Disk Imaging Workflow 218 The Evolving Practice of Disk Imaging 219

15 Managing and Storing Artwork Equipment in Time-Based Media Art Duncan Harvey 15.1 15.2 15.3 15.4 15.5 15.6 15.7

15.8

Introduction 223 Change and Assessing Significance 224 Levels of Significance 225 Documentation 228 Sources of Equipment 230 Storage 232 Technical Care 233 15.7.1 Who Undertakes Technical Care? 233 15.7.2 Loan 236 15.7.3 Strategies for Technical Care 237 15.7.4 End of Life 237 Conclusion 237

16 The Installation of Time-Based Media Artworks Tom Cullen 16.1 16.2 16.3 16.4 16.5 16.6 16.7

223

239

Introduction 239 Defining Artworks by Spatial, Aesthetical, and Technical Requirements 240 Understanding the Space Requirements of the Artwork and the Requirements of the Exhibition Space 240 Determining the Aesthetics of an Artwork’s Display 242 Integrating the Artwork in the Space 243 The Technical Realization of Time-Based Media Artworks 244 Creating a Technical Rider for New Works and Commissions 245

17 A Roundtable: Collaborating With Media Artists to Preserve Their Art Joanna Phillips and Deena Engel in conversation with Lauren Cornell, Mark Hellar, Diego Mellado, Steven Sacks, Lena Stringari, Siebren Versteeg, and Gaby Wijers xi

249

Contents

17.1 17.2 17.3 17.4 17.5

Introduction 249 Is Media Art Intended to Last?  250 Transferring Ownership by Transferring Knowledge  253 Gathering Distributed Knowledge  257 The Significance of Exhibition for the Preservation of Media Artwork 258 17.6 When Artists Maintain Their Own Work  259 17.7 The Professionalization of Media Conservation  260 Appendix 17.1 Bitforms Gallery Collector Agreement  262 PART IV

Medium-Specific Practices in Time-Based Media Conservation

267

18 Caring for Analog and Digital Video Art Agathe Jarczyk and Peter Oleksik

269

18.1 18.2 18.3 18.4

Introduction 269 A Brief Overview of Video as a Medium in Artists’ Practice  269 The Beginnings of Collecting and Preserving Video  271 Collection Practices Today  274 18.4.1 Pre-acquisition Process: Defining Deliverables with the Artist 274 18.4.2 Pre-acquisition Process: Identifying the Significance of Technologies 275 18.4.3 Acquisition Process: The Evaluation of Artist-Provided Materials 276 18.5 Acquisition Process: The Condition Assessment of Video Materials 276 18.5.1 Acquisition Process: Condition Assessment of Analog and Digital Videotapes  277 18.5.2 Acquisition Process: Condition Assessment of Digital, File-Based Video  278 18.6 Digitization as a Preservation Strategy  279 18.6.1 Digitization as a Key Moment in the Life of an Artwork 280 18.6.2 Preparing for Digitization  281 18.6.3 Choosing the Right Equipment for Digitization  281 18.6.4 Time Base Correctors  282 18.6.5 Determining the Target Formats for Digitization  282 18.6.6 Maintaining the Sound Layout During Digitization  283 18.6.7 Documentation of Digitization  283 18.7 Migrating Digital Video  283 18.7.1 Review and Analysis  283 xii

Contents

18.7.2 Non-invasive (Non-transcode) Metadata Adjustments  287 18.7.3 Transcoding and Compression  288 18.7.4 Digital Storage of File-Based Video  289 18.7.5 Storing Tapes, Discs, and Drives  289 18.8 A Closer Look: The Technical Makeup of Video  290 18.8.1 Basics of the Video Signal  290 18.8.2 Color 290 18.8.3 Videotape 291 18.8.4 Analog to Digital Video  293 18.8.5 Digital Video  293 18.8.6 Video Format Identification  295 18.9 Video Hardware  295 18.9.1 Analog and Digital Display Technologies  295 18.9.2 Analog and Digital Playback Technologies  304 18.9.3 Ancillary Devices  306 18.9.4 Analog and Digital Signal Connections  306 18.9.5 Sound Systems  306 18.10 Preparing Video Artworks for Exhibition  308 18.10.1 Preparing the Equipment for Exhibition  309 18.10.2 Preparing the Audiovisual Media for Exhibition  310 18.10.3 Exhibition Documentation  311 18.11 Conclusion 312 19 Sound in Time-Based Media Art Christopher McDonald

317

19.1 Introduction 317 19.2 Physics—The Nature of Sound  319 19.2.1 Sound as Space  320 19.2.2 Pitch, Timbre, and Noise  321 19.2.3 Complex Sound and Sonic Cancellation  323 19.3 Psychoacoustics—The Experience of Sound  324 19.3.1 The Perception of Sound Is Relative  326 19.3.2 Balancing Sonic Environments in an Exhibition Context 327 19.4 Recording Sound  328 19.4.1 Analog versus Digital Recording  331 19.4.2 Analog Audio on Magnetic Tape and Vinyl  331 19.4.3 Sample Rate and Bit Depth in Digital Audio  332 19.4.4 Wordclock in Analog-to-Digital and Digital-to-Analog Conversion 332 19.4.5 Digital Audio Formats  333 19.5 Reproducing Sound  334 19.5.1 The Importance of Sample Rates  334 xiii

Contents

19.5.2 19.5.3 19.5.4

19.6

Getting the Audio to Speakers 336 Frequency Range 336 Active and Passive Speakers—Balanced, Unbalanced, and Digital Cables 337 19.5.5 Choosing Speakers That Reflect the Artist’s Intentions 339 Exhibiting Sound 339 19.6.1 Reducing Sound Reflections 340 19.6.2 Considering the Space and Location 341 19.6.3 Tailoring the Sound to the Space 342 19.6.4 Documenting the Intended Sound Experience 342 19.6.5 Maintaining the Work During Exhibition 344 19.6.6 Knowing the Artwork 344

20 Caring for Analog and Digital Film-Based Art John Klacsmann, with a contribution by Julian Antos 20.1 20.2 20.3 20.4

20.5

20.6

Introduction 347 A Brief Overview of Film as a Medium in Artists’ Practice 348 Collecting Artists’ Film 349 What Is Analog Film? 349 20.4.1 Material Composition 349 20.4.2 The Basics of Film Printing and Duplication 349 20.4.3 Manufacturing 351 Analog Film Identification, Inspection, and Documentation 351 20.5.1 Gauge 354 20.5.2 Base 357 20.5.3 Film Wind 359 20.5.4 Types of Film Stocks 360 20.5.5 Date Codes and Edge Printing 363 20.5.6 Optical Soundtracks 365 20.5.7 Magnetic Soundtracks 367 20.5.8 Base Deterioration 369 20.5.9 Color Dye Fading 371 20.5.10 Film Damage/Footage/Splice Count 371 20.5.11 Identifying Original, Master, and Print Materials 372 Duplication and Digitization 374 20.6.1 Preliminary Steps 374 20.6.2 Film-to-Film Duplication/Photochemical Film Preservation 375 20.6.3 Film-to-Digital Duplication/Digitization 382 20.6.4 Hybrid Approaches: Digital Source Output to Analog Film 388 xiv

347

Contents

20.6.5 Working with Film Labs 389 20.6.6 Quality Control 390 20.7 Storage 390 20.7.1 Film Storage 390 20.7.2 Digital Storage 392 20.8 Exhibiting Film-Based Art 392 20.8.1 Screening Room Considerations and Design 392 20.8.2 Film Projector Types and Anatomy 395 20.8.3 Loopers and Gallery Display 398 20.8.4 Storage, Maintenance, and Long-Term Care of Film Projectors 399 20.8.5 Digital Cinema Projectors 399 20.8.6 Analog and Digital Film Loans 400 20.9 Conclusion 401 Appendix 20.1 Anthology Film Archives—Film Inspection Report 402 21 Caring for Slide-Based Artworks Jeffrey Warda 21.1 21.2 21.3 21.4 21.5 21.6 21.7 21.8 21.9 21.10 21.11 21.12 21.13 21.14

406

Introduction 406 History of the Slide Projector Show 409 Acquisition of Slide-Based Artworks 410 21.3.1 Components 412 21.3.2 Cataloging Components 415 Examination and Condition Assessment 419 Duplicating Slides 421 21.5.1 Film-to-Film Slide Duplication 421 Digitization of Slides 422 21.6.1 Digitizing with Scanners 427 21.6.2 Digitizing with Digital Cameras 430 Film Recorders: Re-output Back to Film from Digital Files 433 Working with Photo Labs 436 Slide Mounts 437 Storage 438 Projectors 441 Synchronized Slide Shows 446 Fading of Slides and Estimating When to Change Sets During Display 448 Conclusion 449

22 Caring for Software- and Computer-Based Art Deena Engel, Tom Ensom, Patricia Falcão, and Joanna Phillips 22.1 22.2

Introduction 453 Collecting Software- and Computer-Based Art 456 xv

453

Contents

22.2.1

22.3

22.4

22.5

What Is the Artwork? Understanding the Intended Experience 457 22.2.2 Researching the Meaning, Making, History, and Context 458 22.2.3 Identifying the Anatomy of the Work 461 22.2.4 Artwork Inspection 461 22.2.5 Conducting Risk Assessment for Preservation 464 22.2.6 Collecting Preservation-Relevant Components, Information, and Rights 467 Understanding Your Software-Based Artwork 470 22.3.1 What Is Hardware? 470 22.3.2 Analyzing and Documenting Hardware 478 22.3.3 Hardware Failure and Obsolescence 480 22.3.4 What Is Software? 481 22.3.5 Types of Software 483 22.3.6 Analyzing and Documenting Software 484 22.3.7 Software Obsolescence 487 22.3.8 Programming Languages and Platforms and Their Preservation Implications 488 22.3.9 Specific Artwork Genres and Preservation Considerations 492 Preservation and Treatment of Software-Based Art 496 22.4.1 Introduction to Environment Intervention Strategies 496 22.4.2 Environment Maintenance 496 22.4.3 Environment Migration 497 22.4.4 Environment Emulation 498 22.4.5 Choosing an Environment Intervention Approach 500 22.4.6 Emulation in Practice 501 22.4.7 Introduction to Code Intervention 503 22.4.8 Code Migration 503 Conclusion 505

23 A Word About Performance Art Hélia Marçal 23.1 23.2 23.3 23.4 23.5

512

Introduction 512 Time-Based Media and Performance Art 513 Performance Art and the Museum 514 Perspectives on the Conservation of Performance Art 515 Conclusion: From Emerging Practice to Sustained Development 517

List of Contributors Index

521 544

xvi

FIGURES

1.1 Interdisciplinary collaboration is needed to practice time-based media conservation. 8 3.1 Mabe Bethonico, Ursula Biemann, Uwe H. Martin, et al., World of Matter (2011–18; www.worldofmatter.net).34 3.2 Andres Bosshard, Telefonia-1291–1991–2021 (1991/2020; https://telefonia. hek.ch).35 4.1 Lia Kramer and Lorena Ramírez López examine the custom hardware utilized in Jim Campbell’s Motion and Rest #2 (2002) during the combined TBM institutional assessment and collection survey conducted at The Metropolitan Museum of Art from 2017 to 2018. 41 4.2 From left to right: Met Collection Manager Catherine Burns, Lorena Ramírez López, and Lia Kramer evaluate film materials stored in The Met’s cool storage. 55 4.3 Sample media production diagram for a single-channel video artwork, used to describe relationships between artwork components. 57 5.1 Filmmaker Ko Nakajima (center, with cap) answers a question during a Community Archiving Workshop in Tokyo in 2016 as part of a knowledge exchange organized by Collaborative Cataloging Japan. 70 5.2 One of the projections as part of AJI (Archive Jam Improvisation), an event at the museum Collección Engelman-Ost in Montevideo, Uruguay, that was organized by Fundación de Arte Contemporáneo (FAC) and participants of APEX Montevideo. 75 5.3 XFR Collective’s pop-up digitization station at MIX NYC: The New York Queer Experimental Film Festival in New York in 2015. 81 7.1 Christine Frohnert performing quality control of a video artwork in the media lab at Bek & Frohnert, LLC, New York. 99 7.2 David Smith performing treatment in the TBM Conservation Lab at M+ Museum, Hong Kong. 99 8.1 Open Archival Information System (OAIS) reference model. 114 8.2 Artwork elements for preservation. 115 8.3 Information continuum for time-based media art conservation. 117 xvii

Figures

8.4 10.1 11.1 13.1 13.2 14.1 14.2 15.1 16.1

16.2 16.3 17.1 17.2 17.3 17.4 17.5

18.1

18.2 18.3 18.4 19.1

19.2 19.3 19.4 19.5

19.6

20.1

NDSA Levels of Digital Preservation Matrix, Version 2.0. 122 Screenshot of the roundtable discussion on June 5, 2020. 155 The artwork life stages and correspondent documentation needs. 168 The OAIS Functional Model, available under Creative Commons (CC BY-SA 4.0). 201 The BagIt file hierarchy represented in macOS Finder (left) and the command-line program “tree” (right). 202 Disk structure showing a track (A), a geometrical sector (B), a track sector (C), and a cluster of sectors (D). 208 For write-blocked access, an external hard drive is connected to a USB 3.0 forensic bridge, which in turn is connected to a computer. 210 Mark Leckey, Dream English Kid, 1964–1999 AD (2015). 229 A 3D rendering of the synchronized nine-screen installation Ten Thousand Waves (2010) by Isaac Julien, shown as installed at the Museum of Modern Art New York in 2013. 246 Schematic for Ten Thousand Waves (2010) by Isaac Julien. 247 Isaac Julien, Ten Thousand Waves (2010), nine-screen installation, 35 mm film transferred to high definition, 9.2 surround sound, 49′ 41″.248 The roundtable participants in discussion on collaborating with media artists to preserve their art. 251 Siebren Versteeg, Dynamic Ribbon Device (2003), Edition 1/10. 251 Daniel Canogar, Billow V (2020). Installation view at the Colección ING in Madrid.255 A logical diagram that illustrates the programming flow cycle of Daniel Canogar’s Billow V (2020). 256 The directory of files for Daniel Canogar’s Billow V (2020), prepared by the artist studio for handover to the new owner, includes the source code and other elements necessary to enable the future conservation of the work. 256 Graph A represents the analog, continuous signal, graph B represents the signal digitized with a low bit depth, and graph C shows the signal digitized with a higher bit depth. 280 Graph A represents the analog, continuous signal. 280 The figure shows four different audio tracks with two channels each. 284 Schematic illustration of a U-matic (LB) tape. 292 This is a soundbar showing the frequency ranges of three different types of speakers and the frequency ranges of the playable notes on several instruments. 320 White noise is mathematically random across the spectrum. 322 The first six harmonics above the C two octaves below middle C (about 32.7 Hz). 322 Simple sound waves “sum” to create more complex ones. 324 Test hanging of No Sound Sculpture Passage (2017), by Atelier Pipilotti Rist, Andreas Lechthaler (architect), Dave Lang, Tamara Rist (seamstress), and Nike Dreyer. 329 Plan of No Sound Sculpture Passage (2017), by Atelier Pipilotti Rist, Andreas Lechthaler (architect), Dave Lang, Tamara Rist (seamstress), and Nike Dreyer. 330 Simplified cross-section of a piece of analog film. 350 xviii

Figures

20.2 Workflow for duplicating a film from a negative element. 351 20.3 A rewind bench with a light box and non-motorized rewinds is needed for film inspection and preparation of films for exhibition. 353 20.4 Basic equipment necessary to properly inspect a film. 353 20.5 Film gauges commonly found in artists’ film and time-based media. 355 20.6 The base type can be easily identified by placing the film roll on a light table.359 20.7 A roll of film that “reads” correctly through the base side is called B Wind, while one that reads correctly through the emulsion side is called A Wind. 360 20.8 (1) Raw stock can be packaged as either A Wind or B Wind. (2) Camera original film is B Wind, and a contact print from it will be A Wind. (3) Films are printed emulsion-to-emulsion in contact printers producing alternating winds. (4) The wind can be confirmed by examining writing in the image relative to the base or emulsion. (5) Optical printing inverts the image: a master is typically run upward through the optical printer’s projector gate. 361 20.9 Different types of B&W film stocks. 362 20.10 Different types of color film stocks. 362 20.11 Date code reference chart. 364 20.12 Kodak date code on 16 mm camera reversal film. 365 20.13 Print-through of a contemporary edge code. 366 20.14 Optical soundtracks. 367 20.15 Magnetic soundtracks. 368 20.16 A 16 mm roll with severe vinegar syndrome, which has warped and spoked. 370 20.17 This faded 16 mm color positive print of Harry Smith’s Early Abstractions displays a strong color shift toward magenta, as its cyan dye layer has faded. 372 20.18 Especially in 16 mm film, editing of the negative or reversing the original elements was often carried out over A+B rolls using a checkerboard technique.373 20.19 Analog duplication is achieved either through the use of a contact printer (1) or an optical printer (2). 376 20.20 (1) Base scratches cause light refraction during printing resulting in “printed in” scratches in the duplicate. (2) Liquid gates alleviate refraction from base scratches, so they are not printed into the duplicate. (3) In contact printers, printing heads are fully submerged during liquid gate printing. (4) In optical printers, projector gates are enclosed with glass, and the liquid is circulated through the gate. 378 20.21 Workflow for duplicating a film from a reversal or positive element. 380 21.1 Installation view of Sharon Hayes, In the Near Future (2009). 407 21.2 Comparison of regular versus slide duplicating film from test slides of Sharon Hayes’ In the Near Future (2009). 408 21.3 Common slide mounts with mount opening dimensions and film placement. 410 21.4 Installation view of Lisa Oppenheim, The Sun Is Always Setting Somewhere 417 Else (2006). 21.5 Installation view of Kathrin Sonntag, Mittnacht (2008).418 21.6 Two hand-altered slides by Kathrin Sonntag, included with Mittnacht (2008). 419 21.7 Typical production diagram for a slide work with associated component numbers assigned. 421 21.8 Kodak IT8.7/1 color transparency target for scanner profiling. 426 xix

Figures

21.9 Digitizing 35 mm slide film with a Linotype-Hell ChromaGraph S3400 drum scanner at Griffin Editions, New York, USA. 428 21.10 Hasselblad Flextight X5 film scanner in use at the Solomon R. Guggenheim Museum, NY. 429 21.11 Nikon Super Coolscan 5000 ED film scanner with detached SF-210 Auto Slide Feeder. 430 21.12 Nikon ES-2 Film Digitizing Adaptor connected to a Nikon D850 DSLR with Micro-Nikkor 60 mm f/2.8 ED macro lens, tethered to a computer with Nikon Camera Control Pro 2 software. 432 21.13 An Agfa Alto CRT film recorder at Mops Computer, Germany. 434 21.14 The first slide from Robert Smithson’s Hotel Palenque (1969–72), slide projection of thirty-one 35 mm color slides (126 format) and audio recording of a lecture by the artist at the University of Utah in 1972 (42 min, 57 sec), Solomon R. Guggenheim Museum, (99.5268). 435 21.15 Sample slide storage enclosure for cold storage. 439 21.16 Common slide projectors. 441 21.17 Computer control of Robert Smithson’s Hotel Palenque (1969–72) to run the projection, audio, and cue. 448 22.1 Inspecting and testing the software-based artwork Subtitled Public (2005), by Rafael Lozano-Hemmer, requires an 8 × 8 m space and three to four people working for a week. 462 22.2 The webpage (left) and underlying HTML code (right) for the web artwork wwwwwwwww.jodi.org %Location (1995) by JODI. 465 22.3 Recoverability assessment, here of an obsolete SCSI printer in the context of a software-based artwork. 467 22.4 Various types of computers. 471 22.5 The side panel of this desktop computer has been removed to reveal the internal components. 473 22.6 Arduino Uno Rev3 SMD microcontroller. 474 22.7 Raspberry Pi Model A Revision 2.0 single-board computer. 476 22.8 The report produced by the built-in System Information tool on a Windows 10 laptop provides a detailed account of all hardware and software components.479 22.9 Application directory for John Gerrard’s Sow Farm (near Libbey, Oklahoma) (2009), viewed in Windows 10. 482 22.10 A simple C++ application project opened in the Microsoft Visual Studio 2017 IDE. 491 22.11 John Gerrard, Sow Farm (near Libbey, Oklahoma) 2009 (2009), installation at Tate Britain in 2016. 493

xx

TABLES

4.1 Core characteristics of a TBM institutional assessment and a collection survey in comparison 4.2 Necessary project staff and resources 4.3 TBM conservation roles and responsibilities 9.1 Institutional models for staffing in time-based media conservation 9.2 Conservation bachelor’s and master’s courses 11.1 The type of information gathered at the pre-acquisition stage 11.2 Types of legal agreements 11.3 Acquisition documentation 11.4 Documents created or sourced in preparation for a display 11.5 Documents created or updated after the artwork has been displayed 11.6 Artwork folder structure and their content, based on “Document Structure developed by TATE and S.M.A.K.” (Inside Installations 2007) 12.1 Inventory example with numbering, compiled by the authors 14.1 Software used by the collections staff at art museums and broadly adopted by the cultural heritage community 14.2 Disk image format comparison chart 15.1 Assessing equipment significance 15.2 A proposed hierarchy for artwork equipment categorization 15.3 Basic equipment information to capture in a CMS 15.4 Considerations for storing artwork equipment 15.5 Pros and cons of common packing options for artwork equipment 18.1 Three levels of critical viewing as part of the quality and condition assessment of digital video 18.2 Comparison of the technical metadata of two files with identical content 18.3 Key video characteristics 18.4 Key audio characteristics 18.5 Basic technical aspects of three television standards 18.6 Video format identification 18.7 File encodings 18.8 Analog and digital cabling for video display xxi

46 47 50 140 146 170 171 172 175 177 180 191 212 216 225 226 230 232 234 279 284 287 287 291 296 300 307

Tables

18.9 20.1 21.1 21.2 21.3 22.1 22.2

Analog and digital cabling for audio signals Timeline of important manufacturing developments Common slide film formats and mounts with sizes Popular Kodak slide projectors and features Projector lamp options for select Kodak projectors An example of an initial, basic inventory and component-based risk assessment Description of common computer components and the key specification information to capture when documenting their specification 22.3 Summary of commonly encountered processor architectures, associated operating systems, and a selection of actively developed emulators capable of running them

xxii

308 352 411 442 445 459 472

502

PREFACE AND ACKNOWLEDGMENTS

This handbook is the first attempt to comprehensively map the current field of time-based media art conservation, a practice with early roots in the preservation of video and other electronic media. Over the last one and a half decades, time-based media conservation has developed into a professional specialization within contemporary art conservation. Today, practitioners provide qualified care for complex artworks that contain video, audio, film, slides, and software, and new practice developments are advancing the emerging field with increasing momentum. Since its early beginnings, the discipline of time-based media conservation has produced online and print publications, mostly in the form of project summaries and presentations, conference papers, and peer-reviewed articles featuring case-study research and explorations of practical and theoretical topics. However, many approaches, methods, and tools developed in practice have never entered the written record, and thus, the field lacks a coherent account of time-based media conservation practices and underlying concepts. With this volume, we strive to connect the dots and present the narrative of time-based media conservation today, offering a compilation of the current knowledge and thinking that governs the collection, examination, documentation, exhibition, management, treatment, and preservation of time-based media artworks. To accomplish this ambitious task, we invited an international lineup of 34 authors, whose chapters are arranged and developed to refer to, complement, and build on each other. To include additional perspectives and voices, we encouraged authors to interview 85 further contributors, whose statements and experiences are captured in text boxes and other forms of citation throughout the book. All authors and interviewees are pioneering and prominent contributors to the field, hailing from 15 different countries and a wide range of backgrounds, but not all of them routinely publish through the medium of the written word. Ensuring that their expertise and practice would not remain excluded from the record was one important goal of this book. Another goal was to acknowledge interdisciplinarity as a key factor in successful time-based media conservation. This diverse group of authors and contributors has compiled an extensive bibliography of over 750 items, referencing a remarkable body of literature that will locate and contextualize time-based media conservation as a professional discipline and science from here onward. The narrative of this handbook is structured in four parts: Part I, “Caring for Time-Based Media Art,” explores a variety of conceptual approaches to reflecting on and implementing time-based media conservation; Part II, “Building a Workplace,” introduces considerations and xxiii

Preface and Acknowledgments

practical guidance around establishing infrastructure, staff, and workflows for time-based media conservation; Part III, “Cross-Medium Practices in Time-Based Media Conservation,” illuminates an array of high-level practices that are relevant to many types of media artworks; and Part IV, “Medium-Specific Practices in Time-Based Media Conservation,” explores in depth the genre-specific care for video art, for the sound in time-based media, for film-based art, for slidebased art, and for software- and computer-based art, ending with a word on performance art. We hope that this book will prove to be a valuable resource for scholars and practitioners alike, as well as an approachable and encouraging textbook for students and faculty of conservation and other academic programs. With its emphasis on practice, this book is designed to serve as a guide for a variety of readers in search of both a practice overview and concrete advice. We hope to deliver inspiration not only to experienced time-based media conservators but also to artists, collectors, curators, archivists, and collection stewards who are in charge of time-based media but may have little or no pre-existing knowledge of how to care for these complex artworks. Even though the emerging field of time-based media conservation is still very much in flux—just like the technologies employed by the artworks in our care—we chose a static book format over a flexible online resource as a publication medium. It was our goal to translate practice into future bibliographies, deliver it to the written record, and bring it into libraries to the attention of researchers and scholars. This book not only represents a snapshot in time in a rapidly evolving field but also marks a formative moment in its history and one that deserves a book of its own. We are deeply grateful for the huge commitment of our authors and contributors, who have put endless hours of volunteer work into developing and coordinating content, reaching out to fellow authors and interviewees, engaging in cross-chapter discussions, and providing each other with constructive feedback and peer review. Rather unexpectedly, at first, this book turned into a networking project of its own with uncounted hours of video conferencing across different time zones, mounting nothing less than a community effort of putting practice to paper. The writing of this book coincided with the global COVID-19 pandemic, and while some contributors were suddenly left with time to focus exclusively on the book, many others faced enormous personal and professional burdens during times of lock-down or illness and had to overcome major obstacles to complete their contributions. Many contributors attested to the comfort and work rhythm this book provided during the height of pandemic isolation, forging a sense of exchange, global collaboration, and community that is so characteristic of the emerging field of time-based media conservation. We are grateful to our editors at Routledge—Heidi Lowther, Ella Halstead, Elizabeth Risch, Emmie Shand, Marcia Adams, and their teams—for their ongoing commitment to this project, their timely advice, and encouragement. Last but not least, we wish to thank our friends and colleagues for all the support they provided throughout the two years of writing, and above all, we wish to thank our families: Edna and Jean-Claude Campell, Paul Kaplan, and Max and Ben Engel-Streich, for their ongoing support and patience during our many hours of video calls, writing and editing. Without our families, friends, and colleagues, our dream for this book would not have reached fruition. —Deena Engel and Joanna Phillips

xxiv

PART I

Caring for Time-Based Media Art

1 IMPLEMENTING TIME-BASED MEDIA ART CONSERVATION IN MUSEUM PRACTICE Joanna Phillips

Editors’ Notes: Joanna Phillips is Director of the Düsseldorf Conservation Center in Germany, where she heads an interdisciplinary team of conservators who are charged with the care of the city’s collections of over 3.5 million artifacts held across 12 museums and archives. Prior to her current appointment, she served as Senior Conservator of Time-Based Media at the Solomon R. Guggenheim Museum in New York, where she launched and headed the media conservation division from 2008 to 2019. In this chapter, Phillips offers her perspective as a practitioner and “first-generation” time-based media conservator who has developed workplaces, staff infrastructure, and practices for her institutions both in New York and Düsseldorf. By sharing her insights and experiences, she aims to provide practice-based guidance and encouragement to those who wish to implement time-based media conservation in their museum practice.

When artists explore new technologies, materials, and media in their art making, and these artworks start to enter collections, the discipline of art conservation will eventually respond with the development of new methodologies, approaches, and practices to provide appropriate care for these artworks. The emergence of new conservation specialties—like the conservation of photographs or the conservation of plastics in the late 20th century—takes time, as expertise, infrastructure, training, and positions have to be generated, and a professional field develops. The most recent specialization evolving within art conservation is concerned with the care for time-based media: technology-based artworks, which contain elements of video, audio, film, slides, and software and which have been entering contemporary art collections worldwide at an increasing pace. Over the last 15 years, a growing number of art museums around the world have employed dedicated time-based media conservators, academic art conservation programs have expanded their curricula to include the care of time-based media, networks and initiatives have formed to undertake research and development, and a growing body of literature is advancing conservation theory and practice in this area. Yet when considering the ever-increasing production and collection of time-based media art, many museums lag behind with the implementation of time-based media conservation in their practice. As an example, across Germany, there are over 700 art museums, including dozens of contemporary art museums, but at the time of writing, only four of them have staff positions dedicated to the care of their time-based media collections. DOI: 10.4324/9781003034865-2

3

Joanna Phillips

The obstacles to creating new, permanent staff positions are manifold, especially for public sector institutions that have to navigate inert government plans and budgets. However, I want to argue that the implementation of time-based media conservation is achievable if collection caretakers effectively raise awareness among colleagues and administrators, build internal and external networks, and find creative ways of bringing relevant expertise to the institution and— ultimately—if the staff is ready to leave their comfort zone and pioneer new practices as a team. Each collection has its own institutional culture and individual challenges, which will determine its specific approaches to implementing time-based media conservation. But there are a number of strategic building blocks that have proven to be helpful when taking the first steps, whether as a collection caretaker without prior specialization in time-based media or as a “firstgeneration” time-based media conservator whose new position has no predecessor, job description, workplace, infrastructure, or established practice in place. This chapter aims to provide orientation and encouragement for those who feel the need to move forward but are looking for starting points and leads for launching the implementation process in their institution. Throughout the chapter, references will be made to other chapters and sections of this book, where readers can explore specific topics in depth.

1.1 Where to Begin In most collections of modern and contemporary art, the percentage of time-based media art is very small compared to other types of media, such as painting, sculpture, works on paper, or photography. The latter types of art are usually cared for by permanent conservation staff who may be specialized in the conservation of modern paints, for example. But even experienced contemporary art conservators tend to step back when it comes to the media art in their collections. Many conservators feel that they are missing the highly specialized, often very technical expertise that is needed to examine, document, treat, and preserve time-based media artworks. As collection professionals, they are hesitant to engage in work they have not been trained for, and they rarely have the time and resources to take on additional responsibilities that require further education. Without dedicated conservators, time-based media works regularly drop out of established museum mechanisms that organize and rule the collection care for other artworks—including conservation documentation (Chapter  11), registering and tracking artwork components (Chapter  12), and storage implementation (Chapters  8 and 15) along with risk assessment, preservation planning, and condition monitoring (Chapters 18–22). In lieu of a dedicated time-based media conservator in-house, some collections delegate conservation work to technical staff, such as media technicians, or outsource treatments to external service providers, such as post-production houses. Less frequently, freelance time-based media conservators are hired for specific projects. However, the management of change that is inherent to time-based media works requires an ongoing engagement, and an artwork’s integrity is easily compromised when loans, exhibitions, or technological updates are carried out without integral institutional involvement and understanding of its conceptual and technical identity. If institutions want to retain continued agency in the collection lives of their artworks, they have to consider ways of establishing conservation oversight and accountability in-house. In-house staff is immersed in the museum’s operations and participates in acquisition, exhibition, and loan processes that can act as key events in a media artwork’s life and history of change. In-house staff is able to bring continuity to collection care beyond temporary projects: iterations of variable artworks can be witnessed, documented, and managed over time, and staff can establish ongoing relationships with artists, galleries, and other critical stakeholders that play 4

Implementing TBM Conservation Practice

a role in supporting the artwork. Structural change—and thus the advocacy for it—can only come from within the institution. This is why—even without having technical expertise or time-based media training— collection professionals play such a critical role in developing in-house stewardship of timebased media works. Committed caretakers often start by creating a cross-departmental working group (see Chapter 10). They commission collection surveys and institutional assessments, as introduced in Chapter 4 of this book, which help to identify the value, condition, and conservation needs of time-based media collections, as well as the structural development an institution must undergo to be able to address these needs. Based on these evaluations, collection staff can identify the quality and quantity of time-based media expertise their institution needs. Chapter 6 in this book provides advice and inspiration on the topic of advocacy and how to effectively start the process of arguing for change. Chapter 5 presents alternative communitydriven preservation approaches, which are particularly relevant for smaller institutions or organizations with limited funding.

1.2 The Consequences and Costs of Inaction Many traditional collection artifacts, such as paintings, photographs, objects, or works on paper, will tolerate a delay in conservation treatments as long as they are appropriately stored and handled. Time-based media artworks, in contrast, can be unforgiving if conservation actions are not proactively timed, often long before discernable loss and damage occur. Factors that drive the need for action include technological developments and obsolescence; for example, when the manufacture of 35 mm slide projectors was terminated worldwide in the early 2000s (see Chapter 21) or the manufacture of cathode-ray tube (CRT) monitors around 2010 (see Chapter 18), many museums started stockpiling this equipment for future replacements and spare parts. Today, with increasing scarcity, certain CRT models that are popular among artists and museums for aesthetic and conceptual reasons can achieve exorbitant prices on the secondary market. Failing to migrate formats and technologies while the industry still supports this process can also come at a huge cost for museum budgets and quality standards. For example, art collections that missed the time window to digitize their analog videotapes to file-based video formats are today experiencing great difficulties in finding facilities that still offer these digitization services. Large videotape archives in the commercial world completed their digitization projects a decade ago. Once their business faded and the tape era ended globally, post-production facilities around the world had to let go of tape digitization infrastructure and experienced staff. Even where digitization labs and expertise are still at work, the achievable quality of the digitized signal is continuously and irretrievably degrading with every year that goes by because videotapes are deteriorating, professionals with analog video expertise are retiring, and spare parts such as video heads are no longer manufactured. The resulting cost of inaction for undigitized video collections is, therefore, not just the starkly increased price of tape digitization today but also the loss of audiovisual quality that is achievable with digitization, thus compromising the experience of digitized artworks for generations to come. Sections 18.6, 20.6.3, and 21.6 address digitization, and Sections 22.2.5, 22.3.3, and 22.3.7 address the concept of obsolescence. Long-sighted conservation actions should take place as early as during the acquisition phase: preservation-critical artwork deliverables must be negotiated (Sections 18.4.1, 20.3, 21.3, and 22.2.6), and in-depth documentation of the artwork should be created in close exchange with the artist and their production teams (see Chapters 11 and 17 and Section 22.2). 5

Joanna Phillips

The identification of preservation-critical acquisition deliverables requires the expertise of a time-based media conservator, and the artist cannot be expected to retain and professionally archive these elements on behalf of the museum. Similarly, artists cannot be expected to relieve the museum from its task of creating meaningful artwork documentation in addition to the artist-provided installation instructions. If artworks are not documented on the occasion of their acquisition, while they are installed and functioning as intended, the effort of reconstructing the work for future displays (which may occur many years later) can become a costly research project that inflates the exhibition budget and may prevent the collection work from being shown. Far more tangible in the short term is the cost of inaction when neglecting digital storage needs. If collections do not invest in appropriate server storage (including redundancy and retrieval mechanisms, fixity checks, and access regulation) and create backup copies of digital artwork components, the risk of artwork damage or loss is very high, as local storage devices such as hard drives can fail spontaneously (Section 22.3.3). Calculating the summary of acquisition values of born-digital and digitized artworks can support IT departments in their argument for investing in suitable storage infrastructure. Chapter 8 in this book introduces criteria and approaches to creating appropriate digital art storage, Chapter 13 introduces the basics of digital preservation, and Chapter 14 explains the concept and practice of disk imaging as a backup strategy for digital artworks. The monetary cost of damage and loss that result from inaction can be a driving factor when arguing for the addition of time-based media resources. But it is equally important to identify the ethical and operational implications for a collecting institution if time-based media conservation practices are not enabled. Does the museum live up to its mission statement (e.g., the commitment to preserving, researching, and providing public access to all artworks in its care)? And if an institution’s inability to provide care for time-based media works inhibits curators from collecting these works, does that museum fulfill its purpose and mission (e.g., to collect the art of our time for future generations)?

1.3 It Takes a Village: Organizing In-House The management of time-based media artworks—like other types of variable, installed, or performative artworks—can involve staff from many departments, depending on the size and structure of the institution. In larger museums, the core team that runs acquisition, loans, and exhibition programs may include registrars, who manage the inventory, component tracking, and travel of time-based media works; art handlers, who monitor and move artworks in and out of storage; exhibition designers and curators, who adapt their layout to new exhibition spaces; fabricators and construction teams, who build artwork elements and spaces; media technicians, who conceptualize and realize the technical setup of artworks; and media conservators, who devise preservation and display strategies and treat and document artworks. Outside of this core team, a whole range of additional stakeholders throughout the museum may interact with time-based media collections, such as IT staff, who manage digital storage, servers, and networks; legal staff, who craft purchase and licensing agreements that regulate the display, preservation, and representation of time-based media; and education staff, who enhance the accessibility and contextualization of works on display. While staff roles and responsibilities vary greatly from institution to institution—and country to country—one condition is shared across most institutions: the knowledge and management of a time-based media work are distributed among many stakeholders, and many stakeholders have an impact on its collection life, including its conservation. Consequently, the actual appearance and visitor experience of a time-based media work in the museum gallery is more likely to be the result of a communal 6

Implementing TBM Conservation Practice

decision-making process (involving a multitude of museum staff and artist representatives) than the single-handed production of an individual artist. Capturing the input of all involved parties is a central concern of established models and methods for time-based media documentation (Chapter  11), and it allows future interpreters of the work to differentiate between choices made by the artist and those made by the museum team. Chapter 16 steps through the process of installing time-based media art, and Chapter  17 explores artists’ roles in stewarding their artworks. Regardless of whether a dedicated time-based media conservator position is in place, in the process of being lobbied for, or hopelessly out of reach: time-based media conservation can only be implemented in museum practice if the aforementioned stakeholders from various departments are cued in to the institutional enterprise of taking conservation responsibility for time-based media collections and if they understand their roles in facilitating this action. Curators, for example, play a vital role in enabling the long-term care for time-based media: they open bilateral relationships with artists to collection teams and help to convey institutional preservation strategies to artists—for example, when conservators ask for additional, preservationcritical acquisition deliverables or want to “shadow” the artist’s team during the install of a new acquisition in order to produce meaningful documentation for the museum. Exhibition managers can equally support the long-term preservation of time-based media—for example, by admitting documentation expenses to the exhibition budget, especially when newly acquired, undocumented artworks are first installed. These costs may involve travel and accommodation for documentation activities at off-site venues, but experienced exhibition managers know that these investments are quickly exceeded by future expenses if the museum wants to loan or exhibit artworks without having a solid knowledge base to inform treatments and preparations. IT staff sustains the collection lives of artworks by providing appropriate server storage for digital artwork components, fast networks, and media lab infrastructure that is needed to appropriately check, render, and store files. And media technicians enable conservation by openly discussing their technical preferences and approaches to problem-solving, by collaborating with conservators on the planning and testing of displays, and by contributing wiring diagrams and equipment specifications to the documentation of artwork iterations. Their specialty expertise can also critically inform risk assessment processes and the development of long-term preservation strategies. Developing time-based media roles, responsibilities, and workflows across museum departments requires mutual understanding, respect, and trust. Awareness and inclusiveness can create valuable synergies that do not just benefit conservation but all of the contributing departments. Many museums that have successfully implemented time-based media conservation initiated their internal organization process with the launch of cross-departmental working groups. Chapter 10 of this book presents the longest-standing of these working groups, the so-called Team Media at the SFMOMA, which formed in 1992 and has been meeting regularly ever since.

1.4 Finding the Workforce Museums that have explored and understood the care requirements of their time-based media collections acknowledge that time-based media conservation is not a temporary project with a beginning and an end but an ongoing activity at the heart of their museum practice—just like the conservation of paintings, objects, or works on paper. Therefore, the creation of new permanent staff positions for specialized time-based media conservators is often a prime goal of institutional advocacy initiatives. 7

Joanna Phillips

Figure 1.1 Interdisciplinary collaboration is needed to practice time-based media conservation. Here, a newly acquired virtual reality artwork is being received and checked in by Lan Linh MerliNguyen Hoai (Time-Based Media Conservation Fellow, Düsseldorf Conservation Center) and Alain Bieber (Head of Time-based Media Collection, Kunstpalast). Photo: Joanna Phillips

One of the most common counterarguments faced by advocates is the small artwork count: If one paintings conservator position can cover a collection of 5,000 paintings, why should a museum dedicate a whole conservator position to 200 time-based media artworks? The answer is that the care load is not significantly defined by the artwork count but instead by the complexity of collected artworks, their employed technologies and age, and by the amount of acquisition, loan, and exhibition activities the museum engages in. A thorough condition report of a painting can be completed within a few hours of work (in most cases), but condition checking a video artwork can take many days. Even a simple single-channel video will usually enter the collection in multiple codecs and file formats (e.g., uncompressed archival masters, production elements, and compressed exhibition formats) or even versions (all of the above in different language versions, for example), which all have to be viewed in real time to ensure that no unintended glitches are present. Therefore, museums that have managed to create permanent conservator positions have all moved beyond concerns about artwork counts; even for major international museums with dedicated time-based media conservation teams, such as the MoMA, the Guggenheim, The Metropolitan Museum of Art, and Tate, time-based media works make up an extremely small percentage (ranging mostly between 1% and 2%) of the total collection size, rarely exceeding a few hundred artworks. 8

Implementing TBM Conservation Practice

To invalidate the argument of artwork count effectively, job descriptions and plausible workloads should be mapped out in detail. Chapter 9, which explores staffing and training in timebased media conservation, presents job profiles and describes the skills and qualifications that are needed to fill time-based media conservation positions. If the rate of new time-based media acquisitions is increasing significantly, this should be factored into the projected workload for each upcoming year. Balancing the cost of a permanent position against the ongoing expense of hiring external specialists at hourly rates helps to identify the savings behind the long-term investment of staff addition, and setting these savings against the museum-specific consequences and costs of inaction (see Section 1.2) brings home the benefits of staff dedication. Until permanent positions are created, alternative staffing approaches can help to offset the lack of a dedicated workforce and launch new practices at the museum. Collections can staff projects with contract conservators, offer temporary employment to media conservators, hire specialists (e.g., video engineers or programmers), take on interns and postgraduate fellows with time-based media expertise, collaborate with traditional conservation programs to bring in students for lab practice, or collaborate with other academic programs (e.g., computer science) to invite specialty skills to projects in exchange for student credit (see Chapter 9). All of these measures can bring valuable expertise and work time to a time-based media collection. If coordinated carefully, these small budget commitments can be organized to contribute toward a larger collection survey, create an overview of artwork needs, and foster crossdepartmental appreciation of previously unknown time-based media practices. Getting started even with an improvised workforce can help to lay the foundations for a stronger, more pressing argument for dedicated staff time. It is always harder to argue for works of “unknown condition” than to make a case for concrete emergency scenarios, such as failing computers and missing backups of valued collection artworks. The roles and responsibilities of in-house staff and temporary workforce should always be clearly defined. Collection caretakers can ensure that ethical guidelines for conservation practice are adhered to, that decision-making and treatments are transparent and documented, and that later interventions are not rendered impossible. Particularly when working with specialists outside of the conservation field, the institution cannot expect that conservation principles are familiar (e.g., that an artist’s hardware modifications or coding style are valued as preservationworthy). Here, close conservation supervision and interdisciplinary exchange are needed to communicate expectations and monitor results.

On the Institutional Process of Staffing Time-Based Media Conservation “As a conservator of contemporary art at the Nationalgalerie im Hamburger Bahnhof—Museum für Gegenwart—Berlin, my conservation responsibility also extends to our collection of timebased media artworks, which belongs to one of the largest public collections worldwide, and it is continuously growing. It includes unique modern and contemporary artist videos as well as film-, video- and sound-based installations. The audiovisual media range from historic video tape formats, such as open reel and U-matic to optical media, and from 16mm and 35mm analog film to digital information carriers such as hard drives. “Our conservation department has been communicating the urgent need for specialized time-based media staff, technical infrastructure, workflows and standards for some years now, as its core expertise is focused on providing care for contemporary paintings, objects, photography

9

Joanna Phillips

and installation art. Even if we have not yet achieved the dedication of time-based media staff, our institutional journey of building awareness and fostering exchange has enabled important steps toward this goal: (1) Staff has been receiving continuous further education through workshops and symposia that introduced practices e.g., how to navigate acquisition processes, register and document collection works and implement digital storage. (2) A scientific position for media arts was created, who works with a small team to strategize the implementation of the structural and technical infrastructure needed for the preservation and research of collection works. (3) A temporary, postgraduate fellowship with specialized time-based media conservation expertise was launched. (4) A media lab was built that allows the viewing, examination, archiving and testing of media works. (5) A collection survey by external experts helped to prioritize urgent preservation actions, leading into a pilot project to digitize especially vulnerable works through external service providers. (6) Collaborations with academic conservation programs and guest scientists were initiated in order to tackle the documentation and treatment of individual collection works. “In reaching these milestones, we have learned that the collection, documentation and preservation of time-based media art is a complex and ongoing activity that requires a cross-departmental, continuous engagement of in-house staff. Only if the knowledge and expertise is implemented in-house and permanently, sustainable structures can be created to responsibly collect, preserve and exhibit these works of art.” —Andrea Sartorius, Conservator of Contemporary Art, Nationalgalerie im Hamburger Bahnhof—Museum für Gegenwart—Berlin, Germany (Sartorius 2022)

On the Role of a Time-Based Media Conservator “In 2015, the Bayerische Staatsgemäldesammlungen established my part-time position (20 hours/week) at the Doerner Institut (Munich), making it one of the first museums in Germany with a dedicated time-based media conservator. Prior to my appointment, an IT engineer took care of the collection. “In the beginning, I started with the examination and documentation of the approximately 120 TBM artworks in the Pinakothek der Moderne and Museum Brandhorst collections. I am responsible for the conservation of artworks that contain 16mm and 35mm films, analog and digital video formats, optical carriers such as Laserdiscs, DVDs, Blu rays, born-digital artworks on hard drives or SD cards and software-based artworks. My daily work mainly focuses on the preparation, installation and disassembly of exhibitions, as well as on the caretaking and maintenance of equipment. In addition to this, I closely work with the curatorial team to process new acquisitions and loans. Unfortunately, due to my limited capacity, there is never enough time for scientific research, examination of artworks and conservation care of the collection. Two areas that need more attention, for example, are the digital long-term preservation and the detailed documentation of exhibitions. “Part of my job is to implement professional standards: we now have condition reports for nearly all artworks, documentation of the data carriers and files, as well as templates for acquisitions and condition reports. The next objective will be hiring a technician specialized in time-based media. The creation of this role will not only reduce the need for external support during the

10

Implementing TBM Conservation Practice

installation of exhibitions and make it a lot easier to take care of the equipment, it will also give me significantly more time to focus on collection care and practice research and development.” —Andreas Weisser, Time-Based Media Conservator, Doerner Institut, Munich, Germany (Weisser 2022)

On the Job Profile of a Time-Based Media Conservator “In the German-speaking art world, there are a lot of misconceptions around the job profile of a time-based media conservator, and the roles of media conservators and media technicians are often confused. Only rarely are these two professional fields clearly defined and separately staffed, which is confirmed by current job postings. “I  appreciate that museums have started to create more dedicated positions, but time-based media conservators at museums here are often forced to take on the tasks of media technicians— especially the installation and deinstallation of exhibitions—which is not part of their actual professional profile and scientific qualification. “We have a lot of work ahead of us, if we want to define and reinforce the role of time-based media conservators, which is lagging way behind the prestige and profile of film conservators, for example.” —Arnaud Obermann, Time-Based Media Conservator, Staatsgalerie Stuttgart, Germany (Obermann 2022)

1.5 Creating a Workplace The workplace of a time-based media conservator looks different in every museum and could include a single computer workplace, an editing suite, a disk imaging station, a setup of equipment racks, a large space for artwork mock-ups, or any other functions. The ideal technical infrastructure will be determined by the types of artworks in the collection as well as the conservation practices that the museum wishes to engage in. Chapter 7 in this book provides in-depth guidance on the features that a functional and efficient media conservation lab should offer and discusses existing media conservation labs for reference. Regardless of its specific technical infrastructure, an ideal media lab is a place that hosts and enables collaborative decision-making processes and fosters the understanding of artworks and planned interventions as part of a participatory, communal effort: a place where artists can meet with conservators to review and compare artwork components, where media technicians can stage artworks prior to an exhibition install, where curators and exhibition designers can experience audiovisual content first-hand, where registrars can inventory incoming acquisitions, and where fascinating projects can be presented to guests.

1.6 Engaging in Networks of Practice Development At most institutions, a time-based media conservator will be the only conservator at the museum with this particular specialization, and constructive discussion and critical peer review 11

Joanna Phillips

of treatments or new practice developments are not likely to be found in-house. However, a steady, professional exchange is vital for practicing time-based media conservation, as the emerging field is still in rapid development, and the fast-paced evolution of artwork technologies requires perpetual learning and practice expansion. This is why connecting with colleagues outside of the institution through professional associations, informal contact, and projects and initiatives is essential for the advancement of practice development in-house and across the professional field internationally. At this early stage, with few time-based media conservators employed by museums, every new position that connects expertise with collections matters and impacts the practice of others. Engaging in a culture of sharing—through presentations, publications, blogs, conferences, or seminars—contributes to the body of literature, propels discourse, inspires further research, and most importantly, allows practitioners to build on the learnings of their peers. Networks of practice development benefit significantly from interdisciplinary diversity: partnering with professionals from intersecting disciplines such as philosophy (Chapter  2), art history (Chapter  3), digital humanities, information science, digital preservation (Chapter 13), engineering (Chapter 19), and computer science (Chapter 22)—for example, through museum-academic collaborations or cross-disciplinary research projects—invites knowledge and perspectives to art conservation that will impact or even trigger the development of new examination and treatment practices. Opportunities for cross-institutional peer exchange can also be created by opening the lab for regular “salon-style” meetings with informal presentations and ample discussion time. Inviting in-house curators and other colleagues to these meetings, or even asking them to co-present, will further benefit the awareness and standing of time-based media conservation within the institution. Moreover, a new level of appreciation of the discipline can be created among in-house colleagues if they have the chance to view it against the background of a larger professional community—for example, if time-based media conservators organize public symposia at their museum and turn the institution into a pioneering platform for international discourse. Ongoing outreach and dissemination increase the confidence and visibility of time-based media conservation, externally and in-house.

1.7 Conclusion The implementation of time-based media conservation in museum practice is not a linear process. First-generation media conservators and other collection caretakers in charge have to simultaneously identify and communicate the needs of their collection artworks, win the support of their colleagues and administrators, raise funds, find the workforce, build a workplace, engage in internal and external networks, and research and develop new practices. Whichever way a museum chooses to start this complex process, organizing in-house (e.g., through creating cross-departmental working groups) is a key step on the journey toward creating dedicated staff resources. Bringing additional, external specialty expertise from relevant disciplines to the museum through temporary project staff, fellowships, or museum-academic collaborations can support the process in meaningful ways and catalyze practice development.

Bibliography Obermann, Arnaud. “Email Communication with the Author.” March 10, 2022. Sartorius, Andrea. “Email Communication with the Author.” March 10, 2022. Weisser, Andreas. “Email Communication with the Author.” March 10, 2022.

12

2 THEORIES OF TIME-BASED MEDIA ART CONSERVATION: FROM ONTOLOGIES TO ECOLOGIES Renée van de Vall Editors’ Notes: Renée van de Vall is Professor of Arts and Media at Maastricht University. Her research focuses on the philosophy of art and aesthetics, specifically on the construction of spectatorship in contemporary visual and new media art; on the theory and ethics of contemporary art conservation; and on processes of globalization in contemporary art and media. In this chapter, she traces the evolution of theoretical frameworks in time-based media conservation by reviewing key initiatives and literature of the past 20 years that changed the understanding and treatment of time-based media artworks.

2.1 Introduction Time-based media (TBM) art is one of the most challenging genres of contemporary art when it comes to conservation. Although it shares many of the features that make other genres, such as conceptual, performance, and installation art, elusive and unruly (Domínguez Rubio 2014, 2020), its dependence on technologies and devices poses extra problems that are not only difficult to solve but also radically undermine the idea of the artwork as a separate, self-contained, and enduring entity. TBM works of art are typically constructed with short-lived information carriers, vulnerable data, and complicated and relatively unsustainable playback and display equipment and technologies. In order to function, they depend on various factors outside their caretakers’ control, such as the availability and continuous support of the technologies they are made with and the expertise needed to make them work—all of which tend to change at a rapid pace. In order to survive, they may need to be transferred to more up-to-date systems, with inevitable aesthetic and conceptual change as a result. This undermines some of the basic premises of conservation because it is difficult to conceive how a work of art can be conserved if there is not a delineated, relatively durable material or immaterial “something” that can be continued over time. But how can we understand the work of art in a way that acknowledges its transformations without losing track of its identity? In this chapter, I will discuss various theoretical attempts to answer this ontological question. I  will show how in TBM conservation literature, initially the idea of a unique and relatively permanent art object was maintained, but with a diversified understanding of its authenticity. A next step was a radical shift of the criteria for the work’s authenticity through DOI: 10.4324/9781003034865-3

13

Renée van de Vall

a reconceptualization of its ontology. This reconceptualization proceeded along two strands of thought, which I will present in a theoretical rather than chronological order: from a unique, enduring object to a repeatable performance on the one hand (2.2.1–2.2.4) and to a set of behaviors that could be separated from the work’s original medium on the other (2.2.5). At the same time, we can see increasing attention to the work’s ecology—the practical and institutional environment in which it is produced and sustained—as relevant factors to consider in its conservation. In the most radical case, this can lead to a perspective that understands ecologies as producing ontologies: rather than deriving conservation practices from the works’ ontologies, these ontologies are considered to be the result of these practices. Although this development is connected to the increasing importance of digital media for TBM art, it has profound consequences for how we approach analog media art as well and, in fact, all fine arts: because ultimately all works of art change over time and are dependent on their contexts of care.

2.2 Ontologies The question of what constitutes the work of art and how it relates to the material thing that makes up its presence in the physical world is a very old one. But where reflections on the ontology of art are usually concerned with articulating the specificity of the artistic creation, the aesthetic experience and critical interpretation of works of art in comparison to other kinds of cultural artifacts, and leave the material object of these reflections literally untouched, in conservation they can have an immediate impact on the work’s physical survival. In contrast to many other artifacts and in contrast to at least some other cultures, within the modern Western tradition, works of art are supposed to endure across time. But like all artifacts, artworks are physical things that change and ultimately perish, and as Domínguez Rubio (2020) has shown, their perpetuation requires extensive and expensive machinery of “mimeographic” labor: labor devoted not to create the “new” but to sustain the “same.” The physical substrates of TBM art, however, are much more vulnerable than those of traditional painting and sculpture. The material image-carrying mediums used in TBM art (like celluloid film, magnetic tape, or laser discs) tend to deteriorate much faster than oil paint on canvas or marble. Playback and display equipment—video or DVD players, monitors, projectors— suffer from wear and tear and obsolescence. Devices that only a few decades ago were daily household products have been taken out of production, including the support services needed to use them. Digital media technologies, their hardware and software, and their support systems tend to change even faster. Internet-based works may be distributed over networks or platforms beyond the control of the artist or caretaker. Hence, it is not surprising that conservators felt a need to reconsider their basic ideas about the connection between the work of art and its material carrier in a more radical way than before; as Renate Buschmann formulated it for media installations, they needed to consider the view that “technology-based installations are, for the most part, artifacts of which the components need not necessarily be linked with the time at which the work was created” (Buschmann 2013, 202). She continues, “Any debate about preservation strategies for media art is dominated by the question of identifying, to begin with, what amounts to the conservation-worthy, ‘original’ state of a media artwork. The differing opinions on this matter range from the strict view that the material constellation of a work as first exhibited should be seen as the original, up to the liberal notion that a work can be reduced to a thematic concept whereby the material equipment is flexible.” ( Ibid.) 14

Theories of TBM Art Conservation

This section will try to give an overview of the main contributions to and discussions of the ontology of TBM art. This overview will follow a theoretical order, showing how these contributions and discussions evolve according to the kind of technologies that are at stake and also how change is increasingly valued in positive terms, from potential loss to a basic condition for survival.

2.2.1 Integrities Pip Laurenson’s (2004) discussion of the significance of display equipment for TBM installations still phrased the connection between the work and its material carrier in terms that could and also have been applied to traditional artworks (Clavir 2002, 52). When confronted with failing and obsolescent components of media display devices, the question arises for the conservator whether the original equipment has more than just a functional significance, in which case replacement would constitute a loss. To indicate what the loss could consist of, Laurenson distinguishes, next to the work’s physical integrity, its conceptual integrity, relating to “the relationship of the work to the process or technology employed and the spirit in which the work was made”; its aesthetic integrity, relating to “the look and feel of visible components and the outputs of the system (i.e., qualities of the sound and image)”; and its historical integrity, relating to “links made by the visible components and discernible outputs of the system to the time the work was made” (Laurenson 2004, 50). Thus, the Walkman™ in Angus Fairhurst’s Gallery Connections (1995) strongly links the work to the 1990s when it was a common device and therefore cannot be substituted by a CD player without compromising its historical integrity (ibid.).

2.2.2 Work-Defining Properties In a paper first published two years later, however, Laurenson (2006) introduced an ontological distinction in conservation theory that allowed for a different evaluation of change than mainly in terms of integrity versus loss. She proposed, in line with what artists and art theorists had already argued with regard to conceptual and installation art, not to think of TBM installations as we do of paintings and sculptures but rather in the same terms we do of music or theater. Whereas an exact reproduction of a painting will never be more than a copy or fake, a theater play can be performed again and again without losing its identity, even allowing for many variations (different actors, staging, costumes), as long as the performance complies with the script. In the same way, Laurenson argued, TBM installations and installations, in general, may be reexecuted as long as their “work-defining properties” are respected, hereby coining a notion that is now ubiquitously used. Laurenson made use of the vocabulary of Nelson Goodman (1976), who articulated the difference as a twofold one: paintings and sculptures are supposed to be finished when they leave the studio, and their authenticity is inseparable from their history of production (they are one-stage, autographic arts), whereas in the case of theater plays or notated music, after the writing of the script or score a second step, that of their performance, is required to complete the work. Moreover, scripts and scores are based on systems of notation that guarantee that a copy of the text is just as “authentic” as the original manuscript (they are two-stage, allographic arts). However, whereas Laurenson and much of the conservation literature following her proposal merged these two distinctions, Goodman explicitly separated them: there are also one-stage allographic arts (literature) and two-stage autographic arts (etchings, cast sculptures). 15

Renée van de Vall

Allographic arts rely on a system of notation, allowing for a distinction between constitutive and contingent properties, such as an alphabet or musical notation. In Goodman’s words, “In effect, the fact that a literary work [and the same goes for a musical score, RvdV] is in a definite notation, consisting of certain signs or characters that are to be combined by concatenation, provides the means for distinguishing the properties constitutive of the work from all contingent properties—that is fixing the required features and the limits of possible variations for each.” (Goodman 1976, 116) Literature is a one-stage art. Music is a two-stage art. When the score is written, it still has to be performed, but although a performance does not consist of characters like the score does, there is still a distinction between properties that are constitutive and properties that are not: “[T]he constitutive properties demanded of a performance of the symphony are those prescribed in the score; and performances that comply with the score may differ appreciably in such musical features as tempo, timbre, phrasing and expressiveness.” (Ibid., 117) In contrast, no such distinction is possible for autographic arts; all features of a painting, even hardly perceivable ones, are potentially constitutive, or in Laurenson’s terminology, “work-defining.” “In painting, on the contrary, with no such alphabet of characters, none of the pictorial properties—none of the properties the picture has as such—is distinguished as constitutive; no such feature can be dismissed as contingent, and no deviation as insignificant.” (Ibid., 116) The same goes for the autographic two-stage arts, such as etching. Therefore, for autographic arts, it is necessary to know the history of production (is this painting really made by this artist?) to decide about a work’s authenticity, but for allographic arts, this history is not relevant: an exact copy of the text of a novel or score of a symphony makes it a genuine instance of that novel or symphony score, regardless of who printed it. TBM installations, and installation art in general, says Laurenson, are somewhere on the continuum between sculptures and performances. Like performances, they are installed events, created in two phases; their identity is defined by a cluster of work-defining properties stipulated by, for instance, artist instructions or artist-approved installations intended as models; they allow for—and even require—more or less variability, depending on whether the work is thickly or thinly specified; yet like in traditional fine arts, there is always still a causal link required to the artist and to the properties that the artist considers mandatory. A very important difference between installation art and performance arts like music is that in the latter case, there are well-established conventions to specify work-determinative features, whereas in the former case, this specification will differ from artist to artist. In the case of TBM installations, the function of the score in music may be fulfilled by various kinds of information: “The kinds of things that might act as work-defining properties of a time-based media installation are: plans and specifications demarcating the parameters of possible 16

Theories of TBM Art Conservation

change, display equipment, acoustic and aural properties, light levels, the way the public encounters the work and the means by which the time-based media element is played back. The artist might explicitly provide work-defining instructions to the museum or designate a model installation from which the key properties of the work can be gleaned. Documentation in this context, whether it be photographs, interviews, light and sound readings or plans, represent an attempt to capture the work-defining properties.” (Laurenson 2006, 7) Another difference is that artists rarely see themselves as working in a context that allows for as much interpretation as composers and playwrights do. Nevertheless, there will always be a gap between what the artist can specify and the realization of the work: the installation is richer than its specifications. This gap asks for interpretation and negotiation between artist, installation crew, curator, and conservator in the process of realizing a new installation of a work. It depends upon the relationship of the artist to the installation phase of the work how much is possible. This ontological move (from the artwork as a basically unique and irreplaceable object to an installed and repeatable event) was a game-changer in conservation theory, allowing the conservator greater freedom in changing features that are not considered to be work-defining and guidance on what features to preserve. Although Laurenson mentions the greater vulnerability of TBM installations to the erosion of identity through its successive presentations, change did acquire a more positive (or less negative) connotation in her proposal. Variability meant no longer only the risk of loss but also pointed to a potential: an installation is richer than its specifications; an element of indeterminacy is central to the idea of the work being performed rather than mechanically executed.

2.2.3 Between Instructions and Manifestations Although Laurenson’s paper explicitly acknowledged variability, several authors (e.g., Fiske 2009; Gordon 2015) have commented that it still holds on to a certain fixation of the work and doubt whether the notion of a score or script as determinative of a work’s identity adequately articulates current practices in (TBM) installation art. What if there is no score or script prior to a work’s first installation? What if the score or script has to be made retrospectively? What if it changes in the course of its successive reinstallations? Would such changes result in new instantiations or versions of the same work or in new works? Actually, many of these questions were already foreseen in Laurenson’s (2006) paper, which pleaded to consider TBM installations as somewhere in between allographic and autographic and included a wide array of documents and other information as possibly contributing to a work’s “score.” The subsequent discussion, however, helped to further articulate the problems at stake. Tiziana Caianiello (2013), for instance, points to some significant differences between media installations and music in relation to their materiality, space, and situation. In contrast to notated music, media installations may contain material components whose significance is more than merely functional and that contribute to the work’s identity. The same goes for the space in which a work is performed: the identity of a symphony does not include the characteristics of a specific concert hall, whereas a media installation can be site-related or site-specific. She, therefore, proposes to view the ontology of media art as comparable to architecture, rather than music or theater, with regard to both its notation and its dependence on a history of production. Architecture falls somewhere in between the autographic and the allographic. Building plans may contain specifications—for instance, on building materials—that are crucial to a building’s 17

Renée van de Vall

identity but do not meet the requirements of a notational system. Space and situation are among the most important parameters with regard to the staging of such works, which makes it impossible to identify a work independently from its history of production. Caianiello distinguishes three rather than two stages in the process of creating a media installation: (1) an artist (or an interdisciplinary team) drafts the concept of the work, including the production of audiovisual material and implementation of the technology; (2) a “director” (who could be the artist or someone else) stages the work in a public situation; (3) a viewer perceives the work, participating in its performance. She observes that very often, artists revise the concept of a work prior to a new staging. Adaptations to be made during the work’s staging in response to differences in space and situation can result in significant aesthetic and conceptual changes, for which the personal intervention or authorization of the artist is considered to be important. She advises the meticulous recording of all stagings of an installation, together with the interpretations and deliberations of which they are the outcome and the inclusion of context-related aspects of the installation such as site and space and the presentation history of the work among the central attributes of the work. These recommendations have been addressed in the Documentation Model for Time-Based Media Art proposed by Joanna Phillips (2015). This model reflects the two-stage constitution of TBM art by documenting not only what would be the equivalent of the score of a work (the “identity” of an artwork) but also the history of its successive manifestations (the “iterations” of an artwork). In contrast to a musical score, artist-provided installation instructions are a part of but not the only component of the score. Taking into account that even a very “thick” set of artist’s instructions may be applicable to a single iteration only and not provide criteria for future iterations in different environments, an Identity Report is compiled by the responsible institution on the basis of additional information, ranging from research into the work’s previous iterations and changing constellations of components to artist statements. The Identity Report specifies how the work should be experienced, what elements may vary and what not and how it should be preserved in the future. The purpose of this Identity Report is “to characterize the work’s behaviors under different circumstances, and to create a rich description of the artwork as a system in its relation to an environment” (Phillips 2015, 176). Simplified installation instructions are derived for use in case of specific exhibitions or loans (also see Chapter  11, “Documentation as an Acquisition and Collection Tool for Time-Based Media Works”). The second stage is documented as well: Iteration Reports are made of each successive installation of the work, including not only the details of the installation but also the collaborative decision processes leading to specific choices and the reception and evaluation of the result by different stakeholders. “The purpose of Iteration Reporting is creating a history of change, which is related to the history of decision making that determines the ‘career’ of unstable, changing artworks such as time-based media works of art” (ibid., 177). The question of how to accommodate change while at the same time preventing the erosion of a work’s identity is addressed by the interaction between the two types of reports. The Identity Reports are more than factual registrations of how installations turned out; they have an evaluative dimension. The information in the Iteration Reports may feed back into the Identity Report, allowing for a dynamic understanding of the artwork’s identity over time while maintaining a critical and self-reflective stance vis-à-vis choices made. “The overall purpose of the model is to create a detailed record of an artwork’s change over time. However, this documentation model also has the benefit of serving as a tool for institutional self-reflection, making current choices transparent to future

18

Theories of TBM Art Conservation

interpreters, and thereby helping to prevent uninformed and compromising realizations of an artwork.” (Ibid., 168)

2.2.4 Differential Approaches to Conservation We could characterize the way in which Laurenson, Caianiello, and Phillips incorporate the evolving nature of TBM artworks as an attempt to open it to controlled change. Theoretically, they draw on Anglo-Saxon analytical philosophy (e.g., Nelson Goodman and Stephen Davies). More open-ended conceptualizations of artworks’ changing identities have often been informed by French “philosophies of difference,” such as the writings by Jacques Derrida and Gilles Deleuze. Although not specifically addressing TBM art, it is important to mention Tina Fiske’s (2009) introduction of the terms “iterability” and “iteration.” Fiske’s critique of ontological typifications like Laurenson’s is that they tend to prioritize presence, identity, and actualization and forget about absence, rupture, and difference. Elaborating on an example of an installation (Andy Goldsworthy’s White Walls (2007)), in which the materials behaved differently from what was foreseen by the artist, Fiske reflects on the pervasiveness of discontinuity in the existence of contemporary artworks and how this is taken into account (or not) within various attempts to redefine concepts like authenticity and identity. She concludes that the concern with presence is still predominant in conservation theory and conservation is still mainly a form of domestication. Fiske alternatively refers to Derrida’s notion of iterability, “the possibility of . . . being repeated (another) time,” a possibility that “alters, divides, expropriates, contaminates what it identifies and enables to repeat itself ” (Derrida 1988, 61–62; quoted in Fiske op. cit., 232). Derrida criticizes traditional conceptions of writing (and ultimately of communication in general) by pointing to the absence that is fundamental for a written text’s functioning: a written text that one can only understand in the presence of the writer or the receiver or in the presence of what the text about does not work as a text. This is an insight that Derrida then extends to all marks, spoken or written, and every sign, linguistic and non-linguistic, and ultimately to experience as such. The identity of a signifying form is “paradoxically the division or dissociation from itself,” constituted by its iterability, “the possibility of its being repeated in the absence not only of its ‘referent’ . . . but of a determined signified or the intention of actual signification, as well as of all intention of present communication” (Derrida 1988, 10). Because of this absence, the iteration constitutive for the possibility of communication is not merely repetition but also always implies difference or alterity. What a reader reads in a text is always different from what the writer intended or what the next reader will read—and this is not a negative side effect but a condition for communication to be possible at all. What Derrida writes about absence (as constitutive of communication) and about iterability (as a unity of repetition and alterity) goes for all communication, all (spoken and written) language equally. Therefore, it seems a kind of category mistake to claim that a particular artwork like White Walls or even a specific genre of artworks like installations may have “as a structural capacity,” “that breaking force” as written language (Fiske op. cit., 233). Nevertheless, this introduction of differential thinking into conservation theory opened new ways to conceptualize and value uncontrolled—or unforeseeable—change. A comparable theoretical contribution has been provided by Hanna Hölling’s study of the conservation of Nam June Paik’s works. Commenting on the Variable Media Approach (see Section 2.2.5) and implicitly on Laurenson (2006), Hölling states that the notion of “variability,”

19

Renée van de Vall

which suggests change within predetermined limits, is not adequate to indicate the extent of the transformations some of Paik’s works (for instance Zen for Film, 1962–64) have gone through. Instead, Hölling proposed the notion of changeability: “an artwork’s potential to transform from one condition, appearance, or constitution to another” (Hölling 2017, 76). Rather than conforming to the prescriptions of an authoritative score, the open nature of Paik’s works allows for changes beyond the initial instructions and, therefore—at least some of them—have a high degree of changeability. In order to articulate this open-ended and dynamic character of Paik’s art and the creative nature of its reception and continuation, Hölling uses a distinction between the possible and the virtual, derived from Bergson’s philosophy of time and creative evolution and presented by Deleuze. “Possible” is what resembles the real; some “possibles” are realized, some not, according to rules of resemblance and limitation (Deleuze 1991, 97). The virtual is not realized but actualized; the rules of actualization are not resemblance and limitation but those of difference, divergence, and creation. The virtual “must create its own lines of actualization in positive acts” (ibid.). Translated to art, this would mean that whereas the notion of an artwork’s possible manifestations would imply that these resemble already existing ones (like executions of a score that faithfully copy previous ones) and would rule out non-resembling ones as genuine instantiations of the work, its virtual realm indicates the work’s potential to generate as yet unforeseeable, divergent manifestations. What should guide new iterations is not an equivalent of a score but what Hölling calls the archive, which includes scores and instructions, but also records of previous performances, the artist’s concept of the work, and the tacit knowledge and memory of all persons involved. As new iterations are both derived from and added to the archive, it is “a dynamic entity involving constant reorganization, addition, and loss. More than a physical realm of papers, files, and objects, it is also a conceptual realm of thought and interpretation, of tacit and embodied knowledge, and a condition of possibility for a multitude of readings.” (Hölling, op. cit., 148) But Hölling goes a step further than broadening the range of actual and immaterial “documents” that qualify artworks’ identities; although she does not completely erase the distinction between the work and its documentation, her account of the archive goes quite a long way in blurring the boundaries between them. “Conceiving of an artwork apart from its archive is unthinkable because the artwork is irreversibly bound to its archive, which shapes its identity, and because its actualization is dependent on the archival realm. The archive is, in fact, an active part of the artwork, rather than some distinct and static repository of documents.” (Ibid., 160) It needs the informed judgment (phronèsis) of the conservator and other collaborators involved to make sense of the archive. Not surprisingly, Hölling emphasizes the creativity involved in conservation: “In recent installation practices, the archive has become increasingly a realm of creative implementation of conservators’ skills and knowledge” (ibid., 158). Although banned from traditional theories of conservation, according to Hölling this creative aspect has always been implicitly present but has become more explicit in the conservation of contemporary and multimedia art. 20

Theories of TBM Art Conservation

Some difficulties of Hölling’s theoretical contributions resemble those of Fiske’s. It is not unproblematic to use notions like potentiality and virtuality that have been formulated to understand vast phenomena such as time, life, and evolution to the more small-scale realities of conserving an installation. Moreover, the concept of changeability runs the risk of becoming tautological, as when the change of the work is considered to be due to its changeability. It is suggested that changeability indicates not only a potential to change but also the limits beyond which an artwork ceases to be the same work, but this normative function is hard to operationalize. Nevertheless, like Fiske’s chapter, Hölling’s work has become an anchoring point for conservation theorists trying to conceptualize the processual and evolving character of much contemporary art, including TBM works.

2.2.5 Medium-Independent Behaviors The strand of thought articulated by the previous contributions mainly focused on (TBM) installations. The second strand of thought was developed by the Variable Media Approach (1999–2004), a project initiated by the Daniel Langlois Foundation and the Guggenheim Museum, which was explicitly instigated by the growing predominance of digital media in contemporary art (Depocas et al. 2003). Its main result was a questionnaire for artists, with the aim of giving them more voice in and commitment to the continuation of their works. Observing that cultural oblivion and rapid technological obsolescence endanger media art, the authors assert “[t]hat is why the variable media approach asks creators to play the central role in deciding how their work should evolve over time, with archivists and technicians offering choices rather than prescribing them” (ibid., 47). Whereas in the previous contributions, change has been taken into account and has been positively valued, in the VMA change is considered even more emphatically as a condition for continuation: as the subtitle of the publication states, “permanence” is only possible “through change.” This idea does not apply to digital artworks only. In Richard Rinehart’s and Jon Ippolito’s later (2014) book, the supposedly stable sculptures by Eva Hesse are compared with the ephemeral wall drawings by Sol LeWitt: whereas several of the former are currently unpresentable because of their material degradation, the latter are in full bloom because there is a well-established procedure for their re-execution. The central ontological notion in this approach is behaviors, “medium-independent, mutually-compatible descriptions of each artwork” (Depocas et  al. 2003, 48) that “describe an artwork in its ideal state” (ibid., 50). This reflects the fact that, in contrast to painting and sculpture, in media art, and in particular digital media art, the images (or sounds) can be separated from their carriers and transferred to another medium. Moreover, without this possibility, media art would be destined to disappear. Although I  just called “behaviors” an ontological notion, the categories described as such do not have the philosophical pretension of providing an ontological typology; they are presented as rather loosely defined, organizing labels for sets of questions with which artists and others using the questionnaire can indicate significant features of the work. They are different from the various “integrities” or “work-defining properties” mentioned before as they primarily relate to those properties of medium(s) that are relevant for the work’s continuation and that can but do not necessarily have aesthetic, conceptual, or historical dimensions. The specification “medium-independent” is a bit confusing, as the descriptions are all related to the medium(s) of the artwork, but it means that they do not coincide with one particular medium and that more than one label can apply to the same work. They are interesting for an ontological exploration for several reasons: they point to a wider range of ontological possibilities than the allographic/autographic binary; they do not exclude each other, and their relative importance may change during an artwork’s lifetime. 21

Renée van de Vall

The behaviors “contained” and “installed” roughly correspond to the autographic/one-stage and allographic/two-stage distinction. Relatively stable formats like paintings and sculptures are “ ‘contained’ within their materials or a protective framework that enclose or support the artistic material to be viewed” (Depocas et al. 2003, 124). “Installed” are those works that need more complex operations for their physical installation than “hanging it on a nail” (ibid., 125) “in the special sense of changing every time there is an installation” (ibid., 48). “ ‘[P]erformed’ works include not only dance, music, theatre, and performance art, but also works for which the process is as important as the product” (ibid., 128); the label assumes a “theatrical or musical setting” (ibid., 48), but the term also applies when “re-creators have to reenact original instructions in a new context” (ibid., 49) (the example mentioned is finding fallen branches for Meg Webster’s Stick Spiral). “ ‘Reproduced’ describes any medium that loses quality when copied” (ibid.). “[A] recording medium is ‘reproduced’ if any copy of the original master of the artwork results in a loss of quality. Such media include analog photography, film, audio, and video” (ibid., 128). A medium is labelled as ‘duplicated’ when a copy cannot “be distinguished from the original by an independent observer,” which applies to “mediums that can be perfectly cloned” (ibid., 125)—this goes for digital works but also for mass-produced items as the candies in GonzalesTorres’ Untitled (Public Opinion). “Interactive” are works where visitors are allowed to manipulate or modify the work, digital or non-digital (ibid., 126); here, Gonzales-Torres’ Untitled (Public Opinion) is an example as well. The description “encoded” applies if “part or all of the work is written in computer code or some other language that requires interpretation (e.g., dance notation)” (ibid., 125) and “networked” if it is distributed across (ibid., 50) and to be viewed on an electronic communication grid (ibid., 127). What is somewhat confusing in this terminology is the behaviors do not seem to be altogether consistent in what they refer to; sometimes, it is about important features of the “original” work that should be kept in translation (works that must be installed or performed); sometimes, it is about what is possible to do with a work to perpetuate it, including consequences of such perpetuation (reproduced; duplicated); here reproducible or duplicable would have been more adequate labels. The second component of the Variable Media Approach is the list of preservation strategies that are designed to negotiate the inevitable “slippage” between the ideal and the real state of an artwork. A variable media strategy is “one of several philosophical approaches to solving a particular preservation issue” (Depocas et  al. 2003, 125). Although some behaviors seem to connect naturally to corresponding strategies, like contained works seem to ask for storage, this connection is in no way a necessary one. Storage is the most conservative strategy (ibid., 129); the example mentioned is buying a supply of light bulbs for Dan Flavin’s fluorescent light works and putting them in a crate; the disadvantage of this strategy is that these ephemeral elements will cease to function, and the work will cease to exist (ibid.). The strategy of emulation means “to create a facsimile . . . in a totally different medium” or “to devise a way of imitating the original look of the piece by completely different means” (ibid., 125). The term can be applied to any refabrication or substitution of an artwork’s components but has a specific meaning in the context of digital art. A typical example is when new software impersonates old hardware. The migration strategy is not to imitate a work’s appearance with a different medium but to upgrade its medium to a contemporary standard—for instance, replacing video monitors with flat-screens for Nam June Paik’s TV Garden (ibid., 126–127), accepting any resulting changes in the look and feel of the work. The most radical strategy is reinterpretation: this happens when hardware is replaced by a different apparatus with the same social or metaphoric function (a teletype becomes a cell phone) or when a performance is recast in a completely different time period and setting (Hamlet in a chat room). Reinterpretation “is a dangerous technique when 22

Theories of TBM Art Conservation

not warranted by the artist, but it may be the only way to re-create performed, installed, or networked art designed to vary with context” (ibid., 128). The VMA behaviors and strategies have become important reference points in conservation discourse. The VMA foregrounds artists’ views on the future of their works, which is both a strong point and a possible liability. As Caianiello observes (and many with her), the intentions of the artist may undergo change, and technological developments are not always foreseeable. Therefore, the responsibility of museum personnel for preservation should not be passed to the artist. “Although born of a justified need to involve artists in the process of preserving their works, this approach is so radical in nature that it releases museum personnel from their responsibility and rouses misleading hopes that the artist’s authority can create unassailable guidelines for action in the future. Decisions can be taken only in response to tangible situations, however, and are always the outcome of a compromise between different requirements.” (Caianiello 2013, 217)

2.3 Ecologies 2.3.1 Institutions and Networks The importance of the tangible situations in which conservation decisions have to be made has been recognized by many other theorists in the field. Thus, Laurenson (2013) has observed that many analog media depend on a technological ecosystem, which is currently under severe threat; despite heroic efforts, it may become impossible to continue to support the display of these works in the way in which the artist intended. Around any complex artwork, there is a possible or existing network of people who have the expertise and knowledge necessary to support that work. Experience is teaching us that the conservation of these artworks is as much about the development or preservation of these networks or ecosystems as it is about the object in isolation (Laurenson 2013, 41). In their (2014) book, Rinehart and Ippolito distinguish three types of external threats that endanger the futures of new media artworks: death by technology, death by institution, and death by law. With the first death, they mean what they call the “accelerating cycle of contemporary obsolescence” (Rinehart and Ippolito 2014, 46), governing the “global nexus of associations” formed by the economic and social apparatus of producers, marketing, and users (ibid., 45). The second death comes from the in-built preference of collecting institutions for storage over access (ibid., 77). The third death comes from intellectual property law, copyright, and licensing rights that tend to restrict or prohibit access to works or their future recreations. Instead, they plead for an Open Museum scenario: an open, widely accessible, and experimental environment in which, for instance, a wide variety of professionals and members of the public could add their interpretations and memories of the work (ibid., 106–114). The works’ technological, institutional, and legal environments have become a prime consideration in what could be called the “practice turn” or “ecological turn” in conservation thinking. Whereas the previously discussed theoretical developments drew on philosophical concepts and theories, the inspiration for this turn to the practicalities of the works’ environments comes from sociological and anthropological theories and from the interdisciplinary field of science and technology studies. The “turn to practices” (Schatzki 2001) directs the focus of investigation from the inherent structures of the artworks to the ways in which these artworks’ 23

Renée van de Vall

careers are “entangled with the museum’s repertoires, organizational structures, work divisions, politics, documentation procedures and levels of engagement” (Van Saaze 2013, 183). In other words, it understands the development of artworks in terms of their embedding in their environments, or ecology, “the material, atmospheric, semiotic, and imagined conditions in and through which something—be it a category, a body, a rock, or a god—exists, subsists, and becomes” (Domínguez Rubio 2020, 8).

2.3.2 Doing Installation Art Vivian van Saaze’s pioneering study Installation Art and the Museum (2013) was not the first ethnographic investigation of backstage museum practices, but it was the first to focus on how conservation practices contribute to the continuous construction of installation artworks, including TBM installations, as museum objects. Working with concepts and theories from science and technology studies, in particular those from actor-network theory (ANT), she showed how central normative categories (original/authentic work; artist’s intention and authorship; ownership) were being done on the museum’s work floor through the often concerted—but not seldom diverging—efforts of collectives of human and non-human actors. ANT holds that their objects of investigation—here artworks, in other seminal studies phenomena like the microbes discovered by Pasteur (Latour 1999) or the disease of atherosclerosis (Mol 2002)—are not things in and for themselves, waiting to be discovered, diagnosed, or treated. Their existence as singular and stable entities is an achievement and depends on the continuation of that achievement. This makes an artwork’s ontology not a starting point for but a result of conservation practices. In Annemarie Mol’s words: “Ontology is not given in the order of things, but . . . instead, ontologies are brought into being, sustained, or allowed to wither away in common, day-to-day, sociomaterial practices” (Mol 2002, 6; quoted in Van Saaze 2013, 80). Therefore, an ANT approach does not take the work of art for granted but aims to “open the black box” and investigate the interactions and transformations of the various actants that have made it and continue to make it into what it is. Through her observations of museum work floor practices and her archival research, Van Saaze could observe how the authenticity of Nam June Paik’s One Candle (1988) in the Museum für Moderne Kunst in Frankfurt was not an indisputable given but “was carefully manufactured within the museum and its authenticity . . . enacted through a range of actants such as agreements, taxonomies, wall labels, photographs, etc.,” not only stabilizing the object in the collection but also separating it from incompatible versions of the work (ibid., 105). Likewise, she could observe how two different museums working with artist Joëlle Tuerlinckx constructed two completely different approaches to and accounts of what both considered to be her intentions—the one more focused on the formal features of the finished artwork and the other more on the production process, resulting in two contrasting strategies for the documentation of her installations, which Van Saaze understands in terms of the contrast between knowing that and knowing how (ibid., 140).

2.3.3 Biographies Van Saaze’s investigations showed that, in spite of traditional views on conservation, works of art do not stop changing once they enter museum collections. In other words, the acquisition of a work of art by a museum does not mean an end to its biography. The notion of cultural biography, derived from the cultural anthropology of things and the study of material culture (e.g., Appadurai 1986; Kopytoff 1986), was introduced into conservation theory in order to 24

Theories of TBM Art Conservation

incorporate the changeability of contemporary artworks and—while recognizing the individual characteristics of each work of art—to serve as a framework to detect similarities and patterns in the dynamics of such works’ development (Van de Vall et al. 2011). The central ideas of this biographical approach are (1) that the meaning of an object and the effects it has on people and events may change during its existence due to developments in its physical state, use, and social, cultural, and historical context and (2) that these changes will tend to conform to culturally recognized stages or phases and specific cultures will have different notions of what counts as a successful career for an object. An important concept in the biography of things is the notion of regimes of value, coined by Arjun Appadurai in order to argue that being a specific category of object (like a commodity, or a work of art) is not a property that some things possess and others don’t; belonging to such a category is a state or situation objects can move into and out off, across different regimes of value, as a phase in their biography.

2.3.4 Producing Permanence A next step was made by Fernando Domínguez Rubio, whose ethnographic study of the Museum of Modern Art (2014, 2020) more or less reversed the perspective on change: rather than a fortunate or unfortunate exception, either to be prevented (as damage or loss) or to be cherished (as potentiality or condition for survival), change is the rule. Museums have to create an illusion of stability in the midst of all-pervasive and relentless change, an illusion on which the modern Western notion of art and the modern aesthetic regime depend. “All artworks undergo a continual process of physical transformation and decay that constantly threatens to undermine the specific relationship between material form and intention that defines them as meaningful and valuable ‘art objects.’ The question the museum has to solve is how to prevent, or at least how to slow down, this unremitting process of change and degradation so that artworks can retain their meaning and value as timeless ‘objects’ of formal delectation.” (Domínguez Rubio 2014, 622) Some works of art lend themselves more easily to this work of stabilization, preservation, and objectification than others, but whether an artwork is “docile” or “unruly” depends not so much on the inherent qualities of different types of work but on how they behave within specific organizational and institutional contexts (ibid.). In a classic museum as MoMA, the relative stability, classifiability and knowability, and portability of oil paintings make them closer to the “docile” pole of what Domínguez Rubio presents as a continuum. Nam June Paik’s Untitled (1993), being relatively variable, elusive, and unwieldy, comes closer to the “unruly” pole. In his (2014) article, Domínguez Rubio describes the relationships between objects and contexts as a reciprocal one, however: museums shape objects, but “docile and unruly objects actively shape internal boundaries and hierarchies within the museum and, through them, the wider processes of institutional and cultural reproduction” (ibid., 623). Docile objects help to create and sustain the conditions under which they are constructed and represented as such (ibid., 632); unruly objects tend to create discontinuities and possibly lead to transformations in the institutions that collect them (ibid., 640–641). In his (2020) book, however, this reciprocity seems to hide behind the massive “aesthetic machine” that continuously reproduces the precarious stability of artworks as objects in spite of their inevitable disintegration as things. “Museums are machines for art”; “[they] are not just a means for the display of art; they are aesthetic machines whereby a specific way of imagining, 25

Renée van de Vall

narrating, and practicing art becomes possible” (Domínguez Rubio 2020, 15). Rather than questioning the museum as an institution, modern and contemporary art has reinforced it because it cannot be understood without the museum and needs the museum to create the conditions for a certain permanence or at least deferral of change (ibid., 17). However, it is indeed new media art that may provide an impetus to institutional change. Initially, it appeared that its vulnerability might be countered through digitization, but digital ecologies turned out to be much more unstable, unpredictable, and volatile than the analog ecologies they seek to replace (Domínguez Rubio 2020, 304). Digitization demands a new ecological nexus “that transforms the institutional structures and logics upon which the museum has traditionally relied to produce the systems of boundaries and categories that have organized the modern aesthetic regime of art” (ibid., 305). Instead of producing stabilized, original, authentic, and discrete objects, such a nexus should allow for objects that circulate across environments and technologies (ibid., 308), exist in multiple forms, and must keep multiplying to survive (ibid., 311), blurring the distinction between original and copy (ibid., 313), are regenerated within each new environment, changing in content and appearance (ibid., 314–315) and distributed across different components (hard drives, internet sites, servers) located in different spatial, temporal, and property regimes. These developments might but will not necessarily lead to an abandonment of the modern aesthetic regime, as the tech industry continuously attempts to further enclose rather than open up technologies and infrastructures in search of profit (ibid., 323).

2.4 Conclusion How these developments will turn out is not for this chapter to answer. We might point to theoretical attempts to conceptualize the interdependence of TBM artworks and their environments and formulate alternative futures, such as the Open Museum mentioned before (Rinehart and Ippolito 2014) or Annet Dekker’s (2018) characterization of net artworks as evolving, collaborative “assemblages” and their continuation as the concern of “networks of care”—networks that bring together professionals and users engaged in the work, with knowledge from a variety of fields and backgrounds, and that “can operate without the structures of centralized archives and authorized custodians, which are present in most museums” (Dekker 2018, 91). What this chapter hopes to achieve is to demonstrate that the continuous effort to understand both the ontologies and the ecologies of TBM artworks, and in particular their interactions, is a vital part of the caring work involved in their continuation.

Bibliography Appadurai, Arjun. “Introduction: Commodities and the Politics of Value.” In The Social Life of Things: Commodities in Cultural Perspective, edited by Arjun Appadurai, 3–63. Cambridge: Cambridge University Press, 1986. Buschmann, Renate. “On the Debate About Media Art and Its Conservation.” In Medienkust Installationen Erhaltung und Präsentation/ Media Art Installations Preservation and Presentation, edited by Renate Buschmann and Tiziana Caianiello, 199–205. Berlin: Dieter Reimer Verlag and Düsseldorf: Stiftung imai, 2013. Caianiello, Tiziana. “Materializing the Ephemeral: The Preservation and Presentation of Media Art Installations.” In Medienkust Installationen Erhaltung und Präsentation/ Media Art Installations Preservation and Presentation, edited by Renate Buschmann and Tiziana Caianiello, 207–29. Berlin: Dieter Reimer Verlag and Düsseldorf: Stiftung imai, 2013. Clavir, Miriam. Preserving What Is Valued. Museums, Conservation, and First Nations. Vancouver: University of British Columbia Press, 2002.

26

Theories of TBM Art Conservation Dekker, Annet. Collecting and Conserving Net Art: Moving Beyond Conventional Methods. London and New York: Routledge, Taylor & Francis Group, 2018. Deleuze, Gilles. Bergsonism. Translated by H. Tomlinson and B. Habberiam. New York: Zone Books, 1991. Depocas, Alain, Jon Ippolito, Caitlin Jones, and Daniel Langlois Foundation for Art, Science and Technology, eds. Variable Media/Permanence Through Change: The Variable Media Approach. New York: The Solomon R. Guggenheim Foundation and Montreal: The Daniel Langlois Foundation for Art, Science and Technology, 2003. Derrida, Jacques. Limited Inc. Edited by G. Graf. Evanston: Northwestern University Press, 1988. Domínguez Rubio, Fernando. “Preserving the Unpreservable: Docile and Unruly Objects at MoMA.” Theory and Society 43, no. 6 (2014): 617–45. ———. Still Life, Ecologies of the Modern Imagination at the Art Museum. Chicago: The University of Chicago Press, 2020. Fiske, T. “White Walls: Installations, Absence, Iteration and Difference.” In Conservation: Principles, Dilemmas and Uncomfortable Truths, edited by Alison Richmond, Alison Bracker, and Victoria and Albert Museum, 229–40. Amsterdam: Butterworth-Heinemann, 2009. Goodman, Nelson. Languages of Art: An Approach to a Theory of Symbols. Indianapolis, IN: Hackett, 1976. Gordon, R. “Documenting Performance and Performing Documentation: On the Interplay of Documentation and Authenticity.” Revista de História Da Arte 4 (2015): 146–57. Hölling, Hanna. Paik’s Virtual Archive: Time, Change, and Materiality in Media Art. Oakland, CA: University of California Press, 2017. Kopytoff, Igor. “The Cultural Biography of Things: Commoditization as a Process.” In The Social Life of Things: Commodities in Cultural Perspective, edited by Arjun Appadurai, 64–91. Cambridge: Cambridge University Press, 1986. Latour, Bruno. Pandora’s Hope: Essays on the Reality of Science Studies. Cambridge, MA: Harvard University Press, 1999. Laurenson, Pip. “The Management of Display Equipment in Time-based Media Installations.” In Modern Art, New Museums: Contributions to the Bilbao Congress, edited by Ashak Roy and Perry Smith, 49–53. Bilbao, Spain and London: The International Institute for Conservation of Historic and Artistic Works, 2004. ———. “Authenticity, Change and Loss in the Conservation of Time-Based Media Installations.” Tate Papers, no. 6 (Autumn 2006). www.tate.org.uk/research/publications/tate-papers/06/ authenticity-change-and-loss-conservation-of-time-based-media-installations. ———. “Emerging Institutional Models and Notions of Expertise for the Conservation of Time-Based Media Works of Art.” Techne 37 (2013). Mol, Annemarie. The Body Multiple: Ontology in Medical Practice. Durham: Duke University Press, 2002. Phillips, Joanna. “Reporting Iterations: A Documentation Model for Time-Based Media Art.” In Performing Documentation, Revista de História Da Arte—Serie W, edited by Gunnar Heydenreich, Rita Macedo, and Lucia Matos, 168–79. Lisboa: Instituto de História da Arte, 2015. Rinehart, Richard, and Jon Ippolito, eds. Re-Collection: Art, New Media, and Social Memory. Cambridge, MA: The MIT Press, 2014. Schatzki, Theodore R. “Introduction: Practice Theory.” In The Practice Turn in Contemporary Theory, edited by Theodore R. Schatzki, K. Knorr-Cetina, and Eike von Savigny, 1–14. London and New York: Routledge and Taylor & Francis Group, 2001. Van de Vall, Renée, Hanna Hölling, Tatja Scholte, and Sanneke Stigter. “Reflections on a Biographical Approach to Contemporary Art Conservation.” In ICOM-CC 16th Triennial Conference Preprints, Lisbon 19–23 September 2011, edited by Janet Bridgland. Lisbon: Critério-Produção Grafica, Lda, 2011. https://pure.uva.nl/ws/files/1262883/115640_344546.pdf. Van Saaze, Vivian. Installation Art and the Museum. Amsterdam: Amsterdam University Press, 2013.

27

3 A ROUNDTABLE: CURATORIAL PERSPECTIVES ON COLLECTING TIME-BASED MEDIA ART Annet Dekker in conversation with Karen Archey, Ulanda Blair, Sarah Cook, Ana Gonçalves Magalhães, Sabine Himmelsbach, Kelani Nichole, Christiane Paul, and Henna Paunu

Editors’ Notes: This roundtable discussion, conceptualized and moderated by Annet Dekker and held on June 3, 2020, brings together an international group of time-based media curators and scholars who work in a variety of institutional and cultural settings, ranging from museums and galleries to universities, from Asia, Europe, and the US to South America. In conversation, they share their experiences in collecting and preserving art and offer their perspectives on shifting roles in curatorial and conservation and evolving collaborative practices. Annet Dekker (Assistant Professor of Media Studies, University of Amsterdam) is a curator and scholar of digital arts; Karen Archey (Curator of Contemporary Art, Stedelijk Museum Amsterdam) heads the research initiative on the conservation, acquisition and display of time-based media; Ulanda Blair (­Curator of ­Moving  Image, M+ in Hong Kong) is developing the moving-image collection at M+; Sarah Cook (­Professor of Museum Studies, University of Glasgow) is a curator and writer on digital art and co-founder of CRUMB; Ana Gonçalves Magalhães (Director, MAC USP—Museum of Contemporary Art, University of São Paulo) is an art historian and curator; Sabine Himmelsbach (Director of HEK—House of Electronic Arts Basel) collects and exhibits digital art; Kelani Nichole (Founder, TRANSFER gallery, Los Angeles) explores expanded art practice and supports artists in producing software-based art; Christiane Paul (Curator of Digital Art, Whitney Museum, and Professor of Media Studies at the New School) curates and writes about  ­digital  art aesthetics and  practices; and Henna Paunu (Chief Curator Collections, EMMA  [Espoo Museum of Modern Art in Finland]) is interested in the relationship between art and environmental and social phenomena.

Within the context of time-based media art preservation in the past two decades, several events and projects have been organized in which the collaboration between curators and conservators was a central topic in the discussions. In Europe, the first initiative to encourage exchange was the organization of the International Network for the Conservation of Contemporary Art [INCCA (Website) n.d.] in 1999, which was set up as “a pluralistic model of cooperative 28

DOI: 10.4324/9781003034865-4

Curatorial Perspectives on Collecting

research” (Wharton 2009). One of its first activities was the international research project Inside Installations carried out between 2004 and 2007 (INCCA—Inside Installations n.d.). This indepth and extensive project showed that with the increasing complexity of installation art, there was a need to foster greater openness and transparency in the decision-making process by means of collaboration and knowledge exchange between different disciplines and institutions. Whereas the preservation of time-based media always revolved around relationships and dependencies between the multiple elements of the artwork, these themes also became important in connection to the collaboration between various experts, in which particularly the relationship between the curator, the artist, and the conservator became an important consideration. After more than a decade, collaborations between specialists from different disciplines have become more and more accepted and are implemented both within and between institutions. But how well do these models function, how has this collaborative practice developed, what can be learned from them for those who are struggling to find the necessary expertise or preservation method, and how has the collecting of time-based media art changed curatorial and conservation practices? In an attempt to answer these questions, this chapter is based on a curatorial exchange that took place on a video platform and lasted one and a half hours. The discussion focused on how curators are becoming more closely connected and involved in preservation practice, particularly when it involves time-based media artworks. While this happens at various stages, from production to exhibition and collecting, the conversation was aimed at the acquisition process and collection practice of time-based media art. The premise that prompted the discussion was that there is a need for stronger collaborations between curatorial and conservation departments for time-based media artworks to survive longer; perhaps even a need to merge the two. The group included eight international time-based media art curators who are working in different institutional contexts (museum, commercial gallery, university). Some of the curators have been working with time-based media for several decades; others have more experience with commercial galleries or collecting non-Western perspectives. All of them straddle the division between curatorial and preservation departments and the respective functions and roles. What follows is a summary of the discussion focusing on some of the key commonalities, such as the need for technical and curatorial knowledge, the usefulness and obstructions of standardization, the benefits of collaboration, and the potentialities of shared and distributed networks of knowledge and care (Dekker 2018).

3.1 Moving from Technical Risk to the Importance of Curatorial Knowledge The preservation of time-based media art is often discussed in relation to the technical dependencies and its unpredictability, ranging from the obsolescence of hardware and software, the continuous maintenance, the lack of expertise or funding, and the risk of changing or losing parts of the artwork at some point. But are these topics also the concern of curators when they acquire a new artwork into their collection? In other words, what did you consider to be a “risky” artwork, and how did it affect your decision to acquire it? SABINE HIMMELSBACH:  I find that perhaps only ten percent of the artworks that we are collecting could be called very risky; usually these are older artworks. At the moment we are trying to collect a historical overview of time-based media art and net art from Switzerland. Some of the older important artworks are already broken or not fully functioning anymore at the point of acquisition. At that moment we sit down with the artist and discuss how to ANNET DEKKER:

29

Annet Dekker

best preserve the work in such a way that people can still experience it even if it was not how it functioned before. Another experience we had was with the work World of Matter (2013–present) by an art collective (Mabe Bethonico, Ursula Biemann, Uwe H. Martin, Helge Mooshammer, Peter Mörtenböck, Emily E. Scott, Paulo Tavares, Lonnie van Brummelen, and Siebren de Haan). The work is an online platform that we transferred to our server (see fig. 3.1). At some point one of the artists, Ursula Biemann, wrote to us that the work didn’t function anymore. However, there had been many people involved in setting up the platform, and some separate containers were not mentioned in our initial discussions, and so we didn’t know how to take care of those parts. The collective had to dig deeper into what actually had been done and by whom, even though it was done by the same group. In the end it was solved, but with many collaborative projects, it becomes really complex to figure out who was responsible or knows something about the work. Particularly, if it is an older work even from 15 years ago, people forget what they have done exactly. Often there is also a lack of this kind of documentation describing what has been done by whom, etcetera. SARAH COOK: Yes, I echo that. We tend to think of risk in terms of technical decay, but actually a similar and perhaps greater risk is lack of connection with the artists in terms of understanding their intention or process of the work. Just knowing that I’ve been able to record the intention of the artist, which will remain a part of the work, is more important than knowing whether or not a CRT monitor or a specific software will continue to be around. Hence, the lack of notetaking or documentation about the work can create just as much risk, because often the focus is on the technical material rather than the intent of the artwork. CHRISTIANE PAUL: Yes, I would second that risk is a very relative term. Challenges posed by artworks often surface years after acquisition. For instance, the works that I commissioned over almost two decades didn’t seem particularly risky at the time, but by now most of them are not properly working anymore because so much has changed in terms of platforms and software, whether it is Flash phasing out or something else. There have been so many changes that the works became risky acquisitions in retrospect. KAREN ARCHEY: We actually have quite a different situation at the Stedelijk Museum, where time-based media acquisitions have less of a tradition. At times it could be that acquisitions were made, but even before the proper documentation could have been made, those in charge of the acquisition had changed positions and weren’t available anymore. So, we had to retroactively document artworks. Another example of risk is that we don’t always have the required expertise at the Stedelijk such as those who know about the specifics of particular artworks, for instance, of performance art. In those cases, we first had to come up with a best practice around acquiring performance art before we actually acquired it. ULANDA BLAIR: I recognize these challenges, as we have a similar situation at M+, which is of course a new museum with many newly appointed staff. This means that a lot of the documentation work is happening post-acquisition in which we continue, or sometimes start, the discussion and planning for future presentations and preservation strategies. One of the solutions is to retroactively interview the artists as well as to collaborate with other institutions who may have acquired artworks by the same artists. So, in some instances we are still learning about the work’s history and how it should be best presented and preserved. KAREN ARCHEY: Another challenge that I  think is particularly difficult for smaller institutions is that maintenance sometimes happens in art handling departments. While these are knowledgeable and helpful staff members, a more sustainable procedure towards 30

Curatorial Perspectives on Collecting

professionalization or a future-oriented approach is needed for the care of artworks, as well as forward-thinking about risk assessment that is not just about presentation needs, which are entering at different levels in an institution. Similarly, to be able to do all the documentation that we need to do alongside other collection care tasks, we need dedicated staff who are trained or at least familiar with time-based media conservation practices. While we have made a lot of progress towards professionalization at the Stedelijk Museum, in the end, one of the limitations is the lack of funding to pay such dedicated staff members. CHRISTIANE PAUL: I agree, from an institutional perspective, funds are the biggest limitation or risk for me. Next to funding for more staff, there is the fact that I’m competing with works in different media collected by the institution. Getting support for time-based media specifically also is a matter of making them a priority within the institution and that very often isn’t an easy case to make. ANA GONÇALVES MAGALHÃES: Yet, often these concerns are not made a priority in an institution, likewise, the cloud infrastructure—such as the servers and digital repositories—that are needed for digital artworks are often situated elsewhere, housed with the cheapest available company. The inconvenience of not having direct access to your collection as you would with in-house storage is a risk that is often taken in favor of cheaper hosting elsewhere. A solution would be to develop a shared infrastructure between institutions. KELANI NICHOLE: This is actually something I’m working on with The Current (The Current—Art That Examines Technology’s Impact on Humanity n.d.)—we have a virtual museum collection of media art, supported by a distributed patronage model. Our aim is to develop decentralized digital collections storage in which stewardship is more shared, including members or supporters of the museum community, to safeguard the permanence and integrity of artworks. In an attempt to offload stewardship in a different way, the idea is to have the entire collection attached to a networked storage drive with a decentralized protocol, which is accessible to individual plug-ins. Such a model of cooperative digital preservation storage networks allows a shared infrastructure for storage, which would mean less logistical overhead or dependence on commercial cloud services. In the process it would become possible to streamline digital preservation standards, particularly when it comes to long-term storage. ANA GONÇALVES MAGALHÃES: Perhaps another interesting example of a risk relates to the work The Book after the Book (1999), a website by Giselle Beiguelman, which we received at MAC USP in 2015. It’s an award-winning hypertextual visual essay about cyberliterature, reflecting the duality between online reading and writing. After years of inactivity, it became a challenge to let the work function in the same way as when it was created almost 20 years earlier. The artist tried to re-stage the piece together with mathematicians and computer scientists from the Institute of Mathematics of the University of São Paulo (IME USP). The main problems revolved around the reliance on a network of other websites that had changed, and would continue to change, because the artist had programmed everything for Web 1.0 and with Web 2.0, which is more controlled by private software companies, formerly open-source programming was sometimes read as a bug or glitch, and adjusting the speed of modern computers and the web. By redoing the code and replacing some of the open-source programs with certified programs that followed the security standards of the museum, the team managed to create a “new original.” While the technical preservation was complicated, the discussions between the museum’s conservators, registrars, and the artist were more challenging. For instance, there were simple questions about what is the work and how to describe it, but also more pertinent questions as to what would happen when the work was online again: would it be possible for someone to copy 31

Annet Dekker

it or steal it, and how would this affect the museum’s collection? Moreover, we questioned what would happen if the work didn’t function anymore. Should we stop showing the work, or should we (re)interpret the work by translating the intention of the artist when creating the work? Or, should we accept that the work lives on through its documentation, disseminated through publications and alternative presentations? ANNET DEKKER: Indeed, these are very important questions, and they all come back to the question “what is it that we are trying to preserve?” CHRISTIANE PAUL: Yes, but I think this question cannot be answered by curators and conservators alone. From working with conservators over a long period of time, I’ve learned that the biggest challenges are always philosophical and not technical. Technical aspects can be solved, but keeping an artwork’s integrity and being able to understand the artist’s intent is where the curatorial and artistic perspectives and voices, as well as those of staff who worked on previous installations of a given work, are incredibly important. I believe that the conservation challenges ultimately become philosophical rather than technical and an important step towards answering the question of what we are trying to preserve is thus to connect the different fields of knowledge and insights from the curatorial, conservation, and other departments. ULANDA BLAIR: I agree, and this is perhaps stating the obvious, but I think curators generally have a much longer relationship with the artists. Often, they are peering behind the curtain, watching as ideas become form. They usually have a good understanding of an artist’s working methods—they make studio visits, meet with studio assistants and managers, etc. So, I think all of that accumulated time and conversation leads to a depth of knowledge that is incredibly important for understanding how a work comes into being. SARAH COOK: Definitely, a curatorial perspective introduces context in a different way, from the studio visits and your knowledge of the socio-cultural and technological histories with which you build a longer view on why a particular topic or technology was used. From that knowledge you can predict how that work may be interpreted by audiences in the future. HENNA PAUNU: I very much agree that the biggest challenges are always philosophical and not technical. At the moment, at EMMA (Espoo Museum of Modern Art), we are working with several acquisitions that are commissioned as concept-specific and reproducible works of art. The most important thing in those commissions is to find out and agree about the concept of the work and the artist’s intentions, next to solving any questions related to the life span of the artwork. This information is gathered between conservators and curators and is part of the commission agreement. At the same time, in my experience, the curator’s role is sometimes to take part in planning and production, but the curator is also more a representative for the audience, being often the first person to provide feedback on specific questions about an artwork or an artist. I think the main aim is to support the artist and artistic production but also to serve the artist’s audiences.

3.2 The Role of the Curator in Preservation Matters Within time-based media preservation, there is a growing and dedicated structure of exchange. Initiatives among museums, such as Matters in Media Art, a collaborative project between the New Art Trust (NAT), MoMA, SFMOMA, and Tate since 2005 (Matters in Media Art 2015) and SBMK, a tightly knit network of museum professionals that aims to develop sound practices around the maintenance and conservation of contemporary visual art and fosters collaboration between different professionals [SBMK (Website) n.d.], have been around for a few decades and there are several additional project-based

ANNET DEKKER:

32

Curatorial Perspectives on Collecting

initiatives to collaborate and exchange information. Within these networks several standards and protocols are co-created, from the artist’s interview to various documentation models. Generally, also outside these networks, conservators of time-based media art are well-connected, and because they use similar methods, it is easy to share information. How do you see this type of collaboration and sharing within curatorial networks, either between institutions or even within a larger institution? From my own experience as a curator and working as a researcher in other institutions, I realized that curators often collect and archive their own records about artworks and artists without much attention for existing standards, which potentially could obstruct the preservation efforts. What are your experiences? ULANDA BLAIR:  I think a straightforward and relatively simple strategy is to organize a crossdepartmental time-based media committee. Such a committee could be comprised of representatives from each department that is involved with time-based media: from curators to registrars, information technologists to conservators, and audiovisual technicians to archivists. At M+ we have this meeting every six weeks to discuss current challenges. Occasionally, a colleague may be invited to present a specific case study that might address, for example, the implications of licensing, or an artwork’s technical elements, or privacy concerns when user data is used in an installation. At M+, the meeting is co-chaired by me and our digital and media art conservator. So, it’s very much a shared curatorial and conservation initiative. Meeting minutes are later shared and kept in a dedicated folder to ensure ease of access and continuity, preferably, note-taking alternates among the members. SARAH COOK:  The question of trust and the different power relations are deeply entrenched in departmental and information structures. Some kinds of records might be available to the public under the guise of a freedom of information request, but that often means some data will be redacted, and other information might be made available because the curators or artists share it directly. There are different cultures of sharing depending on the size and history of an institution. ANA GONÇALVES MAGALHÃES:  For a workshop we organized on time-based media art preservation, we invited Glenn Wharton when he was a conservator at MoMA, and he spoke about the attitude of an artist in interview settings, which could change depending on whether s/he would talk with the conservation team or with a curator. In other words, many artists talk about different things with the curator and the conservator. I guess this is another thing to take into consideration. Sometimes it is useful to have both at the same interview, but at other times it may be counterproductive. I remember that at a certain moment in the past the museum board decided that conservators could only talk to artists after approval of a curator. Well, such a situation brings out a number of issues of course, particularly around the hierarchical structures of the museum, which became a hugely complicating factor when trying to discuss problems. HENNA PAUNU:  Yes, this is indeed problematic. I  think the curator’s role is very important because I believe they can challenge the artist in a way that a conservator usually cannot, when we are talking about the main principles of a work. A conservator’s role is to focus on documentation, which is based on the things or truth the artist is giving or saying, while a curator can more easily ask, for instance, about different options that move beyond the standard questions or protocol. SABINE HIMMELSBACH:  Yes, I think the role of the curator is also about risk-taking, or to find innovative solutions to certain problems. For example, we just acquired a work by the Swiss sound artist Andres Bosshard, Telefonia—2021–1991–1291 (1991/2020), a telematic piece from 1991 between New York, Winterthur, and the Säntis mountains. Bosshard proposed 33

Annet Dekker

to make it accessible online through a website, which includes a static archive of all the material from the past, but also features a generative performance, which we called “the liquid archive” (see fig. 3.2). While this term worried the conservators, it became a generative way to recombine historical elements from a complex performative media installation. Relabeling this method opened a new path to find new ways of transforming what the artistic intention was at the time of the initial performance instead of preserving only the historical documents in a static way. It is a new work but based on the material from the historical event from 1991. KELANI NICHOLE: I recognize this since at TRANSFER gallery, we develop contracts for VR artworks, and because the software and hardware change so rapidly, we use the term “living edition” to indicate that when the performance capacities improve in the future, the artist has the option to update the work or not. At the same time, the collector is offered those updates for a small fee to compensate the artist for his or her time. In the end you have a record of acceptable changes. SARAH COOK: Perhaps similarly, many curators tend to view their curatorial records as being “live.” Their records are constantly updated with additional information, but such changes often lead to different versions that are not always properly tracked in the same way as would happen in conservation departments. This challenge becomes even more pressing when working with freelance curators who also have their own ways of recordkeeping, which don’t necessarily comply with the institution they are collaborating with. From my own experience as a freelance curator, it is rare that I have been told about the recordkeeping systems of the organization I’m curating for, and often I offer to share my records—such

Figure 3.1 Mabe Bethonico, Ursula Biemann, Uwe H. Martin, et al., World of Matter (2011–18; www. worldofmatter.net). The maintenance of the collaborative web project, which lives on the HEK server, has presented its challenges because of the many contributors to the platform. Screenshot: Sabine Himmelsbach, HEK

34

Curatorial Perspectives on Collecting

Figure 3.2

Andres Bosshard, Telefonia-1291–1991–2021 (1991/2020; https://telefonia.hek.ch). When HEK acquired the 1991-work in 2020, the artist proposed to make it accessible online, featuring a generative performance and turning it into a “liquid archive.”

Screenshot: Sabine Himmelsbach, HEK

as correspondence with artists—for their archive or repository before I am asked to supply them. So, there is also the challenge of gathering all of the information in the first place. One way to approach the tension between necessary standardization (of records about a work) and freedom of interpretation (of records about a work) could be to focus on how information management is taught in curatorial education programs. Knowledge about the value of recordkeeping and documentation practices could be implemented in the teaching. This already happens in cases where curatorial and museum training programs are integrated in Media, Archival, or Information Studies departments instead of the usual Art History department. The same skills of recordkeeping are taught both to future curators and future archivists, and they learn to value each other’s expertise and ways of working.

3.3 Collaboration Beyond the Institution We saw some ways of collaborative work in the example that was mentioned in The Book after the Book by Beiguelman, as well as in some of the other examples. So, next to the challenges around the meaning and technical functioning of the work, another challenge may be to cope with all of the different people and organizations that are trying to preserve an artwork. In the case of Beiguelman’s The Book after the Book, the collaborative efforts to make the work function again consisted of the artist(s), research groups from a university, a private company, and other cultural organizations, which potentially in the future may at different points need to reconnect to ensure the functioning of the artwork. In other words, with time-based media artworks, people from different sectors of the museum and people from beyond the institution need to come together and to sustain

ANNET DEKKER:

35

Annet Dekker

the artwork over time. This network will also have to be maintained somehow. This was also one of the conclusions of Pip Laurenson (Tate) and Vivian van Saaze (Maastricht University) from their research into collecting performance art that “it is not the problem of non-materiality that currently represents the greatest challenge for museums in collecting performance but of maintaining—conceived of as a process of active engagement—the networks which support the work” (Laurenson and Van Saaze 2014, 39). This dependency on the artists, but moreover on the network around the artwork once it is in need of updating, seems to be urgent but complex, especially with artworks that are created by larger collaboratives in which it isn’t always clear whom to approach for which problem as roles could have shifted over time. CHRISTIANE PAUL: Indeed, even in specialized departments, help is often required either from another department within the institution or an external party. Moreover, such collaborations are not only necessary for preserving components of an artwork, but also for ensuring that “the digital life” of an artwork is properly documented. In other words, even though an artwork is functioning well, it may change over time due to certain considerations of the artist. For instance, Ian Cheng’s work Baby feat. Ikaria (2013), which is in the Whitney’s collection, is a simulation of three AI chatbots that are commercially used in customer service but are talking to each other in the piece. Cheng’s aim was to have them generate their own conversation, moving from dialogue into absurdity and semantically nonsensical exchanges. The evolving conversation stirs around different abstract shapes on the screen that try to build some sort of form or entity out of the bots’ discussion. In conversation with the Whitney Museum, Ian said that he’d like to change the AI customer service, not due to a preservation need, but to follow the technological evolution of bots. So, I would suggest that it becomes necessary to also develop contextual documentation strategies. In the case of Cheng’s work, the team at the Whitney Museum is documenting these changes and insights in a GitHub repository, where the documentation moves from technical and content issues of the artwork to explaining future potentials that may (in)directly influence preservation decisions about the artwork at some point. This continuous involvement of and exchange between different departments (curatorial, conservation, and IT) brought the interdepartmental collaboration to a different level: instead of focusing on fixing an artwork, it was about documenting the artwork and the changing context in which it evolved. KELANI NICHOLE: While it is more and more acknowledged that it is important to document the presentation and the visitor experience of the artwork, like in Christiane’s example, it is also important to document the context in which an artwork is experienced, for instance, to show what it was like to be in a gallery space with an Oculus Quest and how awkward it is to configure it correctly. Currently at TRANSFER gallery, we are focusing on creating a “fluid archive” in which the process is documented of what happens after the acquisition of an artwork, both in the sense of any technical changes and observations during installation and while the work is on exhibition, since many artworks continue in some ways as malleable entities. ANNET DEKKER: The documentation of time-based artworks seems to move between explaining the technical functioning of the artwork and explaining the artist’s intention regarding the experience of the artwork and its wider context. The diversity of documentation strategies emphasizes how time-based media is changing the museum practice in which the traditional functions of curator and conservator are shifting, as the care of an artwork is not only about repairing or even envisioning future change, but also about following and documenting its evolving “collection life.” Moreover, it shows how the “preservation” of these 36

Curatorial Perspectives on Collecting

artworks is also about maintaining the network or ecosystems of the artworks, just as much as it is about the artwork itself. Collaboration is not necessarily new in a museum context. However, instead of discussions that happen at the moment when an artwork is acquired or needs to be restored, there is now a long-term ongoing exchange between departments that sometimes includes external expertise. Similarly, it is recognized that an effective longterm maintenance and management of time-based media art is best done collaboratively, but next to having knowledge about technical histories, art historical influences, and sociocultural phenomena, the challenge is also to find a way to build a sustainable network to support the artworks in the collections. How does one develop ways to organize the expertise needed and an organizational structure to set up and maintain the networks that have become pivotal for the digital life and the preservation of these artworks? SARAH COOK: I think an interesting example of alternative efforts is the project Future Library (2014–2114) by Katie Paterson, even though it is perhaps not a typical time-based media artwork. For her project Paterson commissions each year, for the duration of the project, an author to write a manuscript. The idea is that all 100 manuscripts will be printed in 2114. The paper for the books will come from the 100 trees that were planted in 2014, especially for this project in Oslo’s Nordmarka forest. In the meantime, no one is permitted to read the manuscripts, and a trust is established to take care of the process. This process includes the artist and contributors but also the shared caring of the trees, their felling and bookmaking, as well as all the owners of the Future Library anthology, even though that they are not explicitly stated. In one of the reviews about the project, it is noted that “the next 96 years do not look promising for the seedlings, which are more vulnerable than their ancestors to all manner of man-made disasters: the storm surges, wildfires, heat waves, and droughts precipitated by global warming, as well as the less dramatic possibility that, amid the daily brutalities of life on earth, people will simply stop tending to them and the books that are their fate” (Emre 2018, 573). So, although the project doesn’t necessarily have any technical media parts in the way that we are used to think of time-based media, the way the caretaking is considered by a city, by location, as well as by an artist, and all of the contributors, makes this a project about distributed authorship as well as a distributed caretaking. CHRISTIANE PAUL: A perhaps more pragmatic possibility is to engage in close collaborations with different departments of educational institutions, such as conservation, curation, law (in case of legal advice needed) and computer science. Most museums don’t have the time or resources to hire someone for half a year to analyze the details of an older piece’s coding, but working with students who are doing in-depth analysis as a case study can be hugely rewarding. For instance, we worked with students from computer science at NYU, supervised by Deena Engel, on specific artworks from the collection to consider preservation strategies. Even if this was only a first step in the conservation process, it made a huge difference to have a student work on code analysis of one piece for a whole semester. Such an amazingly in-depth project would not have been possible in the regular workflow of the museum, nor would we have the money and resources to hire someone to do that job for half a year. We all learned so much about the work and actually ended up appreciating it more from both a curatorial and conceptual perspective. ANNET DEKKER: At the University of Amsterdam, similar projects have taken place. However, it is still restricted to specific departments. For instance, Archival and Information Studies started to collaborate with Presentation and Preservation of the Moving Image within Media Studies, and there is some contact with Conservation and Restoration of Cultural Heritage. However, with the latter it’s mostly guest lectures rather than integrated course development. I think it could also be beneficial to organize an interdisciplinary group of 37

Annet Dekker

students from various departments, including the ones mentioned but also Art History and the faculty of Computer Science and Law, which can focus on different issues simultaneously and collaboratively. In the process, they learn to collaborate and to solve problems together in a natural way from the start. SABINE HIMMELSBACH: To briefly go back to the use of shared platforms, such as Git or Wiki, we learned that these can be very helpful to get a whole team—including internal and external people—involved in the understanding of the process of, for example, curatorial or conservation. At HEK we use a shared wiki, because it is important to get everyone engaged and have them experience first-hand that it is not someone else doing something that you’re not a part of. By making it an active part of the organization and the preservation effort, it makes a difference in how people see their own role and function, as well as dependency and added value towards the others. It is really the whole team, from the ones who are booking hotels and the technical team, we really do not want to differentiate. By lifting the hierarchies between departments, job functions, or even generations, we want to encourage people to move closer towards each other, without the fear that they might be regarded as inferior, or vice versa as being too established to be approached with a “mundane” question. ANNET DEKKER: Thank you all for your time and participation in this conversation. It has become clear that collaboration is a complex and intensive process, and instead of shortterm discussions that happen at the moment when an artwork needs to be fixed, there is a long-term relation of exchange between departments, often also including external expertise. To me this indicates that the traditional functions of curator and conservator are shifting, and that the care of an artwork is not only about repairing or even envisioning future change, but also about following and documenting its evolving “digital life.” Moreover, it shows how the “preservation” of these artworks is also about maintaining the network or ecosystems of the artworks, just as much as it is about the artwork itself.

Bibliography Emre, Merve. “This Library Has New Books by Major Authors, but They Can’t Be Read Until 2114.” The New York Times, November 1, 2018. www.nytimes.com/2018/11/01/t-magazine/future-librarybooks.html. The Current Museum of Art. “Art That Examines Technology’s Impact on Humanity.” The Current Museum of Art. n.d. Accessed June 12, 2021. http://thecurrent.art/. Dekker, Annet. Collecting and Conserving Net Art. Moving Beyond Conventional Methods. London: Routledge, 2018. INCCA. “INCCA International Network for the Conservation of Contemporary Art (Website).” n.d. Accessed June 12, 2021. www.incca.org/. ———. “Project: Inside Installations (2004–2007).” n.d. Accessed June 12, 2021. www.incca.org/articles/ project-inside-installations-2004-2007. Laurenson, Pip, and Vivian Van Saaze. “Collecting Performance-Based Art: New Challenges and Shifting Perspectives.” In Performativity in the Gallery: Staging Interactive Encounters, edited by Outi Remes, Laura MacCulloch, and Marika Leino, 27–41. Oxford: Peter Lang, 2014. Matters in Media Art. “Matters in Media Art; Guidelines for the Care of Media Artworks.” 2015. http:// mattersinmediaart.org/. SBMK. “SBMK (Website).” n.d. Accessed June 12, 2021. https://sbmk.nl/. Wharton, Glenn. “INCCA, the Model of Conserving Contemporary Art, Conservation Perspective.” The GCI Newsletter, 2009. www.getty.edu/conservation/publications_resources/newsletters/24_2/incca. html.

38

4 INSTITUTIONAL ASSESSMENTS AND COLLECTION SURVEYS FOR TIME-BASED MEDIA CONSERVATION Lia Kramer, Alexandra Nichols, Mollie Anderson, Nora Kennedy, Lorena Ramírez López, and Glenn Wharton

Editors’ Notes: To address the preservation needs of its fast-growing TBM collection, The Metropolitan Museum of Art in New York conducted a major collection survey and institutional assessment in 2017–18. The project identified necessary staff and infrastructure adjustments and resulted in the creation of a permanent TBM conservator position. Based on their first-hand experience as project participants and drawing from conservation literature and numerous interviews with collection professionals around the world, the authors of this chapter offer a roadmap for tackling complex TBM collections of unknown conditions and preservation needs. Met staff at that time included Nora Kennedy (Sherman Fairchild Conservator in Charge of the Department of Photograph Conservation, The Metropolitan Museum of Art) and Mollie Anderson (now Executive Assistant at Weights & Biases, Inc.), as well as Alexandra Nichols (now Time-Based Media Conservator at Tate, London), who at the time was a Sherman Fairchild Foundation Fellow in Conservation at the Met. Lia Kramer (now Andrew W. Mellon Fellow in Media Conservation at MoMA, New York) and Lorena Ramírez López (now Consultant for Digital Preservation and Conservation, Myriad Consulting) were hired as project contractors, and Glenn Wharton (now Professor of Art History at UCLA and Chair of the UCLA/Getty Conservation of Cultural Heritage Program, LA) acted as an external assessor.

4.1 Introduction to TBM Conservation Assessments and Surveys Acquiring, exhibiting, and caring for time-based media (TBM) artworks can be a daunting challenge for many institutions. To non-specialists, TBM artworks can easily be misconceived as nothing more than simple digital files. In reality, even single-channel videos are often complex, requiring extensive documentation, research, and ongoing attention to preserve them. The dependence of these works on compatible technologies, obsolescence-threatened equipment, and thorough display guidelines puts them at high risk, particularly as time passes. Moreover, because professional TBM conservators have only recently begun to work in museums, many DOI: 10.4324/9781003034865-5

39

Lia Kramer et al.

institutions and organizations face a backlog of outstanding research, documentation, and treatment, making it even more difficult to address the needs of their collections. “What I wanted to do early on was to prove the need for a post for a time-based media conservator and prove the need for a preservation management plan, because there are some interesting attitudes in many institutions towards time-based media, such as, ‘It’s cheap. It’s just that easy. You just stick a DVD in a DVD player, don’t you?’ There’s a perception that if an artwork is sitting on a pen drive, DVD, or hard drive in our stores, that it is safe and accessible . . . but in reality, these carriers have a lack of longevity and so there’s a need for some sort of digital preservation system.” —Kirsten Dunne, Senior Projects Conservator, National Galleries of Scotland

Institutional assessments and conservation surveys are vital investments in the survival of these artworks. They form an important first step toward understanding TBM collections, allowing conservators to identify the needs of these collections, evaluate institutional policies and procedures, and advocate for funding and staff resources. Ultimately, they can help museums and other collecting institutions build TBM conservation initiatives, develop effective conservation strategies, and ensure the ongoing care, accessibility, and exhibition of TBM artworks. From 2017 to 2018, the authors of this chapter conducted a combined TBM institutional assessment and collection survey at The Metropolitan Museum of Art in New York. During this project, the authors interviewed key staff members, evaluated existing policies and practices relating to TBM conservation and care, and examined the collection holdings (see fig. 4.1), identifying risks posed to the collection and areas for further documentation and care (Kramer et al. 2021). The recommendations put forth in the final report helped improve practices and attitudes toward TBM conservation, and as a result, the Met hired its first TBM conservator in 2019. Through this chapter, we hope to share our experience and provide proposed methodologies for surveys and assessments that can be applied to a range of collecting institutions. We begin with traditional methods developed for cultural heritage institutions and how they can be adapted to serve the unique needs of TBM collections. We then provide guidelines for planning and executing institutional assessments and collection surveys, along with implementing recommendations in the final report.

4.2 Foundational Methodologies for Conservation Assessments and Surveys for Cultural Heritage Collections Conservation assessments and surveys are commonly utilized for a wide range of collection types such as museums, historic houses, libraries and archives, and materials including paintings, sculptures, textiles, books, paper, and photographs. There are several well-established methodologies for conducting these evaluations.

4.2.1 Conservation Assessments In 1990, the Foundation for Advancement in Conservation (FAIC) and the Institute of Museum and Library Services (IMLS) developed the Collections Assessment for Preservation (CAP) program to encourage a more holistic approach toward collection care with broad, general 40

Assessments and Surveys

Figure 4.1

Lia Kramer and Lorena Ramírez López examine the custom hardware utilized in Jim Campbell’s Motion and Rest #2 (2002) during the combined TBM institutional assessment and collection survey conducted at The Metropolitan Museum of Art from 2017 to 2018.

Image: Naoise Dunne, © The Metropolitan Museum of Art

conservation surveys of collections and museum environments, as opposed to item-by-item collection surveys which examine the condition of individual objects (Berrett 1994). The FAIC/IMLS CAP program handbook advises that the assessment team should be composed of three individuals who work collaboratively on the assessment: a collection assessor, a building assessor, and a key staff member of the museum (FAIC 2019, 4). The collection assessor is an external specialist working alongside on-site staff members. They provide professional expertise with broad organizational evaluations and an outside perspective that can carry more weight with upper administration or board members (FAIC 2019). The building assessor is responsible for examining the condition of the building envelope and historical environmental data, which is then compared to the risks that materials in the collection face. The key staff member’s role is to provide information about current policies, schedule interviews with other staff members, and facilitate access to various areas of the museum during any on-site visits. The final report, written by the collection assessor and building assessor, provides a summary of the current state of the collection and the building envelope and identifies ways in which collection care can be improved. For more information on this subject, see related publications by the Getty Conservation Institute (Avrami et al. 1999), the Cultural Heritage Agency of the Netherlands (Brokerhoff et al. 2017), and the Council for Museums, Archives, and Libraries (CMAL 2018). 41

Lia Kramer et al.

4.2.2 Collection Surveys Conservators use collection surveys to achieve a variety of goals, such as creating an updated inventory of a collection, evaluating the condition of objects, cataloging components, identifying risks faced by a collection, and prioritizing objects for conservation treatment. Typically, these surveys do not address larger institutional issues such as policies, procedures, and the building envelope. Staff conservators or collection managers commonly complete these surveys, but institutions frequently utilize contract conservators as well. There is no uniform standard for collection surveys (American Museum of Natural History—Collection Surveys n.d.), but they generally fall into one of the following five categories: 1. A broad, general inventory is used to first identify the scope of a collection and is especially helpful when a collection is first being formed when a large donation is entering the collection, or if there has been inconsistent documentation and registration of the collection in the past. This category of collection survey covers high-level “tombstone” information, such as the name of the artist/creator, culture or region of origin, the date of creation, and the appropriate medium line. 2. Some surveys may focus on a specific type or group of materials, such as a survey to identify works that contain precarious plastics or to identify objects that contain ivory and, therefore, would be subject to government regulations. 3. A storage survey evaluates how items are being stored, assesses whether custom housing is required, and approximates the amount of space available within a specific unit or warehouse. 4. A condition survey focuses on the current condition of objects, estimates the amount of labor required to make a particular object suitable for exhibition or loan, and identifies objects that should be prioritized for treatment. 5. A risk assessment survey calculates the likelihood of future damage based upon the severity of damage from “agents of deterioration” (CCI—Agents of Deterioration 2017) and the frequency of occurrence (Ashley-Smith 2013, Brokerhoff et al. 2017). Often, risk assessments include recommendations for mitigating the risks identified (Taylor 2005; Waller 1994; Xavier-Rowe and Fry 2011; Digital Preservation Coalition—DCP Rapid Assessment Model 2021). Collection surveys are very flexible and can vary in scope, but they are generally time-consuming and labor-intensive. Staff members working with particularly large collections may adopt certain methods to complete surveys quickly and efficiently. One option is to conduct a sample survey rather than examine every item in the collection. This may include evaluating a certain number of objects per drawer or shelf or creating a random sample to evaluate (Henderson 2000). Another approach is the so-called modified McGinley method, in which the surveyor does not provide detailed information about each artwork but instead assigns each object a numerical score that corresponds with a predefined condition status (Favret et al. 2007; Waller 1995; Carter and Walker 1999). The results of a collection survey can be summarized in a final report that details the state of the collection and highlights artworks that are in need of immediate treatment, thereby informing institutional priorities going forward.

42

Assessments and Surveys

4.3 Frameworks for TBM Conservation Institutional Assessments and Collection Surveys 4.3.1 TBM Conservation Institutional Assessments Our suggested methodology for TBM conservation institutional assessments is strongly influenced by the methodologies developed for the FAIC/IMLS CAP program. An institutional assessment for TBM collections differs from a traditional CAP assessment in that the institutional assessment is led by a TBM conservator and employs a digital archivist or IT specialist as the second assessor rather than a building assessor. This reflects the importance of digital storage and preservation in TBM collection practices. Ideally, the institutional assessment is conducted by an independent TBM conservator who liaises with a member of the institution’s conservation or collection care staff (such as registrars or collection managers). Independent specialists have the advantage of greater objectivity and the cachet of being an outside expert with greater potential for advocacy. If there are no TBM conservators in the region or if a contractor is not a feasible expense for the institution, several initial steps can still be accomplished by staff members that will increase awareness of TBM artwork needs, improve practices, and provide data to justify a future assessment; see Sections 4.4, “Conducting a TBM Institutional Assessment,” and 4.5, “Conducting a TBM Collection Survey,” for more details. Institutional assessments for TBM collections focus on evaluating the organizational policies and procedures in place at an institution, including acquisition and collection care policies, resource allocation, staff hierarchy, and organization, and how an institution’s operations align with its mission statement. The assessment also is geared toward identifying needs for equipment to view media for condition assessments and to display the artworks in the gallery. Finally, institutional assessments evaluate the storage conditions of a collection by examining not only the storage vaults that house media carriers and sculptural components but also the digital storage utilized for preserving media files. It is important to include the perspective of stakeholders across a wide range of roles and departments when evaluating policies and procedures. Curators, collection managers, conservators, audiovisual specialists, and registrars can all make valuable contributions and will have amassed institutional knowledge through their involvement in the artworks’ acquisition and past exhibition. Input from a digital archivist or IT specialist, ideally with experience working with TBM collections, is recommended to facilitate communication with other technical staff and provide advice for improvements to digital preservation practices. The final report summarizes the existing procedures, identifies deficiencies, and recommends concrete actions the institution can take to refine and upgrade its practices.

“From my digital design background, we have been taught that all the digital or technology products couldn’t be done by one person, it’s always a team. I always think it is very funny that someone expects that a conservator can be proficient in all kinds of mediums. So I continue to tell all of my colleagues that these products and digital files are not designed by one person, so people who want to conserve them should also be a team, not an individual.” —Yuhsien Chen, Freelance Time-based Media Conservator, Taipei, Taiwan

43

Lia Kramer et al.

4.3.2 TBM Conservation Collection Surveys In contrast to the institutional assessment, a TBM conservation collection survey is an audit focusing on individual artworks rather than the greater museum environment. This item-byitem survey seeks to document the makeup and condition of the collection; what media carriers, sculptural components, or dedicated equipment are present; and what supporting internal documentation is available. Collection surveys for TBM collections require special considerations beyond those of more traditional media. In addition to focusing on the preservation of physical components, TBM surveys also examine digital files and intangible conceptual aspects that may be central to the artwork. Each of these components, whether analog or digital, requires a thorough cataloging review for accurate tracking and historical record (Sherring et al. 2018). The artwork might rely on unique or obsolete hardware or software for display. Taking stock of available viewing, playback, and display equipment will inform conservation purchasing needs and in-house migration capacity relative to the size of the collection. Documentation plays an important role in transmitting and communicating the understanding of the conceptual aspects of a TBM artwork, particularly how it may be displayed and preserved and what conservation interventions are appropriate. This supplementary documentation is integral to the preservation and continued ability to display TBM artworks and thus should form a significant portion of any TBM collection survey. Such documentation may include noting the existence of any acquisition agreements, copyright agreements, recordings or transcripts of artist interviews or correspondence, installation specifications, conservation treatment reports, and documentation of past displays. If the collection does not have policies for creating and maintaining this documentation, a TBM institutional assessment may be needed. Prior to embarking on an institutional assessment or collection survey, an institution should consider the formation of a TBM working group. TBM working groups are composed of institutional stakeholders from a range of departments who have a vested interest in the continued acquisition, care, and display of TBM artworks. Its members may include conservators, curators, collection managers, registrars, audiovisual specialists, exhibition designers, and members of the legal team, among others. TBM working groups can function as a means of support and education, through which its members can share concerns, address challenges arising from specific artworks and exhibitions, and develop policies and procedures relating to TBM artworks. Holding regular working group meetings can help build relationships and foster consensus across departments, and advocate for continued focus on the issues surrounding TBM artworks (see Chapter 10).

“I  started in 2015 as the first full-time art conservator for the Portland Art Museum in Oregon. We didn’t have any specific funding or dedicated time for this project, so we had to be very strategic and focused with what we could do. It was very clear that the approximately 25 TBM artworks had been overlooked and needed some attention. Since I was new, it was a good opportunity to compile a working group so we could talk about these issues. I approached people individually to talk about how this might impact their work and what they would like to get out of the group. When we met all together I made sure for everybody to have a voice. That helped to make sure that there was buyin; everybody wanted to participate and help it work. Looking back, the small size of the institution was beneficial to our success. With a smaller staff and fewer levels of hierarchy,

44

Assessments and Surveys

it was easier to communicate and collaborate than I had experienced in larger organizations. Several departments had only one person, which helped our team feel empowered to make change once we identified our needs. I used the Matters in Media Art collection survey document (Matters in Media Art 2015) to inform my approach. As I went through the collection object-by-object, I was able to consult with these colleagues who had institutional knowledge of the work and could collaborate to put policies into place. Having that support was the key to success.” —Samantha Springer, Owner and Principal Conservator of Art Solutions Lab, Portland, Oregon “The first conversations with management around setting up an interdepartmental taskforce to tackle issues related to the acquisition, display, documentation and preservation of TBM objects started in summer 2017. Later that year, then Senior Conservator Christel Pesme and then TBM Conservator Shu-Wen Lin organised an international symposium to introduce the institution to the current discourses and practices related to TBM conservation. In 2018, Pesme and then TBM Conservator Yuhsien Chen established the Time-based Media Committee, which is an interdepartmental platform among Curatorial, Conservation, Registration, Databases, ​Digital Team, Rights and Reproductions, IT, and Installations departments. Through this platform, David [Smith] and I  have worked with other departments to develop cross-departmental workflows for TBM conservation. We had a huge help from our registrarial and curatorial teams. As we’re a new museum, we had to design everything from scratch, departmental roles are in flux—our very special circumstance means that we’re still learning. Conservation has taken a leading role in proposing solutions and workflows to outline who is doing what, from the very moment that you pull a hard drive out of storage, to how it is brought to the conservation station, how the hard drive’s contents are ingested . . . how to store information about the hard drive and where—what is going to TMS, and what is going to our object files. It took a while to design these processes and they were quite complex.” —Aga Wielocha, Conservator, Preventive, and David Smith, Conservator, Time-Based Media, M+ Museum, Hong Kong

4.3.3 Organizing the Institutional Assessment or Collection Survey TBM institutional assessments and collection surveys (see Table 4.1. for a comparison of core characteristics of both) require an array of resources. For a collection survey, one may need to consider whether the institution has staff members with specialized knowledge and experience examining, evaluating, and caring for TBM artworks and whether they are able to set aside time to contribute to such a project. Similarly, an institutional assessment will also require a significant time investment from staff and outside contractors and may require institutional support for grant applications or other fundraising efforts to hire an external TBM conservator as the primary assessor. External consultants have the advantage of an impartial, third-party perspective, and their critiques and recommendations are often better received by colleagues and leadership than that of existing staff. Although an experienced TBM conservator is necessary to properly evaluate the condition of media files or advise on institutional practices relating to TBM artworks, initial steps toward more in-depth projects can be completed by conservators in 45

Lia Kramer et al. Table 4.1  Core characteristics of a TBM institutional assessment and a collection survey in comparison Institutional Assessment

Collection Survey

• A holistic evaluation of the entire institution as it pertains to TBM artworks. • Evaluates established policies and procedures relating to TBM acquisition, exhibition, conservation, and collection management. • Examines staffing and other resources for TBM conservation. • Builds awareness and support across departments. • Assesses suitability of physical storage vaults for media carriers and display equipment. • Assesses suitability of digital storage infrastructure (servers, LTO tapes, etc.) and preservation practices for digital artwork files. • Gauges TBM-related equipment purchasing needs. • Identifies funding and staffing deficiencies. • Provides prioritized long- and short-term recommendations for improvements. • Provides information about the collection that can be used to do the following: ○ Advocate for funding. ○ Advocate for improved storage. ○ Aid in writing grant proposals.

• Evaluates the condition of individual artworks. • Creates an inventory of the collection. • Identifies analog and digital media carriers present. • Tests media carriers for the ability to access content; evaluates the quality of content. • Evaluates the condition of artist-modified or dedicated display equipment. • Prioritizes artworks for digitization or other conservation interventions. • Assesses risks to individual artworks. • Creates an inventory of any display equipment used for the exhibition of TBM artworks. • May provide prioritized long- and shortterm recommendations for improvements. • Provides information about the collection that can be used to do the following: ○ Advocate for funding. ○ Advocate for improved storage. ○ Aid in writing grant proposals.

other specialties, registrars, collection managers, curatorial staff, and/or audiovisual specialists if a TBM conservator is not present (see Section 4.5, “Conducting a TBM Collection Survey”). “My initial goal was just to bring awareness to the fragility of the objects and to get a sense of what we had and whether it was stored properly and not sitting on someone’s desk or in a file folder. The second goal was to consider what was being requested at acquisition. The third goal was to educate colleagues about and create protocols for various file types, handling, accessibility, sharing, and storage. I looked at it as slowly, stepwise putting those things into place to first raise awareness and create a baseline to then consider what we could do to protect the collection.” —Samantha Springer, Owner and Principal Conservator of Art Solutions Lab, Portland, Oregon

Whether planning an institutional assessment or collection survey, the collection will benefit if the methodology and structure of the project are tailored to the resources, needs, and structure or composition of the institution or collection. Table 4.2 details the staff and resources recommended for each type of evaluation. An initial or limited collection survey can also be utilized to advocate for additional funding and support to conduct a more in-depth survey or institutional assessment. Some basic statistics about the collection, knowledge of its importance to the institution and the public, and an evaluation of the inherent risks of TBM collections are useful in creating compelling arguments 46

Assessments and Surveys Table 4.2 Necessary project staff and resources Institutional Assessment

Collection Survey

Existing Staff

• Conservator, curator, collection manager, and/or registrar to act as liaison to the assessor(s). • IT specialist or digital archivist to advise on digital preservation of TBM artworks.

External Consultants

• An experienced TBM conservator to act as primary assessor (ideal). • Digital archivist or preservation specialist, if not available on staff.

Time Commitment

• Time commitment for on-site assessment work will vary based on assessors (internal vs. external, part-time vs. full-time). Depending on the size of the collection, this may be concentrated over a week or weeks or extended over a longer period of time. • Existing staff must be willing to participate in interviews and act as liaisons for assessor(s), providing existing policies and procedures and access to internal documents and storage spaces. • Ample time should be devoted to writing and editing the final report and may extend weeks or months beyond the assessment. • Desk, table, or another workspace. • A private room to conduct interviews with key staff members. • Access to physical storage vaults. • Access to a networked computer to examine and evaluate the digital storage environment and content management system.

• TBM conservator (ideal). • Conservator in a related specialization, collection manager, registrar, digital archivist, audiovisual specialist, or curator (dependent upon the individual level of expertise and scope of the survey). • An experienced TBM conservator will be required to complete technical evaluations and condition assessments of TBM artworks if not available on staff. • Artists, programmers, and others with specialized expertise may be consulted. • Time commitment will vary based upon the hiring of external contractors or utilization of existing staff members and the amount of time they are able to devote to the survey. Depending on the size of the collection and their availability, this may be concentrated over a week or weeks or extended over a longer period of time. • Facilitators/liaisons across collecting departments may need to have overlapping availability with surveyor(s). • Ample time should be devoted to writing and editing any final report, spreadsheets, or other summary documents and may extend weeks or months beyond the survey.

Material Resources

Funding

• Compensation for the assessor(s). • Cost for transcribing interviews.

47

• Desk, table, or another workspace to examine TBM artwork materials and documentation. • Access to a networked computer to examine digital documentation of TBM artworks and content management system. • A forensic write blocker. • Access to physical storage vaults and the digital storage environment. • If evaluating the content of TBM artworks (“the media”), a viewing workstation with equipment able to play back analog and digital content. • Compensation for the surveyor(s). • Purchase of a write blocker or viewing equipment.

Lia Kramer et al.

for grant applications and other funding opportunities. It should be emphasized how the assessment or survey will benefit the institution, staff, and the public where applicable. Highlighting any in-kind contributions and including letters of support from consulting TBM conservators familiar with the collection and its needs may also strengthen the application.

4.4 Conducting a TBM Institutional Assessment A TBM institutional assessment guides an organization toward developing strategies for improved care and display of TBM artworks. The assessment helps staff focus efforts on the immediate needs of the collection, such as obtaining master-quality files or backing up artistprovided materials, as well as achieve longer-term goals, such as establishing permanent staff positions, creating a TBM conservation lab, or building a digital repository. An institutional assessment consists of five phases: • • • • •

Phase 1: Information-gathering prior to the assessment. Phase 2: Interviews with key staff members. Phase 3: Examination of museum spaces, storage, and infrastructure. Phase 4: Crafting the final report. Phase 5: Follow-up.

4.4.1 Phase 1: Information-Gathering Prior to the Assessment To establish a baseline for the assessment, the assessors will need to review the institution’s mission statement and established policies, digital storage practices, and staffing levels, roles, and responsibilities. To assist with this, the on-site liaison should provide the assessors with access to existing acquisition workflows, acquisition agreements and contracts, legal agreements, and artwork documentation templates. The assessors should meet with the institution’s TBM working group or current practitioners to review the state of TBM care and gauge staff members’ needs and interests. The perspectives and concerns of the institution’s stakeholders will influence the structure and focus of the interviews to be conducted during the on-site visit. Goals and deliverables for the assessment and the intended audience for the final report should be clearly defined at this stage.

4.4.2 Phase 2: Interviews with Key Staff Members Confidential staff interviews, conducted by the assessors, form the core of the institutional assessment. Interviewees are most likely to be candid when speaking with an external assessor rather than an internal colleague. Interviewees may include members of the upper administration, curatorial, exhibition design, registrarial, collection management, legal, conservation, IT, and audiovisual departments. It is best to arrange a private, quiet space for one-on-one interviews so that staff members feel comfortable speaking freely about current TBM conservation practices and needs. The interviews should be recorded to assist in report writing and extracting quotes to support final recommendations. Interviewees should be assured of the confidentiality of these conversations. Interview topics may include the following: • • •

The staff member’s understanding of TBM artworks and related conservation issues. The nature and time commitment of each staff member’s involvement with TBM. What challenges or accomplishments the staff member experience when working with TBM artworks. 48

Assessments and Surveys

• • • • •

What policies and procedures govern their work. What TBM-related equipment is currently utilized or needed. The nature and level of collaboration with other departments. The budgeting and staffing needs in the staff member’s department. The need for any future improvements.

Depending on the department or person interviewed, the discussion may also cover acquisition, storage, documentation, exhibition, loan, and conservation intervention practices or needs.

4.4.3 Phase 3: Examination of Museum Spaces, Storage, and Infrastructure During their time on-site, the assessors visit storage vaults to examine housing and environmental conditions for storage of tape media, external hard drives, and display equipment. They also visit conservation labs and exhibition spaces to learn more about the institution’s capabilities for inspecting, treating, and displaying TBM artworks. The assessors work together to inspect digital storage servers and infrastructure. The staff liaison walks them through digital preservation and cataloging workflows, detailing how digital files are transferred from artist-provided devices, how technical metadata is gathered and stored, file-naming conventions, and what digital preservation practices, such as fixity checks and file redundancy, are in place. The assessors also examine the institution’s content management system to see how media carriers, digital files, and other TBM artwork components are identified, described, and tracked by the collection’s registrars. If more time and resources are available, an expanded assessment may be conducted in addition, incorporating a collection survey alongside the assessment during the on-site visit. For those with limited budgets or time constraints, a smaller, more focused assessment addressing just one aspect of TBM conservation, such as acquisition procedures or digital storage policies can nonetheless be impactful and serve as an important step to improving TBM practices and building support for additional resources or funding for a TBM contractor or staff position.

4.4.4 Phase 4: Crafting the Final Institutional Assessment Report The final report should include an executive summary for senior staff who may not read the entire report, an overview of the project, a description of the collection, a detailed evaluation of current practices as compared to best practices in TBM conservation, and short- and long-term recommendations, all written with input from staff interviews. When possible, institutional practices should be compared to industry standards. Quoting department heads from staff interviews in the final report can alert upper administration and board members to the broad support of the report’s recommendations. Quotes detailing staff experiences and challenges working with time-based media can be used to illustrate recommendations. If such quotes are used, permission must be obtained, and interviewees should have a chance to edit the quotes. Common threads from multiple conversations can be distilled to reflect concurrence. When formulating final report recommendations, the assessors should consider the capabilities and workload of the individuals that will be tasked with meeting the goals. Based on the staff interviews, they can help existing staff define roles and responsibilities for various stakeholders across the institution. Creating a chart of responsibilities to define roles for all stakeholders can be a useful tool for this discussion (see Table 4.3). 49

Table 4.3 TBM conservation roles and responsibilities. Chart created by Alexandra Nichols dur ing the institutional assessment conducted at The Metropolitan Museum of Art from 2017 to 2018. © The Metropolitan Museum of Art. Best Practices for The Met’s TBM Collection (Roles and Responsibilities) Acquisition Paperwork

Conservation

• Deter mines • Processes deliverables. deliver y and • Drafts Identity intake of Report. artwork. • Identifies • Ver ifies artwork r isks. deliverables. • Conducts condition assessments. • Assigns TMS components.

Collections Management

• Sends • Processes • Retains a • Labels and   acquisition artwork copy of all rehouses documents to intake. acquisition all physical artist or artist • Documents paperwork for components. representative. deliverables. the department • Manages • Creates • Requests files. component temporar y replacement • Conducts artist locations in TMS record. deliverables. interviews. TMS. • Creates final TMS record.   • Ar ranges       artwork delivery. • Assigns accession numbers.

Artwork Intake

Documentation

Physical Storage

Digital Storage

• Generates • Labels and • Mig rates checksums rehouses dig ital files and metadata all physical to the dig ital reports. components. repositor y. • Researches • Monitors • Monitors artwork. T/RH for dig ital • Conducts artist storage storage in interviews. locations. collaboration • Attaches with Dig ital documentation and IS&T. to TMS record.

50 Reg istrar

In-House Exhibitions

Outgoing Loans

• Creates installation instructions. • Conducts condition assessments. • Assists with installation. • Advises exhibition design. • Completes Iteration Report. • Oversees inter nal artwork moves. • Updates TMS artwork locations.

• Completes loan • Conducts evaluations. conservation • Conducts interventions. • Mig rates condition assessments. artworks • Creates from at-r isk technolog ical installation instructions. for mats to • Acts as cour ier/ preservation installer. for mats. • Completes Iteration Report.

• Generates outgoing loan paperwork. • Liaises with artists and bor rowing institutions.

Treatment

 

• Processes • Processes   incoming loans. outgoing loans. • Oversees artwork shipping and insurance.

Lia Kramer et al.

 

Curator

 

 

 



Digital

 

 

 

 

• Maintains database for dig ital repositor y.



 

 

• • •



51

• Infor mation   Systems and Technology

 

 

 

Counsel’s Office

• Author izes   final payment to the seller only once all components pass condition assessment.

 

• Creates acquisition contracts. • Approves modifications to legal documents.

• Maintains   hardware for dig ital repositor y. • Manages LTO tapes. • Conducts regular fixity checks.    

 

• Approves conservation interventions.

  • Assists the conservator with artwork mig rations and treatment interventions when needed. • Collaborates with conservation on changes to the dig ital repository.

 

Assessments and Surveys

• Acts as liaison between artist or gallery and Met staff.  

• Researches artwork. • Conducts artist interviews.

Design

• Selects artworks for acquisition. • Initiates acquisition process.  

• Liaises with artists and bor rowing institutions. Selects artworks   for exhibition. Designs exhibition. Wr ites wall text and catalog. Designs   exhibition. Creates • Creates exhibition files. exhibition Sources files for gallery specialized installations and equipment. outgoing loans. Installs artworks.  

Lia Kramer et al.

The final report also should provide a plan for how the institution can address any backlog of TBM conservation actions. The assessors may recommend the establishment of temporary or permanent staff positions to apply newly established standards to the existing collection of artworks, train permanent staff in the updated practices, and address workloads for upcoming acquisitions, displays, and loans of TBM artworks. Underlining the ways in which a TBM conservator could relieve current staff of some of these responsibilities and improve institutional excellence can generate support from colleagues. Finally, a draft of the report should be reviewed by all stakeholders to ensure that all reporting is accurate and that all viewpoints are represented correctly. This practice not only increases the veracity of the report but also creates a sense of teamwork and collaboration in compiling the data that ideally will provide broad support and pave a positive path forward for the institution.

“For the assessment I completed at the Taipei Fine Arts Museum, there are two intended audiences for the final report—the first audience is the conservator at the museum who is directly in charge of the TBM collection, and the second audience is the museum head, as she will need to approve the adoption of any templates suggested in the report. The report consists of five sections. The first section briefly describes why the museum needs to address the preservation needs of the TBM collection. The second section provides an overall review of the status and condition of TFAM’s TBM collection. The third section focuses on the long-term preservation requirements for the digital and physical artwork components, as well as the intangible elements based on the assessment. The fourth chapter gives suggestions for how the staff members caring for these artworks can work cross-departmentally. Although this might be difficult depending on the structure of the museum, I think having a committee from different departments is very important for the care of these artworks. In the fifth section, I provide data about the TFAM collection, especially indicating artworks which are at high risk as potential candidates. Since TFAM is in the middle of a construction project, I will provide some suggestions for how the museum’s new building can be utilized to care for TBM artworks. I will also hold two lectures for museum staff—one is to teach staff the characteristics of TBM works and how to describe TBM works for collections registration, and the second which goes over the very basic risks and threats to the preservation of TBM collections from assessment, documenting, iteration reporting to digital file preservation.” —Yuhsien Chen, Freelance Time-based Media Conservator, Taipei, Taiwan

4.4.5 Phase 5: Follow-up Visit Approximately six to twelve months after the dissemination of the final report, the assessor should arrange one or more follow-up visits—either virtual meetings or on-site visits—to check in with staff members and evaluate the institution’s progress toward its goals, as well as the effectiveness of the assessment and report. This follow-up provides an opportunity for staff members to talk about any challenges they may have faced, ask questions, solicit advice for moving forward, and celebrate the progress made so far. It also provides feedback to the assessor to improve their practices when consulting with other institutions in the future. Funding for this visit should be built into the original budget. 52

Assessments and Surveys

“We have a small, but growing collection of approximately 25 TBM works. I joined the museum in 2018 as the Objects Conservator, and in my initial walk-throughs of the galleries, I could tell that some of the TBM works on display had issues. There had been previous interest in establishing a lab, but ultimately it wasn’t practical for such a small collection without resources for indefinite funding. It seemed like the best strategy was to hire a consultant with a specialization in time-based media. The consultant we hired was contracted to perform an assessment and survey and provide us with recommendations for next steps, including the basics of cataloging and documentation. “In preparation for the consultant’s visit, I  met with the newly established TBM working group which includes staff from IT, library and archives, database management, registration, curatorial, and collections management, and explained the goals of the assessment. The TBM consultant also held a meeting with this group at the start of his week-long visit. This helped everyone feel involved and that their input was valued, and established trust with the consultant. He and I later held individual meetings with stakeholders to answer questions and explain findings. These meetings helped get us all on the same page, which was one of the main goals for the project. The consultant helped us understand roles and responsibilities for each department moving forward. “The working group allowed me to strengthen cross-departmental relationships and build buy-in. One of the most fruitful activities in terms of engaging everyone in the group was bringing members into collections storage to examine physical components of artwork with me—this was of particular interest to those who did not typically work directly with art objects. Another rewarding experience was working alongside an IT colleague in collections storage to examine and understand the behavior of a computerbased work. “The consultant’s final report served as a roadmap, helping us recognize what we had accomplished, what improvements we needed to make, and provided recommendations for how to achieve them. We have come to see the advantages of working in a small institution—we can work collaboratively with little bureaucracy—and we now understand how to use our strengths and resources. “One of the biggest concerns for the collection was that we did not have master materials for several of the artworks. We now have a better idea of what to ask for and have the confidence to reach out to artists and galleries to obtain these materials retroactively and for new acquisitions. The assessment and survey have given us new strength to move forward. We hope to bring on short-term or project-based consultants again in the future who can be dedicated to TBM, and we are now more informed and can submit stronger grant applications to achieve this.” —Liz Homberger, Objects Conservator, Detroit Institute of Arts

4.5 Conducting a TBM Collection Survey Depending on the size of the collection and the extent of the survey, several types of documents may be needed to clearly and adequately record and describe survey findings. Individual object reports can be created to describe detailed findings for each artwork and can be written in prose for greater clarity. A spreadsheet is an efficient way to record quantitative data such as the number of tapes of a specific format that are present for each artwork, or a score indicating an 53

Lia Kramer et al.

artwork’s priority for treatment. Creating “yes” and “no” columns for completed documents or actions is key for turning findings into easily communicated percentages and visuals. If a scoring system is used, consider color-coding these score assignments within the spreadsheet. Using qualitative data to highlight statistics along with visualizations, such as graphs or pie charts, will concisely drive home the conclusions of the final report—for example, the precariousness of the collection resulting from a lack of dedicated TBM staff. The format of any spreadsheets and supplementary documents should be reviewed by the institution’s catalogers, registrars, and conservators in advance to ensure that the data can be merged with existing collection management databases. Establishing controlled vocabularies or adopting those created by the institution prior to embarking on the survey will provide consistency in findings and will aid when drawing statistics for the final report. Preparing a survey guide can improve the results by ensuring that the surveyor understands the intended scale and depth of the evaluation and adheres to it consistently. The guide should instruct the surveyor on how to populate designated spreadsheets or templates and assign scores to artworks.

Case Study 4.1  Recording your Survey Findings Creating a template for clear and concise reporting of survey findings is strongly recommended. In the 2017–18 TBM collection survey at The Metropolitan Museum of Art (Kramer et al. 2021), a summary report, reproduced in Appendix 1, was used to describe the findings for each artwork. The template describes format inventory, risk assessment, component checks, documentation review, and storage assessment, as well as recommendations for treatment, display and research needs, conservation goals, and assigning conservation priorities. This information complemented the concurrent institutional assessment, as it helped underline which policies were already being implemented, what would need to be retroactively addressed, and what still needed improvement. Overview summaries were included in the forms as a way to succinctly convey the artwork’s display and components for individuals that may not have technical backgrounds and would, in the future, be integrated into the Met’s record management system. Data also was recorded in a spreadsheet in order to later collate data for the final report. Recognizing that these individual reports would be unwieldy for collections managers, a ‘quick-reference’ table was created to highlight the most significant findings for each artwork. These summaries, along with individual reports, were distributed to the respective collection departments for their later reference. While the extent of this particular survey may be out of scope for collections with fewer resources, it presents a wide spectrum of possible survey options.

There are several types of TBM collection surveys, which will require different expertise and resources and may be used in any combination to suit the collection’s needs:

4.5.1 Format Inventory and Identification of Migration Needs High-level surveys such as format inventories or evaluations of digitization needs can be effectively executed with limited resources. These types of surveys can, in some cases, be collaborative efforts. Registrars, collection managers, and conservators in related specialisms may work together with a digital archivist or IT specialist—for example, if a TBM conservation specialist is 54

Assessments and Surveys

not available. The surveyor(s) will need to be able to identify physical and digital media formats and should have an understanding of acceptable master formats, archival standards, and obsolescence risks. There are several resources available to aid in the identification of physical media carriers. Analog and digital tape formats are described in glossaries published by the Electronic Arts Intermix (EAI 2013), the Texas Commission on the Arts (Jimenez and Platt 2004), and the American Institute for Conservation (Vitale and Messier 2007).

4.5.2 Storage and Housing Survey A storage and housing survey addresses the housing materials and physical environment used to store media carriers and dedicated or artist-modified equipment. This may include indicating whether hard drives and USB flash drives are stored in anti-static bags to prevent them from being affected by electrical discharge, noting whether artwork components are stored in conservation-grade housing materials, ensuring magnetic tapes and films are stored in the correct orientation, and obtaining data on temperature and relative humidity in storage spaces (see fig. 4.2). The Image Permanence Institute’s Media Storage Quick Reference, second edition (Adelstein 2009), describes degradation mechanisms and proper storage conditions for various media carriers (see also Chapter 15).

Figure 4.2 From left to right: Met Collection Manager Catherine Burns, Lorena Ramírez López, and Lia Kramer evaluate film materials stored in The Met’s cool storage. Image: Naoise Dunne, © The Metropolitan Museum of Art

55

Lia Kramer et al.

4.5.3 Documentation Survey Holding and compiling relevant documentation for TBM works is crucial for their preservation. A simple “yes” or “no” answer can be entered into a spreadsheet to indicate the kind of documentation available. This practice is helpful when tabulating statistics. The resulting data can help identify the level of care the TBM artworks are receiving and highlight any examination and documentation protocols that can be improved. Documentation categories to survey may include acquisition paperwork, technical specifications, file metadata, exhibition parameters, display histories, installation guides, floor plans, artist interview transcripts, video documentation of performances, and other records describing past treatments or displays, such as Identity Reports, Iteration Reports, condition assessment reports, conservation treatment reports, and display specifications. Templates for documenting TBM collections are available online through organizations such as Matters in Media Art (Matters in Media Art 2015) and on the websites of several collecting institutions, including the Solomon R. Guggenheim Museum (Guggenheim Museum—Time-Based Media 2011; Guggenheim Museum—CCBA n.d.), The Metropolitan Museum of Art (Metropolitan Museum of Art—Time-Based Media Working Group n.d.), and the Smithsonian Institution (Smithsonian Institution—Preserving and Collecting Time-based Media & Digital Art n.d.).

4.5.4 Equipment Inventory An equipment survey will assess the condition of artist-modified and dedicated display equipment as well as the availability and condition of non-dedicated obsolete playback equipment at the institution. Creating an up-to-date record of available display equipment is an essential step in establishing a shared audiovisual playback and display equipment pool. The inventory can support efforts to stockpile obsolete equipment, source backups for dedicated or custom equipment, or outfit a lab with format-specific monitors or decks to view content (e.g., from analog and early digital media carriers) more efficiently. Record all brand names and models for the equipment inventory in a spreadsheet along with their current location, noting which items are unique and artwork-specific and which could be utilized as a part of a general equipment pool. Be sure to obtain user guides and service manuals for all equipment whenever possible (see Chapter 15).

4.5.5 Registration and Cataloging A registration and cataloging review for TBM artworks verifies that the institution owns all the components that are necessary to preserve and display the works and that these components are consistently cataloged in the institution’s collection management system (ideally with controlled vocabularies) and tracked with locations (physical storage as well as server locations). The surveyor should verify that the institution has collected master and exhibition materials for all artworks and that any artwork files and formats received match the artwork description and acquisition deliverables. Checksums and metadata reports, such as those generated by MediaInfo (Pozdeev and Mediainfo 2021) or ExifTool (Harvey 2021), can help to check formats and digital files against the expected deliverables (see Chapter 18).

“In the last 3–4 months we have been working with our Databases team, using TMS’s “flex fields” to record each video file’s technical metadata within the object record. We’re

56

Assessments and Surveys

using Mediainfo to automatically extract this information out from each of the files on our servers, and then we export this into a CSV file, which our database team harvests once a month. They created a script that inserts different metadata fields into the “flex fields” on TMS. We’re looking at applying this process to still images as well. This was very much an interdepartmental project.” —Aga Wielocha, Conservator, Preventive, and David Smith, Conservator, Time-Based Media, M+ Museum, Hong Kong

Figure 4.3

Sample media production diagram for a single-channel video artwork, used to describe relationships between artwork components.

Source: Diagram created by Alexandra Nichols using the MindGenius software

Recording the relationships from the source material (e.g., artist’s masters) to derivatives (e.g., migrated materials, exhibition copies, and viewing copies) is vital to monitoring an artwork’s history and integrity. This so-called chain of custody information may be recorded in the institution’s collection management system, text documents, or visual aids, such as media production diagrams (see fig. 4.3).

4.5.6 Digital Storage Digital storage should be evaluated in consultation with a digital archivist, member of the IT department, or some other digital specialist who is knowledgeable about digital preservation practices for cultural heritage institutions. Digital artwork components should not be stored within individual staff members’ personal file folders or storage devices but on a secure artwork server. The surveyor should evaluate current server space capabilities, redundancy, and backup protocols, file fixity practices, and access and security regulations. Adherence to digital storage best practices can be evaluated with the aid of tools such as the National Digital Stewardship Alliance’s Levels of Digital Preservation (NDSA 2020), the National Information Standards Organization’s Framework of Guidance for Building Good Digital Collections (NISO 2007), and the Digital Preservation Coalition’s Rapid Assessment Model (DPC 2021). For detailed information, see Chapter 8 on digital storage and Chapter 13 on digital preservation and the information package.

4.5.7 Condition Assessment Accessing and examining the content of analog tapes, film components, or artwork computers will require specialized equipment and expertise, which may necessitate consultation with a third-party lab or specialist. Part IV in this book offers detailed information on conducting condition assessments on a variety of mediums (see Chapters 18–22). When connecting 57

Lia Kramer et al.

artist-provided hard drives and flash drives to a computer, a forensic write blocker (e.g., a Tableau T8u write blocker) should be used to guard against accidental deletion, modification, or corruption (see Chapter 14.4). Digital files may then be tested by viewing a copy in multiple players or editing software, such as VLC, QuickTime, or Adobe Premiere, to at a minimum confirm that they are not corrupted. The content may be spot-checked for anomalies but should be viewed in its entirety when possible.

4.5.8 Treatment Prioritization Collection surveys bring attention to particularly vulnerable works and create an opportunity to advocate for necessary action. When conducting the survey types outlined earlier, certain artworks will begin to emerge as particularly in need of treatment. Artworks in this category may have aging or deteriorating carriers, unique analog or digital media with no copies, a lack of documentation, or insufficient storage. Significant risk is introduced when any one of these conditions is present. If the artwork has not been properly acquired and stored, artwork materials may be impossible to recover should the artist or their representatives be unwilling or unable to provide replacements, while a lack of display documentation threatens the accuracy and authenticity of future installations. A scoring or stoplight color-coding system applied to each artwork will quickly communicate a holistic picture of the state of the collection.

“Media carriers were the main means by which we held our collection, and I knew that the ones in our collection would be past their acceptable life span. What I wanted to instigate was a traffic light system based on the life expectancy of the carriers and how long the work had been in the collection. We created a key, so we had green/low risk— stable or stored according to best practice, which I  don’t think anything is—yellow/ medium risk, which was partial or total loss imminent within the next five to 10 years, and red/high risk, which is the media carrier’s lifespan has been exceeded, and there was a risk of loss of part or all the artwork and action is required now. And what I expected is what we got, which is largely a sea of red. The aim is that this will go to our leadership team and it’s a very visual, very easy way of them seeing how high risk our collection is.” —Kirsten Dunne, Senior Projects Conservator, National Galleries of Scotland

4.5.9 Crafting the Final Collections Survey Report Methods for describing findings in the final report should be structured with the final audience and next steps in mind. Who will be able to address the issues identified in the survey? A  conservator may want the greatest possible detail about the condition of the artworks in the collection and what documentation is present; a curator or exhibition designer may want exhibition guidelines; a collection manager may want information about storage conditions for physical components; a registrar may need a summary of cataloging and location needs; an IT specialist will want to know about digital storage needs; a member of the upper administration may only want a summary of the institution’s capabilities and needs, along with projected costs for addressing them. Whatever the approach, gathering statistics of what work is yet to be completed provides support for arguments for funding a TBM staff position or contract work to address these issues. The surveyor should make note of any artworks that are particularly vulnerable or bring 58

Assessments and Surveys

attention to aspects of TBM stewardship that fall outside of the expertise of current staff members. Worst-case scenarios can have a silver lining here as they add urgency and provide excellent leverage for improving preservation practices. The final report should include an executive summary and a description of the collection holdings. It should provide a clear outline of short- and long-term steps that can be taken to improve any areas that were found to be lacking. Urgent tasks should be featured. Steps may include reaching out to artists or galleries to gather files or installation instructions, generating documentation, improving cataloging practices, selecting works for digitization or migration, sourcing playback or display equipment, and overhauling digital storage procedures.

“I was initially employed by The Art Gallery of New South Wales, Australia in 2015 on a 6-month part-time contract to undertake a collection survey of the time-based art (TBA) collection and thereby identify gaps and areas for development in collection management and preservation activities. From the perspective of Head of Conservation Carolyn Murphy, the collection survey project served an additional purpose: to develop TBA conservation as a specialisation to address collection needs and following from that to advocate for the need for a permanent role in TBA conservation. Upon reflection, I did not anticipate that as a TBA conservator I would need to acquire advanced skills in advocacy. Moreover, what I presumed would be a short-term collection survey project, actually turned into a complex five-year change management journey for the institution. “Our approach was to break down what was held in the gallery’s contemporary art collections in order to identify where the inconsistencies were in terms of which artworks were and should be classified as TBA. Once the gallery stakeholders agreed on a definition for ’time-based art’, I was able to classify 40 additional artworks into the TBA collection. Once that first step was completed, we turned our attention to the Vernon collections database, for the purposes of reviewing terminology and the descriptors the gallery used to manage TBA. A small cross disciplinary team across curatorial, conservation and collection documentation developed standard terminology for the collection and an approach to determine the integral, variable and auxiliary components for an artwork. This was the most complex and, in hindsight, crucial aspect of the project. The magnitude of that task meant that it was easily three months into the survey before I even touched a single artwork! “As another part of the survey, I investigated the analogue and digital components related to each artwork to ascertain whether the gallery had acquired the work in a master format. Through research and forensic data checks we sought to obtain clarity around what we originally received from the artist, versus what formats had been created at a later date. Specifically, we asked questions like: for how many works in the collection do we only have compressed DVDs, which are not considered a master version? And how many works in the collection currently require digitization or a migration intervention? This part of the survey helped to establish the gallery’s digital storage needs and prioritise funding for the digitisation of analogue carriers. I also gathered data on the documentation related to each artwork. This helped me to identify how many artworks required artist consultation because we didn’t have installation documentation and raised questions as to what the gallery considered standard documentation for TBA.

59

Lia Kramer et al.

“Due to the fact that the survey addressed all aspects of TBA—from artwork concept through to media carriers—the results, in conjunction with strong advocacy from Carolyn, supported further project development over the period of 2015–2020. The survey data formed the basis for several secondary conservation survey projects, which created more opportunities for temporary TBA conservation roles to be created. My role was generously supported by gallery benefactors until it was made permanent in July 2018. “An artist documentation survey followed the initial collection survey. The core TBA project group worked together to establish the base-level documentation requirements including artist created/artist approved installation instructions, a certificate of authenticity, a list of technical specifications etc. Once we had established all of the things we needed, we then surveyed how many works in the collection had this base level of documentation, and how many artworks would require artist/artist studio consultation. Lisa Catt, Curator of International Art, and myself then contacted each artist/artist studio seeking to obtain outstanding material related to an artwork in the collection, including master artwork formats. This was no easy feat as with some artworks 10+ years had passed since acquiring the artwork. We undertook this legacy work at the same time the gallery acquired artworks using our newly established standards. “The next major survey was undertaken in 2018/2019 to address physical media carrier re-housing and storage, ensuring that all TBA had the new cataloguing system applied. Due to the time-consuming nature of this work, it wasn’t until my colleague Rebecca Barnott-Clement, TBA conservator, began a project-based role with me in early 2018 that we were able to make significant progress on this area of collection care. Rebecca finalised the rehousing, relabeling, and cataloguing of the time-based art collection and conducted a thorough inventory of the collection. “The most recent major survey is currently being undertaken, which relates to obsolete legacy equipment essential for the functionality and display of many artworks. Project funding through benefaction was obtained for a TBA technician to undertake a survey of all of the artworks in the collection that rely on obsolete equipment. In the same way that the 2015 survey eventually secured the full-time ongoing role of senior TBA conservator, it is my hope that the equipment survey will secure the role of a TBA technician within conservation. “The survey data has informed all of our procedures and workflows and in a sense, the collection survey has never quite ended, because, as we discuss in the journal paper, “What is the Object?” (Sherring et al. 2018), every time an artwork goes on display, is acquired, or is loaned, TBA conservators use this as our moment to review what we did not receive at acquisition and is still outstanding—master files, documentation, artist interview—and engage the curator and artist in conversation in an attempt to ensure that the artwork can be preserved and displayed in the future.” —Asti Sherring, Doctoral Research Candidate, University of Canberra, Australia, and Former Senior Time-Based Art Conservator, Art Gallery of New South Wales, Sydney, Australia

60

Assessments and Surveys

4.6 Next Steps After an assessment or survey has been completed, a TBM working group meeting is an excellent venue to present the findings and recommendations of the final report, and additional oneon-one presentations should be scheduled with department heads and/or upper administration if possible. In assessing how the institution can implement the report’s recommendations, consider which actions may be taken on by existing staff and what will require external specialists. While some institutions will be able to hire a full-time TBM conservator to take on responsibility for these artworks, individual tasks or projects may alternatively be addressed by contracting with TBM conservators in private practice. As depicted in Table 4.3, a chart outlining department roles and responsibilities can help to define tasks and new workflows. When tasks are distributed across departments, it can be helpful to communicate how this engagement is directly improving the state of the collection and how these policy changes can relieve future workload strain. For example, conducting detailed documentation and condition assessments as part of the acquisition process is often time-consuming, but this ensures that the right materials are obtained and that these artworks are able to be readily displayed in exhibitions.

“I think having dedicated time is more important than having dedicated funding. Taking time to engage with colleagues to get access to the information you need. When I was trying to collect information about some of the artworks, we have people who’ve been here for 20 years who may have actually been installing it or even just remember it being on view. And if there was no record of it on view, but somebody has a memory of it, that’s pretty important, because there may not be any record of the exhibition or installation in our management system. I think I received support from staff members because when I tapped into their knowledge, it provided an opportunity for them to be engaged and see that they were contributing incredibly valuable information. Now I can go to those people and continue to ask questions about other artworks.” —Kristin MacDonough, Assistant Conservator of Media, Art Institute of Chicago

When advocating for a TBM conservator or contractor on a permanent or longer-term scale, it can help to not only look at the identified workloads but to evaluate the acquisition pattern for TBM objects in the collection. The urgency for action will be amplified if TBM artworks are being collected at an accelerated rate or occupy a significant percentage of the collection. (see Chapter 6.) For smaller collections with only a few TBM works and infrequent acquisition, it may be more cost-effective to rely on contract TBM conservators and consultants rather than establish a permanent position. One option is to raise funds for a limited-term TBM conservator in order to address the most pressing needs of the collection and then bring on consultants to assist with special projects and exhibitions as needed (see Chapter 9). Similarly, a small TBM artwork collection may not justify the purchase of a full inventory of obsolete viewing equipment if there is a rare occasion for use and no one on staff with the expertise to maintain it. Such tasks are better outsourced as the need arises.

61

Lia Kramer et al.

“I’ve been at SFMOMA since ’98 and was here for quite a while before we had a media conservator, and there are parallels to this earlier time now that the position is vacant. So how do you build capacity, within a staff where everyone’s workload is already maxed? The way we had done it is to bundle the needs with existing activity that is already planned. You can build an activity into your loan, you can incorporate it into your onsite exhibition. Or it can be around an accession or an artist who is coming to town for some other reason, or a donor who is gifting a media artwork. It’s during these opportunities when you can advocate for a better master or additional research. Building up capacity and opportunities around existing activity can often support bringing in a consultant or contractor to address collections care during that time period. “When I  joined Team Media [SFMOMA’s TBM working group], there was an ongoing advocacy for a media conservator. There was a little resistance, and I think that some people may have felt a little bit threatened. But the truth is, it’s one of those things that if you haven’t seen it in action, it’s very hard to understand or visualize what it could be. Once we hired Martina Haidvogl, our first media conservator, everybody said, ‘Oh, how did we exist without this?’ I’m sure that other institutions where people are trying to advocate for a media conservator experience the same thing; it can be hard to get support when people don’t quite understand or visualize what it looks like. If you have these opportunities, these cumulative one-offs for incorporating media expertise into an existing museum activity, then everybody can begin to visualize what it looks like to have a media conservator. It’s a way of building capacity and expectation and understanding for what somebody can do, what skills they can bring to care for the collection.” —Michelle Barger, Head of Conservation, San Francisco Museum of Modern Art

Keep track of staff accomplishments and improvements throughout the assessment or survey process; this will help reassure staff members that their actions have improved the collection and will also be important in reporting to funders or for demonstrating institutional buy-in for future grant proposals. Any and all steps taken toward improving TBM conservation practices should be considered a success.

4.7 Conclusion This chapter provides a roadmap for those who are developing policies and procedures for TBM collections care. Though addressing the needs of a TBM collection may initially feel like a series of insurmountable hurdles, the successful completion of an assessment or survey helps make this process more manageable by taking stock of the contents of the collection, identifying an institution’s unique needs, and laying out clear steps to pursue. Even if one is unfamiliar with TBM artworks, their care centers around the core principles of preventive conservation in use for other artwork mediums. It is important to keep in mind that one’s skill set, foundational ethics, and experience caring for more traditional artworks often may be applied to the care of TBM artworks, despite their use of obsolescence-threatened technologies, display variability, intangible conceptual aspects, and unique storage and documentation needs. Once improvements are identified, they cannot be implemented and managed by a single individual or department. TBM artworks require true collaborations in care, with working 62

Assessments and Surveys

groups forming the backbone of TBM conservation initiatives. It is in this working group that one can build alliances and make use of internal resources to jointly achieve the preservation goals of an institution. Indeed, these interpersonal networks can be the strongest driving force in advancing conservation practices as well as gaining buy-in from upper administration that may not otherwise be aware of the level of care required for TBM collections. It is our hope that the range of individuals quoted throughout this chapter can serve as a secondary support system and expanded network for consultation and discussion. Through collaboration, communication, and teamwork, these complex and engaging artworks can be preserved at any level of resource.

Bibliography Adelstein, Peter. IPI Media Storage Quick Reference, 2nd ed. Image Permanence Institute, 2009. https:// s3.cad.rit.edu/ipi-assets/publications/msqr.pdf. American Museum of Natural History (AMNH). “Collection Surveys: Museum Collections.” American Museum of Natural History, n.d. Accessed August  14, 2020. www.amnh.org/research/naturalscience-collections-conservation/general-conservation/documentation/collection-surveys. Ashley-Smith, Jonathan. Risk Assessment for Object Conservation. London: Routledge, 2013. https://doi. org/10.4324/9780080938523. Avrami, Erica, Kathleen Dardes, Marta de la Torre, Samuel Y. Harris, Michael Henry, and Wendy Claire Jessup. “The Conservation Assessment: A Proposed Model.” The Conservation Assessment: A Proposed Model for Evaluating Museum Environmental Management Needs, 1999. www.getty.edu/conservation/pub lications_resources/pdf_publications/evaluating_museum_environmental_mngmnt.html. Berrett, Kory. “Conservation Surveys: Ethical Issues and Standards.” Journal of the American Institute for Conservation 33, no. 2 (January 1994): 193–98. https://doi.org/10.1179/019713694806124829. Brokerhoff, Agnes, Bart Ankersmit, and Frank Ligterink. “Risk Management for Collections.” Cultural Heritage Agency of the Netherlands, 2017. https://english.cultureelerfgoed.nl/binaries/cultureelerfgoeden/documents/publications/2017/01/01/risk-management-for-collections/risk-management-forcollections_a.pdf. Canadian Conservation Institute (CCI). “Agents of Deterioration.” 2017. www.canada.ca/en/conserva tion-institute/services/agents-deterioration.html. Carter, D. J., and K. A. Walker. “Chapter 9: Policies and Procedures.” In Care and Conservation of Natural History Collections, 177–92. Oxford: Butterworth Heinemann, 1999. Council for Museums, Archives, and Libraries (CMAL). Benchmarks in Collection Care for Museums, Archives, and Libraries, Version 2.1. Edited by Alex Dawson. London: Collections Trust, 2018. Digital Preservation Coalition. “DCP Rapid Assessment Model.” 2021. www.dpconline.org/digipres/ dpc-ram. Electronic Arts Intermix (EAI). “Formats.” 2013. www.eai.org/resourceguide/formats.html. Favret, C., K. S. Cummings, R. J. McGinley, E. J. Heske, K. P. Johnson, C. A. Phillips, L. R. Phillippe, M. E. Retzer, C. A. Taylor, and M. J. Wetzel. “Profiling Natural History Collections: A Method for Quantitative and Comparative Health Assessment.” Collection Forum 1–2 (2007): 53–65. Foundation for the American Institute of Conservation (FAIC). “Collections Assessment for Preservation Assessor Handbook.” 2019. www.culturalheritage.org/docs/default-source/resources/CAP/capassessor-handbook-2019.pdf?sfvrsn=8. Guggenheim Museum. “Time-Based Media.” 2011. www.guggenheim.org/conservation/time-basedmedia. ———. “The Conserving Computer-Based Art Initiative (CCBA).” n.d. The Guggenheim Museums and Foundation. Accessed September  2, 2020. www.guggenheim.org/conservation/the-conservingcomputer-based-art-initiative. Harvey, Phil. ExifTool (Software) (version 12.32). MacOS, Windows, Unix Systems, 2021. https://exiftool. org/. Henderson, Jane. “Collection Condition Surveys.” In Working with Independent Conservators: Guidelines for Good Practice. London: Museums and Galleries Commission, 2000. Jimenez, Mona, and Liss Platt. “Videotape Identification and Assessment Guide.” Texas Commission on the Arts, 2004. www.arts.texas.gov/wp-content/uploads/2012/04/video.pdf.

63

Lia Kramer et al. Kramer, Lia, Alexandra Nichols, Mollie Anderson, Nora Kennedy, Lorena Ramírez-López, and Glenn Wharton. “Conducting a Time-Based Media Conservation Assessment and Survey at the Metropolitan Museum of Art.” Journal for the American Institute of Conservation (2021). https://doi.org/10.1080/0197 1360.2020.1855866. Matters in Media Art. “Matters in Media Art (Website).” 2015. http://mattersinmediaart.org/. Metropolitan Museum of Art. “Time-Based Media Working Group.” The Metropolitan Museum of Art, n.d. Accessed December  22, 2020. www.metmuseum.org/about-the-met/conservation-and-scientificresearch/time-based-media-working-group. National Digital Stewardship Alliance (NDSA). “Levels of Digital Preservation, Version 2.0.” 2020. https://ndsa.org/publications/levels-of-digital-preservation/. National Information Standards Organization (NISO). A Framework of Guidance for Building Good Digital Collections. Bethesda, MD: NISO Press, 2007. www.niso.org/publications/framework-guidancebuilding-good-digital-collections. Pozdeev, Max, and MediaArea. MediaInfo (Software) (version 21.09). Windows, MacOS, Android, iOS, Debian, Ubuntu, Linux Mint, RedHat Enterprise Linux, CentOS, Fedora, openSUSE, ArchLinux, Gentoo, Flatpak, Snap, AWS Lambda. MediaArea, 2021. https://mediaarea.net/de/MediaInfo. Sherring, Asti, Caroline Murphy, and Lisa Catt. “What Is the Object? Identifying and Describing TimeBased Artworks.” Australian Institute for the Conservation of Cultural Materials (AICCM) 39, no. 2 (2018): 86–95. https://doi.org/10.1080/10344233.2018.1544341. Smithsonian Institution. “Preserving and Collecting Time-based Media & Digital Art at the Smithsonian Institution.” n.d. www.si.edu/tbma/. Taylor, Joel. “An Integrated Approach to Risk Assessments and Condition Surveys.”Journal of the American Institute for Conservation 44, no. 2 (January 2005): 127–41. https://doi.org/10.1179/019713605806082365. Vitale, Timothy, and Paul Messier. “Video Format Identification Guide.” 2007. https://cool.culturalheri tage.org/videopreservation/vid_id/index.html. Waller, Robert. “Conservation Risk Assessment: A Strategy for Managing Resources for Preventive Conservation.” Studies in Conservation 39, no. sup2 (January  1, 1994): 12–16. https://doi.org/10.1179/ sic.1994.39.Supplement-2.12. ———. “Risk Management Applied to Preventive Conservation.” 1995. https://museum-sos.org/docs/ WallerSPNHC1995.pdf. Xavier-Rowe, Amber, and Claire Fry. “Heritage Collections at Risk: English Heritage Collections Risk and Condition Audit.” In ICOM-CC 16th Triennial Conference Lisbon 19–23 September 2011: Preprints, 11. Lisbon: ICOM Committee for Conservation, 2011.

64

Appendix 4.1 ARTWORK SUMMARY TEMPLATE USED BY THE AUTHORS FOR THE 2017–18 MET TBM COLLECTION SURVEY TBM Artwork Summary Please provide sources for all information contained in this report Prepared by:

Date Prepared:

General Information Department: Accession No.: Name of Artist: Title of work: Date of work:

Technical Summary Include a brief, high-level description about the presentation of the artwork, including the use of any display equipment, props, or fabricated objects, and whether video and audio channels should be synchronized.

Notes:

65

Lia Kramer et al.

Recommendations: Tombstone: Components Listed in TMS: Registration: Documentation: Rights: Technical Interventions: Physical Storage: Digital Storage: Exhibition Requirements: Loans: Research Prior to Exhibition: Conservation Goals: Conservation Priorities: Assign numerical score

Artwork Documentation/Research: Physical Storage: Digital Storage: Media Documentation: Media Acquisition: Media Migration: Equipment Purchase:

Template Created by Alexandra Nichols © The Metropolitan Museum of Art

66

5 OUTSIDE THE INSTITUTION: CROSSING THE BOUNDARIES OF COMMUNITIES AND DISCIPLINES TO PRESERVE TIME-BASED MEDIA Mona Jimenez, Kristin MacDonough, and Martha Singer

Editors’ Notes: In this chapter, Mona Jimenez, Kristin MacDonough, and Martha Singer challenge the common preconception that time-based media conservation can only take place in (wealthy) institutions and shift focus to a rich media art ecosystem outside of the institution, where volunteer-driven initiatives bring together creators, communities, and caretakers with the shared goal to preserve their collections. Representatives of three initiatives, the Community Archiving Workshop (CAW), the Audiovisual Preservation Exchange (APEX), and the XFR Collective, offer their perspectives in three roundtable discussions. Mona Jimenez has worked in many capacities on independent media and media art collections for the past 30+ years, founded APEX, and initiated the CAW model. Kristin MacDonough (Assistant Conservator of Media, Art Institute of Chicago) is a founding member of the XFR Collective and a former CAW and APEX participant. Art conservator Martha Singer (Director, Material Whisperer, New York) is in private practice specializing in the conservation of modern and contemporary art, with a focus on working directly with artists and their estates on legacy. Martha has worked with artists whose works include time-based media. Participants of the CAW roundtable (October  23, 2020): Ann Adachi-Tasch (Director, Collaborative Cataloging Japan), Laurie Duke (Head of Finance and Administration, Grey Art Gallery, New York University), Kelli Hix (Audiovisual Collections Consultant/Project Manager, Nashville Metro Archives Audiovisual Heritage Center), Vanessa Renwick (Artist/Oregon Department of Kick Ass), Moriah Ulinskas (Audiovisual Archivist/Public Historian/Project Director of Community Archiving Workshop), and Pamela Vadakan (Director, California Revealed). Participants of the APEX roundtable (December 2, 2020): Bill Brand (Artist/Film Preservationist, BB Optics Inc., New York), Humberto Farias de Carvalho (Conservator/Professor, Federal University of Rio de Janeiro, Brazil), Caroline Gil Rodríguez (Independent Media Conservator/Director of Media Collections and Conservation, Electronic Arts Intermix, New York and Puerto Rico), Ángela López Ruiz [Artist, Fundación de Arte Contemporáneo (FAC)/Laboratorio de Cine, Montevideo, Uruguay], Juana Suárez (Associate Arts Professor/Director, Moving Image Archiving and Preservation Program, New York University), and Guillermo Zabaleta [Artist/Fundación de Arte Contemporáneo (FAC)/Laboratorio de Cine, Montevideo, Uruguay].

DOI: 10.4324/9781003034865-6

67

Mona Jimenez et al.

Participants of the XFR roundtable (December 5, 2020): Andrea Callard (Artist/Filmmaker), Kyle Croft (Art Historian/Curator/Programs Director, Visual AIDS), Caroline Gil Rodríguez (Independent Media Conservator/Director of Media Collections and Conservation, Electronic Arts Intermix), Marie Lascu (Audiovisual Archivist), and Conrad Ventur (Artist/Curator).

5.1 Introduction In this chapter, we introduce readers to three initiatives that engage creators, caretakers, and their allies and are aimed at the preservation of moving image and time-based media collections: Audiovisual Preservation Exchange Community (APEX), Community Archiving Workshop (CAW), and XFR Collective (pronounced “Transfer Collective”). While the development of time-based media conservation has predominantly taken place in museums and libraries, these initiatives give alternative routes to preservation in a variety of contexts. The people behind these initiatives work almost exclusively with artists, small arts and cultural organizations and with collections outside of major institutions or in areas of the world that are under-resourced. Thus, the partners are largely without dedicated staff or well-funded projects in conservation. The initiatives stress not the perfect but the possible. They consider each conservator, archivist, artist, collection manager, or another caretaker as part of an interconnected and interdependent media art ecosystem where actions in one area can affect the whole. We have provided descriptions of each group but have chosen not to write an analysis of their current practices; instead, we have provided space for members to speak on behalf of their own endeavors. To inform the development of this chapter, we organized three roundtable discussions with representative members and participants of each initiative. Descriptions of the groups are followed by quotes from roundtable participants, where salient points have been organized by topic and edited for clarity and continuity. After reading this chapter, we encourage readers to research and explore the groups’ abundant online resources (NYU Tisch n.d.b.; CAW 2019; XFR Collective n.d.). The initiatives recognize the roles of institutional, non-institutional, and hybrid collections and bring to the foreground creators and small or under-resourced organizations that do not necessarily frequent art conservation circles. They play an important role in disseminating information about conservation and preservation and in promoting do-it-yourself (DIY) strategies. In its own way, each initiative fulfills its mission through peer-to-peer and person-centered pursuits, recognizing that each participant brings value to their shared work, whether trained in preservation or not. The groups connect museum and archive-based cultural workers to diverse communities of artists, collectors, and caretakers. In contrast to one-way, service-oriented events, the groups’ activities are designed to build relationships and communities with the understanding that everyone involved has a unique perspective and contribution to offer. For instance, during a CAW, participants may work in pairs or small groups to inspect and inventory boxes of videotapes. APEX assembles cross-partner teams to create new preservation resources, such as a video digitization station or a new telecine setup for film scanning, embodying the program’s emphasis on exchange and reciprocity. XFR Collective may digitize tapes from an arts organization and later collaborate with them on an exhibition or pop-up event that addresses the needs of artist members. Of the three initiatives, XFR Collective has the greatest representation of artists in its active membership, due in part to its roots in an artist, museum, and preservation collaboration. Many of those involved in XFR Collective were emerging professionals when they joined and have 68

Outside the Institution

since gone on to become museum conservators, although they were not traditionally trained in conservation. Both CAW and APEX have sought out artists and arts organizations for partnerships, and art conservators have been present at some of their events. Adaptation and experimentation are valued by all three initiatives, whether expressed as an APEX team staging an expanded cinema performance to understand artists’ use of archives, as CAW’s new Training of Trainers program (see Section 5.2), or as XFR Collective’s programming of digitized collections on the live streaming platform Twitch. In each case, their work is carried out with an abundance of generosity and innovation. They all recognize that when it comes to media conservation and preservation, there are fresh approaches to collaborative work that are yet to be discovered or conceived. Although these groups exist outside of traditional conservation spaces, such as museum conservation departments or laboratories, they incorporate core principles of conservation and preservation. All groups value exploring, understanding, and documenting context, which is an essential principle of conservation practice. Also, while different for each group, the initiatives engage in important cross-disciplinary discussions of ethics. The initiatives may align most closely with the practices of caretakers of artist archives, focusing on the collection as a whole and the relationships among its objects rather than on a deep examination of a single object. Maintaining the relationships created with people and organizations they partner with is a primary concern for all three groups. During the XFR Collective roundtable, conservator Martha Singer noted that that group was “much more involved in the artist’s legacy, the longer arc of the work,” in contrast to the policy of most museums, which tend to collect, and thus conserve, one artwork at a time (Jimenez et al. 2020c). The experience of filmmaker Vanessa Renwick illustrates the unexpected impact of artist Bill Brand’s philosophy of “one thing leads to another” when working in a DIY manner (Jimenez et al. 2020b). In 2015, Renwick’s personal collection was the object of a CAW through the Association of Moving Image Archivists (AMIA n.d.). Through this experience, they came to know conservator Peter Oleksik, who later shared conservation templates with Renwick developed through the program Matters in Media Art (Matters in Media Art 2015). Later, when their artwork was acquired by a local museum that did not have a time-based media conservator, Renwick used the templates to create documentation to support the acquisition process. Thus, an educated artist became the link that introduced museum staff to the methods of conservation documentation for media artworks. This book details practices that, while necessary, are beyond the means of the vast majority of the world’s cultural institutions and artists. A relatively small number of museums collect time-based media art, and a smaller number employ or even contract with time-based media conservators. The three initiatives offer alternatives that address the multitude of media art held by archives, small organizations, collectors, and the artists themselves. In fact, some of the works that benefit from these initiatives may find their way into major museums; it is hoped these efforts will help them survive and be seen in the interim. The initiatives also remind us that not all knowledge needed for time-based media art is found in traditional conservation circles. The survival of the full media art ecosystem depends on the health of all its parts, people and technology alike, within and outside of all institutions. How do we support new forms of caretaking that acknowledge this interdependency? An awareness of an ecosystem is not new to conservation, nor is the recognition that, together, creators, experts, and allies can ensure the best conservation outcomes. It is our hope that community-based efforts and the projects they are involved in can be widely known and that others can experiment with these models. Cross-community and cross-discipline events and experimentation can only benefit the artworks that we all care about. 69

Mona Jimenez et al.

5.2 Community Archiving Workshop (CAW) Community Archiving Workshop (CAW) is a model that arose from the belief that many film and media collections are connected to a community of makers, audiences, collection caretakers, and other allies who want the content to endure and are willing to labor together toward that end (Jimenez 2018). CAWs are typically one-day events where 20–30 participants—some experts and some new to handling audiovisual materials—inspect and inventory endangered audiovisual collections, usually 100–200 items. Participants all use the same spreadsheet template, and after the CAW, the files are collected, merged, and provided to the collection’s caretaker. CAWs may also include a station to practice film inspection and/or a station to demonstrate or perform media digitization. The first CAW was held in 2010 in partnership with the Scribe Video Center in Philadelphia, PA (Scribe n.d.a.). It was organized as part of the course Video Preservation in New York University’s Moving Image Archiving and Preservation Program (MIAP) and was concurrent with a conference of the Association of Moving Image Archivists (NYU Tisch n.d.a.; AMIA n.d.). Along with students from the MIAP program, participants included Association of Moving Image Archivists (AMIA) members and Scribe staff and volunteers. The Independent Media Committee of AMIA was excited about the concept and took on the organization of a CAW in conjunction with the next conference in Austin, Texas, in 2011. For a number of years, the majority of CAWs were aligned with AMIA’s annual conference and were managed as a project of the Independent Media Committee and the Diversity Task Force (Dollman and Orgeron 2015). However, CAWs have been held in Chile, Japan, Mexico, the Philippines, and Thailand;

Figure 5.1 Filmmaker Ko Nakajima (center, with cap) answers a question during a Community Archiving Workshop in Tokyo in 2016 as part of a knowledge exchange organized by Collaborative Cataloging Japan. Photo: Mitsuru Maekawa

70

Outside the Institution

on Native American Tribal Lands; and in Puerto Rico and numerous locations on the mainland United States. Most CAWs are all-volunteer efforts and are done without dedicated funding. A CAW is intended to jump-start preservation processes and lay the groundwork for future projects. Guided by conservation and preservation professionals, collection caretakers gain an understanding of what assets they have and can make preservation-critical decisions about where to direct limited resources. The CAW model is intended to be non-exclusive and replicated by anyone who is moved to take action. Each CAW is unique, having its own character, purpose, and impact. As CAW Committee member Kelli Hix says, “[T]he model is kind of a non-model; it’s like a set of tools and meant to be really malleable” (Jimenez et al. 2020a). Although the tangible output is data, organizers say the event is more about education and community building, which begins at the point of recruiting partners and continues through post-event follow-up. AMIA has provided important administrative support for several large grants. In 2018, with funding from the Institute for Museum and Library Services (IMLS) in the USA, a Training of Trainers (TOT) program was launched to train arts and cultural organizations on how to organize CAWs (IMLS n.d.; CAW—Training of Trainers Toolkit n.d.). The curriculum and project documentation for the program is being published on the CAW website as the CAW TOT Toolkit. As part of this project, CAW has built traveling kits—one to demonstrate media digitization and one to demonstrate film inspection and handling—for use at CAWs and by partner organizations. At the time of this writing, a major initiative is underway for a series of CAWs in partnership with the Association of Tribal Archives, Libraries, and Museums, funded by the National Endowment for the Humanities in the USA [Association of Tribal Archives, Libraries, and Museums (ATALM) n.d.; NEH n.d.]. While still technically associated with the Independent Media Committee, the CAW Committee has remained a dedicated group of fewer than ten people. With the growth of popularity of CAWs, more requests for partnerships are coming in, and opportunities for funding are promising. The committee is in the midst of considering the best organizational structure going forward and how to continue to grow the number of CAWs, CAW organizers, and the community relationships that make CAWs successful. The following perspectives represent CAW partners (Ann Adachi-Tasch and Vanessa Renwick), a CAW participant (Laurie Duke), CAW organizers and members of the AMIA CAW Committee (Kelli Hix, Pamela Vadakan, and Moriah Ulinskas), and co-authors (Martha Singer and Mona Jimenez).

5.2.1 Partnerships and Museum Relationships MORIAH ULINSKAS (AMIA CAW COMMITTEE; AUDIOVISUAL ARCHIVIST/PUBLIC HISTORIAN/ PROJECT DIRECTOR OF COMMUNITY ARCHIVING WORKSHOP): We treat outreach for

a CAW as an important educational element of what we’re doing. We ask locals for anybody we should contact about the workshop in the area where we’re going for the AMIA conference. The CAW Committee splits up a spreadsheet, and we do emails and cold calls, and then we take more recommendations. We’ve always taken the approach that in itself it is an achievement if we make contact with an organization, and we say, “Hey, this is what moving image archiving is. Here’s a couple of resources if you’ve never thought about preservation as an issue for your collection.” In the end it comes down to one organization that we end up working with. Usually it is just contingent on them having staff who are willing to do it and on having space. 71

Mona Jimenez et al. ANN ADACHI-TASCH (CAW PARTNER; EXECUTIVE DIRECTOR, COLLABORATIVE CATALOGING JAPAN): We wanted to do the CAW in Tokyo [Collaborative Cataloging Japan

(CCJ)—Community Archiving Workshop in Tokyo 2016 2020] because we knew that there wasn’t a lot of knowledge for moving image archiving in Japan. We took on the artist Ko Nakajima’s collection, and the point wasn’t really to do a collection survey of Ko Nakajima’s works, but really to gather people. I actually wish that it could be a series. I know that there is an exchange among professionals—like the National Archive of Japan going to the AMIA conference. But I think there still is a need of sharing the knowledge to a broader area of people who are not necessarily moving image specialists, such as artists.     Some people from established institutions even traveled from the South of Japan to Tokyo in order to participate in the CAW. And there were also representatives present from artists’ festivals as well as nonprofits such as those archiving home movies. Because we did it at Nihon University, there were students from there and from other universities. Also, we partnered with archivist Nobukazu Suzuki and with film restoration expert Mariko Goda. It was great to have people who are professionals coming in to share with the participants how to handle film. I thought it was so lovely to see that. Those people are our partners now. I think it was a meaningful project to do. MORIAH ULINSKAS: CAW has really dedicated itself to supporting small community-based arts institutions or cultural organizations. But it’s really hard for me to think about the relationship between CAWs and major museums because of my experience working with museums. They’re entirely focused on their own collections. They only have preservation departments or programs because they have acquisitions, and it’s about the value of their collections. And it’s really hard to get them to think outside of those terms. I just think that the culture of museums is based on a scale that is so different and so unaligned with a community-based approach. ANN ADACHI-TASCH:  I think it takes an individual who’s really passionate about community archiving. I don’t want to sound so pessimistic about museums, but I do think that it would require a major shift in culture of what museums are about for them to organize CAWs. MONA JIMENEZ (AMIA CAW COMMITTEE; CONSERVATOR/MATERIA MEDIA):  There is a push to have the major museums who do have time-based media conservators to share the knowledge. But I agree with Ann and others that it’s probably individuals that the initiative is coming from. I was really intrigued by your thoughts, Moriah, about context. Because of course, context is super important for conservators. There is the examination of the work and the examination of context. KELLI HIX (AMIA CAW COMMITTEE; AUDIOVISUAL COLLECTIONS CONSULTANT/PROGRAM MANAGER, NASHVILLE METRO ARCHIVES AUDIOVISUAL HERITAGE CENTER):  I experi-

enced the same thing that Moriah was talking about, but I definitely feel like there’s hope. I think that it’s also up to us to create incentives for larger institutions to work with their communities. Maybe we can start our creative wheels turning. LAURIE DUKE (CAW PARTICIPANT; HEAD OF FINANCE AND ADMINISTRATION, GREY ART GALLERY, NEW YORK UNIVERSITY): I’m thinking about the Leslie-Lohman Museum

of Art and their collaboration with XFR Collective (Leslie-Lohman 2021a). That’s one example of bringing more of a grassroots preservation effort into a small museum, and in a very particular way. I think that museums are in the midst of trying to re-envision themselves and their missions, and, increasingly, foundations are also more invested in communities than in big exhibitions and acquisitions. Maybe there is more room for a less precious approach to handling artworks—which I think has its pluses and minuses. But I also think 72

Outside the Institution

that there will be more efforts by museums to connect with their communities and their direct audiences. So possibly the CAW model could come into the museum in some way. I hope so.

5.2.2 Workflows I showed maybe three of my short films at the beginning of the CAW in Portland in 2015, and then we split up, and some people went to work with the Portland Institute of Contemporary Art’s collection, and some people went to work with mine. The one thing that was hard for me was that I had all my tapes of different formats, all the ones about this project, or this movie, located in certain places in my studio. And when I brought them to you all, of course they got mixed up because a lot of them got put in boxes, and then people were archiving them and numbering them. Some had my own numbers on them already, using my own numbering system. Those numbers were thrown out the window, and they got new numbers. And then the subjects that were all about one thing did not have sequential numbers anymore. It was like they were out of order from my own visual order, which is easier and quicker for me to find them. So that was the one negative for me. It kind of messed up how I clumped my tapes. I was also scared that something was going to be lost in this process with all of these strangers man-handling tapes. ANN ADACHI-TASCH: I remember Ko was definitely really nervous as well. And I think that’s why we ended up just doing a portion of his collection. And I think he gave us all these really crusty films that he didn’t care about. And then he would say, “You know, the mold is part of the work.” MONA JIMENEZ: I think as archivists we often try to see patterns so that we can organize our process. But it sounds like it’s more than patterns. It’s about workflow. It’s your creative methods that also need to be understood. LAURIE DUKE: I think there’s a kind of tension between having a community dive into a collection and a core group of trained archivists who are trying to “respect des fonds” (Wikipedia—Respect des Fonds 2020) by keeping things in place. Like you were saying, Vanessa, the CAW kind of upset your internal system. It’s interesting that you have to let go a little bit if you’re letting the community handle your work. MORIAH ULINSKAS: At a CAW in Richmond, Virginia, we were processing a collection of oral histories our partner had acquired. The archive professionals just had what was written on the labels; they didn’t have the context. I was working at a table with a bunch of senior citizens who were volunteers at that museum. They could look at a label and say, “Oh, this is about this neighborhood that doesn’t exist anymore.” They could give so much more context and richness than we could as AV professionals or the museum workers could. I really, really stand by that this is the greatest strength of the CAWs. VANESSA RENWICK (CAW PARTNER; ARTIST/OREGON DEPARTMENT OF KICK ASS):

5.2.3 Impact on Communities and Collections There’s so much we can do in one short day and with the prep work. The last thing that we want at a CAW is for someone to feel like their collection is less organized. So I think one of the things we’re doing is really asking ourselves: “How can we have longerterm partnerships when that’s appropriate?”

KELLI HIX:

73

Mona Jimenez et al. VANESSA RENWICK:  I

brought home all my boxes of stuff, and I didn’t unpack them because they weren’t finished. I thought, “Well, I can’t put these away because then I won’t know what’s finished and what’s not.” And they just sat in boxes for a couple of years because it was too overwhelming for me to tackle them. And then I talked to three people, and I had an intern too, who had been at the CAW. We got together in my kitchen and finished archiving the collection. It felt really good to finally have it done. MORIAH ULINSKAS:  I think there’s a lot to be said for the fact that the traditional model is a totally volunteer effort with zero resources, and everybody does the best they can. And it’s always wonderful. The IMLS grant forces the possibility for the long-term relationship, because we actually have the funding to give all of the partners’ stipends for their participation to provide more resources to their organizations. Being able to actually fund something makes all the difference in the world. KELLI HIX:  In 2016, we did a CAW for the Metro Archives in Nashville, Tennessee. It helped to create advocacy and knowledge—to shine a light on this collection that hadn’t been really supported before. Eventually that turned into a whole tiny little department with me parttime as the only staff person. But we now are the central hub for the IMLS grant for the Southeastern United States, and we run lots of CAWs for other organizations. The CAW was part of the foundation of turning Nashville into a center to help its own collections, but also to create a network in the Southeast. These are really transformative outcomes that have very widespread effects. It’s exciting. MORIAH ULINSKAS:  It is really about finding partners with expertise versus trying to invent the expertise or gain it all yourself. Like when we were in Baltimore last year, A/V Geeks drove up two film scanners and set them up (A/V Geeks n.d.). I think they digitized 20 films on-site that day while we were doing the workshop. PAMELA VADAKAN (AMIA CAW COMMITTEE; DIRECTOR, CALIFORNIA REVEALED):  With the IMLS grant, we built media digitization and film inspection kits that are portable and are meant to be loanable after the grant. People always come to us at the end: “Okay, now what?” They are wanting to know what’s next and really to have more control over that process. So I love that we have the kits now and are actually giving the tools to people to use. VANESSA RENWICK:  Another good thing that happened from the CAW was that filmmaker Andrew Lampert introduced me to a conservator at the Museum of Modern Art, Peter Oleksik. The Portland Art Museum ended up buying a seven-channel video installation I made. Peter had told me that they were working on revamping their acquisition process and documentation and that he’d share their new rules with me. They really helped me. When I handed the piece over to the museum, it was probably the best amount of information they’ve ever gotten from anybody because I had this worksheet to fill in for them. That only happened because I went to the CAW.

5.3 Audiovisual Preservation Exchange (APEX) Audiovisual Preservation Exchange (APEX) is an ongoing project of New York University’s Moving Image Archiving and Preservation Program (MIAP). Planned with international colleagues, APEX facilitates gatherings of teams consisting of a variety of collection caretakers, students, faculty, artists, media enthusiasts, and experts who work together on endangered media/ film collections, with the goal of equal sharing of knowledge and practices. At present, the in-person gatherings take place during a period of two weeks, but the aim of the program is to create long-lasting connections that mutually support learning and care of collections. 74

Outside the Institution

Figure 5.2 One of the projections as part of AJI (Archive Jam Improvisation), an event at the museum Collección Engelman-Ost in Montevideo, Uruguay, that was organized by Fundación de Arte Contemporáneo (FAC) and participants of APEX Montevideo. Photo: Mona Jimenez

The teams’ projects result in tangible improvements to the status of the audiovisual collections, but more importantly, international relationships are developed among a wide range of colleagues. Numerous opportunities for exchange are also organized during the visit, such as panels, demonstrations, and workshops. A closing session is held where each team reports on their accomplishments, and participants can plan for future regional and global collaboration. APEX also often results in new connections among disparate people and organizations in a region where many entities hold parts of the local audiovisual heritage. APEX partners have included major institutions such as national film archives, national libraries, and universities, as well as small grassroots organizations, artist spaces, low-power television stations, and accidental archives. APEX began in 2008 as a collaboration with colleagues in Ghana, where a small team of MIAP faculty, staff, and alumni worked with archives in Accra, principally the Cinema Division of the government’s Information Services Department, and the J. H. Kwabena Nketia Archives of the Institute of African Studies at the University of Ghana (Ghana ISD 2020; University of Ghana n.d.). The work occurred over the course of six years, often in partnership with NYU’s Department of Africana Studies and consultants from the company AVPreserve, now known as AVP (AVP n.d.). A key accomplishment was the creation of an audio preservation lab at the Nketia Archives in 2014, where many hundreds of unique audio recordings of Ghanaian arts and culture have been digitized and made accessible to scholars, students and creators. The model of APEX as a short-term event, usually two weeks in length, began in 2009. MIAP faculty and alumni organized APEX Buenos Aires, where volunteers from NYU and 75

Mona Jimenez et al.

attendees from a locally held conference of the International Federation of Film Archives (FIAF) worked together on film collections with the staff of Museo del Cine Pablo Ducrós Hicken (Museo del Cine Pablo Ducrós Hicken n.d.). APEX Bogotá, held in 2013, is notable as the first year where students took the lead as organizers under faculty mentorship; this was also the first APEX that took the multi-institutional, multi-team approach that continues to the present. Teams worked on collections at the Fundación Patrimonio Fílmico Colombiano; visits were organized to filmmaker Luis Ospina’s personal collection and to a film lab, a university hosted a digital preservation workshop, and a personal digital archiving workshop was held at a local bookshop [Moving Image Archiving and Preservation Program (MIAP)—APEX 2013: Bogotá n.d.a.; Fundación Patrimonio Fílmico Colombiano n.d.]. Since 2008, APEX has collaborated with approximately 25 institutions and collections in 10 countries, with gatherings in Argentina, Colombia, Uruguay, Chile, Spain, Puerto Rico, and Brazil. During the COVID-19 lockdown, a 2021 online edition was organized to work with Indigenous collections in Mexico and Brazil. APEX has also supported online workshops with numerous institutions in Latin America and is participating in the development of a network of community-based archives in the region. APEX is led by the MIAP director—since 2017, Juana Suárez—with the support of Pamela Vizner, a media archivist and 2014 graduate of MIAP. Students and alumni are involved in all aspects of planning, funding, and preparations and are encouraged to take leadership roles onsite. In some cases, students from other US graduate programs in archiving and preservation have participated. Considerable in-kind and direct support has come from NYU and MIAP, and students have also been supported through the Film Foundation (Film Foundation 2021) and private donations. Almost all participants incur some costs, and partners provide significant support according to their ability—for example, contributing housing and meals when possible. Donations for supplies and equipment also come from many different sources in the US and the host countries. Here again, we see artist Bill Brand’s theme of “one thing leads to another,” or what Suárez calls “the domino effect,” referring to how these collaborations build on one another, generating a kind of connective tissue among caretakers across borders (Jimenez et al. 2020b). For example, Brand participated in APEX Buenos Aires, held at Museo del Cine Pablo Ducrós Hicken, and director Paula Félix Didier saw an opportunity for Latin American artist-preservationists to network with those from North America. She connected Brand with the artist collective Fundación de Arte Contemporáneo (FAC 2020)/Laboratorio de Cine in Montevideo, Uruguay. Brand and artist Katy Martin were invited for an artist residency that included exchanges on do-it-yourself film preservation, as FAC had an archive of artist media/film works and had been borrowing home movies from families in Montevideo for use in artists’ projects. Subsequently, in 2014, FAC became a partner for APEX Montevideo, hosting one of the teams. Thus, Brand’s network combined with that of Suárez, who had met colleagues in Uruguay while she was researching audiovisual preservation practices in Latin America. And it was through partnering with FAC that APEX became aware of Memorias Celuloides and La Red de Cine Doméstico, two collectives in Spain devoted to the preservation of home movies. These groups became lead partners for APEX Cartagena, held in Spain in 2017 (Vivancos n.d.; La Red n.d.). FAC’s participation in APEX was especially memorable, as their team explored issues of conservation/preservation in part by staging AJI (Archive Jam Improvisation), an expanded cinema event [MIAP—APEX 2014: Montevideo (FAC & AJI) n.d.c.]. FAC was grappling with how to make respectful use of the home movies as art materials. FAC’s approach was a dynamic complement to the work of the teams engaged in more traditional activities: the inspection 76

Outside the Institution

and cataloging of collections in a university archive, a university Communications Department, and at two of Uruguay’s most important national institutions, Cinemateca Uruguaya and the Archivo Nacional de la Imagen y la Palabra at SODRE (MIAP—APEX 2014 n.d.b.; Cinemateca Uruguaya n.d.; SODRE n.d.). APEX strives to keep its networks strong by promoting and publicizing not just the model and its impacts but the expertise and collections of the partners. APEX is often followed by co-organized presentations at international conferences and/or by preservation projects. Some of these projects take place within MIAP, such as within Brand’s Film Preservation course or as part of the Orphan Film Symposium, a project of MIAP faculty Dan Streible (Orphan Film Symposium n.d.) The following perspectives represent APEX partners (Ángela López Ruiz and Guillermo Zabaleta from Montevideo, Uruguay), APEX organizers/participants (Bill Brand, Caroline Gil Rodríguez, and Juana Suárez, all of New York), Humberto Farias de Carvalho (APEX presenter, Brazil), and co-authors (Martha Singer and Mona Jimenez of the New York/New Jersey region).

5.3.1 Archives and Artist Improvisations ÁNGELA LÓPEZ RUIZ [APEX PARTNER; ARTIST, FUNDACIÓN DE ARTE CONTEMPORÁNEO (FAC)/LABORATORIO DE CINE, MONTEVIDEO, URUGUAY]: Our interest really started

when we heard Mona talk at the 7th Orphan Film Symposium about APEX Ghana. We started thinking about how to work with archives in a decentralized, non-institutional way. When we were invited to partner in APEX Montevideo, we made a general call for the participation of artists and collectives. GUILLERMO ZABALETA [APEX PARTNER; ARTIST, FUNDACIÓN DE ARTE CONTEMPORÁNEO (FAC)/LABORATORIO DE CINE, MONTEVIDEO, URUGUAY]:  We had begun to expand our

archives and to use them as a tool to add challenges to artworks and to complicate and expand other creative work. As an artists’ space, we wanted to think about the archive from the place of art and also in terms of working from a collective. We also were interested in jamming, similar to jazz musicians, as a sort of playful improvisation that takes shape from a creative base.     This intersection of artists and specialists in moving image archiving opened the space to appropriate the home movies archive, turning it into a protagonist. This is because an archive not only holds memories but also can be an active agent through performance and can thus be shared collectively. We were also able to preserve these films for the families that had granted permission and to return them with some guidelines so that the families could continue the work of preservation themselves. ÁNGELA LÓPEZ RUIZ: We think of our AJI activity as expanded cinema created through archives. We called it AJI (Archive Jam Improvisation), like ají in Spanish for chili pepper, meaning that we are creating very spicy art. We had done other kinds of jammings before APEX.     AJI took place at Collección Engelman-Ost, a private museum that had also lent us home movies (Wikipedia Colección Engelman-Ost 2021). What was different was that AJI generated a new dialogue among the films and the museum’s static artworks; we consider both parts of their archive. This idea of decentralizing or de-institutionalizing the archive also had to do with the fact that, for us, those participating in AJI also become part of the archive. 77

Mona Jimenez et al. GUILLERMO ZABALETA:  APEX

Montevideo was a very significant event that led us to revise other projects for preservation, by thinking of collective dynamics as a way of preserving memory. With the optical and contact printers, we are able to work with films and capture images. By the time we return the movies to their owners, we have mapped out a record of the materials that allows us to keep working in research, curation, documentaries, and other projects. This also makes it possible for other artists to keep creating with the images we have been able to compile.

5.3.2 Partnerships and Exchange JUANA SUÁREZ (APEX ORGANIZER; ASSOCIATE ARTS PROFESSOR/DIRECTOR, MOVING IMAGE ARCHIVING AND PRESERVATION PROGRAM, NEW YORK UNIVERSITY): The idea

of including performance at the center of APEX, as we did in Uruguay or Cartagena or even closing APEX with a curatorial activity, is limited by some bureaucracy distance from major to minor archives; at times a lack of understanding of what minor archives do. It is also limited by the ways in which major archives or the institutions often have more access to great spaces to perform. Major archives own spaces; they often depend on ministries of culture and often have funding (even if limited). That has to do with questions of access as well. One of the things that I learned was the significance of reversing the logic of APEX, and shifting the program to focus more attention to minor archives, yet being open to the participation of big archives.     Minor archives seem to have an urgency to get work done. They really want to maximize the support to see how they can implement do-it-yourself practices. They are dealing with survival, limited staff, lack of spaces for public programming, work, and contributions from volunteers. They are quick to understand that we do not come as a group of foreign experts with knowledge for them but that we can be more effective when we promote exchange. We approach institutions curious as to what they have to teach us. Therefore, you create this kind of circularity; we’re a feedback loop in a sense.     All of this dovetails with my own background in cultural theory, cultural studies, and a kind of innate feminism about changing the relation of the archives. Not just talking about decolonizing the archive, but actually decolonizing by doing. In APEX, this also has to do with shifting the responsibility of organizing over to students and welcoming the increasing leadership of alumni. BILL BRAND (APEX PARTICIPANT; ARTIST/FILM PRESERVATIONIST, BB OPTICS INC.):  There are two themes that are characteristic of APEX that are instructive and that reflect my experience. One is that “one thing leads to another,” and the other is that “information or ideas flow in both directions.”     I came to archiving from being an artist. I noticed that most attention or most ideas were directed towards institutions and institutional collections. So I found ways to address people like myself who had collections as artists. I developed a whole curriculum around what I called “do it yourself preservation” (Brand n.d.). APEX is integrated into my teaching, my art practice, and my passion for film and for moving image preservation.     That relationship between the artist archive and collecting institutions is parallel to the relationship of the dominant nation’s archives to archives of less dominant nations. As artist-archivists, our priorities and the way we work are different than if we’re institutional archivists. And I think the institutions have learned a lot from us, just as we’ve learned from the institutions.

78

Outside the Institution

    Similarly, the strength of APEX is that it’s not simply dominant archives from the big countries bringing knowledge to the archives of less dominant countries. Archives from the dominant nations have been influenced by the practices of archives from the less dominant countries as much as the other way around. Issues such as what things need to be done, what the priorities are, and the whole concept and purpose of an archive are dealt with as an exchange of ideas and information. That’s the strength of APEX, and that’s what I take from it. HUMBERTO FARIAS DE CARVALHO (APEX PRESENTER; CONSERVATOR/PROFESSOR, FEDERAL UNIVERSITY OF RIO DE JANEIRO):  What made APEX Rio so important to us was that

it brought together local students and professionals who are not necessarily working with time-based media but are interested in this field. I  gathered from comments from colleagues that APEX opened a door to start conversations since many of us did not know each other and we got a chance to meet. CAROLINE GIL RODRÍGUEZ (APEX ORGANIZER; INDEPENDENT MEDIA CONSERVATOR/ DIRECTOR OF MEDIA COLLECTIONS AND CONSERVATION, ELECTRONIC ARTS INTERMIX):  While planning APEX San Juan, we approached a museum of contemporary art and

various art organizations, including artist-led spaces. The project with the contemporary art museum could not materialize. But through a personal connection of Juana’s, we were able to get in contact with an artist—Mari Mater O’Neill—who had a collection of digital materials pertaining to her artistic output as well as the digital archives of an early internet zine (El cuarto del Quenepón) in which she published original net-artworks and art criticism from 1995–2005. It was a key cultural magazine in the Caribbean. The studio visits and subsequent consultations turned into a fully formed project that, to follow the theme of one thing leading to another, led us to apply for a grant from the Society of American Archivists to subsidize some restoration work, which we were awarded and are now working on.

5.3.3 Principles of Collaboration What’s fascinating to me is that the mentor-mentee relationship is easily flipped. You can go in being the mentor but become the mentee easily within that relationship. People aren’t stuck in their roles, so they’re able to learn and teach simultaneously. That dynamic quality is what’s important and also what keeps the relationship going. It’s kind of like doing a big performance, and then you stay in touch with the cast members later. You haven’t performed just one role. CAROLINE GIL RODRÍGUEZ:  I really learned from and appreciate the fact that the APEX relationship has to be based on multilateral honesty and respect. It is a relationship based on a kind of equal exchange. I have a particular sensitivity to this, perhaps, because I am from Puerto Rico, which is a US colony, so is both part of and foreign to the USA. In both APEX Buenos Aires and APEX Santiago, I  could study the archives as someone from the Caribbean. To study the minor archives was more emotionally affecting to me than working with institutions in the USA. In Latin America and the Caribbean, we are very accustomed to working like scavenger birds. Most things are done in piecemeal fashion. This way of working, that sort of cannibalizes things, is due to a lack of resources and professional opportunities, but also because we’re used to having to overcome excessive bureaucratic procedures and institutions. MARTHA SINGER (CONSERVATOR/DIRECTOR, MATERIAL WHISPERER):

79

Mona Jimenez et al.

    I would like to see larger museums and institutions from the Global North provide financial support for this type of exchange to happen; financial support that does not coopt. This would achieve a significant advancement in our field, resulting in a more bilateral flow of knowledge concerning time-based media that circulates beyond the ivory tower. Because as professionals, we don’t know everything, right? It’s always good practice to make a point to learn from others who are approaching conservation with less resources and institutional support. JUANA SUÁREZ:  Collaborative projects are a vehicle for exchange to happen; we are not very invested in creating statistics and reports but actual results. The goal is to create points of contact and develop those relationships. Having projects and teams is a nice structure for that to happen.     Minor archives in some way move faster in organizing impromptu gatherings and have needs that are not covered or that are not similar to the needs of larger archives. Independent artists and archivists are interested in going faster on discussions, such as with the Mari Mater O’Neill digital forensics project that took place in APEX San Juan or the conversation we organized in APEX Rio de Janeiro on the need to promote training in time-based media. The media art conservation field is picking up in Latin America, and chances are that minor archives or independent initiatives will move faster in addressing needs precisely because they work with less bureaucratic frames. However, it is important to add that resources are not enough. That should be contemplated when providing training opportunities or creating exchange programs. HUMBERTO FARIAS DE CARVALHO:  I would say it would be really important for international partners to really profoundly understand the economic, social, and political situation of the country that they’re coming into. Developed countries need to understand that we do have knowledge. Oftentimes we understand what it is that we have to do, we are aware of good practices and existing tools, but we don’t have the resources. So we work by adapting in some way—the knowledge to the possibilities that we have to work with. That is a lens that needs to be kept in mind as we translate experiences from one institution in a developed country to another with less resources.

5.4 XFR Collective XFR Collective, a nonprofit organization that operates as a non-hierarchical structure, partners with artists/makers and small arts and cultural organizations, largely in the New York City region, that typically do not have dedicated caretakers for their magnetic media collections. These partnerships most often result in providing low-cost digitization for a partner’s magnetic media or in events, such as public presentations and educational workshops. The digitized ­materials are all uploaded to the Internet Archive, a nonprofit digital library (Internet Archive 2014a). Anyone can become a core member of XFR Collective by expressing a desire to contribute to the Collective’s work and committing to the schedule outlined by the group. This has typically included attending monthly all-member meetings, participating at least once a month in a transfer session and, when interested, volunteering for subcommittees that are formed based on upcoming projects or events. To remain a member, a person must be regularly involved in activities, and thus, some members may be more involved than others at any given time. Over time, the membership has shifted and expanded, bringing in people from the fields of art and archiving with different skills, backgrounds, and experiences. Those who are no longer involved on a regular basis are considered members-at-large. 80

Outside the Institution

Figure 5.3

XFR Collective’s pop-up digitization station at MIX NYC: The New York Queer Experimental Film Festival in New York in 2015.

Photo: Rachel Mattson

Before XFR Collective, there was XFR STN (pronounced “Transfer Station”), an exhibition project at the New Museum in New York in 2013. Billed as “an open-door artist-centered media archiving project,” XFR STN brought artists in direct dialogue with preservationists and made publicly visible both the equipment and the process of videotape digitization (New Museum 2012–2021). XFR STN was the brainchild of artist Alan Moore and the artist group Colab (Colab Inc. n.d.). It was organized by the New Museum’s curatorial, archival, and educational staff under the direction of Johanna Burton. Artist/conservator Walter Forsberg recruited students and graduates from local archival and moving image preservation programs as workers and created and managed the transfer system. Recognizing the educational opportunities and benefits of low-cost digitization, some of the artists and preservationists who had been involved in XFR STN formed what would become XFR Collective. Some of the equipment used in the exhibition was reinstalled in artist and member Andrea Callard’s loft in Lower Manhattan, and the group connected with artists and local organizations to digitize tapes. Artwork digitized at XFR STN and most of the content digitized through XFR Collective are available at the Internet Archive. The Internet Archive is the Collective’s chosen repository because it stores high-quality files and makes the content widely accessible. 81

Mona Jimenez et al.

Since 2014, the work of XFR Collective has grown, and the artists and organizations the group works with are considered partners. XFR Collective organizes its partnerships according to three goals: digital preservation and transfer, education and research, and cultural engagement (XFR Collective n.d.). Achieving this trio of goals, or “vectors” as Callard calls them, requires the collective to take direct action—digitizing magnetic media—and to also experiment with different models of partnerships or collaborations with individuals and organizations. Digitization stations have been installed at other locations, including other members’ homes, which has expanded digitization capabilities. Members collaborate with partners to hold public events or “pop-ups,” where tape-to-file transfers are done on the spot. As with XFR STN, these activities are intended to demystify the preservation process and promote conservation actions. Through these activities, the Collective strives to give members the space and support to build their skills with reformatting and the design and troubleshooting of preservation systems. Many of the Collective’s partners do not fit neatly into only one specific goal or meet all of the goals. Over time, members have made “an effort to define different types of partnerships without limiting possibilities,” as member Marie Lascu describes them. XFR Collective “continues to float along in an organic way,” intentionally subverting the restrictions that can arise from over-formalizing, capitalizing, or streamlining for efficiency (Jimenez et al. 2020c). Preservation is seen as a humanitarian act, with wider implications and benefits to the larger media art community. Maintaining a non-hierarchical structure within XFR Collective has allowed members to pursue a diverse range of projects, and it provides flexibility to participate in other’s projects collaboratively, such as with the video series We Tell: Fifty Years of Participatory Media, and when the Collective was invited to participate in a workshop as part of the initiative Barbara Hammer: Evidentiary Bodies (Leslie-Lohman 2021b, Scribe n.d.b.). Again, through the Collective’s embrace and modification of XFR STN’s preservation pop-up model, they have made new connections at their events that have generated other collaborations. Since becoming a nonprofit in 2014, XFR Collective has organized pop-up transfer stations at MIX NYC: The New York Queer Experimental Film Festival, PARTICIPANT INC, Secret Project Robot, the Leslie-Lohman Museum of Art, Spectacle Theater (all in New York City), and the Indymedia 20th Anniversary Encuentro in Houston, Texas (MIX • NYC n.d., PARTICIPANT INC 2021, Secret Project Robot n.d., Leslie-Lohman 2021a, Spectacle Theater n.d., 20 IndyMedia n.d.). Many of these partnerships have taken the form of a short residency in tandem with specific exhibitions; however, some have grown to become longer collaborations. One example of the latter is an ongoing relationship with Visual AIDS. Visual AIDS is a New York City–based nonprofit arts organization that supports communities of artists living with HIV and the families, friends, and estates of artists who have passed. During the XFR STN exhibition, Visual AIDS was one of the artist organizations that signed up to have videotapes digitized, and later, when XFR Collective was formed, they became one of the group’s first partners. In July 2019, the Collective held a pop-up event called XFR WKND during Altered After, an exhibition organized by artist and curator Conrad Ventur for Visual AIDS (Visual AIDS—Past Event, Altered After, PARTICIPANT, INC. n.d.a.). The exhibition was held at the nonprofit art space PARTICIPANT INC in New York (Visual AIDS—Past Event, XFR WKND, PARTICIPANT INC. n.d.b.). The event came about after Ventur met XFR Collective at one of their earlier pop-up events, where he brought one of his tapes to be transferred. Kyle Croft, an art historian and curator, was an installations coordinator for MIX and helped facilitate one of the Collective’s first pop-ups at MIX. Later, as programs director of Visual AIDS, Croft saw how XFR STN’s digitization of We Interrupt This Program/Day Without Art, 82

Outside the Institution

a live broadcast directed by Charlie Atlas for the third Day Without Art (Atlas 1991), led to its re-examination and celebration 25 years later. The Day Without Art, or A National Day of Mourning and Action in Response to the AIDS Crisis, is held annually on December 1 and involves many artists and cultural workers (Visual AIDS—Day Without Art n.d.c.). Croft also credits XFR Collective’s pop-up at PARTICIPANT INC with revealing the tremendous need for audiovisual preservation in their community. XFR Collective members understand best practices—learned through their education and experiences—but also the realities of their partners’ limited resources. With this in mind, XFR Collective focuses not only on the physical materials of a collection but also, if not more importantly, the people and stories behind the collections. The following perspectives represent XFR Collective partners (Kyle Croft for Visual AIDS and Conrad Ventur), XFR Collective members (Andrea Callard, Caroline Gil Rodríguez, and Marie Lascu), and co-authors (Martha Singer and Mona Jimenez).

5.4.1 Origins and Motivations Initially our mission was just to be able to do transfers outside of an institutional setting, not exactly commerce, kind of favors: “Oh, let’s get more of this work done because the need, as revealed by the XFR STN, is huge.” I remember saying, “Well, if we work on this every week, we’ll actually get something done.” There was a certain question about whether we work with our friends or meet the public. We tried to set it up so that both could happen and both did happen. MARIE LASCU (XFR MEMBER; AUDIOVISUAL ARCHIVIST):  I joined in 2015. A big motivator for me to join the group was to work with a video digitization rack hands-on without any pressure to be really good at it. And it was just fun because our meetings were once a week at Andrea’s apartment, and she cooked for us, and we would just hang out and start digitizing a tape. And as former member Rachel Mattson always stressed, we need to be comfortable screwing up in order to lose the fear of even attempting more technical tasks. I think that’s the hardest thing for people to learn in this work. ANDREA CALLARD (XFR MEMBER; ARTIST/FILMMAKER):

CAROLINE GIL RODRÍGUEZ (XFR MEMBER; INDEPENDENT MEDIA CONSERVATOR/DIRECTOR OF MEDIA COLLECTIONS AND CONSERVATION, ELECTRONIC ARTS INTERMIX): 

I joined in 2017. I was already in my last year of MIAP, but I knew about XFR STN from an article I had read about the New Museum exhibit. And that actually was a big motivator for me, deciding to go to MIAP. I thought it was the coolest thing to have a public-facing exhibition on how to migrate media and pull that process out of the obscurity of video digitization labs. I was also inspired by the act of giving people the tools—giving back to the artists or creators the tools and the knowledge to understand the importance of audiovisual preservation and conservation. CONRAD VENTUR (XFR PARTNER; ARTIST/CURATOR):  I think I came in to try to have a tape transferred at an XFR Collective pop-up in Bushwick. I got really interested in super lossy video that people had uploaded to YouTube, and I ended up making installations out of those recordings for a few years. I came across Leslie Kaliades, whom I had vaguely known about, and I found a trilogy of her video work that was on the archive.org site (Internet Archive—Leslie Kaliades 2014b). That opened up an idea to propose an exhibition to Visual AIDS, which would become Altered After (its title taken from Kaliades’ video trilogy). At that point I’d already known about Visual AIDS—and XFR Collective and XFR STN.     Maybe a year or two later, I had a few more of my own tapes transferred by XFR Collective. It really helped me move some of my editing forward that had been kind of congested 83

Mona Jimenez et al.

because I couldn’t even review the material. So I had a really good experience. In fact, I went to maybe two or three meetings at Andrea Callard’s loft just to kind of check in and talk to people and try to learn some things and maybe make some friends.

5.4.2 Projects For the early pop-ups, it really started out with “Someone carry a deck, someone carry a CRT monitor.” You know, the Blackmagic Design digital converter is pretty small. It can fit in my purse. So it was whatever people could carry. What was the bare minimum that we needed to digitize? And we would stick to one or two formats to try to keep it as simple as possible. It’s a very small, bare minimum version of a professional video digitization rack. CAROLINE GIL RODRÍGUEZ: The work that we did in 2019 as part of the exhibition Arch at the Leslie-Lohman Museum was really stimulating and encouraging for all of us. We had a week-long residency at the museum. We were invited to install two pop-up stations, one audio, one video. It was a major event, a residency at a real museum modeled after XFR STN. The experience of being among artists and people directly associated with the media we were migrating, intentionally holding an intimate space for them and listening to their stories as depicted on these videotapes, was truly meaningful. Having a conversation with a stranger who just walked through a museum’s door about the content on their tapes, their creative lives and watching those moving images, those performances. It’s very emotional for us and for them. I think that was the most beautiful thing I’ve experienced in a long time. MARIE LASCU:

KYLE CROFT (XFR PARTNER; ART HISTORIAN/CURATOR/PROGRAMS DIRECTOR, VISUAL AIDS): One of the nuggets that I brought here to unpack is everything that happens after

the transfer. I  have been researching a live television broadcast called We Interrupt This Program that was staged for the third Day Without Art in 1991. There is a little bit of a paper trail for this, but the only visual record is a tape that XFR digitized many years ago that’s just been sitting on the Internet Archive. In the years since it was digitized, the tape has begun to spark interest in We Interrupt This Program. For example, in 2019, Professor Roderick Ferguson began teaching a class at Yale inspired by the project, examining histories of queer art and politics (Ferguson 2019). In November 2020, The Kitchen—which co-produced We Interrupt This Program with Visual AIDS—hosted a conversation about the project and published some archival materials online for further research (The Kitchen Blog and Kohl 2016). We’re also publishing a book about the history of Day Without Art at Visual AIDS, and We Interrupt This Program is going to be one of the featured projects. My point is that there has been this kind of resurgence of interest in that project, probably five or six years after the file was uploaded. Over time, people have come across the video and wanted to know more about it, even though there was no specific initiative on the part of Visual AIDS to highlight the project; it just developed into an inquiry-based project. It was: “Here’s a tape, here’s a bunch of tapes; let’s digitize them and put them online. And over time, meaning will accrue or understanding will accrue.”

5.4.3 Impact on Communities and Collections There’s always the battle between “best practices” versus “better than nothing.” And we have to be comfortable with the better than nothing part. And there’s real reasons

MARIE LASCU:

84

Outside the Institution

for that. Asking someone to deal with 10-bit uncompressed files at home by themselves has its own set of problems. Like with CAWs, we’re trying to help people kickstart their own effort that they will hopefully continue on with, but we’re not going to hound people and say, “Have you run checksums? When’s the last time you turned on your hard drive?” And for me personally, at the end of the day, if someone got access to a file and they made a new documentary, or they were able to share home movies with their family, that’s not the worst thing. Forever is a privileged concept. All we can do is our best. MARTHA SINGER (CONSERVATOR/DIRECTOR, MATERIAL WHISPERER):  Museums are looking at one object for a life in the museum. XFR Collective is not just about the life of an artwork in a museum. You’re more about the education of the artist and helping them get to a better process in their decision-making about their workflow and preservation. ANDREA CALLARD:  Actually a recent example is an artist, Hank Linhart, who was in the Monday/Wednesday/Friday Video Club, a Colab project that distributed artist’s videotapes to consumers (Internet Archive—MFW Video Club 2014c). He had been the professor for two people in the Collective. So when I did his tape two years ago, I contacted him. He wrote to me this week asking me how to find his tape as he was trying to think about the bigger picture of his work and find it all. That was kind of an interesting loop. CONRAD VENTUR:  I think there are lots of other artists who want to learn about what they’re working with and especially artists who work with found footage or they’re drawing from work they did in the past, whether it’s their own or other artists’. I think sometimes, when I hear about preservation, it just sort of seems like there’s this thing that got finished. It’s in a box under somebody’s bed, and it needs to be saved. KYLE CROFT:  Something that resonates with both Visual AIDS and XFR is thinking about caretaking or stewardship in ways that are more expansive than just preservation. A  lot of times when someone comes to Visual AIDS and they want to put something in our archive, it’s also that they want to come and talk about someone that they’ve lost. It’s about the artwork, but it’s not just about the object. Sometimes a way of caring for something is not just about not having a file degrade, but rather it’s about getting people to talk about it in public, creating conversations. This conversation is making me think about everything that XFR Collective is about beyond just the file or the tape itself.

5.5 Conclusion Media conservators and archivists do not work in a vacuum; those who practice caring for timebased media within institutions recognize that the majority of their work is collaborative in nature and that it relies on communication with other staff, with colleagues at other organizations, and with artists or their studios, galleries, or estates (Matters in Media Art 2015). Those in private practice are not working alone either—these conservators have close relationships with artists or artists’ representatives and can build bridges between caretakers and communities. After artworks, communication and relationships are the driving forces behind a media conservator’s work. Many of the people involved in the initiatives described in this chapter are those with an educational or experiential background in media preservation, art conservation, libraries, or archives; some may also work for a large and well-funded institution. Many are already invested in working in a cross-disciplinary manner, have intentionally grown their personal networks along these lines, and thus look to all fields that support audiovisual preservation and conservation for their practices. They have also joined these initiatives to specifically address the gaps in care between the large amount of media art that has been created and the small amount that is ultimately acquired and preserved by institutions. 85

Mona Jimenez et al.

CAW, APEX, and XFR Collective are, of course, not the only initiatives closing the gap with independent artists and small organizations through hybrid collaborations. There are other small nonprofit groups, such as Bay Area Video Coalition, Moving Image Preservation of Puget Sound, and the Texas Archive of the Moving Image, as well as library initiatives like the District of Columbia Public Library’s Memory Lab, which offer media preservation services and organize local or regional events to engage with communities [Bay Area Video Coalition (BAVC) 2021; Moving Image Preservation of Puget Sound (MIPOPS) 2018; Texas Archive of the Moving Image (TAMI) n.d.; DC Public Library—The Memory Lab n.d.]. XFR STN, the catalyst behind XFR Collective, is just one example of an art institution partnering with an artist’s group; there is no limit to the ways that resourced institutions can imaginatively co-develop public programming that bridges the many parts of a local media art ecosystem. These types of initiatives empower the collection caretakers to preserve their collections, at least for a little longer, which provides more opportunities for people and communities to engage with the content again or for the first time. As XFR Collective member Marie Lascu said, “Forever is a privileged concept. All we can do is our best.”

Acknowledgments The authors wish to express their deep appreciation to Juana Súarez for her very careful review and revisions for 5.3 Audiovisual Preservation Exchange (APEX). Many thanks also to Carla Marcantonio for her translation services during the APEX roundtable.

Bibliography 20 IndyMedia. “20 Years IndyMedia, 20th Anniversary Encuentro, Celebrating a Radical Media Movement.” 20 IndyMedia. n.d. Accessed September 20, 2021. https://indy20.dangerousmedia.org. Association of Moving Image Archivists (AMIA). “Association of Moving Image Archivists (AMIA).” n.d. Accessed July 30, 2021. https://amianet.org/. Association of Tribal Archives, Libraries, and Museums (ATALM). “Association of Tribal Archives, Libraries, and Museums: Sustaining and Advancing Indigenous Cultures.” n.d. Accessed September 20, 2021. www.atalm.org/. Atlas, C. “We Interrupt This Program/Day Without Art (Video Recording).” 1991. https://archive.org/ details/XFR_2013-08-29_1A_07. A/V Geeks. “A/V Geeks Have 30,000+ 16mm Films, Will Travel!.” A/V Geeks, n.d. Accessed August 4, 2021. https://avgeeks.com/. AV Preserve (AVP). “AVP.” Accessed August 4, 2021. www.weareavp.com/. Bay Area Video Coalition (BAVC) Media. “BAVC Media.” BAVC Media, 2021. https://live-bavc-wp. pantheonsite.io/. Brand, Bill. “A Self Preservation Guide for Film/Video Makers (with Toni Treadway).” n.d. http://helios. hampshire.edu/~wsbPF/downloads/Self Preservation Guide/Self Preservation Guide final edit.pdf. Cinemateca Uruguaya. “Cinemateca Uruguaya.” n.d. Accessed August 26, 2021. https://cinemateca.org. uy/. Colab Inc. “Collaborative Projects (Colab).” Accessed July  31, 2021. https://collaborativeprojects.word press.com/. Collaborative Cataloging Japan (CCJ). “Collaborative Cataloging Japan (CCJ).” 2020. www.collabjapan. org/. ———. “Community Archiving Workshop in Tokyo 2016.” 2020. www.collabjapan.org/workshoppanel-caw-2016. Community Archiving Workshop (CAW). “Community Archiving Workshop, the Handbook.” Community Archiving Workshop (CAW), 2019. https://communityarchiving.org/. ———. “Training of Trainers Toolkit.” Community Archiving Workshop (CAW). n.d. Accessed July  30, 2021. https://tot.communityarchiving.org/.

86

Outside the Institution DC Public Library. “The Memory Lab, the Labs at DC Public Library.” DC Public Library. n.d. Accessed September 10, 2021. www.dclibrary.org/labs/memorylab. Dollman, M., and Devon Orgeron. “Oral History Interviews Celebrating the Association of Moving Image Archivists on Its 25th Anniversary (Video Recording).” November  20, 2015. www.youtube. com/watch?v=8jIQeYf2cyI. Ferguson, Roderick. “Public Praxis.” 2019. https://publicpraxis.org/we-interrupt-this-program-themultidimensional-histories-of-queer-and-trans-politics/. Film Foundation. “The Film Foundation.” 2021. www.film-foundation.org/. Fundación de Arte Contemporáneo (FAC). “Fundación de Arte Contemporáneo (FAC).” 2020. www. colectivofac.com/en/. Fundación Patrimonio Fílmico Colombiano. “Fundación Patrimonio Fílmico Colombiano.” n.d. https:// patrimoniofilmico.org.co/. Ghana Information Services Department (ISD). “INA, The Jnews Government.” 2020. https://isd.gov. gh/. Institute of Museum and Library Services (IMLS). “Association of Moving Image Archivists, Log Number: RE-85-18-0039-18 (a).” n.d. www.imls.gov/grants/awarded/re-85-18-0039-18-0. Internet Archive. “Internet Archive.” 2014a. https://archive.org. ———. “Leslie Laliades.” 2014b. https://archive.org/search.php?query=creator%3A%22Leslie+Kalia des%22. ———. “MFW Video Club.” 2014c. https://archive.org/details/mwf_video_club?tab=about. Jimenez, Mona et  al. “Audiovisual Preservation Exchange Roundtable (Audio Recording).” 2020b. https://tisch.nyu.edu/cinema-studies/miap/research-outreach/apex ———. “Community Archiving Workshop Roundtable (Audio Recording).” 2020a. https://tisch.nyu. edu/cinema-studies/news/miap-2017-pda-seapavaa. ———. “XFR Collective Roundtable (Audio Recording).” 2020c. https://xfrcollective.wordpress.com/ events/. Jimenez, Mona. “Community Archiving Independent Media.” KULA: Knowledge Creation, Dissemination, and Preservation Studies 2, no. 1 (November 29, 2018): 15. https://doi.org/10.5334/kula.31. The Kitchen Blog, and S.R. Kohl. “From the Archives: We Interrupt This Program.” The Kitchen Blog, October 26, 2016. https://thekitchen.org/blog/62. La Red de cinema Doméstico. “La Red de Cinema Doméstico.” n.d. Accessed August 3, 2021. http:// lareddelcinedomestico.com/. Leslie-Lohman Museum of Art. “Barbara Hammer: Evidentiary Bodies.” 2021b. www.leslielohman.org/ exhibitions/barbara-hammer-evidentiary-bodies. ———. “Leslie-Lohman Museum of Art.” 2021a. www.leslielohman.org/. Matters in Media Art. “Matters in Media Art; Guidelines for the Care of Media Artworks.” 2015. http:// mattersinmediaart.org/. MIX • NYC. “MIX Festival (MIX NYC).” Accessed September 20, 2021. www.mixnyc.org/home-page/. Moving Image Archiving and Preservation Program (MIAP). “APEX 2013: Bogotá.” n.d.a. https:// apexbogota.wordpress.com/. ———. “APEX 2014: Montevideo.” APEX 2014 Montevideo, n.d.b. https://apexmontevideo.wordpress. com/. ———. “APEX 2014: Montevideo, FAC & Archive Jam Improvisation (AJI).” APEX 2014 Montevideo, n.d.c. https://apexmontevideo.wordpress.com/fac-archive-jam-improvisation/. Moving Image Preservation of Puget Sound (MIPOPS). “Moving Image Preservation of Puget Sound.” MIPOPS, 2018. www.mipops.org. Museo del Cine Pablo Ducrós Hicken. “Museo Del Cine Pablo Ducrós Hicken.” Accessed September 19, 2021. www.buenosaires.gob.ar/museos/museo-del-cine-pablo-ducros-hicken. National Endowment for the Humanities (NEH). “National Endowment for the Humanities Funded Project Query Form.” n.d. Accessed July 30, 2021. https://securegrants.neh.gov/publicquery/main. aspx?f=1&gn=PE-268832-20. New Museum. “Past, XFR STN, 07/17/13-09/08/13.” 2012–2021. www.newmuseum.org/exhibitions/ view/xfr-stn. NYU (New York University) Tisch. Moving Image Archiving and Preservation M.A. New York University, Tisch School of the Arts, n.d.a. https://tisch.nyu.edu/cinema-studies/miap. ———. Cinema Studies, Audiovisual Preservation Exchange (APEX). New York University, Tisch School of the Arts, n.d.b. https://tisch.nyu.edu/cinema-studies/miap.

87

Mona Jimenez et al. Orphan Film Symposium. “Orphan Film Symposium.” Accessed August 3, 2021. https://wp.nyu.edu/ orphanfilm/. Participant Inc. “Participant Inc.” 2021. http://participantinc.org/. Scribe Video Center. “Scribe Video Center.” n.d.a. www.scribe.org/. ———. “We Tell.” Scribe Video Center, n.d.b. www.scribe.org/wetell. Secret Project Robot. “Secret Project Robot.” Accessed September  20, 2021. www.secretprojectrobot. org/. SODRE. “SODRE, Archivo Nacional de La Imagen y La Palabra.” n.d. Accessed August  26, 2021. https://sodre.gub.uy/anip/. Spectacle Theater. “Spectacle.” Accessed September 20, 2021. www.spectacletheater.com/. Texas Archive of the Moving Image (TAMI). “Texas Archive of the Moving Image.” n.d. Accessed September 20, 2021. https://texasarchive.org. University of Ghana. “J.K. Kwabena Nketia Archives.” Accessed September 20, 2021. https://ias.ug.edu. gh/content/jh-kwabena-nketia-archives. Visual AIDS. “Past Event, Altered After, Participant, Inc.” n.d.a. https://visualaids.org/events/detail/ altered-after. ———. “Past Event, XFR WKND, Participant Inc.” n.d.b. https://visualaids.org/events/detail/xfr-wknd. ———. “Day Without Art.” n.d.c. https://visualaids.org/projects/day-without-art. Vivancos, Salvi. “Memorias Celuloides.” Facebook. Accessed August  3, 2021. www.facebook.com/ MemoriasCeluloides/. Wikipedia. “Wikipedia—Colección Engelman-Ost.” Wikipedia, 2021. https://es.wikipedia.org/wiki/ Colecci%C3%B3n_Engelman-Ost. “Wikipedia—Respect Des Fonds.” Wikipedia, 2020. https://en.wikipedia.org/wiki/Respect_des_fonds. XFR Collective. “XFR Collective.” Accessed July 30, 2021. n.d. https://xfrcollective.wordpress.com/.

88

6 THE ROLE OF ADVOCACY IN MEDIA CONSERVATION Jim Coddington

Editors’ Notes: Jim Coddington was the chief conservator at the Museum of Modern Art (MoMA) in New York City from 1996 to 2016, where he launched MoMA’s media conservation program with a collection survey in 2005 and dedicated media conservation staff in 2007. He initiated MoMA’s major Media Conservation Initiative, a five-year project (2017–22) funded by the Andrew W. Mellon Foundation, which allowed MoMA to host several postgraduate fellowships, workshops, and expert discussion meetings to advance the field of media conservation. In this short and encouraging chapter, Coddington highlights the important role of advocacy in promoting and enabling media conservation in institutional settings. He offers strategic advice in building narratives and raising awareness among decision-makers so that underserved collection needs can be successfully addressed.

6.1 Introduction We all know what advocacy is; we are all advocates. We are advocates, in our own distinctive voice, in our daily lives for ourselves and others close to us. Thus, what this chapter will do is simply place skills and tools that we all already have in a different context with perhaps some additional perspective to make them effective in our professional lives as well and yet still make them in one’s own voice. This chapter’s goal will be to look at how to define and address a collection need like timebased media (TBM) conservation. It will do so by asking general questions of the advocate in order to establish priorities for both the institution/collection they work for and, more specifically, within the TBM preservation project they are proposing. The most general questions will focus on institutional structure and then how TBM collecting and preserving fit into that. From there, the specifics of the collection and its needs will be queried. Emphasis will be placed throughout on clearly articulating the project plan and the goals that progress will be measured against. At the same time, the importance of placing this in the context of broader institutional priorities will be stressed.

DOI: 10.4324/9781003034865-7

89

Jim Coddington

6.2 Why Is It a Problem? The first question to ask is how this question became a priority for you. While broad generalities like preserving the present for the future are underpinnings, they are probably not enough to make a special case for this or any other particular material in your collection. Thus, specific cases that illustrate the broad problem of TBM works that you have direct experience with can provide the structure for how you came to understand the problem and thus provide a basis for your broad case. For instance, is a work you are striving to preserve simply not working? How then do you isolate what is not functioning? Perhaps you find that it is obsolescent technology that can or cannot be replaced. Or perhaps it is missing technology that you then need to find. Perhaps that missing technology is, in fact, a mismatch between the hardware you have and the software that is the work of art. Perhaps in resolving some of these questions, you would like to turn to the artist, but the artist does not have the answer for whatever reason. The problems can be complex but can, in their broad outlines, be excellent examples of the fundamental technical and theoretical issues of TBM conservation. What were the difficulties you confronted with these TBM works that made you think of the longer-term preservation of them? At the same time, one or perhaps a few of these cases might be stories around which you can also build a narrative that has an immediacy that big ideas and plans have trouble communicating. So as you ask yourself this question of how this became a priority for you, make a note of these illustrative works and return to them as you work through some other questions to see if they can serve as a story for the collection as a whole.

6.3 What Is the Problem? As before, the key question you should ask is of yourself. Why can I, or the rest of the team, not solve this problem in the normal course of our work? Is it a lack of knowledge or specific skills? Is it the scale of the collection, either due to past collecting or the rate of current collecting? Is it part of a short-term initiative where there was inadequate time? As you answer such questions, keep in mind how each instance you think through is reflective of your TBM collection as a whole. This will later become part of prioritizing the steps in the plan that you develop.

6.4 You Have a Problem—So What? While it may seem self-evident to you, the need for investing in TBM is not self-evident to everyone. The simple truth that you must always keep in mind is that resources in every institution are scarce or certainly are for new initiatives that have long-term costs. Everyone in your institution could use more resources to do their job better, just as you can see what is necessary for you and TBM conservation. Thus, as you answer the first two questions, you should be constantly asking yourself how someone else will see the need that you see.

6.5 Make the Problem a Joint Priority Preservation of collections has increasingly become a museum-wide responsibility, not just the responsibility of the conservation department. Security, operations, and art handling are just a few of the departments/operations within a museum that have direct responsibility for the safety and thus the preservation of the collection. With TBM, this base of support should become 90

Advocacy in Media Conservation

even larger as at least some of the skills and knowledge to exhibit it, if not also preserve it, will be called upon from other departments. So those in your institution charged with technology, computers, audiovisual, and the like will be natural partners as well as your longer-established partners in any long-term preservation plan. It is no accident that over the last number of years, almost every institution that has embarked on a coordinated TBM conservation program has initiated, very early in that development, something like a TBM working group (see Chapters  4 and 10). Each is a little different in composition as every institution is structured differently, but they are most effective when they include everyone who touches, literally or figuratively, the TBM work as it moves from acquisition to collection registration, to installation and display, to documentation, to storage, to lending, and so on. This group can be central to articulating the scale of your project as each of these departments has a valuable perspective on how TBM affects and changes their work. Most importantly, you have others within your institution who are now working toward this goal with you, who are part of solving not just your problem but your institution’s problem of caring for TBM. Of course, many institutions also hire outside help for the presentation and preservation of their TBM works. While they may not be part of an internal working group, what they do and whom they work with when doing that, should be part of your analysis as you articulate the scope of what you plan to do. As an example, let us look at some of the evolution of TBM conservation at MoMA (Museum of Modern Art, New York City). The initial collection survey led to the need to find outside sources to digitize analog material, which led to finding ways to store such digitized material as well as born-digital art. As digitization increased, the storage requirements grew and led to the need for a server to hold them. This then required coordination with IT to both define the need and to do that within the framework of existing security and access policies, not to mention budgets. At the same time, it was necessary to develop systems for tracking the artwork in its entirety, which thus involved curatorial and registrar staff to either modify existing systems or create new ones. This is just a very quick and by no means complete outline of how the care of TBM works grows and involves both old and new partners within and outside of your institution.

6.6 Turn the Problem into a Project One of the best ways to understand what you ultimately will need for a TBM preservation strategy is to undertake one or two small “pilot projects.” The goal of such projects is not only to better understand the complexity of the works of art themselves but to also grasp and articulate the scale of the problem in both day-to-day restoration work and long-term planning. One common such “pilot project” is the collection survey (see Chapters 4 and 5). The long history of collection surveys as tools for collection preservation is well established, and that experience is readily adapted to TBM collections. Alternatively, identifying a work or works by one artist can be the basis for illustrating how TBM is different from other materials in your collection while at the same time showing that caring for it can be done successfully. Circling back to your original questions about how this became a priority for you can be useful in getting more specific about your project. Will your project emphasize the development of skills and knowledge within your institution? How? If you undertook pilot projects, what did you learn? You might consider what you thought you would learn and compare that to what you did, in fact, learn, both in specifics and in management of the project, all of which could well be important in setting realistic goals for your larger project. 91

Jim Coddington

6.7 Articulate the Project It might be good to think of articulating the project as running on two parallel tracks. The first is the fully detailed “deep dive,” and the second is the two-minute summation of what it is and why it is important. The deep dive is going to serve as your guide throughout your project. As such, it should incorporate answers to each of the questions you have been asking of yourself and your larger working group, should you have one. Which skills and knowledge are necessary to do this? How much of this exists already? How much will need to be developed? Will some of that need to be outsourced? If so, how much will it cost on an annual basis? There is no industry standard for care, like, for instance, temperature and humidity set points for photographic materials, so that is all the more reason to understand your collection and institutional needs to make the case. This, in fact, gives you a great deal of flexibility to create a plan around your specific collection and institutional structure. This also leads us to a key point, and that is, if possible, to develop the project with several options for implementing it. These could include varying the amount of time to fully implement it or a range of costs for elements of your plan. As you develop this plan, it will be strengthened by getting specific thoughts and practical suggestions from the group of colleagues you have put together. As you work through developing your plan, be alert for ways in which their work might change and how to make that part of the plan. This will make the plan more effective and will also create a broader base of support for it. It will be important for you to think about and make clear how you will measure success. This is not just about making a case that can be “sold” but one that you can truly deliver on. Be clear about how you will know that your plan is on track. Each collection and institution is so different that it is not possible to say, in any universal way, what success will be, but whatever it is, you need both to make that goal clear and to articulate the markers you will use to get there. The second track, your two-minute pitch, must communicate that plan, and all its detail, in just those two minutes. It is clear that the technical details will not be useful here and will just distract. The story of a work that would specifically benefit from each piece of your plan would be a good way to present this. Or if a specific work prompted you to push this plan forward, that could be the basis for your shorter case. Whatever you choose as you write this story, do it in clear language and, if possible, set it up as a question first that you then, by the end, answer.

6.8 Summary This chapter is short for a reason. Advocacy should be both expansive and succinct. The details are vital, but the big picture, quickly sketched, is also vital. So as you read this chapter and find unasked questions that should have been asked in your case, use that as an exercise to ask yourself whether, in your advocacy, you too have left something out of your basic brief. Is there a question that someone reading about or listening to your plans might ask that you have not asked? Remember also that others have come this way before, so learn from their experience and ask them questions. Above all, remember that it is not hard; it is just a little different from the advocacy that you already know how to do.

92

PART II

Building a Workplace

7 BUILDING A TIME-BASED MEDIA CONSERVATION LAB: A SURVEY AND PRACTICAL GUIDE, FROM MINIMUM REQUIREMENTS TO DREAM LAB Kate Lewis

Editors’ Notes: This chapter reflects the cumulative experience gathered through a survey written and conducted by Kate Lewis, The Agnes Gund Chief Conservator of the David Booth Conservation Department at the Museum of Modern Art in New York City. Kate Lewis surveyed 30 practitioners running time-based media labs from a variety of countries across both institutional and private settings. Quoted text are taken directly from the survey: Michelle Barger (SFMOMA, San Francisco Museum of Modern Art), Rebecca Barnott-Clement (Art Gallery of New South Wales, Sydney), Reinhard Bek (Bek  & Frohnert LLC, New York), Richard Bloes (Whitney Museum of American Art, New York), Amy Brost (Museum of Modern Art, New York), Savannah Campbell (Whitney Museum of American Art, New York), Brian Castriota (National Galleries of Scotland, Edinburgh, and the Irish Museum of Modern Art, Dublin), Catherine Collyer (Queensland Art Gallery and Gallery of Modern Art, Brisbane), Kristin MacDonough (Art Institute of Chicago), Kirsten Dunne (National Galleries of Scotland, Edinburgh), Steve Dye (SFMOMA, San Francisco Museum of Modern Art), Kristof Efferenn (Museum Ludwig, Cologne), Jonathan Farbowitz (The Metropolitan Museum of Art, New York), Ben Fino-Radin (Small Data Industries, New York), Flaminia Fortunato (Stedelijk Museum, Amsterdam), Christine Frohnert (Bek  & Frohnert LLC, New York), Martina Haidvogl (University of the Arts, Bern and previously SFMOMA), Agathe Jarczyk (Solomon R. Guggenheim Museum, New York and Atelier für Videokonservierung, Bern), Louise Lawson (Tate, London), Jo Ana Morfin (private practice, Mexico City), Dorcas Müller (ZKM, Karlsruhe), David Neary (Whitney Museum of American Art, New York), Arnaud Obermann (Staatsgalerie, Stuttgart), Peter Oleksik (Museum of Modern Art, New York), Joanna Phillips (Düsseldorf Conservation Center), Alysha Redston (National Gallery of Australia, Canberra), David Smith (M+ Museum, Hong Kong), Andreas Weisser (Doerner Institute/restaumedia, Munich), Aga Wielocha (M+ Museum, Hong Kong), and Gaby Wijers (LIMA, Amsterdam).

DOI: 10.4324/9781003034865-9

95

Kate Lewis

7.1 Introduction: TBM Lab Fundamentals This chapter addresses the scope, space, and technical infrastructure that is needed to conduct time-based media (TBM) conservation. What are the minimum requirements and the nice-tohaves? What are the basic and advanced features that a functional and efficient media conservation lab should offer? What compromises can be made if the budget is too small to build the ideal lab? How does one lay the groundwork for gradual growth and infrastructural expansion? This chapter also provides an overview and discussion of existing media conservation labs both outside of and within institutions. It is complemented by Appendix 7.1, “Lab Equipment Lists,” which may serve as a starting point for planning and budgeting one’s own media conservation lab. TBM labs have been slowly growing in number; the first known lab to be established was at the Stedelijk Museum (Amsterdam) in 1976, followed by a handful of pioneering labs in the 1990s and early 2000s, including labs at Montevideo in Amsterdam (now known as LIMA) in 1991, and at the Tate in London in 1996. At the turn of the 21st century, labs were established at Restaumedia in Munich (2003), the ZKM Laboratory for Antiquated Video Systems in Karlsruhe (2004), and the Atelier Für Video Konservierung in Bern, Switzerland (2008). In the decade from 2010 to 2020, the number of TBM labs around the world more than tripled with the arrival of dedicated private labs, such as Bek & Frohnert LLC (2012) and Small Data Industries (2017), both in New York, and institutions with a range of facilities from a single workstation to a full lab with the ability to test display equipment and accommodate test installations, along with a handful of purpose-built “dream labs.” Institutions in the United States that built labs during the second decade of this century include the Solomon R. Guggenheim Museum (2010), The Museum of Modern Art (2011 and 2019), and the Whitney Museum of American Art (2015) in New York City, as well as the San Francisco Museum of Modern Art (2011) and the Art Institute of Chicago (2018). In Europe, media labs at the Staatsgalerie in Stuttgart (2014), the National Galleries of Scotland in Edinburgh (2017), the Irish Museum of Modern Art in Dublin (2019), and the Museum Ludwig in Köln (2019) were established. The Art Gallery of New South Wales in Sydney established a lab in 2016, and a lab was built at the M+ Museum in Hong Kong in 2020. The newest labs included in this survey are at The Metropolitan Museum of Art in New York and the Düsseldorf Conservation Center in Germany, both established in 2021. To practice TBM conservation, a lab space has to accommodate a broad spectrum of activities (AIC Wiki 2020). With little written on this topic, a survey was conducted of 30 TBM practitioners in a variety of countries across both institutional and private settings. It is their cumulative experience, recommendations, and hindsight that underpin this chapter. It is commonly observed that TBM works do not fully exist in storage or even at the point of acquisition from an artist or gallery, arriving as they often do on an external hard drive or in a 16 mm film can. A lab needs equipment, tools, and technical infrastructure but of equal importance is the space to check display equipment and run test installations with the various constellations of media, which coalesce to bring the artwork to life, so to speak. This might, for example, involve a work that runs on arrays of 18 × 35 mm slide projectors, a synchronized projected three-channel video with stereo sound, interactive net art, a 5.1 surround sound audio work, or a 16 mm film projection with digital sound. Space, or access to space, is key. At the Stedelijk Museum, Flaminia Fortunato describes their lab, which incorporates these fundamental aspects: “The lab is divided into three areas. The first is equipped with two working stations for intake, condition assessment, documentation, restoration and analog to digital transfers. The second area is dedicated to installing/mocking-up/testing media works 96

Building a TBM Conservation Lab

entering the collection or going on display or loans. Artists are often invited to our lab to install their works with us and to talk about their production process. The third area is dedicated to resources, books, and technical equipment manuals.” This aspect of test installations affording the opportunity to work with artists highlights the cross-disciplinary and cross-departmental approach, which is now recognized as integral to the stewardship of TBM works (Phillips 2015).

7.2 Space and Requirements The surveyed practitioners and existing conservation literature on lab planning, design, needs, and construction, including the publication Planning and Constructing Book and Paper Conservation Laboratories: A Guidebook (Hain and Alstrom 2012), highlight the following aspects as key considerations for local context-based planning of a TBM lab: • • •

• • • •

collection characteristics, needs, and staffing location, physical access to the space, security, size of space, floorplan, future extendibility flexibility (e.g., rolling furniture), efficiency, furniture, electrical power requirements, network/data infrastructure, lab safety, lighting, sound considerations (e.g., insulation and absorption) ability to perform test installations; ability to view, assess, treat, and prepare for exhibition and store both analog and digital media (collection dependent) ability to view, assess, treat, and run physical display equipment, storage (temporary and/or permanent for digital and physical materials) space for collaboration with artists/clients/curators/exhibition designers/ media and audiovisual technicians/technical consultants/students education, training, and resources, such as books, technical manuals, and paper-based documentation

Surveyed participants explained that these considerations were then weighed against the practicalities of location, size limitations, staffing, and budget. For a private lab, a particular approach might be taken: Ben Fino-Radin at Small Data Industries described the requirements as being “100% dictated by client needs” and the main design consideration was “data throughput (i.e., 10 GbE) and power load” whereas Jo Ana Morfin noted that the main activities in her lab include “organizing and hosting symposia, events and workshops regarding the care of TBM works.” Private conservators Reinhard Bek and Christine Frohnert (of Bek & Frohnert LLC) observed, “In responding to our clients’ needs, the lab evolved naturally over the years. We are increasingly processing digital born works for acquisition. When it comes to magnetic and exotic media, we collaborate with specialized labs. It made sense in retrospect to build the lab slowly over time with the occurring needs.” For collecting institutions, the first step is often a survey of the whole collection and how it is used (e.g., exhibition and/or research access) to help inform decisions related to space, infrastructure, and staffing (see Chapter 4). Arnaud Obermann at the Staatsgalerie commented, “In order to be able to work efficiently and therefore to acquire the necessary equipment, an examination of the collection was essential at the beginning. I didn’t have 97

Kate Lewis

an influence on the size or the location of the lab (which is situated on the 3rd floor), but the electrical wiring was done according to my specifications (mainly to prevent ground loops).” Among institutions with existing traditional conservation labs, a recurring narrative charts gradual progress from the initial establishment of a digital workstation to expansion following advocacy and building on strategic opportunities. Although in one sense a compromise, a gradual approach has proven successful at SFMOMA (San Francisco Museum of Modern Art), MoMA (The Museum of Modern Art), and the Guggenheim (Solomon R. Guggenheim Museum) (Phillips 2015). Brian Castriota, a freelance TBM conservator for the NGS (National Galleries of Scotland) and IMMA (Irish Museum of Modern Art), sums up the reality of how most institutional labs have started: “Risk was the primary determining factor in terms of how I began arguing for and building lab infrastructure at both NGS and IMMA. The most pressing risks at both institutions were initially those posed by the fact that digital holdings were not backed up, held only on ageing, artist-supplied drives that were never intended for long-term storage. These works were therefore at imminent risk of partial or total loss, and were not readily viewable by museum staff or researchers, meaning that these institutions were failing in their obligations to make collection works accessible to the public, now and in the future. Mitigating these risks necessitated, at bare minimum, a computer, monitor, write blocker, RAID and server storage space. These were established at NGS in 2017 and in 2019 at IMMA.” It is not ideal, but when constrained by limited space for mocking-up artworks, TBM conservators have had to think creatively, borrowing larger spaces such as paintings conservators’ spray booths (at the Guggenheim and MoMA) and Kristof Efferenn at the Museum Ludwig in Cologne negotiates temporarily unused space in the galleries between exhibitions: “When I am dealing with larger art installations I have to find another workshop or even a spot in the galleries to be able to install the installation with all parts. I guess space will always be an issue.” The majority of TBM labs today achieve the combination of technical infrastructure and tools with the ability to simulate artwork installations in one modest space laid out with a flexible design. Agathe Jarczyk of Atelier für Videokonservierung, Bern, explains, “Our goal was to have a good space to watch and listen to video, so we chose a spacious room without direct sunlight, dimmable lighting and grey and white surfaces. We have fixed workstations and mobile workstations to facilitate the possibility of working on several projects at a time, and a lot of trolleys! Space limitations require compromise. Our “staging area” is always a temporary mockup that cannot stay for more than a couple of days as we have other colleagues in the space.” During her time at the Guggenheim Joanna Phillips reflected, “as a first-generation TBM conservator there, I had to develop my job profile and identify lab infrastructure needs in real time as there was no example to lean on. We 98

Building a TBM Conservation Lab

Figure 7.1 Christine Frohnert performing quality control of a video artwork in the media lab at Bek & Frohnert, LLC, New York Photo: Reinhard Bek

Figure 7.2 David Smith performing treatment in the TBM Conservation Lab at M+ Museum, Hong Kong Photo: Aga Wielocha

99

Kate Lewis

were the first TBM lab in a US museum, and so we were making it up as we went along.” She describes advocating for requirements: “through steady demand, gradual growth, and folding lab infrastructure investments into exhibition projects. Creating internal and external/public attention around TBM activities at the Guggenheim Conservation Department (exhibition tours, organizing conferences at the Guggenheim, lab visits from colleagues, and entertaining potential donors) supported these efforts.” On reflecting on what they would do differently today, Louise Lawson and the Tate TBM team observed that if they were designing a TBM lab today, they would specify “spaces that can be multi-functional, i.e. divided up and used at different times in different ways; lighting that can be split thereby allowing different areas within a space to be selectively lit; and flexible space for setting up, checking and testing works.”

7.3 Purpose-Built Labs At the M+ Museum, the TBM conservation studio was designed in 2017 before TBM conservators came on staff in 2020. Aga Wielocha and David Smith noted, “The space, location and electrical layout were specified by the architects based on a requirement from M+. It works relatively well as it allows us space to test installation works or project video while also having access to workstations for ingestion or processing.” Despite the solid design and well-considered technical infrastructure, they added, “The initial layout of the lab by the architect was quite rigid (fixed desks etc.). This was changed early on to movable desks and one large custom desk for the edit suite. This has allowed us to do a lot more work with the physical components of installation works that have a TBM element rather than just digital works via workstations.” In 1992, Team Media was established at SFMOMA, consisting of an internal collaborative working group with experts from various departments throughout the museum dealing with media arts (see Chapter 10). Martina Haidvogl and Steve Dye reflected on the history of the TBM lab (Haidvogl 2015): “The media conservation lab grew organically over the years. It started with a small video viewing station within the Conservation Lab in 2011 that allowed us to quality control digital files as well as videotapes. . . . Then, with the museum’s building extension slated to open in 2016, a more ambitious space was planned: a series of shared rooms in the museum’s basement that would serve both the Conservation as well as the Collections Management Department.”

100

Building a TBM Conservation Lab

Martina Haidvogl said, “This has been a game-changer for both the space we’ve had, and the opportunities it provided us with. It was especially the Black Box [test installation space] that saw continuous use and more than once we held Study Days or side-by-side presentations for comparison with the larger team.” Steve Dye added: “Even as our use of these workspaces evolves, they have fundamentally improved our capacity to do this work. The Black Box continues to be an exceptionally useful space. It is a relatively large open space. It also has the advantage of having infrastructure (electric power, data, lighting, fixing points in the ceiling) that mirrors conditions in our galleries. As an open space, it does take discipline to keep it clear, to stay on top of projects and to clear the space when those projects are completed.” Reflecting on the approach to pool resources and create shared spaces among conservators, media technicians and photographers, Steve Dye reflected, “We (I give credit to Jill Sterrett for this) recognized that we could not individually end up with the appropriate spaces we needed, but if we were able to work out how we might share some workspaces, there was a greater chance we would succeed in getting the work areas that we need. This deep examination by each stakeholder, to define what our specific needs are, and to work with each other to define how we might build spaces that reinforce how we work together, was in itself a very productive process.” This institutional pooling of resources with colleagues in the Media and Audiovisual departments was also mentioned by Rebecca Barnott-Clement at the AGNSW (Art Gallery of New South Wales): “Our current lab is in a space which was not designed for this purpose; it has windows with natural light and is in a comparatively open-plan area shared with other conservation staff. The decision to locate the lab here was one of compromise (balancing network capabilities and playback functionality with the needs of other conservation staff), and working in such a space continues to require a significant level of compromise (using headphones rather than playing audio aloud while other staff are undertaking stressful conservation treatments, for example). I think in retrospect it would have made more sense to position the TBA (TBA or time-based art is the acronym in common use at AGNSW) lab closer to the workspaces of our AV department, or even together in a shared space. Both the AV and TBM Conservation departments have very similar needs in terms of a workspace (a dark room without natural light for viewing video, somewhere enclosed and with acoustic baffling for audio playback etc.). Both departments also require very similar toolkits in terms of software and hardware when handling and viewing artworks, so pooling available funds to create a shared bank of resources seems sensible.”

101

Kate Lewis

In 2011 MoMA established two TBM workstations in an off-site collection storage space: one for digitizing tape-based video and a second for digital materials. In 1996 Pip Laurenson established the Tate TBM labs, also located in a collection storage area, where it remains today and has since expanded in footprint and technical capabilities over the last 25 years. With the 2019 MoMA expansion, the conservation department was able to realize a purpose-built “dream” TBM lab adjacent to the existing object-based conservation spaces, thanks to years of advocacy by then Chief Conservator, Jim Coddington (see Chapter 6). Working with the firm Samuel Anderson Architects (New York), MoMA’s Media Conservation team designed four discrete spaces, each incorporating appropriate power loads and network data throughput. The first is an office space with sit/stand digital stations. The second is a clean workroom for ingest, disk imaging, film bench, and temporary media storage. The third is a viewing room, built in collaboration with video engineer and TBM specialist Maurice Schechter (New York), fitted with acoustic isolation and insulation with the capability to digitize media, play 5.1 surround sound, and view media across a variety of display devices. The fourth room was inspired by SFMOMA’s shared Black Box, incorporating lighting, walls, and flooring matching the museum’s galleries, where both TBM conservators and the Audiovisual team could run display equipment and test and install artworks. These purpose-built, fully equipped institutional labs are currently exceptions in the field. When presented with the opportunity to build a TBM lab from scratch, Steve Dye of SFMOMA offered the following advice: “Be sure you are at every meeting. There was quite a long process with the architects, prior to, and during construction. The architects are representing you when they are working with the structural engineers/contractors/vendors who are doing the actual construction. Every construction project encounters discrepancies in the field regarding how a design objective may or may not be accomplished. There ends up being many, many meetings, conversations, and much brainstorming on how to fit this piece, or make that adjustment. On more occasions than I could count, my presence at a meeting helped to ensure that previous decisions remained incorporated in the final project. It is also important to be sure that the changes that do happen are informed decisions.”

7.4 Medium-Specific Overview 7.4.1 Digital Workstation Not every lab will need to treat all media encompassed within the TBM discipline. Today the minimum requirement common to all labs is a digital workstation. This technical infrastructure of both hardware and software provides the fundamental environment required for safely transferring media off carriers, file examination, playback for viewing assessment/quality control (QC)/condition checking, disk imaging, exhibition preparation, cataloging, documentation, and transfer to digital storage. The following outlines the key elements; see Appendix 7.1 for sample equipment and tool lists. The hardware typically includes the following: •

Computer with sufficient processing speed or CPU (central processing unit), sufficient storage to work with large files, and the ability to send and receive data over a network connection.

102

Building a TBM Conservation Lab





• • •

For the access, viewing, and treatment of software-based art, a well-equipped lab will likely have a range of different computer environments available, such as Mac running macOS, PC running Windows, and PC running Linux (Olsen and Woods 2018). RAID (redundant array of independent disks), which combines multiple physical disk drives into a unit for the purpose of data redundancy, thereby improving speed and performance, which are important when working with media files. A minimum of one monitor, usually an HD/2K/4K reference monitor, and if dealing with standard definition video, a professional-grade cathode ray tube (CRT) monitor. Headphones and/or a pair of professional speakers for stereo sound. Hardware write blockers that allow the acquisition and examination of digital files without compromising or altering the integrity of the data.

Current software setups typically include the following: •





• •

Tools for transfer, file packaging, and fixity of digital files, such as BagIt (Kunze et al. 2018), Bagger [Library of Congress—Bagger (Software) 2018], Grabbags, and checksum generation (MD5, SHA-1, etc.) (AMIA Open Source 2021). File metadata extraction tools such as ExifTool (Harvey 2021), FITS (NASA 2021), Mediainfo (Pozdeev and MediaArea 2021), Invisor (Pozdeev 2021), and BitCurator (Software) (Lee et al. n.d.). Common tools listed in the survey for video and digitized film playback, capture, transcoding, and QC (quality control), such as VLC, QuickTime, FFmpeg [FFmpeg (Software) 2021], QC Tools [QC Tools (Software) MediaArea et  al. 2020], Adobe Premiere, and DaVinci Resolve. Tools for digitized slides, font, and graphic-based works include Adobe Photoshop and Illustrator. Disk imaging or forensic acquisition, such as FTK Imager [FTK Imager (Software) AccessData 2012], Libewf [Libewf (Software) n.d.], and BitCurator (Software) (Lee et al. n.d.).

Chapters 13 and 14 in this book provide further information on the tools and practices used for disk imaging and digital preservation more generally. Further medium-specific information on quality and condition assessment practices and their requirements toward the lab infrastructure can be found in Chapters 18 for video-based art, 19 for sound-based art, 20 for film-based art, 21 for slide-based art, and 22 for software-based art.

7.4.2 Tape-Based Media Workstation The digital workstation is foundational to the activity of videotape digitization. For capture software, BlackMagic Media Express and older versions of Final Cut Pro are used in addition to Adobe Premiere. Alongside the range of cables and connectors, additional hardware includes the following (see Chapter 18 and Blewer 2016): • •

Analog to digital converter or A/D that converts an analog signal into a digital signal (e.g., BlackMagic and AJA products). Videotape recorder (VTRs) decks will be format-dependent on collection/artwork needs; ideally, a professional graded deck is preferable (e.g., Sony Digital VCR, DVW-A500 for analog Betacam SP and Digital Betacam [NTSC]).

103

Kate Lewis





• •

Analog tape cleaners: Andreas Weisser of Restaumedia recommends that “if tape-based workflows are part of your lab activities, I always point out that tape cleaners are mandatory. A lab without cleaners sets vintage tapes at risk!” Time base corrector, or TBC, for stabilizing and synchronizing an analog video signal during migration and for correcting processing errors common in analog video signals. For example, the Leitch Digital Processing System (DPS-290) component TBC is a popular model used in labs. Some TBCs come with built-in processing amplifiers, which can adjust audio and video signals. Additional components may include an audio mixer and a video distribution amplifier which can send multiple video signals to various locations. CRT reference monitor (e.g., a Sony PVM-D20). Oscilloscopes are instruments that display information about a video signal in a scientific manner. Two such complementary tools are (1) a waveform monitor that displays the luminance of a video signal and (2) a vectorscope that displays the chrominance of a video signal (The Media Conservation Initiative at the Museum of Modern Art 2018).

7.4.3 Film and Slide Tools The following tools are commonly used for physical viewing, inspecting, performing minor repairs, preparing prints and slides for exhibition, and preparing materials for storage (see Chapters 20 and 21): • • • •

Motion picture film bench, light box, and rewinds. Light box to visually inspect film and slides. Film splicer, film ruler, cotton gloves, split reels, and film cores. Loupes and cotton gloves (National Film Preservation Foundation 2004).

7.4.4 Display Equipment and General Tools • • • •

Screwdrivers of all different types, signal generator, vacuum cleaners, soldering kit, carts or trolleys, and cleaning supplies. ESD (electrostatic discharge) anti-static table mat and wrist leash; voltmeter. Audio, video, and still-image recording capabilities for documentation and conservation interviews; Surveyed conservators frequently listed iPhones as a key tool. Selection of projectors, monitors, speakers, cables/connectors/adapters, and media players (e.g., BrightSign).

7.5 Priorities and Advocacy Typically, video works make up the majority of most collections (see Chapter 18). As the history of video began with tape-based formats, many of the formative TBM labs were designed with the ability to digitize analog and digital tapes—for example, LIMA in 1991, Restaumedia in 2003, ZKM Laboratory for Antiquated Video System (ZKM) in 2004, Atelier für Video Konservierung in 2008, and artist-initiated spaces BAVC (Bay Area Video Coalition) in 1976 and the Standby Program Inc. in 1983. Dorcas Müller at the ZKM Laboratory for Antiquated Video Systems (ZKM) noted, “In 2004, the ZKM established the Laboratory for Antiquated Video Systems. In the realm of a circle of media pioneers like Peter Weibel and ZKM artists in residence like 104

Building a TBM Conservation Lab

the Vasulkas or Ira Schneider, their video archives were the first ones to be digitized in uncompressed 10-bit PAL/NTSC and stored on LTO data cartridges. The activity in the lab focuses on making restoration and digitization of original old video art stock possible, just before the tapes suffer degradation. Although in the last decade, we were offered bigger space with daylight several times, we stuck to our undesirable, indisputable former analog video editing suite for a reason—no daylight (good for the machines and tapes), cold museum climate, and an independent data connection drilled right through the wall. In this case politics fit perfectly with the technical considerations.” In labs established over the last decade, two different approaches have evolved. With audio, video, and film now being primarily digital, many collections consist of primarily digital materials, with tape-based formats accounting for just a small percentage. In these cases, conservators have decided to focus on digital capabilities. Brian Castriota observed, “There seems to be consensus now that, when faced with limited financial resources, in-house digitization capabilities are not the highest priority anymore; almost all new moving image material is born-digital, and analog material should be viewed and/or digitised on equipment that is regularly maintained by video engineers or technicians.” While establishing the lab at The Metropolitan Museum of Art in 2021, Jonathan Farbowitz echoed the response of a number of surveyed TBM conservators: “After looking at the collection and speaking with multiple colleagues, I decided against putting digitization racks in the lab as they would not be used very often.” He sums up his process as: “analyzing items in the collection carefully in terms of the media formats present and their status and anticipating the immediate and most pressing needs instead of designing and specifying equipment based on an idea about what a TBM lab should look like.” When in-house staff oversee collections with significant digitization needs or labs serve regional needs, the capacity to digitize is incorporated into lab designs, such as at the freelance conservator Jo Ana Morfin’s lab in 2014, at the AIC (Art Institute of Chicago) in 2018, the Museum Ludwig in Cologne in 2019 and the Düsseldorf Conservation Center in 2021. Ongoing advocacy for funding, space, and institutional collaboration has proved key to Kirsten Dunne at NGS (National Galleries of Scotland), who states, “Advocacy is an ongoing process but now that I have the attention of senior management, I am working on explaining process lines for TBM works and associated lifecycle budget needs so that I can get the budgets I need access to, to move forward. . . . Prioritisation of needs is based on the collection and a risk assessment at the component level, as that has helped us see what is most critical; this is what we are dealing with first. This year this means getting the budget to get external vendors to digitise tape and film-based material.” David Neary and the TBM team at the Whitney Museum of American Art shared, “We prioritize and advocate based on the most pressing needs of the Whitney’s collection. If a certain media format or artwork is particularly at risk, we make recommendations that can facilitate its treatment.” 105

Kate Lewis

A similar observation was made by the Tate TBM team: “Advocacy is linked to the growing nature of the collection and to Tate itself, with the opening of Tate Modern [in 2004]. Also the Artist Trustees asked really good questions that supported time-based media conservation. Advocacy for the lab is ongoing, it evolves with the collection and the growing team’s needs, so the lab continually develops as a response.”

7.6 Sustaining a TBM Lab Once a digital workstation or a full TBM lab is established, everything requires upgrades and maintenance. In institutional settings, this might be achieved in concert with IT colleagues, but where resources and time are restricted, this is not always a given. Kirsten Dunne at NGS noted that “Mac-based platforms are not currently supported by our IT department (that’s something I am working on resolving with a colleague).” The majority of practitioners currently use a Mac-based computer for their digital workstation, although PCs running Windows or Linux/Ubuntu operating systems are used in established labs for disk imaging and working with software-based art. Brian Castriota wrote, “Maintaining a TBM workstation, both in terms of hardware and software, sometimes feels like a full-time job in itself. In many ways it’s a living organism you have to care for, and it has to be constantly adapted to current needs (prompted by artworks) and technological advances. IT departments can help with some of this but often because our digital tools are so bespoke and customised, I find that I am the one having to do a lot of troubleshooting and maintenance on the software side of things. I am also indebted to the expertise of friends and colleagues around the world; I can’t emphasise enough the importance of being virtually connected with a network of other TBM conservation practitioners in this respect.” Kristin MacDonough at the AIC (Art Institute of Chicago) takes the following approach to lab maintenance: “For analog equipment we do general cleaning in-house. If needed, we contact local audiovisual companies for service and repairs. Which company we use depends on the equipment that needs servicing. For our media rack, which holds mostly obsolete technology, we had someone come on-site to inspect and test everything, but if a machine needs more time to be serviced or repaired, it is taken off-site. For the computer station: any changes made to the operating system are documented, so they can be reversed if needed. Anti-virus software is installed on the Mac and the PC computers.” Adapting and incorporating newer technologies is a perennial activity for TBM labs, so flexibility is fundamental to all aspects of lab design. Steve Dye wrote of the purpose-built SFMOMA spaces: “The key, and I think by and large we were successful in this, is to design enough practical functionality into the spaces, that they can grow and develop as we continue to evolve our working processes.”

106

Building a TBM Conservation Lab

Annual budget planning is critical to support the upkeep and evolution of labs. On future planning, Agathe Jarczyk of the Guggenheim Museum and Atelier für Videokonservierung, noted, “We update regularly, and acquire new equipment like decks, monitors, computer equipment as well as computers. For the latter we always keep older OS and all DMGs [mountable disk images] and licenses we ever acquired. We also keep and maintain different generations of computers. Future updates will include new computers, new monitors and a shift to higher resolutions for video as well as a move to display and document VR and AR works.” Opportunistic growth is a common theme in establishing TBM labs, yet after building two TBM conservation labs, one at the Guggenheim in 2010 and another at the Düsseldorf Conservation Center in 2021, Joanna Phillips is a strong proponent of the benefits of strategic planning: “I had the chance to build two TBM labs in my career, and I tried to learn from all the mistakes I had made the first time around. The biggest change in approach for the second lab was to think more strategically: think big, envision future demands and plan everything at once to successfully apply for grant funding.” In summary, aim high for the dream lab with the necessary equipment, tools, and technical infrastructure, and don’t underestimate the importance of space to check display equipment and run test installations. When the dream lab is not an option, however, this chapter and the experiences of those contained within it demonstrate that there are many paths on the TBM conservation map. The philosopher Lao Tzu once said, “A journey of a thousand miles begins with a single step.” So, too, a committed TBM conservator might start with a single workstation, but with a little advocacy here and a little adaptation there, they can go far. Good luck!

107

Appendix 7.1 LAB EQUIPMENT LISTS COMPILED BY KATE LEWIS

The appendix for Chapter 7 is a collection of sample equipment that was compiled from the survey of 30 TBM practitioners who listed equipment and tools currently in use in their TBM conservation labs. The purpose of this resource is to aid professionals in planning and establishing a media conservation lab infrastructure.

Computers

Monitors

General

Hardware

Software

Mac, PC Forensic Recovery of Evidence Device, or FRED RAID (hard drive storage) Write blockers/Forensic Bridges USB 2.0/3.0/SATA/ CRT LCD generic LCD reference monitor (HD/2K/4K) Vectorscopes Waveform monitors Camera Clear plastic bins for temporarily holding artwork components Electronic screwdrivers ESD (electrostatic discharge) mat, wristband, gloves iPhone (camera, video, audio recorder) LTO drive Microfiber clothes Microphone for recording interviews Precision engineer screwdrivers Soldering iron Vacuum cleaner (anti-static) Video camera Voltmeter

macOS, Windows, Linux (Ubuntu)

108

Atom Inspector BagIt, Grabbags, Bagger Bash (Bourne-Again Shell) scripts BRU PE ExifTool FITS Utility Software GoodSync Hex Fiend Homebrew HTML5 Illustrator JavaScript jQuery Linear Tape File System (LTFS) MindGenius Network-attached storage software (NAS)

Building a TBM Conservation Lab

Hardware

Software

Video preservation

Analog tape decks Capture cards Digital media players Digital tape decks Frame synchronizer Optical drives Patchbay Signal generators Tape cleaning machines Test tapes Time base correctors Video projectors

Audio preservation

CD player Genelec speakers Headphones Record Player Subwoofer Tape player 35 mm slide mounts 35 mm slide projectors Light box Archival cores Film projectors Film winders (upright or horizontal) Light box Splicer Split reels Tape see Computers, Monitors and General given earlier

35 mm slide preservation Film preservation

Softwarebased art preservation

109

Photoshop PHP Python Rsync SketchUp Tar Adobe Media Encoder Adobe Premiere BrightAuthor DaVinci Resolve DV Analyzer FFmpeg Final Cut Pro Handbrake Invisor JES Deinterlacer MediaExpress MediaInfo MPEG Streamclip QC Tools QuickTime VLC Audacity Pro Phase Sonic Visualizer SoX

Adobe Photoshop

Adobe Media Encoder Adobe Premiere DaVinci Resolve FFmpeg

Disk imaging: AnyToISO Bit Curator dd Disk Utility FTK imager Kryoflux Libewf Emulation: Apache Kernel-based Virtual Machine (KVM) libvirt

Kate Lewis

Bibliography AccessData. “Forensic Toolkit Imager (FTK) (Software) (Version Mac OS 10.5 and 10.6x Version—3.1.1). AccessData.” 2012. https://accessdata.com/products-services/forensic-toolkit-ftk/ftkimager. AIC Wiki. “Setting up a Conservation Lab—Wiki.” American Institute for Conservation of Historic and Artistic Works (AIC) Wiki, July 28, 2020. www.conservation-wiki.com/wiki/Setting_up_a_Conservation_Lab. Association of Moving Image Archivists (AMIA), AMIA. “AMIA—Open Source.” GitHub. GitHub— AMIA Open Source, 2021. https://github.com/amiaopensource. Blewer, Ashley. “Minimum* Viable Workstation Documentation.” Google Doc, 2016. https://docs.google. com/document/d/1oJvr8zCMK4A97GF9xYOM0uijDqyNStuwjtZ23yMRkGw/ edit?usp=sharing. “FFmpeg (Software) (version 4.4 “Rao”). Fabrice Bellard.” 2021. https://ffmpeg.org/. Haidvogl, Martina. “Expanding into Shared Spaces at the San Francisco Museum of Modern Art.” The Electronic Media Review Volume 3: 2013–2014 (2015): 23–29. http://29aqcgc1xnh17fykn459grmcwpengine.netdna-ssl.com/emg-review/wp-content/uploads/sites/15/2018/09/EMG-Vol.-3-Haidvogl.pdf. Hain Teper, Jennifer, and Eric Alstrom. Planning and Constructing Book and Paper Conservation Laboratories: A Guidebook. Chicago: American Library Association, 2012. www.ala.org/alcts/resources/preser vation/conservation-labs. Harvey, Phil. “ExifTool (Software) (version 12.32). MacOS, Windows, Unix Systems.” 2021. https:// exiftool.org/. Kunze, J., J. Littman, E. Madden, J. Scancella, and C. Adams. “The BagIt File Packaging Format (V1.0).” RFC Editor, October 2018. https://doi.org/10.17487/RFC8493. Lee, Christopher, Matthew Krischenbaum, Kam Woods, Alex Chassanoff, Porter Olsen, and Sunitha Misra. “Bitcurator (Software) (Version 2.2.12).” n.d. Accessed February  7, 2021. https://bitcurator. net/. Libewf (Software). “Libyal.” n.d. Accessed November 30, 2021. https://github.com/libyal/libewf. Library of Congress. “Bagger (Software).” 2018. https://github.com/LibraryOfCongress/bagger. The Media Conservation Intitative at The Museum of Modern Art, funded by the Andrew W. Mellon Foundation. “Media Conservation Lab Essentials.” Pdf, New York, 2018. https://static1.squarespace. com/static/5835fd7c15d5db57b19535bd/t/5c003dfd4ae2374f1a408a7f/1543519742692/Media+Co nservation+Lab+Essentials.pdf. MediaArea, Dave Rice, Alexander Ivash, Ashley Blewer, and Jérôme Martinez. “QCTools (Software) (Version 1.2). Windows, MacOS, Debian, Ubuntu, Linux Mint, RedHat Enterprise Linux, CentOS, Fedora, openSUSE, ArchLinux, Flatpak, MediaArea.” 2020. https://mediaarea.net/QCTools. NASA. “FITS (Software).” 2021. https://fits.gsfc.nasa.gov/fits_utility.html. National Film Preservation Foundation. “National Film Preservation Foundation: The Film Preservation Guide.” 2004. www.filmpreservation.org/dvds-and-books/the-film-preservation-guide. Olsen, Porter, and Kam Woods. “Building a Digital Curation Workstation with BitCurator (Update).” 2018. https://bitcurator.net/2013/08/02/building-a-digital-curation-workstation-with-bitcurator-update/. Phillips, Joanna. “Cross Disciplinary Conservation: Building a Synergetic Time-based Media Conservation Lab.” The Electronic Media Review Volume 4: 2015–2016 (2015). https://resources.culturalheritage. org/emg-review/volume-4-2015-2016/phillips/. Pozdeev, Max. “Invisor (Software) (version v3.16). MacOS 10.7.3+.” 2021. www.invisorapp.com/. Pozdeev, Max, and MediaArea. “MediaInfo (Software) (version 21.09). Windows, MacOS, Android, iOS, Debian, Ubuntu, Linux Mint, RedHat Enterprise Linux, CentOS, Fedora, openSUSE, ArchLinux, Gentoo, Flatpak, Snap, AWS Lambda. MediaArea.” 2021. https://mediaarea.net/de/MediaInfo.

110

8 DIGITAL STORAGE FOR ARTWORKS: THEORY AND PRACTICE Amy Brost

Editors’ Notes: Amy Brost is Assistant Media Conservator at the Museum of Modern Art (MoMA), New York. At MoMA, she works extensively with the Digital Repository for Museum Collections (DRMC) and with the museum’s cross-departmental digital preservation team. She has been an instructor in NYU’s Moving Image Archiving and Preservation graduate program and has taught in several workshops on the care of media art. In 2015–16, she co-developed the Sustaining Media Art phase of the Matters in Media Art online resource, which was a finalist for a Digital Preservation Award in Teaching and Communications from the Digital Preservation Coalition. In this chapter, Brost shares an overview of approaches to digital art storage and provides practical guidance informed by both her research and her role as a manager of a digital art storage system for over 3,000 time-based media artworks.

8.1 Introduction Every museum gives careful consideration to its physical art storage environment. This includes planning the overall layout of the physical space, packing artworks in appropriate housings, managing the climate of the space, meticulously cataloging locations, dedicating specialized staff, and ensuring that the museum can financially sustain its physical storage site. Many of the same considerations and principles translate in some fashion to storage for the digital elements of time-based media artworks, which may include video, audio, images, and software. Digital files also need appropriate handling, packaging, cataloging, storage, location tracking, and dedicated staff. There are important differences, but it can be useful to consider physical collection care in order to conceptualize the requirements for the care of these digital elements, which are among the most fragile in a museum’s collection. Another useful comparison is with digital archives (e.g., within libraries and other cultural heritage institutions). Archives are “facilities or organizations which preserve records, originally generated by or for a government organization, institution, or corporation, for access by public or private communities” (CCSDS 2012, 2–1). Archives can contain a wealth of born-digital and digitized material that is not only designated for long-term storage; it is also made available to various constituencies on an ongoing basis to be accessed, rendered, repurposed, cited, and circulated. Diverse communities draw from archives in order to gain an understanding of the DOI: 10.4324/9781003034865-10

111

Amy Brost

past and create new knowledge. Digital archivists need to understand the communities they serve in order to preserve the significant properties of the materials in their care. In some ways, this is similar to the work of an art conservator working with the digital elements of artworks in a museum. The conservator needs to preserve these elements over the long term so that the museum can continue to exhibit the artworks in its care. Both of these comparisons are useful, but only to a point. The digital elements of artworks have unique needs that require unique approaches. Digital preservation practices have become integral to archiving, conservation, scientific research, and many other fields since they first emerged several decades ago. Digital preservation is defined as “the management and protection of digital information to ensure authenticity, integrity, reliability, and long-term accessibility” (SAA 2021). These practices are applied to diverse contexts and digital material. There is a vast body of literature dealing with digital preservation and storage, including useful introductory guides for librarians and archivists (O’Meara and Stratton 2016; Ryan and Sampson 2018), but only a small fraction of this literature deals specifically with artworks, as this community is still developing. The term “digital conservation repository” (Wharton and Mack 2012, 24) is used throughout this chapter to distinguish a preservation system for the digital elements of artworks from similar systems designed for archives or other contexts. It is critically important that conservators engage in a robust dialogue around digital preservation challenges in order to arrive at appropriate norms for conservation practice. Ensuring the safe storage of the digital collection is one of the time-based media art conservator’s most important tasks. Without proper storage, these fragile works could easily be lost. The goal of this chapter is not to prescribe specific tools or technologies; rather, it attempts to elucidate how art conservators might approach digital conservation repositories both conceptually and practically. The first three sections provide an overview of what a repository is, an introduction to conceptual models for a repository, and a summary of research on how to organize and describe the digital elements of artworks in storage. The remaining sections focus on the practical steps involved in planning, implementing, and managing a digital conservation repository.

8.2 What Is a Repository? A digital repository is “the technical infrastructure, services, and resources for the storage and management of digital information” (SAA 2021), but broadly speaking, a repository is more than hardware and software. In his book The Theory and Craft of Digital Preservation, Trevor Owens writes, “A repository is the sum of financial resources, hardware, staff time, and ongoing implementation of policies and planning to ensure long-term access to content” (Owens 2018, 4). By this definition, a repository is much more than a technological system. Indeed, thinking of it as such can compromise collection care objectives by placing too much emphasis on technology at the expense of other, equally important aspects. All of the following are essential to building and sustaining a repository: Financial resources. Repositories require ongoing financial support. Sustaining a digital collection is a perpetual process, not a finite project. This process requires financial support for the necessary personnel to do preservation on a continual basis. There are also ongoing costs associated with digital collection storage. Contrary to the myth that data is immaterial, digital collections take up physical space on storage media such as servers and data tapes. Whereas artwork storage in the physical world may cost a certain amount per square foot, storage in the digital world has a cost dependent on both total size in byte 112

Digital Storage for Artworks

units and on the desired features of the storage system, such as speed and ease of retrieval. To connote the fact that the digital elements of artworks occupy space and require care just like physical objects do, it can be helpful to refer to these elements as digital art objects. There are also costs for the software and tools to package and move digital art objects in and out of storage and to manage the digital collection over time. This does not mean that a museum has to be rich to do preservation. On the contrary, much can be done with modest financial resources, as long as each aspect of the repository is well understood and consistently supported by the institution. Technology. Repositories require investment in technologies, including hardware and software. Many technologies have a fairly short life cycle. Repository workflows evolve dynamically with the technological landscape. Software needs to be upgraded or changed, hardware needs to be replaced as it ages and is superseded by new technology, and storage media need to be periodically refreshed. It is not unusual for digital collections to require migration to new storage media every few years. The combination of hardware and software used in the repository ecosystem will be in a constant state of evolution and change. Investments need to be made with the understanding that all storage media are temporary, and change is inevitable. Staff time. Museums need to devote considerable labor to the preservation of digital art objects. Maintaining a repository requires broad cooperation across the institution and may include most or all of the following departments, as well as others: conservation, information technology (IT), registration, curatorial, and finance. The repository may also involve the museum’s existing collection management system (CMS) and digital asset management system (DAM); for a deeper exploration of the differences between these systems, see Chapter 12. Staff from all these areas may contribute in varying degrees to the development of workflows, policies, and plans. Museum staff will need to continually innovate and also adapt existing practices and workflows to meet the needs of digital collections (Van Saaze et al. 2018). Policy implementation. Policies delineate the shape of the repository. For example, in an archive, policies may be created to ensure that the archive accepts materials that further its mission and fall within its capacity to provide long-term care. In museums, making acquisition policies with preservation in mind may be more difficult when a museum’s main objective may be to create a collection that represents and reflects periods of creative production by artists working in countless different ways. To achieve this objective, curators need broad latitude to collect artworks with diverse preservation challenges and risks inherent. However, the comparison to physical storage is particularly apt here: a museum would not be able to acquire a work too large for its warehouse. Museums developing digital conservation repositories may find it helpful to create policies to set realistic guidelines for the type and quantity of digital material coming into the collection. This could include guidelines for acquisitions as well as digitization projects that would add digital files to the repository. Planning. “Nothing has been preserved, there are only things being preserved” (Owens 2018, 5). Within a few months to a year after a new repository has been fully implemented, it will be time to think about what transitions are visible on the horizon. These may include software upgrades, staff changes, storage media migrations, changes in vendor relationships, new contracts, renewals for hardware and software support agreements, workflow and policy changes, and new practices emerging through research in the field. Key members of the team need to remain in contact and meet regularly to maintain a viable repository. 113

Amy Brost

8.3 Repository Models As the foregoing aspects illustrate, a repository is a dynamic system involving people, processes, and technology, none of which can be prescribed in a one-size-fits-all approach. As such, technology-agnostic models can be helpful in visualizing the functions that a repository is meant to perform or support and the expected flow of information within the system. The appropriate technology can then be put in place to meet those needs. The 2019 book Preventive Conservation: Collection Storage contains chapters that provide an overview of digital preservation, the international standards and best practices for digital preservation environments (Slade et al. 2019), and the fundamentals of care for born-digital collections (Ferrante 2019). However, while essential as a general introduction, these chapters are not geared specifically for art collections. The authors describe repository environments that are mature, large-scale, and purpose-built to house digital archives. They refer to the ISO standards and best practices for digital preservation storage and reference a widely used model in digital archives, the Open Archival Information System (OAIS) reference model (see fig. 8.1). Archives that strive to perfect their alignment with this model can be audited and certified as “trusted digital repositories” (CCSDS 2011; Marks 2015). The model shows how a data object is deposited into the archive by a producer, and then it is ingested into archival storage alongside information about it. At that point, it is termed an Archival Information Package (AIP). Consumers can query the archive to discover and locate desired materials and request access copies, which are derived from the AIP. The AIPs are at the center of the model because, for archivists, the AIP is frequently the focus of preservation and the locus of information about the data object. The OAIS model is valuable to conservators for its discussion of planning and administrative functions, as well as its description of the types of information that are critical to achieving long-term preservation of stored data, especially representation, reference, context, provenance, and fixity information. Generally speaking, as these terms suggest, this information explains to future users what the data object is, where it came from, how to interpret it, how it relates to

Figure 8.1 Open Archival Information System (OAIS) reference model. Illustration: Amy Brost, adapted from “Figure 4–1: OAIS Functional Entities” (CCSDS 2012, 4–1)

114

Digital Storage for Artworks

other objects, and how to verify that it has been unchanged in storage and is therefore authentic. Fixity is a term with a broad meaning, as well as a technical one. It refers to “the property of being unchanged” (SAA 2021), but for digital materials, maintaining fixity means to periodically verify this property with a checksum, “a unique alphanumeric value that represents the bitstream of an individual computer file or set of files” (SAA 2021). Like a fingerprint for a digital object, a checksum can indicate whether or not an object is exactly the same as when it first entered the archive. Establishing and maintaining fixity, and preserving not only data objects but also the necessary information to ensure meaningful long-term access to them, are fundamentals of digital preservation. Chapter 13 in this book provides an in-depth introduction to these concepts from an archivist’s perspective. At the same time, in an art conservation context, a digital art object is often just one part of a complex artwork, especially in the case of installation and computer-based works. Museums are tasked with preserving artworks, not digital assets alone. As the illustration in fig. 8.2 shows, artworks may have digital elements as well as physical elements situated in a gallery environment with characteristics that are significant to the work of art. Museums require an approach to digital conservation repositories that allows them to flexibly meet diverse needs. Conservators collaborate with colleagues across the museum and have preexisting information ecosystems and

Figure 8.2 Artwork elements for preservation. Illustration: Amy Brost

115

Amy Brost

norms for information exchange that often need to be considered in the creation of a digital repository (Brost forthcoming). This means that how a conservator structures a digital repository for artworks in a museum may be quite different from how an archivist structures a digital repository for an archive, and the flow and location of information may or may not align with the OAIS model. Another reason that these two types of repositories might be structured differently is because the OAIS model assumes that the data object in the AIP has reached the last phase of its life cycle. In entering this final phase, the data object is believed to have fulfilled its original purpose and has become part of an evidentiary, historical record, static and unchanging. However, many time-based media artworks continue to evolve and change after entering the museum (Wharton 2016; Haidvogl and White 2020), and they enter collections well before the final phase of their life cycle (Van de Vall et al. 2011). Some classes of digital objects—including digital records in active use, scientific data, and digital art objects—require “active life preservation,” which “makes change a key issue, not simply in the sense of preventing change but in terms of managing change as part of the digital object’s active life” (Laurenson et al. 2017). Archivists themselves have disputed the notion that archived data is at the end of its life cycle. Records continuum theory was developed in Australia in the 1990s by records managers and archivists who noted that data objects did not follow a linear transition from active life to the archive. Rather, ​​they observed that there was a dynamic, bidirectional relationship between their fields of practice ​​(McKemmish 2001, 340). They developed a model in which “records are not treated as an end product but as processes” (Laurenson et al. 2017). Over a decade ago, the San Francisco Museum of Modern Art (SFMOMA) built a pioneering art documentation system on a model that reflected a continuum of collaborative knowledge production in the museum (Haidvogl and White 2020). The diagram in fig. 8.3, “Information continuum for time-based media art conservation,” is loosely based on a graphical representation of the original continuum model (McKemmish 2001, 351), but with artwork elements substituted for archival documents and the axes renamed to reflect common moments of information retrieval and production in the life of an artwork. Created as a counterpoint to OAIS for this discussion, it helps to illustrate the two key issues with applying the OAIS model to art conservation contexts, as discussed earlier. First, the OAIS model has a predominantly linear left-to-right flow of information, whereas the continuum model more accurately represents the cycles and processes involved in working with artworks. Second, the position of archival storage (the digital repository) at the center of the OAIS model reflects its central position in the archive, whereas in a continuum model, the digital repository is where digital elements of artworks are stored, but not where artworks themselves are preserved. One project focused on developing a model for managing digital art objects not yet at the end of their life cycle was PERICLES (Promoting and Enhancing Reuse of Information throughout the Content Lifecycle taking account of Evolving Semantics). PERICLES, a fouryear European Union project that ended in 2017, brought researchers, conservators, digital preservation experts, and software and computer engineers together to develop novel digital preservation methods based on use cases from Tate and the Belgium Space Operations Centre. While not modeling to the artwork level, the project aimed to model “continuously changing environments such as for time-based media, where OAIS is less appropriate” instead of basing the approach to preservation on a “continuum viewpoint” (Daranyi et al. 2015, 53). The PERICLES team developed ontologies and tools for digital video art and software-based art, among other deliverables (CORDIS 2017, 40). In its work on the digital video art case study, the team explained, “we are not aiming to model the domain at the artwork level, but rather the specifics of dependencies between digital things within a system which forms part of the artwork. 116

Digital Storage for Artworks

117 Figure 8.3 Infor mation continuum for time-based media art conservation. Illustration: Amy Brost

Amy Brost

It is, therefore, a partial model related to the dependencies of some of the components of the artwork. In this case, modeling enables the conservator to better understand the digital dependencies within the system and also helps identify areas where automation might be achievable. Modeling also facilitates communication with computer scientists and software developers who might provide tools to support activities related to long-term digital preservation and the corresponding dependencies” (Lagos et  al. 2018, part 4). Much in the way that Archivematica software [Archivematica (Software) n.d.] was developed to create AIPs as conceptualized by the OAIS model (Van Garderen et al. 2013), the PERICLES models led to the initial development of software tools that can be applied by diverse industries to dynamic preservation environments. Specific tools aside, the continuum “viewpoint” that underpins this project suggests a way to reframe thinking about digital storage to better fit the actual activity of preserving digital art objects in the museum. Thankfully, it is not necessary to choose or create a model in order to start building a digital conservation repository. It is only necessary to be aware that storage of digital art objects requires a unique approach so that when adapting models, software, and services from other contexts, the differences and their implications can be noted. For instance, the use of the term Archival Information Package (AIP) should be restricted to those contexts where the OAIS model is being implemented. In most conservation contexts, the term storage package may be more accurate. This can refer to any form of packaging, from a single file deposited into storage to a file within a directory full of rich information. In this nascent moment when digital conservation repositories are being implemented in a growing number of museums, conservators need to develop a shared vocabulary and strive to reach a consensus on terminology. As time goes on, conservators and others will continue to add to this literature to help guide decision-making. Already, significant progress has been made in defining some of the unique needs of artworks from a repository perspective, as will be discussed in the next section.

8.4 Unique Needs of Artworks As this discussion of models demonstrates, art conservators are working in a unique context. In a digital conservation repository, the digital elements of artworks remain conceptually linked to the other parts of the work, such as the sculptural elements, consumables, display equipment, and so on. Each part of the artwork has its own storage requirements and documentation. The artwork records of the institution provide the connections between parts when the artwork itself is dissociated. In many institutions, there is a collection management system or another system of record that is used to maintain these connections. As each digital art object is stored, it is necessary to maintain its connection or relationship to the artwork and to the artwork record as a whole. As Kate Lewis, the chief conservator at the Museum of Modern Art (MoMA) in New York, stated, “How digital art is stored, and how we retrieve it is different than objectbased collections. That’s the edge where it is getting pioneered, but to the museum at large it shouldn’t look different; it should feel like the same collection” (Van Saaze et al. 2018, 229). In their 2012 paper, conservator Glenn Wharton and technologist Barbra Mack put forth several critical attributes that are fundamental to a “digital conservation repository” (Wharton and Mack 2012). They outlined the distinct advantages of taking a component-based approach to documentation and storage of digital art objects. For example, if a museum acquires a software-based artwork consisting of a computer loaded with software, a monitor, and source code as a separate file for preservation purposes, a range of conservation approaches are possible for each element. An artist may insist that the original hardware and software be maintained in perpetuity because they are essential to the integrity of the artwork. Alternatively, an artist may 118

Digital Storage for Artworks

want the museum to use whatever technology is common at the time of the exhibition in order to achieve a particular effect rather than exhibit the original hardware and software as supplied. Many artists’ attitudes fall somewhere in between, and some works are “thinly specified” with few instructions while others are “thickly specified” in great detail (Laurenson 2006). According to Wharton and Mack, component-based repositories enable conservators to effectively manage the digital elements of artworks, including information about them and the functions they perform, the significance that the artist assigns to them, and how they are related to one another. Using this repository structure, conservators can maintain component-level documentation that enables them to determine appropriate conservation strategies for each component that takes into account their significance to the artwork as a whole. A component might be a single digital file, such as a video, or a set of files like those copied from a Blu-ray disc. In both cases, even though the number of digital files differs, both may be thought of as a single component of an artwork. The documentation model created by the Documentation and Conservation of Media Arts Heritage project (DOCAM) explained that “components are at the very heart of the changes affecting most media artworks . . . this level promotes the identification and collection of documents that make reference to a specific component . . . which in turn facilitates the tracking of changes made to the work throughout its lifecycle.” (DOCAM—Documentation Model 2010) In the DOCAM Glossaurus, a component is defined as “any physical or logical part of an artwork from which particular characteristics, behaviours or functions can be identified” (DOCAM— Glossaurus 2010). In describing the advantages of component-based repositories, Wharton and Mack explained that “the components have distinct roles in maintaining the entire system. Like a human body in which organs such as the heart and lungs are cared for in unique ways, each component is a member of a whole functioning system. Therefore, one cannot readily understand . . . relationships using the hierarchical concepts applied to still or moving image art (e.g., video) such as siblings and derivatives.” (Wharton and Mack 2012, 26) In a component-based system, different relationship types can be defined. Relationships may be hierarchical, such as those between master material and lower-quality exhibition copies made from the master, but they may also be non-hierarchical, as in the case of software or hardware dependencies. A component-based repository can effectively capture both types of relationships. Wharton and Mack also described the importance of documenting artist-assigned values, which in turn inform conservation decision-making.​These do not refer to monetary values but to the significance assigned to a component by an artist or by the museum. Wharton and Mack wrote, “Individual components may be assigned different values by the artists. This requires a shift towards applying different conservation strategies to individual components, instead of a single strategy that is holistically applied to the work of art” (Wharton and Mack 2012, 27–28). The significance of a digital component to an artwork as a whole cannot be intuited or derived from its technical characteristics. Moreover, these values can and do shift as the work is shown, particularly when artworks are still unfixed and the artist or their representatives are involved during installations. Artists may provide museums with new masters of their work or adjust their 119

Amy Brost

work for different presentation modes, and artists or museums may create new files for various purposes, such as exhibition or loan. Metadata, “commonly defined as ‘data about data’ ” (SAA 2021), is vital for understanding digital components. According to Wharton and Mack, “The role of metadata—to ensure the integrity of its digital files, to control access, and to ensure its digital components remain accessible and understandable in the future—cannot be overstated” (Wharton and Mack 2012, 31). Standardized metadata, consistently generated and stored, can be advantageous for making critical information more easily searchable and usable. Searching technical metadata for specific characteristics (e.g., codecs) can help to identify digital material vulnerable to obsolescence, and having basic technical metadata can be useful for planning, research, and reporting collection statistics. Relationships can be described in a standardized way through structural metadata, which is “information about the relationship between the parts that make up a compound object” (SAA 2021). For example, creating metadata to describe the operating systems in software-based artworks could make it easier to generate a list of all the operating systems in the collection and locate artworks running on specific operating systems. Recording information consistently, and ensuring an enduring connection between a digital component and its metadata, can facilitate the interpretation and preservation of digital art objects over the long term. In summary, there are four critical aspects to organizing and describing the digital elements of artworks in the repository. First, an artwork’s digital elements should be organized into components. Second, each component should have an artist- or museum-assigned value or status associated with it (e.g., artist-provided preservation master, artist-provided exhibition copy, museum-created exhibition copy). For components made by the acquiring institution, such as for exhibition or loan, the date and reason for the new component should be recorded. Third, components should have clearly defined relationships to one another. And fourth, metadata, especially technical metadata, should be created and utilized. The status, relationships, and metadata of a component may be in the collection management system and/or may be included in the storage package itself. Some attributes, especially status, may change over time (e.g., if a new exhibition copy format is created to be compatible with contemporary playback technology), so if these are to be bundled into the storage package, they need to be editable. Whether a storage package is simply a consistent format [i.e., BagIt specification (Kunze et al. 2018) or other] or an entirely self-describing AIP with robust metadata (see Chapter 13) will depend upon each institution’s needs and context. Many institutions will land somewhere in between. The information necessary to conserve the immaterial and physical aspects of a multimedia artwork may not make sense to bundle alongside the digital art object in storage, so by necessity, a significant proportion of information required for the work’s conservation may reside in complementary systems, such as a CMS (see Chapter 12) or network drives. This constellation of elements and information is gathered, examined, and interpreted by diverse museum staff when the museum prepares to show the work. Complementary systems that contain vital information for the preservation of digital art objects should be considered part of the digital conservation repository and must also be preserved.

8.5 Minimum Requirements Before the museum undertakes the planning and implementation of a repository, it is essential to develop the team that will be responsible for doing preservation on an ongoing basis. The museum’s IT experts are key members of this team. While they may be new to conservation, they are adept at safeguarding critical data, including personnel, development, membership, retail 120

Digital Storage for Artworks

sales, and other data, and they understand how to protect the museum’s IT systems from viruses, ransomware, and other cyberattacks. Although there may be a significant communication gap to bridge (Prater 2017), once the unique requirements for the digital conservation repository are understood by IT, and once conservators understand how their data storage needs fit into the overall data management landscape of the museum, conservation, and IT departments can form a strong partnership. Similarly, decades ago, new interdepartmental partnerships were forged when conservators and building operations engineers began to collaborate on the monitoring and management of the museum environment. In both cases, departments performing very different functions in the museum have been able to partner effectively and evolve practices within the institution to improve collection care. In collections where digital materials may still be on the original carriers delivered at acquisition (e.g., hard drives, optical discs, USB drives, SD cards, computers, etc.), it can be difficult to know where to begin. Once these digital art objects have been cataloged, examined to assess their condition and status, and safely copied from the data carriers as described in Chapters 13 and 14 in this book, they are ready to be moved to their storage locations. Despite the differences detailed earlier, digital art objects have similar needs to archival data when it comes to ensuring bit-level preservation and security in storage. In 2016, the Matters in Media Art project, a collaboration between MoMA, Tate, and SFMOMA, with funding from the New Art Trust, added Digital Preservation: Sustaining Media Art to its web resource. The resource provides steps to follow for institutions designing their storage for artworks. According to the site, “Preservation storage systems require the active monitoring of data in order to detect unwanted changes, such as corruption or damage. Their high level of redundancy with copies in several locations enables the data to be restored should a problem arise. Ideally, they will also have a disaster recovery plan.” (MoMA et al. 2015) The authors put forth the following core principles for digital storage: • • • •

Geographic Redundancy: Multiple copies of data should be held at different geographical locations, and a disaster recovery plan should be in place. Fixity Checking: Regularly monitoring the checksums of digital files in order to detect corruption or unwanted changes to your data. Access and Security: The speed and restriction of access to data need to be appropriate for its intended use and the level of protection required. Technology Monitoring: Trends in storage technology should be monitored to assess when migration to new storage media will be necessary.

As these principles show, at its most basic level, the IT system of a digital conservation repository securely stores several copies of the storage packages in geographically dispersed locations, where fixity on all copies is periodically checked, and storage media are periodically refreshed so as to minimize the risk of failure or obsolescence. With these fundamentals in place, the next step is to develop a strategy for increasing the institution’s ability to know, protect, monitor, and sustain its digital collection. These functions correspond to the “Levels of Digital Preservation,” a roadmap created in 2013 and revised in 2018 by the National Digital Stewardship Alliance (NDSA) (see fig. 8.4). As this roadmap shows, any institution can assess its digital storage practices and find new directions for improvement. A contemporaneous explanatory paper (Phillips et al. 2013) can help guide new users. 121

Amy Brost

122 Figure 8.4 NDSA Levels of Digital Preservation Matr ix, Version 2.0. Source: Copyr ight © 2019 by National Digital Stewardship Alliance, licensed under a Creative Commons Attr ibution 3.0 Unported License, DOI 10.17605/OSF.IO/QGZ98

Digital Storage for Artworks

There are numerous assessment tools that can be used in a similar way. These include the Digital Preservation Coalition (DPC) Rapid Assessment Model (Mitcham and Wheatley 2019) and the Preservation Storage Criteria (Schaefer et al. 2018), which has an accompanying overview and user guide (Zierau et  al. 2019) and even a downloadable board game to facilitate learning and decision-making. One assessment tool, the Digital Preservation Maturity Model, Risks, Trusted Digital Repository Certification and Workflow Crosswalk (Langley 2018), compares a dozen selected digital preservation models in a spreadsheet format. Papers and checklists from the InterPARES Trust (International Research on Permanent Authentic Records in Electronic Systems), including the Checklist for Cloud Service Contracts: Final Version (InterPARES Trust, February 26, 2016) and the Checklist for Ensuring Trust in Storage Using IaaS (Infrastructure-as-a-Service) (InterPARES Trust, August 3, 2016), may be of interest to institutions exploring ways to utilize cloud storage in their digital conservation repository. Institutions must explore various models and tools, use them to catalyze self-evaluation, and adapt them to specific contexts as appropriate. No standard or roadmap should be applied to all contexts as a checklist of aims or goals or as a measure of success or failure. Each institution must critically evaluate the recommended actions to determine its own standards and needs. Moreover, no methods are prescribed to meet these metrics, and sometimes the most direct solution can do the trick, especially when getting started. For an artist with a modest budget, storing multiple copies in different geographic locations might simply mean carrying a copy of the files home from the studio on a hard drive. The goal is not necessarily to store all digital data at the highest standard. If a film in the collection was transferred to video using the telecine process years ago, and a higher-quality 4K scan has since been performed, both digital files may not require the same level of preservation storage. All digital material designated for retention requires a digital preservation strategy, but lower-value digital data and high-value digital data may be stored to different standards. With artworks, this is often determined case by case and in collaboration with curators and other colleagues. For one artwork, several hours of video documentation of a time-based media artwork running in the gallery may not be high-value digital data, but similar documentation of a performance acquired by the institution may be invaluable for training performers in the future. To maximize the resources available for digital storage, institutions may consider creating two or more data pools corresponding to different levels of storage. However, it is important to remember that the amount spent on digital storage does not directly correspond to the standard of preservation being met. The repository team must understand how different pieces of a repository work together to achieve the institution’s targeted levels of preservation. Access to the files themselves should be restricted as direct access to any artwork is restricted, but it may be useful to make information about the files widely available. Museums may want to ensure that there is a preview copy of the material readily available for curatorial viewing and research purposes. It may also be desirable to have technical metadata in the collection management system, so planning for an exhibition or loan does not require retrieval of a video file from storage in order to see its codec, aspect ratio, bit rate, audio channel layout, and so forth. Providing ample access to preview files and metadata can reduce requests for the files themselves. Having technical metadata in a form that allows the institution to view and search by technical characteristics is also useful for preservation planning. By having this information at the ready, institutions can periodically review the codecs and file types in the collection to anticipate obsolescence issues and flag works for proactive evaluation and care. This must also periodically be done for hardware and display equipment that may be in physical storage. As mentioned earlier, doing preservation is a process, not a project. It involves continual adaptation and evolution. As such, teams should consider keeping a journal about the storage planning 123

Amy Brost

and implementation process, including decisions made and the rationale, dates of deployments of various tools and technologies, issues noted and how they were solved, policy changes and why they occurred, and so on. Collaborating on this journal with everyone on the digital conservation repository team is a way to convert individual knowledge to institutional knowledge and create a valuable reference for reflection and planning. It might also be helpful for institutions to create an individualized self-assessment based on models, internal discussions, and use cases tailored to the institution’s needs and context. As repository plans are developed, the self-assessment can be used to identify any deficiencies or gaps. A sample high-level assessment is provided; however, these questions are not exhaustive, nor will they apply to every context, so it is important for each institution to develop its own.

Sample High-Level Digital Conservation Repository Self-Assessment A. Storage 1. How and where are the technical aspects of the digital storage infrastructure documented (e.g., capacity, hardware tags, locations, network connections, support contracts)? 2. How are the storage packages containing digital art objects transferred into storage? 3. How is successful transfer and storage verified? 4. How are system status and errors reported and to whom? 5. In what geographical locations are the storage packages kept? Do they have different disaster threats (e.g., risk of natural or human-caused disasters)? 6. How many copies of the packages are there? 7. On what media are they stored (e.g., server, LTO tape)? 8. How long will these media last? 9. What will migration to new storage media entail (e.g., time, logistics, cost)? 10. How is the storage system protected from threats (e.g., intrusion, malware)? 11. Are there different data pools (high-value data, medium-value data)? 12. If working with vendors, what steps would be required to change vendors (exit strategy)? 13. What steps could be taken now to simplify vendor changes in the future? 14. What is the disaster recovery plan? 15. Is the storage system cost-efficient and energy-efficient? B.  Digital components 1. How are components packaged for storage (e.g., BagIt specification or other)? 2. How are packages assigned visible, persistent, unique identifiers (i.e., a naming convention that is unique and independent of cataloging or location)? 3. Where is provenance, status, and relationship information stored? 4. At what point is technical metadata created/extracted, and where is it stored? 5. How and how often is fixity checked? 6. How are fixity results reported? 7. How is corrective action taken when a fixity error is found? 8. How are intentional changes to components managed and documented? 9. Is permanent deletion possible (e.g., due to deaccessioning or ingest in error)? How?

124

Digital Storage for Artworks

C. Discoverability 1. 2. 3.

How are the storage packages associated with the correct artwork? How do museum staff know what digital elements the museum has in storage for an artwork? What information about storage packages and their locations is in other systems of record? D. Access

1. 2. 3. 4. 5. 6.

How are files retrieved from storage? Who has access, and what levels of access/permissions are there? Is there an access layer to protect the storage packages from direct interaction and inadvertent alteration? How are unauthorized people prevented from accessing artwork files? Is there a log or record of who has accessed artwork files and why? Are there preview files available for research purposes? If so, where or in what system? How are they created? E. Conservation

1. 2. 3. 4.

Where are condition reports and treatment reports (e.g., exhibition file creation, migration) stored, and how are they associated with the correct component? Where is documentation stored, and how is it associated with the correct component? How does the repository support proactive care to mitigate preservation risks, such as obsolescence, at the file level? At the artwork level? If complementary systems are being relied upon to provide contextual information and store metadata, preservation actions, and critical documentation, how are those systems being preserved (e.g., backups, exports)? F. Team building and community building

1. 2. 3. 4. 5. 6.

Does the team doing preservation at the institution cover all the necessary functional areas? Are staff roles and responsibilities on the team well understood and documented? Does the team have a shared understanding of its goals and tasks? Have clear policies about storage been developed and communicated institution-wide? Is there a schedule of meetings or activities that ensure smooth and ongoing communication? How and when does the digital conservation repository team share its work more widely across the museum? With colleagues outside the institution? G. Repository documentation

1. 2. 3. 4.

How is the repository documented in the institution (e.g., manuals, websites, flowcharts, procedure documents, training documents, infrastructure documentation)? Where does the documentation reside? Who is responsible for updating the documentation? Is documentation being used effectively for staff awareness and training?

125

Amy Brost

H. Finance 1. 2. 3.

What is the sustainable financial commitment the institution can make to the repository? Who is responsible for budgeting and forecasting? What other departments at the institution have digital preservation storage needs (e.g., archives, DAM)? Can the business case be made jointly? Can the administrative and cost burden be lessened by pooling resources? I. Vendor relationships (if applicable)

1. Is there a clear RFP and approval process for vendor selection? 2. Who is responsible for writing and negotiating vendor contracts? Does the institution have legal counsel? Has this been budgeted? 3. Who are the vendor’s primary contacts at the institution for day-to-day system management? For contracts and billing? 4. If vendor systems will be used for metadata search, storage package search, storage package retrieval, or other functions, what other ways can those functions be performed if those systems are compromised or if the institution changes vendors? 5. How does the institution supervise vendor systems to ensure that they are performing as expected and promised? 6. What reporting is expected from the vendor (e.g., type and frequency of reports)? 7. Have security concerns been addressed (e.g., data security, network access)? 8. What rights and/or control does the vendor have over the institution’s data? 9. What service level agreements, features, functions, and performance are the vendor promising? Are these all in the contract? 10. What is the exit strategy?

8.6 Deploying the Digital Repository The remainder of this chapter focuses on the digital repository—the technical infrastructure where digital art objects are stored. Moving an entire collection into a digital repository is no small feat. Full collection moves in the physical world are highly visible and time-consuming. When the National Museum of the American Indian (NMAI) moved 800,000 objects from New York to Maryland, the process took five years and involved nearly 200 people from four departments (NMAI n.d.). Similarly, moving a digital collection is labor-intensive, timeconsuming and requires multiple departments to collaborate. During their collection move, the NMAI stabilized, cataloged, and documented each object. Digital art objects have similar needs. Each digital art object needs to be appropriately cataloged, tracked, characterized, packaged, and safely transferred to its new home in the digital repository. In some contexts, several of these steps are achieved in an automated way using software, but even in those cases, steps are taken to thoroughly verify that the process is occurring correctly. Because digital collections—when they are not on display—are not as visible as physical ones, preservation needs may be overlooked. For example, complex media artworks may be exhibited and extensively documented by an institution, but the institution may not yet have provided for the long-term storage and preservation of the digital elements of that work. In the hierarchy of urgency in collection care for digital art objects, safe storage is at the top. 126

Digital Storage for Artworks

Paradoxically, conservators are often unable to give this their full attention while also working on exhibitions, treatments, loans, and acquisitions. This may put digital elements at greater risk for damage or loss that will be invisible and undetected until the institution wishes to show the work. Moving a digital collection into storage is largely invisible to the museum at large, and it may be difficult for those not actually performing the work to fully grasp the enormity of the project. Time-based media art conservators need to find ways to convey the importance and scale of this work that no one beyond the digital conservation repository team can readily see. Metrics gathered from moving materials into the repository (e.g., staff time spent, rate of ingest, total collection size, and anticipated time required to move it into storage) can help bolster arguments for additional staff, tools to increase productivity, or working with vendors (see Chapter 6). If additional resources are not available, these metrics can at least help manage expectations, allow broad input on risk assessment, and provide a realistic estimate of when a digital collection will be completely moved into storage. Deploying a digital repository occurs in several phases. Ample time and attention should be given to the earliest phases in order to address any issues before they affect an even greater number of collection objects and their records. As such, the pattern of work tends to be in bursts, where the testing and evaluation phase requires the full attention of several staff members in multiple departments, but then a period of routine operation and planning follows. When a software upgrade or other change occurs, the team will be fully activated again and commit the necessary time. Testing and Evaluation: Any transfer of digital material from one storage location to another, whether the first time or the tenth time, is not to be taken lightly. Staff must pilot the new storage system and test it thoroughly. In this phase, the team verifies that all the use cases imagined for how the system should function are fulfilled as planned. During this phase, a few files that are representative of file types and sizes in the collection can be put into the system for testing. A detailed journal of the process should be kept, noting anything that does not perform correctly or as expected and how the issue was resolved. The system must be evaluated end-to-end; that is, in a responsible physical collection move, it would not be enough to pack up the current warehouse and load the trucks. Staff would also supervise the delivery on the other side and the organization of objects in the new location. This is also a good time to test the workflows related to pre-ingest, storage, and retrieval, ensuring that roles and responsibilities are clear and sensible. If working with vendors, museum staff should fully test system features and performance to ensure that vendors are meeting their contractual obligations. Deployment and Troubleshooting: Following successful completion of testing, the collection is gradually ingested into the digital repository. The workflows are followed and refined. As challenges arise, these are documented, and either adjustments are made or some classes of material are put aside while possible solutions are explored. For example, when MoMA began to ingest raw DPX, a file type associated with film scanning, the extremely long processing time coupled with ingest failures prompted the museum to put all material in that format aside until new curation and packaging decisions could be developed for the material (Brost forthcoming). Issues may be tracked using a ticketing system, such as commonly used by IT departments, in order to keep a record of any problems, how they were solved, and the parties involved. Regular meetings of the digital conservation repository team provide opportunities to discuss issues and add to the team journal. Routine Operation, Monitoring, and Reporting: When the digital repository finally becomes part of routine museum operations and conservation practice, statistics should be collected that will help the institution better understand how it is being utilized. These may include: the number of new storage packages ingested each month, a list of storage packages downloaded and 127

Amy Brost

for what purpose, the size of storage packages ingested, the total amount of capacity used and available, amount of staff time required to perform storage-related tasks, the size of the backlog (material awaiting processing and ingest), and so on. These statistics are useful in planning, both in terms of staffing and budgeting. It is common for technical issues to arise from time to time, so reporting on issues and their resolution is also a part of routine operation. Planning and Forecasting: The digital conservation repository team continues to meet regularly to discuss system status, technical issues, and next steps. The forecast is evaluated against the collected statistics and adjusted accordingly. Planning meetings can include the discussion of emerging practices that could be considered for incorporation into strategic plans. The team should consider ways that its work can be shared more widely with staff and leadership, perhaps through internal presentations, to ensure ongoing awareness and provide a platform for advocacy. Similarly, team members may want to maintain relationships with their counterparts at other institutions with digital conservation repositories, both informally and through conferences and meetings, in order to compare notes and exchange knowledge. When there is a software or hardware change or a collection move, the cycle of phases repeats. It is critical for the team to continually assess how the digital conservation repository fits into museum-wide data storage and stewardship. Conservators need to see themselves partly as museum technologists and play an active role in museum data stewardship discussions alongside others at the institution with digital preservation needs. As digital preservation needs increase across departments, there may be new opportunities to pool resources and develop shared solutions. Efficiencies may also be realized between museums as opportunities arise to develop cooperative digital collection storage agreements, such as those that have been arranged for physical collections. In the Netherlands, LIMA pioneered a shared digital repository for over 30 collections (Van Saaze et al. 2020). In order to help the museum adapt successfully to these evolving needs, digital conservation repository teams must continually monitor the technological landscape and engage with colleagues and communities within and beyond the institution.

8.7 Automation Multiple actions are performed as digital art objects are packaged for storage, and in many contexts, these are done manually. Materials are checked for viruses and malware, files are examined, file formats are identified and recorded, preview files are placed in other systems for curatorial and research purposes, technical metadata are recorded, fixity is checked, and the checksum is recorded. Once the storage package is ingested, fixity is checked again to ensure that the stored file is correct and complete. The results of all these actions are logged in some fashion to document the actions taken. For larger collections, it may be advantageous to automate some or all of these tasks. A microservice approach to automation allows for the selection or development of specific tools for specific applications. These tools can then be modified or replaced without making changes to an entire system. This allows for the greatest customization to a specific context and the greatest flexibility (Rice and Schweikert 2019). For example, a digital asset management application may have a transcoding feature that can create derivative files with certain specifications, but changing those specifications in the code would require updating the entire application. With a microservice approach, an application to create derivatives could be independently deployed—just one in a series of individual archival workflow applications loosely coupled together “like links in a chain” (Rice and Schweikert 2019, 54) Some vendors will offer software and services to fully automate the process of packaging materials for storage, running all the aforementioned tasks automatically and bundling the output into the storage package. However, while this may simplify the design of the system, all-in-one 128

Digital Storage for Artworks

offerings are less customizable, and as far as museum needs are concerned, they may have extraneous features designed for other contexts and be lacking features that museums may want. For instance, a conservator may want an alert or failure message at a certain point in an automated process or to bypass a process entirely, and this may not be possible with a vendor’s all-in-one offering. Working with open-source software like Archivematica provides an opportunity to work with a broader user community and perhaps undertake collaborative development projects where costs are shared and all users benefit. Archivematica software is free, but institutions have paid for new features that all users then enjoy. The model that works best depends on staffing, skills, budget, and the institution’s unique needs. It is important to note that automation requires digital conservation repository teams to reevaluate their skill sets to determine how roles and responsibilities might shift if steps in the process were automated. In other words, the steps described at the beginning of this section that are performed manually by conservators might be replaced by scripts or by vendor software. Manual processes and automated processes are quite different, like trading a bicycle for a motorcycle. The skills, tools, maintenance, costs, and risks are similar in some ways but very different in others. The changes may require the team to reorganize tasks, learn new skills, add staff, and contract with vendors. Automation is not to be undertaken lightly, but it is extremely practical and useful. Conservators tend to prize hand skills and may believe that a manual approach is inherently superior, but this is one important way in which digital collections are different from collections in traditional media. Manual processes naturally introduce inconsistencies and human error and make the largest collections all but impossible to catalog and store in a reasonable time. Properly supervised, automation can significantly reduce errors, increase the rate at which digital material is safely stored, and ensure consistency of storage package formation and structure. One particularly noteworthy benefit of automation is the ability to import data from the ingest process into collection management systems. This can help reduce manual cataloging processes and keep them consistent. For example, technical metadata can be used to populate fields with technical characteristics (e.g., codecs, file formats), fixity information, the universal unique identifier (UUID) of the storage package, and other information. Using data imports to backfill information in artwork records can greatly reduce the time spent on manual cataloging. The consistent entry of data into the museum’s collection management system enables more database-driven methods to support collection care. For example, this would enable conservators to easily search for vulnerable materials with certain technical characteristics, such as obsolete codecs, in order to proactively assess and treat them. Moreover, the promise and potential of linked open data continue to be explored through projects like the multi-year Linked Conservation Data project ( Ligatus Research Centre n.d.), funded by the UK’s Arts and Humanities Research Council. The project consortium, which formed in 2019, includes more than 20 organizations and has collaborated with the International Institute for Conservation of Historic and Artistic Works (IIC), the Institute of Conservation (Icon), and the American Institute for Conservation (AIC). Its goal is to increase the dissemination and reuse of conservation documentation to support conservation practice (Ligatus Research Centre n.d.). Data-driven collection care and linked open data initiatives are enabled by shifting from a narrative emphasis in conservation documentation to a narrative-database balance. For some tasks in an art conservation context, automation is more problematic, especially for condition assessment, creation of preview files, and holistic risk assessment. Because of the high degree of variability in the materials received from artists, it may not be possible to examine or normalize files in batches and generate satisfactory results, even though tools are available to automate these tasks during the ingest process and they are widely used in archives. For 129

Amy Brost

example, if an archive performs a mass-digitization project and ingests the resulting digital files, file conformance checking software, such as MediaConch by MediaArea, can be used to verify that the files meet the specifications defined by the archive, which is infinitely more practical than checking files one at a time [MediaArea et al. MediaConch (Software) n.d.]. However, artists’ files may not conform to standards for reasons that may be intentional or unintentional. For example, certain creative choices made by the artist or the circumstances of making the artwork (e.g., saturated colors, over- or underexposed footage) may be identified as errors by the software used for audiovisual examination. Conservators examine files individually, documenting and contextualizing their characteristics in terms of the artist’s intentions, the production history of the work, technical research, and so forth. During examination, conservators may use similar software tools to expose various file characteristics, but the conclusions of the assessment are informed by a holistic understanding of the artwork, and potential actions are considered within the framework of conservation ethics. Software can be used to convert batches of master files to a more manageable and consistent format for preview and research purposes. Some institutions may find that the nature of the collection lends itself to an automated process for this, while others will find that automation generates many unusable files. At MoMA, the DAM is used to share preview files internally for curatorial and research purposes, and the determination of which file to use for that purpose is decided case by case (Brost forthcoming). This is done by evaluating all digital material received at acquisition, noting the differences between the files, selecting and preparing the appropriate file, uploading the file, and then checking the playback. Some software-based artworks and room-sized installations may not be possible to represent in the DAM, while some multichannel installations may be able to be stitched together to make a useful preview. The diversity and often nonstandard nature of files and formats provided by artists, coupled with the need to understand each artwork in order to represent it faithfully for study, make this process one which requires thoughtful planning in order to introduce automation. Comprehensive risk assessment is also difficult to achieve with automation. As the PERICLES project showed, it is possible to design a system for modeling risks at the level of the digital domain, but risks to the artwork as a whole may be quite different and not discoverable within the digital repository. For example, for one artwork, having an obsolete control system may not be an urgent concern if the artist-assigned value is low and the logic and behavior of the work is well documented. However, for another artwork, the digital files may appear to be straightforward to preserve, but critical instructions for installing the work may be missing. Risk assessment requires examination of the digital elements, but for holistic risk assessment and prioritization, it is necessary to examine the artwork in its entirety. Digital repositories require ongoing monitoring in all cases, but especially when automation is involved. Institutions need to develop processes for monitoring and verifying the correct operation of the system. All potential failure points should be examined, and strategies for checking on system performance should be developed with those who possess the most technical knowledge of the system. It is not enough to be an end-user; when automation is implemented, conservators need to have a deep understanding of what the system was designed to do, how it carries out its tasks, how different software programs or vendor systems interact with each other, what results are expected, and how to evaluate those results. Recent case studies about MoMA (Brost forthcoming), Denver Art Museum (Colloton and Moomaw 2018), and Tate (Roeck 2016) show how museums with large digital art collections have approached implementing automation. Those just beginning to do preservation can learn from the successes and the difficulties faced by institutions that have pioneered storage systems for art collections. The MoMA case 130

Digital Storage for Artworks

study describes how the museum discovered and corrected a software integration issue that had resulted in multiple invalid packages (Brost forthcoming), and a case study by The Metropolitan Museum of Art explains how the institution discovered file corruptions in its art collection storage system and took corrective action (Nichols and Thiesen forthcoming). Neither institution experienced any permanent data loss, largely due to the conservative approaches and redundancy built into their nascent storage systems. Practical applications of the concepts presented in this chapter are highlighted in digital art storage profiles provided by the Denver Art Museum (Case Study 8.1) and The Metropolitan Museum of Art (Case Study 8.2). These case studies and contributions are part of a growing body of literature focused on the preservation of digital art collections.

8.8 Conclusion Digital conservation repositories must be built on a firm foundation that ensures digital file integrity and security and must functionally support the practice of art conservation in the institution. It is crucial to understand the active-life preservation needs of digital art objects so that models, terms, and tools can be thoughtfully adapted to and refined within a conservation context. Digital art objects should be packaged, described, and contextualized in a consistent manner. This involves organizing artworks into components, recording the artist- or museumassigned value and status of the component, defining its relationship to other components, and recording technical metadata. Storage of components must, at minimum, involve safely copying digital material off of the acquired portable storage devices, securely storing at least two copies in different geographic locations, routinely checking fixity, refreshing storage media as needed, and maintaining the connections between storage packages and their documentation, if the two are not bundled together. To do this successfully requires a committed team empowered by the institution to do preservation on a continual basis. Institutions can take the first steps with modest means and straightforward solutions and build incrementally. Because the overwhelming majority of institutions are in the early stages of deploying digital repositories for artworks, it is essential for the field to develop a community of practice to share approaches and learning experiences so that norms that suit this unique context can be collaboratively established.

Case Study 8.1  Digital Storage for Artworks at the Denver Art Museum by Kate Moomaw-Taylor The collection of the Denver Art Museum today includes 90 works with audiovisual and/or digital components within its fine arts collections, as well as 422 such works within the AIGA Design Archives, a collection of graphic design. While the acquisition of these objects began in the 1980s, the development of dedicated, centralized digital storage for artworks did not begin until around 2010. At this time, an interdepartmental working group began meeting with representatives from the conservation, technology, curatorial, registration, and collection management departments to begin formulating and implementing more rigorous protocols for acquiring, cataloging, handling, displaying, and storing these works. These efforts were prompted by greater awareness of the needs of these artworks within the museum world, as well as preparations for an unprecedented exhibition of 53 time-based media works from the collection. As artworks were digitized in preparation

131

Amy Brost

for display, the museum implemented a simple digital art storage solution: a dedicated storage array configured as a RAID 5 to provide some insurance against loss of data by a drive failure. The device functioned to backup files and data stored on removable media such as external hard drives, optical disks, and tapes and to centralize the assets for easier access. As time passed and the collection of digital files associated with artworks grew through digitization efforts and acquisition of borndigital works, staff recognized the increasing risk of data loss through failure of the RAID and/or the removable media. The need for a more robust storage solution became more and more pressing. Staff in the conservation and technology departments worked to develop expertise in digital preservation standards and practices in this period. Kate Moomaw-Taylor, Associate Conservator of Modern and Contemporary Art; Sarah Melching, Silber Director of Conservation; and Bryon Thornburgh, Director of Technology, led the efforts that developed from this point. In 2015, the museum had the opportunity to host Eddy Colloton as a graduate intern from the Moving Image Archiving and Preservation program at New York University. The internship was an opportunity to pilot a digital preservation workflow utilizing the preservation software Archivematica [Archivematica (Software) n.d.] to package, store, and retrieve digital files. Archivematica was appealing for its automation of ingest tasks, creation of machine-readable metadata, fixity checking tool, and functionality to retrieve copies of stored files without touching the originals. The pilot project provided the framework and experience for the museum to successfully apply for an Institute for Museum and Library Services (IMLS) grant, Museums for America: Collections Stewardship [Institute of Museum and Library Services (IMLS) n.d.] in the fall of 2015. The grant project, which took place between 2016 and 2018, included digitization of the remaining audiovisual content and ingestion of all digital files into a centralized repository. In the process of developing the grant application, staff from conservation and technology worked closely to develop a plan for digital storage of files associated with artworks and included a budget for hardware purchases and subscriptions to a cloud storage service. Self-hosting of Archivematica was also included and closely tied with storage needs. The National Digital Stewardship Alliance (NDSA) Levels of Preservation (NDSA 2019) was key in guiding the decisions. The plan included hosting Archivematica, as well as primary storage for digital files on one network server. Data from this server would be backed up to a second network server housed in a different building on the museum campus. Finally, the data would also be backed up to a cloud storage service, providing a third redundant copy in a separate geographical region. The technology department’s expertise would be critical in specifying, configuring, and maintaining the hardware, as well as ensuring the security of the data. Conservation contributed knowledge of preservation standards and an estimate of the amount of data to be stored based on the known file formats and durations of video works. A service agreement with Artefactual, the main developer of Archivematica, was also included in the grant to aid in installing and maintaining Archivematica and configuring the primary server. Conservation and technology staff worked closely to decide on the option for the third redundant copy in a different geographic region. Options other than cloud storage, such as writing files to data tape and storing those tapes at a remote commercial facility or exchanging tapes or hard drives with another institution, were considered. The drawbacks of cloud storage were primarily cost, though control over and security of data were also a concern. In the end, the amount of data estimated seemed like it would be affordable for cloud storage, and the accessibility and convenience of this option won out. Grant funding provided the opportunity for a trial during the period of the grant. Duracloud was selected as the service for its digital preservation ethos and user-friendly upload, download, and data monitoring tools. Whether the museum would

132

Digital Storage for Artworks

be able to continue funding cloud storage beyond the grant period remained a question, so alternative options were kept in mind. Fortunately, the museum has continued to fund the Duracloud subscription and some external support for Archivematica. These items have been accounted for in the technology department’s budget, along with the maintenance of the servers. Within the technology department’s budget, these costs do not stand out as unusual, whereas they would be outliers for the conservation budget. Placing these costs in the context of other technology initiatives may have helped to normalize them. Certainly, the coordinated advocacy of conservation and technology departments, along with the other members of the working group, has been effective thus far. It remains important, however, to always have a Plan B in mind. With the self-hosting of Archivematica, it is possible to maintain the software without external service. Should Archivematica ever have to be abandoned entirely, the data will remain accessible through the servers or cloud storage and could be migrated to another system. Maintaining two copies of the data on-site means that, barring a calamity, data could be backed up to a different cloud storage provider or to a different solution for a third copy altogether should the current solution be no longer tenable. While this solution is working well now, with the expense and ever-changing nature of digital storage, it is critical to always be thinking ahead.

Case Study 8.2  Digital Storage for Artworks at The Metropolitan Museum of Art The Metropolitan Museum of Art holds approximately 300 artworks with associated digital files. These works are mostly video-based, but they include installations, software-based works, photographs, and other digital media. Although the museum has a time-based media working group that convened informally in 2001 and became an official museum working group in 2010 (Metropolitan Museum of Art n.d.), digital storage was not part of its purview. Artwork files received on external storage media or generated from tape transfers were routinely copied to the museum’s servers for redundancy. In 2017, an error during a mass file migration on the servers resulted in the corruption of a number of these artwork files. As a result, the museum formed a cross-departmental team to improve its digital storage methods for high-value data, including the art collection (Nichols and Thiesen forthcoming). Spearheading the subsequent research and recovery effort were Alexandra Nichols, the museum’s first Time-Based Media Conservation Fellow, and Milo Thiesen, Lead Technical Analyst in Digital Asset Management, along with others from the Digital Department and the Information Systems and Technology Department (IS&T). As the artwork files were restored to the server from other backup copies and external storage media, new workflows were implemented, including arranging digital files for each artwork into components and improving the packaging of files by creating “bags” using the open-source tool BagIt (Kunze et al. 2018; see Chapter 13). By implementing BagIt, The Met was able to create and move bags while ensuring upon every move that the contents had not changed. Nichols and Thiesen also worked to improve the storage location for the bags. They collaborated with IS&T on the setup of a designated artwork server where certain routine operations like compressing files to save space and deleting apparent duplicates would not be executed. In addition, IS&T lengthened

133

Amy Brost

the data retention period on the artwork server and made more frequent backups. Around this time, Nichols interviewed nearly a dozen peer institutions about their digital preservation practices in order to inform workflows at The Met. Through these interviews, Nichols found that practices were diverse and context-dependent. Based on these results, Nichols and Thiesen formulated a plan tailored for The Met’s needs and developed an RFP (request for proposal) for choosing a storage vendor. The RFP criteria were primarily informed by the NDSA Levels of Preservation (NDSA 2019) and the information package structure described in the OAIS model (OAIS 2012). The museum felt that the ideal vendor would not only provide secure ingest and storage for highvalue data from multiple sources across the museum (i.e., artworks, video production) but would be able to provide additional digital preservation services if needed, such as pre-ingest support. The museum selected the company Digital Bedrock (Los Angeles), which they felt would meet the RFP criteria and best complement The Met’s existing staff skill set (Nichols and Thiesen forthcoming). Broader institutional support materialized due to several concurrent developments. By chance, during the error recovery period that began in 2017, a time-based media conservation assessment was being conducted (Kramer et al. 2021; see Chapter 4). In 2019, the recommendations of the assessment team, coupled with the sustained advocacy efforts of staff, particularly Nora Kennedy, Sherman Fairchild Conservator in Charge in the Department of Photograph Conservation, led to a year of funding for Digital Bedrock’s storage and obsolescence monitoring services, and a timebased media conservator, Jonathan Farbowitz, who joined the staff as Associate Conservator of Time-Based Media. Farbowitz and Thiesen continued to work on the digital storage improvements, and Farbowitz enlisted the help of more staff. The goal was to complete a collection-wide audit and check each artwork to ensure that its associated files were properly cataloged, described, and stored. They worked to ensure that each bag on the artwork server corresponded to a single artwork component as cataloged in the collection database, TMS (The Museum System) (see Chapter 12). To ensure a durable connection between the bagged files and the artwork, the bag name contained the usercreated TMS component number as well as the persistent artwork record identifier (“ObjectID”) and the persistent component identifier (“ComponentID”). All of the component bags were placed within a parent directory that included the artist’s name and title of the work in the directory name. New team members included Diana Díaz-Cañas, Assistant Conservator in Photograph Conservation; Ashley Hall, Manager of Collection Information in the Digital Department; and Caroline Gil Rodríguez, Andrew W. Mellon Fellow in Media Conservation. Farbowitz, Díaz-Cañas, and Gil Rodríguez worked to complete the standardization of all of the bags on the artwork server and prepare them for ingest into storage with Digital Bedrock. The collection managers in the curatorial departments with digital works (Catherine Burns in Modern and Contemporary Art and Meredith Reiss in Photographs) were also involved in the audit as artworks were organized and described. TMS was updated with component-level descriptions of digital elements and location information. Once ingest began, Digital Bedrock checked the fixity on each bag and provided a unique identifier that could be entered into TMS and used to locate the component in storage. Digital Bedrock also generated technical metadata upon ingest to monitor file format obsolescence, and this metadata may at some point be imported into TMS to reduce manual cataloging. The museum’s data are stored with Digital Bedrock on LTO tape in three geographic locations. An additional copy of the data is stored onsite on the museum’s artwork server. Preview copies of artwork files are stored in

134

Digital Storage for Artworks

the museum’s digital asset management system; streaming by staff is permitted, but downloading of preview copies is by permission only. In a recent discussion with the author, Farbowitz and Thiesen shared some valuable advice for those getting started (Farbowitz and Thiesen 2021). Farbowitz quoted the aphorism that the perfect is sometimes the enemy of the good; he drew a comparison between the “more product, less process” approach within archives (Greene and Meissner 2005) and the need for museums to get their digital collections safely stored sooner rather than later, even if they are not able to implement their ideal workflow. Thiesen expressed the importance of working iteratively so that the work can begin while refinements are being made and learnings are informing the plans for what happens next. Both emphasized the importance of documenting one’s present as well as future actions—Thiesen extensively uses comments when coding, and Farbowitz uses tickets in project management software Jira and the conservation module of TMS to track issues that are outstanding so he can return to them later. Farbowitz summed it up by saying, “Don’t obsess over the perfect system. Sometimes you just have to go with something to be able to move forward and actually get work done.” This case study is based on an interview that author Amy Brost conducted with two staff members from The Metropolitan Museum of Art (New York)—Jonathan Farbowitz, Associate Conservator of Time-Based Media, and Milo Thiesen, Lead Technical Analyst, Digital Asset Management (Farbowitz and Thiesen 2021).

Bibliography “Archivematica (Software).” English. Artefactual Systems Inc., n.d. www.archivematica.org/en/. Brost, Amy. “Making Practice Practical: Developing Digital Preservation Storage for Artworks at MoMA.” In Electronic Media Review, Volume 6: 2019–2020. Washington, DC: AIC Electronic Media Group, forthcoming. Colloton, Eddy, and Kate Moomaw. “Rewind, Pause, Playback: Addressing a Media Conservation Backlog at the Denver Art Museum.” The Electronic Media Review Volume 5: 2017–2018 (2018). http:// resources.culturalheritage.org/emg-review/volume-5-2017-2018/colloton/. Community Research and Development Information Service (CORDIS). “A New Approach to Digital Content Preservation.” Research*eu Results Magazine, March  2017. https://cordis.europa.eu/article/ id/400798-coping-with-the-forces-beneath-the-earth-s-crust. Consultative Committee for Space Data Systems (CCSDS). “Audit and Certification of Trustworthy Digital Repositories.” CCSDS Secretariat, September 2011. https://public.ccsds.org/Pubs/652x0m1.pdf. ———. “Reference Model for an Open Archival Information System (OAIS).” CCSDS Secretariat, June 2012. https://public.ccsds.org/pubs/650x0m2.pdf. Darányi, Sándor, John McNeill, Yiannis Kompatsiaris, Panagiotis Mitzias, Marina Riga, Fabio Corubolo, Efstratios Kontopoulos et al. “PERICLES—Digital Preservation through Management of Change in Evolving Ecosystems.” In The Success of European Projects Using New Information and Communication Technologies, 51–74. Colmar, France: SCITEPRESS—Science and Technology Publications, 2015. https:// doi.org/10.5220/0006163600510074. DOCAM (Documentation and conservation of media arts heritage) Research Alliance. “DOCAM Documentation Model.” The Daniel Langlois Foundation for Art, Science, and Technology, 2010a. www.docam. ca/en/presentation-of-the-model.html. ———. “DOCAM Glossaurus.” The Daniel Langlois Foundation for Art, Science, and Technology, 2010b. www.docam.ca/glossaurus/hierarchy.php?lang=1. Farbowitz, Jonathan, and Milo Thiessen. Interview with Jonathan Farbowitz and Milo Thiessen by Amy Brost, March 12, 2021. Ferrante, Ricardo. “Care of Born-Digital Collections.” In Preventive Conservation: Collection Storage, edited by Lisa Elkin, Christopher A. Norris, Society for the Preservation of Natural History Collections, American Institute for Conservation of Historic and Artistic Works, Smithsonian Institution, and

135

Amy Brost George Washington University, 1st ed. New York: Society for the Preservation of Natural History Collections, 2019. Greene, Mark, and Dennis Meissner. “More Product, Less Process: Revamping Traditional Archival Processing.” The American Archivist 68, no. 2 (September  1, 2005): 208–63. https://doi.org/10.17723/ aarc.68.2.c741823776k65863. Haidvogl, Martina, and Layna White. “Reimagining the Object Record: SFMOMA’s MediaWiki.” Stedelijk Studies (Imagining the Future of Digital Archives and Collections), no. 10 (2020). https://stedelijkstudies.com/journal/reimagining-the-object-record-sfmomas-mediawiki/. Institute of Museum and Library Services (IMLS). “Museums for America: Collections Stewardship.” Accessed November 30, 2021. www.imls.gov/grants/available/museums-america. InterPARES Trust. “Checklist for Cloud Service Contracts: Final Version.” February 26, 2016. https:// interparestrust.org/assets/public/dissemination/NA14_20160226_CloudServiceProviderContracts_ Checklist_Final.pdf. ———. “Checklist for Ensuring Trust in Storage Using IaaS.” August 3, 2016. https://interparestrust.org/ assets/public/dissemination/EU08_IaaS_Checklistv1.2_.pdf. Kramer, Lia, Alexandra Nichols, Mollie Anderson, Nora Kennedy, Lorena Ramírez-López, and Glenn Wharton. “Conducting a Time-Based Media Conservation Assessment and Survey at The Metropolitan Museum of Art.” Journal for the American Institute of Conservation (2021). https://doi.org/10.1080/0 1971360.2020.1855866. Kunze, J., J. Littman, E. Madden, J. Scancella, and C. Adams. “The BagIt File Packaging Format (V1.0).” RFC Editor, October 2018. https://doi.org/10.17487/RFC8493. Lagos, Nikolaos, Marina Riga, Panagiotis Mitzias, Jean-Yves Vion-Dury, Efstratios Kontopoulos, Simon Waddington, Pip Laurenson, Georgios Meditskos, and Ioannis Kompatsiaris. “Dependency Modelling for Inconsistency Management in Digital Preservation—The PERICLES Approach.” Information Systems Frontiers 20, no. 1 (February 2018): 7–19. https://doi.org/10.1007/s10796-016-9709-z. Langley, Somaya. Digital Preservation Maturity Model, Risks, Trusted Digital Repository Certification and Workflow Crosswalk. Cambridge: University of Cambridge, December 17, 2018. https://doi.org/10.17863/ CAM.34266. Laurenson, Pip. “Authenticity, Change and Loss in the Conservation of Time-Based Media Installations.” Tate Papers, no. 6 (Autumn 2006). www.tate.org.uk/research/publications/tate-papers/06/ authenticity-change-and-loss-conservation-of-time-based-media-installations. Laurenson, Pip, Kevin Ashley, Luciana Duranti, Mark Hedges, Anna Henry, John Langdon, Barbara Reed, Vivian Van Saaze, and Renée Van de Vall. “The Lives of Digital Things: A Community of Practice Dialogue.” Tate, 2017. www.tate.org.uk/about-us/projects/pericles/lives-digital-things. Ligatus Research Centre. “Linked Conservation Data.” n.d. Accessed January 5, 2021. www.ligatus.org. uk/project/linked-conservation-data. Marks, Steve. Becoming a Trusted Digital Repository. Edited by Michael J. Shallcross. Chicago: Society of American Archivists, 2015. McKemmish, Sue. “Placing Records Continuum Theory and Practice.” Archival Science 1, no. 4 (December 2001): 333–59. https://doi.org/10.1007/BF02438901. MediaArea, Jérôme Martinez, Dave Rice, Guillaume Roques, Florent Tribouilloy, Blewer, Ashley, and Tessa Fallon. “MediaConch (Software) (version 18.03.2). Windows, MacOS, Debian, Ubuntu, Linux Mint, RedHat Enterprise Linux, CentOS, Fedora, openSUSE, ArchLinux, Flatpak, Snap, AppImage. MediaArea.” n.d. https://mediaarea.net/MediaConch. Metropolitan Museum of Art. “Time-Based Media Working Group.”The Metropolitan Museum of Art. Accessed December  22, 2020. www.metmuseum.org/about-the-met/conservation-and-scientific-research/ time-based-media-working-group. Mitcham, Jenny, and Paul Wheatley. “Digital Preservation Coalition Rapid Assessment Model Ver 1.” Digital Preservation Coalition, September 19, 2019. https://doi.org/10.7207/dpcram19-01. Museum of Modern Art (MoMA), New Art Trust, San Francisco Museum of Modern Art (SFMOMA), and Tate. “Digital Preservation: Sustaining Media Art.” Matters in Media Art, 2015. http://mattersin mediaart.org/sustaining-your-collection.html. National Digital Stewardship Alliance (NDSA). “2019 LOP Matrix V2.0.” October 14, 2019. https://osf. io/2mkwx/. ———. “Levels of Digital Preservation.” National Digital Stewardship Alliance (NDSA), 2019. https://ndsa. org/publications/levels-of-digital-preservation/.

136

Digital Storage for Artworks National Museum of the American Indian (NMAI). “Moving the Collections.” Smithsonian Institution, n.d. https://americanindian.si.edu/explore/collections/moving. Nichols, Alexandra, and Milo Thiesen. “A Cross-Departmental Collaboration to Improve Digital Storage at The Metropolitan Museum of Art.” In Electronic Media Review, Volume 6: 2019–2020. Washington, DC: AIC Electronic Media Group, forthcoming. OAIS. “Reference Model for an Open Archival Information System (OAIS): OAIS Magenta Book.” CCSDS 650.0-M-2. The Consultative Committee for Space Data System, 2012. https://public.ccsds.org/ pubs/650x0m2.pdf. O’Meara, Erin, and Kate Stratton. Digital Preservation Essentials. Edited by Christopher J. Prom. Chicago: Society of American Archivists, 2016. Owens, Trevor. The Theory and Craft of Digital Preservation. Baltimore: Johns Hopkins University Press, 2018. https://osf.io/5cpjt. Phillips, Megan, Jefferson Bailey, Andrea Goethals, and Trevor Owens. “The NDSA Levels of Digital Preservation: An Explanation and Uses.” In Archiving 2013, Final Program and Proceedings of the 10th IS&T Archiving Conference, 2–5 April  2013, edited by Peter Burns and Society for Imaging Science and Technology, 216–22. Washington, DC: Society for Imaging Science and Technology, 2013. www. digitalpreservation.gov/documents/NDSA_Levels_Archiving_2013.pdf. Prater, Scott. “How to Talk to IT About Digital Preservation.” Journal of Archival Organization 14, no. 1–2 (April 3, 2017): 90–101. https://doi.org/10.1080/15332748.2018.1528827. Rice, Dave, and Annie Schweikert. “Microservices in Audiovisual Archives.” International Association of Sound and Audiovisual Archives (IASA) Journal, no. 50 (August  7, 2019). https://doi.org/10.35320/ ij.v0i50.70. Roeck, Claudia. “Preservation of Digital Video Artworks in a Museum Context: Recommendations for the Automation of the Workflow from Acquisition to Storage.” Master’s thesis, Bern University of the Arts, 2016. www.academia.edu/41381787/Preservation_of_digital_video_artworks_in_a_museum_ context_Recommendations_for_the_automation_of_the_workflow_from_acquisition_to_storage and www.academia.edu/41381788/Masters_Thesis_Appendix. Ryan, Heather, and Walker Sampson. The No-Nonsense Guide to Born-Digital Content, 1st ed. London: Facet, 2018. https://doi.org/10.29085/9781783302567. Schaefer, Sibyl, Nancy McGovern, Andrea Goethals, Eld Zierau, and Gail Truman. “Digital Preservation Storage Criteria.” January  16, 2018. https://osf.io/sjc6u/; https://doi.org/10.17605/OSF.IO/ SJC6U. Slade, Sarah, David Pearson, and Steve Knight. “An Introduction to Digital Preservation.” In Preventive Conservation: Collection Storage, edited by Lisa Elkin, Christopher A. Norris, Society for the Preservation of Natural History Collections, American Institute for Conservation of Historic and Artistic Works, Smithsonian Institution, and George Washington University, 1st ed. New York: Society for the Preservation of Natural History Collections, 2019. Society of American Archivists (SAA). “Dictionary of Archives Terminology.” 2021. https://dictionary. archivists.org/index.html. Van de Vall, Renée, Hanna Hölling, Tatja Scholte, and Sanneke Stigter. “Reflections on a Biographical Approach to Contemporary Art Conservation.” In ICOM-CC 16th Triennial Conference Preprints, Lisbon 19–23 September 2011, edited by Janet Bridgland. Lisbon: Critério-Produção Grafica, Lda, 2011. https://pure.uva.nl/ws/files/1262883/115640_344546.pdf. Van Garderen, Peter, Paul Jordan, T. Hooten, Courtney C. Mumma, and Evelyn McLellan. “The Archivematica Project: Meeting Digital Continuity’s Technical Challenges.” In Proceedings of the Memory of the World in the Digital Age: Digitization and Preservation. An International Conference on Permanent Access to Digital Documentary Heritage, 26–28 September  2012, Vancouver, British Columbia, Canada, edited by Luciana Duranti and Elizabeth Shaffer, 1349–59. United Nations Educational, Scientific and Cultural Organization (UNESCO), 2013. http://ciscra.org/docs/UNESCO_MOW2012_Proceedings_ FINAL_ENG_Compressed.pdf. Van Saaze, Vivian, Claartje Rasterhoff, and Karen Archey. “Innovative Digital Infrastructures: The Issue of Sustainability.” Edited by Dutton Hauhart. Stedelijk Studies, no. 10 (Fall 2020). https://stedelijkstudies. com/journal/imagining-the-future-of-digital-archives-and-collections/. Van Saaze, Vivian, Glenn Wharton, and Leah Reisman. “Adaptive Institutional Change: Managing Digital Works at the Museum of Modern Art.” Museum and Society 16, no. 2 (July 30, 2018): 220–39. https:// doi.org/10.29311/mas.v16i2.2774.

137

Amy Brost Wharton, Glenn. “Reconfiguring Contemporary Art in the Museum.” In Authenticity in Transition: Changing Practices in Contemporary Art Making and Conservation: Proceedings of the International Conference Held at the University of Glasgow, 1–2 December  2014, edited by Erma Hermens and Frances Robertson. London: Archetype Publications, 2016. Wharton, Glenn, and Barbra Mack. “A Case for Digital Conservation Repositories.” In Electronic Media Review, Volume 1: 2009–2010, 23–44. Washington, DC: AIC Electronic Media Group, 2012. http://resources.cul turalheritage.org/wp-content/uploads/sites/15/2016/07/Vol-1_Ch-5_Mack_Wharton.pdf. Zierau, Eld, Nancy McGovern, Sibyl Schaefer, and Andrea Goethals. “An Overview of the Digital Preservation Storage Criteria and Usage Guide.” In Proceedings of the 16th International Conference on Digital Preservation (IPRES). Amsterdam, 2019. https://ipres2019.org/static/proceedings/iPRES2019.pdf.

138

9 STAFFING AND TRAINING IN TIME-BASED MEDIA CONSERVATION Louise Lawson

Editors’ Notes: Louise Lawson is Conservation Manager at Tate, where she heads the time-based media conservation department with 11 permanent conservation staff members, currently the largest dedicated TBM department in any museum internationally. Drawing from her ten-year experience as a team manager and developer, Lawson explores in this chapter how to enable time-based media conservation from a staffing perspective: she identifies five staffing models that can be found at different institutions, evaluates recent job postings for time-based media conservators, investigates required skill sets and responsibilities, and provides an overview of education and training opportunities for time-based media conservators.

9.1 Context With the growing number of private and public collections committed to collecting and displaying time-based media artworks, this chapter provides an insight into the skills and activities that would sustain and preserve a time-based media art collection, with a focus on who should be undertaking activities, what skills are required and what training is needed. Currently, only a few conservation programs offer specialized training in time-based media conservation, and many institutions have staff from a variety of educational backgrounds and conservation disciplines supporting their time-based media collections. This chapter explores the skills and knowledge needed to perform time-based media conservation. It considers the areas of responsibility and the opportunities for training and pathways into the profession. It explores different institutional models for how time-based media conservation is currently staffed and how it might be staffed. This chapter outlines the need for dedicated and specialized conservation staff to support time-based art collections, with the need for ongoing professional development and learning. This chapter is developed based on my professional practice as the time-based media conservation manager at Tate, where I head a time-based media conservation team of 11 permanent staff conservators and conservation technicians, as well as specialist contractors and temporary project staff. The team and the roles within the team since the first position in 1996 and the team’s inception in 2004 have been developed in response to Tate’s institutional vision, the expanding and evolving nature of the collection, the overall series of Tate programs (exhibition, DOI: 10.4324/9781003034865-11

139

Louise Lawson

display, and loan), and new practices around the care of artworks being acquired. The current staffing model at Tate ensures that the same level of care, specialist knowledge, and skill is afforded to the time-based media artworks as with all other artworks in Tate’s collection.

9.2 Staffing Models and the Conservator Conservation as a practice is a knowledge-producing activity, and as described by Pereira and Raposo (2019, 184), “knowledge is a collective construction, the interaction, the sharing . . . are the source of creation and consolidation of tacit knowledge.” Being able to develop knowledge is critical for the care of time-based media artworks, and as such, it does present a different staffing proposition as the complexity, creation, and consolidation to develop and layer knowledge takes time (Lawson and Laurenson 2019). This challenge, particularly within time-based media conservation, requires a wide range of skills and knowledge to manage the artworks’ changing and unfolding nature, the associated technologies, working to determine long-term preservation strategies and, as such, staffing models that adapt to and support your collection. When considering existing staff models, I  have identified five key staffing models (see Table 9.1). Through my direct experience, these are “time-based media conservator model,” whereby a time-based media conservator or a time-based media conservation team are in place; a “conservator model,” whereby a conservator not specialized in time-based media has responsibility for the collection; and finally a “non-conservator model,” whereby institutional staff outside of the conservation profession may have responsibility for the time-based media collection. The latter tends to rely on retaining or using existing expertise within an institution, such as curatorial, audiovisual, or with Los Angeles County Museum of Art, the Collections and Digital Assets team work to support the collection, albeit they have time-based media conservation and preservation experience (Heinen 2020). Two further models were identified by Laurenson (2013, 37) in their 2013 paper on Emerging institutional models and notions of expertise for the conservation of time-based media works of art as “a cross-disciplinary model” and “an external agency model.” Laurenson goes on to describe an external agency model as “one that operates on a national level and works closely with a group of museums to support the conservation of Table 9.1  Institutional models for staffing in time-based media conservation 1. The Time-Based Media Conservator Model where there is a dedicated time-based media conservator or time-based media conservation team—for example, one time-based media conservator, plus a minimum of one other position, such as another time-based media conservator or time-based media conservation technician 2. The Conservator Model where there is no time-based conservation, but another conservator is taking responsibility for the conservation of a time-based media collection and developing time-based media expertise 3. The Non-conservator Model where there are no conservators and another discipline in the institution is taking responsibility for the conservation of time-based media artworks 4. The Cross-Disciplinary Model where an internal team is developed to respond collaboratively to the needs of time-based media artworks, including conservators, registrars, technicians, curators, and legal and copyright representatives, plus any additional freelance specialists 5. The External Agency Model where an external agency on a national level or a private time-based media conservation company works closely with a group of museums to support the conservation of time-based media

140

Staffing and Training

the time-based media conservation (i.e. LIMA Media Art Platform)” (LIMA n.d.). Under this last category, I would expand this to now include private conservation companies. A cross-disciplinary model is described by Laurenson (2013, 37) as an internal team developed “to respond collaboratively to the needs of time-based media artworks including conservators, registrars, technicians, curators and legal and copyright representatives, plus any additional freelance specialists.” This model was identified as a mode of collective working to address the challenges of time-based media artworks, referencing the San Francisco Museum of Modern Art (SFMOMA) “team media” as an example (see Chapter 10). Another example of this model was at the Guggenheim with the Variable Media Initiative (Guggenheim Variable Media n.d.), which was active between 1999 and 2004 and included curatorial and conservation staff who were focused on the challenges of time-based artworks (Phillips 2020). This model was also employed at Glasgow Museums, Gallery of Modern Art, whose “working group” worked to address the challenges of “Audiovisual, Installation, Conceptual and Ephemeral Art (AV/ICE)” (Hendrick 2015), as well as at the Museum of Modern Art (MoMA) where in 2012 the Media Working Group was expanded to ensure that a cross-departmental policy for the needs of timebased media was embedded into core structures and processes (Van Saaze et al. 2018). Importantly with this model, which places a time-based media conservator central to the discussion, and similar to MoMA, the “time-based art working group” at the Art Gallery of New South Wales (AGNSW) undertook activity to develop structures and processes for time-based media conservation across the institution (Sherring et al. 2018). The time-based media conservator model can be viewed as the ideal model. In this model, you would have a conservator with all the relevant knowledge and skills to sustain a time-based media collection, and in addition, they are trained and immediately ready to address the needs of a collection. Of course, it’s easy to say that all institutions with a time-based media collection should have a trained time-based media conservator, but the current reality is that this is not the case. Each institution has its own characteristics and levels of resources to deal with a time-based media collection, so it is difficult to say one model fits all institutions. Adding an additional role to any staffing model is challenging, as you need to demonstrate the need and benefit of additional funding to support a new position. The conservator model provides a conservation view for the care and management of a time-based media collection, even though the conservator isn’t trained specifically in timebased media. A conservator will have key qualities and skills in terms of their approach, thought process, and decision-making that they can apply. The difficulty is it may not be immediately apparent without training or professional development to understand what the challenges or risks are or what is needed for preservation and access. They also have to balance that with the reality that time-based media may have been added as an addition to their role, so they will have competing priorities and focus. However, if a non-time-based media conservator is invested in developing their time-based media skills and knowledge, they can support the care of a timebased media collection or even transition (if they are willing and there is an institutional appetite to focus) on becoming a trained time-based media conservator. Although advocacy (see Chapter  6) is paramount in conservation, it gains an additional relevance with time-based media conservation, as the role and responsibilities of a conservator can often be more difficult to visualize and advocate for. The work within time-based media conservation can look and feel very different, even though the core conservation principles that guide the conservation field are the same. Therefore, when advocating, it is important to define what time-based media conservation is, share this definition and need across the institution, and be consistent with the message. Domínguez Rubio (2014, 633) describes digital artworks as “vectors of institutional and cultural change” that challenge, according to Van Saaze et al. 141

Louise Lawson

(2018, 220), “boundaries and redistribute competencies and expertise.” It can be challenging to ensure that the same level of care is provided for time-based media artworks and that this falls to the responsibility of a conservator. The models described here can be used to lobby for time-based media conservation staff. For an institution that does not have any conservation staff, there will be a financial tipping point, where I would argue that it is more cost-effective to hire a permanent conservator. This is not the only tipping point that would dictate the need for a conservator. This can relate to collection size and need that “is impossible to ignore” (Van Saaze et al. 2018, 224) as with MoMA in 2007 and The Metropolitan Museum of Art (The Met) in 2018 or governance and conservation as with Tate in 1996 or through working groups or initiatives looking more broadly at issues of collecting time-based artworks. Such groups and initiatives have led to conservator appointments, such as at the Guggenheim following the Variable Media Initiative in 2008 or the Smithsonian Institution, who set up a working group in 2009 (Goodyear 2020), leading to appointing time-based media conservators at both the Smithsonian American Art Museum (SAAM) and the Hirshhorn Museum and Sculpture Garden (HMSG) in 2015 or the AGNSW in 2015 (Laurenson 2013; Van Saaze et al. 2018; Sherring et  al. 2018; Feston-Brunet 2020; Finn 2020; Nichols 2020; Phillips 2020). The trend of appointments is continuing as we see new roles being created at the Australian Centre for Moving Image (ACMI) in 2019 and the first conservator role in Canada in 2020. We are also seeing the increase of time-based media conservation in the commercial sector, as Fino-Radin acknowledges that as we see institutions developing time-based media conservation roles (Fino-Radin 2018, 1), “[it] has created the need for time-based media conservation expertise outside the walls of institutions,” and we see the development of private practice. The development of roles will hopefully continue so the needs of a time-based media collection can be addressed.

9.3 What Skills and Knowledge? Working in time-based media conservation requires a strong grounding in conservation married with a blend of technical skills and knowledge. There also needs to be an understanding of the philosophical and ethical issues that may pertain to works of art when grappling with how artworks change and the role of the conservator within this (Cane 2009; Lawson and Cane 2016). In order to reflect on the current skills and knowledge one might expect a time-based media conservator to have, I have analyzed seven recent time-based media conservator vacancies. The posts were announced across a 12-month period from across the globe: Tate (April 2019), the Guggenheim Museum (July 2019), The Met (August 2019), the San Francisco Museum of Modern Art (February 2020), M+ (March 2020), the Stedelijk Museum (March/April 2020), and the Art Gallery of Ontario (January–May 2020). In analyzing the job descriptions, four key skill areas alongside a list of knowledge requirements were identified as critical. These are determined as follows: 1. Examination, treatment, and documentation of time-based artworks: • • • •

Able to undertake a wide range of conservation-related activities, such as examination; condition, quality, or function assessments; documentation; and conservation treatments. Able to assess condition, vulnerability, and the displayability of time-based media artworks, including the formulation of long-term preservation strategies. Able to apply conservation procedures to formulate and undertake treatments ensuring high standards of preservation and presentation. Able to assess and prepare both media and equipment relating to time-based artworks. 142

Staffing and Training

2. Conservation and research of time-based media artworks: •

• • • • • •

Able to develop long-term preservation strategies for time-based media artworks, including recommendations for the artworks’ ongoing care and maintenance and addressing issues of obsolescence. Able to create, propose, and execute plans for auditing a time-based media art collection to ensure long-term persistence and usability of artworks. Able to implement best practices for the conservation, storage, and documentation of the collection. Able to develop and contribute to the conservation/preservation policies, procedures, protocols, and workflows. Able to work closely with digital and information systems teams to develop, plan, or maintain a digital repository solution for the time-based artworks. Able to conduct and carry out research on the materials, techniques, and practices of artists. Able to identify requirements, develop and support a time-based media art lab for use in condition assessment, migration, and treatment of time-based artworks and obsolescent display equipment.

3. Program-related activity for time-based artworks: • • • • •

Able to provide expert advice on the condition, vulnerability, and displayability of time-based media artworks for their acquisition, display, and loan. Able to prepare time-based media artworks scheduled for loan and display. Able to develop and manage project budgets for the display, loan, and acquisition. Able to advise and train staff in best practices for the care, handling, display, storage, packing, and shipping of time-based media artworks. Able to coordinate and manage the supervision and completion of projects of vendors and external specialists.

4. The soft skills: • • • • • • •



To have a deep curiosity and problem-solving capacity with unfamiliar media, technology, and rapidly changing environments. To have a drive for continually striving to gain new skills and knowledge. To have strong oral, written, and interpersonal communication skills. To be an innovative and enthusiastic team player with strong critical-thinking and problem-solving skills. To be able to work in a dynamic and continuously challenging environment. To be able to adapt and work on multiple projects, with multiple deadlines. To have a strong professional practice: familiarity with current research and developments in the field, connection to colleagues to ensure collaborative best practice, and commitment to sharing through conferences and publications. Able to collaborate internally and externally.

5. Knowledge requirements in time-based media conservation: • •

Knowledge of the best practice for condition assessment, documentation, installation, and preservation of time-based artworks across a range of analog and digital formats. Knowledge of historical and current technologies, such as film, video, audio, slides, performance, and software, with full consideration of practical and ethical issues relating to such artworks. 143

Louise Lawson

• • •

Knowledge of conservation methodologies, ethics, and techniques for time-based media conservation and for conservation documentation and treatments. Knowledge how to address the preservation and access needs of complex artworks, including installation, performance, and other types of time-based media art. Knowledge of modern and contemporary visual culture, architecture, and design, including materials/techniques and conservation techniques and theories.

The skills needed demonstrate the depth and breadth that is expected of a time-based media conservator, but also what is required to ensure a collection can persist. The requirements are framed within strong conservation and digital preservation standards with the need to research and develop long-term preservation strategies—considering ongoing care and maintenance. It highlights the mix of skills from theoretical thinking, to developing conservation or preservation policies, to practical outputs in terms of developing, planning, and maintaining solutions for the long-term care of time-based media artworks, such as a media laboratory in the case of M+, or a digital repository in the case of The Met, or the ability to assess works and collections to ensure the long-term persistence and access of art works in the case of Guggenheim, SFMOMA, and Tate. It requires the conservator to use their skills to understand time-based media artworks in different contexts, such as acquisition, display, or loan, but also moving beyond the skills to acquire, display, and lend a time-based media artwork such as at Tate or M+. Critical are the soft skills that all institutions require, and in this context, soft skills refer to a range of people or social skills, behaviors, and attributes, such as being a team player, innovative, adaptable, and collaborative. These actions will support the development of practice within the dynamic and challenging environment that is time-based media conservation. A list of core knowledge across the institutions focuses on best practices across assessment, document, installation, and preservation; historical and current technologies; conservation methodologies, ethics, and techniques; and a broad contextual knowledge. Beyond this, it is important for the conservator, as described in the ICOM-CC Training needs for the conservation of modern and contemporary art (ICOM-CC 2010, 35), to have “a good awareness of more subjective concepts that are central to contemporary art conservation, such as authenticity, meaning, intent, integrity, authorship . . . etc.” This provides a broad constituency of knowledge and skills expected for a time-based media conservator, so it is important to acknowledge that within this, each conservator will have more or less experience in different areas, as each person will bring their existing education, knowledge, and skills to any position. Time-based media conservators do have a broad base of skills and knowledge but will tend to specialize in a particular area, such as digital preservation or perhaps video, audio, performance-based, or software-based artworks. It is important to ensure the ongoing development of conservation and technical knowledge and skills of any time-based media conservator, as this ensures the sustainability of existing skills as well as the addition of new ones.

9.4 Who Has Responsibility? A time-based media conservator is critical within any institution that has a time-based media collection, as they will have the key skills and knowledge that will ensure long-term preservation of the collection. However, there is still a lot to be done to have a time-based media conservator as a standard requirement, as many institutions are currently navigating how to articulate the need, as Sherring describes, there is a “reluctance to treat TBM as ‘real art’ or [they] see TBM as disposable” (Sherring 2020). Unfortunately, this is still a common view, as often other roles in an institution are asked to support aspects of a time-based media artwork, such as 144

Staffing and Training

display and installation, and in doing so, negating the need for a time-based media conservator. In these instances, only one dimension of the work is being considered, the ability to display a time-based media artwork, and the whole range of activities needed to care for or preserve an artwork in a collection is absent, placing collections at risk. To navigate beyond this commonly held view requires highlighting the whole range of activities that support a work in a collection: Who is supporting the work’s acquisition? How do you know you have the artwork? Do you have the best-quality master material? What is your preservation plan? Does the storage meet minimum preservation standards? Such questions can aid in articulating all that is needed to support the care of time-based media artworks and the need for a conservator. In addressing the challenges that time-based media conservation presents, collaboration is needed across technologies and areas of preservation and for teams to work together to develop systems or solutions for the care of such collections—for example, with the development and implementation of a digital repository (see Chapter 8). The need for interdisciplinary and crossdisciplinary working is essential. As Laurenson has stated, “as we develop an ever clearer understanding of what it means to conserve time-based media works of art, it is clear that there is value in a broadly defined interdisciplinary approach. This points to a requirement not only for new forms of expertise but also a more open approach to the networks needed, both inside and outside the museum” (Laurenson 2013, 42). The development of new forms of expertise can be seen across a range of initiatives, such as with the Guggenheim Conservation Department, who partnered with the Department of Computer Science at New York University’s Courant Institute of Mathematical Sciences. This collaboration worked on analyzing, documenting, and preserving computer-based artworks from the Guggenheim collection, led by time-based media conservator Joanna Phillips and computer scientist Prof. Deena Engel (Guggenheim Blog Part 1–2016; Guggenheim Blog Part 2–2016). At Tate, collaborative doctorates are hosted within the Research Department and work within time-based media conservation to undertake research. Recent doctorates have been Acatia Finbow (Tate—Acatia Finbow 2014) and Tom Ensom (Tate—Tom Ensom 2014) completing doctorates relating to performance art and software-based artworks, with both doctorates informing practice. Developing strong links with universities, whether this is to bring skills for a specific project or initiative or to host placements has proven invaluable for the development of the profession and also introduced new ways of thinking and expertise related to the care of such complex collections.

9.5 How? Education and Training The pathways into time-based media conservation vary, and conservators enter the profession through a range of routes. This may be through an education route, such as a conservation degree course (albeit not necessarily time-based media conservation), or through a digital or analog preservation degree course. However, conservators can enter the profession through equivalent skills and knowledge acquired through professional experiences, such as computer science, moving image, electronics, and audio. It is important to support the different pathways into the profession as there is an immediate shortage of trained time-based media conservators. There are currently a range of routes as specialized training in time-based media conservation has been scarce, unlike other conservation fields where the training is established. This hasn’t been the case in time-based media conservation, but this landscape is changing (see Table 9.2). Dedicated time-based media conservation programs can be found at Berne University of the Arts (Switzerland), which was the first university in 1997 to offer “media” as part of its conservation curriculum. This was developed in 2007 to be a two-year specialized master’s program in modern materials and

145

Table 9.2 Conservation bachelor’s and master’s courses. Referencing contemporar y art and time-based media conservation. Developed from the ICOM-CC Newsletter on the training needs for the conservation of moder n and contemporar y art, 2010 (ICOM-CC 2010). Course

Qualification Level

Link to Time-Based Media Conservation

Academy of Fine Arts, Vienna (AT)

Bachelor’s/master’s

Time-based media conservation is embedded within the broader scope of the conservation of the contemporary conservation program. Includes core modules relating to time-based media conservation.

New York University, Institute of Fine Arts, Conservation Center (US)

Conservation and Restoration of Moder n and Contemporary Art www.akbild.ac.at/portal_en/institutes/ conservation-restoration/studios/moder n-andcontemporary-art Conservation www.hkb.bfh.ch/en/studies/bachelor/ conservation/ Conservation-Restoration, Specialization in Moder n Mater ials and Media www.hkb.bfh.ch/en/studies/master/conservationrestoration/ Restoration and Conservation of Art and Cultural Property www.th-koeln.de/kulturwissenschaften/dasinstitut_10350.php Time-Based Media Conservation https://ifa.nyu.edu/conservation/time-basedmedia.htm

New York University, TISCH School of Art (US)

Moving Image Archiving and Preservation https://tisch.nyu.edu/cinema-studies/miap

NOVA University of Lisbon (PT)

Conservation-Restoration, specialisation in Moder n and Contemporary Art www.fct.unl.pt/en/faculty/departaments/ department-conservation-and-restoration Conservation and Restoration of Cultural Her itage www.upv.es/titulaciones/MUCRBC/indexi.html

Ber n University of the Arts (CH) Ber n University of the Arts (CH)

146

Cologne University of Applied Sciences, Cologne Institute of Conservation Sciences (DE)

Polytechnic University of Valencia (ES)

Bachelor’s

Master’s

Bachelor’s/master’s

Dual master’s deg rees MS in the Conservation of Histor ic and Artistic Works and an MA in the History of Art and Archaeology Master’s

Bachelor’s/master’s

Bachelor’s/master’s

Aimed at students with a bachelor’s in conservation. Includes core modules relating to time-based media conservation. Includes conservation of moder n and contemporary art and objects.

Focus on time-based media conservation, four-year prog ram.

Preservation of moving image—film, video, dig ital, and multimedia works. Includes conservation of moder n and contemporary art and objects

Contemporary artworks from paintings to installations. No reference to time-based media conservation.

Louise Lawson

University

Art Conservation https://queensu.ca/art/art-conservation

Master’s

State Academy of Art and Design, Stuttgart (DE)

Conservation of New Media and Dig ital Infor mation www.abk-stuttgart.de/en/studies/studyprogrammes/conservation-and-restoration-ofnew-media-and-digital-infor mation.html Conservation of Contemporar y Art and Media (course pending to commence 2022–23)

Bachelor’s/master’s

Conservation and Restoration of Cultural Her itage, specialisation in Contemporar y Art https://gsh.uva.nl/content/masters/conservationand-restoration-of-cultural-her itage/ study-prog ramme/contemporar y-art/ contemporary-art.html Conservation of Audiovisual and Photog raphic Cultural Her itage https://krg.htw-berlin.de/studium/ studienschwer punkte/audiovisuelles-undfotografisches-kulturgut-moder ne-medien-avf/ Cultural Mater ial Conservation https://study.unimelb.edu.au/find/courses/ graduate/master-of-cultural-mater ialsconservation/ Art Conservation www.artcons.udel.edu/

Master’s

University College London, Histor y of Art Department (UK)

University of Amsterdam (NL)

147 University of Applied Sciences, Berlin (DE)

University of Melbour ne (AU), Gr imwade Centre for Cultural Mater ials Conservation Winterthur University of Delaware (US)

Master’s

Bachelor’s/master’s

Students interested in moder n and contemporary conservation could pursue inter nship opportunities. Embedded as part of a traditional conservation training prog ram.

The first deg ree in the UK to provide training for conservators looking to develop a specialism in contemporary art and time-based media. Aimed at students with a bachelor’s in conservation. As part of the contemporary art specialism, a new core module on media art conservation is being developed as part of a new lear ning. Conservation of photog raphic works and moving images (analog and dig ital film and video).

Master’s

Time-based media lectures as part of the prog ram. Includes perfor mance.

Bachelor’s/master’s

Students interested in moder n and contemporary conservation could pursue inter nship opportunities.

Staffing and Training

Queen’s University (CA)

Louise Lawson media conservation. Since 2018, the New York University (NYU) Institute of Fine Arts (IFA) has offered a four-year master’s program in time-based media conservation, leading to a dual degree, an MS in the Conservation of Historic and Artistic Works and an MA in the History of Art and Archaeology. Roemich and Frohnert (2018) have identified that time-based media conservation requires “a very particular knowledge and skill-set . . . to understand and analyze preservation challenges for such works” and that “the conservator [needs] to identify, acknowledge, and respect the conceptual nature of these works.” This is in addition to the long-established Moving Image Archiving and Preservation (MIAP) two-year MA program at Tisch School of the Arts at New York University (NYU—Moving Image Archiving and Preservation M.A. n.d.), which has provided a preservation focused master’s degree course, which has also resulted in bringing time-based media conservators into the profession (see Case Study 9.1). The most recent development is a new two-year master’s program focused on the conservation of contemporary art and media. It will launch in the History of Art Department at University College London (UCL) in 2023. Based at the new UCL East campus, with strong links to the creative and cultural industries, the degree will be the first of its kind in the UK to provide specialist training in the conservation of time-based media art.

Case Study 9.1  Establishing a New Master’s Program for TBM Conservation In 2018, the Conservation Center of the Institute of Fine Arts (IFA) at New York University (NYU) opened a new master’s program in time-based media (TBM) conservation. The following summary of the development and implementation of the program is based on an interview the author conducted with Christine Frohnert, Research Scholar and Time-Based Media Art Program Coordinator at the Conservation Center, in September 2020. Introduction and Development of the TBM Curriculum

The new program (see the Mellon Time-Based Media Conservation Program Outline for a full outline of the curriculum) was initiated out of a deep concern around losing our most recently created cultural and artistic heritage (NYU Institute of Fine Arts—Mellon Time-Based Media Conservation Program n.d.). TBM collections are growing fast worldwide, and in the U.S., there was a lack of specialized TBM conservation education on a graduate level. In 2015, Dr. Hannelore Roemich (Hagop Kevorkian Professor of Conservation and TBM Program Director) and Christine Frohnert engaged in discussions with the Andrew W. Mellon Foundation and were awarded a two-year planning grant (2016–17) to explore how the new TBM conservation specialty could be integrated at the Conservation Center to complement existing tracks in paintings, paper, textile, photographs, and objects conservation. During that phase, the core competencies and skill sets for future TBM conservators were identified based on consultations with experts from European programs and potential employers and practitioners in the US. The TBM project team was supported by an Advisory Board and a Working Group to include a broad range of expertise. Based on the proposed TBM curriculum, the Conservation Center was awarded a follow-up grant from the Mellon Foundation for the implementation of the new TBM specialty, complemented by funding for workshops and public programming for the period from 2018 to 2022.

148

Staffing and Training

Identifying Core Competencies and Skills for Future TBM Conservators

As with other specialties within conservation, the core competencies of future TBM conservators are grounded in conservation ethics, methodologies, and science. In order to gain specific TBM skills, a solid introduction to the conceptual framework of modern and contemporary art conservation should provide the basis early on in their education, alongside modern and contemporary art history and media theory. Building on that foundation, an introduction to electrics/ electronics, computer science/programming, audio/video technology, and digital preservation is important, as well as gaining a solid knowledge of each TBM media category, such as film, slide, video, audio, software, performance, light, kinetic, and internet art. Furthermore, the equipment associated with each of the media, the signal processing, and characteristics of different display and playback devices needs to be understood in context to assess the visual and aural integrity of a TBM artwork. Almost equally important are communication skills and the ability to create a network of technical experts within an institution. General soft skills are needed to promote the special needs of TBM works in the institution, to build a lab, and to establish workflows internally and externally. Time-Based Media Conservation Curriculum at the Institute of Fine Arts, Conservation Center, NYU

The four-year TBM conservation specialty is integrated into the existing conservation curriculum. Students will graduate with a dual degree, an MS  in the Conservation of Historic and Artistic Works and an MA in the History of Art and Archaeology. The conservation of contemporary art and, more specifically, TBM art at NYU attracts students who can cross the disciplinary boundaries of computer science, material science, media technology, engineering, art history, and conservation. Two TBM applicants are accepted each year to the program to study alongside students of other specialties in a cohort of up to eight students. The Conservation Center’s dual degree allows for curricular flexibility and adaptation, a necessity for a successful specialization in TBM. Since the discipline of TBM art conservation continues to rapidly expand, and new technologies emerge while others become obsolete, it is in the best interest of the program to be nimble from its inception and to be prepared for adaptation to an ever-changing field. New course offerings, individualized instruction, and workshops provide options for practical and technological training in media art conservation, utilizing a coalition of experts and specialists in, for example, computer science, digital preservation, engineering, and film and video conservation. Since applicants from various backgrounds are accepted to enter the program, it is ensured that students receive the individualized instruction that complements their pre-program experience and considers their needs and interests. The TBM course outline can be adjusted and modified as necessary for each student and as the field evolves over time. The curriculum requires students to gain skills in very different fields of expertise. Due to the complexity of TBM, the regular class schedule will not provide enough time to cover all possible topics effectively, leaving some skills to be acquired during directed summer placements, the fourth-year internship, special workshops, or post-graduate fellowships as students begin to narrow their foci. Summer placements and the fourth-year internship provide additional practical experience and opportunities for students to work in leading institutions. Recent placements include the Solomon R. Guggenheim Museum, the Museum of Modern Art, the Whitney Museum of American Art in New York, the Tate, and the Hirshhorn Museum and Sculpture Garden.

149

Louise Lawson

Networks and Partnerships

The program works with a range of different professionals in higher education, collecting institutions, and professional practice in the larger New York City area and beyond. Within the NYU network, connections have been established with the Interactive Telecommunications Program, the Integrated Digital Media Program, and the Courant Institute of Mathematical Sciences for Computer Science. These collaborations provide valuable input from colleagues educating the next generation of technologists, engineers, designers, and artists uniquely dedicated to pushing the boundaries of interactivity in the real and digital world. Being exposed to emerging artists who work at the intersection of art, technology, and computer science provides future conservators with a ‘glimpse’ of what might end up in the studio. TBM students can enroll in classes offered through several NYU graduate departments, such as the Moving Image Archiving and Preservation Program.

9.5.1 Entry-Level Opportunities Moving from education into practice, institutions offer internships and fellowships for recent graduates in conservation focused on time-based media conservation. They provide important opportunities to gain critical practical experience and training in time-based media conservation. SFMOMA started offering a paid fellowship in 2002, with Tate providing two internships in 2008 and in 2012 in conjunction with the Institution of Conservation (ICON), the Guggenheim supported by the Kress Foundation in 2014 and 2016, and the Smithsonian American Art Museum through the Lunder Conservation Centre in 2018. More recently, MoMA, through the Media Conservation Initiative (MOMA—Media Conservation Initiative n.d.), has provided three three-year fellowships where candidates from different training backgrounds are provided training and research opportunities. Across 2010–15 Tate also supported the Heritage Lottery Funds Skills for the Future Programme, which saw three work-based placements in time-based media conservation (Tate— Skills for the Future n.d.). ICON is working to facilitate and support work-based training programs (apprenticeships), which combine on-the-job training, formal learning, and paid employment. The Trailblazer apprenticeships are a government-funded scheme that would introduce a conservator standard, which would enable any candidate to gain a master’s degree in conservation (ICON—Conservation Apprenticeships n.d.). This provides an additional pathway for those with experience in time-based media conservation, who do not have a conservation qualification, with an avenue to achieve that.

9.5.2 Continual Professional Development Continual training in the profession is critical to keep skills up to date. There is a responsibility from the profession to support emerging time-based media conservators to develop the field. Many professional networks and personal connections support professional development. Key recent initiatives are MoMA’s Media Conservation Initiative, which developed a series of summer workshops: “Getting started a shared responsibility” in 2017 and 2018, “Caring for artists films” in 2019, and “Conservation of video art” in 2020. New York University’s (NYU) 150

Staffing and Training

Conservation Center of the Institute of Fine Arts hosts a series of workshops aimed at both their students and professionals. To date, these have included “Art with a plug: introduction to electricity and electronics” (January 2019), “Artist interview” workshop partnered with Voices in Contemporary Art (March 2019), and “Digital preservation: caring for digital art objects” (October  2019). Other opportunities also exist through the Digital Preservation Coalition, which is a global membership organization, providing a myriad of workshops, online content support, and guidance [Digital Preservation Coalition (DPC)—Upcoming Events n.d.]. There is a range of conservation membership organizations that create and develop conferences and workshops that relate to or include time-based media conservation, such as the Electronic Media Group of the American Institute of Conservation (AIC); ICON with the Contemporary Art Network, International Council of Museums—Committee for Conservation (ICOM-CC) with the Modern Materials and Contemporary Art Working Group; and the Australian Institute for the Conservation of Cultural Material (AICCM). In addition, other networks have emerged from major projects such as the International Network for the Conservation of Contemporary Art (INCCA) or “No Time To Wait!” which was created as a symposium as part of the PREFORMA project (PREFORMA n.d.). In addition, institutions working with time-based media collections host conferences and workshops and create communities of practices, such as “Media Art Preservation,” organized by the Ludwig Museum in Budapest; “Preserving Technological Arts,” organized by the Sabanci Museum in Istanbul; and “Preserving Immersive Media,” organized by Tate. As described by Hendrick (2015, 242), it is critical to ensure a “culture of openness and learning where . . . professionals build partnerships to problem-solve and collaborate outside the museum.” This culture is prevalent within the field of time-based media conservation resulting in self-motivated action, either by an individual or collective desire or by the challenges that artworks present. This means that knowledge is shared, which in turn increases the professional profile of time-based media conservation.

9.6 The Future The growth and adaptive nature of the Tate time-based media conservation team reflect the changing landscape of the conservation training programs and the inclusivity and diversity of employment. The team members have come from a range of backgrounds, with conservators initially hailing from a range of conservation backgrounds, from private practice to working professionals, and also through development within the team. However, at Tate, the team is now moving toward the employment of trained time-based media conservators from the aforementioned training programs as the education sector continues to develop. Tied to this is a culture of leadership and, therefore, within the team, one that fosters openness, experimentation, and a level of autonomy and engagement that inspires creativity and innovation. It is critical to create an environment that supports ongoing learning, development, and research. Looking to the future, it is so important that the work within the profession continues, but not to the detriment or exclusion of the wider media professions who can add a dynamic perspective to a team. We also need to continue to support institutions without conservation teams in their journey. This means being accessible and open to sharing information, knowledge, and skills, with environments that create support and encourage continual learning. I also see an increasing need for professional placements and exchange programs, with a shift into the virtual or digital domain facilitating greater collaboration, allowing conservators to work alongside one another, combining knowledge while experiencing a different work practice of another institution and cementing long term collaborations and opportunities. As time-based media artworks continue to evolve and as artists engage with new and rediscover old technologies, 151

Louise Lawson

the time-based media conservator needs to be adaptive, curious, open, and flexible in their approaches to conservation practice and to learning while they continue to collaborate and share knowledge across the museum, heritage, and technological sectors.

Bibliography Cane, Simon. “Why Do We Conserve?” In Conservation: Principles, Dilemmas and Uncomfortable Truths, 163–76. London: Butterworth, 2009. Digital Preservation Coalition (DPC). “DPC Upcoming Events.” Digital Preservation Coalition (DPC). n.d. Accessed June 4, 2021. www.dpconline.org/events. Domínguez Rubio, Fernando. “Preserving the Unpreservable: Docile and Unruly Objects at MoMA.” Theory and Society 43, no. 6 (2014): 617–45. Feston-Brunet, Briana. “Personal Communication with the Author.” June 26, 2020. Finn, Dan. “Personal Communication with the Author.” June 11, 2020. Fino-Radin, Ben. “How Conservators Are Fighting the Battle Against Built-in Obsolescence.” Apollo Magazine: The International Art Magazine, May  24, 2018. www.apollo-magazine.com/howconservators-are-fighting-the-battle-against-built-in-obsolescence/. Goodyear, Anne Collins. “History of Smithsonian TBMA Working Group.” Smithsonian Institution, March 31, 2020. www.si.edu/tbma/history-smithsonian-tbma-working-group. Guggenheim Blog. “How the Guggenheim and NYU Are Conserving Computer-Based Art: Part 1.”Guggenheim Blog (blog), October 26, 2016. www.guggenheim.org/blogs/checklist/how-the-guggenheim-andnyu-are-conserving-computer-based-art-part-1. ———. “How the Guggenheim and NYU Are Conserving Computer-Based Art, Part 2.” Guggenheim Blog (blog), November  4, 2016. www.guggenheim.org/blogs/checklist/how-the-guggenheimand-nyu-are-conserving-computer-based-art-part-2. “Guggenheim Variable Media Initiative.” Accessed June 3, 2021. https://guggenheim.org/conservation/ the-variable-media-initiative. Heinen, Joey. “Personal Communication with the Author.” June 11, 2020. Hendrick, C. “The Agile Museum: Organisational Change Through Collecting ‘New Media Art.’ ” Doctor of Philosophy, University of Leicester, School of Museum Studies, 2015. https://lra.le.ac.uk/bitstr eam/2381/36093/1/2015HendrickCphd.pdf. ICOM-CC. “Training Needs for the Conservation of Modern and Contemporary Art.” Report from the Joint Interim meeting, June 12–13, 2010. Bonnefantenmuseum, Maastricht, 2010. www.icom-cc. org/54/document/training-needs-for-the-conservation-of-modern-and-contemporary-art-interimmeeting-report/?id=960%23.XqB6ZVNKg1I#.YRQMxlNKjvV. ICON. “Conservation Apprenticeships: Work Based Training to Support Your Career in Conservation.” ICON: The Institute of Conservation. Accessed August 11, 2021. www.icon.org.uk/training/conserva tion-apprenticeships.html. Laurenson, Pip. “Emerging Institutional Models and Notions of Expertise for the Conservation of TimeBased Media Works of Art.” Techne 37 (2013). Lawson, Louise, and Simon Cane. “Do Conservators Dream of Electric Sheep? Replicas and Replication.” Studies in Conservation 61, no. sup2 (2016): 109–13. https://doi.org/10.1080/00393630.2016.1181348. Lawson, Louise, and Pip Laurenson. “Developing with Time! Time-based Media Conservation at Tate 1996–2019.” Presented at the Symposium: Towards a Flexible Future: Managing Time-Based Media Artworks in Collections, Art Gallery of New South Wales, Sydney, June 4, 2019. LIMA. “Media Art Platform.” LIMA. Accessed August  11, 2021. www.li-ma.nl/lima/tags/mediaart-platform. Museum of Modern Art (MoMA). “Media Conservation Initiative.” MoMA. Accessed June 4, 2021. www. mediaconservation.io/. Nichols, Alexandra. “Personal Communication with the Author.” April 2020. NYU Institute of Fine Arts. Mellon Time-Based Media Conservation Program Outline: Time-Based Media Art Conservation. New York University Institute of Fine Arts. Accessed June 4, 2021. https://ifa.nyu.edu/ conservation/time-based-media.htm. NYU (New York University) Tisch. Moving Image Archiving and Preservation M.A. New York University, Tisch School of the Arts, n.d.a. https://tisch.nyu.edu/cinema-studies/miap.

152

Staffing and Training Pereira, Orlando, and Maria Joao Raposo. “Soft Skills in Knowledge-Based Economics.” Marketing and Management of Innovations, no. 1 (2019): 182–95. Phillips, Joanna. “Personal Communication with the Author.” August 2020. “PREFORMA.” n.d. Accessed June 4, 2021. www.preforma-project.eu/. Roemich, Hannelore, and Christine Frohnert. “Time-based Media Art Conservation Education Program at NYU: Concept and Perspectives.” The Electronic Media Review, Volume 5: 2017–2018 (2018): 1–26. https://resources.culturalheritage.org/emg-review/volume-5-2017-2018/frohnert/. Sherring, Asti. “Divergent Conservation: Cultural Sector Opportunities and Challenges Relating to the Development of Time-Based Art Conservation in Australasia.” AICCM Bulletin 41, no. 1 (2020): 69–82. DOI: 10.1080/10344233.2020.1809907. Sherring, Asti, Caroline Murphy, and Lisa Catt. “What Is the Object? Identifying and Describing TimeBased Artworks.” Australian Institute for the Conservation of Cultural Materials (AICCM) 39, no. 2 (2018): 86–95. https://doi.org/10.1080/10344233.2018.1544341. Tate. “Skills for the Future Programme.” Tate. Accessed June 4, 2021. www.tate.org.uk/about-us/projects/ skills-future-programme. Tate—Ensom. “Tom Ensom PhD: Technical Narratives: Method, Purpose, Use and Value in the Technical Description and Analysis of Software-Based Art.” Tate, 2014. www.tate.org.uk/research/studentships. Tate—Finbow. “Acatia Finbow PhD: The Value and Place of Performance Art Documentation in the Museum.” 2014. www.tate.org.uk/research/studentships. Van Saaze, Vivian, Glenn Wharton, and Leah Reisman. “Adaptive Institutional Change: Managing Digital Works at the Museum of Modern Art.” Museum and Society 16, no. 2 (July 30, 2018): 220–39. https:// doi.org/10.29311/mas.v16i2.2774.

153

10 A ROUNDTABLE: IMPLEMENTING CROSSDEPARTMENTAL WORKFLOWS AT SFMOMA Martina Haidvogl in conversation with Michelle Barger, Joshua Churchill, Steve Dye, Rudolf Frieling, Mark Hellar, Jill Sterrett, Grace T. Weiss, Layna White, and Tanya Zimbardo

Editors’ Notes: This roundtable discussion, conceptualized and moderated by Martina Haidvogl and held on June  5, 2020, brings together current and former members of Team Media at SFMOMA (San Francisco Museum of Modern Art). Team Media was founded in 1992 and is now the longest-standing cross-disciplinary time-based media working group in any museum. In this chapter, the team members reflect upon the museum’s collaborative working culture, their standing Team Media meeting, and the resulting interdisciplinary workflows established for the care of their time-based media collections. Martina Haidvogl (Lecturer in Media Art Conservation, Bern University of the Arts, Switzerland) was part of Team Media from 2011 to 2019 as SFMOMA’s first media conservator. Current SFMOMA staff in this roundtable includes Michelle Barger (Head of Conservation), Joshua Churchill (Collections Technical Assistant Manager), Steve Dye (Collections Technical Manager), Rudolf Frieling (Curator of Media Arts), Grace T. Weiss (Assistant Registrar for Media Arts), Layna White (Director of Collections), and Tanya Zimbardo (Assistant Curator of Media Arts). The conversation is joined by SFMOMA’s former Director of Collections, Jill Sterrett (arts and cultural heritage adviser), and Mark Hellar (Hellar Studios LLC, San Francisco), a technologist and software consultant who has been supporting media conservation efforts at SFMOMA for many years.

10.1 Introduction The San Francisco Museum of Modern Art (SFMOMA) is renowned for practicing a unique, cross-departmental working culture. Team Media, in particular, founded in 1992 and one of the first time-based media working groups of its kind, consisting of curators, conservators, media technicians, registrars, IT specialists, intellectual property managers, and outside experts, has famously inspired other institutions to follow this model of interdisciplinary working style to care for their media artworks (Westbrook 2016; Haidvogl and White 2020). The meeting’s framework is straightforward, though not undeliberate: meet once a month at the same time for 154

DOI: 10.4324/9781003034865-12

Cross-Departmental Workflows at SFMOMA

Figure 10.1 Screenshot of the roundtable discussion on June 5, 2020. Photo: Martina Haidvogl

an hour and a half with a curated but flexible agenda that allows for an open stage and a broad spectrum of items, ranging from day-to-day to long-term concerns. This frequency, regularity, length, and flexibility prevent making it a burden on busy schedules, while the carefully prepared content with the additional time for anyone to contribute is the perfect balance to making it worth everyone’s while. Sterrett and Christopherson (1998) describe the “[m]onthly meetings attended by staff from all three departments [that] have provided a forum for developing a preservation plan for the collection” for the first time in 1998 (Sterrett and Christopherson 1998; in BAVC 1998). Today the meeting is a fixture in people’s calendars and awareness, and its accompanying culture and values are so ingrained, so intrinsic that it is hard to tease out the ingredients that have led to this successful working model. So much so that some Team Media members describe its most crucial element as simply “magic.” If we try to look behind this magical curtain, we find a long history of persistent and arduous efforts to establish and uphold an underlying value system, consisting of appreciation, respect, empowerment, validation, and inclusivity. On this foundation, inter- and intrapersonal connections of trust can be forged. Paul Jeffrey’s research on structure and management of crossdisciplinary research teams found several fundamental ingredients to successful collaboration, among them a shared vocabulary (Jeffrey 2003). What he describes as part of a team-building process of (literary) agreement ultimately leads to a more colloquial landing on the same page, a shared understanding, and a way of communicating. In this same vein, ongoing horizontal knowledge dispersion becomes central to maintaining this common ground, which aims to convey to everybody, continuously, a sense of belonging. This understanding of the other person’s role, needs, and concerns paired with established trust results in a blurring of boundaries 155

Martina Haidvogl

and a sense of territory without losing sight of the recognition of everyone’s individual expertise. And last but not least, providing the institutional structure and resources to support collaboration incentivizes this way of working and plays an important role in clearing logistical barriers. Daniel Stokols et al. (2008) give a comprehensive overview of contextual factors that influence transdisciplinary team effectiveness on intrapersonal, interpersonal, institutional, environmental, technologic, and sociopolitical levels, including empowering leaders, mutual respect, active roles and participatory group settings, organizational incentives, and informal opportunities of contact. Within this framework, different opinions and perspectives are encouraged and supported. Workflows ease as the potential for unhealthy conflict is reduced. Shrum et al. argue that trust has no bearing on the performance of collaborative research beyond a basic, necessary level, albeit it is perceived to do so because of its association with conflict. Between interpersonal and collective trust, they offer that “perhaps trust is a more complex issue” (Shrum et al. 2001, 686) than to allow for simplistic conclusions. There is, however, a more lighthearted though not insignificant acknowledgment “that trusting collaborations are still ‘better’ in the sense that collaborators prefer them” (ibid., 726). And what’s more, positive experiences among members around interdisciplinary approaches further perpetuate their willingness to engage and participate in collaborative teamwork (Stokols et al. 2008, 106). While SFMOMA’s interdepartmental approach was borne out of a need, a challenge brought upon by complex artworks being acquired, it impressed a culture upon the collection that would eventually go beyond media arts. For a deep reflection on the needs of contemporary art and the institutional mechanisms required (to shift) for their care, see Sterrett (2009) and Sterrett and Coddington (2017). At SFMOMA, the merits of this successful collaboration practice can be measured quite concretely: a curator finding themselves to be able to consider complex works of art for their collection because of a deep confidence in the capabilities of their team; direct access to and contact with artists, leading to clear lines of communication and improved capturing of essential knowledge around their artworks, which in turn leads to trusted relationships between both parties; broad dissemination of information around new acquisitions preparing everyone for incoming artworks and giving the team a chance to weigh in and consider issues around preservation from multiple viewpoints; a system of collective documentation efforts through a wiki platform inspired by this particular way of operating, resulting in rich, multivoice artwork records centrally accessible to everyone (Haidvogl and White 2020); a multi-year cross-departmental research project piloting interdisciplinary approaches to contemporary art conservation generously funded by The Andrew W. Mellon Foundation (Clark and Barger 2016); and finally, a long-standing joy of working together. The following conversation took place during the global COVID-19 pandemic, when museums in the US—including SFMOMA—were reckoning with challenges in funding, resources, and identity. The conversation discusses a way of working collaboratively that cannot easily be undone once embraced, even through changing economic circumstances.

10.2 In Response to a Need Jill, as former Director of Collections at SFMOMA, you have overseen more than 25 years of collections care. Very early on in your tenure in 1994, you were instrumental in founding Team Media. Today, the meeting has grown to a size of 15–20 people who come from all over the museum. Even for a museum of roughly 500 staff, that’s quite a sizable group. Can you tell me about the beginnings? How did it all start? JILL STERRETT: It started in response to a need. SFMOMA had one of the first museum departments dedicated to media arts and was actively acquiring media artworks. We found MARTINA HAIDVOGL:

156

Cross-Departmental Workflows at SFMOMA

ourselves bringing these artworks into the collection but without adequate knowledge about how to keep them. Out of necessity, we thought, “What if we convened once a week to take on one challenge at a time?” Initially, our team included one registrar, one media technician and myself, a conservator. We innovated to create a structure to fill a gap within the museum. One of the reasons that I think it succeeded was that we trusted each other. We did not set out to change the museum. We were just addressing a challenge. Sociologist Saskia Sassen has written about the transformations within traditional systems, and she talks about the fact that change happens at the margins and, as it is adopted, moves to the middle (Sassen 2006a, 2006b, 2007). It is one of the tensions we’re up against in organizations—not just museums—with inherited governance structures and hierarchies, which have functioned for many years. So, if you’re going to disrupt the tried-and-true, testing the concept lowers the threat level. Iterating toward a better product is one way in which everybody’s work benefits. MARTINA HAIDVOGL:  It was in 2011 when I joined SFMOMA as the museum’s first media conservator through the Advanced Fellowship in the Conservation of Contemporary Art. Then, you described that as a “safe” way to introduce a new position, a new person into an existing structure—“margin activity,” too, if you will. Could you speak more to that culture that you built over 30 years that allowed SFMOMA to operate so successfully as a team to care for media artworks? JILL STERRETT:  It’s always been interesting for me to reflect on the fact that the creation of a culture was something that we were inserting as an alternative answer to not having the person—a time-based media conservator. Because we had to create a culture for that gap, and we had to do it for so long, this culture got inculcated. We created the meeting, and then we created the culture, and then we filled in people. Today people join a culture.     When we created Team Media, there was a commitment to values: We respected each person’s expertise, perspective, opinions, and questions. We did not edit agendas so individuals could opt in or opt out of parts of the meeting. There was no such thing as a stupid question since we were all figuring it out. There was no issue too big or too small. Upholding these values over time was probably the harder thing to do as new people cycled in (and out) over 20 years. Upholding these values was and continues to be the central accomplishment. It takes time. LAYNA WHITE:  One of the things that I’m very aware of in my role as Director of Collections and something I’ve been observing about Team Media recently is the importance of helping people new to the group feel that they are part of the team and that they’re welcome. A fundamental step here is to ensure that we provide new members with sufficient context around actively discussed agenda items—to help level awareness and support active participation around the room—physical or virtual. MARTINA HAIDVOGL:  Rudolf, how was it for you when you joined these existing ways of working? RUDOLF FRIELING:  When I came to SFMOMA as Curator of Media Arts in 2006, the reputation of this culture had already traveled to Europe and everywhere. I  was particularly excited to step into an environment that was characterized by collaboration and—to the extent that it is possible—non-hierarchical relationships. I felt I had to do a lot of learning and catching up because I hadn’t been part of that history, that important legacy here at the museum, which still is one of the few places in the world that really dedicates not only a lot of resources to the collection and exhibition and preservation of media art but also prides itself on being at the forefront of development. 157

Martina Haidvogl JILL STERRETT: Rudolf,

I  remember having lunch with you and Tino Sehgal. You were reflecting on your transition to the US and what it was like to come here, and you called out an English phrase that you don’t have in German: “Get on the same page.” You said, “Everybody here always wants to get on the same page.”

   

  We’re in a collecting and exhibiting institution and our activities—to put a work on display or to loan or acquire it—involve all of us in different ways. As Collections Technical Manager, I install media artworks, and so I work with curators, conservators, registrars, and preparators on a daily basis. Bringing all of these people together and smoothing out the process is where a collaborative way of working has been most helpful. It creates a trust and deep respect that’s difficult to convey. In this climate, our disagreements are able to inform our decision-making. That’s really magic. TANYA ZIMBARDO:  All of us continue to learn from discussions around complex artworks and addressing changes in technology. The hands-on experience is invaluable to early-career curators too. For instance, collaborating with different departments all of these years on exhibitions was a key form of mentorship that helped me as a curator to better understand what it means to work closely with artists and to negotiate installation layouts. MICHELLE BARGER:  I wonder if it is such interdisciplinary mentorship bringing greater understanding and appreciation for what everybody does that lays the groundwork for successful collaboration. RUDOLF FRIELING:  I want to echo that in a slightly different way. This collaborative culture has created a confidence in each other and a way of operating that doesn’t say, “No, we can’t do that. We cannot acquire this work because we don’t have the infrastructure or technical expertise for this.” When we consider acquiring a complex artwork today that challenges our expertise, it’s actually prompting us to say, “Let’s take this on, because that way we learn not just individually as experts in our various fields but also as an institution.” With help from experts like Mark [Hellar], we’re pushing ourselves constantly. We don’t just stay up to code on what has been established so far, but we stay contemporary in the sense that we can deal with the challenges that are coming to us from artists and artworks. This approach, together with a confidence in the team’s abilities, has given me as a curator a broader spectrum of works to consider for acquisition that I feel confident we can take into our care. JILL STERRETT:  Listening to you express your opinions about the value of collaboration and of Team Media, I can’t help but think these are museum issues that extend beyond media arts. As some of you have said, one of the things that comes from collaborative work is an understanding of what everybody does, respect. A collaborative approach implores people to lift their game, to “come prepared and bring it.” And let’s acknowledge that you have permission to bring it. In that sense, it isn’t just about respecting each other. It’s also about helping each other engage at new levels of conversation. That kind of learning and growth becomes part of your job. You have power and responsibility. MARTINA HAIDVOGL:  Who comes to a Team Media meeting? LAYNA WHITE:  When thinking about who should be at the table, for me, the first question is: what does the artwork need? And then also, conversely: who would benefit from being at

158

Cross-Departmental Workflows at SFMOMA

the table, from participating in the conversation to do their work? We have staff from across the museum, all of whom are dealing with media art in some part of their work.

10.4 A Certain Level of Discomfort In addition to Team Media, there’s another meeting series that I’ve always considered to be fundamental to SFMOMA’s collaborative working culture: the museum’s pre-accession meetings. Grace, as the assistant registrar for media arts, could you speak to SFMOMA’s acquisition process? GRACE T. WEISS: SFMOMA has four acquisition cycles a year, and our Media Arts department acquires around 10–15 artworks per year. Typically, we meet a few months in advance, bringing together stakeholders from many different departments. Our curators then introduce the group to the artworks they are proposing, their background, and their significance for the collection. As a group, we then proceed to compile questions regarding the display, function, and preservation needs of the works. Depending on the artwork and its complexity, we follow up with the artist and their studio beforehand. Different members of the team consider different aspects of stewardship, which allows us to look at the work from many perspectives and to reach a place of feeling comfortable and prepared to bring the proposed media artworks into the collection. At the same time, there may be issues that cannot be resolved at the point of acquisition, and being okay with a certain level of discomfort almost comes with the territory. STEVE DYE: One of the origins of these pre-accession meetings was receiving several artworks as gifts only to understand, “Well, okay, there’s no cost to purchase these, but it will cost us a lot of money every time we want to show them.” We realized that we needed to identify such costs upfront, what it meant to own a particular artwork in regards to preservation, and what was necessary to install it. LAYNA WHITE: For me, a pre-accession meeting is like a blind date. It’s your opportunity to get to know a work of art. You don’t just meet the artwork itself but also the people around it: the artists, their gallerists, their studio managers, their trusted technicians. It’s the beginning of relationships that you enter into when you acquire an artwork for your collection. MARTINA HAIDVOGL: Following these pre-accession meetings, the artworks are typically installed temporarily for the Media Arts Accessions Committee. Why are you going to such lengths in the artworks’ presentation, which require a significant amount of staff resources? RUDOLF FRIELING: It is important for us as a museum to give our donors who support all acquisitions an actual experience of the artwork, which is why—to the extent it is possible—we install them for the accession meeting. It’s a bit of a conundrum for us because unless you acquire an artwork out of an exhibition, you have to try to install the work temporarily. Setting up a work for only a few hours is difficult, and sometimes we have to make compromises in the presentation, though as much as we can, we try to show our committee what we are proposing. The more complex the works are, the more creative we have to get, which is also a way of teasing out the minimum requirements of what the works are about. LAYNA WHITE: Do you think installing works temporarily for accession meetings will continue to be challenged in terms of the resources that it takes to do it? TANYA ZIMBARDO: When we are acquiring major time-based media installations, they often have a higher price tag than artworks brought in by other curatorial areas. We have to MARTINA HAIDVOGL:

159

Martina Haidvogl

justify to our committee why this particular artwork is proposed. In such cases, installing the artwork allows donors to fully experience and appreciate it. GRACE T. WEISS: From the registrar and installation crew’s point of view, it’s true that installing the works for a two-hour meeting is challenging. However, it’s extremely valuable going through the provided documentation and, by installing the work step by step, putting it to the test. Through this process, we understand the proposed artworks a lot more, and it starts unraveling some of the questions that came up in the pre-accessions meeting. It forms the baseline for our documentation moving forward. I remember a performance artwork that pushed this process to a different level. We were not only trying to figure out how to bring the work into a modified space but also “How do we perform it?” Being one of the performers for the accessions meeting, I learned firsthand what it means to exhibit and care for this artwork.

10.5 Honoring a Collective Expertise After the artwork is approved by the committee, there is a whole chain of events to conclude the acquisition, involving many stakeholders: attorneys, registrars, conservators, and curators. When the artwork is then brought in, registrars are the first point of contact, and at SFMOMA, they create an initial condition report. This process was not as streamlined with media artworks. Linda Leckart, then the media registrar, and I started to meet weekly for our “media arts quality control sessions,” in which we analyzed the acquired digital files, watched the full video looking for any issues, created checksums, and uploaded the files to our server. In a second step, we then discussed acquisition numbers and the status of components. This approach created a better understanding of the other person’s role and, through the process, lowered barriers towards digital technology. We then expanded this practice to all of SFMOMA’s collecting departments and made it a staple in the acquisition process. I have to admit, though, acquisitions worked most smoothly in the Media Arts department. MICHELLE BARGER:  As Head of Conservation, I’ve been reflecting on this, and I sense a void in other collecting departments of the museum in that there isn’t a Team Media, meaning there isn’t a forum for issues of collecting and exhibiting. Without it, there’s a sense of anxiety that we may not be asking all of the questions in order to know everything we need to for these other really complex artworks that don’t fall into a more traditional way of making art. I think, inherently, media curators know and understand that you need a team. And we all know that we need each other’s skills and knowledge when bringing in these works. Honoring a collective expertise is at the core of what is required to meet the artwork where it is. JOSHUA CHURCHILL:  Indeed, having Team Media as an ongoing, monthly forum where we can discuss these miscellaneous questions with people from different departments who have different perspectives and areas of expertise is incredibly valuable. We’re really lucky to be able to say, “This is a great question or topic for Team Media!” It allows us to discuss, tackle and sometimes solve problems collaboratively—a process that would be much harder if everyone tried it individually. TANYA ZIMBARDO:  What I enjoy a lot is when members of this team who have presented at conferences then share a recap with all of us. Having a place to talk about various research projects or collection artworks that aren’t being exhibited—it’s important to have that moment to report out to one another. MARTINA HAIDVOGL:

160

Cross-Departmental Workflows at SFMOMA

10.6 The Role of the Media Conservator If Team Media is a successful collaborative model to care for media artworks, how does a media conservator fit into this? Or asked differently, can you reflect on the time before you had a media conservator on staff and compare it to when you did? JOSHUA CHURCHILL: I think it’s pretty symbolic that you came on when we were showing Bill Fontana’s Sonic Shadows, a site-specific, software-based sound installation that the museum acquired in 2010. We’ve never shown an artwork like that before, both in terms of infrastructure, preparation, involvement with the artist on-site, and documentation of it all. Our acquisitions today become more and more complex, and so do the questions around them. It’s a necessity at this point to have somebody who can drill down on the more granular questions and then bring it back to our shared forum and discuss all the other aspects of these artworks in terms of concept, collection, and preservation. TANYA ZIMBARDO: I’m thinking of a number of recent film acquisitions, where it was beneficial to have a media conservator in dialogue with us as well as with the artists. Martina, you were able to notice changes that may have happened during post-production or analyze the master and then ask for a higher quality, even for more straightforward films and singlechannel videos. We had several instances where we went back to the artist and said, “Could we work with you on something better?” Quality control and condition assessment was not something we did before having a media conservator and when we only had written guidelines for what registration and curatorial should be requesting. MARTINA HAIDVOGL:

10.7 Looking Ahead If an institution is just starting out with caring for their media artworks, what advice would you have for them? MARK HELLAR: When we wrote the third chapter for Matters in Media Art, a cross-institutional research group between SFMOMA, Tate, MoMA, and the New Art Trust, called “Sustaining Media Art” (Matters in Media Art 2015), we formulated it in a way where we were not being prescriptive, but we said: you need to assess yourself, you need to assess your resources, your staff, your skill sets, your budgets, and then look at some core principles for the care of media artworks and see how you could provide these and where you need help—this all to say: depending on your circumstances, there is an individual solution for your institution, whether you’re a large or smaller institution, a gallery, a private collector, or an individual artist. This is both true for your digital storage solution as well as your way of approaching caring for your media artworks as a team. JILL STERRETT: Every work situation is distinct. I think the gist of what we are talking about with Team Media is that it responds directly to how artists are making art, the networks of production around art, and notions of ephemerality and variability that are all around us. One could make the case in any work environment for this kind of approach: if it’s required, you have to value it. How it manifests will be influenced by local circumstances and the people you work with. It has always been about people. Whenever we forget that, we dehumanize what we do. MARTINA HAIDVOGL: Thank you, everyone, for this fascinating discussion and for sharing your reflections and insights into this inspiring way of working together. MARTINA HAIDVOGL:

Bibliography Bay Area Video Coalition (BAVC), ed. Playback: A Preservation Primer for Video. San Francisco, CA: Bay Area Video Coalition, 1998.

161

Martina Haidvogl Clark, Robin, and Michelle Barger. “The Artist Initiative at San Francisco Museum of Modern Art.” Studies in Conservation 61, no. sup2 (June 2016): 24–28. https://doi.org/10.1080/00393630.2016.1193692. Haidvogl, Martina, and Layna White. “Reimagining the Object Record: SFMOMA’s MediaWiki.” Stedelijk Studies (Imagining the Future of Digital Archives and Collections), no. 10 (2020). https://stedelijkstudies.com/journal/reimagining-the-object-record-sfmomas-mediawiki/. Jeffrey, Paul. “Smoothing the Waters: Observations on the Process of Cross-Disciplinary Research Collaboration.” Social Studies of Science 33, no. 4 (August  2003): 539–62. https://doi.org/10.1177/ 0306312703334003. Matters in Media Art. “Matters in Media Art (Website).” 2015. http://mattersinmediaart.org/. Sassen, Saskia. Territory, Authority, Rights: From Medieval to Global Assemblages, updated ed., 4 print, 1 paperback print. Princeton, NJ: Princeton University Press, 2006a. ———. “Keynote Presentation and Conversation with Sociologist and Globalization Expert Professor Saskia Sassen.” Keynote presented at the ISEA2006, San Jose, 2006b. www.isea-archives.org/symposia/ isea2006/. ———. “Talk on How Major Global Changes Are Reflected in Some of the Basic Tenets of the Museum.” Presented at the Shifting Practice, Shifting Roles? Artists’ Installations and the Museum Meeting, Tate Modern, March  22, 2007. https://inside-installations.sbmk.nl/OCMT/mydocs/Announcement%20 Tate%20seminar.pdf. Shrum, Wesley, Ivan Chompalov, and Joel Genuth. “Trust, Conflict and Performance in Scientific Collaborations.” Social Studies of Science 31, no. 5 (October  2001): 681–730. https://doi.org/ 10.1177/030631201031005002. Sterrett, Jill. “Contemporary Museums of Contemporary Art.” In Conservation Principles, Dilemmas and Uncomfortable Truths, edited by Alison Richmond, Alison Bracker, and Victoria and Albert Museum, 223–28. Oxford: Butterworth-Heinemann, 2009. Sterrett, Jill, and Curtis Christopherson. “Toward an Institutional Policy for Remastering and Conservation of Art on Videotape at the San Francisco Museum of Modern Art.” In Playback: A Preservation Primer for Video, edited by Sally Jo Fifer. San Francisco: Bay Area Video Coalition, 1998. Sterrett, Jill, and Jim Coddington. “Codifying Fluidity.” VoCA Journal, June  30, 2017. https://journal. voca.network/codifying-fluidity/. Stokols, Daniel, Shalini Misra, Richard P. Moser, Kara L. Hall, and Brandie K. Taylor. “The Ecology of Team Science.” American Journal of Preventive Medicine 35, no. 2 (August 2008): S96–115. https://doi. org/10.1016/j.amepre.2008.05.003. Westbrook, Lindsey. “Team Media: In Action, in Contemplation.” SFMOMA Stories, February  2016. www.sfmoma.org/read/team-media-action-contemplation/.

162

PART III

Cross-Medium Practices in Time-Based Media Conservation

11 DOCUMENTATION AS AN ACQUISITION AND COLLECTION TOOL FOR TIME-BASED MEDIA ARTWORKS Patricia Falcão, Ana Ribeiro, and Francesca Colussi

Editors’ Notes: Throughout the collection life of information-based artworks, such as time-based media, documentation is the primary tool for collection stewards to manage the change inherent in these works as they are reinstalled and adapted to different contexts and conditions over time. The three authors of this chapter, Patricia Falcão, Ana Ribeiro, and Francesca Colussi are all time-based media conservators at Tate (UK) and offer their combined experience to reflect on current and recommended documentation practices in the museum. They identify the occasions when documentation activity should occur, compile the aspects that deserve to be documented, and introduce different documentation models and methods that conservation professionals may choose.

11.1 Introduction Documentation is an essential tool for the preservation of time-based media art, ensuring that these artworks, with their inherent variability and change, can continue to be displayed in the future and retain their meaning through changes in technology and context. As other information-based artworks, time-based media artworks are made up of distinct components, such as media, software, equipment, sculptural elements, and manuals, which together comprise a specific instantiation of an artwork. Documentation created through the life of the artwork forms a baseline of information essential to reuniting the different components successfully, in keeping with the artist’s stated preferences and thus becomes a preservation step in and of itself. Information about the materials and parameters of change will also inform future conservation and display decisions. This chapter is based on the authors’ own experience as well as on their discussions with other colleagues who have generously given their time, including Fernanda D’Agostino, São Paulo, Brazil; Deena Engel, New York, USA; Tom Ensom, London, UK; Jack McConchie, London, UK; Alex Michaan, Paris, France; Gabriela Pessoa Oliveira, São Paulo, Brazil; Joanna Phillips, Düsseldorf, Germany; Iolanda Ratti, Milan, Italy; David Smith, Hong Kong; Camilla DOI: 10.4324/9781003034865-14

165

Patricia Falcão et al.

Vitti, São Paulo, Brazil; Aga Wielocha, Hong Kong; and Gaby Wijers, Amsterdam, the Netherlands. The authors offer an overview of the history of conservation documentation, the importance of documentation in the context of contemporary art conservation, how these practices evolved over time, and emerging approaches. The heart of the chapter covers documentation steps that can be taken at the different stages of an artwork’s life within a collection context. The final section shares basic information on how to manage and share documentation and some of the current limitations on sharing documentation publicly.

11.1.1 The Role of Documentation in Conservation Documentation has long been a crucial part of conservation practice and is considered integral to the profession’s standards and ethical approach (ICOMOS 1964; E.C.C.O. 2011; Icon— Code of Conduct 2014). The Institute of Conservation Ethical Guidance (Icon 2020, 14) recommends that conservators always document “the action, or the decision to take no action . . . to the appropriate level of detail and [that] the documentation includes recommendations for the future maintenance and preservation of the item.” For time-based media conservation, the objective of documentation goes beyond recording materials, production processes, and interventions and aims to support the management of change (Laurenson 2006, 2) and the recreation of an artwork (Dekker 2013, 157). Furthermore, documentation may offer evidence of an artwork’s authenticity and capture its historical value (Hummelen and Scholte 2006, 6), or it may be the only trace of an artwork, even becoming sometimes the artwork itself (Depocas 2002; Hummelen and Scholte 2006, 6). Less dramatically, documentation is the only form of access to an artwork that is not on display. When discussing the evolution of contemporary documentation strategies, Annet Dekker highlights its value as a process where the act of documenting itself creates knowledge (Dekker 2013, 149–69). Domínguez Rubio argues that there is an “inherent incompleteness of documents,” which blurs the relationship between the artwork and the document, such that the latter can never “contain the totality of that which it purports to register” (Domínguez Rubio 2020, 98).

11.1.2 Evolution of Documentation Practices in Time-Based Media Conservation Over the last 20 years, documentation practices in time-based media conservation have continuously evolved in response to changing contemporary art and collection practices. Several initiatives have produced frameworks and guidelines by developing documentation models, methods, and systems, along with tools such as templates and questionnaires. In the early 2000s, the Variable Media Initiative proposed a new concept for preservation, describing artworks with a range of media-independent “behaviors” and offering a range of preservation approaches based on these behaviors and conceptual values (Variable Media Network 2003; Depocas et al. 2003). The aim was to allow these works to be translated onto new media in the future. While this approach has limitations in dealing with the potentially changing significance of the hardware and technologies for an artwork beyond their conceptual significance, it opened the door to rethinking conservation’s approach to contemporary artworks and triggered the development of current change management practice. Other projects followed: Between 2004 and 2007, Inside Installations—Preservation and Presentation of Installation Art created extensive resources on documentation methods, including modules on how to document video installations, motion, and sound, as well as introducing the concept of risk assessment 166

Documentation of TBM Artworks

in the field (INCCA—Inside Installations (2004–2007) n.d.). Between 2005 and 2010, the DOCAM project, Documentation and Conservation of the Media Arts Heritage, identified the artwork component as an essential level of documentation because “components are at the very heart of the changes affecting most media artworks” (DOCAM 2010). This opened up the possibility of describing change in the components of an artwork without affecting the work’s identity, a new concept for collections at the time. DOCAM also explored the importance of recording audience experience and/or the artist’s production practices specifically for timebased media art. Starting in 2004, Matters in Media Art, a collaboration between the Museum of Modern Art (MoMA), San Francisco Museum of Modern Art (SFMOMA), and Tate under the umbrella of the New Art Trust, developed guidelines for the care of media artworks based on the notion of an artwork’s life in a collection: acquisition, loans, documentation, and digital preservation. The project web page provides examples of workflows and templates to support activities such as condition checking different media, a survey template, and guidelines for installation (Matters in Media Art 2015). Collaboration and sharing documentation tools are now commonplace, and a growing number of institutional websites offer resources such as documentation guidelines and templates. Since 2012 the Solomon R. Guggenheim Museum (Guggenheim Museum—Time-Based Media 2012) has been sharing time-based media preservation models, templates, and insights into their projects and initiatives. Others, such as the Smithsonian Time-Based Media & Digital Art Working Group (Smithsonian Institution—Time-based Media  & Digital Art n.d.), The Metropolitan Museum in New York (Metropolitan Museum of Art n.d.), MoMA’s Media Conservation Initiative (MoMA—Media Conservation Initiative, n.d.), and Tate’s pages on Software-Based Art Preservation (Ensom et al. 2021) and on Documentation and Conservation of Performance Art (Lawson et al. 2019) have followed. LIMA published their Artwork Documentation Tool aimed at artists who want to create documentation of their own artworks (LIMA 2017). Sharing documentation is invaluable for this evolving field where many institutions and professionals are just starting out or need to develop new processes.

11.1.3 Emerging Practice Documentation practices evolve as technological development leads to the increased complexity of collected artworks and consequently more complex documentation needs while also bringing us new tools to care for these works. Photogrammetry and immersive media (such as 360 video or virtual reality [VR]) allow for 3D documentation of installations, while VR also supports the recreation of historical installations (Lockhart 2020, 700–3). Tools such as wikis, Wikidata, Git, and Trello allow new types of collaboration among teams and museums. An example is the work done at the San Francisco Museum of Modern Art (SFMOMA), where a MediaWiki implementation provides easy cross-team access to artwork records (Haidvogl and White 2020) or the use of a wiki to document conservation workflows at Tate. The use of Git and wikis, as highlighted by Barok et al. (2019b), can promote affordable online collaboration once the tools are set up, and teams have learned to use them. A well-documented example of the use of Git is the collaboration between SFMOMA and artist Jürg Lehni to share the source code for his robotic chalk-drawing machine Viktor (2006 ~) (Haidvogl 2016). The advantages of machine-readable information and linked open data are demonstrated in platforms such as Europeana (Europeana Pro—Linked Open Data n.d.), the Smithsonian Institution’s National Portrait Gallery (National Portrait Gallery—Linked Open Data n.d.), Rhizome’s Wikibase (Rhizome 2021), and Wikidata for Digital Preservation (WikiDP 2016). 167

Patricia Falcão et al.

The Linked Conservation Data project (Ligatus Research Centre n.d.) focuses on “improving access to conservation documentation records.” The possibility of making information about artworks, technologies, and conservation more easily discoverable through linked open data should inform future plans for documentation. Incorporating ethnographic approaches in conservation is another important factor contributing toward a reflective practice (Wielocha and Smith 2021). This has been discussed in the research of Vivian van Saaze (2013) and Fernando Dominguez Rúbio (2020) and through the current project Reshaping the Collectible: When Artworks Live in the Museum (Tate— Reshaping the collectible 2018). These approaches explore museum structures and how they impact conservation practices, calling for a critical approach to conservation. This process highlights the subjectivity of documentation (Wijers 2021), as seen, for instance, in Stigter (2016), who wants to make the positioning and biases of her role visible as a conservator in caring for an object. The emerging practice seeks to be broad enough to cover a range of perspectives as well as collaborative by making use of the digital environment we live in.

11.2 Time-Based Media Documentation Stages Documentation is an ongoing process but usually occurs in key stages of the artwork’s life, such as pre-acquisition, acquisition, displays, loans, and interventions. These stages, or “turning points” (Van de Vall et al. 2011), require caretakers to engage more closely with an artwork, as they often drive change and generate new layers of knowledge. Thoroughly documenting these stages is an opportunity to understand the parameters of those changes or the lack thereof (see fig. 11.1). An artwork’s life starts when the artist develops an idea and materializes it. Subsequently, one or more exhibitions may take place before the artwork is acquired by an institution. At this point, it should go through a defined acquisition process, ahead of being stored and displayed.

Figure 11.1 The artwork life stages and correspondent documentation needs. Diagram: Ana Ribeiro

168

Documentation of TBM Artworks

Eventually, a conservation intervention may be needed, possibly causing significant change to the materials of a work. These stages should be clearly identified so that institutions and collectors understand the documentation needs of each stage and are able to plan and budget accordingly. Conservators working with a collection and planning conservation interventions should take care not to underestimate the amount of time needed for analyzing existing and creating new documentation. In the next sections, we will describe the different life stages along with the appropriate information to gather, suggestions on how to capture it, and references to templates in Section 11.1.2. Besides the documentation of individual artworks, it is also helpful to document high-level processes through workflow and policy documents (Section 11.2.6). These ensure consistency in how processes are applied, as well as clarifying roles and objectives within the institution.

11.2.1 Pre-acquisition The goal of the pre-acquisition phase is to create a knowledge base to inform the acquisition process. This information will support a balanced discussion with the artist or gallery, help define which deliverables to request, and define any conditions in a purchase agreement. The initial research on a work should develop an understanding of its meaning and behaviors, exhibition history and change, the technologies of production, the significance of individual elements, the identification of risks for preservation, and an estimate of resources involved. The information gathered will help conservators plan their budget and time commitments for shortterm acquisition conservation steps and long-term preservation strategies, including identifying the need for specific experts and further research on the technology and techniques. In practice, this information is often gathered from the artist or gallery, as well as exhibition catalogs, social media, artists’ websites, and gallery pages. With luck, there is an opportunity to experience and record the work installed at a gallery or at the artist’s studio and the opportunity to ask questions directly. The initial research supports an informed discussion with the artist and gallery that can be more productive than a non-specific questionnaire, and the conservator can then verify the information obtained by checking against the information, media, and objects received during the acquisition stage. The conservator’s role often includes explaining why this information is important for the ongoing care of the artworks to artists unfamiliar with this process. Once the research is underway, the next step is to enter the artwork records into a Collection Management System (CMS), thus integrating results of the research with existing documentation in the institution (see Chapter 12). Although all of the information gathered can be summarized in a report, it is important to retain copies of email conversations and any information found online. Regardless of the chosen documentation format, this research should answer the questions outlined in Table 11.1. All of these questions require further investigation as an acquisition proceeds, but even preliminary answers based on discussions with an artist or gallery demonstrate an engagement with the artwork, the artist’s practice, and the technologies used. This foundation creates a relationship with artists and galleries that is based on trust and can lead to a collaborative acquisition process that supports the future of the artwork in the collection.

11.2.2 Acquisition The acquisition phase is probably the most important stage in ensuring the conservation of a time-based media artwork. The aim of the acquisition process is to allow the institution or 169

Patricia Falcão et al. Table 11.1  The type of information gathered at the pre-acquisition stage What is the work?

How was it produced?

What will be supplied as part of an acquisition? Are there any deliverables that the museum should obtain in addition to those planned by the artist/ gallery? Are there any specific pieces of equipment to consider? Are there any foreseeable concerns in displaying this work?

The technologies used to produce and display a work can be difficult to gauge just by looking at images or reading a text. When artists and galleries provide thorough information about a work, that is the best place to start. Pre-existing documentation usually covers display formats and installation instructions but seldom mentions the production process. Understanding how a work is produced is the basis for deciding which media should be requested for delivery. Usually, there is already a format planned for delivery, but if necessary, this should be negotiated. This negotiation can also continue after the acquisition if any problems are encountered.

If yes, a budget should be estimated, and technical specs for specific equipment should be documented. For instance, are there any health and safety risks or estimated high costs (e.g., the need to build a space to display a work)?

collector to assume the responsibility for an artwork in the long-term, beyond the life of the artist, gallery, or estate. Acquisition procedures differ from institution to institution. Falcão (2017) and Phillips (2018) provide descriptions of acquisition workflows with respect to the conservation of timebased media art in the museum, in which the acquisition process is used as a focal point where the artist, conservator, and other stakeholders (such as curators, artist’s assistants, technicians, programmers, and others) engage in complementing or correcting the information gathered during the pre-acquisition stage. The goal is to be sure that the institution or collection has acquired all of the information, rights, media, equipment, and sculptural components necessary to preserve the artwork; that components have been condition-assessed and/or quality-checked; and that any additional unique or custom equipment needed to support the artwork has been purchased. Enough documentation must be available to support the display of a work even in the absence of the artist, including installation instructions or an Identity Report, along with any other manuals. The documentation developed at this stage, together with the documentation of previous iterations of the work (e.g., video and still images), will serve as a guide in the future. At the end of the acquisition process, caretakers should be satisfied that they have created a baseline for preservation with all of the components and information needed to display and preserve a work in the long term. They should also ensure that any specific risks or requirements are clearly stated for the next team who will assume responsibility for the work.

Legal Agreements The contractual agreements between the artist and collector or collecting institution are legally binding and thus have the potential for practical impact on the preservation and display of a work over time. It is important that agreements are created with preservation in mind to ensure 170

Documentation of TBM Artworks

they do not unnecessarily prevent measures to preserve a work, such as the ability to copy and migrate files for preservation purposes. Table  11.2 lists four common types of legal agreements used during the acquisition process, although the content will vary widely from institution to institution and from artwork to artwork.

Table 11.2  Types of legal agreements License Usually signed between an artist and the collecting institution. These agreements address agreements intellectual property rights issues in terms of defining how a work can be used or represented in the collection and for publication purposes. Phillips (2018) identifies licenses regulating public displays and performances, production and reproduction of images (video and still), making copies for preservation, exhibition and publication, and finally, agreement on the artist’s availability for conservation interviews and other collaborative work throughout the artwork’s life. Certificates of Can vary from a single-page document stating that an artwork is made by an artist on Authenticity a specific date to a hybrid purchase agreement that specifies deliverables and display specifications. The certificate is often signed only by the artist, and if so, the certificate is sold along with the artwork. When acquiring NFTs (non-fungible tokens) as digital alternatives to Certificates of Authenticity, the authors recommend simultaneously requesting a paper certificate along with the NFT, given the uncertainty about future support for NFTs. Purchase Happen between a collecting institution and a vendor (e.g., artist, collector, gallery, agreements or artist’s estate). PAs define deliverables, installation guidelines, and the artist’s involvement in conservation treatments, displays, and loans (Phillips 2018). It is helpful to establish a level of freedom in defining an artist’s involvement in decision-making, as the institution should still be able to display and preserve an artwork in the artist’s absence. Co-ownership Made between two or more institutions purchasing together or receiving an artwork agreements jointly as a gift. These agreements define the relationship between the institutions involved and their respective responsibilities when scheduling artwork displays, sharing costs, sharing sculptural elements or equipment, and agreeing on preservation strategies. The authors advise against sharing equipment if possible for environmental, cost, and bureaucratic reasons.

Conservation Documentation at Acquisition At this point, conservators analyze and compare documentation supplied by the artist (e.g., manuals, images, videos) with information gathered from social media, publications, other internal documentation, and the artwork’s many stakeholders. Synthesizing these many sources of information may raise questions that lead to further communication with artists and collaborators; for example, artist-provided installation instructions often document one particular iteration in the past without offering sufficient details on the variability of the artwork for future displays. The discussions at this point create a historical record that future staff can turn to when the artist is no longer available. Nevertheless, it is important to remember that an artist may change their mind over time and that a ten-year-old interview may no longer reflect the views of that artist. The information gathered at this stage can then be combined into condition reports and installation specifications (see Table 11.3). 171

Patricia Falcão et al.

Beyond the technical aspects, conservators also identify which elements of an artwork are “constants and which are variables” (Dullart 2009, 1), mirroring the notion of “accidental or variable” elements (Van de Vall 2017, 88) or the use of instructions to describe properties that are essential (Laurenson 2006). Identifying these constant and variable elements is necessary to understanding where change may happen but also to make sure that elements that can change will be kept in flux (Lawson et al. 2021), allowing the artwork to evolve within the boundaries set by the artist and their practices and techniques. For instance, an installation comprising a variable number of projections of a single video in a gallery space may be very different depending on the space, and the artist may be happy to leave those combinations unspecified. By contrast, the media content is often a constant that can only be changed by the artist. Collecting knowledge from a broad range of sources and stakeholders quickly leads to large amounts of information to process, which in turn needs to be considered when budgeting the time and resources required to document an artwork. This process often brings to light contradictory information and, with it, uncertainty (Van Saaze 2013, 24) that makes conservation decisions more open to debate and less comfortable. However, contradictory information is not wrong, but rather, it enriches the documentation as multiple voices describe different experiences and contexts and reflect a specific moment in any given artwork’s life. Table 11.3 lists categories of documentation either provided, gathered, or generated during an acquisition. Collecting institutions should adapt existing templates to reflect their own workflows and organization. This information is only a starting point and is never complete. From this point on, documentation will not only support the materialization of an artwork but will follow it through its changes, in different contexts, display after display.

Table 11.3 Acquisition documentation Documentation provided and gathered: Artist-provided documentation

Artist interviews and communication with collaborators

Questionnaire

Often an artwork’s manual/installation instructions are provided. It may include information on individual components and guidelines on how to install them in a gallery context (e.g., equipment, playback system, and gallery preferences). Artistprovided documentation might be based on a specific iteration, include different examples, or be written generically. Other documentation from the artist could include images, plans, or audiovisual content from previous displays. Interviews (in-person, telephone, or video calls) or written communication with predefined questions. They aim to cover subjects that normally are not included in instruction manuals, such as an artist’s practice and networks (e.g., technicians or collaborators), production history and techniques, and the significance of specific materials. They also include the artist’s views on applicable conservation measures. Different stakeholders should be identified. These communications are normally recorded and transcribed, and a summary may be produced during the interview process. In either case, the results should be shared with the artist for comment and correction (see Chapter 17). Form with a series of questions to gather technical information about an artwork’s components, creation, and how it may be exhibited and preserved in the future. Ideally, it should be tailored to the artwork being acquired.

172

Documentation of TBM Artworks

Other sources

Information may be gathered from archives or libraries, social media, artists’ and galleries’ websites, other online sources, newspapers, catalogs, publications, and so on, or might come from other departments within the institution (curatorial, production, etc.). These sources may provide insight into the art historical context and the artist’s practice, critical views on previous iterations, and complementary information about a work, such as images or videos captured by individuals in exhibition spaces, thus contributing to an understanding of how the artwork changed over time.

Documentation generated by collection: Collection Management System Inventory

Listing and description of media and physical components as well as the associated technical metadata entries.

For artworks with many equipment and sculptural components, it may be useful to have a separate inventory in the form of a list or spreadsheet with all of the components, a reference number, short descriptions of the type and function of the component, thumbnail images, and a space for notes. This can be created by an export from the CMS or by hand (see example in Table 22.1). Identity Report Compiles an artwork’s defining properties and must consequently be updated when those evolve. As explained by Phillips (2015, 176), “the report communicates the intended experience of the piece, outlines its variability parameters and provides guidance for future preservation. The purpose of the Identity Report is not to capture specific solutions for realizing the piece in one venue, but to characterize its behaviors under different circumstances, and to create a rich description of the artwork as a system in relation to its environment.” Media condition Include information gathered by direct inspection of the media, such as technical reports metadata, be it a MediaInfo report or a System Report for a computer. These can be organized in folders or structured in medium-specific templates. Communication with artists, technicians, or other collaborators is sometimes necessary to clarify discrepancies between information received and the result of the assessment. Condition These will include images captured at arrival, technical specifications (e.g., type, reports model, color, serial number), and details on physical and functional condition (see Chapter 15). Mixed media installations might include other objects such as sculptures or paper elements, which will also need to be documented. Media component Tracks all media components (e.g., masters, production materials, conservation, and diagrams exhibition copies) and their relationships (e.g., parent-child). Installation Summarizes the necessary knowledge to install the artwork. This specification specifications includes space requirements, media and equipment specifications, and other required elements, such as the type of seating or specific technical knowledge needed. It may also include floor plans, wiring diagrams, and images from previous iterations. These specs differ from artists’ manuals because they reflect an institutional focus and combine several sources of information, including instruction manuals, artists’ interviews, and direct examination of the artwork’s components. Artwork record Overview of the conservation work and decision-making developed over the life of summary the artwork in the collection, in chronological order. These documents contain handover information for future conservators and other caretakers. Each entry should be signed by the caretaker who assesses the artwork.

173

Patricia Falcão et al.

11.2.3 Exhibition Exhibitions are another big event in an artwork’s life calling for in-depth documentation. The aim of documentation here is to create a record of how a work was displayed and what influenced the decisions in order to inform future displays. Often, the first exhibition of a work is the moment when an artist’s concept faces the constraints of the actual spaces, budgets, equipment, and expertise available. For example, it might be necessary to adapt a large video projection to fit it into a gallery space that isn’t quite big enough. The documentation that results from the first display will influence future display requirements. For example, if an artist showed a small video projection and found that it did not work, it is likely that future display requirements will reflect that. Given the need to allocate staff and contact the artist, exhibition is an opportunity to review and update information about the artwork, even for collection works with an established exhibition history, as it prompts caretakers to reevaluate existing installation instructions and the appropriate display technologies. The display of a work has long been identified as essential to its preservation and to understanding its identity (Real 2001; Dekker 2013, 163). The decision-making process around an iteration of an artwork in the institutional context is always and inevitably a unique interpretative and collaborative process (Phillips 2018, 169). These ideas were translated into a practical documentation model proposed by Joanna Phillips (Phillips 2015, 177), which starts from an Identity Report as a source to create display-specific installation instructions, and then each installation process is documented with an Iteration Report. The decisions made by artists, curators, technicians, conservators, and other institutional staff in the context of a specific iteration can lead to a fundamental evolution of the artwork. This makes documenting the reasons behind those decisions as well as the relationships between the artwork and its technical components essential. In this context, not only the artwork evolves, but the people involved in the installation process can be affected in very powerful ways as well.

Building the Foundations: Reviewing Previous Documentation When preparing an artwork for display, it is important to review existing information and to recognize that previous documentation may be insufficient or outdated, especially if the new installation takes place years later and in a different context. Thus, subsequent research on an artwork should include reviewing the already existing conservation documentation (see Table 11.3) and exchanges with colleagues who were involved with the artwork or the artist at the time, as well as additional information available online and in the literature. Finally, for complex artworks, it is not unusual to contact specialists or other institutions with experience in installing a specific artwork. The understanding of an artwork’s history of display and knowledge of an artist’s practice provide the initial framework to discuss options and constraints. This is also important in building good relationships with artists. At this point, it is important to be aware of • • • • • •

previous installations of the artwork; any remastering, updates, or edits made by the artist to either media or equipment; degree of complexity of the artwork installation and approximate costs; where to position the artwork within the artist’s practice; network supporting the artist’s practice and this particular artwork (collaborators, artist studio, etc.); and any missing information that should be generated to complete documentation. 174

Documentation of TBM Artworks

Documenting the Preparation for Display In many museums, internal discussions take place across departments before the preparation process starts. Ideally, these exchanges are documented from the beginning by archiving email communications and meeting notes. These discussions should include an assessment of the suitability of an artwork for display in consideration of space, budget, staff, equipment, and potential treatment requirements. If an agreement is reached that the artwork can be displayed, the next step is to contact the artist or their studio/gallery, and a second phase of discussion begins. What is learned at this stage can impact the identity of an artwork, with space, budget, and the direct involvement of the artist potentially driving change in different directions. Keeping all versions of the different documents and a record of the decision-making is helpful to understanding changes and how they may apply in future iterations. It is also important to document media and hardware preparation. This documentation should include what went wrong or was discarded in the testing phases; these detours can reveal software/ hardware dependency issues which could help future decision-making (Colussi and Ribeiro 2019). Caretakers should be aware that during the installation process, previously established parameters such as projection size, details of the space fabrication, or the exact position of the equipment may change. Table 11.4 lists the types of documents produced or sourced around various aspects of preparing a display.

Documenting the Installation Phase—Iteration Reports Documenting the artwork installation process can be challenging, as it often happens in a compressed timeline, involves multiple stakeholders, and critical decisions are often made in the moment, sometimes leading the best-laid plans to be abandoned in whole or in part due to unforeseen occurrences. The need to “get things done” by a deadline often overrides every

Table 11.4 Documents created or sourced in preparation for a display Display Parameter

Types of Documents

Space

Architectural drawings including image size for projections; information on the decoration of the space, including samples of materials for carpet, sound panels, and benches; details of how cables should be run, if trunking is acceptable, and if cables should be visible or not; information on equipment positioning and if any should be housed in a dedicated space; information on health and safety and accessibility constraints and assessments. Cost estimate forms; budget negotiations; invoices and receipts for purchases. Estimates of internal resources; booking forms for external specialists, freelancers, and the artist’s collaborators or performers. Lists of equipment and consumables; details of ongoing maintenance and servicing; wiring diagrams; equipment manuals. Exhibition format files; metadata reports; media production workflows; software screenshots and notes; media player scripts; and software specifications. Emails; meeting notes; video calls; text messages and group chats. Courier receipts; and shipping lists for artwork components.

Budget Staffing Equipment Media Communication Artwork movements

175

Patricia Falcão et al.

other priority and, in some cases, may postpone the documentation of the installed artwork to a later point in time—for example, right before the piece gets de-installed and put back into storage (Barok et al. 2019a). However, documenting the artwork only in its “stabilized” final setup when all changes have been agreed upon and implemented could lead to a loss of information as the participants’ memories of the decision-making process become blurred. It is helpful to plan ahead by creating checklists of what to document during the installation and what can be captured at a later stage (e.g., details regarding space, fittings, decoration, final equipment positioning, and image size). It is also important to identify the tools needed (e.g., phone, camera, laptop, notebook, audio recorder). The relevant information gathered should then be reviewed and collated in an Iteration Report or similar report. The aim of the Iteration Report is to create a history of change (Phillips 2015, 177) by offering an insight into the decision-making process around one specific iteration, gathering technical information, describing the context, recording modifications, and the roles of stakeholders. It includes a section for comments on the success of the iteration and notes on the opinions of the different people involved. This is an important step toward acknowledging the subjectivity of the conservator’s perspective and any cognitive bias (Marçal et al. 2014, 2), which reflects the uniqueness of a specific experience and its impact on how the artwork is perceived. However, Iteration Reports are usually completed by conservators, who are often observers in the installation phase, collating and summarizing different perspectives from their own point of view. The time-based media conservator should actively participate in the preparation and installation phases and be willing to acquire hands-on technical and practical skills while raising awareness on the importance of documentation. If the display of a time-based media artwork is a collaborative endeavor, it is important to work out how its documentation can also become more of a team project. The different stakeholders should be encouraged to think critically about the installation phase and to see the value of their contributions to documentation. This leads to the collective development of knowledge around the artwork and its conservation. Dedicated time in the form of debriefing meetings or workshops is essential. The use of task management apps like Trello or Asana or collaborative platforms like Google Drive, Dropbox, shared folders on museum servers, and MediaWiki (Haidvogl and White 2020) also facilitates contributions and engagement.

Documenting Post-installation Care, Maintenance, and Troubleshooting An installed artwork requires maintenance and supervision depending on the technologies utilized and the degree of interaction. Documenting these aspects of care allows for better estimates of the need for maintenance or replacement of equipment or analog media (such as 35 mm slides or film), ensuring that an artwork looks as it should. Providing clear instructions to gallery staff on how to switch equipment on and off and how to perform basic troubleshooting helps prevent issues related to overuse. Documenting unexpected equipment failure and accidental or malicious interactions by the public through incident reports supports efforts to put measures in place to minimize or mitigate the risk of recurrence. A log of incident reports and maintenance issues should be included in the Iteration Reports to inform resource estimates for future displays. Maintaining the installed artwork, observing its behaviors, and anything that has an impact on its presence in the gallery is an opportunity to understand what is required to display it. 176

Documentation of TBM Artworks Table 11.5  Documents created or updated after the artwork has been displayed Documentation Type

Content of Documentation Type

Relevant acquisition Updates to acquisition documentation. For more details, see Table 11.3. documentation Display preparation Updates to display preparation documentation, including architectural drawings; documentation budget forms; correspondence around decisions and negotiation; proposals and drafts; staff booking forms; media preparation reports; equipment lists; invoices; and so on. For more details, see Table 11.4. Iteration Report A document aimed at capturing all of the changes and decision-making around one specific iteration. Photo, audio, Official installation views; images of specific details and equipment with and video annotations; video/audio recordings (binaural walk-throughs, frequency documentation response tests of acoustically treated spaces, etc.); 360 images and video; VR and AR captures; and so on. Switch on and off Step-by-step instructions on how to turn an artwork on and off, along with procedures troubleshooting instructions. Maintenance logs To capture equipment failure, routine operational maintenance, and physical media swaps. Incident reports To document any tampering with or damage to an artwork while on display. Correspondence Internal and external communication (emails, video call recordings, interview recordings, and group chats).

11.2.4 Loans The documentation of a loan is similar to that of a display, with the difference that the installation of the work happens externally, affecting how an iteration is planned and documented. The conservator’s role is to transmit the knowledge about the work, negotiate conditions and requirements for its display (e.g., space, equipment), and capture information related to the loan (e.g., images of the work installed, equipment used, and why). This is especially important when an artist or artist’s studio is engaged in the process, so the conservator’s role becomes one of following and documenting the discussions. A loan, like an internal display, is another moment to review previous documentation and check if all of the available information is clear enough to be handed over. This review may lead to updated installation instructions, for example, perhaps contacting the artist for clarification or recognizing the need to migrate media or upgrade hardware. Most time-based media artworks require specific equipment and technical skills to be installed; therefore, understanding the degree of expertise and resources available to a borrowing institution helps tailor the information and support needed. It is important to identify and flag any significant dependencies or dedicated equipment involved and any particular circumstance in which a conservator is required as a courier to install and document the artwork. In many cases, this may not be necessary as the borrower should agree in advance to create adequate documentation of their iteration, providing condition reports and other key information for the loaner’s Iteration Reports.

11.2.5 Conservation Interventions Intervention can happen at any point of an artwork’s life and reflects moments when its components are changed extensively, for instance, when a video master is migrated from tape to 177

Patricia Falcão et al.

file or when the software in an artwork is altered to maintain its functionality. Conservation intervention always requires documentation adapted to the artwork and the intervention taking place. Interventions are media-specific; some examples of what and how to document them may be found in the media-specific chapters in Part IV of this book. As with most other conservation documentation, it is important to understand the object and its condition, the reasons for and methods used for the intervention, and identify who was involved in the process and decision-making.

11.2.6 High-Level Documentation: Policies, Strategies, Guidelines, and Workflows Conservation teams in institutions spend an important part of their time advocating for the artworks under their care. To enhance this advocacy, the specifics of conserving time-based media artworks need to be clear to team members but also to other stakeholders. High-level policies and strategy documents, made to be widely understood by non-specialists, are an effective way to create this understanding. The process of writing and agreeing to these documents with different stakeholders contributes to institutional consensus. The documents are helpful in defining the conservator’s role and in aligning it with the institutional needs and are often a requirement for accreditation (Collections Trust—Care and Conservation Policy n.d.). Policy and strategy documents provide “a foundation upon which all activities around the management of digital materials can be based” [Digital Preservation Coalition (DPC)—Institutional Policies and Strategies 2015b], and the same can be said more widely of the management of time-based media artworks. As teams grow and change, the creation of guidelines and workflows becomes a way to ensure consistency when quality-checking and documenting artworks, guiding new members of staff, and highlighting the complexity of the processes involved. The latter aspect is essential when advocating for resources or justifying timeframes (see Chapter 6). Workflows illustrate individual processes step-by-step and identify who is taking the actions (Roeck 2016). They can take the form of diagrams, spreadsheets, text documents, or wiki entries, with each format allowing different levels of granularity. Besides their immediate value, all of these documents also become records of past processes, helping readers understand historical practices in the institution.

11.2.7 Integrated Approach to Documentation This section provides an introduction to how to organize conservation documentation about an artwork. For documentation to be useful, it is important to understand • • •

how and for what purpose it was created, where and how to find it, and how to keep the documents accessible.

We will look at how this can be done on three levels: document, information structure (e.g., artwork folders or wiki fields), and when sharing information externally. Any documentation system must ensure that the relationship between an artwork and its documentation is maintained. In most institutions, artwork-related information is linked via a Collection Management System (CMS), which may contain more or less information. At a minimum, a CMS (which may be an excel spreadsheet) must provide a unique artwork 178

Documentation of TBM Artworks

identifier, usually alphanumeric, to relate paper or digital folders with an artwork. At the other extreme, a CMS can be a place for extensive aggregate documentation (see Chapter 12). For individual documents, it is useful to understand the basics of data management practice. Guidelines are easily found online—for example, Data Management Services from Stanford Libraries (Stanford University website—Best Data Practices n.d.)—and many institutions will have their own. The first step is to agree on consistent and descriptive file-naming conventions. This facilitates finding documents and understanding their content (Frazer 2013; Maguire 2017). An example of a file name is ArtworkID_TypeofDocument_Date.docx. Version control should be used, either through a date, as in the file-naming example, or by adding a version number. It is important to be clear about which documents are not definitive, such as by adding “Draft” at the end of the filename and the title of the document (University of Edinburgh—Naming Conventions 2007) to differentiate current from superseded versions. Whenever important changes are made, superseded versions should be kept so that change over time can be tracked. All documents should have the author(s) and date of creation clearly stated within them, and if partially updated, the name of the reviewer(s) and date(s) of the review should be added. Short- and long-term access to the files’ content should be addressed. Any digital documents created, from emails to reports, drawings, or images should be stored in a sustainable format. Alternatively, a copy in a sustainable format should be stored along with the format of creation. This means, for instance, saving emails as text documents or creating copies of CAD drawings (e.g., made using software like Google SketchUp) to PDF so they can be widely accessed. The Library of Congress hosts authoritative information about the sustainability of file formats (Library of Congress—Sustainability of Digital Formats 2005), which is a useful resource when choosing formats for documentation. The next level pertains to the organization of the information about an artwork. The most common approach involves a folder structure on an institutional server, usually organized by artwork. This is a transfer of the traditional paper folders to a digital context. It is important for the structure to be consistent across all artworks and to match the museum processes so that specific information is easier to find. Tate and the Stedelijk Museum voor Actuele Kunst Ghent (SMAK) (Inside Installations— Document Structure 2007) developed a document structure as part of the Inside Installations project, summarized in Table 11.6. It is used here as an illustration, but it should be adapted for each institution’s needs. Table 11.6 provides an example for organizing artwork documentation within a file directory structure. Within a main artwork folder, the documents related to that artwork are organized within subfolders. Similar structures can also be applied in other contexts, from wikis to databases or departmental websites. The ease of sharing or accessing information within an institution is dependent on the institution’s size, organization, and individuals. As Fernanda d’Agostino pointed out, “all knowledge in an institution is documentation” (D’Agostinho et al. 2021), and in many cases, it is likely that information is spread across multiple people or departments; curatorial documentation is associated with a specific curator or a curatorial department, while legal agreements are stored by the legal department. The ongoing development and adoption of Digital Asset Management Systems and Electronic Records Management Systems may in the future make finding information easier throughout an institution, but until then, it is part of the role of the conservator to learn which 179

Patricia Falcão et al.

information is available outside the conservation sphere may be relevant, who is creating it and how best to gain access. Table 11.6 Artwork folder structure and their content, based on “Document Structure developed by TATE and S.M.A.K.” (Inside Installations 2007) Introduction Specifications for display Structure and examination Display history Acquisition and registration Conservation: strategy/ research/ treatment/ ongoing Artist participation Art historical research/ context

An artwork record summary lists the key moments of the artwork’s life, along with dates, names of those involved, and comments highlighting any issues or changes. Information about the display of a work, not specific to one particular display—for instance, any artist-supplied display manuals, along with the institution’s own general description of the display requirements. Documentation related to the examination and analysis of an artwork—for instance, condition reports, analysis of the technologies of a work, or information about individual components. Information related to specific displays, such as all of the documentation created for each iteration, both in-house or for a loan. Information gathered or produced as part of the acquisition process, such as curatorial documents about the work, legal documents, initial correspondence with the artist, or proof of arrival documents. Information related to conservation plans, risk assessments, and treatment reports.

Artist interviews or correspondence that is not directly related to the processes referred to earlier. Results of art historical or contextual research related to the artwork or editions of the work, groups of works, or the artists themselves.

Sharing Information Externally Sharing artwork documentation externally, with conservators or caretakers, institutions, or the wider public, is a recurrent topic in conservation (Wharton 2015). While potentially extremely useful to those in charge of similar artworks and their conservation, sharing internal records raises a number of concerns that may explain why conservation information is not more openly available: •





Technical challenges are summarized by Wharton (2015) as developing common metadata standards, developing systems for discoverability and access, and accommodating the range of file formats used to archive the documentation. Confidentiality concerns with respect to both legal agreements and financial information as well as information shared by an artist in confidence. Conservators need to build trust with artists to allow unflattering information to be shared, and confidentiality is important for that. Incomplete or outdated information can still be useful and fit to share with colleagues, while sharing it publicly may be inadvisable, as it may be misinterpreted and cause reputational damage. Institutions and conservators are judged by the quality of the knowledge they produce and share, and “good enough” information in day-to-day practice can 180

Documentation of TBM Artworks

be insufficient outside of the context in which it is produced. For instance, a meeting summary can be a short email shared with the people involved and may only make full sense in the context of lengthier email correspondence and broader discussions within the institution. There have been attempts in the past to create exchange platforms for internal, unpublished documentation records. The International Network for the Conservation of Contemporary Art [INCCA (Website) n.d.] took the first step in addressing exchange needs when they created the “Database for Artists’ Archives” (Te Brake-Baldock 2008), where members could initially provide metadata about documentation and later also upload documents. The initiative was very active in the early 2000s, but the infrastructure is now used more broadly for the INCCA network. This resource still reflects a form of gatekeeping as it is only available to INCCA members, who must themselves share documents. Sharing documentation or metadata about selected documentation resources on a wider platform or on a combination of platforms could be a way forward, but projects such as Forging the Future (Forging the Future n.d.) show that institutional engagement is essential to develop and populate those platforms, with a high level of resource involved. This may partially explain why there are currently no equivalent multi-institutional collaborations sharing conservation documentation, even as projects such as Europeana or Google Arts and Culture make sharing collections online a richer experience.

11.3 Conclusion For someone starting afresh in a collection with conservation and documentation processes still to be defined, the amount of documentation described in this chapter can be overwhelming. However, it should be clear that this documentation is created over time, over the course of the specific stages of an artwork’s life, and that the documentation that is really essential will vary from artwork to artwork and from life stage to life stage. At a minimum, caretakers should be able to answer the question “Do I have all of the media and information that I need to install a work now? And in 20 years’ time?” If replying “yes” means that you have a quality-checked video file, a one-page display specification, and archived emails from an artist, then that is fine to start with, and you will learn and document more over time, as the history of the work and the collection, as well as your own experience, grow. As media art practices evolve, new conservation and documentation approaches continue to emerge. Conservators, with colleagues from other disciplines, collaborate to develop guidelines, along with new tools and methodologies to keep these artworks accessible to future generations. Conservators, as custodians of the information needed for an artwork to exist, should be at the center when caring for time-based artworks and take the lead whenever necessary. We must also keep a critical gaze on our own work, acknowledging our power when producing documentation that will shape the future and identity of an artwork.

Bibliography Barok, Dušan, Julie Boschat Thorez, Annet Dekker, David Gauthier, and Claudia Roeck. “Archiving Complex Digital Artworks.” Journal for the American Institute of Conservation 42, no. 2 (2019a): 94–113. https://doi.org/10.1080/19455224.2019.1604398. Barok, Dušan, Julia Noordegraaf, and Arjen P. de Vries. “From Collection Management to Content Management in Art Documentation: The Conservator as an Editor.” Studies in Conservation 64, no. 8 (2019b): 472–89. https://doi.org/10.1080/00393630.2019.1603921.

181

Patricia Falcão et al. Collections Trust. “Care and Conservation Policy—Collections Trust.” Accessed March 11, 2021. https:// collectionstrust.org.uk/accreditation/managing-collections/collections-care-and-conservationmanaging-collections/care-and-conservation-policy/. Colussi, Francesca, and Ana Ribeiro. “From Artwork to File and Back to the Artwork (YouTube Video).” Presented at the Presentation At No Time to Wait 4, 2019. www.youtube.com/watch?v=Gxs4RHycrbI. D’Agostinho, Fernanda, Gabriela Pessoa, and Camila Vitti. Interview with Fernanda d’Agostinho, Gabriela Pessoa and Camila Vitti by Patricia Falcão and Ana Ribeiro. Audio Recording via Zoom, April 30, 2021. Dekker, Annet. “Enjoying the Gap: Comparing Contemporary Documentation Strategies.” In Preserving and Exhibiting Media Art: Challenges and Perspectives. Amsterdam: Amsterdam University Press, 2013. Depocas, Alain. “Digital Preservation: Recording the Recoding—the Documentary Strategy.” 2002. www.fondation-langlois.org/html/e/page.php?NumPage=152. Depocas, Alain, Jon Ippolito, Caitlin Jones, and Daniel Langlois Foundation for Art, Science and Technology, eds. Variable Media/Permanence Through Change: The Variable Media Approach. New York: The Solomon R. Guggenheim Foundation and Montreal, The Daniel Langlois Foundation for Art, Science and Technology, 2003. Digital Preservation Coalition (DPC). Digital Preservation Handbook, 2nd ed. Digital Preservation Coalition, 2015a. www.dpconline.org/handbook. ———. “Institutional Policies and Strategies—Digital Preservation Handbook.” 2015b. www.dpconline. org/handbook/institutional-strategies/institutional-policies-and-strategies. DOCAM (Documentation and Conservation of Media Arts Heritage) Research Alliance. “DOCAM Documentation Model.” The Daniel Langlois Foundation for Art, Science, and Technology, 2010. www. docam.ca/en/presentation-of-the-model.html. Domínguez Rubio, Fernando. Still Life, Ecologies of the Modern Imagination at the Art Museum. Chicago: The University of Chicago Press, 2020. Dullart, Eve. “Case Study Report: Black and White . . . (2001, Nan Hoover).” 2009. www.scart.be/?q=en/ content/case-study-report%C2%A0black-and-white%E2%80%A6-2001-nan-hoover. E.C.C.O. “Competences for Access to the Conservation-Restoration Profession.” 2011. www.ecco-eu. org/fileadmin/assets/documents/publications/ECCO_Competences_EN.pdf. Ensom, Tom, Patricia Falcão, and Chris King. “Software-Based Art Preservation.” Tate Website, 2021. www.tate.org.uk/about-us/projects/software-based-art-preservation. Europeana Pro. “Linked Open Data.” Europeana Pro. Accessed September 8, 2021. https://pro.europeana. eu/page/linked-open-data. Falcão, Patricia. “Preservation of Digital Video Art at Tate (Webinar Winter School for Audiovisual Archiving, 2017).” Presented at the Winter School for Audiovisual Archiving, 2017, Nederlands Instituut voor Beeld en Geluid, Hilversum, The Netherlands, January 26, 2017. www.youtube.com/ watch?v=L_Oom6ggdxo. “Forging the Future.” Accessed December 1, 2021. https://forging-the-future.net/. Frazer, Meghan. “An Elevator Pitch for File Naming Conventions.” ACRL TechConnect (blog), January 14, 2013. https://acrl.ala.org/techconnect/post/an-elevator-pitch-for-file-naming-conventions/. Guggemheim Museum. “Time-Based Media.” The Guggenheim Museums and Foundation, 2012. www. guggenheim.org/conservation/time-based-media. Haidvogl, Martina. “Case Study #1: Acquiring and Documenting Jürg Lehni’s ‘Viktor’ (2006~).” TechFocus III: Caring for Software-Based Art, 2016. https://resources.culturalheritage.org/techfocus/ techfocus-iii-caring-for-computer-based-art-software-tw-2/. Haidvogl, Martina, and Layna White. “Reimagining the Object Record: SFMOMA’s MediaWiki.” Stedelijk Studies (Imagining the Future of Digital Archives and Collections), no. 10 (2020). https://stedelijkstudies.com/journal/reimagining-the-object-record-sfmomas-mediawiki/. Hummelen, IJsbrand, and Tatja Scholte. “Capturing the Ephemeral and Unfinished: Archiving and Documentation as Conservation Strategies of Transient (as Transfinite) Contemporary Art.” Technè 24 (2006). ICMOS. “International Charter for the Conservation and Restoration of Monuments and Sites (The Venice Charter 1964).” 1964. www.icomos.org/charters/venice_e.pdf. ICON. “ICON Code of Conduct.” ICON, October 17, 2014. www.icon.org.uk/resource/code-of-con duct.html. ———. “The Icon Ethical Guidance.” ICON, 2020. www.icon.org.uk/resource/icon-ethical-guidance. html.

182

Documentation of TBM Artworks INCCA. “INCCA International Network for the Conservation of Contemporary Art (Website).” n.d. Accessed June 12, 2021. www.incca.org/. ———. “INCCA Project: Inside Installations (2004–2007).” n.d. Accessed December 1, 2021. https:// incca.org/articles/project-inside-installations-2004-2007. Inside Installation, Preservation and Presentation of Installation Art. “Inside Installation, Research Topics: Documentation.” 2007. https://inside-installations.sbmk.nl/research/index.php.html. Laurenson, Pip. “Authenticity, Change and Loss in the Conservation of Time-Based Media Installations.” Tate Papers, no. 6 (Autumn 2006). www.tate.org.uk/research/publications/tate-papers/06/ authenticity-change-and-loss-conservation-of-time-based-media-installations. Lawson, Louise, Acatia Finbow, Duncan Harvey, Hélia Marçal, Ana Ribeiro, and Lia Kramer. “Strategy and Glossary of Terms for the Documentation and Conservation of Performance, Published as Part of Documentation and Conservation of Performance (March  2016—March  2021), a TimeBased Media Conservation Project at Tate.” Tate, 2021. www.tate.org.uk/about-us/projects/ documentation-conservation-performance/strategy-and-glossary#glossary. Lawson, Louise, Acatia Finbow, and Hélia Marçal. “Developing a Strategy for the Conservation of Performance-Based Artworks at Tate.” Journal of the Institute of Conservation 42, no. 2 (May 4, 2019): 114–34. https://doi.org/10.1080/19455224.2019.1604396. Library of Congress. “Sustainability of Digital Formats: Planning for Library of Congress Collections.” Webpage, Library of Congress, 2005. www.loc.gov/preservation/digital/formats/. Ligatus Research Centre. “Linked Conservation Data.” Accessed January 5, 2021. www.ligatus.org.uk/ project/linked-conservation-data. LIMA. “Artwork Documentation Tool.” 2017. www.li-ma.nl/adt/. Lockhart, Adam. “VR as a Preservation and Simulation Tool for Media Art Installations.” Montreal, 2020. https://discovery.dundee.ac.uk/ws/portalfiles/portal/54480990/04.1_conference_paper_ISEA_16_ Oct_2020_1_.pdf. Maguire, Lorna. File Naming Conventions: Simple Rules Save Time and Effort. University of Aberdeen, 2017. www.abdn.ac.uk/staffnet/documents/policy-zone-information-policies/File%20Naming%20Con ventions%20July%202017.pdf. Marçal, Hélia, António M. Duarte, and Rita Macedo. “The Inevitable Subjective Nature of Conservation: Psychological Insights on the Process of Decision Making.” In ICOM-CC Publication Online. Melbourne Australia: Pulido  & Nunes, ICOM Committee for Conservation, 2014. www.icomcc-publications-online.org/1538/The-inevitable-subjective-nature-of-conservation-psychologicalinsights-on-the-process-of-decision-making. Matters in Media Art. “Matters in Media Art; Guidelines for the Care of Media Artworks.” 2015. http:// mattersinmediaart.org/. Metropolitan Museum of Art. “Sample Documentation and Templates.” The Metropolitan Museum of Art. n.d. Accessed June 24, 2021. www.metmuseum.org/about-the-met/conservation-and-scientific-research/ time-based-media-working-group/documentation. Museum of Modern Art (MoMA). “Media Conservation Initiative.” MoMA. n.d. Accessed June 4, 2021. www.mediaconservation.io/. National Portrait Gallery. “Linked Open Data.” National Portrait Gallery, n.d. Accessed June  18, 2021. www.npg.si.edu/linked-open-data. Phillips, Joanna. “Reporting Iterations: A Documentation Model for Time-Based Media Art.” In Performing Documentation, Revista de História Da Arte—Serie W, edited by Gunnar Heydenreich, Rita Macedo, and Lucia Matos, 168–79. Lisboa: Instituto de História da Arte, 2015. ———. “Time-Based Media Conservation in Museum Practice.” Conference presented at the Archiving the Unarchivable, documenta archiv, November  22, 2018. https://player.admiralcloud. com/?v=c5e0a3dc-2060-403f-a963-76c5f257ef4e. Real, William A. “Toward Guidelines for Practice in the Preservation and Documentation of TechnologyBased Installation Art.” Journal of the American Institute for Conservation 40, no. 3 (January  1, 2001): 211–31. https://doi.org/10.1179/019713601806112987. Rhizome. “The ArtBase Relaunches: Welcome to Linked Open Data.” Rhizome, 2021. http://rhizome. org/editorial/2021/apr/26/the-artbase-relaunches-welcome-to-linked-open-data/. Roeck, Claudia. “Preservation of Digital Video Artworks in a Museum Context: Recommendations for the Automation of the Workflow from Acquisition to Storage.” Master’s Thesis, Bern University of the Arts, 2016. www.academia.edu/41381787/Preservation_of_digital_video_artworks_in_a_mu

183

Patricia Falcão et al. seum_context_Recommendations_for_the_automation_of_the_workflow_from_acquisition_to_storage and www.academia.edu/41381788/Masters_Thesis_Appendix. Smithsonian Institution. “Time-based Media  & Digital Art.” n.d. Accessed February  18, 2021. www. si.edu/tbma/. Stanford University Libraries. “Best Data Practices.” n.d. Accessed March 27, 2021. https://library.stanford.edu/research/data-management-services/data-best-practices. Stigter, Sanneke. Between Concept and Material. Working with Conceptual Art: A Conservator’s Testimony. University of Amsterdam, 2016. https://pure.uva.nl/ws/files/2679847/174706_PhD_Stigter_20160527_ Final_Manuscript_complete.pdf. Tate. “Reshaping the Collectible: When Artworks Live in the Museum.” Tate, 2018. www.tate.org.uk/ research/reshaping-the-collectible. ———. “Documentation and Conservation of Performance—Project.” Tate, 2021. www.tate.org.uk/ about-us/projects/documentation-conservation-performance. Tate, and S.M.A.K. “Document Structure by TATE and SMAK.” Inside Installations, 2007a. https:// inside-installations.sbmk.nl/OCMT/mydocs/Document%20structure%20by%20TATE%20and%20 S.M.A.K.pdf. ———. “Introduction to Documentation Structure by TATE and SMAK.” 2007b. https://inside-instal lations.sbmk.nl/OCMT/mydocs/Introduction%20to%20Documentation%20Structure%20by%20 TATE%20and%20S.M.A.K.pdf. Te Brake-Baldock, Karen. “Workshop on INCCA Database for Artist’s Archives.” Conservation DistList Archives, March 14, 2008. https://cool.culturalheritage.org/byform/mailing-lists/cdl/2008/0292.html. University of Edinburgh, Records Management. “Naming Conventions.” 2007. https://www.ed.ac.uk/ records-management/guidance/records/practical-guidance/naming-conventions. Vall, Renée van de. “Documenting Dilemmas. On the Relevance of Ethically Ambiguous Cases.” Sin Objeto, Collecting the Immaterial (2017): 83–99. https://doi.org/10.18239/sinobj_2017.00.05/ISSN 2530-6863. Van de Vall, Renée, Hanna Hölling, Tatja Scholte, and Sanneke Stigter. “Reflections on a Biographical Approach to Contemporary Art Conservation.” In ICOM-CC 16th Triennial Conference Preprints, Lisbon 19–23 September 2011, edited by Janet Bridgland. Lisbon: Critério-Produção Grafica, Lda, 2011. https://pure.uva.nl/ws/files/1262883/115640_344546.pdf. Van Saaze, Vivian. Installation Art and the Museum. Amsterdam: Amsterdam University Press, 2013. “Variable Media Network.” 2003. https://variablemedia.net/e/index.html. Wharton, Glenn. “Public Access in the Age of Documented Art.” Revista de História Da Arte 4 (2015): 180–91. http://revistaharte.fcsh.unl.pt/rhaw4/RHAw4.pdf. Wielocha, Aga, and David Smith. Interview with Aga Wielocha and David Smith by Francesca Colussi and Ana Ribeiro. Audio Recording via Zoom, January 20, 2021. Wijers, Gaby. Interview with Gaby Wijers by Francesca Colussi, Patrícia Falcão and Ana Ribeiro. Audio Recording via Zoom, February 1, 2021. WikiDP. “About Wikidata For Digital Preservation | WikiDP.” 2016. https://wikidp.org/about.

184

12 INVENTORY AND DATABASE REGISTRATION OF TIMEBASED MEDIA ART Martina Haidvogl and Linda Leckart

Editors’ Notes: Martina Haidvogl is a lecturer at the Bern University of the Arts’ media conservation program (Switzerland), and Linda Leckart is a registrar at the Los Angeles County Museum of Art (USA). In their previous roles at the San Francisco Museum of Modern Art, both authors collaboratively developed workflows and worked side-by-side to quality control and catalog media artworks. In their chapter, they draw on their respective expertise in media conservation and registration to develop a comprehensive numbering system for inventorying media artworks within a traditional collection management system.

12.1 The Collection Information Environment Collecting institutions employ various systems to track, locate, and manage the objects and artifacts in their care. Most museums today use a database to manage their collections, supporting the work of curators, registrars, preparators and conservators. A museum’s collection management database typically sits at the core of the institution’s information environment. Within it, the different artworks are cataloged, parts and components inventoried, locations tracked, conservation reports filed, loan activities documented, and exhibitions logged. The collecting of these data points makes the collection searchable, allows quantification of holdings, and facilitates sharing information across departments and stakeholders. In addition to this database, museums may employ digital asset management systems (DAMS): a central location to store and manage the actual digital assets such as photographs, videos, and documents, which can be searched for via their accompanying metadata and other forms of tagging. Such DAMS are the digital successors of museum documentation binders filled with paper documents, analog photographs, and slides. However, not all analog recordkeeping has ceased with the advent of digital documentation and cataloging systems. Institutions still keep analog files on artworks, such as acquisition or loan agreements, provenance research, conservation reports, installation instructions, or other paper documentation. In addition to legal responsibilities to maintain certain records, documenting artworks serves the broader museum goals of collection research, education, and preservation. Different departments often keep their own paper records. These departmental records are usually complemented by digital documents of all kinds on shared network drives (see Chapter 10). DOI: 10.4324/9781003034865-15

185

Martina Haidvogl and Linda Leckart

While database systems are being utilized today for the stewardship of all art forms, it is the complexity and variability of contemporary art and time-based media art in particular that challenge traditional management systems and require some workarounds and new tools to meet their needs. The most common software-based solutions to managing collections are discussed in the following sections.

12.1.1 Digital Spreadsheets Microsoft Excel and Google Sheets are two examples of spreadsheet software, which can be used to list, sort, and visualize data. MS Excel is often used as a preparation tool to import data into a database. As a program, however, it is not always networked, and in some cases, only a single user can make changes at a time without support for automated tracking of contributors. Google Sheets allows for editing by more than one user at a time but may conflict with an institution’s security protocols if not integrated within their system.

12.1.2 Desktop Databases A desktop database, such as Microsoft Access and Filemaker Pro, supports a limited number of users with an interface that has to be customized. They are not typically networked and offer no security through tiered permissions, and data export is a challenge due to proprietary data formats and version compatibility issues if the data have not been regularly exported into a common human-readable format such as .csv or .txt. Because of their affordability and availability, they are often an easy first solution; however, they are not suitable for actively managing collections.

12.1.3 Collection Management Systems A Collection Management System (CMS) is a type of content management system that has been customized to serve museums and collecting institutions. Confusingly, CMS is the abbreviation for both Collection Management System and Content Management System and is commonly used colloquially in this abbreviated form to refer to either. In this chapter, CMS will solely stand for Collection Management System. A content management system consists of a database (“backend”) and a user interface (“frontend”). Popular CMS applications include TMS (The Museum System), Museum+, and Adlib Museum and the open-source solutions CollectionSpace [CollectionSpace (Software) n.d.] and Omeka [Omeka (Software) n.d.]. Most CMS are relational databases and are set up to capture structured data. They allow for tiered user permissions and networked access and are robust systems to capture core data and track artwork locations. However, they are often cumbersome to use and, through their relational nature, rarely offer the scope and flexibility that works of contemporary art require. Their use for media artworks in particular—which may have multiple components with varying relationships, are oftentimes variable in their nature, may have replaceable or replaced parts, or may even have no physical components at all—is often characterized by individual solutions and workarounds.

12.2 Cataloging Time-Based Media Art Bringing time-based media artworks into a collection and developing ways of caring for these artworks often require not just an adjustment of current management systems and processes 186

Inventory and Database Registration

but also shifts in conceptual thinking by caretakers. How do you condition-check an artwork that has no physical components? What digital locations need to be created to properly track the components? How can there be several master formats for one artwork? What status do exhibition copies of artwork have? Do they need to be tracked, and if so, how? How does one account for a remastered video version that is produced by the artist years after the acquisition? How does one track a conceptual artwork that doesn’t have any components, physical or digital? While each of these questions has to be looked at and solved using an institution’s available— and oftentimes historically grown—system(s), the following recommendations are universally applicable to any successful cataloging of a time-based media collection.

12.2.1 Track Every Component A time-based media artwork may come with multiple information carriers containing analog and digital copies of a single artwork, sculptural installation components and dedicated equipment, and a certificate of authenticity (COA). Each copy of the video may have a different file type and be assigned an intended use. In a CMS, each of these artwork components needs to be tracked individually, and components that arrive or are produced at a later date need to be added to record all holdings and their history.

Glossary Parent records are created in a hierarchical database to represent the whole of a multi-part artwork. Child records are created to represent the parts of the artwork. Parts are components of an artwork that are integral to the identity or makeup of the artwork as a whole as defined by the artist. Components are physical or digital elements related to an artwork. A  time-based media artwork can have components that are not parts themselves but are related to parts of an artwork (e.g., different versions of a video file). These components should be preserved by the collector for the life of the artwork. Components that should be tracked but can be replaced if needed are called Accessories.

12.2.2 Reflect the Status of Components Unlike traditional works of art, time-based media artworks may have components of varying status within the collection. This difference in status helps with the classification of an artwork’s holdings, may impact conservation decisions, and even allows for future deactivation of records. For example, a video’s compressed exhibition copy does not sit at the same level as the video’s uncompressed master file, which exceeds the former in quality and significance. Since the exhibition copy can be recreated from the master file, it may also not be afforded the same access regulation protocols. In addition, an exhibition copy coming directly from the artist may have a different status than an exhibition copy created in-house; and a master file may need to be “retired” if the artist decides to remaster their artwork at a later date. The status of components can be reflected in the numbering system via an assigned status field or simply recorded within a text description field. Because of media art’s variable nature, the numbering system needs to come with a certain flexibility to accommodate potential change. 187

Martina Haidvogl and Linda Leckart

12.2.3 Note Locations, Also Digital Ones For physical objects, it is well understood how to track their location on a certain shelf or rack. The same system can be applied to digital components by assigning digital locations to each component in the database: a file may first live on a RAID as some form of temporary storage when it is acquired, and only after a condition check be moved to an institution’s dedicated network drive, or Digital Art Vault (see Chapter 8). There is one caveat to digital objects that museum databases cannot account for: the very same digital file can, in fact, live in two (or more) places at once—for example, on the original hard drive it was delivered on and the new digital location within the museum’s storage environment. The hard drive then becomes a hybrid item being an object and a location at the same time, a duality that is both conceptually and practically hard to process. Depending on the CMS used, there are different workarounds for this scenario, which will depend on in-house guidelines. Most practically, an institution will track the “tangible objects,” such as the hard drive, the LTO tape, or the USB drive, as object components with their respective locations, noting which files exist on those carriers in a free text field. Additionally, only the main location of the individual digital files will be tracked (e.g., “Video Master File, location: Digital Art Vault”). This shows the difference between dealing with a unique art object (e.g., a painting), which can physically only be present in one location, and handling a digital object (e.g., a video file), of which the very identical file can be in several locations at once. These different needs are currently not accounted for in common CMS.

12.2.4 Note Relationships Between Components As part of an artwork’s documentation, it’s important to track the relationships between the components. For example, was the digital master file digitized from an artist’s master tape, the institution’s master tape, or the institution’s archival master tape? Is the exhibition copy a direct derivative of the digital master file, or was a separate exhibition copy format (e.g., a DVD) transferred? Noting these relationships is crucial to document the provenance with respect to the source of the components. The need for this information becomes particularly apparent when quality has been compromised during a transfer. Time-based media artwork components have a tendency to multiply. Some CMS today have limited capabilities to record these relationships, however, and most only allow such information to be recorded in a free text field. Therefore, it is common for bespoke solutions to be established depending on the particular CMS and historical modes of categorization.

12.3 Inventory All artworks should be inventoried upon receipt. In the case of time-based media artworks, all components, digital or analog, need to be recorded. Digital files, specifically, need to be documented by tracking the number and types of files provided and their file metadata and condition. Additionally, it is also necessary to document the physical carriers that digital components arrive on. Stewards should also revisit existing collections to confirm that all artworks are registered down to the component level. This may require a comprehensive time-based media collection inventory (see Chapters 4 and 5 for different approaches to collection surveys).

12.3.1 Defining the Categories The first step in conducting a comprehensive time-based media collection inventory is to define the categories of information to be gathered for each object. Include object number, 188

Inventory and Database Registration

artist, title, medium (carrier or file type), dimensions, duration, markings, creation/migration history, status, and current location. Include a space for general notes that fall beyond the other categories and note individual needs for rehousing or further review by registration, conservation, or curators. Develop an internal glossary for time-based media within your institution to standardize language across departments during the inventory and future acquisition process. Creating a list of carrier and file types with accompanying images that help with identification could be a useful tool when asking colleagues to help identify unknown caches of time-based media components in their workspaces. This internal glossary could form the basis for creating a controlled vocabulary within your collection management system. The use of controlled vocabularies (e.g., by implementing lists with terms within the CMS) critically enhances the searchability of data sets and provides the basis for successful queries and data interpretation (Getty Research Institute (GRI) n.d.; DOCAM—List of Controlled Vocabulary n.d.).

12.3.2 Selecting the Documentation System After selecting the information to be gathered, devise a system to document and organize the core data about each object. A spreadsheet program works well for inventory purposes. Google Sheets, for example, allows for multiple users to work within the same sheet simultaneously, which is ideal when more than one staff member is actively working on the inventory. Another, more user-friendly way of collecting the information is via a custom PDF-based form using a software program such as Google Forms or Adobe Acrobat, which captures data and can be directly exported into a spreadsheet application. For any system, it is critical that it can be transferred into a collection management system upon completion of the inventory process. In other words, when working with such static documents, it is important to view them as the information collecting tools that they are and keep in mind the ultimate destination of the data so it can eventually be collected and managed in the appropriate place.

12.3.3 Enriching the Record After physical and digital holdings have been documented in the inventory spreadsheet, the museum’s object files should be reviewed for related documentation. Note the existence of certificates of authenticity and consider storing them in archival enclosures and environmentally controlled storage areas. Some artists require the certificate to validate the authenticity of the artwork, and the loss of such a certificate could be equivalent to the loss of the artwork. Consider adding a column to the inventory spreadsheet for COA as the institution may decide to track them as components of an artwork. Furthermore, object files may contain installation instructions, acquisition agreements, and shipping receipts that outline the original components received by the institution. This information can help to identify what was found during the inventory, establish relationships, and assign status to those components. Lastly, it is good to know if equipment manuals, Identity and Iteration Reports, migration and maintenance records, and exhibition histories exist, as they are key to understanding the history of an artwork and can be digitized for wider access and searchability.

12.3.4 Updating the Record Time-based media inventories may need to be updated more frequently than other types of art due to technological obsolescence and changes in the status of components. Ideally 189

Martina Haidvogl and Linda Leckart

and for the sake of efficiency, information gathered during accessioning should be added directly to the institution’s collection management system and be continually checked and updated at the point of exhibition and loan, which then provides an up-to-date accounting of the entire collection. Once the time-based media inventory is complete, confirm that the existing numbering is accurate and reflective of the institution’s holding of each artwork.

12.4 Time-Based Media Numbering Within a Collection Management System The numbering and documentation procedures for time-based media will ultimately be based on an institution’s existing object numbering structure and CMS. While there is no one-sizefits-all solution, there are general recommendations that can apply to most collections.

12.4.1 Keep It Flexible There are numerous moments in the life of a time-based media artwork that can impact the numbering system. New components may be added through the acquisition of higher quality formats, duplication, remastering, or migration. The intended uses of an artwork’s digital files and carriers might change over time as technology advances, and multiple copies of components can have the same intended use. It is also not unheard of for an artist to change the display parameters for an artwork. Thus, it is important for a numbering system to accommodate existing time-based media components as well as any future additions. When numbering timebased media artworks with sculptural installation elements or artist-created or artist-modified equipment, it is recommended to assign the physical objects with part numbers at the top of the hierarchy. Doing so ensures that the records for these elements do not get buried behind the media component(s), which can have many associated records and can continue to multiply over time (Jarczyk 2020).

12.4.2 Integrate with the Existing System As the numbering for time-based media artworks needs to integrate with the existing system, it is recommended to start with a model for a traditional artwork and then expand from there. Since this process will require institutional decision-making, their numbering should be finalized through a dialogue between different stakeholders, including registrars and conservators. If an artwork is made up of multiple parts, which in themselves contain multiple components, the parent-child model may be a useful way of representing it (see the Glossary and Table 12.1 for illustration). Create a parent record for the artwork as a whole and additional child records for the parts that make up the artwork. In such cases, the parent record may conceptually represent the artwork, whereas the physical or digital objects are only added to the child record level. The parent record is then simply a placeholder to represent the artwork as a whole. These types of parent records are sometimes referred to as virtual records. They are also used to track conceptual artworks without any physical components. Adding medium descriptions in brackets after the component names may help with quickly identifying the elements in a list view. If done in accordance with the content management system and solely on the child record level, this information can be imported into various reports without pulling it into label texts or online collection publishing.

190

Table 12.1  Inventory example with numbering, compiled by the authors Description

Object Level

Medium

Location

2010.50.A-D

Whole/Parent

Virtual Record

N/A

Part/COA Part/Child Part/Child Part/Child

Paper Wood, plastic, wire Projector, fabr ic, plexiglass Virtual Record

Archive file Physical art storage Physical art storage N/A

Part/Child

Virtual Record

N/A

2010.50.C.001

Parent/Virtual Object. This record represents the entire artwork (installation with sculptural components). Certificate of Authenticity Descr iption of sculptural element Artist modified projector; “unique equipment” Virtual Object. This record represents the TBM component of video channel 1. Virtual Object. This record represents the TBM component of video channel 2. Archival Master provided by artist (CH1)

Component

Dig ital Art Vault

2010.50.C.XC.001 2010.50.C.XC.002 2010.50.D.001

Exhibition Copy provided by artist (CH1) Exhibition Copy (CH1) created by the institution from 2010.50.C.001 Archival Master provided by artist (CH2)

Component Component Component

2010.50.D.XC.001 2010.50.D.XC.002 2010.50.C-D.001 2010.50.C-D.002 2010.50.A-D.AC.001 EQ.1234

Exhibition Copy provided by artist (CH2) Exhibition Copy (CH2) created by the institution from 2010.50.C.001 Hard dr ive provided by artist LTO tape Brackets for installation of sculptural element .B Backup projector

Component Component Component Component Accessor y Accessor y

Apple ProRes 422/ QuickTime H264/MP4 DVD Apple ProRes 422/ QuickTime H264/MP4 DVD SSD hard dr ive LTO tape Metal Projector

2010.50.A-D.COA 2010.50.A 2010.50.B 2010.50.C 2010.50.D

191

Dig ital Art Vault Physical art storage Dig ital Art Vault Dig ital Art Vault Physical art storage Physical art storage Off-site Physical art storage A/V storage

Inventory and Database Registration

Object Number

Martina Haidvogl and Linda Leckart

12.4.3 Create a Comprehensive Numbering System When an artwork has multiple elements, these parts are integral to the identity or makeup of the artwork as a whole. However, they may not need to always be shown at the same time. Some institutions have found it convenient to add prefixes or suffixes to their object numbers to distinguish intended uses of time-based media elements. For example, if the artwork requires that all parts be included during display, they could be represented by letters, such as A–D (see Table 12.1 for illustration). Other times, it is possible to show only a selection of parts, which could be reflected in the numbering system as a sequential number, such as 1–3. Prefixes may give a quick overview by sorting different components by type, however, they may complicate a search by object number. Some institutions rely solely on text fields and use controlled vocabularies for the descriptions of components. These details often go beyond the file or carrier type to include additional information such as the origin of the component, the corresponding file name on a digital art server, its relationship to other components, and its intended use in preservation and/or display. Like all artworks, time-based media can have associated materials that are replaceable but need to be tracked in storage and documented for their role in the artwork’s history (see Chapter 15). Such accessories common to time-based media artworks include rare equipment that is stockpiled by an institution, hardware, or other substitutable installation materials. Table 12.1 outlines an example numbering system for a time-based media installation. In this scenario, the multichannel artwork is made up of four parts: a sculptural element, a piece of artist-modified display equipment, and two media components. It starts with a virtual parent record that represents the artwork as a whole but does not link to a physical or digital object. Because all of these integral elements must be shown together, the object number for each part includes a letter. The part records for the TBM components are also virtual records and do not link directly to specific carriers or files. This is done to keep the record flexible in the event that the specific media element changes over time. The artwork also includes a certificate of authenticity. While often essential to the artwork, it will likely never be displayed and is numbered with an identifying suffix .COA. Additional copies or versions of the media elements are given component records in the database. Their unique object numbers share the root and part letter with the virtual media component part record (here: 2010.50.C). There is no hierarchy in the numbering from this point forward. Each additional copy or version of the media element is assigned a running number. Adding two zeros before the first unique number allows for easier sorting and up to 999 potential versions to exist in the life of the artwork. Differentiation between media components and notes about a component’s status will be accomplished by adding descriptions in other fields of the record. This can include but is not limited to the media file or carrier type, a description of its relationship to other parts/components, its status such as to whether or not it was provided by the artist or made in-house, metadata, file path names, codec information, and checksums. Exhibition copies are numbered with the root and part letter of the virtual media component (here: 2010.50.C) and a suffix of XC (exhibition copy) followed by a three-digit unique number. Each time an exhibition copy is created, the next sequential number will be used. These records can be made inactive if the exhibition copy is destroyed but will remain searchable within the database. Dedicated yet replaceable accessories and media equipment will be given a similar treatment with the full number of the parent record followed by an .AC suffix (accessory) and three-digit sequential number. Media equipment that is rare or obsolete but is not dedicated to a specific artwork can be tracked in the database with a unique prefix—for instance, EQ (equipment) (Falcão 2020; 192

Inventory and Database Registration

Harvey 2020). Tracking media equipment in the collection database that is difficult to obtain or maintain allows staff to record usage and demand, informing replacement and maintenance needs. While not an artwork’s component, it can be linked to the parent record as a related object, which documents the nature of the relationship.

12.5 Beyond the Database: Where to Store What Doesn’t Fit? As databases are very efficient in tracking objects, whether physical or digital, there are other supporting documents for which the database does not constitute an ideal home: installation instructions, artist interview files, Iteration and Identity Reports, composite or viewing copies, floor plans and production diagrams, packing, installation and exhibition photography, video documentation, correspondence, and equipment manuals, among many others. The naming of the digital supporting documents should include the object number of the parent record for ease of recognition and organization by the user. Some of these may be linked to a database record. However, too many attachments increase the database’s storage footprint. A museum’s DAMS, if existing, may be a more native place for such files and documents. Through a tiered permission system, their individual access can be managed, and they can be shared with other platforms via links. In a DAMS, however, context between the assets and their relation to the artwork may not be as obvious. Many museums have, therefore, resorted to utilizing a file folder system on a shared network drive as a place to store these kinds of documents in a one-folderper-artwork structure. With an organized subfolder system, the documents can be sorted, providing a quick overview of the contents. However, with multiple users, documents may be altered unintentionally, and a lack of version control makes such changes untraceable (Barok et al. 2019). If not managed strictly, different versions of a document may soon create a confusing assortment of assets. Generally, the many different files and file types may lead to a fracturing of information, and one may have to browse several documents to get a larger sense of the artwork or the small piece of information they’ve sought. In recent years museums and institutions have therefore experimented with alternative platforms that not just handle the previously mentioned challenges better but also support a more collaborative approach through their technological setup. In their flexibility, they are also suited to much better serve the complex documentation needs of contemporary artworks (Engel and Wharton 2017). Projects such as the Guggenheim’s Panza Initiative have used the wiki platform Confluence by Atlassian to provide access to compiled information of a collection of artworks for a group of interdisciplinary researchers (Guggenheim Museum—The Panza Collection Initiative n.d.). The open-source software MediaWiki [MediaWiki (Software) n.d.] was used for the Artist Archives Initiative (Artist Archives Initiative n.d.) at New York University, a collaborative research project that deeply analyzed the oeuvre of artists David Wojnarowicz and Joan Jonas (Wharton et al. 2016). The same software has also been established at SFMOMA as a complementary platform to internally document collection artworks and support museum workflows (Haidvogl and White 2020). Going beyond the scope of this paper, there have been explorations into (semantic) data models and linked (open) data for conservation and collections care, which allow for cross-platform and cross-institutional use of data management. What all these external digital platforms have in common is a destination for curated content and supporting documents that can be universally accessed and collaboratively edited, that allows for discoverability of a collection for researchers and museum professionals alike through streaming of linked videos and contextualized documents, and that, through its version control, fosters collaborative approaches to artwork documentation. 193

Martina Haidvogl and Linda Leckart

12.6 Conclusions What is most challenging when discussing the cataloging of media artworks (and works of contemporary art) within standard museum databases is that these systems were never designed for such non-traditional art forms and have yet to be adapted. To fit them into existing systems, standards and workarounds need to be developed, which will be unique to an institution’s information system environment and historically grown cataloging practice. Such standards, however implemented, should include tracking of every component and their location; reflecting status, intended use, and relationships among them; and being flexible to allow for additions or changes. Despite their smaller price tag and ease of access, desktop databases are not suitable solutions to actively manage a collection of any size. Even Collection Management Systems have their limits, and several institutions have developed complementary solutions to support the needs of media and contemporary artworks. Until CMS databases expand their functionality and capacities with an eye toward non-traditional works of art, collecting institutions will need to remain flexible and be creative with their inventory and database registration practices for time-based media artworks.

Acknowledgments The authors wish to thank Amy Brost, Assistant Media Conservator at The Museum of Modern Art; Amanda Dearolph, Database Administrator at Los Angeles County Museum of Art; Patricia Falcão, Time-based Media Conservator at Tate; Erika Franek, Assistant Director, Registration  & Collection Information at Los Angeles County Museum of Art; Duncan Harvey, Time-Based Media Conservator (Loans & Acquisitions) at Tate; Joey Heinen, Digital Preservation Manager at Los Angeles County Museum of Art; Agathe Jarczyk, Associate Time-Based Media Conservator at the Solomon R. Guggenheim Museum; Jeannette Sharpless, Associate Registrar at the Museum of Modern Art; and Grace T. Weiss, Associate Registrar, Collections at the San Francisco Museum of Modern Art for sharing their time and expertise with us during the writing of this chapter.

Bibliography “Artist Archives Initiative.” Accessed January 6, 2021. https://artistarchives.hosting.nyu.edu/Initiative/. Barok, Dušan, Julia Noordegraaf, and Arjen P. De Vries. “From Collection Management to Content Management in Art Documentation: The Conservator as an Editor.” Studies in Conservation 64, no. 8 (2019): 472–89. https://doi.org/10.1080/00393630.2019.1603921. CollectionSpace (Software). “Lyrasis.” n.d. Accessed June 30, 2021. https://collectionspace.org. DOCAM. “DOCAM—List of Controlled Vocabulary.” N.d. www.docam.ca/en/tools/controlled-voca bulary-list.html. Engel, Deena, and Glenn Wharton. “Managing Contemporary Art Documentation in Museums and Special Collections.” Art Documentation: Journal of the Art Libraries Society of North America 36, no. 2 (September 2017): 293–311. https://doi.org/10.1086/694245. Falcão, Patricia. “Email from Patricia Falcão to Martina Haidvogl.” August 11, 2020. Getty Research Institute. “Art & Architecture Thesaurus® Online.” Getty Research Institute (GRI), n.d. www.getty.edu/research/tools/vocabularies/aat/. Guggenheim Museum. “The Panza Collection Initiative.” The Guggenheim Museums and Foundation. Accessed January 6, 2021. www.guggenheim.org/conservation/the-panza-collection-initiative. Haidvogl, Martina, and Layna White. “Reimagining the Object Record: SFMOMA’s MediaWiki.” Stedelijk Studies (Imagining the Future of Digital Archives and Collections), no. 10 (2020). https://stedelijkstudies.com/journal/reimagining-the-object-record-sfmomas-mediawiki/. Harvey, Duncan. Interview with Duncan Harvey by Martina Haidvogl, August 7, 2020.

194

Inventory and Database Registration Jarczyk, Agathe. Interview with Agathe Jarczyk by Martina Haidvogl and Linda Leckart, July 20, 2020. MediaWiki (Software). “Wikimedia Foundation.” n.d. Accessed January 6, 2021. www.mediawiki.org/ wiki/MediaWiki. Omeka (Software). “Digital Scholar Project.” Roy Rosenzweig Center for History and New Media, n.d. Accessed June 30, 2021. https://omeka.org. Wharton, Glenn, Deena Engel, and Marvin C. Taylor. “The Artist Archives Project: David Wojnarowicz.” Studies in Conservation 61, no. sup2 (June 2016): 241–47. https://doi.org/10.1080/00393630.2016.1 181350.

195

13 DIGITAL PRESERVATION AND THE INFORMATION PACKAGE Nicole Martin

Editors’ Notes: Nicole Martin is Associate Director of Media Archives at Human Rights Watch and former Adjunct Professor and Associate Director at the Moving Image Archiving and Preservation program at Tisch School of the Arts at New York University. With an archivist’s perspective, Nicole Martin introduces digital preservation standards and practices established in the libraries and archives community and presents their relevance to the care of audiovisual heritage and time-based media art collections.

13.1 Introduction: Preservation and the Past, Present, and Future When performing digital preservation, it is useful to consider the past, present, and future of a digital asset to make data more meaningful, understandable, and portable. Preservationists begin by duplicating data, backing it up for safekeeping, then contextualizing it and creating documentation about assets so they are usable and renderable in the future. All of the information gathered during this process is added to an “Information Package,” the entity at the heart of every digital preservation workflow. Collection stewards begin the process of digital preservation by investigating the past or history of data in their possession, documenting that information to make certain it is available to future users indefinitely. This background information creates context and makes data more meaningful, helping future users understand how a file was created, how it was used, and who used it. Preservationists also document information about the file in its present state. Every aspect of the current construct of a file is explored, and data integrity or fixity checks are generated, making certain every 0 and 1 remains fixed and unchanged over time. The documentation added to an Information Package supports the data it contains, ensuring this data has everything it needs to be understood, used, and viewed or rendered in the future. The Information Package is like a seed pod containing all the necessary fuel for a seed to propagate successfully in any future situation; it allows data to be moved through its life cycle with all of the contextual information it needs to be viewed, played, or rendered. This detailed attention benefits any data type but is particularly relevant to collection stewards who handle time-based media due to its complexity and reliance on external dependencies such as software, hardware, networks, and other systems. 196

DOI: 10.4324/9781003034865-16

Digital Preservation

Glossary Archival Information Package (AIP): From the OAIS Reference Model, a set of data retained by an archive that has been documented and described in detail for preservation. Command-line tool: A program or application that is executed within a text-based interface as opposed to a Graphical User Interface Data ingest: The process of acquiring data and moving it into a preservation platform such as a Digital Asset Management system or repository. Designated community: From the OAIS Reference Model, the defined group of users who will access and utilize data to be preserved. Distribution Information Package (DIP): From the OAIS Reference Model, a set of data retained by an archive and designated for access by an anticipated user group. Fixity check: The act of evaluating the composition of a file to ensure it remains static or unchanged. Network-Attached Storage (NAS): A data storage server that is accessible via a computer network. Payload: From the Library of Congress’ BagIt standard, a set of user data designated for preservation. Storage Area Network (SAN): A network of computers and servers designed to share pools of data. Submission Information Package (SIP): From the OAIS Reference Model, a set of data designated for preservation that has been received and is ready for a more detailed description.

13.2 Defining the Information Package: The OAIS Reference Model and BagIt What exactly is an Information Package and how do they benefit collections? Digital preservation standards light the way, leading us from basic data backup and redundancy to deeper knowledge about how data should be described, grouped, and maintained for long-term usability. Standards define and enhance our day-to-day archival and preservation practice, providing us with methods to manage data carefully and responsibly throughout its life cycle. They challenge us to consider details we may have omitted, encourage us to share and exchange language and methodologies with our colleagues, and hold us accountable for consistent work year after year. Standards give us a holistic understanding of the makeup and meaning of assets in our care. Collections that are well-described and understood are more easily appreciated and readily valued, increasing the likelihood that they will receive the necessary attention and advocacy to last long into the future. The Open Archival Information Systems (OAIS) Reference Model defines the Information Package, describes its movement through a repository, and provides guidance that helps practitioners determine what kind of metadata and contextual information should be included with data to assure its longevity. The OAIS model, outlined by the Consultative Committee for Space Data Systems and first published in 2003, is a “reference” or “conceptual” model, meaning it does not make prescriptive workflow recommendations or require special systems or architecture to be utilized. Instead, OAIS encourages collection stewards to embrace concepts that can be implemented in the context of their respective repositories or institutions. OAIS 197

Nicole Martin

defines the Information Package simply as “a conceptual container” that holds data (known as the “object”) and associated metadata, stating, “An Information Package is a conceptual container of two types of information called Content Information and Preservation Description Information (PDI)” (OAIS—Reference Model 2012, 11). A basic digital Information Package might consist of a single folder that contains a media asset, such as a single video file, and any associated metadata about that file—for example, a text file containing data integrity checks, such as an MD5 checksum manifest; rights information, such as who the rights holder may be and terms of use; and basic description information, such as who created the file and when it was created; and content information, such as who appears in the video and what it depicts. The OAIS model has a “particular focus on digital information,” both in terms of the data contained in a package and method of description, but is not restricted to the virtual world and can be applied to physical collections, as well. According to the OAIS, “In this reference model there is a particular focus on digital information, both as the primary forms of information held and as supporting information for both digitally and physically archived materials” (OAIS— Reference Model 2012, 11). The conceptual nature of the OAIS model leaves practitioners the flexibility to design an Information Package that is customized for the collections, institutions, or repositories in which they exist but does not provide specific guidance explaining how a package should be stored digitally.

13.3 Benefits of Information Packages There is a multitude of benefits to using Information Packages to preserve data. These benefits include identification, portability, data verification, ability to document relationships and dependencies, renderability, automation and scaling, and preservation of context or original order. First, an Information Package defines the boundaries of a digital object or set of related files. The package draws a circle around a collection of data, designating an identifiable object that can be described, discovered, and validated. Data that is not contained is simply scattered across an archive and is difficult to monitor, maintain, and analyze. Once data has been properly identified, people and machines can track loss, degradation, and change. Packaged data is also portable and can be backed up, moved, and controlled efficiently. Collection items that are packaged can easily be verified or subjected to data integrity checks that run on an audit schedule (monthly, quarterly, etc.) or after significant events, such as a data transfer from one media storage type to another (e.g., hard drive to magnetic data tape). Information Packages allow archivists to group data so that some or all of its relationships and dependencies are preserved. For example, data that must remain in its original folder structure hierarchy to be properly understood, viewed, played, or rendered would ideally be packaged in a container so that assets that depend on one another are always grouped and stored together. Packaging also facilitates the automation and scaling of collections because it separates data into designated sets that can be identified and subsequently manipulated by software programs and computer scripts. This type of batch processing allows archivists to scale collection management, choreographing a set of batch operations to be performed upon thousands of files or high-volume data with a single click on a schedule or execution of commands. Even for small archives and collections, the ability to scale and use analysis programs is critical, as it ensures consistency within data management workflows and future-proofs the archive if large sets of collections are acquired. Lastly, gathering collection contents or digital objects into packages allows archivists to preserve important contextual information defined by the field of library science as “original 198

Digital Preservation

order.” According to this principle, “the organization and sequence of records [is] established by the creator of the records” [Society of American Archivists (SAA)—Dictionary 2021]. To use an analogy from the physical world, paper collections may be donated to an archive in folders with hand-written labels that are ordered chronologically by date. In this case, the organizational system itself (a series of folders) provides contextual meaning that should be preserved along with the paper records that were acquired during the accessioning process. The same is true of digital files, which are frequently named and arranged to convey meaningful contextual information. This initial or previous system of organization is known as the original order, and it should be either preserved in its original state or documented within an Information Package.

13.4 Data Ingest and Acquisition Before an Information Package is created, however, digital assets must be acquired, transferred, and stored by a collector or collecting institution. This is known in the field of library and information science as the process of ingest [Society of American Archivists (SAA)—ingest n.d.]. To explore this in more detail, it may be helpful to follow the process of digital asset ingest, including acquisition and data transfer.

13.4.1 Acquisition For the purpose of this guide, the example of a single video file, a 13 GB Apple ProRes 422 file, will be used to examine and describe the acquisition process. This file may be delivered or acquired by an archive, museum, individual, or collecting institution in myriad ways. It is small enough that it may be delivered via network download using a third-party file transfer service (such as WeTransfer) or cloud storage platform (such as Dropbox or Google Drive). Available bandwidth in countries with a strong internet architecture allows for a file of this size to be accessible via download. As of this writing, for example, the average wired internet download speed in the United States is about 86 megabits per second (McKetta 2020). At that speed, our 13 GB ProRes file would take approximately 19 minutes to download. Although network download is feasible, acquisition via physical hard drive is preferred because it eliminates variables that may introduce data loss or corruption, such as network interruptions, file download expirations on third-party platforms, discontinuation of transfer or cloud storage services, or mismanagement of data that is perpetually accessible in the cloud. In support of best preservation practice during acquisition, files obtained by any means must be transferred to some form of secure, centralized digital storage for processing and long-term archiving. Options may include locally hosted storage devices, like a NAS or SAN, or a cloud storage volume that is directly secured, monitored, and managed by the collecting institution. Files stored on external hard drives may be vulnerable to corruption or loss due to device failure and must be moved to centralized storage so they can be monitored and held in a secure location. By the same token, data stored on basic third-party storage platforms (e.g., Dropbox) are vulnerable to loss, security failures, and access issues and must also be transferred to secure, encrypted, centralized storage under institutional control.

13.4.2 File Transfer When moving digital assets over networks or from one location to another, data can be corrupted, or metadata can be stripped or removed from files. To make sure files remain unchanged, a safe method of file transfer can be implemented to move data from its source location (cloud 199

Nicole Martin

storage, external hard drive, etc.) to its destination (centralized storage). Some file transfer utilities can create logs or records detailing successes and failures during a file transfer or “job” that can be saved alongside files as part of a preservation package. Assets that are relatively large in size (in the range of tens or dozens of gigabytes per file), such as time-based media assets, are more susceptible to data loss or corruption due to long transfer times over low-bandwidth or unstable networks. Assets that are moved from one location to another may lose metadata information, such as creation date and time stamps. Digital storage devices (e.g., hard drives, flash drives) are formatted for use with a specific operating system, such as macOS, Windows, or Linux, each of which is designed to store specific categories of metadata in a way that may or may not be retained after a file transfer from one operating system format to another. Some file transfer applications were designed with this in mind, handling files more carefully, ensuring data integrity and preserving metadata attributes. To avoid file corruption or loss of metadata, use a method of safe file transfer whenever possible. Rsync [Rsync (Software) n.d.], a command-line backup utility developed in 1996 and covered by the GNU General Public License, comes installed on most Unix or Unix-like operating systems (Linux, macOS, Windows subsystem for Linux) and is commonly used to transfer files for preservation. The tool reports errors such as network interruption or dropped files during transfer, produces a report for each file transferred including the speed of transfer and total size of data in transit, creates a report summary of each transfer job, and gives users the option to “preserve attributes” and retain all file metadata during transfer. Reports that rsync generates can be saved alongside files as a record of a transfer event and stored as a log in a preservation package. Rsync has many more features and options, such as pausing jobs mid-transfer, reconciling duplicate files by comparing modification times, running checksums or fixity checks, and comparing data integrity of files in a source and destination location during transfer. Although rsync is very easy to use as a command-line utility in the Terminal, some users may feel more comfortable with an application that employs a Graphical User Interface, such as Exactly, an open-source tool developed by AVPreserve. Exactly [Exactly (Software) n.d.] is open source, uses the rsync utility “under the hood,” includes many of the same options and features of the full command-line program, and allows users to create custom metadata for each transfer job.

13.5 OAIS Information Packages Once data is acquired and transferred to centralized storage, it can be processed by the archive. The OAIS model outlines a helpful series of Information Packages that can be used to facilitate management throughout the life cycle of data. These packages contain both the object or data to be held by the institution, as well as metadata that helps describe the object and package contents. OAIS begins with the Submission Information Package (SIP), created during the acquisition process, the Archival Information Package (AIP), designed to support data for long-term care, and the Distribution Information Package (DIP), designed for user access and deployment of data outside of the archive.

13.5.1 The Submission Information Package During the acquisition process, information about incoming data can be obtained from producers (people or entities who created the digital assets) and stored as preservation metadata. This information might include details about provenance, such as who created the object, when, how, and an overall history of its ownership and use. The Submission Information Package (SIP) would ideally contain rights information, including copyright details, usage agreements, and consent 200

Digital Preservation

Figure 13.1

The OAIS Functional Model, available under Creative Commons (CC BY-SA 4.0).

information. It could contain a category of metadata OAIS defines as Context Information, which describes relationships and dependencies of the object, such as external objects elsewhere in the collection. In addition to other types of metadata, such as descriptive information, the SIP should document data about the acquisition itself, such as the staff member who received the digital objects, the acquisition method, and the date of acquisition, all of which will help future users understand and contextualize assets in their care (OAIS—Reference Model 2012, 73). Information captured during the acquisition process from producers can be stored alongside assets in a package defined by the OAIS standard as a SIP. This package is subsequently ingested or moved into a digital repository and/or to centralized storage, such as a NAS device. At this point, more metadata can be added to the package, and it can transition from a Submission Information Package to an Archival Information Package for long-term storage.

13.5.2 The Archival Information Package Once the digital asset or assets in question (for example, our single 13 GB ProRes file) is transferred to centralized storage, a new set of metadata is added to the package. Within the framework of the OAIS conceptual model and preservation standard, the package associated with our ProRes file transitions from being a Submission Information Package (SIP) to an Archival Information Package (AIP). At this point, additional metadata may be added to the package. This metadata may be added by humans who are describing the asset and its content or through manual or automated operations performed on the asset. This metadata may be stored in a Digital Asset Management System as a series of text or files alongside the asset itself in a folder or in a “sidecar” XML file that travels with the asset throughout its life cycle. In some cases, metadata may even be added to the metadata of the file itself for safekeeping. Examples of metadata that may be added at this stage include reference information, such as unique IDs from previous collections, context information, such as documentation of relationships to other media or digital assets, or provenance information detailing the chain of custody and/or the asset’s origin. Additional metadata might include technical information, such as file format specifications generated through analysis tools like MediaInfo (Pozdeev and MediaArea 201

Nicole Martin

2021) or ExifTool (Harvey 2021) programs, data integrity or fixity checks, or structural metadata about a file describing its makeup and construction, versioning information and the computing environment in which the file is optimally rendered. For an example of an extremely simple package that can be used for digital preservation, it is useful to examine the Library of Congress’ BagIt preservation packaging system. The BagIt standard, developed by the Library of Congress, was created to answer the practical question of how to arrange data within the file system or disk-based storage (such as a hard drive), laying out a set of hierarchical naming, arrangement, or other structural conventions. BagIt utilizes a container known as a bag, which holds “descriptive metadata ‘tags’ and a file ‘payload’ ” (The BagIt File Packaging Format 2018). The bag can be a folder, data container format (such as TAR or ZIP), or a similar digital entity. The payload is the digital object or data to be preserved, and tags are “metadata files intended to facilitate and document the storage and transfer of the bag” (The BagIt File Packaging Format 2018). A slightly more complex example of an Information Package might be a standard BagIt package containing several media assets. Fig. 13.2 shows an example of a BagIt package that contains a set of five media and document files: .mov, .mp4, .mp3, .pdf, and .docx files. These files have been acquired through an accession process and are meant to be preserved. In the BagIt universe, these files are known as the payload. In addition to these media and document files, the BagIt standard also requires the inclusion of a set of metadata files or data about the payload files, which are called tags. In the example shown in fig. 13.2, the tags are a set of four .txt files. These four tags are used to contextualize the payload and ensure data integrity. The tags include the bag-info.txt tag file (bag size and creation date), the bagit.txt tag file (BagIt software version and text encoding information about the tags themselves), the manifest-md5.txt tag file (data integrity checks for each of the payload assets in the “data” folder), and the tagmanifestmd5.txt tag file (data integrity checks for each tag file in the bag) (The BagIt File Packaging Format 2018; Kunze et al. n.d.). The left-hand image of fig. 13.2 is a screenshot from the macOS Finder application that shows a hierarchical view of the files and folders that make up an example bag that contains an interview entitled 2020–08–01_TBM_Interview. The right-hand image of fig. 13.2 shows the same set of files and folders contained in the example bag in a hierarchical view but is shown as the text output of the bash programming language command “tree,” which shows a full list of files and folders within a given directory.

Figure 13.2 The BagIt file hierarchy represented in macOS Finder (left) and the command-line program “tree” (right). Screenshot: Nicole Martin.

202

Digital Preservation

BagIt packages can be created with command-line utilities or basic software with an easyto-use Graphical User Interface or GUI. Bags can be generated manually by a human user or produced through automated batch processes. Importantly, BagIt packages can also be verified for data integrity or “audited” using the same software, either manually or through automation. At any time, a user can run a simple operation that will report if any file within the bag (including tags) has changed, disappeared, or suffered some form of data loss. These data integrity checks can be performed routinely on a schedule (every three to six months, for example) or after significant events occur, such as a move of an Information Package from one media type to another (such as a migration from a hard drive to LTO tape). Together, the OAIS Reference Model and BagIt standard help define the Information Package and how it might be structured on disk-based storage. To package and describe collections, practitioners may choose to use either or both of these standards in some form or simply consult the standards for inspiration when designing or improving a repository or archival workflow. Regardless of the details, the simple act of implementing packaging for data storage, description, and preservation can increase the longevity and sustainability of digital objects immeasurably.

13.6 Conclusion Not only does information packaging increase productivity within an archive or repository, but it also gives collection stewards an opportunity to define, describe, and track digital objects for access and preservation. Creating containers for data assets and assigning metadata to packages facilitates digital preservation and long-term collection care. By utilizing or drawing inspiration from digital standards, archivists and conservators can create packaging structures that include metadata specifically designed for the collections and institutions they support. Preservation packaging performed in the present makes digital objects more sustainable, helping to ensure they are understandable and renderable in a distant, uncertain future.

Bibliography “The BagIt File Packaging Format (V1.0).” The Internet Engineering Task Force, October  2018. https:// datatracker.ietf.org/doc/html/rfc8493. “Exactly (Software).” AVPreserve. Accessed May 1, 2022. https://www.weareavp.com/products/exactly/. Harvey, Phil. “ExifTool (Software) (version 12.32). MacOS, Windows, Unix Systems.” 2021. https:// exiftool.org/. Kunze, John, Andy Boyko, Justin Littman, Liz Madden, and Brian Vargas. “The BagIt File Packaging Format (V0.96).” The Internet Engineering Task Force (IETF). Accessed December 1, 2021. https://tools. ietf.org/id/draft-kunze-bagit-04.html. McKetta, Isla. “U.S. Internet Speeds Increase 15.8% on Mobile and 19.6% on Fixed Broadband.” Speedtest, August 21, 2020. www.speedtest.net/insights/blog/announcing-us-market-report-q2-2020/. OAIS. “Reference Model for an Open Archival Information System (OAIS): OAIS Magenta Book.” CCSDS 650.0-M-2. The Consultative Committee for Space Data System, 2012. https://public.ccsds.org/ pubs/650x0m2.pdf. Pozdeev, Max, and MediaArea. “MediaInfo (Software) (version 21.09). Windows, MacOS, Android, iOS, Debian, Ubuntu, Linux Mint, RedHat Enterprise Linux, CentOS, Fedora, openSUSE, ArchLinux, Gentoo, Flatpak, Snap, AWS Lambda. MediaArea.” 2021. https://mediaarea.net/de/MediaInfo. “Rsync (Software).” Accessed May 1, 2022. https://rsync.samba.org. Society of American Archivists (SAA). “Archives Terminology—Ingest.” N.d. Accessed December  1, 2021. https://dictionary.archivists.org/entry/ingest.html. ———. “Original Order.” In SAA Dictionary. Society of American Archivists, n.d. Accessed July 25, 2021. https://dictionary.archivists.org/entry/original-order.html.

203

14 DISK IMAGING AS A BACKUP TOOL FOR DIGITAL OBJECTS Eddy Colloton, Jonathan Farbowitz, and Caroline Gil Rodríguez

Editors’ Notes: This chapter introduces disk imaging as an emerging practice in time-based media art conservation. It brings together the expertise and experience of three authors who share a background in the Moving Image Archiving and Preservation master’s program (New York University) and pursued careers in time-based media conservation. With their comprehensive survey of use cases, tools, and best practices, Eddy Colloton (Project Conservator of Time-Based Media, Hirshhorn Museum & Sculpture Garden), Jonathan Farbowitz (Assistant Media Conservator, The Metropolitan Museum of Art, New York), and Caroline Gil Rodriguez (Director of Media Collections and Preservation, Electronic Arts Intermix) are transferring crucial knowledge from the Information Sciences and Libraries and Archives communities and applying it to art conservation.

14.1 Disk Imaging in Media Conservation Creating disk images for the purpose of preserving, examining, and exhibiting software-based artworks or other digital objects is a developing area of practice in time-based media conservation. A disk image is a computer file that contains the contents and structure of an entire data storage device or a segment of that device. Storage devices that can be disk-imaged include internal or external hard disk drives, floppy disks (8-inch, 3½-inch, 5¼-inch), optical discs (CD, CD-ROM, VCD, DVD, Blu-ray), USB flash drives (sometimes called thumb drives or pen drives), and data tapes (including LTO and various IBM formats). In other words, a disk image is a bit-for-bit copy of the data on one of these physical data carriers. Originally disk imaging was used by the IT (information technology) field as a technique to back up computers and recover data from hard drives. It was later adopted by digital forensics practitioners who specialize in “the recovery and investigation of material found in digital devices” (Wikipedia—Digital Forensics 2020) as a method to create authentic copies of computers and data storage devices for later use as evidence in criminal and civil trials. Seeing the potential value of these practices for born-digital cultural heritage materials in their collections, archivists, librarians, and conservators have begun to integrate disk imaging into their practices (Kirschenbaum et al. 2010). Extracting and archiving data from artist-provided computers, hard drives, or optical discs is a key conservation strategy, as all digital information carriers are relatively short-lived and not suitable for long-term storage of artwork data. Even if the artist-provided computer or storage 204

DOI: 10.4324/9781003034865-17

Disk Imaging as a Backup Tool

device sits on a shelf unused, the data will eventually become corrupted or unreadable (Lunt 2011; Dicks 2020; Backblaze n.d.). The method of creating a disk image offers the benefit of lossless duplication of the entire content of a storage device. Qualities unique to the original media carrier, such as file systems, deleted files, directory structure, and system settings, are all captured within a disk image without the risk of modification or loss that can come with other forms of data transfer. In this way, creating a disk image allows for a thorough future examination of art objects, including their hardware and software dependencies, without any risk to the original object. Since the disk image captures all of the particularities of the original carrier, it can potentially act as a stand-in for the original in the context of examination or display. Disk images are more than simply a copy of all the information on a digital carrier. They can be used to activate the artwork independently from its original hardware and software (including the operating system and all applications and files). A disk image of a computer can be used to run an artwork within software known as an emulator or virtual machine (see Section 22.4). Virtualization and emulation are techniques for simulating, for example, the hardware and software functionality of a Windows 95–era computer on a modern machine. Through this approach, a digital artwork can be executed with all of its functions self-sufficiently on a different computer, relying only on the files within the disk image. Emulating disk images can serve a variety of purposes—for example, for understanding and documenting the behaviors of a work or for testing or mocking up equipment (Colloton et al. 2019b). Writing a disk image to a new hard drive can allow for exhibiting artwork on a replica computer (Lewis and Fino-Radin 2015). By using a disk image instead of original equipment or storage media, the artist-provided hardware and its contents are preserved and not put at risk through handling or prolonged exhibition hours. Disk imaging empowers conservators to interact with digital media in a museum collection while upholding the tenets of contemporary conservation ethics: to implement reversible, documented treatment that does not adversely affect future use or investigation of the original work (AIC 1994; ICON 2014; ICOM 2017).

14.2 Use Cases for Disk Images The disk image’s key potential for time-based media conservators lies in its ability to capture a representative facsimile of a digital object. A copy of a disk image can also be edited, experimented with, or used for a particular purpose. A disk image can be used as an archiving method, as an exhibition copy, as a loan deliverable, or to facilitate the examination and documentation of an artwork.

14.2.1 Disk Image as an Archival Copy When artists provide computers and other physical data carriers as part of the acquisition of their artwork, conservators may choose to create a disk image of the artwork immediately upon acquisition. Creating a backup of the artwork in its hardware and software environment reduces the risk of damage and loss through physical or functional degradation of the original hardware. Depending on the contents of the drive, a disk image created for this purpose could have the status of a preservation file and be safely and redundantly stored in a trusted digital repository (see Chapter 8). For museums with substantial collections of digital carriers, disk imaging can be applied en masse, not unlike other preservation measures such as rehousing or deacidification. This allows for more incremental investigations of media collections by alleviating the risk of data only existing on a single physical carrier subject to degradation. Examples of such large-scale disk imaging include the Guggenheim’s Conserving Computer-Based Art initiative (Guggenheim 205

Eddy Colloton et al.

Museum—CCBA n.d.; Farbowitz 2018) and the Denver Art Museum’s Electronic Media Project (Colloton and Moomaw 2018).

14.2.2 Disk Image as a Tool for Artwork Examination A hard drive could contain valuable data inherent to running the artwork, as well as documentation of the machine’s general settings and levels. Software tools such as BitCurator’s Disk Image Access (Cirella 2021) or Brunnhilde (Walsh 2020), which bundles tools such as Siegfried, ClamAV, bulk_extractor, the Sleuthkit’s tsk_recover, HFS Explorer, and fiwalk to streamline appraisal and minimal processing of digital archives, allow users to explore the contents of a disk image without mounting the image, thus gaining access to technical information about the source of the disk image without risk of affecting the original data. Moreover, creating a disk image can allow the conservator to glean software dependencies and document any risks associated with software obsolescence. Disk images can be purposebuilt: Rhizome’s Preservation Director Dragan Espenschied proposes the creation of “synthetic disk images” (Espenschied 2017) that contain only the essential dependencies of a specific software-based artwork without any of the other superfluous data. Through a process of trial and error, items are removed from a disk image to test whether or not they are essential. Espenschied has used this technique to locate software dependencies of artworks originally collected on a computer and to create a streamlined copy of the work that only contains the necessary data to run the artwork. A synthetic disk image could also combine data from multiple sources to create an exhibition copy tailored to particular hardware or a particular use case. A disk image can also act as a “snapshot in time,” documentation of a digital volume’s particular content and functionality at a particular moment. For example, in the case of a softwarebased artwork that generates new files on the hard drive, creating a disk image before and after the artwork is in use allows one to note the change in the contents of the drive over time at a very granular level. Digital forensics tools such as fiwalk allow the user to create an inventory of every file and its checksum (see Chapter 13) that are contained within a disk image. These inventories can then be compared through automation in order to arrive at a succinct list of differences between the two disk images.

14.2.3 Disk Image as a Backup for Exhibition A disk image may be used as a bootable backup copy for exhibition. Often a consumer backup software like Apple’s Time Machine can perform this task, but in certain cases, low-level system files, which are excluded from Time Machine backup, may be important to an artwork’s functionality. Using a disk image as a backup ensures that all data from the original hard drive can be restored, if necessary. This exhibition backup may allow for the artwork to be reinstated quickly, preventing it from being “out of commission” during exhibition.

14.2.4 Disk Image as a Deliverable for Loans An artwork’s disk image could be loaned to another institution and used by the receiving institution to run the artwork, or access could be granted to an emulator or virtual machine running the image on the internet. While these are emerging practices at the time of writing this book chapter, recent case studies suggest these practices could be very promising for the loan and exhibition of software-based artworks. For example, Olia Lialina’s My Boyfriend Came Back From the War (1996) was loaned for exhibition in 2016 as a disk image that was specifically 206

Disk Imaging as a Backup Tool

prepared for an emulator (Espenschied 2017). This ensured the artwork’s functionality within a specific computing environment, configured and prepared by the loaning institution, mitigating the challenges of remote configuration and remote troubleshooting.

14.3 What to Disk-Image While disk imaging can be an effective method for preserving digital media, it is not a necessary or even ideal practice for all digital media carriers in an art museum collection. Disk images document a particular carrier at a very granular level, including data and files the typical computer user would not interact with directly. This exhaustive level of detail can provide valuable insight into certain types of media but could just as easily occupy valuable digital storage space (potentially several terabytes depending on the size of the storage media and imaging format selected) while providing little information on the artwork the disk image is related to (or, worse, imply a higher level of significance to information that is insignificant). Because a disk image is an exact copy of the contents of a physical carrier, even empty space on it is recorded in a disk image; therefore, depending on the format selected, a disk image of a 500 GB hard drive with only 1 GB of data may still be 500 GB. A variety of factors should be weighed when determining whether to create and/or store a disk image. The more artifactual value the object has, that is to say, the higher the potential to glean insight about the work from inspecting the structure and function of the data storage device, the more value the disk image will hold. •





Type of Storage Device: Given that disk images can be created from a variety of sources, such as computer hard drives, DVDs, and floppy disks, physical storage format can be a useful metric when deciding when to disk-image a carrier, especially in collection-wide treatments or surveys. Floppy disks and CD-ROMs lend themselves well to disk imaging as the resulting disk image is relatively small. Artist-provided computers should typically be disk-imaged, as the computer often carries a significant amount of evidentiary value regarding software dependencies and other technical information. However, format alone should not be seen as universally determinative. Medium Type: Artworks that are made up of a corpus of files, such as software-based artworks, benefit from disk imaging in that all file hierarchy and organizational structure are preserved, whereas storage media that contain single-channel video works are less likely to require disk imaging as there are typically fewer files involved, and software dependencies (if any) for such works are simpler to ascertain. Functionality: Evidentiary and reuse value should always be considered when evaluating whether to create and/or preserve a disk image. For example, an artist-provided computer may seem like an obvious candidate for disk imaging, but if the computer is simply provided as a robust media player for a single-channel video artwork, a disk image of that computer will take up a significant amount of storage space with little useful benefit to conservation. Conversely, a DVD of an exhibition copy for the same video artwork could be seen as a mere carrier of a video signal, but if the artist created an interactive menu design, that functionality would be lost if the DVD were not disk-imaged.

14.4 Necessary Hardware, Software, and Tools for Disk Imaging This section presents an overview of the hardware, software, and tools used to create disk images. Proper use of these tools ensures the creation of a bit-for-bit copy of the original storage 207

Eddy Colloton et al.

Figure 14.1 Disk structure showing a track (A), a geometrical sector (B), a track sector (C), and a cluster of sectors (D). The term “sector” in the context of computer engineering usually refers to (C). Source: Public domain, https://commons.wikimedia.org/wiki/File:Disk-structure2.svg

media without altering them in any way during the disk imaging process. The following minimum pieces of equipment and software will be discussed in detail: • • • • • •

ESD mat and ESD-safe tools, such as screwdrivers and spudgers to disconnect and extract hard drives from artist-provided computers Appropriate drives for floppy disks, optical discs, and data tapes if present in a collection Write blockers/forensic bridges to attach the extracted hard drives to the imaging computer with cables and connectors that suit the storage devices to be imaged Disk imaging workstation (a macOS, Linux, or Windows computer with several terabytes of storage) Software for creating disk images Software for analysis of disk images

The age of the items in a collection and their storage media formats present will affect the equipment required to create disk images. Some legacy or proprietary storage formats require specialized software and hardware to be disk-imaged. For example, the Kryoflux device has become commonplace in libraries and archives for imaging floppy disks (Allen et al. n.d.), and a Zip drive would be required to create images of Iomega Zip Disks. However, many of the principles of disk imaging remain the same regardless of carrier type. Disk imaging software will typically report progress in terms of the physical architecture of a storage device at the sector level. A sector is a discrete amount of data stored in physical proximity on the hard drive, floppy disk, or optical disc itself. A common metric for evaluating the completeness of a disk image is the number of accurately copied sectors. An inaccurately read sector is often referred to as a “bad sector” and can either connote a damaged area on the physical carrier or an unsuccessful read by the imaging device. Much of the hardware and software involved in disk imaging was originally designed to recover and analyze evidence in criminal and civil investigations, so the assumed terminology 208

Disk Imaging as a Backup Tool

and use cases may not be familiar to conservators. This category of hardware and software is marketed primarily by US-based companies that serve law enforcement agencies, which has raised some ethical questions in the cultural heritage community about whether to support businesses that are reliant on police departments and criminal justice systems (Arroyo-Ramírez et al. 2018; Colloton et al. 2019a). Since a disk image creates a complete copy of all of the data on a storage device (including data that the user may have thought they deleted), conservators may wish to practice informed consent by using an acquisition agreement that specifies that hard drives, floppy disks, or optical discs may be imaged, including the fact that deleted files on these devices may be retrieved through the process [Farrell 2012; BitCurator Consortium (BCC) (Website) n.d.].

14.4.1 Preventing Damage to Hardware or Data Creating a disk image involves some risks, which can include damaging the physical storage media being imaged (or the computer that it is housed in) or altering the data on the storage device. However, by taking proper precautions, there is little risk involved in creating disk images. An internal hard drive often must be removed from the computer in order to connect it to a disk imaging workstation. However, for Apple computers manufactured in 1991 or later (Wikipedia—Target Disk Mode 2018), opening the computer’s case may not be necessary as it can be placed in Target Disk Mode (where the computer acts as an external hard drive) and connected to the workstation that will create the image (Henry 2011). Understanding how to properly disassemble computers, identify hard drives, and take precautions to prevent electrostatic discharge (ESD) are all important knowledge for disk imaging. ESD takes place when the static electric charge stored in one object suddenly moves to another. When a person charged with static electricity touches an electronic component, they can transmit the charge and potentially damage the component. For more information about opening computers and preventing damage to electronics from ESD, see the following: Desco (2017, 2020) and Synder (2020). Specialized electronics tools are helpful and sometimes necessary to open computers and extract hard drives. Purchasing ESD-safe screwdrivers, wrenches, spudgers, and opening picks is recommended as well as purchasing an ESD wrist band and ESD mat. This equipment is useful outside of disk imaging when working with any type of electronic equipment. Hardware devices called “write blockers” ensure the integrity of the physical storage media during disk imaging and using them has become standard practice among media conservators and digital archivists (Dietrich and Adelstein 2015; Colloton et  al. 2019a). A  write blocker is connected between the drive to be imaged and the computer or other device creating the image. The write blocker prevents any modification of the data stored on the drive. Certain floppy disk formats (such as 3½-inch and 5¼-inch), with the write-blocking tab enabled on the disk, do not require a separate write blocker. Disk imaging optical discs also does not typically require write blocking (Farbowitz 2018). An assessment of the collection will help to determine which drive interfaces (the type of cable connection between the drive and the computer) are prevalent and which write blockers are needed. Forensic write blockers are often sold in kits with individual units for different types of drive interfaces. Multiport forensic bridges (a single write-blocking unit with many types of ports on the same unit) can also be purchased. The most common drive interfaces across the collections the authors have worked on are USB, SATA, PATA/IDE, and FireWire. Many write blocker kits come with the required cables, but additional adapters and cables may 209

Eddy Colloton et al.

also be needed. In the authors’ experience, the most common adapters are 3½-inch to 2½-inch PATA/IDE adapters (which convert a laptop hard drive interface into a desktop interface) and FireWire to Thunderbolt 2 adapters.

14.4.2 Disk Imaging Workstation An ideal disk imaging setup within a media lab will have a computer workstation dedicated to creating, analyzing, and working with disk images. Having a dedicated workstation means that when creating the disk image, no other software is running on the computer. Additional software running in the background may overtax the computer’s resources and cause it to freeze or crash during imaging. A freeze or crash may produce a corrupted disk image f ile, or the imaging process may fail altogether. If either of these situations occur, one will need to start the imaging process over again. Creating images can take several hours, depending on the size of the hard drive. Therefore, the workstation requires a significant amount of uninterrupted time. In some cases, the software may need to run overnight. A f lat surface of suff icient space is also needed to position hard drives and write blockers near the workstation. Hard drives should always be placed on a f lat surface when running. Standard desktops or laptops running Windows, macOS, or GNU/Linux will often suff ice for creating a disk imaging workstation (Martin 2017; Olsen and Woods 2018). Conservators may also choose to run the BitCurator environment on their workstation [BitCurator

Figure 14.2 For write-blocked access, an external hard drive is connected to a USB 3.0 forensic bridge, which in turn is connected to a computer. Photo: Amy Brost

210

Disk Imaging as a Backup Tool

Consortium (BCC) (Website) n.d.]. BitCurator is a suite of open-source disk imaging and digital forensics software developed by a consortium of libraries, archives, and museums within the Ubuntu Linux operating system. Due to the large size of disk image files, a hard drive of several terabytes is recommended. For institutions desiring a turn-key solution, specialized all-in-one units used by forensic investigators can also be purchased, such as a Forensic Recovery of Evidence Device, or FRED (Digital Intelligence n.d.), though these units are more expensive. FRED units have built-in write blockers for many kinds of media and preinstalled software for disk imaging and forensic analysis. Purchasing a portable standalone forensic imaging unit, such as Tableau TX1 or Wiebetech Ditto, is also an option. These units combine the functionality of a write blocker and an imaging computer in one convenient package. The hard drive or computer to be disk-imaged is plugged into the unit, which then creates and stores the image. However, these units have less functionality than a computer workstation and are more expensive. Elevated administrative (“admin”) permissions are typically needed for installing software for disk imaging or creating disk images. On computers running the Windows operating system, the person creating disk images needs administrative privileges to the workstation. On macOS and GNU/Linux systems, this level of access may also be called “super user” privileges. Without elevated privileges to the workstation, one may be unable to disk-image. For computer workstations that fall under the purview of an institutional IT department, staff may need to request that this level of elevated access be granted and explain their reasoning for doing so.

14.4.3 Software Choices for Creating Images There are many excellent choices for disk imaging software available at little to no cost. Most of this software performs the same functions and can create and read the same group of file formats. Historically, data recovery and cloning programs were the first programs employed to conduct disk imaging activities—for example, dd (Thompson and Various 1974–2021) and GNU ddrescue (Diaz 2020) for data recovery, as well as Ghost (General Hardware Oriented System Transfer) for cloning and backing up disks, and now included as a supplemental utility in current FRED workstations. The software used by collections staff at art museums and broadly adopted by the cultural heritage community at the time of the writing is outlined in Table 14.1. GNU ddrescue and ISOBuster are unique among imaging programs discussed in that they offer the possibility of recovering data from damaged storage media. Each program allows the user to retry imaging in specific areas of the disk where it was initially unsuccessful and fill in gaps in the image file. ISOBuster is the most comprehensive imaging program for optical discs as it can read a wide variety of different disc types and file systems. In a blog post, Eddy Colloton described tests he performed using five different disk imaging tools, including Guymager (Voncken 2021), FTK Imager (AccessData 2012), and ddrescue (Diaz 2020), concluding that all of the applications yielded similar results (Colloton 2019). The only differences observed occurred in the extent and format of compression of the disk image and the contents of sidecar text files (metadata) that provided a record of the imaging process. Ultimately, software choices for disk imaging may depend on weighing several variables, including access to different kinds of equipment and operating systems and an individual’s level of comfort with a Graphical User Interface (GUI) versus a command-line interface.

211

Eddy Colloton et al. Table 14.1 Software used by the collections staff at art museums and broadly adopted by the cultural heritage community (All five software packages are listed in the bibliography for further information.) Guymager Intended Disk imaging Use Cases hard drives, flash drives, or floppy disks

FTK Imager

GNU ddrescue

Disk imaging hard drives, flash drives, floppy disks, or optical media

Recovering data from damaged hard drives, floppy disks, or optical discs Microsoft Windows, GNU/Linux, macOS, GNU/ macOS Linux EWF, Raw, AFF3 Raw

ISOBuster

dd

Disk imaging optical media Recovering data from damaged media. Microsoft Windows

Disk imaging hard drives, flash drives, floppy disks, or optical media

Raw

Operating System

GNU/Linux

Output File Formats Reporting

EWF, Raw, AFF3 Produces Produces a thorough Produces a brief the most report on the report on the thorough imaging process success of report on in a sidecar text data recovery the imaging file with .txt as a sidecar process in a extension text file with sidecar text Can also create a .txt extension file with .info .csv file with a list extension of the contents of storage media Graphical User Graphical User Command line Interface Interface (Windows only), command line (GNU/Linux and macOS only) Has not been updated since 2012 Open-source Proprietary Open-source (installed by (installed by default in default in BitCurator BitCurator environment) environment)

Raw

Developer

Guy Voncken

AccessData

Antonio Diaz

Peter van Hove

Cost (as of February 2021)

Free

Free

Free

US$ 59.95

Interface

Open Source/ Proprietary

212

GNU/Linux, macOS

Offers Only produces options for a brief customized command reports on line output the imaging on the results process or of imaging contents Does not of storage produce a media sidecar file Graphical User Command line Interface

Proprietary

Open-source (installed by default on all GNU/Linux and macOS systems) Ken Thompson, AT&T Bell Laboratories Free

Disk Imaging as a Backup Tool

14.5 Disk Imaging Formats and Selection Criteria The software used for creating disk images allows for choices between different file formats. Disk image formats have been created for different communities that have different use cases, such as the digital forensics community’s need to establish a clear chain of custody. These more complex forensic formats are unnecessary for simple data recovery, and in this way, different formats serve different users’ specific needs. Forensic disk imaging formats, such as EWF or AFF3, use lossless compression to produce a smaller file, while raw formats are uncompressed and produce a file that is the same size as the total capacity of the media being imaged. Fortunately, most disk image formats can be transcoded between one another with no loss of data. However, conservators must still evaluate the advantages and disadvantages of file formats if they are used for long-term preservation. Keep in mind that there are potential use cases for disk images as temporary storage (DANNNG!, n.d.), in which case file format selection would be influenced by a different rubric. For example, if a disk image will simply be held in staging storage while a selection of files within the image is extracted for retention, the disk image format’s hardware and software compatibility with the processing workflow would weigh heavily on the decision, while the sustainability of the format would not. This section will discuss established factors to consider when selecting a file format for disk images intended for long-term preservation and compare the current formats used by museums, archives, and libraries (Arms and Fleischhauer 2005; Goethals and AVPreserve 2016; Library of Congress—Sustainability of Digital Formats 2017; Colloton et al. 2019a). All disk image file formats fall into two general categories: raw disk images (sometimes also called flat images) and forensic disk images. Raw disk images are simply the binary content of a piece of storage media copied into a file with no additional structure or metadata, whereas forensic images have a standard structure and embed additional metadata about the image in the header of the file. Among several forensic file formats, EWF (the Expert Witness Format, sometimes also called E01) disk images have seen high adoption by cultural heritage institutions. Raw images have seen high adoption as well (Colloton et al. 2019a; Farbowitz 2018; Goethals and AVPreserve 2016).

14.5.1 Sustainability A conservator must consider the predicted sustainability of a format in order to make decisions about its use for long-term preservation. Because the developers of forensic formats are not primarily concerned with long-term preservation and may update, deprecate, or abandon certain formats, the future of a particular forensic format is unknown. Conservators must make a risk calculation and assess the likelihood that the software necessary for accessing the format survives into the future. In order to make this assessment, conservators may wish to consult resources like the Library of Congress’s Sustainability of Digital Formats database, which includes an assessment of several sustainability factors for a file format (Library of Congress 2017), and the UK National Archives PRONOM technical registry (National Archives—PRONOM n.d.). In general, frequently used formats are likely more sustainable than formats that are not adopted widely. In addition, consider whether the format is open-source (the source code related to the format is publicly available), proprietary (the format is the intellectual property of one or several companies, and information about it is not publicly available), or semi-open (the format is technically proprietary, but open-source tools have been developed for working with it). The software to read a given file format will inevitably become obsolete. However, as stated by Arms and Fleischhauer, “a format that is widely adopted is less likely to become obsolete 213

Eddy Colloton et al.

rapidly, and tools for migration and emulation are more likely to emerge from industry without specific investment by archival institutions” (Arms and Fleischhauer 2005, 3). Migration and emulation as preservation and access strategies are further discussed in Section 22.4. Conversely, if the format sees little use, there may not be sufficient interest in developing or maintaining software to read it. Furthermore, “some digital content formats have embedded capabilities to restrict use in order to protect the intellectual property” (Arms and Fleischhauer 2005, 4). These technical restrictions (sometimes called “technical protection measures”) and a format’s patents and restrictive licenses may inhibit the future development of tools that conservators could use to read the format.

14.5.2 Compression Forensic disk imaging formats (such as EWF) use lossless data compression to create a smaller file, while raw formats are uncompressed. Unlike lossy compression in video or image file formats (such as H.264 or JPEG), the compression algorithms used in disk imaging formats are lossless. A lossless compression algorithm finds efficient ways of representing information without compromising accuracy—for example, by abbreviating redundant data or empty space to produce a smaller file. With lossless compression, no original information is discarded, and when decompressed, the disk image is exactly the same bit-for-bit as the original. However, file formats that use compression present sustainability concerns because compressed formats depend on the existence of specialized software that can decompress the content. Just as a codec must be installed in a video player to read the video and audio encoded in the compressed format, a piece of software must be capable of properly decompressing the data to read a disk image that uses compression. In addition, using compression “inhibits transparency” (Library of Congress—Sustainability of Digital Formats 2017) of the data and file format, meaning that without proper documentation, future programmers may not be able to understand the compression scheme (or which one was used) and may not be able to decode the file. A compressed disk image file will take up significantly less space than one that is uncompressed, perhaps up to ten times less space, depending on how much data is stored on the hard drive. For a raw disk image, the size of the image is the size of the entire storage media, no matter how much data is stored on it.

14.5.3 Metadata Conservators can also evaluate a disk image format in terms of its potential for storing metadata. Forensic formats allow for metadata embedded within the disk image file itself. A conservator could embed descriptive information about which artwork the disk image is related to, about the computer the disk image was created from, or information about who created the disk image and when. Recommended metadata is discussed in Section 14.7. Embedded metadata establishes a chain of custody for the file and provides an extra layer of protection. If the file is ever disconnected from a cataloging system, one only needs to look at the file itself and get information about it. Forensic disk image formats automatically embed preservation metadata such as a checksum for the entire image and/or for each block of data. The data can be verified against these checksums to ensure that no block of data has changed. If future corruption of the file ever occurs, one can pinpoint exactly where the corruption occurred in the file to facilitate data recovery. Raw disk images, in contrast, are simply a copy of the raw data stored on the physical media and do not allow for embedded metadata of any kind. 214

Disk Imaging as a Backup Tool

14.6 Review of Formats Various aspects of disk image formats are compared in Table 14.2. This chart was adapted from one used in a previous article by the authors (Colloton et al. 2019a).

14.7 Disk Image Metadata The value of a disk image is defined through its context. Without robust information about the original source device, the disk image will be far more difficult to repurpose for research or for virtualization or emulation. Without technical specifics about the disk imaging process, the completeness of the disk image could be called into question. Without an accurate inventory of the contents of the disk image, a granular description of the disk image is not possible. Thankfully, much of this contextual information can be generated in an automated manner and stored as metadata. When disk imaging a computer hard drive, the computer can be leveraged to describe itself. Most common operating systems (Windows, macOS, and GNU/Linux) offer a tool (msinfo32.exe, System Information, and hardinfo, respectively) that creates a report documenting the computer and its components. The resulting report describes all of the hardware, firmware, and installed software that make up the computer’s current environment, including the computer’s “peripherals,” such as connected keyboards, cameras, and display devices. If the computer is part of an artwork that relies on such peripherals, then it is ideal to create the report when the work is installed in the gallery and connected to all other components of the artwork. Disk imaging software reports the software’s progress and results as a matter of course usually as a sidecar text file. While different software applications perform this task differently, and with differing levels of granularity, a dearth of automated metadata generation can often be supplemented relatively easily through a manual process or by incorporating other tools. Key information about the disk imaging process includes the following: • Date and time the image was created. • Notable failures, such as the following: • • • •

Documentation of all equipment and software used to create the image, including the following: • • • •



Number of bad sectors. Location of any bad sectors. Any other errors in the imaging process.

Host computer/machine. Host operating system. Disk imaging software (name and version number). Write blocker (model number and firmware version).

Specifications of the device being imaged, including the following: • • • • •

Model number. Serial number. Firmware version. Drive capacity. Physical sector size. 215

Table 14.2 Disk image for mat compar ison chart Format and Extension

Extensions

Type

• Forensic, EWF (Expert .e01, or Witness incrementing propr ietar y for mat For mat) file extensions, i.e. .001, .002, .003 and so on

Tools and Utilities for Media Reading and Writing Format

Sustainability

• FTK Imager • Floppy disks, • (GUI and CLI), optical media, Guymager, libewf exter nal hard (ewfacquire) dr ives and flash SMART, Sleuth dr ives, computer Kit, X-Ways hard dr ives

• •

Raw disk image

• Raw .dd,. raw, .01, or any arbitrar y file extension

• dd, dc3dd, dcfldd, • Floppy disks, FTK Imager, optical media, ProDiscover, exter nal GNU ddrescue hard dr ives, computers





Metadata

• EWF is • Appends MD5 hash compressible and of image as a footer searchable. in the file. • Support for • Embeds cyclic splitting files up redundancy checks to 2 GB. (CRCs) for each block of data within the image file. • Allows users to enter metadata within a limited number of free text fields, which is then embedded within the image file.

• Lack of • Raw disk images compression, contain no which means the additional metadata files use more and rely on other storage space than prog rams to identify a compressed any file system(s) for mat. contained within. • A raw disk image • No support for of a 500 GB hard embedding metadata dr ive with only in the image. 1 GB of data will • No cyclic still be 500 GB. redundancy checks (CRCs) of data blocks dur ing imag ing.

Eddy Colloton et al.

216



Propr ietary file for mat, somewhat closed, though documentation is widely available and open-source tools are available to work with the for mat. Strong community of users and support. It is possible to export a raw image from an EWF image. Uses compression, which could prevent access if software to decompress isn’t available. Uncompressed, no decompression is needed to read the data. No additional wrapping or encoding, which may make the for mat more sustainable for long-ter m preservation.

Compression

AFF .aff (Advanced Forensic For mat)

• Forensic, open-source for mat

• Guymager, • Floppy disks, Kr yoflux, AFFLib exter nal hard dr ives, computers







.iso

217

• Raw • FTK Imager, • Generally • • There is no DVDisaster, optical media comprehensive Carbon Copy • This naming single Cloner, IsoBuster, convention for specification ImgBur n raw disk images • for all of the of optical media var iant for mats arose because of called ISO the ISO 9660 image. file system used • for CD-ROMs. Not all disk image files with the .iso extension • contain the ISO 9660 file system and not all .iso files contain the contents of optical media

Disk Imaging as a Backup Tool

ISO

Notable for being • Allows for lossless • Allows for the only open-source compression. embedded metadata forensic for mat. and can store as AFF versions 1–3 many fields as have been deprecated desired. and placed in “maintenance mode” by the developers [Hellewell (2014) 2020]. AFF version 4 is in the works, but is still in development as of 2021 and not production-ready (AFF4 n.d.). No single • Uncompressed, • No embedded specification for metadata however, all var iants of ISO given that the images maximum Uncompressed, no storage capacity decompression is of CD-ROMs needed to read the and DVDs is data small relative to Uses the file uncompressed structure of inter nal high-resolution optical disc file videos, disk system (CDs, UDF images of optical and DVD) media take Broad compatibility up relatively with operating little space in systems and software contemporary storage systems

Eddy Colloton et al.



File system information: • • •



Fixity information: •



File system type (for example, NTFS, HFS+, FAT). Logical block size. Block/sector count.

MD5 and SHA checksums of the disk image file.

Embedded metadata fields.

Finally, metadata can be gathered from the completed disk image itself to describe its size, fixity information such as a checksum, and contents. Given that disk images can often be hundreds of gigabytes, the checksum of the entire disk image itself only provides a broad depiction of the fixity of the data within the image. For example, a single bit flipping in a 500 GB disk image and a catastrophic loss of 499 GB both have the same result, a changed checksum. Therefore, a granular description of the contents of a disk image is not only a matter of cataloging or accessibility, but it has real implications for the preservation of the disk image as well. A variety of tools exist to create a manifest of all of the files within a disk image, noting fixity, as well as other disk image-specific metadata, such as the location on the storage media. Digital forensics companies have created powerful tools with user-friendly GUI design, such as Access Data’s Forensic Toolkit (FTK) or Basis Technology’s Autopsy/Sleuth Kit, but standardized output into Digital Forensics XML (DFXML) (Garfinkel 2012) is available through the free open-source software fiwalk, short for “file and inode walk,” and Inode, or “index node,” being a data structure for Unix-style file systems that stores attributes and the location of a file (Cormany 2008). Fiwalk traverses the file system of a disk image to collect granular technical metadata on every bit of a disk image and stores it in a human and machine-readable format. This wealth of information could potentially yield a thorough analysis of a disk image, but also it can be overwhelming. While collecting an abundance of metadata has obvious benefits, it also comes with a responsibility to eventually parse and analyze such data in order to render it valuable to future users.

14.8 Sample Disk Imaging Workflow In order to better visualize the disk imaging process, the approach can be conceptualized into three primary stages: pre-imaging, imaging, and post-imaging. This sample workflow is not intended as a one-size-fits-all methodology. Describing these phases is intended to act as a primer for planning more concrete and precise workflows incorporating more specific needs. 1. Pre-imaging phase a. Collect information about the artwork, including photography of physical components, ports, and other details, before any significant intervention is implemented. b. Inventory all components with unique identifiers, including each piece of physical storage media to be imaged. c. Create components for physical and digital objects in a Collection Management System or catalog and clearly establish the relationship between disk image and source media. d. Document the workflow for disk imaging, including hardware and software used. 218

Disk Imaging as a Backup Tool

2. Disk imaging phase a. Augment documentation of a disk image through generating automated metadata during the creation of the disk image. b. Identify and label derivatives created during the disk imaging process. c. When imaging a hard drive or flash drive of a computer, generate system reports on the hardware. 3. Post-imaging phase a. Occurs after the image has been created and can be the most technically challenging aspect of the disk imaging process, in which quality assurance and testing begin. b. Quality assurance may include the following: i. Checking for bad sectors during imaging (which most disk imaging software records in sidecar info reports) and determining if the storage media needs to be reimaged. ii. Verification of the image by rescanning the source media after imaging and comparing its contents with the disk image file. To simplify, one can compare the checksum of the disk image file with a checksum generated for the entire original source media. At the time of this writing, Guymager software can optionally perform this verification step automatically after creating an image. iii. Mounting the image to ensure it is usable, generating metadata to describe the image’s contents, and confirming the functionality of any operating systems or executables within the disk image (Colloton et al. 2019a). Ensuring the functionality of a disk image through quality assurance testing can entail troubleshooting using complex tools in order to realize virtualization strategies (if needed); it can mean diagnosing the cause of bad sectors and wrangling a surfeit of technical metadata. While complicated and time-consuming, performing a high level of quality assurance can determine the potential reuse value of the disk image. Emulating or virtualizing the operating system from the disk image of a computer hard drive will present the files and software from that computer in their original (virtual) environment. The process will identify potential stumbling blocks, such as the need for administrator passwords, underlying software dependencies, required CPU capabilities, and other potential pitfalls of virtualization. Navigating over these hurdles will deepen the institution’s documented understanding of the original computer and its significant properties while simultaneously increasing the viability of presenting the contents of the disk image without the original hardware.

14.9 The Evolving Practice of Disk Imaging The landscape of disk imaging and emulation is advancing quickly, and significant developments in creating, manipulating, and using disk images within emulators or virtual machines have occurred outside of the conservation field. Connecting with colleagues who work in digital preservation within libraries, archives, scientific laboratories, and even individuals outside of memory institutions will be important to remain informed on current practices. •

The BitCurator Consortium maintains the BitCurator environment and hosts an annual user forum with talks on disk imaging, digital forensics, and digital preservation for libraries, archives, and museums [BitCurator Consortium (BCC) n.d.]. 219

Eddy Colloton et al.











EaaSI (Emulation as a Service Infrastructure) is a collaborative emulation project led by the Digital Preservation Services Team at Yale University Libraries, supported by OpenSLX, DataCurrent, PortalMedia, Educopia, and the Software Preservation Network (Software Preservation Network—EaaSI n.d.). In the United States, the Digital Library Federation and National Digital Stewardship Alliance host an annual conference with talks on digital preservation topics, including disk imaging. In Europe, Nestor is a professional network (Nestor n.d.) that promotes advancement in digital preservation. The MoMA Peer Forum on Disk Imaging was a gathering of professionals to discuss the topic as it relates to art conservation. Videos of talks and workflows are available on MoMA’s Media Conservation Initiative website (MoMA—Media Conservation Initiative 2017) The PERICLES program (Promoting and Enhancing Reuse of Information throughout the Content Lifecycle taking account of Evolving Semantics), active 2013–17, was a project that addressed the challenge of ensuring that digital content remains accessible in an environment that is subject to change (Tate—Pericles 2017). Forensics Wiki (n.d.) is an excellent resource for general information about disk imaging, write blockers and other digital forensics topics.

Disk imaging provides an extraordinary tool for time-based media conservators to create an exact copy of a piece of physical storage media. The disk image has many possible uses, from research to exhibition or loan, without ever posing a risk to the original artist-provided components of an artwork. However, creating the image only serves as a first step in the stewardship of a time-based media artwork. The disk image file must be properly stored in a digital repository and, as with all digital assets, potentially re-evaluated or migrated in the future.

Bibliography AccessData. “Forensic Toolkit Imager (FTK) (Software) (version Mac OS 10.5 and 10.6x Version—3.1.1). AccessData.” 2012. https://accessdata.com/products-services/forensic-toolkit-ftk/ftkimager. AFF4. “AFF4—The Advanced Forensics File Format.” AFF4. Accessed August 13, 2020. http://www2. aff4.org/. AIC (American Institute for Conservation). “Code of Ethics and Guidelines for Practice of the American Institute for Conservation of Historic & Artistic Works.” AIC (American Institute for Conservation), 1994. www.culturalheritage.org/about-conservation/code-of-ethics. Allen, Jennifer, Elvia Arroyo-Ramirez, Kelly Bolding, Faith Charlton, Patricia Ciccone, Yvonne Eadon, Matthew Farrell et  al. “Archivists Guide to the Kryoflux.” GitHub, n.d. Accessed June  10, 2021. https://github.com/archivistsguidetokryoflux/archivists-guide-to-kryoflux. Arms, Caroline, and Carl Fleischhauer. Digital Formats: Factors for Sustainability, Functionality, and Quality. Washington, DC, 2005. https://memory.loc.gov/ammem/techdocs/digform/Formats_IST05_paper.pdf. Arroyo-Ramírez, Elvia, Kelly Bolding, Faith Charlton, and Allison Hughes. “ ‘Tell Us About Your Digital Archives Workstation’: A Survey and Case Study.” Journal of Contemporary Archival Studies 5, no. 1 (December 20, 2018). https://elischolar.library.yale.edu/jcas/vol5/iss1/16. Backblaze. “Hard Drive Stats.” Backblaze, n.d. Accessed January 20, 2021. www.backblaze.com/b2/harddrive-test-data.html. “BCC—Practice // Values : Situating Digital Forensics Tools in Archives (BCC Roundtable).” BCC Roundtable (BitCurator Consortium), 2020. www.youtube.com/watch?v=q7wgO6d9uu0&feature=yo utu.be. “BitCurator Consortium (BCC) (Website).” n.d. Accessed June 1, 2021. https://bitcuratorconsortium.org/. Cirella, David. “BitCurator Disk Image Access (Software) (version 0.8.12).” 2021. https://confluence. educopia.org/display/BC/Access+and+Export+Files+from+Disk+Images.

220

Disk Imaging as a Backup Tool Colloton, Eddy. “Disk Imaging Drag Race.” Eddycolloton.Com (blog), Personal Blog, October 21, 2019. http://eddycolloton.com/blog/2019/10/21/disk-imaging-drag-race. Colloton, Eddy, Jonathan Farbowitz, Flaminia Fortunato, and Caroline Gil. “Towards Best Practices In Disk Imaging: A  Cross-Institutional Approach.” The Electronic Media Review, Volume 6: 2019–2020 (2019a). https://resources.culturalheritage.org/emg-review/volume-6-2019-2020/colloton/. ———. “Towards Best Practices in Disk Imaging: A  Cross-Institutional Approach.” Electronic Media Group Specialty Session presented at the 47th Annual Meeting of the American Institute of Conservation, Uncasville, CT, May 16, 2019b. https://sched.co/IujY. Colloton, Eddy, and Kate Moomaw. “Rewind, Pause, Playback: Addressing a Media Conservation Backlog at the Denver Art Museum.” The Electronic Media Review Volume 5: 2017–2018 (2018). http:// resources.culturalheritage.org/emg-review/volume-5-2017-2018/colloton/. Cormany, Adam. “It’s All About the Inode.” June  10, 2008. https://developer.ibm.com/technologies/ systems/articles/au-speakingunix14/. DANNNG! “Welcome to DANNNG!” DANNNG! Accessed October 16, 2020. https://dannng.github.io/. Desco Industries. “Desco—06822 ESD Awareness Booklet, Spanish.” 2020. https://desco.descoindustries. com/DescoCatalog/Education-Awareness-Accessories/ESD-Training/06822/. ———. “ESD Awareness Booklet.” 2017. www.baskiville.co.nz/SPNet6/UserFTP/Desco/DES06821/ Static%20control%20book.%20ESD%20basic.pdf. Diaz, Antonio. “GNU Ddrescue (Software) (version 1.25). C++.” 2020. www.gnu.org/software/ ddrescue/. Dicks, Christopher. “Computer Hard Disks and Diskettes—FAQ.” CCI (Canadian Conservation Institute), May 7, 2020. www.canada.ca/en/conservation-institute/services/care-objects/electronic-media/com puter-hard-disks-diskettes-faq.html. Dietrich, Dianne, and Frank Adelstein. “Archival Science, Digital Forensics, and New Media Art.” Digital Investigation, the Proceedings of the Fifteenth Annual DFRWS Conference 14 (August 1, 2015): S137–45. https://doi.org/10.1016/j.diin.2015.05.004. Digital Intelligence. “FRED Forensic Workstations.” Digital Intelligence, n.d. Accessed February 7, 2021. https://digitalintelligence.com/products/fred/. Espenschied, Dragan. “Emulation and Access.” Museum of Modern Art (MoMA), December  8, 2017. https://vimeo.com/278042613/a78ee5e46b. Farbowitz, Jonathan. “Archiving Computer-Based Artworks.” The Electronic Media Review Volume 5: 2017–2018 (2018). http://resources.culturalheritage.org/emg-review/volume-5-2017-2018/farbowitz/. Farrell, Matthew J. “Born-Digital Objects in the Deeds of Gift of Collecting Repositories: A Latent Content Analysis.” Master’s Thesis, University of North Carolina, Chapel Hill, 2012. https://cdr.lib.unc. edu/concern/masters_papers/sn00b218p. “Forensics Wiki (Website).” Accessed September 1, 2020. https://forensicswiki.xyz/page/Main_Page. Garfinkel, Simson. “Digital Forensics XML and DFXML Toolset.” Digital Investigation 8 (2012): 161–74. https://doi.org/10.1016/j.diin.2011.11.002. Goethals, Andrea, and AVPreserve. “Harvard Library, Disk Image Content Model and Metadata Analysis.” Harvard Wiki, October  31, 2016. https://wiki.harvard.edu/confluence/display/digitalpreservation/ Disk+Image+Formats. Guggenheim Museum. “The Conserving Computer-Based Art Initiative (CCBA).” The Guggenheim Museums and Foundation, n.d. Accessed September  2, 2020. www.guggenheim.org/conservation/ the-conserving-computer-based-art-initiative. Hellewell, Phillip. “Sshock/AFFLIBv3 (Software) (version 3.7.19). C++.” 2020. https://github.com/ sshock/AFFLIBv3. Henry, Paul. “Forensically Sound Mac Acquisition in Target Mode.” SANS Digital Forensics and Incident Response (blog), February  2, 2011. www.sans.org/blog/how-to-forensically-sound-mac-acquisitionin-target-mode/. ICON. “ICON Professional Standards.” ICON, June  25, 2014. www.icon.org.uk/resources/standardsand-ethics/icon-professional-standards.html. International Council of Museums (ICOM). “ICOM Code of Ethics for Museums.” ICOM, 2017. https://icom.museum/en/resources/standards-guidelines/code-of-ethics/. Kirschenbaum, Matthew G., Richard Ovenden, Gabriela Redwine, and Rachel Donahue. Digital Forensics and Born-Digital Content in Cultural Heritage Collections. CLIR Publication, no. 149. Washington, DC: Council on Library and Information Resources, 2010. www.clir.org/pubs/reports/pub149/.

221

Eddy Colloton et al. Lee, Christopher, Matthew Krischenbaum, Kam Woods, Alex Chassanoff, Porter Olsen, and Sunitha Misra. “BitCurator (Software) (version 2.2.12).” Accessed February 7, 2021. https://bitcurator.net/. Lewis, Kate, and Ben Fino-Radin. “Preparing for Exhibition: Feng Mengbo: Long March: Restart (2008).” Presented at the TechFocus iii: Caring for Software-based Art, Solomon R. Guggenheim Museum, New York, September  25, 2015. http://resources.conservation-us.org/techfocus/ techfocus-iii-caring-for-computer-based-art-software-tw/. Library of Congress. “Sustainability of Digital Formats: Planning for Library of Congress Collections.” Library of Congress, January 5, 2017. www.loc.gov/preservation/digital/formats/sustain/sustain.shtml. Lunt, Barry M. “How Long Is Long-Term Data Storage?” In Archiving 2011 Final Program and Proceedings, 29–33. Society for Imaging Science and Technology, 2011. www.imaging.org/site/PDFS/Reporter/ Articles/REP26_3_4_ARCH2011_Lunt.pdf. Martin, David. “OS X as a Forensic Platform.” SANS Institute Information Security Reading Room, 2017. www.sans.org/reading-room/whitepapers/apple/os-forensic-platform-37637. MoMA Media Conservation Initiative. “Disk Imaging.” MoMA Media Conservation Initiative, December 2017. www.mediaconservation.io/disk-imaging. National Archives. “PRONOM.” The National Archives. The National Archives, Kew, Surrey TW9 4DU, n.d. Accessed September 2, 2020. www.nationalarchives.gov.uk/PRONOM/Default.aspx. Nestor. “Nestor (Website).” n.d. Accessed August 12, 2021. www.langzeitarchivierung.de. Olsen, Porter, and Kam Woods. “Building a Digital Curation Workstation with BitCurator (Update).” 2018. https://bitcurator.net/2013/08/02/building-a-digital-curation-workstation-with-bitcurator-update/. Software Preservation Network (SPN). “EaaSI Emulation-as-a-Service Infrastructure.” Software Preservation Network, n.d. Accessed June  11, 2021. www.softwarepreservationnetwork.org/emulation-as-aservice-infrastructure/. Synder, Jeff. “Electrostatic Discharge.” iFixit. Accessed September 14, 2020. www.ifixit.com/Wiki/ESD. Tate. “Pericles.” Tate, 2017. www.tate.org.uk/about-us/projects/pericles. Thompson, Ken, and Various. “Dd (Software).” AT&T Bell Laboratories, 1974. https://git.savannah.gnu. org/git/coreutils.git/. Van Hove, Peter. “ISOBuster (Software) (version 4.7).” Microsoft Windows, 2020. www.isobuster.com/. Voncken, Guy. “Guymager (Software) (version 0.8.12).” 2021. https://guymager.sourceforge.io/. Walsh, Tessa. “Brunnhilde (Software) (version 1.9.1).” 2020. www.bitarchivist.net/projects/brunnhilde/. “Wikipedia—Digital Forensics.” Wikipedia, August  29, 2020. https://en.wikipedia.org/w/index.php? title=Digital_forensics&oldid=975610833. “Wikipedia—Target Disk Mode.” Wikipedia, November  14, 2018. https://en.wikipedia.org/w/index. php?title=Target_Disk_Mode&oldid=868820299.

222

15 MANAGING AND STORING ARTWORK EQUIPMENT IN TIME-BASED MEDIA ART Duncan Harvey

Editors’ Notes: Duncan Harvey is a conservator in Tate’s Time-Based Media Conservation Department, working on outgoing loans and Tate’s digital repository storage project. During his time at Tate, he has worked as a conservation technician working on displays and acquisitions. This work has included understanding the equipment requirements of procuring, testing, preparing, installing, and maintaining artworks and their equipment at the point they come into Tate’s collection, while on display, and while on loan. Over the last years, Harvey has further developed and established Tate’s systems for categorizing, handling, and storing the hardware of media artworks. Drawing from his own experience, as well as conservation literature and interviews with colleagues, Harvey offers in this chapter a systematic and practical approach to the complex task of managing artwork equipment in a collection context.

15.1 Introduction Artwork equipment, such as monitors, speakers, projectors, or media players, are needed to install and realize time-based media (TBM) works. It may be that this equipment enters a collection alongside the artwork media, supplied as part of an acquisition, or that only instructions are supplied by the artist. Some equipment may be a unique, integral part of the artwork, whereas other equipment may be replaced with identical, similar, or entirely different equipment. The preservation and maintenance management of artwork equipment depends on the functional, conceptual, and aesthetic significance assigned to the equipment in relation to the meaning and identity of an artwork. This chapter introduces current practices and methodologies to assess and categorize equipment and to assist those tasked with caring for TBM artworks (hereafter referred to as the conservator) in making strategic decisions on its care, maintenance, and storage. We will consider how this methodology can be used to guide the practical considerations needed when establishing modes of organization within a Collection Management System (CMS) (see Chapter  12), conducting an inventory, gathering and producing documentation (see Chapter  11), acquiring equipment, storing equipment, and carrying out maintenance, servicing, and repairs (hereafter referred to as technical care).

DOI: 10.4324/9781003034865-18

223

Duncan Harvey

This chapter is written based on five years working in Tate’s Time-Based Media Conservation Department and drawing on experiences shared through interviews with Asti Sherring, representing the Art Gallery of New South Wales (Australia); Martina Haidvogl, formerly representing the San Francisco Museum of Modern Art (USA); Morgane Stricot, from the Center for Art and Media, Karlsruhe (Germany); Samantha Owens from Glenstone (USA); and Susanna Sääskilahti from Ateneum (Finland) and drawing on the relevant literature as outlined in the bibliography. A specific debt is owed to Pip Laurenson’s The Management of Display Equipment in Time-based Media Installations (2004, 2005), Joanna Phillips’ Shifting Equipment Significance in Time-Based Media Art (2012), and Emanuel Lorrain’s Obsolete Equipment: A Research Project on Preserving Equipment in Multimedia Art Installations (2013).

15.2 Change and Assessing Significance Unlike more traditional mediums, TBM artworks lack a discrete whole serving as the focal point for care. Instead, these works are comprised of interrelated and interdependent equipment and media elements, which, either owing to a variability prescribed by the artist or the inevitability of wear and technological obsolescence, warrant repair and replacement over time. Changing elements within this “dynamic system” (Laurenson 2004, 49) has the potential to significantly alter an artwork. Though all artworks face the prospect of change and loss over time, this dependence on technology accelerates the process for TBM works. The pace and sometimes-radical nature of change appear to stand in contrast to “the basic ethical principles of conservation: authenticity, minimal intervention and reversibility” (Bek 2011, 210) and seems to place upon the conservator the onus to decide whether or not to take, as phrased by the artist Bill Viola, either a “purist, original-technology-at-all-costs approach” or an “adapted/ updated technology approach” (Viola 1999, 89). Each of these has obvious limitations, whether it be that, eventually, the artwork cannot be shown or that, over time, increasingly inauthentic versions be shown. Inevitably, the solution lies somewhere in between and is dependent on the characteristics of the work. Conservator Joanna Phillips calls this a “balanced approach” to decision-making, an approach that is influenced by the context in which an artwork operates: “In choosing a balanced approach, one central task of time-based media conservation becomes the determination and oversight of acceptable degrees of short-, mid- and long-term change that an artwork may undergo in response to a changing context. This can include different display environments, ongoing technological developments, curatorial and exhibition design concepts, or a technician’s preferences.” (­Phillips 2012, 141) Laurenson suggests that aesthetic significance “relates to the look and feel of visible components and the outputs of the system” (ibid.) (for example, sound, image, and points of “viewer” interaction), historical significance “refers to the relationship of the work to the process or technology employed and the spirit in which the work was made” (ibid.) and that conceptual significance “refers to the relationship of the work to the process or technology employed and the spirit in which the work was made” (ibid.). It follows that “[t]he stronger and more specific the link [between the equipment and the work], the more vulnerable the work is to loss” (Laurenson 2004, 49). Functional significance is the least significant and is attributed to equipment that cannot be seen by the viewer or, if it can be seen, is not intended to be seen and/or if the equipment can be swapped for another without changing how the work is experienced. 224

Managing and Storing Artwork Equipment

Table 15.1 contains a summarized and augmented list of questions and prompts, devised by Laurenson (Laurenson 2004, 50), designed as a starting point in the assessment of significance. Their value is dependent upon the individual conservator’s ability to expand upon them and utilize their professional discretion in processing the answers. Such questions can and should be asked at any time in the life of an artwork. However, given that acquisition and early displays of a work bring together the artist and other key stakeholders to actively engage with the conceptual and practical nature of a work (see Chapters 11, 16, and 17), these are important moments to identify and interrogate both the “intended and contingent work-defining properties” (Phillips 2012, 152) that Table 15.1 aims to identify. Many artworks will be flexible enough to allow display needs to be met in a number of ways, and equipment will only have functional significance, but it is important to remember that significance can change over time. Over time, changes in context, as outlined earlier, may highlight properties that were previously regarded as ubiquitous and insignificant, unforeseen obsolescence being the most obvious example of this. As Phillips notes, “Equipment significance is always established in reference to a contemporary landscape of existing and available technologies, the rapid advances of which can hardly be foreseen by more than a few years” (Phillips 2012, 142). The role of the conservator is to prepare for the future, anticipating and meeting change through thorough dialogue, reflection, and documentation. With the passage of time, typically there is a shift in authority around how change is regarded in relation to an artwork. In the formative years of an artwork, the artist’s authority is paramount (see Chapter 17). How flexible or rigid the specifications for an artwork are is at their discretion. As time passes, conversations may slow down and become more predictable as a work’s identity becomes established. Depending on the particular artwork and nature of the artist’s practice, there comes a time when the stewardship of the collecting institution becomes the principal determining factor in decisions around change (Laurenson 2004, 51).

15.3 Levels of Significance Given that equipment is integral to the conceptual and material realization of TBM artworks, it is advisable that equipment is included within an institution’s CMS to reflect this. Whichever Table 15.1  Assessing equipment significance • Is the equipment visible when displayed? • Can the function of the equipment be replicated by another piece of equipment without changing the work in a perceptible way? • Has the artist provided specifications for the equipment to be used with the work? If yes, are these specifications flexible or rigid? • Has the equipment been modified in any way (either by or under instruction from the artist or as a necessary measure to realize an artist’s intent)? • If visible, is the equipment part of an interrelated collection of equipment and/or objects which, together, convey all or part of the meaning of the work? • Does the appearance or function of the equipment provide the viewer with information about the time in which the work was made or the context within which it was made? • Does the work itself, in any way, reference or conceptually rely upon a specific piece of equipment used in the display of the work? • Is the equipment considered to be obsolete/legacy equipment? • If not obsolete, is it thought obsolescence will likely become an issue? • What are the artist’s thoughts relating to any existing or potential obsolescence issues?

225

Duncan Harvey

system is used for this purpose, a CMS entry will offer several benefits. It provides a place to identify the nature of the relationship between a piece of equipment and an artwork and to track that equipment. It creates a space wherein the conclusions drawn from the work around assessing significance, which can result in complex documentation, can then be turned into a system that is more easily understood by other members of staff within an institution. It elevates the status of equipment institutionally, highlighting their importance as cultural (as well as financial) assets, thereby creating a protective “buffer” around these objects. Additionally, accepting that time and changes in technology can afford significance to previously ubiquitous pieces of equipment, this buffer affords the requisite time needed to assess and reassess significance. In order to reflect the level of significance within a CMS it is important to establish a flexible but meaningful nomenclature placing each piece of equipment within a defined category (see Chapter 12). Table 15.2 outlines a proposed hierarchical scheme relating directly to conclusions that may be drawn from the questions asked in Table 15.1. This scheme is a version of that which we employ at Tate within our CMS (Gallery Systems’ The Museum System). How this hierarchy is implemented will vary depending on the existing CMS and the current and historical modes of categorization associated with that. It may be that some workarounds are required to distinguish older modes of categorization from new or to implement a

Table 15.2 A proposed hierarchy for artwork equipment categorization Status

Name

Level 1 Unique Equipment

Level 2 Dedicated Equipment

Description

Management Strategies

• Items conceived of and subsequently made or commissioned by the artist. • Items supplied by the artist that have been functionally or aesthetically modified by the artist/at the artist’s request. • Items chosen and/or supplied by the artist that have unique, inimitable characteristics. • Unmodified, replaceable items dedicated to a specific artwork owing to perceived significance, artist preference, and complexity of the overall equipment system. • Obsolete, legacy equipment. • The item is the default choice for display and should be used until it no longer functions as required.

• Must be included with CMS and have an explicit relationship with an artwork. • Must be stored in climate-controlled conditions (see Section 15.6, “Storage”). • Technical care should be prioritized above all other equipment (see Section 15.7). • Objects may require additional care from other objects conservators, including traditional types of condition checking. • Spare parts should be sought, where possible. • Must not be shared with any other artworks or used for any other purpose. • Must be included with CMS and have an explicit relationship with an artwork. • Must be stored in climate-controlled conditions (see Section 15.6, “Storage”). • Technical care should be prioritized (see Section 15.7). • Spare parts should be sought, and replacements should be stockpiled, where possible. • Must not be shared with any other artworks or used for any other purpose.

226

Managing and Storing Artwork Equipment

Status

Name

Level 3 Related Equipment

Level 4 Generic Equipment

Level 5 Consumables

Description

Management Strategies

• Unmodified, replaceable items which, by virtue of having been bought for and/or used for specific artworks have a historical link. • Obsolete equipment that is shared between any number of artworks. • Can be used for other works if deemed appropriate and need not necessarily be used for the artwork to which they are related. • Items not associated with any specific artworks but serve to facilitate the display of works whose specifications can be met in a number of ways. • Items whose function is easily replicated/ substituted.

• Should be included with CMS and have an explicit relationship with an artwork. • Should be stored in climate-controlled conditions (see Section 15.6, “Storage”). • Technical care should be conducted where appropriate (see Section 15.7). • Spare parts should be sought, and replacements should be stockpiled, especially for items with obsolescence concerns. • Should only be used for artworks. • Items should be monitored over time lest obsolescence requires they be assigned Level 2 status.

• Items subject to high wear have a relatively short life span and are easily replaceable. Often, they are inexpensive, such as cables or headphones, but they can include more costly items such as projector lamps.

• Inclusion in CMS is not essential but recommended. • Items need not have a relationship with an artwork. • Should be stored in climate-controlled conditions (see Section 15.6, “Storage”). • Technical care should be conducted where appropriate (see Section 15.7). • Spare parts should be sought but only where appropriate. • Can be used for non-artwork requirements, if necessary. • Inclusion in CMS is not recommended, but documentation of their properties in relation to artwork display may be important. • Do not need to be stored in climatecontrolled conditions. • Technical care is typically not required. • Spare parts need not be sought, nor replacements stockpiled unless related to obsolete equipment (e.g., analog projector lamps). • Can be used for non-artwork requirements, if necessary.

new system altogether. For example, at Tate, the system outlined earlier is being implemented over an existing, simpler system. The older system was less nuanced and resulted in inconsistent categorization and, in some cases, insufficient documentation. In order to introduce the new system, a system was developed to highlight the nature of an object’s relationship with an artwork (other than Level 1 items, all equipment is categorized in a separate category than artwork objects and inventoried with separate accession numbers as EQ objects in the CMS). It was agreed that remedying inconsistencies in how equipment had been categorized, and more 227

Duncan Harvey

generally, implementation of the new system would be conducted incrementally, over time. For example, Level 1 items were sometimes made to be components of an artwork; other times, they were categorized as EQ objects. Now, these items are made to be components as and when they are encountered. (It should be noted that, while it was agreed that they could be either EQ items or components, consistency was more important than which method of categorization was chosen, and the institutional tendency toward prioritizing artwork components led us to conclude Level 1 items should indeed be components.)

15.4 Documentation Whether it is qualitative information gleaned through conversations with the artist and other stakeholders or quantitative data pertaining to the physical properties of the equipment in question, as for other more traditional forms of artistic media, documentation lies at the center of the care of artwork equipment. However, owing to the complexity of the equipment and the vulnerability to change these dynamic systems face, traditional methods of documentation need to be expanded upon when it comes to the care of equipment. In her 2006 paper, Authenticity, Change and Loss in the Conservation of Time-Based Media Installations, Pip Laurenson introduces the concept of “work-defining properties of a time-based media installation” (Laurenson 2006). The identification and documentation of such aspects of a work, she argues, allows the conservator to better manage change by allowing us to understand the flexibility, or lack thereof, in an artist’s conception of their work (see Chapter 11). Equipment, whether an actual object in the possession of the collecting institution or a hypothetical item (e.g., a specification of a proposed piece of equipment to be sourced), is no exception to this; the search for an understanding of the degree an equipment item may or may not be replaceable is the search for the properties of that equipment that may or may not define a work—its significance. An example from Tate’s collection of equipment that manages to be both real and hypothetical (i.e., the current item dedicated to the work is replaceable) is the highly visible CRT projector staged on a plinth in Mark Leckey’s Dream English Kid, 1964–1999 AD (Leckey 2015). The choice of CRT projection technology, a legacy technology of the 1960s through 1990s that has been replaced by subsequent projection technologies since then, is highly uncommon for a work created in 2015. However, the artist indicated, in conversations at the point of display, that the projector itself is not unique but replaceable. What is important about the projector is not the make or model (though options are limited when it comes to such things); rather, it is a CRT projector, which reflects aspects of the meanings and temporal references within the work. A similar example is the turntable in Ceal Floyer’s Carousel (Floyer 1996); conversations with the artist, years after the work was acquired, revealed that a replacement turntable would have to adhere to a relatively strict aesthetic specification and that, whatever turntable was sourced would have to be functionally modified in order to display the work (a repeating 10″ record playing the sound of a carousel on a slide projector). With this example, the connection between the playback device and the meaning is direct and unequivocal. As with all documentation, the identification of these “work-defining properties” can and should take place throughout the life of an artwork, with some moments being more productive or critical than others—namely, acquisition and display. Should a collection not already have an existing, accurate, or useful catalog of equipment items, then a conservation-minded inventory should be conducted. Information gathered should then serve as the basis for data to be entered into the CMS (see Chapter 12). Table 15.3 outlines the minimum information one should aim to create and capture for all items deemed to be Levels 1 to 4. 228

Managing and Storing Artwork Equipment

Figure 15.1 Mark Leckey, Dream English Kid, 1964–1999 AD (2015). Purchased with the support of the Contemporary Art Society 2016. © Mark Leckey. Photo: Tate

Besides the types of information outlined in Table 15.3, additional documentation such as manufacturer produced manuals, purchase and warranty information, types and characteristics of spares and consumables, condition/servicing/treatment reports, photographs, videos, thirdparty documentation, installation/display specifications, and wiring diagrams can be gathered and generated throughout the life of a piece of equipment. Realistically, not every item will have all types of documentation captured, as it may be that certain information is not available or required, and there may not be sufficient resources available in terms of time and/or staff. Equipment significance, obsolescence, and scarcity should dictate priorities in this regard; typically, and needfully, greater quality and quantities of documentation will be generated for Level 1 and 2 equipment and for complex systems. Where this information is stored depends on the existing institutional data management structures and preferences. For example, for Level 1, 2, and 3 equipment and complex systems, such information could sit within the structures that are in place for capturing other key information about an artwork outside a CMS. For example, at Tate, we save this information in what we call, simply, Artwork Folders—folders that are on a shared network drive, accessible to all those working in the conservation and curatorial departments. For Level 4 equipment that is not part of complex systems, depending on the size of an equipment collection, individual records containing the previous information could be kept. Smaller collections may find this a more sustainable solution than institutions with large quantities of equipment.

229

Duncan Harvey Table 15.3 Basic equipment information to capture in a CMS Information Type

Details

Identification Number Status* Associated Artwork(s) Category*

Each item should be assigned a unique number. Level 1–4 (see Table 15.2) Names of any and all associated artworks The overarching equipment category; for example, audio equipment, film equipment, video display equipment, video player, etc. The specific function of the item; for example, speaker, 16 mm looper, flatscreen monitor, digital media player, etc. Equipment manufacturer Manufacturer assigned model name and number Manufacturer-assigned serial number Object color Power requirements of the object Describe how the object is packed Packed and unpacked dimensions of the object

Type* Make* Model Serial Number Color* Power* Packaging* Dimensions

* These fields benefit from a controlled vocabulary to aid searching within a CMS.

15.5 Sources of Equipment Equipment is expensive and takes time to research and procure. While there are important considerations around where equipment comes from when it comes to assessing significance, there are also practical matters one must be mindful of when procuring equipment. Typical sources of equipment items include the following items: • • • • • • • • •

Supplied at the point of acquisition Procured by the collecting institution, in consultation with the artist at the point of acquisition Procured at the point of display (in consultation with the artist or not, depending on the significance of the equipment within the work) Procured to replace and/or maintain existing equipment Procured in the ongoing search for legacy equipment Procured in anticipation of technological advances Supplied as a gift (for example, in response to a public callout for scarce legacy technology such as domestic televisions) Taken/borrowed from an Audiovisual Department collection and used in the display of an artwork Hired for display

Receiving equipment from the artist or their representatives at the point of acquisition can be a low- or no-cost way of providing for an artwork’s equipment needs and, in some cases, is the best way of ensuring a work is presented exactly how the artist desires, which is especially true when it comes to complex systems such as software-based artworks. While it may be the case that this is an appropriate solution for collections with very few TBM artworks, larger or growing collections ought to be mindful of the pitfalls of acquiring equipment in this manner. 230

Managing and Storing Artwork Equipment

A potentially more advantageous approach is that equipment is purchased separately by the collecting institution, especially when it comes to Level 3, 4, and 5 equipment. Acquiring equipment this way affords a number of advantages. Equipment can be procured at a time deemed most suitable by the institution, such as at the point of display, which, in contrast to the point of acquisition, allows for the purchase of up-to-date equipment that can be compared to existing technologies to properly understand differences in output qualities and therefore suitability for use. Intentional investment into compatible or modular manufacturer-designed systems can create streamlined, versatile equipment collections capable of meeting a variety of needs; the larger an equipment pool, the more difficult and expensive it is to maintain. Collecting “sets” of equipment for each work as an institutional practice might result in bloated and outdated equipment collections. As Morgane Stricot, media and digital art conservator at ZKM Center for Art and Media Karlsruhe, succinctly points out, “it doesn’t make sense to buy as many projectors as there are artworks” (Stricot 2020). Thorough research can be conducted to ensure high-quality items are bought that meet the standards required for prolonged display and are produced by reputable companies that offer in-country support, servicing, and repair. Workflows can be honed, and standards maintained by developing problem-solving skills associated with specific equipment types. Finally, the development of ongoing relationships with vendors in this area both creates opportunities for financially favorable arrangements to be made and permits the development of more intuitive, informed conversations around the nature of needs for specific artworks or general display requirements. Art collections with works reliant upon legacy equipment must undertake an ongoing search for appropriate pieces of equipment and spare parts. This work can be time-consuming and requires not only using the typical international web-based marketplaces but also local specialist suppliers and web-based forums of different kinds. It is often the case, especially in smaller institutions, that needs are met by or with support from an Audiovisual Department. The benefits to this arrangement are that resources are more efficiently spread and that broad and deep technical expertise likely already exists within such departments. The drawbacks are that, typically, audiovisual technicians do not take a conservation-minded approach to the equipment, and key information garnered at the point of acquisition or display may be lost due to the lack of documentation or appreciation of the significance of a piece of equipment within the context of an artwork. Additionally, reliance upon Audiovisual Departments may limit the opportunities to develop technical expertise within the conservation department. Ideally, any equipment located within an Audiovisual Department pool but deemed to be Level 1, 2, and potentially 3 should be separated and brought into a collection concerned solely with artwork equipment. Should this arrangement be unavoidable or deemed the most appropriate, as it may well be, systems should be established that permit good communication and understanding between conservators, audiovisual technicians, and any other relevant stakeholders (see Chapter 10). Though typically an expensive and non-sustainable approach, there may be times when equipment hire is appropriate—for example, short-term hire of Level 4 equipment that is prohibitively expensive to buy. As for purchasing equipment, developing ongoing relationships with trusted hire companies may allow for opportunities to reduce costs in certain areas while still meeting display requirements. Specialist hire companies that provide legacy equipment and have developed familiarity with the typical requirements of TBM artworks may provide not just equipment but also expertise that can be absorbed by the collecting institution. Often these companies work closely with artists to provide guidance as to how to realize their ideas and develop technical specifications. Additionally, often these companies, though typically keen to protect their collections of legacy equipment, are willing to sell equipment or entire systems. 231

Duncan Harvey

15.6 Storage As researcher and conservator Emanuel Lorrain points out, in addition to the rigors of display, the risks faced by equipment are broadly similar to those faced by more traditional objects; “age, storage conditions, environmental factors such as sunlight and oxygen, or phenomena like corrosion and oxidation of metals” (Lorrain 2013, 235). Ideally, equipment should be stored in the same climate-controlled conditions observed in storage units for artworks stored as metal. If resource limitations make this difficult, the status of a piece of equipment should dictate priorities for storage in such conditions. At the very least, Level 1 and 2 and all legacy equipment should be stored in the conditions outlined in Table 15.4. When establishing institutional practices in this area, one must conduct a cost-benefit analysis of prolonging the life of equipment through proper storage and the cost of storage in specialist facilities. Table 15.4 Considerations for storing artwork equipment Consideration

Description

Recommendation

Temperature

“Heat and sudden changes of temperature increase the risk of equipment failure. Peaks in temperature, which are highly stressful for all components, have the effect of damaging soldered joints, shortening the life of components, and accelerating chemical reactions inside machines, for example” (Lorrain 2013, 236). Humidity “Excessive humidity causes hydrolysis and thus chemical degradation of materials such as plastic, the appearance of rust on metals, and condensation inside machines and their components. “Too dry an atmosphere, however, creates static electricity, a major consequence of which is erasure of the content of an EPROM (erasable programmable read-only memory) on a motherboard” (ibid.). Protection from “Dust, and other particles present in the air, dust attract humidity and prevent thermal exchange with the outside environment by blocking the means of ventilation and airflow provided by the manufacturer. When dust gets inside a machine, it covers its components and makes them more liable to overheat” (ibid.). Dangers Old batteries and electrolytic capacitors may leak and originating cause irreversible damage through corrosion to inside surrounding components. equipment

232

A stable temperature of around 18–20℃.

Stable humidity levels (ideally a relative humidity of 45%). Store vulnerable components in anti-static bags where practicable.

Store equipment on shelves and/or ensure all items are packaged or covered in some way (see Table 15.5).

Removal of all primary and secondary cell batteries that are designed to be removed. Systematic monitoring of all capacitors, circuit board mounted batteries, and batteries that are not designed for easy removal. Backups of CMOS values should be made when required and where possible.

Managing and Storing Artwork Equipment

Consideration

Description

Recommendation

Some equipment with components with a high failure rate, such as capacitors, benefit from being switched on in order to retain their specifications longer and in a way that avoids breaking down.

If deemed appropriate, establish routines of switching on and running unused equipment at intervals and for durations determined on a case-by-case basis. Regular monitoring for metal whiskering.

Metal whiskering, the appearance of fine, whiskerlike protrusions from certain types of metal, can occur over time in early-20th-century and post2006 circuitry (owing to the removal of lead from consumer electronics following the Restriction of Hazardous Substances Directive [RoHS]). Whiskering can cause short circuits and arcing in electrical equipment.

In Obsolete Equipment: A Research Project on Preserving Equipment in Multimedia Art Installations (2013), a synopsis of the PACKED VZW Obsolete Equipment project, Lorrain outlines four key areas of consideration when storing equipment: temperature, humidity, protection from dust and addresses dangers originating inside the equipment itself. Table 15.4 contains the findings of this project with additional information on potential issues associated with internal components. As well as preventing a buildup of dust, equipment should be appropriately packed to prevent damage during transport and storage. Table 15.5 outlines some pros and cons of the most common packing options. Questions may arise as to whether or not to store items as complete systems or separately. Having equipment numbered and tracked within a CMS (see Chapter 12) permits individual items to be stored separately, allowing for more flexible storage arrangements and for easier access to those items it has been deemed necessary to access on a regular basis (in line with the considerations outlined in Table 15.4). There are some instances where it may be beneficial to store systems together, for example, systems with Level 1 equipment elements, systems provided in their entirety at the point of acquisition, complex systems, and systems that form part of a larger installation environment. As ever, how something is packed is best judged on a case-bycase basis. Whatever packing type is decided upon, it is important that institutional norms are followed or established in relation to labeling and tracking.

15.7 Technical Care 15.7.1 Who Undertakes Technical Care? Though creating many opportunities to learn, the reality is that having the time to develop indepth skills related to any and all equipment types is something of a luxury. While this may be possible with smaller or more specialized collections or within teams wherein there lies specialist knowledge, the role of the conservator is, more often, to establish broad priorities related to assigned status, obsolescence, and display needs, to identify external specialists who can meet those needs, and to document the process. In addition, depending on resource availability and 233

Table 15.5 Pros and cons of common packing options for artwork equipment Pros and Cons

Particularly Suitable For

Artwork case: Polyethylene foam-lined, ISPM 15–compliant wooden cases.

Pros • The best option for protection and climate control in transit. • Conservation-grade mater ials. • Built to measure. • Easily identifiable as belong ing to the collecting institution. • Suitable for inter national travel. Cons • Take up more space than any other packag ing type. • Very expensive. Pros • Very high level of protection. • Built to measure. • Can be constructed entirely from metal and be padded with conservation-g rade foam to produce a case well suited to archival storage. • Easy to handle/move. • Suitable for inter national travel. Cons • Inter nal foam will off-gas and deter iorate over time/extra cost may be encountered to fit conservation-grade foam. • Can be hard to store in a space efficient manner. • If made without wheels, can be difficult to move. • Expensive. Pros • Can offer a high level of protection if packaged properly. • Relatively cheap. • Versatile. • Boxes sourced from a single manufacturer can per mit standardization of otherwise ir regular-sized objects making stacking easier. • May be suitable for inter national travel. Cons • Non-bespoke, therefore introducing the potential to waste space. • Requires additional labor and skills to line cases with appropr iate padding.

Large Level 1 equipment and complex systems containing Level 1 equipment.

Flight case: A reinforced hard case made with either metal or plastic sides with inter nal padding, often designed with casters and sprung handles.

Other, hard case: Var iable (e.g., r igid plastic box)

Large, fragile equipment such as CRT monitors and projectors and flat-screens.

Medium and small equipment and complex systems without large equipment components.

Duncan Harvey

234

Packaging Type and Description

Other, conservation-g rade container (custom or premade): Var iable (e.g., acid-free board or cor rugated plastic box)

235 “Soft” wrap: Polythene plastic and/or foam wrap

Medium and small equipment and complex systems without large equipment components.

Level 4 and 5 equipment.

All equipment types, ideally on a temporary basis or for equipment otherwise stored in a manner that ensures greater protection is not paramount.

Managing and Storing Artwork Equipment

Manufacturer’s box: Var iable (e.g., cardboard box with polystyrene foam)

Pros • Conservation grade mater ial. • Versatile. • Suitable for smaller, lighter items. • Relatively cheap. • Can be made to measure, in-house. Cons • Potentially limited use for larger, heavier items. • Requires additional labor and skills to line cases with appropr iate padding. • Unsuitable for inter national travel without additional protection. Pros • Easily identifiable. • May display important infor mation about the item. • Generally made for the specific item and therefore per mits efficient use of space. • Requires no extra cost or labor to source and fit. Cons • Generally made using mater ials which can be damaged ver y easily, offer minimal protection and may off-gas. • Polystyrene car r ies a static charge and attracts dust. • Unsuitable for inter national travel without additional protection. Pros • Very versatile. • A simple solution to the need to protect an object from dust. • Very cheap. Cons • Very poor protective qualities; only suitable to protect items from light contact. • Proper coverage may require the use of tape, which will deter iorate and leave a residue. • Unsuitable for travel.

Duncan Harvey

institutional structures, advocacy for conservation technicians and other specialist roles within the department is advisable (see Chapters 6 and 9). The spectrum of equipment encountered, while dealing with TBM artworks, could be, very simplistically, seen as modern equipment at one end and legacy equipment at the other. The distinction between modern equipment, which can be “increasingly complex and miniaturized,” often meaning it “can no longer be repaired at the level of the individual component” (Lorrain 2013, 238), and legacy equipment, which may allow or, by virtue of its obsolescence, demand a more hands-on approach, is reflected in the kinds of specialists associated with them. Modern equipment, which typically requires less maintenance by virtue of age and design, is generally dealt with by larger companies and companies affiliated with manufacturers. Using these companies may even be a requirement of the equipment warranty. Legacy equipment, which requires a higher level of care and attention, tends to be dealt with by smaller companies or individuals who have not just the requisite skills but also a passion for the equipment and its contemporary usage. Our experience at Tate is that it is often easier and more of a necessity to foster close relationships with these smaller companies and individuals. Though some may be protective of their knowledge, more often specialists are happy to work in a collegiate manner to produce collaborative documentation that helps with transferring the knowledge and skills to the institution. Over time this can allow more internal technical care to be conducted and ensure potentially problematic codependencies are not generated. Depending on the nature of the work and existing relationships, this may permit more thorough, conservation-minded documentation and may save money. However, a balance must be maintained to ensure those who possess these skills are supported professionally; though their skills may be of vital importance for the collecting institution, likely they are required less in the wider world. Additionally, the speed with which such specialists can diagnose and resolve problems, and the fact that they can offer advice on how to avoid them in the future may actually be a more cost-effective approach. While our experience at Tate is characterized by being a large institution located in one of the world’s major cities, it is important not to overlook the fact that specialists in obsolete technologies, and indeed, the general availability of such technologies is not something that is readily available to all. That said, if such specialists are available and affordable, the principles that underpin the sorts of relationships described earlier should apply to institutions, big or small.

15.7.2 Loan Whether or not a loan request is fulfilled with equipment is dependent on the nature of the artwork and the policies the lending institution has devised around the matter. Decisions regarding suitability for loan should be made on a case-by-case basis; just as a more traditional artwork may be fragile or require treatment, so may a TBM work with Level 1 equipment. Level 2 items are likely candidates for loan, but careful consideration should be given to Level 2 items designated that status by virtue of being legacy equipment. For these items and all other designations, a decision should be made as to whether or not the lending institution is happy to absorb the wear associated with display at a venue where they cannot have close oversight over the use and maintenance of the item. This is especially true for high-wear items with mechanical parts such as equipment for film, slide, and tape. As with internal displays, condition reports play an important role in the loaning of TBM artworks. At Tate, we require that all artworks that travel with equipment are accompanied by a courier who will sign off a condition report with a local conservator at the point of installation. Consideration should also be given to the fact that, owing to the cost of shipping and the potential need for packing, it may be more practical and cheaper for borrowers to source 236

Managing and Storing Artwork Equipment

equipment locally. Doing so provides opportunities for borrowers to develop and support local networks around the technological niches TBM works are often reliant upon.

15.7.3 Strategies for Technical Care How technical care is approached is dependent on the status of the equipment item and the nature of the problem to be resolved. For Level 1 and 2 items for whom change presents potential loss to their associated artworks, Laurenson proposes three Strategies: 1.

2.

3.

“Acquiring spares to repair the original unit or substituting the failed unit with the same make or model. This maintains the integrity of the work on all counts except for any work explicitly designed to be ephemeral. This strategy is the closest to traditional conservation practice in that it preserves the link to the meaning of the work by preserving the identity of the physical components. Success is easy to evaluate—if the same equipment is used then (if all other conditions for installation are adhered to) no loss will occur. This strategy is only an option for the time spares are available.” “Making new components or modifying another piece of equipment to match what was considered significant about the failed equipment. For example, putting a modern mechanism into an original casing. When considering this strategy, one should bear in mind the risk of undermining the spirit of the work if the technology is intended to be transparent and un-contrived.” “Recreating significant features by inexact substitution perhaps by an item of equipment using the same technology or producing the best match of measurable outputs (quantifiable in terms of dynamic range, resolution, brightness, etc.).” Laurenson (2004, 51–52)

Referring back to Phillips’ “balanced” approach, these strategies should be employed in whatever combination and to whatever degree the significance of the artwork dictates.

15.7.4 End of Life There are two reasons equipment would be regarded as having come to the end of its life. Either it has ceased to function as required and cannot be repaired, or its function has been superseded. What is done with items at the end of their functional life is determined by their current status and potential future utility. All Level 1 and some Level 2 items should be retained for historical reference and priority given to maintaining material stability, as suggested by Bek (Bek 2011). Items that are Level 3 and below should be assessed to determine if they have any potential future utility in terms of spare parts or as examples of historical modes of display for research or display purposes. Equipment that is no longer required should be recycled by a certified specialist, and all documentation kept as a record of that item’s life in the collection.

15.8 Conclusion To properly care for TBM works, one must understand them in their entirety, as dynamic systems of equipment and media that, together, produce a conceptual and material whole. An individual who cares for media alone is no better placed than someone who only concerns themselves with equipment to understand this. Whatever else the resources of an individual, department, or institution permits, assessing the significance of equipment is the core work that facilitates the proper care of artwork equipment and the framework from which all other practices can be devised and prioritized. 237

Duncan Harvey

Bibliography Bek, Reinhard. “Between Ephemeral and Material—Documentation and Preservation of TechnologyBased Works of Art.” In Inside Installations, Theory and Practice in the Care of Complex Artworks, edited by Tatja Scholte and Glenn Wharton, 205–15. Amsterdam: Amsterdam University Press, 2011. Floyer, Ceal. “Carousel.” Vinyl record and record player, Overall display dimensions variable, 1996. www. tate.org.uk/art/artworks/floyer-carousel-t12325. Laurenson, Pip. “Authenticity, Change and Loss in the Conservation of Time-Based Media Installations.” Tate Papers, no. 6 (Autumn 2006). www.tate.org.uk/research/publications/tate-papers/06/ authenticity-change-and-loss-conservation-of-time-based-media-installations. ———. “The Management of Display Equipment in Time-based Media Installations.” In Modern Art, New Museums: Contributions to the Bilbao Congress, edited by Ashak Roy and Perry Smith, 49–53. Bilbao, Spain and London: The International Institute for Conservation of Historic and Artistic Works, 2004. ———. “The Management of Display Equipment in Time-based Media Installations.” Tate Papers, no. 3 (Spring 2005). www.tate.org.uk/research/publications/tate-papers/03/the-management-of-displayequipment-in-time-based-media-installations. Leckey, Mark. “Dream English Kid, 1964–1999 AD.” Video, projection, colour and sound (surround), Duration: 23min, 2sec. Tate, 2015. www.tate.org.uk/art/artworks/leckey-dream-english-kid-19641999-ad-t14666. Lorrain, E. “Obsolete Equipment: A Research Project on Preserving Equipment in Multimedia Art Installations.” In Preservation of Digital Art: Theory and Practice: The Project Digital Art Conservation, edited by Bernhard Serexhe and Eva-Maria Höllerer, 232–42. Karlsruhe: Zentrum für Kunst und Medientechnologie, 2013. Phillips, Joanna. “Shifting Equipment Significance in Time-Based Media Art.” Edited by F. Durant, G. Ryan, and Jeffrey Warda. Electronic Media Review Volume 1: 2009–2010 (2012): 139–54. Stricot, Morgane. Interview with Morgane Stricot by Duncan Harvey. Unpublished audio recording, July 23, 2020. Viola, B. “Permanent Impermanence.” In Mortality Immortality: The Legacy of 20th-Century Art, edited by Miguel Angel Corzo and Getty Conservation Institute, 85–94. Los Angeles: Getty Conservation Institute, 1999.

238

16 THE INSTALLATION OF TIMEBASED MEDIA ARTWORKS Tom Cullen

Editors’ Notes: Tom Cullen, a UK-based media art technician, has managed the installation of time-based media artworks for over 30 years, commissioned by renowned artists, galleries, and museums all over the world. In his career, he has worked with Isaac Julien, Gillian Wearing, Bill Viola, and Steve McQueen. With his unique perspective as an experienced media technician, Cullen provides a structured approach to realizing the successful display of time-based media art. With a focus on practical requirements, he advises on the criteria to consider, mistakes to prevent, and crucial documentation to create.

16.1 Introduction Technology has changed tremendously over the 30-plus years I have been supporting artists, galleries, and institutions to install time-based media artworks. Working across the world, in some of the largest museums, galleries, and site-specific spaces, I have had the good fortune to help develop exhibits, from working with individual artists to installing huge international biennials in my role as a Technical Manager. Over those years, I have seen exhibition technology rapidly change—from U-matic and LaserDisc video, Apple II computers and old CRT projectors and monitors, to the latest ultra-high-definition projection and synchronized multi-screen playback systems. My main work is traveling to support the installation of artist exhibits, including artworks that have not yet entered collections. Many of these works are site-specific, which can involve working with the artist throughout their creative process, advising on what may or may not be possible. At other times I am working on behalf of the artist to ensure that a space can accommodate their existing artwork. In all cases, it is a fine balance of mediation, collaboration, and technical experience to ensure the needs of the artist and the gallery are met. Where museums do not have specialized staff and infrastructure to plan and execute a time-based media exhibit, I can help to realize the project and guide them through the process. My role often includes negotiating with artists and venues, from how to present an artwork in its best possible environment to understanding the venue’s exhibition space and what would work best for the integrity of the art. While installation design, materials, construction, and use of the right equipment have to be agreed upon and managed for the best presentation of DOI: 10.4324/9781003034865-19

239

Tom Cullen

an artwork; so too do the needs of the venue by creating and installing an artwork in a busy functioning space that visitors can view and navigate with ease. This chapter offers a personal insight into how I  approach the installation of time-based media and how I deal with specific requirements of a whole exhibition or a single artwork so that the installation is a success. In order to get a wider view on the issue of working in public and commercial spaces, I have also consulted a few additional practitioners to get their perspective on installation management: Jon Harris, Technical Manager for Birmingham Museum Trust in the UK; David Wood, Gallery Manager at Victoria Miro Gallery in London; and Jorma Saarikko, owner of ProAV, a company based in Helsinki that has been in the business of installing time-based media internationally for over 30 years.

16.2 Defining Artworks by Spatial, Aesthetical, and Technical Requirements Understanding the specifics of an artist’s time-based media work and the presentation of it is fundamental; this can be the difference between the exhibit looking average or amazing. It is also crucial to obtain the right equipment to avoid the disappointing consequence of a work not being shown. With an existing work, there are normally guides, specifications, or installation instructions that I will call a Technical Rider in this chapter. The Technical Rider outlines how that work should be installed and exhibited at its best within the gallery context. These specifications obviously vary depending on the artist or studio which has produced them, but I have generally found that most Technical Riders can be broken into three main parts: 1. 2.

3.

The type of space necessary to present the work in the gallery. Dimensions such as width, depth, and height should be included to give you a good understanding of what is required. The aesthetic considerations for showing the work, including the types of materials needed for construction and presentation of the work, painting of walls, and the use of carpet or other acoustic materials. A technical outline, which covers both the specific equipment required for the artwork and the gallery infrastructure requirements, such as mains power and internet access, which the venue will need to provide.

By looking through the Technical Rider, you can also start to get a good understanding of the budget and what considerations you have to plan for. However, if the artwork is a new commission or being exhibited for the first time, it is unlikely that a Technical Rider is available, and new specifications will need to be produced so the work can be faithfully reproduced in the future. These specifications should always at least cover the three main areas, space, aesthetics, and technical.

16.3 Understanding the Space Requirements of the Artwork and the Requirements of the Exhibition Space Finding an appropriate exhibition space for a time-based media work requires one to identify how much space the work needs, from a simple monitor-based work, which would take up little space, to a multi-screen projection, which requires a larger space. When considering the right venue, physical and technical requirements will determine the selection and preparation of the exhibition space. Factors that impact an artwork’s need 240

Installation of TBM Artworks

for space include projection throw distances or screen sizes, whether a projection screen is front- or rear-projected, or what type of lenses are available for that particular type of projector. You will also need to think about issues with the audio reproduction and what might impact the sound in the space. Is there an audio spill between one artwork to another? Are there any concerns within the gallery that might affect the sound reproduction, such as noisy air conditioning, sound from elevators, or general public noise outside the space, such as a main thoroughfare, gallery cafés, or reception? All of these would majorly impact the artwork. Other points when considering a space are whether the walls, ceiling, and/or gallery roof can support the hanging of screens, projectors, speakers, technical equipment, or sculptural elements of the exhibit. What are the weight-bearing loads that the ceiling can sustain? Can I-beams be accessed and used? If there are issues with hanging, can a support system, such as a truss, be brought in to facilitate hanging the equipment? If the artwork requires a darkened space, is blackout required? Can you control the ambient light entering the space without building light locks? Are there windows that would need to be covered so that you can achieve the right light levels? Can internal lighting be controlled, and are there illuminated fire exit signs which might impose on the work? Access to the space is also a major consideration. Can you get the artwork into the gallery? I have seen artworks arrive at venues where they would not fit through the doorways and could not traverse a corridor system. I have also seen works taken out of crates in car parks in the rain, as the height and width of the travel flight case had not been factored into gaining access to the space. Power and mains distribution are something that is often taken for granted in spaces. However, as artworks get larger and more complex, you need to ascertain early on in your planning that you have enough power to cover the requirements of the exhibit in the space or that power can be run from elsewhere into the gallery. Obviously, these power requirements change depending on voltage from country to country. We will look at more specific power requirements under Section 16.6, the technical specifications. One issue that can quickly turn into a major problem in some venues is internet access. Does the work you plan to install need an internet connection for remote access? This sounds really straightforward, but a large percentage of artworks need a hardwired stable network connection to the internet as, in my experience, the Wi-Fi access can be unstable, even in the best of spaces, often dropping its connection several times a day. To get the hardwired internet line often means that a special arrangement is needed with the venue’s IT department or providers to get access or patching through the institution’s firewall system. As this can take considerable time to arrange depending on the venue, I advise that you address this issue early. Hopefully, a Technical Rider provided by the artist would have good-quality images or layout plans of how the work has previously been installed, giving you a valuable idea of the layout and design. If not, it’s important to engage the artist to fully understand the requirements of their work. I asked Jon Harris of Birmingham Museum Trust what he initially looks for in a Technical Rider. “Well, first of all, you’ve got to look at the physical space that you can offer the artist, and I suppose, that’s me, working with my team leader, who’s also my carpenter and room creation person, to see if we can build what’s needed” (Harris 2020). Likewise, David Wood of Victoria Miro Gallery adds, “I think the first thing to do is to establish whether it’s been shown at a pre-existing site, has it been shown at another venue or another gallery before it’s been shown by us; how that previous space compares with our space, and if you’re talking about projectors and speakers, then it’s to see how that tech spec marries up 241

Tom Cullen

with our space, because maybe it’s a whole new technical set-up needed, in terms of the lenses and things like that” (Wood 2020). Jorma Saarikko from ProAv states, “A photograph or a plan is the best resource for understanding the space required to see how things were done before; if the specifications include photographs of a previous install, this is the best way of seeing how it was done before” (Saarikko 2020). Once you have established that you have the right space to accommodate the artwork, you need to look into the gallery infrastructure of what can and cannot be achieved. We touched on it earlier when considering hanging points and internet access. Construction, most likely, will be a major factor for the exhibition. Can walls be added, such as light locks or dividing walls? Can temporary walls be moved or taken down? Can you fix or screw into the floors, and can the ceiling support a hanging structure or screens? Is the mains power in the position it needs to be, or do we need to bring in additional power points, and can the additional cabling for this be run out of sight? Much of this can be incorporated into the design and aesthetic considerations when planning the installation. In order for the exhibition installation to run smoothly, it is vital to confirm with the venue that staff and crew are available for the installation. Always make sure there are some specialist audiovisual technicians and/or time-based media staff available, as well as regular art technicians or art handlers; this is always a good balance on a crew. As a visiting technician, it is great to have local knowledge and someone who knows the gallery well. If these specialists are not available or beyond budget, it may be worth rethinking whether the installation is viable. Most artists will insist either a trusted technician or company will oversee the installation. The staff and infrastructure resources of a given venue will have a great impact on planning the installation of a work: some museums may have equipment pools of suitable projectors or audio equipment that can be used. If you are unable to modify or restructure a space to fully accommodate the specifications of an artwork, this could seriously compromise it. Never be afraid to say the space is unsuitable; a work that has been compromised to fit a space is not good for the artist, the gallery, or the viewing public.

16.4 Determining the Aesthetics of an Artwork’s Display Dealing with the aesthetics of an exhibit runs hand in hand with deciding whether a space is suitable, but the aesthetics of a work have their own set of concerns. Can a space be painted? This sounds simple, but it’s not. Some venues are happy to paint the whole space, but others will only entertain painting in parts, and venues such as heritage sites often will not permit painting at all. The role of a visiting technician is to liaise with the exhibition designers and other staff to communicate needs, such as carpets, wall colors, and so on and to initiate that the museum takes care of product orders, construction, and so on. In difficult spaces, a different approach is needed: building a space within a space, which can accommodate the work is sometimes a way of providing the right aesthetics. Often rooms within rooms are used in very large spaces to accommodate lots of works, allowing controlled spaces to be built to the different specifications of the artworks. While this allows the exhibits to be shown correctly, it can be very time-consuming and costly but may be the only solution. Lighting can be a big issue, especially if you are building new spaces. Can lights be placed where needed? Can lights or lighting tracks be added to the space? Would you need some sort of programmable lighting control such as DMX? Can the in-house lighting system be amended to support any extra requirements that are needed for the exhibit? I have found that in most 242

Installation of TBM Artworks

cases, the in-house tech crew are more than happy to accommodate any lighting changes, but some smaller venues will need to call in external support for this, so this again needs to be flagged up early in discussions. Light locks are very important in reducing the amount of ambient light entering a space; these need to be designed sensitively and not be too imposing as to stop visitors from wanting to enter a space, but at the same time, they need to be large enough to keep light out. In order to have excellent sound in a space, artists will demand acoustic treatments of some description, and these are crucial for the work to be heard at its best. Acoustic treatments will impose an artistic aesthetic, and the larger the space and the higher the flat walls, the more sound treatment will be needed. You need to consider whether you can lay carpet to help with the sound reflection and, if so, whether you can source the correct type and color of the carpet? Some galleries never use any acoustic materials, and some newer galleries have them built into the fabric of the walls, but carpets can be extremely important to an exhibit and will help absorb sound. Sound travels by the vibration of air molecules, so the carpet fibers, tufts, and underlay all have different resonant frequencies at which they absorb sound. Carpets can reduce airborne noise by up to 35% more if you use a wool carpet (Carpet Institute of Australia Limited— Acoustic Benefits of Carpet, n.d.). Many museums will require the use of closed-loop carpets to minimize the distribution of loose fibers within the museum. In addition, carpets reduce many of the sound reflections from hard solid walls. Acoustic panels can also be added to a space to reduce reflections of flat surfaces; these can be the eggbox foam-type panels or the frame with Rockwool absorption inside. So how do you accommodate sound insulation and light locks or truss hanging systems into a space without imposing on the art? Usually, if the work has been shown before, many of these considerations would have been incorporated into the Technical Rider specifications, and design and placement have already been decided. However, if there are new issues because of the exhibition space, then it can become a careful negotiation between you, the artist, and the gallery to come up with the best solution. In the vast majority of cases, this can be resolved. Other considerations that impact the aesthetics of the display pertain to the presence of architectural elements within the exhibit space, such as exhibition furniture, benches, plinths, bespoke structures, or housings. These will all need to be considered. Do these sculptural elements need to be sourced or built new if the in-house resources are not suitable? Again, if the work has been shown before, there might be good photographs or designs of furniture previously used.

16.5 Integrating the Artwork in the Space Designing the exhibition space to provide the best environment for showing the work is something that is developed hand in hand with the artist and the gallery. Often the gallery will use external architects to work on exhibition design, who are normally quite versed in working with artists. However, many artists are very hands-on and need to be included at this stage. Once you have agreed on the space, you need to work out the orientation of how the artwork best fits into the space and how the audience will interact and navigate the exhibit. How people relate to or interact with the work is vitally important; if people find the navigation of a space too difficult or confusing, they will not come away with a great experience. During this design phase, things like hanging systems such as a box truss or lighting frames, speaker, and projector placements can be agreed upon. Rooms or hidden enclosures for equipment can be incorporated into the aesthetic design, and cable runs and concealment planning will also need to be factored in at this stage. 243

Tom Cullen

It is vital to produce drawings of the space. I normally produce two types of drawings: first, a 3D mock-up so that virtual components can be moved around pretty quickly to get a good idea of the feel of the work in the space and, second, good architectural 2D CAD drawings for big installations. The 3D renderings are great to share with artists, and curators can get a good understanding of the artwork in the space. The CAD drawings are important as they are precise drawings that you can share with building management or other architects so that structure or building changes can be agreed upon. Often the architects will produce these types of drawings for you which you can amend. Also, with a good set of drawings, you can plan the visitor flow or queue management in collaboration with the venue staff. If the venue is planning timed screenings, ticket-only access, a one-way system, or interaction with the artwork, all of these can be planned out on paper. Gallery furniture, if needed, can also be added to the drawings at this point.

16.6 The Technical Realization of Time-Based Media Artworks Equipment is a major topic to deal with in the presentation of time-based media. Most artists have specific equipment they prefer, which they have previously used and show the work at its best. Sometimes an artwork will travel with its dedicated equipment. In these rare cases, that equipment may be bespoke or has been bought by the institution or collector when the work was first purchased. Some larger museums or galleries have a pool of audiovisual equipment that they can let you use for exhibitions, but as most artists require different makes and models, most institutions tend to hire or rent equipment for the duration of the exhibition. Unlike when I first started working in arts venues, there are now many companies who specialize in hardware for exhibitions that hire at rates less than most commercial audiovisual hire companies. Most good Technical Riders will give you various makes and models that the artist is happy with or have similar specifications to the equipment required. Obviously, there can be a lot of negotiation about equipment, and I have found most artists are happy with a like-for-like approach for equipment, as long as it meets the requirements for the work. However, if I cannot get the model required, I normally get a slightly better specification than the one requested, as that will help find the middle ground in negotiations on equipment. In some instances, though, you will just have to buy, hire, or source the correct equipment that the artist knows and is happy with. Tracking down any bespoke equipment can be an arduous task. I normally start with the previous venue where the artwork was shown to try and locate the same supplier of the equipment; if that does not work, I try to source it elsewhere. Finally, if all else fails, try and find a similar piece of equipment and negotiate with the artist. Finding equipment from various sources can be time-consuming and costly, so try to bundle up all of your equipment purchases or rentals into one or two packages that can go to your preferred suppliers or go out to tender. Sourcing equipment from one or two companies means you get the right cabling with the hire, and you have the support if there are any issues. One thing I normally request for any artwork installation I am overseeing is that there are spares available, such as a spare projector, media player, or projector bulbs, mainly for my own peace of mind but also because it allows the venue to change over problematic hardware very quickly if needed. Sometimes having spares may not be possible, especially if it is a very expensive piece of equipment, but I would insist that the hire company has another in stock, ready to swap out, or an engineer who could fix the unit quickly. These things need to be built into any rental agreement. 244

Installation of TBM Artworks

As with all time-based media works, the signal path is crucial to a successful exhibition. Using the correct type of cables, connectors, and adaptors is paramount. Knowing the correct cabling, such as the different types of HDMI cable, is critical: Can it carry a 4K image or only HD? Will you need an HDMI extender if you are using a HDMI cable over 10 m? Does the audio cabling need to be balanced or unbalanced, and are you using the correct connectors and adaptors? Knowing your signal path will save you time and energy during an install. Sourcing legacy equipment like CRT monitors and projectors is getting harder; luckily, there are specialist companies who acquire, maintain, and hire older or obsolete equipment, which allows us to show older works as intended. However, using vintage equipment brings its own concerns, such as reliability, and vintage equipment needs to be looked after carefully, which is another factor when planning your exhibition. Budgeting is a major issue. You will find many differences between working within the ­public sector galleries and the commercial ones, with the issue of budget or lack of it. Jon ­Harris explains how he deals with equipment for exhibitions, on owning or hiring equipment in the public sector museums: “In terms of budget, it’s looking at what kit we’ve got in-house and what do we need to hire in, which, you know, more often than not, is all of it. Because as an organization, we don’t have huge amounts of stock equipment, we have acquired a little bit over the years, but we just don’t have the capital to constantly be buying projectors, speakers or media systems, etc. One thing we do have a lot of is media players, but I think it’s all about the realities of what you can achieve within the specs that you’ve got. We can do what we can under the budget; if you want more, then we can do it. I always will try to satisfy the artist’s spec. I feel that as an organization that’s showing any form of digital art, you have a duty of care and responsibility to do these things properly” (Harris 2020). Most commercial galleries have fewer financial constraints when installing time-based media. When asked about whether the budget is a big issue in commercial galleries, David Wood explains, “I would say not, but it depends on the galleries. I think galleries want the artists to have the freedom to push their ideas. They’ve also got to think about the work, who’s looking at it, is it looking at its best? And you know, ultimately the gallery wants to sell the work. So, I think there’s a bit more flexibility with the budget I would say, rather than saying budgets don’t matter” (Wood 2020).

16.7 Creating a Technical Rider for New Works and Commissions Earlier, I have explored the importance of a Technical Rider for existing artworks. However, if the gallery has commissioned or is showing a new work for the first time, it is important to produce a new specification document and record the installation. This will be required by the commissioning organization and the artist. Jorma Saarikko outlines his experience when producing a new Technical Rider for artists: “I have done many specs this year, I usually put the main needs, whether it’s a DLP or LED panel projector, that its brightness is 6,000 ANSI lumens, do we need to have a changeable lens, do we need a long lens or lenses, etc. But then I usually offer that it can be different models—for example, Epson or Panasonic. I use examples because technology’s development or the speed of the new models is so fast, that it’s better to give examples than specific models. Also we can’t give specific models under Finnish law, if we do, we break the law, so we need to give examples. That’s why we say 6,000 ANSI, lumens, HD, etc. Then we give all the information about the media coding. If it’s synchronised work, what that (equipment) needs to be and how it’s synchronised, and then we give all the information for the sound systems, whether they need to have Dolby decoding, do they need to have active speakers, etc.” (Saarikko 2020). 245

Tom Cullen

As we have previously considered that the Technical Rider comprises three main sections (space, aesthetics, and technical), these themes also need to be incorporated into any new Technical Rider, which needs to be written up once the exhibit is finalized. The gallery space specification should include ideal width, depth, and height, with a good 2D plan of how the work was installed in the space, showing all walls and light locks and dividing walls that were built. 3D renderings of the work as installed should also be included (see fig. 16.1). Be sure to record the variety of acoustic treatments that were used: carpets, panels, and so on. Include a swatch from the carpet with manufacturer details. If any gallery furniture like benches or plinths were used, make drawings and take photographs of these. List all measurements, no matter how insignificant such as screens from the floor, the distance between screens, placements of benches, tables, and sculptures. You should provide a full spec of the equipment that was used and whenever possible alternative equipment that could be used in the future, citing different models and manufacturers. This all makes the next venue’s life easier in sourcing the equipment next time around. Make sure you add plenty of notes about the installation and if internet access is needed. In addition, give details of the power requirements for the work; how many amps the equipment normally draws at 110 volts and 240 volts? Again, this will help the next venue plan ahead. I also like to include a technical schematic of the work, which consists of a drawing showing how all the equipment is cabled together and the chain of the equipment (see fig. 16.2). For future display of the work, there is then no ambiguity about what is needed. Once the artwork is installed, good photographs are essential so that galleries understand the aesthetic design of the work within the space. In addition to creating images of the completed

Figure 16.1 A 3D rendering of the synchronized nine-screen installation Ten Thousand Waves (2010) by Isaac Julien, shown as installed at the Museum of Modern Art New York in 2013. Rendering: Tom Cullen

246

Installation of TBM Artworks

Figure 16.2 Schematic for Ten Thousand Waves (2010) by Isaac Julien. Creator: Tom Cullen

installation (see fig. 16.3), documenting the technical installation process itself can give the next venue a heads-up on what to expect. Try and take photographs of everything, how the projectors or speakers were hung, and shots of any specialist equipment that was used in the installation. No matter how small or trivial, take a snap, and store it on a hard drive and a thumb drive so it can travel with the artwork specifications. If you have the resources, shoot some videos of the artwork and the space it is in. This could simply be on your phone, but it is just to add a little more information to the Technical Rider. 247

Tom Cullen

Figure 16.3 Isaac Julien, Ten Thousand Waves (2010), nine-screen installation, 35 mm film transferred to high definition, 9.2 surround sound, 49′ 41″. Installation view at the Museum of Modern Art, New York, 2013, © 2022. Digital Image, The Museum of Modern Art, New York/ Scala, Florence. Photograph: Jonathan Muzikar

So your finished Technical Rider will have a full outline of the installation and the technical equipment used. It should also include plenty of drawings, from 2D and 3D to schematics and lots of photos and videos of the exhibit. This will make the reinstallation of the artwork as intended and as straightforward as it can be. Artists will value this, future venues will be grateful, and if you are reinstalling the work again in the future, having a well-planned Technical Rider will save time, energy, and cost.

Bibliography Carpet Institute of Australia Limited. “Acoustic Benefits of Carpet.” Carpet Institute of Australia Limited. Accessed September  13, 2021. www.carpetinstitute.com.au/residential/carpet-noise-pollutionreduction/. Harris, Jon. Interview with Jon Harris by Tom Cullen, November 18, 2020. Saarikko, Jorma. Interview with Jorma Saarikko by Tom Cullen, November 26, 2020. Wood, David. Interview with David Wood by Tom Cullen, November 18, 2020.

248

17 A ROUNDTABLE: COLLABORATING WITH MEDIA ARTISTS TO PRESERVE THEIR ART Joanna Phillips and Deena Engel in conversation with Lauren Cornell, Mark Hellar, Diego Mellado, Steven Sacks, Lena Stringari, Siebren Versteeg, and Gaby Wijers

Editors’ Notes: This roundtable discussion, held on January 27 and February 22, 2021, brings together seven professionals who are involved in producing, distributing, collecting, curating, and conserving media art. In conversation, they share their perspectives and explore together how they collaborate with living media artists to support the sustainable collection of media art. Steven Sacks (Director of bitforms gallery, New York) represents media artists and distributes their work; Siebren Versteeg is an American media artist represented by bitforms gallery and collected by major museums in the US; technologist Mark Hellar (Hellar Studios LLC, San Francisco) produces media art for artists, such as Lynn Hershman Leeson and supports institutional media conservation initiatives; Gaby Wijers (Director of LIMA, Amsterdam) works with artists to distribute, preserve, and research media art and is charged with the long-term care of multiple collections; Lauren Cornell (Director of the Graduate Program and Chief Curator at the Center for Curatorial Studies, Bard College, New York) collaborates with artists to exhibit their work; Lena Stringari (Deputy Director and Andrew W. Mellon Chief Conservator at the Guggenheim Museum, New York) is charged with the long-term care of the museum’s collection and invites artists, such as Siebren Versteeg to inform acquisition and preservation processes; and Diego Mellado (engineer and project manager at Studio Daniel Canogar, Madrid and Los Angeles) produces, installs, and documents art for Daniel Canogar, also represented by bitforms gallery, and provides consulting services for media art collections.

17.1 Introduction Contemporary artists have long been actively involved in the collection of their art. They promote their work, they cultivate relationships with gallerists, curators, collectors, and museums, and they often collaborate with institutions or galleries to produce, install, or adapt their works for display and acquisition. For complex works such as installations, performances, or media artworks, the artists’ participation will sometimes also extend to the conservation and DOI: 10.4324/9781003034865-20

249

Joanna Phillips and Deena Engel

long-term care of their work. Over the last two decades, the conservation community has developed skills, tools, and standards to capture and document artists’ intentions, preferences, and viewpoints. Strategies to interview artists and other stakeholders have been developed and refined (SBMK—Concept-handreiking Kunstenaarsinterviews 1999; INCCA—Guide to Good Practice: Artists’ Interviews 2002–2016; Beerkens et al. 2012; Cotte et al. 2016), artist questionnaires and artist-based documentation systems have been put to use (Gantzert-Castrillo 1996; LIMA 2017; Rhizome—ArtBase n.d.; Ippolito et al. n.d.), and training opportunities for conservation professionals, such as the ongoing Artist Interview Workshop series of Voices in Contemporary Art (VoCA—Artist Interview Workshops n.d.) have enhanced collaborative practices in contemporary art conservation. Even with this progress and artist interview practices established in many places, the quality of information that can be transferred from the artist to the future custodians of an artwork depends greatly on the level of mutual understanding. A collector or institution that wants to own, install, or conserve a media artwork will have to know enough about the technology, inherent change, and preservation risks to pose the right questions and ask for the necessary physical and digital materials. An artist has to understand the preservation needs that require certain information, preservation formats, or other deliverables. Even before a work is collected, artist studios, galleries, and distributors can play an important role in documenting the conceptual identity of an artwork so that collection stewards learn about the ways in which a work may or may not change with different exhibition spaces, obsolescing technologies, and changing audiences.

17.2 Is Media Art Intended to Last? Media artists choose to work with ephemeral technologies. How concerned are they with the longevity or long-term preservation of their artworks? Is long-term viability even something that is considered when media artworks are produced? SIEBREN VERSTEEG:  The answer for me is yes. Preservation already became an issue at the very beginning of my career. My first few sales were pieces such as Dynamic Ribbon Device (2003), which scraped live data off news websites (see fig. 17.2). And I saw the immediate potential for the software, a proprietary package called Director by Macromedia, to not be sustainable and the necessity to acknowledge and inform any potential collector to understand that fact. That said, I  did what I  could with the knowledge that I had to support and maintain the work. And that support process has since grown quite a bit in terms of its quality and sophistication. I have updated many pieces from that particular era of my production, one just in the past year, and it continues to run without any major hitch or problem. So when you’re taking something that is ultimately pretty ephemeral and performative and nuanced, and when it goes into any kind of collecting circumstance, it becomes a commodity or an artifact. So maintaining and updating the work is definitely something that I think is part of the process, the creative process even, and to somehow capture and contextualize information about the piece. GABY WIJERS:  The artists we work with at LIMA are all more or less interested in the longevity of their work but do not necessarily have the time or interest to put a lot of effort into it. Some are very interested and wish you good luck! And others put in a lot of effort and join all the discussions and work on solutions and approaches, like, for example, incorporating measurement controls in installations so that you know beforehand when a lamp will stop JOANNA PHILLIPS:

250

Collaborating With Media Artists

Figure 17.1 The roundtable participants in discussion on collaborating with media artists to preserve their art. Screenshot: Joanna Phillips

Figure 17.2 Siebren Versteeg, Dynamic Ribbon Device (2003), Edition 1/10. Internet-connected computer program output to plasma screen. Dimensions variable. 2019.116. Gift of George Bucciero and Pamela Lui. Photo: Siebren Versteeg, © Courtesy of the Museum of Art, Rhode Island School of Design

251

Joanna Phillips and Deena Engel

operating. Some artists also develop an interest in preservation at a certain age or once they have a certain body of work. LENA STRINGARI: Yes, I  agree. In my experience at the Guggenheim, I  see two groups of artists. There is a distinction between artists who make work that involves technology and who understand the technology and utilize it in their studio process—for example, writing code as an artistic process versus those artists who make art by delegating the technical production, perhaps to studio assistants or computer engineers, or someone else to code their work. These artists don’t necessarily understand the technology underlying the work. And in my experience, these very different types of artists also take different levels of interest in the preservation of their works. The artists who understand the technology and who really are the creators of the source code, for example, often also take a deeper interest in the preservation of their work. MARK HELLAR: I agree with Lena, and I think Casey Reas is a good example. He has even written on How to conserve my code (Reas 2020). And Rafael Lozano-Hemmer has also written a sort of a manifesto on GitHub (Lozano-Hemmer 2015) about instruction-based conservation rules. As someone who gets hired to produce artworks for media artists and works in conservation at the same time, I’m always thinking about how a piece is going to run for the next 50 years. But the dynamics of the production, the studio work, doesn’t always support conservation considerations. Because when you’re in the studio and there’s a project, there are a lot of deadlines, and there’s a budget. One example is Shadow Stalker, a very complex project I did with Lynn Hershman Leeson for the Shed in New York City in 2019 (Hershman Leeson 2018–2021). Visitors would type their email into a tablet on the wall, and the piece would go out and get all your data off the internet and stream it on the walls of the Shed inside your shadow, which was live-produced with Connect cameras. I mean, it’s pretty high pressure to get a show programmed and coded up and running. It’s in the New York Times, you have a deadline, you have four weeks to install it in New York after building it in a studio in San Francisco. You’re trying to stay true to the artist’s vision. I mean, there’s not a lot of time in that period for me to say, “Oh, we really have to think about putting this in GitHub and writing all the comments for the code.” And sure, conservation is going to be in there, but that usually comes post-project. I actually wrote a VoCA article “Artists and Technologies” (Hellar 2019), in which I  explore this dual role I’ve had working in the conservation department at SFMOMA while also working in the artist studio. And these roles sometimes conflict. LAUREN CORNELL: As a curator, I’ve worked with many video artists, many time-based artists. And I see over and over and over again, people just producing media and digital work without thinking about its longer life. And then I see over and over and over again, this work obsolescing and then becoming hard to retrieve. It is the same with arts organizations that produce websites for exhibitions or projects. I always wonder: what is the preservation plan? So I see people just really needing to understand the life of media art and how to plan for inevitable obsolescence. A larger point is that the history of digital art is deeply troubled by the fact that artists and institutions haven’t been able to properly conserve works. And so there’s a lot of missing artwork. There’s a lot of missing information, missing stories, missing experience, and it leads to a kind of collective forgetfulness about an important history. And so, conservation is a very important issue to address.

252

Collaborating With Media Artists

17.3 Transferring Ownership by Transferring Knowledge If media artworks are intended to live and be cared for, what can artists, galleries, and institutions do to better prepare artworks for a sustained life? How can preservation-critical materials and information be transferred to collectors, regardless of who will be in charge of the maintenance in the future? STEVEN SACKS: As a gallery which has focused on media art for around 20 years, we at bitforms definitely connect with the artist in a deeper way before the works are released to the world. We are pretty assertive early on to make sure that the artist is aware of what needs to happen for the work to survive and to be presented over the long term. Depending on the complexity of a work, a range of pre-collection steps occur. If it’s just video art, which for us is almost traditional media today, it’s quite easy. We make sure that the artist is being sensitive to a master file and a play file and that no flaws are introduced in the copying process. But as things get more complex, obviously using software and bringing in kinetic movement, then we really have to work closely with the artists on what their vision is for the future of the work and how they look at migration of that work, because as we all know, the technology will fail at some point. We usually interview the artist, take notes, and try to get as much information as possible on the conceptual foundation of the work and how migration is built into the idea of the work. We try to figure out how this work can be handed off, whether to a museum or private collection. But recently, we introduced a more serious document. It’s a four-page contract (see Appendix 17.1), which includes a description of the intent and conceptual foundation of the work. It lists the equipment provided, and it lays out the responsibilities of both parties. So that’s a document now that we actually have the client sign. This is creating awareness already prior to the acquisition. LENA STRINGARI: I have a question for Steve: one of the things that we encounter regularly at the Guggenheim is that we receive gifts from private collectors. So, we’re not actually acquiring the work at the outset, and we don’t have the usual communication directly with the artist or the gallery to request or negotiate deliverables. And the private collector often doesn’t have a master file or doesn’t have a non-exclusive license, in which case we don’t even really have the rights to show the work. So, these are things that we keep coming up against when we receive gifts. And I’m just curious what you think about that—and have your protocols changed over the years? STEVEN SACKS: Absolutely. It’s a big problem. Many collectors aren’t always sure what they’re buying, and in the past, they have definitely not kept some of the documentation materials we handed over to them. Some of them have even misused works—for example, by running them 24/7. This lack of awareness causes problems for conservation, for people like you, but also for us and the artists. This is why we now have this official agreement that is signed by both parties. Most people are going to read and keep a signed document, and if the work is gifted to a museum or an institution, the document would hopefully move over to the institution. LENA STRINGARI: What I have also seen over the years is that we as museum professionals play an important role in generating necessary information or prompting the artist for certain answers and the right deliverables. The more we know about what we will do to preserve technology-based or media art, the more we can engage the artist because then they gain a sense of trust and they understand what we are trying to do. It then becomes a true collaborative exchange, a nice back-and-forth. JOANNA PHILLIPS:

253

Joanna Phillips and Deena Engel SIEBREN VERSTEEG:  I

absolutely agree with Lena: How much information gets transferred to the collecting institution depends on the level of engagement that it is able to have. Information can only be transferred if there is a receiver. And if there is a knowledge base at the collecting end, I think that really determines how deep that discussion about future conservation can happen. In my practice, different institutions’ engagement with my work has contributed a great deal to how I’ve moved forward and developed conservation techniques in my work as well. Of course, like Mark was saying, sometimes things are rushed, and sometimes it’s a race to the exhibition opening in a short timeframe, and things might go out pretty sloppy, but usually before something goes out in a more formal way, I try to comb through all the code and make sure everything is pretty clean and commented in terms of what needs to happen or what the expectations of certain functions are. So I think, again, it’s an evolving discussion, and it grows with complexity as all the parties that are involved are able to be more engaged with the many nuances of preservation. MARK HELLAR: I actually have a good example for the impact of institutional engagement on the quality of knowledge transfer. Lynn Hershman’s piece we did for the Shed was acquired by two museums, and I was involved in the acquisition process on the studio side. At the Whitney, Christiane Paul was the collecting curator, and Savannah Campbell, a dear colleague, is the Media Preservation Specialist. They came forward with this amazing questionnaire about software-based art, and Lynn actually hired me to fill it out because it was so extensive. It covered a lot of bases, but there were even more aspects that needed to be conveyed to the museum. The piece is very complex. We are looking at a bunch of networked computers. There’s messaging and a published-subscribed protocol where the computers talk to each other. When the piece goes into the network environment of a different museum, it will need to be adapted to the new environment. So it was not enough to document the iteration at the Shed, the Whitney also needed to understand the concept, a high-level description of the work. I had to give a lot of references, there are certain proficiencies required, and I told them, “You can call me, and I’ll walk you through when the adaptation has to occur.” When this happens with other museums, such as those which don’t have a media conservator, they generally do not know what to ask, so I am careful to guide them through all of the steps of the work and the documentation and not take anything for granted. DIEGO MELLADO:  At Daniel Canogar’s studio, our approach is to produce and deliver as much documentation of an artwork as possible, even if you don’t know if the person on the collecting side will be able to understand it. The more you give now, the higher the chance that somebody, maybe in the future, can understand your notes. It’s also important to acknowledge that documentation means different things for different people. Right now, in the studio, we are eight people, and as the engineer, I approach documentation from an engineering point of view: I describe the artwork and its functions, identify which equipment or technologies work for the piece, and what could be a substitute. Our programmer focuses much more on having clean well-documented code. So, together we try to collect all of this information in one document, an extensive artwork manual that goes much further than traditional installation instructions. Steven Sacks, as Daniel’s gallerist, was actually one of the people encouraging us to develop this in-depth documentation and to add more and more information. One of the most recent manuals we made is for Billow V (2020) (see fig. 17.3). These are not just instructions on how to install or operate a piece. The manual also includes a compilation of problems that could occur and solutions for troubleshooting. What I like most about all of these manuals is our solution for describing how the artwork actually works. What is the idea behind the artwork? What does the algorithm do behind 254

Collaborating With Media Artists

Figure 17.3 Daniel Canogar, Billow V (2020). Installation view at the Colección ING in Madrid. Photo: Jorge Anguita Mirón, © Courtesy of Daniel Canogar and Colección ING, Madrid

the artwork? In addition to textual descriptions, we also offer diagrams that visualize the logical makeup of a software-based artwork (see fig. 17.4). In addition to the manual, we also provide a comprehensive file directory that includes documentation materials, the executable and source code for the work, as well as programs and project files that support future system recovery (see fig. 17.5). GABY WIJERS: At LIMA, our approach for artists we represent is similar. We have our Digital Repository for Artists and Collections and encourage the artists to use our Artwork Documentation Tool (LIMA 2017), an online database we developed at LIMA in 2016–17 and that we use to empower artists to feel in control of documenting their own artworks to be able to show them now and in the future. The tool is open for everyone to use, the info is not shared. LIMA is not in control; artists can use the tool and their own data according to their need. We represent over 500 artists, and we have over 3,500 works in our collection. So, we don’t have any capacity to create the in-depth detailed documentation for each artwork that Diego is talking about. We do so for research purposes on the basis of case studies. We don’t sell works, we distribute. Whenever we have new works in distribution and new artists, we encourage artists to be aware of preservation issues and to consider future presentations. In addition, we work with our artists in a workshop format. We also teach media art documentation at art academies to raise awareness among emerging artists on how to deal with future presentations, how to manage change, if, for example, museums change the architecture of a particular room, or technologies become obsolete. When it comes to more complex artworks, for example, if they involve interactivity, we also conduct video interviews with artists, have them explain the behaviors of the work. Most manuals are too much based on current or past situations or iterations when you 255

Figure 17.4 A logical diagram that illustrates the programming flow cycle of Daniel Canogar’s Billow V (2020). Diagram: Diego Mellado, © Courtesy of Studio Daniel Canogar

Figure 17.5 The directory of files for Daniel Canogar’s Billow V (2020), prepared by the artist studio for handover to the new owner, includes the source code and other elements necessary to enable the future conservation of the work. Screenshot: Diego Mellado, © Courtesy of Studio Daniel Canogar

Collaborating With Media Artists

actually want to look beyond that into the future. So we tend to ask artists more often now about their creative process, about making the artwork and reinstalling the work. Iterations differ because there are often many possibilities. Documenting the decision-making really well can inform how we deal with the work in the future. So when we interview the artist, we always ask which installation they preferred, and why? That really brings out a lot of information. DEENA ENGEL: So almost a technical art history that then can inform the future? GABY WIJERS: Yes.

17.4 Gathering Distributed Knowledge Lena just spoke about artists who delegate the technical execution of their work, and Diego mentioned the engineers and programmers at Daniel Canogar’s studio. Preservation-critical knowledge about an artwork can be scattered across many people who may not even have a stake in collecting or conserving the work. How can this distributed knowledge be gathered? MARK HELLAR:  Definitely speak to those who fabricate and produce work for artists. I have worked with a lot of artists who draw sketches and email me in all caps at three in the morning. And at times I have to do a lot of creative interpretation because you have to know the technology to make decisions. Sometimes I have to say, “That’s not possible.” And when I talk to certain artists about the technical concerns of preserving their work, it doesn’t always get through. Of course, as a contractor, I’m not going to be with the studio forever. GABY WIJERS:  At LIMA we are increasingly researching and using distributed knowledge platforms (for example, wikis) where not only we or the collector or the artists but also others can join and contribute. And I think that’s one of the newer developments, knowledgebased sharing, distributed platforms. We have this collaborative approach with over 30 collections, but also in research we do that more often. LENA STRINGARI:  Even after an artwork enters a collection, knowledge about the artwork continues to be produced by many stakeholders. This is really part of Joanna’s amazing work that she did when she was at the Guggenheim; we now document our own decisionmaking and rationale across museum departments (Guggenheim Museum 2012, 2014, Phillips 2015). And what we see over the years is that no matter how well you document something, works morph—they morph for various reasons, for budgetary reasons, when the artist changes their mind, or the exhibition space changes, or the curator wants to adapt a work, or there’s all kinds of possibilities, how things transition over time. By capturing the rationale, the decision-making, and documenting all of the different stakeholders that are involved in this, including the legal and finance departments, you get a real sense of the dimensionality and what happens over the life of an artwork. DIEGO MELLADO:  I guess the question is: what do you want to conserve and what do you want to document? In the case of Daniel Canogar’s work, we want to transfer the ability of replicating the behavior of the artwork. But maybe somebody like Gaby would prefer to document the reactions of the public, or Joanna maybe would prefer to document what changes from one artwork iteration to the next, or maybe somebody is interested in documenting how that artwork represented a given historical moment. Every piece of documentation has a goal. So we need to ask: What is the purpose of my documentation? What is it that I want to conserve? Different stakeholders and different institutions will answer this question differently. One solution for gathering these perspectives is a wiki JOANNA PHILLIPS:

257

Joanna Phillips and Deena Engel

approach to artwork documentation. Of course, this also opens the question of how much stakeholder involvement the institution wants to allow. How much does the owner of a work want to open up the conservation question? And with that many views, it perhaps also becomes more difficult to agree on the most important parts of the artwork that need to be conserved. LENA STRINGARI: Perhaps, along with all the documentation that we do, there should be simple terms that define the work and the behaviors. That was what we proposed at the Guggenheim with the Variable Media Initiative (Variable Media Network n.d.) 20 years ago. We attempted to define an artwork in terms of its behavior (contained, installed, performed, interactive, reproduced, duplicable, encoded, networked), after which a strategy could be employed (storage, emulation, migration, reinterpretation). Although each artwork has its own idiosyncrasies and exigencies, this format enabled a conceptual framework for decision-making. It is sometimes difficult when you’re sifting through all of this documentation to figure out what is the most important aspect of this piece? What is the thing, the kernel-like essence that needs to be preserved? Is it spatial, form, color, sound, smell? Or is it the computer code itself?

17.5 The Significance of Exhibition for the Preservation of Media Artwork I would like to talk about the display of a work as a driving factor for its maintenance and conservation. There are only so many future scenarios you can imagine when you fill out a questionnaire or write a manual. The real world catches up with the work once it gets installed at a given point in time, with new technology at hand, in an unforeseen space, and under a future budget. How important is the repeated activation of the work for sustaining its collection life? LAUREN CORNELL: The opportunity of exhibition is so important for the preservation of both new, untested artworks and of legacy works. When I collaborate with artists, I always encourage them to think very carefully about how they want their work shown in a gallery space and to document this. And I always tell them, if you don’t think about these questions very carefully and spell your requirements out, then the institution or the curator will make the decisions for you, and you might not be happy with it. In the past couple of years, I’ve worked with two significant video artists: Nil Yalter on her exhibition Exile Is a Hard Job, which originated at the Museum Ludwig in Cologne and traveled to CCS Bard, and also Dara Birnbaum, whose retrospective I am curating now. And in both of those cases, exhibiting the early works prompts very exciting questions about how these works made on older formats translate into new media and can be shown with contemporary technology. Exhibition provides an opportunity to think through all of these questions. To think about what was the original mode of this work? How can it be re-executed? It’s also time-consuming and can be expensive. Yalter’s early works were made on Portapak and had fallen out of access, but then the Bibliothèque Nationale de France digitized those works, and they started to be seen again, and her place in art history was able to be better written and appreciated. LENA STRINGARI: I’m always telling our conservators and curators that “exhibition is preservation.” If no one exhibits the work, there are limited resources, and it is difficult to argue for funds to restore something that doesn’t get activated or will never be experienced. So, I have this dream of a large space where all tech-based works are running so that we can see JOANNA PHILLIPS:

258

Collaborating With Media Artists

in real-time if they are functioning or require attention, where students use the collection to learn about preservation and devise strategies that can be shared with artists, collectors, and other institutions. SIEBREN VERSTEEG: I have a piece at the Yale University Art Gallery that is up and running in storage. It’s a news ticker piece, and they enjoy having it running, which is great. And I absolutely agree, a work’s conservation has a lot to do with the consistency of exhibition, of being displayed. I have many works in collections that I assume must be buried in crates somewhere in storage. Those works simply don’t get the updates that they need and should have. On the other hand, there are collectors who consider the work very important to their day-to-day. And those collectors become just as much stewards of the work as I am. So oftentimes, say a data scraping feed or something like that will go down because a website changed its HTML code or some of the hooks changed that I was using to access information, and oftentimes collectors will know that before I do, and I’ll get a phone call or email. And at this point, I consider that to be a significant part of my occupation, my career, that engagement and also the attempt to have as much precognition about where and what might go wrong in the future and to write myself notes in the software that I’m going to be able to decipher and access again in the future. I’ve definitely gotten better at that although I still write a lot of spaghetti code! LENA STRINGARI: So, I guess I’m asking, if we know that artworks are going to go into deep storage of collections that cannot activate the pieces frequently enough, should we still collect them? I don’t know because what I find as we’re currently working on this Jenny Holzer work from 1989, Untitled (Selections from Truisms, Inflammatory Essays, The Living Series, The Survival Series, Under a Rock, Laments, and Child Text), and the resources you need to go backwards to 1989 and reconstruct a piece are considerable. So, I think it would take a huge effort to try and change the way we collect, but I think the way we do things right now is not really working in favor of these artworks. LAUREN CORNELL: I think that curators have to have a closer relationship to digital, timebased, and media artworks than they would with other mediums because they do require constant care and maintenance. So, I’m all in favor of collecting fewer works and maintaining a really strong commitment to them over collecting many works and not having the resources to give them the attention that they require. GABY WIJERS: I know a few artists who sell their artworks with a contract that requires the work to be on display every five years. And I know some artists who request in the purchase agreement that their studio or a technician should check on the work every three years to see if it still runs. They advise the museum to ask their assistant to come to the museum and run it every few years. So there are many examples of artists who want to preserve their own oeuvre for future display. I am thinking of Rafaël Rozendaal, for example, who keeps himself in charge of the preservation of his work.

17.6 When Artists Maintain Their Own Work Many institutions that collect media art still don’t have specialized conservators on staff, which can often mean that the expertise and authority to maintain these artworks remains with the artist. Gaby just mentioned a scenario where artists even request to stay in charge of the preservation of their works. How does a gallery like bitforms facilitate the maintenance of collected works through artists? Is there any contractual paperwork to organize the care through artists? STEVEN SACKS: We definitely put in the contract that artists, or the gallery that represented them at the time, should be contacted first for approval if anything is changed or updated JOANNA PHILLIPS:

259

Joanna Phillips and Deena Engel

on the piece. Of course, if the artist is not available, then the owners need to follow the guidelines that we’ve given them. In some cases, when there’s physical hardware, we have some type of warranty involved in the contract that goes beyond a manufacturer’s warranty. So, if it’s a custom-made work, the manufacturer’s warranty is typically void. For most cases, we typically offer a one-year guarantee, and then after that, the owner would need to pay for the services to fix that work if the hardware were to fail.     Another really big evolution forward in terms of real-time artist maintenance is remote access to the works using tools like TeamViewer. Siebren does this, and Canogar’s studio and even Rafael Lozano-Hemmer. It has been an incredible help for maintaining and repairing any problems. It also gives a lot of confidence for the collector. We’ve done this now numerous times and it’s been great. SIEBREN VERSTEEG:  I’ve been using a remote desktop on Chrome, but I don’t love it. It has a lot of update needs, and that winds up causing more problems than it’s really worth, but it has gotten me out of a few jams. So, I think if there were a desktop app, that’s something definitely to consider. DIEGO MELLADO:  At Canogar’s studio, we request to have remote access to artworks. Unfortunately, it depends on the institution. Some institutions are okay with TeamViewer, but other institutions or collectors require more stringent privacy settings. And of course, it helps us a lot: we are based in Spain and can have control of artworks all around the world, but it’s a double-edged sword because that’s putting it on the backs of the studio to take care of the artworks, too. And, that consumes a lot of time. Writing these manuals for taking care of every single thing that could fail in every single time zone that is potentially our market demands a lot of effort from the artist studio. So, the artist studios tend to have two different roles right now: they have to make art, but they also have to support and maintain it.     And depending on whom you’re talking to, conservation can mean different things. For example, for me, conserving the artwork that I produced for Daniel could mean something different to what conservation could mean for Joanna or for our collector. In the case of Daniel’s work, the computer or the technology or even the programming language isn’t that important, as long as the idea of the artwork, the algorithm that runs the artwork, is conserved. But for other stakeholders, it might be significant to conserve a specific kind of programming language that is typical of his oeuvre. So, whoever carries out the conservation has to be aware that it’s important to find some common ground of what’s the artwork, and what needs to be conserved.

17.7 The Professionalization of Media Conservation Over the past 10 or 15 years, the emerging field of media conservation has evolved a lot. Institutions have started to dedicate specialized media conservation staff to their collections, independent media conservators are consulting for private collections, and the overall awareness of standards has risen a lot. Do you think that the professionalization of time-based media conservation has impacted curatorial collection practices in any way? Now that we have developed expertise in caring for these works, are curators and collectors more confident and courageous to buy media art? STEVEN SACKS:  It absolutely has helped, and the level of confidence has grown a lot. I think any serious collector today will ask a number of questions early on that relate to conservation and preservation issues. Again, I’ve been in business now for 20 years. It was very different back then. There wasn’t even as much media art being offered for collecting. Today, JOANNA PHILLIPS:

260

Collaborating With Media Artists

we have this foundation and level of confidence that can be conveyed to the collector. It is really important to be able to speak about the future integrity of the work and to have these answers during the acquisition process. LAUREN CORNELL: I’ve definitely seen the impact of those collection and research initiatives at exhibitions in several institutions. I’ve also seen more wonderful older time-based works come out of the MoMA collection, for instance, but I think we still need more projects and resources to share knowledge and best practices for curators. As the director of a curatorial master’s program, I’m trying to build more dedicated teaching around the conservation of time-based media and digital artworks. I think the larger impetus in the contemporary art field is just to be moving forward and discovering the new, but without paying enough attention to the issue of how to conserve the new, these pieces that mark our moment. So, speaking as an educator, I feel that there are starting to be more common resources, but we still need more widely shared research, books, and literature on best practices. GABY WIJERS: There is a professionalization going on, but very, very slowly in the Netherlands. As far as I’m aware, there is only one museum that has a time-based media art conservator on staff here. And for sure, there are also a few freelance media art conservators too. I always wanted to do research on why the development in Europe is so much slower than in the States. JOANNA PHILLIPS: I agree with your observation, Gaby. The development of the field is much slower here in Europe. I think the critical mass is still missing. Across Germany, we only count four museum positions so far, and that makes it hard to produce regular meetings such as the annual meetings of the Electronic Media Group [EMG (Website) n.d.] in the US that really fosters exchange and innovation. At the Düsseldorf Conservation Center, we are in the process of launching a media conservation department now. I hope that we can all meet here in person soon for a symposium or other exchange format! Thank you everyone for this inspiring conversation!

261

Appendix 17.1

bitforms gallery | 131 Allen Street New York, NY 10002 | t 212 366 6939 | www.bitforms.art

262

Collaborating With Media Artists

bitforms gallery | 131 Allen Street New York, NY 10002 | t 212 366 6939 | www.bitforms.art

bitforms gallery | 131 Allen Street New York, NY 10002 | t 212 366 6939 | www.bitforms.art

263

Joanna Phillips and Deena Engel

bitforms gallery | 131 Allen Street New York, NY 10002 | t 212 366 6939 | www.bitforms.art

bitforms gallery | 131 Allen Street New York, NY 10002 | t 212 366 6939 | www.bitforms.art

264

Collaborating With Media Artists

265

Joanna Phillips and Deena Engel

Bibliography Beerkens, Lydia, Liesbeth Abraham, Stichting Behoud Moderne Kunst, Netherlands, and Universiteit van Amsterdam, eds. The Artist Interview: For Conservation and Presentation of Contemporary Art, Guidelines and Practice. Heyningen: Jap Sam Books, 2012. Cotte, Sabine, Nicole Tse, and Alison Inglis. “Artists’ Interviews and Their Use in Conservation: Reflections on Issues and Practices.” AICCM Bulletin 37, no. 2 (July 2, 2016): 107–18. https://doi.org/10.1 080/10344233.2016.1251669. Electronic Media Group (EMG). “Electronic Media Group.” American Institute for Conservation, n.d. Accessed May 10, 2021. www.culturalheritage.org/membership/groups-and-networks/electronic-media-group. Gantzert-Castrillo, Erich, ed. Archiv Für Techniken Und Arbeitsmaterialien Zeitgenössischer Künstler, vol. 1. Reprint. Stuttgart: Enke, 1996. Guggemheim Museum. “Time-Based Media.” Guggenheim Blog (blog), 2012. www.guggenheim.org/ conservation/time-based-media. ———. “What Is ‘Time-Based Media’?: A Q&A with Guggenheim Conservator Joanna Phillips.”Guggenheim Blog, March  4, 2014. www.guggenheim.org/blogs/checklist/what-is-time-based-media-a-q-and-awith-guggenheim-conservator-joanna-phillips. Hellar, Mark. “The Artist and the Technologist.” VoCA Journal, March  7, 2019. https://journal.voca. network/the-artist-and-the-technologist/. Hershman Leeson, Lynn. “Shadow Stalker.” Film, Installation and Website, 2018–2021. www.lynnhersh man.com/project/shadow-stalker/. INCCA. “Guide to Good Practice: Artists’ Interviews (Updated Version 2016).” INCCA (International Network for the Conservation of Contemporary Art), 2002–2016. www.incca.org/sites/default/files/ field_attachments/2002_incca_guide_to_good_practice_artists_interviews.pdf/2002_incca_guide_to_ good_practice_artists_interviews.pdf. Ippolito, Jon, John Bell, and Craig Dietrich. “Variable Media Questionnaire, 3rd Generation Beta.” Forging the Future. Accessed May 10, 2021. https://variablemediaquestionnaire.net/. LIMA. Artwork Documentation Tool (Software). Amsterdam, The Netherlands: LIMA, 2017. www.li-ma.nl/ adt/. Lozano-Hemmer, Rafael. “Best Practices for Conservation of Media Art from an Artist’s Perspective.” GitHub Antimodular, September 28, 2015. https://github.com/antimodular/Best-practices-for-conservation-ofmedia-art. Phillips, Joanna. “Reporting Iterations: A Documentation Model for Time-Based Media Art.” In Performing Documentation, Revista de História Da Arte—Serie W, edited by Gunnar Heydenreich, Rita Macedo, and Lucia Matos, 168–79. Lisboa: Instituto de História da Arte, 2015. Reas, Casey. “Conservation.” GitHub REAS/Studio, May 29, 2020. https://github.com/REAS/studio/ wiki/Conservation. Rhizome. “ArtBase.” Rhizome, n.d. Accessed May 10, 2021. https://rhizome.org/art/artbase/. SBMK. Concept-handreiking Kunstenaarsinterviews. Amsterdam, The Netherlands: Instituut Collectie Nederland, 1999. www.sbmk.nl/source/documents/concept_handreiking.pdf. “Variable Media Network.” Accessed May 10, 2021. https://variablemedia.net/e/index.html. VoCA. “Artist Interview Workshops.” VoCA | Voices in Contemporary Art, n.d. Accessed May 10, 2021. https://voca.network/artist-interview-workshops/.

266

PART IV

Medium-Specif ic Practices in Time-Based Media Conservation

18 CARING FOR ANALOG AND DIGITAL VIDEO ART Agathe Jarczyk and Peter Oleksik

Editors’ Notes: Agathe Jarczyk is Associate Conservator of Time-Based Media at the Solomon R. Guggenheim Museum in New York and the owner of the Studio for Video Conservation (Atelier für Videokonservierung) in Bern, Switzerland. Peter Oleksik is Associate Media Conservator at the Museum of Modern Art (MoMA) in New York. In this chapter, the authors bring together their combined experience as practitioners in the field of time-based media conservation and video conservation in particular.

18.1 Introduction Video artworks comprise the majority of time-based media artworks in museum collections due to the long and rich history of artists engaging with this medium. As video technology has bled increasingly into our daily lives, first via the television, then various generations of information carriers like cassettes and discs and now via our computers and phones, artists continually mine this ever-changing landscape and grapple with cultural, ideological, and technological issues through the medium of video. What started out as a tape-based, analog technology has morphed into a purely digital, file-based medium while still engaging with various and everevolving hardware technologies, ranging from cathode ray tube (CRT) monitors to flat-screens like liquid crystal displays (LCDs) and plasma displays, to projectors capable of monumental images. This constant technological evolution, with older technologies quickly falling into obsolescence and new technologies emerging at a fast rate, challenges the traditional models of care and necessitates a high level of engagement by various stakeholders within the organization to thoughtfully manage how these works will change over time. This chapter explores the history of video as an artistic medium, gives a short insight on collecting practices, and offers practical advice on the long-term care of this volatile material in a collection.

18.2 A Brief Overview of Video as a Medium in Artists’ Practice Tracing the history of how artists have explored the medium of video helps to better understand how a collection may have taken shape and helps to illustrate how the artistic use of video has DOI: 10.4324/9781003034865-22

269

Agathe Jarczyk and Peter Oleksik

evolved over time. Here we briefly illustrate the 50-plus year practice of video as an artistic medium and dive deeper into various technologies later in this chapter. Artists began their initial explorations into the medium of television in the 1950s via the manipulation of hardware, well before the introduction of videotape. Wolf Vostell and Nam June Paik were among the earliest experimenters in the televisual medium. Nam June Paik began by physically manipulating the cathode ray tube itself, perhaps most famously by collapsing the image to a single line on the television screen in his famous Zen for TV (1963) (Elwes 2005, 25). The instantaneous display of a video camera’s view onto a monitor was how artists began to creatively use the medium and was known as closed-circuit installations. These performative closed-circuit installations are considered early examples of video art even though in most cases the signal was displayed live and not recorded to tape. Some artists utilized and mastered the equipment’s specific characteristics, such as the lag in the early CRT tubes in Vidicon inscriptions (1974) by David Hall and Videozeichnungen zur Präsenz . . . (1980) by Hannes Vogel. Other artists modified equipment in collaboration with engineers [ Jean Otth worked with engineer Serge Marendaz for Hommage à Mondrian (1972)] or built and invented new tools, such as Steina and Woody Vasulka did for their pioneering works Matrix I and II (1970–1972) (High et al. 2014, 287) and Violin Power (1969). The introduction of the first portable video recorder in 1965, Sony’s legendary Portapak, catalyzed artists’ use of the video medium (Marshall 1979, 109). The ability to record the tele­ visual signal completely revolutionized artists’ practice with video as it allowed for instantaneous viewing without the tedious process of photochemical development required by motion picture film. Artists began to record their live performances and events, such as Joan Jonas with Organic Honey’s Visual Telepathy (1972) and Bruce Nauman with his many works exploring his studio in the late 1960s. Early video documentary practices emerged as artists began to film in a verité style in the streets, documenting life around them, such as Les Levine’s Bum (1965), which portrays the Bowery in New York City in the mid-1960s. The late 1960s saw the first experimentations in a broadcast context. In 1969 Gerry Schum broadcast his “television exhibition” Land Art on SFB in Berlin (Schum et al. 1979, 23) and continued later with his Videogalerie Schum. The TV stations KQED in San Francisco, WGBH in Boston, and WNET in New York established collaborations with artists that influenced the development of new tools (High et al. 2014, 113). As the medium matured technologically with improved tape formats and technology, professional production and post-production facilities emerged geared toward artists, such as Electronic Arts Intermix in 1971 (EAI) in NYC [Electronic Arts Intermix (EAI) 1999], the Long Beach Museum of Art in 1976 (Huffman 1984, 10), Montevideo in 1978 (Boomgaard and Rutten 2003) in Amsterdam [now: LIMA (n.d.)], the Bay Area Video Coalition (BAVC) in San Francisco in 1976 [now BAVC Media (2021)], 235media in Cologne in 1984 (Buschmann and Nitsche 2020, 154), and many others. In the early 1970s, Gloria Maria Bicchochi’s art/tapes/22 provided a production studio in Florence and also promoted exchanges between European and American artists (Mazzanti 2017, 170). These facilities were crucial to providing artists access to what was still a fairly expensive medium to work with, and most of the facilities were geared toward broadcast or professional settings. Because artists were using professional equipment, albeit in a quasi-professional setting, their works began to adhere to broadcast standards of the time. This allowed for more uniformity in their practice and, crucially, the ability to disseminate their works both on tape and through television broadcast. Dedicated video exhibition spaces at museums and galleries were established at this time; for example, the Long Beach Museum of Art started to regularly screen video in 1974 (Phillips 2008, 6), and the Museum of Modern Art began its VideoViewpoints screening series in 1975 (London 1997). 270

Caring for Analog and Digital Video Art

The 1980s saw an explosion in the artistic use of video. With the wide, global adoption of VHS in the consumer market starting in the early 1980s (Zielinski 1986, 244), video was increasingly more accessible, and artists engaged with the medium in a variety of ways. The artist Gretchen Bender began to stage live, multi-monitor performances, such as Dumping Core (1984) and Total Recall (1986) at the Kitchen in New York City, using video material she had recorded off the air and edited using the new tape to tape editing systems. In the 1980s, these works were more actively exhibited and started entering collections with increasing frequency. In the late 1990s, with the availability of non-linear video editing programs on computers such as Apple’s Final Cut Pro and iMovie (Meigh-Andrews 2014, 317) and the development of new, miniaturized video cassette formats such as DV, artists’ practice diversified significantly. Some artists moved away from working in a professional production environment and were able to work independently in their own studios, given the new affordability of video. This greatly expanded the creative use of video but also added a level of variability into what the final product would be now that artists were moving away from the standardized, broadcast video settings. The result is that works were completed using a variety of formats and standards, which meant that the quality of the video signal varied from work to work and artist to artist. With the introduction of the purely digital video signal in the early 2000s, the previous boundaries of the standard definition (SD) television standards (PAL and SECAM in Europe, Africa, and Eurasia; NTSC in the Americas and Japan) became obsolete as digital file-based workflows were now possible on home computers and artists could thus work largely independently producing works using different resolutions, frame rates, scan types, pixel aspects, and so on. Some artists, such as Petra Cortright, who made VVebcam (2007), worked with low-cost equipment, mostly using her own laptop, while other artists employed an entire production staff, such as Ragnar Kjartansson’s multichannel The Visitors (2013). Video is now more than 60 years old; it began with open-reel tapes, it continued with analog and later digital cassettes, and now it no longer needs to be produced with a classical setup of a camera, a video cassette recorder (VCR), and monitor. Today, some artists shoot video with a phone, some build it using animation software or a game engine, while other artists prefer to use low-tech equipment, and still others apply highly sophisticated tools similar to TV or film production. As computing power and video technology continue to advance, new forms of video will be introduced or continue to develop, such as videos generated using new proprietary apps, 3D videos, ultra-high-definition videos, and 360° videos. As in other areas of art, video practices, materials, and tools vary widely and will continue to change and evolve. More examples of artists’ pioneering applications of video technology are discussed in Noordegraaf et al. (2013, 217ff).

18.3 The Beginnings of Collecting and Preserving Video The history of collecting videos is closely interwoven with the history of video production and shaped by the history of distribution. As we discussed earlier, groups of video practitioners and enthusiasts formed in the late 1960s/early 1970s as they began to actively organize and distribute their work. New York State saw a number of these groups, from the Videofreex in the Catskills with The Spaghetti City Video Manual (Videofreex 1975), which was translated into German (Nitzschke and Weller 1975), to the Raindance Corporation in New York City, whose Radical Software zine helped share information about video technology and works (Boyle 1990, 52). Many of the video artworks that museums collected in these early days were single-channel works on tape intended to be displayed on monitors or to be projected. Some of the works came with requirements on monitor sizes or instructions such as color and contrast settings, but 271

Agathe Jarczyk and Peter Oleksik

the majority did not. Collections first began to acquire works in ½″ open-reel format, an early step toward standardization. Soon after, these tapes were replaced by the first successful cassette format, ¾″ U-matic, introduced in 1971 (Zielinski 1986, 174). U-matic cassettes enabled easier distribution and screening of video art until the mid-1990s because it was portable, relatively affordable, and easy to use. Instead of threading tapes manually, the cassettes could now simply be inserted. The later U-matic players even allowed automatic rewinding and thus made loop playback possible. Video artworks from this early period were typically sold as unlimited editions, and producers and distributors used different sale models. The Videogalerie Gerry Schum (Schum et al. 1979, 10) sold video works with a certificate that allowed the owner to order a new copy for the price of the technical production costs (Otterbeck and Kunstmuseum Wolfsburg 1997, 68). Other distributors, such as Electronic Arts Intermix (EAI, New York), distinguished between copies for private use and those for institutional use [Electronic Arts Intermix (EAI) and Zippay 1991]. In general, they operated under the concept of “life of tape.” Works were sold at a reasonable price but had a built-in expiration date: the more often a tape was played, the faster it wore out. This model allowed both distributors and artists a nominal revenue stream by replacing worn-out tapes. In these early days of video collection, museums would typically hold a single tape per work. For example, most institutions used a single U-matic tape as their master copy, research copy, and exhibition copy, resulting in wear and tear. Around the mid-1980s, many artists also used VHS and S-VHS as exhibition and master formats, and later in the decade, the first optical media format, the LaserDisc, was launched. The production of LaserDiscs was a sophisticated and costly process, so it was mainly reserved for artists with renowned galleries. As an optical medium, LaserDiscs could be read without disc contact and promised to be free of wear and tear, and offering the possibility of playing the video in a seamless loop for the first time. During this time, initial efforts were undertaken to preserve videotapes (London 1997). These early projects focused on the treatment of videotapes, such as dehydrating videotapes which were exhibiting signs of binder hydrolysis, the “sticky-shed syndrome” [Van Bogart 1995, 5; Bertram and Cuddihy 1982; AMIA (Association of Moving Image Archivists) 2002]. Videotape is composed of a polyester tape with a coating of magnetic particles bound to the tape via an adhesive substrate. When introduced to moisture via humidity, this adhesive begins to degrade, and the magnetic particles become unbound from the tape medium, resulting in “shedding” these particles when the tape is played. For further reading on the anatomy of video­ tape, see Van Bogart (1995) and AMIA (2002); and for in-depth explanations of tape deterioration, see Cuddihy (1980) and Bertram and Cuddihy (1982). A grassroots effort to safeguard and treat this material evolved as artists and practitioners regularly traded tips and tricks through various publications. One of the earliest published attempts to restore videotapes by artist Tony Conrad can be found in the 1987 edition of the magazine The Independent (Conrad 1987). In the 1990s, galleries entered the distribution market in greater numbers and introduced the edition model to video art, creating scarcity by offering an edition of three, for example, instead of unlimited editions, in order to drive up sales (Balsom 2016, 97). At the same time, many video artists shifted from the predominant single-channel format to creating more complex video installations made possible as video projectors became more available and affordable. As a result, artists started to request specific equipment to display these installations, including projectors, monitors, speakers, and beyond. With this push toward specific environments and equipment for video works, comprehensive installation manuals by artists began to appear in the late 1990s in response to institutional demands (Balsom 2016). These manuals contain not only precise lists of materials and equipment but also complex instructions for installation. At 272

Caring for Analog and Digital Video Art

first glance, these manuals appear to be complete and unambiguous; however, in practice, they can be contradictory or quickly outdated and must always be understood in the context of their time. As video was maturing as a medium in the 1990s, concurrent efforts were undertaken to better understand and preserve it. EAI started its first dedicated video preservation program in 1988 (Boyle 1993, 12), and others followed in the early 1990s, such as Montevideo/TBA in Amsterdam (Huisman and Mechelen 2019, 143). The first conferences on the preservation of video art emerged in subsequent years, such as the conference “How Durable Is Video Art?” at the Kunstmuseum Wolfsburg in 1996 (Otterbeck and Kunstmuseum Wolfsburg 1997); the “Playback” conference in 1997 (Fifer et al. 1998) and “TechArcheology” in 2000 (Laurenson 2001; Vitale 2001), both held in San Francisco; “Video Arts in Museums” in 2000 in Cologne (Mißelbeck et al. 2001); “Present Continuous Past(s)” in Bremerhaven in 2004 (Frohne et al. 2005), followed by “40yearsvideoart.de” in 2005 (Herzogenrath 2006; Blase et al. 2010); and “Reconstructing Swiss Video Art from the 1970s and 1980s” in Lucerne (Schubiger et al. 2009a, 2009b). In their 1997 publication, conservators Otterbeck and Scheidemann observed that the “disintegration of magnetic tapes in archives, the breakdown of playing equipment, a lack of replacement components, complicated maintenance and the rapid development of image processing demands from the restorer an intensive consideration of this comparatively young—only thirty-years-old—art form and technology” (Otterbeck and Kunstmuseum Wolfsburg 1997, 9). While the focus of the early conferences was still very much on the materiality and preservation of data carriers, the later efforts focused on preserving the conceptual aspects of the artwork and understanding the artwork’s identity. As video crossed into the new millennium, digital video quickly dominated both artistic practice and collections. As video became more accessible, artists worked with the medium more frequently, regularly distributing or editioning their works through traditional gallery representation, distributors such as EAI, or new platforms of exhibition and display in the digital realm. Analog video material in collections had to be migrated to fend off obsolescence. Initially, there was skepticism about digitization as a preservation action as only a few digital tape formats were available (Boyle 1993, 26), and file-based digitization was a cumbersome process given the computing technology in the early 2000s (Otterbeck and Kunstmuseum Wolfsburg 1997). However, as Digital Betacam became more affordable and widely available, collections and galleries began to adopt it as a professional master format for standard definition (SD) video. Digital Betacam was not only used as a preservation master, but collections demanded it as a standard for acquisition. When looking critically at this practice today, it must be noted that the desire for the best possible archival format obscured the different production practices of artists. With the advent of high definition (HD) video, this practice was enacted for a short time through the requirement for HDCAM and HDCAM SR, but these, in turn, were soon abandoned due to the imminent obsolescence of tape formats. In the late 2000s, file-based video became the predominant mode of video production and delivery: it arrives at the museum on removable media, such as HDD, thumb drives, and SSD, or increasingly over the internet and in a variety of formats. All tape-based materials today are migrated to digital file formats to ensure preservation. Video collections now are, therefore, primarily digital, with the original analog materials serving as backups and research copies, and the needs of the video collection are generally governed by digital preservation practices. Video collections today often contain a variety of materials that spans the rich history of this medium, posing unique challenges to collection care staff tasked with managing them. The following sections offer practical advice on how to approach and care for video artworks, 273

Agathe Jarczyk and Peter Oleksik

whether analog or digital, whether single-channel or installation, and which can be applied to both current and future collections.

18.4 Collection Practices Today Collection care of time-based media artworks is an ongoing process with a number of important key moments. Acquisition and intake lay the groundwork for creating comprehensive documentation on the concept, production, and display of video artworks. When not on display, the artwork exists as stored analog or digital components, artist-provided instructions, and as documentation created by the institution. To make this documentation a comprehensive body of knowledge, it is fundamental to acknowledge that the preservation of time-based media artworks is a shared responsibility and effort. Conservators must collaborate with artists, curators, registrars, technicians, and many other staff members in and outside of the museum in order to build a complete understanding of a work. Research and sharing information define the pre-acquisition phase, while condition control is important at collection entry, and digital preservation and storage are key for the long-term preservation of a work.

18.4.1 Pre-acquisition Process: Defining Deliverables with the Artist As we have illustrated earlier, a video work can be associated with a long history of formats and technologies or a relatively short one for a new work. Knowing how the work has entered the collection and, more importantly, in what video standard and format is critical for the longterm preservation of the work. Artists work in their own preferred style, and how the video takes shape can vary from artist to artist or work to work. It is the receiving institution’s responsibility to shepherd each work into the collection in a manner that is both faithful to the work and serves to best prepare it for its life within the collection. The authors recommend that institutions and collectors request video works in both a master and an exhibition format in order to achieve this. The master should ideally be a direct copy of what the artist considers to be the best version of the work with the least compression so that it can be used to derive copies in the future. The exhibition format should be an example of the work that the artist has shown previously or that the artist prefers, which can either be used in future exhibitions or serve as a reference when the institution will inevitably need to make new exhibition copies for future display. To assist in determining which formats to use, the pre-acquisition questions listed after this paragraph will help you to better understand the artist’s production process so that you can gain a good understanding of what went into producing a piece which will help determine what the master and exhibition formats should be and serves as invaluable information when later condition-assessing the piece to confirm visual and aural anomalies or questions that arise. Here are sample format questions: 1. 2. 3. 4. 5. 6.

How was the work shot or recorded (e.g., camera types, file formats of video and audio)? How was the work edited (e.g., Final Cut Pro 10.4.1, Adobe Premiere Pro CC 2015)? What is your master format (e.g., Apple ProRes 422 HQ)? What is the video resolution (e.g., 1920 × 1080 px)? Which file format(s) have you used for exhibition (e.g., H.264)? What type of sound arrangement does the work have (e.g., mono, stereo, 5.1 surround)? 274

Caring for Analog and Digital Video Art

The answers to these questions will help guide you and the artist in selecting the formats that would be best for a master and an exhibition copy. For instance, the artist may not be aware that a compressed file format like H.264 is not a good master format. However, based on the information the artist gives you, you could work with the artist to see if the editing software’s project file still exists so that a less compressed file could be produced and brought into the collection. The rationale for this is that the less compressed version will be better suited for preservation in the long run. This will vary from work to work and artist to artist, so being familiar with the workflows of video production and post-production is key in these conversations to tease out the nuances of file formats and workflows in order to help determine these ideal deliverables. If you are not familiar with video production, turn to people who are, such as the audiovisual staff at your museum or organization or time-based media conservators working in the field. Acquisition deliverables also include critical information, such as installation instructions and guidelines on handling the variability of an artwork. During the acquisition process, collection caretakers work with artists to ensure that this relevant information is generated and/or transmitted from the artist to the museum. It is not uncommon for video works to have a variety of display parameters due to the flexible nature of the medium. However, some artists may only be interested in one specific aspect of display. For this reason, these questions prove critical in the initial acquisition phase of these works. Here are sample display questions: 1. What are the preferred tangible aspects of the work’s display (e.g., lighting, paint color on the walls, type of room, etc.)? Answers to this question will guide future exhibitions of the work and build the identity of the work in your collection. 2. Do you have any guidelines or preferences when it comes to display (e.g., showing the work projected on monitors, with speakers or headphones)? 3. If projection is acceptable for display, do you have a maximum or minimum projection size? 4. Do you have any guidelines or preferences for equipment used to show the work (e.g., manufacturer/model or particular specifications)? Sample documentation templates can be found online; see “Matters in Media Art” (2015), Metropolitan Museum of Art (2021), Guggenheim Museum (2012), and see Chapter 11.

18.4.2 Pre-acquisition Process: Identifying the Significance of Technologies As video works rely on different hardware and software technologies for playback, their significance in each artwork needs to be identified so the work can be cared for in the best possible way. This is done during the initial pre-acquisition conversations with the artist, which are aimed at defining the significant aspects of an artwork, but also through the exhibition and continued dialogue with the artists, curators, and other stakeholders within your organization. For example, the initial pre-acquisition conversations may reveal that the work should only be shown on a CRT monitor, which would make the significance of that type of hardware critical to the display of that work, while there might be flexibility for another work that could be shown using any type of display, be it CRT, flat-screen, or projection. If the artist is not available for these conversations, research into past displays of a work will prove critical. Written and photographic documentation can reveal how the work was displayed and guide you in acquiring and possibly dedicating technology that will be instrumental for the future exhibition of the work. 275

Agathe Jarczyk and Peter Oleksik

See Lorrain (2013, 236ff) and Chapter 15 for recommendations on managing and storing equipment.

18.4.3 Acquisition Process: The Evaluation of Artist-Provided Materials Once you have had a discussion with the artist to determine the parameters of their artwork and decide on the deliverables, the next critical action is the evaluation of the materials delivered. This is essentially an action to confirm that what you received is what was agreed upon. In some cases, the artist may not have participated in creating the deliverables for a variety of reasons (e.g., they might employ assistants to create materials for collectors and institutions). They might rely on post-production facilities that may not be aware of the conversation between you and the artists, or they might rely on a gallery to deliver the material. It is vital to check the material as quickly as possible after it is received so that any issues can be addressed.

Analog Material It is important to either preemptively migrate material that is delivered on a tape or disc format as part of the delivery process or to budget for its migration to a digital video file format, as most tape media are now obsolete. In addition, playback devices for these formats are becoming increasingly rare, and playback could potentially damage the tape media. All optical media, whether analog (LaserDiscs) or digital (CDs, DVDs, Blu-ray), should be digitized or migrated to digital file formats as well.

Digital Material It is important to review all digital material like hard disk drives, thumb drives, and so on in a safe environment. This means adhering to digital preservation’s best practices, such as using a write blocker to prevent any potential harm to the original delivery drive and files and to make a copy of the material for review (see Chapter 14). Once you have safely prepared the drive and copied the contents for review, you can then open the material in playback software such as QuickTime Player or VLC to review the content and duration of the delivered material. It is important to verify that the master and exhibition files all have the same duration and adhere to the agreed-upon specifications from the preacquisition phase. You can use the menu option “Get Info” in the video playback software to obtain some of the technical metadata on the files to determine whether they are as expected. For example, the artist may have promised an Apple ProRes master, but when opening the file in QuickTime Player, you observe in the Inspector window that it uses H.264 encoding, which is typical for an exhibition or preview copy rather than a master. For further analysis, you can review the technical metadata of the digital material. This evaluation step should be followed by a full assessment of the condition of the video material, which is covered in-depth in the next section.

18.5 Acquisition Process: The Condition Assessment of Video Materials Condition assessment of audiovisual materials is usually conducted as part of the acquisition process or during a collection survey (see Chapter  4). Condition assessment is the practice of 276

Caring for Analog and Digital Video Art

reviewing the material to evaluate and determine the condition of its audiovisual and technical aspects. Quality assurance and quality control procedures are conducted during condition assessment. Quality assurance (QA) is conducted to ensure that the material is authentic and that nothing has affected the quality and condition of the material, be it digital or analog. Using fixity to confirm the integrity of a digital file is a good example of QA: stipulating that a checksum be provided at acquisition allows the receiving institution to confirm that the quality of that digital file has not been compromised. Quality control (QC) is conducted by the receiving institution through a complete review of the material visually, aurally, and technically to ascertain the condition of the work and to identify and rectify any issues that are found. Poor encoding, unintentional errors, and other anomalies that can possibly be addressed and rectified in conversation with the artist may be caught during this process. It is important to perform this check soon after acquisition while still in active contact with the artist, who can help confirm visual and aural artifacts, answer lingering questions, and/or correct material that was delivered in error.

18.5.1 Acquisition Process: Condition Assessment of Analog and Digital Videotapes The condition assessment of analog and digital videotapes requires significant infrastructure and studio equipment (see Chapter 7). For those who do not have access to the equipment mentioned in the next paragraphs, the authors recommend working with a trusted vendor. Each tape format requires its own video tape recorder (VTR) and other specialized equipment such as Time Base Correctors and frame delays to assist in this review. Specialized training and knowledge are needed to operate the equipment, including waveform monitors and vectorscopes for analyzing the visual signal, meters and phase meters for visually monitoring the aural signal, and speakers and headphones for listening. A sufficiently large monitor for the condition assessment is crucial. While HD videos are best watched on flat-screen monitors with appropriate resolution, flat-screen monitors do not display both fields of the interlaced SD content. Therefore, CRT studio monitors are important for condition assessing standarddefinition analog and digital tapes. Ordinary, non-studio CRT monitors play video in overscan mode, which hides approximately 7% of the standard definition video image (Schmidt 2013, 50). In contrast, CRT studio monitors are able to display video in underscan mode, which scales the image down and displays all of the active scan lines. Underscan mode reveals important indicators such as head switching points at the bottom edge of the image. This visual artifact derives from the switching between the two heads in the VTR. When copied from one analog tape to another, head switching points add up and can move up into the picture area. The number of head switching points indicates the generational distance to the master tape. Visual anomalies are noticed, described, and possibly remedied during the assessment. For example, dropouts, a common video error, are horizontal white, black, or white-and-black lines, which disrupt a scan line. They are caused by momentary loss of head-to-tape contact and are either temporary in the case of debris or permanent in the case of a loss of magnetic particles on the tape’s surface. Dropouts that originate during playback, either as temporary or permanent damage, also cause an interruption in the blanking interval. More details on the anatomy of a video signal can be found in Weise and Weynand (2007, 15ff) and Poynton (2012, 83ff). Dropouts that derive from former tape generations and have been recorded onto a new tape will appear as disrupted scan lines within the picture raster. In order to assess whether a tape’s 277

Agathe Jarczyk and Peter Oleksik

dropouts derive from debris or from a loss of magnetic particles, the tape should be digitized at least twice and the results compared. Cleaning the tape between the tape runs is recommended. Resources are available to assist in identifying these phenomena and suggest remedies in some cases (AV Artifact Atlas | AVAA n.d.; Gfeller et al. 2012). Assessing the condition of analog and digital tapes is thus threefold: it includes a physical assessment of the open reel tape or cassette, as well as assessing the video playback and the product of its digitization.

18.5.2 Acquisition Process: Condition Assessment of Digital, FileBased Video Condition assessing digital file-based video material uses a combination of analyzing technical metadata and critical viewing to ensure that the material matches the pre-acquisition understanding and is suitable for the collection. This review can be used to identify material that was exported with settings that contradict those discussed with the artist or that exhibit visual or aural artifacts and anomalies that should be addressed with the artist.

Technical Metadata The first step when reviewing digital video material is to collect the technical metadata about the file. There is a variety of software tools available such as specific video metadata tools like MediaInfo and Invisor or those that run within playback software like QuickTime Player. The goal is to ascertain as much information as possible about the file in order to identify any incongruities between the information you have about the file and the file itself. For example, a file agreed on to be received as an Apple ProRes file (Apple Inc. 2020) should not have an encoding codec of H.264. This is also an opportunity to check for any discrepancies that may be present within the file’s metadata. For example, SAR (Sample Aspect Ratio)/PAR (Pixel Aspect Ratio)/DAR (Display Aspect Ratio) miscalculation is common, indicating that the pixel aspect ratio differs from the ratio required for display (Nagels n.d.). As of this writing, the most common tools used for this purpose are MediaInfo (Pozdeev and MediaArea 2021), Invisor (Pozdeev 2021), ExifTool (Harvey 2021), and ffprobe [FFmpeg (Software) 2021b], and Siegfried (Lehane 2020).

Critical Viewing Critical viewing is the act of watching and listening to the video content at one of three levels (see Table 18.1).

Automated Critical Viewing Automated critical viewing is possible with software tools that analyze the video and audio stream and flag certain elements that it has been configured to identify. For example, the software can be set up to analyze a file and report where possible that luminance information has been “clipped” or note files that do not pass a file format validation. Automated critical viewing is useful for working with large amounts of material. However, the software will only flag what it is instructed to flag, so the results may be limited by the setup and the software’s ability to identify issues in the video material.

278

Caring for Analog and Digital Video Art Table 18.1  Three levels of critical viewing as part of the quality and condition assessment of digital video Minimum Level

Medium Level

High Level

• Opening the video material in a playback software and watching and listening to it. • Reviewing the material in at least two different playback software applications (e.g., QuickTime Player and VLC) and checking to see if the software will render the video or audio differently. If it does, this can be a result of some internal error in the video file. • Confirming the duration and comparing across multiple versions (e.g., master and exhibition file). • Confirming that the audio tracks are arranged properly.

• Using more sophisticated software to review the video material, such as non-linear editing software like Final Cut Pro or Adobe Premiere, using digital oscilloscopes like waveform and vectorscope. • Watching the material in its entirety while also watching the signal through the different views afforded by the oscilloscopes, which allow the viewer to verify the levels of the luminance and chrominance channels.

• Adding more sophisticated hardware into the process, such as calibrated reference monitors and display devices. • Observing how the material is rendered on different devices to expose possible color gamut issues, encoding issues with interlacement/ progressive discrepancies, or a host of other artifacts that can be exposed when displayed on a variety of displays. • Checking the file integrity by running the FFmpeg command “framemd5”.

Current free and open-source tools for automated condition assessment are QCTools (MediaArea and Various 2020), MediaConch (MediaArea and Various 2018), and DVAnalyzer (MediaArea and AVPreserve 2017).

18.6 Digitization as a Preservation Strategy Migrating the video signal onto new formats has long been the primary preservation strategy for video preservation. Until the early 2000s, this strategy meant that analog video was transferred from older analog tape formats to newer analog tape formats. For example, ¾″ U-matic tapes would be copied onto the professional analog format Betacam SP. Although these formats were typically higher in quality than the source format, each analog transfer induced noise due to the analog transfer process and is commonly known as generational loss. The new copy manifested primarily as a “noisier” or degraded image when compared to the original, as static, picture softening, and other visual and aural artifacts were introduced into the signal. Digitization is the translation of the continuous analog signal into a discrete digital signal with defined values. Digitizing video has also long been part of the video signal, dating back to the 1970s with the introduction of Time Base Correctors (TBCs) (Turkus 2016). With the introduction of various professional digital cassette formats, such as D5 and Digital Betacam, in the mid-1990s, the process of transferring video to new formats allowed for the signal to be sampled and stored digitally. Critically, this process eliminated generational loss as a concern because the digitization process allows for identical copies of the video signal. Because the process was expensive and unwieldy at the beginning, it was still common to migrate works to the highest-resolution analog tape formats such as Betacam SP. However, costs gradually dropped, and the ability to work with professional digital cassette formats became easier so that as time went on and as technology progressed with faster computational speed and increasing disk

279

Agathe Jarczyk and Peter Oleksik

Figure 18.1 Graph A represents the analog, continuous signal, graph B represents the signal digitized with a low bit depth, and graph C shows the signal digitized with a higher bit depth. The higher the bit depth, the finer gradations are possible. An 8-bit signal allows for 256 gradations; a 10-bit signal for 1,024 gradations. Illustration: Agathe Jarczyk

Figure 18.2

Graph A  represents the analog, continuous signal. With each transfer or copy, noise B is introduced to the signal. The resulting signal C contains noise. Noise is added with each transfer or copy, resulting in a poor signal-to-noise ratio. In the bottom row, D represents a digital signal with discrete values. Again, noise E is always added during a transfer. However, since digital signals are assigned discrete, fixed values, the resulting signal F after the transfer is interpreted identically to the initial signal. Noise can be disregarded, and a lossless copy is possible.

Illustration: Agathe Jarczyk

space, the majority of preservation strategies for video resulted in migrating to a digital video file format. All analog videotape formats are now obsolete; therefore, it is critical to migrate all of this material to digital video file formats before access to playback equipment becomes impossible. Analog video formats need to be transferred immediately as we will begin to lose access to this technology within the next five to ten years (Lacinak 2014). The authors advise that all analog material in collections be prioritized immediately for digitization.

18.6.1 Digitization as a Key Moment in the Life of an Artwork Digitization is one of the key moments in the life of an artwork. The entire digitization workflow, including preparing the original tape, choosing the equipment, assessing the tape’s condition, configuring the settings, reviewing the signal connections, and selecting the target format,

280

Caring for Analog and Digital Video Art

determines the quality of the digitized material. Digitization is labor-intensive and costly, and the tapes will continue to deteriorate, so it is unlikely that digitization will be repeated. Another aspect of this key moment is that digitization results in separating the material from its physical carrier: All of the associated metadata, such as the labels on a cassette and its case, or additional information in the form of dub sheets, the artist’s handwriting, signatures, and so on, are no longer accessible with the digital file. At this key moment, it is essential to ensure that the documentation is clear and comprehensive so that the reference to the original tape remains traceable. For more information on working with a vendor for digitization, see De Stefano et  al. (2013) and Lacinak (2006). Vitale and Messier (2013) offer a rich collection of sources on migration and video preservation in general.

18.6.2 Preparing for Digitization Whether the digitization is carried out in-house or with the help of an external vendor, workflows, naming conventions, and processes of quality control should be determined beforehand. Photographing all of the components supports later identification and provides evidence of a tape’s condition. When closely inspecting tapes, always check the condition of the case and the tape itself to look for damage or to see if the lid is blocked by a label, the tape pack is wound evenly, if tape deformation is evident or there are signs of mold. If there is evidence of mold, separate the affected tapes and store them in a dry place until the tapes can be cleaned. In cases of advanced mold infestation, the layers of the tape pack tend to stick together, which often causes the tape to rip when it is played back or rewound. Next, open the back flap of the cassette (if possible) and check for signs of wear and tear like scratches, creases, or damaged tape edges. Magnetic tapes that are affected by binder hydrolysis, or “sticky shed,” often emit a waxy smell. Playing these tapes back will often result in the magnetic particles “shedding” and clogging heads, damaging the equipment, the tape, or both. If you suspect a tape is exhibiting signs of binder hydrolysis, consult with a specialist before attempting playback. [More details on preparing for playback and digitization can be found in Gfeller et al. (2012).] Commercially available tape cleaning machines have become obsolete, but older machines are still in use by many vendors. These cleaning machines strain tapes less when rewinding in comparison to a regular playback deck. However, when the cleaning mode is turned on, a sapphire burnishing blade polishes the tape and cleans off superficial debris, removing efflorescence from the tape’s surface. Tapes with creases should not be cleaned with a burnishing blade as the sapphire can damage the magnetic tape surface in the area of the creases. Again, work with a trusted specialist before embarking on these tasks.

18.6.3 Choosing the Right Equipment for Digitization Before digitization can begin, the best playback equipment should be evaluated. TV standards, optional recording modes like Long Play (LP), Standard Play (SP), or Extended Play (EP) should be considered, as well as specific formats, such as U-matic Low Band (LB), High Band (HB), and Superior Performance (SP), which require their own playback devices. Some formats developed later were backwardly compatible (Gfeller et al. 2012). The television standards NTSC, PAL, and SECAM were incompatible due to different frame rates, different numbers of scan lines, and different color encoding systems (see Table 18.5).

281

Agathe Jarczyk and Peter Oleksik

There were players in the respective standard for each cassette format, and monitors and TV sets were also usually limited to one standard. Exceptions to this were the Sony PVM models that emerged in the late 1980s and later the stackable monitors, which were often used for display. Many European museums supplemented their collections with works by US-based video pioneers, which were sold by galleries or video distributors like EAI. Multi-standard devices, such as U-matic players for NTSC, PAL, and SECAM were, therefore, more common in Europe than in the US. Galleries and distributors often had video works converted to their respective television standard. It is important to note that with each change of standard, there is a significant loss of quality: when changing from PAL to NTSC, the frame rate must be converted from 25 Hz to almost 30 Hz, the resolution is reduced from 626 to 525 scan lines, and a different color space is applied. When converting from NTSC to PAL, the number of scan lines is increased, but the frame rate is reduced. Each standard conversion results in a loss of sharpness and color fidelity (see Table 18.5).

18.6.4 Time Base Correctors In the process of digitization, TBCs are usually required for all analog formats to stabilize the analog signal. A TBC buffers the unstable input analog signal and sends out a stabilized analog signal as a result. Depending on the TBC, only a few lines, a field, or a full frame are buffered. Some TBCs are capable of dropout compensation (DOC) and output the processed signal by replacing faulty lines with buffered lines. TBCs usually contain a processing amplifier (proc amp) which allows the video signal to be adjusted. The proc amp is used to adjust all video levels and keep the outgoing signal within a legal range to prevent clipping. The entire video signal can be raised or lowered, and the components of the signal, such as luma, black, chroma, chroma burst, and Y/C delay can be adjusted. Although most TBCs have a “standard preset,” it is worthwhile to check this setting with the help of a reference signal; for example, by comparing color bars before and after processing by a TBC, it becomes clear in most cases that the standard settings of the brightness and color values change in the color bars processed by the TBC (Bunz and Jarczyk 2019).

18.6.5 Determining the Target Formats for Digitization For standard-definition analog and digital videotape material, the generally agreed-upon target digital video codec is 10-bit, uncompressed, YUV (color subsampling 4:2:2). The audio tracks should be captured at 48 kHz/16-bit pulse code modulated (PCM) audio, which is also uncompressed. This encoding scheme is equal to the encoding of Digital Betacam, the previous tapebased standard for the preservation of standard-definition material (Wijers et al. 2003, 52). Due to compatibility with editing programs, the predominant video container is QuickTime MOV. There have been recent developments in the use of FFmpeg Videocodec 1 (FFV1) encoding of a standard definition video signal, but capturing directly to this format is currently not possible with most analog to digital capture software except for Vrecord (AMIA—Association of Moving Image Archivists 2021). For further resources on FFV1, see Fleischhauer et al. (2019) and Pfluger et al. (2019). For standard-definition material that is encoded digitally on a tape, one should ideally transfer the data stream directly, capturing the material in its native encoding. For example, all tape formats of the DV family (DV, DVCAM, DVCPRO) should be transferred via FireWire (also

282

Caring for Analog and Digital Video Art

indicated as IEEE1394) output from the capture deck to preserve the encoded video signal on the tape. If you were to capture this material through an analog to digital converter, you would lose all of the relevant metadata associated with this format through an unnecessary digital to analog to digital conversion. For high-definition digital videotape material, such as HDCAM and HDCAM SR, an uncompressed digital encoding with 10-bit depth is still sufficient for the encoding and digital preservation of this information (Fleischhauer et al. 2019).

18.6.6 Maintaining the Sound Layout During Digitization The methods for recording sound on videotape evolved over time. Open reel formats started with mono soundtracks, followed by several tape formats which offered stereo tracks, leading to later digital tape formats with up to eight tracks. For faithful digitization, it is essential to retain the audio track assignments in order to preserve the original audio arrangement on the tape. For instance, videos with a monaural, one-channel track date back to a time when televisions or monitors often had only one built-in mono speaker. However, stereo sound, or two speakers, is most common today in modern displays. The authors advise that one would first capture the audio “as is” by transferring the mono track. To display the video in a more modern context, the authors recommend making a copy to produce a display version with the mono track doubled on channels 1 and 2 (left and right). Doubling the mono-sound on both tracks prevents the sound from being played back only on one side of the speakers or headphones.

18.6.7 Documentation of Digitization Various TBCs can be used for signal stabilization and dropout compensation during digitization as different TBCs each have their pros and cons. Test digitizations are often carried out first to evaluate the best combination of devices. Split-screen or side-by-side video montages are used to compare video quality. They help to convey the reasoning behind each decision, whether applied to the hardware, software, or the signal path, which is recorded in the written documentation.

18.7 Migrating Digital Video Digital video, much like analog video, requires reformatting, also known as transcoding. Treatment for different display contexts depends on the characteristics of the original digital material. Because video is a constantly evolving technology, the migration of the signal has been a constant aspect of any video artwork, both for access and preservation to keep pace with changing playback and display technology. For example, material that is encoded as MPEG-2 may not play on a device that requires H.264 for playback. Here, we will describe the factors involved in migrating digital video material.

18.7.1 Review and Analysis Before embarking on a migration, it is important to ascertain the key characteristics of your source file (see Section 18.8.6, Table 18.6), as this information will guide your decisions.

283

Agathe Jarczyk and Peter Oleksik

Figure 18.3 The figure shows four different audio tracks with two channels each. The upper track represents the left channel; the lower track, the right channel.

Example A contains two different audio tracks. In example B, the tracks are mixed so that there is a spatial stereo effect, and the listener does not have the impression that left and right alternate. Example C contains a single mono track on only one of the two channels. For producing exhibition copies, it is often advisable to double this track (D) so that the sound can be heard from both sides in the room or on headphones. Illustration: Agathe Jarczyk

Table 18.2 Comparison of the technical metadata of two files with identical content (The metadata of the uncompressed preservation master and the exhibition copy were extracted using the tool Invisor, which is based on MediaInfo.)   File Name Size Kind UTI Created Modified Container Format Format profile Codec ID

PreservationMaster.mov

ExhibitionCopy.mp4

Data types

PreservationMaster.mov 6.15 GB (6146024697 bytes) QuickTime movie com.apple.quicktime-movie 12 /14/2021 at 6:39:47 PM 12 /14/2021 at 6:39:58 PM

ExhibitionCopy.mp4 7.37 MB (7367755 bytes) MPEG-4 movie public.mpeg-4 12 /14/2021 at 6:42:22 PM 12 /14/2021 at 6:42:23 PM

MPEG-4 QuickTime qt 2005.03 (qt)

MPEG-4 Base Media/Version 2 mp42 (mp42/mp41)

              Data on wrapper (container format)

284

Caring for Analog and Digital Video Art

 

PreservationMaster.mov

ExhibitionCopy.mp4

Data types

Duration Overall bit rate Encoded date Tagged date Writing library Media/UUID

3 min 40 s 0 ms 223 Mb/s UTC 2021–12–14 23:39:47 UTC 2021–12–14 23:39:58 Apple QuickTime 6FC90111-BA57–4CEB93DB-F4A37F4ACD63

3 min 40 s 0 ms 268 kb/s UTC 2021–12–14 23:42:22 UTC 2021–12–14 23:42:23    

           

1        

1 Advanced Video Codec [email protected] CABAC / 3 Ref Frames Yes

 

3 frames

  v210   AJA Video Systems Xena 3 min 40 s 0 ms Constant 221 Mb/s 720 pixels 703 pixels 576 pixels 576 pixels 1.457 1.85:1 Constant 25.000 FPS

M=1, N=125 avc1 Advanced Video Coding   3 min 40 s 0 ms   8 670 b/s 1 024 pixels   576 pixels   1 16:9 Constant 25.000 FPS

5500 PAL YUV 10 bits Interlaced Interleaved fields

5500   YUV 8 bits Progressive  

Video ID Format/Info Format profile Format settings Format settings, CABAC Format settings, Reference frames Format settings, GOP Codec ID Codec ID/Info Codec ID/Hint Duration Bit rate mode Bit rate Width Clean aperture width Height Clean aperture height Pixel aspect ratio Display aspect ratio Frame rate mode Frame rate Frame count Standard Color space Bit depth Scan type Scan type, store method Scan type Compression mode Bits/(Pixel*Frame) Stream size Language Encoded date Tagged date

    Data on codec (video stream)

              Data on PAR/ DAR   Data on frame rate       Data on bit depth Data on scan type

Top field first   Lossless Lossy   21.333 0.001   6.08 GB (99.0%) 238 KB (3.2%)   English English   UTC 2021–12–14 10:39:47 UTC 2021–12–14 10:42:22   UTC 2021–12–14 10:39:58 UTC 2021–12–14 10:42:23   (Continued)

285

Agathe Jarczyk and Peter Oleksik Table 18.2 (Continued)  

PreservationMaster.mov

ExhibitionCopy.mp4

Data types

Chroma subsampling

4:2:2

4:2:0

Codec configuration box Color primaries

 

avcC

Data on color subsampling ratio  

BT.601 PAL

BT.601

Clean aperture display aspect ratio Format Matrix coefficients Transfer characteristics Audio ID Format Format/Info

16:09

 

YUV BT.601 BT.709

AVC    

2 PCM PCM

Format settings Codec ID Duration Bit rate mode Bit rate Channel(s) Channel layout Sampling rate

Little/Signed in24 3 min 40 s 0 ms Constant 2 304 kb/s 2 channels LR 48.0 kHz

2 AAC LC Advanced Audio Codec Low Complexity   mp4a-40–2 3 min 39 s 960 ms Constant 256 kb/s 2 channels LR 48.0 kHz

Yes Time code

   

Frame rate Frame count Bit depth Compression mode Stream size Language Encoded date Tagged date Format Other Duration Encoded date Format Frame rate ID Language Tagged date Time code of first frame Time code, striped Type

Data on color primaries             Data on codec (audio stream)

      Data on arrangement Data on sample rate   46.875 FPS (1024 SPF)     10311   24 bits   Data on bit depth audio   Lossy   63.4 MB (1.0%) 7.04 MB (95.5%)   English English   UTC 2021–12–14 23:39:47 UTC 2021–12–14 23:42:22   UTC 2021–12–14 23:39:57 UTC 2021–12–14 23:42:23   PCM AAC LC     3 min 40 s 0 ms     UTC 2021–12–14 10:39:58     QuickTime TC     25.000 FPS     3     English     UTC 2021–12–14 10:39:58     01:00:00:00    

286

   

Caring for Analog and Digital Video Art

Video Table 18.3 Key video characteristics Wrapper (container format) Codec (video stream)

SAR/PAR/DAR

Frame rate Bit depth video Scan type Color subsampling ratio

Color primaries

The type of wrapper around the original file (e.g., .mov, .avi, .mkv, .mxf, .mp4) indicates the type of playback software to use. The codec used to encode the video information (e.g., H.264, Apple ProRes, uncompressed, etc.) is the most important information when transcoding. For example, if the material is encoded as Apple ProRes 422, you know the codec subsampling is 4:2:2, and the bitrate is a constant 147 Mbps. These determine the sample aspect ratio (SAR) and the pixel aspect ratio (PAR) (0.9, 1, etc.), which indicates the encoded sampling geometry and the resulting display aspect ratio (DAR). For example, 486 rows of 720 pixels each, with a pixel aspect ratio of 0.9, will result in a display aspect ratio of 4:3. This is the rate at which the visual information is encoded (e.g., 24, 25, 29.97). This is the amount of space allotted for each sample (for example, 8-, 10-, or 12-bit, which impacts the number of possible colors). Whether the encoded information is interlaced (two fields) or progressive (a full frame). Scan type also indicates the scan order of the fields in a frame. The ratio used for the subsampling of the color information (4:2:2, 4:1:1, etc.), including how much color information (chroma) is in the video stream and how it was encoded. For example, a 4:2:2 ratio means that for every four samples of luminance, there are two each for the B-Y and R-Y chrominance channels. For 4:2:0 encoding, used in the H.264 codec, for every four samples of luminance, the B-Y and R-Y are sampled one at a time instead of simultaneously, essentially halving the chrominance information. The standard used for the color tristimulus reference points. This is key for proper color rendition as it indicates the color space, such as BT.601, BT.709, and sRGB.

Audio Table 18.4 Key audio characteristics Codec (audio stream) Sample rate Bit depth audio Arrangement

The audio information encoding (PCM, AAC, etc.). How the audio information was sampled (48 kHz, 96 kHz, etc.) This is how much space is allotted for each sample (for example, 16-, 24-, or 32-bit) Directing the audio will be directed upon playback (e.g., mapping the audio to different speakers depending on the context); for example, mono-aurally (one channel of audio and one speaker), dual-channel mono (two identical channels of audio played via two speakers), stereo (two discrete channels of audio played via two speakers), and so on.

18.7.2 Non-invasive (Non-transcode) Metadata Adjustments If there are discrepancies in the file’s technical metadata, a non-invasive method for correcting selected specific aspects of this information, as listed in the next paragraphs, is through a simple edit of the metadata without a total transcode of the video or audio information. In order to do 287

Agathe Jarczyk and Peter Oleksik

this, make a copy of the file and determine the best tool for adjusting the metadata information. See Rice (n.d.) for further information. If the frame rate metadata appears inconsistent, or a file is obviously playing too fast or too slow, use tools such as FFmpeg [(Software) (version 4.4 “Rao”) 2021a] or a hex editor to correct the frame rate in the file manually. Another metadata adjustment that is non-invasive concerns the pixel aspect ratio (PAR) and display aspect ratio (DAR) in digital video contexts where the original source was analog or a hybrid analog/digital format and thus prone to PAR and DAR errors. For example, the software might introduce square pixels into the encoding process while transcoding interlaced material. This would throw off the display aspect ratio, typically resulting in a 3:4 display. Tools such as FFmpeg or a hex editor allow you to adjust the pixel aspect ratio, which then corrects the display aspect ratio. Finally, when the primary color matrix is incorrectly encoded, the playback software or hardware interprets the color information incorrectly by drawing on the wrong tristimulus reference points. For example, if the listed matrix was BT.601 (Wikipedia—Rec. 601 n.d.; ITU—Rec. ITU-R BT.601 n.d.) but was erroneously encoded at BT.709 (Wikipedia—Rec. 709 n.d.; ITU—Rec. ITU-R BT.709–6 n.d.), the video would not be displayed with the correct colors. Use FFmpeg commands in these cases to adjust the color primaries without re-encoding the file.

18.7.3 Transcoding and Compression Re-encoding different stream information into an entirely new encoding scheme for a variety of reasons ranging from preservation to exhibition is called transcoding. Transcoding reformats all of the digital information, so this process needs to be thoughtfully decided so that the original signal is mapped onto the new encoding stream properly. Transcoding should always be done on a copy so that no unwanted errors are introduced to the original video. The two primary methods of transcoding are lossless and lossy. In lossless encoding, the information is mapped onto the new encoding stream without loss so that if the process were reversed, an identical encoding of the original file would result. Lossy encoding discards information to save on file size or bandwidth using temporal or spatial compression, resulting in a loss of information in the new encoding scheme. With lossless compression, the decision to transcode video material using a lossless encoding process might be due to space-saving capabilities or bandwidth constraints. For example, it might be necessary to reduce the size of uncompressed video material by transcoding it using a lossless codec such as FFV1 so the amount of information stored is cut in half. In a different situation, it might be important to reverse the process and transcode from a highly compressed file to an uncompressed file so that the information is encoded as robustly as possible if storage space is not an issue.

Compression: Lossy Migration of Signal The reason for transcoding using lossy compression is typically for display contexts, such as on the web or during exhibition as master files are typically large and/or necessitate a prohibitively high bandwidth for playback. Decisions on how best to translate the full resolution of an original into the new reduced encoding scheme are dictated by display settings, as most display devices have a limited set of codecs and parameters they will accept. For example, H.264 is currently the preferred lossy encoding for display hardware, with a data rate that typically is 288

Caring for Analog and Digital Video Art

around 20–30 Mbps. However, as technology improves to handle larger files, transcoding using lossy compression should be reviewed anew each time new hardware or software is introduced.

18.7.4 Digital Storage of File-Based Video Digital storage is an active preservation measure for all file-based video. One of the cornerstones of digital preservation is redundancy (i.e., making multiple copies of the digital material that are stored in geographically dispersed locations). If one digital storage location fails in whole or in part, the data can be restored from a backup that is located in a different location. Redundancy can also exist within a file. The lower the compression rate, the higher the redundancy. File formats without spatial or temporal compression store two or more identical succeeding video frames so that a loss like bit rot damages only that specific area or frame. If a bit error occurs in a compressed format, more frames are impacted, and the error can be more widespread. Fixity is used to ensure that files intended for long-term preservation arrive in the collection safely. Fixity is the assurance that the “digital file you have has remained unchanged, i.e., fixed” (Owens and Bailey, 2014; see Chapters 13 and 14). This is accomplished primarily through the use of a checksum, which can be considered a “digital fingerprint” for a file. Checksums are generated through cryptographic techniques and typically take the form of a string of alphanumeric characters of varying lengths depending on the tool that you use. Once generated, checksums are used as a “fingerprint” to constantly check that nothing has changed within the file when moving it from storage device to storage device or through general storage. Redundancy, duplication, and control are all cornerstones of digital archiving. Next to digital film and disk images from hard drives, video files have the largest storage requirements in time-based media collections. Use the resolution, duration and video codec or turn to an online video storage calculator to estimate your storage needs.

18.7.5 Storing Tapes, Discs, and Drives Magnetic tapes are susceptible to “sticky-shed syndrome” (see Section  18.3) through binder hydrolysis, which is catalyzed by the presence of humidity and warmth and renders affected tapes unusable. Therefore, dry and cool storage is considered a critical factor in prolonging the life of tapes; ideal storage conditions are specified around 50% RH at 15°C (Library of Congress—Care, Handling, and Storage of Audio-Visual Materials n.d.). All video material should be stored in a cool and dry environment. Given that the technological obsolescence of video material will increase with time, along with the ecological impact of maintaining this storage indefinitely, physical storage needs should be reassessed regularly as both technology and costs evolve. Some cassettes from the 1970s and 1980s are in containers made of PVC, which contain plasticizers that can leach out due to aging and cause sticky surfaces. Cassette cases that off-gas or are sticky should be stored separately away from other materials, and their cassettes should be re-housed. All magnetic media, including open reels, video cassettes, and hard disk drives, are susceptible to magnetic fields. All carriers need to be kept at a distance from any hardware like CRT monitors, loudspeakers, headphones, and microphones, as well as from motors, generators, and other devices that may emit a magnetic field in your environment. The storage environment should be kept clean, as dust and other small particles will cause severe errors during playback. Videotapes and optical media are best stored vertically (see Chapter 15). 289

Agathe Jarczyk and Peter Oleksik

All writable optical media such as CD-R, DVD-R, and Blu-ray rely on a light-sensitive dye to store information, which makes storage in a dark environment essential for their preservation. For further reading on the preservation of optical discs, see Iraci (2002) and Iraci (2011). Hard disk drives and solid-state disk drives can enter a collection with casing or without. Especially for the latter, storage in electric-static discharge (ESD) bags or cases is recommended to prevent static discharges on exposed electronic boards. All removable media, including thumb drives, should not be considered long-lasting carriers and should be backed up as quickly as possible (see Chapter 14). Since the early days of video through today, some artists have signed their media or designed cases so that these carriers can be considered art objects and treated as such. To document which assets were provided by artists and which were created later by the collection, all assets should be clearly labeled (see Chapters 11 and 17).

18.8 A Closer Look: The Technical Makeup of Video Broadly speaking, video is an electronic medium for the encoding, packaging, transmission, and reception of audiovisual content. Developed in the early 20th century, video draws from and is inextricably linked to film and audio technologies that developed prior to and concurrent with video’s evolution. The history of video is cumulative, meaning that the format’s evolution constantly informed decisions and features of later formats.

18.8.1 Basics of the Video Signal At its core, the video signal comprises three elements: a visual or “video” element, an audio or “aural” element, and a third element that contains all of the controls for the other two elements and is invisible to the viewer (Remley 1999, 125). Think of the scanning process as the same as a typewriter, where light is encoded and decoded electronically from top to bottom, left to right, in rapid succession. These scan lines refer to the resolution of an image and have been standardized and expanded since the early experiments in television in the 1920s. Originally utilizing between 200–300 lines, we now are scanning visual information well into the thousands. These lines represent the frame of the image, to borrow a term from film technology, and compromise the whole B&W visual image of the video signal. The amount of information in these lines is referred to as the raster or sample and is related to the rate at which the encoding device is given to record the light information. This rate of scanning is typically expressed as frames per second (fps). However, the bandwidth at the time only allowed for half the frame to be transmitted at a time, resulting in an A and B field per frame. Field A contained the odd lines; field B, the even lines of a frame. These were then encoded and decoded in sequence to construct the image, which is a process known as interlacement. All of this information is then encoded electronically and/or digitally and either transmitted or captured on some form of medium. A CRT monitor was the first display device for video, based on this method of light scanning, and continued to be used well into the early 2000s.

18.8.2 Color With the introduction of color into the televisual image, some adjustments needed to be made to the television standards to allow for this signal addition. In 1953, the NTSC (National Television Systems Committee) system codified the addition of color using a subcarrier in the luminance signal to allow for the addition of color while keeping the structure of the original 290

Caring for Analog and Digital Video Art Table 18.5 These three standards formed the basic technical aspects of what is commonly known as standard definition (SD) video that formed the basis of most televisual production and transmission until the introduction of digital technology in the 1980s. This table outlines the basic technical aspects of all three standards: Television Standard

Frequency

Scan Lines

Frames per Second

NTSC PAL SECAM

60 Hz 50 Hz 50 Hz

525 lines 625 lines 625 lines

30 fps (29.97 fps) 25 fps 25 fps

NTSC signal intact to allow for backward compatibility (e.g., that B&W television receivers could still receive the same picture information). When Europe began to explore adding color to their system, they introduced color in a similar subcarrier, but the information would be phase-shifted line to line (shifting the phase 180 degrees between A and B lines) to cancel out any interference. This system was named Phase Alternating Line, or PAL, and was standardized in 1962. SECAM (Séquentiel couleur à mémoire), developed slightly earlier than PAL, is similar in nature to PAL in terms of color encoding.

18.8.3 Videotape There was a lot to be desired in the live broadcast system, notably the degradation of the visual information from the transformations from the electronic to the photochemical and back again. To solve this problem, television and radio engineers turned to the recent development of magnetic media to record the televisual signal. Magnetic tape as a recording medium was first introduced during the late 1930s in Germany but wasn’t globally popular until the post-war period (late 1940s) (Engel 1999, 47). It is a combination of a substrate, which at first was typically paper-based but has since been mostly plastic in nature, where a binding agent is used to apply an oxidized metal to the tape to allow for magnetic polarity shifts within the oxide. When a magnetized coil is run over the tape, it aligns the metal particles into a pattern. The oxide is able to retain this pattern, and the electronic signal has been “recorded.” This process is reversed when a different magnetic coil is run back over the tape that “reads” these imprinted patterns to then play back the recorded electrical signal. Much like how radio paved the way for television, magnetic tape was first used for the recording and playback of audio. Three characteristics of recording on magnetic tape are important to take into account when discussing the medium: tape speed, signal-to-noise ratio, and bandwidth. Each of these, while discrete elements, impacts the other in terms of the quality of the recorded signal. Bandwidth is the difference between the high and low frequencies in a given signal and is typically expressed in hertz (Hz), which equals one cycle of the frequencies per second. Originally stemming from broadcasting of audiovisual signals, bandwidth also refers to the amount of area necessary to record the signal to a particular medium. Tape speed was originally expressed in North America as inches per second (ips) and in Europe as centimeters per second and is simply the amount of tape that is transported across the recording heads within a second time scale. The greater the amount of tape that can pass before the head, the more area is available for the signal to be recorded. Thus, there is a better fidelity to the recording. 291

Agathe Jarczyk and Peter Oleksik

Figure 18.4 Schematic illustration of a U-matic (LB) tape. The audio tracks are arranged parallel to the bottom edge of the tape; the control track can be found on the top. The helical video tracks of U-matic are arranged at an angle of 4°58′ (here: displayed steeper). Each video track contains one field. Illustration: Agathe Jarczyk

Signal-to-noise ratio refers to the highest and lowest threshold for capturing given frequencies in transmissions and capture. Any aspects of the signal that exceed the highest or lowest threshold allowable in a given medium will result in distortion or artifacts in the recording (such as tape “hiss” in audio or noise in a video signal). Tape, and more specifically recording, was exclusively the domain of broadcast organizations and professional settings until the late 1960s (see Section 18.2). In the 1970s, videotapes were mostly used in professional settings (2″ quad/1″ type C), and cheaper formats such as Portapak and U-matic were increasingly being used by artists and amateurs alike, but for mostly artistic or documentary purposes for home use due to their inherent instability and unsuitability for broadcast. In the 1980s, there were two main technological developments in video: further refinement of video processing and early digitization of the video signal. In the broadcast realm, video formats were gaining in resolution and technological developments, with new formulation of tape coatings (different metal oxides) and smaller cassette sizes. You see these new formats like Betacam SP becoming widely adopted by artists and galleries as a master format, supplanting the previous 1″ format. LaserDiscs, the first optical disc-based video format, were introduced in the mid-1980s, and artists adopted the format for greater exhibition fidelity and control, especially with works that necessitated synchronization, as the LaserDiscs could be linked together and controlled in unison via synchronization signal. The Portapak made video accessible to the consumer, amateur market, and this was expanded with cheap video cameras using the new cassette formats. These cheaper, consumer formats (such as 8 mm and Hi8mm) sparked further video adoption by artists, but hurdles were still faced in terms of the ease of use (editing was still cumbersome activity outside of broadcast/ professional settings) and the same incompatibility with professional settings were present as they were with portapak, necessitating signal processing and time base correction to allow for professional broadcasting. In the late 1980s, the first digital video compression standard, H.261, was developed. While not widely adopted or used, its structure formed the backbone of all subsequent digital video coding standards. What is notable about these digital file formats is how they derive mainly from 292

Caring for Analog and Digital Video Art

the computer technology standpoint and not televisual or broadcasting. This would change in the late 1990s when the two mediums formally met.

18.8.4 Analog to Digital Video Early digitization of the video signal began in earnest in the late 1970s and was first introduced with TBCs, which allowed for digital quantization of the analog video signal line by line and later field by field in the TBC’s buffer to allow for stabilization and correction of time base errors. This digitization was then further developed into full signal digital quantization to allow for the digitization of analog video for transmission and capture, moving it out of the analog realm and slowly into the digital. Two formats signal the massive shift to digital in the 1990s, Digital Betacam and DV. Digital Betacam was introduced into professional and broadcast settings in 1993 and represents the shift from analog video technology into digital video technology (Wikipedia—Betacam n.d.). Using the same cassettes and VTRs as Betacam SP, making the format backward compatible with analog workflows, Digital Betacam’s uncompressed, 4:2:2 format allowed for full adoption of digital quantization of the video format. The development of Serial Digital Interface (SDI) connections allowed for a digital workflow to complement the analog video workflow in post-production facilities before completely supplanting it. You begin to see Digital Betacam become the de facto standard for master formats because of its high-resolution, uncompressed nature, and its lack of generational degradation (and its robust stability when compared to analog video formats). DV, launched in 1995, is a digital video format explicitly designed for home use. Using lossy compression, DV used the Rec. 601 standard and allowed for fully digital interlaced video. Its small cassette size, relative stability, and robustness, when compared to Video8 and Hi8, made it very attractive to both amateur and home use. It also became widely adopted in other settings because of its unique aesthetic and ease of use and truly brought digital video into the consumer market. On the heels of the introduction of DV came the ability to capture the signal to digital files and editing software accessible to consumers with Apple’s Final Cut Pro and iMovie software applications. Artists, too, quickly flocked to the format for the aforementioned reasons, and you began to see wider adoption of it in artistic circles into the late 1990s. For a more detailed history of magnetic recording, see Daniel et al. (1999).

18.8.5 Digital Video With the adoption of digital video in both the professional and amateur production markets, the next place primed for the analog to digital conversion was the consumer market, which was realized in the introduction of the DVD in the late 1990s. Released in 1996, the DVD format supplanted VHS as the main consumer video format. In the mid to late 1990s, you see the introduction of discrete video file formats. Audio Video Interleave (AVI) is the Windows video file format introduced around 1993 that originally was one of the first digital video files to gain popular adoption (Fleischhauer et al. 2019, B-41). QuickTime, the video wrapper introduced by Apple in the early 1990s, is a proprietary format that quickly became one of the most popular file types for video capture and playback. An evaluation of the most common video wrappers can be found in Pfluger et al. (2019). Wrappers, like QuickTime or MPEG-4, are container file formats that “wrap” the various streams that make up the digital video file. A video stream is primarily composed of the video 293

Agathe Jarczyk and Peter Oleksik

information that has been packaged or encoded by a software application in a certain way. This packaging or encoding software is known as a codec which is short for (encode/decode). Think of the encoding type as analogous to the video formats like VHS, Betacam, and so on. These analog formats were a particular type of video packaging technology that needed a special VTR device to play them back. In a similar fashion, the codec is software that is necessary to encode and decode the video in a computer environment, much like the VCR to VHS tape. In the DV format, for example, the codec is referred to as DV. The most basic aspect of a video codec is its use of compression. What this means is whether the software for encoding is doing anything to the signal to simplify and shrink the information, either through more efficient ways to encode the information or by simply discarding it. The two main types of compression are known as lossless and lossy, and the video stream can compress information in two basic ways, either spatially or temporally (see Section 18.7.3). Spatial compression, or intraframe compression, is best described when the visual information is encoded in a simplified manner. For example, areas with the same color can be simplified by encoding the color information of one area and relying on the playback software to decode and reconstruct the frame based on this simplified information. Temporal compression is when the compression relies on the relationship of information between frames. One method of compressing information is on the frame level using the group of pictures structure, or GOP, in which frames are categorized by the amount of information they supply within their group. Now that bandwidth was increasing, it was possible to have discrete frames, like film, instead of the fields that dominated standard definition video. Technology evolved so that it was now possible to encode all of the visual material in a frame using progressive scanning. Progressive scanning was introduced to take advantage of higher resolutions available so that all of the visual information of a frame is encoded sequentially instead of the previous interlacement scanning alternating the encoding between half of the lines. Additionally, with these increases in resolution and the type of scanning, the frame rate of the digital video became higher to allow for more efficient encoding and decoding of the progressive scanning. You see this with a doubling of the standard definition frequency to 50 fps and 59.97/60 fps for PAL and NTSC, respectively. With all of these increases in resolution and frame rate came another increase in frame size. The 4:3 aspect ratio was widened to 16:9 as a happy median between 4:3 and the widescreen formats that had been used previously in cinema applications to minimize letter and pillar boxing between different aspect ratios (Schubin 1996). The bit depth, or the amount of information possible to store for each pixel, is a further specification of the digital file stream. For color information, it is how many possible variations of color are possible for the pixels comprising the video image. An 8-bit video has 256 (28) possible combinations, while a 10-bit video has 1,024 (210) possible combinations, and so forth. This does not seem like a lot of color information, but where the information is split among three channels, as is done in RGB spaces, you have one bit of depth per channel of color, so 1,024 × 1,024 × 1,024, for example. The color space gamut also increased given the gains in resolution, scan type, aspect ratio, and so on. Digital audio operates in a similar function where the audio is sampled and encoded digitally using a codec, which will typically have a sample rate and a bit-depth at which the material was encoded. With analog video, the amount of audio channels was fairly limited, but with digital video files, you can have numerous audio streams in a file. With this comes a new arrangement of audio from mono versus stereo to 5.1 and beyond. These new arrangements allowed the audio to be mixed for six speakers instead of one or two. 294

Caring for Analog and Digital Video Art

All of the digital information packaged into the stream also has a certain rate of transmission, which is known as the bit rate. The computing environment needs hardware and software that is sufficiently capable of decoding and displaying the information, or else it will get clogged and stutter, freeze or just simply not play the video file. You can add storage size and processing speed which is typically expressed in kilo- or megabits per second (Kbps or Mbps). For example, uncompressed 10-bit standard definition video has a bit rate of 224 Mbps. An H.264 made from this file for a BrightSign playback would have a bit rate of around 20 Mbps. All of these elements combine into the digital video signal and subsequent file that we know today.

18.8.6 Video Format Identification To assist in the identification of video formats and better understand encoding see tables 18.6 and 18.7. In addition, there are useful resources like the Videotape Identification and Assessment Guide (Jimenez and Platt 2004), the Compendium of Image Errors in Analogue Video (Gfeller et al. 2012), and the Library of Congress Format ID (Library of Congress—Sustainability of Digital Formats 2021). Wheeler (2002) offers a comprehensive glossary on technical terms around video and video preservation.

18.9 Video Hardware The playback devices, display devices, and other associated hardware are just as critical to video artworks as the video signal itself. In installation settings, the equipment can be a critical aesthetic aspect of an artwork. As described in the next section, each display device has particular characteristics that can impact the visual display of video material. Some artworks will specify a particular type of display based on these characteristics—for example, the quality of blacks that a CRT projector provides—while others can be display-agnostic and fine to show on any monitor or to project. These elements should be discussed with all relevant stakeholders (artists, curators, exhibition staff, etc.) when selecting equipment for display.

18.9.1 Analog and Digital Display Technologies Monitors and projectors are the main method with which to display the video image once it has been decoded by the playback device. As with all video technology, video display started in the analog realm and migrated into the digital realm with a cross-over between the two as they transition from analog to digital occurred in the late 1990s. Analog display devices display analog video material connected to the display device through analog connection types and decode the signal for the monitor or projector. The common connection types are all analog in nature and are typically BnC, RCA, or VGA connections. If a digital source is connected, then a digital to analog conversion occurs somewhere within the device for decoding and display. Digital display devices use a variety of technologies for digital display regardless of whether an analog or digital source is connected to the display.

Cathode Ray Tube (CRT) Monitors The CRT monitor was first introduced in the mid-1940s when NTSC was standardized and became the main display device for video information until the early 2000s. The CRT monitor 295

Table 18.6 Video for mat identification Format

History

Artistic Use 

Notes and Conservation Risks

Open Reel

• 1″ open reel was introduced in the late 1970s and was typically used within professional broadcast environments and mainly used for recording master mater ial. • Tapes are typically seen in 20- and 60-minute durations but due the fact that they are stored on open reel flanges, the quantity of tape on the reel can var y.

½″ Open Reel

• The ½″ open reels were the first commercially available videotape for mats that were explicitly advertised for consumer use. The first ½″ open-reel for mat CV was introduced in 1965 by Sony. • Initially released with competing for mats (AV and CV by Sony), most recorders and playback devices adhered to the EAIJ standard in the late 1960s. • The U-matic for mat was the first cassette-based video for mat introduced to the market in the early 1970s. The name der ives from the U shape the tape makes when it is wrapped around the head drum inside the VTR. • U-matic came in two cassette sizes: the smaller cassette containing up to 30 minutes, the larger cassette 60 minutes.

• In the 1970s, the 1″ for mat was introduced as a replacement to the 2″ for mat and used typically in professional, broadcast environments. However, with more artists using video for mats throughout the 1960s and ’70s (½″ open reel and U-matic, in particular), this mater ial was routinely mastered to the 1″ for mat to both accomplish standardization and preservation. Because of its relatively high resolution at the time, 1″ was an excellent method of preservation for lower-resolution for mats, such as ½″ and U-matic, because there was only a slight generational loss in the master ing of this for mat. It also afforded the ability for mater ial on U-matic and ½″ to be standardized to a broadcast-ready signal owing to the fact that the equipment necessary for these for mats was housed in professional video facilities geared toward broadcast. • From the 1960s to 1980s, most video artists used this for mat. • The ½″ open reel was typically used as a production or master tape for mat due to the nascent state of video as a technology.

• This mater ial will most likely be the master of early video mater ial, most likely from a transfer from a lower resolution, more consumerfr iendly for mat or was what is known as a dub master, which was a generation below the official master recording and used to make copies on lesser for mats (U-matic, VHS, etc.). • This for mat is obsolete, so it must be mig rated for future access. • An obsolete for mat, ½″ must be mig rated to a digital video file for long-ter m preservation.

• U-matic was a widely adopted for mat among artists due to its relative robustness, portability, ease of use, and ease of editing. • Typically used as a production, master, and exhibition for mat, U-matic was selected for many different applications in an artist’s practice, again pr imar ily for economic reasons and thanks to its wide dissemination.

• An obsolete for mat, U-matic must be mig rated to a dig ital video file for long-ter m preservation.

¾″ U-matic

VHS/ S-VHS

Betacam/ Betacam SP

D5

• Introduced in late 1976/early 1977, the VHS for mat (Video Home System) was developed by the Japan Victor Company (JVC) as a competing consumer for mat to Betamax. • Ubiquitous in both consumer and professional settings, VHS was the common video for mat for home-viewing and circulation of video images. • With the introduction of the compact version, the VHS-C cassette, the first miniatur ization of video cameras appeared. With the help of adapter cassettes, VHS-C can be played back on a nor mal VHS deck. • S-VHS was introduced in 1987 with an improvement in bandwidth from 240 to 400 lines of hor izontal resolution, a separate management of the Y/C signal and the ability to have high fidelity or “Hi Fi” audio. • Betacam was introduced by Sony in 1982 as the first analog video for mat that allowed the recording of component signals. • In 1986, Sony released Betacam SP (Super ior Perfor mance), which improved the hor izontal resolution (boosting it from 360 to 450 lines of hor izontal resolution) and cemented Betacam SP’s place in professional video settings until well into the early 2000s.

• Developed by Panasonic, D5 was introduced to the market in 1994 and sur passed D1 in quality, recording with a bitdepth of 10-bit instead of 8-bit. It is the only dig ital for mat in standard definition that uses entirely uncompressed recording. Due to its high recording density, the for mat was available in three different cassette sizes. Later on, Sony developed a compressed HD for mat based on D5, which was named D5-HD.

• Because of its dominance as the main consumer video for mat in the 1980s and ’90s, VHS was commonly used by artists and instrumental in their production, postproduction, and exhibition practices. Artists could shoot on VHS, work with it in post-production, and frequently “master” to it. • The for mat can range from a source for mat for works to the master of the work to a for mat that was used for exhibition in a var iety of settings. • With cassettes containing up to 240 minutes of video, VHS offered the longest durations for playback and was for a long time a widely used tape for mat for exhibition copies.

• An obsolete for mat, VHS must be mig rated to a dig ital video file for long-ter m preservation.

• Betacam SP (and Betacam) are seen in collections commonly as either master or dub masters. Artists would typically shoot on a different, inexpensive for mat (U-matic, VHS, Betamax, etc.) and finish their work using a Betacam SP as a for mat appropr iate for a master and for preservation pur poses. • Sometimes you will see pair ings of mater ial on Dig ital Betacam and Betacam SP, where the Dig ital Betacam serves as the master for mat and the Betacam SP serves as the dub master. In this case, one would make copies from the dub master, while new dub masters would be generated from the master, in this case the Dig ital Betacam for mat. These pair ings are fairly common in mater ial that was mig rated from a lower resolution for mat (e.g. U-matic) to a higher resolution for mat for preservation and access in the late 1990s/early 2000s. • D5 was not or only scarcely used by artists due to limited availability and very high pr ice. Few collections have used this for mat as a preservation master, especially because it records without compression compared to Dig ital Betacam, for example.

• An obsolete for mat, Betacam SP must be mig rated to a dig ital video file for long-ter m preservation.

• An obsolete for mat, D5 must be mig rated to a dig ital video file for long-ter m preservation.

(Continued)

Table 18.6 (Continued) Format

History

Artistic Use 

Notes and Conservation Risks

Digital Betacam

• Released in 1993, Digital Betacam became the standard for broadcast television in the 1990s, and it was used well into the 2010s before being supplanted by file-based workflows. • Digital Betacam records in a 4:2:2 uncompressed 10-bit YUV dig ital video for mat, with a resolution that was near the peak of what is possible for standard definition video. • Digital Betacam, in compar ison to D1, was significantly less expensive and used VTRs that were backward compatible with Betacam and Betacam SP, which made adoption easier for studios actively using that for mat. • Introduced in 1995, the DV video for mat is a lossy compression, which follows the ITU-R BT.601 standard. It uses DCT compression with a 4:1:1 subsampling for NTSC and 4:2:0 subsampling for PAL, both at a data rate of 25 Mbps. The infor mation is encoded on the tape or wrapped within different video file for mats (AVI, QT, etc.) when it is migrated into a computing environment. • The tape for mat uses a ¼″ tape in a var iety of cassette sizes that cor respond to different for mats (MiniDV, DV, DVCAM, DVCPRO). • MiniDV was perhaps the most widely adopted cassette for mat in both amateur and professional settings in the 1990s well into the late 2000s. • Due to the concur rent popular ity of accessible non-linear editors for consumers, such as Apple’s iMovie, the for mat could be used in a production environment and easily migrated onto a computer for post-production and finishing.

• Dig ital Betacam is commonly seen as a master for mat for works, much like earlier analog car r iers like 1″ and Betacam/Betacam SP. Dig ital Betacam offered an additional benefit in that it did not suffer generational loss when it was copied via dig ital signal paths. • Dig ital Betacam was a more expensive for mat when it was introduced, but artists mastered to this for mat just as broadcast facilities did.

• An obsolete for mat, Dig ital Betacam must be mig rated to a digital video file for long-ter m preservation.

• DV was widely used by artists upon its introduction, and a large var iety of mater ial was shot using this popular for mat. The tapes themselves, however, typically compr ise a small percentage of collections as the for mat was not typically one that was mastered to. • You will typically see DV mater ial either mig rated to a higher quality for mat such as Dig ital Betacam or exist as a dig ital file as its master. DV mater ial was typically displayed using popular exhibition for mats, such as VHS and DVD.

• An obsolete for mat, DV must be mig rated to a digital video file for long-ter m preservation via a dig ital connection type.

DV/DVCAM

HDCAM/ HDCAM SR

LaserDisc

DVD

• Introduced in 1997, HDCAM and HDCAM SR are highdefinition video for mats. They were some of the first capable of recording high-definition video (1440 × 1080 for HDCAM and 1920 × 1080 for HDCAM SR) and could record in a var iety of fps and modes (interlaced and prog ressive). • Both for mats were used for a per iod of time in the 2000s before being supplanted by file-based workflows for highdefinition content, given the move to completely digital broadcasting. • Introduced in 1978, LaserDisc was the first optical disc technology to be used for video playback. Or iginally introduced as a high-end consumer for mat, the LaserDisc had a resolution roughly equivalent to 1″ type C videotape, which was about 440 lines of hor izontal resolution. • LaserDiscs could either be professionally pressed or recorded by “bur ning” the image into a dye layer. • Dig ital Versatile/Video Disc, or DVD, was codified in 1995 and introduced to the consumer market in 1996. • DVDs have the same resolution as D-1 videotape, which like LaserDiscs before them, allowed for higher resolution video to be accessible in consumer and exhibition settings.

• Much like Dig ibeta, HDCAM and HDCAM SR were a master for mat for high-definition works shot with high production value at the time, pr imar ily for feature films and other large budget productions. It was atypical to convert standard definition to HDCAM as mig ration to dig ital files was already a popular phenomenon at this time.

• As they are obsolete for mats, HDCAM and HDCAM SR must be mig rated to a digital video file for long-ter m preservation.

• LaserDiscs became a popular exhibition for mat because of the ability for a high-resolution for mat with no need for rewinding of videotape. • The for mat could be exhibited fairly seamlessly on a loop in a galler y setting, while a tape would always necessitate some time for rewinding

• An obsolete for mat, LaserDiscs must be mig rated to a dig ital video file for long-ter m preservation via a dig ital connection type.

• DVDs supplanted LaserDiscs and VHS tapes as the prefer red exhibition for mat for video works in the late 1990s through the 2000s. • DVD playback allowed for the highest-resolution video for mat at the time, offer ing approximately 480 lines of hor izontal resolution.

• An obsolete for mat, DVD must be mig rated to a digital video file for long-ter m preservation via a dig ital connection type.

Table 18.7 File encodings Format

History

Artistic Use 

Notes and Conservation Risks

Uncompressed

• Uncompressed is an encoding for mat for videos where the infor mation is encoded using YUV or 4:2:2 intraframe encoding with no compression applied and typically with either 8- or 10-bit depth. • It or ig inated with the development of the QuickTime video software specification for stor ing uncompressed YUV video data, the pr imar y encoding for dig ital tape for mats. • Its wide adoption stems from its use in the migration of analog tapes to digital video files, where analog to digital capture devices interfacing with non-linear editing software like Final Cut Pro led to its use in postproduction facilities. • ProRes is a propr ietary video encoding for mat developed by Apple Inc. and introduced in 2007. • It is a good inter mediary video codec, utilizing intra-frame compression where each frame is stored individually, which allows for redundancy and less compression of visual infor mation. • It is a very popular for mat for video production because of its quality and effective compression to reduce the file size. • H.264 was developed by the Video Coding Experts Group and was introduced in the mid-2000s. • It is a flexible video standard that allows lower-bit-rate videos to maintain sufficient visual quality. • It is a widely adopted video for mat, used in most moder n video applications (television, inter net, etc.), and is the most common encoding for mat, especially for exhibition copies, as of this wr iting.

• Uncompressed is still a widely used encoding for mat for dig ital video files that result from analog to dig ital mig rations. • The use of 8-bit versus 10-bit in the encoding was or ig inally due to space-saving measures so that less cr itical mater ial was transfer red with 8 bits to cut down on storage space. This is no longer a concer n in moder n computing environments.

• True to its name, uncompressed does not apply any compression to the video infor mation after it has been encoded as YCbCr or ig inally, or whatever the pixel for mat and sampling structure are in the present (e.g., 422, 4444) and is a good master for mat for analog to digital mig ration projects. • There is minimal documentation as a result of the var ious QuickTime implementations, and it was retroactively codified within SMPTE ST 377:2011, annex G, included in amendment 2.

• This for mat is widely used by artists in their video production practice and is a ver y common master file encoding for mat.

• It is a propr ietary video for mat, with Apple Inc. holding the r ights to the encoding. However, the for mat is freely available and has been reverseeng ineered and incor porated into open-source tools like FFmpeg since 2011. Given its near ubiquity and adoption, it is a relatively robust and versatile master video for mat and is suitable for most collections.

• H.264 encoding is the typical encoding for mat for exhibition pur poses because of its low bit rate and file size and relatively high video quality. • It has been used by artists since the early 2010s and is still in active use as of this wr iting.

• H.264 is a highly compressed for mat with inter-frame encoding, so frames are not discretely represented. • It is typically not a suitable master for mat unless this is what was produced as the final master by the artist, and access to the or ig inal post-production project is no longer possible. • H.264 comes with a color subsampling of 4:2:0. When using H.264 to produce exhibition copies, color can deviate from the source mater ial with higher subsampling rates. • H.264 stores video as progressive frames only. Deinterlacing SD video with interlaced frames is necessar y.

ProRes

H.264

MPEG-2

• Developed by the Motion Picture Experts Group (MPEG) in the late 1990s, MPEG-2 is the standard encoding for mat for DVDs and television.

• MPEG-2, packaged as VIDEO_ TS director ies on DVDs, was widely used by artists in the 2000s as a popular exhibition for mat.

DV

• Developed for the DV tape for mat in the mid-1990s, DV was a popular amateur video for mat with wide adoption. • DV is a 4:1:1 (NTSC) respectively a 4:2:0 (PAL) standard definition video encoding for mat with both tape and file-based applications.

• Artists widely used the DV file for mat in the 1990s as many works were made and mastered in the for mat and exported back out to DV tape.

DPX

• The Dig ital Picture Exchange (DPX) for mat is an image sequence encoding for mat or iginally developed for the scanning of analog film into digital environments for post-production work. • DPX is an uncompressed for mat that is resolutionagnostic. It is cur rently used in high-end video applications. • FFV1 is a lossless intra-frame video encoding for mat. • It is free and open-source and in the process of becoming standardized.

• DPX and other image sequence file for mats like REDCODE RAW are gaining in popular ity for artists’ productions due to their ability to render high resolutions.

• DPX, and other image sequence for mats, should only be considered a master for mater ial that results from that workflow. DPX is not considered an ideal master for mat for SD and HD video applications because of its cumbersome properties and or ig inal development for film post-production and scanning.

• There is limited artist adoption of the for mat, although it has gained popular ity in file-shar ing and archiving communities.

• FFV1 is an open-source and documented lossless encoding for mat that has gained popular ity as an archiving for mat in archives and museum applications for video mater ial. • Its lossless encoding means it can more efficiently encode video mater ial, saving on storage needs. • It also can assign frame-level fixity to the video mater ial, which affords built-in redundancy and more g ranular accuracy in the fixity of the file for mat.

FFV1

• MPEG is a widely documented video encoding for mat but has dropped in adoption with the decline in DVD use. • The main preservation r isks for MPEG-2 encoded mater ial largely center around the r isks to DVD optical disks. • DV’s r isks largely result in the loss of metadata when migrating from tape to file if an incor rect signal path is chosen.

Agathe Jarczyk and Peter Oleksik

consists of a vacuum-sealed glass tube that contains a source for electrons (this is typically called the electron gun) and a system by which to direct these electrons toward a phosphor-coated screen. When the electrons strike the screen, they illuminate the phosphor, which allows for the image to be traced across the screen over time. The electrons are guided by a system to control them, typically a series of magnets placed around the glass tube which interpret synchronization signals transmitted in the video signal to move the electrons left and right and up and down to trace each interlaced field. During the CRT’s 60-year history, the only significant change to the technology came with the introduction of color in the 1950s, which necessitated the use of red, green, and blue phosphor and three electron guns instead of one to excite each of these RGB phosphor regions. CRTs are typically a 4:3 aspect ratio, but there are a small number of 16:9 CRTs that were produced in the late 1990s/early 2000s. CRTs require manual calibration, where a test pattern, such as the Society of Motion Picture and Television Engineers (SMPTE) color bars, is fed to the monitor to adjust the luminance, chrominance, and picture geometry. There are various levels of calibration one can engage in with CRTs, from minimal adjustments to the highlights, blacks, hues, and saturation to more in-depth calibration involving adjusting the tube itself. The authors recommend working with an experienced technician to review and set up this technology. Additionally, CRTs are an inherently analog display devices and require an analog signal for display. As of this writing in 2021, CRT displays are increasingly becoming harder to find and are no longer being manufactured.

Cathode Ray Tube (CRT) Projectors Operating much like the CRT monitor, a CRT projector utilizes three vacuum-sealed tubes, corresponding to the three primary colors of red, green, and blue, to project the video image with a system of lenses affixed to the end of the tube to magnify the projected image. The red, green, and blue channels are all projected one on top of the other to achieve full image detail and color rendition. CRT projectors were in use from the mid-1960s to the early 2000s and offered great image quality for video material, with some artists preferring the CRT projected image to more modern projector displays. The CRT needs an analog video input, typically a composite or a component signal. They have a fairly low lumen output, between 500 and 1,200, so they are sensitive to ambient light and are best used in completely dark environments for best picture clarity. The method of convergence of the red, green, and blue tubes on the screen is achieved manually via reference patterns and is a fairly laborious process prone to periodic revisiting throughout an exhibition period. The authors advise working with an experienced technician when calibrating and installing a CRT projector. As with CRT monitors, CRT projectors are no longer in production and are considered obsolete technology. For more details on the technology and the alignment of monitors and projectors, see Simpson (1997).

Plasma Display Panels Plasma monitors, which were popular briefly in the 2000s, are another display technology developed as the modern flat-screen was becoming standard. Imagine a grid with thousands of cells, where each cell is filled with gas and painted on the interior with one of three colored phosphors (RGB), as these phosphors are very similar to those in CRT color technology. When the cell is excited by electricity, the gas generates ultraviolet light, and in turn, the phosphor coating creates a visible color of red, green, or blue. They do not require a source of backlight as each cell is self-emissive. Plasma had deeper blacks, increased contrast, faster response time, 302

Caring for Analog and Digital Video Art

greater color spectrum, and a wider-viewing angle than LCDs until the technology for LCDs improved. As a result, plasma displays are no longer manufactured. Plasma displays were for a short time available with an aspect ratio of 4:3, soon replaced by 16:9. For more on plasma displays, see Castellano (1992).

Liquid Crystal Display (LCD) Projectors LCD projectors were first introduced in the early 1990s and were manufactured for business and educational settings, intended to display word processing and presentation content. They operate by using a single source of white light which bounces off a mirror and splits the light into the red, green, and blue wavelengths and then directs these individual colors to their correct LCD color panel. This panel consists of thousands of pixels that either block or allow the light to pass through. The red, green, and blue images are then reconverged before the lens projects the light onto a screen. If you observe an LCD image closely, you can see the tiled pattern of the image, which is the pixel matrix that the light passes through in the projector. LCD projectors come in a variety of resolutions and formats and have gotten more robust as the technology has evolved. Older LCDs had a fairly low lumen output, but newer versions allow for higher brightness and sharper contrast in the image and are capable of displaying fairly high-resolution images.

Liquid Crystal Display (LCD) Monitors Using the same technology as was used in the projector system, LCD monitors were the first flat-screen monitor system to enter the consumer market and ended up replacing CRT monitors as the dominant display monitor for the video signal. Like the projectors, LCD monitors are backlit with a grid of pixels that rotates to allow more or less light through, depending on the picture content. Because this grid is comprised of filters instead of an opaque panel, blacks are not as deep as light still passes through. Originally, cold-cathode fluorescent lamps were used as a light source, but they were replaced by LEDs as the technology evolved. The display is purely digital and needs a digital video signal input for display. Calibration of the monitor can be done manually via computerized menus within the monitor itself. LCD monitors typically have a 16:9 aspect ratio.

Organic Light-Emitting Diode (OLED) Monitors An OLED monitor was introduced in 2004 and has become one of the better monitor display types for digital video displays. Where LCD suffered from washed-out blacks and a lag in the change in pixel display, with OLED each pixel has its own OLED light source instead of a single white light source for all of the pixels, which allows for greater color accuracy and richer image detail. As of this writing, OLED monitors are still quite expensive when compared to LED models but offer better image quality. See Noordegraaf et al. (2013) for more examples.

Digital Light Processing (DLP) Projectors Projector technology has also advanced with Digital Light Processing (DLP) projectors. They use a digital micromirror device (DMD) which is covered in an array of millions of mirrors, each less than one-fifth the width of a human hair, and each mirror corresponds to one pixel. 303

Agathe Jarczyk and Peter Oleksik

The mirrors tilt back and forth, reflecting light as necessary to create a highly detailed grayscale image, usually 256 levels of gray. The color image is created in either one of two ways. In the single-chip method, the DMD chip is synchronized with the rotating motion of a color wheel that consists of different chromatic sections which rotate between the lamp and the DMD chip. The green components of the image are displayed on the DMD when the green section of the color wheel is in front of the lamp, and the same is true for the red, blue, and other sections. The colors are thus displayed sequentially but at a sufficiently high rate that our brains combine them into a full-color image. However, this has an anomaly known as the rainbow effect. A brief visible separation of the colors can be apparent when the viewer moves their eyes quickly across the projected image. Some people perceive these rainbow artifacts, while others may never see them at all. The rainbow artifacts are said to be most notable when the projector is beaming black-and-white content. The 3-chip projector uses a prism to split light from the lamp, and each primary color of light RGB is then routed to its own DMD. These images are then recombined and routed out through the lens. The 3-chip systems are found in higher-end home theater projectors, large venue projectors, and DLP cinema projection systems found in digital movie theaters. The 3-chip version gives a brighter image than a 1-chip, without a rainbow effect. With DLP projectors, there is no burn-in as there is in liquid crystal display panels because the technology is based on mirrors, which are either on or off.

Laser Projectors The typical lamps used in video projectors need to be replaced after a certain number of use hours, usually 3,000–5,000 lamp hours. The lamp gradually fades, and replacement lamps are needed. This lamp technology commonly contains mercury and can cost several hundred dollars each. With LED light sources, the drawback is that they are not capable of producing the same brightness levels as the commonly used mercury halide projector lamps. A second light source is lasers (light amplification by stimulated emission radiation). Like LED, each light only needs to produce one color, making them more energy-efficient. However, lasers can produce high levels of brightness with less heat generation, they do not require warm-up or cool-down time, and they last for approximately 20,000 hours, greatly minimizing the need for lamp replacement. Lamp-based projectors currently dominate the market, but laser projectors continue to increase in popularity. For more details on projectors, see Sætervadet (2012) and Finzel (2003).

18.9.2 Analog and Digital Playback Technologies As technology has progressed, so have the methods of playback for video material. Originally, playback was only possible using cumbersome open-reel decks, requiring manual threading of the videotape. Now, there are inexpensive and small models for playback equipment for digital video files. Each piece of equipment has unique characteristics specific to the device and can impact the display of the video signal if the wrong video type is played back. For example, interlaced video material that is displayed in a progressive manner will result in the interlaced artifacts being visually exposed and apparent to the viewer. It is important to review the key aesthetic features of each piece of equipment and choose the best hardware that suits the material you wish to display, always in dialogue with the artist, audiovisual technicians, curators, and other stakeholders in the display of a video artwork.

304

Caring for Analog and Digital Video Art

Open-Reel Playback The first VTRs for analog video, open-reel playback required manual threading for display. The formats that utilize open-reel playback are 2″, 1″, and ½″. The 2″ and 1″ formats were largely relegated to post-production facilities, but the ½″ open-reel was targeted toward consumers and would have been used in early exhibition settings for video. In an exhibition setting, the tape would need to be manually threaded, triggered for display, and then manually rewound again.

Cassette Playback Cassettes, introduced with the U-matic format, became the predominant VTR for video material from the 1970s until workflows were largely supplanted by digital video file workflows. Cassette VTRs were used in both professional and consumer settings and vary in their sophistication and quality depending on the make and model. Typically, the tape is mechanically unspooled within the VTR and wound around a head drum, which contains the magnetic read and write heads necessary for reading and writing the video information. Cassette playback was used in exhibition settings, but there would inevitably be some sort of blank period where the cassette was rewound. Typically, U-matic and VHS were the main formats used in these types of applications.

Optical Disc Playback Realized with the LaserDisc format, LaserDisc players allowed for higher image quality in a format that was suited toward exhibition with a quicker method of looping the video material by moving the read head instead of the time necessary to rewind a tape. DVD and Blu-ray playback was popular for exhibition as the discs were easier and cheaper to produce and had higher resolution, and the players were more robust and less prone to failure than the LaserDiscs. DVD and Blu-ray discs were the main exhibition copies in the 2000s and were supplanted by digital video files and media players. One drawback to optical playback is the disc could become unstable during its revolutions and disrupt playback or possibly harm the playback hardware.

Media Player Playback Media players utilize an entirely digital video file-based playback system. There are numerous types of media players, ranging in their ability to play back higher resolutions and higher bandwidth files. Media players are the current preferred method of display for video material because of their compact nature and durability. Typically, it’s a simple software system playing back the file from a solid-state memory device, so there are no moving parts to slowly degrade as there are in the other playback devices for video.

Computer Playback Computers, either desktop or laptop, are another device that can be used for the playback of video material. With their ability to handle high-resolution files requiring high processing speeds, computers are typically utilized in theatrical settings. However, they can also be used in

305

Agathe Jarczyk and Peter Oleksik

gallery exhibition settings with specialized playback software that can be set to loop or used to arrange video to different sources.

18.9.3 Ancillary Devices A common device for multichannel displays, synchronization devices were first introduced with the LaserDisc format and later expanded to DVD. Individuals such as Dave Jones designed synchronizers that would communicate with specific models of LaserDisc or DVD players so that each channel of video would stay in sync with the others. In the digital video file-based display settings, this synchronization is now achieved both through hardware and software. Different media players allow for different levels of synchronization depending on how the synchronization is achieved. This can range from a loose sync, where the channels are kept relatively in sync with each other while others afford a higher consistency of sync, sometimes referred to as frame sync, where each frame of video is kept in sync across the various channels being displayed. Refer to the documentation for each particular display device to determine the type of synchronization it supports.

18.9.4 Analog and Digital Signal Connections Playback devices connect to displays and other hardware through cables and connection types. Different cables and types have different applications, so it is important to be aware of the different types and their salient features. If not, the signal can be degraded or noise can be introduced, which can mar the eventual display of the work. The following table is a summary of the most common connection types used in video systems.

18.9.5 Sound Systems Audio on video has increased in quality and quantity, from one mono audio source in early video technology to the current possibilities of multiple audio tracks for playback or synchronization with separate audio playback systems. The sound system is integral to most video displays and should be chosen to match and represent the audio configuration accurately in the video material (see Chapter 19).

Amplifiers Amplifiers process the audio signal before it gets to the speakers. This is also known as a gain stage, where volume is determined. There are numerous amplifiers for different applications, so it is best to analyze the audio needs of a work and then work in concert with an audio engineer or someone with equivalent expertise to help decide on the necessary amplification. A receiver, which is a form of amplifier, is critical to works that have a surround sound arrangement in the soundtrack and is necessary to properly decode the audio arrangement and send the signal to the particular speakers in the space.

Equalizers An equalizer allows you to adjust the different frequencies within an audio signal to help tune the audio more effectively in a given space. Some video works have very involved audio

306

Table 18.8  Analog and digital cabling for video display VGA

DVI

HDMI

Connection type Typical usage

Analog Consumer-or iented connection type. Made for short video runs (playback to monitor).

Analog or digital Professional or high-end applications for both analog and digital video.

Analog Video Graphics Ar ray (VGA) or iginated in computer displays. This is a 15-pin connection type that allows for component transmission.

Dig ital High-Definition Multimedia Interface (HDMI) is a purely dig ital interface for the transmission of audiovisual signals.

Applications

Common in home and amateur uses or in short-run environments.

Common in postproduction facilities and in professional-g rade equipment.

Issues

Prone to interference over long signal runs.

Electr ical interference can creep into BnC connection types and cables, so care must be taken to never run power parallel to BnC runs.

VGA was fairly common in commuting environments in the 1990s, connecting computers to their CRT displays. It was largely supplanted by DVI but can still be used in analog display environments. Obsolete, so it may be hard to find cables.

Analog/dig ital Digital Video Interface (DVI) was developed to supersede VGA and allow for the transmission of purely digital video at 1920 × 1200. Commonly used for connection computing environments (e.g., computer towers) to display devices like projectors and computer monitors. Care must be taken to deter mine whether the output signal is analog or dig ital, as DVI can have both applications.

Fairly ubiquitous connection type that is seen in most consumer and professional settings. HDMI is perhaps the most commonly used connection type for digital display to monitors and projectors as of this wr iting. HDMI can be fragile, so care must be taken when using this cable in exhibition settings

Caring for Analog and Digital Video Art

BnC

307

RCA

Agathe Jarczyk and Peter Oleksik Table 18.9  Analog and digital cabling for audio signals

Connection type Balanced or unbalanced

RCA

¼″ TS

¼″ TRS

XLR

EBU

HDMI

SDI

Analog Unbalanced

Analog Unbalanced

Analog Balanced

Analog Balanced

Digital n/a

Digital n/a

Digital n/a

components and, therefore, may require an equalizer in their playback system for precise audio adjustments.

Connections As with the visual element to video, there are particular connection types in audio applications that can affect the signal quality. Table 18.4 contains the most ubiquitous connection types and their salient features.

Balanced versus Unbalanced Sound Unbalanced audio means that along with the positive and negative end aspects of the cable, the negative end is also utilized as the ground. Balanced audio, on the other hand, has a pure separation between positive and negative end aspects with a separate ground, which cancels out interference within the cable itself by flipping the phase of the audio and removing anything that remains. From a practical point of view, unbalanced connections are more suitable for shorter runs where interference cannot build up in the cable or more “forgiving” situations where a minor hum or buzz is not noticeable or of concern.

Active versus Passive Speakers Active, or powered, speakers use an independent power source with an amplifier integrated into the speaker for individual adjustment and the ability to raise the gain further. Passive speakers derive their power from the transmitted audio signal from the amplifier, which results in a limited line run, meaning that the speakers’ cabling cannot be too long, or the power will diminish (see Section 19.5.3).

18.10 Preparing Video Artworks for Exhibition In preparing a video artwork for display, it is important to gather all of the information about the piece that was gathered during the acquisition phase, along with new research into past displays of the work, and begin a dialogue with all of the stakeholders who will be involved in the installation. Everyone will need to understand the documented parameters of the work and realize them in physical space, so it is important to engage with the appropriate technicians who have the expertise to select and install the equipment. In addition, this should be done in dialogue with the artist and curators so that their goals for the work’s exhibition can be achieved both technically and practically. It is helpful to document these decisions along the way so that they can inform future exhibitions of the work.

308

Caring for Analog and Digital Video Art

18.10.1 Preparing the Equipment for Exhibition The playback and display equipment, along with the interplay between the connections and signal types, determine how a video is displayed and rendered in the gallery space, as the relationship between these three aspects of display can alter how the image and sound are ultimately rendered. Familiarity with the devices and equipment is important as each piece of equipment will have its strengths and weaknesses.

Testing/Calibration Before equipment is used in an exhibition, it should be tested and calibrated. Each piece of equipment should be tracked physically, whether it is dedicated or from an equipment pool, and the salient details about the equipment should be documented and monitored. It is worthwhile to document how long that device has been on active display to gain a rough understanding of how much longer it can be deemed reliable for future displays or if its age is of a concern, and a new device will be needed for replacement. To check that the display equipment is running reliably, a known video signal should be utilized in this review. For the calibration and testing of equipment, such as CRT monitors, plasma displays, LCD monitors, and projectors, there are different standardized test patterns that are suitable. The Society for Motion Picture Television Engineers (SMPTE) test pattern is the most common standard for adjusting the basic aspects of the video signal, luminance, chrominance, and black and white levels (Hodges 2016). There are a number of additional test patterns for adjusting and documenting other image characteristics such as color purity, sharpness, and white balance, and image geometry characteristics like image distortion and the image position. Other specific test patterns have been developed for equipment with special features like monitors with 3D capabilities. Because different applications call for different patterns and tools, the authors recommend working with a trusted engineer or video expert to assist in this assessment and calibration and to advise on the most suitable test patterns.

Obsolescence Tracking All technology becomes obsolete over time, so it is important to track market trends for equipment and determine the equipment needs of works in your collection. For instance, CRTs are no longer in production, but a handful of vendors currently still offer refurbished CRTs for a significant price. This practice is becoming increasingly rare, so it will be necessary to determine which works in a collection absolutely need this technology for display and thus how much of this equipment to acquire and conserve in order to exhibit this material on CRT technology in the future, balanced with financial, ecological, and resource limitations.

Sourcing, Backups, and Equipment Pools Video artworks that rely on a specific technology like CRT or plasma monitors, but where the parameters of the specific technology are flexible enough that a specific make or model of equipment is not required, could use equipment from a shared equipment pool. Pooled display equipment has the advantage that it remains in regular use with regular assessment and calibration (see Chapter  15). Pooling equipment also allows for many works to draw on the same equipment source, which saves space and resources. Acquiring service manuals when sourcing

309

Agathe Jarczyk and Peter Oleksik

equipment is important to support repair and maintenance efforts. Spare parts, consumables, and recommended service cycles are usually well described in service manuals. Obsolete equipment that has been sourced second-hand will ideally be examined, serviced, and tested by a specialist technician or engineer. Information on runtimes or how long the equipment has been in use prior to acquisition should be available with professional video equipment, but this is not possible with historical or consumer equipment. Older equipment is likely to fail during an exhibition, and the authors recommend having a sufficient number of spare devices at hand. Sets of similar equipment, such as those designated for multichannel works, should be calibrated for a consistent rendering. Periodic recalibrations during longer exhibitions might be necessary.

18.10.2 Preparing the Audiovisual Media for Exhibition Audiovisual media for exhibition must be carefully prepared for display. In the current technological environment, media displays in a gallery or exhibition setting are typically drawn from digital video files via the use of digital display technology. It is critical to review the needs of the work in question and then prepare the work for exhibition on the technology that has been selected for its display, either because the work requires it or because that is what was selected for this particular exhibition. A video’s aesthetic and technical characteristics and the artist’s preferences need to be taken into account when preparing media for an exhibition. This requires familiarity with the display technology (the playback device and the aural and visual display hardware), along with the relevant features of the media that will be displayed and how to prepare the work so that it is rendered accurately on this technology.

Transcoding In most applications, it will be necessary to transcode the material to fit the display parameters for the piece. For example, you may have an uncompressed file, but your playback device will only support H.264 encoding. Transcoding must be done in a manner that will migrate the essential aspects of the source material into the new encoding scheme without affecting the overall aesthetic of the work through unnecessary compression, errors in encoding, or other issues that may present themselves. This should be evaluated on a case-by-case basis and based on knowledge of and skill in transcoding the material (see Section 18.7).

Editing Editing may be required during the preparation of exhibition material and should be done with a clear understanding of the significant properties of a work. This is where the review of master and previous exhibition formats is crucial to determine how the piece was prepared in the past. For example, due to the limitations of playback devices, a short work was typically looped many times on a long tape or optical disc so that the rewind or replay would only happen intermittently. Today, this aspect of display is no longer a concern, and a file can loop almost infinitely.

Deinterlacing Standard definition video is recorded as interlaced video, which shows up with comb-like visual artifacts when it is displayed on a progressive display, as most modern display devices are. For works of this nature that will be displayed on a progressive display, the authors recommend 310

Caring for Analog and Digital Video Art

deinterlacing the interlaced encoding prior to display. This can be done in different ways through various tools (non-linear editors like Adobe Premiere or video tools like FFmpeg) and through various filters. For resources on how to do this, see resources like ffmprovisr (ffmprovisr n.d.).

Cropping As with deinterlacing, cropping is also commonly needed for standard definition works, which are displayed on modern display devices. As covered earlier, standard definition works were inherently overscanned when displayed on CRTs, which would crop some picture elements that were never intended for display. In the digital environment, these aspects of the signal are visible and can be distracting in the display of these works. Again, this is a case-by-case decision during preparation, but some cropping may be necessary for standard definition works to be displayed in a progressive, digital environment and should be evaluated as such.

Scaling Another aspect of SD video that is played back on HD equipment is the need for scaling. The HD resolution is four times larger than the SD video, and that resolution gap is only increasing. If you play back SD material in this space, it will be upscaled either by the playback device or by the display equipment, depending on the equipment’s individual settings and the wiring. To have the most control of this scaling and also deinterlace and crop if necessary, the authors recommend scaling this material in advance of display and documenting how this scaling was accomplished. When left to the device, there is no way to document how the hardware and software are scaling the material. Much like deinterlacing, this can be done in non-linear editors like Adobe Premiere and open-source tools like FFmpeg (Software) (2021a).

Pillarboxing/Letterboxing Pillarboxing or letterboxing refers to the display of material with an aspect ratio that does not match the intended display device. To give an example, the display of 4:3 material on a modern 16:9 display requires pillarboxing, which refers to the black border on either side of the image area. Much like scaling, playback and display devices can do this automatically, but their methods of doing so are not apparent and can be difficult to document or can result in a loss of image fidelity or incorrect geometry. The authors recommend exporting a video file in advance that matches the parameters of the playback and display device in order to avoid accidental errors.

18.10.3 Exhibition Documentation During the exhibition period, it is useful to document how the exhibition has evolved and to note how decisions were made. This allows for the full story of the exhibition preparation to be told and will allow future curators, conservators, registrars, and other stakeholders in the care of video artworks a window into how and why a certain work was exhibited and for what reason. It also serves to help form the identity of the work as these works will evolve as technology and culture change over time, and their moments of exhibition are when the work is materially realized and shaped. There are various templates and methodologies for documentation (see Chapter 11). The template and level of detail in the documentation depend on the capacity and information ecosystem of each organization. However, the key aspects to document in an exhibition of video artworks are dimensions of the space and geometric relationship of equipment 311

Agathe Jarczyk and Peter Oleksik

(e.g., throw distance of a projector, speaker placement), equipment used, and rationale as to why and how the exhibition came together through the labor of various individuals (curators, artists, audiovisual technicians, exhibition designers, registrars, conservators, collection care personnel, etc.). Photographs and video documentation also are invaluable to support the textual descriptions and documentation.

18.11 Conclusion The medium of video in time-based media works of art has had a rich 60-year history that is rife with technological, formal, and creative evolutions. Artists have engaged with the medium in different ways throughout the years, by interrogating the hardware initially, then using the video medium in a myriad of contexts and applications. As the technology has evolved from a purely analog, tape-based medium to a digital, file-based medium that is increasingly creeping into our daily lives, video will continue to be explored artistically well into the future. The challenge for custodians of these artworks is the pace at which technology changes and becomes obsolete, requiring increasingly esoteric knowledge to maintain these works, and the sheer quantity of material now that the format is largely unbound from material storage concerns. The most critical aspect of approaching any collection of video material is first to safely store it, either physically or digitally, and second to have a full understanding of the collection. Once this has occurred, you can then go about documenting the technological aspects of the work, reviewing different iterations, and so on. Our aim has been to provide practical advice about the care of video artworks, supported by historical information about the medium, to provide context and understanding about how video has evolved. The care of any video artwork is decided on a case-by-case basis, as each work will most likely have needs that differ from another work that seems identical to it. With this information, we hope we have provided you with an argument that the preservation and maintenance of a video art collection requires a collaborative effort and that specialized knowledge and ongoing education are necessary to safely preserve these works of art in the future.

Bibliography AMIA (The Association of Moving Image Archivists). “Videotape Preservation Fact Sheets.” 2002. https:// amianet.org/wp-content/uploads/Resources-Video-Preservation-Fact-Sheets-2002.pdf. ———. “Vrecord (Software). MacOS, Linux, Shell.” 2014. Reprint, AMIA Open Source, 2021. https:// github.com/amiaopensource/vrecord. Apple Inc. “Apple ProRes, White Paper. January 2020.” 2020. www.apple.com/final-cut-pro/docs/Apple_ ProRes_White_Paper.pdf. “AV Artifact Atlas | AVAA.” n.d. Accessed September 20, 2021. www.avartifactatlas.com/. Balsom, Erika. “The Limited Edition: Old Problems and New Possibilities.” In I Have a Friend Who Knows Someone Who Bought a Video, Once: On Collecting Video Art, 90–103. Milano: Mousse Publishing, 2016. Bay Area Video Coalition (BAVC) Media. “BAVC Media.” BAVC Media, 2021. https://live-bavc-wp. pantheonsite.io/. Bertram, H., and E. Cuddihy. “Kinetics of the Humid Aging of Magnetic Recording Tape.” IEEE Transactions on Magnetics 18, no. 5 (September 1982): 993–99. https://doi.org/10.1109/TMAG.1982.1061957. Blase, Christoph, Peter Weibel, Zentrum für Kunst und Medientechnologie Karlsruhe, Ludwig Forum für Internationale Kunst, Kunsthaus Dresden, and Ausstellung Record Again! 40jahrevideokunst.de—Teil 2, eds. Record again! . . . anlässlich der Ausstellung “RECORD> Again! 40jahrevideokunst.de—Teil 2”; ZKM, Zentrum für Kunst und Medientechnologie Karlsruhe, 17. Juli bis 6. September 2009, Ludwig-Forum für Internationale Kunst, Aachen, 18. September bis 15. November 2009, Kunsthaus Dresden, 28. November 2009 bis 14. Februar 2010 . . . 40jahrevideokunst.de, = 40yearsvideoart.de, Teil 2. Ostfildern: Hatje Cantz, 2010.

312

Caring for Analog and Digital Video Art Boomgaard, Jeroen, and Bart Rutten, eds. The Magnetic Era: Video Art in the Netherlands 1970–1985. Rotterdam: NAi Publishers, 2003. Boyle, Deirdre. “A Brief History of American Documentary Video.” In Illuminating Video: An Essential Guide to Video Art, 51–70. New York: Aperture, 1990. ———. Video Preservation Securing the Future of the Past. New York: Media Alliance, 1993. Bunz, Sophie, and Agathe Jarczyk. “TBC Under Control: Suggestion for a New Documentation Method for the Digitization of Analog Video.” Presented at the AIC’s 47th Annual Meeting Specialty Session, Electronic Media, Sky Convention Center, Mohegan Sun, Connecticut, May 15, 2019. https://aics47thannualmeeting2019.sched.com/event/Iuk2/electronic-media-tbc-under-control-suggestion-fora-new-documentation-method-for-the-digitization-of-analog-video. Buschmann, Renate, and Jessica Nitsche. Video Visionen: Die Medienkunstagentur 235Media als Alternative im Kunstmarkt. Bielefeld: Transcript Verlag, 2020. Castellano, Joseph A. Handbook of Display Technology. San Diego: Academic Press, 1992. Conrad, Tony. “Open Reel Videotape Restoration.” The Independent: Film  & Video Monthly 10, no. 8 (October 1987): 12. Cuddihy, E. “Aging of Magnetic Recording Tape.” IEEE Transactions on Magnetics 16, no. 4 (July 1980): 558–68. https://doi.org/10.1109/TMAG.1980.1060652. Daniel, Eric D., C. Denis Mee, and Mark H. Clark, eds. Magnetic Recording: The First 100 Years. New York: IEEE Press, 1999. De Stefano, Paula, Kimberly Tarr, Melitte Buchman, Peter Oleksik, Alice Moscoso, and Ben Moskowitz. Digitizing Video for Long-Term Preservation: An RFP Guide and Template. New York: Barbara Goldsmith Preservation & Conservation Department, New York University Libraries, 2013. https://guides.nyu. edu/ld.php?content_id=24817650. Electronic Arts Intermix (EAI). “Electronic Arts Intermix.” 1997–2001. www.eai.org/. ———. “Electronic Arts Intermix | Editing Post-Production Facility.” 1999. https://web.archive.org/ web/20010410234927/www.eai.org/catalogue/editingfacility.html. Electronic Arts Intermix (EAI), and Lori Zippay, eds. Artists’ Video: An International Guide. New York: Cross River Press, 1991. Elwes, Catherine. Video Art: A Guided Tour. London and New York: I.B. Tauris, In Association with University of the Arts, In the United States of America and in Canada distributed by Palgrave Macmillan, 2005. Engel, Friedrich K. “The Introduction of the Magnetophon.” In Magnetic Recording: The First 100 Years, 47–71. New York: IEEE Press, 1999. www.wiley.com/en-us/Magnetic+Recording%3A+The+First+ 100+Years-p-9780780347090. FFmpeg (Software). “FFmpeg (Software) (version 4.4 “Rao”). Fabrice Bellard.” 2021a. https://ffmpeg. org/. ———. “ffprobe (Software).” 2021b. https://ffmpeg.org/ffprobe-all.html. “Ffmprovisr (Software).” AMIA Open Source, n.d. Accessed October  3, 2021. https://amiaopensource. github.io/ffmprovisr/. Fifer, Sally Jo, Tamara Gould, Luke Hones, Debbie Hess Norris, Paige Ramey, and Karen Weiner, eds. Playback: A Preservation Primer for Video. San Francisco: Bay Area Videocoalition, 1998. Finzel, Peter. Projektor Spezial. Videoprojektoren Für Das Kino Zu Hause, 10 ed. Potsdam: Polzer, 2003. Fleischhauer, Carl, Kevin Bradley, and International Association of Sound and Audiovisual Archives, eds. IASA-TC 06 Guidelines for the Preservation of Video Recordings, revised ed., 2019. www.iasa-web.org/ tc06/guidelines-preservation-video-recordings. Frohne, Ursula, Mona Schieren, Jean-François Guiton, Hochschule für Künste, Bremen, and International University Bremen, eds. Present Continuous Past(s): Media Art-Strategies of Presentation, Mediation, and Dissemination. Schriftenreihe 2 Der Hochschule Für Künste Bremen. Wien and New York: Springer, 2005. Gfeller, Johannes, Agathe Jarczyk, and Joanna Phillips. Kompendium der Bildstörungen beim analogen Video =: Compendium of image errors in analogue video. Kunstmaterial 2. Zürich: SIK ISEA, Schweizerisches Institut für Kunstwissenschaft, 2012. Guggenheim Museum. “Time-Based Media.” The Guggenheim Museums and Foundation, 2012. www. guggenheim.org/conservation/time-based-media. Guidelines for the Care of Media Artworks. “Matters in Media Art.” 2015. http://mattersinmediaart.org/. Harvey, Phil. “ExifTool (Software) (version 12.32). MacOS, Windows, Unix Systems.” 2021. https:// exiftool.org/.

313

Agathe Jarczyk and Peter Oleksik Herzogenrath, Wulf, ed. 40 Jahre Videokunst.de—Teil 1. Digitales Erbe: Videokunst in Deutschland von 1963 bis heute; (diese Publikation erscheint anlässlich der Ausstellung vom 25. März 2006 bis 21. Mai 2006). Stuttgart: Hatje, 2006. High, Kathy, Sherry Miller Hocking, and Mona Jimenez, eds. The Emergence of Video Processing Tools, vol. 2. Bristol: Intellect, 2014. Hodges, Peter. Introduction to Video and Audio Measurement. London: Focal, 2016. Huffman, Kathy Rae, ed. Video a Retrospective: Long Beach Museum of Art, 1974–1984. Long Beach: Long Beach Museum of Art, 1984. Huisman, Sanneke, and Marga van Mechelen. A Critical History of Media Art in the Netherlands: Platforms, Policies, Technologies. Prinsenbeek: JAP SAM Books, 2019. Iraci, Joe. Disaster Recovery of Modern Information Carriers: Compact Discs, Magnetic Tapes, and Magnetic Disks. Technical Bulletin/Canadian Conservation Institute 25. Ottawa: Canadian Conservation Institute, 2002. ———. “The Cold Storage of CD, DVD, and VHS Tape Media.” Restaurator 32, no. 2 (January 2011). https://doi.org/10.1515/rest.2011.006. ITU (International Telecommunication Union), ed. “Rec. ITU-R BT.601: Studio Encoding Parameters of Digital Television for Standard 4:3and Wide-Screen 16:9 Aspect Ratios.” ITU, March 2011. www. itu.int/rec/R-REC-BT.601/. ———. “Rec. ITU-R BT.709–6: Parameter Values for the HDTV Standards for Production and International Programme Exchange.” ITU, June 2015. www.itu.int/rec/R-REC-BT.709/en. Jimenez, Mona, and Liss Platt. “Videotape Identification and Assessment Guide.” Texas Commission on the Arts, 2004. www.arts.texas.gov/wp-content/uploads/2012/04/video.pdf. Lacinak, Chris. “Project Outsourcing: Navigating Through the Client/Vendor Relationship to Achieve Your Project Goals.” 2006. www.weareavp.com/wp-content/uploads/2017/07/AVPS_Series_Pro ject_Outsourcing.pdf. ———. “The Cost of Inaction: A New Model and Application for Quantifying the Financial and Intellectual Implications of Decisions Regarding Digitization of Physical Audiovisual Media Holdings.” International Association of Sound and Audiovisual Archives Journal 43 (July 2014): 12. www.weareavp.com/ wp-content/uploads/2014/07/COICalculator.pdf. Laurenson, Pip. “Developing Strategies for the Conservation of Installations Incorporating Time-Based Media with Reference to Gary Hill’s Between Cinema and a Hard Place.” Journal of the American Institute for Conservation 40, no. 3 (January 1, 2001): 259–66. https://doi.org/10.1179/019713601806113003. Lehane, Richard. “Siegfried (Software) (version 1.9.1). Windows, MacOS, Ubuntu, Debian, Freebsd, Arch Linux.” 2020. www.itforarchivists.com/siegfried/. Library of Congress. “Sustainability of Digital Formats: Planning for Library of Congress Collections.” June 17, 2021. www.loc.gov/preservation/digital/formats/index.html. ———. “Care, Handling, and Storage of Audio Visual Materials—Collections Care—(Preservation, Library of Congress).” Web page. Accessed September  19, 2021. www.loc.gov/preservation/care/record. html. LIMA. “LIMA Preserves, Distributes and Researches Media Art.” n.d. Accessed October  15, 2021. https://li-ma.nl/lima/. London, Barbara. “Video Preservation.” In Television and Video Preservation 1997: A Report on the Current State of American Television and Video Preservation: Report of the Library of Congress, vol. 5, 220–22. Washington, DC: Library of Congress, 1997. www.loc.gov/static/programs/national-film-preservationboard/documents/tvmoma.pdf. Lorrain, E. “Obsolete Equipment: A Research Project on Preserving Equipment in Multimedia Art Installations.” In Preservation of Digital Art: Theory and Practice: The Project Digital Art Conservation, edited by Bernhard Serexhe and Eva-Maria Höllerer, 232–42. Karlsruhe: Zentrum für Kunst und Medientechnologie, 2013. Marshall, Stuart. “Video: Technology and Practice.” Screen 20, no. 1 (March 1, 1979): 109–19. https://doi. org/10.1093/screen/20.1.109. Mazzanti, Anna, ed. art/tapes/22: Le origini della videoarte. Cinisello Balsamo: Silvana, 2017. MediaArea, and AVPreserve. “DV Analyzer (Software) (Version 1.4.2).” Windows, MacOS, Debian, Ubuntu, Linux Mint, RedHat Enterprise Linux, CentOS, Fedora, openSUSE, ArchLinux, Flatpak. MediaArea.net & AudioVisual Preservation Solutions Inc., 2017. https://mediaarea.net/DVAnalyzer. MediaArea, Jérôme Martinez, Dave Rice, Guillaume Roques, Florent Tribouilloy, Blewer, Ashley, and Tessa Fallon. “MediaConch (Software) (version 18.03.2).” Windows, MacOS, Debian, Ubuntu, Linux

314

Caring for Analog and Digital Video Art Mint, RedHat Enterprise Linux, CentOS, Fedora, openSUSE, ArchLinux, Flatpak, Snap, AppImage. MediaArea, 2018. https://mediaarea.net/MediaConch. MediaArea, Dave Rice, Alexander Ivash, Ashley Blewer, and Jérôme Martinez. “QCTools (Software) (version 1.2).” Windows, MacOS, Debian, Ubuntu, Linux Mint, RedHat Enterprise Linux, CentOS, Fedora, openSUSE, ArchLinux, Flatpak. MediaArea, 2020. https://mediaarea.net/QCTools. Meigh-Andrews, Chris. A History of Video Art, 2nd ed. New York: Bloomsbury, 2014. Metropolitan Museum of Art, Time-Based Media Working Group. “Sample Documentation and Templates.” Time-Based Media Working Group/Sample Documentation and Templates, The Metropolitan Museum of Art, 2021. www.metmuseum.org/about-the-met/conservation-and-scientific-research/ time-based-media-working-group/documentation. Mißelbeck, Reinhold, Martin Turck, and Museum Ludwig, eds. Video im Museum: Restaurierung und Erhaltung, neue Methoden der Präsentation, der Originalbegriff: internationales Symposium, Museum Ludwig Köln, 9. September 2000 = Video Arts in Museums: Restoration and Preservation, New Methods of Presentation, the Idea of the Original: International Symposium, Museum Ludwig Cologne, September 9, 2000. Köln: Museum Ludwig, 2001. Nagels, Katherine Frances. “PAR, SAR, and DAR: Making Sense of Standard Definition (SD) Video Pixels—BAVC Media.” BAVC Media. n.d. Accessed October  1, 2021. https://bavc.org/ par-sar-and-dar-making-sense-standard-definition-sd-video-pixels/. Nitzschke, Werner, and Charly Weller, eds. Video-Handbuch: Gebrauch, Wartung, Reparatur. Berlin: Büro für off-Kudamm Kinos, 1975. Noordegraaf, Julia, Cosetta G. Saba, Barbara Le Maître, and Vinzenz Hediger, eds. Preserving and Exhibiting Media Art: Challenges and Perspectives. Framing Film. Amsterdam: Amsterdam University Press, 2013. Otterbeck, Bärbel, and Kunstmuseum Wolfsburg, eds. Wie haltbar ist Videokunst? Beiträge zur Konservierung und Restaurierung audiovisueller Kunstwerke; Symposium im Kunstmuseum Wolfsburg, 25. November 1995 = How durable is video art? Wolfsburg: Kunstmuseum, 1997. Owens, Trevor, and Jefferson Bailey. “Protect Your Data: File Fixity and Data Integrity | The Signal.” The Signal, April 7, 2014. https://blogs.loc.gov/thesignal/2014/04/protect-your-data-file-fixity-anddata-integrity/. Pfluger, David, Reto Kromer, Yves Niederhäuser, and Agathe Jarczyk. “Digital Archiving of Film and Video: Principles and Guidance, v1.2.” 2019. https://memoriav.ch/en/dafv/. Phillips, Glenn, ed. California Video: Artists and Histories. Los Angeles: The Getty Research Institute, 2008. Poynton, Charles A. Digital Video and HD: Algorithms and Interfaces, 2nd ed. Waltham, MA: Morgan Kaufmann, 2012. Pozdeev, Max. “Invisor (Software) (version v3.16).” MacOS 10.7.3+, 2021. www.invisorapp.com/. Pozdeev, Max, and MediaArea. “MediaInfo (Software) (version 21.09).” Windows, MacOS, Android, iOS, Debian, Ubuntu, Linux Mint, RedHat Enterprise Linux, CentOS, Fedora, openSUSE, ArchLinux, Gentoo, Flatpak, Snap, AWS Lambda. MediaArea, 2021. https://mediaarea.net/de/MediaInfo. Remley, Frederick M. “The Challenge of Recording Video.” In Magnetic Recording: The First 100 Years. New York: IEEE Press, 1999. Rice, Dave. “Sustaining Consistent Video Presentation.” Tate. n.d. Accessed October 3, 2021. www.tate. org.uk/about-us/projects/pericles/sustaining-consistent-video-presentation. Sætervadet, Torkell. FIAF Digital Projection Guide. Brussels: FIAF, 2012. Schmidt, Ulrich. Professionelle Videotechnik: Grundlagen, Filmtechnik, Fernsehtechnik, Geräte- und Studiotechnik in SD, HD, DI, 3D, 6 Aufl. Berlin and Heidelberg: Springer Vieweg, 2013. Schubiger, Irene, Kunstmuseum Luzern, and AktiveArchive (Project), eds. Reconstructing Swiss Video Art from the 1970s and 1980s. Zürich: JRP Ringier, 2009a. ———, eds. Schweizer Videokunst der 1970er und 1980er Jahre: Eine Rekonstruktion. Zürich: JRP/Ringier, 2009b. Schubin, Mark. “Searching for the Perfect Aspect Ratio.” SMPTE Journal 105, no. 8 (August 1996): 460– 78. https://doi.org/10.5594/J09548. Schum, Gerry, Kölnischer Kunstverein, Museum Boymans-Van Beuningen, Amsterdam (Netherlands) Museum voor Hedendaagse Kunst (Gent), Stedelijk Museum, and Vancouver Art Gallery. Gerry Schum: [catalogue of an exhibition held at] Stedelijk Museum, Amsterdam, 21.12.1979–10.2.1980; Museum Boymans. Van Beuningen, Rotterdam, 29.2.1980–13.4.1980; Kölnische Kunstverein, Köln 30.4.1980– 1.6.1980; Museum voor Hedendaagse Kunst, Gent 15.6.1980–30.8.1980; Vancouver Art Gallery, Vancouver. Amsterdam: Stedelijk Museum, 1979.

315

Agathe Jarczyk and Peter Oleksik Simpson, Robert S. Videowalls: The Book of the Big Electronic Image, 2nd ed. Oxford and Boston: Focal Press, 1997. Turkus, Benjamin. “Time Base Correction: An Archival Approach.” Master of Arts, New York University, Department of Cinema Studies, Moving Image Archiving and Preservation Program, 2016. https:// miap.hosting.nyu.edu/program/student_work/2016spring/16s_3490_Turkus_Thesis_y.pdf. Van Bogart, John W. C. Magnetic Tape Storage and Handling: A Guide for Libraries and Archives. Washington, DC and St. Paul, MN: Commission on Preservation and Access, National Media Laboratory, 1995. Videofreex. The Spaghetti City Video Manual: A Guide to Use, Repair and Maintenance. New York: Praeger, 1975. Vitale, Tim, and Paul Messier. “Video Preservation Website.” 2013. https://cool.culturalheritage.org/ videopreservation/. Vitale, Timothy. “Techarchaeology: Works by James Coleman And Vito Acconci.” Journal of the American Institute for Conservation 40, no. 3 (January 1, 2001). https://cool.culturalheritage.org/jaic/articles/ jaic40-03-005_indx.html. Weise, Marcus, and Diana Weynand. How Video Works, 2nd ed. New York: Focal Press, 2007. Wheeler, Jim. “Videotape Preservation Handbook.” 2002. https://amianet.org/wp-content/uploads/ Resources-Guide-Video-Handbook-Wheeler-2002.pdf. Wijers, Gaby, Evert Rodrigo, Ramon Coelho, and Stichting Behoud Moderne Kunst, eds. De Houdbaarheid van Videokunst: Conservering van de Nederlandse Videokunst Collectie = The Sustainability of Video Art: Preservation of Dutch Video Art Collections. Amsterdam: Amsterdam University Press, 2003. “Wikipedia—Betacam.” Wikipedia, Accessed October 3, 2021. https://en.wikipedia.org/wiki/Betacam. “Wikipedia—Rec. 601.” Wikipedia, Accessed October 3, 2021. https://en.wikipedia.org/wiki/Rec._601. “Wikipedia—Rec. 709.” Wikipedia, Accessed September  27, 2021. https://en.wikipedia.org/wiki/ Rec._709. Zielinski, Siegfried. Zur Geschichte Des Videorecorders. Berlin: V. Spiess, 1986.

316

19 SOUND IN TIME-BASED MEDIA ART Christopher McDonald

Editors’ Notes: Whether a prominent feature of an audio installation or an integral part of a video projection, the sound in time-based media artworks is often neglected, underestimated, and not treated with the same attention and rigor as their visuality. In this chapter, sound engineer and artist Christopher McDonald (Connecticut, USA) offers his unique perspective as an experienced practitioner whose work focuses on producing and exhibiting sound in time-based media art. For a decade, McDonald has been working with artist Ragnar Kjartansson to produce and realize his media installations that are famous for their immersive and perfectly synchronized soundscapes. Here, McDonald introduces the physics and psychoacoustics of sound and shows how these translate into the gallery experience of artworks. He provides valuable context and recommendations to support those who wish to record, reproduce, and exhibit sound-based art. Throughout his chapter, he invites the perspectives of artists Ragnar Kjartannson and Pipilotti Rist, as well as art historian Linnea Semmerling (Director, Institute of Intermedia Art, Düsseldorf), who contribute to McDonald’s argument that the sound in time-based media cannot get enough attention when creating and maintaining the intended artwork experience.

19.1 Introduction In this chapter, I will explore sound and its use in artworks. The material in the following sections focuses on the fundamental science of sound and good practices for working with it both digitally and in a real space. Therefore, I hope it will be broadly useful wherever the careful preservation or presentation of sound is the goal. My examples will focus on sound in the making, conservation, and exhibition of video art, as that is my area of expertise. Sound has the distinction of being everywhere and nowhere. There is no place in the universe that a human can go to find silence. John Cage learned this when he was given access to an anechoic research chamber at Harvard in the 1950s (Indeterminacy by John Cage, Smithsonian Folkways, 1959). He was very much looking forward to finally hearing what silence sounded like, and to his disappointment, it sounded like two things: a higher-pitched thing and a lowerpitched thing. The higher-pitched thing, the researchers told him, was his own nervous system in operation and the lower sound, which we can all confirm on a sleepless and still night, was the pumping of blood circulating to and from the brain through the jugular veins as it goes DOI: 10.4324/9781003034865-23

317

Christopher McDonald

past the ears. Barring a physiological or neurological inability to process auditory information, sound is with us always and everywhere. At the same time, the vast majority of people never think about sound until extreme cases where it has become an environmental problem for them. A neighbor living on the other side of our bedroom wall who slams cabinet doors at the most annoying times. A restaurant where you can’t hear the person across from you because the proprietors read a study that people think a restaurant is hipper the louder it is. The subway. Even in cases where we are actively listening—for instance, to music at a concert hall or in headphones on an airplane—most of us will always just experience sound as feeling good or bad. In that sense, it is nowhere because it is transparently part of the environment. It tells us which direction a car is coming from. Whether someone has finished doing dishes in the kitchen. Whether a baby goat is in trouble and where it has somehow managed to sneak away to. We don’t think about it as sound per se because in our brains it becomes just another spatial description of the environment. Most of the time, oddly enough, it is simply absorbed into our visual map of the world around us.

“It is really difficult to put sound into words because it’s a really important part of how you feel the world. There is the old story that it is scarier to be deaf than blind. Sound is the most important sense. It’s so immediate and so much of the romantic beauty of the world is sound. And sound is such a kick.” —Ragnar Kjartansson (Kjartansson 2021)

This also means that most people have limited sonic imaginations. Even people who may otherwise be artistic visionaries (note, there are no “hearingaries”) and deep thinkers about the potential power of sound are often more likely to make an intuitive but incorrect initial assumption about the actual effect sound will have in a given space or context. People who work with sound for a living, especially the best of them, will spend a fair amount of time haphazardly messing around with odd combinations of microphones and instruments and effects to eventually be hit with the sound they were looking for the whole time. Brian Eno famously (though perhaps apocryphally) said of his production process when asked how he “came up” with such incredibly rich and unique sounds in the studio, “Make the machines break.” The point to take away from this is that when we try to do something with sound, it will, more often than not, surprise us. Unfortunately for the people likely to be reading this chapter, these surprises are not as likely to be as happy and fruitful as they are for Eno. I have been working with Ragnar Kjartansson for more than a decade now in the recording, post-production, and exhibition of his video works, and he remarked in one of our conversations: “You always fool yourself into believing sound can be more easily dealt with than it can. Sound is like water. It goes everywhere” (Kjartansson 2021). It is always wishful thinking that sound and architecture will “play well” together. This is especially the case in museum environments, with their long, rectangular marble and concrete corridors and perhaps even more so in their ubiquitous small cube-shaped “media” rooms one sheetrock wall away from the museum’s Education Center. By the end of this chapter, my goal is for you to be better equipped to think through and imagine what a specific soundtrack will do in a specific space, identify potential problems, and propose solutions. In short, I want you to be less surprised. You are often limited in the architectural interventions open to you, but you will understand how to make better modifications to rooms and how to better match works and spaces. You will not install a gutsy, tummy-rumbling artwork only to find it thin and a little pathetic (you 318

Sound in Time-Based Media Art

didn’t realize that even at their loudest and biggest, a two- or three-way speaker is not going to do the things the artist needed a subwoofer to do). You will also not put that subwoofer on the other side of the wall from an intimate video work with quiet talking intended to play back off the monitor’s built-in speakers and wonder why you still can’t hear it clearly even when you turn it up so much that it scares people out of the room. Some works may not even be suitable anywhere in your space, and thinking realistically about such limitations may help inform acquisition decisions. Therefore, what I discuss in this chapter will be helpful for anyone collecting sound-based works, institutions exhibiting and preparing plans for the longevity of collected sound-based works, and anyone working in an exhibition context. You will learn the language that will enable you to engage with artists’ studios in order to understand what it takes to faithfully reproduce a work. If you are dealing with the long-term preservation of digital works with sound (or in the process of converting analog works for digital storage), you will understand what makes for a safe, clear, and transparent transfer of sound from one format to another. And even if you are committed to never working with a single sound-based artwork in your life, chances are that (if you are still reading this) you know you will one day find you are installing on the other side of the wall from someone who is. Understanding how sound works and what special problems it creates in such spaces will put you in a better position to plan future exhibitions and negotiate solutions that will best preserve the viewing experience of your works on paper show or the well-being of invigilators when there is a noisy next-door neighbor. I will start with a brief description of sound as a physical phenomenon. I will then discuss how those physical phenomena are perceived by humans. I will explain how sound waves are recorded—focusing on digital recordings but with a brief review of historical formats. I will look at how recordings are converted back into sound waves, the role specific equipment plays in this process, and some basic issues in the acoustics of how those sounds interact with the spaces they inhabit (and escape them!). And finally, I will conclude with a discussion of the exhibition of sound works and special concerns in working with artists’ studios to create useful specifications for these works for posterity. I will also discuss different analog and digital sound formats, but I will not go into great detail regarding the conservation of analog media, such as vinyl records and magnetic tape (no wax cylinders either!). I will explore the structure of digital audio, sample rates, bit depths and containers, and issues specific to audio restoration and the production of exhibition files. But I will not go into much detail about what a digital file is and good practices for their archival storage (see Chapters 8, 13, and 14).

19.2 Physics—The Nature of Sound Physically, sound is two axes of information expressed in pressure oscillations: amplitude in time. We express the time between amplitude peaks, the frequency of the sound wave, in hertz (Hz) or cycles per second. The more amplitude peaks in a given span of time, the higher the frequency of the sound wave and perceived pitch. The fewer peaks in a given span of time, the lower frequency and perceived pitch. As we will see in the following section, our experience of the loudness of the range of audible frequencies is not linear. Lower frequencies require more energy (amplitude) in order to appear to us as loud as higher frequencies. Sound waves travel through the air at about 767 miles per hour, or 1,125 feet per second, at sea level when it is about 68 degrees Fahrenheit. When the atmospheric pressure is higher and the temperature is cooler, the waves travel more slowly. And they travel even more slowly and dampen (the amplitude gets less and less) more quickly when they travel through denser 319

Christopher McDonald

material than air. In a vacuum, there is no material for the waves to propagate in, so there is no sound. This section will provide a basic outline for what is an extremely complex subject that has been better explored by other engineers, theoreticians, and scientists much more knowledgeable than I am. For anyone seeking more in-depth material on the physics (and philosophy) of sound, I recommend Juan G. Roederer’s The Physics and Psychophysics of Music, Bob Katz’s Mastering Audio: The Art and the Science, and the collection Audio Culture: Readings in Modern Music, assembled by Christoph Cox and Daniel Warner.

19.2.1 Sound as Space We don’t intuitively think of sound as space, but as it travels through the air at a (mostly) constant speed, sound waves have a wavelength that varies with their frequency. The wavelength can be calculated by dividing the frequency by the speed of sound. So a 20 Hz sound wave is 56′3″ long (1,125′ per second / 20 peaks per second = 56.25′). Higher frequencies, like those in the critical range for understanding speech, are much shorter. A 2.2 kHz (2,200 Hz) sound wave is only about a half foot long. Fig. 19.1 is a soundbar showing the frequency ranges of three different types of speakers and the frequency ranges of the playable notes on several instruments. The 20 Hz to 20 kHz range shown here is what a lucky under-40-year-old can hear. As you will learn in Section 19.2.2 and as is illustrated in fig. 19.3, there can be very complex combinations of higher harmonics on top of these fundamental pitches. This helps explain why, even though a violin can only realistically play up to the E three octaves above middle C, a violin still emits a lot of sonic information above the frequency of that note. If you were to gently roll off even some of the information above 2,700 Hz, a violin would sound dull, muffled, and unnatural.

Figure 19.1 This is a soundbar showing the frequency ranges of three different types of speakers and the frequency ranges of the playable notes on several instruments. Sketch: Gregory Lizotte

320

Sound in Time-Based Media Art

This physical fact is beautifully on display in Alvin Lucier’s Music on a Long Thin Wire (1977) in which a 50′ length of piano wire is stretched in a space and activated. Many different tones emerge (see the discussion of harmonics in Section 19.2.2) and are visible in the vibrating string. Perhaps one of the most lovely and bizarre aspects of experiencing this piece is the ability to walk a 50′ wavelength and its harmonics. As you move through the space, you experience different pitches getting louder and quieter. You hear (or, more, feel as the pitch is starting to get below our auditory threshold) a sound that goes from loud to nothing as you move away from the amplitude peak. Keep walking and you hear/feel the sound get louder and louder again. I use this work as an illustration not just because of how delightful it is but also, as you are beginning to understand, because it is a harbinger of the profound difficulties such low frequencies can pose in exhibition spaces! Thinking of sound as space gives us a lot of insight into the issues we run into when exhibiting sound. Just as a string of a certain length with a certain tension and density will vibrate at a certain frequency when activated by plucking or sympathetic vibration, so does a space when activated by sound. The simplest visualization of this is when you talk into a vase or blow into a partially empty bottle. You are activating the natural resonance of the space. If you have a ceiling height of about 11′3″ and a lot of 100 Hz energy on your sonic material, you will notice the same phenomenon of sympathetic resonance with the 100 Hz sound wave. This helps us understand that exhibiting sound is not as simple as playing it back with the proper equipment because since every space has dimensions it is also, in effect, an amplifier that will exaggerate certain sounds and even cancel out others. Curiously enough, Alvin Lucier also has a recorded work, probably his most famous, that explores just this phenomenon: I Am Sitting in a Room (1970). The recording begins with Lucier speaking a phrase into a microphone that begins, “I am sitting in a room different from the one you are in now.” Over the course of the work, Lucier plays his recorded speech into the room and re-records the result. With each iteration, some speech detail is effaced by the frequencies that the room’s dimensions acoustically amplify. By the end of the work, the original recorded speech has degraded into a sonic map of the resonant frequencies of the room, pulsing to the rhythmic cadence of his speech. The words themselves are completely lost. This is a work that I love and a warning to exhibitors everywhere. We can do a lot to try to compensate for this acoustic reality, from equipment like equalizers that adjust the relative frequency levels we are playing back to the careful selection of acoustic panels of different thicknesses (the thicker ones being more effective at sucking the energy out of longer wavelengths). We can also specifically select rooms whose “modes” will least affect a given work’s content. I will address these options in more detail in the final section of this chapter.

19.2.2 Pitch, Timbre, and Noise For sound waves to make sense to us, they need to have periodicity, the regularity with which the amplitude peaks occur. When there is no regularity to the amplitude peaks, for instance, if they are pretty much random, we experience noise. White noise is purely random. There are also other types of noise that have randomness but are weighted. For instance, in pink noise, the lower frequencies have an equal audible intensity to the high frequencies. White noise is mathematically random across the spectrum (see fig. 19.2). The randomness in pink noise is weighted, with more power put into the lower frequencies and less into the higher ones so that the different frequencies in the audible spectrum are heard as similar in loudness. White noise sounds brighter than pink noise. And pink noise has more heft to it than white noise. Note also how the frequencies are distributed along the x-axis: they represent 321

Christopher McDonald

Figure 19.2 White noise is mathematically random across the spectrum. The randomness in pink noise is weighted, with more power put into the lower frequencies and less into the higher ones so that the different frequencies in the audible spectrum are heard as similar in loudness. Sketch: Chris McDonald

Figure 19.3

The first six harmonics above the C two octaves below middle C (about 32.7 Hz).

Sketch: Gregory Lizotte

a linear sequence by octave. The frequency of 100 Hz to 200 Hz, a span of 100 Hz, represents one octave (see fig. 19.3). But 10,000 Hz to 20,000 Hz, a span of 10,000 Hz, is also an octave. One octave is not described by a fixed frequency span; it is the span between two frequencies whose ratio is 1:2. That is, white noise is an equal distribution of frequencies by amplitude, while pink noise is a logarithmic distribution of frequencies by amplitude that corresponds to our ability to hear these frequencies as being at the same loudness. In short, white noise sounds higher or “brighter” to us and pink noise sounds lower and “darker.” Let’s concretize this abstract information into something we understand: music. When a good flutist plays a note, we get a very regular pitch. If the pitch coming out of the flute is 322

Sound in Time-Based Media Art

440 Hz (440 amplitude peaks or cycles per second), we hear the A above the middle C on a piano—a freshly tuned contemporary piano, at least. We also get some noise! The breath of the performer contributes a certain amount of sound waves with a lot of randomness, and this in turn activates the cylinder of the flute whose length is effectively changed when different keys are pressed. The simplest synthesis of flute sounds mimics this: digitally produce white or pink noise with a sharp attack and decay (for the breath) and a sine wave (for the resonant frequency of the flute’s cylinder activated by the energy of the breath). This is a relatively simple sound because what we hear in the note played on a flute is mostly the fundamental frequency, the 440 Hz pitch activated when the keys for the A above middle C are held down, and the flute is blown into. A fairly regular sine wave is produced that peaks 440 times per second with a minimal amount of higher pitches that are subdivisions of that frequency. We call these higher pitches harmonics or partials. Things get much more complicated in the sound waves other instruments produce, and this accounts for the different timbres, the characteristic and recognizable sound of each instrument that we perceive. To stick with wind instruments, oboes are single-reed instruments with cylindrical bores producing rather loud higher partials relative to the fundamental pitch. Timbres, with relatively loud higher partials, sound like Fran Drescher from the ’90s TV show The Nanny (and for our non-American readers, imagine a baby crying). The harmonic series can be described, for a fundamental frequency f, as f + f*2 + f*3 + f*4 + f*5 + f*6 + f*7 and so on. A flute has a lot of n and some f*2 and not a whole lot else. A clarinet has a lot of n and also a lot of f*3, f*5, and so on but not a lot of f*2, f*4, and so on, and the partials have less energy the higher they get. Our Western diatonic and modal scales are based on this series (but our contemporary, equal-tempered tuning only approximates it). In fig. 19.3 we see the first six harmonics above the C, two octaves below middle C (about 32.7 Hz). On the left we see the musical notation of the chord produced by these harmonics, and on the right is a “string” representation of their relation to each other. The fundamental f—also referred to as the first harmonic or partial—is the frequency produced when the whole length of the string vibrates. The second harmonic is both halves of the string sympathetically vibrating from activation by the fundamental and producing a pitch twice as high as the fundamental: about 65.4 Hz. It is as if we have now plucked two strings half the length of the first. In the next higher harmonic, it is as if we have plucked three strings, each a third the length of the first and producing a frequency three times as high: about 130.8 Hz. And so on. The harmonics above the fundamental are also referred to as overtones. One way to understand the timbres produced by different relative intensities of these higher harmonics, or partials, is to look at different vowel sounds. An “oh” sound has a loud fundamental, successively quieter partials, and very few of the higher ones. An “ee” sound has upper partials that are louder than those of the “oh” but not as loud as the upper partials in “ask” as an American nanny might pronounce it (the “a” in the “wah” of a baby’s cry). Simple sound waves “sum” to create more complex ones (see fig. 19.4). The simplest and most complex sound waves, when summed with their phase-reversed evil twin, create silence.

19.2.3 Complex Sound and Sonic Cancellation We now see that the more energy we have in the upper harmonics, the more complex a sound wave becomes. When you are dealing with a full sound mix with many different instruments and sounds, the sound wave can start to look chaotic. Every sound wave produced, for instance, by a symphonic orchestra sums in the space. That is, the sound waves generated by each performer and the reflections bouncing off all the surfaces of the hall hit your eardrum 323

Christopher McDonald

Figure 19.4

Simple sound waves “sum” to create more complex ones. The simplest and most complex sound waves, when summed with their phase-reversed evil twin, create silence.

Sketch: Chris McDonald

as a single waveform. Yet though the sound wave hitting our ears is now extremely complex, we can still hear a flute, an oboe, a timpani drum, some computer-generated swoops or some samples of train sounds or whatever else the composer has thrown at us, but our auditory faculties are able to follow all of the different periodic frequencies summed in this wave that is hitting our ears. But when multiple sound waves sum, the effect is not just purely additive. If you take the sound of a flute playing a 440 Hz tone and the same sound delayed so that the valleys of one align with the peaks of the other—a delay of something like 1.136 ms—these sounds “sum” to . . . nothing because they cancel each other out (see fig. 19.4). This seems a little bizarre but, we actually exploit this feature in noise-canceling headphones, which sample the pressure waves hitting them and then pump the same sound into your ears with the phase inverted (the peaks become valleys and the valleys become peaks) to cancel them out, so you can loosely understand your Harry Potter et les reliques de la mort audiobook in a plane even if your French is rusty. This is a form of destructive interference (Roederer 2008, 35). Normally, for example, in an exhibition, we are trying to minimize such sonic cancellations. But it is sometimes nice on an airplane. Hopefully, this begins to give you a sense of how complicated things get in space when sounds are bouncing off of different walls and multiple instances of the same sound reach us out of phase. And that tells us a little more about how important architecture (and sound insulation) is for our experience of sound. I will deal with this more in the final section of this chapter, but next, I will look at how we perceive sound waves, an area of study known as psychoacoustics.

19.3 Psychoacoustics—The Experience of Sound Humans can hear about nine octaves of sound—20 Hz to 16 kHz (or a little higher if you are young or lucky). And for those of us with two working ears, we are extremely sensitive to the timing differences with which these frequencies hit our two ears. A few milliseconds sooner to 324

Sound in Time-Based Media Art

the left ear than the right, and we immediately have a description of space. We are not listening to an abstract tiger’s growl. We are listening to a tiger growling way, way too close to the left side of our body. We are extremely sensitive to differences in the timing of sounds hitting our two ears for obvious reasons. We can also sense whether sounds are in front of or behind us based on how much highfrequency content we receive. Our ears are little parabolic cups to better hear sound in front of us and also little mufflers that make sounds behind us appear duller because higher frequency sounds travel less well through thick material (even if it is just a little bit of ear). Put this all together, and you understand that sound is always a spatial description. There are incredible new advances in 3D listening coming from the gaming world. You take pictures of your ears from a few different angles and upload them for analysis, and a model is generated. This model is used to adapt the sounds you will hear in the game to match exactly how you personally hear sounds in front and behind you based on your ear shape. In your headset now, the sounds behind you in the game are not just muffled, a cheap trick for making sounds appear to be behind you (also unfortunately just as easily in front of you though coming from under a quilt), but now realistically and very specifically behind you.

“We are creating these worlds. . . . And they are so much created in sound composition. The high frequency and the low frequency. When we were doing Sumarnótt [a work from 2019 originally titled Death Is Elsewhere], you think about the rumbling sound of the river versus, like, the rattling of the grass. And it’s really kind of similar to the soundscape of, like, y’know how you deal with a symphony orchestra. It’s just, like, [low punch sound] to get a bass oomph and then the high frequencies to get into the world.” —Ragnar Kjartansson (Kjartansson 2021)

In general, younger people are able to hear higher frequencies than older people. That is, our sensitivity to higher frequency sounds dulls as we age. But for all of us, regardless of age, the experience of this audible frequency spectrum is not linear with respect to loudness (see fig. 19.2). Lower frequencies require much more energy to appear as loud as higher ones. If you have ever nearly thrown your back out moving a subwoofer, you may have already understood this physical fact, if only intuitively. Subwoofers, which are loudspeakers designed to reproduce sub-bass and low frequencies from around 10 Hz to 150 Hz, require much heftier magnets, diaphragms, and wattage in order to produce these low-frequency sounds at levels complementary to the rest of the audible spectrum being produced by much smaller, lighter (and thus often cheaper) speakers. For evolutionary reasons, we are extremely sensitive to sounds around 2 kHz as these sounds play a big role in the intelligibility of speech. It is striking how much of the rest of the frequency spectrum can be “rolled off” without making speech incomprehensible. It may not sound great, but we can still understand it. You have noticed this in cell phone conversations (or a payphone call if you are old enough to have experienced any). These sibilant, hollow sounds have an unpleasant shrillness that is absent when you are listening to a full spectrum vocal recording on a nice speaker or obviously when conversing in person. The higher-frequency sounds may be just as loud on the cell phone speaker, but with other lower frequency sounds supporting them, the experience is much more pleasing. 325

Christopher McDonald

19.3.1 The Perception of Sound Is Relative That brings us to one very important fact about our perception of sound: it is highly relative. If higher frequency sounds are boosted, lower frequency ones will seem a little less sturdy. And if higher frequency sounds are cut, the lower frequency ones will seem fuller. The lower frequency sounds remain at the exact same sound pressure level, but the perception changes a lot.

“My advice for those who want to produce, collect and exhibit sounding artworks is not to rely on any presumably essential qualities of sound and not to assume that there is a one-fits-all solution to adequately installing them. I believe that the challenges that museums face with exhibiting sounding artworks are not so much about the fact that the artworks make sound as they are about the expectations that artists have about how these sounds are meant to be listened to. I have had artists explain to me that their works need absolute quiet, that their works can easily be adjusted to any soundscape, or that they can only develop their works for specific soundscapes. All too often, however, it does not suffice for the sounds to be installed as the artist intended for audiences to also listen as the artist intended. This means that it is essential that the artist and the museum staff work together on eliciting and communicating the sensory repertoires that the artwork engages in order to facilitate their shared rehearsal.” —Linnea Semmerling (Semmerling 2021)

Our adaptation (or fatigue) has similar effects, and it happens in both neurological and mechanical realms. Over time, our brains can learn to filter out frequencies, and similarly, the tiny hairs inside our cochlea can become physically unable to respond to the stimuli they receive. Luckily, most of the time, this is temporary “damage” to our ability to hear sonic details across the spectrum, and we will gradually reset back to our baseline. Those of us who have left a loud concert (or even a loud restaurant) feeling mildly deaf are relieved to feel our hearing return to normal as the hours pass. Loud enough sounds (upwards of 90 dB—a leaf blower, a subway coming into a station, or a rock concert) can physically hurt to hear. And the quick review earlier about our relative sensitivity to low- and high-frequency sounds is important here. High-frequency sounds will start to be unpleasant at much softer levels than low-frequency ones. This has implications for how we design our exhibitions and how to balance the sound in them. The considerations for a given work will change depending on what other works are around it. For instance, when installing Ragnar Kjartansson’s Scenes from Western Culture (2015)—which consists of nine video works that can be exhibited together, as single channels, or in smaller groups—each channel is treated differently depending on the selection of channels and their arrangement. When all nine channels are shown, all channels must compromise a little bit. When we’re exhibiting these works, just as when we are making them, we are thinking about arranging and balancing out an orchestration. We will have to adjust the level and equalization of Dinner (Jason and Alicia Hall Moran) differently than if it were being exhibited alone, boosting somewhere in the 1.5 kHz to 2.8 kHz range so that their speech is more intelligible in the mix. The roar of the Burning House and the engine throttling in The Boat (Stephan Stephensen, Kristín Anna Valtýsdóttir, and Gyða Valtýsdóttir) will provide bass support to the ensemble that will benefit from a little more brightness from the frolicking and chattering children in Rich 326

Sound in Time-Based Media Art

German Children (Ingibjörg Sigurjónsdóttir). Whereas, if exhibited alone, the same brightness of the children playing and laughing might sound shrill.

“How much of a painting you can make with just sound. In [Me & My Mother], sound is so important. Like the sound of the spitting and the sound of that, y’know, [makes mouth sounds] and, uh, [more mouth sounds] all that saliva sound is so important. It’s very, again, it’s very musical. So, sound is very important for that piece. I mean, it’s always important in the pieces. In the works where I’m dealing with that gorgeous form of sound, it’s really the make it or break it. The sound always has to feel good. Even when you are making something aggressive, the sound has to feel good. . . . It pumps your heart.” —Ragnar Kjartansson (Kjartansson 2021)

19.3.2 Balancing Sonic Environments in an Exhibition Context Obviously, one important design goal should be not to cause pain or permanent hearing loss for any of our visitors. And we have to remember that we are not just trying to create an appropriate exhibition experience for the visitors but also that we are producing a work environment for security personnel, docents, construction teams, office workers, cleaning and maintenance staff, and many others. Some visitors may spend 30 seconds in an exhibition or five minutes or an hour or more if the work is excellent. But many other people are spending full workdays in this environment, day after day and week after week, so we have to balance the impact of the work with respect for those who have to live alongside it. One excellent way to help insulate works from each other and improve the experience of those working in as well as visiting the museum is to incorporate well-built sound locks between works—which give the added benefit of helping with light bleed between works as well.

“The most successful exhibitions I have come across were usually the ones that made an effort to contextualize the artwork in question with other sounds and materials that allowed visitors to attune their faculties to the specific repertoire in question. This has been implemented at the 1980 exhibition For Eyes and Ears: From the Mechanical Clock to the Acoustic Environment, curated by René Block and Nele Hertling at Akademie der Künste Berlin, for example. The exhibition presented kinetic sound sculptures by Stephan van Huene, Joe Jones, Jean Tinguely and Takis together with 19th century music automata and mechanical clocks, and my research has shown that the historic popular crafts actually sensitized the 20th century audience to the sounds of the contemporary artworks. As sensory repertoires can hardly be delineated without their historical dimension, I believe that the curatorial and conservational responsibilities are similar in nature regardless of how distant the moment of the artwork’s presentation is from the moment of the artwork’s conception. A historically sensitive approach to the respective sensory repertoire will allow audiences of all ages to attune their sensory faculties to the sounds in question.” —Linnea Semmerling (Semmerling 2021)

327

Christopher McDonald

Once we have made sure we aren’t actually hurting anyone, adaptation and fatigue still create more problems for us. The relative levels of high- and low-frequency sounds not only change the overall experience of the “weight” of the sound in the space but also raise questions about moving from work to work within an exhibition and between unrelated exhibitions. This is akin to the issues faced when mastering a suite of songs for an album—as each track plays through and the next one starts, our ears are not really hearing “flat” anymore but hearing in the context of the song that just ended. If we move from a track with lots of bassy sounds to another without any, it will feel a little hollow even though it might have been “perfect” on its own. When an artist releases an album, a mastering engineer has carefully tweaked each song to create a pleasant journey from beginning to end. When a soundtrack is released that has many different artists’ songs (taken from their own albums), a mastering engineer has to remaster them to “play well” with each other—the context is now different, and the collection of songs requires a slightly different approach. A similar problem arises when jumping from song to song on Spotify, for instance, and has led us into maddening loudness wars and endless mediocre “remasters” of good albums from past decades in order for them to stand up next to the latest Kanye West. Unfortunately, we may find ourselves in a bit of a similar situation when putting multiple sound-based works together in the same exhibition. One way to avoid having to compromise works in an exhibition so that they “play well” with each other is to provide physical space between works [see figs. 19.5 and 19.6 of Pipilotti Rist’s No Sound Sculpture Passage (2017)]. If it is possible to arrange an exhibition so that visitors have a minute or two to catch their auditory breath—listening to the ambient sounds of walking, quiet talking, or even simply (relative) silence—it can give our brains and little hairs in our ears time to reset enough to enter a new work and appreciate it on its own terms. “I also think of the guards and the people who work in the museum. Even if our sound isn’t that loud we invented really good, specialized sound passages. To swallow the sound on your way out and in, we work with a lot of fabrics to make double-sided curtains through which you meander. It is very simple and very effective. Sometimes we even use an additional rubber fabric behind the curtains. Three meters, turn, three meters, turn, three meters and then you’re out. Like a zig-zag.” —Pipilotti Rist (Rist 2021)

When I am spending a week installing and mixing the sound for an exhibition, I am constantly aware of how much time I am spending outside of the exhibition. Too much time tweaking sound without giving your ears time to reset gives wildly diminishing returns. We will discuss some of these issues more in the final section of this chapter, and hopefully, this brief psychoacoustic overview will have helped to situate you for it. In short, everything we hear next is heard relative to what we have heard previously. And what we have heard previously can tire our ears and brains out. So whether we are designing a whole exhibition or even just trying to put the finishing touches on a single installation, remember that it is very important to give the ears a rest.

19.4 Recording Sound To capture sound, we must translate its two axes of information—amplitude and time—into an archive. There are many formats for such an archive and as many methods of making them. Some of the earliest in the 19th century involved physically etching this information 328

Sound in Time-Based Media Art

Figure 19.5 Test hanging of No Sound Sculpture Passage (2017), by Atelier Pipilotti Rist, Andreas Lechthaler (architect), Dave Lang, Tamara Rist (seamstress), and Nike Dreyer. Photo: Atelier Pipilotti Rist

onto wax cylinders. A  needle attached to the end of some apparatus for amplifying the vibrations of a sound wave rests on the wax. And as the cylinder spins (time), the vibrations cause the needle to cut a groove into the wax (amplitude). This historical method, though quite clever, seems a little primitive or silly now (if it doesn’t seem silly, I urge you to listen to Edison’s liver complaint or vaudeville recordings on the National Parks Service’s website [National Park Service (NPS) 2015], and yet this process remained fundamentally unchanged for decades. Wax and metal and acetate and then vinyl records and magnetic tape eventually replaced the wax cylinders. Microphones, transducing sound wave pressure into electrical voltages, eventually replaced the purely mechanical process of amplifying sounds to etch a needle’s vibrations into a medium. The electrical voltages (amplitude) could be written onto magnetic reel-to-reel tapes spinning at various speeds (time). The speed 329

Christopher McDonald

330 Figure 19.6 Plan of No Sound Sculpture Passage (2017), by Atelier Pipilotti Rist, Andreas Lechthaler (architect), Dave Lang, Tamara Rist (seamstress), and Nike Dreyer. Photo: Atelier Pipilotti Rist

Sound in Time-Based Media Art

of the tapes (essentially equivalent to the resolution or sample rate) got faster and faster. This enabled higher and higher frequencies to be recorded with better and better fidelity. Today, we still use microphones to transduce sound wave pressure into electrical voltage (amplitude). An analog-to-digital converter then samples this continuous analog signal a certain number of times a second (the sample rate), which is digitally recorded as discrete chunks of bytes (the number of which determines bit depth). There are innumerable unsuccessful and successful recording techniques and media from the last century and a half (in terms of their audio quality, market share, and longevity), but almost all of them are wildly outside the scope of this chapter. The important thing here is that every single one of these methods ultimately records sound wave pressure as amplitude over time to a medium for later playback. The previous examples will be helpful as visual metaphors for understanding how digital audio works.

19.4.1 Analog versus Digital Recording The difference between analog and digital recordings is that the first is a continuous change of amplitude over time, whereas the latter must record amplitude over time by sampling it at a specific rate and writing the value as discrete chunks or samples. That means digital audio is only as high resolution as the frequency at which the amplitude is sampled and the bit depth of the discrete chunks of data archived. The Nyquist Shannon sampling theorem tells us that a sample rate s cannot accurately sample frequencies above s/2. As the luckiest of us cannot hear much above 16 kHz (less than half the CD standard sample rate of 44.1 kHz), delivery sample rates have not doubled over the last few decades with as much enthusiasm as video resolutions have. Though higher sample rates are used in studios for tracking and mixing, we will generally be dealing with audio files that have a 44.1 kHz sample rate (for pure music) and 48 kHz (for broadcast sound and, because of that legacy, sound on videos). But other factors play as much of a role in the quality of digital recording as the nominal sample rate does. For example, a digital recording is only as accurate as the regularity of the clock (see Section 19.4.4 for more explanation), triggering the sample collection. Capturing samples at 44.1 kHz using an extremely precise clock will sound much more realistic, spatially rich, and harmonically nuanced than capturing samples at 172.4 kHz with a less accurate clock. The precision of the conversion of the sampled analog amplitude to a digital value is also at least as important as the nominal sample rate. So the fact that digital formats must always operate in discrete samples introduces many places for sound degradation. But that doesn’t necessarily mean that analog audio recording is always preferable to digital.

19.4.2 Analog Audio on Magnetic Tape and Vinyl Take reel-to-reel tape, for example. The wider the tape is, the bigger the maximum recorded voltage (as magnetic charge) can be. That translates to a larger difference between loud sounds (and, more importantly, delicate, quiet ones!) and the noise floor, which you can think of as the attempt to “record silence.” If there is more room between a delicate sound you are trying to record and the noise floor, you will be able to hear more detail in that delicate sound as it will have less background noise drowning it out. Similarly, the faster the tape is spinning during recording (the more tape used to record, say, one second of sound), the higher the resolution of the recorded changes in amplitude will be. In tape recordings, this is described as inches per second (ips), and common rates were 7.5″/15″/30″, with each doubling resulting in the ability to record frequencies twice as high more accurately. 331

Christopher McDonald

This might make you think of the rotations per minute (rpm) of vinyl records. And you would be right, but why, then, aren’t your grandma’s old 78 rpm Sousa records more than twice as good as your 180 gram, 33 rpm vinyl reissue of Goldie’s “Timeless”? And this is the crux of the problem regarding the quality of audio recordings—both for analog and digital ones. The 78 rpm records were etched with tools that had worse noise floors and didn’t cut the grooves as finely, and the motors spun the records with less speed consistency. The high rpm was simply an attempt to compensate for that lack of precision. And it did; at 33 rpm, the Sousas would have been truly unlistenable. Just because the analog audio is recorded as continuous changes in amplitude, that is no guarantee of the precision with which those changes, either in terms of amplitude or time, were archived. Before we address some specifics of digital audio processing, let’s start with a fundamental fact we will need to keep in mind whenever we are dealing with recording, processing, or exhibiting audio: there is not necessarily a direct correlation between quality and technical specifications, such as sample rate, bit depth, and frequency response of speakers, even if that feels intuitive. The end result will always be dependent on the quality of everything from clocks to circuitry design, age of components (such as capacitors), and selection of materials for input/ output jacks or speaker diaphragms. High quality is just high quality. Of course, high-quality analog audio is fantastic, and the same goes for digital audio.

19.4.3 Sample Rate and Bit Depth in Digital Audio The parallels for the earlier discussion of tape and vinyl in the digital realm are bit depth (recorded amplitude resolution: tape width or groove depth) and sample rate (recorded time resolution: tape or vinyl medium speed when recording). To get from the analog domain to the digital domain, these amplitude changes (transduced by the microphone into voltage that is still analog) must be converted into samples of a certain bit depth at a stable sampling rate. The most common bit depths you will see are 16 (2 bytes for an unsigned value range of −32,768 to 32,767), 24 (3 bytes for an unsigned value range of −8,388,607 to 8,388,608) and 32 (4 bytes for an unsigned value range of −2,147,483,648 to 2,147,483,647). Bit depths of 32 (or even 64 for a maximum signed value of −9,223,372,036,854,775,808 to 9,223,372,036,854,775,808) are most often found internally in software to allow math to be done without clipping (losing sample integrity due to loud material exceeding the available headroom) when processing. Let’s look at some smaller numbers to illustrate this problem—let’s look at our hands. Let’s say I want you to add three fingers to some hand-friendly “sample” values: 5, 7, 8, 9. You hold out 5 fingers and add 3 fingers with your second hand. Our first sample is now 8. You repeat for 7 and get 10. Now hold out 8. If I tell you to perform our operation again, adding 3 fingers to the count, you probably can’t. You will probably max out with a count of 10. The same goes for the 9. So our new sequence of samples becomes 5, 10, 10, 10. You clipped our archive: the relationship between our samples is no longer accurate. We didn’t have enough headroom (handroom?), and if this were audio math, then something strange would have been introduced into the waveform.

19.4.4 Wordclock in Analog-to-Digital and Digital-to-Analog Conversion The component responsible for converting the amplitude to integer values is called the analogto-digital converter (ADC), and it generally includes its own built-in clock (known as wordclock in the context of audio equipment) for triggering the sampling of the incoming signal into a 332

Sound in Time-Based Media Art

series of bytes with each tick. For digital-to-analog converters (DAC), each tick triggers the conversion of a digital sample back into a continuous analog voltage signal. For a quick illustration of the disconnect between technical specifications and quality, you can get a digital audio interface capable of doing stereo analog-to-digital and digital-to-analog conversion at sampling rates of up to 192 kHz for $200 or less, and you can also get a very, very good, dedicated ADC capable of sampling rates of up to 96 kHz for well over $1,000. The most high-end dedicated clocks can even use atomic decay for the highest possible precision—a rubidium clock generator that only provides this wordclock “ticking” over BNC cables to trigger conversion by other dedicated ADCs/DACs can cost upwards of $6,000. The higher the precision of this clock, the more regular the timing of the conversion. Similarly, the better the conversion circuitry, the more accurately the voltage is converted into digital bytes and vice versa. This results in overtones with much more nuance and clarity as well as a more captivating stereo image (even technical audio terms are visually infused!). In short, you can picture lower quality clocks for ADCs and DACs as a vinyl record spinning irregularly, sometimes faster and sometimes slower; an effect which you’ve experienced if you were ever unfortunate enough to leave vinyl in a very hot car and then try to play it. A similar effect occurs when a turntable’s motor is reaching the end of its life. Remember how sensitive we are to subtle timing differences in sounds hitting our two ears? Clock irregularities (which in the digital audio domain is known as jitter) can do a profound amount of damage to our experience not just of harmonic nuance but to the spatiality of recorded sound as well.

19.4.5 Digital Audio Formats Audio stored as a series of bytes representing the voltage changes of an analog audio signal is known as pulse-code modulated audio (PCM). There is no compression involved (lossless or otherwise), and so it is as good a representation of the original signal as the ADC’s components can provide. It represents as much of the signal’s dynamic depth as its bit depth can store and as high resolution as the rate those bytes are sampled at. The digital data stored on a CD or in an Audio Interchange File Format (AIFF) file or in a Waveform Audio File Format (for some reason WAV and not WAFF) file all qualify as PCM audio. Apple created AIFF, IBM and Microsoft created the WAV, and Sony and Philips created the Audio CD standard. The difference between these formats is largely a question of how the bytes are ordered and what kind of metadata they can include in their headers. That means converting between these formats, whether ripping a CD to AIFF files or burning a CD from a WAV file, is also a lossless process. So whether you choose to use WAV, AIFF, or Core Audio Format (Apple’s format that succeeded AIFF) is more or less irrelevant in most use cases. This changed around the turn of the century when the Broadcast Wave Format (BWF) extension added multichannel (meaning more channels than stereo) support to WAV files. The 2009 RF64 extension to BWF then added support for 64-bit addressing, which wildly expanded the maximum possible WAV file size. On its release a few years before RF64, CAF had supported both of these features. The 32-bit addressing of the earlier formats had imposed a maximum file size of 4 GB. The newer additions raised the potential file size to something like 17 billion GB. Back in the ’90s, you could only record yourself telling spooky stories in 48 kHz/24-bit quality for about eight hours. But now you can hit record and around 3,740,955 years later, if my math is right, end up with one convenient file: AllMy5POOKYTales.wav. Try that with a tape recorder. But digital audio’s biggest practical triumph over analog audio is that once sound is recorded as digital data, there is no generational loss of integrity when duplicating material, and many simple adjustments like volume changes can be completely reversible if you are careful. If the 333

Christopher McDonald

computer is too slow, it will just take its time to make the calculations correctly. In the digital domain, except in the case of some kind of catastrophic drive or processing failure, the data stored is reproduced exactly. No timing distortions and no change in the noise floor. When you lower the volume of a sound by 30% or raise it by 30% (assuming there is enough headroom), there is no noise added, no static introduced, and no added pops as a capacitor discharges. The pure math being done ensures that each sample (the chunks of data representing discrete steps in time) has a certain value subtracted from it or added to it, and no other information can be accidentally introduced.

19.5 Reproducing Sound Sound is always going to be pressure waves hitting our eardrums. So we need to be able to take our records (be they stored on vinyl or digitally) and convert them back into pressure waves in the air. For our exhibition of artworks, this will almost always involve getting digital audio back into the analog realm—that is, continuous voltage changes that can be used by loudspeakers to create pressure waves in the air. We have already covered the problems we encounter when importing analog sound into the digital realm and going in the other direction is raises similar concerns. In our digital audio files, we have a series of samples that describe sound. That is, we have a series of big numbers (generally 16- or 24-bit lengths) representing the amplitude changes of a sound wave over time. As a quick and reversed recap of the previous section, we will need to go from these numbers to voltages and ultimately transduce that voltage into a pressure wave. As these numbers have been “sampled” a specific number of times per second, we will need to be aware of that sample rate and be sure to convert these samples back to pressure waves at the same rate. The sample rate tells us how many discrete samples represent one second of sound. So as we play back a series of samples at a higher and higher sample rate than intended, more and more samples per second are converted to voltage, and then pressure waves and the resulting sound will get higher in pitch and take less time to play. The opposite is true as we play those samples at lower and lower than the intended rate: the pitches are lowered and the playback will take longer than if played back at the “native” sample rate. This is exactly the same problem as when you play back a 45 rpm vinyl record at 78 rpm or at 33 rpm (or, though fewer of us have experienced this, when you play back a 15 ips tape at 30 ips or 7.5 ips). Understanding what a sample rate is and how it affects the recording and playback of sound is crucial to ensuring the integrity of sound-based artworks, both in the context of exhibiting such works and in the cases where we are transferring a work between different digital formats.

19.5.1 The Importance of Sample Rates As you can now understand, in order to play any sound back at all, a computer or sound device must have a sample rate set. Even if you have never seen it and even if you can’t change it, there is one. Most current computers will have a default of either 44,100 Hz (44.1 kHz) or 48,000 Hz (48 kHz), the standard sample rates for CD audio and the audio on DVD/Blu-Ray, respectively (but note also that in special cases, DVD and Blu-ray, for example, when used for audio-only releases, can have higher sample rates than this). Recall from Section 19.2.1 that most humans can barely hear reliably up to about 20 kHz. For that reason, the vast majority of speakers in cars, laptops, and headphones, let alone earbuds, aren’t designed to do a great job of recreating sounds above 20 kHz for us not to hear. 334

Sound in Time-Based Media Art

Recall from the previous section that the Nyquist rule states that frequencies can only be accurately sampled at a sample rate twice that frequency. We can then understand why 44.1 kHz and 48 kHz have stood their ground for audio delivery. Over the past 30 years, video resolution widths went from less than 500 pixels in VHS to 720 in DVD and eventually 1920 (HD) and 3840 (4K) for digital content in the 2010s. At the beginning of the 2020s, it isn’t unheard of for video games to be 7680 pixels wide (8K), let alone motion pictures. The same doubling and redoubling of audio resolution have simply not followed, and the reason is that the difference between 44.1 kHz and 88.1 kHz (let alone 384.2 kHz) is nothing like the quality advance visually from DVD to HD video—because we can barely hear sounds at even an eighth of 172.1 kHz, the frequency cut off for what 384.2 kHz sample rates can reliably reproduce. Because of the CD (music) and DVD (motion picture) heritage, the music and broadcast/ motion picture production chains have tended to stick to multiples of 44.1 kHz and 48 kHz, respectively. And while it is very common for working resolutions in recording studios to be 88.1 kHz or 176.2 kHz or even higher, the delivery resolutions for music and at least digital delivery (if not cinema) have not changed in three decades. As expected, there are some “aficionado” audiophile exceptions—DVD audio releases boasted 96 kHz sample rates—but it seems extremely unlikely to me that higher resolutions will become standard for personal home experience. It would not surprise me for 88.1 kHz or 96 kHz to become common in exhibitions. But why bother playing back 88.1 kHz audio on a laptop speaker that simply can’t pump out frequencies that high, even if we could hear them? To reiterate a point in the previous section on converting audio between analog and digital realms, an excellent converter operating at 44.1 kHz or 48 kHz will have much more of a positive, tangible impact on the quality of sound in an exhibition space than the sound being nominally twice the resolution but with lower quality converters. Of course, if you are in a position to have excellent DACs operating at 96 kHz and speakers that are detailed enough to flatly reproduce sounds into that range for a video work with detailed sound, and I stress, that was delivered to you at a 96 kHz sample rate, then you are lucky and take advantage of that opportunity. On that note, it is important to take a quick moment to discuss upsampling material. To lean on our visual metaphors again, similarly as an HD video cannot—barring AI voodoo that literally “repaints” the frame based on what it’s been trained to think the higher-resolution version would look like—be improved by exporting a 4K file for exhibition, so the 48 kHz audio in a work cannot be upsampled to 96 kHz to gain any improvement in quality. Converting MP3 or AAC to uncompressed audio will yield no improvement, just as transcoding ProRes 422 HQ (a codec that is compressed if very, very good-looking) video to uncompressed video does not, no matter how many museums ask us to, improve the quality of the video. Doing so to create “archival” files may help keep bit rot from noticeably affecting the work, but that is all. In an ideal world, all work would be recorded and processed, and delivered in an uncompressed domain. While video work, at least for shorter works and works with lots of cuts, is often recorded uncompressed, it is prohibitively expensive or (until rather recently) basically technically impossible to exhibit the work that way. With audio, thank goddesses, uncompressed playback has already been the case for decades. The audio file will have the sample rate (and bit depth) in its header, but unless we are opening the file in some kind of digital audio workstation (DAW) that automatically sets our system’s audio output sample rate, we will need to check manually that the system is set to match the sample rate of our file. Unfortunately, many quick playback tools, such as VLC or QuickTime, will not complain about a sample rate mismatch and might not automatically rescale it for you. Worse, if they do automatically convert the sample rate and the pitch and timing are close enough, you still don’t really know how much damage it is doing to the sound quality. As with 335

Christopher McDonald

video frame rates, the system will generally do an on-the-fly sample rate conversion. Most modern systems have native frame rates of 50, 59.9, or 60 fps and will retime the video accordingly. When this is done—for instance, for a video that is 25 fps on a system running at 60 fps—you can see the periodic jumps every second or so when frames are dropped in order to accommodate the operating frame rate. The retiming for sample rates is less severe but still affects the quality of the audio. So for purposes of quality control and especially exhibiting works, it is fully your responsibility to ensure that no unintentional conversion is taking place. This is very system dependent and beyond the scope of this chapter, but a quick search should be enough to find the system-specific settings required so you can check for yourself. This is one of many reasons to use dedicated playback systems where the sample and frame rates are locked to the material instead of Windows/Mac OS/Linux desktop environments where mistakes can happen.

19.5.2 Getting the Audio to Speakers All of the above is rather technical but important to know in order to proceed to the most important stage of the process of reproducing sound: getting your audio to a very good speaker. The speaker will use a magnet attached to a diaphragm, and voltage passing through an electromagnet will push that diaphragm forward, resulting in pressure waves in the air. Sound! In order to get the material from the playback system to the speaker, you will need the same kind of device discussed in the previous section that takes us from analog voltage output to discrete digital bytes, but this one runs in the other direction: digital to analog. The same clock and conversion precision issues apply regardless of which direction you are going. A good digital to analog converter will need a good clock (which, as discussed in the previous section, will dramatically affect the quality of the pitch and spatial imaging) and highquality components for translating the sample values into voltage. In the exhibition context, this means that your selection of equipment is crucial to presenting the work as accurately as possible. And again, the general rule is that you get what you pay for.

19.5.3 Frequency Range Technical specifications are often not very helpful here for converters or speakers. We see specifications for speakers like “frequency range: 20 Hz to 20 kHz,” which essentially tells us nothing useful about the sound quality we will experience. It is good to know we should be getting 20 Hz out of a speaker, but it doesn’t tell us anything about how subjectively good the sound is at either end of the spectrum or in the middle, for that matter. For example, it doesn’t tell us if the speaker has a weird +3 dB bump at 1 kHz. Although some speaker manufacturers provide helpful graphs that give us a sense of the flatness of their frequency response over the spectrum, we still don’t really know how good the frequency reproduction sounds. Further, because every speaker is taking voltage as an input and literally pumping a diaphragm back and forth based on its circuitry, we not only have to deal with questions of frequency response but also dynamic reproduction. Every speaker will essentially act like its own unintentional compressor—a tool for reducing the dynamic range of sonic material. Or it might even act as an expander and increase the dynamic range of the material, which in this case will likely mean making quiet sounds inaudible. Really good speakers will not decrease the “distance” between quiet sounds and loud ones. They also won’t increase it. But you will never see a meaningful or satisfying description of this in technical specifications. It is simply a question of how good the speaker is. This again points us to the importance of the artist’s guidelines in the selection of equipment. Some works may be more about “wall of sound” and guts, and some may be more about delicate 336

Sound in Time-Based Media Art

nuance. Hopefully, the artists in the first category have selected speakers that are powerful enough, and the ones in the latter category have selected speakers that are detailed enough. Some artists may be in a gloriously different category altogether. It is always best to get some kind of qualitative description for the intent of the work if there isn’t already one. This is discussed more in the following section, but a simple rule would be that a digital-to-analog converter should always be as neutral (i.e., good) as possible while speakers are generally going to be a matter of taste.

“My style is always having the speakers as close to ear height as possible. Curators normally would put the speakers out of sight, but I prefer that it looks bad [laughs]. If you hide the speakers far away, you often need to have too high sound level to fill the space and that gives the spill into the next room. In my exhibitions I always have to avoid spilling. So I try to have more speakers and have them lower—lower in sound and lower in position. If you have more speakers the sound holes are less. “Other guys in my league often have two super big and expensive stereo speakers. I prefer cheaper . . . but more [laughs]. Then they don’t have to be so loud and sound spills less into the other rooms. And I always cover the speakers in stockings, like satin leg stockings, in the color of the wall. The speakers are usually white or black and I usually have colorful walls so we give them the same color with the stockings.” —Pipilotti Rist (Rist 2021)

19.5.4 Active and Passive Speakers—Balanced, Unbalanced, and Digital Cables One crucial distinction between speaker types is that some are passive (require an external amplifier to drive them) while some are powered (the amplifier is built into the speaker itself). One major benefit of using powered speakers is that the engineers have already perfectly matched the amplifier circuitry to the elements in the speaker. This generally makes for a bigger, heavier, and more expensive speaker. Each active speaker should take a single balanced cable carrying analog audio as voltage or even an AES cable carrying a digital audio signal if it is a high-end speaker with its own built-in DAC such as the Genelec 73xx subwoofer series and 83xx speaker series. A single AES cable— which, confusingly, often uses the exact same connector (either TRS or XLR) as a normal balanced cable but should have a slightly different impedance (the amount of electrical resistance the cable puts on the voltage it carries)—can actually carry two channels of digital audio, known as channel A and channel B. Speakers capable of receiving AES digital audio as an input should also have a pass-through output that will allow you to daisy chain two speakers using AES cables and then select on each speaker whether you want to use channel A or B. This can somewhat simplify stereo installations. Cables with RCA jacks or headphone jacks should never be used because they are unbalanced, which means they lack the ability to cancel out the noise they are picking up. The wires in cables are essentially a kind of weak antenna for electromagnetic waves like radio frequencies and AC power cycling. So the longer the unbalanced cable, the bigger the problem you have. Though we are dealing here with electromagnetic waves instead of pressure waves, as in the case of sound, the wave cancellation process described in fig. 19.4 is exactly how a balanced cable works. The balanced cable has three wires: ground, hot, and cold. Hot 337

Christopher McDonald

and cold are given the same audio signal, but one is phase-reversed, as are the pair of complex waves in fig. 19.4. When the device on the receiving end gets the balanced audio signal, it flips the phase of one of them and sums the two signals into an audio signal twice as loud while any noise the two cables picked up (which should be virtually identical on the two wires) is canceled! Though it is best practice to avoid running XLR or AES cables longer than 100′ or so and perhaps to avoid running them parallel to power cables for long stretches, generally their shielding and three-wire design are such that no problematic amount of noise is introduced into the signal they are carrying. Passive speakers are connected to the amplifier using two wires, positive and negative: one carrying the signal and another for the ground. In addition to the possibility of picking up noise, this presents another shortcoming as compared with powered speaker architecture. As we know from our review of sound in Section 19.2.3, sound waves have peaks and valleys that can cancel each other out in space. When you are dealing with passive speakers, the positive and negative inputs become a way of inverting the phase of these sound waves. Plug the correct positive signal into the positive input and the correct negative signal into the negative input, and you produce a sound wave whose peaks and valleys are as expected. Mix them up, and you have now inverted the phase of this audio signal (again, see the mirror image pair of complex waves in the middle of fig. 19.4 for a visualization of two waves that are inverted or “out of phase” with each other). If you have a two-speaker installation with very similar (or, worse, identical) sound, then you will have extremely strange sound problems in the space if the phase of both speakers is not matched. The two speakers will be pumping out sound waves that will cancel each other out. To give you a more visceral sense of how bad this is, quick and dirty karaoke versions of songs can be made by flipping the phase on one channel of the audio. The main vocal in pop music is identical on both left and right channels, while things like various synthesizers or often most of the drums are spread out across the stereo image. Flip the phase on one channel and. . . . That’s right; the vocal magically disappears—perhaps a little reverb ghost of it might remain because reverb is generally used to add “width” or spaciousness to sound, so the left channel and right channel material is usually substantially different and won’t cancel out. When this is done in a DAW, the phase cancellation is mathematically exact and generally obvious—something disappears from the mix. When it happens acoustically in a real space, you may not notice it right away (or from certain spots in the room!), but it can create problems that are difficult to understand. For instance, bass frequencies in stereo mixes are also often identical in both left and right speakers. Recall our discussion of the length of bass frequencies from Section 19.2.1, and you can begin to imagine just how weird this can get. It may manifest as “feeling like there is no oomph,” or it may create a truly bizarre, unnatural experience as you move through the space. If you do have to use passive speakers—for instance, if an artist requires it—pay special attention to two things: the impedance (measured in ohms) and the wattage. Both are outside the scope of this chapter (I am not much of an electrical engineer) but getting the impedance wrong will affect the frequency curve of the sound, and getting the wattage wrong can, as I have experienced, make a speaker catch on fire (not my fault). Hopefully, if an artist stipulates the use of passive speakers, then they are also listing an amplifier to use with them. Even so, it is worth checking that the impedance and wattage are correctly matched. It is important to note that most speakers, whether active or passive, are either subwoofers or two- or three-way speakers. Subwoofers are best at reproducing sub-bass and low-bass frequencies. Two-way speakers contain both a mid-range driver (named because it is specially designed to reproduce the middle-frequency range not because they are necessarily reasonably priced) and a tweeter, which takes care of the higher frequencies. Three-way speakers add a woofer element to extend low-frequency response, but due to the fact that the woofer element is not 338

Sound in Time-Based Media Art

generally as large as in a dedicated subwoofer, the low frequencies will not be as powerful or detailed as with a good subwoofer. There are also mid-range speakers which contain only the mid-range driver; most commonly, you see them in small radios or not-so-great sounding car stereos. Subwoofer drivers are the largest, tweeters are the smallest, and the mid-range drivers are predictably in between the two. Recall the discussion in Section 19.2.2 of the relatively high energy needed to produce lower frequencies, and this size difference makes sense. Fig. 19.1 illustrates how these different speaker elements fit into the range of audible frequencies.

19.5.5 Choosing Speakers That Reflect the Artist’s Intentions So again, we return to the artist’s specifications. The experience of the sound in a work is dependent on (in addition to matching sample rates, bit depth, and having a good, neutral digital-to-analog converter) the speakers used. We need to understand what effect is desired. This is more important than specifically listed speakers/amplifiers for a few reasons. First, no equipment is in production forever. And second, parts for discontinued equipment eventually become impractical to acquire. As with the question of “upgrading” works to future unthoughtof playback systems and codecs, we are trying to make sure we know that a work is always being exhibited faithfully. As I hope I have made clear, replacing a speaker with one of (per the speaker’s technical specifications) equivalent frequency response or wattage gives no guarantee of maintaining the quality of experience the artist hopes for. I do see this as fundamentally a speaker issue. Every other element of the chain, as we are in the digital realm, I see as a question of maintaining “neutrality.” We seek to translate the sample value and timing as accurately into the analog space as we can. Only in the most esoteric realms will someone demand an Apogee Rosetta converter because of its “special sound,” and there are some cases where that may be called for, but it will not be in the exhibition of artworks. If you have maintained a high-quality audio environment with attention to the specifications of the file, your biggest qualitative question in going from digital audio to (analog) sound will be speaker selection. The bare minimum you can do in selecting speakers and other equipment is to be sure that you are playing back as many channels (each on their own speaker) as the work calls for. Hopefully, this is explicitly stated in the guidelines for the work. But if there is any uncertainty, you can check the files themselves to see how many channels of audio are in them. Keep in mind that it is possible to have a single track of audio containing multiple discrete channels. And as you know from the discussion of phase and sound cancellation in the previous section, it is important to be sure you are not mixing multiple tracks to a single speaker. Finally, selecting the right equipment to faithfully reproduce sound in an exhibition are just as important when designing a quality control or proofing room for such works. When works are acquired, it is important to become acquainted with the content of the work in an environment that is a little more controlled than an exhibition space. With good monitoring headphones or a very well-treated room and very good speakers, it is possible to familiarize yourself with what is in the work so you are better positioned to notice, when the work is exhibited, what is and what isn’t supposed to be there.

19.6 Exhibiting Sound There are two competing aims when we exhibit works with sound. We want to present the work as faithfully as possible. But we also want an engaging space that feels like it belongs in the museum. We would prefer for a visitor to feel like they are having a pleasantly continuous 339

Christopher McDonald

experience as they wander from exhibition to exhibition—that is to say, we don’t want them to be noticing anything but the work. In a museum with iconic and beautiful architecture, it is a shame to build a black box. In an ideal world, our acoustically neutral exhibition space would feel like a coherent architectural extension of the rest of the building. Unfortunately, we tend to have to choose one of the two aims to succeed while the other suffers. If we are lucky, the unlucky one only suffers a little bit. When possible, we should try to exist within the museum space rather than creating an acoustically dead black hole within it. Still, sometimes that is the only choice. We wander through a corridor of paintings, turn right, and enter what is essentially a little movie theater with no seats. But in most cases, we can find a compromise where we eliminate the space’s worst acoustic offenses without taking the nuclear option. A quick reflection on our spatial sensitivity to sound as it hits our two ears tells us that exhibiting work in improper or untreated spaces can profoundly damage works with sound. When the sound from the work reflects off surfaces in the exhibition space, the punch and detail described by the “work space” are compromised. Cues that are intended to help the viewer settle into the “work space” are muddied and distorted. It becomes very difficult for the “work space” to overcome the limitations of a poorly designed exhibition space.

19.6.1 Reducing Sound Reflections We have two options to improve the acoustics of a space: acoustic panels (on the walls and ceiling) and carpeting. Hard surfaces bounce back sound just as bright, shiny ones reflect light. Simply put, the more paneling and carpet cover, the better (more acoustically neutral) the room will be. But there are some trade-offs. We don’t necessarily want to make a room completely “dead” sounding. What you gain in sound clarity, you lose in the naturalness of the experience of being in the space. Additionally, care should be taken in the selection of the materials, especially panel thickness, which acts as a selective equalizer with thinner panels absorbing high frequencies (shorter wavelengths) better and thicker panels absorbing low frequencies (longer wavelengths) better. There are many different materials for acoustic paneling, including fiberglass, Rockwool, packed cotton, and wool. Except in cases of availability issues or building code regulations, there is no reason not to use panels made from recycled cotton and wool, which are more environmentally friendly to make and dispose of. We would prefer to have a balance of 2″, 4″, and 6″ thick panels, so we are doing a “flatter” job of absorbing frequencies across the spectrum. The 2″ and 4″ panels work well on the walls and ceiling-mounted directly to the wall and directly to or floated off the ceiling. The thicker 6″ panels, which are best at targeting the lowest frequencies (longest wavelengths), work best in the corners of the space installed at 45-degree angles. This creates an air gap of around 20″ behind the panel (for a 2′-wide panel), which multiplies the efficacy of the bass trap as the bass frequencies are absorbed by the panel and then decay while bouncing around behind it. All panels should be wrapped with an acoustically transparent fabric for visual reasons. It is usually easy enough to find a good fabric color match, so the panels blend in with the surface as much as possible. This is preferable to painting the panels, which some manufacturers approve. In our experience, this risks resulting in panels with an acoustically reflective surface, thus defeating the whole purpose of having panels in the first place. Unfortunately, there isn’t a great rule of thumb for exactly how many panels to use. This obviously depends on how large your panels are but also on whether you are paneling the ceiling in addition to the walls, using carpeting, and also the room’s materials and dimensions. We often aim for larger aggregates of panels (since they are less distracting than having a 340

Sound in Time-Based Media Art

checkerboard pattern all over the room), and we center them on the projection height or room height, whichever feels more natural. If the projections are 8′ tall and the rooms have 16′ ceilings, we would probably go with something like 12′-tall-by-8′-wide panels. The panels might then extend to 2′ above and below the projection. These decisions will often boil down to personal preference (whether the artist wants a completely dead room or not and how willing they are to compromise the look of the space) and budget. In general, it is a game of finding a balance between low visual distraction and good sound. And if, for some reason, there are very few possibilities for acoustic treatment, take advantage of them anyway. If we can’t panel the ceiling, carpeting will make a huge difference. If the walls have a lot of difficult permanent features that limit the number of panels that can be installed, even a few acoustic panels installed at ear height will help.

“The more you cater to the sound and the more clumsy things you do that make the space worse, like put carpet and whatnot, the better the environment gets-because sound is king. I think it started with my godmother. If we would go with her to a restaurant, she would subtly freak out if there was some music in the background. She was like, ‘MUSIC IS NOT SUPPOSED TO BE IN THE BACKGROUND.’ She was an old school early twentieth century singer. She didn’t understand this idea that sound did not matter. I think she was a big influence in thinking that music and sound really, really matter. It’s not like some blanket.” —Ragnar Kjartansson (Kjartansson 2021)

19.6.2 Considering the Space and Location We may also have to deal with other sound issues relating to the room itself and its proximity to other works. Perhaps there is a very loud air handling unit in the room or a freight elevator on the other side of a wall. For works with delicate extended sections of calm, this can be very disruptive. Preferably, we would have noticed this and chosen a different space to begin with. For some problems, there will be no perfect solution. But other problems like loose rattling metal in the ceiling can be tracked down and silenced or lessened. Such problems often only appear after the work is installed and playing at its normal level. It is safest to test sound in the space ahead of time by bringing in speakers and doing test playback during installation as early as possible. Hopefully, the way this work will impact other exhibitions around it—and how other works will impact it—was already considered early in the room selection process as well.

“When I’m installing, it’s for me very important that we place the speakers in the first days. That’s what I learned over the years: that sound often comes too late or is neglected. Especially in group exhibitions, very much energy goes into placing projectors and monitors and all that and then sound comes very late. So as a rule in our team we want to do it the first day, position the speakers and make tests on the ground. And then adjust to it. And also the mixing is done in the room.” —Pipilotti Rist (Rist 2021)

341

Christopher McDonald

Keep in mind also that the dimensions of the room itself determine ways in which it will interact with the piece—think back on Lucier’s I Am Sitting in a Room! In general, the more square a space is and the lower the ceiling height, the more the frequency response of the room and echoes will be disruptive. Unfortunately for museum design, rooms that don’t have 90-degree angles are often very helpful in minimizing acoustic issues. This is why recording and mastering studios often go to such lengths and expenses to create non-square spaces. Often, we will not be lucky enough to get spaces like this designed with acoustic neutrality as a priority. Another general rule is that in spaces that are longer in one direction, it is best for the sound to be projecting the length of the room.

19.6.3 Tailoring the Sound to the Space Once the room has been treated to the best of our abilities, we can still use equalizers (EQs) or sometimes other signal processing like compression to tailor the sound of the work to the space. Even though we are technically adjusting playback of “the work,” what we are really doing is better thought of as further adjustment of the exhibition environment. The room, even with a good distribution of sound paneling and carpeting, still has specific dimensions which effectively amplify certain frequencies. If this room’s “mode” results in a boost of 5 dB from around 400 Hz to 550 Hz, we can cut that frequency range accordingly in the work. This is almost always going to be done in the room with specialized hardware or software rather than by preparing a new, equalized exhibition copy of the work. This is simply because, in addition to many artists not allowing the production of new exhibition files, it is easier to tweak the sound while hearing the effects of your changes in real-time. These adjustments are best made after all painting and, obviously, carpeting and paneling are finished as they will affect the sound of the room. If possible, it is highly recommended that the changes be finalized with the ideal or expected number of people in the space as well. Just as with sound panels, human bodies absorb their own range of frequencies. And the more bodies in the space, the more those frequencies are absorbed. The same goes for setting the overall sound levels of the work. It is also helpful to consider the general behavior of the visitors. Some museums tend to have very quiet visitors, while others’ can be rowdy. For example, if there are special openings or specific donor events, you may need to think about creating entirely different “presets” of EQ and overall sound level. This does add extra complexity and the need for someone to take care that the correct preset is loaded for the correct times, so it is often better avoided but can be an option when a museum team has highly competent sound specialists on staff (usually, this means a recording artist who also happens to be a museum installer and that can be a boon). Note that some sound systems include automatic room adjustment by testing the space with an “impulse” (usually some kind of white or pink noise), which it then records with a microphone and analyzes. It can then apply neutralizing EQ to “undo” the room’s frequency issues. In theory, this works. In practice, it may not yield as good results as patiently, manually adjusting the work by ear. These automatic impulse-based adjustment systems can neutralize the sound of the space but do not thoughtfully address any other problems the particular work may have in a particular space. The artist’s intentions may require additional changes for a number of reasons.

19.6.4 Documenting the Intended Sound Experience It can be difficult to be faithful to decisions of artistic intent, even if you do know what that intent is, and that is why I advocate for well-documented, qualitative descriptions of the work’s sound. If an artist’s studio does not provide such an account, it is recommended that curators, 342

Sound in Time-Based Media Art

conservators, and registrars actively seek one out from them as soon as possible—and they should urge artists to include such descriptions with their work in the future. I believe some rules like the following are helpful, if not necessarily all to my taste: • • • • • • •

“The work should have the dynamic range of an orchestra playing a symphony in a hall.” “The work should feel consistently, almost overwhelmingly loud with little mid-range sound and maximum bass.” “The sound should feel full, transparent, natural, and never pushy.” “The guitars should sound like they are in the space with you, not like they are amplified at a rock concert.” “It should be so loud that you do not want to stay in the room.” “At the moments of peak impact—around 23 and 28 minutes in—the bass should be felt in your stomach.” “The work should be just loud enough that the line ‘Save me a bite’ at 17 minutes and 32 seconds is just barely audible when the room is at expected capacity.”

Such rules are helpful precisely because you can get consensus from different people working on the project in order to achieve an end result that, for most visitors, will achieve the work’s goals. The rules are not overly technical and are immediately relatable. Some people prefer sound pressure level meters or similar tools which analyze the sound level of the room using microphones, but there are major issues with that approach. What if the work is to be played back at 80 dB but the bass response of the room is wrong? As you now know, since the energy of sound waves varies greatly over the frequency spectrum (and low frequencies change a lot with listening position), you could end up with a wildly inaccurate sound level as it reads on a meter. Easily more than a few times too loud or too quiet! More importantly, consensus on the acoustic character will change over time. That means that if we hope to preserve the intended experience of a work for future generations, our best hope is for an easily understandable, qualitative checklist rather than an EQ graph or a list of decibel levels. Within the conservation community, some efforts have been made to develop frameworks for the documentation of sound-based artworks 2021: (Brost 2021). “We just say, y’know, that it has to have a gut punch and that it has to be presented nicely and hope for the best. Everything may be different in the future. I mean, like, Titian probably did not envision that in the future his paintings would have a warning about how patriarchal and disturbing they are. Art moves with the times and the technology. Time will tell. Conceptual relations aside then the sonics of these works need to adapt to people hearing differently. People in the future will have a . . . we have a very, very different sense of sound and bass and treble than a 19th century person. Even records of the late 20th century are now remastered for a different sonic aesthetics. So if anyone wants to show these works in the 23rd century then they will just deal with that in their period. And so keep going with this weird Titian example, then now these ‘poesie’ paintings are presented as brutal rape paintings and it’s really put into the context of our culture now and that’s cool. It’s just really awesome. I think it makes Titian more interesting for contemporary viewers. I never think of the works in the future after my death. I’m just like, ‘Why should anyone be interested?’ but, y’know, maybe people will be and then I just think, y’know, they know it better how to deal with their era than me who is dead in the ground.” —Ragnar Kjartansson (Kjartansson 2021)

343

Christopher McDonald

19.6.5 Maintaining the Work During Exhibition Once the work is properly installed, we have a new problem—maintaining the work throughout the duration of its exhibition. The question we are asking here is: What can go wrong? Equipment failure and cable failure can cause problems ranging from fairly subtle to very obvious. There could be nearly imperceptible, sporadic clicks. Perhaps one speaker’s tweeter fails, and that channel is now missing detail in the high frequencies, so it appears dull and muffled. Perhaps one speaker dies completely or loses its signal, and no sound comes out of it at all. And maybe one speaker just blasts extremely loud white noise—all the time or sometimes. “For the first two weeks of an exhibition I would ask and pay a neutral person in the city who is not linked with the museum to go there twice a week. This person would work kind of like my spy and is often another artist. And I tell the person all the points they have to check. And then after three days you call the museum: ‘Hey one monitor is not running or one speaker is off.’ They feel observed and it has a really good impact on the rest of the exhibition’s maintenance. In a way you just have to double-check in the first two, three weeks. And then the person only goes there once a week and then once a month. . . . That is a good tip: you need a local spy.” —Pipilotti Rist (Rist 2021)

19.6.6 Knowing the Artwork Especially for the subtler problems, we need to know what is and is not supposed to be in the piece to begin with. So it is important to be familiar with the piece. As discussed in Section 19.5, we need to listen through the work with good reference headphones and get a sense of the sound quality in general and how it changes over the course of the work. Please also note that even high-end computers and standalone video players can have very low-quality DACs. So headphone jacks should never be used for exhibition, let alone quality control. A good audio interface for the computer (and standalone DAC for video players) should always be used. While doing quality control, there are lots of questions you should ask yourself. Are there clicks and pops in the work? Static? Clean, detailed passages and noisier, rougher ones? Big changes in sound level from scene to scene or moment to moment? Hopefully, the artist has a list of any issues like this that are intentionally present. During the shooting of Ragnar Kjartansson’s 2005 work Satan Is Real, someone’s cell phone went off and created odd, digital, buzzing interference for a few seconds. We were going through his older works recently, and he decided to keep it as the duh-dit-dit-di-dit duh-dit-dit-di-dit really was a devilish sign of the times. Or maybe, as with Ragnar Kjartansson’s The Visitors (2012), flies kept landing on microphones—and it is a feature, not a bug! We took out some of them and kept a few for good measure. As with the qualitative rules, if there are no mentions of such issues accompanying the work, it is a good idea to ask the artist’s studio as soon as possible. Are there any issues in the piece that might sound like errors but are intentional? If we are going to be able to catch distortion or glitches during installation or over the course of the exhibition, we will need to study the piece and understand what is supposed to be in it and what is not. And if we do find something we are concerned about and have no explanation for, we should be sure to check with the artist’s studio right away. Once we are sure the piece is running properly, several people should familiarize themselves with it in the space so that they will be able to check for any changes in quality over the course of the exhibition.

344

Sound in Time-Based Media Art

“I remember I always asked someone who was just better in sound than me to help me with doing it. Right there at the earliest, in the earliest works then I would ask a musician friend if he owns a good mic to use. You know, I think really early on I always realized the importance of trying to have the sound better than I could imagine. “I’ve hardly ever fucked around with recording anything myself. Y’know, even y’know, like, back in the old days when just I had a dictaphone it was just like [gasps] I was just, I was scared of the technology. I think I’ve always had such huge respect for sound that I  have not been recording kind of anything myself through all these years when I think of it. That it’s always this thing, y’know, you should, y’know, you’re-you’re cutting something out of crystal, like, y’know I’m gonna just call a crystal cutter: I dunno how to do this.” —Ragnar Kjartansson (Kjartansson 2021)

My last piece of advice, and probably the best advice in this chapter, is to hire a sound engineer. If no one on staff is a professional mix engineer or acoustician, consider finding one you like working with for projects that require it. When we need to light an exhibition, we hire a lighting designer if one isn’t already on staff. To paint, we get good painters on the job. I believe it is important for all of us in exhibition design, conservation, and curating to have a good grasp of the basic nature of sound and some special issues we run into when dealing with it in the context of time-based media art. I hope that this chapter has served as a helpful introduction. But when you have the opportunity to hire someone more specialized than you, you will probably be glad that you did.

Further Reading on Sound in Time-Based Media Davis, Gary, and Ralph Jones. The Sound Reinforcement Handbook, 2nd ed. Milwaukee, WI: Leonard, 1990. Farnell, Andy. Designing Sound. Cambridge, MA: MIT Press, 2010. Kelly, Caleb, ed. Sound. Documents of Contemporary Art. London and Cambridge, MA: Whitechapel Gallery, MIT Press, 2011. Licht, Alan. Sound Art Revisited. New York: Bloomsbury Academic, Bloomsbury Publishing Inc., 2019. Semmerling, Linnea. Listening on Display: Exhibiting Sounding Artworks 1960s-Now. Maastricht, Netherlands: Maastricht University, 2020. https://doi.org/10.26481/dis.20200430ls. Semmerling, Linnea, Peter Peters, and Karin Bijsterveld. “Staging the Kinetic: How Music Automata Sensitise Audiences to Sound Art.” Organised Sound 23, no. 3 (December 2018): 235–45. https://doi.org/10.1017/S1355771818000146. Toole, Floyd E. Sound Reproduction: The Acoustics and Psychoacoustics of Loudspeakers and Rooms, 3rd ed. New York and London: Routledge, 2017. Weibel, Peter. Sound Art: Sound as a Medium of Art [Exhibition, Center for Art and Media, Karlsruhe March 17, 2012-January 6, 2013]. Cambridge, MA: MIT Press, 2019.

345

Christopher McDonald

Bibliography Brost, Amy. “A Documentation Framework for Sound in Time-based Media Installation Art.” Journal of the American Institute for Conservation 60, no. 2–3 (July 3, 2021): 210–24. https://doi.org/10.1080/019 71360.2021.1919372. Cox, Christoph, and Daniel Warner, eds. Audio Culture: Readings in Modern Music. New York: Continuum, 2004. Katz, Robert A. Mastering Audio: The Art and the Science, 2nd ed. New York: Focal Press, Taylor & Francis Group, 2013. Kjartansson, Ragnar. Interview with Ragnar Kjartansson by Chris McDonald, 2021. National Park Service (NPS). “Listen to Edison Sound Recordings,” 2015. https://www.nps.gov/edis/ learn/photosmultimedia/the-recording-archives.htm. Rist, Pipilotti. Interview with Pipilotti Rist by Chris McDonald, 2021. Roederer, Juan G. The Physics and Psychophysics of Music: An Introduction, 4th ed. New York: Springer, 2008. Semmerling, Linnea. “Email from Linnea Semmerling to Chris McDonald.” July 26, 2021.

346

20 CARING FOR ANALOG AND DIGITAL FILM-BASED ART John Klacsmann, with a contribution by Julian Antos

Editors’ Notes: John Klacsmann is Archivist at Anthology Film Archives in New York City, where he preserves experimental film and artists’ cinema. Applying his experience as both a film preservationist and a laboratory technician to the unique context of artists’ film and time-based media, Klacsmann introduces the analog medium film and its specific use by artists. He guides through the process of film inspection and condition assessment and offers best practices on film duplication and digitization for archival purposes. At the end of his chapter, in Section 20.8, Klacsmann is joined by Julian Antos (Executive Director, Chicago Film Society, and Technical Director, Music Box Theatre, Chicago), who contributes his perspective on exhibiting film. Approaching this topic as a film projectionist, technician, and archivist, Antos introduces the practical considerations that go into planning a screening room or gallery space and selecting, preparing, and maintaining film projection equipment.

20.1 Introduction Analog film, a ubiquitous and omnipresent moving image medium for most of the 20th century, has over the last 20  years become a rarefied, and, therefore expensive, medium. The commercial entertainment film industry’s successful wide-scale deployment of digital cinema exhibitions throughout the 2000s and 2010s effectively transformed analog motion picture film from an industry standard into a niche medium with only small-scale commercial and industrial support. Nonetheless, analog film remains in use today by artists and the archivists, preservationists, and conservators who care for film-based works. In the context of film-based artworks, analog film is still used today because display often relies on duplication: looping setups require multiple sets of film prints for every exhibition or there exists a requirement or desire to exhibit the work in its native analog film format. And archivists, preservationists, and conservators also rely on analog film for its excellent long-term archival properties. That is, in proper storage, new polyester-based film elements can last hundreds of years. Unfortunately, without the prevalent support of the commercial film industry and studios, many film laboratories, where film processing, printing, and duplication takes place, have DOI: 10.4324/9781003034865-24

347

John Klacsmann with Julian Antos

shuttered. There is now a dwindling expertise among technicians at the few remaining film labs and printing and processing work has become expensive. Eastman Kodak remains the only major manufacturer of raw film stock, yet they have discontinued many of the most important stocks in their product catalog over the last 20 years. Artist filmmakers such as Tacita Dean and other prominent figures from the art, museum, and archival communities began a campaign in 2014 calling on UNESCO to protect and safeguard the medium of film [SAVEFILM (Website) n.d.]. Simultaneously, Hollywood studios reportedly made deals directly with Kodak in 2015 and 2020 committing to the purchase of bulk amounts of film stock for commercial production, ensuring the continued manufacturing of motion picture film for the foreseeable future (Giardina 2020). This chapter aims to provide a general understanding of film as an artistic medium, introducing aspects of care, identification, and inspection, as well as providing a brief overview of analog, digital, and hybrid duplication. Film projection and exhibition is covered from the perspective of presenting film-based art, both theatrically and in a gallery setting.

20.2 A Brief Overview of Film as a Medium in Artists’ Practice Artists began working with film as soon as it became a viable medium in the early 20th century. 35 mm film format was widely adopted by the nascent film industry by 1909, and as Paolo Cherchi Usai notes, “The success of cinema as a form of mass entertainment is tied to this type of film” (Cherchi Usai 2003, 2). As a professional format, many artists (for example, Hans Richter, May Ray, Salvador Dalí, and Marcel Duchamp) who were making non-narrative 35 mm art films in the pre-war period were doing so alongside—or ancillary to—their work in other mediums, like painting or sculpture. A proliferation of 16 mm film stocks, cameras, and projectors in the post-war era made the format, initially introduced in 1923 for the “amateur” market, a more practical gauge for artists who were dedicated to producing their own films outside of the commercial entertainment industry. In addition to being more economical to work with, 16 mm (and later 8 mm and Super 8 mm) was also a more flexible and portable projection format, allowing artists to show their work in smaller, non-traditional, and often temporary screening venues. Despite largely existing outside of the film industry, and art galleries and museums too, “independent” and artists’ film flourished internationally by the 1960s. Variously called underground, experimental, or avant-garde film, these loose movements often adopted self-determined, cooperative models of distribution and exhibition. John Handhardt described the position some artist filmmakers of this era found themselves in: “Museums and art galleries virtually ignored this cinema, as film did not appear to function within the economics of collecting and its attendant support system of galleries, museums, and critical journals. In the absence of assistance from private foundations, the film industry, and the marketplace, filmmakers were left to their own collective efforts and individual perseverance.” (American Federation of Arts 1976, 37) But by the 1970s, more artists were producing film installation works specifically for gallery exhibition, and as a result, film was more widely exhibited within museum and gallery spaces. These types of works usually relied on film looper devices, which had become more widely available, allowing film prints to be projected continuously. The emergence of optical media and digital video by the 1990s brought new possibilities to show film works in the gallery. Films could be transferred to video and easily shown on loop either on a video projector or monitor. 348

Caring for Film-Based Art

Likewise, artists could shoot on film but exhibit their work on video. In more recent years, as film becomes more rarefied, there has been some renewed interest by artists to shoot and/or exhibit new works on film. For a more comprehensive overview of artists’ use of film, see Film and Video Art (Comer 2009). P. Adams Sitney’s Visionary Film: The American Avant-Garde 1943–2000 (Sitney 2002) is a detailed account of the post-war experimental film movement in the United States.

20.3 Collecting Artists’ Film In the past and before media preservation practices started to take hold in museums, the primary film collection objects were film prints, also known as release prints or exhibition prints. Original film elements or printing masters, such as internegatives and interpositives, were acquired less frequently. Film prints can scratch and wear during projection, whether in the course of scheduled theater presentations or on continuous display with film loopers. An extended presentation of the artwork necessitates the ability to create new exhibition prints from master materials. This is why, in many cases in the past, prints may have not even been acquired and simply rented from a distributor or borrowed from artists directly for specific presentations. While artists often did not make many prints, prints were considered relatively easy to replace and were not considered rare. With evolving preservation practices, a wider variety of film elements are collected now: from original elements, multiple intermediates, and printing negatives to multiple print copies, including artist-approved reference prints. A wide variety of video elements from Digital Cinema Packages (DCPs), videotapes, optical media, and digital files are collected. While prints can still be produced, they are now less common than before, are more expensive, take a considerable amount of time to produce, and are therefore considered more rarefied.

20.4 What Is Analog Film? 20.4.1 Material Composition Analog film is made up of light-sensitive silver halide crystals suspended in a gelatin binder (see fig. 20.1). This emulsion is coated on a flexible plastic strip, which acts as a support or base. The strip is perforated so that it can be transported through and consistently registered in a camera, projector, and/or another machine, such as a film printer or scanner. When the silver halides are exposed to light, a chemical reaction occurs in which a “latent image” is formed. In contemporary chromogenic color film stocks, the emulsion has three layers with color dye couplers alongside the silver halides, with layers sensitive to red, green, or blue, respectively. The film must then be developed in chemistry to convert the latent image to a visible image, a process by which the exposed silver halides are converted to metallic silver. In chromogenic color stocks, dye couplers form colored dyes (cyan, magenta, or yellow) during development, and in a subsequent bleach step, the silver is removed. A professional film laboratory typically handles the development of the film and, if copies are desired, prints and develops film duplicates.

20.4.2 The Basics of Film Printing and Duplication From the beginnings of film in the late 19th century until today, the analog process of film duplication has not significantly changed. There are two types of ways to duplicate (often 349

John Klacsmann with Julian Antos

Figure 20.1 Simplified cross-section of a piece of analog film. Illustration: John Klacsmann

called printing) a film: via a contact printing method or an optical printing method (see Section 20.6.2). Film printing and developing are typically performed by skilled technicians at a motion picture film laboratory. With the emergence and wide commercial adoption of digital cinema by the 2010s, the analog process of film printing and developing has become increasingly rare, and few film laboratories remain. The printing process makes use of different types of film stocks (see Section  20.5.4). Eastman Kodak remains the only major manufacturer of motion picture film stock, and availability has decreased in recent years. Nonetheless, polyester-based film elements are excellent long-term carriers for motion pictures, and thus, analog duplication still plays an important role in film archiving and preservation to this day. Additionally, the need or desire to produce film prints for exhibition in a theater or gallery space continues, especially among artists, archives, museums, and galleries. Once the original camera rolls are photochemically developed, the rolls are typically edited by the filmmaker, where shots are physically cut and tape or cement spliced together. Exhibition prints could be struck from the cut camera original and/or, hopefully, intermediate generations of film elements (i.e., internegatives, interpositives, or dupe negatives) are produced from it. These intermediate elements are produced on polyester-based films, which can last many decades when stored properly. Additionally, this allows the creation of exhibition prints (also called release prints, positive prints, or projection prints) from an intermediate internegative or dupe negative that allows for the projection and/or distribution of the film. In this way, new prints can be produced from a duplicate intermediate master rather than directly from an irreplaceable camera original, where there is a risk of damage when handling or printing. When producing new intermediates or projection prints, the master elements must be “graded” or “timed” first. In this process, a skilled laboratory technician (called a timer) programs the lamphouse of the printer. This programming, which results in specific light changes in the printer while it is being run, will dictate the color and density of each shot in the resulting duplicate.

350

Caring for Film-Based Art

Figure 20.2

Workflow for duplicating a film from a negative element. The process of film printing and duplication creates a variety of negative and positive film elements.

Illustration: John Klacsmann

Camera originals or intermediates may also be digitized (called scanning) to produce digital surrogates in contemporary file formats. Terminology of various film elements and processes may vary across film archives, labs, and territories. For clear and reliable definitions, see the Society of Motion Picture and Television Engineers’ Standard for Motion-Picture Film—Nomenclature for Studios and Processing Laboratories (ST  56:2005) (SMPTE 1996) or Kodak’s free online glossary (Eastman Kodak Company— Glossary 2022).

20.4.3 Manufacturing Since motion picture film was introduced and subsequently widely adopted in the early 20th century, there have been several major manufacturers internationally (e.g., Eastman Kodak, Fuji, Agfa, Ansco, Dupont, 3M, and Orwo) of many different B&W and color stocks. Many formats, gauges, color systems, and sound systems have been used throughout history, and this chapter does not aim to be comprehensive. Instead, it introduces the most common types of films and formats encountered in the worlds of artists’ film and time-based media art. Additionally, this overview centers on the history and use of Eastman Kodak’s products. Kodak developed and introduced many of the most widely adopted film formats and technologies, and they remain the only major manufacturer of most of the film stocks still available today. Table 20.1 provides an overview of important film manufacturing developments at the Eastman Kodak Company.

20.5 Analog Film Identification, Inspection, and Documentation Identifying and documenting the format, status, physical characteristics, and condition of analog film elements in your collection via detailed film inspection will guide key decisions related to storage, exhibition, duplication, and digitization. Film inspection is also the basis for planning and budgeting any digitization or film printing projects (e.g., for exhibition or preservation purposes).

351

John Klacsmann with Julian Antos Table 20.1  Timeline of important manufacturing developments Year

Film Manufacturing Developments

1889

The Eastman Company introduces flexible and transparent still photography roll film based on cellulose nitrate. Thomas Edison demonstrates the kinetoscope motion picture system, which utilizes 35 mm transparent nitrate-based roll film. The 35 mm gauge film will soon become a “standard gauge” for motion picture film. Cellulose acetate-based motion picture film is introduced in the 22 mm gauge by Eastman Kodak. Aiming at the amateur market, Eastman Kodak introduces the 16 mm motion picture film format, which utilizes acetate-based B&W reversal film. Optical sound-on-film technology is introduced with Fox Movietone News and F.W. Murnau’s Sunrise. Kodak introduces the Regular 8 mm motion picture film format. Technicolor introduces its three-color dye-transfer system for 35 mm. Kodak introduces Kodachrome color reversal motion picture film stock. All nitrate-based film stock begins to be phased out by Kodak. Kodak’s chromogenic motion picture film stocks, Eastman Color Negative and Eastman Color Positive, are introduced. Nitrate-based film stock is completely discontinued by Kodak. Kodak introduces Ektachrome color reversal motion picture film stock. Super 8 mm (Kodak) and Single 8 mm (Fuji) formats are introduced. Kodak introduces Super 8 mm film stock with magnetic sound stripe for single-system soundon-film recording. Kodak discontinues the Regular 8 mm format. Kodak introduces polyester base motion picture film (Kodak’s trade name: ESTAR) Digital film technology is introduced with Disney’s Snow White and the Seven Dwarfs (1937) restoration, which utilizes Kodak’s new Cineon System (film scanner, film recorder, restoration workstation). Kodak discontinues Super 8 mm film stock with magnetic sound stripes. Kodak discontinues all Kodachrome color reversal motion picture film stock. Kodak discontinues Ektachrome color reversal motion picture film stock. Fujifilm discontinues all motion picture camera stock. Kodak reintroduces Ektachrome color reversal motion picture camera stock. Kodak discontinues polyester-based color internegative intermediate stock.

1891

1910 1923 1927 1932 1935 1948 1950 1952 1958 1965 1973 1991 1992 1993

1997 2006 2012 2013 2018 2020

Source: Adapted from Kodak’s “Chronology of Film” (Eastman Kodak Company—Chronology n.d.)

Analog film should be inspected on a rewind bench, with a light box, and non-motorized rewinds (see fig. 20.3). While winding through a film, an architect lamp is crucial for examining the surface of the film and reveals scratches or damage on the surface. The beginning of the film is commonly referred to as the head, while the end of the film is called the tail. Flatbed editing tables/viewers (often called Steenbecks) or motorized rewinds should not be used to inspect or view original, intermediate masters, unique, rare, or film elements of unknown condition (including prints). The risk of damage is too high for these types of elements. If viewing is required, digitization via an archival scanner is preferable. While winding through a film, all information pertaining to the material and technical specifications, as well as the condition, should be collected in an inspection report and entered into a cataloging database. Some basic equipment necessary to inspect a film is found in fig. 20.4. An 352

Caring for Film-Based Art

Figure 20.3 A rewind bench with a light box and non-motorized rewinds is needed for film inspection and preparation of films for exhibition. Photo: John Klacsmann

Figure 20.4 Basic equipment necessary to properly inspect a film: hand rewinds, metal split reels (1), film cores (2), magnifier/loupe (3), cotton gloves, light box (4), architect lamp, shrinkage gauge (5), tape splicer for all film types (6), cement splicer for nitrate and acetate based films (7), ultrasonic splicer for polyester-based films (8), fine scissors, footage/frame counter (9), “artist tape” for securing the ends of reels (10), and a footage ruler (11). Photo: John Klacsmann

John Klacsmann with Julian Antos

example film inspection form is included as Appendix 20.1. One should take care to handle a film by its edges only while winding through it and not touch the surfaces of the film. Cotton gloves should be worn for extra safety, but take precaution as gloves can easily catch on tears or broken perforations in unrepaired films and cause further damage. Sections  20.5.1 through 20.5.11 provide an overview of the different characteristics to document: the gauge, the base, the film wind, the type of film stock, date and edge codes, soundtrack type (if any), base decomposition, color dye fading, film damage/footage/splice count, and identifying original, master, and print materials. The National Film Preservation Foundation’s The Film Preservation Guide: The Basics for Libraries, Museums, and Archives [National Film Preservation Foundation (U.S.) 2004] is an excellent and freely available resource covering recommended film inspection procedures. The National Film and Sound Archive of Australia’s online Technical Preservation Handbook [National Film and Sound Archive of Australia (NFSA) n.d.] is also a recommended and detailed reference guide for film inspection, identification, and preservation. Finally, Kodak’s The Book of Film Care (Blasko et al. 1992) is a useful guide for identifying and naming the different types of deterioration and damage one may encounter during film inspection, and it details the types of repairs that can be performed.

20.5.1 Gauge A film’s gauge refers to the width of the film, edge to edge, in millimeters. Film has been manufactured in a variety of gauges dictating the format specifications and frame size of the image. The following gauges are the most common (see fig. 20.5).

35 mm 4 perforations per frame 16 frames per foot Common aspect ratios: 1.33:1 (full aperture/silent), 1.37:1 (academy aperture/sound), 1.85:1 (hard matted widescreen), c. 2.35:1 (anamorphic widescreen, trade name: Cinemascope) When optical sound was introduced, the frame size was slightly reduced, and a portion of the filmstrip between the perforations and one of the edges of the frame was reserved for the optical soundtrack. c. 1891–present The 35 mm gauge is the oldest standard film gauge and goes back to the origins of motion picture film technology. W.K.L. Dickson, an inventor who worked with Thomas Edison, asked Eastman Kodak to cut their then-standard 70 mm gauge photographic film in half for use in his experimental motion picture camera, the kinetograph, which was patented in 1891 (Eastman Kodak Company 2007, 7). In France, the Lumière brothers adopted a similar 35 mm format for their motion picture film camera. The 35 mm gauge was adopted in commercial and studio-produced filmmaking at the emergence of the industry. It was also used by artists, especially in the 1920s and 1930s, prior to the introduction and wider adoption of 16 mm film.

354

Caring for Film-Based Art

355 Figure 20.5 Film gauges commonly found in artists’ film and time-based media, from left to r ight: 35 mm film, 16 mm film (top: double perf; bottom: single perf), Super 16 mm, 8 mm, and Super 8 mm. Source: Adapted by John Klacsmann from images drawn by Max Smith using Visio, 2009–11 (Drawings released into Public Domain via Wikimedia Commons.)

John Klacsmann with Julian Antos

16 mm 2 perforations per frame 40 frames per foot Aspect ratio: 1.33:1 16 mm film can be perforated on both sides (double perf) or on one side (single perf). Sound films are always single perforated because the track is printed on one of the edges. 1923–present Kodak introduced 16 mm film gauge, alongside their camera and projector, in 1923, specifically for home use. The narrower and acetate-based format made it cheaper and safer to use, and cameras were smaller and lightweight. It was swiftly adopted for home movie, amateur, educational, and industrial filmmaking. The format came into wider adoption, including by artists, after World War II because of a proliferation of projectors and cameras. In fact, 16 mm would become “the gauge that defined the avant-garde film movement,” as curator Steve Anker noted in Big As Life (Kilchesty 1998, 5). Because the format was cheap and flexible, and its aspect ratio matched that of television screens, the format was also widely used in television production (Thompson and Bordwell 2010, 446). Prior to videotape technology, video was recorded by shooting a monitor’s playback with a 16 mm camera and B&W film. These are called kinescopes. 35 mm films were also often “reduced” to 16 mm prints for non-theatrical projection.

Super 16 mm 2 perforations per frame 40 frames per foot Aspect ratio: 1.66:1 1970–present Super 16 mm is a shooting format, not a distinct film gauge: the width of the film base is identical to the 16 mm gauge, but the film frame has a wider aspect ratio. Developed in 1970 as a more flexible and economical widescreen format, Super 16  mm utilizes single perforated 16  mm film stock and is shot with a specialized camera that exposes the frame all the way to the nonperforated side’s edge (i.e., into the area usually reserved for perforations or for an optical track). Since the format is not a projection format, the camera film is usually “blown up” to 35 mm or is transferred to video for exhibition. It has an aspect ratio of 1.66:1, close to HDTV (1.78:1), making it well suited for use in television production (Eastman Kodak Company 2007, 43).

8 mm 2 perforations per frame 80 frames per foot Aspect ratio: 1.33:1 1932–91 Also known as Regular 8 mm or Standard 8 mm, Kodak introduced the 8 mm gauge in 1932 as a cheaper alternative to 16 mm film. The film was sold on 25 ft. spools of double-perforated 16  mm film with two times as many perforations on each side. The perforations are of the same shape and size as 16  mm perforations. Kodak called these 16  mm rolls perforated for 8 mm “double 8.” The roll was loaded into an 8 mm camera, shot on one side of the film strip, and then reloaded into the camera where the opposite side was shot. The film was developed 356

Caring for Film-Based Art

as 16 mm and subsequently slit in half and cement spliced in the middle by the development laboratory, who would return a 50 ft. roll of 8 mm film to the customer (Kattelle 2000, 95–6). Some artists adopted this format for their work, including some who used the format for reducing their 16 mm films to 8 mm prints, hoping to sell copies of their word to collectors for viewing in the home (Balsom 2017, 57). Kodak discontinued the manufacturing of 8 mm film in 1991.

Super 8 mm 2 perforations per frame 72 frames per foot Aspect ratio: 1.33:1 1965–present “Super 8” was introduced by Kodak in 1965 as a major improvement to the 8 mm format. The film comes preloaded inside a plastic cartridge in 50 ft. pre-slit 8 mm widths for seamless loading into a camera. Cameras had battery-operated motors, allowing filmmakers to shoot long takes without having to stop to wind the camera’s motor. The format used smaller perforations and a slimmer frame line to increase the surface area of the frame, resulting in a higher quality image. Also, in 1965, Fuji introduced an almost identical competing format, Single 8, which utilized polyester stock. By 1971 Kodak had introduced auto-exposure in their Super 8 film cameras and in 1973 the Ektasound system. The Ektasound system utilized a thin magnetic stripe along the edge of the Super 8 film. Cameras contained a magnetic head to record sound directly onto the stripe. This allowed for image and sound to be captured simultaneously in the camera and onto a single piece of film (Kattelle 2000, 237). Magnetic sound stripe Super 8 film is often shot at 18 fps or 24 fps. Super 8 mm projection reels have larger center holes when compared to Regular 8 mm reels. Adapters are required to fit Super 8 reels onto standard film rewinds. The features, quality, cost, flexibility, and ease of use of the Super 8 format led to wide adoption by schools, educators, industry, and even some professional cinematographers (Kattelle 2000, 214). And as Steve Anker explains in Big As Life: An American History of 8mm Films, “Super 8 was embraced by independent filmmakers because of its widespread availability, the novelty of single system sound (sound and picture recorded simultaneously on film) and even the murky look of its images” (Kilchesty 1998, 8). Since intermediate and print stocks are not produced in Super 8, Super 8 films are now commonly duplicated onto 16 mm stocks (called a blow-up) or simply digitized.

20.5.2 Base The base (also sometimes called the cell) refers to the flexible plastic support of the film. When inspecting a film, it is important to be able to differentiate between the strip’s emulsion and base sides. The base side of the film can be identified because it is smoother and shinier than the emulsion side. The emulsion side will appear dull and often shows relief in the image area when illuminated with raking light. Since the emulsion side is more fragile, it is typically best to have it facing you while winding during inspection. Over the course of history, there have been three types of film base supports, each with different physical properties: nitrate, acetate, and polyester. 357

John Klacsmann with Julian Antos

Cellulose Nitrate c. 1891–1951 Commercially produced in 35 mm only The first commercially available film utilized cellulose nitrate for its base. Cellulose nitrate is dangerous because it is highly flammable, and deteriorating film can self-combust. As such, it requires specialized handling and cold storage. If discovered within a collection, it is best to contact the nearest film archive or storage vendor who has specially constructed nitrate film vaults. Be aware, nitrate film can only be shipped by someone who has hazardous material (hazmat) shipping training and certification. Nitrate film base was only commercially produced in 35 mm gauges for professional use. Kodak completely discontinued the use of nitrate in its products by 1951 (Blasko et al. 1992, 14). It can usually be identified by the edge printing “NITRATE.” Found almost exclusively in pre-1951 35 mm film elements, it is not a common film base for artists’ films. For more information about nitrate base deterioration, see Section  20.5.8, “Base Deterioration.” For more information about nitrate film, see Kodak’s online resource, Storage and Handling of Processed Nitrate Film (Eastman Kodak Company 2021).

Cellulose Acetate 35 mm: c. 1951–present 8 mm, S 8 mm, 16 mm: introduction–discontinuation/present Cellulose acetate-based film was developed in the early 1900s as a safe alternative to nitrate film. Since acetate film does not have the flammable properties of nitrate film, it is sometimes called safety film. When Kodak introduced 16 mm and 8 mm, it was with the amateur and home user in mind. Therefore, manufacturers only ever produced these gauges in acetate-based stocks. Beginning in 1951, all 35 mm stocks were also exclusively produced in acetate bases. As such, it is the most common base found among artists’ films. Acetate film base remains in use today for camera stocks in all gauges still manufactured. It can sometimes be identified by the edge printing “SAFETY.” For more information about acetate base deterioration (a.k.a. vinegar syndrome), see Section 20.5.8., “Base Deterioration.”

Polyester 16 mm, 35 mm: c. 1990s–present Print and intermediate stocks only While first introduced in the 1960s, polyester-based stocks began to be widely used in motion picture film in the 1990s. By 2000, most print stocks had moved over to polyester from acetate (Shanebrook 2016, 110). A  more modern “safety” film, Kodak’s trade name for polyester is ESTAR. Polyester film is thinner and much stronger than acetate film. It is also highly stable in comparison to acetate film. That is, polyester film is not known to be susceptible to major base deterioration or shrinkage. The Image Permanence Institute (IPI) estimates that polyester film “will last five to ten times longer than acetate under comparable storage conditions” 358

Caring for Film-Based Art

Figure 20.6

The base type can be easily identified by placing the film roll on a light table: acetate base (left) does not permit light through the film pack, while polyester base (right) appears translucent.

Photo: John Klacsmann

(Reilly 1993, 15). Because of its long-term stability and durability, polyester films should be used when producing intermediates (e.g., internegatives, interpositives, dupe negatives) or new prints today. Polyester films cannot be cement spliced like nitrate and acetate films. A tape or ultrasonic splicer, which welds the polyester ends together, must be used instead. Polyester motion picture film can be identified by passing light through a roll. More light will pass through the roll when compared to an acetate film (see fig. 20.6).

20.5.3 Film Wind A roll of film that “reads” correctly through the base side is called B Wind or B Type, while one that reads correctly through the emulsion side is called A Wind or A Type (see fig. 20.7). Since the film is loaded in a camera with the emulsion facing the lens, camera original films are B Wind (i.e., they read correctly through the base side). In contact printing (see Section  20.6.2), films are duplicated emulsion-to-emulsion. This results in each generation of a contact printed film changing winds (e.g., a print contact printed from a B Wind original negative will be A Wind; an internegative contact printed from an A Wind reversal print will be B Wind). When working with single perforated 16 mm films, printing must be done appropriately to ensure single perforated prints can be threaded into a projector and still read properly. As such, A-wind single-perforated film elements must be contact printed from the tail (see fig. 20.8). The wind information should always be documented during inspection because it can help identify a film’s generation and will dictate how the roll is printed or duplicated at a 359

John Klacsmann with Julian Antos

Figure 20.7

A roll of film that “reads” correctly through the base side is called B Wind, while one that reads correctly through the emulsion side is called A Wind.

Source: Association of Cinema and Video Laboratories 1982, 16

laboratory. Single-perforated films’ wind can be identified by the position of their perforations (see fig. 20.8). Double-perforated films’ wind can be identified by locating directional evidence within the content of the film itself. In this case, the text is most commonly used to determine through which side of the film it reads correctly (see fig. 20.7).

20.5.4 Types of Film Stocks While a wide variety of film stocks have been produced by manufacturers over the last 120 years, when inspecting a roll of film, one should at a minimum identify the type of film stock: negative, positive, or reversal. The film type will dictate some general characteristics of the roll and will determine the workflows used during duplication and digitization. Within these three types of films, there can be three categories of stocks, each engineered for a different purpose: • • •

Camera films: made to be shot in a camera; usually films shot with these stocks will be “originals.” Intermediate films: made to be used in a film laboratory for producing master elements (i.e., internegative, interpositive, or dupe negative) from originals or other masters. Print films: made for producing prints from negatives (or, in the past, from reversal originals/masters) 360

Caring for Film-Based Art

Figure 20.8

(1) Raw stock can be packaged as either A Wind or B Wind. (2) Camera original film is B Wind, and a contact print from it will be A Wind. The wind of single perforated films can be determined by the position of the perforations on the roll. (3) Films are printed emulsionto-emulsion in contact printers producing alternating winds. For example, a B Wind original will produce an A Wind print, while an A Wind internegative will produce a B Wind print. (4) The wind can be confirmed by examining writing in the image relative to the base or emulsion. (5) Optical printing inverts the image: a master is typically run upward through the optical printer’s projector gate. But unlike contact printing, the optical printer’s projector can be loaded in such a way to produce an element of either wind.

Source: Case 2001, 113

Negative Negative films produce the opposite of what we see. A  negative film must be printed onto another piece of film (a positive) to be properly viewed or projected. For this reason, negative and positive films can be thought of as a “system”—the two types of film are designed and manufactured to be used in conjunction. Negatives are usually master materials: camera original films or intermediate films (e.g., internegatives or duplicate negatives), either of which positives can be produced from. 361

John Klacsmann with Julian Antos

Figure 20.9 Different types of B&W film stocks, from left to right: camera negative, fine-grain master, dupe negative, positive print, camera reversal, reversal print. Photos: John Klacsmann

Figure 20.10

Different types of color film stocks, from left to right: camera negative, interpositive, dupe negative, camera reversal, internegative, positive print.

Photos: John Klacsmann

Negative films are produced in B&W and color. B&W negative films can be identified by their inverted image and, usually, a gray or blue base. Color negative films can be identified by their inverted image and an orange or pink base (this is called the integral dye mask). B&W negatives date to the origins of motion picture film. Modern color negative stocks with integrated dye couplers were introduced by Kodak in 1950. This chromogenic negative/ positive system was later adopted by other major film manufacturers and is the basis of all currently available color negative/positive film stocks.

Positive Positives are most commonly prints: made from negatives for the purpose of film projection. Positive print films are produced in B&W and color. Both B&W and color positive print films can be identified by their “normal” images and clear bases. Additionally, there are positives which are intermediate films: master elements made from negatives for the purposes of producing new duplicate negatives. These low-contrast 362

Caring for Film-Based Art

intermediate master positives can also be excellent sources for digitization. These elements are called fine-grain masters (for B&W films), master positives, or interpositives (for color films). They are considered master materials and are not made for projection purposes. B&W master positive films can be identified by their low contrast “normal” image and a gray or blue base. Color master positive films can be identified by their low contrast “normal” image and orange or pink base. Alongside B&W negatives, B&W positives go back to the origins of motion picture film. Similarly, modern color positive stocks with integrated dye couplers were introduced by Kodak alongside their negative counterparts in 1950.

Reversal Reversal films are produced in B&W and color and can be identified by their deep black edges. Reversal films are “direct positive” films, meaning reversal camera stocks will produce “regular-looking” images rather than inverted images as with negatives. It also means you can make reversal elements directly from other reversal elements, with one appearing the same as the other, no negative/positive alternating required. B&W reversal films were first introduced by Kodak in 16 mm so that amateur filmmakers would not need to go through a negative/positive printing process at a film lab. Instead, the original reversal camera film could be processed, returned, and projectable with no printing required (Kattelle 2000, 80). Reversal films are also widely used in 35 mm mounted photographic slides for a similar reason (see Chapter 21). Reversal film stocks were commonly used for all three categories of stocks: camera films, intermediates (e.g., printing masters), and prints (i.e., reversal prints made from reversal originals or masters). As such, it can be challenging to properly identify the category of stock (camera, intermediate, print) once you have identified a roll as a reversal element since they can all look similar. See Section 20.5.11 for more information about identifying original, master, and print elements. Kodachrome, the most recognizable color reversal film, was introduced by Kodak in 1935 in 16 mm. Known for its saturated colors and remarkable color stability, it required complex processing. Kodachrome was discontinued by Kodak in 2009. Ektachrome is another widely used color reversal film that was originally introduced by Kodak in 1958 for professional use. Compared to Kodachrome, it has a simpler development process and is faster speed. Ektachrome Commercial (ECO) was a low-contrast camera version of the stock. Being lowcontrast, this camera film is not projection ready like many other color reversal stocks, and instead, color reversal prints (or an internegative) were meant to be produced from it (Pittaro 1979, 291). While now discontinued, reversal print stocks were also available to produce prints directly from reversal originals or masters. While reversal film printing and processing were more expensive than negative/positive printing, it was more economical if only a few prints were needed. For this reason, reversal film stocks were widely adopted by artists who often were not producing high quantities of prints.

20.5.5 Date Codes and Edge Printing Several film manufacturers printed the year of manufacture of the roll on the outer edge of the strip. Kodak’s coding initially varied across regions and gauges, although their most common codes are provided in fig. 20.11. Kodak’s system primarily used one or two symbols that 363

John Klacsmann with Julian Antos

1916 1917 1918 1919 1920 1921 1922 1923 1924 1925 1926 1927 1928 1929 1930 1931 1932 1933 1934 1935

1936 1937 1938 1939 1940 1941 1942 1943 1944 1945 1946 1947 1948 1949 1950 1951 1952 1953 1954 1955

EASTMAN KODAK DATE CODES 1956 1976 1982 1957 1977 1983 1958 1978 1984 1959 1979 1985 1960 1980 1986 1961 1981 1987 1962 1988 1963 1989 1964 1990 1965 1991 1966 1992 1967 1993 1994 1968 1995 1969 1996 1970 1997 1971 1998 1972 1999 1973 2000 1974 2001 1975 2002

KODAK KEYKODE DATE CODES 1989 DE 2007 LM 1990 LE 2008 EN 1991 EA 2009 AK 1992 AS 2010 TK 1993 ST 2011 MD 1994 TM 2012 NF 1995 MN 2013 KL 1996 NK 2014 DE 1997 KD 2015 FA 1998 DF 2016 LS 1999 FL 2017 ET 2000 SD 2018 AM 2001 TF 2019 SN 2002 ML 2020 NM 2003 NE 2021 KN 2004 KA 2022 DK 2005 DS 2023 FD 2006 FT 2024 LF

DUPONT DATE CODES 1956 KL 1966 KLT 1957 KN 1967 KNT 1958 KS 1968 KST 1959 LN 1969 LNT 1960 LS 1970 LST 1961 NS 1971 NST 1962 K 1972 KT 1963 L 1973 LT 1964 N 1974 NT 1965 S FUJI DATE CODES

First Two Digits of Four-Digit Code

KODAK PLANT OF ORIGIN CODES K ODAK S AFETY U.S.A. (Rochester) KOD AK S AFETY U.S.A. (Colorado) KO DAK SA FETY Canada KOD AK SAF ETY U.K. (Limited) KODA K SAFE TY France (Chalon)

Figure 20.11 Date code reference chart. Source: John Klacsmann

364

Caring for Film-Based Art

Figure 20.12

Kodak date code on 16 mm camera reversal film. +O decodes to 1934, 1954, or 1974. This piece of film is from 1954.

Photo: John Klacsmann

repeated every 20 years. In 1982 they adopted a system of unique three-symbol codes before moving to a system of two-letter codes or simply the numerical year itself. Keep in mind, date codes signify the year the stock was manufactured, not the year the film was shot. This is an important consideration because filmmakers sometimes shoot stock that is several (or many!) years old. Nonetheless, this information should be cataloged because it can be useful in dating the age of a roll of film. Other information printed on the edges can be useful as well. For example, stock information or the location of the stock manufacturing plant are often included in edge printing. One should be aware of “print-through,” however: in non-camera original films, edge information can be carried over, or printed through, from a previous generation’s edge information (see fig. 20.13). As such, take extra care in interpreting edge information on intermediates or prints. Interpreting different generations of printed-through edge information can provide valuable evidence about the production history and the status of the film element (i.e., whether it is several generations removed from the original or master elements). For more detailed information on Kodak’s date codes, see their publication A Guide to Identifying Year of Manufacture for KODAK Motion Picture Films (Eastman Kodak Company 2013). Additionally, the International Federation of Film Archives (FIAF) second edition of Physical Characteristics of Early Films as Aids to Identification (Brown and Bolt 2020) is by far the most comprehensive guide to interpreting the edge information and date codes of most film manufacturers, past and present.

20.5.6 Optical Soundtracks Optical soundtracks are photographic recordings of sound and, as a result, are printed and processed similarly to the picture. This allowed sound-on-film technology to be easily adapted into existing formats and systems when introduced in the 1920s and subsequently adopted widely. Optical tracks are still used today in the production of sound projection prints. When encountering an optical track film element, or an optical track on a film print, one should identify the type: variable area or variable density (see fig. 20.14). Variable area tracks are the most common, and there are many variations. They appear as one or multiple vertical waveforms along the edge. Variable density tracks were a competing type of track but are less common. They appear as horizontal lines of differing densities. In the 1990s, digital optical soundtracks (e.g., Dolby Digital, DTS, SDDS) offering surround sound were introduced in 35 mm. They appear as bit patterns (Dolby Digital and SDDS) or a timecode pattern of dashes (DTS). The standard running speed of films with optical soundtracks is 24 fps. 365

John Klacsmann with Julian Antos

Figure 20.13

Print-through of a contemporary edge code: The 16 mm color camera negative’s date code (top) is printed through to the edge of the positive print (bottom): “7219” is a Kodak color negative stock number and “LS” decodes to 2016.

Photos: John Klacsmann

Optical Track Negative Most camera originals or intermediates will not have soundtracks on them. Sound is usually recorded onto its own magnetic tape or digitally separate from the picture. This is called a “double system” sound production. The soundtrack is edited alongside the picture, and when completed, a lab records the audio onto its own roll of film using a B&W optical soundtrack negative stock. This optical track negative is the sound printing master (see fig. 20.14). It is then synchronized to the picture and printed, alongside the picture, onto a single piece of print film for projection. Be aware—the soundtrack is printed in a set amount of frames advanced from its corresponding image frames. This is because a projector’s sound head is below the gate through which the image is projected. Optical tracks must always be printed via contact, as optical printing would introduce frame lines into the track. See Section 20.6.2, “Contact Printing,” for more information about film printing. A print that has both image and sound is called a composite print. Much less common are optical tracks from a “single system” recording setup. Here, the picture and the optical track are captured simultaneously within a camera and onto the same piece of film. 16  mm film cameras, such as the Auricon Cine-Voice, manufactured in the 366

Caring for Film-Based Art

Figure 20.14

Optical soundtracks, from left to right: optical soundtrack negative (variable area type), optical soundtrack positive on a positive projection print (variable area type), and an optical soundtrack positive on a reversal projection print (variable density type).

Photos: John Klacsmann

1950s, could record a picture and an optical track at the same time onto a single piece of single perforated 16 mm film.

Optical Track Positive While the optical tracks on prints are generally optical track positives (printed from the optical track negative), optical track positives are less common as stand-alone rolls. Sometimes optical track positives are printed directly from optical track negatives onto B&W print stock for archival purposes or to serve as a sound master during preservation/restoration when no better sound materials, like a full-coat magnetic soundtrack, survive. These elements are sometimes called track prints.

20.5.7 Magnetic Soundtracks Full-Coat Magnetic Tracks Full-coat magnetic film (“full-coat mags”) can be found in 35 mm, 16 mm, and even Super 8 mm film gauges. They are manufactured to the same specifications as their motion picture film counterparts, including perforations, except the surface is completely covered with iron oxide (see fig. 20.15). This allows for accurate synchronization during editing and playback 367

John Klacsmann with Julian Antos

Figure 20.15

Magnetic soundtracks, from left to right: 16 mm full-coat magnetic film, a magnetic stripe track on a 16 mm color reversal camera original, and a magnetic stripe on a Super 8 mm B&W positive print.

Photos: John Klacsmann

with a 1:1 relationship of image to sound frames. When a film’s soundtrack was completed, a full-coat magnetic track was typically used as a master format from which to shoot the optical track negative. Since full-coat magnetic tracks have higher fidelity than optical tracks, they’re used, when possible, to restore and/or digitize a film’s soundtrack. Multiple audio tracks can be stored on single full-coat magnetic tracks depending on the equipment and layout utilized during production. In 35 mm, striped variations were also manufactured in which only a portion of the surface of the film is coated as a cost-saving measure. Like films with optical tracks, full-coat magnetic tracks run at 24 fps.

Magnetic Stripe Tracks Magnetic stripe tracks were also utilized in “single system” sound cameras, most commonly in Super 8, but also in 16 mm and even Regular 8 mm (see fig. 20.15). In these cases, raw camera stock was manufactured with magnetic stripes along the edges of the film. Compatible film cameras were able to shoot picture and record sound to the magnetic stripe simultaneously and 368

Caring for Film-Based Art

onto the same piece of film. Film projectors were equipped with magnetic sound head readers to play back the magnetic sound on the stripe. Some projectors were even able to record onto the stripe so that filmmakers could record (or re-record) to the stripe after the film was shot and processed. Additionally, sound film prints could be manufactured with recorded magnetic stripes, and previously processed films without magnetic stripes could be “striped” so magnetic tracks could be added. For further information about various magnetic sound media used in film production, see Film into Video: A Guide to Merging the Technologies, second edition (Kallenberger et al. 2000). Additionally, Alan Kattelle’s Home Movies provides a detailed account of the development of magnetic stripe sound technology for 16 mm, Regular 8 mm, and Super 8 mm gauges (Kattelle 2000).

20.5.8 Base Deterioration Cataloging the level and severity of base decomposition during inspection is critical to avoid further damage (e.g., running a brittle or shrunken film through a printer or projector) and/ or to prioritize a film for duplication/digitization. Acetate base deterioration (a.k.a. vinegar syndrome) is common and irreversible. Polyester-based film is much more stable, and there is currently no widely adopted system for measuring deterioration.

Nitrate Base Decomposition The International Federation of Film Archives (FIAF) has designated five stages of nitrate film decomposition (Volkmann 1966, 6): 1. Fading of the silver image and brownish discoloration of the emulsion: the film gives off a strong odor. 2. The emulsion becomes sticky. 3. The film becomes soft in parts (“formation of honey”) and blisters. 4. The entire film congeals to a solid mass. 5. The base disintegrates to a pungent brown powder. These numbered stages are commonly used to rate nitrate decomposition on a 0–5 scale during inspection. See Section 20.5.2 for more information, and warnings, about nitrate film, which is flammable and can self-combust.

Acetate Base Deterioration (a.k.a. Vinegar Syndrome) In the presence of heat, moisture, and acid, decomposing cellulose acetate film releases acetic acid (Reilly 1993, 10). The acetic acid has a vinegar odor, hence the nickname “vinegar syndrome.” This reaction becomes autocatalytic, or self-producing. As the Image Permanence Institute (IPI) describes it, “as the decay progresses, materials become more acidic, degrade at an ever faster rate, and eventually are irreversibly damaged” (Adelstein 2009, 3). Decomposition cannot be reversed once it starts but it can be significantly slowed by storing the deteriorating film in cold or frozen conditions. See Section 20.7.1 for more information about film storage. Other side effects of deterioration are embrittlement and shrinkage (Reilly 1993, 11), see fig. 20.16. These physical changes can make handling, duplicating, and digitizing of a roll of film difficult, expensive, or nearly impossible. The percentage a film is shrunken should be 369

John Klacsmann with Julian Antos

Figure 20.16 A 16 mm roll with severe vinegar syndrome, which has warped and spoked: when cellulose acetate deteriorates under humid, warm and/or acidic conditions, the film base can shrink, warp, and become brittle. Photo: John Klacsmann

regularly measured and documented with a film shrinkage gauge during inspection (see fig. 20.4). The shrinkage percentage may dictate the means of duplication/digitization, as some machine transports are less tolerant of shrinkage than others. Generally speaking, films with shrinkage above 1% likely will not be able to be run through standard film equipment with sprocket drives, like film projectors, contact printers, and footage counters. Individual film labs will be familiar with the tolerances of their equipment. IPI has developed and manufactures A-D Strips for testing the severity of acetate decomposition. A small paper strip is placed on a roll of film and enclosed in a can for a set period of time (24 hours at room temperature; longer in colder conditions). The resulting color of the piece of film is matched alongside a guide and assigned a corresponding level of decomposition. The scale measures 0 (blue; no deterioration) to 3 (yellow; critical). Level 1.5 (green) indicates the point of autocatalytic decay (Image Permanence Institute 2020, 1). This same numbered scale is commonly used to rate acetate deterioration on a 0–3 scale during inspection and immediately after an A-D Strip test. 370

Caring for Film-Based Art

20.5.9 Color Dye Fading Many film archives rate the degree and severity of color fade on a subjective 0–4 scale during film inspection. Knowing if a film has lost some of its color dyes can justify preservation and/ or digitization prioritization. While some stocks and certain production eras are worse than others, all dyes in color films will fade over time. With motion picture film, fading occurs primarily in the dark, that is, over time and while in storage inside a can. Color film dyes (cyan, magenta, yellow) typically fade at different rates causing a shift in color balance over time. Dye fading also results in a loss of density making images look lighter or flatter than they originally did (IPI—Visual Decay Guide | Dye Fade 2021). In 1980 Kodak acknowledged and published data relating to the (in)stability of the dyes in their color stocks for the first time, while promising to continue to publish such data for future color products (Wilhelm and Brower 1993, 305). Indeed, one such example, positive projection prints from that era which appear magenta (i.e., that exhibit cyan dye fading), are now common and recognizable (see fig. 20.17). While this acknowledgment and subsequent data led to major improvements in color dye stability in the following decades, color dye fading will remain an issue that only proper cold and dry storage can effectively slow the rate of. See Section 20.7.1 for more information about film storage. For films that are already faded, digital restoration and color correction can help to rebalance and offset dye loss to some degree. Faded prints should be retired and not exhibited since they no longer represent how the film was intended to look. For further reading about color dye fading, see the IPI Storage Guide for Color Photographic Materials (Reilly 1998) and Chapter  9, “The Permanent Preservation of Color Motion Pictures,” in The Permanence and Care of Color Photographs (Wilhelm and Brower 1993).

20.5.10 Film Damage/Footage/Splice Count Other film damage should be noted in inspection. Like with color fading, many institutions rate the following damage on a subjective 0–4 scale during inspection: • • • • •

Emulsion scratching Base scratching Perforation damage Edge damage Warpage

Base scratching can mostly be hidden from duplicates or digital masters via the use of a liquid gate (see Sections 20.6.2, “Printing with Liquid Gates,” and 20.6.3, “Liquid Gate Scanning”). Emulsion scratching is difficult to hide during duplication since the actual image carrier, rather than the plastic support, has been damaged. Rewashing can aid in lessening the severity of some minor emulsion scratching (see Section 20.6.1, “Rewashing”). When possible, repairs should be made with a tape splicer, especially to splices that are coming apart or to tears. Cement (for nitrate and acetate) or ultrasonic (for polyester) splices may age better long-term, but typically, one must cut out at least two frames to make the splice. Since this is a destructive act, it should be avoided. Damaged edges and perforations can be smoothed out with scissors or repaired with tape so that they don’t catch during winding or in a machine and cause further damage. 371

John Klacsmann with Julian Antos

Figure 20.17

This faded 16 mm color positive print of Harry Smith’s Early Abstractions displays a strong color shift toward magenta, as its cyan dye layer has faded.

Photo: John Klacsmann

Other more severe damage in original or master materials could be noted and left to skilled lab technicians prior to duplication or digitization. Repairs to projection prints may be best left to experienced projectionists. An exact footage count can be useful for comparison purposes or to check completion. This is typically done in a footage counter or synchronizer block, which counts film length in feet and frames. If a synchronizer block is not on hand or the roll is fragile, damaged, or decaying, one may want to use a footage ruler instead, a specially marked stick to roughly estimate the length of a roll from a core’s center. Similarly, it is useful to log the total number of non-edit splices and tears in a roll to document if it is complete or may be missing footage.

20.5.11 Identifying Original, Master, and Print Materials Determining if you have an original or master material, as opposed to a print, is important to understanding and evaluating your materials. When possible, duplication and/or digitization should be done from the best surviving materials that exist, even if not held by your institution. Original and master materials should never be projected and should be handled with care. In some cases, original and/or master materials may be lost or missing. Only after that has been determined should duplication and/or digitization proceed from a print. In these cases, the best surviving (or only) print should be treated as master material and not projected. The determination of an original or master should take into account many pieces of information: In addition to your basic inspection information and any available documentation on the provenance of an element, clues from the film stock itself, its winds, and edits can help to identify the status of a film element.

Interpreting the Film Stock, Film Winds, and Editing Traces Based on the film stock at hand, the following questions help to structure the examination and identification of the status of a film element. Negative • Is it edited/does it have splices at each shot? If so, it is likely the original negative. • If not, can you identify it as a complete intermediate negative (internegative or dupe negative)? 372

Caring for Film-Based Art

Reversal • Is it a reversal print (e.g., composite print with soundtrack, “printed in” splices)? • What wind is the film? An A Wind roll is usually not a camera original. • Does it have splices at every edit? Be aware—a workprint is a print made during production for editing purposes and may look similar to a reversal original. Workprints are usually edited at every shot with tape splices and are silent prints. They’re often battered from running through a flatbed editor many times. Positive • Can you identify it as a fine-grain master (B&W) or interpositive (color)? An interpositive is not the original but is a master element. • If it is a positive print, it is likely not the original. The positive print would have been made from a negative—either the original or an internegative/dupe negative. Hand-painted, scratch animation, or other surface intervention • Likely the original or at least a unique version of the film

Considering A+B Rolls Since there is only a very small frame line in 16 mm, A+B rolls are used extensively when splicing/editing originals or masters. Using this splicing method, shots alternate between each roll in a checkerboard pattern, and as a result, the cement splices are not “printed in” to any frames in elements printed from the A+B rolls (see fig. 20.18). When a shot appears in the A roll, the black leader will appear in the B roll at the same location, and vice versa. This method of cutting is also used in 16 mm and 35 mm for producing dissolves and superimpositions during contact printing, which is much more cost-effective than having to produce the same effects in an optical printer. In these cases, it is possible to have even more than two rolls—with the film being made up of A, B, and C (or more) rolls. Since A+B rolling is commonly used when cutting an original or master element, it is usually found only in negative or reversal films. If you are working with negative or reversal A+B

Figure 20.18 Especially in 16 mm film, editing of the negative or reversing the original elements was often carried out over A+B rolls using a checkerboard technique. This means that an edit master often consists of two (or more) film elements (Association of Cinema and Video Laboratories 1982, 11).

373

John Klacsmann with Julian Antos

rolls, they are likely original or master materials. In these cases, be sure you are in possession of all rolls; an A roll without the corresponding B roll will typically only represent about half of the shots of the completed reel!

20.6 Duplication and Digitization Once the original or best surviving film elements have been identified, inspected, and documented, photochemical duplication, often called film preservation, and/or the digitization process can begin. Duplication and digitization extend the life of the original film object via the production of high-quality copies. As such, great care must be taken when duplicating or digitizing a film so that it is done correctly and at the highest quality possible. The FIAF Technical Commission provides the following guidelines to keep in mind in their Preservation Best Practice publication (FIAF Technical Commission 2009, 2): •





Preservation must be entrusted to specialized laboratories within or outside the Archive, with a proven record of handling archival film to the highest possible standards of quality, safety, and security. Archives are responsible for identifying the laboratories that best meet these standards. No loss of quality in preservation duplicates is acceptable beyond what is unavoidable in analog duplication. For example, image characteristics, such as aspect ratio, format, and so on, must be maintained to the limit of available techniques, the original gauge and format should be retained whenever possible, and reductions (such as duplication from 35 mm to 16 mm) should be avoided. Similarly, when migration or reformatting are performed as part of digital preservation, the original quality of the content must be maintained: lossy compression, reduction of resolution, or bit depth are to be discouraged. Because the ultimate goal of preservation is to extend the life expectancy of the original work and to allow for future access, the use of the best available techniques and materials (e.g., polyester base films vs. acetate, well-established film stocks and equipment) is essential.

The best general resource for further information related to photochemical and digital workflows within a film laboratory is Dominic Case’s Film Technology in Post Production (Case 2001). Steven Ascher, Edward Pincus, and David Leitner’s The Filmmaker’s Handbook (Ascher et  al. 2019) is an excellent and comprehensive technical production guide covering film and digital workflows. Finally, Eastman Kodak’s The Essential Reference Guide for Filmmakers (Eastman Kodak Company 2007) is a comprehensive overview of analog and digital film technologies.

20.6.1 Preliminary Steps Repairs All film material must be inspected and repaired prior to duplication or digitization. Running film materials through any machine, be it a film cleaner, a printer, or a scanner is a risk. Careful repairs will minimize the risk of mechanical film damage during duplication or digitization. Most often, tape splicers and pre-perforated tape will be used to secure torn or broken frames, resecure splices, or strengthen broken perforations. Additionally, damaged edges may be trimmed or smoothed to prevent the film from catching and tearing during relatively highspeed printing or scanning. 374

Caring for Film-Based Art

If duplicating or digitizing at a film lab, skilled lab technicians will inspect the film and can perform necessary repairs as a first step. Lab technicians will be familiar with the machines used at the lab and the best repair techniques for safe transport.

Cleaning Most film should be cleaned prior to duplication to remove any surface dirt, particles, oil, or residue. Particle Transfer Rollers are sticky circular buffers which a film can be wound through to safely remove loose surface contamination without the use of liquids. Most labs utilize ultrasonic film cleaners, like those manufactured by Lipsner-Smith, which use the solvent perchloroethylene or trichloroethylene and high-frequency sound waves to agitate and remove dirt and residue. The use of perchloroethylene, which has been widely used in dry cleaning, may be banned in some territories. Film run through an ultrasonic cleaner is also buffed and dried, and the process is one of the quickest and most effective methods for cleaning film. Particularly, dirty film can be run through an ultrasonic cleaner multiple times prior to duplication. Be aware, some painted, intentionally scratched, or handmade artists’ films should not be cleaned. Cleaning these types of films may ruin or harm them.

Rewashing Films that have minor to moderate emulsion scratching can be “rewashed” prior to duplication/digitization. This process feeds the roll through the wash and dry steps of a film processing machine. While wet, the emulsion swells, and once dried, some minor scratches can fuse back together, removing them. Rewashing is effective but can be risky for older acetate films and is a treatment not offered by many labs that still run film processing machines. This process can be useful when preserving a film from a print. Prints tend to have more scratches in them, from projection, than original elements, and rewashing can help lessen minor emulsion scratching prior to duplication.

20.6.2 Film-to-Film Duplication/Photochemical Film Preservation When to Use Film-to-Film Duplication Photochemical film-to-film duplication (commonly called film preservation) is a useful approach when there is a need to: 1.

2.

Produce an archival/protection film master on polyester stock from original materials. Polyester film stocks have long archival shelf lives, especially when stored in proper cold storage. Produce a new printing negative dedicated to striking new exhibition prints (e.g., for looping display). Running original materials through a film printer to make prints carries some risk. By producing a new dedicated polyester printing negative (either an internegative or a dupe negative), the risk to the original material is minimized.

When working from original materials in good condition and utilizing modern intermediate stocks, photochemical duplication results should be excellent. Since film printers and processors are required, this necessitates working with a film laboratory. 375

John Klacsmann with Julian Antos

In cases where there may be a need to produce high numbers of exhibition prints, such as for works that require looping projection, it is best practice to produce at least two negatives: one designated for producing prints (a printing negative) and one for preservation (a preservation negative). Since there is always a risk of a negative getting scratched or damaged during printing, the preservation negative is placed in long-term archival storage and is not used for printing purposes, ensuring the long-term survival of the film work. Using a photochemical duplication approach does not negate or disqualify one from also digitizing original film materials. It is usually appropriate to do both, assuming the budget allows, securing both an analog master on polyester film and a contemporary digital master with access files. Paul Read and Mark-Paul Meyer’s Restoration of Motion Picture Film (2000) is a classic textbook resource extensively covering the photochemical duplication and restoration of motion picture film.

Contact Printing In contact printing, the duplicating master (e.g., an original, printing negative, or a preservation master) is threaded emulsion-to-emulsion with raw stock in the printer. The machine is run continuously at a relatively high speed by a sprocket drive. Both films are exposed to light through a small opening at the point of contact (called the “printing head”). Once printing is completed the raw stock is processed. Contact printers usually have a secondary, separate head for printing optical soundtracks. If producing a composite print, for example, raw print stock, the picture negative, and the optical soundtrack negative are all threaded into the contact printer. The raw stock is exposed to the picture on one head and the sound on the other head during one pass through the printer (see fig. 20.19). Contact printers typically have lamphouses with dichroic mirrors, which split the light source into controllable red, green, and blue portions. The colors can be programmed to change at precise points in the printing process resulting in a change to the color and density of the frames

Figure 20.19 Analog duplication is achieved either through the use of a contact printer (1) or an optical printer (2). RS = raw stock, IN = image negative/source, SN = sound negative. Source: Case 2001, 75

376

Caring for Film-Based Art

in the shots of the element being printed. The skilled process of programming the printer is called color timing or grading. Contact printers are the most commonly used film printers because they are fast and therefore the most economical. Since film materials are duplicated via emulsion contact, resulting sharpness is optimal. When producing new film prints from new negatives, contact printers are almost always used. However, since the printers are sprocket driven and run at a high speed, they have a strict tolerance with regards to shrunken film. One should be cautious about running film with around 0.75% or higher shrinkage on a contact printer due to a higher probability of damage. Consult with the film lab before having them attempt to contact print shrunken film.

Optical Printing In optical printing, the duplicating master is rephotographed to a raw stock frame-by-frame in an intermittent process. The duplicating master is loaded into the projector side of the machine. The projector is a simple single-frame gate with a pin which advances the duplicating master frame-by-frame. A lamphouse focuses light through the frame in the gate and is aligned precisely with the camera (see fig. 20.19). The raw stock is loaded into a camera magazine and threaded into the camera side of the machine. The camera and lens are attached by a bellows, allowing them to be independently adjusted and dialed in, controlling focus and zoom. Once set up, the frame in the projector gate is photographed by the camera, the projector is advanced one frame, and that single frame photographed in the camera, in a continuous sequence. Optical printing is slow and therefore expensive. The sharpness of materials produced via optical printing can be “off” since focus must be carefully dialed in for each project setup. However, since the duplicating master is advanced with a pin and movement is slow, optical printers are gentle with film. They are also flexible since film can be loaded and run in numerous ways and there is independent control of the camera and projector sides of the printer. As such, optical printing was commonly used to duplicate shrunken, deteriorating, or otherwise fragile (e.g., hand-painted) film. For these reasons it was also the type of printer commonly used to produce analog special effects. Finally, optical printers were used to change gauges: to “blow up” a smaller gauge onto a larger one (e.g., Super 8 mm to 16 mm, 16 mm to 35 mm) or to produce a “reduction” (e.g., 35 mm to 16 mm, 16 mm to 8 mm). An optical printer is not practical for producing prints given its slow speed, expense, and the fact that optical tracks must be printed continuously (in a contact printer). At most labs, work that previously would have been optically printed is now digitally scanned instead since non-sprocket-driven scanners can excel at capturing shrunken or deteriorating film. The digital copy can then be digitally restored and recorded back to film. See Section 20.6.4, “Hybrid Approaches: Digital Source Output to Analog Film”, for more information.

Printing with Liquid Gates To minimize printing base scratches from a duplication master into a new film element, contact and optical printers can be fitted with liquid or “wet” gates (film scanners can similarly be fitted with liquid gates). A contact printer’s printing head is submerged in a tank with perchloroethylene, the same solvent often used in ultrasonic film cleaners, which both dries quickly and has a similar refractive index as the film’s base (Eastman Kodak Company 2003, 2). In the case of optical printing, perchloroethylene is circulated through a glass-enclosed projector gate. In both cases, during exposure, base scratches are filled in with the perchloroethylene minimizing the 377

John Klacsmann with Julian Antos

Figure 20.20

(1) Base scratches cause light refraction during printing resulting in “printed in” scratches in the duplicate. (2) Liquid gates alleviate refraction from base scratches, so they are not printed into the duplicate. (3) In contact printers, printing heads are fully submerged during liquid gate printing. (4) In optical printers, projector gates are enclosed with glass, and the liquid is circulated through the gate.

Source: Case 2001, 89

scattering of light that would cause the scratches to be printed into the new film element (see fig. 20.20). The film is dried immediately after exiting the liquid gate and prior to being wound by the printer. Liquid gate contact printers need to be run more slowly than standard “dry” printers. As with cleaning film, precaution should be used with any material that may react adversely to perchloroethylene, such as hand-painted films. The use of perchloroethylene may be banned in some territories.

Duplicating from Negative Workflow Once an original negative film has been identified it can be sent off to a film lab for duplication. The same workflow would be used if necessary to produce new intermediates from another 378

Caring for Film-Based Art

type of negative (e.g., an intermediate negative, such as an internegative or a dupe negative). See fig. 20.2 for workflow diagram. The negative would typically be inspected, repaired, and cleaned by the film lab, timed, and liquid gate printed—via contact if possible—to a so-called answer print, which is the lab’s first attempt at producing a print with proper density or color (for a more detailed explanation, see “Producing Projection Prints” in this section). Once timing of the answer print is approved, the same timing would be used to produce a low-contrast master positive or Interpositive. In B&W, the interpositive is often called a fine-grain master. From the interpositive a dupe negative is struck. This is typically done with a “dry” gate on a contact printer since both elements are new. In color, the same stock is used for producing interpositives and dupe negatives. Contemporary interpositive and dupe negative stocks are on a polyester base. The new dupe negative would then be timed (although since the previously set timing was built into the printing of the interpositive + dupe negative, it may not need many light changes) and new positive prints (with sound, if applicable) struck from it. This is again typically done on a “dry” gate printer since both film rolls would be new.

Duplicating from Reversal or Positive Print Workflow Once an original or master reversal film has been identified it can be sent off to a film lab for duplication. The same workflow would be used in the worst-case scenario: duplicating from a reversal or positive projection print, in cases where the original reversal or negative does not survive. This same workflow would also often be used when working from an original reversal/ positive hand-painted or scratched film. Just like the regular duplication workflow from a negative, the reversal/positive source would typically be first inspected, repaired, and cleaned by the film lab. Then, it would be timed, and liquid gate printed—with a contact printer, if possible—to an internegative. In B&W, the internegative is often, somewhat confusingly, called a dupe negative. This is because there is just one B&W intermediate negative stock, while in color there are two: one for duplicating from reversal/positive (internegative) and one for duplicating from an interpositive (dupe negative). Contemporary internegative stocks are on a polyester base. See fig. 20.21 for workflow diagram. The new internegative would then be timed and new positive prints (with sound, if applicable) struck from it. This is typically done on a “dry” gate printer since both film rolls would be new.

Duplicating Sound Sound is duplicated separately from the picture and from the best surviving sound master. Typically, this is a mixed full-coat mag, which is digitized, restored, and/or remastered, synchronized to the picture’s digital video reference file, and recorded back to a new optical track negative. Other master sound elements could be a ¼″ audio tape, a DAT tape, or even a digital file, depending on the era the film was produced. Sometimes, no sound master survives and, in that case, an optical track positive must be used as a master and digitized instead. In this case, a new optical track positive is usually struck from the original optical track negative. If the original optical track negative does not survive, a print (or multiple prints) are used. Working from optical tracks is not ideal because their dynamic range is much less than a magnetic source, so it should be avoided whenever possible. 379

John Klacsmann with Julian Antos

Figure 20.21

Workflow for duplicating a film from a reversal or positive element.

Illustration: John Klacsmann

The new optical track negative is then synced with the new picture negative so that composite prints can be struck using both elements. The new digital sound master is archived as a digital file and can also be recorded to a polyester full-coat magnetic track so that it can additionally be archived on film.

Producing Projection Prints Once a new printing negative is made, the process of producing acceptable new prints in coordination with the film lab begins. During this process, one works with a color timer (or grader), a lab technician who specializes in photochemical density and color correction. The color timer determines which printer lights to use and lays out this “timing” determining where to change the lights. The color timer programs the film printer for a mixture of red, green, and blue, each on a 0–50 scale. This determines the color and densities of the resulting print. If B&W, all three colors will be programmed with the same light and work together simply to adjust the density. Typically, light changes occur at splices so that each shot and scene is corrected to give it an appropriate look. This printer programming is typically delivered in a readable format called a timing sheet, which details the light values and the footage the change occurs at. At the outset of producing new prints, one communicates with the color timer and specifies if the goal is to produce tungsten (3200 Kelvin) or xenon (5400 Kelvin) color balanced prints. The lamp type (tungsten or xenon) of the projectors the print will be exhibited with will determine this. Xenon is more common today, although many portable 16 mm projectors use tungsten or halogen lamps. The first print produced in this process is called an answer print. The process of answer printing can continue until one approves a print, often in consultation with the artist who made the film. Answer printing is the most expensive part of producing prints since given the time and expertise required. It is critical when working through the answer printing of an older film to have a reference print. A reference print is usually an older print that best represents 380

Caring for Film-Based Art

how prints originally looked. The reference print can be used by the color timer to attempt to match, and used during quality control to suggest adjustments. Once an answer print is approved, its timing is used to produce a series of release prints. The release prints are the prints that are used for exhibition while the answer print is typically archived alongside the negative so that it can be used for future reference purposes, especially when producing additional release prints in the future. Release prints are cheaper to produce than answer prints since the time and expertise of the color timer is no longer required. A check print is the name of a print made once timing has already been set via an answer printing process, often to “check” that a negative is good or to ensure previously determined timing still holds.

Film Stocks Used for Image and Sound Duplication B&W Fine-Grain Master: • Kodak Fine-Grain Duplicating Positive Film (35 mm: 2366, 16 mm: 3266*) • Orwo Duplicating Positive Film (35 mm and 16 mm: DP31) B&W Dupe Negative: • Kodak Fine-Grain Duplicating Panchromatic Negative Film (35 mm: 2234, 16 mm: 3234*) • Orwo Duplicate Negative Film (35 mm and 16 mm: DN21) Color Interpositive and Dupe Negative: • Kodak Vision Color Intermediate Film (35 mm: 2242, 16 mm: 3242) Color Internegative: • Kodak Color Internegative (35 mm: 2273, 16 mm: 3273)** Optical Track Negative: • Kodak Eastman EXR Sound Recording Film (35 mm: 2378, 16 mm: 3378) • Orwo Sound Recording Film (35 mm: TF12d) Full-Coat Magnetic Track: • Kodak Analog Magnetic Sound Film 800 (35 mm)*** B&W Print Stock: • Kodak Black-and-White Print Film (35 mm: 2302, 16 mm: 3302) • Orwo Positive Print Film (35 mm and 16 mm: PF2) Color Print Stock: • Kodak Vision Color Print Film (35 mm: 2383, 16 mm: 3383) * In August 2014, Kodak discontinued their B&W Dupe Negative stock in 16 mm (3234), although it may still be produced by special order. It is still available in 35 mm (2234). ORWO produces a comparable B&W Dupe Negative in 16 mm and 35 mm (DN21). ** In December  2020, Kodak discontinued Color Internegative (35  mm: 2273 and 16 mm: 3273). As a result, color internegatives will need to be produced on the slowest camera stock, a 50-speed daylight balanced film (35 mm: 5203 and 16 mm: 7203). Results will be similar to 2273 and 3273 Internegative, with the big downside being 381

John Klacsmann with Julian Antos

these camera stocks are acetate-based instead of polyester. To produce a photochemical polyester master, an interpositive would have to be struck from the new acetate internegative. *** In December  2020, Kodak discontinued full-coat magnetic sound film in 16  mm. 35 mm, which is still available, should be used in all cases.

20.6.3 Film-to-Digital Duplication/Digitization When to Use Film-to-Digital Duplication Film-to-digital duplication (or digitization) is a useful approach when there is a need to do the following: 1. 2. 3. 4. 5. 6.

Produce a digital file for access or online presentation purposes. Produce a copy of the film for contemporary digital exhibition (e.g., HD or higher file, 2K/4K Digital Cinema Package, etc.). Archive a high-resolution digital copy of an actively deteriorating or fading film master. Duplicate a film that is deteriorating, warped, and/or shrunken and may not travel through a film printer. Restore a color faded film and/or one that is damaged (e.g., has tears in frames). Restore a film from a variety of sources that must be conformed to build a complete version of a work.

Digital film-scanning technology, which captures a film to individual digital frames sequentially, has advanced dramatically in the last decade. This technology has largely replaced telecines, which transfer films to video signals/formats in real time. There are a variety of manufacturers of multi-gauge film scanners, and prices have come down to the point where many institutions have purchased film scanners for on-site use in media conservation labs. These scanners, often cheaper, may be more limited, however, and depending upon how much film there is in the collection and to what level you need the film digitized (access vs. archival), working with an outside film lab may still be desirable. Most film labs have film cleaning and liquid gates, as well as higher-end film scanners, with higher-resolution/higher-quality sensors, better optics, and better movements, which may be needed for certain projects, especially ones that involve film elements that are deteriorating, fading, or are requiring a hybrid approach with intended output to film. Labs also have technicians and colorists on staff with expertise in film-scanning and the digital grading and color correction of film. In short, as FIAF has dutifully noted, “It is rarely considered that one scanner is able to handle all types of material equally well. Those caring for collections will want to obtain access to as great a variety as possible for consideration in individual cases” (FIAF Technical Commission n.d.). Digitizing a film does not disqualify one from also photochemically duplicating film materials, when possible. It is usually appropriate to do both, assuming the film is in good enough condition and the budget allows, securing both an analog master on polyester film and a contemporary digital master with access files. For information about film-scanning technologies, see Film into Video: A Guide to Merging the Technologies, second edition (Kallenberger et al. 2000). Quantel’s Film in the Digital Age is also an excellent technical guide (Throup and Pank 1996). Finally, FIAF’s Technical Commission has a host of online resources and specifications that should be consulted (FIAF—Resources n.d.). 382

Caring for Film-Based Art

Types of Film Scanners There is a wide variety of film scanners with different methods of transporting the film and capturing/converting frames to digital images. The principle of all film scanners is the same, however. The film is moved across a gate which is illuminated by a light source. A  sensor attached to a lens captures the image in the gate and is responsible for the conversion to a digital image. Sensors are built for different digital resolutions. In motion picture film scanning, 2K (2,048 pixels horizontally, vertical pixel count determined by aspect ratio), 4K (4,096 pixels horizontally, vertical pixel count determined by aspect ratio), and higher resolutions are common. Most film scanners use either charged-coupler devices (CCDs) or complementary metal oxide semiconductor (CMOS) sensors. Color is recorded in a RGB color space by one of the following methods: 1. 2.

3.

Three monochrome sensors: using a prism, the image is separated and color filtered to three sensors, one for red, green, and blue. A single monochrome sensor with multiple captures per frame: each frame is recorded three times sequentially (with red, green, and blue light respectively) using a single monochrome sensor. This method is slow since it requires intermittent movement and multiple captures per frame. A  similar principle can also be used to capture a single frame with multiple levels of light exposure, increasing the dynamic range recorded (FIAF Technical Commission n.d. Part 1). A single sensor: one sensor that captures red, green, and blue at once, using a mosaic color filter, like a Bayer matrix filter. Bayer filters do not have an equal proportion of each color in their filter arrays. Therefore, they do not capture each color at full resolution like the other methods previously mentioned. The sensor must mathematically “de-mosaic” the captures to put them into the full RGB color space.

Line scanners capture the film with a narrow horizontal line sensor and a slit gate that the film is precisely and continuously moved across. Each frame is digitally reconstructed vertically from the horizontal line scans. These types of scanners may not be tolerant of film splices or damage, which may cause unsteadiness during transport. Area scanners capture an entire frame at once. This can be done through an intermittent movement (similar to an optical printer) or continuously in synchronization with a strobing LED light source (FIAF Technical Commission 2016, 5). Some intermittent area array film scanners use pins to physically register a frame in the gate (much like an optical printer). This allows for a precise and consistent registration across all images in the sequence but is slow and may not be tolerant of shrunken film. Many contemporary area array scanners use capstans to drive the film continuously through the scanner without the use of sprockets or registration pins. In this case the frames, which are captured with a slight “overscan,” can then be registered digitally in post-processing using a perforation as a reference or “anchor” point across images in the sequence. These types of scanners can be tolerant of highly shrunken film, as long as it is not brittle. For severely deteriorating rolls, a scanner that does not use sprockets or pin registration may be the only option for duplication/digitization.

Liquid Gate Scanning Like in film printing (see Section 20.6.2), scanners can be fitted with liquid gates. Similarly, the enclosed gates are filled with perchloroethylene. The “perc” fills in base scratches in the film, 383

John Klacsmann with Julian Antos

minimizing light scatter and “hiding” the scratches from appearing in the digital frames captured by the sensors. This technique works well and can be preferable to digital scratch removal which can be expensive, time-consuming, and prone to errors or artifacting.

Scanning Resolution There is no simple rule regarding which digital resolution will fully capture all of the information in a piece of analog film. As the International Federation of Film Archives (FIAF) explains, “Digital sampling of an analog image requires a sampling rate higher than the finest detail on the original (actually twice the highest frequency of the original, as described by the Nyquist theorem) but it can be argued at one extreme that there is no theoretical limit to the resolution required to perfectly render an analog film image, right down to the microstructure of the film grain.” (FIAF Technical Commission 2016, 1–2) Furthermore, which resolution one should ideally use is dependent on the age, gauge, stock, and generation of the film material. For example, film negatives hold more detail than prints from them and 35 mm materials hold more detail than 16 mm ones. Therefore, a 35 mm negative would require a higher resolution scan than a 16 mm print. As a compromise, many film archives are now scanning at 4K resolution (4,096 pixels horizontally) at a minimum for 35 mm materials (Fossati 2018, 109). Practically speaking, the end goal of the digitization project, and your budget, may dictate the resolution you choose. Higher resolution scans will cost more and produce more data, which then must be properly archived. If access is the goal, HD or 2K may be enough. If the aim is full preservation or restoration, then subsequent archiving, 4K or higher may be required. In the end, scanning at the highest resolution you can afford and manage the data on, is the best approach. This will allow your scan to be used for a variety of deliverables (e.g., mezzanine file, Digital Cinema Package, access file), even if the uncompressed/uncorrected scan is archived and some derivatives are produced at a later date.

Lookup Tables (LUTs) Lookup tables (LUTs) are used to map pixel values when converting between color spaces. They are used to display film scans on monitors during digital color grading and when previewing what a DCP or a film print produced in the digital intermediate process will look like or outputting derivatives for certain display scenarios.

Sound Digitization When digitizing from a composite element (e.g., a print that has optical sound), sound can often be digitized on the same machine and at the same time that the picture is being scanned. This may not provide the best results but is cheaper and, depending upon the goal of the project (access vs. archival), may be adequate. When done at the same time, there are two approaches: 1.

The track is captured on a separate dedicated optical or magnetic stripe sound head that the film material is continuously moved across, similar to how a film projector works. 384

Caring for Film-Based Art

2.

An optical track is scanned during the picture scan (i.e., a wider gate is used to capture the picture and the optical track next to the picture). The frame scans are processed by software that translates the optical track images into a digital audio file. AEO-Light is an example of open-source software that performs this extraction (Wilsbacher and Aschenbach 2019).

Digitizing sound in a separate step and on its own machine is preferable. Using this approach, sound will need to be synced and rejoined to the picture scans at a later step. This allows one to digitize the best sound masters that exist and on a machine that will provide the best results. Using this approach, magnetic master materials (e.g., full-coat magnetic tracks, ¼″ audio tape, or DAT), which are higher quality than optical tracks, can be utilized to produce a new digital master with the best quality picture and sound. Be aware of frame rates when working with sound elements. Sound is often digitized for and synced to video files running at 23.976 fps (a.k.a. 23.98 fps), a standard rate for HDTV. Files may need to be converted accordingly to or from “true” 24 fps depending upon what is being produced.

Color Correction and Digital Restoration After a film is scanned, digital post-processing can begin. The files off the scanner will typically appear flat and dull. As such, at a minimum, one must go through the process of digital color correction. For films which are color faded, digital color correction is often the only route to properly rebalance the color and restore a proper look to the film. When compared with photochemical duplication, this post-processing step allows for a much wider opportunity for cleanup, repair, and correction using digital tools. Some other common post-processing steps include stabilization/registration, dirt/dust removal (“dust busting”), scratch removal, and damage repair (e.g., fixing film tears). This cleanup can be done in an automated process, by a computer, or manually, by a human operator. Automated processes can be prone to errors that result in artifacting or errors but will be cheaper and less timeconsuming than a human operator. No matter which approach is used, careful and painstaking quality control is a necessity to ensure restoration and cleanup is proper, is without artifacting, and is representative of the original object. Similarly, sound can be digitally restored. Common digital sound restoration steps include noise reduction, click removal, repair of damage or splices, and sound “sweetening” or remixing. Works in Progress: Digital Film Restoration Within Archives (Parth et al. 2013) is a useful resource providing a variety of case studies on the digital restoration of film materials.

Workflow The film is scanned to an uncompressed image sequence, usually Digital Picture Exchange (DPX) files. Rather than a video file in a container (e.g., MOV, AVI, MKV), digital film masters are a sequence of files, one file per frame, named sequentially and stored in a folder. This can make them unwieldy and difficult to directly view, store, and manage. The “raw scan” refers to the resulting uncompressed image files/sequence produced by the film scanner prior to any post-processing and usually in the DPX format (.dpx). Post-processing is done on the “raw” image sequence, including cropping/framing, color correction, stabilization, and any digital restoration. Sound is digitized either simultaneously or separately. Similarly, the “raw” sound transfer (e.g., an uncompressed digital sound capture) can be restored and remastered, then synced with the picture in the post-processing step. 385

John Klacsmann with Julian Antos

Once the picture and sound mastering are completed and approved, a variety of derivatives, including Digital Cinema Distribution Master (DCDM)/DCP, mezzanine, and access files, are produced from this fully corrected and restored uncompressed digital master, or Digital Source Master (DSM). There is no standardization for DSMs but typically uncompressed image and sound formats that match or are similar to the “raw” masters are used, such as DPX and WAV.

Deliverables The specifications of the file types and formats one produces and subsequently archives during film digitization is not entirely standardized and is evolving. The United States government Federal Agencies Digitization Guidelines Initiative’s Audio-Visual Working Group provides some useful guidelines and a technical overview for deliverables related to the digitization of motion picture film in their 2016 publication “Digitizing Motion Picture Film Exploration of the Issues and Sample SOW” (Federal Agencies Digitization Guidelines 2016). For an overview of the German Institute of Standardization’s DIN SPEC 15587, the standard for the digitization of motion picture film, see Egbert Koppe and Jorg Houpert’s summary published in FKT and translated into English and published online in 2019 (Koppe and Houpert 2019). In 2019, FIAF collected information from leading film archives regarding what kinds of digital files they were producing and archiving when scanning films. The file types below are pulled from that survey and provide a relatively accurate overview of what is currently most common. See the FIAF Technical Commission’s “Digital Statement Part V: Survey on Long-term Digital Storage and Preservation” by Céline Ruivo and Anne Gant for more information (Ruivo and Gant 2019). Be aware that like all things digital, file types, formats, resolutions, and so on will undoubtedly evolve and change in the coming years. DIGITAL FILM MASTERS

Institutions are archiving both the raw scan and sound captures (i.e., the files produced by the machines) and the corrected versions (the DSMs) as uncompressed digital files. DPX files are most commonly used. Sound is archived as separate uncompressed audio files and synced to the picture in software. The raw scans are typically overscanned, meaning that they are uncropped and include more images than what is typically projected, including above and below the frame lines and at least some of the edges of the reel, including perforations. In these cases, the film may be scanned at slightly higher resolutions so that when cropped later, the scans are full standard resolutions (2K, 4K, etc.). The corrected scans (i.e., the DSM) are cropped, fully color-corrected, and include any image processing/digital restoration. The corrected sound files include any digital restoration as well: cleanup, sweetening, and noise reduction. RAW AND DIGITAL SOURCE MASTER (DSMS)

Image: DPX file sequence 10-, 12-, or 16-bit depth, log 2K or 4K resolution RGB color space 386

Caring for Film-Based Art

Typical file sizes: 2K 10-bit: 1.04 TB/hour 4K 10-bit: 4.22 TB/hour Sound: WAV or BWF (Broadcast Wave File) 24-bit depth / 96 kHz Typical file sizes: Mono: 1.03 GB/hour FROM DSM

From the DSM, two deliverables are commonly produced: a Digital Cinema Package (DCP) for contemporary digital cinema projection and a mezzanine file, a master file for producing other, smaller, derivative files for access, streaming, or home video (i.e., Blu-ray) purposes. In practice, many people encode DCPs directly from DSMs, but there is a standardized master specification, the Digital Cinema Distribution Master (DCDM), used to encode DCPs from. The DCDM would be made from the DSM, is uncompressed and unencrypted, and contains everything needed to produce DCPs. The standards for DCDMs and DCPs can be found in the DCI Specification (Digital Cinema Initiative 2020). DCPs are typically delivered on their own external USB hard drive or on a hard drive housed in a Cru DX115DC sled. Digital cinema servers require EXT2 or EXT3 formatted drives. Some servers also can read NTFS formatted drives. Unencrypted DCPs can be duplicated easily with a computer which can read, write, and format EXT2 or EXT3 drives. DCP: DCPs use JPEG2000-encoded .j2c files wrapped in MXF XYZ color space Unencrypted 2K or 4K 24 fps* Stereo or 5.1 Surround Sound (including center-only channel Mono) Typical file sizes: 2K, Full, 24 fps, stereo sound: 90 GB/ hour *A frame rate of 24 fps is most common for works originally made on film, although some other frame rates, like 25, 30, 48, 50, and 60, are supported. Additionally, frame rates 16, 18, 20, and 22 fps are supported but are optional, meaning not all playback servers support these rates (Saetervadet 2012, 32). Mezzanine file: ProRes422 or ProRes4444 QuickTime MOV with PCM sound REC 709 color space 387

John Klacsmann with Julian Antos

HD or scanning resolution (e.g., 2K/4K) Typical file sizes: HD ProRes422: 95.04 GB/hour FROM THE MEZZANINE FILE

From the mezzanine file, a highly compressed file for access or online presentation is made. Access file: H.264 with AAC sound HD REC 709 color space Typical file size: HD H.264: 16.42 GB/hour

20.6.4 Hybrid Approaches: Digital Source Output to Analog Film When to Use Hybrid Approaches A hybrid approach involves scanning a film to digital, doing any cleanup/correction, and then outputting the digital master back to polyester film negative frame-by-frame for long-term archiving and printing purposes. This process is often called a digital intermediate workflow. This process was first used by Disney to restore Snow White in 1993 using a scanner, workstation, and film recorder developed by Kodak. The workflow soon became common for high-end special effects work, which would be created and composited digitally, then recorded to 35 mm film. Today, this workflow is particularly useful when a work needs the power and flexibility of scanning and digital restoration tools (due to color fading, damage, or decay), a polyester film negative for long-term archiving, and new exhibition film prints. It can also be used when one wants to “blow up” a film (i.e., change to a higher gauge), a case where an optical printer would have been used in the past. A production shot with digital cameras (i.e., “digital film”) can similarly utilize a film recorder for outputting born-digital images to film. Since one is performing a high-resolution scan as part of the process, the digital part of the workflow is nearly identical to the film-to-digital workflow in Section 20.6.3. One should produce the same deliverables in that workflow in addition to the recorded film negative and prints produced in this workflow. See the online resource, The ARRI Companion to Digital Intermediate, by Harald Brendel (Brendel n.d.), for further technical information about the digital intermediate workflow and film recorders. Phil Green’s online resource The Digital Intermediate Guide is slightly less technical but equally useful (Green n.d.).

Film Recorders Film recorders record each frame of a digital film master onto analog film stock. The model still being used most widely is the ArriLaser, which was only ever manufactured to record to 35 mm stocks. The ArriLaser is carefully calibrated and uses three colored laser beams (red, green, blue) 388

Caring for Film-Based Art

to record the pixels of each frame of the digital master onto filmstock line by line. B&W stock can also be used in the machine. Some labs have produced their own film recorders for 16 mm film. These 16 mm “film-out” machines operate very similarly to optical printers (see Section 20.4.2) except instead of film frames in a projector being shot by a camera, digital frames displayed on a high-resolution monitor are shot. These recorders utilize optical printer film cameras pointed at and aligned with high-resolution monitors. The monitor and camera are synchronized and computer-controlled. The monitor displays each frame of the film master at its native resolution one by one with each frame shot by the film camera. The resulting exposed film stock is then photochemically developed.

Workflow The corrected digital master is fed to the film recorder which records each frame back to a polyester-based film negative. This negative is then processed like any film negative. This is often called a film-out negative. From the new film-out negative, work proceeds as in a photochemical workflow: answer and release prints are struck using a contact printer, along with an optical track (if it is a sound film) and processed. Just like in a digitization workflow, the raw and corrected digital film masters should be archived. In addition to the film-out negative, DCPs and mezzanine files can also be produced from the corrected digital film master with the aid of LUTs.

Stocks Used for Film Recording B&W negative: • 35 mm: Kodak Vision3 Digital Separation Film (2237) • 16 mm: Orwo Duplicate Negative Film (DN21) Color negative: • 35 mm: Kodak Vision3 Color Digital Intermediate Film (2254) • 16 mm: Kodak Color Internegative (3273*) *In December 2020, Kodak discontinued Color Internegative (35 mm: 2273 and 16 mm: 3273). This internegative stock was commonly used in custom-built 16 mm film recorders.

20.6.5 Working with Film Labs Film printing, recording, and processing must be contracted out to a film laboratory. You also may need to send film to a lab for film scanning and digitization if your institution does not have a high-end film scanner of its own or requires the expertise of scanner technicians and digital colorists. There are few film labs left and it is important to be aware of which labs offer which services and what work each lab does particularly well. Once a lab has been selected you can contact their customer service department and ask for an estimate or quote. At this stage you should already have determined what you’ll be using as master material. Information you’ve collected during film inspection, including the length and condition of each element, will be crucial to provide to the lab so that they’re able to write an accurate estimate and ask 389

John Klacsmann with Julian Antos

any technical questions. The lab may ask to receive the film elements for inspection prior to providing an estimate. Assuming the estimate fits within your budget, you can begin the project. It is important to include detailed instructions on your project, as well as any deadlines, so the lab technicians know what the goals of the project are and the desired workflow. Remember, you will be more familiar with the film than the lab. The lab technicians will do their own inspections, adding leaders, labeling, and repairing the film as necessary. The film will be readied for duplication and in the case of film-to-film work, color timed or graded. The film will then be cleaned and printed and/or scanned. Once a new film negative is ready, it will also be timed/graded to produce a new print. In the case of digitization, the digital master will be color-corrected to produce a digital master. At this juncture, you and the lab will go through the process of quality control, making any corrections as needed. Once final deliverables are produced and delivered, it will be your duty to quality control the materials, whether prints, digital exhibition files, DCPs, or all of the above. Once you have paid all lab invoices, ensure the lab returns all materials—both the original masters and new elements. Traditionally, many labs have allowed clients to “vault” materials at the lab. The idea being that a printing negative can be pulled from the on-site lab vault and a new print struck with little delay. This is no longer desirable given the low volume of printing, and the risks vastly outweigh the minor turnaround benefits: many films vaulted at labs have been lost over the decades because of closures. Furthermore, institutional cold storage collection vaults should be better than that of most film labs.

20.6.6 Quality Control Quality control of analog and digital film requires careful and regular calibration of all devices used in the process of duplication and/or digitization and in playback. This includes film printers, film processors, film scanners, film recorders, viewing monitors, speakers, digital projectors, and film projectors. Every element (analog or digital) produced in the chosen workflow should be fully inspected and quality controlled before moving to the next step in the process. While direct side-by-side comparison of projected digital and film images is extremely challenging in practice (since playback cannot be synced, film projection requires a projectionist, etc.), having a color-calibrated light box next to a workstation can be useful. Or if possible, being able to project a film in the same room as a workstation is beneficial. This allows for some comparison between the digital and analog film. In any case, having a good-quality reference print to match all new film and digital elements to is crucial during quality control. In practice, one cannot truly quality control a scan of a film without seeing (and being familiar with) a copy of the film itself! When ready, it is best to quality control exhibition materials (e.g., a DCP, a release print, an HD file) with the display device (e.g., a D-Cinema projector in a theater, a 16 mm film projector in a gallery, on a calibrated computer monitor or HDTV) and at the playback speed that will be used when exhibited.

20.7 Storage 20.7.1 Film Storage The Image Permanence Institute’s Media Storage: Quick Reference (Adelstein 2009) is an essential resource for understanding the storage needs of various types of media and has become a 390

Caring for Film-Based Art

definitive reference guide with regards to film storage. SMPTE Recommended Practice Storage of Motion-Picture Films (RP 131–1994) is also a useful reference (SMPTE 1994).

Cans/Cores Films should be stored in consistently and clearly labeled cans, which provide protection and make identification simple. Film cans should not be sealed or taped shut. Do not store any other materials (other rolls of film, paperwork, etc.) in the cans with the film. Rolls should be stored with smooth winds and on three-inch plastic cores, not on metal or plastic projection reels, and with the end secured with tape. With 8 mm or Super 8 mm film, archival plastic reels are acceptable since 8 mm cores are not manufactured. Film rolls should not be placed in plastic bags while in storage. With decaying acetate films, plastic bags trap acetic acid and accelerate deterioration. Film rolls are often placed in plastic bags while shipping, so be sure to check and remove films from any bags after receiving them.

Cold Storage/Image Permanence Institute Guidelines The Image Permanence Institute (IPI) has studied the long-term effects and advantages of the cold storage of film and provides the most reliable data and guidelines. In short, normal room conditions are not good enough for the long-term storage of film materials and will lead to a much faster rate of base deterioration and dye fading when compared to cold storage. Coldtemperature storage is the most important factor in storing film properly and extending the life of a film collection. Additionally, high relative humidity can lead to mold growth and is also a factor in color fading and base deterioration, so dry storage is important as well. IPI succinctly summarized their storage guidelines for film: “Make the temperature as low as you can, and preferably make the average relative humidity (RH) 20% to 30%. RHs up to 50% will do no special harm; they will only result in a faster rate of vinegar syndrome and dye fading than would occur at 20% RH” (Reilly 1993, 19). Be aware, however, that special precautions must be taken to avoid condensation forming on film materials when removing them from colder or frozen storage to room temperatures. Intermediate, slightly warmer (but still lower than room temperature) staging rooms are often constructed between film vaults and the exterior, or tightly sealed plastic bags could be used. No matter where you store your film, a temperature and relative humidity data logger will help you understand your current environment, make projections, and if necessary, provide ample data to advocate for better storage conditions at your institution. The IPI’s filmcare.org website allows one to upload data from a logger and will generate plots and automatically access the conditions of the collection (IPI—Filmcare.org n.d.).

Segregating and Freezing Decaying Acetate Film As Peter Adelstein of the IPI describes, “acetate films that have already deteriorated can emit acidic gasses that may be absorbed by other film stored nearby. It is possible that absorption of acidic and oxidizing vapors from degrading films can accelerate decay among undeteriorated films in the same storage area” (Adelstein 2009, 3). For this reason, acetate decomposition is sometimes called “contagious.” If possible, decaying acetate films should be segregated from films that are not actively decaying. However, as the IPI points out, cold temperatures, ventilation, and fresh air exchange is more important in lessening the threat of vinegar syndrome in storage areas (Adelstein 2009, 3). 391

John Klacsmann with Julian Antos

Since acetate film decomposition cannot be reversed, it can be helpful to freeze deteriorating films as the low temperature will significantly slow the rate of decay. Practically speaking, standalone consumer or industrial freezers can be used for this purpose. FilmForever.org provides a useful guide for this process, which entails carefully sealing the film can and then placing it in two sealed plastic bags, which are labeled (Bigourdan, Coffey, and Swanson 2020). When removed, the film is allowed to “thaw” and acclimate to room temperature before being removed from the plastic bags.

20.7.2 Digital Storage Raw and DSM files are stored as uncompressed files, typically DPX and an uncompressed audio format. The Federal Agencies Digital Guidelines Initiative (2019) has published a comprehensive guide for embedding metadata within DPX files in Guidelines for Embedded Metadata within DPX File Headers for Digitized Motion Picture Film. The open-source tool embARC can embed and edit metadata in DPX and MXF files (embARC n.d.). Since the total file size of a digitized film can easily be terabytes, many utilize LTO magnetic data tapes for long-term digital storage of the masters. Since DPX files are sequences (i.e., typically a collection of thousands of DPX files, one file per frame) rather than individual files, they can be burdensome to manage and read/write from/ to storage. As such, some package and wrap the DPX sequences into an individual file such as ZIP or TAR package. A more recent tool developed by MediaArea, RAWcooked, can package the sequences into open-source lossless compression formats, FFV1 (picture) and FLAC (audio), and in a Matroska (MKV) container. RAWCooked can “restore” the original sequence exactly, if necessary (RAWCooked n.d.). Similarly, the Library of Congress utilizes an MXF wrapper with JPEG2000 (picture) and linear PCM (sound) formats (Federal Agencies Digitization Guidelines Initiative Audio-Visual Working Group 2016, 6). Other files (mezzanine, access, etc.) utilize compressed file formats and are often stored within a digital repository system for digital archiving and access (see Chapter 8).

20.8 Exhibiting Film-Based Art By Julian Antos There are many ways in which artists’ film and film-based media art is exhibited: a 16 mm film continuously projected on a looper in a museum gallery; a multi-format program as part of a film festival using a temporary projection setup; or a 35 mm screening at a cinema with permanently installed equipment are just a few examples. Regardless of the setting, the setup and presentation of analog film always requires careful planning. The following section steps through the considerations that go into choosing and designing the exhibition space, as well as sourcing and maintaining the necessary equipment.

20.8.1 Screening Room Considerations and Design Room Conditions A screening room—meaning a space which is purpose-built for the display of motion picture material—should avoid acoustically reverberant and visually reflective materials if at all possible. Light reflections from screening room walls will eventually make their way to the screen and degrade the projected image contrast. Acoustical reverberations act similarly, bouncing around 392

Caring for Film-Based Art

the room and reducing dialog intelligibility. Additionally, ambient light, such as from exit signs or other galleries, should be kept to a minimum, and care should be taken that light and sound from adjacent rooms do not bleed into the screening space (and similarly, that sound from the screening space does not bleed out).

Screen Material Many galleries choose to use a white wall as a projection surface. While this is a common and economical approach, it does not allow for any placement of speakers behind the image, which—to a degree—degrades the illusion that the sound is coming from the screen. If a wall is to be used for projection, it is beneficial to use a reflective screen paint specifically designed for projection rather than a white primer paint. Another common way of creating fitted screens are flexible screen fabrics stretched over (custom) frames. Screen materials are typically made of vinyl, with perforations to allow the transmission of sound from speakers placed behind the screen. Recent developments have also been made in acoustically transparent material that does not need perforations and is lighter weight. For standard gallery or cinema settings, a matte white screen with a gain of 1.0—meaning the light reflected off the screen is equal to the light projected onto it—is ideal. Higher gain screens are available for increased light output, but sacrifice luminance uniformity, viewing angle, and color balance as a tradeoff. High-gain silver screens used for 3D projection in commercial cinemas are severely compromised for standard 2D projection, resulting in an image with uneven brightness and poor contrast.

Screen Masking In analog film projection, the edges of the projected image are never perfectly sharp, because the emulsion of the film and the edge of the aperture are on different focal planes. While many artists choose to expose these characteristically soft image edges and rounded corners, in the theater or black box context, screen masking is often a key component of a screen setup. Typically, a black fabric, masking creates a “hard” edge that replaces the soft edge of a projected analog film frame (or straightens out keystoning in video projection). Masking has the additional benefit of increasing the contrast on screen and “pulling” the audience into the image. Masking may be made of a variety of materials, from black paint on a wall to highly sophisticated motorized systems with programmable presets. When a fabric is used, it should be acoustically transparent so that the speakers behind it are not muffled. Regardless of the material, masking should absorb light, not reflect it, and border all four sides of the screen.

Sound in Analog Film Sound system design in commercial cinemas has increasingly gravitated toward adding more surround channels, but the screen channels, meaning the speakers behind the screen, have remained relatively consistent for the past several decades. Unlike a stereo music recording, stereo sound in film usually has three screen channels: left, center, and right. Mono films are typically mixed to be played back from one center channel speaker, preferably installed behind the screen. Many small venues, or venues where film is installed in addition to performing arts (which have very different demands on a sound system than film), will forgo a center channel and use 393

John Klacsmann with Julian Antos

left and right only. This has the unfortunate effect of altering the sound mix of stereo films, which would normally be played through three speakers, and mono films, which would normally be played through a center channel only. The result is usually a reduction in overall dialog intelligibility. If at all possible, separate sound systems should be installed for live sound (e.g., concerts, public address) and cinema sound. For gallery installations in crowded or noisy areas, installing several pairs of headphones can be a great solution for ensuring excellent sound quality. Bluetooth headphones tend to have latency and should be avoided.

Projection Booth or Enclosure If projection equipment is installed in the same room as the screen and audience, care should be taken to ensure that the equipment cannot be tampered with by the curious public. That said, the equipment also needs to be in a place where projectionists and technicians are able to access it. A projection booth provides room for projectors, sound equipment, and often a film inspection area. Adequate space should be allocated for the equipment, the projectionists operating it, and the technicians servicing it. Optical quality glass should be used for port windows so the image brightness and contrast are not compromised. Viewing ports for projectionists should be positioned at a comfortable height near the projector so that the operator can make adjustments to focus and framing while seeing the image on the screen, and these ports should also be made of optical quality class. A spotting scope or a pair of binoculars should be a requirement for any projection booth so that the operator can make critical focus adjustments. Light leaks from projectors and other sources from within the booth should be isolated, so that the only light entering the auditorium from the booth is the projected image hitting the screen. Many booths have a fish-tank-style design with a single large port window spanning the entire width of the booth. While this is appealing for audience members who might want to admire the technology, this design makes controlling light leaks from equipment and booth lighting very difficult.

Essential Film Preparation Equipment Any venue planning to exhibit film should have at minimum the following equipment: splicers for each film gauge being used, a set of rewinds, split reels and good-quality take-up reels, a good set of tools, test films and practice reels, and cleaning supplies. All equipment should be kept in good repair and maintained by a qualified service technician and operated by a qualified projectionist. Any room where equipment is being operated should be kept as clean as possible, free from dust and dirt, and kept cool and dry.

Artist Requirements Regardless of best practices and standards for screening room design, most artists have ideas about how their films should be presented, although these may sometimes deviate from what we would expect in commercial cinemas or even standard best practices. The earlier principles can be applied to any installation, and compromises can be made to ensure the best possible presentation of the film. For more on screening room design, see Dolby Stereo Technical Guidelines for Dolby Stereo Theaters (Dolby Laboratories 1994) and THX Theater Alignment Program Recommended Guidelines For Presentation Quality and Theater Performance for Indoor Theaters (Lucasfilms Ltd. 2000). 394

Caring for Film-Based Art

20.8.2 Film Projector Types and Anatomy Motion picture film projectors are fairly simple machines, and generally speaking, they all operate in the same way. A constant speed sprocket pulls the film into the projection head. An intermittent movement moves the film one frame at a time, holds it still in the projector gate for a fraction of a second, and repeats. While the film is held still, light from a lamphouse shines through the film, and that light travels through a lens and onto the screen. While the film is advanced, a shutter blocks the light from passing through so that the audience does not see this movement. The film leaves the intermittent sprocket, and its stop-start-stop movement is stabilized by either another constant speed sprocket or an impedance roller. The film then enters a sound head where the light from an exciter lamp or LED shines through the optical sound track of the film, through a lens, and onto a solar cell. The variations in light are converted into an electrical audio signal and are amplified into the auditorium. The motion picture projector is an optical system. The pieces of this system need to work in harmony and be perfectly aligned by a qualified technician, or the resultant image and sound will be degraded, and film prints quickly scratched or otherwise destroyed. Many advancements in film projection technology have been made over the past 100 years, but the fact that the general functionality of film projectors has remained consistent means that a film projector from 1939 can still produce state of the art picture and sound in 2020 (with some modifications and adjustments). Similarly, many projectors of 1950s vintage are still in use today.

Theatrical versus Portable Film projectors can be divided into two main categories: Theatrical and Portable. Theatrical projectors are usually modular, consisting of a picture head, sound head, pedestal or column, lamphouse, and film transport (reel arms, platter, tower system). They often weigh several hundred pounds, are designed for permanent or semi-permanent installation, and have more demanding power requirements than portable machines. Theatrical projectors are available in 16 mm, 35 mm, and 70 mm, as well as dual 16/35 mm and 35/70 mm machines, and are suitable for small screening rooms, enormous cinemas, and everything in between. Except for a handful of specialty cinemas using carbon arc lamphouses, which were the dominant light source for theatrical film projection until the 1970s, all theatrical projectors use xenon lamps. The wattage of the lamp, in tandem with variables such as shutter angle and frequency, film gauge, lens speed, and reflector type, determines the size of the image that can be projected. A small screening room might use three-bladed (72-cycle) shutters and 2,000 W bulbs to project 35 mm onto a 13′-wide screen, and a large cinema might use a 4,000 W bulb with a two-bladed (48-cycle) shutters to project 70 mm onto a 60′-wide screen. Because xenon lamps generate a considerable amount of heat, an exhaust system is necessary to pull hot air out of the projector lamphouse. Additionally, most manufacturers recommend the use of water-cooled film gates in projectors using lamps over 3,000 W. Heat filters and glass reflectors, which do not reflect UV light, are also recommended to reduce the risk of color fading and heat damage to the projection print. Generally speaking, the maximum image size attainable without putting the film at risk of heat damage is about 20′ wide for 16 mm, 50′ wide for 35 mm, and 80′ wide for 70 mm, assuming all safety precautions are in place and the image on screen is properly illuminated. Theatrical projectors are quite loud and should generally be used in sound-insulated projection booths, though it is not uncommon to see them in gallery settings. For archival projection of feature films, a two-projector changeover system is almost always a requirement. A changeover system allows for projection on 2,000′ reels without the need 395

John Klacsmann with Julian Antos

to “build” or “platter” a print by cutting heads and tails, which is considered a destructive practice. Portable projectors are typically one self-contained piece and are light enough to be handcarried. The projector mechanism, sound head, reel arms, and amplifier are typically all in the same piece. They are available in 8 mm, Super 8, 16 mm, and (less commonly) 35 mm. Historically used in classrooms, libraries, prisons, homes, and other non-theatrical settings, portable projectors were designed to be used on small screens. Some of the more common 16 mm projector models encountered in the art context today are the Eiki SL and SSL (Slot Load and Super Slot Load—rebranded as Elf projectors in some countries), Elmo 16CL (Channel Load), and the Kodak Pageant 250S. A typical Eiki 16 mm projector using a 250 W halogen bulb will project an adequately bright image up to about 5 or 6′ wide. Portable projectors using xenon lamps can be used on screens even larger, though one should not make the mistake of using a 500 W 16 mm xenon projector which produces a beautiful 8′-wide image on a 20′-wide cinema screen. An underlit image will be pale and washed out, and not a good example of the 16 mm format. Although most portable projectors have an integrated audio amplifier, many users choose to connect the projector to a larger audio system. To do this successfully, the user must be sure that they are getting a line level, rather than speaker level output from the projector. Most 16 mm projectors do not have a “line-out” output for sound and only offer a “speaker-out” connection. Running the speaker output to a mixer, PA, or sound processor will result in very distorted sound. Instead, a speaker level to line level adapter (typically a direct input box, or DI box) can be used, or the projector can be modified with a line-level output. It is best to avoid routing the audio signal from the projector through multiple components and to avoid multiple stages of equalization (e.g., if going from a projector’s internal amplifier to a DI box to a mixer to a loudspeaker amplifier, set the bass and treble controls of the projector to flat and set equalization in the mixer). Although the dynamic range of 16 mm optical soundtracks is quite limited, the sound should never be distorted or muffled, except in the case of a poorly recorded track. Limited dynamic range should not be confused with poor-quality audio. A  good optical soundtrack properly reproduced can yield very pleasing audio quality.

Tungsten versus Xenon The color temperature of tungsten lamps used in most portable projectors is about 3200 Kelvin, and the much cooler color temperature of xenon lamps used in theatrical and more robust portable projectors is about 6000 Kelvin. For this reason, it is important to coordinate with your film lab what type of projector you are using when ordering new prints. A tungsten-balanced print projected on a xenon lamp will look too cool or blue, and a xenon-balanced print projected on a tungsten projector will look overly warm or yellow. Of course, when you are projecting prints that have already been made, you simply get what you get. Modern (post-1980) 35 mm prints are almost universally balanced for xenon projection.

Intermittent Type: Claw versus Sprocket versus Electronic Pulldown Three different systems can be used in a film projector to achieve intermittent movement: •

A claw system is driven by a cam movement and pulls the film down one frame at a time with a two- or three-tooth claw. This system is used in 8 mm, Super 8, and 16 mm portable projectors and some 16 mm theatrical projectors. 396

Caring for Film-Based Art





A Geneva movement, typically found in 16 mm, 35 mm, and 70 mm theatrical projectors, uses a sprocket, which is attached to a star and cam movement in an oil bath to advance the film. An electronic movement uses a stepper motor, which stops and starts electronically.

No system is inherently better or worse, though modern projectors with electronic pulldowns can be much more expensive and complicated to repair than their mechanical counterparts.

Lenses In 35  mm projection, most lenses are of fixed focal length. A  35  mm projector in use in a cinema will typically have a lens for each aspect ratio and an aperture plate that masks the film in the projector so that only the intended image area is projected. For CinemaScope films, an anamorphic attachment is used, stretching (or rather, decompressing) the image by a ratio of 2:1. Lenses for 35 mm projection are typically 70.6 mm in diameter so that any lens can be used on any projector (with some exceptions). In 16 mm projection, both zoom lenses and fixed focal length lenses are used. The barrel size of lenses for 16 mm projection is not standardized, so lenses are not always interchangeable between manufacturers. Anamorphic adapters are also available for 16 mm projectors, and 35 mm scope lenses can be adapted for 16 mm projection with a little bit of engineering.

Film Playback Speeds Most film projectors are designed to run at 24 fps, commonly called sound speed. Many portable 8 mm, Super 8, and 16 mm projectors can also run at 18 fps, or silent speed, by means of a mechanical pulley, and some older projectors can run at 16 fps or below. Most Super 8 sound projectors can run at both 18 and 24 fps, and it is common for this format to have sound recorded at either speed. Theatrical projectors found in archives, museums, and cinematheques are often equipped with variable frequency drives or inverter drives, which allow for true variable speed from 16 to 24 fps (and sometimes higher or lower depending on the projector and motor); these drives require inverter-duty motors, which are not always compatible with all projectors. When silent speeds are used, a three-blade shutter is recommended to reduce flicker. Exhibiting films at nonstandard speeds requires careful planning. At a recent exhibit of Andy Warhol’s Screen Tests, portable projectors had to be modified at great expense to run at 16 fps because the preferred equipment for the exhibition in terms of reliability and film handling could not readily accommodate silent speed.

Optical Sound Playback In an ideal setting, optical soundtracks should be played back through a high-quality preamplifier or cinema sound processor, such as a Dolby CP-65 or 650. For playback of 35 mm stereo soundtracks, a cinema processor is essential for noise reduction and multichannel decoding. 35  mm prints may also have digital soundtracks (Dolby Digital, DTS, or SDDS), which require additional sound processors, though almost all prints with digital soundtracks also contain an analog backup. For more on film projection, see The Advanced Projection Manual by Torkell (Saetervadet 2006) and The Art of Film Projection: A Beginners Guide (George Eastman Museum 2019). 397

John Klacsmann with Julian Antos

20.8.3 Loopers and Gallery Display Loopers Loopers are one of the most popular devices for displaying films in the gallery. Rather than projecting a film from one reel to another, the head of the film is spliced to the tail and projected continuously without interruption, rewinding, or rethreading. When looping films, the media must always be a new print specifically intended for looping exhibition. Over the course of several hundred projections, the film will invariably become worn out through a combination of color fading from light exposure, abrasion, scratches, and dust. Thus, only replaceable (i.e., not “one of a kind”) prints should be exhibited on a looper. Typically, multiple new prints are struck for any looping exhibition and replaced preventatively at regular intervals; color prints are commonly replaced every three weeks after 3 × 6 days of continuous screening during museum hours, while B&W prints are less vulnerable to fading and may last longer. Loopers come in several different sizes, and the best practice is to order a looper, which is the next size up from the film you wish to show. For example, if you’re looping a 15-minute film, you would want a looper designed to play at least 20 minutes. Most loopers are custommade by individuals and small companies, and each has its own set of eccentricities. If using a looper for the first time, it would be wise to use an expendable practice film to become familiar with its operation. On most loopers, a fresh print will tend to “misbehave” for the first few runs through the projector. It will need to be closely monitored and adjusted, and if all is right, it will run without much trouble. Motorized loopers are available for longer runtimes. If possible, use an ultrasonic splice rather than a tape splice to join the head and tail. Over time, a tape splice will stretch, degrade, and eventually fail. If an ultrasonic splicer is not available, make a point of checking the integrity of the splice at regular intervals. Although loopers allow for uninterrupted projection, they are not maintenance-free. A successful gallery installation of a looped film includes daily cleaning of the projector and looper, weekly mechanical checks, premature replacement of prints before scratching (or wear becomes a distraction), and regular monitoring of the system by staff who are familiar with film projection to ensure a quality presentation. An unsuccessful gallery installation will include a poorly maintained projector and looper that have not been used in some time, unsupervised projection of an out-of-focus image, heavy scratching from a dirty gate and poorly adjusted rollers, and damage to the projector mechanism. Because of the strength of modern polyester film stocks, it is important that the film and projector be monitored. If the film jams or catches, the damage to the equipment may cost more time and money to repair than hiring an experienced projectionist to supervise it. Common causes of film wraps are very low relative humidity and static buildup, dirty and oily prints, and sticking looper rollers. Almost all problems can be avoided with regular maintenance. Under the best circumstances, prints on a looper can last for over 1,000 runs without significant signs of wear. As prints are taken off, a technician should evaluate them for signs of damage to see if anything in the film path could be adjusted or improved upon.

Multiple and/or Synchronized Projections Some film installations require multiple projectors or even frame-accurately synched projections. To improve aesthetic coherence across projections, it is advisable to source a set of identical 398

Caring for Film-Based Art

projector makes and models and to replace the projector lamps regularly at the same time. Just like loopers, synchronization units are custom-made by individuals and small companies. Each unit type has its own syncing method, and some require the use of custom-modified projectors to bypass built-in motors.

20.8.4 Storage, Maintenance, and Long-Term Care of Film Projectors Although nearly all of the companies that manufactured film projectors and parts have gone out of business or transitioned to digital imaging technology, many old stock parts are still available as new from third-party dealers for all but the most esoteric machines. Where parts are no longer available, many individuals and companies have manufactured their own gears, gates, shutter blades, and other parts. Institutions would do well to plan ahead and work together when considering parts longevity for film projectors. For example, it may not be economically feasible for one museum to pay for the custom manufacture of a single replacement gear, but if ten or twenty other institutions will eventually need that same gear, the cost goes down considerably. When projectors are not in use, they should be stored in a dry environment to avoid rust and condensation. The best practice is to run the projector with film for several minutes at least once a month to exercise the belts and bearings. If this is not possible, oil should be drained, flywheels and belts removed, and any parts containing springs should be in a “relaxed” position. Xenon lamps should be removed from lamphouses for long-term storage due to the risk of explosion, which could damage the reflector. Using film projectors regularly can be one of the best forms of maintenance. For example, an art house theater that runs film only once a year for a festival would do well to have a monthly 35 mm screening to exercise the mechanics of their projectors and ensure their projection staff is in good practice. Similarly, a library that only runs 16 mm when a patron requests to view a film might schedule a regular 16 mm screening in their conference room once a month to make sure their equipment is in good shape and cleaned regularly. Scratch tests, which involve running a piece of virgin film (either clear leader or raw film stock, available from a lab or most cinema equipment suppliers) through a projector multiple times and making sure it does not scratch, should be a regular part of maintenance and performed at regular intervals before equipment is used after sitting dormant, or any time a projector is modified or repaired. Many modern projectors have electronics. which should be turned on periodically, and associated electronic equipment, such as sound processors should be powered on from time to time and not left dormant. The current supply of projectors and parts is enough to satisfy the demands of the industry for many years to come, but maintaining the knowledge to operate, repair, and restore these machines is critical.

20.8.5 Digital Cinema Projectors Section 20.6.3 of this chapter covers the digitization of analog film, resulting in Digital Cinema Packages, or DCPs. Since the mid-2000s, DCPs (which include both digitized and born-digital film) have been introduced in commercial cinemas as a means of replacing analog 35 mm film projection. In recent years, DCPs are also increasingly entering the art world for exhibition purposes. In the cinema world, each DCP may contain multiple Composition Playlists, or CPLs for a given title, and each CPL is typically a different version of the title. 399

John Klacsmann with Julian Antos

Here are examples: Tenet_FTR-3_F-220_EN-XX-CCAP_OV_51-HI-VI-Dbox_4K_WR_20200812_DTB_SMPTE_OV Tenet_FTR-3_F-220_EN-XX-OCAP_OV_51-HI-VI-Dbox_4K_WR_20200813_DTB_SMPTE_VF

These CPLs are two different versions of the same movie: one with closed captions (CCAP) and one with open captions (OCAP). Both have a D-box track, which causes the audience’s seats to rumble if the venue is so equipped. Most DCPs are encrypted and unlocked with a Key Delivery Message, or KDM, which stipulates a time, day, and projector on which the CPL may be played. These KDMs are typically generated by a lab, and miscommunications between an exhibitor, distributor, and lab can often lead to canceled or delayed shows (a common example is a KDM for a midnight screening expiring at 12:00 a.m. on the wrong day). Mistakes can be avoided if KDMs are delivered several days in advance with long windows and are properly tested as soon as possible. A wide variety of open-source software for making DCPs is now available, and artists and filmmakers can now make their own DCPs for exhibition at very little cost. However, unless you have the equipment to properly test and evaluate DCPs, it is strongly advised to work with a post-production facility or lab that has the proper equipment to do so. Furthermore, testing a DCP on in-house equipment does not necessarily mean it will work the same at every venue (for example, an in-house server might read an NTFS formatted drive, but another venue’s server may not). Projectors designed to show DCPs are referred to as DCI-compliant projectors, meaning they adhere to the standards set forth in the Digital Cinema Initiative (Digital Cinema Initiative 2020). They are quite large, weighing from 100 to 350 pounds, and are typically noisy enough that they require a projection booth or enclosure. DCI-compliant projectors can also show video sources other than DCP, though some external processing is required for analog video and certain resolutions. The life span of a digital cinema projector and digital cinema server is comparable to a goodquality desktop computer. Digital cinema equipment is usually retired after five to ten years of regular use, either due to obsolescence or repairs which will not be cost-effective to make. It is critical to power on dormant digital cinema equipment at least once a month.

20.8.6 Analog and Digital Film Loans Many institutions provide access to their film materials through loans of analog or digital materials. One of the most common loan requests is for projection prints for theatrical exhibition. This could range from a 35 mm print of JAWS from Universal Studios or a 16 mm print of one of Andy Warhol’s Screen Tests from the Museum of Modern Art. Lenders typically require several weeks’ advance notice before lending a print, proof of insurance, references from other institutions, proof of competent projection staff and well-maintained changeover projection equipment, and proof of copyright clearance if the lender does not own the copyright. Borrowers are expected to treat the film with great care and return it in the same condition in which it was sent out, agree not to platter the print by cutting off heads or tails, to credit the lending institution if they so request, and to return the film promptly after the screening. Both the lending institution and the borrower should coordinate shipping well in advance of the screening

400

Caring for Film-Based Art

to be sure there are no delays and the print arrives in time to prepare it for projection. Lending institutions typically inspect film for damage before and after a screening. Loaning film prints for continuous gallery exhibition works a bit differently. Because the prints are meant to be worn out over the course of the exhibition, multiple prints must be ordered and struck. This means that the prints should be ordered six to twelve months in advance of the first screening to ensure they are ready on time and no corrections are needed. The lending institution may want to coordinate with the borrower to ensure that a competent technician is installing the work and that the staff is adequately trained on film equipment. At the end of an exhibition, films should be returned to the institution or individual who loaned them unless directed otherwise. Damaged prints are often destroyed at the end of an exhibition; however, if prints were well cared for during exhibition and are still in very good condition, the lending institution could extend the life of the print by having a lab clean it and put it into service as a screening or backup print. The loaning of DCP files has become increasingly common. Typically, DCPs are delivered on an appropriately formatted hard drive installed in a CRU sled carrier (specifically the CRU DX115 DC), which is designed to be inserted into a cinema server for a quick and easy ingest. DCPs may be delivered on external USB drives, formatted according to spec, but the standardization of CRU drives makes them the most reliable and quickest to ingest. All DCP drives should be formatted EXT3. If a DCP is screening at multiple venues in a short time span, the lending institution should provide each venue with its own drive, just as they would a film print.

20.9 Conclusion In the years to come the availability and use of analog film will continue to decline. Simultaneously, there will be advances in relevant digital technologies, including film scanners, sensor resolutions, display/projection, and post-processing and restoration software. As such, it will get cheaper and easier to digitize, exhibit, and archive digital copies of analog films while more difficult and expensive to duplicate them on film. To an extent, the survival of “film on film” in a post-analog world depends on filmmakers and archives continuing to use it and exhibitors continuing to exhibit it. For now, many artists may insist on film projection as a technical requirement for displaying their artwork, while archivists and conservators still depend on film to produce reliable long-term archival carriers. While digital reproductions of analog films can be excellent facsimiles, allowing broader access and often making exhibiting films easier, digital files are fundamentally different from their analog counterparts. Analog film once made up an enormous industry, and it was the format of choice for motion pictures for over 100 years. Even once new analog film stocks are completely discontinued, there will remain massive amounts of existing film elements to collect, care for, exhibit, and digitize.

Acknowledgments The authors wish to thank Walter Forsberg, who reviewed and provided helpful feedback on a draft of this chapter. John Klacsmann wishes to thank Sarah Whitman-Salkin, who provided crucial encouragement and support during the long process of writing this chapter.

401

Appendix 20.1 ANTHOLOGY FILM ARCHIVES—FILM INSPECTION REPORT

402

Caring for Film-Based Art

ANTHOLOGY FILM ARCHIVES FILM INSPECTION REPORT

INSPECTED BY:

FILM TITLE FILMMAKER Archive #

# of Reels

Former Archive #(s) Shelf Location

Edge Code

Gauge

Year of Prod

FPS

Aspect Ratio

COLOR / B/W

SILENT / SOUND (specify below)

NEGATIVE / POSITIVE

OPTICAL (VARIABLE AREA / VARIABLE DENSITY) SEPARATE ELEMENT (specify): OVERALL PRINT CONDITION

Physical condition

excellent

good

mediocre

poor

Image

excellent

good

mediocre

poor

Sound

excellent

good

mediocre

poor

0 = NONE/NEGLIGIBLE 1 = LIGHT 2 = FAIR 3 = MODERATE These are just estimates–use your best judgement REEL #

1

2

3

4

5

6

7

2 = HEAVY

8

9

10

EMULSION SCRATCHES (WHITE, GREEN, YELLOW, BLUE) BASE SCRATCHES (GRAY OR BLACK) PERFORATION DAMAGE FADING (PINK OR RED) SPLICES (EXACT # OF SPLICES PER REEL) VINEGAR COUNTDOWN LEADER (Y/N) LENGTH (IN FEET) SHRINKAGE (%) NOTES:

(please continue on back)

403

John Klacsmann with Julian Antos

Bibliography Adelstein, Peter Z. IPI Media Storage: Quick Reference, 2nd ed. Rochester, NY: Image Permanence Institute, 2009. American Federation of Arts. A History of the American Vvant-Garde Cinema: A Film Exhibition. New York: American Federation of Arts, 1976. Ascher, Steven, Edward Pincus, and David W. Leitner. The Filmmaker’s Handbook: A Comprehensive Guide for the Digital Age. New York: Plume, 2019. Association of Cinema and Video Laboratories. Handbook: Recommended Procedures for Motion Picture and Video Laboratory Services, 4th ed. Hollywood, CA: Association of Cinema and Video Laboratories, 1982. Balsom, Erika. After Uniqueness: A History of Film and Video Art in Circulation. Film and Culture. New York: Columbia University Press, 2017. Bigourdan, Jean-Louis, Liz Coffey, and Dwight Swanson. “Film Forever.” Accessed December 30, 2020. http://filmforever.org/. Blasko, Edward, Benjamin A. Luccitti, Susan F. Morris, and Eastman Kodak Company, eds. The Book of Film Care, 2nd ed. Kodak Publication, H-23. Rochester, NY: Eastman Kodak Company, Motion Picture and Television Image, 1992. Brendel, Harald. “The ARRI Companion to Digital Intermediate.” n.d. http://dicomp.arri.de/digital/ digital_systems/DIcompanion/index.html. Brown, Harold, and Camille Bolt-Wellens. Physical Characteristics of Early Films as Aids to Identification. Bruxelles: FIAF Fédération internationale des archives du film, 2020. Case, Dominic. Film Technology in Post Production, 2nd ed. Oxford and Boston: Focal Press, 2001. Cherchi Usai, Paolo. Silent Cinema: An Introduction. London: BFI Pub, 2003. Comer, Stuart, ed. Film and Video Art. London: Tate Publication, 2009. Digital Cinema Initiatives. “DCI Specification, Version 1.4.” July 20, 2020. www.dcimovies.com/speci fication/index.html. Dolby Laboratories. “Dolby Stereo Technical Guidelines for Dolby Stereo Theaters.” 1994. Eastman Kodak Company. “Use of Perchloroethylene in Motion Picture Wet-Gate Printing.” 2003. ———. “The Essential Reference Guide for Filmmakers.” 2007. www.kodak.com/en/motion/page/ essential-reference-guide-for-filmmakers. ———. A Guide to Identifying Year of Manufacture for KODAK Motion Picture Films. April 2013. https:// studylib.net/doc/18157524/a-guide-to-identifying-year-of-manufacture-for-kodak-motion. ———. “Glossary of Motion Picture Terms.” 2022. www.kodak.com/en/motion/page/glossary-ofmotion-picture-terms. ———. “Chronology of Film: History of Film from 1889 to Present.” n.d. www.kodak.com/en/motion/ page/chronology-of-film. ———. Storage and Handling of Processed Nitrate Film. Accessed July 26, 2021. www.kodak.com/en/motion/ page/storage-and-handling-of-processed-nitrate-film. “EmbARC (Metadata Embedded for Archival Content).” n.d. www.digitizationguidelines.gov/guide lines/embARC.html. Federal Agencies Digitization Guidelines Initiative Audio-Visual Working Group. “Digitizing Motion Picture Film Exploration of the Issues and Sample SOW.” April 18, 2016. www.digitizationguidelines. gov/audio-visual/. ———. “Guidelines for Embedded Metadata within DPX File Headers for Digitized Motion Picture Film.” April 23, 2019. www.digitizationguidelines.gov/audio-visual/. FIAF International Federation of Film Archives. “Resources of the Technical Commission.” International Federation of Film Archives. Accessed August 20, 2021. www.fiafnet.org/pages/E-Resources/TechnicalCommission-Resources.html. FIAF Technical Commission. “The Digital Statement: Recommendations for Digitization, Restoration, Digital Preservation and Access.” n.d. www.fiafnet.org/pages/E-Resources/Digital-Statement.html. ———. “Choosing a Film Scanner.” 2016. www.fiafnet.org/images/tinyUpload/E-Resources/Commis sion-And-PIP-Resources/TC_resources/Film%20Scanners%20v1%202.pdf. ———. FIAF Technical Commission Preservation Best Practice. 2009. www.fiafnet.org/images/tinyUpload/ E-Resources/Commission-And-PIP-Resources/TC_resources/Preservation%20Best%20Practice%20 v4%201%201.pdf. Fossati, Giovanna. From Grain to Pixel: The Archival Life of Film in Transition. Amsterdam: Amsterdam University Press, 2018.

404

Caring for Film-Based Art George Eastman Museum. “The Art of Film Projection: A Beginners Guide.” 2019. https://www.world cat.org/title/art-of-film-projection-a-beginners-guide/oclc/1090863824. Giardina, Carolyn. “Studios Re-up Kodak Deals to Keep Celluloid Film Alive.” The Hollywood Reporter, January  29, 2020. www.hollywoodreporter.com/movies/movie-news/studios-up-kodak-deals-keepcelluloid-film-alive-1274709/. Green, Phil. “The Digital Intermediate Internet Guide.” n.d. www.digital-intermediate.co.uk/. IPI Image Permanence Institute. “User’s Guide for AD Strips Film Base Deterioration Monitors.” Filmcare. org, 2020. ———. “FilmCare.Org.” n.d. Accessed December 30, 2020. https://filmcare.org/. ———. “Visual Decay Guide | Dye Fade.” Filmcare.org, 2021. www.filmcare.org/vd_dyefade.php. Kallenberger, Richard H., Stuart Blake Jones, and George D. Cvjetnicanin. Film into Video: A Guide to Merging the Technologies. Oxford: Focal, 2000. Kattelle, Alan. Home Movies: A History of the American Industry, 1897–1979, 1st ed. Nashua, NH: Transition Publishing, 2000. Kilchesty, Albert, ed. Big as Life: An American History of 8mm Films. San Francisco, CA: San Francisco Cinematheque, 1998. Koppe, Egbert, and Jorg Houpert. “DIN SPEC 15587—the New Standard for the Digitization of Cinematographic Film.” 2019. www.cube-tec.com/en/news/announcements/fkt-din-spec-15587. LucasFilm Ltd. “THX Theater Alignment Program Recommended Guidelines for Presentation Quality and Theater Performance for Indoor Theaters.” 2000. https://closinglogosgroup.miraheze.org/wiki/ THX_Theatre_Alignment_Program. National Film and Sound Archive of Australia (NFSA). Technical Preservation Handbook. NFSA, n.d. Accessed August 20, 2021. www.nfsa.gov.au/preservation/guide/handbook. National Film Preservation Foundation (U.S.), ed. The Film Preservation Guide: The Basics for Archives, Libraries, and Museums. San Francisco, CA: National Film Preservation Foundation, 2004. Parth, Kerstin, Oliver Hanley, and T. Ballhausen. Works in Progress: Digital Film Restoration Within Archives. Wenen: Synema, 2013. Pittaro, Ernest M., ed. Photo Lab Index: The Cumulative Formulary of Standard Recommended Photographic Procedures, 2d compact ed. Dobbs Ferry, NY: Morgan & Morgan, 1979. “RAWcooked (Software) (version 21.01).” MediaArea, n.d. https://mediaarea.net/RAWcooked. Read, Paul, Mark-Paul Meyer, and Gamma Group, eds. Restoration of Motion Picture Film. ButterworthHeinemann Series in Conservation and Museology. Oxford and Boston: Butterworth-Heinemann, 2000. Reilly, James M. IPI Storage Guide for Acetate Film. Minor Revision, 1996. Rochester, NY: Image Permanence Institute, 1993. Reilly, James M., and New York State Program for the Conservation and Preservation of Library Research Materials. Storage Guide for Color Photographic Materials: Caring for Color Slides, Prints, Negatives, and Movie Films. New York: University of the State of New York, New York State Education Department, 1998. Ruivo, Céline, and Anne Gant. “Digital Statement Part V: Survey on Long-Term Digital Storage and Preservation.” FIAF Technical Commission, April 2019. www.fiafnet.org/images/tinyUpload/2019/04/ Preservation_Digital_Statement_Final.pdf. Sætervadet, Torkell. The Advanced Projection Manual. Brussels: FIAF, 2006. ———. FIAF Digital Projection Guide. Brussels: FIAF, 2012. “SAVEFILM (Website).” n.d. Accessed December 13, 2021. www.savefilm.org/. Shanebrook, Robert L. Making Kodak Film. Rochester, NY: Robert L. Shanebrook, 2016. Sitney, P. Adams. Visionary Film: The American Avant-Garde 1943–2000. Oxford: Oxford University Press, 2002. SMPTE. Storage of Motion-Picture Films. SMPTE, May 1, 1994. ———. Standard for Motion-Picture Film—Nomenclature for Studios and Processing Laboratories. SMPTE, May 28, 1996. Thompson, Kristin, and David Bordwell. Film History: An Introduction, 3rd ed. New York, NY: McGrawHill Higher Education, 2010. Throup, David, and Bob Pank. Film in the Digital Age. Newbury, Berkshire, England: Quantel, 1996. Volkmann, Herbert, Preservation Committee, and Herbert Volkmann. Film Preservation. London: FIAF International Federation of Film Archives, Preservation Committee, 1966. Wilhelm, Henry Gilmer, and Carol Brower. The Permanence and Care of Color Photographs: Traditional and Digital Color Prints, Color Negatives, Slides, and Motion Pictures, 1st ed. Grinnell, Iowa: Preservation Pub. Co, 1993. Wilsbacher, Greg, and Tommy Aschenbach. “AEO-Light (Software) (version 2.x). Soundtrack Extraction from Film Scans.” 2019. https://usc-imi.github.io/aeo-light/.

405

21 CARING FOR SLIDE-BASED ARTWORKS Jeffrey Warda

Editors’ Notes: Jeffrey Warda is Senior Conservator, Paper and Photographs, at the Solomon R. Guggenheim Museum in New York where he oversees the study, conservation, and display of the museum’s works on paper and photographic works. The ability to preserve and exhibit slide-based artworks relies on the production of high-quality slide duplicates because slides will fade within weeks of display in a projector. As photography labs have stopped offering traditional, analog film-to-film duplication, alternative duplication options involve a range of digitization methods and output processes that require considerable expertise and supervision by collection caretakers. In this chapter, Warda introduces best practices for collecting, displaying, storing, and loaning slide-based artworks and systematically builds the knowledge base necessary to navigate the challenges of post-analog slide duplication. Projection hardware and syncing technologies common in the art world are characterized in detail, and selected case studies from the Guggenheim’s collection offer real-life conservation scenarios.

21.1 Introduction Similar to motion picture film, the preservation and display of analog, slide-based artworks relies not only on the storage of original film materials but also on the production of duplicate copies so these works can be exhibited in the future. Since slides start to fade in a matter of hours in a projector, the only way to exhibit these works—and for them to live on—is through duplication. As many collecting institutions have learned in recent years, it is this latter task that has become particularly challenging. For the Solomon R. Guggenheim Museum, New York, USA, this issue came to a head in 2011, while preparing a slide work for inclusion in an upcoming exhibition, Found in Translation. For this exhibition, the Guggenheim needed to produce roughly 3,000 exhibition copy slides of a recent acquisition that featured a 13-projector artwork by Sharon Hayes (b. 1970) titled In the Near Future (2009), consisting of 1,053 color 35 mm slides (see fig. 21.1). Copying that many slides was a tall order, but it seemed like a relatively straightforward duplicating project, especially since the museum had recently used a local photo lab to complete a similar project successfully. However, every time

406

DOI: 10.4324/9781003034865-25

Caring for Slide-Based Artworks

Figure 21.1 Installation view of Sharon Hayes, In the Near Future (2009), thirteen 35 mm color slide projections, edition 1/3, Solomon R. Guggenheim Museum, New York (2010.12) in Found in Translation at the Solomon R. Guggenheim Museum, January 28–April 10, 2012. Photo: Kristopher McKay

conservation staff met with the artist at the lab to color-proof duplicate test slides for this new project, they were unsatisfied with very-high-contrast images that did not adequately match the originals. After multiple trips and growing frustration, a slide mount was eventually pried open only to discover that the lab had been using a regular camera slide film (Fujifilm Provia 100F RDP-III) instead of specially formulated Kodak or Fujifilm slide duplicating film (see fig. 21.2). When pressed, the lab revealed—to great dismay—that both Kodak and Fujifilm slide duplicating films had been discontinued a few years prior and their backstock had been used up. Slide duplicating films are essential for analog reproduction of color slides in large part because they are specifically designed by manufacturers to provide much lower contrast compared to regular camera slide films. A linear or reduced film contrast, as found in duplicating film, is a unique characteristic that allows for accurate reproduction of subtle details in the highlights and darks of original film materials (Eastman Kodak 1984a). By switching to regular camera slide film to duplicate Hayes’ work, unacceptable loss of detail in the darks and highlights occurred and the color saturation was altered (see fig. 21.2). The wide tonal range of the original slide had been unacceptably compressed. With the installation date fast approaching, a different lab was chosen but the technician responsible for slide duplicating had recently retired and the lab had started to discard their mounting equipment. Desperate calls were made to other photo labs, which all confirmed that they too had stopped offering slide dupes. A few companies said they still offered the service but when asked what kind of film

407

Jeffrey Warda

Figure 21.2 Comparison of regular versus slide duplicating film from test slides of Sharon Hayes’ In the Near Future (2009). Left: copy of a camera original slide with Fujifilm CDU Type II duplicating film with accurate tonal range; middle: copy of the same camera original slide with regular slide film (Fujifilm Provia 100F RDP-III)—note the compressed tonal range and loss of details in the darks; right: same as the middle image, but the film was pulled during processing to try to reduce contrast. Note unacceptable loss of detail in the highlights. Photo: Jeffrey Warda

they were using, it was never a slide duplicating film. Remarkably, the retired technician at the new photo lab was hired back and they started making calls to locate remaining duplicating film stock—eventually sourcing some expired 30-meter-long rolls of duplicating film in a freezer at Fujifilm in Japan. What had previously been common practice, came to an unexpected and alarming halt, threatening the long-term viability of any slide-based artworks in collections. It was a moment of panic and realization that our days were numbered to preserve slide-based artworks and keep them accessible long-term. After struggling through the analog slide duplication of Hayes’ work in 2011, the Guggenheim ultimately shifted to a process involving scanning or digitizing slides, then using a lab to write the image back to film with a film recorder. Throughout this process, the cost for duplicating slides has only increased, and there are now only a handful of photo labs in the world able to make accurate duplicates to high standards, often using obsolete equipment with limited life spans. Additionally, even slide mounts have become more difficult to source and often rely on secondary markets like eBay. In response, some institutions, such as Tate, London; Hamburger Kunsthalle, Germany; and the Guggenheim launched projects to make multiple extra copies of slide works as the best method to ensure their display in the future. See Weidner’s summary of Tate’s project, titled Dying Technologies: The End of 35 mm Slide Transparencies (Weidner 2012a) and the Hamburger Kunsthalle’s project in Back to the Future: Riding the Slide Carousel (Sommermeyer and van Haaften 2019). While the landscape has changed considerably, there are still viable methods to reproduce slides to high standards. However, the future of this process is uncertain and is dependent on the availability of color reversal slide film, access to specialized equipment needed for duplication, and whether the few remaining photo labs, photographers, and technicians will continue to offer the service. Slide duplicating is a collaborative process that relies on expertise and tools provided by others outside of a collecting institution. This chapter gives a brief history

408

Caring for Slide-Based Artworks

surrounding slides and provides insight into the acquisition, duplication, storage, and display of slide works within collections.

21.2 History of the Slide Projector Show The ubiquitous 35 mm slide show, popularized in homes with family slides in the 1950s through the 1980s, was preceded by a much earlier type of projector called a magic lantern that projected images on glass plates. The magic lantern is largely credited to Christiaan van Huygens (1629–95), a Dutch astronomer, mathematician, and physicist who used a concave mirror, candlelight, and lens—all housed in a wooden box—to project painted images on glass (Mannoni and Crangle 2000). By the mid-1800s, magic lantern slide shows in theaters were common and eventually a wet plate collodion process was developed allowing photographic images on glass plates. After 1890, this process was largely replaced by gelatin dry plates, which were easier to develop and handle, making slide shows more commercially available. These photographic images could be hand-colored and were projected with an ever-advancing assortment of magic lantern projectors. Various photographic processes were used to produce images on glass lantern slides, but it was the Lumière brothers who finally perfected the color positive image with the autochrome process in 1907 (Lavédrine 2009). Eventually, glass lantern slides were largely replaced with film, beginning in 1936 when the Eastman Kodak Company, USA, introduced Kodachrome in a 35 mm roll film. Kodachrome was the first commercially successful color reversal (slide) film, based on the chromogenic process. 35 mm film allowed for smaller cameras and for slide shows to move from the theater to the home, spawning huge advances in film types and a new array of slide projectors and related equipment. Table 21.1 lists the major color reversal slide film formats that could be used in projectors. Fig. 21.3 illustrates typical film and slide mount sizes. In tandem with the introduction of the slide projector, Kodak introduced the standard 50 × 50 mm (2 × 2 in.) paper slide mount in 1937 (a year after introducing Kodachrome film), with many different manufacturers to follow with different films, mounts, and projectors. Kodak’s early projectors, beginning in 1937–40, were geared more toward home use. As the technology advanced, more robust projectors were introduced, starting with Kodak’s Ektagraphic projectors in the late 1960s into the 1970s and 1980s, and the professional Ektapro line of projectors introduced in 1992 (Eastman Kodak 1984b). With these advancements, slide shows became increasingly more complex with the ability to run multiple projectors in sync for multimedia or “multi-image” productions as they were called, most often for a commercial market (e.g., corporate product launches, displays at tourist destinations) or for professional meetings. Artists also quickly appropriated these methods, adopting the full range of possibilities from the most basic single projection to elaborate productions with multiple synched projectors and audio. Some of the earliest slide works from the late 1960s often emerged through the act of documenting performance art or remote land art pieces (Sobieszek and Smithson 1993; Alexander et  al. 2005). Through narrative, sound, or sequencing, still images could be transformed to create an experience for a gallery viewer that was both intimate and captivating in a way that framed photographic prints could not equal. Other artists used slides to create completely new works, in many ways culminating in Nan Goldin’s (b. 1953) Ballads of Sexual Dependency (1985) consisting of close to 700 color slides set to an audio soundtrack with multiple projectors. A detailed survey of the medium is outlined in the exhibition catalog Slideshow, curated by Darsie Alexander at the Baltimore Museum of Art in 2005.

409

Jeffrey Warda

35 mm

17 mm

135 format 35 mm slides

135 format “Super 35”

135 format Half-frame camera

50 mm

38 mm

26.5 mm

27 mm 39 mm

50 mm

23 mm

28 mm

23 mm 34 mm

26.5 mm 38 mm

2

828 format Bantam camera

70 mm

53 mm

53 mm

2

30 mm

23 mm

127 format “Superslides”

126 format Instamatic

17 mm

110 format Pocket Instamatic

Figure 21.3

40 mm

53 mm

120 medium format “645”

120 medium format “6 × 6 ” Larger sizes include 6 × 7, 6 × 8, and 6 × 9

Common slide mounts with mount opening dimensions and film placement. See corresponding Table 21.1 for more information, including larger medium-format slides not illustrated here.

Illustration: Jeffrey Warda

21.3 Acquisition of Slide-Based Artworks When slide-based artworks enter collecting institutions, there are important aspects to evaluate for long-term preservation and access. A slide work should ideally include related components (often referred to as “deliverables” when part of an acquisition), each with specific purposes

410

Caring for Slide-Based Artworks Table 21.1  Common slide film formats and mounts with sizes Film Width (mm)

Format

Description

Film Image Frame Size (mm)

Slide Mount Slide Mount Window Size Overall Size (mm) (mm)

16

110

13 × 17

12 × 16

35

135

24 × 36

23 × 34

30 × 30 or 50 1972–82 × 50 in 2″ × 2″ adaptors 50 × 50 1934–present

35

135—half frame

24 × 18

24 × 17

50 × 50

35 35

135 828

29.4 × 37.4 28 × 40

28 × 35 27 × 39

50 × 50 50 × 50

1935–85

35

126

28 × 28

26.5 × 26.5

50 × 50

1963–2008

46

127

41 × 41

38 × 38

50 × 50

1912–95

62

120 (220, 620 for longer rolls)

• Pocket Instamatic cartridge • 35 mm SLR cameras • Half-frame 35 mm camera • Super 35 • Bantam camera film • Roll film without sprockets • Instamatic cartridge • Super 127 “Superslides” • Medium format (2–1/4) 6×4.5 (645) 6×6 6×7 (67) 6×8 (68) 6×9 (69) • 70 mm motion picture film can be used to copy 120 mediumformat slides • Glass lantern slide

54 × 40.2 54 × 54 54 × 66 54 × 76 54 × 86

53 × 40 53 × 53 53 × 65 53 × 68 53 × 82

70 × 70 70 × 70 85 × 85 85 × 85 and 105 × 105 105 × 105

• Cut to fit into 120 format mounts

• Various 120 format sizes

• Various 120 format sizes

1899–1984

85 × 85 85 × 100 82 × 100

1850–1950

70

116 format (616)

85 (typical n/a glass plate)

Years in Production

1934–present (popular 1960s)

1901–present

82.5 × 101.7 69 × 75 3–1/4 × 4 in)

presented later. The list that follows represents a best-case scenario, and many factors may make some of these items impractical or simply not possible to acquire. If a slide work is already held in a collection, it often falls on the current owner to create some of these components to help ensure the artwork has a viable long-term future. Whenever possible, the artist, foundation, or gallery should be consulted before embarking on a project to create new components. As is true for many time-based media artworks, what may seem self-evident to a caretaker may actually not result in an accurate representation of the artist’s visual (or conceptual) intent.

411

Jeffrey Warda

The terms used for the components listed here are suggestions, other institutions may use slightly different language. The exact vocabulary is less important than finding terminology that your institution agrees on and uses consistently. In particular, the addition of a short descriptor of the maker of the component, allows for a quick way to decipher between artist-provided materials and those produced by others. The Guggenheim uses the following descriptors based on who produced the slides: Artist, Museum, or Gallery. For example, the “Museum Exhibition Copy” refers to slides that were produced by the museum or institution, while the “Artist’s Exhibition Copy” refers to slides produced directly by the artist. The term “set” is used when there is more than one slide for the artwork (e.g., a set of 80 slides), and the term “file” is used for digital images. The materials listed in the next sections should be cataloged in a collections management system to identify and track them as components or accessories to the artwork. Examples of actual component descriptions for an artwork follow at the end of this section to provide context.

21.3.1 Components Master—Original The “master” or “master set” refers to the original slides that the artist shot in a camera (also known as camera originals). Alternatively, a master set could be the digital files supplied by the artist from which subsequent slides are derived. High-resolution scans of camera originals can also be referred to as the “archival master files.” Acquiring camera originals for a slide work is advisable because they represent the bestquality image without generational loss from subsequent duplication campaigns. When a slide work is part of a larger edition with multiple owners, it may not be feasible to request the camera originals, or some artists may simply not agree to release their original slides to another entity. It may make more sense for a collecting institution to temporarily borrow the camera original slides from the artist or foundation to allow for in-house digitization of the originals, in the interest of securing the best-quality image files. The ability of a collecting institution to offer long-term dedicated cold storage for camera original slides may help in these negotiations. See Section  21.10 for a discussion on cold storage benefits. Camera original slides are unique and should not be projected due to risk of fading, except for short periods when necessary. For slide works where the content was originally captured with a digital camera (not with film), the master set would be digital image files. It is important to understand the properties of any files that are acquired, for long-term preservation and use. In the case of master files, all image adjustments should have previously been saved to the file to the artist’s specifications and stored in an uncompressed image format such as TIFF (tagged image file format). Images shot in a proprietary camera raw format should similarly be processed by the artist with raw processing software such as Adobe Camera Raw and saved as TIFF files at 16-bit/channel, in either Adobe RGB (1998) (Rieger 2016), ProPhoto RGB, or other wide-gamut color space such as eciRGBv2. Unlike TIFF or JPEG (joint photographic experts group), raw image files are not processed with white balance, tonal range, exposure, and many other adjustments finalized or “locked” into the file. Raw images can open with different processing parameters applied on different computers, so they are not reliable images of record for an artwork. If raw files are acquired, they should at minimum be converted by the artist from their proprietary camera manufacturer’s raw format (e.g., Nikon’s NEF or Canon’s CR2) to the Adobe Digital Negative

412

Caring for Slide-Based Artworks

(DNG) format. DNG allows image processing adjustments to be saved directly within the file instead of to external sidecar files, which must travel with the file to be accessible. The DNG format allows image adjustments made on one computer to be available if the image is moved and opened on a different computer (Warda et al. 2011). If raw DNG files are acquired, derivative TIFF master files should also be created and stored. Similarly, if JPEG files are supplied by an artist, these should be retained, but they should also be duplicated and saved as uncompressed TIFF files separately. Alternatively, some artists shoot analog film, scan their film, make adjustments in image editing software, such as Adobe Photoshop, then re-output back to slide film with a film recorder (see Section 21.7 for a description of film recorders). In this case, the camera original slides would not be acquired as they are an intermediary step in the artist’s working process. For a work created in this fashion, the final edited digital files would constitute the master set. A master set of image files may contain layers that hold image adjustments that the artist created in Adobe Photoshop, or they may be saved in a larger bit depth (e.g., 16-bit/channel) or color space (e.g., ProPhoto RGB) than what might be recommended for printing. Similarly, the master set of files may also be at the native camera pixel dimensions (resolution) and sized larger than what would be required to make new slides. As a master, these properties should be retained, and smaller derivative files can be saved with appropriate settings for viewing or printing.

Reference—Viewing Copy A viewing copy, or reference set of analog slides, is extremely useful because unlike the master slides, these can be projected for testing or review. If these were supplied by the artist, they can be referred to within a collections management database as the artist’s reference set. If the collecting institution produced this set subsequent to acquisition, it can be referred to as the museum reference set or the gallery reference set, as the case may be. Many slide artworks do not necessarily consist of 80 unique images to fill an 80-slot carousel and are instead made up of a smaller number such as 27 unique images, that are repeated three times to fill out an 80-slot slide carousel (with one slide in the zero slot of the carousel). Alternatively, a work could be 40 images repeated twice to fill an 80-slot carousel. In instances like these, the reference set should contain all 80 or 81 slides, not just the first 27 or 40 slides. It is important that the reference set is a full realization of the artwork and includes any other components such as sound or a cue track for use with a dissolve unit or other controller. If a reference set was not originally acquired, often an exhibition copy set can be designated as such or used for testing.

Duplication Master—Duplicating Copy A duplication master set are the analog slides or digital files that are used to create new exhibition copies. A duplication master set need only contain the unique images, not all 80 slides or files if, for example, the artwork consists of 40 slides repeated twice to fill a carousel. If film is held, it should have been produced from the camera originals using slide duplicating film and should match the originals in terms of color, tonal range, and cropping. Although it was common practice to hold a duplication master set of slides in a collection, there are limitations to this type of film-to-film copying due to generational loss of image quality when not copying directly from the camera originals. Duplicates from duplicates will not produce the same level of image quality as duplicates from camera originals, especially in terms of sharpness. If digital

413

Jeffrey Warda

files are used to reproduce a work, the duplication master set may be derivatives of the master set of files but sized to the correct output film size (e.g., 24 × 36 mm), at the resolution (in pixels per inch) specified by the printer, with any adjustment layers flattened; in an appropriate image bit depth (typically 8 bits/channel), and an embedded color space [typically Adobe RGB (1998) (Rieger, 2016)]. Files should be saved in an uncompressed image format such as TIFF.

Proof Set A proof set is important when using any duplicating process other than analog slide duplicating. If sending digital files to a lab to have them re-output back to film, the lab must have an approved set of physical slides to proof against to determine contrast, tonal range, and color. Proof sets are equally important when producing black-and-white slides. There are far too many variables in processing chemistry to expect accurate slides if the lab does not have a proof set of slides to compare against. Asking a lab to compare their test slides against digital image files on their computer monitor is not recommended—even with a profiled monitor. If a proof set was never designated by an institution, a duplication master set could be used for proofing, or even an approved exhibition copy set.

Exhibition Copy After completing a slide duplication process, ideally there will be additional exhibition copy sets to store away for future use. Note that some institutions refer to the exhibition copy set as the duplication set. An institution may decide to request a batch of exhibition copy sets at the time of acquisition directly from the artist or to include the estimated cost of producing additional exhibition copy slides post-acquisition as part of the acquisition budget process. Since it is impossible to forecast how many times a work will actually be displayed in the future, deciding on how many exhibition copy sets an institution should acquire or produce is challenging and imprecise. For color slide works, two or three sets of slides are typically needed to fulfill the needs of one exhibition, so an institution will have to decide on a minimum number of sets it can produce within a particular budget with estimates for future use.

Artist Interviews Whenever possible, an interview with the artist should ideally be part of the acquisition process, preferably in-person and in front of the artwork. It is often the case that there are issues not apparent to the collecting institution that are only revealed upon setting up and running a work with the artist present. Logistically, this may not be possible, and phone or video conference may be necessary. For works already in a collection, reaching out to the artist is a critical component to understanding an artwork, how it should be displayed, and developing options for duplication of exhibition copies. With consent of all parties, interviews should be recorded with audio or video. Audio should be transcribed to make information more accessible to future caretakers and researchers. When discrepancies or condition issues are found in a slide work, it is imperative that the artist be engaged whenever possible. Photography is somewhat unique in that the final product is reliant on many outside factors often beyond the control of any one individual. The type of lab equipment, film used, condition of their processing chemicals, or skill level of the operator at a photo lab all influence how the final image will look. Aesthetic parameters, like contrast, tonal range, and color, are highly subjective. It may be the case that what was delivered to the 414

Caring for Slide-Based Artworks

collecting institution does not fully represent what the artist was striving for, and it may include minor compromises that one often needs to make with photography. There is a fine line that can be easily crossed with photography though, whereby the technology the artist used originally is an important frame of reference for an artwork within the context of history. Caretakers of collection works should be careful about “improving” an artwork given access to better technology or resources, but there may be very valid arguments for when this general concept is appropriate.

Installation Instructions Despite best intentions, artworks are occasionally installed by institutions in a way that may not fully capture the artist’s preference—often attributed to a lack of comprehensive installation instructions. Artist-provided installation instructions are essential to obtain at the time of acquisition, but it is often necessary for a collecting institution to also produce their own instructions, which incorporate the artist’s guidelines as well as expand on additional issues that arise from lending an artwork by an institution. The scope of installation instructions will vary but should include • • • • • • • •

specifications on the projector type (if applicable) and power requirements; ideal projected image size (including a range of acceptable sizes); timing for each slide; sync information for works with multiple projectors; audio information and instructions, including equipment and formats (if applicable); type of projector stand or pedestal (if visible or hidden from view); room or space requirements (e.g., a size range, wall color, room lighting, seating); and a numbered sequence of images as thumbnails to verify the slides were loaded into the projector carousel correctly.

It is important to include a range of options (if allowed), such as minimum and maximum projected image sizes to accommodate different situations where the ideal setting may not be possible. If an artist has filled out an artist’s questionnaire form or agreed to an interview with the collecting institution, elements from these interactions can be incorporated into the institution’s installation instructions.

21.3.2 Cataloging Components For each duplication master, exhibition copy, or reference set of slides created or acquired (both film-based and digital files), ensure there are component numbers for each set within a collections management system or database. Component numbers are typically a string of sequential numbers following the main accession number for a collection object. Each component should have a descriptive name (e.g., “Artist’s reference copy set of 40 slides, mounted”) and include a narrative description that provides as much detail as possible to explain the context. Main topics that should be addressed include: • • • • •

description of who made the slide or sets of slides and why, the production date, the equipment used, the type of film, which component was used as duplication master (slides or digital files), 415

Jeffrey Warda

• • •

the method for proofing to ensure accuracy, the total quantity of slides or sets produced, and the type of slide mounts (e.g., glass or non-glass mounts, paper or plastic).

This information is valuable for future custodians and will help guide decisions. Once a set has been exhibited and presumably has faded, make sure to mark the outside of the slide container in storage and the component description within your Collection Management System as “exhibited—do not use” or some other notation to ensure they are not used in the future by accident. Given the fragile state of slide duplication, it is probably a good idea to retain used slides and not discard them. If storing digital image files, their component descriptions should ideally include the file properties for each file type within a collections management system. This is beneficial to record so that future users will be able to discern file characteristics without having to access the file. It is also a simple way to maintain an external reference (independent of the actual files) that defines the pixel dimensions and other aspects of the files. Image file properties can be gathered with an image browsing program such as Adobe Bridge, which allows the user to view file properties within the metadata pane without actually opening the file. File property component descriptions should include • • • • • •

pixel dimensions, image size, resolution in ppi, color space, bit depth, and file format.

In addition, it is equally helpful to include a descriptive text document saved in the same folder as the image files that addresses the same type of who, what, when, where, and why questions. This can be in a simple rich text format (RTF), text editor (TXT), or portable document format (PDF) and can be named with the accession number and “Read Me” such as “2009–87–4_READ_ME.rtf.” For further information on inventory and database registration, see Chapter 12.

Case Study 21.1  Lisa Oppenheim—Digital Files, Output to Film Lisa Oppenheim’s (b. 1975) slide work, The Sun Is Always Setting Somewhere Else ( 2006), Solomon R. Guggenheim Museum (2009.60), consists of a set of fifteen 35 mm color slides that is repeated five times to fill 75 slots in an 80-slot carousel, with a blank slot between each of the five groupings (see fig. 21.4). The artwork runs on a continuous loop without sound. The artist created the work by first printing inkjet prints of images she downloaded off of photo-sharing sites like Flickr of sunsets posted by American soldiers stationed in Iraq. She then held up these prints, each sized like postcards, in front of actual sunsets in New York and photographed the print and sunset in the background. As the slide show progresses, the sun gradually goes down and the sky darkens in both scenes. Although originally shot with 35 mm slide film, the artist had the film scanned and has always used the digital files to make additional slides through Gamma Tech, a photo lab in

416

Caring for Slide-Based Artworks

Figure 21.4

Installation view of Lisa Oppenheim, The Sun Is Always Setting Somewhere Else (2006), slide projection of fifteen 35  mm color slides, continuous loop, A.P., edition of 4, Solomon R. Guggenheim Museum (2009.60) in Photo Poetics: An Anthology at the Deutsche Guggenheim Berlin (now the PalaisPopulaire), July 10–August 30, 2015.

Photo: Mathias Schormann

New Mexico, USA, that uses a CRT (cathode ray tube) film recorder. For this collection work, the Guggenheim holds a reference set (75 slides), a duplication master set of 15 digital files, a color proof set (15 slides), and multiple exhibition copy sets (75 slides each) that the museum has produced over two campaigns. The following is an example of a typical component description for a set of exhibition copy slides. This text represents the type of information that the Guggenheim Museum captures within its collections management system for a given duplication campaign. In this case, 25 sets of exhibition copy slides were produced, with each set assigned a sequential component number and the following text pasted into the museum’s collections management database: Component Description Museum Exhibition Copy set of 75 slides (1 of 25 total sets produced), duplicated from the Artist’s Duplication Master set of 15 digital image files (2009.279.1). 35  mm Kodak 100D motion picture film 5285 (similar characteristics to Kodak E100G) was used with a CRT film recorder by Gamma Tech, New Mexico on 18 November 2014. Film recorder resolution: 4096 × 2792 pixels. The Artist’s Color Proof set of slides (2009.279.2) was sent to Gamma Tech for proofing. Final color and exposure proofed by Jeffrey Warda. Non-glass plastic mounts were used. These 25 sets of slides were produced in preparation for their display in the exhibition Photo Poetics: An Anthology (2015).

417

Jeffrey Warda

Case Study 21.2  Kathrin Sonntag—Digital Files Output to Film with Hand Alterations Kathrin Sonntag’s (b. 1981) black-and-white slide work, Mittnacht, 2008, Solomon R. Guggenheim Museum (2011.15) consists of eighty-one 35 mm slides, all taken in the artist’s studio from carefully constructed scenes and juxtapositions related to 19th-century spirit photography (see fig. 21.5). The artist used regular black-and-white negative film and reverse-processed the film with a reverse processing kit in the studio. Reverse processing allows one to shoot with negative film but process the film into a positive image. After processing, the artist noticed some white processing chemical residue on some of the film, which were deliberately retained as it seemed fitting within the context of spirit photography. Sonntag then scanned the film and did some minimal editing in Photoshop, largely to reduce some of the processing chemical halos in the images. Sonntag then worked with MOPS Computer GmbH, a photo lab in Münster, Germany, to write back to film with a CRT film recorder onto Agfa Scala black-and-white 35 mm slide film. The artist also hand-manipulated twoslide mounts with paint and by either cracking or driving a nail into the mount, cracking the glass (see fig. 21.6). The broken glass was then secured in place with clear pressure-sensitive tape on the outside. On acquisition, the Guggenheim received the duplication master digital files, a color proof set consisting of unmounted strips of film produced from the digital files, and one set of mounted reference copy slides. The artist also supplied six additional copies of the hand-altered slides.

Figure 21.5 Installation view of Kathrin Sonntag, Mittnacht (2008), slide projection of eightyone 35 mm slides, continuous loop, edition 3/3, Solomon R. Guggenheim Museum (2011.15) in Photo-Poetics: An Anthology at the Solomon R. Guggenheim Museum, November 20, 2015–March 23, 2016. Photo: David Heald

418

Caring for Slide-Based Artworks

Figure 21.6 Two hand-altered slides by Kathrin Sonntag, included with Mittnacht (2008). Photo: Jeffrey Warda

The following is a description of a typical duplication master set of digital files, in this case related to Kathrin Sonntag’s slide work in the Guggenheim’s collection: Component Description Artist’s Duplication Master files, set of 79 image files provided by the artist at time of acquisition. These files are approved for use for future exhibition copy slides created with a CRT film recorder. These files were created on 15 March 2009 on a Hasselblad Flextight X5 film scanner by the artist from the artist’s camera original slides. Scanned files were color corrected by the artist and dust and scratches were reduced within the Hasselblad scanning software, with additional cleanup in Adobe Photoshop. The file properties are 4092 × 2732 pixels, sized to 36 × 24 mm at 2889 ppi, in Adobe RGB (1998), and 8-bits/channel, saved in the TIFF format. Note that slide number 50 is handcolored with an altered glass mount (without film) and does not exist as a digital image file. While there is a digital image file for slide number 69 it must be placed in a mount with cracked glass, provided by the artist.

21.4 Examination and Condition Assessment Whether a slide work has recently entered a collection or has been held for many years, it is important that the work be looked at carefully to assess condition and to best gauge options for duplication to ensure there is a viable long-term display solution. It may be helpful to create a condition reporting template for slide-based works, to help ensure each aspect of a work’s condition and characteristics is covered (see, for example, the template provided in Sommermeyer and van Haaften 2019, 165–67). The following are recommendations on different slide examination techniques.

419

Jeffrey Warda

Slide Examination Examine each slide on a light box designed for viewing film transparencies. A transparency or photo light box typically has lamps with a daylight color temperature of 5000 K CCT (correlated color temperature), with a high CRI (color rendering index) above 90. It can be helpful to lay strips of black mat board or paper to block stray light around the slides. Use a loupe, magnifier, or stereo microscope to check for condition issues, including dust, scratches, mold, or mis-registration in the mount. If appropriate, open slide mounts to confirm the type of film used (i.e., if it is a duplicating or regular slide film). Mounted slides should always be documented on both sides before removing the film from the mount; see Section 21.6 on digitization for a description. Plastic mounts can usually be opened with a micro-spatula or blade edge, but sealed paper mounts require making the decision to permanently alter the original slide mount. Once these are cut open, it is not possible to reuse them, and it will likely be necessary to store the film separate from the original paper mounts. Ultimately, long-term storage of slide film separated from the mount is preferred (see Section 21.10 on storage), so rehousing may be an important opportunity to learn more about a collection work. Always use tweezers and powder-free nitrile gloves on a clean surface when handling film. Opening slide mounts can cause damage to film if not done carefully and it may be more appropriate to involve a photograph conservator in condition assessments. Signs of mold on film should be handled carefully to prevent contamination of the work surface and other objects (Pietsch 2021).

Projection of Slides Equally important as examining slides on a light box, slides should also be projected on a screen to check for condition issues and to better understand the work. Viewing a large image from projection allows one to see different aspects of the artwork, such as orientation and sequence. If digital files were also provided by the artist, projecting the physical slides and comparing them against the digital files on a large monitor is helpful to confirm cropping and orientation and to further check for abrasions or dust, which will be much larger when projected. Subtle differences in color should be ignored when comparing slides to digital images on a monitor. Particularly in slide-duplicating projects, there can be a tendency for some images to get flipped horizontally by accident as some mounts lack clear orientation marks; projecting them makes this more apparent.

Production Diagrams When multiple versions of an artwork are created over time, it can become difficult to assess the status of various components, especially if the full details of this process are not documented. With more complicated works or works that have been duplicated repeatedly, it can be beneficial to maintain a production diagram that visually illustrates how each component is related to the others. Notes in a Collection Management System are important but are often not as easily interpreted as a single graphic illustration affords. Production diagrams can be color-coded, typically red for the master set (indicating it is the most fragile), orange for the reference or duplication master set (less fragile but important to preserve), and green for the exhibition copy slides (indicating they can be used for exhibition) (see fig. 21.7).

420

Caring for Slide-Based Artworks

Master Set-Slides (Camera Originals) 1992.56.1

Duplication Master (Analog dup. slides) 1992.56.2

Exhibition Copy (Analog dup. slides) (2/4) 1992.56.4

Duplication Master (Scanned Files) 1992.56.7 Proof Copy (1 set of slides from Exhibition Copy Set (15 sets of slides

Exhibition Copy (Analog dup. slides) (3/4) 1992.56.5

1992.56.3 Exhibition Copy (Analog dup. slides) (4/4) 1992.56.6

1992.56.9–.24

Figure 21.7

Exhibition Copy (Analog dup. slides) (1/4) 1992.56.3

Typical production diagram for a slide work with associated component numbers assigned.

Diagram: Jeffrey Warda

21.5 Duplicating Slides There are two main methods for duplicating slides. Prior to its obsolescence around 2009–12, the traditional and most common process was an analog film-to-film duplication using a film camera. An alternative process, which has become most common since then, involves a dedicated film recorder that can write digitized or born-digital images (back) onto analog slide film. No matter which method is used, the objective in slide duplication is to match the original slide in color, tonal range, contrast, and cropping while not imparting new artifacts into the image. That said, it is important to keep in mind that film chemistry and optical systems can limit the accuracy of any “copy” or “duplicate.” There is no such thing as an exact copy in photography. Duplicating slides often must balance our goal of accuracy in reproduction with necessary compromises given technological limitations.

21.5.1 Film-to-Film Slide Duplication Film-to-film or analog duplicating involves illuminating a slide with diffuse transmitted light and photographing it with a camera loaded with film specifically formulated for copying slides. Unfortunately, the only films appropriate for this type of work, Fujifilm CDU Type II and Kodak EDUPE, were both discontinued in 2009 (Kodak Technical Support 2012). Back-stocks of these two films could be sourced post-2009, but by 2012 this was not a feasible commercial endeavor for most photo labs unless they had stockpiled large quantities of expired film in freezers. Analog duping is presented here as a reference since it was such an important process. Film-to-film duplicating can range widely from a relatively simple copy stand or bellows setup with a single-lens reflex (SLR) camera and macro lens to a much more sophisticated slide duplicator or professional animation camera (also known as a rostrum camera) with precision x-y stage, high-quality optics, tunable color compensation adjustment, and pin registry for film advancement. Because a camera lens must be used, there is always some generational loss in analog duping, but

421

Jeffrey Warda

excellent results can be obtained using a professional duplicator with a high-quality lens. Slide works that were originally created with a film recorder from digital files, should preferably be duplicated with that same technology, not a film-to-film duping process. See Section 21.7 on film recorders. As mentioned in the introduction, slide duplicating films are essential for analog copying because they are inherently low contrast compared to regular camera slide films (Jacobson et al. 1988). Contrast is reported by film manufacturers in what is called a characteristic curve for each film type, typically found in the data sheet for that film. A characteristic curve illustrates how a given film responds to exposure by plotting the measured density of the film from the lightest area or D-min to the darkest or D-max in relation to the amount of exposure the film receives. Essentially, the steeper the curve, the higher the contrast of the film (Warda and Munson 2012). Copying slides with regular camera slide film will compress the tonal range and result in the loss of details in the darks and lights. A photo lab using regular slide film for duplicating may try to compensate for the increased contrast by reducing the processing time (pulling the film), but this will come at the expense of other important characteristics, such as reduced color saturation or unacceptable loss of detail in the highlights (see fig. 21.2 for examples of each of these effects). Kodak and Fujifilm duplicating films were color balanced for either tungsten (3000–3200 K) or daylight (5500 K) exposure with slight variance between each manufacturing lot of film. To correct this, Kodak and Fujifilm included color compensating filter recommendations that needed to be applied during duplication to balance the color. These values were typically stamped on the outside of the film container and box. Before embarking on a copy project, a lab will need to shoot multiple frames with different color filters and exposure settings applied to compare against the original. Color print viewing kits could be used during viewing to estimate the amount of color compensation needed. This process could take multiple rounds of exposure and processing before finalizing the settings and starting a project. Depending on a photo lab’s duplicating equipment, film-to-film analog duplicating may involve shooting mounted slides, requiring the image to be magnified enough to remove the slide mount from the captured frame. Shooting unmounted film requires more setup to ensure the loose sheet of film is centered perfectly within the field of view, ideally within a holder or frame, and the film can be more vulnerable to scratches from handling. Professional animation cameras are well suited to copying unmounted film due to the precision stages they employ. Slide works with multiple projectors that rely on precise registration to employ dissolve or crossfade techniques must be duplicated unmounted with an animation camera with a pin registry. Film-to-film duplicating has largely ceased worldwide, with only a handful of people able to produce accurate analog dupes, having previously stockpiled rolls of expired duplicating film in a freezer. Activity-Studios in Germany [Activity-Studios (Website) n.d.] is possibly the only lab in the world still producing accurate analog slide dupes with slide duplicating film. Originally serving in the corporate advertising field, Jochen and Elke Trabandt built a professional facility with precision equipment that eventually transitioned to serving museums and artists in 2010, as every other photo lab abandoned the technique due to the discontinuation of slide-duplicating stock and widespread adoption of digital projection techniques (Jennings and Weidner 2012; Weidner 2019).

21.6 Digitization of Slides Before discussing the use of film recorders as an alternative to analog duplicating, slides must first be digitized. Digitization can also provide a form of documentation and facilitate the production of derivative image files for an artwork so that viewings do not require accessing the physical slides. If the slides will be removed from their mounts for digitization and long-term storage, a photographic record of the slides in their original mounts is an essential form of 422

Caring for Slide-Based Artworks

documentation prior to digitization. A contact sheet can be created on a copy stand of an array of mounted slides tiled out together (e.g., between 4 and 20 slides per shot) on a light table with some additional ambient or reflected lighting so that the slide film is legible in transmitted light and any inscriptions or labels on both sides of the mount from reflected light. Although it is not always possible, there are compelling advantages to digitizing slides outside of their mounts, capturing the full image frame of the original slide, and ensuring the image frame is level. The opening window for all slide mounts is slightly smaller than the actual film image area. This mount mask crops the image to provide for a clean projected image. For 135 format 35 mm film (with a 24 × 36 mm image area), the mask from a mount is roughly 1 mm smaller at 23 × 35 mm (±0.5 mm). If digital files are used to re-output back to film to make new slides and the original slides were digitized in their mounts, the image will be slightly enlarged and there will be an additional crop from the new slide mounts that was not in the original slides. Digitizing slides while unmounted will allow future duplicate slides to have the same field of view (the full image frame) and thus the same crop from any new slide mount window openings that are used for future exhibition copy slides. Digitizing slides in paper mounts with rounded corners is even more problematic as they require a more significant crop of the image to remove the rounded black corners of the mounts for new exhibition copy slides. For large digitization projects with hundreds or even thousands of slides, it may not be realistic or practical to unmount each slide prior to scanning.

Digitization Standards Digitization encompasses a wide range of materials and technologies, with many guidelines available on digitizing cultural heritage materials. See, for example, the Technical Guidelines for Digitizing Cultural Heritage Materials, Creation of Raster Image Files, part of the United States Federal Agencies Digital Guidelines Initiative (referred to here as FADGI) (Rieger 2016) and the Metamorfoze Preservation Imaging Guidelines (Dormolen 2021). Digitization guidelines like FADGI and Metamorfoze are extremely important and should be consulted prior to embarking on a digitization project. While some best practices are beyond the scope of many institutions or collectors, both of these guidelines provide different tiers of image quality. Equally important, digitization hardware and software tools are truly only as good as the operator. Although there are many consumer options for digitizing film—with a wealth of online resources—access to rare professional equipment and full knowledge of the correct settings to best operate these devices, as well as understanding post-processing options, is a highly specialized skill that is required when seeking the best possible scan for an artwork. Working with a photo lab may be preferable when it comes to digitization, but one needs to be mindful of having a unique collection work held off-site. It is advisable to send a staff member as a courier to ensure collection works are handled safely. Staff at a photo lab must also be made aware of the objectives of digitization: to accurately reproduce the original slide, not to make a “better” image through post-processing. Describing imaging guidelines in full and covering all the different facets of digitizing slides is well beyond the scope of this chapter. Instead, the recommendations that follow are focused most on 35 mm slide-based artworks, with the added assumption that duplicate exhibition copy slides will need to be produced from these files while obtaining the best-quality image given equipment and budget. Unfortunately, many high-end scanning devices have been discontinued by the manufacturers in recent years, limiting options for future projects. While many of these scanners are still quite functional, they often rely on outdated computers, cable connections, and operating systems that cannot be updated due to hardware incompatibility. A dedicated computer that is not connected to the internet or network drives may be the only way to 423

Jeffrey Warda

run these devices. Many of these film-scanning devices are now only available on the secondary market. Instead, medium-format digital cameras are the current standard for the digitization of slides, discussed in more detail in Section 21.6.2.

Resolution The term “resolution” is often used interchangeably (and to great confusion) to refer to both the physical sensor inside an imaging device, as well as the ability of an imaging device to resolve fine detail—two very different concepts (Burns and Williams 2007). While a scanner or camera manufacturer may report the “resolution” of their device, this is more accurately referred to as the sampling rate, reported as ppi (pixels per inch), pixels per mm, or (less frequently) samples per inch. Sampling rate describes how many pixels are on a sensor, not necessarily how well it can resolve detail. To make matters more confusing, the sampling rate is also referred to as spatial, optical, native, optimal, addressable, and limiting resolution. For example, a flatbed scanner with 40,800 photodiodes on an 8.5-inch linear CCD (charge-coupled device) sensor has an approximate sampling rate or optical resolution of 4800 ppi (40,800 pixels divided by 8.5 in.). It is important to keep in mind, though, that this does not mean the device can actually resolve detail at the sampling rate. Other components, such as lenses, mirrors, glass, analog-to-digital converters, electronics, and other elements within an imaging device, all effectively limit the optimal sensor resolution that a device can achieve. The actual resolution or resolving power of an imaging system is always less than the optical resolution of the sensor within that system. Particularly in flatbed scanners, the actual measured resolution can be less than half of the manufacturer’s reported optical sensor resolution. This principle applies to all imaging devices, including digital cameras. Actual scanner or camera resolution (ability to resolve fine detail) must be independently verified by scanning and evaluating a specifically designed resolution target. A USAF-1951 test target (United States Air Force) designed for film can be used to visually evaluate the actual or effective resolution of a scanner or camera, reported in line pairs per mm. With digital imaging, this is more accurately done with DICE (digital image conformance evaluation) software that uses specific targets that conform to the International Standards Organization (ISO), such as ISO 12233 for cameras and ISO 16067-2 for film scanners (Williams 2003; Rieger 2016). Measurements of the spatial frequency response (SFR) and modulations transfer function (MTF) are more objectively and reliably used to evaluate the resolution of an imaging system (Williams 2003; Steinhoff 2009; Rieger 2016). Scanning slides at the manufacturer’s maximum reported sensor resolution when the actual measured resolution of the device is much lower will result in bloated files with large file sizes but no appreciable increase in detail. Many manufacturers report even higher resolution values (as “maximum” or “enhanced” resolution) in product literature. These values rely on software interpolation to increase the native sensor resolution by adding neighboring pixels in a final image to increase the file size. Scanning at higher resolutions that involve interpolation should always be avoided. Note that scanner manufacturers also often report two-pixel dimensions (e.g., 4800 × 9600 dpi) for the optical resolution. In all cases, the lower value is closer to the spatial resolution. With any digitization project, the most common question asked is at what resolution should materials be scanned. Providing an exact file size or pixel dimensions for slides in a digitization project is difficult to specify as there are many factors that go into making this decision, but typically knowing the desired output format of the file will guide decisions. If slides are being digitized in order to re-output the scans back to film with a film recorder for new exhibition copy sets, the resolution of the scanning device should exceed the addressable resolution of the film recorder. Most film recorders have an addressable resolution of 4096 pixels in the long 424

Caring for Slide-Based Artworks

dimension. Consult the lab for optimal file sizes of scanned images prior to beginning a scanning project (see Section 21.7 on film recorders). The FADGI guidelines provide four tiers of image quality (rated between 1–4 stars), with a resolution of the lowest quality level at 1000 ppi (in the long dimension) to the highest 4-star quality level of 4000 ppi (in the long dimension) for 35 mm slides. Note that each FADGI quality level also differentiates between a number of other image quality metrics, such as bit depth, dynamic range, and the resolution of an imaging system through MTF and SRF analysis. Cultural heritage digitization standards for slide works may not take into account the output objectives for a collecting institution that must produce new sets of slides from the digital files, and it may be advantageous to scan at higher resolutions depending on the digitization tools available (Rieger 2016).

Density and Dynamic Range Another value that film scanner manufacturers report for their products is maximum density or dynamic range. Unlike negative films or print media, reversal slide films often have a wide dynamic or tonal range—meaning they are able to capture subtleties within a broad range of the lightest and darkest areas of the film. These two points are reported as the D-min (minimum density, typically the transparent film base) and D-max (maximum density), the darkest image area of the film. The dynamic range is the range between the D-min and D-max, recorded in density units. Film density is recorded on a logarithmic scale with a D-max of 5.0 being a completely opaque film. For example, a film density of 3.0 D is ten times darker than a film density of 2.0 D (Fraser et al. 2005, 540–41). For scanning color reversal films, a high D-max is recommended, at or above 4.0 D (Rieger 2016). Actual dynamic range of scanners is limited by light flare within the system and detector noise. Manufacturers typically do not include this in their reported values, instead including theoretical D-max or dynamic range (Williams 2000). Dynamic range in film scanners is tested independently through ISO 21550. When imaging or scanning tools cannot image the full density range of the film, the resulting images may have a loss of detail in the highlights, darks, or both, and this can also alter color accuracy (Rieger 2016).

Dust Removal Techniques During Scanning Slides often have dust, scratches, or mold that will be captured during digitization. When projected, these artifacts will be greatly enlarged and far more noticeable. An air puffer or soft antistatic brush should be used to remove loose dust or lint prior to scanning, but embedded dust may require more extensive cleaning of the film. Most dedicated slide-scanning devices have dust- and scratch-removal tools, which can range in their effectiveness. Hardware-based tools use an infrared light source for a separate scan of the film. Since color film is transparent to infrared radiation but lint and dust are not, the effects of dust can be effectively subtracted from the final image. Originally developed and marketed by Kodak Is Digital ICE (Image Correction and Enhancement) and found on Nikon, Epson, and other scanners, other companies use different names for the same technology. Hardware-based infrared dust detection does not work with gelatin silver film, which is not transparent to infrared. Some versions of Digital ICE will not work on Kodachrome film. If using SilverFast, a third-party scanning software by LaserSoft Imaging AG, hardware-based infrared dust detection is called iSRD (Smart Removal of Defects). SilverFast’s SRD feature is a software-based option that can be combined or used independently of iSRD. Applying too much dust removal will result in distracting image artifacts and can soften the image (Steinhoff 2009). Testing is usually needed to determine the correct amount of Digital ICE before embarking on a larger project. There is also an assortment of software-based dust 425

Jeffrey Warda

removal tools for image editing software post-capture, such as SilverFast’s SRDx, a plug-in for Adobe Photoshop, but these are generally not as effective as hardware-based tools. Manually removing dust or scratches with a Clone Stamp or Spot Healing Brush in Adobe Photoshop or other image editing software is extremely time-consuming or expensive if outsourced. Mold, mildew, or fungus growth is of particular concern as the gelatin layer in film emulsions is receptive to biological attack, and this process will continue to progress if left unchecked. Mold activity can contribute to stark color shifts, growth patterns, and eventually result in degradation of the emulsion layer and separation from the acetate base (see an excellent case study that includes treatment of mold on slides and digitization in Pietsch and Fernandes 2019). A photograph conservator should be consulted if artworks require cleaning or mold remediation prior to digitization and rehousing. In cases of mold, cleaning of digitization equipment between each slide is necessary to prevent cross-contamination.

Scanner Input Profiles Film scanners and cameras can be profiled to improve color and tonal range accuracy—an important step to faithfully reproduce slide film. Profiling is the process of scanning or imaging a calibration target with color and grayscale patches, then comparing the results of the scanned image to known measured values for those patches (found in a target reference text file from the manufacturer). An input profile is then created within profiling software, and this profile can be applied to subsequent scans to account for any differences in color and tonal range (Fraser et al. 2005). The IT8 series calibration target, ISO 12641-1, and the expanded ISO 12641-2 (2019) is the primary target in use, as well as the HCT targets from HutchColor, LLC. The IT8.7/1 target is designed for film transparencies and can be found on Kodak Ektachrome E100G film, Kodachrome (no longer produced), or Fujifilm Provia 100F film (see fig. 21.8). It is generally (but not universally) agreed that a Kodak Ektachrome profile can be applied to a scan of Fujifilm slide film (and vice versa), but this is not the case when scanning Kodachrome film,

Figure 21.8 Kodak IT8.7/1 color transparency target for scanner profiling. Photo: Jeffrey Warda

426

Caring for Slide-Based Artworks

which requires a profile created on Kodachrome film (Warda and Munson 2012). Kodak previously produced IT8 targets and hosted their reference text files on a Kodak FTP (file transfer protocol) site, which is no longer supported or accessible. Instead, targets and their associated reference files can be sourced through SilverFast and their scanning software packages. Before profiling, the scanner must be normalized to stabilize its response by turning off all auto functions that control white or black point, color compensation, or unsharp mask (sharpening) (Fraser et al. 2005). Some scanners have a dedicated calibration setting that will turn these settings off. Digital ICE dust and scratch removal features do not impact profile creation and can be left on. Profiling a scanner or imaging device can bring you closer to more accurate color but additional color and tonal corrections may still be needed, and profiling cannot correct for inherent hardware or software limitations in the scanning equipment or inaccurate setup by the user (Warda and Munson 2012). Digital cameras can also be profiled but the process is more difficult to control because there are so many more variables that change with lighting and setup over time with a digital camera compared to a very controlled setup inside a film-scanning device (Fraser et al. 2005). Successful camera profiles require a tightly controlled studio setup. Photography studios or digitization firms that specialize in imaging artworks or cultural heritage materials should have the ability to profile their camera for more accurate results. Successful scanner profiles require consistent use of scanner software settings for uniformity.

21.6.1 Digitizing with Scanners Drum Scanners Generally, the highest quality scan of film is from a drum scanner, but this technique has notable issues for unique works in collections. Drum scanners use a plastic cylinder (the drum) that holds film wet-mounted onto the surface while it spins during scanning (see fig. 21.9). The image is captured with a photomultiplier vacuum tube (PMT), which is far more effective at capturing light compared to a typical CCD and lens combination found on other imaging devices. Wetmounting of film requires that slide mounts are removed, which may not be possible for some collection materials. Wet-mounting greatly reduces, if not eliminates, imaging of scratches and abrasions on the film surface by reducing light scattering and also removes the possibility of Newton’s rings forming (spherical interference fringe patterns that form when film is touching glass or plastic). Wet-mounting also reduces perceived film grain (Vitale 2010) and provides better saturation and density range. Mounting fluids are proprietary solutions, and despite claims by manufacturers that they do not damage film, the exact components in these fluids are often unknown. Many are either combinations of organic solvents or mineral oil; the latter must be cleaned off the film after scanning. There is no uniform agreement on the effects of wet-mounting of film, but it may not be deemed appropriate for unique, historical, or fragile film materials. The FADGI guidelines recommend against drum scanning of cultural heritage materials (Rieger 2016). If you are considering drum scanning, find out what kind of mounting fluid the lab uses and make sure you are comfortable with how collection materials will be handled. Quality drum scanning requires a highly skilled operator who understands how to optimize the scanner for a given film type and conditions. Drum scanners have reported optical resolution that far exceeds any other scanning technique. The actual or effective resolution will be lower yet still higher than other forms of digitization. Manufacturing of new drum scanners is not cost-effective without significant demand, so sales are now driven by the second-hand market and servicing of existing units. Heidelberg (Linotype-Hell), Howtek, and ICG were major manufacturers. Aztek Inc. is still selling refurbished models and servicing their line of drum scanners (Aztek 2021). 427

Jeffrey Warda

Figure 21.9

Digitizing 35 mm slide film with a Linotype-Hell ChromaGraph S3400 drum scanner at Griffin Editions, New York, USA. The unmounted film in the center of the image is wetmounted on a clear acrylic cylinder while scanning. Image courtesy of Griffin Editions.

Photo: Griffin Editions

Hasselblad Flextight Scanners The Hasselblad Flextight scanner is generally held to be second to a drum scanner in terms of image quality (see fig. 21.10). First offered in 1997 by the Imacon Corporation, Denmark, the Flextight line was eventually acquired by Hasselblad, Sweden. Referred to as a “virtual drum scanner” due to the way in which film is suspended within a flexible metal mask and gently flexed to curve over a hollow drum while it is scanned. Along with other dedicated film scanners, the Flextight does not have a glass platen between the sensor and the film surface that would otherwise reduce image quality. A CCD sensor captures the image with a high-quality Rodenstock lens. The film holder prevents it from imaging film edge codes and sprockets, but it can capture the full-image frame area (24 × 36 mm) of a 35 mm slide and all other standard larger formats, up to 100 × 125 mm (4 × 5 in.) sheet film. Slides can also be scanned while still in their mounts with a separate holder. Flextight scanners have a reported D-max of 4.9. It does not include Digital ICE technology for hardware-based dust and scratch removal, but instead uses a diffuse light source that reduces small dust particles and provides a software-based tool for dust and scratch removal (called Flextouch). Care should be taken with brittle or degraded film base materials which can be damaged by the way the Flextight curves the film during scanning (Pietsch 2021). Like drum scanners, these scanners are very expensive and unfortunately Hasselblad discontinued production in 2019. Image quality of scanned slides is excellent, it does not require a skilled operator as with a drum scanner, and wet-mounting is not needed (although there are third-party accessories to enable wet-mounting). Hasselblad reports an optical resolution of 8000 ppi for 35 mm film (in the long dimension), the actual resolution was independently measured with a USAF-1951 resolution target at 6900 428

Caring for Slide-Based Artworks

Figure 21.10

Hasselblad Flextight X5 film scanner in use at the Solomon R. Guggenheim Museum, NY. The film holder feeds either unmounted or mounted slide film into the scanner.

Photo: Midge Wattles

ppi by ScanDig GmbH (Wagner 2021a). Larger film formats have a lower maximum optical resolution on the Flextight scanner due to the way the optics are constructed.

Film Scanners Nikon, Minolta, and Canon were primary manufacturers of dedicated slide and negative film scanners in the 1990s. The Nikon Super Coolscan 9000, 8000, or 5000 ED film scanners provide some of the best image quality (see fig. 21.11). Nikon scanners have a reported D-max of 4.8 and a reported sensor resolution of 4000 ppi. The 9000 ED scanner was independently measured with a USAF-1951 resolution target at 3900 ppi by ScanDig (Wagner 2021b). Nikon scanners use infrared Digital ICE dust and scratch removal tools. Unfortunately, Nikon discontinued their film scanners in 2004, and there are currently no similar options available that achieve the same image quality. Many of these scanners are still functional and available from third parties. The Nikon scanning software may not be compatible with recent operating systems, but Silverfast software can be used. While not at the same image quality as a mediumformat camera, a Flextight scanner, or drum scanner, generally dedicated slide and negative film scanners perform better than flatbed scanners, particularly for 35 mm slides.

Flatbed Scanners Originally designed for scanning reflective materials such as prints, many flatbed scanners also scan in transmissive mode for slides and negatives. Slides can be scanned mounted or unmounted and can be scanned directly on the glass platen or held in plastic holders provided by the manufacturer. There is a risk of Newton’s rings forming when film is placed directly on glass through 429

Jeffrey Warda

Figure 21.11

Nikon Super Coolscan 5000 ED film scanner with detached SF-210 Auto Slide Feeder.

Photo: Jeffrey Warda

the formation of interference patterns. It is possible to replace the glass on some scanners with an anti-Newton’s ring glass platen, if desired. Wet-mounting film on a flatbed scanner will avoid Newton’s rings as well as remove scratches and abrasions in the film, but this may not be recommended with unique, historical, or fragile film materials (see Section 21.6.1 “Drum Scanners” for discussion on wet-mounting). Wet-mount kits are manufactured for select scanners. Plastic slide and film holders that come with a scanner can cause focus issues if the film is not held perfectly in-plane. Some scanners have an auto-focus feature to account for this, others are fixed focus. High-end professional flatbed scanners were produced by Creo (Kodak), Heidelberg, and Fuji, but these are no longer manufactured. These professional scanners produced better-quality images than all currently available flatbed scanners. Flatbed scanners also typically have a lower maximum density (in the range of 3.4 to 4.0 D) than high-end dedicated film-scanning devices. Flatbed scanner manufacturers are also notorious for reporting much higher optical resolution than can be independently measured with a resolution test target (Williams 2000; Steinhoff 2009). Check independent reviews with actual measured resolution values (which can be as much as half of the manufacturer’s reported value) before purchasing a scanner, or an existing scanner can be measured with a resolution test target or through the use of DICE analysis software and target packages that measure imaging device resolution.

21.6.2 Digitizing with Digital Cameras With the widespread discontinuation of many high-quality scanning devices, it is becoming more and more common to digitize slides or other transparencies over a light box with a digital camera. Unlike obsolete film-scanning devices, digital camera and sensor technology is continually advancing, approaching and in some cases exceeding earlier technologies in terms of image quality. Medium-format digital cameras are by far the preferred tool for digitization, but digital single-lens reflex (DSLR) type cameras are also discussed in the next sections. One drawback to using a digital camera is the lack of hardware-based dust or scratch removal tools that are found 430

Caring for Slide-Based Artworks

on purpose-built slide-scanning devices. While an air puffer or soft brush can be used to remove dust prior to digitization, some dust can be embedded into the film and more difficult to remove.

Medium-Format Digital Cameras For best results, a professional medium-format camera with digital back should be used. These cameras have much larger sensors than full-frame DSLR cameras, ranging between 43.8 × 32.9 mm to 53.7 × 40.2 mm. A larger sensor generally results in sharper images with a wider dynamic range and less noise (Warda et al. 2011). Manufacturers of medium-format cameras include Hasselblad, Phase One, Sinar, FujiFilm, and Pentax. Phase One offers complete digitization workstations, film carriers, and a Cultural Heritage version of their Capture One software. See the following “Camera Lenses” section.

DSLR and Mirrorless Cameras Full-frame digital cameras have also been used for slide digitization, either with mirrorless or DSLRtype cameras. Cameras with a cropped sensor that is smaller than 24 × 36 mm (e.g., APS-C) would ideally be avoided as the sensor is significantly smaller than a full-frame camera. Even the highestresolution full-frame 35 mm digital cameras cannot match the resolution of medium-format digital cameras, and they do not have the same level of sharpness, dynamic range, or quality lens optics.

Camera Adaptors for Digitizing Slides and Film There are numerous accessories that can be used with a DSLR to digitize slides, designed to make it easier to copy multiple slides (see fig. 21.12). The Nikon ES-1 Slide Copying Adaptor and ES-2 Film Digitizing Adaptor are simple accessories that can be used with either of Nikon’s MicroNikkor 60 mm f/2.8 D or ED macro lenses. The ES-2 can reach 1:1 magnification without the use of extension tubes that are needed on the ES-1. The ES-2 also includes a film strip holder for shooting unmounted slide film. One advantage of this type of digitizer over shooting on a light box is that camera shake is avoided since the adaptor is attached directly to the lens barrel. Nearly all major camera manufacturers produced bellows systems for macro photography that included adaptors for analog film-to-film copying. These systems can also be used for digitization of slides (e.g., the Nikon PB-6 bellows and PS-6 slide holder). Bellows enable greater magnification than a lens was designed to achieve by spacing the lens further away from the film or sensor. This arrangement has the potential to degrade image quality, because it is in effect spreading out the native lens resolution over a larger than intended area. One exception to this may be the Novoflex bellows for photomacrography with a high-quality Schneider Kreuznach flat-field, apochromatic lens that includes a slide-duplicating attachment. None of these adaptors will produce the same image quality as one can capture with a professional medium-format camera, but they are all far more affordable and if carefully managed with a high-resolution camera and good optics, acceptable results can be obtained. Systems that use a macro lens capable of reaching 1:1 magnification and thus not requiring the use of extension tubes or bellows are preferred.

Camera Lenses With medium-format or DSLR cameras, one must always use a prime lens (with a fixed focal length, e.g., 60 mm on a full-frame DSLR), not a zoom lens (with a variable focal length, e.g., 431

Jeffrey Warda

Figure 21.12

Nikon ES-2 Film Digitizing Adaptor connected to a Nikon D850 DSLR with MicroNikkor 60 mm f/2.8 ED macro lens, tethered to a computer with Nikon Camera Control Pro 2 software. A Nikon SB-700 Speedlight flash was used with a PocketWizard MiniTT1 wireless transmitter and FlexTT5 transceiver.

Photo: Jeffrey Warda

35–70 mm) for copying slides. This lens must also be flat-field and ideally apochromatic (APO). A flat-field lens is corrected against distortions (e.g., barrel or pincushion distortion) at close working distances (Stroebel and Zakia 1993). Most macro lenses are flat-field. Apochromatic lenses are designed to correct against chromatic aberrations by their ability to focus each wavelength to the same point on the film or sensor surface. Non-apochromatic lenses can show color fringes at the edges of high-contrast image areas, often as a red, green, or blue fringe. A macro lens will also allow magnification at or close to 1:1 (1×), whereby the 24 × 36 mm slide image frame is projected edgeto-edge on the camera’s 24 × 36 mm full-frame sensor. A good macro lens should have uniform sharpness at the center and corners of the image frame, typically requiring stopping down to roughly f/8. Inexpensive macro lenses will often be softer at the corners of the image. Similarly, illumination should be uniform across the image frame, without vignetting (darkened corners) in the frame.

Lighting Working on a copy stand with a 5000 K light box, slides can be shot in their mounts or unmounted, and the full sheet or strip of film can be imaged. Light boxes use fluorescent or LED sources. Either is fine, provided they have a high color rendering index (CRI), above 90 CRI. The ballasts inside fluorescent light boxes produce heat, which can cause unmounted film to distort, affecting focus. High-quality LED sources are preferable to prevent heat buildup, and some LED sources are capable of higher CRI ratings above 95 CRI. Not all light boxes provide a uniformly lit surface, typically found with shallow light boxes where the translucent surface acrylic or glass is positioned too close to the lamps. These should be avoided. Ideally, the rest of the light box surface is blacked out to reduce light spill into the lens. If using a slide film adaptor 432

Caring for Slide-Based Artworks

mounted to the lens of a DSLR camera, a continuous light source or an external flash or strobe system can be used. Flash is well suited for this setup, either using a radio-controlled external transmitter or cable to position the flash opposite the slide. However, some form of continuous light is also needed to check the focus for each shot. Continuous light sources for photography should have a high CRI rating, above 90 CRI.

Tethered Shooting It is highly recommended to shoot while tethered to a computer to enable live view on a graphics quality or reasonably good-quality monitor to check precise focus and framing. At high magnifications, there will be a very shallow depth of field, so accurate focus is best done with the image enlarged to 100% on a computer monitor. Monitors should ideally be calibrated and profiled with a hardware profiling device (Warda et al. 2011).

Post-processing of Digitized Slides No matter the technique used to digitize slides, post-processing of the files within an image editing software such as Adobe Photoshop or Adobe Lightroom will be necessary. It is best practice to first verify each scan against the physical slide to ensure it was scanned in the correct orientation (it is common for slides to get flipped in scanning) and to check that each file was scanned at the same resolution (with the same pixel dimensions). If a slide needs to be rescanned, this is best determined soon afterwards to ensure the same scanner settings can be applied. Next, minor color corrections, dust and scratch removal, and cropping may also be needed. If significant color corrections or changes to the tonal range or contrast of the image are required to match the original slide, this is an indication that the tools used for digitization were not ideal.

21.7 Film Recorders: Re-output Back to Film from Digital Files Once a digitization project has been completed, the resulting digital files can be used to create new exhibition copy slides with a specialized film recorder. Film recorders are unique in that— unlike analog film duplicating—they are not reliant on obsolete low-contrast slide duplicating film stock and can instead use regular camera slide films, which are still in production. Because film recorders work from digital files, they can be profiled to ensure that the output slide is exposed at the same color, contrast, and tonal range as the image file or slide it is based on, effectively removing the inherent high-contrast characteristics of regular slide film. In theory, any slide film can be used; typically Kodak E100G and 100D and Fujifilm Provia 100F (RDP III) have been used with good results. Existing black-and-white film stocks such as Agfa Scala can also be used with film recorders (Prasse 2017; Dodge 2018). Despite Kodak announcing in 2012 that it would end production of all color reversal slide films, they fortunately changed course and reintroduced 35 mm E100 color reversal film in 2018 (Eastman Kodak 2018).

CRT and LCD Film Recorders The most common type of film recorder is a CRT or LCD film recorder (see fig. 21.13). Developed in the 1980s, these devices use a high-resolution display to project an image file without the use of a lens back onto film. They can accommodate 35 mm film stock and medium- and large-format up to 8 × 10 in. (20 × 25 cm). High-quality recorders suitable for duplicating slides use a monochrome CRT display with a series of red, green, and blue color filters to expose 433

Jeffrey Warda

Figure 21.13

An Agfa Alto CRT film recorder at Mops Computer, Germany.

Photo: Mops Computer

the film. The 4K CRT film recorders can expose film at a resolution of roughly 2732 × 4096 dpi, while 8K film recorders are capable of 5464 × 8192 dpi. At 4K, the addressable resolution of the recorder exceeds the resolution of most slide films used in film recorders (Prasse 2008). Film recorders can write to the entire 24 × 36 mm image frame of 35 mm roll film. Just because a photo lab has a film recorder does not mean they have the ability to produce accurate slides. This technique requires the use of a high-quality recorder capable of 4K resolution and, most importantly, careful calibration and profiling of each film stock to achieve accurate and repeatable results. In addition, the exposed film must be developed and processed correctly so many labs work with specific photo processors who reliably maintain their processing chemicals. A  number of photo labs still offer film recorder services with slides, including Gamma Tech, USA; MOPS Computer, Germany; and Activity-Studios, Germany. Agfa was a major manufacturer of quality film recorders, now acquired by CCG GMbH, Germany, which has recently released updated models.

LVT Film Recorders LVT (light valve technology) film recorders work somewhat like a drum scanner in reverse, using photodiode lasers to expose sheet film that is wrapped onto a rotating cylinder. Both color and black-and-white silver halide film can be used. Exposure is a continuous tone process that 434

Caring for Slide-Based Artworks

is beyond the resolution of film without raster lines. Only sheet film can be used, up to 8 × 10 in. (20.3 × 25.4 cm) and 16 × 20 in. (40.6 × 50.8 cm) at a lower resolution. Similar to CRT film recorders, LVT film recorders can also be calibrated and profiled to correct for color and tonal range, allowing them to print to regular camera films (Warda and Munson 2012). The use of sheet film makes LVT recorders more time-consuming to use for slide works, as multiple images must be tiled out on a sheet of 8 × 10 film, then cut by hand and fit into slide mounts. Sheet films lack sprocket holes, so there can be registration issues in the mount if a work requires pin registry for dissolve or cross-fading of multiple images (Weidner 2012b). Maintaining precision registration may require additional steps in printing and cutting the images. Large format sheet film is thicker than roll film, which can make it challenging to fasten into some thinner plastic slide mounts. One advantage to LVT film recorders is that they allow text or identification marks to be printed below the frame and subsequently covered by the mount. Kodak and Durst Dice were major manufacturers with their Rhino film recorder. The manufacture of LVT recorders was discontinued in 2002, but a number of photo labs still operate them, including Chicago Albumen Works, Massachusetts, USA.

Case Study 21.3  Robert Smithson Analog Slides Duplicated with LVT Film Recorder Robert Smithson’s (1938–73) iconic slide work, Hotel Palenque (1969–72), Solomon R. Guggenheim Museum (99.5268) consists of thirty-one 126 format color slides with an audio track that is currently synced to the projector through a small-form computer running a custom programming

Figure 21.14 The first slide from Robert Smithson’s Hotel Palenque (1969–72), slide projection of thirty-one 35 mm color slides (126 format) and audio recording of a lecture by the artist at the University of Utah in 1972 (42 min, 57 sec), Solomon R. Guggenheim Museum, (99.5268). Left: Smithson’s camera original 126 format slide, mounted; middle: the unmounted original film; right: an exhibition copy duplicate slide of the original from an LVT film recorder with identification information printed below the image, produced by Chicago Albumen Works. © 2022 Holt/Smithson Foundation/Licensed by Artists Rights Society (ARS), NY. Photos: Jeffrey Warda

435

Jeffrey Warda

script in Python (n.d.). This artwork began as a lecture Smithson gave at the University of Utah, Salt Lake City, in 1972, based on travels Smithson had previously made in the Yucatan Peninsula in Mexico with his wife Nancy Holt (1938–2014) and gallerist Virginia Dwan (b. 1931) in 1969. There was an assumption by the university audience that Smithson’s lecture would focus on the Mayan ruins of Palenque, but he instead meticulously described in both a deadpan and eloquent fashion the various ways in which the hotel they had stayed at was being continually built-up and destroyed at the same time. This artwork entered the Guggenheim’s collection as the camera original slides that Smithson showed in 1972 along with an audio cassette tape recording of the lecture Smithson gave at the University of Utah. Shot with 126 format Instamatic cartridge Kodak Ektachrome and Kodachrome film, these slides rely on a square 26.5 × 26.5 mm mount opening (see fig. 21.14). In order to produce analog film-to-film duplicates, photo labs had previously reduced the image size to 24 × 24 mm in order to fit the film into regular 135 format 35 mm film and slide mounts. With the end of analog slide duplicating films, the Guggenheim set out on a multi-year project in partnership with Chicago Albumen Works (CAW). CAW made new high-resolution images by shooting each unmounted slide on a light box with a Sinar 54H multi-shot medium-format digital camera. The images were re-output back onto 8 × 10 in. Kodak E100G color transparency film using an LVT film recorder. Each frame was tiled out in a grid onto the sheet of film and the accession number and date of production was printed below the image, covered by the slide mount mask. The individual frames were then cut down and mounted by hand in Gepe mounts (Gepe Geimuplast GmbH, Germany), all sourced through a combination of eBay and directly from Gepe from backstock as 126 mounts were no longer manufactured. It was not possible to source enough mounts for the number of sets produced, so the remaining sheets of 8 × 10 film were left un-trimmed with the expectation that the old mounts will need to be reused (Warda and Munson 2012).

21.8 Working with Photo Labs Slide-duplicating projects (either film-to-film or with film recorders) can be demanding and time-consuming for all parties. It is always best to make sure expectations are conveyed and understood by the photo lab and collecting institution. Most labs need to return a job as soon as they can at the best quality that timeframe allows. A collecting institution may expect more precise color corrections or test strips of film than a lab typically provides. Making this clear from the outset will help the process greatly, as this may not be the regular business model for most labs. A company may need to charge more for this kind of attention, but it is far better to anticipate these costs before starting. It can often help to speak to the owner or manager so they understand expectations and they can evaluate if they feel their services are appropriate. By communicating these needs beforehand, one is less likely to see the cost of the project exceed the original quote.

Quality Control Just as it is necessary to examine a new acquisition carefully with multiple techniques, the same applies when checking new exhibition copy slides or digitized files. View the slides on a light box and also project them as both methods allow visualization of different aspects of the work. There can be a tendency to only view new duplicated slides on a light box, but ultimately it is 436

Caring for Slide-Based Artworks

how they look projected that matters, and some issues are only evident when projected. The parameters that define a high-quality duplicate are summarized under Section 21.5.

21.9 Slide Mounts With any duplication project, the choice of mounts can become an additional obstacle as many manufacturers have discontinued production. That said, there is a wide range of types of slide mounts, from very thin paper to thick plastic mounts with glass on either side of the film to protect against dust and scratches. Less frequently, metal mounts can be found on slides from the 1940s to 1960s. Slide projectors rely on gravity for slides to drop into the slide gate. Thicker and heavier plastic mounts with or without glass are often more reliable and less prone to slide jams than thin paper mounts. Paper mounts (and some plastic mounts) use adhesives to seal the mount closed and to secure the film from moving. Occasionally, these adhesives have been known to stain film. The best-quality plastic mounts are by Gepe and Wess Plastic Inc., USA, which both produced mounts without glass or with two pieces of anti–Newton’s ring (AN) glass on either side. Gepe ended the production of slide mounts in 2019 (Gepe 2021). They can still be sourced as of this publication, but these are all back-stock items. A limited number of Wess slide mounts (only without glass) are still produced by a third party, BCA Manufacturing, LTD, USA (BCA 2021).

Pin Registry Both Gepe and Wess offered professional pin registration mounts that have raised points on one side of the mount that fit into the film’s sprocket holes, perfectly aligning the film in the frame and holding it firmly in place. Pin-registry mounts should only be used when slide duplicates were produced with pin-registry slide-duplicating equipment (a rostrum camera) or if it can be confirmed that the camera exposes film in register (Bishop 1984). For slides without pin registry, Gepe’s standard metal mask Clip System is designed to clip onto the film, but aluminumbacked slide mask tape is typically needed to fully secure the film and prevent it from shifting. Gepe’s metal mask also provides sharp corners to the projected image and helps hold the film in place. Wess mounts are recognized by a hinged system that snaps the mount into place. Wess made a variable registration hand punch that adds two new holes to the film if the film sprocket holes are not aligned for pin-registered mounts. Hand punched film uses the Wess 2VR mount.

Glass or Non-glass Slide Mounts There are mixed opinions on the merits of glass or non-glass mounts. Dust specks on the film can be one of the biggest problems when slides are projected for an extended time in an exhibition. Dust often gets embedded into the film, making it impossible to remove with an air blower. Glass slide mounts protect the film from dust and scratches, particularly in high-use environments. They are also much easier to wipe down if needed with no risk to the film. Mounts with AN glass may require cleaning with a microfiber cloth to remove white hazy deposits if they have sat in storage an extended time. There is some debate as to whether or not glass mounted slides will fade sooner or later when projected. Wilhelm and Bower found there was no difference (1993). There is a potential for heat buildup when glass mounted slides are projected for a long period of time without advancing the carousel, but artworks with these requirements are rare. For works that require a particularly long exposure time for one mounted slide, it is likely that slide will fade slightly slower if it is in a non-glass mount. 437

Jeffrey Warda

Manufacturer markings on slide mounts can also provide insight into when a set of slides was produced (Pietsch and Fernandes 2019).

21.10 Storage Camera original slides are unique objects that are highly vulnerable to many types of damage and degradation. Slides are easily scratched, prone to fingerprints, attract dust, are susceptible to attack from mold, are extremely light-sensitive when projected, and even fade or shift in color while stored in the dark. While steps can be taken to minimize or prevent damage from handling, the best way to stop long-term dark storage fading is to store slides in a cold environment.

Cold Storage Cold storage can be separated into three broad categories: cool storage, between 13 and 18°C (55–65°F); cold storage, between 0 and 12°C (32–54°F); and below freezing, between −18 and 0°C (0–32°F). In addition to temperature, a stable relative humidity (RH) within an acceptable range of 30–50% RH is equally important. The benefit of cold storage and lower RH is dramatic for color film. Storing color slides at ambient temperatures of 23°C and 50% RH can result in noticeable changes in color dyes after only 25 years. Dropping the temperature to 0°C and lowering the RH to 30% can increase the time to a projected 1,200 years before noticeable change (Voellinger and Wagner 2009b). There are few single factors that have the ability to so dramatically improve an artwork’s long-term preservation as cold storage for color photography. Deciding on the most appropriate type of storage climate conditions will inevitably take into account many financial and logistical factors and be influenced by the size and type of other materials in a collection (Wagner 2007). Options range from large walk-in rooms that include combined relative humidity and temperature control (an active RH system) to lowcost frost-free household freezer or refrigerator units that rely on sealed storage packages to maintain stable relative humidity (a passive RH system). If a large cold storage facility is beyond an organization’s budget, it is relatively easy to store thousands of slides in a household freezer, greatly prolonging the color stability of a collection. In a passive cool or cold storage system, the refrigerator or freezer unit has no internal control of relative humidity—which can fluctuate widely, especially during periods when these units cycle through brief elevated temperatures that are designed to prevent frost buildup (with a frost-free unit). Stable relative humidity for the artwork is maintained through the use of a sealed double-bag system. The inner bag should be a high-performance laminate moisture barrier film, such as Marvelseal 360 (aluminized nylon and polyethylene film), Dri-Shield (aluminized polyester and polyethylene film), or a translucent static shielding bag or film that can be sealed with pressure-sensitive tapes. The second outer bag can be either a 6–8-mil low-density polyethylene zipper-seal bag or a second layer of Marvelseal, Dri-Shield, or static shielding films (Voellinger and Wagner 2009a) (see fig. 21.15). Slides should be stored within paper-based storage enclosure materials, with minimal plastic or metal enclosures. Ideally, paper-based slide boxes should be used with voids filled with crumpled paper tissue, archival corrugated board, or mat board. Paper has the benefit of absorbing and desorbing moisture to buffer against changes in relative humidity inside the slide storage package. Padding also prevents materials from moving within a box. Specific steps must be in place to remove materials from a cold storage environment. Double-bagged materials in a freezer can be brought directly into a room at normal ambient conditions, but they must be left fully sealed until they have acclimatized to the room temperature to prevent condensation from 438

Caring for Slide-Based Artworks

A

B

C Figure 21.15 Sample slide storage enclosure for cold storage. (A) Double-bagged slide storage box with extended Marvelseal end to enable the bag to be resealed. Each time a bag is cut open, it shortens in length by roughly 3 cm. (B) The inner box removed from the outer enclosure (the extended end of the bag is folded under the box). Archival corrugated board act as buffer material. (C) Inside of a typical slide storage box, capable of holding roughly 1,000 mounted slides. Each set is labeled by component number. Photos: Jeffrey Warda

439

Jeffrey Warda

forming inside the enclosures. The time required for this can vary depending on size, but most all enclosures are safe to open the next day. There are many sources on this topic that one should consult if considering cold storage, namely Reilly 1989; Voellinger and Wagner 2009a, 2009b. Active climate-controlled cold or cool storage systems are much more expensive to build and run, and they require dedicated attention to maintain. These are more typical for large walk-in spaces for larger materials. Within a larger space, all items in cold storage would still need to be bagged before moving them into ambient condition. Keeping materials bagged is also an added precaution against mechanical failure, which is a serious concern with these systems. Alternatively, some larger rooms have an anteroom just outside a cold storage space that is at cool storage conditions to gradually acclimate materials before bringing them into the warmer general storage spaces at ambient temperature. Anterooms are typically needed for larger materials where bagging is not feasible. Unbagged materials can also be placed into a thermally insulated storage container before bringing them into a space at ambient conditions and left overnight before removal. Some cold storage freezer units are located within a larger cool storage space which acts as an anteroom. All storage enclosure materials for slides and other photographic materials should have passed the Photographic Activity Test (PAT), an international standard (ISO 18916:2007) that includes both the Standard PAT (for black-and-white film) and Color PAT (for dye coupler reactivity), developed by the Image Permanence Institute, Rochester, New York, USA. Conservation archival supply companies sell various paper-based slide storage boxes and enclosures. All enclosures in cold storage units should be clearly labeled with all of their accession and component numbers listed on the outside of the bag or enclosure so they can be easily retrieved. It can also be helpful to include a printout of the component descriptions inside the box with any relevant information a future user may need when accessing the slides. This information would also be included in a collections management database, but it can be helpful to have a hard copy printed out with the slide materials. A separate inventory file is also useful to maintain, with labeled shelf locations so the box or enclosure can be quickly located and removed from a freezer.

Storage of Slide Film in Glass Mounts Generally, storing slides long-term in glass mounts is not advisable as micro-climates can develop on the film where it touches the glass mount, particularly when it is not possible to maintain long-term stable relative humidity. These areas can be more prone to develop mold over time. Paper enclosures are much less likely to create microclimates in contact with film and are therefore preferred when relative humidity is not stable (Lavédrine et  al. 2003). Paper enclosures will not facilitate easy viewing of film, but film materials should ideally have already been digitized prior to their cold storage. Although archival quality polyester (Mylar), polyethylene, and polypropylene films are also recommended for film storage (ISO 18911, Imaging materials— Processed safety photographic films—Storage practices), there are instances where these materials can contribute to micro-climates forming when relative humidity levels are not stable. Ideally, one must remove any unique slides from glass mounts, place each film frame into a paper-based folder or pocket, then insert the film into a standard archival slide sleeve (Pietsch and Fernandes 2019). Slides in plastic and paper mounts without glass are acceptable to store long-term. They can be stacked into standard slide storage boxes or within archival polyethylene sleeves.

Digital File Storage Storage of digitized slides, audio tracks, synchronization tracks, and any other file-based components must also be as carefully stored as original physical slides (see Chapter 8). 440

Caring for Slide-Based Artworks

21.11 Projectors 35 mm Slide Projectors By far, the most common slide projectors were the Kodak Ektagraphic, Ektagraphic III, Ektagraphic S-AV, and Ektapro line of projectors. Leica produced the Pradovit RT-s and RT-m, which are nearly identical to the Kodak Ektapro with Leica’s own line of lenses. For mediumformat slides, Hasselblad made the massive PCP 80 (discussed below) (see fig. 21.16). Kodak’s line of projectors is the primary focus of this section. The Ektagraphic and Ektagraphic III projectors were assembled in the USA, while Ektapro and the S-AV were produced in Germany (the S stands for Stuttgart). Running a slide projector in an exhibition space every day for three to four months places great stress on this equipment, beyond which they were generally intended. Professional projectors such as the Kodak Ektapro, the Ektagraphic S-AV, and the Ektagraphic III models are preferred due to their feature set and durability. Earlier projectors with a linear cassette tray (instead of a circular carousel) should be avoided unless specified by the artist. The ability to auto-focus, use a dual lamp module, a variable timer, and integration with dissolve units or other programmers is recommended. Only the Ektapro line and S-AV 2060 model have the ability to automatically switch over to the second lamp when one burns out, a big advantage within a gallery exhibition space. Table 21.2 lists major features and models for popular Kodak projectors.

Slide Carousels Although many projectors accept 140-slot carousels, these are not recommended as they are far more likely to cause slide jams compared to 80-slot carousels, and they also do not accept thicker slide mounts. Carousels that come with plastic dust covers should be used to reduce dust accumulation on the slides. Two options include the Kodak Ektapro 80 Slide Tray or the Kodak Ektagraphic Universal Deluxe Covered Slide Tray. The Ektagraphic 89 Slide Tray with cover will work on the S-AV model. These are available second-hand. Both carousels can accept slide mounts up to 3.2 mm thick (Eastman Kodak 2001).

Figure 21.16  Common slide projectors. From left to right: Kodak Ektagraphic, Ektagraphic S-AV, Ektapro, and the Hasselblad PC 80 medium-format projector. Photo: Jeffrey Warda

441

Jeffrey Warda Table 21.2  Popular Kodak slide projectors and features Features

Kodak Ektapro 3020, 4020, 5020, 7020, 9020

Kodak Ektagraphic S-AV 2000, 2030, 2050

Kodak Ektagraphic III A, AS, AT, ABR, AMT, ATS, B, BR, E, ES

Kodak Ektagraphic E-2, AF, AF-1, AF-2, AF-2K, AF-3, B-2, B-2AR, E-2

Auto focus

• 5020, 9020

• 1030, 2050AF only

Dual leveling feet Dual lamp module

• All models • Auto switch: all models

• AF-1, AF-2, AF-3, AF-2K • One foot • Single lamp

Dissolve built-in

• 7020, 9020

• All models • Manual switch: all models. Auto switch: 2060 • 1050, 2020, 2030, 2050

• A, AS, AT, AMT, ATS • One foot • Single lamp in a removable module • No • All models • BR, ABR

• Models: E-2, B-2, AF-1, AF-2 • Model: B-2AR

• BR, ABR

• Model: B-2AR

• 3 to 22 sec: AT, AMT, ATS

• 4 to 15 sec: AF-2, AF-2K

• No • Geared

• No • Geared

• Kodak Ektanar C and FF

• Kodak Ektanar C and FF

• No

• No

110–125 volts, 60 Hz 110–130 volts, 50 or 60 Hz 220–240 volts, 50 or 60 Hz 110, 130, 220–230, 240–250 volts, 50 or 60 Hz Timer

• 2060 • All models

• 2010,2050

• 1 to 60 sec: 5050, 9020

• 4 to 40 sec with model 1030 or with external Carousel Interval Timer: model 1010. • 2010, 2030 • Spiral groove (all) and geared 1010, 1030, 1050, 2010, 2050, 2060 • S-AV Retinar (spiral) and VarioRetinar lenses. Not compatible with Ektapro FF or Ektanar C lenses. • No

Auto zero Lens mount type

• 9020 • Geared or spiral groove

Lenses

• Ektapro FF or Ektapro Select FF

Computer control

• All models

• No

Medium-Format Slide Projectors The Hasselblad PCP 80 is nearly twice the size and weight of a standard 35 mm projector. The PCP 80 is one of the few medium-format projectors to hold a circular 80-slot slide carousel and will project 120 medium-format slides in 7 × 7 cm slide mounts with 6 × 4.5 cm or 6 × 6 cm image frame sizes. PCP refers to the perspective control projector, a feature that moves the lens to correct for keystone or parallax distortion if the projector is not perpendicular to the 442

Caring for Slide-Based Artworks

projection surface, a feature found on high-end perspective control lenses on 35 mm projectors. The PCP 80 and related line of lenses were discontinued in 2001. Götschmann Diaprojektoren, Germany, currently produces medium-format projectors with linear slide trays.

Projector Maintenance and Operation Slide projectors require regular cleaning and maintenance. As the number of labs offering this service has greatly diminished, you may need to develop in-house skills to clean and maintain your projectors. See, for example, the projector maintenance instructions from Tate (Cavaliere and Weidner 2010). Kodak produced detailed service manuals that are essential when setting out to service a projector in-house. When looking for a service provider, it may be necessary to ship the projector outside of your local area or state. Waterproof hard plastic cases with interior foam are best suited for shipping projectors. Prior to shipping, check the manual and make sure to remove the lens and carousel and pack separately.

Sourcing and Lending Projectors After a 67-year run, Kodak produced their last slide projector in 2004 in Rochester, New York, USA (Sarlin 2015). While they can still be sourced second-hand, professional projectors in good condition are increasingly more difficult to find. Some institutions have built up a collection of projectors for their needs, either with dedicated projectors for a particular artwork or with a wider pool of shared projectors and lenses. When lending a slide-based work, it is often simpler to have the borrowing institution source their own equipment. That way, they will have a more active role in ensuring everything works properly and in securing a backup projector and lamps. This is even more relevant when a lender has a different mains voltage requirement than the lending institution. Despite some projectors having a built-in voltage and fuse selector to allow it to work in different countries (world use), there is always a risk that the borrower will not set the voltage and fuse properly before turning it on. Failure to do this will either blow a fuse (preferably) or damage the control board on the projector—an expensive replacement part. Further, lending equipment requires cleaning and maintenance prior to and after the loan. Even with regular maintenance, projectors can still fail. Either way, given the demands slide-based works put on projectors, having a backup projector available during an exhibition is always recommended.

Projector Lenses Projector lenses can be either prime lenses with a fixed focal length or zoom lenses with a variable focal length. Shorter focal length (wide-angle) lenses (e.g., 36 mm) allow for a larger projected image at a given projector distance (known as a short-throw lens), while longer focal length lenses (e.g., 100 mm) produce a relatively smaller image at the same projector distance (known as a long throw lens). There are many online projector-lens calculators to determine the throw of a lens from a given distance. The aperture of projector lenses is fixed, with larger aperture lenses (e.g., f/2.8) allowing more light to pass through (a brighter image) than smaller aperture lenses (e.g., f/3.5). Generally, zoom lenses are not as sharp as prime lenses, and poorerquality zoom lenses often exhibit pincushion or curvilinear distortion at some focal lengths, whereby the edges of the projected image bow in slightly and are not straight (Stroebel and Zakia 1993). Prime lenses are less likely to produce distortion of the image frame. When possible, it is recommended to use a prime lens, but this may require some advanced planning to 443

Jeffrey Warda

determine projector placement for a particular image size given the exhibition space. The focal lengths of Kodak’s Slide Projection, Ektagraphic, and Ektanar line of lenses are often reported in inches instead of millimeters (e.g., a 3 in. lens is equivalent to 75 mm).

Flat-Field and Curved-Field Lenses Projector lens manufacturers produced both curved-field (CF) and flat-field (FF) lenses. Generally, CF lenses are recommended if using slides in open mounts (without glass) and the film is curved or bows toward the projector lamp and away from the lens. This is common with most color reversal and slide duplicating film in paper mounts. CF lenses were designed to compensate for the slight curvature of normal slide film when heated up in a projector, allowing for the corners and center of the image to be in optimal focus on a flat screen. CF lenses are not recommended if the film has the opposite curvature, whereby the film curves toward the projector lens, away from the lamp. In this case, an FF lens should be used. Nor should CF lenses be used with glass slide mounts. In these cases, the center and corners of the image may not be in focus at the same time. FF lenses are best used when the film is flat, particularly with film in glass mounts (Eastman Kodak 1984b). Check your lens barrel for FF, CF, or C labels, and ideally, do not mix glass mounted with non-glass mounted slides in the same carousel. If your slides are mixed, it is generally safer to stick with an FF lens. If you notice that the center is in focus but not the edges, you may not have the correct type of lens for your film type. Higher-quality lenses have metal housings, better-quality glass optics with multi-coatings, and some lenses have perspective control (PC), whereby minimal keystone (or parallax) distortions can be corrected when the projector is not mounted perpendicular to the projection surface. Schneider (Isco), Leica, select Kodak Retinar lenses, and Navitar made a suite of excellent quality perspective control lenses with a separate rotating ring to address keystone distortion, typically identified with PC in the name (Biere 1990).

Projector Lamps Some early Ektagraphic projectors required turning the projector over to access the lamp behind an access panel door, which necessitates realigning the projector on the stand each time while on display. The Kodak Ektapro, Ektagraphic III, and Medalist II Carousel slide projectors are all equipped with a detachable light module, an assembly that can be easily removed from the side of the projector to access and change the lamp. The Ektapro 7020 and 9020 and S-AV 2060 projectors hold two lamps and will automatically switch to their second lamp in the event of a burnout. Other S-AV models have a lever that manually switches to the second lamp. Without this feature, it can be helpful to have an additional lamp module on-hand to allow for easier lamp replacement in case of a burnout, or you will need to wait for the assembly to cool. Always wear gloves when handling lamps, as hand oils on the bulb or dichroic reflector can cause the lamp to burn hotter and alter the performance or degrade lamp life. Table 21.3 lists each type of lamp that can be used with the most common Kodak projectors, along with their estimated life span in hours when operated at standard or full power. Slight differences in line voltage will result in much greater differences in brightness and lamp life spans (Eastman Kodak 1984b). Generally, the brighter the lamp, the shorter the life span. Most projectors have both a standard and economy (or lower power) lamp setting which will decrease brightness and increase life span. Choice of which lamp and power setting to use will depend on a number of conditions, including how bright or dark the rest of the gallery space is lit. You may need to make the final determination during the installation period. Once set, you will want to purchase enough lamps 444

Caring for Slide-Based Artworks Table 21.3  Projector lamp options for select Kodak projectors Feature

Kodak Ektapro. Models: 3020, 4020, 5020, 7020, 9020

Kodak Ektagraphic S-AV. Models: 2000, 2030, 2050

Kodak Ektagraphic III. Models: A, AS, AT, ABR, AMT, ATS, B, BR, E, ES

Kodak Ektagraphic. Models: E-2, AF, AF-1, AF-2, AF-2K, AF-3, B-2, B-2AR, E-2

Lamp type options

EXR 300 W/82 V (35 hour) maximum 100% brightness; FHS 300 W/82 V (70 hour) high 70% brightness; EXY 250 W/82 V (200 hour) low 60% brightness

EHJ 250 W/24 V (50 hours)

EXW 300 W/82 V (15 hours) maximum 150% brightness; EXR 300 W/82 V (35 hour) high 125% brightness; FHS 300 W/82 V (70 hours) medium 100% brightness; EXY 250 W/82 V (200 hours) low 75% brightness

ENG 300 W/120 V (15 hours) maximum; ELH 300 W/120 V (35 hours) high; ENH 250 W/120 V (175 hours) low

Lamp power settings options

Standard (100%); Economy (75% of lamp rating) triples lamp life; High light mode (120% of lamp rating) reduces life by 20%, models 7020, 9020 only.

Full power (100%); Economy (70% of lamp rating); 2050 has only one power setting.

Full power (100%); Economy (70% of lamp rating)

Full power (100%); Economy (70% of lamp rating)

to last through the course of your exhibition. Always schedule to replace projector lamps before they are expected to burn out during the run of an exhibition. The extra-bright mode on the Ektapro 7010, 7020, 9010, and 9020 is not recommended for more than one minute exposure per slide due to increased heat on the slide and greatly reduced lamp life (Eastman Kodak n.d.). Check the projector manual before changing lamps. It is a common misconception to run the fan after turning off a projector lamp to “cool” the lamp down. Rapid temperature change of an incandescent lamp filament by running the fan after the lamp is turned off will actually reduce lamp life. Instead, turn off the lamp and fan at the same time and allow the lamp to cool gradually. Use the fan to cool a lamp assembly if the lamp must be changed (Sundt 1980).

Projected Image Brightness If you are not able to address image brightness through lamp choice, using a wider aperture lens, or reducing the projected image size, there are other ways to increase the image brightness. While a projection screen is ideal for projecting slides, they are not often used in a museum or gallery setting. Projection wall paint is another option, but it is not always possible 445

Jeffrey Warda

to predetermine the exact area of projection. Ambient light spill onto the projection surface will greatly reduce image quality by decreasing contrast and lightening the darks in an image. If possible, it is best to reduce ambient light as much as possible near the projection surface.

21.12 Synchronized Slide Shows The introduction of dissolve units in the early 1960s paved the way for major changes in how slides were used and displayed. A dissolve unit connects to one or more projectors to change the lamp output and control carousel and slide gate movements. The first dissolve units could fade out one slide while it faded in another slide from a second projector, so the images dissolve into one another (known as a cross-fade function). By the 1970s and 1980s, these features expanded greatly to control an array of projector functions as well as many more projectors. An entire industry grew out of this technology that specialized in audiovisual shows, often produced for corporate product launches and major entertainment events. Referred to as multiimage, these productions used large racks of projectors all in sync to vividly animate otherwise still images and text through slides. Using entirely analog film techniques combined with early programming languages, highly skilled practitioners created many of the features we now take for granted with software such as Microsoft PowerPoint. The technical skill and amount of time involved to achieve these results with analog photography is hard to fathom today (Kenny and Schmitt 1981; Bishop 1984; Biere 1990). Early dissolve units used a proprietary programming language to write an audible tone or pulse sound to audio tape, known as a cue track. Some units had built-in audio cassette decks, while others required external decks. Both were connected to the projector with a seven-pin connector cable (Kenny and Schmitt 1981). The earliest dissolve models included punchedtape programmers that physically punched different holes in rolls of paper tape, which were later read to control different projector functions. Single-tone programmers were replaced by more advanced tone-control programmers that could eventually animate multiple projectors and functions. Finally, more advanced control came with solid-state microprocessors within early computer workstations that predate the current personal computer (Kenny and Schmitt 1981). As projector technology advanced, new models were able to advance each slide at a faster rate, which multi-image productions took advantage of for more dynamic effects. Each manufacturer used its own proprietary programming language and cue tone, resulting in incompatibility with different systems—lamented even at the time (Biere 1990). Some of the major audio tone track languages include Mate-trac, which was developed by Arion and used by Kodak and 3M; PlusTrac and FreeTrac, which were used by Bässgen AV; and Procall, which was proprietary to Audio Visual Laboratories or AVL (O’Connor 2016). Eventually, floppy disk drives were introduced to save and recall programs. As multi-image slide programming continued to advance, the rise of digital photography, LCD projectors, and personal computers rapidly took over the market share. In response, slide film production declined greatly, the duplicating film was discontinued, and the production of slide projectors came to an end (O’Connor 2016). Artists also turned to synchronized programming of slide-based artworks, though typically in a more limited fashion with more basic dissolve units to fade images into one another or for side-by-side display. Slide works that use a cue track or any form of synchronized programming and dissolve features present real challenges for long-term preservation. Assuming these works were programmed on a device that allowed for external storage media to save the program file, it is even rarer that these files are now accessible or were ever saved. With a collection work scheduled for display in a gallery setting, there are two main options: (1) use the original playback 446

Caring for Slide-Based Artworks

equipment or (2) migrate the cue and projector commands to a computer, which can be used to control the projectors and any sound file. Playback of a slide-based artwork that relies on a cue track from an audio cassette tape requires regular maintenance and attention. An audio cue track signal will degrade on a cassette tape with constant playback every day through a three-month-long exhibition (Biere 1990). If the projector is not able to correctly discern the signal, the synchronization will be disrupted, and the projector will not advance correctly. To protect against this, multiple tape cassettes must be made and changed out regularly during an exhibition. Further, when copying to new cassette tapes—either tape-to-tape or output from a file—the output signal level must be recorded precisely to ensure the projector can accurately interpret the commands. In general, the more obsolete equipment an artwork relies on to perform, the more likelihood that problems will develop in exhibiting the work. This equipment can be difficult to service and maintain and the expertise required to set up and run it is significant. Another approach is to migrate analog cue tracks and programming features to a computer, which in turn controls the projectors and any audio soundtrack. The advantages of a file-based system on a computer are a more robust and reliable setup with far greater options and less maintenance. A micro form-factor or ultra-compact personal computer can be used and more easily fits within a pedestal, out of sight. These can be programmed to launch and stop a slide show based on a schedule, or it can be controlled remotely through a network connection. To interface a modern computer with a projector requires the use of a Kodak Ektapro projector with a serial bus port. A major challenge to this path, though, is there is currently no clear way to reverse-engineer the analog cue signal from an audio cassette to extract a precise timestamp of a dissolve sequence. The various original programming languages used to write the analog cue to audio tape were proprietary, and there is no method to accurately read the original code that would enable extracting a precise timestamp from the cue track. This obstacle was explored in depth in 2014 by O’Connor at Tate (2016). Case Study 21.4 describes the Guggenheim Museum’s work to create a file-based cue that coordinates the playback of sound files with the advancement of the slides by means of a script in the Python programming language, running on a small personal computer. Another option for similar works that require programming to advance a projector is to use an Arduino board with a microcontroller (Haidvogl 2019).

Case Study 21.4  Computer Control of Projector and Audio Tracks, Robert Smithson As previously discussed in Case Study 21.3, Robert Smithson’s slide work, Hotel Palenque (1969–72) entered the Guggenheim’s collection as the camera’s original slides that Smithson showed in 1972, along with an audio cassette recording of the lecture Smithson gave at the University of Utah. The Guggenheim had the audio cassette recording digitized and the audio restored and saved in an audio WAV (waveform audio file) format. Since there was no sync track for the lecture, it was necessary to manually create a timestamp based on when Smithson would call out in his lecture, “Next slide, please,” or he would describe aspects of a slide image in his lecture. Over the years, different types of scripts and computers were used to sync the audio to the slide projector, eventually settling on a Python script to control the projector and audio on a small-form PC computer running the Linux operating system. The use of a computer allows for automated scheduling to bring the projector

447

Jeffrey Warda

Figure 21.17

Computer control of Robert Smithson’s Hotel Palenque (1969–72) to run the projection, audio, and cue. The monitor, keyboard, and mouse were connected to allow control of the computer. Normally, the computer is out of view when on display.

Photo: Jeffrey Warda

out of standby mode each morning to start the presentation during open hours at the museum, along with shutting down at the end of each day. By networking the computer, the Guggenheim’s audiovisual technician can also connect remotely to the computer to turn the system on or off if needed or to troubleshoot issues when the work is on loan (see fig. 21.17).

21.13 Fading of Slides and Estimating When to Change Sets During Display The most thorough work on fading of color slides was done by Wilhelm and Brower (1993), who demonstrated that each brand of color slide film had different projector-induced fading rates and that despite attempts by projector manufacturers to reduce heat buildup at the film plane with dichroic filters on lamp housings and mirrors, it is primarily light that causes fading of color slides. Importantly, Wilhelm’s testing method was based on each slide receiving 30 seconds of light, six times a day, to mimic the intermittent exposure a slide receives in a projector. An EXR lamp was used in a Kodak Ektapro projector set to the high setting, and the slides were in mounts without glass. Wilhelm measured the sensitivity of each cyan, magenta, and yellow dye and confirmed that the magenta dye was most prone to fading in all color films. Wilhelm separated his recommendations into two categories: general and critical museum use. Since museums would only be exhibiting exhibition copy slides (not critical original slides), Wilhelm’s general (and less restrictive) recommendations can be followed. Wilhelm found that Fujichrome films, including CDU Type II duplicating film, had the best light stability compared to Kodak, Agfa, and 3M color films. Fujichrome films were found to show noticeable fading of the magenta dye after a cumulative exposure of five hours and twenty minutes. The maximum exposure for Kodak Ektachrome EDUPE duplicating film was found 448

Caring for Slide-Based Artworks

to be two hours and forty minutes (Wilhelm and Brower 1993). From this data, one can—in theory—calculate how long before a set of slides in an exhibition will reach its maximum exposure limit based on how many seconds each slide is displayed when it drops into the projector gate, how many hours the projector is running each day, and how many days the gallery is open per week. In practice, there are many variables that influence fading, including the type of projector, lamp, and power setting, and even the color characteristics of the slides on view. For example, a shift toward yellow from fading will be more noticeable on a slide that is predominately white compared to a slide of a warm yellow sunset. Generally, color slides should last between six and eight weeks of display, assuming slides are in a carousel on a timer. For a typical three-month-long exhibition (roughly 13 weeks), two sets of slides should be sufficient. This will provide six and a half weeks for each set, but an additional set should always be held as a backup. The best way to make this determination for a particular slide work is to monitor the slides on a weekly basis and make a note of when noticeable fading starts. If slide works are lent to other institutions, the borrower can be asked to keep track of how often they changed out sets. Examine the exhibited slides once they return from loan to arrive at a general recommendation for that artwork moving forward. As slide duplicating has grown more scarce and expensive, it is not uncommon to extend the exposure of a set of slides longer than one would normally. Slides on black-and-white film rarely require changing out as they are quite stable to light exposure. Over time, the film base may start to yellow (typically in the center where the lamp is brightest), but black-and-white slides have been exhibited over the course of many exhibitions without noticeable change. AN glass mounts are recommended to prevent dust and scratches from building up on the film with such extensive use.

21.14 Conclusion Preservation of slide-based artworks necessitates an active role in duplication, working closely with photography labs and other specialists in scanning, printing, and managing display equipment. It is a process not without challenges and occasional roadblocks—navigating a complicated landscape of obsolete materials and equipment. Duplication was often not considered until a slide-based work was scheduled for exhibition, but this is no longer a sustainable model. With the rapid decline in manufacturing of film types and all film-related equipment, there is now a sense of urgency that good stewardship requires duplication to effectively stockpile slides for future use. Given these challenges, there may be discussions within an institution on the merits of analog versus digital projection. Someone will inevitably ask, “Can’t we just scan the film and use a digital projector?” Ideally, try to have this conversation with the artist—when possible—to clarify if analog film and actual slide projectors are important to the nature of the artwork or if a digital projection is equally acceptable. Even with that knowledge, as long-term custodians of artworks, it is equally important to be mindful of the technology an artist chose originally and to recognize that migrating a work to newer technology dissociates that piece from its place in history and the technology that was available at that moment. There are meaningful visual, audible, and historical associations that are lost when switching to a digital projection, and there is a strong case to be made to do the best one can to continue with slide production as long as physically and financially possible. As Marchesi stated insightfully, reproduction is both a subtractive and additive process (2017, 66). With each and every slide duplication project, there will be elements removed from and added to the original presentation. Dust, scratches, lint, subtle changes in color, tone, sharpness, or cropping: these types of alterations are inevitable in any photographic process, but there are techniques to minimize artifacts and help retain as much as possible of the rich visual quality of slides for future viewers. 449

Jeffrey Warda

Acknowledgments Many aspects of this chapter were developed out of multiple collaborative projects and an earlier presentation in 2012 with Doug Munson of Chicago Albumen Works (retired). The author is deeply appreciative of Doug’s involvement in this area and for reviewing the content of this chapter. Similarly, Katrin Pietsch, University of Amsterdam and the Netherlands Fotomuseum, Rotterdam, graciously reviewed the text and offered valuable suggestions. Particular thanks are due to Sharon Hayes, Lisa Oppenheim, and Kathrin Sonntag for their generous feedback and correspondence related to their own slide-based artworks in the collection of the Solomon R. Guggenheim Museum. Within the Guggenheim Museum, I am thankful for the continued support of Lena Stringari, Deputy Director and Andrew W. Mellon Chief Conservator, as well as Agathe Jarczyk, Associate Conservator, Time-Based Media; Joanna Phillips, former Senior Conservator, Time-Based Media, and current Director, Düsseldorf Conservation Center; Maurice Schechter, Schechter Engineering; Piotr Chizinski, Head of Media Arts; Paul Paradiso, contract technician; and Vance Stevenson, former Head of Media Arts, for their help in realizing many slide-based artworks. Charlie Dodge of Gamma Tech, New Mexico; Juergen Prasse of MOPS Computer GmbH, Berlin; and Elke Trabandt of Activity-Studios, Esslingen, have been invaluable resources in duplication projects and sharing their combined technical knowledge. Thanks also to Fergus O’Connor, former Senior Conservation Technician, Time-Based Media, Tate. Lastly, I am indebted to Dan Kushel, Professor of Conservation Documentation (retired) at the State University of New York, College at Buffalo, and Timothy Vitale, conservator in private practice (deceased), who both generously shared their expertise in analog and digital photography.

Bibliography “Activity-Studios (Website).” n.d. Accessed November 28, 2021. www.slidelab.de. Alexander, M. Darsie, Charles Harrison, and Robert Storr. Slideshow. Baltimore and University Park, PA: Baltimore Museum of Art, Pennsylvania State University Press, 2005. Aztek Scanner Support. “Phone Communication with Author, Phone Conversation.” June 8, 2021. BCA Manufacturing, LTD. “Email Communication with Author.” May 30, 2021. Biere, Julien. The Guide to Multi-Image: Professionelle Dia-AV. Schaffhausen, Switzerland: Verlag Photographie, 1990. Bishop, Ann. Slides: Planning and Producing Slide Programs. Kodak Publication, no. S-30. Rochester, NY: Eastman Kodak Co, 1984. Burns, Peter D., and Don Williams. “Ten Tips for Maintaining Digital Image Quality.” Proceedings of the IS&T Archiving Conference (2007): 16–22. Cavaliere, Lee, and Tina Weidner. “Slide Projector Maintenance: Kodak S-AV 2050 Slide Projectors.” Tate, February  2010. www.tate.org.uk/file/example-report-maintaining-kodak-s-av-2050-slide-pro jectors-dying-technologies-project. Dodge, Charlie. “Email Communication with Author.” January 15, 2018. Dormolen, Hans van. Metamorfoze Preservation Imaging Guidelines: Image, Quality. National Library of the Netherlands, January 2021. www.metamorfoze.nl/english/digitization. Eastman Kodak Company, ed. Copying and Duplicating in Black-and-White and Color. Kodak Publication, M-1. Rochester, NY: Consumer/Professional & Finishing Markets, Eastman Kodak Co, 1984a. ———. The Source Book: Kodak Ektagraphic Slide Projectors. Rochester, NY: Eastman Kodak Co., 1984b. ———. “Quick Reference: Ektapro.” Eastman Kodak Company, 2001. https://resources.kodak.com/support/pdf/en/manuals/slideProj/ektapro_QuickReferenceGuide.pdf. ———. “To the Delight of Photographers and Filmmakers Everywhere, New EKTACHROME Films to Begin Shipping.” Press Release, August  18, 2018. www.kodak.com/en/company/press-release/ ektachrome-film-begins-shipping. ———. “Kodak Ektapro Slide Projector Instruction Manual: 3020, 4020, 5020, 7020, 9020.” Eastman Kodak Company, n.d. www.manualshelf.com/manual/kodak/ektapro-5020/instruction-manual-slideprojector-3020-4020-5020-7020-9020.html.

450

Caring for Slide-Based Artworks Fraser, Bruce, Chris Murphy, and Fred Bunting. Real World Color Management: Industrial-Strength Production Techniques, 2nd ed. Berkeley, CA: Peachpit Press, 2005. Gepe Geimuplast GmbH. “Email Communication with Author.” May 31, 2021. Haidvogl, Martina. “Slides, Projectors and Challenges: Case Studies from the SFMOMA Collection.” In Back to the Future!: Im Karussell Der Diakonservierung; Riding the Slide Carousel, edited by Barbara Sommermeyer and Julia van Haaften, 91–111. Bielefeld: Kerber Verlag, 2019. Jacobson, R. E., Sidney F. Ray, and G. G. Attridge, eds. The Manual of Photography, 8th ed. London and Boston: Focal Press, 1988. Jennings, Kate, and Tina Weidner. “Geeks, Boffins, and Whiz-Kids: The Key Role of the Independent Expert in Time-Based Media Conservation.” The Electronic Media Review, Paper presented at the Electronic Media Group Session, AIC 40th Annual Meeting, May 8–11, 2012, Albuquerque, New Mexico, Volume 2: 2011–2012 (2012), 119–31. https://resources.culturalheritage.org/emg-review/ volume-two-2011-2012/geeks-boffins-and-whiz-kids-the-key-role-of-the-independent-expert-intime-based-media-conservation/. Kenny, Michael F., and Raymond F. Schmitt. Images, Images, Images: The Book of Programmed Multi-Image Production, 2nd ed. Kodak Publication, no. S-12. Rochester, NY: Kodak Motion Picture and Audiovisual Division, 1981. Kodak Technical Support. “Email Communication with Author.” October 4, 2012. Lavédrine, Bertrand. Photographs of the Past: Process and Preservation. Los Angeles: Getty Conservation Institute, 2009. Lavédrine, Bertrand, Jean-Paul Gandolfo, and Sibylle Monod. A Guide to the Preventive Conservation of Photograph Collections. Los Angeles: Getty Conservation Institute, 2003. Mannoni, Laurent, and Richard Crangle. The Great Art of Light and Shadow: Archaeology of the Cinema. Exeter Studies in Film History. Exeter, Devon: University of Exeter Press, 2000. Marchesi, Monica. Forever Young: The Reproduction of Photographic Artworks as a Conservation Strategy. Leiden: Monica Marchesi, 2017. O’Connor, Fergus. “Slow Dissolve: Re-Presenting Synchronized Slide-Based Artworks in the 21st Century.” The Electronic Media Review Volume 4: 2015–2016 (2016). https://resources.culturalheritage.org/ emg-review/volume-4-2015-2016/. Pietsch, Katrin. “Email Communication with Author.” June 25, 2021. Pietsch, Katrin, and Lénia Oliveira Fernandes. “The Slides of Ed van der Elsken at the Nederlands Fotomuseum.” In Back to the Future!: Im Karussell der Diakonservierung; Riding the Slide Carousel, edited by Barbara Sommermeyer and Julia van Haaften, 69–87. Bielefeld: Kerber Verlag, 2019. Prasse, Jürgen. “Film Exposure with Film Recorders.” Trade fair presented at the Photokina, Cologne, Germany, 2008. ———. “Email Correspondence with Author.” November 2, 2017. Python Software Foundation. “Python (Software).” Accessed April 17, 2021. www.python.org. Reilly, James. Storage Guide for Color Photographic Materials: Caring for Color Slides, Prints, Negatives, and Movie Films. Albany, NY: University of the State of New York, New York State Education Department, New York State Library, New York State Program for the Conservation and Preservation of Library Research Materials, 1989. https://s3.cad.rit.edu/ipi-assets/publications/color_storage_guide.pdf. Rieger, Thomas. “Technical Guidelines for Digitizing Cultural Heritage Materials: Creation of Raster Image Files.” Federal Agencies Digital Guidelines Initiative, 2016. www.digitizationguidelines.gov/ guidelines/FADGI%20Federal%20%20Agencies%20Digital%20Guidelines%20Initiative-2016%20 Final_rev1.pdf. Sarlin, Paige. “The Work of Ending: Eastman Kodak’s Carousel Slide Projector.” PhotoResearcher 24 (2015): 10–19. Sobieszek, Robert A., and Robert Smithson. Robert Smithson: Photo Works. Los Angeles, CA and Albuquerque, NM: Los Angeles County Museum of Art, University of New Mexico Press, 1993. Sommermeyer, Barbara, and Julia van Haaften. Back to the Future!: Im Karussell der Diakonservierung; Riding the Slide Carousel. Bielefeld: Kerber Verlag, 2019. https://www.kerberverlag.com/en/1727/ back-to-the-future. Steinhoff, Sascha. Scanning Negatives and Slides: Digitizing Your Photographic Archive, 2nd ed. Santa Barbara, CA: Rocky Nook, 2009. Stroebel, Leslie, and Richard D. Zakia, eds. The Focal Encyclopedia of Photography, 3rd ed. Oxford: Focal, 1993.

451

Jeffrey Warda Sundt, Christine. “Running the Lamp Fan—Necessary or Excessive?” Visual Resources Association Bulletin, special bulletin #3, 7, no. 4 (1980): 67–68. https://online.vraweb.org/index.php/vrab/article/view/ 152/149. Vitale, Timothy. “Film Grain, Resolution and Fundamental Film Particles.” 2010. https://cool.cultural heritage.org/videopreservation/library/film_grain_resolution_and_perception_v24.pdf. Voellinger, Theresa, and Sarah Wagner. “Cold Storage for Photograph Collections—Vapor-Proof Packaging.” Conserve O Gram, National Park Service, 2009a. www.nps.gov/museum/publications/conserveogram/14-12.pdf. ———. “Cold Storage for Photograph Collections—An Overview.” Conserve O Gram, National Park Service, 2009b. www.nps.gov/museum/publications/conserveogram/14-10.pdf. Wagner, Patrick. “Hasselblad Flextight X5 — Virtual Drum Scanner.” ScanDig, 2021a. www.filmscanner. info/en/HasselbladFlextightX5.html. ———. “Nikon Film Scanner Super CoolScan 9000 ED.” ScanDig, 2021b. www.filmscanner.info/en/ NikonSuperCoolscan9000ED.html. Wagner, Sarah S. “Cold Storage Options: Costs and Implementation Issues.” Topics in Photograph Conservation 12 (2007): 224–38. http://resources.culturalheritage.org/pmgtopics/2007-volume-twelve/12_30_ wagner.pdf. Warda, Jeffrey, Franziska Frey, Dawn Heller, Dan Kushel, Timothy Vitale, and Gawain Weaver. The AIC Guide to Digital Photography and Conservation Documentation. Edited by Warda, 2nd ed. Washington, DC: American Institute for Conservation of Historic and Artistic Works, 2011. Warda, Jeffrey, and Doug Munson. “Acquisition and Creation of Exhibition Copies for Slide Works.” Presentation presented at the TechFocus II: Caring for Film and Slide Art, Hirshhorn Museum and Sculpture Garden, Washington, DC, 2012. https://vimeo.com/121089320. Weidner, Tina. Celebrating Collaborative Practice: Reflections on Trust, Time and Transparencies. In Back to the Future!: Im Karussell Der Diakonservierung; Riding the Slide Carousel, edited by Barbara Sommermeyer and Julia van Haaften, 37–53. Bielefeld: Kerber Verlag, 2019. ———. “Dying Technologies: The End of 35 Mm Slide Transparencies.” Museum Website, Tate, 2012a. www.tate.org.uk/about-us/projects/dying-technologies-end-35-mm-slide-transparencies. ———. “Fading out: The End of 35 Mm Slide Transparencies.” The Electronic Media Review, Paper presented at the Electronic Media Group Session, AIC 40th Annual Meeting, May 8–11, 2012, Albuquerque, New Mexico, Volume 2: 2011–2012 (2012b), 157–73. https://resources.culturalheritage.org/ emg-review/volume-two-2011-2012/fading-out-the-end-of-35-mm-slide-transparencies/. Wilhelm, Henry Gilmer, and Carol Brower. The Permanence and Care of Color Photographs: Traditional and Digital Color Prints, Color Negatives, Slides, and Motion Pictures, 1st ed. Grinnell, Iowa: Preservation Pub. Co, 1993. Williams, Don. “Debunking of Specsmanship.” RLG DigiNews, February 15, 2003. http://worldcat.org/ arcviewer/1/OCC/2007/08/08/0000070519/viewer/file2003.html. ———. “Selecting a Scanner.” Guides to Quality in Visual Resource Imaging, July 2000. https://old.diglib. org/pubs/dlf091/dlf091.htm#visguide2.

452

22 CARING FOR SOFTWARE- AND COMPUTER-BASED ART Deena Engel, Tom Ensom, Patricia Falcão, and Joanna Phillips

Editors’ Notes: For this chapter on caring for software- and computer-based art, the editors of this book, Deena Engel (Clinical Professor Emerita, Department of Computer Science, Courant Institute of Mathematical Sciences, New York University, USA) and Joanna Phillips (Director, Düsseldorf Conservation Center, Germany) take on the role as authors and are joined by Tom Ensom (Tate/Independent, UK) and Patricia Falcão (Tate/ Goldsmiths, University of London, UK). Deena Engel and Joanna Phillips draw from their five-year museumacademic collaboration to study and treat software-based works at the Solomon R. Guggenheim Museum, New York, where Phillips previously served as Senior Conservator of Time-Based Media. At Tate, conservators Tom Ensom and Patricia Falcão collaborate toward improving the care of software-based artworks in the collection, drawing from practices outside the conservation field and synthesizing them to the institutional context. With their combined expertise and experience, the authors step through an ideal acquisition process that enables the preservation of software- and computer-based artworks for the future. An in-depth look at the technical makeup of software-based art aims to serve as a foundation for its examination, documentation, and treatment, and an overview of different conservation practices and treatment approaches is offered to inform the decision-making process toward sustaining the collection life of these artworks.

22.1 Introduction Within the field of time-based media conservation, the newest area of specialization is the care for software- and computer-based art. While contemporary artists have been employing software and computers since the mid-20th century, the number of collected artworks in this category is still relatively small, even in major international collections with large time-based media holdings. Over the last decade, however, as artists continue to embrace the latest digital technologies in their work and collectors are growing more confident in their ability to care for these works, an ever-increasing number of software- and computer-based artworks are entering art collections. The wide variety of these works ranges from web artworks viewed in a browser to generative animations on the screens of Times Square, from robotic arms performing in the gallery to game-like apps on your mobile phone. However diverse in experience, these works have many commonalities with other types of time-based media in regard to their conservation: they must be understood and treated as DOI: 10.4324/9781003034865-26

453

Deena Engel et al.

dynamic systems of interdependent hardware and software components, their industrially produced technologies may be replaceable and/or variable, and their inherent change has to be managed carefully to prevent damage to their integrity. But the extreme complexity of their systems and their accelerated obsolescence and failure can make their care more challenging than caring for “traditional time-based media works” (Laurenson 2014) and the specialized expertise needed to maintain and treat them can be very specific. To address these challenges and support the collection and longevity of software-based art, different communities, initiatives, and institutions have been researching and developing new conservation practices and conceptual frameworks in recent years. Contemporary art museums have launched dedicated projects such as the Conserving Computer-Based Art initiative at the Guggenheim Museum in New York (Guggenheim Blog 2016; Guggenheim Museum— CCBA 2019) and the projects Software-Based Art Preservation (Ensom et al. 2021) and Preserving Immersive Media (McConchie et  al. 2021) at Tate in London and have organized dedicated symposia (Smithsonian—TBMA Symposia 2014, AIC—TechFocus III 2015). Crossinstitutional networks have been formed to transfer knowledge across disciplines, such as the Community of Practice on Software-Based Art (Dekker and Falcão 2017) in the PERICLES project and the Software Preservation Network (SPN 2022). In-depth case studies are used to map artwork needs and catalyze new conservation responses (Lintermann et al. 2013; FinoRadin 2016; Phillips et al. 2018; Roeck et al. 2019; Sherring et al. 2021). The range of research topics is continuously expanding and covers, among many other topics: approaches to analysis for their examination and treatment (Engel and Wharton 2014; Wharton and Engel 2015; Ensom 2018); the continued access to legacy artworks through emulation (Rechert et al. 2016; Roeck 2018), EaaS (emulation as a service; Espenschied et al. 2013), and virtualization (Falcão et al. 2014; LIMA 2018); collecting and conserving web-based artworks (Hellar 2015; Dekker 2018); the development of tools such as Webrecorder and Conifer by Rhizome [Webrecorder (Software) n.d.; Rhizome—Conifer (Software) n.d.], as well as Rhizome’s research into the use of linked open data to describe web-based artworks and preservation information (Fauconnier 2018; Rossenova 2020b); the documentation of software-based artworks (Lurk 2013; Barok et al. 2019; Ensom 2019); the collection, preservation, and treatment of source code (Swalwell and de Vries 2013; Di Cosmo and Zacchiroli 2017; Engel and Phillips 2018); and the application of conservation ethics to the examination and treatment of software-based art (Engel and Phillips 2019). In addition, notable artists have taken the initiative to reflect and advise on collecting and sustaining software-based works (McGarrigle 2015; Lozano-Hemmer 2015; Reas 2016) and researchers such as Qinyue Liu and Colin Post have reviewed conservation practices by artists (Liu 2020; Post 2017). Chapter  22 acknowledges the point of acquisition as a decisive moment in the life of a software-based artwork and guides the reader through the collection process (Section  22.2). Section 22.3 offers an in-depth look at the technical makeup of software-based art, and Section 22.4 explores a variety of conservation treatment practices.

Glossary algorithm: A defined sequence of steps or instructions to accomplish a specific task. API (application programming interface): An API defines interactions between software applications; it is commonly used to exchange files or data.

454

Caring for Software-Based Art

bus: A communication system for transferring data between components of a computer, including wiring, connectors, ports, and protocols. client-side: When application code runs on the user’s machine, that is considered client-side (e.g., the HTML/CSS code of a website). (See server-side for comparison.) command-line interface (CLI): This interface allows users to issue commands and run programs by typing text and using the keyboard for navigation. (See Graphical User Interface for comparison.) comments (in source code): Comments are annotations, instructions, documentation, or other notes written within the code of a computer program that are meant for human readers and ignored by the computer. CPU (central processing unit): An integrated circuit that can execute instructions represented as computer code. cross-browser compatibility: This is a feature of websites that run successfully under different browsers and fail gracefully when necessary. downwardly compatible: Downwardly compatible software (also referred to as backward compatible software) refers to a programming language or application that can continue to run files written for earlier versions of that language or software. device driver: Also simply called a driver, a device driver is a computer program that is used to manage external devices such as a printer, scanner, or other equipment. Drivers are specific to both the computer’s operating system and to the device. emulation: The use of software or hardware tools called emulators, which allow one computer system to behave as if it were a different computer system, to run software on computer hardware that it was not designed for. EPROM (erasable programmable read-only memory): Non-volatile storage that can be read from, written to, and erased using a strong ultraviolet light. EEPROM (electrically erasable programmable read-only memory): Non-volatile storage which can be read from, written to, and erased using an electrical signal. executable: An application file that runs when accessed; for example, when one calls it up at the command line and hits or double-clicks on it with a mouse within a Graphical User Interface. GUI (Graphical User Interface): A GUI offers the user graphical icons and audio along with devices such as a mouse. (See command-line interface for comparison.) hardware environment: A set of interconnected hardware components which a specific software program requires to run. (See software environment for comparison.) HDD (hard disk drive): A  non-volatile storage device that reads and writes data from a magnetic platter. human-readable file or data: These contents can be opened in a text editor or at the command line and rendered without special characters so that a human being can naturally read them. For example, the digits directly below a bar code are the human-readable portion of the information. integrated circuit (IC): A circuit contained on a single piece of semiconductor material. Also known as a microchip or chip. library: A software library contains additional code that is used with a given application or programming language. In many cases, the library is considered a third-party library, meaning

455

Deena Engel et al.

that the external library was not written by the same developers who wrote the application or programming language that calls the library. OS (operating system): The software that controls a device’s basic functions such as file management, managing peripherals, the user interface, and more. Examples include Windows, macOS, and Linux. PCB (printed circuit board): A fabricated circuit consisting of a conductive material in a non-conductive substrate that forms a base for mounting electronic components. PROM (programmable read-only memory): Non-volatile storage that can be read from but only written to once. RAM (random-access memory): Volatile storage used to store data and programs when they are accessed. ROM (read-only memory): Non-volatile storage which can only be read from and not written to. SCSI (small computer system interface): (Pronounced “scuzzy”.) A set of standards for physically connecting and transferring data between a computer and its peripherals such as a printer. It also refers to the physical connectors that implement the standard. server-side: When application code runs on the server (e.g., in PHP or Perl), it is considered server-side. (See client-side for comparison.) software environment: A set of interconnected software components that a specific software program requires to run. (See hardware environment for comparison.) software obsolescence: As hardware, operating systems, and other elements of a system change over time, many programming languages and software applications will not run as expected or will no longer run at all. source code: Human-readable set of instructions, written in a formal programming language, that tell a computer what to do. SQL (structured query language): Often pronounced “Sequel,” SQL is a language used to manage data and databases. SSD (solid-state drive): A non-volatile storage device that reads and writes data from solidstate flash memory. VCS (version control software): VCS refers to software applications that are used to manage and document change across computer programs, documents, and other files. Git (often used with tools by GitHub) is an example of a popular VCS application. WYSIWYG (what you see is what you get): An acronym that signals when output to the printer or web reflects what the user sees on the screen (e.g., when developing content using desktop publishing, web development, or graphics software).

22.2 Collecting Software- and Computer-Based Art When software- and computer-based artworks are collected, their sustainability comes into focus. The responsibility for their care and maintenance is transmitted from the artist to the new owner, and each artwork’s longevity will critically depend on the actions taken—or not taken—by its new custodians. Before a collected artwork breaks down or requires repair and maintenance, there are important steps that should already occur in the acquisition process to

456

Caring for Software-Based Art

enable future preservation. Relevant information about the artwork and its underlying technical systems has to be gathered, preservation-critical components have to be collected, and rights and responsibilities cleared. Conducting an initial preservation risk assessment and projecting future needs of an artwork can critically inform the acquisition process and help to successfully sustain the artwork’s collection life (Falcão 2019). This is why it is best practice to include conservators and other relevant stakeholders in the acquisition process as early as possible and to proactively define the deliverables needed to support the longevity of a work. When bringing software-based artworks into a collection, it is ideal to examine and document them as soon as possible after their creation when they are still fully functional in their native hardware and software environments and display all of the behaviors intended by the artist. Surveying artworks later, perhaps years after their entry into the collection, bears the risk of software or hardware malfunction that could go unnoticed. As an example, when viewing the Guggenheim’s web artwork Brandon (1998–99) by Shu Lea Cheang prior to its 2016–17 restoration, words that were intended to blink across the sprawling website appeared static. The words were still legible, and it would not occur to an audience unfamiliar with the work that the blinking function was indeed missing, as contemporary browsers no longer supported the HTML blink tags. Only source code analysis (see Section 22.3.6) revealed the missing functionality (Phillips et al. 2017; Engel et al. 2018). Had there been a simple video of the piece, recorded around the time of its creation, collection caretakers would have had a valuable reference when visually assessing loss and damages many years later. Learning from experiences like this, the conservation community has developed a range of acquisition practices to support the future preservation of software- and computer-based works.

22.2.1 What Is the Artwork? Understanding the Intended Experience The best way to start an acquisition process is to experience an artwork first-hand, installed and running. This creates an immediate impression of its intended behaviors and potential implications for conservation and maintenance and critically informs the acquisition process. Many artwork features and behaviors may not be spelled out in the artist-provided information and are not easily transmitted through documentation. This is particularly true for interactive works that require audience engagement or data input. When first experiencing a software-based artwork in operation, collection caretakers aim to answer questions like the following: • • • •

What is the artwork intended to do and how is the audience expected to interact with it? What are the conditions of display and space? What are the employed technologies (software and hardware), and do they have an aesthetic or conceptual relevance for the artwork that could make them hard to replace? What could challenge the maintenance, upkeep, or repair of the work for the next few exhibitions, as well as far into the future?

Experiencing the work pre-acquisition, even if only partially installed during an acquisition committee meeting or in a gallery setup, may be the last opportunity to gather relevant information before the new owner is charged with the next reinstallation. It is therefore advisable to produce as much documentation as possible on this occasion, whether by means of video, via screen recording or a combination of different documentation methods (see Chapter 11).

457

Deena Engel et al.

22.2.2 Researching the Meaning, Making, History, and Context Contextual and technical research during the acquisition phase is a key preservation step for software-based art and lays the groundwork for appropriate conservation strategies. While variability and change are integral to all time-based media, they are particularly fast-paced and urgent in software- and computer-based artworks. Any operating system (OS) or browser update, hardware failure, software obsolescence, or conceptual preference (for instance, if an artist wants to see their historical web artwork updated to run on mobile devices) can be a driving factor for changes to an artwork. The research phase aims to answer questions like the following: • • • • • • • • •

What is the artwork’s previous history of exhibition and change? How has it evolved or been updated over time? Were some iterations considered more successful than others, and if so, why? How was the artwork conceived and produced (technically and conceptually) and what is significant about it? Who was involved in the development of the work? Which programming language(s) or software technologies did the artist (or programmer) use, and for what reason? How do they relate to other hardware or software technologies that the artist/technician uses? Is this platform used regularly by the artist, or is it a one-off? Where does the artwork sit within the practice of the artist and the overall artistic and technological practice of its period? How common was that technology at the moment when the artwork was produced? How is the artwork experienced and perceived by the audience? The press? The public?

The information sources consulted during this research phase may include, among others: artist-provided manuals, relevant internal and external email correspondence, conversation notes, video recordings of the functioning work, images and videos of a work on social media, entries on media art databases [e.g., ADA | Archive of Digital Art (Website) n.d.], catalog texts, curatorial descriptions of a work, artists’ websites, or GitHub pages. Gathering and organizing this information, such as with the help of templates (Guggenheim Museum—CCBA n.d.; Ensom et al. 2021; Metropolitan Museum of Art—Sample Documentation and Templates n.d.), will support an informed discussion with the artist. Conducting formal acquisition interviews with living artists is a common practice in many contemporary art collections (see Chapter 17); in the case of software-based artworks, such interviews may also include programmer(s) and others with in-depth knowledge of the technological makeup of a work. The process of analyzing and documenting artwork components is iterative and a deeper analysis of the system, as detailed under Section 22.3, will produce further information. Eventually, the object file of a software-based artwork will contain a range of documentation, including but not limited to a component-based inventory (see Section 22.2.3), display and configuration instructions, identity, and iteration reports (see Chapter 11), a component-based risk assessment (see Section 22.2.6 and Table 22.1), any source code documentation (see Section 22.3.6), and metadata in the Collection Management System. Considering the context of the collecting institution, including the type and size of the collection, resources, and expertise available, is fundamental to devising sustainable preservation strategies.

458

Table 22.1 An example of an initial, basic inventory and component-based r isk assessment: This table models the type of infor mation that may be captured in the process; the level of available detail, including the work-defining properties of computer components, may deepen with continued examination and stakeholder involvement. Risk Assessment

Basic Inventory Make and Model Components (hardware, software, data, project files, etc.)

Function, Specs, Relationships

Work-Defining Properties

Significance

Preservation Strategies Hazards/Risk

Replaceable, as long as it remains invisible behind the monitor and has appropr iate video output

Torch RedFoCS (2003)

Monitor

Torch RedFoCS/ Displays artwork; 4:3 aspect ratio; slim black frame; joint with the LCD NEC LC screen size 17″ to computer. See (2003) 19″; LCD panel spec sheet. type

Functional; aesthetic; conceptual (“discrete object”; 4:3 content)

Replaceable with a Wear and tear; different monitor change of (17″ to 19″, 4:3, br ightness and slim black frame) color balance; per manent screen lines. Risk is High

Video connector cable

VGA connection from PC to monitor

Short, hidden behind monitor; compatible with computer output and monitor input

Functional (connection type not significant)

Wear and tear/low r isk

459

Computer

Transmits image signal

Replaceable with other hardwarecompatible cables/connector types

Controlled storage (climate, dustfree, ESD-safe); create and store disk image; source and set up a spare computer (for possible replacement); clean regularly, particularly when on display Controlled storage (climate, dust-free, ESD-safe); source backup monitor (for possible replacement); clean regularly, particularly when on display No action needed (VGA cables are still available)

(Continued)

Caring for Software-Based Art

Wear and tear; Fits behind monitor; Functional; Executes most at r isk: conceptual; capable of artwork; is hard disk, CPU, histor ical running the OS joint with power supply, (early example and executable monitor and video cards, and of monitor file; sends display mounted/ capacitors. Risk with built-in infor mation to hidden behind is high computer) monitor it; see spec sheet

Recoverability

Table 22.1 (Continued) Basic Inventory

Risk Assessment

Components Make and Model (hardware, software, data, project files, etc.)

Work-Defining Properties

Significance

Hazards/Risk

Runs artwork executable (Director.exe)

Capability to run artwork.exe

Functional

Obsolete; no Replaceable with Store backup copy of support/secur ity OS that can run Windows XP in a artwork.exe, or updates, but dig ital repository artwork is offline/ XP can be run on low r isk emulation If it does not run on Store backup copy in a Dependency on recent Windows OS Windows dig ital repository OS (test), run on XP, no hardware or network XP emulation. dependencies; moderate r isk

Windows XP (Home Edition)

Application (“the artwork”)

Executable file Machine(.exe) created readable in Macromedia instruction Director 8.0 suited for OS Windows XP

Production element, required for treatment and examination

Contains Functional; aesthetic; instructions and conceptual; data dictating histor ical artwork behaviors (animation speed, colors, movement, shapes, etc.) Allows Preservationconservators/ cr itical programmers element to fully capture (and potentially reconstruct) artwork behaviors and to update any settings

Propr ietar y, obsolete, limited support. Dependent on obsolete Director platfor m. High Risk

Recoverability

Run legacy Director in emulated legacy environment, if needed.

Store project file in dig ital repository; obtain and back up a copy of legacy platfor m Macromedia Director 8.0

Deena Engel et al.

460

Function, Specs, Relationships

Operating system

Artist’s project Project file (.dir), file Macromedia Director 8.0

Preservation Strategies

Caring for Software-Based Art

22.2.3 Identifying the Anatomy of the Work To effectively care for a software-based artwork, collection caretakers need to identify its interconnected hardware and software components and understand their relative importance in realizing the artwork’s work-defining properties, as well as how each component potentially affects the way the system behaves. The materials supplied by the artist will often consist of an executable file or sets of files and directories, a computer with peripherals ready to be set up in a gallery for display or, for web-based art, a group of files and folders to be uploaded to an institutional web server. Some artists may also supply source code or other source materials. Additional components may include sculptural elements, temporary exhibition hardware (e.g., a projection screen or monitor), replacement parts, or hard drives with backup files and software not included in the OS. For complex works with many elements it is helpful to start with a basic inventory listing all the known components of the artwork. The example in Table 22.1 illustrates how a component-based inventory can then be expanded to structure the analysis and risk assessment process and support the development of preservation strategies. The inventory should record basic information on hardware, software, and data components and note their functionality and the relationships among the components. For hardware components, peripherals and connectors should be considered, and standards, protocols, makes, and models should be specified. See Sections  22.3.1 for an overview of commonly encountered computer components and 22.3.2 for guidance on analyzing and documenting hardware. For an introduction to software technologies, see Sections 22.3.4 through 22.3.8.

22.2.4 Artwork Inspection During an acquisition process, software-based artworks should be inspected in the conservation lab to ensure that none of the needed deliverables are missing, that the work is fully functioning, and to record the behaviors in the artist-supplied or artist-approved environment. The examination and documentation of the hardware and software components may be undertaken by the conservator themselves or—depending on their skill set and the level of engagement of the artist—through collaboration with the artist and/or programmers and other technical specialists.

Creating an Inspection Environment In order to test the software delivered, an inspection environment—made of suitable hardware and software—will be required. In some cases, this will be the hardware delivered by the artist. If no hardware is received with the artwork, the examiner has to set this up. In some cases, this environment will also be used for display. Ideally, the environment is created in collaboration with the artist or by using an artist-supplied specification. If this is not available, decisions can be made based on the format of the executable software (i.e., which operating system it will run on), the period in which the artwork was produced, and any knowledge of the technologies used in its production or previous displays. As the software environment is created through the process of installing and configuring an operating system and other programs, the components that were used to build it should be documented and archived. In some cases, a desktop computer with the intended operating system, or even just a few browsers, may be enough for an initial inspection. However, some works—regardless of whether hardware was provided by the artist or not—may need to be fully installed to be correctly inspected. There may be limitations as to how much of the work can (or must) be installed for 461

Deena Engel et al.

Figure 22.1

Inspecting and testing the software-based artwork Subtitled Public (2005), by Rafael LozanoHemmer, requires an 8 × 8 m space and three to four people working for a week.

Photo: Tom Ensom

inspection purposes, namely, for large works or works that require a physical installation space with certain characteristics. For example, to inspect and test Subtitled Public (2005), a software-based artwork in the Tate collection by Rafael Lozano-Hemmer, it is necessary to have a completely dark space with a minimum size of 8 × 8 meters and with an appropriate floor covering. Testing this software can involve three or four people working for a week (see fig. 22.1).

Switching the Artist-Provided Artwork On for the First Time Before powering on a software-based artwork for the first time, it should be verified that an identical copy of the software and date that comprise the artwork is stored safely. Any computer or storage device can fail spontaneously at any time, potentially resulting in irretrievable data loss or corruption. Supplying power to or booting a computer can in itself cause it to fail if it is in poor condition. Risk increases with its age, hours of usage, and improper storage conditions. A computer that has been sitting on a shelf for five or ten years could suffer a hardware failure upon being switched on (see Section 22.3.3). If a computer has been improperly stored, you may wish to ensure that internal components are fully dry and clean before use. Dust can be removed with a can of compressed air, an ESD-safe vacuum cleaner, or equivalent, while ingrained dirt and debris can be removed with isopropyl alcohol and a lint-free cloth.

462

Caring for Software-Based Art

Especially if the software-based artwork runs on an artist-provided computer, a backup of the data contained on the computer (not just the files that make up the artwork) should be created. The ideal backup is to create a disk image of the entire computer’s storage devices, which includes all installed systems and applications. Chapter  14 provides a detailed overview and guidelines on how to create disk images. The data must be safely backed-up (see Chapters 8 and 13), and only then should the computer be turned on. At this point, any other related hardware (e.g., peripherals such as cameras or joysticks) should be connected so that the full functionality of a work can be tested and documented. Before the artwork itself is run, it is advisable to use built-in system tools such as the system report to create basic documentation of the hardware and software environment (see Section 22.3.2). This way, the configuration is captured directly from the system rather than derived from inspection or manuals and before any operational failure may be triggered by executing the artwork.

Running and Inspecting the Software and Hardware Once the system documentation is created, the software can be run. Especially when inspecting older, fragile artworks that have a higher risk of failure, this moment should be considered a valuable opportunity to create video documentation of the work running and of the user interface on the computer. When inspecting the software and hardware running, collection caretakers should aim to generate answers to questions like the following: • • • • •

Are the artist-supplied components (equipment, software, and data) complete and working correctly? Are we missing any crucial information necessary to run the piece, such as passwords? Which software is employed by the artwork, in which version, and configuration and configuration? What is the setup and calibration process required to run the software correctly? When running correctly, what is the software supposed to do functionally and aesthetically (what are its audiovisual, kinetic, and/or haptic behaviors, including colors, image characteristics, movement, speed, duration, data input, interactivity, etc.)?

Answering some of these questions may require an in-depth analysis of the artwork as described in Section 22.3.

Understanding the Work-Defining Properties The concept of “work-defining properties” was introduced by Laurenson (2006) and has since been widely adopted in the time-based media conservation community. It serves to identify the properties of the artwork—such as hardware characteristics, audiovisual content, installation components, and spatial parameters—that define its identity, are essential for its uncompromised integrity and should be preserved. For software-based art, Laurenson (2014, 24) expanded these general properties to include the appearance (including sculptural hardware components), behaviors, and modes of interaction. In addition to these discernible characteristics, significance in software-based art has also been attributed to its original source code—namely, the artist’s or programmer’s choice of technologies, programming languages, and coding styles (Engel and Phillips 2019, 181).

463

Deena Engel et al.

Developing an understanding of the artwork’s work-defining properties is a critical step in the risk assessment process. This can be supported by a wide range of information, including interviews with the artist (and other stakeholders), technical specifications, source code and documentation of past installations, an understanding of which will be informed by the conservator’s own knowledge and experience. One approach to this is to identify work-defining properties at the component level (see Table 22.1). However, care should be taken to ensure that these are not interpreted as fixed or authoritative. Attaching significance can involve subjective judgments, and a complete picture of the artwork’s identity will continue to emerge throughout its life.

22.2.5 Conducting Risk Assessment for Preservation Risk assessment supports the development of preservation strategies and helps to prevent damage and loss to objects and collections. The concepts of risk assessment and risk management are widely used in cultural heritage conservation as a way of identifying hazards, threats, and risks for specific objects and collections, as well as evaluating the impact of those risks and finding ways to mitigate them. The ICCROM (International Centre for the Study of the Preservation and Restoration of Cultural Property) defines risk as “the chance of something happening that will have a negative impact on our objectives” (ICCROM 2016). Building on these practices, Agnes Brokerhof (2011) developed a risk assessment methodology for contemporary art installations, which also proved helpful for software-based art (Falcão 2010). Unlike traditional fine art conservation, where change is commonly seen as loss, technology-based artworks often must change to retain their functionality. Moreover, the possibility of full recovery, by which damaged or failing components can be replaced without change to the work-defining properties of an artwork, means that recoverability, or the balance between technical feasibility and affordability, becomes a key factor when identifying and prioritizing risks and devising preservation strategies. The following section explains how to identify and evaluate risks for the conservation and display of a software-based artwork. In the process, each component’s significance for the artwork is assessed, including its functional, aesthetic, or conceptual significance (see Table 22.1). Then, its risks of failure and loss and its potential recoverability from a conceptual and technical perspective are identified. A similar risk may impact different components in different ways, depending on their significance and ability to recover from failure. This risk assessment is best reviewed and agreed upon with the artists, creators, curators, and other stakeholders to build consensus around preservation, maintenance, and display strategies.

Understanding and Attributing Significance Significance refers to “the values and meanings that items and collections have for people and communities” (Russel and Winkworth 2009) and reflects the value attributed by different stakeholders; a conservator and an artist may value a specific component differently. Moreover, attributed significance to the same component may vary in the context of different artworks: an ordinary-looking desktop computer may carry a purely functional significance for one artwork but be of conceptual significance in the context of another artwork (e.g., if the artist chooses to display it on the gallery floor). The computer could also carry historical value if it represents a significant point in art history, the history of technology, or a key moment in the artist’s practice. Artwork components often combine several types of significance at once, in varying degrees: a computer could have a functional and a historical significance, but the functionality might be 464

Caring for Software-Based Art

considered more significant for the artwork than its agency in representing a certain point in history. One way of attributing significance is through an understanding of the artwork’s identity and work-defining properties. Table 22.1 exemplifies how significance attribution (as an integral part of the risk assessment process) considers all hardware and software components, including monitors, cables, operating systems, and associated networks and databases. Especially when planning conservation treatments (see Section 22.4), it may be necessary to break down software and hardware components further (e.g., to consider the significance of a particular graphics card inside a computer) or the programming language underlying the software. For example, the HTML code in JODI’s web artwork wwwwwwwww.jodi.org %Location (1995) does not only functionally create the page but also carries conceptual and aesthetic significance, as it is written so that the HTML code takes the form of a drawing (fig. 22.2). The relative significance of artwork components is the basis for their subsequent assessment of recoverability. For instance, if a computer’s significance is mostly functional, the spectrum of options for recovery—from careful storage to complete replacement—are available. If it is deemed to have aesthetic significance, it may be replaced by other means, as long as the replacement is invisible. If the value of the computer is deemed to be conceptual, the options for preservation may be limited to storing the original computer, like a traditional sculpture—in extremis, to the detriment of its functionality. Generally, in the context of time-based media art, high significance is attributed to functionality with the aim to preserve a functioning artwork, often overriding historical or aesthetic value of a component by allowing for its replacement. In other words, faced with a work that stopped functioning, many, but not all, artists will be happy to make changes to maintain that functionality and often to technically improve it. The latter may raise ethical concerns, as these changes may compromise historical or aesthetic significance.

Figure 22.2 The webpage (left) and underlying HTML code (right) for the web artwork wwwwwwwww. jodi.org %Location (1995) by JODI. Screenshot: Patricia Falcão

465

Deena Engel et al.

It is important to highlight that different collections, communities, and other stakeholders may value different aspects of a work. For instance, early computer art by artists such as Frieder Nake or Vera Molnar has long been collected by museums only in its outputs as prints. The value of the underlying algorithms has been neglected in that context, but it was valued at the time of creation by the artists themselves and journals such as “Computer Graphics and Art” (Della Casa 2012), which chose to publish that code. Currently, the community around the Recode project is reusing that code and translating it to Processing (Recode Project n.d.).

Identification of Hazards and Estimation of Associated Risks The agents of deterioration for traditional objects of cultural heritage include physical forces, fire, water, criminals, pests, contaminants, light and UV radiation, incorrect temperature, incorrect relative humidity, and custodial neglect (Waller 1994). In spite of the misleading discourse about the immateriality of digital objects, all of these hazards are relevant and will apply to data carriers (even servers in the cloud), equipment, and any other physical components of a software-based artwork. Brokerhof (2011, 97) identified further tangible and intangible agents of deterioration for media installations: electricity, wear and tear, plastics and metals degradation, information dissociation, malfunctioning, and misinterpretation. Further to these, the digital preservation community has established additional hazards specific to software and, more broadly, digital objects, such as obsolescence, data loss, rendering issues, or inaccessibility (Digital Preservation Coalition (DPC) 2022). Fino-Radin identified three primary risks faced by new media artworks in the Rhizome collection: diffusivity (i.e., a tendency to reference external data sources), data obsolescence, and physical degradation (Fino-Radin 2011). The magnitude of a specified risk depends on three factors: the likelihood of a hazard happening, the impact of that hazard, and the possibility of recovery. For example, the risk that a file is lost if it is stored on a single hard drive is very high, as it is known that hard drives fail, sooner or later. If copies of that file are stored on multiple storage devices, the risk of losing that file is much lower.

Assessing Recoverability and the Impact of Obsolescence Even before hardware and software components fail, they may be affected by obsolescence, “a condition that affects an element or system when it or its parts are no longer commercially available or supported by the industry that initially produced them” (Falcão 2011). It is important to understand that hardware or software does not stop working because it is obsolete. The issue is that for replaceable components, it becomes increasingly difficult and expensive to repair or replace them once they are obsolete, and the increasing incompatibility of obsolete hardware and software with newer environments compromises their (correct) function. More staff time must be invested into sourcing replacements, and dwindling specialty expertise is increasingly costly. For factors that drive hardware failure and obsolescence, see Section  22.3.3, and for software obsolescence, see Section 22.3.7. For an introduction to treatment responses to obsolescence, see Section 22.4. A decision-making approach can support the process of assessing the recoverability of artwork components and understanding how obsolescence impacts their recovery in the context of their system. The example in fig. 22.3 explores the recoverability of an obsolete SCSI printer component of a software-based artwork, the different options for its recovery, and how, after a certain point, this may drive the need to change the whole system.

466

Caring for Software-Based Art

Figure 22.3

Recoverability assessment, here of an obsolete SCSI printer in the context of a softwarebased artwork.

Diagram: Patricia Falcão

Assessing recoverability during the acquisition process helps to identify the deliverables needed to support the artwork’s recoverability in the future. In this example, the need for spare SCSI printers and printer parts was identified as a recovery option until they can no longer be sourced. Once printers with SCSI connections are no longer available, the software will need to be changed to output to USB printers or the common connection at that point. Having the source code will make this intervention in the code much easier (see Section 22.4). In summary, the main questions regarding recovery/replacement are as follows: • • • • •

Is the element replaceable? Will its replacement lead to other changes to the system? Can an appropriate replacement be found or produced? Do we have the means (conservation elements, expertise, and funding) to produce that replacement? Is there a preferred approach if the element cannot be replaced? If the work cannot be recovered, is exhibiting it as documentation an option?

22.2.6 Collecting Preservation-Relevant Components, Information, and Rights The sustained collection life of a software-based artwork critically depends on the hardware and software components and on information and rights that the collector receives from the artist or gallery. Time-based media conservators play an active role in negotiating and obtaining deliverables. The collection team should establish the deliverables in close dialog with the artist/gallery and list them in the purchase agreement (see Appendix 17.1 “Bitforms Gallery

467

Deena Engel et al.

Collector Agreement”). This way, the transaction is clearly defined for both parties, and funds can be released upon the successful quality and condition control of the received deliverables.

Artist-Provided Deliverables Acquisition deliverables for software-based artworks depend on the type of artwork and its technological dependencies, the artist’s preferences, a collection’s resources, and the preservation awareness of the involved parties. In general, the collection goal is to enable the new owner to responsibly care for the work, to maintain and update it, to examine and treat it, and thereby to manage its change without compromising its integrity—even if the artist is not or no longer available to consult. To support this transition of care and responsibility, the artist should be expected to supply the following: •

• •

• • •







A full, running version of the work, including all hardware and software components. This setup acts as an artist-approved reference of how the work should appear and function. If the artist does not supply all components, the collector should source or build the setup and have it approved by the artist at the point of acquisition or the first gallery display of the work. Technical specifications for hardware that is not supplied and should be sourced by the collection, including specifications on the future replacement of supplied hardware. Comprehensive installation instructions. This includes detailed instructions on how to install the artwork in a variety of spaces and conditions; how to install, configure, and run the software; how to handle hardware and software maintenance and failures (troubleshooting, replacements, updates); and whom to contact if specialty expertise, service, or supply is needed. Stockpile of hardware or replacement parts (e.g., in the case of rare legacy equipment or custom-made parts). A complete backup copy of the software required to run the artwork, including executable files, software dependencies, and firmware. Where executables can be produced for multiple operating systems (e.g., a Unity executable can be created for macOS, Windows, or Linux environments—an artist can be asked to produce executables for one or more operating systems). For web-based works, it is important to agree on who is responsible for hosting the website, and ideally, the collector will have a full backup of the site, even if not hosting the site themselves. The rights to any significant domain must also be acquired at this stage. The source code of the project, as well as file directories, text files, media assets, libraries, and project files (e.g., for a development environment such as Flash, Director, Xcode or Unity, or 3D print files). Granting of rights (via purchase agreement or non-exclusive license): the artist should grant the collector the right to show, publish, and preserve the artwork; to create copies of the work for the purpose of preservation and access; and to treat the work (e.g., to migrate or update the code or project file, preferably in consultation with the artist, as long as the artist is available) to sustain the life of the artwork.

“Obtaining the uncompiled, human-readable source code in addition to the compiled, machinereadable executable file . . . is perhaps the most crucial act of preventive conservation” (Engel and Phillips 2019, 192). Section 22.3.6 shows how source code analysis is a critical tool for

468

Caring for Software-Based Art

identifying artwork behaviors, especially randomized, generative, and other behaviors that cannot be quantified and qualified through human observation. Source code is also often needed in the long term to maintain, update, or migrate a software-based artwork (see Section 22.4.8) and to ascertain software and systems dependencies that the conservator must take into account for conservation and future treatment. However, collectors may not always succeed in their negotiations to obtain the source code in addition to the executable file. While some preservation-conscious artists like Rafael LozanoHemmer, Jürg Lehni, and Cory Arcangel share their code with collectors or GitHub communities, other artists feel more protective of it. Contractually agreeing on who has access to the source code and for what purposes can be very helpful in alleviating any concerns and obtaining an artist’s consent. Not having access to the source code increases the risk of losing access to the artwork in the future. Although it does not preclude preservation, intervention may be significantly more technically challenging and costly without the source code, requiring reverse engineering of the software (see Section 22.3.6).

Sourcing and Creating Additional Components as an Act of Preventive Conservation Additional measures are recommended to be taken by the collector to broaden future preservation options: •





• •





Create an identical exhibition copy of the whole artwork system, both hardware and software. This limits the use of the artist’s original over extended periods of time, prolonging its life as a reference for the creation of new copies and systems. Create a disk image (see Chapter  14), either of the original computer or created from scratch, by installing the artwork software on a disk image with the correct software environment. This supports access not only to the artwork software but to the whole software environment that surrounds it. Gather supporting software in its artwork-relevant versions, including production software (e.g., Adobe Flash, Visual Basic, Max MSP, Processing), media playback software (e.g., RealPlayer, Adobe Flash plug-in), operating systems, libraries, plug-ins, and drivers. In some cases, the software may not be available to purchase anymore and will have to be sourced through the second-hand market. Record details of license keys and/or account information required to access or update these components, where applicable. Extract and store media files separately outside of the software (e.g., image or sound files animated on a Director timeline) to reduce the risk of access loss and to enable future reconstruction of the artwork with different technologies. If the collector is unable to obtain the source code, future examination and treatment may depend on decompiling the artist-provided executable. Even if this path does not guarantee success (Engel and Phillips 2019, 192), the collector should source the appropriate decompiler and compiler software immediately as an act of preventive conservation. The compiler software is not artwork-specific but may be difficult to obtain in the future when it might become necessary to recompile the artwork. Create comprehensive audiovisual documentation, such as a video recording, narrated screen recording, or similar to capture the artwork’s intended appearance, experience, behaviors, and functions.

469

Deena Engel et al.

If other preservation-relevant deliverables are not received, the collector is well advised to step in and source or create these components on the artist’s behalf (and seeking artist approval, where applicable), including stockpiling equipment and replacement parts or sourcing software the artwork depends on. For handling proprietary software in the context of preservation, the Software Preservation Network published a report on how fair use may be applied to software preservation (ARL 2019).

22.3 Understanding Your Software-Based Artwork In order to effectively care for a software-based artwork, conservators will need to be able to identify the technologies used, make sense of their role within the system of the artwork, and understand how these technologies are likely to be impacted by the passing of time. Through a process of analyzing and documenting their findings, conservators should aim to capture critical information to guide preventive conservation now and treatments in the future. In this section, the authors aim to support the in-depth examination process by providing an overview of commonly encountered hardware and software technologies, their dependencies, and implications for preservation. Key failure and obsolescence risks are explored, and practical guidance on approaches to analysis and documentation is provided. Software-based artwork systems are made up of two types of components: hardware and software. Hardware components are electronic devices that are capable of processing and responding to instructions (or other inputs) and generating tangible outputs (such as images on a computer monitor). The hardware associated with a software-based artwork will usually consist of at least a digital computer containing a microprocessor. This is an integrated circuit—a circuit contained on a single piece of semiconductor material (also known as a chip)—capable of processing binary data through logical and mathematical operations. Software components consist of encoded instructions and associated data that tell a computer what to do. These instructions are written in programming languages and can be stored and run by the computer on demand.

22.3.1 What Is Hardware? Hardware consists of the physical, electronic components of a computer system. Modern computer systems make use of an array of hardware components, which, together with software components, turn instructions and other inputs into tangible outputs. The purpose of a system will dictate the kinds of hardware components used. For example, a computer used to run virtual reality software will need to incorporate powerful graphics processing hardware and attach to specialized peripherals, while a computer that can fit inside a smartphone will need to use components with reduced size and, consequently, reduced performance. In many cases, the hardware used to realize a software-based artwork may include the use of display equipment common to other forms of time-based media. For example, an artwork may utilize monitors or digital projectors to display moving images alongside audio equipment to reproduce sound. These kinds of hardware are sometimes referred to as peripheral devices in the context of computing. The primary focus of this section is on computers, as they are essential to the execution of the software and thus central to the realization of the artwork. Conservators are likely to encounter a diverse array of computer hardware when caring for software-based art. In this section we introduce common types of computer hardware and consider why and how they are used by artists. Particular preservation risks to hardware are also highlighted, but it is important to note that all hardware is at risk of loss in the long-term (see Section 22.3.3). 470

Caring for Software-Based Art

Figure 22.4

Various types of computers. Clockwise from top left: iMac G3, custom desktop tower, Raspberry Pi Model A single-board computer, Arduino Uno Rev3 microcontroller, off-the-shelf COMPAQ desktop, and Mac Mini.

Photo: Tom Ensom

Common Computer Components Computers are made up of electronic circuits and can employ a single circuit board or many circuit boards connected via suitable interfaces. Desktop computers are typically made up of many removable components connected to the motherboard via internal (concealed within the computer case) and external (accessible on the exterior of the case) buses. This approach means that individual components can be replaced or upgraded without having to replace the entire system. As computer size decreases, there is a higher likelihood of components being soldered directly onto a reduced number of circuit boards. Microcontrollers represent this at the smallest extreme, where all of the hardware components required for a particular function can be contained on a single circuit board or even a single integrated circuit. This offers much-reduced size but makes replacing or repairing individual components more challenging and alters prospects for maintenance in the course of conservation treatment (see Section 22.4). Despite the wide variety of uses of computers and the myriad forms they take, there are certain common components found across modern computers that are frequently encountered when caring for software-based art. These are summarized in Table 22.2 as a reference for the use of this terminology in subsequent sections. While the components listed are framed in the context of separate devices contained in a desktop computer, many of the components listed can also be found built into a single circuit board or integrated circuit. 471

Deena Engel et al. Table 22.2 Description of common computer components and the key specification information to capture when documenting their specification Component

Description

Key Specification Information

Motherboard

• The motherboard is the circuit board to which the computer’s other hardware components connect and through which they communicate. Also known as a mainboard or logic board. • The CPU carries out operations based on instructions written in computer code. A CPU implements a specific instruction set architecture (ISA), which defines the kind of instructions that can be executed (e.g., x86-64 in modern desktop computing and ARM64 in mobile computing). • RAM is volatile memory to which programs and data are loaded when they are executed or used, allowing for rapid access by the CPU. The information in RAM is lost when the machine is turned off or rebooted. • Devices for storing and accessing digital data. At least one storage device within a computer will contain a software environment consisting of an operating system, software programs, and user data. • A GPU is a device for rendering 3D graphics. While sometimes taking the form of standalone expansion cards, GPU functionality can also be integrated into a CPU. • Devices that support or extend the computer’s networking capabilities.

• Model • Supported interfaces

Central Processing Unit (CPU)

RandomAccess Memory (RAM) Storage Devices

Graphics Processing Unit (GPU) Network Devices

Audio Devices

• Devices that process, route, and/or generate audio signals, such as sound cards or audio interfaces.

Power Supply Unit (PSU) Peripheral Devices

• A device that routes power to components that require it. • Removable devices which extend the functionality of the computer system, such as cameras, sensors, and human interface devices.

• • • •

Model Architecture Core number Clock speed

• Model • Capacity

• Model • Capacity • Host connection interface (e.g., PATA/IDE, SATA, SCSI) • Model • VRAM (video RAM) • Host connection, if removable (e.g., AGP, PCIe) • Model • Host connection (e.g., AGP, PCIe) • External interfaces (e.g., 8P8C ethernet) • Model • Host connection (e.g., PCIe, FireWire 400, USB 3.0) • Supported input and output connections (e.g., 3.5 mm jack, RCA) • Model • Wattage • Host connection interface (e.g., USB 3.0, FireWire 400)

Desktop and Laptop Computers Computers can be manufactured to conform to a variety of form factors: a term for characterizing different size and shape profiles. Full-size desktop computers incorporate discrete hardware components in a relatively large case and, as the word desktop implies, are intended for home and office use. They contain full-size, off-the-shelf hardware components (which are 472

Caring for Software-Based Art

Figure 22.5 The side panel of this desktop computer has been removed to reveal the internal components: (1) heatsink and fan array mounted above the CPU, (2) four sticks of RAM, (3) optical drive mounted in a 5.25″ drive bay, (4) GPU attached to a PCIe bus on the motherboard, (5) 2.5″ solid-state drive, (6) 3.5″ hard disk drive, (7) motherboard, and (8) PSU. Photo: Tom Ensom

usually easily accessible via a removal panel) and therefore tend to be the most powerful and favored for high-performance applications, such as real-time 3D rendering. The components of an example desktop computer are annotated in fig. 22.5. Desktop computers can be bought prebuilt or built to a custom specification using off-the-shelf components either by the artist or a third-party supplier. In the event that an internal component fails, it can usually be removed and replaced with a similar component (if available). Small form factor desktops contain similar components to full-size desktops but with a much-reduced physical size. Their smaller size sometimes comes at the cost of maintainability, as components can be difficult to access or, in some cases, are soldered directly onto the motherboard. They are typically sold as mass-produced, pre-built units, such as the Mac Mini or Intel NUC ranges. While they may be less performant than full-size desktop computers, their smaller size means that they can be more easily concealed in an exhibition space. Laptops often use similar components to those found in small form factor desktop computers while also integrating essential peripherals into the case, such as a screen, keyboard, and trackpad. They can suffer from similarly reduced maintainability. 473

Deena Engel et al.

Microcontrollers Microcontrollers are small computers contained on a single microchip (an integrated circuit or IC). A microcontroller IC will contain at least a CPU but may also incorporate a small amount of RAM and persistent storage. The latter may be programmable and modifiable to some extent, depending on whether it uses ROM, PROM, EPROM, EEPROM, or flash memory. Unlike a desktop computer, a microcontroller does not run an operating system, so it can only be programmed by connecting it to another computer. Microcontrollers can carry out a reduced set of possible operations (described in the datasheet for that model) in comparison to a microprocessor-based, general-purpose computer. For example, a microcontroller might be used to control the movement of a servo motor or an array of lights. Microcontrollers are likely to be chosen by an artist where a low-cost and small-size computer is needed to implement a simple control system. A microcontroller IC needs to be built into a circuit board with other components to be used. To simplify the process of working with a microcontroller, circuit boards containing a microcontroller IC and other components are manufactured and sold as units—for example, the Arduino (see fig. 22.6) and Basic Stamp series. These are known as single-board microcontrollers. These units make use of an integrated interface (e.g., USB, RS-232) and development software to simplify the process of loading instructions onto the device. This contrasts with a standalone microcontroller IC, where compiled code must be loaded using a specialized device—known as

Figure 22.6 Arduino Uno Rev3 SMD microcontroller. This model uses an integrated ATMega328P microcontroller IC and USB connection. Photo: Tom Ensom

474

Caring for Software-Based Art

a programmer—connected to a host computer. The latter is rarely used by artists today due to the specialist requirements for customization in comparison to single-board microcontrollers but may be encountered where an artist has worked with an engineer, particularly in older artworks before the wider availability of single-board microcontrollers. Arduino, microcontrollers, and related technologies can play two roles in a conservator’s work: as an artistic medium of artworks that enter a collection and need to be conserved and as a conservation strategy when migrating other works to this technology for preservation purposes.

On Microcontrollers SASHA ARDEN:  Microcontrollers can play two roles in a TBM conservator’s work. I was trying to

think of all of the artworks I have encountered or know of that definitely use microcontrollers. And I came up with just as many conservation projects that I’ve been involved in or know of that use microcontrollers as a treatment. So, it’s about half and half. MARK HELLAR:  I would say that each artwork is different, right? So, some artworks have microcontrollers that talk to the internet, some have microcontrollers that use Bluetooth and there’s a vast array of sensors. So, we could say, “Oh, it’s an artwork with a microcontroller,” but I think it might make more sense to say, “What does this artwork do? And what are the significant properties?” And then I could take it out of this paradigm that this is a microcontrollercontrolled work, and I’d spend time with the artwork to really understand what we would need to do and what elements we would need to pursue to care for this artwork. SASHA ARDEN:  It is wise to document one’s own treatments using microcontrollers as well as documenting artworks that use them. I record the hardware that I use, the make and model of the microcontroller, and I create a circuit diagram and list all of the electronic components that were in the external circuit. I make wiring diagrams of how all those things were connected and create a text file of the program that I wrote, and I also take a photo as a kind of a screenshot when I can. MARK HELLAR:  There are a number of tools to help with this process. They’re known as PCB tools, which means “printed circuit board design tools.” What you do is you create a schematic, and there are standard symbols for electronic schematics. Then you can convert them so that you can lay out a printed circuit board and have that refabricated, for example, to help you to migrate from one microcontroller to another. SASHA ARDEN:  Also, even just having a standard electrical schematic would be enough for someone who can read that to build a new circuit by hand. MARK HELLAR:  In terms of proprietary and open-source microcontrollers, you need to consider, “What does that mean in a hardware context?” Keep in mind that the Arduino project publishes all of the schematics and the source code, so you could just go out and get some parts and put together your own Arduino on a breadboard, and you can just download the source code and program the bootloader for it. But that is clearly not true for an iPhone! (Laughter) From an interview with sasha arden and Mark Hellar, October 26, 2020 (arden and Hellar 2020)

Microcontrollers should be well-documented to ensure their function is understood and could be recreated. Care should be taken to ensure that any code stored on the IC or board is also stored and preserved separately, as it is not possible to create disk images from microcontrollers, 475

Deena Engel et al.

and it may be difficult to retrieve data from them at all once it has been transferred. Microcontrollers are also harder to maintain than desktop computers, as components are usually soldered directly to the circuit board, making replacement difficult.

Single-Board Computers Single-board computers incorporate a set of discrete hardware components on a single circuit board but, in contrast to a single-board microcontroller, include a standalone microprocessor rather than the microcontroller IC. While they can look superficially similar to single-board microcontrollers, they are actually closer to desktop computers in terms of purpose. The increased sophistication of the components in comparison to a microcontroller means that they are suitable for use as general-purpose computers, can be easily connected to computer peripherals (such as a monitor and keyboard), and can run desktop operating systems. Single-board computers are found inside smartphones and media players. They are also available as standalone devices, such as the Raspberry Pi series (see fig. 22.7). These may be used by artists where the complexity of their software exceeds that which could be supported by a microcontroller (such as running an operating system or using a display output), but a similarly reduced size or lower cost is required.

Figure 22.7 Raspberry Pi Model A Revision 2.0 single-board computer. This model uses a Broadcom BCM2835 ARM CPU and includes significantly more input/output options than a singleboard microcontroller, including HDMI, composite video, 2x USB 2.0 ports, ethernet, and 3.5 mm audio. Photo: Tom Ensom

476

Caring for Software-Based Art

As for microcontrollers, single-board computers are difficult to maintain as components are soldered directly onto the circuit board and are not easily replaceable. The recoverability of code and other stored data is highly dependent on the device and the nature of the storage medium. Data may be retrievable where it is contained on removable storage media (e.g., an SD card or USB stick), but storage integrated into the board may be very difficult to access.

Peripherals The capabilities of a computer may be extended by attaching peripherals, easily removable hardware devices that extend the functionality of the host system by providing additional inputs or outputs. Examples of frequently encountered peripherals in software-based art are monitors, cameras, sensors, networking devices, and human interface devices (such as keyboards, mice, and controllers). Each will have its own unique characteristics, and understanding its importance within the larger system will have to be approached on a case-by-case basis. It is generally important to identify the interface used to connect the peripheral (e.g., RS-232, Firewire, USB) as the obsolescence of these interfaces is a common source of long-term compatibility problems. The term “peripherals” overlaps with what we typically refer to as display equipment in time-based media conservation. This is particularly so for software-based artworks with video outputs, which may make use of digital projectors, TV monitors, and audio equipment commonly encountered in the care of video artworks (see Chapter 18).

Servers and Networking While networking hardware overlaps with the types of hardware described in the preceding sections, it is discussed here in further detail due to its significance in networked and web-based artworks. A network is a connection between two or more computer systems through the use of shared communication protocols. The connection allows the connected systems to share and exchange resources, such as files and web pages. Networks can operate at a variety of scales, ranging from a local area network (LAN) between devices in physical proximity to the internet, the global network of interconnected networks that provides access to the World Wide Web. Networking capabilities may be provided by components built into computers (such as ethernet and Wi-Fi adapters) and extended through attached peripherals (for example, a network switch used to route LAN traffic between several connected computers). Servers are computers that are used to serve content to other computers over networks. One computer—a client—connects to another—the server—over a network, allowing for the exchange of information between the two. Servers can be used to provide access to web content, route email, or provide access to databases and computational resources. Servers contain similar hardware components to desktop computers. While they may use a form factor that allows for them to be rack-mounted, any computer can be used as a server providing it has networking capabilities and can run a suitable operating system. Artists may operate their own servers or use servers provided by a third-party service provider. Where servers are employed, they can present a significant long-term preservation risk if not properly managed. Providing long-term public access to servers (which may be required for web-based artworks) requires that the server be maintained and updated, or otherwise risk loss of access and reduced security. The extent to which it is possible or desirable to move servers under the control of an institution is one important factor to consider.

477

Deena Engel et al.

Custom Hardware Artists may use custom-built hardware to realize software-based artworks. Custom hardware is likely to be unique and can be contrasted with off-the-shelf hardware, which is industrially produced and sold at volume. Instead, bespoke circuit boards are created to meet a unique specification. These can be built by hand using a fixed base (such as breadboard or stripboard) and suitable electronic components. They can also be designed using electronics design software and sent to a third party for manufacture and assembly. As custom hardware is likely to have been made to carry out a specific purpose within a software-based artwork, understanding it, and having the necessary documentation to rebuild it, if necessary, is very important. Artist Jeffrey Shaw has frequently used custom hardware in his work, particularly bespoke human interface devices. For example, in The Legible City (1989–91) gallery visitors ride a bicycle mounted in front of a projection screen to navigate the virtual environment projected in front of them. This work relies on a custom-made analog-to-digital converter, the documentation and replication of which was a critical issue in a treatment of this undertaken by ZKM (Lintermann et al. 2013). In addition to being built from scratch, off-the-shelf hardware can be modified by artists in order to alter the way it functions. Understanding the nature of the intervention and how it might be recreated is very important if artworks incorporating modified hardware are to be successfully preserved. Cory Arcangel is an American artist who has worked extensively with hardware modification, particularly his interventions in video game technology. This has included the modification of video game console controllers so that they self-play, in the case of Various Self Playing Bowling Games (2011), and a video game cartridge in Super Mario Clouds (2002) (Magnin 2015).

22.3.2 Analyzing and Documenting Hardware Gathering information about hardware components should be done while the system is operational. Capturing the precise model and other relevant information (see Table 22.2) for all components will help to create duplicate backups, source future replacements, and define emulation configurations (see Section 22.4).

Gathering Computer Component Information Off-the-shelf computers such as Apple’s Mac Mini series or Intel’s NUC series may have welldefined component sets associated with the specific model. In addition to capturing the manufacturers’ specifications, the computer should be examined directly (e.g., to retrieve a list of components, including attached peripherals, operation system, and installed software). Most contemporary operating systems include built-in tools, such as those packaged with Windows 10 and macOS, both called System Information (see fig. 22.8). Third-party tools, such as HWiNFO (Software) for Windows (Malík n.d.), can provide more detailed information. Where possible, choose software tools capable of exporting the information they gather in a widely understood, non-proprietary format, such as plain-text, XML or HTML. Storing analysis outputs in these formats increases the chances that this information remains accessible in the long term. For microcontrollers and single-board computers, the relevant datasheet from the manufacturer should be located and retained. If software tools cannot be used (e.g., in older computers where analysis tools may not be available), a visual inspection of the physical components inside the computer can produce 478

Caring for Software-Based Art

Figure 22.8

The report produced by the built-in System Information tool on a Windows 10 laptop provides a detailed account of all hardware and software components.

Screenshot: Patricia Falcão

model numbers and/or serial numbers, which can then be cross-referenced with other resources to gather more detailed information. The interior inspection of the computer is also an opportunity to assess its condition and identify problems such as dust buildup or leaking capacitors. The contents of some computers may be difficult to access due to the way they have been assembled or the absence of screws. For example, the internal components of some Apple iMacs can only be accessed by prising off the glued-on LCD panel. Careful consideration should be given to balancing the possibility of damaging the computer against the value gained through this intervention. When handling hardware components with exposed circuitry, there is a risk of damaging components through electrostatic discharge (ESD). To avoid this, use a combination of ESD mitigating equipment such as anti-static mats, wristbands, and gloves (see Chapter 14 for further guidance).

Documenting Peripherals and Custom Hardware Peripheral devices, such as cameras, sensors, human interface devices (e.g., controllers), and other off-the-shelf hardware, should be recorded—including noting and photographing the interface used to connect to the computer. Any manuals or other relevant documentation available from the manufacturer should also be collected where possible. Peripheral hardware may be accessed using specific drivers, so it is important that this software be identified, documented, and appropriately archived. 479

Deena Engel et al.

Custom-made hardware devices require particularly detailed documentation. As a general rule, aim to gather enough information that they could theoretically be reconstructed from scratch. Technical documentation to seek includes schematics, circuit diagrams, and bills of materials. If this documentation is not available, as much of it should be generated as possible at the point of acquisition, ideally in collaboration with the artist and inspecting the circuits as necessary. It is also important to identify whether hardware has been programmed with custom code that cannot be easily retrieved (e.g., a microcontroller), as this should be acquired separately. If not available, measures can be taken to retrieve it using interfaces attached to the microcontroller or to reverse engineer its function through the use of a logic analyzer (FinoRadin 2016; Bunz 2018). Human interface devices may require more extensive documentation of user experience. Factors to consider include the array of possible interactions, use of haptics, look and feel of materials used in the interface’s construction, and the relationship between the movement of the interface, and its effects on the software. Documenting interface devices can be achieved by creating videos of user interaction. Care should be taken to ensure as many aspects of interaction are captured as possible and whether video capture would benefit from comments, narration, or further written explanation. The materials that constitute the device should also be identified, and appropriate steps should be taken to care for them; this is work that can be undertaken with a specialist conservator with suitable knowledge of the materials.

22.3.3 Hardware Failure and Obsolescence Failure of hardware components occurs due to damage to or degradation of their electronic circuits or mechanical components. This can be caused by faulty or degraded components, environmental conditions (such as extreme heat or damp), power surges, and electrostatic discharge (ESD). Based on discussion with a number of conservators with experience maintaining hardware—Candice Cranmer and Nick Richardson (ACMI, Melbourne), Chris King (Tate, London), Paul Jansen Klomp (media artist/tutor Media Art at Artez, Enschede), Arnaud Obermann (Staatsgalerie Stuttgart), and Morgane Stricot and Matthieu Vlaminck (Zentrum für Kunst und Medien, Karlsruhe)—the following were identified as common sources of failure (Cranmer and Richardson 2021; King 2021; Klomp 2021; Obermann 2021; Stricot and Vlaminck 2021): •









Battery failure. For example, reduced ability to hold charge or leaking battery acid. This is particularly challenging to address where these are soldered to the motherboard and therefore difficult to replace. Leaking battery acid can also damage the circuit board and adjacent components. Capacitor failure. For example, reduced ability to hold charge or leaking electrolytic fluid. The latter is particularly common among batches of capacitors from the early 2000s, where a buildup of gas in a faulty electrolytic capacitor can cause the shielding to split. Failure of a mechanical component, such as a fan, switch, or drive. For example, the motor that spins the magnetic platter inside a magnetic hard disk drive may fail, rendering the data contained inaccessible. High operating temperatures. Factors such as high ambient temperature and failure of cooling systems (e.g., accumulation of dust, fan failure) can result in operating temperatures exceeding component tolerances, which can accelerate the degradation of materials. Oxidation of connectors between components, which impedes connectivity. For example, this can occur in the metal contacts that connect an expansion card to a motherboard.

480

Caring for Software-Based Art



Poor storage or display conditions. For example, exposure to humidity and accumulation of dust and other debris can cause components to short-circuit when powered on.

There is little publicly available empirical data on the failure rates of computer hardware, making it hard to reach meaningful conclusions as to its frequency in well-controlled conditions. Given the long time spans over which those collecting software-based art aim to preserve it, it may be assumed that hardware failure will eventually occur in any given component due to the tendency of all objects to deteriorate over time. In the examples of sources of hardware failure listed earlier, these problems can often be addressed, although in all cases, specialist equipment and expertise would be required. A faulty capacitor can be identified through visual inspection and then replaced before it causes damage. A magnetic disk platter can be removed from a failed drive in controlled conditions, and the data recovered. Other types of hardware failure may be more catastrophic, making repair impossible, particularly due to the industrial processes by which circuits are manufactured. The extent to which failures are addressable can be identified by balancing the benefit against the resources required. When faced with hardware failure, it may be more cost- and time-effective to replace a component rather than repair it. However, this becomes increasingly challenging with hardware obsolescence, whereby hardware components are no longer sold or supported by their manufacturer. Hardware obsolescence is driven by commercial interests. Newer products may provide better performance or additional features which make them more appealing to the consumer, thus shifting demand from an old product to a newer product and disincentivizing the manufacturer from continuing production. In other cases, a manufacturer may simply choose to stop producing or supporting a product, despite continued consumer interest. The effect is that hardware becomes both increasingly difficult to repair, as expertise dwindles with a decreased financial incentive, and to replace like-for-like, as the now fixed number of components becomes increasingly rare. This forces replacements to be made using different hardware: the treatment approach is called hardware migration. This process has a secondary effect on the viability of software due to the changing hardware support in operating systems: vendors are unlikely to continue to provide driver software for hardware that is no longer commercially available (see Sections 22.3.7 for software obsolescence and 22.4.3 for hardware migration).

22.3.4 What Is Software? Software consists of stored instructions—or code—written in programming languages. These are formal languages used to tell a computer what to do. Software can run on many different hardware platforms, from small devices such as microcontrollers to mobile devices, tablets, desktop computers, and servers. In some cases, the software is written to run on devices of any size, and in other cases, the software is too closely aligned with a specific hardware device or custom hardware to run on anything else. Web-based software runs on a web server using the programming languages, media file types, and other paradigms appropriate for a web-server environment.

Programming Languages Programming languages have evolved over time, reflecting the evolution of the hardware, the role of computers, and how they are used in society, to meet specific computational and usage requirements throughout all areas of study and professional practice. There are several hundred

481

Deena Engel et al.

programming languages (Rösler 2021); those most frequently encountered in software-based art are introduced in Section 22.3.8. Source code is a human-readable set of instructions written in a specific programming language. It can be read and/or edited in any text editor and is commonly written using a software program called an IDE (integrated development environment), which is designed to support the software development process. Source code can also be developed within some WYSIWYG environments, such as Adobe’s Flash and the Unity game engine. The term high-level programming language refers to programming languages that remain distant from the details of the computer in order to make programming easier so that onerous tasks such as memory management are handled “behind the scenes.” High-level languages are meant to be human-readable; using a high-level programming language, artist-programmers can focus on the features of code to write the instructions for their artwork using variables, Boolean logic (if-then structures), iteration (such as loops), functions, and other features of the language that are not specifically dictated by the computer’s architecture. In order for a computer program to be executed, it must be transformed into instructions— called machine code—which can be understood by the target computer. Machine code is a low-level programming language with close correspondence to the actions carried out by the computer hardware. The point at which this transformation happens varies and is usually dictated by conventions associated with the programming language used. While for interpreted languages like PHP the source code is sufficient to execute the program, for languages, such as C++ and Java, the source code is compiled to executable code, which is not human readable. In cases where the source code gets compiled, it is no longer needed to run the software.

Executable Software Executable software is the collection of the executable code and associated data required to run a program. This is stored in and managed as files, ranging in complexity from a single executable file to a collection of files that are called upon when the software is run (see fig. 22.9). This file

Figure 22.9 Application directory for John Gerrard’s Sow Farm (near Libbey, Oklahoma) (2009), viewed in Windows 10, containing a Windows Portable Executable (.exe) file, settings stored in plain text files, and batch and shortcut files for running the software. Screenshot: Tom Ensom

482

Caring for Software-Based Art

or collection of files can be referred to by a number of other similar or related terms, including application, binary, and build. The contents of executable files are stored in forms optimized for computer processing and are not intended for human reading. Executable code is stored and managed in a variety of formats, which may be accompanied by a wide variety of other files. A useful starting point for understanding these collections of files is to examine the folder-file structures (including file and directory names which may be indicative of meaning) and the file formats present. Somewhere within the collection of files will be one or more executable files, which are used to run the software on a suitable host computer. Successfully running the executable software relies on connections between this executable software and other software and hardware components, which were embedded in the executable software during its development. These are known as dependency relationships. The importance of dependency relationships in the realization of the software-based artwork may vary. In some cases, access to a highly specific component might be critical, and its absence or failure may cause the software to fail to execute altogether; for example, a driver required by the software to communicate with a hardware device. In other cases, there may be some flexibility in the component used; for example, software may require access to a graphics processing unit (GPU) to render 3D graphics but may not need access to a specific model. Dependency relationships can also extend to external resources, such as web services and APIs. See Section 22.3.6 for further guidance on identifying dependency relationships between components.

22.3.5 Types of Software Custom Software versus Off-the-Shelf Software For contemporary artists who create software-based works of art, custom software is their primary medium. Artists may have their own characteristic style with respect to selecting programming languages, developing and coding the algorithms that drive the appearance and behaviors of the artwork, and stylistic programming choices (e.g., whether to calculate color shifts using an HSB color model rather than an RGB model or whether to include comments and thereby retain earlier “drafts” of source code within the code). In other cases, artists may work with a collaborator or team of collaborators who will also influence the technical choices made and the style of programming. Off-the-shelf components are software components that have not been made specifically for the software-based artwork. Instead, they are typically pre-packaged, widely distributed software components designed to meet some common need. For a software-based artwork, offthe-shelf components usually consist of at least an operating system—itself incorporating many individual software programs—but may also include additional libraries, application programming interfaces (APIs), and drivers, which are not part of the operating system by default. These components serve critical supporting roles in the execution of the artist-created software and form a layer that connects the artist-created software with the hardware environment.

Web-Based Software Websites run software that is configured to run on a web server (see Section 22.3.1) and accessed by browsers. Websites often require the use of more than one technology (e.g., a programming language such as JavaScript and the use of a markup language such as HTML), and they can also host any required media files such as digital images, audio, and/or video files. 483

Deena Engel et al.

Proprietary versus Open-Source Software Proprietary software is under copyright by a publisher and available only to licensees. In this case, the vendor might set limitations such as the number of installations permitted for a given license and force that limitation through product keys, a serial number, online authentication, or other means. Free and open-source software is provided at no cost with access to the source code. It can be released under a variety of licenses: license such as the public domain Creative Commons CC0 license (Creative Commons (CC) n.d.), a permissive license such as the MIT license (Open Source Initiative n.d.), or under copyleft, such as the GNU license (GNU Operating System n.d.). The use of open-source software benefits preservation, as source code can easily be gathered and reused, thus ensuring that it can be modified for preservation purposes if necessary (see Section 22.4.8).

22.3.6 Analyzing and Documenting Software Source Code Analysis Artwork inspection can provide a good understanding of the makeup of a work, but not all behaviors are discernable to human observation by simply running the piece. Complex features, like open durations, randomization, or generative functions, can only be identified through source code analysis. A close reading of the source code, which contains the instructions and the algorithms that create the artwork, reveals not only the artist’s decisions regarding the behaviors and presentation of the work but also its technological makeup and genesis. This examination method requires the expertise of a programmer or computer scientist and has been introduced to the conservation community in recent years through interdisciplinary efforts between media conservators and computer scientists (Engel and Wharton 2014; Wharton and Engel 2014; Guggenheim Blog 2016; Engel and Phillips 2018, 2019). When conducting source code analysis, the examiners try to find answers to the following questions: • • •

• • •

• •

Color: How does the artist stipulate the use of color (e.g., HSB or RGB)? How are color values calculated? Randomization: Does randomization play a role in the work? What is the impact on the artwork’s behavior and appearance based on the random results? Speed of animation: Is animation speed rendered or measured in absolute terms such as seconds or milliseconds, which are hardware-independent or relative terms, such as delays that reflect processing time? Input: What is the role of user input, if any? Does other input, such as the ambient sounds in the exhibition space, play a role? Sound and image creation: Are sounds and images created dynamically as the software is running, or does the artwork use external media files that are played or displayed? Data sources: Does the artwork use one or more data sources, and are they static or dynamic? How are the data retrieved and processed? Are the data conserved for study or future exhibitions? Systems issues: The interaction and dependencies between the artwork and the operating system, peripherals, and the role of configuration are defined. Web-based artworks: What are the parameters for responsive design (the specific page layouts for a mobile device, tablet, or desktop)? What are procedures for permissions and access that requires authentication? Does the software address specific browser constraints? 484

Caring for Software-Based Art

The following conservation concerns and preservation risks can be identified in the source code: •

• •

Programming languages and third-party libraries: Use source code analysis to identify all of the programming languages and all of the external, third-party libraries that should be conserved along with their source code. Technologies: Use source code analysis to identify all of the technologies used in a given work, including device drivers for hardware and hardware requirements noted. Cost and resource analysis for preservation and interventions: Use source code analysis to outline the scope and skills required to plan time and cost budgets for interventions.

Source code analysis—that is, reading and documenting the source code of an artwork—can provide a rich resource not just for conservation but also for art history, technical art history, and curatorial study. For conservators, source code analysis and documentation can offer an understanding of the behaviors of an artwork and insights into potential preservation risks an artwork may be facing. It helps to assess how much work is required to make a piece run again, which technical skills are required to do so, and the scope of such a project in order for the conservator to make informed decisions on intervention and conservation treatment. Software documentation is essential to long-term maintenance and thus has a long history and is a major activity in software engineering, so there is a great deal of literature on this practice (Garousi et al. 2013; Bavota et al. 2019). Source code documentation can take a number of different formats, depending on the resources available with respect to time, cost, and the expertise of those writing the documentation and the goals of the project (e.g., whether the project is meant to benefit conservators, curators, and art historians and/or future programmers who may be commissioned to work on the piece). Successful approaches include making a study copy of the source code and annotating the study copy with comments and explanations for future technologists, writing a descriptive narrative of each logical section of the source code for curatorial and art historical readers, and building out tables and diagrams that identify specific aspects of the code. Tables or spreadsheets can be used to list a color vocabulary, the media file types used in the artwork, the static image files used, and other facets of the artwork to complement documentation that identifies the specific areas of the source code that constitute risk, such as deprecated code (Phillips et al. 2017). Visualizations such as flowcharts are appropriate for older works such as those written in Pascal, while UML diagrams are commonly used with object-oriented programming languages such as Java (Engel and Wharton 2014; Wharton and Engel 2015).

On the Value of Source Code Analysis “Source code analysis is a new tool in the conservator’s technical research kit, but it is not yet widely used. It is a standard maintenance procedure in software engineering, but as with many research methods adapted from other industries, it needs to be applied with a strong understanding of conservation ethics. Now that a new generation of conservators are being trained to work on time-based media, no doubt some will arrive in museum labs with programming skills. Yet the field has always relied on technical expertise from other fields, and I predict that software engineers will increasingly be consulted in ­software-based art conservation.” —Glenn Wharton, Professor, Department of Art History and Lore and Gerard Cunard Chair, UCLA/Getty Program in the Conservation of Cultural Heritage, University of California, Los Angeles (Wharton 2021)

485

Deena Engel et al.

“The degree to which artists annotate their source code varies, and source code analysis helps to determine the purpose, functionality, and intent of the code. Source code analysis has a tremendous impact in that it helps to identify the conservation strategies—from rewriting code for a migration to emulating the environment—that best maintain the original artwork’s functionality and intent. The Whitney Museum still is in the early stages of conducting source code analysis of the works in its artport collection, the Whitney Museum’s platform for net art, and so far, this analysis has been undertaken only in collaboration with classes at New York University where students have performed the work. The results have not yet been actively used but already were very helpful in determining potential strategies for conserving software-based work.” —Christiane Paul, Curator of Digital Art, Whitney Museum of American Art and Professor, School of Media Studies, The New School in New York City (Paul 2021)

“Source code analysis helps to understand and value the materiality of the artwork. It has the same purpose as scientific imaging and chemical analysis of a painting. Source code analysis contributes to a better understanding of the artist’s and/or programmer’s intent. Together with interviews of the artist and/or the programmer, one can better decide which algorithms or features are significant and have to be maintained. Source code analysis makes one more aware of the craftsmanship of the programmer/ artist and of the community culture linked to certain digital tools. Even if a work has to be transferred to a new technology, the knowledge of the algorithms, program features, and parameters used in the source code informs the choice of programming language and platform and facilitates reprogramming. When acquiring software-based artworks, we do source code analysis to estimate whether and how long the work will be functional and what additional information or tools we need to preserve it. Source code analysis in the case of TraceNoizer (2001) by LAN (Annina Rüst, Fabian Thommen, Roman Abt, Silvan Zurbruegg, and Marc Lee) ( Roeck n.d . ) supported our decision-making regarding preservation strategies on a technical level and also on a contextual level (what kind of technology was used in 2001 and how the technological landscape has changed since, what that means for the interpretation of the artwork, and so forth).” —Claudia Roeck, PhD Candidate, University of Amsterdam, Media Studies, and Time-Based Media Conservator (Roeck 2021)

Executable Analysis Executable software typically contains compiled code, which is not human-readable and makes analysis and documentation work more difficult if the source code is not available. Nonetheless, in some situations, executable code is the only representation of the artist-created software available. It is important to note that this may significantly increase the risk of losing access to this artwork in the future, as it makes applying some preservation strategies difficult, if not

486

Caring for Software-Based Art

impossible. In this section we will briefly discuss the options available in such situations and the limitations to what we can learn about a computer program and the way it functions and behaves from executable code alone. Unlike other common file formats in time-based media conservation like digital video, executable files usually contain little in the way of accessible metadata. Instead, specialized tools such as debuggers and decompilers are required to extract useful information from them and, even when successful, may offer limited insight compared to full source materials. The use of such tools leads us down the path of software reverse engineering—that is, attempting to gain information about how the software works with limited prior knowledge about how it was created. At its most complete, reverse engineering would aim to produce source code similar or equivalent to the original source code from the executable code. In practice, however, this is a laborious and highly specialized activity, particularly for larger and more complex programs. One approach that may come close is decompilation: the use of a software tool called a decompiler to create an approximation of the original source code from the executable software. The success of decompilation will depend on the characteristics of the software (e.g., a program compiled to machine code is harder to decompile than a Java executable) and the availability of a decompiler tool. Approaches to reverse engineering software-based artworks are discussed further in Ensom (2018).

Documenting a Software Environment When documenting a software environment, identify which operating system is being used and any additional off-the-shelf software such as device drivers, runtime libraries and database software. Documentation should include the specific version and build number of each program where possible. Most contemporary operating systems include built-in tools for reporting this, such as the tools (both called System Information) packaged with Windows 10 and macOS (see fig. 22.8 and Section 22.3.2). Where possible, you should choose software tools capable of exporting information in a widely understood, non-proprietary format, such as plain-text, XML, or HTML. Alternatively, it may be possible to gain equivalent information by examining the contents of storage media (e.g., a disk image). This method is less reliable as it requires the conservator to manually locate installed programs on storage media. While there are typical install locations (e.g., Windows 10 uses the Program Files directory, while macOS X uses the Applications directory), this is something, which can easily be customized by the artist or their collaborators. It is important to note that while the software environment is distinct from the artist-created software, the default configuration of off-the-shelf software components can be further modified by the artist. For example, web server or database software requires a level of configuration to be used that can then impact its connections with other software components. In the case of John Gerrard’s Sow Farm, the artist applies image processing techniques using the graphics card driver, which significantly impacts the characteristics of the projected image (Ensom 2019). The extent of any custom configuration of off-the-shelf components should therefore be identified and documented. Given the large number of configuration options available for some software, this can be difficult to identify through examination alone, so it is best to work closely with the creator of the system—be that the artist or a collaborator—to identify these and their significance.

22.3.7 Software Obsolescence As hardware and operating systems evolve over time, the risk of software obsolescence grows. External libraries are updated or no longer supported, devices become outdated or drivers are

487

Deena Engel et al.

no longer written for current hardware and operating systems, programming languages develop and change and may not be downwardly compatible, and other changes in the system will cause software applications to run in ways that are not anticipated, to exit prematurely as a “crash,” or not to run at all. This can be managed in the short term by avoiding making changes to the software environment (e.g., preventing automatic updates, creating disk images). However, when hardware fails or an external dependency changes, obsolescence of components will have to be negotiated. Web-based artworks are a clear example of works subject to software obsolescence and software incompatibilities that can arise when the underlying operating system and/or the software environment are updated or changed in ways that prevent the artwork from running as expected. Updates to operating systems, programming languages, and other widely used software resources are often publicly announced ahead of time, and in a best-case scenario, risk assessment and interventions of the affected artworks can be considered while the artworks still run. Web-based artworks are also subject to updates in browser software and should be tested periodically using the most popular browsers available at that time to ensure compatibility.

22.3.8 Programming Languages and Platforms and Their Preservation Implications Many artworks employ more than one programming language in order to best handle the tasks that the artist wishes to accomplish. It is important to understand which language(s) are in use and the specific role(s) that they play in order to plan for long-term conservation and re-exhibition. In the following section, several categories of programming languages are introduced, along with examples of languages that conservators may encounter when working with contemporary artworks. The details listed here are not meant to be a comprehensive software review or to replace the extensive manuals and documentation on these languages available in print and online, but rather to point out specific features, functionality, behaviors, and risks associated with each of these languages or language groups in the context of art conservation.

High-Level Programming Languages That Are Compiled This group of programming languages includes languages such as Java, C, and C++. Compiled programs are architecture-specific, meaning that they run on a computer that has an architecture (macOS, Windows, etc.) corresponding to the machine code language used. Risk assessment of these works includes taking care to find out not only which programming language was used but also which specific version of that language and whether any additional libraries of source code were included when the software was compiled. High-risk artworks in this category include the many Java Applets written for web-based applications in the 1990s and early 2000s, as Java Applets are no longer supported by most browsers. Works in C or C++ are at risk due to current or future hardware obsolescence as the close pairing of the language with the operating system and hardware will render these works inoperable when the appropriate hardware is no longer functioning or available.

High-Level Programming Languages That Are Interpreted: Scripting Languages There are a number of programming languages that do not require compilation but rather require an interpreter. An interpreter executes the human-readable source code at the point at 488

Caring for Software-Based Art

which the program is run. This requires an installation of a suitable interpreter to be available to the software, such as Python, in order to run a Python script. Client-side JavaScript is an example of a scripted programming language that can run locally or on a web server through a browser; either way, viewing source within a browser will typically reveal the JavaScript source code directly to a viewer in a human-readable format. Server-side web-based scripting languages, such as PHP, Perl, and Python, are interpreted but the source code is not accessible to the institution or collector unless the artist provides access to the server or the files. Further, scripts that handle tasks such as file management are hidden from view and must also be specifically provided to the collector or institution. Preservation measures for artworks in this category include collaboration with the IT staff managing the web server to maintain a stable and secure environment, such as consistency in the appropriate path designations to ensure that the artworks run in the correct programming language and version. Software upgrades can cause incompatibilities and failure; for example, while PHP is typically downwardly compatible, Python 3.x is not, and software written in Python 2.x will not run or will not run correctly in Python 3.x. High-risk artworks in this category include artworks that rely on widely used libraries, such as server- and client-side JavaScript libraries that change over time. In these cases, a virtual development space in which a conservator or their team can test the artwork in a software environment that is configured to match the updated languages and libraries of the production server is crucial to ensure the ongoing successful presentation of these works.

Markup Languages Markup languages, such as HTML (Hypertext Markup Language), are not programming languages but rather conventions of annotation that allow the implementation to differentiate content from format, differentiate content from document structure, and in some cases, identify content roles such as designated text as a citation using the  . . . tags in HTML, also known as semantic markup. HTML files are human-readable and may be associated with CSS (Cascading Style Sheet) files which describe the rules as to how content is displayed with respect to page layout, font size and color, and other format specifications, whether in print or on a tablet, mobile device, or another device. As text files, HTML and CSS files are easily stored and preserved. These languages change slowly over time, often with lengthy announced lead times, which gives conservators time to address any necessary software updates such as replacing tags that may be deprecated. For example, a list of tags noting which ones are deprecated is available on the w3schools website (W3Schools—HTML Element Reference n.d.); additional documentation is available on other sites as well. Risk assessment of these works includes documenting the versions of the mark-up languages used and identifying deprecated tags and a remediation strategy where necessary. Risk assessment will also include checking for cross-browser compatibility to ensure that a given work either opens in multiple browsers or that a user is advised which browser(s) to use when viewing a work in this category if limitations are found.

Query Languages Query languages are used to manage data and answer questions posed to a specific database. Many artworks use data in databases, which are retrieved by running queries against either a dynamic or a static database. The queries themselves may typically be launched via a program written in a programming language, such as Python, PHP or Perl. 489

Deena Engel et al.

Preservation measures begin by identifying the type of database and its storage location along with verifying access and permissions to ensure security. SQL (Structured Query Language) is a query language used to obtain data and results from a server-side relational database, such as MySQL, Oracle, and PostgreSQL, or from SQLite, with its smaller footprint running either locally or on a server. Standard database documentation, which is useful to the conservator, includes a data dictionary (a list of all of the tables, fields, and other objects in the database with descriptions), an E-R diagram (an entity-relationship diagram, which provides a visualization of the relationships among the tables), and the results of a standard “dump” query. A “dump” query, such as “mysqldump,” in MySQL records the contents of the database as a human-readable script file for study and analysis or to create a new copy of the database. SPARQL (SPARQL Protocol and RDF Query Language) is used to retrieve and manipulate data in a Resource Description Framework (RDF) format, such as in Wikibase implementations or with Wikidata. A decision must be made as to whether an artwork will run against a static (i.e., unchanging) database (e.g., the data available when it was first exhibited) or a dynamic database, which is updated each time the work is run. If the artwork will run against a static database in the future and is “frozen in time,” then the data should be backed up and conserved for future use. If a decision is made to run the artwork against a dynamic (changing) database, then the software that is used to obtain and manage the database becomes an integral part of the conservation effort, and preservation measures going forward should include identifying all of the data sources and their respective availability. For example, if external data are captured from one or more websites, the data sources and/or data formats and availability will likely change over time, creating risk to the work’s long-term preservation. In such cases, it is advisable to store copies of the database at appropriate intervals so that the work can be exhibited in the future with historical datasets if necessary and, if so, agreed by all of the appropriate stakeholders. Assessing and displaying artworks that store data from user input should ideally be coordinated with the IT professionals involved to evaluate security and to protect against provocative, illegal, or threatening entries by users. In many countries, it is important to consider General Data Protection Regulations (GDPR) if any identifiable personal data (e.g., name, age, contact details) is or can be supplied by users. For artworks that collect statistics and data based on user activity, it may be important to retain these datasets for future study.

Multimedia Application Authoring Languages and Platforms There are some programming languages that run inside of software applications and have been used by artists. Lingo is a scripting language developed for use with Macromedia’s and later Adobe’s Director platform in the 1990s. Director files (.dcr) could be output as executables to run directly on Windows or macOS or as content for the Shockwave runtime environment on the web at that time. ActionScript was developed for use with Adobe’s Flash to run using Adobe’s Shockwave player on the web. In both cases, conservators would need the appropriate version of either Director for Lingo or Flash for ActionScript to set up a development environment that will allow study, further development, or recompiling if necessary. ActionScript 3.0 is not backward compatible, so special care must be taken to correctly identify which version of the language is in use in order to set up the development environment. Artworks that use these formats are at risk.

490

Caring for Software-Based Art

Max (also known as Max/MSP/Jitter) by Cycling 74 and Pure Data (Puckette and Various 1996–2021) are used to build multimedia applications. Both use a visual programming language in which the programmer generates code by manipulating a graphical environment rather than by writing code textually. Unity and Unreal Engine are game development platforms that primarily use the programming language C#. Risk assessment of artworks built with these languages includes an evaluation of whether the work runs on a local Mac or PC as a compiled version and if that is sufficient for re-exhibition (Engel and Phillips 2019; see Section 22.4).

IDEs (Integrated Development Environments) An IDE is an integrated development environment, a category of software that is used to support software development. They may be language-specific, such as NetBeans for Java and PyCharm for Python, or target a specific platform, such as Microsoft Visual Studio for Windows development, Xcode for Mac development, or the Arduino IDE for Arduino microcontrollers. They can also target a specific group of users, such as Processing, an IDE created for artists. Preservation measures for these works include documenting which version of the IDE was used and with which programming language. Using Processing as an example, earlier versions of Processing do not support all of the programming languages currently available, and some versions are not backward compatible, so it is important to collect the appropriate version of Processing for each artwork for further documentation and treatment.

Figure 22.10 A simple C++ application project opened in the Microsoft Visual Studio 2017 IDE. Screenshot: Tom Ensom

491

Deena Engel et al.

22.3.9 Specific Artwork Genres and Preservation Considerations Web-Based Art Web-based artworks range in complexity from the static HTML web pages (see Section 22.3.8) of the early web to those making use of technologies like JavaScript and Flash to create interactivity to large sites that are dynamically generated by software running on a web server. A web-based artwork may include one or a combination of high-level scripting languages and/or markup languages with the possible addition of a query language if a database is used for the artwork in addition to any media files. Some of the web technologies used by artists in the past are at particular risk at this time. This includes Java Applets and Adobe Flash content, which no longer runs in contemporary browsers, and the loss of functionality associated with the removal of support for specific HTML tags. Other considerations include loss of functionality for browser-based media playback, which changes over time as support for old formats is removed. Browsers are updated and modified regularly, and new browsers are introduced, which may not correctly display earlier web-based artworks. Preservation of a web-based artwork can be approached from the perspective of files stored on the web server (server-side) or the display-ready web pages returned by the server (clientside). When implementing server-side preservation, the materials required to run the server should be archived, which is likely to require close collaboration with IT staff or whoever is responsible for those servers. The key aspects to be agreed on are as follows: •





Levels of access and security permissions for conservators and web developers, to allow them to document the artwork on the server, as well as upload, download, and/or modify the software and other components that build the website. Notifications of any updates to the web server software environment that might impact the artwork. Web servers often host many websites, so decisions about which version to use for the operating system, databases, programming languages, or other software requirements could be made on the basis of many users’ needs and thus unintentionally break a webbased artwork. Timely notification allows conservators to test the artwork and ensure that there was no change. Ensuring the security of the web server, particularly with respect to network and systems integrity, to avoid malicious tampering with the artwork. If there is visitor input, the hosting institution will need to consider monitoring the input for offensive content and agree on how and whether to respond.

The conservator should document the hierarchy or file tree of the files for the artwork on the web server, the specific programming languages and dependencies, and any specific network considerations along with standard web server issues, such as security, storage, backing up, and server maintenance. It is important to test all web-based artworks regularly to document and address any problems that may arise. Using a variety of current browsers on a scheduled basis to check for anomalies or problems helps to ensure cross-browser compatibility or to highlight the need to advise users which browser is preferred if necessary. For web artworks where server-side preservation is not possible, or as a complementary measure, client-side preservation measures, sometimes referred to as web archiving, may be an option (National Archives 2011). Specialized tools such as web crawlers can be used to retrieve the code and data displayed by a browser in the form of a WARC (WebARChive file format), 492

Caring for Software-Based Art

a standardized web-archiving format (Library of Congress—WARC 2020). These recordings maintain some navigation functionality among the pages of the site and a degree of interactivity. Tools such as Conifer [Rhizome—Conifer (Software) 2020] and Webrecorder (n.d.) allow for small-scale, user-guided recording using an intuitive interface, as well as playback of the WARC files. Web-crawling tools, such as Browsertrix Crawler [Browsertrix (Software) n.d.], Heretrix [Heritrix3 (Software) n.d.] and Wget [Wget (Software) n.d.] can be used to carry out automated capture of larger or more complex sites. Using client-side tools means that processes happening server-side are not captured, so not all recordings are successful. These tools can also be combined with server-side preservation—for instance, when links to external pages are important for the work—so they can be captured preemptively.

Real-Time 3D, Virtual Reality, and Augmented Reality Real-time 3D (RT3D) rendering is the use of software to generate a moving image sequence in real-time from 3D data sources. In contrast to pre-rendered 3D and digital video, where a fixed number of frames of image data are stored, in real-time 3D frames are generated on the fly. This allows for a moving image that is dynamic and can respond to user interaction. The development of RT3D rendering has been led by the video game industry, where the technology has become ubiquitous. Software-based artworks utilizing RT3D rendering technologies are becoming increasingly common; see, for example, the artists John Gerrard (see fig. 22.11) and Ian Cheng who are represented in a number of museum collections.

Figure 22.11 John Gerrard, Sow Farm (near Libbey, Oklahoma) 2009 (2009), installation at Tate Britain in 2016. This software-based artwork uses real-time 3D rendering to create a virtual representation of a remote pig farm in the USA. © John Gerrard. Photo: Tate

493

Deena Engel et al.

RT3D software is developed with the assistance of existing libraries and tools. Perhaps the most significant of these is what is known as an engine. An engine is software that packages a set of reusable features and tools for RT3D software development. This will typically include a development environment, known as an editor, which incorporates WYSIWYG elements and graphical interfaces that simplify the process of development for non-programmers. Within this development environment, 3D scenes can be built from assets imported into the engine (such as 3D meshes, textures, and audio), while interactivity and dynamic events can be added through scripting languages or visual editors. Engines are created using high-level programming languages like C++ and can be extended by users through plugins or the modification and re­compilation from source (if the developer makes this available). Prominent examples of RT3D engines include Unity and Unreal Engine. Using an engine editor, a user can build executable software to run on a variety of different platforms. This consists of a package containing all of the necessary code and data to run the software on the target platform—usually a particular operating system and a 3D API (e.g., DirectX or OpenGL). Real-time 3D software often has a close link with dedicated graphics hardware, which is designed to offload common calculations in 3D rendering from the CPU. The presence of a GPU, alongside other powerful hardware components, ensures that frames can be generated at a rate that is sufficient to maintain an illusion of motion in the images displayed on an output device. This can be measured using the metric frame rate (the number of frames generated per second, or fps) or frame time (the amount of time in milliseconds taken to render a frame). The GPU hardware, drivers, and the 3D API used may alter the characteristics of a rendered image. Recent research on the preservation of real-time 3D artworks has focused on virtual reality (VR) artworks (Campbell and Hellar 2019; Ensom and McConchie 2021; Brum et al. 2021) and augmented reality (AR) artworks (Fortunato and Heinen 2020). VR and AR technologies, sometimes referred to using umbrella terms such as immersive media and extended reality (XR), build on real-time 3D rendering. These technologies are centered on hardware technologies that increase levels of immersion in the virtual environment. VR and AR differ in the extent to which they immerse the user in a virtual environment. For VR, the user only sees the virtual environment, whereas AR uses cameras mounted on an HMD or mobile device to display the real world, which is then augmented with virtual elements. Mixed reality (MR) artworks combine elements of VR and AR. Common immersive media technologies include the head-mounted display (HMD), a display device that mounts one or two screens in front of a user’s eyes, providing a wide field of view. An HMD can achieve a stereoscopic image by rendering two frames at slightly different viewing angles, one for each of the user’s eyes which creates the illusion of depth. A high frame rate must be achieved to prevent motion sickness in the user, which currently requires the use of a powerful computer system and the application of frame interpolation techniques. In addition to HMD technology, VR and AR also make use of tracking systems. This allows the realtime 3D software to track the user’s position in the physical environment and translate this into movement within the virtual environment.

Physical Computing Physical computing, in the context of software-based art, usually refers to projects of a DIY nature that use combinations of low-cost microcontrollers (e.g., Arduinos) or computers (e.g., Raspberry Pi) with sensors and other electronic components to control output devices. These components are often used to generate interaction or respond to external inputs. They can be 494

Caring for Software-Based Art

used, for instance, to read inputs such as light on a sensor, a finger on a button, or a Twitter message, and turn them into an output such as activating a motor, turning on an LED, or publishing content online [Arduino (Website) n.d.]. This type of technology is closely aligned with open-source communities and hacker practices, and the software tools are often open source and hardware specifications openly shared. Physical computing is part of the curriculum in many art schools as a fun way to introduce programming concepts. The robustness of the components and movable parts are usually the main preservation concern (see also 22.3.1), and although these components tend to be generic in the functionality they offer, they may have an aesthetic value and may be difficult to replace in the future.

3D-Printed Art The 2021 conference TechFocus IV—Caring for 3D-printed Art described 3D-printed art as “the process of creating a three-dimensional object via computer-aided design (CAD) programs and digital files, printing it using a range of materials from plastic to metal more conventionally, to all kinds of experimental materials like chocolate or shrimp shells” (AIC—TechFocus IV 2021). Other names for 3D printing are additive manufacture and rapid prototyping. While the artworks produced using 3D printing technologies are generally cared for by object conservators, preserving the software and technology required to create these artworks may also provide a role for time-based media conservators as more artworks using this medium enter collections, and software obsolescence is seen as a risk factor. TBM conservators may thus encounter .stl (standard tessellation) files, which express triangular representations of 3D objects (McCue 2019). Artists “may also be willing to supply the working design files, in Computer-Aided Design (CAD), Rhino©, or other software” (Hamilton 2017). The fragile nature of the objects themselves may necessitate decisions on the materiality and conservation of the objects in cases where a museum and the artist make decisions about reprinting the object (Oleksik and Randall 2018). Objects conservator Emily Hamilton has observed, “Some artists feel more strongly about the prints, while for others the files are the art” (Hamilton 2020). Collaboration between object conservators and TBM conservators will best support the conservation of these artworks.

On the Acquisition of .stl Files at MoMA “At MoMA, Media and Sculpture Conservation collaborated on a case study of the artwork Altar/ Engine (2015) by Tauba Auerbach. Consisting of over 40-plus 3D-printed objects, the artwork also arrived at acquisition with the .stl files used to print the objects and a composite .stl file of the entire piece in a 3D environment. Media and Sculpture Conservation reviewed the composite .stl file to help guide us in setting installation parameters for the piece (how the objects are placed on a table surface and the distance between them, for example) but quickly realized that the .stl files used to generate the objects were also a critical resource as it allowed us to evaluate the condition of the objects and assess how much they have degraded over time. A number of the objects, it was observed, had already started to collapse on themselves, which we only realized in this comparison. Additionally, we used the .stl files to aid in designing the packing for long-term storage for some objects to provide better support. The use of .stl files as documentation, as reference, and as a

495

Deena Engel et al.

resource has since become a critical element of our long-term strategy for caring for these objects, and we now regularly request them when accessioning 3D-printed/rapid prototype objects.” —Peter Oleksik, Associate Media Conservator, Museum of Modern Art, New York City (Oleksik 2021)

22.4 Preservation and Treatment of Software-Based Art In the long-term care of software-based art, we must contend with the inevitable failure and obsolescence of components (see Sections 22.3.3 and 22.3.7). In order to continue to display software-based art, it may therefore be necessary to make changes to the system used to realize the artwork. Through carefully considered modification of components, we can extend the life of the artwork without compromising its identity. Despite the need to accommodate change in the conservation of software-based art, the technologies with which a work was created will always be an important part of the artwork’s history, reflecting the period and context of creation, as well as the artist’s practice. They must, therefore, always be thoroughly documented, even if they are replaced, to ensure the work remains displayable. In this section we present a variety of approaches to preservation, treatment, and interventions for computer- and software-based art, including both environment and code interventions. Code intervention is the modification of code in order to maintain the functionality of an artwork. Environment intervention strategies contrast with code intervention strategies in that they involve modification to the hardware or software environment surrounding the artwork. In practice, you may wish to choose one of these intervention approaches or combine them as dictated by the requirements of the treatment.

22.4.1 Introduction to Environment Intervention Strategies Environment intervention strategies are those which alter or adapt the technical environment in which software runs so that access to that software can be maintained. A technical environment consists of two parts: the hardware environment, which is made up of physical hardware components, and the software environment, which consists of off-the-shelf software components (as opposed to artist-created software). Environment intervention strategies can be used to address a wide variety of problems in the care of software-based art, such as the failure of a hardware component and allowing software to run on computers for which it was not designed. In this section, we will introduce various approaches which can be taken, their limitations, and some general principles for their application.

22.4.2 Environment Maintenance Environment maintenance is the process of keeping a technical environment operational, so that software can continue to be run within this environment. This can include processes such as repairing hardware and updating off-the-shelf software. This approach is most likely to be effective in the short term when access to replacement hardware components, software updates, and the expertise required to use them remains available. As maintenance continues over extended periods of time, compatible hardware components may become

496

Caring for Software-Based Art

increasingly difficult to find, updates to off-the-shelf software may cease, and expertise in the technologies may dwindle or disappear altogether. When such factors become limiting, environment migration, environment emulation, or code intervention strategies should be explored. One common example of environment maintenance is the replacement of failed hardware components. For example, if an internal component of a computer fails, it could be replaced with an identical or similar device. Similarity can be very important when making such replacements, as this will help lower the risk of breaking dependency relationships. Important factors to consider include the performance of replacement components, the interface between the component and motherboard, and the availability of drivers for the target OS. For example, it may not be possible to replace an IDE/PATA hard drive with a SATA hard drive unless this newer interface technology is supported by the motherboard and operating system. Maintenance can go beyond the replacement of components and address failure directly through the repair of electronics at the circuit level. Successfully doing so is contingent on suitable electronic components being available and access to the tools and expertise required to carry out the replacement. For example, a specialist might be able to remove a failed capacitor, identify a suitable replacement by carefully matching the specification, and solder it into place in the circuit. However, this kind of repair can be very challenging (if not impossible) for newer computer hardware due to the use of industrially manufactured PCBs, which can be high-density and multi-layered and use components that are not widely available (Stricot and Vlaminck 2021). Off-the-shelf software updates are another form of environment maintenance—for example, applying security updates to an operating system or updating to a newer, more stable hardware driver. As a consequence of dependency relationships and the alteration or removal of features by software developers over time, carrying out this kind of update can risk loss or change of work-defining characteristics. For networked artworks, where public internet access must be maintained over extended periods of time, security risks are an added concern. The software on a server supporting a web artwork will need to either be updated or the server securely emulated (see Section 22.4.4). It is important to note that if the environment is updated, the artist-created source code must be tested for compatibility and evaluated for possible code intervention (see Section 22.4.7).

22.4.3 Environment Migration Environment migration is the process of moving software from one technical environment to another newer technical environment without altering the software itself. An example of this approach would be installing software designed for an obsolete computer system on a contemporary computer system. Where possible, this is advantageous in the conservation of software-based artworks, as it is one way of mitigating the risks posed by the failure and obsolescence of components. However, its usefulness is often limited by the tendency for technical environments to change over time, which can break dependency relationships between the software you wish to preserve and its technical environment. For example, software designed to run on the Macintosh computers of the 1990s and early 2000s, which used PowerPC CPUs, will no longer run on the Intel-based and ARM-based Macintosh computers available today. For this reason, environment migration tends to be feasible only in a relatively shortterm timeframe.

497

Deena Engel et al.

Assessing the viability of environment migration should be led by an understanding of the dependencies of the software at the center of the software-based artwork system. For example, in the case of replacing an aging computer system with a newer one, information such as the required CPU architecture and operating system will help determine a suitable system to migrate to. Ideally, these two elements of the hardware specification would be matched in the newer computer system. However, due to the process of obsolescence, this may be difficult to achieve. For example, as Windows XP is no longer sold or supported by Microsoft, hardware drivers are rarely created to support modern hardware components on this operating system. A change in the hardware environment may therefore precipitate a change in the software environment. For software developed to run on Windows XP, there is a chance of success depending on whether it uses any elements of Windows XP which are no longer present in a newer version of Windows such as Windows 10. In the case of the artwork Subtitled Public (2005) by Rafael Lozano-Hemmer, it was possible for conservators at Tate to install and run the Windows XP software on Windows 10 without altering it. However, a quirk of the programming caused the software, which carries out motion tracking using a computer vision system, to run with significantly increased latency. The source code was refactored and recompiled by the artist’s programmer to resolve this. Further risks that may require future intervention were identified as part of the treatment, most significantly that a Windows component used by the software to access video camera feeds has been deprecated— meaning its use is discouraged by Microsoft—and may be removed in future versions of Windows. As this example illustrates, environment migration is often a short-term measure where it is possible at all. In situations where there have been significant changes in prevalent hardware and software environments, such as a CPU architecture change, environment emulation may be an effective alternative.

22.4.4 Environment Emulation Rather than moving software to a new technical environment, it may be more advantageous to emulate a suitable environment. Emulation can be defined as the use of software or hardware tools called emulators, which allow one computer system to behave as if it were a different computer system. Emulation is useful because it allows us to run operating systems and other software on computer hardware which they were not designed for. This has significant advantages in supporting long-term access to digital materials broadly, as emulating an obsolete computing platform allows us to maintain access to it without requiring physical hardware, which, as discussed earlier, becomes harder to repair and source over time. Emulation was first widely discussed in a digital preservation context after a series of articles by Jeff Rothenburg in the 1990s (Rothenberg 1995, 1999). Its value as a preservation strategy is now supported by a growing body of evidence from a variety of digital preservation contexts (Wheatley 2004; von Suchodoletz et al. 2008; Espenschied et al. 2013; Dietrich et al. 2016). The term emulation can be used to refer to a number of related approaches. Emulation in its strictest sense—known as full-system emulation—is where all hardware is completely simulated within the emulator. At the time of writing, software tools are the best option for emulating desktop computers and will be the focus of this section. However, hardware devices are also available, such as field-programmable gate array (FPGA) integrated circuits that can be programmed to mimic other hardware. When talking about emulation using a software emulator, the physical computer system can be referred to as the host, while the computer system that is emulated in software can be referred to as the guest. 498

Caring for Software-Based Art

On Emulation-as-a-Service While local access—that is, access via a physical computer at a specific location—is useful for the display of software-based art in physical exhibition spaces, it is not the only way to provide access to emulated software-based art. We have recently seen the emergence of platforms that provide web browser-based access to emulators running on remote servers (von Suchodoletz et al. 2013; Liebetraut et al. 2014). This approach is known as emulation-as-a-service (EaaS) and has already been applied to providing access to softwarebased art from the Rhizome collection (Espenschied et al. 2013). The Scaling Emulation as a Service Infrastructure (EaaSI) project is continuing the development of the EaaS software and infrastructure [Software Preservation Network (SPN)—EaaSI n.d.]. The great advantage of the EaaS approach is that it allows the presentation of emulated software-based art to anyone with an internet connection and a web browser. It also greatly simplifies the process of using emulators by shifting emphasis to external management of the technical aspects of emulation through a central location and then providing access to pre-configured environments to suit a variety of common use cases. The disadvantage to this is that control of the technical aspects of emulation is handed over to the service provider, which may reduce configuration and customization options. EaaS tends to be most successful for software-based artworks in which the software has few external dependencies (e.g., on physical peripherals) and requires a single video output. It has been successfully used for emulating web-based artworks, where it can offer a secure way of providing public access to web servers that require outdated software (Roeck et al. 2018).

Virtualization Virtualization is similar to emulation but differs in that it allows the guest system access to some of the underlying guest hardware. This is most frequently encountered in CPU virtualization, in which a specialized piece of software called a hypervisor allows the virtualization software to execute code directly on the physical processor of the host machine. The condition of this is that the guest operating system is able to make use of the same instruction set architecture (ISA) as the host. This is because, unlike emulation, processor instructions are not translated to suit execution on the host processor. This has significant advantages for the performance of the guest, as having direct access to underlying hardware allows the software to execute at closer to native speeds than would be possible when instructions are translated in full system emulation. Virtualization has been demonstrated as a viable strategy for the preservation of software-based art (Lurk 2008; Falcão et al. 2014). While the term virtualization is often used synonymously with CPU virtualization, other components may also be virtualized. This is used for GPUs or other components where it is desirable for them to be shared between guest and host and where it is typically known as paravirtualization (Roeck 2018). A related technique called passthrough may also be used to allow guests direct access to host hardware components (e.g., a USB device). While this allows it to be used at native speed, this comes with the disadvantage that the hardware component cannot be shared with the host. This also affects preservation prospects, as virtualized components are more closely tied to physical hardware and the host operating system-specific drivers to support them (Rechert et al. 2016). 499

Deena Engel et al.

Containerization Containerization (also known as operating system-level virtualization) involves a similar paradigm to virtualization but instead relies on containers: packages of software consisting of the program to be run and any software dependencies it requires to run on a host operating system (IBM Cloud Education 2021). This container is stored in a form that is easily shared and reused with minimal configuration required in order to use it, known as a container image. A container image can conform to a variety of file format specifications. Much like virtual machines, containers have found use in the context of web development, where they offer a means of running multiple software environments on the same server hardware. Unlike virtual machines, container images do not contain an operating system. As a result, they are smaller in size and rely on a container runtime, such as Docker Engine or containerd (Cloud Native Computing Foundation n.d.), instead of a hypervisor (see Section 22.4.4 “Virtualization”). Containerization can achieve faster performance than emulation or virtualization. Containerization is finding use in the presentation of web-based artworks. For example, it has been used as a way to provide access to old versions of web browsers by Rhizome (Rossenova 2020a). It is also used by Haus der Elektronischen Künste in Basel for managing web servers, which Claudia Roeck describes as follows: “Each artwork is in a container, and each container contains a web server with the artwork. This is because they all need different scripting language versions, database versions, everything—each artwork has a different environment. Some might use the same software, but they’re not using the same versions, so you can’t really have them on the same web server. So, for each work you need a web server, and they are packed each in one container.” (Roeck 2020) Containerization does not have the preservation applications of emulation or virtualization, however, which achieves a level of independence from the physical hardware (Roeck 2020). Containerization is instead reliant on an additional piece of off-the-shelf software (the container engine), and any software packaged in a container must be able to run on the host operating system.

22.4.5 Choosing an Environment Intervention Approach When choosing which environment intervention approach to apply to a work requiring treatment, the first step should be to identify whether hardware can be repaired or replaced like-for-like. If so, environment maintenance is likely to be a good option to explore, as this minimizes the risk of unintended alterations to the artwork. If repair or replacement is not an option or is prohibitively costly, you can assess whether or not carrying out environment migration is likely to be feasible. Decisions about how to approach this must be made on a case-by-case basis but will usually involve assessing the dependencies of the software and determining whether or not they can be maintained after the proposed migration process. This may necessitate a certain amount of experimentation, as it is not always possible to predict the impact changes will have on the execution of the software. When it is not possible to migrate to newer hardware, an intervention involving environment emulation should be considered.

500

Caring for Software-Based Art

22.4.6 Emulation in Practice When deciding which approach to emulation to take, consider the following: • • • • •

What emulation tools are available to support access to this kind of software on modern computers? What are the hardware dependencies of the software, and can one approach support these better than another? What are the performance requirements of the software, and can one approach support these better than another? To what extent does this approach mitigate obsolescence, and how soon would another intervention be required? Will this approach support requirements for access to and/or display of the software-based artwork?

Experimentation with different tools may be necessary to find the approach that best meets the aim of the intervention. Time-based media conservator Jonathan Farbowitz describes a typical emulation use case as “a very simple setup of software running on a single computer with a single video output, and a minimum of peripherals” (Farbowitz 2020). More complex use cases can come up against the limitations of emulation technology (e.g., works requiring 3D acceleration capabilities or access to peripherals). There are a large number of tools and workflows for carrying out emulation, and we do not cover these exhaustively here. Instead, we present a simple workflow that will help make applying emulation successful. This sequence of steps is informed by the workflow in use at Tate, which was developed during the Software-based Art Preservation Project (Ensom et al. 2021), based on outcomes of a research collaboration between Klaus Rechert and Tate in 2016 (Rechert et al. 2016). 1. Select an emulator. Use information about the original computer system on which the software was designed to run (e.g., processor architecture, operating system) to select a suitable emulator (see Table  22.3). Suggested open-source emulators are Basilisk II (Bauer and Various n.d.), PCem (Walker 2007), QEMU (The QEMU Project Developers 2010), SheepShaver (Bauer and Various n.d.), and VirtualBox (Oracle Corporation 2004). Choosing an emulator which also supports virtualization can improve performance. 2. Prepare a disk image. A disk image (see Chapter 14) will act as a storage device for the emulated computer system. If the disk image has been acquired from a physical machine, you may need to convert the image to a format supported by that emulator. If you have not acquired a disk image from a physical machine, you can create a synthetic disk image (see Chapter 14 and Espenschied 2017). This is an empty disk image container and will require you to install an operating system and any other software required via the emulator. Two useful tools for creating and converting disk images are qemu-img (QEMU Project Developers 2021), associated with the QEMU emulator, and VboxManage (Oracle Corporation 2004), associated with the VirtualBox emulator. 3. Configure the emulator. Options within the emulator should be used to match or approximate the hardware requirements of the software. Ensuring these are as closely matched as possible increases the chance of success.

501

Deena Engel et al. Table 22.3 Summary of commonly encountered processor architectures, associated operating systems, and a selection of actively developed emulators capable of running them Processor Architecture

Associated Operating System(s)

Suggested Emulators

Motorola 68000 (also known as 68k) Intel 80386 (also known as i386, IA-36, or x86-32)

Mac System 7

Basilisk II QEMU (qemu-system-m68k) Pcem QEMU (qemu-system-i386) VirtualBox

PowerPC (also known as PPC) x86-64 (also known as AMD64 or Intel x86-64)

ARM

Windows 9x (95, 98, Me, 2000) Windows XP Windows Vista Ubuntu Mac OS 8 Mac OS 9 Mac OS X (10.0–10.5) Windows Vista Windows 7 Windows 8 Windows 8.1 Windows 10 Mac OS X (10.4 or later) Ubuntu macOS 11 or later Ubuntu (and other Linux distributions) Android iOS

QEMU (qemu-system-ppc) SheepShaver QEMU (qemu-system-x86_64) VirtualBox

QEMU (qemu-system-aarch64) (Limited or no support for some operating systems [e.g., iOS])

4. Run the emulation. If you experience problems at this step (e.g., the disk image does not boot), you can experiment with returning to the previous steps and modifying the emulator configuration or the disk image. 5. Assess the emulation. Assess the success of the emulation by making use of side-by-side comparison with physical hardware, image and video reference materials, quantitative metrics, and other documentation. See Magnin (2015) and Ensom (2018) for example approaches. 6. Document the emulation. Document the process so that it can be understood in the future. Important information to record includes the workstation used, the versions of software tools used, any modifications made to the base disk image (e.g., format conversion), a record of the configuration options used with the emulator, and notes on the success of the emulation (e.g., problems encountered and solutions developed, differences between the emulation and the original hardware). 7. Archive the emulation. Determine whether you wish to archive the components of the emulation so that it can be repeated in the future. Where emulation was successful and particularly where the emulation was used to display the work, doing will help support future treatment and display. Archived components should include the binaries or installer for the specific version of the emulator and the disk image used, along with documentation as described in the previous step. Emulation is often an iterative process of troubleshooting problems through the testing of different tools and configurations and consulting existing online resources where these may well have been encountered before. Some useful practical resources include the E-Maculation forum (Emaculation.com n.d.) for Mac emulation, the QEMU QED website (Gates n.d.) for 502

Caring for Software-Based Art

guidance on using the QEMU emulator, and the developer-created user documentation for the tools mentioned earlier. Although not every emulation attempt will be successful, emulation and other forms of environment intervention are also opportunities to gain knowledge about your software-based artwork; by testing different intervention approaches, we learn about and document the connections between a piece of software and its technical environment.

22.4.7 Introduction to Code Intervention In some cases, code intervention—the modification of code in order to maintain access to software—is deemed necessary to restore a work of software-based art. Evaluating the scale of intervention needed is the first step toward remediation. For example, a relatively small intervention of one or two lines of code would be required to modify the source code of a scripting language when an update is required to access a different URL linking to an external dependency such as a data source or media player. Additional steps will be required even for this small change if the software for the artwork needs to be recompiled. If more substantial changes are required—for example, where an obsolete version of a programming language is no longer supported—the code would need to be migrated to another programming language.

22.4.8 Code Migration Migration is a term that has different meanings in a variety of contexts. With respect to the conservation of film and video, it usually refers to a change in the carrier or format of the media. For software-based artworks, it can refer to migrating an artwork from one hardware environment to another for exhibition or preservation purposes (see Section  22.4.3). Code migration refers to the migration from one programming language or development environment to another and is an intervention that requires a review and re-writing of parts or most of the source code. Migrating a software-based work of art from one programming language or development environment to another can be a resource-intensive process, undertaken after considering “the existing migration options [that] are useful for preserving software-based art and how they compare to the emulation options currently used” (Roeck et al. 2019). When evaluating the migration of a work of software-based art from one programming language or development environment to another, there are a number of considerations that are pertinent to softwarebased artworks that are not necessarily relevant to other software applications. This is due to the conservation goals of minimal intervention, the significance of the artist-authored code, respecting reversibility, and selecting programming solutions that best support the long-term preservation of the artwork.

Selecting the Target Programming Language Following are some of the questions to consider when selecting the programming language to migrate the artwork to. First, is there a newer version of the same programming language or development environment available? If so, then upgrading to a newer version of the software or environment that the artist originally selected is in keeping with the artist’s choice of language or environment at the time that the work was created, taking care to check for missing or new features that are not backward-compatible. In addition, migrating a software application to a newer version of the same language or environment is likely a well-documented process, and it is possible that some of the migration can be handled programmatically either 503

Deena Engel et al.

by using custom software written for this purpose or reviewing tools available online. For example, software has been written to do a rough “first pass” on converting Python 2.x code to Python 3.x code. If this is not an option, the next question to address is whether there is another programming language that is similar in syntax and structure that could be used. While making decisions around migration, other considerations play a role as well. For example, are there analogous functions, libraries, or other code sources to support the restoration as envisioned? As artists make use of third-party libraries for graphics, media players, special effects, and other purposes, it is important to ensure that these libraries are compatible and available in the new programming language or environment or that comparable libraries are available. Conservators, along with their IT or programmer specialists, should also consider how and whether the software and hardware environments for the specific artwork would need to be configured to accommodate the migration and, if so, whether that would cause further issues. Finally, as is true for all software, a determination needs to be made that the language selection will support all of the required systems and functional requirements to render the artwork’s behaviors as intended. In the case of software-based artworks, special attention must be paid to how the artwork will appear to ensure visual and artistic integrity (which is not necessarily true for software application migrations outside of the visual arts). Long-term preservation of the new programming language is of great importance so that future interventions can be postponed as long as reasonably possible.

Code Resituation When a new programming language must be selected, it is important to address whether there is another programming language that is similar in syntax and structure that could be used. For example, when restoring a web-based work of art that runs as a Java Applet, the intervention will require migration to another programming language as Java Applets no longer run under standard browsers. In such a case, one could consider Python or JavaScript, to name two examples; between the two, the syntax and structure of JavaScript are much closer to Java than Python, so although a Python restoration could ultimately look the same and behave the same as the original artwork, the code would by necessity look quite different. With Java and JavaScript, on the other hand, some portions of the artist’s original code can be copied as-is from Java to JavaScript, along with a small library of functions to manage some of the differences between the languages (such as the way that floating-point numbers are handled). This approach was developed at the Solomon R. Guggenheim Museum during the restoration of John F. Simon Jr.’s web artwork Unfolding Object (2002) (Phillips et al. 2018) and called Code Resituation (Engel and Phillips 2018). In this case, a significant amount of Unfolding Object’s original code could be retained, including the algorithms that render the abstract “objects” when users interact with the work, thus ensuring fidelity to the artist’s work. Further, when code can be migrated in this way, aspects of the functionality are therefore not affected, which in turn supports testing and A-B comparisons during development.

Code Migration: Quality Control and Documentation As with other treatments, it is important to conduct standard software development testing throughout development to ensure performance. Web-based artworks should also be tested using all of the commonly used browsers and on different platforms (Windows, macOS, and Linux, if available). Appropriate A-B comparisons running the work in its original form, if that 504

Caring for Software-Based Art

can still be arranged alongside the restored version, can be part of the quality control process throughout the development process as needed. For web-based artworks, a development environment (e.g., a website or platform such as AWS that is configured to simulate the institutional or collection web server but not available to the public) is valuable throughout restoration treatment and testing. During this phase, the programmer, the conservator, and the artist, if appropriate, can all have access to view the progress. A VCS implementation, such as Git, is helpful in tracking progress, bugs, and documenting the software migration. Upon completion of the restoration, it is advisable to download a report from the VCS software to capture all of the steps and decisions made throughout the process as part of the treatment documentation or as a separate treatment report. Any code intervention should also be documented within the source code of the restoration copy through code annotation (for examples, see Phillips et al. 2017 and Engel and Phillips 2019).

22.5 Conclusion Hardware and software will continue to change as new devices, new programming languages and new paradigms become available over time; a comprehensive list of scenarios is not possible, nor would we want it to be as artists continue to explore new technological horizons in their work. Those caring for works of software- and computer-based art will be faced with languages and environments that are not discussed in this chapter as they have not yet been developed. However, the principles remain the same: understanding the technologies behind the hardware and software and the possible impacts of failure and obsolescence, considering the nature of the programming language(s) and other tools used in creating the artwork, and the role of data, media files and external dependencies, if any. Some emerging challenges for conservators will include the ongoing introduction of industrial algorithmic processes such as artificial intelligence and machine learning into artists’ works and technical and ethical implications of NFTs and other blockchain technologies. Breaking down these new artworks into their respective components—the anatomy of the artwork, as it were—will inform the decision-making of conservation teams moving forward into the future. Conservation treatment options will evolve as the time-based media conservation community continues to pioneer new intervention approaches to help safeguard the longevity of software- and computer-based art. Our response to new technologies and challenges will be best supported through collaboration with colleagues in the wider digital preservation and computer science communities.

Acknowledgments The authors wish to thank Lan Linh Merli-Nguyen Hoai, Time-Based Media Conservation Fellow at Düsseldorf Conservation Center, and Professor Craig Kapp, Clinical Professor of Computer Science, Department of Computer Science, Courant Institute of Mathematical Sciences at New York University, for taking the time to read this chapter and provide valuable feedback. The authors also wish to thank Euan Cochrane, Digital Preservation Manager at Yale University Library and Ethan Gates, Software Preservation Analyst at Yale University Library, for taking the time to meet with us on video to discuss emulation; as well as Amad Ansari, Emma Dickson, Xincheng Huang, and Karl Rosenberg, recent graduates of the Department of Computer Science at the Courant Institute of Mathematical Sciences at New York University who participated on research and projects related to the conservation of software-based art and met with us on video to share their thoughtful reflections. 505

Deena Engel et al.

Bibliography AIC (American Institute for Conservation). “TechFocus III: Caring for Software-Based Art.” American Institute for Conservation (AIC) and Foundation for Advancement in Conservation (FAIC) Resource Hub, September  25, 2015. https://resources.culturalheritage.org/techfocus/techfocus-iii-caring-for-com puter-based-art-software-tw-2/. ———. “TechFocus IV: Caring for 3D-Printed Art.” AIC (American Institute for Conservation), 2021. https://learning.culturalheritage.org/products/techfocus-iv-caring-for-3d-printed-art. Archive of Digital Art (ADA). “ADA | Archive of Digital Art (Website).” N.d. Accessed February 27, 2021. www.digitalartarchive.at/nc/home.html. arden, sasha, and Mark Hellar. Interview with sasha arden and Mark Hellar by Patricia Falcão, Tom Ensom and Deena Engel, October 26, 2020. Arduino. “Arduino (Website).” 2022. www.arduino.cc/. Association of Research Libraries (ARL). “Code of Best Practices in Fair Use for Software Preservation.” 2019. www.arl.org/wp-content/uploads/2018/09/2019.2.28-software-preservation-code-revi sed.pdf. Barok, Dušan, Julie Boschat Thorez, Annet Dekker, David Gauthier, and Claudia Roeck. “Archiving Complex Digital Artworks.” Journal for the American Institute of Conservation 42, no. 2 (2019): 94–113. https://doi.org/10.1080/19455224.2019.1604398. Bauer, Christian, and Various. “Basilisk II (Software).” Accessed December  20, 2021. https://basilisk. cebix.net/. ———. “SheepShaver (Software).” N.d. Accessed December 20, 2021. https://sheepshaver.cebix.net/. Bavota, Gabriele, Emad Aghajani, Csaba Nagy, Olga Lucero Vega-Marquez, Mario Linares-Vasquez, Laura Moreno, and Michele Lanza. “Software Documentation Issues Unveiled.” In 2019 IEEE/ACM 41st International Conference on Software Engineering (ICSE), 1199–210. Montreal, QC: IEEE, 2019. https:// doi.org/10.1109/ICSE.2019.00122. Brokerhof, Agnes W. “Installation Art Subjected to Installation Art Subjected to Risk Assessment—Jeffrey Shaw’s ‘Revolution’ as Case Study.” In Inside Installations, edited by Tatja Scholte and Glenn Wharton, 91–101. Amsterdam: Amsterdam University Press, 2011. www.academia.edu/8034919/Installation_ Art_Subjected_to_Installation_Art_Subjected_to_Risk_Assessment_Jeffrey_Shaws_Revolution_as_ Case_Study. “Browsertrix Crawler (Software).” Rhizome Webrecorder, n.d. Accessed February 7, 2022. https://github.com/ webrecorder/browsertrix-crawler. Brum, Olivia, Mauricio van der Maesen de Sombreff, Eléonore Bernard, and Gaby Wijers. “A Practical Research into Preservation Strategies for VR Artworks on the Basis of Justin Zijlstra’s 100 Jaar Vrouwenkiesrecht.” LIMA and the Dutch Digital Heritage Network, 2021. www.li-ma.nl/lima/sites/default/ files/Rapport_A-Practical-Research-into-Preservation-Strategies-for-VR-artworks-on-the-basis-ofJustin-Zijlstras-100-Jaar-Vrouwenkiesrecht.pdf. Bunz, Sophie. “Preserving Stephan von Huene’s Electronic Artworks by Means of Bit-Stream Documentation.” The Electronic Media Review Volume 5: 2017–2018 (2018). https://resources.culturalheritage. org/emg-review/volume-5-2017-2018/bunz/. Campbell, Savannah, and Mark Hellar. “From Immersion to Acquisition: An Overview of Virtual Reality for Time Based Media Conservators.” Electronic Media Review Volume 6: 2019–2020 (2019). https:// resources.culturalheritage.org/emg-review/volume-6-2019-2020/campbell/. Cloud Native Computing Foundation. “Containerd (Software).” Go. Accessed December  20, 2021. https://containerd.io/. Cranmer, Candice, and Nick Richardson. Interview with Candice Cranmer and Nick Richardson by Tom Ensom. Audio Recording, March 3, 2021. Creative Commons (CC). “Creative Commons—About the Licenses.” N.d. Accessed December 1, 2021. https://creativecommons.org/licenses/. Dekker, Annet. Collecting and Conserving Net Art: Moving Beyond Conventional Methods. London and New York: Routledge, Taylor & Francis Group, 2018. Dekker, Annet, and Patricia Falcão. Interdisciplinary Discussions about the Conservation of Software- Based Art Community of Practice on Software-Based Art. London: Tate, Pericles, 2017. https://openresearch.lsbu. ac.uk/item/8v21z. Della Casa, Davide. “PDFs of ‘Computer Graphics And Art.’ ” TopLap (blog), September 27, 2012. https:// toplap.org/pdfs-of-computer-graphics-and-art/.

506

Caring for Software-Based Art Di Cosmo, Roberto, and Stefano Zacchiroli. “Software Heritage: Why and How to Preserve Software Source Code.” In IPRES 2017–14th International Conference on Digital Preservation, 1–10. Kyoto, Japan, 2017. www.softwareheritage.org/wp-content/uploads/2020/01/ipres-2017-swh.pdf. Dietrich, Dianne, Julia Kim, Morgan McKeehan, and Alison Rhonemus. “How to Party Like It’s 1999: Emulation for Everyone.” The Code4Lib Journal, no. 32 (April 25, 2016). https://journal.code4lib.org/ articles/11386. Digital Preservation Coalition (DPC). “What Are the Risks?.” 2022. www.dpconline.org/digipres/ implement-digipres/dpeg-home/dpeg-risks. Emaculation.com. “Emaculation.Com | Forum.” N.d. Accessed February  1, 2022. www.emaculation. com/forum/. Engel, Deena, Lauren Hinkson, Joanna Phillips, and Marion Thain. “Reconstructing Brandon (1998– 1999): A Cross-Disciplinary Digital Humanities Study of Shu Lea Cheang’s Early Web Artwork.” Digital Humanities Quarterly 12, no. 2 (2018): 44. www.digitalhumanities.org/dhq/vol/12/2/000379/000379. html. Engel, Deena, and Joanna Phillips. “Introducing ‘Code Resituation’: Applying the Concept of Minimal Intervention to the Conservation Treatment of Software-Based Art.” The Electronic Media Review Volume 5: 2017–2018 (2018): 22. https://resources.culturalheritage.org/emg-review/volume-5-2017-2018/engel-2/. ———. “Applying Conservation Ethics to the Examination and Treatment of Software- and ComputerBased Art.” Journal of the American Institute for Conservation 58, no. 3 (July 3, 2019): 180–95. https://doi. org/10.1080/01971360.2019.1598124. Engel, Deena, and Glenn Wharton. “Reading Between the Lines: Source Code Documentation as a Conservation Strategy for Software-Based Art.” Studies in Conservation 59, no. 6 (November 2014): 404–15. https://doi.org/10.1179/2047058413Y.0000000115. Ensom, Tom. “Revealing Hidden Processes: Instrumentation and Reverse Engineering in the Conservation of Software-Based Art.” In Electronic Media Review, Volume 5: 2017–2018 (2018). Houston, TX, 2018. https://resources.culturalheritage.org/emg-review/volume-5-2017-2018/ensom/. ———. “Technical Narratives: Analysis, Description and Representation in the Conservation of SoftwareBased Art.” Ph.D., King’s College London, 2019. https://kclpure.kcl.ac.uk/portal/en/theses/tech nical-narratives(e01bff94-08bd-4b83-aeef-4e7d6d5b0dfc).html. Ensom, Tom, Patricia Falcão, and Chris King. “Software-Based Art Preservation—Project | Tate.” 2021. www.tate.org.uk/about-us/projects/software-based-art-preservation. Ensom, Tom, and McConchie, Jack. “Preserving Virtual Reality Artworks Report (Tate).” Zenodo, August 13, 2021. https://doi.org/10.5281/ZENODO.5274102. Espenschied, Dragan. “Emulation and Access.” Museum of Modern Art (MoMA), December  8, 2017. https://vimeo.com/278042613/a78ee5e46b. Espenschied, Dragan, K. Rechert, Dirk von Suchodoletz, Isgandar Valizada, and Nick Russler. “LargeScale Curation and Presentation of CD-ROM Art.” IPRES, 2013. https://purl.pt/24107/1/iPres2013_ PDF/Large-Scale%20Curation%20and%20Presentation%20of%20CD-ROM%20Art.pdf. Falcão, Patricia. “Developing a Risk Assessment Tool for the Conservation of Software-Based Artworks.” 2010. www.academia.edu/6660777/Developing_a_Risk_Assessment_Tool_for_the_conservation_of_ software_based_artworks_MA_Thesis. ———. “Preservation of Software-Based Art at Tate.” In Digital Art through the Looking Glass: New Strategies for Archiving, Collecting and Preserving in Digital Humanities, edited by Oliver Grau, Janina Hoth, and Eveline Wandl-Vogt, 271–87. Krems an der Donau, Wien: Edition Donau-Universität Krems; ÖAW, Austrian Academy of Science, 2019. www.donau-uni.ac.at/dam/jcr:a29638aa-f334-4abb-960110e36652d09f/Digital_Art_through_the_Looking_Glass_updated.pdf. ———. “Risk Assessment as a Tool in the Conservation of Software-Based Artworks.” EMG- Electronic Media Review Volume 2: 2011–2012 (2011). https://resources.culturalheritage.org/emg-review/ volume-two-2011-2012/falcao/. Falcão, Patricia, Ashe Alistair, and Brian Jones. “Virtualisation as a Tool for the Conservation of Software-Based Artworks.” Melbourne, Australia, 2014. www.academia.edu/12462584/Virtualisation_ as_a_Tool_for_the_Conservation_of_Software-Based_Artworks. Farbowitz, Jonathan. Interview with Jonathan Farbowitz by Tom Ensom and Patricia Falcão, August 14, 2020. Fauconnier, Sandra. “Many Faces of Wikibase: Rhizome’s Archive of Born-Digital Art and Digital Preservation.” Wikimedia Foundation News, September  6, 2018. https://wikimediafoundation.org/ news/2018/09/06/rhizome-wikibase/.

507

Deena Engel et al. Fino-Radin, Ben. “Digital Preservation Practices and the Rhizome Artbase.” Rhizome at the New Museum, 2011. http://media.rhizome.org/artbase/documents/Digital-Preservation-Practices-and-the-RhizomeArtBase.pdf. ———. “Art In the Age of Obsolescence: Rescuing an Artwork from Crumbling Technologies.” MoMA: Features and Perspectives on Art and Culture, December  21, 2016. https://stories.moma.org/ art-in-the-age-of-obsolescence-1272f1b9b92e. Fortunato, Flaminia, and Joey Heinen. “There Is No ‘I’ in IOS: Preserving Mobile Technologies as AR Experiences and Objects.” Presented at the Preserving Immersive Media Workshop, March 27, 2020. www.youtube.com/watch?v=6M1YgYhYGf0. Garousi, Golara, Vahid Garousi, Mahmoud Moussavi, Guenther Ruhe, and Brian Smith. “Evaluating Usage and Quality of Technical Software Documentation: An Empirical Study.” In Proceedings of the 17th International Conference on Evaluation and Assessment in Software Engineering—EASE ’13, 24. Porto de Galinhas, Brazil: ACM Press, 2013. https://doi.org/10.1145/2460999.2461003. Gates, Ethan. “QEMU QED—The Missing Manual for Digital Preservation.” N.d. Accessed February 2, 2022. https://eaasi.gitlab.io/program_docs/qemu-qed/. GNU Operating System. “GNU General Public License.” GNU Operating System, n.d. Accessed December 1, 2021. www.gnu.org/licenses/gpl-3.0.en.html. Guggenheim Blog. “How the Guggenheim and NYU Are Conserving Computer-Based Art. Part 1.” Guggenheim Blog (blog), October  26, 2016. www.guggenheim.org/blogs/checklist/how-theguggenheim-and-nyu-are-conserving-computer-based-art-part-1. Guggenheim Museum. “The Conserving Computer-Based Art Initiative (CCBA).” The Guggenheim Museums and Foundation, 2019. Accessed September  2, 2020. www.guggenheim.org/conservation/ the-conserving-computer-based-art-initiative. Hamilton, Emily. “Reprinting as a Conservation Strategy: Collecting Additive Manufactured Works at SFMOMA.” In Future Talks 017: The Silver Edition. Visions. Innovation in Technology and Conservation of the Modern. Postprints of the Conference Held in Munich, 2017, edited by Tim Bechthold, 170–75. Munich, Germany: The Design Museum, 2017. ———. “Email to the Authors: Book Project and a Question about 3D Printed Art.” November 16, 2020. Hellar, Mark. “The Role of the Technical Narrative for Preserving New Media Art.” The Electronic Media Review Volume 3: 2013–2014. http://29aqcgc1xnh17fykn459grmc-wpengine.netdna-ssl.com/emgreview/wp-content/uploads/sites/15/2018/09/EMG-Vol.-3-Hellar.pdf. “Heritrix3 (Software).” Internet Archive, n.d. Accessed February  7, 2022. https://github.com/inter netarchive/heritrix3. IBM Cloud Education. “What Are Containers?.” June 23, 2021. www.ibm.com/cloud/learn/containers. International Centre for the Study of the Preservation and Restoration of Cultural Property (ICCROM). “A  Guide to Risk Management of Cultural Heritage.” Canadian Conservation Institute (CCI), 2016. www.iccrom.org/wp-content/uploads/Guide-to-Risk-I_English.pdf. King, Chris. Interview with Chris King by Tom Ensom, 2021. Klomp, Paul. Interview with Paul Klomp by Tom Ensom. Audio Recording, March 3, 2021. Laurenson, Pip. “Authenticity, Change and Loss in the Conservation of Time-Based Media Installations.” Tate Papers, no. 6 (Autumn 2006). www.tate.org.uk/research/publications/tate-papers/06/ authenticity-change-and-loss-conservation-of-time-based-media-installations. ———. “Old Media, New Media? Significant Difference and the Conservation of Software-Based Art.” In New Collecting: Exhibiting and Audiences After New Media Art, edited by Beryl Graham, 1st ed., 73–96. London: Routledge, 2014. https://doi.org/10.4324/9781315597898-4. Library of Congress. “WARC, Web ARChive File Format.” Sustainability of Digital Formats: Planning for Library of Congress Collections, 2020. www.loc.gov/preservation/digital/formats/fdd/fdd000236. shtml. Liebetraut, Thomas, Klaus Rechert, Isgandar Valizada, Konrad Meier, and Dirk Von Suchodoletz. “Emulation-as-a-Service—The Past in the Cloud.” 2014 IEEE 7th International Conference on Cloud Computing (2014): 906–13. https://doi.org/10.1109/CLOUD.2014.124. LIMA. ArtHost Report 2017–2018. Amsterdam, The Netherlands: LIMA, 2018. www.li-ma.nl/lima/sites/ default/files/Arthost%20end%20report_FINAL%20(2).pdf. Lintermann, Bernd, Chiara Marchini Camia, Arnaud Obermann, and Claudia Roeck. “Jeffrey Shaw, the Legible City (1989–1991).” In Preservation of Digital Art: Theory and Practice: The Project Digital Art Conservation, edited by Bernhard Serexhe and Eva-Maria Höllerer, 488–512. Karlsruhe: Zentrum für Kunst und Medientechnologie, 2013.

508

Caring for Software-Based Art Liu, Qinyue. “An Investigation into New Media Artists’ Personal Preservation Practices.” Arts Management and Technology Laboratory, December  17, 2020. https://amt-lab.org/blog/2020/10/inves tigation-into-new-media-artists-personal-preservation-practices. Lozano-Hemmer, Rafael. “Best Practices for Conservation of Media Art from an Artist’s Perspective.” GitHub Antimodular, September  28, 2015. https://github.com/antimodular/Best-practicesfor-conservation-of-media-art. Lurk, Tabea. “Virtualisation as Conservation Measure.” In Archiving Conference, 221–25. Chicago: Society for Imaging Science and Technology, 2008. ———. “Methodologies of Multimedial Documentation and Archiving.” In Preserving and Exhibiting Media Art: Challenges and Perspectives, edited by Julia Noordegraaf, Cosetta G. Saba, Barbara Le Maître, and Vinzenz Hediger, 149–95. Framing Film. Amsterdam: Amsterdam University Press, 2013. Magnin, Emilie. “Defining an Ethical Framework for Preserving Cory Arcangel’s Super Mario Clouds— Electronic Media Review.” In Electronic Media Review, Volume 4: 2015–2016 (2015). https://resources. culturalheritage.org/emg-review/volume-4-2015-2016/magnin/. Malík, Martin. “HWiNFO (Software).” Accessed October 20, 2021. www.hwinfo.com. McConchie, Jack, Tom Ensom, and Louise Lawson. “Preserving Immersive Media.” Tate, 2021. https:// www.tate.org.uk/about-us/projects/preserving-immersive-media. McCue, T. J. “STL Files: What They Are and How to Use Them.” Lifewire, November 14, 2019. http:// guilfordfreelibrary.org/wp-content/uploads/2019/11/what-is-an-.stl-file.pdf. McGarrigle, Conor. “Preserving Born Digital Art: Lessons From Artists’ Practice.” New Review of Information Networking 20, no. 1–2 (July 3, 2015): 170–78. https://doi.org/10.1080/13614576.2015.1113055. Metropolitan Museum of Art. “Sample Documentation and Templates.” The Metropolitan Museum of Art, n.d. Accessed June 24, 2021. www.metmuseum.org/about-the-met/conservation-and-scientific-research/ time-based-media-working-group/documentation. National Archives. “Web Archiving Guidance.” 2011. https://cdn.nationalarchives.gov.uk/documents/ information-management/web-archiving-guidance.pdf. Obermann, Arnaud. Interview with Arnaud Obermann by Tom Ensom. Audio Recording, March  3, 2021. Oleksik, Peter. “Email to the Authors: Book Project and a Question About 3D Printed Art.” February 19, 2021. Oleksik, Peter, and Megan Randall. “(Material Transfers & Translations) Tauba Auerbach’s Altar/Engine: A Case Study in Reconceptualizing Materiality.” Presented at the AIC’s 46th Annual Meeting, Houston, TX, June  1, 2018. https://aics46thannualmeeting2018.sched.com/event/Cz5w/material-trans fers-translations-tauba-auerbachs-altarengine-a-case-study-in-reconceptualizing-materiality. Open Source Initiative. “The MIT License.” Open Source Initiative, n.d. Accessed December  1, 2021. https://opensource.org/licenses/MIT. Oracle Corporation. “VBoxManage (Software).” 2004. www.virtualbox.org/manual/ch08.html. ———. “VirtualBox (Software).” 2007. www.virtualbox.org/. Paul, Christiane. “Email to the Authors: Time-Based Media Art Conservation: Source Code Analysis for Software-Based Artworks.” February 19, 2021. Phillips, Joanna, Emma Dickson, Deena Engel, and Jonathan Farbowitz. “Restoring Brandon, Shu Lea Cheang’s Early Web Artwork.” Guggenheim Blog (blog), May 16, 2017. www.guggenheim.org/blogs/ checklist/restoring-brandon-shu-lea-cheangs-early-web-artwork. Phillips, Joanna, Deena Engel, Jonathan Farbowitz, and Karl Toby Rosenberg. “The Guggenheim Restores John F. Simon Jr.’s Early Web Artwork ‘Unfolding Object’.” Guggenheim Blog (blog), November 19, 2018. www.guggenheim.org/blogs/checklist/the-guggenheim-restores-john-f-simon-jr-earlyweb-artwork-unfolding-object. Post, Colin. “Preservation Practices of New Media Artists: Challenges, Strategies, and Attitudes in the Personal Management of Artworks.” Journal of Documentation 73, no. 4 (July 10, 2017): 716–32. https:// doi.org/10.1108/JD-09-2016-0116. Puckette, Miller, and Varoius. “Pure Data (Software).” 1996–2021. http://puredata.info. QEMU Project Developers. “QEMU (Software).” 2010. www.qemu.org/. ———. “Qemu-Img (Software).” 2021. www.qemu.org/docs/master/tools/qemu-img.html. Reas, Casey. “Reas/Studio—Home (Website).” 2016. https://github.com/REAS/studio/wiki. Rechert, Klaus, Patricia Falcão, and Tom Ensom. “Introduction to an Emulation-Based Preservation Strategy for Software-Based Artworks.” December  2016. www.tate.org.uk/research/publications/ emulation-based-preservation-strategy-for-software-based-artworks.

509

Deena Engel et al. Recode Project. “ReCode Project.” n.d. Accessed October 4, 2021. http://recodeproject.com/. Rhizome. “Conifer (Software).” 2022. https://conifer.rhizome.org/. Roeck, Claudia. “Emulating Horizons (2008) by Geert Mul: The Challenges of Intensive Graphics Rendering—Electronic Media Review.” The Electronic Media Review Volume 5: 2017–2018 (2018). https://resources.culturalheritage.org/emg-review/volume-5-2017-2018/roeck/. ———. Interview with Claudia Röck by Tom Ensom and Patricia Falcão, July 24, 2020. ———. “Email to the Authors: Time-Based Media Art Conservation: Source Code Analysis for SoftwareBased Artworks.” February 17, 2021. ———. “Conservation Case Study: TraceNoizer by LAN.” Accessed December 1, 2021. www.hek.ch/ en/collection/research-and-restauration/conservation-case-study-tracenoizer-by-lan/. Roeck, Claudia, Rafael Gieschke, Klaus Rechert, and Julia Noordegraaf. Preservation Strategies for an Internet-Based Artwork Yesterday, Today and Tomorrow. Amsterdam: Amsterdam University Press, 2019. https://doi.org/10.17605/OSF.IO/GF2U9. Roeck, Claudia, Julia Noordegraaf, and Klaus Rechert. “207.4 Evaluation of Preservation Strategies for an Interactive, Software-Based Artwork with Complex Behavior Using the Case Study Horizons (2008) by Geert Mul.” 2018. https://search.datacite.org/works/10.17605/osf.io/2vpft; https://doi. org/10.17605/OSF.IO/2VPFT. Rösler, Wolfram. “The Hello World Collection.” 2021. http://helloworldcollection.de. Rossenova, Lozana. “ArtBase Archive—Context and History: Discovery Phase and User Research 2017– 2019.” Rhizome, 2020a. https://sites.rhizome.org/artbase-re-design/docs/1_Report_ARTBASE-HIS TORY_2020.pdf. ———. “Artbase Redesign Data Models: Design Exploration and Specification Phases 2018–2019.” 2020b. https://sites.rhizome.org/artbase-re-design/data-models.html. Rothenberg, Jeff. Avoiding Technological Quicksand: Finding a Viable Technical Foundation for Digital Preservation: A Report to the Council on Library and Information Resources. Washington, DC: Council on Library and Information Resources, 1999. https://clir.wordpress.clir.org/wp-content/uploads/sites/6/pub77.pdf. ———. “Ensuring the Longevity of Digital Documents.” Scientific American 272, no. 1 (1995): 42–47. www.jstor.org/stable/24980135. Russel, Roslyn, and Kylie Winkworth. Significance 2.0, a Guide to Assessing the Significance of Collections. Adelaide: Collections Council of Australia, 2009. www.arts.gov.au/sites/default/files/ significance-2.0.pdf?acsf_files_redirect. Sherring, Asti, Mar Cruz, and Nicole Tse. “Exploring the Outlands: A Case-Study on the Conservation Installation and Artist Interview of David Haines’ and Joyce Hinterding’s Time-Based Art Installation.” AICCM Bulletin 42, no. 1 (January 2, 2021): 13–25. https://doi.org/10.1080/10344233.2021.1982540. Smithsonian. “Smithsonian TBMA Symposia.” Smithsonian Institution, 2014. www.si.edu/tbma/smith sonian-tbma-symposia. Software Preservation Network (SPN). “EaaSI Emulation-as-a-Service Infrastructure.” Software Preservation Network. Accessed June 11, 2021. www.softwarepreservationnetwork.org/emulation-as-a-serviceinfrastructure/. ———. “Software Preservation Network (SPN) (Website).” 2022. www.softwarepreservationnetwork. org/. Stricot, Morgane, and Matthieu Vlaminck. Interview with Morgane Stricot and Matthieu Vlaminck by Tom Ensom. Audio Recording, February 15, 2021. Swalwell, Melanie, and Denise de Vries. “Collecting and Conserving Code: Challenges and Strategies.” Journal of Media Arts Culture 10, no. 2 (2013). www.academia.edu/3831859/Collecting_and_ Conserving_Code_Challenges_and_strategies. Von Suchodoletz, Dirk, Peter Bright, Remco Verdegem, Jasper Schroder, Paul Wheatley, and Jeffrey van der Hoeven. “Planets Project: Report Describing Results of Case Studies.” May  30, 2008. https:// web.archive.org/web/20201230191241/www.planets-project.eu/docs/reports/Planets_PA5-D3_Case Studies-final.pdf. Von Suchodoletz, Dirk, Klaus Rechert, and Isgandar Valizada. “Towards Emulation-as-a-Service: Cloud Services for Versatile Digital Object Access.” International Journal of Digital Curation 8, no. 1 (June 14, 2013): 131–42. https://doi.org/10.2218/ijdc.v8i1.250. W3Schools. “HTML Element Reference.” In W3 Schools. Accessed November 29, 2021. www.w3schools. com/TAGS/default.ASP. Walker, sarah. “PCem (Software).” 2007. https://pcem-emulator.co.uk/.

510

Caring for Software-Based Art Waller, Robert. “Conservation Risk Assessment: A Strategy for Managing Resources for Preventive Conservation.” Studies in Conservation 39, no. sup2 (January  1, 1994): 12–16. https://doi.org/10.1179/ sic.1994.39.Supplement-2.12. “Webrecorder (Software).” Rhizome, n.d. Accessed February 7, 2022. https://webrecorder.net/about. “Wget (Software).” GNU, n.d. Accessed February 7, 2022. www.gnu.org/software/wget/. Wharton, Glenn. “Email to the Authors: Time-Based Media Art Conservation: Source Code Analysis for Software-Based Artworks.” February 1, 2021. Wharton, Glenn, and Deena Engel. “Museum and University Collaboration in Media Conservation Research.” The Electronic Media Review Volume 3: 2013–2014 (2014): 8. https://resources.culturalheritage.org/emg-review/volume-three-2013-2014/wharton/. ———. “Source Code Analysis as Technical Art History.” Journal of the American Institute for Conservation 54, no. 2 (May 2015): 91–101. https://doi.org/10.1179/1945233015Y.0000000004. Wheatley, Paul. “Digital Preservation and BBC Domesday.” In Electronic Media Group Annual Meeting of the American Institute for Conservation of Historic and Artistic Works, 1–9. Copenhagen: Citeseer, 2004.

511

23 A WORD ABOUT PERFORMANCE ART Hélia Marçal

Editors’ Notes: Hélia Marçal is Lecturer in History of Art, Materials, and Technology at University College London. Among other topics, her research explores the ethics and performativity of cultural heritage and the conservation of time-based media and performance art. With her brief chapter, Marçal reflects on the intersection between time-based media and live performance art, providing an overview and outlook on the state of emerging practice and research in the collection and conservation of performance art.

23.1 Introduction As a form of artistic practice where the materiality is understood to be in perpetual unfolding, time-based media art was introduced in contemporary art museums through the acquisition of audiovisual and electronic media. The genre of time-based media art, however, incorporates in its expanded meaning other artistic manifestations that are not necessarily associated with audiovisual and electronic media, such as performance art. Long gone are the days when live performance art seemed to be, by its very nature, in direct contrast with collecting and conservation practices. It is not that performance art was kept out of collections and museum spaces; indeed, traces of performance artworks have inhabited the museum and its various structures since the emergence of the genre in the 1950s (Calonje 2014; Phillips and Hinkson 2018). Collecting institutions have, however, restrained themselves from collecting performance art as a live, repeatable action up until 2005, when the first live performance artwork was acquired by Tate (London). Since then, we have witnessed performance entering art collections at a rapid pace. In 2021, the Monoskop website identified more than 60 live performance artworks as being part of art collections in Europe and North America (Monoskop 2021). The list is, however, not exhaustive, and many more have entered museum collections throughout the world. But what are the challenges that performance artworks—as living forms of art making—pose to collecting institutions, and in which ways can performance artworks be conserved? This chapter aims to reflect on the conservation of performance art and how it has evolved in the last 15 years. It will focus on the parallels between conserving time-based media and performance art, the emerging and changing practices of collecting live art in museums, and 512

DOI: 10.4324/9781003034865-27

A Word About Performance Art

the intertwining of conservation and curatorial practices in museums. In providing an overview of research initiatives on both theory and practice and reflecting on the role of performance art in the development of the conservation field, this chapter will also discuss the challenges and possibilities that we can foresee in the future. The chapter is divided into four sections. The first section explores the intersections of performance art and other time-based media artworks. The second section approaches the different performance art types one can find in museum collections, reflecting both on collecting trends and the reasons why some kinds of performance art are yet to be collected. The third section highlights the expertise that has been developed in both collecting institutions and universities, focusing not only on how much of it is contingent on the collection and context of operation but also on the possibilities afforded by emerging tools and frameworks. The fourth and last section discusses the challenges performance art is bringing to the future of conservation and how we can create opportunities for the development of practice in museums and other institutions.

23.2 Time-Based Media and Performance Art Performance art—which, in some academic and museum circles is also called live art (with the term claiming a specific theoretical lineage—cf. Heathfield 2004)—is usually recognized as an artistic movement that emerged around the 1950s and was later adopted as a genre by visual artists in the 1960s (Jones 2012). The emergence of this genre, however, is neither visually nor geographically contained. Under the overarching notion of art as bodily action, performance art developed in multiple directions, reframing the idea of motion and bodies in the process. And indeed, although the published chronologies are undoubtedly inclined toward the contexts in North America and Europe (mainly Anglo-Saxon cultures), the movement has been adopted beyond these contexts with renowned artists and groups emerging around the world—such as, for example, the Japanese collective Gutai, which was particularly active in the 1950s and 1960s, Rirkrit Tiravanija (b. 1961, Argentina), or Cildo Meireles (b. 1948, Brazil)—with many of these only now being recognized by the canons of history of art (as is the case of artists from Eastern Europe, whose pioneering practices were only recently mapped—Bryzgel 2017). Materially speaking, performance art is characterized by a set of actions that take place during a limited time. Both human and nonhuman bodies can perform those actions: the artwork Tatlin’s Whisper #5, created by the Cuban-born artist Tania Bruguera (b. 1968) in 2008, for example, is a performance artwork that includes actions by both mounted police and horses; digital performance, as another example, often implies the performance of people through digital media [such as the social media durational performances by Amalia Ulman (b. 1989, Argentina)]. The actions performed usually follow a sequence, which can be minimally predetermined, just be triggered by an overarching idea, or be somewhere in the middle of these scenarios. The time limit for these actions also varies greatly: John Cage’s (1912–92, USA) ORGAN2/ASLSP (1987), for example, which is at the intersection of music and performance art, is being performed for 639 years in Halberstadt; Tehching Hsieh’s (b. 1950, Taiwan) One Year Performances (1978–79, 1980–81, 1981–82, 1983–84, 1985–86) and his Thirteen Year Plan (1986–99) are also examples of an action being dilated in time; in contrast, the Austria-born artist Erwin Wurm (b. 1954) has created One-Minute Sculptures (since 1988), which are activated by the public during 60 seconds. In considering performance art’s relation with time—particularly when we are discussing its conservation—one needs to account not so much for the duration of performance artworks but rather how they change with time. Similarly, to other time-based media artworks, performance not only materially during their institutional life cycle, but also those changes are a fundamental part of what makes these artworks what they are. 513

Hélia Marçal

As one can see through reading the various chapters that comprise this handbook, changes in time-based media artworks—whether as part of the process of displaying them or just keeping them safe and accessible—are both triggered by an ecology of practices that makes them particularly prone to obsolescence and burgeoned by conservation’s aims to keep them alive, functioning for present and future generations. That is also the case with performance artworks that are to be preserved and displayed as live-action—efforts for keeping them alive are intrinsically linked to the degrees to which we can change them. Both types of artworks (or, indeed, all artworks) can be said to operate in a range of performativity, in which the most performative artworks would account for more material changes from activation to activation, while at the other end of the spectrum artworks would seem to be more self-contained. Another characteristic that is shared by time-based artworks—and is not medium-specific—pertains to the information-based materiality of these artworks. Indeed, time-based media artworks rely on more than their separate components in order to be realized in contexts of display—information is not only an integral part of the formal materiality of some of these works (Kirschenbaum 2008), but it is only what allows them to be displayed, activated, and performed when installed. In looking to preserve both the possibilities of change and the information-based materiality as much as the artwork itself, the conservation of time-based media and performance art looks for a balance that is quite hard to get: how can we foster different iterations of a given artwork while keeping it the same artwork? And in which ways can performance artworks thrive in museums?

23.3 Performance Art and the Museum There are many reasons behind the decision to collect a specific form of artistic practice. In the particular case of performance art, relevant literature has shown that live performance is not exactly new in museums—it is only new to its collections (see Calonje 2014; Lawson et al. 2019). It is possible to say, however, that it has taken quite some time for museums to start acquiring performance art. One of the reasons is that performance art (and, may I say, particularly artworks created in the 1960s and 1970s—cf. Bishop 2012) was considered uncollectible due to the stance that such artworks were unrepeatable, impossible to document, and prone to an inevitable disappearance (Phelan 1993). As the field of performance studies took off in the 1990s (Davis 2008), new perspectives on performance documentation—namely, some that contested this initial position—started to appear (see, for example, Auslander 2006, 2008; or Jones 1997). The new millennium led to several reenactment projects that, according to the scholar Jessica Santone, answered to a turn to nostalgia (Santone 2008). Not only did reenactment projects like Marina Abramovic’s (b. 1946, former Yugoslavia) Seven Easy Pieces, commissioned and presented at the Guggenheim Museum in New York in 2005, unveil the possibility of repetition for performance art in museums, but they have also fostered a critical mass of scholars and artists that helped redefine the possible futures of performance. This period coincided with the time the first performance artworks were acquired by museums (Monoskop 2021). Good Feelings, Good Times, created by Roman Ondák (b. 1966, former Czechoslovakia) in 2003, was acquired by Tate in 2005, quickly followed by This is Propaganda (2002), by Tino Sehgal (b. 1976, United Kingdom), and David Lamelas’ (b. 1946, Argentina) Time (1970). The Stedelijk Museum (Amsterdam) collected Sehgal’s Instead of allowing something to rise up to your face dancing bruce and dan and other things (2000) also in 2005, and the Van Abbe Museum (Rotterdam) acquired This is Exchange (2002) by the same artist in 2006. In the United States, the Museum of Modern Art in New York was the first museum to acquire a live performance—Kiss (2003), also by Sehgal—in 2008. More than the Zeitgeist, however, it was the material conditions of these works that led them to be collected in the first place. Up until very recently (see Marçal 2022b), all of the 514

A Word About Performance Art

artworks that entered museum collections shared characteristics that make them relatively more collectible than performances created in the 1970s. These artworks are what the art historian Claire Bishop has called delegated artworks. This notion can be used to describe performance artworks that are executed by others than the artist, or, in Bishop’s words, “the act of hiring non-professionals or specialists in other fields to undertake the job of being present and performing at a particular time and a particular place on behalf of the artist, and following his/her instructions.” (Bishop 2012, 219) Delegated performances are, therefore, not so closely associated with the body of the artist who created them. Nor is there such emphasis on the idea that performance ought to be considered as a single event, never to be repeated or reproduced in any way, as were many artworks created during the 1960s and 1970s. Designed to be performed by someone other than the artist, mechanisms for the conservation of delegated performances are usually devised by the artists during the process of their creation. These artworks are additionally often acquired (or even shown) as a set of instructions that can be followed by hired performers or members of the audience. The delivery of such instructions can vary significantly: while some instructions are supplied during the acquisition of the artwork, such as in the case of Roman Ondák’s Good Feelings, Good Times (see Marçal et al. 2022), others, as in the case of Tino Sehgal’s artworks, are conveyed through a network of trusted collaborators (see Laurenson and van Saaze 2014). And while gathering and fostering such networks is a challenge in itself (Laurenson and van Saaze 2014)—and one that museums need to grapple with for many other reasons—delegated performances such as Tino Sehgal’s artworks still thrive in contemporary museums. Besides this form of delegation, other mechanisms of transmission have started to inhabit museums. That is the case, for example, of performance artworks acquired with an alternative mode of display—for example, artworks that can be shown both as a live performance or an installation that includes performance documentation. That potential option accounts for a time in which the performance can no longer be presented as a live option due to, for example, the degradation of unique objects that are essential for the performance to take place. There are artworks in which the role of the main performer—such as the artist—is conscientiously and intentionally delegated to a person or a group of people who are tasked to perform the artwork in a regime of quasi-co-authorship (see Marçal 2017, for an example). Devising the material limits and possibilities of a given performance artwork is, therefore, a crucial task for the conservators that are bringing the artwork to a collection. Documentation (see Chapter 11) and collaboration with artists and other stakeholders (see Chapter 17) are two essential strategies in negotiating the material needs of the artwork, not only during the acquisition process but also throughout the life cycle of the work in the collection. The next section will highlight some of the research efforts in both theory and practice in the conservation of performance art that have been developed in the last years.

23.4 Perspectives on the Conservation of Performance Art Studies on the conservation of performance art go as far back as the Guggenheim-based Variable Media Initiative (1999–2004; see Depocas et  al. 2003), which, already in 2001, introduced Robert Morris’ (1931–2018, USA) Site, created in 1964, as a case study. The most prominent initiative to reflect on the conservation of performance after this pioneering effort of the Variable Media Network was the project Collecting the Performative, a research network 515

Hélia Marçal

examining emerging practice for collecting and conserving performance-based art, which took place from 2012 to 2014 (Tate 2014a). This project was based at Tate and developed in collaboration with Maastricht University. Outputs from this collective and international effort include essays by the curator Catherine Wood (Wood 2014) and by Pip Laurenson and Vivian van Saaze (Laurenson and van Saaze 2014) and a resource for professionals interested in collecting performance, named The Live List: What to Consider When Collecting Live Works (Laurenson et al. 2014). This resource is designed as a prompt to help promote thinking about the things required when bringing a live performance into a collection. The Live List has some similarities with the Variable Media Questionnaire (Variable Media Network 2003) but goes beyond the realm of the artist and summarizes aspects to consider when acquiring performance artworks, both concerning the nature of these works and the politics and procedures of the institution (Laurenson et al. 2014). Research and practice in the conservation of performance art were furthered around the time Collecting the Performative took place, with publications focusing on performance art starting to appear across all spectrums of the conservation field (see Giguère 2012, 2014; Marçal et al. 2013, 2016; van Saaze 2015; Wharton 2016; Finbow 2018). Some of these publications (and, in general, the field) were influenced by the seminal paper “Reflections on a biographical approach to contemporary art conservation,” published by Renée van de Vall and co-authors in the preprints of the ICOM-CC 16th Triennial Conference (van de Vall et al. 2011). This cohort of authors was also involved in the New Approaches in the Conservation of Contemporary Art (NACCA) of the Marie Sklodowska-Curie Innovative Training Network, which, in 2014, launched 15 PhD projects, one of which was dedicated to the preservation of performance art (see Goldie-Scot 2021; for more information see Tate 2019). The opening of the field to the study of the conservation of performance art was also propelled by conferences emerging from conservation projects and associations. The first conference, “Performing Documentation in the Conservation of Contemporary Art,” took place in Lisbon in 2013. Part of the research network NeCCAR (Network for Conservation of Contemporary Art Research; see Tate 2014b), this conference joined conservators and researchers in discussing the role of documentation in the conservation of contemporary art and, in particular, performance-based art (for the proceedings see Matos, Macedo, and Heydenreich 2015). The first conference that exclusively focused on the conservation of performance art happened in 2016. Organized by the German Association of Conservator-Restorers (VDR) and the Kunstmuseum Wolfsburg, “Collecting and Conserving Performance Art” (June 9–11, 2016) brought together international scholars and practicing conservators to discuss the lives of performance art in the museum. The videos of the conference are available online (Verband der Restauratoren 2017), and the VDR journal Beiträge zum Erhalt von Kunst- und Kulturgut published the majority of contributions across two volumes (VDR 2017, 2018). Significant expansions of theory and practice in the conservation of performance art occurred after 2016, with previous initiatives drawing on this progress. The year 2016 marked the beginning of the project Documentation and Conservation of Performance at Tate, and the consequent development of a strategy specially dedicated to the care of these works (see Lawson et al. 2019; Tate 2021). The years that followed led to the publication of the book Histories of Performance Documentation: Museum, Artistic, and Scholarly Practices (Giannachi and Westerman 2018), as well as various publications centered on the practices of conserving performance art (e.g., Borges and Inês 2018; Lane and Wdowin-McGregor 2016; Lawson and Potter 2017; Lawson et al. 2019; Marçal 2017, 2021a, 2021b; Nogueira et al. 2016), or raising the profile of the theory being developed at the intersection of performance art and conservation (see, for example, Castriota 2021; Hölling 2021; Marçal 2019b, 2021a, 2021b, 2022a, Rieß et al. 2019; van den 516

A Word About Performance Art

Hengel 2017). One of the latest seminal research efforts to look into the theory and practice of the conservation of contemporary art with a focus on performance art is the Andrew W. Mellon-funded project Reshaping the Collectible: When Artworks Live in the Museum (2018–21), led by the Head of Collection Care Research at Tate, Pip Laurenson. Intending to understand the ways in which artworks that challenge traditional categories or practices can reshape museums, this project has highlighted conservation as a social activity that goes beyond the walls of the museum [see the project’s website (Tate 2018); for the conservation-related state-of-the-art report, see Marçal 2019a]. The end of 2020 saw the beginning of a new project—Performance: Conservation, Materiality, Knowledge, led by Hanna B. Hölling. Based at the University of Bern, this four-year project aims to study the conservation of performance art at the intersection of theory and practice (Performance: Conservation, Materiality, Knowledge 2021). The first colloquium (2021) focused on discussing the ethics and the politics of care, and a publication is forthcoming. The currently published body of literature explores five topics in particular: (1) the need to account for implicit knowledge in the documentation of performance, (2) the importance of body-to-body transmission, (3) the role of audiences and performance participants, (4) conservation as a social activity, and (5) the role of documentation in transmitting performance artworks from iteration to iteration. Among the documentation tools developed for the conservation of performance (or for time-based media artworks that share some of its characteristics), one can find the Identity and Iteration Reports developed by Joanna Phillips and her team at the Guggenheim (see, for example, Phillips 2015; Phillips and Hinkson 2018), and the tools and workflows developed by the time-based media team at Tate, led by Louise Lawson (see Lawson et al. 2019; Lawson and Marçal 2021; Lawson et al. 2021). Other documentation tools—particularly the ones that recognize objects that require change—could be applied to the conservation of performance; although this could lead to the reformulation of some aspects of the templates, this exercise in itself can be useful to rethink the categories that we use in understanding and conserving artworks.

23.5 Conclusion: From Emerging Practice to Sustained Development This chapter explored the emergence of performance art as both an artistic practice and an operator in the makings of conservation in collecting institutions. I have argued that although performance art helped destabilize some of the practices of museums, the potential for further change is as significant as the ambition to continue to collect performance art. It is only a matter of time until artworks that defy intentions of delegated authority, such as non-delegated or activist performance, or that make us revise our notion of what is performance art, such as digital performance, start being acquired by collecting institutions. As I’ve argued elsewhere (Marçal 2022b), not only has the acquisition of non-delegated artworks already started but it is also one of the ways in which museums have been looking to expand. The acquisition of activist performance art will further impact museum practices as well as the understanding of what it means to conserve performance in museums. The consolidation of the practice around the conservation of performance—and, in particular, of artworks that not only depend upon but also create social networks—will contribute to revising models of documentation (including digital preservation and collaborative categorization—see, for example, Chapter 10), creating opportunities for the development of practice in museums and other institutions in the form of collaborative networks and shared tools and frameworks (like in the project Matters in Media Art—see Matters in Media Art 2015). One of the first steps is, however, to normalize the care of 517

Hélia Marçal

performance art within conservation departments and as part of a conservation strategy. More­ over, in creating the conditions for the various types of performance art to thrive in museums, it will be inevitable for conservation and other departments within museums to become ever more collaborative as active participants in the ecology of these works and to underwrite their preservation.

Bibliography Auslander, Philip. “The Performativity of Performance Documentation.” PAJ: A Journal of Performance and Art 28, no. 3(84) (September 1, 2006): 1–10. https://doi.org/10.1162/pajj.2006.28.3.1. ———. Liveness: Performance in a Mediatized Culture, 2nd ed. London and New York: Routledge, 2008. Bishop, Claire. Artificial Hells: Participatory Art and the Politics of Spectatorship. London and New York: Verso Books, 2012. Borges, Silvestre, and Maria Inês. “Conservar la performance: La documentación y su papel en la conservación de obras de arte transitorias.” November  7, 2018. https://riunet.upv.es/handle/10251/ 112033. Bryzgel, Amy. Performance Art in Eastern Europe Since 1960. Rethinking Art’s Histories. Manchester: Manchester University Press, 2017. Calonje, Teresa, ed. Live Forever: Collecting Live Art. London: Koenig Books, 2014. Castriota, Brian. “Object Trouble: Constructing and Performing Artwork Identity in the Museum.” ARTMatters International Journal for Technical Art History, special issue no. 1 (2021): 12–22. Davis, Tracy C., ed. The Cambridge Companion to Performance Studies. Cambridge Companions to Literature. Cambridge: Cambridge University Press, 2008. https://doi.org/10.1017/CCOL9780521874014. Depocas, Alain, Jon Ippolito, Caitlin Jones, and Daniel Langlois Foundation for Art, Science and Technology, eds. Variable Media/Permanence Through Change: The Variable Media Approach. New York: The Solomon R. Guggenheim Foundation and Montreal: The Daniel Langlois Foundation for Art, Science and Technology, 2003. Finbow, Acatia. “Multiplicity in the Documentation of Performance-Based Artworks: Displaying MultiMedia Documentation in Rebecca Horn’s Body Sculptures at Tate.” Journal of New Music Research 47, no. 4 (August 8, 2018): 291–99. https://doi.org/10.1080/09298215.2018.1486432. Giannachi, Gabriella, and Jonah Westerman, eds. Histories of Performance Documentation: Museum, Artistic, and Scholarly Practices. London: Routledge, Taylor & Francis Group, 2018. Giguère, Amélie. “Art Contemporain et Documentation : La Muséalisation d’un Corpus de Pièces Éphémères de Type Performance.” These de doctorat, Avignon, 2012. www.theses.fr/2012AVIG1110. ———. “Collectionner La Performance : Un Dialogue Entre l’artiste et Le Musée.” Muséologies: Les Cahiers d’études Supérieures 7, no. 1 (2014): 169–85. https://doi.org/10.7202/1026653ar. Goldie-Scot, Iona. “Performance Art in the Making: Distributed Knowledge and New Dependencies.” ArtMatters International Journal for Technical Art History, special issue no. 1 (2021): 30–35. Heathfield, Adrian. Live: Art and Performance. London: Tate Publishing, 2004. Hölling, Hanna B., ed. Object-Event-Performance: Art, Materiality, and Continuity Since the 1960s. Cultural Histories of the Material World. New York: Bard Graduate Center, 2021. Jones, Amelia. “ ‘Presence’ in Absentia.” Art Journal 56, no. 4 (December 1, 1997): 11–18. https://doi.org /10.1080/00043249.1997.10791844. ———. “Timeline of Ideas: Live Art in (Art) History, A  Primarily European-US-Based Trajectory of Debates and Exhibitions Relating to Performance Documentation and Re-Enactments.” In Perform, Repeat, Record: Live Art in History, 425–32. Chicago: Intellect, The University of Chicago Press, 2012. Kirschenbaum, Matthew G. Mechanisms: New Media and the Forensic Imagination, 1. paperback ed. Cambridge, MA: MIT Press, 2008. Lane, Robert Lazarus, and Jessye Wdowin-McGregor. “This Is so Contemporary? Mediums of Exchange and Conservation.” Studies in Conservation 61, no. sup2 (June 1, 2016): 104–8. https://doi.org/10.108 0/00393630.2016.1204528. Laurenson, Pip, Christiane Berndes, Hendrik Folkerts, Diana Franssen, Adrian Glew, Panda de Haan, Ysbrand Hummelen et al. “The Live List: What to Consider When Collecting Live Works.” Collecting the Performative Network, Tate, January  24, 2014. www.tate.org.uk/about-us/projects/collectingperformative/live-list-what-consider-when-collecting-live-works.

518

A Word About Performance Art Laurenson, Pip, and Vivian Van Saaze. “Collecting Performance-Based Art: New Challenges and Shifting Perspectives.” In Performativity in the Gallery: Staging Interactive Encounters, edited by Outi Remes, Laura MacCulloch, and Marika Leino, 27–41. Oxford: Peter Lang, 2014. Lawson, Louise, Acatia Finbow, Duncan Harvey, Hélia Marçal, Ana Ribeiro, and Lia Kramer. “Strategy and Glossary of Terms for the Documentation and Conservation of Performance, Published as Part of Documentation and Conservation of Performance (March  2016—March  2021), a TimeBased Media Conservation Project at Tate.” Tate, 2021. www.tate.org.uk/about-us/projects/documen tation-conservation-performance/strategy-and-glossary#glossary. Lawson, Louise, Acatia Finbow, and Hélia Marçal. “Developing a Strategy for the Conservation of Performance-Based Artworks at Tate.” Journal of the Institute of Conservation 42, no. 2 (May 4, 2019): 114–34. https://doi.org/10.1080/19455224.2019.1604396. Lawson, Louise, and Hélia Marçal. Unfolding Interactions in the Conservation of Performance Art. Beijing, China: International Council of Museums, 2021. https://discovery.ucl.ac.uk/id/eprint/10119488/1/ Marcal_388_Marcal_MMCA_paper_finalversion%20%281%29.pdf. Lawson, Louise, and Deborah Potter. “Contemporary Art, Contemporary Issues—Conservation at Tate.” Journal of the Institute of Conservation 40, no. 2 (May 4, 2017): 121–32. https://doi.org/10.1080/19455 224.2017.1318079. Marçal, Hélia. “Conservation in an Era of Participation.” Journal of the Institute of Conservation 40, no. 2 (May 4, 2017): 97–104. https://doi.org/10.1080/19455224.2017.1319872. ———. “Contemporary Art Conservation: Published as Part of the Research Project Reshaping the Collectible: When Artworks Live in the Museu.” Tate Research, 2019a. www.tate.org.uk/research/ reshaping-the-collectible/research-approach-conservation. ———. “Diffracting Histories of Performance.” Performance Research 24, no. 7 (October 3, 2019b): 39–46. https://doi.org/10.1080/13528165.2019.1717863. ———. “Situated Knowledges in the Conservation of Performance Art.” ARTMatters International Journal for Technical Art History, special issue no. 1 (2021a): 55–62. ———. “Towards a Relational Ontology of Conservation.” In ICOM-CC 19th Triennial Conference Preprints, Beijing, 17–21 May 2021, edited by Janet Bridgland. Paris: International Council of Museums, 2021b. ———. Marçal, Hélia. “Becoming Difference: On the Ethics of Conserving the In-Between.” Studies in Conservation 67, no. 1–2 (February 17, 2022a): 30–37. https://doi.org/10.1080/00393630.2021.194 7074. ———. “Ecologies of Memory in the Conservation of ‘Ten Years Alive on the Infinite Plain’.” Reshaping the Collectible, Tate Research Publication (blog), 2022b. https://www.tate.org.uk/research/reshapingthe-collectible/conrad-ecologies-memory Marçal, Hélia, Louise Lawson, and Ana Ribeiro. “Experimenting with Transmission.” Reshaping the Collectible, Tate Research Publication (blog), 2022. https://www.tate.org.uk/research/ reshaping-the-collectible/tony-conrad-experimenting-transmission. Marçal, Hélia, Rita Macedo, Andreia Nogueira, and António Duarte. “Whose Decision Is It? Reflections About a Decision Making Model Based on Qualitative Methodologies.” CeROArt: Conservation, Exposition, Restauration d’Objets d’Art, no. HS (September 11, 2013). https://doi.org/10.4000/ceroart.3597. Marçal, Hélia, Andreia Nogueira, Isabel Pires, and Rita Macedo. “Connecting Practices of Preservation: Exploring Authenticities in Contemporary Music and Performance Art.” In Authenticity in Transition: Changing Practices in Contemporary Art Making and Conservation, edited by Erma Hermens and Frances Robertson, 117–27. London: Archetype Lda, 2016. Matos, Lúcia Almeida, Rita Macedo, and Gunnar Heydenreich, eds. “Performing Documentation in the Conservation of Contemporary Art.” In Revista de História Da Arte—Serie W. Lisboa: Instituto de História da Arte, 2015. Matters in Media Art. “Matters in Media Art (Website).” 2015. http://mattersinmediaart.org/. Monoskop. “Performance Art.” Monoskop. Accessed August 6, 2021. https://monoskop.org/Performance_ art#Works. Nogueira, Andreia, Rita Macedo, and Isabel Pires. “Where Contemporary Art and Contemporary Music Preservation Practices Meet: The Case of Salt Itinerary.” Studies in Conservation 61, no. sup2 (June 1, 2016): 153–59. https://doi.org/10.1080/00393630.2016.1188251. Performance: Conservation, Materiality, Knowledge. “Performance: Conservation, Materiality, Knowledge.” Accessed March 31, 2021. https://performanceconservationmaterialityknowledge.com/. Phelan, Peggy. Unmarked: The Politics of Performance. London and New York: Routledge, 1993.

519

Hélia Marçal Phillips, Joanna. “Reporting Iterations: A Documentation Model for Time-Based Media Art.” In Performing Documentation, Revista de História Da Arte—Serie W, edited by Gunnar Heydenreich, Rita Macedo, and Lucia Matos, 168–79. Lisboa: Instituto de História da Arte, 2015. Phillips, Joanna, and Lauren Hinkson. “New Practices of Collecting and Conserving Live Performance Art at the Guggenheim Museum.” VDR-Journal Beiträge Zum Erhalt von Kunstund Kulturgut 1 (2018): 124–32. Rieß, Eva, Carolin Bohlmann, and Ina Hausmann. “From Action to Object: On the Preservation of Performance-Based Installations by Joseph Beuys.” Journal of the Institute of Conservation 42, no. 2 (May 4, 2019): 79–93. https://doi.org/10.1080/19455224.2019.1604397. Santone, Jessica. “Marina Abramović’s Seven Easy Pieces: Critical Documentation Strategies for Preserving Art’s History.” Leonardo 41, no. 2 (April 2008): 147–52. https://doi.org/10.1162/leon.2008.41.2.147. Tate. “Collecting the Performative—Project.” Tate, 2014a. www.tate.org.uk/about-us/projects/collect ing-performative. ———. “NeCCAR: Network for Conservation of Contemporary Art Research—Project.” Tate, 2014b. www.tate.org.uk/about-us/projects/neccar-network-conservation-contemporary-art-research. ———. “Reshaping the Collectible: When Artworks Live in the Museum.” Tate, 2018. www.tate.org.uk/ research/reshaping-the-collectible. ———. “NACCA: New Approaches in the Conservation of Contemporary Art—Project.” Tate, 2019. www.tate.org.uk/about-us/projects/nacca. ———. “Documentation and Conservation of Performance—Project.” Tate, 2021. www.tate.org.uk/ about-us/projects/documentation-conservation-performance. ———. “Strategy for the Documentation and Conservation of Performance.” Tate. Accessed March 31, 2021. www.tate.org.uk/about-us/projects/documentation-conservation-performance/strategy-and-glossary. Van de Vall, Renée, Hanna Hölling, Tatja Scholte, and Sanneke Stigter. “Reflections on a Biographical Approach to Contemporary Art Conservation.” In ICOM-CC 16th Triennial Conference Preprints, Lisbon 19–23 September 2011, edited by Janet Bridgland. Lisbon: Critério-Produção Grafica, Lda, 2011. https://pure.uva.nl/ws/files/1262883/115640_344546.pdf. Van den Hengel, Louis. “Archives of Affect: Performance, Reenactment, and the Becoming of Memory.” In Materializing Memory in Art and Popular Culture, edited by László Munteán, Liedeke Plate, and Anneke Smelik, 125–42. London and New York: Routledge, Taylor & Francis Group, 2017. Van Saaze, Vivian. “In the Absence of Documentation: Remembering Tino Sehgal’s Constructed Situations, Performing Documentation in the Conservation of Contemporary Art.” In Revista de História Da Arte—Serie W, edited by Lúcia Almeida Matos, Rita Macedo, and Gunnar Heydenreich, 55–63. Lisboa: Instituto de História da Arte, 2015. “Variable Media Network.” 2003. https://variablemedia.net/e/index.html. “Variable Media Questionnaire: Documentation.” Accessed March 31, 2021. https://variablemediaquestionnaire.net/. VDR. “Beiträge Zur Erhaltung von Kunst—Und Kulturgut [2].” Verband der Restauratoren, no. 2 (2017). ———. “Beiträge Zur Erhaltung von Kunst—Und Kulturgut [1].” Verband der Restauratoren, no. 1 (2018). Verband der Restauratoren. “Collecting and Conserving Performance Art—Videos.” Verband der Restauratoren, 2017. www.restauratoren.de/collecting-and-conserving-performance-art-videos/. Wharton, Glenn. “Reconfiguring Contemporary Art in the Museum.” In Authenticity in Transition: Changing Practices in Contemporary Art Making and Conservation: Proceedings of the International Conference Held at the University of Glasgow, 1–2 December  2014, edited by Erma Hermens and Frances Robertson. London: Archetype Publications, 2016. Wood, Catherine. “In Advance of the Broken Arm: Collecting Live Art and the Museum’s Changing Game.” In Live Forever: Collecting Live Art, edited by Teresa Calonje, 123–46. London: Koenig Books, 2014.

520

CONTRIBUTORS

Adachi-Tasch, Ann Ann Adachi-Tasch is Executive Director of Collaborative Cataloging Japan, a nonprofit that supports preservation and archiving of Japanese historical and experimental moving image works. Ann has worked at the Museum of Modern Art, where she managed projects for the museum’s global research initiative titled Contemporary and Modern Art Perspectives (C-MAP), and contributed to the launch of its digital platform, post (post.at.moma.org). In 2009, she organized a touring screening program and publication of Japanese experimental video and film, Vital Signals, at Electronic Arts Intermix, a video art archive and distributor where she was Distribution Coordinator. Ann has given presentations and written about the status of media archiving in Japan, at the Museum of Modern Art (Tokyo), Tate Modern (London), Keio University Art Center (Tokyo), and the Archives of American Art (Washington, DC), among others. Anderson, Mollie Mollie Anderson is Executive Assistant at Weights & Biases, Inc. She has worked in administrative roles at museums and cultural institutions in Los Angeles, New York, and Philadelphia, including the Getty Conservation Institute, The Metropolitan Museum of Art, and the Conservation Center for Art and Historic Artifacts. Mollie earned her BA in philosophy from the University of Pennsylvania in 2013. Antos, Julian A projectionist, technician, filmmaker, and archivist living and working in Chicago, Julian Antos currently serves as Executive Director for the Chicago Film Society. Julian projects at CFS shows, manages the CFS film collection and film loans, and has a special interest in making sure every bit of salvaged film and equipment has a useful life. He has installed 16 and 35 mm exhibitions for Steve McQueen, Ben Rivers, and Mathias Poledna and oversaw the 16 mm projection for the Art Institute of Chicago’s Andy Warhol Exhibition: From A to B and Back Again. Julian also serves as Technical Director at the Music Box Theatre, Chicago’s premiere film venue with a focus on exhibiting new films on 35 and 70 mm.

521

Contributors

Archey, Karen Karen Archey is Curator of Contemporary Art, Time-Based Media, at the Stedelijk Museum, Amsterdam. She is an American curator and art critic formerly based in Berlin and New York and a 2015 Creative Capital | Warhol Foundation Arts Writers Grant recipient for short-form writing. arden, sasha sasha arden is currently a fourth-year graduate student in the Time-Based Media Conservation program at New York University. Having been involved in arts production, installation, and management throughout their career, sasha embraces the long-term thinking and development of appropriate stewardship practices while negotiating ecosystems of stakeholders and values unique to each artwork. Their ongoing research examines the intersection of technical capabilities and the philosophical and ethical questions arising through the conservation process, often questioning conventional approaches in pursuit of a holistic approach to the integrity of cultural assets. Barger, Michelle Michelle Barger is Head of Conservation at the San Francisco Museum of Modern Art where she oversees conservators with expertise in paintings, paper, objects, photography, and electronic media. Developing models of care for contemporary art is of particular interest to her— especially when integrating the role of living artists. Unorthodox approaches to art making and how this shapes conservation practice are subjects of specialized focus and have led to publications on Eva Hesse and Bruce Conner. Barger has worked at the Museum of Fine Arts (Boston) and the Philadelphia Museum of Art. She holds a BA in art history from the University of North Carolina, Chapel Hill, and an MS in art conservation from the University of Delaware/ Winterthur Museum. Barnott-Clement, Rebecca Rebecca Barnott-Clement is Senior Time-Based Art Conservator at the Art Gallery of New South Wales (AGNSW), Australia, where she focuses on the care and conservation of contemporary and time-based art collections. Prior to her current appointment, Rebecca worked as Junior Time-Based Art Conservator at AGNSW and as a contemporary art conservator at the Museum of Contemporary Art (MCA) and the 21st Biennale of Sydney, having completed an MA in Cultural Materials Conservation at the University of Melbourne in 2017. Bek, Reinhard Prior to enrolling in the Conservation Program of the HTW, University of Applied Sciences in Berlin, Reinhard Bek apprenticed as a ship-builder in Hamburg (1993 to 1996) and completed internships in the conservation departments of several museums in Germany. He was a fellow at the Swiss Institute of Art Research (SIK), Zurich, Switzerland, in 2001, and completed his training as an objects conservator in 2002. He holds a graduate degree in the conservation of objects. Reinhard joined the Museum Tinguely in Basel, Switzerland, from 2002 to 2012 as Head of Conservation. In 2005 he co-organized the symposium Moving Parts on the Preservation and Documentation of Kinetic Artworks at the Museum Tinguely. Reinhard was a participant in the European conservation research projects Inside Installations and PRACTICs. In 2009, he was a 12-month conservation research fellow with the Museum of Modern Art in New York. Since 2012, Reinhard has been a partner of Bek & Frohnert LLC, based in New York City. 522

Contributors

Blair, Ulanda Ulanda Blair is Curator of Moving Image at M+, a brand-new museum for visual culture, which opened in Hong Kong in 2021. At M+, she is developing the moving-image collection, while planning and presenting exhibitions, screenings, publications, and events. Prior to joining M+ she was Curator at the Australian Centre for the Moving Image (ACMI) in Melbourne, Australia. Bloes, Richard Richard Bloes has been an AV installer and projectionist since being hired by the Whitney Museum in 1979. Having installed many works that the museum later acquired, he became more active in the conservation of those works, especially after 2010. These works include the iconic Paik Magnet TV and The Ballad of Sexual Dependency by Nan Goldin, both of which have traveled extensively. Brand, Bill Bill Brand is a co-owner of BB Optics Inc., a company that specializes in archival film preservation and post-production services. He is Professor Emeritus at Hampshire College and teaches film preservation in New York University’s Moving Image Archiving and Preservation graduate program. He is a multidisciplinary artist whose films, public artwork, works on paper, painting, and installations have been exhibited worldwide in museums, galleries, festivals, and microcinemas and on television. His 1980 Masstransiscope, an animated mural installed in the New York City subway, is in the MTA Arts and Design permanent collection. Brost, Amy Amy Brost is Assistant Media Conservator at the Museum of Modern Art (MoMA), New York. She earned an MA in art history and MS in conservation from New York University, a BS in art and BA in art history from the University of Wisconsin-Madison, and an AS in chemistry from New York City College of Technology. She is pursuing an MEd in Curriculum and Instruction from SUNY Empire State College to connect high school chemistry curricula, artists’ practices, and conservation. She held media conservation training positions at The Metropolitan Museum of Art, MoMA, and the Solomon R. Guggenheim Museum. Callard, Andrea Andrea Callard is an artist making experimental narrative films, photographs, and industrial videos. A member of Collaborative Projects Inc., a.k.a. COLAB, her photographs of COLAB’s seminal 1980 exhibit The Times Square Show are widely published. Callard documents the former New York City landfill as it becomes Freshkills Park, a 30-year process. Her works have been exhibited at MoMA PS1, Exit Art, the Kentler International Drawing Space, White Columns, the Museum of Chinese in America, and so on. Screenings of her early preserved films include Ambulante, the Walker Art Center, MoMA, DOXA 2013, the 10th and 7th Orphans Film Symposium, the Maysles Cinema, and the 56th Oberhausen Short Film Festival. She is a founding member of the XFR Collective. Since 2008, she has produced more than 120 short videos documenting the work of Green Planet 21. Campbell, Savannah Savannah Campbell is Media Preservation Specialist, Video and Digital Media, at the Whitney Museum of American Art as part of the Media Preservation Initiative project. Previously she was a Fellow in Magnetic Media Preservation at the Standby Program and has worked on 523

Contributors

audiovisual preservation projects for the Dance Heritage Coalition, CUNY TV, and Crawford Media Services. Savannah holds an MA in Moving Image Archiving and Preservation from New York University. Castriota, Brian Brian Castriota is a researcher, educator, and conservator specializing in time-based media, contemporary art, and archaeological materials. He is Adjunct Lecturer in Time-Based Media Art Conservation at NYU’s Institute of Fine Arts, is a freelance conservator for time-based media and contemporary art at the National Galleries of Scotland and the Irish Museum of Modern Art, and serves as Supervising Conservator with Harvard Art Museums’ Archaeological Exploration of Sardis. He completed graduate-level training in conservation at NYU’s Institute of Fine Arts (2014) and received a PhD in History of Art from the University of Glasgow (2019). Prior to his doctoral studies, he was a Samuel H. Kress Fellow in Time-Based Media Conservation at the Solomon R. Guggenheim Museum in New York and worked as a contract conservator for time-based media artworks at the Smithsonian American Art Museum in Washington, DC. Chen, Yuhsien Yuhsien Chen, a gamer and digital cultural lover born and educated in Taiwan, has a keen interest in preserving born-digital heritages and artworks that have digital media components. She has played an active role in initiating TBM art conservation in Taiwan under the sponsorship of the Digital Art Foundation and has worked independently with art museums in Taiwan since early 2020. Her main areas of practice are collection assessment, artist interviews, and conservation strategy planning. Yuhsien was an associate conservator at the M+ museum in 2019. She holds a bachelor’s degree in digital media design and a master’s degree in Museum Studies and is currently pursuing a PhD degree. Churchill, Joshua Joshua Churchill is Assistant Manager of the Collections Technical team at SFMOMA. He supports the Collections Technical team in the installation and documentation of a wide variety of media-based artworks and exhibitions, with a focus on the technical aspects of incoming mediabased acquisitions and the preservation of media-based works in the museum’s collection. Most recently, he led the technical installation and conservation of media-based collection works for the international touring retrospective exhibition Nam June Paik. Churchill is also a practicing media artist and musician, where his practice and work within the Collections Technical team constantly overlap and inform one another. Coddington, Jim Jim Coddington is a conservator, retired from the Museum of Modern Art in New York as the Agnes Gund Chief Conservator. In addition to his restoration of paintings, he has published on topics including structural restoration of paintings, development of color-accurate documentation of art, automated texture identification of photo papers, multi-spectral analysis of paintings, and the use of flash thermography for paintings. He has also published on the theory and practice of conservation and studies of Pollock, de Kooning, Miro, Cezanne, and Pissarro. Colloton, Eddy Eddy Colloton is Project Conservator of Time-Based Media for the Hirshhorn Museum and Sculpture Garden, where he works closely with the conservation department on the museum’s diverse array of media artworks. Colloton received his MA degree from the Moving Image 524

Contributors

Archiving and Preservation program at New York University in May 2016. Colloton has previously worked as Assistant Conservator at the Denver Art Museum and Time-Based Media Technician at the Los Angeles County Museum of Art. Collyer, Catherine Catherine Collyer is Associate Sculpture Conservator at the Queensland Art Gallery and the Gallery of Modern Art (QAGOMA), with a special interest in time-based art. She obtained her MA in Cultural Materials Conservation from the University of Melbourne in 2016. Colussi, Francesca Francesca Colussi is a time-based media conservator at Tate. She holds an MA in Visual Arts from IUAV University (Venice) and moved to London in 2012 to work for Afterall, a research center of the University of the Arts London. She subsequently worked as Studio Manager for the artist Mark Lewis, where she deepened her interest in the display and preservation of video art. At Tate, she has been focusing on the preparation and delivery of exhibitions and displays, including those of Mark Leckey at Tate Britain in 2019 and Steve McQueen at Tate Modern in 2020. Cook, Sarah Sarah Cook is a curator and writer specializing in contemporary art at the intersection of digital creativity, technology, and science. She is Professor of Museum Studies (Information Studies), School of Humanities, College of Arts, University of Glasgow, and Curator for NEoN (NorthEast of North), Scotland’s only digital art festival. Cornell, Lauren Lauren Cornell is Director of the Graduate Program and Chief Curator at the Center for Curatorial Studies, Bard College, New York. Previously, Cornell was Curator and Associate Director of Technology Initiatives at the New Museum in New York City. From 2005 to 2012, Cornell served as Executive Director of Rhizome, an organization that commissions, exhibits, and preserves art engaged with technology. She is a co-editor, with Ed Halter, of Mass Effect: Art and the Internet in the Twenty-First Century (New Museum and MIT Press, 2015). Cranmer, Candice Candice Cranmer is ACMI’s first time-based media conservator with many years’ experience in archiving and preservation. She is the co-convener of Electron, a special interest group of the Australian Institute for the Conservation of Cultural Material, and has a keen interest in innovative documentation strategies for the conservation of time-based media. Croft, Kyle Kyle Croft, an art historian and curator, is Programs Director at Visual AIDS. His research examines the formative role of AIDS in debates about identity, participation, and social responsibility in the art world. He holds an MA in art history from Hunter College, City University New York, and he was a member of the VHS Archives Working Group at CUNY’s Center for the Humanities from 2017 to 2019. Cullen, Tom Tom Cullen has been installing time-based media art for over 30 years across the world. Tom has taken many varied roles, from Technical Manager of large art festivals to adviser to architects on gallery builds but still enjoys working one to one with artists. 525

Contributors

D’Agostino Dias, Fernanda Fernanda D’Agostino Dias is a museum professional based in Sao Paulo, Brazil. She holds a BA in Visual Arts and a graduate degree in Archival Management. Currently, Fernanda is Head of Collection Care and Management of MASP (Museu de Arte de São Paulo Assis Chateaubriand). Fernanda worked as a curatorial assistant for the 28th Sao Paulo Biennial (2008) and 12th Istanbul Biennial (2011) and was chief registrar of Pinacoteca de São Paulo’s collection from 2013 to 2020. Dekker, Annet Annet Dekker is a curator and researcher. Currently, she is Assistant Professor of Archival and Information Studies and Cultural Analysis at the University of Amsterdam and Visiting Professor and Co-director of the Centre for the Study of the Networked Image at London South Bank University. She has published numerous essays and edited several volumes, among others, Documentation as Art (co-edited with Gabriella Giannachi, Routledge 2022) and Curating Digital Art. From Presenting and Collecting Digital Art to Networked Co-Curating (Valiz 2021). Her monograph, Collecting and Conserving Net Art (Routledge 2018), is a seminal work in the field of digital art conservation. Duke, Laurie Laurie Duke has worked at the Grey Art Gallery, New York University’s fine arts museum, since 2008 and taken on a series of roles, including that of Exhibitions and Publications Coordinator. For the last five years, in her current position as the Head of Finance and Administration, Duke has been responsible for museum and exhibition budgets, contracts, grants, individual giving, compliance, and strategic planning. She received an MA in Moving Image Archiving and Preservation from NYU’s Tisch School of the Arts in 2016 and oversees the Grey’s archives and active exhibition files, managing both traditional and born-digital records and audiovisual documentation. Dunne, Kirsten Kirsten Dunne ACR is Senior Projects Conservator at the National Galleries of Scotland, where she has worked since 2005. She trained as a paper conservator but, since 2019, has moved into a new role within the conservation department, which focuses on time-based media, microfading, and the application of technology to conservation practice. Kirsten holds an MA in Conservation of Fine Art, Works of Art on Paper, from Northumbria University and an MA in History of Art from Edinburgh University. Dye, Steven Steven Dye is Collections Technical Manager at the San Francisco Museum of Modern Art (SFMOMA). The Collections Technical team works collaboratively with artists and relevant museum staff to translate written and verbal descriptions, schematics, documents, floorplans, and technology into successful installations. In partnership with SFMOMA Conservation and Registration departments, the Collections Technical team also develops short- and long-term strategies for the preservation of media-based artworks in SFMOMA’s permanent collection. Efferenn, Kristof Kristof Efferenn is the Time-Based Media Conservator at Museum Ludwig in Cologne since 2019. He was previously a staff member of the archive unit at the Academy of Media Arts 526

Contributors

Cologne. He received his MA in Conservation of New Media and Digital Information at the Stuttgart State Academy of Fine Arts. Engel, Deena Deena Engel, Clinical Professor Emerita of the Department of Computer Science at the Courant Institute of Mathematical Sciences at New York University, focuses her research on contemporary art. The winner of four teaching awards, she has experience teaching undergraduate computer science courses on web and database technologies, as well as Digital Humanities courses for graduate students. Deena is the Co-Director along with Prof. Glenn Wharton of the Artist Archives Initiative, and she also works with major museums on projects which address the challenges involved in the conservation of time-based media art. She has an MA in Comparative Literature and Literary Translation from SUNY-Binghamton and an MS in Computer Science from the Courant Institute of Mathematical Sciences. Ensom, Tom Tom Ensom is a London-based Digital Conservator specializing in the conservation of softwarebased art. He works with those caring for software-based art to research, develop, and implement strategies for its long-term preservation. His PhD, an AHRC-funded Collaborative Doctoral Partnership between King’s College London and Tate, explored approaches to conservation documentation for software-based art, addressing a number of gaps in existing theory and practice. Tom currently works primarily with the time-based media conservation team at Tate, where he has been developing their conservation strategy for software-based artworks and undertaking the assessment, documentation, and treatment of a range of time-based media artworks. Falcão, Patricia Patricia Falcão is a Portuguese time-based media conservator working at Tate, where she researches and develops strategies for the preservation of digital components of artworks, from slide- to video- to software-based artworks. More recently, in the context of the Reshaping the Collectible project, the range has broadened to include the acquisition and preservation of web-based artworks. In the past eight years, she has consistently published on the theme of preservation of time-based media, digital and software-based art, both within the conservation and digital preservation communities. Farbowitz, Jonathan Jonathan Farbowitz is the Associate Conservator of Time-Based Media at The Metropolitan Museum of Art. Previously, Jonathan was a Fellow in the Conservation of Computer-Based Art at the Solomon R. Guggenheim Museum, where he took part in the restoration process for the museum’s three web artworks. He holds an M.A. in Moving Archiving and Preservation from New York University and a B.A. in Film from Vassar College and has previous experience in software development and testing. Farias de Carvalho, Humberto Humberto Farias is a conservator and Assistant Professor at the Federal University of Rio de Janeiro, Brazil, where he has been teaching Conservation of Contemporary Art since 2012. He participated in the Visiting Scholar program of the Moving Image Archiving & Preservation Program at New York University in 2016. He was an assistant researcher on the project “Brazilian Concrete Art. The Materiality of Form: Industrialism and the Latin American Avant-Garde,” funded by the Getty Foundation. From 2009 to 2014, he was a member of the 527

Contributors

cultural committee of the Instituto Brasil Estados Unidos (IBEU), where he curated several solo exhibitions. Fino-Radin, Ben Ben Fino-Radin is an art conservator, entrepreneur, and founder of Small Data Industries, a lab and consultancy working globally to support institutions, collectors, and artists, through the complex challenges presented by variable time-based media art, and the digital transformation of art information and documentation. Fino-Radin also hosts the Art & Obsolescence podcast, a show featuring conversations with the artists, collectors, curators, and conservators shaping the past, present and future of time-based media art. Prior to founding Small Data Industries in 2017, Ben served as a time-based media conservator at the Museum of Modern Art (MoMA) and at Rhizome. They hold Master’s degrees in Information Science and Digital Art from Pratt Institute, have taught courses at NYU, and have led training workshops in Hong Kong, Basel, Brussels, Mexico City, and New York City. Fortunato, Flaminia Flaminia Fortunato is Time-Based Media Coordinator, Conservation-Restoration Department, at the Stedelijk Museum in Amsterdam. She works with the collection of media and performance art. Prior to this role, she completed a three-year Andrew W. Mellon fellowship in media conservation at the Museum of Modern Art and at the Brooklyn Museum, both in New York. She has conducted research on the preservation and documentation of software-based works, sound, and complex multimedia installations. In 2017, she earned an MA in Conservation-Restoration of Modern Materials and Media from the University of the Arts in Bern, Switzerland. Frieling, Rudolf Curator of Media Arts at the San Francisco Museum of Modern Art (SFMOMA) since 2006, Rudolf Frieling is the co-curator of the recent retrospectives Nam June Paik (2019–22), Suzanne Lacy: We Are Here (2019), Bruce Conner: It’s All True (2016–17), and the thematic survey Soundtracks (2017). He curated the influential thematic exhibitions The Art of Participation: 1950 to Now (2008–09) and Stage Presence: Theatricality in Art and Media (2012) and over 20 monographic exhibitions at SFMOMA. Frieling is also Senior Adjunct Professor at the California College of the Arts in San Francisco. He holds an MA from the Free University Berlin and a PhD from the University of Hildesheim. Frohnert, Christine Christine Frohnert (graduate degree in Conservation of Modern Materials and Media, University of Arts, Berne, Switzerland, 2003) is a partner of Bek  & Frohnert LLC, Conservation of Contemporary Art, based in New York City since 2012. Previously, Ms. Frohnert served for 12 years as Conservator of Contemporary Art and later as Chief Conservator at the Museum Ludwig in Cologne, Germany. Ms. Frohnert was named the inaugural Judith Praska Distinguished Visiting Professor in Conservation and Technical Studies at the Conservation Center, Institute of Fine Arts at New York University (CC/IFA/NYU) in 2012. She is currently Research Scholar and Program Coordinator of the time-based media curriculum and offers instruction in TBM art conservation. Gil Rodríguez, Caroline Caroline Gil Rodríguez is a media conservator and Director of Media Collections and Preservation at Electronic Arts Intermix, New York. At EAI, she is responsible for the management, 528

Contributors

acquisition, documentation, and conservation of a collection of 4,000 media artworks. She is also an adjunct professor at NYU, where she co-teaches the Handling Complex Media course. Caroline has previously worked at the New York Public Library, The Metropolitan Museum of Art, the Museum of Modern Art (MoMA), and the Smithsonian Institution. Caroline holds an MA in Moving Image Archiving and Preservation from NYU, and a MA in Cinematography from the Escola Superior de Cinema i Audiovisuals de Catalunya (ESCAC). Gonçalves Magalhães, Ana Ana Gonçalves Magalhães is an art historian, professor, and curator at the Museum of Contemporary Art of the University of São Paulo (MAC USP), where she now holds the post of Director (2020-24). Haidvogl, Martina Martina Haidvogl is Lecturer in Conservation of Contemporary Art at the Bern University of the Arts. Prior to this appointment, she was Associate Media Conservator at the San Francisco Museum of Modern Art (2011–19), where she piloted documentation and preservation initiatives for SFMOMA’s media arts collection. Martina has lectured and published internationally on media conservation and its implementation within collecting institutions. Her research focuses on cross-disciplinary collaboration practice fostered through digital tools, serving the needs of the art of our time. Hamilton, Emily Emily Hamilton is Assistant Professor of Objects Conservation at Buffalo State College. Previously, she was Associate Objects Conservator at SFMOMA, Assistant Objects Conservator at the Saint Louis Art Museum, and the Samuel H. Kress Sculpture and Media Conservation Research Fellow at MoMA. Emily earned a BA in Art History from Reed College and an MA in Art Conservation from Buffalo State College. Recent projects include organizing an iteration of the TechFocus conference series hosted by the American Institute for Conservation (AIC) Electronic Media Group, focusing on 3D printing. Harris, Jon Jon Harris is the Technical Manager for the Birmingham Museum Trust in the UK, where he oversees several galleries and buildings for the trust. Jon has been Gallery Manager and Technical Manager at various galleries and visitor attractions and has a wealth of knowledge in designing and implementing exhibitions in varied spaces. Harvey, Duncan Duncan Harvey is a time-based media conservator at Tate focusing on loans-out, acquisitions and Tate’s High-Value Digital Asset (HVDA) storage project. In his time at Tate, he has worked as a Conservation Technician working on displays and acquisitions. He holds an M.A. in Politics from the University of Glasgow. Hellar, Mark Mark is a creative technology consultant for cultural institutions throughout the San Francisco Bay Area and beyond and the owner of Hellar Studios LLC. He specializes in innovative yet practical digital media and software-based solutions for multimedia artists and the institutions that support their work, with an emphasis on developing systems and best practices for exhibition, documentation, and long-term preservation. 529

Contributors

Himmelsbach, Sabine Sabine Himmelsbach is Director of HEK (House of Electronic Arts Basel). After studying art history in Munich, she worked for galleries in Munich and Vienna from 1993 to 1996 and later became the project manager for exhibitions and conferences for the Steirischer Herbst Festival in Graz, Austria. In 1999 she became the exhibition director at the ZKM | Center for Art and Media in Karlsruhe. From 2005 to 2011 she was the artistic director of the Edith-Russ-House for Media Art in Oldenburg, Germany. Hix, Kelli Kelli Hix is an audiovisual archivist and consultant. She is Program Manager of the Audiovisual Heritage Center (AVHC) at Metro Archives Nashville and a core member of the Community Archiving Workshop (CAW). Kelli has worked as a consultant, archivist, and curator for organizations including the Smithsonian Institution, the National Geographic Society, and the Country Music Hall of Fame and Museum. Homberger, Liz Liz Homberger joined the staff of the Detroit Institute of Arts as Objects Conservator in 2018. She is responsible for the preservation of objects in the DIA’s collection, ranging from contemporary outdoor sculpture to 15th-century ceramics to ceremonial headdresses from the Akan. Prior to moving to Detroit, Liz lived in Los Angeles for ten years and worked at the Los Angeles County Museum of Art and the Natural History Museum of LA County. She earned her MA in Conservation from Buffalo State College in New York in 2008. Jarczyk, Agathe Agathe Jarczyk is Associate Time-Based Media Conservator at the Solomon R. Guggenheim Museum in New York. She joined the museum in 2020 and focuses on technical examination, research and the conservation of the time-based media works in the Guggenheim’s collection. Prior to joining the Guggenheim, Jarczyk was Adjunct Professor for the Conservation and Restoration of Modern Materials and Media at the University of the Arts in Bern, Switzerland. She is the owner of the Studio for Video Conservation in Bern, Switzerland. Jarczyk received her diploma in Conservation of Modern Materials and Media from the University of the Arts Bern. Jimenez, Mona Mona Jimenez has worked in many capacities on independent media and media art collections in libraries, archives, museums, TV stations, and in a wide variety of community-based settings for the past 30-plus years. From 2003 to 2017 she was on the faculty of the Moving Image Archiving and Preservation Program (MIAP) at New York University and led curriculum development in areas of video preservation, collection management, and media art conservation. She founded the Audiovisual Preservation Exchange, which she directed from 2008 to 2016, and initiated the Community Archiving Workshop model. She recently received awards from the Association of Moving Image Archivists (Silver Light Award, 2021) and the American Institute for Conservation (David Magoon-University Products Conservation Advocacy Award, 2019) for her contributions to video preservation and media art conservation. Kennedy, Nora Nora Kennedy is Sherman Fairchild Conservator in Charge of the Department of Photograph Conservation at The Metropolitan Museum of Art. Her department conserves The Met’s 530

Contributors

photographs and has oversight of time-based media (TBM). Met conservators continue to expand the museum’s initiatives in scholarship, education, publication, and advocacy. Kennedy is an adjunct faculty member of New York University’s Institute of Fine Arts Conservation Center, which recently established a new TBM conservation specialization. Kennedy holds a BFA in Fine Arts from York University, an MS in Conservation from the University of Delaware, and an Honorary Doctorate from the Academy of Fine Arts and Design, Bratislava. King, Chris Chris King is Assistant Time-Based Media Conservator, Acquisitions, Tate. He has a background in media art, DIY electronics, museum AV, digital learning, and show control programming for artists. Kjartansson, Ragnar Ragnar Kjartansson engages multiple artistic mediums, creating video installations, performances, drawings, and paintings that draw upon myriad historical and cultural references. An underlying pathos and irony connect his works, with each deeply influenced by the comedy and tragedy of classical theater. The artist blurs the distinctions between mediums, approaching his painting practice as performance, likening his films to paintings and his performances to sculpture. Throughout, Kjartansson conveys an interest in beauty and its banality, and he uses durational, repetitive performance as a form of exploration. Klacsmann, John John Klacsmann is Archivist at Anthology Film Archives in New York City, where he preserves artists’ cinema and experimental film. Klacsmann holds a bachelor of science in Computer Science from Washington University in St. Louis and is a graduate of the George Eastman Museum’s L. Jeffrey Selznick School of Film Preservation. Before joining Anthology in 2012, he worked as a preservation specialist and optical printing technician at Colorlab, a film laboratory in Maryland. He co-edited two volumes of The Collections of Harry Smith: Catalogue Raisonné and Manuel DeLanda: ISM ISM. He is a contributing editor of INCITE: Journal of Experimental Media. Klomp, Paul Jansen Paul Jansen Klomp (Almelo, NL, 1956) is a media artist and tutors Media Art at Artez, Enschede. He studied monumental design/painting at AKI Academy Enschede, the Netherlands, from 1979 to 1984. Since then, he works as an autonomous media artist (audio, video, installation, painting, photography). From 1993 to 2003 he was involved in the Amsterdam Montevideo media lab (later NIMk, now LIMA) as one of the consultants in the weekly consultation hour where artists could freely walk in and discuss plans involving art and technology. He also founded “Klomp Kunst & Electro”, a small company that is dedicated to developing hard- and software for artists. Now he teaches Media Art/Physical Computing at AKI Enschede (Fine Art Sculpture) and Leiden University (Master’s in Media Technology). In 2017 he got his master’s degree, MADtech, at Frank Mohr Institute, Groningen, NL. Kramer, Lia Lia Kramer is a Mellon Fellow in Media Conservation at the Museum of Modern Art. She has conducted project-based work in time-based media conservation for institutions including Tate, The Metropolitan Museum of Art, Cooper Hewitt Smithsonian Design Museum, and the Solomon R. Guggenheim Museum. Lia holds an MA in the History of Art and Archaeology and an MS in Conservation of Historic and Artistic Works from the Conservation Center of the 531

Contributors

Institute of Fine Arts, New York University and a BFA in Drawing and Painting from Georgia State University. Lascu, Marie Marie Lascu is Audiovisual Archivist for Crowing Rooster Arts, a nonprofit that has spent 20-plus years documenting the arts and political struggles of Haiti, and Digital Archivist for Ballet Tech, a public school for dance in NYC. She has been a member of XFR Collective since 2015 and has been a member of the Community Archiving Workshop (CAW) since 2011. She is a graduate of NYU’s MA program Moving Image Archiving and Preservation (2012) and is the 2016 recipient of the Society of American Archivists Spotlight Award. Lawson, Louise Louise Lawson is the conservation manager of time-based media conservation at Tate. She is responsible for the strategic direction, development, and delivery of all aspects relating to timebased media conservation at Tate. This requires working across a wide range of projects and programs: exhibitions, displays, acquisition, loan-outs, collection care, and research initiatives. Her research has focused on developing how performance artworks in Tate’s permanent collection are documented and conserved through the projects Documentation and Conservation of Performance at Tate (2016–21) and Reshaping the Collectible: When Artworks Live in the Museum (2018–21). Her current work is focused on the conservation of choreographic artworks. Leckart, Linda Linda Leckart is a registrar with more than 15 years of experience in the management, care, acquisition, documentation, and exhibition of art. She spent a decade at the San Francisco Museum of Modern Art, where she specialized in time-based media and photography. Currently, she is Registrar, Permanent Collection and Loans, at the Los Angeles County Museum of Art. Linda earned an MA in Museum Studies from San Francisco State University and graduated with a BA in Anthropology from the University of California, Los Angeles. Lewis, Kate Kate Lewis is a media conservator and Agnes Gund Chief Conservator at the Museum of Modern Art, New York. She has worked with artists including Beryl Korot, David Lamelas, Tony Oursler, and Lis Rhodes. Prior to joining MoMA, she held the position of Time-Based Media Conservator at Tate in London (2005–13). She earned her MA in the Conservation of Works of Art on Paper from the University of Northumbria at Newcastle-upon-Tyne, UK. Engaged in training and education, Lewis led the five-year Media Conservation Initiative at MoMA, funded by The Andrew W. Mellon Foundation, to address the long-term preservation of media-based works through professional collaborative training and serves on the board of Voices in Contemporary Art (VoCA). López Ruiz, Ángela Ángela López Ruiz is a visual artist, independent curator, professor, and researcher based in Montevideo, Uruguay. She has an MA in Latin American Studies and a BA in Visual Arts from the Universidad de la República, Uruguay. She works as a guest professor at EICTV (Cuba) and other art education institutions in the region. She​is part of the curatorial team of Ism, Ism, Ism/Ismo, Ismo, Ismo: Experimental Cinema in Latin America, and she contributed a chapter on the silenced protagonism of women​filmmakers​in the history of Latin American experimental cinema for the associated anthology (University of California Press in association with the Los 532

Contributors

Angeles Filmforum, 2017). In 2019 she published her book Poéticas del Cine Experimental en el Cono Sur (Poetics of Experimental Cinema of the Southern Cone) and was awarded a grant from the Competitive Culture Fund of the Ministry of Culture of Uruguay. MacDonough, Kristin Kristin MacDonough currently works as Assistant Conservator of Media at the Art Institute of Chicago (AIC), where she collaborates with colleagues throughout the museum to implement processes for acquiring, assessing, exhibiting, and conserving time-based media artworks. At AIC, she co-leads the TBM Forum and oversees the development of digital storage for artworks. Kristin is a founding member of XFR (pronounced “Transfer”) Collective, a participant in the Community Archiving Workshops and in Audiovisual Preservation Exchange (APEX Bogotá in 2013 and APEX Montevideo in 2014). She is a 2013 graduate of the Moving Image Archiving and Preservation program at New York University. Marçal, Hélia Hélia Marçal is Lecturer (Assistant Professor) in History of Art, Materials, and Technology at the Department of History of Art, University College London, and a researcher at the Institute of Contemporary History (NOVA University of Lisbon). Before this appointment, she was the Fellow in Contemporary Art Conservation and Research of the research project Reshaping the Collectible: When Artworks Live in the Museum, at Tate, London (2018–20). In addition, she has been the coordinator of the Working Group on Theory, History, and Ethics of Conservation of the Committee for Conservation of the International Council of Museums (ICOMCC) since 2016. Martin, Nicole Nicole Martin (they/them) is Associate Director of Archives and Digital Systems at Human Rights Watch and established the organization’s first archive. As an adjunct professor, Nicole teaches digital preservation and time-based media conservation at New York University’s Moving Image Archiving and Preservation (MIAP) graduate program. Previously, Nicole worked as the archivist at Democracy Now! and the Associate Director of the MIAP program and led digital preservation workshops at The Rockefeller Archives Center and NYU’s Institute of Fine Arts. Nicole is a board member of OpenArchive, a human-rights-focused tech developer, and the creator of twobitpreservation.com. McDonald, Christopher Christopher McDonald is an artist and sound engineer who lives in Connecticut. His most recent exhibition at SPACE Gallery in Portland, Maine, featured video works, photographs, pastels, and egg tempera paintings from the last decade. When he is not blazing new trails in the woods with his goats, he has collaborated as Ragnar Kjartansson’s director of sound on works such as The Visitors (2013) and Sumarnótt (2019). He has supervised the installation of these works at the Hirshhorn Museum, The Metropolitan Museum, and the GES-2 in Moscow. Homesteading with his husband remains the constant and renewing highlight of his life. Mellado, Diego Diego Mellado is Technical Director at Studio Daniel Canogar and a new media art conservation researcher. With a strong focus on technical solutions and documentation models for software-based artworks, since 2010 he has used his background in communications engineering to design, produce, document, and maintain new media artworks for several artists. In 2020, 533

Contributors

Diego finished the MediaArtHistories program at Danube University, with a master’s thesis on computer-based art conservation. As of 2022 Diego is responsible for the conservation of the artworks developed at Studio Daniel Canogar and has worked with public collections such as Museo Nacional Centro de Arte Reina Sofía, Centro de Arte Dos de Mayo in Madrid, and Escuela Nacional de Conservación Restauración y Museología in Mexico City. He has been part of the board of AIC Electronic Media Group and have taken part in Danube University Krems’ programs such as the Media Art Preservation Insitute and MediaArtCultures program. Moomaw-Taylor, Kate Kate Moomaw-Taylor has been Associate Conservator of Modern and Contemporary Art at the Denver Art Museum since 2011. With an interest in building community and exchange among conservators, she is serving on the founding board of the Contemporary Art Network (CAN!) of the American Institute for Conservation (AIC) as Program Chair. Previously, she was on the board of the AIC Electronic Media Group. She graduated from the Conservation Center of New York University in 2007 and completed training positions at the Museum of Modern Art, the Stedelijk Museum (Tate), The Metropolitan Museum of Art, and the Hirshhorn Museum and Sculpture Garden. Morfin, Jo Ana Jo Ana Morfin is a time-based media conservator who works on projects in the field of Digital Cultural Heritage and Contemporary Art. Morfin, trained as a conservator of traditional artworks, now conducts research on issues of digital preservation, the creative use of documentation and archives, and the crossovers between performance studies and variable media art preservation. Müller, Dorcas Dorcas Müller studied media art at the University of Design Karlsruhe (Ulay) and received her doctorate in media theory (Boris Groys). She has been recognized for her artwork with stipends from the Kunstfonds Bonn and the Körber Foundation Hamburg, Germany, among others. As a founding member of the Laboratory for Antiquated Video Systems, she started to work for the ZKM | Center for Art and Media Karlsruhe in 2004. In 2011 she became the head of the ZKM | Laboratory for Antiquated Video Systems. She participated in numerous ZKM productions as an artist, author, video editor, and video restorer. In 2017 two major ZKM archive exhibitions resulted from the work of the Laboratory: Aldo Tambellini: Black Matters and Radical Software: The Raindance Foundation, Media Ecology and Video Art. An anniversary video edition, 100 Years of Joseph Beuys, was published in 2021. Neary, David David Neary is Project Manager of the Whitney Museum of American Art’s Media Preservation Initiative. He is a graduate of NYU’s Moving Image Archiving and Preservation program and also holds an MA in Film Studies from University College Dublin. He has worked to preserve the film collections of institutions, including the Lincoln Center for the Performing Arts, MoMA, and the Oregon Historical Society, and specializes in researching complex multiprojector film installations. Nichole, Kelani Kelani Nichole is the founder of TRANSFER, an experimental gallery founded in 2013 that explores simulation and expanded practice. The gallery supports artists making computer-based

534

Contributors

artworks and hosts immersive media exhibitions. In 2018 she established The Current Museum collection, in collaboration with a community of like-minded curators, artists, gallerists, writers, and patrons. Her background and ongoing professional role are in the tech industry—she consults with startups and cultural partners as a User Experience Specialist. Nichols, Alexandra Alexandra Nichols is Time-Based Media Conservator at Tate, focusing on exhibitions and displays. Prior to working at Tate, she was a Sherman Fairchild Foundation Fellow at The Metropolitan Museum of Art and a Samuel H. Kress Fellow at the Solomon R. Guggenheim Museum in New York, concentrating on the conservation of time-based media. Alexandra Nichols holds an MS in Art Conservation from the Winterthur/University of Delaware program in Art Conservation and a BA in Art History from the University of Maryland. Obermann, Arnaud Arnaud Obermann was trained as a designer for print and digital media. After graduating from Stuttgart Media University with a degree in Library and Media Management, he received an MA in Conservation of New Media and Digital Information at the Stuttgart State Academy of Art and Design. From 2010 to 2013 Arnaud Obermann was conservation coordinator for the INTERREG project digital art conservation at ZKM | Karlsruhe and has been an associate lecturer for the MA course Conservation of New Media and Digital Information since 2011. He has been working as a conservator for time-based art at the Staatsgalerie Stuttgart since 2013. Oleksik, Peter Peter Oleksik is Associate Media Conservator at the Museum of Modern Art (MoMA), where he has been working since 2011 to conserve the museum’s time-based media collection. Outside of MoMA, Peter regularly writes and teaches various topics within time-based media conservation as well as works with artists, filmmakers, and musicians to preserve and provide access to their media collections. Oleksik received his BA in Cinema Studies from the University of Southern California and his MA from New York University’s Moving Image Archiving and Preservation (MIAP) program. Owens, Samantha Samantha Owens is Assistant Conservator at Glenstone Museum in Maryland. She holds an MS in Art Conservation from the Winterthur/University of Delaware Program in Art Conservation (WUDPAC), specializing in modern and contemporary objects, and a BA in Art History from Emory University. Samantha was previously a fellow at Glenstone and held graduate internships at The Metropolitan Museum of Art, the Cleveland Museum of Art, and the Hamburger Kunsthalle. Paul, Christiane Christiane Paul is Curator of Digital Art at the Whitney Museum of American Art, and Professor in the School of Media Studies at The New School in New York City. Paunu, Henna Henna Paunu is Chief Curator of Collections at EMMA, Espoo Museum of Modern Art in Finland.

535

Contributors

Phillips, Joanna Joanna Phillips is a time-based media conservator and Director of the Düsseldorf Conservation Center in Germany. She was previously Senior Conservator of Time-Based Media at the Solomon R. Guggenheim Museum in New York (2008–19), where she launched the first media conservation lab in a US museum, implemented time-based media conservation practices, and headed the Conserving Computer-Based Art (CCBA) initiative. She co-founded the AIC conference series TechFocus and lectures and publishes on TBM topics internationally. As a researcher in the Swiss project AktiveArchive, she co-authored the Compendium of Image Errors in Analog Video (2013). Phillips earned her MA in Paintings Conservation from the Dresden Academy of Fine Arts, Germany (2002). Ramírez López, Lorena Lorena Ramírez López is a consultant for digital preservation and conservation at Myriad Consulting as well as the community manager at Webrecorder for the web archiving needs of museums and universities. Lorena graduated as a full stack web developer from the Flatiron program (Access Labs) and holds an MA for audiovisual archiving and preservation from the Moving Image Archiving and Preservation program at New York University. Rechert, Klaus Klaus Rechert is a researcher leading the digital preservation research group at the University of Freiburg and, since 2018, CEO of OpenSLX. Klaus was the principal investigator of bwFLA, architect of Emulation as a Service. Over the last ten years, Klaus has assisted in the implementation of emulation solutions in a range of organizational types, including national libraries, art museums, and university data management departments. Klaus studied Computer Sscience and Economics at the University of Freiburg and received a diploma in Computer Science in 2005. Klaus received a doctoral degree from the University of Freiburg in 2013. Redston, Alysha Alysha Redston is Time-Based Media Conservator at the National Gallery of Australia and the first conservator to hold this position. She obtained her master’s in Cultural Materials Conservation from the University of Melbourne in 2017. Renwick, Vanessa Vanessa Renwick has been a singular voice in experimental cinema for over 20 years. Eschewing an allegiance to any one medium or form, Renwick builds authentic moving image works revealing an insatiable curiosity and unflinching engagement with the world around them. Often focusing their lens on nature, freedom, and the locales of their adopted home, the Pacific Northwest, Renwick uses avant-garde formal elements to explore radical politics and environmental issues. An artist who often self-distributes, their screening history reads as a map of independent cinema worldwide. They have screened work in hundreds of venues internationally, institutional and not, including the Museum of Modern Art, Light Industry, The Wexner Center for the Arts, Art Basel, Oberhausen, the Museum of Jurassic Technology, Centre Pompidou, Bread and Puppet Theater, and True/False Film Festival, among many others. Their website is www.odoka.org. Ribeiro, Ana Ana Ribeiro is a time-based media conservator at Tate. Ana has studied conservation in Lisbon (FCT-UNL) and subsequently trained in media art conservation at the S.M.A.K., NiMK, 536

Contributors

and Tate. At Tate, she has worked across both acquisitions and loans program areas developing experiences such as in digital preservation and documentation practices. Currently, Ana has been solely focusing on all aspects of bringing new media works to the collection, including performances. Ana has worked on the acquisition of Tony Conrad’s Ten Years Alive on the Infinite Plain, part of the Reshaping the Collectible project, and is now participating in a project about choreography in the museum titled Precarious Movements. Richardson, Nick Nick Richardson, Head of Collections and Preservation, ACMI (Australian Centre for the Moving Image), has worked in film archives for over 35 years. He previously worked at the Film Archives of the Australian Institute of Aboriginal and Torres Strait Islander Studies (AIATSIS), the Australian Broadcasting Corporation, and the National Film and Sound Archive. At ACMI, Nick manages 250,000 moving image items ranging from 16 mm film prints dating back to the early 1900s to video games and the latest in digital and VR art pieces. He has a particular focus on enabling public access. Rist, Pipilotti Pipilotti Rist, a pioneer of spatial video art, was born in 1962 in Grabs in the Swiss Rhine Valley on the Austrian border and has been a central figure within the international art scene since the mid-1980s. Astounding the art world with the energetic exorcistic statement of her now-famous single-channel videos, such as I’m Not the Girl Who Misses Much (1986) and Pickelporno (1992), her artistic work has co-developed with technical advancements and in playful exploration of its new possibilities to propose footage resembling a collective brain. Through large video projections and digital manipulation, she has developed immersive installations that draw life from slow caressing showers of vivid color tones, like her works Sip My Ocean (1996) or Worry Will Vanish (2014). Roeck, Claudia Claudia Roeck is a time-based media conservator who graduated from the Bern University of Arts (Switzerland) in 2016. She is affiliated with the University of Amsterdam (Netherlands), where she is pursuing a PhD in preservation strategies for software-based artworks. This PhD is part of the EU-funded project New Approaches in the Conservation of Contemporary Art (NACCA). She did her practical research at LIMA from 2016 to 2018, an archive and research platform for media art in Amsterdam. From 2013 to 2016 she worked as an assistant time-based media conservator at Tate (London, UK), mainly with video-based art. From 2019 to 2021 she collaborated on a software preservation project at the Netherlands Institute for Sound and Vision. Currently she works part-time to preserve the collection of the House of Electronic Arts in Basel and contributes to conservation projects at LIMA. Saarikko, Jorma Jorma Saarikko is the owner of ProAV based in Helsinki. He set up ProAV to provide professional technical equipment and assistance to artists and galleries some 30 years ago. Jorma has extensive knowledge of gallery technologies and design and has built long-standing relationships with artists, galleries, and organizations that still run strong to this day. Sääskilahti, Susanna Susanna Sääskilahti is an art historian working at the Finnish National Gallery, where she is the project manager of digital preservation issues. She takes care also of audiovisual archives, and 537

Contributors

she is interested in developing documentation on media art in Kiasma. In Finland, digital preservation for cultural institutions is produced together with the IT Center for Science (CSC). Susanna Sääskilahti has studied media art preservation and documentation (e.g., at Danube University). Sacks, Steven Steven Sacks founded bitforms gallery in November 2001 in New York City. The bitforms gallery represents established, mid-career, and emerging artists critically engaged with new technologies. Spanning the rich history of media art through its current developments, the gallery’s program offers an incisive perspective on the fields of digital, internet, time-based, and new media art forms. ​​Sartorius, Andrea Andrea Sartorius is a conservator of contemporary art at Nationalgalerie im Hamburger Bahnhof, Museum für Gegenwart, Berlin, where she also cares for the media artworks in the collection. Her current research focuses on artist interviews as a conservation tool to enhance the understanding of collection artworks. Previously, she was the conservator at Kunstmuseum Wolfsburg (2009–20). She holds an MA in Paintings Conservation from the Hochschule für Bildende Künste Dresden, Germany, and completed post-graduate internships at the Hamilton Kerr Institute (Cambridge UK) and the J. Paul Getty Museum (Los Angeles), as well as a fellowship in media art conservation at Bek & Frohnert LLC (New York). Semmerling, Linnea Linnea Semmerling is Director of the IMAI (Inter Media Art Institute) in Düsseldorf, Germany, a nonprofit foundation dedicated to the collection, distribution, presentation, and conservation of time-based media art. She holds a BA in Cultural Studies, an MA in Art Studies, and a PhD in Technology Studies from Maastricht University. Her dissertation “Listening on Display: Exhibiting Sounding Artworks 1960s–Now” was awarded the 2019 Media Art Histories Emerging Researcher Award. Semmerling regularly organizes exhibitions for, among others, ZKM Karlsruhe (DE), IKOB Eupen (BE) and TENT Rotterdam (NL). Sherring, Asti Asti Sherring is a trained paper, photographs, and time-based art conservator. She has completed a bachelor’s in Media Arts (honors) from Sydney University and a master’s in Materials Conservation at Melbourne University. Asti held the position of Senior Time-Based Art Conservator at the Art Gallery of New South Wales between 2015 and 2020. She has also worked at notable institutions such as the Los Angeles County Museum of Art and Museum of Contemporary Art. Asti is currently undertaking doctoral research at Canberra University, which explores contemporary conservation theories and practices of works that are digital, ephemeral, immersive, participatory, and technological in nature. Singer, Martha Martha Singer is an art conservator and Director of Material Whisperer, a firm in New York City specializing in the conservation of modern and contemporary sculpture. Martha received her MA in Art History and Conservation from New York University. Martha was trained in modern and contemporary conservation first as a fellow at SFMOMA and then contracts at other modern art museums. Martha has published on modern artists and their intention and working techniques, including Jean Arp’s bronzes, as well as Donati and Duchamp’s Prière de 538

Contributors

toucher. Martha is a Fellow of the American Institute for Conservation. Presently she is the editor of Contemporary Art Review. Smith, David David Smith is a time-based media and digital art conservator with 20 years’ experience working across museums, archives, broadcast, and digital projects around the world. At M+ David contributes to the preservation, research, management, and display of the museum’s TBM and digital collections. Since moving to Asia, David spent seven years at the Hong Kong–based nonprofit Asia Art Archive, managing their collections and developing digital projects to raise awareness, interest, and reuse of archives relating to Asia’s art history. Before moving to Asia, he worked at Europeana, Archives New Zealand, and the Imperial War Museum, London. Springer, Samantha Samantha Springer established Art Solutions Lab in 2020 in the Portland, Oregon, area to provide preventive care and treatment services to regional arts and culture organizations, artists, and private collectors. Her practice grows from her MS attained at the Winterthur/University of Delaware Program in Art Conservation and work at institutions such as the Portland Art Museum, Cleveland Museum of Art, Field Museum of Chicago, and National Museum of the American Indian. Samantha has published articles in art and conservation books and journals and developed the Materials Testing Results Tables and Image Gallery of Damages on the AIC-Wiki. Sterrett, Jill Trained in conservation, Jill Sterrett has worked in universities, museums, and libraries throughout her career. She was Interim Director and Deputy Director at the Smart Museum of Art at the University of Chicago (2018–20) and Director of Collections and Conservator at the San Francisco Museum of Modern Art (1990–2018). Her research focuses on the role of museums in contemporary society, and she works at the intersection of contemporary art practice and materials, conservation, and collections. Today, she is an independent arts and culture adviser engaged with sustainable heritage projects and art exhibitions in the US and abroad. Stricot, Morgane Morgane Stricot is a senior media and digital art conservator at ZKM Center for Art and Media, Karlsruhe, Germany. She heads the acquisition process, storage, preservation, loan, and presentation of the digital art collection of ZKM with a focus on software and internet-based art. While at ZKM, she has been a research fellow in media archaeology in the PAMAL (Preservation and Art—Media Archaeology Lab) research program of Avignon School of Art and at the Media Archaeology Laboratory of the Orléans School of Art and Design, France. Prior to her work at ZKM, she attained a master’s degree in Digital and Media Art Conservation at Avignon School of Art, France, and completed a research fellowship at the University of Maine, Orono, USA, in the New Media Department, Still Water Lab, under the direction of Jon Ippolito. Stringari, Lena Lena (Carol) Stringari is Deputy Director and Andrew W. Mellon Chief Conservator of the Solomon R. Guggenheim Museum in New York.  Having joined the Guggenheim in 1992, she is responsible for the care and treatment of the collection and managing conservation for a global loan and exhibition program. Stringari has overseen exhibitions and research projects on materials and processes, including Jackson Pollock: Exploring Alchemy, Imageless: The Scientific 539

Contributors

Study and Experimental Treatment of an Ad Reinhardt Black Painting, and Seeing Double: Emulation in Theory and Practice. Stringari played a key role in formulating and implementing the Panza Collection Initiative and the Variable Media Initiative at the Guggenheim. She was a founding member of the International Network for the Conservation of Contemporary Art and has served as an adjunct professor at the Institute of Fine Arts at NYU. Stringari has contributed to numerous publications and lectures throughout the world on ethical considerations and the conservation of contemporary art. Suárez, Juana Juana Suárez is the Director of the Moving Image Archiving and Preservation Program at New York University (NYU MIAP), a scholar in Latin American Cinema, and a media-preservation specialist. She is the author of Cinembargo Colombia: Critical Essays on Colombian Cinema (Spanish-language edition, 2009; English-language edition, 2012) and Sitios de contienda: Producción cultural y el discurso de la violencia (2010); the co-editor of Humor in Latin American Cinema (2015); and the Spanish language translator of A Comparative History of Latin American Cinema, by Paul A. Schroeder-Rodríguez (2020). Suárez is currently working on a book tentatively entitled Moving Images Archives, Cultural History and the Digital Turn in Latin America and coordinating arturita.net, a collaborative digital humanities project on Latin American audiovisual archives. Thiesen, Milo Milo Thiesen is Media Asset Manager for the Office of Knowledge and Legacy at the Lincoln Center for the Performing Arts Inc. in New York City. For 15 years they have specialized in media archiving and Digital Asset Management for photography, video, interactives, and animation in cultural heritage and higher education. Milo was Lead Technical Analyst for Digital Asset Management at The Metropolitan Museum of Art for five years and has held positions in Digital Asset Management at the American Museum of Natural History and the University of Massachusetts Amherst. Milo has guest lectured at Pratt Institute, New York University, and the Metropolitan New York Library Council. Ulinskas, Moriah Moriah Ulinskas is an audiovisual archivist and public historian whose work supports collections that fall outside mainstream historical narratives and institutions. From 2011 to 2017 she served as Diversity Chair for the Association of Moving Image Archivists and was Preservation Program Director at the Bay Area Video Coalition from 2011 to 2014. She contributed to Citizen Internees: A Second Look at Race and Citizenship in Japanese American Internment Camps and has published articles in KULA: Knowledge Creation, Dissemination, and Preservation Studies and Places Journal. She is a member of the Community Archiving Workshop and a PhD candidate in Public History at UC Santa Barbara. Vadakan, Pamela Pamela Vadakan directs California Revealed, a California State Library program that partners with public libraries, archives, museums, and historical societies across California to digitize, preserve, and provide online access to primary sources documenting the state’s history, art, and cultures. She is also a member of the Community Archiving Workshop and serves on the board of the Center for Home Movies. She has a master’s degree in Moving Image Archiving and Preservation from New York University.

540

Contributors

van de Vall, Renée Renée van de Vall is Professor in Art and Media at Maastricht University. Her research focuses on the phenomenology of spectatorship in contemporary art, more specifically on how works of art construct the embodied engagement of spectators, on processes of globalization of contemporary art and media, and on conservation theory and ethics in the context of contemporary art. She has been the project leader of various research projects, most recently (2016–19) of the Marie Sklodowska-Curie Innovative Training Network, New Approaches in the Conservation of Contemporary Art (NACCA). Ventur, Conrad Conrad Ventur is a multimedia artist based in New York City. His work, which encompasses photography, video, installation, and independent publication projects, considers ecology, family, habitat, and memory. Ventur studied photography at the Rochester Institute of Technology (BFA, 1999) and fine art at Goldsmiths, University of London (MFA, 2008). Recent solo exhibitions include A green new deal at Participant Inc (2020), IVY at Baxter St at CCNY (2016), and Pink Seat at Rokeby, London (2016). Recent group exhibitions include Be Seen: Portrait Photography After Stonewall at the Wadsworth Atheneum, Hartford, Connecticut (2019) and Face to Face: Portraits of Artists at the Philadelphia Museum of Art (2018). In 2019, Ventur curated Altered After for Visual AIDS. Ventur’s work is held in the permanent collections of the Philadelphia Museum of Art, the Wadsworth Atheneum, and the Whitney Museum of American Art. Versteeg, Siebren Siebren Versteeg uses digital technology to make work to synthesize painterly abstraction with algorithmic programming. Solo exhibitions include bitforms, NY; the Museum of Art at Rhode Island School of Design; Hallwalls, Buffalo; the Wexner Center for the Arts, Ohio; Max Protetch, New York; and the Museum of Contemporary Art in Chicago. He has been included in group exhibitions at the Whitney Museum of American Art; the Hermitage, St. Petersburg; Smithsonian’s Hirshhorn Museum and Sculpture Garden, Washington, DC; the Fabric Workshop and Museum, Philadelphia; and the National Museum of Art, Czech Republic. Public collections include the Whitney Museum of American Art, the Solomon R. Guggenheim Museum, Albright-Knox, Yale University Art Gallery, Hirshhorn Museum and Sculpture Garden, and Chicago’s Museum of Contemporary Art. Vlaminck, Matthieu Matthieu Vlaminck is a Senior Digital Art and Media Conservator at ZKM | Center for Art and Media Karlsruhe. He graduated from the Ecole Supérieure d’Art d’Avignon in Visual Arts. He also holds a diploma in programming/network and in music (cello). His current specialization is the preservation and restoration of 3D computer-generated cinema models, especially Star Trek ships. Matthieu’s research and work focus on the preservation of digital art, notably on third-party products as part of artworks (maintenance/adaptation of historical and obsolete commercial software/APIs for the sake of art preservation) and the archiving of artworks using 3D visualization. Warda, Jeffrey Jeffrey Warda is Senior Conservator, Paper and Photographs at the Solomon R. Guggenheim Museum, New York. He has worked closely with artists, photography labs, and technicians to complete multiple digitization, slide duplication, and rehousing projects of slide-based artworks for acquisition and display. He was chair of both the Electronic Media Group (2006–08)

541

Contributors

and the Digital Photographic Documentation Task Force (2007–08) of the American Institute for Conservation (AIC). He co-authored and edited both editions of The AIC Guide to Digital Photography and Conservation Documentation (2008 and 2011), a publication that provided practical recommendations for conservators transitioning from film-based conservation documentation to digital photography. He also founded and was managing editor (2009–2014) of the AIC Electronic Media Group’s biennial publication, the Electronic Media Review—the first print publication of EMG designed to build a body of knowledge for the field of time-based media conservation. He received an MA and Certificate of Advanced Study in art conservation from the State University of New York College at Buffalo. Weiss, Grace T. Grace T. Weiss is Assistant Registrar for Media Arts at the San Francisco Museum of Modern Art. She has presented internationally on best practices for media arts registration and is a contributing author of Museum Registration Methods, sixth edition. Grace holds an MA in Museum Studies from New York University and dual BA degrees in Art History and Communications from Fordham University. Specializing in time-based media, her work focuses on how museums collect and display the art of our time. Weisser, Andreas Media conservator Andreas Weisser studied conservation and restoration at the Cologne University of Applied Sciences. From 2003 to 2015, he was employed by the Freiburg Municipal Museums, and since 2002, he has been working as a freelance consultant for archive analysis and digitization and as a conservator for audiovisual data carriers. The focus of his work is advising on analog and digital long-term archiving and developing sustainable digitization strategies. This includes the evaluation of suitable formats as well as the design of storage spaces. He is a lecturer at the Cologne University of Applied Sciences and HTW Berlin as well as an instructor at the Deutsche Welle Academy for the Middle East and North Africa. Wharton, Glenn Glenn Wharton is a Professor of Art History at UCLA and Chair of the UCLA/Getty Conservation of Cultural Heritage Program. Prior to this appointment, he was on faculty of Museum Studies at New York University. He initiated and managed MoMA’s time-based media conservation program during its early years, and he currently co-directs the NYU-based Artist Archives Initiative that develops digital information systems for the display and conservation of contemporary art. He received his PhD in Conservation from the Institute of Archaeology, University College London, and his MA in Conservation from the Cooperstown Graduate Program in New York. White, Layna Layna White is Director of Collections at the San Francisco Museum of Modern Art. She leads a division comprised of five high-performing departments (Conservation, Registration, Collections Management, Collections Information and Access, and Library + Archives) in cross-disciplinary art stewardship placed in direct dialogue with activating, experiencing, and documenting art. Her interests include developing lively, situational approaches, systems, and documentation related to art, art practices, and art experiences, toward maximizing access and sharing for staff and public benefit. Wielocha, Aga Aga Wielocha is a collection care professional and a researcher specializing in contemporary art. Currently she holds the position of Conservator, Preventive, at M+ in Hong Kong. She 542

Contributors

holds a PhD from the University of Amsterdam, Amsterdam School for Heritage, Memory, and Material Culture. Her doctoral research, carried out within the program New Approaches in the Conservation of Contemporary Art (NACCA), situated at the crossroads of art history and theory, conservation, museology, and heritage studies, was focused on the lives and futures of contemporary art in institutional collections, particularly on works which are variable and unfold over time. Prior to her doctoral studies, she served as a conservator at the Museum of Modern Art, Warsaw. Wijers, Gaby Gaby Wijers is the founder and director of LIMA. Previously she was the coordinator of collection, preservation, and related research at the NIMk, Amsterdam (NL); she has a background in information management, theater, and informatics. She initiated, advised, and participated in multiple national and international projects dealing with the documentation of, preservation of, and access to immaterial and interactive art, specializing in media art and performance. These projects include ArtHost, UNFOLD, NACCA, Transformation Digital Art, Preservation of Media Art Collections in the Netherlands, Inside Installations, Inside Movement Knowledge, Obsolete Equipment, Digitizing Contemporary Art, Digitalcanon?!, and Documenting Digital Art. She participates in national and international networks such as the Foundation for the Conservation of Contemporary Art (SBMK), Dutch Digital Heritage Network (NDE), and Network Archives Design and Digital Culture, and she is a guest lecturer at Amsterdam University and an Honorable Research Fellow at Exeter University. Wood, David David Wood is based in London and has many years’ experience in art installation from public institutions such as the Hayward Gallery to commercial galleries like Victoria Miro. David has worked extensively overseas on art fairs across the world. Zabaleta, Guillermo Guillermo Zabaleta is dedicated to the rescue and preservation of film, both in a teaching environment and by experiencing alchemy in his laboratory and through Filmperformance. His work forms a whole, with his affection for the images that he lives with daily and shares with other generations as a transmission of knowledge and also as a call to achieve other new images with which to live and work from the field of art. He studied museology, theater, and cinema and trained in audiovisual preservation in London, Madrid, Buenos Aires, and Montevideo. He is a member of the Fundación de Arte Contemporáneo (FAC) in Montevideo, Uruguay, where he has directed the Laboratorio de Cine since 2009. Zimbardo, Tanya Tanya Zimbardo is Assistant Curator of Media Arts at the San Francisco Museum of Modern Art. She has curated exhibitions of work by contemporary artists Jim Campbell, Runa Islam, Pat O’Neill, and Kerry Tribe, and recently, Future Histories: Theater Gates and Cauleen Smith. She has curated select screening programs online and for the theater context. Zimbardo co-curated Nothing Stable under Heaven, Soundtracks, Nam June Paik: In Character, The More Things Change, and Fifty Years of Bay Area Art: The SECA Awards, among other exhibitions, and worked on surveys of William Kentridge and Suzanne Lacy. She has participated in several SFMOMA collection initiatives.

543

INDEX

Anderson, Mollie, 39 – 66 Anker, Steve, 356 – 57 Ansari, Amad, 505 Anthology Film Archives, 347, 402 Antos, Julian, 347, 392 – 401 APEX (Audiovisual Preservation Exchange), 67 – 69, 74 – 79, 86 APIs (application programming interfaces), 454, 483 application programming interfaces. See APIs AR (augmented reality), 177, 493 – 94 Arcangel, Cory, 469, 478 Archey, Karen, 28, 30 archival information package. See AIP Archive Jam Improvisation. See AJI Archivematica, 118, 129, 132 – 33 arden, sasha, 475 Arduino, 447, 471, 474 – 75, 491, 494 – 95 Art Gallery of New South Wales. See AGNSW Art Institute of Chicago, 61, 67, 95 – 96, 105 – 6 artist interviews, 30, 33, 44, 50 – 51, 104, 108, 151, 171 – 73, 177, 193, 250, 253, 255, 257, 414 – 15, 458, 464, 486 Artwork Documentation Tool. See LIMA (Living Media Art), Artwork Documentation Tool artwork record (database) child record, 173, 187, 190 – 91 component, 44, 50 – 51, 56, 118 – 20, 124, 131, 133, 167, 172, 185, 187 – 91, 193, 218, 228, 412, 415 – 17, 419 – 20, 461, 469 – 70 parent record, 6 – 7, 57, 187, 190 – 93, 458 aspect ratio, 294, 302 – 3, 311, 356 – 57, 374, 383, 397, 459 Association of Moving Image Archivists. See AMIA Association of Tribal Archives, Libraries, and Museums (ATALM), 71

3D-printed artworks, 495 Abramovic, Marina, 514 ACMI (Australian Centre for Moving Image), 142, 480 acquisition, 5 – 6, 24, 29 – 30, 32, 50 – 51, 159 – 61, 165, 168 – 73, 177, 180, 250, 253 – 54, 261, 273 – 78, 411, 413 – 14, 418 – 19, 436, 453 – 54, 456 – 58, 461, 467 – 68, 470, 495, 517 contract (legal agreements), 34, 44, 48, 51, 169 – 71, 179 – 80, 253 – 54, 259 – 62, 467 – 68 deliverables, 5 – 7, 50 – 51, 56, 169 – 71, 250, 253, 274 – 76, 390, 411, 457, 461, 467 – 68, 470 documentation, 165, 168 – 73, 177, 180, 185, 254, 276 – 78, 411, 456 – 58, 461, 467 – 68, 470 process (best practice workflows), 168 – 73, 253 – 54, 261, 274 – 78, 411, 413 – 15, 418 – 19, 436, 456 – 58, 461, 467 – 68, 470 Adachi-Tasch, Ann, 67, 71 – 73 Adelstein, Peter, 391 advocacy for time-based media conservation, 5, 7, 43, 59, 74, 89, 91 – 92, 104 – 6, 128, 133, 141, 178, 236 AGNSW (Art Gallery of New South Wales), 59 – 60, 95 – 96, 101, 141 – 42, 224 AIP (archival information package), 114, 116, 118, 197, 200 – 201 AJI (Archive Jam Improvisation), 76 – 77 algorithms in software-based art, 254, 260, 454, 466, 483 – 84, 486, 504 – 5 allographic artworks, 15 – 17, 21 – 22. See also autographic artworks AMIA (Association of Moving Image Archivists), 69 – 71, 272, 282

544

Index ATALM. See Association of Tribal Archives, Libraries, and Museums Atelier für Videokonservierung (Bern), 95 – 96, 98, 104, 107, 269 audio channels, 283 – 84, 287, 333, 337 – 39, 344, 393 – 94 signal, 306, 308, 338, 396, 472 tracks, 279, 282, 284, 292, 435, 440, 447 Audiovisual Preservation Exchange. See APEX Auerbach, Tauba, 495 augmented reality. See AR Australian Centre for Moving Image. See ACMI autographic artworks, 15 – 17, 22. See also allographic artworks automation (ingest and storage of digital assets), 118, 128 – 30, 132, 198, 203, 206

camera original, 351, 356, 360, 363, 366, 407 – 8, 412 – 13, 419, 421, 423, 433, 436, 438, 447 Campbell, Jim, 41 Campbell, Savannah, 95, 254 Canogar, Daniel, 249, 254 – 57, 260 Castriota, Brian, 95, 98, 105 – 6 cataloguing time-based media art, 44, 53, 56, 58 – 60, 77, 111, 129, 131, 134, 172, 185 – 87, 194, 214, 218, 352, 415, 458 cathode ray tube. See CRT CAW (Community Archiving Workshop), 67 – 74, 85 – 86 CCBA Initiative. See Guggenheim Museum, CCBA (Conserving Computer-Based Art) Initiative CCJ (Collaborative Cataloging Japan), 67, 70, 72 CCSDS (Consultative Committee for Space Data Systems), 111, 114, 197 CD (audio CD), 15, 109, 204, 216 – 17, 276, 331, 333 – 35 CD-ROMs, 204, 207, 216 – 17 cellulose nitrate. See film, nitrate (cellulose nitrate) certificate of authenticity, 60, 187, 191 – 92 Cheang, Shu Lea, 457 checksums (see also fixity of digital files), 50 – 51, 56, 85, 103, 115, 121, 128, 160, 192, 198, 200, 206, 214, 218 – 19, 277, 289 Chen, Yuhsien, 43, 45, 52 Cheng, Ian, 36, 493 Chicago Albumen Works, 435 – 36, 450 Chicago Film Society, 347 Chizinski, Piotr, 450 Churchill, Joshua, 154, 160 – 61 circuit boards, 232, 456, 471 – 72, 474 – 78, 480 CMS (collection management system), 56 – 57, 113, 118, 123, 129, 169, 172, 178 – 79, 186 – 90, 194, 218, 223, 226 – 30, 233, 412, 415 – 17, 420, 458 CMS (content management system), 186, 190 Cochrane, Euan, 505 Coddington, Jim, 89 – 92, 102 code resituation, 504 Collaborative Cataloging Japan. See CCJ Collecting the Performative. See Tate, Collecting the Performative collection management system. See CMS (collection management system) collection survey, 9 – 10, 39 – 49, 53 – 54, 56, 58 – 62, 91, 97 Colloton, Eddy, 132, 204 – 22 Collyer, Catherine, 95 color dye fading (film). See film, color dye fading color films, 363, 371, 425, 438, 448 color slides, 407, 416 – 17, 435, 448 – 49 Colussi, Francesca, 165 – 84 command line tools, 197 – 98, 200, 202 – 3, 211 – 12, 279, 288, 447, 455

Bagger (software). See Library of Congress, Bagger BagIt (software). See Library of Congress, BagIt (software) Baltimore Museum of Art, 410 Barger, Michelle, 62, 95, 154, 158, 160 Barnott-Clement, Rebecca, 60, 95, 101 BAVC (Bay Area Video Coalition), 86, 104, 155, 270 Bay Area Video Coalition. See BAVC Beiguelman, Giselle, 31, 35 Bek, Reinhard, 95, 97, 237 Bek & Frohnert LLC, 95 – 97 Bender, Gretchen, 271 Betacam/Betacam SP. See video formats, Betacam/Betacam SP Bethonico, Mabe, 30, 34 Bicchochi, Gloria Maria, 270 Bieber, Alain, 8 Biemann, Ursula, 30, 34 Birmingham Museum Trust, 240 – 41 Birnbaum, Dara, 258 Bishop, Claire, 515 BitCurator (software), 103, 209 – 12, 219 bitforms gallery, 253, 259, 262, 467 Blair, Ulanda, 28, 30, 32 – 33 Bloes, Richard, 95 Blu-ray. See video formats, Blu-ray Bosshard, Andres, 33, 35 Brand, Bill, 67, 69, 76 – 78 Brokerhof, Agnes, 464, 466 Brost, Amy, 95, 111 – 38 Bruguera, Tania, 513 Burton, Johanna, 81 cable connections, 175, 243, 245, 306 – 8, 333, 337 – 38, 344, 459, 465 CAD (computer-aided design), 179, 244, 495 Cage, John, 317, 513 California Revealed, 67 Callard, Andrea, 68, 81 – 83, 85

545

Index Community Archiving Workshop. See CAW compression, 211, 213 – 14, 216 – 17, 274, 285 – 86, 288 – 89, 293 – 94, 296 – 301, 310, 333, 342, 374, 392 audio, 333, 342, 392 digital film, 374, 392 disk images, 211, 213 – 14, 216 – 17 video, 274, 285 – 86, 288 – 89, 293 – 94, 296 – 301, 310 computer-aided design. See CAD computer-based art. See software-based art condition assessment, 8, 10, 41, 44 – 47, 50 – 52, 57 – 58, 61, 96, 102 – 3, 125, 129, 143, 161, 167, 171, 177 – 78, 180, 236, 276 – 79, 281, 351 – 52, 419 – 20, 428, 444, 462 film, 351, 353 – 54, 394, 402 – 3 slides, 414, 419 – 20 video, 276 – 81, 468 Conrad, Tony, 272 conservation assessments, 39 – 40 Conservation Center (Institute of Fine Arts, New York University), 146 – 49, 151 Consultative Committee for Space Data Systems. See CCSDS containerization (system level virtualisation), 500 content management system. See CMS (content management system) Cook, Sarah, 28, 30, 32 – 34, 37 Cornell, Lauren, 249, 252, 258 – 59, 261 Cortright, Petra, 271 CPU (central processing unit), 102, 455, 472 – 74, 494 Cranmer, Candice, 480 Croft, Kyle, 68, 82 – 85 CRT (cathode ray tube), 5, 30, 84, 103, 108, 228, 235, 239, 245, 269, 275, 277, 290, 295, 302 – 3, 307, 309, 311, 417, 433 – 34 film recorders. See film recorders, CRT monitors, 5, 30, 84, 103 – 4, 108, 234 – 35, 245, 269, 275, 277, 289 – 90, 295, 307, 309, 311 projectors, 228, 234 – 35, 239, 245 Cullen, Tom, 239 – 48 custom hardware (computer-based art), 478 – 81, 487 custom-modified film projectors, 399

DCP (Digital Cinema Package), 349, 382, 384, 386 – 87, 390, 399 – 401 Dean, Tacita, 348 Dearolph, Amanda, 194 Dekker, Annet, 26, 28 – 38, 166 Deleuze, Gilles, 19 Denver Art Museum, 130 – 31, 206 dependency relationships, 483, 497 Derrida, Jacques, 19 development environment (software), 468, 490, 494, 503, 505 Dickson, Emma, 505 digital artworks, 6, 21 – 22, 31, 113 – 16, 118, 120 – 21, 124, 126, 128, 131, 141, 151, 205, 245, 252, 261 audio, 294, 319, 331 – 34, 337, 339 cameras (slide digitization), 412, 424, 427, 430 – 31, 436 collections, 112 – 13, 121, 126 – 27, 129, 135 forensics, 80, 108, 204, 206, 211, 213 – 14, 218 – 20 objects, 112 – 13, 115 – 16, 118 – 20, 188, 190, 192, 198, 202 – 5, 218, 466 video, 273, 278 – 79, 283, 289, 293 – 94, 296 – 98, 300, 303, 305 – 6, 310, 348, 487, 493 digital asset management system. See DAMS Digital Betacam. See video formats, Digital Betacam digital cinema, 347, 350, 386 – 87, 399 – 400 Digital Cinema Package. See DCP digital conservation repository. See digital repository digital preservation, 31, 49, 57, 76, 82, 112, 114 – 16, 118, 121 – 23, 126, 128, 132, 134, 144, 149, 167, 196 – 97, 199, 201 – 3, 219 – 20, 273 – 74, 276, 283, 289, 466, 498 Digital Preservation Coalition. See DPC digital repository, 31, 48, 112, 115 – 16, 118 – 21, 123, 126 – 28, 130 – 31, 144 – 45, 201, 220 Digital Repository for Artists and Collections. See LIMA (Living Media Art), Digital Repository for Artists and Collections digitization of analog film, 351 – 52, 360, 363, 369 – 72, 374 – 75, 382 – 86, 389 – 90, 392 of slides, 408, 412, 420, 422 – 28, 430 – 33 of sound (in film and video), 283, 384 – 86 of video tape, 5, 80 – 84, 103, 105, 113, 279 – 83 DIP. See distribution information package disk imaging, 6, 11, 102, 106, 109, 204 – 21, 289, 459, 463, 469, 475, 487 – 88, 501 – 2 formats, 212 – 17 software, 109, 207 – 8, 211 – 12, 215 – 17, 219 workstation, 11, 102 – 3, 106, 109, 207 – 11 display equipment. See equipment distribution information package (DIP), 197, 200

D’Agostino Dias, Fernanda, 165, 179 Dalí, Salvador, 348 DAMS (digital asset management system), 113, 126, 130, 135, 179, 185, 193, 197, 201 data corruption, 58, 121, 133, 199 – 200, 214, 462 fixity. See fixity of digital files ingest. See ingest of data integrity, 196, 198, 200, 202 – 3 loss, 132, 199 – 200, 203, 213, 466 recovery, 211 – 14

546

Index DOCAM (Documentation et Conservation du Patrimoine des Arts Médiatiques), 119, 167, 189 Documentation Model for Time-Based Media Art, 18, 174 documentation of time-based media art, 7, 44, 50 – 51, 56, 165 – 81, 183, 185, 188 – 90, 228 – 29, 231, 236 – 37, 254 – 55, 257 – 58, 274 – 75, 342 – 43, 351, 372, 422, 463, 479 – 80, 484 – 88 Dodge, Charlie, 450 Doerner Institut, 10 – 11, 95 Dolby surround sound, 306, 365, 387, 393, 397 Dominguez Rúbio, Fernando, 25, 168 DPC (Digital Preservation Coalition), 123, 466 DPX. See video file formats (encodings), DPX drum scanners. See slide scanning, drum scanners Duchamp, Marcel, 348 Duke, Laurie, 67, 71 – 73 Dunne, Kirsten, 40, 58, 95, 105 – 6 Düsseldorf Conservation Center, 3, 8, 95 – 96, 105, 107, 261, 450, 453, 505 DV/DVCAM. See video formats, DV/DVCAM DVD. See video formats, DVD DV files. See video file formats (encodings), DV Dye, Steven, 95, 100 – 102, 106, 154, 158 – 59

legacy equipment, 226, 228, 230 – 32, 236, 245, 468 significance, 15, 17, 78, 119, 159, 166, 169, 172, 187, 207, 223 – 26, 228 – 30, 237, 275, 459 – 60, 463 – 65, 477, 487, 503 See also significance assessment storage, 223, 226 – 27, 232 – 34 ESD (electrostatic discharge), 104, 108, 208 – 9, 290, 462, 479 – 80 Espenschied, Dragan, 206 Espoo Museum of Modern Art (Finland) (EMMA), 28, 32 examination of time-based media works, 10, 102 – 3, 130, 142, 173, 180, 206, 372, 419, 461, 469 – 70, 484, 487 executable, 455, 459 – 61, 468 – 69, 482 – 83, 486 – 87, 490, 494 executable analysis, 486 exhibition copy, 187 – 88, 192, 205 – 7, 274 – 76, 284, 296, 319, 406, 413 – 14, 417, 419 – 20, 423, 448 FAC (Fundación de Arte Contemporáneo), 67, 75 – 77 FAIC (Foundation for Advancement in Conservation), 40 – 41 Falcão, Patricia, 165 – 84, 453 – 511 Farbowitz, Jonathan, 95, 105, 134 – 35, 204 – 22, 501 Farias de Carvalho, Humberto, 67, 77, 79 – 80 Fédération Internationale des Archives du Film. See FIAF FFmpeg (software), 109, 278, 282, 288, 300 – 301, 311 FFV1. See video file formats (encodings), FFV1 FIAF (Fédération Internationale des Archives du Film), 76, 365, 369, 374, 382, 386 file transfer, 199 – 200 file types, 123, 127, 187, 189, 193, 386, 416 film acetate (cellulose acetate), 352 – 53, 356 – 59, 369 – 70, 372, 374 – 75, 382, 391 – 92 acetate base deterioration (vinegar syndrome), 358, 369 – 70, 391 analog film, 300 – 301, 347, 349 – 50, 352, 377, 384, 388, 390, 392 – 93, 399, 401 cleaning, 375, 378, 382 color dye fading, 371, 382, 388, 391, 395, 398, 403 damage, 354, 371 date code, 354, 363 – 65, 403, 428 duplication, 347, 349, 351, 360, 372, 374 – 75, 378 – 79, 390, 409, 449 edge printing, 358, 363, 365 gauges, 348, 351 – 52, 354 – 56, 358, 363, 367, 369, 374, 377, 384, 388, 394 – 95 inspection, 71, 351, 353 – 54, 371, 389 intermediate, 360 – 62

EaaS (Emulation as a Service), 454, 499 EaaSI (Emulation as a Service Infrastructure), 220, 499 EAI (Electronic Arts Intermix), 55, 270, 272 – 73, 282 ecologies of TBM artworks, 13 – 14, 23 – 24, 26, 514, 518 Efferenn, Kristof, 95, 98 Electronic Arts Intermix. See EAI electrostatic discharge. See ESD EMMA. See Espoo Museum of Modern Art (Finland) emulation, 22, 205 – 7, 214 – 15, 219 – 20, 258, 454 – 55, 460, 478, 497 – 503 Emulation as a Service. See EaaS Emulation as a Service Infrastructure. See EaaSI encoding (see also video file formats (encodings)), 216 – 17, 276 – 77, 279, 282 – 83, 287, 290, 294 – 95, 300 – 301, 310 Engel, Deena, 37, 145, 165, 249 – 66, 257, 453 – 511 Eno, Brian, 318 Ensom, Tom, 145, 165, 453 – 511 equipment, 5, 10 – 11, 13 – 15, 17, 43, 47, 56, 59, 102, 104 – 7, 170, 175 – 77, 223 – 37, 240 – 41, 243 – 46, 295 – 97, 304, 308 – 12, 394, 398 – 400, 409, 415, 441, 443, 447, 470, 477 categories, 226 inventory, 56, 61, 172, 188, 191, 223, 228, 459 – 61

547

Index liquid gate scanning, 372, 382 – 83 loopers, 230, 348 – 49, 392, 398 – 99 nitrate (cellulose nitrate), 353, 357 – 59, 369, 372 nitrate base deterioration, 358, 369, 391 photochemical duplication (film-to-film), 374 – 76, 380, 382, 385, 389 polyester, 358 – 59, 376, 382, 388 projection, 96, 304, 347 – 50, 356 – 57, 362 – 63, 365 – 67, 372, 376, 379 – 80, 387, 390 – 91, 393 – 401, 443 – 44, 446 projection booth, 394 – 95 projectors, 109, 369 – 70, 390, 395 – 97, 399 release prints, 349 – 50, 381, 389 – 90 reversal, 352, 359 – 60, 362 – 63, 365, 367 – 68, 373, 379 – 80, 409, 425, 433, 444 screening room, 392, 394 shrinkage, 358, 369 – 70, 377, 383, 403 shrinkage gauge, 353, 370 soundtrack. See soundtrack (film) stocks, 348, 350 – 52, 354, 356, 360, 372, 374, 409, 422, 434, 444 storage, 369, 371, 390 – 91, 440 See also storage, cold film printing, 347 – 51, 359, 363, 366, 369, 372, 374 – 80, 382 – 83, 389 – 90 contact printing, 350, 359, 361, 366, 370, 373, 376 – 79, 389 liquid gate printing, 372, 377 – 79 optical printing, 361, 373, 376 – 78, 383, 388 – 89 film recorders, 388 – 90, 408, 413, 417, 420 – 22, 424 – 25, 433 – 36 CRT, 417 – 19, 433 – 34 LCD, 433 LVT, 434 – 36 film scanners, 349, 374, 377, 382 – 83, 385, 388 – 90, 401, 424 – 26, 429 film-to-digital duplication. See digitization, of analog film Finbow, Acatia, 145 Fino-Radin, Ben, 95, 97, 142, 466 Fiske, Tina, 19, 21 fixity of digital files, 6, 49 – 51, 57, 103, 114 – 15, 121, 124, 128 – 29, 131 – 32, 134, 196 – 97, 200, 202, 218, 277, 289, 300 – 301 floppy disk, 204, 207 – 9, 212, 216 – 17, 446 Floyer, Ceal, 228 forensics. See digital, forensics Forensic Toolkit. See FTK Imager (software) Forging the Future (project), 181 formats digital audio, 333, 387 disk images. See disk imaging, formats film. See film, gauges slide film. See slide, film formats video. See video formats

Forsberg, Walter, 81, 401 Fortunato, Flaminia, 95 – 96 Foundation for Advancement in Conservation. See FAIC frame rate, 282, 287 – 88, 294, 336, 385, 387, 494 Franek, Erika, 194 Frieling, Rudolf, 154, 157 – 59 Frohnert, Christine, 95, 97, 99, 148 FTK Imager (software), 103, 109, 211 – 12, 216 – 18 Fundación de Arte Contemporáneo. See FAC Gallery of Modern Art (Brisbane), 95 Gates, Ethan, 505 Gerrard, John, 482, 487, 493 Gil Rodríguez, Caroline, 67 – 68, 77, 79, 83 – 84, 134, 204 – 22 GitHub (software), 36, 252, 456, 458, 469 Glasgow Museums, Gallery of Modern Art, 141 Gonçalves Magalhães, Ana, 28, 31, 33 Gonzales-Torres, Felix, 22 Goodman, Nelson, 15 – 16, 19 Grey Art Gallery (New York University), 67 Guggenheim Museum (Solomon R. Guggenheim Museum), 8, 56, 96, 98, 100, 107, 141 – 42, 144 – 45, 149 – 50, 167, 193 – 94, 252 – 53, 257 – 58, 406 – 8, 412, 416 – 19, 429, 435 – 36, 447 – 48, 450, 454, 457, 504, 514, 517 CCBA (Conserving Computer-Based Art) Initiative, 205, 454 Panza Initiative, 193 Variable Media Initiative, 19, 21 – 22, 141 – 42, 166, 258, 515 – 16 Gutai, 513 Haidvogl, Martina, 62, 95, 100 – 101, 154 – 62, 185 – 195, 224 Hall, David, 270 Hamburger Kunsthalle, 408 – 9 Hamilton, Emily, 495 Hanhardt, John, 348 hardware failure, 458, 462, 480 – 81 Harris, Jon, 240 – 41, 245 Harvey, Duncan, 194, 223 – 38 Haus der Elektronischen Künste (Basel) (HEK), 28, 34 – 35, 38, 500 Hayes, Sharon, 406 – 8, 450 HDCAM/HDCAM SR. See video formats, HDCAM/HDCAM SR Heinen, Joey, 194 HEK. See Haus der Elektronischen Künste (Basel) Hellar, Mark, 154, 158, 161, 249, 252, 254, 257, 475 Hershman Leeson, Lynn, 249, 252 Hesse, Eva, 21 Himmelsbach, Sabine, 28 – 29, 33, 38

548

Index Hirshhorn Museum and Sculpture Garden (HMSG), 142, 150 Hix, Kelli, 67, 71 – 74 HMD (head-mounted display), 494. See also VR (virtual reality) HMSG. See Hirshhorn Museum and Sculpture Garden Hölling, Hanna B., 19 – 21, 517 Holzer, Jenny, 259 Homberger, Liz, 53 Hsieh, Tehching, 513 Huang, Xincheng, 505

Irish Museum of Modern Art (IMMA), 95 – 96, 98 Iteration Reports (see also Identity Reports), 18, 50 – 51, 56, 170, 174 – 77, 189, 193, 458, 517 Jarczyk, Agathe, 95, 98, 107, 194, 269 – 316, 450 Jimenez, Mona, 67 – 88 JODI, 465 Jonas, Joan, 193, 270 Julien, Isaac, 239, 246 – 48 Kaliades, Leslie, 83 Kapp, Craig, 505 Kennedy, Nora, 39 – 66, 134 King, Chris, 480 Kjartansson, Ragnar, 271, 317 – 18, 325 – 27, 341, 343 – 45 Klacsmann, John, 347 – 405 Klomp, Paul Jansen, 480 Kramer, Lia, 39 – 66 Kunstmuseum Wolfsburg, 273, 516 Kushel, Dan, 450

ICCROM (International Centre for the Study of the Preservation and Restoration of Cultural Property), 464 ICOM-CC (International Council of Museums-Committee for Conservation), 144, 146 – 47, 151, 516 ICON (Institute of Conservation), 129, 150 – 51, 166 Identity Reports, 18, 50 – 51, 56, 170, 172, 174, 193. See also Iteration Reports Image Permanence Institute. See IPI IMAI. See Intermedia Art Institute (Düsseldorf) IMLS (Institute of Museum and Library Services), 40, 71, 132 IMMA. See Irish Museum of Modern Art INCCA (International Network for the Conservation of Contemporary Art), 28 – 29, 151, 181 information package (storage package), 57, 113, 118, 120 – 21, 124 – 25, 127 – 29, 131, 134, 196 – 203, 244, 392, 494 ingest of data, 100, 102, 127, 129 – 30, 132, 134, 197, 199, 401 Inside Installations (project), 29, 166, 179 – 80 installation instructions for time-based media art, 6, 18, 50 – 51, 170 – 74, 240 – 41, 246 – 47, 254 – 55, 272 – 73, 415, 468 installation of time-based media art, 11, 17, 160, 174 – 76, 239 – 41, 243 – 45, 247, 319, 328, 341, 394, 398, 401 Institute of Conservation. See ICON Institute of Museum and Library Services. See IMLS institutional assessments, 40 – 41, 43 – 51, 54 Intermedia Art Institute (Düsseldorf) (IMAI), 317 International Network for the Conservation of Contemporary Art. See INCCA Internet Archive, 80 – 81, 83 – 84 inventory of time-based media, 42, 46, 54, 56, 60 – 61, 68, 70, 172, 185, 188 – 91, 194, 206, 215, 218, 223, 228, 416, 440, 458 – 61 IPI (Image Permanence Institute), 358, 369 – 71, 390 – 91, 440 Ippolito, Jon, 21, 23

Laboratorio de Cine (Montevideo, Uruguay), 67 LACMA. See Los Angeles County Museum of Art Lamelas, David, 514 Lampert, Andrew, 74 Lascu, Marie, 68, 82 – 84, 86 LaserDisc. See video formats, LaserDisc Laurenson, Pip, 15, 17, 19, 23, 36, 102, 140 – 41, 145, 224 – 25, 228, 237, 463, 516 – 17 Lawson, Louise, 95, 100, 139 – 53, 517 Leckart, Linda, 160, 185 – 95 Leckey, Mark, 228 – 29 Lehni, Jürg, 167, 469 lending time-based media artworks, 50 – 52, 62, 177, 206 – 7, 236, 400 – 401, 406, 415, 443, 448 – 49 Leslie-Lohman Museum of Art, 72 Levine, Les, 270 Lewis, Kate, 95 – 110, 118 LeWitt, Sol, 21 Library of Congress, 179, 202, 213 – 14, 289, 295, 392, 493 Bagger (software), 103, 108 BagIt (software), 103, 108, 120, 124, 133, 197, 202 – 3 Sustainability of Digital Formats, 179, 213 – 14, 295 light box, 109, 352 – 53, 419 – 20, 432, 436 LIMA (Living Media Art), 95 – 96, 104, 128, 141, 167, 249 – 50, 255, 257, 270, 454 Artwork Documentation Tool, 167, 255 Digital Repository for Artists and Collections, 255 previously Montevideo (Amsterdam), 67, 75 – 77, 96, 270

549

Index microcontrollers, 447, 471, 474 – 78, 480 – 81, 491, 494 migration, 22, 54, 104, 113, 124, 133, 276, 281, 283, 288, 298 – 301, 374, 481, 486, 497 – 98, 500, 503 – 5 code migration. See source code, migration environment migration, 497 – 98, 500 hardware migration, 481 video migration, 281, 288, 298 – 301 MIPOPS (Moving Image Preservation of Puget Sound), 86 Mol, Annemarie, 24 Molnar, Vera, 466 MoMA (Museum of Modern Art), 8, 25, 32, 74, 91, 96, 98, 102, 104, 127, 130, 141 – 42, 149 – 50, 161, 248, 270, 400, 495 Montevideo. See LIMA (Living Media Art), previously Montevideo (Amsterdam) Moomaw-Taylor, Kate, 131 – 32 Moore, Alan, 81 Morfin, Jo Ana, 95, 97, 105 Morris, Robert, 515 Moving Image Archiving and Preservation, New York University. See MIAP Moving Image Preservation of Puget Sound. See MIPOPS Müller, Dorcas, 95, 104 Munson, Doug, 450 Museum Ludwig (Cologne), 95 – 96, 98, 105, 258 Museum of Contemporary Art, University of São Paulo, 28 Museum of Modern Art. See MoMA The Museum System (TMS), 45, 50 – 51, 56 – 57, 134 – 35, 186, 226

Linhart, Hank, 85 Linked Open Data, 129, 167 – 68, 454 loans. See lending time-based media artworks loopers. See film, loopers López Ruiz, Ángela, 67, 77 Lorrain, Emanuel, 224, 232 – 33 Los Angeles County Museum of Art (LACMA), 140, 194 Lozano-Hemmer, Rafael, 252, 260, 454, 462, 469, 498 LTO (Linear Tape-Open), 46, 105, 108, 124, 134, 188, 191, 203 – 4, 392 Lucier, Alvin, 321, 342 Ludwig Museum (Budapest), 151 LUTs (Look Up Tables), 384, 389 M+ Museum (Hong Kong), 28, 30, 33, 45, 57, 96, 99 – 100, 142, 144 MacDonough, Kristin, 61, 67 – 88, 95, 106 Mack, Barbra, 118 – 20 Marçal, Hélia, 512 – 20 Marcantonio, Carla, 86 markup languages, 483, 489, 492 Martin, Katy, 76 Martin, Nicole, 196 – 203 Martin, Uwe H., 30, 34 master files, 130, 187, 253, 288, 387, 412 master materials, 53, 119, 349, 361, 363, 372, 374, 389 Matters in Media Art, 32, 45, 56, 69, 85, 121, 161, 167, 275, 517 McConchie, Jack, 165, 454, 494 McDonald, Christopher, 317 – 46 McQueen, Steve, 239 MediaConch (software), 130, 279 Media Conservation Initiative (MoMA), 104, 150, 167, 220 media conservation lab, 7, 10 – 11, 48, 95 – 109, 210 – 11, 352 – 53, 390, 419 MediaInfo (software), 56 – 57, 103, 109, 172, 201, 278, 284 media players, 104, 245, 305, 476, 503 – 4 Meireles, Cildo, 513 Mellado, Diego, 249, 254, 257, 260 Merli-Nguyen Hoai, Lan Linh, 8, 505 The Met. See Metropolitan Museum of Art metadata, 120, 123, 125, 134, 181, 185, 192, 197, 199 – 201, 203, 211, 213 – 18, 283 – 84, 287, 300, 333, 392, 458 Metropolitan Museum of Art (The Met), 8, 40 – 41, 50 – 51, 54 – 56, 66, 95 – 96, 105, 131, 133, 135, 142, 275, 458 MIAP (Moving Image Archiving and Preservation, New York University), 67, 70, 74, 77, 83, 132, 146, 150 Michaan, Alex, 165

NACCA (New Approaches in the Conservation of Contemporary Art), 516 Nakajima, Ko, 70, 72 Nake, Frieder, 466 Nashville Metro Archives Audiovisual Heritage Center, 67, 72, 74 NAT. See New Art Trust National Digital Stewardship Alliance. See NDSA National Film Preservation Foundation, 354 National Galleries of Scotland, 40, 58, 95, 98, 105 National Gallery of Australia (Canberra), 95 National Museum of the American Indian. See NMAI Nauman, Bruce, 270 NDSA (National Digital Stewardship Alliance), 57, 121 – 22, 132, 134, 220 NDSA Levels of Preservation, 57, 122, 134 Neary, David, 95, 105 New Art Trust (NAT), 32, 121, 161, 167 New Museum (New York), 81, 83 NGS (National Galleries of Scotland), 96, 98, 105 – 6

550

Index PERICLES (project), 116, 118, 130, 220, 454 Pessoa Oliveira, Gabriela, 165 Phillips, Joanna, 3 – 12, 18 – 19, 95, 98, 107, 145, 165, 170 – 72, 174, 224 – 25, 237, 249 – 66, 253, 257 – 61, 450, 453 – 511, 517 Pietsch, Katrin, 450 Portapak, 258, 270, 292 Portland Art Museum, 44, 74 Prasse, Juergen, 450 Processing (programming language), 466, 469, 491 production diagrams (for slide-based artworks), 420 – 21 programming languages, 260, 446 – 47, 455 – 56, 458, 463, 465, 470, 481 – 83, 485 – 86, 488 – 92, 503 – 5 high-level, 482, 488, 494 low-level, 482 object-oriented, 485 scripted, 489 visual, 491 Python (programming language), 436, 447, 489, 504

Nichole, Kelani, 28, 31, 34, 36 Nichols, Alexandra, 39 – 66, 133 – 34 nitrate base deterioration. See film, nitrate base deterioration NMAI (National Museum of the American Indian), 126 NTSC, 105, 271, 281 – 82, 290 – 91, 294 – 95, 298 – 301 OAIS (Open Archival Information System), 114, 116, 118, 134, 197 – 98, 200 – 201 OAIS Reference Model, 114, 197 – 98, 201, 203 Obermann, Arnaud, 11, 95, 97, 480 obsolescence, 21, 23, 120 – 21, 123, 125, 134, 213, 224 – 27, 229, 233, 236, 252, 255, 269, 271, 273, 276, 280 – 81, 296 – 97, 307, 309, 312, 460, 466, 470, 477, 480, 496, 498, 501 of hardware, 56, 60, 225, 227, 233, 245, 310, 408, 447, 466 – 67, 481, 488 of software, 206, 456, 458, 466, 487 – 88, 495 of video formats, 129, 296 – 99 O’Connor, Fergus, 450 Oleksik, Peter, 69, 74, 95, 269 – 316, 496 Ondák, Roman, 514 – 15 O’Neill, Mari Mater, 79 ontologies in time-based media art, 13 – 15, 17, 24, 26 Open Archival Information System. See OAIS open reel. See video formats, open reel operating systems, 106, 120, 200, 205, 211, 219, 423, 447, 455 – 56, 458, 461, 465, 468 – 69, 472, 474, 476 – 77, 481, 483 – 84, 487 – 88, 492, 497 – 501 Oppenheim, Lisa, 416 – 17, 450 optical media, 9, 212, 216 – 17, 276, 289 – 90, 348 – 49 Orphan Film Symposium, 77 Ospina, Luis, 76 Otth, Jean, 270 Owens, Samantha, 224 Owens, Trevor, 112 – 13

QA. See quality assurance (QA) QC. See quality control (QC) QC Tools, 102 – 3, 109, 279 QEMU (emulator), 501 – 3 quality assurance (QA), 219, 277 quality control (QC), 5, 22, 99 – 100, 102 – 3, 160 – 61, 185, 187 – 88, 190, 219, 229, 231, 234 – 35, 250, 277, 279, 281 – 83, 295 – 303, 305 – 6, 308, 331 – 33, 335 – 36, 339, 344, 374, 385, 390, 394, 396 – 98, 412 – 13, 421, 425, 428 – 33, 436 – 37, 444, 504 – 5 Queensland Art Gallery and Gallery of Modern Art, 95 query languages, 489 – 90, 492 questionnaires for artists, 21, 166, 169, 172, 250, 254, 258, 415, 516 QuickTime (.mov wrapper), 109, 191, 282, 284, 293, 300 QuickTime (software), 58, 103, 109, 276, 278, 335

PACKED VZW Obsolete Equipment Project, 233 Paik, Nam June, 19, 22, 24 – 25, 270 PAL, 271, 281 – 82, 285 – 86, 291, 294, 298 – 301 Panza Initiative. See Guggenheim Museum, Panza Initiative Paradiso, Paul, 450 Participant Inc, 82 – 83 Paterson, Katie, 37 Paul, Christiane, 28, 30 – 32, 36 – 37, 254, 486 Paunu, Henna, 28, 32 – 33 performance art, 16, 22, 30, 36, 160, 167, 409, 512 – 19 performance art documentation, 514 – 17

Ramírez López, Lorena, 39 – 66 Raspberry Pi, 471, 476, 494 Ray, May, 348 real-time 3D (RT3D), 493 Reas, Casey, 252, 454 Rechert, Klaus, 501 recoverability, 464 – 67, 477 Redston, Alysha, 95 registration of time-based media, 56, 185 – 95 relative humidity. See RH

551

Index Renwick, Vanessa, 67, 69, 71, 73 – 74 Reshaping the Collectible (Tate). See Tate, Reshaping the Collectible resolution, 386, 424 – 25, 428, 430 – 34 film, 374, 386, 413 – 14, 428 – 29, 434 optical, 416, 424, 428, 430 scanning, 383 – 84, 386, 388 sensor, 401, 424, 429 sound, 331 – 32, 335 video, 107, 216, 274, 282, 290, 292, 294, 298 – 99, 311, 331, 335, 430 RH (relative humidity), 232 – 33, 391, 398, 438, 440, 466 Rhizome (New York), 454, 466, 493, 499 – 500 Conifer (software), 454, 493 Webrecorder (software), 454, 493 Wikibase, 167 Ribeiro, Ana, 165 – 84 Richardson, Nick, 480 Richter, Hans, 348 Rinehart, Richard, 21, 23 risk assessment, 4, 7, 31, 42, 54, 105, 127, 129 – 30, 166, 169, 180, 457 – 61, 464 – 66, 488 – 89, 491 Rist, Pipilotti, 317, 328 – 30, 337, 341, 344 Roeck, Claudia, 486, 500 Roemich, Hannelore, 148 Rosenberg, Karl, 505 Rozendaal, Rafaël, 259

significance assessment, 15, 17, 78, 119, 166, 169, 172, 187, 207, 223 – 26, 228 – 30, 237, 275, 459 – 60, 463 – 65, 477, 487, 503 Simon Jr., John F., 504 Singer, Martha, 67 – 88 single-board computers, 471, 476 – 78 SIP (submission information package), 197, 200 – 201 slide carousel, 228, 413, 415 – 16, 437, 441 – 42 cleaning, 425 – 26, 437, 443 digitization. See digitization, of slides duplicating film, 407 – 8, 413, 421 – 22, 436, 444, 446 duplication, 406 – 9, 414, 416, 420 – 22, 433, 437, 449 fading, 412, 438, 448 – 49 film (regular), 407 – 8, 413, 416, 418, 420 – 23, 426 – 27, 433 – 34, 440 film (reversal), 409, 425, 433, 444 film formats, 411 mounts, 109, 363, 407 – 8, 411, 416, 420, 422 – 23, 428 – 29, 434 – 39, 441 – 42, 444 projection, 406 – 7, 417 – 18, 420, 435, 444, 446, 448 projector lamps, 444 – 45 projectors, 5, 96, 409, 413, 437, 441 – 43, 446 – 47, 449 synchronization, 446 slide-based artworks, 406 – 9, 411 – 17, 419, 421 – 50 slides, soundtrack, 410, 435, 440, 447 slide scanning drum scanners, 427 – 30, 434 film scanners. See film scanners flat-bed scanners, 424, 429 – 30 Hasselblad Flextight, 419, 428 Small Data Industries (New York), 95 – 97 Smith, David, 45, 57, 95, 99 – 100 Smith, Harry, 371 Smithson, Robert, 435 – 36, 447 – 48 Smithsonian American Art Museum (SAAM), 142, 150 Smithsonian Institution, 56, 142, 167 SMPTE (Society of Motion Picture and Television Engineers), 302, 309, 351, 391 Society of American Archivists. See SAA Society of Motion Picture and Television Engineers. See SMPTE software custom, 480, 483, 486 – 87, 496, 504 dependencies, 205 – 7, 219, 468, 500 environment, 205, 455 – 57, 461, 463, 469, 472, 487 – 89, 496, 498 off-the-shelf, 487 open-source, 129, 484 software-based art, 103, 106, 109, 145, 205 – 6, 254 – 55, 453 – 54, 456 – 58, 461 – 64, 466 – 71,

SAA (Society of American Archivists), 79, 112, 115, 120, 199 SAAM (Smithsonian American Art Museum), 142, 150 Saarikko, Jorma, 240, 242, 245 Sääskilahti, Susanna, 224 Sacks, Steven, 249, 253 – 54, 259 – 60 sample rate, 286 – 87, 294, 319, 331 – 32, 334 – 36, 339 San Francisco Museum of Modern Art. See SFMOMA Sartorius, Andrea, 10 Sassen, Saskia, 157 Schechter, Maurice, 102, 450 Schneider, Ira, 105 Schum, Gerry, 270 SECAM, 271, 281 – 82, 291 Sehgal, Tino, 158, 514 – 15 Semmerling, Linnea, 317, 326 – 27 SFMOMA (San Francisco Museum of Modern Art), 7, 32, 62, 95 – 96, 98, 100, 102, 116, 121, 141 – 42, 144, 150, 154, 156 – 57, 159 – 61, 167, 193, 224, 252 SFMOMA Team Media, 7, 62, 100, 141, 154 – 61 Sharpless, Jeannette, 194 Shaw, Jeffrey, 478 Sherring, Asti, 60, 144, 224

552

Index 477 – 78, 482 – 83, 486 – 87, 493 – 94, 496 – 99, 501, 503 – 4 Software Preservation Network (SPN), 220, 454, 470 Sonntag, Kathrin, 418 – 19, 450 sound cancellation, 323 – 24, 337 – 39 digitization. See digitization, of sound (in film and video) duplication (film), 379, 381 dynamic range, 237, 336, 343, 379, 383, 396, 425, 431 equalizer, 306, 308, 321 frequencies, 319, 321 – 22, 325 – 26, 339 frequency range, 325, 336, 338, 342 – 43 level, 337, 342 – 44 noise, 241, 243, 279, 321 – 24, 331 – 32, 334, 337 – 38, 342, 344, 385 – 86, 397 panels, 175, 342 pitch, 319 – 21, 323, 334 – 36 psychoacoustics, 318, 324 recording, 283, 291 – 92, 318, 328, 331 – 32, 334, 357, 368 reflections, 340 reproducing, 334, 336, 338 timbre, 16, 321, 323 waves, 319 – 21, 323 – 24, 329, 331, 334, 338, 343 sound-based artworks, 9, 161, 317, 319, 326, 328, 334, 343 soundtrack (film), 366, 368, 373 digital, 397 magnetic, 14, 55, 272 – 73, 277 – 78, 281, 289, 291, 319, 329, 331, 352, 357, 366 – 69, 382, 385 optical, 352, 354, 356, 365 – 68, 377, 379 – 80, 384 – 85, 389, 395 – 97 soundtrack (slides). See slides, soundtrack source code, 31, 118, 167, 213, 252, 254 – 56, 259, 454 – 56, 464 – 69, 475, 477, 481 – 89, 491 – 92, 496, 498, 503 – 5 analysis, 457, 468, 484 – 86 documentation, 455, 458, 485, 489, 505 intervention, 454, 496 – 97, 503, 505 migration, 503 – 5 resituation. See code resituation speakers, 103, 109, 223, 243, 245, 247, 275, 277, 283, 287, 294, 306, 308, 319 – 20, 325, 332, 334 – 39, 341, 344, 393 – 94 active vs. passive, 308, 337 – 38 SPN. See Software Preservation Network Springer, Samantha, 45 – 46 staffing time-based media conservation, 3 – 7, 9, 11, 47 – 48, 139 – 41, 143 – 53, 156, 161, 259, 261 Stedelijk Museum (Amsterdam), 30 – 31, 96, 514 Sterrett, Jill, 101, 154, 156 – 58, 161

Stevenson, Vance, 450 sticky shed syndrome, 272, 281, 289 Stokols, Daniel, 156 storage cold storage, 105, 303, 358, 370, 375, 390 – 91, 412, 438 – 40 digital storage, 6, 43, 49, 57 – 59, 102, 111 – 21, 123, 125 – 37, 161, 199 – 200, 289, 319, 386, 392 equipment storage. See equipment, storage video tapes and disks, 121, 289 – 90 Stricot, Morgane, 224, 231, 480 Stringari, Lena, 249, 252 – 53, 257 – 59, 450 Súarez, Juana, 67, 76 – 80, 86 submission information package. See SIP Taipei Fine Arts Museum, 52 Tate, 32, 36, 96, 100, 106, 116, 121, 139 – 40, 142, 144, 150 – 51, 161, 167 – 68, 179 – 80, 224, 226 – 29, 236, 408 – 9, 443, 447, 450, 454, 493, 498, 501, 512, 514, 516 – 17 Collecting the Performative, 515 – 16 Reshaping the Collectible, 168, 517 TBC (Time Base Corrector), 104, 109, 277, 279, 282 – 83, 293 Team Media at SFMOMA. See SFMOMA Team Media Technical Rider. See installation instructions for time-based media art Thiesen, Milo, 133 – 35 Time Base Corrector. See TBC time-based media conservation lab. See media conservation lab time-based media working group, 5, 7, 12, 44, 48, 53, 62 – 63, 131, 133, 141 – 42, 148, 154 Tiravanija, Rirkrit, 513 Trabandt, Elke, 422, 450 training in time-based media conservation, 80, 139 – 53 TRANSFER Gallery, 28, 34, 36 Tuerlinckx, Joëlle, 24 Ulinskas, Moriah, 67, 71 – 74 Ulman, Amalia, 513 U-matic 3/4.” See video formats, U-matic 3/4” Vadakan, Pamela, 67, 71, 74 Van Abbe Museum (Rotterdam), 514 van de Vall, Renée, 13 – 27 van Saaze, Vivian, 24, 36, 128, 141, 168, 516 Variable Media Initiative. See Guggenheim Museum, Variable Media Initiative Vasulka, Woody and Steina, 105, 270 Ventur, Conrad, 68, 82 – 83, 85 version control, 179, 193, 456 Versteeg, Siebren, 249 – 51, 254, 259 – 60 VHS/S-VHS. See video formats, VHS/S-VHS

553

Index video tape binder hydrolysis. See sticky shed syndrome cleaning, 106, 109, 278, 281 digitization. See digitization, of video tape video tape recorder. See VTR vinegar syndrome. See film, acetate base deterioration Viola, Bill, 224, 239 virtualization, 205, 215, 219, 454, 499 – 501 virtual reality. See VR Visual AIDS, 68, 82 – 85 Vitale, Timothy, 450 Vlaminck, Matthieu, 480 Vogel, Hannes, 270 Vostell, Wolf, 270 VR (virtual reality), 8, 167, 177, 470, 493 – 94 VTR (video tape recorder), 103, 277, 293, 296 – 99, 305

Victoria Miro Gallery, 240 video cassettes, 269, 272, 278, 281, 289, 293, 296 – 97, 305 codec, 214, 287 – 88, 294, 335, 339 deinterlacing, 300 – 301, 310 – 11 interlacing, 277, 287 – 88, 293, 300 – 302, 304, 310 – 11 letterboxing, 311 pillarboxing, 311 projection, 228, 241, 275, 302 – 4 signal, 104, 207, 271, 277, 279, 282, 290, 292 – 93, 295, 302 – 4, 309 storage. See storage, video tapes and disks transcoding, 103, 128, 283, 287 – 89, 310 video artworks, 99, 104, 269, 271 – 76, 278 – 79, 282 – 83, 289, 295, 298 – 302, 304 – 6, 308 – 9, 311 – 12, 335 video displays CRT. See CRT (cathode ray tube), monitors LCD, 108, 269, 303 OLED, 303 plasma, 269, 302 – 3, 309 video file formats (encodings), 128 – 29, 132, 179 – 80, 211, 213 – 14, 216 – 17, 273, 275, 292 – 93, 335, 392, 416, 483, 487 Apple ProRes, 191, 199, 201, 274, 276, 278, 287 – 88, 300, 387 – 88 DPX, 300 – 301, 385 – 86, 392 DV, 282, 294, 300 – 301 FFV1, 282, 288, 300 – 301, 392 H.264, 214, 274 – 76, 278, 283, 287 – 88, 295, 300 – 301, 310, 388 Uncompressed, 8, 85, 191, 199, 201, 274, 276, 278, 287 – 89, 295, 299 – 301, 310, 387 – 88 video formats, 9 – 10, 55, 272 – 73, 276, 279 – 80, 283, 292 – 301 Betacam/Betacam SP, 103, 279, 292 – 93, 296 – 97, 299 Blu-ray, 204, 276, 290, 305, 334, 387 Digital Betacam, 103, 273, 279, 282, 293, 296 – 99 DV/DVCAM, 109, 271, 279, 282, 293 – 94, 298 – 301 DVD, 40, 204, 207, 216 – 17, 276, 293, 298, 300, 306, 335 HDCAM/HDCAM SR, 273, 283, 298 – 99 LaserDisc, 239, 272, 276, 292, 298 – 99, 305 – 6 open reel, 278, 283, 289, 296 – 97 U-matic 3/4,” 272, 279, 281 – 82, 292, 296 – 97, 305 VHS/S-VHS, 271 – 72, 293 – 94, 296 – 99, 305 video projectors, 109, 272, 304, 348 CRT. See CRT (cathode ray tube), projectors DLP, 245, 303 – 4 laser, 304 LCD, 303

Warda, Jeffrey, 406 – 52 Warhol, Andy, 397, 400 Wearing, Gillian, 239 web archiving, 492 web-based artworks, 453 – 54, 461, 465, 477, 484, 488, 492, 497, 499 – 500, 504 – 5 Weiss, Grace T., 154, 159 – 60, 194 Weisser, Andreas, 11, 95, 104 Wharton, Glenn, 33, 39 – 66, 118 – 20, 180, 485 – 86 White, Layna, 154, 157 – 59 Whitman-Salkin, Sarah, 401 Whitney Museum of American Art, 28, 36, 95 – 96, 105, 149, 254, 486 Wielocha, Aga, 45, 57, 95, 100 Wijers, Gaby, 95, 249 – 50, 255, 257, 259, 261 wiki documentation, 38, 167, 178 – 79, 257, 490 wiring diagrams, 7, 173, 175, 229, 475 Wojnarowicz, David, 193 Wood, David, 240 – 41, 245 work-defining properties, 15 – 17, 21, 225, 228, 459 – 61, 463 – 65, 497 write blockers, 47, 58, 98, 209 – 11, 220, 276 Wurm, Erwin, 513 XFR Collective, 67 – 69, 72, 80 – 83, 85 – 86 Yalter, Nil, 258 Zabaleta, Guillermo, 67, 77 – 78 Zimbardo, Tanya, 154, 158 – 61 ZKM Center for Art and Media, 95, 104, 231, 478 ZKM Laboratory for Antiquated Video Systems, 96, 104

554