Engineering and Philosophy: Reimagining Technology and Social Progress 9783030700980, 9783030700997


320 42 6MB

English Pages [366] Year 2021

Report DMCA / Copyright

DOWNLOAD PDF FILE

Table of contents :
Contents
About the Editors
Chapter 1: Reimagining Conceptions of Technological and Societal Progress
1.1 Introduction
1.2 Why Philosophy and Engineering?
1.2.1 Structure of the Book
1.3 Section 1: Technological Progress
1.3.1 Part I: Reimagining How Engineering Relates to the Sciences
1.3.2 Part II: Re-imagining Engineering Epistemology and Reasoning
1.4 Section 2: Social Progress
1.4.1 Part III: Reimagining Values and Culture in Engineering and Engineered Systems
1.4.2 Part IV: Reimagining Social Progress Through Engineers’ Ethical Principles
1.5 Section 3: The Connection Between Engineering and Social Progress
1.5.1 Part V: Re-imagining How Engineering Relates to Complex Sociotechnical Systems
1.5.2 Part VI: Reimagining Social Progress in Democracy, and the Need to Align Engineering to Social Values
1.6 Part VII: A Provocation – Reimagining the Limits of Philosophy and Knowledge Through Generic Design
1.7 On Progress for the Philosophy of Engineering
References
Part I: Technological Progress: Reimagining How Engineering Relates to the Sciences
Chapter 2: Engineering Design Principles in Natural and Artificial Systems: Generative Entrenchment and Modularity
2.1 Introduction
2.2 Generative Entrenchment
2.2.1 Generative Entrenchment and Engineered Technological Systems
2.2.2 Entrenchment Can Drive Asymmetry and, thus, Diversity
2.2.3 Entrenchment and Bauplans: General Frameworks for Adaptive Radiations
2.3 Top-Down Modularity: The Emergence of Order from the Big Ball of Mud
2.3.1 How Modularity in Engineering Can Become Entrenched
2.4 Conclusion
References
Chapter 3: Technological Progress in the Life Sciences
3.1 Introduction
3.2 A History of Genetic Intervention
3.3 What’s a Technological Revolution Anyway?
3.4 Is CRISPR-Cas9 Really Revolutionary?
3.5 Why Tracking Innovation Matters
3.6 Conclusion
References
Part II: Technological Progress: Re-imagining Engineering Knowledge
Chapter 4: Philosophical Observations and Applications in Systems and Aerospace Engineering
4.1 Introduction
4.2 The Protaganists of Protagoras: Engineering Rhetoric
4.3 Aristotle’s Children: Teleology in Engineering
4.4 Euclid vs. Ptolemy: From Axiomatic to Model-Based Systems Engineering
4.5 Hermeneutics, Pragmatism, and the Theory of Fault Management
4.6 Conclusion
Chapter 5: Prehistoric Stone Tools and their Epistemic Complexity
5.1 Introduction
5.2 Section 1: Subjective vs Objective Perspectives of Knowledge
5.3 Section 2: How Does Knowledge Take on Material Forms?
5.4 Section 3: Early Stone Tools as Epistemically Complex Entities
5.5 Concluding Remarks
References
Chapter 6: Narrative and Epistemic Positioning: The Case of the Dandelion Pilot
6.1 Introduction
6.2 Plots and Askability
6.3 Interpretation and Synoptic Judgement
6.3.1 Flow Visualisation
6.3.2 Narrative Helping Others to See
6.4 Conclusion
References
Part III: Social Progress: Considering Engineers’ Ethical Principles
Chapter 7: Constructing Situated and Social Knowledge: Ethical, Sociological, and Phenomenological Factors in Technological Design
7.1 Introduction
7.2 Algorithmic Bias
7.2.1 Photographic Bias
7.2.2 Surveillance
7.2.3 Google’s Search
7.3 Intersubjective Intersections
7.3.1 GIGO (Garbage In, Garbage Out)
7.4 Conclusion
References
Chapter 8: Towards an Engineering Ethics with Non-engineers: How Western Engineering Ethics May Learn from Taiwan
8.1 Introduction
8.2 The Meanings of “Engineering”
8.3 An Engineering Ethics for “Non-engineers” as Well
8.4 From Ethics of Professional Engineers to Ethics  of the Engineering Profession
8.5 Conclusion and Implications
References
Chapter 9: Broadening Engineering Identity: Moving beyond Problem Solving
9.1 Introduction
9.2 Solutions Versus Responses
9.3 Proposed Ontology
9.4 Response-Type Taxonomy
9.4.1 Negation or Elimination
9.4.2 Delay
9.4.3 Buffer
9.4.4 Avoidance
9.4.5 Denial
9.4.6 Reframe (the Challenge)
9.4.7 Discussion
9.5 Nature of the Sustainability Challenge
9.6 Ontology-Epistemology-Pedagogy
9.7 Curriculum Implications
9.8 Summary and Conclusions
References
Part IV: Reimagining Values and Culture in Engineering and Engineered Systems
Chapter 10: Engineering, Judgement and Engineering Judgement: A Proposed Definition
10.1 Methodology of This Paper
10.2 Phronesis, a First Cut Definition
10.3 Why Phronesis?
10.4 What Is Phronesis?
10.5 Phronesis and Other Virtues
10.6 A “Rational Quality”
10.7 The Case for Adding Eustoxia to Our Definition of Engineering Judgement
10.8 A Problem of Scope: Phronesis Excludes the Act of Making Things
10.9 Aristotle’s Categorization of Virtues
10.10 Another Problem of Scope: Universal Applicability of Phronesis vs. Specific Applicability of Judgement
10.11 Phronesis: Summary
10.12 Definition of Engineering Judgement – Second Cut
10.13 Creativity in Engineering
10.14 Uncertainty and the Problem of Truth
10.15 Koen’s Postmodern View of Truth vs. that of the Typical Engineer
10.16 Truth vs. Optimization
10.17 Is Engineering Judgement a Rational Quality?
10.18 Summary of Lessons Learned from Consideration of Koen’s Method
10.19 Engineering Judgement – Third Cut
10.20 What Is Unique About Engineering Judgement?
10.21 How Is Engineering Different?
10.22 Engineering Judgement – Final Cut (for This Chapter)
References
Chapter 11: Technology, Uncertainty, and the Good Life: A Stoic Perspective
11.1 Introduction
11.1.1 Technology and Uncertainty
11.1.2 Technology and Human Fulfillment
11.1.3 Sustainability and Resilience
11.2 Relevant Stoic Concepts and Arguments
11.2.1 Eudaimonia
11.2.2 Global Concern
11.2.3 Fate
11.2.4 Avoiding Judgment
11.2.5 Visualizing and Growing from Adversity
11.2.6 Phronesis
11.3 Application of the Concepts to the Proposed Challenges
11.4 Conclusion
References
Part V: Re-imagining How Engineering Relates to Complex Sociotechnical Systems
Chapter 12: The Impact of Robot Companions on the Moral Development of Children
12.1 Introduction
12.2 Virtues
12.3 Types of Robot Companions
12.4 Potential Justifications for the Use of Robots with Children
12.5 Types of Interactions
12.6 Encouraging Prosocial Behavior
12.7 Discouraging Antisocial Behavior
12.8 Ethical Concerns with the Strategy
12.9 Other Related Ethical Objections
12.10 Conclusion
References
Chapter 13: Engineering Our Selves: Morphological Freedom and the Myth of Multiplicity
13.1 Introduction
13.2 Part 1: Transhumanism, and the Quest to (Re-)Engineer the Body
13.3 Part 2: Bills of Rights and Engineering
13.4 Part 3: Streamlining and Eugenics
13.5 Part 4: A Deeper Dive into Morphological Freedom
13.5.1 The First Clause
13.5.2 The Third Clause
13.5.3 The Fourth Clause
13.6 Conclusions
References
Part VI: Reimagining Social Progress in Democracy, and the Need to Align Engineering to Social Values
Chapter 14: Shared Learning to Explore the Philosophies, Policies and Practices of Engineering: The Case of the Atlantic Coast Pipeline
14.1 Introduction
14.2 Shared Learning
14.3 Research Design
14.3.1 Case Context: Atlantic Coast Pipeline
14.3.2 Research Team and Structure
14.3.3 Research Activities
14.4 Findings: Confronting Disengagement via Shared Learning
14.4.1 Dualism and Socio-technical Systems
14.4.2 Meritocracy and Pipeline Planning
14.4.3 Politicization
14.5 Concluding Points
References
Chapter 15: Middle Grounds: Art and Pluralism
15.1 Introduction
15.1.1 Background
15.1.2 Intersection of Art & Technology
15.1.3 Overview of Artist Duo Caitlin & Misha
15.1.4 Repurposing Technology via the Worries Bash and Other Projects
15.2 Creating Space
15.2.1 Space that Is Open to the Public
15.2.2 Creating Spaces for Shared Experiences
15.3 Shaping Culture
15.3.1 Possibility Space
15.3.2 Possibility Space of the Mobile Sauna and Sweat Battery
15.3.3 Possibility Space of the Shareable Biome
15.3.4 Possibility Space of Worries Bash
15.3.5 Possibility Space of the Pink Noise Salon
15.3.6 Possibility Space of Total Jump
15.4 Conclusion
References
Chapter 16: The Artefact on Stage – Object Theatre and Philosophy of Engineering and Technology
16.1 Landmarks of Engineering
16.2 Objects, Puppets and Theatre
16.3 Technology on Stage
16.3.1 Case 1: The Second Reality
16.3.2 Case 2: Eliza – Uncanny Love
16.4 Philosophical Experiences
16.5 The Mirror Image
References
Chapter 17: Imagined Systems: How the Speculative Novel Infomocracy Offers a Simulation of the Relationship Between Democracy, Technology, and Society
17.1 Introduction
17.2 How Science Fiction Can Inform Policy
17.3 On the Relationship among Technology, Society and Democracy
17.4 The World of Infomocracy
17.5 How Infomocracy Embodies Forecasting, Values Reflection and Governance
17.6 Infomocracy as a Governance Simulation for Re-engineering the Relationship Between Technology and Society
References
Part VII: A Provocation
Chapter 18: The Discrete Scaffold for Generic Design, an Interdisciplinary Craft Work for the Future
18.1 Introduction
18.1.1 Views of the Generic
18.1.2 Scaffolding for Construction Crafts
18.1.3 Scaffolding for Generic Design
18.2 Building a Conceptual Framework for Generic Design
18.2.1 Generic Epistemology and Generic Space
18.2.2 Situating Generic Design in Craftwork
18.2.3 The Role of Shared Memory in Generic Design
18.3 Approach: Generic Design, a Theory/Practice Framework
18.4 Poincare’s Partially Explicit Generic Design
18.4.1 Poincare: Interdisciplinarian and Precursor to Generic Design
18.4.2 Poincare’s Synoptic Ordering of Science Conjoined with Philosophy in Popular Science Writing
18.4.3 Poincaré’s Ordering of Disciplines as Conceptual Scaffolding
18.5 Historical Epistemology of Einstein’s Breakthroughs in Physics: Viewed as Generic Design
18.6 Extending Poincare and Einstein
18.7 A Generic Design Perspective on Engineering
18.8 Dimensional Analysis: A Case of Multi-authored Generic Design
18.9 Conclusion
References
Recommend Papers

Engineering and Philosophy: Reimagining Technology and Social Progress
 9783030700980, 9783030700997

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

Philosophy of Engineering and Technology

Zachary Pirtle David Tomblin Guru Madhavan  Editors

Engineering and Philosophy Reimagining Technology and Social Progress

Philosophy of Engineering and Technology Volume 37

Editor-in-Chief Pieter E. Vermaas, Delft University of Technology, The Netherlands Editors Darryl Cressman, Maastricht University, The Netherlands Neelke Doorn, Delft University of Technology, The Netherlands Byron Newberry, Baylor University, U.S.A Editorial advisory board Philip Brey, Twente University, The Netherlands Louis Bucciarelli, Massachusetts Institute of Technology, U.S.A Michael Davis, Illinois Institute of Technology, U.S.A Paul Durbin, University of Delaware, U.S.A Andrew Feenberg, Simon Fraser University, Canada Luciano Floridi, University of Hertfordshire & University of Oxford, UK Jun Fudano, Kanazawa Institute of Technology, Japan Craig Hanks, Texas State University, U.S.A Sven Ove Hansson, Royal Institute of Technology, Sweden Vincent F. Hendricks, University of Copenhagen, Denmark & Columbia University, U.S.A Don Ihde, Stony Brook University, U.S.A Billy V. Koen, University of Texas, U.S.A Peter Kroes, Delft University of Technology, The Netherlands Sylvain Lavelle, ICAM-Polytechnicum, France Michael Lynch, Cornell University, U.S.A Anthonie Meijers, Eindhoven University of Technology, The Netherlands Sir Duncan Michael, Ove Arup Foundation, UK Carl Mitcham, Colorado School of Mines, U.S.A Helen Nissenbaum, New York University, U.S.A Alfred Nordmann, Technische Universität Darmstadt, Germany Joseph Pitt, Virginia Tech, U.S.A Ibo van de Poel, Delft University of Technology, The Netherlands Daniel Sarewitz, Arizona State University, U.S.A Jon A. Schmidt, Burns & McDonnell, U.S.A Peter Simons, Trinity College Dublin, Ireland Jeroen van den Hoven, Delft University of Technology, The Netherlands John Weckert, Charles Sturt University, Australia

The Philosophy of Engineering and Technology book series provides the multifaceted and rapidly growing discipline of philosophy of technology with a central overarching and integrative platform. Specifically it publishes edited volumes and monographs in: the phenomenology, anthropology and socio-politics of technology and engineering the emergent fields of the ontology and epistemology of artifacts, design, knowledge bases, and instrumentation engineering ethics and the ethics of specific technologies ranging from nuclear technologies to the converging nano-, bio-, information and cognitive technologies written from philosophical and practitioners’ perspectives and authored by philosophers and practitioners. The series also welcomes proposals that bring these fields together or advance philosophy of engineering and technology in other integrative ways. Proposals should include: A short synopsis of the work or the introduction chapter. The proposed Table of Contents. The CV of the lead author(s). If available: one sample chapter. We aim to make a first decision within 1 month of submission. In case of a positive first deci+sion the work will be provisionally contracted: the final decision about publication will depend upon the result of the anonymous peer review of the complete manuscript. We aim to have the complete work peer-reviewed within 3 months of submission. The series discourages the submission of manuscripts that contain reprints of previous published material and/or manuscripts that are below 150 pages/75,000 words. For inquiries and submission of proposals authors can contact the editor-in-chief Pieter Vermaas via: [email protected], or contact one of the associate editors. More information about this series at http://www.springer.com/series/8657

Zachary Pirtle • David Tomblin • Guru Madhavan Editors

Engineering and Philosophy Reimagining Technology and Social Progress

Editors Zachary Pirtle Independent Scholar Washington, DC, USA

David Tomblin University of Maryland College Park, MD, USA

Guru Madhavan National Academy of Engineering Washington, DC, USA

ISSN 1879-7202     ISSN 1879-7210 (electronic) Philosophy of Engineering and Technology ISBN 978-3-030-70098-0    ISBN 978-3-030-70099-7 (eBook) https://doi.org/10.1007/978-3-030-70099-7 © Springer Nature Switzerland AG 2021 This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. The publisher, the authors, and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, expressed or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations. This Springer imprint is published by the registered company Springer Nature Switzerland AG The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland

Contents

  1 Reimagining Conceptions of Technological and Societal Progress�����    1 Zachary Pirtle, David Tomblin, and Guru Madhavan Part I Technological Progress: Reimagining How Engineering Relates to the Sciences   2 Engineering Design Principles in Natural and Artificial Systems: Generative Entrenchment and Modularity ������������������������������������������   25 William C. Wimsatt   3 Technological Progress in the Life Sciences������������������������������������������   53 Janella Baxter Part II Technological Progress: Re-imagining Engineering Knowledge   4 Philosophical Observations and Applications in Systems and Aerospace Engineering��������������������������������������������������������������������   83 Stephen B. Johnson   5 Prehistoric Stone Tools and their Epistemic Complexity ��������������������  101 Manjari Chakraborty   6 Narrative and Epistemic Positioning: The Case of the Dandelion Pilot ������������������������������������������������������������  123 Dominic J. Berry Part III Social Progress: Considering Engineers’ Ethical Principles   7 Constructing Situated and Social Knowledge: Ethical, Sociological, and Phenomenological Factors in Technological Design ��������������������������������������������������������������������������  143 Damien Patrick Williams

v

vi

Contents

  8 Towards an Engineering Ethics with Non-­engineers: How Western Engineering Ethics May Learn from Taiwan����������������  161 Bono Po-Jen Shih   9 Broadening Engineering Identity: Moving beyond Problem Solving ��������������������������������������������������������������������������������������  181 Thomas Siller, Gerry Johnson, and Russell Korte Part IV Reimagining Values and Culture in Engineering and Engineered Systems 10 Engineering, Judgement and Engineering Judgement: A Proposed Definition������������������������������������������������������������������������������  199 Daniel McLaughlin 11 Technology, Uncertainty, and the Good Life: A Stoic Perspective����������������������������������������������������������������������������������  219 Tonatiuh Rodriguez-Nikl Part V Re-imagining How Engineering Relates to Complex Sociotechnical Systems 12 The Impact of Robot Companions on the Moral Development of Children������������������������������������������������������������������������������������������������  237 Yvette Pearson and Jason Borenstein 13 Engineering Our Selves: Morphological Freedom and the Myth of Multiplicity ������������������������������������������������������������������������������������������  249 Joshua Earle Part VI Reimagining Social Progress in Democracy, and the Need to Align Engineering to Social Values 14 Shared Learning to Explore the Philosophies, Policies and Practices of Engineering: The Case of the Atlantic Coast Pipeline ������������������������������������������������������������������������������������������  271 Rider W. Foley and Elise Barrella 15 Middle Grounds: Art and Pluralism������������������������������������������������������  291 Caitlin Foley and Misha Rabinovich 16 The Artefact on Stage – Object Theatre and Philosophy of Engineering and Technology��������������������������������������������������������������  309 Albrecht Fritzsche

Contents

vii

17 Imagined Systems: How the Speculative Novel Infomocracy Offers a Simulation of the Relationship Between Democracy, Technology, and Society��������������������������������������������������������������������������  323 Malka Older and Zachary Pirtle Part VII A Provocation 18 The Discrete Scaffold for Generic Design, an Interdisciplinary Craft Work for the Future����������������������������������������������������������������������  343 Ira Monarch, Eswaran Subrahmanian, Anne-Françoise Schmid, and Muriel Mambrini-Doudet

About the Editors1

Zachary  Pirtle  is a researcher of systems engineering and philosophy based in Washington, D.C., as well as a program executive and engineer enabling science and human exploration on the Moon. David Tomblin  is director of the Science, Technology and Society program at the University of Maryland, College Park. Guru Madhavan  is the Norman R. Augustine Senior Scholar and senior director of programs at the National Academy of Engineering, Washington, D.C.

 The views expressed in this volume are those of the individual authors and not necessarily of the editors and their respective employers. 1

ix

Chapter 1

Reimagining Conceptions of Technological and Societal Progress Zachary Pirtle, David Tomblin, and Guru Madhavan

Abstract  Engineers love to build ‘things’ and have an innate sense of wanting to help society. However, these desires are often not connected or developed through reflections on the complexities of philosophy, biology, economics, politics, environment, and culture. To guide future efforts and to best bring about human flourishment and a just world, our volume, Engineering and Philosophy: Reimagining Technology and Social Progress, brings together practitioners and scholars to inspire deeper conversations on the nature and varieties of engineering. The perspectives in this book are an act of reimagination: how does engineering work, how does it serve society, and in a vital sense, how should it. Our introductory chapter builds on the book’s perspectives to reframe notions of both technological and societal progress and the connection between the two. While there have long been philosophical and science and technology studies literatures that provide deeper perspectives on engineering, we worry that little of that reflection has actually shifted the practice and trajectory of engineering in a way that can better support societal progress. We seek to highlight how ‘reimagined’ conceptions of technological and societal progress can serve to provoke critical reflection about engineering among both engineers and everyday citizens. We conclude with a comment on what progress for the philosophy of engineering should look like, noting the need for both broad engagement and fundamental conceptual shifts. Keywords  Philosophy of engineering · Engineering practice · Complex systems · Social responsibility

Z. Pirtle (*) Independent Scholar, Washington, DC, USA D. Tomblin Science, Technology and Society Program, University of Maryland, College Park, MD, USA G. Madhavan National Academy of Engineering, Washington, DC, USA © Springer Nature Switzerland AG 2021 Z. Pirtle et al. (eds.), Engineering and Philosophy, Philosophy of Engineering and Technology 37, https://doi.org/10.1007/978-3-030-70099-7_1

1

2

Z. Pirtle et al.

1.1  Introduction1 How does engineering contribute to society’s progress? In a vital sense, how should it? Building on the perspectives in this book, we argue that there is a need to reimagine engineering and the relationship between technology and society. To buttress this argument, we draw on well-established areas in the philosophy of engineering as well as science and technology studies (STS), which address how technology co-­ evolves with science and society (Mitcham 2019; Johnson and Wetmore 2008; Jasanoff 2004; Vincenti 1990; Shrader-Frechette and Westra 1997; Vallor 2016).2 However, as researchers and practitioners, we worry that very little of this critical reflection on engineering has yet to permeate engineering practice or public discourse.3 Our  volume identifies new ways to pursue change in how engineering works, who it serves, and the future paths it could follow. Using the term broadly, we seek to highlight how ‘reimagined’ conceptions of technological and societal progress can serve to provoke critical reflection about engineering among both engineers and everyday citizens. We use philosophy as a way to encourage inclusive reflection about engineering from diverse areas of expertise (engineers, policymakers, STS scholars, artists, and other humanists) and many sectors of life (government, industry, academia and non-profits), bringing a variety of values and perspectives into discussion with one another. True and meaningful change will require dialogue around multiple ideas through broad societal engagement. Deep change requires major conceptual shifts, or ‘reimaginings’, of the sort outlined in this volume.4 1  All opinions expressed both in  this introduction and  in  the  subsequent chapters are those of the individual authors, and do not necessarily represent the views of any affiliated organizations or their employers. We’re also thankful to the Springer Philosophy of Engineering and Technology (POET) series editor-in-chief Pieter Vermaas for  his support and  insight. This introduction and the overall volume were greatly helped by an anonymous reviewer’s comments. For enabling our hosting the fPET2018 event, we also appreciate support from the fPET steering committee, especially including Diane Michelfelder, Rick Evans and  David Goldberg. Pirtle is especially grateful for the help of the space policy practitioner Katelyn Kuhl, both in supporting the organization of fPET2018 and in strategically advising and supporting the process to create this book. 2  To underscore the length of this history of reflection on engineering: The Society for Philosophy and Technology began with the first meeting in 1976. Much of the field of science and technology studies (STS) began in the 1970s, with the Society for Social Studies of Science (4S) being founded in 1975. The history goes much deeper still: Layton (CitationRef CitationID=”CR19”>1971) traces the history of engineers’ ethical debates in the early 1900s, and Mitcham (CitationRef CitationID=”CR23”>2019) points to much earlier antecedants as well. 3  This is partly based on our experience at seeing awareness of philosophy of technology and STS in Washington, DC policy circles. This seems broadly true of science policy professionals in the US. 4  Our use of the phrase ‘reimagining’ is based on a general recognition of the potential value for major conceptual changes, here focused on technology and progress. We do not intend to ascribe to Jasanoff and Kim (CitationRef CitationID=”CR014”>2015) who refer to sociotechnical imaginaries, but recommend their work. A new account that we are just now studying is Schatzberg (CitationRef CitationID=”CR0031”>2018), which likewise hopes to change action by revising our concepts of technology, in his case by highlighting cultural dimensions of technology in addition to instrumental ones.

1  Reimagining Conceptions of Technological and Societal Progress

3

1.2  Why Philosophy and Engineering? We strongly believe that philosophy offers key insights into how humanity can go about reimagining technology and society, involving a broad range of issues surrounding engineering. Research on the epistemological, ontological, historical, and ethical foundations of engineering helps to improve our understanding of the processes that comprise engineering itself, from design to operations to maintenance.5 All of these are key pieces of what many would frame as an overall ‘philosophy of engineering,’ which builds on work in the philosophy of technology (Mitcham 2019). The “knowledge” created and required by engineers presents seemingly unique challenges, as they engage in work that is at times hard to capture with language or precise description, and which the practitioners themselves often do not care to represent. Insights bridging philosophy and engineering provide us with better guidance for performing engineering that leads to better outcomes for society (Allenby and Sarewitz 2011). More cogent depictions of engineering can guide our understanding of our increasingly “tech centric” world, and can contribute to the management and governance of engineering developments, which in turn provides a foundation for more responsible forms of engineering (Vallor 2016; Owen et al. 2013). Bringing scholars and practitioners of engineering to the same table to exchange ideas and tacit knowledge related to these issues is key to developing more comprehensive philosophical answers but also ushers insights from philosophy into the core concepts and practice of engineering, and vice versa. In Engineering and Philosophy: Reimagining Technology and Social Progress, we bring together perspectives from a diverse community of philosophers, engineers, scientists and public policy practitioners. The specific thoughts here began with the 2018 Forum on Philosophy, Engineering, and Technology (fPET), which we (Pirtle and Madhavan) co-chaired and (Tomblin) hosted at the University of Maryland, College Park, a continuation of a dialogue that began in 2007 at the Delft University of Technology.6 Our goal in selecting chapters for this volume is to deepen and advance the engagement of the philosophy of engineering with the many forms of engineering practice. This is necessary to reflect on the nature of engineering work, and to rethink technology and social progress beyond the traditions and constraints of our current thinking. To become accustomed to engaging

5  Epistemology is the structured study of knowledge and how we obtain it; ontology refers to the nature engineering and of engineering artifacts. 6  Interested readers who are intrigued about philosophy and engineering would do well to explore future biennial meetings of fPET, as well as meetings of the Council on Engineering Systems Universities and the Society for Philosophy and Technology. fPET follows in the tradition of the 2007 Workshop on Philosophy and Engineering (WPE) at Technical University- Delft and the 2008 WPE meeting at the U.K. Royal Academy of Engineering. Rebranding to fPET, a 2010 meeting was held at the Colorado School of Mines, a 2012 meeting at the Graduate University of the Chinese Academy of Sciences in Beijing, in 2014 at Virginia Tech, and in 2016 at Friedrich-­ Alexander University Erlangen-Nuremberg. A virtual meeting occurred in 2020, and a 2022 meeting is being planned at https://philosophyengineering.com/

4

Z. Pirtle et al.

with philosophy and with engineering, we have found that engineers and philosophers both benefit from forums like fPET, which represents a shared desire amongst philosophers and engineers to learn from each other7. These chapters collectively pose challenges that engineers and all members of society should think about, especially about where engineering as a discipline (and set of disciplines) should focus on and engage, asking what the ultimate goals should be for engineering. How should engineers and the engineering profession in general more deeply define and encourage social progress? As we will discuss, there are caricatures of both technological and social progress that we must move beyond. Reimagined conceptions need to be more proactive, as engineering systems can have major unintended consequences, both good and bad, which motivates early study of engineering artifacts. To successfully intervene and improve the world, engineers need to be reflective about how to develop technologies that can work with society, to know about society and its values, and how all of it relates. To have meaningful moral agency, engineers need to understand the complex issues that surround their work (Shrader-Frechette and Westra 1997). But, to deeply engage with complex adaptive systems behooves everyone to slow down and establish some inquisitive level of understanding in many traditions of philosophy, from ethics to epistemology, metaphysics, philosophy of language, philosophy of biology, as well as understanding how the humanities can inform and engage with engineering systems. All of these topics are touched on in various ways within this volume.

1.2.1  Structure of the Book The book is divided into five main content areas: • Section 1 Technological Progress –– Part I: Reimagining how Engineering relates to the Sciences –– Part II: Re-imagining Engineering Knowledge • Section 2: Social Progress –– Part III: Reimagining Social Progress through Engineers’ Ethical Principles –– Part IV: Reimagining values and culture in engineering and engineered systems • Section 3 How Engineering Relates to Social Progress –– Part V: Re-imagining how engineering relates to complex sociotechnical systems –– Part VI: Reimagining Social Progress in Democracy, and the need to Align Engineering to Social Values 7  We are grateful for the over one hundred attendees at fPET 2018 for lively and fulfilling conversations, and also to our keynote speakers Robyn Gatens, William Wimsatt, Daniel Sarewitz and Malka Older.

1  Reimagining Conceptions of Technological and Societal Progress

5

• Part VII: A Provocation: Reimagining the Limits of Philosophy and Knowledge Through Generic Design

1.3  Section 1: Technological Progress What would it mean to reimagine technological progress? We focus here on relatively advanced technologies that are often shaped by engineers, though of course technology can touch on a broad range of arts, crafts, and performance.8 There is a popular caricature of technological progress as a cascade of new “innovations,” which form an implicit notion of technological progress. This view sometimes is pushed to excess, with some commentators talking about empty ‘innovation speak’ drowning out intelligent conversation about the use of technology (Vinsel and Russell 2020). A myopic view of engineering as being one piece of a chain of endless innovations does not properly appreciate the nuances of how engineering works, and calling for disruptive innovation does not well-prepare society to solve major problems. Having a rich understanding of engineering that can help it in solving problems and promoting well-being does appear to be lacking. Where the public understanding of science has increasingly professionalized, public understanding of engineering has barely begun to recognize the complexity of how engineering works. While engineers create technology, the way they go about doing so is much deeper and richer than popular conceptions recognize of engineers performing ‘applied science’ (Vincenti 1990). While there is a rich history and much social science research on engineering, the question and history of how engineering should relate to society is rarely taught to engineers (Bud 2012; Wisnioski 2012). Indeed, misconceptions about engineering persists at the national policy making level. Odumosu and Narayanamurti (2016) argue that the pillars of American innovation policy still place far too much emphasis on the standard distinctions of basic, applied science, and miss out on how innovation occurs at a systemic level. They made a powerful argument that the way innovation policy is discussed at the federal level has a large misconception about how innovation works. The need to infuse STS and philosophy of engineering insights into engineering policy means there could be great opportunity to reconceive the nature of the nation’s engineering programs, potentially allowing for different and more helpful programs to be executed. The reimagining of engineering and technological progress that is needed requires a deeper sense of what engineering is, to allow policy and management decisions about engineering to be based on a deep foundation of social science knowledge. There is a need to get the right sorts of knowledge that allows 8  See Mitcham CitationRef CitationID=”CR23”>2019 for further reflection on important nuances between technology and engineering that we gloss through here. We imagine that it would make for interesting work to look at notions of progress in technological areas that involve no engineers!

6

Z. Pirtle et al.

engineering to advance and solve problems, recognizing the diversity of engineering activity.9 There are many varied definitions of technological progress that can be pursued, from a desire for more efficient technology, to keep productivity high; technology as a knowledge stream; and also seeing the care and maintenance of technology as a core goal (Vinsel and Russell 2020). Our main plea for reimagining technological progress is grounded in wanting an engineering that is capable of helping society solve major problems and to grow in ways that encourages human flourishing.

1.3.1  P  art I: Reimagining How Engineering Relates to the Sciences Context: Engineers actively create systems, doing so in ways that rely upon complex design processes that are nuanced and in some respects distinct from science. A central part of the philosophy of engineering has long focused on the nature of design (Kroes 2002) and its connection to how all of society shapes and is shaped by engineering systems. The naïve conception of engineering that gets embedded in engineering work, of engineering as merely applied design, also gets embedded into the supposed structure of engineering knowledge and the scientific principles that engineers bring to bear. The work that engineers do to design, and to successfully operate new systems, is unique, and shows that engineering is different from the popular caricature of mere ‘applied science’. Chiefly, Walter Vincenti has argued that engineering knowledge can and does have its own unique features and implications that are distinct from other areas of science. In “Engineering Design Principles in the Architecture of Nature,” the philosopher William Wimsatt offers design principles that he views as important for engineered, natural and social systems. By systematically looking at similarities in how engineered and biological systems can have deep similarities in how underlying concepts and ideas can become key to their functions. He describes this concept as generative entrenchment, which discusses how initial features of a system become self-reinforcing over time. He discusses how entrenchment shapes maintenance and adaptations in complex systems, and offers considerations for how engineers and designers should think about the long-term consequences of the decisions they make. Overall, Wimsatt helps to show us how the study of engineering could inform deeper study of both biology and philosophy, and shows that connections between engineering and other areas of science may be deeper than what one first recognizes. In “Technological Progress in the Life Sciences,” the philosopher Janella Baxter proposes a new standard for the meaning of a “revolutionary” technological breakthrough, going beyond discussions of knowledge. She uses Wimsatt’s (2007) notion

9  A standard academic way of describing progress in philosophy and science is through increased problem solving capability. This is a sense that goes back at least to Kuhn and Laudan.

1  Reimagining Conceptions of Technological and Societal Progress

7

of generative entrenchment to understand which technologies break out of previous approaches. She uses this framework to assess the development of CRISPR-Cas-9 gene editing technologies, which have expanded humanity’s capability for modifying organisms. She helps us realize that our definitions of innovative and unique within engineering can be matters of convention, and we should choose to celebrate and encourage types of innovation that add a coherent addition to our overall set of knowledge. This helps to illustrate how the philosophy of engineering literature can inform broader contributions in the philosophy of synthetic biology. Engineer, historian, and philosopher Stephen B.  Johnson provides several insights into epistemology in his article, “Philosophical Observations and Applications in Systems and Aerospace Engineering.” He begins by discussing the nature of our knowledge about the world, including debates on whether scientific laws or models are the basic unit of scientific knowledge. He shows how notions of applying science from high level laws may be too crude to meaningfully capture nuances in engineering. Within this frame, he discusses cases within engineering that, offer ways in which engineers could (re)consider their work. He shows how concepts of teleology, hermeneutics, and other concepts apply to engineering, in ways that are not strict derivations from science.

1.3.2  P  art II: Re-imagining Engineering Epistemology and Reasoning Context: The nature of knowledge in engineering is also critical, and has long been a topic of study (Bunge 1966; Laudan 1984; Vincenti 1990). The importance of engineering knowledge can be motivated by the late engineer and historian Walter Vincenti’s truism: “what engineers do...depends on what they know.” Studying engineering knowledge is essential to understand what engineers do, and can also help discover rich nuances on where engineering knowledge comes from. The chapters in this section show the richness of engineering knowledge and reasoning patterns, how it can be embedded in material things (akin to Davis Baird’s (2004) idea of ‘thing knowledge’) and be shaped by non-linear narrative devices. The philosopher Manjari Chakraborty provides a critical lens for focusing on knowledge in engineering. In “Prehistoric Stone Tool Technology and Epistemic Complexity,” she explores the material dimensions of engineered artifacts and how knowledge can become embedded within artifacts as operational principles. In a case study about embedded knowledge in prehistoric artifacts, she puts them in the context of Popper’s conception of modes of knowledge. Her work helps to embolden the need for engineers and scholars of engineering to think through how knowledge is embedded in artifact. This conception of engineering’s worldview as involving knowledge embedded in physical artifacts represents a key way in which engineering knowledge is different from the public caricature of how engineering works.

8

Z. Pirtle et al.

In “Narrative and Epistemic Positioning: The case of the Dandelion Pilot,” the historian Dominic Berry explores the role of narrative in engineering knowledge production, an important but previously undeveloped dimension of engineering. He uses a case study of engineers who are part of a multi-disciplinary team with mathematicians and biologists, all studying fluid mechanics surrounding a dandelion seed. Berry uses interviews and small periods of ethnography to study the ways they designed tests, evaluated data, and theorized, highlighting how the narratives helps engineers use data and understand events. While sometimes the public caricature of engineering involves the straight application of scientific laws, Berry helps to show how the analysis is non-linear, and involves critical changes. This provides frameworks others can use in the study of engineering knowledge, but it also highlights how engineering does not work in terms of straight derivations from science. It involves uncovering, imagining, and considering how to approach key issues.

1.4  Section 2: Social Progress What is social progress, such that we might reimagine it? A naïve popular caricature might imply that it is easy to measure benefits to society, either by looking at some objective quantitative measure such as economic growth or average lifespan. Likely every political view, be it ‘liberal or conservative’, has conceptions of societal good that can be pursued by individuals as well as through broader collective effort. However, deep conflict can arise in debates about social progress, especially when inspection starts to raise questions about social progress for whom, and at what cost and benefit. A great challenge for democratic societies is to grow to adjudicate terms for shared progress amongst many groups, while avoiding dangerous extremes. While it is outside our intent to fully reimagine social progress, we definitely ascribe to pluralistic visions for collective debate on what sort of good should be pursued (Pirtle and Tomblin 2017). This reimagined notion of social progress thus recognizes the complexity of social benefit, and the need for a more pluralistic democratic debate on the topic. Engineers can relate to social progress in terms of their own ability to be ethical reflective actors, in dialog with others, and seeking to provide value for society. If the progress we are interested in concerns challenging the self-conception of engineering and their ethical duties, then historians have raised serious doubts about whether engineers have increased their awareness of their societal obligations. It has long been clear that engineering is a force that transforms—and is transformed by— society, and holistic approaches to engineering and engineering systems have long been called for by engineers themselves (Cooke 1916; Mesthene 1969; NAE 2004). But these changes are often cyclical, and seem not to result in changes to engineering practice without fundamentally expanding how engineers think about their ethical obligations. Written a generation apart, Edwin Layton (1971) and Matthew Wisnioski (2012) chronicled two separate waves of engineering reflection, from the 1910s–1930s and

1  Reimagining Conceptions of Technological and Societal Progress

9

then the 1960s–1970s, respectively. In the first wave of engineering reflection in the 1910s, visions of technocracy and a desire to aide business development drove visions of engineering progress, with Layton seeing engineers’ desire to contribute to social progress as potentially self-serving, to put engineers in charge of society (Layton 1971). A second wave of engineering reflection occurred in the 1960s, amidst criticism of technology during the Vietnam War (Wisnioski 2012). In both waves, critics highlighted the negative effects of engineering artifacts alongside the positive, and engineers have long debated how to respond, drawing on a range of virtuous and conflicted rationales. What is most alarming across the two books is Wisnioski’s note about Layton’s work: “I became captivated by Revolt of the Engineers for the same reason as the president [Donald E. Marlowe] of the American Society of Mechanical Engineers (ASME), who, in his 1971 review of Layton’s book, concluded that “many of the papers and speeches cited could have been presented in the last few years without loss of impact” (p. 5). For us reading this today, we feel alarm at these cyclical patterns, and worry about a lack of progress in engineering ethics from the 1910s through today, especially as most engineers are unaware of this history. Wisnioski does note how the 1960s engineering ethics wave led many engineering societies to create technology and society divisions. However, with notable exceptions,10 many of these divisions began to veer into advocacy and public engagement focused on raising the status of the engineering disciplines in popular culture, and do not seem to have led to significant changes. Both Layton and Wisnioski cast skeptical tones about engineers’ ethical pursuits actually leading to societal progress. This speaks to the need for new, potentially re-imagined, approaches to change how engineers seek to change and engage with society. We agree with others that engineers need to embrace a much expanded notion of engineering responsibility, going beyond narrow situational ethics and whistleblowing (Herkert 2005). There needs to be a broader constellation of actors involved in discussions on engineering, helping to infuse diverse values into the systems that engineers build, as well as the principles that engineers use, such as balance and sustainability. The following two sub-sections focus on the role values play in engineering systems and cultures, as well as in principles that engineers may want to consider.

1.4.1  P  art III: Reimagining Values and Culture in Engineering and Engineered Systems Context: One key aspect in which social progress ties to engineering comes from the societal values that are embedded in engineering artifacts. There has been a sea change of thinking about how human values manifest in engineering and in

 We do recognize the IEEE Society for Social Implications of Technology as a uniquely deep and active source of reflection and potential change in engineering

10

10

Z. Pirtle et al.

engineered artifacts (Vallor 2016; Douglas 2009; Longino 1990). While some scholars like Douglas (2009) show roles for objectivity amidst important social values, the influence of values on thinking about the systems that we create and build are undeniable. Engineering is not some dispassionate and objective activity, where the market decides how engineering activities and products get used. An ongoing topic in the philosophy of engineering has been how values get embedded in engineered systems. Understanding this topic is an entry point into ethically salient issues about engineering, but it also changes the nature of how engineers design, create, and use knowledge. Before trying to create positive change in the world, engineers should reflect on the values brought to bear by themselves and their forbearers, and what biases (positive or negative) might be induced as a result. With these foundations in mind, engineers are then well prepared to see studies and reflections about how to be ethical actors within complex systems, as discussed in many of the chapters. The philosopher and science and technology studies scholar Damien Patrick Williams does a thorough and evocative description of how values of human beings get instilled into technological systems. Titled “Constructing Situated and Social Knowledge: Ethical, Sociological, and Phenomenological Factors in Technological Design,” he focuses on algorithms used in a variety of technological systems, exploring how key biases penetrate into those systems, with significant consequences. Discussing photographic image assessment, search engine results and other examples, he shows how the choice of who participates in design and the values of those designers can lead to unjust societal outcomes. He provides a way of thinking for everyone, not just engineers, to reflect and call for a more just approach to engineering. The science and technology studies scholar Bono Po-Jen Shih delves into how other values get embedded into engineering, in this case by focusing on values implicit in the language engineers use. In “Towards an Engineering Ethics with Non-engineers: How Western Engineering Ethics May Learn from Taiwan,” he focuses on the definition of “engineering” in Chinese languages and how it conveys meanings that are importantly different from Western contexts. Specifically, he reviews how the phrase gong cheng, commonly used to mean engineering, includes a broader and more diverse set of people including laborers and others who work alongside engineers. Shih explores how this influences the discussions of engineering ethics in Taiwan, nudging ethical debates there to be more inclusive. He offers a major lesson for the United States, that they can learn from Taiwan’s more expansive concept of engineering ethics, to help engineers from having to navigate cultural battles. The engineers and social scientists Thomas Siller, Gerry Johnson, and Russell Korte provide an additional pedagogical reflection, noting that real-world problems do not comport easily to the well-defined problems seen in many engineering curricula. In “Broadening Engineering Identity: Moving Beyond Problem Solving,” they call upon engineers to reflect on their identity and how this influences how they solve problems. Using a framework promoting interaction between ontology, epistemology and pedagogy, they show the need to approach complex problems like

1  Reimagining Conceptions of Technological and Societal Progress

11

sustainability from multiple dimensions that cannot be resolved into a tangible problem statement. They believe that a narrow definition of engineering as problem solving may prevent engineers from being able to work on truly interdisciplinary work.

1.4.2  P  art IV: Reimagining Social Progress Through Engineers’ Ethical Principles Context: The earlier need to reimagine how engineers fit into social progress brings to mind an array of questions about how engineers can best consider ethics. The dominant approach toward engineering ethics is often too focused on whistleblowing and professional responsibility. Herkert (2005) distinguishes between microethics vs. macroethics, with the former being a focus on individual professional responsibility and the latter focused on understanding the larger societal implications of engineering. Understanding how broader values inform engineering would be part of the latter. Ensuring that engineers are taught about a variety of ethical principles that can be used to inform what they do is important and critical, and can provide ethical guidance in specific situations. Anchoring engineering ethics on a detailed discussion of engineering principles can be essential. The engineer Dan McLaughlin discusses the nature of virtues in engineering and the way in which such values affect how engineers think. In “Engineering, Judgement, and Engineering Judgement: A Proposed Definition,” he presents a series of reflections on how the ways in which engineers make decisions can be different from other types of judgment. He reviews longstanding debates advocating the importance of phronesis, or the virtue that regulates other virtues such as temperance and courage, and questions whether it fully describes the type of thinking that engineers engage in. McLaughlin proposes focusing on the virtue of conjecture, or eustoxia, as a better way to describe the values that are fundamental to engineering. McLaughlin’s approach straddles Herkert’s notion of micro- and macro-ethics, as the balance he seeks in judgment can be applied to both individual as well as societal contexts. The engineering professor and philosopher Tonatiuh Rodriguez-Nikl discusses a strategy for confronting the complexity of our largely technological world by drawing upon Stoic philosophy. His “Technology, Uncertainty, and the Good Life: A Stoic Perspective” proposes to deepen the concept of Eudaimonia, or human flourishing, to remind us all of the limits of our control in the world and the necessary role of luck shaping all of our lives. He shows how his proposal for human flourishing can remind us to think more globally, and applies a framework for approaching important environmental challenges. In this sense, he is giving a set of macro-ethical principles for engineers. He concludes with an engineering meta-­reflection on how ethical frameworks might be judged to have practical value.

12

Z. Pirtle et al.

1.5  S  ection 3: The Connection Between Engineering and Social Progress What is the connection between technological progress and social progress? The naïve popular view might be that engineering and technological progress is linearly related to social progress, where increases in technology directly led to benefits for society.11 Vannevar Bush’s ‘linear model of innovation’ might assume that engineering is just a pass-through as benefits accrue from science on their way to benefit all of society (Pirtle 2013). But as was noted above, technological progress via engineering is very complicated and non-linear, with many shades of complexity. And as also noted, social progress means different things to different individuals, and we ascribed to a pluralistic approach that adjudicates that good. The relation between technological progress and social progress involves channeling the complex and non-linear processes of materialization of engineering and ties it into pluralistic and at times democratic processes for how complex sociotechnical systems emerge. Connecting engineering and social progress presents continued challenges in terms of how engineers can build systems that accomplish certain desired goals, as well as in how they can align with democratic outcomes. Many engineers have come to recognize the importance of the social dimensions of systems (De Weck et  al. 2011; Madhavan 2015). Keeping engineering in close collaboration with a diverse set of stakeholders can help to ensure that engineers are more likely to meet some societal need, with stakeholder and public engagement helping to ensure that the requirements will actually meet a relevant need (Pirtle and Tomblin 2017; Madhavan, Poste, and Rouse, 2020). These chapters discuss aspects of the complexity of sociotechnical systems with an eye to help engineers decide on what the right thing to do is when their decisions have a multitude of consequences and stakeholders. Discussions focused on the ethics of particular engineering innovations is key, as is the use of broad groups of stakeholders to examine the system.

1.5.1  P  art V: Re-imagining How Engineering Relates to Complex Sociotechnical Systems Context: Engineers have had a tendency to implement technological fixes, but often find these fixes don’t meet the intended societal need (Sarewitz and Nelson 2008). Occasionally technological fixes for societal problems can work, as targeted vaccines have helped in eliminating smallpox, and airplanes enable structured mass travel. But in many cases technological fixes break down, and the relationship between technology and how to solve societal problems is often not linear or clear (Sarewitz and Nelson 2008). Having principled ways to ask ethical questions about  This certainly seems to be the implied view of some innovation advocates, though this does not necessarily line up to historical reality (Vinsel and Russell 2020).

11

1  Reimagining Conceptions of Technological and Societal Progress

13

the goals for engineering work is one critical way to avoid getting trapped in a techno-fix mindset. The chapters in this section describe instructive entry points for general ethical reflection about emerging areas of engineering systems. The philosophers Yvette Pearson and Jason Borenstein provide conceptual tools to understand the complex and potential interactions between humans and robots. In “The Impact of Robot Companions on the Moral Development of Children,” they focus on the interactions between robots and children, which offers a unique point for ethical reflection on the nature of robots in society. Their approach uses a virtue ethics lens to understand what virtues robots might instill into children, either through direct conversation or by reliance on robots. They offer a number of ethical points that they call for engineers and ethicists to engage in prior to the widespread use of robots for childcare. The science and technology studies scholar Joshua Earle turns to another complex issue, which is the topic of re-eengineering the human form and attempts to justify this through a Bill of Rights. In “Morphological Freedom and the Myth of Multiplicity,” he discusses transhumanists who advocate transforming the human mind and body through technology. Earle specifically focuses on the notion of morphological freedom, showing how transhumanists’ push for diverse lived experiences could be ethical but only if they acknowledge unjust issues in society today, and attempt to address and reconcile them. He contrasts efforts to found a right to re-engineer humans to the more common notion that engineers should feel free to engineer the world around them at large. Earle calls for pluralistic and equitable values to inform engineering goals generally.

1.5.2  P  art VI: Reimagining Social Progress in Democracy, and the Need to Align Engineering to Social Values Context: A key part of understanding social (or societal) progress is to think through the engineers’ relationship to democracy. This ties to the need to reimagine social progress as keeping close track of what society wants through the involvement of many stakeholders to direct engineering activities toward desirable societal outcomes. By better creating engineering projects to align with societal goals, it helps to ensure that engineers are more likely to solve societal problems, and to not create techno-fixes without any deeper ability to actually make an impact on society. Art helps us contemplate the meaning of technology, and can critically help draw in more stakeholders into discussions about what engineering should be. Art has always played a key role both to inform engineering practice but also to help understand the effects of engineering on society. This volume explores three ways in which art matters to the philosophy of engineering. These serve as invitations to reengage with what role engineering should play in broader culture, and ways to encourage everyone to think about what sort of world we want to collectively engineer together. The engineers and science and technology studies scholars Rider Foley and Elise Barrella offer a reflection on how co-learning with engineering students and citizens

14

Z. Pirtle et al.

at large can help engineers to understand the full complexities of systems they work with. In “Shared Learning to Explore the Philosophies, Policies and Practices of Engineering: The Case of the Atlantic Coast Pipeline,” they discuss a research and policy project on the titled case. Drawing off of sociologist Erin Cech’s work on sources of disengagement in engineering, Foley and Barrella involve students and the community in research that leads toward more engagement as well as better research results, offering practical lessons for engineering and humanities scholars. Art is an important way to involve more people in dialog. In “Middle Grounds: Art and Pluralism,” the artist duo Caitlin Foley and Misha Rabinovich reflect on the relationship of art to engineering. While many scholars recognize how technology informs art by offering up novel artistic tools, they seek a different approach. In their art projects, they illustrate how technologies can provide a communal space to reflect on the trajectory of society, and perhaps discover unknown aspects of technology and how it will shape and be shaped by society. The philosopher and engineer Albrecht Fritzche discusses several works of art, detailing in each how technology mediates the human experience in “In the Artefact on Stage: Object Theatre and Philosophy of Engineering and Technology.” He discusses how engineers produce artifacts that affect all of society, and it can be difficult to obtain an objective perspective on how technologies affect and shape society. He discusses an art troupe that actively reflects on the role of technology in our lives, noting how they draw in the audience The selective choice of what is not present in the staging of a work of art can highlight what is importantly missing from a given social situation. Fritzche shows a way for art to fuel deeper societal reflection on engineering, drawing in a broader audience. The science fiction author and sociologist, Malka Older, shows how science fiction is a powerful lens to think through the co-evolution of values with technology, which can allow technology policy makers to reflect on how to shape and guide new technologies. With her co-author Zachary Pirtle, the chapter “Imagined Systems: How the Speculative Novel Infomocracy Explores the Relationship between Democracy, Technology, and Society,” explores themes from her novel Infomocracy, called by the Washington Post one of the best science fiction novels of 2016. The construct of her book involves both an expanded information technology system that interacts in powerful and surprising ways with a novel system of “microdemocracy.” Older’s work serves as a great example of how to teach others about the nature of engineering and how it relates to society, and to implore them to consider what sort of world we collectively want to engineer.

1.6  P  art VII: A Provocation – Reimagining the Limits of Philosophy and Knowledge Through Generic Design In “Toward a Philosophy and Engineering of Generic Scaffolding and Design,” engineering and design is approached from a very different perspective, discussing the role of conceptual scaffolds that aide thinking through design work. Writing from several engineering and interdisciplinary backgrounds, Ira Monarch, Eswaran Subrahmanian, Anne-Francoise Schmid, and Muriel Mambrini-Doudet explore

1  Reimagining Conceptions of Technological and Societal Progress

15

how to describe multiple dimensions of the design process, offering a generic, context sensitive method for doing design and describing the information emanating from design. The authors also establish the need for more than just concepts, but the inclusion of community care and interests that inform discussions and approaches. They offer up a way to dissolve broad distinctions between engineering and philosophy, focusing on fundamental aspects of category theory and generic concepts. Their approach gives potential to reframe debates on engineering, putting it on a common ground alongside many other aspects of science and  engineering.  This ends our summary of the book chapters.

1.7  On Progress for the Philosophy of Engineering When we call for reimagining engineering in service to society, we are recognizing that engineers, everyday citizens and many aspects of society needs to reconsider its core concepts on what engineering is meant to be, how it relates to societal problems, and how to pursue broad and systemic revisions to whom we involve in engineering work. With the path laid forward for the volume, we now want to offer advice based on reflecting on the idea of progress in the philosophy of engineering as a whole, especially its own ties to technological and social progress. Research focused specifically on the philosophy of engineering gained momentum with the first precursor to fPET, which took place in 2007, but of course there has been generations of philosophical reflection on engineering before this. Layton (1971) and Wisnioski (2012) discuss much ethical deliberation amongst engineers beginning in the early 1900s into the 1970s. The Society for Philosophy and Technology was founded in 1976. In terms of intellectual progress, we think Mitcham (2019) does the best job of showing the quality and quantity of publications that yield a deeper insight into the nature of engineering, though he notes there is still much room to grow. Many of the chapters in our book hint at future unexamined questions. But if one had a desire for a philosophy of engineering to infuse and to help guide technological and social progress, then the case for progress seems uneven at best. Odumosu and Narayanamurti have shown that many major innovation policies rest on a misunderstanding of the nature of engineering, and the philosophy of engineering seems to have provided little assistance on this. In terms of ethical practice, society’s expectations for engineers have been largely consistent for over a hundred years as evidenced by the cyclical histories described in Wisnioski and Layton above. Awareness of the philosophy of engineering and related fields like STS and research areas like responsible innovation is still minimal,12 and seems to rarely influence or shape policy decisions about how engineering works.  In a major engineering organization that the first co-editor has worked in, awareness of philosophy of technology or STS is virtually non-existant, where only a handful have heard of the topics out of many thousands of people. This is part of a broader problem of the influence and priority of the social sciences and humanities generally. Mark Solovey (2020) gives a recent history of social science research at the U.S. National Science Foundation, showing that generations of STEM leaders enter government with little awareness of social science research. Philosophers and STS scholars would do well to study this history (Pirtle in preparation).

12

16

Z. Pirtle et al.

New opportunities and action may be needed to truly allow for philosophy of engineering and STS work to meaningfully affect technological or social progress. In the US context, Fisher (2019) is a striking review of a piece of legislation, the 2003 Nanotechnology Research and Development Act, that had STS/philosophical influences through the testimony of the philosopher Langdon Winner. It is most striking because it is an isolated example, where U.S. legislators have not encouraged any comparable STS/science policy research projects since. Detailed study of the impact of STS and philosophy research on the development of the human genome project, nanotechnology, synthetic biology, geoengineering and others has not yet shown a major contribution to policy or scientific or engineering practice (Sullivan 2018). While this may seem dire, we should remember that advocates for the structured use of STS and philosophy to inform science and engineering policy decisions have never been able to request and receive their own funding requests to implement a deep structured program (Guston and Sarewitz 2002; Solovey 2020).13 As such, perhaps the real value of using philosophy of engineering and STS/science policy research to inform decisions must await a more detailed implementation at scale, and subsequent evaluation. It is uncharitable to dismiss the philosophy of engineering’s ability to help inform technological and social progress when its practitioners may not have been given the means to do so. Recognizing this as well as the deep depths of intellectual contributions that come before us, we offer the following reflections on how to continue the journey of engineering and philosophy, drawing on our (the authors) own experiences as researchers, teachers, engineers, and policy advisers: • Need broad support for community building for philosophers of engineering (and other STS scholars), including support for case study research between philosophers and engineers. A stable research community with the funding to do detailed studies on engineering organizations would be critical in helping to connect the ideas from philosophy of engineering to enable any social progress in general. Sustained funding for a community is needed as work done out of the goodness of an individual engineer or philosopher’s heart is noble, but can easily get lost, especially in chaotic times. A range of governmental and non-­ governmental funding sources and teaching requirements14 need to work together

 It is also not clear if there have been sufficiently influential decision makers who listen to the results of philosophical or science policy research (Sullivan 2018), which makes assessing the prospective extrinsic value of such research difficult to discern. Changing this attitude may require more grassroots advocacy within the political process (Pirtle 2021). Because we believe many engineers and decision makers have an inherent desire to do the right thing, we focus on reimagining concepts of engineering and progress, on the hope that changes of perspective among engineers and everyday citizens can enable them to make different decisions within their existing political authority and processes. But, as mentioned below, having the time for reflection requires resources and leadership. 14  The Accreditation Board on Engineering and Technology (ABET) has in some ways created a community around engineering ethics through its requirements to teach engineering students engineering ethics. Debates over the nature and extent of what ethical requirements need to be taught to undergraduate and graduate engineers is important. 13

1  Reimagining Conceptions of Technological and Societal Progress

17

to allow a deeper research community to grow. Scholars need to be rewarded for engaging in this type of blended work among both engineers and philosophers. Providing incentives for structured case studies, involving philosophers and social scientists engaging in engineering can be a substantive way to gain deeper knowledge of engineering but also to get engineers to embrace and learn from social science (Pirtle and Moore 2019; Pirtle 2013). Finding ways to embed more structured social science studies of engineering organizations would be essential to helping share and reveal knowledge like that found in this book. If one hopes that a philosophy of engineering would have a significant impact on social progress, then one needs to reflect on what scope of work is needed to do that, and what ties a research community needs to have to decision makers in order to have an impact. • Many, varied participants are needed to successfully reimagine engineering: A key aspect of success and improvement here needs to involve bringing in more than engineers. The technocratic visions of the 1920s still lurk in the background of some science policy discussions (Cooke 1916; Layton 1971). Our work here is partially a reflection of how we might go about doing this: we brought together engineers, philosophers, policy analysts, community members, artists, fiction authors and others into this volume. Discussions just amongst engineers (or philosophers) aren’t sufficient. The Section 3 papers help to motivate why we need to include representatives from all of democracy into discussions of engineering activities. Involving the public in engineering decisions is a mission that two of the co-editors have pursued in other venues as well (Bertrand et  al. 2017). A more societally engaged engineering must have engineers be part of a ­democratic process, accountable to and engaged with key stakeholders and the public at large. • Working with decision makers requires trust and recognition of the decision context of practitioners. Philosophers and other social scientists face this hurdle in providing guidance on engineering activities, whether they are trying to influence the decisions of an engineering manager or of some higher level policy maker (Eisenhardt and Zbaracki 1992). Some philosophers’ advice for responsible engineering can seem disconnected from how engineering systems are actually developed, deployed, and managed. It is important to have a keen sense of practice, of how engineering institutions work and what knowledge gets brought to bear in the design, production, operations and maintenance of engineered systems. To guide responsible engineering efforts for human flourishment and a just world, more reflection and trust-building conversations are needed with practitioners. • Recognize the good intent of engineers, and guide them to be reflective: We think an important part of gathering momentum for a deeper re-imagining of engineering should come from recognizing the good intent of many engineers, enrolling them into conversations about what sort of world they wish to live in (Karwat 2020). As Cech (2014) has shown, many engineering students begin their education wanting to contribute to society, but they lose some of this intent through their study or through acculturation into working business environments.

18

Z. Pirtle et al.

As practitioners and teachers of engineers, we see that many engineers do want to do good, but there can often be doubts or confusion about how engineers can positively benefit the world. And of course, as several of our chapters (especially Foley and Barella) indicate, in many complex situations, it can be unclear as to what engineers can really do or how they can help.15 Reflecting on key concepts, such as those in this book but also in the broader philosophy of engineering and STS world, can be critical for enabling engineers to actually achieve more good in the world. The recent effort by engineers to embrace AI for social good is a sign that this may have broad interest among engineers that can scale across engineering (but recall Layton and Wisnioski!). • Do not take technological progress as a given. While advances in computer and information technology (arising within broader political and economic contexts) have brought society the iPhone and increased social media access, some areas of industry appear to have stalled. In a systematic and alarming study, the economist Robert Gordon’s The Rise and Fall of American Growth (2016) has argued that the century from 1870 to 1970 had a far greater pace of technological advancement, and that productivity for every average person has grown at a slower rate since the 1970s. Others, such as Tyler Cowen (2011) and Vernon ­Ruttan (2006), have made similar arguments.16 Recognizing this may help identify areas of engineering that have lacked substantial progress, which may engender important research. If key government and private leaders push to drastically reshape society’s approach to innovation, then there could be a powerful role for the philosophy of engineering to help inform how different organizations can bring knowledge and people to bear to actually solve major societal problems. This can range from epistemological advice to even include helping to identify the values that are surrounding a given technology, prompting engineers to identify new technological options that can remove political logjams (Pielke Jr 2007; Pirtle and Szajnfarber 2017). But again, as our chapters indicate, one must always question the ultimate goals and purposes for such innovations, and proceed through the broad pluralistic dialogues noted above. • Consider how engineering shapes the ‘long-term’, irrespective of whether it is ‘innovation’. Recently, some scholars have increasingly pushed for a more long-term perspective about engineering, about the need for designing systems that will last for the long-haul. A focus on maintenance, which is a task that many engineers work on, has also called into question whether a more maintenance focused approach to engineering for society could be embraced (Vinsel and Russell 2020). Engineers can help foster important conversations about the role of maintenance in society. The study of ‘thing knowledge’, of the knowledge  Richard Nelson’s 1977 book The Moon and the Ghetto is a challenging volume that questions why engineers can figure out how to send humanity to the moon but cannot resolve broader societal issues. What is it about engineering that enables success in some contexts but not others? 16  Cowen 2011 talks about running out of ‘low hanging fruit’, that the easy to develop technologies have already been developed. Ruttan (2006) argued that the major advances in IT were due to a surge of government investment made during WWII and the Cold War. 15

1  Reimagining Conceptions of Technological and Societal Progress

19

embedded in old artifacts that Chakraborty discusses, may be particularly helpful for maintenance of older systems. • There may be no easy, single-point end state for how engineering, technology and society relate. A subtle message in the volume is that the connection between technology and society needs to be a constant focus of governance. We want technology to progress in ways that society collectively desires, and thus there needs to be some sort of direction and accountability, as well as an awareness of all the ways in which the two can co-evolve. But such directions will often be changing and evolving. Malka Older’s Infomocracy offers a mental simulation by which we can debate what sort of world we want to live in with respect to technology co-evolving with society. But in Older’s world, change is always coming with either the next election or the next technology upgrade, which might put the system out of whack. In sum, there may be no perfect end state for a reimagined engineering in service of society; just as with Infomocracy’s sequels, the drama continues. Looking forward, engineers’ seriously engaging philosophy and philosophers’ seriously engaging engineering has the potential to reshape and collectively reorient these intellectual traditions and practices toward the design of a more just world. We hope our volume serves as such a stimulus.

References Allenby, B. R., & Sarewitz, D. (2011). The techno-human condition. Cambridge: MIT press. Baird, D. (2004). Thing knowledge: A philosophy of scientific instruments. Univ of California Press. Bertrand, P., Pirtle, Z., & Tomblin, D. (2017). Participatory technology assessment for Mars mission planning: Public values and rationales. Space Policy, 42, 41–53. Bud, R. (2012). “Applied Science”: A Phrase in Search of a Meaning. Isis, 103(3), 537–545. Bunge, M. (1966). Technology as applied science. In Contributions to a philosophy of technology (pp. 19–39). Dordrecht: Springer. Cech, E. A. (2014). Culture of disengagement in engineering education? Science, Technology, & Human Values, 39(1), 42–72. Cooke, Morris L. (1916). Public engineering and human progress. Cleveland Engineering Society, IX (January 1917) pp. 245–263. Cowen, T. (2011). The great stagnation: How America ate all the low-hanging fruit of modern history, got sick, and will (eventually) feel better: A penguin eSpecial from Dutton. Penguin. De Weck, O. L., Roos, D., & Magee, C. L. (2011). Engineering systems: Meeting human needs in a complex technological world. Cambridge: MIT Press. Douglas, H. (2009). Science, policy, and the value-free ideal. Pittsburgh: University of Pittsburgh Press. Eisenhardt, K. M., & Zbaracki, M. J. (1992). Strategic decision making. Strategic Management Journal, 13(S2), 17–37. Fisher, E. (2019). Governing with ambivalence: The tentative origins of socio-technical integration. Research Policy, 48(5), 1138–1149. Gordon, R. J. (2016). The rise and fall of American growth. Princeton University Press. Guston, D. H., & Sarewitz, D. (2002). Real-time technology assessment. Technology in Society, 24(1–2), 93–109.

20

Z. Pirtle et al.

Herkert, J. (2005). Ways of thinking about and teaching ethical problem solving: Microethics and macroethics in engineering. Science and Engineering Ethics, 11(3), 373–385. Jasanoff, S. e. (2004). States of knowledge: The co-production of science and the social order. Hoboken: Routledge. Jasanoff, S., & Kim, S. H. (eds.). (2015). Dreamscapes of modernity: Sociotechnical imaginaries and the fabrication of power. University of Chicago Press. Johnson, D. G., & Wetmore, J. M. e. (2008). Technology and society: Building our sociotechnical future. Cambridge: MIT press. Karwat, D. M. (2020). Self-reflection for activist engineering. Science and Engineering Ethics, 26(3), 1329–1352. Kroes, P. (2002). Design methodology and the nature of technical artefacts. Design Studies, 23(3), 287–302. Laudan, R. (Ed.). (1984). The nature of technological knowledge. Are models of scientific change relevant? (pp. 83–104). Dordrecht: Springer. Layton Jr, Edwin T. (1971). The revolt of the engineers. Social responsibility and the American engineering profession. Baltimore: Johns Hopkins University Press. Longino, H. E. (1990). Science as social knowledge: Values and objectivity in scientific inquiry. Princeton: Princeton University Press. Madhavan, G. (2015). Applied minds: How engineers think. New York: W.W. Norton. Madhavan, G., Poste, G., Rouse, W. (2020). Complex Unifiable Systems, The Bridge (National Academy of Engineering), 50(4). Mesthene, E. G. (1969). Some general implications of the research of the Harvard University program on technology and society. Technology and Culture, 10(4), 489–513. Mitcham, C. (2019). Steps toward a philosophy of engineering: Historico-philosophical and critical essays. London: Rowman & Littlefield Publishers. National Academy of Engineering (2004). The Engineer of 2020: Visions of Engineering in the New Century. National Academies Press. Nelson, R. R. (1977). The moon and the ghetto. New York: W.W. Norton. Narayanamurti, V., & Odumosu, T. (2016). Cycles of Invention and Discovery: Rethinking the Endless Frontier. Harvard University Press. Owen, R., Stilgoe, J., Macnaghten, P., Gorman, M., Fisher, E. & Guston, D. (2013). A framework for responsible innovation. Responsible innovation: managing the responsible emergence of science and innovation in society, 31, pp. 27–50. Pielke, R.  A., Jr. (2007). The honest broker: Making sense of science in policy and politics. Cambridge: Cambridge University Press. Pirtle, Z. (2013). Engineering innovation: Energy, policy, and the role of engineering. In D. Michelder et al. (Eds.), Philosophy and engineering: Reflections on practice, principles and process (pp. 377–390). Dordrecht: Springer. Pirtle, Z. (2021). Book Review: Social Science for What? Battles over Public Funding for the “Other Sciences” at the National Science Foundation. Journal for Responsible Innovation. https://doi.org/10.1080/23299460.2021.1907045 Pirtle, Z., & Moore, J. (2019). Where does innovation come from?: Project hindsight, TRACEs, and what structured case studies can say about innovation. IEEE Technology and Society Magazine, 38(3), 56–67. Pirtle, Z., & Szajnfarber, Z. (2017). On ideals for engineering in democratic societies. In Philosophy and engineering (pp. 99–112). Cham: Springer. Pirtle, Z., & Tomblin, D. (2017). Well-Ordered Engineering: Participatory Technology Assessment at NASA. In: Pitt, J.C. and Shew, A. eds. Spaces for the future: A companion to philosophy of technology. Routledge. Ruttan, V. W. (2006). Is war necessary for economic growth?: Military procurement and technology development. Oxford: Oxford University Press. Sarewitz, D., & Nelson, R. (2008). Three rules for technological fixes. Nature, 456(7224), pp.871–872.

1  Reimagining Conceptions of Technological and Societal Progress

21

Schatzberg, E. (2018). Technology: critical history of a concept. University of Chicago Press. Shrader-Frechette, K., & Westra, L. (Eds.). (1997). Technology and values. Lanham: Rowan & Littlefield Publishers. Solovey, M. (2020). Social Science for What?. Battles Over Public Funding for the Other Sciences at the National Science Foundation. MIT Press. Sullivan, M. (2018). The Expansion of Science Policy in the United States in Three Cases: rDNA Research, The Human Genome Project, and the National Nanotechnology Initiative (Doctoral dissertation). Vallor, S. (2016). Technology and the virtues: A philosophical guide to a future worth wanting. Oxford: Oxford University Press. Vincenti, W. G. (1990). What engineers know and how they know it: Analytical studies from aeronautical history (Johns Hopkins studies in the history of technology). Baltimore: The Johns Hopkins University Press. Vinsel, L., & Russell, A. (2020). The innovation delusion: How our obsession with the new has disrupted the work that matters most. New York: Currency. Wimsatt, W. C. (2007). Re-engineering philosophy for limited beings: Piecewise approximations to reality. Cambridge: Harvard University Press. Wisnioski, M. (2012). Engineers for change: Competing visions of technology in 1960s America. Cambridge: MIT Press.

Part I

Technological Progress: Reimagining How Engineering Relates to the Sciences

Chapter 2

Engineering Design Principles in Natural and Artificial Systems: Generative Entrenchment and Modularity William C. Wimsatt

Abstract  I see in the nature of our minds and the character of our problem-solving methodologies a search for simplifying tools that will let us model a complex world (be it biology or society) and get away with it far more often than we might suppose. As it turns out, this broad a reach to mind and world is possible because both turn on common properties of evolved complex adaptive systems. These are in effect “design principles” for the architecture of nature—all of it, from biological systems to ourselves and the technologies that we engineer. I explore how generative systems may, under some circumstances lead to adaptive radiations, and how the growth of complexity is entailed by their compositional embedding of prior systems, their stabilizing their features as architecture thru a process that I call generative entrenchment. I also explore how modularity has two forms: top-down modularity or “quasi-independence” (Lewontin 1978) in which evolving systems require the possibility of changing parts of the system without scrambling the organization of the rest, and bottom-up modularity in which a stable alphabet of standardized parts can be combined in various ways to generate an adaptive radiation of diverse systems to accomplish different things. Keywords  Generative entrenchment · Modularity · Von Baer’s law · Differential stability · Combinatorial algebra · Near decomposability · Big ball of mud

2.1  Introduction Unlike most philosopher emigres from engineering (who often seem to be relieved at their escape) my experience in engineering (theory and practice!) was very positive and has influenced my views on all sorts of things—on mind, body, and W. C. Wimsatt (*) Minnesota Center for Philosophy of Science, University of Minnesota, Minneapolis, MN, USA e-mail: [email protected] © Springer Nature Switzerland AG 2021 Z. Pirtle et al. (eds.), Engineering and Philosophy, Philosophy of Engineering and Technology 37, https://doi.org/10.1007/978-3-030-70099-7_2

25

26

W. C. Wimsatt

society.1 Indeed, since these are the three great engineered systems in our experience, and also common foci of philosophical research, it seems something of a paradox that most philosophers do not see in their engineering training useful levers to employ on the systems that they study.2 I see in the nature of our minds and the character of our problem-solving methodologies a search for simplifying tools that will let us model a complex world (be it biology or society) and get away with it far more often than we might suppose. As it turns out, this broad a reach to mind and world is possible because both turn on common properties of evolved complex adaptive systems. These are in effect “design principles” for the architecture of nature—all of it, from biological systems to ourselves and the technologies that we engineer. I discuss two common aspects of the architecture of adaptive systems and their importance across natural and artificial domains. These are generative entrenchment and modularity (of which there are two types, not one). Moreover, these relate to other properties widely recognized as important, such as robustness, near-­ decomposability, self-organization, and evolvability. These properties, with the nature of heuristic design, search, and problem-solving techniques—paralleling in their logic the features of biological adaptations—provide a rich toolkit for the production and use of complex adaptive systems. All of these are produced as a result of biological selection or culturally evolved efficient design processes to engineer new artifacts (which are themselves selection processes). These are essential or inescapable. And generative entrenchment, as a process and as a product, is the main reason why history matters in the analysis of such systems. Generative entrenchment and modularity interact in multiple ways, and are sometimes in tension, because increases in modularity commonly decreases generative entrenchment by decreasing interaction between modules. But in one important mode, combinatorial entrenchment, modularity can substantially increase the entrenchment of the combinatorial elements. Both are important source of creativity and adaptive change, but in different ways.

2.2  Generative Entrenchment Many evolving adaptive systems can be described using the concept of a life cycle, where the system evolves and becomes more complex over time, with recurring, cascading developments. Different elements of the system help produce or causally influence other future elements of the system. This definition can apply to organisms, but also the learning history of an individual, or the development of a 1  For more on my engineering studies and work, see the epilogue of Wimsatt (2007a). That volume also provides content to other ideas mentioned in this article, such as robustness and hierarchy. 2  An important exception here is in the work of Brett Calcott, (e.g. 2014) who was a software engineer for a decade before he came into philosophy. His work is important, paradigmatic of the perspective I urge, and complementary to the approaches advocated here.

2  Engineering Design Principles in Natural and Artificial Systems: Generative…

27

technology. Simple generative entrenchment is about the extent to which a system element drives future states of the overall system. Specifically, differences in the number and magnitude of downstream consequences from a given element are differences in that element’s generative entrenchment (GE). Some elements of a system have a greater extent of entrenchment than others, and differential GE generates different degrees of evolutionary conservation of these elements because changes in more deeply entrenched things are both more likely to fail (because they are more highly constrained by adaptive demands, as other system elements depend on them). To change a highly entrenched element can lead to bigger cascading failures, more often lethal, and thus to be more strongly selected against. More deeply entrenched elements are thus more likely to be preserved unchanged or nearly so. And influences acting earlier in the developmental life cycle are more likely to have more downstream consequences. But things that are preserved have more chance to acquire further downstream dependent features which presuppose their presence. So entrenchment can be self-amplifying, and different degrees of entrenchment give a dynamics for differential rates of change in processes of evolution. In biology, this provides a basis for integrating macro-evolution and micro-evolution, since it applies to selectionist differences of all sizes. Population genetics elaborates a theory in which selection intensities are abstracted from phenotypic characteristics, whose fitness’s must be added as hypotheses. Differential entrenchment relates fitness directly to phenotypic properties. This is a significant advantage. Since the theory is not based on genes but on the organization of dependencies within the adaptive structure of the phenotype and its environment, it provides an intrinsic evolutionary dynamics wherever there is an articulatable structure of dependencies. This also means that GE can also apply to the evolution of culture and technology where there are no genes. Thus a systematic theoretical basis for evolutionary change not based on genetics—though presupposing a means of hereditary transmission—is possible. Although it becomes increasingly difficult to make changes or substitutions in elements as they become increasingly entrenched, if something can meet the constraints necessary for a substitution, the effects can be revolutionary, and open new kinds of possibilities. We will consider these constraints, and some cases of successful revolutionary deep substitutions. The search for simple generative entrenchment can be a fruitful lens to examine adaptive systems, as can be seen from the history of a widely discussed phenomenon from developmental biology called “von Baer’s Law”, after embryologist Karl Ernst von Baer, who formulated it in 1828. Von Baer noted that earlier stages in the embryos of developing organisms look more alike than later stages. To Darwin this suggested their common origins, and it is indeed evidence for such. This would also be expected if the elements of earlier stages, or the genes playing roles in their production were more GE’d than those of later stages—a feature also consistent with a common evolutionary origin. The following illustration from Ernst Haeckel’s Das Menschenproblem (1907) shows the development of vertebrate embryos of a bat, a gibbon, and a human, illustrating this phenomenon. Since this would also generate similar starting points for successive developmental stages, in successive

28

W. C. Wimsatt

Fig. 2.1  An illustration of von Baer’s Law. Ironically this occurs in a book by Ernst Haeckel, who repeatedly (from 1876 on) used illustrations like this as confirmation of his idea of recapitulation, which it was not, and in fact contradicted it

generations, GE also gives a natural account of why there are life-cycles.3 We will return to further discussion of the vertebrate Bauplan below (Fig. 2.1). Entrenchment requires that the action of downstream consequences must feed back to maintain the presence of that type of element that is said to be entrenched. Once an entrenched event occurs it is past, and for entrenchment to lead to evolutionary conservation, it must again recur in the next iteration. This causal closure exemplifies the apparent teleology characteristic of functional explanations, in which consequences of an element are parts of the explanation for the recurrences of similar elements in successive iterations of the life-cycle (Wimsatt 1972, 2002, 2015), and indeed, generative entrenchments are a subclass of functional events and processes. Individual events, without these feedbacks, do not get generatively entrenched, even though they may have widespread downstream consequences. Thus the  This explains why there should be life cycles, returning to a very similar place at the beginning of development, though not why life-cycles should also start in the particularly simple one-cell zygote. For discussion of this, and the hypothesis that it would serve reduction of transmitted parasites or cancers, see Grosberg and Strathman (1998). 3

2  Engineering Design Principles in Natural and Artificial Systems: Generative…

29

asteroid collision that killed off the dinosaurs induced many generative entrenchment processes in the mammalian adaptive radiations that flourished after the exit of the dominant reptilian predators, but the asteroid collision was not itself entrenched by them. Entrenchment can also be observed in the maintenance of a system, which one might call maintenance entrenchment. The closed causal loop that explains the persistence of entrenched elements in life cycles can apply when the elements in question play a role in maintaining an important feature in a persisting adaptive system and may thereby contribute to the preservation of the system and itself. These “closed causal loops” give the causal occurrence necessary to explain maintenance of the entrenched elements in a homeostatic system without invoking the recurrence of whole successive life cycles. The feedback loops of GE play roles both in the maintenance of stable mature systems but also in the continued development of competence in a given area or set of areas through continued use or reuse of a developing capability. Thus an organism (or person or technological system!) can generate or scaffold developing recurrences and entrenchments within it, improving it without passing to the next generation. Language acquisition and the development of mathematical skills, and their continued use within the life of an individual, are obvious examples of potentially entrenched developments that do not get passed on, and the development of any sequentially acquired competence in which new skills are layered upon and articulate earlier ones also fits here. This is in addition to those traits an entity may pass on. I would argue that skills and the technologies constructed through and for their use are the most important elements of transmitted culture. Spoken language and the evolutionarily recent ability to read and to write a language are perhaps the most important of these skills (Wolf 2008). Both require complex multi-stage learning trajectories. The boundedness of these skills within a generation is demonstrated by the lack of heritability of specific languages in cross-cultural adoptions and the lack of any sort of (automatic) heritability of mathematical achievements from parent to child. Nonetheless, both language and mathematics can persist trans-generationally in populations of users—language usually spanning the range of a cultural system as a whole, and mathematics (and different parts of it) being taught and used in different subcultures (initially astronomers, and more recently accountants, engineers, technologists, and scientists) within a society. This mode of maintenance entrenchment is crucial to identifying entrenchment in cultural systems, where many closed causal loops mediate quite complex performances. Here, systems are persisting rather than reproducing, and biological and cultural entrenchments often are based upon and articulate such maintenance entrenchments. This basic idea of generative entrenchment was significantly developed and elaborated before genetic data by Rupert Riedl (1978), and applied by him primarily to morphological data. Others have discovered it and extended it in multiple directions, ranging from the elaboration of technology to the innate-acquired distinction

30

W. C. Wimsatt

and the fixation of meaning.4 Analyzing GE and its consequences has become one of the most crucial tools in comparing and untangling developmental processes. It is applied to massive phylogenetic data sets of the genetic constitution of different organisms, by developmental geneticists and evolutionary developmental biologists, to study comparative differences in conservation and performance, in analyzing the working structure and evolution of developmental programs (Wimsatt 2015). The explosion of genomic data giving comparative phylogenetic information has made possible detailed genomic analysis like those of Eric Davidson and Doug Erwin of genetic circuitry. Cross phyletic comparison allows the extraction of genomic architectures of broader applicability, like that pictured on page 219 of Davidson’s (2006), indicating a kernel underlying specification in the heart progenitor field. Davidson compares the circuitry in Drosophila and in Vertebrates, and extracts architectural features of the circuitry they have in common to come up with a circuit—presumably ancestral to both. Interestingly, the conserved elements are not DNA sequences, or even genes, but the functional roles that they serve. This is a move up in level from the reductionism that would have been common in the last generation, but the preservation of functional roles is a crucial issue to which we return.

2.2.1  G  enerative Entrenchment and Engineered Technological Systems Differential generative entrenchment of different parts of a complex system is a deep property which is universal among complex machines. It is ubiquitous and generic, and indeed, even self-organizing. Why is it so common? If we look at machines, we see that they comprise an articulation of different parts. The different forms of these parts and their different roles allow them to perform a variety of complex tasks. Without this differentiation of parts and how they are hooked together, engineering would be boring indeed, and the array of ways in which we could interact with our complex worlds much less varied, interesting, and powerful. But these different forms and different roles induce different failure rates for the parts, whose failures lead to different radiations of consequences ranging from inconsequential to catastrophic.

4  This would include William Wimsatt, Wallace Arthur, Jeffrey Schank, and Nicolas Rasmussen in evolutionary and developmental biology, (Wimsatt 1981, 1984, 1986, 2007b, 2015; Schank and Wimsatt 1988, 2000; Wimsatt and Schank 1988, 2004; Rasmussen 1987); Brian Arthur in Economics (1992) and technology (2009), Wimsatt and Griesemer in cultural evolution Griesemer and Wimsatt (1989), Wimsatt and Griesemer (2007), Wimsatt in cognitive development (1986, 2003), the evolution of scientific diagrams (2012), and technology (2007a, 2013a), and linguistics Dove (2012), See Turner (1991) for the transformation of figurative to literal meaning thru entrenchment.

2  Engineering Design Principles in Natural and Artificial Systems: Generative…

31

By coincidence, I was entering O’Hare airport on May 25, 1979, and witnessed the DC-10 crash of American Airlines flight 191 flying out of Chicago, a particularly terrible event. With 273 killed it was the worst aircraft disaster in the continental U.S.  It illustrates the massive proliferation possible in a failure cascade. An improper maintenance procedure for removing the engine (instituted to save time and money, and later forbidden) led to cumulative structural damage from metal fatigue in the rear of the pylon securing the left engine to the wing. This damaged section sheared under full-power takeoff load, with engine and pylon tearing out of the wing, severing hydraulic lines that ultimately caused raising of the outboard flaps on that wing. Without the outboard flaps extended that wing had a higher stall speed, leading the wing to stall out and drop on that side. Loss of electrical sensors (also severed when the engine ripped out of the wing) failed to alert the pilots of the flap retraction so they did not know to take corrective action (dropping the nose to increase airspeed would have terminated the stall) so the aircraft rolled over to that side and nosed into the ground. Airliners like the DC-10 can take off with a single engine failure, but this engine “failed” in a particularly destructive way. Like most severe failures, this local failure led to consequences that crossed over into multiple systems: loss of motive and electrical power from the missing engine, hydraulic control of flaps, electrical sensing and control of systems on that side, and ultimate aerodynamic failure leading to the crash.5 Here we have differential consequences of different parts, with a catastrophic failure cascade. So all complex machines with differentiated parts can be expected to have differential probabilities and consequences of failure for their different parts. When talking about production or maintenance of these systems, this leads to differential generative entrenchment. This justifies why analysis of generative entrenchment in engineered systems is relevant.

2.2.2  Entrenchment Can Drive Asymmetry and, thus, Diversity But this is not all. One of several characterizations of self-organization, symmetry-­ breaking, is particularly relevant to generative entrenchment. In certain systems, generic properties of interest emerge or self-organize through self-amplifying deviations from conditions of homogeneity or symmetry. The emergence of differential entrenchment in evolving complex systems is such a symmetry-breaking process, as is illustrated if we start with a system that is initially homogeneous in entrenchment. Consider a system, say a gene control network, in which all of its elements are equally strongly selected, or have equivalent amounts of generative entrenchment or downstream dependency. (This homogeneity is unlikely for any differentiated system, whose parts would have different conditions and consequences of failure, as

5  Wikipedia entry, American Airlines Flight 191 consulted on February 12, 2019. Some details of the failure are condensed, resulting in a simplified but much shorter description.

32

W. C. Wimsatt

illustrated above, and all interesting mechanisms are made of differentiated parts— perhaps the first principle of engineering!) Suppose a mutation adds a new adaptive downstream element depending upon one or more, but not all, of the existing elements. Any such mutation would increase the generative entrenchment of those elements. But then their loss thru other mutations would be more strongly selected against. Thus their expected lifetime in a system evolving under a selection-­mutation balance, and their chance of acquiring still more entrenching mutations, would both be increased. This is a symmetry-breaking process favoring increasing differential generative entrenchment from an initially homogeneous situation. So differential generative entrenchment is characteristic of any differentiated system, and also would emerge from symmetry-breaking transitions (fluctuations in system or environment) in any system that did not already show it. It is thus robustly generic, and a fundamental architectural feature in evolving systems. Both contingent elements and generic ones can become entrenched—they just must be stable for long enough to acquire further dependencies, and this gives them greater chance of surviving to acquire still more. So differential GE is at the core of explanations for the appearance of diversity and contingency in organic life and in our technological developments. For cultural evolution, developmental structure and generative entrenchment are even more crucial than for biology. Biological genes are acquired from a single breeding population in a single bolus at the beginning of the life cycle. But elements of culture are acquired throughout the life cycle, with earlier ones often affecting which later ones are acquired, and how they are interpreted and elaborated. This dependency structure is crucial in skill acquisition. We pass through a series of culturally induced breeding populations as we grow older, with social structures often scaffolding (through schooling and curricula) the skills we must acquire. Cultural evolution emerges through an articulation of internal and external structure, but in a way that makes the cultural elements we acquire through development differentially entrenched (Wimsatt 2010, 2019). So both for biology and for culture, differential generative entrenchment is ubiquitous and important. Indeed, in biology, generative entrenchment is the main reason that history matters, and that historical information is so revealing. And entrenchment is essential for preservation of elements to build upon, and thus for cumulative cultural evolution. So it must also be a major factor in the importance of history in human technology and culture as well.

2.2.3  Entrenchment and Bauplans: General Frameworks for Adaptive Radiations So far entrenchment may seem like a conservative force blocking constructive change. Generative entrenchment is not an unalloyed good. It can also anchor in dependencies, practices, and structures that we would like to eliminate, and block

2  Engineering Design Principles in Natural and Artificial Systems: Generative…

33

major innovations. Thus the two digit date-year convention adopted early in computer history when memory was a limiting factor led to fear of massive errors in time-indexed (e.g. financial) data when the date turned 2000, which could not be differentiated from 1900. Only massive reprogramming and purchase of new computer equipment turned a potential disaster (the so-called “Y2K problem”, Webster 1999) into a collection of relatively minor dislocations. But fixation of undesireable features is not always the case, and not the only major pattern of change! As we will see, entrenchment generates not only conservation, but also frameworks on which different adaptive designs can be constructed, sometimes generating an adaptive radiation of new possibilities. Earlier in discussing Von Baer’s Law, I showed developmental diagrams of the embryos of three vertebrates. But their similarities (as vertebrates) persist into adulthood. The vertebrates constitute a wide and varied phylum, one that includes all larger animals, from small fish through elephants to great blue whales, because the structural strength and scalability of the vertebrate skeleton allows integral and controlled motion through a large range of sizes. The vertebrate skeleton has provided a framework for major and diverse forms of adaptive creativity. This common architecture is called a Bauplan, when a major arrangement of parts is conserved within a phylum. In the vertebrate Bauplan, the basic form and arrangement of bones is conserved, with only few and minor modifications, across species occupying massive differences in size and ecological niche (Figs. 2.2 and 2.3). Thus the same skeleton that does for humans (with hands, shoulders and pelvis modified from our large primate ancestors for upright posture, efficient bipedal walking, throwing, and writing) has in the elephant—two orders of magnitude heavier at 15,000 lbs—a significant set of metrical changes, but the bones stay and though changed, their topological arrangement remains unchanged. Most obvious are massive thick leg-bones to handle the greater weight of a body that grows as the cube of the dimensions but must be supported by limbs whose cross-sectional area grows only as the square of the dimensions. (This is why elephants, with their piano-leg limbs, under threat of breaking a leg, can be confined in an enclosure bounded by a ditch that a human could easily traverse). Also striking are their canines, modified to become massive tusks. Going the other way in mass to the “flying fox”, (a fruit bat) which weighs four orders of magnitude less (two orders of magnitude less than us at a pound and a half), we have very thin bones, which are further lightened for flight, with the finger bones vastly extended to act as structural members for the wings, whose thin membranes are elaborations of the inter-­ phalangeal skin-folds.6

6  The size range for mammals is even larger, from 2 g for moles (less than a tenth of an ounce) to over 200 tons for the great blue whale. That is more than nine orders of magnitude! It is astounding that the same basic architecture can be well-adapted over such an enormous size range, but also that it is preserved over such a range. The mass of 200 tons is possible only for a marine animal, whose massive displacement of water helps to support it. A much smaller whale would (and does) smother when beached. The largest land animal, a dinosaur, was just (an astounding) 65 tons, and it is still not clear how it could support such a mass for land locomotion.

34

W. C. Wimsatt

Fig. 2.2  Skeleton of an Elephant, from Georges-Louis Leclerc, Comte de Buffon’s Histoire Naturelle (1749–1788), illustrating with Fig. 2.3 the diverse transformations in the common vertebrate Bauplan. (Author’s personal copy). The conservation of the vertebrate Bauplan is a product of generative entrenchment

Why was the skeletal form preserved across these adaptive variations and size scale changes? Can’t it be changed? Wouldn’t we do better at our complex tasks (like tool making) with a lengthened torso and an additional pair of arms? Or perhaps just by an additional pair of ribs? Not so quick! Either of these would be problematic.7 Many aspects of the functional phenotype—the whole organism—depend upon this

7  The “Bithorax” mutant, in Drosophila, gave an extra pair of wings (or restored, since four wings, like the Dragonfly, was the ancestral state) This too was a very deleterious mutant (for the fly), but was historically important and interesting since it seemed to provide such an organized (but dysfunctional) mutant. It proved to be the first mutant of a new class of genes, the Homeobox genes, whose study was crucial in unravelling many major features of development in diverse organisms from yeast to man.

2  Engineering Design Principles in Natural and Artificial Systems: Generative…

35

Fig. 2.3  Skeleton of a fruit bat, from Buffon’s Histoire Naturelle (1749–1788). The differences between this figure and the preceding were products of adaptations to the 4 orders of magnitude weight difference, with some special adaptations (enlarged canines to tusks in the elephant; lengthened digits to support interphalangeal membranes to make wings in the bat). (Authors copy)

architecture. In addition, the processes that generate this architecture are themselves important for other structures and processes, so cannot be changed with impunity. Frietson Galis and collaborators (Galis et al. 2006) have shown both of these: that skeletal processes are not easily changed, and if changed, they are sensitive indicators of deep problems elsewhere in ontogeny that are usually lethal. She was studying variation in the number of vertebrae in different phyla. These are highly variable from species to species in birds and reptiles, but strongly conserved in almost all mammals. By looking in the human medical literature at morphological anomalies among spontaneous abortions in human embryos, she found a relatively common anomaly in the conversion of the seventh cervical vertebra to a thoracic one (which then has a small spine like a rudimentary rib, but is otherwise unremarkable). This almost unnoticeable change was accompanied by major and diverse changes elsewhere in different human embryos. None were viable. This included

36

W. C. Wimsatt

widespread cancers of diverse types, missing kidneys, and a missing corpus callosum. The cause of such widespread chaos was a homeotic transformation that modified the action of a Hox gene, which had numerous downstream cascading effects. This points to a crucial generalization: to make a deeply entrenched change that is viable, it is crucial to preserve the core functions of the system. This induces constraints that become successively more demanding as the entrenchment increases over evolutionary time. This applies to biological systems, to technological systems (such as the computer or the internal combustion engine) and to conceptual systems such as scientific theories. But why bother to make such a change if it is overwhelmingly likely to have such disasterous consequences? The reason is that other consequences of a successful new substitution can have revolutionary effects because of other properties of the new substituted element. Thus an early symbiotic association of a proto-bacterium with a parasite led to a mitochondrion containing eucaryote, capable of oxygen metabolism, facilitating multi-cellular organization, vertebrate evolution, and us. So we have considered biological changes. I now consider one each of a technological change and a scientific change, and then explore this claim further. In 1958, IBM released its vacuum-tube 709 scientific computer. It was the first computer to emulate an older computer, the IBM 704, introduced 4 years earlier, so it could continue to run Fortran. The Fortran (for “formula translation”) computer language had become more than a success: it was a requirement for scientific computing. This backwards compatibility of the 709 became paradigmatic of a broader trend as computers progressed, ranging from a requirement to a strong desideratum for newer computers, for hardware accessories, and for popular software languages or applications, and illustrates our point. But development of the 709 changed the game in a crucial way. To meet an Air Force requirement to run reliably at radar sites in the Arctic, IBM then put the 709 team to work producing the 709 T, (or the phonetically identical 7090, as it soon came to be known), a logically identical version of the 709 computer substituting transistors for tubes (Ceruzzi 2003). This major and fundamental change in the core switching element is a case of conservation of function in spades: it was computationally isomorphic! This massively cut development time, so it could be released only a year later. The 7090 was smaller, much more reliable, and much less demanding in its support scaffolding. All of this had to be changed, but this was standard engineering, and thus unchallenging. It had much lower (5v) power requirements (none of the high voltage filament transformers required for vacuum tubes), much reduced need for air conditioning, and half the cost. It also ran 6x faster. With higher reliability, it had much less down time, and its higher speed allowed real-time control of processes that the 709 could not manage, and significantly expanded suitability for large computational tasks. It became a smash hit. The 7090 was logically identical as far as running programs was concerned, (and it was thus quick to develop), but its other characteristics gave it much wider distribution and use. The magnitude of this success guaranteed the future use of transistors in computers. But the explosive growth of computing emerged mainly from other innovations and scaffolding technologies.8  Scaffolding is crucial in the evolution of complex adaptive systems. See Caporael et al. (2013) and Wimsatt (2013a). 8

2  Engineering Design Principles in Natural and Artificial Systems: Generative…

37

Vacuum tube technology was by then some 50 years old, and miniaturization of vacuum tubes had gone about as far as it could go. Transistors were even initially smaller, but a game-changing aspect of their development emerged from the use of another technology which could preserve function while vastly increasing performance and simultaneously reducing cost—truly having your cake and eating it too! Transistors could be made smaller, and with the invention of the integrated circuit (in 1958) the path was set for them to become much smaller. The crucial application of photolithography, and developments in it (including the move to higher electromagnetic frequencies to allow resolution of smaller components) has, thru 4 decades of its evolution, led to x-ray microlithography capable of making integrated circuits with billions of chips on a circuit the size of a fingernail. This reduction in size has allowed reductions in cost for more powerful computers that were both faster and consumed less power because the components were smaller and closer together. But all of this would have been pointless without the much higher reliability of the transistor than the vacuum tube—only thru this higher reliability could one have built machines with so many working components that was not breaking down all of the time.9 In a way the computer revolution was made possible by a consequence that was initially a side effect: the change to a base technology that simultaneously (thru miniaturization) increased power and speed while reducing cost. So computers improved in four dimensions at once: reliability, power use, speed, and cost. And with the massive growth in software (inspired of course by the massive changes in these dimensions), how could there not be a revolution? But this requirement that to allow substitution, the core functions of a deeply entrenched element had to be preserved, has application more broadly, and indeed appears to be universal in analogous cases. But what for conceptual structures would be the analogous form of preserving or meeting the core functions of the adaptive system? For this we need to look to their use. For scientific change, it is analogously crucial to have a newer theory generate the core explanatory successes (and therefore the core applications) of its ancestor either directly or as limiting cases—so crucial that the older theory cannot be overthrown without meeting this condition. Michel Janssen’s (2019) studies of the theoretical changes from classical mechanics and electromagnetic theory to relativistic and quantum mechanics have demonstrated that in each of the five separate cases he analyzed, newer theories were built upon the scaffolding of the older ones and preserved (most of) the crucial characteristics of the classical theories. This begins to illustrate just how generally applicable this requirement is (Fig. 2.4). Janssen’s examples are deep and convincing, but technically demanding in mathematical detail, so consider an easier, but crucially important example from biology which is well-documented. In 1842, Darwin was faced with a deep anomaly for his developing theory. The organization and specialized labor of the different castes of 9  The problem of the reliability of vacuum tubes was severe. Writers forecast tight limits on the size of computers which above a few thousand tubes would have a tube burn out and break down before they could execute a program. One of von Neumann’s most intriguing papers (1956) was titled “An essay in probabilistic logic: how to build a reliable organism with unreliable components”, and was an attempt to design components that would be more reliable by introducing redundancy of function within the component, and modern memory chips introduce redundancy of function at higher levels.

38

W. C. Wimsatt

Fig. 2.4  Illustration of social insects from Das Kupferbibel (Augsburg, 1731–1735), so named because of the many copperplate engravings used to illustrate events in the Bible. The social insects are recognizable from the form of their burrows: termites on the left (and at top) and carpenter ants on the right. (Author’s copy)

2  Engineering Design Principles in Natural and Artificial Systems: Generative…

39

social insects contributing to the welfare of the colony made them a paradigm for the beneficent wisdom of the Creator in the then dominant Natural Theological story of adaptive design in Nature. This was a key homily for the explanatory account from Natural Theology—a moral lesson: work hard and accept your place in the social hierarchy!10 But for Darwin, this posed a problem: how could the adaptations of sterile castes of social insects have evolved, and be passed on to descendants? This seemed a direct challenge to his theory of evolution by natural selection, which required variations in organismal form that caused fitness differences and were inherited! Darwin saw that this case would surely come up immediately upon presentation of his theory, and he described it to a friend in a letter in 1842 as “the rock upon which my theory could founder.” A solution was not obvious, and according to Bob Richards’ account, (1983, 2015) Darwin did not hit upon the solution until actually writing up The Origin of Species in 1858. Darwin (1859) found a way to co-opt the case of the social insects, but had to modify his theory to accommodate colony selection in order to do so. If the whole colony was the unit of selection, then heritability was managed at that level (in effect, the queen was the gonads for a large complex organism) and the existence of sterile castes of termites or carpenter ants (the two social insects depicted in the illustration) was no more a problem than the existence of skin cells with no direct reproductive offspring was a problem for individual selection. They had become adaptive tools. For both the reproductive machinery was elsewhere—in the queen for the social insects, and in the gametes for “normal” individuals. The hierarchy of adaptations attributed to the insect colony was preserved across this revolutionary co-option in all major details, and only further anchored by new information. (The centrality of this case is reflected in a four volume illustrated bible, by Johan Jakob Scheutzer, Physica Sacra, Das Kupferbibel, and published in Augsburg, 1731–1735. It also shows that Natural Theology was not limited to England.) The changes only affected purposes at the top of the adaptive hierarchy, substituting natural selection for a grand designer and left virtually all of the massive adaptive information and explanations accumulated by naturalists unaffected. Darwin’s co-option of the widespread studies of adaptation supporting the competing account of Natural Theology midwifed a revolution in biology. This “co-­ option” strategy, with preservation of the main successes of the earlier version persisted thru the introduction of the gene as the hereditary element, yielding the successive revolutions of neo-Darwinism (population genetics), molecular (DNA based) evolution, and evolutionary developmental biology—each with major and sometimes foundational changes, each expanding the range of the theory while each time preserving almost all of the original unchanged—enough still to be called “Darwin’s theory”. Darwin’s Origin is still required reading for evolutionary biologists in training. In how many sciences is an analogous founding work so centrally honored? This too is preservation of function in spades!

 “Look thou o sluggard to the ant; consider her ways and be wise.” Proverbs 6:6 was cited at the bottom of the illustration.

10

40

W. C. Wimsatt

So generative entrenchment preserves structure, which can serve many functions, in maintaining compatibility with other structures and practices, (thus the key importance of backwards compatibility) and, when first introduced, provides a framework for adaptive elaboration in different directions. What means are there to help to preserve key functions when making a deeply entrenched substitution? We must accomplish functionally equivalent behavior, which would be expected to get more demanding as entrenchment increases. Strategies that will accomplish or make easier this task include: 1. duplication of entrenched entity, so that one copy can continue to function as before, and the other can be transformed. (e.g., tandem duplication of genes in genetics) 2. make change in supportive environment to meet some of the functional requirements. (vitamin c required but no longer synthesized in humans and other mammals, who have managed to make it a regular item in their diets. Scurvy, the bane of long sea voyages resulted in its absence). 3. find other ways to accomplish some of the functional requirements (as for example the redundancy of many pathways in primary metabolism, giving a distributed duplication of function) 4. modularity/compartmentalization/sequestration: isolate relevant functions from deleterious effects of substitution on module. (e.g., main effect gene + modifier loci to reduce side effects) 5. break down functions of complex substitution into elements that accomplish functional requirements either piecemeal or collectively to simplify analysis or replacement. (Something like this resulted fro the separation of the autocatalytic and heterocatalytic functions of the gene, which allowed the development of a successful theory of (classical) transmission genetics without yet having an account of gene expression and development.) 6. accept reduced functionality; change use; accept failure in initial function; (68,020 computer chip included 68,881 numeric co-processor; some were defective in that part but otherwise functional, so marketed as 68,010, and used in cheaper portables that did without fast floating point computations.) In all of these, we see the mark of history, and it is secured by the processes of generative entrenchment. Indeed, I know of no other process in biology that would preserve the effects of history against the degradational forces of mutation, and of direction-changing selection in changing contexts and co-options that would erase the effects of past optima (Wimsatt 2001). Indeed, generative entrenchment is, in biology, the main reason that history matters, In human history and technology there are so many forces acting that it is harder to make the same argument, but it would be difficult to argue that generative entrenchment would not be a major influence in preserving the long hand of history even thru human affairs. I close this section with what is perhaps nothing more than an urban legend. This is that the width of railways (at least in the UK) owes its measure to the standard practice of wagon-makers (who would have constructed the first railway carriages) which matched the width of ruts in English roads. (If you don’t you quickly break wheels and axles). And that

2  Engineering Design Principles in Natural and Artificial Systems: Generative…

41

it is said goes back to the width of Roman chariots. So the triumph of English industrialism owes a debt to the Roman Empire. But the Russian railways, you might say—those are different. Ah yes, but the Roman Empire never reached into Russia, and the Russians are said to have deliberately made their carriage widths different than those in Germany, so as to hinder invasion. That would suggest that generative entrenchment and history matters, but we must be careful to include all of it.

2.3  T  op-Down Modularity: The Emergence of Order from the Big Ball of Mud Now we turn to a discussion of modularity, and a contrast between “top-down” and “bottom-up” modularity commonly characteristic of evolved and engineered contexts respectively. Modularity as a design strategy for adaptive systems does not attempt directly to produce a functionally equivalent substitute for a generatively entrenched element. Rather, it makes it much easier to produce variations (and possibly to use search heuristics to direct their production or evaluation in a focused way), which can then be used to explore alternative possibilities, either to accomplish the function directly, or to replace the whole structure with the offending element or module. A crucially important way to create modularization is to invent structure to support a combinatorial alphabet for generating new variations in a systematic way. Such innovations have had revolutionary effects whenever they have occurred in the history of organic life or of culture—with RNA, DNA, proteins, antigens, languages (oral and written), machine tools and interchangeable parts, electronic components, and computer languages—each producing an adaptive radiation of systems utilizing this new alphabet of parts to explore both traditional and new adaptive niches. These modules are progenitors of the most massive elaborations of culture and technology. I consider this below when we discuss combinatorial entrenchments. Arriving at cleanly modularized parts is not always guaranteed, often due to poor ‘top-down’ planning. Consider an organizational problem for evolved systems as seen by two computer scientists contemplating the evolution of complex software, such as operating systems, or multi-functional packages, such as Microsoft office, both of which are constructed by large teams of programmers working at least partially independently, and maintained by their successors who do not have the documentation for the original parts of the program: “A BIG BALL OF MUD is haphazardly structured, sprawling, sloppy, duct-tape and bailing wire, spaghetti code jungle. We’ve all seen them. These systems show unmistakable signs of unregulated growth, and repeated, expedient repair. Information is shared promiscuously among distant elements of the system, often to the point where nearly all the important information becomes global or duplicated. The overall structure of the system may never have been well defined. If it was, it may have eroded beyond recognition. … ”11 11

 I owe this delightful reference to Bret Calcott, who spent a decade as a software engineer.

42

W. C. Wimsatt —Brian Foote and Joseph Yoder, University of Illinois, Dept of Computer Science, June 1999

Modularity can emerge organically in a ‘bottoms-up’ approach. Biological evolution has famously proceeded thru a series of random mutations winnowed by selection. Although the randomness of these mutations at the phenotypic level has recently been qualified by work in evolutionary developmental biology, it is still accepted that roughly anything that produces an improvement anywhere in the phenotype will be selected for, in a “first come, first served” opportunistic manner. This suggests that evolution, like most software maintenance, proceeds through a series of contextually advantageous layered kluges. If so, how is this different from the “big ball of mud” envisioned by Foote and Yoder and how do we escape its consequences for having an unmanageable mess? How indeed, do we even have a discernible functional organization (Wimsatt 2001, 2013b)? Sadly, while there is a means of escape for biological evolution – in that positively selected entities live while others fade away - it is not, or not easily, available to software programmers, who are pressured to keep using existing software approaches and resources.12 I think that the operation of recombination drives the modularity of quasi-­independent elements allowing evolution, at least in sexual species, and may also drive system robustness. And in such systems, functional organization remains relatively stable thru the action of generative entrenchment (Wimsatt 2015). Careful thinking about what the likely essential modules are within a system is important, which makes the study of robustness important. A generation ago in the work of George Williams (1966), and popularized by Richard Dawkins (1976), we were taught that genes were the only things sufficiently stable to be evolutionary units because recombination disrupted any larger complexes. And genes show significant epistasis (non-linear gene interaction) in complex combinatorial ways. So a change in either genes or environment should change phenotype, which, so the argument goes, must be less stable than either. And continues: since phenotypes are products of genes and environment, and gene complexes and even genes are scrambled in sexual recombination, phenotypes should be very unstable. WRONG! THEY AREN’T! The traits are often robust to many changes of inputs. Despite all of the non-linear gene interactions and new combinations, most offspring phenotypes are not only characteristically of the same species as their parents (and not unrecognizable embryological hamburger), but reflect their many detailed morphological, behavioral and biochemical traits. Thus we may say that a child has her mother’s eyes, her fathers’ forehead and hair color, and her grandmother’s disposition. Human mothers in their prime (say, 25 years old) have roughly 50% viable embryos. How is this possible? Why aren’t almost all new zygotes immediately inviable—nothing more than cellular hamburger? Furthermore, If there weren’t significant heritability both of fitness and of individual traits from parents to offspring, evolution would be impossible. We would not be here.

 Actually, it is arguable that supporting this task is one of the aims of Object Oriented Programming, by making parts of code more easily portable, extendable, and reusable.

12

2  Engineering Design Principles in Natural and Artificial Systems: Generative…

43

Andreas Wagner (2005) shows that evolution has tuned the phenotype for robustness of essential characteristics at multiple levels (from redundant DNA on up) in the face of environmental and genetic variations in all essential properties, and this should generate the relative stability (for crosses between not too distant relatives) to allow sexual recombination to do the finer tuning. Wagner however attributes the selection for robustness to environmental variation. In this he underestimates the effect of recombination changing the genetic environment of adaptive gene complexes. Sexual recombination is a major driver of robustness in sexual species. Given genetic variability at 5% or more of loci (so that about 1000 out of 20,000 loci would be segregating in humans), and random recombination of genes in offspring, and widespread epistasis among alleles at different loci, and non-additivity among alleles at the same locus, recombination must act as a strong filter against high epistatic fitness variations among different recombinations. Livnat et al. (2008) have since confirmed this claim in simulations, arguing that selection in the presence of sex robustly favors alleles that do well across multiple genetic backgrounds. These alleles can serve as powerful modules amidst a wide range of contexts. This is an instance of the kind of complex stability possible that modules can afford in open non-linear highly context-dependent and facultative (biological, and more generally, adaptive) systems. I suggest that similar things are true for evolved adaptive systems in general, including complex context-dependent and conditional cognitive organism-environment-artifact-multi-agent complexes. If it were not so, we could not make human culture or other forms of coordinated behavior work. And of course, human engineers start with a host of compatibility requirements as fundamental design constraints for new technologies’ ability to operate with other aspects of our technological environment. I know from my experience working as an engineer in the adding machine division at NCR that a major part of an engineer’s learning on a new job is becoming familiar with the constraints imposed by the existing architecture of the machine within which her designed parts must operate. We have seen that the emergence of ordered quasi-independent behavior in a spontaneously evolving system is plagued by the fact that improvements are local fixes—usually kluges—that don’t meet any overall architectural plan. They emerge as driven by the need for improvement, which may occur anywhere, and opportunistically, in any manner that works.13 This means that their functional architecture will be to some extent mysterious, although broadly constrained by a Bauplan of the more deeply entrenched elements, structures and processes that provide a framework within which the fixes and adjustments yield improvements. But the presence of kluges compromises the architecture, and makes de-bugging or localization of faults more complex. This thicket is manageable only because of this entrenched architecture and the modularizing influence of recombination, driven by internal and external needs for heritability of fitness, and in a relatively stable environment,  In some ways this parallels the mysteries of “deep learning” in connectionist networks. There too, we have improvements in the behavior of a network driven by no architectural plan, but only by positive effects under training with reinforcement.

13

44

W. C. Wimsatt

heritability of traits. This modularity is commonly context-dependent, and if only recently achieved thru selection (a case of genetic assimilation), can be quite sensitive to destabilization (West-Eberhard 2003). Modularity is a recurrent theme in evolved objects and systems, and it is often aided by scaffolding, adaptive structures and processes that facilitate, support, and provides the context by which the module can do its work (Wimsatt and Griesemer 2007; Caporael et al. 2013). We have already considered how top-down modularity may be produced thru recombination, and how important it is to evolution. But for engineering, bottom-up modularity is more widespread and obvious. We design and build things by assembling standardized mass-produced interchangeable parts. We don’t “grow” them. Nor do they reproduce or repair themselves. Bottom-up modularity—however context-free the ultimate pieces seem to be, is secured and maintained only through substantial scaffolding—an often reticulated apparatus of support. Thus we suppose that DNA contains within it the modularity of the code, but the code is such only thru the complex apparatus for transcription to messenger RNA, and the “tape-reader” head of the ribosome assembling the polypeptide string. The “selfish gene” is a functional localization fallacy—a mythology—and “genetic engineering” simply resets the jigs in a cellular factory to make a new product. An important thing to note is that the functional role of modules may be multi-­ dimensional and cross-cutting. That is to say, top-down selection to produce “quasi-­ independence” of traits and bottom-up constructions to make gene control networks or “tinker toy” assemblage of pre-defined parts need not produce or use the same mapping of function to module. In biology, this produces the complexity of the genotype-phenotype map. In technology, this does not occur with the simplest design practices, where there tend to be 1–1 mappings between functions and parts at various levels, as a product of piecemeal engineering practices. It is only when systems are designed (or later kluged) in ways that individual elements serve multiple functions (Wimsatt 1974), and recognizable functions are accomplished thru the interaction of multiple elements that we begin to approach the complexity found with top-down modularity. This situation, called “functional multiplexing” by Winograd and Cowan (1963) can be used deliberately to increase efficiency or reliability, or can be produced inadvertently when fixes or extensions of capacity are accomplished through multiple opportunistic kluges that ignore the existing functional architecture, as described in the “big ball of mud” of Foote and Yoder (1999).

2.3.1  H  ow Modularity in Engineering Can Become Entrenched In our elaboration of culture, and of biological organization, the operation of scaffolding is a common element to support modularization (Caporael et al. 2013). However, this focus on bottoms-up modularity did not come to western technology easily.14 I  Apparently the crossbows accompanying the thousands of Chinese pottery warriors associated with the tomb of Emperor Chin Shi Wang ca. 200 BC show signs of having been mass-produced, and do have parts that are interchangeable (Williams 2008), anticipating the West by two millenia.

14

2  Engineering Design Principles in Natural and Artificial Systems: Generative…

45

wish to describe more fully the efforts to produce interchangeable standardized parts which, when distributed, allowed the hardware store, with nuts, bolts, and standardized fittings, to largely replace the ubiquitous blacksmith shop where the measure of quality was the ability to produce customized solutions in iron on demand. In this replacement, the development of interchangeability was scaffolded at every stage. The development of standardized parts and mass production in the nineteenth century provides a paradigm of the emergence of a combinatorial modular system crucial to the development of the industrial revolution. We can also see multiple stages in its emergence, through the development of supporting structures and practices. The initiating motivation was the desire of the military for muskets with interchangeable parts to reduce the number of arms put out of action by allowing for much simpler repairs in the field—something crucial to an army and worth the added expense.15 In 1812 the Ordnance Department of the new American government encouraged development of interchangeable parts in arms produced at each of its two armories, in Springfield, Massachusetts, and Harper’s Ferry, Virginia. Interchangeability was not easy to manage (it was never achieved for the fussy flintlock mechanisms of 1812 muskets) and not fully achieved until 1841, when the later and more reliable percussion muskets produced at the two armories had parts interchangeable not only with other muskets at the same armory but with each other (Smith 1977). Why did it take so long? Several developments were required before interchangeability became possible. For anything larger than molecules, interchangeability is a hard-won battle, demanding significant scaffolding. The new mode of manufacture required reconfiguring the workforce and production, but also development of specialized machine tools, of template gauges for accurate dimensioning, and frequent measurement operations to make parts to given standards with successively greater precision. A move away from the craftsman tradition of individuals who made whole rifles with custom-­fitted parts with great pride to separately decomposed tasks of making individual types of interchangeable parts predictably met with massive resistance. It called for massive changes in lifestyle: demands that people keep regular hours to start with their work-shift on time, and specialize on simpler subassemblies (like “lock, stock, and barrel”), and still later, the work path was decomposed further to have individuals concentrate on still smaller parts or operations on them. Less skilled or unskilled workers replaced craftsmen (who were in short supply) and the new workers were paid lower wages using piece rates. The pride of creation was lost (it was hard for a piece-worker to identify with the whole complex product) and the changes met with resistance from workers as their jobs were “de-skilled” (Smith 1977), and later time-studies and efficiency of production (in units per hour) became a top-down tyranny of “Taylorite” production. The armories were influential in two other respects A number of “mechanics” learned the emerging mass-production techniques in these and in other smaller centers of arms manufacture and fanned out to other industries that began to take on “armory practice.” There was development of a clear methodology. The  Such arms were initially far more expensive, due to the cost of setting up for and de-bugging manufacturing, even though they ultimately became increasingly cheaper as this was accomplished.

15

46

W. C. Wimsatt

development and use of specialized machine tools allowed precision operations, and their configuration to do specified operations in a set order (to avoid time loss and errors in resetting for different operations), so they were set up in a line to allow workpieces to be passed from one to another for successive operations. (Hounshell 1984). This reflected generative entrenchment of manufacturing operations. All of these (with frequent inspection operations) contributed to the reproducible production of parts sufficiently similar to be interchangeable. The use of these machine tools spawned a machine tool industry that made both tools specialized for other ends and tools (like lathes, drill presses, and milling machines) which were reproducably adjustable to diverse precision operations. (Smith 1977, 288). The use of gauges and measurement also deserves special mention (Hounshell 1984, 41–42). Hall adopted the use of three sets (each comprising 63 gauges) for the first variety of muskets successfully produced with interchangeable parts. There was a master set against which all others were calibrated, an inspection set used for testing whether the pieces met the standard, and one or more work sets, used for testing in the actual production of the pieces. This allowed for successively reduced wear on the inspection and master sets. (There are significant parallels with biology here, with reliability the concern: DNA as the master molecule produces messenger RNA as the “working” molecule in protein synthesis, but is otherwise protected in the nucleus, while proteins do the work.) Another important innovation by Hall was to set up a system for measuring all dimensions in the assembled product from a single reference or bearing point, so that error or “slop” in the placement of various fixtures to do operations on the whole piece would not be cumulative. But practice was still required: the desired interchangeability required a learning and ongoing feedback process of redesign involving back-and-forth interaction between machinists, machines, and inspection procedures, and not infrequently (from my design experience working for NCR), redesign to better serve reliable (or inexpensive) production. After the initial experience with parts for muskets, standardized parts spread to the manufacture of sewing machines, reapers, bicycles, and larger mechanical assemblies, and also to smaller parts like threaded fasteners (screws, nuts, and bolts and many other things, see Fig.  2.5), which played diverse roles in all kinds of machines. Institutional standardization of machinery began with boiler manufacture, after a number of boiler explosions wreaked havoc in New York (Hounshell), and standardization spread to all sorts of things. Coordinated setting of standards, which become self-reinforcing in a “coordination game” because if most others are standardized and you are not, your parts don’t fit in their machinery and you rapidly go out of business. Different kinds of parts led to different manuals, setting design standards, and where it was crucial enough, standards committees set up visitation procedures to assure that manufacturing met the standards. There are currently over 500 manuals setting standards for different kinds of manufactured objects (Fig. 2.5). Once multiple industries came to depend upon the parts (which now served multiple functions) distribution of standardized parts is required, both for manufacturing and for retail consumption in specialized stores for hardware, electronics, plumbing, computer and automotive parts. This is an often ignored requirement: the

2  Engineering Design Principles in Natural and Artificial Systems: Generative…

47

Fig. 2.5  “Threaded fasteners: miscellaneous bolts and screws” in H.  Herkimer’s Engineer’s Illustrated Thesaurus, Chemical Publishing Company, (1952), pp.  12–13 showing an adaptive radiation of different threaded fasteners as standardized interchangeable parts to serve different kinds of functions

right parts must be in the right place at the right time, calling for elaboration of coordinated transport mechanisms. (Containerized shipping and “just-in-time” stocking of items for use is only the latest extension of this trend.) In this way, distribution and use of standardized parts is like metabolism in biology. Indeed, I have a teaching chart for high school biology from 1944 exploiting this analogy, and depicting the human body as a factory. The importance of modularity led to a focus on standardization becoming increasingly generatively entrenched. These standardized parts can be produced more cheaply and found more readily. If their use requires special training, that too becomes standardized, producing, for example auto service stations, where mechanics can service (and in the old days, even repair) a variety of different brands of car, although the major and more specialized operations required at least getting parts from the dealership. The standardization of these parts, now anchored by the increasing number of functions that they serve produces a combinatorial explosion—an adaptive radiation of products that use them, and rapidly produces a deep generative entrenchment for them. And any deeply generatively entrenched part must meet the very demanding conditions for its replacement described above. This is combinatorial generative entrenchment.

48

W. C. Wimsatt

The existence of standardized parts has led to other productive practices. Chunking is the standardized assembly of a particular set of parts into a larger functional unit which may itself them become a part that is distributed, or further chunked into still larger assemblages. Thus electric motors became significant modules, as did gasoline motors, fans, and pumps, all of varying sizes and capacities. There was no limit to this aggregation. Sears sold kit houses with 30,000 parts, delivered by rail (Thornton 2002, Wimsatt and Griesemer 2007). This modularity is emphasized by Brian Arthur (2009), when he claims that technology is “recursive”, though he takes the way down, of parts-within-parts-within-parts … rather than the way up, as we have. Along with larger assemblages of machinery went a complementary trend, of “black-boxing” the content of the new modules.16 Gone is the ability to disassemble the modules into their parts, or even to know what is inside, or how it works. This is in fact a necessary aspect of our increasingly complex technology: if we only have to know what a module does in a larger complex, and can ignore how it does it, we can learn to use increasingly complex machinery, and not be overloaded with all of its assembly and manufacturing details all the way down. This is a requirement for the use and understanding of more complex artifacts. Psychologist Donald Norman (1993) documents the impact of cognitive overload in pilots and machine operators who are given too much information (“too many dials to watch”) leading to catastrophic failures. At this point, the “user interface” becomes important, reducing the operation of complex machinery to knowing how to find the desired operations and what they do in an often overcomplex computer menu. But even this must be avoided. This excessive complexity has led to standardized yoking together of functions commonly used together, leaving it only to the expert to figure out how to utilize the complex menus to accomplish customized functions or capabilities of the machine. And interface design has become a specialty increasingly in demand. The bottom-up modules that emerged in the widespread utilization of standardized parts were widely used in standard engineering practice (also discussed by Arthur 2009). This “standard engineering practice” was furthered (and revealed) in engineering design books like that published in 1952 (and multiple editions since) by H.  Herkimer, and called An Engineer’s Illustrated Thesaurus. Herkimer’s remarkable text is worthy of special note as a tool, for in it the modularity of design mimics the architecture of the artifacts: This “thesaurus” is organized by kind of  This term was reintroduced to modern students by Bruno Latour, in his Science and Action (1987), and goes back to the second world war when electronic “black boxes” were incorporated in aircraft and ships to accomplish functions that their users did not need to understand. Norbert Wiener also used it in the introduction to his classic book,. Cybernetics, (1947, 1957) where he contrasted it with a complementary operation which he called “white boxing” in which one synthesized a circuit of known components and architecture that would behave in the same way as an unknown “black box”. This is a systematic methodological variety of the modern procedure of “reverse-engineering”. See also discussion of the increasing black boxing in car user manuals over time beginning with the Ford model T in Wimsatt (2019). (Arthur’s work is also of broader interest, see Arthur 1982, 1984, 1988, 1997).

16

2  Engineering Design Principles in Natural and Artificial Systems: Generative…

49

mechanism and within kind of mechanism by more specialized function. It creates a “design alphabet” of alternatives organized for easy look-ups, but also shows nicely how basic mechanisms can proliferate varieties. They aren’t strict functional equivalents, since each is specialized to a more particular kind of application. And they are types, not particular parts. This text also encourages engineers to break complex design problems into sub-problems which are “nearly-decomposable” (Simon 1962/1969) and to use existing solutions rather than to invent yet other variants unless absolutely necessary. Thus engineering practice becomes modularized and standardized as well!

2.4  Conclusion So we have seen the importance of basic design principles in engineering and in Nature. Generative entrenchment and top-down and bottom-up modularity are crucial across all of what Herbert Simon named “The Sciences of the Artificial”. This is not yet a complete list of such design principles: a fuller one would include hierarchy, robustness, self-organization, and evolvability, all worth discussing on another occasion, with self-reproduction, self-repair,17 and metabolism also important, but so-far lacking in our engineered artifacts. Simon was unapologetic in lumping cases of engineering (and social) design with design in natural products, and talking about features pertinent to both. His idea of near-decomposability has proved to be one of the most important analytical tools of the second half of the twentieth century. (I have used it extensively, e.g., Wimsatt 2007a). His ideas of a heuristic procedure and of satisficing are no less important. It has been 50 years since the publication of Simon’s groundbreaking integrative work. And it is no less important now than it was then. I dedicate this paper to his memory and to the continued influence of his ideas.

References Arthur, W. (1982). A developmental approach to the problem of evolutionary rates. Biological Journal of the Linnean Society, 18(3), 243–261. Arthur, W. (1984). Mechanisms of morphological evolution: A combined genetic, developmental and ecological approach. Chichester: Wiley. Arthur, W. (1988). A theory of the evolution of development. New York: Wiley. Arthur, B. (1992). Increasing returns and path dependence in the economy. Ann Arbor: University of Michigan Press. Arthur, W. (1997). The origin of animal body plans: A study in evolutionary developmental biology. Cambridge, MA: Cambridge University Press.

 There is perhaps an exception justified here for hard disks or memory that assesses itself, and avoids sectors that it identifies as bad. This is of course a software issue.

17

50

W. C. Wimsatt

Arthur, B. (2009). The nature of technology. New York: Macmillan. Buffon, C. (1749–1804). Histoire Naturelle. Calcott, B. (2014). Engineering and evolvability. Biology and Philosophy, 29, 293–313. https://doi. org/10.1007/s10539-­014-­9425-­3. Caporael, L., Criesemer, J., & Wimsatt, W. (Eds.). (2013). Developing scaffolding in evolution, culture, and cognition. Cambridge, MA: MIT Press. Ceruzzi, P. (2003). A history of modern computing (2nd ed.). Cambridge, MA: MIT Press. Darwin, C. (1859). The origin of species. Davidson, E. (2006). The regulatory genome. New York: Elsevier. Dawkins, R. (1976). The selfish gene. Oxford: Oxford University Press. Dove, G. (2012). Grammar as a developmental phenomenon. Biology and Philosophy. https://doi. org/10.1007/s10539-­012-­9324-­4. Foote, B., & Yoder, J. (1999). Big ball of mud, University of Illinois Department of Computer Science, http://www.laputan.org/mud/mud.html Galis, F., et al. (2006). Extreme selection in humans against homeotic transformation of cervical vertebrae. Evolution, 60(12), 110–121. Fig 1, p. 111. Griesemer, J. R., & Wimsatt, W. C. (1989). Picturing Weismannism: A case study in conceptual evolution. In M. Ruse (Ed.), What philosophy of biology is: Essays dedicated to David Hull (pp. 75–137). New York: Kluwer. Grosberg, R. & Strathmann, R. (1998). One cell, two cell, red cell, blue cell: the persistence of a unicellular stage in multicellular life histories. TREE, 13(3), March. Haeckel, E. (1907). Das Menschenproblem und die Herrentiere von Linne. Berlin: Paul Parey. Hounshell, David A. (1984). From the American System to Mass Production, 1800–1932. Baltimore: Johns Hopkins University Press. Janssen, M., (2019). Arches and Scaffolds: Bridging Continuity and Discontinuity in Theory, in A.  C. Love and W.  C. Wimsatt, eds., Beyond the Meme: The Role of Development and Structure in Cultural Evolution, Minneapolis: The University of Minnesota Press. Latour, B. (1987). Science in action. Cambridge, MA: Harvard University Press. Livnat, A., Papadimitriou, C., Dushoff, J., & Feldman, M. (2008). A mixability theory for the role of sex in evolution. Proceedings of the National Academy of Sciences, 105(50), 19803–19808. https://doi.org/10.1073/pnas.0803596105. Simon, H. (1962), The Architecture of Complexity. Proceedings of the American Philosophical Society, 106(6), 467–482. Smith, M. R. (1977). Harper’s Ferry Armory and the New Technology: The Challenge of Change. Ithaca: Cornell University Press. Norman, D. (1993). Things that make us smart: Defending human attributes in the age of the machine. New York: Addison-Wesley. Rasmussen, N. (1987). A new model of developmental constraints as applied to the Drosophila system. Journal of Theoretical Biology, 127, 271–301. Richards, R. (1983). Why Darwin delayed, or interesting problems and models in the history of science. Journal of the History of the Behavioral Sciences, 19, 45–53. Richards, R. (2015). Chapter 11: The myth of Darwin’s delay. In R. Numbers, K. Kampourakis, & N. Rupke (Eds.), Textbook myths about science. Cambridge, MA: Harvard University Press. Riedl, R. (1978). Order in living organisms: A systems analysis of evolution. New York: Wiley. Trans. R.P.S. Jefferies (German original: 1975). Schank, J. C., & Wimsatt, W. C. (1988). Generative entrenchment and evolution. In A. Fine & P.  K. Machamer (Eds.), PSA–1986 (Vol. 2, pp.  33–60). East Lansing: The Philosophy of Science Association. Schank, J. C., & Wimsatt, W. C. (2000). Evolvability: Modularity and generative entrenchment. In R. Singh, C. Krimbas, D. Paul, & J. Beatty (Eds.), Thinking about evolution: Historical, philosophical and political perspectives (Vol. 2, pp. 322–335). Cambridge, MA: Cambridge University Press.

2  Engineering Design Principles in Natural and Artificial Systems: Generative…

51

Scheutzer, A. (1731). Das Kupferbibel, Augsberg. Simon, H.A. 1962. The architecture of complexity. In H. A. Simon (Ed.), The sciences of the artificial (3rd ed.). Cambridge, MA: MIT Press. Thornton, R. (2002). The houses that sears built: Everything you ever wanted to know about sears catalog homes. Alton: Gentle Beam Publications. Turner, M. (1991). Reading minds: The study of English in the age of cognitive science. Princeton, NJ: Princeton University Press. Von Baer, K. (1828). Über Entwicklungsgeschichte der Thiere. In Scientific memoirs, selections from foreign academies of science, and from foreign journals: Natural history (vol. 1, pp. 221–224) (trans. Henfrey, A. & Huxley, T.). London: Taylor and Francis, 1853. Von Neumann, J. (1956). Probabilistic logic and the synthesis of reliable organisms from unreliable components. In C.  E. Shannon & J.  McCarthy (Eds.), Automata studies (pp.  43–98). Princeton: Princeton University Press. Wagner, A. (2005). Robustness and evolvability in living systems. Princeton: Princeton University Press. Webster, B. F. (1999). The Y2K survival guide: Getting to, getting through, and getting past the year 2000 problem. Prentice Hall PTR: Upper Saddle River. West-Eberhard, M. (2003). Developmental plasticity and evolution. Princeton: Princeton University Press. Wiener, N. (1947/1957). Cybernetics, Cambridge, MA: MIT Press. Williams, G. (1966). Adaptation and natural selection: A critique of some contemporary thought. Princeton: Princeton University Press. Williams, D. (2008). Mass-produced pre-Han Chinese bronze crossbow triggers: Unparalleled manufacturing technology in the ancient world. Arms and Armour, 5, 142–153. Wimsatt, W. C. (1972). Teleology and the Logical Structure of Function Statements. Studies in History and Philosophy of Science, 3, 1–80. Wimsatt, W. C. (1974). Complexity and Organization, in K. F. Schaffner and R. S. Cohen, eds., PSA-1972 (Boston Studies in the Philosophy of Science, volume 20), Dordrecht: Reidel, 67–86. Wimsatt, W.  C. (1981). Units of selection and the structure of the multi-level genome. In P. D. Asquith & R. N. Giere (Eds.), PSA 1980 (Vol. 2, pp. 122–183). Lansing: The Philosophy of Science Association. Wimsatt, W.  C. (1986). Developmental constraints, generative entrenchment, and the innateacquired distinction. In W.  Bechtel (Ed.), Integrating scientific disciplines (pp.  185–208). Dordrecht: Martinus-Nijhoff. Wimsatt, W.  C. (2001). Generative entrenchment and the developmental systems approach to evolutionary processes. In S. Oyama, R. Gray, & P. Griffiths (Eds.), Cycles of contingency: Developmental systems and evolution (pp. 219–237). Cambridge, MA: MIT Press. Wimsatt, W. C. (2002). Functional Organization, Functional Inference, and Functional Analogy, substantially revised and expanded version of 1997a for a collection on Function edited by Robert Cummins, Andre Ariew, and Mark Perlman. Oxford. 174–221. Wimsatt, W.  C. (2003). Evolution, entrenchment, and innateness. In T.  Brown et  al. (Eds.), Proceedings of the 1999 Piaget society meetings (pp.  53–81). Mahwah: Lawrence Erlbaum and Associates. Wimsatt, W. C. (2007a). Re-engineering philosophy for limited beings: Piecewise approximations to reality. Cambridge, MA: Harvard University Press. Wimsatt, W.  C. (2007b). Echoes of Haeckel? Re-entrenching development in evolution. In J. Maienschein & M. Laubichler (Eds.), From embryology to Evo-devo: A history of developmental evolution (pp. 309–355). Cambridge, MA: MIT Press. Wimsatt, W. C. (2012). The analytic geometry of Genetics: The structure, function, and early evolution of Punnett squares. In Archive for the history of the exact sciences, (inaugural biology issue) (17,498 words + 10 figs.) Wimsatt, W. (2013a). Scaffolding and entrenchment. In L. Caporael, J. Criesemer, & W. Wimsatt (Eds.), Developing scaffolding in evolution, culture, and cognition. Cambridge: MIT Press.

52

W. C. Wimsatt

Wimsatt, W. C. (2013b). Evolution and the stability of functional architectures, for CNRS conference on function and teleology. In P. Huneman (Ed.), Functions: Selection and mechanisms (Synthese library #363) (pp. 19–41). Dordrecht: Springer. Wimsatt, W.  C. (2015). Entrenchment as a theoretical tool in evolutionary developmental biology in love. In A. C. Love (Ed.), Conceptual change in biology: Scientific and philosophical perspectives on evolution and development (Boston studies in philosophy of science). Berlin: Springer. Wimsatt, W. C. (2019). Articulating Babel: A Conceptual Geography for Cultural Evolution, in A.  Love and W.  Wimsatt. eds. Beyond the Meme: Development and Population Structure in Cultural Evolution. Minnesota Studies in Philosophy of Science, 22, Minneapolis, MN: University of Minnesota Press, 1–41. Wimsatt, W.  C., & Griesemer, J.  R. (2007). Reproducing entrenchments to scaffold culture: The central role of development in cultural evolution. In R. Sansome & R. Brandon (Eds.), Integrating evolution and development: From theory to practice (pp.  228–323). Cambridge, MA: MIT Press. Wimsatt, W. C., & Schank, J. C. (1988). Two constraints on the evolution of complex adaptations and the means for their avoidance. In M. Nitecki (Ed.), Evolutionary progress (pp. 231–273). Chicago: University of Chicago Press. Wimsatt, W.  C., & Schank, J.  C. (2004). Generative entrenchment, modularity and evolvability: When genic selection meets the whole organism. In G.  Schlosser & G.  Wagner (Eds.), Modularity in evolution and development (pp. 359–394). Chicago: University of Chicago Press. Winograd, S., & Cowan, J. D. (1963). Reliable computation in the presence of noise. Cambridge, MA: MIT Press. Wolf, M. (2008). Proust and the Squid: the Story and Science of the Reading Brain. New York: Harper.

Chapter 3

Technological Progress in the Life Sciences Janella Baxter

Abstract  The new gene-editing tool, CRISPR-Cas9, been described as “revolutionary” This paper takes up the question of what sense, if any, might this be true and why it matters. I draw from the history and philosophy of technology to develop two types of technological revolutions (Hughes, Technological momentum in history: Hydrogenation in Germany 1898–1933. Oxford University Press, New York, 1969; Wimsatt, Re-engineering philosophy for limited beings. Harvard University Press, Cambridge, MA, 2007; Constant, The origins of turbojet revolution. The Johns Hopkins University Press, Baltimore, 1980; Scaife, Sci Am 252(4), 1985). One type of revolution involves a technology that enables users to change a generatively entrenched structure (Wimsatt, Re-engineering philosophy for limited beings. Harvard University Press, Cambridge, MA, 2007). The other type involves a technology that works within a generatively entrenched structure, but as a result of incremental improvement becomes the “new normal” technology for a community (Scaife, Sci Am 252(4), 1985). In what follows, I argue that if CRISPR-Cas9 is revolutionary at all – and I do not take a stand on the issue – it is in becoming the “new normal” molecular technology across biology labs. By contrast, a technology that has the potential of being revolutionary in Wimsatt’s sense is the orthogonal tRNA technique developed by Peter Schultz’ synthetic biology lab. Whether or not CRSIPR-Cas9 or the orthogonal tRNA technologies are revolutionary, I propose to treat these two types of putative revolutions as distinct types of technological innovation. I argue further that observing distinctions between types of technological innovation can be useful for tracking the epistemic and normative consequences that technology raises. Keywords  Genetic technologies · Technological revolutions · Synthetic biology

J. Baxter (*) Washington University, St. Louis, St. Louis, MO, USA © Springer Nature Switzerland AG 2021 Z. Pirtle et al. (eds.), Engineering and Philosophy, Philosophy of Engineering and Technology 37, https://doi.org/10.1007/978-3-030-70099-7_3

53

54

J. Baxter

3.1  Introduction The new gene-editing tool, CRISPR-Cas9, has sparked a lot of attention and excitement since its unveiling. Indeed, the technology is often described (explicitly and implicitly) as revolutionary. The aim of this paper is to analyze in what sense, if any, this claim is true and whether it matters. I achieve this by drawing from cases in the history and philosophy of technology to distinguish between two types of revolutions (Hughes 1969; Wimsatt 2007; Constant 1980; Scaife 1985). One type of revolution involves the creation of a tool that enables users to alter what the philosopher of science, William Wimsatt, calls generatively entrenched structures (Wimsatt 2007). The other type involves the creation of a tool that enhances or improves upon an existing technology in such a way that it helps explain the widespread adoption of the tool in a community of users (Scaife 1985). I call cases that achieve this sort of change the “new normal” technology. “New normal” technologies differ from the former type in that they do not enable users to change generatively entrenched structures, but instead function within them. I argue that if CRISPR-Cas9 is revolutionary at all, it is not revolutionary in the sense that it enables the alteration of a generatively entrenched structure. For unlike some technologies developed by synthetic biologists  – notably, the orthogonal tRNA case (Liu and Schultz 2010)  – CRISPR-Cas9 functions within the generatively entrenched structure of DNA and the standard genetic code. Instead, if CRISPR-Cas9 is revolutionary, it is in terms of being the “new normal” molecular technique among biologists across labs globally. Ultimately, whether a technology is in fact revolutionary is an empirical question best left to social scientists to settle. Nevertheless, in laying out the distinct types of revolutions that historians have described, I hope to clarify the historical and contemporary significance of novel molecular tools in biology like CRISPR-Cas9 and the orthogonal tRNA technology. Whether or not technological revolutions ever really occur, I argue that it is nevertheless useful to track technological innovations in terms of “new normal” technologies and technologies that enable users to change generatively entrenched structures. In this case, it is more appropriate to use the language of technological innovation instead of revolution. Technological innovations that proceed along these different paths have distinctive epistemic features and normative challenges. As I’ll argue (Sects. 3.2 and 3.3), “new normal” technologies are recent developments of a longstanding approach to solving a problem. This means that the designers and users of this type of technology have adequate background and theoretical knowledge about whether and how the technology will perform a desired function. This in contrast with technologies that enable users to change a generatively entrenched structure. At least in the initial stages of such a technology’s design and development, there is significantly less background and theoretical knowledge as to whether and how the tool will work. In this way, technologies like the orthogonal tRNA technique developed by synthetic biologists share some similarities to radical technologies like the David airfoil described by Walter Vincenti (1990). I use the CRISPR-Cas9 and orthogonal tRNA cases to illustrate the epistemic differences

3  Technological Progress in the Life Sciences

55

between that arise for each type of technological innovation. The epistemic differences characteristic of different types of technological innovations have consequences for the relevant normative questions that designers and users of a technology ought to consider. In the case of CRISPR-Cas9, scientific communities have a sense of what potential harms can arise from using the tool to modify human germlines or to control the inheritance patterns of populations (as a gene-drive mechanism). The potential risks associated with applying CRISPR-Cas9 technologies to real world problems like treat disease and control pest populations has sparked a burgeoning literature exploring the ethics of such applications (O’Keefe et al. 2015; Smolenski 2015; Rodriguez 2016; Comfort 2015; Charo and Greely 2017; Guttinger 2018; Deplazes et  al. 2009). In the case of the orthogonal tRNA technology, currently researchers have much less knowledge about the potential risks of using this technique to engineer organisms carrying an alternative genetic code that incorporates unnatural amino acids into proteins. In enabling users to change the genetic code – a generatively entrenched structure, the orthogonal tRNA technology makes things possible that biologists had not imagined nor hypothesized to a great extent. Thus, it is challenging to raise specific ethical concerns as authors have done in the case of CRISPR-Cas9. Nevertheless, at the moment more research and greater reflection is needed on whether synthetic biologists ought to advocate for the use of orthogonal tRNA as a biocontainment safety mechanism in, say, agricultural settings (Schmidt 2010; Diwo and Budisa 2019). By making the alteration of a generatively entrenched structure possible, technologies like the orthogonal tRNA case also raise further normative questions. As synthetic biologists advance in their efforts to engineer wholly artificial life, conceptual and normative questions about how synthetic organisms relate to other paradigmatic living systems. How we answer this question will bear importantly on how we ought to value these specimens. Currently, the predominant attitude among synthetic biologists has been to value these specimens in terms of their economic and experimental value; however, this may need to be rethought as synthetic biologists continue to advance their efforts. Finally, synthetic biology as a discipline that defines itself as adopting ever more radical approaches to engineering life, the synthetic biology community is in a position to reflect on whether an interest in gaining increasing amounts of control over life processes is a responsible norm to embrace. This paper has five parts. I begin (Sect. 3.2) by outlining the dominant approach by which biologists have sought to control life processes and evolution – namely, by developing tools and techniques to intervene on genetic material. I show how CRISPR-Cas9 is the latest development in this long history. Next (Sect. 3.3) I lay out several conditions that must be met for a technology to be revolutionary and I spell out two types of technological revolutions. I proceed to argue (Sect. 3.4) that if CRISPR-Cas9 is revolutionary at all – and I do not wish to take a stand on this matter – it falls under the “new normal” type of revolution. I contrast CRISPR-Cas9 with the orthogonal tRNA technology, which I argue is an example of a tool that enables users to change a generatively entrenched structure of life  – namely, the genetic code. Finally, (Sect. 3.5) I argue that tracking different kinds of technological innovation  – whether or not they constitute revolutions  – is useful for

56

J. Baxter

understanding the recent surge of ethical challenges authors have posed to some proposed applications of CRISPR-Cas9. Tracking technological innovation is also useful for laying out the normative challenges that arise for technologies that enable users to change generatively entrenched structures.

3.2  A History of Genetic Intervention The history of inheritance and genetics is very much a history of developing and using novel technologies to intervene on heritable material. In what follows, I trace this history back to the early twentieth century, before the molecular age of biology, to the use of crude technologies such as X-rays, radium, and colchicine. I argue that CRISPR-Cas9 is the latest development in this history. What’s noteworthy is that it is not even the latest molecular technology developed to intervene directly on genetic material. However, it is the most affordable and customizable of molecular gene-editing technologies. Whether for the pursuit of knowledge, commercial gain, or for curative purposes, a central aim in the biological sciences since the twentieth century has been to intervene on heritable material as a means of manipulating and controlling life processes. By the late-19th/early-twentieth century, researchers and agriculturalists had developed controlled breeding regimens that indirectly manipulated the genetic constitution of biological populations for the purposes of study and managing crop yield (Morgan and Bridges 1919; Kohler 1994; Paul and Kimmelman 1988). As a tool for the study and manipulation of life, breeding regimens were somewhat effective at aiding in the observation and direction of inheritance in some kinds of populations. However, the history of genetics has been driven for greater control over inheritance and the many metabolic processes that contribute to life. Before the molecular structure of DNA was first described in 1953, biologists applied a diverse array of mutagenic substances and processes to induce random changes in the chromosomal makeup of plants and animals (Campos 2016; Curry 2016). Radium, X-rays, and colchicine were all employed to generate novel biological varieties for the purposes of studying genetic inheritance and speciation, speeding up evolution, treating human disease, and producing commercial goods (de Vries 1909; Muller 1927; Burpee and Taylor 1940). The expectations of some scientists and breeders had for these methods were high. In agriculture, the hope was to develop methods that put the direction of evolution in the hands of breeders (Curry 2016). No longer, researchers and breeders hoped, would industry have to wait for the slow, unreliable process of natural selection to generate bigger and better varieties. Popular news outlets and magazines advertised each of these techniques as revolutionary. For example, the Charleston Daily Mail Magazine touted the use of X-rays as giving breeders the power to make, “New Life to Order” (Gray 1932; Curry 2016). But as tools for intervention on heritable material, X-rays, radium, and colchicine were relatively crude. Researchers and agronomists often had little control over the genes targeted by these technologies. Furthermore, application of these technologies

3  Technological Progress in the Life Sciences

57

target many genes at once, making it challenging to isolate the contribution a few genes make to an organism’s phenotype, survival, and reproduction. And finally, these technologies rarely intervene just on heritable material. Often, they can damage unrelated phenotypic traits and metabolic processes. Observable changes were often only achievable in some types of organisms and of limited heritability. Despite the immense hype of each of these technologies, they nevertheless fell short of the hopes and expectations of researchers and agronomists of the time. With the rise of a molecular age of biology around the 1950s brought forth not only a new way of thinking about heritable material, but also novel ways to imagine human intervention on genes. Discovery of DNA’s molecular structure along with the deciphering of the genetic code helped usher in a novel gene concept  – the molecular gene concept (Waters 1994). Molecular genes consist of regulatory modules and sequences of nucleic acid bases – adenine, thymine, uracil, and guanine – that in turn control the linear sequences of other molecular products. For example, the nucleic acid sequence along a segment of DNA specifies the nucleic acid sequence of a complimentary segment of DNA according to Watson-Crick base pair rules, according to which adenine and thymine always bind together and guanine and cytosine always base pair together.1 When it comes to proteins, the genetic code ensures that the same gene sequence is systematically translated into the same amino acid sequence across life. The genetic code is a mapping between nucleic acid triplets (codons) and the ~20 canonical amino acids. There is some redundancy in the genetic code – meaning that more than one codon associates with the same amino acid. Nevertheless, for the most part changes in the nucleic acid sequence of a molecular gene can produce many, specific changes in the linear sequences of other DNA sequences, RNA transcripts, and proteins, which can in turn produce a range of changes in the numerous metabolic and biochemical processes that sustain life. With a molecular conception of the gene came new possibilities for human intervention on life processes. By intervening on the nucleic acid sequence of a protein coding gene in such a way that produces a change, for example, a researcher can manipulate and control the linear sequence of other biomolecules and perhaps even their functions. An interventionist approach to understanding and controlling life processes has persisted into the molecular age, despite significant conceptual shifts in our understanding of heritable material. Beginning in the late 60s through the 70s, one of the first means by which researchers directly intervened on genetic material was using specialized proteins called restriction enzymes that induce DNA recombination by cleaving the sugar-phosphate backbone of DNA at specific locations in a genome (Smith and Wilcox 1970; Werner 1979). DNA recombination is a process that repairs broken strands of DNA in natural, living organisms. This process involves a host of DNA repair enzymes that rejoin DNA strands by either introducing new nucleic acid bases or ligating the existing nucleic acid bases together at the site of

1  A similar relationship holds for the intermediary copies of DNA called ribonucleic acid or RNA transcripts. The only difference is that uracil replaces thymine.

58

J. Baxter

breakage. In repairing broken DNA strands, recombination often results in a change in the nucleic acid sequence than was originally present. In fact, researchers have found ways to guide the new nucleic acid sequence that gets inserted into the breakage site by providing nucleic acid sequence templates at the place and time of recombination. DNA recombination techniques continue to be a central step of contemporary gene-editing methods. The specificity with which restriction enzymes identify and target precise nucleic acid sequences in DNA comes from a DNA-binding domain or subregion of the enzyme, which bind to particular nucleic acid bases in DNA. Different restriction enzymes exhibit different “codes” or rules by which their DNA-binding domains recognize and bind to nucleic acid sequences. Although some restriction enzymes are effective for targeting specific nucleic acid sequences, a major engineering challenge for biologists has been to decipher and expand an enzyme’s “code” so that it can target a broader range of nucleic acid sequences. Often, this requires altering the protein’s conformation by changing its amino acid sequence in subtle ways. The engineering challenge has been to identify the alternative amino acid sequences that will expand the enzyme’s target range from a very large set of possible alternative protein conformations. An average protein of about 300 amino acids can 20300 possible alternative amino acid sequence, since there are ~20 possible amino acids from which to choose at each position of a protein. Two types of specialized enzymes that are relatively amenable to engineering and, thus, can target an expanded range of specificity, have been discovered and are widely used – namely zinc finger nucleases and TALENs (short for transcription activator-like effector nucleases). Zinc finger nucleases and TALENs consist of two major components – a customizable guide element that helps precisely identify the nucleic acid sequence a researcher wishes to target and a DNA-cleaving enzyme (Gaj et al. 2013). Researchers have been relatively successful in identifying the “code” by which the DNA-binding domains of these proteins identify different nucleic acid sequences. Gene editing with restriction enzymes coupled with DNA recombination processes represented a significant technological advancement from the use of X-ray, radium, and colchicine. Zinc finger nucleases, and TALENs gave researchers the ability to intentionally intervene on a wide range of gene sequences. This is in stark contrast to the blind, random mutagenesis that previous interventionist technologies enabled. Gene-editing with zinc fingers and TALENs, furthermore, give researchers the ability to intervene on DNA while also minimizing the number of unwanted changes in other unrelated processes. Administration of some mutagens can severely damage, if not kill, the cellular tissue of an organism. By contrast, delivery of zinc finger nucleases and TALENs to living cells is much less damaging to an organism. All that is required is that a cell take up genetic material that encodes these proteins, which can be achieved in various ways – such as viral plasmids – that are much gentler on the organism. Gene-editing with restriction enzymes has a level of precision that is more surgical than the crude methods of the early twentieth century. The precision with which these techniques enable intervention on heritable material has important epistemological and problem-solving value. The precise intervention on gene sequences with restriction enzymes enables researchers investigate and reason

3  Technological Progress in the Life Sciences

59

about how differences in a gene sequence might make a difference to other metabolic processes in a systematic way. When differences in a metabolic process reliably correlate with differences in a gene sequence, researchers can reasonably infer that the gene is causally relevant in some way to the metabolic process. A systematic examination of how a gene is causally relevant to some other biological process was more challenging to achieve with X-rays, radium, and colchicine. Since each of these methods make many changes at one time, researchers couldn’t know which of the changes caused changes in other metabolic processes. Gene-editing with restriction enzymes also enhances our ability to solve real-world problems. A serious challenge in the treatment of disease, for example, is the minimization of unwanted side effects of a drug. More precise interventionist methods hold the promise of changing a cause of a disease without also producing other, deleterious problems for a patient. As significant of a technological advance as zinc finger nucleases and TALENs may be, they nevertheless face limitations. The most serious of which has to do with the customizability of the DNA-binding domain to identify and target a wide range of gene sequences. A novel zinc finger or TALE protein must be customized for each novel gene sequence a researcher wishes to target. This is both an expensive and laborious feat (Ledford 2015). The amino acid sequences of the DNA-binding domain of a protein can interact with other subregions of a zinc finger protein in such a way as to compromise its ability to precisely recognize the desired gene sequence. The range of nucleic acid sequences that TALENs can recognize and bind is limited to sequences that begin with the nucleic acid base, thymine (Gaj et al. 2013). Although gene-editing with protein-based technologies have represented a significant technological advancement in terms of the precision with which researchers can target gene sequences in DNA, this advancement has been tempered by the cost – both in terms of time and money – that engineering novel DNA-recognition domains of zinc finger and TALE proteins. CRISPR technologies, especially CRISPR-Cas9, is the latest development in gene-editing technologies that overcomes some of the limitations of the previous techniques. Like the previous molecular technologies, I’ve discussed, CRISPR-Cas technologies – the inspiration and material components – were derived from naturally evolved genetic mechanisms. In microbes, a CRISPR (short for clustered regularly interspersed short palindromic repeats) array is a sequence of nucleic acid bases encoded in the organism’s genome that contain a genetic signature of past encounters with infectious agents (like viruses). Flanking the array are gene sequences that encode a suite of proteins (or CRISPR associated proteins  – Cas proteins) with various functions. Some Cas proteins insert new genetic signatures into the CRISPR array from infectious agents; some transcribe and process intermediary products called CRISPR RNA (or crRNA) that carry the genetic signature of genetic pathogens; others – most notably, Cas9 – cleave genetic material found in the microbe that matches the genetic signature encoded by the CRISPR array. The gene-editing tool, CRISPR-Cas9, employs single guide RNAs (sgRNAs) that are

60

J. Baxter

synthetized and designed by the experimenter (Jinek et al. 2012).2 Like the naturally evolved crRNAs, sgRNAs identify genetic material with matching nucleic acid sequences by Watson-Crick base-pair rules – adenine binds with thymine (with uracil in RNA), thymine/uracil binds with adenine, guanine binds with cytosine, cytosine binds with guanine. The sgRNA guides the DNA-cleaving enzyme, Cas9, to a precise gene sequence for inducing a breakage.3 In comparison with zinc finger nucleases and TALENs, CRISPR-Cas9 is significantly easier and cheaper to customize to target almost any gene sequence a researcher may wish.4 The design and synthesis of sgRNAs that can identify almost any nucleic acid sequence in DNA is significantly easier than the design and synthesis of complex proteins. So, while zinc finger nucleases and TALENs are relatively precise and customizable, CRISPR technologies have expanded the range of gene sequences that can be targeted to include almost any sequence a researcher wishes. Although CRISPR-Cas9 represents a significant advancement in gene-editing, the extent to which it enables researchers to “make new life to order” as was hoped in the early twentieth century is unclear. For one thing, all of these technologies rely on DNA recombination processes to recombine broken DNA strands. Researchers don’t have complete control over the nucleic acid sequence that results from repair of the broken DNA strands (Ma et al. 2017). Furthermore, all the gene-editing technologies can target have off-target effects by cleaving gene sequences that were unintended by a researcher (Kosicki et al. 2017). And lastly, even when a gene edit is successfully carried out with minimal disruption on the living system, it is a further matter entirely as to whether and how that edit will impact the overall phenotype of the organism. It is common for organisms to have evolved backup mechanisms that compensate for the loss of a gene that would otherwise contribute to a phenotypic trait (Kitano 2004). This means that some gene-edits (even with CRISPR-Cas9) will have no effect on the organism. It is also common for a gene to have multiple functions (pleiotropy). An intervention on a pleiotropic gene could result in unintended phenotypic differences (Guttinger 2018). What this suggests is that more precise techniques for intervening on genes alone is will not always be sufficient to produce precise control over living organisms.

2  sgRNAs fuse a crRNA with an additional RNA molecule (called a trans-activating crRNA or tracrRNA) that helps guide the CRISPR-Cas technology to its precise target. 3  Although the Cas9 protein doesn’t perform the function of locating the precise target for intervention alone, it has nevertheless been modified in artificial ways to expand its performance as a technology. One of the most important modifications has been the addition of a nuclear localization signal to enable the Cas9 protein to enter the nucleus of cells  – a function that naturally evolved Cas9 proteins don’t have (Cong et al. 2013). 4  When authors describe CRISPR-Cas9 on its own, without comparing it to other technologies, it is common for them to praise it for its superior precision, accuracy, and specificity (Jinek et al. 2012). However, this sort of assertion doesn’t help us understand the state of current technology. When CRISPR-Cas9 is compared to TALENs and zinc finger nucleases, its precision, accuracy and specificity are widely recognized as being comparable (Bennett 2017; Gaj et al. 2013; Chari and Church 2017).

3  Technological Progress in the Life Sciences

61

This journey through the history of technology development in the biological sciences is meant to demonstrate how technological progress, at least since the twentieth century, has been driven by an interest in precise interventions on a wide range of gene sequences as possible. Precision has epistemic and real-world value. As an epistemic value, precision enables researchers to (at least in some cases) associate changes they induce on particular gene sequences with changes in life processes. Knowing that a researcher has induced a change in a particular gene and (relatively) nothing else, warrants them inferring that any other differences that might correlate with their intervention are under the causal control of the gene. But this isn’t all. The ability to produce precise changes in the heritable information of living organisms has, for quite some time even before the twentieth century, been of immense agricultural, ecological, and medical value. However, precise intervention on a wide range of gene sequences isn’t just a matter of engineering more surgical gene-editing technologies. The engineering of a gene-editing tool that enables precise intervention on a wide range of gene sequences requires that the tool also be relatively cheap to customize. This is the lesson to glean from the history of protein-­ based gene-editing technologies, like zinc finger nucleases and TALENs, and the emergence of CRISPR technologies. In other words, the experimental and real-­ world significance of an interventionist tool is a matter of precision and the ease with which it can be customized.

3.3  What’s a Technological Revolution Anyway? The popular rhetoric surrounding CRISPR-Cas9 paints a picture of a revolutionary technology – a technology that is ushering in widespread social change. Scientists, humanists, and journalists alike have characterized CRISPR-Cas9 with intense excitement. It’s been described as “revolutionizing medicine” (Marks et al. 2018), an “exponential advance” (Chari and Church 2017), and “the holy grail” of genetics” (Reichel 2018). If technological revolutions occur at all, is there a principled way of judging what is and is not a revolution? In what follows, I lay out some conditions that (I take it) must met for a technological revolution to occur. I show how the history of technological innovation can meet the two conditions described here in various ways. What this means is that technological revolutions can come in different varieties depending on the sort of interventions they make possible and the changes they bring about. If technological revolutions occur at all, at minimum two conditions must be met. First, the technology must have some feature that sets it apart from other existing technologies available at a point in time. This makes the attribution of a revolution to a particular technology rather than another principled and not arbitrary. The feature that sets a technology apart from others can be anything A relevant feature can be anything that makes some intervention possible that was previously beyond the reach of scientists, researchers, agronomists, medical professionals, etc. Alternatively, a relevant feature could have to do with economics

62

J. Baxter

of an intervention by making intervening easier, cheaper, more efficient, etc. Whatever the feature is, what matters is that a technology possess some property that justifies it being singled out from other existing technologies as special. Second, technological revolutions, like the automobile boom in America in the 1920s, are commonly described as having dramatic, large-scale effects on society (Hughes 1969). Novel technologies may possess many features that set them apart from previous tools, but not all of these will be responsible for ushering forth important social change. Like my understanding of a technology’s features, I also mean significant social change broadly. Social change can be a significant increase in an experimental technique or the production of a commercial good. Alternatively, significant social change might be a shift in the attitudes, beliefs, and theoretical knowledge of a community. Furthermore, what is a significant social change to one community may not be to another. Some communities may be untouched by a technological advancement either because of lack of access, a lack of interest, or due to economic, political, and cultural barriers. Ultimately, whether a technology has a significant social impact is an empirical matter that I won’t settle in this paper. My aim is to lay out the conceptual landscape for technological change. Since there are many ways a new technology can differ from other existing tools and many ways a new technology might cause social change, there are conceivably different types of technological revolutions. A tool might be revolutionary in some ways and not in others. The claim that a technology plays a role in significant social change requires further elaboration. I am not suggesting that a technology is the singular cause of social change. Rarely do technologies alone have significant social impact – various social, political, cultural, and economic conditions must be in place for social change to occur (Cook 1995; Hughes 1969). For example, the demands of war and restricted foreign trade created the impetus for Germany’s rapid adoption and development of the Harber-Bosch method of producing nitrogen compounds (Hughes 1969). The proposal instead is that a technology (or set of technologies) are one of several causally significant variables that make a difference to whether social change occurs. Technology must feature as one variable along with economic, political, cultural, and other social variables to help explain some social change. In this way, the social change brought about by technological innovation is not inevitable or inescapable. The absence of a crucial social, political, or economic variable could very well mean that the relevant social change does not occur. The sense of technological revolution defended here may very well diverge from other meanings of the expression. Yet for technological revolution to have any plausibility, it should accommodate the many non-technological factors that make major social change possible. Many authors recognize that technological revolutions can come in different types. One type involves the design and development of technologies that make it possible to change a deeply, generatively entrenched feature or structure of a system (Wimsatt 2007). Something is generatively entrenched so long as it (1) plays a role in systematically and reliably producing and reproducing a structure and (2) is foundational for the functioning of other features that are part of the whole structure

3  Technological Progress in the Life Sciences

63

(Ibid, 134). Generative entrenchment is a matter of degree depending on the number of other processes that depend upon it. A paradigm example of a deeply generatively entrenched feature of a system is DNA and the nearly universal genetic code – the precise mapping between nucleic acid sequences in DNA and the ~20 canonical amino acids that make up protein sequences. A widely held theory is that the genetic code a “frozen accident” (Crick 1968; Maeshiro and Kimura 1998). That is to say, the precise codon-amino acid assignments are a historically contingent fact about life. However, evolutionary constraints select against any mixing and matching of codons and amino acids. Since the genetic code originated, numerous life processes have evolved to rely on the faithful translation of gene sequences into amino acid sequences, such that any mixing and matching between codons and amino acids are likely to be fatal to the living organism. A common attitude among some authors has been that the genetic code is nearly impossible to alter: “Arbitrary features, like the particular association between anti-codons5 and amino acids, acquire an essential character as organic systems increase in complexity. Everything would be scrambled if their associations changed, making modifications in them impossible” (Wimsatt 2007, 137).6

Because deeply generatively entrenched structures provide the stability for so many further processes, for Wimsatt any change in such a structure is a revolutionary one: “Something that is deeply generatively entrenched is in effect a foundational element, principle, or assumption. A great deal depends upon it, and must be given up or generated, or justified in other ways if it fails. Changing principles with greater generative entrenchment is revolutionary in effect if successful, and disastrous if not” (Ibid, 140).

“Success” can be interpreted in many ways. When it comes to any technology that changes the genetic code, one way to interpret the success of such a technique is whether it sustains the life of the organism. Not only would continuation of life despite variation in the genetic code would be a mark of success from the perspective of the organism in which the procedure is performed. Furthermore, altering the genetic code in a life-sustaining way would (especially in multicellular organisms), indeed, be surprising to biologists given how widely accepted the frozen accident hypothesis is. Wimsatt’s account of revolution may satisfy the first condition mentioned; however, altering a generatively entrenched structure by itself needn’t satisfy the second condition. A technology that successfully alters a generatively entrenched 5  Anti-codons are the component of the transfer RNA molecule (tRNA) that ensures that the same codon associates with the same amino acid during translation. 6  For other examples of a similar attitude see Weber (2013, 2017). Weber (2017) appeals to alternative codon-amino acid assignments as “mad or gerrymandered” possibilities – possibilities that lie outside the scope of what is relevant to the study of biology. Biologists recognize that a few subtle variations in the codon-amino acid assignment of some organisms have evolved; however, most are found in single-celled organisms where the amount of disruption to further downstream effects is smaller than in multicellular organisms (Maeshiro and Kimura 1998). For this reason, the genetic code is often described as “nearly universal” (Koonin 2017).

64

J. Baxter

structure is likely to have some properties that set it apart from other existing technologies. Since generatively entrenched structures are challenging to alter, it’s likely that many technologies and bodies of knowledge will operate within systems that are sustained by a generatively entrenched system. As I’ll discuss below, the orthogonal tRNA technology successfully varies a structure that Wimsatt takes as a paradigm example of something that is deeply generatively entrenched. In this way, the orthogonal tRNA technology meets the first condition for a technological revolution. But as I’ll explain, significant social change has not occurred in part owing to this novel technology. As a consequence, it is far from being revolutionary. Another way technological revolution is often used by authors doesn’t concern generatively entrenched structures, but instead concerns what has become the “new normal” technology (Scaife 1985). On this sense of revolution, technologies are revolutionary when they become the widely adopted by a community. Some authors regard the Parson’s Steam Turbine as having “revolutionized shipping and the generation of electric power.” The Parson’s Steam Turbine was an enhancement and modification of previously existing turbine technologies (Scaife 1985, 132). By the end of the nineteenth century, the mechanical workings of steam turbines involved a jet of high-pressure steam directed at a single set of fanlike blades. The set of blades spin in response to a jet of steam (Constant 1980; Scaife 1985). Turbine technology was already in place when Charles Parsons (around 1884) improved upon the turbine’s efficiency and power. One might say that turbine technology was already generatively entrenched by this time. Parson’s steam turbine was a combination of incremental enhancements form previously designed components and genuinely novel innovations. Perhaps the most noteworthy change was the multistage setup, which involved 15 (initially) pairs of bladed turbine wheels. Each pair consisted of one wheel of fixed blades and one wheel of moving blades. As the steam flowed through each fixed set of blades, its velocity increased, thus increasing the kinetic energy at each step. The size of the blades increased in size, steadily to catch as much of the energy of the steam as it expanded and increased in speed at each stage of the turbine. Parsons also adopted the use of a highly efficient nozzle (called a convergent-divergent nozzle) developed by the Swedish engineer, Carl Gustaf de Laval, that accelerates the flow of a gas in the direction of the turbines. Other important features included a close-fitting collar that surrounded the turbine and created a seal to prevent any loss of steam. The Parson’s Steam Turbine stood apart from previous technologies in reducing noise, withstanding high amounts of pressure, efficiency, and in producing unprecedented wattage and speeds from low-­ pressure steam. The historian of science, Edward Constant has called the Parsons generator systems “the epitome of the new normal technology” (Constant, 77). For the Parsons generator quickly became adopted worldwide, nearly monopolizing electrical power generation, Naval and passenger ships, and train stations, and eventually setting the standard for future designs. Unlike technologies that alter generatively entrenched structures, “new normal” technologies have the potential of satisfying both conditions necessary for a technological revolution. For a

3  Technological Progress in the Life Sciences

65

technology to be the “new normal” it must have properties that set it apart from other tools and, furthermore, it must be widely adopted by a community. Critics might question whether technologies that alter generatively entrenched structures or “new normal” technologies deserve the label “revolution” given the crucial role that non-technological variables must play in meeting the second condition previously spelled out.7 I am sympathetic to this complaint. Indeed, it may be that the very concept of technological revolution is incoherent and useless. Nevertheless, I maintain that it is worth tracking different types of technological innovation. Innovations that change generatively entrenched structures differ from ones that are the “new normal” in their historical, scientific, and social significance. Importantly they also differ in the sorts of political, ethical, and social challenges that they present. Tracking different types of technological innovation can help us appreciate the sorts of normative questions authors have raised regarding CRISPR-­ Cas9. I’ll argue further that tracking the different types of technological innovations is an important step for identifying the normative questions that the orthogonal tRNA technology raises as well. If technological revolutions occur, then several types can be distinguished. One type occurs when technologies enable users to change deeply generatively entrenched structures. Another type occurs when a new technology attains the status of the “new normal” technique used by a particular community. In what follows, I show how CRISPR-Cas9 and technologies from synthetic biology fit into this taxonomy.

3.4  Is CRISPR-Cas9 Really Revolutionary? As I’ve shown in Sect. 3.2, CRISPR-Cas9 is the result of a history of incremental change of previous gene-editing tools. In what follows, I argue that if CRISPR-Cas9 technology is revolutionary at all, it is not in virtue of enabling users to change a generatively entrenched structure. If anything, it is revolutionary in the second sense I discuss – in virtue of being the “new normal” technology for biologists working in labs globally. This does not mean that there are no candidates for radical technological revolutions, however. I contrast CRISPR-Cas9 with the orthogonal tRNA technology, which does enable users to change a generatively entrenched structure – namely, the genetic code. As a gene-editing tool, CRISPR-Cas9 does not enable scientists to change a generatively entrenched structure. This technology operates on the generatively entrenched structures of DNA and the genetic code. It is not a tool that enables altering these structures. CRISPR-Cas9 enables scientists to rearrange the nucleic acid sequences of DNA, but researchers rely on Watson-Crick base pairing and the

7   For example, Constant and Vincenti do not not regard “new normal” technologies as revolutionary.

66

J. Baxter

codon-amino acid assignments of the genetic code to successfully use this tool. Nor does it alter other generatively entrenched structures. One might consider the use of molecular mechanisms to target and rearrange DNA a generatively entrenched structure – perhaps generatively entrenched practice is more apt in this case. On this proposal, one might argue that molecular gene-editing mechanisms satisfy Wimsatt’s two criteria for generative entrenchement. One might argue that the relative success of zinc finger nucleases, TALENs, and CRISPR-Cas9 in targeting and rearranging genetic material has contributed to why gene-editing practices with molecular technologies continue to be produced and reproduced. One might even maintain that gene-editing by means of molecular tools is foundation for many other scientific practices. Many experimental practices depend on the ability to precisely target and rearrange genetic material. For example, the introduction of a genetic marker – a gene that produces an easily observable effect, like green fluorescence  – can be accomplished by means of a gene-editing tool. Yet, even if one wishes to interpret experimental practices along the lines of generative entrenchment, CRIPSR-Cas9 nevertheless fails to alter this structure. As a molecular mechanism that has been derived from naturally evolved components to function as a gene-editing tool, CRISPR-Cas9 is very much continuous with this generatively entrenched practice. A technology that would alter this sort of generatively entrenched practice might be one that is not derived from naturally evolved mechanisms or, indeed, alter shift the dominant DNA-centered approach by which biologists seek to manipulate and study living processes to a different approach entirely. If CRISPR-Cas9 is revolution at all, it is in virtue of being the “new normal” technology used by experimentalists globally. Just as Parson’s multistage steam turbine was adopted across several major industries, CRISPR-Cas9 has been widely adopted by several types of biologists. Several features of CRISPR-Cas9 set it apart from other zinc finger nucleases and TALENs – namely, its affordability and customizability – thus meeting the first condition for a revolutionary technology. One significant social change that this technology has helped bring about concerns its experimental use. CRISPR-Cas9 is a much-needed addition to current experimental techniques used for the study of gene function (Housden et al. 2016). A major area of investigation performed by geneticists is the study of what a gene (or set of genes) does at different developmental stages across diverse organisms. This research often proceeds by interfering with the expression of a gene as a way to produce an observable difference in an otherwise genetically identical population. CRISPR-Cas9 has made the knocking out of a gene a central experimental method across biology labs. Knockout studies involve (ideally) the complete removal of a gene and its products from living organisms. Prior to this technology, the dominant approach to studying gene function involved gene-silencing techniques. Gene-­ silencing works by significantly reducing the amount of a gene product in a living organism and was primarily performed with RNA interference (RNAi) technology. Gene-silencing techniques are quite effective at producing observable differences in the phenotypes of an experimental population. However, for some types of genes, gene silencing techniques fail to produce any observable difference. This can be due to gene silencing leaving small traces of the gene product which may be all that an

3  Technological Progress in the Life Sciences

67

organism requires to maintain a stable phenotype.8 By contrast, the knockout ­technique (ideally) eliminates all trace amounts of a gene product. In some cases, knockout techniques reliably produce phenotypic differences in populations where knockdown techniques cannot. This is not to say that gene knockout experiments will replace knockdown experiments. For there can be cases where gene knockouts fail to produce a phenotypic effect where knockdowns do. In this way, CRISPR-­ Cas9 has made the investigation of a wide range of gene functions that were previously challenging to probe. Moreover, whether CRISPR-Cas9 is in fact a better technology than zinc finger nucleases or TALENs, the common attitude among the public and researchers is that CRISPR-Cas9 is the standard technology to use. For example, gene therapy studies that have recently used zinc finger nucleases have been criticized for using “an outdated technology; a relic of the pre-CRISPR era” (Bennett 2017). Similar to how the Parsons’ Steam Turbine became a part of everyday life in train stations, ships, and power companies, CRISPR-Cas9 is now an everyday molecular technology in biology labs and in the imaginations of researchers. Additional applications of CRISPR-Cas9 also fall into the “new normal” category. CRISPR-Cas9 has been proposed as a technique for modifying human germlines9 and for developing gene-drive technologies. Modification of human germline cells is highly controversial owing to the possibility that it will enable the passing on of potentially harmful genetic modifications to future human generations. By contrast, gene-drive technologies are a method for bias the transmission of a gene (or set of genes) across generations at higher rates than what is allowed by standard Mendelian laws of inheritance. When a gene-drive mechanism promotes genes that lower the fitness of a population in some way, the population size can be significantly reduced. Researchers have proposed the use of gene-drives as a method for controlling mosquito populations that act as vectors for harmful pathogens like malaria, dengue fever yellow fever, chikungunya, and zika (Macias et  al. 2017; Sarkar 2018). CRISPR-Cas9 holds the promise of making these applications easier and more affordable; however, it is not he first technology to make these 8  This is not the only reason a knockdown or knockout experiment might fail to produce phenotypic differences in a population. As previously mentioned, biological organisms have evolved to maintain a stable phenotype despite subtle genetic variations. Sometimes, there is simply no way to produce phenotypic differences by intervening on genes. Other reasons for why knockout/knockdown experiments fail have to do with researchers simply not having an assay to detect phenotypic differences (Barbaric et al. 2007). 9  Unfortunately, CRISPR-Cas9 has recently been used to modify twin human embryos despite an informal prohibition from the biological community (Cyranoski 2018). Not only has the researcher, He Jiankui, most likely violated several ethical principles, but the intervention itself was not as surgical and precise as is necessary for using CRISPR-Cas9 for therapeutic purposes in humans (and perhaps, non-human animals as well). It appears that the changes produced by CRISPRCas9 in the human embryos were not the exact ones intended by the researcher. What this demonstrates is that even though CRISPR-Cas9 may be an improvement from previous technologies, researchers do not exercise complete control over its use. Note, this is not a surprising finding. Researchers were aware of the limitations of using CRISPR-Cas9 on developing embryos before He Jiankui’s experiment (Ma et al. 2017).

68

J. Baxter

applications possible. Previous gene-editing technologies had the potential to precisely target and modify human germlines even though they were not generally regarded as reliable enough for human application. In addition to zinc finger nucleases, other naturally evolved genetic-modification mechanisms have been developed into gene-­drive technologies, such as synthetic maternal-effect dominant embryonic arrest (MEDEA) (Macias et  al. 2017). Although the extension of CRISPR-Cas9 to a wide range of applications represent significant technological advances, CRISPR technologies nevertheless fall well within the dominant DNAcentric approach to intervening and manipulating life processes that’s been in place since the rise of molecular biology. Gene-editing technologies like CRISPR-Cas9 do not alter the generatively entrenched structure of life – they operate within it. In this way, CRISPR-Cas9 cannot represent a radical type of technological revolution. It’s in terms of changing what the normal technology is in the minds and labs of scientists. In contrast, with the CRISPR-Cas9 case, an example of a technology that does change a generatively entrenched structure comes from synthetic biology. The orthogonal transfer RNA (tRNA) technology developed in the early 2000s by synthetic biologist, Peter Schultz and his laboratory, facilitates the successful variation of the genetic code by artificial means (Liu and Schultz 2010). Variation of the genetic code by means of the orthogonal tRNA technology occurs at the level of protein translation. Translation is the process by which messenger RNA (mRNA) are “read” by the ribosome and a polypeptide chain of amino acids is synthesized according to the codon sequence of the mRNA. tRNA work together with the ribosome to deliver the amino acid that associates with each codon. With the exception of “stop codons” – codons that associate with no amino acid and instead carry the instruction to halt protein synthesis  – there is a unique tRNA for each codon.10 Orthogonal tRNA are molecules that have evolved in one lineage and have been artificially modified to only recognize a synthetic amino acid with a “stop codon” in another organism from a different lineage.11 The orthogonal tRNA technology makes it possible for synthetic biologists to successfully alter and expand the genetic code while also maintaining the continued existence of a wide range of single-­ cellular and multi-cellular organisms (yeast, bacteria, round worm, fruit fly, mouse, human cell cultures, etc.). The orthogonal tRNA technology represents a notable shift away from DNA-centric approaches to manipulating and controlling proteins. The predominant approach to controlling the amino acid sequence of a protein has been to intervene directly on the corresponding DNA sequence (with a tool like  Aminoacyl-tRNA synthetases charge each tRNA with its specific amino acid. For this reason, the Schultz Lab had to also engineer orthogonal aminoacyl-tRNA synthetases to associate an unnatural amino acid with its corresponding tRNA. 11  Synthetic amino acids are naturally occurring amino acids whose R-groups have been altered in subtle ways, for example p-fluorphenylalanine has a fluorine atom instead of hydrogen as part of the aromatic compound of phenylalanine. Although synthetic amino acids are modified canonical amino acids, they are nevertheless synthetic. These amino acids are not ordinarily found outside of the laboratory and are human made. 10

3  Technological Progress in the Life Sciences

69

CRISPR-Cas9). On this approach the protein synthesis machinery, including tRNA, does not help specify the amino acid sequence of a protein. It merely chains together amino acids in the order specified in DNA. The technique developed by the Schultz Lab is an alternative to this approach. Now, tRNA can specify the order of amino acids of a protein. Synthetic biologists have made it possible for biologists to synthesize proteins in a wide range of living organisms without altering DNA. Indeed, they can (in principle) synthesize proteins in a wide range of living organisms that have no evolutionary history at all thanks to an expanded genetic code that includes synthetic amino acids. The state of biological and engineering knowledge prior to the development of orthogonal tRNA has notable similarities with what the historian of science, Walter Vincenti, has described as radical design (Vincenti 1990). A design is radical when engineers have little theoretical knowledge to justify belief in whether the design will function as expected. Vincenti’s example is David Davis’ airfoil design for the Model-31 airplane. Davis, a self-taught aeronautical engineer, designed an airfoil with relatively low drag with increasing lift – desirable properties for long-range flight. Despite resting on unconventional mathematical equations and being “almost completely blind to fluid mechanics,” Davis’ design was adopted by Consolidated Aircraft for a variety of World War II aircraft including the B-24 (Ibid 37).12 It was not until sometime later when a theoretical method for deriving airfoil shape from desired pressure distribution was developed that researchers verified that the Davis model indeed reduced drag at low angles of attack. Nevertheless, Consolidated Aircraft adopted the Davis airfoil at a time when engineers had insufficient evidence that it was, in fact, a better design. Design and development of orthogonal tRNA was not the result of amateur efforts as in the case of the Davis airfoil. However, design of this technology was radical in the sense that it challenged some aspects of the “frozen accident” theory. Prior to the successful achievement of the orthogonal tRNA technology, some subtle variations in the genetic code had been identified in mitochondria and some single-celled lineages (Osawa et al. 1992; Barrell et al. 1979) Some stop codons have been reassigned to one of the ~20 canonical amino acids in extreme cases – cases with relatively small genomes and high guanine-cytosine content. By the time the Schultz Lab had developed the orthogonal tRNA technique, some modest success of incorporating unnatural amino acids into living organisms that were starved of the canonical amino acid (Budisa et al. 1999).13 Yet despite these findings, the common view among biologists is that these are rare, exceptional cases. Variation of codon-amino acid assignment in naturally evolved single-celled organisms and mitochondria is likely to have minimal downstream effects and thus are likely to be  At the time, mainstream engineers commonly relied on Max M.  Munk’s thin-airfoil theory, which analyzed lift in terms of the camber line (mean line between upper and lower surfaces) and the thickness distribution from lead- to tail edge. By contrast, Davis did not appear to know of this approach. Instead, he based his calculations on geometrical considerations – a “throwback to the period before Munk” (Vincenti 1990, 37). 13  Organisms starved of an amino acid are auxotrophic. 12

70

J. Baxter

compatible with the life of the organism. (In other words, the genetic code in these cases are less generatively entrenched than less extreme genomes.) Nevertheless, variation of the genetic code in multicellular organisms, especially, ventured into territory previously uncharted by experimental and theoretical work. The Schultz research team faced a host of uncertainties, such as whether varying the genetic code in multicellular organisms would be life-sustaining, which (if any) synthetic amino acids could be incorporated into single- and multicellular systems, how many codon-amino acid reassignments could be life-sustaining and more (Chen et  al. 2007; Liu et al. 2007; Xiao et al. 2013). The original test cases of the orthogonal tRNA technology served to prove that variation in the genetic code could be successful for a single- and multicellular organisms despite the “frozen accident” theory. The initial test cases served as a proof of principle that warranted further development and optimization of the technique. In challenging the orthodox view of what conditions are life-sustaining in multicellular organisms, the Schultz Lab pursued an engineering feat that faced significant uncertainties such that the label of “radical design” may be warranted. This is not to claim that the orthogonal tRNA technology is revolutionary. It may satisfy Wimsatt’s criterion of a technology that makes it possible for users to change a generatively entrenched structure. However, at the moment it has not made a dramatic impact on a community. Scientists have not yet altered their practices in an effort to integrate this technology into their experimental practices, nor has the field of medicine embraced it as a potential therapeutic. Techniques that operate within the domain of Watson-Crick base pairing and the codon-amino acid assignments of the genetic code continue to be the technologies of choice among biologists. Although the orthogonal tRNA technology may enable scientists to alter a deeply generatively entrenched structure of life, its uses are limited to specific applications like design of synthetic proteins for the study of protein structure and for development of special cancer therapies (Liu and Schultz 2010). If CRISPR-Cas9 and the orthogonal tRNA technology were revolutionary are at all, they would be revolutionary in distinct ways. As an affordable and flexible gene-­ editing tool, CRISPR-Cas9 has become the “new normal” technique adopted by biologists across labs. By contrast, the orthogonal tRNA case does not have the status of a “new normal” technology. If anything, it is a technology that enables scientists to alter a deeply generatively entrenched structure – namely, the genetic code. Whether or not the status of these technologies warrants the label “revolution,” what matters is that they are instances of distinct types of technological innovation. As distinct types of innovations, they bring distinct types of normative questions.

3.5  Why Tracking Innovation Matters As it turns out, different types of innovation prompt different normative questions. In what follows, I argue that “new normal” technologies like CRISPR-Cas9 prompt us to ask whether a technology ought to be the new norm of a community. By

3  Technological Progress in the Life Sciences

71

contrast, technologies that enable the change of generatively entrenched structures like the orthogonal tRNA have prompted a range of normative questions, some of which have not yet been addressed by the biological community. This is, partly, a feature of such a radical and recent technological development. Understanding CRISPR-Cas9 as the “new normal” technology in molecular biology helps explain why the many ethical questions that have been raised take the form they do. Several authors have raised various ethical concerns about the use of CRISPR-Cas9 to modify genes in human somatic cells and human germlines. Some of the ethical concerns raised include whether CRISPR-Cas9 is reliable enough to guarantee that only unwanted genes will be modified and nothing else such that harmful effects are not inherited by future progeny (O’Keefe et al. 2015; Smolenski 2015; Rodriguez 2016); whether human germline modification risks violating the unborn person’s autonomy (Smolenski 2015); whether genetic modification is ethical at all (Rodriguez 2016); and whether human gene-editing might revive a eugenics movement where privileged consumers have the power to guide human evolution (Comfort 2015). Similar concerns about unknown risks associated with using CRISPR-Cas9 to modify crop and livestock genomes and with using it as a gene-­ drive to control pest populations (Charo and Greely 2017; Rodriguez 2016). Finally, others have expressed concern over the potential use of CRISPR-Cas9 as a bioweapon (Reeves et  al. 2018). What accounts for this surge of ethical concerns regarding CRISPR-Cas9 when it is only the latest tool in a long line of gene-editing technologies? Part of the answer has to do with what made CRISPR-Cas9 the “new normal” technology in molecular biology. “New normal” technologies prompt discussion about whether a technology ought to be adopted as the norm for communities that have the know-how and power to use them widely and for potentially dangerous purposes. As genome modification become more ubiquitous and facile, scientific communities become that much more comfortable with the technique as a possible solution to real world problems. A risk is that real world solutions might be overwhelmingly addressed by gene-editing technologies when other solutions are more apt. By contrast, technologies that facilitate the changing of generatively entrenched structures raise distinctive sets of ethical and normative questions. At initial stages of technology development for cases that enable the change of generatively entrenched structures, lack of theoretical and experimental knowledge of how well the tool works is likely to make risk assessment challenging. In the case of CRISPR-­ Cas9, we know that off-target and interaction effects of pleiotropic genes are risks that are relevant to application of this tool. However, given that the scientific community has never before had a technique for changing the genetic code in life-­ sustaining ways, it is highly uncertain what the possible harmful effects of implementing this technology in organisms outside of the laboratory might be. Without further knowledge of the metabolic and evolutionary consequences are of artificially altering the genetic code in otherwise naturally evolved organisms, we may not even be in a position to identify the potential risks are relevant to applications of this technology. Currently, the orthogonal tRNA technique for

72

J. Baxter

incorporating unnatural amino acids into proteins is regarded as operating with minimal off-target and interaction effects. Yet, scientists may not be in a position to adequately assess this. Many current applications primarily concern the production of synthetic proteins, which are purified for study or therapeutic use. It has yet to be implemented for the purposes of giving living organisms novel functions. It may be too early to have identified the long-term effects of altering the genetic code in living organisms. Technologies that allow scientists to alter generatively entrenched structures make new things possible that had previously been beyond technological reach. In doing so, users often face uncertainties that they might not have previously imagined or hypothesized. Currently, incorporation of unnatural amino acids into proteins by means of rewriting stop codons is restricted to controlled laboratory settings; nevertheless, real-world applications as have already been proposed. Synthetic biologists have yet to engineer metabolic pathways to synthesize the unnatural amino acids that are associated with stop codons during protein synthesis, such that an organism could synthesize synthetic proteins independent of human intervention. Instead, unnatural amino acids must be provided to an organism’s environment for the technique to work. However, this limitation of the orthogonal tRNA technology is precisely what makes it a promising method for biosafety. For some, it has the potential to serve as “the ultimate biosafety tool” for crop and laboratory management (Schmidt 2010; also see Diwo and Budisa 2019). As a safety containment method, the orthogonal tRNA’s association with a stop codon is exploited. A stop codon is inserted early in a protein coding gene’s nucleic acid sequence. This ensures that a full-sized protein would be synthesized only when an orthogonal tRNA and unnatural amino acid are present in the living cell. In the absence of these elements, no protein would be made. In some cases, this could ensure that the relevant gene sequence cannot be transmitted from one organism to another. In others, this could ensure the genetic isolation of one lineage from another. Several normative questions should be addressed before implementing this technology for biosafety. The language of “orthogonality” may turn out to be misleading. Orthogonality suggests that the technology operates independently of the rest of the living system in which it is embedded. Yet, there is insufficient experimental evidence to determine how effective and precise this technology will be in agricultural contexts. The possible off-­ target and harmful effects of implementing orthogonal tRNA in crops is just one set of unknown risks. Implementing this technology as a way to prevent genetic transfer between organisms in the wild at this stage of development would require some further method of making unnatural amino acids available. One question that deserves attention is what the impact will be if unnatural amino acids are introduced into the wild. Is the impact worth the gain in crop improvement and prevention of gene transfer? Questions of uncertainty are not the only normative questions that arise for technologies that help change generatively entrenched structures. A further normative question concerns how the technology itself ought to be valued by users and broader society. In the orthogonal tRNA case, are organisms carrying a expanded genetic

3  Technological Progress in the Life Sciences

73

code fundamentally different from other life forms? Does the engineering of an organism with an expanded genetic code make it a different species from the one it was originally derived? Currently, organisms are extremely limited in the degree to which they express an expanded genetic code. The expanded genetic codes currently being expressed in organisms are quite modest, with only a few unnatural amino acids being incorporated into proteins for short intervals of time during the organism’s life. Furthermore, expanded genetic codes cannot be inherited across generates. At the moment, the current technology may not warrant classifying existing organisms as different species or different forms of life. Yet as the technology advances, engineered genetic codes are likely to expand more and more as organisms are made to incorporate a greater number of unnatural amino acids, are able to do so for longer periods of time, and perhaps even inherit them. Whether organisms carrying expanded genetic codes are different life forms or belong to different species will bear importantly on how we ought to value these specimens. If they are a new species a new form of life, does this mean we have a moral responsibility to preserve them on the grounds that they are rare or some other reason? Currently, their primary value is economic and experimental, as they are used to produce commercial products like drug therapies, to probe protein structure, and to contain potentially harmful specimens to controlled environments. Technologies that enable changing generatively entrenched structures enable the creation of entities that, again, a community may not have imagined or hypothesized previously. This means that the relevant community might not have reflected on the status and value of such entities. Questions of how to value organisms with expanded genetic codes become increasingly relevant as the entity is increasingly artificial. The orthogonal tRNA technology is just one component of a larger orthogonal living system that synthetic biologists are currently developing (Liu et al. 2007; Schmied et al. 2018; Malyshev et al. 2012; Liu and Schultz 2010). The aim of this research is to develop artificial nucleic acid bases that function as analogues to the naturally occurring nucleotides that replicate according to analogous Watson-Crick base pair rules, which are transcribed and translated by synthetic analogues to RNA polymerase and ribosomes. Ideally, this orthogonal system would encode and express wholly artificial biomolecules (RNA transcripts and proteins). Currently only some components of this artificial DNA-protein synthesis system have been successfully developed to function in parallel with the natural DNA-protein synthesis system. One day, synthetic biologists might find a way to embed the orthogonal system in an entirely artificial cell that functions analogously to naturally evolved phospholipid membranes. Similar questions bear on the creation of wholly synthetic organisms as the creation of organisms with expanded genetic codes. Users are likely to face great uncertainty about potential harm this sort of technology may cause in environments outside of the laboratory. A notable difference concerns how we ought to value such creations. The answer to this question will hinge on whether such creations count as living organisms or machines (Boldt and Müller 2008; Deplazes et al. 2009). The orthogonal components that have so far been successfully engineered have been derived from naturally evolved, living components and artificially modified. The more synthetic biologists are able to create de novo, the less connected synthetic organisms

74

J. Baxter

will be to paradigmatic living organisms. It’s not evident whether this is possible or whether there will be any motive for pursuing this sort of project. Nevertheless, if wholly synthetic organisms are ever successfully engineered, an answer to this question would be needed. Even though contemporary synthetic biology is the latest development in biology’s history of developing new techniques and tools for achieving greater amounts of control over life, it has a distinctive culture. Contemporary synthetic biology formed as a discipline by researchers either from different scientific backgrounds from biology or researchers turning to other scientific disciplines like computer science, electrical engineering, and chemistry for novel tools, concepts, and approaches (Campos 2010; Endy 2005). Drew Endy, one of the more outspoken founders of the synthetic biology community, has characterized the aims of this movement in terms of making biology more amenable to engineering. One way synthetic biologists have proposed achieving this has been in terms of engineering standardized biological components (BioBricks) that can serve as basic building blocks of more complex life processes and that function the same across species without the need for customization (Shetty et al. 2008). Synthetic biologists commonly express the attitude that the immense complexity of naturally evolved biological systems significantly hinders our ability to understand and control life, and that only when an engineering approach is taken to adopted will we make progress (Brand 2014). While the interest in gaining greater amounts of control over life processes is not a new aim among biologists, what is distinctive about contemporary synthetic biology is its reliance on engineering and other physical sciences to make artificial parts and systems. Indeed, sometimes the components and systems synthetic biologists create are things that (1) have not actually evolved and (2) (in some cases) are unlikely to evolve were it not for human engineering efforts. The expanded genetic code is an illustrative example of an engineering feat that meets both (1) and (2). The expanded genetic codes achieved by the Schultz Lab is an illustrative example. The genetic codes they create are unlikely to evolve by natural means given how generatively entrenched14 the near universal code is in living organisms and the rarity of unnatural amino acids in natural environments. As contemporary synthetic biology emerges as a distinctive discipline, participants and observers have an opportunity to deliberate and decide on the norms that ought to shape the community. Greater control over life has been a longstanding interest of biologists, yet is this an appropriate guiding norm for an emerging discipline with the potential to engineer wholly artificial life forms? How might the

 Alternative genetic codes are quite rare in naturally evolved species owing to the standard genetic code’s high degree of robustness. Redundancy in the standard genetic code significantly minimizes the likelihood that a very different (biochemically) amino acid will be inserted in a location that will be disruptive to the protein’s normal functioning. The less generatively entrenched the genetic code is to a particular species – e.g., mircrobes with small genomes – the less robust and more evolvable the genetic code may be (Maeshiro and Kimura 1998). Of course, alternative genetic codes that have evolved in natural lineages only encode the ~20 canonical amino acids, unlike the unnatural amino acids used by the Schultz Lab.

14

3  Technological Progress in the Life Sciences

75

language and assumptions implicit to research programs in synthetic biology promote or inhibit an understanding of how engineered living components might benefit or harm the environments in which they are introduced? A discipline guided by the aim of gaining greater amounts of control over life may also have economic and structural consequences. Highly specialized molecular components that confer great amounts of control over life processes may further exacerbate existing inequalities between scientists that have don’t have access to great amounts of research funding and specialized training (Deplazes et al. 2009). The cutting-edge drug therapies developed by synthetic biologists may also further exacerbate existing global inequalities as some communities are unlikely to have the wealth and infrastructure to access these treatments. Settling whether CRISPR-Cas9 or orthogonal tRNA are revolutionary technologies may not be all that interesting. What matters is that they represent different kinds of technological innovation with different kinds of social significance. CRISPR-Cas9 has been widely adopted as the “new normal” technology across biology labs as it has made the precise targeting and rearranging of almost any gene sequence across model organisms significantly more affordable and flexible. Understanding CRISPR-Cas9 as the “new normal” technology helps account for the recent surge in ethical concern over its application for human genome and more. By contrast, the orthogonal tRNA technology and the extended genetic codes they make possible is a more radical type of innovation as it enables changing a deeply generatively entrenched structure. Radical technologies like this raise distinctive ethical and safety concerns. Concerns range from whether scientists have an adequate understanding of potential harms that might arise from introducing the technology to uncontrolled environments to whether the culture of synthetic biology ought to embrace its radical approach to engineering life.

3.6  Conclusion Popular rhetoric concerning CRISPR-Cas9 describes it as a revolutionary technology. If there is any truth to this claim, “revolution” is best understood in terms of what I’ve called the “new normal” technology. For unlike orthogonal tRNA and other radical technologies, CRISPR-Cas9 does not enable users to change generatively entrenched structures. Instead, its successful activity depends on a generatively entrenched structure – namely, Watson-Crick base pairing and the standard genetic code. Ultimately, whether CRISPR-Cas9 is in fact a revolutionary technology is an empirical matter that is best left for social scientists and historians to settle. What matters is that the distinction between “new normal” technologies and technologies that enable the alteration of generatively entrenched structures is a useful way of tracking ethical and normative questions that new technologies raise. “New normal” technologies and technologies that enable the alteration of

76

J. Baxter

generatively entrenched structures raise distinct kinds of ethical and normative questions. I’ve argued that “new normal” technologies prompt the question of whether the technology ought to be the norm for a given community. When a technology becomes the “new normal,” the community of users becomes that much more accustomed to the kinds of interventions it makes possible. Yet, as many authors have expressed, is it wise for scientific communities to become so comfortable with genome editing that it becomes the presumptive solution to real world problems? By contrast, technologies that make it possible for users to alter generatively entrenched structures raise epistemic and ethical questions. Such cases make things possible that a community is likely not to have imagined nor hypothesized sufficiently. This means that users are likely to not have sufficient knowledge of what the relevant risks are of applying the technology to real world problems. At the moment, synthetic biologists are successfully engineering technologies that enable users to change generatively entrenched structures in living organisms. As a radical technology, synthetic biologists do not currently have sufficient knowledge of what the associated risks are of using this technology to solve real world problems, like genetic containment between crops. As the technology is developed further and expanded to create increasingly artificial living systems, additional normative questions should be raised concerning how engineered organisms relate (if at all) to naturally evolved ones. How we answer this question will have further implications for how we ought to value these creations. And finally, the orthogonal tRNA technology is a product of a culture that is willing to adopt increasingly novel strategies to engineer life. As an emerging discipline, synthetic biology is in a position to think carefully about what norms ought to guide their community.

References Barbaric, I., Miller, G., & Dear, T. N. (2007). Appearances can be deceiving: Phenotypes of knockout mice. Briefings in Functional Genomics and Proteomics, 6(2), 91–103. Barrell, B. G., Bankier, A. T., & Drouin, J. (1979). A different genetic code in human mitochondria. Nature, 282. Bennett, C. (2017). Are zinc finger nucleases making a comeback? CRISPR is not yet king of the clinic. Genetic Engineering & Biotechnology News. Boldt, J., & Müller, O. (2008). Newtons of the leaves of grass. Nature Biotechnology, 26(4). Brand, S. (2014). Massively collaborative synthetic biology. Interview with Drew Endy. http:// longnow.org/seminars/02014/sep/16/igem-­revolution/ Budisa, N., Minks, C., Alefelder, S., Wenger, W., Dong, F., Moroder, L., & Huber, R. (1999). Toward the experimental codon reassignment in vivo: Protein building with an expanded amino acid repertoire. Federation of American Societies for Experimental Biology, 13(1), 41–51. Burpee, D., & Taylor, F. J. (1940). So we shocked mother nature. Rotarian, pp. 15–17. Campos, L. (2010). That was the synthetic biology that was. In M. Schmidt, A. Kelle, A. Ganguli-­ Mitra, & H. de Vriend (Eds.), Synthetic biology: The technoscience and its societal consequences. Dordrecht: Springer.

3  Technological Progress in the Life Sciences

77

Campos, L. (2016). Radium and the secret of life. Chicago/London: University of Chicago Press. Chari, R., & Church, G. M. (2017). Beyond editing to writing large genomes. Nature Reviews: Genetics, 18. Charo, R. A., & Greely, H. T. (2017). CRISPR critters and CRISPR cracks. The American Journal of Bioethics, 15(12), 11–17. Chen, S., Schultz, P. G., & Brock, A. (2007). An improved system for the generation and analysis of mutant proteins containing unnatural amino acids in Saccharomyces Cerevisiae. Journal of Molecular Biology, 371(1), 112–122. Comfort, N. (2015). A new eugenics. The Nation. Cong, L., Ran, F., Cox, D., Lin, S., Barretto, R., Habib, N., Hsu, P., Wu, X., Jiang, W., Marraffini, L., & Zhang, F. (2013). Multiplex genome engineering using CRISPR/Cas systems. Science, 339(6121), 819–823. Constant, E. (1980). The origins of turbojet revolution. Baltimore: The Johns Hopkins University Press. Cook, S. (1995). The structure of technological revolutions and the Gutenberg Myth. In New directions in the philosophy of technology. Cambridge, MA: Kluwer Academic Publishers. Crick, F. (1968). The origin of the genetic code. Journal of Molecular Biology, 38. Curry, H. (2016). Evolution made to order: Plant breeding and technological innovation in twentieth-­century America. Chicago/London: The University of Chicago Press. Cyranoski, D. (2018). First CRISPR babies: Six questions that remain. Nature. de Vries, H. (1909). The mutation theory: Experiments and observations on the origin of species in the vegetable kingdom. Chicago: The Open Court Publishing Company. Deplazes, A., Ganguli-Mitra, A., & Biller-Andorno, N. (2009). The ethics of synthetic biology: Outlining the agenda. In M.  Schmidt, A.  Kelle, A.  Ganguli-Mitra, & H. de Vriend (Eds.), Synthetic biology: The technoscience and its societal consequences. Dordrecht: Springer. Diwo, C., & Budisa, N. (2019). Alternative biochemistries for alien life: Basic concepts and requirements for the design of a robust biocontainment system in genetic isolation. Genes, 10, 17. Endy, D. (2005). Foundations for engineering biology. Nature Reviews, 438(24). Gaj, T., Gersbach, C. A., & Barbas, C. F., III. (2013). ZFN, TALEN, and CRISPR/Cas-based methods for genome engineering. Trends in Biotechnology, 31(7), 397–405. Gray, G. W. (1932, August 22). New life made to order. Charleston Daily Mail Magazine, pp. 3. Guttinger, S. (2018). Trust in science: CRISPR-Cas9 and the ban on human Germline editing. Science and Engineering Ethics, 24, 1077–1096. Housden, et al. (2016). Loss-of-function genetic tools for animal models: Cross-species and cross-­ platform differences. Nature Reviews, 18. Hughes, T.  P. (1969). Technological momentum in history: Hydrogenation in Germany 1898–1933. New York: Oxford University Press. Jinek, M., Chylinski, K., Fonfara, I., Hauer, M., Doudna, J., & Charpentier, E. (2012). A programmable dual-RNA-guided DNA Endonculease in adaptive bacterial immunity. Science, 337, 816–821. Kitano, H. (2004). Biological Robustness. Nature, 5. Kohler, R. (1994). Lords of the fly: Drosophila genetics and the experimental life. Chicago: University of Chicago Press. Koonin, E. (2017). Frozen accident pushing 50: Stereochemistry, expansion, and chance in the evolution of the genetic code. Life, 7, 22. Kosicki, M., Tomberg, K., & Bradley, A. (2017). Repair of double-Strand breaks induced by CRISPR-Cas9 leads to large deletions and complex rearrangements. Nature Biotechnology, 36. Ledford, H. (2015). CRISPR, the disruptor. Nature, 522. Liu, C., & Schultz, P. (2010). Adding new chemistries to the genetic code. Annual Review of Biochemistry, 79, 413–444.

78

J. Baxter

Liu, W., Brock, A., Chen, S., Chen, S., & Schultz, P. G. (2007). Genetic incorporation of unnatural amino acids into proteins in mammalian cells. Nature Methods, 4, 239–244. Ma, H., Marti-Gutierrez, N., Park, S., Wu, J., Lee, Y., Suzuki, K., Koski, A., Ji, D., Hayama, T., Ahmed, R., Darby, H., Van Dyken, C., Li, Y., Kang, E., Reum Park, A., Kim, D., Kim, S., Gong, J., Gu, Y., Xu, X., Battaglia, D., Krieg, S. A., Lee, D. M., Wu, D. H., Wolf, D. P., Heitner, S. B., Belmonte, J. C. I., Amato, P., Kim, J., Kaul, S., & Mitalipov, S. (2017). Correction of a pathogenic gene mutation in human embryos. Nature, 000. Macias, et al. (2017). Gene drive for mosquito control: Where did it come from and where are we headed? International Journal of Environmental Research and Public Health, 14. Maeshiro, T., & Kimura, M. (1998). The role of robustness and changeability on the origin and evolution of the genetic codes. PNAS, 95, 5088–5093. Malyshev, D.  A., Dhami, K., Quach, H.  T., Lavergne, T., Ordoukhanian, P., Torkamani, A., & Romesberg, F. E. (2012). Efficient and sequence-independent replication of DNA containing a Third Base pair establishes a functional six-letter genetic alphabet. PNAS, 109(30). Marks, N. (Producer), Morris, K., Woods, J. (Associate Producers). (2018). CRISPR: The gene-­ editing Tool Revolutionizing Biomedical Research. 60 Minutes, CBS. Morgan, T. H., & Bridges, C. B. (1919). The origin of Gynandromorphs. In Contributions to the genetics of Drosophila Melanogaster. Washington, DC: Carnegie Institution of Washington. Muller, H. J. (1927). Artificial transmutation of the gene. Science, 66, 84–87. O’Keefe, M., Perrault, S., Halpern, J., Ikemoto, L., & Yarborough, M. (2015). Editing’ genes: A case study about how language matters in bioethics. The American Journal of Bioethics, 15(12), 3–10. Osawa, S., Jukes, T., Watanabe, K., & Muto, A. (1992). Recent evidence for evolution of the genetic code. Microbiological Reviews, 56(1), 229–264. Paul, D. B., & Kimmelman, B. A. (1988). Mendel in America: Theory and practice, 1909–1919. In R. Rainger, K. R. Benson, & J. Maienschein (Eds.), The American development of biology. Philadelphia: University of Pennsylvania Press. Reeves, R.  G., Voeneky, S., Caetano-Anollés, D., Beck, F., & Boëte, C. (2018). Agricultural research, or a new bioweapon system? Science, 362(6410). Reichel, J. M. (2018). CRISPR-Cas: The Holy Grail within Pandora’s box. https://www.lindau-­­ nobel.org/blog-­crispr-­cas-­the-­holy-­grail-­within-­pandoras-­box/ Rodriguez, E. (2016). Ethical issues in genome editing using CRISPR/Cas9 system. Journal of Clinical Research & Bioethics, 7, 2. Sarkar, S. (2018). Researchers hit roadblocks with gene drives. BioScience, 68(7). Scaife, W. G. (1985). The parsons steam turbine. Scientific American, 252(4). Schmidt, M. (2010). Xenobiology: A new form of life as the ultimate biosafety tool. BioEssays, 32. Schmied, W. H., Tnimov, Z., Uttamapinant, C., Rae, C. D., Fried, S. D., & Chin, J. W. (2018). Controlling orthogonal ribosome subunit interactions enables evolution of new function. Nature, 564. Shetty R.P., D. Endy, T.F. Knight Jr. 2008. “Engineering BioBrick vectors from BioBrick parts.” Journal of Biological Engineering, 2, number 5. Smith, H. O., & Wilcox, K. W. (1970). A restriction enzyme from hemophilus influenzae II. Base sequence of the recognition site. Journal of Molecular Biology, 51. Smolenski, J. (2015). CRISPR/Cas9 and germline modification: New difficulties in obtaining informed consent. The American Journal of Bioethics, 15(12), 35–68. Vincenti, W. G. (1990). What engineers know and how they know it. Baltimore: The Johns Hopkins University Press. Waters, C. K. (1994). Genes made molecular. Philosophy of Science, 61(2). Weber, M. (2013). Causal selection vs causal parity in biology: Relevant counterfactuals and biologically normal interventions. In C.  K. Waters, M.  Travisano, & J.  Woodward (Eds.), Causation in biology and philosophy. Minneapolis: University of Minnesota Press.

3  Technological Progress in the Life Sciences

79

Weber, M. (2017). Discussion note: Which kind of causal specificity matters biologically? Philosophy of Science, 84(3), 574–585. Werner, A. (1979). Promotion and limitation of genetic exchange. Science, 205. Wimsatt, W. (2007). Re-engineering philosophy for limited beings. Cambridge, MA: Harvard University Press. Xiao, H., Chatterjee, A., Choi, S.-H., Bajjuri, K. M., Sinha, S. C., & Schultz, P. G. (2013). Genetic incorporation of multiple unnatural amino acids into protein in mammalian cells. Angewandte Chemie International Edition, 52, 14080–14083.

Part II

Technological Progress: Re-imagining Engineering Knowledge

Chapter 4

Philosophical Observations and Applications in Systems and Aerospace Engineering Stephen B. Johnson

Abstract  This paper describes several examples of recent practices in systems engineering and aerospace engineering research and development of interest to the philosophy of technology community. These examples include cases in which philosophical and social concepts and practices are (or can be) used in engineering, cases in which engineers use philosophical strategies (often without realizing it), and cases that philosophers have investigated and found of interest in the past. Engineers implicitly use some philosophical practices. Rhetorical ploys are quite common. One common practice is to change key terms to distinguish one engineering approach or practice from another. This is often used to attempt to influence a discipline or acquire funding. Quantification is a key form of engineering rhetoric. Attempts to create engineering disciplines, establish an engineering organization, or win an argument about a design or operational method can succeed or fail depending on the ability to generate and present quantitative results. There are several cases in which philosophical and social concepts have been implicitly or explicitly used in engineering. One important example is the growing importance of “goals” in the engineering of autonomous systems. Autonomous systems are designed with “intelligence” to enable them to change goals based on changes in the external environment or internal health. These goal-based approaches are inherently teleological. Another case of interest is the use of concepts of social communication and cognitive limitation, which are core principles in the newly forming discipline of system health management, and in the older field of systems engineering. Engineers are also developing axiomatic and model-based approaches to systems engineering disciplines. Finally, philosophers using hermeneutical and pragmatic approaches argue that active engagement with the world profoundly influences the way in which individuals understand and interact with the world. I argue that both engineering and philosophy have this in common, and that using these approaches is useful to understand the nature of both.

S. B. Johnson (*) Dependable System Technologies, LLC, Westminster, CO, United States © Springer Nature Switzerland AG 2021 Z. Pirtle et al. (eds.), Engineering and Philosophy, Philosophy of Engineering and Technology 37, https://doi.org/10.1007/978-3-030-70099-7_4

83

84

S. B. Johnson

Keywords  Artificial intelligence · Communication · Function · Goal · Hermeneutics · Models · Model-based systems engineering · Pragmatism · Quantification · Rhetoric · Systems engineering · System health management · Teleology

4.1  Introduction In this chapter, I provide examples drawn from personal experiences in aerospace engineering and systems engineering research and development related to the idea that “active engagement with the world” and the resultant “problem solving” are key ingredients in technological, scientific, and philosophical development. In philosophical terms, this emphasis resembles or draws from several existing approaches. One is the emphasis on “intervening” in the world, and not merely representing it, as exemplified in Ian Hacking’s introductory philosophy of science book Representing and Intervening.1 Another is the remarkable similarity between recently developed ideas in the engineering of complex, autonomous systems and hermeneutics as represented by Patrick A. Heelan and Jay Schulkin, in their 1998 paper “Hermeneutical Philosophy and Pragmatism: A Philosophy of Science.”2 The authors emphasize the active engagement with the world promoted by hermeneutics and pragmatism. Engineering systems actively engage with the world, and thus philosophical approaches that emphasize engagement are congenial to the philosophy of technology. Engineered systems engage with the world to achieve goals, and any theory or philosophy of engineering must recognize and emphasize the importance of this practical bent. Effective engagement with the world also typically needs effective representations of the world. In this regard, I argue for a model-­ based (as opposed to “law-based”) approach to representation, as argued by Ronald N. Giere in various publications, such as Science without Laws.3 I also modify or extend these philosophical threads in certain respects. The “engagement with the world” and the “goals” that actors have in doing so depend on the “micro-world” in which those actors are engaged. Technologies are typically designed for specific roles, though they can and often are used in ways not originally intended. Even with unexpected uses, no engineered system or tool is useful in all contexts for all possible purposes. Their use is inherently restricted. Human actors too operate in their own micro-worlds, which clearly influence the nature of their engagement. For example, the goals of academic philosophers are

1  Ian Hacking, Representing and Intervening: Introductory Topics in the Philosophy of Natural Science. Cambridge: Cambridge University Press, 1983. 2  Patrick A. Heelan and Jay Schulkin, “Hermeneutical Philosophy and Pragmatism: A Philosophy of Science”, in Robert C. Scharff and Val Dusek, eds., Philosophy of Technology, The Technological Condition, An Anthology, Malden, MA: Blackwell, 2003, pp. 138–153, originally from Synthese 115 (1998): 262–302. 3  Ronald N. Giere, Science without Laws. Chicago: University of Chicago Press, 1999.

4  Philosophical Observations and Applications in Systems and Aerospace Engineering

85

quite different from those of engineers developing complex technologies. Both have interests that differ from the goals of engineering researchers in academia or the government. The problems that actors in different organizational settings address are distinctive, and these in turn impact the questions being asked, the methods by which these questions are addressed, and the solutions or outputs produced. Producing “recognized novelty” in published papers to gain tenure or academic stature differs hugely from working efficiently as an individual or in groups to solve numerous small problems as is typical for design of large-scale engineering systems and integrate those solutions into a greater whole. Novelty is neither necessary nor desired to address most problems, unless essential to develop an effective and affordable solution. The cases in this paper are selected for their “bridging capacity” between engineering and philosophy. Some highlight engineering use of implicit or explicit philosophical approaches, and others illustrate engineering practices of potential use or interest for philosophers of engineering and technology. The value of these cases as “bridging material” is ultimately defined by their utility to engineers and philosophers. Philosophers will find these cases useful if they help the development and publication of novel and influential ideas to solve philosophical problems. Development engineers will find these cases useful if they aid system development. Research engineers are somewhere in between, as they can succeed either by publication of novel and effective ideas or successfully implementing their ideas and products in engineering products. The cases discussed in this paper draw from my own experiences in engineering and technology. As some of these examples are of activities with which I am directly involved, I inevitably bring in my own biases and judgments to bear. I will reflect on and describe some my own experiences, not in the hope of being fully objective, but rather simply because this is often an effective approach to engage with the world.4

4  It may be helpful to understand some of my background and biases. I have a bachelor’s degree in physics and a PhD in the History of Science and Technology, with strong supporting programs in philosophy of science, economic history, and European history. I worked in aerospace engineering from 1980 to 1995 and from 2005 to the present, and in academia as a regular tenure-track faculty member from 1997 to 2005. I have worked for large companies, in academia as regular and as research faculty, as a civil servant, and running my own business. I have worked as a civil servant on torpedoes for the Navy and launch vehicles for NASA. My experience includes working for large companies on target drones, as manager of a digital-analog simulation laboratory, engineering of deep space probes, and as researcher in Vehicle Health Management. I have worked in academia teaching space management, economics, and history, doing research on System Health Management and systems engineering for NASA, the Army, and Missile Defense Agency. I have my own small business, which includes work for NASA on launchers, and as a consultant on system health management and commercial diagnostic tools. I am the author of two books on the origins of systems engineering and project management (one is The Secret of Apollo: Systems Management in American and European Space Programs), general editor for System Health Management: with Aerospace Applications and the two volume space history encyclopedia Space Exploration and Humanity: A Historical Encyclopedia, and many papers and books in space history, space-related political economy, system health management, and systems engineering.

86

S. B. Johnson

4.2  The Protaganists of Protagoras: Engineering Rhetoric Since 1985, I have been involved with the emerging engineering discipline variously called integrated diagnostics, fault protection, vehicle health monitoring, system health management, integrated system health management, dependability, fault management, engineering resilience, and prognostics and health management, among others. This proto-discipline is historically and philosophically interesting for a number of reasons, several of which I will explore in this paper. One curious aspect of this field is the current and historical lack of a label for it that is accepted by all, or even of most practitioners. The multiplicity of names relates both to the difficulty of defining the technical scope for the discipline, and also to practitioner political choices. I will use “system health management (SHM)” as my term of choice for reasons I will explain shortly, but currently it remains one of several alternatives. In my opinion, most of the terms listed above are describing essentially the same practices, but with differences of emphasis and application. However, some contest this characterization. A very short history will provide some context. Like all histories, this summary uses my own biases and interpretive framework about what to emphasize and what to leave out, as well as explicit interpretations of meaning intended for readers. The very early beginnings of this field are literally rooted in failure. Specifically, it is grounded in attempts to reduce the chance of failure, to eliminate failure if possible and to mitigate failure if not. Several well-known narratives about the relationship of failure to engineering disciplines are written by Henry Petroski, such as in Success through Failure: The Paradox of Design.5 I fully endorse Petroski’s views regarding the utility of engineers learning from failure, but here I am attempting a different task, to briefly describe a new proto-discipline dedicated to mitigating failure. An early 1940s–50s approach to address failure was the creation of the field of “reliability” to estimate failure rates of components and systems.6 Another less obvious approach was the creation of systems engineering. Reliability emphasizes mathematical calculation and probabilistic estimates, whereas systems engineering uses process cross-checks and configuration control mechanisms to eliminate or reduce the chances of certain kinds of failures. These include but were not limited to issues such as mismatches between engineering drawings and the as-built system, and the allocation of resources and coordinate the schedules needed for system design, build, and manufacture. System health management got its start primarily from the development of complex systems such as aircraft, helicopters, missiles, rockets, and spacecraft. Of particular importance were highly autonomous systems like the Voyager spacecraft that

5  Henry Petroski, Success through Failure: The Paradox of Design. Princeton: Princeton University Press, 2006. 6  The first paper that I have found using the term ‘reliability’ was from 1950, from the Army Ballistic Missile Agency at Redstone Arsenal in Huntsville Alabama, addressing mathematical prediction of missile failure rates, or conversely, success rates.

4  Philosophical Observations and Applications in Systems and Aerospace Engineering

87

had to perform emergency actions without human intervention, and very complex, high-reliability systems such as the Space Shuttle, and military and commercial aircraft. All of these systems featured redundant hardware and software to ensure the system can continue to function even when components fail, and monitoring to capture flight data to search for problems in flight and from flight to flight. By the late 1980s and early 1990s, managers and engineers at the National Aeronautics and Space Administration (NASA), the Department of Defense (DoD), and at major aerospace companies were developing new technical and organizational approaches to address failure in complex systems. One was the development of a new DoD engineering standard for Integrated Diagnostics. Another was the development of a NASA working group for Vehicle Health Monitoring. A third early effort was the creation of the Dependability Working Group (DWG) organized by The Aerospace Corporation, focused on computer dependability. Another example was the development of “health usage and monitoring systems (HUMS)” for helicopters. Starting in the 1970s, the Jet Propulsion Laboratory (JPL) began developing what they called “fault protection”, a term used for the on-board software that detected and responded to failures without human intervention for its deep space probes. I became involved with the field in 1985 on the Magellan project, which used synthetic aperture radar to map the surface of Venus. On this project, managed by JPL but with the spacecraft designed by Martin Marietta Corporation, one of my roles was developing the Attitude and Articulation Control System Fault Protection. Later deep space missions continued to deploy fault protection, which by the 2010s was relabeled as “fault management.” By the 1990s, some new terms for these activities came into being. The term “Vehicle Health Monitoring”, which seems to have originated in aircraft applications to determine what data to monitor from flight to flight, was largely superseded by “Vehicle Health Management”, which indicated that the point was to manage the health of the vehicle, not merely to monitor it. Vehicle Health Management in turn was largely superseded by System Health Management, which indicated that the scope of the discipline was broader than the vehicle, including the entire set of ground equipment, human operators and the vehicle(s) together. Humans were part of the system, not separate from it. By the 2000s–2010s, further terms and variations appeared, including “Integrated” System Health Management, Prognostics and Health Management, and Fault Management. “Integrated” System Health Management was promoted by some groups who wanted to emphasize the “integrated” aspect of health management, and not merely management of parts of the system. Prognostics and Health Management was promoted largely by research engineers who wanted to emphasize the prediction of future failure (prognostics). Fault Management was a new term from the NASA science community to supersede the older term “fault protection.”

88

S. B. Johnson

In the organizing, writing, and editing a reference text for the field, I eventually decided to use the term “System Health Management” for the book.7 I rejected “Integrated” SHM because in my view, the word “integrated” added no value, as systems are inherently integrated in some way or other. Additionally, some groups focus on health management of components, which from their viewpoint are their “systems.” I rejected “Prognostics” and health management as this promoted the idea that prognostics was co-equal to health management, whereas in my theory of SHM (more on this later), it is merely one function among several that are needed to manage system health. I rejected “fault” management because in SHM theory the term “fault” has a precise meaning related to causation of failures, and because the point of SHM was to manage health, not manage faults or failures per se. In the NASA science spacecraft world, fault protection engineers historically designed mechanisms to detect and respond to faults and failures, whereas across many other applications, managing and mitigating faults and failures was only one part of what was needed to keep a system healthy. For most systems, a common strategy is to prevent faults and failures from occurring at all through large design margins, quality assurance, and other similar methods, and the term “SHM” captures this fact. What is the significance of these naming variations? Some relate to the fact that engineers in different application areas were not familiar with similar work in other application areas. “Dependability” was and is a term used related to computing, “HUMS” for helicopters, “diagnostics” in aircraft applications, “fault protection” for deep space probes, and so on. Later variations emerged due to better understanding, such as the idea that addressing failures was not simply a function of gathering data, but also using that data, as with the switch from “monitoring” to “management.” Scope changes also emerged, as some engineers expanded their work from addressing the health of components to vehicles and then including the components and people operating the vehicles. This is shown by the shift from “vehicle” to “system” health management. Some of the variations came into being for at least partly “political” reasons. By this, I mean that one group of researchers wanted to distinguish their approach from those of other groups. This seems a likely motivation of the term “Prognostics” and Health Management, as a prominent group in the formation of the PHM community was prognostics researchers. A similar motivation likely  existed for the case of “Integrated” System Health Management, in which the researchers involved wanted to emphasize the integration of data to produce information and then knowledge in SHM. The use of words to emphasize differences so as to favor your particular approach is prominent in the research world as part of the competition for research funds. The ideal for any researcher or academic is to have your term become “the” term, and as this often implies that its associated ideas will become dominant. That usually means that funding flows to the dominant approach. Competitions for development 7  Stephen B. Johnson, Thomas J. Gormley, Seth S. Kessler, Charles D. Mott, Ann Patterson-Hine, Karl M.  Reichard, Philip A.  Scandura, Jr., eds. System Health Management: With Aerospace Applications. Chichester, UK: John Wiley & Sons, 2011.

4  Philosophical Observations and Applications in Systems and Aerospace Engineering

89

contracts also feature rhetorical differentiation. This is the sort of political phenomenon investigated by Bruno Latour and others with actor theory. They often describe the competition of ideas as a phenomenon of politics in networks of actors, as much or more than the technical merit of those ideas.8 There is much of value in this approach, though I argue that political factors are only one aspect of the spread of the ideas and hence the relative power among actors. Another practice that is both rhetorical and technical is quantification. As described by Theodore Porter in Trust in Numbers, the mere fact of quantification brings its own rhetorical benefits.9 In the newly emerging discipline of SHM and in the older one of systems engineering, one of their major historical disadvantages has been the inability to quantify its benefits or performance. This has recently been changing. By 2011, I (and colleagues) developed a theory of SHM that enables quantification, with successful application in NASA’s human spaceflight program. Aspects of this theory draw from Control Theory concepts. The quantification enabled by these concepts are similar to cost-benefit analyses or risk analyses, framed in probabilistic terms.10 Management at NASA’s Marshall Space Flight Center (MSFC) recognized that SHM was a field of growing importance for space systems, and thus created an institutional home for it in 2004. The new Integrated System Health Management and Automation Branch gave SHM some institutional recognition and a place to build expertise. However, the bottom line for the organization’s success would ultimately based on how well its personnel performed on MSFC’s major program, which after the Columbia accident of 2003 was to replace the Space Shuttle with a new launch vehicle to place astronauts into space. On the Ares I project, and then by 2011–2012 on the Space Launch System (SLS) program, a major task of MSFC ISHM personnel was to decide which on-board algorithms are needed to detect and respond to failure, and then to design them. Deciding which on-board algorithms are needed is a particularly difficult one. Historically, engineers making these decisions used qualitative techniques to decide what algorithms are necessary. Given a proposed system design, historical information is used to create a list of critical failures that can occur. In this context, “critical” means components and functions necessary to achieve the mission, and in human-­ rated systems, to keep the crew safe. Using this list, and historical information indicating which components have relatively higher rates of failure than others, algorithms would be designed to detect and respond to these failures. This mostly qualitative activity had at least one major disadvantage in that if there were questions on whether an algorithm was necessary, there was little information regarding the likelihood of the failures that the algorithm was to detect and mitigate, nor on the effectiveness of that algorithm in addressing the failure. The question “is it  Bruno Latour, Science in Action, Cambridge, MA: Harvard University Press, 1987.  Theodore M. Porter, Trust in Numbers: The Pursuit of Objectivity in Science and Public Life, Princeton: Princeton University Press, 1995. 10  Stephen B.  Johnson, Sudipto Ghoshal, Deepak Haste, and Craig Moore, “Fault Management Metrics,” 2017 AIAA Science and Technology Forum, Grapevine, Texas, 9–13 January 2017. 8 9

90

S. B. Johnson

worthwhile to have this algorithm?” could not easily be answered. Given that most other engineering disciplines provide quantitative estimates of the performance and effectiveness of their designs, this was a significant deficiency.11 Developing a quantitative method by which the value of the algorithm could be estimated alleviated this problem. Applying this method, engineers in the ISHM & Automation Branch provided information in the quantitative form preferred by MSFC and SLS senior management. In the struggle for credibility, this was an important factor cementing the existence of the branch, and its acceptance as a “standard engineering organization” like others in MSFC’s organization. The importance of rhetorical factors in the competition for funding should not be underestimated. This competition is particularly obvious in the research domain, in which researchers fight for funding nearly every year and attempt to distinguish their approach from those of competing researchers. It is less obvious but just as important in other contexts, such as achieving credibility for a new organization and discipline, in which quantification is an important rhetorical factor in gaining acceptance as a “regular engineering discipline.” For the engineer, understanding the role of terminology and quantification as rhetorical tools as well as technical descriptions is very useful in the competitive activities typical in engineering. For the philosopher of technology, these rhetorical factors are well-known, and the cases described here provide engineering examples in which rhetorical factors are important.

4.3  Aristotle’s Children: Teleology in Engineering Engineered systems exist to achieve one or more goals or purposes, and to accomplish them the system performs functions that enable achievement of those goals. One major job of the engineer is to design a system with the capabilities to perform the necessary functions to achieve these goals. From the beginnings of philosophy, Aristotle argued that goal-directedness was a fundamental aspect of philosophy, which became known as teleology. Whether or not teleology is today a critical topic of philosophy can be debated. However, there can be no debate that it is inherently central to the nature of engineering. This section describes how engineers themselves are beginning to recognize the centrality of goal-directedness and hence (though few recognize it) of teleology. I will start with role of goal-directedness in the Theory of SHM.12 The theory starts by defining SHM as “the capabilities of a system that preserve the system’s  An example of this quantification is described in Yunnhon Lo, Stephen B. Johnson, and Jonathan Breckenridge, “Application of Fault Management Theory to the Quantitative Selection of a Launch Vehicle Abort Trigger Suite,” IEEE 2014 Prognostics and Health Management Conference, Spokane, Washington, June 2014, Paper ID 3225693. 12  Stephen B.  Johnson, “The Theory of System Health Management”, in Stephen B.  Johnson, Thomas J. Gormley, Seth S. Kessler, Charles Mott, Ann Patterson-Hine, Karl M. Reichard, Philip 11

4  Philosophical Observations and Applications in Systems and Aerospace Engineering

91

ability to function as intended,” or alternatively “the capabilities of the system that preserve the system’s ability to achieve its goals.” The simplest system is typically one with minimum set of capabilities to achieve the goals. Any capability added to the system to help ensure that those goals are achieved can be classified as SHM. SHM is much like insurance, which exists so that goals are achieved even in the face of unexpected or undesired events. The consumer purchases an insurance policy to ensure that in the case of events such as a car accident or illness, that the deleterious effects of these events can be mitigated. SHM operates similarly. A certain amount of capability is “purchased” and put into the system to either prevent failure, or if an undesired event occurs, to mitigate the deleterious effects. Goals are central to the theory, because non-achievement of a goal is defined as failure. Another related definition of failure is “the unacceptable performance of intended function.” Since functions exist to achieve goals, the linkage is obvious in natural language. In formal mathematical terms, the relationship is even clearer. A function is defined in the same way as classical mathematics, as y = f(x, t), where y is the output state, x is the input state, t is time, and the function f is what transforms or maps the inputs to the outputs over time. A goal being achieved is defined as the output state y being successfully constrained to be within an acceptable range over the time t. That is, r1 75%). What became clear was that eminent domain is not simply good nor is it bad. As one engineering student-researcher reflected: The [ACP] pipeline identifies with both [ethical] theories. On one hand, its nature is utilitarian because it justifies some individuals experiencing negative consequences [because] when doing so will bring about broader, socially beneficial consequences. In this case, the benefits are a greater supply of natural gas for the Virginia Beach area and North Carolina. Contrarily, the pipeline represents a categorical imperative from the Kantian theory because there are people along the route (especially in West Virginia) who are well informed of the risks involved, but want the pipeline because of the compensation. This is a unique case of both ethical theories coming in to play and the root of this duality is the structure of eminent domain. Eminent domain allows the utility companies to cooperate with land-owners towards a categorical imperative, but also reserves them the ability to become utilitarian if cooperation isn’t working.

284

R. W. Foley and E. Barrella

This was reinforced by a handwritten comment at the end of the survey, “The pipeline restricts driving farm equipment across the land in places and only brings temporary jobs. Is there a way for landowners to get like a ‘royalty’ based on gas volume for the pipe to cross their land?” Such questions are resolved through eminent domain and debates about the distribution of risks and benefits are generally ignored and the categorical imperative of expanding energy infrastructure. There is little (if any) direct public benefit to the broader community affected by the taking of private lands. Another key lesson on politicization centered on how geographic distance and sense of community influenced political views about the pipeline. Prior to our research, a poll was commissioned by the Virginia Chamber of Commerce and gathered responses from 500 Virginia residents via telephone. That poll showed support for the ACP at 55%, while opposition was 29% of respondents (The Tarrance Group 2016). Another poll in May 2017 by Hickman Analytics (2017) showed support for the ACP at 54% and opposition had increased by only 2 points to 31%; well within statistical margins of error. The results from our survey differed appreciably from those prior polls. The student researchers collected 272 random responses in Nelson and Augusta counties and found opposition to the ACP was 73.5% with a margin of error of 5% and a 99% confidence interval. This shows that geographic distance influences the political opposition to infrastructure. This has been widely described as the NIMBY (Not In My BackYard) phenomenon (Devie-Wright 2009, 2012). What became clear to the student-researchers was that residents living in Nelson and Augusta counties (where the pipeline is proposed to be built) held views that differed significantly from Virginians not impacted by the proposed pipeline. As one student-researcher reflected, “This is an issue that is dividing the state and setting up people living in the coastal cities where I grew up against others living in central and western regions.” In an effort to generate shared learning between the research team and community members, the next two community-based workshops were designed to challenge the NIMBY attitudes portrayed in the survey results. The two counties that were the focus of this research, Nelson and Augusta counties, primarily import power generated in other parts of the state, which means that other communities currently are burdened by the risks associated with power generation, while the residents of Nelson and Augusta Counties reap the benefits. To confront this, the student-researchers designed a workshop inspired by an in-class activity two engineering students had participated in the prior semester, Energy Scenarios and Trade-off, which drew upon work by Berkes (2009). The workshop asked participants to consider trade-offs between alternative energy technologies and then to physically map the locations and quantities of energy generation within their County. As a student researcher observed, “It was an interesting task to create a game-like activity that was realistic and had real applications. It was also rewarding when the activity was successful in forcing people to make educated decisions about what they want to see in energy infrastructure. It was great to learn about other people’s opinions and to encourage them to question their NIMBY feelings.” At the end of the workshop series, another one of the engineering students reflected, “When it comes to research, I learned how important it is to engage the public.

14  Shared Learning to Explore the Philosophies, Policies and Practices of Engineering… 285

Progress demands tradeoffs, but some of those tradeoffs can be mitigated through simple involvement and engagement.” The workshop drew a small group of four participants from the local community that had completed the survey. The participants ranked seven alternative energy sources by force ranking them from 1 (most preferred) to 7 (least preferred). Then the participants re-ranked the energy sources using six criteria: environmental impacts, land requirements, human health risks, monetary cost, long term employment, and aesthetics. Using natural gas as the baseline energy source, participants assigned values of +1 (better than natural gas), 0 (about the same as natural gas), or −1 (worse than natural gas) for each criterion. The participants felt the activity helped them work through the tradeoffs between energy sources. The participants paired up into two groups of two people for the next activity, which we called Energy Grid Mapping. Both groups created their own energy planning maps for Augusta County, placing pushpins into a map to indicate where they would site energy production facilities to meet the projected electricity needs in Augusta by 2040. The groups were provided basic size, space, and resource availability for each energy type––e.g., geothermal potential––to inform their decisions. While neither group favored new coal or nuclear power plants within Augusta County, there were marked differences in the two maps. Group 1 focused on empowering individuals to build their own energy systems by incentivizing new home construction and upgrading municipal buildings for a decentralized pattern of solar coupled with geothermal (see Fig.  14.2, left). Group 2 focused on concentrated efforts supported by policies and incentives, as well as the redevelopment and use of restored land (e.g., wind and biomass at the landfill), which resulted in solar power in tightly concentrated areas and a few wind-generating sites and small-scale hydroelectric facilities (see Fig. 14.2, right). The experiential learning approach facilitated dialogue about the energy preferences and differences about plausible energy scenarios in the region. The first two activities allowed for individual expression of knowledge and values. The initial questionnaire showed that the top three preferred energy sources are solar,

Fig. 14.2  Energy sources charted by Group 1 (left) and Group 2 (right). Note: Yellow = Solar, Pink  =  Geothermal, Green  =  Biomass; White  =  Wind; Blue  =  Hydroelectric. (Image credit: Rider Foley)

286

R. W. Foley and E. Barrella

geo-­thermal, and hydroelectricity. However, the next activity, which confronted participants with trade-offs, resulted in hydroelectric, geo-thermal, and biomass as the top three preferred energy sources. The third activity, energy grid mapping, considered the specific geographic capacity for different energy sources into account, resulting in a preference for solar, geothermal, and biomass. This shows how the participants expressed their values in different ways based upon the activity and changed course when confronted with information about site-specific choices and information and preferences shared by other workshop participants. While the group size precludes drawing generalizations, the research team was struck by the difference between the distribution of energy on the two maps, Fig.  14.2 below. The two groups vigorously debated the decentralized option expressed by group 1 and the centralized and concentrated distribution expressed by group 2. For the student researchers this highlighted that while the energy sources were shared the distribution and governance of energy could still be quite varied, “Their focus [Group 2] was on concentrated efforts supported by policies and incentives, as well as the redevelopment and use of restored land. They agreed with group 1 in regards to wind, although they decided to place wind turbines on the reclaimed land around landfills. This idea was in conjunction with the idea of the use of wind turbines in places such as West Virginia as a method to reclaim the land from abandoned coal mining such as mountain top removal techniques.” Following the workshop, we collaborated with one of the participants, an elementary school teacher, to develop learning modules based on the three workshop activities. The Energy Infrastructure Decisions and Trade-offs modules were implemented at an elementary school with approximately 90 students and 4 teachers as part of the 4th grade “Resources” unit (aligning with state science standards). An advanced version was also implemented in a college classroom with 19 junior science students and 21 first-year engineering students. The activity that we experimented with during the workshop appeared to generate learning around complex issues like energy infrastructure and to be scalable and adaptable for a range of ages and educational levels and group sizes. Through experience, participants can discover how energy technology is intertwined with contextual dimensions such as rural versus urban distribution, political power, and justice dimensions. Such an approach is highly encouraging to the research team as it appears to garners mutual respect, listening, and co-creation of knowledge that is grounded in experience and local expertise.

14.5  Concluding Points This research project was born from a simple question asked by an engineering student to a faculty member with training in the social sciences and humanities. Together they built a larger team, garnered resources for their collaborative research project and worked to design a study that would explore the issue of natural gas pipeline infrastructure. Over the period of a year, the research team co-created

14  Shared Learning to Explore the Philosophies, Policies and Practices of Engineering… 287

research outcomes (questionnaires, maps, media accounts, and political process mapping), as well as co-designed and tested activities to learn with the community members. This effort sought to follow the framework of shared learning, listening and communicating in non-violent ways, and through the co-creation of activities that shared knowledge between the different groups (faculty-student researchers and research team-participants). Such a mode of shared inquiry between faculty and student-researchers demands time and authentic relationship building. What the team discovered through this approach was that the three pillars of disengagement articulated by Cech (2014) can be observed in the practice of designing large-scale socio-technical infrastructure and not just in the education of engineers. The separation of technical design by the energy utility and then the secondary step of societal approval showed how dualism manifests in the process. However, the intertwined approval process and iterative, technical re-design suggests that is a false divide. The notification by an energy utility to FERC, the regulatory body, immediately reveals the political nature of the process, not to mention that President Trump listed the Atlantic Coast Pipeline among his administration’s top 25 infrastructure priorities (Marcellus Drilling News 2017), which makes it both a [P]olitical and [p]olitical process. Further, the project was not designed for the faculty researchers to consider the student-researchers as ‘empty vessels’ that needed to be filled with information (often termed the deficit model (Brown 2009). Rather, throughout the research project the student-researchers were able to make discoveries and teach the faculty about the lessons they were learning through the framework of shared learning. The interactions between the research team and the workshop participants were also designed to facilitate shared learning about the regulatory approval process with invited experts (workshop 1) and participatory energy mapping activities (workshop 2). And finally, the voices that have united in opposition to the ACP are veterans of the coal-field battles of Appalachia and have the civic capacity and prior experience to mobilize quickly. However, the process is not well-informed by the voices of those persons that are often marginalized in civic discourse. The process for planning pipeline infrastructure is intended to appear meritocratic, yet the research team observed who took the survey and how participants expressed that the decision-­ making is far from ideal. Community involvement is relegated to reacting to proposed routes and environmental impacts, as shown in Fig.  14.1, rather than proactively shaping the decisions about the future of energy infrastructure. There are, of course, alternative policies to the design-first, ask permission later approach outlined by FERC. In other nations, such as Canada, the United Kingdom and the Netherlands, there is greater emphasis placed on engaging with key stakeholders in more deliberative processes prior to initiating any specific technological designs. For example, in Canada the process entails stakeholder identification and engagement before any energy infrastructure is proposed. That approach aims to connect societal values with the initial design of energy infrastructure, and is intended as a way to avoid dualism between social and technical decisions. In the United Kingdom there is a discrete timeframe allocated for the political process to unfold. This demands that all public engagement, site visits and environmental

288

R. W. Foley and E. Barrella

assessments are conducted within one calendar year and then a political decision is made. That approach acknowledges the explicit political nature of decisions about energy infrastructure and seeks to avoid long, drawn out legal battles. In the Netherlands, decades-long planning processes about energy infrastructure have sought to include diverse perspectives, make decision processes transparent and yield equitable outcomes for the nation. At face value, the process described by Kemp and Loorbach (2006) seems to be the most meritorious among these four nations. Yet, each of those processes has their critics, c.f. Kim (2018) on the TransCanada pipeline or Bell et al. (2013) on UK’s wind approval process. Kemp and Loorbach’s (2006) reflections on the energy transition project in the Netherlands suggest the process undertaken in the Netherlands is not ideal. There appears to be much left to learn from one another and a more systematic review of energy infrastructure planning processes that explores cross-national comparisons could offer important insights. While an ideal deliberative, equitable process for designing energy infrastructure remains elusive, it is important to engage in shared learning. For faculty this means avoiding giving the ‘right answers’ to students in our classrooms and student-­ researchers in our research teams. At times, the openness and autonomy, more reflective of a professional workplace than a classroom, frustrated the students. As one student researcher observed, “Often when us undergraduates would meet up, we would discuss how we aren’t really sure exactly what we needed to be doing because we weren’t given detailed guidance on what needed to be done. We understand that this project was a little loose for us, and that we could be drawing our own conclusions from research… As the summer went on though, we reached out more to the professors and that helped. The professors were very willing to help at any cost and this was very appreciated.” Rather, the faculty focused on facilitating the exploration of questions about complex socio-technical systems and offered the student-researchers an opportunity to learn and then bring information back to the research team and educate the faculty. The faculty were not experts in pipeline planning processes, rather they facilitated a research process to learn about this case alongside the student-researchers. This required patience and commitment to the objective of shared learning and respect for different ways of coming to understand the societal, political, and justice dimensions of socio-technical systems. The research team, upon learning about the absence of energy generation in Augusta County designed a set of workshop activities to engage members of that community. That workshop facilitated learning for both the participants and offered a new understanding of the similarities and differences among the participants in their preferences for future energy infrastructure. Philosophers, ethicists and social scientists often interrogate the practices and outcomes of engineering from a critical distance. The approach presented here demands that philosophers and social scientists commit to discovering things alongside their students, colleagues and community members. The desire to share knowledge and demonstrate expertise can preclude (or otherwise restrict) an approach that facilitates shared inquiry. Jointly considering questions that appear to be easily answered might serve to motivate difficult but important work. The mindset of shared learning offers an opportunity to

14  Shared Learning to Explore the Philosophies, Policies and Practices of Engineering… 289

build relationships and reorient the dynamics between the scholars from different disciplines, faculty-student relations and between research teams and stakeholders.

References Barth, M. (Ed.). (2015). Implementing sustainability in higher education: Learning in an age of transformation. London: Taylor & Francis. Barth, M., Lang, D.  J., Luthardt, P., & Vilsmaier, U. (2017). Mapping a sustainable future: Community learning in dialogue at the science–society interface. International Review of Education, 63(6), 811–828. Bell, D., Gray, T., Haggett, C., & Swaffield, J. (2013). Re-visiting the ‘social gap’: Public opinion and relations of power in the local politics of wind energy. Environmental Politics, 22(1), 115–135. Berkes, F. (2009). Evolution of co-management: Role of knowledge generation, bridging organizations and social learning. Journal of Environmental Management, 90(5), 1692–1702. Brown, S. (2009). The new deficit model. Nature Nanotechnology, 4, 609–611. Cech, E. (2014). Culture of disengagement in engineering education? Science, Technology, and Human Values, 39(1), 42–72. Devine-Wright, P. (2009). Rethinking NIMBYism: The role of place attachment and place identity in explaining place-protective action. Journal of Community & Applied Social Psychology, 19(6), 426–441. Devine-Wright, P. (2012). Explaining “NIMBY” objections to a PowerLine: The role of personal, place attachment and project-related factors. Environment and Behavior, 45(6), 761–781. Downey, G., & Lucena, J. (2005). National identities in multinational worlds: Engineers and ‘engineering cultures’. International Journal of Continuing Engineering Education and Life Long Learning. https://doi.org/10.1504/IJCEELL.2005.007714. Eaton, E., & Kinchy, A. (2016). Quiet voices in the fracking debate: Ambivalence, nonmobilization, and individual action in two extractive communities (Saskatchewan and Pennsylvania). Energy Research & Social Science, 20, 22–30. Emanuel, R. E. (2017). Flawed environmental justice analyses. Science, 357(6348), 260–261. Ewing, R. (2016). Pipeline companies target small farmers and use eminent domain for private gain. North Carolina Central Law Review, 38(2), 125–141. Finley-Brook, M., Williams, T. L., Caron-Sheppard, J. A., & Jaromin, M. K. (2018). Critical energy justice in US natural gas infrastructuring. Energy Research & Social Science, 41, 176–190. Foley, R. W., & Gibbs, B. (2019). Connecting engineering processes with responsible innovation: A response to macro-ethical challenges. Engineering Studies, 11(1), 9–33. Foley, R.W., Barrella, E. Kirkvold, H., Wilkins, R., Sloss, A., Mazur, E., Trevisan, C., Rogerson, J., Katleman, D., Mohan, C., Lindsey, V., Dang, F., & Clarens, A. (2017). Future energy infrastructures: Engagements with the Atlantic Coast Pipeline. Research Report. https://doi. org/10.18130/V3CR5NB8D Froyd, J. E., Wankat, P. C., & Smith, K. A. (2012). Five major shifts in 100 years of engineering education. Proceedings of the IEEE, 100(S), 1344–1360. Gibson, R. B. (2006). Sustainability assessment: Basic components of a practical approach. Impact Assessment and Project Appraisal, 24, 170–182. Heaney, T. (2013). The illusive ground between town and gown. New Directions for Adult and Continuing Education, 139(Fall), 35–43. Hecht, G. (1998). The radiance of France: Nuclear power and national identity after world war II. Cambridge, MA: MIT Press. Hickman Analytics Inc. (2017). Poll profile and results in Virginia. Consumers Energy Alliance. https://www.abralliance.org/wp-­content/uploads/2017/05/VA_poll_profile_and_results.pdf. Accessed 10 July 2019.

290

R. W. Foley and E. Barrella

Hughes, T. P. (1987). The evolution of large technological systems. In W. Bijker, T. P. Hughes, & T. J. Pinch (Eds.), The social construction of technological systems (pp. 51–84, Cambridge, MA). MIT Press. Kasl, E., & Marsick, V. (1997). Epistemology of groups as learning systems: A research-based analysis. Proceedings of the 27th Annual SCUTREA Conference. London UK, July 1–3. Kemp, R., & Loorbach, D. (2006). Dutch policies to manage the transition to sustainable energy. https://repub.eur.nl/pub/7629/. Accessed 12 June 2019. Kemp, R., Parto, S., & Gibson, R. (2005). Governance for sustainable development: Moving from theory to practice. International Journal of Sustainable Development, 8(1–2), 12–30. Kim, J. (2018). Perspectives from the ground: Colonial bureaucratic violence, identity, and transitional justice in Canada. Conflict and Society, 4(1), 116–134. Marcellus Drilling News. (2017). Hope: Atlantic Coast Pipe on Trump List of High Priority Projects. https://marcellusdrilling.com/2017/01/hope-­atlantic-­coast-­pipe-­on-­trump-­list-­of-­ high-­priority-­projects/. Accessed 12 June 2019. McCall, L. (2013). The undeserving rich: American beliefs about inequality, opportunity, and redistribution. New York: Cambridge University Press. Norgaard, R. (2004). Transdisciplinary shared learning. In P.  Barlett & G.  Chase (Eds.), Sustainability on campus: Stories and strategies for change (pp. 107–120). Cambridge MA: MIT Press. Ohnesorge, L. (2019). Dominion CEO: We’ll take the ACP fight to the supreme court. Triangle Business Journal, May 7. https://www.bizjournals.com/triangle/news/2019/05/07/dominion-­ ceo-­well-­take-­the-­acp-­fight-­to-­the.html. Accessed 2 Dec 2019. Rawls, J. (1985). Justice as fairness: Political not metaphysical. Philosophy and Public Affairs, 14, 223–251. Riley, D. (2008). Engineering and social justice: Synthesis lectures on engineers, technology, and society. Williston: Morgan & Claypool Publishers. Seron, C., & Silbey, S. S. (2009). The dialectic between expert knowledge and professional discretion: Accreditation, social control and the limits of instrumental logic. Engineering Studies, 1(2), 101–127. Slaton, A.  E. (2014). Meritocracy, technocracy, democracy: Understandings of racial and gender equity in American engineering education. In S.  H. Christensen, C.  Didier, A.  Jamison, M. Meganck, C. Mitcham, & B. Newberry (Eds.), International perspectives on engineering education: Philosophy of engineering and technology (pp. 171–189). Cham: Springer. Stirling, A. (2014). Transforming power: Social science and the politics of energy choices. Energy Research & Social Science, 1(1), 83–95. Talwar, S., Wiek, A., & Robinson, J. (2011). User engagement in sustainability research. Science and Public Policy, 38(5), 379–390. The Tarrance Group. (2016). A survey of voter attitudes in Virginia. Virginia Chamber of Commerce. https://www.vachamber.com/wp-­content/uploads/2016/10/15420-­VIRGINIA-­ POLL-­CHARTS.pdf. Accessed 12 June 2019. Tierney, S. (2017). Natural gas pipeline certification: Policy considerations for a changing industry. Analysis Group. https://www.analysisgroup.com/uploadedfiles/content/insights/publishing/ag_ferc_natural_gas_pipeline_certification.pdf. Accessed 2 July 2019. Trencher, G., Yarime, M., & McCormick, K. B. (2013). Beyond the third mission: Exploring the emerging university function of co-creation for sustainability. Science and Public Policy, 41(2), 151–179. Vilsmaier, U., Engbers, M., Luthardt, P., Maas-Deipenbrock, M., Wunderlich, S., & Scholz, R. W. (2015). Sustainability Science, 10(4), 563–580. Winner, L. (1980). Do artifacts have politics? Daedalus, 109(1), 121–136. Yin, R. K. (2003). Applications of case study research (Applied social research methods series, 2nd ed.). Thousand Oaks: Sage. Young, M. (1994). The rise of meritocracy. New Brunswick: Transaction Publishing.

Chapter 15

Middle Grounds: Art and Pluralism Caitlin Foley and Misha Rabinovich

Abstract  From the inventions of Leonardo Da Vinci, the tintype photographs of Dorothea Lange, Salvador Dali’s collaborations with filmmakers and animators (including Walt Disney), to MacArthur Fellow Trevor Paglen’s recent escapades with artificial intelligence, artists are often among the first groups of people to bring new technology into the public spotlight. The first thing created with a new technology is often seen through the lens of art, simply because of the novelty. As the technology diffuses into the culture, the content of the work made with it and the context the work is presented in drive the artistic value over the groundbreaking novelty of the format. For example, the pioneering video artist Nam June Paik created the first video art by simply getting his hands on a Sony Portapack in the mid sixties and recording the first thing he encountered (Meigh-Andrews: A history of video art. Bloomsbury, New York, pp 17–19, 2006). The stranglehold on TV content production, controlled by huge corporations within the context of a one-to-­ many medium, was broken. However, once they became generally available the content captured & produced by camcorders became more important and the form alone could no longer be enough to justify the result as artwork. Art can create contexts for engaging with technology without the pressure of using it for practical means and thus allows technology to be seen from totally new angles, to be imagined in different contexts, and to be questioned from new aesthetic and ethical points of view. We, the artist duo Caitlin & Misha, aim to use technology as a means of creating unique shared experiences and to harness it as a tool for engaging people in deep thinking and feeling about the current trajectory of our culture. Keywords  Novelty · Art and technology · Possibility space · Machine learning · Artificial intelligence · Neighborliness · Temporary autonomous zone · Heterotopia · Pluralism · Speculative fiction · Design fiction · Alchemy · Participatory installation · Relational aesthetics

C. Foley (*) · M. Rabinovich Art & Design Department, University of Massachusetts Lowell, Lowell, MA, USA e-mail: [email protected]; [email protected] © Springer Nature Switzerland AG 2021 Z. Pirtle et al. (eds.), Engineering and Philosophy, Philosophy of Engineering and Technology 37, https://doi.org/10.1007/978-3-030-70099-7_15

291

292

C. Foley and M. Rabinovich

Caption  Sphinctegraphs (gut bacterial ecologies of 7 FMT donors, data from OpenBiome) by Caitlin & Misha

15.1  Introduction 15.1.1  Background From the inventions of Leonardo Da Vinci, the tintype photographs of Dorothea Lange, Salvador Dali’s collaborations with filmmakers and animators (including Walt Disney), to MacArthur Fellow Trevor Paglen’s recent escapades with artificial intelligence, artists are often among the first groups of people to bring new technology into the public spotlight. The first thing created with a new technology is often seen through the lens of art, simply because of the novelty. As the technology diffuses into the culture, the content of the work made with it and the context the work is presented in drive the artistic value over the groundbreaking novelty of the format. For example, the pioneering video artist Nam June Paik created the first video art by simply getting his hands on a Sony Portapack in the mid sixties and recording the first thing he encountered (Meigh-Andrews 2006). The stranglehold on TV content production, controlled by huge corporations within the context of a one-to-­ many medium, was broken. However, once they became generally available the

15  Middle Grounds: Art and Pluralism

293

content captured & produced by camcorders became more important and the form alone could no longer be enough to justify the result as artwork. Art can create contexts for engaging with technology without the pressure of using it for practical means and thus allows technology to be seen from totally new angles, to be imagined in different contexts, and to be questioned from new aesthetic and ethical points of view. This realization shows two general paths for how to engage with new technologies as art. There is a fair amount of cutting edge technological artwork produced by artists and engineers who are quick to jump on new inventions (sometimes creating gimmicks to ‘ooh’ and ‘aah’ over). This work produces results that are often pleasing to the senses, marking a turning point in the diffusion of the technology. There is also artwork that does not aim to glorify the technology but employ it as another artistic medium, a tool wielded to tell stories, evoke emotion, and/or reflect on social, political, and cultural issues. However, choosing to use a new or developing technology as a medium means that the artwork will inherently raise questions about the larger implications of technological developments on human society and the world. We, the artist duo Caitlin & Misha, aim to use technology as a means of creating unique shared experiences and to harness it as a tool for engaging people in deep thinking and feeling about the current trajectory of our culture.

15.1.2  Intersection of Art & Technology We perform our art with the recognition that there are many different visions of what technology is and will be. Various forms of fiction, from science to design fiction, present opportunities for imaginative engagement with new or as of yet uninvented technologies. Our culture is inundated with dystopic visions of technology in the future which play out on larger-than-life screens in our movie theaters, Xboxes in the living room, and handheld devices in transit. How are these visions of the future shaping the development of technology? Beyond the utopic/dystopic dichotomies found in a lot of fiction there is a messy space of pluralism1 where many possibilities coexist. Pluralism describes situations where multiple visions of the world are applied to the same object, with no vision or description being more important to others. As artists we are interested in building experiences where the audience is invited to help create an open possibility space (one which allows different possibilities for our current and future relationship with technology).

1  Nancy Cartwright’s ideas on a dappled/mottled/patchworked world are operative here. One of the examples she uses is the problem of believing quantum physics has superseded classical physics. In the real world, quantum physics is only practical in very specific circumstances, while classical mechanics are still useful generally. The two theories coexist together in a kind of patchwork that “is more like an outcome of negotiation between domains than the logical consequence of a system of order” (Cartwright 1999).

294

C. Foley and M. Rabinovich

In this text we will explore how the use of technology and this concept of heterotopia relates to our work. First we will describe our work, and then illustrate how we repurpose technology in ways to engage various social issues. Then, recognizing the possibility of competing pluralistic visions from any technology, we discuss several of our artworks and the possibility spaces that they espouse. Our participatory installations are tangible and also speculative. Experiencing our work makes the pluralistic experience tangible because multiple viewpoints are present via a shared experience, while the technological exploration may be more speculative. What if we banked our stool daily? What if we trained AI to worry for us? Encouraging possibility spaces like these is one tangible way to use technology to pursue our broader artistic vision.

15.1.3  Overview of Artist Duo Caitlin & Misha Across all of the projects, technology enables the relational aspect of our work and plays a key role in enabling true participation from our audience. When we embrace practices aimed toward long-term survival, such as eliminating waste by transforming it into energy, we are able to live more symbiotically with other systems on Earth. One of our first collaborations, the DS Institute Sweat Battery, actively creates/engages a sharing community through the collection of sweat from participants using our mobile sauna, presents an alternative ecology for energy production, and transforms “waste” sweat into power to charge cell phones and symbolize collective energy. Naturally occurring systems such as rhizomatic networks of mycelium, the microbiome ecology, and emergent pink noise are inspirations for the shared experiences we construct in our collaborative art practice. Our ongoing work with pink noise or 1/f noise—a form of noise believed to have meditative/relaxing effects (and much less harsh than white noise)—includes a series of wearable devices such as the Heart Hat which focuses the user on the rhythms of his or her own heartbeat. Though commonly associated with nature, e.g. the sound of a waterfall, pink noise also appears in certain human cultural ecologies; we found that multiple radio stations or YouTube video soundtracks converge into pink noise. Pink noise is also an energy pattern found in many systems on earth, from the natural to the cultural. Our project investigates the value of the sonic form of pink noise as a means of filtering media overstimulation.

15  Middle Grounds: Art and Pluralism

295

Caption:  Radio Wormhole is an aural bridge between two cultures built using a mashup of the top ten radio stations in NYC (left ear) and London (right ear) to produce a cross cultural experience of pink noise. By Caitlin & Misha

In addition to exploring ecology, media, and alchemy, we aim to create artworks that provide unique opportunities for neighborliness. The Pink Noise Salon was a comfortable place for people to immerse themselves into listening experiences, attend related talks, and participate in a group sonic meditation. Worries Bash is another similar experiment driven by an archive of recorded worries from various participants. Each worry retains its uniqueness while at the same time contributing to the production of a kind of heterotopia or space of othernesses. These experiential projects aim to bring different life experiences into one place where they can sit side-by-side.

15.1.4  R  epurposing Technology via the Worries Bash and Other Projects Art can make the potential benefits of technology accessible to a broader public, more so than is possible in the realm of research and development. We are interested in taking technology that is often capitalized on in pay-to-play spaces into places that are free and open to the general public. With this in mind we often combine cutting edge technology with age-old materials and techniques. Worries Bash, an interactive installation, includes an audio portrait composed of thousands of recorded worries collected from people via an online portal. These are exhibited as emanations from fragile papier-mâché sculptures inspired by traditional piñatas. Attendees were able to bring the audio recordings into focus by tapping or hitting the sculptures which were ultimately broken open to ceremonially

296

C. Foley and M. Rabinovich

release the worries. People are often surprised when they interact with the sculptures in Worries Bash and discover that their actions are triggering the audio. There is a playfulness inherent in combining papier-mâché (newspaper, flour, and water) with electronics and software. Artificial Intelligence (AI) is one of the most talked about technologies at the moment with daunting social implications. We brought AI into this playful do-it-­ yourself environment by using the recorded worries to train a Sample Recurrent Neural Network  (SampleRNN), with the help of Dadabots (Carr and Zukowski 2017), to synthesize new worries. After about twenty nine evolutions the network could produce what sounded like synthetic worries: it learned the rhythm and cadence of the speech of worried humans. The only discernible word in this audio is “worried”. The child/infant-like sounds (rendered in adult voice timbres) with almost recognizable words are what people hear by default when an audible human worry isn’t being triggered. The worried mumbles became distilled concerns that sound familiar and help create an uncanny sonic environment. Tapping or hitting the main interactive piñata sculpture triggers a custom piece of software to attenuate the mumbles and amplify one of the actual recorded worries. The first Worries Bash was conducted in a German artist-run space called AgvA C.I.A.T in Berlin in 2016. We were concerned that the piñata bashing environment could be disrespectful to the seriousness of the thousands of worries recorded by various contributors. To our relief our goal to help bear the worries collectively was achieved through the careful use of ritual as well as by leveraging several technologies such as machine learning and an infrared beam speaker or “sonic laser”. We have since iterated the project for US exhibitions in Dayton, Ohio and Boston, Massachusetts. During normal gallery hours leading up to the bash the visitors can interact with the main piñata sculpture and bring different worries into focus out of the cloud of SampleRNN mumbles. The piñata acts as a kind of worries browser. By employing AI as just one of our artistic media in a layered and textured installation there is an opportunity for imagining the possibilities of AI through a new lens.

Caption: Ecology of Worries consists of several animated characters speaking worries that were synthesized using various machine learning algorithms (which were trained on an archive of

15  Middle Grounds: Art and Pluralism

297

recorded worries that artists collected since the 2016 US elections) Pictured here projectionmapped for ILLUMINUS, Boston (photo by Aram Boghosian). By Caitlin & Misha

We are using this archive of over a thousand worries collected for Worries Bash to develop a new work called Ecology of Worries featuring a landscape of hand drawn critters who take turns sharing worries. The worries that they are speaking were produced using TextGenRNN and GPT-2 machine learning models trained with the worries archive. It is largely speculative in asking the questions: What if we could train a machine to worry for us? What could be gained and what could be lost? We use familiar text-to-speech voice synthesizers so that the critters’ voices are reminiscent of virtual assistants such as Alexa, beckoning reflection about how much our thoughts are already influenced by machines. This transformation is replicable in multiple areas. Machine Learning is commonly used to summarize information with the goal of providing accurate data sets. We are repurposing it to tell fictional stories that present new lenses for looking at culture. As part of our Shareable Biome project we created visualizations about the microbiome and ideas about the sharing communities that make Fecal Microbiota Transplantations (FMTs) possible. One such visualization is a series of data portraits of the microbial composition (or “enterotypes”) of OpenBiome’s stool donors we dubbed Sphinctegraphs. People can have a thousand species of bacteria in their guts. Including each bacteria would present too many pie slices to visually make sense of so we used topic modeling machine learning software (Girdhar et al. 2013) to boil down 1000 dimensions into ten. Each of the ten topics can be thought of as a certain collection of bacteria that are likely to appear together. The shape of the graph is reminiscent of a biomorphic sphincter to make the idea of a poop transplant—which is laden with taboos—more familiar and comfortable. It is also an opportunity for a person or a group to see themselves as an ecology containing multitudes of beings. As a whole our Shareable Biome project explores the need for a diverse microbial culture as a potent analogy for the health of a more diverse human culture. Social media applications have had some of the most far reaching effects on human culture and behavior over the past decade. Our project Total Jump uses social gaming to bring people face-to-face in the same place (as opposed to isolating or dividing them up). Total Jump is an arcade-style jumping game that trains people for a worldwide coordinated jump where everyone lands at exactly the same time. The near impossibility of accomplishing a total jump combined with the ease and fun of training for it invites the audience to bridge the gap between a postmodern pluralistic2 world and the necessity of globally coordinated solutions in the face of the

2  Artistic modernism and architecture were keen to develop a universal language of abstraction. This aesthetic drive arguably led us to fascism and authoritarian communism. To contrast this, postmodern pluralism is an embrace of context (e.g. Joseph Albers’ notion of interaction of colors or the perception of color being totally relative to other colors around it) and inclusive culture where many kinds of outlooks can coexist. This pluralism can actually make unitary political solutions to wicked problems like climate change harder to achieve.

298

C. Foley and M. Rabinovich

Anthropocene. When participants successfully land in unison a photo of them midair appears on the screen which is super satisfying and often prompts laughter and a desire to keep playing. We released a beta version of the Total Jump Live mobile app for training where players are prompted by a countdown on the screen and jump while holding their mobile device. This allows people to coordinate jumps with friends and family standing in the same room as well as with strangers from around the world and potentially discover fellow jumpers when out and about. Art can debunk the preconceived notions about technology. In some ways using technology is simply a way of embracing the language of the times. Henri de Toulouse-Lautrec’s lithographs were employing the new technology of the time which provided the ability to produce multiples of a given image. Employing cutting edge technologies (machine learning, smartphone apps), as well as creating work that embraces media archeology and uses technology that has been in use for a long time (physical computing, electronics, textiles, printmaking, 2D animation, etc.) can provide viewers with multiple entrance points into experiencing the artwork. There is a lag between technological innovations and their adoption. We think that science and engineering are producing everything that we need to survive as a species, and yet our culture is not integrating these insights equitably. The world is changing rapidly due to technological innovation so both science and art must react to the same changes. Employing technology as an artistic tool within the framework of contemporary culture creates a space for asking questions and developing alternative visions for the uses of technology.

15.2  Creating Space 15.2.1  Space that Is Open to the Public Modern and contemporary life has grown almost unbearably bureaucratic, with technology that was supposed to make our lives easier actually making them more complicated. Max Weber was prescient when he predicted the technocratic disenchantment of our lives (Owen and Weber 2004) due to the confluence of capitalism, technology, and rationalism. He also wrote about the deprioritization of public monuments in exchange for personal fulfillment in private relationships and individualistic encounters with art. We are inspired by Fluxus and Situationist movements, happenings, and maintenance art to create space for relational artistic experiments. We built a mobile sauna3 while based in Syracuse, NY where there are no public saunas. In addition to the health benefits of saunaing we were interested in creating a space for people to have a shared social experience. The idea to make the sauna mobile came from our desire to call attention to the privatization of public space and of culture more generally. Communities often form in times of need as a sort of  DS Institute Mobile Sauna by Caitlin Foley, Maximilian Bauer, Misha Rabinovich, Zach Dunn.

3

15  Middle Grounds: Art and Pluralism

299

survival instinct. People living right next to each other who have never spoken may join forces if digging out of a snow storm or recovering from a flood. Public spaces can help provide a place to come together without necessitating a disaster. Sharing a unique experience such as a sauna together can create valuable interactions. On an individual level the sauna cleanses people of physical toxins and releases stress. Surrendering to this act of cleansing in the presence of others creates a unique social bond. The public becomes intimate.

Caption: While participants rejuvenated in the DS Institute Mobile Sauna parked outside of the New Museum in NYC, their phones recharged using the early version of the DS Institute Sweat Battery so they could call their friends and family to join them. Also pictured are the artists’ prototypes of “Baghdad batteries” based on the mysterious out-of-place artifacts. By the DS Institute (Caitlin Foley, Misha Rabinovich, and Zach Dunn).

When we brought the sauna to NYC for the New Museum’s Ideas City Festival we invited bathers to donate their sweat to be the active ingredient in a working electrical battery used to charge mobile phones. Bathers could then message their friends and family to join. Our battery was inspired by the so-called “Baghdad battery”, the curious archeological discovery which appeared to be a primitive battery dated to 400 B.C. (Frood 2003). It is unclear what this battery was used for, but we suspect ritual worship. We decided to reproduce the battery and iterated on the original design to create a community sweat battery consisting of 98 cells wired in groups of seven in series to attain a high enough voltage (~3 V) and the resulting 14 groups hooked up in parallel to boost the amperage and charge USB compatible mobile phones (Griffin 2013). We gave out an illustrated how-to sheet (based in part on an aluminum-air battery design (Chasteen et al. 2008)) which we made available to participants. The use of media archeology and a do-it-yourself battery is a response to the role of technology in the human relationship to the environment.

300

C. Foley and M. Rabinovich

15.2.2  Creating Spaces for Shared Experiences The Mobile Sauna and Sweat Battery provide a communal experience. Worries Bash creates a shared experience through a combination of familiar rituals and novel technology. Worrisome shifts in the USA’s politics triggered our interest in collecting worries. Leaders are drawing the culture inward while souring relationships with longstanding allies. The emotionalization of events by the media is engendering worries that swirl inside us, trapping us in manufactured anxieties. We have been asking people in our communities what they are worried about and find they have a plethora of concerns at the ready. This project is an opportunity to collect various types of worries and consider similarities in emotional cycles. There are many social situations where it is unacceptable to express these thoughts and emotions, yet we all have them. Before we officially start bashing the piñata sculpture with a stick, we ask everyone to form a loose circle and then come up one by one to hit/tap the piñata so we can all really listen to the worries. In this way we pay respect to the woes of the individual recordings. Then the first person to bash the sculpture gets blindfolded, spun around, positioned in front of the sculpture, and allowed two attempts to hit it. People cheer and urge them to find their mark. The person has to select the next basher and so on until the piñata is destroyed. When the sculpture breaks open the audio stops and transcriptions of the worries along with small dolls, inspired by Guatemalan worry dolls, spill to the floor. People typically crouch side by side, sifting through worries, reading them and pairing them with the dolls. It takes a group of people to break the sculpture. The act of selecting worries and dolls to take home often results in reading and discussing them with others.

15.3  Shaping Culture 15.3.1  Possibility Space Now that we’ve described how our work repurposes technology, we would like to elaborate on how each project opens up the space to look at multiple possibilities of technology. When accepting her National Book Foundation’s Medal, the late great science fiction author Ursula Le Guin said: “Hard times are coming when we will be wanting voices who can see alternatives to how we live now, can see through our fear-stricken society and its obsessive technologies to other ways of being; even imagine grounds for hope.” The dystopias in our fiction, movies etc., risk becoming self fulfilling prophecies if we don’t feed our utopian imaginary. We aim to create a pluralistic space with our artworks where unique possibilities of our current and future culture can be explored.

15  Middle Grounds: Art and Pluralism

301

15.3.2  P  ossibility Space of the Mobile Sauna and Sweat Battery

Caption: DS Institute Sweat Battery. By the DS Institute (Caitlin Foley, Misha Rabinovich, and Zach Dunn).

With the mobile sauna and sweat battery projects we hope to expand the possibility space of neighborliness and sharing community. This is not a utopian premise. Hannah Arendt described the sterility of Utopias, remarking that in a Utopia everyone is the same (Arendt 1958). A heterotopia is a place where many different kinds of individuals can coexist. The Mobile Sauna is a sort of temporary autonomous zone. It doesn’t matter where it is parked: it is always the same warm space inside. There is a saying that everyone in the sauna is equal. You can’t wear a Rolex in the sauna and everyone has a similar physical experience. In Finland there is a practice called sauna diplomacy and politicians have negotiated using a sauna. It is a space where people can bring their different perspectives to an equalized playing field. The ion transport solution in our sweat battery is sweat, or released stress, which is still real and holds its charge but is externalized and no longer doing damage inside the body and mind. Sweat, that would be wasted otherwise, is sublimated into power. This sort of alchemy is a recurring theme in our work. The sweat battery requires sweat from many people to produce a substantial amount of energy. In our culture an individual’s sweat is taboo, but a shared sweat can build solidarity, especially when we notice that it all looks the same! Sublimating the sweat into power symbolizes communal energy, a potential within a united community. The aluminum air battery requires two different metals joined by an electrolyte solution, e.g. sweat. We used copper nails left over from sauna construction and pieces of aluminum heat barrier to construct the cells. The battery thus served as a direct index to the sauna itself. The experiential dimensions of this project and the creation/

302

C. Foley and M. Rabinovich

activation of a sharing community became important strategies for our projects going forward. The mobile sauna creates an opportunity for a diverse group of people to come together into a kind of heterotopia (Foucault and Miskowiec 1986), or a space where many kinds of differences can coexist. The battery is a symbol of the ability to harness alterity into something generative.

15.3.3  Possibility Space of the Shareable Biome

Caption: Caitlin & Misha’s Shareable Biome lecture-performance at the Science Gallery, London, UK. Photo by the Science Gallery at King’s College.

The human microbiome is a kind of heterotopia where diversity of microbes is required for health. Our Shareable Biome project celebrates microbial diversity while proposing speculative solutions to the loss of microbiome diversity. The human microbiome is all of the microorganisms living on one’s skin, in one’s lungs, in one’s guts etc. Diet and lifestyle play an important role in maintaining a healthy balance of mircoflora. There is a general understanding that the health of our microbiome greatly affects our physical and mental health. At the core of the microbiome research is the radical life-saving probiotic procedure called the Fecal Microbiota Transplantation (FMT). FMT is a suppository of a healthy person’s feces into the bowel of a patient and is pure alchemy: feces is recontextualized and through that act becomes a life-saving medicine. Alchemy tends to be a recurring theme in our work; using people’s sauna sweat to charge phones and mixing excessive amounts

15  Middle Grounds: Art and Pluralism

303

of media to produce relaxing pink noise. To us the ecological need for a diverse microbiome is a fertile analogy for a multitude of cultural struggles in which sharing communities play a key role. The conservation of this diversity is not possible by individuals alone but relies on ever expanding groups and networks of beings. Culture refers to both bacterial culture and human collective truth. Research supports playing in the dirt, gardening—and inhaling soil microbes—etc. as an effective way to combat depression (Schlanger 2017). The alchemical possibilities of poo are plentiful, e.g. fertilizer. During the lecture-performance component of our Shareable Biome project we invite the audience to share their skin microbiomes with each other by dipping a finger into distilled water (for gene sequencing later). We think this sort of shared experience has the capacity to inspire thought experiments and dark discussions while making visible the maintenance work required for a healthy culture. Networked homes/smart cities promise to help us coexist efficiently with each other and technology in urban environments. But what about all the age-old threats to urban life that haven’t gone away? Weakened immune systems and a high density of people could be the conditions for another perfect pathological storm such as the black plague4. Can the Internet of Things (IoT) help with this? We believe a system of networked smart bathrooms can track disbalances in human microbiomes and even go beyond to augment the microbiomes of people in several ways—by literally infusing the environment of a deficient person with microbes from healthy people as well as providing data to help people choose diverse neighborhoods (microbiomically speaking). Anthropocentric changes in climate and urbanization have greatly destabilized the relationship between microbes and humans, and the smart city of the future could address this issue. Although the smart city could leverage technology to redress this disbiosis, it would require a radical openness from the human denizens the likes of which we haven’t seen. Design fiction could show the way. Studies already show that young chickens sprayed with healthy adult chicken bacteria are more resilient to disease than chickens fed antibiotics (Edens 2003). A smart networked bathroom equipped with high-throughput gene sequencing could monitor the microbiomes of its users, (e.g. a shower drain tracking skin microbiome diversity). Based on these ideas (and the use of Fecal Microbiota Transplants, FMTs), we have already proposed a kind of “two-way toilet”, using the slogan “you poop into it and it poops into you”. This novel toilet has made an appearance in our lecture-performance that is part of our Shareable Biome project. We propose extending this idea further into a generalized smart bathroom that measures not only the gut but the skin, oral, and other microbiomes. Just like in the forest ecology where mycelium interconnects tree roots and delivers custom-tailored nutrition to each tree based on needs (Stamets 2005), so can the smart bathroom network create a microbiotic internet where technology helps to maintain diversity.

 Written prior to the COVID-19 Pandemic.

4

304

C. Foley and M. Rabinovich

15.3.4  Possibility Space of Worries Bash For the Shareable Biome, we propose a networked smart city where machine learning helps us keep our microbiomes diverse. In the Worries Bash, we used machine learning to boil down the recorded worries of individuals into a more elemental distillation of Worry that allowed for a new kind of connection between the participants. These two projects used machine learning to help create a more empathetic experience for our audience. In our Worries Bash installation, above the audience and pointed down towards where the basher stands is an ultrasonic beam speaker by Holosonics. The speaker creates a discreet beam of sound which is only audible when standing in the beam. The beam can also be bounced off of smooth floors and walls to create a totally uncanny experience: a listener in the beam can clearly hear the sound and yet when they look in the direction of the sound, they see nothing but a wall/floor/ceiling/ sculpture. The technology behind the speaker has been weaponized to create various forms of crowd control and a “voice of God” weapon, and arguably presents a new form of social control (Goodman 2012). We are opportunistic about this technology’s ability to enable empathy. We use it to beam the innermost worries of one person into another to hopefully create an unexpected resonance across different life experiences. Can the destruction of these worry vessels create space for clarity?

15.3.5  Possibility Space of the Pink Noise Salon One day we were playing with a mashup of many YouTube videos at once and noticed how similar the sound was to the relaxing sound heard in our Shellphones, headphones made out of large seashells. Analyzing the spectrum of the mashup revealed that the soundscape tended towards pink noise (aka 1/f noise). Pink noise is commonly found in nature (e.g. a waterfall) and is considered to have relaxing effects. White noise features every frequency at full volume and is quite harsh. Pink noise on the other hand is more soothing because as the frequency gets higher pitched, the volume decreases (i.e. it has an inverse relationship between frequency and volume). We became interested in the relationship between the occurrence of pink noise in both nature and culture as well as its effect on listeners. Eventually we came to think of it as the golden ratio of sound: a steady state occurring in nature and culture. We organized the Pink Noise Salon, featuring a series of wearable listening devices and installations created by us and other Flux artists in residence at the Flux Factory in NYC. The salon investigated the value of filtering media overload and alleviating symptoms of “innovation fatigue” (Foley and Rabinovich 2016). We invited Tricia MacKenzie, PhD, a neuroscientist, to give a presentation on the uses of pink noise in neurology both during the salon and as part of a podcast we produced on the subject. She described pink noise as an energy pattern ubiquitous in systems on earth. It’s in physics, chemistry, biology, economics, film, social

15  Middle Grounds: Art and Pluralism

305

networks etc. Tricia was skeptical of some of our claims (e.g. that pink noise is the golden ratio of sound). Instead of trying to ameliorate our differences or to smooth over the boundaries between the art and the science, we were happy to leave the boundary exposed. This created a productive tension between a positivist, evidence-­ based approach and an artistic mode of research and production not beholden to reproducibility. The juxtaposition of our artistic presentation and Tricia’s scientific one was a heterotopia of sorts: two radically different approaches to knowing the world sitting in the same space, butting up against each other and exposing a generative gap.

15.3.6  Possibility Space of Total Jump

Caption: Total Jump (eight player version) at Game On! put on by Artspace in New Haven, CT by Caitlin & Misha.

The idea to train people for a coordinated worldwide total jump was inspired by various examples of social synchronization. The Estonian singing revolution led to Estonia freeing itself from the USSR.  Christoph Schlingensief invited people to overflow a lake with six million swimmers, the number of Germany’s unemployed, in an attempt to flood the chancellor’s house. When the millennium bridge in London first opened, it swayed dangerously after hundreds of people intuitively fell in sync. The Rally to Restore Sanity in 2010 led by comedian John Stewart incited the crowd to jump and encouragingly the effect was detected by hired

306

C. Foley and M. Rabinovich

seismologists. There is an urban legend that if everyone in a city flushed the toilet at the same time the water mains would burst (Freeman 2017). Would a total jump precipitate an earthquake or unite humanity against all odds? Total Jump works perfectly well as a single player game. Because jump countdowns are synced however, players become participants. Participatory works such as multiplayer games can unite different people in reaching a shared goal. An eight player version of Total Jump was commissioned by Artspace in New Haven, CT to take place in a large public festival. Practice jumps continued throughout the day, and culminated in a city-wide jump. The practice sessions used eight jump pads on the ground and players could walk on at any time during a countdown. We often witnessed player groups helping new players get synched up and ready for the jump. Everyone had to be on time to reach the winning condition. We saw a group coaching a differently-abled person who didn’t understand the instructions to help her jump and land on time. Everyone was invited to actively coexist; to help meet individual needs in order to reach the common goal.

15.4  Conclusion We described a series of examples of heterotopic applications of technology. Technology can help to constitute a heterotopia if it’s used to enable participatory artwork. Art provides a playspace for envisioning cultural applications for technology. Interactive artwork requires only one person to make it work. Even if multiple people can interact with a piece, they are not necessarily interacting with each other. In our art, technology is used to enable the relational component; the ability for people to participate in the work with and because of each other. While artwork such as ours itself doesn’t have a practical use, it can present a unique opportunity to experiment with technology in totally free and open ways. Participatory installations can present great opportunities for exploring the relationship between human and artistic/technological artefacts and to explore how technology affects the values of a society. The first use of a technology is often art due to the novelty of the form alone, with the content (and context) being less important. As the technology becomes more familiar, the use of it formally is not enough: the content and context become primary to the manifestation of artwork. Participatory installation, with it’s multimodal and interdisciplinary approach to creating experience, and the inherent relational aspects which must be activated by a group of people employs technology as just one of the tangible media. Using a novel technology alongside more familiar ones in this context allows for the opening up of a possibility space for speculative visions of the future. When the relational aspects of such an experience are successfully inclusive, and when they play with unexpected uses of a new technology, a heterotopia can emerge where many visions of a technological world can coexist and butt up against each other, vie for the attention of the audience, and underscore our

15  Middle Grounds: Art and Pluralism

307

ability to shape our future with our technologies, as opposed to relinquishing control to the technology alone.

References Arendt, H. (1958). The human condition (Vol. xviii, pp.  202–224). Chicago: University of Chicago Press. Carr, C. J. & Zukowski, Z. (2017, December 8). Generating black metal and math rock: Beyond bach, beethoven, and beatles. Retrieved from: http://dadabots.com/nips2017/generating-­black-­ metal-­and-­math-­rock.pdf Cartwright, N. (1999). The dappled world: A study of the boundaries of science (Vol. 2, pp. 190–218). Cambridge: Cambridge University Press. Chasteen, S., Chasteen, N., et al. (2008). The salty science of the aluminum-air battery. The Physics Teacher, 46(9), 544. https://doi.org/10.1119/1.3023656. Edens, F. W. (2003). An alternative for antibiotic use in poultry: Probiotics. Brazilian Journal of Poultry Science, 5(2), 75–97. https://doi.org/10.1590/S1516-­635X2003000200001. Foley, C & Rabinovich, M. (2016, April 1). Maintenance art and sharing communities. Retrieved from: https://static1.squarespace.com/static/56a8e2fca12f446482d67a7a/t/56fda6f3356f b0f980498757/1459463928297/Maintainance+Art+and+Sharing+Communities+updated.pdf Foucault, M. & Miskowiec J. 1986. Of Other Spaces. Diacritics, 16(1), 22–27. The Johns Hopkins University Press. Freeman, C. (2017, March 11). New media caucus hub: Fermenting at flux: Live and active cultures (Part 3). Retrieved from: http://www.newmediacaucus.org/fermenting-­at-­flux-­part-­3/ Frood, Arran. (2003, February 27). BBC News: Riddle of ‘Baghdad’s batteries’. Retrieved from: http://news.bbc.co.uk/2/hi/science/nature/2804257.stm Girdhar, Y., Giguere, P., et  al. (2013). Autonomous adaptive exploration using Realtime online spatiotemporal topic modeling. International Journal of Robotics Research, 33(4), 645–657. https://doi.org/10.1177/0278364913507325. Goodman, S. (2012). Sonic warfare: Sound, affect, and the ecology of fear (pp.  183–188). Cambridge, MA: MIT Press. Griffin, M. (2013, June 25). “Maximizing sweat equity” Project converts sweat to battery charge on adafruit forums. [web log] Retrieved from https://blog.adafruit.com/2013/06/25/ maximizing-­sweat-­equity-­project-­converts-­sweat-­to-­battery-­charge-­on-­adafruit-­forums/ Meigh-Andrews, C. (2006). A history of video art (pp. 17–19). New York: Bloomsbury. Owen, D.  S., & Weber, M. (2004). The vocation lectures (pp.  1–31). Indianapolis: Hackett Classics Series. Schlanger, Zoë. (2017 May 30). Dirt has a microbiome, and it may double as an antidepressant. Retrieved from: https://qz.com/993258/dirt-­has-­a-­microbiome-­and-­it-­may-­double-­as-­an-­antidepressant/ Stamets, P. (2005). Mycelium running: How mushrooms can help save the world (pp.  26–31). Berkeley, Calif: Ten Speed Press.

Chapter 16

The Artefact on Stage – Object Theatre and Philosophy of Engineering and Technology Albrecht Fritzsche

Abstract  Philosophical approaches to engineering tend to use the technical artefact as a starting point. From the artefact onwards, they look at its design and its importance in human life. All this, however, only seems possible if the artefact is somehow put in an exposed position, where it becomes available as a reference for further investigation. A similar kind of exposure seems to take place in theatre when something is put on stage. This chapter therefore investigates how the study of theatre may provide a valuable contribution to philosophy of technology. The investigation focusses on object theatre, where the treatment of artefacts plays a particularly important role. After a conceptual clarification of the approach, the paper discusses two recent productions concerned with engineering and technology in more detail. Keywords  Artefact ∙ Staging ∙ Object ∙ Theatre

16.1  Landmarks of Engineering Engineering evokes a powerful imagery of tools, machines and buildings that have become part of our cultural heritage: steam engines, airplanes, towers, dams, railway lines and power stations stand out as achievements of human civilization that have changed the face of the earth. Such artefacts are a source of identity for the whole discipline of engineering. With their help, stories of adventure, courage, intellectual achievement and professional excellence can easily be told. Everyone can relate to these stories. They hold engineers together as a community and distinguish them from others. Furthermore, they have implications for the responsibilities that engineers tacitly accept as members of their profession. Artefacts accordingly serve as references for numerous different claims related to their work.

A. Fritzsche (*) Ulm University, Ulm, Germany e-mail: [email protected] © Springer Nature Switzerland AG 2021 Z. Pirtle et al. (eds.), Engineering and Philosophy, Philosophy of Engineering and Technology 37, https://doi.org/10.1007/978-3-030-70099-7_16

309

310

A. Fritzsche

At the same time, the landmark artefacts that engineers have created during the last centuries also affect people who are not involved in engineering. They shape the external perspective on the discipline as well. Heidegger’s encounter with a hydroelectric plant is a famous example (Heidegger 1993). Situated in the middle of a river, the plant is at once recognized as an alien body. It does not fit to the environment, which seems in comparison untouched by human intervention and still in its original, natural state.1 The plant is an industrial installation, erected for the sole purpose of gaining energy from the current of the river. It provides a perfect illustration for Heidegger’s argument about modern technology and the spirit that drives it, which has strongly influenced many philosophical reflections on engineering, especially those that proceed from the perspective of observers from outside. Today, in a world permeated by mobile and interconnected digital devices, it has become quite obvious that a clear distinction between an internal and an external perspective on engineering is not always helpful. A smartphone needs to be discussed in a different way than a hydroelectric plant in order to understand its implications in daily life. Considering the fluidity of modern information systems architectures, one might even ask if the reference to a static artefact in general is a suitable approach in such situations to give an appropriate account of contemporary engineering. It seems very well possible that speaking about artefacts as if they were static oversimplifies the situation and distracts us from the questions that really need to be asked about engineering in the twenty-first century. This chapter therefore tries to step beyond current philosophical practice in dealing with technology and engineering. It looks for a way how speaking about artefacts in the philosophy of engineering and technology can itself become a topic of reflection. Since the turn of the millennium, engineers and philosophers have shown an increased interest in a common dialogue to learn more about each other’s points of view (Franssen et al. 2016). The forum for philosophy, engineering, and technology (fPET) has become a popular platform for such a dialogue and inspired a lively exchange across disciplinary boundaries (Michelfelder et al. 2017). This does not necessarily mean that the differences in  perspective at fPET  have disappeared (Fritzsche and Oks 2018). They are woven too deeply into the fabric of industrialized societies to expect that they can be easily eliminated, and there may even be good reason to uphold them in order to assign responsibilities and thus keep some control over the trajectories of technical development. In other words: integration might not be the most important goal of our dialogue. Instead of trying to understand each other, it might be much more interesting to try to understand how and why we do not understand each other. In doing so, we might learn a lot more about our practice of referring to artefacts and its limitations in dynamic, evolving and often ambiguous application contexts. How can we proceed to make this happen?  The following pages provide an answer that might come as a surprise. It is based on the impression that the stories 1  Heidegger, of course, must have known that the whole scenery he observed at the southern edge of the Black Forest has been shaped by human hand. Neither the riverbed nor the vegetation on the mountainside was in a natural state. Both have been cultivated over many centuries.

16  The Artefact on Stage – Object Theatre and Philosophy of Engineering…

311

told about engineering through landmark artefacts have a theatrical quality. The stories take the efforts of engineering out of the context of daily toil and trouble. They put them on a stage, elevating the engineer to the role of a protagonist in a play, while others turn into an audience that watches the events from below. Klyukanov (2010) has described such stagings as a fundamental element of communication. Nevertheless, they are often ignored in the current discourse on technology. Theatre therefore might provide an access point for reflections on engineering that has so far been wrongfully neglected. Contemporary object theatre, drawing on the traditions of puppet theatre and physical theatre, seems to be a particularly interesting format, as it puts the artistic treatment of dead matter in the spotlight. On the next pages, this potential of theatre is explored in more detail, starting with a brief overview of the concepts and history of object theatre that pays particular attention to the staging of artefacts. Then, two recent productions of object theatre are taken as examples to show the potential of this form of art in practice. One piece deals with the digital transformation of society, the other with robotics and artificial intelligence. The summary of the productions is followed by an analysis of different ways how they address philosophical questions concerning technology and engineering and a short discussion of the contributions that they might provide for the ongoing discourse in the field.

16.2  Objects, Puppets and Theatre For various reasons, including the ignorance of the author, this section cannot give a full rendition of extant theory on puppetry, pantomime and the general usage of objects in theatre. It can only explore some very specific aspects that seem particularly relevant for the treatment of technical artefacts. Technology, of course, also plays a huge role in theatre as a basis for different forms of stagecraft, from lighting and sound design to the creation of movable sets and the operation of the curtains (White 2015). For the topic at hand, however, it is more important to look at technical artefacts and operations as part of the presentation. Independently from theatre as a specific form of art, there are numerous social rituals in which objects take over a ceremonial function that has little to do with the actual impact of their usage on the material world. For instance, being touched by a sword can elevate someone to knighthood and having a ring put on someone’s finger can create a lasting bond to a life partner – under the condition that the people who are involved agree with the conventions of the ritual and behave accordingly. If they do, however, the objects used in the ceremony have an actual instrumental value. Objects on stage can take over functions in a similar way, independently from their material properties. Just as the actors can impersonate someone else, an object can embody something else. To be used a gun, a telephone, a key or an oven in a play, an object does not have to enable its user to cause the same kind of effects on its physical environment that one expects from any such tool or machine outside the

312

A. Fritzsche

theatre. The audience just needs to understand what kind of function the object is supposed to have and observe that the enactment of its usage has the expected consequences. All this can be achieved solely by the performance of the artists who are present on stage and all others who contribute to the play from outside. (Certain parallels in the context of engineering have already been explored by Evans 2013). One of the most fascinating aspects of theatre is that the artists and the audience are aware of the artificiality of the situation. They know that everything that happens on stage is a construction. Theatre creates a temporary illusion of reality that will soon enough perish, once the people involved have left the premises. Nevertheless, they can get emotionally engaged, and their emotions may even become stronger inside than outside the theatre. The fact that a certain plot is put in the spotlight, where it can be appreciated without distractions or further need to think about its consequences, allows a depth of experience that can hardly be compared with anything else. Many aspects of this phenomenon are discussed in Aristotle’s drama theory and subsequent investigations in the same area. They are also a source on criticism on theatre, for example in Plato’s Ion. In many respects, the artificiality of theatre can be considered to reach its maximum in puppetry, where even the actors on stage are replaced by objects. Puppetry has for a long time been part of popular culture (Bell 2000), practised in public places like market squares, where the staging, the plots and the portrayed personas cannot reach a high level of sophistication. As a form of mass entertainment, puppetry relies on simplification. The puppets only bear a superficial resemblance of human beings, showing the audience what kind of person they are supposed to be. Puppets may be caricatures of people whom the audience knows, which makes puppetry to this day a perfect medium for satire (Brillenburg Wurth 2011; Finch 1981). But in spite of all simplification and disfigurement, the audience can still empathize with the characters. In fact, the phenomenon of the uncanny valley, which was first observed in animated movies (Mori 2012), indicates that the imperfection of the puppet itself is enabling an emotional response. Like the spotlight on stage, it helps the audience to focus on the essence of the presentation. In his text “On the Marionette Theatre” from 1810, Heinrich von Kleist relates the puppet on stage to the ideal of a dancer: a body whose mechanics are under full control of the artist (see Kleist 1972). Like a musical instrument, the body serves as an object that the artist uses to express him- or herself. Drawing on similar ideas, Craig (1957) speaks of the ideal actor in theatre as an “über-marionette”, dedicated to the sole purpose of fulfilling its function on stage. As a consequence, many new possibilities to combine puppetry, conventional theatre and dance have lately been explored. At the same time, puppet theatre has moved on to new ways of working with physical matter, as the attention of modern art shifted from what is perceived to how it is perceived (Callery 2001). The audience is considered to be actively involved in the experience of theatre. The ability to relate to the presentation on stage regardless of its artificiality is by itself made a topic. The puppeteer therefore does not need to be hidden. She or he can appear on stage, confronting the audience with the mechanics of the play and forcing it to step actively beyond this knowledge to follow the presentation. Furthermore, modern object theatre and physical theatre

16  The Artefact on Stage – Object Theatre and Philosophy of Engineering…

313

abandon the idea of concrete representations of physical reality on stage. Instead of material objects, the actors work with empty space and leave it to the audience to fill in what is missing. In a similar way, the spoken word becomes optional on stage. Even the idea of the stage as a dedicated space in a theatre building is given up (Artaud 1958) and artists move out to the streets (Doyle 2001). By leaving the traditional arrangements of stage, actors, props and audience behind, contemporary object theatre creates many occasions when the audience is reminded of the fact that it willingly engages in the experience of the play. The audience is not just exposed to the work of the artists. Theatre can only happen if the audience develops a common understanding with the actors about the kind of communication that is going on between them. The audience has to accept the stage as a stage, with all the artificiality of the setting, to let a play happen. The performing artists can only offer an experience. If and how it actually takes place depends on the agreement of the audience. The means of expression that object theatre has at its disposal make sure that the need for such an agreement is not so easily forgotten. Exactly this is the reason why there is something to be learned from theatre for engineering.

16.3  Technology on Stage The previous section of this investigation has looked at the staging of objects and the artificiality of theatrical experience. It has discussed the isolating effect of spotlights and simplified representations of persons and instruments. Similar to a technical installation, the presentation on stage turns the attention to a specific fraction of reality that be accessed in the form of a narrative, a story of problem solving, or anything else that allows the participants to engage in an emotional journey. The previous section has also shown how modern object theatre turns the staging itself into a topic of theatre and reminds the audience of its choice to get involved. After this investigation has thus shed light on the possibilities of theatre to let us experience different ways of addressing engineering and technology, it now turns to the question how these possibilities are actually used by artists. The investigation proceeds on two examples that can be considered as particularly revelatory cases in this context, as they make use of a large variety of different means of expression that modern object theatre has to offer. Both examples are taken from the works of Meinhardt & Krauss, a group of artists that has shown particular interest in questions concerning technology, science, artificiality and medialization. Furthermore, they have presented excerpts of their plays at an fPET conference and engaged in discussions with the attendees. Meinhardt & Krauss have worked together with various other artists over the years. An overview of their complete oeuvre can be found on www.meinhardt-­krauss.com, including many other pieces that would be interesting objects of investigation, but need to be ignored here for the sake of brevity.

314

A. Fritzsche

In line with common hermeneutic patterns of modern multimedia analysis (e.g. Fritzsche and Dürrbeck 2019), the following description of the two cases addresses the subjects of the plays, the presentation on stage and the means of expression used in it.

16.3.1  Case 1: The Second Reality The Second Reality is inspired by Freud’s (1920) reference to the outrages that humanity had to endure from science. According to Freud, science has not confirmed the outstanding position of humanity in the universe, but instead shattered human self-love on several occasions. In astronomy, humanity has lost its position in the centre of the universe; in evolutionary theory, it has been reduced to the status of an animal species among many others; in psychoanalysis, it has lost control over its own intellectual capacities. Meinhardt & Krauss argue that recent progress in engineering has a similar effect regarding the belief that humanity can improve the world by the means of technology (Fig. 16.1). With the title of their piece, Meinhardt & Krauss refer to the artificial environments that human beings create for themselves in the course of the digital transformation. This creation and its results are depicted on a stage that remains completely void of any physical material except for the bodies of three actors and stones. With these stones, the actors start to ‘draw’ on the floor, leaving behind traces of light. The act of creation continues with different impressions of light and sound on the

Fig. 16.1  The Second Reality: virtual shadows, able to move as digital twins. (Photo: M. Krauss)

16  The Artefact on Stage – Object Theatre and Philosophy of Engineering…

315

Fig. 16.2  The Second Reality: puppets on invisible strings. (Photo: M. Krauss)

stage, creating a virtual, abstract living environment that includes images of the actors themselves, and while the actors continue to produce new audio-visual effects, they are also forced to react to their own creations. All their experience is digitally mediated and magnified in a spectacle of abstract imagery, larger than life and immaterial. They are exposed to a strange artificial habitat, replacing what nature had to offer with different possibilities of intervention by humans that give an impression of unlimited power, but in the end entangle the individual in a larger network that they cannot control (Fig. 16.2). For the interplay between the actors and the audio-visual effects, Meinhardt & Krauss together with fellow colleagues Marcus Doering and Oliver Feigl have developed their own tracking and tracing technologies with infrared light and other means. They make it possible to record certain movements of the actors and to use them as triggers for different visual and auditory effects, based on predefined geometric and rhythmical patterns. The same technology provides the actors with projections of digital twins. Sometimes they appear on stage with a short delay, which enables the actors to relate to them from different positions. In doing so, they must be very precise in their movements. Their performance can therefore be described as puppetry in two directions. On the one hand, the actors control the audio-visual effects with whom they interact. On the other hand, they become puppets themselves, as they are under the control of the technology on stage.

316

A. Fritzsche

16.3.2  Case 2: Eliza – Uncanny Love In Eliza, Meinhardt & Krauss pick up the ancient Greek story of Pygmalion, the sculptor who fell in love with one of his statues. Based on this story, Meinhardt & Krauss explore how human beings can become romantically involved with artificial intelligence. Eliza is the name of the female protagonist of George Bernhard Shaw’s play Pygmalion. It is also the name of a computer program by Joseph Weizenbaum that was able to apply simple conversation patterns in the exchange with human users (Weizenbaum 1976). To Weizenbaum’s own surprise, users showed a strong emotional response to the program, which did not disappear when the simplicity of the program code was explained to them. With the second part of the title, Uncanny Love, Meinhardt & Krauss refer to the phenomenon of the Uncanny Valley, which was already mentioned above. It describes the sudden drop of affection to an artificial image of a human being when the image becomes too perfect (Mori 2012) (Fig. 16.3). During the play, Pygmalion appears on stage as an engineer who creates a humanoid robot. A singer accompanies his actions with a recital of Ovid’s rendition of the story in the Metamorphoses. Meinhardt & Krauss, however deviate from the original in various ways. The engineer takes measure on his own body in the design of the robot. He already falls in love during the process of creation. He caresses arms and legs long before they are put together to form a full body, using his own gestures and movements to cope with the missing parts of the robot in a solitary dance performance. When the robot is finally finished and starts to speak, the magic is lost. The object of desire has become uncanny and catastrophe ensues (Fig. 16.4). In comparison to The Second Reality, Meinhardt & Krauss take a rather traditional approach to the usage of technology in their piece. The actor on stage works

Fig. 16.3  Eliza: anticipating desire. (Photo: M. Krauss)

16  The Artefact on Stage – Object Theatre and Philosophy of Engineering…

317

Fig. 16.4  Eliza: construction of the other. (Photo: M. Krauss)

with physical matter. The parts of the robot’s body consist of a movable core structure and a thin, white cover material. Some of the latter is added by Pygmalion during the play. The parts are arranged on stage like they would be in the studio of a sculptor at work. The only difference is that they include a multitude of electric engines to enable active movements of the limbs and facial expressions without intervention of the actor. The engines are controlled from backstage via a thick bundle of cables, reminiscent of the strings of a puppet. The speech of the robot is pre-recorded. The movement of the mouth, eyes and skull are aligned with it.

16.4  Philosophical Experiences The Second Reality and Eliza broach the subject of technical acts of creation from two very different perspectives. On the one hand, the creation concerns a comprehensive environment: the digital habitat of humanity in the twenty-first century. On the other hand, it concerns artificial individual intelligence: the autonomous robot that assumes the role of a human being. Both subjects are core topics of contemporary engineering. Both can also be considered as landmark projects that draw public attention to the work of engineers, without exhausting the full bandwidth of activities that are currently going on in the fields of digitization and artificial intelligence. What Meinhardt & Krauss present to the audience are stagings of engineering that attract public attention and serve as a source of identity for the professionals who are involved. The medium of object theatre makes it possible that these stagings of engineering become themselves the topics of the plays. What can be observed on

318

A. Fritzsche

stage is how the acts of creation are carried out and what happens to human beings in the process. Eliza shows how the aspired outcome of the act of engineering, the artificial creature, provides a source of affection during its assembly. The engineer starts to get emotionally involved with the creature long before it actually comes alive. He experiences moments of tenderness play with its parts, which are formed after the image of his own body. The love affair ends as soon as the creature has tasted the fruits of knowledge. The artefact is attractive as long as it remains in the process of becoming. After that, there is little left for the engineer to do with it. Given the history of object theatre, this can also be considered as a reflection on puppet theatre as a play with the imperfect image of a human being, a projection under the control of the puppeteers. It is not supposed to be a real person, but to serve as a means to advance the plot. The second reality puts more emphasis on the intangible nature of digital creations. They are reduced to audio-visual effects, sounds and lights. The only tangible objects that are seen on stage are stones, which makes the observer wonder if this stage is not actually a cave with projections on the walls. Step by step, the actors explore the enormous possibilities to express themselves with the help of digital technology. Again, the process of staging a play is mirrored by the action on stage, where the actors switch between the roles of the set designer, the puppeteer, the puppet, and to some extent also the dancer and musician. They create their own world out of abstract forms. Unsuitable parts of their bodies disappear in the shadows. All further interpretation is left to the audience. With all the overwhelming stimuli that the play provides for eyes and ears, it does not tell a finished story, which would allow the spectators to lean back and enjoy. There remains an emptiness in the spectacle, as if it only drew attention to itself in order to force the audience to get involved. The work of Meinhardt & Krauss can in many different ways be related to the discourse in philosophy of technology. In the usage of light and sound, enframings of the action on stage can be recognized. The play with artificial body parts, just as puppetry in general, allows references to the idea of organ replacement and organ projection. The treatment of objects on stage draws immediate attention to the dual nature of technical artefacts in the combination of physical embodiment and symbolic meaning. Technology provides at the same time a topic of the plays and infrastructure for the performance, as it mediates the play on numerous different levels. Furthermore, there is a strong postmodern element in the presentation, letting the field of engineering appear as a fragmented, often self-referential activity with a recurring need to gain orientation in the vast space of possibilities for technical progress (Fig. 16.5). None of the references to the discourse on philosophy of technology in the given examples is elaborated to an extent that would allow the formulation of a clear statement as a contribution to this discourse. More likely, the theatrical presentation can be considered as a commentary on the discourse itself, showing how many different impressions and experiences shape our understanding of engineering and technology at the same time. There is no given, static artefact to which we can refer, but an

16  The Artefact on Stage – Object Theatre and Philosophy of Engineering…

319

Fig. 16.5  Eliza: dancing (with) technology. (Photo: M. Krauss)

ongoing ritual dance to summon it for a moment in a fragile, floating image that looks different every time, forcing the spectator to get involved and fill the gaps that the creative forces behind it have left open. The stage is not a pedestal. It does not show us the artefact as a monument, but rather as part of an ongoing performance. In the treatment of the objects on stage that mirrors object theatre as a whole, we can recognize ourselves and our counterparts in the discourse on technology as performers.

16.5  The Mirror Image Various scholars have already looked into the possibilities to use object theatre to support design education and practice (e.g. Ryöppy and Skouby 2015; Buur and Friis 2015). Others have looked at different kinds of intersections between robotics and theatre (e.g. Nijholt 2018; Walters et al. 2013), with a particular interest in puppetry (Mikalauskas et al. 2018; Sakashita et al. 2017). This text suggests that there is also something to be learned from theatre on another level. It seems that the practice of theatre provides us with a mirror image of our treatment of the artefact in the discourse of engineering. No matter if engineering is approached from an inside or an outside perspective, the artefact serves as an important point of reference by which the work of the engineer and its impact on the world is assessed. Theatre allows us to reverse this assessment and study the process that turns the artefact into

320

A. Fritzsche

a point of reference: a process of staging the artefact, somewhere between artists and audience. The staging requires efforts of everyone involved to become a valuable experience. In contemporary object theatre, many different facets of these efforts can be observed. They concern the shape of the object, the way how the actors make it work and how it assumes agency in the play. Most interestingly, the theatrical presentation also lets us observe how much is missing in the staging of the object and how the audience copes with it. All this seems to be relevant for technical artefacts in engineering as well. Even if the mirror image of our discourse in theatre is a little distorted, enough can be recognised to inspire new thoughts and insights. All pictures used with friendly permission by Meinhardt & Krauss.

References Artaud, A. (1958). The theater and its double, translated by M. C. Richard. New York: Grove Press. Bell, E. (2000). Strings, hands, shadows: A modern puppet history. Detroit: The Detroit Institute of Arts. Brillenburg Wurth, C. A. W. (2011). Spitting image and pre-televisual political satire: Graphics and puppets to screens. Image [&] Narrative, 12(3), 113–136. Buur, J., & Friis, P. (2015). Object theatre in design education. Nordes, 1(6). Callery, D. (2001). Through the body: A practical guide to physical theatre. New York: Routledge. Craig, E. G. (1957). On the art of the theatre (2nd ed.). New York: Theatre Arts Books. Doyle, M.  W. (2001). Imagine nation: The American counterculture of the ‘60s and ‘70s. New York: Routledge. Evans R. (2013). Engineering as performance: An “experiential Gestalt” for understanding engineering. In D. Michelfelder, N. McCarthy, D. E. Goldberg, (Eds.), Philosophy and engineering: Reflections on practice, principles and process (pp. 27–37). Dordrecht: Springer. Finch, C. (1981). Muppets and men: The making of the Muppet show. New York: Alfred A. Knopf. Franssen, M., Vermaas, P. E., Kroes, P., & Meijers, A. W. (2016). Editorial introduction: Putting the empirical turn into perspective. In M. Franssen, P. Vermaas, P. E. Kroes, & A. W. M. Meijers (Eds.), Philosophy of technology after the empirical turn (pp. 1–10). Amsterdam: Springer. Freud, S. (1920). A general introduction to psychoanalysis. New York: Horace Liveright. Fritzsche, A., & Dürrbeck, K. (2019). Technology before engineering: How James Bond films mediate between fiction and reality in the portrayal of innovation. Technovation, 28(1–2), 20–28. Fritzsche, A., & Oks, S. J. (2018). Translations of technology and the future of engineering. In A. Fritzsche & S. J. Oks (Eds.), The future of engineering (pp. 1–12). Cham: Springer. Heidegger, M. (1993). The question concerning technology. In D.  Krell (Ed.), Basic writings (pp. 307–342). New York: Harper Collins Publishers. Kleist, H. V. (1972). On the Marionette Theatre, translated by T. G. Neumiller. The Drama Review, 16(3), 22–26. Klyukanov, I. E. (2010). A communication universe: Manifestations of meaning, stagings of significance. Lanham: Lexington Books. Michelfelder, D. P., Newberry, B., & Zhu, Q. (2017). Philosophy and engineering: An unconventional work in progress. In D. P. Michelfelder, B. Newberry, & Q. Zhu (Eds.), Philosophy and engineering: Exploring boundaries, expanding connections (pp. 1–12). Cham: Springer.

16  The Artefact on Stage – Object Theatre and Philosophy of Engineering…

321

Mikalauskas, C., Wun, T., Ta, K., Horacsek, J., & Oehlberg, L. (2018). Improvising with an audience-­controlled robot performer. In Proceedings of the 2018 designing interactive systems conference (pp. 657–666). Mori, M. (2012). The Uncanny Valley, transl. MacDorman, K. F., Kageki, N. IEEE Robotics & Automation Magazine, 19, 98–100. Nijholt, A. (2018). Robotic stand-up comedy: State-of-the-art. In International conference on distributed, ambient, and pervasive interactions (pp. 391–410). Cham: Springer. Ryöppy, M., & Skouby, A.  H. (2015). Object theatre-a playful understanding of design. In 4th participatory innovation conference (pp. 458–461). Sakashita, M., Minagawa, T., Koike, A., Suzuki, I., Kawahara, K., & Ochiai, Y. (2017). You as a puppet: Evaluation of telepresence user Interface for puppetry. In Proceedings of the 30th annual ACM symposium on user Interface software and technology (pp. 217–228). Walters, M. L., Koay, K. L., Syrdal, D. S., Campbell, A., & Dautenhahn, K. (2013). Companion robots for elderly people: Using theatre to investigate potential users’ views. In IEEE RO-MAN (pp. 691–696). Weizenbaum, J. (1976). Computer power and human reason: from judgment to calculation. New York: W.H. Freeman. White, T.  R. (2015). Blue-Collar Broadway: The craft and industry of American Theater. Philadelphia: University of Pennsylvania Press.

Chapter 17

Imagined Systems: How the Speculative Novel Infomocracy Offers a Simulation of the Relationship Between Democracy, Technology, and Society Malka Older and Zachary Pirtle

Abstract  We reflect on the novel Infomocracy as a way to simulate the relationship between democracy, technology and society. While some talk about science fiction predicting the future, its predictiveness has been questioned. We think science fiction holds greater potential to help us understand what sort of world we want to create, starting from the present. A key part of how we create the world we want to live in is by creating policy, that is, setting forth plans of action and governance principles for how humans govern themselves and technology. We provide a framework for how science fiction can help inform policyserve as a kind of simulation to test values and ground normative assertions about governance and to offer space to reflect on how technology and society relate. To motivate this framework, we will draw significantly on key themes from the first author’s 2016 novel Infomocracy. Keywords  Technology · Philosophy of technology · Science and technology studies · Science fiction · Futures

17.1  Introduction Technological “progress” is not an exogenous, inevitable, neutral force, wherein given developments are likely to occur sooner or later and in more or less the same order, regardless of individual decisions or broader societal trends. Rather, technology and society co-evolve with one another, shaping each other in a complicated All opinions expressed in this article belong to the authors and do not necessarily represent the views of NASA or the United States Government. M. Older (*) Arizona State University, School for the Future of Innovation in Society, Tempe, Arizona Z. Pirtle Independent Scholar, Washington, DC, USA © Springer Nature Switzerland AG 2021 Z. Pirtle et al. (eds.), Engineering and Philosophy, Philosophy of Engineering and Technology 37, https://doi.org/10.1007/978-3-030-70099-7_17

323

324

M. Older and Z. Pirtle

process (Jasanoff 2004). Science and technology studies (STS) and sociological literature offer powerful demonstrations of the interaction of technologies and values infused within them, and arguments for why society needs to responsibly develop new technology (Vallor 2016; Winner 1980; Shrader-Frechette and Westra 1997). Interrogating the relationship between technology and society leads us to look more seriously at science fiction, especially as it can offer a sociological perspective on how technology may evolve. While some talk about science fiction predicting the future (Merchant 2018), its predictiveness has been questioned (Rejeski and Olson 2006). We think science fiction holds greater potential to help us understand what sort of world we want to create, starting from the present. A key part of how we create the world we want to live is is by creating policy, that is, setting forth plans of action and governance principles for how humans govern themselves and technology. Indeed, science fiction has long been pointed to as a way to inform policy by serving as a tool for reflection on the role of technology and society (Miller and Bennett 2008; Davies et al. 2019). In this chapter, we want to give a framework for how science fiction can inform science and technology policy, serve as a kind of simulation to test values and ground normative assertions about governance and to offer space reflect on how technology and society relate. To motivate this framework, we will draw significantly on key themes from the first author’s 2016 novel Infomocracy, which posits a future where an international information management entity takes responsibility for the condition of having an informed constituency and the world is broken into relatively small groups of people that choose preferred governments without regard to geography, with mixed results, as we will show. We will describe how the book modifies the traditional technology and society relationship, and may help us to imagine policy decisions to consider how we should make technology and our political and democratic systems relate. Our deconstruction of how fiction can help explore how democracy, technology and society relate reinforces broader calls for holistic approaches for assessing technologies and the values humans build into them (Noble 2018; Vallor 2016).

17.2  How Science Fiction Can Inform Policy Many scholars have pointed toward science fiction as a lens by which we can explore the implications of technologies, and help society govern and shape new technology (Miller and Bennett 2008; Finn and Eschrich 2017). Science fiction has been an active topic of research for several decades (Brown and Kahn 1977; Rejeski and Wobig 2002), and it has inspired many public debates on how science fiction can – or should – shape our current thinking about the world (Eveleth 2019; Merchant 2018). There is an increasing body of academic literature on how science fiction can serve as a tool to improve science policy: Conley (2018) and York and Conley (2020) discuss ways to use science fiction to improve ethical deliberation capacities for engineers and scientists, which might in turn help them better serve society.

17  Imagined Systems: How the Speculative Novel Infomocracy Offers a Simulati…

325

Science fiction can serve a number of purposes to inform policy and research on technology and science. We propose a framework of three key goals for the use of science fiction: forecasting, values reflection, and governance. These diverse uses for science fiction capture all of the of the major use cases for using science fiction in policy that are laid out in the classic “Thinking Longer Term About Technology” by STS and policy scholars Clark Miller and Ira Bennett, with the recognition that there can be intersections between them. Our framework here is a new synthesis, with its simplicity meant to help practitioners see the roles for science fiction in policy while still teasing out fundamental considerations for science policy.1 Forecasting describes how science fiction can envision what may happen in the world, and help to identify new possibilities and pathways by which change may occur (Merchant 2018; Samuel 2019; Selin 2011). This includes broad speculation about how people might live with new technologies- and what function those technologies might have in the real world. Miller and Bennett 2008 note some use cases that we would label as forecasting activities, such as identifying and exploring “non-linearity,” which is the ability to predict or recognize break points and potential places for rapid changes; they also discus creating “Clear visions of desirable – and not so desirable  – futures”, thereby laying out broad options for our future which can be used to make decisions about which paths we may want to go down. Values reflections is about examining what normative good there might be in current or future technologies and configurations of society, considering the values underlying social change and technologies (Jørgenson and Bozeman 2007; Douglas 2009; Vinsel and Russell 2020). Thinking about science policy should be informed by considering the societal benefits and costs that future science will encourage (Bozeman and Sarewitz 2011; Bertrand et al. 2017). Considering what diverse peoples want in both our current and future worlds can help to encourage empathy and reflection (Conley 2018). Miller and Bennett 2008 note science fiction use-cases that we would describe as values reflection: “resonance, meaning and identity”, which involves how readers can seek to create meaning and assess value of the worlds they are creating, motivating why we should care about possible technological change, to consider how technology begins to take on new meaning in our lives. They also note how science fiction can encourage “people centrism,” which focuses readers on how people and societies engage with new technologies. As we think about future governance of technology, keeping the human element foremost in mind is critical, and science fiction is a way to center the reader on the people involved in the story; character development is a strength of fiction and narrative as compared to other forms of futurism.

1  Having only three main categories of influence is a virtue in the attention starved world of policy, and helps to concepts from Miller and Bennett. This framework owes some heritage to reflective technology assessment approaches by Guston and Sarewitz (2002), for whom the second author worked from 2005 to 2007. Specifically, Guston and Sarewitz talk about ‘technology assessment and choice’, which we have here focused on as values reflection and governance. Guston 2014 and other work on anticipatory governance has evolved Guston and Sarewitz’s work, but did not feed into our analysis.

326

M. Older and Z. Pirtle

Governance in our framework is about how to implement changes in the world, thinking reflectively on how to build changes and improvement. This is the area where democratic deliberation can occur on how to influence and shape the world toward what we wish to aspire (Guston 2014). Conceptual tools to help people deliberate on what world they seek is necessary for meaningful deliberation. Further, there are many actions needed for governance, to implement in the real world, and reflecting on how to achieve it can be helpful.. Miller and Bennet discuss use-cases that we would classify as governance: they highlight how science fiction can provide people with “enduring symbols and problem framings:” a recent edition of Mary Shelley’s Frankenstein held a rich set of reflections on the Frankenstein metaphor (Shelley et  al. 2019). Or, to take a more recent example, Skynet (from the 1980s movie The Terminator) is still used as shorthand for “AI taking over the world”. This helps to serve as a vocabulary to lay out ethical dilemmas, and to conceive of technologies that will continue to exist after their creator, which everyday people can use to make sense of potential policy decisions. Science fiction can also provide for “capacity building for reflective governance,” which is about enabling citizens to better discuss the future of technology. This helps governance by helping citizens be more reflective, more informed, and more disposed to engage in governance and policy activities. Lastly, Miller and Bennett note a use-case where science fiction offers a “a fertile playground for the imagination.” They refer to this as the ability for fans of a science fiction work to continue to wonder about what would make a certain idea possible (such as creation of a giant ringworld). We will focus the most on governance in our analysis of Infomocracy (Older 2016), showing how the richness of the book serves as a playground for reflecting on the relationship between technology and society, acting as a kind of simulation.2 We are using the notion of a playground here primarily to think about different ways to configure how society science and technology relate, and not to reflect on ‘what if’ questions about science, as Miller and Bennett allude to. As we discuss Infomocracy below, we show how themes from the book can fit into our framework categories of forecasting, values reflection, and governance, illustrating how science fiction can help us think about governance and what sort of society we want to live in. Infomocracy isn’t about trying to predict the future and we will note that some core themes have existed in the real world for a long time (like political operatives glibly manipulating facts and fact-checkers alike).

17.3  O  n the Relationship among Technology, Society and Democracy To understand the ways in which science fiction - and Infomocracy in particular can be a simulation for how we think on the relationship between technology and society, a few key touchstones are needed to guide us through the detailed 2  For overviews on the nature of simulations as tied to narrative fiction, see the work of Michael Poznic (Poznic et al. 2020; Poznic 2016).

17  Imagined Systems: How the Speculative Novel Infomocracy Offers a Simulati…

327

relationships that occur in sociotechnical systems. This is particularly important for understanding our discussion below on how Infomocracy changes key democratic and technological systems, which changes the relationship between technology and society in ways we can use to inform science policy. The idea that technology shapes society has a long history, and is sometimes perceived as a unidirectional extreme. ‘Technological determinism’ describes a view whereby technology develops by its own logic, shapes how society evolves. There are at least several ways in which technology shapes social behavior: Bimber (1994) notes three conceptions of technological determinism that each partly reflect how technology can affect society: (1) technology becomes a normative goal, with cultural values focusing more on the pursuit of new technological and material resources as ends-in-themselves (a mindset once criticized by theorists Jacques Ellul, Lewis Mumford and others); (2) technology can be a source of major unintended consequences or side effects, which affect human life and which need to be controlled; and (3) a ‘nomological’, or law-based, determinism, where the internal logic of technology will grow and shape social action. This shaping can have two general meanings: Heilbroner (1994) argues that there often is a natural order of steps for specific technological developments, and that different social groups will eventually converge on similar technologies and technological approaches. Vincenti (1992) notes how technology often constrains decisions, and that for lower-level components of a technological system, engineers can face very few design options that will physically work.3 All of these are ways in which technology shapes society.4 Many alternatives to technological determinism capture the ways in which technology and society co-evolve in a bi-directional way.5 In particular, the historian Thomas Hughes uses a notion of technological momentum to capture bidirectionality: as a technology becomes more frequently deployed, used in more places, it takes on a type of momentum that creates dependence, and it becomes harder to imagine alternatives to the technology, and it develops political constituencies that take on new interests. Of course, the ways in which society shapes technologies is deep as well: cultural norms and values often drive the sense of what problems are worth solving through technology, with tacit conceptions of aesthetics and cost 3  Vincenti also broadly recognizes how social influences can affect higher level systems engineering decisions, but does worry about constraints in the details of engineering work. 4  It should be noted that even the most ardent advocate of technological determinism, Heilbroner (1994) offers many different caveats about the important influences of social factors on and within the design process. Despite the many critiques and alternatives to technological determinism, Ceruzzi (2005) and Wyatt (2008) note technological determinism may not have been properly assessed. 5  The social construction of technology movement was perhaps the most contrasting alternative to tech determinism (Bijker et al. 1987). Others refer to technological momentum (Hughes 1994), technological styles (Bijker 1987), and actor network theory (Latour 2005). These alternatives differ slightly in their concepts, but all of which serve as possible lenses to examine the influences on a technology, and all recognize political agency as a key influence on how technology’s develop. Jasanoff’s (2004) notion of co-production offers a comparable way to capture all of these connections in broad terms, with some similarities to Hughes.

328

M. Older and Z. Pirtle

benefit tradeoffs affecting technology at all levels. There are a myriad of other ways in which society shapes technology. Intuitively, those who have power and political and financial capital in society would plausibly have the ability to adjust what technologies are being developed in a significant way. For example, famous case studies explored how political power and regional interests shaped the development of early U.S. missile systems (Sapolsky 1971; Eisenhardt and Zbaracki 1992). The entrenched interests of those who control existing technologies is also a way that political power locks in, sometimes ensuring the survival of a particular technological pathway (Weiss and Bonvillian 2012). And, of course, humans have values and those values get infused into the technologies that they build, such as biases filtering into algorithms (Williams 2020; Douglas 2009). Figure 17.1 offers a simplified schematic to visualize the shared influence between technology and society: it treats technology and society as one conjoined sociotechnical system: after all, technology is inherently a part of society, and there is no easy way to cleanly separate them out at all levels. For schematic purposes, we call out separately the influences of society and technology, with technological influences in green, capturing the ways in which technologies shape human action. The red arrow reflects the broader role of human and cultural influences on technology, illustrating the constellation of political actors and values that shape technology. We will explore later how Infomocracy offers a thought experiment for changing how technology and society relate with one another. Many of the ways society shapes technology take place through debates and mechanisms that are broadly part of a nation’s form of government, including democracies or similar representative governments. So, in that sense, democracy broadly shapes technology through some of the mechanisms noted above. Changes to the democratic process can also have their own effects on technology.6 While some philosophers have advocated using democratic theory as a means to decide on Society’s Influence on Technology • Cultural norms shaping goals for technology • Entrenched interests promoting tech stagnation • Judgment and values

Technology’s Influence on Society • Technology makes some actions easier, and constraints others • Shared patterns of technological growth

Fig. 17.1  Simplified schematic to visualize the shared influence between technology and society  as they act on a nation-state and its associated technological infrastructure. Graphics by Elaine Seward 6  There is a vast amount of literature on the nature of democracy, including much on the nature of how democracy relates to the governance of science and technology policy (Brown 2004, 2009; Gutmann and Thompson 1998). We will not attempt to fully elaborate on the vast amount of literature on the nature of democracy.

17  Imagined Systems: How the Speculative Novel Infomocracy Offers a Simulati…

329

goals for science and technology (Kitcher 2004; Brown 2009), there have been many critiques about the need for much more detailed engagement with democratic processes in order to inform science and technology policy (Brown 2004; Pirtle and Tomblin 2017; Durán and Pirtle 2020). Much of this literature depends on the interplay between facts and values, and how efforts to use technology to inculcate objectivity can run astray (Williams 2020; Noble 2018). Such interplay belies the need for democratic oversight on technology. Further, efforts to maintain and protect democracy require a great deal of attention (Talisse 2019), and behooves engagement in everyday public and political contexts. The first author has been active in public discussions about the implications of a nation state, the limits of direct democracy, and what ways we could change the structure of democracy (Older 2019a, b, c, d).

17.4  The World of Infomocracy Infomocracy imagines a world some 60 years in the future. In this future, the nation-­ state system is largely defunct, although a few holdouts continue to proclaim sovereignty within their diminished borders. Most of the world now participates in microdemocracy, a system in which the basic jurisdictional unit, called a centenal, is made up of roughly 100,000 people. Centenals are population-based, not territory-­ based, so in a dense urban area a centenal might take up a few city blocks, while in a sparsely-populated rural area it could be hundreds of hectares. In global elections, which take place every 10 years, each centenal can vote for whatever government it chooses, out of all the governments that exist anywhere in the world, which at the time the story begins is around 2000. As the author, Older construes “governments” in Infomocracy to express the combination of laws, processes, and organizational structures that would be sufficient to describe the government of a nation-state today, along with the policy platforms and customs of what we commonly think of as political parties. Governments can be localized, holding one or two or a few centenals quite close together, concerning themselves primarily with local issues, and not seeking to contest elections more broadly. Others are regional, or focus on an issue with broad resonance across the globe for a scattering of adherents. At the extreme end of the scale, the largest governments might hold dozens or hundreds of centenals, and compete fiercely for more at every election. The government that wins the most centenals in a given election is known as the supermajority. This inaccurate title does not offer nearly the power that it suggests, but its holder does get a say in certain administrative processes and decisions related to the functioning of the system as a whole, which, along with the sheer glory of winning the most, makes winning additional centennials desirable. Figure 17.2 shows the two main changes that Infomocracy makes to the structure of democracy and its technological infrastructure. The first half of Fig. 17.2 captures the shift to microdemocracy, and makes clear that Infomocracy engages in

330

M. Older and Z. Pirtle

Present world with Nation States and Current Technology

Infomocracy World, with Centenals and Information

Political Changes

Information Technology Changes

Current news world: • Current internet and social media • Traditional and emerging news organizations

Information: a Powerful, Synthesizing News Source • Real-time evaluation • Global surveillance

Fig. 17.2  Key sociotechnical changes introduced in Infomocracy: the move to centenal states and to information infrastructure

political fiction, imagining possible evolutions of the social technology that is democracy. Technology in the colloquial (i.e., electrical/digital/physical) sense enters into the story most directly by how information gets shared throughout the microdemocracy system through a complex socio-technical organization called Information. A massive global information-management bureaucracy, a sort of cross between the United Nations and Google, Information serves both technical as well as political roles in supporting the microdemocracy system. It is funded in large part from the interest off of winnings in civil suits against disingenuous advertising and news companies, as well as a modest member dues system. Information is responsible for the technological infrastructure of data transfer as well as for collecting, compiling, analyzing, and disseminating information across the globe. Its goal is to make information not only available, but accessible; it provides translation, data visualization, tailoring to different reading levels, audio, video, and whatever other innovations it can come up with to equitably promote an informed populace. This can be proactive as well as reactive; Information annotates advertisements and political speech (such as electoral debates), noting if claims veer too close to deception: it can levy fines and remove content in the case of outright lies. Information as an organization does contain some unique characteristics, which plays into the development of technology that it operates. Information’s staff is housed in hubs all over the world and attempts a relatively flat organizational structure, without a single predominant leader or even board of leaders. Aware that an organization dealing with information can never claim to be neutral, Information instead strives for transparency, detailing its own processes as openly as any others. This is most elaborated on in the third book of the trilogy, State Tectonics, in which one character working for Information is accused of producing biased public reports. However, the idea that Information, with its purported exclusion from the fray of politics, is skating on fairly thin legitimacy, is a consistent if subtle theme throughout the trilogy.

17  Imagined Systems: How the Speculative Novel Infomocracy Offers a Simulati…

331

17.5  H  ow Infomocracy Embodies Forecasting, Values Reflection and Governance The political system described in Informocracy was not designed or intended as a utopia, or even a partial utopia. For Older, it is a thought-experiment response to a few specific frustrations. Microdemocracy derives from her experience of living, working, and traveling in a few of the world’s many countries, including well-­ established democracies, some with separatist movements. It is an attempt to grapple with the contradiction between the democratic ideal of governments chosen by the people and the increasing insistence on unchanging, theoretically controlled borders, which is tantamount to a refusal to let the vast majority of people decide what country they want to live in. It also reflects an irritation with the antiquated idea that territory is a source of power in a global economy far more sensitive to population numbers. The centenal system is a reaction to seeing increasingly granular, county-level data about, for example, U.S. presidential elections that show communities living side-by- side that hope for very different things out of government. While the novel emerged from a complex mélange of influences and ideas in Older’s path as an author, we can use the novel to exemplify the varied goals for using science fiction to inform policy that we proposed above, of forecasting, values reflecting, and governance. The following are a series of quotes from Infomocracy that capture the core categories of how science fiction can influence policy that we proposed above. We will discuss a few more examples below, but collectively this helps affirm the diversity of ways in which science fiction in general, and Infomocracy in particular, might inform policy (Table 17.1). In terms of forecasting, Information responds to the increasingly fragmented nature of the media and data environment in the early twenty-first century. Media outlets differ not only in ideologies, but in the assertions - statistics, anecdotes, and supposed facts - that they use to support those ideologies. Nor is this entirely surprising. Much of the evidence on which we claim to base our political, commercial, and household decisions is either more abstract or less certain than it sounds. Pronouncements on the economy are based on theories and estimations; nobody “knows” exactly what the future national unemployment or median income, for example, is at any given moment, nor is it something you can verify or disprove by direct observation. Public opinion polls are presented as inarguable indicators of election outcomes even when they work with tiny sample sizes and questinablemethodology. This is not to say that there’s no substance behind those numbers; rather that it is extremely difficult to be certain which interpretation is “true.” It is certainly beyond the resources of the average consumer of information. This probably contributes to what has been described as a post-truth context, in which people are suspicious of “facts”; it also means that different media venues can, to a certain extent, present seemingly valid bases for their contradictory positions by cherry-­ picking or framing those numbers. This brings us to reflect on what value we might find in this future world, given that ccho chambers of opinion are not entirely a new, social media development.

332

M. Older and Z. Pirtle

Table 17.1  Relevant quotes from Infomocracy that tie to our policy influence goals Category Description Forecasting Imagining how society might respond to micro-democracy, and associated immigratin policies

Quote The whole point of micro-democracy was to allow people to choose their government wherever they were, but plenty of people didn’t agree with their 99,999 geographically closest friends. Some areas—Ireland being one classic example, vast zones of what used to be the United States another—had been polarized so deeply and so long that your choices if you stayed were pretty much A or B. (p. 28) Forecasting Shows a plausible pathway In the first election, information leadership was naïve and idealistic (Mishima’s read not only of the whereinpeople will be broadcast speeches but the internal memos). They naive as they embrace a thought that providing data about each candidate new system. Hints at the government would be enough for people to make challenges tied to any informed, more-or-less-sensible choices. (p. 67) major changes “We find that the people who hate each other that Forecasting Echoing the present, much rarely view the same types of Information,” the discusses how people man says. “It seems terribly unlikely that they will might use information to ever know. In the meantime, everyone is happy, and look at significantly the possibility for real aggression is being defused.” different things, which (p. 345) shapes social/political groupings Governance Example of a playground, “...they’re going to try to get down to the ten-­ thousand-­person level,” she says. “Instead of imagines getting to even centenals, I don’t know, decimals. To keep minorities different configurations from getting overrun in their centenals, you know? than the ones that the And so that people take more responsibility for their author proposed. Helps votes and hopefully make more informed choices. prompt the reader to consider other alternatives Nano-democracy, they’re calling it.”...[in terms of changing Infomocracy’s 10 year election cycles: “There are the centenal elections—some governments do them as often as every 2 years. And referendums and policy shifts within that. The last 5 years are all about positioning.” (p. 366–7) “Information is a public good,” one of the older men Values Imagines how society reflection might create new meaning/ says with finality. “It may fail for technical reasons, and we may strategize about the best technical value tied to Information approach to get it back up, but we will not withhold and the value of transparency. Explores the Information once it is in our power to make it available. We cannot give ourselves the power to see balance between and leave everyone else blind.” A brief silence, and technological vs then the same projection says, “Well, then, we bring institutional power it up for everyone. I need to know what’s happening so that we can respond! Centenals could already be fighting each other.” (p. 248) (continued)

17  Imagined Systems: How the Speculative Novel Infomocracy Offers a Simulati…

333

Table 17.1 (continued) Category Values reflection

Values reflection

Values reflection

Values reflection

Description Information is more than just technology; people make meaning out of it, drawing on many complex emotions

Quote She has sometimes wondered if this is why people hate Information; whether the idea of all those people working for an enormous bureaucracy, supposedly sapped of their individual identities, spurs some primal fear in people. Or guilt that unimaginable masses of workers have to sort through vids and grind out commentary nonstop to provide the extraordinarily individualized Information that almost everyone on Earth now feels entitled to. (p. 41) “Regardless, I am not interested in changing a Explores how a group outside of the information system that I do not participate in.” Seems unlikely. system reacts to it, seeing “It is a shame that Information is constantly attempting to influence the minds of your people, ideological lenses claiming that the election system is the answer to all their problems.” “Information does not enter here,” the sheikh answers. “We provide our subjects with all the news and entertainment that they need.” “But still.” (p. 54) [With respect to context that Information provides Questionings the around 'facts':] 'They call it compilation and objectivity of the information system and its distribution, or in some cases highlighting and contouring, but he’s done it before under a different value, signifying name: campaigning. As habitual as it is for him, he challenges in separating can’t shake the feeling that doing it for information is facts and values somehow amiss.’ (p. 353) “You should be dancing for joy that the election was Explores value preferences, highlights the disrupted.” “I hate your stupid pseudodemocratic nature of a different system infomocracy, true,” Domaine agrees. “But I would hate a corporate dictatorship manipulated by the by comparing it to a military-industrial complex even more.” (p. 266) seemingly more ‘status quo’ option

Echo chambers have long been practiced through a variety of technological mediums, from pamphleteers, hometown newspapers, radio stations and cable news channels. But the thoroughness of immersion in thought bubbles does seem to be growing, which may be diminishing the possibility for useful ideological arguments. As a fictional construct that is both political and technological in nature, Information represents the longing for an adult in the room, a single universally shared data source that everyone can agree on to resolve factual questions, supply points in an argument, or decide that key variables remain unknown to the sum of human expertise. Information was also born out of Older’s experience of seeing that functional role play out on a small scale during a disaster response in West Sumatra in 2009, when the United Nations Office for the Coordination of Humanitarian Affairs (UNOCHA) brought in a dedicated information manager whose sole job was to receive, comply, and make available data being brought in by field agencies  (Older 2019). It was a powerful demonstration of the potential to see

334

M. Older and Z. Pirtle

information as a public good, with sufficient externalities to make it worth providing as a public service like other technological utilities such as power and transportation infrastructure. The novel does lay bare options for how we can govern society, and forces the reader to consider the values trades between different ways of governing technology and society. As tempting as Information sounds, it is hard to ignore its dangers. It differs from Orwell’s Big Brother or the propaganda machines of current governments in its efforts at transparency and perhaps most of all in the focus on distributing, rather than sequestering, the data it collects. However, it has a relative lack of stake in elections, which removes democratic accountability for both algorithm- and human-made judgments. It is still monolithic as an entity, however diverse its staffing and flat its organizational structures, and it wields incredible power in the most dangerous way possible: under the pretense of facilitation and disinterest. When everyone trusts a single source of information, the potential for corruption is that much more perilous, while without a back-up infrastructure any loss of trust in the organization would invite chaos. Algorithms seem objectively coded to actually encode the values of their designers into them (Noble 2018, Williams, this volume), and humans struggle to extricate their perspectives from their judgment. Infomocracy also raises many issues about how one might implement policies to govern society in a different way. It is probably clear by now that while it may be set towards the end of the twenty-first century, Infomocracy is very much about the present, rather than the future, and as such is not primarily an exercise in forecasting. While Older’s characters have access to some futuristic technology - wearables that project visuals at eyeball level; flying vehicles; antennae equipped with minuscule cameras that twitch to indicate danger - the critical elements of the Information and microdemocracy systems could function almost as easily in the world we inhabit today. Indeed, many of them do. Google, social media technologies, and educational organizations as well as the established news media serve many of the roles of Information. Smartphones are nearly as pervasive and powerful as the wearables described in the book. There have and continue to be many attempts to rein in our information chaos, from fact-checking websites to Wikipedia, although so far none has been a complete success. Even the surveillance aspect of Information, which many readers seem to find frightening or even dystopian, is not terribly far from where we are today, with the difference that in our case the data remains hidden after it is taken. Microdemocracy can point to other ‘real-world’ precedents, as many countries include territory that is not geographically contiguous with the rest of the state, like Alaska, Ceuta, Gibraltar, and others. Conversely, many metropolitan areas that seem like single units contain multiple municipalities with distinct laws, requiring signage to let visitors know when they can no longer turn right on red, for example. State, province, and prefectural borders do the same. The plausibility of much of Infomocracy’s key premises belies its potential utility as a conversational tool for informing our conversations today about what world to live in and how to govern it.

17  Imagined Systems: How the Speculative Novel Infomocracy Offers a Simulati…

335

17.6  I nfomocracy as a Governance Simulation for Re-engineering the Relationship Between Technology and Society While Infomocracy has multiple nuances for how it might inform policy, Older’s thought experiment allows us to reconsider the normal relationship between technology and society because both technology and society are changed in key ways, including in how they relate to each other. Figure 17.3 below shows the transition to Nation States, and then considers how the existing relationships between technology and society might change with the new decentralized governance and the Information System. Infomocracy does switch two key aspects of modern political and information technology. Today we live in a world in which, except for the elite and sometimes (in the case of refugees) the opposite, the form of government we live under is largely decided at birth, while in many countries our sources of information are highly fragmented. Infomocracy posits a future in which we have far more choice of governments, but a monolithic source of information. Information collects data on everyone, with its embrace of transparency making it harder for individuals to maintain a sense of privacy. Widespread data sharing and annotated commentary on news puts a heightened focus on the supposed objectivity of politically disputed facts. In sum, the book offers a playground for the reader to simulate and reflect on how technology and society could relate, to help us debate our values, and to ground normative assertions about the future governance of technology. This is what we would call a governance benefit, in that it offers all of us a partial model for how we can deliberate on what sort of world we want to create. We can use the playground/ simulation of Informocracy as a detailed example in the debate about whether changes are needed in our own democratic process, and how technology might contribute. While the novel doesn’t fully answer how technologies would differently affect societies that are comprised of micro-democracies, it does show a world that seems both foreign and familiar, and invites further consideration.

Changes to Society’s Influence on Technology • Citizens are able to immigrate, and try to leave technological systems • Some power grabs attempt to upturn the Information system

Changes to Technology’s Influence on Society • Information rapidly shares information, including personal details, deemphasizing privacy • News annotation focuses dialog on questionable notions of objectivity • All political powers need to accommodate and act in accordance with Information

Fig. 17.3  Conceptual map of the changing relationships between technology and society resulting from shifting away from a Nation-state system to a Centenal system and Information

336

M. Older and Z. Pirtle

More broadly, Infomocracy, along with its two sequels completing the Centenal Cycle trilogy (Older 2017, 2018), is largely an attempt to play out the implications of different configurations of both democracy and information technology that might work in human society. Once again, this is less about whether that set up is “better” or “worse,” and more about highlighting some of the unquestioned parameters of our society. As an author, Older seeks nuance and unexpected consequences rather than justification, and to give all of us a stronger lens by which to view our government and technology today. Given the damages wreaked by the nation-state concept over the last century and a half, is there another way of organizing our geopolitics? If democracy requires an informed constituency, is there a sustainable technological and organizational model for providing that information, or at least promoting that outcome? How do our values get embedded into algorithms that supply information, and how does that shape (and get shaped by) political processes such as elections? The plots of the books serve as just one simulation run of an alternative sociotechnical system of our society through various mechanisms: natural disasters that might possibly be caused by technology; technological failure and sabotage; bad actors; self-interested actors; actors who believe a different system would be more just; espionage; and human susceptibility to misinformation. To say that the system is being “tested” in this way is, of course, something of a stretch, since all of the plot twists, and their resolutions, came from the invention of the author, occasionally with input from the book’s editor. Readers, potentially including policy makers and those advising them, can imagine other tests. Nevertheless, we have shown above how the novel embodies three broad ways for which science fiction can inform policy, especially insofar as it encourages values reflection and consideration of governance. The two main conceptual shifts of Infomocracy, of changing to a much more expanded and centralized information system and moving to micro democracy, provide axes by which a reader can continue to imagine the relationship between technology and society and to treat that as a playground and mental simulation to consider even further changes. All readers have some faculty to interrogate and reflect on how life might be under a set of potentially different scenarios (Berne and Schummer 2005; Berne 2008). We are, as a culture, highly saturated with narratives; we are also relatively critical audiences, trained in the nuances of character development and motivations, even if we can sometimes be blinded by star power or special effects. It’s far easier for an author to convince readers of dragons and wormholes than of characters who don’t behave the way people do in everyday life. Infomocracy provides a mental playground for all of us to simulate how a different configuration of democracy would influence and be influenced by the technologies and organizations we use for news and information technology. All humans use technology, and our actions either directly or indirectly influence the design of those technologies (Subrahmanian et al. 2020). In recognizing these fundamental connections between democracy, technology and society, we should ask: what world do we collectively want to create (Kitcher 2004)? How can we engineer both our political systems, institutions, and our technologies in order to build that world?

17  Imagined Systems: How the Speculative Novel Infomocracy Offers a Simulati…

337

References Berne, R. W. (2008). Science fiction, nano-ethics, and the moral imagination. In Presenting futures (pp. 291–302). Dordrecht: Springer. Berne, R. W., & Schummer, J. (2005). Teaching societal and ethical implications of nanotechnology to engineering students through science fiction. Bulletin of Science, Technology & Society, 25(6), 459–468. Bertrand, P., Pirtle, Z., & Tomblin, D. (2017). Participatory technology assessment for Mars mission planning: Public values and rationales. Space Policy, 42, 41–53. Bijker, W. E. (1987). The social construction of Bakelite: Toward a theory of invention (pp. 159–187). Cambridge, MA: MIT Press. Bijker, W., & T.P. Hughes, T. Pinch (editors). (1987). The Social Construction of Technological Systems: New Directions in the Sociology and History of Technology. MIT Press. Bimber, B. (1994). Three faces of technological determinism. In Does technology drive history (pp. 79–100). Cambridge, MA: MIT Press. Bozeman, B., & Sarewitz, D. (2011). Public value mapping and science policy evaluation. Minerva, 49(1), 1–23. Brown, M. B. (2004). The political philosophy of science policy. Minerva, 42(1), 77–95. Brown, M.  B. (2009). Science in democracy: Expertise, institutions, and representation. Cambridge, MA/London: MIT Press. Brown, W. M., & Kahn, H. D. (1977). Long-term prospects for developments in space: A scenario approach. Final report prepared by the Hudson Institute for NASA. Available at: https://ntrs. nasa.gov/citations/19780004167 Ceruzzi, P. E. (2005). Moore’s law and technological determinism: Reflections on the history of technology. Technology and Culture, 46(3), 584–593. Conley, S.  N. (2018). An age of Frankenstein: Monstrous motifs, imaginative capacities, and assisted reproductive technologies. Science Fiction Studies, 45(2), 244–259. Davies, S.  R., Halpern, M., Horst, M., Kirby, D.  A., & Lewenstein, B. (2019). Science stories as culture: Experience, identity, narrative and emotion in public communication of science. Journal of Science Communication, 18(05), A01. Douglas, H. (2009). Science, policy, and the value-free ideal. Pittsburgh: University of Pittsburgh Press. Durán, J. M., & Pirtle, Z. (2020). Epistemic Standards for Participatory Technology Assessment: Suggestions Based Upon Well-Ordered Science. Science and Engineering Ethics, 26(3), 1709–1741. Eisenhardt, K. M., & Zbaracki, M. J. (1992). Strategic decision making. Strategic Management Journal, 13(S2), 17–37. Eveleth, R. (2019). Can Sci-Fi writers prepare us for an uncertain future? Wired Magazine. https:// www.wired.com/story/sci-­fi-­writers-­prepare-­us-­for-­an-­uncertain-­future/ Finn, E., & Eschrich, J. (Eds.). (2017). Visions, ventures, escape velocities: A collection of space futures. Tempe: Center for Science and the Imagination, Arizona State University. Guston, D. H. (2014). Understanding ‘anticipatory governance’. Social Studies of Science, 44(2), 218–242. Guston, D. H., & Sarewitz, D. (2002). Real-time technology assessment. Technology in Society, 24(1–2), 93–109. Gutmann, A., & Thompson, D.  F. (1998). Democracy and disagreement. Cambridge/London: Harvard University Press. Heilbroner, R. (1994). Technological determinism revisited. In Does technology drive history (pp. 67–78). Cambridge, MA: MIT Press. Jasanoff, S. (Ed.). (2004). States of knowledge: The co-production of science and the social order. London: Routledge. Jørgenson, T. B., & Bozeman, B. (2007). Public values: An inventory. Administration & Society, 39(3), 354–381.

338

M. Older and Z. Pirtle

Kitcher, P. (2004). What kinds of science should be done? In A.  Lightman, D.  Sarewitz, & C. Desser (Eds.), Living with the genie (pp. 201–224). Island Press. Latour, B. (2005). Reassembling the social: An introduction to actor-network-theory. Oxford ­university press. Merchant, B. (2018). Nike and Boeing are paying Sci-Fi writers to predict their futures. https:// onezero.medium.com/nike-­and-­boeing-­are-­paying-­sci-­fi-­writers-­to-­predict-­their-­futures-­ fdc4b6165fa4. Miller, C. A., & Bennett, I. (2008). Thinking longer term about technology: Is there value in science fiction-inspired approaches to constructing futures? Science and Public Policy, 35(8), 597–606. Noble, S. U. (2018). Algorithms of oppression: How search engines reinforce racism. New York: NYU Press. Older, M. (2016). Infomocracy: Book one of the Centenal cycle. Tor.com books. Older, M. (2017). Null states: Book two of the Centeral cycle. Tor.com books. Older, M. (2018). State tectonics: The Centenal cycle book 3. Tor.com books. Older, M. (2019a). The United States Has Never Truly Been a Democracy. The New  York Times. https://www.nytimes.com/2019/10/24/opinion/democracy-­electoral-­college. html?smtyp=cur&smid=tw-­nytopinion. Older, M. (2019b). The Kurds are the Nation-State’s latest victims. Foreign policy. https://foreignpolicy.com/2019/10/31/turkey-­syria-­kurds-­nation-­state-­latest-­victim/. Older, M. (2019c). Presidential debates could be much more imaginative. The New York Times. https://www.nytimes.com/2019/12/19/opinion/presidential-­debate-­alternatives.html. Older, M. (2019d). An Op-Ed from the future: The United States should welcome a strong, United Latin America. The New  York Times. https://www.nytimes.com/2019/06/17/opinion/future-­ united-­latin-­america.html. Older, M, (2019). Organizing after disaster: the (re) emergence of organization within government after Katrina (2005) and the Touhoku Tsunami (2011) (Doctoral dissertation, Paris, Institut d’études politiques). Pirtle, Z. & Tomblin, D. (2017). Well-Ordered Engineering: Participatory Technology Assessment at NASA. In J. C. Pitt, & Shew, (Eds.), Spaces for the future: A companion to philosophy of technology. Routledge. Poznic, M. (2016). Make-believe and model-based representation in science: The epistemology of Frigg’s and Toon’s Fictionalist views of modeling (pp. 201–218). Teorema: Revista internacional de filosofia. Poznic, M., Stacey, M., Hillerbrand, R., & Eckert, C. (2020). Designing as playing games of make-­ believe. Design Science, 6. E10. https://doi.org/10.1017/dsj.2020.8. https://www.cambridge. org/core/journals/design-science/article/designing-as-playing-games-of-makebelieve/67BC2E 6B4ED339A61F1B866363852B04 Rejeski, D., & Olson, R. L. (2006). Has futurism failed? The Wilson Quarterly (1976-), 30(1), 14–21. Rejeski, D., & Wobig, C. (2002). Long term goals for governments. Foresight, 4(6), 14–22. Samuel, A. (2019). Can science fiction predict the future of technology? The Digital Voyage column, JSTOR. Available at: https://daily.jstor.org/ can-­science-­fiction-­predict-­the-­future-­of-­technology/. Sapolsky, H. M. (1971). The Polaris system development. Cambridge, MA: Harvard University Press. Selin, C. (2011). Negotiating plausibility: Intervening in the future of nanotechnology. Science and Engineering Ethics, 17(4), 723–737. Shelley, M., Guston, D., Finn, E., & Robert, J. (Eds.). (2019). Frankenstein: An annotated version for scientists and engineers. Cambridge: MIT Press. Shrader-Frechette, K., & Westra, L. (Eds.). (1997). Technology and values. Savage: Rowman and Littlefield Publishers. Subrahmanian, E., Reich, Y., & Krishnan, S. (2020). We are not users: Dialogues, diversity, and design. Cambridge, MA: MIT Press.

17  Imagined Systems: How the Speculative Novel Infomocracy Offers a Simulati…

339

Talisse, R. B. (2019). Overdoing democracy: Why we must put politics in its place. Oxford University Press. Vallor, S. (2016). Technology and the virtues: A philosophical guide to a future worth wanting. New York: Oxford University Press. Vincenti, W. G. (1992). Engineering knowledge, type of design, and level of hierarchy: Further thoughts about what engineers know…. In Technological development and science in the industrial age (pp. 17–34). Dordrecht: Springer. Vinsel, L, & Russell, A. L. (2020). The innovation delusion: How our obsession with the new has disrupted the work that matters most. Currency. Weiss, C., & Bonvillian, W. B. (2012). Structuring an energy technology revolution. Cambridge, MA/London: MIT Press. Williams, D. P. (2020). Constructing situated and social knowledge: Ethical, sociological, and phenomenological factors in technological design. In Z. Pirtle, G. Madhavan, & D. Tomblin (Eds.), Engineering and philosophy: Reimagining technology and social progress. Springer Press. Winner, L. (1980). Do artifacts have politics? Daedalus, pp. 121–136. Wyatt, S. (2008). Technological determinism is dead; long live technological determinism. In The handbook of science and technology studies (Vol. 3, pp. 165–180). Cambridge, MA: MIT Press. York, E., & Conley, S. N. (2020). Creative anticipatory ethical reasoning with scenario analysis and design fiction. Science and Engineering Ethics, 26(6), 2985–3016. https://pubmed.ncbi. nlm.nih.gov/32705538/

Part VII

A Provocation

Chapter 18

The Discrete Scaffold for Generic Design, an Interdisciplinary Craft Work for the Future Ira Monarch, Eswaran Subrahmanian, Anne-Françoise Schmid, and Muriel Mambrini-Doudet

Abstract  We introduce the notion of generic design from a new perspective, though the term does have a history in fields like systems science and along somewhat different paths in cognitive science, artificial intelligence and software engineering. For the latter, it was hypothesized as a unique type of thinking with similarities among design activities in different situations but with crucial differences between these and other cognitive activities. For the former, it was an approach to management of complexity through systems design via socialtechnical systems theory. The focus of our work is different from both. We do not investigate design as a unique type of activity, though we do not deny that it could be. Also, our work is not solely ensconced in a social-technical system. For us, the latter can also be seen as a physical and/or biological system from which materials for new construction can also be employed. Also for us, the arts are as important to address as science or engineering. Our work is generic because it does not start by limiting itself to designated disciplines or paradigms, fields or domains. Rather for us, generic design supposes an encompassing multi-disciplinary, multi-field approach for dealing with the ever-­ growing number and diversity of knowledge islands that are becoming more interconnected with support for going somewhat native. We also show how generic design participates in the qualitative or what are sometimes called incommensurable paradigmatic shifts in which knowledge grows. We begin development of generic design by combining philosophy and engineering, situating them in craftwork through the use of an integrative object from the construction crafts – scaffolding. Scaffolding becomes a way station between knowledge in its current state toward new conceptions under development. This paper is itself a way station, which links I. Monarch · E. Subrahmanian (*) Carnegie Mellon University, Pittsburgh, PA, USA e-mail: [email protected] A.-F. Schmid Mines Paris Tech, Paris, France M. Mambrini-Doudet Institute National De la Recherche Agronomique, Alimentation, Paris, France © Springer Nature Switzerland AG 2021 Z. Pirtle et al. (eds.), Engineering and Philosophy, Philosophy of Engineering and Technology 37, https://doi.org/10.1007/978-3-030-70099-7_18

343

344

I. Monarch et al.

generic design with developments in philosophy (Wittgenstein, Poincare and Laruelle), in science (Poincare and Einstein) and engineering (dimensionless numbers). Keywords  Generic design · Scaffold · Craft · Conceptual change · Shared memory · Integrative object

18.1  Introduction We introduce the notion of generic design from a new perspective, though the term does have a history in fields like systems science (Warfield 1994) and along somewhat different paths in cognitive science, artificial intelligence and software engineering (Goel and Pirolli 1989, 1992; Visser 2009). For the latter, it was hypothesized as a unique type of thinking with similarities among design activities in different situations but with crucial differences between these and other cognitive activities. For the former, it was an approach to management of complexity through systems design via socialtechnical systems theory. The focus of our work is different from both. We do not investigate design as a unique type of activity, though we do not deny that it could be. Also, our work is not solely ensconced in a socialtechnical system. For us, the latter can also be seen as a physical and/or biological system from which materials for new construction can also be employed. Also for us, the arts are as important to address as science or engineering. Our work is generic because it does not start by limiting itself to designated disciplines or paradigms, fields or domains. Rather for us, generic design supposes an encompassing multi-­ disciplinary, multi-field approach for dealing with the ever-growing number and diversity of knowledge islands that are becoming more interconnected with support for going somewhat native. We also show how generic design participates in the qualitative or what are sometimes called incommensurable paradigmatic shifts in which knowledge grows. We begin development of generic design by combining philosophy and engineering, situating them in craftwork through the use of an integrative object from the construction crafts  – scaffolding. Scaffolding becomes a way station1 between knowledge in its current state toward new conceptions under development whose identity remains indeterminate. This chapter is itself a way station, which links generic design with developments in philosophy (Wittgenstein, Poincare and Laruelle), in science (Poincare and Einstein) and engineering (dimensionless numbers). The rest of this section will discuss the generic in generic design pointing to genericity that is a knowledge that does not know itself, or at least cannot predict where it is going and a scaffolding that is not simply thrown away on the 1  Way Station is a 1963 science fiction novel by American writer Clifford D. Simak. Travelers arrive at the way station by a form of teleportation via duplication, where the original body remains at the source and a new live copy is created at the destination.

18  The Discrete Scaffold for Generic Design, an Interdisciplinary Craft Work…

345

completion of what is built but rather develops conceptually and practically as the conception of what is being built develops conceptually and practically. The rest of the chapter then fills out the notions of generic design and scaffolding. Section 18.2 develops a conceptual framework for generic design that is the basis for our approach to generic design theory and scaffolding described in Sect. 18.3. This approach is then employed in Sect. 18.4 and 18.5 to discuss precursors of generic design and the use of generic scaffolding in the invention of scientific theories and concepts. Section 18.4 describes how Poincare in his  roles of mathematician, scientist and engineer was practicing key aspects of generic design and using a form of generic scaffolding, even making explicit some of his own practices in his popular descriptions of scientific practice. Section 18.5 discusses how Einstein crisscrossed the disciplines in search of problems that forced the generation of new concepts and scaffolding. Unlike Poincare, Einstein began to forge scaffolding that was used by others in the paradigm changes occurring from late nineteenth into the twentieth century. While Einstein also popularized some key concepts in special relativity, he did not make explicit the change in scaffolding. Section 18.6 extends the generic practices of Poincare and Einstein to make explicit the change from single-minded concept construction and supporting scaffolding to multi-minded concept construction and scaffolding. The latter is especially well-­ exhibited in the construction of engineering concepts, practices and methods as described in Sect. 18.7 and exhibited in Sect. 18.8 on the history of dimensional analysis that is a case of multi-authored generic design. The paper ends with a concluding section summarizing the paper in the context of prior work in the philosophy, history, sociology and economics of science and our intention to take generic design in additional and perhaps more radical directions.

18.1.1  Views of the Generic There are many theories in linguistics, psychology, philosophy, education and informatics that touch on the generic in its various forms, its procedures and particular operational power. They remain partial and even incompatible. On the one hand, as Leslie and Lerner (2016) point out, “generic generalizations are inherently fuzzy and non-totalizing” as to their application.2 They also suggest that the method of cases, though generic in this sense, has been treated as though it were the knowledge of universal necessary and sufficient conditions for the application of the relevant concepts (Leslie and Lerner 2016). On the other hand, the generic can be confounded with generalization, universalization or considered as the opposite of singular or specific. More like analogies, it is not to produce inductive generalizations, but to extend the application of exemplars to new situations. However, generic 2  Generic statements that express generalizations are non-totalizing because unlike quantified statements, they do not carry information about how many members of the kind or category have a specific property.

346

I. Monarch et al.

thinking is not limited to analogy. Moreover, it  is not limited to reasoning from antinomies. (1) Generic thinking goes beyond analogy in that the consideration of bringing together fragments from different disciplines and/or fields is not limited to a previous analogic basis that guides the new composition. (2) It reasons beyond antinomies in that it is not a new proposed solution of previous antinomies, as discussed by Kant, e.g., whether the universe is finite or infinite, discrete or continuous or whether there is free will. It is the invention of both new antinomies and new solutions. The generic is a vehicle of the logical expression of our knowledge about the world, especially with respect to the construction of new propositions deeply compatible with large and heterologous knowledge and conceptual resources. To use this vehicle, we emphasize the role of identifying incompatibilities among results in different disciplines and fields and creating hyper-compatibilities among their conceptual resources and fields.3 This means that the properties of things and the property rights to things, whether presumed living or non-living, object or subject, natural or made, are open to change according to hypotheses not limited by the confines of analogy and known antinomies. They accord with a logic limited by use and agreement.4 The logical openness of generic design welcomes a Commons where intellectual property rights are transformed. Taken in this way the generic helps reshuffle the cards of logical order. The generic deserves more attention from philosophers, though  some have already begun to consider it to revisit ancient paradox (Nickel 2016); in this way the generic has been used to reinterpret an important class of scientific generalizations (Claveau and Girard 2019). But the philosopher who builds a philosophical proposal for the generic, liberated from usage and habits and currents of interpretations (philosophical, social or mathematical) by a radical gesture combining philosophy with quantum physics, is François Laruelle.5 For Laruelle, the generic is not another philosophic system providing yet again a total world view that imposes a limiting order on the sciences, engineering and the arts as has been done from Plato to Logical Positivism. Rather it acts as a ‘weak force’ that transforms philosophy and the disciplines in a piecemeal, open ended fashion (Laruelle 2011). For Laruelle, the generic is an inchoative process, a melody, that is linked with philosophy and the language of quantum theory. It forms a remini-science that can move the philosopher from the cavern, to the world and to the universe. It liberates us from being fixed in one disciplinary field or stasis (Tétralogos, un opéra de philosophies, Paris Cerf 2019) to becoming the lived experience of genericity. For us, the generic provides conditions for accessing conceptual fragments indexed to background knowledge expressed in the literature associated with domains of knowledge. Generic practice can be established in association with 3  These hyperbolic but still logical operations will be discussed in section on Einstein’s breakthroughs in physics. 4  See Sect. 18.2.3 and especially the discussion on Wittgenstein’s use of scaffolding. 5  Introduction aux sciences génériques [Introduction to Generic Science], Pétra, Paris 2008; Philosophie non-standard: générique, quantique, philo-fiction [Non-Standard Philosophy: Generic, Quantum, Philo-Fiction], Paris, Kimé, 2010.

18  The Discrete Scaffold for Generic Design, an Interdisciplinary Craft Work…

347

multiple disciplinary spaces that can be criss-crossed by means of interdisciplinary or non-disciplinary scaffolding. A similar role for scaffolding was introduced by Wimsatt and Griesemer (2007). They introduced the notion of entrenched functional scaffolding that permits major cultural changes sometimes at a precipitous rate. We focus on conceptual change in several cases of science and engineering to show how new concepts are constructed in specific generic spaces spanning multiple disciplines.

18.1.2  Scaffolding for Construction Crafts The role of scaffolding in craftwork, especially the construction crafts and certain forms of art from ancient times6 to the present, is pivotal for our conception of generic design. Scaffolding orients construction work in two directions, supporting two of the conditions for the construction process. Attention is alternatively focused on: (1) materials or resources so they can be selected and brought to the scaffolding and (2) the construction process so that the staging and organizing of the selected materials or resources and work can take place. Orientation of the scaffolding thus supports two of the conditions for the construction work to get done and thus supports (3) performing the craft construction process itself. While selection is preliminary to the actual work done on the construction scaffold, the form and functioning of the scaffold orients the selection process and together with the staging and organizing work helps determine a) what needs to be brought to the scaffold, b) when it is brought to the scaffold and c) how the sorts of expertise needed for composing what is being constructed using the scaffold will be organized. Scaffolding plays a key role in modern-day construction, with scaffolds being structures surrounding buildings enabling builders to walk around the structure being built. We would add that not only walking around but also help with lifting heavy loads is part of work using scaffolding, including various sorts of pulley’s (starting with the pyramid builders perhaps) and even construction cranes. In other words, transferring heavy or large amounts of material from staging areas to where they can be used for construction is also part of the role the scaffolding plays. The time to build scaffolding can be burdensome, but once completed it allows for work to be more efficiently and productively completed. 6  Although historians cannot be certain of prehistoric scaffolding use, archaeologists suspect that some form of scaffold system was in place for cave paintings. Indentations, which are suspected to be socket holes, were discovered at the Lascaux Caves in France over 17,000 years ago. These paintings extend onto the ceiling of the cave, and it is thought that the artist would have required scaffolding to paint the images found there. Herodotus writing about the pyramids of ancient Egyptians either related that they used light scaffold as a means for lifting heavy blocks from step to step. He wrote that their scaffold was intentionally lightweight so that it could be easily moved straight up to the next step (Herodotus, Histories, 2.125). There is documented evidence of scaffolding systems being used in ancient Greece both in art (https://en.wikipedia.org/wiki/Berlin_ Foundry_Cup) and architecture (https://www.metmuseum.org/toah/hd/grarc/hd_grarc.htm).

348

I. Monarch et al.

18.1.3  Scaffolding for Generic Design While scaffolding in generic design has for us been informed by its role in the construction crafts, it became an open-ended object as its role in concept construction was being re-imagined or re-modeled. In the case of generic design, what is primarily being constructed is a conceptual structure, not just single concepts, but concepts that fit together in a structure.7 Hatchuel and Weil (2009) have pointed out: in design spaces, concepts reorganize knowledge and design activity begins with concepts, otherwise it would be standard optimization or problem solving. For us, generic design is the beginning of design with the construction of new concepts. This also includes techniques or new ways of doing things because they also depend on the development of these new conceptual structures. In other words, new techniques and ways of doing things are not disseminated by mere imitation alone but force the composition of newly linked concepts or conceptual structures. As is the case in the construction crafts, work on conceptual scaffolding orients concept construction in two directions: selecting the concepts to be worked on and transforming them into new concepts. The concepts are manipulated as quasi-­ objects or as embodied. Construction of object-concepts or concept-objects is a process that includes the extraction and/or indexation of fragments of inscribed knowledge or memory usually from shared knowledge sources accessible in physical or digital libraries. This extraction or indexation corresponds to (1) the scaffolding’s orientation toward selection of materials – in this case conceptual materials. The selected fragments come from the different knowledge islands or from different areas of human activity. These fragments can be staged or organized in, for example, a matrix to make manifest incompatibilities that may suggest new avenues of thought. As we will see in Sect.  18.5, according to some scholars (Renn and Rynasiewicz 2014) this ability to surface incompatibilities as materials for creating hyper-compatibilities is characteristic of Einstein’s work in developing concepts for a new physics. Hyper-compatibilities influence (2) orienting the scaffolding toward the construction work. (3) The construction work itself for which (1) and (2) are conditions, is also supported by the conceptual scaffolding. Tools and techniques used on the conceptual scaffold to implement generic operators support composition and transformation of conceptual materials made available for concept construction by the conditions shaped by (1) and (2). These operators are used to construct integrative objects as well as processes and subjects that can populate wider conceptual structures like fictions, models and theories. So what is conceptual scaffolding made of? It is not something that exists transcendentally as Kant would have it. It does not somehow presuppose transcendental primitives or transcendental constructs that maintains the identity of knowledge categories. Rather it is implemented in every day objects and activities like written notes, printed materials and digital objects like web sites. Conceptual scaffolding in 7  Some prefer to call it a network, e.g., a semantic network. For current purposes, we consider structure and network interchangeable.

18  The Discrete Scaffold for Generic Design, an Interdisciplinary Craft Work…

349

the design of new conceptual structures or repairing old ones is physically embodied, typically in communication media, e.g., art of memory techniques in oral poetry, speech, writing, printing, or electronic and digital systems. All these knowledge representation technologies can be viewed as memory aids, whether using oral, writing, printing or digital systems. In (1) above, the scaffolding accommodates or indexes global knowledge representations (like disciplines and even cultures) or, even more generally, across such structures. Indexing enables the extraction and selection of fundamental but seemingly incompatible fragments and data from currently accepted knowledge sources. These fragments can then be used in (3) to combine with new concepts that form new conceptual structures. In these cases, (1) selecting and (2) staging and organizing materials or resources is based on the indexing of knowledge resources that are increasingly being made available in evolving knowledge representation technologies (Brachman and Levesque 2004). All these knowledge representation technologies can be viewed as memory aids, whether oral writing, printing or digital systems and their indexation shows how they can be activated as what we call “shared memory”. Two of the authors were part of a team that proposed the concept of shared memory (Konda et al. 1992) as part of a project to analyze the reasons for the success or failure of industrial design organizations. These organizations were evaluated on the basis of how well they built and managed organizational shared memory. The shared memory team recorded and kept track of what problems arose, how they were addressed and what the outcomes were of the multiple projects in the organization. The team investigated how well shared memory was structured, especially with regard to interdisciplinary and inter-field principles and how well they were indexed and used in subsequent projects. Suggestions were also made on how to improve these processes. The team sometimes suggested the use of digital support tools they were developing (Monarch et al. 1997). The physical embodiment of shared memory and conceptual scaffolding is becoming increasingly better and more portable. Writing, drawing, photography, videography and digital tools can be brought to different local sites to set up scaffolding for concept construction work. Although used in multiple local situations and relatively temporarily, conceptual scaffolding still provides stability for representing the object under construction, which usually undergoes many changes to its periphery and even to its core. Multiple workers usually use scaffolding, though it can be geared to a single person. In this way, digital scaffolding, for example, can be shared in different places at the same time.

18.2  Building a Conceptual Framework for Generic Design Indexing our collective “shared memory”, we draw on several sources in building the scaffolding for generic design. A generic orientation in epistemology and science enables us to situate the interplay among science, philosophy and design. In the first instance generic design is part of craftwork opening the way to extend the

350

I. Monarch et al.

use of scaffolding in the building construction crafts to the generic design craft of constructing new concepts. Craftwork provides a concrete setting for the generic posture we take toward design. Generic design is not a-priori philosophy in disguise because craft practices ground theoretical articulation empirically, though not completely. Lastly, we consider shared memory itself and the indexing that conceptual scaffolding provides. All conceptual design indexes shared memory to mine and select the materials for concept construction, and insofar as new techniques and processes involve new concepts to interpret and use them, these are indexed as well. Such conceptual scaffolding can be seen in technologies that maintain the particular organizations and conceptual guidance of visualizations, sketches, architectural plans, models and the like. Communication technologies like writing, printing and increasingly electronic and digital forms all serve as memory aids for all forms of concept construction. They also provide the building materials for the construction of conceptual scaffolding like automated indexing, the extraction of snippets or fragments and the flows of new conceptual materials. Designers survey collect, organize and recompose conceptual materials from different disciplines and crafts. They do this by traversing several different disciplinary domains, often with the intention of providing something new that will be of use. With our scaffolding allowing the interplay between philosophy, science organization and design, we bring a certain perspective on the conceptual pathways employed by major knowledge  makers. Galileo and his disciples traversed disciplines and crafts (including geometry, kinematics, hydraulics, strength of materials, and astronomy) and played a large role in moving from Aristotelian to classical physics. To go a step forward, as detailed in the next sections, we will (1) reconstitute the conceptual itinerary followed by Poincaré, (2) investigate how historical epistemology (Renn 2004) portrays  Einstein in reinterpreting the classical physics of Boltzmann (kinetic theory of gases), Lorentz (electrodynamics) and Planck (black body radiation). The historical epistemology allows  the work of Einstein to be treated as if he were a generic design craftsman developing many of the initial but critical conceptual building blocks for the construction of modern physics (statistical mechanics, light quantum and relativity theory). (3) We will also discuss  the discovery of dimensionless numbers. We will show how these new mathematical but physically applicable objects become more specified in the increasing diversity of their contexts and in the processes and methods used to derive them. The same model of the generic design process is used in tracing conceptual innovation in both science and engineering cases. The generic design process isolates fragments, extracting them out of their multiple source disciplines using the indexing and mining tools that conceptual scaffolding makes available. With the orientation turned toward generic space, these fragments are then composed using generic techniques in the form of operators with the intention of creating new concepts. These operators are oriented toward the future insofar as they initially make fictions by  composing  new concepts  portrayed in the form of integrative objects. These integrative objects are virtually or partially manifested – part object, part concept – rather than simply concepts. They exist in a space that permits the consideration of objects that are under-specified, open to changing their identities and therefore

18  The Discrete Scaffold for Generic Design, an Interdisciplinary Craft Work…

351

variable. Sometimes these integrative objects become further manifest to become exemplars that can be modified to fit new contexts, thus generalizing the contexts in which the exemplars apply. They bear a family resemblance to previous exemplars. Such generalizations transform disciplines, as shall be shown in the sections on Poincare, Einstein and the dimensionless number. These transformations are wrought on craft-like scaffolding that borders disciplinary and generic space fostering innovative construction in generic-like conditions. It would seem that non-­ standard and craft-like epistemological engineering tools are needed for this transformation.

18.2.1  Generic Epistemology and Generic Space Two of us have initiated the work on generic epistemology and procedures for science in the future (Schmid & Hatchuel 2014; Schmid & Mambrini 2019). In our book (Schmid and Mambrini 2019), we describe extensively the way to open-up epistemology. In short, first, we point out the three fundamentals of generic epistemology, i) the generic, ii) the hypothesis and iii) the “Poincaré criterion”. Second, we propose tools to situate research in “generic-like” conditions, to design strong hypotheses  – this is not trivial in interdisciplinary regimes  – and to evaluate the profoundness of the disciplinary combination (the “Poincaré criterion”). This criterion was identified from the analysis of the posture and paths that Poincaré adopted for his own invention (Schmid 2019); this analysis is revisited in later sections of the present chapter. Third, we present the two series of procedures with which generic epistemology operates. The first series opens and maintains the generic space. This series deals with the way the object is studied, and therefore we introduce the concept of “integrative object”. It deals also with the conditions to combine efficiently knowledge and savoir-faire, and we propose the concept of “collective intimacy” (Schmid et al. 2016). The second series of procedures helps dealing with the non-­ standard heterogeneity of knowledge islands. We have selected three operators and propose a device that allows long-lasting compatibilities among disciplines and the production of concepts leading to theoretical shifts (Schmid & Mambrini 2019). This framework helped identify the tools of Generic design and gave value to its techniques and technologies.

18.2.2  Situating Generic Design in Craftwork The notions of craft, craftwork and craftship8 helped provide us with a view of the multi-dimensional, interdisciplinary and inter-field theoretical framework we are constructing. Looking carefully at these notions, they apply to science, engineering,

 We use craftship instead of craftsmanship because it applies both to men and women.

8

352

I. Monarch et al.

arts, business and in the public and policy spheres. If they are not too quickly relegated as having inferior or pre-theoretical status, if they are distinguished from being simply “bricolage”, then we can realize to what extent they intervene in design activities and support new knowledge and concept production. What is common in all these areas is that craftship is essential to crucial parts of the work performed. For example, in science, craftship is involved in most aspects of setting up and in performing experiments. In all areas of engineering, designs have to be carried out and craftwork is always involved (for example the construction crafts). Medicine is a science and a craft. In the arts, performance of music, the act of painting, sculpting and writing poetry are all crafts. Political performance and diplomacy are crafts and so are raising children. So what is craftship? Our framework draws on the idea of craftship from the ancient Greeks of making things or doing a job well for its own sake,9 where the criterion of wellness is that the output of the craft works well for users of the craft, e.g., patients, dwellers or children.10 The importance of craftship has recently been revitalized by Richard Sennett in The Craftsman (2008). He sees Homo sapiens from the point of view of homo faber while also insisting on a dialog between the two. Craftship emphasizes techniques, practices and skills and their composition separately and together. Crafts are not static. They develop through a dialog between concrete practices and thinking, evolving into sustainable habits from mutual processes of problem solving and problem finding. From drawings to computer-­ generated imagery, the understanding and development of techniques proceeds through the powers of imagination and reflection on concrete activities communicated in language and visualizations. Engineering is assumed to have emerged from such craftwork, especially from craft handbooks (e.g., Vitruvius, De architectura, a precursor to the engineering handbook tradition – see the Vitruvius Project at http:// www.cs.cmu.edu/~Vit/vit-­project.html) that attempts to direct and guide bodily skill.11 We may say that craftwork has some generic aspects in the sense that someone, somewhere constructs new forms combining materials, techniques, methods and principles that have been exemplary in other contexts. The theoretical framework of generic design that we have been describing has in common with craftwork that it is based on tools that have an exemplary use in other contexts, enriching the theoretical framework, while leaving it open and transforming locally social organization. Generic design, as craftwork demands the acquisition of knowledge, practice and savoir-faire and can be itself a craft, but in our common interdisciplinary effort, we

9  See the Homeric hymn of Hephaestus and Plato in the Republic and other dialogs discussed in Sennett’s The Craftsman (2008) and Parry’s Plato’s Craft of Justice (1996). 10  In this sense, the sake of the craft essentially involves the people who will use its products, so it is also for the sake of these people and more generally for the sake of all humanity who use craft products. 11  For an interpretation of De architectura as something more politically grandiose than as simply a manual for architects and engineers see Vitruvius: Writing the Body of Architecture by Indra Kagis McEwen.

18  The Discrete Scaffold for Generic Design, an Interdisciplinary Craft Work…

353

have found that it combines theory with practice in tracing the generic paths of invention/discovery.

18.2.3  The Role of Shared Memory in Generic Design The concept of shared memory in design proposed by two of the authors12 emphasize not only that knowledge of design successes and failures was valuable for future design, but also that insofar as such knowledge could be recorded in text, drawings and computational models, it could be made available for future design work and updated more efficiently and effectively on a computer. There has been much written on how a computer program can represent knowledge (Brachman and Levesque 2004). The two authors have written in this area as well, focusing on acquisition, representation, indexing and annotation of very large online corpora that could make available conceptual materials for the composition of prototypes and exemplars (Monarch and Nirenburg 1993; Konda et al. 1992; Monarch et al. 1997; Coulter et al. 1998; Subrahmanian et al. 2003). These knowledge representation capabilities contribute to the creation and use of scaffolding to increase design generativity. In the twentieth century, one philosopher in particular has written on knowledge representation from a logical point of view with both epistemological and non-­ ontological ramifications. He also had been trained as an engineer and used the notion of scaffolding to describe his proposals. This philosopher or non-philosopher is Wittgenstein who in the Tractatus (1922/2015) said, “A proposition constructs a world with the help of a logical scaffolding, so that one can actually see from the proposition how everything stands logically if it is true” 4.023 and “The logical scaffolding surrounding a picture determines logical space. The force of a proposition reaches through the whole of logical space” (3.42). For Wittgenstein, a proposition (or propositional sign – 3.14) represents facts by picturing, that is, “the elements of a picture are related to one another in a determinate way represents that things are related to one another in the same way” (2.15). The logical scaffolding surrounding a propositional sign determines a logical space that relates a proposition if it is true not only to the facts that it pictures but to all the facts of the world that have logical relations with the proposition. In other words, Wittgenstein’s logical scaffolding makes available all the logical inferences a proposition is involved in. A recent article (Floyd 2016) argues that Wittgenstein transfigured his notion of logical scaffolding into an association of Lebensform (form of life) with scaffolding and logic, and also with binding and weaving, with handwriting, charactery, and

 Monarch and Subrahmanian have combined their backgrounds in software engineering and engineering design research with historically based philosophy of science ((Kuhn 1962; Foucault 1969; Feyerabend 1975) and (Sellars 1973, 1968)) through a series of case studies describing their design interventions in various engineering companies.

12

354

I. Monarch et al.

certainties, including in mathematics.13 Quoting Wittgenstein (2009, 240–242) she maintains that Wittgenstein extends his conception of logical scaffolding to encompass the idea that it concerns agreement in judgments: 240: Disputes do not break out (among mathematicians, say) over the question of whether or not a rule has been followed … This belongs to the scaffolding [Gerüst] from which our language operates (for example, yields descriptions). 241: “So you are saying that human agreement decides what is true and what is false?” – What is true or false is what human beings say; and it is in their language that human beings agree. This is agreement [Übereinstimmung] not in opinions, but rather in form of life [Lebensform]. 242: It is not only agreement in definitions, but also (odd as it may sound) agreement in judgments that is required for communication by means of language. This seems to abolish logic, but does not do so. – It is one thing to describe methods of measurement, and another to obtain and state results of measurement. But what we call “measuring” is in part determined by a certain constancy in results of measurement. Floyd takes it that the image of “scaffolding” Wittgenstein is using is a revision of Hilbert’s idea of logic as part of the Fachwerk (half-timber structure) of a theory by making it dynamic in our varieties of its uses. Referring to a letter of Hilbert’s to Frege, she relates that according to Hilbert, any theory is a Fachwerk of concepts or a schema of concepts together with their necessary relations to each other. This view is similar to Wittgenstein’s use of scaffolding in the Tractatus except that the Wittgenstein of the Investigations now sees logic as subject to change or in our terms, conceptual structures are in flux. This suggests a new sort of scaffolding that we call conceptual scaffolding that can handle the indexing and re-indexing of changing conceptual structures. Floyd argues further that like Turing, and compatible with his position in the Tractatus, Wittgenstein viewed logic and the axiomatic method in terms of more general and fundamental ideas: rule formulation, catenation, iteration, substitution, and calculation: procedures of calculation that can be taken in, begin somewhere, and may be carried on, reorganized, and articulated indefinitely. According to Floyd, what Turing did in “On computable numbers” was to logically and mathematically make more rigorous Wittgenstein’s attempted generalization on the axiomatic method. For, partly indebted to Wittgenstein, he rooted his analysis of the notion of “calculation in a logic”, a “formal system” in a “language game”, a snapshot of human calculative behavior, a person with pencil and paper living in a social world. In the software world, scaffolding in our interpretation of Wittgenstein has been developed and used in knowledge representation systems that can handle changing theoretical or formal structures. Though not called conceptual and not necessarily directly traceable to Wittgenstein and Turing, they have been called something similar – “software scaffolding.” For example, the notion of scaffolding has been applied

13

 Floyd refers the reader of her article to (Wittgenstein 2009) §§240–242 and xi §§345–362).

18  The Discrete Scaffold for Generic Design, an Interdisciplinary Craft Work…

355

to systems that support software coding generation and project generation (e.g., the Ruby on Rails framework14). These systems go beyond managing changing ­conceptual structures by providing default structures for software applications. The use of conceptual indexing to handle large-scale information infrastructures like the Web and digital libraries is from our point of view an implementation of conceptual scaffolding.

18.3  A  pproach: Generic Design, a Theory/ Practice Framework To develop generic design we choose to build a shared theoretical framework, based on generic epistemology, craftwork and conceptual change in science and information management. The framework is further developed via tracing the paths of inventions and discoveries that have provoked a radical paradigm shift in philosophical, scientific and engineering domains. As exemplars, we draw on the work of Poincaré, Einstein and the engineering science community that invented/discovered dimensionless numbers. Our theoretical framework is dynamic and enriched through the paths that it helps describe. Our work is based on a strong hypothesis that scaffolding helps bring out generic-like conditions for crafting original paths from knowledge to new constructions that lead to discoveries. We believe our theoretical hypothesis covers a broad territory from advanced mathematics and experiments with the large Hadron Collider to prehistoric cave painting. Simply put our theoretical framework consists of a scaffolding that is oriented in two directions given by our own concepts: (1) shared memory and (2) integrative objects. We will start with the work of Poincare as an incipient generic designer focusing on how his travel across disciplines was undertaken as an incipient generic designer. We will next draw on the revolution in twentieth century mathematical physics focusing on the work of Einstein. We will relate his work to Poincare’s as both incipient generic designers considering the invention/discovery of special relativity (Renn and Rynasiewicz 2014) and the atomism underlying the emergence of statistical mechanics (Poincare 1900, 1908, 1912; Perrin 1913; Einstein 1956; Renn 2005; Chalmers 2009). We will also look at Einstein’s considerations of the light quantum (Renn and Rynasiewicz 2014; Renn 2012). Finally, we will turn to the role of generic design and scaffolding in engineering. We will look at the interaction of art and engineering in a discussion of Da Vinci and others. And finish with an example on the interaction between engineering and science – dimensionless numbers.  Rails is a Model-View-Controller (MVC) architectural pattern, providing default structures for database, web and web pages. It encourages and facilitates the use of web standards such as JSON or XML for data transfer, HTML, CSS and JavaScript for user interfacing. Rails also emphasizes the use of other well-known software engineering patterns, such as Convention over Configuration (CoC).

14

356

I. Monarch et al.

18.4  Poincare’s Partially Explicit Generic Design15 In this section, we will discuss Poincare’s work in mathematics and physics focusing on his use of what we are calling conceptual scaffolding. Poincare used conceptual scaffolding to support his creation of new concepts in mathematics and physics. This scaffolding manifests itself in the way he orders conceptual pathways between the various disciplines he was working in to solve problems across them including: arithmetic, algebra, analysis, geometry, mechanics, mathematical physics and experimental physics. Poincare’s approach can be seen as having some similarity to what has been called the lattice of theories (Sowa 2006; Spivak and Kent 2012). The lattice of theories is an entailment order or specialization-generalization order (Spivak and Kent 2012) that uses four basic operators for navigating the lattice: contraction, expansion, revision and analogy. Poincare’s ordering and conceptual pathways were more flexible than those of the lattice of theories. His aim was to facilitate discovery and invention by adopting parts from different disciplines and forcing them to come together in seemingly incompatible ways. While Poincare was a rigorous mathematician, scientist and engineer, he thought the understanding needed for discovery and invention was generic across all the work he did, whereas validation and justification were local or specific to each discipline. He was the first philosopher scientist to point out hyper-­ compatibility of fragments across disciplines or compatibility between seemingly incongruous, though minimally analogous, fragments as being important in discovery and invention of new concepts. He specified the role of decomposition and generalization in identifying candidate fragments that can be composed in new ways. He also proposed a novel theory of hypothesis formation between (1) sensation and judgment that conjures up repetitions that had not been previously noticed and between (2) mathematics and physics that demands creative leaps across significant institutional divides and different disciplines, requiring daring and courage.

18.4.1  P  oincare: Interdisciplinarian and Precursor to Generic Design Poincare’s writings in the sciences, engineering and philosophy have some very remarkable characteristics not only as to content, but also as to its general organization: it is constructed in such a way as to permit the most effects with a minimum of means, achieving an economy of thought. His writings are (1) very systematic, always elaborating the same techniques for analogous problems; (2) regional or local because they offer philosophic views as interspersed with the problems

 Much of  the  following discussion is based on  the  work of  one of  the  authors of  this chapter (Schmid 2001).

15

18  The Discrete Scaffold for Generic Design, an Interdisciplinary Craft Work…

357

specific to each scientific and engineering discipline. Up to now, this part of Poincare’s achievement has not been much emphasized. The whole of Poincare’s generic architecture is not made explicit in one place. Rather, it is dispersed in threads across his different writings. It is not visible in any single text, nor is it self-explanatory. Rather, it makes itself seen in threads though his books and papers that contain the scientific and engineering contents of his work. In many of his books and papers, Poincare focuses more on looking for analogies among domains than on elaborating special theories. His scientific work touches a variety of domains in a manner that plays with their agreed limits. This is how he can propose an algebraic mechanics that was for a long time missed by scientists and engineers and that has found some recent applications in engineering. The solutions he found to some arithmetic problems were applied to a problem apparently distant from them concerning the stability of the solar system. It is by observing these interdisciplinary crossings that an interpretation of his philosophy can be elaborated. The disciplines and their organization show us where the connections between the sciences and philosophy organize themselves according to Poincare, where philosophic materials are configured systematically with materials from engineering and the sciences with ramifications even for art (Holton 2001).

18.4.2  P  oincare’s Synoptic Ordering of Science Conjoined with Philosophy in Popular Science Writing Seen as a thread through his writings, Poincare classifies scientific contents: arithmetic, algebra, analysis, geometry, mechanics, mathematical physics and experimental physics according to an ordering which spans what can be achieved through thinking alone to what needs some amount of organized experience. The ordering moves from arithmetic and the innate or a-priori ability to repeat or iterate to the laboratory experience of experimental physics. Our hypothesis is that this ordering provides enough of a synoptic view to facilitate the creation of analogies and hyper-­ compatibilities seen across disciplines and in conjunction with philosophical materials but specific to the scientific and engineering context for which the combination of conceptual materials was being developed. In this way philosophic material becomes scientific, especially when the scientific material being conjugated with philosophic materials is deprived of its equations, as in popular science texts. Popular science is the genre where Poincare brought science and philosophy together. This popularization of the sciences and engineering is where the generic scaffolding that Poincare used is most manifest. No one to our knowledge has devoted sufficient attention to why Poincare engaged in popularizing science. It does not seem to be for public attention because by all accounts he was not interested in this. Popularizing science certainly fits with Poincare’s putting discovery and invention on equal footing with validation and proof (Detlefsen 1992; Heinzmann and Stump

358

I. Monarch et al.

2017). In any case he regularly published papers in popular science and philosophy journals, many of which were collected and edited in a series of volumes that included La Science et l’Hypothese, published in 1902, followed by La Valeur de la Science in 1905, and Science et Methode in 1908 (Poincare 1913). These were written in a generic non-standard philosophic style that differs from typical philosophical language by its frequent irony and taste for paradox that in the texts served to undermine philosophic authority over the scientific and engineering problems being addressed. The popular writings also included essays like the Dynamics of the Electron (1908) that would be understandable by an audience who were not trained in at least one or any of the disciplines whose contents were being discussed.

18.4.3  P  oincaré’s Ordering of Disciplines as Conceptual Scaffolding Poincare linked mathematics and experience from two distinct epistemic sources: (1) the capacity to repeat an operation once it is made possible, and (2) experiences from the performance of experiments. For Poincare, each discipline falls on a line between something derived from almost pure repetition to something that depends equally or more on experience. Each discipline in the ordering borrows, without conflict, the amount of repetition and experience of the disciplines above it, though with the ability to use specific elements or fragments from the discipline above in exceptional ways. This ordering results in a classification of disciplines that is not merely formal, but plays a constituent role in Poincare’s way of discovery and invention. Here is the classification: Arithmetic, whose only source is the capacity to repeat an operation, is the basis for mathematical induction (based on the capacity to repeat an operation) and supports creation of new mathematics by construction, Algebra and Analysis, introduce the concept of a mathematical group, connecting to Geometry, a non-experimental discipline with an indirect relation to experience, Mechanics, close to geometry in its deductive form, but is also experimental, though its principles, at least for Poincare, are nonetheless outside the constraints of experiment, Mathematical physics, broader than mechanics (especially Maxwell’s thermodynamics and electromagnetic theory), but with a similar relation to experience Experimental physics, the closest to experience as packaged by experiments in the laboratory. According to Poincare, theories can be metaphorical and conventional, that is they can privilege, for example, a continuous or discrete approach that is not experimentally confirmed and may never be. However, he did think that concepts from different disciplines could be combined in the confirmation of theories, acknowledged, and this  did happen for the atomic hypothesis (Poincare 1900, 1912; Demopoulos et  al. 2012; Demopoulos 2013). Theories can also proceed by

18  The Discrete Scaffold for Generic Design, an Interdisciplinary Craft Work…

359

progressive generalization from the facts, distinguishing “raw facts,” coming in a “naïve” form, from “scientific facts”. The latter being formulated from seemingly incompatible concepts from different disciplines. For Poincare, generalization is more than just generalizing from facts and data since it is necessary to build, step by step, conditions of compatibility between scientific propositions and the relevant disciplinary knowledge. Building such compatibility is what he did for proposing an algebraic mechanics. Unexpected and new disciplinary connections are possible, if the order of disciplines and the construction of generalizations are respected. What Poincare saw as radical breaks between sensation and judgment, and between mathematics and experience also contributed to his views on extending hypothesis formation. For Poincare, the break between mathematics and experience opens development of generic intermediaries that move across disciplinary borders. Between sensation and judgment, there is no continuous transformation – each time there is a “leap.” Hypotheses produce a break or a jump in thought, which puts into play relations between disciplinary fragments, allowing for controlled generalization, articulation between models and theories and bringing into play several heterogeneous fragments of science in generic space. Hypotheses are not restricted to hypothetico-deductive reasoning. They demand daring and scientific courage in allowing the dynamic passage between sensation and judgment. So, scientific reasoning is not just a continuous generalization of facts. Poincare constructed models as well. Many of these models, especially in engineering, try to account for the same phenomenon but were often based on disparate hypotheses that did not seem compatible even though they were not contradictory. Poincare’s classification ordering of disciplines allowed for hyper-compatibility rather than standard compatibility because it liberated a space between models and reality due to the gaps between repetition and experience, between fact and generalization between conventions and “disguised definitions.” In a scientific paper with almost the same title of one referred to previously, “Sur la dynamique de l’électron,” (Poincare 1905), Poincare delved more mathematically into the same subject area to formulate his theory of restricted relativity. He combined the Michelson-Morley experiment, principles of mechanics, Maxwell equations, deformations of Lorentz and those of Langevin, where each experimental result needed to be generalized in such a way as to keep all this knowledge in different disciplines compatible. Poincare constructed many results by new links between disciplines, rather than extending particular theories showing his emphasis on the importance of generic comparability in science and engineering.

18.5  H  istorical Epistemology of Einstein’s Breakthroughs in Physics: Viewed as Generic Design To discuss the early twentieth century conceptual breakthroughs in physics, we will rely on (Renn and Rynasiewicz 2014). Their thorough work describes the methods by which Einstein constructed the concepts that produced the paradigm shift or

360

I. Monarch et al.

epistemic break for a new non-classical physics. These breakthroughs emerged piecemeal from his treatment of borderline problems of classical physics. This piecemeal process is interpretable using our generic design framework. For Einstein a borderline problem was a problem that pertained to two or more domains of classical physics, such as electrodynamics and mechanics. Each overlaps with at least one of the others in the details of the physics. Fluctuations play an essential role in both his notion of light quantum and in Brownian motion. Also, the empirical laws governing radiation play a role both in the advent of special relativity and again in the light quantum. Renn and Rynasiewicz (R & R) describe Einstein’s work as not only interdisciplinary in tackling the borderline problems that span different disciplinary domains, but also as revealing fundamental problems with these frameworks that have to be solved outside the classical framework. In their rendering, Einstein’s focus was on creating something new, even though using materials from something traditional. In this way, according to R & R, the knowledge system or expression of shared memory was the same for both Einstein and the then current masters of classical physics like Ludwig Boltzmann (the kinetic theory of gases), Hendrik Antoon Lorentz (the atomistic elaboration of Maxwellian electrodynamics) and Max Planck (the theory of black-body radiation). We would add to R & R’s description that much of the conceptual scaffolding was shared since the same borderline problems were indexed. What differentiates Einstein’s new physics from classical physics was his intention toward the problems and conceptual resources used for construction of new models. Einstein searched for a conceptual basis for all of physics, which he hoped to find with the help of an “interdisciplinary” atomism and to find unlikely connections between properties in different domains that seemed incompatible. In terms of generic design, these unlikely connections are examples of hyper-compatibilities. Many of the arguments are based on attempts to use atomistic ideas to build and explain hyper-compatible connections between seemingly disparate physical properties. For example, Einstein constructed a connection between the electrical and thermal conductivities of metals, which he traced back to mobile charged particles inside metals, or between intermolecular and gravitational forces. Einstein’s idea of an atom was not defined by a particular theory, but served as a mental model whose exact properties still had to be discovered from experience, i.e., an integrative object or object x of generic design. His interdisciplinary atomism covered such diverse applications as an atomistic theory of liquids, diffusion processes, the electron theory of metals, and heat radiation. In light of these explorations, he began to consider a generalization16 of the kinetic theory beyond what Boltzmann had already achieved. Whereas kinetic theory typically starts with the interaction between the various atomic constituents of a macroscopic system, such as collisions between molecules of a gas, statistical mechanics does not focus on such processes. It can therefore be applied much more generally than kinetic theory. Statistical mechanics focuses not on time development but on the statistical properties of virtual

16

 Here generalization is playing much the same the role it played for Poincare.

18  The Discrete Scaffold for Generic Design, an Interdisciplinary Craft Work…

361

ensembles. A virtual ensemble denotes a large number of essentially identical copies of one and the same system. All of these copies obey the same dynamics, but they differ in the exact configuration of their atomic constituents: there is a vast number of ways that billions and billions of atoms can zoom about and still make up a gas with the same volume, pressure, and temperature. The practically impossible analysis of the time development of a microscopically described system is now replaced by the evaluation of statistical mean values over the virtual ensemble for the system under consideration. R & R discuss Einstein’s other interests in addition to interdisciplinary atomism. For example, his interest in the electrodynamics of moving bodies led him to ask how a light wave would look to an observer moving alongside of it at the speed of light. The answer obviously depended on the underlying ether model. In a stationary ether, which is not carried along by the moving system, the speed of light relative to the observer would certainly have to change. Einstein hit upon the borderline problems between electromagnetism and mechanics early on, but always with an eye to bringing them under a changed conception. In this way, on our view, he was tacitly simulating the use of a generic matrix by way of his conceptual scaffolding. He simulated the use of the matrix by indexing conceptual structures of the shared memory of classical physics with an eye to borderline problems and incompatibilities in order to forge hyper-compatible conceptual structures that became different fragments of the newly emerging physics. Based on this review of Einstein’s methodology, we maintain that while revolutions in paradigms, and epistemic breaks do exhibit incommensurability, this does not interfere with a repeatable methodology (generic design) for forging such breakthroughs. These incommensurable concepts were constructed in a craft-like way based on shared conceptual resources that were composed in new ways. The conceptual scaffolding he shared with Boltzmann, Lorentz and Planck was often the same but his orientation and indexing were different and more in line with Poincare on relativity and lesser known figures like Smoluchowski and Sutherland on Brownian motion, and Heisenberg building on Bohr, Born and others in quantum theory and Schrodinger going it alone (Renn 2012). What he had in common with the latter is no longer trying to resolve the incompatibilities in a classical framework, but rather to begin to forge a new one that produced incommensurable concepts of matter, space and time but pulling on the same materials and conceptual manipulations that could have been shared with the classicists. R & R also write that Einstein observed in his light quantum paper of 1905 that empirical results captured in the Rayleigh-Jeans law were incompatible with the possibility of making a thermal equilibrium of radiation and at the same time called into question the idea of a continuous ether. For Einstein this showed that Planck’s radiation formula is incompatible with classical physics, and in need of radically new physics models. Einstein conjectured that the Brownian motion of particles suspended in a liquid realized this other model. His decisive insight is that fluctuations of heat radiation can be related directly to a material process. Einstein imagined a cavity filled with heat radiation containing a mirror, which should exhibit behavior similar to Brownian motion as a consequence of the incident radiation and

362

I. Monarch et al.

the friction force from radiation pressure. Einstein’s thought experiment involving the mirror served as a conduit for transferring insights from one domain of knowledge to another. Thus one could learn from radiation theory that the observability of fluctuations may be enhanced by an appropriate scaling of the dimensions of the system and also that the scale of displacement in the expected random walk does not depend on the mass per se but on other dimensions of the problem,17 the surface area in the case of a mirror and the radius of the suspended particle in the case of a liquid. The irregular motion of these particles manifests a conflict between two areas of classical physics, similar to the conflict in the electrodynamics of moving bodies between the relativity postulate and the constancy of the speed of light, and also similar to the conflict between the assumption of a continuum of wavelengths and an equipartition of energy in the case of heat radiation. Einstein’s reaction to these conflicts was similar in all three cases. He dared to look for new concepts with which he could overcome incompatibilities of classical physics. R & R also discuss the advent of special relativity. Essential to this was Einstein interpreting the Lorentz transformation equations in a new way. Lorentz had introduced transformation equations to show that the earth’s motion relative to the ether will almost always be unobservable. To ensure the unobservability of motion relative to the ether, Lorentz had to introduce special assumptions that did not have a clear physical meaning. In particular the impossibility of achieving thermal equilibrium directly contradicted the basic assumptions of the then current theory of radiation: the existence of a stationary ether. Without an ether, Einstein did not have a basis for the crucial assumption of Lorentz’s theory, that the speed of light in the ether is constant. Without an ether, he likewise had no basis for the explanation of the contraction of moving bodies, which Lorentz had introduced in order to explain the result of the Michelson-Morley experiment. Concepts were needed that could be used to provide physical meaning for space, time, and velocity. The meaning of these concepts could no longer be derived from the framework of Lorentz’s theory in terms of the behavior of the ether, molecular forces, and so forth. So, Einstein focused on those elements that held the key for the eventual solution: for one, the strange notion of local time introduced by Lorentz and the auxiliary assumption he had introduced concerning distances in moving systems of reference, and for another, the constancy of the speed of light and the principle of relativity. If the space and time coordinates that occur in the Lorentz transformations are interpreted as physically meaningful quantities, then these transformations imply that identically constructed measuring rods and clocks in a system that moves relative to an observer define for this observer different units of length and time than his own, that is, relative to measuring rods and clocks at rest. Was there any way to verify the curious behavior of bodies and processes in moving frames of reference. How do measuring rods and clocks behave in such frames? What does it mean to say “this event takes place simultaneously with another one,” and how does one check that?

17

 Note the relationship to dimensional analysis to be discussed later.

18  The Discrete Scaffold for Generic Design, an Interdisciplinary Craft Work…

363

Einstein posed such questions inspired by his philosophical reading, particularly the writings of David Hume, Ernst Mach, and Henri Poincaré. This background reading opened him to the idea that the concept of time is not simply given, but represents a complicated construct, and that to ascertain simultaneity at different locations one needs a definition that must be based on a practical method. Einstein’s method for ascertaining simultaneity at different locations – the synchronization of spatially distant clocks by light signals that propagate with a finite speed was used in contemporary technical practice, as Einstein knew from the reading of popular scientific texts and from the context of his patent work. When taken seriously as actually measured quantities, the curious space and time coordinates in the Lorentz transformations no longer contradicted the foundations of kinematics. Einstein’s insight was to view the Lorentz transformations as resulting from the principle of relativity and from the new non-classical principle of the constancy of the speed of light. Einstein was able to interconnect two levels of knowledge – a theoretical and a practical one – in a new way. As we saw, his reflection about the concept of time interlinked the practical knowledge about time measurements at different locations with propositions about the propagation of light, based in the theoretical knowledge about the electrodynamics of moving bodies. Interconnecting these two levels of knowledge in Einstein’s 1905 paper was the starting point of a scientific revolution that introduced the relativity of simultaneity and its incommensurability with previous notions of simultaneity. However it was based on clear and concrete transformations of the conceptual materials extracted from classical physical theories. It also led to further conceptual change. Since according to this framework, physical interactions cannot propagate faster than speed of light, Newton’s well-established theory of gravitation, assuming an instantaneous action at a distance, was no longer acceptable after 1905. Einstein’s formulation of the general theory of relativity replaced Newton’s theory in 1915. Einstein proposed a tensor equation relating local space-time curvature (expressed by the tensor) with the local energy and momentum within that space-time. Janssen and Renn (2015) argue that Einstein built his field equations on the Entwurf (outline or draft) theory whose field equations provided scaffolding that supported the construction of new field equations as arches. Their view is a slightly different use of scaffolding than the one we have been using as part of generic design. On their view, the theoretical and mathematical scaffolding is actually “supporting,” in a sense analogous to the physical, the construction of the new Einstein field equations. We would say that Einstein’s conceptual scaffolding indexes parts of the Entwurf text that can be used to construct the new equations. On our view, their use of scaffolding also fits in with generic design as craftwork, but in a somewhat more Platonically ideal sense.

18.6  Extending Poincare and Einstein So far our discussion of generic design has focused on individuals. To put it in terms of Poincare’s faculty of repetition, repetition is not just a matter of what takes place in individual minds but also a matter of what takes place across individual minds.

364

I. Monarch et al.

Generic design is always open to multi-minded concept construction, where multiple designers (including users)18 index parts of a shared conceptual space or shared memory, even if across disciplines or fields. Multiple minds can contribute to the development of the same concept, as will be seen in the dimensionless numbers/ dimensional analysis case discussed below. In this case, multiple researchers share exemplars of an emerging concept of dimensionless number as part of the conceptual framework of dimensional analysis. They even form collaborative units, though not all together or all at once, and in smaller groups (2 or 3) than the larger number that will develop the new concept in different contexts using prior cases as exemplars. The dimensionless number case is different from the previous described cases of Poincare and Einstein. Poincare and Einstein indexed the work of others and Poincare popularized what we are calling scaffolding in his philosophic and popular science writing. They had collaborators. For Einstein, these included Marcel Grossmann and Michele Besso, who were close friends, but who did not contribute  independently from Einstein to the new conceptions he was developing. For Poincare, his interaction with other researchers was more in the form of oppositional collaboration. For example while he communicated with Bertrand Russell and Hilbert concerning the explication of axioms, definitions and conventions in geometry, his new conceptual proposals were not issued as collaboratively coming from the three of them or any two of them, but himself alone. The relationship among the researchers developing the new concept of dimensionless number was different in important ways. As will be shown, different researchers made independent contributions to the development of the new concept in different contexts, indexing some different concepts and developing new techniques to adapt a previous exemplar to the exigencies of their situation. All worked in multiple disciplinary and practical domains. Some of the researchers worked together in producing new exemplars in different contexts, e.g., Froude’s work with Brunel in scaling models for testing in building large shipbuilding. Most followed the work in the literature of some of the others quite closely, e.g., Raleigh followed the work of Fourier and later the work of Strouhal and Reynolds and Buckingham (theorem) followed the work of most of these. Bridgeman acted as a chronicler of all this work showing how the different exemplars and the theorem could be used in future engineering and physics work. However, no one of these researchers was the single name associated with the new generic generalization, dimensionless number.

18.7  A Generic Design Perspective on Engineering Crafts people and engineers have created techniques and technologies that we see as supporting the generic design process. We also see these techniques and technologies as the products of generic design. This is especially true of the construction of

18

 See Generic Design Process in Architecture and Engineering (Bucciarelli et al. 1987)

18  The Discrete Scaffold for Generic Design, an Interdisciplinary Craft Work…

365

visualization aids and memory aids, handbooks of drawings, diagrams, architectural plans and textual descriptions of exemplary designs and physical or digital models. Engineering and craftwork are based on already existing constructions that serve as exemplars for future constructions. They not only avoid having to experience these first hand, but do more than this by providing guidance for understanding the exemplar. As Ferguson in his book, Engineering and the Mind’s Eye, (Ferguson 1992) proposes, the engineer/designer cannot express completely the implicit knowledge gained from pragmatic experience into verbal and mathematical descriptions. Polanyi (1974) called this sizeable residual knowledge tacit knowledge. However, diagrams and visual models in the collective context of a guild can capture at least some of the residue. Together they make it possible to duplicate some aspects of the original experience, concisely, thus facilitating investigating important aspects of the experience and also exploring alternative possibilities by changing in a different way the diagram or visual model rather than having to duplicate everything that made for the original experience. In this way, the visualizations and memory aids play the role of conceptual structures that can be indexed via conceptual scaffolding to produce new concepts and conceptual structures. When engineering science came on the scene, these techniques and technologies were elaborated using various experiential means and developed even before engineering theories are developed or even when they are not. They also supplemented the engineering theories already being used in a domain or discipline. For example, exemplars of visual and hands­on aids can provide a space for virtual experience of alternative engineering possibilities. Leonardo’s sketches collected in his codex are a great example of his ability to transfer his understanding of the mechanics of the animal body to create imaginary devices and machinery (Capra 2008). Redtenbacker took the handbooks to a whole new level.19 Rendering exemplars by visualizations or a collection of visualizations from the viewpoint of conceptual scaffolding guides investigation into the viability, efficiency and reliability of structures and their behavior without the trial and error costs of building the actual structures first. Craft and engineering handbooks are exemplars of knowledge representation technologies that serve as memory aids for shared memory. Collecting these exemplary sketches and scaled models in codices and engineering handbooks has been critical for making engineering more efficient, effective and reliable. Documentation of the elements and processes of design is the basis of preventing failures, gaining more efficiency or reliability and exploring new concepts and compositions of concepts. In next section we explore the evolution of dimensional analysis as a method that allows one to reason without the functional equational form of the phenomena studied (Macagno 1971). The case of dimensional analysis and dimensionless numbers we describe next exhibits a more collective approach to concept innovation than we have described previously. The case also further elaborates the role of exemplars in generic design.  Starting in the 1830s Redtenbacker began collecting and organizing practical and empirical knowledge on machine design and the disciplinary knowledge of hydrodynamics and classical mechanics that were seeped in geometry and algebra (Wauer et al. 2009).

19

366

I. Monarch et al.

This case can be introduced in terms of two views concerning scale in engineering design. Addis, an historian of structural engineering, maintains that scaling from toy and other scaled down versions of the design object is critical for detecting boundary conditions, which often drive failures in the development of the artifact (Addis 2013). Petroski, who has written many popular books on engineering design failure, puts a different wrinkle on the scaling up of scaled down models. He points out in his book on design paradigms (Petroski 1994), that scaling up of objects is one place where designs fail. Petroski points to numerous examples where designers are overconfident that scaled up versions of small-scale models or even smaller versions of the same artifact will scale up. Concerning the size of great ships, Petroski (but he also makes the same point about bridges) provides as an example the Titanic disaster. For Petroski, the Titanic’s true failure mode may not have been only the sharpness of the iceberg, since no large gash was discovered in the wreck, but rather its colossal size. Both Addis and Petroski, cite Rankine as having made an important contribution to avoiding design failure, though Addis had little if any interest in scaling up as failure mode in and of itself. According to both, Rankine introduced the idea of the harmony of theory and practice, though Addis, unlike Petroski essentially devotes a book to it, Structural Engineering: The Nature of Theory and Design (1990). Addis provides the following key quote from Rankine that Petroski also references as an example of the “harmony”, though Rankine does not mention anything about such harmony in what he says. “The study of scientific principles with a view to their practical application is a distinct art, recognizing methods of its own…This kind of knowledge (intermediate between purely scientific and purely practical)…enables its possessor to plan a structure or machine for a given purpose without the necessity of copying some existing example–to compute the theoretical limit of strength and stability of a structure, or the efficiency of a machine of a particular kind–to ascertain by how far an actual structure or machine fails to attain the limit, and to discover the cause and the remedy of such a shortcoming–to determine to what extent, in laying down principles for practical use, it is advantageous, for the sake of simplicity, to deviate from the exactness required by pure science; and to judge how far an existing practical rule is founded on reason, how far on mere custom, and how far on error.” When Rankine refers to his view of what might be called the practical theory of the artifact20 as a distinct art, he is confirming our view of the craft-like nature of even engineering science. We describe what this means in the case of dimensional analysis next.

 See the article by J. C. Reddy et al., “Designing as Building and ReUsing of Artifact Theories: Understanding and Support of Design Knowledge (Reddy et al. 1997).

20

18  The Discrete Scaffold for Generic Design, an Interdisciplinary Craft Work…

367

18.8  D  imensional Analysis: A Case of Multi-authored Generic Design Dimensional analysis arose in the context of solving problems in the development of Fourier’s analytical theory of heat applied by Raleigh in his initial version of the theory of sound probably using Strouhal’s work on production of sound. This was further developed by Stokes’ mathematical calculation of the frictional force re viscosity in fluid motion, Reynolds’ experiments to help predict fluid flow patterns in different contexts in regard to ship building, Froude’s work with Brunel in scaling models for testing in building large shipbuilding and later generalized by Buckingham’s Pi theorem. Dimensional analysis was a byproduct of solving physics and engineering problems of heat and fluid flows in various contexts. To solve these problems, five important conceptual innovations were needed to: (1) distinguish dimensions from units, where a dimensional category can be expressed in multiple units of measurement; (2) extend the notion of dimension from geometric space to a broader conceptual space that includes initially physical phenomena like fluid flow and more specifically heat and sound; (3) identify the principle of dimensional homogeneity to enable algebraic consistency; (4) construct dimensionless numbers where the model of the conceptual parameters in a design of an object interacting with phenomena are known but not the functional equational relationships and (5) derive a mathematical existence theorem that identifies the dimensionless numbers for a mathematical function describing a physical phenomena. Along side of these five conceptual changes is the development of the principle of similitude (also referred to as principle of similarity). The principle can be stated in generic terms as a generic generalization encompassing the relationship between a model and the context of a real application in terms of their geometric, kinematic and/or dynamic similarity. In what follows, this principle can be seen to be at work in the examples we discuss. From the point of view of generic design, these five conceptual innovations contribute to a generic method for measurement and analysis of dynamic phenomena. It is generic in its origins and is still generic today. In the first instance, it is generic because the new concepts arose in traversal of multiple disciplines providing conceptual materials and techniques that needed to be composed and fitted to different scientific, engineering and craft domains. The invention of dimensionless numbers focused not only on concepts and mathematical techniques but also on various technologies like shipbuilding and acoustics. This is one reason why the dimensionless number case is different from the Einstein case, even though both science and engineering disciplines were involved in Einstein’s conceptual innovations as well. In the Einstein case, the conceptual innovations were infused into the interpretation of empirical laws and data generating experimental apparatuses. The confluence of his thought experiments and others’ physical experiments (e.g., Jean Perrin (1913)) across disciplines were important in the confirmation of atomism or the development of relativity theory. However, the conceptual and other innovations of Einstein were not conditioned on technologies in the way they were in the emergence of

368

I. Monarch et al.

dimensionless numbers. Einstein’s breakthroughs may be the basis for nuclear power and weapons, but the latter were not a condition of his conceptual innovations. Moreover, the scaffolding for the construction of dimensionless numbers was therefore also different from the Einstein case. The scaffolding for dimensionless numbers was not geared to a single researcher who could survey shared memory in multiple directions, but was rather geared to multiple researchers each with their own orientation to the multiplicity. Each researcher had a somewhat different orientation and surveyed some overlapping and some different shared memory domains. The generic innovations include seeing heat flow as fluid flow and extending fluid flow with viscosity to airflow (aerodynamics). We can also include here the conjugation of spatial dimension with physical phenomena resulting in the identification of new physical dimensions. In addition, a generic method or approach (based on exemplars) had to be discovered/invented for applying the new concepts of physical dimensions. It was developed in applications of these new dimensions in different contexts, each of which provided exemplary characterizations of the dynamics of behavior for a given context. Dimensions have their roots in applied geometry (Martins 1981). Geometry was the first means to represent scaled dimensions of structures that when scaled to the real device would be invariant in the desired properties. Identification of the three dimensions in Euclidian geometry and the use of dimensions as exponents in the early days of geometry arose for describing physical objects geometrically as areas and volumes. The dimensions were treated as the exponent of the length measure area as L2 and volume as L3. They were even called “rectangular number” or “solid number.” This notion of dimension as exponent in geometry was related to the principle of homogeneity and the rules for composition, requiring that the number associated with area cannot be added to the number for volume but can only form a numeric ratio with each other. The use of dimensional homogeneity was used in specific contexts, especially mechanics, as was done by Foncenex in the 1700s viz a vis the laws of composition of forces (Martins 1981). Subsequently, Poisson also used it for composition of forces. Between 1870 and 1890, Fourier began to explore the differences between units and dimensions in his analytical theory of heat, referencing Foncenex (Martins 1981; Macagno 1971). Fourier was the first to expand geometric dimensions to other physical dimensions. He made the claim that dimensions are independent of units and that any equation that characterizes a phenomenon has to be homogenous in its dimensions. These observations contributed to his inventing the algebra of dimensions independent of units.21 However, he did not invent dimensional analysis. Left to be done were a specification for how to choose parameters and explicitly formulating the role of similarity (similitude and scaling) in such an analysis.

 Fourier was the first astute observer to notice “that the equations by which the relation of the component parts of the system is analyzed are expressed in such a general form that they remain true when the size of the fundamental units is changed.” (Macagno 1971)

21

18  The Discrete Scaffold for Generic Design, an Interdisciplinary Craft Work…

369

Raleigh continued the work of Fourier in his theory of sound (1877–1879). Raleigh recognized the need for the dimensional equivalence of the two sides of an equation for physical phenomena. However he was not yet aware of Reynolds’ work and so he did not recognize that the Strouhal number was a dimensionless number for the generation of pitch in the theory of sound produced by strings. The theoretical discussions starting with Fourier and continuing with Raleigh started out independently of the development of a practical approach in fluid mechanics being developed by Reynolds and Froude that contributed to the invention of dimensional analysis (Rott 1990). Osborne Reynolds was a person who was familiar with both the fundamentals of mechanics and engineering tasks that were part of his work at the ship building company where he was employed. He was the first to recognize the change in behavior of a fluid with respect to its increasing velocity through a pipe. Reynolds created an experiment where he used transparent pipes and dye to study fluid flow. He also identified the power law for pressure drop in pipes with respect to velocity changes of the fluid in the pipe. He used the power law to recognize the boundary conditions of the flow from laminar to turbulent. Reynolds’ aim was to find the upper critical velocity at which the phase change occurs in the fluid flow. He used the formula V = P/BD where V is the critical velocity, P is the ratio between the kinematic viscosity v divided by v0, the viscosity of water at 0° Celsius. The value of B was determined by experiments and was used to compute the dimensionless quantity that is the reciprocal of Bv0. He conjectured that the critical velocity should be occurring at a particular number where the behavior of the fluid changes from laminar to turbulent flow. He used the power law to identify the lower critical velocity at which there was a phase transition and a critical pressure drop. With these experiments, Reynolds predicted that the critical value of the Reynolds number to be between 1900 and 2000. The current value that is used is 2300. However, in order for the application of the resulting Reynolds’ number to be extended to other contexts, these contexts had to be further articulated. In the case of the movement of sound produced by a string in air (Aeolic tones), the similarities and differences between fluid flow in pipes and flat plates on the one hand and airflow resulting from vibration of strings on the other had to be articulated as was done by Raleigh (Rott 1990). Based on the work of Reynolds, Raleigh proposed a revised “theory of sound,” reinterpreting the experimental work of Strouhal on the origin of Aeolian tones to be independent of the wire and as purely aerodynamic. Raleigh acknowledged that his theoretical prediction based on viscosity not having a role in fluid flow was not in line with Reynolds experiments. Taking this into consideration, he used the Reynolds’ number as an exemplar to reinterpret Strouhal’s number as a dimensionless number (Rott 1992). He concluded that inclusion of even an infinitesimally small value of viscosity had an effect on the fluid flow behavior. From the point of view of generic design, Raleigh constructed a generic generalization of dimensional analysis, extending it to calculate drag in aircraft design. Raleigh’s generalization was generic in the first instance because it initiated a set of instances based on exemplars. However, there is reason to believe that dimensional analysis has remained a generic generalization to this day.

370

I. Monarch et al.

The development of dimensional analysis based on the exemplary work of Reynolds and Raleigh was extended to other cases showing how related physical phenomena could be characterized to allow dimensionless numbers to be constructed. These characterizations supported new predictive capabilities restricted to specific contexts. A watershed event occurred in1914 with Buckingham’s publication of “On Physically Similar Systems: Illustrations of the use of Dimensional Equations.” Buckingham formulated the Pi theorem for dimensional analysis (Buckingham 1914). Buckingham’s contribution was to formalize the generic observation that given a set of variables describing a phenomenon, one can determine the number of dimensionless numbers that can be created to describe different aspects of that phenomenon. For example, if there are n variables participating in an unknown functional form of an equation describing the phenomenon, then n  - k dimensionless numbers can be created where k is the number of dimensions participating in the functional form. However, Buckingham’s theorem only provides a way of generating sets of dimensionless numbers leaving their “physical meaning” underdetermined. Underdetermination has to be addressed through experimental means and each case is different. Art and craft for choosing the optimal variables for developing the experimental means is required. There were several other contributions to dimensional analysis including those of Prantdl (1890s) for boundary layer problems in fluid flow (Rott 1990). The technique of using the theorem was discussed in the works of Rayleigh. Still, Raleigh wondered whether physicists would appreciate Buckingham’s theorem. In spite of the advancements, it did not spread until after Bridgeman’s book on this topic in 1920 codified how the method and its exemplars, including Buckingham PI theorem, could be used in engineering and physics (Bridgeman 1922). In any case, dimensional analysis has remained to the present a generic method that can be applied using exemplars and the principle of similitude to many areas of engineering and physics. These areas will only have a family resemblance to one another  – not an identity that can be characterized universally.

18.9  Conclusion Scaffolding for generic design supports the (1) flow of meaningful fragments from shared memory through the scaffold for the construction of a new concept, (2) conjugation of these fragments, whether of a theoretical or practical kind across knowledge boundaries, acting as both filter of current knowledge and protector of what might otherwise be rejected as non-standard or even heretical design work performed in a no-man’s land of generic space. This scaffolding (1) helps to survey and also index the conceptual structures of shared memory, e.g., for pre-classical, classical and modern physics. It also (2) puts in place for construction, tools and techniques from shared memory. Finally (3) it provides a place for framing the conjugation of fragments from shared memory as new concepts. Conjoined fragments are often incompatible from the point of view

18  The Discrete Scaffold for Generic Design, an Interdisciplinary Craft Work…

371

of each of their disciplinary sources. Framing a new conception or integrative object needs to show them to be not only compatible, but also jointly productive in a fictional or hypothetical universe. Not only are concepts imported via the scaffolding, so are tools. The tools enable operations for determining the integrative status of the objects, which then give rise to theoretical shifts and means for engineering the future. Our hypothesis is that such activity on the scaffolding opens a local generic space where it can be seen that the three operators, virtual, fictional and “seen from the future,” identified by generic epistemology, are used. In generic space, fictions depart from the Real and point to the Real22 oriented towards the sake of humanity. Our work is linked to other frameworks, such as the scientific paradigms of Kuhn (1962) and Dosi (1982) and the discursive formations of Foucault (1969). Kuhn saw conceptual innovation in physics and Foucault in medicine and the human sciences as unexplainable epistemic breaks like religious conversions. Feyerabend (1975), while promoting more conceptual innovation, opposed method in favor of anarchism. Dosi (1982) was interested in the distinction between the continuity of progress along a technological trajectory defined by a technological paradigm and the discontinuities associated with the emergence of a new technological paradigm. He also argued against one-directional explanations of the emergence of new technological paradigms like those that assume “the market” is the prime mover. He saw this emergence as the interplay of: scientific advances, economic factors, institutional variables, and unsolved difficulties on established technological paths. He claimed to have a model that defines the selection of new technological paradigms among a greater set of notionally possible ones. However he does not describe where the notionally possible ones come from. In other words Dosi does not describe how new paradigms are constructed. However, this construction process is the focus of generic design, though in this chapter we have considered only how the conceptions of a new paradigm are constructed. If our demonstration has succeeded, then the particular scaffold that the present article is helping to build, offers the conditions to see how the conceptions of new paradigms and new discursive formations are constructed, even when these turn out to be incommensurable with the old. As materials for constructing hyper-compatibilities, the old conceptions are not simply thrown into unusable junk heaps, but used in the generic design process to construct the new conceptions. Also while Dosi argues against one-dimensional market explanations of the emergence of new technological paradigms, and correctly argues that discovery and invention need to be seen multi-dimensionally, he does not consider how epistemological, ethical and aesthetic dimensions are brought to bear. These latter issues are also clearly brought to bear for such dilemmas as climate change, sustainability of the biosphere and humanity and other global issues we face. In this way, sustainability and ethics are not added as an after-thought but are at the core of the generic design model.

 Think of Plato’s design of the Republic and or the systems philosophy of Buckminster Fuller’s spaceship earth – both fictions but meant to have real effect

22

372

I. Monarch et al.

The authors of this chapter are working in a generic space, generating the generic design model itself. We are constructing new intellectual identities as we elaborate generic design. We are juxtaposing and interlacing strands of our work in non-­ standard philosophy, generic and historical epistemology, agricultural engineering and engineering design. Since disciplinary rules governing our work in this generic space are lacking, orientations, principles and rules are worked out during the collision and conjugation of the conceptual materials that we each bring to bear. While innovation is something we see generic design supporting, innovation is not an end in itself, but tied to what works for human beings considered generically. One of the reasons we adopt a generic orientation to design is that we extend the notions of craftwork and scaffolding to much wider areas of application covering all or much of design. We also extend the notion of design to include not only its typical role in fashion, engineering, architecture (interior and exterior) and the arts, but also, for example, to include design of scientific experiments and concepts, designing proofs and concepts in mathematics,23 design of policies for managing organizations and governing, even the design of philosophical systems and perhaps most importantly for our purposes, we extend the notion to include interdisciplinary design and more generally inter-field design. By doing this, we intend to re-situate the relation of philosophy to design or engineering by combining them unilaterally into a conception of generic design. In this latter regard, we also take a generic stance adapted from Francois Laruelle.24 Our contributions to generic design continue to grow. The feedback and interaction at the Conference from which this ­chapter derives and the comments of the referees have also provided significant steps in this process.

References Addis, W. (1990). Structural engineering: The nature of theory and design. New  York: Ellis Horwood Limited. Addis, B. (2013). Toys that save millions: A history of using physical models in the design of building structures. SESOC Journal, 26(2), 28. Brachman, R., & Levesque. (2004). Knowledge representation and reasoning, the Morgan Kaufmann series in artificial intelligence. Amsterdam: Morgan Kaufman Publ. Inc.  The authors have recently begun to investigate how category theory may be used as a generic tool not only for mathematics but other disciplines as well (Subrahmanian et al. 2017), basing their work on (Spivak and Kent 2012; Spivak 2015; Gangle 2016). 24  Two of us (Mambrini-Doudet and Schmid) have already worked on extending Francois Laruelle’s non-standard philosophy. Laruelle’s project combines philosophy with external knowledge usually from scientific disciplines creating what he calls generic science (see the Glossary of the Generic Quantum in (Laruelle 2010 translated online https://differendkomplex.files.wordpress. com/2014/10/glossary-of-the-generic-quantum12.docx). Focusing on quantum theoretical concepts produces non-standard philosophy. Its extension on the part of two of us became generic epistemology with an emphasis on interdisciplinarity and collective intimacy (Schmid et al. 2016; Schmid and Hatchuel 2014; Schmid and Mambini 2019). 23

18  The Discrete Scaffold for Generic Design, an Interdisciplinary Craft Work…

373

Bridgman, P. W. (1922). Dimensional analysis. New Haven: Yale University Press. Bucciarelli, L., Goldschmidt, L. G., & Schon, D. A. (1987). Generic design process in architecture and engineering. In J. P. Protzen (Ed.), Proceedings of the 1987 conference on planning and design in architecture. American Society of Mechanical Engineers. Buckingham, E. (1914). On physically similar systems; illustrations of the use of dimensional equations. Physical Review, 4(4), 345. Callon, M., & Rabeharisoa, V. (2003). Research ‘in the wild’ and the shaping of new social identities. Technology in Society, 25, 193–204. Capra, F. (2008). The science of Leonardo: Inside the mind of the great genius of the renaissance. New York: Anchor Books. Ceccarelli, M. (Ed.). (2009). Distinguished figures in mechanism and machine science: Their contributions and legacies. Dordrecht: Springer Science & Business Media. Chalmers, A. (2009). The Scientist’s atom and the Philosopher’s stone: How science succeeded and philosophy failed to gain knowledge of atoms. Dordrecht: Springer. Claveau, F., & Girard, J. (2019). Generic generalizations in science: A bridge to everyday language. Erkenntnis, 84(4), 839–859. Coulter, N.  S., Monarch, I., & Konda, S.  L. (1998). Software engineering as seen through its research literature: A study in co-word analysis, Published 1998 in JASIS. Demopoulos, W. (2013). Logicism and its philosophical legacy. Cambridge: Cambridge University Press. Demopoulos, W., Frappier, M., & Bub, J. (2012). Poincare’s “Les conceptions nouvelles de la matiere,”. Studies in History and Philosophy of Modern Physics, 43, 221–225. Detlefsen, M. (1992). Poincaré against the logicians. Synthese, 90, 349–378. Dosi, G. (1982). Technological paradigms and technological trajectories: A suggested interpretation of the determinants and directions of technical change. Research Policy, 11(3), 147–162. Einstein. (1956). Investigations on the theory of Brownian movement (edited by Furth, R. ad translated Cowper, A.D.). New York: Dover. Ferguson, E. S. (1992). Engineering and the mind’s eye. Cambridge, MA: MIT Press. Feyerabend, P. (1975). Against method: Outline of an anarchistic theory of knowledge. London: NLB. Floyd, J. (2016). Chains of life: Turing, Lebensform, and the emergence of Wittgenstein’s later style. Nordic Wittgenstein Review, 5(2), 7–89. Foucault, M. (1969). The archaeology of knowledge. (trans: Sheridan Smith, A. M.). London/New York: Routledge, 2002. Gangle, R. (2016). Diagrammatic immanence: Category theory and philosophy. Edinburgh: Edinburgh University Press. Goel, V., & Pirolli, P. (1989). Motivating the notion of generic design within information-­ processing theory: The design problem space. AI Magazine, 10(1), 18–36. Goel, V., & Pirolli, P. (1992). The structure of design problem spaces. Cognitive Science, 16, 395–429. Hatchuel, A., & Weil, B. (2009). C-K design theory: An advanced formulation. Research in Engineering Design, 19(4), 181–192. Heinzmann, G., & Stump, D.  Henri Poincare, from the Winter 2017 Edition of the Stanford Encyclopedia of Philosophy. https://plato.stanford.edu/archives/win2017/entries/poincare/. Holton, G. (2001, April). Henri Poincaré, Marcel Duchamp and innovation in science and art. Leonardo, 34(2), 127–134. https://doi.org/10.1162/002409401750184681?journalCode=leon. Janssen, M., & Jürgen Renn, J. (2015). Arch and scaffold: How Einstein found his field equations. Physics Today, 68(11), 30–36. https://doi.org/10.1063/PT.3.2979. Published by the American Institute of Physics. Konda, K., Monarch, I., Sargent, P., & Subrahmanian, E. (1992, March). Shared memory in design: A unifying theme for research and practice. Research in Engineering Design, 4(1), 23–42. Kuhn, T. S. (1962). The structure of scientific revolutions (PDF) (1st ed.). Chicago: University of Chicago Press.

374

I. Monarch et al.

Laruelle, F. (2008). Introduction aux sciences géneriques. Paris: Editions Petra. Laruelle, F. (2010). Non-standard philosophy: Generic, quantum, Philo-Fiction. Paris: Editions KIME. Laruelle, F. (2011). The generic as predicate and constant: Non-philosophy and materialism. In L. Bryant, N. Srnicek, & G. Harman (Eds.), The speculative turn, continental materialism and realism. Melbourne. (Originally published in French in: Laruelle, Francois, Introduction aux sciences generiques, Editions Petra, Paris, 2008, ch. 2 and 5.). Laruelle, F. (2012).  Photo-Fiction, a Non-Standard Aesthetics, Univocal, (English and French Edition) (French) Paperback – November 1, 2012 by François Laruelle (Author), Drew S. Burk (Translator). Le Masson, P., & Weil, B. (2013). Design theories as languages of the unknown: Insights from the German roots of systematic design (1840–1960). Research in Engineering Design, 24(2), 105–126. Leslie, S. -J., & Lerner, A. (2016). Generic generalizations. Stanford Encyclopedia of Philosophy, Copyright c 2016 by the publisher, The Metaphysics Research Lab, Center for the Study of Language and Information, Stanford University, Stanford, CA 94305. Macagno, E. O. (1971). Historico-critical review of dimensional analysis. Journal of the Franklin Institute, 292(6), 391–402. Martins, R. D. A. (1981). The origin of dimensional analysis. Journal of the Franklin Institute, 311(5), 331–337. Monarch, I., & Nirenburg, S. (1993). An ontological approach to knowledge acquisition schemes. In N. Bourbakis (Ed.), Advanced series on artificial intelligence: Volume 2, knowledge engineering shells. World Scientific, New Jersey. Monarch, I., Konda, S., Levy, S., Reich, Y., Subrahmanian, E., & Ulrich, C. (1997). Mapping socio-technical networks in the making. In G. Bowker, S. Star, L. Gasser, & W. Turner (Eds.), Social Science, technical systems, and cooperative work beyond the great divide. Lawrence Erlbaum, New Jersey. Neffe, J. (2009, August 24). Einstein a biography. Johns Hopkins University Press; Reprint edition. Nickel, B. (2016). Between logic and the world: An integrated theory of generics. Oxford: Oxford University Press. Parry, R. D. (1996). Plato’s craft of justice. Albany: State University of New York Press. Perrin, J. Les Atomes, (Felix Alcan, Paris, 1913), translated as Atoms, by D. Ll. Hammick (Ox Bow Press, Woodbridge, Conn., 1990). Petroski, H. (1994). Design paradigms: Case histories of error and judgment in engineering. Cambridge: Cambridge University Press. Poincare, H. (1905). Sur la dynamique de l’électron. Rendiconti del Circolo matematico di Palermo, 21, 129–176. Translation online at https://en.wikisource.org/wiki/ Translation:On_the_Dynamics_of_the_Electron_(July). Poincare, H. (1908). La dynamique de l’électron. Revue générale des sciences pures et appliquées, 19, 386–402. Translation online at https://en.wikisource.org/wiki/The_New_Mechanics. Poincare, H. “Les conceptions nouvelles de la matiere,” Foi et Vie, 1912, translation by Frappier, M. and Bub, J., with an introduction by Demopoulos, W. Frappier, M. and Bub, J., appears in [Demopoulos et al. 2012]. Poincare, H. (1913). Hypotheses in physics. In The foundations of science. The Science Press. Authorized translation by Halsted, G.B., of “Les relations entre la physique experimental et la physique mathematique,” Reveue Generale des Sciences Pures et Appliquees, 1900. Poincare, H. (1982). Science and method. In The foundations of science: Science and hypothesis, the value of science, science and method, translations of Poincaré 1902, 1905b & 1908, University Press of America. Polanyi, M. (1974). Personal knowledge: Towards a post-critical philosophy. Chicago, IL: University of Chicago Press.

18  The Discrete Scaffold for Generic Design, an Interdisciplinary Craft Work…

375

Reddy, J.  C., Finger, S., Konda, S.  L., & Subrahmanian, E. (1997). Designing as building and ReUsing of artifact theories: Understanding and support of design knowledge. In Proceedings of the workshop on engineering design debate. London: Springer London. Renn, J. (2004). The relativity revolution from the perspective of historical epistemology. ISIS, The History of Science Society. Renn, J. (2005). Einstein’s invention of Brownian motion. Annals of Physics (Leipzig), 14(Supplement), 23–37. Published online 2005, https://www.physik.uni-­augsburg.de/theo1/ hanggi/History/Renn.pdf. Renn, J. (2012). Schrödinger and the genesis of wave mechanics. Preprint 437, Max Planck Institute for the History of Science. Renn, J., & Rynasiewicz, R. (2014). Einstein’s Copernican revolution. In M. Janssen & C. Lehner (Eds.), The Cambridge companion to Einstein. New York: Cambridge University Press. Rott, N. (1990). Note on the history of the Reynolds number. Annual Review of Fluid Mechanics, 22(1), 1–12. Rott, N. (1992). Lord Rayleigh and hydrodynamic similarity. Physics of Fluids A: Fluid Dynamics, 4(12), 2595–2600. Schmid, A.  F. (2001, January 1). Henri Poincare: Les sciences et la philosophie. Editions L’Harmattan; HARMATTAN edition. Schmid, A.-F. (2012). The science –Thought of Laruelle and effects on epistemology. In J.  Mullarkey & A.  P. Smith (Eds.), Laruelle and non-philosophy. Edinburgh: Edinburgh University Press. Schmid, A. F. (2019). L’Age de l’Epistémologie. Ré-édition Editions Kimé. Schmid, A.  F., & Hatchuel, A. (2014).“On generic Epistemology”, Angelaki, Journal of the Theoretical Humanities, 19(2), 131–144. Schmid, A. F., & Mambrini, M. (2019). Épistémologie générique Manuel pour les sciences futures, Kimé, Paris. Schmid, A. F., Mambrini-Doudet, M., Coutellec, L., Sanchez-Albarracin, E., & Perez, A. (2016). Convoquer les disciplines au banquet des interdisciplines. De l’ ‘ intime collectif ‘ à l’intimité collective comme dimension de l’épistémologie générique,” in :  Interdisciplinarités entre Natures et Sociétés – Bernard Hubert et Nicole Mathieu (dir.), Berne, Peter Lang, pp. 143–166. Sellars, W. (1968). Science and metaphysics: Variations on Kantian themes. London/New York: Routledge & Kegan Paul Ltd/The Humanities Press. Sellars, W. (1973). Conceptual change. In G.  Pearce & P.  Maynard (Eds.), Conceptual change (pp. 77–93). Boston: D. Reidel. Sennett, R. (2008). The craftsmen. New Haven/London: Yale Universit Press. Sowa, J. (2006, January). A dynamic theory of ontology. Conference Paper in Frontiers in Artificial Intelligence and Applications, 150, 204–213. Spivak, D. (2015, June 25). Categories as mathematical models. arXiv, 1409.6067v3 [math.CT]. Spivak, D., & Kent, R. (2012, January). OLOGS: A categorical framework for knowledge representation. PLoS One. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0024274 Star, S. L., (1999, November/December). The Ethnography of Infrastructure. American Behavioral Scientist, 43(3), 377–391. Star, S. L. (2002, April 22–23). Got infrastructure? How standards, categories and other aspects of infrastructure influence communication. In The 2nd Social study of IT workshop at the LSE, ICT and globalization. Subrahmanian, E., Monarch, I., Konda, S., Granger, H., & Milliken, R. (2003, May). Boundary objects and prototypes at the interfaces of engineering design. Computer Supported Cooperative Work, 12(2), 185–203. Subrahmanian, E., Schmid, A. F., Monarch, I., & Breiner, S. (2017, June). Perspectives on modeling: Generic epistemology and category theory. In Workshop on philosophy of models in engineering design, Karlsruhe Visser, W. (2009). Design: One, but in different forms. Design Studies, 30(3), 187–223.

376

I. Monarch et al.

Warfield, J. N. (1994). A science of generic design: Managing complexity through systems design. Ames: Iowa State University Press. Wauer, J., Mauersberger, K., & Moon, F. C. (2009). Ferdinand Jakob Redtenbacher (1809–1863). In M. Ceccarelli (Ed.), Distinguished figures in mechanism and machine science: Their contributions and legacies (pp. 217–247). Dordrecht: Springer Science & Business Media. Wimsatt, W., & Griesemer, J. (2007). Reproducing Entrenchments to Scaffold Culture: The Central Role of Development in Cultural Evolution. In Integrating Evolution and Development: From Theory to Practice, edited by R. Sansome and R. Brandon, 228–323. Cambridge, MA: MIT Press. Wittgenstein, L. (1922). Tractatus Logico-Philosophicus, First published by Kegan Paul (London). Side-by-side-by-side edition, version 0.42 (January 5, 2015), containing the original German, alongside both the Ogden/Ramsey, and Pears/McGuinness English translations. Available at: http://people.umass.edu/klement/tlp/. Wittgenstein, L. (2009). Philosophische Untersuchungen = Philosophical investigations (trans: G. E. M. Anscombe, P. M. S. Hacker, and Joachim Schulte., Rev. 4th ed.). Chichester/Malden: Wiley-Blackwell.