Gender in AI and Robotics: The Gender Challenges from an Interdisciplinary Perspective 3031216059, 9783031216053

Why AI does not include gender in its agenda? The role of gender in AI, both as part of the community of agents creating

162 93 2MB

English Pages 165 [166] Year 2023

Report DMCA / Copyright


Table of contents :
1 Owning Your Career Paths: Storytelling to Engage Women in Computer Science
1.1 Introduction
1.2 Background
1.2.1 Facts About Women Participation in CS
1.2.2 Existing Efforts to Widen Up Female Participation in CS
1.2.3 Role-Models
1.3 Our Approach and Method
1.4 Analysis and Results
1.4.1 Themes
1.4.2 The Role of AI
1.4.3 Dimensions
1.4.4 Storytelling Framework
1.5 Discussion and Impact
1.5.1 Tie-in with Related Work
1.5.2 The Significance of Role Models
1.5.3 Threats to Validity
1.5.4 Outreach and Dissemination Opportunities
1.6 Conclusions and Future Work
2 Gender Differences and Bias in Artificial Intelligence
2.1 Introduction
2.1.1 Understanding the Historical Background
2.1.2 Examples of Relevant Women Contribution to AI
2.1.3 Issues Overview for Lack of Gender Diversity in AI
2.2 Starting from Education, Where Are Tomorrow's Female Scientists?
2.3 Fixing the Number of Women
2.4 Fixing the Institutions, Improving the Presence of Women in Leadership Positions
2.5 Fixing the Knowledge and Its Side Effects, Creating Principles and Ethical Guidelines for an Inclusive Artificial Intelligence
2.6 Conclusions
3 The Private and the Public: Feminist Activisms in the Informational Turn
3.1 Introduction
3.2 The Private and the Public in Feminist Theory
3.3 From Haraway’s Cyborg to Cyberfeminisms
3.4 An Effective Techno-Activism
3.5 Conclusions
4 Gender Bias in Artificial Intelligence
4.1 Introduction and Problem Statement
4.2 Algorithms, Formal Languages and the Mirage of Objectivity and Computational Neutrality
4.3 Gender Biases in Natural Language Processing: Some Feminist Contributions
4.4 Critical Processes for the Introduction of Gender Bias in Algorithm Processing: A Taxonomy
4.4.1 Data Entry Bias
4.4.2 Biases in Algorithmic Operations and in Outputting and Receiving Results
4.4.3 Biases in the Monitoring and Data Collection that Feeds Back into the Algorithmic System
4.5 Conclusions
5 Emotional Intimacy and the Idea of Cheating on Committed Human–Human Relationships with a Robot
5.1 Introduction
5.2 Romantic Emotional Intimacy
5.3 Formal and Informal Conditions and the Elastic Definitions of Emotional Cheating
5.4 Robots as Sociosexual Actors
5.5 Thinking of Robots as Other
5.6 Conclusions
6 Are Dating Apps and Sex Robots Feminist Technologies? A Critical Posthumanist Alternative
6.1 Introduction
6.2 What is a Feminist Technology?
6.3 Dating Apps
6.4 Sex Robots
6.5 Critical Posthumanism
6.6 Conclusion
7 Not Born of Woman: Gendered Robots
7.1 Introduction
7.2 Automating the Body and Mind
7.3 Perspectives on Chatbot Gender
7.4 Fictional Robots
7.5 Sex and Gender in Humans
7.6 Gender in Robots: Humanoids in Practice
7.7 Views on Circumstances for Gendered Robots
7.8 Principles in Robot Development
7.9 Discussion
7.10 Conclusion
8 How Robots Can Change the Role of Women in Healthcare? A Way Towards Equality?
8.1 Introduction
8.2 Robots and Gender Labor Issues
8.2.1 Robot Brute Force
8.2.2 Social Robot, a New Robot Workforce
8.2.3 Care and Healthcare Robots. A Motivation for Girls to Approach Technology?
8.3 Discussion and Conclusions
9 Anime’s Thoughts on Artificial Minds and Gendered Bodies: From Transhumanism to Transindividuality
9.1 Introduction
9.2 Mapping the t/p Field
9.3 Sex, Gender and the t/p Field
9.4 Born This Way: Sci-Fi and Feminist t/p-Humanism
9.5 (Manga)Anime as a Thinking Device
9.6 Our Animetic Body
9.7 Is the One in the Shell a Human Ghost?
9.8 Interlude. Ain’t “Robot” Always Been a Woman?
9.9 The Robot as Posthuman Alliance
9.10 Simondon, Botticci and the Trans- of Transindividuality
9.11 Conclusion. Is the Animetic Contribution of Our Corpus Feminist?
Recommend Papers

Gender in AI and Robotics: The Gender Challenges from an Interdisciplinary Perspective
 3031216059, 9783031216053

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

Intelligent Systems Reference Library 235

Jordi Vallverdú   Editor

Gender in AI and Robotics The Gender Challenges from an Interdisciplinary Perspective

Intelligent Systems Reference Library Volume 235

Series Editors Janusz Kacprzyk, Polish Academy of Sciences, Warsaw, Poland Lakhmi C. Jain, KES International, Shoreham-by-Sea, UK

The aim of this series is to publish a Reference Library, including novel advances and developments in all aspects of Intelligent Systems in an easily accessible and well structured form. The series includes reference works, handbooks, compendia, textbooks, well-structured monographs, dictionaries, and encyclopedias. It contains well integrated knowledge and current information in the field of Intelligent Systems. The series covers the theory, applications, and design methods of Intelligent Systems. Virtually all disciplines such as engineering, computer science, avionics, business, e-commerce, environment, healthcare, physics and life science are included. The list of topics spans all the areas of modern intelligent systems such as: Ambient intelligence, Computational intelligence, Social intelligence, Computational neuroscience, Artificial life, Virtual society, Cognitive systems, DNA and immunity-based systems, e-Learning and teaching, Human-centred computing and Machine ethics, Intelligent control, Intelligent data analysis, Knowledge-based paradigms, Knowledge management, Intelligent agents, Intelligent decision making, Intelligent network security, Interactive entertainment, Learning paradigms, Recommender systems, Robotics and Mechatronics including human-machine teaming, Self-organizing and adaptive systems, Soft computing including Neural systems, Fuzzy systems, Evolutionary computing and the Fusion of these paradigms, Perception and Vision, Web intelligence and Multimedia. Indexed by SCOPUS, DBLP, zbMATH, SCImago. All books published in the series are submitted for consideration in Web of Science.

Jordi Vallverdú Editor

Gender in AI and Robotics The Gender Challenges from an Interdisciplinary Perspective

Editor Jordi Vallverdú ICREA Acadèmia Department of Philosophy Universitat Autònoma de Barcelona Bellaterra, Catalonia, Spain

ISSN 1868-4394 ISSN 1868-4408 (electronic) Intelligent Systems Reference Library ISBN 978-3-031-21605-3 ISBN 978-3-031-21606-0 (eBook) © Springer Nature Switzerland AG 2023 This work is subject to copyright. All rights are reserved by the Publisher, whether the whole or part of the material is concerned, specifically the rights of translation, reprinting, reuse of illustrations, recitation, broadcasting, reproduction on microfilms or in any other physical way, and transmission or information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed. The use of general descriptive names, registered names, trademarks, service marks, etc. in this publication does not imply, even in the absence of a specific statement, that such names are exempt from the relevant protective laws and regulations and therefore free for general use. The publisher, the authors, and the editors are safe to assume that the advice and information in this book are believed to be true and accurate at the date of publication. Neither the publisher nor the authors or the editors give a warranty, expressed or implied, with respect to the material contained herein or for any errors or omissions that may have been made. The publisher remains neutral with regard to jurisdictional claims in published maps and institutional affiliations. This Springer imprint is published by the registered company Springer Nature Switzerland AG The registered company address is: Gewerbestrasse 11, 6330 Cham, Switzerland

To my wife Sarah, an incredible anthropologist, and feminist. Without her, my ideas would have not evolved as quickly as necessary, and probably this book would not be in your hands.


This book should not have been edited by me, as well as I’m a clear example of the biases existent in Academia: cis, male, white, senior, Western,…the dominating community which has blocked the access of brilliant women to their naturally deserved intellectual positions. Anyhow, and using my privilege, as well as following my long worries about a gender-free science, I suggested to Springer editorial a new edited book on the special issues and conflicts concerning gender which could be found in contemporary AI and Computer Science fields. Incredibly, at the moment of beginning this editorial process, there was not a single book about such a topic. Not pretending to occupy a position that I do not deserve at all, but urged by the necessity to fix such inequality, I considered the edition of such a book. Because it is sad to admit, but despite the emergence of new technologies, prevail the same biases against women. It is worrying to see that the new Twenty-First Century economy and science are based on the fields of AI, robotics, and computer sciences, all of the fields with a very low number of female presence. Thus, the next revolution is happening just now, in front of our eyes, without including women in them. Therefore, the biases and inequalities are being fixed for at least one more century. Despite such a scenario, things can change, because human history is nothing predetermined, but the result of active pressures and transformations. The revolution must be led by women themselves, although males must have an active role in fixing their wrong attitudes. Without female tutorship, they have too many problems still to solve; male academicians and industrial experts must learn to improve their fields. Because every year is wasted more than the true 50% potential of our societies, which could be provided by incredibly talented women, who are not welcomed or allowed to be part of the such revolution. Even subfields like open software or videogames, which should be more hierarchically horizontal and universalist, are heavily dominated by male biases. I’ve asked very important and diverse experts about their views, ideas, and experiences in such fields. Women with their voices and special perspectives. From different fields: Anthropology, Computer Science, Philosophy AI, Engineering, History,…but all of them agree on the same idea: without women, the future will not be true, but just a repetition of the past without real advances or changes. I feel very honored and vii



lucky for receiving the ideas of all these researchers, and wish that you also enjoy their reading. The forces that act against the change are multiple, and plenty of them belong to the same social structure. Delaying the natural deadlines and release of this book, appeared the worldwide COVID-19 pandemic. It provided us with several lessons. But a very important one was: in case of problems, women receive more pressure and obstacles than males. For example, once confined, male academicians increased significantly their scientific production, thanks to the liberation from daily requests (meetings, classes, supervisions,...), and at the same time, female academicians decreased their normal production because of the familiar and social pressure (care duties, family organization,…), the yin and yang of reality, but always pressing against women. These dynamics must be transformed, eradicated, and get over. Only by being aware of such problems, we can create mechanisms for solving them. The first step consists of identifying the bias and then looking for its causes. Consequently, new solutions can be designed. This is the role of this humble edited work: to make visible uniformly the pending integration of women in new technologies of information. From my side, I’ll be happy if more academicians, politicians, and the whole society understand the importance of adding values and talent to the current technological revolution. The real advances in these fields have been possible, thanks to the superstation of outmoded and wrong theories about intelligence, perception, sensory-motor processes, social communication, language and semantics, creativity, and causal knowledge among a long list of fundamental aspects of knowledge and reality. Of course, the contributions of this book are not strictly related to the gender debates in the analyzed disciplines, but consider fundamental aspects to be taken into account to produce real revolution research, with a better impact in contemporary societies. I hope that you enjoy this book as much as I did while designing the project and following all the necessary steps toward its completion. Arenys de Mar, Catalonia

Prof. Dr. Jordi Vallverdú


This book covers a rich variety of topics concerning gender issues in Artificial Intelligence and robotics.



1 Owning Your Career Paths: Storytelling to Engage Women in Computer Science . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Elisa Rubegni, Birgit Penzenstadler, Monica Landoni, Letizia Jaccheri, and Gordana Dodig-Crnkovic 1.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.2 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.2.1 Facts About Women Participation in CS . . . . . . . . . . . . . . . . 1.2.2 Existing Efforts to Widen Up Female Participation in CS . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.2.3 Role-Models . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.3 Our Approach and Method . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.4 Analysis and Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.4.1 Themes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.4.2 The Role of AI . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.4.3 Dimensions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.4.4 Storytelling Framework . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.5 Discussion and Impact . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.5.1 Tie-in with Related Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.5.2 The Significance of Role Models . . . . . . . . . . . . . . . . . . . . . . 1.5.3 Threats to Validity . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1.5.4 Outreach and Dissemination Opportunities . . . . . . . . . . . . . 1.6 Conclusions and Future Work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 Gender Differences and Bias in Artificial Intelligence . . . . . . . . . . . . . . Valentina Franzoni 2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.1.1 Understanding the Historical Background . . . . . . . . . . . . . . 2.1.2 Examples of Relevant Women Contribution to AI . . . . . . . . 2.1.3 Issues Overview for Lack of Gender Diversity in AI . . . . .


2 4 4 4 6 8 9 10 14 14 15 18 18 19 20 20 21 23 27 27 28 29 30





Starting from Education, Where Are Tomorrow’s Female Scientists? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.3 Fixing the Number of Women . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.4 Fixing the Institutions, Improving the Presence of Women in Leadership Positions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.5 Fixing the Knowledge and Its Side Effects, Creating Principles and Ethical Guidelines for an Inclusive Artificial Intelligence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.6 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 The Private and the Public: Feminist Activisms in the Informational Turn . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Lola S. Almendros 3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.2 The Private and the Public in Feminist Theory . . . . . . . . . . . . . . . . . 3.3 From Haraway’s Cyborg to Cyberfeminisms . . . . . . . . . . . . . . . . . . 3.4 An Effective Techno-Activism . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 Gender Bias in Artificial Intelligence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Enrique Latorre Ruiz and Eulalia Pérez Sedeño 4.1 Introduction and Problem Statement . . . . . . . . . . . . . . . . . . . . . . . . . 4.2 Algorithms, Formal Languages and the Mirage of Objectivity and Computational Neutrality . . . . . . . . . . . . . . . . . . 4.3 Gender Biases in Natural Language Processing: Some Feminist Contributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.4 Critical Processes for the Introduction of Gender Bias in Algorithm Processing: A Taxonomy . . . . . . . . . . . . . . . . . . . . . . . 4.4.1 Data Entry Bias . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.4.2 Biases in Algorithmic Operations and in Outputting and Receiving Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4.4.3 Biases in the Monitoring and Data Collection that Feeds Back into the Algorithmic System . . . . . . . . . . . 4.5 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 Emotional Intimacy and the Idea of Cheating on Committed Human–Human Relationships with a Robot . . . . . . . . . . . . . . . . . . . . . . Julie Carpenter 5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.2 Romantic Emotional Intimacy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.3 Formal and Informal Conditions and the Elastic Definitions of Emotional Cheating . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.4 Robots as Sociosexual Actors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

31 36 37

38 41 42 45 45 46 52 54 57 59 61 61 64 66 68 70 70 72 72 73 77 77 81 82 84


5.5 Thinking of Robots as Other . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5.6 Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6 Are Dating Apps and Sex Robots Feminist Technologies? A Critical Posthumanist Alternative . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Janina Loh 6.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.2 What is a Feminist Technology? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.3 Dating Apps . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.4 Sex Robots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.5 Critical Posthumanism . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6.6 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7 Not Born of Woman: Gendered Robots . . . . . . . . . . . . . . . . . . . . . . . . . . . Huma Shah and Fred Roberts 7.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.2 Automating the Body and Mind . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.3 Perspectives on Chatbot Gender . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.4 Fictional Robots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.5 Sex and Gender in Humans . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.6 Gender in Robots: Humanoids in Practice . . . . . . . . . . . . . . . . . . . . . 7.7 Views on Circumstances for Gendered Robots . . . . . . . . . . . . . . . . . 7.8 Principles in Robot Development . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.9 Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7.10 Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 How Robots Can Change the Role of Women in Healthcare? A Way Towards Equality? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . A. Casals 8.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8.2 Robots and Gender Labor Issues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8.2.1 Robot Brute Force . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8.2.2 Social Robot, a New Robot Workforce . . . . . . . . . . . . . . . . . 8.2.3 Care and Healthcare Robots. A Motivation for Girls to Approach Technology? . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8.3 Discussion and Conclusions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .


86 89 90 93 93 95 96 100 102 104 105 107 107 108 109 111 114 115 118 122 125 125 125 129 130 131 132 135 136 137 137



9 Anime’s Thoughts on Artificial Minds and Gendered Bodies: From Transhumanism to Transindividuality . . . . . . . . . . . . . . . . . . . . . . Alba Torrents González and Andreu Ballús Santacana 9.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9.2 Mapping the t/p Field . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9.3 Sex, Gender and the t/p Field . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9.4 Born This Way: Sci-Fi and Feminist t/p-Humanism . . . . . . . . . . . . 9.5 (Manga)Anime as a Thinking Device . . . . . . . . . . . . . . . . . . . . . . . . . 9.6 Our Animetic Body . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9.7 Is the One in the Shell a Human Ghost? . . . . . . . . . . . . . . . . . . . . . . 9.8 Interlude. Ain’t “Robot” Always Been a Woman? . . . . . . . . . . . . . . 9.9 The Robot as Posthuman Alliance . . . . . . . . . . . . . . . . . . . . . . . . . . . 9.10 Simondon, Botticci and the Trans- of Transindividuality . . . . . . . . 9.11 Conclusion. Is the Animetic Contribution of Our Corpus Feminist? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .

139 140 140 142 144 144 145 146 151 153 155 156 157

Chapter 1

Owning Your Career Paths: Storytelling to Engage Women in Computer Science Elisa Rubegni, Birgit Penzenstadler, Monica Landoni, Letizia Jaccheri, and Gordana Dodig-Crnkovic

Abstract Motivation & challenge: Computer Science suffers from a lack of diversity that gets perpetuated by the most dominant and visible role models. The community is doing itself a disservice by upholding techno-solutionism, short-term efficiency, and busyness as central values. Those models are created and consolidated over time through social and cultural interactions that increase the perpetration of gender stereotypes. Exposing people to diverse types of role models and stories can contribute to making them more aware of the complexity of reality and inspire them taking better informed decisions-making on their career paths. Likewise, showing different role models to stakeholders in society and industry can contribute to increase the workforce diversity in the profession of computing as well as to make a shift towards the consolidation of different role models. This, in turn, may contribute to strengthen resilience and adequacy for solving issues related to diversity, equality and inclusion in Computer Science and more importantly allowing women take the ownership of their career path. Goal: To encourage the dissemination, sharing and creation of stories that show diverse career pathways to address gender stereotypes created by dominant stories in Computer Science. We tackle this issue by developing E. Rubegni Lancaster University, Lancaster, UK B. Penzenstadler Chalmers, Göteborg, Sweden GU Sweden, Göteborg, Sweden Lappeenranta University of Technology, Lappeenranta, Finland M. Landoni Università della Svizzera Italiana, Lugano, Switzerland L. Jaccheri Norwegian University of Science and Technology (NTNU), Trondheim, Norway UiT The Arctic University of Norway, Tromso, Norway G. Dodig-Crnkovic (B) Chalmers University of Technology, Gothenburg, Sweden e-mail: [email protected]; [email protected] University of Gothenburg and Mälardalen University, Gothenburg, Sweden © Springer Nature Switzerland AG 2023 J. Vallverdú (ed.), Gender in AI and Robotics, Intelligent Systems Reference Library 235,



E. Rubegni et al.

a framework for storytelling around female scientists and professionals to show a diversity of possibilities for women in pursuing an academic career based on the ownership of their pathways. Method: We apply a qualitative approach to analyse stories collected using the auto-ethnography and use thematic analysis to unpack the components of what in these stories contribute to building the academic path in the field of Computer Science. Authors used their own professional histories and experiences as input. They highlighted the central values of their research visions and approaches to life and emphasised how they have helped to take decisions that shaped their professional paths. Results: We present a framework made of the nine macro-themes emerging from the autoethnography analysis and two dimensions that we pick from the literature (interactions and practices). The framework aims to be a reflecting storytelling tool that could support women in Computer Sciences to create their own paths. Specifically, the framework addresses issues related to communication, dissemination to the public, community engagement, education, and outreach to increase the diversity within Computer Science, AI and STEM in general. Impact: The framework can help building narratives to showcase the variety of values supported by Computer Science. These stories have the power of showing the diversity of people as well as highlighting the uniqueness of their research visions in contributing to transformation of our global society into a supportive, inclusive and equitable community. Our work aims to support practitioners who design outreach activities for increasing diversity and inclusion, and will help other stakeholders to reflect on their own reality, values and priorities. Additionally, the outcomes are useful for those who are working in improving the gender gap in Computer Science in academia and industry. Finally, they are meant for women who are willing to proceed into an academic career in this area by offering a spur for reflection and concrete actions that could support them in their path from PhD to professorship.

1.1 Introduction Storytelling is a primary expression of human psychology and a fundamental aspect in the construction of meaning [8]: it is a way for humans to shape and understand their reality. Creating and sharing stories that show the diverse pathways in life it is essential to have a better understanding of its complexity. Quoting Chimamanda Ngozi Adichie from her TED talk “The danger of a single story” (2009) “The single story creates stereotypes, and the problem with stereotypes is not that they are untrue, but that they are incomplete. They make one story become the only story”. Multiple stories show different perspectives and provide a more complete view of the reality and ultimately it is the key for deconstructing stereotypes. Sharing stories that show diverse pathways allow these pathways to be legitimated as well as creating a safe space for people to build their own stories on the basis on their own values contribute the transformation of our global society into a supportive, inclusive and equitable community. The creation of new role models is essential in certain contexts for instance in balancing the presence of women in science, technology, engineering, and

1 Owning Your Career Paths: Storytelling to Engage Women …


mathematics (STEM) fields. In recent years, women’s under-representation in STEM has captured the attention in public, academic, and policy circles. Computer Science is one of the most heavily affected areas. This strong imbalance is long enduring, and no significant progress has been observed in the past years, neither in Europe [13] nor in the US [54]. An analysis of CS literature [52] estimates that the gender gap in CS research (parity between the number of male and female authors) will not close for at least 100 years if specific measures are not taken. On average, in the whole of Europe, women take less than 15% of the full professor positions. Figures show that in 2016, an overwhelming majority (83.3%) of ICT specialists employed in the EU were men. 53% of European employers say they face difficulties in finding the right people with the right qualifications [26]. Since information technology has a direct impact on the lives of people in society, the lack of diversity among developers can cause limitations to people rather than support them in achieving the desired goals for the software. A recent study reports that AI can enable the accomplishment of 134 targets across all the sustainable development goals, but it may also inhibit 59 targets [51]. While there is not enough evidence on the relation between AI and sustainable development goal 5 (gender equality), the study claims that AI can work against the accomplishment of goal 5, like for example exacerbating existing gender stereotypes. There are numerous initiatives worldwide devoted to increasing awareness about the diversity problem in Computer Science and implementing measures to mitigate the problem. IEEE Software has recently dedicated a special issue [1] to the diversity crisis in software engineering. The community addresses a variety of diversity axes, such as how to include more young girls into Computer Science [43] and how to render hackathons trans-inclusive [33]. The idea of this work is based on the concept that stories that show diverse pathways can lead a change in the self-perception of women’s role in STEM and have an impact on deconstructing stereotypes. In order to explore this topic we conducted a case study based on auto-ethnography, storytelling [8] and self inquiry [7] starting from us the authors of this chapter. The authors have all experience with Computer Science, Computing/ Informatics (field which includes Artificial Intelligence), having encountered these research areas at different turning points in our academic path. For simplicity, in the chapter we will refer to the areas of Computer Science, Computing/Informatics (which include Artificial Intelligence) as Computer Science using the acronym “CS”. We are aware that CS might be reductive or considered a subarea of computing but for the sake of facilitating the reading we will adopt this approach but please keep in mind that with the term CS we mean a broader area that includes all the above. This paper is structured as follows. The Background provides an overview of the actual research and the socio-technical context. The Method section presents our qualitative approach to collect and analyse data based on auto-ethnography. In the Analysis and Results we present the thematic analysis outcomes that were elaborated into a frameworks which aims at making storytelling a regular practice to reflect upon our paths, and heading to the next step. Finally, in the Discussion and Impact section we provide our view on how this contribution may impact the broader context.


E. Rubegni et al.

1.2 Background IEEE Software has recently devoted a special issue to the diversity and inclusion in software development [1] that illustrates how dimensions such as geography, gender, socioeconomic politics, age, ethnicity, and disability shape who can participate in creating technology. In the following, we recapitulate women participation in CS, and then existing efforts to widen up female participation in CS.

1.2.1 Facts About Women Participation in CS The Bureau of Labor Statistics (BLS) [14] projects CS research jobs will grow 19% by 2026. Yet, women only earn 18% of CS bachelor’s degrees in the United States. Despite the high job demand, CS remains a male-dominated field in the United States. In response, many top colleges are making efforts to recruit female CS students, making it an ideal time for women to pursue CS degrees. More specifically, women are under-represented among doctoral graduates in the fields of information and communication technologies (ICT) and engineering, manufacturing and construction (21 % and 29 % respectively), while women are 68 % of doctoral graduates in education [54]. A temporal analysis of the European data shows that, on average, no significant progress in female participation in CS higher education has been observed over the past ten years in Europe [19]. The same is true for the US, as shown in [54], gathering data on college students for four decades, which highlights a persistent, sizable under representation of women in CS in the US. Beyond academia, the profession also inherits the male-dominated student population. Women are strongly underrepresented among ICT specialists in all EU Member States, a striking contrast with total employment, where the genders are broadly balanced. Data for CS only is not available at the European level, however, a loose parallel can be drawn from data for Engineering and Technology where on average, in the whole of Europe, women take less than 15% of the full professor positions [13].

1.2.2 Existing Efforts to Widen Up Female Participation in CS Increasing representation of women in CS at all levels, from undergraduate and graduate studies to participation and leadership in academic and industry positions, is a grand challenge for academics, policymakers and for the society. While research has shed light on the issue, there is limited evidence on effective solutions and on what works. The following challenges are addressed by the community: How to attract more girls choosing CS as their higher education studies and profession? How to

1 Owning Your Career Paths: Storytelling to Engage Women …


retain female students and assure they finish their studies and start successful careers in the field? How to encourage more female Ph.D. and postdoctoral researchers to remain in the academic career and apply for professorships in CS departments? How to support and inspire young women in their careers and help them to overcome the main hurdles that prevent women from reaching senior positions in industry and public sector [23]? Which communication and dissemination strategy to adopt in this field? Frequently applied measures are interventions to recruit more female students to studies. These interventions are run by volunteers in the IT industry, university professors, teachers, parents. Other types of intervention are mentoring programs for women scientists; equal representation in the recruitment processes; and work environment support. These interventions are often run by female actors. The Horizon 2020 decision-making bodies has reached the objective that women now make up 55% of the advisory boards and 41% of the evaluation experts [13]. There have been several practical efforts to increase the interest in women towards CS courses and careers. For example, a study conducted by the University of California, Berkeley, to close gender gap in CS subjects revealed that the approach in which courses are marketed might not be suitable to attract girls [6]. In 2014, the university changed their course named “Introduction to Symbolic Programming” to “The Beauty and the Joy of Computing” which resulted in women outnumbering men in the class for the first time [6]. Furthermore, one of the biggest motivating factor in current times about CS careers for women could be the fact that CS careers are increasingly perceived to be lucrative [28] and the gender pay gap is very low as compared to other professions [14]. Positive initiatives from highly-regarded organizations can create ripple effects in the tech industry thus inspiring women to explore CS as a career option.Tech giants, like Apple and Google maintain reports Apple1 and Google2 to state that there is a requirement for increasing the diversity. Initiatives from tech giants like Apple and Google reduce gender disparity in their businesses by encouraging women to explore CS and gain success with the learned skills. Furthermore, in a study by Google, to access the indicators involved in a woman’s decision to pursue CS degree, four key factors were revealed [21]: • Social Encouragement: positive response from family and friends. • Self-perception: belief that critical thinking and problem solving skills can provide a successful career. • Academic Exposure: opportunities to participate in curriculum and/or extracurricular CS courses/activities. • Career Perception: positive thinking towards CS as a career with societal impact. All over the globe, measures to encourage more girls to pursue CS studies are implemented. There have been several successful efforts such as the MIT Women’s Technology Program3 (WTP), which is a program running since 2002 with the goal 1 3 2


E. Rubegni et al.

to increase high school girls’ interest to study engineering and CS in the future. In a summer camp held for a week, girls reported under representation of females (only 17% of the total participants) and they would have wanted to work together with more girls [50]. Camps focused only to girls would be more friendly and engaging to them [50]. Previous results also suggest that camps focused only for girls to be more friendly and engaging to them [50]. At the Norwegian University of Science and Technology, Kodeløypa (which means the pathway towards coding), a program running since 2014 [32] introduces boys and girls to programming by encouraging them to playfully interact with digital artifacts and create their own games with Scratch. Such artifacts allow students to learn by iteratively testing and rebuilding their designs while learning programming concepts. The study reported in [32] presents the results of an empirical investigation regarding students’ attitudes and identify potential differences in attitudes among male and female students. In all the attitudes, male students’ scores were significantly higher than females with the greatest difference in intention to participate in a similar activity. The EUGAIN project aims at improving gender balance in Informatics at all levels through the creation of a European network of colleagues working on the forefront of the efforts for gender balance in Informatics in their countries and research communities [18]. Figure 1.1 shows the winner of the Minerva Award during the Workshop organized by EUGAIN in cooperation with Informatics Europe. The Minerva Informatics Equality Award recognises best practices in Departments or Faculties of European Universities and Research Labs that encourage and support the careers of women in Informatics research and education. At NTNU we also run the IDUN project [26], that proposes a framework to inspire female researchers in their path from PhD to professor. Figure 1.2 shows a picture of an event in the IDUN projects. Further, Wang and colleagues suggest that girls may get more out of science and maths lessons if they are taught through the lens of a storytelling and gamified lessons since stories make the lessons more relatable [53]. The studies reported in [11, 29] introduce girls to stories and events that can influence their professional choice at an early age. Research also suggests that the focus of game-based interventions should be on interest enhancement [49]. Their suggestions are backed up by the findings from Sadik and colleagues [39]. Finally, integrating a game design project into the participants’ curriculum to evaluate girls’ interest in the topic and might also impact their CS choices in the future [46].

1.2.3 Role-Models Another strategy is to introduce female role-models in CS to girls [4, 22]. Black and colleagues [4] cited Güre and Camp [22] to state that the lack of female role models in CS was found to be one most detrimental factors for young girls to stop relating CS; and hence they require a female role model to be motivated to study CS or pursue a career in the field [4]. More specifically, a project called CS for Fun

1 Owning Your Career Paths: Storytelling to Engage Women …


Fig. 1.1 The EUGAIN Chair Letizia Jaccheri (on the left) gives the Minerva Informatics Equality Award to Claire Ordoyno (on the right) who received it on behalf of the EPSRC Centre for Doctoral Training in Robotics and Autonomous Systems (UK)

(CS4FN [16]) that has produced and made freely available (online) a booklet that showcases female role models and their groundbreaking work in CS. CS4FN [16] was aimed to address one of the biggest hurdles for women to consider CS as a viable field of study, that is the lack of female role models [4]. Concerning the toys that girls relate to, in 2010, Mattel, Inc. announced that the next Barbie will be a computer engineer, as this profession got the most votes (out of five options, i.e., architect, news anchor, environmentalist, surgeon, and computer engineer). Cheryan and colleagues argue that efforts to increase female participation in CS might benefit from changing masculine cultures and providing students with early experiences that signal equally to both girls and boys that they belong and can succeed in these fields [12]. Cheryan and colleagues argue that it might be difficult for women and girls to pursue fields with masculine cultures (beliefs and values encouraging and rewarding masculine characteristics in men) [12]. Similarly, in a survey conducted to study the attitudes of women towards the Computer Engineer Barbie by Martincic and Bhatnagar, 75% of the participants agreed that the doll could influence a girl’s decision to enter the field of CS [29]. Martincic and Bhatnagar argue in favour of such toys that could be viewed as the experimental tool for future occupational choices for the children [29].


E. Rubegni et al.

Fig. 1.2 An IDUN project mentoring event lead at NTNU by Letizia Jaccheri the Project Leader (photo by Kai Torgeir Dragland)

We can observe that previous research has tackled the problem of improving and maintaining women’s interest in CS studies and/or careers using a multitude of strategies. However, this has resulted in a scattered field of knowledge about the state-of-the-art in the field of serious games for this purpose. Therefore, we present a systematic literature review on the current status of research about the effect of the serious games on the girl’s attitudes towards CS studies and careers.

1.3 Our Approach and Method The main purpose of this work is to make female computer scientists more visible in their diversity of possibilities for career paths and helping other women on their academic pathways. With our goal in mind we developed a storytelling framework using the auto-ethnography approach to collect data [25], and the thematic analysis (induction mode) to identify the main topics emerged. We decide to use auto-ethnography after a thoughtful reflection on the need of providing a description of experiences that include different type of paths, different cultural and educational background, highlights the interdisciplinary, reporting different level of experience (e.g. academic seniority), different social-economic-geographic settings, and to include different CS disciplines. Auto-ethnography has a long tradition in HCI [34] and, also, it has been applied specifically to investigate gender issues in

1 Owning Your Career Paths: Storytelling to Engage Women …


Table 1.1 Interviews questions 1. What was your motivation to go into CS? 2. How and when did you first encounter CS? 3. What is your field of research and main research projects about? • Do you have experience with AI? • Do you have experience with robotics? 4. Did you have any role models when you started? 5. What was your picture of your future profession? Did you have a “dream” of being able to do something, answer a question, solve a problem, and similar that was not directly connected to one specific role model, but to the picture of a profession and its role or of a research field. 6. What was your ideal of a good life - professional and otherwise? 7. Were there any obstacles on your path? 8. What is your research vision? 9. What are the core values that support your vision? 10. Who are your role models today? 11. If you had to start all over, what would you tell your younger self? 12. What would you recommend to your colleagues to adopt a more inclusive behaviour towards their colleagues of other genders (female VS male and male VS female and including fluid gender)? 13. Looking from a systemic and global perspective, what do we need to improve the gender balance in CS?

technological infrastructures [45]. As well as the use of storytelling to investigate gender identity has been extensively explored in social science [3]. Narrative is a way for human being to give meaning to our world and make sense of the reality [8], thus using storytelling as a form to collect our experiences it seemed to be the most suitable approach. We expect that auto-ethnography and storytelling would help depicting the richness of our histories as well as to highlight the varieties of elements that characterised our academic career path in CS. Thus, after these considerations we decided to start from our stories as it would have been the most sensitive way to understand how to unveil gender-related issues within an academic career path. Thus, we identified a set of questions (see Table 1.1)and we asked each participants to answer via email. Each story was transcribed in a document and further analysed. In the following, we present the results of the analysis. Based on those results, we propose a storytelling framework to support in outreach and educational settings around gender in CS.

1.4 Analysis and Results Each author conducted the self-interview by writing down the answers and afterwards, two authors analysed the interviews in the vein of thematic analysis (inductive approach) by using NVivo software. One researcher had a first pass of coding the


E. Rubegni et al.

Table 1.2 Themes and codes Theme Codes Communication communication Motivation external motivation, internal motivation, meaningful activity, motivation, connection with other disciplines, mentoring, new challenge, scholarship to study CS, role models, good teachers, scientist, spiritual Process family imprint, life as a process, spontaneous process Research-CS AI in our research, affected by AI biases, AI applied to research, AI ethics, AI literature, at the university, cognitive systems, CS as a new language of nature, CS first encounter, research topic, Study AI, teaching AI Successes curiosity for the topic, HCI fascination, humanities study, interdisciplinary research, positive experience studying CS, successful experience Suggestion Improving gender balance, changing atmosphere, cultural norms, educate, focus on humans, impact on society, interdisciplinarity, risk of AI gender biases Role-models recommendations for colleagues, consideration, language, listening, safe space, sharing and collaborating, speak up, suggestions Threats unfair treatment, CS not appealing as teenagers, external judgment, obstacles, external obstacles, internal obstacles, performance pressure Values confidence, inclusiveness, independence, intuition, open mindedness, respect, values Vision being a teacher, ideal of a good life, normal life, personal vision, vision, vision for future profession, vision in software engineering

interviews that subsequently was revised by a second researcher in a blind mode. In the initial tagging were created 68 codes that were refined and tuned in further sessions (see Table 1.2). Finally, these were grouped into nine macro themes: Visions, Values, Motivation, Communication, Processes, Research in CS, Success, Threats, and Suggestion.

1.4.1 Themes These themes help us to see what are the main topics that emerged from our stories and highlighted the main issues encountered (e.g. Threats and Obstacles: “The other obstacle is not trusting myself enough” Id3) as well as some similarities in our experiences that we had in our career path (e.g. all the stories have items related to Motivation and family imprint: “I was just full of optimism and self-efficacy thanks to my parents and teachers” Id3). Following we provide a description of the themes, the related codes, and quotes from interviews that exemplify the several facets of each topic.

1 Owning Your Career Paths: Storytelling to Engage Women …



Visions can be related to the past, present and future guide. Through the interviews vision is connected to: being a teacher, ideal of a good life, normal life, personal vision, vision, vision for future profession, vision in software engineering (codes). This includes both personal perspectives as human beings “Contribute to something meaningful and follow my inspiration and intuition, while being playful and having fun. Keep growing, be connected with wonderful people, have a family. Inspire others. Carry on adventuring until I graduate from Earth” Id2. As well as “Better understanding of cognition, from basal cellular to the human-level via computational models can help us both better understand the functions of the basic living unit,- the cell, as well as phenomena of intelligence and decision-making under constraints” Id5.


The word and concept of values is recurrent and relates to different aspects: confidence, inclusiveness, independence, intuition, open mindedness, respect, and principles. These values can refer to suggestions on how to build a more inclusive research environment: “First, by offering other cultural models and showing that empathy, fairness, strength can be as efficient as arrogance, egocentrism and competitive behaviour. Giving more space to these people in academia and making prizes that reward this behaviour” Id4. Values that drove the career path in the past: “the hope to find a job and become independent” Id1, “respect and open mindedness are crucial ingredients for collaboration that is the way to success” Id3.


Motivation includes all the reasons that move us to pursue a CS academic career path, such as: external motivation, internal motivation, meaningful activity, connection with other disciplines, mentoring, new challenge, scholarship to study CS, role models, good teachers, scientists, and spirituality. Quite often external incentives move us “I was simply going to get a job and make my mum happy” Id3. In other cases role models had an important role. Both from family, “uncle who was an electronic engineer and fond of electronic gadgets” Id5, or researchers as "Brené Brown a social scientist well known for her research on vulnerability and leadership, bridges research, training practitioners, and public outreach—also a really fun human, dedicated family person, and fantastic community builder” Id2. Sometimes also bad experiences constitute a good stimulus to go ahead in order to make things different “The toxic environment where I did my PhD negatively impacted on my career path and my self-esteem. I had to work very hard to recover and to recognise those behaviours as misconducts” Id4.


E. Rubegni et al.


Communication has been seen as one of the most important aspects to create an inclusive and respectful environment “By using a more thoughtful language. Thoughts are expressed using words, our language has an impact on others. Language can create new neurologic processes and, in the long term, can change our mental models” Id4. In addition, “First, educate yourself on common pitfalls (so you have the vocabulary to name and communicate it if something is off)” Id2.


Process can concern with family imprint, life as a process and in particular spontaneous process. Regarding the first point, family has an impact on our way of growing such as: “I come from a family of athletes and my sisters and I have been doing competitions in swimming and running since a very early age. My father ran his last marathon 5 months before dying of cancer. We were also encouraged to be good at school, well dressed, slim, everything. When I look back and I listen to the discussion about how to raise children now, I wonder how we survived” Id1. Moreover, life has been seen as a process of “navigating dynamic white waters of real-world complex constraints, enjoying it, in spite of an effort of keeping balance between reasonable amount of control and unpredictability. That applies to both profession and real life - a dynamical balance between opposite constraints” Id5.

Research in CS

Research in CS includes several topics relevant for this study such as: AI in our research, affected by AI biases, AI applied to research, AI ethics, AI literature at the university, cognitive systems, computing as a new language of nature, CS first encounter, research topic, Study AI, teaching AI. Some of us had the purpose of studying CS from the beginning “Many of my peers were coming from technical schools and already knew how to program but I learned quickly and was always very comfortable sharing my strengths in maths and logics” Id3. As for others it was a casual encounter “the first year at the university and it was through the passion that a professor demonstrated about the subject” Id4. Approaching CS was seen as a natural extension of previous knowledge and background “I started to believe that computing is a new language of nature even more versatile than mathematics, that can help us build a new “real-time” natural philosophy where humans will be the organic part of nature, not the outside, “objective” observers” Id5. That was broadening the understanding of the concept of computing into multi-disciplinary models involving humans. Others broaden the approach to applications of CS to other domains: “CS and security, CS and creativity, CS and startups, and one in digital innovation” Id1.

1 Owning Your Career Paths: Storytelling to Engage Women …



Success stories include several items, such as: curiosity for the topic, HCI fascination, humanities study, interdisciplinary research, positive experience studying CS, and successful experience. The integration of different disciplines is definitely one of the most common across the interviews. Here some examples: “I try to combine CS and art, CS and social innovation, CS and children, now CS and gender” Id1, “I am also proud of having a classic background as it gives me a different perspective on CS” Id3, and “application of psychology in the design of objects and technologies was extremely fascinating for me” Id4. Often successful stories are related to external visibility “I applied for grants, got my own projects and students, and became visible again” Id3.


Threats emerged as having different nature, including: unfair treatment, CS not appealing for teenagers, external judgment, obstacles, both external and internal. External and internal threats have definitely different impacts on our career path. For instance, unfair treatment by a colleague can have a severe effect on our self-esteem but also on access to resources “Colleagues treated me like I were a wife looking for a side job, and showed no interest in my research, as they told HCI was not Informatics. The lowest point was the day I was not allowed to attend a seminar as it was only for professors” Id3. Being in academia makes us feel judged “I have been almost overwhelmed by this role model concept and paralyzed that I have to behave as role model” Id1. However, most of the obstacles are coming from our internal judgments “The other obstacle is not trusting myself enough. We have an inherent wisdom inside that guides us, and sometimes that voice gets drowned by the noise from outside (shoulds, have-tos, well-meant advice). We always have a choice, even though sometimes we may not like the options, but I have learned that for every committing Yes I give that means saying No to other opportunities, so now I am more aware of what I say Yes to” Id2.


Suggestions come from our personal reflections on how we could change the actual context and our past experience in order to create a healthier and supporting environment. Quotes were coded as: improving gender balance, changing atmosphere, cultural norms, educate, focus on humans, impact on society, interdisciplinarity, risk of AI gender biases, role models, recommendations for colleagues, consideration, language, listening, safe space, sharing and collaborating, speak up, and suggestions. Educating academics to use a more inclusive language and allowing them developing and growing more equal cultural norms addressed to the diversity and the interdisciplinarity. “To look at the problem as a structural and cultural issue which


E. Rubegni et al.

can have different nuances across countries and cultures, but that is grounded on a power mechanism. We need to demolish this mechanism of power in different ways and at different level” Id4, and “We must also work within different cultural norms and globally” Id5. It is a step forward for creating an academic work environment oriented to listening to each other, to focus on humans, and having safe spaces to speak up: “speak up for yourself or for any observed toxic behaviors in a kind and matter-of-fact way and always assume that people do the best they can” Id2.

1.4.2 The Role of AI Before presenting the resulting framework, we wish to address the role of AI within this piece of research. AI has played a different role in our individual paths. Some of us encountered AI when already established in their career, “When I was department head, we started an AI lab in cooperation with local IT” Id1. It was inspirational when still at university as “AI was one of my electives and loved using Prolog and LISP to write simple parsers, I find it amazing AI is coming back so strongly now” and a way to feel recognised and legitimated too “then when I took the AI exam they offered me to work on my final project in a European research centre, and after that I started my doctoral research and build my academic career” Id3. Nonetheless, AI is not only seen as a fascinating subject with great potential but we also approach it in a more critical way, specially when it comes to teaching and preparing next generations as we are “ to some of its biases in ways that make me cringe ... but I see a lot of opportunity it can bring if we teach our software engineers and data scientists to adequately acknowledge and deal with the responsibility they have for the long-term impacts of the systems they put in place” Id2. The same critical perspective proved valuable when reflecting on the impact AI has on research with a focus on “AI ethics and robotic ethics” and asking “in what way can intelligence, both natural and artificial be understood in terms of computing” Id5, and then moving on to more practical considerations on “children’s fears and wishes about AI using AI critical literacy” to guide the design a innovative solutions with and for children, Id4.

1.4.3 Dimensions One way to explore our stories in their richness is to look for the kind of Interactions and Practices, as described in [31] that made us the way we are: gave us the motivations to start the adventure pursuing CS, kept us focused, supported during the hard times and enabled us to see the beauty of our dream. Therefore, somehow

1 Owning Your Career Paths: Storytelling to Engage Women …


fostered retention in CS for the authors of this paper. Therefore, we look into the roles played by peers, faculty and family/friends to account for interactions, and match the practices listed by Pantic [31] with those reported in the interviews.


Our stories point out the importance of family in providing positive emotional, educational and practical support. In particular we read about mothers as inspiration and role models, uncles and godfathers providing school guidance and educational support. Faculty and teachers we met at different stages of our lives are also positive inspirational figures as well as peers as they also provided a sense of legitimacy by acknowledging our worth, and more importantly, helping us to recognise it too. Most of the quotes reporting positive or memorable interactions belong to the “Inspirations, Supporters, Facilitators” and the “Process” themes, showing how examples played a strong role in guidance towards pursuing career in CS.


Most interviewees commented on “good classes” often run by the “good teachers” mentioned above, to be the trigger for interest in CS. For instance, attending a course introduction to AI was a major turning point for one of the authors who decided to embrace a career in academia. Facing and overcoming obstacles made us more inclined to take on challenges, to the point that some of us were looking explicitly for complex, challenging situations to prove our worth to ourselves. We all have to accept abandoning perfectionism and instead set more realistic goals for ourselves. But what made us stay and try harder was finding jobs that were stimulating and challenging and helped us to be acknowledged as members of the CS community that in turn made us want to help others feel the same. Both the 9 categories and the two pillars are used as structuring concepts for a framework that aims to make female computer scientists more visible in order to show a diversity of possibilities.

1.4.4 Storytelling Framework Our analysis showed how diverse an academic career path in CS could be for women, it highlights several gender-related aspects that impacted on shaping the pathways (e.g. family caring tasks). Some elements are recurrent across the five stories, and some were disruptive while others soothing our pathways. From these outcomes we elaborated a framework, Table 1.3, that has the purpose to support the use of storytelling as a regular exercise to reflect upon our paths, how far we have come, and where we aim to head for the future. The themes are the ones that emerged in our


E. Rubegni et al.

Table 1.3 Storytelling famework based on interview themes Interactions Practices Values

What are the values you base your life on? How do you live into these values? Who in your life has shown you integrity around the values they uphold? Visions Have you seen glimpses of better world instances where you came across sth. you wanted to contribute to? How can computing be supportive here? Motivation Have you experienced scenarios where you found ourselves engaging naturally and effortlessly? Who were/are your role models? Which supporters do you have in your life? What moves you? Communication Where in your life have there been instances gone well or gone bad in communication? Which elements of communication contributed? Language, gesture, facial expression, medium? Process Which steps have helped you along the way in developing your career so far? Which activities do you engage in for your work that contribute to your expansion? How does your personal background and family imprint sabotage or support these processes? Threats What are the major obstacles and challenges you perceive in your life? What of those that are external and those that are internal? Successes



Reflect on values and choose three to focus on living into for a period of time. Revisit [2, 40]

Take time to journal about possibilities and paths. In writing, more insight unfolds [20, 44]

Research and find people who inspire us. Engage with inspirational content on a regular basis. Morning inspiration sets the mood for the day which influences our efficacy and output [17, 42] Practice active listening in all conversations. Listen as intently as you wish to be heard. Learn empathic communication or non-violent communication [36]

Take one hour at the end of the week to reflect upon the past week and what went well and what can be improved, and to plan the week ahead. Meta-reflection on processes and activities shows to pay off within a matter of weeks [10] Take an hour to reflect upon any fears and concerns that you have in your life. Spend time with each of them to draw out alternatives if they surfaced and came true.6 What successes have confirmed your Celebrate your successes. We often intentions of pursuing your goals? look only at what remains to be done, What do you experience as your most when there is also ground we have significant accomplishments? How already covered. Make a list of are these connected to your values? achievements [24] What are the research contributions Reading about research and having you have made or aspire to make? reflective conversations around new How do you wish to serve the world? research methods and adjacent fields sparks the imagination [35] What was the best advice you ever Deep down we already know the received? What suggestions have you answers. Write a letter to your found helpful? Would you suggest younger self with three pieces of your past and future “you”? advice to give [48]

1 Owning Your Career Paths: Storytelling to Engage Women …


analysis, as well as the two dimensions of interactions and practices. The prompts for exploring them stem from a variety of resources for personal inquiry and growth. We suggest using the prompts and activities in personal reflective practice as well as in workshop settings with students and the general public. These items are not equal to each other, for instance values should be set up to drive the rest: values are included in the vision that is implemented in the research. Values are embedded in the motivation, as well as driving the communication and the process. At the same time they have emerged as outcomes of process, for example via family imprinting. Values provide at the same time lenses to look at threats and successes. They may also be the foundation for the suggestions. Moreover, values are not fixed entities but can change over time. In fact, they have to be flexible to changes, as they have to represent and respect our-selves and drive our actions in the world. The vision interactions aims to lead to a reflection on how our broad impact and to envision on the potential paths that could bring us there. The motivation is also connected to the vision and it concerns with finding the main reasons that guide us across our journey. It could be related to a role model or to our inner inspirations. Communication is essential to succeed in any career path or just for being a decent human being. In particular, reflecting on our ability of listening (and not just with our hears) and answering using an empathetic language. A deep consideration on the process that brings us to advance or prevent our career can be a good practice that will bring to a more conscious pathway. Threats and Success can be considered as two sides of the same coin. As understanding our obstacles (internal and external) will allow us to make a plan to achieve more success. Moreover, these latest should be assessed in terms of our vision and values. Constantly assessing how our research contributes to the community and how far are we from our vision is a good way of reflecting on this item. Finally, reflecting on the suggestions we got (from others and from our selves) and how these contribute to our actual status it is a god practice of self- assessment. All these parts are interconnected and one support the other, thus this framework should be considered as an holistic approach of supporting a self-reflection practices on academic career path. Uniqueness for Women in IT. The framework was built from a reflection that has seen women as principal actors as their narratives contains unique elements that are specifically related to women. However, this framework can be used by any human being in general and computer scientists in particular. All choices at work and in our career are driven by our beliefs. In some cases this drive is visible more directly than in others, such as choosing to study how design can help achieve better gender balance in CS, or accessibility for all. Specifically, as both IT and STEM academia tend to be male-dominated environments, a reflection framework will open up creativity. In addition, as we are not defined by the two former context conditions, the framework is value-specific and bases all other reflections on the foundation of individual values. This framework is created from women histories with the purpose of supporting other women in the creation and development of their own narrative and feel empowered to design and develop their own path in academia. The framework is


E. Rubegni et al.

shaped by considering the diversities of women pathways in CS and by embedding the peculiarities of these paths. The strength of the framework stays in the way all these features are transformed into a method that allows the creation of a narrative for shaping career pathway. In doing so, it brings a strong message that there are many and diverse ways to be successful in CS. The final purpose of this framework is to support women in being more assertive in the way they freely design their career path to suit their needs, inclinations and preferences, to own and be the main agent of her pathways to academia in CS. This reflection tool aims at creating a safe space for women to take the agency and the ownership of her career. Is it possible to make the framework more specific to women in CS? When grappling with the question of how we could make it more specific for this particular audience and purpose we found several aspects mostly linked with the CS values. And the urge to conform, at least initially, to what was perceived as one monolithic set of principles, somehow defined by a male dominated community where total dedication to technology seemed to be a must. First and foremost the framework naturally deals with the need of legitimization and recognition endemic in the CS community. Being members of the CS community has been clearly an important achievement in our early career, by looking at the two dimensions of interactions and practice we can see how we thrived to prove our worth to ourselves and others around us. But then we each developed our paths in different, fulfilling and at times unorthodox ways, and in so doing realised that there are many ways to be part of the same CS community, and of its many sub-communities each with their own philosophy and modus operandi, shared values and ad hoc ways to communicate these. Therefore, our frameworks is naturally infused with a variety of ways to be a member of the CS community while keeping our own principles and views. There is a differentiation between working life and private life, and yet a common set of values will underlie both of them if the responding human being has a sense of congruence. For example, we bring our values with us everywhere, but sometimes the environment around us, people and work ethics, may not be accepting of them. Still, when we reflect about our research and attitude towards teaching and supervising students, these values may reflect who we are and choose to be in the world. That would mean we are in congruence within the different contexts—we are in tune value-wise, or in alignment.

1.5 Discussion and Impact 1.5.1 Tie-in with Related Work As recent research [27] convincingly argues there is ample evidence of gender bias in the whole of academia. Although it is well documented that the engineering and com-

1 Owning Your Career Paths: Storytelling to Engage Women …


puting [15] is not gender balanced, there is a lack of agreement about the underlying causes, the critical barriers faced by students, educators, researchers, and practitioners, and the interventions and practices that may help. Moreover, the community tends to focus on one diversity dimension without considering it in combination with other aspects. In order to question and counteract the stereotypes in the field of computing in a similar way as envisaged by [37] for children, our article suggests the diversity of paths female students can take in pursuing a career in CS. It also exemplifies possible barriers, hurdles and ways around them as well as motivating factors. The strategy of using the female histories as role-models in CS has been used in the past [4, 22]. Indeed, the lack of female role models in CS [4, 22] is still an issue and it has been recognised as one of the most detrimental factors for young girls causing them to stop relating to CS; and hence they require a female role model to be motivated to study CS or pursue a career in the field [4, 11]. More specifically, a project called CS for Fun (CS4FN [16]) can be mentioned, that has produced and made freely available (online) a booklet that showcases female role models and their groundbreaking work in CS. CS4FN [16] was aimed to address one of the biggest hurdles for women to consider CS as a viable field of study, that is the lack of female role models [4]. In this perspective we aim at providing a more complex and yet reachable depiction of women pathways with the intention of giving back not just role models but a rich set of interactions and practices that could inspired other women without forcing them to take a predefined path or even scare them by portraying unobtainable standards. Likewise, our framework offers a support to the implementation of four key factors that involved in a woman’s decision to pursue a CS degree [21] (Social Encouragement, Self-perception, Academic Exposure, and Career Perception). The framework’s elements aim at supporting females in CS by pushing them to reflect on past, actual, and future interactions and practices. For instance, the Social encouragement factor is connected with the Motivation and Suggestions. As we pointed out the social context (including family imprinting) is an important driver to support and encourage women in their career path. The factor of Self- perception is related to Values but also to Threats and Successes. While Academic Exposure is connected with the ability to Communicate and put in places fruitful process. Finally, the Career Perception factor is connected with the Vision and the Research implemented. Thus, it would be possible to use the framework to motivate woman’s decision to pursue CS degree.

1.5.2 The Significance of Role Models The strategy of using these female histories as role-models in CS is not new, see [4, 22]. Indeed, Black and colleagues [4] cited Güre and Camp [22] to state that the lack of female role models in CS was found to be one most detrimental factors for young girls to stop relating CS; and hence they require a female role model to be motivated to study CS or pursue a career in the field [4]. More specifically, a project called


E. Rubegni et al.

CS for Fun (CS4FN [16]) that has produced and made freely available (online) a booklet that showcases female role models and their groundbreaking work in CS. CS4FN [16] was aimed to address one of the biggest hurdles for women to consider CS as a viable field of study, that is the lack of female role models [4]. Role models can inspire us—they are not here to confine us or put us into new boxes. With all due respect to the great role models that are available once the search is started, we also wish to acknowledge that inspiration by a role model, and vision resulting therefrom may be there or not, and both is fine. There are many successful professionals that discovered their path one step after the other. Passions can be known, or developed. As Newport describes in his book ‘So good they can’t ignore you’ [30], passion can be a result of deliberate practice over time that comes with the fruit of labor of success. One of the authors of the chapter at hand sometimes still questions how she ended up becoming a CS professor. Multiple stories showing the diversity of academic career pathways can enrich existing role models and inspire the creation of new one. Throughout the framework we aim at helping everyone to build their own story and become a role model themselves.

1.5.3 Threats to Validity In terms of limitations we have identified three main validity issues: construct, internal and external. Following we provide details for each. Construct validity: in the methods section we have extensively explained the motivation behind the use of auto-ethnography as a way of providing a self-reflection output and to create a storytelling. Internal validity: The two researchers who coded the interviews were also interviewees. This introduces a potential bias. However, we made sure that the primary coding of an interview was done by the researcher who was not the interviewee. External validity: Our results are not intended to be generalized but to show one expression of how experience as female researchers in CS can manifest. Therefore our only measure is whether the results are useful to trigger discussions and inspire action in settings when gender in CS is a relevant topic of conversation, see the following subsection.

1.5.4 Outreach and Dissemination Opportunities We discuss how this can inform science communication, dissemination to the public, community engagement, education and outreach to increase the diversity within CS and STEM in general. One of the initial hunches we had in this research was that our stories might be useful in several ways:

1 Owning Your Career Paths: Storytelling to Engage Women …


• Use the stories as warm-up or illustrative scenarios to speak about gender in CS and associated benefits and challenges. This might help admissions officers who cannot share their own story. • Use the range of experiences and development paths to show the breadth of what is possible. Research has shown that exposure to role models helps to reduce the gender gap in STEM [5] and for leadership [41]. • Use the examples to discuss challenges when people are not comfortable sharing their own stories (yet). Especially in work environments that do not yet have established a safe space for such conversations. • Use the framework to allow women who are or are willing to proceed to go for an academic career to reflect and make a plan for their path. Questions about interaction and practices aim at making them reflect on their past and present situation towards the definition of a clear future pathway. • Develop a workshop to craft own stories. It is possible to use the questions we had for our interviews, and propose additional questions. Stories depend on our personal upbringing, so they are strongly influenced by cultural contexts. As all researchers involved in this series of interviews were raised and predominantly work in Europe, there is a limited range of experiences represented. We encourage readers to run their own focus groups with a cohort of like-minded interested researchers.

1.6 Conclusions and Future Work The rise of new technologies, in particular AI and their convergence with the physical world, is affecting millions of citizens, companies, as well as social and governmental organizations. New technologies have triggered a global race for investment, talent, knowledge and research. However, this development also increases the diversity gap and regardless the efforts done in addressing this issue it is still present. Filling this gap is an important step towards the development of this field as the lack of diversity in CS can cause serious limitations. We address this issue leveraging on storytelling to break stereotypes and disseminate diversity in career pathways. In this chapter we present a study that produced as outcome a conceptual framework for supporting women in their career path by offering cues for storytelling, internal inquiry and reflection. The framework was derived from an auto-ethnography study conducted on the histories of five women in CS (the authors). Stories were studied in the vein of thematic analysis and the findings elaborated were developed as a framework. The framework can be used in numerous contexts. For instance, it can be a helpful resource for women when planning their career path. Additionally, it can prove useful in outreach and workshops about balancing gender in CS when participants may be new to the opportunities and challenges of this topic, or when in a setting where


E. Rubegni et al.

participants may shy back from revealing personal experiences with their challenges in gender balance. This work offers the unique opportunity to highlight the importance for five women in CS academia of a personal and cultural background as a legacy on which to build future perspectives. Considering that all interviewees are in academia this framework would strongly benefit from more variety in terms of including other stories for instance to inquire CS alumni who went into industry or that worked in other areas of computing. Moreover, we will include stories also outside Europe as the authors are all from European countries. In the future, this direction would offer the opportunity to broaden the impact and to strengthen the framework. Additionally, in order to assess and improve the framework we aim at organising a first workshop in which we will use the framework with women in CS with the purpose of creating their narrative and owing their career path. For instance, the workshop could use the dimensions and prompts of the framework to ask the participants to create their own story as a starting point for building a bigger vision and more detailed plan. We expect that using the framework in a concrete context in which we could observe the implications will provide concrete feedback that will help to enhance it. Definitely, these two actions (to include data from more histories and the usage in the context) will support its improvement and broadening of its scope. Another potential development for the future concerns the gamification of the framework [11, 12, 29], as gamification can influence professional choice at an early age. Storytelling and gamification together have the potential to appeal to a young audience and help inducing behavioural changes as discussed in Rubegni et al. [37]. The study reported there was followed by an exercise in the explicit use of gamification described in a related Master thesis [9], where a number of game based activities were devised to help female students in group bonding and boost creativity and resulted in the production of more gender balanced and less stereotypical multimedia stories [38]. As stated by Chimamanda Ngozi Adichie in her TED talk “The danger of a single story” (2009), providing more than one story can lead to increasing the diversity and eventually to building a more inclusive society in which women have equal voice and power in shaping our reality as academics as well as human beings. Acknowledgements This work has been partially supported by the EUGAIN COST Action CA19122— European Network for Gender Balance in Informatics and by the NFR 295920 IDUN project. Part of this research is financed by the Area of Advance ICT at Chalmers University of Technology under no. C-2019-0299. Finally, we would like to thank Nikolas Dickerson for the final feedback on the chapter.

1 Owning Your Career Paths: Storytelling to Engage Women …


References 1. Albusays, K., Bjorn, P., Dabbish, L., Ford, D., Murphy-Hill, E., Serebrenik, A., Storey, M.-A.: The diversity crisis in software development. IEEE Softw. 38(2), 19–25 (2021) 2. Barrett Values Center: The Barrett Model (2021). 3. Birrell, S., McDonald, M.G.: Break points: Narrative interruption in the life of Billie jean king. J. Sport Soc. Issues 36(4), 343–360 (2012) 4. Black, J., Curzon, P., Myketiak, C., McOwan, P.W.: A study in engaging female students in computer science using role models. In: Proceedings of the 16th Annual Joint Conference on Innovation and Technology in Computer Science Education, ITiCSE ’11, pp. 63–67. ACM, New York, NY, USA (2011) 5. Breda, T., Grenet, J., Monnet, M., Van Effenterre, C.: Do female role models reduce the gender gap in science? Evidence from French high schools (2021). halshs-01713068v5 6. Brown, K.V.: Tech shift: More women in computer science classes. https://www.sfgate. com/education/article/Tech-shift-More-women-in-computer-science-classes-5243026.php. February (2014). Accessed: 10 Dec 2018 7. Bruner, J.: Life as narrative. Soc. Res. 11–32 (1987) 8. Bruner, J.: The narrative construction of reality. Crit. Inquiry 18(1), 1–21 (1991) 9. Budakovic, J.: Using design for fighting gender stereotypes in digital storytelling for children. Master Thesis, USI (2019) 10. Burchard, B.: High Performance Habits: How Extraordinary People Become that Way. Hay House (2017) 11. Cheryan, S., Master, A., Meltzoff, A.: Cultural stereotypes as gatekeepers: Increasing girlsínterest in computer science and engineering by diversifying stereotypes. Front. Psychol. 6, 49 (2015) 12. Cheryan, S., Ziegler, S.A., Montoya, A.K., Jiang, L.: Why are some stem fields more gender balanced than others? Psychol. Bull. 143(1), 1 (2017) 13. European Commission: She Figures (2018). 14. Women in computer science: Getting involved in stem (2017). https:// Accessed: 20 Nov 2021 15. Corbett, C., Hill, C.: Solving the Equation: The Variables for Women’s Success in Engineering and Computing. ERIC (2015) 16. cs4fn. Computer science for fun. 17. Drury, B.J., Siy, J.O., Cheryan, S.: When do female role models benefit women? The importance of differentiating recruitment from retention in STEM. Psychol. Inquiry 22(4), 265–269 (2011) 18. EUGAIN: EUGAIN COST ACTION CA19122 European Network For Gender Balance in Informatics. (2021) 19. Informatics Europe: Informatics Europe Higher Education Data Portal. https://www. (2020) 20. Goldberg, N.: Writing Down the Bones: Freeing the Writer Within. Shambhala Publications (2005) 21. Google: Women who choose computer science–what really matters. https://static. pdf May (2014). Accessed 20 Nov 2021 22. Gürer, D., Camp, T.: Investigating the incredible shrinking pipeline for women in computer science. Final report–NSF project, 9812016 (2001) 23. Happe, L., Buhnova, B.: Frustrations steering women away from software engineering. IEEE Softw. (2021) 24. Happer, H., McCreadie, J., Aldgate, J.: Celebrating Success: What Helps Looked After Children Succeed. Social Work Inspection Agency (2006) 25. Jones, S.H.: Autoethnography. The Blackwell Encyclopedia of sociology (2007)


E. Rubegni et al.

26. Jaccheri, L., Pereira, C., Fast, S.: Gender issues in computer science: lessons learnt and reflections for the future. In: 2020 22nd International Symposium on Symbolic and Numeric Algorithms for Scientific Computing (SYNASC), pp. 9–16. IEEE (2020) 27. Llorens, A., Tzovara, A., Bellier, L., Bhaya-Grossman, I., Bidet-Caulet, A., Chang, W.K., Cross, Z.R., Dominguez-Faus, R., Flinker, A., Fonken, Y., et al.: Gender bias in academia: a lifetime problem that needs solutions. Neuron 109(13), 2047–2074 (2021) 28. Lockard, C.B., Wolf, M.: Occupational employment projections to 2020. Monthly Labor Rev. 135(1), 84–108 (2012) 29. Martincic, C.J., Bhatnagar, N.: Will computer engineer barbie® impact young women’s career choices? Inform. Syst. Educ. J. 10(6), 4 (2012) 30. Cal Newport: So Good They Can’t Ignore You. Hachette UK (2016) 31. Pantic, K.: Retention of Women in Computer Science: Why Women Persist in Their Computer Science Majors. Utah State University, p. 7794. All Graduate Theses and Dissertations (2020) 32. Papavlasopoulou, S., Giannakos, M.N., Jaccheri, L.: Creative programming experiences for teenagers: attitudes, performance and gender differences. In: Proceedings of the The 15th International Conference on Interaction Design and Children, pp. 565–570 (2016) 33. Prado, R., Mendes, W., Gama, K.S., Pinto, G.: How trans-inclusive are hackathons? IEEE Softw. 38(2), 26–31 (2021) 34. Rapp, A.: Autoethnography in human-computer interaction: Theory and practice. In: New Directions in Third Wave Human-Computer Interaction: Volume 2-Methodologies, pp. 25–42. Springer (2018) 35. Ritter, S.M., Van Baaren, R.B., Dijksterhuis, A.: Creativity: the role of unconscious processes in idea generation and idea selection. Thinking Skills Creativity 7(1), 21–27 (2012) 36. Rosenberg, M.B., Chopra, D.: Nonviolent Communication: A Language of Life: Life-Changing Tools for Healthy Relationships. PuddleDancer Press (2015) 37. Rubegni, E., Landoni, M., Jaccheri, L.: Design for change with and for children: How to design digital storytelling tool to raise stereotypes awareness. In: Proceedings of the 2020 ACM Designing Interactive Systems Conference, pp. 505–518 (2020) 38. Rubegni, E., Landoni, M., Malinverni, L., Jaccheri, L.: Raising awareness of stereotyping through collaborative digital storytelling: Design for change with and for children. Int. J. Human-Comput. Stud. 157, 102727 (2022) 39. Sadik, A.: Digital storytelling: A meaningful technology-integrated approach for engaged student learning. Educ. Technol. Res. Develop. 56(4), 487–506 (2008) 40. Schwartz, S.H.: An Overview of the Schwartz Theory of Basic Values. https://scholarworks. (2012) 41. Sealy, R., Singh, V.: The importance of role models in the development of leaders’ professional identities. In: Leadership Perspectives, pp. 208–222. Springer (2008) 42. Sealy, R.H.V., Singh, V.: The importance of role models and demographic context for senior women’s work identity development. Int. J. Manage. Rev. 12(3), 284–300 (2010) 43. Sharma, K., Torrado, J.C., Gómez, J., Jaccheri, L.: Improving girls’ perception of computer science as a viable career option through game playing and design: lessons from a systematic literature review. Entertainment Comput. 36, 100387 (2021) 44. Silvia, P.J.: How to write a lot: A practical guide to productive academic writing. Am. Psychol. Assoc. (2007) 45. Spiel, K.: Why are they all obsessed with gender?’-(non) binary navigations through technological infrastructures. In: Designing Interactive Systems Conference, vol. 2021, pp. 478–494 (2021) 46. Sweedyk, E.: Women build games, seriously. In: Proceedings of the 42Nd ACM Technical Symposium on Computer Science Education, SIGCSE ’11, pp. 171–176. ACM, New York, NY, USA (2011) 47. Szabó, M., Lovibond, P.F.: The cognitive content of naturally occurring worry episodes. Cognitive Therapy Res. 26(2), 167–177 (2002) 48. Szedlak, C., Smith, M.J., Callary, B.: Developing a letter to my younger self to learn from the experiences of expert coaches. Qual. Res. Sport, Exerc. Health 13(4), 569–585 (2021)

1 Owning Your Career Paths: Storytelling to Engage Women …


49. Tai, R.H., Liu, C.Q., Maltese, A.V., Fan, X.: Planning early for careers in science. Life Sci. 1, 0–2 (2006) 50. Urness, T., Manley, E.D.: Generating interest in computer science through middle-school android summer camps. J. Comput. Sci. Coll. 28(5), 211–217 (2013) 51. Vinuesa, R., Azizpour, H., Leite, I., Balaam, M., Dignum, V., Domisch, S., Felländer, A., Langhans, S.D., Tegmark, M., Nerini, F.F.: The role of artificial intelligence in achieving the sustainable development goals. Nat. Commun. 11(1), 1–10 (2020) 52. Wang, L.L., Stanovsky, G., Weihs, L., Etzioni, O.: Gender trends in computer science authorship. Commun. ACM 64(3), 78–84 (2021) 53. Wang, M.-T., Degol, J.L.: Gender gap in science, technology, engineering, and mathematics (stem): Current knowledge, implications for practice, policy, and future directions. Educ. Psychol. Rev. 29(1), 119–140 (2017) 54. Zweben, S., Bizot, B.: Taulbee survey-total undergrad CS enrollment rises again, but with fewer new majors. Doctoral degree production recovers from last year’s dip, Computing Research Association (2019)

Chapter 2

Gender Differences and Bias in Artificial Intelligence Valentina Franzoni

Abstract Artificial Intelligence (AI) is supporting decisions in ways that increasingly affect humans in many aspects of their lives. Both autonomous and decisionsupport systems applying AI algorithms and data-driven models are used for decisions about justice, education, physical and psychological health, and to provide or deny access to credit, healthcare, and other essential resources, in all aspects of daily life, in increasingly ubiquitous and sometimes ambiguous ways. Too often systems are built without considering human factors associated with their use, such as gender bias. The need for clarity about the correct way to employ such systems is an an increasingly critical aspect of design, implementation, and presentation. Models and systems provide results that are difficult to interpret and are blamed for being good or bad, whereas only the design of such tools is good or bad, and the necessary training for them to be integrated into human values. This chapter aims at discussing the most evident issues about gender bias in AI and exploring possible solutions for the impact on humans of AI and decision support algorithms, with a focus on how to integrate gender balance principles into data sets, AI agents, and in general in scientific research.

2.1 Introduction Artificial Intelligence (AI) and robotic systems are efficient in reasoning tasks but dumb, in very specific ways, especially on what humans do at a subconscious level. There is no clear path to obtain good AI but evident biases should be avoided. In particular, we focus on gender bias, born in cultural discrimination. In this chapter, we will recap the main issues, and discuss the past and current problems of gender bias. We will discuss how to avoid their repetition in the future, starting from entangling and resolving as many as possible, and planning a better environment for others to be solved during the time. V. Franzoni (B) Department of Mathematics and Computer Science, University of Perugia, Via Vanvitelli 1, Perugia 06123, Italy e-mail: [email protected] © Springer Nature Switzerland AG 2023 J. Vallverdú (ed.), Gender in AI and Robotics, Intelligent Systems Reference Library 235,



V. Franzoni

We can start simplifying the classification of gender bias in Artificial Intelligence as three-folded: An evident lack of data about women in scientific data sets. It is apparent that AI is a matter of adult white males: e.g., in medical, social, and behavioral studies, data sets include more males’ than females’ data, leading to classification bias, especially for deep learning, algorithms used as black boxes, and other techniques with scarcely readable intermediate data. A problem of insufficient participation of women in science. The female’s point of view on the world, which only recently has been addressed, can offer additional ideas and contributions. The current patriarchal approach, still present in many countries both in the academic and industrial areas, leads to a biased point of view to research and problem-solving. A biased use of female features in artificial agents. AI assistants (e.g., Alexa, Cortana, Google, Siri, car navigators) use female voices in a nice, accommodating, and mostly submitted way. This approach can reinforce a patriarchal behavior: in some cultures, females are only assistants and inferiors, to ask and expect complete willingness.

2.1.1 Understanding the Historical Background Artificial Intelligence (AI) has been presented as a dual disciplinary profile, one of engineering to build machines to assist humans in both physical and intellectual tasks, and the other as a psychological discipline, closely reproducing the essential features of the cognitive and emotional activity of the human brain, to support it, and to contribute to the mind-body research. In the course of this history, foundations were laid for AI to identify some areas of research that have remained classic and present the first so-called intelligent computer programs. The machines, that the pioneers of AI were thinking of, were digital computers with their fundamental properties. The mathematical models behind many computers features were laid down between the fourth and the sixth decade of the nineteenth century before assembling the first calculators. We know that the first programmer in history has been a woman, Mrs. Ada Lovelace Byron, who laid the foundations for the programming of Charles Babbage’s machine, around 1840. Research programs inspired by other concepts of intelligence have been carried out in the framework of Artificial Intelligence in the following century when the logical foundations of the computability theory of the 1930s s were laid even before personal computers were built. The focus shifted from trivial calculation to symbolic processing, tacit knowledge, sense-motor skills, and adaptive capacities to the natural environment or social interactions with other natural or artificial agents, starting from epistemological and ethical problems about the nature of intelligence.

2 Gender Differences and Bias in Artificial Intelligence


2.1.2 Examples of Relevant Women Contribution to AI While it’s easy for any of us to remember great men leading history in any era, we should instead focus on reestablishing in historical context the significance of women, which is too often passed over in silence, hoping that history can be better rewritten in the next 50 years. In several important cases, first computer science and then AI has benefited from women’s contributions. The ENIAC (Electronic Numerical Integrator And Computer) was the first general-purpose electronic digital computer, and a team of six women initiated this pioneering programming work: Kathleen McNulty, Frances Bilas, Betty Jean Jennings, Ruth Lichterman, Elizabeth Snyder, and Marlyn Wescoff [1]. Regrettably, the contributions of the women programmers who created the foundation of computer programming have received little recognition. This underappreciation was because, at the time, machine programming was most associated with human calculation and thus was regarded as a job for clerical women. Instead, the leading engineers and physicists focused on hardware design and construction, which they considered more vital to the future of computing. From the field of Information Technology companies, we have plenty of wellknown prominent female profiles now: we will mention some of them as examples, with no desire to rank them or consider them better than other profiles but well focusing on the fact that they are women and their work is profoundly improving the field of computing and AI. Allie K. Miller [3] is the US Head of AI Business Development for Startups and Venture Capital at Amazon, advancing the world’s largest AI companies. Allie also became the youngest woman ever to build an artificial intelligence product at IBM. Dr. Fei-Fei Li is the first Sequoia Professor in the Department of Computer Science at Stanford University and co-director of Stanford’s Human-Centered AI Institute [4]. She served as director of Stanford’s AI Lab from 2013 to 2018. During her sabbatical at Stanford from January 2017 to September 2018, she was a Vice President at Google and served as Chief Scientist of Artificial Intelligence and Machine Learning at Google Cloud. Rachel Thomas [5] is the founding director of the University of San Francisco (USF) Center for Applied Data Ethics, among others granting diversity scholarships to people from underrepresented groups. Daphne Koller [6] is one of the two computer scientists co-founders of Coursera, the multi-award-winning massive open online course provider created in 2012 from Stanford University. Women contributed by introducing systematic and associated AI research in academia, often focusing on ethical issues. The Italian emeritus professor Luigina Carlucci Aiello [7] contributed to the ethical divulgation of AI-based research, recognized worldwide. The environmental context in which AI developed in the first decades was a patriarchal university, which heavily influenced both the research setting and the data and results achieved until now.


V. Franzoni

Female genius is necessary within the AI industry, and improving gender diversity is essential for creativity, productivity, and management goals. This concept is not as well received in universities, especially if funded by governments. The same happens in any area where saleable products, and not culture and advancement, are the ultimate goal, and the profit from diversity is not as easily measured, therefore accounted for. While companies put more funds into educational goals, including gender diversity, universities are in most cases a step behind [34], where women can only be recognized if they go through university or create startups and earn their space by creating it from the ground up. Rosalind Wright Picard [8] is an American scholar and innovator, professor of Media Arts and Sciences at MIT, founder and director of the Affective Computing Research Group in the MIT Media Lab, and co-founder of the startups Affectiva and Empatica. Lada Adamic [9] is an American network scientist, researching the dynamics of information in networks. She studies how the network structure affects the flow of information, how information affects the evolution of networks and crowdsourced knowledge sharing. Adamic is now a computational social scientist and director of research at Facebook. Joy Buolamwini, [10] a black researcher at MIT, [42, 43] discovered that IBM’s verification system at her workplace did not recognize her face, and explored the reasons behind it, resulting in an average accuracy of −6% for lighter women compared to men, −23% for darker women compared to men, and an overall −14.5% for women compared to men. After its analysis, IBM decided to stop the sale of that general-purpose product. For similar reasons, Amazon then discontinued its use of facial recognition for police. With its ‘Algorithmic Justice League’, the study of gender bias in AI is still ongoing.

2.1.3 Issues Overview for Lack of Gender Diversity in AI In Artificial Intelligence, classification systems (e.g. Machine Learning) are trained with bundles of examples of input-output behaviors to be able to then generalize with any input. A human bias is a system of shared knowledge in society for or against something, i.e. a stereotype. Learning from data, by nature such systems can incorporate societal biases, including gender bias, leading to unfair decisions [41]. Improved diversity also increases possibilities and enhances progress. Feminine perspective, out of any cliché, is critical in brainstorming and innovations, and female role models can inspire the younger generation of girls to pursue careers in IT and STEM fields. The concept is valid for any diversity (e.g., a handicapped person can give a new point of view on product use) but gender diversity is one of the major issues because the AI field is a male-dominated one. According to Reuters, [18] in 2017 the percentage of female employees in technical roles in major ML companies is only around 20%.

2 Gender Differences and Bias in Artificial Intelligence


Search engines, widely used by all citizens, can also reflect gender biases, e.g. if we look for “work” or “go shopping” in image search engines, we may find more men photos for the former and more female for the latter, reflecting our societies stereotypes present in the data and annotations. Gender bias is extremely critical when present in sensitive applications such as health-related [33] or in criminal justice, [29] where mechanisms should support humans fairly taking life-changing sensitive decisions. We then need to establish mechanisms for AI systems to empower following best engineering practices and smart evaluation strategies. Severe gender biases were introduced both in the data and in their interpretation, and only recently humanity realized that this is seriously detrimental, especially in deep learning, and as a result, it was necessary to stop some systems and start from scratch. However, here’s the concern: limited diversity in the AI sector increases the chance that AI systems will have harmful effects on the world. Incidentally, Artificial Intelligence is even less inclusive than the broader tech industry, which has its wellknown diversity problems. Women are among the most vulnerable to the changes that Artificial Intelligence is bringing. According to the World Economic Forum, women are at greater risk of losing jobs to automation. Females represent 73% of cashiers, for example, and 97% of cashiers are forecast to lose their jobs to automation. The research questions seem thus to be threefold: (i) Is Artificial Intelligence exploiting the feminine point of view as a resource? Male are more represented than females; (ii) Are data sets including data about women, and with which results? The actual differences and specifications of different genders are not exploited for proper gender-based applications; (iii) Will Artificial Intelligence influence the loss of workplaces for women? Women are among the target in works that will have more transformations due to AI.

2.2 Starting from Education, Where Are Tomorrow’s Female Scientists? According to the Gender Data Portal of the international Organisation for Economic Co-operation and Development (OECD), [11] currently in 35 countries, fewer than 1 in 3 engineering graduates and fewer than 1 in 5 computer science graduates are girls. This behavior is likely because of stereotypes and expectations rather than performance differences in math and science. Digging data, we observe that at age 15 far fewer girls (4.7%) than boys (18%)—even among the top performers—reported that they expect to have a career in engineering or computing. In reading, 15-yearold girls outperform 15-year-old boys (by the equivalent of roughly one year of school) while in mathematics, boys outperform girls (though by a narrower margin, the equivalent of less than half a year of school); in science, there is instead little difference between boys’ and girls’ performance. Yet dig even a little deeper, and a more nuanced picture emerges. There are far more boys (24.9%) than girls (12.5%)


V. Franzoni

Fig. 2.1 Share of bachelor’s degrees earned by women in the US. Source: NCES

among the lowest-achieving students in reading, while there are far fewer girls than boys among the top performers in mathematics (10.6% vs. 14.8%) and science (7.7% vs. 9.3%). Such gender bias is almost equal worldwide, and regional income losses associated with current levels of gender-based discriminatory social institutions are critical everywhere. We are speaking about USD 6 116 billion in OECD countries, USD 2 440 billion in East Asia and the Pacific, USD 888 billion in South Asia, USD 733 billion in Eastern Europe and Central Asia, USD 658 billion in Latin America and the Caribbean, USD 575 billion in the Middle East and North Africa, and USD 340 billion in sub-Saharan Africa, which is relevant even when normalized on the standard income of the geographical area. The temporal graph in Fig. 2.1[19] from data collected by the National Center for Education Statistics (NCES) shows how the share of bachelor degrees earned by women in the United States had a peak in the ’90s and then slowly continued decreasing until 2010. In Fig. 2.2, on the X axis the points of gender equality as described by the World Economic Forum, on the Y axis the percentage of women for Information and Communications Technology (ICT) graduates. For European countries and others appearing in the bottom-right part of the graph in blush colors, greater gender equality is

2 Gender Differences and Bias in Artificial Intelligence


Fig. 2.2 The gender equality paradox for information and communications technology (IGO 3.0 CC-BY-SA license)

associated with low female participation in ICT degree courses. For the group of Arabian countries, India, and others appearing on the top-left side of the graph in reddish colors, the points of gender equality are associated with the high participation of females in ICT programs. This surprising result is called the Gender Equality Paradox. A possible interpretation is that women in developing countries see a chance for emancipation by studying computer science [14] and therefore are more motivated to persist in the choice. Another aspect may be that once started the career path, these countries in some way can concretely support women more than others in their choices: probably this career is considered a privilege for a woman and supported by families and institutions more than in other countries, where professional environmental security is given for granted, thus women are left alone by the society facing the objective difficulties of their uncommon choice.1 1

At the moment this page is written, the world is facing a critical step back in gender equality in Arabic countries, with Afghanistan facing the renewed arrival of Talibans’ command. Clear data about this part of the world are still not available. ISIS is bombing Kabul’s airport to force people to stay under the Sharia Islamic rules, where women are limited in which in other countries are considered human rights. Students and researchers accepted in foreign universities cannot exit


V. Franzoni

Case study: Italy In Italy in 2018/2019, only the 21.01% of Science Bachelor (BSc, i.e., triennial base degree) graduates in Engineering have been women, only 12,23% for Computer Science [11], with a 2% drop compared to the previous two years, where 0.2 is the overall drop in registrations (both for females and males). In Padua university, one of the first Italian universities to open a Computer Science degree course in the past century, and still one of the most quoted, the female matriculation percentage was 23.8% for Engineering, 9.5% in Engineering for Computer Science. Education is the first step to having female scientists in Artificial Intelligence, thus being a fundamental step for gender variation. What happens after graduation? In Italy, the role of a researcher is not permanent: students must have a specialization and a Ph.D. to access a long way, including several years of job insecurity because the role of post-doc researcher can last a maximum of 12 years, with 1 to 3 years contracts at a time. Both for women and men, this means a long time without proper security and salary (which for senior post-doc researchers is around 1400 euros net in the payroll), but there are a few 3-years research contracts compared to the high number of senior research assistant contracts, which do not include any retirement pension quota, health support or holidays. For example, if during the Covid-19 pandemics an assignee were to contract a severe Covid-19 illness requiring hospitalization, she or he would be forced by regulation to suspend his monthly allowance and live as a hospitalized person without any salary. Furthermore, the contract is usually renewed yearly, often leading to a severe overexploitation of staff, where supervisors ask for tasks not foreseen in the contract. The precarious researcher can not refuse, especially in the first years, and therefore with less experience of the academic human environment, under threat of non-renewal of the contract at the end of the year. Only people with a supervisor professionally in powerful roles or people with a very strong stomach can overcome this state without passing through burnout. Too often, women are also forced to receive unprofessional and unethical requests for sexual services from powerful personnel. In Italy, it is estimated that one in five women has been abused in her lifetime, and in the university environment indecent proposals are unfortunately experienced by many young women, often able to cope thanks to individual training, or support by psychoterapy. At the end of the post-doc, the opportunities to become a professor are rare and in steadily decreasing numbers. A law is currently under discussion and in the process of being approved would not allow precarious researchers to be stabilized through a selection procedure in the same university where they graduated, obtained their doctorate, or where they have had precarious contracts in the last five years. This would introduce further serious discrimination against people with disabilities, who are unable to move to other cities. The same happens against people who have started a family, women above all. Italian culture has not yet been freed from patriarchy, especially in the center and south of Italy, where women are the only ones who the country. Women are abandoning their children in the arms of the US army and European ambassadors, hoping for them to be transported outside Afghanistan.

2 Gender Differences and Bias in Artificial Intelligence


actively take care of the home and children. Unfortunately, it sets a disadvantage by having to work distantly from the family. Similarly, young men and women would be demotivated in forming a family at the right age, given that stabilization, when it comes, arrives on average around the age of 45–50, when a woman is already beyond the natural physiological possibility of having children. Laws are still not taking account of the security situations that are lacking.2 Some sensibilization has been started but it’s a long way. Laws taking into account the so-called ‘pink quotas’ which provide for a percentage of the roles assigned to women, are not envisaged before the role of professor, but rather focused on power roles (selection committees, senior leadership, and management roles) to which precarious researchers are not entitled. This fact leads a large proportion of women not to enroll in degrees that envisage a university or managerial career, and not to embark on them. Even after completing their degree or doctorate, they often prefer to opt for jobs that are lower paid but somewhat less demanding in terms of precariousness and the number of hours of work required in the long term, such as teaching in lower and upper secondary schools. Moreover, in Europe women are generally paid 15% less than men for the same role (data from 2018) [15]. In Italy, this statistic drops to 5% for professional roles but for academic and public roles where working hours are not defined because assessed based on productivity, as in the roles of precarious academic researchers, this difference translates into the necessity for women to work a larger number of hours and demonstrating more effort to be evaluated the same as peers males colleagues.

How can women resist in such environment? The normalization of inequality, even in the case of violence and abuse, leads women to what is scientifically called learned helplessness [2], which causes women to remain, e.g., with employers who insult them, in groups that exploit them, in relationships that torment them, but also to feel unable to commit the values in which they wholeheartedly believe: their ideas, their love, their art, their lifestyle, their political beliefs and all aspects of their physical, emotional, and creative nature. Women are faced with this challenge whenever they are stunted and induced to limit themselves and defend their lives from intrusive cultural, psychological, or other projections, with the loss of their ability to escape or protect the values they hold precious. Everything they consider important at work, personal, cultural, and environmental levels falls by the wayside, with the effect that they also become accustomed to their inability to act in the event of traumatic events, such as abuse in the workplace. This normalization of the traumatizer is rejected by restoring the damaged instinct and basic needs, but the trap within the trap is thinking that becoming aware is sufficient, while on the contrary, regaining awareness is only 2

Note of the author: It is such a difficult situation to live in that I had difficulty getting to sleep while writing these paragraphs because it is a worrying situation with a dark past and future, and it can be emotionally challenging for any woman in the field, even to speak about it. Many studies and surverys are circulating to track such issues, all of them are anonymous to the aim to protect the freedom of speech.


V. Franzoni

the first step [32]. The real solution will necessarily have to include a paradigm that encourages women to question their status quo with confidence, providing a sufficiently protected cultural environment, and also to look not only at themselves but to the world that accidentally, unconsciously, or maliciously pressures them in every aspect of life. The internal fragmentation to which women are subjected is stopped by allowing them to accommodate everything within their reach, without neglecting their needs or turning their eyes away from the world. Tips for young women starting their professional university careers Plan psychological support or training by an experienced professional psychologist or body psychotherapist for managing interpersonal relationships, to be prepared to react in the best way possible to handle inter-gender difficulties in the working environment and possible attempts of physical or psychological abuse. This will prevent the possibility of burnout or severe trauma. This commitment might be financially challenging if your country’s health and social support system do not include psychological care but in the long run, you will find that it is well-spent money. If this choice is not offered, find or create a support group. After Covid-19, it is easier to find such support also through online services.

2.3 Fixing the Number of Women A trivial solution apparently is to increase the number of female scientists in Artificial Intelligence, and in science in general to fix the gender gap but we already saw in Sect. 2.2 that the path is not straightforward. Unfortunately, this solution is not so easily implementable. The participation bias is actual for all STEM disciplines (i.e., Science, Technology, Engineering, and Mathematics), and almost all the world is trying to balance it, often with only partial and temporary success. Gender studies are a relevant baseline to start this path, where often the instrumentation of the problem by political parties is evident, who harass people with propaganda based on extremism, which is certainly useful for obtaining votes but not for achieving real change. This is a clear and important obstacle to solving gender bias problems in science, which may apply to any gender orientation besides genetics. Fixing the number of women participating in any role is critical to give a different point of view which can reflect on productivity and results. Artificial intelligence is biased on white males [40]. E.g., in 2019 AI female researchers in Facebook’s staff were 15%, in Google 10%. In University, 80% of professors working in AI are men, and only 18% of authors considered important in AI conferences (e.g., giving keynotes) are women. Gender disparity is even more critical if merged with skin color. Such biases are present both in data sets and in the AI community itself.

2 Gender Differences and Bias in Artificial Intelligence


The diversity problem is not just about women. It’s about gender, race, and most fundamentally, about power [40].

News generalist mass media are considered responsible for still giving a stereotyped vision of women, often as mere sexual tools. On the other hand, the communication focus on women scientists is improved worldwide. In the scientific news media environment, the number of journalist women is increased, giving attention to women proving professionality, and ethical and emotional balance, for instance in the Covid-19 pandemics. It is considered important that women scientists create communities, eventually including men, to network and educate against gender bias. IEEE and many other scientific associations promote women networks at the international level.

2.4 Fixing the Institutions, Improving the Presence of Women in Leadership Positions The European Commission (e.g., the Horizon2020 initiative) chose to give guidelines to be applied in all the member countries to apply Gender Equality Plans (GEP) starting from 2011 [22]. The GEP’s aim is to promote cultural change in research institutions through specific gender policies and actions to promote work-life balance between work and family life. Also, many Institutions and Universities in countries outside Europe, including Hong Kong (China), are following similar strategies. The bias in culture is very deeply rooted, and only a public interpellation could change the mindset, exiting institutions to the demographic sensitization about the consequences of such science-lack bias. We need time, and constant effort to create a new generation of scientists and academics who acknowledged and are sensitive to such bias. This topic is strongly interconnected with the issue of female numbers in institutions. Each citizen, including men, women, and gender-diverse individuals, should be aware of this problem and sensitized to evaluate each case separately, posing some basic questions before reacting to common situations, to check if there is gender inequity behind, that can be overcome step by step, with everyone’s even small contributions.

? Questions for sensitization

• If the person in front of me was a male, a person that I respect, or a person whose life I care about, how would I react? Which decision would I choose? Which action would I decide?


V. Franzoni

• In front of this social media post, what would I want for me if I was the subject of the post? Can I go against the mass reaction in a pacific and constructive way? Would I like to be the one who makes the difference? • Am I sufficiently aware of how are researchers in this field treated in their professional life and role? Are they paid for their productivity or I can see a difference between genders? • Are gender representatives putting the same effort in this issue or for the same project one is required or expected to work more than others because of her gender? • Can I say, choose, do anything to stop bias and abuse? Can I say, choose, do anything to support who is abused or who would like to have a role but is lacking opportunities?

2.5 Fixing the Knowledge and Its Side Effects, Creating Principles and Ethical Guidelines for an Inclusive Artificial Intelligence This issue is probably the most important and the most urgent to solve, in the actual development of scientific and life-science studies. Regarding computer science and artificial intelligence, in particular in the machine learning field, to be fair from a gender point of view, only a complete redefinition of the method and scientific research questions can redesign science. No blind application of ML techniques is possible: there is a need for analyzing the knowledge used for training to check if gender and ethnicity bias is present in data. De-biasing methods are under study for bias detection and mitigation, for overcoming biases incorporated in AI systems.

? Question

Are algorithms and technologies that AI develops, in particular in the Machine Learning area, fair from a gender point of view?

If we don’t balance genders in data, then statistics, decision making, problemsolving, prevision, and classification are almost surely biased. The area and topic play a relevant role in evaluating the importance of a bias: a medical study on a pathology that affects differently males and females must be studied on balanced data. In the same way, studies affecting physical and mental well-being in any application should be balanced for males and females. A theoretical study, e.g., on education or other areas with a less immediate effect on lives, can be considered less critical but still must be balanced, step by step, because of the needed cultural change (see Sect. 2.3).

2 Gender Differences and Bias in Artificial Intelligence


We already cited some side effects of gendered science, e.g. in health [33] and criminal justice [29]. The main problem is that, when male developers (whom we saw are the majority, see Sect. 2.1.3) create their systems, they incorporate, often unconsciously, their own biases in the different stages of its creation such as data sampling, annotation, algorithm selection, evaluation metrics, and the human-algorithm user interface [29, 30]. As a trivial result, AI systems seem to be biased toward male developers’ tastes. We will see later that this is not the only drawback in experiments design and result.

! Attention

It is not enough to apply the “pinking” method, i.e. a stereotypical feminization of the product which indeed can be damaging, but only a complete redefinition of the research approach and questions can redesign science from a methodological perspective to produce gendered innovation [37].

Several authors have found out that voice and speech recognition systems performed worse for women than for men [17, 31]. Recruiting tools based on text mining can inherit gender bias from the data they train on. Face recognition systems have also been found to provide more errors with female faces [28]. In 2018, UNESCO studied the increasing gender bias in AI ubiquitous digital assistants [20]. The think piece shows how gendered AI technology reflects, reinforces, and spreads gender bias: • modeling acceptance and tolerance of sexual harassment and verbal abuse; • sending implicit and sometimes explicit messages about how women and girls should respond to requests and express themselves; • making women the ‘face’ of glitches and errors that result from the limitations of hardware and software designed predominately by men; • forcing synthetic ‘female’ voices and personality to defer questions and commands to higher (and often male) authorities. The same UNESCO document shares recommendations regarding the gendering of AI technologies for government institutions and companies: • end the practice of making digital assistants female by default; • explore the feasibility of developing a neutral machine gender for voice assistants that is neither male nor female; • encourage the creation of public repositories of computer codes and speech taxonomies that are gender-sensitive (see Gender Equality Markers); • programme digital assistants to discourage gender-based insults and abusive language; • develop the advanced technical skills of women and girls so they can steer the creation of frontier technologies alongside men.


V. Franzoni

UNESCO believes that all forms of discrimination based on gender are violations of human rights, as well as a significant barrier to the achievement of the 2030 Agenda for Sustainable Development and its 17 Sustainable Development Goals [21]. The message is clear: women and men must enjoy equal opportunities, choices, capabilities, power, and knowledge as equal citizens. Equipping girls and boys, women and men with the knowledge, values, attitudes, and skills to tackle gender disparities is a precondition for building a sustainable future for everybody. Still agreeing with such goals, there is a further consideration to realize: if the solution is that everything in AI should be neutral and not related to females and males, it means to lose a lot of information and opportunities. Instead, balancing males and female roles also in AI could better exploit the useful differences between genders. Besides the not so trivial concept that medical and justice studies must consider both males and females both as data sources for AI both for scientific roles to guarantee the advantages of diversity, the point saying to have “neutral” agents, thus using only neutral voices may introduce a new negative bias. If gender bias is detrimental for culture and human behavior, and men and women should be treated equally, it is also true that men and women are not identical but have different bodies and identities. It would neither be useful nor intelligent to discard all the female and male characteristics that could serve people through AI. Case study: Maternal Male Artificial Voices For Relaxation One example for all: the autonomous neural system includes the facial cranial nerve (i.e., CN VII), innervating among others the tympanic cavity and in particular stimulating the activity of the stapedius muscle, which influences the tympanic tension. This functionality under some constraints acts as a filter for the frequency of the female voice, keeping the maternal voice always understandable among others, and over a background noise, which does not happen with lower pitches corresponding to male voices [36]. This is a natural feature of our neural system which is biologically programmed to listen to the maternal voice overall, especially when the neurological state is in an alert mode (also generating particularly annoying sensations when the female voice is not soothing but shouting). Thus, in the developmental years, children can be helped by the maternal voice to face and manage overwhelming emotions or other critical situations in which the defensive features of our neural system are active under alert stimuli. This feature remains active during the rest of the life, besides deteriorating with age as with the rest of the auditory system, so that even in adult life we can be influenced by soft and soothing voices speaking softly and with prolonged vocals, similar to maternal voices. Such gender-based maternal features can be exploited in AI tools despite voice pitch, for instance, to create artificial male voices (balancing the bias which wants females voices for personal assistants) with maternal features [35].

2 Gender Differences and Bias in Artificial Intelligence


Also, Stanford University has a research platform on Gender Innovation, guided by Londa Schiebinger, with several projects and case studies demonstrating how sex and gender analysis improves science and engineering [38], and should not be neutralized. Several diverse areas in AI and robotics are critically affected by gender bias, e.g.: • Sex and gender affect all parts of the pain pathway, from signaling to perception to expression and treatment [23]. • Facial recognition systems can identify people in crowds, analyze emotion, and detect gender, age, race, sexual orientation. These systems are often employed in recruitment, authorizing payments, security, surveillance, and unlocking phones. These systems can also discriminate based on characteristics such as race and gender, and their intersections, including transgender or how cosmetics are used [24]. • Technologies such as Extended Virtual Reality are useful for promoting gender equality by reducing bias and enhancing empathy [25, 39]. • Engineers are increasingly designing robots to interact with humans as service robots in hospitals, elder care facilities, classrooms, people’s homes, airports, and hotels. Robots are also being developed for warfare, policing, bomb defusing, security, and the sex industry-topics. Women, men, and gender-diverse individuals may have different needs or social preferences, and designers should aim for genderinclusive design, not “gender-blind” or “gender-stereotypical,” but considering the unique needs of distinct social groups [26]. • In engineering design, the medium-sized male body (175 cm; 75.5 kg) has been taken as the norm. It is no surprise that men fitting this profile suffer the fewest injuries in automobile accidents: when crash data is analyzed by sex, age, height, and weight, injury rates are higher among people who don’t fit the mid-sized male norm. Elderly drivers have high fatality rates, obese drivers also risk death and serious injury, and women are more likely than men to be injured in crashes [27]. A solution to these and any other ethical biases in artificial intelligence requires the invention of practical paradigms and protocols that enable the technical implementation of non-biased tools and systems, as well as data sets, which scientific research has not yet attempted to cover.

2.6 Conclusions The path to gender equity is still long and hard. Scientific research has already started understanding and facing the problem in Artificial Intelligence but still without practical solutions to avoid gender bias and to change the cultural and technical background generating ethical biases with the invention of practical processes and protocols. If the issues have several faces e.g., data inequality, opportunities disparity, and a still poor mass and culture sensitization, education and research may be the necessary first step to reach the goal. As a woman, I remain hopeful that future generations can be more sensitive to the problem and find the willingness and practical solutions to


V. Franzoni

ensure a better life beyond the unbalanced (i.e. underrepresented) classes, starting with gender bias, for every minority. This path will be faster the more everyone is sensitized to the importance of this issue.

References 1. 2. Lennerlöf, L.: Learned helplessness at work. Int. J. Health Serv. 18(2), 207-222 (1988). https:// 3. 4. 5. 6. 7. 8. 9. 10. 11. OECD Gender Data Portal: Where are tomorrow’s female scientists? gender/data/wherearetomorrowsfemalescientists.htm 12. CNI Data Portal: In calo gli immatricolati ai corsi di ingegneria . news/213-2019/2620-in-calo-gli-immatricolati-ai-corsi-di-laurea-in-ingegneria 13. I’d blush if I could, UNESCO-EQUALS (2019). 14. Questioni di genere inIntelligenza artificiale, report from S. Badaloni on Gender bias in Artifiial Intelligence (2020). 15. ISTAT portal: Divario retributivo di genere, report from the Italian National Institute for Statistics (2019). 16. Craglia et al.: Artificial Intelligence: A European Perspective. Joint Research Centre (2018). artificial-intelligence-european-perspective 17. McMillan, G.: It’s Not You, It’s It: Voice Recognition Doesn’t Recognize Women. Times article (2011). 18. Dastin, J.: Amazon scraps secret AI recruiting tool that showed bias against women (2018) 19. Chin, C.: AI Is the Future-But Where Are the Women? Wired article (2018). https://www. 20. UNESCO Data Portal: The rise of gendered AI and its troubling ripercussions (2018). https:// 21. UNESCO Data Portal: Priority Gender Equality. 22. European Commission projects 23. Stanford Gendered Innovation Platform. Chronic Pain: Analyzing How Sex and Gender Interact (2018). 24. Stanford Gendered Innovation Platform: Facial Recognition: Analyzing Gender and Intersectionality in Machine Learning (2019). facial.html 25. Stanford Gendered Innovation Platform. Extended Virtual Reality: Analyzing Gender (2019).

2 Gender Differences and Bias in Artificial Intelligence


26. Stanford Gendered Innovation Platform. Gendering Social Robots: Analyzing Gender and Intersectionality (2018). html 27. Stanford Gendered Innovation Platform: Inclusive Crash Test Dummies: Rethinking Standards and Reference Models (2019). crash.html 28. Buolamwini, J., Gebru, T.: Gender shades: intersectional accuracy disparities in commercial gender classification. Proc. Mach. Learn. Res. 81, 1–15 (2018) 29. Tolan, S.: Fair and Unbiased Algorithmic Decision Making: Current State and Future Challenges. JRC Technical Report (2018) 30. Tolan, S., Miron, M., Castillo, C., Gómez, E.: Performance, fairness and bias of expert assessment and machine learning algorithms: the case of juvenile criminal recidivism in Catalonia. Algorithms and Society Workshop (2018) 31. Tatman, R.: Gender and dialect bias in YouTube’s automatic captions. Ethics in Natural Language Processing (2017) 32. Pinkola Estés, C.: Women Who Run with the Wolves. Pickwick BIG (2016) 33. Roger, J.: A field study of the impact of gender and user’s technical experience on the performance of voice-activated medical tracking application. Int. J. Human-Comput. Stud. 60(5–6), 529–544 (2004) 34. Falkner, K.: Gender gap in academia: perceptions of female computer science academics. In: Proceedings of the 2015 ACM Conference on Innovation and Technology in Computer Science Education, pp. 111–116 (2015) 35. Franzoni, V., Baia, A.E., Biondi, G.,Milani, A.: Producing artificial male voices with maternal features for relaxation. In: 20th IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology, Dec 14-17, 2021, Melbourne, Australia, p. 8. ACM, New York, NY, USA (2021). 36. Porges, S., Lewis, G.: The polyvagal hypothesis: common mechanisms mediating autonomic regulation, vocalizations and listening. In: Handbook of Mammalian Vocalization. Handbook of Behavioral Neuroscience, vol. 19, pp. 255–264. Elsevier (2010). B978-0-12-374593-4.00025-5 37. Badaloni, S., Lisi, F.A.: Towards a Gendered Innovation in AI (short paper). DP@AI*IA 12–18 (2020) 38. Tannenbaum, C., Ellis, R.P., Eyssel, F., et al.: Sex and gender analysis improves science and engineering. Nature 575, 137–146 (2019) 39. Franzoni, V., Milani, A., Di Marco, N., Biondi, G.: How virtual reality influenced emotional well-being worldwide during the Covid-19 pandemics. In: 20th IEEE/WIC/ACM International Conference on Web Intelligence and Intelligent Agent Technology, 14–17 Dec 2021, Melbourne, Australia, p. 8. ACM, New York, NY, USA (2021). 11224561 40. West, S.M., Whittaker, M., Crawford, K.: Discriminating Systems: Gender, Race and Power in AI. AI Now Institute (2019). Retrieved from html 41. Mehrabi, N. et al.: A survey on bias and fairness in machine learning. ACM Comput. Surv. 54(6), 115:1–115:35 (2021) 42. Raji, I.D., Buolamwini, J., et al.: Saving Face: Investigating the Ethical Concerns of Facial Recognition Auditing, pp. 145–151. AIES ’20 (2020) 43. Raji, I.D., Buolamwini, J.: Actionable Auditing: Investigating the Impact of Publicly Naming Biased Performance Results of Commercial AI Products, pp. 429–435. AIES ’19 (2019)

Chapter 3

The Private and the Public: Feminist Activisms in the Informational Turn Lola S. Almendros

Abstract The hypothesis guiding this research argues that the cyborg political project is failing because privacy has become a techno-economic issue. To evaluate the hypothesis, this paper firstly points out how the feminist movement has analysed, understood, interpreted and problematised the public/private dichotomy as an element of coercion and subordination. Secondly, it analyses how the personal became a properly political issue in the 1970s thanks to self-consciousness groups, and socialist and radical feminists. The third task focuses on understanding the meaning of Donna Haraway’s cyborg project, as well as the ways in which it has been interpreted and materialised by cyberfeminists. Finally, the research describes the reasons for the lack of transcendence of online activisms and offers a characterisation of an effective techno-activism. The methodology followed consisted, on the one hand, in the compilation and analysis of literature related to the treatment of the public/private dichotomy in feminist theory. On the other hand, it has analysed the genesis and development of cyberfeminisms since the 1990s, as well as their links with communitarianism, instances of recognition and problems of social justice. Keywords Public/private dichotomy · Cyborg · Feminism · Activism · Techno-politics · Informational turn

3.1 Introduction The advent of the Internet was promising. Especially in the 1980s and early 1990s, there was a strong conviction in the possibility of breaking down subordinations and hierarchies from a new space: the cyberspace. Originally, the public–private dichotomy did not seem to be operative in this non-place space. Furthermore, this space not only made it possible to act on the margins of the public and the private, but also on the margins of bodies, sexes, genders… Haraway’s Cyborg Manifesto [27] represents those possibilities and inspires new modes of activism (see [47, L. S. Almendros (B) University of Salamanca, Salamanca, Spain e-mail: [email protected] © Springer Nature Switzerland AG 2023 J. Vallverdú (ed.), Gender in AI and Robotics, Intelligent Systems Reference Library 235,



L. S. Almendros

48]): new ways of deconstructing the demarcating and subordinating dichotomies that had defined the way of life since Modernity. However, years have shown that many of these dichotomies are playing out today: they have been (re)assimilated in the design, construction, and diffusion of technologies, as well as in their use. And gender biases are still prevailing in all these issues. Additionally, the technoeconomic characteristics of the Web 2.0 also hinder its emancipatory appropriation. As we will see, it is due to epistemic, socio-economic, and techno-power issues. Thus, the hypothesis here suggests that the cyborg political project is failing, firstly because the private is becoming techno-economic, and secondly because cyberactivisms are not attending to this transformation. When Modern rationality was implemented in politics, the public and the private were (de)marked to define, justify, and delimit the power of the State. Likewise, such demarcation was used as a mechanism of socio-political and socio-economic hierarchisation, subordination and discrimination. This question has occupied feminism’s reflection since its origins. In the 1970s, the private was redefined by feminists as a political question and even as an element of resistance. Today, privacy continues to be a mechanism of coercion. This forces us to reconsider feminist (techno)activisms. The goals of this paper are, firstly, to find out how the public–private distinction has been analysed and problematised in the history of the feminism, and how the personal became a properly political issue in the 1970s. Secondly, the aim is to explore the meaning of the political cyborg metaphor and its projections in cyberfeminisms, and the differences between it and other types of subjectivity described as inforgs [14–16] or techno-persons [1, 9]. Thirdly, a set of characteristics for an effective techno-activism will be described. The methodology followed has consisted, on the one hand, in the compilation and analysis of bibliography related with the treatment of the public/private dichotomy in feminist theory. On the other hand, the genesis and development of cyberfeminisms since the 1990s has been analysed, as well as their links with communitarianism, instances of recognition and problems of social justice. Thus, the chapter presents a first section dedicated to the different ways in which feminist theory has understood, interpreted and acted in relation to the private as an element of coercion and subordination. The second one focuses on the ways in which Donna Haraway’s cyborg project has been interpreted by cyberfeminisms. The third one describes the reasons for the lack of transcendence of online activisms, and it offers a characterisation of an effective techno-activism. The chapter ends with a conclusion section.

3.2 The Private and the Public in Feminist Theory The private and the public became split in Modernity. Firstly, this is due to the implementation of a dichotomous model of rationality, objectivity, and subjectivity [43, 44]. Secondly, it is because of the requirement to define—and thereby (de)mark and (de)limit—and justify a new mode of political power. In this sense, the public/private

3 The Private and the Public: Feminist Activisms in the Informational Turn


conceptualisation and differentiation is a political question linked to the establishment of certain forms of socio-political order and hierarchisation (i.e., forms of domination and subordination). In short, it is the result of setting up a new sociopolitical system that must be justified. Meanwhile, this split (and others such as the objective/subjective, the rational/emotional or the masculine/feminine…) not only defines the way of life, but shapes life itself. This is because they define the spaces and times, as well as the practices, performances, and values associated with each of these issues. In this sense, the (de)limitation of the public and the private is also the demarcation of practices and their possibilities. As Amartya Sen points out, these demarcations imply the determination of the “functioning of capabilities” and, therefore, the determination of freedom and equality. Consequently, there is a relationship between the public/private distinction and the definition of what it is to be a citizen. Establishing what is public and what is private also sets a framework of practices and values. The public gives meaning to the policy and political space, and it implies the distinction of the private as a space of non-interference (of neither the political nor the policy). In this sense, the enlightened and liberal public/private distinction serves as a basis for the rule of law (i.e., for citizens’ rights and freedoms), but also for the (limitation of) the State power. Thus, the definition of the liberal rule of law inherently implies the circumspection of aspects of people’s lives as matters in which there is no room for intrusion. At the same time, in this distinction also lies a sense of what it means to be a citizen based on a logic of inclusion and exclusion [42]. The convergence of political and economic liberalism reinforces the division between the State and the private life: the private life is no longer understood only as a space of freedom in the sense of non-interference of the political, but also as a space of consumption in which there should be no interference either. In this sense, political freedom defines citizens, while economic freedom defines consumers, customers, and workers. Thus, the establishment of free market ideology reaffirms private space as a space of power, but also of domination. So, Wendy Brown points out: The divided subject born with modernity and intensified by liberal ideological and capitalist political-economic ordering—particular/universal, subjective/objective, private/public, civil/political, religious/secular, bourgueois/citoyen– can be gathered under a universalizing political rubric such as “equality, liberty, fraternity” yet subsist in a civic and economic structure orders by inequality, constraint, and individualism. The subject represented as free, equal, and solidaristic in state and legal discourse is abstracted from its concrete existence where is limited, socially stratified, atomized, and alienated [5, p. 29].

As spaces of socio-political and socio-economic power, the public and the private are something gendered too: the public-political space belongs to white heterosexual men. Private space is also theirs, but it is where the women are… In this sense, the public/private distinction being the consequence of a liberal-contractualist logic where the way to create social relations (is to constitute them) by subordination [36, p. 83] From the beginning, the concretisation of the public/private entailed the concretisation of modes of freedom and domination: of equality and inequality. But this also became a way of resistance. That required bringing the private into public discussion, i.e., to turn the personal into the political. Feminists did just that.


L. S. Almendros For women the measure of the intimacy has been the measure of the oppression. This is why feminism has had to explode the private. This is why feminism has seen the personal is political. The private is public those for whom the personal is political [30, p. 191].

The history of feminism is interwoven with dichotomies, among which the public/private one has been central (see [35]). A large part of feminist theoretical constructions, analyses, and critiques—as well as their political actions—have focused on this topic. Hence, many debates among feminists, as well as their heterogeneity, segmentation, and radicalisation since the 1970s, are due to how this dichotomy and the modes of power and subordination involved are understood, described and challenged. Thus, there is a direct relationship between the public/private binomial and the recognition and description of issues and objectives in feminist theoretical and activist history. Broadly speaking, different moments in feminist history can be distinguished according to how this dichotomy has been understood and how it has been linked to freedom and equality. The liberal-political character of the suffragette movement did not question the public/private dichotomy. The suffragettes fought for women to be full citizens: to be included in the public-political part of the dichotomy. Thus, they understood freedom in a political sense, addressed the public and fought for the achievement of political rights. In their view, freedom was a precondition for equality, which was conceived in terms of equal political rights. Thus, the private was not yet problematised in a political sense. Feminists managed to participate in the public sphere, where this discussion would later be launched. Even so, some naïve liberal and current neoliberal positions assume a radical distinction between the public and the private. They argue that politics only takes place in the public sphere, and that any interference by the state or social organisations in the private sphere is unacceptable. Examples of this are the attacks from liberal and conservative groups on issues such as sexual education in schools. “The neoliberalisation of feminism takes at least three forms: the integration of women and notions of gender equality into economic restructuring, the integration of feminism into neoliberal ideology, and the associated change in rationalities and technologies of governance” [41, p. 620]. Neoliberal feminism is neoliberal in a political and economic sense because it understands political rights as economic rights. Freedom and equality are also viewed in the same way. The main objective is to be free, so freedom is prioritised over equality. Moreover, equality is conceptualised according to the logic of competition. Thus, these views reaffirm the Byung-Chul Han’s performance subject [23–25]. According to them, in addition to the vindication of the private sphere as a space of autonomy and non-interference, the privatisation of the public sphere is demanded. Socialist feminism focused on issues that transcended political rights but defined the way of life of women and other groups discriminated for sexual, socio-economic, ethnic reasons… In the late 1960s, the private came onto the public political agenda as a reaction to the “malaise” with which Betty Friedan [19] described the feelings of middle-class women in the post-war period. In the 1950s and 1960s, there was a conservative turn in the United States towards a new paradigm of the family and the

3 The Private and the Public: Feminist Activisms in the Informational Turn


domestic. After the Second World War, the United States’ economy was sustained by the establishment of consumerism, cultural industries and the technification of homes and private life. This was the age of consumerist leisure, household appliances, television, cinema, large shopping centres, residential neighbourhoods… In these years, women’s social status returns to the home, and the domestic becomes a public good and a commodity for advertising. Reproductive work acquired new meanings. On the one hand, it became more technological thanks to household appliances. On the other hand, women took on the responsibility of raising good citizens, of having a nice garden; of being part of associations where women’s things are done with or for their children, houses, husbands… In this way, women went in and out of their houses, always being someone’s wife, mother, daughter…. They were in the public space, but they did not inhabit it. They also became advertising and film icons. Wherever they were, they were for someone or something. In other words: they were neither in the public nor in the private sphere. If they were anywhere, it was in the domestic sphere. That is, they were far from being and doing whatever they wanted, and at the same time, they had the great responsibility to care for others’ wellbeing. From this subspace-time of the private in which women are situated, their existence is defined as domestic and reproductive. This describes a way of life experience characterised by repetition and resilience. Domination of women in the private sphere is based on emotional values related to care, love, family, wellbeing, etc. Consequently, any feeling of weariness or try to break with the imposed way of life is accompanied by the weight of guilt. In this sense, the heteropatriarchal definition of the domestic is not merely political but presents a “(…) domination that legitimizes women’s oppression and exploitation in the private sphere” [3, p. 93]. Moreover, it adopts forms of blackmail since it influences one’s own will. Thus, the topography that defines citizenship not only follows a logic of inclusion and exclusion but does so in a strongly emotional sense. Hence, not only do political acts occur in private spaces [29], but also different types of violence and injustice [20, 32, 33]. Although “gender-based violence” or “domestic violence” are relatively recent concepts, they exist in all cultures and as long as “time” has been called “history”. So, it is not surprising that, on fundamental issues such as the right to privacy, women were not contemplated until the 1970s. In those years, while liberal feminism began to pay attention to social issues, a properly socialist feminism began to take shape. Socialist feminists understood that equality was not a political question reduced to civil-political law. Rather, it was a social, cultural and economic issue. Both positions were reformist, but in different ways. On the one hand, liberals focused not on political rights but on civil rights. The public and private spheres are understood in a more complex way, allowing more attention towards the political and social characteristics of public space. And although they still considered freedom as a prerequisite for equality, they adopted a broader conception. In contrast to liberal socialist feminism that featured liberal-economic tendencies, socialist feminism is socialist in both the political and economic sense. It focuses on social rights rather than civil rights. Therefore, its attention was also less on public and private political space than on social space (in its political and private sense). Moreover, socialist feminists saw equality as a precondition for freedom.


L. S. Almendros

Since the 1960s, self-awareness groups have been a key element for acknowledging that many of women’s personal problems were shared. This meant that they could not be (only) personal. In this way, the private began to be seen as a space of subordination which was not yet challenged, and therefore gained public relevance. Radical feminisms also began to take shape in the 1970s from self-consciousness groups. According to these perspectives, freedom, and equality were understood in terms of recognition. In the 1960s and 1970s, the challenging of economic liberalism and its political aspects helped to deepen the capitalist distinction between productive and reproductive labour. In this period, it was understood that the liberal capitalist sociopolitical and socio-economic model was constructed from a sexualisation of labour based on heteropatriarchal values. Thus, political subordination was associated with a further socio-economic and sexualised subordination. In this way, heteropatriarchal intimate-sexual subordination is viewed as the structural basis of all relationships, especially of those that take place in the personal-private sphere. In this way, the personal becomes a political issue. This also meant that categories such as sex were seen as social constructions, and therefore not only intersected by power relations, but also imposed by those who were privileged [31]. For all these reasons, the understanding of inequality is no longer reduced to legal-political terms. Neither did it focus on its economic, cultural, social, and political causes. The approach was focused on the heteropatriarchal values that define relations of domination and subordination. Given this, supporting the difference between the private and the public would mean continuing to tolerate the reproduction of subordination and inequality in both public and private spheres. Therefore, the main objective of the vindication of the publicpolitical consideration of the private is to challenge the domination exercised over the private sphere. As a result, it was recognised that many aspects of private life were shaped by processes of heteropatriarchal subordination. Difference feminisms departure from this point. Such feminisms are neither reformist nor liberal. Like socialist feminism of equality, their objective is the conquest of social rights. Nevertheless, the recognition of differences is understood as the precondition for equality. In addition, because these feminisms are not reformist, they do not try to influence the way of life through politics, but through social life itself (i.e., through routine practices, education, and culture). Therefore, the private is more important than the public: the private is where political action can be carried out in a more direct and incisive way. Carol [26] was one of the first to introduce “the personal is political” into feminist theory and critique. She did so in just two pages in a collection of texts edited by Shulamith Firestone and Anne Koedt [13] that brought together many of the radical perspectives of the time. But Hanisch’s idea was interpreted in different ways by socialist and radical feminists. Socialists focused on the disadvantages caused by the heteropatriarchal structure of the socio-political, socio-economic, socio-historical, socio-cultural, and socio-historical spheres. Their aim was to break down gender roles and their associated modes of domination, subordination, and discrimination. In contrast, from the radicals’ perspective, all forms of discrimination were founded on a sexualisation based on heteropatriarchal values and beliefs. Additionally, radicals

3 The Private and the Public: Feminist Activisms in the Informational Turn


consider that the experience of the discriminated people is fundamental to understand the logics of disadvantage and to fight against it. In this sense, radicals are neither collectivist nor egalitarian. Thus, groups of self-consciousness and shared nature of personal problems were interpreted differently by socialists and radicals. Consequently, different forms of activism were developed. For socialists, “the personal is political” means the recognition of shared problems. Such problems do not depend on everyone’s life, but on structural problems of the way of life. In this sense, the personal is political because it is social. They considered that the structures of power and relations as well as the values and practices that shape experiences and subjectivities are constructions whose logics can (and must) be described, studied, analysed and criticised. Therefore, their aim was to re-signify them and break them down. As socialist feminism was reformist and institutionalist, all these issues became part of the political agenda. In contrast, radical groups are neither universalist nor egalitarian; neither reflective nor abstractive. They attach greater epistemic, social, and political value to personal experience than to shared collective one. Social struggle is understood in the same way: it advocates action in small groups, defined by what distinguishes them from others. This activism is not institutional but is based on direct micropolitical actions. From this point of view, it is necessary to act on a daily scale. So, the aim is not so much the public reflection on the private as the political action in the private. Thus, here the personal is political because the private is politicised in order to re-signify itself. A paradox emerges here. Movements adopting behavioural norms in order to participate successfully in political institutions often find themselves forgetting their primary purposes of social change. Long-term ideals are foregone in favour of short-term gains. But movements that hold on to their radical goals and spurn political participation often find themselves isolated in a splendid ideological purity that gains nothing for anyone. That is why a successful movement must not only maintain a balance between political and personal change, but also a creative tension between its “politics” and its “vision” [18, p. 24].

The international and global character of the feminist movement increased in the 1980s and 1990s. The complexity of internal discussions about the meaning and importance of public/private also increased. The World Conference on Women held in Mexico in 1975 opened a cross-cultural understanding and debate that culminated in Beijing in 1995, when an international plan of action for gender equality and women’s empowerment was drawn up [34]. However, throughout these twenty years, a major segmentation took place within the feminist movement. This was characterised by a focus on differences and the search for micropolitical alliances for direct action. Thus, the movement became more heterogeneous and fragmented, and more performative than discursive [46]. In this way, new forms of dispersed (and articulable) activism appeared. Indeed, many of them adopted postmodern forms.


L. S. Almendros

3.3 From Haraway’s Cyborg to Cyberfeminisms The 1990s introduced a new sphere of (inter)action and relationship through information and communication technologies (ICTs) that captured the attention of the feminist movement. Feminists quickly apprehended the social, political, economic, and cultural character of the ICTs, as well as their possible subaltern derivations. Hence, Haraway’s insistence on the need to appropriate the ICTs in order to make them tools for emancipation. However, such appropriation never happened. One of the reasons was that cyberfeminism adopted many forms and remained more cultural and postmodern than socio-political and technological. That is why cyberfeminisms are highly performative at the level of meanings, but not at the level of political struggle. In addition, although cyberfeminism presents itself as inclusive, cyberfeminist literature assumes an educated, white, upper-middle-class, English-speaking, culturally sophisticated audience [12, p. 309]. Donna Haraway’s Cyborg Manifesto [27] was a turning point in feminism: she achieved that technology began to be understood as something political and as a possibility for deconstruction and emancipation. Thus, the cyborg metaphor settles (with different senses and meanings) within feminist theory as an open purpose for the achievement of techno-political projects. It explains why new forms of theoretically postmodern and practically technological feminist activism appeared in the 1990s (see [49, 50]). Haraway’s proposal encouraged new ways of understanding, analysing, critiquing, and using technology. In addition, she highlighted one of the main problems of the ICT system: its heteropatriarchal biases and domination. In this way, Haraway confronted the liberal positions that viewed technology as neutral and hoped that male dominance would disappear when women had equal access to higher technological education. Haraway also opposed radical and essentialist views such as Sadie [40] or VNS Matrix’s, which recognised a feminine nature in technologies or considered them irremediably heteropatriarchal and, therefore, inevitably in confrontation with them. According to Haraway, technology was a space of opportunities and possibilities for subverting the forms of power and dichotomies governing the way of life. She argued that it was possible (and necessary) to make a socio-political and feminist intervention in this space. In short, the ICT system had to be appropriated before it became another element of power that reinforced traditional subordinations. From a constructivist and situationist perspective, Haraway considers technologies as tools of power and knowledge. Therefore, her analyses focus on highlighting the sociopolitical character of technologies and on showing the values and biases involved in their development. The acknowledgement of technologies’ constructivist and contingent character was the first step towards the possibility of their appropriation (i.e., of their both axiological and pragmatic re-signification). This task was emancipatory in a disruptive and constructive sense. The cyborg is a postmodern political project: its metaphorical character and the manifestos that have been shaping it for years show a utopian, futuristic, and avantgarde spirit. This explains the cultural, artistic, and aesthetic materialisation of much

3 The Private and the Public: Feminist Activisms in the Informational Turn


of cyberfeminism. However, the ulterior aim of the project started by Haraway was the use of ICTs for the deconstruction of modern subjects in order to achieve the emancipation. That task involved the construction of new post-gender subjectivities. It was not a vindication of human progress and overcoming, but of its “death” and substitution as a unique way of (re)signification. Therefore, most cyberfeminisms are (de)constructivist (see [4, 10, 11, 51]). Cyberfeminisms assume the constructivist, social and political character of gender and technologies, as well as the existence of relations of co-construction between both issues. Cyberfeminisms also assume a posthumanist view of the subject. In order to understand this, two issues are fundamental. The first one consists in the fight against the modern idea of “Humanity”, that is, against the conception of the human elaborated by a white, heterosexual, middle-class, and masculine knowledge-power system. That idea of the subject was shaped in the Enlightenment and chained to a technoscientific schema whose rationality was not bias-free (see [4, 6, 21, 28, 37–39]). In this sense, postmodern feminisms generally try to overcome the patriarchal-western idea of humanity. In the case of cyberfeminisms, this results in an (intentionally) nonspecific definition of the subject as cyborg. The second issue is the performative and active character of cyberfeminism, which explains the strong cyberactivist quality of cyberfeminisms. The conjunction of these two issues explains how the indeterminate configuration of cyborg projects has provided rich interpretations of what it might mean to be and act in the techno-world. However, such projects have not been articulated in real terms of techno-political action. In this sense, cyberfeminisms are more cultural than political. Most of the dichotomies have not been deconstructed, but reformulated and (re)assimilated. Cyberfeminisms dropped the public/private dichotomy. The persistence of privacy as a techno-political element of coercion has undermined the possibilities of emancipation. Cyborgs have been eclipsed by other subjectivities, as the relationship with ICTs has been realised in symbiotic and alienating terms. Cyborgs have been eclipsed by other types of subjectivities because of the symbiotic and alienating relationship with ICTs. Co-constructive hybridisation based on the appropriation of technologies has resulted in a de-anthropologisation of the entities who operate in the infosphere. They have been described as inforgs [14–16] or technopersons [1, 9]. In this way, emancipation has become alienation due to the modes of agency and the usability paradigm that describe what it means to be and act in the infosphere today. The cyborg has yet to be constructed. Haraway’s project followed a socialist path rewritten in terms of de-construction that focused, on the one hand, on overcoming modern dichotomies and, on the other hand, on direct participation in the genesis and control of technologies. But cyberfeminisms were built around radical perspectives in which, as we have seen, the strength of the collective is lost in its own fragmentation and dichotomies are discussed and tried to be removed from lesser spaces. This hinders the achievement of any common project and control on the scale demanded by the structure of the ICT system. In this sense, there has been a depoliticising shift since the First Cyberfeminist International in 1997. Thus, although the Old Boys Network’s 100 Antitheses of Cyberfeminism


L. S. Almendros

define an apolitical cyberfeminism, the fact is that politics was left unspecified [12, p. 306].

3.4 An Effective Techno-Activism An effective activism requires rejecting the individualistic idea of freedom encouraged by the techno-economic system and recovering the republican idea of freedom. The difference between these two perspectives lies in the conception of individuality. In the case of republicanism, it is linked to citizenship and, therefore, to community. This communitarian idea of freedom includes the plurality of instances of recognition, which serve as a social nexus and link between the pieces of identities. It also helps to recognise common concerns in the development of capabilities and thus issues of social justice. Fraser and Honneth [17] discuss the nature of social justice claims. Fraser considers that recognition issues are part of social justice. In her view, status subordination is caused by the institutionalisation of unfair cultural patterns that depend on class subordination. In Honneth’s view, on the other hand, recognition instances are identity-defining and thus are the principles for the development of social claims. In this way, redistribution issues would be subordinated to recognition issues. This contrasts with Fraser’s “perspectivist dualism”, where both class and status subordination are problems of justice that must be eradicated in order to make participatory parity possible. But the point of their discussion here is that recognition instances are indispensable for conceiving the possibilities of political vindication. Given the postmodern literature, such instances seem to dissolve. However, they are extremely significant for the sustainability of the public sphere. Nevertheless, from constructivist perspectives such as postmodern ones, it is possible to understand that instances of recognition are historical but re-describable. Therefore, they have a certain given significance that orients (but does not determine) the possibilities. On the other hand, they are fundamental for solidarity and communitarianism because they allow the development of a space of power between the public and private spheres. In this space, common concerns are recognised, which allows the development of discourses, debates, agreements, and public actions. This is possible because the recognition instances go beyond one’s own individuality. Meanwhile, in order to achieve all the above, the first instance of recognition must be the recognition of the contingency of one’s own interests and beliefs. This is indispensable for the construction of the public sphere without renouncing individuality. However, this implies the recognition of its own contingency. However, it seems difficult to achieve such objectives in (cyber)contexts such as today’s social networks, where an exponential homogenisation of identities and an inactivity masked as (hyper)sociability are promoted. Therefore, the mechanisms of change should begin to be imagined from a social, libertarian, and communitarian reconfiguration of social cyberspaces. Several Latin American feminist groups are developing good examples of these

3 The Private and the Public: Feminist Activisms in the Informational Turn


practices, such as chicastecno and chicasentecnología in Argentina or ranchoelectrónico in Mexico. Free Software is useful for these emancipatory practices that move away from appropriation towards creation. However, these aspirations are also challenged by the intrusion of private capital into Free Software communities [2]. A tendency to transform codes into products rather than open tools [7] suggests the need to encourage a constant critical reflection on the Internet’s architecture. Now more than ever, technological artefacts are mechanisms not only for thinking about one’s own identity, but also for identifying oneself. Thus, the place of technology in our cultures is a place of definition. However, awareness of this vital role of technology is not common. Rather, an uncritical assimilation takes place. ICTs must be socialised and democratised by design. The free software movement may be considered a first attempt. But effective socialisation and democratisation of these tools requires training in their use and development. The modern idea of the state is sustained by the imaginary of progress. In this paradigm, state power is associated with sovereignty, and implies the capacity for election, decision, and action. Thus, the state is an agent with the capacity to predict, to decide and, therefore, to control. In short, the role of the state is based on prediction, and regulation is its outcome. However, the informational ecosystem does not fit into this paradigm. This is a consequence of the innovation imaginary and implies uncertainty: the impossibility of prediction and therefore difficulties in regulating. The logic of innovation has also reached the political level, and at least four factors have been key in this process. The logic of innovation has also reached the political level. At least four factors have been key in this process. Firstly, the lack of regulation of the financial-information economic development of recent decades. Secondly, the emergence of a new economic paradigm where information is both commodity and capital, and which has caused a disruptive change in the value-chain both within the ICT sector and in the economy as a whole. Thirdly, the genesis of economic value from the social and political value of information. And finally, the end of the distinction between the public and the private, which has been reinforced by the privatisation of the public and the publication of the private that underpins technocommunications. In this way, a techno-power structure of high socio-political impact has been set up, which consists of leaving things as they are as long as there seems to be no alternative. Its effects, besides being difficult to predict—and therefore to regulate—are neither affordable nor manageable by states. The State role in this transformation has been passive at best. Both the complacency and the fast development of innovations in the techno-economic sphere have opened an irreparable gap between forms of power. The result is a geopolitical map in which states are juridical-political entities but have little territory and fewer competences. Nor do the large unions of states, such as the European Union, show any greater capacity. Indeed, much of the national(ist) exacerbations recently in vogue are the result of the deficits of the state sphere and the incompetence of international cooperation. Nations are no longer a space of identity and community. Thus, what partycracy offers, on both sides of the Atlantic, is populism. In addition, there is a material decay of states through public debt and their factual incapacity. For all this, representative democracies do not (re)present much today.


L. S. Almendros

States have lost power, so there is no sense in developing political resistance to challenge them. In this sense, the end of state power implies the end of activism in traditional terms. Because online and cyber activisms have reproduced older forms and practices of activism, they are not being disruptive. Furthermore, they do it within the structural and usability terms of websites, especially of social networks. The lack of transcendence of online activism is firstly due to their aims and actions, which are subordinated to the online/offline distinction. Thus, paradoxically, social platforms are used for the communication and organisation of activist collectives, but their demands and actions are focused offline. Secondly, online platforms favour issues such as the organising ability or visibility, the outlining of strategies, or the attraction of followers… but they also block or censor truly subversive criticism and activity. In this sense, the structure used to organise activist collectives also hinders their political projection. Therefore, only attacks on these platforms can be really transformative. So, today’s activism must be not just techno-activism but specifically hacktivism (see [8, 22, 45]). Forms of power are embedded in practices, so resistance must also be a matter of praxis. Specifically, it is a matter of practical knowledge. So, one of the major contemporary questions is how to emancipate alienated techno-symbionts. To answer this, at least four questions should be considered: 1. 2. 3. 4.

Understanding the symbiotic and alienating circumstance. Understanding its conditions. Understanding its logics. The capability to challenge those logics.

The first three conditions require the development of social, economic and political research. The last demands a technical knowledge embedded in subversive values. For all this, it is necessary to develop an in-depth study of the informational economy focusing on the characteristics, causes and effects of the new value-chains. Because of the innovation logic, the market must be understood as a process rather than a structure. Secondly, it is necessary to examine, on the one hand, the marketing mechanisms aimed at understanding and manipulating people’s behaviour and, on the other hand, the technical mechanisms of plundering and mining information. Thirdly, each of these mechanisms has to be related to the big-tech companies that are developing and using them, to their (explicit and implicit) goals and interests, and to the possible consequences of all this. This is crucial to understanding how big-tech economy and power runs. Fourthly, it is necessary to create a framework of anthropological, social and political understanding that explains the characteristics of the onlife way of life. Such a framework should also explore, on the one hand, the reasons why the individual’s will gives rise to informational relationships and, on the other hand, the consequences and projection of all this for the socio-political coexistence. Fifthly, it is important to define and contextualise the political problems inherent in the onlife way of life, as well as the values involved. These include values such as openness and sharing, whose meaning has been shifted. Looking for the meanings of values can offer ways of transgressing or reinventing them. Therefore, sixthly, the description of operative values must be accompanied by a proposal of

3 The Private and the Public: Feminist Activisms in the Informational Turn


others which would subvert the established order. These may include anonymity versus transparency, encrypted sharing versus open sharing, creation versus use and consumption, independent and chosen connection versus dependent and subordinate connection, and surprise and attack versus formal strategies and passivity. Lastly, it is essential to develop practices embodying these values. This will serve to shape subjectivities capable of subverting the exacerbation of individuality in favour of the realisation of collectives with common concerns and goals.

3.5 Conclusions In its early days, Web 2.0 was a space full of social and even political possibilities. Donna Haraway’s Cyborg Manifesto advocated such expectations. However, several dichotomies have been (re)assimilated within the ICT system’s development. This is particularly the case of the public/private dichotomy. The hypothesis guiding this research shows that the cyborg political project is failing because privacy has become into something techno-economic. Three objectives were set to evaluate the hypothesis. The first one was to find out how the feminist movement has analysed and problematised the public/private distinction. The second one was to explore the meaning of the cyborg political metaphor and its projections in cyberfeminisms. Thirdly, the characteristics of an effective techno-activism have been outlined. Regarding the first objective, it has explored the relationship between the public/private dichotomy and the enlightened-liberal model of rationality that justifies the origin of the rule of law and defines life in terms of citizenship. This has shown how the establishment of the public and the private also sets a framework of practices and values, a socio-political hierarchy, and modes of coercion and subordination. Next, it has been outlined the different ways in which feminist theory has understood, interpreted and acted in relation to the private as an element of coercion and subordination. Thus, it has been seen how the public and the private are spaces of socio-political and socio-economic power, as well as gendered spaces. Six tendencies have been distinguished within the feminist movement: suffragette-liberals, neoliberals, socio-liberals, the socialists and radicals. In the first, the public/private dichotomy is not questioned; freedom is understood in a political sense; the focus is on the public sphere; and political rights are fought for. Thus, freedom is understood in terms of equal political rights and as a precondition for equality. Neoliberals see the political solely in the public sphere and argue that the state should not interfere in private affairs. From this perspective, political rights are seen as economic rights, and freedom and equality are understood in the same terms. Socio-liberals focus on civil rights, so their attention stays in the public and private political space. For them, freedom still prevails over equality. In contrast, socialists fight for social rights. Therefore, their focus is not so much on the public political space as on the social space (in its public and private dimensions). In their view, equality is the precondition for freedom. In the 1960s, many of women’s personal problems became recognised as shared problems, and private space was conceived as a space of subordination. In this way,


L. S. Almendros

the private acquired public relevance. As a result, radical feminisms appeared in the 1970s. But socialists and radicals interpreted the private as political in different ways. For socialists, it meant challenging the disadvantages of socio-political, socioeconomic, socio-historical and socio-cultural structures. Their aim was to break down gender roles and their associated modes of domination, subordination, and discrimination. For this reason, these feminisms are also called “equality feminisms”. In contrast, radical feminists claim that all forms of discrimination are based on a sexualisation that is a consequence of heteropatriarchal values and beliefs. Thus, they attach greater epistemic, social and political value to personal experience than to common experience. Moreover, they conceive freedom and equality in terms of recognition. So, they are not concerned with public reflection on the private, but with political action in the private sphere. Since the late 1980s, all these issues have converged with ICT system development. At the same time, Donna Haraway’s work inspires new cyberactivisms that are theoretically postmodern and practically technological. For her, new technologies opened up a space of opportunities and possibilities for subverting the dichotomies which characterised forms of life and power. Such technologies could be emancipatory tools. However, the postmodern character adopted by new activisms in the 1990s led to an indeterminate concretisation of cyborg projects. Although this has favoured their artistic and cultural deployment, it has also complicated their techno-political articulation. Haraway’s original project followed a socialist path rewritten in terms of deconstruction. It focused on overcoming modern dichotomies and on direct participation in the creation and control of technologies. But cyberfeminisms were configured from radical perspectives where the strength of the collective was lost in its own fragmentation. As a result, it was impossible to realise any common project on the scale demanded by the ICT system structure. Furthermore, it has been seen how this is linked to the decline of communitarianism, which causes an increasing indeterminacy of common-recognition instances. In this sense, a pragmatic way of understanding the recognition instances has been proposed. In this vision, the first and most important instance of recognition is the contingency of one’s own interests and beliefs. This is the condition for the possibility of constructing the public without renouncing individuality and plurality. Furthermore, from this perspective, it has been argued that ICTs must be transparent, socialised and democratised in both epistemic and socio-political terms. Regarding the third objective, this paper analyses the reasons for online activism’s lack of impact and offers a characterisation of what effective techno-activism should look like. However, because of the informational turn’s conditions of subjectification, the development of these techno-political movements requires prior efforts to understand and deconstruct the process of subjectification itself. Such a task presents four moments, in which the epistemic and the political intersect: the understanding of the symbiotic and alienating circumstances; the understanding of the conditions and logics of both circumstances; and the capability to challenge them. We are shaped by complex networks that define and constrain us at the same time. Faced with the tyranny of data analysis and transparency, new forms of political

3 The Private and the Public: Feminist Activisms in the Informational Turn


sovereignty must be designed. Technopolitical activism cannot forget that freedom is a construction which requires social, cultural, political, and technical practices and tools. Moreover, we face challenges that should not be solved by authorities outside social interests. It should not be forgotten that democracy is the privilege of being able to make mistakes: the privilege of being able to make ourselves.

References 1. Almendros, L. S.: Tecnopersonas: sujetos alienados. In Gómez, C. M. P., Maffía, D., Moreno, A. (eds.) Intervenciones feministas para la igualdad y la justicia, pp. 280–296. Jusbaires (2020). 2. Alonso, A., D’Antonio, S.: El software libre y el Open Knowledge como comunidades de conocimiento paradigmáticas. Utopía y Praxis Latinoamericana, 20(69), 83–92 (2015). https:// 3. Benhabib, S.: Models of public space: Hannah Arendt, the liberal tradition, and Jurgen Habermas. In: Calhoun, C. (ed.) Habermas and the Public Sphere, pp. 109–142. MIT Press (1992) 4. Berg, A.J., Lie, M.: Feminism and constructivism: do artifacts have gender? Sci. Technol. Human Values 20(3), 332–351 (1995) 5. Brown, W.: Tolerance and equality: “The Jewish Question” and “the Woman Question.” In: Scott, J., Keates, D. (eds.) Going Public. Feminism and Shifting Boundaries of the Private Sphere, pp. 15–42. University of Illinois Press (2004) 6. Castaño, C., Webster, J. (eds.): Género, ciencia y tecnologías de la información. Aresta (2014) 7. De la Cueva, J.: Innovación y conocimiento libre: cuestiones morales y políticas. Isegoría 48, 51–74 (2013). 8. Deseviis, M.: Hacktivism: on the use of botnets in cyberattacks. Theory Cult. Soc. 34(4), 131–152 (2017). 9. Echeverría, J., Almendros, L.S.: Tecnopersonas. Cómo las tecnologías nos transforman. Ediciones Trea (2020) 10. Faulkner, W.: The technology question in feminism: a view from feminist technology studies. Women’s Stud. Int. Forum 24(1), 79–95 (2001) 11. Faulkner, W., Lie, M.: Gender in the information society: strategies of inclusion. Gend. Technol. Dev. 11(2), 157–177 (2007) 12. Fernández, M., Wilding, F.: Situar los ciberfeminismos. In: Zafra, R., López-Pellisa, T. (eds.) Ciberfeminismo. De VNS Matrix a Laboria Cuboniks, pp. 305–316. Holobionte Ediciones (2019) 13. Firestone, S., Koedt, A. (eds.): Notes from the Second Year: Women’s Liberation. Major Writings of the Radical Feminists (1970) 14. Floridi, L.: The Philosophy of Information. Oxford University Press (2011) 15. Floridi, L.: The Fourth Revolution. How the Infosphere is Reshaping Human Reality. Oxford University Press (2014) 16. Floridi, L. (ed.): The Onlife Manifesto. Being Human in a Hyperconnected Era. Springer International Publishing (2015). 17. Fraser, N., Honneth, A.: Redistribución o reconocimiento? Morata (2006) 18. Freeman, J.: El movimiento feminista. Editores Asociados S.A (1975) 19. Friedan, B.: La mística de la feminidad. Cátedra (2009) 20. Gal, S.: A Semiotics of the public/private distinction. In: Scott, J., Keates, D. (eds.) Going Public. Feminism and Shifting Boundaries of the Private Sphere, pp. 261–277. University of Illinois Press (2004)


L. S. Almendros

21. García Dauder, D., Pérez Sedeño, E.: Las ‘mentiras’ científicas sobre las mujeres. Catarata (2017) 22. Halupka, M.: What anonymous can tell us about the relationship between virtual community structure and participatory form. Policy Stud. 38(2), 168–184 (2017). 01442872.2017.1288900 23. Han, B.-C.: La sociedad del cansancio. Herder (2012) 24. Han, B.-C.: La sociedad de la transparencia. Herder (2013) 25. Han, B.-C.: Psicopolítica. Neoliberalismo y nuevas técnicas de poder. Herder (2014) 26. Hanisch, C.: The personal is political. In: Firestone, S., Koedt, A. (eds.) Notes from the Second Year: Women’s Liberation, pp. 76–78. Major Writings of the Radical Feminists (1970) 27. Haraway, D.: A cyborg manifesto: science, technology, and socialist-feminism in the late twentieth century. In: Simians, Cyborgs, and Women. The Reinvention of Nature, pp. 149–182. Routledge (1991) 28. Haraway, D.: Testigo_Modesto@Segundo_Milenio: HombreHembra©_Conoce_Oncoratón®. Routledge, Feminismo y tecnociencia (2004) 29. Lister, R.: Citizenship. In: Blakeley, G., Bryson, V. (eds.) The Impact of Feminism on Political Concepts and Debates, pp. 57–72. Manchester University Press (2007) 30. MacKinnon, C.A.: Toward a feminist theory of the state. Harvard University Press (1989) 31. Millet, K.: Política sexual. Cátedra (2010) 32. Mottier, V.: Pragmatism and feminist theory. Eur. J. Soc. Theory 7(3), 323–335 (2004) 33. Okin, S.M.: Gender, the public, the private. In: Held, D. (ed.) Political Theory Today, pp. 67–90. Polity Press (1991) 34. ONU: Declaración y Plataforma de Acción de Beijing (1995) 35. Pateman, C.: Feminist critiques of the public/private dichotomy. In: Benn, S.I., Gaus, G.F. (eds.) Public and Private in Social Life, p. 281. Croom Helm (1981) 36. Pateman, C.: El contrato sexual. Anthropos (1995) 37. Pérez Sedeño, E.: Las ligaduras de Ulises o la supuesta neutralidad valorativa de la ciencia y la tecnología. Arbor, 181(716), 447–462 (2005). 38. Pérez Sedeño, E.: Mitos, creencias, valores: cómo hacer más científica la ciencia; cómo hacer la realidad más real. Isegoría 38, 77–100 (2008). goria/article/view/404/405 39. Pérez Sedeño, E., Almendros, L.S., García Dauder, S., Ortega Arjonilla, E. (eds.): Knowledges, Practices and Activism from Feminist Epistemologies. Vernon Press (2019) 40. Plant, S.: Zeros + Ones. Digital Women + The New Technoculture. Fourth State (1997) 41. Prügl, E.: Neoliberalising feminism. New Polit. Econ. 20(4), 614–631 (2015). 10.1080/13563467.2014.951614 42. Riley, D.: Prologue: the right to be lonely. In: Scott, J., Keates, D. (eds.) Going Public. Feminism and Shifting Boundaries of the Private Sphere, pp. 1–12. University of Illinois Press (2004) 43. Rorty, R.: La filosofía y el espejo de la naturaleza. Cátedra (1995) 44. Rorty, R.: Objetividad, Relativismo y Verdad. Paidós (1996) 45. Serracino-Inglott, P.: It is ok to be an anonymous? Ethics Glob. Polit. 6(4), 217–244 (2013). 46. Sollfrank, C.: La verdad sobre el ciberfeminismo. In: Zafra, R., López-Pellisa, T. (eds.) Ciberfeminismo. De VNS Matrix a Laboria Cuboniks, pp. 251–257. Holobionte Ediciones (2019) 47. Wajcman, J.: El tecnofeminismo. Cátedra (2004) 48. Wajcman, J.: From women and technology to gendered technoscience. Inf. Commun. Soc. 10(3), 287–298 (2007). 49. Wajcman, J.: Feminist theories of technology. Camb. J. Econ. 34(1), 143–152 (2009) 50. Wyatt, S.: Feminism, technology and the information society. learning from the past, imagining the future. Inform. Commun. Soc. 11(1), 111–130 (2008) 51. Zafra, R.: Netianas. Lengua de Trapo (2005)

Chapter 4

Gender Bias in Artificial Intelligence Enrique Latorre Ruiz and Eulalia Pérez Sedeño

4.1 Introduction and Problem Statement The last decade has witnessed remarkable advances in the field of artificial intelligence (AI), widening the horizons of practical application in a vast array of different contexts. This trend has resulted from the confluence of two key factors: the technical development of formal languages capable of expressing and modelling increasingly complex processes and the promise of objectivity and axiological neutrality accompanying such formal systems. The logic underlying this promise is that their inputs and decisions cannot be biased due to the absence of fallible subjects in the calculation or deliberation process. Nevertheless, the rapid expansion of AI-based solutions to all areas of social life has brought new and unforeseen challenges, attracting the attention of both the scientific community and policy makers across the world [10]. Prominent among these challenges are biases derived from demographic data collected and processed by AI, particularly those related to race, ethnicity, and sex/gender-identity. By demographic biases, in this paper, we refer to systematic biases or errors that highlight certain aspects of experience to the detriment of others that are ignored because of insensitivity to variables such as gender or race. This can have a direct negative impact on the quality of scientific research and output [17, 22, 33]. There is growing recognition that these demographic biases are a problem that skew the findings of all kinds of demographic research and not just those that focus on gender/sex sensitive aspects, race, or marginalized communities. Such biases can thereby have far-reaching impact in the context of societies which are increasingly depending on AI to govern themselves [9]. E. Latorre Ruiz · E. Pérez Sedeño (B) University of Santiago de Compostela, Santiago, Spain e-mail: [email protected] Instituto de Filosofía-Consejo Superior de Investigaciones Científicas, Albasanz, 26-28, 28037 Madrid, Spain © Springer Nature Switzerland AG 2023 J. Vallverdú (ed.), Gender in AI and Robotics, Intelligent Systems Reference Library 235,



E. Latorre Ruiz and E. Pérez Sedeño

This is an emerging problem of our century and there are many examples which attest to its gravity. From facial recognition technologies failing to properly identify Black or Asian individuals, to automatic translation technology that translates nouns or expressions that should be neutral or feminine (“the physician” for “el medico” (male doctor) or “the teacher” for “el profesor” (the male teacher”) into generic masculine, or the well-known case of the internet search engine which temporarily tagged “unprofessional hairstyle” solely with images of black women. Algorithms and big data are so widespread that there are now automatic solutions for huge volumes of data that provide credit assessments to decide who can be granted a mortgage, gauge the level of danger of a health emergency, measure the compatibility of potential romantic partners and match them, identify illegal content in a mobile phone’s photo gallery, predict the risk of reoffending or classify job applicants according to their suitability. Such wide variety of contexts has inevitable consequences which may impinge upon fundamental rights [29]. Further complicating the issue, these biases do not appear only when categories such as race or sex/gender are used, since even in cases where these variables have been removed from the initial dataset, indirect relationships and correlations with these categories proliferate as algorithms make decisions (and learn from them) based on these correlations or from “garbage” databases that collect biased or fraudulent data. Biases do not then appear in the surface layer of the data but reveal themselves through complex structural relationships only recognizable to a statistically trained eye [12]. By way of example, it is worth highlighting Microsoft’s Tay AI, which had to be withdrawn due to the controversy generated by the fact that, without knowing how or why, it published racist, sexist, and Nazi slogans on social networks through its user @TayandYou. AI-based solutions are no longer only accessible at a macro or supervisory level by large companies or political bodies, but are now part of our most intimate lives, controlled and structured through large digital platforms that influence opinions and social behavior. Among them, social networks occupy a pre-eminent place as they have changed the rules that govern the functioning of the social and political system in our hyper-connected communities, segmenting the information that reaches us through their advertisements. What is shown, to whom it is shown and when, is also decided by an algorithm [20, 35]. To provide a general overview of the problem, our contribution in this chapter will consist of a discussion of the types of demographically derived biases that can be identified, taking as a criterion at what point they are introduced into the decision process as a whole. To do so, we will use the taxonomisation proposed by Danks and London [11] and extend it to sex/gender-sensitive cases. We thereby hope to contribute to the understanding of a fundamental aspect of this problem, especially regarding the different strategies for resolution or mitigation that have been put forward to date. Such strategies require the intersectional contributions that have been made on the topic of bias and the analysis of language from a feminist perspective [24, 25]. As Selena Silva and Martin Kenney [36] have pointed out, understanding how and at what point in the operation of software there is a danger of introducing bias

4 Gender Bias in Artificial Intelligence


into automatic decision-making and machine learning processes is vital to ensure that society does not reproduce discrimination and inequality in the technologies that will shape the future world. But also, because they can provide a small window through which to identify prejudices and biases in our reflection projected onto a machine, and therefore can be an opportunity to analyze and reverse some values that we do not want to be part of our society. With this and other elements on the horizon, some policy institutions have addressed this growing debate. A 2014 report by President Barack Obama’s administration analyses privacy and data protection policy, but also addressed what they called “digital discrimination” [41]. Two years later, the Obama administration would present a second report on the impact of algorithms on civil rights exclusively focused on these issues [42]. EU institutions have also shown an interest in the impact of AI technology on European society, presenting in April 2019 ethical guidelines for AI [13], emphasizing the need for responsible technological development ensuring trustworthy AI systems. In their proposal, trustworthy AI is understood as technology which both respects applicable European laws and regulations and is ethical, i.e., it respects the principles and values of the European Union. On these principles, six key requirements are proposed to be met: (i) agency and human oversight, (ii) technical robustness and security, (iii) privacy and data governance, (iv) diversity, non-discrimination and equity, (v) social and environmental well-being, (vi) accountability and auditability. In this same direction, a year later, in October 2020, the Presidency Conclusions on the Charter of Fundamental Rights in the context of AI and digital change would see the light of day [14], providing guidance on the challenges of AI in the context of justice, dignity, freedom and equality. In addition, the European Commission has recently gone a step further by putting forward an ambitious proposal for the regulation of Artificial Intelligence in the EU as a legislative framework to establish uses and limits for some applications that may infringe European rights, including a enforcement system allowing for heavy fines to be imposed in cases of infringement. With this proposal, the European Commission hopes to protect the fundamental rights of users and consumers in the face of the risks posed by growing AI systems (Proposal for a regulation of the European Parliament and of the council laying down harmonized rules on artificial intelligence (artificial intelligence act) and amending certain union legislative acts). Among its most notable aspects is the requirement for a high degree of transparency in how algorithms operate. This can be understood as the ability to reconstruct and replicate its operation thereby providing a clear explanation for output or behavior. Ultimately, if, as many experts have argued, science and technology implicitly or explicitly incorporate the beliefs and goals of their creators, we must ensure that these do not contravene the EU’s principles of equality and non-discrimination. What all AI systems have in common is that they operate through algorithms that make use of the universality of computation [39] by capturing the knowledge and behavior of human beings, learning from our responses and behaviors, which makes them permeable to the values of our cultures. It follows that, to safeguard algorithms from gender bias, diversity in the field of machine learning is essential. But not only


E. Latorre Ruiz and E. Pérez Sedeño

diversity in the databases that are used to train these algorithms, but also in the places where these technologies are created and produced. Including women’s voices in the development of AI and Machine Learning technologies could help solve the problem of bias or at least mitigate their impact by incorporating different perspectives in the evaluation of data and the impact of gender in the contexts of technology use [24]. However, this is not happening in the contexts in which these technologies are being developed, which suffer from male over-representation where racial diversity is also absent [40]. An analysis published in 2019 by the AI Now Institute looked at conference registration data and found that less than 20% of the researchers who tried to submit papers were women. And only one in four enrolled in AI training programs at Stanford and Berkeley were women [43]. In another 2019 report, published in this case by the British agency NESTA, a big data analysis of the 1.5 million papers available on arXiv, a website used by the AI community to discuss and disseminate their work, established that only 13.8% of the authors on this platform are women [37]. However, it is worth remembering that the presence of women in the development of AI is not a sufficient condition for better science, but it is necessary. Because when scientific and technological developments are carried out from the point of view of groups traditionally excluded from knowledge spaces, fields of ignorance are identified, priorities are rearticulated, and new questions are posed [1, 17].

4.2 Algorithms, Formal Languages and the Mirage of Objectivity and Computational Neutrality Algorithms are the great revolution of the machine century. They have provided us with technological devices that have changed the way we live in ways unimaginable only a few decades ago. When we refer to an algorithm here, we mean it in its most elementary sense, i.e., as a set of well-defined rules or instructions to be followed to perform a calculation or solve a problem. Many tasks in everyday life can be expressed by algorithms, and these are algorithmic tasks whose processes are computable, e.g., looking up a word in a dictionary or following a recipe. In the field of informatics and computing, algorithms together become the software that establishes how data should be processed according to a pre-established set of rules and in order to produce a result. In themselves, they are nothing more than small instructions in programming language that prescribe procedures or regularities, but they play a fundamental role in all computer systems, and especially in those that are autonomous. Autonomous AI systems require the ability to adapt to different contexts, some even unexpected or ill-defined, and while physical design and robust system components are critical to this task, algorithms hold the key to these capabilities. In a way, they constitute the heart of the system itself, whether they are part of the machine’s own software or have been used in its training.

4 Gender Bias in Artificial Intelligence


In our modern societies, humans increasingly rely on using autonomous systems, believing that by not involving a cognitive agent in the decision-making process, the result will be more accurate, uncontaminated by the inherent flaws to human behavior. This is the case, for example, with self-driving vehicles that will not be distracted by a phone call and cannot be fatigued by a long day’s driving [6]. These tools project to the public image the promise that the decisions they make are more coherent, more accurate and more appropriate than if they were made by humans. Thus, this confidence in the infallibility of autonomous systems is often underpinned by the assumption that computation, as the love child of mathematics, is neutral and objective, automatically providing AI with the capacity to be fair. In contrast to this image of AI, philosophical debates among computer scientists are becoming more and more frequent. They must deal with the different situations in which they will have to decide what exactly it means for an algorithm to be fair, and they will also have to define, mathematically speaking, what fairness is and how to operate to achieve it. We can ask ourselves then, is the idea of fairness capturable or expressible by an algorithm, but also, is there one and only one idea of mathematical fairness? All these questions have no easy solution and involve a debate that requires time. At present, in the field of computer science and statistics, different mathematical criteria are already being devised to define what it means for a model to be fair. The importance of this work is that it is a technical attempt to account for our most elementary moral intuitions, and inevitably these definitions are driven by values and political positions. Indeed, some scientists such as Arvind Narayanan have already identified as many as 21 different mathematical definitions of impartiality [4, 28]. In opposition to this aspiration for neutrality and objectivity, are the overwhelming commercial interests driving digital platforms making use of algorithms or machine learning procedures such as Facebook, Uber, Airbnb, or Google. These large corporations offer services to users by means of tools or applications that involve AI and machine learning in their processes, enabling mutual interaction between users and the market. In performing these functions, digital platforms use user data to improve their own services and often do not consider the class divide in the use of these devices, which means that women or other impoverished groups, using less digital technology, do not produce as many digital “crumbs”. This aspect is doubly worrying, not only because of the necessary denunciation of the difficulty of access to ICTs, but also because in the long run they can cause a vicious circle of digital exclusion that is difficult to break out of. Women use fewer applications involving AI and machine learning; thereby producing less information with which to train these tools, ultimately receive worse services and experiences, further reducing adherence. A vicious cycle that is difficult to break. Moreover, these platforms offer linking and communication services that have become indispensable to modern life, such as having an email account, meaning they end up structurally modifying the way we live. It is in this sense that we also say that algorithms are not neutral. It is not only about what algorithms are, but also about what digital platforms can do with them. In these corporations, decisions are


E. Latorre Ruiz and E. Pérez Sedeño

not made by citizens, but by owners based on an agenda of economic growth and on the principle of capitalist profit, which can sometimes mean that, in their very structure, algorithms incorporate some self-interested ideas about relationships and correlations between people, services and market objects [30, 36]. Embedded in these tools, these biased or self-interested ideas can act as authentic performative filters that imbue our perception of the phenomena, they themselves aspire to describe, self-constituting them and producing effects on the social system. This is what can happen with some algorithmic applications that use certain patterns of matches to offer quotes or grant visibility to some profiles over others with the aim of attracting more users, but also establishing the idea that there is an appropriate way to be desirable and arouse more interest. Such problems are more challenging the more sophisticated the techniques used, especially if they involve machine learning technology because the results obtained cannot be explained [26]. Opening the black box of machine learning processes must be a challenge for the future, otherwise identifying biases and intervening in them will be a Herculean task.

4.3 Gender Biases in Natural Language Processing: Some Feminist Contributions It is important to note that gender bias is not a new problem that has arisen exclusively because of the emergence of artificial intelligence and big data. From the midtwentieth century to the present, we can identify in feminist studies a continuous series of works dedicated to analyzing, identifying and intervening gender biases in fields of knowledge as varied as medicine [21, 23, 32], psychology [34] or computer science [16, 38]. These works denounce how the predominance of the male scientist has led to the elevation of men as the standard model of the human being, producing an androcentric science that excludes women as an object of study except to highlight their differences, especially in all matters concerning reproduction. Many of the current debates in AI on the problem of gender bias are very similar to those earlier debates related to gender equality in society in the 1960s and 1970s. Hence, their relevance today. Since the 1960s, feminist studies have analyzed how women have always been represented as passive, emotional and irrational beings in literature [27], and how the media have contributed to entrench idealized portrayals of femininity in our social imaginaries [15]. This line of research on discourse relations in the shaping and reinforcement of gender reaches into our century where feminist theorists have questioned the active role of language in perpetuating gender ideology in society [7]. These seminal works are relevant to the analysis of gender bias in AI and big data because they have helped to unmask some of the ways in which gender ideology is embedded in language and influences people’s values and perception of women, as well as the behavioral

4 Gender Bias in Artificial Intelligence


expectations associated with gender. Moreover, we can still find this gender ideology in numerous textual sources that are fed into algorithms through machine learning, and thus give rise to stereotypical concepts of gender. In the field of feminist linguistics, research on stylistics and semantics has identified some linguistic resources that can cause gender bias. As Levy [24] has argued, these contributions can be useful for developments in computation by providing a framework with which to highlight gender biases in databases, which is a fundamental first step in training algorithms to identify and eliminate them autonomously. Among them, we will present here three: naming biases, ordering biases and biased descriptions. When we use words to describe groups of women and men when we express ourselves, we often use some constructions that show subtle prejudices about women, which we call naming biases. These terms are part of the living history of languages, which is also the history of the people who spoke and constructed those languages, and therefore reflect their social values and their understanding of the world. For example, we sometimes use “father” to describe men who are responsible for a family, “family men”, but there is no equivalent identification with “woman”, which is often associated with motherhood or care work. Something similar happens with certain professions to which a gendered article is added, which implicitly shows that their existence contravenes gender expectations: “female lawyer” or “female judge”. We can also recognize this bias when we use words such as “man” to refer to the entire human species or when we use a gendered name to refer to certain highly masculinized or feminized professions where one gender is expected to be most of the group, such as “firefighters” and “nurses”. Similarly, the order in which names appear when we organize a list may also reveal a gender bias. In these cases, it is a clear linguistic staging of a specific social order in which the most important names occupy the most valuable and emphasized position, which is repeated by convention in some languages. This is what happens, for example, when naming the male first in expressions such as “son and daughter”, “husband, wife”, “Mr. and Mrs.” but also with more subtle ones such as “teacher and pupil” and “doctor and nurse”. Another of the most common types of gender bias that can be recognized in the uses of language is that which is produced by a biased description of women that ends up configuring a fictitious and idealized representation. By this we mean how custom and repeated uses of certain gendered expressions can give rise to stereotypical associations or correlations. An analysis of the adjectives most used by the British press to describe men and women [8] found that men were usually described about their actions or behavior, while women were described in reference to their physical appearance and sexuality, also demonstrating that lexical choices produce differential judgements about women and men that have social effects. Similarly, after analyzing the contexts of use of the term “girl”, it has been shown that girls are associated with worse contexts or are more objectified than boys, which has led to a recommendation to remove the adjectives


E. Latorre Ruiz and E. Pérez Sedeño

used to describe women from the databases with which machine learning algorithms are trained [24]. In a similar direction, the technique of word embedding has attracted attention in recent research on gender bias and artificial intelligence [5]. These authors have presented how, when using this technique, the blind application of machine learning risks amplifying the biases present in the data. Word embedding is a natural language processing technique that manages to represent text data as vectors, allowing words with similar meanings to have a similar representation. It is, therefore, a distributed representation of text, which represents one of the technical advances that are essential to understand the impressive development of deep learning in the face of the challenge of natural language processing. This technology has been applied, for example, to segment the appropriate audience of articles advertised by Google News, which have been shown to exhibit troubling gender stereotypes [9]. All these rhetorical language strategies that have been denounced by feminist theorists need to be taken into account by algorithm developers, because they can help identify gender bias in the training data used for machine learning. In fact, the cases analyzed have been presented in such a way that it may be possible to identify them computationally, although this is a task that requires joint work between experts from different areas of knowledge and the inclusion of a gender perspective in AI and big data research. Overcoming bias in computing is a complex but not insurmountable task, and feminists have a lot to contribute here. Susan Leavy puts it best: Kate Crawford aptly captured the ultimate cause of the prevalence of gender bias in AI: ‘like all technologies before it, artificial intelligence will reflect the values of its creators’. Societal values that are biased against women can be deeply embedded in the way language is used and preventing machine learning algorithms trained on text from perpetuating bias requires an understanding of how gender ideology is manifested in language ([24]: 1).

4.4 Critical Processes for the Introduction of Gender Bias in Algorithm Processing: A Taxonomy Throughout this chapter we have presented the importance of the problem of bias in AI and Big Data and the urgent role it is called to play in our research programs because of its ethical and political implications. We have also addressed this issue, giving an account of some classic debates in gender studies which are patently relevant to the debate. As we have already argued, feminist theorists’ proposals for analyzing sexism in natural language can help to identify gender bias in training data used in machine learning. However, we also believe that several identifiable algorithmic biases have other origins besides natural language processing, which also need to be assessed and evaluated. Currently, public discussions on “algorithmic biases” are impractically bringing together many different types, sources of origin and measurable consequences of

4 Gender Bias in Artificial Intelligence


biases, with consequent difficulties in terms of technical analysis and proposed solutions. In some cases, intervening in these biases involves some ethical or legal debates, but in others they require statistical developments, although all of them end up being identified under the label of “algorithmic biases” in the absence of more coherent concepts [11, 36]. This leads us to believe that we are only seeing the tip of the iceberg and that as our experience with these systems and our knowledge of how they work progresses, new biases and new questions will emerge. Moreover, there is no evidence suggests the future development of a single response to all the variety of biases that are currently proliferating. Therefore, different strategies for eliminating or mitigating them successfully will depend on both the nature and types of biases and the values of the social group that must interact with the system. In this context, we present here under a proposed taxonomisation that may shed light on our approach to the concept of “algorithmic biases” for three different reasons. First, because it provides us with a holistic framework, covering all categories of bias that AI and Big Data research has pointed out to date. Secondly, because this framework allows us to understand the particularities of these biases allowing for the discussion of different possible alternatives to address them. And finally, because it introduces the social dimension of the uses and applications of the systems, an aspect that we find particularly noteworthy. To organize this presentation we have based ourselves on the model developed by Danks and London [11] and extended by Silva and Kenney [36] in which algorithms are considered to be chains of transmission and processing of information in which four different moments can be identified: (1) data input, (2) algorithmic operations, (3) output and reception of the results by the users of the algorithm and finally, (4) monitoring and obtaining data with which the system is fed back. 1. In Data Input 1.1. Bias in the training data 1.2. Algorithmic approach bias 2. In Algorithmic Operations and in Outputting and Receiving Results 2.1. Algorithmic processing bias 2.2. Transfer context bias 2.3. Misinterpretation bias 2.4. Non-transparency bias 2.5. Automation bias 2.6. Consumer bias 4. The Monitoring and Data Collection with Which the System is Fed Back 4.1. Feedback loop bias


E. Latorre Ruiz and E. Pérez Sedeño

4.4.1 Data Entry Bias Data input is a crucial moment in the operation of an algorithmic system. So, if at the beginning of this chapter we referred to algorithms as the heart of autonomous systems, data is akin to the blood circulating through the whole system rendering it operative. It stands to reason that its entire functionality is dependent on the quality of this data. There is an adage in computer science that sums up this idea in reverse, “garbage in, garbage out”. Poor quality data used for training or as part of an algorithm will unavoidably result in poor quality results and bias. Two types of gender biases widely present in the literature can be introduced through data input into an algorithmic system, (1.1) training data biases and (1.2) algorithmic approach biases. Errors or incorrect sampling in the training data are one of the most common gateways for algorithmic biases. Although they may initially appear to be easy to detect, this is impossible because the companies or platforms that hoard them are reluctant to make them public. Bias in training data is even caused by gross errors, such as when the data entered are not randomly sampled, which sometimes leads to over- or under-representation of some sectors [3, 11]. On the other hand, we also include in this category those situations in which, when creating an algorithm, companies make strategic decisions for their system for reasons that are also not public in a direction that affects the services offered by the algorithm on the basis of gender. In this sense, the way in which that data input has been organized in the algorithm is as important, if not equally important, as the data itself [9]. In deciding which dataset the algorithm is going to work with and in designing a training strategy, some biases arising from these decisions may be introduced, which we refer to as biases in the algorithmic approach. In this way, we want to account for all those errors or malfunctions that may arise in the system because of both including and excluding variables such as gender or race from the calculations.

4.4.2 Biases in Algorithmic Operations and in Outputting and Receiving Results The processing of the data entered into the algorithm is another sensitive moment where gender biases may arise. Whether it is because the calculation does not work correctly or because this was a premeditated decision, these biases arise when it is the algorithm itself that has a built-in procedure that is not statistically neutral. Sometimes there may even be good reasons to introduce biases into an algorithm, for example when a biased statistical measure is used to mitigate the impact of training data as a source of bias in the face of small population samples that introduce noise in the sample due to over-representation. In these cases, there is therefore a deliberate

4 Gender Bias in Artificial Intelligence


decision to use a biased algorithm to compensate for other biases and the need to ensure the robustness and stability of the system. We also speak of processing biases when algorithms do not consider the differences between cases and the particular context for which they have to give a result [36]. For example, in the context of educational technology research, [19] have compared the performance of an autonomous assessment software versus a traditional assessment system, so a teacher. They found that the result of the algorithm always fits with the teacher’s intuitions. But the algorithm scored only the average of the proposed tasks, while teachers adjust their scores to the individual progression of the students. Some researchers [2] have shown that considering variables such as gender or race in algorithms that monitor prison permits can result in discrimination against women or black people. In this situation, the most frequent solution chosen by developers has been to avoid these variables, but this has not solved the problem given that there are other indirect variables that are closely related to the protected categories. In any case, many data protection rules and laws have gone in this direction by prohibiting the collection or use of the most sensitive demographic data, leading to situations in which this measure, intended to protect against discrimination, ends up having the opposite effect. This is what happens, for example, when we eliminate sex/gender as a variable in a health screening algorithm. The programmers of these systems are then faced with a situation where they themselves need to carefully weigh the effects of these variables, taking into account their benefits and drawbacks [11]. Another bias occurs when users are unfamiliar with the interpretation of data from an algorithmic system, which can lead to the interpretation being biased by the user’s internalized biases. For example, there are currently some AI systems that are being tested in predicting the risk of recidivism of people in prison. These software’s output a numerical index that represents the likelihood of a person reoffending, but it is the judges who interpret the score and personally make decisions on reintegration pathways based on their interpretation of what each score means. Interpretation bias therefore arises because it is a user who must interpret the data [36]. Another problem in data output that can lead to bias is the problem of nontransparency of the functioning of the autonomous system. Due to technical progress in machine learning and the increasing use of huge databases containing a large number of variables, algorithms are becoming more complex, and it is more difficult to monitor their computations. This situation has led to decisions becoming increasingly opaque not only for users or the bodies responsible for monitoring and authorizing their operation, but also for researchers and system developers themselves, who are unable to explain how the results have been obtained or based on which variables [12, 18]. This bias can occur, for example, when faced with a rejected loan application, even the bank and finance staff do not know the reasons for the refusal. In these cases, the lack of transparency makes it difficult for victims of such decisions to identify when outcomes are discriminatory and makes it impossible for them to complain and thus also to seek redress for the harm caused.


E. Latorre Ruiz and E. Pérez Sedeño

We also identified another possible source of bias when the user considers that the results of the algorithms are objective and describe the world in a transparent way. This is automation bias and is based on the user’s misconception of the machine’s purpose and capabilities, which leads him to think that his result is a neutral statistical calculation and not a prediction with a certain level of confidence. The problem of automation bias lies precisely in this: by eliminating the subject who issues a result, users do not perceive that there is room for error and uncritically accept this generated result [11]. The last of the biases in the output and reception of data is consumer bias. This is essentially that users may transfer their own biases to the online environments they interact with, sometimes even with less self-censorship than they would in analogue environments due to the feeling of protection and disinhibition provided by the impersonality of an online profile or comment. The problem is that, in many cases, algorithms can exacerbate these biases or give them greater visibility because of erroneous criteria, so that they end up reproducing and reinforcing discrimination and violence [36].

4.4.3 Biases in the Monitoring and Data Collection that Feeds Back into the Algorithmic System One of the great successes in the use and design of AI and machine learning tools is their ability to use the data they themselves produce and information about user interaction to create more data and correct or modify their behavior [44]. Undoubtedly, this is a feature that can greatly improve the efficiency of developments and even favor their adaptation to different application contexts, but it can also be one of the moments when gender bias is introduced into algorithms. An example is the algorithm that runs Google’s search engine, in which, when faced with a query, it records both the terms that users have chosen and the choices they have made with the results obtained. In this way, this circular dyamic feeds the system’s databases, giving rise to better results for the future, but also reinforcing certain biases that are repeated in searches and that reinforce prejudices or stereotypes. These biases, which we call algorithmic feedback loops, are very difficult to detect and even more difficult to resolve, because they sometimes involve introducing intentional biases into the algorithms that can nuance or correct their errors, and this can lead to new unforeseen biases.

4.5 Conclusions Throughout this chapter we have sought to provide an overview of the problem of algorithmic biases in AI and machine learning that allows us to better grasp the full

4 Gender Bias in Artificial Intelligence


extent to which the uses of this new technology can have an impact on sensitive issues such as gender equality or non-discrimination based on race or ethnicity. In our approach we have tried to give prominence to an understanding that incorporates the pragmatic dimension in the analysis, as we understand that malfunctions or errors may not only occur in language or calculation but also, and especially, in their interaction with human beings. If algorithms and digital platforms offering services based on AI and machine learning are set to become indispensable tools for modern life, research into how they work, and the implications of their use will require more and more attention. Although there are already prolific lines of research in this direction with promising results that partially mitigate biases, there is still a long way to go in terms of a conceptual identification of each of them, especially if we consider that they will rarely be operating in isolation and that we hardly know anything about their behavior when they interact. Even if the sources and types of biases end up operating together, finding an adequate taxonomy that accounts for all of them creates a wider space for reflection and technical intervention. Moreover, in our approach to the problem, we hope to have provided enough compelling reasons to argue that these problems cannot be solved by technical research alone but will require the joint work of experts with diverse points of view and from different areas of knowledge who can approach these debates from ethical, political, and epistemological perspectives.

References 1. Adán, C.: Feminismo y conocimiento: de la experiencia de las mujeres al ciborg. A Coruña: Spiralia Ensayo (2006) 2. Angwin, J., Larson, J., Mattu, S., Kirchner, L.: Machine Bias: There’s Software Used Across the Conuntry to Predict Future Criminals. And it’s Biased Against Blacks. ProPublica (2016) 3. Barocas, S., Bradley, E., Honavar, V., Provost, F.: Big Data, Data Science, and Civil Rights (2017) 4. Barocas, S., Hardt, M., Narayanan, A.: Fairness and Machine Learning. Limitations and Opportunities (2018). 5. Bolukbasi, T., Chang, K.-W., Zou, J., Saligrama, V., Kalai, A.: Man is to Computer Programmer as Woman is to Homemaker? Debiasing Word Embeddings (2016). 06520 6. Brandao, M.: Age and Gender Bias in Pedestrian Detection Algorithms (2019). https://arxiv. org/abs/1906.10490 7. Butler, J.: Gender Trouble: Feminism and the Subversion of Identity. Routledge, New York (1990) 8. Caldas-Coulthard, C.R., Moon, R.: ‘Curvy, hunky, kinky’: using corpora as tools for critical analysis. Discourse Soc. 21, 99–133 (2010). 9. Caliskan, A., Bryson, J.J., Narayanan, A.: Semantics derived automatically from language corpora contain human-like biases. Science 356, 183–186 (2017). ence.aal4230 10. Courtland, R.: Bias detectives: the researchers striving to make algorithms fair. Nature 558, 357–360 (2018).


E. Latorre Ruiz and E. Pérez Sedeño

11. Danks, D., London, A.J.: Algorithmic bias in autonomous systems. In: Proceedings of the Twenty-Sixth International Joint Conference on Artificial Intelligence, pp. 4691–4697. International Joint Conferences on Artificial Intelligence Organization, California (2017). https:// 12. Díaz Martínez, C., Díaz García, P., Navarro Sustaeta, P. : Sesgos de género ocultos en los macrodatos y revelados mediante redes neurales: ¿hombre es a mujer como trabajo es a madre?/Hidden Gender Bias in Big Data as Revealed Through Neural Networks: Man is to Woman as Work is to Mother? Revista Española de Investigaciones Sociológicas (2020). https:// 13. EU High-Level Expert Group on Artificial Intelligence: Ethics Guidelines for Trustworthy AI. European Comission (2019) 14. EU Presidency: Artificial Intelligence: Presidency Issues Conclusions on Ensuring Respect for Fundamental Rights. EU Council Press (2020) 15. Friedan, B.: The Feminine Mystique. Norton, New York (1976) 16. Friedman, B., Nissenbaum, H.: Bias in computer systems. ACM Trans. Inform. Syst. 14, 330– 347 (1996). 17. Garcia Dauder, S., Pérez Sedeño, E. : Las ‘mentiras’ científicas sobre las mujeres. Catarata, Madrid (2017) 18. Gershgorn, D.: AI is Now so Complex its Creators Can’t Trust Why it Makes Decisions. Quartz (2017) 19. Guskey, T.R., Jung, L.A.: Grading: why you should trust your judgment. Educ. Leadersh. 73, 50–54 (2016) 20. Halberstam, Y., Knight, B.: Homophily, group size, and the diffusion of political information in social networks: evidence from Twitter. J. Public Econ. 143, 73–88 (2016). 10.1016/j.jpubeco.2016.08.011 21. Hamberg, K.: Gender bias in medicine. Women’s Health 4, 237–243 (2008). 10.2217/17455057.4.3.237 22. Hare-Mustin, R.T., Marecek, J.: Making a Difference. Psychology and the Construction of Gender. Yale University Press (1992) 23. Holdcroft, A.: Gender bias in research: how does it affect evidence based medicine? J. R. Soc. Med. 100, 2–3 (2007). 24. Leavy, S.: Gender bias in artificial intelligence: the need for diversity and gender theory in machine learning. In: IEEE/ACM 1st International Workshop on Gender Equality in Software Engineering, pp. 14–16 (2018) 25. Leavy, S., Meaney, G., Wade, K., Greene, D.: Mitigating Gender Bias in Machine Learning Data Sets (2020) 26. Lepri, B., Oliver, N., Letouzé, E., Pentland, A., Vinck, P.: Fair, transparent, and accountable algorithmic decision-making processes. Philos. Technol. 31, 611–627 (2018). 10.1007/s13347-017-0279-x 27. Millett, K.: Sexual Politics University of Chicago Press, 1970. Política Sexual. Cátedra, Madrid (2020) 28. Narayanan, A.: 21 fairness definitions and their politics. In: FACCT Conference (2018) 29. O’Neil, C.: Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy. Crown Publishers, New York (2016) 30. Piskorski, M.: A Social Strategy: How We Profit from Social Media. Princeton University Press (2014) 31. Proposal for a Regulation of the European Parliament and of the Council Laying Down Harmonised Rules on Artificial Intelligence (Artificial Intelligence Act) and Amending Certain Union Legislative Acts. European Comission. 32. Risberg, G., Johansson, E.E., Hamberg, K.: A theoretical model for analysing gender bias in medicine. Int. J. Equity Health 8, 28 (2009). 33. Ruiz Cantero, T.: Sesgos de género en investigación y atención sanitaria. In: Sanchez, P. (ed.) La salud de las mujeres, pp. 215–233. Síntesis (2013)

4 Gender Bias in Artificial Intelligence


34. Sherif, C.W.: Bias in psychology. Fem. Psychol. 8, 58–75 (1998). 9353598081005 35. Sikder, O., Smith, R.E., Vivo, P., Livan, G.: A minimalistic model of bias, polarization and misinformation in social networks. Sci. Rep. 10, 5493 (2020). 36. Silva, S., Kenney, M.: Algorithms, platforms, and ethnic bias: an integrative essay. Phylon 55, 9–37 (2018) 37. Stathoulopoulos, K., Mateos-Garcia, J.: Gender Diversity in AI Research. NESTA (2019) 38. Tiainen, T.: Constructing gender bias in computer science. In: Encyclopedia of Gender and Information Technology, pp. 135–140. IGI Global (2006). 39. Turing, A.M.: Computing machinery and intelligence. Mind 59, 433–460 (1950) 40. Turner Lee, N.: Detecting racial bias in algorithms and machine learning. J. Inform. Commun. Ethics Soc. 16, 252–260 (2018). 41. U.S. Executive Office of the President: Big Data: Seizing Opportunities. Preserving Values. The White House (2014) 42. U.S. Executive Office of the President: Big Data: A Report on Algorithmic Systems, Opportunity, and Civil Rights. The White House (2016) 43. West, S.M., Whittaker, M., Crowford, K.: Discrimination Systems. Gender, Race and Power in AI. AI Now Institute (2019) 44. Zuboff, S.: In the Age of the Smart Machine: The Future of Work and Power. Basic Books, New York (1988)

Chapter 5

Emotional Intimacy and the Idea of Cheating on Committed Human–Human Relationships with a Robot Julie Carpenter Abstract There is an emerging body of research that demonstrates people interact with humanlike forms of AI (such as holograms and robots) socially in some situations and furthermore may develop meaningful emotional attachment to them. Therefore, as the first generations of AI designed specifically to enhance human sexual pleasure are introduced it is important to contemplate the ethical questions because these new technological entities introduce new social actors into the complicated mix of morals and emotional attachment of romantic and sexual relationships with their unique characteristics of being both (a) technology and (b) socially meaningful. Exploring the dynamic of human–robot social interactions in this way also acknowledges that robots can be incorporated as meaningful social actors in human relationships, forming new dynamics between humans and technology. This chapter explores some of the cultural expectations that people have around human–human romantic commitment and explains how the introduction of AI-based sexual partners like sex robots will introduce new ideas and definitions about the ethics of emotional cheating when they are applied to a technological and sociological Other.

5.1 Introduction The morality and boundaries of what constitutes cheating in a committed romantic relationship has been a persistent and universal topic of interest, discussion, and nuanced debate ever since society decided that the idea of sustained commitment to a romantic partner and its set of meaningful cultural rituals and expectations for everyone involved to live by was an ideal, or at least held up a set of ideas that has universal roots as a persistent relationship model. Continued interest and conversation about these topics in the form of popular books and movies—and peerreviewed research papers—resonates with individual common sense, because people

J. Carpenter (B) Ethics and Emerging Sciences Group, San Luis Obispo, CA, USA 93407 e-mail: [email protected] © Springer Nature Switzerland AG 2023 J. Vallverdú (ed.), Gender in AI and Robotics, Intelligent Systems Reference Library 235,



J. Carpenter

are interested in finding moral guidance and absolute truths to apply to their everyday romantic lives and romantic relationships. In the field of human–robot interaction, research indicates people will anthropomorphize robots even when they are not designed to resemble something human or animal-like; design cues (e.g., morphology), the context of the robots’ use, or its lifelike capabilities such as responsiveness to natural language commands, are all affordances that can elicit unintended user social interactions when users attribute other lifelike qualities to the robot (Carpenter 2016; [1, 2, 20, 23, ]). There is also an emerging body of research that demonstrates people can develop emotional attachment to robots, even when that robot has not been designed specifically for social interaction1 (Carpenter 2016; [60]). However, for the purpose of this chapter, the design of the robot as humanlike, mechanical, or abstract is less important than the dynamic imposed on it by elevating it from artifact to social actor by the human perception and projection in a sociosexual dynamic. Correspondingly, if there are situations emerging where people will be presented with the opportunity to engage with a robot in a way that is physically intimate, sexual, and perhaps emotionally salient, that also means there are ethical issues that will need to be considered, evaluated, applied, and re-negotiated as technology and culture change in robot design, sales, safety and privacy considerations, as well as consideration of the ethics of emotional design, as well as from the user standpoint. Therefore, with the introduction into the real world of humanlike robots designed to intimately interact with people by enhancing sexual pleasure via artificially intelligent sociosexual interactions, it is important to contemplate the emerging ethical questions associated with this new technological entity which introduces another potential social actor into the complicated mix of morals and emotional attachment of romantic relationships. Exploring the dynamic of human–robot social interactions in this way also acknowledges and incorporates robots as meaningful social actors into culture by formalizing initial theoretical ways of understanding emerging human–robot social, sexual, and emotional attachment phenomena. The first iterations of sex robots in the emerging world market are humanlike in appearance and behavior mimicry, humanlike illusions, and they are also entities that humans have demonstrated they can become emotionally attached to in some circumstances [21, 24, 65]. Based on the premise it is possible for people to become emotionally attached to robots, the humans who will be affected by the idea of romantic cheating are a range of people, some of whom also may not have considered –or believed—the possibility of emotional entanglement with a robot prior to their own involvement in any part of 1

Interestingly, there is currently less research about human attachment or social interaction with highly humanlike robots which may have to do with barriers to access to such robots as research stimuli due to factors such as acquisition and maintenance costs. Because of the nature of human attachment, longitudinal studies of humans interacting with various robots in different contexts would be ideal, but the logistics and efficacy of long-term studies between humans and AI or robots can be problematic. Additionally, there are—clearly—ethical dilemmas about academically exploring situations that could put study participants and mental or emotional risk by forming (or not) relationships with the AI.

5 Emotional Intimacy and the Idea of Cheating on Committed …


this potential dynamic. In other words, the initial exploration of this dilemma may be its discovery and an acknowledgment this situation is a possibility in a larger, cultural sense of understanding new phenomenon. Furthermore, if the idea of cheating with a robot on a committed human relationship is entertained by those involved, how do we begin a larger, perhaps more abstract discussion about these possibilities and the emotional repercussions and emerging ethical expectations and standards? To bring a sense of scope to this discussion, although there are multiple perspectives in every act of cheating, in this case, we focus on the ethics of the people and explore these topics with less regard to the AI-based social actor, the robot. Although the robot is a participant, it is passive in the sense that its motivations and actions are not rooted in conscious or sentient emotions or motivations, and therefore its actions are initiated and sustained by the humans in the scenario. Sexualized robots are already in development and early stages of commercial release, with projects like Realbotix’ Harmony, a chatbot precursor from 2017 that rapidly gave way to the company’s X-Mode sex robot series for sale by 2020 [17, 63] and early distribution. Additionally, there are already people who identify as “Synthetiks,” “robosexual,” and “iDollators,” and will purposefully seek out meaningful interactions with sex robots [24]. One of the challenges with exploring this topic is the sheer number of variables and creating a bounded area for research and discussion. The terms robot, sex robot, and sex doll need to be defined here for clarity. Here, a robot refers to an embodied computer that can interact with and sense, react, and learn from its environment. A doll does not have these robot qualities embedded in its body, although it may have animatronic qualities (e.g., limited body part movement). A sex doll refers to a humanlike doll designed for the user’s sexual and/or social pleasure.2 A sex robot is a robot that was designed to enhance human sexual pleasure. Necessarily then, this chapter—which takes a human-centered approach to the topic of emotional cheating on a human–human relationship with a robot as the imposing entity—begins with definitions of a committed relationship, cheating, and virtual cheating as they are applicable in this situation. Arguments regarding whether the development of sex robots is justified or moral are beyond the scope of this chapter, which focusses on the emerging social and cultural phenomenon around human–robot sexual and romantic relationships. Additionally, for the purpose of this work, relationships of interactivity will be indicated as human–human (HH), human–robot (HR), or, in the case of a committed human relationship dynamic (e.g., a couple) and a robot actor, human–human-robot (HHR). However, these terms are not meant to indicate an exact number of people (or robots) potentially involved in the primary romantic commitment(s) where the idea of cheating could emerge 2

Some people who identify as sexually attracted to dolls, mannequins, and robots object to the term “sex doll” or “sex robot.” For example, Davecat, an iDollator and “advocate” on the topic, has said that using those terms is akin to calling a human partner a “sex girlfriend/boyfriend” and detracts from the depth of the socialness and caring he feels is integral to his attraction to the dolls/robots [24]. However, the dolls may be purchased by some as primarily for sexual stimulation or to stimulate activities with a human partner(s), not necessarily for companionship; however these self-reported reasons for ownership may also overlap or change over time [65].


J. Carpenter

and does not assume a human–human dyad model of romantic exclusivity. Nor does the HH model indicate, assume, or suggest biological genders, gender identity, or sexual preferences of the people or robot personas involved. Yet, the term HHR does indicate that at least two of the people in the human–human relationship(s) consider themselves to be committed to each other exclusively in romantic and/or sexual ways. Because of this notion of exclusivity and commitment, the idea of cheating or potential cheating with another actor outside that relationship becomes a cause for concern. The robot in this interaction will not necessarily have been designed with sexualized capabilities, because the significance lies in the human’s emotions and actions, not the robots design intent, and as indicated earlier, people interact socially with a range of robots beyond their intended design use. Cheating is a word that is laden with subtext frequently dependent upon many very specific cultural notions that vary across gender norm expectations, religious beliefs, legal definitions, geographical boundaries, and generational lines. Here, we are working from a base definition that there is an explicit commitment between at least two people where there is a mutually agreed-upon understanding of emotional (although not necessarily physical) sexual and/or romantic fidelity between them, and for one (or more) of the partners to deviate from that understanding becomes a situation that becomes cheating. Therefore, to establish a premise of emotionally intimate cheating in an HHR social system, the questions become “Is it possible, and if so in what ways and to what extent, for people to form meaningful, emotionally intimate relationships with robots? Furthermore, if the nonacting human partner in this emotionally laden HHR scenario perceives actions between HR to be emotionally intimate, what is triggering this perception?”. To further clarify the ideas, the concept of virtual cheating is often defined differently than traditional cheating because it reflects the emotional states of the people involved, who do not necessarily ever have physical contact. Instead, virtual cheating is typically associated with persons expressing affection via email, text/sext, or in a virtual environment such as a video game or virtual world. Yet, like the idea of HR cheating, virtual cheating may also mean a person communicates their sexual fantasies or loving intentions toward a variety of nonhuman partners, such as an AIpowered chatbot. In this example, we again see a challenge with applying commonly understood terms to a discussion of new interaction phenomena, but it serves to illustrate the specificity and boundaries of this argument. While there can be common agreement that the physical aspects of cheating on a committed relationship are morally wrong, the debate about emotional intimacy sustained via virtual environments is less easily agreed upon [27, 42, 51, 52, 61], and is less relevant to a discussion of HHR circumstances because HHR situations may have both physical and emotional components.3 3

Additionally, sexual activity with a robot could be considered similar to the use of any other sex toy. Therefore, the subjective opinions of the people involved in the committed relationship have to determine if the use of sex toys is problematic to their own boundaries, for whatever reasons; or, whether a person can cheat with an object at all. However, if a person oversteps any defined boundaries with their partners regarding sex toys interacts with a robot sexually, they are breaking an agreement regarding sexual activity.

5 Emotional Intimacy and the Idea of Cheating on Committed …


Therefore, virtual cheating has a place in the discussion of human–robot cheating because part of the proposition is similar; someone is having an intimate relationship outside the agreed-upon conventions of the primary relationship. A part of one definition of virtual cheating may be where the entity outside the primary relationship may not be organic itself, such as AI in the form of a chatbot or hologram. As with the premise of cheating with a robot, this type of virtual cheating means the bedrock of the accusation lies in the perception that the cheater has formed or established an emotional attachment beyond the scope of the primary relationship and does not have any physical expressions of this attachment, sexual or otherwise. At the core of the arguments and questions, though, is the idea that cheating is wrong. Virtual cheating is then less appropriate to use as a framework for understanding human–robot relationships in this situation because of the very physical nature of robot embodiment.

5.2 Romantic Emotional Intimacy Emotional intimacy is a feeling of a bond and connectedness with another person [41]. Furthermore, this sort of intimacy may be regarded as a universal human need [3, 53]. Every individual has a level of intimacy that they desire [14, 43], and also a level of intimacy that characterizes their model for a romantic relationship. People in romantic relationships attempt to maintain intimacy with tactics such as selfdisclosure and conveying a sense of understanding and responsiveness toward their partner [40, 58], with responsiveness also being a quality that enables emotional attachment [48]. Emotional intimacy may or may not be an explicitly expressed goal in a HH relationship, although presumably even in a relationship where it is not discussed as its own objective its influence or valence is a subjective experience [31]. Emotions and the agreement of shared (versus individual) perceptions about emotional experiences are the very essence of positive functioning in a romantic relationship. Furthermore, positive emotions are associated with the beginning of a relationship, while negative emotions are a precursor to the end of one (Gottman 1994, 1999). The experience and expression of shared emotional intimacy is often one of the primary building blocks in the common definition of a successful romantic relationship. However, there has been relatively little scholarly exploration of romantic emotional intimacy for a human to a robot, specifically [27, 42]. The burgeoning body of work demonstrating human–robot attachment possibilities indicates that this condition should not be disregarded as niche or otherwise deviant human behavior that results only from the psychological vulnerability of the person. In fact, it may be a very normal human reaction to a robot in certain situations triggered by a combination of factors spanning the cultural expectations and beliefs of the user, their cognitive model of human-to-robot interaction, design cues from the robot that afford (previously exclusive) organic or humanlike modes of interaction (e.g., natural language processing), and the persistence of the individual human–robot relationship over time.


J. Carpenter

Cheating is considered a violation of relationship expectations. One theory of emotion and relationships suggest that it results from a disruption of personal scripts, or interactions that deviate from the expected pattern [5, 6]. If the disruption facilitates the relationship partners to move to a mutually agreed-upon positive goal, the emotions have a positive association. However, if the disruption blocks progress toward goals in some way, it imparts negative emotion. The goal that is enabled or stymied may be outside the relationship, and not directly pertain to it. Some relationship-specific goals are to avoid undesirable conditions, such as threats to the relationship itself, such as emotional and/or physical cheating. One example of a threat to a relationship is conflict, or anguish—fleeting or persistent—between the people involved [36]. Furthermore, each person has a level of relationship conflict that they are unwilling to manage, and that level of intolerance is a state committed partners try to avoid [35]. Conflict can come in many forms, and have emotionally hurtful impact on romantic relationships [8]. Behaviors, such as criticism, can cause distress [12], and the resolution of conflict relieves distress [33]. Conflict resolution is often the focus of research, rather than identifying incentives and motivational factors that encourage people to move toward shared intimacy [32]. Although conflict and loss of intimacy in a relationship may be related, it should be noted that emotional intimacy is viewed as a positive construct and conflict is a threat to the state of the relationship.

5.3 Formal and Informal Conditions and the Elastic Definitions of Emotional Cheating Legalism is another way to formally define morality within a society; if the law claims adultery a crime, for example, then it is commonly understood that adultery is considered wrong. Yet people do not typically consult law when immersed in the emotions and decisions necessary to navigating their everyday relationship and so determine amongst themselves what is acceptable romantic behavior between partners and what is considered out-of-bounds of the relationship. To simplify an extraordinarily complicated set of relationships and subjective experiences, this chapter will use the lens of determining the morality of the action based on its harmful and beneficial consequences to the participants. Using this criterion necessarily encompasses legal and religious considerations for individuals because their belief system may be emotionally entwined with legal or religious considerations. Various religions offer definitions, guidelines, and canonical law regarding cheating, as does society’s legal structures. However, aside from the religious and legal doctrines concerning romantic fidelity, another approach to addressing the issue of cheating is to question whether it is an immoral act or set of actions and behaviors because it will have harmful consequences on the primary relationship, and possibly negative effects on the person(s) involved outside the primary commitment, as well. This assessment of negative emotional repercussions because of a partner (or partners) attaining

5 Emotional Intimacy and the Idea of Cheating on Committed …


emotional connections beyond the primary HH relationship is not easily boundedor measured. Furthermore, there are the varying behavioral states of cheating, and whether the partner admits their transgressions, stops cheating, continues to cheat, or a combination of these circumstances. It is important to note these consequences are not always perceived as negative by each party involved. After all, for the cheating parties, there may be states of emotional or physical pleasure associated with the acts of cheating, but the morality of the acts is the valence for determining whether it is deemed wrong behaviors. One way to define emotional cheating then is when it creates more negative consequences than positive ones, it may be considered morally wrong. But then how do parties keep a running tally of weighted responses to figure out if the fallout of cheating created more positive or negative consequences? It is generally agreed that what defines cheating is that someone in a committed relationship engaged in sexual and/or romantic activity with a person outside of that relationship. Viewed this way, emotional cheating involves three main dynamics. The first is that the cheater is in a relationship that is supposed to exclude cheating. The second dynamic is that there are sexual or romantic activities beyond the agreed-upon primary relationship. Third, by including the possibility of romantic entanglement in addition to purely sexual or physical actions, the scope of cheating then includes the emotional ramifications for all (human) parties involved. Instinctually, it may seem obvious that physical cheating is easier to define than the boundaries and activities of emotional cheating. However, expressions of physical affection are open to ambiguous interpretations. As examples, exchanging cheek kiss salutations or a hug between friends are physical expressions of platonic friendliness that are open to individual perceptions about their social construction for all parties involved, including the social actors and the passive observers. Acknowledging these subjective readings of physical expression is important because a discussion of emotional cheating is not exclusively a realm of what occurs intellectually, and there may be physical manifestations of the emotional bond that are less obviously categorized as purely sexualized or romantic in intent or reception. The successful social exchange of a friendly hug between friends relies on all parties agreeing upon the emotional and intellectual interpretations, to some extent, or some emotional distress may be caused by the misreading of intentions. As with expressions of physical affection, a discussion of emotional affection— and possibly, attachment—are centered on what are considered appropriate behaviors. This idea of appropriateness is the less easily formally defined area for all parties in an emotional cheating dynamic because internal motivations, such as intent, are not necessarily clear even to the principal social actors in a scenario. However, if a person is expressing affection and perhaps romantic overtures toward someone beyond the committed primary relationship, whether these overtures are reciprocated or not, it may be regarded as emotional cheating. But if this sort of cheating does not involve sexual contact, why does it matter? The answer to that question is because the emotional damage caused by a violation of trust between the primary committed relationship members is morally wrong.


J. Carpenter

5.4 Robots as Sociosexual Actors The discussion of emotional cheating thus far has focused on a model of human– human romantic relationships to provide a framework of understanding the factors involved in the complicated set of potential emotional states and their related behaviors in emotional cheating. Using this H–H model has the advantage of a previous and ongoing body of research that is relevant to the topic at hand. Additionally, in discussions of the idea of emotional cheating, there is a common understanding that for the cheating to occur the “true” fault of cheating outside the primary committed HH relationship lies in the human factors, as robots do not currently have formal societal status (e.g., legal rights), and lack humanlike internal states that we recognize as having the same valence to us as love or autonomy do. Therefore, without these aspects of human level intelligence or accurate understanding of emotional states, it is the human factors that would contribute to the very idea of cheating in any form, not the robot. There is a burgeoning body of research that demonstrates people interact with robots in social ways that sometimes resemble H–H interactions, and even develop meaningful emotional attachment to them, and this set of phenomenon distinguishes robots as their own emerging and undefined ontological category(ies) [37, 38, 56]. Indeed, the Robot Accommodation Dilemma also acknowledges this tendency in some H-R relationships and attempts a working definition (Carpenter 2016) of the phenomenon. The original application and meaning of the words accommodation dilemma stemmed from two main patterns revealed in the data regarding soldier human–robot interactions where soldiers demonstrated a specific set of expectations, beliefs, and actions with the robots they used everyday: 1. Regarding robots as critical tools, and the importance of thoroughly recognizing robot capabilities and limitations. 2. Defining robots as mechanical, yet still developing ways of interacting with robots as a technology (e.g., as an extension of self, humanlike, animallike, or uncategorized “other”) RAD offers some overlap with Reeves and Nass’ Computers as Social Actors theory (1986) that claims people interact with computers in fundamentally social ways. Moreover, Reeves and Nass drew a parallel of the human–computer social interactions to a H–H communication model [50]. RAD is distinguished from CASA in two ways: 1. The entity discussed is recognized and perceived as a robot, and in that way is distinguished from a computer. 2. The social category of robots is emerging, and undefined, and in that way not simply “like people.” The ideas of CASA and RAD are important to understand to accept a HH model of emotional cheating applying to a HHR situation. Research about emotional connection with a product or similar nonliving entities often entail an exploration of design

5 Emotional Intimacy and the Idea of Cheating on Committed …


elements (e.g., aesthetics), or specific factors such as product durability, or brand loyalty [34, 46]. Additionally, the idea of persistent emotional attachment to a product often centers on psychologically unusual behaviors, sometimes encompassing objectophilic circumstances [30]. Yet, the emerging interdisciplinary body of human–robot interaction research indicates that humans interact with robots in a way that resembles human-Other interactions is normal and even intuitive behavior in some contexts. Incorporating [13] emotionally durable design concept is another framework for understanding the complexities of HR emotionally-centered interactions from the human perspective. To define emotional cheating with a robot—and intrinsically nonhuman entity that participates only as an object of affection without humanlike motivations—adopting emotionally durable design as a context offers a rich way to comprehend the potential meaningfulness a person may project onto a robot. In their explanation of emotionally durable design, Moran and O’Brien propose these parameters of emotionally durable design [45, p. 148]: 1. Narrative: Users share and develop a unique, personal history with the product. 2. Consciousness: Products are perceived as autonomous and in possession of their own free will. 3. Attachment: Users feel a strong emotional connection to a product. 4. Fiction: The product inspires interactions and connections beyond the physical relationship. 5. Surface: The product ages gracefully and develops character through time and use. Importantly, the fifth point, surface, implies that the durability of a robot as a product is significant in its role in the dynamics of HR and HHR relationships; this idea ties in neatly with aspects of attachment theories, which also has a temporal component for the affectionate bond to form in the eye of the beholder [9, 10, 18]. This theoretical framework encompasses the multiple dimensions of an emotionally meaningful robot relationship to the user. Applied to the design perspective, this context also suggests an intent by the designers to create a product that users find deep satisfaction with, to the point of careful use and maintenance of the object so it is sustained for long-term functionality for the user without product replacement. The idea of interacting sexually with a robot is not a concept every person finds appealing, let alone the entertaining the likelihood of forming an emotional bond toward a robot. Nevertheless, in a 2013 poll of one thousand American adults, 18% of respondents said that they believed robots made specifically for sexual purposes will be available by 2030, and 9% claimed they would have sex with a robot if they had the opportunity. In the same poll, when asked about the ethics of a spouse turning to a robot for stimulation during the marriage, 42% of survey respondents said this act would be viewed as cheating, whereas 31% said it would not be regarded as cheating, and 26% replied they were unsure. Americans in this group of respondents under the age of thirty were almost as likely to say it would not be cheating (34%) as that it would (36%), while respondents over the age of sixty-five were far more likely to say that it would, by a 52% to 24% margin. By 2020, reported 22% of Americans said they would have sex with a robot, a figure up six points from


J. Carpenter

the same poll conducted in 2017. Furthermore, 27% said yes, 31% said no, and 41% were unsure in response to the question “If you had a partner who had sex with a robot would you consider it cheating?” Note that these survey questions were asked without explanation, elaboration, or situational context. In 2017, Pearson’s same poll of 1146 American adults revealed 49% of respondents agree that sex with robots will become commonplace practice in the next fifty years. Of those, nine percent of women said they would consider having sex with a robot, whereas 24% of men similarly stated they would interact sexually with a robot. Fifty-two percent of those who said they were open to robot sexual intimacy also indicated a preference for a humanlike robot design. Yet, most Americans were not adamant in their position that sex with a robot should be classified as a parallel experience to traditional HH sex. Fourteen percent of the study participants said they would categorize sex with a robot as intercourse, and 33% indicated the prediction of one’s own behaviors are indicative of opinion and beliefs, and not necessarily a gauge of real behaviors in situ, yet, this poll indicates that some people are already open to the act should be labeled as masturbation. Interestingly, 27% of respondents did not feel either intercourse or masturbation was an accurate term for HR sex. Respondents were similarly divided over the idea of cheating with a robot partner, with 32% indicating sex with a robot should be considered cheating, and 33% said it should not be defined that way. Thirty-sex percent of female survey respondents consider robot sex cheating, and 29% said it would not. For male respondents, the numbers almost invert, with 29% claiming sex with a robot is cheating, and 37% saying it would not. Self-reported measures that require idea of sex with a robot. These polls also indicate people have an initial belief and expectations about the meaningfulness assigned to the idea of engaging with robot sexually outside a HH relationship. Assuming any spectators to an AI-human interaction understands the AI is artificial and nonorganic, the AI’s value and usefulness are derived from its ability to successfully please the owner or other humans in the scenario. This dynamic establishes some important points about how a sex robot may be regarded at a very personal level. As humans, we readily apply our gaze to a robot as we do about our interactions with sociological Others, but also as a new ontological category.

5.5 Thinking of Robots as Other One way to explore all of these topics about culture and perceptions about AI and robots and our complicated relationships with them is to use the framework of our human gaze, or the human tendency of using an anthropomorphic model of understanding AI in terms of intelligence and one-sided perceptions that we, as humans, project onto robots, who are in turn regarded as who or what we want them to be in our lives. The human gaze is a phrase used to signal an idea inspired, in part, by Sartre’s le regard—also referred to as the look or the gaze—one’s awareness and perception of other individuals, other groups, in relation to oneself [54]. Various applications of this

5 Emotional Intimacy and the Idea of Cheating on Committed …


concept have evolved from Sartre’s work; Derrida wrote of interspecies perception via a version of the look [26] between humans and animals. Thus, the human gaze is a framework for understanding our perceptions of our relationship to robots as sociological Other, and frames how humans are designing, presenting, and interacting with robots now. Therefore, the human gaze toward robots consists of three group perspectives: • robot designer(s)/developers • people interacting directly with the robot • spectators to the human–robot interaction The human gaze also invokes the belief systems of the social actor’s perceptions of an AI and suggests a way of looking that empowers humans over robots. This form of gaze may or may not objectify AI as mere products and can encompass interactions where the human believes the AI is lovable, or otherwise worthy of affection, but implies the power imbalance that exists since current and near-future AI are part of a world where their existence is believed to be less significant than human life within the constructs of society, including legal classifications and limitations inherent to AI being an “owned” product. Finally, using the human gaze as a scaffold for discussion, this framework encompasses the tendencies of people to treat robots as a sociological Other, or as an extension of Self; both models of understanding hinge on viewing the AI as something humanlike. The very definition of a sex robot is not universally agreed upon in this era of its invention; even the very creators of the emerging early models publicly debate about limitations and capabilities of their competitors’ products, and what constitutes a “sex doll” versus a “sex robot” [11]. From the development and design point of view, for this discussion, the AI mechanisms as well as the robots’ exteriors are conceived as things that will be functional and aesthetically pleasing for the human user. The AI may be customized in appearance and programmed or learn behaviors to please the human user, and in this way reflects the power of people interacting directly with the robot. As examples of existing sexualized AI, affiliated companies Realbotix and RealDoll have developed the RealDollx series, a customizable doll model with X-Mode ([39], RealDoll, n.d.). Realbotix has incorporated animated facial expression—including eye, mouth, and neck movement—into their highly lifelike sex dolls. The eyes can blink, and there plans for built-in cameras are “in development” (RealDoll, n.d.). Additionally, via an app the user can design and dress a digital avatar version of their robotic doll and choose its physical features, persona (talkative, shy, sensual, funny), moods, desire level, voice, and accent. The robotic dolls can also be adapted to recognize the voice of its owner [11]. RealDollx models also integrate a “vaginal style” called the Sensex Bluetooth Insert that can detect and react to touch, movement, and “transitions from mild arousal to orgasm” (RealDoll, n.d.). Interestingly, at this time, the available options for RealDollx features are only available in the products that are female-gendered by the company and resemble women.


J. Carpenter

A proposed human gaze can take many forms, but it is recognizable as a dynamic as it explains human control over robots from abstract idea through invention, production, ownership, and cultural integration. Thus, the human gaze refers to the development of AI from a human-bounded set of expectations and human-created data. Additionally, according to this gaze, as users or actors with robots in the world, human expectations and perceptions of them are ultimately based on biases toward seeing AI as interpretations of human culture, understanding, meaning, and desired goals. Furthermore, this lens will change as humans adapt to AI and robots in their everyday spaces, and robots become accepted social actors over time. However, what we experience as our human Self—or person, soul—is a product of our physical interactions with the world, biochemical processes in our physical being that we experience through our particular senses, and our mind’s interpretation of our past experiences. Sex robots may be given complex back stories by their owners, come with narratives suggested by company-branded AI, or be a collaboration of the user(s) and AI to achieve a sense of character development or persona for the robot. Users may assign them internal motivations, consider sex robots significant actors in their romantic lives, or associate other humanlike emotional qualities to them [21, 27]. However, the dynamic of HR is still influenced by the human’s knowledge they are interacting with a thing developed to gratify them emotionally, physically, or both. Considerations for the social and sexual interactions are focused on the human needs and preferences, even if the user believes the sex robot is capable of some degree of humanlike responsiveness, emotionally. This is a state that is necessarily so, as any activity in the dynamic centers in the human at its core, since AI does not demonstrate desire or needs for emotional fulfillment or affection or sexual desire in any way we recognize. If AI were to become self-aware, the dynamic has the opportunity to change in significant ways, but that possibility is not under consideration within the scope of this chapter. Dennet [25] claims that people do not imagine the intentional states of other people only to predict their behavior, but the fact that others have thoughts and feelings similar to us is central to concepts such as trust, friendship, and love. One objection to Dennett ‘s position is whether it “matters” to people whether an object, such as a sexualized robot, has an inner life or not. While robots discussed here that act as sexual stimuli resemble humans, they do not resemble people in perfect mimicry of human intelligence or behavior. In 1950, Alan Turing proposed the imitation game, commonly referred to now as the Turing test, to deduce if computers could think, or demonstrate intelligent behavior similar to a humans. Part of the reason the imitation game persists as a standard for testing humanlike AI is because we continue to use human intelligence as the yardstick to measure AIs outcomes; an example of our human gaze at work in this lens of imitation. As Turing acknowledged, humanlike intelligence is a nebulously interpreted goal as a benchmark of programming and computer learning, if not human expectations and interactions with it as a tool, which is why he preferred the term think.

5 Emotional Intimacy and the Idea of Cheating on Committed …


In regards to cheating with a robot, another approach is to consider the matter from the perspective of the human gaze—if the human engaged in sexual activity with the sexbot regards them as being person-like enough, then the activity can be seen as cheating. An objection to this is that it does not matter what the human thinks about the robot; what matters is its actual status in larger society. However, if a human regards a human they are cheating with as a mere object, this does not make it so they are not cheating. Likewise, if someone feels like they are cheating, it does not mean they really are. An oppositional stance to that argument is that how the human social actors interpret their own behaviors matters. In other words, if someone believes they are cheating and engaging in those behaviors, their actions are morally wrong. Ultimately, the emotional ups and downs, enmeshment or resistance, trust, or betrayal, lies within each person. Robots may encourage our emotions in many directions, but they have no emotional motivations in the situation because they currently lack anything resembling consciousness, sentience, or sentiment. Emotional costs and boundaries to a human–human relationship are continually reevaluated, negotiated, and recalibrated. What remains between the people involved are their own emotions and the agreements between them. As a practical matter, the future is likely to see people working out the specifics of their relationships in terms of what sort of virtual and robotic activities are allowed and which are out-of-bounds.

5.6 Conclusions ˇ Karel Capek’s 1921 play Rossum’s Universal Robots (RUR) introduced the idea of humanlike mechanical artificial entities, coining the term robota, or robot. In the story two of the robots fall in love. Since this first representation of fiction that introduced the idea of emotion-capable autonomous human-created humanlike machines, there has been the popular idea that one outcome of this cultural shift would mean romance in one form or another. The current technologies offer little more than animatronic sex dolls with associated phone-based apps, yet advances in lifelike robotics, AI, and the lucrative business of sex toys means almost certainly more convincingly biomimetic sex robots are on the horizon. The technological problems of building machines that are realistic—if that is the end goal—is one set of (complicated) challenges for the industry. Yet cheating is a subjective phenomenon that is decided between human participants within a relationship, often, at least in part, based on what their own culturally situated beliefs. However, the robot is not a person and so the traditional definitions of cheating are not met. Currently, sex robots do not meet the standards that society determines means it qualifies as a person or personhood and is only beginning to acknowledge they are objects that can be meaningful socially. In the near future there will be robots advanced enough to evoke, even briefly, the experience in people that they are interacting intimately with another human being. Thus, how can we best understand and evaluate such relationships? This initial formalized discussion of the (currently) controversial, highly subjective, and complicated topic of romantic cheating with robots is one of the initial steps


J. Carpenter

towards discussion by introducing frameworks for considering the dynamics of HHR specifically. Next steps should involve careful exploration of these dynamics that may yield useful insights that can be applied in psychologically therapeutic work. Importantly, implications for user emotional safety should remain the focus of future research. The ways we design and use such robots is bound to affect us, and we must consider ways of developing and using the technologies thoughtfully.

References 1. Abe, N.: Human–machine interaction and design methods. In: The Routledge social science handbook of AI, pp. 138–154. Routledge (2021). 2. Bartneck, C., Reichenbach, J., Carpenter, J.: The carrot and the stick: The role of praise and punishment in human–robot interaction. Interact. Stud. 9(2), 179–203 (2008) 3. Baumeister, R.F., Leary, M.R.: The need to belong: Desire for interpersonal attachments as a fundamental human motivation. Psychol. Bul. 117(3), 497–529 (1995) 0033-2909.117.3.497 4. Leary, M.R., Baumeister, R.F.: The need to belong. Psychol. Bull. 117(3), 497–529 (1995) 5. Berscheid, E., Ammazzalorso, H.: Emotional experience in close relationships. In: Blackwell handbook of social psychology. Interpersonal Process, pp. 308–330 (2001) 6. Berscheid, E.: The emotion-in-relationships model: Reflections and Memories, thoughts, and emotions: Essays in honor of George Mandler, p. 323 (1991) 7. Block, N.: Psychologism and Behaviorism. Philos. Rev. 90(1), 5–43 (1981). Duke University Press. 8. Booth, A., Crouter, A.C., Clements, M., Boone-Holladay, T. (eds.): Couples in conflict. L. Erlbaum Associates (2001) 9. Bowlby, J.: The nature of the child’s tie to his mother. Int. J. Psycho Anal. 39, 1–23 (1958) 10. Bretherton, I.: The origins of attachment theory: John Bowlby and Mary Ainsworth. Dev. Psychol. 28(5), 759 (1992) 11. Brown, M.: Sex robot Samantha just got dissed by rival in fierce fight. Inverse (2017). Retrieved 12. Carstensen, L.L., Gottman, J.M., Levenson, R.W.: Emotional behavior in long-term marriage. Psychol. Aging. 10(1), 140 (1995) 13. Chapman, J.: Emotionally Durable Design: Objects, Experiences and Empathy. Routledge, UK (2015) 14. Clark, M.S., Reis, H.T.: Interpersonal processes in close relationships. Annu. Rev. psychol. 39(1), 609–672 (1988) 15. Clayton, R.B.: The third wheel: the impact of Twitter use on relationship infidelity and divorce. Cyberpsychol. Behav. Soc. Netw. 17(7), 425–430 (2014) 16. Crist, R.: Hello Harmony: RealDollsex robots with X-Mode ship in September. CNet (2018). Retrieved 17. Crist, R.: Dawn of the sexbots. CNet (2017). Retrieved abyss-creations-ai-sex-robots-headed-to-your-bed-and-heart/ 18. Crittenden, P.M.: Gifts from Mary Ainsworth and John Bowlby. Clin. Child Psychol. Psychiatry 22(3), 436–442 (2017) ˇ 19. Capek, K.: 1921. In: Landes, W.A. (ed.). Players Press, Studio City, CA (2002) 20. Danaher, J.: The philosophical case for robot friendship. J. Posthuman Stud. 3(1), 5–24 (2019) 21. Danaher, J., McArthur, N. (eds.): Robot Sex: Social and Ethical Implications. MIT Press (2017)

5 Emotional Intimacy and the Idea of Cheating on Committed …


22. Darling, K.: Extending legal protection to social robots: The effects of anthropomorphism, empathy, and violent behavior towards robotic objects (2012) 23. Dautenhahn, K., Woods, S., Kaouri, C., Walters, M.L., Koay, K.L., Werry, I.: What is a robot companion-friend, assistant or butler? In: 2005 IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 1192–1197. IEEE (2005) 24. Davecat: The Life Synthetik. [keynote] In: The Proceedings of the 6th International Congress of Love and Sex With Robots (2021). [online] 25. Dennett, D.C.: The Intentional Stance. MIT press (1989) 26. Derrida, J.: Trans. Wills, D. The animal that therefore I am (more to follow). Crit. Inquiry 28(2), 369–418 (2002) 27. Devlin, K.: Turned On: Science, Sex and Robots. Bloomsbury Publishing (2018) 28. Devlin, K.: The ethics of the artificial lover. Ethics Artif. Intell. 271–290 (2019) 29. Dziergwa, M., Kaczmarek, M., Kaczmarek, P., K˛edzierski, J., Wadas-Szydłowska, K.: Longterm cohabitation with a social robot: a case study of the influence of human attachment patterns. Int. J. Soc. Robot. 10(1), 163–176 (2018) 30. Döring, N., Mohseni, M.R., Walter, R.: Design, use, and effects of sex dolls and sex robots: scoping review. J. Med. Internet Res. 22(7), e18551 (2020) 31. Fitzsimons, G.M., Bargh, J.A.: Thinking of you: Nonconscious pursuit of interpersonal goals associated with relationship partners. J. Pers. Soc. Psychol. 84(1), 148 (2003) 32. Gable, S.L., Reis, H.T.: Appetitive and aversive social interaction. In: Close romantic relationships, pp. 177–202. Psychology Press (2001) 33. Geist, R.L., Gilbert, D.G.: Correlates of expressed and felt emotion during marital conflict: Satisfaction, personality, process, and outcome. Pers. Individ. Differ 21(1), 49–60 (1996) 34. Grisaffe, D.B., Nguyen, H.P.: Antecedents of emotional attachment to brands. J. Bus. Res. 64(10), 1052–1059 (2011) 35. Guerrero, L.K., Andersen, P.A.: Emotion in close relationships (2000) 36. Ickes, W., Dugosh, JW., Simpson, J.A., Wilson, C.L.: Suspicious minds: The motive to acquire relationship-threatening information. Pers. Relat. 10(2), 131–148 (2003) 1111/1475-6811.00042 37. Jones, R.: What makes a robot social? Soc. Stud. Sci. 4(47), 556–579 (47). Sage Journals. 38. Kahn, P.H., Reichert, A.L., Gary, H.E., Kanda, T., Ishiguro, H., Shen, S., Ruckert, J.H., Gill, B.: The new ontological category hypothesis in human-robot interaction. In: 2011 6th ACM/IEEE International Conference on Human-Robot Interaction (HRI), pp. 159–160. IEEE (2011) 39. Kragen, P.: World’s First Talking Sex Robot is Ready for Her Close-Up. San Diego Tribune (2017). Retrieved sd-me-harmony-doll-20170913-story.html 40. Laurenceau, J.P., Barrett, L.F., Pietromonaco, P.R.: Intimacy as a process: The importance of self-disclosure and responsiveness in interpersonal exchanges. J. Pers. Soc. Psychol. 74(5), 1238–1251 (1998) 41. Laurenceau, J.P., Barrett Michael, L.F., Rovine, J.: The Interpersonal Process Model of Intimacy in Marriage: A Daily-Diary and Multilevel Modeling Approach. J. Fam. Psychol. 19(2), 314– 323 (2005) 42. Levy, D.: Love and Sex with Robots: The Evolution of Human-Robot Relationships. Harrper, New York (2007) 43. McAdams, D.P.: The intimacy motive. In: Smith, C.P., Atkinson, J.W., McClelland, D.C., Veroff, J. (eds.) Motivation and personality: Handbook of thematic content analysis. pp. 224– 228. Cambridge University Press. (1992) 44. McCullers, C.: The Ballad of the Sad Café and Other Stories. Houghton-Mifflin, Boston (1951) 45. Moran, A., O’Brien, S.: Love Objects: Emotion, Design, and Material Culture 46. Mugge, R., Schoormans, J.P., Schifferstein, H.N.: Product attachment: Design strategies to stimulate the emotional bonding to products. In: Product Experience, pp. 425–440. Elsevier (2008)


J. Carpenter

47. Pearson, I.: Robot sex [poll] (2017). Retrieved: https://d25d2506sfb94s.cloudf ovNY%20(Robot%20Sex)%20203%2009.27.2017.pdf 48. Pittman, J.F., Keiley, M.K., Kerpelman, J.L., Vaughn, B.E.: Attachment, identity, and intimacy: parallels between Bowlby’s and Erikson’s paradigms. J. Fam. Theory Rev. 3(1), 32–46 (2011) 49. Realbotix: RealDoll x: Robot SEX Dolls: Silicone DOLLS adults: Love Dolls Sex. RealDoll (2021). Retrieved 14 Sept 2021, from 50. Reeves, B., Nass, C.: The Media Equation: How People Treat Computers, Television, and New Media Like Real People. Cambridge University Press, Cambridge, United Kingdom (1996) 51. Richardson, K.: Sex robot matters: slavery, the prostituted, and the rights of machines. IEEE Technol. Soc. Mag. 35(2), 46–53 (2016) 52. Rousi, R.: Lying cheating robots–robots and infidelity. In: International Conference on Love and Sex with Robots, pp. 51–64. Springer, Cham (2017) 53. Ryan, R.M., Deci, E.L.: Intrinsic and extrinsic motivations: Classic definitions and new directions. Contemp. Educ. Psychol. 25(1),54–67 (2000) S0361476X99910202. 1006/ceps.1999.1020 54. Sartre, J.P.: Being and Nothingness: An Essay in Phenomenological Ontology. Citadel Press (2001) 55. Schneider, J.P., Weiss, R., Samenow, C.: Is it really cheating? Understanding the emotional reactions and clinical treatment of spouses and partners affected by cybersex infidelity. Sex. Addict. Compuls. 19(1–2), 123–139 (2012) 56. Severson, R.L., Carlson, S.M. Behaving as or behaving as if? Children’s conceptions of personified robots and the emergence of a new ontological category. Neural Netw. 23(8–9), 1099–1103 (2010) 57. Severson, Carlson: Behaving as or behaving as if? Children’s conceptions of personified robots and the emergence of a new ontological category. Neural Netw. 23(8–9), 1099–1103 (2010) 58. Shaver, P.R., Hazan, C.: A biased overview of the study of love. J. Soc. Pers. Relat. 5(4), 473–501 (1988) 59. Simpson, J.A., Oriña, M.M., Ickes, W.: When accuracy hurts, and when it helps: A test of the empathic accuracy model in marital interactions. J. Pers. Soc. Psychol. 85(5), 881 (2003) 60. Singer, P.W.: Wired for War: The Robotics Revolution and Conflict in the 21st Century. Penguin, New York, NY (2009) 61. Sone, Y.: Hatsune Miku, virtual machine-woman. In: Japanese Robot Culture, pp. 139–166. Palgrave Macmillan, New York (2017) 62. Sullins, J.P.: Robots, love, and sex: the ethics of building a love machine. IEEE Trans. Affect. Comput. 3(4), 398–409 (2012) 63. Trout, C.: RealDoll’s first sex robot took me to the Uncanny Valley. Endgadget (2017). Retrieved 64. Turing, A.M.: Computing machinery and intelligence. In: Parsing the Turing Test, pp. 23–65. Springer, Dordrecht (2009) 65. Valverde, S.H.: The Modern Sex Doll-Owner: A Descriptive Analysis. San Luis Obispo, California Polytechnic State University [doctoral thesis] (2012) 66. Whitty, M.T.: The realness of cybercheating: Men’s and women’s representations of unfaithful Internet relationships. Soc. Sci. Comput. Rev. 23(1), 57–67 (2005) 67. Retrieved 2020-both-men-and-women-are-more-likely-consider-h

Chapter 6

Are Dating Apps and Sex Robots Feminist Technologies? A Critical Posthumanist Alternative Janina Loh

Abstract In the following, I first define what a feminist technology is, referring to the approach of Linda L. Layne, Sharra L. Vostral, and Kate Boyer. In a second step, the two examples of dating apps and sex robots—with a specific focus on embodiment and the datafied self—will be examined from different classical feminist perspectives. Finally, a critical posthumanist analysis and critique will provide an answer to the initial question of whether dating apps and sex robots are feminist technologies or if the question itself might cover exclusionary (and thus antifeminist) traits.

6.1 Introduction Mostly unnoticed, algorithms and the data they accumulate influence our everyday lives to a significant extent. From our e-mail accounts, where they sort out spam, in search engines that filter results, in fitness apps that give us tips for a healthy daily routine and store a rudimentary version of our datafied self, to the partly automation of the global financial market system [7]—very few people are aware of the immense range of potential and actual realities of algorithms, and even fewer of the enormous amount of data collected by digital technologies. This is because algorithms often operate seemingly invisibly in the background. They cannot be seen, smelled, or felt. Apart from the, indeed, sometimes very noticeable consequences, they can only take shape for us in a physically embodied way. Although they, of course, always need some material basis (provided, e.g., by a computer or a smartphone), we usually mean something else when we think of real embodiment. That is why many people probably think of robots as the algorithm-controlled machines that will, in some hoped-for or feared sci-fi future, have seriously changed the human way of being. Embodiment thus plays a role when it comes to the perception of digital technologies, Big Data, and algorithms in our everyday lives. Indeed, even a first cursory J. Loh (B) Weingartshoferstr. 23a, 88214 Ravensburg, Germany e-mail: [email protected] Stiftung Liebenau, Meckenbeuren, Germany © Springer Nature Switzerland AG 2023 J. Vallverdú (ed.), Gender in AI and Robotics, Intelligent Systems Reference Library 235,



J. Loh

glance at this topic allows us to conclude that data-accumulating algorithms and robots have a lot in common—at least insofar as robots would not really be conceivable without the interaction of a set of algorithms. However, the reverse is not true. In many areas of human life, as stated above, algorithms do not act embodied and only become visible in the effects they have on people. A robot is an electro-mechanical machine that (a) has a body and (b) at least one processor, (c) sensors that collect information about the world, and (d) effectors that translate signals into mechanical processes. The behavior of a robot (e) is or at least appears autonomous and it can (f) interact with or physically influence its environment. According to the definition proposed here, it is therefore not a robot in the true sense of the word if one of the conditions mentioned is not met: Thus, computers and smartphones do not fulfill the condition of physically exerting influence, although they are of course controlled by algorithms. In a figurative sense, a computer is rather the “brain” of a robot, just as algorithms metaphorically speaking represent their “mental behavior patterns” and “learned processes”, but not the actual robot itself (cf. [18]: 16–18, [21]: 43). Embodiment thus seems to represent the main or at least a significant difference between algorithms and Big Data on the one hand and robots on the other, and the ethical challenges that accompany them. When it comes to sex and partnership, this seems obvious. However, especially with regard to this field, it also quickly becomes apparent that problematic gender stereotypes often break through via embodiment, which are sometimes consciously, or at least unconsciously, perpetuated by users. But are non-embodied algorithms really free from classical attributions of gender? In the following, I would like to answer the question whether dating apps such as Tinder and OK Cupid as an example for a non-embodied and sex robots such as Roxxxy and Harmony as an example for an embodied digital technology are feminist technologies. In their anthology Feminist Technology [1], Linda L. Layne, Sharra L. Vostral, and Kate Boyer propose a “working definition of feminist technology” [16: 3] that I will refer to. Feminist technologies are “those tools plus knowledge that enhance women’s ability to develop, expand, and express their capacities” [16: 3]. A “‘radically or truly’ feminist technology adopt[s] a holistic approach to women’s lives and make changes that radically restructure arrangements in ways that will benefit women and substantially shift the balance of power between women and men” [16: 14]. I will then examine dating apps and sex robots from the perspective of classical feminist movements, such as radical, liberal, difference, and queerfeminism—with a specific focus on embodiment and the datafied self (and with that also the usage of Big Data). Finally, I will argue for a critical posthumanist or inclusive ethical view on dating apps and sex robots (referring to Donna Haraway and Karen Barad) that questions the essentialist feminist schools of thought in general and develops arguments of the constructivist feminist movements.

6 Are Dating Apps and Sex Robots Feminist Technologies? A Critical …


6.2 What is a Feminist Technology? In the following (cf. [18]: 3–5), I will outline the understanding of a feminist technology provided by Linda L. Layne, Sharra L. Vostral, and Kate Boyer in the anthology Feminist Technology [16]. Here, they bring together a range of approaches that ask what feminist technologies are and discuss different feminist perspectives using concrete technologies such as the birth control pill and the tampon as examples. In her introduction to the anthology, Layne proposes a “working definition of feminist technology” [16: 3]. This refers to “those tools plus knowledge that enhance women’s ability to develop, expand, and express their capacities” [16: 3]. In fact, Layne does not see technological artefacts in isolation, but together with a specific knowledge that integrates them in the social, political, and economic contexts. Against this background, she identifies three categories of feminist technologies, depending on how clearly they change the lives of women. According to Layne, a feminist technology in the “minimal” sense is one “that improve[s] things for women some degree from the status quo”. A “moderate feminist technology”, on the other hand, makes “substantial improvement for women over the status quo” possible. A “‘radically or truly’ feminist technology” finally “adopt[s] a holistic approach to women’s lives and make changes that radically restructure arrangements in ways that will benefit women and substantially shift the balance of power between women and men” [16: 14]. This gradual differentiation between three types of feminist technologies also makes it clear that, according to Layne, these are not merely artefacts to be judged for themselves, but are to be interpreted in terms of their effects on the lives of women and their position in society. On the pages that follow, Layne uses several examples to show that by no means everything that might appear at first glance as a feminist technology deserves to be identified as such. Nor are all those artefacts feminist technologies simply because they are called “liberatory” for women by the companies that distribute them (such as cigarettes; Layne [16]: 4). Even “feminizing” an object, i.e. the design of a technology according to criteria that are usually considered ‘typically feminine’, such as certain colors, shapes, smells, or sounds, does not turn an artefact into a feminist technology (e.g., a pink cell phone; [16]: 4). Finally, even “feminine” technologies, which can be called so because they help women to “control” their “distinctly female reproductive systems”, are not automatically feminist technologies (e.g. the birth control pill, tampon, pregnancy test, etc.; [16]: 3). For all these technologies do not lead to a significant improvement in the lives of women in patriarchally organized societies, but at most enable them to live more appropriately in the given social structures. So, these are only—if at all—minimal or moderate, but not radical or genuine feminist technologies. On the other hand, technologies that provide women, for instance, access to certain professions that had previously been reserved primarily for men—due to the artefacts involved and tailored to the average male body, but have now been adapted to the dimensions of the average female body—appear more interesting (e.g. the fighter pilot cockpit; [16]: 5). Nor do technologies have to be constructed by feminists to


J. Loh

be considered feminist, as shown by the example of Earle Cleveland Haas, who invented the tampon with sexist motivation. He “just got tired” of the “poor women” being dependent on these “damned old rags” that they were forced to put in their underwear at the time of their period and that he felt called to take care of [16: 12]. If technologies bring about a fundamental change in the lives of women and profoundly transform the balance of power in a society, as can be assumed in the case of access to certain professions, it is reasonable to interpret them as feminist technologies that contribute to the political, ethical, social, economic, religious, etc. equality of all people. Irrespective of the extent to which someone may be willing to see the tampon as a feminist technology, this example also shows that a feminist influence on patriarchal and heteronormative social structures can happen unintentionally, on the basis of perhaps even contrary (namely sexist or otherwise discriminatory) intentions. However, the tampon and the associated question of whether it actually has to be interpreted as a feminist technology, i.e. whether and to what extent it fundamentally changes the lives of women in a patriarchal society and radically shakes the patriarchal power structures constituting this society, makes clear at this point what has already been pointed out in the introduction. Because of the fact that there are numerous different feminist schools and approaches, it is not surprising that a technological artefact can appear to some as a feminist technology, while others classify it as minimal feminist, moderate feminist, or even anti-feminist. Aengst and Layne discuss in their chapter “The Need to Bleed? A Feminist Technology Assessment of Menstrual-Suppressing Birth Control Pills” [1], the example of the three-month pill Seasonale, which liberal feminists would probably interpret as a feminist technology because it gives women the opportunity to control their cycle more self-determinedly [1: 70]. On the other hand, radical feminists would probably see Seasonale as antifeminist, since it (like any other birth control pill) merely adapts women to the given patriarchal and heteronormative social structures. A radical feminist approach, according to Aengst and Layne, would not support technologies that “suppress menstruation”, but rather, as in the case of the fighter pilot cockpit, would aim for transforming “workplaces, schedules, and expectations”, and the technological artifacts involved “to accommodate women’s cyclically changing capacities and predilections” [1: 70]. Accordingly, Layne concludes her considerations by saying that a “technology may appear feminist in light of one type of feminism and antifeminist through a different feminist lens” [16: 18]. In the following two sections, I will look at the two examples of the nonembodied dating apps Tinder and OK Cupid and embodied sex robots such as Roxxxy and Harmony with different feminist lenses to explore the question of whether these technologies are feminist or antifeminist.

6.3 Dating Apps It doesn’t take much to realize that the algorithms implemented in dating apps such as Tinder and OK Cupid are anything but neutral in terms of gender stereotypical

6 Are Dating Apps and Sex Robots Feminist Technologies? A Critical …


categories. That is because algorithms filter and limit our search via dating apps according to company-specific criteria. For instance, Tinder allows searches either exclusively for women, exclusively for men, or both genders, and is thus standardized by two genders. OK Cupid, on the other hand, allows 22 gender options (including woman, man, androgynous, transgender, genderqueer, genderfluid, etc.) as well as 13 different sexual orientation options and various relationship types. So, anyone looking around with the help of the Tinder dating app is moving within the familiar binary gender paradigm. OK Cupid takes a more differentiated approach—although, of course, there too (as with every other dating app) the criteria that define one’s own profile as well as the search itself are ultimately finite (cf. [29]: 12). We see, then, that while these and other dating apps and the algorithms at work in them, are not themselves embodied in the way defined above, they undoubtedly influence our thinking about representation and embodiment quite enormously. Ultimately, whoever uses a dating app must feel ‘included’ in their personal gender identity and sexual preferences. This is also the case with people who see themselves as women and, in German-speaking countries, where the generic masculine is still the grammatical and social norm, need to understand themselves as included when, for instance, doctors or professors are mentioned. Comparably, users of dating apps can consider themselves lucky if they fall into one of the predefined categories and have to accept it if this is not the case. They simply have to fit into some of the given categories. Of course, the predetermined categories also shape how we perceive gender in the first place. The world of people who use Tinder, for instance, exclusively is limited to two genders. Carolin Emcke summarizes this problem of “being included” in her book Wie wir begehren [11: 21; my translation]: Norms as norms only strike us when we do not conform to them, when we do not fit in, whether we want to or not. Those who are white consider the category of skin color irrelevant because in the life of a white person in the Western world, skin color is irrelevant. One who is heterosexual considers the category of sexual orientation irrelevant because one’s sexual orientation may be irrelevant in the life of a heterosexual. For those who have a body in which he or she recognizes himself or herself, the category of gender seems self-evident because that body is never questioned.

Accordingly, cis women and cis men who have been either heterosexual, homosexual, or bisexual in their lives so far do not even notice on Tinder that the stereotyping of gender, embodiment, and sexual orientation implicitly carried out by the company normalizes and defines a specific view of the world on the one hand, and on the other hand simply hides or makes impossible the visibility and recognition of those who do not fall into the aforementioned categories. This is simply because the company sets a certain standard based on its search criteria, which is perpetuated and normalized as commonsense in society via the use of the dating app in question. Hence, the impression that nonembodied algorithms do not raise questions of representation and social embodiment is deceptive. This is because the moral, political, and economic values of those who program these algorithms implicitly or explicitly, known or unknown allow ideas about, for instance, gender, supposed standards, and their deviations to enter the algorithmic code and be passed on to the users through them.


J. Loh

The datafied self is also much more minimalist on Tinder than it is on OK Cupid, as a profile on Tinder is limited to six photos, 500 characters, the specification of a favorite song, and the possibility of linking other social media profiles such as Instagram. There is no character limit on OK Cupid, nine instead of six photos may be uploaded and additional photos may be posted within the profile text, and the questions and categories according to which a respective profile is structured (e.g. hobbies, favorite travel spots, books, and others) are significantly more extensive than on Tinder. Nonetheless, however, what is considered essential about a person in a dating context is in both cases previously determined and defined by the company, thus normalizing a certain image of ourselves and stereotyping what we conceive of as a person based on certain categories. Against this background, feminist schools would certainly give different answers as to whether dating apps such as Tinder and OK Cupid can be understood as feminist technologies. Difference feminism, also called cultural or essentialist feminism, for instance, assumes biological differences between women and men ([1]: 71, [18]: 7). The characteristics usually interpreted as typically female—such as menstruating, having children, but also role patterns associated with biological attributes such as, e.g., women being more sensitive and caring—are evaluated positively. However, the social implications of differences between the sexes are not assumed to be biological. If the characteristics usually interpreted as typically female, such as the birth and upbringing of children, were more socially recognized, we would, for instance, organize the workplace and work environment differently in order to allow women (and men) to devote more time to children [30: 3 f]. Technologies that support women in living a ‘typical female live’ (with regard to their biological characteristics and associated role patterns) are thus to be interpreted as feminist technologies. Against this background, dating apps such as Tinder and OK Cupid would probably rather not be assessed as feminist technologies by difference feminism. It is true that they (esp. Tinder) define clear differences between the sexes. However, these tend to be reduced to very specific characteristics (e.g., appearance and sexual preferences) in the form of their datafied selves, which are then evaluated negatively or positively by the app’s users according to social norms (e.g., permissive, slutty, uptight, or the like). In this way, dating apps perpetuate and reinforce patriarchal structures in which what is understood as typically feminine by difference feminism is negatively evaluated and tends to be objectified. Liberal feminism (in contrast to difference feminism) does not accept biological differences between the sexes [18: 7]. Rather, liberal feminists think that these differences lie in the different socialization of women and men. There exists a universal human nature that encompasses autonomy, equality, self-realization, and equal rights for all human beings. All those technologies that expand women’s opportunities for self-realization, freedom of choice, and autonomy are viewed positively [1: 70]. Dating apps such as Tinder and OK Cupid would likely be viewed more positively as feminist technologies by liberal feminism, as they give women more freedom to meet potential partners and more control over who they do and do not want to meet. Radical feminism clearly takes a negative view on those technologies that women adapt to the prevailing patriarchal structures (such as the birth control pill, egg

6 Are Dating Apps and Sex Robots Feminist Technologies? A Critical …


freezing, artificial insemination, and surrogacy; [1]: 70, [18]: 8). Lisbeth N. Trallori, in Der Körper als Ware [32], argues in a radically feminist way that modern reproductive technologies guarantee the ultimate “control over human production” (my translation; [32]: 162) and are thus the product of the modern capitalist, patriarchal society. The body is instrumentalized as a “resource” and regarded as a “defective (breeding) machine to be repaired” (my translation; [32]: 162). Dating apps are not primarily about reducing women to their reproductive function. However, dating apps and the associated minimalist datafied self (esp. with regard to Tinder) tailor the image of women to certain characteristics that are positively or negatively valued in patriarchal societies. Similar to difference feminism, radical feminism would presumably also come to a negative assessment of dating apps (not as feminist technologies), since they do not enable people to break free from the given patriarchal structures, but rather fit them into them. After these three classical feminist schools of thought (difference, liberal, and radical feminism) a fourth and last traditional feminist movement will now be examined, namely queefeminism. From the perspective of queerfeminists, dating apps such as Tinder and OK Cupid could only be considered feminist technologies if users would be allowed to choose their gender completely freely and not according to categories imposed by the company, and if they could also define new gender categories as needed. Judith Butler questions all essentialist differences between women and men ([5], cf. [18]: 8). According to Butler, both biological gender (sex) and social gender (gender) are social constructs. Differences based on the assumption of a biological gender, such as the average size and physical strength of women and men, for instance, cannot be understood as such without the influence of diet, division of labor, and physical training [30: 3]. Gender is generally rejected as a classification category by queerfeminists. There are as many identities as there are people. Technologies that support and confirm gender binarity and sex-gender-differentiation are to be interpreted negatively. Existing dating apps (esp. Tinder) are based on a biological understanding of women and men. They adapt women to existing social patriarchal conditions and therefore do not represent feminist technologies from a queerfeminist point of view. The concise discussion of the four classical feminist schools of difference, liberal, radical, and queerfeminism has shown that both essentialist movements such as difference feminism, which makes clear distinctions between the sexes, and constructivist feminisms such as radical and queerfeminism, which deny such distinctions, arrive at a negative assessment of dating apps such as Tinder and OK Cupid. The ways of embodiment defined by the companies in the form of gender stereotypes as well as the reduction of the datafied self to a specific set of data and information about appearance and sexual preferences leads to the adaptation to patriarchal structures and not to a break with them.


J. Loh

6.4 Sex Robots In this section (cf. [18]: 5–6), I will consider issues of embodiment and the datafied self with respect to sex robots, and then discuss whether these robots could be interpreted as feminist technologies from the perspective of classical feminist movements. Several international companies already produce and distribute these particular social robots. The sex robot Roxxxy (TrueCompanion, roboticist Douglas Hines), for example, has interactive skills such as being able to “hear what you say, speak, feel your touch, move their bodies, are mobile and have emotions and a personality”. Although Roxxxy should be able to develop her own personality (or as many different roles as desired) through interaction with her users, she can also be given one of the five pre-programmed characters of one’s choice: Wild Wendy, S&M Susan, Mature Martha, Frigid Farah, and Young Yoko. In addition to the above mentioned abilities, Roxxxy is also able to “listen, talk, carry on a conversation and feel your touch as well as move her private areas inside when she is being ‘utilized’”, and even “have an orgasm” [33]. Her current price is just under $10,000. Officially, there is a male version, Rocky, about which, however, almost no information is provided on the TrueCompanion homepage.1 RealDoll’s (Abyss Creation, CEO and designer Matt McMullen) sex robot Harmony is also described as “the perfect companion” that can be adapted to the individual needs of her users in terms of appearance and personality. Even more than Roxxxy, Harmony “is versatile with conversational topics and designed to hold long-term persistent conversations with users, and learns over time” [28]. Also, for Harmony you can choose between ten pre-programmed characters, it costs more than $5,000, and there may also be male as well as transgender versions of this sex robot [8: 7. There exist other interesting sex robots that I at least like to quickly mention here such as the sex robot Samantha (developed by Barcelona-based engineer Sergi Santos), which has similar features to Roxxxy and Harmony, was recently given a “moral code” to say “no” [22]. Finally, LumiDoll’s (based in Dublin) sex robot Kylie, also known as Cow Kylie due to her oversized breasts, should be mentioned [23]. I’ll get to the obvious gender stereotypes embodied in existing sex robots by companies like TrueCompanion, Abyss Creation, and Lumi Doll below. The datafied self of users can also be called minimalist at best. For while individual personalization of a respective sex robot is possible, it is no doubt clearly limited to the user’s preferences of attractiveness and sexual expression. No further information about the user defining their datafied self in the pesonalization of the sex robot seems to be relevant at this point. The development of sex robots is indeed accompanied by numerous ethical questions that are not limited to feminist discourse, although it is sometimes dominated by a few prominent feminist thinkers. The robot ethicist Kathleen Richardson, for instance, launched the Campaign Against Sex Robots in 2015 and represents a radical 1 David [17] discusses whether Roxxxy actually exists or if TrueCompanion actually is a dummy company. In December 2022, the Homepage is no longer available.

6 Are Dating Apps and Sex Robots Feminist Technologies? A Critical …


feminist argument against sex robots in general, understanding them “as part of a larger culture of exploitation and objectification that reinforces rape culture and normalizes the sex trade” [24]. In robots such as Roxxxy and Harmony, vividly illustrated in Roxxxy’s five personality modes, highly questionable gender stereotypes are indeed perpetuated and heteronormative, patriarchal, instrumentalizing, and discriminatory power structures are reinforced. For, of course, Richardson’s argumentation could be further developed, Samantha’s so-called moral code is not a genuine moral code, but on the contrary could even invite potential users to disregard her “no”, i.e. to affirm rape as an ordinary expression of lived sexuality (cf. [25]). For these reasons, radical feminists presumably would not be willing to interpret existing sex robots (according to Richardson, sex robots in general) as feminist but rather as antifeminist technologies. Richardson’s position is opposed by some feminist thinkers (although they certainly agree with rejecting the discriminatory gender stereotypes mentioned above in the construction of existing sex robots) such as Vanessa de Largie. The Australian actress and sex columnist argues in a liberal feminist way that sex robots give women new opportunities to free themselves from existing patriarchal and objectifying power structures. De Largie speaks from painful experience; after a rape she created the show Every Orgasm I Have Is A Show Of Defiance To My Rapist to handle her experiences. She prefers “when a person lives out his [sic!] rape fantasy with a sex bot and not with a human” [9]. A similar argument is made with regard to pedophilia; perhaps child sex robots could be used as therapy assistance systems, just as sex robots in general could support human therapists in their work with, for instance, trauma patients. It is conceivable that people with certain physical limitations would only be able to satisfy their sexual needs with sex robots, just as misanthropic people might even find a pleasant and satisfying form of counterpart in sex robots [4, 10, 20, 31]. Presumably, then, liberal feminists would interpret sex robots as feminist technologies. The contrasting perspectives on sex robots in general, and thus on existing sex robots in particular, adopted by radical and liberal feminism illustrate the schematic thinking exhibited by many traditional feminisms. In a different vein, queerfeminists would presumably not interpret sex robots per se positively or negatively, but rather focus their critique specifically on existing sex robots that unquestionably support heteronormative notions of femininity and masculinity. Queerfeminists thus already argue in a similar way to critical posthumanist feminists whom I will discuss in the next section and who explicitly refer to Butler and other queerfeminists. Queerfeminists and also the xenofeminism of the Laboria Cuboniks collective argue for a radical pluralization of gender embodiment, which is pointedly expressed in their call to “[l]et a hundred sexes bloom!” (Laboria [15]: 0 × 0E) Accordingly, we would need a much greater diversity and heterogeneity in sexrobotics and also sexrobots beyond gender, i.e. those that are not designed according to any gender characteristics at all, in order to be able to assess sexrobots as feminist technologies from a queer- and xenofeminist perspective.


J. Loh

6.5 Critical Posthumanism Before I discuss the critical posthumanist alternative to classical feminist perspectives on technologies such as dating apps and sex robots, let me first briefly summarize what I mean by critical posthumanism in the first place. Critical posthumanism [14, 18: 9–10] is not primarily a feminist movement. Its primary goal is not to criticize patriarchal and heteronormative structures, but our humanistic view of the human being and the world. Critical posthumanism questions the traditional, mostly humanistic dichotomies such as woman/man, nature/culture or subject/object, which have decisively contributed to the emergence of our present view of the human being and the world. Critical posthumanists such as Donna Haraway and Karen Barad, Rosi Braidotti, and Cary Wolfe want to overcome ‘the’ human being by breaking with conventional categories and the thinking that goes along with them. Thus, critical posthumanism arrives at a philosophical position behind or beyond (“post”) a specific and for the present essential understanding of ‘the’ human being. Due to the fundamental upheavals associated with a radical questioning of humanism, human society and political structures, knowledge cultures and their claim to the definition of facts and knowledge, the consequences of Western (and therefore usually white and male) capitalism and the development of mass society, and ultimately also of the cosmos as a whole, are subjected to a total revision by critical posthumanists [6, 12, 26]. Five core elements characterize critical posthumanist thinking: (1) a struggling with humanism, (2) an overcoming of anthropocentrism, (3) a questioning of essentialism and (philosophical) anthropology, (4) a critique of the knowledge cultures, as well as (5) a clear appeal character and socio-political implications [19]. Against the background of these characteristics, it becomes apparent that critical posthumanism overlaps with feminist endeavours because critical posthumanism takes the differentiation between body and “Leib” into account, as well as between incorporation and embeddedness, which is still ignored by humanism. Critical posthumanists strive for a proper recollection of matter and a rediscovery of the body, thus also seeking to open up the possibilities of new conceptions of agency. These approaches are also discussed in feminist discourses such as new materialist feminism [2, 27]. Whether dating apps such as Tinder and OK Cupid and sex robots such as Roxxxy and Harmony can be interpreted as feminist technologies from a critical posthumanist perspective is not even that easy to say, which is due to the already indicated changed view of the agent in critical posthumanist approaches. One could also say that critical posthumanists are more concerned with the relation from which they believe agents emerge in the first place. Or, as Karen Barad and Donna Haraway put it, “relations do not follow relata, but the other way around” [3: 136–137]. “Beings do not preexist their relatings” [13: 6]. “‘[T]he relation’ is the smallest possible unit of analysis” [13: 20]. Whether dating apps and sex robots are feminist technologies depends on the relation from which we as users, as well as the technologies in question themselves, emerge. Although it seems to us that both we and the technologies exist prior to any relation we enter into, this is not the case. However, according to critical

6 Are Dating Apps and Sex Robots Feminist Technologies? A Critical …


posthumanists, this is due to the fact that we are always already in relations and, for superficial observation, always realize ourselves in a similar way via relations. In this way the impression of a spatio-temporal stability is created, which enables us to act, to be recognizable, and to have a stable reference. Nevertheless, to put it pointedly, it is the relations that make us who we are or who we recognize each other as. So, let’s take a quick look at the relations first between users and dating apps and second between users and sex robots. As already explained in the third section, the user of dating apps such as Tinder and OK Cupid creates a more or less restricted datafied self—in fact even more minimalistic with Tinder than with OK Cupid, due to the restriction to 500 characters, a handful of photos, the option to mention a favorite song, and the possibility of linking to social media profiles. Now, the fact that the (datafied) self in relations with dating apps is (strongly) reduced and focused on, for instance, attributes of attractiveness and sexual preferences does not automatically have to be evaluated negatively. After all, we engage in such relations, in which very specific facets of our self are focused, in many situations of our everyday life. The relationship between, for example, a piano player and a piano is, at best, one that awakens and fosters in the piano player a love of (piano) music (whether with or without singing or playing with other instruments), supports their ability to play the piano, and perhaps allows expression of character traits and emotions. Less common, though perhaps not excluded per se, are other characteristics specific to the individual in question. The fact that the dating app, as at least one other agent within this relation, also presents a very specific self that is designed by the company to be used in a number of specific ways does not have to be evaluated negatively per se either. Even a piano is designed to be played on—yet we can also use it as a piece of furniture and to place smaller objects such as cups or flowerpots. Similarly, it is perhaps unusual, but nonetheless possible, to use Tinder to search for people with whom we can walk our dogs or play cards together. So, the fact that a (datafied) self limited to specific characteristics emerges from all those participating in this relation does not provide an argument for not seeing dating apps as feminist technologies. More problematic, on the other hand, is the legacy of previous relations that is expressed in the respective agents, which we are forced to adopt or which we at least may find very difficult to change. In the case of dating apps, for instance, these are certain characteristics set by the companies in the apps that we cannot change. The two-gendered nature of Tinder is a good example for this fact. Tinder users cannot circumvent this and can only reflect it with difficulty—for example, by activating the setting that we are looking for “all” people and not just for women or men (although looking for “all” here obviously means that we want to see profiles from “both” genders). Or, by using the 500 characters of our profile to declare our own gender— while we are still forced to categorize ourselves into the binary gender system when we sign up for Tinder. With regard to the piano, such a legacy is visible, for example, in the keys, which at least in the past were often made of ivory, and in the use of special woods for the main body of the piano. From the point of view of critical posthumanism, such a legacy from earlier relations is problematic if it has the consequence that later relations that build on


J. Loh

it tend to be exclusionary because of said legacy—that is, if they exclude certain agents, discriminate against them, or oppress them. With regard to Tinder’s given two-gendered nature, this seems to be quite obviously the case. Antifeminist (in the sense of exclusionary), we can now formulate the answer of critical posthumanists to the question of whether dating apps are feminist technologies or not, is thus not the dating apps themselves. Tinder cannot be interpreted as a feminist or antifeminist technology. Arguably, however, certain features that have been created by previous relations in the dating apps—such as the two-gendered nature in Tinder that is the result of a relation with the company—can be said to be feminist or antifeminist by reinforcing inclusionary or exclusionary tendencies in later relations. Now we come to the sex robots, where the considerations can be somewhat shorter, since some aspects should have already become clear with regard to the dating apps. The user’s self, which is formed in the relationship with their sex robot, is theoretically not subject to any restriction as in the case of the dating apps. Although created primarily for the purpose they carry in their label, the companies advertise precisely that users obtain artificial partners with a sex robot. What is important for the owner of a sex robot in a partnership, the aspects of their character and personality that the person in question wants to bring into a relationship, or the facets of the self that can emerge through the artificial counterpart in the relationship, are individually specific and not necessarily limited to the sexual. The self of the respective sex robot, on the other hand, is similarly limited as the self of a dating app—at least with regard to the external and character design options that are specified by the company. Theoretically, however, it is conceivable that users will find an interesting conversation partner in their specific sex robot and that a more multifaceted artificial self will thus emerge in the relationship with it. However, also in the case of sex robots, comparably and perhaps even more problematically than with regard to the dating app Tinder, the legacy of previous relationships in the form of antifeminist, exclusionary (structurally discriminating, sexist, oppressive) features is to be criticized. In the case of Roxxxy, this particularly concerns its pre-programmed personalities, in the case of all currently existing sex robots, their extremely limited image of attractive femaleness. Against this background, the critical posthumanist judgment turns out to be similar to that regarding dating apps: It is not the sex robots themselves that are feminist or antifeminist technologies, but rather some features that have been created, implemented, or defined in them in earlier relations with the companies are to be interpreted as feminist or antifeminist, depending on whether they reinforce inclusionary or exclusionary tendencies in later relations.

6.6 Conclusion The final question to be answered is what advantages the critical posthumanist alternative has over the approaches of traditional feminist schools of thought. Might we not object that the critical posthumanist alternative is actually more complicated and therefore, in practice, is not really an alternative at all? But I do not think

6 Are Dating Apps and Sex Robots Feminist Technologies? A Critical …


that is true. Where classical feminisms have a tendency to use the label of feminist technology to blanketly condemn certain technologies as antifeminist, but in this way in turn perpetuate exclusionary (and thus themselves antifeminist) patterns, the critical posthumanist perspective allows for a more nuanced view that further develops queerfeminist approaches. From the perspective of critical posthumanism, it would be entirely possible to develop feminist dating apps and sex robots, or rather, according to critical posthumanists, talk of feminist or antifeminist technologies is itself precisely problematic, as it encourages exclusionary traits. Focusing on relations and being aware of their feminist (inclusionary) or antifeminist (exclusionary) features, on the other hand, has two advantages: First, it is more accurate, because this approach makes it possible to uncover the actual causes of the problem and to move away from the blanket remediation of any symptoms in the judgment of “feminist technology” or “antifeminist technology” (for instance, when Kathleen Richardson wants to ban sex robots per se). Second, this approach does not perpetuate exclusionary (discriminatory, sexist, oppressive) and thus antifeminist tendencies by condemning the users of certain technologies as well as the technologies themselves as antifeminist.

References 1. Aengst, J., Layne, L.L.: The need to bleed? A feminist technology assessment of menstrualsuppressing birth control pills. In: Layne, L. L., Vostral, S.L., Boyer, K. (eds.) Feminist Technology, pp. 55–88. University of Illinois (2010) 2. Alaimo, S., Hekman, S. (ed.): Material Feminisms. Indiana University Press, Bloomington, Indianapolis (2008) 3. Barad, K.: Meeting the Universe Halfway: Quantum Physics and the Entanglement of Matter and Meaning. Duke University Press (2007) 4. Bendel, O.: Sexroboter im Gesundheitsbereich. IT Health 8, 36–37 (2017) 5. Butler, J.: Gender Trouble. Feminism and the Subversion of Identity. Routledge (1990) 6. Callus, I., Herbrechter, S.: Posthumanism. In: Wake, P., Malpas, S. (eds.) The Routledge Companion to Critical and Cultural Theory, pp. 144–153. Routledge, London (2013) 7. Coeckelbergh, M.: Money Machines. Electronic Financial Technologies, Distancing, and Responsibility in Global Finance. Ashgate (2015) 8. Danaher, J.: Should we be thinking about robot sex? In: Danaher, J., McArthur, N. (ed.) Robot Sex. Social and Ethical Implications, pp. 3–14. The MIT Press, Cambridge, Massachusetts, London (2017) 9. de Largie, V.: Sex robots offer real benefits to society—and to women. In: iNews. The Essential Daily Briefing. Retrieved from iety-women/ 10. Di Nucci, E.: Sex robots and the rights of the disabled. In: Danaher, J., McArthur, N. (ed.) Robot Sex. Social and Ethical Implications, pp. 73–88. The MIT Press, Cambridge, Massachusetts, London (2017) 11. Emcke, C.: Wie wir begehren. Frankfurt am Main (2013) 12. Franklin, A.: Posthumanism. In: Ritzer, G. (ed.) The Blackwell Encyclopedia of Sociology, pp. 3548–3550. Oxford (2009) 13. Haraway, D.: The Companion Species Manifesto: Dogs, People, and Significant Otherness. Prickly Paradigm Press (2003)


J. Loh

14. Herbrechter, S.: Critical Posthumanism. In: Braidotti, R., Hlavajova, M. (Eg.) Posthuman Glossary, pp. 94–96. Bloomsbury, London, Oxford, New York, New Delhi (2018) 15. Cuboniks, L.: A Politics for Alienation (2014). Retrieved from http://www.laboriacuboniks. net/de/ 16. Layne, L.L.: Introduction. In: Layne, L.L., Vostral, S.L., Boyer, K. (ed.) Feminist Technology, pp. 1–35. University of Illinois (2010) 17. Levy, D.: Roxxxy the ‘Sex Robot’—real or fake? Lovotics 1, 1–4 (2013) 18. Loh, J.: What is feminist philosophy of technology? A critical overview and a plea for a feminist technoscientific Utopia. In: Loh, J., Coeckelbergh, M. (ed.) Feminist Philosophy of Technology, pp. 1–24. J.B. Metzler (2019) 19. Loh, J.: Trans- und Posthumanismus zur Einführung. Hamburg (2018) 20. McArthur, N., Danaher, J.: How sex robots could help with the nuts and bolts of relationships (2017). Retrieved from bots-nuts-bolts-relationships-sex-robots 21. Misselhorn, C.: Robots as moral agents? In: Rövekamp, F., Bosse, F (ed.) Ethics in Science and Society. German and Japanese Views, pp. 42–56. München (2013) 22. Mlot, S.: Sex robot Samantha upgrade with moral code (2018). Retrieved from https://www. 23. Morgan, R.: Looking for robot love? Here are 5 sexbots you can buy right now (2017). Retrieved from 24. Murphy, M.: Interview. Kathleen Richardson makes the case against sex robots. In: Feminist Current (2017). Retrieved from 25. Murray, T.: Professor Kathleen Richardson on ethical problems with sex robots (2017). Retrieved from 26. Nayar, P.K.: Posthumanism. Polity, Cambridge (2014) 27. Pitts-Taylor, V. (ed.): Mattering. Feminism, Science, and Materialism. NYU Press, New York, London (2016) 28. Realbotix: The Software (2019). Retrieved from 29. Rostalski, F.: Entscheiden im digitalen Zeitalter – Zur Bedeutung der technischen Beeinflussung des Menschen bei der Entscheidungsfindung. In: Hermann, I., Rostalski, F., Stock, G. (Ed.) Kompetent eigene Entscheidungen treffen?, pp. 9–23. Auch mit Künstlicher Intelligenz! Berlin (2020) 30. Satz, D.: Feminist perspectives on reproduction and the family. In: Stanford Encyclopedia of Philosophy. Retrieved from 31. Strikwerda, L.: Legal and moral implications of child sex robots. In: Danaher, J., McArthur, N. (ed.) Robot Sex. Social and Ethical Implications, pp. 133–151. The MIT Press, Cambridge, Massachusetts, London (2017) 32. Trallori, L.N.: Der Körper als Ware. Mandelbaum Verlag, Wien, Berlin (2015) 33. TrueCompanion: FAQ (Frequently Asked Questions) (2019). Retrieved from http://www.tru

Chapter 7

Not Born of Woman: Gendered Robots Huma Shah and Fred Roberts

Abstract This chapter posits that gender, while being messy and non-binary, is a salient consideration in the development of virtual and embodied robots to avoid replicating stereotypical representations of men and women’s roles and occupations in our future artificial work colleagues, and companions. Some questions are posed for robot developers with early responses from researchers working in the field. Keywords Gender · Humanoids · Language · Robot · Sex · Virtual assistant

7.1 Introduction In a 2016 academic panel on gender, agents and artificial intelligence, as part of that year’s international conference on agents and artificial intelligence (ICAART) [20], one attendee asked why the topic of gendered robots was being considered. Sophia, Hanson robotics’ citizen of Saudi Arabia [21], and Titan, Cyberstein’s eight-foot entertainment robot, had not yet been introduced to the world [43]. Nonetheless, gendered and genderless robots do, and continue to, grace our screens in science fiction movies and TV shows: from Maria the artificial temptress in the 1927 silent era film Metropolis, to the protective robot in the 2021 movie, Finch. Seven years on from ICAART, it is not unreasonable to question whether roboticists should consider gender in the design of a robot. We might learn that some sub-conscious stereotyping is inherent in the development of virtual (e.g. digital assistants) and physical robots. For example, as virtual robots with text interaction only, such as a chatbot used to augment a call centre, the system might feature a longer haired smiling female. In physical robots a body appearing strong enough to carry heavy loads might exude maleness. This is not to say that there is no place for H. Shah (B) Centre for Computational Sciences and Mathematical Modelling (CSMM), Coventry University, Coventry CV1 5FB, UK e-mail: [email protected] F. Roberts Loebner Prize winner, Hamburg, Germany © Springer Nature Switzerland AG 2023 J. Vallverdú (ed.), Gender in AI and Robotics, Intelligent Systems Reference Library 235,



H. Shah and F. Roberts

gendered robots. It may be the case that in some cultures a female-bodied humanoid is considered suitable as an artificial carer for an elderly immobile grandmother. A male-bodied mobile robot might be considered a suitable travel companion for older male relatives. Here the first author could be conveying her own bias for human machine interaction! In this chapter we briefly explore gender in early virtual and physical automata, and robots with gender in the movies, sex and gender in humans and humanoids, perspectives on gendered robots from ICAART’s 2016 panelists and post-panel contributors. We close with principles from a UK research organisation, EPSRC, that could serve as guidelines for the ethical development of robotics avoiding replication of stereotypical views on sex/gender. With the ongoing threat that a variety of future jobs will be done autonomously, and some of automation could be applied to the low-paying jobs traditionally performed by women, the authors ask whether now is a good time to consider the gender of those not born of woman.

7.2 Automating the Body and Mind Automata, not born of woman, over the centuries have included Leonardo Da Vinci’s sixteenth century mechanical lion “sounding very much like a simple robot powered perhaps by wound springs or wires”, which, to impress the French King, Francis I, enabled the artefact to walk “a few steps before its chest opened to reveal a fleurde-lis where the heart would have been” ([41], p. 254). In the industrial age some mechanical marvels embedded gender, including the eighteenth century mechanical writing boy (Fig. 7.1), “perhaps the world’s most astonishing, surviving automaton” [34]. Fig. 7.1 Pierre Jaquet Droz’s automaton: the writer. Source https://www.inform ton-is-the-ancestor-of-tod ays-computers-3.html

7 Not Born of Woman: Gendered Robots


Alan Turing [37] introduced his test for intellectual capacity in machines, a comparison paradigm leveraging question–answer interviews, through a gender imitation game [35]. Niculescu et al. [23] state that apart from other cues of physical appearance, such as age, height, weight, ability/disability, and ethnicity, “gender seems to be of fundamental importance, being part of the first visual information that people exchange in daily communication” (p. 16). In physical systems, gender has been embodied in certain robots, such as Nadine introduced at Singapore’s Nanyang Technological University as their receptionist [28]. Eyssel and Hegel [15] found that “people interact with computers in ways that are comparable to human–human interaction and usually people are not even aware of the fact that they do so automatically” (p. 2215). Developers of conversational agents and digital assistants have often given their creations female or male personalities. For example, Demchenko and Veselov [10] inculcated their Eugene Goostman programme with the personality of a teenage boy from Odessa Ukraine that could text in English. In the next section, we look at the award-winning Elbot, the conversational virtual robot.

7.3 Perspectives on Chatbot Gender The second author provides observation from his experience as an AI research and development engineer. As well as creating the award-winning chatbot Elbot [29, 30], Roberts has created conversational assistants for a variety of organisations. Roberts provided this insight: A customer that was a prominent automobile manufacturer used a male and female avatar side-by-side. Users could decide themselves which virtual assistant they preferred to talk to. The majority of users (75%) chose to converse with the female agent. When asked to evaluate the customer service experience, users considered the female agent to be more competent than her male counterpart. The male avatar on the other hand received more criticism regarding the content, and was generally seen as less knowledgeable. Yet the content of the systems was identical!” Perceived differences were a result of user biases.

Roberts found it remarkable to discover that the perceived quality of a system can be influenced by the supposed gender of that system. In the next section Roberts’ gender neutral Elbot chatbot is considered. Roberts reported that he generally thinks of Elbot as male, although Elbot itself does not purport to have a gender at all. In the design of its conversational capacity, Roberts has attempted to keep Elbot’s answers in conformity with the character, but to him Elbot will always be a “he”—perhaps because of the images, which seem decidedly masculine to its creator. To approach the question, Roberts used Teneo Discovery to analyze millions of user inputs from three different systems from Artificial Solutions [1]: Elbot—gender neutral, but male visual cues Lyra—gender neutral, but female voice Male Agent—unambiguous male gender


H. Shah and F. Roberts

Roberts looked for user utterances that implied the conversational partner, the VA, had a certain gender. For example, “who is your wife?” or “you look like a happy man today” are statements that imply the speaker recognises the entity spoken or texted to possesses a certain gender. In each case, the user perceived the system as “dictated” by the available cues. Elbot had visual male cues which appear to have helped the majority of users decide that Elbot is male (58%). Lyra was perceived as female by 95% of the users, despite being presented as gender neutral (Graph 7.1). Here it appears that the female (default) voice is the dominating factor. Users will invariably refer to an entity as female if it has a female voice. In the case of Artificial Solutions’ customer agent, it appears that an unambiguous presentation of gender satisfies users as to that gender. They are willing to suspend the belief that they are chatting with a computer programme that has no gender at all. Exactly how users perceive AIs, robots, virtual assistants according to gender cues is a fascinating area that needs to be studied more. Humanoids, not mere machinery, but robots with body and limbs not born of woman (Shakespeare-Macbeth; [17]), designed by mostly male roboticists could be fulfilling their own stereotypical views of male and female humans. We briefly look at robots in fiction, including in two famous films, Metropolis which influenced subsequent films including Star Wars.

Graph 7.1 Gender Gr perceptions in Elbot, Lyra & Customer Agents

7 Not Born of Woman: Gendered Robots


7.4 Fictional Robots In science fiction movies one of the most famous embodiments of an artificial being is Maria in Fritz Lang’s 1927 Metropolis (Fig. 7.2). This movie and its robot protagonist influenced film directors, among them George Lucas, Director of the earlier Star Wars cinematic arrangements. Lucas specifically had the C3P0 robot (Fig. 7.3) fashioned to be like Maria but not her gender (in [33]): George [Lucas] brought a photograph of the female robot from Metropolis and said he’d like Threepio to look like that, except to make him a boy.” Breasts and hips removed, this gender-realigned Maria is the perfect robo-companion. “He” is unthreateningly camp and non-violent, and a bit of a wuss, so he never upstages the male heroes. He is also unerringly polite and a great communicator, which makes him a hit with the ladies. And there is absolutely nothing sexy about him.

Ideas in movies reflect the attitudes of screenwriters and directors and could sway public expectations on robots with artificial intelligences impressing on us ethical concerns about the treatment of gendered robots. In films and TV shows from Blade Runner [6] to A.I. Artificial Intelligence (2001) [18, 27], and Ex Machina [14] we encounter machines and robots such as Prometheus’ David (Fig. 7.4), or Ex Machina’s Ava (Fig. 7.5) with human level intelligence, self-awareness, perfection, as well as the development of feelings, dreams, desires and falling in love. There is often sexuality and almost always a gender. In Ex Machina, a full robot Turing test is conducted between a male human and the artificial intelligence, Ava, “housed in a beautiful female robot” not born of god or woman (Fig. 7.5). Ava uses her gender to a devastating outcome on the deceived human taken in by her guile. These films vividly portray a popularly imagined threat posed by AI, as evidenced by the female Ava’s ultimate deception at the end of Ex Machina, that machines are inherently self-serving, or malevolent. Fig. 7.2 Maria—Metropolis. Sources: gallery and http://starwa wars+c3po+action+figure


H. Shah and F. Roberts

Fig. 7.3 C3Po—Star Wars. gallery and http://starwa wars+c3po+action+figure

Fig. 7.4 David robot in [27] movie. Source digal-son-david-8/

In visionary TV shows such as Persons of Interest (2015) and Westworld [40] the dilemma of all seeing but genderless intelligences or attractive humanoids for servitude or pleasure is ever present. The experiences of some movie robots mirror the cruelty some humans inflict on others in real life, and portray current fear: robots are one piece of bad code away from wreaking havoc on their human creators. We taunt and belittle others, exemplified by a human, Charlie, in how he treats the robot David (Fig. 7.4) in Prometheus [27]. We abuse those weaker than ourselves, as Dolores (Fig. 7.6), the first humanoid ‘host’ comes to recall in Westworld (2020).

7 Not Born of Woman: Gendered Robots


Fig. 7.5 Ex machina—Ava robot’s Turing test. Source sites/markhughes/2015/04/ 25/ex-machina-directortalks-gender-nazis-and-col laborative-filmmaking/

Dolores’ artificial existence, in the first series of Westworld, featuring a futuristic adult adventure playground, is to succumb to “violent delights” for the pleasure of some of the human guests. Westworld series 1 relayed a vision “in which every human appetite, no matter how noble or depraved, can be indulged” [49], at the expense of the artificial inhabitants. Dolores exacted her revenge in Westworld’s third series having become self-aware, and smarter (Fig. 7.6). In other cinematic representations of artificial beings, from Blade Runner ([6] to A.I. Artificial Intelligence (2001)), to [18], and in Ex Machina [14] we encounter machines with human level intelligence, self-awareness, perfection, as well as the development of feelings, dreams, desires and falling in love. But would we want this level of engagement with gendered robots in reality? What do we see as an acceptable purpose for robots in our lives? We learn that stereotypical interaction permeates one idea for a household robot. The cover for a 2014 robotics journal (Japanese Society for Artificial Intelligence) displayed an image which drew criticism for using a female robot looking at the reader “dragging a cable connected to her back with a book in her right hand and a broom in her left” (Fig. 7.7). Fig. 7.6 Dolores robot from series 3 of Westworld TV show. Source https://www. westworld-season-3-facemasks-coronavirus


H. Shah and F. Roberts

Fig. 7.7 Book-reading Fembot with broom. Source JSAI

Concluding this section, we see that fictional robots are missing the point that sex and gender are not equivalent, and neither sex nor gender are binary. In the next section we provide a brief landscape of the non-binary nature of sex.

7.5 Sex and Gender in Humans Sex is a “biological term” and “people are termed either male or female depending on their sex organs and genes” ([2], p. 2). Sex is non-conclusive when biological determinants, i.e. genitalia are both male and female in the case of hermaphrodites. [12]. In practice, humans themselves can identify as non-binary, as in the case of a British-Pakistani Muslim, Maria Munir who came out publically in front of an audience to President Barack Obama on Saturday 23 April 2016 [4] before revealing her non-binary gender to her parents and traditional community. The well-known case of South African athlete Caster Semenya is another illustration of the non-binary nature of biology. Semenya is one of those females defined as hyperandrogenic—that have high testosterone levels. Hers and similar athletes’ inclusion in elite sporting events, including in the Olympics, fans flames of unfairness from female athletes whose testosterone levels are within the norm [16, 22]. Since 2009, Semanya endured years of controversy and invasive humiliating testing practices [24]. Since 2013, through sport gender policing that defines who

7 Not Born of Woman: Gendered Robots


gets to be a women, IAAF policy has required women who do “not have conventionally feminine appearance—the latest iteration of gender verification”, to undergo medically unnecessary hormone therapy or surgery if they wished to compete [24]. This is a practice almost as demeaning as the chemical castration undergone by the likes of Alan Turing in the 1950s. Another athlete who came under fire for her testosterone level “ruled too high for her to be considered a woman” [24], at the Rio 2016 Olympics was Indian runner Dutee Chand. Semanya and Chand’s hyperandrogenism, above the natural hormone levels, is due to a health condition and not as a result of unfair play or doping. Bermon et al.’s clinical study [5] investigated the blood passport of over 850 female athletes. The medical researchers considered the basis of clinical features of the subjects, including masculinity of body build and hormonal blood results. The controversy concerned the eligibility of athletes with hyperandrogenism and whether they should be allowed to “compete in women’s sports” if the female athlete possesses “normal androgen sensitivity and serum T levels above the lower normal male range” (p. 4334). Bermon et al. [5] concluded that “Congenital adrenal hyperplasia is a possible cause of virilization in our elite female athletes” but they could not “exclude that the Y chromosome in some unknown way may bring an advantage to female athletes”. (ibid). Nonetheless the study does not preclude such females from being women. Discussions around who is a woman, and whether individuals transitioning from male to female are women, has seen robust debate. In one case the discussion spiraled to accusations of Harry Potter author J.K Rowling being transphobic [3]. Rowling had stated: [3] My life has been shaped by being female. I do not believe it’s hateful to say so …. “I know and love trans people, but erasing the concept of sex removes the ability of many to meaningfully discuss their lives…

The BBC reported that Rowling had “voiced her support for a researcher who was sacked after tweeting that transgender people cannot change their biological sex” [3].

7.6 Gender in Robots: Humanoids in Practice When in reality sex is not clear cut, and “gender is socially constructed, not biologically given” ([2], p. 1), we ought to be wary and not succumb to unconscious attribution when developing systems in order not to fuel stereotypes in humanoids, whether it be robots preparing food in a restaurant [8] or carer/companion humanoids in the home. Indeed, progressive initiatives have been in place for tailor-made solutions, for example at Bristol Robotics Laboratory UK’s Connecting Assistive Solutions to Aspirations (CASA, 2014) project. Robot development provides groundwork for exploring human relationships with artefacts and also to investigate and design robots with skills that humans need


H. Shah and F. Roberts

to successfully complete tasks. Human–robot interaction can also gather information on human perception and requirements of a variety of robots. For example, an elderly female human would require different skills from a robot designed to care or companionate her than a male human astronaut would in space requiring assistance in mathematical analyses to manoeuvre their lived-in spacecraft, or avoid objects in space. In both these cases would it be unacceptable for the elderly female human to be cared for by a male robot, and would the human astronaut object if they were assisted by a female robot mathematician? A leading Japanese roboticist predicted that “over half all future humanoids will be female” (in [31]). Japan already utilises a female robot in Tokyo’s Science Museum [11], and a female humanoid is guiding tourists there (Fig. 7.4). NASA’s Robonaut has a robust male-like appearance with square shoulders (Fig. 7.5). Are these stereotypical representations of the kinds of roles humans fulfil: females for tasks requiring communication, whereas presumably a strong robot required to assist astronauts with heavy equipment-lifting in space? Singapore’s Nanyang Technological University has a new receptionist in the form of a female robot (Fig. 7.6); would a male robot look out of place as a welcome figure at a university’s front-desk? What about virtual robots as information providers on university websites? Should they look like students or be genderless (Fig. 7.8, 7.9, 7.10)? A leader in this field following her extensive study on gendering humanoid robots in Japan, Jennifer Robertson [31], considered the question “how do (the mostly male) roboticists design and attribute the female or male gender of humanoid robots?”. What Robertson learnt was that “the practice of attributing gender to robots” is a “manifestation of roboticists’ tacit, common-sense knowledge”, and “how robot makers gender their humanoids is a tangible manifestation of their tacit understanding of femininity in relation to masculinity and vice versa” ([31]: p. 4). Robertson’s investigation revealed that the criteria by which roboticists assigned gender was as a result of “naive and unreflexive assumptions about humans’ differences” and these Fig. 7.8 Humanoid tourist guide. Source: http://www.

7 Not Born of Woman: Gendered Robots


Fig. 7.9 NASA robonaut. Source: http://robonaut.jsc.

Fig. 7.10 Nadine robot. Source: uk/news/archive/2016-01/04/ robot-receptionist

“informed how they imagined both the bodies and the social performances of their creations” ([31]: p. 5). In Niculescu et al.’s [23] pilot study, participants preferred “gender labelled” agents (female and male avatars) to a gender-ambiguous entity. In Eyssel and Hegel’s 2012 study of gender stereotyping of robots [15], they investigated the effects of visual gender cues (hair length). Participants in the experiment were shown images of robots, one with long hair the other with short hair. The short-haired robot was perceived as more masculine than the long-haired robot: “these visual gender cues, in turn, affected social perceptions of the robot types” ([15]: p. 2223). The male looking robot with short hair was adjudged to have more agency (such as assertiveness) than the female looking robot, while the latter was perceived as more communal (empathetic) than the male robot ([15]: p. 2223). In a recent study by Song-Nichols and Young [44], the researchers explored “whether gender cues and stereotypes should be exploited to facilitate human–robot


H. Shah and F. Roberts

interaction, potentially improving rapport and engagement, reducing performance errors, and increasing marketability” (p. 2480). Song-Nichols and Young “extended” their line of research by engaging forty-five 6–8 year-old children to find “children’s social learning from robots to learning about gender stereotypes” (p. 2481). The children were shown “short animated and narrated cartoon videos about three female gendered robots” (ibid). The findings from Song-Nichols and Young (2020) showed that children do “treat gendered robots as models for cultural gender stereotypes” and that from an “applied perspective” the study’s results “suggest robots may counteract and reinforce potentially harmful gender stereotypes in children (p. 2483). Song-Nichols and Young did deploy “stereotyping interventions” (2020: p. 2483). The intervention was “based on the idea that learning information that is counter to one’s beliefs about a group can change how one thinks about members of that group in the future” (ibid). The children in Song-Nichols and Young’s study “observed information about the occupations, activities, and traits of female robot exemplars and extended that information to their beliefs about what activities and occupations novel male and female robots, boys and girls, and men and women should do” (2020: p. 2483). Society needs to ensure that children, adolescents and adults do not assume that robots are cultural models for human men and women. In the next section we learn the perspective from practitioners in artificial intelligence and robotics.

7.7 Views on Circumstances for Gendered Robots During and after the 8th international conference on agent and artificial intelligence [20], questions were considered by academics on the circumstances of robot gender: – Professor Katia Sycara (Carnegie Mellon University-US)—roboticist – Professor Barbara Henry (Scoula Superiore Sant’Anna, Italy)—philosopher – Emeritus Professor Kevin Warwick (Reading and Coventry University, UK)— cyberneticist and first human cyborg – Fred Roberts, R&D Engineer, Loebner Prize winner (co-author) – Dr. Pericle Salvini, previously Project Manager of concluded EU FP7 (Science in Society) RoboLaw project – Dr. Mark Elshaw, Coventry University robotics researcher – Dawid Horacy Golebiewski, EU-robotics list member – The author (HS) Views are presented in the tables that follow. The questions are given first, followed by tables of responses. Questions considered at ICAART 2016 panel on gender agents and artificial intelligence. a. Do you personally feel some virtual agents/robots should have a gender? b. If so, in which type of agents/robots? For example, in carer/companion robots interacting with the sick or elderly in their own home.

7 Not Born of Woman: Gendered Robots


c. If you feel gender-neutral is best, why? d. If you develop agents and AIs do you give them male or female attributes, if so why? e. Do you feel the lines between gender (trans/inter; etc.) and sex (male/female) are too messy to complicate in agent/robot development? Reaction of audience at the panel included the following: – objection to considering gender at this early stage of robot development—however it was pointed out that robot carer/companions were already in existence and others in development; – objection to the title of the panel—it was pointed out a discussion has to start somewhere and as the theme was whether robots should have a gender the panel title was appropriate. What was noticeable, although the panel was gender-balanced (2 female speakers, 2 male speakers) in the audience at the panel talks and its Q/A session there was only one female in attendance. We noted there was a parallel session taking place but that such little interest was noted among the female delegates in the conference to attend and listen in to the panel discussion. The first author contends that there might have been a misunderstanding that the panel was a feminist discussion, as her talk in an earlier session on ‘Imitation Gender as a Measure for Artificial Intelligence: Is it Necessary?’ [36] was incorrectly introduced as a feminist view point by the session’s male Chair, when in fact the talk delved into the players in the gender game evolving to Turing’s machine-human imitation game. The following section presents tables of views on each question. Personal views on whether a robot should have gender were varied, roboticist Sycara’s position was that the question was task-dependent. Philosopher Henry’s cautioned for an ethical, legal and social case-by-case consideration of the circumstances in which a gendered robot would be appropriate. Sycara suggested testing hypotheses such as female-bodied robots might be better carers than male-embodied robot carers. Salvini’s perspective echoes those who feel neutral robots would prevent confusion in humans interacting with them, and avoid replicating stereotypes. Roberts’s work is driven by customers and their requirements for virtual assistants. Warwick’s [39] experiment building a robot powered by rat-neurons did not include gender consideration. Varied opinions presented in Tables 7.1, 7.2, 7.3, 7.4, 7.5 highlight that there is need for symbiotic consideration across disciplines to embed ethical, legal and social mores in virtual and physical robots. On question 2, on which type of robots could have gender, Warwick stated it could be relevant to sex robots and other robots where interaction with humans is required. Carer/companion robots is another area pointed out by Roberts who builds virtual intelligent assistants for e-commerce. He impressed that Users should be included in the choice of robot gender. Genderless systems were deemed appropriate in military or disaster scenarios, and for smart devices, such as wearable technologies. Sycara suggested experimentation to find which is the better


H. Shah and F. Roberts

Table 7.1 Question 1: Should robots have a gender? Q1. Do you personally feel some virtual agents/robots should have a gender? Shah

The decision should be use-dependent and users’ preference. For example, a robot deployed to defuse an unexploded bomb or to check a suspected package or vehicle probably does not need gender attributes, but a human patient should be given the choice, if provided with an automaton/humanoid carer, that it be genderless or female or male embodied


It depends on the situation. The more humanlike qualities a system has, the more it makes sense to assign a gender. Keep in mind that users will tend to assign a gender themselves based on cues available to them, and it may even be in opposition of the gender you wished to present


I think the jury is still out on the question. As a scientific community we currently lack adequate amounts of data from studies on this issue. My current view is that whether a robot has gender or not depends on the task. In other words, we need to study the effect of robot gender on the quality of the task-specific human–robot interaction and evaluate overall performance of the human–robot system


This answer, like the next four, is in need of an epistemological/conceptual preliminary clarification in order to be properly addressed as embedded/contextual issues. This does NOT simply means “it depends from the situation”, moreover “it depends from the presuppositions adopted by the observer”. According to a Post-positivist epistemology: there is no knowledge out of the knowing possibility of the subject; and thus there is no science out of the cultural environment it is entwined with. In this case, the answer is positive. YES



Golebiewski Yes Elshaw

My belief is that we should move away from gender and gender stereotypes for job. Robot should possibly have a gender but we should not create a male robot to do one job and female robot to do another. Different genders is a thing however for the future. It is interesting the receptionist robot that IBM have produced with nao robot is called Connie:


In my opinion they should be neutral

option, however driverless/autonomous vehicles would be effective without gender in their design. For the social and cultural aspect, Henry reminded of the post-human, a hybrid of human and technology, Warwick being one of the first cyborg examples through implant experiments. The post-human’s way of thinking was “a chance to contrast and bust the supporters of the sacral purity and superiority of the exclusively biological origin of the natural born of woman” (Henry, Table 7.3). Henry also pointed out that the consideration of Lesbian, Gay, Bisexual, Transgender, Queer, Intersexual and Allied (LGBTQIA) should be included in a framework for robotics development to ensure more inclusive moral agents (Table 7.5). Shah felt we might be perpetuating stereotypes with female embodied robots to do light work, but Roberts’ highlighted unnecessary complication in developing virtual assistants. A complex gendered agent could be a distraction for the human user. Sycara’s

7 Not Born of Woman: Gendered Robots


Table 7.2 Question 2: What type of agents/robots should have gender? Q2. In which type of agents/robots should gender be embedded? For example, in carer/companion robots interacting with the sick or elderly in their own home Shah

Again user preference is crucial here. If an elderly male patient prefers in appearance a male embodied robot carer, even if that robot has no pragmatic behavioural differences from a female embodied robot carer, to palliate the male patient, his sensitivities should be acknowledged


For a carer/companion robot it makes sense to select a particular gender, but there should not be a dogmatic decision about this. Ask your target user/group which gender is preferred


As scientists we can formulate particular testable hypotheses regarding the tasks and settings. One such hypothesis that I find plausible would be that female robots, for example, would be more effective in elder care than male robots


An interlocutory response is given to this question (2), because we should adopt an ethical, legal, social approach (ELSA) case by case, in order to frame and fulfill the specific needs, claims, and interests of the stake-holders according to the all-affected principle. We should even adopt in this specific context the disciplinary lens of the Persuasive Robotics


Sex robots. Probably better if all virtual agents had a gender, particularly if some form of interaction is required. For robots that interact around the home (such as carer robots) it would be better for them to have a gender to complement the human they are interacting with


In contexts where the appearance of an agent/robot is very close to any gender archetype commonly recognized by humans. NO artificial gender necessary


We might speak to the people who receiving the care or application and see if need a gender and if so what

concern was that there is not yet a “good answer in human society” for the distinctions between sex and gender, however Warwick felt the timing provided “an exciting new area of study” (Table 7.5). Regardless of whether a system has a gender, what about its language use? If it is a spoken system then its tone and timbre could be interpreted as male or female. How it uses language might also affect its treatment. Whether a robot is gendered or gender-less, Sherry Turkle warns, “For the idea of artificial companionship to be our new normal, we have to change ourselves, and in the process, we are remaking human values and human connection” [33]. From the responses to the questions, we realise that social robotics is still a nascent area in the wider field of robotics. Occupying research as part of this field in artificial consciousness and self-awareness could lead to intelligent robots discovering gender for themselves. The area of gender in robotics could well be appealing for crossdiscipline studies. Policymakers could refer to Palmerini et al.’s [26] guidelines for robotics following conclusion of the EU funded RoboLaw project [32]. In the next section, we consider one guideline that could be applied to the development of robotics.


H. Shah and F. Roberts

Table 7.3 Question 3: gender neutrality in robots Q3. If you feel gender-neutral is best, why? Shah

In military or disaster circumstances, for example when a robot is performing a specific task, like sending images from underneath rubble in an earthquake-hit area, where rescue operations involve searching for human survivors, such a robot’s purpose is to rescue and it being male or female embodied would not necessarily make it more successful

Roberts There can be situations in which a gender-neutral system is the best. If the system represents an inanimate object, if it is a smart device with few human characteristics, or where there should be no distractions, it is probably best to avoid gender. It would be unusual to have a transvestite toaster because toasters have no actual sexuality. Also, imagine this situation with a social robot: A young man selects a social robot and chooses a sexy female model. He takes this robot with him everywhere he goes. But then he meets a young lady and goes on a date with her, bringing his social robot with him. Imagine how awkward that situation would be! If it becomes too weird for his date, he may wind up sitting alone with his social robot, lamenting over the fact that he did not select the gender-neutral model Sycara

Gender neutral could be most effective in certain tasks, but this needs to be studied experimentally. For example, I believe that self driving cars and package-carrying autonomous vehicles would be effective as gender neutral


A negative answer is given to this question (3), so far as we are situated in a material/symbolical condition like the Post-Human is. Post-Human is to be interpreted here as interspecist, osmotic and relational, horizon of effective sharing of experiences, dangers and challenges, a horizon inclusive of human, artificial, hybrid beings. This is to be said in regard to the concerned phenomena (artificial and hybrid beings), to the frame and texture of meaning (post-human) and to the corresponding field of studies: Cultural/imaginary Studies, Semantic and Hermeneutics of science fiction, Gendered Post-human Studies, Persuasive Robotics, Cyber-Anthropology. This negative answer to the question about gender neutrality descends directly from my preliminary clarification of intents and from the adoption of the post-human way of thinking: in this respect, the Post-human horizon is to be considered a chance to contrast and bust the supporters of the sacral purity and superiority of the exclusively biological origin of the ‘natural born of woman’


I feel that gender neutral is fine for applications that are not long term interactions. This would save a lot of unnecessary effort


In order to avoid confusing the human being interacting with the robot and replicate stereotypes

7.8 Principles in Robot Development The 4th Principle of Robotics, as devised by the UK’s Engineering and Physical Sciences Research Council [13], states: Robots are manufactured artefacts. They should not be designed in a deceptive way to exploit vulnerable users; instead their machine nature should be transparent.

Alan Winfield, Bristol University roboticist has taken this principle to justify a position that embedding gender in robots is tantamount to deception [42]. Winfield states:

7 Not Born of Woman: Gendered Robots


Table 7.4 Question 4: on development of AI and agents Q4. If you develop agents and AIs do you give them male or female attributes, if so why? Shah

I do not create but I do analyse dialogue agents’ ability to simulate natural language/human-like conversation


When we develop agents at Artificial Solutions it is in conformity to the character that has been defined by the customer. Any attributes the character is given should conform to that character, which can include gender or the lack of a gender. Whether some unconscious bias slips in, I cannot say, as it is unconscious!


I do not currently develop human-like agents


An interlocutory response is to be given to this question (4), because we should adopt, in analogy with answer (3), an ethical, legal social approach (ELSA) case by case, in order to frame and fulfill the specific needs, claims, and interests of the stake-holders according to the ‘all-affected’ principle. We are in need of embedded, not of abstract responses


On the rat-brain powered robot experiment: It wasn’t intentional that the robot was gender neutral. Indeed the robot may well have inherited an innate biological gender from the neurons used. However there was no way in the experiment for us to test for or measure gender in any way. It raises a question as to what it means for a robot/machine to have a gender if/when it’s inputs/senses are different to those of a human and when it cannot (as far as we are aware) measure it’s gender

Golebiewski No, I do not. In the long run, Human–Computer-Interaction will be modularized and the question of gender will dissipate into irrelevance since the “personality” or “character” of an agent will be interchangeable Elshaw

The robots we are working do not have gender. However the users might add gender themselves to the robot


I do not develop agents, but I guess the reason is to make robots and AI more appealing, sociable, in a word to increase their social acceptance and usability

Robots cannot have a gender in any meaningful sense. To impose a gender on a robot, either by design of its outward appearance, or programming some gender stereotypical behaviour, cannot be for reasons other than deception—to make humans believe that the robot has gender, or gender specific characteristics.

While robot developers might agree with Winfield’s view, it is of concern that gendered robots are recreating stereotypical views about humans in humanoids that have been developed to appear either female-like or manly. Could gender be complicating an already complex engineering challenge? Gendered robots are increasingly being developed, but embodying a robot with a female or male appearance could be refueling human impression on what roles are appropriate for female robots and which are best for male robots. For example, are robot receptionist jobs best for female humanoids, whereas jobs requiring heavy lifting better suited to a male-like robot?


H. Shah and F. Roberts

Table 7.5 Question 5: on messy sex/gender Q5. Do you feel the lines between gender (trans/inter; etc.) and sex (male/female) are too messy to complicate in agent/robot development? Shah

I think it important that the field of Robotics does not perpetuate social and cultural constructs, for example building female embodied robots to do light work, such as receptionist roles. It is interesting that DARPA has been developing some military robots shaped around animals: dogs and cheetahs. Robotics has a great role to play in shaping the future by developing non-stereotypical humanoids through balanced, interdisciplinary teams involving males and females creating the look, and designing the behaviours, to ensure the future does not provide an image where electric driverless vehicles will be categorised as female and petrol-fueled self-driving cars will be seen as manly [19]. We need a future of robots that will interact with each other and of course with humans successfully, perhaps teaching us humans to be better at humaneness


I don’t see the purpose of presenting a complex gender in a virtual assistant, unless it is to raise awareness about that complex gender. If so, it should be done in a respectful and consistent way, otherwise it might seem like a joke. A past customer of ours was a producer of hair products, for whom we designed a virtual hair stylist. Initially there were discussions of having the hair stylist present himself as gay, but finally it was decided to make him heterosexual instead, so as to avoid controversy. A complex gender may serve more as a distraction than anything else. Most of the casual conversation you hear from a virtual assistant will be gender-neutral anyway


I think we do not have a good answer in the human society yet for the question of the lines between gender and sex, therefore I would say that it is too early to pose this question for robot development


I say NO to question (5). The constellation of LGBTQIA (Lesbian-Gay-Bisexual-Transgender-Queer-Intersexual, Allied) humans is to be considered as a crucial frame of reference. This field of phenomena is a fundamental source for counter-factual hypothesis on behalf of a more inclusive, hybrid and porous taxonomy among embodied moral agents. I rely on the works of Donna Haraway, and of the cyberpunk /post-human literature; both insights are effective in promoting and endorsing the alliances between socio-anthropological subjectivities and cybernetically connected configurations


Certainly not. It provides an exciting new area of study. Particularly with regard to stereotypical associations/expectations and what it takes for an individual human to be convinced that the agent they are interacting with is male/female

Golebiewski Absolutely. At least until true artificial consciousness is achieved—then we can leave the agents/AI to discover their own gender themselves Elshaw

I feel that we are very much at the early stages of social robots and so attempts to include gender, race or sexuality might lead to stereotyping which is not good for the research field or society. If you look at many of the current social robot many lack any sort of subtlety in their behaviour. There is a massive field of research on gender bias and I would prefer if robots did not contribute to that


I think robots or AI are objects and as such they have no intrinsic gender so the issue does not apply

7 Not Born of Woman: Gendered Robots


7.9 Discussion Women’s traits are accepted as being “interpersonal warmth” while “men are perceived predominantly in terms of agentic features” ([15], p. 2217; 25]). Men and women are seen as opposites, one being “strong, rational, aggressive” and the other being “weak, emotional, submissive” ([2], p. 1). Robot developers need to take care when translating these impressions into the design of virtual and physical robots. Otherwise, we run the risk of evolving sex-stereotypes in real life and the contra perception of real people [9]. Subservient female humanoids might burden an expectation of human women to accept this behaviour as their lot in life. We need to keep in mind that gender is a “psychological and cultural term referring to one’s subjective feeling of maleness or femaleness” giving rise to gender identity ([2], p. 2). Society’s role in this is by “evaluating behaviour” as masculine or feminine so assigning gender roles and wrongly assuming any differences are biology-based rather than as a result of social factors [2]. Disregarding this could lead to permanent damage to physical robots, or retiring of virtual humans—as in the case of Eugene Goostman chatbot temporarily taken offline as a result of receiving abuse following its performance in a 2014 Turing test [38].

7.10 Conclusion The emergence of numerous gendered robots leads to the issue of whether gender is regarded unconsciously by developers of humanoid or other autonomous systems. Could gendered robots, rather than progress an understanding of the relationship between gender and body in humans, be actually fulfilling stereotypical views about the kinds of roles particular robots should undertake? The aim of this chapter is to pose this consideration to the robotics community, on whether, and when, it is appropriate to gender robots. Future employment will change the way many, and the lower paid jobs traditionally carried out by women (receptionists, carers), are done. More jobs will be autonomously completed. Now is a good time to consider what we are developing when we design physical and virtual robots. We ought to move beyond unconscious bias so as not to produce culturally stereotypical robotic systems that promote antiquated views on societal roles for men and women. Acknowledgements Barbara Henry, Katia Sycara, Kevin Warwick.

References 1. AS: Artificial Solutions wins Best Intelligent Assistant Innovation at the Alconics (2018). Retrieved 22 Sept 2021 from:


H. Shah and F. Roberts

wins-best-intelligent-assistant-innovation-aiconics 2. Basow, S.A.: Gender: Stereotypes and Roles. Brooks/Cole, CA, USA (1992) 3. BBC: J.K Rowling responds to trans tweets criticism. BBC UK (2020), 11 June 2020. Retrieved from: 4. BBC: Why I came out to President Obama before I told by Parents. BBC Newsbeat (2016). Retrieved April 23, 2016 from: 5. Bermon, S., Garnier, P.Y., Hirschberg, A.L., Robinson, N., Giraud, S., Nicoli, R., Baume, N., Saugy, M., Fénichel, P., Bruce, S.J., Henry, H., Dollé, G., Ritzen, M.: Serum androgen levels in elite female athletes. J. Clin. Endocrinol. Metab. (JCEM). 99(11), 4328–4335 (2014) 6. Blade Runner (1982). 7. CASA: UWE Bristol Robotics Laboratory—Long Term Care Revolution (2014). Accessed 30 Sept 2016 from: 8. CASABOT: Casabots: Robots for Food Businesses (2014). Accessed 30 Sept 2016 from: http:// 9. De Angeli, A., Brahnam, S.: Sex stereotypes and conversational agents. In: Proceedings of Gender and Interaction (2006). Accessed 11 Oct 2016 from: gender/files/papers/Sex%20stereotypes%20and%20conversational%20agents.pdf 10. Demchenko, E., Veselov, V.: Who fools whom? The great mystification, or methodological issues in making fools of human beings. In: Epstein, R., Roberts, G., Beber, G. (eds.) Parsing the Turing: Philosophical and Methodological Issues in the Quest for the Thinking Computer, pp. 447–459. Springer (2008) 11. Demetriou, D.: Humanoid robots join staff at Tokyo Science Museum. The Telegraph UK, June 25, (2014). Retrieved April 28, 2016 from: japan/10924594/Humanoid-robots-join-staff-at-Tokyo-science-museum.html 12. Devor, A.H.: How Many Sexes? How Many Genders? When Two are Not Enough (2007). Retrieved April 23, 2016 from: 13. EPSRC: Principles of Robotics: Regulating Robots in the Real World (2010). Retrieved April 20, 2016 from: principlesofrobotics/ 14. Ex Machina (2015). 15. Eyssel, F., Hegel, F.: (S)he’s got the look: gender stereotyping of robots. J. Appl. Soc. Psychol. 42(9), 2213–2230 (2012) 16. Guardian: Caster Semenya Wins Olympic Gold But Faces More Scrutiny as IAAF Presses Case. Sport Rio 21 August (2016). Accessed 30 Sept 2016 from: https://www.theguardian. com/sport/2016/aug/21/caster-semenya-wins-gold-but-faces-scrutiny 17. Henry, B.: Imaginaries of the global age. “Golem and others” in the post-human condition. Polit. Soc. 2, 221–246 (2014) 18. Her (2013). 19. Hopkins, P.D.: Sex /Machine: Readings in Culture, Gender, and Technology. Indiana University Press (1998) 20. ICAART: Gender, Agents & Artificial Intelligence: 1/2 day panel at 8th International Conference on Agents and Artificial Intelligence, Rome (2016). 2016#2 21. Imran, A.: Building Out Sophia 2020: An Integrative Platform for Embodied Cognition. Hanson Robotics (2021). Accessible from here: 22. NBC News: Naturally High Testosterone Snares Female Athletes in Rio. NBC Health News (2016). Accessed 30 Sept 2016 from: 23. Niculescu, A., Hofs, D., van Dijk, B., Nijholt, A.: How the agent’s gender influence users’ evaluation of a QA system. In: Proceedings of the International Conference on User Science Engineering (I-USEr), Shah-Alam, Malaysia, 2010, pp. 16–20 (2011)

7 Not Born of Woman: Gendered Robots


24. Nolen, S.: How India’s Dutee Chand ran past gender barriers to compete in Rio. The Globe and Mail (2016). Accessed 30 Sept 2016 from: how-indias-dutee-chand-ran-past-gender-barriers-to-compete-in-rio/article31385923/ 25. Park, G., Yaden, D.B., Schwartz, A., Kern, M.L., Eichstaedt, C., Kosinsko, M., Stillwell, D., Ungar, L.H., Seligman, M.E.P.: Women are warmer but no less assertive than men: gender and language in facebook. PLoS ONE 11(5), 1–26 (2016) 26. Palmerini, E., Azzarri, F., Battaglia, F., Bertolini, A., Carnevale, A., Carpaneto, J., Cavallo, F. Di Carlo, A., Cempini, M., Controzzi, M., Koops, B.J., Lucivero, F., Mukerji N., Nocco, L., Pirni, A., Shah, H., Salvini, P., Schellekens, M., Warwick, K.: Guidelines on Regulating Robotics. Report deliverable D6.2 EU FP7 RoboLaw project: Regulating Emerging Robotic Technologies in Europe-Robotics Facing Law and Ethics, SSSA-Pisa (2014). Report accessible from: 27. Prometheus (2012). 28. Reynolds, E.: This Robot Could be Your Next Receptionist. Wired (2016). Retrieved 28 April 2016 from: 29. Roberts, F.: The Social Psychology of Dialogue Simulation as Applied in Elbot. Special Issue ‘Turing on Emotions’. Int. J. Synthetic Emotions 5(2), 21–30 (2014). ijse.2014070103 30. Roberts, F., Gulsdorff, B.: Techniques of dialogue simulation. In: Proceedings of the 7th International Conference on Intelligent Virtual Assistants (IVA), Paris, France, Sept 17–19, 2007. Series on Lecture Notes in Computer Science, Vol. 4722, pp. 420–421 (2007). 10.1007/978-3-540-74997-4_68 31. Robertson, J.: Gendering humanoid robots: robo-sexism in Japan. J. Body Soc. 16(2), 1–36 (2010) 32. RoboLaw: Regulating Emerging and Robotic Technologies in Europe: Robotics Facing Law and Ethics. EU FP7 SiS (2014). Project page: 33. Ross, A.: Could our future nurses and caregivers be robots? LinkedIn Pulse (2016). Accessed 28 Oct 2016 from: 34. Schaffer, S.: Mechanical Marvels: Clockwork Dreams (2013). BBC Four documentary: http:// last shown in the UK on March 30, 2016 35. Shah, H.: Deception Detection and Machine Intelligence in Practical Turing Tests. PhD Thesis, Reading University, UK. Available on ResearchGate.Net (2010). 2.1.4993.0642 36. Shah, H., Warwick, K.: Imitating gender as a measure for artificial intelligence: is it necessary? In: Proceedings of the 8th International conference on agents and artificial intelligence (ICAART), Vol. 1, pp. 114–119, Rome 24–26 Feb (2016) 37. Turing, A.M.: Computing, machinery and intelligence. Mind, LIX 236, 433–460 (1950) 38. Warwick, K., Shah, H.: Turing’s Imitation Game: Conversations with the Unknown. Cambridge University Press, Cambridge, UK (2016) 39. Warwick, K.: Implications and consequences of robots with biological brains. Ethics Inf. Technol. 9(1), 223–234 (2010) 40. Westworld (1973). IMDB 2016. HBO: 2020: IMBD Series 3: com/title/tt0475784/episodes?season=3 41. White, M.: Leonardo Da Vinci: The First Scientist. Abacus paperback, UK (2000) 42. Winfield, A.: Robots should not be gendered. Alan Winfield’s web log (2016). Retrieved April 15, 2016 from: 43. Cyberstein: The home of live robotic entertainment. Cyberstein robots (2021). https://www. 44. Song-Nichols, K., Young, A.G.: Gendered robots can change children’s gender stereotyping. In: Cognitive Science Society (CogSci) Conference, Toronto, July 29–August 1 (2020)

Chapter 8

How Robots Can Change the Role of Women in Healthcare? A Way Towards Equality? A. Casals

Abstract The role of men and women in the society has been traditionally strongly differentiated. The successive periods of humanity and its culture show a great diversity of social and functional situations. Biological differences, mainly physical force and motherhood have characterized, for long, the specialization of jobs, and more extensively, the daily activity and the respective presence of men and women in a wide variety of social, cultural and labour contexts. This natural condition, together with the tradition of the roles carried out along time by individuals of one and the other sex, have created an environment in which biology and mainly culture and tradition have affected equity between the two genders. Nowadays, healthcare, and care in general, is increasing due mainly to the ageing society. Wages in this sector are lower in average than in other sectors. Traditionally healthcare related tasks have been performed by women. The gender unbalance in this kind of employments creates a negative impact in women working conditions and in consequence damaging their social and economic status. This chapters tries to identify factors of the digital area, mainly referring to Artificial Intelligence and Robotics, that might provide opportunities that contribute to revert this situation. Two main reasons can be seen as positive points in this sense. On the one side, robots can contribute with brute force to assist care related jobs and consequently can compensate one biological difference between sexes. On the other side, robots in healthcare can motivate women towards technologic studies. First, because care will depend more and more on technology and its use may imply the need to acquire some technological knowledge for such higher level and better paid jobs. And second, the image of technology changes when relating it with social applications, rather than the unique view of its use in industry or services. Seeing technology from this perspective has an incidence in the cultural factor that prevents women to approach technology and STEM related disciplines.

A. Casals (B) Center of Research in Biomedical Engineering, Universitat Politècnica de Catalunya, Barcelona Tech, Spain e-mail: [email protected] © Springer Nature Switzerland AG 2023 J. Vallverdú (ed.), Gender in AI and Robotics, Intelligent Systems Reference Library 235,



A. Casals

8.1 Introduction Gender issues are a main concern in policies in democratic countries due to the evidence that social and economic rights are unequally achieved by people of both genders. Other non-binary genders are not considered here, not for being less important, but for the complexity of widening the scope to the wide variety of existing social, cultural and functional issues. The bias in opportunities related to jobs and social status has been studied for long and governments and social institutions are imposing policies to progressively reduce this unjust situation. The US Bureau of Labor statistics in a study of employment projection from 2019 to 2029 foresees that 6 out of the 10 fastest growing occupations will be related to healthcare or assistance to people with especial needs [1]. This study reports that although regulated jobs wages are higher than the median for all occupation, the supportive healthcare jobs are much lower than this median. Traditionally, these supportive health and social care jobs are taken mainly by women and, in general, its implementation is poorly regulated. The Covid-19 pandemic has visualized even more this social disfunction as many women working on domestic and care assistance have found themselves uncovered by the governments’ special actions taken to support the unemployment caused by the collateral effects of this disease. Although the care economy is growing worldwide, the void of protection in this field derives in low wages and risk of abuse, either mental or physical, and even sexual. On the contrary, science and technology related jobs, those that provide economic productivity, receive a higher social consideration and provide its workers with much higher salaries. These jobs require workers with STEM (Science, Technology, Engineering and Mathematics) studies, which unfortunately do not attract girls in the same proportion than boys, thus reducing still more women’s options for future employment in STEM related jobs. Either the appreciation of the interest and finalities of these matters or other social and cultural reasons generate a bias in the proportion of men and women choosing such disciplines. Several studies [2, 3] show how STEM matters are seen as male domains, thus preventing girls to choose them. Among the multiple local or global actions [4] taken in this direction, the OECD report [5] points on policy actions to deal with this gap. Together with care givers, domestic jobs are frequently poorly remunerated, or even not paid and neither recognized as a job. This might have been the reason why an American woman in 1880, Ellen Eglin, invented a clothes-wringer to alleviate women of such hard works. Later, in 1890 the European Eli Garci-Lara Català invented the first mechanical washing machine. Other examples of women inventions [6], normally women without previous studies in Engineering, developed their talent motivated by the need to overcome this unfavourable situation. With the advances of technology, the demand of manual domestic work has been lessening progressively, and thus, giving women more chances to get opportunities, although not exempt of a hard fight towards equality, gaining positions in better recognized jobs. However, still now, in the third decade of the XXI century, domestic and supportive healthcare fall mainly on women. The International Labour Organization (ILO) [7] is the United

8 How Robots Can Change the Role of Women in Healthcare? A Way …


Nations agency that brings together governments, employers and workers to set labor standards for a decent work all around the world. ILO has a special topic focused on the care and healthcare economy, analysing how it has its main impact on women and showing the strong unbalance on wages related to the different healthcare job categories. Once robots are expanding in the field of services, and care and medical assistance, we can explore how robotics could pave the way to a new phase for facilitating equal opportunities. Factors like robots as instruments to deal with the brute force requirements in healthcare might be relevant. However, we should consider technology as “a means for”, more than “the solution to”, which will depend strongly on additional cultural factors, policies and economic interests. One of the cultural factors could be the interest that robotics, and thus technology, generates to young students; and especially medical and care related robotics as a means to attract girls to this technological field.

8.2 Robots and Gender Labor Issues The bias between the kind of jobs held by men or women has many causes. On the one side traditional and cultural trends, which can influence significantly on the difficulties or context reasons that prevent women from acquiring the adequate formation level to become suitable for a well-recognized job. This is aggravated by the existing gender bias present in the selection process itself, which too frequently favors men in front of women that are equal or even better prepared for the job. This bias is analyzed in [8] considering multiple factors affecting top-merited candidates. The study in [9] confirms the social context that favors the skills mismatch between nurses (in general females) versus qualified doctors (in general males); referring “qualified” to their level of recognition and wages more than their capabilities and talent. Referring to medical doctors this fact is still worse when considering that currently in many countries the number of female medical doctors is higher than their male counterparts. On the other side, jobs that require physical efforts make women less or not suitable enough for them. While the former bias is a question of cultural and education of society, the later differences, due to men and women biological constitution has stronger and more reasonable and tangible justification. Apart from this more regulated workforce, domestic assistance or supportive healthcare at home suffers from the above-mentioned lack of protection of labor conditions. The question is, how robotics might provide the means to contribute to alleviate this bias, considering that robots as well as technology in general, only constitute a tool within the global context of local policies and cultural factors. In what follows, some partial views of how robots might influence on gender works are presented.


A. Casals

8.2.1 Robot Brute Force Although robots are assumed to become more intelligent in the future, artificial intelligence differs from human intelligence and we are far from getting intelligent robots; both, in what refers to achieve emotional and social intelligence, or being adaptable enough when working in quite variable environments that can derive into unexpected situations. That’s why robots are seen in many applications as the brute force controlled in a higher or lower degree by humans, that is, providing the required force in response to the human’s will and intelligence, and being the human who decides its movements and actions. In this context we can consider robots, either an external arm or a multiarm arm robot (usually a body with two arms) or wearable robots, exoskeletons. This robot assistance involves the concept of co-bot or collaborative robots, working hand to hand with humans. A third kind of robot in healthcare is oriented to patient manipulation. (A) Collaborative robots. Cobots The previous concept of co-manipulation, manually guided robots, is old, with the first application in car painting robots in the seventies, in which an expert operator manually guides the robot once, to train it for learning how to do the task and then, the robot can replicate the same task infinite times. This co-manipulation is based on the addition of a force sensor on the robot wrist, that allows the manual guidance of the robot arm by a human holding and steering it through a handle on this same wrist. The force sensor measures the force applied by the human, the human intention, and transform it to the desired movement. In this way, the robot learns by demonstration the sequence of movements to repeat later in the production process. The generalized concept of co-bot started with the Kuka Lightweight robot, used basically in research for the study and development of robot control behaviors to interact safely with humans, even with physical contact. In this case the robot senses interaction forces applied to any part of its body structure. The collaborative robot concept was popularized in 2012 with the Baxter robot [10] that launched a trend in industry, providing a lower cost robot with the idea of becoming a co-worker, a robot working hand to hand with humans. After the Baxter robot, most robot manufacturers develop collaborative robots with a wide applications scope. This concept is essential when dealing with robot assistants interacting with patients, either external arms (feeding, bringing objects…) or wearable devices. Elder and disabled can have limited force or dexterity to manipulate objects to carry out the daily tasks that allow living autonomously. Robots operating in safe environments collaborating intelligently with humans and with other robots to make its use easy and efficient to the user imply a technological challenge involving research in multidisciplinary matters. A domestic environment is complex and variable requiring perception, robot planning and control, artificial intelligence learning methods, mathematical modeling etc. The work in [11] describes one such multi-robot system to deal with the required assistance to disabled in a domestic environment (Fig. 8.1).

8 How Robots Can Change the Role of Women in Healthcare? A Way …


Fig. 8.1 Multirobot collaboration in a domestic environment for a daily live assistive task (Source UPC)

(B) Exoskeletons Exoskeletons (wearable robots) are a good example of devices that can equalize men and women in a given set of works that traditionally were only conceived for men due to the physical effort they require [12]. The first exoskeleton, Hardiman (1965), [13] was designed in General Electric to augment the user’s strength. It was a full-body exoskeleton. Its modern counterpart is [14] from Cyberdyne, a wearable exoskeleton. Firefighters are a good example of users of such robotic systems as they have to carry heavy loads (long hoses and other material) through rough terrains. For such applications, the human decides the movements and the robot applies the required force in a way that none or minimal human forces are required for executing the task. Unfortunately, until now such exoskeletons have been used more extensively to assist soldiers, since usually military industry very well-funded provide novel technology. In industry, exoskeletons are becoming more and more used in manufacturing chains to carry and manipulate loads for instance in cars assembling lines. In this context the robot structure can be an exoskeleton for lower limbs, upper limbs, shoulder or lower back, acting as a muscular aid and facilitating movements carrying loads or supporting activity in not ergonomic postures [15]. Provided that in these conditions the force is done by the robot, there are no physical human differences that prevent women to be good candidates for such jobs. In healthcare, especially in rehabilitation, exoskeletons have been studied for long, as a means for improving rehabilitation therapies. They are much more complex robots since usually the user is not able to directly control the movements, thus requiring more complex control strategies. This control is based on the availability of adequate multisensory data to interpret the user’s intention and state to continuously adapt to the programmed therapies according to the user’s needs. In any case, robots alleviate the work for any human in rehabilitation that consists on repetitive exercises. Otherwise, these therapies would require a therapist to force the movements and control them. Rehabilitation can also be carried out buy an external robot, a kind of arm that holds a patient’s limb to guide and execute the prescribed exercises.


A. Casals

Fig. 8.2 Robotized treadmill to assist rehabilitation therapies and its therapy programming. a In operation, b programming the exercise (source UPC-SJD)

Figure 8.2 shows one such robotic system composed of a treadmill and two robotic arms to assist the user’s movements according to a programmed rehabilitation exercise. A serious game to engage the user with the therapy is integrated in the whole system. With these robots, clinicians or therapists are released from applying forces and repeat routines to produce the movements and their role is more cognitive, that is, programming and planning the rehabilitation exercises based on their knowledge and experience [16, 17]. This kind of application is an example of how robots operate in repetitive and tedious tasks, requiring force and non-ergonomic postures while requiring human knowledge and intuition and common sense to design the therapy. (C) Prosthesis Patients missing a limb or part of it can find a way to recover partially or completely the lost autonomy with the support of prostheses. Hands, foots, ankle, lower limb or upper limb prosthesis constitute also a research area that as in the case of rehabilitation, faces a strong multidisciplinary approach. From the first mechanical prostheses, the progress has led to more ergonomic devices which are neurologically controlled [18]. For such robot applications, engineering knowledge ranging, roughly speaking, from: new materials; mechatronics; physics, mathematics and computing for signal and data interpretation and robot control, should advance hand to hand with medical and biological science. In this complementary biomedical world, the disciplines cover also a wide scope, mainly: biomechanics, neurosciences, surgery (neurological implants), physiotherapy and rehabilitation. A basic knowledge in these fields is also essential to count on good specialists, not only for developers. Thus, jobs in this kind of care services will become more motivating and better recognized.

8 How Robots Can Change the Role of Women in Healthcare? A Way …


(D) Patient manipulation and walker assistants Apart from rehabilitation, patient manipulation is also an issue. As in the abovementioned jobs, large physical efforts and uncomfortable postures are required to manipulate the patient, and in the long term may produce musculoskeletal injures to the assistant worker. Crane like robots or robotized beds can facilitate patient transfer, becoming in general a huge machine. They can be easily manipulated from a handheld controller. Other robotic systems derive from the robotization of assistant walkers or rollators. These smart walkers include driving and stability or guidance aids, as well as different sensing systems for patient monitoring, both, biological or behavioral data (like length or duration of runs, or lost of activity) [19]. The robotization of wheelchairs can provide more autonomy to the user or facilitate the assistant their manipulation in sit-to stand operations and vice versa, or other changes of posture. These devices approach the concept of robot nurse for patient transfer that does the role of nurses for bed-ridden patients and disabled. In this line, human like robots can be more flexible in the different kind of transfers or patient manipulation, like RIBA, the robot bear that can hold a patient like a human does. In [20] a study on the technique of RIBA to guarantee comfort to the user shows how this kind of robots will evolve. Therefore, a wide scope of assistive devices that have been incorporating robotic technology and becoming consolidated robotic systems, have paved the way to the development of new robots, still in their research and development phase aiming to provide a more complete assistance and higher levels of autonomy. This progress will alleviate healthcare assistants from the unnatural postures and huge efforts that this manipulation implies, although for many users and needs they should still be there, either to adjust the movements or to guide or control the robot actions.

8.2.2 Social Robot, a New Robot Workforce Assistive and social robotics are becoming a fact. Their contribution is not its physical force or assistance in posture or transportation, but the continuous attention, which can be quite stressing for the assistant (usually women) and imply a loss of intimacy for the patients that may need a permanent assistance or long sessions of personal care. This assistance is not physical, meaning bringing objects to the user, assisting in their movements or manipulating them, but accompanying, assessing or supervising their behavior (as reminder of actions to be done). Their appearance is usually that of a small animal or semi-human toys (dolls) that interact with the user along their daily activity. Imagine providing a personal assistant (like Siri, Alexa…) with a body that gives an animated appearance. They can become much more than a pet for a user and without the care that pets require. Such robots can be programmed with predefined behaviors or be the result of learning the user’s activity or needs, either from a training based on big data, on the data collected on the interaction with


A. Casals

the user, or both. The number of social robots is huge and with different levels of assistance and performance. The fact that current personal assistants rely on woman’s voice has been debated as “male culture” that reinforces the view that care is a women’s job and contribute to the culture of keeping them in such care jobs. But there is another consideration on the fact that women are more prone to give care than men and have more experience and skills. Would care robots be better conceived if designed and programmed by women? This is another reason to motivate women to approach technology.

8.2.3 Care and Healthcare Robots. A Motivation for Girls to Approach Technology? Gender unbalance in the different kind of studies is a well-known fact and in spite of the efforts to achieve a better gender equilibrium in some disciplines, the progress is minor. In STEM the presence of girls is in general low. However, and thinking specially in technology, when comparing informatics with bioinformatics, or engineering with biotechnology or bioengineering the difference between boys and girls is not so big. Biology, medicine, care and assistance related studies attract more girls than boys. In general informatics or engineering generates some rejection effect on girls, in part probably due to the vision of the traditional jobs that can be afforded after such studies. After these considerations, one can think on how robotics, a discipline that creates in general high attraction to society in general and in particular to children, converge with robot applications to care and healthcare. On the one side, robotics is becoming more popular. Robots are now a topic present at school at a very early age. The possibility of developing simple robotic systems with very cheap microcontrollers, 3D printed parts, simple sensors that allow programming control strategies motivates young students. And even younger children draw, imagine or even implement, handmade, their ideas of robots. Therefore, robotics constitutes a means to approach technology to children and young students, can awake interest for mechanics, computing, control, mathematics and physics or artificial intelligence. On the other side, care and healthcare robots, and at another level toys that can be seen like companions, are becoming more familiar to the society. Can then robotics be perceived, as it is, a way to provide better care, healthcare and medical conditions? Can this combination be the spark needed to attract girls towards science and technology? And towards robotics?

8 How Robots Can Change the Role of Women in Healthcare? A Way …


8.3 Discussion and Conclusions Robotics and Artificial Intelligence are now a focus of attention among society due to their potential negative effects. Science fiction has contributed to create fears about its progress, but also some current facts warns of these risks. The fear of losing jobs or the thread that robots might become too powerful and react against human interests are the main concerns. Instead, a wider view of how technology and specifically robotics contributes to create richness, facilitate better quality of services in many areas, and healthcare in particular, has to be considered. Therefore, the problem is not technology itself, but its assigned role, who owns it, how it is used, how the richness produced benefits to all or only a minor group. Legislation and ethics are thus an issue for the adequate progress of technological development. Technology, and robotics in particular, can attract women to the field, can facilitate their incorporation to well recognized jobs. But, will cultural issues, mental prejudices, social interests keep having a bigger influence on women preferences in studies and jobs, and in men’s mentality of seeing women as not equal in the social and labor aspects? And can the orientation of the use of technology have a new trend with more presence of women in the field? In this time of sensibility towards women’s rights, these are factors to be strongly considered and potentiated.

References 1. US Bureau 2020: Employment Projections 2019–2029. US Bureau of Statistics. https://www. 2. Makarova, E., Aeschlimann, B., Hergoz, W.: The gender gap in STEM fields: the impact of the gender stereotype of math and science on secondary students’ career aspirations. Front. Educ (2019). 3. The STEM Gap: Women and Girls in Science, Technology, Engineering and Math. https:// 4. Health Workforce Policies in OECD Countries: Right Jobs, Right Skills, Right Places, pp. 163– 183. OECD Publishing, Paris. 5. OECD: Closing the gap. Act Now. OECD Publishing, Paris (2012). 9789264179370-en 6. Uve, S.: Supermujeres, superinventoras. Ideas brillantes que transformaron nuestra vida, Lunwerg Editores (2018). ISBN 978–84–16890–97–2 7. 8. Andersson, E.R., Hagberg, C.E, Hägg, S.: Gender bias impacts top-merited candidates. Front. Res. Metrics Anal. (2021) 9. Franklin, P., Bambra, C., Albani, V.: Gender equality and health in the EU. European Commission. Unit D2 Gender equality (2021) 10. Guizzo, E., Ackerman, E.: How Rethink Robotics Built its New Baxter Robot Worker. IEEE Spectrum (2012) 11. Vinagre, M., Aranda, J., Casals, A.: Suitable Task Allocation in Intelligent Systems for Assistive Environments, Robot’2019. Springer (2019) 12. 13. 14.


A. Casals

15. Bogue, R.: Exoskeletons: a review of industrial applications. Indus. Robot 45(5) (2018). https:// 16. Maciejasz, P., et al.: A survey on robotic devices for upper limb rehabilitation. J. Neuroeng. Rehabil. 11(1), 3 17. Li, C., Jianfeng, S., Linhong, J.: Lower limb rehabilitation robots. A review IFMBE proceedings 39 18. Godrich, J.: Bionic Hands Gives Amputees Sense of Touch. IEEE Spectrum (2021) 19. Ballesteros, J., et al.: Automatic assessment of a rollator-user’s condition during rehabilitation using the i-Walker platform, IEEE Trans. Neural Syst. Reh Eng. (2017) 20. Ding, M., et al.: Comfort estimation during lift-up using nursing-care robot—RIBA. In: IEEE First International Conference on Innovative Engineering Systems (2012)

Chapter 9

Anime’s Thoughts on Artificial Minds and Gendered Bodies: From Transhumanism to Transindividuality Alba Torrents González and Andreu Ballús Santacana

Abstract The complex fictions appearing in mass-consumption media such as films, video games, comic books and genre literature aren’t just entertainment. They often are themselves a medium of reflection and creation where new ideas are born and those already alive are tested and transformed. When that happens, the technical specificities of a certain medium (or of a particular media ecology) can have a crucial role in determining the shape of these intellectual outcomes, which bear the mark of the place where they were produced and disseminated. In this chapter, we take a look at the ways one specific medial milieu, that of manga, anime and their reciprocal influences and adaptations, has contributed to the ongoing discussions about transhumanism, posthumanism and the relationships between AI, embodiment, gender, and the limits of human identity. We focus on a minimal corpus consisting of four anime franchises: Ghost in the Shell (GitS), Serial Experiments Lain (SEL), Akira, and Evangelion. All of them are very influential works, recognised by critics and fans alike, and have had an enduring influence on contemporary science fiction inside and outside their immediate medial and cultural spheres. Furthermore, all of them are directly concerned not only with broader transhumanist and posthumanist ideas, but specifically with the roles that AI could play in a technological overcoming (or abandonment) of classical human identity. GitS is treated in special detail because, in our opinion, it presents the clearest instance of a questioning of the transhumanist ideals, as is NGE, which we consider a sort of contramodel of the same ideals, and an example of the notion of posthuman being used to dismantle traditional humanist/transhumanist dichotomies.

A. Torrents González (B) · A. Ballús Santacana Universitat Autònoma de Barcelona, Edifici B / Campus de la UAB, 08193 Bellaterra, Spain e-mail: [email protected] A. Ballús Santacana e-mail: [email protected] Universitat Oberta de Catalunya, Barcelona, Spain © Springer Nature Switzerland AG 2023 J. Vallverdú (ed.), Gender in AI and Robotics, Intelligent Systems Reference Library 235,



A. Torrents González and A. Ballús Santacana

9.1 Introduction The complex fictions appearing in mass-consumption media such as films, video games, comic books and genre literature aren’t just entertainment. They often are themselves a medium of reflection and creation where new ideas are born and those already alive are tested and transformed. When that happens, the technical specificities of a certain medium (or of a particular media ecology) can have a crucial role in determining the shape of these intellectual outcomes, which bear the mark of the place where they were produced and disseminated. In this chapter, we take a look at the ways one specific medial milieu, that of manga, anime and their reciprocal influences and adaptations, has contributed to the ongoing discussions about transhumanism, posthumanism and the relationships between AI, embodiment, gender, and the limits of human identity. We focus on a minimal corpus consisting of four anime franchises: Ghost in the Shell (GitS), Serial Experiments Lain (SEL), Akira, and Evangelion.1 All of them are very influential works, recognised by critics and fans alike, and have had an enduring influence on contemporary science fiction inside and outside their immediate medial and cultural spheres. Furthermore, all of them are directly concerned not only with broader transhumanist and posthumanist ideas, but specifically with the roles that AI could play in a technological overcoming (or abandonment) of classical human identity. GitS is treated in special detail because, in our opinion, it presents the clearest instance of a questioning of the transhumanist ideals, as is NGE, which we consider a sort of contramodel of the same ideals, and an example of the notion of posthuman being used to dismantle traditional humanist/transhumanist dichotomies.

9.2 Mapping the t/p Field The terms “transhumanism”, “posthuman” and “posthumanism” have become increasingly pervasive in cultural and philosophical discussion. Ferrando [11] traces the beginnings of their use in the contemporary sense to the XX century fifties and the seventies respectively, although the philosophical roots of the ideas they represent can be dug for as far back as, at least, Nietzsche. However, it is in the last three decades that they have become ubiquitous in the parlance of several disciplines. Despite that, or maybe because of it, the nuances of their meanings, the relationship between the two, and that between the pair and their obvious counterpart and antecedent, “humanism”, can be difficult to disentangle. Fusco and Broncano [6] have noted that the word “transhuman” tends to appear in connection to both descriptive and normative discussions on the possibility of technologically enhancing either human individuals and groups, or the human species 1

We will refer to both the original series (Neon Genesis Evangelion, 1995–1996) and the anime film The end of Evangelion (1997).

9 Anime’s Thoughts on Artificial Minds and Gendered Bodies: From …


as a whole. The explored potentials for change and evolution are usually circumscribed within what [11] calls “the possibilities inscribed within its possible biological and technological evolutions”.2 “Posthumanism”, on the other hand, tends to appear in broader philosophical discussions of the problems and limits of its counterpart, the humanist tradition. Paradoxically, as Broncano and Fusco also note, many strands of posthumanism could at the same time be considered continuations of a late, post-romantic and critical version of the humanist project, which would include authors like Sartre, Benjamin and Hanna Arendt.3 However, Ferrando [11] makes a nuanced distinction between post-humanism, i.e. the opposition to classical humanist philosophies because of its inherent anthropocentrism and androcentrism that is a core component of posthumanism, and posthumanism itself, which focuses on exploring the notion of the posthuman and on dismantling the traditional dichotomies associated with the role of humanity in nature (natural/artificial, natural/cultural, biological/technological, and even old/new). In Ferrando’s words: Posthumanism can be seen as a post-exclusivism: an empirical philosophy of mediation which offers a reconciliation of existence in its broadest significations. Posthumanism does not employ any frontal dualism or antithesis, demystifying any ontological polarization through the postmodern practice of deconstruction.…[It] can also be seen as a postexceptionalism. It implies an assimilation of the “dissolution of the new,” which Gianni Vattimo identified as a specific trait of the postmodern. In order to postulate the “new,” the center of the discourse has to be located, so that the question “New to what?” shall be answered [11, p. 29].

As we can see, where Fusco and Broncano see a fundamental continuity between a late version of humanism and posthumanism, Ferrando introduces postmodernism as a crucial mediator. The connection between postmodernism (particularly, feminist and antispecist strands of postmodernism) and posthumanism is indeed also recognised by Broncano and Fusco, who make a firther distinction between a derridean tradition, represented between others by Judith Butler, and a deleuzean line of thinking, instantiated in the example of Rossi Braidotti. However, Ferrando gives a more determining role to this influences by making them part of what differentiates transhumanism from posthumanism: transhumanism, in continuity with (older, romantic) humanism, doesn’t engage with the critics of classical philosophical polarities, and as a result it remains a modified form of humanism (albeit an “ultra-” one). The emphasis on notions such as rationality, progress and optimism is in line with the fact that, philosophically, transhumanism roots itself in the Enlightenment, and so it does not expropriate rational humanism. By taking humanism further, transhumanism can be defined as “ultra-humanism.”... This theoretical location weakens the transhumanist reflection… For these reasons, although offering inspiring views on the ongoing interaction between the 2

Broncano and Fusco point to the works of Gunther Anders on the idea of human obsolescence as a counterpoint to this tendency. Anders’ proposal, which while being explicitly “humanist” has been connected with the origins of transhumanism [1], focuses on the moral enhancement of humanity. 3 The authors consider the publication of Marx’s Economic and Philosophic Manuscripts of 1844 in 1932 a turning point in the history of the humanist tradition, marking the transition from a romantic, naive humanism to a critical one.


A. Torrents González and A. Ballús Santacana

biological and the technological realm, transhumanism is rooted within traditions of thought which pose unredeemable restrictions to its perspectives. Its reliance on technology and science should be investigated from a broader angle; a less centralized and more integrated approach would deeply enrich the debate [11, p. 27].

To make the relationship between the whole transhumanism/posthumanism complex and technology more complex, some of the most emblematic images of a possible overcoming of human limitations, the idea of “uploading” human minds onto a computer, has been variously classified as typically transhuman and typically posthuman.4 However, this can be clarified by noting the factors that may affect its classification. Confronting an image or fiction of “mind uploading”, we can ask the following questions: (a) Is the continuous identity of the uploaded mind, preserving at least some human traits, made salient, or is the transition taken as an occasion to question assumptions on the limits of identity? (b) Does technology appear in contrast with biological life, or is the pair technology/life problematized? These questions, which will later become relevant in the analysis of our corpus, mark the difference between a typically transhumanist conception of mind-uploading (the one corresponding with the italicized part of the question) and a typically posthumanist one. At the same time, they show that even if it is possible to make clear conceptual distinctions between transhumanist and posthumanist ideas, it is sometimes more difficult to differentiate between the two when considering images and narratives. In some instances, then, it may be preferable to take this complex of ideas as a somewhat continuous but polarized field (what we referred to in the title to this section as the t/p (trans-/post-humanist) field. In one pole we would find images which imply an outdated, naive assumption of human exceptionalism and classic dichotomies. In the other pole, we would find images and narratives forcing us to confront and problematise these very assumptions. In practice, most items can only be (subjectively!) situated in between these two extremes, but it is sometimes possible to make sufficiently clear dimensional comparisons.

9.3 Sex, Gender and the t/p Field Matters related to sex and gender have always been an important part of transhumanist and posthumanist discussions. In part, this is due to the fact that t/p narratives naturally provide an escapeway from the implicit universalisation of masculinity typical of humanist views. In addition to this, specifically feminist proposals such as Haraway’s Cyborg manifesto [12, pp. 149–182] have been fundamental in defining the conceptualization of technology in posthumanist discussions. Similarly, debates 4

This is precisely what happens with Broncano and Fusco, who take it as characteristic of the posthuman approach, and Ferrando, who mentions it as part of the transhumanist imaginary.

9 Anime’s Thoughts on Artificial Minds and Gendered Bodies: From …


around the possibility of transforming one’s body in ways that subvert both gender norms and certain discourses about sex have contributed to a widening of the t/p field, from Butler’s groundbreaking discussions about the discursivity and performativity of sex [8] to the New Materialist’s critiques of the downplaying of matter in feminist posthumanist discourses [2]. These discussions are full of nuances that would be impossible to reproduce here. However, focusing on the possibilities that transhumanist and (specially) posthumanist ideas offer with regard to the feminist cause, we can sum them up in two main options, one of which presents somewhere in between two notably different versions. Option a1 (transhuman-wise version): t/p is good, because it helps us conceive a (future) overcoming of the limits of gender and sex (which are objectively real). Option a2 (posthuman-wise version): t/p is good, because it helps us conceive a (theoretical) overcoming of the (discursively imposed) limits of gender and sex. Option b: (independently of their nature,) gender and sex are part of what defines most human identities, and the idea of “overcoming” them may often be based on a misogynistic rejection of traits traditionally identified as feminine. We can see option a as spreading along the t/p dimension mentioned in the previous section: most versions of it accept some kind of constructivism with regards to gender, but their connection to views on the status of “biological” sex and its relationship with gender varies. However, the main opposition is not that between the ideal types a1 and a2, but between any version of option a and option b, i.e. between those who consider the possibility of overcoming sex/gender inherently positive, and those who are suspicious that such a goal may be tainted by androcentric or misogynistic assumptions. Far from being merely theoretical positions about fictions and images of the future, these positions have direct political implications inasmuch as they naturally relate with the matter of the status of trans people. One the one hand, any position based on a traditional, dichotomic and immobilist view of sex implies a denial of trans experience. Even by recognising the possibility of overcoming one’s assigned sex, and in one sense validating the use of sex-change technologies, they reinforce the idea that sex has always been naturally binary and independent of one’s experience, feelings or desires. This can happen both with transhuman-wise versions of option a and with some versions of option b. On the other hand, some explicitly posthumanist versions of option a can lead to an unfair dismissal of the factic, lived character of both sex and gender, by making them appear like mere theoretical categories. One example of these politically charged discussions we find, for instance, in recent [10] remarks on the capital’s cartesian dream. Federici uses the term to refer to the ways the idea of reconstructing and enhancing the human body, assumed as liberating by many feminists and queer theorists (and crucial to the life projects of many trans people), may in many ways function towards in a capitalistic process of reduction of embodied human lives into abstract workforce.


A. Torrents González and A. Ballús Santacana

9.4 Born This Way: Sci-Fi and Feminist t/p-Humanism This relevance of transhumanist and posthumanist fictions, images and narratives to politics, activism and lived identities is, of course, not exceptional. Narrative fiction in general and sci-fi in particular have always functioned as a kind of laboratory of scientific, political and philosophical ideas. As thinkers as Haraway and Braidotti have noticed, the construction of fictional worlds offers an almost unequaled resource for exploring new images of the world. This often means not only imagining new situations, but even escaping what [9] calls hegemonic ontological schemes. What would be impossible to think before a certain narrative becomes thinkable (and sometimes doable) through it. From the inception of the genre with seminal works such as Mary Shelley’s Frankenstein, the language and tropes of science fiction have tended to combine a high degree of reflexivity and a prone orientation to social critique, which allows them to subvert more institutionalized ways of narrating. This has made science fiction very important for culture-theoretical and philosophical movements like postmodernism and posthumanism. It may even be said that sci-fi has been the true homeland of many transhumanist ideas, which appeared on paper or onscreen a lot before being accepted as part of “serious discussions”. This is clearly the case with many issues linked to IA and to the possible ways humanity is going to relate to it and/or be transformed by it In Bukatman words, Within the metaphors and fictions of postmodern discourse, much is at stake, as electronic technology seems to rise, to pose a set of crucial ontological questions regarding the status and power of the human. It has fallen to science fiction to repeatedly narrate a new subject that can somehow directly interface with -and master- the cybernetic technologies of the Information Age [9, p. 2].

Shortly after that, Bukatman advances an interesting attempt to explain why genderedness, sexuatedness and the possibility of changins one’s relation to sex/gender tend to be prominent in sci-fi narratives: in his view, they always imply not only a certain degree of poetic figuration of oir social relationships, but also a degree of cognitive distorsion of our relationships both with others and with ourselves, and very particularly with our body, its capacities and its limits [9, p. 11].

9.5 (Manga)Anime as a Thinking Device As we have seen, one of the most recognisable traits of posthumanism is its tendency to oppose every form of human exceptionalism. Therefore, it has allied with advances in the cognitive sciences in order to dispute such settled knowledge as the notion that thinking is, at least on some crucial levels, mostly a human thing, and even the very idea that one can easily distinguish what is human from what is not. Hand in hand with sci-fi, it problematizes machine thinking, but also all kinds of biological thinking, as something lacking any difference in nature with human thought, and sometimes inextricably linked to it. This means going beyond the two categories that comprise

9 Anime’s Thoughts on Artificial Minds and Gendered Bodies: From …


most classical sci-fi narratives about artificial cognition: that of the technological instrument capable of expanding human thinking by enhancing one of their already existing traits, and that of the artificial being endowed with a full blown mind… which happens to be fundamentally analogous to a human psyche. Therefore, we find these old ways of conceiving artificial thought not only examples of the kind of fictions that would be uninteresting to our analysis, but also inadequate methodological tools to employ in it. Instead, we will be using Torrent’s notion of “thinking devices” to explore the ways in which the material technicality of anime (and, to a lesser extent, of manga) bears on the conceptual elaboration of trans- and posthuman ideas and images in the works we analyze. In Torrents words, It is supposed to mean both “devices for thinking” and “devices that think,” something halfway between an independent machine and a tool or utensil. One of the reasons I like to talk about thinking devices is because I believe that thinking is first and foremost a complex relationship between entities, and that sometimes deciding where to place the agency (i.e., the who is thinking) is not so simple…. Cultural products, especially artistic ones, are especially suitable to be interpreted as thinking devices. Not only do they have the ability to engage agents and information systems in mutual relationships, but they also have critical potential, that is, they have the ability to change the ways we engage with the world in a meaningful way… When we talk about thinking devices we refer to processes in which the production of meaning is distributed and dynamic: meaning is something that happens during the interaction between the viewer and the device, not a fixed message that is transmitted ([16], p. 84; our translation).

We are aware that many of the advantages of this methodology as compared to a traditional textual analysis (i. e., not assuming that the text acts as a mere “vehicle” for meaning, acknowledging the dynamic role of multiple agencies in the production of meaning) are at least partially shared by other contemporary approaches. However we believe it is uniquely appropriate in that it also allows us to avoid any traces of unwarranted individualism in our approximation to the meaning-producing mechanisms of objects which are at, at the same time, cultural and technical. The interaction between any spectators and the technical object that is anime constitutes a process of collective individuation which traverses different levels of reality: the physical, the biological, the psychic and the social. To locate the agency of the thinking which produces the elaborations we will explore, we must keep in mind that this individuation, in a sense, begins even before the spectator confronts a specific work. It is an act of distributed cognition depending on a progressive transduction of structures between the animation technologies, genre tropes and conventions and social and psychological realities, occupying a span of time that also has to be taken into account.

9.6 Our Animetic Body We have chosen the works that compose our corpus with the intention of showing how anime (in its relation of mutual adaptation with manga) has participated in the


A. Torrents González and A. Ballús Santacana

elaboration of concepts, images and narratives that help us think about the challenges and possibilities IA and other advanced technologies present to humanity. In all of them, but in very different ways, technology appears as something of a mediator between subjectivity and coporality. In fact, underscoring the importance of embodiment as a crucial factor in the relationship between technology and human identity may be in itself one of the fundamental contributions of anime. However, it takes two substantially different forms. GitS and SEL startpoint is what we could call a software-centered perspective. In both fictions, embodiment (or at least traditional, non digitally mediated embodiment) appears as something that can in principle be transcended, and an emphasis is put on the relative independence of consciousness and body. Even if problematizing them, those two works are somewhat near typical transhumanist images of minduploading, and both of them include disembodied intelligences as main characters (although only the one in GitS, the Puppet Master, is properly speaking an AI). Evangelion and Akira offer a counterpoint to these works and dialogue with them from what we could call, in contrast, a hardware-centered model. In a very posthuman fashion, they question the limits between biology and technology and between consciousness and body, partly by introducing reflections on the relationship between embodiment and temporality. Again, only in Evangelion do we find what is supposed to be an AI (although, as is characteristic of the series, it turns out to be otherwise). However, all of them contribute powerful concepts and images that help us reflect on what the emergence of AI and the progressive blurring of the limits between human and machine thinking may mean for all of us.

9.7 Is the One in the Shell a Human Ghost? As we have already pointed out, one of the crucial questions we can ask any version of the mind-uploading fiction in order to ascertain where it is placed in the t/p field is the question about continuity. Is humanity transformed into something entirely else by abandoning the human body, or are the differences between the two terms of the transformation conceived as matters of degree? In the case of GitS and SEL, this question is poised both directly in the story and indirectly, through an exploration of the temporal aspect of embodiment. Despite the fact that the relationships between body and technology, as well as between humanity and technology, are presented in these anime in a complicated manner, we can see in both of them a common tendency to differentiate conceptually between biological bodies and their technological substitutions or extensions. Inside this division, the biological body is associated with the past: it is in the flesh that the individual subject’s memory is recorded. This body-as-memory-of-the-past is also a body that is presented as always already given, a body conceived as a product: the biological body represents what has already happened and cannot be changed. This link between history and the biological body (and, more specifically, human flesh) is clearest in GitS. Despite the many differences between the GitS manga,

9 Anime’s Thoughts on Artificial Minds and Gendered Bodies: From …


movies, and series, in all of them Motoko Kusanagi, a cybernetic organism who works as a detective in Sect. 9.9, appears as the main character through which the relationships between body, identity, and technology are explored. Kusanagi is a cyborg with a fully cybernetic body who wonders how much she may be considered human anymore. Kusanagi has trouble identifying with her body, and hence with the biology of her own species. The biological body appears here, therefore, not just as a locus for individual history, but as a meeting place with the history of the species. In the original GitS story, Kusanagi interacts with the Puppet Master, the first IA to reach autonomy as a conscious being, and that interaction prompts questions about her own humanity and unique identity. Throughout the franchise we find an ongoing exploration of the notions of the flesh as a unique and non-reproducible record of the subject’s past, and the body as a materialization of the limits of the individual. Kusanagi’s desire to escape the limitations of her body is made evident throughout the 1995 film. For instance, in a scene where she returns from a dive she admits to her partner Batou that she feels confined to her body. In a sense, both GitS and GitS: Innocence handle the link between body and technology within a paradigm that draws more from the early imagination of cybernetics and the concept of cyberspace than from the imaginary of classic cyberpunk or of other contemporary technological anime. This early cybernetic paradigm has a dualistic and substantialist bent to it, idealizing transcendence of the biological body towards the technological as an almost-religious “overcoming” of flesh. In her article The “Virtual” Body and the Strange Persistence of the Flesh: Deleuze, Cyberspace, and the Posthuman», Ella Brians writes: [...T]he imaginary of cyberspace is invested in a notion of transcendence: specifically, transcendence of the body and its perceived material limits .... The techno-fantasy of cyberspace is that technology will finally deliver what philosophy and religion have only dreamed of – to free us at last from the earthly bonds of the flesh, with its hungers, needs, and limitations [5, p. 122]

GitS portrays the biological body as something that can be replaced by a technological body, but never the other way around. Unlike the former, the latter is associated with novelty and potentiality. Furthermore, it raises questions about the uniqueness and irreproducibility of individual identity, which becomes doubtful when it comes into contact with the technological element. As Bolton points out in his article From Wooden Cyborgs to Celluloid Souls, GitS’ treatment of the body is complex and contradictory: Oshii’s film seems to be divided between nostalgia for a firmly physical body, on one hand, and a desire to transcend that body and enter a world of pure data or language, on the other ([3], p. 731). Furthermore, the franchise delves on transindividuality and its relationship with bodily transcendence on several levels. Individuality, or more precisely, the complete individual, is clearly shown as being on the opposite side of technology. The Tachikoma robots, for example, highlight the tension between progress and individuality in the GitS:GIG series: while each machine has different experiences, none of them has an individual past, and that is precisely because of their collective memory. Even though the Tachikomas’ is a completely technological body, it is still presented


A. Torrents González and A. Ballús Santacana

as one that allows them to preserve certain characteristics of individual beings. Each Tachikoma’s daily experiences differ from one another in the moment they occur, but they are not preserved in the form of an individual past or memory of the subject. The Puppet Master, on the other hand, personifies the fantasy of transcending one’s physical boundaries. In this extreme case, the physical “body” is the network itself. Unlike the biological body, it is a distributed and immobile body with only an indirect relationship to the character’s action. For this reason, the Puppet Master does not completely identify with any physical body. The metaphorical title, “Ghost in the Shell”, refers precisely to this mind/body dualism, in which various technological bodies appear as mere puppets that the Puppet Master manipulates without being bound to them by any record or memory. The case of Puppet Master also accounts for the difference between the biological body as a history record and the technological body as a locus for novelty. Kusanagi must shed her flesh body in order to transform into another being, one in which her individual identity as Kusanagi will be lost. It’s worth noting, however, that the Puppet Master depends on Kusanagi to complete itself precisely because he lacks any organical capacities for reproduction and death. According to the anime’s conceptual scheme, uniqueness requires a transformation into something other, a becoming difference. This is ultimately realized in a technological process of collective individuation, in a move which is interesting in relation with another of the questions we have raised above: that of the relationship between sex/gender and technological trascendence. In this case, we see that while gendered and sexed traits seem to be completely overcome in the proposed fusion of identities, at the same time something akin to sex itself is introduced, leaving us to ponder if Kusanagi’s role is not ultimately a very traditional feminine-as-procreation-oriented role. [14] has explored how the different treatment of visual and functional sexual traits in GitS can be read as a critic of capitalist reification, in a direction not too dissimilar to Federicic’s cartesian dream mentioned above. Delving on this connection between body and individual and collective history, SEL also uses the network image to investigate shared identity and the link between the individual and death. Lain begins to receive messages from a girl through the Internet after a classmate commits suicide. This sets in motion a complicated scenario in which Lain develops an obsession with the virtual world and her identity is put in crisis. Not only will three separate Lains surface, posing a threat to all of her relationships, but her memory and family ties will start to vanish as well. In SEL, like in GitS, individual death entails passage to the collective: Lain eventually vanishes as an individual creature in order to transcend to the world of the Wired. The relationship between technology and corporeality, thus, is explored throughout the plot of SEL As Steven T. Brown analyzes this relationship in his book Tokyo Cyber-Punk: Posthumanism in Japanese Visual Culture [7]: There is a tension throughout Serial Experiments Lain between a desire for disembodiment, on the one hand, and a desire for reembodiment, on the other. Indeed, it could be argued that most posthuman anime and films involve some sort of negotiation between these two poles, generally favoring one side over the other. The desire for disembodiment typically presupposes contempt for the obsolete human body and a yearning to escape death

9 Anime’s Thoughts on Artificial Minds and Gendered Bodies: From …


by discarding or annihilating the body in favor of some higher, transcendent state of being, whether spiritual devaluation of the body in favor of a “mind” or “soul”, however conceived, exemplifies a clear desire for disembodiment. Moreover, the desire for disembodiment may also include transhuman (or extropian) fantasies of uploading consciousness into a computer in the digitally pure form of an autonomous program or code that can then circulate freely across cyberspace. [...] This transhuman view of posthumanism privileges what N. Katherine Hayles has described as “informational pattern over material instantiation, so that embodiment in a biological substrate is seen as an accident of history rather than an inevitability of life” [7, pp. 176–177].

This type of dualism may be seen in both GitS and SEL. Materiality is characterized as something that must be abandoned in favor of a form of life conceived as incorporeal or largely independent of the body, and thus superior or more evolved. As a result, a desire arises to transcend the body and its needs and reach a purely technological existence. In addition to being dualistic, this suggests a certain essentialism: The more Eiri rants against the materiality of the body and hardware, the more obvious it becomes that his is an ideology of electronic presence —an “informatic essentialism” or digital idealism, if you will— that is simply another variation on the extropian fantasy of electronic transmutation. [...] According to Eiri’s brand of digital idealism, the sensible world of phenomena is subordinated to the intelligible world or digital data in the Wired, of which the material world is merely a hologram. In other words, by elevating the Wierd to a transcendental realm where Truth, the Real, and the Thing-In-Itself reside, Eiri’s digital idealism approaches something like cyber-Platonism [7: 177].

But, like in GitS, it’s also fair to wonder how much of this dualism and essentialism is only apparent. To begin with, there is no true dematerialization; rather, one materiality is substituted for another. We observe how the virtual world requires physical support at all times: computers, connections, and the like are integral parts of the “virtual” world. Although the virtual is in some ways antithetical to the physical (particularly as shown in Eiri Masami’s discourse), there is some continuity between the two. But there’s more: as evidenced by Lain’s victory over Eiri Masami, who favors a virtual existence beyond the physical, what happens in SEL is, in a way, a corporeal victory [7: 180]. Lain and Alice, one of his greatest pals, are shown together in a scene from Chapter 12, “Landscape.” Alice urges him to feel her body and explains that her heart is racing because she is frightened of losing it. Finally, Eiri Masami endures a horrifying metamorphosis (similar to Tetsuo’s at the end of Akira), after which Lain informs him that he doesn’t understand anything because he lacks a body. Now, If we go back to 1995’s GitS film and take a more technical look into it, we can see that even there the “spirit” does not abandon matter either; instead, we find another sort of materiality is portrayed, one that is tied to spirituality and in which the visual domain is subjugated to the auditory. Although there is some dualism in it, it is not a net dualism in which two worlds or two clearly distinct realities, the material and the spiritual, are depicted. While the mind—(biological) body relationship is questioned, materiality itself is not abolished or transcended. Rather, it is substituted by a different version of itself: a materiality which is conceived as a kind of destination


A. Torrents González and A. Ballús Santacana

for humanity, although not so much through the definition-blurring perspective of posthumanism, but in a sophisticated version of transhumanism. Let’s follow Shin’s [5] example and compare Mamoru Oshii’s use of voices in Akira to what Katushiro Otomo did in Akira. Through the use of narrators and the voicing of private thoughts, the relationship between sound and image is often used in film to underscore the distance between the spiritual and the material. In Otomo’s film, all the dialogues were recorded before the animation, resulting in a perfect match between the animated mouths and the sounds of voices throughout the film. That seems fitting because, as previously stated, Akira and Evangelion oppose the kind of dualism alluded to in GitS. In them, the biological body cannot be transcended or replaced (only extended) by a technological body. Shin, however, compares Akira’s final scene to Oshii’s use of voice in GitS. The author discusses the absence of harmony between the acoustic and graphical dimensions, as well as the lack of audiovisual integration: No matching image is offered for the voice-echo except dead stillness, essentially suggesting the reverse of the Big Bang scene in Akira, where the visual explosions are completely silent. The voice dissociated from image consequently demonstrates Oshii’s interrogation of the fundamental operation of sound-image relationships. The resulting sensory break between sound and image, and the dimensional gap of space and time, illustrates animation’s heterogeneous representational regime, which reopens the chasm of unified sensory registers and thereby disrupts coherent corporeal experience. (Shin 2011: 11)

This new space opened, however, is something different from a mere disconnection between matter and mind: it is meant to speak to us of new kinds of coherence, of different organizations of matter. This is underscored by the repeated use of eerie, religious-like vocal music during these scenes. Shin also mentions GitS’s rupture with Cartesian perspectival expectations as a second resource Oshii uses to interrogate the relationships between subject and object and between the human and the technological. Shin focuses on two scenes in which this non-cartesian, “inorganic” gaze, may be observed. These are the scene in which Batou and Kusanagi hunt the villain whose memories have been replaced by the Puppet Master, and their scene on the ship. We can observe a very special interaction between flat image and depth in these two scenes. This interplay, according to Shin, allows Oshii to represent the characters’ psychological and emotional states. From a technical standpoint, we see how, in the pursuit scene, the optical plane moves in a sequence that blends manga-like flat graphics, three-dimensional cinematic space, and the presentation of a four-dimensional space using the digital morphing approach (Shin, 2011: 15). Although Shin reads the scene as a metaphor for the postmodern condition, with its schizophrenic temporality, historical forgetfulness, and dehumanization, we prefer to read it more specifically as resignification of the relationship between technology and humanity. Indeed, it is a decentered gaze: there is no longer a particular perspective, but rather a game of delocalized, expanding visions. These technical motions strengthen Oshii’s concept of a material extension of the spiritual, which he already offered with the disconnection of speech and image. In GitS, therefore, we find a clear opposition between human beings and technology: acquiring a technological body means, to a certain extent, abandoning one’s

9 Anime’s Thoughts on Artificial Minds and Gendered Bodies: From …


identity as a human being. Technology (and specifically AI) works as an enabler for the radical abandonment of the biological body. This body is presented as what defines individuality as such and at the same time places him within the human species, while the technological domain is identified with novelty and indefinite potentiality. By becoming a purely technological individual, Kusanagi abandons, along with all traces of biology, her own humanity. This clearly falls more on the t/ side of our t/p field, even if one should note that the idea of a rupture with the human and the biological does not imply a dematerialisation. Rather, what is proposed is a rediscovery of the possibilities of matter, an escape of the conceptual corset of the human body. The robotic (interchangeable) body and the physical network appear thus as the material substrates of a way out human limitations, which include, of course, gender and sexual limitations.

9.8 Interlude. Ain’t “Robot” Always Been a Woman? Or do they? Authors as [14] have noted that the main relationship between the robotic body and sex/gender in anime seems to be one of contrast: the sexualized female traits of Kusanagi appear as shocking in the context of the robotic body, and the more the industrial, inorganic nature of her body-machine is revealed, the least we perceive it/her as feminine. On the other hand, the Puppet Master seems to have a preference for doll-like bodies, and through the use it makes of them we are repeatedly reminded of the one obvious thing traditional femininity and robotics have in common: subservience. Let us turn back in time and remember the origins of the robot, the literary paradigm of the artificial being. The robot occupies a prominent place in anime’s history, mainly through the enormous influence of Tezuka’s Astroboy, one of the defining creations of the genre. Tezuka was a very well-read man, and between the works that influenced him there was a translated copy of Rossum’s Universal Robots, ˇ published by Karel Capek in 1921: the play in which the word “robot” was used for the first time. The story is about a species of slave-like artificial beings who, at a given moment, rebel against their human masters. In the author’s native czech language, “robot” means “work” or “service”, and is derived from the word “slave” or “servant” ˇ ([15], Part 1, Chap. 2, para. 3). Capek’s work, in turn, is heavily influenced from the Jewish traditional figure of the Golem, which we can also connect with the myth of Prometheus. In both the Golem and Prometheus stories we find two key common elements: the desire of those who are not “naturally equipped” to do so to bring forth new life from mere matter, and the idea that this implies a hubris, a going against the wishes of (and risking the ire of) either Nature or a superior being. Interestingly, the story of Shelley’s Frankenstein, considered by many the first modern science-fiction work, also draws its imagery from this tradition. But Frankenstein’s creature can also be seen as a direct antecedent of the figure of the cyborg, that is, a being where the human and the technical are inextricably confused. The imaginary around the robot is not one of clear limits, but one that points to the fact


A. Torrents González and A. Ballús Santacana

that traditionally assumed limits can be blurred: matter can come alive, non-pregnant beings give life… This transgression, however, will usually be punished: the original robot idea revolves about the notion that those limits are somehow related to the social norms that establish domination. Ultimately, we can trace back these connections to the basic cultural, social and economical structures that were seminal in the configuration of what we sometimes call the West. Many elements of our present-day myths and imaginaries about society (but also, for instance, our present legal system) are deeply rooted in ancient greek culture, and specifically in the notions of the oikos (the home or basic unit of family, property and work) and kyrios (the male adult father/owner of the oikos). In turn, one of the crucial laws of the relationship between kyrios and oikos is the fact that the kyrios, the legal and public representative of the oikos, has a complete power of life and death over his children, women, slaves and animals. While male children are expected to grow up to be kyrios themselves, and animals are supposedly unable to reason, women and slaves are perenially separated from a full public live precisely because they share a common function of physical workers of the oikos, with one fundamental difference between them: the (male) slave accupies himself mainly in what we nowadays would call “production”, while women are in charge of care, sexual work and reproduction. In that structure we find the origins of what both binds and separates robots and sex/gender: while mechanical toil and loving care should in principle be properly distributed, each to his particular kind of familiar slave (it is worth nothing that “family” comes precisely from the latin famulus, slave), there is a natural tendency of those oppressed together to rebel together. And indeed, in the intimate sphere of the oikos, there isn’t much obstacle for sexually abusing the worker or forcing nonmetaphorical field labor on the wife. So, ultimately, the robot is revealed as sharing at least its core trait of subjugation with the female, the marked gender, and it only takes a little push in the imaginary direction of pornography, for instance, for the sexedness of the robot to suddenly emerge. Somehow, however, we perceive that this connection is a lot easier to apply to the individual physical robot than to its ethereal and distributed counterpart the artificial consciousness. No matter how much the Puppet Master teases us with dolls, or how much Lain warns us about the risks of being disembodied. IA is more difficult to associate with the working parts of the oikos because it clearly partakes of what distinguishes the kyrios from them: not only intelligence (in the end, both the tool and the slave are supposed to have something of that), but the commerce with symbols and commands. It is not its artificiality what makes us difficult to think of the Puppet Master as female, or even its lack of dependence on a specific body, and it certainly is not the fact that it is conscious, but the fact that (unlike, for instance, the Tachikomas), it presents itself as autonomous.

9 Anime’s Thoughts on Artificial Minds and Gendered Bodies: From …


9.9 The Robot as Posthuman Alliance As we have seen, GitS and SEL use images and narratives that correspond roughly to a moderately t/ -wise side of our t/p field and to what we referred previously as option a1 with respect to gender. However, their sophisticated use of this imagery reveals its inherent nuances and tensions. Now we are going to look at Akira and NGE as examples of anime in which the basic suppositions behind these options are directly attacked: the frontiers between human and artificial being, between mind and body and between the “autonomous” and the collective are blurred and questioned. All of this, of course, falls in line with the posthumanism of authors as Haraway and Hayles, who argue for a critical thinking through of embodiment and our relation to technology ([5], ‘p.129). While the centrality of the body as a substratum of thought emerges even in GitS and SEL, Akira and NGE underscore not only the existential fact that we are our bodies, but also the impossibility of deciding where the body ends and where the mind begins: we are our conscious bodies, but in a sense that has little to do with the apparently clear limits of our skins. In both Akira and NGE the biological body has more prominence than in GitS, both as an image and conceptually. In them, technology does not serve as a means of abandoning the human flesh or the human species: the dualism that would make that possible is alluded to, but only to be denied. Although the biological body appears, once again, as a record of individual and collective history, and in this sense it is still linked to the concept of the past, there is no “already given” in the body, no hidden or inactive past individuating the conscious subject. In these two anime, the past is above all power and virtuality: that past is what brings the future forth. At the same time, technology emerges as that which, in one way or another, updates the virtualities attached to the past and releases its potential. Another important difference is that in Akira and Evangelion, the concept of individuality as a whole is questioned by the encounter with something that is more than the individual subject, but at the same time is indistinguishable from the individual’s own embodied self: the past of the species. Instead of opening the door to an overcoming of embodiment, technology awakens an active history of the body that transcends the protagonists and exposes that they have never been complete and independent, but rather have always been a part of something bigger. This active past corresponds to the evolutionary impulse as something that is still acting on bodies, though it is not only expressed in organic matter. Thus, in Evangelion, the robotic body also becomes a living and experienced body: not only can the pilot feel the pain produced by the robot, but the robots are ultimately revealed to be related as kin to humanity, and even contain the psyches of the pilots’ mothers. In Akira’s final scene, mind–body unity is questioned through audio desynchronization, as it was in GitS, but the scene points to something entirely different: the immanent impossibility of identifying one’s own body. The body is conceptualized as pure diversity, but there is no identification between the voice and the spiritual. In the end, the sentence “I am Tetsuo” suggests a link between the auditory and the subject’s identity, but the fact that this only works when no image is present anymore, points to


A. Torrents González and A. Ballús Santacana

the impossibility of a complete identification. Otomo emphasizes that identity always begins from difference: autonomy is necessarily founded on a relationship with the medium and with other beings, and the emergence and persistence of individuality is not a given, but a problem. In this respect, it’s also worth mentioning the role of Rei Ayanami’s character in Evangelion. It is a mysterious character, about whom little is known during most of the narration beyond the fact that he pilots EVA 00. During the end of the series, however, we discovered that there are actually multiple bodies called “Rei Ayanami”, all of them clones of Shinji’s mother. In fact, during the development of the plot, three different Rei have appeared, one body being replaced by another when it dies. The multiple Rei do not share the whole of their memories, and therefore Rei appears as a character who is not individuated either by her memory or by her flesh. But in the end, Rei turns out to be an incarnation of Lilith, the evolutionary mother of humanity. Thus, the lack of an individual past that identifies her is not presented in Rei’s case as a mere lack: on the contrary, it enables Rei to fully embody the past of the species, the common origin of all human individuals. Therefore, we see how the relationships between corporeality, technology and different forms of the past have a prominent role in these animes. The body and its technological extensions are revealed as the key place in which history (not only of the subject, but of the species and of life) emerges as an impulse that subsumes and at the same time exceeds its individuality. From the perspective of our analysis, this means that the collective forces working alongside the individual (the wives, slaves and animals of any oikos) are recognised and brought to the forefront of the story. At some points this takes the form of a very direct critique of transhumanist narratives. We see, for example, how the MAGI supercomputers, which are simply described as military AIs at the service of the NERV organization the beginning of the story, are then revealed to have biological components and humanlike personalities, and finally, to have been constructed form parts of the brain of NERV lead scientist’s mother. This apparent obsession with mothers hiding behind technology (the EVA units have the souls of the pilots’ mothers; the AI is literally made of parts of a scientist’s mother) has to be read precisely in the directions already pointed. it is as if the anime, at points, is screaming to our transhumanist: there is nothing inhuman in your technology—as the rest of the species, in the end what made it possible was the caring work of female bodies. Interestingly, in the case of Evangelion it is not only the story that unravels in these directions. Through the series Anno makes use of the techniques of limited animation and stock image insertion to play with the audience relationship with what is viewed. In the ending of the original series, the story seems to end up disappearing in the middle of an unveiling of animation itself, as if Anno was forcing the spectators to confront the very situation in which they, as humans trying to “lose themselves” in an anime, actually are. And while he ended up publishing a second finale in the form of a movie, The End of Evangelion, he refuses to completely abandon his game, and once again breaks the narrative to show images of the female voice actors, and even of the death letters sent to him after the first ending.

9 Anime’s Thoughts on Artificial Minds and Gendered Bodies: From …


Although both Akira and Evangelion endings seem to take us near two abysses of abjectional chaos (Tetsuo’s body becomes an explosion of living, growing flesh; humanity as a whole becomes one liquid sea of souls as NERV’s Project of Human Complementation is completed), what they do in fact is subvert our expectations by letting the forces that usually work behind the scenes emerge: they point our attention to the work of actual machines, but also of actual humans with sexed and gendered bodies and lives, and to the fact that only through a veil of abstraction (such as that criticized by Federici) they lend themselves to be fit into the aseptic limits of technologically inspired fictions. In Otomo’s and Anno’s posthuman hands, they revolt against any idea of AI or robotics that refuses to wear its entanglement with human sweat glands and uteruses on its sleeves.

9.10 Simondon, Botticci and the Transof Transindividuality Regarding gender, then, in Akira and NGE we are again left undecided between two options, even if we now can discard the more naive version of the first one: Option a2 (posthuman-wise version): t/p is good, because it helps us conceive a (theoretical) overcoming of the (discursively imposed) limits of gender and sex. Option b: (independently of their nature,) gender and sex are part of what defines most human identities, and the idea of “overcoming” them may often be based on a misogynistic rejection of traits traditionally identified as feminine. In fact, it seems that these seemingly opposed options now may collapse onto an hybrid one: the recognition that t/p images and concepts may help prove our views on gender and sex too limited, while at the same time we guard ourselves against the possibility of erasing what’s actually hidden under those views: the complexity of sexed and gendered working human bodies and their experiences. And as the countermodel presented by Akira and NGE shows, that complexity goes well beyond images and narratives about complete, autonomous consciousnesses and bodies can offer us, and takes us to the realm of the collective. We believe that one powerful concept that can help us understand the kind of fictions and images about that which exceeds the individual that appear in our corpus is that of transindividuality, as originally developed by Gilbert [13] and more recently developed by Boticci [4]. Boticci wants to get rid of the idea of the human body as a singular substance underlying human cognitive agency, which is deeply rooted in methodological individualism. Following Simondon and Balibar reading’s of Spinoza, she underscores the fact that the individual doesn’t preexist its own process of individuation. Instead, any situation includes interactions between (complete or incomplete) individuals and their environments, and a crucial part of these environments is always what Simondon calls their potential energy or even apeiron, the undetermined. Individuation consists


A. Torrents González and A. Ballús Santacana

in the restructuring of all kinds of systems in ways which imply not only the transmission of form between different elements (transduction), but also the structuring of relational elements that are not reducible to terms. The concept of transindividuality, then, builds bridges between nature and culture and between nature and technology, and opens up a whole dimension of being that is not exhausted in the formation of individual subjects. The question no longer is how to explain individuation through the acts of individuals but how to explain the way individuals act by describing processes that also affect and are affected that which is inherently collective, which is part of the situation but not reducible to individual terms. Translating to our previously used terms, society cannot be reduced to the exchanges between autonomous kyrios, or even to exchanges between oikos composed of kyrios, wives, children and slaves. Pregnancies and harvests need to be granted full status of being, as do floodings, pests and social values. What our analyses of our corpus shows is how the animetic imaginary on robots and artificial beings elaborates on the idea that the relationships between human and machine cannot be thought of without recourse to these transindividual elements. There is no mind-uploading that doesn’t lead to questions about sex and procreation, and there is no IA can help us escape the role (or absence of it) of women’s bodies in its own creation.

9.11 Conclusion. Is the Animetic Contribution of Our Corpus Feminist? We hope to have shown that the kind of reflections and imaginations incited by the anime in our corpus, which we take to be representative of general tendencies in technological anime as a whole, are useful to rethink some notions related to technology, IA and the limits of sexual and gendered identities. They are profoundly inspired by transhumanist and posthumanist ideas, and in turn contribute powerful material with which such discourses can be built. Particularly, we believe that a relevant contribution anime, as a thinking device in itself, has made to the matter of IA and gender is underscoring the social relations and values underpinning the traditional dichotomies that in many cases still structure our thinking about technology and its relationship with humanity. As the anime themselves show, these relations themselves cannot be reduced to its individual parts, so our technological imagery needs to learn to deal with the level of the transindividual, where processes of collective individuation take place. The same way GitS or Evangelion force us to think in in what transpires in the collective level, trying to understand what the applications of AI can mean for gendered and sexed workers all over the world, and especially for those whose bodies and agencies are marked as female, trans or nonbinary (i. e, precisely as marked) will demand of us a capacity to think beyond the established and include something more than individual, autonomous (workers, computers, AIs, cyborgs, women). Inasmuch

9 Anime’s Thoughts on Artificial Minds and Gendered Bodies: From …


as it trains us to do so, anime’s contribution, for all of anime’s criticizable aspects, is not only gender-relevant, but a feminist one.

References 1. Ballesteros, V.: De Günther Anders al transhumanismo: la obsolescencia del ser humano y la mejora moral. Isegoría 63, 289–310 (2020) 2. Barad, K.: Posthumanist performativity: Toward an understanding of how matter comes to matter. Signs: J. Women Cult. Soc. 28(3), 801–831 (2003) 3. Bolton, C.A.: From wooden cyborgs to celluloid souls: Mechanical bodies in anime and Japanese puppet theater. Positions: East Asia Cult. Critique 10(3), 29–771 (2002) 4. Bottici, C.: Anarchafeminism. Bloomsbury Academic, London (2022) 5. Brians, E.: The virtual body and the strange persistence of the flesh: deleuze, cyberspace and the posthuman. In: Guillaume, L., Hughes, J. (ed.) Deleuze and the Body, pp. 117–144. Edinburgh University Press, Edinburgh (2011) 6. Broncano, F., Fusco, V.: Presentación: Transhumanismo y posthumanismo. ISEGORÍA, Revista de Filosofía Moral y Política 63, 283–288 (2020) 7. Brown, S.T.: Tokyo Cyberpunk. Posthumanism in Japanese Visual Culture. Palgrave Macmillan, New York (2010) 8. Butler, J.: Bodies That Matter: On the Discursive Limits of “Sex.” Routledge, New York (1993) 9. Bukatman, S.: Terminal Identity: The Virtual Subject in Postmodern Science Fiction. Duke University Press Books (1993) 10. Federici, S.: Beyond the Periphery of the Skin: Rethinking, Remaking, and Reclaiming the Body in Contemporary Capitalism. PM Press (2020) 11. Ferrando, F.: Posthumanism, transhumanism, antihumanism, metahumanism, and new materialisms: differences and relations. Existenz 8(2), 26–32 (2013) 12. Haraway, D.: Simian, Cyborgs and Women: The Reinvention of Nature. Routledge, New York (1991) 13. Simondon, G.: Du mode d’existence des objets techniques. Editions Aubier, Paris (1989) 14. Schaub, J.: Kusanagi’s body: gender and technology in mecha-anime. Asian J. Commun. 79– 100 (2001) 15. Schodt, F.L.: Inside the robot kingdom. Japan, Mechatronics, and the coming Robotopia [Kindle DX version]. JAI2, San Francisco (2010) 16. Torrents, A.: El anime como dispositivo pensante: cuerpo, tecnología e identidad (Doctoral Thesis). Universitat Autònoma de Barcelona