Racist Zoombombing 9780367725808, 9780367743376, 9781003157328


185 28 2MB

English Pages [73] Year 2021

Report DMCA / Copyright

DOWNLOAD PDF FILE

Table of contents :
Cover
Half Title
Title Page
Copyright Page
Table of Contents
Acknowledgments
Introduction
1 New Platform, Same Racists: How Social Media and Gaming Route Racist Hatred to Zoom
2 Zoom Is Memetic Warfare: Zoombombing and the Far Right
3 Affective Violations: Black People’s Experiences with Zoombombing
4 Conclusion
Index
Recommend Papers

Racist Zoombombing
 9780367725808, 9780367743376, 9781003157328

  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

Racist Zoombombing

This book examines Zoombombing, the racist harassment and hate speech on Zoom. While most accounts refer to Zoombombing as simply a new style or practice of online trolling and harassment in the wake of increased videoconferencing since the outbreak of COVID-19, this volume examines it as a specifically racialized and gendered phenomenon that targets Black people and communities with racialized and gendered harassment. Racist Zoombombing brings together histories of online racism and algorithmic warfare with in-depth interviews by Black users on their experiences. The book explains how Zoombombing is a form of racial violence, interrogates our ideas about online space and community, and challenges our notions of on and off line distinction between racial harassment of Black people and communities. A vital resource for media, culture, and communication students and scholars that are interested in race, gender, digital media, and digital culture. Lisa Nakamura is Gwendolyn Calvert Baker Collegiate Professor of American Culture at the University of Michigan, Ann Arbor, where she is the inaugural Director of the Digital Studies Institute. She is the author/editor of four books on race, gender, and digital media: Race in Cyberspace (2000), Cybertypes: Race, Gender, and Ethnicity on the Internet (2002), Digitizing Race: Visual Cultures of the Internet (2007), and Race After the Internet (2011). She has also published essays and book chapters on racial passing and videogames, online toxicity, and the politics of digital infrastructure. Hanah Stiverson is a PhD candidate in the department of American Culture at the University of Michigan, Ann Arbor. Her research focuses on the growing overlap between the far-right and the mainstream in the United States. Kyle Lindsey is a PhD student in American Culture at the University of Michigan, Ann Arbor. His research focuses on how people of color navigate interpretation, criticism, and responsibility on digital platforms.

Racist Zoombombing

Lisa Nakamura, Hanah Stiverson and Kyle Lindsey

First published 2021 by Routledge 52 Vanderbilt Avenue, New York, NY 10017 and by Routledge 2 Park Square, Milton Park, Abingdon, Oxon, OX14 4RN Routledge is an imprint of the Taylor & Francis Group, an informa business © 2021 Lisa Nakamura, Hanah Stiverson and Kyle Lindsey The right of Lisa Nakamura, Hanah Stiverson and Kyle Lindsey to be identified as authors of this work has been asserted by them in accordance with sections 77 and 78 of the Copyright, Designs and Patents Act 1988. All rights reserved. No part of this book may be reprinted or reproduced or utilised in any form or by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying and recording, or in any information storage or retrieval system, without permission in writing from the publishers. Trademark notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation without intent to infringe. Library of Congress Cataloging-in-Publication Data A catalog record for this title has been requested ISBN: 978-0-367-72580-8 (hbk) ISBN: 978-0-367-74337-6 (pbk) ISBN: 978-1-003-15732-8 (ebk) Typeset in Times New Roman by codeMantra

Contents

Acknowledgments Introduction 1

2 3 4

vii 1

New Platform, Same Racists: How Social Media and Gaming Route Racist Hatred to Zoom

15

Zoom Is Memetic Warfare: Zoombombing and the Far Right

29

Affective Violations: Black People’s Experiences with Zoombombing

41

Conclusion

53

Index

63

Acknowledgments

We would like to thank the University of Michigan’s American Culture Department and the Rackham Graduate School’s Faculty Allies and Student Allies Diversity Grant Program, which offers funding for doctoral students to participate in research with faculty over the summer, for enabling the partnerships that went into writing this book. We especially owe thanks to the administrators of our department’s Diversity Allies Program, Julie Ellison and Maryam Aziz, who reached out to the department’s doctoral students and faculty at the height of the pandemic and expanded the program to support as many students as possible. We truly appreciate our department’s efforts to support collaborative work and would not have written this book without it. We are also very thankful to Sheni Kruger and Emma Sherriff, our editors at Routledge, who hastened to get this book into print while it was still relevant. When we started this project in April, we did not expect that the pandemic would last this long and that zoombombing would still be such a widespread and serious problem. We’re sorry that it is indeed still such a big part of everyday life and hope that some of these insights can contribute to making online life a less toxic and more sustaining place. Lisa would like to thank Tom Carson, the staff of the Digital Studies Institute and the American Culture department for their steadfast and steady support; Hanah Stiverson and Kyle Lindsey for their willingness to take on this new project while so busy with other things, and for their generosity, persistence, and mutual support during one of the strangest times to write a book about digital media that she can remember; and every essential worker at the university who cleaned, stood outside buildings to deliver equipment, and did the invisible and dangerous labor of maintaining the university infrastructure.

viii

Acknowledgments

Hanah would like to thank Lisa for her constant support and encouragement. The process of writing this book was truly enjoyable with coauthors as creative and generous as Kyle and Lisa. She would also like to thank Megan Rim and Mallory Whiteduck for being the best and most brilliant support system she could imagine. Finally, she is eternally grateful to her family and in particular Nick, for the constant stream of love and coffee, which makes everything more bearable. Kyle would like to thank Lisa and Hanah for being such worthwhile collaborators on this project. Through this book, he has learned how enriching collaborative research and writing can be for scholars in the humanities. He is also eternally grateful to his mother, who has always been supportive of his aspirations even as they take him away from home. He hopes this book is a meaningful addition to her collection.

Introduction

“The Familiar Feeling of Racism: How Zoom Hurts People Where They Live” Technology allows for magical thinking, and never more so than during the COVID-19 pandemic, the historical period during which we write this book. Zoom, the most popular videoconferencing and online chat software during COVID, is both hated and appreciated, tiresome and indispensable, new and, at the same time, a minimally different version of video platforms that have been around for decades. Users make sense of their radically changed lifeworlds by endowing the digital systems and infrastructures that have saved them from extreme isolation during the pandemic with inherent emotional characteristics. Systems, tools, and networks are seldom critiqued until they offer more drawbacks than benefits, as was the case for Facebook after the Cambridge Analytica scandal became news. Even then, users put aside their well-founded reservations about Facebook as social media fills the gap that pandemic public life leaves open.1 The problem with magical thinking is that it allows those with the most privilege to turn away from moments of rupture, to ignore contrarian reports of structural inequality, lack of access, or exploitative practices that contribute to systems of oppression. When tech utopianists or industry leaders are confronted with proof that their platforms are harmful, they release paradoxical statements such as “this is not who we are as a company and does not represent what we value,” while claiming that the problem is an anomaly, a bad apple, a glitch, an outlier, rather than the rule. 2020 is a year of collision but only partially of exception. Many old forms of digital harm have come to a head and are colliding in ways that feel explosive, especially to those that have looked away for much of their lives. This book is about zoombombing – the use of videoconferencing software by infiltrators to attack

2 Introduction unsuspecting users with racist, sexist, and other toxic content. We argue that the racism and misogyny that characterizes zoombombing is the same racism and misogyny that the Internet has trafficked in from its origin. Zoombombing differs from more benign kinds of trolling such as DDOS (distributed denial of service) attacks because it has intimate ties and critical engagements with the growth of the far-right, the manosphere, and the culture of “red-pilling,” which are all vital to understanding why videoconferences are so often disrupted by racist and sexist hate and what purposes these acts serve in our society. Racism has deep roots in early and contemporary Internet culture, and platforms, users, and policy makers have come to tolerate and even expect the worst kinds of hate speech and imagery online. Similarly, anti-Black racism is an integral part of American culture and capitalism; it provides a deep repertoire of horrifying symbols, words, and acts that prompt the shock and affective response that zoombombers seek.2 This book asks the reader to recognize zoombombing for what it is: a reification of systems of harm that exposes a broader audience to what women and people of color have known about technology and our society all along. Zoombombing is a new social formation that draws on a long heritage of Internet harassment, which has historically centered people of color and women as primary targets, exposing them to real harm both on and offline.3 Empirical research shows that the digital industry’s investments in content moderation labor are a project of failure, much of it performed by women and people of color for low wages.4 Despite the increasing moderation, every digital platform hosts racial harassment and toxicity to varying degrees. We wrote this book in the midst of two pandemics. First, the COVID pandemic, which has created a collision and cultural breaking point in regards to how we imagine digital platforms such as Amazon, Google, Apple, and Zoom. They are both our heroes and our scapegoats: Zoom has become radically overdetermined as a source of intimacy and connection and as a vector for deep and justified anger. Second, we write in the midst of a global struggle over anti-Black racism, where everything – life itself – is in the balance. The following story illustrates how Zoom and anti-Black racism came together during the earliest days of the pandemic. Well before the pandemic, ecologist Dr. Tiara Moore posted a call on Twitter for members interested in sessions of collaborative working, yoga, and happy hours on Zoom. She meant it to be a space for

Introduction  3

Figure I.1 A WOC Space.

Black and Brown joy, a radically accepting and anti-meritocratic invitation: “WOC, excellent or not, come raggedy or beat faced, vent, SHOUT, speak, be heard, listen, support. Let’s make a WOC space and build each other up from now on.” Tiara Moore established A WOC Space as a social and intimate community that centered experiences of women of color (WOC) and as a networked digital space of collectivity and WOC safety. This grew out of a need for a space separate from the inhospitable working environments that women of color experience in the sciences. It wasn’t until the pandemic hit that Dr. Moore’s online gathering place for women of color in STEM fields to meet, collaborate, and share their moments of frustration and joy was invaded by zoombombers (Figure I.1). Moore announced and hosted the group’s successful Zoom meetings by advertising the links under the “A WOC Space @AWOCSpace” Twitter account just as she had since the fall of 2019. In the early days of the pandemic, on March 30th, she was preparing a drink with her Zoom room open on her computer when an unfamiliar attendee was first to join the room. After a moment of chatting, the attendee stated that Moore should be wary of the possibility of hackers. Immediately after this comment, a flood of new users joined and began shouting derogatory and racist language. The N-word was screamed multiple times. Dr. Moore shut down the room and for subsequent meetings implemented protective measures to limit the risk of a second attack (Figure I.2). Zoombombing has become and remains very common since we learned of Dr. Moore’s Experience; we were drawn to the topic partly because of the work of journalists who published dozens of stories about horribly racist and sexist zoombombings targeting a wide range of people and groups.5 We chose to narrow our scope and to conduct in-depth interviews with three Black users who were

4 Introduction

Figure I.2 A WOC Space cancels due to zoombombing.

targeted by zoombombers. These detailed interviews make the case that the problem and the solution live close together: we interviewed one of our informants, Dr. Moore, over Zoom, even as we gathered data that buoyed our concerns and criticisms about the platform. We chose to focus on these three cases because they brought us close to the emotional and affective worlds of Zoom racism and the ways it harms specific individuals in specific ways. Zoombombing has had lasting effects upon targets’ confidence, comfort, and sense of safety while working and socializing through videoconferencing and has strong parallels with the paradoxical familiarity and alienation of off-camera racial abuse. Zoombombing is a new term, but the racism and abuse were by no means new to the people who talked to us. In defiance of the notion that there’s a bright line between the virtual and real, we found that racial abuse during Zoom meetings that included family, coworkers, and friends caused lasting trauma, anxiety, and anger. It was a familiar feeling, but platformed in a new way, an intense reminder that racism is a fundamental part of US culture and the technologies that arise from it. As we will discuss in Chapter 1, racist harassment is inherent and foundational to Internet and gaming cultures, but there are some specific ways that zoombombing is new. Firstly, Zoom is the essential worker of COVID-era digital infrastructure. Zoom (NASDAQ:ZM) was founded in 2013 by Eric Yuan, former CEO of a WebEx, a different videoconferencing company, and has expanded tremendously as the pandemic forced workers to labor at home: it has almost 50% of the total market share and the company claims to have more than 300 million daily active participants. Yuan’s shares in the company are worth 16.8 billion dollars on October 8, 2020, and the company is worth 90–110 billion depending on the day. The company reported in August 2020 that total revenue had grown 355% year-over-year to $664 million in the second quarter. These

Introduction  5 results exceeded their original projections of $495 million to $500 million, as demand remained at heightened levels and more free users were converted to paying customers. Most impressively, current customers with more than ten employees increased by 105,000 to approximately 370,000.6 In “Why Zoom is Booming,” business reporter Rich Karlgaard of Forbes attributes the perception of Zoom as a technology for creating social connection and happiness to its CEO’s clear messaging about Zoom’s corporate identity, to consolidation of its first mover status by expanding rapidly and effectively from the desktop into mobile platforms, and to its “land and expand” strategy, which offered limited access for free, luring companies into paying to keep their employees on a platform that they preferred over other videoconferencing services.7 Organizations have a strong incentive to buy licenses for their members because free Zoom accounts are limited to 40 minute sessions, while paying customers can create unlimited length meetings. A quick look back at the history of Internet videoconferencing can help us understand how Zoom came to dominate the market. The first commercial webcam, QuickCam, was released for the Mac in 1994. Later PC applications such as CU-SeeMe were designed to run on home webcams connected to the Internet and were touted as a way to “bring people together” to support remote teams and to get work done rather than as primarily sites for social activity.8 Skype, the first free cross-platform videoconferencing service, was used as a phone replacement, because at its date of launch in 2003, cell phone minutes were expensive and limited for most users. When Skype was acquired by Microsoft in 2011, it cultivated a new identity as a “work” platform, though, like camming, it has been widely used as a space for streaming sociality and pornography. Videoconferencing gained mainstream relevance in 2004, when American homes began using broadband networks. Then, in 2010, the technology experienced another surge in popularity with the introduction of Apple’s iPhone with FaceTime, a video chat app that helped verticalize and consolidate the company’s market share by keeping all user data on their proprietary networks.9 Though it looks quite similar to Skype or FaceTime, Zoom has its own unique features and history. Zoom seems to be the best of all videoconferencing worlds: free, easy for new users to understand, and equally suited for work or play; a broader and more flexible platform that combines business and pleasure. Since the start of COVID-19, the workplace and the home have merged in

6 Introduction ways that cause context collapse between public spaces and nominally private and domestic spaces, and Zoom has become emblematic of pandemic life. Zoom is well suited for this particular moment because it has successfully marketed itself as the positive face of working from home: the corporate slogan it adopted in 2019 is “Elevate Every Encounter.”10 By creating a “simple and purposeful” platform for everyone to use, Zoom has become the default virtual meeting room for work, education, family gatherings, celebrations, and community engagements. It has become an aggregator for intimate moments while encouraging public access. COVID has turned Zoom into a ubiquitous social platform, which has also made it a platform for racial hate and harassment. Zoombombing matters because the most innocuous and commonplace platforms have the greatest potential for far-reaching harm. Algorithmic racism informs everyday activities such as using search engines, which reproduces racist thinking and word associations.11 Google, another company, that like Zoom, has become synonymous with general activity on the Internet, becomes a space of access to warfare tactics against racialized and gendered bodies and communities. Zoom is part of a de facto monopoly that includes Google, Amazon, and Apple. Zoom’s perceived ease of use has made it the only videoconferencing platform that many institutions are willing to purchase. For example, our workplace at the University of Michigan opted to cancel its paid access to the BlueJeans videoconferencing platform that had been used successfully for over a decade and now offers Zoom as its core platform option. Many students and workers who rely on videoconferencing to meet their daily demands are now possible targets of zoombombing. Zoom released a statement in April 2020 about increased security measures in the face of these zoombombing attacks. While never directly addressing the racial and gendered nature of the harassment, nor the harassment itself, Zoom claimed that they were working on “doing better in the future” but framed these issues as stemming from the recent break from what the platform was originally intended to be – a resource for institutions, not individuals.12 Zoom has many corporate customers, and it was always meant to be used with full IT support available, according to the brief email response offered to those who report attacks to the company. This statement situates Zoom as a victim of circumstance, a platform whose original goals have been subverted by surging individual

Introduction  7 usage and isolated bad actors. It’s an apology that disavows both blame and a deeper look at Zoom’s complicity within structural and algorithmic racism.13 Like other COVID-era digital services, Zoom has used the pandemic as cloud cover for preventable structural complicity and normalizing radical behavior. One of the reasons we care about this issue is because Zoom is a tool for parents to remediate racism. Black parents have noticed that virtual school has protected their children from unfair treatment in the classroom and allowed Black kids to focus on learning rather than contending with harassment from peers and teachers. Theresa Chapple-McGruder, a Black maternal and child health epidemiologist, was so surprised at the positive changes in her second grader’s educational experience online that she posted a survey to the national Facebook group Conscious Parenting for the Culture. The survey went out to more than 10,000 Black parents of children from prekindergarten through 12th grade. The 373 parents who responded overwhelmingly said they appreciated the way virtual learning allowed them to shield their children from anti-Black bias and protect them from the school-to-prison pipeline — the well-documented link between the police in schools and the criminalization of Black youth and other students of color.14 Zoom has already shown great potential for providing an escape from some aspects of racism in the classroom and, in a betterplanned world would extend that benefit to the parts of the platform that they can control: user access, robust reporting, and culturally sensitive customer support. Zoombombing is being taken seriously by one federal agency: the FBI, which issued a warning that defined zoombombing as a “computer crime,” a federal offense that merits arrest and possibly time in jail. In their press release they stated, “if you or anyone you know becomes a victim of teleconference hacking, they can report it to the FBI’s Internet Crime Complaint Center.”15 At least two zoombombers, both of them male teenagers, were arrested in April 2020 and charged with disrupting classrooms. A boy in Madison, Wisconsin was charged with fifth-degree computer crime, fifth-degree conspiracy to commit a computer crime, and breach of peace for defacing a class with “obscene language and gestures.” On April 3rd, the United States Attorneys’ Office for the Eastern District of Michigan

8 Introduction released a press statement warning against “teleconference hacking during Coronavirus pandemic.”16 While this agency is very clear that zoombombing is a form of hacking or misuse of computer resources and networks, i.e., “disrupting a public meeting, computer intrusion, using a computer to commit a crime, hate crimes, fraud, or transmitting threatening communications,” the racism and misogyny that are part and parcel of zoombombing as in the Madison case are downgraded to “displaying pornographic or racist words and images,” a very vague set of terms that minimizes how harmful, personal, and traumatic this practice often is. Zoombombing is in no way a lighthearted or casual encounter for those who have experienced it. The incidents that we have researched and documented have been extreme, even by Internet standards; for example, zoombombers have used images of child pornography. They have put Black women professors’ home addresses in the chat and threatened to cut them up with meat cleavers. There is too much at stake for zoombombing to go unregulated and ignored. Multiple actors are complicit in the proliferation of zoombombing attacks. In Chapter 4, we provide some suggestions and ideas for new directions and policies for digital platforms based on reparative justice models, inclusive hiring priorities, and empowering users to report and remediate abuse on their own. Dr. Tiara Moore’s story shows us the wide gap between corporate rhetoric and lived experience. Though racism and sexism were not purposely designed into the platform, Zoom has now become synonymous with zoombombing: the unpredictable, widespread, and seemingly random practice of racial and gendered attacks on individuals who are historically harmed by systems of oppression and dismissal and the intimate spaces they have crafted for collaboration and celebration. The attacks on these spaces are motivated by a history of white supremacist and patriarchal threats toward collaborative Black intimacy, and they harm everyone who w itnesses them. Who benefits when these acts are categorized as “trolling” or “cybercrime” rather than “racism?”17 As Lorenz and Alba reported in April 2020 in their excellent article “Zoombombing Becomes a Dangerous Organized Effort,” when confronted by Dr. Dennis Johnson, a Black academic who was racially attacked during his dissertation defense, the company released a statement that completely avoided any mention of racism or hate crimes. Nate Johnson, a spokesperson for Zoom, claimed that “Zoom strongly condemns harassment of this kind” (italics ours). This book exists because as

Introduction  9 race scholars we are bound to explore and press hard on the vagueness and disavowal inherent in this rhetoric. Zoombombing is the latest example of a long history of social and state control of Black community gatherings in public spaces such as parks, recreation centers, graduations, and weddings.18 Its purpose is to pollute the once-gated spaces of the classroom and the meeting, and it has a disproportionate effect on Black people’s communities. The aggregation of gendered and racial animus toward moments of celebratory Blackness is linked to the longer history of violent anti-Blackness in online and offline spaces. All of our interviewees described their efforts to protect themselves and, in some cases, to communicate their experiences and concerns to the company. However, they have not been satisfied with Zoom’s resistance to admitting culpability in these instances of racial and gendered harassment. Dr. Dennis Johnson created an online petition after his dissertation defense was attacked and gave interviews to spread the word about his experience and the lack of structural support Zoom offers to combat instances like these.19 Dr. Tiara Moore was unable to find any contact link for Zoom and was forced to reach out to the company via Twitter. Angelique Herring reported the incident to her workplace. Moore and Johnson found Zoom’s response superficial and dismissive; the company ended the correspondence by sending a canned notice that the case had been resolved. Why does it matter that Zoom dominates the market? Zoom’s nascent monopoly is not only economic; it is also cultural, rhetorical, and psychic. Zoom is the latest software platform to become rhetorically established as the best choice or standard program for its entire category of digital telecommunication.20 Zoom is similar to older digital platforms such as Facebook or Twitter that have systems for reporting bugs or other technological problems that users encounter, but unlike them it lacks a reporting system that allows a user to specifically flag hate speech or harassment. Omitting this feature effectively hides the problem by failing to acknowledge its existence. And this normalization of harassment capitalizes on users’ deference for software; for many users, especially new users, Zoom is the only option. Microsoft Office Suite programs such as PowerPoint or Word aren’t the only slideshow or word processing software programs available, but they are the only ones that many people have access to or are aware of. Alternatives such as the Google Suite applications (Google Drive, Google Docs, Google Sheets, etc.) are seen as temporary spaces before users

10 Introduction convert their work into a Microsoft Office Suite object. Microsoft Suites is an efficient and effective high-end software option for digital consumers, but deference for the Microsoft empire limits the imagination of its users at a programming and individual level. Every program that doesn’t do things the way Word or PowerPoint does is perceived as unfamiliar and thus inferior despite the limitations that Word or PowerPoint place upon our own creativity and style. This tendency to defer to software, and the inability of most digital consumers to identify, produce, or design their own technologies as alternatives, produces a built-in psychic resistance to seeing Zoom and other software as bad objects when the time comes to demand more from them. And that time has come. This psychic resistance to critiquing software expresses itself as critique’s opposite number – trust and faith, which manifest themselves alongside resignation and exhaustion during COVID. This “faith” allows for consumers, journalists, and CEOs to minimize these racist and misogynistic attacks as “crashing” and Internet pranking, and to place their hope on users being better at technology, removing Zoom’s accountability to users from the equation. Doing this research made us tired. This book is organized around three insights about digital racial histories: like other forms of online harassment, zoombombing relies upon a complex ecosystem of adjoining platforms and existing histories of anti-Black racism; zoombombing leverages the far-right movement’s effective use of humor alongside hate; and Black user experiences have fallen out of the public and academic discourse about videoconferencing. In Chapter 1, “New Platforms, Same Racists: How Social Media and Gaming Route Racist Hatred to Zoom,” we analyze how critical race theory can help us understand the “weaponization of Zoom” as a hate speech platform and how this move ties into earlier histories of racism in the US and on the Internet. In Chapter 2, we analyze Zoom raid accounts on adjacent platforms such as Reddit and 4chan that have prior connections to the far-right movement and a robust record of harassment campaigns before COVID. If we fail to analyze zoombombing’s roots, inspirations, and place within existing media ecosystems, we run the risk of viewing it is an isolated and unique practice that can’t be remedied. Even televisual media such as Fox News contributes to the problem by minimizing or failing to acknowledge that these crimes are racially motivated and by creating a culture of permission for white supremacists.

Introduction  11 In Chapter 3, we turn to our three interview subjects’ firsthand accounts as zoombombing targets. These long-form interviews provide a nuanced account of what zoombombing looks like from the targets’ point of view, their desires and efforts to protect themselves and their families and communities, and how the experience felt both continuous with, and different from, anti-Black racism and misogyny that they had experienced offline. In defiance of the cyberutopian notion that the Internet would reduce racism and sexism by moving it into a more controllable digital space, our interviewees found that Zoom gave them little to no control in the moment that they needed it. The trauma and pain they experienced there compared readily with and, in some cases, exceeded real-life racism and had more lasting effects on their self-esteem, sense of identity, and feeling of safety. In Chapter 4, we discuss strategies for moving from tech’s magical thinking to informed realism by envisioning a different digital future. An anti-racist digital platform is possible if the industry can learn from and compensate users who have experienced the worst that Zoom has to offer.

Notes 1 Hutchinson, “Facebook Releases New Insights on Groups Usage during COVID-19.” 2 Robinson, Cedric J. Robinson: On Racial Capitalism, Black Internationalism, and Cultures of Resistance. 3 Duggan, “Online Harassment 2017”; Schoenebeck, Haimson, and Nakamura, “Drawing from Justice Theories to Support Targets of Online Harassment.” 4 Gray and Suri, Ghost Work; Roberts, Behind the Screen. 5 Law, “Oklahoma University’s Virtual Graduation Ceremony Disrupted by Racist Hacker”; Raache, Hicham, “Oklahoma City University’s Virtual Graduation Hacked; Racist Language, Swastika Displayed during Blessing”; Zoom Meeting for African American Students Hacked with Racist Images, Slurs; Hernandez, “A Zoom Meeting For Women Of Color Was Hijacked By Trolls Shouting The N-Word”; Allen, “Video Shows Racists Clad in Blackface, Swastika Zoombomb Black South Carolina Students”; Redden, “‘Zoombombers’ Disrupt Online Classes with Racist, Pornographic Content.” 6 Williams, “Millions of Users Love Zoom.” 7 Karlgaard, “Why Zoom Is Booming.” 8 Matters, “The History of Video Conferencing.” 9 Apple Newsroom, “Apple Brings FaceTime to the Mac.” 10 Williams, “Why We Changed Our Tagline.” 11 Benjamin, Race After Technology; Buolamwini and Gebru, “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification.”

12 Introduction 12 Yuan, “A Message to Our Users.” 13 Benjamin, Race After Technology 14 Anderson, “You’re Out of Your Mind If You Think I’m Ever Going Back to School.” 15 “Internet Crime Complaint Center (IC3).” 16 “Federal, State, and Local Law Enforcement Warn Against Teleconferencing Hacking During Coronavirus Pandemic.” 17 ‘Zoombombing’ Becomes a Dangerous Organized Effort, https://www. nytimes.com/2020/04/03/technology/zoom-harassment-abuse-racismfbi-warning.html 18 Simmons, Crescent City Girls: The Lives of Young Black Women in Segregated New Orleans. 19 Johnson, “Demand That Zoom Immediately Create a Solution to Protect Its Users from Racist Cyber Attacks!” 20 ht t p:// ka i ros.t e ch norhetor ic.net /20.2/i nve nt io/stol ley/i ndex. html#point-two

References Allen, Matthew. “Video Shows Racists Clad in Blackface, Swastika Zoombomb Black South Carolina Students.” The Grio. Accessed August 29, 2020. https://thegrio.com/2020/04/26/black-southcarolina-students-racist-zoombomb/. Anderson, Melinda D. “‘You’re Out of Your Mind If You Think I’m Ever Going Back to School.’” The New York Times, October 28, 2020, sec. Opinion. https://www.nytimes.com/2020/10/28/opinion/virtual-schoolracism.html. Apple Newsroom. “Apple Brings FaceTime to the Mac.” Accessed November 1, 2020. https://www.apple.com/newsroom/2010/10/20Apple-BringsFaceTime-to-the-Mac/. Benjamin, Ruha. Race after Technology: Abolitionist Tools for the New Jim Code. Medford, MA: Polity, 2019. Buolamwini, Joy, and Timnit Gebru. “Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification,” 2018. Duggan, Maeve. “Online Harassment 2017.” Pew Research Center: Internet, Science & Tech (blog), July 11, 2017. https://www.pewresearch.org/ internet/2017/07/11/online-harassment-2017/. Federal, State, and Local Law Enforcement Warn Against Teleconferencing Hacking During Coronavirus Pandemic. “Federal, State, and Local Law Enforcement Warn Against Teleconferencing Hacking During Coronavirus Pandemic,” April 3, 2020. https://www.justice.gov/ usao-edmi/pr/federal-state-and-local-law-enforcement-warn-againstteleconferencing-hacking-during. Gray, Mary L., and Siddharth Suri. Ghost Work: How to Stop Silicon Valley from Building a New Global Underclass. Boston, MA: Houghton Mifflin Harcourt, 2019.

Introduction  13 Hernandez, Salvador. “A Zoom Meeting For Women Of Color Was Hijacked By Trolls Shouting The N-Word.” BuzzFeed News. Accessed June 3, 2020. https://www.buzzfeednews.com/article/salvadorhernandez/zoomcoronavirus-racist-zoombombing. Hutchinson, Andrew. “Facebook Releases New Insights on Groups Usage During COVID-19.” Social Media Today, October 15, 2020. https:// www.socialmediatoday.com/news/facebook-releases-new-insights-ongroups-usage-during-COVID-19/587146/. Internet Crime Complaint Center IC3. “Internet Crime Complaint Center(IC3).” Accessed November 15, 2020. https://www.ic3.gov/. Johnson, Dennis. “Demand That Zoom Immediately Create a Solution to Protect Its Users from Racist Cyber Attacks!” Organize For. Accessed November 1, 2020. https://campaigns.organizefor.org/petitions/demandthat-zoom-immediately-create-a-solution-to-protect-its-users-fromracist-cyber-attacks. Karlgaard, Rich. “Why Zoom Is Booming.” Forbes. Accessed October 8, 2020. https://www.forbes.com/sites/richkarlgaard/2020/09/16/why-zoomis-booming/. Law, Tara. “Oklahoma University’s Virtual Graduation Ceremony Disrupted by Racist Hacker.” Time Magazine, May 10, 2020. https://time. com/5834845/oklahoma-city-university-zoom-racism-hacker/. Matters, Business. “The History of Video Conferencing.” Business Matters (blog), January 8, 2015. https://www.bmmagazine.co.uk/tech/ history-video-conferencing/. Raache, Hicham. “Oklahoma City University’s Virtual Graduation Hacked; Racist Language, Swastika Displayed during Blessing.” KFOR.Com (blog), May 9, 2020. https://kfor.com/news/local/oklahomacity-universitys-virtual-graduation-hacked-racist-language-swastikadisplayed-during-blessing/. Redden, Elizabeth. “‘Zoombombers’ Disrupt Online Classes with Racist, Pornographic Content.” Accessed August 13, 2020. https:// www.insidehighered.com/news/2020/03/26/zoombombers-disruptonline-classes-racist-pornographic-content. Roberts, Sarah T. Behind the Screen: Content Moderation in the Shadows of Social Media. New Haven, CT: Yale University Press, 2019. Robinson, Cedric. Cedric J. Robinson: On Racial Capitalism, Black Internationalism, and Cultures of Resistance. New York: Pluto Press, 2019. https:// www.amazon.com/Cedric-Robinson-Capitalism-InternationalismResistance/dp/0745340032. Schoenebeck, Sarita, Oliver L. Haimson, and Lisa Nakamura. “Drawing from Justice Theories to Support Targets of Online Harassment.” New Media & Society, March 25, 2020. https://doi.org/10.1177/1461444820913122. Simmons, Lakisha Michelle. Crescent City Girls: The Lives of Young Black Women in Segregated New Orleans. University of North Carolina Press, 2015. https://uncpress.org/book/9781469622804/crescent-city-girls/.

14 Introduction Williams, Barry. “Why We Changed Our Tagline.” Accessed November 1, 2020. https://blog.zoomint.com/blog/why-we-changed-our-tagline. Williams, Christine. “Millions of Users Love Zoom: Here’s Why That’s Bad News.” The Motley Fool, September 10, 2020. https:// www.fool.com/investing/2020/09/10/millions-of-users-love-zoomheres-why-thats-bad-ne/. Yuan, Eric S. “A Message to Our Users.” Zoom Blog (blog), April 2, 2020. https://blog.zoom.us/a-message-to-our-users/. Zoom Meeting for African American Students Hacked with Racist Images, Slurs. WSPA 7News. Accessed July 2, 2020. https://www.youtube.com/ watch?v=mAW838Ph-z8.

1

New Platform, Same Racists How Social Media and Gaming Route Racist Hatred to Zoom

Zoombombing has become increasingly common, with thousands of individual cases reported by journalists and many more cases that were not. Though a very large archive of racist zoombombing events emerged and continues to evolve, we chose to analyze three specific events in order to understand their media ecology: where they come from, how they are executed, and their effects upon targets.1 How do zoombombers find their victims? Zoombombers leave tracks on the Internet that help us understand their techniques for mobilization and clear indications of the platforms where they prefer to collaborate. Multiple social platforms integrate with Zoom to create an easy path for online hatred to follow people of color and women. This has become more common as classrooms and student life increasingly integrate with social media, with events being organized on Twitter, Facebook, and Instagram. Dr. Tiara Moore, who we interviewed for this book, advertised her women of color academics’ group on Twitter because she wanted to cast a wide net in the name of inclusion. This chapter will focus specifically on Discord and 4chan because they offer the greatest structural advantages to zoombombers and other online abusers: they are anonymous or pseudonymous, it is easy to create new groups when old ones are taken down or banned for violations, and they already host robust communities of white supremacists and far-right adherents who have either been banned or are not welcome on other social network sites. The old argument that gamers do not deserve protection from harassment because they are playing “for fun” was always a flawed and discriminatory claim, and certainly no longer holds when everyone’s online spaces are so blurred between “fun” and work. Zoom has become ubiquitous with the workday for many. Even those that use it for community building or “fun” have little choice of platform considering the pandemic and Zoom’s hegemony. Given that

16  New Platform, Same Racists we know that racism and sexism are fixtures of online gaming and that the industry and its core fans have intentionally created and defended this identity as an antidote to “PC” culture and a bastion of “free speech,” it is not surprising that these same people have brought these bombing techniques to Zoom. Though racism and sexism might not always be the most obvious motivator of online harassment, they have historically defined the activity. As Whitney Phillips writes in her book This is Why We Can’t Have Nice Things, racism is a way of hacking the attention economy.2 Critical race theory understands racist acts in public as more than bids for attention; they are tactic and a strategy to create a world where people of color are unwelcome, treated as not-people, and both erased and made hypervisible as racist spectacle. On April 8, 2020, the Associated Press reported that “law e­ nforce­ment agencies across the country are trying to adapt and respond to reports of uninvited guests on videoconferencing platforms who make threats, interject racist, anti-gay or anti-Semitic messages, or show pornographic images.”3 How can we tell the difference between racism as a cynical move meant to attract attention – and “felt” racism, racism as part of an ideology that extends out into and organizes a person’s life and politics? Does intention much matter here? We say no. Intent truly doesn’t matter when the actions cause harm. The distinction between individuals who recognize themselves as racist and those who use racism without viewing themselves that way is completely illegible and inconsequential to those who have to live with the fallout of these experiences. This chapter focuses on the networks and platforms that support these actions rather than the individuals who take part in these acts of harassment and hate. In this book, we focus on the experiences of Black people who have been targeted by zoombombers because they exemplify the harm that videoconference drive-by racism creates, especially in the context of COVID-19 and during a period of intense protest and public discourse about the value of Black lives. We did not interview zoombombers for this project as their thoughts and reasoning on this issue are beyond inconsequential. Instead, we analyze artifacts from platforms where these campaigns unfold and draw our conclusions from their own words and actions. Some zoombombers identify themselves as racists or misogynists, as part of the manosphere or broadly defined far-right, but many do not. Whether or not bombers claim this identity, the ­activities that emerge from these spaces provide the conditions for these ideologies to continue to spread and thrive. In this chapter, we analyze

New Platform, Same Racists  17 data captured from youth-oriented platforms such as Discord and 4chan to show how zoombombers represent their activity on a continuum of racism, at times as innocuous homosocial acts of male bonding, and at other times as ideological attacks on “snowflakes,” “libtards,” and Black people. On April 14, 2020, a 4chan board user (all 4chan users are anonymous) who organized an attack on his “fat bitch landwhale of 400 pounds” teacher posted specific advice to participants, saying “join with real-sounding names or she won’t let you in,” and after she had closed the link, praised the group by saying “good job lads.” ­(Figure  1.1) Though the student displays misogyny toward his teacher to rally strangers on 4chan to attack her, in the end it’s the bond between the group that they celebrate. “Laddish” behavior is archetypally defined as annoying but harmless, however it highlights the spectrum between “normal” versus toxic masculinity. It’s a new articulation of an old and tired claim that “boys will be boys” while making invisible the actual violence of these normative behaviors. Similarly, conversations around gendered acts often obfuscate the racial components that make up much of the violence and the sense of entitlement to frameworks of power. The Internet has only increased the scope of this type of behavior through an expansion of

Figure 1.1  Screengrab from 4chan.

18  New Platform, Same Racists

Figure 1.2  Screengrab from Discord.

who, what, and where “laddish” behavior is able to target. Zoom is a new and easily accessible space for this gendered and racial dynamic to take place. Even seemingly innocuous examples of zoombombing have racist and sexist undercurrents, which are often overlooked or erased. Internet pundits such as Jordan Peterson produce YouTube videos that route men and boys into radicalized media and social networks that promote real-life violence. This is a time-honored gateway which eases the user into politicized racist and sexist harassment. This type of pseudo-radicalization (or overt radicalization in other instances) can be a subtle process, and works best when it is couched in humorous, laddish terms. One Discord poster, zoom X, encouraged participants by saying “Let’s make ‘zoom-­bombing great again!!!!” a reference to Donald Trump’s political slogan, “Make America Great Again.” Even though there are no other references to Trump and politicized racism in the post, it interpellates Zoom attacks as living on the edge of deniability, irony, and legitimate politicized attacks (Figure 1.2). We can see the continuum of zoombombing from laddish prank to precisely targeted attacks on queer parents, Black people, and women. On April 23, one organizer encouraged zoombombers to be as “toxic as humanly possible,” and another one specifically requested “the N word please” (Figure 1.3). On April 24, 2020, the University of South Carolina’s Association of Black Students held their yearly cookout to support students studying for final exams on Zoom rather than in person. This celebratory event was advertised on Twitter because the event was

New Platform, Same Racists  19

Figure 1.3  Screengrab from 4chan.

Figure 1.4  screen capture from YouTube video reporting U. South Carolina zoombombing.

always “open to all.” There was no record of racist behavior or harassment while the event was held in person; however, moving the event to Zoom created additional opportunities for disruption. As Black students logged onto the event, a swath of violent language and imagery flooded the Zoom call. Swastikas, images of white people wearing blackface makeup, and people shouting the “N” word and “Fortnite!” were projected on their screens. This incident showcases zoombombing’s links to white supremacy both as an ideological framework and as a structure that supports these types of attacks and protects the attackers, not the targets (Figure 1.4). The Black Students’ Association turned to Zoom in an effort to capture some of the intimacy, feeling of presence, and social solidarity that sustained their students in the midst of our collective

20  New Platform, Same Racists isolation. After the incident, Bob Caslen, president of the University of South Carolina (USC), released a statement asserting that the university IT office was working with Zoom to identify the culprits and called on students to report any incidents of online racial harassment through the school’s office of Diversity, Equity, and Inclusion (DEI).4 This statement illustrates why zoombombing is so confounding and so poorly handled by institutions; it falls into a crack between an information technology problem that needs to be addressed by systems engineers and telecommunication specialists and a “climate” problem that would bypass the IT department and become the responsibility of the DEI or personnel office. These two parts of the university tend not to work together. Because zoombombing isn’t like a server crash that affects hundreds of people, it is often addressed as a one-off event or glitch rather than an open door to structural racialized hate. From an IT perspective, zoombombing is a known issue and the fix is to implement preventative measures that displace labor away from Zoom and onto the user. Though individual IT professionals are helpful, caring, and knowledgeable, their focus is on educating users to capitalize on the platform’s features and debugging failed software rather than preventing and taking responsibility for attacks, and zoombombing has not been elevated to the status of a drop-­everything emergency. DEI rubrics were developed during a time when in-person racism and name-calling was the norm, DEI offices do not have expertise in digital racism and sexism despite this being a major source of harassment and a barrier to educational access on college campuses. The “if you don’t like it, don’t use it” advice that many receive is also outdated; terminating or boycotting social media is not an option for students searching for connection and community, especially students of color at predominantly white institutions (PWIs). Many universities and schools have moved online since COVID-19, and DEI initiatives need to as well. The Internet’s utopian self-identity, dating back to a historical period well before video platforms were a possibility, imagined the user as powerful, self-reliant, and white/male, therefore immune from the racial climate problems that have characterized the real Internet from the beginning. Innovations such as chat rooms, which, like Zoom, invited participants into shared virtual spaces to socialize in real time, and multiplayer video games were touted as bringing people together to form a new utopia. Advice from IT

New Platform, Same Racists  21 departments and institutions to protect themselves by using software features differently is part of this way of seeing the digital, as a nascent community that is safe to use so long as we can master its features. Matthew Allen and other journalists describe Black USC students’ zoombombing as “Shocking Images and Hurtful Slurs,” an omission of racial discourse in the place where it most belongs. The double gesture of both covering these stories of zoombombing to bring visibility and awareness to the problem and eliding the central role of racism in these acts both acknowledges racism and buries any mention of the word within the text itself rather than including it in a headline. This strategy is very common in the news stories that we found, but we argue that we must expect and demand more during two of the most serious crises of the 21st century: the epidemic of violence against Black people and COVID-19.5 Zoom (or Google Meet, or BlueJeans, or Skype – Zoom has come to stand in for all videoconferencing platforms) is no longer an optional aspect of professional, educational, or social life. ­Before COVID-19, Zoom and other videoconferencing platforms were not the preferred means of communication for many professional and academic settings; rather, they were used to compensate for someone not being able to appear physically by some twist of fate or prior obligation. But given the exodus from public space and this mass digital migration, it has become essential for individuals to use these platforms to build and maintain community, as well as to work from home and maintain their livelihoods. COVID-19 has drastically increased the number of users on the platform. Having previously only had 10 million daily meeting participants at the end of 2019, that number skyrocketed to over 300 million in April of 2020.6 Zoom is an essential service in the COVID era. “Essential” has come to mean “non-optional.” Zoom users are not choosing the platform, rather they are forced onto it with varying degrees of support and preparation, and must learn to navigate new ideas of labor, community building, and social life in the face of a pandemic. In short, everyone is a vulnerable user on Zoom. However, not everyone has experienced abuse on the platform … yet. Because Zoom has not successfully protected its suddenly enormous number of users from the structural flaw that is zoombombing, it is crucial that we understand users’ experiences, their options for protecting themselves, and the obligation that the platform has to protect them (Figures 1.5–1.8).

22  New Platform, Same Racists

Figure 1.5  Screengrab from 4chan.

Figure 1.6  Screengrab from Zoom X Discord.

Figure 1.7  Screengrab from Krvavi Discord.

New Platform, Same Racists  23

Figure 1.8  Screengrab from Zoomheads Discord.

Oftentimes zoombombings originate from a call-to-arms on message boards such as 4chan or Discord, by individuals that have access to a meeting ID and/or password. This is often the case when the instigator has a direct or indirect connection to a class, a church, work, or local government meeting. Raid organizers target teachers, other students, coworkers, or other members of their social networks. Other message board users will answer the call and spew racist or homophobic slurs and show violent or lewd imagery by screensharing. If the target is openly liberal, a person of color, or LGBTQ, the raid will focus on attacking these identities. Discord and 4chan have very different regulation policies. 4chan’s moderation and content regulation is limited at best, meaning people can (mostly) post whatever they want, including Zoom meeting information, without fear of it being taken down.78 Larger public Discord servers can be taken down by Discord if they violate the service’s guidelines, which prohibit organizing to harass others or spread hate speech. Many servers created to organize zoombombings have lasted only a week or so before being taken down by Discord; thus Discord users have to be much more discreet and willing adapt when their servers are taken down. Individuals on Discord organize zoombombing by employing bots programmed to scrape public Zoom IDs from other social media sites, primarily Twitter. As everything began shifting online in early 2020, many people were completely surprised when zoombombing began to spread, and many were unaware of the perils of online interactive spaces. And yet gamers may have been less confused by this, as online

24  New Platform, Same Racists gaming has been synonymous with trolling for over a decade. Toxicity in gaming is a major reason why women often avoid multiplayer games despite enjoying and wanting to partake.9 Video games have been a largely unregulated and unmoderated, proudly disruptive force since well before the funeral bombing occurred. In 2006, a World of Warcraft player posted a video to YouTube entitled “Serenity Now bombs a World of Warcraft (WoW) funeral.”10 This video records one of the most egregious examples of gaming’s unique mixture of cruelty and humor, which sets the tone for how bad behavior would be conceptualized by both platforms and many users: as fundamentally harmless, or “for the lulz.”11 When a beloved and devoted WoW player died of cancer, her guild staged an in-game funeral service for her. As mourners stood in line to honor their friend, an opposing faction who had learned about the event charged in and killed everyone. This is an excellent example of the cognitive dissonance that intimate online ­v iolence engenders; reactions ranged from shock to cynical eye-rolling. The Serenity Now bombing (known as the “funeral raid”) was deeply painful to those who showed up to mourn and to celebrate the life of their guildmate.12 Unlike the majority of zoombombing incidents, it didn’t feature obscenities but instead followed gaming rules by attacking and killing the attending players. While this is not the same type of incident that this book focuses on, it speaks to the gamification of zoombombing that many participants exploit. In gaming, players who purposely campaign either alone or in groups to harass other players are called “trolls” or “griefers.” Like zoombombing, much of this activity is racist and sexist, but not all of it is. We argue that these terms minimize the damage that calling a player the “N” word on Xbox during a game produces. Griefers specialize in winkling out these moments of celebration and joy in digital play in order to destroy them. Zoombombing wouldn’t exist the way it does if racist griefing in gaming were not already such an entrenched practice, a rehearsal space for racism in public events where Black people and other non-white men must now live their lives. The Internet didn’t become a trashfire all of a sudden: it happened over a long period of time. Zoombombing has deep roots in toxic early gaming culture and incidents that ought to shock and horrify us are accepted as part of the territory. Unlike gamers, some of the people of color who have been zoombombed were new to the medium and had never experienced

New Platform, Same Racists  25 this particular form of racial attack. Zoom is like many gaming platforms because the company regulates its space lightly, if at all; users consent to a Terms of Service agreement that almost nobody has ever read.13 The people we spoke to who had experienced zoombombing reported shock, disbelief, and trauma. It’s important that we trace the term “bombing” to its roots in order to show how its intention is to deliberately hurt other players in real time. Like other forms of griefing, zoombombing is an enduring feature of the badly regulated or unmoderated Internet, not something new, and the people who’ve experienced it have told us that it was devastating. In short, zoombombing is an act of terrorism: isolated, explosive, anonymous or semi-anonymous, and digitally coordinated. As Internet researcher Manuel Castells wrote in 2004, well before the dominance of social media or online video meetings, extremist groups are optimized for Internet harassment.14 Intimacy and gaming culture have conjoined in moments of massive upheaval and disruption in gaming communities. #Gamergate was a well-known and far-reaching harassment campaign that began as an attack on game designer Zoë Quinn (who uses they/them pronouns), in response to the accusation that they’d used sexual, intimate relationships to advance their career as a game developer. Offered up as a sacrifice to harassment by their ex-boyfriend, Quinn became a figurehead meant to represent female intrusion in a supposed male-centric space. Their gendered position, and the accusation of intimacy as power, worked to position them as a threat to the greater gaming community. #Gamergate transformed into something much larger, amorphous, and violent. Brianna Wu, Randi Harper, Katherine Cross, and other prominent women and femmes in gaming were similarly threatened with rape and death. Game scholar Kishonna Gray’s work on Xbox racism describes how women of color players experience misogynoir (a term developed by Moya Bailey and Trudy to describe violence targeted specifically at Black women)15 at the hands of other gamers, and how they have mobilized themselves to fight back. Misogynoir allows us to understand the scope of what goes into Black experiences in online and gaming spaces. It works to uncover the intersections of white supremacy, anti-Blackness, patriarchy, and the objectification of Black women. Anonymity increases the viability for misogyny and racism in online spaces. The freedom that trolls and racist actors operate within, to enact violence against other users is predicated

26  New Platform, Same Racists on the expected protections they enjoy on these platforms. Gray’s use of the term “flaming” is helpful to understand how we arrive at a new form of virtual attacks with zoombombing.16 Flaming, according to Gray, is a spontaneous use of racist, homophobic, or violent language during electronic communication. Zoombombing is an act of misogynoir. We use this term to describe how racism targets Black women even when they are not in a Zoom room being abused “live.” Zoombombing creates ripe opportunities for racism and sexism that have a chilling effect upon women and people of color who read about what might happen, know a person to whom this has happened, or must log on to a new Zoom room every day without knowing if s­ ecurity measures have been put into place. Bombers use language and i­ mages designed for maximum shock value, and the swastikas, the “N” word, and porn that Dennis Johnson and his family were forced to look at during his dissertation defense are acts of violence that target women and people of color, whether they are absent or present. Who benefits when zoombombing is described as party-crashing rather than as a racist attack, when we minimize it and understand it as business as usual? The following chapters address this question, how zoombombed targets experience bombing, and useful ways that we might think about cultural and technological harms.

Notes 1 Neil Postman defines media ecology as a method for understanding “a medium [as] a technology within which a culture grows; that is to say, it gives form to a culture’s politics, social organization, and habitual ways of thinking.” 2 Phillips, This Is Why We Can’t Have Nice Things. 3 Associated Press, “Teen Arrested after ‘Zoom Bombing’ High School Classes.” 4 Bingham, “Association of African American Students Hosts ­Follow-up Call in Wake of Racist Hacking - The Daily Gamecock at University of South Carolina”; “Office of Diversity, Equity and Inclusion ­University of South Carolina.” 5 Allen, “Video Shows Racists Clad in Blackface, Swastika Zoombomb Black South Carolina Students.” 6 “90-Day Security Plan Progress Report.” 7 4chan does have some regulation and moderation. As Internet scholar Tarleton Gillespie explains in Custodians of the Internet: Platforms, Content Moderation and the Hidden Decisions that Shape Social Media, even less-moderated sites such as 4Chan have hard lines that can’t be crossed. 8 Gillespie, Custodians of the Internet. 9 Cote, “‘I Can Defend Myself.’”

New Platform, Same Racists  27 10 http://marcuscarter.com/wp-content/uploads/2014/05/AoIR-WoW-­ Funeral-Final.pdf. Gibbs et al., “Serenity Now Bombs a World of Warcraft Funeral: Negotiating the Morality, Reality and Taste of Online Gaming Practices.” 11 Phillips, This Is Why We Can’t Have Nice Things. 12 See Tonia Sutherland’s “Making a Killing: On Race, Ritual, and (Re) Membering in Digital Culture.” 13 As of August 2020, Zoom notes that any usage of its products and services is contingent on compliance to with its terms of service. 14 Castells, The Power of Identity. 15 Bailey and Trudy, “On Misogynoir: Citation, erasure, and plagiarism.” 16 Gray, Race, Gender, and Deviance in Xbox Live; Dorwick, “Beyond Politeness.”

References “90-Day Security Plan Progress Report: April 22,” April 23, 2020. https:// blog.zoom.us/wordpress/2020/04/22/90-day-security-plan-progress-­ report-april-22/. Allen, Matthew. “Video Shows Racists Clad in Blackface, Swastika ­Zoombomb Black South Carolina Students.” The Grio. Accessed August 29, 2020. https://thegrio.com/2020/04/26/black-south-carolina-studentsracist-zoombomb/. Associated Press. “Teen Arrested after ‘Zoom Bombing’ High School Classes.” New York Post (blog), April 9, 2020. https://­nypost. com/2020/04/08/teen-arrested-after-zoom-bombing-high-school-classes/. Bailey, Moya and Trudy. “On Misogynoir: Citation, Erasure, and Plagiarism.” Feminist Media Studies 18, no. 4 (March 13, 2018): 762–768. Bingham, Jack. “Association of African American Students Hosts ­Follow-up Call in Wake of Racist Hacking – The Daily Gamecock at University of South Carolina.” Accessed November 1, 2020. https://www. dailygamecock.com/article/2020/04/aaas-followup-bingham-news. Castells, Manuel. The Power of Identity. 2nd ed. Information Age, Economy, Society, and Culture, vol. 2. Oxford; Malden, MA: Blackwell Publishing, 2004. Cote, Amanda C. “‘I Can Defend Myself’: Women’s Strategies for Coping with Harassment While Gaming Online.” Games and Culture 12, no. 2 (March 1, 2017): 136–155. https://doi.org/10.1177/1555412015587603. Dorwick, Keith. “Beyond Politeness: Flaming and the Realm of the Violent.” Presented at the 44th Annual Meeting of the Conference on College Composition and Communication, San Diego, 1993. http://eric.ed.gov/ ERICDocs/data/ericdocs2sql/content_storage_01/0000019b/80/13/ dc/99. pdf. Gibbs, Martin, Marcus Carter, Michael Arnold, and Bjorn Nansen. ­“Serenity Now Bombs a World of Warcraft Funeral: Negotiating the Morality, Reality and Taste of Online Gaming Practices.” n.d., 4. https:// doi.org/10.5210/spir.v3i0.8845.

28  New Platform, Same Racists Gillespie, Tarleton. Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. New Haven, CT: Yale University Press, 2018. Gray, Kishonna L. Race, Gender, and Deviance in Xbox Live: Theoretical Perspectives from the Virtual Margins. Theoretical Criminology Series. Waltham, MA: Anderson Publishing, 2014. “Office of Diversity, Equity and Inclusion – University of South Carolina.” Accessed November 1, 2020. https://www.sc.edu/about/offices_and_­ divisions/diversity_equity_and_inclusion/index.php. Phillips, Whitney. This Is Why We Can’t Have Nice Things: Mapping the Relationship between Online Trolling and Mainstream Culture. The MIT Press, 2015. http://www.jstor.org/stable/j.ctt17kk8k7. Postman, Neil. “The Humanism of Media Ecology: Keynote Address ­Delivered at the Inaugural Media Ecology Association Convention Fordham University, New York, New York June 16–17, 2000”. Proceedings of the Media Ecology Association, Volume 1, 2000. https://www.media-­ ecology.org/resources/Documents/Proceedings/v1/v1-02-Postman.pdf Sutherland, Tonia. “Making a Killing: On Race, Ritual, and (Re)­ Membering in Digital Culture.” Preservation, Digital Technology & ­C ulture 46, no. 1 (2017): 32–40. doi:10.1515/pdtc-2017-0025.”

2 Zoom Is Memetic Warfare Zoombombing and the Far Right

The goal of media scholarship is to help us understand the histories and afterlives of communicative forms. Digital or “new” m ­ edia presents challenges because of its ephemerality, massive scale, and emergent qualities, requiring us to analyze it while living in it. ­Memes, user-produced mashup images that combine familiar ­figures such as cats, cartoons, badly drawn MS Paint images, and text have received particular attention as indispensable and characteristic tools of the right. “The left can’t meme” is itself a meme: a repeatable, memorable short piece of content that has roots in earlier forms and lives within an ecosystem of actors, material artifacts, production cultures, and cultural politics. In this chapter, we situate zoombombing within the memic architecture of US racial and gender digital politics, examining how the practice serves the right’s purposes even when they are not deployed by the right. Zoombombing deploys classic online harassment techniques and adds something new: it exploits Zoom’s uniquely liminal space, a space of intimacy generated by users via the relationship between the digital screen and what it can depict, the device’s audio tools and how they can transmit and receive sound, the software that we can see, and the software that we can’t. Two conditions have paved the way for zoombombing: a resurgent fascist movement that has found its legs and best megaphone on the Internet and an often-unwitting public who have been suddenly required to spend many hours a day on this platform. COVID did not permit training, onboarding, or offer advance warning about the dangers of this platform. As this book shows, Zoom benefited from a boom that made it especially attractive to racist actors and racist acts. The Internet has always offered the option of creating community or allowing work to flow into the intimate spaces of the home, but with the global pandemic, these spaces have become intricately and necessarily linked, and

30  Zoom Is Memetic Warfare Zoom has stepped up to become a useful and, at the same time, monopolizing force for the facilitation of this overlap. Zoom and other videoconferencing services are both the bad object and the unquestionable infrastructure for social and economic life. Zoombombing repurposes new platforms in dynamic and far-reaching ways; it is both like and unlike older styles of racial and gendered attacks but in the end, is a uniquely COVID-era and neofascist activity.1 Zoombombing sprang into being as a widespread practice because it was able to tap into the frustration and opportunities presented by the US’s culturally polarized society and buoyed by a platform that provides ample opportunities for these particular styles of attacks. Internet harassment shifts and adapts along with new platforms, growing in parallel with them as they get bigger and more popular. Zoombombing is more than just trolling; though it belongs to a broad category of online behavior meant to produce a negative reaction, it has an intimate connection with online conspiracy theorists and white supremacists. Trolls want an audience, and big platforms produce big audiences, and while trolling is often racist and misogynistic, it can also entail less nefarious tactics and goals, such as rickrolling. Zoombombing should not be lumped into the larger category of trolling, both because the word “trolling” has become so broad it is nearly meaningless at times, and because zoombombing is designed to cause intimate harm and terrorize its targets in distinct ways. While some forms of zoombombing mimic trickery and mischief that were already present in spaces such as real-life classrooms and town halls, this book analyzes how the Internet’s prior histories of joyful and mischievous pranking are actually hateful harassment. This tactic emerged from online spaces that deploy memetic warfare, and the online ecosystems that radicalize users into increasingly violent actions. Some of the earliest examples of zoombombing happened in the classroom. Like other online harassment campaigns, zoombombing is nothing if not opportunistic. As described in Chapter 1, these groups leave traces of their active organizing and mobilization on other online platforms such as Twitter and Discord. Their planning focuses on gathering more participants, offering up details about the space that is going to be attacked, and providing the attackers with a set of images, rhetoric, and video tools. At times the instigator will make requests based on the intended target, usually linked to their race and/or gender, but always focused on shock value

Zoom Is Memetic Warfare  31 provided through profanities, racial slurs, and graphic porn. In a Discord chat used to organize a zoombombing raid, one student who volunteered his class’s login code for Zoom later drew the line at showing porn to fourth-grade students.2 There were many other examples of trolling behavior linked to classrooms that intended not only to disrupt but to humiliate or harm the teacher or students by targeting their race and gender. Though there is a wide spectrum of tactics and goals within the larger category of zoombombing, we’ve found that it is most commonly used for racist and gendered attacks and that attackers regularly seek out spaces intended for community and safety. COVID has reprioritized and reframed everyday acts of living, working, and communicating and has created fertile ground for a regime of seemingly random racial terror. Zoombombing now fits into a growing framework of memetic warfare: platforms such as Discord, Twitter, and 4chan funnel bombers and unwitting users alike into an unpredictable, unmoderated, anonymous, and consequence-free space. We entitled this book Racist Zoombombing to distinguish our topic from other forms of Zoom disruption that don’t have a racial component. Some instances of zoombombing were intended to disrupt institutional spaces, such as classrooms or government board meetings, and may or may not have used racial and gendered content. We also understand the messiness of that distinction. At times context and content seem antithetical. The content used to disrupt could be racist even if the targets themselves were white, or sexist even when the target was male. This speaks to zoombombing’s memic nature, which draws from the past and present to repurpose a formula and to spread its content as far as possible. Racism and misogyny are endemic to the US cultural context and are thus repurposed for a variety of contexts. We are not claiming that zoombombing is in itself an inherently racist act, but that through its targets or its content, it is used as a form of racial and gendered terror, and that by and large, the targets of these attacks are people of color and women. Kids disrupt class online just as they did offline, and bored and frustrated people with new time on their hands and justified aggression about being stuck at home seek connection by associating with subcultural groups online that organize around disruption. Because the majority of zoombombings reported in the press involve overtly racist name-calling, pornographic imagery, threats to kill people of color, images of swastikas and KKK regalia and symbolism,

32  Zoom Is Memetic Warfare and other signs and signifiers, we see this subset of Zoom misuse as both distinctive and as belonging to a more expansive category of proto-fascist content that has been banned on other platforms but appears there frequently nonetheless. We trace zoombombing’s genealogy back to the memetic warfare waged by the far-right and its use of the same racist signs and images disguised and embedded within more playful ones, such as cartoon character Pepe the Frog. Like certain memes or other tactical appropriations of popular culture, zoombombing is designed to provide users deniability. When zoombombing happens in a group of all white people and bypasses people of color, it overlaps structurally, historically, and semiotically with the far-right even when its perpetrators are not aware of or participating in those movements, which is exactly why we call this memetic warfare. Zoombombing serves the far-right racist movement even when the carriers of this meme who decide to zoombomb for fun aren’t aware of it. A virus doesn’t need a host to believe in it: it just needs a carrier and a population to infect. The excitement of trolling has much to do with the excitement of emergent behavior on the Internet that utopians celebrate, of not knowing how far this might go, a tonic for alienated and often legitimately disenfranchised people who cling to this right as other more meaningful forms of social engagement have disappeared or were never present in the first place. The history of trolling is inseparable from racist and misogynistic content and behavior because these are often the tools used to shock and harm, and in this way zoombombing is like other kinds of trolling that leverage novelty and strong effects. While certainly zoombombing can be understood as part of the lineage or ecosystem of trollish behavior, we argue that it needs to be critiqued and understood as more than simply trolling because this term emerged during an earlier, less media-rich and interpersonally live Internet. Many people have heard the term “red-pilling”3 before, as documentaries such as The Red Pill (2017) and The Brainwashing of My Dad (2015) find their way onto Netflix and journalists use it in articles about the rise of far-right populist movements. The term is drawn from The Matrix film franchise and describes how the memetic Internet recruits users into far-right movements and how digital media content is both an act of community building and a form of propaganda that provides momentum and power to Internet harassment.4 Choosing to take the red pill rather than blue pill awakens one to the “reality” of feminism as a plot against men,

Zoom Is Memetic Warfare  33 liberalism as a way to victimize white people, and diversity initiatives as indoctrination and as “Black supremacy.” Those who take the red pill claim a new awareness of the lies and harm that feminism or multiculturalism have created for society at-large, and the particular harm that white males face as a consequence. To be redpilled is to be radicalized into a male-supremacist and/or whitesupremacist community, to be “awakened.” The term took root in the manosphere/incel communities, quickly spreading to white supremacist spaces. It might mean following “Q” and following the QAnon movement – a conspiracy theory that envisions Democrats as child molesters and has migrated from an online-only space to offline political demonstrations; QAnon signs have been spotted at Trump rallies. To “take the red pill” is to have one’s eyes opened to the harms of feminism, liberalism, or multiculturalism.5 It is to be inducted, ideologically, into a loosely defined subculture that opposes progressiveness or mainstream society. Red-pilling can be more or less focused on aspects of radical-right thinking, it can be more or less organized toward male-supremacy, or it can be loosely racist or homophobic, all depending on which platform or community the red-pilling takes place in. Red-pilling is attractive to resentful and angry converts because it rewards and requires action. This can be as simple as sharing your conversion story on a Reddit forum, making a racist meme “for the lulz,” or zoombombing a meeting using one of these memes. Red-pilling is fundamentally about creating noise, violence, and harm.6 These communities and subcommunities’ tactics and tools are so malleable and widespread that they have infiltrated the broader culture.7 Those that participate in this particular type of antagonism, trolling, or general harassment are taking part in a culture of red-pilling, whether or not they realize it or actively engage with spaces more readily understood as extremist.8 The Overton Window, or the indicator of what is publicly acceptable in social discourse, has shifted so far to the right that it is now not only possible but absolutely the norm that zoombombing is viewed as a relatively harmless prank compared to the plethora of other less-ephemeral and longer-duration overtly racist and misogynistic content and behavior online. The term and ideologies behind red-pilling have become so widespread that it doesn’t actually require intentional recruitment for someone to “fall down the rabbit hole” to engaging with the same ideologies and tactics that the more radical online spaces traffic in. As some researchers have pointed out, there is such a plethora of

34  Zoom Is Memetic Warfare misinformation and online hate speech that users can effectively “red-pill” themselves.9 One of the core tactics used by recruiters is to approach racist or misogynistic material as “trolling” or, as Ryan Milner calls it, the “Logic of the Lulz,” which deploys an attempted plausible deniability to acts of harassment or overt ideological terrorism.10 This was a key element in early online harassment, and zoombombing has refined it for the COVID age. Trolling is both an act of aggression meant to situate the victim as unwanted and unwelcome, as well as a way to codify shared values for the subcultural group that deploys these tactics. It defines the target as the “other,” and in doing so, it helps strengthen the internal characteristics of the community. Zoombombing conceals and contains the terror and psychological harm that targets of active harassment face because it doesn’t leave a trace unless an alert user records the meeting. Likewise, zoombombing is articulated, most commonly, through a focus on anti-Blackness. Even in instances where a Black subject is not present, anti-Black imagery and language is often deployed. This core element to zoombombing speaks to the longer history of the United States and how structures of power, subjugation, and race are articulated through white supremacy and anti-Blackness. Similarly, it also showcases the racialized history of Internet culture and the power dynamics of historic harassment. Zoombombing is the latest iteration of a much longer history of loosely organized, highly effective, memetic campaigns by subcultural groups that often have violent real-world aftereffects. Zoombombing, like other kinds of bombing, induces terror through targeted violence against an enemy other, often in gendered and racialized terms but operating under the cover of the impersonal. And yet, racism and sexism are always personal; they are an attack on a person’s very being and identity. Even the most innocuous cultural objects are able to be easily weaponized, such as Pepe the Frog, whose origins were apolitical but is now recognized as a hate symbol.11 Pepe has for years been used in memetic warfare campaigns, to harass and threaten targets through racist and misogynistic humor. As a cultural symbol, he signals to a loose membership of networked communities or ideologies that users can tap into when useful. And yet, he is often used to claim an updated version of “the logic of the lulz,” which separates intent from action. In much of the commentary we found on online platforms used to organize these attacks, participants framed their use of zoombombing similarly. They saw their actions as humorous and claimed in the comments to not understand why people reacted as strongly as they did – why they didn’t get the joke – even as it was clear that the

Zoom Is Memetic Warfare  35 violence of the act and the shock was always the point. The culpability for these actions, and the resulting harm, fall on the perpetrators but also on the infrastructures that continue to support and allow zoombombing to take place. Earlier this year, despite the increased evidence of these attacks, Zoom continued to maintain that these acts were simply “party-crashing” and not part of an organized campaign of hate. While their language has changed after organizing by those affected, there continues to be a lack of responsibility and support. The spreadability of memes such as Pepe or zoombombing allows for a variety of uses. Not every Pepe image is deployed in a racist way, and not every user of Pepe is a white supremacist.12 Context does matter. Nonetheless, zoombombing and Pepe are both part of a larger contextual framework that has effectively mobilized and empowered hate campaigns. Zoombombing is itself a meme.13 As with Pepe, zoombombers derive social capital in their usage through constructs of violence, and memes and memetic warfare are an important part of the accumulation of social capital. Limor Shifman calls Internet memes “units of popular culture that are circulated, imitated, and transformed by Internet users, creating a shared cultural experience.”14 This shared experience, however, takes on a particular racialized and gendered meaning when viewed through the longer history of the fight to keep the Internet a white, male space. Zoombombing, then, is simply a newer version of this longer struggle. It deploys the same tactics through a new venue. Racist and misogynistic language and imagery have long been used as a guerilla-warfare tactic to push unwanted individuals out of a digital space through fear, disgust, or discomfort. Though new platforms look different, these goals and tactics remain the same. Memes are the improvised explosive device (IED) of information warfare.15 We are well in the midst of a digital cultural war based on information and data, rather than weapons and bodies, and memes and other elements of the far-right’s political aesthetic play a key role in this conflict.16 The rise of the online troll as a political player and the alt-right are merely the logical outcomes of these systems.17 The term “zoombombing” is inherently violent, invoking terrorist and warlike tactics. It is no coincidence that these martial metaphors are so common in online spaces as culture wars are largely waged online, and like explosives, memes are not precise weapons. They are not easy to control once circulated on social platforms and can harm more than the intended target. Memetic warfare isn’t always racist; some of the most effective and widespread anti-racist campaigns during COVID have also

36  Zoom Is Memetic Warfare been memetic. Memes can be a genuine form of resistance against propagandized rhetoric by powerful institutions and can be used by victims to disrupt systems of power and harm. BTS, a popular K-Pop group, whose fans, also known as “Army” members, have appropriated military language to a very different end, and have coordinated highly successful attacks against white supremacy hashtags by creating their own memes, posting thousands of fancams of group members.18 This kind of memetic warfare can push back against hate groups or carceral regimes with stark examples of opposing experiences. It can draw together the masses to disrupt power. For example, though the #myNYPD campaign was designed to collect promotional material meant to showcase positive interactions with police officers, users flooded it with thousands of images of police brutality, eventually spreading nationally to include similar hashtag campaigns such as #myLAPD.19 A more current example of memetic warfare comes from the 2020 campaign trail when KPop stans, fans of particular Korean Pop groups, rallied together to take over the hashtag #whitelivesmatter, an inherently white supremacist pushback to the growing Black Lives Matter movement.20 Or when thousands of teens on TikTok waged multiple attacks against the Trump reelection campaign through negative reviews on Trump’s reelection app;21 TikTokers and KPop stans claimed credit for tanking the numbers at Trump’s Tulsa Oklahoma rally in June.22 However, while these examples are lauded as highly impactful, most campaigns “from the other side” are rarely as effective compared to those “on the right.” One key to the right’s effectiveness is the straightforwardness of their goals and their allergy to nuance. Those that take part in campaigns of targeted harassment want to create havoc, confusion, and harm. They do so by using the most direct and explosive tools available to them. Zoombombers enter a space meant for joy, intimacy, celebration, or collaboration, and disrupt and shock its target audience through violent imagery or language. Making the targets feel unsafe and unwanted in their space and their skin is the most simple yet effective strategy. Early trolling culture claimed to be apolitical, targeting a spectrum of advisories with the common goal of showcasing a nihilistic, trickster aesthetic.23 And yet, even then misogyny and racism, in particular anti-Black racism, was a core tactic in those attacks and harassment campaigns. Zoombombing is simply one of the latest memetic weapons used against people in precarious social positions. The US military considers memetics a subset of neurocognitive warfare and as a tool in

Zoom Is Memetic Warfare  37 “information war.” Although memetic warfare is often understood through a focus on state against state tactics, it also refers to spaces where online self-designated “meme warriors” have launched targeted attacks against a cultural enemy group in a variety of organized and disorganized ways. Meme warriors see themselves as digital guerrilla fighters against institutional monopolies on knowledge and narratives, such as the mainstream media and other centralized authorities.24 Gamergate, “the Fappening” (or Celebgate), and the subsequent Comicsgate are examples of campaigns that deployed active memetic warfare in racialized and gendered ways to attack and drive away a constructed enemy.25 Like memes, zoombombing operates under the moral cover of humor, yet as we have found from speaking to Dr. Tiara Moore, Angelique Herring, and Dr. Dennis Johnson, hearing the “N” word shouted at you during your dissertation defense is a form of informational and psychological warfare, and zoombombing, like other memetic warfare campaigns has grown naturally, and asymmetrically, across multiple platforms. All users are vulnerable by design on Zoom, but the dangers are not evenly distributed. It is truly inspiring to see how users are creating sacred and nurturing spaces on Zoom by offering free yoga and boxing classes; holding prayer groups and meditation sittings; conducting funerals, weddings, and graduations on video; and saying their last goodbyes to loved ones with COVID as they pass from this life alone in hospital beds. It is exactly because Zoom is a lifeline to community and intimacy that Black life is particularly targeted there. As previous memes such as Barbecue Becky and the driving, walking, or standing while Black catchphrases have demonstrated, the sight of Black joy or public life enrages and disturbs whiteness and those white folks who feel their privilege is threatened. At its best, the Internet networks and connects individuals and allows them to create new forms of knowledge, community, and intimacy. When used as a space for joy, or to organize to disrupt power and oppression, memes and other emergent digital practices can be creative, collaborative, and a force for good. However, Zoom is often used to support and perpetuate harm and, for better or for worse, has become both a battleground and the COVID era’s site of connection for work, for family, and for community connection. And like other styles of warfare, those that are most targeted and most harmed are the ones that live in a state of precarity already. Racism can be separated neither from our understanding of technology nor from cultural movements. Zoombombing is simply the most recent iteration of the culture wars played out online.

38  Zoom Is Memetic Warfare

Notes 1 Steinbeck, “Virtual UGA Guest Lecture Hijacked with Death Threats, Racial Slurs Directed toward Professors.” 2 Kan, “Students Conspire in Chats to ‘Zoom-Bomb’ Online Classes, Harass Teachers.” 3 “What the Red Pill Means for Radicals”; Lewis, “The Online Radicalization We’re Not Talking About.” 4 “What the Red Pill Means for Radicals.” 5 “What the Red Pill Means for Radicals.” 6 Cunha, “Red Pills and Dog Whistles.” 7 Phillips, “The Oxygen of Amplification.” 8 Crawford, “The Influence of Memes on Far-Right Radicalisation.” 9 “VasilistheGreek (Discord ID.” 10 MIlner, “FCJ-156 Hacking the Social.” 11 Morlin, “Pepe Joins (((Echoes))) as New Hate Symbols.” 12 Chan, “Intimacy, Friendship, and Forms of Online Communication among Hidden Youth in Hong Kong.” 13 What is a meme? According to Limor Shifman memes are “(a) a group of digital items sharing common characteristics of content, form, and/ or stance, which (b) were created with awareness of each other, and (c) were circulated, imitated, and/or transformed via the Internet by many users,” Shifman, Memes in Digital Culture, 367. 14 Shifman, Memes in Digital Culture, 367. 15 Siegel, “Is America Prepared for Meme Warfare?” 16 Bogerts and Fielitz, “Do You Want Meme War?” 17 Fichman and Sanfilippo, Online Trolling and Its Perpetrators; Graham, “Boundary Maintenance and the Origins of Trolling”; Greene, “‘Deplorable’ Satire”; Hodge and Hallgrimsdottir, “Networks of Hate.” 18 “K-Pop Fans Drown out #WhiteLivesMatter Hashtag.” 19 Lopez, “Twitter Critics Take on LAPD after NY Police Hit on Social Media.” 20 Ohlheisier, “How K-Pop Fans Became Celebrated Online Vigilantes.” 21 Banjo and Egkolfopoulou, “TikTok Teens Are ‘Going to War’ Against the Trump Campaign After Republicans Call to Ban the App.” 22 Lorenz, Browning, and Frenkel, “TikTok Teens and K-Pop Stans Say They Sank Trump Rally.” 23 Phillips, Beyer, and Coleman, “Trolling Scholars Debunk the Idea That the Alt-Right’s Shitposters Have Magic Powers.” 24 #Gamergate is a good example of a memetic campaign against women in gaming that figured itself as an insurgent protest of a larger and more powerful entity—the gaming journalism establishment. 25 Massanari, “#Gamergate and The Fappening.”

References Banjo, Shelly, and Misyrlena Egkolfopoulou. “TikTok Teens Are ‘Going to War’ Against the Trump Campaign After Republicans Call to Ban the App.” Time, July 10, 2020. https://time.com/5865261/ tiktok-trump-campaign-app/.

Zoom Is Memetic Warfare  39 Bogerts, Lisa, and Maik Fielitz. “‘Do You Want Meme War?’: Understanding the Visual Memes of the German Far Right.” (2019): 137–153. https://doi.org/10.14361/9783839446706-010. Centre for Analysis of the Radical Right. “What the Red Pill Means for Radicals.” June 8, 2018. https://www.radicalrightanalysis. com/2018/06/08/what-the-red-pill-means-for-radicals/. Chan, Gloria Hongyee. “Intimacy, Friendship, and Forms of Online Communication among Hidden Youth in Hong Kong.” Computers in Human Behavior 111 (October 2020): 106407. https://doi.org/10.1016/j. chb.2020.106407. Cunha, Darlena. “Red Pills and Dog Whistles: It Is More than ‘Just the Internet.’” Aljazeera, September 6, 2020. https://www.aljazeera. com/opinions/2020/9/6/red-pills-and-dog-whistles-it-is-more-thanjust-the-internet/. Crawford, Blyth. “The Influence of Memes on Far-Right Radicalisation.” Centre for Analysis of the Radical Right (blog), June 9, 2020. https:// www.radicalrightanalysis.com/2020/06/09/the-inf luence-of-memeson-far-right-radicalisation/. Fichman, Pnina, and Madelyn R. Sanfilippo. Online Trolling and Its Perpetrators: Under the Cyberbridge. Lanham, MD: Rowman & Littlefield, 2016. Graham, Elyse. “Boundary Maintenance and the Origins of Trolling.” New Media & Society, May 30, 2019. https://doi.org/10.1177/1461444819837561. Greene, Viveca S. “‘Deplorable’ Satire: Alt-Right Memes, White Genocide Tweets, and Redpilling Normies.” Studies in American Humor 5, no. 1 (2019): 31–69. https://doi.org/10.5325/studamerhumor.5.1.0031. Hodge, Edwin, and Helga Hallgrimsdottir. “Networks of Hate: The AltRight, ‘Troll Culture’, and the Cultural Geography of Social Movement Spaces Online.” Journal of Borderlands Studies 35 (February 26, 2019): 1–18. https://doi.org/10.1080/08865655.2019.1571935. “K-Pop Fans Drown out #WhiteLivesMatter Hashtag.” BBC News, June 4, 2020, sec. Technology. https://www.bbc.com/news/technology52922035. Kan, Michael. “Students Conspire in Chats to ‘Zoom-Bomb’ Online Classes, Harass Teachers,” n.d. https://www.pcmag.com/news/studentsconspire-in-chats-to-zoom-bomb-online-classes-harass-teachers. Lewis, Alice Marwick, Becca. “The Online Radicalization We’re Not Talking About.” Intelligencer. Accessed November 1, 2020. https:// nymag.com/intelligencer/2017/05/the-online-radicalization-were-nottalking-about.html. Lopez, Robert J. “Twitter Critics Take on LAPD after NY Police Hit on Social Media.” Los Angeles Times, April 23, 2014, sec. California. https://www.latimes.com/local/lanow/la-me-ln-twitter-critics-mylapdmynypd-20140423-story.html. Lorenz, Taylor, Kellen Browning, and Sheera Frenkel. “TikTok Teens and K-Pop Stans Say They Sank Trump Rally.” The New York Times, June

40  Zoom Is Memetic Warfare 21, 2020, sec. Style. https://www.nytimes.com/2020/06/21/style/tiktoktrump-rally-tulsa.html. Massanari, Adrienne. “#Gamergate and The Fappening: How Reddit’s Algorithm, Governance, and Culture Support Toxic Technocultures.” New Media & Society 19, no. 3 (March 2017): 329–346. https://doi. org/10.1177/1461444815608807. Milner, Ryan M. “FCJ-156 Hacking the Social: Internet Memes, Identity Antagonism, and the Logic of Lulz. | The Fibreculture Journal : 22.” The Fibrecultire Journal, no. 22 (2013). http://twentytwo.fibreculturejournal. org/fcj-156-hacking-the-social-internet-memes-identity-antagonismand-the-logic-of-lulz/. Morlin, Bill. “Pepe Joins (((Echoes))) as New Hate Symbols.” Southern Poverty Law Center, September 28, 2016. https://www.splcenter.org/ hatewatch/2016/09/28/pepe-joins-echoes-new-hate-symbols. Ohlheisier, Abby. “How K-Pop Fans Became Celebrated Online Vigilantes.” MIT Technology Review, June 5, 2020. https://www.technologyreview. com/2020/06/05/1002781/kpop-fans-and-black-lives-matter/. Phillips, Whitney. “The Oxygen of Amplification.” Data & Society, May 22, 2018. https://datasociety.net/library/oxygen-of-amplification/. Phillips, Whitney, Jessica Beyer, and Gabriella Coleman. “Trolling Scholars Debunk the Idea That the Alt-Right’s Shitposters Have Magic Powers.” Vice, March 22, 2017. https://www.vice.com/en/article/z4k549/ trolling-scholars-debunk-the-idea-that-the-alt-rights-trolls-havemagic-powers. Shifman, Limor. Memes in Digital Culture. Cambridge, MA: The MIT Press, 2013. Siegel, Jacob. “Is America Prepared for Meme Warfare?” Vice, January 31, 2017. https://www.vice.com/en/article/xyvwdk/meme-warfare. Steinbeck, Foster. “Virtual UGA Guest Lecture Hijacked with Death Threats, Racial Slurs Directed toward Professors.” The Red and Black. Accessed November 2, 2020. Unicorn Riot: Discord Leaks. “VasilistheGreek (Discord ID: 270328712367570955).” Accessed November 1, 2020. https://discordleaks. unicornriot.ninja/discord/user/1445.

3

Affective Violations Black People’s Experiences with Zoombombing

Chapters 1 and 2 of this book defined zoombombing, its historical precursors, and some of the ways that bombers use digital platforms such as 4chan and Discord to recruit collaborators by publicizing private links and encouraging strangers to disrupt meetings in the most offensive ways possible. As scholars such as Alex Stern and Jessie Daniels have noted, these and other anonymous platforms have long been organizing spaces for white supremacists and red-pilling. Red-pilling recruits new people into these movements through increasingly violent tactics. Therefore, zoombombing, which operates under cover of “pranking,” serves as an important stepping stone to more harmful, directly racist behavior. Zoombombing and other forms of online abuse are like a stool that has three legs: the abuser, the platform, and the target or witness. This chapter turns to the third leg: those who have suffered abuse on Zoom and the price they have paid and continue to pay every day to keep this dysfunctional stool upright. If we have done our job as researchers, it should be clear by this point that zoombombing is more than just a harmless prank or annoying behavior that Internet users have no choice but to accept as part of the price of usage. To that end, this chapter gives voice to those who have been on the receiving end of this harassment. We conducted three in-depth interviews from August to September of 2020 with people who have been bombed since COVID-19: environmental ecologist Dr. Tiara Moore, employee of student services Angelique Herring, and Dr. Dennis Johnson, a newly minted PhD in Education. Their experiences provide much-needed perspective and lived detail about the specific forms of emotional and social damage that zoombombing produces and the racialized labor that it requires to combat it. This is a story of uneven awareness and attempts to protect meetings from “party-crashers.” The many examples that we studied

42  Affective Violations tell us that no one invites zoombombing or “asks for it,” but rather that determined harassers can overwhelm even the most carefully planned interactive events and that properly securing or lockingdown events can reduce interactivity.

Dr. Tiara Moore We interviewed Dr. Tiara Moore, an environmental ecologist and founder of A WOC Space, a community workspace centering the needs of women of color in professional settings, after reading about her experience with zoombombing on BuzzFeed.1 Moore was one of the first cases that we could find in the popular press of zoombombing. As we describe earlier in this book, Dr. Moore started this group to create community and fellowship after a series of events that left her feeling a social lack, being the only Black woman in her department and experiencing a variety of subtle (and not so subtle) racist incidents through her work spaces. Moore was a postdoctoral fellow at the Nature Conservatory in Seattle, Washington, when the attack happened. Postdoctoral fellowships are high-stakes and high-stress positions where labor and research are expected to be produced quickly and thoroughly, and are used to determine a scientist’s access to labs and resources to complete their work. This on top of the COVID-19 pandemic provided strong motivation for Moore to create a virtual community space for women of color to gather, collaborate, and unwind with activities such as women’s circles, happy hours, and game nights. Dr. Moore began the Monday, March 30th women’s circle meeting on her iPad at 5:00pm. She was alone in the Zoom room, which she created using her institutional account as a University of Washington employee. She was making final preparations for the meeting when a woman logged in to the room and greeted Dr. Moore. Though this person was not someone she recognized from any prior meetings, an unfamiliar face did not sound any alarms for Dr. Moore, who advertised the meetings as open to all via Twitter with the catchphrase, “grab a drink, click the link.” Dr. Moore attempted a conversation with the new woman as she moved around her kitchen preparing for the rest of her group to log on. After a few moments of awkward chatting, the woman said something strange; “[Y]ou should be careful because you could get hacked!” and instantly the room was flooded with faces, shouting the “N” word and other slurs. Despite her initial shock, Dr. Moore

Affective Violations  43 tried to address the problem by leaving the call and opening it again on her computer. Seeing that she had returned, the harassment began again. Dr. Moore was bombarded with more people shouting the “N” word, questionings of why she closed the meeting, and chat moving so quickly that she could barely process the content being thrown at her. Dr. Moore closed the meeting again and canceled the gathering for that day. She quickly took steps to protect future meetings – changing the room’s settings to require a password, and adding a second host to help moderate the space. When asked, Dr. Moore was unsure if she lost members with the new restrictions, but she felt that overall her numbers remained the same. However, there was a new sense of caution added to what was intended to be an intimate, joyful space for her and her colleagues. These were the early days of Zoom during COVID and like most people, Moore hadn’t ever heard of zoombombing, and yet in some ways the vitriol was unsurprising. Dr. Moore argued that although it was shocking, she “didn’t let it defeat me.” This attack had a major impact on the labor required of Moore to keep the group alive and to fulfill its original purpose: to support women of color. A WOC Space altered the structure of future Zoom events; rather than sending out a public link on Twitter, individuals who wanted to join the group needed to join an email listserv to receive passwords to events. In addition, one of the three original organizers who acted as administrators for A WOC Space now had to serve as a moderator rather than leaving the event open. As an additional precaution, all meetings were recorded. Following the zoombombing, Dr. Moore went to Twitter and tweeted at Zoom about what had happened, receiving no response. Later on, an account with a larger following also reported an attack and Zoom responded, on Twitter, with a link to submit a formal complaint. Dr. Moore used the link to file a complaint with Zoom, receiving an automatic response initially. Three weeks after the filing, she received an email saying that her case was officially closed, offering no additional information.

Angelique Herring Angelique Herring is the Digital Content Manager for the marketing team at Eckerd College in St. Petersburg, Florida. We interviewed her on July 21st, 2020. Angelique was zoombombed during

44  Affective Violations an event hosted by Eckerd’s Afro American Society (AAS), a student organization that serves as a resource and support system for the college’s Black students. The Afro American Society held a Zoom event on June 9th, 2020, as part of a Vent & Share series during which Black students could speak with the Eckerd community about their experiences and struggles as Black students on campus. The event was widely advertised to both the AAS and the college’s Instagram accounts, which collectively have 10.9 thousand followers. Angelique explained that the goal was to host a “discussion on race at our campus…specifically for Black students” to share their experiences dealing with racism, but also to engage with the larger campus community who were unaware of this aspect of the Black experience. Her goal was also to provide a virtual space for Black students to check in about their mental health and general well-being following the public uprisings against police violence in the summer of 2020. The link to the event was posted on the Afro American Society’s Instagram page and included the Zoom event’s meeting ID. Angelique logged into the event on her iPad, where a number of students, alumni, and faculty were already on the call. The student president of the Afro American Society had agreed to serve as designated moderator. As the event was starting in earnest, Angelique and the rest of the room heard a high-pitched voice saying “Whoo, I’m going to show you something, something, something.” Angelique scrolled through the other meeting participants and saw the voice was coming from a profile whose video feed showed “a very petite white person with a little black bob.” After speaking in this voice on the screen for a bit, this white person began masturbating on camera. After the initial shock of witnessing this unexpected event, the group left the call and returned, hoping that the disruptive person would have left without an audience to perform for. This strategy was not successful as the same user continued to disrupt the meeting, using the audio channel to spout nonsense and expose themselves. During these interactions, the student president and Angelique were active in a group chat with Afro American Society members, warning them that something strange was happening and requesting that they delay entering the meeting. The group collectively decided to leave the Zoom event after a few rounds of this and moved the event to Google Hangouts, where Eckerd has access through a paid Google Suites license for digital communication.

Affective Violations  45 Angelique told us that many of her Eckerd staff colleagues saw this as a confirmation of their fears that Zoom was less secure than the Google Suites software they were originally using. Google Meet required participants to have email addresses in the Eckerd College system to access the meetings. As a result of this attack, the Afro American Society used the Google Meet platform for the remaining events in the series. Angelique noted that many students found Zoom easier to use, but after being zoombombed, the risks outweighed the reward. Angelique had experienced racial and gender harassment many times before in offline spaces, but as is so often the case with intersectional identities, she often found it impossible to tell whether the people who shouted expletives at her on the street were targeting her race, her gender, or even possibly something else. She had also experienced frequent “rudeness” online from men that she described as “usually pretty subtle” but annoying nonetheless. Theorist and poet Cathy Park Hong describes this kind of racism as a “minor feeling” that doesn’t rise to the level of headline-worthy event, but skates under and around the line, like a nagging headache that isn’t yet a migraine.2 Similarly, Angelique described the zoombombing as a complex emotional event, where two feelings, normalization and shock, both profound anomie and paralyzing surprise, can “live together at once.” Having heard of zoombombing before did not prepare her for its reality; her first thought as it was happening was “Oh my God, this is happening to me.”

Dr. Dennis Johnson On August 17th, 2020, we interviewed Dr. Dennis Johnson, a Senior Program Manager at the EXP Opportunity Engine, a California-based nonprofit, and a lecturer at California State University Long Beach. Dr. Johnson spoke about his experience with zoombombing and the event it took place in, his dissertation defense. He described the event as particularly meaningful because his path to doctoral education was not a straight one: he had graduated from high school with a less-than-stellar GPA and barely got into his undergraduate program period. And yet, he quickly accelerated scholastically as a quantitative scholar in his field, and on the day of his defense, he was the youngest person in his program and had been approved to graduate early. On March 26, 2020, his dissertation defense was zoombombed. He, along with the participants in the meeting – his family, friends, colleagues,

46  Affective Violations and mentors – were forced to experience the shock and violence that zoombombing often inflicts. Dr. Johnson was feeling hot and somewhat nervous on the afternoon of his defense. He was sitting at his kitchen table, wearing a three-piece suit and preparing himself mentally to sum up several years of research in less than two hours. Dr. Johnson’s dissertation analyzed African-American experiences and work-based learning programs, and he had spent years gathering and analyzing data. Dissertation defenses are often public affairs and over 40 attendees had registered for his, preparing to watch the culmination of all his hard work and innovation. Dr. Johnson’s defense was very different from Dr. Moore’s and Angelique Herring’s Zoom meetings. Because it was an official academic event hosted by his university, its security and logistical work were handled by the university’s professional IT specialists. This event had the most extensive security measures of any we had heard of, and far more than other folks that we interviewed could access. Dr. Johnson was not expected to moderate or host his own defense, nor were any of his faculty committee members; rather, he was meant to focus on presenting his research and to complete the final test of a new PhD. His defense was not publicly advertised; invitations to the Zoom defense were sent only to specific people and required an RSVP, and the IT staff at the university had pre-arranged a verification process to vet guests to prevent mishaps. They sent Dr. Johnson a list of names to preapprove before guests could be admitted to the room where the defense was held. The IT staff were well aware of Zoom’s dangers and prepared as well as they could to defend the event against bombing. Dr. Johnson’s dissertation defense started without interruption. However, once he reached the portion of his presentation that covered his historical analysis of anti-Black violence in the US education system, the atmosphere changed radically. As Dr. Johnson spoke over the PowerPoint slides that he was sharing on his screen, he noticed that he had lost control of what he was seeing. The zoombombing had begun in earnest. A person he did not know began to draw images of a penis, wrote the “N” word, and posted other pornographic images over his slide. Meanwhile his committee members and IT staff frantically attempted to identify the source of this disruption. The zoombombing ended after about five minutes, either because someone was able to identify the source and remove them, or because they had left of their own accord.

Affective Violations  47 Dr. Johnson was able to complete his defense, but for weeks afterward he found himself mentally returning to the zoombombing despite his desire to put it behind him. This moment that he had looked forward to and prepared for so extensively had, in many ways, been stolen from him. This event catalyzed a new activist and scholarly project for Dr. Johnson on top of his existing projects. He immersed himself in zoombombing research and discovered that he was far from the only or even first Black person to experience this new form of racist affective violation. When he was finally able to get Zoom to respond to his request for improvement, they pointed to the company’s rapid COVID-induced or COVID-prompted growth as the reason they felt the issue was, in some ways, out of their hands. Some time later, they changed their website to emphasize the company’s expectation that users will provide their own IT staff to prevent these events – an irony considering that this is exactly what Dr. Johnson had at his disposal, and it made no difference. In the end, Dr. Johnson’s campaign to raise awareness about the problem and advocate for solutions despite Zoom’s lack of responsiveness was motivated by people other than himself. His main concern was how these racist violations on Zoom might affect the children who must use the platform every day for school. He told us that his “biggest fear was that this was going to happen to a young person…in elementary classrooms, like third or fourth grade.” Dr. Johnson thought carefully about how to best effect change at Zoom and was also far more aware of the racial politics of platforms than most users. He chose to host his petition, which, as of this writing has attracted over 36,000 signatures, on Change.org. Like Angelique Herring, he described how two feelings, shock and resignation, can live together. Dr. Johnson explained that as a Black man in the United States he had already experienced systemic and sustained racism in many aspects of everyday life. He speculates that his mother, grandmother, and other relatives were not surprised by seeing this racialized attack and were able to continue to attend and support without comment because “we already knew that if we couldn’t be protected in society that there was no way that we were going to be protected online.”

Similarities to Analog Experiences In keeping with the argument made throughout the rest of the book, our participants did not experience Zoom as a novel platform

48  Affective Violations or zoombombing as a completely new experience; quite the opposite, in fact. Zoombombing fit quite comfortably with other experiences of racial harassment that these three people had endured in the past. While these zoombombing attacks were examples of digital disruption, our three interviewees linked them in various ways to their previous experiences of in-person racist harassment and shared the multiple emotional and technological strategies and skills that they had evolved for coping with them. Racist zoombombing led our thoughtful and generous interviewees to reflect on their own identities as Black professionals, academics, women, family members, workers, organizers, and researchers, and how the intersections between those identities produced different affective or emotional responses. Angelique Herring reflected on the ways the zoombombing led her to reconsider and reexamine her racial identity as a Black woman and her previous and current exposure to harm. Unprompted by us, she offered a story of having expletives screamed at her on the street while she was walking and compared it to her feelings during the zoombombing. Both moments required significant emotional processing. Like Angelique Herring and Dr. Johnson, Dr. Moore was no stranger to virulent public racism. When the “Central Park Karen” story broke, she saw resonances between her experiences and Christian Cooper’s, the Black male birdwatcher who was harassed in New York’s Central Park by Amy Cooper, a white woman who refused to leash her dog. Though Cooper, an investment banker at Templeton, was later charged with lying to the police and was fired from her job, the event reminded viewers that police violence is not just the fault of the policing system or “bad apples,” but rather of an ecosystem that values white people’s stories over Black people’s lives. This video went viral in the same timeframe as the rise of zoombombing, reflecting the viral spread of anti-Black racism on and offline. While on a beach conducting research, Dr. Moore was similarly confronted and told to leave by a white woman who threatened to involve the police as she was engaging in her field work. She described the moment as both disturbing and commonplace, just the sort of interaction that Black people experience in public every day. Dr. Johnson’s story was painful to hear and demonstrated to us how zoombombing hurts people where they live and work. The event fundamentally changed the way he was able to celebrate and even accept his educational achievement. The purpose of the dissertation defense is not only to validate the quality of a researcher’s major project; it is also a rite of passage meant to signal to the

Affective Violations  49 community and to the candidate that they are now an expert in their field and on their topic. Despite his having moved on to a fulfilling new job and passing his defense with flying colors, he is understandably upset and unsatisfied with Zoom because neither they nor his university have done something simple, straightforward, and technology-free, that is, to apologize. We concur with his speculation that this omission may have been motivated by the platform and the institution’s desire not to be sued, and there is no doubt in our mind that this apology was deserved. Dr. Johnson is a resilient and resourceful scholar and activist who spent significant amounts of time remediating the problem as best he could, engaging in labor to draw attention to the widespread issue that zoombombing had become, supporting all those this affected, and offering advice to the company who readily benefited from his labor. He performed this work while enduring the lasting effects of zoombombing. As he said, “when people said ‘doctor,’ for the next few months…all I saw was the word, ‘n-----’.”

Zoom’s Spatial Context Collapse and the Feeling of Intimate Racial Violence COVID-19’s spread has significantly reduced the number of spaces that one can “safely” inhabit in their personal and professional lives. For many people, personal domestic spaces such as kitchen tables, couches, and beds that were once reserved for leisure or briefly catching up with work have become dedicated professional spaces. As many people have been made to work from home, personal and professional lives have flattened even further; for those who were already freelancing or working from their homes, there is no respite or retreat from the possibility of work. This is one of zoombombing’s more nefarious violations. As our interviewees mentioned to us, after they had been zoombombed, they still had to sleep in that bed, sit on the couch, eat at that table. Zoom calls are not just an attack on the political, communal, or celebratory work that make life feel vital and fulfilling. Zoombombing brings those attacks directly into your home. One’s living space comes to hold not only the promise and experience of comfort and safety but also the traumas of racial violence. While our interviewees noted that online spaces and physical spaces are not the same, they also made a point of drawing affective connections between the two; for them, these were functionally the same tactics, just in a new space.

50  Affective Violations Previous experiences with zoombombing had permanently changed how our informants experienced their own feelings of self-assurance and privacy in their homes and led to some selfreflexive moments about place, space, and safety. Dr. Dennis Johnson pointed out that zoombombing had transformed his experience of his own personal space and that he was aware that he was sitting at his kitchen table, the same place where he had been bombed while he was defending his dissertation, while speaking to us. Dr. Tiara Moore’s attack began not at her office, but in her kitchen on her iPad while she was making a drink. During her interview with us, Angelique was walking around her apartment, the same place she had watched her panel get zoombombed. These overlaps manifest in both physical spaces and in internal monologues. Dr. Tiara Moore intended to create a safe virtual space during a time when it felt especially needed. And the violation of that space through zoombombing was so severe it limited future meetings, making them more regulated and guarded where they were once intended to be open and welcoming.

The Labor of Micromoderation: How People of Color’s Entrepreneurial Reparative Work Benefits Zoom and Other Platforms Earlier in this chapter, we described Dr. Moore’s experiences volunteering her labor as an unpaid organizer to create intellectual and emotional resources for Black women in STEM. Zoom started out as a workable fix to an untenable situation – our social isolation and loneliness during an interminable-seeming and open-ended lockdown – but produced a new set of serious problems, a new set of tasks she needed to either perform herself or assign to someone else on the same screen where she was already laboring on her research. Science is a collaborative enterprise and professional networks are key to advancing; in the absence of conferences or poster sessions, Zoom meetings take on a new resonance for junior researchers who are developing careers, especially when they are conspicuous minorities in their fields. Dr. Tiara Moore spent time and energy negotiating Zoom’s technical affordances and coming up with new ways to modify, experiment, and eventually recruit help from colleagues to gatekeep A WOC Space and protect it from zoombombers. Unfortunately, the labor of moderation precludes participation: keeping the meeting safe and monitoring the door makes it very difficult to be part of

Affective Violations  51 a conversation. Zoom puts the onus on users to implement the platform’s safeguards – a daunting task considering how often its terms, interfaces, and affordances change and bombers find new ways to get around them. The history of video game toxicity has taught us that no amount of automated filtering or even “live” human moderation can change a culture that has been allowed to harbor harassment and overt cruelty to specific racial and gendered groups. Hackers have always deeply enjoyed the creative challenge of getting around new forms of security as both a sign of their own technological prowess and as part of gaming traditions such as “leetspeak,” which uses alphanumeric characters to fool NLP filters.3 The problem isn’t that Zoom can’t fix it, or even that we as users don’t have the time or capacity to fix it: the problem is that we have gotten used to feeling powerless and disengaged when the Internet ruins elections, turns out search results that identify Black people as gorillas and AI that identifies Asian people as needing to open their eyes for digital cameras. It’s dismissed as the Internet’s basic nature. And most of us can’t remember a time when it was ever different. Our interviewees represented a spectrum of approaches, outcomes, and reactions; Angelique Herring continued to use the platform but did so more cautiously, Dr. Tiara Moore poured time and energy into both protecting and nurturing her Zoom meetups and talking to journalists to spread the word about the problem out in the public, and Dr. Dennis Johnson approached the company directly to seek redress and mobilized awareness through an online petition. Each of these people also chose to spend their time explaining what zoombombing meant to them, another sign that they see it as a serious problem that is worth their energy, focus, and attention during a historical moment when all are in severely short supply. As we’ve discussed earlier in this book, the Internet has always been a precarious space for people of color, women, and Black folks in particular. Even as marginalized communities push back and, against numerous odds, make networks of community and joy, the Internet continues to be violent in various ways. This inherently misogynistic and racist structure places the onerous task of moderation on those most affected. Moderation is a form of labor, mostly unpaid, which forces the burden again on those historically required to do unseen and uncompensated labor. People of color and women have been dedicating their time, energy, and psychic resources to perform the work of unpaid

52  Affective Violations micromoderation since the early days of the Internet, creating productive spaces in the midst of trauma.4 In each case, these users felt obligated to do the work of moderation that makes a platform usable; they felt responsible for taking care of their people and their communities, their students, friends, colleagues, and even strangers. In other words, they became volunteer moderators in the service of their communities and Zoom was the beneficiary.

Notes 1 Hernandez, “A Zoom Meeting For Women of Color Was Hijacked By Trolls Shouting The N-Word.” 2 Hong, Minor Feelings. 3 Phillips, This Is Why We Can’t Have Nice Things. 4 Nakamura, “The Unwanted Labour of Social Media.” Roberts, Behind the Screen.

References Hernandez, Salvador. “A Zoom Meeting For Women Of Color Was Hijacked By Trolls Shouting The N-Word.” BuzzFeed News, April 2, 2020. https://www.buzzfeednews.com/article/salvadorhernandez/zoomcoronavirus-racist-zoombombing Hong, Cathy Park. Minor Feelings: An Asian American Reckoning. New York: One World, 2020. Nakamura, Lisa. “The Unwanted Labour of Social Media: Women of Colour Call out Culture As Venture Community Management.” New Formations: A Journal of Culture/Theory/Politics 86, no. 1 (December 16, 2015): 106–112. Phillips, Whitney. This Is Why We Can’t Have Nice Things: Mapping the Relationship between Online Trolling and Mainstream Culture. The MIT Press, 2015. http://www.jstor.org/stable/j.ctt17kk8k7. Roberts, Sarah T. Behind the Screen: Content Moderation in the Shadows of Social Media. Yale University Press, 2019.

4

Conclusion

Moderating Zoom from Below Zoombombing is a complex issue that demands a collaborative counter-response. Zoombombers who wield racist and pornographic speech and images to harm users create dangerous and traumatic spaces for everyone, especially for Black people and women of color. They put enormous effort into using their existing networks and technical ingenuity to find their way past passcodes, waiting rooms, and ended calls; therefore, Zoom and other platforms must work equally hard providing methods to make platforms safer and more forthright about specifically identifying racism as it occurs through their service. Which harm-reduction methods would move the platform in the right direction? We leave the policymaking to policymakers, but as humanistic critical race and digital studies scholars, we urge a rethinking of digital labor and responsibility. Holding some of the richest and most dominant companies responsible for user safety, rather than demanding that individual users jump through hoops to craft protection that may not come, is long overdue. At the same time, these initiatives should be done in conversation with the individuals actually using the platform, not without them. While Zoom has options for teams, businesses, and enterprises, our research has shown that harassment is always experienced on an individual level, and it’s impossible to measure harm without collaboration with that person. The tech industry is long overdue for an update in terms of race, gender, and diversity in their workforce. We see zoombombed people as valuable collaborators who can create meaningful reform and moderation efforts; indeed, it’s hard to imagine how change might occur without them. Creating safety on Zoom requires a shift in consideration. Rather than operating with a top-down model of safety that prioritizes

54 Conclusion institutions, Zoom must consider a bottom-up model of safety that values individual users, both as people with a range of experiences and knowledge to draw from, and as workers belonging to institutions. Since zoombombing harms individuals much more than it does institutions, they are the consumers that should be considered in the future when framing the company’s response to harassment. Zoom’s institutional focus prevents a clear view of some key issues. First, individuals without institutional or paid premium licensing for “small team, small/big business, or a large enterprise[s]” have no access to key Zoom options. The pricing packages for Zoom ­licenses reveals users who rely on the free Personal Meeting package to use Zoom have to deal with smaller and timed group meetings and a lack of cloud storage for meeting recordings and interactions with Zoom staff. Second, being under the banner of an institution willing and able to pay for licensing does not guarantee the safety of the people on those platforms. All the individuals we interviewed in Chapter 3 were using Zoom through licenses they had received via their institutions. Dr. Johnson even had the recommended security that Zoom suggests for events like his dissertation defense. And yet here they are – featured stories in a book on zoombombing. In addition, both Dr. Moore and Dr. Johnson had to utilize social media platforms to receive even a minimal response from the company. This communication came not through an Information Technology department or from one of the company’s Content Success Managers, but in Dr. Moore’s case, a message via Twitter that simply indicated a closed case.1 Making a fuss on Twitter or through Change.org got them basic recognition when they were unable to obtain the protections that supposedly come from institutional connections. As we discussed in Chapter 3, the communication and subsequent response was seriously lacking, and led to an extension of the frustration and harm that the original event caused. Third, often the safety options offered end up causing the spaces and communities produced to be altered in ways that detract from their original goals. Restructuring Zoom events and calls following zoombombing makes them fundamentally different spaces, not only in feeling but in function. Drawing again on Chapter 3, after her zoombombing attack, Dr. Moore was forced to turn a fluid space of community building and consciousness raising into a more structured environment that relied on her and the other leaders of A WOC Space to work as Zoom security guards and unpaid

Conclusion  55 moderators. The very nature of this space had to transform once again due to them having to take moderation into their own hands. Knowing that an institutional model of security measures can’t offer the kind of safety that is needed for our interview subjects, what does a more individual focus offer us? Let’s look at one more instance of zoombombing to consider a range of options. On October 28th, 2020, Dr. Lorgia Garcia Peña of Harvard was a guest lecturer for a seminar on violence against women of color hosted by the University of Georgia (UGA). The lecturer, and the near 50 participants, many of them women of color, were zoombombed. Event co-organizer and UGA assistant professor Sharina Maillo Pozo’s home address was put on the screen. The attackers shouted racial slurs, claimed to be part of the Ku Klux Klan, threatened participants’ family members with violence, and showed a dismembered body on the screen. The entirety of the attack took about five minutes. To completely remove those participating in the attack, the hosts were forced to remove all participants that didn’t have a UGA email. This horrific incident happened even after the event organizers took the steps that Zoom suggests to mitigate or stop zoombombing from occurring. According to Red and Black, a UGA student newspaper,2 the organizers and Peña anticipated that the event might be zoombombed because of its subject matter and their own identities, and took all of the precautions that they were able to. They did not share the event on social media, and they set up multiple hosts and required a waiting room. The only advertising the event received was through instructors’ classes, which led some who experienced this attack to question the safety of their classrooms, as there was some concern that the culprits might have been students or connected to students; but because there is no mechanism for seeking out the attackers, the victims are left in the dark about who was responsible and how their preparations failed to protect them. The organizers wanted to use the webinar function to enhance their security but were unable to do so because this function was beyond a paywall that Peña and the others did not have access to. Even if that had been an option, there was a risk of changing the necessary interactivity with the webinar function, which would mean losing the interactivity with the audience that is so vital to events such as this. Setting the lecture up as a webinar would have stripped participants of the ability to interact with the speakers, either audibly or through their screens. Webinars are meant to be

56 Conclusion livestreaming events and are seemingly Zoom’s most secure option. However, the reality is that many like Peña need to weigh the risks against the rewards of having open access and interactive learning spaces, and had they had the option, many who have been attacked would have chosen greater security. The webinar function, beyond being simply unattainable for many, limits what Zoom has to offer in terms of community. In our current pandemic moment, videoconferencing has become our link to work, communication, and long-distance intimacy. When that space is threatened, it is damaging, and the shift to the “livestream” function strips away the communal side of these spaces. In the webinar, there is a disconnect between the speaker and the audience. Interaction is severely limited in the name of functionality and security. This works for some formats, but is impossible for many others. Dr. Moore’s group could not exist within the webinar function, and so even if it had been offered to her, it would not have provided the protection she needed. Let’s look back at the key features that defined Dr. Peña’s zoombombing. She and other organizers were left without platform features to protect themselves or other members of the meeting from the threat of outside invaders, which is the first issue of the institutional model. The second issue of the institutional model comes from an inability to address internal compromises within networks. Even if kicking the attackers out through the use of requiring UGA emails had worked, it offered no protection if the zoombomber had come from within the UGA network, and ultimately would have left them vulnerable to attack. This particular strategy does nothing to address the core issue nor the actual threat if the attacker’s link to the meeting came from within the institution itself. The third key issue is that the webinar function is not the saving grace for all academic or similar spaces of labor. The webinar function works well for replicating large lectures or special convocations; but what about classroom visits? What about other spaces in which community members want something more personal and casual than a lecture? The safest option would have made this talk, like A WOC Space, unfit for its purpose.

From Parasitism to Collaboration A refocus on satisfying individual paying and non-paying customers would allow Zoom and other tech companies to move away from a parasitic model that defines Zoom as a generous host and its

Conclusion  57 individual users as supplicant guests.3 Zoom and other videoconferencing platforms are inescapable for people who need to connect and work; they are the system through which COVID life is navigated. While the institutions that license Zoom are seen as essential parts of the system, the individual users, both affiliated and unaffiliated with a larger institution, are seen as irksome or parasitic, rather than as smaller and essential pieces to this digital ecosystem. Users’ pings of complaint only gain acknowledgment when they are too public and too loud for a lack of engagement to go unnoticed, and even then, the actions are less than satisfactory. Dr. Moore’s or Dr. Johnson’s testimony would have been taken seriously when they reported their attacks if they had been seen as potential collaborators rather than unhappy customers or parasites. An individual model also allows for companies to be in meaningful conversation with people who are zoombombed. Our interviewees received vague emails about investigations being underway or being closed with no further details in response to their complaints and offers to help; these are characteristic forms of communication that demonstrate how institutional responses are privileged over individual ones. These methods and responses convey the message that these individuals are a nuisance, rather than ­consumers struggling to utilize and improve the product they are engaged with. It is possible to create a different climate that could include these users in the process of coming up with alternatives and strategies rather than writing them out of the conversation.4 It is certainly not a lack of ideas that prevents individuals from contributing to this process. Dr. Johnson’s petition offers quite a few insights. Given the detail of that petition and the remedies that it seeks, its message is far more than an attempt by Dr. Johnson to complain and publicly shame Zoom; instead, it demonstrates a commitment to a collective mission to make this platform, and other platforms on the Internet, into safer, more considerate places for all users, not only for the most prominent and profitable clients. While this model focuses on the individual, it is not a claim that only individuals would benefit from this shift. In fact, taking seriously the particularities of individual instances of zoombombing allows for a safer community of people within a specific licensing network like a university, and by association would strengthen the safety and support of the larger pool of individuals who form professional, academic, or personal communities on Zoom. If the most basic user of the platform feels safe, or at least feels like there are plausible avenues for reporting or attaining assistance that would

58 Conclusion produce well-communicated results, everyone on the platform would be much safer. Zoom has begun altering the way they speak about zoombombing, from calling it “party-crashing” to gesturing toward the attacks as something harmful. And yet it has become increasingly evident that this issue will not simply go away as it arises from and contributes to the longstanding issue of hate speech on the Internet. The barrier for user protection has been shoved behind an expanding paywall, one that both encourages institutionalized relationships and dissuades personal use, while at the same time allowing for a continued consolidation of the monopolizing force that Zoom has become in this COVID-era world. The move from no usage to several hours a day of usage has left little affective overhead for critique of the platform itself. We often hear talk of how tired “we” are of “Zooming,” but as a larger society, we have not grappled with the exhaustion that continued racial and gendered attacks have on individuals targeted for their personhood. There is now ample evidence that “neutral” platforms produce harassment and abuse, that neutrality is impossible because digital infrastructures express social values and priorities.5 In order for Zoom to pivot away from neutrality to responsibility, it needs anti-racist policies, values, and participation by the people of color who are most harmed by it. An example of the lack of actual neutrality that platforms such as Zoom operate in are the instances of quiet moderation of content that platforms deem politically sensitive or critical of their policies. In an article entitled “Zoom Deleted Events Discussing Zoom Censorship,”6 Jane Lytvynenko quotes Andrew Ross, a professor at NYU who helped organize a conference meant to criticize Zoom’s cancellation of an event featuring Palestinian rights activist Leila Khaled: “Everyone working in higher education right now depends on Zoom and we cannot be in a position of allowing a corporate, third-party vendor to make these kinds of decisions,” Ross said. “It’s simply unsustainable.”7 When the organizers moved the event to Google Meet, it was “trolled,” that is to say, disrupted by abuse. In other words, it was zoombombed without Zoom. Zoom cancelled the “We Will Not Be Silenced: Resisting the Censorship of Leila Khaled, Palestinian Voices…and the Online College Classroom” event because they believed it violated its Terms of Service and Community standards; Khaled, a Palestinian refugee, is a controversial figure who has been called a terrorist by her critics. Zoom’s choice to “moderate” her voice was a political

Conclusion  59 one, but couched in the legalese of “community standards.” However, as we’ve learned from talking to Dr. Johnson, Dr. Moore, and Angelique Herring, these policies are not enforced equally. Those that used the platform to zoombomb clearly did so outside of the “community standards” that Zoom has put forth, but little to no effort has been made to moderate or deplatform those causing actual harm. Zoom has very quickly become the core videoconferencing platform during COVID. It has been situated as the prominent communication device between users of institutions, private collaborations, and for the maintenance of communities. The people that are most affected by zoombombing belong to the same groups that have historically been attacked for their identities and the threat they pose to white and male supremacy both on and offline. We need more research on Zoom and its effects upon social life, inequality, and racial justice; though we have focused specifically on toxic uses of the platform, there are many examples of racial organizing and the creation of new possibilities that deserve further study. We also badly need quantitative research on how widespread zoombombing has become over time; as critical qualitative scholars, we chose to analyze specific cases in detail in order to get at the affective, personal, and individual dimensions of teleconference toxicity, but we would have found data about the zoombombing’s national and global scale and reach, extremely helpful in our research. Asking platforms to be transparent about their moderation policies, to apply them equally, and to confront zoombombing head-on might pose an existential challenge to companies that platform not just life online, but life itself. Keeping themselves technologically functional despite massive demand has been the key to Zoom’s and other successful digital industries’ growing market share and social identity. However, this focus does not produce a socially functional or fair outcome for users. We see an opportunity to do right by the Black and women users who have been activated by their zoombombing experiences as policymakers and consultants. Black women have played major roles as unpaid moderators on Twitter and other platforms.8 It’s time to compensate and elevate their contributions.

Looking Forward This study has been a qualitative snapshot of how zoombombing has played out in the first months of COVID-19 in the United

60 Conclusion States, and that is due both to the time span of our data and to the time constraints we had in writing this book. We write from the midst of COVID’s second wave, which in the US has already eclipsed the first. There is much work to be done to tell the full story and understand the full scope of zoombombing as a racist practice. Survey research on how zoombombing affects Black communities and networks is a logical and critical next step in understanding zoombombing: scholars such as Dr. Dennis Johnson have already contributed expertise and experience to platform improvement and we see great potential in longer term, well-paid collaboration. Dr. Dennis Johnson is a central figure, organizer, and researcher in the story of racial harassment on videoconferencing platforms and much of our critique is indebted to his work in attempting to gain accountability from Zoom and other institutional bodies involved in his own and other zoombombing attacks. It should be clear by now that we strongly believe that platform safety needs to be a higher priority. When looking at the major players in zoombombing, Zoom remains one of the least active participants in the story. While online harassers are creating networks on Discord and other chat rooms to better coordinate attacks, individual users are frantically trying to keep up with security measures to maintain their networked spaces of community, at the expense of their general feelings of safety and intimacy. An ambitious and energetic few have been able to rally enough time and personal resources to organize campaigns to seek accountability; rather than offering public statements calling racist usage “party-crashers” and updating their terms of service and tutorials, Zoom would do well to listen to them. It seems to us that the folks who utilize the platform are working harder to maintain ongoing conversations surrounding this issue and are the ones actively seeking to expand platform safety rather than those who actually have the resources and access to make these changes. We don’t say this to merely disparage Zoom, as we realize that all large digital platforms tend to work this way, but rather to highlight that there are steps that can be taken beyond the ones recommended to and implemented by the users. Digital platforms are terrible at moderation during COVID not because the pandemic altered them but because they have always been so.9 In the Internet’s early, pre-platform days, moderation was not a priority, since digital communication was perceived as

Conclusion  61 “neutral.”10 Like other platforms, Zoom benefits from “the two sides of the harbor, the ‘right but not the responsibility’ to police their sites as they see fit.”11 Zoom has enjoyed freedom from the increased scrutiny and backlash that platforms such as Google and particularly Facebook have confronted in the wake of the 2016 election tampering and other widespread critique of misinformation and hate speech. Zoom benefits by disidentifying with racist internal structures like zoombombing. Therefore, we must challenge the monopoly status that Zoom and other companies maintain in their particular realms. We see an opportunity for platforms to be leaders in addressing the unchecked, overt, and widespread racist and misogynistic harassment that continues to occur every day in the new normal of virtual classrooms, workplaces, and other gatherings during COVID. 2020 has been hard on everyone but truly a disaster for Black people as police violence, differential death tolls, and economic hardship has affected them in much greater measure. Like it or not, we have all had to become users who must contend with the world as we know it becoming more digitally intimate than ever before, in the best and worst ways. It is clear after almost a year under COVID that zoombombing is not going away, that bombers are not getting tired of it, that women and people of color are almost always its targets, and that its effects upon witnesses are serious and long-lasting: zoombombing is far more than a prank or silly joke. Zoombombing is part of a larger legacy of racist and sexist harassment on and offline, and addressing it from both a technological and an interpersonal perspective can move the needle from magical thinking to informed realism.

Notes 1 A dedicated Content Success Manager is one of the benefits provided by the Large Enterprise-Ready package. 2 Steinbeck, “Virtual UGA Guest Lecture Hijacked with Death Threats, Racial Slurs Directed toward Professors.” 3 Fisher, The Play in the System. 4 Schoenebeck, Haimson, and Nakamura, “Drawing from Justice Theories to Support Targets of Online Harassment.” 5 Gillespie, Custodians of the Internet; Noble, Algorithms of Oppression. 6 Lytvynenko, “Zoom Deleted Events Discussing Zoom ‘Censorship’.” 7 Lytvynenko, “Zoom Deleted Events Discussing Zoom ‘Censorship’.” 8 Nakamura, “The Unwanted Labour of Social Media”; D. Clark, “DRAG THEM.”

62 Conclusion 9 Gillespie, Custodians of the Internet. 10 Gillespie, Custodians of the Internet, 25. 11 Gillespie, Custodians of the Internet, 44.

References Clark, Meredith D. “DRAG THEM: A Brief Etymology of so-Called ‘Cancel Culture.’” Communication and the Public, October 16, 2020, 2057047320961562. https://doi.org/10.1177/2057047320961562. Fisher, Anna Watkins. The Play in the System: The Art of Parasitical Resistance. Durham, NC: Duke University Press, 2020. Gillespie, Tarleton. Custodians of the Internet: Platforms, Content Moderation, and the Hidden Decisions That Shape Social Media. New Haven, CT: Yale University Press, 2018. Lytvynenko, Jane. “Zoom Deleted Events Discussing Zoom ‘Censorship.’” BuzzFeed News, October 24, 2020. https://www.buzzfeednews. com/article/janelytvynenko/zoom-deleted-events-censorship. Nakamura, Lisa. “The Unwanted Labour of Social Media: Women of Colour Call out Culture As Venture Community Management.” New Formations: A Journal of Culture/Theory/Politics 86, no. 1 (December 16, 2015): 106–112. Noble, Safiya Umoja. Algorithms of Oppression: How Search Engines Reinforce Racism. New York: New York University Press, 2018. Schoenebeck, Sarita, Oliver L. Haimson, and Lisa Nakamura. “Drawing from Justice Theories to Support Targets of Online Harassment.” New Media & Society, March 25, 2020. https://doi.org/10.1177/1461444820913122. Steinbeck, Foster. “Virtual UGA Guest Lecture Hijacked with Death Threats, Racial Slurs Directed toward Professors.” The Red and Black. Accessed November 2, 2020. Zoombombing deploys classic online harassment techniques and adapts them to exploit Zoom’s liminal space, a space of intimacy generated by the screen, the speaker, the microphone, the software that we can see, and the software that we can’t.

Index

abuse 4, 8, 21, 26, 58 abuse, online 15, 41 affect 2, 4, 41, 48–49, 58–59 anti-blackness 2, 7, 9, 10–11, 25, 34, 36, 46, 48 affordance 50–51 black communities i, 9, 60 black parents 7 black people i, 9, 16–18, 21, 24, 41, 48, 51, 53, 61 black students 7, 18–19, 44 black women 8, 25–26, 48, 50, 59 community standard 58–59 content moderation 2, 23, 26, 52, 58 COVID i, 1–2, 4–7, 10, 16, 20–21, 29–31, 34–35, 37, 41–43, 47, 49, 57–61 digital politics 29 discord 15, 17–18, 22–23, 30–31, 41, 60 facetime 5 far-right i, 2, 10, 15–16, 29, 32–33, 35 fascism 29–30, 32 4chan 10, 15, 17, 19, 22–23, 26, 31, 41 gamer 15, 24–25 gamergate 25, 37 gaming 4, 10, 15–16, 24–25, 38, 51 Google Hangouts 44 Google Meet 21, 45, 58

Google Suites 9, 44–45 grief 24–25 harassment 7–10, 15–16, 18, 20, 23–25, 33–34, 36, 41–43, 45, 51, 53–54, 58; online i, 10, 16, 20, 29–30, 32, 34; racial i, 2, 4, 6, 18, 20, 48, 60–61; sexist i, 6, 9, 18, 45, 61 hate speech i, 2, 9–10, 23, 34, 58, 61 host 2, 15, 32, 43–44, 46–47, 55–56 infrastructure i, 1, 4, 30, 35, 58 institutions 6, 20–21, 31, 36–37, 42, 49, 54–60 Internet culture 2, 4, 34 intimacy 2–3, 6, 8, 19, 24–25, 29–30, 36–37, 43, 49, 56, 60, 61 isolation 1, 7, 10, 20, 25, 50 KPop 36 labor vii, 2, 4, 20, 21, 42–43, 49–51, 56; digital 53; racialized 41 licensing 5, 44, 54, 57 meme 29, 32–33, 35, 37 memetic warfare 29–32, 34–37 micromoderation 50, 52 misinformation 34, 61 misogynoir 25–26 misogyny 2, 10–11, 16–17, 30–36, 51, 61 normalization 7, 9, 45

64 Index pandemic vii, 1–4, 6–8, 15, 21, 29, 42, 56, 60 password 23, 43 people of color (POC) i, 2, 15–16, 23–24, 26, 31–32, 50–51, 58, 61 Pepe the Frog 32, 34–35 privacy 6, 41, 50, 59 redpill 2, 32–34, 41 regulation 8, 23–26, 50 safety 3–4, 11, 31, 36, 49–50, 53–58, 60 shock 2, 21, 24–26, 30, 32, 35–36, 43–47 Skype 5, 21 TikTok 36

troll i, 2, 8, 11, 24–25, 30–36, 58 toxic masculinity 17 Twitter 2–3, 9, 15, 18, 23, 30, 31, 42–43, 54, 59 Utopia 1, 11, 20–21, 32 videoconference 1–2, 4–6, 10, 16, 21, 30, 56, 59–60 violation 15, 23, 47, 49, 58; affective 41, 47 waiting room 53, 55 webinar 55–56 white supremacy 8, 10, 15, 19, 25, 30, 33–36, 41, 59 women of color (WOC) 2–4, 15, 25–26, 42–43, 50, 53–55, 56