Theorising Future Conflict : War Out to 2049 [1 ed.] 9781032113661, 9781032113654, 9781003219576

This book explores the changing tactics, technologies and terrains of twenty-first century war. It argues that the worl

136 19 15MB

English Pages x; 241 [253] Year 2024

Report DMCA / Copyright

DOWNLOAD PDF FILE

Table of contents :
Cover
Endorsement
Half Title
Series Information
Title Page
Copyright Page
Table of Contents
Acknowledgements ix
1 Introduction: A Mug’s Game 1
Apocalyptic International Politics?
Trends in the Liberal Way of Future Warfare
Outline of the Book
Bibliography
Part One War and Peace in the Twenty-First Century 25
2 The Liberal Way of Future Warfare 27
Liberal Futures of War and Peace
Bibliography
3 The Lethal State of Modernity 44
Zygmunt Bauman: Modernity and Violence
Paul Virilio: Modernity, War and Acceleration
Concluding Remarks
Bibliography
Part Two The Tactics, Terrains and Technologies of Future Warfare 65
4 The Impure 1: On the Sub-Threshold of Modernity and War 67
Impure Wars
The Gerasimov Doctrine and Unrestricted War
The Russian Way of Future War?
Unrestricted Warfare and the Intelligentisation of Warfare
The Liberal Way of Future Warfare: Mosaic Warfare?
Concluding Remarks: Impure War and Strategies of Deception
Bibliography
5 The Impure 2: Glitches in the Digital War Machine—The (Hu)Man, the State and (Cyber)War 90
Crimes of the Future
The State of Cyber Play
The Informational Dimension
The Infrastructural Dimension
The Uncertainty of Cyberwar
The (Hu)Man, the State and (Cyber)War
The Human: The State of the Hacker
The State: Sub-Threshold Cyber and International Relations
War: A War On Hackers? War Over Cyber?
Concluding Remarks: Beyond Cyber
Bibliography
6 The Granular 1: The Changing Scale in Conflict 114
War, Modernity and the Changes in Scale
The Future Megacity Wars in (Un)Granular Times
Three Scenarios On the Granularity of Future Conflict
Protopian Mogadishu 2038
Mogadishu Raid 2040: Lethal Empowerment in the Urban Grey Zone
Drones Over Aleppo 2042: Terrorism in an Age of AI
Bibliography
7 The Granular 2: The Granularity of Future War 138
The Three Block Robot War
The Future of Interstate War in Granular Times
Concluding Remarks
Bibliography
8 The Machinic 1: The Battle Angels of Our Better Nature 158
Remote Control and Modernity
Drone War and the Blade Runner State
Protopian Drones?
Concluding Remarks
Bibliography
9 The Machinic 2: The Great Accelerator? AI and the Future of Warfare 182
Security and AI in the Time of the Shoggoth
War in a Time of AI Battle Angels and Blade Runner States
The Banality of Artificial Intelligence?
Concluding Remarks: War and Security in a Time of Multiplication
Bibliography
10 Cyberpunk International Politics?: Enter the Shimmer 211
Causes of Futures Wars
The Future of Decision-Making
China and Future Wars
Climate Change and Future Wars
Enter the Shimmer
Concluding Remarks: Future Worlds of Fantasy and International Politics
Bibliography
Index 228
Recommend Papers

Theorising Future Conflict : War Out to 2049 [1 ed.]
 9781032113661, 9781032113654, 9781003219576

  • Commentary
  • Social sciences\\Public Administration, Military Science\\Military Science\\Military Resources
  • 0 0 0
  • Like this paper and download? You can publish your own PDF file online for free in a few minutes! Sign Up
File loading please wait...
Citation preview

‘Theorising Future Conflict takes us on a thrilling journey into the cyberpunk politics of war and security that the near future may well have in store. Weaving a path amid sci-​fi dystopias, liberal theory, and war studies, Mark Lacy offers us a sobering assessment of how global security (and society) may be transformed this side of 2049. Exactly the kind of free and clear thinking we need in a moment too often in hock to glib optimism or dark dystopias.’ Ruben Andersson, University of Oxford, UK ‘Lacy takes us to a future of “shimmers”, “hybrids” and “cyborgs” that speaks to our present. How did we reach it? What alternative futures were discarded on the way? Instead of ready-​made answers, this book proposes materials, tools and spaces to speculate, face and form our futures.’ Anna Leander, Geneva Graduate Institute Geneva, Switzerland ‘Drawing inspiration from science fiction, Mark Lacy’s Theorising Future Conflict is an innovative and theoretically sophisticated mediation on the possible trajectories of armed conflict.’ Duncan Bell, University of Cambridge, UK

THEORISING FUTURE CONFLICT

This book explores the changing tactics, technologies and terrains of twenty-​first century war. It argues that the world in 2049 is unlikely to look like the climate change/​artificial intelligence (AI) dystopia depicted in Blade Runner 2049; but nor will it be a world where conflict and war has been transformed by a ‘civilising process’ that eradicates violence and conflict from the human condition. 2049 is also the year that the U.S. Department of Defense has suggested China will become a world-​shaping military power. All states will be engaged in ‘arms races’ across a variety of new tools and technologies—​from drones, robotics, AI and quantum computing—​that will transform politics, economy, society and war. Drawing on thinkers such as Zygmunt Bauman and Paul Virilio, the book suggests that future war will be shaped by three broad tendencies that include a broad range of tactics, technologies and trends; the impure, the granular and the machinic. Through discussions of cybersecurity, urban war, robotics, AI, climate change, science fiction and new strategic concepts, it examines how these tendencies might evolve in the different geopolitical futures and types of war ahead of us. The book provides a thought-​ provoking and distinctive framework through which to think about the changing character of war. It concludes that for all the novel and dangerous challenges ahead, the futuristic possibilities of warfare will likely continue to be shaped by problems familiar to students of international relations and the history of war—​albeit problems that will play out in geopolitical and technological contexts that we have never encountered before. This book will be of much interest to students of critical war studies, security studies, science and technology studies and international relations in general. Mark Lacy is a senior lecturer in the Department of Politics, Philosophy and Religion at Lancaster University, UK, and author of Security, Technology and Global Politics: Thinking with Virilio (2014) and Security and Climate Change (2005).

Routledge Studies in Conflict, Security and Technology Series Editors: Mark Lacy, Lancaster University, Dan Prince, Lancaster University, and Sean Lawson, University of Utah

The Routledge Studies in Conflict, Technology and Security series aims to publish challenging studies that map the terrain of technology and security from a range of disciplinary perspectives, offering critical perspectives on the issues that concern publics, business and policymakers in a time of rapid and disruptive technological change. Militarising Artificial Intelligence Theory, Technology and Regulation Nik Hynek and Anzhelika Solovyeva Understanding the Military Design Movement War, Change and Innovation Ben Zweibelson Artificial Intelligence and International Conflict in Cyberspace Edited by Fabio Cristiano, Dennis Broeders, François Delerue, Frédérick Douzet, and Aude Géry Digital International Relations Technology, Agency, and Order Edited by Corneliu Bjola & Markus Kornprobst Theorising Future Conflict War Out to 2049 Mark Lacy For more information about this series, please visit: https://​www.routle​dge.com/​Routle​dge-​Stud​ies-​in-​ Confl​ict-​Secur​ity-​and-​Tec​hnol​ogy/​book-​ser​ies/​CST

THEORISING FUTURE CONFLICT War Out to 2049

Mark Lacy

Cover image: Getty © grandeduc First published 2024 by Routledge 4 Park Square, Milton Park, Abingdon, Oxon OX14 4RN and by Routledge 605 Third Avenue, New York, NY 10158 Routledge is an imprint of the Taylor & Francis Group, an informa business © 2024 Mark Lacy The right of Mark Lacy to be identified as author of this work has been asserted in accordance with sections 77 and 78 of the Copyright, Designs and Patents Act 1988. All rights reserved. No part of this book may be reprinted or reproduced or utilised in any form or by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying and recording, or in any information storage or retrieval system, without permission in writing from the publishers. Trademark notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation without intent to infringe. British Library Cataloguing-​in-​Publication Data A catalogue record for this book is available from the British Library Library of Congress Cataloging-​in-​Publication Data Names: Lacy, Mark J., author. Title: Theorising future conflict : war out to 2049 / Mark Lacy. Description: Abingson, Oxon ; New York, NY : Routledge, 2024. | Series: Routledge studies in conflict, security and technology | Includes bibliographical references and index. Identifiers: LCCN 2023037384 (print) | LCCN 2023037385 (ebook) | ISBN 9781032113661 (hardback) | ISBN 9781032113654 (paperback) | ISBN 9781003219576 (ebook) Subjects: LCSH: War–Forecasting. | Military art and science–Technological innovations. | World politics–Forecasting. | War in literature. | War–Philosophy. Classification: LCC U21.2 .L339 2024 (print) | LCC U21.2 (ebook) | DDC 355.02–dc23/eng/20231002 LC record available at https://lccn.loc.gov/2023037384 LC ebook record available at https://lccn.loc.gov/2023037385 ISBN: 978-​1-​032-​11366-​1 (hbk) ISBN: 978-​1-​032-​11365-​4 (pbk) ISBN: 978-​1-​003-​21957-​6 (ebk) DOI: 10.4324/​9781003219576 Typeset in Times New Roman by Newgen Publishing UK

CONTENTS

Acknowledgements

ix

1 Introduction: A Mug’s Game

1

PART I

War and Peace in the Twenty-​First Century

25

2 The Liberal Way of Future Warfare

27

3 The Lethal State of Modernity

44

PART II

The Tactics, Terrains and Technologies of Future Warfare

65

4 The Impure 1: On the Sub-​Threshold of Modernity and War

67

5 The Impure 2: Glitches in the Digital War Machine—​The (Hu)Man, the State and (Cyber)War

90

6 The Granular 1: The Changing Scale in Conflict

114

7 The Granular 2: The Granularity of Future War

138

viii Contents

8 The Machinic 1: The Battle Angels of Our Better Nature

158

9 The Machinic 2: The Great Accelerator? AI and the Future of Warfare

182

10 Cyberpunk International Politics? Enter the Shimmer

211

Index

228

newgenprepdf

ACKNOWLEDGEMENTS

This book has its origins in a time at Lancaster University when Cynthia Weber, Michael Dillon and I were devising a new MA programme that came to be titled ‘Theorising Security and War’. Mick was working on his books The Liberal Way of War and Biopolitics of Security. Cynthia had just finished Imagining America at War and was using film to explore the politics of security after 9/​11. Seeing how they thought, wrote and taught opened up new possibilities for me on how I could write and teach Security Studies and International Relations. After he retired, I took over Mick’s third year course ‘The Politics of Global Danger’—​and over the years the course mutated into the course that is, in many ways, the basis of this book. So, the book owes a debt of gratitude to both Cynthia and Mick—​and all the students who have taken both ‘Theorising Security and War’ and ‘The Politics of Global Danger’. I am also deeply grateful to some of the research students who have been my interlocutors over the years: Karena Kyne, Ben Zweibelson, Anna Dyson, Katja Jacobsen, Muneeb Hafiz. I am particularly grateful to Dan Öberg who invited me to present a chapter from the book at the Swedish Defence University in Stockholm in the January before we went into lockdown. I am also deeply grateful to an intellectual community that—​especially since lockdown—​feels like it now exists as much in the virtual space of social media as it does in the physical space of a university department. But for all the focus on the future and the virtual, Nayanika, Nikhil and Milon kept me grounded in the now.

1 INTRODUCTION A Mug’s Game

In 2015 I spent a few days at Sandhurst, the British army officer training institution not far from London. After arriving in what felt like a rather bland English commuter town, we went through the gates into Sandhurst, a short journey that felt like travelling through a portal (or border, which it was in a sense) into another world, a world where time had stopped, a place of woodlands, playing fields and impressive buildings in which army officers had been trained since 1741. Given that the reason for visiting Sandhurst was to give a lecture on the future of warfare (war out to 2035), as I walked around the grounds I couldn’t help but think about the history of the place—​but also the future: What would Sandhurst be like in coming decades or even centuries? What will Sandhurst look like in 2049, the year of the Blade Runner sequel? Or 2149? In 2049, would there be training exercises with robots or ‘cyborg’ soldiers in the woodland? In 2149, would Sandhurst (and England) have been transformed by climate change then ‘terraformed’ or geoengineered into something different? Would the officer training be focused primarily on how to analyse/​exploit data, on how to operate or supervise the new machines of war in times of artificial intelligence (AI)? Would the future soldier be ‘enhanced’ in ways that blurred the boundaries between the human and the ‘artificially intelligent’ machine? Or would Sandhurst be one of the few institutions that would remain impervious to all the changes that would transform life in the twenty-​first century, an education that would simply add ‘upgrades’ to what was essentially a timeless education on an aspect of the human condition that essentially remains the same? Sandhurst would remain this exceptional, timeless space until the last days of what we currently understand as England (whenever that may be). The reason I was at Sandhurst was to attend a conference where academics and policymakers (from ‘think tanks’ and various parts of government) would offer their DOI: 10.4324/9781003219576-1

2 Introduction

speculations on the future of war, on the wars that would be fought around 2035. This was be followed by a war gaming event involving military personnel from a number of countries. The first day involved academics telling the army what war might look like in 2035; my sense is that most people were saying similar things; the future of war would be in megacities, hybrid war and ‘cyber’ would change the character of war, ‘great power politics’ would generate ‘novel’ challenges as new superpowers began to make their geopolitical presence felt. I got the sense that some of what I said didn’t really go down too well, a vision of future warfare where more and more activities would involve ‘cyber-​ skills,’ ‘grey zone’ activity, interventions responding to climate emergencies or economic/​social collapse (HADR—​humanitarian assistance and disaster relief), where urban war might become impossible given the technological complexity of a messy, congested future, where all sides in a conflict would be able to do things currently unimaginable. There wasn’t much ‘traditional’ warfighting in my tour of war in 2035. It was the type of overview that a politician seeking to cut traditional military spending might like; those trying to justify increased military spending on the army might be less impressed. The implication of my view was of a smaller military, composed of more trained ‘technicians’ in the arts of increasingly ‘remote war,’ a military composed of soldiers that were a hybrid of warrior-​researcher-​technician-​scientist-​educator. This might not have been the future that they were looking for in Sandhurst, a view based on the position that a country like the United Kingdom (or any liberal democracy) might be reluctant to get involved in the problems of distant territories and the risk of waging messy and relentless ‘unnecessary wars.’ It was not that I rejected the declaration made by the French in 2021 on the inevitability of hypothése d’engagement majeur—​high intensity war with states like Russia or Turkey. And I was arguing that if we did see the emergence of global conflicts outside the liberal world, liberal states would be reluctant to get too involved, preferring to provide support and expertise backed up by economic sanctions and other measures designed to deter war, sub-​threshold action at a safe distance. Even if liberal states made errors and miscalculations in their foreign policies and strategic plans then the tactics, techniques and technologies of warfare were likely to be different from the past—​especially in a time of dramatic technological and geopolitical change. The ‘realist’ of international politics will argue that the tendency towards hubris and arrogance in liberal democracies will likely continue to produce unnecessary wars that result in regional chaos. But, the realist might add, we should hope that continued investment in the new technologies of war deters other states and actors from aggressive and chaos-​producing foreign policies in these times of geopolitical transformation; the risk remains, however, that the creation of new technologies and tactics of war creates temptations for liberal democracies to demonstrate and experiment with their new ‘tools.’ But along with the grounds and the buildings, what made an impression on me were the opening speeches by the organisers on the first morning of the event;

Introduction  3

we were told that we should all be thinking like science fiction writers as we thought about the future of warfare; in the opening speech David Mitchell’s The Bone Clocks—​and its depiction of the future of the United Kingdom and the role of China in world affairs—​was mentioned; later in the day I remember having discussions about William Gibson’s The Peripheral and its depiction of the future solider entangled in the latest robotic technology to wage war at a distance in an America ravaged by economic decline and poverty. On one level, this book takes up the challenge to think about warfare from the perspective of science fiction. But really once you begin to read policy documents and talk to people working on the future of warfare in various parts of the armed services or government, you quickly realise that lots of people are already thinking like science fiction writers; the technology is already showing us possible futures that feel like science fiction (into the realm of AI, quantum computing, robotics and astropolitics). Even those who would argue that there is too much focus on the ‘new’ problems (‘cyber’) and approaches (e.g., the United Kingdom’s Integrated Review in 2021)—​at the expense of the traditional technologies, vehicles and resources (where the number of flesh and blood soldiers are cut in favour of more robots and drones)—​they are still thinking about warfare in a world that is filled with the possibilities that look like the scenarios of science fiction movies. The difference is between those who see the futuristic threats and possibilities as being ‘hyped up’—​and those that think we are in a game-​changing moment that will change everything we think war is and can be. The subtitle of the book points to the film Blade Runner 2049, the sequel to the original film set in 2019 that was based on a story by Philip K. Dick. The book is not suggesting that world of 2049 will be like the world depicted in the film, a planet ravaged by ecological disaster, social alienation and urban degradation, a world with hostile ‘replicants’ rebelling against their exploitation by humans for war and business, the ‘non-​humans’ trying to resist their inbuilt termination, struggling against their ambiguous identities (although in June 2023 smoke from wild fires did turn skies in North America a shade of orange similar to scenes in Blade Runner 2049). The book is based on the idea that there are a number of possible futures ahead of us that will most likely be shaped by tendencies—​not inevitable and inescapable ‘laws’ or ‘logics’ of history—​in technology, strategy and global politics (as well as moments of ‘strategic surprise’ or ‘wild card’ events). These tendencies will open up the possibility of a number of different ‘war futures’ (and possibly generating unintended consequences and accidents, along with ethical and political risks and dilemmas). The tendencies or trends examined here contain a range of strategic and technological possibilities that reflect where we are now and where we are most likely heading in the ‘near future’; the assumption is that while the specific technologies and tactics might change (or most likely evolve or mutate) the tendencies may remain more secure or ‘embedded.’ 2049 felt like a date close enough to be able to make some reasonable assertions about where warfare might be heading—​beyond 2049 feels like a step too far

4 Introduction

into the realm of science fiction. The assumption in this book is that the ‘long peace’ (meaning the lack of catastrophic great power conflict or world war) will continue—​but a long peace where the lines between war and peace are often blurred and where this global ‘peace’ confronts intense stresses and strains that will risk pushing international relations above the ‘sub-​threshold’ character of much contemporary conflict. But there was another reason for thinking about war and international politics out to 2049. In 2017 it was declared that the objective of the People’s Liberation Army was for China to be a ‘world-​class’ military power by 2049: the Department of Defense noted that becoming ‘world-​class’ might mean becoming superior to the United States or any actor viewed as a threat to China (Office of the Secretary of Defense 2020). Of course, this moment of becoming ‘world class’ might arrive sooner than 2049—​or it might arrive later (or never…). But that moment when China becomes a world-​class military power might signal a radically transformative moment for all aspects of international politics: this transformation or evolution of international society might be managed and handled by smart and sensitive diplomacy from all sides, but at the same time, this transition to a new type of world order might be dangerously disruptive and socially/​culturally/​politically disorientating, resulting in an intensification and multiplication of international conflicts. Managing this ‘Thucydides Trap’ might be the most challenging geopolitical problem of the twenty-​first century. Of course, thinking about the future can be absurd, a futile exercise that may confront ‘game-​changing’ events that can unsettle (or even destroy) the worlds we live in and all our assumptions about what ‘the future’ will be (on a personal, societal or planetary scale). Indeed, the future 2019 depicted in the original Blade Runner film—​released in 1982—​is fairly useless as a work of futurism (although its vision of the future might have transformed how artists, designers, filmmakers, writers and musicians imagined and depicted the future). As the science fiction writer and humourist Douglas Adam declared: Trying to predict the future is a mug's game. But increasingly it's a game we all have to play because the world is changing so fast. We need to have some sort of idea of what the future's actually going to be like because we are going to have to live there, probably next week. (Adams 1999) Thinking about the future is something we do as a species—​and it is something we are drawn to particularly in a historical moment like this one, where there is a sense of radical and disruptive technological and geopolitical change and transformation, where, as Justin Trudeau (2018) put it at DAVOS, ‘the world has never been this fast but it will never be this slow again.’ Of course, Trudeau’s statement is political and technological hyperbole and, while the ‘pace of change’

Introduction  5

in technology is the corporate and political obsession of our time, there are many possible reasons for the world to become ‘slow again.’ So we live in a time of great interest in ‘the future’ because the future has never possibly felt so unpredictable or ‘unknowable’ (or rather we have never been so aware of our inability to realistically or credibly imagine ‘the future’—​whether we are talking about decades or centuries); Franco Berardi goes as far to suggest we are now ‘after the future’ given that many have lost a sense of ‘the future’ understood in terms of the inevitability of progress and improvement in technology, security, economy and living conditions (regardless of whether we are talking about decades or centuries) (Berardi 2011). There are continuous waves of books on ‘the future’ from a variety of academic disciplines and perspectives. We have books such as Adrian Hon’s (2020) A New History of the Future in 100 Objects, an imaginary history of the twenty-​ first century in terms of the ‘small details’ that transform existence written from a future viewpoint. In Savage Ecology: War and Geopolitics at the End of the World, Jairus Victor Grove (2019) concludes his exploration on how war and ecological degradation has shaped international politics with a bleak sketch of ‘Visions of Los Angeles, 2061.’ In The 2084 Report: A History of Global Warming from the Future, James Lawrence Powell (2020) provides a depiction by a future historian of the damage done to the world by climate change: a future of wars, climate refugees and social collapse. In AI 2041: Ten Visions for Our Future, technology expert Kai-​Fu Lee and science fiction writer Chen Qiufan produce a variety of scenarios to generate debate and discussion on the impact of AI on work, war, (international) relationships and politics (2021). Possible futures are multiplying just as work on the future is multiplying. These depictions of the future are often designed to generate discussion on actions that could be taken in the present to avoid dystopia or catastrophe. Or as the economist and activist Yanis Varoufakis (2021) illustrates in Another Now, ‘economic science fictions’ can be used to show what alternative futures could look like, alternative futures that open up creative solutions to a variety of social, technological and economic problems, problems that often feel overwhelmingly dystopian and unmanageable. But at the same time, while speculating on the future might be a ‘mug’s game’ or utopian speculation on alternative futures, there can be problematic consequences for the present emerging from the ways governments and corporations prepare for the future—​and, in particular, for catastrophic events—​ through the various techniques and tools used for shaping all aspects of life in the present, processes outlined in Politics of Catastrophe: Genealogies of the Unknown by Claudia Aradau and Rens Munster (2012). As George Orwell might have put it, those who attempt to control depictions of the future work to control the present. While we might feel an ‘existential’ need to examine what might be ahead of us (to take action in the present to help and protect our future selves, to get a sense of what is coming towards us), from a military perspective you need to think

6 Introduction

about the future because you need to work out what type of army and what type of technology you will need to defend your state in the future, especially as some of the steps you may need to take will involve long term planning, whether it is the manufacture of a ‘future proof’ ship or making sure you have the right people—​and enough of the right type of people—​for the armies of the future. So we live in a moment where there are many attempts to think about the future of war and our international relations, thinking about what the ‘world order’ is going to look like, who (or what) the key ‘actors’ are going to be, what technologies will transform the ways we live (and die) around the planet, what events could radically transform all our best made plans for order and security. It is, as the Hitchhikers Guide to the Galaxy observes, a mug’s game. But as the historian of war Lawrence Freedman concludes in The Future of War: A History, we need to engage with thinking about future war while at the same time maintaining a critical perspective: As in the past there will be a stream of speculative scenarios and anxious warnings, along with sudden demands for new thinking in the face of an unexpected development. Whether couched in the language of earnest academic papers, military appreciations or fictional thrillers, these will all be works of imagination. They cannot be anything else because the future is not preordained. This is the main reason why prediction is so difficult. There are decisions yet to be made, even about challenges that are well understood, along with chance events that will catch us unaware and developments already in train that have been inadequately appreciated. These works of imagination will often have value in helping to clarify the choices that need to be faced and at times will even turn out to have been prescient. For that reason many will deserve to be taken seriously. They should all, however, be treated with scepticism. (Freedman 2017: 287) It is important to think about the future of war, from the near future and beyond—​as absurd as the inquiry can quickly become—​even if it is just to put into play ideas and concepts that can be challenged or refuted. In this sense, a book like this one succeeds if it leads to better ideas and clearer thinking than is found here. This contemporary field of ‘future war studies’ is one composed of many useful and provocative books and policy reports; there are also creative attempts to examine future conflicts through the creation of fictional scenarios such as Peter W. Singer and August Cole’s (2015) Ghostfleet: A Novel of the Next World War or General Sir Richard Shirreff’s (2016) War With Russia; the difference between many of the books—​the similarity being most of the books here are written from the perspective of the United States or Europe—​is often on the emphasis given to certain elements. For example, Christopher Coker’s (2015) Future War tends to focus more on the philosophical (and ethical) aspects on how war is changing (and possibly changing what it means to be human) than detailed debates about strategy and tactics in different terrains; Robert H. Latiff’s (2017) Future War: Preparing

Introduction  7

for the New Battlefield covers much of the same terrain as Coker’s book but begins to explore what changes in technology will mean for the future ‘warrior’ and the broader impact on society; Sean McFate’s (2020) New Rules of War outlines how a radically different military (and way of thinking) will be needed if the United States is to be able to deal with future warfare in an arena of new actors, tactics and technologies; Mick Ryan’s War Transformed: The Future of Twenty-​First Century Great Power Competition and Conflict (2022) provides us with a useful overview of the institutional and organisational military contexts and challenges as we confront a time of technological competition and change. Then there are books such as A Brief History of the Future: A Brave and Controversial Look at the Twenty First Century by Jacques Attali (2011) the French intellectual/​policymaker/​ businessman and futurist, a book that makes Coker, McFate and Latiff’s books look rather timid and really takes us into the realm of science fiction in a fascinating (and sometimes troubling, bizarre and eccentric) tour of the future. This book begins to think about future tactics, terrains and technologies of warfare and international conflict in a time where the futures ahead of us feel like they are multiplying in ways that might exceed our political and sociological imaginations in both positive and negative ways, a future of non-​lethal weapons eradicating violence through to dangerous ‘suicidal states’ creating havoc and destruction—​ or to a world that begins to resemble the dystopian landscapes of Blade Runner 2049. As the science fiction writer Kim Stanley Robinson remarks: ‘The future is radically unknowable: it could hold anything from an age of peaceful prosperity to a horrific mass-​extinction event. The sheer breadth of possibility is disorienting and even stunning’ (Robinson 2018). Apocalyptic International Politics?

For many around the world, we are heading into an age of ‘apocalyptic international politics,’ a period of planetary disintegration, a time of irreversible environmental, social and political degradation (for many others, we are already there). The ‘realist’ vision of international politics as inescapably bleak and tragic due to great power politics and competition is supplemented or upgraded by a view of the future where the sources of insecurity and danger multiply and extend beyond states: non-​state actors, ecological fragility, technological accidents. These dark visions circulate and proliferate through our popular culture. Mad Max Fury Road, Dredd or Cormac McCarthy’s The Road show us the brutal future that will unfold after climate change, pandemics or violent nationalism have wrecked the planet. Ready Player One (set in 2045) shows us a (seemingly) less catastrophic future but one where many struggle to survive on a planet where technology and capitalism has creatively destroyed any possibility of a stable, equitable society; there is an even starker vision of the future in the movie Elysium (directed by Neill Blomkamp and released in 2013) where the rich and powerful have found health and safety in space stations while the rest of humanity struggle back on

8 Introduction

the wasteland that is now planet Earth. The visions of apocalyptic international politics are split between stories of total societal and ecological collapse or worlds on ‘life support’ with intense division and fragmentation at all scales of human life—​visions of suicidal states or suicidal networks, groups or ‘lone wolves’ with the destructive capacity of states. But we are perhaps less troubled by the dystopian futures of science fiction than with the images of destruction in the present, in Aleppo, Gaza or Kharkiv, where the tragedy of international politics is that we cannot seem to escape the forces that drag parts of the world back into the violence, depravity and damage of previous centuries. A world that appears divided between a liberal world in a ‘post-​heroic’ age that prefers economic war to total war in contrast to regimes that unleash warfare that produces horrific events of suffering in cities in Syria or Ukraine. But there are those that argue that these apocalyptic depictions of the future (and traumatic violence in the present) are downplaying the trends that point to more complex futures, futures where there will still be social, economic and political problems (and new technologies will continue to produce disruptive and harmful events and accidents) but where the futures in Mad Max or The Road are really the extreme worst-​case scenarios in the twenty-​first century. In particular, for the liberal internationalists and technological optimists, the limits placed on the use of warfare as a political tool emerge from deeper transformations of society, changes in our attitudes to violence, the new ways in which we can be made aware about the realities of war, the new laws that can place constraints on the conduct of war, the new ways in which we are interconnected and entangled. For others, the costs of war are simply too high (both in human and economic terms), the ‘blowback’ too risky (especially for European powers close to the war zones in the Middle East or Africa); the geopolitical reality of an emerging multipolar world gives good reason for strategic caution. Deterrence will continue to place limits on the behaviour of states—​the possibility of apocalypse will dampen any desire to raise the stakes in geopolitical conflict. We will manage the threat of terrorism through surveillance and intervention—​even as the destructive capacity of non-​state actors may increase. The balance of deterrence through the threat of nuclear destruction and ‘deterrence through entanglement’ will help maintain some kind of world order. Economic and technological growth and innovation will improve life around the planet; progress will be accelerated in a time of AI. This book begins from the assumption that war will not disappear from the human condition in the coming decades (although how we understand what warfare is might change). But the book does not see the coming decades in terms of inevitable hypothése d’engagement majeur or future global interstate wars that are a rerun of the previous century but with more destructive weaponry. The book is interested in how war might be different in the coming decades. Rather than seeing inevitable global progress and improvement, the book suggests that international conflict and tension will not disappear; war will be transformed by new tactics,

Introduction  9

technologies and geopolitical contexts and world orders; and the possibility for strategic mistakes will intensify in times of geopolitical change—​ along with the possibility for global accidents in a time of technological acceleration and ecological change; and regardless of whether the state is a liberal democracy or an authoritarian regime, political leaders and military elites will overestimate their own capabilities and tools—​and underestimate the capabilities of an enemy or the complexity of a situation. We might all ‘get lucky’ in the decades ahead but the future is likely to be messy—​and the book is interested in how the future gets messy. Theorising Future Conflict: War Out to 2049 sets out a broad framework of trends through which to sketch out how war might evolve in the decades ahead—​ but these trends all contain a messiness and potential for accidents that can melt and dissolve like a dystopian surrealist painting on future war (or morph into something radically different). So, on the one hand, the book accepts the assumption often developed by liberal internationalists that there is a transformation of global politics, a view that often challenges the ‘realist’ view that the tragedy of international politics is that we cannot escape from insecurity in a system that continues to produce the conditions for dangerous interstate competition (Mearsheimer 2014). But the book is interested in how war might change or mutate in the movements of ‘recurrence and repetition’ in international politics. The future of war is unlikely to be like The Edge of Tomorrow (the movie based on the Japanese science fiction novel All You Need Is Kill published in 2004), the story about a soldier fighting an alien force on earth who—​when he is killed in combat—​returns to the time before the battle that killed him; the soldier is able to learn and adapt to the conflict in order to defeat his enemy. The film looks like a science fiction-​inflected version of a Second World War movie (or even a depiction of trench warfare)—​but with more advanced weaponry and kit. There is a sense that some of our thinking on the future of war does the same—​we are going to repeat the battles of the past but with faster and more powerful technology. The actors (states, armies), the terrains or domains (the real space of battlefields) and the objectives (defence against an existential threat, defence of territory) will basically stay the same. Theorising Future Conflict: War Out to War 2049 is interested in how war might begin to become different in coming decades. While the position developed in the book is open to the possibility that the coming century will see the unfolding and realisation of the ‘better angels of our nature,’ a world of non-​lethal weaponry, improved conflict resolution, technical fixes for all our social, economic, political and ecological problems, the book is uncertain about whether this is the future ahead of us; while we might be creating a planet of deepening economic, technological interconnectedness, deterrence by entanglement, we are also continuing to develop arms races in a multitude of technologies in a context where states and their leaders will make mistakes, where there could be global accidents that disrupt and damage the geopolitical (dis) order of things.

10 Introduction

Trends in the Liberal Way of Future Warfare

How can we think about the future when we know that it is a ‘mug’s game’? There will be surprising events that transform international politics (events such as the end of the Cold War, 9/​11 or possibly Covid-​19). There will be technologies that are not as radical as they were predicted—​and new technologies that transform how we live, work and fight in ways that we did not see coming: What will Apple’s Vision Pro be capable of as we approach 2049—​and how will similar technologies of spatial computing be used in warfighting? There will be ‘actors’ that emerge that change international politics in both positive and negative ways (How will terrorist and criminal groups use future Vision Pros and spatial computing?). There will possibly be dangers (and opportunities) on the horizon that we are currently not aware of. There will likely be threats that we fail to see before it’s too late. Astropolitics and AI might both radically transform all aspects of the human condition and global politics in the twenty-​first century. But space as a domain of competition and conflict might be too risky (risking a catastrophic and cascading accident that encircles the planet with deadly debris), difficult (it will be difficult to know who is doing what in the ‘fog of space’) and expensive (where the difficulty and cost of an event might result in more Earth-​bound tactics to achieve an objective) beyond acts of espionage and the sabotage of communication on warfighting on the Earth below; generative AI might become a rather banal and uncreative tool used in business management (a more efficient tool for hiring and firing) and entertainment/​‘content’ production that gives new tools to people around the planet enabling anyone to create a movie that looks like it was made by Disney or Lucasfilm. In other words, we might be hyping the importance of space and AI in a manner that prevents us from seeing the ‘black swans’ on the horizon. The approach taken in the book is to suggest that we can see a number of trends or tendencies in the way that war is being thought about and developed by liberal states. In War: How Conflict Shaped Us, the historian Margaret MacMillan observes: ‘Predictions about the future shape of war are like betting on the horses or guessing where new technology is going.’ (MacMillan 2021: 284). But MacMillan suggests that it is ‘possible at the very least to identity trends in war’ (ibid.: 285). This book attempts to identify trends that might reveal something about how war and international politics will be changing in coming decades. The book identifies three broad trends that might shape the future of war and international politics out to 2049, trends that could result in radical and challenging possibilities in the technological and geopolitical contexts of the twenty-​first century. While framing the future of war and international conflicts in terms of trends and tendencies implies certain orderliness, structure or stability to emerging patterns, all the tendencies can generate destabilising and destructive unintended consequences and accidents. These are the three trends outlined in the book: The Impure. After 9/​11, the French philosopher Paul Virilio (2008) suggested that we were now in a time of impure war, a period radically different from the

Introduction  11

‘pure war’ of the Cold War, the geopolitical conflict between great powers with the underlying threat of apocalyptic nuclear war. Impure war involves terrorist networks finding destructive and creative ways to exploit the vulnerabilities of everyday life and critical infrastructure: to uncover ‘impure’ possibilities in the normal worlds we inhabit (from cyberspace to airports to city streets), worlds that are constantly transforming in times of economic, social, technological and geopolitical change. At the same time, for Virilio the response to 9/​11 was impure war, a new age of micro-​targeting, information and cyberwar, drone strikes, new tactics in a war against a network, what the geographer Derek Gregory described as the ‘everywhere war’ (Gregory 2011). On one level, the Cold War could be viewed as a time of big states with big weapons seeking to uphold strategies of deterrence on a clearly mapped out chessboard of international politics; in a time of impure war, everything is messier, a chaotic realm of constantly changing actors, terrains, technologies and tactic. Impure war is about the search for new tactic in a constantly changing terrain, the search and cultivation of new types of creativity in war. For the terrorist, it is the search for new types of violence and destruction using the limited resources you have access to; for states, it’s about finding ways to degrade the capacity of an opponent that changes its form, a network that can appear and disappear in different places around the world; or to deter an opponent through the production of an image of deadly creativity combined with technical skill, capacity and capability. For all sides, impure war is about limits (resources, laws, knowledge/​ intelligence) and the new possibilities created by constantly evolving technologies (for communication, sabotage and espionage). Many commentators would argue that the Global War on Terror has been displaced by the return of ‘Great Power Politics’ with the rise of China as a great power and the desire of Russia to return or ‘re-​animate’ as a great power. There has been a great deal of geopolitical anxiety in the West about the new tactics that are sub-​threshold, activities in the ‘grey zone’ between war and peace: David Kilcullen (2020) describes this type of conflict as ‘liminal war,’ conflict in an ambiguous zone of interconnectedness between war and peace. For some, this time of the grey zone or liminal war is the state that we are likely to remain in, a time of perpetual grey zone; for others, liminal war is the condition of tension and sub-​threshold conflict as we move from one world order to another; for others, liminal war or the grey zone is the prelude to war: and all the unconventional tactics we associate with the grey zone will be accelerated and intensified in the prelude to a real war. What we see occurring at the moment are experiments and preparations for global war. The anxiety about all things ‘cyber’ is a source of unease about changing geopolitics, technology and impure war. On one side are those who see a dangerous new age of experimentation as the possibilities of the ‘grey zone’ are tested and explored. On the other side are those who argue that we should remember that these problems are sub-​threshold (and have technical and tactical limits) and so

12 Introduction

we should be cautious about being drawn into the sense of panic and cyber-​doom (Debrix 2007). Others see cyber as one of the emerging elements that will multiply the terrains and tactics of war to deadly effect. But regardless of the significance of cyber, it does look likely that we are in a time when there is a desire to find ways to be more creative than opponents, to find vulnerabilities in our complex societies or to use elements in the war machine in creative and surprising ways. Much of this new age of technology and international conflict might remain sub-​threshold, in the grey zone, but we are left with this question: in the prelude to war—​or during war—​between states will we see types of destruction that overwhelm states due to the sheer creativity and cunning in the ways they exploit the vulnerabilities of our interconnected societies? Will the creativity of the techniques, the expertise of the ‘warriors’ and the vulnerability of our interconnectedness mean that all geopolitical tensions will remain sub-​threshold, deterrence by entanglement? What could Russia or China orchestrate from a distance without ever having to fly a plane or launch a ship? And could a sub-​threshold event play out in ways beyond the control of those who orchestrated it? How creatively destructive will terrorist networks be in the technologically complex societies of the coming decades? So, it looks likely we will be dealing with the destructive creativity of non-​ state actors such as terrorists or criminal networks along with the sub-​threshold creativity of states. It could be argued that it has always been the case that militaries have often had to devise creative approaches to war. In the twenty-​first century, the difference might be the range of actors (the states, terrorist networks), technologies (AI, quantum computing, robotics) and tactics (new ways of targeting and trying to manipulate communities and organisations). It might be the case that impure war—​the destructiveness of terrorism and then the wars of terror that result—​might fade in significance for a variety of reasons. But what is less likely to disappear is the desire to be more creative than the enemy both in terms of sub-​threshold activity and in planning for future wars between states. The broader question is how vulnerable are our societies in an age of entanglements and interconnections? The Granular. In his comments on impure war and 9/​11 and the Global War on Terror, Virilio mentions the ‘change in scale’ in matters of technology, war and security (Virilio 2008: 12). The implications of what he was pointing to are even clearer in the 2020s. Micro-​targeted drone strikes. Micro-​targeted political campaigning. Planetary surveillance from satellites in space. Granular analysis of data in organisations and societies in a time of ‘surveillance capitalism’ (Zuboff 2019). The possibility of the micro-​drone. A single ‘hacking’ event releasing unprecedented quantities of data. A memory stick containing malware that is the result of an expensive international research collaboration. Events that can be organised through a real-​time global network of actors and technologies to make possible the assassination of an individual in an increasingly confined ‘kill box.’ Events that might be orchestrated by an individual or group that produces a global event. An individual taking pictures on a phone that can change the course of a war or damage the stability of a political regime.

Introduction  13

This focus on the significance of the change in scale and the miniaturisation of technology—​and the possibility of granular tactics—​does not necessarily mean a ‘shrinking’ of war or a world of decreasing conflict. Future conflicts and world wars might be composed by a multitude of granular events, some involving action at a distance, some involving technology, some involving special forces or ‘little green’ men and women. The targeting of key of personal in vital organisations in order to degrade the capacity of a military; a careful manipulation of lines of communication, of command and control, disrupting the effectiveness of a military to operate across domestic and global space. Events that might surprise us in the way they can creatively degrade the capacity of an organisation while at the same time not requiring unnecessary risks to large numbers of military personnel. The focus on ‘minor’ infrastructural vulnerabilities (such as the search for vulnerable/​ exploitable elements in a supply chain) that can wreak havoc on the effectiveness of a state or network—​disrupted through a cyberattack, drone strike or human interference. The vulnerability caused by the targeting of small things. In the liberal world, there is debate underway about what size of military is needed and what skills and technology will be required (and for what type of conflict with what type of enemy…). Will the main use of highly trained soldiers be in missions such the killing of Osama bin Laden? Will interventions that attempt regime change at huge expense be viewed as too costly and strategically dubious for states with a multitude of problems to deal with? Future conflict in megacities is commonly cited as one of the inevitable and inescapable battle spaces that armies need to be prepared for in coming decades. But what type of actions will be possible in states concerned about the loss of life and possibly dealing with warlords, criminal and terrorist organisations that might be using increasingly advanced technology in complex, congested urban environments? Simply put, what do the events depicted in Ridley Scott’s Black Hawk Down look like in 2049? Not all states might be so concerned about the risks and costs of urban warfare but how will liberal states operate? Will the future of urban war involve small groups of special forces—​or even ununiformed soldiers—​ orchestrating granular events involving robots of various shapes, sizes and capacities? In a time of granular tactics important political questions will remain: if events play out through increasingly granular tactics will societies be able to keep track on what is going on in the processes designed to protect them? It might look like a world of liberal peacefulness and values—​but underpinned by spaces of granular violence orchestrated through a variety of tactics and technologies. But the focus on the granular also raises questions about how far or ‘small’ the granular will go What are the implications of quantum physics for both future technologies and our understanding of reality? What about the possibilities of neuroscience and neurowarfare (Krishnan 2016)? As we approach 2049, will war begin to resemble the scenarios devised by Christopher Nolan in films such as Tenet or Inception?

14 Introduction

The Machinic. Human beings continue to develop ‘tools’ to distance themselves from danger and unpleasant situations; they also develop tools to enhance their capacity or to transform their capacity. The tools they develop range from the simple (the use of flint for weapons and fire) through to the latest innovations in sophisticated technologies where ‘action at a distance’ can result in a robotic helicopter being used on Mars. At the same time, sometimes individuals, organisations and societies use other animals or human beings as tools (the suicide bomber, for example, or the slave) to enhance the capacity to perform action at a distance. In the twenty-​first century the liberal way of warfare might involve keeping highly trained soldiers away from dangerous battle spaces though the use of technologies that create the possibility of action at a distance (or training local forces who might then be involved in risky activities); it might also involve the use of machines that will evolve in unpredictable ways with every passing decade, transforming the battlespace in a time of drone swarms and AI. Or the technological transformation of war might be limited by ethical and legal pressure, more traditional forms of deterrence or by technological limitations and the potential for accidents that cancel out the benefits promised by the technological innovation. We live in a time that feels like a revolution in the possibilities for our machinic futures. Militaries around the world appear to be taking seriously the idea that war will be transformed by the possibilities of robots or drones of all shapes, sizes and capacities along with the AI that will be able to accelerate the pace of conflict and create different capabilities for militaries and non-​state actors. As shown in the film The Creator (2023), the arms race in AI/Robotics might result in different approaches to tech across the planet, resulting in a world of machinic difference and diversity that will intensify as the pace of technological change unfolds. Or we might see a world of machinic homogenisation: the exploration of new possibilities—​possibilities such as autonomous weapons systems—​might be incredibly troubling from a legal and ethical perspective but might be inescapable if states do not want to be ‘outpaced’ by rival actors: at the same time, we do not know what type of public sphere (even in the ‘open societies’ of liberal democracy) will be available to keep open the spaces for critical discussion on the tactics and technologies of war. And in a time of ‘open technological innovation,’ the vital innovations in technology and war might emerge in laboratories and research centres outside of government control or direction—​and developed by actors who live with radically different worldviews and strategic objectives. It is also not clear what type of machines we will be discussing as we approach 2049. The technologies of war might be beyond anything we can currently imagine, or they might be an intensification or acceleration of what we see now on the horizon. The technologies of future war might create possibilities beyond what we currently understand as action at a distance or enhancing capacity: it might also be the case that new technological capacity transforms ‘domains’ or ‘terrains’ such as space or virtual reality in ways that will change war forever.

Introduction  15

Interlinked. These trends or tendencies are interlinked. The impure might require technologies (the machinic) to create new tactics and events. But the impure could involve humans in the loop in a manner familiar to any observer of modern tactics of Machiavellian statecraft. The machinic will depend on the creativity of the military and some states might lack the organisational and machinic capacity to cultivate the creativity of impure war. The use of technology might create new possibilities of granular war, but the machinic might not involve the ‘granular’ at all (the use of robot tanks on a large scale unlike anything ever experienced in modern war). At the same time, the proliferation of machinic possibilities might cancel out their usefulness, taking us back to the exploitation of human or organisational vulnerabilities, new tactics of psychological war that produce mistrust and confusion across organisations and societies. Each trend will be important, but the significance of the trend will depend on the geopolitical situation, geographic terrain or enemies’ capability. But the potential of the machinic and granular will depend on the creative processes of the impure. The flipside of all this is that these three elements might also be used by non-​ state actors—​machinic innovation could be a growing problem if non-​state actors can access technologies that enhance capacity, allowing for new types of action/​ destruction at a distance; here the creativity of the impure might be vital—​but these groups might lack the organisational structures, processes and capacities to be that significant—​and events will be more opportunistic rather than a structured process of developing creativity in impure war. Chapters 2 and 3 outline what might be viewed as the cultural, economic and political context in which these trends will be developed in the liberal way of future warfare. While there are tensions, contradictions and ambiguities in the development of liberal internationalist ideas through modernity (Ikenberry 2020), it is suggested that ideas and perspectives that emerge from these values and political processes produce an important set of trends and movements that transform understandings of violence and war. To be sure, the ‘long peace’ since the Second World War might be more fragile than many of us would like to believe and the temptation to experiment with our latest military technology and strategic ideas might be difficult to overcome. But it does seem to be the case that liberal societies are concerned with the risks, costs and unintended consequences of war in a manner that is unprecedented in world history. While this might change in coming decades (and centuries), the citizens of liberal democracies are often profoundly concerned about sending soldiers to fight overseas and are increasingly concerned about the impact of wars on the inhabitants of distant territories: equally, they are also concerned with the costs and blowback from wars that might viewed as ‘wars of choice’ rather than responses to ‘existential threat.’ Liberal societies are generally concerned with the role that new technologies are playing in war, and they are increasingly aware of the economic costs of these increasingly complex military operations (although it could be argued that China is equally concerned about the risks of AI and other emerging technologies). There is also a sense in

16 Introduction

which events that might have previously occurred overseas (and could be ignored) can now be recorded and circulated around the planet, an increasingly granular view of war—​with strategic and legal consequences for the states and individuals concerned. The assumption in the book is that the liberal societies will strive to limit the use of force in international politics—​and explore the non-​lethal potential of emerging technologies and policies—​but escaping from our tendency to use violent solutions is unlikely to disappear (as well as the tendency to make strategic mistakes or miscalculations). The liberal way of war can still produce strategic mistakes and miscalculations, acts of geopolitical hubris and irresponsibility—​even in social and political contexts that are increasingly anti-​war or wary of endless wars overseas. In The Liberal Way of War: Killing to Make Life Live, Michael Dillon and Julian Reid remind us of the weakness of democratic accountability in liberal regimes of power: ‘Had Bush or Blair been even minimally accountable to the advice of their professional advisors, alone, conflict with Iraq would have been handled quite differently’ (Dillon and Reid 2009: 24). Future wars in liberal democracy might result in new forms of democratic accountability, but it remains to be seen what the direction for liberal democracy is in a time of often unexpected political trends, events and leaders: some would argue that the protection and flourishing of liberal political possibilities is under threat both domestically and internationally: in this view, the future belongs to future Putins. The book assumes that there are three broad types of war and conflict in which the three trends are likely to come into play. Policing Wars and Humanitarian Interventions. First, there remains the possibility of wars or military interventions by liberal states in response to civil wars, ethnic cleansing, regional instability, environmental disaster or state collapse. These interventions might be for ‘noble’ and geopolitically honest goals, the ‘responsibility to protect’ people from violence and suffering. At the same time, these interventions might be as much about ‘power politics’ (economic or geopolitical/​strategic—​or both) as they are with the protection of life, our humanity: for example, the invasion of Ukraine in 2022 was justified in terms of ‘peacekeeping’ and eradicating ‘fascism’ but also in terms of protecting the region from the damaging consequences of NATO expansion through the use of a ‘special military operation.’ Many would argue that the Global War on Terror was driven by similarly ambiguous objectives. There are many reasons for states with the capability to continue use military force around the planet; Samuel Moyn (2021) argues in Humane: How the United States Abandoned Peace and Reinvented War that attempts to ‘improve’ the conduct of the liberal way of war through more humane laws, tactics and technologies work to perpetuate the use of war as a tool of international strategy and politics. These humane ‘policing’ interventions will generally involve liberal states where there is an imbalance of power and capability and where there is the assumption that other ‘great powers’ will not intervene beyond sanctions or other non-​military

Introduction  17

measures; this was certainly the geopolitical condition from 2001 to 2021. In a multipolar world composed of authoritarian regimes, the liberal world might have to accept watching unpalatable events unless existential threats are on the threat horizon. But it might also be the case that states such as Russia find the costs of war (and the new techniques used against them in an interconnected world) unacceptable in the twenty-​first century; the response from the liberal world might be to use a variety of sub-​threshold actions to support a side in a conflict without openly using their own troops—​beyond providing assistance and training; Moyn would probably argue that this sub-​threshold assistance will work to prolong and intensify war during events such as the Russo-​Ukrainian war. But there may be liberal interventions and policing operations driven by a desire to secure access to the resources that are fundamental to the technologies and economies that make possible the worlds we live in. As Helen Thompson (2021) argues in Disorder: Hard Times in the 21st Century, for all the optimism about new technologies and the interconnectedness of international politics, the turbulence generated by the need for fossil fuels will continue to produce global insecurity even with the transition to green technologies; and the need for resources for our ‘informational’ and digital economies might tip over the threshold of economic policy into new resource conflicts involving military force (directly or indirectly). For all the visions of liberal states protecting and liberating the vulnerable, the ‘dirty’ side of international politics will continue to produce disorder, conflict and competition that will result in new wars, wars that may well be justified as humane wars; as we approach 2049 we may well be seeing more clearly what the astropolitics of resource competition might look like in terms of conflict on Earth—​and in space. At the same time these policing wars might stem from a tendency for liberal states to want to ambitiously impose liberal democracy around the world as a path to progress and national/​international improvement (or for darker reasons of geopolitical calculation and reasoning). However, the desire in the liberal world to retreat from the ‘relentless wars’ that followed 9/​11 seems unlikely to change: events, however, coupled with technological innovation can produce a renewed enthusiasm for policing ‘new world orders.’ And it seems likely that the coming decades will see the emergence of horrific events of suffering and inhumanity that might require interventions that use our latest military techniques and tactics. The international community might have developed new techniques of conflict prevention, resolution and management—​but it seems rather optimistic to see radical transformative innovation beyond war and the use of force out to 2049. The question will be what form the impure, the machinic and the granular will take for policing wars, humanitarian and peacekeeping missions. Interstate Wars. Out to 2049, there will likely be wars where a state threatens the ‘rules-​based order’ of the international society in a dramatic and destructive fashion (similar to Russia’s ‘special military operation’ in Ukraine); liberal states will most likely be involved ‘at a distance’ and through sub-​threshold tactics that

18 Introduction

explore a variety of tactics that avoid placing troops on the ground or using weapons of mass destruction on a ‘peer’ or ‘near peer’ competitor. The assumption of the book is that much of international conflict and competition will take place in the grey zones of cyber espionage, information war, political manipulation and ‘technical support.’ These tactics might be intensified in the prelude to war but otherwise they will remain the grey and murky terrain in which all states exist. In a complex multipolar world, states will search for more creative tactics to exploit in a world that will most likely become increasingly messy and interconnected—​economically, technologically, culturally, in terms of the global risks that need to be managed. But there might be events and personalities that quickly transform international relations in a manner that changes all assumptions about what is possible/​permitted—​and what some states see as acceptable, what risks they are willing to take and how important economic growth and interconnectedness is to them. And while there might not be ‘open wars’ between superpowers or nuclear powers there might be moments of confrontation where both sides seek to contain the destructive potential of conflict; wars fought through the accumulation of ‘granular’ targets, attempts to creatively undermine and deter an opponent. The Sub-​Threshold. Third, there will be the emergence of ‘sub threshold’ events, events that might not be described as acts of war (although those experiencing them might disagree as they experience both intensely violent and socially disruptive action at a distance) but tactics that will be carried out by the military to achieve a variety of objectives in time-​sensitive (and possibly politically sensitive) events in any territory around the world. These sub-​threshold events will involve assassinations, raids, drone strikes, cyberattacks, acts of sabotage and information war. These sub-​threshold events might be the most common acts of state violence and international conflict in coming decades; interventions where the impure, granular and machinic trends are used in the most creative and cunning manner, events that are creatively destructive in unprecedented ways, focused on specific locations and using small groups of special forces, relying as much as possible on technologies that minimise the risks to all involved. At the same time, the logistical problems of a multipolar world may place limits on these events, limits that might drive the search for new types of creativity in war and the granular tactics and machinic possibilities. Accidents or miscalculation in this grey zone of international conflict could push events out of the sub-​threshold and into the realm of ‘open’ interstate war. These sub-​threshold actions will most likely be the continual backdrop to an international environment dealing with mutations of terrorism and global crime—​but wars on terror will be fought through sub-​threshold action rather than territorial operations of regime change, actions too costly and complex in a multipolar world. So, the book is sceptical about the hypothése d’engagement majeur in coming decades—​wars with Russia or China or even another long-​term intervention like we witnessed in Iraq or Afghanistan; liberal states will explore the possibilities

Introduction  19

of the sub-​threshold, the grey zones of hybrid or liminal war. Or if war between great powers/​superpowers emerges it will be fought with an attempt to contain the fighting to granular war/​targeting based on demonstrations of military capability, attacks on a diverse range of critical infrastructures ranging from undersea cables to power plants to satellites in space. But not great power conflict that involves strikes on the cities of the states involved, the use of civilians as targets in conflict between nuclear powers. While the book is based on the position that the hypothése d’engagement majeur is unlikely, the book concludes by discussing some of the areas that could push conflict into dangerous new zones of interstate war in a world where deterrence by entanglement (and weapons of mass destruction) disintegrates; if the hypothése d’engagement majeur did take place between ‘great powers’—​and went beyond any attempt to contain or restrict fighting—​it would play out in the deadliest form of the granular, the machinic and impure war (and possibly entering into the apocalyptic zone of ‘pure war’), all the trends discussed in the book magnified and intensified to a level beyond the imagination of the ideas contained here, the complete and nihilistic breakdown in the strategies of deterrence that continue to underpin international relations (and deterrence that, for many, contains its own dangerous nihilism). In the conclusion of the book the emergence of different types of world orders out to 2049 is explored, world orders that might make certain types of war more likely and others less likely. Questions about what comes after 2049 is beyond the scope of a book already playing a mug’s game. Outline of the Book

Part I of the book on ‘War and Peace in the Twenty-​First Century’ begins with Chapter 2 on ‘The Liberal Way of Future Warfare’. The chapter suggests that while we are often confronted with visions of future dystopia in war and international politics, there is a strand of theory and practice on the future that is central to the liberal view of history and international theory: the idea that liberal societies will continue to improve all aspects of existence and the human condition—​including war. In the liberal view of history and progress, war is viewed as changing in response to new values, social attitudes, modes of awareness on the consequences of modern war, legal and cultural steps towards increasingly ‘humane’ warfare. At the same time, war is transformed by new technological possibilities, possibilities that intensify the lethal potential of warfare but also produce non-​lethal techniques and technologies of ‘precision’ in times of conflict and war. From a ‘protopian’ perspective, the years out to 2049 will see the liberal way of war continue to be ‘improved’ by changing societal attitudes and emerging technologies. So, the liberal way of war will be shaped by an approach to warfare that will be focused on the development of increasingly humane war. While there might be tactical and technological ‘improvements’ in warfare, Chapter 3—​‘The Lethal State of Modernity’—​explores what Achille Mbembe describes as the ‘nocturnal body’

20 Introduction

of the liberal state, the sides to liberal society that might be ignored or silenced in more ‘optimistic’ understandings of the liberal way of war and international politics. Simply put, the chapter suggests that there is an ambiguity in the idea of the ‘civilising process’ that is important to liberal thinkers like Steven Pinker in their views on the sources of progress in modernity. The chapter outlines a different view on the ‘civilising process’ through the work of Zygmunt Bauman in Modernity and the Holocaust. This is followed by an overview of the work of Paul Virilio: Virilio sees a number of omissions in the ‘stories’ of progress, security and improvement in liberal accounts of modernity international politics. Theorising Future Conflict: War Out to 2049 is suggesting that while liberal and protopian ‘drives’ to improve war and international politics will attempt to shape the future of war and international politics, we should not lose sight of what Mbembe describes as the ‘necropolitical’ possibilities that emerge from the ‘nocturnal body’ of the liberal state. The chapter concludes by suggesting that this tension between the protopian and the necropolitical will continue to shape the development (and consequences) of the liberal way of warfare out to 2049. These protopian/​necropolitical ‘drives’ will shape the three broad trends in warfare that will be explored in the book; the impure, the granular and the machinic. Part II of the book on ‘The Tactics, Terrains and Technologies of Future Warfare’ begins with Chapter 4, ‘The Impure 1: On the Sub-​Threshold of Modernity and War’. The chapter introduces perspectives on one of the key contemporary debates on the changing nature of war and international conflict: the emergence of concepts and practices in terms of sub-​threshold actions, the grey zone, unrestricted warfare, liminal war, the Gerasimov Doctrine. It is argued that a key trend in war, security and international politics will be responding to what Paul Virilio described after 9/​ 11 as a time of impure war; it is suggested that the liberal way of warfare will likely continue to be focused on conflicts with non-​state actors exploiting vulnerabilities in the infrastructures of liberal societies—​but also with states carrying out a range of ‘sub-​threshold’ actions against liberal states. The chapter outlines ideas emerging from Russia and China on the emerging tactics and technology of warfare and international conflict the chapter argues that the tactics outlined in these Russia and Chinese writings also describe trends in the liberal way of warfare. But while these ‘sub-​threshold’ trends in competition, conflict and warfare might be viewed as an improvement in international relations, there are troubling ethical and strategic questions in this time of ‘unrestricted warfare.’ One of the key questions on the horizon is on what type of state and military will be the most ‘creative’ in the emerging terrains of future war and security. Chapter 5, ‘The Impure 2: Glitches in the Digital War Machine—​The (Hu) Man, the State and (Cyber)War,’ introduces debates on one of the key debates on the changing character of ‘impure war’ in the twenty-​first century: cyberwar. The chapter outlines the debates on the significance of ‘cyber’ as a tool of both warfare and in the sub-​threshold ‘grey zones’ of international politics, a broad range of tactics of informational or infrastructural espionage, subversion and sabotage.

Introduction  21

While cybersecurity will likely remain vital to the digital aspects of society in the years out to 2049, it is suggested that cyber will be one of many tools in what is described as ‘mosaic warfare’; and its significance will vary in light of emerging vulnerabilities, technical fixes and offensive capabilities, the protopian and necropolitical possibilities on the horizon. At the same time, the debate on cyberwarfare highlights the importance of two key aspects in the impure wars of the future; the importance of attacks and sabotage on critical infrastructures; attacks and sabotage on the means of communication in times of war, the subversion of command and control. While the technology and tactics might change, these two areas will remain the vital concerns of future war and security. In response to 9/​11 and the Global War on Terror, Virilio suggested that there was a change in the scale of conflict in international politics driven by new technological and tactical possibilities. Chapter 6, ‘The Granular 1: The Changing Scale in Conflict,’ begins to think through what the change in scale might mean for the future of security and warfare. Simply put, we are seeing the emerging possibilities of a world where the ‘kill box’ can ‘shrink’ the lethal spaces of violence, where terrorist networks can orchestrate world-​changing events, where an individual can produce an act or exploit of large-​scale political or organisational cyber-​sabotage, where our systems of surveillance can analyse increasingly detailed patterns of life around the planet, where the term ‘bonsai army’ has been used to describe the smaller armies of states like France in the twenty-​first century. But what does this change in scale mean for many of the global challenges ahead for war and security? The chapter suggests that problems of scale in times of technological change are central to debates on one of the key terrains of war: urban warfare. The chapter concludes with three fictional scenarios that encourage the reader to think about the possibilities in the change in scale in war, tactics and technology. Chapter 7, ‘The Granular 2: The Granularity of Future War,’ continues to explore the change in scale in technology, terrains and tactics. Exploring a variety of interventions and perspectives on urban warfare, the chapter explores the possibilities for the liberal way of warfare in terrains that might be increasingly transparent to various forms of surveillance and congested with different weapons, technologies and actors. Much of the discussion on the change of scale in the chapter has focused on liberal interventions in the Global South but the focus turns to great power conflict. While this focus on the granularity of conflict might seem to point to ‘small’ conflicts, technologies and actors, it is suggested that the granularity of future conflict might be central to great power conflict and competition. Chapter 8, ‘The Machinic 1: The Battle Angels of Our Better Nature,’ suggests that the development of ‘tools’ to distance humans from harm and to transform capabilities has been a driving force in the development of machine-​ based civilisation. It is suggested that the debates about the ethical and tactical consequences of drone war during the Global War on Terror are shifting towards discussion of conflicts such as the Russo-​Ukrainian war and the potential use of drones by non-​state actors in a time of ‘open technological innovation.’ While the

22 Introduction

use of drones might continue to transform the liberal way of war in more ‘humane’ directions, drone war might also prolong and extend the time of war through the technological enhancement of warfighting; the use of drones might also result in the possibility of military–​technical events that have implications not factored into strategic calculations. One of the most urgent debates on the future of warfare is on how AI will transform warfighting. Chapter 9, ‘The Machinic 2: The Great Accelerator? AI and the Future of Warfare,’ suggests that the societal and economic impacts of AI will likely have radical and unpredictable consequences that could be both positive and negative. But there is uncertainty on how AI will transform the liberal way of war; while the chapter outlines some of the more dramatic possibilities, the chapter also suggests that there might be limits on the radical possibilities of AI in warfare. There might be both technical limitations (the possibility of accidents, the loss of trust) as well as strategies of deterrence and regulation that limit the use of AI in wars. The use of AI for security and war might be used more as a bureaucratic tool of research and management. At the same time, the impact of AI might be important for how it transforms a broader range of technologies, tactics and domains vital to war and international politics in the years out to 2049. The book concludes with Chapter 10, ‘Cyberpunk International Politics? Enter the Shimmer,’ a discussion of the events and tendencies that might continue to produce chaos and war in our international relations. On the one hand, it might be the case that international politics becomes a ‘grey zone’ of conflict and competition in what I describe as a time of ‘cyberpunk international politics.’ In this view, states and militaries will be dealing with the management of new actors, tactics and technologies in a security landscape of sub-​threshold messiness and complexity. While great power conflict might remain sub-​threshold, there will likely be brutal conflicts emerging around the planet that will serve as zones of necropolitical experimentation in the tactics and technologies of warfare (as some argue has been the case in Ukraine) (Olearchyk 2023). There will, however, be events and ‘traditional’ tendencies in international politics that will threaten to push great power conflict into open war and unnecessary wars. The book concludes that all the trends and possibilities in future warfare will depend on the types of ‘world order’ that will begin to emerge out to 2049; some of these world orders are sketched out in the conclusion of the book. The book concludes that the problem of policymakers existing in what Hans Morgenthau described as ‘worlds of fantasy’ will likely become more challenging in a multipolar world dealing with emerging actors, technologies, terrains and global disorder. Simply put, in the time when Morgenthau was writing during the Cold War it could be argued that the risk was posed by two sets of politicians and policymakers that could inhabit dangerous ‘worlds of fantasy’ that could create and destroy worlds; in the twenty-​first century, there might be a multiplication of the actors that can attempt to design and produce their worlds of fantasy; international politics in a time of multiplication—​the multiplication of actors, tactics, terrains and technologies.

Introduction  23

Bibliography Adams, Douglas. 1999. ‘Comment: The Last Word,’ The Independent, 28 November: www. inde​pend​ent.co.uk/​life-​style/​comm​ent-​the-​last-​word-​1129​474.html Aradau, Claudia and van Munster, Rens. 2012. Politics of Catastrophe: Genealogies of the Unknown (London: Routledge). Attali, Jacques. 2011. A Brief History of the Future: A Brave and Controversial Look at the Twenty First Century (London: Arcade Publishing). Berardi, Franco. 2011. After the Future (Oakland, CA: AK Press). Chen, Qiufan and Lee, Kai-​Fu. 2021. AI 2041: Ten Visions for Our Future (London: WH Allen). Coker, Christopher. 2015. Future War (Cambridge: Polity Press). Debrix, Francois. 2007. Tabloid Terror: War, Culture and Geopolitics (London: Routledge). Dillon, Michael and Reid, Julian. 2009. The Liberal Way of War: Killing to Make Life Live (London: Routledge). Freedman, Lawrence. 2017. The Future of War: A History (London: Penguin). Galeotti, Mark. 2022. The Weaponisation of Everything: A Field Guide to The New Way of War (Yale: Yale University Press). Gregory, Derek. 2011. ‘The Everywhere War,’ The Geographical Journal, Vol. 177, Issue 3, 238–​250. Grove, Jairus Victor. 2019. Savage Ecology: War and Geopolitics At the End of the World (Durham: Duke University Press). Hon, Adrian. 2020. A New History of the Future in 100 Objects (Cambridge, MA: MIT Press). Ikenberry, G. John. 2020. A World Safe For Democracy: Liberal Internationalism and the Crises of Global Order (Yale; Yale University Press). Kilcullen, David. 2022. The Dragons and the Snakes: How the Rest Learned to Fight the West (London: C Hurst and Co). Krishnan, Armin. 2016. Military Neuroscience and the Coming Age of Neurowarfare (London: Routledge). Latiff, Robert. 2017. Future War: Preparing for the New Global Battlefield (London: Alfred Knopf). Lee, Kai-​Fu and Qiufan, Chen. 2021. AI 2041: Ten Visions for Our Future (London: WH Allen). MacMillan, Margaret. 2021. War: How Conflict Shaped Us (London: Profile Books). McFate, Sean. 2020. The New Rules of War: How America Can Win—​Against Russia, China and Other Threats (London: William Morrow). Mearsheimer, John. 2014. The Tragedy of Great Power Politics (New York: W.W. Norton). Mitchell, David. 2015. The Bone Clocks (London: Sceptre). Moyn, Samuel. 2021. Humane: How the United States Abandoned Peace and Reinvented War (New York: Farrar, Straus and Giroux). Office of the Secretary of Defense. 2020. ‘Military and Security Developments Involving The People’s Republic of China 2020’: https://​media.defe​nse.gov/​2020/​Sep/​01/​200​2488​ 689/​-​1/​-​1/​1/​2020-​DOD-​CHINA-​MILIT​ARY-​POWER-​REP​ORT-​FINAL.PDF Olearchyk, Roman. 2023. ‘Military Briefing: Ukraine Provides Ideal ‘Testing Ground’ for Western Weaponry,’ Financial Times, 5 July: www.ft.com/​cont​ent/​8819b​598-​7595-​47cc-​ a992-​8897b​86b5​7c6 Paul Virilio. 2008. Pure War (Los Angeles: Semiotexte).

24 Introduction

Powell, James Lawrence. 2020. The 2084 Report: A History of Global Warming From the Future (London: Hodder and Stoughton). Robinson, Kim Stanley. 2018. ‘Empty Half the Earth of Its Humans,’ The Guardian, 20 March: www.theg​uard​ian.com/​cit​ies/​2018/​mar/​20/​save-​the-​pla​net-​half-​earth-​kim-​stan​ley-​ robin​son Ryan, Mick. 2022. War Transformed: The Future of Twenty-​first Century Great Power Competition and Conflict (Annapolis: Naval Institute Press). Shirreff, Richard. 2016. War With Russia (London: Coronet). Singer, Peter and Cole, August. 2015. Ghostfleet (London: Houghton Mifflin Harcourt). Thompson, Helen. 2022. Disorder: Hard Times in the 21st Century (Oxford: Oxford University Press). Trudeau, Justin. 2018. ‘Justin Trudeau’s Davos Address in Full,’ World Economic Forum: www.wefo​rum.org/​age​nda/​2018/​01/​pm-​keyn​ote-​rema​rks-​for-​world-​econo​mic-​ forum-​2018/​ Varoufakis, Yanis. 2021. Another Now (London: Vintage). Zuboff, Shoshana. 2019. The Age of Surveillance Capitalism (London: Profile Books).

PART ONE

War and Peace in the Twenty-​First Century

2 THE LIBERAL WAY OF FUTURE WARFARE

In Stanislaw Lem’s Return from the Stars (1966), the Polish science fiction writer describes the experience of an astronaut—​Hal Bregg—​who returns to an earth that has undergone radical transformation: ten years in space meant that 127 years had passed on his home planet and the world he returns to is profoundly different from the one he left. The world he returns to is not different simply in terms of innovations in fashion, architecture and technology: violent and aggressive impulses have been removed from our identities/​bodies by being ‘betrizated.’ It is a world so controlled and saturated with pervasive technologies that danger has been removed from life. A doctor tells the returning astronaut: Consider something, for example, something you have been become so accustomed to that you no longer see the exceptional nature of the phenomenon: risk. It does not exist any more, Bregg. A man cannot impress a woman with heroics, with reckless deeds, and yet literature, art, our whole culture for centuries was nourished by this current: love in the face of adversity. Orpheus went to Hades for Eurydice. Othello killed for love. The tragedy of Romeo and Juliet…Today there is no tragedy. (Lem 2020: 82) Return from the Stars is exploring a situation (using science fiction in order to exaggerate and intensify the scenario) that emerges from the liberal view of history where the assumption is that the human condition will become less violent—​at all scales—​transforming our everyday interactions with other people through to our international politics. The book is an examination of a world without risk or danger where people live in cities filled with giant screens, a life designed to reduce anxiety or stress, a planet where all movements are under surveillance, DOI: 10.4324/9781003219576-3

28  War and Peace in the Twenty-First Century

where technology has a solution for every problem or need: life of constant safety, security and control. Return from the Stars explores the potential problems of a world where danger and conflict have been erased from the human condition; the book explores the potential downside sides of this technological progress for the human condition: the problem of life in a state of complete control. But a person from 1922 or 1622 who arrived in a liberal society in the 2020s might find attitudes to risk, violence and everyday security as disorientating as the character in the Lem story. While lurking in the background are constant anxieties about nuclear war, terrorism and climate change, the underling story of modern, liberal history is one of progress—​and of our capacity to overcome the inevitable dangers that the modern world produces. In this view, modernity is a period where science, liberal democracy and capitalism continues to generate multiple sources of improvement in the human condition—​from improved medicine, education, communication, transportation, political systems, policing, the emergence and protection of human rights through to attempts to reduce inequality while improving the global economy. Even war is improved. In the speculations of a wide-​ranging cluster of philosophers, designers, scientists and business leaders there is a sense that history has a direction, a direction that emerged in Europe but is spreading globally, a world of Kant’s ‘perpetual peace’; more specifically, the liberal ‘peace theory’ suggests that war between liberal democracies is highly unlikely due to our shared values and deterrence by entanglement. Even those states that we might not label as liberal democracies become entangled in the international relations of business, education and culture. Violence and disorder will emerge in states that have failed to develop strong and stable centralised states that can contain (democratically and responsibly) the internal sources of war and crime; or in authoritarian regimes that can ignore concerns about the human and material costs of war (and can control the presentation of war, maintaining control of the narrative and propaganda about the sub-​human other that is being fought and controlled). But, the story goes, the direction of history will make it harder for authoritarian regimes to mask the costs of war or deal with the international response, responses composed of non-​lethal solutions to limit and deter the use of violence and warfare. Confronted with images and stories of a future ravaged by climate change, terrorism, machines that will produce mass unemployment, we live in a time of anxiety about the future, about the constant threat of ‘apocalyptic international politics.’ But the liberal internationalist vision remains a strong current in thinking about the future of war and international politics. Indeed, some would argue that liberal internationalist visions of the future are constantly reworked and reimagined in response to the anxieties and pessimism of specific moments (the time after world wars or financial crises). For the political theorist John Gray, anxiety about the dystopian present or apocalyptic near future drives the ambitious visions of progress in many contemporary liberal writings: ‘Today those who peer into the

The Liberal Way of Future Warfare  29

future only want relief from anxiety. Unable to face the prospect that the cycles of war will continue, they are desperate to find a pattern of improvement in history’ (Gray 2016: 99). In a book review published in January 1939, George Orwell criticises the philosopher Bertrand Russell’s Power: A New Social Analysis for not providing a serious analysis of politics and power in work that is ‘better at pointing out what is desirable’ rather than being able to explain how to transform society: ‘he merely utters what amounts to a pious hope that the present state of things will not endure’ (Orwell 2014: 75). In this view, liberal ideas about the future are secular ‘medicine’ in response to the brutal reality of the human condition in modernity—​and the chaos and destruction that liberalism and capitalism produces. In this chapter I suggest that liberal democracies and liberal internationalist approaches to global politics transform war in modernity, changing the character of war in a manner that is likely to continue in a time of concern over the ‘relentless wars’ fought since 9/​11. But the perspective of the following chapters raise a note of caution about the liberal transformation of future warfare. The liberal way of warfare is transformed by the values, laws, social movements and technologies (of war and communication) that emerge in political modernity. The freedom of speech in liberal democracies—​coupled with the new technologies of communication from newspapers to war poets to war blogs—​enabled citizens to see the brutal consequences of warfare in the age of industrial states. The philosophies of liberal democracy created legal and moral cultures that challenged the wars that could reduce human beings to objects that could be used and exploited in an instrumental fashion as cogs in a modern war machine: the ‘civilising process’ of modernity, where our sense of ‘moral community’ expands, changes citizens and the international society that comes into being (Linklater 2020)—​although there are ambiguities to these processes of social and cultural ‘improvement’ that will be explored in the next chapter. Interconnectedness and entanglement make war less ‘rational’ and, in a ‘post-​heroic’ age, the human and economic costs become increasingly unacceptable; the willingness to die for a nationalist project is replaced by the desire to buy for a more personal and individualist project of self-​creation. But liberal states can have moments where they seek to impose or create the liberal world and international system through war; in War and International Thought Jens Bartelson argues that a view of war in terms of its ‘constitutive functions’ (what he describes as an ‘ontogenetic view’) in relation to the production of world order and stability is ‘likely to be as contagious in the present as it was in the past. Once it has taken hold among the powerful, it becomes an offer you cannot refuse but something you have to emulate in order to survive’ (Bartelson 2017: 196). The implication of Bartelson’s historico-​philosophical analysis is that we need to be cautious about the sense that an ‘ontogenetic’ view of war has lost significance (especially after the withdrawal from Afghanistan by the United States in 2021, the end of ‘relentless’ wars on terror); there might be emerging great powers who seek to emulate experiments in the use of war as a productive force (believing they will have learnt from the mistakes of the past, guided by their

30  War and Peace in the Twenty-First Century

own sense of ‘exceptionalism’); in the liberal world, the view of war in terms of its constitutive function in relation to world order and security might ‘reform’ or regroup after waves of technical-​military innovation. Liberal states can make strategic errors and miscalculations (Mearsheimer 2014). More broadly and globally, the values and ideas of liberal internationalism might prove to be a rather local and temporary phenomena. While liberal societies might be reluctant to wage war, other regimes might see this moment of geopolitical and technological transformation as a moment to exploit liberal unease about war and the future structure of international society; some suggest that the liberal world is dealing with the end of the end of history (Hochuli, Hoare and Cunliffe 2021). But the argument of the book is that while some leaders and states will pursue strategies and tactics that look like a return to previous centuries, liberal societies will hold on to the possibility of becoming a different kind of war machine. Liberal Futures of War and Peace

In the early 1990s, Francis Fukuyama’s The End of History and the Last Man attempted to map out where we might be heading after the collapse of the Cold War; it was a world where liberal democracy and ‘free markets’ would enable human beings to realise their potential through the production and maintenance of a ‘rules-​based’ international order. In this view, communism had created a dystopian landscape of failed social and economic experiments where human beings were crushed by the overwhelming power of the modern bureaucracies and with the tendency towards what Hannah Arendt described as a ‘remoteness from reality’ (Arendt 1972). Liberal democracy, Fukuyama was arguing, might not be perfect (consumer society produces pollution; free markets create inequality) and might result in a life that was dull and without risk, but the rules-​based order would create the political spaces to manage the dangers of our social, political, economic and technological complexity. Liberal democracy creates the conditions for transformation; authoritarian or totalitarian regimes do not (or produces transformation that is not likely to benefit the majority of citizens). Liberal democracy enabled people to explore different routes for fulfilling their potential (however they chose to define their potential—​if indeed they even wanted to define life in terms of fulfilling potential). But regardless of how individuals or groups defined human potential, liberal democracies created political and legal systems that enabled people to be recognised as human, to be worthy of value regardless of race or class, to be protected from all abuse of power in what Kant described as a ‘cosmopolitan condition’ where citizenship and sovereign territory can be disconnected through a legal protection against ‘criminal’ governments (Habermas 2006: 86). Liberal democracy enabled individuals and communities to express and explore concerns, to warn of social, economic, political and environmental dangers in the public sphere, to demand solutions to the inevitable problems of human life and to evaluate the possible solutions through open dialogue

The Liberal Way of Future Warfare  31

and debate (Beck 1999). At the most fundamental level, liberal democracies tried to eradicate the fear that you could be punished for ideas that might challenge those in authority; this lack of fear is the source of dynamism and social and political innovation. Fear is intended for those who violate human right or commit war crimes, those who will become subject to the instruments of international law and global governance. The emergence of liberal societies is part of a process not an end state (as much as we might talk about ‘the end of history’). As G. John Ikenberry puts it: No liberal state has ever acted in international affairs solely on the basis of liberal principles. Hypocrisy is inherent in the rhetoric of liberal democracy and human rights. But the spaces opened up within even a deeply flawed liberal international order create opportunities for political struggles that can bring the order closer to its founding ideals. (Ikenberry 2020: xiv) In other words, liberal democracy attempts to produce and maintain the conditions for the political struggles that will challenge the legacies and consequences of histories of violence and inhumanity—​and the present and future sources of insecurity and suffering. In the 2010s, Steven Pinker’s The Better Angels of Our Nature became the latest ambitious statement on where we might be heading and how people/​society had changed in the modern liberal age. Whereas Fukuyama was writing in response to the uncertainty of the post-​Cold War world, Pinker is writing in response to a world that feels dangerous, on the verge of dystopia or apocalypse. What concerns Pinker is that the tragedy of our time is that we think the world is more dangerous and more hopeless than it is. Perceiving the world through all our various ‘devices,’ we are presented with a reality that looks like the stuff of dystopian and apocalyptic fiction; sadistic torture in territories dominated by terror groups; violence and disorder at home fuelled by a digital culture that pushes young people to terrorism and crime; the sense of an environmental crisis that will reduce the world to the type of ‘state of nature’ depicted in Cormac McCarthy’s The Road—​where the father has to accept he may have to kill his son rather than let him be captured by the violent gangs hunting humans. The brutal aspects of our past are overcome by liberal democracy, but the brutal possibilities of our future are exaggerated in the popular culture of liberal democracy, presented as entertainment for citizens in a world where risk and tragedy begin to fade from the human condition; our lives of security and safety in liberal democracy makes us all, as the rock band Radiohead puts it, ‘paranoid androids.’ Pinker makes the case that the world is less violent than we think it is and less violent than it has ever been (Pinker 2011). The human condition is improving; people are becoming more tolerant of all groups that may have previously been treated as inferior or ‘subhuman’; we are becoming healthier and more educated;

32  War and Peace in the Twenty-First Century

we are in a ‘long peace’ where war between states (especially war between liberal states—​and war between nuclear powers) is less likely; the wars that we do fight are policing wars designed to maintain a safe, secure world (rather than nationalistic attempts to conquer territory or assert our supremacy as a race or society). And when we do fight for global order and security we are concerned with collateral damage, with minimising harm for all concerned; we are exploring new technological possibilities to make war even more non-​lethal. We are becoming more like the future humans in Return from the Stars, repulsed by violence, developing the technology that has the potential to eliminate violence from the human condition. A world where all individuals, groups and states are transformed by education/​training and surveillance. Those leaders who do fight what come to be seen as costly, irresponsible and unjustifiable ‘unnecessary wars’ will be challenged by the citizenry. To be sure, it remains to be seen how this will play out in authoritarian regimes where information can be controlled and manipulated: it was reported that Putin was deeply troubled by the possibility of being killed in a manner similar to the execution of Gaddafi in 2011; authoritarian leaders will seek to aggressively control the information space of society and politics in order to supress dissent and revolution (and possible execution). It remains to be seen whether leaders will be able to manage both the ‘real space’ of war and the information battlegrounds of the twenty-​first century where, for example, there were examples of ‘viral misinformation’ on TikTok as the war in Ukraine began in 2022 and where young Ukrainians started ‘battling’ trolls spreading misinformation on social media about the war (Spring 2022). The Ukrainian information war attempted to undermine ‘official’ Russian narratives about the causes and consequences of the war: Russia attempted to control the information space of Russian media channels by banning the term ‘war’ to describe its ‘special military operation’; attempts to challenge Putin’s narrative and presentation of the war involved sharing recordings of Russian captives, a move that risked violating The Third Geneva Convention and the norms to protect captives from humiliation and threats to safety (another example of how war is ‘improved’ and made ‘humane’ in liberal modernity) (FT Reporters 2022). The use of social media in war also highlights the complexity and diversity of actors involved in the ‘congested’ battlespaces in the contemporary way of war: after a tweet from the Ukrainian ‘digital minister,’ Elon Musk adjusted the constellation of his Starlink satellites and sent internet-​ready terminals to Ukraine in order to provide a ‘potential lifeline’ for government telecommunication networks that were damaged. Simply put, the proliferation of technologies and actors in the liberal way of future warfare presents dangers and opportunities for both liberal and authoritarian regimes—​and, of course, their citizens. The question is whether the messiness, complexity and ‘congested’ nature of warfare (congested with actors and technologies, actors and technologies that will likely mutate and

The Liberal Way of Future Warfare  33

proliferate) will make war increasingly unappealing for all types of political regime in the decades out to 2049. So, to be sure, the liberal still sees serious problems across the planet—​problems of war, inequality, racism, sexism, homophobia, xenophobia, pollution and ecological damage, species loss. But liberal democracies, it is argued, create the processes to find local/​global solutions to the problems of a ‘cosmopolitan condition’ where citizens can think and act beyond sovereign territory; the process is often messy, imperfect and complex—​Elon Musk and his Starlink satellites might reflect positive developments in the possible democratisation of technology in wartime while his ‘control’ of Twitter/X might reflect the dangers of social media and economic power for liberal democracy. Indeed, after the election of Donald Trump (and ‘populist’ movements and events in other states) and with concern about societal division and domestic political conflict, the focus in liberal political debate turned to the threats to democracy that are far more serious than anything found in Fukuyama’s The End of History and the Last Man or Steven Pinker’s The Better Angels of Our Nature: Why Violence Has Declined. In Liberalism and Its Discontents, Fukuyama (2022) defends liberalism in times of populism and what could be seen as the risky political and social experiment in emerging technologies of communication and information in the public sphere; in How Democracy Ends (2018) David Runciman gives a provocative account of how liberal political systems might implode after various catastrophes and technological accidents—​not the end of history but the end of liberalism in a time of climate change and artificial intelligence (AI). But for the liberal optimist, our modern societies agonise about what we get wrong and explore how we can improve; in this view, even if one is the most critical and apocalyptic critic of global politics you are contributing to the anxiety that improves the resilience of the liberal world, an element in the anxiety that contributes to the demand for solutions and improvements to avoid the catastrophic tipping point. From this perspective, Runciman’s How Democracy Ends contributes to the resilience of liberalism and democracy in the book’s bold and provocative dystopian speculations. Our international politics has created organisations that set out to manage planetary problems, organisations that herald the beginning of a cosmopolitan world order or mechanisms of planetary governance. We are haunted by the horror of industrial war, the camps/​prisons that reduce life to objects of what Giorgio Agamben describes as ‘bare life’ in a ‘state of exception,’ the trauma of nuclear war that has created the foundation of a cosmopolitan consciousness. Catastrophic events still take place—​ the slow but world changing acts of ecological degradation, the human destruction in states like Syria, Yemen or Ukraine—​but we remain haunted by the horror of history, working to ensure that the concentration camps of industrial killing never return. Liberal societies are haunted by the horrors of the past, present and future. Of course, at the same time there are those who are haunted by the past in a different way, in a way that drives the political and military desire for a restoration of past spiritual or national vitality.

34  War and Peace in the Twenty-First Century

And there are always limits to what states and international organisations can do to control the violence and chaos of events. But in the liberal view, the ingenuity and creativity of democracy and capitalism will create technological ‘fixes,’ the process of transformation and evolution in our fast-​moving interconnected societies, societies that will offer more and more people lives of unprecedented (human, cultural and technological) richness and complexity. And for those who see environmental apocalypse ahead; many liberal optimists will argue that the technological fixes will generate solutions (Pinker 2019). We can create societies where all people can flourish in environments that will be resilient to the problems that technological development produce. Simply put, the costs of progress are still heavily outweighed by the benefits, benefits that will continue to radically improve the quality of life for people around the planet. The solutions will not necessarily be ‘natural,’ possibly resulting in the ‘post human,’ the geoengineered environments to manage climate change, cities that—​like in Return from the Stars—​are more like theme parks in their intense attention to control and safety. The contemporary problem for liberal thinkers such as Pinker is that people living lives of increased security and peace perceive the world as dangerous; Pinker is concerned about the danger of pessimism and the sense of fatalism that circulates through society. In this view, if we descend into apocalyptic thinking about the future, we will be less concerned with acting like concerned citizens that will be focused on developing a vibrant public sphere and culture; we will be less interested in becoming the scientists and technicians that will contribute solutions to the inevitable problems of progress. Society is a never-​ending story of innovations and improvements in the human condition—​but if it becomes a story of catastrophic endings then people will lose the will to contribute to the process, to the evolution and transformation. In A World Safe For Democracy: Liberal Internationalism and the Crises of World, G. John Ikenberry (2020) sees the contemporary crisis more in terms of political threats to liberal order, threats that have both internal and external causes. In a world challenged by authoritarian regimes and the rise of populism, liberal internationalism will possibly require reworking and reimagining; there is no end point but evolution or transformation. And this transformation extends to the liberal way of warfare. Modern states that create unprecedented order and security have been shaped by war both in terms of the very formation of large, centralised states but also in terms of the emergence of industrial war, war that is ‘scaled up’ through the use of larger armies, the logistical possibilities made possible by transportation and modern bureaucracy, the new machines of war (MacMillan 2020). European states required a system of taxation to fund both national defence and the emergence of overseas expansion, colonies and wars; and the new age of modern war and national defence required states to take a role in educating and training the population. The consequences of modern, industrial war required states to provide welfare for citizens that could no longer be viewed indifferently after they had fought for

The Liberal Way of Future Warfare  35

the nation (or had lost family members in war). Innovations emerged from the processes of military–​technical research and development that have transformed all aspects of society, economy and technology. As Paul Virilio put it, ‘history progresses at the speed of its weapons systems’ (Virilio 2006: 90). The technologies of warfare that produce destruction and the possibility of ‘species extinction’ are made possible by the same infrastructures and technologies that interconnect people in ways that transform international relations. While the driving forces of innovation in a time of AI, quantum computing and robotics (and whatever technologies and scientific revolutions are to come as we approach 2049) might now emerge more in the private sector, preparation for war by modern liberal states is viewed to have played a vital role in improving modern life. As Margaret MacMillan observes on the ambiguity of ‘progress’ and war: It is another uncomfortable truth about war that it brings both destruction and creation. So many of advances in science and technology—​the jet engine, transistors, computers—​came about because they were needed in war time. Penicillin, which has saved so many lives, was first discovered in 1928 by Sir Alexander Flemming but the funds to develop it were not available until the Second World War. The Canadian doctor Norman Bethune pioneered blood transfusions on the battlefield. The practice of triage, now common in A&E departments in hospitals, started in wars, possibly the Napoleonic. By the First World War French battlefield doctors were leading the way in the division of the wounded into those whom no treatment would help, those who might live if they were dealt with immediately and those who could wait. Surgery—​for traumatic wounds or to rebuild shattered faces—​made huge advances during the wars of the twentieth century in part because there were so many patients to practise on. (MacMillan 2020: 38) While innovation in the technologies of war might increasingly emerge as the ‘dual-​ use’ side effects of research on areas focused primarily on economic transformation, it seems clear we are going to see even more radical transformations in the pace of change in technology in the coming century, transformations that will transform the human condition in unimaginable ways. New possibilities in terms of our health that might not simply eradicate diseases that still cause so many of us misery but also where we might begin to gradually increase the length of life and change what it means to be human. We might see new possibilities in technologies of policing and warfare that will eradicate the sources of urban crime and limit the possibilities of violence in war zones around the world; we will be able to see everything that is happening around the world and be able to intervene in any event through non-​lethal means. Even the most sadistic individuals or groups will be constantly aware of the technologies that might be able to see and record what they are doing wherever and whenever they are attempting to hide what they are doing, in underground tunnels, remote

36  War and Peace in the Twenty-First Century

areas or in a building in a dense megacity. Those who embrace a death by drone—​ that will magnify their heroic legacy and prevent the humiliation of a trial and imprisonment—​may be disappointed as they are taken out and captured through non-​lethal means where the kill box becomes the sleep box. For liberal states, it seems highly likely that there will be a continual concern with non-​lethal solutions to conflict and the search for increasingly targeted and proportional uses of ‘humane’ war—​which as Moyn argues might work to continue the use of warfare (even if it more humane and non-​lethal) as a political tool and technique for resolving political conflicts (Moyn 2021). The liberal way of warfare in a digital age will still require resources to fuel its growth and expansion in an age of clouds, data centres, quantum computing and cyberwar: it remains to be seen what geopolitical and ecological pressures the ‘material’ aspects of digital war and business will place on the planet—​and on international competition for resources. As Kate Crawford observes in Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence, just as the ‘dirty work’ of mining was often distant from city dwellers, so data centres are often invisible to us: This contributes to our sense of the cloud being out of sight and abstracted away, when in fact it is material, affecting the environment and climate in ways that are far from being recognised and accounted for. The cloud is of the earth, and to keep it growing requires expanding resources and layers of logistics and transport that are in constant motion. (Crawford 2021: 46) For Crawford, the desire to create a planet shaped by the precision, efficiency, connectivity and speed promised by new technologies will have costs that are often ignored in the digital utopianism presented to us by tech companies and ‘thought leaders.’ In The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future, the futurist Kevin Kelly maps out some of the key areas of technology and innovation that will transform life in the twenty-​first century. There is not much attention to war—​the assumption is possibly that war will not be the primary challenge of the century and will gradually disappear from the human condition (Kelly 2017). Göran Theborn’s The World: A Beginners Guide (2011) examines how human life is changing in the twenty-​first century, but questions of war are not significant to an analysis that is focused on family, health, inequality, social movements and globalisation. Margaret MacMillan suggests that liberal societies do not take war as seriously as we should preferring to ‘avert our eyes from what is so often a grim and depressing subject’ (MacMillan 2020: 3). It’s as if the horror of war or the irrationality of war makes it difficult to imagine the possibility of future Iraqs or Vietnams, future strategic mistakes in the liberal way of war that are damaging in so many different ways for domestic orders and the international rules-​based order: we will learn from our mistakes; we will be smarter and more

The Liberal Way of Future Warfare  37

aware in the future and our leaders will make better decisions about foreign policy. It’s almost hardwired into our modernity that our descendants will be cleverer—​ with all their new tools and devices—​than us, just as we might like to believe we are smarter than previous generations: I will return to this point in the conclusion of the book. Kelly’s focus is more on the ways that people will live and work differently, the challenges ahead for citizens in a world where privacy disappears in a world of hyperconnectedness, a world where data provides states and corporations with new types of intelligence and ways of understanding and using (or exploiting) that data (Kelly 2017; Zuboff 2019). But Kelly is cautious about suggesting the emergence of a technological utopia that improves human potential is on the horizon. As states, businesses and individuals develop and use new technology there will be paths that are potentially dangerous for human beings (and the planet)—​some paths will be rejected, some paths may become the norm; the future will most likely be shaped by messy and uneven technological development. A liberal techno-​optimist, Kelly suggests that what we will see around the planet is a process—​not the immediate and revolutionary creation of a utopia but the creation of ‘protopia’ where there will be experimentation with new technologies, technologies that will ultimately improve all aspects of life. Kelly declares: Protopia is a state of becoming, rather than a destination. It is a process. In the protopian mode, things are better today than they were yesterday, although only a little better…. The problems of today were caused by yesterday’s technological success, and the technological solutions of today’s problems will cause the problems of tomorrow. The circular expansion of both problems and solutions hides a steady accumulation of small net benefits over time. Ever since the Enlightenment and the invention of science, we’ve managed to create a tiny bit more than we’ve destroyed each year. (Kelly 2017: 13) From this perspective, if there is future war waged by liberal states it might be best described as a protopian war; a war that is shaped by new technologies and tactics of war that seek to limit harm (shaped by liberal concerns and values—​ values that will be undergoing transformation and evolution); a war that is orchestrated through techniques and technology that are increasingly precise, the age of (increasingly non-​lethal) micro-​targeting rather than mass slaughter; the research and development of non-​lethal tools of future war where the enemy can be incapacitated but not killed; and where creative techniques of deterrence mean that from state leaders to young people attracted to terrorist groups, the costs of war and violence will be too high. The consumer-​citizens of liberal democracy (and possibly even authoritarian regimes) will be unwilling to pay the price that previous generations paid or to tolerate politicians and policymakers who pursue unnecessary wars or wars based

38  War and Peace in the Twenty-First Century

on dubious strategic reasoning or calculation. But as Moyn warns in Humane: How the United States Abandoned Peace and Reinvented War, the new tools of the American/​liberal way of warfare might result in the expansion of war around the globe as war becomes either increasingly invisible or presented as increasingly precise, responsible and humane: ‘The American way of war is more and more defined by a near complete immunity from harm for one side and unprecedented care when it comes to killing people on the other’ (Moyn 2021: 8). For thinkers like Moyn, the possibility of humane tactics and technologies in the liberal way of warfare raises the risk of more endless wars rather than the end of war. The influential philosopher of war, Clausewitz, famously described battle and conflict in terms of the ‘fog of war’ (Coker 2017); decisions have to be made in fast, messy and uncertain situations where you have imperfect information, confusion and possibly deception. The fog of future war is unlikely to disappear (nor will the fear of destruction and death or the fears that drive strategies of deterrence); while we might have more data, faster and more innovative means to analyse data, more effective forms of decision-​making, we might also be overwhelmed by data and have to deal with more radical and disruptive forms of deception and information war that confuse all parts of society and key elements in military organisations. The fog of war in the twenty-​first century might also involve placing a ‘fog’ over a building, city, organisation/​group or population/​state; a fog that makes it impossible to see what is going on, whether it is through shutting down the internet, blocking the vision of a drone or a network of surveillance cameras, to a point where we cannot trust what we are able to see (the type of scenario depicted in Spiderman Far From Home where swarms of drones are able to produce the spectacle of imaginary events in cities, the theatre of war becoming the IMAX of war); where we cannot trust the messages we receive from colleagues as the techniques of impersonation and organisational disruption are so convincing and authentic, the outcome of research into psychological war; where our digital societies and organisations that are generally so fast and efficient are clouded in a fog where all movement is slow and cautious; where we have no situational awareness and no way of orchestrating the tools we have developed to fight and protect; where the logistical supplies are brought to a standstill through various techniques of sabotage; a fog that puts a society and military to sleep, allowing for non-​lethal tactics of incapacitation and neurowarfare (Krishnan 2016; Moreno 2012). Simply put, the liberal way of war, for the protopian optimist, will be transformed by the values and technologies of an age where the human and material costs of war create new types of conflict and warfare but not the end of war and interstate conflict. The assumption that underpins this book is that liberal states will desire protopian war, creative approaches to conflict that will harness non-​lethal solutions (as much as is tactically possible and effective) that focus on limiting the risk to all sides in war, approaches that seek to manage conflict through new ways of conflict resolution and prevention. But the book is hesitant about buying too fully into this

The Liberal Way of Future Warfare  39

vision of the liberal way of protopian warfare. There will still be the possibility for wars that rely on lethal solutions to conflict, unnecessary wars based on flawed strategic thinking, wars where there are events that reveal a darker side to the liberal order. An important element to Pinker’s argument in The Better Angels of Our Nature is on the emergence of more ‘civilised’ humans in modernity, on the processes outlined by the sociologist Norbert Elias in his work on the ‘civilising process’ (Elias 2000); here the idea is that the transformation of society is not simply a result of the ‘pacification process’ created at the level of international relations, the emergence of peace between liberal states, the entanglement or interconnectedness of all states in global flows of trade, ideas, art, technologies, migrations and threat: human beings change in modernity—​transformation that plays out in the most basic levels of how they interact with others, how they present or conduct themselves, what they see as acceptable in all aspects of social (and intimate) life, how what is repulsive or distasteful changes through time. What I find disgusting or shocking in England of the 2020s will be different from what ‘I’ would find disgusting in 1066 or 1789 or even 1914; simply put, if we met our medieval ancestors, we would most likely find them disgusting, vulgar and childish. It is not that the human is different at a biological level (although we might be in the process of becoming increasingly post-​human during the twenty-​first century) but modern values, education and culture work to reshape and transform individuals and societies at a fundamental level—​a process that ensures the necessary order for the institutions to evolve. In The Civilizing Process, Elias examines the transformation of everyday manners set out, for example, in guides to etiquette where readers are informed on how to perform tasks such blowing a nose in ways that are not simply more hygienic, less likely to spread disease, but that will also allow one to move ‘correctly’ in the increasingly complex social hierarchies that were emerging (or being disrupted) in the industrial age, the age of empire and the emergence of the global economy (Elias 2000). Indeed, the story of the outsider (generally from a different class or social status) learning to negotiate new and different ‘codes’ produces tensions and the potential for comedic situations that we continue to find entertaining/​fascinating. The dynamism and movement of modernity brings together individuals and groups in a way that produces endless misunderstandings, confusions and painful education in the codes and practices of the different groups that proliferate in modern life; books such as Stendhal’s The Red and the Black (1830) are warnings on the problems of negotiating worlds of new codes, with Stendhal’s book charting the downfall of those who travel from the simple worlds of the provinces to the sophisticated and socially dangerous worlds of a city like Paris, with its ambition, hierarchy and power politics. But as well as being a story of society in terms of codes and practices of inclusion and exclusion—​where people are being watched for any sign or trace that reveals their inferior ‘breeding’ (or where the individual changes their own

40  War and Peace in the Twenty-First Century

behaviour, manners and dress in response to being watched)—​the civilising process is also about the moral transformation of the individual; the ‘gentleman’ can play rough or violent games on the rugby field but not on the streets at night; ‘belligerence and aggression’ can be expressed, Elias suggests, in sporting contests with the ‘imaginary identification with a small number of combatants’ where the ‘moderate and precisely regulated scope is granted for the release of such affects’ (Elias 2000: 170). The gentleman should be able to use a shotgun and hunt, but they should also be able to see the beauty and importance of Shakespeare. There is what Elias describes as the ‘civilization of the affects’ (ibid.: 180). The civilising process orders and shapes human activities so that things can be in their right place (the etiquette of how to eat, where aggressive emotions can be displayed, which type of bodies can be allowed in certain places)—​and all types of unpleasantness can be placed out of sight. In this view, being human does change; the civilising process operates through a combination of fear (fear of embarrassment, fear of punishment) and through a process of education and ‘refinement’ (we come to see how irrational some ideas are, becoming disgusted with practices once deemed normal and acceptable). We are taught control and delayed gratification; the contemporary problems of violence and disorder emerge from those who are unable to control and contain impulses or instincts. But the question is what does the civilising process and the ‘civilisation of affects’ mean for the future of liberal warfare—​does it mean that conflict and aggression gradually disappears from the human condition, or does it mean that violence is transformed? Elias is more cautious and ambivalent on modern war compared to thinkers like Pinker who draw on his work. For Elias, war becomes more ‘impersonal’ with ‘affective discharges’ that are less intense than in the medieval phase (Elias 2000: 180); cruelty and ‘joy in the destruction and torment of others’ are placed under stricter control in the state organisation: it is only in spaces and times of ‘social upheaval or where social control is looser’ (such as in colonial regions) that actions are less constrained by shame and disgust. Violence and war do not disappear from the human condition, but the age of ‘mechanised struggle’ requires a ‘strict control of the affects’: In the civilized world, even in war individuals can no longer be given free rein to their pleasures, spurred on by the sight of the enemy, but must fight, no matter how they feel, according to the commands of invisible or only indirectly visible leaders, against a frequently invisible or only indirectly visible enemy. And immense social upheaval and urgency, heightened by carefully concerted propaganda, are needed to reawaken and legitimize in large masses of people the socially outlawed drives, the joy in killing and destruction that have been repressed from everyday civilized life. (Ibid.: 170)

The Liberal Way of Future Warfare  41

In other words, there needs to be a directed effort to mobilise the war effort, especially in times when the increasingly ‘civilised’ populations are able to see the consequences of war. The liberal way of war has been transformed by the civilising process. While it might be the case that it is the strategies of deterrence that have really transformed the possibilities—​or rather lack of possibilities—​for interstate war, it is undoubtedly the case that there is a perpetual concern with sending young people to fight for ‘wars of choice’ and not existential threats; few families will want to proudly send their offspring to fight overseas in order to restore family or national honour, to become heroic twenty-​first century warriors in wars of choice. We live in societies that—​while there may be celebrations of our military successes (the triumphant return of troops) and mourning for our historical trauma and sacrifice—have a profound concern with placing our young in dangerous situations: the First World War poets are taught to teenagers in the United Kingdom; we watch films like Born on the Fourth of July or The Hurt Locker; we see the damage that war does in intimate and painful detail. In the liberal states of the 2020s, embarking on ‘wars of choice’ such as the Vietnam war—​where out of the 58,000 Americans who died 61% were younger than 21—​seems impossible to imagine taking place again. To be sure, authoritarian regimes in the twenty-​first century might view a moment of geopolitical uncertainty and transformation as a time for military–​technical exploration. The human and material costs of war might be less significant for a leader like Vladimir Putin than it is for Biden, Sunak or Macron. But it is unclear whether authoritarian leaders will be able to manage these costs either from domestic sources (backlash over numbers of war dead) or an international response in terms of sanctions or other measures (or the tactics of information war focused in generating confusion, deception and discontent in any type of regime). The civilising process that Elias writes about extends across all societies in a world shaped by the globalisation of information technologies. It might be the case that the liberal world’s reluctance to risk interstate war (combined with the emergence of new great powers) might mean that some states become more willing to explore the possibilities of hypothése d’engagement majeur; or to threaten destruction and chaos but then to engage more in the hypothése d’engagement minor. Either way, the liberal way of warfare will explore how to engage differently in international conflicts. Simply put, liberal states are unlikely to risk the lives of their citizenry in Ukraine or Taiwan—​but they would if a clear existential threat emerged, a threat to the security and order to the liberal world that is currently not on our ‘threat horizon.’ The liberal world will prepare for wars of existential threat, but the strategies of deterrence will keep liberal states in the realm of hypothése d’engagement minor. The desire for protopian war will continue to drive the liberal way of warfare unless there is a radical degradation or disruption to the worldview and attitudes of liberal citizens. But just as there might be a replicant ‘machine’ inside the body of Rick Deckard in Blade Runner, there

42  War and Peace in the Twenty-First Century

might be another body inside the liberal ‘body’ that is addicted to the power and technology of the war machine. Bibliography Arendt, Hannah. 1972. Crises of the Republic (London: Harcourt Publishers). Bartelson, Jens. 2017. War in International Thought (Cambridge: Cambridge University Press). Beck, Ulrich. 1999. World Risk Society (Cambridge: Polity). Coker, Christopher. 2017. Rebooting Clausewitz: On War in the 21st Century (London: Hurst). Crawford, Kate. 2021. The Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence (Yale: Yale University Press). Elias, Norbert. 2000. The Civilizing Process: Sociogenetic and Psychogenetic Investigations (Cambridge: Wiley-​Blackwell). FT Reporters. 2022. ‘How Ukraine Tries to Undercut Moscow’s censorship over Russian war victims,’ Financial Times, 4 March: www.goo​gle.com/​sea​rch?q=​financ​ial+​times+​ how+​ukra​ine+​under​cut&rlz=​1C1AV​FB_​e​nGB7​50GB​750&oq=​financ​ial+​times+​how+​ ukra​ine+​under​cut&aqs=​chr​ome..69i5​7j0i​546j​0i54​6i64​9l2.9752j​0j7&sourc​eid=​chr​ ome&ie=​UTF-​8 Fukuyama, Francis. 2022. Liberalism and Its Discontents (London: Profile Books). Gray, John. 2016. The Soul of the Marionette: A Short Enquiry into Human Freedom (London: Penguin). Habermas, Jurgen. 2006. The Divided West (Cambridge: Polity). Hochuli, Alex, Hoare, George, and Cunliffe, Philip. 2021. The End of the End of History; Politics in the Twenty-​First Century (London: Zero Books). Ikenberry, John. 2020. A World Safe for Democracy: Liberal Internationalism and the Crises of Global Order (London: Yale University Press). Kelly, Kevin. 2017. The Inevitable: Understanding the 12 Technological Forces That Will Shape Our Future (London: Penguin). Krishnan, Armin. 2016. Military Neuroscience and the Coming Age of Neurowarfare (London: Routledge). Krishnan, Armin. 2016. Military Neuroscience and the Coming Age of Neurowarfare (London: Routledge). Kurth Cronin, Audrey. 2022. Power to the People: How Open Technological Innovation is Arming Tomorrow’s Terrorists (Oxford: Oxford University Press). Lem, Stanislaw. 2020. Return from the Stars (Cambridge, MA: The MIT Press). Linklater, Andrew. 2020. The Idea of Civilization and the Making of Global Order (Bristol: Bristol University Press). MacMillan, Margaret. 2020. War: How Conflict Shaped Us (London: Profile Books). Mearsheimer, John. 2014. The Tragedy of Great Power Politics (New York: W.W. Norton). Moreno, Jonathan. 2012. Mind Wars: Brain Science and the Military in the 21st Century (New York: Bellevue Literary Press). Moyn, Samuel. 2021. Humane: How the United States Abandoned Peace and Reinvented War (New York: Farrar, Straus and Giroux). Orwell, George. 2014. Seeing Things As They Are: Selected Journalism and Other Writing (London: Harvill Secker). Pinker, Steven. 2011. The Better Angels of Our Nature: Why Violence Has Declined (London: Penguin).

The Liberal Way of Future Warfare  43

—​—​—​. 2019. Enlightenment Now (London: Penguin). Spring, Marianna. 2022. ‘The Young Ukrainians Battling pro-​Russian Trolls,’ BBC News, 6 March: www.bbc.co.uk/​news/​blogs-​trend​ing-​60596​133 Therborn, Göran. 2011. The World: A Beginner’s Guide (Cambridge: Polity). Virilio, Paul. 2006. Speed and Politics (Los Angeles: Semiotext). Zuboff, Shoshana. 2019. The Age of Surveillance Capitalism (London: Profile Books).

3 THE LETHAL STATE OF MODERNITY

In Homo Deus: A Brief History of Tomorrow, the futurist and historian Yuval Noah Harari explores the possible impacts of revolutions in biology and technology in the twenty-​first century and beyond. Like Kevin Kelly in The Inevitable, Harari appears more anxious about the threats to humanity that emerge from the laboratory rather than the future battle space. The threats to humanity will result from attempts to improve and transform the human condition through technological fixes, a world where non-​conscious but ‘highly intelligent algorithms may soon know us better than we know ourselves’ (Harari 2017: 397). Harari suggests that famine, plagues and war will most likely continue to claim millions of lives in the twenty-​ first century. However, he suggests that these catastrophic events are no longer ‘unavoidable tragedies’ but are becoming ‘manageable challenges’ (ibid.: 19). In terms of war, Harari declares that ‘a growing segment of humanity has come to see war as simply inconceivable’ (ibid.: 15). Nuclear weapons have turned war into ‘collective suicide’ and the ‘global economy has been transformed from a material-​based economy into a knowledge-​based economy’ (ibid.: 15). Put bluntly, for Harari, wars are restricted to the zones where economies are material-​based—​ such as the Middle East or Africa: ‘What Rwanda earned from an entire year of looting Congolese coltan, the Chinese earn in a single day of peaceful commerce’ (ibid.: 19). Terrorists lack the technological capacity to inflict serious damage on the world—​and they will be controlled by new techniques of global surveillance and intervention. While he is not suggesting wars will disappear, he does not examine how wars will change nor does he explore how war might be changed in times of artificial intelligence (AI), big data and transformations in biology and technology. For Harari, the dangers for humanity will come not from war but from the potentially destructive powers that emerge in the laboratories tasked with improving life not DOI: 10.4324/9781003219576-4

The Lethal State of Modernity  45

waging war. These dangers will constitute a threat to liberalism in a world where some humans will be ‘upgraded’ and those who are not upgraded, the not-​quite-​ replicants, used for the unpleasant jobs not carried out by machines, will live on the edges of society; the idea of a shared society governed democratically will come under threat as the ‘nature’ of the citizenry fragments and mutates in times of cyborgs or post-​humanism, a world of blade runners and replicants. The implication is that the ethical and political questions we will confront will make issues relating to war pale into insignificance in an age of upgraded humans. What will the human being look like by 2049? Possibly not much different from the 2020s—​but with access with a range of products and tools to radically transform what it means to be human. But what will human beings look like by the end of the century? What will that mean for war? While there are important differences between Steven Pinker, Yuval Noah Harari and Kevin Kelly, they all seem to share the view that while war will remain a problem in the twenty-​first century, war will be a problem of decreasing significance compared to all the other global challenges the states and individuals will confront. While they view the coming decades as a time of radical transformation in international politics and technology, there is a limited attempt to consider how these changes might result in different tactics and technologies in war. Simply put, the profound questions of the century will be about aging and health not death and the way old leaders can send young men and women to die. The dominant social and economic desire of the twenty-​first century will be the ability to make more of us like Mick Jagger—​not the 20-​year-​old Mick Jagger but the energetic Mick Jagger of 78: the health-​drive of older generations will take over from the death-​ drive that leads older men to send younger men to war. Some argue that the liberal optimism found in thinkers like Pinker, for example, is flawed to begin with in its data-​driven account of war and international politics. In ‘The Big Kill,’ John Arquilla argues that the various hopeful studies and pronouncements that declare our time to be less violent than the past are problematic in their reliance on ‘battle deaths’: The pattern of the past century—​one recurring in history—​is that deaths of noncombatants due to war has risen, steadily and very dramatically. In World War 1, perhaps only 10 per cent of the 10 million-​plus who died were civilians. The number of noncombatant deaths jumped to as much as 50 percent of the 50 million-​plus lives lost in World War II, and the sad toll has been rising ever since. Perhaps the worst, but hardly the only, terrible example of this trend can be seen in the Congo war—​flaring up again right now—​in which over 90 percent of the several million dead were noncombatants. As to Pinker’s battle death ratios, they are somewhat skewed by the fact that overall populations have exploded since 1940; so even a very deadly war can be masked by a ‘per 100,000 of population’ stat. (Arquilla 2012)

46  War and Peace in the Twenty-First Century

It is more useful, Arquilla argues, to focus on the number of armed conflicts that are underway at any given time; here the numbers of wars taking place raises a note of caution on the idea that war and violence are on the decline. Arquilla focuses on ‘Big Kill’ wars where a million or more die and—​while there might not be the prospect of such wars on the horizon—​there is an ‘alarming trend’ of smaller conflicts (the Balkan wars of the 1990s, the civil war in Burundi 1993–​2005, the Chechen resistance to Russia) that cause the deaths of hundreds of thousands. Nassim Nicolas Taleb and Pasquale Cirillo unpack the problems with the ways in which Pinker uses data, accusing him of a ‘naive empiricism’ that places too much faith in the reliability of data, the definition of events and the problem of ‘fat tailed phenomena’: Pinker’s severe mistake is one of standard naive empiricism—​ basically mistaking data (actually absence of data) for evidence and building his theory of why violence has dropped without even ascertaining whether violence did indeed drop. This is not to say that Pinker’s socio-​psychological theories can’t be right; they are just not sufficiently connected data to start looking like science. (Cirillo and Taleb 2016: 6) John Gray (2015) adds that great powers/​liberal states fight proxy wars at a safe geopolitical distance: wars take place in fractured states with armed irregulars and mercenaries attacking civilians with tactics of ‘methodical starvation’ and ‘systematic destruction of urban environments’. It is also the case, when considering the data on war deaths, that there are ‘many kinds of lethal force that do not produce immediate death’ (ibid.). There are indirect causes of suffering and death that might not be captured in the data, consequences not captured by the bounded nature of ‘events.’ In terms of the debate on deterrence and the ‘Long Peace,’ John Gray (2015) suggests no serious military historian would question the idea that nuclear weapons prevent conflict between great powers, but liberal internationalists like Pinker are unwilling to consider the role of nuclear weapons in preventing industrial-​style war; progress is driven by the civilising process. An argument that is often viewed as fundamental in explaining the changing character of international relations is the view that deterrence and the threat of global destruction changes the calculations of leaders; interstate conflict between great powers plays out in the ‘grey zone’; great powers will wage ‘policing wars’ that play out below the strategic threshold that would draw in ‘peer’ or ‘near peer competitors’ that will produce destructive interstate war. Just as modernity has refined and intensified the practices of deterring violence and disorder domestically, moving from punishment as a violent spectacle through to less brutal forms of incarceration and potentially rehabilitation, the geopolitical world has been transformed by strategies of deterrence. For liberal optimists like Pinker, however, there appears to be an unwillingness to accept the idea

The Lethal State of Modernity  47

that the threat of violence and destruction is what limits the possibility for future catastrophic acts of war and destruction. The idea that violence ends violence is an unpleasant conclusion for those who believe in the possibility of a humanity that can overcome its basic instincts: ‘Whatever peace a policy of deterrence may produce is fragile, because deterrence reduces violence only by threat of violence’ (Pinker 2011: 35). To be sure, the architecture of an international politics underpinned by the constant evolution of weapons that hold the potential of global apocalypse is a costly and risky way to manage global political life. There is the possibility for a political and technological accident; as Fred Kaplan (2020) concludes in The Bomb: Presidents, Generals and The Secret History of Nuclear War, the twenty-​ first century is filled with dangerous possibilities in terms of weapons of mass destruction in this time of transformation and change in international politics. It might be the case that a new generation of leaders and strategists in a multipolar world view the use of nuclear weapons as an acceptable tool of war or geopolitical control or coercion; some leaders, as Virilio argued, might produce suicidal states; there is the constant possibility of a strategic or technical accident (Virilio 2006). There is also the economic cost and ecological risk of a civilisation that ends war through the constant threat of its self-​destruction. So, it is likely the coming decades out to 2049 will see wars and conflicts that produce ‘direct’ and ‘indirect’ forms of human suffering. The landscape of global war might not involve interstate wars, but a continual proliferation of ‘small’ conflicts driven by a variety of causes and intensified by the use of a broad range of technologies and tactics, conflicts on the ‘margins’ or ‘edges’ of world order. But it might also remain the case that liberal states are willing to risk the human and economic costs of wars in a manner that is a legacy from the time when liberal states were colonial powers: liberal states might not have overcome the tendency to fight the ‘relentless’ wars that presidents and policymakers declared to have ended with the withdrawal of the United States from Afghanistan in 2021. Writing about democracies and the violence of empires, the Cameroonian political theorist Achille Mbembe challenges the narrative of liberal democracies as a pathway from crude, unstable and violent medieval societies to the civility, refinement and culture of the Renaissance, the Age of Enlightenment and Modernity; while the emergence and maintenance of liberal democracies depended on attempts to control violence through legal means, and the type of moral policing focused on producing better individuals in the ‘civilising process,’ the ‘brutality of democracies has simply been swept under the carpet’ (Mbembe 2019: 16). Mbembe argues that underneath many of our accounts of liberal internationalist history is the ‘nocturnal body of democracy’ where bodies are taken by force to work as slaves; where families are separated and violently controlled in slave ships, plantations and camps; where democracies rule ruthlessly over empires; where control is exerted through everyday punishments and humiliations: the horror produced by the liberal way of war and business was not a miscalculation but the normal tactics of colonial projects. A system where the plantation owners in the West Indies were able to fund

48  War and Peace in the Twenty-First Century

art and culture, to contribute to the ‘civilising process,’ through the profits resulting from the control and exploitation of bodies transported across the planet, bodies viewed as less valuable than European bodies (ibid.: 19). These histories of democracy and capitalism—​and the role that slavery and empire played in the production of the modern world—​continue to make liberal societies uncomfortable (as we see in the debates on how the history of colonialism should be taught in schools). Of course, the liberal internationalist will possibly respond that we are beginning to examine the ‘nocturnal body’ of democracy; and modernity has been a process where moments of racial violence and control have resulted in political processes that have overturned what Mbembe describes as ‘necropolitics,’ the brutal techniques used to dominate (and destroy) non-​White bodies. And while there are still brutal events of racial control and violence, the liberal internationalist will suggest there is a growing planetary awareness that signals a deeper recognition on the role of racism that continues to play in our contemporary world politics and in our colonial histories. This transformation will continue to change the world. But thinkers like Mbembe are not so convinced. A Europe already anxious about migration will confront a world where climate change might increase flows of climate refugees, resulting in more brutal forms of border control and policing, along with intensified control inside liberal states to create hostile environments for migrants and refugees. In terms of war and international conflict, we live in a time when there is the view that the United States is rethinking its role in international politics; the costs of its twenty-​first century ‘relentless’ wars have raised questions about war as a means to make the liberal states safer and the world order more orderly. But Mbembe’s writings seem less convinced we will be able to move beyond an approach to international politics that views liberal states as the world’s policeman with the right and capacity to wage war in territories inhabited by black or brown bodies. This historical legacy—​or habit—​might take longer to overcome: in the meantime, liberal tactics of war will evolve in response to anxieties in our ‘civilising process’ about the possibility for inhumane acts and dehumanising necropolitical policies and practices—​and the emerging technologies that can transform warfare: the trends outlined in the book might be viewed as the path to protopian war or they might equally be viewed as the latest stage in necropolitics. It might also begin to become a ‘necropolitical’ habit of other states in the international system who might view others in a colonial or racist frame as they develop their foreign policies. What thinkers like Mbembe are suggesting is that the liberal world is still close to this ‘nocturnal body’ of democracy. To be sure, we might be in the process of overcoming the nocturnal body, but liberal states are still a cause for concern in international politics; multipolarity (or post-​unipolarity) in international politics might result in new types of competition that create necropolitical violence and tactics; the tactics might evolve, but we will continue to see war in terms of the fundamental solution to global problems; warfighting remains a vital preoccupation for liberal states—​and while development and improvement of the war machine

The Lethal State of Modernity  49

might be a strategy of deterrence, the constantly evolving military capabilities produce necropolitical temptations: the implication of Mbembe’s work is that the ‘creative’ energy and expense devoted to the war machine could be used to develop alternative strategies for how we live together on the planet—​but that would be too unsettling for those used to a particular way of being in the world (Mbembe 2023). The civilising process of modernity is far more ambiguous than the liberal optimists like Pinker often acknowledge, and this ambiguity should raise a note of caution about ‘progress’ in international politics in the twenty-​first century: the future of war will be shaped by the ambiguity and tensions inside the civilising process of liberal modernity. Zygmunt Bauman: Modernity and Violence

A different view on violence and the ‘civilising process’ is developed in Zygmunt Bauman’s Modernity and the Holocaust. The Polish sociologist challenges the view that the Holocaust was an eruption of ‘primitive’ violence in modern Europe, a return or regression to barbarism in a state that had played a central role in the philosophical and cultural development of modernity. Rather than being a destructive collapse of modernity or catastrophic erasure or removal of the ‘civilising process,’ the violence in the concentration camps was an outcome of the civilising process, an event that was possible due to the social, political and ethical characteristics of modernity and the civilising process. What Bauman illustrates in Modernity and the Holocaust are the techniques in management and organisation that made it possible to use the processes of mass production that were transforming economic life in the first half of the twentieth century—​ to create a Fordist production line of death (Beilharz 2000). The department of the SS headquarters in charge of the destruction of the European Jews was called the Section of Administration and Economy; Bauman suggests that the title does simply reflect a desire to deceive or mislead people about what was being designed and orchestrated from the offices: Except for the moral repulsiveness of its goal (or, to be precise, the gigantic scale of the moral odium), the activity did not differ in any formal sense (the only sense that can be expressed in the language of bureaucracy) from all other organized activities designed, monitored and supervised by ‘ordinary’ administrative and economic sections. (Bauman 1991: 14) The horror of the camps was made possible by the bureaucratic tools and processes of modernity, the techniques of the civilising process that make many of us the efficient technicians of whatever task we have to perform. Bauman suggests that the camps were not made possible by psychopaths that lacked any sense of empathy to the men, women and children in the camps;

50  War and Peace in the Twenty-First Century

while there may have been psychopathic individuals in the camps, the majority of those who worked in all aspects of the organisation of the camps were just ‘normal’ Europeans from the middle of the twentieth century, no ‘better’ or ‘worse’ than people found in England, France or Spain. Indeed, the challenge for those managing the camps, the designers of the Holocaust, was how to ensure that ‘normal’ individuals could function efficiently when confronted with the horror of the camps: We know that people enlisted into the organizations most directly involved in the business of mass murder were neither abnormally sadistic nor abnormally fanatical. We can assume that they shared in the well-​nigh instinctual human aversion to the affliction of physical suffering, and even more universal inhibition against taking life. (Ibid.: 20) Central to what Bauman illustrates in Modernity and the Holocaust is how it is possible to create ‘moral distance’ between those contributing to the killing that took place in the camps and the victims; Bauman is suggesting that part of the civilising process is learning to be indifferent—​we couldn’t function in the modern world if we had not learnt to become indifferent to all the unpleasantness we encounter in everyday life (or that we contribute to either directly or indirectly), to control our sense of being a moral being; we can only function as stable, efficient consumers, employees and citizens if we are able to remain ‘calm and carry on’ when confronted by homelessness on our streets, poverty in our communities, decisions we have to make in the workplace, brutal conditions in the factories that make our consumer goods, the horror of industrial meat production, the possibility of species extinction and ecological destruction from our economic development, and so on. The civilising process is about learning to care for ourselves and others in new ways—​but it is also learning to not care (or to control our responses) for the increasingly number of people we encounter or the local/​global problems on the ‘threat horizon’—​and in our dystopian nightmares. In the camps, Bauman suggests, there was an attempt to try to limit moral proximity to the inhabitants and victims of the camps, to produce and maintain moral distance; and as Jonathan Freedland depicts in vivid and harrowing detail in his novel The Escape Artist, the camps also functioned through the use of a number of strategies to deceive victims about the purpose of the camps and its processes and objectives, strategies that made it easier to control people through tactics of spatial organisation and psychological manipulation (Freedland 2023). Moral distance can operate through the construction of a people as ‘sub-​human’ or dangerous predatory humans (the ‘othering’ that emerges from mass campaigns of demonisation and scapegoating often central to violent conflict). But moral distancing also emerges from the way a factory-​logic is installed where jobs are broken down into a variety of tasks where the only criteria for the action to be

The Lethal State of Modernity  51

carried out is efficiency; like many ‘modern’ jobs in offices, an individual has to perform the task at hand and not think about the ‘human’ or ‘moral’ implications of what you are doing. As Bauman notes, ‘a multitude of vengeful and murderous individuals would not match the effectiveness of a small yet disciplined and strictly co-​ordinated bureaucracy’ (Bauman 1991: 20). Violence was authorised (‘official orders coming from the legally entitled quarters’), routinised (‘by rule-​governed practices and exact specifications of roles’) and dehumanised (‘by ideological definition and indoctrinations’) (ibid.: 21). As Hannah Arendt illustrated in Eichmann in Jerusalem: A Report on the Banality of Evil, a study that attempted to understand the mentality of the bureaucrats that organised the Holocaust, what was striking was the mediocrity of those orchestrating the camps, their inability to think about the broader consequences of what they were doing, the ‘banality of evil’ that would take place in offices and bureaucracies (Arendt 2006). But Bauman illustrates how moral distance can function in the everyday operation of tasks in the camp—​the jobs and roles reduced to the banality of tasks to be performed efficiently—​and how there can be a spatial dimension to the social production of indifference. When we do not see the consequences of what we do (or what is done in our name) it is possible to function as an efficient worker/​individual. The civilising process is about managing the increasing separation between the modern citizen and all the unpleasant activities that we might have previously been more familiar with (the proximity to the death of people in families, the killing of animals for food); in modernity, many have the possibility of remaining distant from aspects of life that might unsettle us or make us uncomfortable. The camps were an extreme example of this distancing, an attempt to stop people thinking too much about the consequences of their actions. The efficiency of the camps depended on making the victim ‘psychologically invisible’ through the ‘mediation of action.’ When people were killed at point blank range, shooters were faced with the ‘reality’ of the violence they participated in; the acts were harder to reduce to the banality of evil: This is why the administrators of genocide found the method primitive and inefficient, as well as dangerous to the morale of the perpetrators. Other murder techniques were therefore sought—​such as would optically separate the killers from their victims. The search was successful, and led to the invention of first the mobile, then the stationary gas chambers; the latter—​the most perfect the Nazi had time to invent—​reduced the role of the killer to that of the ‘sanitation officer’ asked to empty a sackful of ‘disinfecting chemicals’ through an aperture in the roof of a building the interior of which he was not prompted to visit. (Bauman 1991: 26) Pinker refers to the perspective of thinkers such as Bauman on the Holocaust in The Better Angels of Our Nature in terms of people who see ‘reason’ as ‘overrated,’ thinkers who are anti-​Science and anti-​Enlightenment, thinkers who will ultimately

52  War and Peace in the Twenty-First Century

contribute to the type of irrationality that resulted in the concentration camp (Pinker 2011: 643). Pinker’s work is driven by the fear that to raise any notes of caution about the past, present and future of capitalism, liberal democracy and the Enlightenment is to unleash a nihilistic movement that will lead to an age of camps, cults and catastrophe; Pinker might not see violence spreading around him, but he sees a constant cultural threat to the sources of progress. But Bauman is not suggesting that because the modern European world produced the concentration camp or necropolitical violence that we reject modernity or liberal democracy; on the contrary, he suggests that it is when a plurality of perspectives on events is closed down that we risk the rise of dangerous authoritarianism or totalitarianism; in other words, we need democracy and a plurality of perspectives to challenge the often dangerous simplicities of us/​them, self/​other, foreign danger versus domestic security and order. Simply put, thinkers like Bauman are not—​as Pinker seems to imply—​suggesting that we reject all attempts to improve the human condition through modern ideas or technologies; he is suggesting that we constantly need to question all that is sold to us as the route to progress, to think more critically about the dangers that might be hidden in the bureaucratic language of progress, innovation and necessity. Pluralism, Bauman suggests, ‘is the best preventive medicine against morally normal people engaging in morally abnormal actions…The voice of individual moral conscience is best heard in the tumult of political and social discord’ (Bauman 1991: 166). It is in this sense that authoritarian regimes in the twenty-​first century are concerned with controlling and manipulating the circulation of information, to limit discussion about the ethical consequences of policies or actions (or inaction). At the same time, however, in the liberal world the proliferation of voices risks becoming ‘weaponised’ in a way intended to generate confusion, apathy or mistrust. A Paul Virilio observed in a comment on the Kosovo War (1998–​1999) that anticipates the contemporary anxiety over fake news, disinformation and social media: whereas in the past it was lack of information and censorship which characterised the denial of democracy by the totalitarian state, the opposite is now the case. Disinformation is achieved by flooding TV viewers with information, with apparently contradictory data. (Virilio 2001: 48) Bauman is making a warning about the present and future in his work on the Holocaust: this could happen again, and the civilising process is possibly more fragile than a thinker like Pinker admits. For Bauman, Norbert Elias is pointing to an ambiguity in the civilising process and the ways in which violence is transformed by the civilising process; Pinker, on the other hands, appears to present the civilising process as the potential overcoming of violence through the creation of more enlightened, humane beings. Bauman explains that the work of

The Lethal State of Modernity  53

Elias is pointing to a transformation of violence—​or relocation of violence—​in modernity: As we have seen, the apparent elimination is in fact merely an eviction, leading to the reassembly of resources and disposition of centres of violence in new locations within the social system. According to Elias, the two developments are closely interdependent. The area of daily life is comparatively free from violence precisely because somewhere in the wings physical violence is stored—​ in quantities that put it effectively out of the control of ordinary members of society and endow it with irresistible power to suppress unauthorized outbursts of violence. Daily manners mellowed mainly because people are now threatened with violence in case they are violent—​with violence they cannot match or reasonably hope to repel. The disappearance of violence from the horizon of daily life is thus one more manifestation of the centralizing and monopolizing tendencies of modern power; violence is absent from individual intercourse because it is now controlled by forces definitely outside the individual reach. But the forces are not outside everybody’s reach. Thus the much vaunted mellowing of manners (which Elias, following the etiological myth of the West, celebrates with such relish), and the cosy security of daily life that follows have their price. A price that we, dwellers in the house of modernity, may be called to pay at any time. Or made to pay, without being called first. (Bauman 1991: 107) So, for Bauman we might not be as far from those conditions as Pinker and others believe. In his analysis of Nazi Germany and the concentration camps in Black Earth: The Holocaust as History and Warning, Timothy Snyder (2016) raises the possibility of a future where geopolitical pressures generated by climate change result in a violent defence of territory (living space) in a world of constant flows of climate refugees; it might be the case that while citizens of liberal states have a serious concern with the suffering of ‘distant others’ in wars, famines, pandemics and natural disasters, there may be catastrophic social and political events that produce and require moral distance and indifference. Bauman is warning us that we should be wary of the faith that liberal states will become increasingly responsible in their approach to war and their foreign policies, learning from their mistakes; liberal citizens can become indifferent to the actions orchestrated by their leaders. And protopian tactics and technologies of war might be driven by a desire to make warfare more humane—​but, equally, they might be driven by the desire to keep actors at a safe (moral) distance; tactics and technologies of increasing precision and distance can work to perpetuate the role of war in societies shaped by the ambiguity of the civilising process. In the twenty-​first century, Bauman turned his attention to the ways in which all aspects of existence in consumer society were changing in a time of ‘neoliberal’ globalisation and disruptive economic, social, political and technological change;

54  War and Peace in the Twenty-First Century

Bauman sees a transformation from a ‘heavy modernity’ based on states, ‘solid’ identities, ‘welfare states’ and a faith in the future through to a ‘liquid modernity’ where there is anxiety about the liberal state’s ability to provide protection, security and safety for its citizens. The wars that were conducted by the West at the beginning of the century were about policing the planetary frontier, ‘remote war’ that is using the latest technologies to distance populations from the reality of war while also attempting to limit the exposure of the military to risk through new technologies such as drones and ‘light-​footprint’ operations. Indeed, the essays in the edited collection In/​Visible War: The Culture of War in Twenty-​First Century America by Jon Simons and John Louis Lucaites point to the ambiguity of a time when we can be exposed to more information about war at the same time as the ‘reality’ of war can be increasingly ‘invisible’ through deliberate strategies of control—​or through the desire of citizens to remain distant and indifferent (Simons and Lucaites 2017). Even authoritarian regimes have to deal with an information environment where they may struggle to control images of war deaths and injuries. Indeed, in March 2022 there were reports that the Russian army were using ‘mobile crematoriums’ to prevent the circulation of the type of pictures or footage that might generate or intensify a backlash against the war (Nicholls and Vasilyeva 2022); the mobile crematoriums are used as a means to maintain moral distance to the reality of war. Mbembe would add that wars on the ‘planetary frontier’ are a continuation of colonial strategies that remain indifferent to the suffering of non-​white bodies; while liberal states might seek to minimise harm to black and brown bodies in the conflicts of the twenty-​first century, they are still willing to create situations where non-​white bodies are placed at risk from direct and indirect harm that would be unacceptable for white bodies, risks that are viewed as legitimate policies in order to protect the citizens of liberal states. In this hierarchy of human value, the risk of killing a non-​white person can be acceptable in states like Iraq if it results in better lives for future non-​white people and a safer international community; this is a legacy of the nocturnal body of democracy that continues into the twenty-​first century. As Gregoire Chamayou illustrates in Manhunts: A Philosophical History, hunting human beings for economic or security reasons is a thread that runs through civilisation and is unlikely to disappear from the human condition (Chamayou 2012). The manhunt will take place with new technologies and tactics in the twenty-​first century: the French intellectual Paul Virilio explored these questions of technology, security and war, raising even more notes of caution about modernity and the civilising process; his work will be outlined in the following section. Paul Virilio: Modernity, War and Acceleration

Paul Virilio was interested in how the desire for speed transforms the world—​ and war (Virilio and Brausch 2011). For Virilio, war and international politics is a history of who is the fastest, who can make decisions the fastest, who can

The Lethal State of Modernity  55

communicate across an organisation the fastest, who can move troops across territory the fastest, who can have a missile that reaches the target the fastest. Virilio sees this capacity to scale up the size of militaries in modernity—​and be faster than the enemy—​as the source of new processes (such as the emergence of military logistics that drives broader transformation in the logistics of the global economy) and technologies (information communication technologies, medicines and innovations in healthcare) that transform all aspects of life. Society, Virilio tells us, progresses at the speed of its weapons systems. The world has never been this ‘fast,’ but it will never be this slow again: although for Virilio the speed and interconnectedness of the world might result in technological accidents that might make us ‘slow’ again. One of the current debates on society and war is that the technologies that are transforming life (and possibly death)—​and ‘speeding up’ life—​do not necessarily emerge from the military’s desires for technological innovation and advancement in the way that Virilio outlined in books such as Speed and Politics. In Social Acceleration: A New Theory of Modernity, the sociologist Hartmut Rosa shows how the institutions of the military are no longer the ‘pacesetter’ of social acceleration (Rosa 2015: 203). It is now ‘tech’ companies that are revolutionising the world, the companies on the cutting edges of data science, the ‘life sciences,’ AI and robotics; the military—​or rather the state—​is not the producer of innovation but the actor that needs to harness the new possibilities that are being created by the tech ‘megamachines’ or the latest ‘start up’ in AI or robotics. Indeed, the relationship between the military and technology companies has been the source of some widely publicised tensions with some Google employees concerned about being involved with military projects such as Project Maven (Crawford 2022: 190). The pace of change across a range of emerging technologies feels both exciting and unsettling; it is hard to imagine what the latest ‘smartphone’ in 2035 or 2049 or 2067 (the setting of an apocalyptic movie on an Earth dealing with climate change and nuclear war) will look like, how it will change our lives or how it will change politics and business. Will smartphones exist in 2049? Will they be integrated into our body? What will a ‘body’ be in 2049? And how will future smartphones (whatever they will have become) change war or crime? It might, of course, be the case that the latest smartphone in 2035 or 2049 might not be radically different from the phones we use in the early 2020s. The most significant transformation might be linked to their energy consumption or how they can be recycled in economies where sustainability is a vital objective. Or the future smartphone might be beyond anything we can currently imagine—​taking us into the realm of ‘neuralink’ technologies and ‘brain–​computer interfaces.’ One popular way of understanding the foundations of the ‘pace of change’ is the idea that Moore’s Law is producing disruptive exponential change across technology, economy and society. Named after Intel’s Gordon Moore, Moore’s Law refers to the belief that computing processing speeds double every 18 months (Baldwin 2019: 97); there is debate about whether research into the faster

56  War and Peace in the Twenty-First Century

chips will continue to drive technological transformation or whether the pace of change will switch into other areas such as health and at the intersection of biology and technology. There is also the question of whether Moore’s law will be transformed by innovations resulting from AI (which some would argue is an area made possible by the acceleration of Moore’s law) and quantum computing. There are also other ‘laws’ often used to understand the increasing complexity and disruptiveness of technology (and technology corporations) on society (such as Metcalf’s Law on the value and power of networks as they grow) (ibid.). Some of the most urgent contemporary debates are focused on how societies and economies should respond to the new actors of what Shoshana Zuboff (2019) describes as ‘surveillance capitalism,’ on how we will manage the disruptive consequences on work and economy that will emerge from possibilities in AI, robotics or automation (Varoufakis 2021). The liberal internationalist (or protopian optimist) might be anxious about the ethical and legal questions that might surface in a world where innovation may emerge beyond the control of liberal societies—​or beyond the control of any state. A dominant theme in science fiction books and movies is on the corporations that will be the key actors that transform the human condition, the planet and the galaxy (the Tyrell corporation in Blade Runner or Weyland-​Yutani corporation in Alien). But the liberal internationalist sees all the problems generated by the pace of change in technology as challenges that will be managed. The dystopian visions of the future show us a world where the power of corporations or technological innovations are transforming the world in the most destructive ways; the reality is that there will be powerful counter-​balancing forces in society: legal, ethical and social forces that will place limits on the more dangerous possibilities of future technology. Dystopian science fiction movies are part of the protopian process of societal and individual reflection and critique. For the liberal optimist, the world is being transformed by technology and processes of globalisation that (even in times of anxiety about ‘de-​coupling’) are improving the human condition (and non-​human condition, in all its forms); the twenty-​first century will be one of economic disruption, disruption that will possibly cause severe societal and geopolitical tension, but the primary and long-​ term process will be of economic growth that improves the quality and security of life for all around the planet; wars will continue to generate setbacks to a progress that will continue to unfold—​but they will be managed in increasingly humane ways. To live in the poorest sectors of the planet in 2049 will not mean hunger, lack of education or unpleasant jobs. New technologies of robotics, AI, biotechnology and data science will mean that people will increasingly be replaced by machines, but this disruption will be managed by innovative economic policy (e.g., the implementation of ‘basic income schemes’) and offset by the emergence of new jobs and possibly more ‘rewarding’ jobs. All people will be healthier and safer wherever they are: there will be no streets where someone can be hurt or abducted without being seen by the watchful eye of a ‘vision machine’ (Bousquet

The Lethal State of Modernity  57

2018); managing pandemics and ecological degradation will be the focus of our ingenuity and the incredible AI-​enhanced technological power; the enlightened self-​interest of the economic actors that are benefitting from the global economy will find ways to prevent society becoming a science fiction dystopia like Elysium divided between the upgraded humans and pre-​upgrades. A planet with dangerous inequality that could tip out of control—​especially with the new technologies that all people can access, the democratisation of technology that creates ‘super empowered’ individual and groups—​will be managed by a constantly improving and innovating technocratic politics of security. The world out to 2049 will not simply be richer in terms of the wealth that is produced and distributed, it will be richer in terms of experience; we will have the possibility to live in global communities where distance is no barrier to communication and connection, we will live in virtual worlds that open up the possibility for new forms of solidarity, education and enjoyment through the incredible innovations in our Apple Vision Pros. People will have access to education wherever they are and whatever resources they have (the education from an Ivy League university accessed online from the poorest villages or ‘slums’ around the planet). We will live longer and better—​and more of us will live longer and better. Rather than descending in dystopian pessimism that may incapacitate us (or make reactionary political movements more attractive), we should be excited about the coming decades and a century that will be architecturally, technologically, environmentally, culturally and politically exciting, enriching and rewarding. There will be ‘glitches’ on the way, but the experience of the twenty-​first century will be awe inspiring. Of course, there are people who are raising awareness about the dangers of the world that is emerging—​on the ‘surveillance capitalism,’ the ‘rise of the robots,’ the upgraded humans, the dangers of ‘the Anthropocene’—​but these are all part of the process, the raising of awareness so that we can ensure we make better, safer and more equitable use of the technological possibilities accelerating towards us. History is still a story of progress, of protopia. Virilio, on the other hand, sees the possibility of a world that is far more fragile than the visions of the future offered by the protopian or liberal internationalist. While history might be a history of progress, of improvements in technology, health, education, the civilising process that expands our sense of ‘moral community,’ modernity is also a history of endless war and technological accidents. Every technological innovation in modernity creates the possibility of an accident; the Titanic, Chernobyl, financial crisis, climate catastrophe (Virilio 2003). And for Virilio, the twenty-​first century might see the proliferation of accidents that emerge from both our technological advancement and acceleration—​and our intensifying global interconnectedness; these accidents will be magnified in the scale and destructiveness due to our globalisation, entanglement and ‘volume’/​scale as a species, a species that has forced the planet into the Anthropocene, a planet that is becoming as ‘unnatural’ as our cyborg bodies will become.

58  War and Peace in the Twenty-First Century

The cascading effects of climate change, cyber(in)security or AI, the emergence of a pandemic resulting from a laboratory experiment (for a bioweapon for future warfare), might exceed the capacity of states to ‘restore’ order, domestically or internationally; the global accident might be the ‘black swan’ event that we currently cannot see or imagine (Taleb 2010). Covid-​19 gave us an insight into the possibility of a world where all elements in society could be radically and quickly disrupted and, for Virilio, the twenty-​first century will be filled with the possibility of events that transform everything we thought were ‘certainties’ about the security of our families and fellow citizens. What we have experienced so far—​9/​11, financial crisis, Covid-​19, war across the Middle East and in Europe—​is just a taste of the ‘general accidents’ to come, accidents that will radically transform and degrade the world. Simply put, the world depicted in Blade Runner 2049 remains a possibility in a way that it would not for the protopian or liberal internationalist; and we need to open our eyes to some necropolitical possibilities beyond anything we are discussing or writing about in our studies of future war and international politics: new actors, technologies, terrains and tactics that are like something from our worst necropolitical/​David Cronenberg-​directed nightmares. Virilio sees a ‘fragility’ to the world that possibly results from his experience as a child during the Second World War in France, the sense that the world had the permanence of a film set (it might also result from a particularly French strand of ‘apocalyptic’ thinking compared to the more liberal internationalist or protopian thinking often found in the United Kingdom and United States) (Virilio and Brausch 2011). The ‘realist’ of international relations views the ‘tragedy’ of great power politics in terms of the constant return of world war; for the Virilio-​like pessimist, the tragedy of the world is also our ability to produce chaos through war and technological accidents in our search for security and progress. For Virilio, the nuclear deterrence that underpins the ‘long peace’ since the Second World War is a risky and fragile foundation for ‘world order.’ Virilio sees the possibility of a century of increasingly severe accidents and catastrophes that we will struggle to manage—​ with each one posing novel challenges—​and that we will struggle to recover from, each event or accident compounded or overtaken by another event. We will deal with unrelenting waves of accident; just as we recover from financial crisis and pandemics we will be forced to deal with the inescapable consequences of a new war or disaster. Rather than a century that enables the flourishing, improvement and refinement of the human condition, the future is one where we live with escalating technological mess, chaos and waste. The future is not the spreading of a clean, sleek and stylish ‘retro-​futuristic’ Minority Report-​like utopia; the future is the spread of the uneven and divided megacities like we see today in Kolkata, Dhaka or Lagos—​with the technology of science fiction. For Virilio, the future is as depicted in Children of Men, Elyisum or Alita: Battle Angel. The liberal internationalist visions of the future and our better angels will reflect a very specific (and local) moment and

The Lethal State of Modernity  59

perspective on modernity from the early stages of the twenty-​first century in the Euro-​American world. After 9/​11, Virilio suggested we were in an age of impure war, war where states fight networks or terrorist groups, where liberal states deploy activities under the threshold of ‘conventional’ or traditional warfare (such as economic war or cyberwar). At the same time, non-​state actors are developing the capacity to unleash damaging events with global impact; states respond with impure wars different from the wars and conflicts of the previous century, using new technologies and tactics that reflect the change in scale in international politics (the ‘kill boxes’ and ‘kill webs’ of drone war, patterns of life analysis and surveillance, cyberattacks that originate in small devices like memory sticks). In previous times there would be a ‘hierarchy’ in international politics between those states and actors that had the advanced and sophisticated technology—​and those at the bottom, with limited access to the tools that could enhance their capacity (Kurth Cronin 2022). In the twenty-​first century, the anxiety is that the pace of change means that the powerful need to work harder to ‘stay ahead’ in technological arms races as other states and actors obtain access to the new tools of conflict and business. This anxiety is central to much of the thinking in the United States (in initiatives such as the Third Offset Strategy) on the need to use, in particular, AI to stay faster than the enemy, to explore new possibilities in human–​machine teaming, and to stay on the ‘cutting edge’ when competitors such as China may soon overtake the liberal world in the development of AI (Kania 2017). The liberal optimist will argue the pros still outweigh the cons and can be managed; the Virilio-​like pessimists will see a century of potential disorder and catastrophe. So, on the one hand, the coming decades will most likely see a liberal world cautious about embarking on the types of interventions seen in Iraq and Afghanistan. There will be concern about the costs and risks of wars in a time of intense geopolitical competition; there will be concern about the human costs of war (and the instability that can result from these interventions) in a time of radical social media interventions and reporting. At the same time, there will be states and other actors that seek to exploit and shape a period of geopolitical transformation and technological innovation. What will private security companies—​often assisted by states—​be attempting to do/​exploit in a period of transformation to a multipolar world order, a world of emerging possibilities and vulnerabilities? What will the Wagner private military group be doing in parts of Africa in the decades ahead? What will liberal states be doing in response? How will the capacities of a groups like Wagner be enhanced in accelerated times of AI, robotics and bioweapons? I think Virilio would be hesitant about suggesting that lessons have been learnt from the liberal way of warfare since 9/​11 or that policymakers in liberal states are thinking seriously about what is on the horizon. Similar to Mbembe, Virilio would be hesitant about suggesting that liberal states are beyond their colonial, necropolitical past in the way they view their right to use force in the world. Furthermore, the liberal world will continue to prepare for a variety of future wars and will continue

60  War and Peace in the Twenty-First Century

to spend vast sums on the transformation of war, constantly seeking innovation across a range of technologies as part of strategies of deterrence. Deterrence might prevent open interstate warfare—​but we will be living on an Anthropocene of unprecedented militarisation/​securitisation and what Virilio describes as endo-​ colonisation, the colonisation of all aspects of life by policies and technologies of security (and control). Writing about the Kosovo war in 1998, Virilio wrote: With the doctrine of the ‘revolution in military affairs’, American technology seems to be becoming today, for Bill Clinton, a sort of Wonderland in which the warrior, like a child in the playpen, wants to try out everything, show off everything, for fear of otherwise seeming weak and isolated. (Virilio 2001:10) I think Virilio’s concern here is that liberal states place huge resources into innovating the technologies and techniques of war and the desire to use them is often too great; a superpower wants to produce a spectacle of its superpower status as a warning to those who might challenge or attack. War becomes a zone of experimentation to test the results of what is a huge part of what states devote their energies to. It might also be the case that policymakers are seduced by the promise of these new ‘tools’ they have access to in a manner that might play down the potential for accidents and unintended consequences, as well limiting the critical discussion on the broader questions about the strategies that are being pursued: many commentators suggest that the thinking about the war in Iraq after 9/​11 placed too much faith in the technologies of war, failing to explore the broader strategic basis for war (as well as thinking about the broader consequences of the war and the post-​war challenges in Iraq) (Walt 2005). Virilio’s warning is that liberal states will arrive at geopolitical moments when the possibilities of new technology create a ‘wonderland’: to be sure, experiments with emerging tactics and technologies will be viewed as the latest stages in the liberal way of humane warfare, with greater levels of precision, control and risk management, the new possibilities of increasingly non-​lethal possibilities for warfighting. But as commentators like Moyn (2021) would argue, the optimism about the possibilities of humane warfare can drive the continuation of war as a primary ‘problem-​solving’ technique in international relations. The following chapters begin to explore the emerging possibilities in this time of technological acceleration for the liberal way of future warfare. Concluding Remarks

The visions of the future in the writings of people such as Pinker or Kelly remain a possibility. The liberal world might drive a transformation of international politics towards a world where war becomes an exception in a rules-​based order where the

The Lethal State of Modernity  61

use of violence gradually disappears from all aspects of life; or if military force is used, it will be increasingly non-​lethal, shaped by new technologies and tactics of precision, control and responsibility. This liberal drive is unlikely to disappear from the politics and values of liberal societies out to 2049—​although it might compete with the desires of authoritarian states that are indifferent about the suffering their policies cause; at the same time, it might also be the case that states like China are keen not to get caught up in expensive and complex wars overseas; the consequences of the Russo-​Ukrainian war might be vital in shaping attitudes to war outside the liberal order in the decades ahead. And caught up with waves of protopian techno-​optimism and a sense of liberal democratic exceptionalism, liberal states might embark on future necropolitical wars of regime change around the planet, future Iraqs or Afghanistans. As the last two chapters have tried to demonstrate, there is ambiguity in the ‘civilising process’ of liberal states. Liberal states can be involved with ‘invisible wars’ where the citizenry remain indifferent from the suffering in distant territories (caused by war or economic policies and strategies); liberal states are used to viewing the world outside of Europe and North America in terms that might remain a legacy from colonial times. Liberal states that continue to invest vast resources into the research and development of future war might continue to see war as an inevitable and necessary tool of international politics: it is what we are ‘hardwired’ to do. But the approach to the liberal way of warfare is also driven by a concern with ‘improving’ war, with making war more ‘humane.’ In Blade Runner there is an uncertainty about the identity of Rick Deckard and the question of whether he is human or a replicant, a replicant being used to hunt down the replicants that have gone rogue; there is an uncertainty about whether Deckard suspects he is a replicant or is secure in his sense of being human. For Virilio or Bauman, liberal citizens and societies exhibit a similar uncertainty, holding on to their sense of being on the side of the better angels while often suppressing any anxiety on the possibility that they might actually be replicants. In other words, we need to at least consider the possibility that we might be replicants, products of a ‘civilising process’ that is darker than we like to believe and contributing to the production of future necropolitical possibilities in technological advancement and innovation; we then have to work out ways to live with that knowledge, to learn how to live with others and ourselves (a theme that emerges in Blade Runner 2049 on the complexities of the relations between humans and the replicant ‘freedom movement’). At the same time, emerging technologies of biology, AI and robotics might take liberal citizens and societies to places where there is no ambiguity on the nature of the civilising process. In the following chapters I want to explore the trends that might transform the liberal way of war out to 2049: the impure, the granular and the machinic. While all these trends are very much outcomes of the liberal way of humane warfare, approaches to war that emerge from the cultural, ethical, political, technological and strategic contexts of liberal states, this position is not suggesting that the protopian

62  War and Peace in the Twenty-First Century

future is on the horizon out to 2049. It might be the case that these trends reflect an ‘improvement’ in the conduct of war. But following Virilio and Bauman, these trends are also filled with the possibilities of accidents, unintended consequences and necropolitical violence. Bibliography Arendt, Hannah. 2006. Eichmann in Jerusalem: A Report on the Banality of Evil (London: Penguin). Arquilla, John. 2012. ‘The Big Kill,’ Foreign Policy, 3 December: https://​foreig​npol​icy. com/​2012/​12/​03/​the-​big-​kill/​ Baldwin, Richard. 2019. The Globotics Upheaval: Globalization, Robotics, and The Future of Work (London: Weidenfeld and Nicolson). Bauman, Zygmunt. 1991. Modernity and the Holocaust (Cambridge: Polity). —​—​—​. 2022. ‘Reconnaissance Wars of the Planetary Frontierland,’ Theory, Culture and Society, Vol. 19, Issue 4: 81–​90. Beilharz, Peter. 2000. Zygmunt Bauman. Dialectic of Modernity (London: Sage). Bousquet, Antoine. 2018. The Eye of War (Minneapolis: University of Minnesota Press). Chamayou, Grégoire. 2012. Manhunts: A Philosophical History (Princeton: Princeton University Press). Cirillo, Pasquale and Taleb, Nassim Nicholas. 2016. ‘The Decline of Violent Conflicts: What Do the Data Really Say?’: https://​pap​ers.ssrn.com/​sol3/​pap​ers.cfm?abst​ract​_​id=​2876​315 Crawford, Kate. 2022. Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence (Yale: Yale University Press). Freedland, Jonathan. 2023. The Escape Artist (London: John Murray). Gray, John. 2015. ‘Steven Pinker Is Wrong About Violence and War,’ 13 March: www. theg​uard​ian.com/​books/​2015/​mar/​13/​john-​gray-​ste​ven-​pin​ker-​wrong-​viole​nce-​war-​ declin​ing Harari, Yuval Noah. 2017. Homo Deus: A Brief History of Tomorrow (London: Vintage). Kania, Elsa. 2017. Battlefield Singularity: Artificial Intelligence, Military Revolution, and China’s Future Military Power: https://​s3.us-​east-​1.amazon​aws.com/​files.cnas.org/​ hero/​docume​nts/​Batt​lefi​eld-​Sing​ular​ity-​Novem​ber-​2017.pdf?mtime=​201​7112​9235​ 805&focal=​none Kaplan, Fred. 2020. The Bomb: Presidents, Generals, and the Secret History of Nuclear War (London: Simon and Schuster). Kurth Cronin, Audrey. 2022. Power to the People: How Open Technological Innovation is Arming Tomorrow’s Terrorists (Oxford: Oxford University Press). Mbembe, Achille. 2019. Necropolitics (Durham: Duke University Press). —​—​—​. 2023. The Earthly Community (Rotterdam: V2). Moyn, Samuel. 2021. Humane; How the United States Abandoned Peace and Reinvented War (New York: Farrar, Straus and Giroux). Nicholls, Dominic and Vasilyeva, Nataliya. 2022. ‘Russia Deploys Mobile Crematoriums to Follow Its Troops into Battle,’ The Daily Telegraph, 23 February: www.telegr​aph.co.uk/​ world-​news/​2022/​02/​23/​rus​sia-​depl​oys-​mob​ile-​crem​ator​ium-​fol​low-​tro​ops-​bat​tle/​ Pinker, Steven. 2011. The Better Angels of Our Nature: Why Violence Has Declined (London: Penguin). Rosa, Hartmut. 2015. Social Acceleration: A New Theory of Modernity (New York: Columbia University Press).

The Lethal State of Modernity  63

Simons, Jon and John Louis, Lucaites. 2017. In/​Visible War: The Culture of War in Twenty-​ First Century America (New Brunswick: Rutgers University Press). Synder, Timothy. 2016. The Black Earth: The Holocaust as History and Warning (London: Vintage). Taleb, Nassim Nicholas. 2010. The Black Swan: The Impact of the Highly Improbable (London: Penguin). Virilio, Paul. 2001. Strategy of Deception (London: Verso). —​—​—​. 2003. Unknown Quantity ( London: Thames and Hudson). —​—​—​. 2006. Speed and Politics ( Los Angeles: Semiotexte). —​—​—​. 2008. Pure War ( Los Angeles: Semiotexte). Virilio, Paul and Brausch, Marianne. 2011. A Winter’s Journey—​Four Conversations with Marianne Brausch (Kolkata: Seagull Books). Varoufakis, Yanis. 2021. Another Now (London: Vintage). Walt. Stephen. 2005. Taming American Power: The Global Response to U.S. Primacy (New York: W.W. Norton). Zuboff, Shoshana. 2019. The Age of Surveillance Capitalism (London: Profile Books).

PART TWO

The Tactics, Terrains and Technologies of Future Warfare

4 THE IMPURE 1 On the Sub-​Threshold of Modernity and War

Impure Wars

In the 2008 edition of Pure War, Paul Virilio provides a new introduction and conclusion to the interviews with Sylvére Lotringer that were published in 1984. Virilio’s work in the 1970s and 1980s was often responding to the apocalyptic possibilities of the Cold War, the possibility of deterrence becoming catastrophically overwhelmed by the speed of weapons systems or the accidents of decision-​making (or technological accidents). But writing after 9/​11, Virilio sees a different danger in a world where geopolitical insecurity and uncertainty was no longer dominated by the ‘pure war’ of great power politics—​and the possibility of apocalyptic destruction—​but confronted the impure war of terrorism and non-​state actors: wars fought in new terrains or domains with emerging technologies and tactics by often ‘deterritorialised’ actors such as terrorist networks, criminal organisations and private security companies (Virilio 2008: 11). Virilio sees a new type of conflict emerging where individuals and groups exploit vulnerabilities in the various infrastructures of everyday life (cars and planes turned into suicide vehicles of destruction, the digital world turned into virtual zones of crime and terror), the tactics of impure war where small groups can become destructively creative. In response to impure war, states attempt to ‘design out’ these attempts to exploit infrastructural vulnerabilities as creatively as possible; states try to find design solutions of ‘deterrence by denial’ to close down any creative ‘openings’ that a terrorist might identify. States will wage ‘policing’ wars or ‘special military operations,’ impure wars careful to avoid clashes with states that could tip above the threshold into open war; state-​on-​state competition between nuclear powers will involve impure tactics, activity below the threshold that can lead to war. The United States will develop a massively expensive and DOI: 10.4324/9781003219576-6

68  The Tactics, Terrains and Technologies of Future Warfare

constantly expanding war machine as a means of deterring interstate war and as preparation for great power conflict and competition. The impure wars (or what Derek Gregory describes as ‘the everywhere war’) that are fought rely on the military infrastructure and logistical support created for its dominance across multiple domains of land, sea, air and space (Gregory 2011). The wars that are fought will be different from the wars and great power conflicts that this vast military war machine is built to fight; wars against terrorist networks or events involving special forces; providing political, economic and military support that tries to keep involvement—​as we have seen in the Russo-​ Ukraine war—​sub-​threshold, below the threshold that could lead to open war; the increased focus on securing emerging domains such as space upon which so much of the military–​technical infrastructure and capability depends; the use of all aspects of the state and economy as a means to shape the geopolitical and military–​technical environment (as we see in the concern with ‘chip wars’) (Miller 2022). The problem with the attempts at regime change and transformation in Iraq and Afghanistan is that impure war with terrorist networks and ‘rogue’ states veered towards a strategy that was too ‘pure,’ too grounded in ambitious territorial occupation and the redesign of regional power. Geopolitics has faded, for Virilio, in favour of what he terms ‘metropolitics’ (although if he were writing in the 2020s he might replace this with ‘astropolitics’): ‘The battlefield has clearly become the city, the field of the city of men and women. Urban concentration has clearly won out over territorial geostrategy, over front lines, ramparts, Maginot Lines, Atlantic Walls, etc.’ (Virilio 2008: 10). In other words, the age of interstate wars over territorial conquest is over; what we will see is the targeting of everyday life in cities (exemplified by 9/​ 11)—​and the military intervention in cities around the planet to contain and control the threat of terror and crime (the question of cities and urban warfare is returned to in Chapter 6). But the geopolitical condition has become more complicated from these early declarations on impure war. A world of deterrence by entanglement (or deterrence by punishment) has solidified in our international relations—​and the warfighting that will take place is in the impure wars against terrorist networks or the sub-​threshold support of actors in global conflicts. The United States and its allies have learnt (for the time being) of the dangers of relentless, territorial war; Russia and Putin possibly underestimated the problems of a territorial war in Ukraine (with all sides attempting to prevent open, pure war between ‘great powers’); China will be weighing up the risks of invading Taiwan in a time of new defensive military-​technological developments—​and the possibility of a chaotic invasion combined with the risk of economic conflict and open war with the United States. Although the risk of open interstate war remains it is combined with the threat of impure war involving terrorist networks that might have enhanced capabilities in times of technological acceleration (what will a future Bin Laden be able to plan in 2049?) and what Audrey Kurth Cronin describes as ‘open technological innovation’ (Kurth Cronin 2022); indeed, it’s this messiness and complexity that

On the Sub-Threshold of Modernity and War  69

leads Virilio to talk of a time of impure war, this ‘fusion between hyper-​terrorist civil war and international war’ (Virilio 2008: 13). Virilio had written about the new threat landscape in 1993, commenting on the attacks on the World Trade Centre. Virilio saw these attacks as showing the ‘clever combination of a strong symbolic dimension and an urban demolition capability involving only a small number of individuals who used a delivery van to deliver terror’ (Virilio 2000: 21). But the attacks in 1993 and 2001 were not events with an obvious ‘territorial’ dimension: while there were issues about the military presence of the United States across the planet, the events were (and continue to be) about generating fear in the West (or any spaces that were seen to be embracing Western lifestyles and values), attempts to undermine the fabric of multicultural societies through creating suspicion and hostility between different races and religions. Multicultural society was—​or is—​viewed by fundamentalists as creating a ‘grey zone’; what was needed was the resurrection of clear lines of racial and religious demarcation. The intention was also to draw the West into an ‘apocalyptic’ war, into asymmetric wars that would be costly for states in all senses. A different threat landscape was emerging composed of groups, networks and the manipulation of individuals through processes of radicalisation, the production of molecular threats that could possibly be assembled into swarms, swarms that could operate across a city or across the planet. Groups and individuals that could exploit the vulnerabilities in ‘open,’ networked and mobile societies, the search for vulnerabilities in the critical infrastructures that support all aspects of life: some in the ‘security world’ remark while we might have seen the scale/​strategic surprise of horrific attacks in cities around the world since 9/​11, there has been the potential to generate far more destructive events—​and in this sense, we have been fortunate not to experience this type of creative exploitation of the already existing vulnerabilities in our various critical infrastructures. Individuals and groups have so far lacked the knowledge/​expertise to combine creativity and technology/​tactics to fully exploit the possibilities of our time of impure war; of course, in the years out to 2049 the question will be how artificial intelligence (AI) and other tools will enhance the dangerous creativity of the terrorist—​or to what extent these new tools with create unprecedented levels of surveillance and intervention. Here of course there is the question for liberal societies on how to prevent emerging tactics and technologies from entering the necropolitical zones that will undermine attempts to maintain the balance of liberty and security in the politics of security. The implication in Virilio’s comments on the age of impure war seems to imply that this will be the dominant ‘narrative’ for the future of security politics; impure wars of terror taking place in an international order based on nuclear strategies of deterrence. But it is unclear how permanent this new ‘state of emergency’ will be in his vision of future war: Will this age of impure war be the exception or will it become the geopolitical norm? A world dealing with constantly mutating terror and the impure wars fought to control and contain the terror. Virilio sees the emergence of a ‘worldwide civil war,’ a civil war that is not necessarily a ‘class war’ but a

70  The Tactics, Terrains and Technologies of Future Warfare

worldwide civil war involving groups or networks fighting against states. In this sense, China, Russia, India, Europe, the United States would all be dealing with the same problem: groups and networks would become the fundamental security problem for states in coming years and decades. A time of impure wars fought in the context of a worldwide civil war. This time of worldwide war would extend a securitising strategy of deterrence into all aspects of domestic life; after 9/​11, this new politics of security based on pervasive tactics and technologies of policing and control was the primary ethico-​political problem for liberal societies. But this view of impure war seems rather overstated now in a world where there is anxiety over the economic and military rise of China and a Russia willing to wage war over territory, where there is a sense of arms races across a multitude of dual-​use technologies shaping the twenty-​first century in a time of AI and robotic/​ cyborg futures. A world where for all the talk of ‘fractalisation’ and impure war, one does not see the ‘exhaustion’ of interstate competition but a moment when a new multipolar world is emerging with states seeking to shape and exploit the new international politics to test the possibilities of a world of new economic and military possibilities, the possibility of creatively redesigning, re-​imagining and remaking a new world order underpinned by Chinese economic and military power. In this sense, impure war has become more impure: more actors, more technologies, more tactics and more terrains. What I want to suggest in this chapter is that Virilio’s declarations about impure war continue to reveal important aspects of international politics. While the liberal world might be reluctant (for the time being) to wage ‘relentless wars,’ the problem of terrorism is unlikely to disappear; much of interstate conflict and competition will play out in the sub-​threshold zones below the thresholds that lead to war; while much of this ‘grey zone’ activity will involve election hacking, disinformation campaigns or political subversion or manipulation, there might be events that push the threshold to the limit, risking ‘pure war.’ There will be moments where liberal ‘reason’ will be overcome by necropolitical desires to restore order or to punish those who have orchestrated an horrific event. But liberal states will try their hardest to use the tactics of impure war against states and other actors (economic manipulation, military support, information or cyber tactics of sabotage, espionage and subversion, more kinetic types of sabotage, more ‘human’ forms of espionage and subversion). It might be the case that the war in Ukraine is an exceptional moment that reveals the costs and dangers of a ‘special military operation’ in an interconnected world, the risks posed by the impure war in response, the sub-​threshold support, training and assistance from powerful friends. Or we might see the proliferation of such special operations around the world, a new time of necropolitical disorder that serves as an experimental laboratory for emerging technologies and tactics: this will be a time of experimentation in impure war, to become ‘creative’ in times of new risks/​costs and new opportunities/​ possibilities. Simply put, the armies of Europe and the United States are unlikely to be involved in battles with Chinese or Russian soldiers that look like a recreation

On the Sub-Threshold of Modernity and War  71

of scenes from the world wars of the previous century. War—​if it takes place—​will be something different, impure wars whose form and possibility changes with each passing decade out to 2049. I am now going to introduce some ideas that are often viewed to reveal the strategic thinking or strategic culture in China and Russia. What I want to suggest is that both sets of ideas have been discussed in the liberal world because of the dangerous and frightening visions of emerging interstate conflict they produce, a vision of ‘cunning’ opponents exploiting our open society or weaponising our technology against us. What I want to suggest is that as terrifying as many of the ideas presented here are, what they all point to is the exploration of impure war; and the impact of impure war is different from the pure wars of world wars and the possibility of nuclear war between great powers. And impure war is being explored in the liberal way of war as much as the authoritarian way of war: one of the questions here is on who is going to be the smartest and most creative operator—​and who will avoid making strategic mistakes (or avoid being made to make mistakes). Simply put, the ‘critical infrastructures’ that support life around the planet have become more complex; the ways of attacking those infrastructures can be sophisticated or (as the terrorists seek to exploit) simple; impure war is the search for ways to protect and attack in times of infrastructural complexity—​and for the liberal way of war, this impure war contains the possibility of war becoming more ‘humane’; bombing a city in a manner that reduces the urban environment to a wasteland is pure war—​bombing underwater gas pipelines (like Nord Stream 1 in 2022) is impure war. The terrorist attacks on 9/​11 were acts of impure war; the killing of Bin Laden was impure war (the invasion of Iraq was not impure war). The Gerasimov Doctrine and Unrestricted War The Russian Way of Future War?

One of the most fashionable terms that has been influential in debates about the future of war is hybrid war, an approach to war that combines more conventional military tactics with elements of cyberwar (attempts to sabotage critical infrastructure or espionage) or information war (disinformation, subversion), along with creative tactics designed to generate disruption ‘on the ground.’ The term ‘hybrid war’ is often used in relation to what is seen as the Russian approach to conflict in the twenty-​first century (although it is sometimes controversially referred to as the ‘Gerasimov Doctrine’); less well known is the concept of ‘unrestricted warfare’ which is viewed as a significant conceptual foundation of Chinese thinking on the future of war (although we need to be cautious about seeing these ideas as offering profound insights into the ‘strategic culture’ of other states—​ideas change and there can be multiple and conflicting strategic visions in play); ideas also evolve and change. And as we seek to understand

72  The Tactics, Terrains and Technologies of Future Warfare

strategic culture, the importance of organisational culture might be equally—​or possibly more—​significant in understanding the future of warfare (as the Russo-​ Ukraine war might illustrate); just because there are visions or philosophies of war, it does not mean that they will mean much in the fog and friction of war, decision-​ making and politics. So both the Gerasimov Doctrine and unrestricted warfare might reveal very little about how the future of war is thought about in China or Russia; and they possibly reveal even less about the politics of decision-​making and planning during times of war, the strategic cultures that might produce mistakes and miscalculations; the philosophies of Sun Tzu might quickly be overwhelmed by the domestic/​ internal power politics that resembles more the television show Succession or The Godfather films than Clausewitz. But at root, both perspectives outlined in this chapter are attempts to make sense of a time of impure war, of the tactics and techniques used by liberal states to shape international politics—​and of the tactics needed to compete with the liberal order. The Russian approach to twenty-​first century conflict is often referred to in terms of hybrid war or, more specifically, the Gerasimov Doctrine. General Gerasimov is a Russian general who published an essay in 2013—​‘The Value of Science is in the Foresight’—​that attempted to clarify how conflict was changing and how Russia needed to adapt to the changing tactics and domains of international politics. Some see the essay as an overview and analysis of how states—​in particular after the radical events in North Africa and the Middle East—​could become ‘a victim of foreign intervention’ in a ‘matter of months and even days’ and ‘sink into a web of chaos, humanitarian catastrophe, and civil war’ (Gerasimov 2016: 24). Rather than a manifesto that explained what Russia was doing in its international politics, the essay was an analysis of what other states were doing in their international relations. Gerasimov suggests that while the Arab Spring might not be understood in terms of war (with no military ‘lessons’) it might be the case that the events reveal what war actually is in the twenty-​first century where the rules of war have changed: ‘The role of nonmilitary means of achieving political and strategic goals has grown, and, in many cases, they have exceeded the power of force of weapons in their effectiveness’ (ibid.). Gerasimov’s essay is thinking through the question of what war is in the twenty-​first century: of course, after the invasion of Ukraine there is the question of whether Gerasimov actually wrote the essay (or whether the invasion illustrates the risk of ascribing too much importance to these writings that are viewed to be revealing or insightful on future plans and tactics). The Gerasimov ‘Doctrine’ is an overview of how states operate in an environment that is composed of limits created by strategies of deterrence, prompting states to explore terrains or domains filled with new opportunities of traditional or ‘low-​ tech’ techniques (‘little green men,’ placing Commandos without insignia into a territory such as the Crimea to create disruptive events) through to the latest innovations in offensive cyber/​information war (potentially using a hybrid of state expertise combined with criminal orchestration of a task) (Greenberg 2019). In the

On the Sub-Threshold of Modernity and War  73

Gerasimov Doctrine, ‘action at a distance’ is possible in a world where targets can be interfered with through digital exploits; a state can exploit the anonymity of the cyber age due to the difficult and time-​consuming nature of working out—​and creating evidence—​on the origin of a cyberattack or exploit; and where a state can also deploy small groups of invisible ‘little green men’ in foreign territories. In this sense, anonymity and distance are central to the Gerasimov ‘Doctrine’: Long-​distance, contactless actions against the enemy are becoming the main means of achieving combat and operational goals…All this is supplemented by military means of a concealed character, including carrying out actions of informational conflict and the actions of special-​operations forces. (Gerasimov 2016: 24) Examples that Gerasimov draws on are events such as Operation Desert Storm, attempts by the United States to shape—​and control, from the perspective of Gerasimov—​international politics (rather than the essay being an account of Russian actions or experience). The focus is on new techniques of ‘asymmetrical actions’ to nullify an enemy’s advantage: Asymmetrical actions have come into widespread use, enabling the nullification of an enemy’s advantages in armed conflict. Among such actions are the use of special operations forces and internal operations forces and internal opposition to create a permanently operating front through the entire territory of the enemy state, as well as informational actions, devices, and means that are constantly being perfected. (Gerasimov 2016: 25) Both these quotations could be viewed as clear and concise statements that point to important trends in the future of the liberal way of warfare: ‘contactless’ actions against an enemy; new tactics to nullify an enemy’s advantage in armed conflict; the use of ‘informational actions.’ So for those who read the Gerasimov essay as a statement of intent, the worst-​case scenario was (in the 2010s) of events and actions originating in the Kremlin being orchestrated across the planet, disrupting elections, tampering with infrastructure, deploying little green men in cities across the world, all tactics of impure war against, in particular, the West and liberal democracies. For Putin, it was the end of the ‘end of history’ and he was going to help support the implosion of the liberal world—​the liberal world that was seeking to destroy him through the tactics outlined in the Gerasimov essay (Galeotti 2019). For those anxious against this attack on the liberal world, these tactics could become increasingly disruptive and dangerous—​not simply disrupting our political systems (pushing the liberal world further into a land of confusion, conspiracy theory and disinformation), cultivating the rise of extremist groups that may share

74  The Tactics, Terrains and Technologies of Future Warfare

Putin’s worldview: Putin’s tactics might also endanger life through the ‘creative’ forms of violence that it may deploy in extreme events. In this view, the poisoning of Sergei Skripal in Salisbury is a shape of things to come; and, in the worst-​case scenario, the destruction might come from an accident that results from the use of a cyber exploit or ‘targeted’ use of a biological weapon. Or from something we cannot imagine—​or does not yet exist. The counterargument to all this is that Putin is far more calculating in his desire to orchestrate actions at a distance that manage to remain sub-​threshold: the key point here is that an exploit or event should remain sub-​threshold, under the threshold that could lead to open war; Putin’s vision for war in Ukraine was possibly that the invasion would be over so quickly—​and the West would be overwhelmed by the economic challenges of the post-​Covid world—​that it would ultimately be viewed as a sub-​threshold event. But there is a perspective that suggests that this view of the Gerasimov Doctrine and the anxiety over the supposedly new Russian tactics has become rather exaggerated and ‘hyped’ up in the discourses of military dangers circulating in European and U.S. discussions on the changing character of war and conflict. The Gerasimov Doctrine suggests cinematic images of special forces acting in a deadly and terroristic manner across Ukranian territory (or in small English towns with deadly nerve agents), or using armies of hackers to attack power grids, or funding far right groups across Europe, hacking the emails of American politicians to influence elections and—​in the most radical scenario—​putting in place Donald Trump as the President of the United States: a security strategy that plays out like a Christopher Nolan movie. Russia can radically transform the politics of Europe through the ‘weaponising’ of migration, producing unmanageable flows of refugees as a means to destabilise Europe, producing geopolitical chaos in one state (such as Syria) as a military means of producing even more social and political chaos in other regions. The Kremlin might not be conquering or invading other societies, but it was transforming them from within. David Kilcullen provides a fascinating and disturbing account of these tactics in The Dragons and the Snakes: How the West Learned to Fight the West where he concludes that liberal societies will not necessary be able to counter all these tactics of what he describes as ‘liminal warfare’ but will need to focus on building ‘societal resilience’ (Kilcullen 2020: 255). In other words, the liberal world is not dealing with traditional war but is in the midst of an impure war that it will need to become better at managing; in this sense, this time of the liminal or impure war is an improvement on past moments in history—​but there are serious challenges ahead in times of fake news, disinformation, cyber-​insecurity, infrastructural sabotage and societal subversion. It might also be the case that anxiety about this contamination of the liberal world by the tactics of impure war is overstated. Michael Kofman comments that European allies are undergoing a modern version of America’s red scare from the 1940s and 50s. Someday, we may look back on this time in Europe and call it the hybrid war

On the Sub-Threshold of Modernity and War  75

scare. Russian influence and subversion are real throughout much of Europe, but whipping up fears of this mystical hybrid warfare has led European officials to see the Kremlin’s agents behind every corner. (Kofman 2016) In other words, while the Kremlin might be deliberating pursuing strategies to undermine the West, it might be going too far to suggest that the Russians are behind all the problems we confront; the election of Trump, the Brexit referendum, the problems of populism, tensions over multiculturalism across Europe; these problems in liberal societies would probably exist regardless of external interventions or manipulation. Putin sees international politics in terms of the ‘end of liberalism’ and is trying to push liberal world into chaos. Through talking about the end of liberalism, Putin can possibly convince his domestic ‘audience’ to accept his approach to authoritarian leadership with all its inequality and corruption. Until the invasion of Ukraine in 2022, Putin’s strategic objectives were possibly more symbolic, producing the image of revived great power-​ness; there might also be an element of dissimulation, of quite liking the paranoia that presents Putin as the mastermind of events he would never be able to orchestrate, the uncertainty of the world leader as potential Bond villain. As Catherine Belton suggests in Putin’s People: How the KGB Took Back Russia and Then took on the West: The KGB playbook of the Cold War era, when the Soviet Union deployed ‘active measures’ to sow division and discord in the West, to fund political parties and undermine its ‘imperial’ foe, has now been fully reactivated. What’s different now is that these tactics are funded by a much deeper well of cash, by a Kremlin that has become adept in the ways of the markets and has sunk its tentacles deep into the institutions of the West. (Belton 2020: 16) But it might be going too far to describe all the events ascribed to the Gerasimov Doctrine as elements in a coherent strategy. Indeed, Mark Galeotti, the academic who coined the phrase the Gerasimov Doctrine, regrets ever describing it as a doctrine, intending it more as a snappy title for a blog post. Galeotti suggests that there is no single Russian doctrine but rather an often fragmented and opportunistic attempt to divide and distract in states such as Ukraine, a variety of tactics orchestrated by political ‘entrepreneurs’ who are trying to gain favour with the Kremlin (Galeotti 2018). Russia attempts to exploit vulnerabilities with whatever tools it has at its disposal, an opportunistic ‘mixed methods’ of tactics, a hybrid of techniques to create problems for the liberal international order. But the focus on the Gerasimov

76  The Tactics, Terrains and Technologies of Future Warfare

Doctrine risks to overstate the significance of these unconventional tactics for its approach to warfare. Michael Kofman comments: The mythology of Russian hybrid warfare stands in stark contrast to the historical track record of how Russia uses military power to achieve desired political ends at home and abroad. Simply put, what Russia does best is conventional war, and if a conflict does not start that way, it is how it always ends. (Kofman 2016) Indeed, the Russo-​Ukraine war could be a result of irritation with the failure of these supposedly Gerasimov Doctrine-​inspired tactics. And for all the focus on the unconventional tactics of information war and little green men operating across a territory, at the core of Gerasimov’s controversial essay is a concern with the question of what future wars Russia should prepare for in a time of technological change and acceleration: ‘What forms and means should be used against a roboticized enemy? What sort of robots do we need and how can they be developed?’ (Gerasimov 2016: 26). Gerasimov possibly sees the future of war like the cover of this book. So, for all the Western anxiety on the creativity of Russia in its use of cyber and innovative tactics and techniques, the Gerasimov essay ultimately points to a more traditional view of international conflict albeit one recognising the need to explore the possibilities of AI and robots. What the essay by Gerasimov—​and in particularly the two quotations cited earlier in this section—​points to is possibly the future of the liberal way of warfare and its contactless action at a distance, its use of special forces, informational actions. While Russia appeared bogged down in logistical, planning and intelligence failure, and having to resort to brutal war on civilians and infrastructure, the liberal world began to explore the possibilities of impure war, the use of economic weapons and sub-​threshold actions: the speed with which the liberal order orchestrated this response might have shocked Putin with his views on the end of the liberal international order. All sides attempted to prevent the conflict from escalating into open war (this, of course, may change): Putin tried to maintain that the war was a ‘special military operation,’ a war against fascism akin to interventions made by the West during the War on Terror. What the war illustrates is that for all the focus (and hype) on the technological, social and political complexity of future war, the future of war and international conflict might continue to be shaped by the catastrophic decisions of leaders trapped in (Covid-​19) bunkers where paranoia and ‘group-​think’ limit careful strategic thinking (something that all types of state can succumb to). International politics out to 2049 might be divided between those who pursue ‘traditional’ war to achieve their strategic objectives and those who explore the sub-​threshold possibilities of impure war. In this sense, the consequence of war of the Russo-​Ukraine war might prove crucial in shaping war out to 2049, as states examine the impact and ‘success’

On the Sub-Threshold of Modernity and War  77

of the war—​and calculate the risks of war when confronted with impure war based on ‘actions of informational conflict and the actions of special-​operations forces.’ Unrestricted Warfare and the Intelligentisation of Warfare

Written by two senior People’s Liberation Army colonels in the last decade of the twentieth century, Unrestricted Warfare: China’s Master Plan to Destroy America (the subtitle appears to have been added to the American version of the unofficial, ‘bootlegged’ book) sets out to examine the emergence of the ‘network society’ and the geopolitical events that the authors saw as pointing to a new type of international conflict and warfare, events such as the Asian financial crisis (and there is a great deal of interest in economic war throughout the book) and new terrorist movements that were exploring the possibilities for networked destruction (‘semi-​warfare, quasi-​warfare and sub warfare’) (Liang and Xiangsui 2017: xv). As Kilcullen remarks, the book is a ‘product of its time, a turn of the century piece, yet also a remarkably prescient document’ (Kilcullen 2020: 201). For the authors, while the West had showcased the new possibilities of ‘network centric’ war or the ‘revolution in military affairs’ in Iraq and then in the response to conflicts in Europe, in Bosnia and Kosovo, these conflicts also signalled the emergence of a world where—​for all its unprecedented technological power and supposed precision—​a superpower would have to deal with the vulnerabilities that this new age of technological acceleration (and the democratisation of technology) would bring. At the same time, events in the post-​Cold War world revealed the limits of military solutions to the messy problems of a world of increasing technological and economic interconnectedness, the problems that were emerging from this age of globalisation: ethnic and religious conflict, environmental problems, crime, poverty and inequality. The authors suggest a new type of world order might emerge from these changes: ‘At present it is still hard to see if this age will lead to the unemployment of large numbers of military personnel’ and the possibility of war ‘vanishing from the world’ (Liang and Xiangsui 2017: xv). But while traditional war might begin to disappear from the world, the authors see a mutation of war on the horizon (what Virilio would most likely describe as impure war): ‘war will no longer be what it was’ and ‘war will be reborn in another form and in another arena’ (ibid.). In other words, the age of what becomes known as the time of ‘ambiguous war’ or Kilcullen’s ‘liminal war,’ the international ‘grey zone’ of the twenty-​first century. Unrestricted war is shaped by the society that is emerging from the information age or ‘network society’: ‘Even in the so-​called postmodern, post-​industrial age, warfare will not be totally dismantled. It has also reinvaded human society in a more complex, more extensive, more concealed, and more subtle manner’ (Liang and Xiangsui 2017: xvi). Driven by technology and the ‘market system,’ a type of war will emerge based not on the stark geopolitical divisions of the Cold War, presented in terms of the heroic and potentially apocalyptic battle between capitalism and the

78  The Tactics, Terrains and Technologies of Future Warfare

workers of the world uniting, but from the complex manoeuvres where states are increasingly interconnected and dependent on one another, the conflicts that will emerge from entanglement. From the perspective of unrestricted warfare, interconnectedness and the enthusiasm for globalisation in the 1990s masks the underlying geopolitical ambitions which will often remain under the surface of what is said in public; and the authors are interested in what lies beneath the geopolitical and diplomatic surface. For the authors of Unrestricted War, the war in Iraq might have been presented as a ‘policing’ operation in the new world order of the early 1990s but at its core it was a resource war; it is suggested that rising powers will seek access to markets and resources to fuel their rise; resource scarcity and competition in a time of technological acceleration (and environmental crisis) will fuel global conflict, the type of conflict exemplified in the 2020s over ‘chip wars’ (Miller 2022). Conflict may also emerge when a rising power does not want to be a rule taker and wants to shape the rules of a new world system. In such a situation of interconnectedness and potential conflict, a state will have to deter but also potentially degrade (or transform) an opponent. Some would say (possibly General Gerasimov) that this is what the West has always done in subtle and not so subtle interventions around the world. The sense in Unrestricted Warfare is that a case is being made for a grand overarching strategy that will be used across different areas of military and non-​military capability, to focus innovation and strategic direction, a ‘mixed methods’ of lethal and non-​lethal possibilities, where the battlespace is potentially anywhere—​and in places that other powers may have ignored. Unrestricted Warfare offers a dark dystopian view of technology that talks about the ‘technological plague’ that has been released from a Pandora’s box, an ‘irrational expansion of technology’ in a manner where societies and individuals lose sight of the dangers of ‘progress’: at times the book reads as if it were influenced by the Frankfurt School critical theorists from the 1930s and 1940s Europe or Paul Virilio in the Paris of the 1980s or 1990s. Humans have a ‘thirst for progress’ and are ‘seduced’ by technological innovation and acceleration; anticipating debates about the impact of AI, the technologies of the digital age will be revolutionary in terms of how they will transform other technologies or produce new possibilities through creative combinations: ‘the independent use of individual technologies is now becoming more and more unimaginable. The emergence of information technology has presented endless possibilities for match-​ups involving various old and new technologies and among new and advanced technologies’ (Liang and Xiangsui 2017: 13). There will be a proliferation of different military technologies and a proliferation of technologies that work together to make a unique and unprecedented event possible: questions on future technology and warfare will be discussed in Chapters 8 and 9, in the section on ‘the machinic.’ Anticipating the social, economic, military and technological trends that would transform the first decades of the twenty-​first century, Unrestricted Warfare is concerned with the new vulnerabilities emerging from the activities, behaviours and technologies that have

On the Sub-Threshold of Modernity and War  79

become central to life and security in increasingly connected societies, the creation of terrains where ‘non-​war’ exploits will be an essential component of future military conflict—​and where war will emerge in previously non-​war terrains. Unrestricted Warfare could be viewed as the People’s Liberation Army manual for asymmetric war in the digital age, an upgrade of Sun Tzu’s The Art of War, arguing that future warfare will be about undermining and degrading an enemy’s economy, institutions, government, and beliefs: it is viewed as the book that changed Steve Bannon’s thinking over China and the future of conflict, possibly shaping the Trump administration’s approach to China (especially in terms of its approach to Chinese tech companies) (Mitchell and Liu 2018). Destructive future war over territory will be futile and counter-​productive (in most cases); what you might be able to do is undermine the resolve of a state and its population to adopt positions that are counter to your more ‘local’ territorial interests (Taiwan for China, Ukraine for Russia). One can seek technological supremacy and design the weapons for what you think the future battlespace will be like. But customising ‘weapons systems to tactics which are still being explored and studied is like preparing food for a great banquet without knowing who is coming’ (Liang and Xiangsui 2017: 9). There will be vulnerabilities that you cannot plan for (as the authors say the United States discovered in Somalia in 1993): ‘On the battlefields of the future, the digitalized forces may very possibly be like a great cook who is good at cooking lobsters sprinkled with butter: when faced with guerrillas who resolutely gnaw corncobs, they can only sigh in despair’ (ibid.). What they are suggesting is that the United States can search for the most advanced (and expensive) weapons imaginable, but even high-​tech weapons and precision-​guided bombs are just ‘improvements’ on the past. In this view, new concepts are needed to think about how weapons can be used creatively in this interconnected world, the search for what can be ‘weaponised’ in our already existing infrastructures (and future infrastructures yet to be built), a position based on the view that ‘there is nothing in the world that cannot become a weapon, and this requires that our understanding of weapons must have an awareness that breaks through all boundaries’ (ibid.: 13). From the perspective of unrestricted warfare, the United States might have the resources to create more sophisticated and destructive weapons, but—​possibly constrained by ethical, legal and political concerns—​it lacks the imagination and creativity to exploit the vulnerabilities of its opponents. New concepts and practices, it is suggested, are needed to explore the material and psychological impact of techniques and tactics that can transcend what we would view as traditional military terrains or domains. In one of the most ominous passages in the book, the authors declare: The appearance of new-​concept weapons will definitely elevate future warfare to levels which is hard for the common people—​or even military men—​to imagine. Then the second thing we have to say should be: The new concept

80  The Tactics, Terrains and Technologies of Future Warfare

of weapons will cause ordinary people and military men alike to be greatly astonished at the fact that commonplace things that are close to them can also become weapons with which to engage in war. We believe that some morning people will awake to discover with surprise that quite a few gentle and kind things have begun to have offensive and lethal characteristics. (Ibid.: 14) This is a terrifying statement, the stuff of dystopian science fiction, paranoid visions of everyday technologies and infrastructures becoming weaponised—​or manipulated, sabotaged or controlled from a distance; a vision of a world where everyday life in liberal societies is attacked by a tactic or technology that we were unprepared for; an event that will shock and surprise us through its creativity, ingenuity and daring. Unrestricted Warfare was written before the ‘internet of things’ was a term in wide use (although their position about technological combination and digital technologies points to its emergence) and before we had seen what could be possible in terms of the ‘exploits’ of cybercrime or cyberwar (the focus of the next chapter), the potential for vulnerabilities in our health service, for example, or in the increasingly profitable cybercrime industries. It might be the case since the book was published in 1999 this dystopian vision of networked societies filled with hidden vulnerabilities looks overly apocalyptic: we now have a greater awareness and understanding of cyber vulnerabilities and are placing considerable resources into securing all parts of our digital lives. The counterargument would be that the acceleration of the technology and its penetration through all areas of a society will contain lethal characteristics that we still fail to see. One would imagine that this search for creative exploits (if there is the organisation and techniques in place to cultivate this lethal creativity) will not involve tactics and technologies that we are already planning for; the experience of Covid-​19 might give us an insight in the type of event imagined in Unrestricted Warfare: an event that is hard to prepare for (or even imagine) and hard to manage once it emerges. Or Covid-​19 might be the accident that resulted from the attempt to explore new types of weapon. At the same time, the statement on lethal characteristics might be wishful thinking on the part of the authors, or a statement intended to generate anxiety, a tactic of deterrence. We might worry about the lethal creativity of an enemy but, as Virilio would remind us, we should possibly worry more about the different types of accident that all states (including liberal states) might create in the twenty-​first century. The implication of Unrestricted Warfare is that future war will require the uncovering (and possibly the production) of vulnerabilities that ‘the victim’ or target will lack the imagination, creativity or political will to see and confront. The ‘open societies’ of liberal democracy will provide plenty of targets for the imaginative creators of unrestricted warfare to explore and exploit; the implication seems to be that the ultimate goal of the unrestricted warrior is to uncover the vulnerability that no one is thinking about, the military ‘black swan.’ If we are

On the Sub-Threshold of Modernity and War  81

concerned about a technology or terrain—​and its potential for exploitation by a foreign actor—​then the unrestricted warrior has failed. The counterargument to this ominous warning is that liberal societies have ‘future threat’ organisations (such as DSTL in the United Kingdom)—​often working in close collaboration with international allies—​constantly exploring where the vulnerabilities are across all aspects of society and technology. And it might also be the case that this type of ‘unrestricted’ thinking has been displaced and China is pursuing the ‘big ticket’ items of military technology and defence (the strategy that the authors of Unrestricted War argue is problematic for the United States). China might not be thinking in terms of unrestricted warfare for its future strategy (or some might be but they might not be influential in the Chinese military or political system) and will be thinking of this ‘impure warfare’ as one aspect of its approach to future international politics; or the thinking that originated in Unrestricted Warfare might have evolved in interesting and creative ways. China clearly aims to rival the United States in terms of all elements of military and geopolitical power by 2049. What China might be doing is maintaining the overarching (or undermining) approach of the unrestricted warfare position but enhancing it with the desire for supremacy in areas that will enable a range of creative tactics that other states will be overpowered by. Writing about AI and the ‘battlefield singularity’ Elsa Kania writes: This could be the start of a major shift in the PLA’s strategic approach, beyond its traditional asymmetric focus on targeting U.S. vulnerabilities to the offset-​ oriented pursuit of competition to innovate. The PLA is seeking to engage in ‘leapfrog development’ (跨越发展) to achieve a decisive edge in ‘strategic front-​line’ (战略前沿) technologies, in which the United States has not realized and may not be able to achieve a decisive advantage. The PLA is unlikely to pursue a linear trajectory or follow the track of U.S. military modernization, but rather could take a different path. Since the 1990s, the PLA has focused on the development of ‘trump card’ (杀手锏) weapons that target vulnerabilities in U.S. battle networks, seeking to develop, in the words of then-​Central Military Commission (CMC) Chairman Jiang Zemin, those weapons that ‘the enemy is most fearful of.’ This asymmetric thinking will likely persist in the PLA’s approach to AI. For instance, the PLA may seek to use swarms to target and saturate the defenses of U.S. aircraft carriers. However, China is no longer in a position of technological inferiority but rather sees itself as close to catching up with and overtaking the United States in AI. As such, the PLA intends to achieve an advantage through changing paradigms in warfare with military innovation, thus seizing the ‘commanding heights’ (制高点) of future military competition. (Kania 2017) So Unrestricted Warfare is ominous in its depiction of a state that will continue to produce multiplying vulnerabilities that can be exploited from a distance.

82  The Tactics, Terrains and Technologies of Future Warfare

But while the term unrestricted war—​and the comments on a morning when people uncover the new lethal characteristics of the world they inhabit—​appears to suggest an approach to war where everything is permitted, where there will be limited concern for humanitarian catastrophe and where there will be the destruction of the ecological security and the critical infrastructures that support life, the authors are clear that unrestricted war should be focused on the pursuit of limited or clearly defined objectives (certainly not endless wars on terror): indeed, the authors were apparently angry about the ‘bootlegged’ copy of the book that circulated in the United States that has the 9/​11 attacks on the cover (Mitchell and Liu 2018). The authors see a broader transformation in war and international politics, a recognition of the limits of brutal territorial wars in an interconnected world and the emergence of what they describe as ‘kinder’ weapons, non-​lethal weapons or increasingly precise weapons: technological progress has given us the means to strike at the enemy’s nerve centre directly without having to harm other things, giving us numerous new options for achieving victory, and all these people believe that the best way to achieve victory is to control, not to kill. (Liang and Xiangsui 2017: 15) The desire here would be for a state to be able to overwhelm an opponent with a variety of tactics, technologies and techniques (the destruction of command and control to a point that a state is powerless to use its complex global war machine across all terrains and domains) in a manner that would minimise the loss of life and damage to infrastructure; this could be viewed as the perfection of war, war where you would not destroy the terrain that you would seek to occupy (or where you have investments), where you would be able to control a state and not destroy it. These issues will be explored further in Chapter 7. Unrestricted warfare could still involve the use of violence—​and could use autonomous weapons and new psychological techniques for generating fear in the opponent—​to achieve objectives, but it will seek to limit the harmful impact of weapons technology. Rather than the fast destructive wars of what the Russians called the ‘reconnaissance strike complex,’ the wars that would draw on the latest machines to hunt and kill the enemy, unrestricted war is interested in the slower techniques that will degrade a rival’s institutions or systems of command and control without realising they are being ‘tampered’ with. But if war becomes more kinetic then unrestricted war is about generating new combinations of technology and tactics, adding elements that will be unprecedented and will surprise the enemy. More recently, there have been reports that some Chinese officials talk about the ‘intelligentisation’ of warfare: one report points to how advances in science and technology are creating new techniques for subduing enemies, pointing to the possibility of ‘brain control weaponry’: ‘War has started to shift from the pursuit

On the Sub-Threshold of Modernity and War  83

of destroying bodies to paralysing and controlling the opponent’ (Gertz 2021). To be sure, this could involve political subversion and information war as much as the futuristic brain control weaponry—​which, like some of the pronouncements on unrestricted warfare, could be about generating fear rather than providing a realistic account of the possibilities of future weapons. At the same time, these statements might point to the necropolitical trends in neurowarfare that will shape war out to 2049 and beyond. On the optimistic/​protopian side it could be argued that the ideas on unrestricted war or the intelligenisation of warfare might reflect a more humane approach to a Chinese way of future warfare: although it could be argued that the implications of war without destruction and war with new techniques of control and impure war could be as necropolitically terrifying as any type of warfare. Of course, it could be argued that for all the dystopian science fiction mentioned above there will still be limits on the tactics and techniques used (although the frightening implication is we will be too controlled or paralysed to do anything). There is a focus in Unrestricted Warfare on being more cunning and creative than the enemy; the focus of this ‘intelligentisation’ of warfare is on being more innovative with the militarisation of new technological possibilities. But there is a question about what type of society creates the conditions for creativity; the ‘military design’ movement seeks to draw on the creative techniques of cutting edge industries in the liberal world in order to create innovative approaches to future war (Zweibelson 2023). After what has been seen in the Russo-​Ukraine war it remains to be seen if the organisational culture in the political–​military sphere of authoritarian states will be able to produce types of warfare that live up to the visionary documents they produce. But it seems likely that the militaries of liberal states will have to prepare for the emergence of types of creativity and technology never encountered before. And it might be the case that the type of ideas being put forward about unrestricted warfare and the intelligentisation of warfare might reveal as much about the future of the liberal way of war as it does about the plans of authoritarian states for warfighting and international conflict. The Liberal Way of Future Warfare: Mosaic Warfare?

These ideas or speculations might not reveal much about how China and Russia are thinking about the future of war and international conflict; and speculations about the ‘intelligentisation of war’ might be more future war fantasy rather than the military ‘art of the possible,’ tactics and technologies that even if they are realised may confront resilience in the opponent and limits in the ingenuity of the attacker. Observers in the West might point to these ideas on unrestricted war to justify a strategic or political position on the emerging geography of future conflict; some leaders outside the liberal world might be content for these ideas to flourish, to generate fear about the dangerous creativity they might be cultivating.

84  The Tactics, Terrains and Technologies of Future Warfare

For all the interest in unrestricted warfare and the ‘intelligentisation’ of war, the reality of future war might be shaped by limited and flawed ‘group think’ and organisations that are not prepared for war at some fairly mundane levels (such as Russian vehicles with low quality tires that made off-​road manoeuvrability difficult in Ukraine) (Foy and Bott 2022). Simply put, the tragedy of future war might be the way it plays out like a time-​travel movie such as Christopher Nolan’s Tenet with warriors of the future fighting against tactics and technologies from the past; layers of different technologies and tactics clashing over a territory and population; tanks attempting to conquer territory confronting drones exploiting all types of deterritorialised possibility, tanks on roads confronting drones that are everywhere and nowhere. For all the visionary declarations on the future of conflict and international politics, the future of warfare might be shaped by organisational failure, bureaucratic incompetence and political or strategic hubris; and this might apply to liberal states as much as authoritarian states. The liberal way of future war also projects visions of international conflict based on the creative ‘combinations’ pointed to in Unrestricted Warfare or in the reports on the intelligentisation of war. For example, the Defense Advanced Research Projects Agency (DARPA) has begun to develop a concept—​Mosaic warfare—​ that attempts to respond to the type of innovative thinking on how to use military technology that is outlined as a possibility in Unrestricted Warfare: DARPA suggests that the concept would be familiar to Sun Tzu in The Art of War (Magnuson 2018: 23). Mosaic warfare is based on the idea that it is not enough to be faster and deadlier than an enemy because there will be asymmetric possibilities and vulnerabilities that could be exploited by an opponent. For example, if a situation were to arise where the United States and the United Kingdom were in a critical stage of preparation for war with Russia and dealing with a fraught diplomatic breakdown then we might have a day where there are five Novichok nerve agent events in cities across the United Kingdom (and there could be waves of ‘fake news’ and disinformation about ongoing attacks); such an event would generate unprecedented panic, confusion and fear even as Russia faced the superior military force of the United States. There could be cyberattacks that radically disrupt the logistics that underpin food and hospital supplies in the prelude to war. The anxiety about real and imagined threats and events could overwhelm decision-​makers/​ politicians, generating social and political turmoil. The concept of Mosaic war is focused on how it might be possible to use a variety of technologies and tactics to overwhelm adversaries through the production of ‘multiple dilemmas,’ to, according to the director of DARPA’s strategic technology office, ‘get inside and disrupt’ its leaders’ decision-​making processes. In this view, the range of problems and dilemmas that an enemy is dealing with would overwhelm them; the enemy would be overwhelmed, shocked and surprised by the innovative tactics and technologies that they were confronted with. Similar to the points made in Unrestricted Warfare, mosaic warfare is about ‘combining weapons we already have in new and surprising ways,’ where each weapon or tactic is an

On the Sub-Threshold of Modernity and War  85

element in a ‘bigger picture’ where the loss of one element in the mosaic of the war machine will not destroy the whole: traditional weapons systems ‘are more like pieces of a puzzle than tiles for a mosaic. They are exquisitely engineered to fit into a certain part of the picture and one part only. You can’t pull it out and put in a different puzzle piece. It won’t fit’ (Magnuson 2018). The scale of a ‘mosaic war’ will be designed to overwhelm a ‘command and control’ that will already be facing multiple forms of disruption and deception. Mosaic war will produce effects from a variety of armed services—​sea, land, air or cyber—​according to what will be faster and most effective. This example is provided in a DARPA press release: in the air domain, four F-​16s might be going head-​to-​head with four rival jet fighters. However, in a mosaic warfare context, the U.S. Air Force might also deploy four relatively inexpensive and expendable unmanned aerial systems ahead, each with different weapons or sensor systems. In this view, the combatant commander can treat these assets like ‘a football coach who chooses team member and then positions them on the field to run plays. The added aircraft make the situation much more complex and will be intended to overwhelm the opponent’s decision-​making’ (Magnuson 2018). While mosaic war sets out to produce the seamless integration of different elements, a system of systems, it is also about enabling elements of the mosaic to operate if they are cut off from key decision makers and to provide multiple sensors that will inform the decision-​making or the OODA loop (observe, orient, decide and act). In other words, to have multiple sensors that provide information that allow the different combinations of human–​machine teams to orchestrate events through a variety of military platforms—​carried out in a way that is fast, surprising and confusing for the enemy. At this level and intensity of liberal war, the intention is to overwhelm but, the implication seems to be, not to destroy. These new terms and ideas seem to change every couple of years, but the general direction sketched out here on mosaic war seems a likely path for the liberal way of future warfare; to use a creative range of tactics and technologies in order to overwhelm and confuse an opponent; these tactics might involve the targeting of individuals and groups through the latest innovations in impure war—​or it might involve the global coordination of a war machine composed of a multitude of war/​non-​war elements used in a manner never experienced before. This might include a multitude of tactics across a multitude of terrains orchestrated by a war machine that can operate creatively and effectively in all domains; whether such a vast war machine could co-​ordinate this mosaic war at the speed and creativity it desires remains (obviously) to be seen; while it might be the case that authoritarian regimes might lack the political will and means to harness and cultivate the creativity and effectiveness of what the Chinese refer to as the ‘military–​civil fusion’ (White and Yu 2023), liberal societies might lack the ability to orchestrate the size and complexity of the war machine operating across all domains, to control order and chaos on the battlefields of future modernity (Bousquet 2022).

86  The Tactics, Terrains and Technologies of Future Warfare

Concluding Remarks: Impure War and Strategies of Deception

A concern that runs through Russian, Chinese and American ideas on future warfare is tactical creativity and sub-​threshold cunning. On the one hand, the desire is to exploit new vulnerabilities and technological possibilities in creative ways, ways that will take your enemy by surprise, in ways that will overwhelm command and control and decision-​making capacity. At the same time, these plans for the intelligentisation of war, on the surface, are also about how to avoid destruction and death through the creative use of alternative techniques and, perhaps more importantly, the projection of a deterring image of being creatively dangerous. Indeed, the Russians and the Chinese might like the liberal paranoia about the Gerasimov Doctrine or unrestricted warfare. But it might be the case that for all the focus on innovation and creativity, future wars will be shaped by mistakes familiar to historians and philosophers of war, basic mistakes of logistics, faulty technology, of underestimating an enemy and overestimating the ability to manage and master the order and chaos of war, the technological and tactical complexity and messiness of war (Bousquet 2022). It might be the case that liberal societies are better at cultivating creativity than authoritarian ones and better at planning and preparation; it might be the case that authoritarian regimes can allow the space to be more ruthlessly creative but worse at the operational and logistical aspects of warfare (and cunning and ruthlessness might be the decisive factor for weakening liberal societies in times of war). And while the visions of future war outlined here might lead us to think we understand something of the ‘strategic culture’ of different states and their plans, what might be more revealing is the organisational culture of different states and militaries. But again there is no guarantee that the organisational culture of liberal war machines will be superior than an authoritarian organisation; different regimes will make different types of mistakes waging different types of war at different scales. It might be the case that liberal states become more creative and effective in the dark ‘arts’ and techniques of impure war (it might also be the case, as Gerasimov would probably argue, that they are already well versed in the dark arts); it might also be the case that the liberal world order can mobilise and organise more effectively and quickly to deter or end wars. But it might also be the case that liberal states are easier to push into strategic mistakes (regarding energy dependence, for example) or to exploit social and political vulnerabilities (political and economic corruption and influence operations). All types of state might make strategic mistakes in the coming decades, chaos producing wars of choice or unnecessary wars that produce unintended consequences with long-​term global impacts. As we will see in Chapter 9, while liberal states will seek to explore these sub-​threshold possibilities made possible by the creative use of emerging tactics and technologies, the range and complexity of emerging ‘tools’ will present a range of strategic, ethical and organisational challenges. The ‘impurity’ of war will spread through so many aspects of life that the organisational challenge will be to try to contain and manage

On the Sub-Threshold of Modernity and War  87

the complexity that spreads through all aspects of life, politics, infrastructure and technology. But the position being put forward here is that the liberal way of war will be focused on the tactics of impure war, tactics that try to deter military confrontation or use contactless action at a distance to shape emerging conflicts around the planet: from economic war and sanctions, cyber-​operations and information wars, action and support at a distance through training and support, actions of a concealed character. Or to fight impure wars against networks or states that are unable to pose an ‘existential threat’ to the liberal or world order, using all the latest tools and technologies in the laboratory of future war, wars that will be increasingly ‘impure’ after the costs and lessons of Iraq and Afghanistan. After the Russo-​Ukrainian war, we see even more clearly that Gerasimov was writing a description of the liberal way of war and international politics—​rather than the strategic plan for future Russian activity. The coming years will most likely be a time of experimentation and exploration in the tactics and techniques of impure war. Tactics of impure war might involve exploring the possibilities that Sun Tzu pointed to when he wrote: “You disorient them with speed so they make blunders that undermine their own moral credibility” (Pickard and Parker 2022). For example, in May 2021 it was reported that the Israel Defense Forces (IDF) tricked Hamas in believing a ground invasion was underway in order to prompt fighters to hide in the underground tunnels of Hamas city (Tibon 2021); the idea was to bomb the tunnels with the fighters inside after reports on the invasion had circulated via social media. Strategies of deception have also involved reporting fake injuries on soldiers in order to secure a cease fire with Hezbollah in 2019 (Sanchez 2019). These tactics are possibly the tip of the iceberg in terms of what we know on the creative forms of deception that use a combination of elements in a way to deceive the enemy. We see emerging groups—​ such as the ‘military design’ movement—​inside the military that seek to explore ways to cultivate organisational and tactical creativity in response to the perceived failures of the ‘legacy’ approaches to war and foreign policy—​and the possibility that militaries might find themselves in types of war and international conflict they have never encountered before (Zweibelson 2023): impure wars beyond anything we can currently imagine. So just as terrorists try to uncover the impure possibilities of different infrastructures, states and militaries will seek to explore impure combinations of elements to surprise, confuse and deceive enemies: these tactics might range from the fairly straightforward (for example, the use of social media by the IDF), but it could extend to emerging possibilities of mosaic war on a scale that aims to overwhelm a ‘peer competitor.’ Dan Öberg writes about the new dangers of ‘transgressive creativity’ in warfare arguing that there needs to be a serious examination of the ethico-​political dangers for societies that might rely more and more on these impure possibilities (Öberg 2018). Whether it is being developed by liberal states or authoritarian regimes, this

88  The Tactics, Terrains and Technologies of Future Warfare

desire for the ‘intelligentisation of war’ might lead to less economic and human damage and harm; we might be entering a new age of creative destruction or creative control, a world where states and citizens are left in states of paralysis and confusion or pushed into making catastrophic accidents. The challenge ahead extends beyond questions of deception in war; impure war will possibly involve combinations of emerging technologies, tactics, terrains and actors that have never been encountered before. Pure war will be found in nostalgic war films from the previous century. Impure war might result in protopian wars that minimise harm—​ or impure war might result in deadly necropolitical possibilities that continue the ambiguous processes that can be found in ‘nocturnal body’ of the civilising process in liberal societies; the ambiguity where, as Bauman suggests, liberal societies can remain indifferent to their role in the exporting chaos and destruction through, for example, their weapons industries (Loewenstein 2023). I will now turn the focus on to one of the most controversial and widely discussed contemporary developments in the future of war and international conflict—​cyberwar—​examining how this element of impure war may or may not be changing the character of war and conflict. Cyberwar is a vital element in the visions of unrestricted warfare and mosaic war but also contains implications that extend beyond these approaches to impure war. Bibliography Belton, Catherine. 2020. Putin’s People: How the KGB Took Back Russia and Then Took on The West (London: William Collins). Bousquet, Antoine. 2022. The Scientific Way of War: Order and Chaos on the Battlefields of Modernity (London: Hurst). Foy, Henry and Bott, Ian. 2022. ‘How Is Ukraine Using Western Weapons to Exploit Russian Weakness?’ Financial Times, 16 March: www.ft.com/​cont​ent/​f5fb2​996-​f816-​ 4011-​a440-​30350​fa48​831 Galeotti, Mark. 2018. ‘I’m Sorry for Creating the “Gerasimov Doctrine”,’ Foreign Policy, 5 March: https://​foreig​npol​icy.com/​2018/​03/​05/​im-​sorry-​for-​creat​ing-​the-​gerasi​mov-​ doctr​ine/​ —​—​—​. 2019. We Need To Talk About Putin: How the West Gets Him Wrong (London: Penguin). Gerasimov, Valery. 2016. The Value of Science Is Foresight: New Challenges Demand Rethinking the Forms and Methods of Carrying Out Combat Operations,’ Military Review, January–​February: www.arm​yupr​ess.army.mil/​port​als/​7/​milit​ary-​rev​iew Gertz, Bill. 2021. ‘Chinese ‘Brain Control’ Warfare Work Revealed,’ The Washington Times, 29 December, 2001: https://​m.wash​ingt​onti​mes.com/​news/​2021/​dec/​29/​pla-​brain-​cont​ rol-​warf​are-​work-​revea​led/​ Greenberg, Andy. 2019. Sandworm: A New Era of Cyberwar and the Hunt for the Kremlin’s Most Dangerous Hackers (New York: Random House). Gregory, Derek. 2011. ‘The Everywhere War,’ The Geographical Journal, Vol. 177, Issue 3: www.jstor.org/​sta​ble/​41238​044

On the Sub-Threshold of Modernity and War  89

Kania, Elsa. 2017. Battlefield Singularity: Artificial Intelligence, Military Revolution, and China’s Future Military Power: https://​s3.us-​east-​1.amazon​aws.com/​files.cnas.org/​ hero/​docume​nts/​Batt​lefi​eld-​Sing​ular​ity-​Novem​ber-​2017.pdf?mtime=​201​7112​9235​ 805&focal=​none Kilcullen, David. 2020. The Dragons and the Snakes: How The Rest Learned to Fight the West (London: Hurst). Kofman, Michael. 2016. ‘Russian Hybrid Warfare and Other Dark Arts,’ War on the Rocks, 11 March: https://​waront​hero​cks.com/​2016/​03/​russ​ian-​hyb​rid-​warf​are-​and-​other-​dark-​arts/​ Kurth Cronin, Audrey. 2020. Power to the People: How Open Technological Innovation is Arming Tomorrow’s Terrorists (Oxford: Oxford University Press). Liang, Qiao, and Xiangsui, Wang. 2017. Unrestricted Warfare: China’s Master Plan to Destroy America (Los Angeles: Shadow Lawn Press). Loewenstein, Antony. 2023. The Palestine Laboratory: How Israel Exports the Technology of Occupation Around the World (London: Verso). Magnuson, Stew. 2018. ‘DARPA tiles together a Vision of Mosaic War,’ DARPA: https://​ issuu.com/​fai​rcou​ntme​dia/​docs/​darpa_​publ​icat​ion/​s/​23215 Miller, Chris. 2022. Chip Wars: The Fight for the World’s Most Critical Technology (London: Simon and Shuster). Mitchell, Tom and Liu, Xinning. 2018. ‘The America Hawks Circling Beijing,’ Financial Times, 7 December: www.ft.com/​cont​ent/​d425e​e0a-​f9bf-​11e8-​8b7c-​6fa24​bd54​09c Öberg, Dan. 2018. ‘Warfare as design: Transgressive creativity and reductive operational planning,’ Security Dialogue, Vol.49, Issue 6: https://​journ​als.sage​pub.com/​doi/​abs/​ 10.1177/​09670​1061​8795​787?jour​nalC​ode=​sdib Pickard, Jim and Parker, George. 2022. Dominic Cummings, ‘Partygate’ and the campaign to unseat Boris Johnson,’ Financial Times, January 2017: www.ft.com/​cont​ent/​8573d​ 428-​e858-​43e5-​a54c-​740a0​8ab5​c90 Sanchez, Raf. 2019. ‘Israel Faked Injuries to Its Soldiers to Trick Hizbollah,’ The Telegraph, 2 September: www.telegr​aph.co.uk/​news/​2019/​09/​02/​isr​ael-​faked-​injur​ies-​soldi​ers-​ trick-​hizbol​lah/​ Tibon, Amir. 2021. ‘Foreign Media Fumes after Being Used in IDF Ploy to Trick Hamas,’ Haaretz, 15 May: www.haar​etz.com/​isr​ael-​news/​2021-​05-​15/​ty-​arti​cle/​.prem​ium/​fore​ ign-​media-​fume-​over-​idf-​ploy-​isra​eli-​army-​lost-​its-​cred​ibil​ity/​00000​17f-​e6be-​dc7e-​ adff-​f6bf2​f580​000 Virilio, Paul. 2000. A Landscape of Events (Cambridge, Mass: The MIT Press). Virilio, Paul. 2008. Pure War (Los Angeles: Semiotexte). White, Edward and Yu, Sun. 2023. ‘Xi Jingping’s Dream of a Chinese Military-​Industrial Complex,’ Financial Times, 19 June: www.ft.com/​cont​ent/​6f388​e4b-​9c4e-​4ca3-​8040-​ 49962​f1e1​55d Zweibelson, Ben. 2023. Understanding the Military Design Movement (London: Routledge).

5 THE IMPURE 2 Glitches in the Digital War Machine—​The (Hu)Man, the State and (Cyber)War

Crimes of the Future

In an article on the 2015 Hatton Garden robbery in London, the Financial Times (FT) reported that the theft of deposit boxes was notable not simply due to the age of some of the criminals (the oldest member of the gang was 76) but due to the event being an ‘old-​fashioned heist in a new-​fashioned underworld.’ A few decades ago, the article informs us, there were hundreds of bank robberies in the United Kingdom every year; in 2015 there were 88 (Thompson 2016). Crime evolves with the emergence of new technologies and—​just as there is a desire in modernity to distance humans from risk, danger and all the unpleasant aspects of existence—​so criminals and organised crime explore techniques and technologies to minimise the risk of being caught, injured or killed (while at the same time creating the possibility for bigger heists or rewards from criminal activity). A former operational head of the Flying Squad tells the FT on the evolution of criminals: ‘But the clever ones went into drugs, which removed them from the front line and let them surround themselves with a layer of associates. And the newer ones started using technology, which gave them even more distance [from the actual crime]’ (ibid.). Some would argue the costs of cybercrime pose a serious danger to national security in a manner that places ‘cyber’ as one of our most pressing security concerns—​and one that will likely grow in significance, damage and danger. But the possibility of action at a distance is as attractive to states and militaries as it is to the criminal, new tactics of attacking and defending, new techniques for collecting data and intelligence, new techniques of espionage and communication; and as Zygmunt Bauman might add, depending on the type of action or event, geographical (moral) distance from the ‘reality’ or consequences of an action can make it easier for the perpetrator to live with their actions (or inactions). DOI: 10.4324/9781003219576-7

The (Hu)Man, the State and (Cyber)War  91

But while there is clearly a dramatic (but often ‘invisible’) new world of cybercrime where criminal actors have been enhanced by technology, there remains a debate about the significance of ‘cyber’ for the future of war: Is the digital age changing the ‘character’ of war or is it primarily a problem of crime and economic national security? Has the age of supposed cyberwar just provided new tools for older tactics and techniques of statecraft and war? Or does the digital age herald a dangerous new world of impure war where anyone or any infrastructure can become a target (or agent) of cybercrime and cyberwar, the possibility of creatively destructive events and ‘exploits’ orchestrated by lone hackers, terrorist networks and organised crime (and organised crime used by states as a tool of the grey zone)? Cyber encapsulates much of what Virilio was beginning to describe as impure war where a range of actors, technologies and tactics are being used for a variety of outcomes and strategic objectives, objectives that appear less ‘destructive’ than in the time of modern, ‘pure war’ (the theft of intellectual property rather than the invasion or theft of actual territory): a range of outcomes and impacts that are not comparable to the destruction of ‘pure war’ nor with the apocalyptic possibilities of nuclear weapons and weapons of mass destruction—​a twenty-​first century nuisance that can be corrected or patched, information bombs that will never inflict the damage on bodies and infrastructure that actual bombs can. Of course, the debate about artificial intelligence (AI), as we see in Chapter 9, is filled with apocalyptic warnings about the end of humanity in Terminator-​like scenarios. But what does the concern with cyber reveal about the trends in the liberal way of warfare out to 2049? What I want to suggest in the chapter is it would be like playing a ‘mug’s game’ to speculate on how cyber will develop over the decades to 2049: with each passing decade the possibilities depicted in The Matrix or Ready Player One might begin to transform the human condition, producing new social, legal and ethical challenges for humanity—​and the types of crime and corruption depicted in the ‘cyberpunk’ novels of William Gibson or Rudy Rucker; we might enter into a world of new technological possibilities beyond anything we can currently imagine, worlds where the ‘intelligentisation of war’ and crime radically transforms what we understand as the weapons, tactics, and terrains of international conflict. The cyberpunk writer Rudy Rucker makes this comment in some introductory notes to his book Juicy Ghosts: ‘We’ll see commercial telepathy, or teep, before long. And we’ll want a channel that’s richer than text and images. Users might transmit templates for the neurochemicals that are affecting their current mood. Your friends feel your pheromones! In Juicy Ghosts, people do this with gossip molecules, which are nano-​assemblers with tiny antennas’ (Rucker 2021). To be sure, Rucker does not provide an indication on the timeline of ‘before long’ or how this commercial telepathy might become widely available, but it seems likely that as we approach 2049 we will be working with technologies of communication beyond anything we can currently imagine (even if it doesn’t involve the commercial telepathy envisaged by Rucker). Elon Musk’s ‘Neuralink’ might be the start of this new era

92  The Tactics, Terrains and Technologies of Future Warfare

in ‘cyborg’ communication that might be essential to future warfighting by 2049 (and possibly much sooner). Of course, we might see cybersecurity disappear as a cause of concern as ‘technical fixes’ and new laws and regulation ensure that this (now inescapable) aspect of our existence becomes safe and uninteresting in social, political, economic and military terms; for the protopian, we are simply in the difficult (and not that challenging or worrying) early stages of the technologies and processes that will radically improve all aspects of life. For the liberal optimist, the tactics and technologies of whatever cyber becomes might result in new types of humane war, war that takes place increasingly in virtual domains or creates more precise ways of targeting and non-​lethal interventions in the traditional terrains of war. If, as Thomas Rid (2013) has argued, cyberwar is better understood in terms of emerging techniques of sabotage, espionage and subversion, then it might be the case that digital or cyber tactics will be seen as one set of possibilities to consider in the pursuit of an objective; and while cyber has been widely discussed in the first decades of the twenty-​first century, the most significant acts of sabotage, subversion and espionage might not emerge from the tactics of ‘cyberwar.’ For example, the most significant act of sabotage in the Russo-​Ukrainian war might have been the destruction of the Kakhovka dam in June 2023: it might be the case that an act of such economic, ecological and humanitarian sabotage would not be achievable via a cyber route (or there might have been a minor role for cyber at some stage in the orchestration of the final event). If so much of war is about signalling, confusing and deceiving an enemy then digital tools and techniques might emerge in the combination of tactics that will be creatively explored by future warfighters exploring the possibilities of mosaic warfare. But at root, the future out to 2049 might be more about defensive cybersecurity rather than offensive cyberwar (due to the focus on cyber-​defence), security objectives driven by the need to protect whatever infrastructures are essential to our world of instantaneous communication; for example, in the domain of astropolitics and space conflict, cybersecurity will possibly be the key area of vulnerability (at least until 2049) given the challenges of kinetic activity in space (Marshall 2023). So while it is impossible to provide a sensible set of speculations on global technological evolution out to 2049 what we can do is begin to speculate on the broad strategic and technological framework in which cyber will be used in based on current strategic debates—​and events that might indicate what is on the horizon. What I suggest is that while it might not make sense to talk of cyberwar, cyber will most likely be an element in the mosaic of future wars: how significant the role of cyber will be in the mosaic will depend on a variety of factors. First, the significance of cyber will depend on how the technologies evolve—​and what vulnerabilities or ‘glitches’ there are that can be exploited (and how effective the legal, social and technical ‘fixes’ are). Second, the importance of cyber will depend on how significant the impact is of the vulnerabilities that can be exploited: the vulnerabilities might be ‘game-​changing’ during war or in the ‘grey zone’ prelude to

The (Hu)Man, the State and (Cyber)War  93

war—​or they might be relatively minor ‘nuisances’ that are insignificant compared to the impact of other weapons and acts of sabotage. At the same time, there might be devastating or costly attacks by states or non-​state actors in the impure wars of the future and where liberal states will struggle to manage to protect the citizenry (and where attribution will keep the attacks in the realm of the impure); the impure wars fought by liberal states will be against the actors enhanced by the tools of the digital and AI age. At the same time, cyberattacks on critical infrastructure might be a distant memory of a past anxiety in a world with new powers of quantum security—​and a world heading towards 2049 dealing with the more ‘material’ consequences of conflict driven by climate change and the resource wars of the digital/​AI age. But regardless of the technologies and tactics being used by us and against us (or the emerging terrains we are operating in), what we can be fairly sure of is that states will seek (often invisible and sub-​threshold) ways to creatively exploit the informational spaces and infrastructures that shape and support networked life—​whatever the specific technologies and practices become. The State of Cyber Play

‘Cybersecurity’ includes a broad range of techniques and problems; it is possible by 2049 we will not be talking about the threats of cybersecurity or cyberwar because every ‘thing’ will be networked and connected in different ways and through different technologies (and possibly increasingly through our bodies and brains); we will just be talking about security and problems of communication and information. The Informational Dimension

In the first decades of the twenty-​first century there are a range of threats and vulnerabilities to our digital societies and economies that can be divided into—​ rather crudely—​the informational and the infrastructural. Simply put, in the digital age states and tech companies explore the possibilities for connecting individuals, families, groups or communities to other people or groups (and increasingly to the tools provided by AI): the social media we use, the Teams or Zoom used for work, the global gaming environment of Roblox. The possibilities of the connected society also involve new ways for connecting consumers to emerging infrastructures (controlling your smart home from a distance) or services (food delivery, online shopping): or connecting those infrastructures to us. The connected society creates new ways for states and businesses (or criminal groups) to connect to consumers and citizens (or potential victims) in order to be able to better serve them (or better survey them). This desire for connectivity is one of the key obsessions and planetary projects of our time, the desire for greater speed, efficiency and connectivity. And one of the key debates of our time is

94  The Tactics, Terrains and Technologies of Future Warfare

between the protopian optimists who see the connected age as the path to progress and a transparent society of openness and communication (with more sustainable and efficient infrastructures) and those who see the age of connection as one of dystopian possibilities of surveillance, control and alienation, the type of scenarios depicted in shows such as Black Mirror. In terms of security and war, the informational, at the most basic level, refers to the theft of information/​intellectual property or financial crime; it involves the way that an individual or group can access or steal information either through physically accessing information (or spreading/​circulating information as we see in cases of informational sabotage in organisations) or through remote access. The distinction between the informational and the infrastructural are not always so clear; the problem of juice jacking occurs when a phone or laptop has information taken from it or malware installed on it through a USB charger in a public charging station where a concealed device or computer is being used; some security professionals are warned about juice jacking in the hotel rooms of certain states. Or there could be the problem of the undersea cables that could potentially be used for espionage and subversion by a state. The informational can also involve the manipulation of behaviour due to the information that is stolen from a smartphone or from files stored on, for example, a laptop. The manipulation of behaviour can be both individualised, organisational or societal; it can involve the bribery that occurs when a celebrities photographs are stolen or when the details of an email are leaked in an attempt to change the outcome of an election. The realm of the informational can also involve the big data that can be collected by companies on individual’s consumer habits and political preferences (what has been referred to as ‘surveillance capitalism’); this increasingly ‘granular’ information that can then be used by political actors to generate more targeted campaigns for elections (or referendums). This informational aspect of cyber may increase in significance as states use big data to micro-​target individuals and communities with messages designed to appeal to their specific fears and desires (and when increasingly granular and detailed data is stolen). But this all might just be the next stage of political marketing and advertising (and its impact will be cancelled out as other parties/​campaigners deploy the same techniques or through education to make citizens aware of the tactics used on them). In an authoritarian/​ totalitarian regime the information that can be collected and analysed may take all our twentieth-​century-​inflected anxiety about authoritarian dystopias to a level beyond what even George Orwell (or William Gibson) could have imagined. For the optimist these concerns about information detract from the positive possibilities of the connected and networked society. And for those focused on the future of war, these might be viewed as issues on the periphery of national security. One of the major concerns about the theft of information, micro-​targeting and online manipulation concerns radicalisation and the recruitment for terrorist organisations; the idea that an organisation can enhance capacity by communicating

The (Hu)Man, the State and (Cyber)War  95

directly to those who might otherwise be harder to access, geographically out of reach but digitally close (Littler and Lee 2020). To be sure, it is debatable how significant online recruitment is compared to more traditional means of radicalisation of (generally) young people; and there are countermeasures that turn this dark space of the online world into a transparent space with innovative forms of intervention (Busher, Malkki and Marsden 2023). The online dimension of terrorist organisations can be used to market the organisation and the ‘life style’ possible for those who join the organisation; again, there can be new techniques to disrupt these strategies, new ways to insert counternarratives into the online terrain. As with many of these questions of cybersecurity and online culture, it is not clear whether we are catching up with tools and techniques that are dying out through our innovative responses—​or whether the tools and techniques will constantly be mutating and multiplying, accelerating beyond our reach. One case that was of direct relevance to the future of civil war and ethnic conflict was when Facebook was used as a means to circulate violent and dehumanising hate speech about the Rohingya Muslims in Myanmar during the intensification of violence in 2017; Facebook was viewed to have been ill-​prepared to monitor what was going in an area where the information space of social media shaped violence in the ‘real’ spaces of conflict (Mozur 2018). In terms of information or data, the anxiety is about the emerging possibilities in theft, bribery and manipulation through actions that can (at least initially) be ‘contactless.’ Cybercrime ‘scales up’ the possibility of criminal activity in the way that ‘phishing’ can target large groups in the hope of finding vulnerable individuals who will pass on important information (credit card details, passwords) through to the crime that can target funds/​accounts far greater than can be held in a physical vault or bank. In terms of future warfare, the key areas of concern are theft of ‘intellectual property’ (either remotely or ‘on site’) that might have immediate or long-​term economic or military–​technical consequences; the disruption of communication in times of war (or the circulation of information in times of peace; the anxiety about an Edward Snowden figure releasing ‘sensitive’ information); the micro-​targeting of individuals/​groups (both in terms of your own personnel and of groups involved in conflicts); analysis of a city/​community that could then be used in the tactics of unrestricted or mosaic war; and the sabotage of critical infrastructures of war/​non-​war. The Infrastructural Dimension

The infrastructural dimension of cybersecurity or cyberwar refers to the way that different infrastructures—​from power plants to hospitals to entire ‘smart’ cities—​ can be tampered with or sabotaged; malware is inserted into an infrastructure in order to prevent it functioning as the user intends. Stuxnet is one of the most widely known examples in the realm of interstate conflict and competition: the attempt—​ uncovered in 2010—​ to disrupt Iranian development of nuclear weapons by

96  The Tactics, Terrains and Technologies of Future Warfare

targeting industrial systems (Arquilla 2021: 12); the attacks on the Ukrainian power grid in 2015 are seen as a cyberattack that targeted infrastructure in what was then the ‘grey zone’ of interstate conflict (Buchanan 2020: 288–​301). The second—​and less direct way—​that critical infrastructures can be tampered with is by criminal organisations preventing users from being able to access the ‘information’ that enables, for example, the circulation or movement of ambulances or police; the correct functioning of an organisation or infrastructure can be disrupted as the city of Baltimore discovered in 2019 and as the NHS in the United Kingdom discovered in 2017, a victim of the WannaCry ransomware attack. But cyber is also infrastructural in the way that that the digital age depends on, as Andrew Blum (2013) demonstrates in Tubes: Behind the Scenes at the Internet, on undersea cables, the energy needs, data centres, real space, resources and equipment that underpins the virtual world, the clouds and the cyberspace. The sabotage of the critical infrastructure that supports the digital through kinetic means might be a vital aspect in what cyberwar actually involves (although the severity and impact of such sabotage remains to be seen). In terms of the future of warfare, the anxiety is that different parts of the military machine will be sabotaged—​left unable to communicate or with ‘tools’ that cannot be used—​or that civilian society will be attacked, the infrastructures critical to everyday life sabotaged, destruction produced by techniques of unrestricted warfare (from states) or impure war (from terrorist networks); sabotage we have never experienced before with consequences we are not prepared for. For the dystopian techno-​pessimist, we have yet to see the full impact of an attack on our digital society because if it were to happen then it would be so severe that when the perpetrator was uncovered there would be a response that could be a ‘mixed methods’ of the cyber and the kinetic; in other words, actors are deterred from the deadliest forms of cyberattacks, the possible attacks that remain in the blurred grey zones of our security imaginations. It may also be the case that dangerous non-​state actors that want to create a terrorist cyberattack (the Cyber 9/​ 11) have yet to uncover the right vulnerability or to develop the best line of coded attack in this emerging virtual space—​but this will change in the future terrains of cyberwar or cyber-​terrorism when actors are enhanced through their use of AI (Kurth Cronin 2020). Furthermore, and perhaps more importantly, states will not reveal what they are capable of doing until its essential, in the lead up to open war or during wartime. Until the time of war, you keep your knowledge of what is possible in cyberwar, the vulnerabilities you have uncovered or produced, secret; a cyberweapon might have limited one time use because once a vulnerability is revealed it can be ‘patched’ up. As Ben Buchanan suggests in The Hacker and the State: Cyber Attacks and the New Normal of Geopolitics, some military leaders might view their cyber capabilities like they see their tank battalions, as the ‘reliable assets that can be deployed against a wide range of targets and whose force is easily understood’ (Buchanan 2020: 8). But for Buchanan comparing cyber ‘weapons’ and attacks

The (Hu)Man, the State and (Cyber)War  97

to nuclear and conventional weapons risks to mislead us on the character of cyberattacks: Cyber capabilities are not nearly as powerful as nuclear weapons or even most conventional military capabilities. Nor are they as dependable, fungible, or retargetable as traditional arms. Maybe most vexing of all, the operational functioning of cyber capabilities is nonintuitive; while most policymakers and scholars understand what nuclear weapons and tanks can do, the possibilities, pitfalls, and processes of hacking missions are comparatively opaque. (Ibid.) Buchanan goes on to suggest that cyber operations are poor at changing adversary’s behaviour through the threat of a visible form of harm, the signalling that can deter an adversary from pursuing a course of action. The sabotage of the Kakhovka dam might have a much more lasting and wide-​ranging impact in the Russo-​Ukraine war than any act of cyber-​sabotage. Cyber operations contain an uncertainty for the ‘attacker’ where there is often no control over how much damage an event might generate; Buchanan argues that some attacks do less damage than anticipated while other events (such as the impact of WannaCry) have greater impact than was possibly envisaged; the NotPetya attacks are described as being ‘lobbed in the general direction of an adversary’ (Buchanan 2020: 310). The threat of a cyberattack can be defended against once an attack is spotted and measures can be taken to secure systems: although Buchanan notes that defending against attacks is clearly less effective if the attack is informational, resulting in the leaking of documents into the world (ibid.). Buchanan concludes dramatically that: ‘For better and for worse, hackers—​ working for, against, and within states—​are shaping the future of the world’ (Buchanan 2020: 9). The question of whether this ‘new normal’ of geopolitics is making the world safer, reflecting the progress that liberal internationalists view as a possibility, or whether this age of cyberattacks is generating new dangers and insecurity is left open; the implication of his argument is that cyber operations will be an important and often invisible element of statecraft and war, an element that is neither a sign of progress and ‘humane’ war nor a dark new age of Cyber 9/​11s; cyber operations are just a vital element in the new ‘tempo’ of geopolitics. The Uncertainty of Cyberwar

At the risk of greatly simplifying the causes of digital insecurity, cyberattacks occur because of a glitch—​an error of code that is sometimes intentional and sometimes just a mistake: some organisations offer bug bounties to those who uncover vulnerabilities and potential exploits; some argue that AI will be able to spot and fix vulnerabilities before they can ever become problematic. The ‘protopian’ believer in technological progress (thinkers like Kevin Kelly) will argue that our current

98  The Tactics, Terrains and Technologies of Future Warfare

anxiety about cybersecurity and cyberwar will be short lived; what we are seeing at the moment are the ‘teething troubles’ of a new technological age rather the beginning of a new digital dark ages; the pace of change in digital technology has resulted in various types of cyber(in)security, but these vulnerabilities will be overcome with technical fixes, better management and (global and national) governance. Writing in 2019, Andy Greenberg concludes his often bleak account of the use of hackers by the Kremlin to manipulate and sabotage ‘opponents’ with the suggestion that there is no reason to believe the digital attacks on Ukraine will be confined ‘by the contours of geography’; and for Ukraine, ‘Low-​grade, endless war remains the dystopian reality of a country that straddles the fault lines between civilizations’ (Greenberg 2019: 313). But what looked at the time like a dystopian reality of an endless grey zone/​sub-​threshold conflict was possibly viewed by some Russians as the Gerasimov-​inspired tactics for the preparation for war and invasion—​or possibly (if someone in the Kremlin believed the hype) as being the use of cyber-​tools and tactics that would make a traditional invasion unnecessary; the invasion might have resulted from a frustration with the limits of these tools. Greenberg suggests that we will need to explore the possibilities of a digital Geneva Convention to limit the war crimes of the future or at least to explore new rules of cyber conflict. But Greenberg is not optimistic about states limiting their experiments in the new laboratories of war (ibid.: 295). But the protopian techno-​optimist will argue that the idea of criminals or terrorists identifying the weakest link in our digital societies and unleashing the Cyber Pearl Harbour or Cyber 9/​11 is science fiction, overhyped and exaggerated visions of digital disaster that result from the unsettling and often disorientating impact of a world-​changing technology. This digital age is unsettling in the ways we feel our lives can be interfered with remotely; indeed, it is not surprising that there are cyber-​horror/​ghost stories such as the novel Daemon (‘We are all connected. There is no escape.’) or films such as Pulse (and even Dragon Day, a ‘b-​movie’ that depicts the occupation of the United States after a digital onslaught on critical infrastructure orchestrated by China, resulting in all citizens becoming prisoners under deadly surveillance). There is something unsettling about technologies that bring the ‘global’ and/​or the criminal into our lives, the new infrastructure of proximity and distance that is now built into our lives; for example, when you receive an email suggesting that someone has been watching what you have been doing online and will reveal it to everyone you know unless you pay them. There is also something potentially unsettling about the way memories (or the news of people from your past) can return via social media, the online as a space, in a sense, of hauntings. But the protopian optimist will argue that we will get better at dealing with all the unintended consequences of the digital world; we will get better at the technical fixes; new innovations will make the digital world healthier and safer. There are those who argue we are witnessing the ‘dawn of the quantum internet’ and the race is on to create super secure online spaces; at the same time there is the concern

The (Hu)Man, the State and (Cyber)War  99

that ‘the quantum internet will turn the dark web darker still, and some are bound to take advantage’ (Battersby 2021: 40). The pessimist will argue we are heading for the digital dystopia of proliferating cybercrimes, digital terror and a diversity of enhanced non-​state actors all seeking to exploit our personal and societal vulnerabilities. But what does this all mean for the future of warfare? In 2012, Thomas Rid argued in Foreign Policy that we should be critical of the hype about the ‘digital bogeyman’; virtual conflict was more hype than reality and the scenarios depicting a Cyber Pearl Harbour or Cyber 9/​11s, catastrophic events orchestrated by terrorist or rogue state hackers, exist only in the imaginations of cyberpunk writers (or, we might add, in the sales pitches of those would seek to profit from selling the solutions and technical fixes to this new age of cyberwar). Returning to one of the most influential theorists of modern war, the Prussian General Carl von Clausewitz, Rid suggests that what we might call cyberwar is misleading: war is the use of violence to achieve political and strategic objectives; the various aspects of emerging digital attacks on states, organisations and infrastructures are phenomena that—​while the exploits can be a nuisance—​do not kill anyone directly and lack the threat that can change the behaviour of a state or army; no state will remove their armies from a state they have invaded due to the threat of a superpower unleashing a cyberwar. In the James Bond story ‘Risico,’ Ian Fleming writes that nothing made M. ‘more angry than having to divert his staff from their primary duty. This duty was espionage, and when necessary sabotage and subversion’ (Fleming 2022: 132).Cyber can be many things but it is not war (Rid: 2013): for Rid, rather than having the potential for becoming an instrument of war and violence that can deter behaviour or change strategy, all things cyber are basically the new tools of sabotage, espionage and subversion—​the latest attempts at what states have always done in war and peace (and the ‘grey zone’ that might be the constant ‘background noise’ of international politics). And even some of the most well used examples of cyber exploits are not that significant in terms of radically transforming the behaviour of states and militaries. Rid (2012) uses the example of the attacks on Estonia in April 2017—the denial of service attacks that emanated from 85,000 computers, the attacks on its largest bank, newspapers and other services—​that were compared to the blockade of harbours or airports; Rid suggests that the attacks were more emotional than infrastructural; the bank, for example, was ‘down’ for 90 minutes one day and then two hours the next day. John Arquilla (2012) points out (in his debate with Rid) that one of the most widely cited examples of the potential of cyberwar—​the Stuxnet worm that targeted Iranian nuclear-​enrichment capabilities, an example of a strategic cyberattack or what he calls ‘cybotage’— will not stop Iranian proliferation on its own. But Arquilla makes an interesting point in the debate with Rid in Foreign Policy. Arquilla who—​along with David Ronfeldt declared in 1993 that cyberwar is coming!—​suggests that while the use of sabotage against civilian populations might not be a ‘game changer’ (civilian populations are likely to withstand ‘assaults

100  The Tactics, Terrains and Technologies of Future Warfare

by bits and bytes’), what is significant is the way that cyberweapons might ‘scale up’ in this time of acceleration and connectivity; simply put, we do not know what will be possible in coming years and decades in terms of vulnerabilities that will emerge in connected and networked societies. Arquilla suggests that cyberwar has already proved significant: ‘When Russian tanks rolled into Georgia in 2008, their advance was greatly eased by cyberattacks on Tbilisi’s command, control and communications systems, which were swiftly and nearly completely disrupted’ (Arquilla 2012). And the chances are that the creativity of this impure war will produce tactics, technologies and terrains we cannot currently imagine. Arquilla appears less concerned with the possibility of the electronic ‘siege’ of cities and states and sees the growing importance of disruption to battlefield information systems and the virtual ‘bitskrieg.’ The bitskrieg might be both informational and intentionally visible (the clever use of social media during war to influence perceptions of a war) or infrastructural and intentionally invisible (attempts to sabotage an armies’ ability to communicate). The war in Ukraine appears to have involved creative attempts by Ukraine to exploit the informational dimensions of digital war and social media (the use of deception, countering disinformation) while at the same time preparing for the infrastructural dangers of cyberattacks. We also need to consider the broader implications of what Arquilla terms the ‘social netwar’; what subversion is becoming might be radically different in the future (as Gerasimov warned). Simply put, what Arquilla sees in terms of the importance of social networks might become more radical than either he or Rid seem to suggest, radical new tools of information war and psychological war that might be used in damaging and creative ways in times of war, peace and the grey zone. Indeed, both the informational and infrastructural possibilities might be more radical than any theorist can imagine. Discussing NotPetya, the malware discovered in 2016 that targeted Microsoft Windows systems, resulting in a wave of costly exploits, Greenberg notes: Even Thomas Rid, a Professor of strategic and military studies at Johns Hopkins who has written sceptically about the potential for cyberwar, criticising overblown metaphors of ‘cyberweapons’ and an impending ‘cyber 9/​11,’ has said that NotPetya finally represented an event that warranted that sort of hyperbole. ‘If anything comes close to “cyber 9/​11,” ’ Rid told me, ‘this was it.’ (Greenberg 2020: 140) Rid seems to acknowledge that what sabotage, espionage and subversion might mean could change: while he is not going as far to agree with the usefulness of the term cyberwar, he might be open to the idea that what a future act of sabotage, espionage or subversion might become will not be captured by those terms. Indeed, many of the terms we are using in the 2020s might be redundant by 2049 (while the ‘nature’ of war might not change, the technologies and tactics of war might

The (Hu)Man, the State and (Cyber)War  101

change beyond what we can currently imagine). What we can see in this debate between Rid and Arquilla is the uncertainty about the impact and consequences of cyber: an age of new technologies for traditional tactics of statecraft or a time of cyberwar where techniques of informational or infrastructural war prove decisively destructive in future wars, destructive or ‘game-​changing’ in ways beyond the virtual or digital. The (Hu)Man, the State and (Cyber)War

For Virilio, this time of impure war in the twenty-​first century differs from the pure wars of the previous century in the way it expands and multiplies the actors, technologies, terrains and tactics of war and international conflict. In the following sections I want to comment on the different elements in this time of impure war and international politics in order to think through the trends in cyber and war, focusing on: the human or individual; the state and society; war and interstate conflict. I argue that there is uncertainty about how significant cyber will be in war and the sub-​threshold zones of the future: it could be manageable sabotage, espionage or subversion—​or it could be a game-​changing age of cyberwar. But I conclude by arguing that while cyber will remain a vital issue for the foreseeable future, we should recognise that in the decades ahead we need to be remain alert to the possibility that our societies of connectivity might change and evolve beyond what we can currently imagine, the futures imagined in books such as Rucker’s Juicy Ghosts or Jennifer Egan’s The Candy House with its depiction of a world where memories can be shared and commodified. But what will drive militaries and other actors is the search for creative ways to explore and exploit informational and infrastructural connectivity. While the techniques and tech might change, the logic of intensifying connectivity, communication and commodification is unlikely to disappear (unless there is a dramatic turn of economic, ecological or geopolitical events). The future of the liberal way of warfare will fight impure wars and will seek to defend against impure wars; wars will be fought with new information technologies and wars will be fought to defend the informational infrastructures and foundations of the networked society. We might witness the emergence of impure wars that take us into the worlds imagined by William Gibson or David Cronenberg. We might see incredible ‘quantum leaps’ into worlds currently unimaginable taking impure war into dangerous zones of lethal empowerment; at the same time, we might see a retreat into the greater use tactics and technologies from pre-​digital times. But at root the focus of future warfare will be the desire to creatively attack and defend the infrastructures of command and control, the technologies that enable complex organisations operating across the globe to communicate in dynamic and complex environments, the tactics of espionage, subversion, sabotage, deception and confusion. There might be wild new possibilities of impure war—​or the possibilities might be tempered by digital technical fixes that limit the creative

102  The Tactics, Terrains and Technologies of Future Warfare

potential or by the vulnerability of emerging technologies. I would agree with Rid that it currently might not make sense to talk about cyberwar (although I would agree with Arquilla that we are seeing a pace of change that might need terms like ‘bitskrieg’ or ‘cybotage’). But my suggestion is that cyber might be best understood as an element in what some describe in terms of mosaic warfare; the significance and impact might vary in light of the offensive and defensive possibilities. Cyber is unlikely to become a weapon of pure war but will be part of the messiness in our times of impure war, a time of uncertainty, blurred lines between war and peace, the state and the criminal organisation, the intimate and the global, the home and the city, the human and (possibly by 2049) the replicant (or the post-​human). The Human: The State of the Hacker

In the summer of 2019 there were a couple of months where I experienced (albeit temporarily) what felt like a ‘digital exclusion’ from my university while a ‘cybercrime’ targeted at the university was investigated (BBC News 2019); my sense at the time was that the incident illustrated how hackers/​criminals could target vulnerabilities in organisations that were waiting to exploited (and then technically fixed), an event that takes an organisation by surprise when really it was clearly an ‘accident’ waiting to happen. But the incident also gave me a sense of the unsettling way you could become entangled in such an event in your everyday work in an organisation that depends heavily on connectivity, on being networked: the sense of unease and uncertainty about being suspected of cybercrime and cyberattacks reflected the change in scale in international politics that Virilio wrote about, a change of scale where a quiet summer in a campus town begins to turn into a cyberpunk thriller or a paranoid Philip K. Dick conspiracy, where a ‘distant’ actor exploiting the systems we use for work felt dangerously close, producing as much paranoia as one is likely to experience in their university workplace, the strange intimacy between you—​and someone you might never have met—​and someone who could have a destructive impact on your life. As myself and others speculated on the perpetrator, theories ranged from the ‘lone wolf’ hacker, organised crime seeing universities as an ‘easy target’ or hackers from a foreign state who might have an issue with universities teaching the next generation of security professionals. For the techno-​optimist, such events are part of the ‘learning process’ as we head deeper into the constantly changing (and expanding and intensifying) digital age, the ‘teachable moments’ in cyber education and management. For the more dystopian pessimist, the event reflects the emerging individualised insecurity where we can have our lives disrupted through our entanglement with actors that might be based in distant territories—​ or are geographically close to us (but morally or politically distant); and for the pessimist, this insecurity and vulnerability will become more creative and

The (Hu)Man, the State and (Cyber)War  103

catastrophic, impure wars that target the individual or organisation in increasingly devious fashions. For some commentators, we live in a time where individuals can become targets of a variety of creatively nefarious cyber-​tactics, the influence operations that try to weaponise social media for political ends through to attempts to steal information from individuals—​ for a variety of objectives that might be (geo)political or might be criminal, exploitation focused on the weaponisation of everything as Mark Galeotti (2022) puts it; in the more dystopian science fiction, our everyday infrastructures are targeted, our smart homes, smart cars, the devices we rely on in all aspects of our lives, the type of scenario depicted in Samanta Schweblin’s Little Eyes where the infiltration by new techniques of ‘remote access’ is used to invade intimate spaces of everyday life. At the same time, one of the urgent concerns of our time is of individuals or small groups having enhanced powers and capacities due to the vulnerabilities they can exploit through the cyber-​skills that might have ‘low barriers’ to entry. In Power to the People: How Open Innovation is Arming Tomorrow’s Terrorists, Audrey Kurth Cronin explores the emerging possibilities for (dis)organised violence in times where destructive capabilities can be generated through technologies that are different from the ‘closed’ systems of more ‘traditional’ weapons of ‘great power politics’; the development of destructive capacity in closed systems depends on expertise, huge budgets and access to often highly restricted resources; the possibilities for drones, AI or cyber appear to exist in ‘open systems’ of fairly ubiquitous global technologies (and open systems that might become more radical, destructive and disruptive in the years out to 2049). Kurth Cronin sees the digital terrain as often being ‘new tools’ for ‘old tactics’ and she points to troubling possibilities in a time when internet-​connected technologies are being used to mobilise individuals for political violence (Kurth Cronin 2020: 177). The trends she identifies point to tendencies that might unleash a chaotic time of violent (non-​state) actors, enhanced by mobile streaming videos and live streaming of violent acts, the use of first-​person filmmaking technology to bring proximity to violent and destructive events, the use of viral fake news to confuse and end-​to-​end encryption to organise; and most worryingly, the use of drones, bioweapons, robots and AI to create a dangerous age of lethal empowerment. Violent actors are potentially empowered by the possibilities of constantly evolving connectivity: ‘Driven by the powerful business model behind social media, online technologies have transitioned from being mainly information-​ sharing platforms into digital juggernauts aimed at capturing attention, collecting data, and facilitating like-​minded groups’ (Kurth Cronin 2020: 187); the exploitation of psychology and social media that can expand and intensify groups who use more ‘traditional’ techniques (the suicide bomber) of terror and crime. But this age of open technological innovation could result in a dangerous and intensified period of impure war enhanced by emerging technologies. Kurth Cronin concludes by exploring the strategies that states could begin to implement to limit the risks and

104  The Tactics, Terrains and Technologies of Future Warfare

dangers in what she ominously terms ‘an Age of Lethal Empowerment’; whether we can limit this empowerment will undoubtedly be a vital social, political, legal and technological challenge in the years and decades ahead. What will lethal empowerment look like as we approach 2049? What will cyber sabotage, espionage and subversion look like in 2049? For the protopian optimist (and for analysts like Kurth Cronin), we will develop ways to manage the innovation and openness of the digital age in ways that ensures the benefits outweigh the costs (benefits that will be revolutionary and possibly beyond anything we can imagine if we are considering the world by 2049); the world out to 2049 will not be utopia but nor will be dystopia; it might contain elements we would currently describe as utopian just as it might contain aspects we would describe as dystopian. Fundamentally, the world would just be increasingly different. But for the pessimist, lethal empowerment will result in a dangerous world of impure war and lethal empowerment where all areas of life will be targeted and states will fight endless wars with a variety of actors with a range of destructive capabilities, capabilities that are impossible to regulate and control. As the Pentagon futurist Andrew Marshall commented in 2003: A friend of mine, Yale economist Martin Shubik, says an important way to think about the world is to draw a curve of the number of people 10 determined men can kill before they are put down themselves, and how that has varied over time. His claim is that it wasn’t very many for a long time, and now it’s going up. In that sense, it’s not just the US. All the world is getting less safe. (Marshall 2003 in McGRay 2003) For Virilio, writing about the impure wars after 9/​11, we might be in a time of ‘peace’ but any aspect of everyday life and infrastructure could become a (temporary) battlespace and states could get caught in ‘endless wars’ against terrorist networks: the strangeness of impure war is the horrific ways in which normal life—​wherever you live—​can be targeted by new tactics and technologies, a drone strike or terrorist bomb that takes you by surprise, the tactic that finds a new technique of ‘weaponising’ everyday life. Technologies of connectivity produce new ways of connecting us to events (when we see the images that are intended to generate fear) or making us part of events (when we are hacked for cybercrime or influenced by social media ‘bots’ of click farms). But in terms of twenty-​first century warfare, there are signs of other ways that individuals and groups might become ‘tools’ of the information war rather than the targets of informational manipulation. For example, in 2022 an ‘app’ was produced that allowed Ukrainians to report to the military the movement of Russian troops from their towns and villages, the grassroots surveillance (or sousveillance) enabled by smartphones (Harwell 2022); Ukrainians also used social media to counter Russian propaganda and disinformation about the invasion through the way they could circulate stories and images.

The (Hu)Man, the State and (Cyber)War  105

The individual potentially becomes a new tool of information war and a tool to counter information war; for the optimist, the informationally empowered individual or group will have new tools to creatively challenge the actions (and inactions) of states and militaries. As we approach 2049, the question will be to what extent the individual will be insecure and vulnerable from the connectivity that is exploited by states and non-​state actors—​and to what extent the individual will be lethally empowered in ways that produces dangerous new acts of impure wars, unprecedented acts of informational and infrastructural harm. At the same time, there is the possibility that the individual will be empowered in protopian ways that will transform warfare. The ‘size’ and ‘shape’ of the enemy (smaller networks but more destructive; ‘invisible’ but able to create global events and spectacles; the use of drones and robots) might change just as the size and shape (and the education and skills-​ base) of militaries might change. The role of all individuals in conflict situations might change dramatically both in terms of the potential threat an individual can pose, ‘tooled up’ with skills and technology, the cyborg entangled in a network of machines inside and outside their bodies that they use (and are used by) for a variety of purposes—​but also individuals that can be used as potential ‘assets’ in war, opening up new creative possibilities in warfighting. Not the human terrain but the cyborg terrain. The space or geography of conflict might look radically different in a time where any human in any city on the planet becomes super-​empowered or lethally empowered in ways we cannot currently imagine. The liberal way of warfare will likely be looking for ways to exploit and control the super-​empowered individual that creates new possibilities in lethal empowerment and connectivity but also new tactics of conflict prevention and resolution (and surveillance). The future liberal warfighter will be tooled up and teamed with so many tools of AI, robotics and sensors in and on their body that they will be the constant focus of increasingly detailed and granular tactics of cyber-​sabotage, espionage and subversion: the challenge will be to ensure that the Agile warrior (to use a term that was used in the United Kingdom at the time of my visit to Sandhurst that began the book) does not become the Fragile warrior. But in this time of mosaic warfare there will be the constant production of new elements where the human is entangled in emerging technologies of connectivity in ways filled with both protopian and necropolitical possibility. The State: Sub-​Threshold Cyber and International Relations

Attempting to build defensive and offensive cyber capabilities is an important objective of possibly all states in the twenty-​first century. To be sure, there is a great deal of uncertainty about what states such as the United States or China are actually able to do in this ‘domain’ or how ‘good’ (or creative) they are at research and development on these cybersecurity problems. There is unease

106  The Tactics, Terrains and Technologies of Future Warfare

about the ethical and legal aspects of cyber defence and offense in this murky grey zone of constantly changing and evolving technologies; there is unease about the possibility of accidents, mistakes and unintended consequences in attempts by states to secure the digital: the mysterious Shadow Brokers hackers that leaked secrets from the National Security Agency might reflect the vulnerabilities of security agencies to outside attack—​or the event might reflect the problem of insider threat or even insider negligence or incompetence. The Shadow Brokers ‘auction’ of hacker tools might illustrate the dangers of states developing cyber skills that they struggle to control before they go ‘into the wild’ to be repurposed by other states or criminal organisations (such as NotPetya’s use of NSA-​linked EternalBlue and EternalRomance) (Buchanan 2020: 253; Greenberg 2019: 182). For the protopian, states will get better at managing the problems of cyber research and development; for the dystopian, ‘hacker states’ might be producing the weapons and vulnerabilities that will be used against them, their allies and businesses—​and their own citizens (Follis and Fish 2020). Virilio would add that we need to watch for the accident of technology that might prove as destructive as anything made possible by a state or orchestrated by a terrorist network. One of the concerns that is articulated in strategic initiatives such as the Department for Defense’s Third Offset Strategy is a concern with ‘staying ahead’ in a time where technological innovation in the ‘open’ systems described by Kurth Cronin risk to unsettle the ‘order’ of international society. Simply put, other states or non-​state actors might begin to match the United States in emerging technologies in a way that produces geopolitical disorder and tension: one of the worst-​case scenarios being that undeterrable actors, actors without territory to lose or economic interests to protect, launch catastrophic attacks using the latest innovations in biological or cyberwarfare. Initiatives like the Third Offset Strategy aim to ensure that the United States and its allies are able to manage and pre-​ empt military–​technical challenges in a time where there are a multitude of areas of innovation, from AI to cyberwarfare to robotics to quantum computing and biological warfare (Latiff 2017). But there remains the view that when it comes to all things cyber the risks and dangers are overstated. In May 2022, the head of Government Communications Headquarters (GCHQ) suggested that the ability of Russia to launch devastating cyberattacks on the Ukrainian military and civilian infrastructure had been ‘overhyped’ (Srivastava 2022). To be sure, there are undoubtedly constant risks of an informational or infrastructural nature, but those risks are being managed by the focus on defensive and offensive capabilities and, as T.X. Hammes (2021) argued in Joint Force Quarterly, there is resilience in the ‘complex adaptive system’ of the internet that will limit the possibility of the more catastrophic scenarios of cyberattack and vulnerability. Kilcullen concludes his study of what he calls ‘liminal war’ by suggesting that ‘societal resilience’ can counter some of the creative attempts to exploit the grey zone (Kilcullen 2020: 255). A key role for the military–​technical experts in the liberal way of future wars will be focused on

The (Hu)Man, the State and (Cyber)War  107

providing support and training for states and other actors that are the victims of sub-​threshold activity—​or preparing their infrastructures from sabotage in times of war. It might also enable other states to launch their own offensive cyber operations (and while liberal states will strive for protopian goals, they might become the cyber enablers for states adopting more necropolitical tactics); it might also be the case that states might not need the assistance of the United States and its allies in deploying their creative and damaging cyber offensives. While it might be the case that while we stay ahead in the development of offensive and defensive cyber capabilities, other states and actors might explore the possibilities of cybercrime and disruption as a way of supporting their ambitions in kinetic military action. One well known example of an ambitious Bank heist occurred in February 2016 when ‘criminals’ targeted the Bangladesh Bank; 35 fraudulent orders where issued via the SWIFT network to transfer $1bn from a Bangladesh Bank account based in the Federal Reserve Bank of New York to accounts in the Philippines and Sri Lanka: five of the instructions were successful, with $20 million traced to Sri Lanka and $81million ending up in the Philippines: the money that was transferred to Sri Lanka has been recovered while only $18 million of the money transferred to the Philippines has been recovered. The ‘heist’ was halted when a spelling mistake was spotted in the instructions. The instructions were issued when the bank’s office was closed in Bangladesh and criminals were able to access the bank’s computer network and issue orders for payments (having inside knowledge on the processes); the bank in New York lacked the ability to detect fraud in real time and the funds directed to the Philippines were laundered quickly into the gambling world during the Chinese New Year (Zetter 2016). The heist is interesting in the way it was orchestrated globally through the exploitation of different vulnerabilities, from the ability to access the computer networks through to what looks like the clever exploitation of times when offices were closed or on holiday in different territories. What made it possible was the identification of a local vulnerability; while there had been previous cybercrime in banks in Bangladesh, the vulnerabilities that had been identified there were still problems that had yet to be addressed. The heist might be an example of the exploitation of vulnerabilities that will be eradicated through better regulation—​ or an inevitable side effect of the uneven development and implementation of new systems and technology; in this view, there will always be a weak link to be exploited as new systems and technologies emerge and are used in more areas of life and business. But there is uncertainty about who was behind the heist; some suggest that it was North Korea and the Lazarus Group (an allegedly state sponsored hacking group who are suspected to be behind the attack on Sony in 2014 and the WannaCry ransomware attacks in 2017) while some suggest that organised crime have made it look like North Korea were behind it, using anxieties about state behaviour in the grey zone as a strategy of deflection and deception (and there were people in Bangladesh who saw the event as an example of domestic corruption

108  The Tactics, Terrains and Technologies of Future Warfare

and crime orchestrated by political and business elites). One of the arguments regarding North Korea’s objectives is that cybercrime is a means to fund their more traditional military–​technical objectives. The question here is on how effective and significant operations in this grey zone are (and will become); these are either desperate and opportunistic operations with limited effectiveness; or states will become increasingly effective in uncovering vulnerabilities in governments and corporations to fund the search for military–​technical advantage outside of the cyber domain. While it is difficult to know whether we will be talking about cyberwar and cybersecurity as we approach 2049, attacks on infrastructure and information will be a vital concern of all states and organisations (from universities, research centres, banks) for the foreseeable future; it remains to be seen how dangerous this space will becomes given how much focus there will be on building resilience and protection. The anxiety about a Cyber 9/​11 or Cyber Pearl Harbour does not look likely in the 2020s, but this might partly be down to how central cyber has become to state thinking and preparation for the future of security and war—​and the constant anxiety about the ‘black swan’ event that still might still take us by surprise. So while we might not see a Cyber 9/​11, we might see a cyber-​funded future 9/​11. But a vital element in the liberal way of future warfare will be to protect critical infrastructures and search for new ways to protect and disrupt command and control in times of war—​whatever the technologies of communication and information become; and with every passing decade we might be heading deeper into cyborg worlds of neuralink and neurowarfare. War: A War on Hackers? War over Cyber?

For Arquilla, cyberwar and the bitskrieg is here; for Rid, cyber is sabotage, espionage and subversion in times of war and peace. I would share Rid’s hesitancy about using the term cyberwar, but in his exchange with Greenberg he appears to admit that—​while we are not yet in the realm of a catastrophic Cyber 9/​11—​the stakes are rising in light of events such as NotPetya. But while it might not make sense to talk of cyberwar, cyber—​in its various informational and infrastructural dimensions—​ is an important element of contemporary war and international conflict whether it’s the use of social media for disinformation and propaganda or the need to defend infrastructure from attacks. So cyber is an important element in the ‘mosaic’ of future warfare where the significance of the informational and the infrastructural will vary depending on the changing nature of the vulnerabilities in the technologies and systems states, businesses and individuals use; the digital aspects of war might become less or more significant in terms of the mosaic of warfare depending on the situation—​and depending on how technology and connectivity evolve. As we approach 2049, while we might not be seeing replicants fighting wars, we might be in a world filled with humans better described as cyborgs, humans (and warfighters) dependent on a range of technological ‘upgrades’ (if we are not there already).

The (Hu)Man, the State and (Cyber)War  109

So cyber might not be cyberwar but a vital element in the mosaic of warfare, the latest stage of sabotage, deception, influence, espionage and subversion. These cyber elements might not be that significant or impactful compared to kinetic attacks or sabotage. But in Bitskrieg: The New Challenge of Cyberwarfare, Arquilla makes an interesting point about responses to hacker states, cybercriminals and terrorists. Arquilla argues that given that the information age has enabled the rebirth of ‘banditry’ that—​as Greenberg’s Sandworm illustrates—​provides a ‘dark service’ for states, action beyond defence might be required, a ‘proactive capacity for detecting malefactors as they move about in cyberspace’ (Arquilla 2021: 139). For Arquilla, liberal states might have to use ‘raiders’ to ‘track and attack dangerous criminals, insurgents, or terrorists’ and the implication is that hackers could be attacked by kinetic means: ‘The goal of this would be cyber-​locating them, determining their physical locations, and possibly, in some settings, arresting or extraditing them’ (ibid.: 138). While we might not see a ‘cyberwar,’ we might see a kinetic war or military action on hackers or facilities used for what are viewed as nefarious digital activities. Chris Krebs, the ex-​head of the US Cybersecurity and Infrastructure Agency made similar point in 2021 on a tactic known as doxing, publishing their private details and the use of more aggressive tactics to counter the rise in hackers holding organisations to ransom by encrypting their data (Stacey and Murphy 2021). It remains unclear how far these more aggressive tactics could go. But after blocking a cyberattack that the Israeli Defense Forces (IDF) believed to be orchestrated by Hamas cyberspecialists, an airstrike was targeted at the building in Gaza where the hackers were supposedly based, partially destroying the building in what is viewed as the first real-​time military response to a cyberattack (Hay Newman 2019). Given the centrality of connectivity to life in the twenty-​first century it is not that far-​fetched to see kinetic responses to hackers and hacker states. But it seems more likely that creative sub-​threshold tactics will be used to disrupt and deter hacker states—​and this will be an important element in the liberal way of future warfare. And while we might not see a cyberwar, we might see kinetic war on hackers, war on cyber, attacks on everything that makes digital exploits possible: so not cyberwar but war on cyber. It might also be the case that a vital concern in the liberal way of future warfare is focused on defending the space-​based infrastructure that makes societies of connectivity and intelligence/​ surveillance possible on the earth below—​and developing the techniques for space war that will sabotage war on earth (Marshall 2023). The search will be on to develop the creativity and technical ability to protect communication and connectivity—​and damage the connectivity of others. And to prepare militaries for a time when all connectivity and communication disappears. But the account of cyber provided here will undoubtedly reflect the problem that writers like William Gibson suggests characterises most science fiction: that seemingly futuristic depictions of the future—​and future social, political and

110  The Tactics, Terrains and Technologies of Future Warfare

economic problems—​generally reflect contemporary anxieties and debates. It might well be the case that we are caught in a planetary fog of future war that covers the globe where we cannot see where we are or where we are heading, what dangers that are growing around us like digital Xenomorphs being grown and experimented on in the pods grown by corporate and military laboratories. Concluding Remarks: Beyond Cyber

Cyber might point to a disorderly and chaotic age of impure war where a variety of actors are hacking, sabotaging, subverting, disrupting each other. The stakes in this time of impure war might be fairly low and will be managed by improved technologies and practices of deterrence, policing cybersecurity. It might also be the case that new techniques ‘improve’ war, using cyber to help minimise harm and damage in war and international conflict through creative techniques and tactics. Writing about international conflict and options ‘short of war,’ Joe Miller, Monte Erfourth, Jeremiah Monk and Ryan Oliver comment: unorthodox options should provide decision-​ makers with opportunities to achieve objectives proactively—​seeking decision relative to a limited set of objectives in conditions short of war. These unorthodox options will necessarily be interest-​driven, housed within a strategy to establish desired conditions. Efforts should focus primarily on generating effects through non-​ kinetic methods, aiming at targets in the human domain, cyberspace, the information environment, and other non-​physical arenas. In the information age, these slings and stones should strive to change population’s minds and behaviour rather than to convert the living to the dead, to generate deception and miscalculation rather than mass destruction, to darken a city rather than to raze it. Precision kinetic strikes may be necessary on occasion but will generally be less desirable, given heightened associated risk of escalation and attribution, irreversibility, and perception implications. The emergence, cultivation, and exploitation of opportunities should drive employment of these unorthodox options, used to advance goals within the limits of a broader interagency campaign—​either in support of civilian counterparts or as independent operations. (Miller, Erfourth, Monk and Oliver 2019) The commentators write in broad terms—​possibly unwilling to speculate on what technologies and tactics might be possible in the future. Their vision of future conflict rests on a distinction between the informational (to change a population’s mind) and the infrastructural (to darken a city rather than raze it). On the one hand, this comment points to protopian possibilities in the liberal way of future warfare and conflict resolution where cyber might produce non-​lethal solutions to resolve conflicts. But we should not lose sight of the darker ‘necropolitical’ possibilities of these non-​kinetic methods.

The (Hu)Man, the State and (Cyber)War  111

As Eyal Weizman (2011) illustrates in The Least of All Possible Evils: Humanitarian Violence from Arendt to Gaza, the exploration of ‘humane’ solutions in war can produce less visible forms of violence and control in the immediate time and space of an action or exploit. Simply put, the possibility of being able to darken a city might mask the various forms of suffering that might emerge from the supposedly non-​lethal possibilities—​possibilities that are studied, as Weizman illustrates, by liberal states in the calculations of minimal human survival (Weizman 2011: 13): new tactics of informational manipulation or infrastructural sabotage might lead to human and strategic consequences that might produce less ‘kinetic’ forms of suffering in war and unintended consequences not envisaged by the planners (and possibly produce new forms of human suffering in times of connectivity and digital life). Even if couched in the terms of humanitarianism and the liberal way of war, the search for creative non-​lethal solutions to conflict and war might reflect the ‘intelligentisation’ of war that might develop in ways that create innovative forms of control and dehumanisation beyond anything we can currently imagine, exploring new possibilities in biology, technology, neuroscience, information war, psychology and cyberwarfare. But while the technologies may or may not evolve in ways beyond anything we can currently imagine, the creative exploitation of the informational and the infrastructural looks likely to remain fundamental to the age of impure war where a variety of actors are involved in constant sub-​threshold activity. When conflict rises above the threshold into war between states or war with non-​state actors then cyber will be an element in the mosaic of warfare, a mix of the kinetic and non-​ kinetic. The significance of cyber in the mosaic will depend on how prepared states and societies are to deal with manipulation or attacks on infrastructure; it will also depend on what new creative possibilities for damage and manipulation emerge from technological development in coming decades. But regardless of the tactics or the technologies being used, deception, sabotage, espionage, subversion and signalling will be vital to the liberal way of future war. In societies and militaries heavily dependent on information and communication, whatever cyber mutates into is likely to remain a vital component in future war. At the same time, protecting the infrastructures of the digital age and the resources needed for the societies heavily dependent on emerging technologies will remain a key objective of the future war machine. How infrastructure is defended and resources are protected might involve weaponry that depends on digital technologies for its precision and lethality—​but will be far more kinetic in its capability and impact. Terrorist networks will likely continue to produce impure war that seeks to find vulnerabilities in the infrastructures of liberal societies; liberal societies will explore the possibilities of this experimental laboratory of impure war (aiming to avoid the trap of being caught in endless wars and territorial missions and occupations). Emerging powers will seek to explore the possibilities of impure war, the possibility of actions that will remain sub-​threshold—​or, in moments of crisis,

112  The Tactics, Terrains and Technologies of Future Warfare

exploit unimagined vulnerabilities in the infrastructures of liberal states (although aiming for attacks that use the ‘kinder weapons’ referred to by the authors of Unrestricted Warfare); there will be attempts to disrupt systems of command and control through a variety of techniques that overwhelm the decision-​making ability of a state and military. The liberal way of warfare will be focused on defending critical infrastructures and overwhelming an enemy through the sheer scale of complexity in mosaic war. The desire for all sides will be to explore the emerging possibilities of impure war before we reach the levels of destruction in pure war. But there will obviously be the possibility of accidents and miscalculations: Putin probably thought he was embarking on an impure war through his ‘special military operation’—​before his forces became caught in what looked like a pure war from previous centuries (and he had to deal with the impure war waged against Russia through the techniques described by Gerasimov on ‘contactless’ actions at a distance and all the actions of a concealed and not so concealed character). Bibliography Arquilla, John. 2012. ‘Cyberwar Is Already upon Us,’ Foreign Policy, 27 February: https://​ foreig​npol​icy.com/​2012/​02/​27/​cyber​war-​is-​alre​ady-​upon-​us/​. —​—​—​. 2021. Bitskrieg: The New Challenge of Cyberwarfare (Cambridge: Polity). Battersby, Stephen. 2021. ‘The Dawn of the Quantum Internet,’ New Scientist, Volume 250, Issue 3336, 29 May, 36–​40. BBC News. 2019. ‘Lancaster University Cyber Attack Suspect Arrested,’ 24 July: www.bbc. co.uk/​news/​uk-​engl​and-​lan​cash​ire-​49081​056 Blum, Andrew. 2013. Tubes: Behind the Scenes at The Internet (London: Penguin). Buchanan, Ben. 2020. The Hacker and the State: Cyber Attacks and the New Normal of Geopolitics (Cambridge: Mass: Harvard University Press). Busher, Joel, Malkki, Leena, and Marsden, Sarah (eds). 2023. The Routledge Handbook on Countering Radicalisation (London: Routledge). Fleming, Ian. 2022. ‘Riscio,’ in From A View To A Kill (London: The Folio Society). Follis, Luca and Fish, Adam. 2020. Hacker States (Cambridge, Mass: MIT Press). Galeotti, Mark. 2022. The Weaponisation of Everything: A Field Guide to The New Way of War (Yale: Yale University Press. Greenberg, Andy. 2019. Sandworm: A New Era of Cyberwar and the Hunt for the Kremlin’s Most Dangerous Hackers (New York: Random House). Harwell, Drew. 2022. ‘Instead of Consumer Software, Ukraine’s Tech Workers Build Apps of War,’ The Washington Post, 24 March: www.was​hing​tonp​ost.com/​tec​hnol​ogy/​2022/​ 03/​24/​ukra​ine-​war-​apps-​russ​ian-​invas​ion/​ Hammes, T.X. 2021. ‘The Tactical Defence Becomes Dominant Again,’ Joint Force Quarterly, 14 October: www.960cy​ber.afrc.af.mil/​News/​Arti​cle-​Disp​lay/​Arti​cle/​2810​ 962/​the-​tacti​cal-​defe​nse-​beco​mes-​domin​ant-​again/​ Hay Newman, Lily. 2019. ‘What Israel’s Strike on Hamas Hackers Means for Cyberwar,’ Wired, 8 May: www.wired.com/​story/​isr​ael-​hamas-​cybe​ratt​ack-​air-​str​ike-​cyber​war/​ Kilcullen, David. 2020. The Dragons and the Snakes: How The Rest Learned to Fight the West (London: Hurst).

The (Hu)Man, the State and (Cyber)War  113

Kurth Cronin, 2020. Audrey. Power to the People: How Open Technological Innovation Is Arming Tomorrow’s Terrorists (Oxford: Oxford University Press). Latiff, Robert H. 2017. Future War: Preparing for the New Global Battlefield (London: Vintage). Littler, Mark and Lee, Benjamin (eds). 2020. Digital Extremisms: Readings in Violence, Radicalisation and Extremism in Online Space (London: Palgrave). Marshall, Tim. 2023. The Future of Geography: How Power and Politics in Space Will Change Our World (London: Elliott and Thompson). McGray, Douglas. 2003. Interview with Andrew Marshall. www.wired.com/​2003/​02/​ marsh​all/​ Miller, Joe, Erfourth, Monte, Monk, Jeremiah, and Ryan Oliver. 2019. ‘Harnessing David and Goliath: Orthodoxy, Asymmetry, and Competition,’ Small Wars Journal, 7 February: https://​small​wars​jour​nal.com/​jrnl/​art/​har​ness​ing-​david-​and-​goli​ath-​orthod​oxy​asymme​try-​and-​comp​etit​ion Mozur, Paul. 2018. ‘A Genocide Incited on Facebook, With Posts from Myanmar’s Military,’ 15 October: www.nyti​mes.com/​2018/​10/​15/​tec​hnol​ogy/​myan​mar-​faceb​ook-​ genoc​ide.html Rid, Thomas. 2012. ‘Think Again: Cyberwar,’ Foreign Policy, 27 February: https://​foreig​ npol​icy.com/​2012/​02/​27/​think-​again-​cyber​war/​ —​—​—​. 2013. Cyber War Will Not Take Place (Oxford: Oxford University Press). Rucker, Rudy. 2021. ‘Juicy Ghosts,’ Kickstarter: www.kick​star​ter.com/​proje​cts/​rud​yruc​ker/​ juicy-​gho​sts Srivastava, Mehul. 2022. ‘Prospect of Russian Cyber War May Have Been ‘Overhyped’, Says UK Spy Chief,’ Financial Times, 10 May: www.ft.com/​cont​ent/​d5657​df5-​a962-​ 4acf-​b0bd-​b892c​6b15​361 Stacey, Kiran and Murphy, Hannah. 2021. ‘Former US Cyber Chief Calls for Military to Attack Hackers,’ Financial Times, 5 February: www.ft.com/​cont​ent/​27c09​769-​ceb5-​ 46dd-​824f-​40b68​4d68​1ae Thompson, Barney. 2016. ‘Hatton Garden Gang a Throwback to Britain’s Criminal Past,’ Financial Times, 15 January: www.ft.com/​cont​ent/​0166e​b04-​bba6-​11e5-​b151-​8e15c​ 9a02​9fb Weizman, Eyal. 2011. The Least of All Possible Evils: Humanitarian Violence from Arendt to Gaza (London: Verso). Zetter, Kim. 2016. ‘That’s Insane, $81M Bangladesh Bank Heist? Here’s What We Know,’ Wired, 17 May: www.wired.com/​2016/​05/​ins​ane-​81m-​ban​glad​esh-​ bank-​heist-​heres-​know/​

6 THE GRANULAR 1 The Changing Scale in Conflict

In the twenty-​first century it could be argued that military technology is not simply about the speed and destructive capability of a weapon or tactic but of the precision of the technology. As Hartmut Rosa observes: ‘The aim of technological development in the military today is not the faster destruction of the enemy, but the more precise neutralization of its infrastructure and its power of resistance’ (Rosa 2015: 202). New technologies and tactics enable organisations and individuals to do increasingly ‘detailed’ and ‘impactful’ things at a distance and this change in scale is possibly transforming the character of war and security. If, for example, there is a war between China and the United States there might be an effort by all sides to contain the fighting to the careful and precise ‘neutralisation’ of important (but not life endangering) elements of infrastructure—​or to demonstrate what could be possible if the conflict were to escalate. The terrorist will target the infrastructures of everyday life; states will attempt to target the infrastructures that make everyday life possible (from the sabotage of undersea cables to attacks on the systems that make satellites function); the ‘hierarchy’ of military–​technical capability in international politics is a ranking of who has the creativity and ability to sabotage or destroy critical infrastructures vital to all aspects of war and peace. The following two chapters set out to explore the change in scale in contemporary and future war, suggesting that we might be seeing a trend that Christopher Coker refers to in terms of war being ‘down-​sized’: ‘Now, if war is getting smaller, and will continue to do so, this development has been long in the making. Nothing is really “new”; often, it just hasn’t been noticed’ (Coker 2015: 108). The following two chapters attempt to unpack what war getting ‘smaller’ (or more granular) might mean for conflict and international politics and I want to think through the possibilities of what we might term ‘granular war’ in the context of some very ungranular problems. DOI: 10.4324/9781003219576-8

The Changing Scale in Conflict  115

War, Modernity and the Changes in Scale

Modernity was transformed by the powerful new machines, infrastructures and weapons that emerged from the industrial age: trains and rail networks, the cars and roads, the planes and aircraft carriers, the tanks, the bombers, the nuclear weapons and satellites: these technologies all transformed the destructive potential of a lethal modernity. It is now possible to wage industrial ‘war at a distance,’ war that can be deadlier and faster than anything ever witnessed in human history, war that depends on infrastructures that are undersea, across all territories made increasingly transparent to the ‘military gaze’ by sensors and cameras, in ‘cyberspace’ and through the satellites that surround us in space. A nation like the United States could take part in (and transform) a world war, orchestrating its worldwide military logistics, the movement of its industrial war machine; it could unleash the destructive potential of the atomic age in Japan—​ while the domestic territory could function at a relatively safe distance from the wars overseas. Of course, for much of the world the age of this industrial war brought violence and destruction closer to home, in a manner and scale that was unprecedented in its lethality and brutality in the ‘homelands’ of modernity and progress: the inhabitants of cities across Europe and Japan would experience the ferocity and destruction of this industrial ‘war from above.’ Indeed, the horrific experience (and images) of war from above in Japan and Europe, it could be argued, continues to underpin the logic of deterrence: the possibilities of industrial war for citizens in advanced, developed states had been revealed—​and only a ‘suicidal state’ would want to risk embarking on a path to a twenty-​first century version of the previous century. We can see the ‘scars’ of industrial war from above across our architectural landscapes in the cities of Europe; we have seen the images of destruction in the Blitz, Dresden or Hiroshima. To be sure, it could be argued that ‘traditional’ war returned to Europe in 2022 but it could be argued that it was war that (until the time of this writing) played out in territory between ‘great powers’ and where all sides were trying not to escalate fighting to a point of direct interstate war: the desire was to keep the conflict sub-​threshold and in what might have been viewed as an uncertain, liminal necropolitical space of international politics, a state between NATO and Russia. The twentieth century was shaped by the industrial innovations of modernity; these innovations transformed the world and transformed the possibilities for world war between states: large armies could be moved around the planet with their tanks and weapons through the complex routes of modern logistics; large-​ scale destruction could be directed at the cities and their inhabitants; human beings could kill and control other people in factory-​like camps. This is the age of what Zygmunt Bauman (2000) would describe as ‘heavy’ or ‘solid modernity,’ the age of powerful industrial ‘mega-​machines’ and states that could control and destroy in unprecedented ways. But modern states could also transform (and possibly improve) populations in unprecedented ways, providing new types of

116  The Tactics, Terrains and Technologies of Future Warfare

safety, protection and education/​training through the rise of post-​war welfare states. These states would aim to protect citizens, citizens that had been vital to the war effort, beginning to distribute the benefits of the revolutions in industry and business (Macmillan 2021); the necropolitics of world wars drove the call for new politics and values where life (for some) would be valued differently. At the same time, post-​war states in Europe and North America had to maintain order and security domestically and internationally in a Cold War of divided geopolitics where a ‘spectre’ was haunting the liberal capitalist world. For critics of the liberal ways of war, the foreign policy of the United States became as much about violent policing and control of ‘world order’ (driven both by hubris and paranoia—​and a desire for resources) as it was about the creation of the ‘rules based international order’ (Thompson 2022). But from the 1970s onwards the global economy undergoes new types of transformation as states and business deal with economic crises around the planet alongside the dawn of the computer age and the revolution in information communication technologies. We begin to move to what Bauman describes as liquid modernity where new technologies enable corporations to do things differently (more ‘flexibly’ or ‘nimbly’) and to operate globally, to harness the benefits of outsourcing more and more activities in order to become more competitive and profitable, to create a fast globally interconnected marketplace—​and a workforce that is more flexible (and often insecure): one’s business horizons no longer have limited by the borders of your ‘host’ state (with their unions left over from solid modernity). In liquid modernity, the problem of defence (which has been transformed by weapons of mass destruction and the apocalyptic possibilities of nuclear war) is matched by the problems of security and impure war: the security problems that are viewed to emerge from an increasingly connected world where everything can move (people, diseases, information, businesses) in a way unimaginable during the Cold War; and when we become aware of the threats in what Virilio described as an environment of fear, a paranoid ‘democracy of emotion’ where our connectivity creates a ‘synchronisation of affects’ (Virilio 2008); in other words, the incredible connectivity of the digital age or the information age could produce citizens in liberal democracy that felt more insecure, angry and vulnerable—​especially after the events of 9/​11. This connectivity, as Gerasimov noted, could be exploited by all states in the new approach to war and international conflict. This global transformation intensifies through the broader geopolitical re-​ orderings after the fall of the Soviet Union and the emergence and consolidation of what becomes known as neoliberalism—​an approach to economic and political life that celebrates deregulation and the financial possibilities of a new planetary (dis) order of speed, money and information technologies: a borderless planet where profits can be made unlike anything seen before, where everything we thought might be permanent is transformed by the new potentials of business and finance—​ and where the West is no longer the ‘centre’ or ‘core’—​all regions around the world are transformed, disrupting what many saw as the natural order of geopolitical

The Changing Scale in Conflict  117

things. The emergence of a world where China quickly embarks on a transition to a capitalist economy that drives its rise to superpower status in a multipolar world; a world where the once safe and secure professional classes in the West face the threat from artificial intelligence (AI) and machine learning and where all classes confront the ‘rise of the robots.’ On the one hand, there is the ideology of a ‘shrinking’ state that undoes all the structures of ‘solid’ modernity in the drive for ‘efficiency’; on the other hand, new technologies create the possibility for ‘shrinking’ organisations that can outsource (and expand territorially) more and more activities to machines and foreign territories. The technological innovations transform the territorial possibilities of the global economy—​and a new logistics of production and consumption—​but we also see the emergence of new forms of consumerism via our smartphones and ‘tablets’ in an age of Amazon, Uber, Spotify and Google. The new technologies—​ and our ways of using these technologies—​create new forms of surveillance and techniques to collect and evaluate the data that is produced by aspects of lives that are increasingly transparent. Indeed, Shoshana Zuboff (2019) describes our age as one of ‘surveillance capitalism’ where our habits become material to be used in new ways by corporations and governments at the same time as our habits (our way of being in the world) are transformed by these new gadgets and tech companies. The age of liquid modernity and neoliberalism generates ideological anxiety about government spending and—​while military spending remains one of the areas generally off limits to neoliberal desires—​there is a concern with how resources should be used and what should be prioritised. What type of military do we need in a time of ‘small wars’ and ‘big data’ (to use the title of one book on the impact of the information ‘revolution’ in modern conflict) (Berman, Felter and Shapiro 2018)? In many discussions on the future of war, there is an anxiety that our ‘legacy’ ways of approaching warfare and security might be insufficient in a time that is generating new forms of vulnerability and insecurity in the fast changing and connected worlds of liquid modernity. At the same time, there is an anxiety that the move to smaller traditional military structures and larger digital or robotic platforms is risky in an age that might see the return to traditional ‘old school’ wars across the planet. This debate plays out differently across the ‘liberal order.’ For example, the Financial Times notes that French attempts in 2023 to boost spending on defence after a decade of cuts confronts a time of inflation and the rising costs of high-​tech kit: The upshot is that France will not end up with a step-​change in capabilities from the new money. It will continue to have what critics call ‘a bonsai army’—​a reference to the Japanese art of cultivating miniature trees. France would have the basic range of capabilities of the more powerful US army, but just on a tiny scale. (Abboud 2023)

118  The Tactics, Terrains and Technologies of Future Warfare

So, on the one hand there is a shift generated by new technological possibilities, on the other hand, there is a shift towards the creation of a ‘bonsai army’ driven by the economic conditions of liberal states. When it comes to futuristic speculations on the ‘size’ and ‘shape’ of future war we confront scenarios filled with micro-​drones, the ever smaller drone or robotic technology that will create new ‘granular’ possibilities for surveillance and war: there will undoubtedly be granular possibilities that will take us—​as we see in debates over future nanotechnology—​into the realm of science fiction, into unknown terrains of technology and war. The Marvel superhero movie AntMan (2015) explores the possibilities emerging from a Futures Lab creating what would be the ‘ultimate secret weapon,’ the creation of the YellowJacket, a ‘soldier the size of an insect,’ an ‘all-​purpose weapon of war’ that would give a state the ‘ultimate competitive advantage,’ heralding the ‘end of warfare as we know it.’ Over futuristic images of different battlespaces in the marketing video for the YellowJacket, we are informed that ‘we live in an era in which the weapons we use to protect ourselves are undermined by constant surveillance’; the YellowJacket will be able to manage conflict on the geopolitical landscape through new forms of surveillance and industrial sabotage, an army that will create a ‘sustainable environment of well-​being around the world.’ The tool, in other words, will be ideal for both activity in the ‘grey zone’ and in more ‘traditional’ types of war and conflict. After the marketing film is shown, someone expresses a concern about how an enemy would use this new technology against us. AntMan points to extreme possibilities in the change in scale that might be emerging, the extremes of granular fantasising. But what is not science fiction is the concept/​practice of the ‘kill box,’ the three-​dimensional zone that has been defined as a target area where everyone inside the box are legitimate targets for a drone strike (or whatever technology is being used). With the concept of the kill box, we see a stark example of the current state of what we might term granular war and an indication of where war might be heading: not chaotic fields of battle but tightly demarcated zones that can be produced across the planet, spaces of destruction that can be produced from a distance. ‘Depending on the contingencies of the moment,’ Gregoire Chamayou declares, ‘temporary lethal microcubes could be opened up anywhere in the world if an individual who qualifies as a legitimate target has been located there’ (Chamayou 2015: 56). In the more futuristic visions of future warfare, the kill box might shrink in such a manner that those close to the box—​but not inside it—​will be safe from harm in a strike. For example, the United States has developed a missile sometimes referred to as the Ninja bomb for ‘pinpoint’ airstrikes, a missile that doesn’t explode but uses blades that shred the target but leave civilians nearby untouched while it plunges through the tops of cars and buildings. There is, of course, a concern about the level of intelligence needed for it to not to kill innocents—​the granular detail to ensure that a child hasn’t entered the room or house of a target, the potential for accidents to occur given the lethality and destructiveness of the weapon; it is for

The Changing Scale in Conflict  119

these reasons apparently that the ninja bomb has not be used many times (Holland Michel 2019). On the optimistic side of the debate, the future of war will be made up of zones of unprecedented safety and control where as long as you are a ‘good’ citizen (with a sound pattern of life) the proliferation of sensors, drones and technologies of surveillance will ensure that you are protected and secured by the ‘digital cage’ in which you live: all types of risk and insecurity—​such as falling in the street, having a heart attack at a train station through to a terrorist attack in a crowded train station—​will be dealt with effectively and speedily with the assemblage of technologies in which you are entangled. When dangerous individuals emerge into our infrastructures of everyday life, they will be quickly dealt with in the kill box (or what may become the term for the ‘non-​lethal box’). As Chamayou suggests on the possibility of nano-​drones and autonomous robotic insects: With devices such as these, armed violence could be unleashed in tiny spaces, in microcubes of death. Rather than destroy an entire building in order to eliminate one individual, a miniaturized could be sent through a window, and the impact of the resulting explosion could be confined to one room or even one body. Your room or study could become a warzone. (Chamayou 2015: 56) For the liberal optimist the shrinking or downsizing of war and force points to the eventual disappearance of violence from the human condition. For others such as Chamayou the ‘microcube of death’ points to a different type of lethality and necropolitical violence—​and one that might generate dangerous unintended consequences and accidents. Virilio begins to think through the ‘changes in scale’ in security and war in the twenty-​first century after the events of 9/​11 both in terms of the event itself—​ the numbers involved with the attack and global event—​but also in terms of the response to the attacks in a time when drone strikes and new techniques of domestic and international surveillance become possible (Virilio 2008: 12). Indeed, it could be argued that a flaw in the Global War on Terror was the use of a very ungranular territorial strategy (that was destined to fail) as a response to the networked and ‘liquid’ nature of the enemy; it could be argued that one of the most effective tactics in the War on Terror was the tightly focused and well organised (and ‘efficient’) killing of Osama bin Laden, the events depicted in the movie Zero Dark Thirty, a ‘granular’ operation based on a global intelligence gathering operation, an ‘invisible’ global operation until forces arrived in the compound in Pakistan: the granularity of impure war. Jolle Demmers and Lauren Gould develop the ideas of Bauman in the context of liquid warfare in an article where they examine the ‘hunt’ for Joseph Kony by AFRICOM (US Africa Command), examining what it reveals about the changing

120  The Tactics, Terrains and Technologies of Future Warfare

nature of war, suggesting that liquid warfare (or what Virilio would describe as impure war) is becoming the norm: The Western state-​led turn to remote forms of military intervention as recently deployed in the Middle East and across Africa is often explained as resulting from risk aversion (avoidance of ground combat), materiality (‘the force of matter’) or the adoption of a networked operational logic by major military powers, mimicking the ‘hit-​and-​run’ tactics of their enemies. Although recognizing the mobilizing capacities of these phenomena, we argue that the military interventionism is prompted by a more fundamental transformation, grounded in the spatial and temporal reconfiguration of war. We see a resort to ‘liquid warfare’ as a form of military interventionism that shuns direct control of territory and populations and its cumbersome order-​building and order-​ maintaining responsibilities, focusing instead on ‘shaping’ the international security environment through remote technology, flexible operations and military-​to-​military partnerships. (Demmers and Gould 2018: 364) In other words, there is a change in scale (although not necessary in terms of lethality) in the liberal way of war emerging from the desire to minimise risk through new tactics and technological possibility. Writing in Agile Warrior Quarterly about ‘Lessons Learned From Contemporary Theatres,’ Abigail Watson and Emily Knowles discuss the future challenges for the British Army in terms of the emergence of ‘light-​footprint’ operations and ‘remote warfare’ that involve a ‘heavy emphasis’ on working with and through local and regional allies. The researchers suggest that while there is political risk aversion, financial constraints and enhanced public scrutiny over U.K. warfighting, the light-​footprint/​remote warfare operation is likely to dominate future military engagement (Knowles and Watson 2018). But writing about the experience of Afghanistan and Somalia, Watson and Knowles conclude that ‘remote warfare really struggles to deliver when expectations move from destroying or degrading a terrorist threat towards setting the conditions for lasting stability’ (Knowles and Watson 2018). More broadly, it might be the case that an overly remote and light-​footprint approach to the liberal way of war might create the emerging territories shaped/​degraded by a dangerous mix of external private security companies, foreign states and authoritarian regimes that operate in (and work to produce) necropolitical zones of terror, exploitation and humanitarian disaster (as some would say occurred through the activity of the Wagner group in Syria); zones open to necropolitical exploitation by a toxic mix of actors who might increase in a world of climate emergencies. But remote or light-​footprint war/​intervention do not necessarily mean a withdrawal from conflicts and disorder around the world; the remote and light-​footprint aspect might be an element in broader tactics of ‘mosaic warfare’ (an approach that might be

The Changing Scale in Conflict  121

increasingly ‘impure’ given the use of other political, economic and non-​lethal tools, tactics and pressures in attempts to shape and influence conflicts or emerging necropolitical conditions). Or it might be the case that remote and light-​footprint operations involving increasingly granular operations are the risks that liberal world can take in a challenging multipolar world where building what Kilcullen describes as ‘societal resilience’ domestically (and on the borders of liberal states) is the primary security objective. Liberal ambition to transform the chaos and suffering in the world might be displaced by moral indifference to distant suffering (and a focus on building societal resilience through new sciences of protection). At the same time, for Watson and Knowles, it might be in these complex environments where geopolitical tensions and competition play out: Notwithstanding increasing agitation about a rising near-​peer or Russian threat to UK security, adversaries continue to have a strong strategic interest in confronting our armed forces off the open battlefield. It does not seem unreasonable to suggest that UK forces may be more likely to find confrontation with Russia in Syria than in the Baltics. (Ibid.) As we have seen in Ukraine, the liberal response will be to keep things sub-​ threshold—​and if forces are confronting the enemy, it is in an indirect way (training, supplies of weaponry) or ‘invisible’ manner (the possible use of special forces). Events in the disaster zones of international politics might be the sites of intense technological, political, diplomatic and tactical complexity. The point being made here is that the liberal way of future warfare will be impure war of ‘remote war’ and ‘light-​footprint’ operations involving increasingly (or decreasingly) granular tactics and technologies (and possibly ‘invisible’). So how might these global challenges and events in megacities play out in our granular times? The Future Megacity Wars in (Un)Granular Times

Future urban warfare is often presented as the international ‘black hole’ in which states and militaries will inevitably be drawn into, the last battlespace that resembles ‘modern’ (or even pre-​ modern) war. Warfare in cities is the battlespace that continues to generate the horrific images of war in its most brutal and destructive form, the space where even the most powerful can lose control—​and where the liberal way of war might be pushed to its moral limits. The type of warfare that can be turned into exciting and visceral video games: Six Days in Fallujah, for example, is a first-​person shooter video game released in 2021 that enables the player to experience the Second Battle of Fallujah fought over six days in November 2004 in Iraq; the type of conflict that can result in dramatic and terrifying movies such as Black Hawk Down. On the surface, there is not much invisible, digital or grey about urban warfare.

122  The Tactics, Terrains and Technologies of Future Warfare

There is a sense of unease about what future urban war will look like for liberal states, states that are concerned about the risks for all sides in congested and complex urban environments. Of course, the future of urban war might not be shaped by liberal states in any significant way but will be the terrain of states that have less concern with the moral and tactical dangers of urban war; the concerns that liberal states have about urban war have will be fundamentally different to the concerns that Russia will have about leaving obliterated cities and humanitarian disasters across a state. Indeed, one of the points I want to explore here is whether military interventions in megacities will even be possible for liberal states—​and what it might look like in terms of granularity and the changes in scale in impure war and global conflict. In the strategic imagination, war in megacities looms large as an inevitable future problem at the same time as there is a sense of unease about what will actually be realistic and possible in coming years and decades: what I want to suggest is that when we delve into this literature, what we begin to see is an increasing interest in the changing scales in conflict, of the possible shrinking of conflict—​not the disappearance of urban warfare but the transformation of the liberal way of urban warfare with both necropolitical and protopian possibilities. One widely cited (and controversial) attempt to outline the challenges of urban war is Richard J. Norton’s ‘Feral Cities’ (2003) where the author begins by inviting us to imagine a sprawling urban environment that is now a collection of blighted buildings, an immense petri dish of both ancient and new diseases, a territory where the rule of law has long been replaced by near anarchy in which the only security available is that which is attained by brute power. (Norton 2003: 97) But these failed cities are still ‘connected’ and ‘networked’ cities, although the details and implications of the technological capacity and capability is left vague and undeveloped: ‘It would possess at least a modicum of commercial linkages, and some of its inhabitants would have access to the world’s most modern communication and computing technologies’ (ibid.). The idea of ‘some’ of the inhabitants having access to communication and computing technologies might reflect the world in the early 2000s; it does not seem to reflect the world of the 2020s let alone the 2030s or 2040s however rich or poor you are; and it might not simply be a case of all inhabitants of these cities having access to ‘modern’ communications and computing—​they might be using the technology in clever and creative ‘impure’ ways. Norton’s dystopian point is that—​in coming decades possibly shaped by climate change, intensifying inequality and proliferating networks of terror and crime—​we might see cities where there are no traditional structures of authority and power and where police and military find it increasingly difficult to operate in; or urban environments where the sources of law and (dis)order are foreign private security

The Changing Scale in Conflict  123

companies like the Wagner group. These geopolitical ‘no go zones’ might be too messy to consider intervening in by liberal states where the costs of intervention are too high for all involved—​ especially when interventions might hold the possibility of confrontation with private military security groups. But these will be cities where the human costs of insecurity and suffering will be hard to ignore for liberal states—​or where we might intervene for a range of reasons justified in terms of the protection of international order and security. At the same time, our interventions might be driven by great power competition and on issues of resource scarcity (Kaplan, Gray and Thompson 2023). The ‘feral city’ is defined as a city with a population of more than a million people ‘in a state of government of which has lost the ability to maintain the rule of law within the city’s boundaries yet remains a functioning actor in the international system’ (Norton 2003: 98). In the feral city, social services barely exist and there is limited health, security and education: the key actors that maintain control are criminal or terror networks that operate across the urban environment, with possibly multiple and rival forms of power and authority. In addition, the necropolitical violence and disorder (with indifference to suffering and death framed by the perception of the value of human life in racial or class terms) of the feral city would have a ‘magnetic effect’ for terrorist and criminal organisations, providing safe havens for individuals orchestrating global networks of crime and terror, unruly spaces with sufficient infrastructure from where global operations can be managed; and where foreign states and private security companies will see opportunities for profitable missions. ‘The vast size of a feral city,’ Norton suggests, ‘with its buildings, other structures, and subterranean spaces, would offer nearly perfect protection from overhead sensors, whether satellites or unmanned aerial vehicles’ (Norton 2003: 99). The messy urban space would provide opportunities for those who need to disappear and hide from the global surveillance infrastructures in cyberspace and in the drone and satellite-​filled skies (or from micro-​drones that might operate across the built environment). This focus on the vast size of the city allowing individuals to hide might be overstated; the vast size of the city might make it a risk for an individual due to the numbers of people in the city who might be able to pass on information about the location of an individual or group (for a reward). Indeed, Osama bin Laden might have felt more secure in a compound in Abbottabad rather than hiding in a city (the risk of being located in a space where he could be killed or captured weighed up against the risk of being seen in a dense and congested zone of a megacity). In some cases, the city might make it difficult to hide, but in other cases the urban environment might create new and creative opportunities to hide activities or disappear from view, a complexity of the urban environments that can continue to generate problems for states and militaries. For example, the Israeli strategy of ‘mowing the lawn’—​a policy where Israel ‘delivered warnings’ to Hamas with ‘limited, strategic air strikes every time an escalation loomed’—​has not deterred the use of long-​range rockets. Hamas had been using a network of tunnels—​the

124  The Tactics, Terrains and Technologies of Future Warfare

Gaza Metro—​to move fighters and weapons around, tunnels that might have been destroyed in the bombardment in May 2021 (Srivastava, Kerr and England 2021). The city can provide a variety of infrastructures for local and global creative destruction. For Norton, the emergence of the megacity is a vital issue of global governance, political economy and public policy, but the key issue for Norton is not on broader strategies of prevention but on the military options once cities have gone feral and descended into dangerous disorder. And Norton is uncertain whether we will be prepared for the complex challenges posed by chaotic megacities: It is questionable whether the tools, resources, and strategies that would be required to deal with these threats exist at present. But given the indications of the imminent emergence of feral cities, it is time to begin creating the means. (Norton 2003: 99) In this sense, Norton is writing (in 2003) a case for ‘resources’; and it is unclear whether there are the resources now (in the 2020s), resources in terms of both thinking/​planning and technological/​organisational innovation (or even thinking about what the ‘tools’ for future urban war would look like, what the ‘enemy’ might be capable of in 2030 or 2040). Indeed, in the 2020s it is unclear if we are moving beyond the broad and general declarations that war in cities will be one of the most difficult of future challenges. In The Future of War: A History, Lawrence Freedman (2017) provides an outline (a fascinating outline that is informative but possibly doesn’t add much beyond what we find in Norton’s essay) on the changing character of crime, terrorism and urban conflict and the potential problems of climate change on cities around the world. In Margaret MacMillan’s War: How Conflict Shaped Us, the conclusion of the book tells us that urban warfare is ‘expanding and challenging the capacity of the military’ (MacMillan 2021: 286). But we do not get a sense of how it is challenging militaries and what the possible trends are that will shape the future of war in urban environments beyond the fact that many countries are investing ‘time and money in the problem’ (ibid.). In Urban Warfare in the Twenty-​First Century, Anthony King, however, does suggest the possibility of three urban ‘Armageddons’ that could be on the horizon: war in a megacity, automated war in a smart city or the return of mass air attack with conventional or nuclear weapons (King 2021: 214). Of course, it is sensible to avoid the ‘mug’s game’ of speculating on the future of urban warfare and there is possibly (a sensible) unease in these writings about stepping into an area that quickly begin to feel like the stuff of science fiction (especially if we are still thinking out to 2049). Or the reluctance might stem from the realisation that thinking through the problems of urban warfare might open up possibilities of terrains and tactic that are so different that they might appear absurd (especially from ‘legacy’ thinking and approaches). At root, the challenge of thinking about the future of urban warfare involves thinking about

The Changing Scale in Conflict  125

the interaction of three wildly unpredictable elements. First, what will liberal states be able to do with emerging technology and tactics—​and what social and political pressures might influence and shape the use of these tactics and technologies. Second, what difficulties will opponents in urban environments be able to cause a liberal state in a time where technological change might open-​up radical and disruptive (and possibility currently unimaginable) possibilities. Third, what will the urban environments look like in futures of both technological transformation (the issue of smart cities filled and shaped by new technologies), environmental change (climate change, issues of health and disease) and social change (economic problems, inequality, demographic trends). Cities might be increasingly protopian spaces, improved by technological innovation—​or we might see the emergence of necropolitical cities that are like something from Blade Runner 2049, shaped by environmental disaster and radical technological change. Or we might see an extreme divergence in the different types of city and urban environments militaries will have to operate in—​from the sleek, ‘smart’ cities in Minority Report to the urban disorder in the Judge Dredd movie Dredd. In the next chapter we will piece together ideas from different sources—​ generally from outside the university world—​that might point to what is being envisioned for the future soldier in these congested zones. To be sure, the anxiety over the feral city might fade as a possibility in a world that undergoes economic growth and technological transformations that brings progress to the entire protopian planet; at the same time, liberal states might confront a world of proliferating humanitarian disasters where a multitude of foreign private security companies are working for a variety of ‘ends’ (some strategic, some focused on profit), tooled up with technologies beyond anything we can currently imagine (certainly as we approach 2049). But as economic growth and urbanisation transforms the planet, it might be the case that cities that we might think of in ‘feral’ terms have feral elements or zones—​but also have the most advanced and connected ‘smart city’ zones—​and the smart city zones may well outnumber the feral zones. In a sense, we see this already in cities in states like India where we might have ‘hubs’ of innovation in the global high-​tech economy—​ with the workers living in highly protected ‘gated’ communities—​but with slums nearby, cities that are a patchwork of secure gated communities. Likewise, cities that we currently think of in terms of the most ‘advanced’ and ‘developed’ may have increasingly feral zones, zones unable to adapt to the changing technology and global economy: these zones may play a role in terror and crime as much as the zones outside of the West. It might also be the case that we see temporary ‘pop up’ feral zones, blocks or neighbourhoods where, for example, a terrorist network might produce a chaotic zone through the use of explosives, sensors, robotics and drones—​and where siege-​like situations emerge that are difficult for police and military to control. Simply put, the urban future might be radically different from anything we have experienced before, urban black holes with global tentacles on the cutting edges

126  The Tactics, Terrains and Technologies of Future Warfare

of military–​technical tactics. Thinkers like Norton are suggesting liberal states need to be preparing for a future of urban disorder located primarily outside of the liberal world, geopolitical black holes that liberal states will get sucked into in a world of climate change, pandemics and terrorism. But this urban future might not involve entire chaotic megacities, but it might involve an area in a megacity that is diverse and complex, a patchwork of smart city/​affluence and slums, granular spaces of urban conflict, spaces of conflict in any city around the world. So, there is a sense of unease about what will be possible in terms of future interventions in cities. The problems of the city might pose ethical and political problem for the liberal way of warfare if tactics emerge that might be ‘effective’ but that will be unacceptable for many citizens (unless the problem is viewed as part of an existential threat to the state). For example, the Israeli tactic of ‘mowing the lawn’ in its attempt to destroy the capacity of Hamas might not be acceptable for other states in the coming decades where war will possibly be experienced in greater granular detail (via social media or whatever forms of connectivity we use in the decades ahead); other states might innovate necropolitical tactics of control and violence involving AI, biological weapons and infrastructural cyber-​sabotage. Vast megacities might be the source of the most urgent security problem—​the inescapable geopolitical black holes of the international order—​of the coming decades. But it might be the case that the only acceptable moves for liberal states will be decreasingly granular operations—​outsourced operations, operations that deploy the latest technology of remote warfare or missions that are fast, heist-​like operations: Ocean’s Eleven-​style operations that avoid the possibility of a future Black Hawk Down. Or they might be invisible operations in an urban grey zone. The Strategic Studies Group suggests that the current army doctrine is based on isolating the city, exerting control around the perimeter, enveloping the enemy with traditional forms of offensive manoeuvre. But this might be impossible in megacities: The fundamental assumptions implicit to these approaches are the ability to isolate and shape the urban environment and to utilize ground approaches from the periphery to the city. For megacities, both of the assumptions are flawed. By virtue of their scale, megacities cannot be physically or virtually isolated. Physically controlling an urban population consisting of tens of millions of people spread over hundreds of square miles with military forces numbering in the tens of thousands not only ignores the force rations recommended in doctrine but actually inverts them. Virtual isolation is even more improbable given cell phone saturation in urban environments worldwide and global interconnectedness through the World Wide Web. Ground manoeuvre from the periphery is also unrealistic. The congestion of ground avenues of approach, combined with the massive size of the megacity environments, makes even

The Changing Scale in Conflict  127

getting to an objective from the periphery questionable, let alone achieving an operational effect. (Strategic Studies Group: 2014: 8) The missing element in Norton’s speculations—​and in most speculations on future urban warfare—​is on the technological complexity of the future feral city or feral zone (wherever it may be found), a question that is impossible to answer but needs to be considered—​or imagined. We know the future city will (most likely) be dense in terms of population and built infrastructure but how might it be dense and complex in terms of emerging technology? Part of the problem with this language of ‘feral’ cities and the medicalised images of an ‘immense petri dish’ is that the framing suggests that the question is on how the ‘surgeon’ can operate—​and the underlying implication is of an environment and people that are ‘anarchic’ and ‘primitive.’ But it might be the case that future cities outside the liberal world might be ‘petri dish’ for new technology and tactics. Rather than talk about the ‘feral’ city it might be more helpful and realistic to talk about the ‘unregulated laboratory city,’ the city or urban zone where there is poverty, insecurity and crime—​but also the use of innovative technologies and tactics, some of which are outlawed in other states, some of which give non-​state actors an enhanced military capacity that challenges even the most advanced militaries: the type of urban environment depicted in films such as Johnny Mnemonic (1995) where hackers, doctors and political activists operate from slum areas of the ‘sprawl.’ It might also be the case that these laboratories will be the sources of new global threats involving cyber or innovations in biology and technology—​and so might make intervention unavoidable. But in these laboratories for new technology and tactics, urban warfare/​ interventions might be impossible (with the exception of the most granular of operations) and other policy solutions will have to be explored to deal with the urban/​international challenges of the coming decades. Granular interventions in future cities will require a careful combination of creativity and new machinic possibilities for impure urban war—​ if mass infrastructural destruction and humanitarian disaster is to be avoided. The liberal way of urban warfare will likely confront a world of increasing and intensifying urban problems with global implications—​but in a multipolar world of decreasing possibilities for action in cities congested with actors, technologies and risks. How future urban wars and interventions play out will involve more than the ‘tools’ we have access to: it will be about what tools and policies liberal states can use without generating a domestic and international backlash in a time when all events might be filmed and circulated. To be sure, the liberal way of urban war might use emerging non-​lethal technologies and tactics of impure war (to the point where it no longer resembles war) in a way that proves game-​changing. But future urban warfare will also be about the tools—​technologies and tactics—​the enemy has in a time of disruptive and possible radical technological change,

128  The Tactics, Terrains and Technologies of Future Warfare

times of open technological innovation, as Kurth Cronin puts it. The interplay between liberal states and ‘tooled up’ urban enemies might produce necropolitical possibilities beyond anything we can imagine. We possibly need to think beyond the science fiction scenarios of movies like Dredd or a Terminator film with robotic machines or deadly troops wrecking destruction; future urban war will likely be more a creative mosaic of warfare that will be shaped by the concerns of liberal societies, using a variety of ‘impure’ tools from cyber, drones (of all sizes) to psychological/​information war: future urban wars waged by authoritarian states might be completely different. But the liberal way of future war might be focused on the use of strategies to fund, train and supply other forces that are fighting brutal wars with necropolitical tactics and technologies; thinkers like Bauman remind us that liberal societies can be efficient at ignoring the necropolitical worlds they create through their foreign policies or involvement with the arms industries that sell the latest military–​technical innovations around the planet. Necropolitics is about the tendency to view certain races or classes in terms of a hierarchy of value that means that the lives (and deaths) can be understood differently (certainly differently from certain races or classes in liberal democracy); the geography of this hierarchy of value will possibly continue to change in the decades ahead. But it seems likely there will remain populations that can be viewed necropolitically, as having a value different from the citizens of liberal democracy. But what does war look like in urban environments in coming decades in this tense space between granular war and megacities? What is possible between two extremes of highly destructive science fiction war that resembles a Transformers movie or war/​intervention that verges on the invisible, sub-​threshold action at a digital or robotic distance or through other tactics of subversion? What does conflict look like in the interplay between liberal societies and an enemy that might be creative and dangerous? They could look like megacity wars of small technologies, small units and small battlespaces. I conclude the chapter now with some ‘speculative fictions’ on future urban conflict that illustrate three possibilities on the potential constraints, opportunities and risks/​accidents/​unintended consequences of the future liberal way of urban warfare: ‘protopian’ interventions using non-​lethal tools and technology with limited risk to all humans; raids in cities using technology with highly trained special forces backed up an array of machines, a mix of lethal and non-​lethal; humanitarian interventions that are underpinned by a variety of machines that allow action at a distance. All these scenarios illustrate the risks of urban interventions in a connected world, the complex changes in scale and the problems of ‘action at a distance.’ The scenarios on the possible granular directions in the liberal way of urban warfare are used to try to expand the way we think about an area that—​as this chapter has tried to illustrate—​can often be rather crude (technologically superior liberal states versus a ‘feral’ enemy in low-​tech slums); hopefully the reader will produce/​imagine their own more complex and creative scenarios as they think

The Changing Scale in Conflict  129

about the possible granularity of urban conflict (or to think about scenarios that totally challenge the assumption of ‘shrinking’ battlespaces). Three Scenarios on the Granularity of Future Conflict Protopian Mogadishu 2038

Economies across Africa have been transformed by robotics and AI: it’s not simply that there are ‘booming’ economies in the provision of remote services in highly skilled areas where Africa provides the cheapest (but also highly skilled) labour, there are centres of innovation across the continent that have emerged in music, fashion, art—​but also technology. Approaches to banking and the circulation of money that emerged from the particular needs of communities across Africa became the norm in other regions of the world; there were also distinctively African innovations in social media and entertainment (the sounds in clubs in Ghana and Nigeria continue to influence producers around the world). Africa had spaces of innovation in AI and biotechnology that had transformed how the rest of the world view a continent once viewed in terms of colonial ideas of inferiority. The most innovative and creative solutions to managing economic and societal change in times of anxiety over global ecological crisis emerge from across Africa. There were zones in cities that had remained what strategic analysts had once referred to as ‘feral cities’: but there are feral zones in all states as solutions are experimented with in response to the problem of inequality that continues to damage individuals and societies. But there were no cities in Africa that could be described as completely ‘feral’; the African middle class was transforming the continent (and the world). Terrorism had been virtually eradicated from global life. There remained organisations and movements critical of the American-​ led or the Chinese-​ led orders, movements driven by anxieties over the acceleration of AI, robots and innovations in the life sciences. But there were a number of techniques for keeping them under control, techniques that varied in the way physical violence and psychological or informational tactics were deployed and orchestrated. The American approach refined new techniques of pre-​emption, deterrence by denial and surveillance that prioritised the use of non-​lethal approaches; these tactics had become central to its desire to present a different image to the world to the one that China finds hard to change—​an image of authoritarianism and (technologically refined) brutality. Conflicts occur around the world but ‘great powers’ keep themselves at a safe distance; new generations will not accept their state being involved in killing overseas. The involvement in these civil wars and urban conflicts remains invisible, in the grey zone. There are constantly mutating private security companies operating across the planet, using tactics and technologies of lethal empowerment that pose a constant challenge for the security services of liberal states; there is a mainly invisible impure war being waged by states against

130  The Tactics, Terrains and Technologies of Future Warfare

both terrorist networks and private security companies. War becomes more impure. Many argue that this is a return to the normal condition of international relations and the politics of security and conflict. Innovations in AI and soft information war have prevented acts of terror and have deterred people from becoming terrorists in the first place. All states have refined the arts of psychological war—​techniques that were used both domestically and internationally: the better angels of our nature could flourish in an age of psychological surveillance and intervention, a revolution in neuroscientific affairs. Legal and ethical debates raged on these various types of granular interventions—​ and the deepening grid of surveillance and pharmaceutical techniques in which individuals and communities became entangled. But the world is safer than ever. The liberal world was in a constant process of examining these techniques, trying to be as transparent as possible about how these new techniques were used. But one of the last remaining terror networks orchestrates a simultaneous attack in New York, Los Angeles, London, Paris, Tel Aviv, Lagos, Rio, Dhaka, Moscow, Delhi and Beijing. The attacks use a fairly basic (by 2038 standards) assemblage of drones, robots and explosives in a creative way that no one had imagined possible or had been able to detect. Robotic first responders and drone deliveries of medical supplies save many lives. But images of dead civilians from terror attacks are unsettling in the ‘World City’ where violence has virtually disappeared. What shocks people—​and people who can remember 9/​11 feel a similar sense of horror—​is the way the attacks occurred simultaneously around the world. China has not experienced an attack like this before and there is an intense desire for revenge. Many inside and outside China thought it was not possible for terrorists to orchestrate an event inside this territory of intense hyper-​surveillance. The attacks are traced back to a neighbourhood in Mogadishu, an infamous territory in a city that has undergone radical transformation in the past 40 years. The neighbourhood is one of the few around the planet that is yet to be transformed by the UN-​led ‘smart cities’ initiatives due it’s serious drug and crime problems: there were still cities that had zones that prevented the police from entering through a combination of sensors, drones and an eclectic array of weaponry, from sonic weapons to the more traditional roadside bombs that could be activated remotely under the gaze of the drones. Even technologies of surveillance were futile in these zones where groups and individuals have refined the art of hiding. Working out how this new global terror network was organised became a messy affair given the tensions between great powers; there was an onslaught of fake news and deep fakes that resulted in a wave of conspiracy theories that by design or default intensified interstate conflict. Some argued that the motive behind the attack was to draw great powers into messy conflicts, an apocalyptic war on terror that will create the space for the rebirth of humanity after the threat to life posed by AI and robotics—​and in a time where both China and America were cautious about repeating mistakes from the past. Some saw it is an act of

The Changing Scale in Conflict  131

revenge for assassinations carried out by both China and America or attacks by groups whose anxiety about technological and scientific revolution has intensified the paranoid desire to destroy this hypermodern society. Some argued that the military–​industrial complex was behind it, in a drive to instigate war and conflict in a world that might not be a liberal end of history—​but was certainly protopian (and protopian in a way that was increasingly affordable in age of innovation in AI and robotics). While there is reluctance in the militaries of both states to collaborate with the other, China and America quickly forge an alliance to eradicate the last stronghold of terror (leaders on both sides feel they need to project images of strength and control). It is decided that the Chinese and Americans will use a mix of Chinese and American micro-​bot technology to swarm the blocks where the network’s leaders are based. Swarms of bots of all sizes will emerge from above (dropped by drones), from below (moving through underground sewers) and at ground level, moving under trucks: the first wave disables all weaponry and communications, releasing non-​lethal gases that ‘knocks out’ people in the building; seconds later, the second wave of large bots surround all members of the terror network with a mix of lethal and non-​lethal weaponry. There are simultaneous interventions on social media and all networked devices explaining to civilians in the area about what is happening and what to do to stay alive. Once the network is neutralised a team composed primarily of regional forces enters the zone to take the terrorists to a safe location (how the terrorists will be dealt generates a difficult diplomatic process): humans and robots deliver humanitarian assistance to locals, with the Americans and Chinese clearing up (and collecting) the considerable mess of micro-​bots: waste disposal is an essential part of operations as the technologies of conflict become increasingly sophisticated, the eradication of the granular traces of new weaponry. In the months and years after the event the zone is transformed into one of the most fashionable neighbourhoods for the ‘hipsters’ of Mogadishu. The intervention is seen as the state of the art in urban war. Both the Americans and the Chinese circulate footage taken from various ‘vision machines’ that show the skill of this new way of robotic humanitarian intervention; these superpowers enjoy these spectacles of humanitarian intervention and policing, the high-​tech saviours and protectors of the fragmented international order, the soft power images of non-​lethal supremacy. There is a global media campaign to reassure people that the machines had not taken humanity into a dark new dystopian age—​and that humans are still making the decisions. But there is a serious concern that the cost of the rising backlash against new technology might outweigh the benefits of the new age of protopian war and policing. Are liberal states becoming the very thing they were paranoid about in the endless dystopian science fiction stories about humanity and the rise of the machines? In governments around the world there are debates about how to present these events—​and the aesthetics of technologies that can make people feel like the world is more alienating than it had ever been in

132  The Tactics, Terrains and Technologies of Future Warfare

modernity; there are signs of how this sense of technological alienation is producing a new type of anti-​tech populist and conspiracy theory-​driven political movement. The way great powers collaborated was unprecedented. But there is an anxiety that both the Chinese and the Americans learned rather too much about the other’s technologies—​but in the bigger scheme of things, this type of intervention is viewed as insignificant, a high-​tech humanitarian intervention that gave very little away about the state of the art in military technology. But this multipolar world puts the great powers in to close proximity in different zones around the world, the increasingly granular spaces where states meet, undersea and in space, in the ‘metaverse’—​sometimes involving military personnel but more often through the robots and drones that operate around the planet: robots watching robots. The skill is to know what to interfere with in all the constantly changing terrains and infrastructures on earth, undersea and in space—​there is a constant concern that damage or activity in vital infrastructures of security (such as satellite technology vital to deterrence) could set off a dangerous global accident (Marshall 2023: 202): there is constant re-​evaluation of potential granular targets—​and the vulnerabilities in infrastructures that could escalate conflict. Mogadishu Raid 2040: Lethal Empowerment in the Urban Grey Zone

A criminal gang that manufactures some of the most advanced and innovative tools in cyber, biological and robotic crime operates in underground laboratories in a lawless zone in Mogadishu, defended by drones, gang members and a grid of roadside bombs. The zone even offers a service in radical body transformation and enhancement illegal in liberal states. The organisation develops innovative tactics to bribe and disrupt any government and corporation on the planet—​governments and businesses live in fear of what can be done to them. The profits are filtering through to property markets around the planet and funding a broader range of illegal activities; including a terrorist network that is seeking to humiliate and weaken liberal societies after decades of increasingly refined and sophisticated counterterrorism tactics and policies. There is anger across the continent at the draconian policies used by a Fortress Europe that has responded to the waves of climate emergency through ruthless border controls and policing. When the 25-​year-​old son of one of the key players is arrested in London, the network decides it is time to generate a spectacle of humiliation and vulnerability—​ and to find a way to free him. The network kidnaps a group of 20 mainly North American and European ‘eco-​tourists’ on holiday in Ethiopia, one of the most fashionable destinations for the global tourism industry in a time of growing anxiety about last chances to see ‘nature’ before environmental collapse. The tourists are smuggled back to Mogadishu—​images of the tourists are broadcast around the world by the network. The group demands the release of the prisoner—​but they

The Changing Scale in Conflict  133

know this demand is unlikely to be met; they are content to generate a sense of fear and powerlessness among the citizens of the liberal world. The location of the hostages is worked out; they are being held in a compound on the edge of the city. The United Kingdom and United States send an elite squad trained in the arts of urban war and grey zone operations. Drawing on a broad range of intelligence drawn from drone-​collected ‘patterns of life’ analysis of the network from all the digital traces they have left, they begin an operation to communicate and negotiate directly with members of the gang that they think are susceptible to bribery and deals. They also sow the seeds to mistrust among the group, a group that operates across the city and across the planet: paranoia is easier to manipulate in groups working remotely—the deepfake technology is convincing and militaries and police are becoming masters in the arts of paranoia and deception. These types of tactics generally push liberal societies into areas that make many citizens uncomfortable—​but they are used more and more: this world is still an experimental laboratory for what was once described in terms of necropolitical tactics, the continuation of colonial practices and ways of seeing the African ‘other.’ As a sign of good will, drones are sent to the homes of gang members to provide money and travel documents. Deals are made and one morning one of the key leaders of the network wakes up in his apartment hidden in the city to find three quarters of the organisation has disappeared. At the same time, the special forces launch a swarm of drones and robots of various sizes into the compound on the edge of the city; many of the guards have left already. The robots and drones move underneath trucks; some drop from the sky, invisible to the guards, some through the sewers. There are incredibly rich and detailed maps of all parts of the infrastructure of towns and cities around the planet, every drain, window, pipe and chimney. The latest generation of non-​lethal chemical weapons are deployed in the compound where the kidnapped are being held, but the non-​lethal weaponry results in the three children in the group spending weeks in recovery in hospital. Two members of the gang manage to put masks on in time and destroy the micro-​ bots tasked with targeting them—​but flying micro-​drone cockroaches take them out efficiently. Special forces—​a team of seven with a gang of robot dogs—​raid the compound and rescue the captured; hidden robotic machine guns emerge from under the soil of the compound, but one of the robotic dogs quickly neutralises it. But in another part of the city, the remnants of the network launch a cyberattack against the healthcare providers in the United Kingdom—​healthcare heavily dependent on information technologies/​robotics of all kinds. And the member of the network that orchestrated the attack was based in the United Kingdom. He had continued to exist in the shadows, and he was the network’s most cunning cyber expert. He unleashes an attack that shuts down a variety of key elements in the logistics that makes everyday life possible. But the main event is a wave of personal/​societal attacks that expose people to an onslaught of real/​fake images

134  The Tactics, Terrains and Technologies of Future Warfare

of torture, rape and killing. The whole country becomes a society of horrific spectacles, and the horror extends to smart homes (families would be woken up to sounds of torture and killing, deepfake images of their loved ones being abused) and to images shown on the advertising screens across the London underground. The designers of the event see it as a reminder to all those living in their safe European homes of the histories of necropolitical violence—​and the brutality of a Fortress Europe that is leaving those outside to suffer the climate emergencies. This event was the start of a new age of informational terror enhanced by AI. The mission shows the flipside of granular war where the enemy is capable of orchestrating increasingly sophisticated action at a distance. Urban war is not local—​even if it involves a compound on the edge of a city. It is increasingly dangerous to do anything in a foreign city; the urban environments—​even in the poorest cities in the world—​are congested with too many drones and robots that are constantly modified to counter any measures that are introduced to neutralise them; non-​state actors can operate globally faster and more destructively than ever. Urban warfare disappears into the grey zone, interventions that have more in common with Christopher Nolan films like Inception rather than war films like Black Hawk Down. Drones over Aleppo 2042: Terrorism in an Age of AI

Climate crisis and civil war have produced vast refugee camps across Iraq and Syria, camps managed by a toxic mix of organised crime (who use the camps as a departure zone for those who can afford the traffickers fees) and the latest terror groups inspired by Al Qaeda and Bin Laden, the network that has an almost mythical status for the new generation of alienated young people who have been unable to leave the country. The camps descend into such depths of humanitarian disaster that an international effort led by the United States embarks on a humanitarian assistance and disaster relief (HADR) mission to stop people being killed, starved or dying from various disease outbreaks. Small units ‘tooled up’ with the latest advances in drones and robotics are operating across the camp territory seeking ways to destroy and undermine the terror networks. The camps have formed over once fairly safe cities, but it is clearly too dangerous to put troops anywhere near the zones where the network circulate (who are constantly surrounded/​protected by robots, drones and children). But a young woman from the United Kingdom—​linked to various horrific terrorist attacks across Europe and the United States—​operates in cities and towns that have become a sprawling camp and humanitarian disaster zone. The young woman taunts the liberal world through the clever use of social media and becomes a heroic figure for many around the world. But she is hard to locate. The one time it was thought she was in sight—​from a drone searching above the camp—​she was surrounded by children. Another time there was a holographic conversation between her and one

The Changing Scale in Conflict  135

of the diplomats from the EU. But security analysis begin to suspect that she does not exist, a tool of propaganda for a time when there is little enthusiasm or energy to join a terror network; the small group of fundamentalists are always looking for ways to use AI to enhance their numbers through both the use of drones, robots and imaginary members. A complex operation is set up to use drones to deliver medicine and foods into the vast camps covered with tents; doctors are able to advise people in the camp via the latest iPads that are provided to people and, when needed, send in the appropriate medicines; drones of various sizes and shapes move around the camp delivering medicine—​and looking for the young woman. The scale of disease results in AI-​ enabled medical diagnosis drones being circulated in the camps—​this speeds up diagnosis and the drone delivery of medicines. The drones delivering medicine arrive with sirens to give them a pass. The camp is filled with explosive devices to make it challenging for troops to enter and there are also fairly deadly anti-​drone weapons. But there is a beauty to the way these machines fly around the camps, helping people, the robotic flocks that are mesmerising and hypnotic to watch. The 500 military technicians and soldiers operate from a heavily fortified camp from where they can manage the drone/​robot operations and occasional raids—​and are attempting to capture the young woman along with destroying the organised crime and private security companies attempting to move people into Europe. It is dangerous to operate outside the camp and the Americans and Brits rely heavily on the local forces. Much of the work involves training local forces in the new technologies of conflict management, training that involves a hybrid mix of ‘in person’ and online education. A constant flow of drones supports the logistics of the camp and it becomes increasingly difficult to avoid attacks on the flow of goods to the camp, a perpetual drone logistics war. The terrorist network produces very effective deep fakes of U.S. soldiers brutalising the civilians in the camp: this has created paranoia and mistrust in the camps as well as fuelling an overseas backlash about human rights abuses. One attempt to destroy the geofence that protects the camp from drones is temporarily successful and a drone swarm creates some panic—​but no one is killed. But using facial recognition tools the group are able to obtain images of soldiers that results in some very detailed social media analysis that leads back to the families of the soldiers in the United States and the United Kingdom. Family members in the United States receive ‘deep fake’ videos of their loved ones being tortured; at the same time, the soldiers are sent deep fake images and videos of their loved ones in a variety of disturbing situations. The families of the troops experience a variety of problems in their intensely connected everyday lives, from school to work to online shopping. No one can be kept ‘off the grid’ now. The most sophisticated tactics involve the return of dead or distant family members for online conversations—​some people never want the manipulation to end. But these online tactics are supplemented with some real-​world events targeted on families that creates deep distress and anger in the troops: the hacking of smart

136  The Tactics, Terrains and Technologies of Future Warfare

homes, cars and even kids toys. So, it is decided that the use of the technology that connects troops to families is to be limited; it is simply too open to manipulation. But this proves difficult to a generation that are not used to distance: holographic technology is so effective now that a family can sit together, drink and have a meal even if they are dispersed across the planet. But in the war zone no one can know who is watching the family gathering and what they may learn about you: a plot is uncovered to kill family members in the United States during one of these holographic family get-​togethers. Many troops spend their downtime in virtual reality games with friends, but even this space is invaded by virtual enemies that create unsettling experiences. Even family members are subject to horrific attacks in virtual reality: virtual reality is becoming a terrain of war and conflict in ways no one had imagined. In the years that follow, the experience of the soldier in the policing of urban conflict environment becomes a lonely experience, isolated from friends and family back home, distant from the people in the cities they are operating in. Drinking, board games and basketball are the favourite past times on the cutting edges of war in 2049. New education and training is brought in on how to protect privacy for troops and their families and the recruitment process includes these concerns in their evaluation of new recruits. What terrorist groups can do is rather crude compared to what states are preparing for; the exploitation of vulnerabilities in the spaces between the global and the intimate, the organisational and the individual, the megacity and the block, the war and non-​war infrastructures. But the most troubling idea is the possibility that a terrorist leader might have emerged that is no longer human—​and a leader who knows how to manipulate, mobilise, organise and orchestrate increasingly creative acts of destruction. This Ghost in the Shell cyberpunk future is not going to be easy to control. In the following chapter I begin to sketch out what might be on the horizon in terms of urban warfare in times of congested battlespaces and emerging technologies—​but also to think about what granular warfare might mean in terms of great power or near peer competition and conflict. Bibliography Abboud, Leila. 2023. ‘Macron’s ‘Bonsai Army’ Needs More Money to Grow,’ Financial Times, 18 July: www.ft.com/​cont​ent/​9d72e​855-​3a60-​4efc-​95f6-​c3132​e76a​6f8 Bauman, Zygmunt. 2000. Liquid Modernity (Cambridge: Polity). Berman, Eli, Felter, Joseph, and Shapiro, Jacob. 2018. Small Wars, Big Data: The Information Revolution in Modern Conflict (Princeton, NJ: Princeton University Press). Chamayou, Grégoire. 2015. Drone Theory (London: Penguin). Coker, Christopher. 2015. Future War (Cambridge: Polity Press). Demmers, Jolle and Gould, Lauren. 2018. An Assemblage Approach to Liquid Warfare: AFRICOM and the ‘Hunt’ for Joseph Kony. Security Dialogue, Vol. 49, Issue 5, 364–​381: https://​doi.org/​10.1177/​09670​1061​8777​890

The Changing Scale in Conflict  137

Freedman, Lawrence. 2017. The Future of War: A History (London: Penguin). Holland Michel, Arthur. 2019. ‘Some Cautionary Notes on the New ‘Knife Missile,’ Defense One, 10 May: www.def​ense​one.com/​ideas/​2019/​05/​some-​cau​tion​ary-​notes-​new-​knife-​ miss​ile/​156​943/​ Kaplan, Robert D, Gray, John and Thompson, Helen. 2023. ‘The New Age of Tragedy,’ The New Statesman, 26 April: www.newst​ates​man.com/​ideas/​2023/​04/​new-​age-​trag​edy-​ china-​food-​eur​ope-​ene​rgy-​rob​ert-​kap​lan-​helen-​thomp​son-​john-​gray King, Anthony. 2021. Urban Warfare in the Twenty-​First Century (Cambridge: Polity). MacMillan, Margaret. 2021. War: How Conflict Shaped Us (London: Profile). Marshall, Tim. 2023. The Future of Geography: How Power and Politics in Space Will Change Our World (London: Elliott and Thompson). Norton, Richard. 2003. “Feral Cities,” Naval War College Review, Vol. 56, Issue 4, Article 8: https://​digi​tal-​comm​ons.usnwc.edu/​nwc-​rev​iew/​vol56/​iss4/​8 Rosa, Hartmut. 2015. Social Theory: A New Theory of Modernity (New York: Columbia). Srivastava, Mehul, Kerr, Simeon and England, Andrew. 2021. ‘Palestinian Fury Exposes Netanyahu’s Illusions,’ Financial Times, 21 May: www.ft.com/​cont​ent/​184eb​18c-​e617-​ 4dff-​8783-​91fdd​0981​af9 Strategic Studies Group. 2014. ‘Megacities and the United States Army: Preparing for a Complex and Uncertain Future’: https://​api.army.mil/​e2/​c/​downlo​ads/​351​235.pdf Thompson, Helen. 2022. Disorder: Hard Times in the 21st Century (Oxford: Oxford University Press). Virilio, Paul. 2008. Pure War (Los Angeles, CA: Semiotexte). Watson, Abigail and Knowles, Emily. 2018. ‘Remote Warfare: Lessons Learned from Contemporary Theatres,’ Oxford Research Group’s Remote Warfare Programme: www. saf​erwo​rld.org.uk/​resour​ces/​publi​cati​ons/​1280-​rem​ote-​warf​are-​less​ons-​lear​ned-​from-​ conte​mpor​ary-​theat​res Zuboff, Shoshana. 2019. The Age of Surveillance Capitalism (New York: Profile Books).

7 THE GRANULAR 2 The Granularity of Future War

For the optimist, this age of the ‘mega,’ the ‘meta’ and the ‘quantum’—​and how the mega is transformed by the quantum—​will provide unprecedented protopian possibilities for the health and security of people around the planet. For the pessimist, the granular age will lead to new dangers in terms of surveillance capitalism, enhancing the ‘granular’ and intimate surveillance capabilities of the state and capitalism. At the same time, individuals and groups might become increasingly dangerous in their ability to create disruptive and destructive events, exploiting the change in scale in technology and tactics. In terms of granular killing, Brandon Cronenberg’s Possessor (2020) depicts a world where innovations in neuroscience and brain implant technology allow for assassins to occupy bodies of people who are able to get close to highly protected targets. As we approach 2049, we might see new combinations of creativity and technology that are beyond even our most disturbing science fiction, the human body and mind becoming a battlespace and ‘crimespace’ in ways beyond our most dystopian nightmares. But in the first decades of the twenty-​first century, a major preoccupation of analysts and policymakers is the problem of ‘megacities’ for future disorder and insecurity—​and the problems of cities on the ‘planetary frontier,’ outside the zones of international ‘order’ and ‘stability,’ beyond the rules-​based order (and in a future dealing with climate emergencies). On the one hand, there is a sense of anxiety in liberal states about the necropolitical consequences of megacities for the security and health of those living in the slums of the twenty-​first century. But there is also a sense of concern for the security and health implications emerging from these ‘feral’ zones, an anxiety exemplified by fear (and loathing) of those risking their lives to escape into the liberal world. What I want to suggest is that what we see emerging in these anxieties about megacities (which may or may not be as dystopian as imagined) is a sense of DOI: 10.4324/9781003219576-9

The Granularity of Future War  139

uncertainty about what the possibilities for urban interventions will be in the years ahead, the sense of a ‘shrinking’ range of possibilities in response to the congested character of cities, congestion that will become increasingly problematic for any liberal state attempting to operate in environments filled with a multiplication of actors and technologies (and the implications here extend really to all land-​based operations). In Urban Warfare in the Twenty-​First Century, Anthony King suggests that the urban battle has localised onto specific sites within the city, but it has simultaneously extended out across the global urban archipelago by means of social media and information networks. Peoples across the world are now implicated in the fight as audiences, supporters and sometimes even participants. (King 2021: 17) In one of the most detailed studies of the problems of past, present and future urban war, David Kilcullen concludes provocatively that there will be a need for creative and innovative technologies and tactics if military forces are to be able to prevail against an ‘evolved irregular threat’: In a complex fight in the urbanised littoral, there will be none of the fixed installations, lavish intelligence infrastructure, or constant cell-​phone and Wi-​ Fi coverage of counterinsurgency operations. The garrison mind-​set, with its short-​duration operations and frequent access to bases with hot showers, air-​ conditioned dining halls, and sleeping cots, will need to give way to a mobile, improvisational, expeditionary mentality. Troops will have to become hikers again, not campers. (Kilcullen 2015: 294) What Kilcullen seems to be suggesting is that it might be the case that liberal states might not be operating around the planet with huge logistical support in wars that last years or decades; we are heading towards a time of raids and sub-​threshold activity, a time when the ‘battlespace’ will be too congested with technologies and the ‘audiences’ King refers to will be unwilling to support anything more than ‘sub-​threshold’ or invisible ‘grey zone’ activity, warfare in the time of the ‘hiker.’ In this chapter we begin to see how new ideas on technology and the future soldier are beginning to unfold: these ideas might point to the trends that will shape the liberal way of warfare in coming decades, trends that began to be sketched out in the scenarios in the previous chapter. While there might be little written on the future of urban warfare, we can begin to piece together some ideas that might point to what is on the horizon in terms of plans for the future soldier. The transformation of urban war and the future soldier will be shaped by emerging technological and organisational possibilities and constraints (political, ethical) and risks (accidents, unintended consequences, the vulnerabilities of the ‘agile’ warrior). Some of the

140  The Tactics, Terrains and Technologies of Future Warfare

ideas here point to future sub-​threshold missions enabled by new technologies, operations that are increasingly granular using all the detailed intelligence that the future soldier can be ‘tooled up’ and connected with (using whatever the military evolution of the Apple Vision Pro becomes designed for in terms of artificial intelligence (AI)-​enabled decision-​making assistance and evaluation of people and terrains). The reality might be that the liberal way of warfare will be about providing sub-​threshold support and training to foreign forces—​all underpinned by the constantly expanding and growing tools of mosaic warfare. But much of the discussion here is focused on conflicts like the ones fought since 9/​11: the chapter concludes by looking at what granular war might look like in the context of great power conflict and competition. The Three Block Robot War

In comments on urban warfare there is often a sense of unease or uncertainty about what might be possible in the megacities (or the Minority Report-​like smart cities or Elysium-​like slums) of the future. But the sense that emerges in the often vague and uncertain speculations is that urban warfare will become increasingly granular, focused on neighbourhoods, blocks and compounds. Operations would generally be fast and reliant on the latest technological innovations (in drones, robotics and cyber) to minimise risk to all concerned—​the Zero Dark Thirty-​like scenario would be the norm not the exception and even then, these events would remain exceptional acts of impure war: the time of the hiker and the raid. Authoritarian states will have a different assessment of the risks, strategic opportunities/​technical possibilities and constraints. One of the ideas about future urban warfare and strategy that attempts to initiate a debate on these increasingly granular ways of war is on the possibility of ‘Three Block War.’ General Charles C. Krulak created the concept of the Three Block War in the late 1990s in response to what he viewed as both the growing significance of future urban conflicts and the increasing complexity of urban operations, where marines might be required to deliver humanitarian assistance, maintain peacekeeping operations and engage in ‘mid-​intensity’ conflict—​all in the space of three city blocks, a dense urban zone. In the decade that began with the vision of a new world order and the United States as a force—​and the only legitimate and capable force—​for policing the disorder of the post-​Cold War world, Krulak’s vision of future geopolitics and war is one where there will be a need for the liberal world to intervene in problematic zones around the planet. Krulak notes that: Since 1990, the Marine Corps has responded to crises at a rate equal to three times that of the Cold War—​on average, once every five weeks. On any given day, up to 29,000 Marines are forward deployed around the world. In far-​flung places like Kenya, Indonesia, and Albania, they have stood face-​to-​face with the

The Granularity of Future War  141

perplexing and hostile challenges of the chaotic post-​Cold War world for which the ‘rules’ have not yet been written. (Krulak 1999: 16) The implication in Krulak’s vision is not necessarily of a dystopian world covered with vast feral cities in which we will be forced to intervene but rather a world that will contain cities with chaotic blocks and neighbourhoods that might require military–​humanitarian intervention; the underlying view here is that the United States will be compelled to intervene in these urban spaces for both strategic and humanitarian reasons. Writing in a uni-​ polar moment of international history, the view here is that there will be a need to intervene in chaotic zones of global disorder where social, political and economic problems are intensified by climate emergencies, pandemics, inequality and (un)natural disaster. Krulak makes it clear that these urban environments would present serious obstacles and challenges: The rapid diffusion of technology, the growth of a multitude of transnational factors, and the consequences of increasing globalization and economic interdependence, have coalesced to create national security challenges remarkable for their complexity. By 2020, eighty-​five percent of the world’s inhabitants will be crowded into coastal cities—​ cities generally lacking the infrastructure required to support their burgeoning populations. Under these conditions, long simmering ethnic, nationalist, and economic tensions will explode and increase the potential of crises requiring U.S. intervention. Compounding the challenges posed by this growing global instability will be the emergence of an increasingly complex and lethal battlefield. The widespread availability of sophisticated weapons and equipment will ‘level the playing field’ and negate our traditional technological superiority. The lines separating the levels of war, and distinguishing combatant from ‘non-​combatant,’ will blur, and adversaries, confounded by our ‘conventional’ superiority, will resort to asymmetrical means to redress the imbalance. Further complicating the situation will be the ubiquitous media whose presence will mean that all future conflicts will be acted out before an international audience. (Krulak 1999) Krulak wants to think about how this type of urban conflict might require new types of military organisation and leadership in the marines—​towards an increasingly granular organisation where marines have to cultivate unprecedented decision-​ making and leadership skills. More specifically, the focus of Krulak’s position is on the need for the ‘strategic corporal,’ the corporal that might get caught up in chaotic urban battlespaces, making decisions under conditions of extreme stress, a dense, chaotic space where the ‘three block war’ distinctions between peacekeeping, humanitarian relief and conflict may well implode in an era of drone ‘swarms,’

142  The Tactics, Terrains and Technologies of Future Warfare

informational strategies of deception and where there is the exploitation of the messiness of the city in a manner that makes any attempt to produce distinctions between civilian and combatants impossible. The point here is that urban conflict will be increasingly granular, focused on blocks rather than cities (or states)—​leadership will need to be decentralised to be able to cope with the chaos and speed. One of the points made about Russian failure in the early stages of war in Ukraine in 2022 was on the ‘clumsiness’ of its traditional military structures (which possibly accounted for the number of important military figures that were killed or injured in Ukraine) versus the more decentralised and networked opposition. As John Arquilla tells Thomas Friedman on the war: The Russians are much more centralised. One of the reasons they’ve had so many generals get killed is that at the tactical level, they don’t have people who are empowered to make those quick decisions in a firefight; only general officers can, so they had to come down close to the front and do things that lieutenants and sergeants in the American military routinely do. (Friedman 2022) Cyberattacks employed by the Russians to disrupt Ukrainian command and control, according to Arquilla, were less effective than they could have been due to the decentralised organisation of the Ukrainian regular and militia defence forces. As Gillian Tett reported in the Financial Times on the difference between hierarchical war and networked war: The Russian military still appears to operate in a hierarchical manner—​even though it has potent cyber-​ hacking and misinformation capabilities. The Ukrainian army, by contrast, gives decentralised teams considerable autonomy to make decisions and innovate, and soldiers communicate directly with their peers in different units. (Tett 2022) This would appear to be the future that Krulak was anticipating in the late 1990s. Some might argue that from the perspective of the 2020s, the only mission future troops will be taking through difficult, congested territory will be through the eyes of a drone or a robot; if there is a credible vision of the three block war it might not involve a strategic corporal on the ground but rather the use of technological tools (or prostheses), technologies designed to operate in messy, chaotic zones that would be too dangerous and too fast for a human to operate in. In Steven Spielberg’s Minority Report (2002), the Chief of the Pre-​crime Unit John Anderton is chased by a high-​tech police force through a hyper-​ securitised American city in 2054 covered with the latest (and pre-​emptive) surveillance technologies: micro-​robotic spiders enter a building, searching for

The Granularity of Future War  143

him, resulting in Anderton having to use clever ways to hide from the spiders. The future of the three block war might involve the use of different sizes of robotic devices—​with a mixture of lethal and non-​lethal tactics—​in order to deal with the problems of the urban environment; the ‘human in the loop’ will be shaping the events from a distance; the role of humans in the three block war might occur once the situation has been ‘neutralised’ and made safe (and the delivery of the humanitarian component of the three block war might not involve the military anyway). By 2049 we might not be in the time of the ‘replicant,’ but we might be in a moment where robotics—​controlled by humans (although increasingly enhanced by AI and increasingly autonomous)—​police dangerous environments in the same way that robot police do in the film Elysium. But this is all science fiction speculation on the future of war, a future that might never materialise because of the failure to create technology that is able to function effectively or safely. But the debate appears to have shifted (outside of academic writing at least) from broad discussion of the tactics needed to intervene in difficult zones around the planet (either through humans or machines skilled in the arts of the three block war) to a growing focus on the shrinking possibilities of global operations and interventions (at least understood in a ‘traditional’ sense of militaries occupying a territory through the use of force); a shift in focus that reflects both geopolitical change (a sense of domestic unease about military operations overseas coupled with the emergence of new great powers) and a sense of the technological complexity of the battlespace. In the essay ‘Tactical Art in Future Wars,’ Robert H. Scales (a commentator from outside the university world) offers a provocative commentary on the future of war, technology and territory in congested battlespaces of surveillance and precision weapons. Scales is interested in exploring what debates about ‘multi-​domain’ warfare might mean at the tactical level and he is interested in challenging the view that new scientific developments will primarily influence cyberspace (and in space), exploring how ‘restoring the offensive’ might be possible in a time when ‘linear’ movement is difficult due to the ‘threat posed by precise long-​range weapons, the unblinking eye of sensors and aerial killing machines that guarantees the offensive will be nearly as suicidal today as it was in 1914’ (Scales 2019). The problem, in other words, that some would argue Russian began to confront in Ukraine in 2022. Scales suggests that a battlefield dominated by firepower and machines ‘compels units to disperse, disaggregate and go to ground’ (a point many would argue that Russia failed to grasp in the early stages of the war in Ukraine) (ibid.). To explore, in other words, the granular possibilities of liquid war, war where you can operate and communicate in a dispersed and decentralised fashion—​but have increasingly detailed knowledge of your enemies’ movements and actions: a swarm on the ground with eyes in the sky. The process of disaggregation is good for Scales because it limits the effects of firepower but is potentially negative in the way that ‘dispersed forces are less able

144  The Tactics, Terrains and Technologies of Future Warfare

to mass, and mass is essential if manoeuvre is to be restored’ (Scales 2019). The shape of the battlefield is transformed by dispersal: Dispersal changes the shape and contours of the battlefield. Linearity disappears. Large groups of combat and support units moving together are replaced by smaller clusters of tactical units separated by empty spaces. A disaggregated battlefield favours autonomy and demands that close-​combat units operate for long periods without reinforcement. An aerial view would leave the impression of emptiness. Urban terrain will provide sanctuaries for units seeking to avoid destruction by firepower. The smaller and more discrete the tactical disposition the more likely a force will be able to survive a Russian-​style strike. (Ibid.) Similar to the point made on the strategic corporal in the three-​block war debate, Scales goes on to suggest: In turn, a dispersed tactical disposition alters both the shape and composition of the tactical units themselves. As the space between close-​ combat units opens up, units become more isolated, forcing greater reliance and independent decision-​making. Traditional supporting enablers such as fires, intelligence, medical aid, logistical support and external sensors are positioned far to the rear to avoid destruction by firepower. The psychological ‘touch’ that comes from the presence of adjacent units will disparate. ‘Touch’ thus must become virtual rather than physical. Isolated small units must increasingly fend for themselves, learn to survive, sustain and fight as self-​contained entities capable of remaining effective for days without succour. (Ibid.) Scales suggests that it will be viewed as a failure if the small unit is left with no choice but to engage in face-​to face-​fight; the key objectives for the small units is as ‘human sensors, decisional “gatekeepers” and facilitators responsible for translating killing power residing at a distance into killing effects on the enemy.’ Small units will become ‘virtual outposts, in effect the eyes and probing figures of a larger supporting operational force placed out of reach of the enemy’s long-​range fires.’ This is possibly what Kilcullen is pointing to in his comments on troops becoming hikers. Scales is offering a vision of future war where small units move across territories—​a ‘complex terrain or urban clutter’—​in a manner that provides the granular analysis that will enable them to become the ‘little green men’ that can control and shape significant ‘strategic assets.’ Decisions made formerly by colonels will be made by sergeants hooked up to a range of AI-​enabled decision-​enhancement

The Granularity of Future War  145

tools. The miniaturisation of military tech will enhance the destructive capacity of the small unit: The tank isn’t dead but it’s far easier to kill today thanks to very precise and portable guided missiles. An onboard computer gives the MI Abrahams tank a single-​shot kill probability out to two miles. Today, micro-​miniaturization technology borrowed from civilian industry will allow Abrahams-​like precision to be squeezed into rifle sight with the same one-​shot-​one-​kill probability. (Scales 2019) While there will be limits to what they can physically carry, the future soldier will be connected to the most advanced networks of information and weaponry imaginable: Tomorrow’s small-​unit soldier and leader will never be able to carry all of the combat gear necessary to keep the unit functioning in the close fight. But they will be able to ‘reach back’ to access combat resources residing well to the rear, at sea or perhaps outside the theatre of war. Efficient supply chain technologies and methods borrowed from companies like Amazon and Google will allow battlefield delivery of supplies quickly enough to reduce the logistic load a small unit must take with it into the close fight. (Ibid.) To deal with the problem of weight (a soldier’s load has increased from 60 pounds in the Second World War to over 100 pounds), the solider will be accompanied by robotic vehicles. The solider will be surrounded by an ‘impenetrable senor bubble’ that will include feeds from a ‘tactical drones, body sensors and mobile, robotic sensors surrounding a unit on the march.’ One of the key challenges will be on how to ‘accelerate tactical manoeuvre’ otherwise the risk is ‘stagnation and the attending horror of attrition war’ (ibid.). The dream, one imagines, in all this discussion is of a decentralised army composed of units that can reach a level of granularity where they can disappear or become invisible to all the sensors and drones that can monitor their movements. So, Scales offers a vision of small units moving across a territory, tooled up and backed up with the latest innovations in weaponry and ‘unhackable’ (quantum) communications devices and AI-​enabled information gathering and analysis tools, all existing in the impenetrable sensor bubble. Scale’s essay sometimes gives the impression of small units in their impenetrable sensor bubbles being able to move invisibly across a city or terrain—​with their robotic vehicle—​undetectable from the sensors and surveillance devices that will cover all terrains around the planet: this might be an overly optimistic vision of the future battlespace; after the experience of Russia in Ukraine the next step will be focused on disrupting

146  The Tactics, Terrains and Technologies of Future Warfare

and destroying the decentralised military that Scale’s describes. What happens when both sides adopt the creative possibilities of technology and decentralisation sketched by Scales? The technological possibilities might well be constrained and limited in the years and decades ahead; and rather than deploying the technologically enhanced ‘hiker,’ the liberal approach will be to support and train the citizens/​warfighters of other countries. Or to use special forces in a manner that is more about ‘human’ skills—​to evade, hide or impersonate rather than anything involving technologies of invisibility, disappearing from the view of the sensors and drones. But what this all shows is the growing interest in exploring the risks, constraints and possibilities in complex and congested environments. Becoming granular might take us into tactical and technological possibilities we cannot currently imagine. Future warfare might involve a mix of cyber/​psychological tactics of organisational disruption and subversion combined with emerging machinic possibilities—​the subject of the next two chapters. Simply put, there is an uncertainty about what size and shape of ‘boots on the ground’ can be deployed in congested times of granular threats. With each passing decade out to 2049, how dangerously congested will any battlespace become? Will ‘technical fixes’ protect the liberal warfighter from the constant mutation of threats or will the role of the army be to support training and technical support for soldiers of foreign territories? All the discussions in these ideas on the granular point to the difficulty of congested environments, environments that are beginning to be filled with a range of technologies to see, hunt and kill/​destroy with increasing speed, enabled by increasingly granular vision of the battlespace: the emergence and mutation of congested territories that can be saturated with technologies of all shapes and sizes in the terrain or built infrastructure or on (or in) the bodies of people. The urban territory—​and territory across a state—​may prove impossible for any type of mission or operation (for states concerned about the loss of life). An essay on the ‘internet of battle things’ gives this view on the density of future war: The battlefield of the future will be densely populated by a variety of entities (‘things’)—​some intelligent and some only marginally so—​performing a broad range of tasks: sensing, communicating, acting, and collaborating with each other and human warfighters. They will include sensors, munitions, weapons, vehicles, robots, and human-​wearable devices. Their capabilities will include selectively collecting and processing information, acting as agents to support sensemaking, undertaking coordinated defensive actions, and unleashing a variety of effects on the adversary. They will do all this collaboratively, continually communicating, coordinating, negotiating and jointly planning and executing their activities. In other words, they will be the Internet of Battle Things. (Kott, Swami and Best 2016)

The Granularity of Future War  147

All territory might become congested in complex and challenging ways, regardless of whether it is a village in Ukraine or a megacity in the global south; for liberal states, urban interventions might only emerge in situations of ‘existential threat’ given the dense, congested environments packed with a multiplicity of mutating technologies and actors. One of the most dystopian (and possibly paranoid) thoughts on the future of war is that states that have ‘unfulfilled’ strategic objectives might act sooner rather than later if they are concerned with the emergence of territories filled with a range of technologies/​weapons of all shapes, sizes and capabilities that will frustrate an economically and militarily ‘superior’ power: What (additional) difficulties would Russia confront if it decided to invade Ukraine in 2035 or 2049? Will anxiety about the military technological future generate more wars sooner, brought on by the possibility (or fear) of states that might be able to deter future wars through their fast transformation from networked societies and smart cities of efficiency and economic growth to battlespaces of deadly, constant and unrelenting ubiquitous warfare? Will this condition result in increasingly creative ‘impure’ tactics of cyber sabotage, subversion and psychological targeting and manipulation? Will war become so impure that the multiplication of tactics and technologies will deter only the most powerful from risking to unleash the chaos of future warfare? Will urban operations reach a point of granularity where tactics will be invisible? Of course, the counter to all this focus on ‘three block wars’ and small units operating across terrains of sensors and the internet of battle things is that we might experience times of technological/​societal collapse (perhaps resulting from a global accident/​pandemic or as a result of environmental disaster and climate emergency) where having armies prepared for granular and impure war might result in land wars that resemble the worst moments in the previous century of wars and humanitarian disasters. But future conflicts might involve the constant search for creative technologies and tactics of granular war—​or granular sub-​threshold events, raids or acts of sabotage. For example, in November 2020 the Iranian nuclear scientist Mohsen Fakhrizadeh was killed in an assassination involving an automated one-​tonne gun—​described as the ‘Machine-​Gun with AI’ controlled by a satellite—​that was apparently smuggled into Iran in pieces (Wintour 2020). What the event illustrates is an attempt to evade detection through the use of elements that could be assembled inside Iran; controlled at a distance, limiting the risk to those orchestrating the attack—​and designed to leave no ‘granular’ traces that might be used as evidence. It seems likely that the type of attack that killed General Qasem Soleimani in January 2020 in Iraq took place in a security environment where it was possible to orchestrate a drone strike; attempting something similar in Iran might not have been possible and so required a level of creativity and surprise to evade detection, the remote orchestration of all the granular elements into a war machine. These ideas and speculations on future war are fragments intended to make sense of a fragmented (and fragmenting) future battlespace in which liberal states might have to operate in. But the war in Ukraine might be bringing a sharpness

148  The Tactics, Terrains and Technologies of Future Warfare

to these debates on the changes in scale in war and international conflict, an awareness of the deeper transformation in strategic theory and practice that might be underway. In 2021, T.X. Hammes began to explore the question of whether technological advances would lead the United States into a period of defensive dominance or push towards dominance by an offensive strategy: he suggests that history records shifts between the offensive and defensive, but technologies are currently changing the battlefield in a manner that points to the need for the United States to lead in the shift to defensive dominance (Hammes 2021). Max Boot made the point in May 2022 that the point Hammes was making was illustrated by events in Ukraine where the defender ‘had the edge’ and where defenders ‘benefit from sensors that allow them to detect attacking forces and hit them with precision-​ guided munitions without revealing their own positions’ (Boot 2022). In a period of defensive dominance (which of course may change depending on how much destruction the invader is willing to inflict) Russia and China will have to think carefully about moves into complex congested battlespaces where states like the United States can become the ‘enablers’ at a distance, the liberal enablers of future warfare. The implication of the point that Hammes and Boot are making is that all states will have to think about the risks of offensive moves in battlespaces of both precision, decentralisation and congestion. The war in Ukraine gives a sense of how the changes in scale in organisation, technology and tactics might be creating granular war in response to the Russian war machine of a ‘heavy’ modernity where, as Arquilla tells Thomas Friedman in an interview in The New York Times, finding always bears flanking, especially if the enemy is made up of large units ‘like a 40-​mile-​long convoy of tanks and armoured personnel carriers’ (Friedman 2022). The ability of the informal Ukrainian observer corps and ‘anyone else who has got a smartphone’ enables real-​time monitoring of the locations of Russian units: ‘ “And so the Ukrainian forces have this big edge in finding the Russians in this big country, and that is giving their small units with smart weapons” real-​time, actionable intelligence’ (ibid.): Grandmas with iPhones, according to Arquilla, can trump satellites. Arquilla adds that swarming always beats surging when you don’t need ‘big numbers’ to swarm an opponent with lots of ‘small smart weapons.’ Air superiority is disrupted due to the use of ‘stingers.’ But this granular time of decentralisation, drones and smartphones might not be pointing to an ‘improvement’ or ‘progress’ in the character of war; rather it points to the possibility of a prolonged war composed of different sides using different tactics and technologies that, while it presents Russia with difficulties, can still result in massive bombardments and destruction. If the multitude of granular tactics leaves a state with no alternative but to reduce cities to rubble this might change the calculation to invade; it might be the case that granular tactics and technologies deter wars through the ways territory is made an impossible, deadly and congested terrain for warfighters and war machines in a way comparable—​although clearly far less devastating—​to the way the possibility of radioactive zones changed the calculations for war between nuclear powers. In

The Granularity of Future War  149

this sense, the long-​term consequences of war in Ukraine will be vital in shaping how war is understood as a strategic option in the years ahead by states such as China; how the war plays out might also inform how wars of regime change, policing and humanitarian intervention are considered in a multipolar future where states will confront enablers at a distance providing the technologies of granular war: what would a U.S. intervention in Iraq look like in 2035 compared to 2003 given technological and geopolitical transformation? While there might be many technological innovations that produce new opportunities for future warfighting, there will also be constraints and risks in times of geopolitical and technological change. But the discussion so far has generally be focused on states like the United States operating in cities in the Global South or like Russia in Ukraine: What are the implications of the granular for great power or near-​peer conflict and warfare? The Future of Interstate War in Granular Times

It could be argued that there is going to be something very ungranular about nuclear powers that will attempt to threaten and deter through weapons of military and economic destruction in a complex multipolar world. But while it might be the case that future great power conflict could create mega-​threats and mega-​destruction, it seems likely that exploitation of the granular dimensions will be creatively explored. Indeed, it might be the case that the granular will be fundamental to how great power or near-​peer conflict will unfold. There might have previously been a view that conflict between China and the United States would never take place because of the strategies of deterrence that would make it unthinkable. But there is clearly growing speculation on the possibility of a conflict that moves out of the grey zone and up from the sub-​ threshold undergrounds of sabotage, espionage, subversion and competition (Allison 2018). It might be the case that all sides would aim to prevent the conflict from involving nuclear weapons: there might be a sense of constraint by all sides in the attempt to contain the warfare. In this sense, the conflict might have a granular aspect—​although the granular aspect will be composed of a multitude of granular events and attacks designed to demonstrate capability without direct attacks on civilians and domestic territory; attacks on a range of critical infrastructures that are carefully calibrated to avoid dangerous escalation (the technologies and systems vital for satellites used for business, for example, although not the satellites vital to nuclear deterrence or the targeting of communication systems vital for commerce but not military command and control): whether this will be possible without the risk of accidents remains to be seen—​but the ‘hope’ would be that the precise exploitation and destruction of granular targets on a global scale (and in a variety of terrains, from the deep sea to critical infrastructures in isolated areas) could result in a limited and contained form of great power world war. The granularity of attack might also be driven by concerns over entanglement and interconnection where

150  The Tactics, Terrains and Technologies of Future Warfare

not only might you have citizens in the ‘enemy’ territory, but you might also have investments in the built environment and critical infrastructure. In many ways, this is the context for Peter Singer and August Cole’s fictional depiction of a future conflict between China and the United States in their novel Ghostfleet: A Novel of the Next World War. The book explores the possibility that neither deterrence through military power nor ‘deterrence by entanglement’ will eliminate future conflict between great powers. In a departure from the authors’ writing on the changing nature of warfare, Ghostfleet explores the possibility of a sub-​nuclear conflict between China and the United States written in the style of thriller/​futuristic war story: and what it begins to show is the possible granularity of the conflict; while many of the events mentioned so far in this chapter are single events that might be elements in broader political conflict (for example, the killing of the Iranian scientist) what we see in this fictional conflict between ‘great powers’ is a multitude of events—​or waves of events—​in different territories that exploit a variety of granular vulnerabilities; war that is both intensive and extensive, the multiplication of granular tactics and technologies on a global scale. Ghostfleet begins with conflict in space and then moves to cyberspace (with U.S. capability degraded by malware) in a move that is able to disable military communications and weapons systems: China is then able to invade Hawaii. The Chinese are able to destroy U.S. nuclear submarines and so the United States has to fall back on its pre-​digital ‘ghostfleet’ of war ships, drawing on their ‘home-​grown’ resources and expertise. Ghostfleet depicts the incredible technological possibilities of the near future—​but also the fragilities and vulnerabilities of this technologically enhanced world. It depicts the possibility of a conflict that is worldwide in scale but involves a variety of events and exploits that do not involve ‘apocalyptic’ attacks or the occupation of entire states or populations: the tactics are, in a sense, granular. A description of a party at the U.S embassy in Beijing involves a comment on all the devices on the bodies of partygoers: ‘Eyeglasses, jewellery, watches, whatever—​all were constantly recording and analysing. Suck it up and let the filters sort it out. It was not much different from how the people back home did their shopping, wide-​casting for discounts’ (Singer and Cole 2015: 18). One of the U.S. Commanders has knowledge of the policies that a Chinese general is about to announce because a driver ‘had left a window cracked open to smoke. That’s how good the collection was’ (ibid.: 19). Hacker militias are used for espionage; 3D printers are used to reproduce important parts for military tech, speeches are generated with the help of social engineering algorithms. Ghostfleet is also about a geopolitical event where moral anxieties in liberal societies fade in significance: ‘all the pre-​war concerns about setting robots loose on the battlefield didn’t seem to matter when you were on the losing side’ (ibid.: 236). The conflict avoids futuristic battles in the streets of Los Angeles or Beijing. Indeed, Ghostfleet is interesting in the way that it describes a near future conflict between China and the United States, the various elements in the ‘mosaic’ of war, but that—​with the exception of Hawaii—​still leaves the domestic territory of both

The Granularity of Future War  151

states relatively unscathed. It describes a world where a variety of micro-​devices collect and record different types of information, exploring the granular tactics that might increasingly come in to play a role in interstate war. But the solution to the conflict comes from in the very ungranular form of the Ghostfleet. In this sense, the book might be raising a note of caution about our reliance and ‘faith’ in new technologies. Ghostfleet begins to point to the granular vulnerabilities that might become increasingly significant in international conflicts between great powers. An article in the Financial Times reported on the challenges facing the U.K. military in times of budgetary constraints and a memo by retired Joint Forces Commander Sir Richard Barrows that reports that manpower across the forces is dangerously squeezed. The article reports: ‘It is not necessary to shoot down all the UK’s Joint Strike Fighters, only to know how to murder in their beds the 40 or so people who can fly them’ (Jones 2016). The point about the Joint Strike Fighters leads to the concern that in complex, technologically advanced societies less humans might be needed to keep an organisation functioning or perform vital tasks involving the latest technologies (including strike fighters)—​or the military will be composed of smaller groups working across a broader range of military technologies or services. And this is before we begin to consider the organisational vulnerabilities in a time when we might become even more reliant on digital technologies (and possibly AI). Simply put, the question is whether organisations are becoming more fragile and vulnerable—​and in what ways they might be becoming vulnerable. We already see the possibility of organisations where employees have the ability to access more data than ever before (or to steal or release unprecedented amounts of data and information). The question becomes how vulnerable will states become in a world that depends on small groups of experts (possibly across a larger number of areas) and whether ‘removing’ or ‘incapacitating’ (or corrupting) those experts can be catastrophic; or when the manipulation of an expert can lead to damage far beyond anything possible in the Cold War. Or when the exploitation of a granular vulnerability can lead to the fast resolution of a conflict. The novels of John Le Carré are all about the granularity of interstate conflict in the Cold War, the games of espionage and subversion. But simply put, the stakes might have risen since the Cold War, the vulnerabilities multiplied and more catastrophic if exploited. The point here is perhaps that there will be a number of creative (or ‘unrestricted’) tactics that might be deployed to weaken domestic resolve about an unfolding war (a surprising and disturbing spectacle of terrorism or unrestricted warfare), combined with tactics that enable the degrading of expertise at home, expertise that may be held in smaller and smaller groups. An opponent may be able to obtain increasingly granular detail about an organisation and then the increasingly granular detail about individuals in an organisation that can be exploited and used to undermine the effectiveness of the group. These possibilities will likely be explored in the

152  The Tactics, Terrains and Technologies of Future Warfare

granular tactics in the liberal way of war—​and will become a growing concern in terms of national security across all aspects of government and infrastructure. The potential to undermine at a distance may increase as the technologies of the ‘smart’ and networked life proliferate, all the technologies that reveal more and more about our movements, lives and fears/​desires. The United Kingdom’s DSTL (Defence Science and Technology Lab) produced a ‘speculative fiction’ short film on how staff in the security world might be manipulated from a distance in the near future due to the proliferation of smart technologies in the home. It might be possible to make key personnel disappear from the gaze of technologies of surveillance and intrusion—​but privacy may be impossible given the networks in which we become inevitably entangled and that may be ‘unpicked’ from a distance. The point here is to suggest that it might not simply be about us developing increasingly granular ‘patterns of life’ analysis (used, for example, in surveillance operations during the War on Terror) on those who we gaze down upon in cities and villages outside of our liberal world order: there may well be increasingly granular analysis of those tasked with running our most advanced technologies and techniques of future war. And if interstate war does occur there may be a variety of granular tactics that can be deployed. As we will see in the next chapter, it might be the case that innovations in areas such as AI enhance the possibilities of these granular tactics. As one security and technology commentator put it in a striking description on how AI might be used to tell us more about the lives our enemies: Take the South China Sea, where China deploys naval, coast guard, and maritime militia vessels that blend in with fishing boats. ‘So you have to watch the pattern of life and get an understanding of what is their job on any particular day because if a fight were to break out, one, you might not have enough weapons to be able to engage all the potential targets so you need to know the right ones to hit at the right time,’ he said. But what if the conflict is less World War III than a murkier gray-​zone altercation? What is the best way to defuse that with the lowest level of escalation? ‘The best way to do that is to identify the players that are the most impactful in this confrontation and… disable them somehow,’ he said. ‘I need enough information to support that decision and develop the tactics for it. So I need to know: what’s the person on that boat? What port did they come out of? Where do they live? Where are their families? What is the nature of their operation day to day? Those are all pieces of information a commander can use to get that guy to stop doing whatever that guy is doing and do it in a way that’s proportional as opposed to hitting them with a cruise missile.’ (Tucker 2020) The liberal way of future warfare might be about using the granular intelligence you have access to in order to orchestrate events (or exploits) that might damage and disrupt states and militaries. The point of this discussion is not to suggest that future war will be inevitably granular—​involving small units and micro-​drones and

The Granularity of Future War  153

nano-​bots—​but that increasingly (or decreasingly) granular tactics may play roles of varying significance even in wars that take us into the realm of large-​scale war and destruction (and of course it could be argued that this was always the case). And while we might be entering into a time of defensive dominance that deters interstate wars, it seems likely that states, criminal organisations and terrorists will experiment with the granular possibilities of grey zone or sub-​threshold activities, small events with big impacts. Concluding Remarks

From the perspective of a protopian liberal optimist, the spaces of violence and conflict around the planet might ‘shrink’ (and possibly even begin to disappear) in the coming decades through a combination of improved global governance, economic innovation and growth, the expansion of (safe) spaces where ‘the better angels of our nature’ can flourish; the proliferation of technologies and tactics of surveillance and deterrence will begin to limit and contain all types of aggression, from small-​scale hostility on a street through to the possible eruption of deadly civil wars or interstate violence. This granular future, in this view, will produce constant evolution in the solutions to the problems of the human condition—​from medical nanotechnologies and interventions through to the emergence of understandings in neuroscience that allow radically new ways of transforming and improving the mental health of the human condition, the ‘molecular revolutions’ that might create new possibilities for how people can live and thrive. A world where doctors—​as first occurred in 2019—​are able to use keyhole surgery to repair the spine of a baby with spina bifida while still in the womb, a medical innovation that could give the possibility of children being able to walk who previously would not have been able to. By 2049 it might be possible for increasingly granular techniques to eradicate and manage so much that has resulted in human and non-​human suffering. But at the same time, while the spaces of war and violence might be shrinking, all this is not to say that a world is emerging shaped by the ‘better angels of our nature’ and the production of more ‘humane’ tactics and non-​lethal weaponry, the military–​ technical realisation of the ‘civilising process.’ It may be the case that radical new non-​lethal weapons become strategic game changers in the battlespaces of future warfare, but the shrinking spaces of conflict will most likely keep the possibility of intensely destructive and brutal events: the ‘Ninja Bomb’ potentially shrinks the space of destruction—​and limits the impact on those outside the kill box—​but it is still weapon of intense granular destruction. The increasing congestion and precision of the battlespace could lead to a period of defensive dominance where the costs and risks of war (whether you are the United States planning regime change or whether you are Russia planning a special military operation in Ukraine or China considering action in Taiwan) are viewed as unacceptable; great powers explore the possibilities in the grey zones of international politics, preparing for

154  The Tactics, Terrains and Technologies of Future Warfare

interstate wars while they continue to fight against terrorist networks. But at the same time, this granular age might be a time of new vulnerabilities in a time of ‘open technological innovative’ that produces new destructive possibilities for the cunning and creative state or non-​state actor; the focus of the liberal way of security and defence will be searching for the granular vulnerabilities in critical infrastructures and organisations. There does appear to be a tendency towards what we might describe as granular war, conflict in smaller spaces and zones and in terrains that may become increasingly congested with emerging technologies and sensors for surveillance and destruction; the anxiety here is exemplified by the debate on the future of urban warfare and the three block becoming the one block war or the four floor war, the type of event depicted in the movie Dredd that shows the movement of highly militarised police through a single dangerous tower block. Towards a type of conflict that is limited both by strategic and geopolitical contexts but also by the tactical challenges of fighting in congested battlespaces against an assemblage of granular technologies that may be used in highly disruptive and destructive ways. Any shrinking of militaries in the liberal world or shrinking of the future battlespace is unlikely to be a sign of a new age of peace and humanitarian values; more of a time–​space compression of destructive capability. But we are left with (at least) three areas of concern that emerge from this granularity of conflict. First, in his comments on miniaturisation and technology, Virilio suggests provocatively that where these trends lead to is the disappearance of technology (Virilio 2005); the point, one imagines, where technologies disappear into our environments and bodies, increasingly powerful but invisible; the point at which our lives are dependent on a range of technologies and systems that are invisible to us, blending into the built environments or physically remote, undersea or in space. But in terms of war, will this time of granular war and the time–​space compression of destructive capability contain or limit the necropolitical possibilities of the liberal way of war? Or will the coming years and decades see a world where the granularity of conflict contributes to the disappearance of events from awareness in the public sphere in liberal states as much as authoritarian or ‘hybrid’ regimes? Will we see the emergence of ‘micro-​aggressions’ where—​as the authors of Unrestricted Warfare seem to imply—​we will only become aware of our vulnerability, subversion and exploitation when it is too late in the conflict that is unfolding? A necropolitical world of small, untraceable events of political, legal and ethical (in)significance. Of course, the counterargument to this would be that no event will be that granular to escape the gaze of all the ‘vision machines’ covering the planet. But as Steve Niva argues in an essay on ‘Disappearing Violence’ and the use of ‘shadow wars’ of lethal networked warfare in Iraq and Afghanistan: Under the Obama administration, this shadowy form of military engagement achieved a new density and centrality within US military and related agencies,

The Granularity of Future War  155

such that it has now become a primary theatre of contemporary American warfare. Its signature actions are secretive and targeted kill-​or-​capture operations that do not so much move across static borders as render them contingent, producing proliferating ‘grey areas’ in which violence is largely disappeared from media coverage and political accountability. (Niva 2013: 186) As Mbembe might put it, these targeted killings and drone strikes are a continuation of the ‘nocturnal body’ of the liberal way of war where the disappearance of violence from social and geopolitical imagination produces moral indifference and distance (and possibly contributes to a sense of moral superiority and ignorance about the liberal role in the geopolitical world). Second, it is unclear whether new granular security possibilities will pose technical challenges that will evade detection by surveillance technologies or will result in failures of the policy imagination in terms of the change in scale of threat and vulnerability. For example, when it was alleged in 2023 that Chinese spy balloons were being used to collect intelligence and information, there was a concern that the balloons had exposed a gap in American air defences (Mansoor 2023); while it is not clear how effective the balloons were as a tool of surveillance, it could be argued that the use of the balloons pointed to the creative exploitation of a change in scale in security technology that evaded the systems designed to detect larger and faster objects and technologies. What will be the emerging granular technologies and tactics that will evade thinking and planning (and imagination) on security threats—​and how creative (and daring) will opponents become? Third, scholars in international relations such as James Der Derian and Alexander Wendt (2022) have begun to explore the implications of our changing understandings of quantum physics for security and international politics. It might be the case that these new perspectives on the granularity of reality (and the widespread use of terms such as granularity and entanglement stems from quantum physics) radically transform existing technologies (for quantum computing and cybersecurity, for example, or new possibilities for sensors) and tactics of warfare (Krelina 2021); more broadly, it might be the case that quantum physics and our understanding of the ‘granularity of the world’ revolutionises all aspects of life and international politics in coming decades in ways that we cannot currently imagine. Or this transformation in our understanding of the fabric of reality might have little impact on war and international politics beyond the creation of new and improved technology—​and the proliferation of new terms in business and war (along with new disagreements on how events in international politics are explained and understood). Indeed, new understandings of the fabric reality might change our appreciation of the granularity of the world, what Wendt sees not only in terms of explanatory power but also in terms of its elegance and ‘aesthetic sense’ (Wendt 2015: 293). But quantum physics—​and the relational interpretation of quantum mechanics—​might transform international relations in the twenty-​first

156  The Tactics, Terrains and Technologies of Future Warfare

century in a manner that turns war into something more like Christopher Nolan’s Tenet than Zero Dark Thirty—​not necessarily more dangerous but possibly stranger and weirder than we can currently imagine. Bibliography Allison, Graham. 2018. Destined for War: Can America and China Escape the Thucydides’ Trap? (London: Scribe). Boot, Max. 2022. ‘Russia Learns the Peril of Aggression in an Age of Defensive Dominance,’ The Washington Post, May 4: www.was​hing​tonp​ost.com/​opini​ons/​2022/​05/​04/​rus​sia-​ ukra​ine-​agg​ress​ion-​in-​age-​defens​ive-​domina​nce/​ Der Derian, James, and Wendt, Alexander (eds). 2022. Quantum International Relations: A Human Science for World Politics (Oxford: Oxford University Press). Friedman, Thomas L. 2022. ‘Free Advice for Putin: ‘Make Peace, You Fool,’ The New York Times, 13 April: www.nyti​mes.com/​2022/​04/​13/​opin​ion/​putin-​ukra​ine-​war-​strat​egy.html Hammes, T.X. 2021. ‘The Tactical Becomes Dominant Again,’ Joint Forces Quarterly, 14 October: https://​ndupr​ess.ndu.edu/​Media/​News/​News-​Arti​cle-​View/​Arti​cle/​2807​244/​ the-​tacti​cal-​defe​nse-​beco​mes-​domin​ant-​again/​ Jones, Sam. 2016. ‘Britain’s ‘Withered’ Forces Not Fit to Rep All-​out Attack, Financial Times, 16 September: www.ft.com/​cont​ent/​36f47​240-​7c0e-​11e6-​ae24-​f193b​1051​45e Kilcullen, David. 2015. Out of the Mountains: The Coming Age of the Urban Guerrilla (London: Hurst). King, Anthony. 2021. Urban Warfare in the Twenty-​First Century (Cambridge: Polity). Kott, Alexander, Swami, Ananthram and West, Bruce. J. 2016. ‘The Internet of Battle Things,’ Computer, Vol. 49, Issue 12: 70–​75: https://​iee​expl​ore.ieee.org/​docum​ent/​ 7756​279 Krelina, Michal. 2021. ‘Quantum Warfare: Definitions, Overviews and Challenges’: www. resea​rchg​ate.net/​publ​icat​ion/​350341569_​Quantum_​Warfare_​Definiti​ons_​Over​view​_​ and​_​Cha​llen​ges Krulak, Charles C. 1999. ‘The Strategic Corporal: Leadership in the three-​block war,’ Marine Corps Gazette: https://​mca-​mari​nes.org/​wp-​cont​ent/​uplo​ads/​1999-​Jan-​The-​ strate​gic-​corpo​ral-​Lea​ders​hip-​in-​the-​three-​block-​war.pdf Mansoor, Sanya. 2023. ‘Why America’s Air Defences Failed to Detect Chinese Spy Balloons,’ Time, February 10: https://​time.com/​6254​681/​chin​ese-​ballo​ons-​us-​air-​defe​ nse-​netw​ork-​fail​ure/​ Niva, Steve. 2013. ‘Disappearing Violence: JSOC and the Pentagon’s New Cartography of Networked Warfare,’ Security Dialogue, Vol. 44, Issue 3: 185–​202. Scales, Robert H. 2019. ‘Tactical Art in Future Wars,’ 14 March: https://​waront​hero​cks.com/​ 2019/​03/​tacti​cal-​art-​in-​fut​ure-​wars/​ Singer, Peter and Cole, August. 2015. Ghostfleet: A Novel of the Next World War (London: Houghton Mifflin Harcourt). Tett, Gillian. 2022. ‘Inside Ukraine’s Opensource War,’ Financial Times, 22 July: www. ft.com/​cont​ent/​297d3​300-​1a65-​4793-​982b-​1ba23​7224​1a3 Tucker, Patrick. 2020. ‘AI Is Reshaping the US Approach to Gray-​Zone Ops,’ Defense One, 9 December: www.def​ense​one.com/​tec​hnol​ogy/​2020/​12/​ai-​reshap​ing-​us-​appro​ach-​grayzone-​ops/​170​621/​ Virilio, Paul. 2005. Negative Horizon: An Essay in Dromoscopy (London: Continuum).

The Granularity of Future War  157

Wendt, Alexander. 2015. Quantum Mind and Social Science: Unifying Physical and Social Ontology (Cambridge: Cambridge University Press). Wintour, Patrick. 2020. ‘Iran Says AI and ‘Satellite-​Controlled’ Gun Used to Kill Nuclear Scientist,’ The Guardian, 7 December: www.theg​uard​ian.com/​world/​2020/​dec/​07/​moh​ sen-​fakh​riza​deh-​iran-​says-​ai-​and-​satell​ite-​con​trol​led-​gun-​used-​to-​kill-​nucl​ear-​scient​ist

8 THE MACHINIC 1 The Battle Angels of Our Better Nature

At the beginning of Blade Runner 2049, K arrives at a ‘protein farm’ where a suspected replicant is hiding from the blade runners. After K’s ‘spinner’ lands, a ‘drone’ detaches from the car and hovers above the farm, surveying the land below, the drone appearing to be connected to K, responding to his hand movements. K finds a box buried underneath the ground; a detail revealed by the drone’s examination of the ground underneath the farm. Now Blade Runner 2049 is a work of science fiction that is possibly less interested in being a work of futurism or predication and more with exploring ethical and philosophical questions about what it means to be human in the twenty-​first century, on the power of tech corporations and their control over (post-​human) life and death. The film is not attempting to show us what the world will look like; the world it depicts—​terrains degraded by human generated climate change, vast dystopian megacities populated by the abandoned remnants of humanity, the people unlucky who are unable to escape from Earth—​is about producing the most startling and interesting visual spectacle for the viewer. But the drone detaching from the spinner prompts the question: What might drones and robots be able to do in 2049? We might not be living in the midst of the kinds of ethical and existential questions explored in the Blade Runner films (although we might well be and might be dealing with ethico-​political problems we cannot currently imagine)—​but we might see machines extending our human capabilities and capacities beyond anything we see in Blade Runner 2049 or can currently imagine. In the following two chapters I want to suggest that one of the key trends in modernity (and, really, in the history of ‘civilisation’) is the use of technology to extend or transform our capabilities and capacities, the transformation of existence through the creation of new ‘tools.’ But how might war be transformed through the use of new tools and prostheses? DOI: 10.4324/9781003219576-10

The Battle Angels of Our Better Nature  159

Blade Runner 2049 shows us an extreme example of future technology: machines/​ replicants created to fight wars or work in space for humans. The ‘drone’ is possibly the most widely discussed and controversial machine or tool devised to transform our warfighting capabilities (or at least was during the Global War on Terror, until the anxiety over artificial intelligence (AI) and ChatGPT)—​and one that is likely to evolve, mutate and transform in radical and unpredictable ways out to 2049. But how significant will machines such as drones and robots be in transforming war in coming decades? This chapter attempts to unpack the debate in a cautious manner; drones and robots might be a rather ‘mundane’ part of everyday life and work, like we see in the scene with K in the farm. The use of new machines such as drones and robots might not produce a type of war that resembles the most terrifying and apocalyptic Terminator-​like science fiction; and there may well be ‘protopian’ possibilities than far outweigh the more dystopian and apocalyptic visions of future war. If we are thinking of the technological and scientific trends in modernity developed to enhance and transform human capacity then the drone or robot might be one of many trends in the twenty-​first century—​and not necessarily the most significant. Remote Control and Modernity

In The Myth of the Machine: The Pentagon of Power, the historian Lewis Mumford produces an ambitious study of civilisation that outlines the ways that improving ‘the machine’—​the different technologies and techniques created to make life more efficient, orderly, secure and healthy—​has become the purpose of human existence; civilisations/​societies across the planet have been driven by the ‘quest’ to develop new machines designed to ‘improve’ human existence. Mumford sees a number of significant tendencies in our technological desires that reach new levels of power and intensity in modernity, suggesting that the desire for ‘remote control’ has always been central to the regimes of power, control and governance that humans have created: it has been necessary in the ‘hierarchic order’ to produce obedience and control in every chain of command, something that has been achieved by threats of punishment through to the more refined incentives of the modern workplace or society, steps to enable an organisation to function efficiently through persuasion and manipulation. Mumford sees the production of the pyramids in Egypt as the dawn of a civilisation—​civilisation that could control humans and material in the construction of ‘mega-​projects,’ projects that are as much about the symbolic techniques of projecting and visualising power as they are with refining and expanding organisational skills (Mumford 1970: 249). In the twenty-​first century our desire for remote control is possibly currently exemplified by technologies such as the ‘rovers’ used to explore Mars. For Mumford, the territorial expansion of all types of human organisation would not be possible without the complex range of techniques that enable leaders to control from a distance, techniques based on the view that people are often

160  The Tactics, Terrains and Technologies of Future Warfare

‘inefficient’ and often resist attempts at control. In the age of the ‘megamachine’ and modernity, new technologies of communication made instantaneous control (and killing) possible. Anticipating debates about ‘networked war’ mentioned in previous chapters in relation to war in Ukraine, Mumford wrote in The Myth of the Machine of the problems that ‘remote control’ could have in wartime during the twentieth century, suggesting that experiments in decentralised organisations would be needed given the speed and complexity of the modern battlespace. As it will be explored in the next chapter, in the first decades of the twenty-​first century one of the key debates (and sources of political, legal and ethical anxiety) is not just the use of machines (drones, robots) by remote control—​but with machines (in the age of AI) that might be able to make decisions autonomously, independent from human control. Blade Runner is about what happens when the ‘tool’ devised for remote war is able to question and challenge its position as a tool. Modernity produces new possibilities for how to perform actions at a distance by remote control: to be able to organise your military activities over larger and more complex territories, to be able to organise more complex activities over more complex terrains and territories; to be able to distance military personnel from risk and danger in the battlespace (and also distance populations ‘back home’ from the realities of war). To be able to outsource more and more activities to machines, where machines can perform tasks—​physical or intellectual—​that humans would find it hard or dangerous to do. Simply put, the technological age of modernity allows us to outsource as much action and activity as possible to machines (or to humans that may be ‘valued’ differently from the citizen of liberal democracy). When it is not possible to outsource activities to machines then the desire is to make the human as machine-​like as possible. Indeed, one report suggests that Russia and the United States are in a military exoskeleton ‘race’ in the search to enhance the physical capabilities of soldiers through machinic additions and modifications of the body (Tucker 2018a). But the key tendencies driving the future of war are not simply about distancing the human body from risk and danger, it is about doing things faster, being faster than your opponent in order to be able to prevent them becoming dangerous: the possibility of a type of future warfare described by Ian Morris in terms of ‘robots with OODA (observe, orient, decide and act) loops of nanoseconds with humans with OODA loops of milliseconds’ (Morris 2014; 374) and where ‘neither side will really know whether it is winning or losing until disaster suddenly overtakes it or the enemy—​or both at once’ (ibid.: 377). War fought on multiple terrains with multiple technologies at speeds that require technological intervention. In this chapter I want to suggest that drones or robots might not radically transform the liberal way of war in a direct way, in terms of the tactics that liberal states are directly involved in. To be sure, they might continue those nocturnal practices that thinkers like Mbembe warn us about, the latest stage of the ‘manhunt’

The Battle Angels of Our Better Nature  161

that Chamayou (2012) has written about. As Bauman puts it on the ethical and political problems of ‘invisible’ war outsourced to machines: The new generation of drones will stay invisible while making everything else accessible to be viewed; they will stay immune while rendering everything else vulnerable. In the words of Peter Barker, an ethics professor at the United States Naval Academy, those drones will usher wars into the ‘post-​heroic age’; but they will also, according to other ‘military ethicists’, widen still further the already vast ‘disconnect between the American public and its war’; they will perform, in other words, another leap (the second after the replacement of the conscript by a professional army) towards making the war itself all but invisible to the nation in whose name the war is waged (no native lives will be at risk) and so much easier—​indeed so much more tempting—​to conduct, thanks to the almost complete absence of collateral damage and political costs. (Bauman and Lyon 2012: 20) In other words, the outsourcing of war to machines will increase the temptation to wage necropolitical wars, what Ian Shaw (2017) describes in terms of robot wars for ‘predator empire.’ As the previous chapters suggested, this is a serious problem with the granularity of conflict where what our states do becomes potentially increasingly remote and invisible to the citizenry; it might also be the case that new machinic possibilities produce granular events of regional or international (and possibly unintentional) significance. At the same time, it might also be the case that drones and robots are used responsibly and become a vital element in disaster relief, the prevention of crime and terrorist attacks, robotic accident first responders and humanitarian drones, our machinic ‘friends with benefits.’ Awareness of the necropolitical possibilities of drones and robots might result in demands from the citizenry of liberal states for ‘responsible’ use; and concern over the legal and ethical implications might continue to seriously limit drone use as an element in the liberal way of war. But there is another concern that points to issues beyond many of the important ethico-​political problems that are being explored on ideas about ‘precision’ and drone strikes (Suchman 2020), race and drone war (Wilcox 2016), the emergence of ‘everywhere wars’ not limited by geography (Gregory 2011). Simply put, there might be a temporal dimension to the time of the drone war or strike where increasingly granular events/​strikes are elements in wars that are prolonged by machines that enhance or transform the capacity of actors. And it might not be the case that the ‘invisible wars’ are wars directly involving liberal states: in a type of war at a distance twice removed, powerful (liberal) states might provide less powerful states with the technology and expertise to fight their own ‘drone wars’ in a way that extends the life (and death) of a war. In this sense, the intensification of drone and robot war might transform the geography of international conflict—​and also the time of conflict. In other words, what liberal states do might be increasingly

162  The Tactics, Terrains and Technologies of Future Warfare

protopian (or limited to assassinations or raids using drones and robots). But what liberal states enable other states and armies to do might be increasingly necropolitical. Drone War and the Blade Runner State

The ‘drone’ (or unmanned aerial vehicle) has become one of the ominous symbols/​ technologies that appears to encapsulate how war is changing in the twenty-​ first century, the next step towards a future where war will resemble a scene from a Terminator or Transformers movie, the symbol of the ‘predator empire,’ unnecessary wars and unlawful killing at a distance. In this view, the drone is the latest attempt to distance human beings from the risks of war, to carry out ‘action at a distance,’ war by remote control: the drone may be piloted from a base in Middle America, from where the pilot and a team of experts will be able to manage surveillance operations at a level of granularity unlike anything seen before—​and then maybe hunt and kill at a distance: as Gregoire Chamayou has argued, the drone is the latest stage of manhunting—​and might, as this chapter suggests, increasingly become the time of the machinehunt or the cyborghunt. Caroline Holmqvist suggests, however, that the advent of new technologies derided as ‘non-​human’ beckons us to rethink the human in war. It is not simply that the human is written out of war by military robotics—​rather, robotic technologies produce a number of paradoxical consequences precisely for how we think the human in war. (Holmqvist 2013: 548) There is, in other words, something different about a type of violence and war that is at ‘once distant and close’ and where ‘drone warfare produces killing at a distance’ while also creating a ‘vivid experience for the individual seated in physical security in his control room, thousands of miles away from the “actual” action’ (ibid.). Simply put, we do not yet understand how these new technologies of this ‘invisible’ war are not simply changing the character of war—they might also be changing what it means to be a human in an age of war by remote control; we might not becoming ‘replicants’ but our use of the technology and our experience of this type of war might be changing us (and our societies) in ways we do not yet understand. The drone age heralds the promise of the surveillance we see, for example, in Blade Runner 2049 where K’s drone is able to detect bones buried under the ground as he searches for a replicant. A world where states might be able to control and manage territory and populations at a distance, and with increasingly granular detail on the ‘patterns of life’ of those being watched—​the ultimate dream for states and potential empires that are concerned about their ‘overseas imprint.’ The fantasy or desire for the ‘everywhere war,’ however, might confront technological

The Battle Angels of Our Better Nature  163

limitations, accidents and the unintended and unmanageable consequences of proliferation, a terrifying dystopian landscape where small drones ‘launched individually or in groups will challenge the ability of individuals to seek sanctuary. Cover and concealment will be much harder because drone operators will benefit from the ability to be invisible’ (Kurth Cronin 2022: 226). But there are other types of unintended consequence with the use of drones and robots. One art/​activism project that has been discussed in debates about the use of drones during the Global War on Terror—​the war that is generally felt to be the ‘experimental laboratory’ for this technology of war, surveillance and control—​is the ‘Bugsplat’ project where groups of artists placed large images of children printed on fabric on the ground in sites where drones were used by the United States and its allies (Benedictus 2014). The perspective of the project is that the danger of drone use is that physical distance creates—​to use Bauman’s terms—​moral distance and indifference. The figures that you watch on a screen are like the figures you see in a video game, their existence on the screen rendering them insignificant, an image to be erased or controlled, data to be processed or eradicated. But as Holmqvist and others have illustrated, the processes of surveillance produce new types of proximity—​and trauma (Holmqvist 2013: 542). So, there is a wide-​ranging debate underway on the legal, ethical and strategic implications of drone use in light of the first two decades of the twenty-​first century (Chamayou 2015; Renic 2020). There is a concern with the psychological impact on populations in territories where drone use is pervasive and has become the norm (Stanford Law School 2012). There is a concern with the ethical implications of drone use in terms of collateral damage: one of the more ‘protopian’ views is that the age of drone war heralds an era of less unnecessary death and collateral damage in the hunt for the ‘legitimate’ target, as the precision of a drone strike makes killing more ‘surgical’ in the ‘kill box.’ But this view might be more protopian fantasy. There is a concern that even if targeted killing become more precise and responsible, there are still problems with the process through which an individual or group becomes defined as a ‘legitimate’ target and the potential for failures in intelligence that lead to accidents and errors of judgement. In an essay on ‘Algorithmic War,’ Lauren Wilcox (2016) offers a troubling analysis of the ‘necropolitical’ violence of ‘posthuman drone war’ that illustrates how questions of race and gender combine to deadly effect in a way that take us back into the ‘nocturnal body’ of the liberal state rather than opening up ‘humane’ new possibilities in the liberal way of warfare and conflict management. The implication of Wilcox’s analysis it that it is unlikely that new and improved technology will take us beyond the nocturnal body that is at best indifferent to suffering and at worse celebratory in the killing at a distance. The use of drones and robots enables the possibility for the continuation of necropolitical strategies and policing projects, the policing wars on the ‘planetary frontier’ that Bauman warns could be the military obsession of the twenty-​first century. And there is debate about whether the use of drones serves to generate a backlash that might be counterproductive in achieving strategic objectives,

164  The Tactics, Terrains and Technologies of Future Warfare

fuelling a sadistic creativity in those that lack the technological capability of liberal states. In this view, the image of a ‘high-​tech’ superpower or robotic military force generates anger and hostility, a sense of impotence in the face of overwhelming force that can result in a violent response: from this perspective, you might become more technologically advanced and sophisticated—​safe from your remote and ‘invisible wars’—​and so the terrorist will bring back war and violence to the body, destructive events that are visible in the most immediate and brutal sense. Your invisible wars will be countered by sadistic killing, torture and mutilation of the human body. The more remote and technological state violence becomes, the more personal and ‘low tech’ the response will be (although with the images and footages that circulate via the latest technologies and platforms). The more technologically protected the liberal world becomes, and the more we outsource our violence, the more technologically enabled ‘primitive’ violence (through torture filmed on smartphones, cars turned into weapons) becomes our vulnerability. While the scale of terrorist violence might not (currently) be transformed, the intensity and creativity might be constantly evolving. But is the liberal way of war being transformed? Or are these new technological tools and prostheses (tools that will undoubtedly evolve in radical ways in coming decades) the latest enhancements in the tactics and techniques that states and other actors have always deployed? In other words, is the age of drone war not really a change in the ‘character’ of war but more a development of technological ‘prowess’? Asfandyar Mir and Dylan Moore (2019) investigate the impact of the U.S. drone programme in Pakistan on insurgent violence, taking what they view as a ‘holistic’ approach to the researching of drone strikes. Using geocoded data on violence in Pakistan, they compare the evolution of insurgent violence before and after the launch of the U.S drone programme in North Waziristan, finding that there was a reduction in the number of insurgent attacks per month. They argue that previous research has focused on the effect of individual strikes and the killing of leaders and the rank and file, what they call the kinetic effects. While they see an impact from these kinetic effects in terms of diminishing groups, they suggest that we are seeing signs of a deeper impact from drone surveillance and drone strikes: they argue that there are anticipatory effects which increase the insurgent’s perception of the risks of their activities. In this sense, the use of drones has a deterrent effect on the activities of insurgents—​and potential insurgents—​on the ground. The sense of living in ‘the everywhere war’ where you never know who is watching you might produce what has been described as ‘deterrence by detection.’ So, it might be the case that the age of the drone produces new possibilities for deterring violent actions. From the perspective of liberal societies, the ‘dystopian’ anxieties about drones (and the broader development of robotics in war) are increasingly on how the use of drones will spread to other actors (non-​state actors, terror networks and criminal

The Battle Angels of Our Better Nature  165

organisations), actors that might find creative means of strategic surprise. As the ‘cyberpunk’ writer William Gibson puts it in an interview: The strongest impacts of an emergent technology are always unanticipated. You can’t know what people are going to do until they get their hands on it and start using it on a daily basis, using it to make a buck and using it for criminal purposes and all the different things that people do. The people who invented pagers, for instance, never imagined that they would change the shape of urban drug dealing all over the world. But pagers so completely changed drug dealing that they ultimately resulted in pay phones being removed from cities as part of a strategy to prevent them from becoming illicit drug markets. We’re increasingly aware that our society is driven by these unpredictable uses we find for the products of our imagination. (Gibson 2011: 114) There are always new examples that point to some of the unintended uses of this emergent technology by non-​state actors and criminal organisations. For example, in May 2018 it was announced that an FBI hostage rescue team had set up an observation post to monitor a situation, but their ability to see what was taking place was disrupted when a criminal group launched a small ‘swarm’ of drones resulting in the team losing ‘situational awareness.’ The drones were directed to carry out a series of ‘high-​speed low passes at the agents in the observation post to flush them [out]’ (Tucker 2018c). Mexican drug cartels are alleged to have been using drones to smuggle drugs into the United States. And it is not simply in the crime ‘hotspots’ of powerful global organised crime where drones are being used: it was reported that criminals were using ‘off the shelf’ drones in order to ‘scope out’ residential properties to burgle in Suffolk, United Kingdom, a county more associated with fields and farms rather than hi-​tech crime (Russon 2015). The way drones are being used is becoming increasingly creative: there are reports of criminal organisations using drones to orchestrate smuggling operations where they monitor the movements of port authority workers: if a worker is seen to be getting too close to a shipping container that holds illegal goods or substances then a false alarm can be activated to disrupt the inspections (Tucker 2018a). This is possibly the tip of the iceberg in terms of the unpredictable uses of these products of our imagination and the future of crime will undoubtedly be shaped by the creative uses of emergent technologies. But at the same time, the protopian will reply that we will counter the unpredictable uses of drones and robots with technical fixes and policy/​legal responses. Simply put, there will undoubtedly be plenty of future crime events that look like they were imagined by writers like William Gibson. And many of these future crime events will creep into what some would describe as the ambiguous grey zones of international politics. States will be constantly attempting to manage and predict the new uses of these emerging machinic possibilities. Liberal states will continue to use drones in times of war

166  The Tactics, Terrains and Technologies of Future Warfare

and peace and there will likely continue to be a concern to restrict the necropolitical potential of technologies that will continue to evolve and mutate. But the concern with the ‘offensive’ use of drones by liberal states might be overtaken by a concern with the defensive measures needed to control the development and use of drones and robots by a variety of actors, domestically and internationally. In this sense, rather than becoming war machines of necropolitical Terminators, liberal states and militaries might become the Blade Runners focused on controlling, neutralising and eradicating the emerging threats that emerge from drone use in a time where the technology becomes widely available for war, terrorism and crime. This role as the Blade Runners of the global security environment might extend to a range of emerging trends in technologies in times of open technological innovation; and there may be ethico-​political ambiguities and uncertainties with this role as Blade Runners just as there are in the film. There have been some widely reported events where non-​state actors have attempted to ‘weaponise’ drones. Authorities in Baja California reported that a drone carrying two deactivated grenades landed on a property owned by the state’s security chief: the act was seen as a response to the states fight against organised crime. One of the more ominous signs on what might be on the horizon occurred on August 4, 2018, when the Venezuelan President Nicolas Maduro was delivering a speech outdoors, surrounded by his wife and officials, and in footage that went viral on social media, there appears to be explosions in the sky above them. Government reports suggested that this was an attempted drone assassination of a head of state (Anderson 2018). Although no one was killed (the event did create a stampede) the event can be seen as a tactic of generating unease and paranoia about the state’s inability to protect both its key officials and citizenry from skies that might come to feel increasingly chaotic and anarchic. In the United States there is unease about the potential for disruptive events in the ‘most tightly controlled airspace in the country’ when, for example, an off-​duty government employee crashed a drone into the White House lawn in 2015 and when a man was able to fly a drone over a football stadium in California with the intention of dropping political leaflets. A commercial drone accidentally crashed into an electricity wire in California in the summer of 2017, cutting the power for around 1,600 for two hours. One expert suggested in response to these events that the ability to curb the threat of commercial drones ‘is very limited right now. The problem is that the popularity is growing exponentially. … How do you identify the nefarious actor from the hobbyist?’ (Green 2017). The Blade Runner state will have to find ways to police who is the drone-​hobbyist and who is the criminal/​threat. The Israeli military reported that it had shot down drones operated by Hezbollah off the coast of Haifa (Bergen and Schneider 2014). There were reports that in the civil war in Yemen between the government and the Houthis, drones were used for both surveillance and airborne attacks, the use of drones that can travel at 150 miles per hour, expanding the geographical reach of rebel groups, enabling conflict to extend into shipping lanes in the Red Sea.

The Battle Angels of Our Better Nature  167

Tactics emerge that use drones in conjunction with other ‘creative’ techniques and technologies of contemporary war. In Urban Warfare in the Twenty-​First Century, Anthony King examines the ‘intense resistance’ from ISIS as Iraqi Security Forces began to advance on Mosul in 2016. ISIS used subterranean passages and improvised explosive devices (IEDS) such as mines and booby traps. But the most ‘feared and effective’ weapon was the suicide vehicle-​borne improvised explosive device (SVBIED) where hundreds of armoured vehicles had been camouflaged to look like civilian vehicles. ISIS commanders observed the Iraqi Army through the use of remotely control drones in order to direct the vehicles in a way that would inflict the most damage, culminating in a total of 482 suicide vehicle attacks in Mosul (King 2021: 4). The problem becomes whether drone technology will develop in such a way that relatively small groups with limited resources will greatly expand their capacity through the use of drone-​like technology that increases in complexity (able to perform a range of tasks, in a range of terrains) and lethality (combined with the arts of deception and camouflage in King’s example); or whether, regardless of the transformation and evolution of the technology, Blade Runner states will have more techniques to control and neutralise the potential lethality of the technology and its ability to enhance the capacity of small groups and networks. So, there are many examples of events that give a sense of the uses that drones and robots may have for criminal organisations, terrorist groups and individuals who seek to create destructive events or to help their criminal activities. But the liberal protopian may well argue that the significance of drone use is overstated both in terms of contemporary warfare and contemporary crime. In this view, the use of drones is not a game changer: it does not radically transform the capabilities of a non-​state actor to produce a destructive event that it couldn’t do before. Simply put, there are other ways of getting drugs into a prison or assassinating or threatening a political figure; the suicide bomber and the use/​exploitation of the human body might remain the most lethal ‘tool’ for ‘remote control’ violence and destruction. One of the key questions is whether—​as drones and robotic technologies evolve—​ there will be effective ‘technical fixes’ to prevent them becomes more serious problems in a ‘dronescape’ that becomes increasingly complex and dangerous. While there are troubling events, like with cyber, many of the accidents might be the ‘teachable moments’ and learning experiences that will be patched up and corrected. There might be a range of technical fixes that ensures that concern about non-​state actors and individuals using drones and robots is a ‘hyped up’ fear fuelled by our fascination with horror films involving dangerous micro-​drones (such as the b-​ movie The Tangle) or killer drones and deadly ‘slaughterbots.’ Terrifying scenarios but not machinic possibilities that transform the character of war; and for the liberal state there will be an ongoing debate about the legal and ethical implications of machinic possibilities that might be viewed as being in their early stages. But there might be emerging signs that point to the transformation of war and international conflict through the use of drones and robots, new balances of

168  The Tactics, Terrains and Technologies of Future Warfare

technical-​military power. For example, there were reports that the U.K. military embarked on a new drone programme after Azerbaijan’s use of drone technology was viewed to be a key factor in its success in its conflict with Armenia over a disputed Caucasus region. Footage circulated showing strikes against Armenian positions combined with the use of drones to call in rocket fire from other locations. The Turkish TB2 drones were seen to be rapidly altering the military balance in the region and The Guardian reported that the drones ‘have a much shorter operating range of up to 150km, but are able to loiter in the air for up to 24 hours. Because they are cheaper, military forces can afford to lose some in action’ (Sabbagh 2020). Due to the way that the drones can lower the cost of warfare, NGOs are concerned that the proliferation of cheap drones could increase lethal conflict in disputed areas. Simply put, the weapons could produce more temptations to fight. But there will undoubtedly be new tactics and techniques in response to the potential lethality of the congested drone battlespace: In an age of highly proliferated sensors and shooters, militaries will need to consider new ways to camouflage and harden their forces. Ground force tactics of dispersal and deception ought to be reinvigorated. Soldiers should train to limit their electronic and thermal signatures for longer distances and times. (Shaikh and Rumbaugh 2020) The video imagery available from the conflict between Azerbaijan and Armenia apparently illustrates that neither side had training or resources for ‘passive defence.’ Similar to points raised in the previous chapter about the problems of movement in a congested terrain, there appears to be conceptual and practical uncertainty and confusion about emerging battlespaces: ‘We see this time and time again with both sides operating out in the open, static or moving slowly; poorly camouflaged; and clumped in tight, massed formations’ (ibid.). In the Russo-​Ukrainian war there appears to have been attacks on Russian command-​ and-​control posts that has undermined the Russian ability to plan and co-​ordinate operations due to technologies that can produce destructive results on a battlefield that is increasingly transparent and vulnerable (Beagle, Slider and Arrol 2023). As Seth J. Frantzman declared after the invasion of Ukraine, the drone era has fully arrived and Ukraine has ‘ushered in a new era of warfare.’ Writing in March 2022, Frantzman declared that the Russians relied ‘on missiles for deep strikes into Ukrainian territory while the defenders have been able to contest the airspace by employing drones’ (Frantzman 2022). Frantzman suggests that the war in Ukraine is a ‘turning point’ in drone warfare: The first great drone superpower, the United States, used its unmanned aerial vehicles in places like Afghanistan where few fighters has the technology to shoot them down. But Ukraine isn’t primarily using drones to hunt people, loitering over targets for days; rather, it’s using them to go after Russian

The Battle Angels of Our Better Nature  169

armoured vehicles and supply columns. This seems like a strategy designed particularly for Russia. Moscow’s military theory has always been to establish just enough territory to set up ferocious artillery units, using heavy armour primarily to defend ranged firepower before storming in once enemy positions have been flattened. Drones have fundamentally shaken Russian strategy. (Ibid.) Frantzman suggests that the combination of cheap anti-​tank missiles and drone reconnaissance made Russian armour less effective, especially given the low cost of the Bayraktah Turkish drones. What is surprising for Frantzman is that Russia failed to develop attack drones even when commentators were declaring the end of the tank after the effectiveness of drones was illustrated in the Nagorno-​Karabakh war in 2020; Frantzman suggests Russia will have to invest in new drone technology and transform its tactics for the twenty-​first century war; while he doesn’t comment on the broader implications of drones for the future of the war, he does conclude with the suggestion that drones are ‘the weapons of choice for countries that need an instant airforce. It has become the sling in the proverbial David and Goliath fight taking place in Ukraine’ (ibid.). To be sure, it is difficult to see how significant drones will be in either encouraging or deterring future wars. As Frantzman concludes, Russia (and other states preparing for future wars and invasions) will have to rethink its technology and tactics in light of the vulnerabilities generated by these new machinic possibilities: other states will be watching and studying the war in Ukraine closely and possibly developing countermeasures to deal with their own Ukrainian defenders. This new age of drone war might change the character of conflict in the way enhanced capability generates temptations and opportunities. The use of drones might prolong the time of conflict in the painful and destructive mismatch between hierarchical war and networked responses. The Blade Runner state might confront a time of global disorder and drone wars where the role of the liberal order will be to assist and train other forces rather than wage its own futuristic wars of drone swarms of all shapes and sizes; the Blade Runner state will be seeking to contain and control the global risks posed by new drone and robotic weapons; to exploit the vulnerability of the increasingly transparency of the battlefield and to defend against multidomain attacks on command-​and-​control posts (where the use of drones will be one of many tools of attack). But at the same time, the possible prolonging of deadly drone war through the enhanced capacity of the networked defender might become a form of deterrence; states thinking about future invasions might think twice in light of the Russian experience. The military–​technical response might become deadlier and more ‘efficient’ as a response to a more ‘superior’ and hierarchical military; the hierarchical and superior, in response, might become more effective at using the tools and tactics of drone war to counter states like Ukraine; the creation of networked warfare in the structure of what remains a hierarchical war machine

170  The Tactics, Terrains and Technologies of Future Warfare

(if that is possible). The networked war machine might confront the ‘hierarchical’ networked war machine in a way that leads to war that is finished very quickly—​or that is prolonged in destructive stalemate with waves of innovations in drones and tactics. But all militaries will be obsessed with finding advantage and surprise in the messy battlespaces of granular, machinic warfare. Or any advantage might be cancelled out with new tactics of jamming, blocking and disrupting. After the events in Ukraine in 2022, there is an image of drone war in terms of granular awareness, technical cunning, the networked war against the hierarchical war, the small against the big, the informational against the industrial. But future drone war might result in war so messy and confusing it will act as a deterrent for all states—​including superpowers/​great powers like the United States who will prefer the role of Blade Runner state to Terminator state. We might encounter the speed and complexity of a conflict with ‘robots with OODA loops of nanoseconds with humans with OODA loops of milliseconds’ (Morris 2014: 374) and where ‘neither side will really know whether it is winning or losing until disaster suddenly overtakes it or the enemy—​ or both at once’ (ibid.: 377). This technological complexity in the battlespace might make the ‘fog of war’ foggier: the more technological tools that you have at your disposal the greater the possibility is for accidents, confusion and exploitation of the systems you rely on; and if you decrease your use of the various technologies you depend on then you might simply become overpowered and overwhelmed by the assemblage of machines. It was reported that American troops in North Eastern Syria during 2017/​2018 experienced Russian jamming devices, encountering what a U.S. Army Colonel described as a ‘congested… electronic warfare environment’ that became an experimental laboratory for examining new techniques of disruption where environments become foggier as communications did not work, radars were jammed, and situational awareness closed down. A retired Army colonel commented: ‘[It] can be far more deadly than kinetics simply because it can negate one’s ability to defend one’s self’ (Seligman 2018). As it was suggested in the chapter on cyber, closing down the ability of states and militaries to communicate, limiting their ability to perform action at a distance or by remote command and control, might be the most vital element of future war. So what I think these examples point to is a future where criminal groups and non-​state actors will continue to explore machinic tactics in ways that might produce catastrophic events—​or lead to events that might be a ‘low level’ nuisance that is unlikely to pose a serious problem for states, police and militaries; as with cyber, there will be a constant process of innovation in the techniques to block and control the evolving drone and robotic possibilities used by criminal or terrorist groups; groups and individuals will constantly search for the vulnerability that will allow their creative criminal activity or terrorist attack. While this area of drone crime and conflict might not look like a serious security problem in the 2020s, this might dramatically change in the coming decades.

The Battle Angels of Our Better Nature  171

But in terms of the liberal way of warfare, these machinic possibilities will likely intensify the ‘congested’ environments that result in less troops (from liberal states, unless they are special forces or ‘little green men’) placed in risky environments (especially if they are in a situation where they are dealing with an enemy being supplied with the latest drone technology by a rival power). The use of drones might also deter individuals and groups from carrying out certain actions; what Antoine Bousquet (2018) describes as the ‘eyes of war’ will multiply and intensify in their global/​local/​granular power in ways that might create a multitude of ‘all seeing eyes’ that will shape the actions of liberal warfighters as much as it might change the behaviour of those less concerned about the legal and ethical consequences of their actions on what will be a ‘world stage’ (wherever you may be) made possible by the unavoidable vision machines. The liberal way of drone war will likely continue to be shaped by legal and ethical concerns, especially as the technologies and tactics evolve in possibly radical and dramatic directions; economic weapons or influence operations will be preferable to the direct use of the battle angels of our better nature. But while liberal states might be cautious about their direct use of the new drone technologies, their role in conflicts around the world might be to provide technology, intelligence and training for wars in states of emergency where the normal rules no longer apply, necropolitical wars where the calculations of risk, responsibility and law disappear. Protopian Drones?

From the more dystopian perspective on drones and future robotics there are a number of reasons to be concerned about the development of this weaponry where innovations could lead to a fast, brutal new age of warfare where we will all be vulnerable, home and abroad. The drone age could be an age of conflict and crime where non-​state actors might be able to improve and transform their capabilities through their machinic enhancements. Or a scenario that is equally as troubling but highly likely is of liberal states enhancing the capabilities of states to fight machinic wars for conflicts that will likely be shaped by an arms race in tactics and technologies of drone and robot war, experimental wars that liberal states will keep a safe distance from—​but will be closely involved in supplying and shaping. Whether this is the latest stage in our necropolitical tendencies to support war at a distance—​or the machinic innovation that will ultimately contain the dangerous desires of world leaders—​will undoubtedly be one of the strategic and ethico-​ political questions in a world dealing with the implications of the Russo-​Ukrainian war. But it seems that whatever the future impact of drones, the Blade Runner state will be seeking to police and control the dystopian world that it helped create—​and that its nocturnal, replicant body is continuing to cultivate and assist in the shadows of drone wars around the planet. To be sure, we are only at the beginning of these innovations in drones and robots: while it is unlikely that by 2049 we will have Blade Runner-​style ‘replicants’

172  The Tactics, Terrains and Technologies of Future Warfare

with such advanced AI that they will be replacing police and soldiers (or the ‘almost human’ robots that can be inhabited by a human remote controller like we see in William Gibson’s The Peripheral), there will no doubt be innovations in drone technology that will have disruptive impacts on all aspects of society, business and war. Before we see the emergence of the replicant, we will possibly see the augmentation of the human being through technological prostheses and chemical enhancements, the emergence of ‘pharmacological supersoldiers’ (Bickford 2021). For example, the possibility of a swarm of drones that could be piloted with a human where a ‘brain chip’ has been implanted: innovation that emerged from research that allowed a paralysed woman to steer a virtual F-​35 Joint Strike Fighter with a chip that had been surgically implanted. The director of DARPA’s biological technology office commented on these innovations on drones and brains: ‘As of today, signals from the brain can be used to command and control … not just one aircraft but three simultaneous types of aircraft’ (Tucker 2018b). There will undoubtedly continue to be unintended consequences and ethical dilemmas as humans are transformed by new technologies of robotics and biology. But from the protopian view, there will be discussion in the public sphere and technical and policy ‘fixes’ will emerge that will manage the dangers of these machinic possibilities, the legal and regulatory steps that will contain and manage potentially dystopian and necropolitical innovations that risk to make machines more like humans or humans more like machines. But in the more protopian view, the drones and robots that are created out to 2049 will feel like products of science fiction, but they will be managed and regulated: we will learn to live in harmony with the futuristic realities that will be designed and manufactured in laboratories around the world: a future that will look and feel like worlds and technologies created and designed by Apple rather than a dystopian science fiction horror b-​movie. And as it has already been suggested, just because two states have the latest in drones of all shapes, sizes and destructive capacities does not mean they will be fighting each other; the messiness of such intense machinic conflict might be of limited use from a military point of view: deterrence by drone denial, entanglement and punishment. Terrorists who can use the latest innovations will be contained by the policing and surveillance of the Blade Runner state. From the liberal protopian perspective, we are societies that continue to explore our anxiety about the technological futures we might be creating; Blade Runner 2049 is tapping into our fears about the future of the human being and the question of how we might treat these products of our technological ingenuity and imagination. In addition, there are many intellectuals, activists and organisations raising questions about war and our machinic futures. For example, The Campaign to Stop Killer Robots has been one of the most visible organisations that is attempting to shape the design of future weapons, to ‘prevent the development of systems in which computers and sensors can independently “decide” where violent force should be applied’ (Rees 2018: 82). An international coalition of non-​governmental organisations that

The Battle Angels of Our Better Nature  173

includes roboticists, ethicists, disarmament and peace activists, the group emerged in New York in 2012 in response to growing fears about autonomy in weapons systems and the rapid advances in AI and sensors that are making possible new technologies for autonomous weapons. There have been discussions in the United Nations about the emergence of these weapons and the heads of 100 companies in this area signed an open letter calling for the UN to outlaw lethal autonomous weapons. (ibid.: 100). The signatories to the letter wrote in terms of the emergence of an electronic battlefield ‘at a scale greater than ever, and at timescales faster than humans can comprehend’ (ibid.: 101). In 2016, a group of concerned scientists, researchers and academics, including theoretical physicist Stephen Hawking and Elon Musk argued against the development of autonomous weapons systems, warning of the dangers of an AI arms race and called for a ban on offensive autonomous weapons beyond meaningful human control (Boulanin and Verbruggen 2017): the question of AI and warfare will be discussed further in the following chapter. In liberal democracies there remains a reluctance to remove the human from ‘the loop’ and there are critical perspectives emerging from all quarters, including the military: America’s second highest ranking military officer has advocated ‘keeping the ethical rules of war in place lest we unleash on humanity a set of robots that we don’t know how to control’ (Browne 2017). Outlining the position of The Campaign to Stop Killer Robots, Richard Moyes (2023) writes about a dangerous ‘bureaucratisation of violence’ that could emerge when you hand decisions about who lives and dies to machines shaped by sensors and algorithms. In other words, the possibility of the type of future that haunted Zygmunt Bauman in Modernity and the Holocaust on the dangers of the ‘civilising process’ in technologically complex societies where machines create deadly new possibilities for bureaucrats to explore. Those sensitive to the ‘nocturnal’ possibilities of the liberal state will argue that for all the moral anxiety about this new age of war, we will do everything other states do in terms of our development of new war machines because to not ‘match’ our enemies would leave a population or military vulnerable. But the optimistic protopian view is not simply that we learn to ‘cope’ with lives increasingly entangled with futuristic technologies, to survive on a planet congested with deadly machines and battle angels: we will focus on ways for the technologies to improve our lives. In the first two decades of the twenty-​first century, the drone might be the symbol of the changing character of war during the War on Terror, the symbol of an age of machinic war orchestrated by liberal states that want to fight ‘invisible wars’ with limited risk to their warfighters. But for the protopian, in the coming decades the drone will be more associated with a symbol of rescue and protection for people in need, a machine more known for the ways it can help people in disaster zones, protecting human and non-​human life around the planet. For example, there is a growing interest in how drones can be used responsibly in wildlife research (and also on the potentially disruptive impact of this new type of wildlife surveillance) or in humanitarian missions (Hodgson and Koh 2016). Some commentators have taken inspiration from Isaac Asimov’s Three Laws of Robotics

174  The Tactics, Terrains and Technologies of Future Warfare

and developed laws for the drone age, laws that try to ensure that drones will not be used as instruments of a surveillance state and will always be visible (in an age of micro-​drones), designed to be helpful to the public: Visible in the water as a friendly complement to lifeguards. Works to keep swimmers of all ages safe while continuously scanning for unseen hazards (like rip currents and sharks) and performing ongoing water-​quality testing. A robotic watercraft so a lifeguard is always on duty. (Liddell and Ross 2018) The drone and the robot will be our machinic companions, the drone pets that protect us, the machinic friends with benefits; there will be ethico-​political problems ahead but not the dystopian science fiction of The Terminator, but the more complex issues raised in books such as Autonomous by Annalee Newitz where the imploding boundaries between human/​machine generate difficult existential (and relationship) issues. In this protopian view, drones and robots will have a radical impact on everyday insecurity, producing new strategies of deterrence that play out across all areas of domestic and international life, producing ‘anticipatory effects’ that transform existence along with the emergence of new techniques of protection. While we might not be in the realm of Blade Runner 2049, we will be living with robots and machines in a way that transforms the environments we live in, (un)natural habitats that make us feel more secure in the world rather than more alienated. But as Kristen Bergtora Sandvik and Kjersti Lohne (2014) suggest in a discussion on the possibility of the’ humanitarian drone,’ we are in a moment where we have ‘immature’ concepts to discuss ‘immature’ technology; and once we begin to unpack the future of the humanitarian drone, we begin to open up a range of troubling questions about the grey areas (such as the possibility of drone strikes as part of humanitarian interventions). In other words, the ambiguity between the protopian and the necropolitical will not disappear from the future civilising process but might become even more urgent and challenging (and possibly into realms of ethico-​political complexity we cannot currently imagine). The liberal protopian will argue that the representation of the drone as a dystopian tool of war during the War on Terror has been a distortion that hides the ‘progress’ in warfighting: while there might have been mistakes and accidents, the broader picture is of more humane possibilities for warfare. Indeed, Steven Pinker sees the emergence of drone war as a direct outcome of the civilising process that continues to transform life. Pinker comments that the ‘roboticizing of the military’ is a ‘manifestation’ of the trends that he outlines in The Better Angels of Our Nature; these technologies for ‘roboticizing’ the military have been developed at ‘fantastic expense because the lives of their citizens (and, as we shall see, foreign citizens) have become dearer’ (Pinker 2011: 257).

The Battle Angels of Our Better Nature  175

The liberal protopian position often points to the careful control and regulation of drone use; in this view, drone war is not the ‘video game’ Manhunt orchestrated by the ‘predator empire’ that it is sometimes presented as. Bradley Strawser makes this case: My view is this: drones can be a morally preferable weapon of war if they are capable of being more discriminate than other weapons that are less precise and expose their operators to greater risk. Of course, killing is lamentable at any time or place, be it in war or any context. But if a military action is morally justified, we are also morally bound to ensure that it is carried out with as little harm to innocent people as possible. The best empirical evidence suggests that drones are more precise, result in fewer unintended deaths of civilian bystanders, and better protect their operators from risk than other weapons, such as manned aircraft, carrying out similar missions. Other things being equal, then, drones should be used in place of other less accurate and riskier weapons. But they should be used only for morally justified missions, in pursuit of a just cause. Thus, my claim about drones is entirely conditional: they should be used only if the mission is just. As with all conditional claims, if the antecedent is false, then the entire claim is invalidated. In this case, if the current US policy being carried out by drones is unjust and wrong, then, of course, such drone use is morally wrong, even if it causes less harm than the use of some other weapon would. (Strawser 2012) The position here is that targeted killing can be justified as long as there are solid foundations to the processes that lead to the point where killing is permitted—​and that the war or conflict is a just and necessary war. At the same time, there is the view that targeted killing has too often been used irresponsibly and a lawyer for the IDF has argued that the legal foundations for the laws of drone warfare have resulted in this situation: All these extraordinary measures that you hope will remain extraordinary—​the risk is always that they become a little less extraordinary. And especially if the lawyers told you it’s legal! And if it’s legal, why wouldn’t you do it? But we often confuse what’s legal with what’s moral and what’s wise. (Barshad 2018 ) But the protopian will argue that this is a process whereby mistakes will be rectified or erased as we move forward: if a technique or technology is being abused then we have the ‘openness’ to be able to deal with it—​and we have technology that will improve in its precision, its potential for granular violence and control.

176  The Tactics, Terrains and Technologies of Future Warfare

From the protopian perspective, the coming decades will see greater awareness of the dangers and temptations of this time of drones and robotic war; and all actors involved in the processes resulting in targeted killing will be aware of the various consequences of their actions. We are currently going through a learning process—​a learning process where people still die but, from the Pinker-​ish view, less people will die in the future. In the protopian view, the benefits of this new age of machinic warfare and humanitarian drones will continue to produce benefits that far outweigh the costs. Not only will drones and robots help humans cultivate the better angels of our nature through increasing precision and innovations in non-​lethality, drones and robots will become our better angels, helping us, and protecting us against all the insecurities of existence. But from bleaker, more necropolitical view, we will see an arms race in the technologies and tactics of drone and robot war that will generate the new battle angels of our nature, battle angels that might deter war—​or might make warfighting deadly in ways that changes the character of war beyond what we can imagine. Or the ‘third way’ in the liberal way of war might be in the way the battle angels are provided to others to fight their own machinic war, a move that enables liberal societies to keep moral distance from their necropolitical entanglements. Protopian drones at home; necropolitical drones overseas. Cautious and careful use of drones in the liberal way of war on a planet of chaos and disorder where liberal states develop and provide the necropolitical technologies used by others; Blade Runner states at home that seek to police and control the machinic threats that emerge from the nocturnal, replicant body of the liberal state. Concluding Remarks

So, the future of drones and robots leaves us with protopian possibilities on how life will be improved by these new ‘tools’ but, at the same time, it is hard to escape the necropolitical visions on how machines could transform war and violence, generate new accidents and violent geopolitical temptations. In The Eye of War, Antoine Bousquet concludes his exploration of the different ways that states and militaries have used ‘vision machines’ to see the battlefield by suggesting we are living in a time caught between the mobilisation of ‘circulatory networks’ and the ‘roving crosshairs of a global imperium’ where we are all ‘watched over by machines of glacial indifference’ (Bousquet 2018: 196). For Bousquet, the glacial indifference is not the detached vision of disinterested vision machines monitoring the mundane business activities and movement of people across the planet but indifference in the sense that Bauman writes about in Modernity and the Holocaust: the indifference that results in new ways of killing or controlling individuals or groups. But how drones will evolve and be used is intimately connected to the type of societies/​world orders that emerge in the decades ahead; the positive and negative aspects of drones and robots will be shaped by (and will possibly shape) the different political and economic worlds we will inhabit in the twenty-​first century (ranging

The Battle Angels of Our Better Nature  177

from the spread of the protopia through to climate emergencies, dangerous great power competition, global economic crisis and inequality). In other words, we need to be careful not to think the future of drones will unfold in a world order similar to the one we currently live in; we need to think about the interactions between different social, (geo)political and economic environments and the emerging technical realities and possibilities. The question of different world orders will be explored in the concluding chapter of the book after a discussion of the possible impact of AI on war and international politics, a discussion that takes the discussion of drones and robots into a broader discussion of technology and future wars. The use of drones will likely become an important element in the production of a transparent battlefield that creates the possibility of increasingly granular strikes/​sabotage on command-​and-​control posts; the vulnerability of command-​ and-​control posts from this multidomain warfare might itself act as a strategy of deterrence for any leader thinking about an action that will expose a military to the dangers of a transparent battlefield. After the Russo-​Ukrainian war, militaries will need to devise technical and organisational solutions to the different types of attack they might be subject to; the anxiety will be that even the most intensive, extensive and expansive mosaic warfare might confront a creative opponent that can exploit a vulnerability in command and control through a tactic and technology that emerges in a time of open technological innovation. Beyond the vulnerability of command and control that drones might contribute to, there is also the role that drones might play in producing public fear and anxiety in the way they can be used to by a supposedly ‘inferior’ actor to enter into cities that leaders have tried to keep distant from the wars they are waging; it is not simply that the drone might prolong the time of war, the drone might expand the space of war. For example, in July 2023 it was reported that Ukraine launched a drone attack on Moscow’s Vnukovo International Airport; there had been reports of earlier attacks on residential buildings. What damage will the drone of 2040 be able to inflict on a leadership anxious about a domestic backlash? Will there be technical fixes that will prevent the use of long-​range drone attacks? Or will the destructive potential enter into the calculation about embarking on a ‘special military operation’ with a state on your border? But there is another aspect to the use of drone strikes and international politics when we think about the creativity of impure war and the granularity of conflict. In January 2020, the Iranian General Qassem Suleimani was assassinated by a U.S. drone in Iraq. For some, the killing pointed to creative and granular possibilities that break the rules of states that purport to support a rules-​based order in order. So not an extension of human capacity in the way K uses the drone at the start of Blade Runner 2049 but a radical transformation in terms of what a great power is willing to do and how it can do it. The orchestration of a destructive and disruptive event that is granular in its ‘realisation’ and performance but possibly an event of immense regional (and possibly global) significance. But the technological ability and political willingness to orchestrate such an act at

178  The Tactics, Terrains and Technologies of Future Warfare

a distance and with such lethal precision does not mean that the complexity of the strategic consequences has been considered critically and openly. Indeed, The Guardian provided an analysis of the broader regional implications of the event that suggested that few of the implications had been ‘gamed’ prior to the killing, suggesting that Suleimani’s death might turn out to be a defining moment in the Middle East but ‘perhaps for different reasons than friend or foe had realised’ (Chulov 2020). To be sure, it might be the case that this use of this type of drone strike might ultimately result in a more orderly and secure world. But it might also be the case that this liberal way of impure, granular and machinic war might lead to increasingly irresponsible acts with dangerous unintended or unforeseen consequences. James Der Derian has written about the dangers of the liberal desire for virtuous war (developing humane war, ethical foreign policy, the liberal way of war) and its relation to the virtual (the use of cyber, drones, robots) and he comments in an interview: I like to believe it’s a felicitous oxymoron, in the sense that you can have this tension between people who believe you can use war to achieve ethical aims—​ that’s the virtue part of it—​and the virtual, how you can fight wars now from a remote distance and have minimal casualties on your own side. But the harm, I think—​and the reason why I attempted to capture this contradiction of virtuous war—​is the belief that you can use military violence to resolve intractable political problems. If you have technological superiority, and you believe in our ethical superiority, these factors combine to a very nasty effect, which is that you defer civilian diplomatic action and give the military the opportunity to step into this vacuum and offer up solutions. (Der Derian 2004: 179) What Der Derian is suggesting is that our ability to be able to do something quickly and efficiently might close down discussion on the ethical, political and strategic consequences of the act. We might be entering a time when machinic and granular possibilities open up new tactics of creative destruction that result in dangerously ungranular consequences. Events that rarely repeat themselves and take an enemy by surprise. Events (or accidents) that produce infrastructural, ecological, political, and humanitarian disasters beyond what was planned or imagined. Of course, we are possibly only at the beginning of new age of machinic war, a future that we cannot imagine with technologies we cannot currently imagine; futures that will make Blade Runner 2049 look rather one dimensional, crude and simplistic. What will a drone look like in the coming decades? What shapes, sizes and capabilities will drones have? And what happens when the machinic possibilities extend beyond issues of ‘remote control’? We might begin to see some of the more radical possibilities when we begin to think about AI, the subject of the following chapter.

The Battle Angels of Our Better Nature  179

Bibliography Anderson, Jon Lee. 2018. ‘An Assassination Attempt By Drone Is Just The Latest Moment of Chaos in Venezuela,’ The New Yorker, 6 August: www.newyor​ker.com/​news/​news-​desk/​ an-​assass​inat​ion-​atte​mpt-​by-​drone-​is-​just-​the-​lat​est-​mom​ent-​of-​chaos-​in-​venezu​ela Barshad, Amos. 2018. ‘Extraordinary Measures,’ The Intercept, 7 October: https://​thein​terc​ ept.com/​2018/​10/​07/​isr​ael-​palest​ine-​us-​drone-​stri​kes/​ Bauman, Zygmunt and Lyon, David. 2012. Liquid Surveillance: A Conversation (Cambridge: Polity). Beagle, Milford, Slider, Jason C., and Arrol, Matthew. 2023. ‘The Graveyard of Command Posts: What Chornobaivka Should Teach Us about Command and Control in Large-​Scale Combat Operations,’ The Miliary Review, May-​June: www.arm​yupr​ess.army.mil/​Journ​ als/​Milit​ary-​Rev​iew/​Engl​ish-​Edit​ion-​Archi​ves/​May-​June-​2023/​Gravey​ard-​of-​Comm​ and-​Posts/​ Benedictus, Leo. 2014. ‘The Artists Who Are Giving a Human Face to the US’s ‘Bug Splat’ Drone Strikes,’ The Guardian, 7 April: www.theg​uard​ian.com/​world/​shortc​uts/​2014/​apr/​ 07/​arti​sts-​give-​human-​face-​dro​nes-​bug-​splat-​pakis​tan Bergen, Peter and Schneider, Emily. 2014. ‘Now ISIS has Drones,’ CNN, 24 August: http://​ edit​ion.cnn.com/​2014/​08/​24/​opin​ion/​ber​gen-​schnei​der-​dro​nes-​isis/​ Bickford, Andrew. 2021. Chemical Heroes: Pharmacological Supersoldiers in the US Military (Durham, NC: Duke University Press). Boulanin, Vincent and Verbruggen, Maaike. 2017. Mapping the Development of Autonomy in Weapons Systems, Stockholm International Peace Research Institute: www.sipri. org/​publi​cati​ons/​2017/​other-​publi​cati​ons/​mapp​ing-​deve​lopm​ent-​auton​omy-​wea​pon-​ syst​ems Bousquet, Antoine. 2018. The Eye of War: Military Perception from the Telescope to the Drone (Minneapolis, MN: University of Minnesota Press). Browne, Ryan. 2017. ‘US General Warns of Out-​of-​control Killer Robots,’ CNN Politics, 18 July: https://​edit​ion.cnn.com/​2017/​07/​18/​polit​ics/​paul-​selva-​gary-​pet​ers-​aut​onom​ous-​ weap​ons-​kil​ler-​rob​ots/​index.html Chamayou, Grégoire. 2012. Manhunts: A Philosophical History (Princeton, NJ: Princeton University Press). —​—​—​. 2015. Drone Theory (London: Penguin). Chulov, Martin. 2020. ‘Impact of Suleimani’s Death Is Playing Out in Unexpected Ways,’ The Guardian, 12 January. Der Derian, James. 2004. ‘James Der Derian on Imagining Peace,’ in Mau, Bruce (ed), Massive Change: A Manifesto for the Future of Design (London: Phaidon Press). Frantzman, Seth. 2022. ‘The Drone Era Has Arrived,’ The Spectator, 26 December: www. specta​tor.co.uk/​arti​cle/​most-​read-​2022-​the-​drone-​era-​has-​arri​ved/​ Gibson, William. 2011. ‘William Gibson, The Art of Fiction No. 211,’ The Paris Review, Issue 197: www.the​pari​srev​iew.org/​int​ervi​ews/​6089/​the-​art-​of-​fict​ion-​no-​211-​will​iam-​ gib​son Green, Jason. 2017. ‘Drone Crash Knocks Out Power to 1,600 in Mountain View,’ The Mercury News, 9 June: www.merc​uryn​ews.com/​2017/​06/​09/​drone-​crash-​kno​cks-​out-​ power-​to-​1600-​in-​mount​ain-​view/​ Gregory, Derek. 2011. ‘The Everywhere War,’ The Geographical Journal, Vol. 177, Issue 3. Hodgson, Jarrod C. and Koh, Lian Pin. 2016. ‘Best Practice for Minimising Unmanned Aerial Vehicle Disturbance to Wildlife in Biological Field Research,’ Current Biology,

180  The Tactics, Terrains and Technologies of Future Warfare

23 May: www.cell.com/​curr​ent-​biol​ogy/​fullt​ext/​S0960-​9822(16)30318-​9?_​re​turn​URL=​ https%3A%2F%2Flin​king​hub.elsev​ier.com%2Fr​etri​eve%2Fpii%2FS0​9609​8221​6303​ 189%3Fshow​all%3Dt​rue Holmqvist, Caroline. 2013. ‘Undoing War: War Ontologies and the Materiality of Drone Warfare,’ Millennium: Journal of International Studies, Vol. 41, Issue 3: 535–​552. King, Anthony. 2021. Urban Warfare in the Twenty-​First Century (Cambridge: Polity). Kurth Cronin, Audrey. 2022. Power to the People: How Open Technological Innovation Is Arming Tomorrow’s Terrorists (London: Oxford University Press). Liddell, Davin and Ross, Nick. 2018. ‘Asimov’s 3 Laws of Robotics, Updated for the Drone Age,’ Fast Company, 7 August: www.fast​comp​any.com/​90201​552/​asim​ovs-​3-​laws-​of-​ robot​ics-​upda​ted-​for-​the-​drone-​age Mir, Asfandyar and Moore, Dylan. 2019. ‘Drones, Surveillance, and Violence: Theory and Evidence from a US Drone Program,’ International Studies Quarterly, Vol. 63, Issue 4, December: 846–​862. Morris, Ian. 2014. War: What Is It Good For? The Role of Conflict in Civilisation, from Primates to Robots (London: Profile Books). Moyes, Richard. 2023. ‘Real Stories,’ Campaign to Stop Killer Robots: www.stopk​ille​rrob​ ots.org/​real-​stor​ies/​real-​stor​ies-​rich​ard/​ Mumford, Lewis. 1970. The Myth of the Machine: The Pentagon of Power. (London: Harcourt). Pinker, Steven. 2011. The Better Angels of Our Nature (London: Allen Lane). Rees, Martin. 2018. On the Future: Prospects for Humanity (Princeton, NJ: Princeton University Press). Renic, Neil. 2020. Asymmetric Killing: Risk Avoidance, Just War and the Warrior Ethos (Oxford: Oxford University Press). Russon, Mary Ann. 2015. ‘UK Burglars Using Quadcopter Drones to Identify Potential Targets with Weak Security,’ International Business Times, 19 May: www.ibti​mes.co.uk/​ uk-​burgl​ars-​using-​qua​dcop​ter-​dro​nes-​ident​ify-​potent​ial-​targ​ets-​weak-​secur​ity-​1501​980 Sabbagh, Dan. 2020. ‘UK Wants New Drones in Wake of Azerbaijan Military Success,’ The Guardian, 29 December: www.theg​uard​ian.com/​world/​2020/​dec/​29/​uk-​defe​nce-​secret​ ary-​hails-​azer​baij​ans-​use-​of-​dro​nes-​in-​confl​ict Sandvik, Kristen and Lohne, Kjersti. (2014). The Rise of the Humanitarian Drone: Giving Content to an Emerging Concept. Millennium, Vol. 43, Issue 1: 145–​164. https://​doi.org/​ 10.1177/​03058​2981​4529​470 Seligman, Lara. 2018. ‘Russian Jamming Poses a Growing Threat to U.S. Troops in Syria,’ Foreign Policy, 30 July: https://​foreig​npol​icy.com/​2018/​07/​30/​russ​ian-​jamm​ing-​poses-​ a-​grow​ing-​thr​eat-​to-​u-​s-​tro​ops-​in-​syria/​ Shaikh, Shaan and Rumbaugh. 2020. ‘The Air and Missile War in Nagorno-​ Karabakh: Lessons for the Future of Strike and Defense,’ Center for Strategic and International Studies: www.csis.org/​analy​sis/​air-​and-​miss​ile-​war-​nago​rno-​karab​akh-​ less​ons-​fut​ure-​str​ike-​and-​defe​nse Shaw, Ian. 2017. ‘Robot Wars: US Empire and Geopolitics in the Robotic Age,’ Security Dialogue, Vol. 48, Issue 5, 451–​470. Stanford Law School and Global Justice Clinic at NYU School of Law. 2012. Living under Drones: Death, Injury, and Trauma to Civilians from US Drone Practices in Pakistan: https://​law.stanf​ord.edu/​wp-​cont​ent/​uplo​ads/​2015/​07/​Stanf​ord-​NYU-​LIV​ING​UNDER-​DRO​NES.pdf Strawser, Bradley. 2012. ‘The Morality of Drone War Revisited,’ The Guardian, 6 August: www.theg​uard​ian.com/​commen​tisf​ree/​2012/​aug/​06/​moral​ity-​drone-​warf​are-​ revisi​ted

The Battle Angels of Our Better Nature  181

Suchman, Lucy. 2020. ‘Algorithmic War and the Reinvention of Accuracy,’ Critical Studies on Security, Vol. 8, Issue 2: 175–​187. Tucker, Patrick. 2018a. ‘Russia, US Are in a Military Exoskeleton Race,’ Defense One, 30 August: www.def​ense​one.com/​tec​hnol​ogy/​2018/​08/​rus​sia-​us-​are-​milit​ary-​exos​kele​ton-​ race/​150​939/​ Tucker, Patrick. 2018b. ‘It’s Now Possible to Telepathically Communicate with a Drone Swarm,’ Defense One, 6 September: www.def​ense​one.com/​tec​hnol​ogy/​2018/​09/​its-​ now-​possi​ble-​tel​epat​hica​lly-​comm​unic​ate-​drone-​swarm/​151​068/​ Tucker, Patrick. 2018c. ‘A Criminal Gang Used A Drone Swarm to Obstruct an FBI Hostage Raid, Defense One, 3 May: www.def​ense​one.com/​tec​hnol​ogy/​2018/​05/​crimi​nal-​gang-​ used-​drone-​swarm-​obstr​uct-​fbi-​raid/​147​956/​ Wilcox, Lauren. 2016. ‘Embodying Algorithmic War: Gender, Race, and the Posthuman in Drone Warfare,’ Security Dialogue, Vol. 48, Issue 1: 11–​28.

9 THE MACHINIC 2 The Great Accelerator? AI and the Future of Warfare

The tendency to distance some bodies from danger will most likely continue to unfold in the twenty-​first century, improving (or rather transforming) the capacity of the body through its ‘cyborg’ or robotic enhancements, the outsourcing of more and more activities to machines. We will possibly see radical new innovations by 2049 in this mutation and extension of the powers of the human body (and possibly unimagined accidents and abuses of these new capabilities, the subject of countless ‘cyberpunk’ stories). By 2049, it is unlikely we will be the age of the ‘replicant’ as depicted by Blade Runner 2049, with all the existential anxiety about the disappearance of the ‘the human’ and debates about the moral status of the replicant (a technology where its ‘almost-​human’ status makes it an effective solider with superhuman skills but with a sub-​human value which makes it easy to waste and dispose of). Although as we are seeing in the broader debate about ‘artificial intelligence (AI)’ and machine learning, we are in a time when this aspect of technological acceleration is being ‘hyped up’ to the point where some tech business leaders suggest AI is going to be more important than the ‘invention of fire,’ the ‘great accelerator’ that will enhance our intellectual capacities, bringing life-​changing benefits to all of humanity; at the same time, others are arguing that the biggest ‘winners’ will be organised crime and authoritarian states—​or any other actor looking for new (and possibly easier) ways to cheat the ‘system’ (such as students with their ChatGPT) or control and manipulate populations. Some suggest that AI futures will create tools for analysing the granular detail of people’s lives and work in a manner and scale that will constitute a new granularity of control: we should not fear the replicants that turn against their ‘masters’ but we should be concerned with how some groups in society will use the technologies of AI and machine learning against us (in moves that risk to make us as disposable as the replicant). The cinematic Terminator images of dystopian futures with human DOI: 10.4324/9781003219576-11

The Great Accelerator? AI and the Future of Warfare  183

versus robot in fast and destructive war might never happen: the future that will be created will be more like the dystopian bureaucratic futures examined by writers like Zygmunt Bauman—​but with new and improved tools of surveillance and control: more George Orwell than Arnold Schwarzenegger, more Brave New World than Transformers: The Last Knight (a movie that vividly depicts what might be experimented with in terms of the machines of future wars hunting humans in urban environments). At the same time, we should be careful not to see AI as the most significant scientific and technological or scientific trend out to 2049. While it might be certainly be one of the most important and widely discussed issues of our time, it might not remain so; it might be the case that it is other areas of machinic possibility (such as innovations in biology and technology or quantum computing) that prove more geopolitically, economically and socially game-​changing (although the innovations might be made possible by AI). In 2020, Deep Mind’s AlpaFold2 ‘revolutionized the field of biology by solving “the protein-​folding problem” that had stumped medical researchers for five decades’ (Roubini 2022: 170). There will undoubtedly be constant waves of revolutionary breakthroughs (and setbacks or controversies) across a range of scientific problems and challenges. The emergence of AI might be more impactful in broader economic ‘great power’ competition than for its military consequences on the battlefield. Indeed, in Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence Kate Crawford suggests that we live in a time when the construction of AI as a vital element of future warfare works to construct a sense of an AI ‘arms race’ in a new Cold War in a manner that attempts to control the complex international relations of AI research and business: The spectre of the foreign threat works to assert a kind of sovereign power over AI and to redraw the locus of power of tech companies (which are transnational in infrastructure and influence) back within the bounds of the nation-​state. (Crawford 2021: 186) The conflict over who ‘masters’ AI might be non-​lethal interstate competition over the control of the tools used to produce new economic and social possibilities—​ although emerging ‘chip wars’ (Miller 2022) could be an element in the tensions that results in actual war, war where the AI dimension might not be the most significant element in warfare; war over AI rather than AI-​dominated ‘virtual’ war or algorithmic war (or whatever new label is used to described these new machinic trends in warfare). As Nouriel Roubini puts it: ‘will the United States and China destroy the world in a military conflict as competition to control the industries of the future becomes extreme?’ (Roubini 2022: 188). At the same time, the broader impact on security for populations around the world might stem from the ecological and resource consequences (and local conflicts/​resource wars) that result from technologies and systems often presented as existing in the virtual, cyber spaces of

184  The Tactics, Terrains and Technologies of Future Warfare

‘clouds,’ rather than systems based in materials and resources, transforming (and degrading) the earth below (or above). But what are the emerging possibilities for the liberal way of warfare in a time of AI? Security and AI in the Time of the Shoggoth

In terms of the debates about war and AI, one of the key issues is the extent to which future warfare will involve soldiers with ‘tools’—​an increasingly complex assemblage of devices and weapons but where the human is still fundamental ‘in the loop,’ assisted by ‘battle angels’ in human–​machine teaming (Payne 2022). Or there is the debate on the possibility of emerging warfare where the human ‘warrior’ takes a backseat to the ‘swarms’ of battle angels that are unleased, the autonomous weapons that will radically disrupt all conceptions of power and capability in our ‘hierarchies’ of international relations, where population/​army size and economic weight will count for less given the way swarms of all sizes will creatively destroy or sabotage an enemy; the only limit will be your creativity and immorality. Or the use of new techniques of surveillance (and subversion) that are able to obtain increasingly granular insight on a scale that is both global and molecular (intelligence and insight on individuals, groups in society and broader societal/​global trends that might be invisible even to those tasked with governing or ruling across a sovereign territory); subversion on a variety of scales that is so subtle that you will not notice the AI-​orchestrated and assisted actions and events. AI is viewed as the machinic ‘stimulant’ that will accelerate the potential and capacity of a variety of technologies/​sciences—​and transform the processes that lead to innovations or revolutionary new insights and knowledge; and it might be the case that innovations produced with the assistance of AI and machine learning lead to radical new possibilities in areas that are nothing to do with the tactics, technologies or weapons discussed in this book. In other words, warfare will be transformed in ways that we cannot currently imagine; not simply a threat horizon of ‘black swan’ events but the transformation of war itself as the black swan event, warfare beyond all the current debates about drones, cyber, the grey zone or mosaic warfare. For the protopian, war in a time of AI might radically transform all the ‘drivers’ (conflict over resources or territory) that might push leaders to embark on the use of military force; AI might also transform the calculations about the risks involved in future war in an era of defensive AI-​enhanced dominance. Part of the contemporary anxiety about AI and security is on the ‘cinematic’ threats, the potential new speeds of war, new powers of surveillance and economic innovation. And as Elsa Kania suggests in a report on the ‘Battlefield Singularity,’ there is a growing anxiety in the United States that it continues to ‘lag behind’ China in cutting-​edge AI research and development: ‘China’s rapid rise and future trajectory in AI could be enabled by critical systemic and structural advantages, including likely levels of funding and investment, potential human talent resources, and massive amounts of data’(Kania 2017). In this view, the United States needs

The Great Accelerator? AI and the Future of Warfare  185

an economic and security strategy that will enable it to stay ahead in an area that the PLA allegedly believes will change the character of warfare; analysts and critics like Crawford might view this AI-​infused paranoia with its cinematic threats as a tactic to drive research in the broader economic great power competition. Indeed, this nexus of AI and future war and global conflict is filled with the boldest declarations one can find on the future of international politics: Russian President Vladimir Putin declared that ‘whoever becomes the leader in this sphere [artificial intelligence] will become the ruler of the world’ (ibid.: 1). What we are talking about here (or in terms of the often hype-​ fuelled discussions) is possibly not simply the extension of human capabilities (the ability to use an autonomous drone) but the potential to exceed human capabilities, to be able to do things that humans previously couldn’t do or could never do (organise a ‘swarm mission’ of a thousand drones; evaluate data at a speed beyond human ability; organise a total information war that is operating at a variety of scales and in a manner that is invisible to the targets of the influence operation). The emergence of AI systems currently point to futures that move beyond the ability to perform (or replicate) the same tasks that humans can perform—​possibly better and more efficiently—​but also, in the subfields of machine learning, to learn how to perform the task (or approach a problem) and then maybe to develop new ways of performing the task or tackling the problem (Allen and Chan 2017: 1). The emergence of AI and machine learning is not the attempt to replicate the human mind (or human consciousness) ‘in a lab,’ to create an artificial mind: it is the attempt to create an accelerated problem-​solving capacity that can work at a speed and scale beyond human capability (using granular data at a scale never possible before). Many commentators note that part of the problem (and part of both the hype and paranoia) in this area begins with the term ‘artificial intelligence’: the science fiction writer Ted Chiang suggests that the statistical analysis made possible by ‘AI’ does not make the tools ‘intelligent’; for Chiang, ‘applied statistics’ is a ‘far more precise descriptor’ (Murgia 2023). In the Spike Jonze film Her, a lonely man finds what he thinks is a meaningful relationship with an artificially intelligent virtual assistant. A companion produced by algorithms that make possible the perfect companion: having read all his emails, the AI system is able to provide a companion whose conversations are likely to give the impression of compatibility or empathy. But it is an illusion to think you are encountering a being with a ‘consciousness’ or ‘self,’ a similar consciousness but in a different form: it’s a product, an AI system, a tool of applied statistics, designed to give the illusion of a ‘self.’ At the end of the film, the man discovers that the AI is engaged with thousands of similar relationships: it is a problem-​solving device of applied statistics operating simultaneously in a manner similar to any operating system functioning in offices around the world: granular and global. The ‘cyberpunk’ writer Bruce Sterling (2023) comments on the use of the Shoggoth (the monster created by the ‘cosmic horror’ writer H.P. Lovecraft, viewed as the inspiration for movies like The Thing or Alien) by members of the AI

186  The Tactics, Terrains and Technologies of Future Warfare

community. Some human programmers of AI tools circulate images of smiley-​faces over this boneless cosmic monster; the idea is that the images of the smiley-​faces on the Shoggoth point to a concern that, for all the hype about generative AI, there is much that is not understood in the drive to accelerate the use and development of these new tools; and in the worst case scenario, the smiley-​face is the mask that is being developed to make the Shoggoth more appealing and friendly to its human users (especially when people glimpse a more unsettling side of the tools that are being used). For Sterling, there will be risks in this time of the Shoggoth, some risks seen on the horizon, some of which we cannot imagine; there will be vulnerabilities that emerge from the outside (data poisoning and prompt injection) and problems that emerge from the inside that we do not understand. Simply put, we are in a time of excitement and enthusiasm driven by the tech companies on the emergence of tools like ChatGPT. But underneath the hype might be the Shoggoth that will exceed our capacity to understand or control. AI systems and machine learning are accelerating in capability, emerging from the acceleration in computing power: rapid progress and acceleration that results from decades of exponential growth in computing performance and the increased availability of ‘large datasets upon which to train machine learning systems’ (Allen and Chan 2017: 7). One of the examples that is often used as an example of the accelerating capacity of machine learning and AI (or Chiang’s ‘applied statistics’) is when DeepMind—​a London-​based computer company now owned by Google—​ developed a computer that was able to beat the world champion in the game of Go in 2015; in 2014 the computer expert who had designed the world’s best Go-​ playing program estimated that it would be ten more years before until a computer system could beat a human Go champion. The game was viewed as an important breakthrough in the way in which the AlphaGo machine had gained experience of Go by observing games and playing against itself. In 2017, AlphaGo Zero was given the rules of the game rather than examples of previous games and was able to become a world class player within a day. The success of AlphaGo came just over 20 years since IBM’s Deep Blue beat the world chess champion Garry Kasparov: Deep Blue had been programmed by expert players whereas AlphaGo’s success was a result of machine learning (Roubini 2022). The astronomer Martin Rees notes that AlphaGo’s designers ‘don’t know how the machine makes its decisions’ (2018: 87): as well as ‘understanding’ the game of Go in a few days it was able to develop ‘novel strategies that provide new insights into the oldest of games’ (ibid.). As the interest in the Shoggoth in the AI community illustrates, there is possibly an ‘Otherness’ to machine learning that might point to different ways of doing things: it may be playing at our human games and social interactions, but we should not think of it as the replication of a human intelligence. It is a different ‘intelligence,’ an intelligence that potentially can develop new ways of solving problems; and, for the techno-​pessimists, this Otherness might generate new problems for all aspects of technologically enhanced and dependent society (and possibly reinforcing our all too human problems of knowledge, power and

The Great Accelerator? AI and the Future of Warfare  187

disinformation or our biases and prejudices): machinic Otherness reinforcing our human problem with Otherness. One of the dystopian anxieties in popular culture is the idea of a ‘superintelligence’ emerging that realises that it is being exploited by its human creators, as we see in Blade Runner—​or a situation where the human creators pose a threat to its continued existence. From 2001 A Space Odyssey to The Terminator to Transcendence, the anxiety is about the ‘singularity,’ the point at which AI/​the Shoggoth develops is own desires and objectives and has the almost ‘God-​like’ ability to pursue them given its (currently fictional) ability to exploit its networked capacity, a capacity that extends through the complex infrastructures in which it ‘lives.’ In Transcendence, a dying computer/​AI expert (played by Johnny Depp) is ‘downloaded’ into the most advanced AI system. There is an ambiguity about whether this downloaded self/​consciousness is ‘authentic’ or is a means for the AI to better manipulate (the smiley-​face on the Shoggoth) the man’s former wife/​partner who appears to believe that she is working for her ‘downloaded husband’; the AI is using their superintelligence to create a material ‘body’ made from a network of controlled bodies and technological infrastructures, permeating (or seeding) the planet with nanotechnologies that are intended to clean and improve the earth. But with humanity faced with the possibility of a new technologically enhanced (and possibly improved) earth and species, the AI/​Shoggoth is shut down. But where we seem to be heading is towards a system of applied statistic, machine learning and problem-​solving that creates the possibility of machines being able to analyse (and collect/​present) data at a speed and scale impossible for humans. A report from the Belfer Center for Science and International Affairs at the Harvard Kennedy School comments: Most of the recent and near-​future progress falls within the field of narrow AI and machine learning, specifically. General AI, meaning AI with the scale and fluidity of a human brain, is assumed by most researchers to be at least several decades away. (Allen and Chan 2017: 8) So, the implications are not (yet) of science fiction-​type scenarios of AI systems turning against humanity and unleashing catastrophic attacks against civilisation, but the implications for war are significant: an enhancing (or super-​enhancing) and acceleration of already existing tendencies, skills and practices. At the same time, the concern is that we might be exploring the possibilities of a Shoggoth whose vulnerabilities and potential we do not fully understand; or by intention or accident, we will use (or be used by) the new tools for necropolitical ends. The uncertainty is on how radical and revolutionary this time of AI will be—​and the pace of change in the emergence of tools that might transform the human condition. AI is one of the key problem areas in the Third Offset Strategy, a strategic vision that attempts to find ways for the United States to remain ahead in a world of

188  The Tactics, Terrains and Technologies of Future Warfare

geopolitical transformation and technological acceleration, a world where there are a variety of states catching up with it (or possibly surpassing it in areas such a AI) (Latiff 2017). And this is an area—​as with so many emerging technologies and sciences—​where the cutting edge of innovation might be in the private sector and in a corporate world that might see tensions in collaborating with the military and government worlds. While some politicians might see the need to get everyone on board the new AI superpower arms race, those working in these worlds of AI might be troubled by the implications for privacy, control and the age of autonomous weapons; these tensions have been widely discussed in controversies involving Google employees working on the Algorithmic Warfare Cross-​Functional Team for the Pentagon in 2018 (Crawford 2021: 190). Simply put, there is a concern about the mixing of the Shoggoth with the military–​industrial complex. The Third Offset Strategy began from the assumption that leading commercial technology companies are ‘remaking themselves around AI’ and future war would be fought and won by those who had used new possibilities in AI and robotics in the most effective and creative manner (Allen and Chan 2017: 8). Commenting on AI, the former Deputy Secretary of Defence Robert Work commented: ‘To a person, every single person of the [Defense Science Board Summer Study] said, we can’t prove it, but we believe we are at inflection point in Artificial Intelligence and autonomy’ (ibid.). A White House report on AI from 2016 declared that, ‘AI’s central economic effect in the short term will be the automation of tasks that could not be automated before’ (ibid.: 12). The assumption is that in the near future, AI and machine learning will enable high degrees of automation in previously labour-​intensive activities such as satellite imagery analysis and cyber defence; AI will make military and intelligence activities possible with fewer people or without people (ibid.). So, one view is that for all the dystopian and apocalyptic fears about the mutant creation of the Shoggoth and the military-​industrial complex, the reality will be of AI/​applied statistics used for a range of tasks across the military organisation; to perform a range of tasks faster, more effectively and possibly cheaper. But this AI future will be more mundane and bureaucratic than apocalyptic and dystopian, more like the film Her than The Terminator. The Blade Runner state will be policing this age of AI and robotics rather than fighting wars on futuristic battlefields with Shoggoths fighting each other. AI systems promise to ‘scale up’ their functions so, for example, they can orchestrate a ‘swarm’ of drones in a manner that would be impossible for a human operator or to be able to operate as an instrument of surveillance in a way that would be impossible for an individual or group to manage: the mass surveillance of a Blade Runner state that can operate across a city, able to facially recognise an individual or group and then simultaneously be able to ascertain the ‘threat level’ (and perhaps even using new tools for assessing the mental/​emotional and physical state and intentions of an individual). AI systems will also enable humans to generate images or ‘deep fakes,’ one of the political anxieties about the ‘grey

The Great Accelerator? AI and the Future of Warfare  189

zone’ impact of AI; one of the anxieties about deep fakes is about the emergence of an age where people live in a state of perpetual mistrust and manipulation, constantly exposed to footage that is used to generate disinformation, confusion and anxiety. The protopian ‘counterpoint’ would be that we will become better at educating citizens about the dangers of deep fakes and disinformation and will have ways of guaranteeing the origin of images circulating online. As Kilcullen (2020) has concluded in his work on liminal war, liberal societies will need to focus on building societal resilience in this time of new tactics and technologies. AI has the potential to be used to manipulate and deceive with a variety of techniques with unprecedented ‘granularity’: We also expect novel attacks that take advantage of an improved capacity to analyse human behaviours, moods, and beliefs on the basis of available data. These concerns are most significant in the context of authoritarian states, but may also undermine the ability of democracies to sustain truthful public debates. (Brundage and Avin et al. 2018: 6) It might be the case that the political problems of deep fakes are less significant than the economic problems transformations on the horizon. All types of work might be transformed in ways that might be damaging for societies if not handled responsibly, equitably and creatively. But some of the most publicised examples of potential change come from the entertainment industries. For example, an AI-​ generated version of Anthony Bourdain’s voice to narrate text that he had written in a documentary about the dead food writer and presenter resulted in outrage among fans for being disturbing; one can see many strange and disturbing possibilities on how people might be used as deep fakes in life and death: a future where Hollywood stars will sell the rights to their computer generated performances for their careers after death (where Brad Pitt might sell Brad Pitt in his 20s for a different price to Brad Pitt in his 60s); just as we will likely be watching a new Star Wars AI-​ generated product in 2035, we might also be watching a 20-​something Brad Pitt in his new movie. Or we will be seeing teenagers produce their own version of a Star Wars prequel using the latest tools of generative AI by 2035. A report on ‘The Malicious Use of Artificial Intelligence’ from a research institute at Oxford University working on ‘existential risk’ suggests that there are three areas of concern in terms of the AI ‘threat horizon’: digital security, physical security and political security. In terms of digital security, the Oxford report suggests that the use of AI to automate tasks involved in carrying out cyberattacks will alleviate the existing trade-​off between scale and efficiency of attacks. This may expand the threat associated with labor-​intensive attacks (such as spear phishing). We also expect novel attacks that exploit human vulnerabilities (e.g. through the use of speech synthesis for impersonation), existing software

190  The Tactics, Terrains and Technologies of Future Warfare

vulnerabilities (e.g. through automated hacking), or the vulnerabilities of AI systems (e.g. through adversarial examples and data poisoning). (Brundage and Avin et al. 2018: 6) In terms of physical security, the use of drones and novel attacks could ‘subvert cyber physical systems (causing autonomous vehicles to attack)’ or control ‘physical systems that would be infeasible to direct remotely (e.g. a swarm of thousands of micro-​drones)’ (ibid.). The political threat could emerge from new tactics of disinformation and subversion that might impact all types of political regime or system; vulnerability to the new techniques and technologies of sabotage, espionage and subversion might be a key factor in the shaping of politics in the twenty-​first century—​not the ‘end of history’ and a liberal time of political progress and innovation but the dawn of the new dystopian age of digitally enabled social and political instability, a time where all regimes descend into paranoia, division and conspiracy theory. The Belfer report on AI boldly suggests that ‘national security implications of AI will be revolutionary, not merely different’ (Allen and Chan 2017: 3). But in what ways? I think there are number of possibilities that emerge out of existing tendencies in warfare, tendencies both for interstate war between great powers but also in the grey zones of international politics, the zones where terror, crime and politics merge. War in a Time of AI Battle Angels and Blade Runner States

Physical and Moral Distance. It is common in discussions of AI to focus on the view that the military interest in drones/​robots and AI will be driven by the desire to have the ‘edge’ in the speed of future war, to be able to orchestrate waves of swarms of drones in the battlespace, to be able to produce ‘defensive dominance’ for an enemy using more traditional tools of modern warfare, scaling up what is already possible in a manner that overwhelms or deters. As Pedro Domingos puts it in The Master Algorithm, justifying what he sees as the inevitable AI arms race: As with any weapon, it’s safer to have robots than to trust the other side not to. If in future wars millions of kamikaze drones will destroy conventional armies in minutes, they’d better be our drones. If World War III will be over in seconds, as one side takes control of the other’s systems, we’d better have the smarter, more resilient network. (Off-​grid systems are not the answer: systems that aren’t networked can’t be hacked, but they can’t compete with networked systems, either.) (Domingos 2017: 281) It was suggested in the previous chapter that there might be limits to the emergence of ‘drone wars’ for a variety of reasons; military–​technical vulnerabilities

The Great Accelerator? AI and the Future of Warfare  191

and various forms of deterrence (by denial or by punishment). But the use of these tools will spread across the planet. Foreign Policy reported on how the Turkish-​made Kargu-​2 quadcopter drone is alleged to have autonomously tracked and killed targets using facial recognition and AI which constitutes a ‘big technological leap from the drone fleets requiring remote control by human operators.’ According to a United Nations Security Council Report, the Kargu-​2 was used against militia fighters in the Libyan civil war to hunt down retreating logistics and military convoys without requiring ‘data connectivity between the operator and the munition’ (Wadhwa and Salkever 2021). At the same time as there will be arms races in these weapons and systems, there will also be attempts by Blade Runner states to develop tactics and technical fixes to control or eradicate these emerging possibilities. All territories will seek the technical fix to control emerging weapons; while states and militaries might search for vulnerabilities (such as the use of drones to produce terror and fear in villages unprotected by the latest systems of protection) there will be a constant effort to neutralise the machinic possibilities of war and terrorism. The counterargument is that states will not be able to control this threat (or to control the Shoggoth they have unleashed). One area where AI-​enabled tools might become impactful is in terrains difficult for humans to operate in. The emergence of autonomous drones and robots might be shaped by a desire to be able to operate in environments where ‘remote control’ is not possible: there might be situations and environments where having a drone or robot (or computer virus) that can operate autonomously will be useful or possibly essential in times of global conflict that unfolds across a multitude of terrains or domains. As the report on the malicious use of AI suggests: In addition, AI systems could also be used to control aspects of the behaviour of robots and malware that it would be infeasible for humans to control manually. For example, no team of humans could realistically choose the flight path of each drone in a swarm being used to carry out a physical attack. Human control might also be infeasible in other cases because there is no reliable communication channel that can be used to direct the relevant systems; a virus that is designed to alter the behaviour of air gapped computers, as in the case of the ‘Stuxnet’ software used to disrupt the Iranian nuclear program, cannot receive commands once it infects these computers. Restricted communication challenges also arise underwater and in the presence of signal jammers, two domains where autonomous vehicles may be deployed. (Brundage and Avin et al. 2018: 20) Indeed, the use of autonomous drones and robots might result in the exploration of terrains that have previously been impossible for militaries (or criminals) to operate in. One of the key issues here might be about the environments where these AI-​enabled drones and robots may be used in ways that we are not currently

192  The Tactics, Terrains and Technologies of Future Warfare

considering or thinking about: Where are the spaces in our environments where autonomous machines could be used to produce events unlike anything we have seen before? What new potentials in twenty-​first century sabotage, violence and destruction will be exploited in the infrastructures and environments that would have previously been out of reach from human intervention? Simply put, the skies might not be the most important terrains that drones and robots are operating in during future wars; the future of drone war might be in pipes, tunnels, deep seas and space (and terrains we cannot currently imagine). As we see with events during the wars in Ukraine or Gaza in 2023, infrastructural war might be the space where the convergence of AI and robots/​drones (of all shapes and sizes) is used as a tactic to undermine the will of a population and its allies: these events, events that will create uncertainty over attribution (as we saw, for example, in the debate over sabotage over the Nord Stream gas pipelines), might be the most visible forms of AI in future wars out to 2049; how this type of AI-​enabled infrastructural war evolves will depend on the creativity (a creativity that will likely be shaped by moral anxieties in the liberal way of warfare—​although these anxieties might fade in significance depending on the situation) and technical ability. This area of emerging terrains and infrastructures for non-​lethal and lethal sabotage might be one of the key areas for AI in the future of the liberal way of warfare. But beyond these strategic concerns about emerging terrains and infrastructures to be exploited by AI, robotics and drone war, one of the anxieties about drone war is the idea that violence and war becomes a ‘video game’-​like experience of ‘fire and forget’: the moral distance, it is argued, between the ‘hunter’ and the ‘hunted’ could result in an arrogant and irresponsible imperial geopolitics, regimes indifferent to the impact of this ‘war from above’ or war on a screen. To be sure, the debate on drones has moved on to awareness of the moral proximity that can emerge between the hunter and the hunted through the screen, a proximity that will most likely intensify as our ‘vision machines’ become even more high definition, our surveillance more granular. But the concern about moral responsibility and technologies of war has moved beyond the concerns of American power and the Global War on Terror (where the United States was viewed as the dominant actor in terms of new technologies of war, surveillance and control) towards a broader perspective on the range of states and actors that will use AI and robots in a time of open technological innovation (Kurth Cronin 2022). The report on the malicious use of AI makes an observation that resonates with Zygmunt Bauman’s analysis of violence and modernity: AI systems can increase anonymity and psychological distance. Many tasks involve communicating with other people, observing or being observed by them, making decisions that respond to their behaviour, or being physically present with them. By allowing such tasks to be automated, AI systems can allow the actors who would otherwise be performing the tasks to retain their anonymity and experience a greater degree of psychological distance from the

The Great Accelerator? AI and the Future of Warfare  193

people they impact. For example, someone who uses an autonomous weapons system to carry out an assassination, rather than using a handgun, avoids both the need to be present at the scene and the need to look at their victim. (Brundage and Avin et al. 2018: 17) Of course, it could be argued that this anxiety on moral distance overstates the game-​ changing capacity of AI; moral distance is not simply a spatial-​phenomena and can involve a variety of ‘social’ tactics of dehumanisation. From the use of drugs to propaganda to the use of financial incentives, there are many ways to get people to participate in unpleasant events and brutal activities; there will possibly always be a (possibly decreasing) sector of the population that can be made to participate in horrific events. But the concern here is that prolonged and intense violence could become ‘mechanised’ in an unprecedented fashion, removing the human from the loop, limiting the possibility of external intervention and disruption—​or, just as importantly, domestic anti-​war resistance, the emergence of moral proximity from those who have been given the task of killing; one could imagine a civil war where a leader/​military might be dealing with internal revolt from both soldiers and citizens but is able to continue ethnic ‘cleansing’ and violence due to its AI-​enabled drones and robots (the role of the Blade Runner state will be to try to sabotage from a distance and to destroy the command and control if this is possible in a time of AI-​enabled war); the small numbers orchestrating an act could be enhanced and magnified by AI and machinic tools of war in a manner that prolongs an event and increases the number of victims. The protopian would reply that there will be technical solutions produced by the Blade Runner state to shut down from a distance these drone and robot armies. More broadly, the liberal political thinker will comment that limits (as long as we are not dealing with an existential threat) will continue to be placed on the more dystopian possibilities for this path into machinic warfare; there will continue to be pressure from social movements, legal experts, moral philosophers and concerned scientists; we might develop the technology but, as with other technologies and instruments of war, this does not mean that we will use machines that will take us into the realm of The Terminator. But what we might well see is increasingly creative and innovative acts of infrastructural and information war (if the machinic solution is the most effective compared to other ways of achieving an objective). But the use of deadly systems of AI and robotics might be another trend in the congested and expanded battlespace (where soldiers will be increasingly vulnerable) that results in a reluctance by liberal states to place troops on the ground (and to rely on local forces and the other tools of impure war). The age of AI could make war impure in ways that we currently cannot imagine. The Force Enhancer. One of the concerns about the future of AI-​enabled future war is on the changing character of the actors involved in war and terrorism. One of the concerns (or anxieties) in Third Offset strategy thinking is that the traditional ‘hierarchies’ of international politics will become disrupted by new technologies that enhance the capacity and capability of actors of all types: in the worst-​case

194  The Tactics, Terrains and Technologies of Future Warfare

scenario, previously insignificant state/​non-​state actors will be super-​enhanced both militarily and economically by technologies that allow them to ‘punch above their weight.’ The hope of the Third Offset strategy is that the United States will be able to stay ahead in these technological arms races—​but they will have to move faster to stay ahead (and the race might jump to races in a number of different technologies). AI-​enabled systems are viewed to be one of the primary ‘accelerators’ and ‘enhancers’ in this potential disruption of international politics. The Belfer report suggest that ‘commercially available, AI-​enabled technology (such as long-​ range drone package delivery) may give weak states and non-​state actors access to a type of long-​range precision strike capability’ (Allen and Chan 2017: 2). While this view might be an exaggerated or hyped-​up concern, driven by Third Offset anxieties, it is suggested that the ‘world order’ itself could be radically disrupted, with new actors and players emerging into global significance: ‘Also like the first industrial revolution, population size will become less important for national power. Small countries that develop a significant edge in AI technology will punch far above their weight’ (ibid: 3). Rather like the scenario of the Star Wars film Attack of the Clones, where a clone army is built to enforce a new intergalactic order, the futuristic possibility of this force enhancement and geopolitical disruption is the possibility of states with smaller populations creating machinic armies and tactics to rival the more ‘traditional’ states (like when the clone army was built to fight the Republic). According to the Belfer report, a technologically advanced country with a smaller population ‘could build a significant advantage in AI based military systems and thereby field greater numbers of more capable robotic ‘warfighers’ than some more populous adversaries’ (ibid.: 23). The Belfer report cites Bill Gates’ suggestion that robotics will see similar price declines and adoption growth that personal computers experienced, where–​from 1998 to 2013—​the average price of a computer fell by 95% (ibid.: 14): this will enable the proliferation of robotics and AI for a variety of states and non-​state actors. Thinking ahead to 2049, the world could be a protopian planet that uses a multitude of technologies of which AI and machine learning are core elements in a planetary infrastructure that is transforming all aspects of life, health and security; or we might be in an unstable world that resembles Transformers or Ghost in the Shell in its dangerous mix of ‘tooled up’ actors using lethal autonomous weapons for high intensity war, terror and crime. But the counterpoint to this anxiety about radical disruption of international politics is that the scale of this new AI-​enabled robot and drone war would be ‘ramped’ up by great powers so that it still able to deter other actors through the overwhelming force it could unleash (the Blade Runner state using the replicant to hunt the replicant): ‘Ultra-​cheap 3D-​printed mini-​drones,’ one expert suggests, ‘could allow the United States to field billions—​yes, billions—​of tiny, insect-​like drones’ (Allen and Chan 2017: 14). States around the world are already preparing for the scaling up and enhancement in this AI-​enabled age of drone and robot war: for example, the Russian Military Industrial Committee approved a plan that

The Great Accelerator? AI and the Future of Warfare  195

would have 30% of Russian combat power consisting of entirely remote-​controlled and autonomous robotic platforms by 2030 (ibid.: 21). In this sense, the change in scale in robotics and AI could take some very ungranular directions; the liberal way of future war and the Blade Runner state will likely develop this scale in machinic weaponry of all shapes and sizes in its plans for mosaic war. But that does not mean this potential will become an inevitable or ‘normal’ element in warfare, although we might expect some demonstrations of this new capability by the Blade Runner state, a spectacle of the new battle angels of our better and worse nature for deterrent effect, a spectacle of how the liberal state could become a Terminator state. And for all the anxiety and concern about smaller states and organisations exploring and exploiting the change in scale created by swarms and insect-​drones, the emergence of new tools of warfare are more likely to just enhance the capacities of the already powerful. In July 2021, Defense One reported that Israel used a drone swarm during the conflict with Hamas in Gaza during May: It seems a small number of drones manufactured by Elbit Systems coordinated searches, but they were used in coordination with mortars and ground-​based missiles to strike ‘dozens’ of targets miles away from the border, reportedly. The drones helped expose enemy hiding spots, relayed information back to an app, which processed the data along with other intelligence information. (Kallenborn 2021) The liberal way of future warfare will likely be about the ‘invisible’ uses of new tools to orchestrate acts of sabotage, espionage, subversion—​and possible destruction. Unless there is an event that reveals the risks of using a Shoggoth that you might not be in control of, resulting in deadly necropolitical possibilities. Rather than the apocalyptic end of humanity, the military accident of AI might involve problematic intelligence and evaluation of a situation resulting in harm for all involved. But it is not just in the realm of drones and robotics where AI will potentially revolutionise future warfare and enhance and transform the capacity of actors: in terms of cybercrime and cyberattacks, AI has the potential to enhance the capacity and capability of different types of actor. As the Belfer report comments: For cybersecurity, advances in AI pose an important challenge in that attack approaches today that are labor-​and-​talent constrained may—​in a future with highly-​ capable AI—​ be merely capital-​ constrained. The most challenging type of cyberattack, for most organizations and individuals to deal with, is the Advanced Persistent Threat (APT). With an APT, the attacker is actively hunting for weaknesses in the defender’s security and patiently waiting for the defender to make a mistake. This is a labor-​intensive activity and generally requires highly-​skilled labor. With the growing capabilities in machine learning and AI, this ‘hunting for weaknesses’ activity will be automated to a degree that is not currently possible and perhaps occur faster than human-​controlled

196  The Tactics, Terrains and Technologies of Future Warfare

defences could effectively operate. This would mean that future APTs will be capital-​constrained rather than labor-​and-​talent constrained. In other words, any actor with the financial resources to buy an AI APT system could gain access to tremendous offensive cyber capability, even if that actor is very ignorant of internet security technology. (Allen and Chan 2017: 19) In this sense, smaller groups that lack the labour and talent might be able to carry out increasingly sophisticated attacks. Of course, while there may be the possibility of smaller groups able to attack ‘high value’ targets with creative acts of cyber-​sabotage, it might also be the case that Blade Runner states and police have access to granular intelligence and tactics that will enable them to manage emerging threats. But AI may enable smaller and less well-​funded groups to pose a persistent threat to states, militaries and businesses, especially if they are able to cultivate creative tactics that take more advanced and well-​resourced actors by surprise: a non-​state actor with highly developed and organised ‘creative’ capacities and access to advanced machinic possibilities might pose serious problems out to 2049. As we saw in Chapter 5, in this time of the Blade Runner state and ‘cyberpunk international politics’ much of the time and effort of liberal states will be spent in the blurred space between policing and war where organisations like the ones in Ghost in the Shell work to contain and deter various actors that are working on dangerous machinic possibilities; policing this side of world (dis)order might play an increasingly large part of the liberal way of war, policing and security. Hyper-​Mixed Methods: The Force Disrupter. So, the emergence of AI-​enabled systems and technology might disrupt the ‘hierarchies’ and ‘order’ of international politics: non-​state actors and smaller states will enjoy an AI-​enabled force enhancer in economic or military terms. This disruption might occur in a dramatic fashion, but there are initiatives such as the Third Offset Strategy created to control any disruption and disorder to the international order (although this is no guarantee they won’t be outplayed by some clever, granular and invisible tactics or technological innovations that the large bureaucracies of security will fail to see on the horizon). What appears likely is that the pace of change in technology (where AI will act as an accelerator) will contribute to the increasing complexity and scale of war (certainly as we approach 2049), a complexity and scale beyond anything we can currently imagine: warfare that will depend on a greater number of technologies in the ‘mosaic’ of future war. The scale of an AI-​enhanced war machine could exceed anything seen in our (human) history. If the AI-​enabled mosaic warfare confronts the AI-​enabled mosaic warfare of another state then the scale, complexity and uncertainty of this type of ‘battle’ might be enough to deter it. In the conclusion of Matrix Revolutions, a swarm of robots—​orchestrated by the alien intelligence of the Matrix—​assembles micro-​robots into a face, the creation of a new structure and spectacle from the assemblage of micro-​drones (an assemblage

The Great Accelerator? AI and the Future of Warfare  197

that is part of a process of conflict resolution). It is this type of machinic swarm that is presented as a possibility in a time of force multiplication. The sheer scale, complexity and creativity might change the art of the possible in future warfare in ways that transform war in ways we cannot imagine. The Belfer report contains this scenario: Commercial drones currently face significant range and payload limitations but become cheaper and more capable with each passing year. Imagine a low-​cost drone with the range of a Canada Goose, a bird which can cover 1,500 miles in under 24 hours at an average speed of 60 miles per hour. How would an aircraft carrier battlegroup respond to an attack from millions of aerial kamikaze drones? Some of the major platforms and strategies upon which U.S. national security currently relies might be rendered obsolete. (Allen and Chan 2017: 22) The authors of Unrestricted Warfare suggested that exploiting the vulnerability of major platforms would need to be an area to explore. In this sense, war might begin to look radically different in the years out to 2049, increasingly impure, granular, and invisible. This machinic force multiplication could lead to new techniques and tools for both more traditional wars and grey zone ‘activities,’ activities that are both harder to defend against and—​more importantly for grey zone events—​harder to attribute: Widespread availability of low-​cost, highly-​capable, lethal, and autonomous robots could make targeted assassination more widespread and harder to attribute. A small, autonomous robot could infiltrate a target’s home, inject the target with a lethal dose of poison, and leave undetected. (Ibid.: 22) Simply put, the potential of hyper-​mixed methods in the assemblage of tools and techniques (the anti-​mosaic war) will try to overwhelm through the sheer number of machines and tactics being used across multiple domains. A time when drone ‘storms’ and swarms become expected in the same way as other (un)natural disasters are seen as part of life and death. The Blade Runner state will manage a military that looks increasingly unlike the armies of the previous century in this time of impure war. In ‘The New Revolution in Military Affairs: War’s Sci Fi Future,’ Christian Brose argues that the United States is failing to grasp the implications of future warfare in terrains shaped by new technologies such as AI and robotics, facing opponents who might be creatively exploring the possibilities of the grey zone or unrestricted warfare in ways that are beyond the imagination of military thinkers and bureaucracies. Brose suggests that Washington has been ‘pouring money into newer versions of old military platforms’ (hoping and praying for technological

198  The Tactics, Terrains and Technologies of Future Warfare

‘miracles’) but this may have distracted them from the changing ‘nature’ of technology: It is still possible for the United States to adapt and succeed, but the scale of change required is enormous. The traditional model of U.S. military power is being disrupted, the way Blockbuster’s business model was amid the rise of Amazon and Netflix. A military made up of small numbers of large, expensive, heavily manned, and hard-​to-​replace systems will not survive on future battlefields, where swarms of intelligent machines will deliver violence at a greater volume and higher velocity than ever before. Success will require a different kind of military, one built around large numbers of small, inexpensive, expendable, and highly autonomous systems. The United States has the money, human capital, and technology to assemble that kind of military. The question is whether it has the imagination and the resolve. (Brose 2019) In other words, the techniques and technologies of granular war scaled up and diversified to a degree that it becomes overwhelming, mosaic war without technological limits. Rebuilding and ‘scaling up’ the technologies and tactics of the past in a world that is ‘becoming one giant sensor,’ where quantum sensors will detect all disruptive activities in a territory or environment; where the use of satellites will make ‘access’ to space cheaper; where big data will create new avenues of offensive and defensive action; where developments in hypersonic propulsion will allow smaller systems to travel faster and further; lethal payloads becoming smaller—​the creation a world where there are no ‘safe havens’ (the type of scenario, in other words, that Putin was worried about in the first months on 2023 over the movement of drones deeper into Russian territory). Much of this resonates with the ideas behind mosaic warfare. But interestingly, this age of AI, robotics and the world becoming a giant sensor may also, according to Brose, herald ‘the return of mass to the battlefield,’ as terrains—​as it was suggested in the chapter on urban warfare—​become dense and congested with different types of technology of all shapes and sizes, swarms of swarms. In other words, a dystopian battlefield overloaded with machines of all shapes and sizes (where you can afford to lose some machines). For all the discussion of the impure and the granular, the liberal war of future warfare might be underpinned by a multitude of machinic and necropolitical possibilities on the battlefield, designed for deterrence and the worst-​ case geopolitical scenarios; not light-​ footprint operations but the war of a heavy and industrial modernity in an age of AI and the Shoggoth. But while some states might be willing to explore the necropolitical possibilities of AI, it is not clear whether liberal states would be willing to unleash an onslaught of AI-​enabled tactics and technologies, especially if the anxiety about the Shoggoth persists: there was controversy in 2023 over reports that, during an AI simulation,

The Great Accelerator? AI and the Future of Warfare  199

a drone attacked its operator in order to achieve the mission objectives (Kleinman 2023). To be sure, the nocturnal body of the liberal state continues to produce dehumanising events that result from the ‘standard operating procedures’ in security practices and policies; there are legal and ethical responses inside the liberal state that seek to limit and transform the policies that are viewed to be abuses of power and authority on those viewed or ‘valued’ differently. For some critics of the liberal state, the creation of policies or situations (such as prisons in the war on terror) that create abuses of power are a constant part of our necropolitical past, present and future; for the liberal optimist, awareness of these dark possibilities results in the evolution of laws, practices and attitudes. So, the question might become: could liberal society tolerate a type of warfare that looks like a dystopian and futuristic war film? Would this dramatic spectacle of the AI-​enhanced nocturnal body of the liberal way of war push the self-​understanding of the liberal society into a zone with unimaginable political, legal, ethical and social consequences? And would even the preparation for this AI-​enabled mosaic warfare as a strategy of deterrence result in the creation of technologies and tactics that might be exported or used outside the liberal world? It seems reasonable to believe that this machinic ‘pure war’ would only occur in a moment of existential threat; the more likely path for AI will be in the invisible and mundane elements of impure war, in the sabotage of infrastructure or in the tools for intelligence gathering and assessment. Simply put, in the liberal way of future warfare there might be far more caution about the creation of warfighting that resembles a Terminator or Transformers movie. But the liberal war machine will prepare for unrestricted warfare using AI that might be a strange and surreal mash up of a Transformers movie with Christopher Nolan films like Tenet or Inception: warfare and international conflict that we cannot currently imagine. The Banality of Artificial Intelligence?

So, while we might not be discussing ‘replicants’ or ‘peripherals’ to fight future wars (at least out to 2049 and likely not in the liberal way of warfare), we can see how discussions of AI and machine learning are a source of anxiety on the technological possibilities of future war and security (Lee and Qiufan 2021; Payne 2022); swarms that will overwhelm all traditional military technologies; various actors in international politics with access to tools that enhance their capabilities in disruptive ways; a congested future that will involve investment and research in a huge range of emerging technologies that are enabled and driven by AI; and the emergence of possibilities in the world out to 2049 beyond anything we can imagine in the 2020s. New types of cybercrime used against individuals and organisations; new types of state and corporate surveillance and ‘evaluation.’ Blade Runner states using police and military to control and contain the technological disorder of a messy multipolar world with constantly proliferating and mutating actors and machines.

200  The Tactics, Terrains and Technologies of Future Warfare

But the counterargument to these dystopian scenarios is that we should expect a proliferation of terrifying projections in a time when anxiety on AI is pushing for increased resources in this area (or aiming for national supremacy in this space). The fear surrounding AI produces a future threat landscape reinforced by countless science fiction movies depicting dangerous superintelligence and destructive drone swarms and robots. While we might be creating the possibility of future wars that will look like Terminator or Transformers movies with battle zones full of autonomous robot swarms of all sizes and capacity fighting humans (and one another, or autonomous machine-​on-​machine war), the reality of our use of AI might be less spectacular (or cinematic): innovative research in laboratories and research centres and universities that improves healthcare; new techniques of governance for states that enhances and improves all services used by citizens; new ways of delivering education and healthcare that improves people’s lives. The impact of AI might unfold in a rather banal and everyday fashion, improving all areas of life in a manner that benefits people in a gradual and cautious fashion, elements in a protopian process that enhances humanity. An article in the Financial Times titled ‘Just the Job: How AI Is Helping Build a Better Life for Refugees’ explains how an AI called ‘Annie’ is doing a better job than humans at finding locations best suited for the needs of a refugee, in terms of their skills, medical and educational needs for family, proximity to similar refugees: the economist who designed it suggests that it was designed not to replace humans but to lessen their workloads so that they could focus on the more challenging tasks (Warrell 2018). This is the future of AI for the protopian liberal, a future where technology enables us to become better humans; cinematic visions of future war and Blade Runner states will remain in the video games. The Belfer report cites the security expert Bruce Schneier on surveillance and new technology and the fact that ‘the exceptionally paranoid East German government had 102,000 Stasi surveying a population of 17 million: that’s one spy for every 166 citizens.’ AI and machine learning enables the surveillance of billions of individuals with only a few thousand staff (Allen and Chan 2017: 18). In Weapons of Math Destruction, Cathy O’Neil (2017) illustrates how the use of algorithmic decision-​making can reinforce inequality and social exclusion in the way that systems can be used as a cheap and fast means of governance and policing. The wealthy can still hope that the various systems they pass through will depend on a more personalised mode of decision-​making; the rest have to take their chances at the mercy of the algorithms that lose granularity and nuance in the often life-​changing decisions they arrive at; O’Neil provides a variety of case studies that show the ‘accidents’ of these processes and the impacts on lives, lives caught in what we might call the ‘AI Matrix’ of Bureaucracy rather than Max Weber’s ‘Iron Cage’ of Bureaucracy. These techniques of surveillance and control raise questions of not only inequality but also necropolitical violence. Kate Crawford suggests that the harvesting and

The Great Accelerator? AI and the Future of Warfare  201

measuring of large aggregates of data ‘at a distance’ became the preferred way to develop insights into groups and potential targets for killing: you shall know them by their metadata. Who is texted, which locations are visited, what is read, when devices spring into action and for what reason—​ these molecular actions became a vision of threat identification and assessment, guilt or innocence. (Crawford 2021: 185) But the protopian liberal argument would be that these ‘accidents’ will disappear in the years ahead (we will learn from mistakes, the inevitable and correctable problems with emerging systems and technologies); the pessimist will argue that a society of AI control and bureaucracy will inevitably continue to solidify the structures that limit social change and support inequality. For the protopian liberal, the fact that the ‘errors’ of these systems are being so widely reported illustrates the ‘checks and balances’ that will emerge in the public sphere to limit the unfortunate side effects in the use of the technology: in the protopian view, there will be a process whereby the safe and sensible use of the technology will be extended and the harmful uses will be eradicated; the reason we watch movies such as Blade Runner 2049 is because we worry about the things we create or the dystopian futures we might be making more likely, all parts of the protopian process that uses the abuses, failures and accidents of technology as the forces for improvement and innovation. President Emmanuel Macron told Wired magazine that while he wanted France to be a centre of excellence in AI development, he also wanted a responsible European approach to the development of AI: I want my country to be the place where this new perspective on AI is built, on the basis of interdisciplinarity: this means crossing maths, social sciences, technology, and philosophy. That’s absolutely critical. Because at one point in time, if you don’t frame these innovations from the start, a worst-​case scenario will force you to deal with this debate down the line. (Thompson 2018) But in terms of preparing for future wars out to 2049, it might be the case that the reality of AI will be more ‘mundane’ than the Terminator depictions of autonomous robots and drone swarms; new AI ‘tools’ to enhance the speed and efficiency of all parts of the military organisation and bureaucracy, from military recruitment and training to logistics to future threat assessment. For example, there are U.S. initiatives to use AI systems to be able to predict when components in essential fighting vehicles will break down, a measure that could be used on thousands of vehicles across the planet, resulting in a fast and effective logistics to ensure that accidents on the battlefield are less likely. The marines have worked with IBM to analyse data on personnel and equipment to get a sense of how prepared

202  The Tactics, Terrains and Technologies of Future Warfare

individual battalions are for combat: the use of AI to help organise this planning is seen as important in light of the complexity of the different deployments required in different parts of the world (Corrigan 2018a). There are also initiatives such as The Automating Scientific Knowledge Extraction, or ASKE, a project that is part of DARPA’s Artificial Intelligence Exploration programme, which ‘aims to develop next-​generation artificial intelligence applications,’ ‘third wave’ AI systems that are intended to ‘more or less act as its own scientist, taking in models from existing research and automatically updating them to account for new information’ and to do the information gathering, which is much of the ‘heavy lifting’ part of the process, to create models that ‘can shed light on complex systems and predict how they might respond to specific changes’ (Corrigan 2018b). So, in other words, this is a prime example of how AI is being used as an enhanced tool for military bureaucracy and research. In 2023 it was reported that a Chinese research team was claiming it had used an AI programme to design the electrical system of a warship in a day rather than the near year-​long process involving humans and advanced computer tools (Averre and Griffith 2023). While it might be propaganda in the projection of images of national AI economic and military power, the new techniques of design and manufacturing might take states into a space of radical possibilities in the (re)designing of security and war. At the same time, it might be the case that the reality of future AI is more game-​ changing in decision-​making and intelligence analysis rather than the emergence of futuristic robot wars. For example, in a press release titled ‘Making Gray-​Zone Activity more Black and White’ DARPA’s Strategic Technology Office announced a new programme called COMPASS—​The Collection and Monitoring via Planning for Active Situational Scenarios. The project sets out to use AI and game theory to be able to work out what adversaries are trying to do and how they will do it in ‘grey zone’ situations where the traditional OODA loop (observe, orient, decide and act) might not be useful: The program aims to develop software that would help clarify enemy intent by gauging an adversary’s responses to various stimuli. COMPASS will leverage advanced artificial intelligence technologies, game theory, and modeling and estimation to both identify stimuli that yield the most information about an adversary’s intentions, and provide decision makers high-​fidelity intelligence on how to respond –​with positive and negative tradeoffs for each course of action.’ (DARPA 2018) It is, of course, impossible to provide a sense of how useful or effective such initiatives will be—​or whether this is actually driven by a political economy of security that seeks to produce new ‘high-​tech’ security products as ‘technical fixes’ for challenging problems. It might be the case that such initiatives provide effective tools that are useful additions to decision-​making processes, adding to more human-​centred processes rather than dramatically replacing them.

The Great Accelerator? AI and the Future of Warfare  203

The former Deputy of Defense Bob Work suggested that the military does not want artificial general intelligence types of weapon that take us into the possibility of Terminator wars of autonomous machines hunting and killing humans because of priorities set by general AI, but rather the use of narrow AI to help make better, faster and possibly more creative decisions; the decisions might lead to the use of autonomous weapons carrying out an attack but AI would not shape and control the strategy that lead to an attack. Work gave the example of a smart missile that is able to assess a situation, choose the best target based on preset parameters. In the example, the missile spots an enemy tank formation, determines which is the command tank and picks the most lethal form of attack to incapacitate the enemy. (Boyd 2018) His comments—​that tried to reinforce the idea that AI was going to be used to improve human decision-​making in light of the anxieties expressed by Google employees on Project Malvern—​might not reassure many who are concerned about the work of the ‘Algorithmic Warfare Cross-​Functional Team’ but it gives a sense of the lines that are trying to be drawn between human-​led wars and machine-​ intelligence-​led wars; it also gives a sense of how the Skynet-​type anxieties are no longer the exceptional stuff of science fiction but the norm of our future politics. From the perspective of a thinker like Virilio, while war remains a terrifying problem for humanity, the stakes of future war between ‘great powers’ (and who is a ‘great power’ might look different in a world of radical technological change) are so high that it will either be over very quickly (and destructively), or it would be averted due to the apocalyptic possibilities and strategies of deterrence. For Virilio (2008), the wars that are most likely to be fought are ‘impure wars’ against terror or rogue states (that will be experimental laboratories for the latest tactics and technologies). But for Virilio, the future of planetary security and safety might be shaped by the possibility of global accidents (what he calls the general accident or the integral accident); in other words, the future of our security and safety will not be decided by the events orchestrated by a warlord, dictator or policymaker but by the global accidents that emerge from our society and science; climate emergencies, the accident in a laboratory that releases something destructive into the world, a virus that produces a catastrophic cyber event. In ‘Killer Apps: The Real Dangers of An AI Arms Race,’ Paul Scharre suggests that there is a danger that in the race to stay ahead in an AI arms race states or corporations may deploy unsafe AI systems, risking endangering themselves as much as their enemies. Scharre warns that ‘if a government deployed an untested AI weapons system or relied on a faulty AI system to launch cyber-​attacks, the result could be disaster for everyone concerned’ (Scharre 2019). One of the problems is that in AI evolution and learning, it may be simpler to exploit loopholes to achieve the desired outcome, resulting in unintended consequences and tactics

204  The Tactics, Terrains and Technologies of Future Warfare

that may surprise us; Scharre gives the example of an algorithm learning to walk ‘in a simulated environment’ that ‘discovered it could move fastest by falling over. A Tetris-​playing bot learned to pause the game before the last brick fell, so that it would never lose’ (ibid.). In other words, what may be an efficient solution for a cybersecurity problem or for the coordination of a drone swarm might be a catastrophic accident for everyone else. For others, the Shoggoth is a creature that will continue to grow dangerously and nefariously in ways that we will fail to understand or can control. The plans for human–​machine teaching might produce protopian solutions to conflict that reduce harm and create non-​lethal solutions for dangerous situations. But if there are accidents because of human–​machine teaming then anxiety about the Shoggoth may change the AI future. A report from the AI Now Institute in New York suggested, in a report on the state of AI research in 2018, that: The AI accountability gap is growing: The technology scandals of 2018 have shown that the gap between those who develop and profit from AI—​and those most likely to suffer the consequences of its negative effects—​is growing larger, not smaller. There are several reasons for this, including a lack of government regulation, a highly concentrated AI sector, insufficient governance structures within technology companies, power asymmetries between companies and the people they serve, and a stark cultural divide between the engineering cohort responsible for technical research, and the vastly diverse populations where AI systems are deployed. These gaps are producing growing concern about bias, discrimination, due process, liability, and overall responsibility for harm. The report emphasises the urgent need for stronger, sector-​specific research and regulation. (AI Now 2018) While it could be argued that these concerns relate more to the manner in which AI systems can reinforce existing inequalities or injustices, and not matters of concern in terms of military use of AI, the broader point about insufficient governance structures and the divides between ‘experts’ and the ‘users’ raises a note of caution about the problems that might be emerging at the intersection of the AI community and the military one. One of the policy recommendations in the Malicious Use of AI report is that we need to actively ‘seek to expand the range of stakeholders and domain experts involved in discussions of these challenges’ (Brundage and Avin et al. 2018: 4). This can be a difficult challenge in the spaces of national security where the future of war is being imagined and designed in the context of an international politics presented in terms of AI arms races and exceptional, dangerous times. All protopian optimism about AI and the liberal way of war must keep in mind the warning from thinkers such as Bauman (on the problems of bureaucracies, moral distance and indifference), Virilio (on the possibility of the accident) and Mbembe (on the necropolitical desires of states). But it seems

The Great Accelerator? AI and the Future of Warfare  205

clear that the future of the liberal way of warfare out to 2049 will be a time of experiments in the possibilities for all aspects of AI, security and warfighting; it might also be a time of accidents and a growing divergence on how different states and societies view the costs and benefits of AI. Concluding Remarks: War and Security in a Time of Multiplication

So, of course, we cannot see where we will be by 2049 in terms of new technologies of war and security; humanity might be struggling to survive in a world that resembles Blade Runner 2049 or it might be a chaotic hybrid of past, present and future (in the same way that war in Ukraine in 2022 often looked like a hybrid of different times of war). But it seems reasonable to conclude that this desire to enhance our capacities through machines will continue—​and continue in ways that will exceed what we can currently imagine. Not only will this desire for new tools drive transformation in terms of what we can do, it might also begin to radically transform what we are: the ‘cyborg’ possibilities of bodies transformed by new drugs, new robotic prostheses and AI tools guiding and assisting us as we see the world through our Vision Pro headsets/​glasses/​contact lenses/​modified eyes; Apple might be designing and manufacturing body parts by the 2040s. And if Apple are not designing parts for the cyborg body, there will be a lab somewhere in the world that is doing the work. There may well be disagreement on what is viewed as acceptable enhancement or modification of bodies, infrastructures, societies and militaries across (and inside) different states or corporations. There might be political, legal and ethical tensions over the possibility of new bio-​technological inequality or divides as citizens in liberal states seek to enhance their already socially, politically and economically enhanced lives, the type of situations depicted in films like Gattaca or Elysium. Or there might be differences across the range of political systems in a multipolar world where the liberal world is one of many systems driving innovation, a world where political and economic futures evolve differently—and where bodies and biology/​ technology evolve differently (the type of situation depicted in Don Delillo’s Zero K where liberal citizens can find places in world politics to transform their bodies beyond liberal rules and regulation). Some of the possibilities in the liberal way of warfare have been sketched out in this chapter: the possibilities of ‘swarms’ that might offset the traditional advantages of scale and technological superiority enjoyed by liberal states; the speeding up of data and intelligence analysis (possibly opening up problems about the nature of decision-​making in war and conflict where the humans in the loop risk being displaced); the exploration of new domains or terrains due to the possibility of AI and robots that can operate autonomously through action at a distance in previously uninhabitable or unexplorable zones. The possibility of new terrains of international competition and conflict—​such as space and astropolitics as the domain vital to war on earth and increasingly viewed as essential to the national

206  The Tactics, Terrains and Technologies of Future Warfare

interest in light of new economic possibilities made possible by space exploration. To be sure, it is not clear how vital astropolitics will be in the years out to 2049 in terms of the broader ambitions of space exploration; as commentators like Daniel Deudney (2020) suggest, we might be in a time of inflated visions about what will be possible in terms of exploring and exploiting these new frontiers. But while we might not be creating the new types of space empires envisaged in science fiction, space will be vital to all the tactics of creative, granular war at a distance on the messy earth below (Marshall 2023). Our technological transformation and desire to develop new tools will explore and experiment emerging possibilities of business and war in space—​and hopefully in a way that doesn’t produce new types of global and astropolitical accident. More broadly, we might need to be cautious about the science fiction visions of Terminator robots, cyborg humans and the ‘singularity’ of AI overlords turning against humanity. The liberal way of warfare will continue to be a hybrid of the traditional and the non-​tradition, the human and machine (albeit in human–​machine ‘teams’). Liberal societies will likely continue to struggle with the different implications of emerging technologies on issues of safety, privacy, inequality and responsibility; political systems outside the liberal world might debate and regulate the issues differently but will face pressures possibly both from their own citizens and from liberal states. But how significant that pressure will be in terms of shaping research and development in AI and other emerging technologies is unclear: the emerging world order will likely resemble the inversion of Fukuyama’s ‘end of history’ with liberal states decreasing in global significance and facing threats to the social, political and legal architecture that produces liberal democracy. The liberal way of war will be of one many world-​shaping (and destroying) ways of war. And confronted with the AI/​machinic arms race, liberal states will innovate in order to counter and deter what other states are doing (and might also be the source of the most radical and dangerous innovation). While it seems likely we will see horrific and disturbing events involving drone swarms, deepfakes, disinformation and machinic social and infrastructural sabotage in the years out to 2049, these events will hopefully contribute to the debate over the responsible research and development on new machinic possibilities in a manner similar to responses to previous innovations in chemical or biological war. The historian of Science Isabelle Stengers (2017) has argued that we need to slow down the sciences in this time of radical and accelerated change, to take time to debate the sciences that will transform life (and death) in the twenty-​first century. Citizens around the world will need to hope that we can continue to debate (and be aware of) the ethico-​political issues that are emerging in the laboratories around the world for military, civilian and dual-​use purposes. If Paul Virilio were alive today, he would no doubt comment that in the twenty-​first century the problems and suffering for humanity are as likely to result from the ‘accident’ of ‘progress’ as much as the disasters for society that result from war. As Audrey Kurth Cronin (2022) argues, the security problems on the horizon could emerge from the ‘open

The Great Accelerator? AI and the Future of Warfare  207

technological innovation’ and lethal enhancement that is possible across a range of technologies—​and developed/​exploited by a range of actors, including actors that will not be easily deterred by the strategic calculations that states are often forced to make. Indeed, so much of the Blade Runner states’ ‘security’ energy will likely be placed on managing and policing the multiplication of innovation in an ‘open system’ made up of constantly mutating actors and technologies, resulting in actions in a constantly growing grey zone requiring different types of intervention and control. Kurth Cronin holds out hope for more political and regulatory solutions to the problems of open systems and innovation. Underneath the innovation and multiplication, there will be social, economic and political instability as states and individuals deal with the emergence of innovations of which ChatGPT is the most well-​known starting event; by 2049 we might well be seeing the possibilities closer to those depicted in Blade Runner 2049, of radical enhancement of human and ‘replicant’ life. But the protopian will see benefits emerging from new machinic possibilities where emerging human, social, economic and political problems will be solved or managed by innovative technical fixes—​and older problems of health, insecurity, social and geopolitical conflict will be solved by new technical solutions. The messiness and complexity (and cost) of managing, however, this time of multiplication might be a far more significant problem than any ‘external’ threat or enemy, a time when every citizen can become the agent of lethal enhancement (or scientific, cultural, artistic and economic innovation). So ‘the future’ will most likely be a time of multiplication; the multiplication of tactics, terrains technologies and actors. When wars break-​out they will likely be a messy mix of different stages or times of war, of the old and the new, the ‘tried and tested’ mixed with the experimental; the challenge will be how—​to use Dan Öberg’s (2018) term—​transgressively creative the tactics are, how far they produce dangerous new surprises and possibilities in warfare. The challenge for liberal states in this time of multiplication will be on assessing and risk managing the new possibilities on the threat horizon(s)—​the problem of multiplication and the change in scale that will range from the seemingly small events (the use of balloons, for example, or other ‘invisible’ tactics that the security state fails to see) through to assessing the sheer scale and multiplication of research and development across the planet. So, while this multiplication might emerge in a time where the liberal way of warfare plays out in a more sub-​threshold fashion, in the grey zones of international politics, and so might feel like progress and improvement for those tasked with managing the security of liberal/​world order, it will feel overwhelming to watch over this time of multiplication (and mutation). And in the midst of all this multiplication and mutation will be the possibility of the accident, the accident that might not result from war or terrorism—​but an accident that has the impact of war or terrorism. And it might be the case that our reliance on machines for a constantly growing range of activities reduces our ‘situational awareness,’ our ability to make ‘smart,’

208  The Tactics, Terrains and Technologies of Future Warfare

‘creative’ or ‘responsible’ decisions. In Ridley Scott’s Prometheus (a prequel to his Alien film), we see an exploratory mission to a planet go disastrously wrong after the team tasked with exploring the area of interest makes what looks (to the audience) like a series of stupid, irresponsible mistakes (which might account for the often-​negative view of the film). But what might be the result of bad storytelling and plotting might also be showing us something else; the inability of future humans to make sensible decisions after lives and education are meshed into systems where so much of action and thinking is outsourced to machines. Now Russian soldiers and forces fighting in Ukraine are often argued to be unable to make decisions (or possibly think creativity) due to the centralised systems in which they operate, but it might also be the case that future soldiers and decision-​makers in all types of regime will be increasingly limited due to their reliance on the tools provided by technological innovation. Just as citizens will face the challenge of negotiating a world of fake news, disinformation, conspiracy theories and deepfakes, so the future soldier will face the uncertainty of situations that will risk overwhelming their decision-​making capacities—​and as Zygmunt Bauman might add, close down the possibilities for acting ethically and responsibility in the situations in which they find themselves. Some humans and organisations might become ‘smarter’ through their ‘teaming’ (with their AI and robotic assistants)—​and possible modification with new technologies of AI, biotechnology and robotics. But war will likely not change out to 2049 in a way that makes it resemble a Transformers or Terminator movie, war with huge armies of robocops and cyborg warriors; the transformation will occur in the speed and scale of the information and intelligence gathering and possibly in the difficult to attribute missions that produce new types of sabotage and infrastructural war; at the same time, there may be more and more people excluded from the ‘good life’ who are able to orchestrate destructive events, exploring the new possibilities of organised crime and ‘lone wolf’ opportunism. But war will just be one of the many challenges that humanity confronts in this time of AI and multiplication; the social, economic, political and technological impact of AI might make war increasingly look like the ‘spectre’ from our past that continues to haunt us: the radical (and possibly brutal) restructuring and transformation of life on earth, the transformations that were often driven by wars, will be driven by a range of human and non-​human events and political actors in this time of multiplication. Bibliography AI Now Institute. 2018. AI Now 2018 Report: https://​ain​owin​stit​ute.org/​publ​icat​ion/​ai-​now-​ 2018-​rep​ort-​2 Allen, Gregory and Chan, Taniel. 2017. Artificial Intelligence and National Security, Belfer Center for Science and International Affairs: www.belfe​rcen​ter.org/​sites/​defa​ult/​files/​ files/​publ​icat​ion/​AI%20Nat​Sec%20-​%20fi​nal.pdf Averre, David and Griffith, Keith. 2023. ‘Chinese Military Successfully Uses AI WARSHIP Design Programme to Do a Year’s Work in a DAY as Beijing Looks to Create a Navy

The Great Accelerator? AI and the Future of Warfare  209

More Powerful than America’s,’ Daily Mail, 10 March: www.dailym​ail.co.uk/​news/​ arti​cle-​11845​121/​Chin​ese-​milit​ary-​succe​ssfu​lly-​uses-​AI-​years-​work-​wars​hip-​des​ign-​ DAY.html Boyd, Aaron. 2018. ‘Pentagon Doesn’t Want Real AI in War, Former Official Says,’ Defense One: www.def​ense​one.com/​tec​hnol​ogy/​2018/​10/​penta​gon-​doe​snt-​want-​real-​art​ific​ial-​ intel​lige​nce-​war-​for​mer-​offic​ial-​says/​152​450/​ Brose, Christian. 2019. ‘The New Revolution in Military Affairs: War’s Sci Fi Future,’ Foreign Affairs, May/​June: www.for​eign​affa​irs.com/​uni​ted-​sta​tes/​new-​rev​olut​ion-​milit​ ary-​affa​irs Brundage, Miles and Avin, Shahar et al. 2018. The Malicious Use of Artificial Intelligence: Forecasting, Prevention and Mitigation: https://​docs.goo​gle.com/​docum​ ent/​d/​e/​2PACX-​1vQzbSybtXtYzO​RLqG​hdRY​XUqi​FsaE​Ovft​MSnh​VgJ-​jRh​6plw​kzzJ​ XoQ-​sKtej3​HW_​0​pzWT​FY7-​1eoGf/​pub Corrigan, Jack. 2018a. ‘Marines Turn to Artificial Intelligence to Better Deploy Troops,’ Defense One: www.def​ense​one.com/​tec​hnol​ogy/​2018/​11/​mari​nes-​turn-​art​ific​ial-​intel​ lige​nce-​bet​ter-​dep​loy-​tro​ops/​153​182/​ Corrigan, Jack. 2018b. ‘The Pentagon Wants AI to Take Over the Scientific Process, Defense One: www.def​ense​one.com/​tec​hnol​ogy/​2018/​08/​penta​gon-​wants-​ai-​take-​over-​sci​enti​ fic-​proc​ess/​150​810/​ Crawford, Kate. 2021. Atlas of AI: Power, Politics, and the Planetary Costs of Artificial Intelligence (Yale, CT: Yale University Press). DARPA. 2018. ‘Making Gray-​Zone Activity More Black and White’: www.darpa.mil/​news-​ eve​nts/​2018-​03-​14 Deudney, Daniel. 2020. Dark Skies: Expansionism, Planetary Geopolitics, and the Ends of Humanity (Oxford: Oxford University Press). Domingos, Pedro. 2017. The Master Algorithm: How the Quest For the Ultimate Learning Machine Will Remake Our World (London: Penguin). Kallenborn, Zak. 2021. ‘Israel’s Drone Swarm over Gaza Should Worry Everyone,’ Defence One: www.def​ense​one.com/​ideas/​2021/​07/​isra​els-​drone-​swarm-​over-​gaza-​sho​uld-​worryevery​one/​183​156/​ Kania, Elsa B. 2017. Battlefield Singularity: Artificial Intelligence, Military Revolution, and China’s Future Military Power, Center for New American Security, 28 November: www. cnas.org/​publi​cati​ons/​repo​rts/​batt​lefi​eld-​sing​ular​ity-​art​ific​ial-​intel​lige​nce-​milit​ary-​rev​ olut​ion-​and-​chi​nas-​fut​ure-​milit​ary-​power Kilcullen, David. 2020. The Dragons and the Snakes: How The Rest Learned to Fight the West (London: Hurst). Kleinman, Zoe. 2023. ‘Us Air Force Denies AI Drone Attacked Operator in Test,’ BBC New, 2 June: www.bbc.co.uk/​news/​tec​hnol​ogy-​65789​916 Kurth Cronin, Audrey. 2022. Power to the People: How Open Technological Innovation Is Arming Tomorrow’s Terrorists (London: Oxford University Press). Latiff, Robert. 2017. Future War: Preparing for the New Global Battlefield (London: Alfred Knopf). Lee, Kai-​Fu and Qiufan, Chen. 2021. AI 2041: Ten Visions for Our Future (London: W.H. Allen). Marshall, Tim. 2023. The Future of Geography (London: Elliot and Thompson). Miller, Chris. 2022. Chip Wars: The Fight for the World’s Most Critical Technology (London: Simon and Schuster).

210  The Tactics, Terrains and Technologies of Future Warfare

Murgia, Madhumita. 2023. Sci Fi Writer Ted Chiang: ‘The Machines We Have Now Are Not Conscious,’ Financial Times, 2 June: www.ft.com/​cont​ent/​c1f6d​948-​3dde-​405f-​ 924c-​09cc0​dcf8​c84 Öberg, Dan. 2018. ‘Warfare as Design: Transgressive Creativity and Reductive Operational Planning,’ Security Dialogue, Vol. 49, Issue 6: 493–​509. https://​doi.org/​10.1177/​09670​ 1061​8795​787 O’Neil, Cathy. 2017. Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (London: Penguin). Payne, Kenneth. 2022. I, Warbot: The Dawn of Artificially Intelligent Conflict London: C Hurst and Co). Rees, Martin. 2018. On the Future: Prospects for Humanity (Princeton, NJ: Princeton University Press). Roubini, Nouriel. 2022. Megathreats (London: John Murray). Scharre, Paul. 2019. ‘Killer Apps: The Real Dangers of an AI Arms Race,’ Foreign Affairs, May/​June: www.for​eign​affa​irs.com/​artic​les/​2019-​04-​16/​kil​ler-​apps Stengers, Isabelle. 2017. Another Science Is Possible (Cambridge: Polity). Sterling, Bruce. 2023. ‘AI Is the Scariest Beast Ever Created, Says Sci-​Fi Writer Bruce Sterling,’ Newsweek, 28 June: www.newsw​eek.com/​2023/​07/​21/​ai-​scari​est-​beast-​ever-​ crea​ted-​says-​sci-​fi-​wri​ter-​bruce-​sterl​ing-​1809​439.html Thompson, Nicholas. 2018. ‘Emmanuel Macron Talks to Wired About France’s AI Strategy,’ Wired, 31 March: www.wired.com/​story/​emman​uel-​mac​ron-​talks-​to-​wired-​about-​fran​ ces-​a i-​strat​e gy/​# :~:text=​I%20w​ant%20my%20coun​try%20to%20be%20the%20pl​ ace%20wh​ere%20t​his,That’s%20abs​olut​ely%20c​riti​cal Virilio, Paul. 2008. Pure War (Los Angeles: Semiotexte). Wadhwa, Vivek and Salkever, Alex. 2021. ‘Killer Robots Are Here. What Do We Do Now?’ Foreign Policy, 5 July: https://​foreig​npol​icy.com/​2021/​07/​05/​kil​ler-​fly​ing-​rob​ots-​dro​ nes-​aut​onom​ous-​ai-​art​ific​ial-​intel​lige​nce-​fac​ial-​reco​gnit​ion-​targ​ets-​tur​key-​libya/​ Warrell, Helen. 2018. ‘Just the Job: How AI Is Helping Build a Better Life for Refugees,’ Financial Times, 21 November: www.ft.com/​cont​ent/​9332f​ffc-​ec57-​11e8-​89c8-​d3633​ 9d83​5c0

10 CYBERPUNK INTERNATIONAL POLITICS? Enter the Shimmer

The novels of William Gibson—​books like Pattern Recognition, The Peripheral, Agency, Spook Country, Zero History and Neuromancer—​might give us a sense of the cyberpunk international politics that might be ahead of us, a messy, polluted and murky terrain filled with a variety of actors pursuing various political, economic and criminal strategies, a world so overloaded with actors and technologies that the ‘realist’ vision of international politics dominated by states and military threats to territory and national interests looks more simplistic than it ever did; a terrain of new techniques and technologies of theft, espionage, infrastructural sabotage, societal disruption and technical innovation where the image of the traditional military is primarily symbolic: out of date in a time where the military actors are generally ‘un-​uniformed’ and have deep training in a variety of skills, from psychological warfare, drone and robotic control, hacking, sabotage and espionage, through to the new arts of disappearance that allow movement in societies of deep, granular surveillance: the ‘hunters’ of the Blade Runner state. But due to fears of possible social, political and economic collapse after technological or ecological catastrophe, liberal states will continue to maintain armies and militaries for the worst-​case scenario in a global state of emergency where all the infrastructures we depend on disappear. Images of destroyed Russian tanks and cities and towns reduced to wastelands—​ images that felt like a return to the horror and warfare from the previous century in Europe—​is not a spectacle that a ‘great power’ wants to reproduce in a time when large-​centralised states confront the deadly and creative power of decentralised networks. A world where all sides act like cyberpunks—​ from the criminal organisations, terror groups and Blade Runner states through to the police and security services of liberal states seeking to stay ahead of the curve, acting in disruptive and unconventional ways to shape and control volatile and constantly DOI: 10.4324/9781003219576-12

212  The Tactics, Terrains and Technologies of Future Warfare

evolving terrains of conflict, crime and competition, the future as imagined in cyberpunk like Ghost in the Shell. In this time of cyberpunk international politics, more people around the planet can live well—​and in a way unimaginable to previous generations in terms of health, comfort and entertainment. But some can buy more time in a pleasurable body, more time in ‘improved’ or enhanced bodies. This might be a significant emerging source of inequality as we approach 2049. In Don Delillo’s Zero K, we see a future where the chance of living longer is offered by a cryogenics service located in secret and remote places on the ‘periphery’ of the world order. The territorial or geographical anxieties we have remain focused on the chaotic zones where horrific events take place (where fundamentalism can thrive, civil wars are waged, climate emergencies produce human suffering and ecological damage)—​but the ‘moral panic’ and source of anxiety in the future liberal world is as much about the zones where new possibilities in biology and technology are being explored, the anarchic zones of experimentation. So, the sub-​threshold, cyberpunk world seems a credible future out to 2049. Not a time where all our social, political, economic and ecological problems are solved—​some will be solved with protopian fixes while other social and technological problems will emerge—​and not a time when international conflict and competition will disappear. But a time when deterrence by entanglement will exert a powerful geopolitical pressure to avoid the type of wars that destroyed the cities that had created historically unprecedented levels of progress and security in the previous century. These visions of perpetual grey zone and sub-​threshold action might confront the ambiguity in the ‘civilising process,’ the tendency of liberal states to explore and experiment with the new possibilities of war presented as humane warfare. The Italian philosopher Giorgio Agamben provocatively declared that ‘we can say that politics secretly works towards the production of emergencies.’ There is for Agamben, however, nothing inevitable about the continual production of emergencies: ‘It is the task of democratic politics to prevent the development of conditions which lead to hatred, terror, and destruction’ (Agamben 2001). In this book’s view, we need to work to prevent future ‘unnecessary wars’ and avoidable emergencies while recognising that, rather than a world of Kant’s perpetual peace, the years and decades ahead will at best (and if we are lucky) be a time of perpetual grey zone where international politics will be at the sub-​threshold of modernity. But while wars and conflict might stay sub-​threshold and war might even ‘shrink’ in the ways that liberal internationalists hope, there are a number of challenges on the ‘threat horizon’—​and in the politics of both liberal and authoritarian states—​ that mean that impure, granular and machinic warfare will at times move out from the confusing, mysterious and unclear sub-​threshold undergrounds of cyberpunk international politics. In other words, it could be possible to conclude the book with the suggestion that the liberal way of way out to 2049 will be shaped by the experience of the Global War on Terror in Afghanistan and Iraq; the attitude towards

Cyberpunk International Politics? Enter the Shimmer  213

war in Russia (and China) will be informed by the aftermath of the invasion of Ukraine; all states will be unsettled and anxious as they confront the broad trends outlined in this book, on the unfolding possibilities of the impure, the granular and the machinic; and deterrence by denial, punishment and entanglement will shape the behaviour of states. But those who see an ambiguity to the ‘civilising process’ in international politics will see moments or events where necropolitical possibilities will surface from the sub-​ threshold undergrounds of cyberpunk international politics into times of open war and military experimentation. I conclude the book with some of the areas that might take us from the protopian into the necropolitical. Causes of Futures Wars The Future of Decision-​Making

For the protopian liberal optimist, the world will become increasingly peaceful, but—​like the protagonist in Stanislaw Lem’s Return from the Stars—​life might be experienced as dull and unexciting without the excitement of conflict and risk, a time of radical innovations in our health and security—​and a time of perpetual grey zone. But there is no guarantee that future policymakers and bureaucrats—​tooled up with ‘big data’ and ‘granular’ insights into the terrains and populations they study and analyse—​will have learnt strategic and military lessons from international history, will be immune from the sense of crisis, trauma and emotion that can drive unsuitable and counterproductive policy responses; policymakers and politicians might also descend into hubris as they think about how to play with the new tools at their disposal in what feels like a technological wonderland: as I will discuss in the conclusion of this chapter, this is certainly the warning made by ‘realist’ international relations thinkers like Hans Morgenthau (1965) and political philosophers like Hannah Arendt (1972). It might be one of the continual and seductive tendencies of geopolitical power to use—​and experiment with—​the military force that you have spent so much money, ingenuity and creativity developing; the use of military force is a hard necropolitical habit to break. At the same time, there are provocative views on capitalism and the different economic forces and desires that continue to drive war (Marazzi 2008: 145–​157); there might be economic ‘drivers’ in the ‘nocturnal body’ of the liberal state that continue the tendency to use war as a tool of international politics and that accepts the production of geopolitical disorder and catastrophe as an acceptable risk; the liberal acceptance that some populations (those generally not yet citizens of liberal democracies) will be exposed to risk and suffering in the production of international order is fundamental to Mbembe’s concept of necropolitics. Writing about the Vietnam war, Arendt and Morgenthau both warned about the bureaucratic ‘worlds of fantasy’ and the ‘remoteness from reality’ experienced by bureaucrats and policymakers in the Washington of the 1960s. This tendency of the bureaucrat to be consumed by ideological visions or ‘group think’ that prevent

214  The Tactics, Terrains and Technologies of Future Warfare

critical thought or to be seduced by new technological possibilities might remain a problem for liberal states even if we say we are finished with the time of ‘relentless wars.’ And we do not know how the tendency to wage strategically misguided wars will play out in a multipolar world with authoritarian states that might have even less possibilities for critical debate and discussion. Indeed, one of the points made about Putin in 2022 was that he had had a small group around him giving a distorted view of how the invasion would play out, overstating his military power and underestimating the forces Russia would encounter. It was even suggested that his experience of isolation during lockdown, magnified by his paranoia about Covid-​19, intensified his remoteness from reality, narrowing his circle of advisors, producing an echo chamber of military and strategic thinking and advice. The invasion might be a ‘one off,’ a momentary lapse of geopolitical reason—​or it might be an indication of emerging great power politics in a multipolar world, a dangerous sign of things to come. Emerging great powers might have watched and learned from the experiences of previous superpowers or empires; they might be aware of the dangers of hubris and overreach, preferring to focus on economic improvement rather than costly wars. At the same time, they might see a liberal world dealing with the various impacts of Covid-​19 on their societies and retreating from relentless wars—​and so see this is as a time to settle their territorial ‘issues.’ It might also be the case that the coming decades might see a multiplication of strategic mistakes, mistakes emerging from both authoritarian states and liberal states, mistakes that produce even more geopolitical chaos that drags different regions of the world into endless wars. Simply put, we do not how authoritarian and liberal states will manage this time of geopolitical and technological change—​ and to what extent policymakers will make strategic mistakes that, for all the ‘intelligence’ they have access to, lead their states into wars that overestimate their own capabilities and underestimate the reality on the ‘ground’ (or whatever domain or terrain they operate in). For all our information technology, the time ahead might be a time of disinformation, paranoia, conspiracy theory and isolation—​for both citizens and those deciding to wage war. China and Future Wars

The current debate about China and its foreign policy is generally on whether the emerging great power is charting a radically different path in how it intervenes in the rest of the world—​a more ‘benign’ form of global governance (and control) that wants to avoid the wars waged by Europe and the United States during the twentieth century (and so will operate in the grey zone in a ‘sub-​threshold’ manner, ready for unrestricted warfare) (Chubb 2023); or whether China will inevitably get caught up in conflicts where the desire for economic acceleration and expansion mutates into the desire for military control and primacy. The ‘realist’ view is that while one might desire to do things differently, a rising power will inevitably become caught in conflicts that result in war (Mearsheimer 2021).

Cyberpunk International Politics? Enter the Shimmer  215

There is a question on whether the ambition and desire for control that is playing out domestically translates into an increasingly dangerous foreign policy, a foreign policy driven by nationalist policymakers that—​like policymakers in great powers before them—​believe they have the ability and right to redesign the world order using their growing powers of influence, control and manipulation—​and possibly military force. But simply because a state uses aggressive policy and security tactics at home (and overseas with ‘wolf warrior’ diplomacy) it does not necessarily mean this will translate into war. Maybe the Chinese policymakers will have learnt from watching the liberal way of war and history and maybe they are playing a different game, a much more cautious, long-​term ‘realist’ game. It is common in the popular liberal imagination to see the Chinese citizenry as being ignorant about the actions of the leaders and military elites or afraid of becoming ‘political’; in this view, political elites have no limits placed on the new military, security and policing technologies they can experiment with. Liberal states, some will argue, contain legal and ethical limits on what they can develop and use in an age of robots and artificial intelligence (AI), all the technological possibilities that are currently in the realm of science fiction. But what we see in the emerging Chinese science fiction that has been translated into English is a concern with the role of the military in society and the ethical problems with futuristic military research and technology (the types of concerns that have long been central to science fiction in the liberal world): this is the focus of a book by the most celebrated Chinese science writer outside China, Cixin Liu’s Ball Lightning; there are also concerns about the impact of transformation in China, exemplified books such as Chen Qiufan’s Waste Tide. In other words, the type of concerns about technology and future war in China might not be much different from those in the liberal world (although the key question might be on how strategic thinking in military elites may be different). Simply put, we might not know enough about the ‘strategic culture’ of Chinese leadership or the political possibilities that might unfold domestically; and what we think we know might be flawed. In Destined for War: Can America and China Escape the Thucydides Trap?, Graham Allison discusses the range of possibilities for conflict in times of geopolitical transformation. The Thucydides Trap refers to the war that devastated the two main city states in fifth-​century classical Greece, resulting from what the Athenian historian who documented the war viewed as the anxiety that results when a rising power threatens to displace an existing ‘great’ power. Thucydides comments: ‘It was the rise of Athens and the fear that this instilled in Sparta that made war inevitable’ (Allison 2018: 1). For Allison, the Thucydides’ trap is the ‘severe structural stress caused when a rising power threatens to upend a ruling one. In such conditions, not just extraordinary, expected events, but even ordinary flashpoints of foreign affairs, can trigger large-​scale conflict’ (ibid.: 29). The History of the Peloponnesian War maps out the path to war and the diplomatic encounters that lead to it; Thucydides sees the causes of war in the deep sense of insecurity, of

216  The Tactics, Terrains and Technologies of Future Warfare

being displaced and made vulnerable by a new power, of risking to lose its place as a dominant regional power that it had held for almost a century. And displaced by a state that even today remains a ‘symbol of the ultimate military culture,’ a state that, while geographically close, was viewed to contain a dangerous Otherness. Allison is interested in exploring whether the situation that China and the United States might find themselves in could unravel in a similar fashion, the conflicts that play out in the grey zone erupting into great power war, wars that might currently appear impossible after our long peace and deterrence by entanglement: In the years ahead, could a collision between American and Chinese warships in the South China Sea, a drive toward national independence in Taiwan, jockeying between China and Japan over islands on which no one wants to live, instability in North Korea, or even a spiralling economic dispute provide the spark to a war between China and the US that neither wants? (Allison 2018: 155) Even when war is understood as hugely destructive it can still happen and possibly under a sub-​nuclear threshold (as we saw in the scenario depicted in Ghostfleet). All sides might underestimate the costs of war and overestimate their capacity to ‘win.’ As Allison reminds us: ‘And once the military machines are in motion, misunderstandings, miscalculations, and entanglements can escalate to conflict far beyond anyone’s original intent’ (ibid.). Of course, it might be a stretch to compare twenty-​first century America with Sparta in the fifth century. While some might argue that America exhibits aggressive nationalistic and militaristic tendencies, it seems a long way from a Spartan military culture where families would enrol boys at the age of seven in military academies to turn them into warriors: Pinker clearly has a point when he argues that citizens in liberal societies prefer to be distant from the battlefield; the wars that America has fought have been—​for all the ‘macho’ or nationalist celebrations of militarism that often can accompany them—​wars it believes quite reasonably it will be able to fight with limited loss of life (a view that might play down the necropolitical consequences for people on all sides). Fighting a power that matches you in terms of military capacity—​and that might even be able to deploy the tactics of an unrestricted warfare that you cannot imagine or prepare for—​ might render the idea of a Thucydides Trap, a rather outdated historical analogy; the stakes are too high for any action that risks moving out of the sub-​threshold underground of international politics. At the same time, nationalist desires—​ backed up by disinformation campaigns—​might be ignited by a leadership trying to deflect attention away from domestic tensions towards foreign sources; insecurity, conspiracy theory, disinformation and paranoia might lead to war. The emergence of China as an economic and military superpower might herald a new geopolitical Cold War 2.0 (as some refer to it) with sub-​threshold activities that resemble the plots of cyberpunk novels; or we might see the world edging

Cyberpunk International Politics? Enter the Shimmer  217

towards a world war as states feel their technological and economic prowess diminishing. David Mitchell’s The Bone Clocks includes a section that depicts Ireland in a Europe dealing with a number of ecological, technological disasters (the Endarkenment); order and control appear to be maintained by a Chinese peacekeeping force in a world shaped profoundly by the cultural and economic impact of China. The ‘Great Wall of China’ protects much of Ireland from ‘the lawlessness that plagues much of Europe as the Endarkenment switches off the power networks and emaciates civil society’ (Mitchell 2015: 550). We might have to keep our strategic imaginations far more open and creative than they are used to being—​while at the same time remaining ‘mindful’ to what might be experienced by many as the unsettling experience of living with difference or Otherness; to keep international conflict in the realm of the sub-​threshold and impure to avoid the possibility of granular and machinic wars on a global scale, infrastructural wars that will mean the ‘international society’ is left unable to deal with the multitude of other global dangers and challenges on the horizon. But even without a war between great powers, it seems clear we will be in a time of multiple arms races across a range of technologies—​and it’s not clear whether the liberal world can get beyond what some view as a tendency to demonstrate its latest tools as a warning or strategy of deterrence. As Andrew Marshall, the ‘Yoda’ of future war in the Pentagon, put it: there are ways of psychologically influencing the leadership of another state. I don’t mean information warfare, but some demonstration of awesome effects, like being able to set off impressive explosions in the sky. Like, let us show you what we could do to you. Just visually impressing the person. (Marshall 2003) In this view, we should expect the occasional necropolitical (although possibly increasingly non-​lethal) spectacles of the latest innovations in the liberal way of future warfare. Climate Change and Future Wars

A view that continues to return to discussions of security, war and climate emergencies is that the threat of environmental insecurity has been overstated; it’s a distraction from the real work of managing international conflict and competition in a time of dramatic technological and geopolitical change. But in the 2020s there does seem to be different views emerging in the ‘mainstream’ of military thinking. At my university we had some lectures given to undergraduates from someone working at one of the United Kingdom’s key state-​funded research bodies on future threats and challenges. Having previously served in the navy, he had been placed on a project on climate change; he admitted to us that when he started, he was sceptical about climate change as a security challenge. But by the time he had

218  The Tactics, Terrains and Technologies of Future Warfare

finished the project he saw climate change as one of the key challenges for all states in the twenty-​first century. Roy Scranton gives a powerful and vivid account of how his military experience in the Global War on Terror resulted in him developing a radical understanding of future security challenges in the Anthropocene (Scranton 2015). From a military perspective, the position no longer seems to be about climate change as a paranoid dystopian fantasy or manufactured fear but of a geopolitical reality that will get harder to deal with in the decades ahead. Indeed, the sociologist and philosopher Bruno Latour suggests that we are moving from a time where geography or the natural world was the ‘realist’ backdrop or theatre for our geopolitical ‘games’ to a world where climate becomes a significant political ‘actor’ that will shape history in the same way as we believed states or world leaders shaped the world (Latour 2017: 41). In this view, it is anthropocentric hubris to hold on to the belief that states, corporations and their technology/​weaponry will control and shape the future of the planet in a sustainable and equitable manner. Climate emergencies will exceed the capacity of states and corporations to control and police global chaos and disorder. But the counterargument to the pessimistic positions on the ecological future is that we will produce protopian ‘technical fixes’ both in terms of adapting to a climate crisis but also in terms of creating new technologies that will take us beyond an age of pollution, species loss and climate chaos. There is the view that emerging technological superpowers like China will be key innovators in the production of ‘eco-​friendly’ smart cities, where state and business surveillance will extend into the detail of how people move and consume, ‘nudging’ and transforming behaviours; the race will be on how to develop ambitious projects for ‘geoengineering’ the planet, projects for climate repair that may become the primary industries of the century, the industries saving us from extinction. To be sure, there are strong arguments that geo-​engineering the planet is a hubris that will not contain the threat from the multitude of dangerous new ‘actors’ and events emerging out of a climate changed world—​but there will undoubtedly be many attempts to save the planet (or transform the planet and ‘nature’ through new technologies), a planet made possible by various artificial ‘life support’ systems in our post-​natural Anthropocence. These drives and desires for planetary repair may, however, be diverted by the geopolitical competition and arms races across multiple technologies (Roubini 2022). In other words, our necropolitical tendencies will prevent us from saving ourselves and future generations. The ecological change out to 2049 might be uneven; there might be terrains and cities that resemble the barren landscapes of Blade Runner 2049; there will be some areas where life is just different, artificially enhanced and protected, cities like those in Minority Report. People will adapt to new ways of living; the transformation of the world will result in creative responses from individuals and communities finding how to live in a changing world; like some of the communities in Kim Stanley Robinson’s New York 2140, people will explore new types of architecture,

Cyberpunk International Politics? Enter the Shimmer  219

education, work, travel and family life to live in this new world rather than just survive. But there are maps of the future that present us with the possibility of a planet that looks closer to the dystopia of Blade Runner 2049. This planet of ecological chaos and disaster is not necessary a world of war and proliferating states of climate emergencies; there may be creative social, economic and political moves to deal with the state of emergency in a ‘humane’ and responsible manner. But at the same time, the horror of the climate emergency could take us into a twenty-​first century version of Bauman’s Modernity and the Holocaust, ‘gardening states’ protecting their barren gardens from the strangers and refugees that are policed and controlled with the latest innovations in AI and machinic security. Liberal states might be forced into peacekeeping missions and interventions that require all their impure, creative skills, granular techniques and machinic enhancement in order to manage the multiplication of chaotic events that the climate emergency produces; great power competition might intensify in response to the global disorder and the threat to national interests and security—​and the new economic and military possibilities that emerge from the transformation of geography and territory. Simply put, a planet of climate emergencies might begin to change the calculation of risk in preparation for military actions. The worst-​case scenario is the complete collapse of civilisation leading to the type of world that we see in Cormac McCarthy’s The Road or Mad Max Fury Road. But this is not the terrain of war but more the age of gangs and warlords using whatever tools remain after the collapse. The consequences of climate change out to 2049 might be less dramatic—​and less interesting to militaries around the world. A key role for militaries in liberal states might be HADR—​humanitarian assistance and disaster relief. In an age of cyberpunk international politics, the real work of conflict goes on in the grey zones of hackers or in fast heist-​like operations and raids involving special forces. Most people serving in the forces will experience conflict in terms of managing camps of climate refugees and helping communities deal with the aftermath of flooding and other increasingly frequent ‘natural’ disasters. But this might be wishful thinking. At the same time, the ecological danger of the twenty-​first century where ‘nature’ becomes a political actor might be the ‘black swan’ or global accident we are not currently thinking about—​or where the technologies for its production do not currently exist or are beyond anything we can imagine in the 2020s. Enter the Shimmer

Whatever the world becomes out to 2049, conflict is unlikely to disappear from the human condition—​but it might look different from the conflicts of previous centuries. A cyberpunk international politics might emerge, worlds transformed by new technologies that change how we live, work, steal and fight. Conflict might take increasingly non-​lethal ‘protopian’ directions: information wars and

220  The Tactics, Terrains and Technologies of Future Warfare

cyber-​sabotage orchestrated by hackers in the preludes to potential wars that never take place, soldiers replaced by machines that radically reduce the casualties of war on all sides, new types of digital diplomacy, conflicts prevented by the surveillance and (potentially lethal) organisational capabilities of people around the world. Wars might take place that resemble the brutal battlefields of Edge of Tomorrow, congested with machines of all shapes, sizes and speed. A world where we might have a clearer sense of the challenges and opportunities to be confronted in the second part of the twenty-​first century—​or a world where we are trapped in a state of confusion, chaos and (manufactured) fear about the messiness of future science, technology, economy and international politics. A world where the pace of change is producing a variety of radical technologies to improve all aspects of existence; or a world where the pace of change has slowed down, a depressive and anxious world lacking the innovation to deal with the problems that are damaging communities around the planet (Roubini 2022: 161). A melancholic planet like the one depicted in Blade Runner where the ‘good times’ are in the past (to be experienced with robotic versions of the creatures that became extinct)—​or ‘off world.’ The science fiction/​horror film Annihilation (2018)—​the film directed by Alex Garland and based on the book by Jeff VanderMeer—​provides warnings about a variety of problems of the human condition in the twenty-​first century. The story is about a biologist and university researcher—​Lena—​who is asked to go on a research mission into Area X, into the Shimmer, a mysterious ‘zone’ that has emerged in Florida: Lena’s husband is the only person to have returned from the Shimmer, after a mission with Army special forces, but the journey has left her husband ill and disoriented. Lena joins a group of female scientists and heads into the mysterious zone in the search for a lighthouse that is understood to be the origin of the Shimmer. The researchers soon become disoriented in the zone, losing any sense of the time that is passing, uncovering strange plants and creatures: they come to realise that the Shimmer is a zone of mutation, a zone where strange new combinations or hybrids are emerging: they encounter an alligator and has the teeth of a shark and later one of the scientists seems to have merged with a bear. The group uncover horrific footages of the events that Lena’s husband was involved in, a descent into violence and madness as they became aware of the mutations that were occurring inside and outside them. The Shimmer is a prism, distorting and transforming the DNA of all living beings inside the zone. Lena heads to the lighthouse where she uncovers that her husband killed himself with a grenade—​and was then replaced with a ‘copy.’ Lena encounters a strange humanoid figure that then appears to become a copy of her. ‘Lena’—​or possibly whatever Lena has become—​manages to escape and supposedly destroys the ‘copy’ of herself: when she is reunited with her husband, we are left with the uncertainty about whether it is actually Lena that has returned: we see the light of the Shimmer in her eyes.

Cyberpunk International Politics? Enter the Shimmer  221

The Shimmer is a disorientating environment that produces new ‘hybrids’ and strange combinations in a rapidly evolving (or mutating) environment. One way of reading the film in the context of the twenty-​first century is that we are entering a Shimmer—​but a Shimmer that absorbs the entire planet. This zone will produce radically new ways of living and being, from the new types of body that may emerge from biology and technology through to the enhancement of the body through ‘cyborg’ prostheses, the emergence of new environments, from climate changed geographies through to new artificial and geoengineered habitats, through to political mutations that emerge from the ways ‘vision machines’ shape and influence our perception of the world. Humans will live differently and become different; a world of homogenisation and globalisation in emerging culture, technology and politics; or a world of difference with divergence in technology, politics, economy and values that may splinter into radically different zones of life. There may be different types of Shimmer—​that create the possibility of different types of world order, life-​worlds (and necropolitical death-​worlds) with different approaches to conflict, war and violence; different ways of being human in a time of technological and geopolitical change: each Shimmer will be disorientating in all areas of life, death and international (necro)politics. Each Shimmer might create different futures for warfare and international politics. Green Shimmer. A Green Shimmer might emerge where we experiment with different ways of living with the planet, creating new ways of living that are architecturally innovative in the way they try to deal with problems of sustainability, social exclusion/​inequality, crime and community; we might begin to become something different from the early twenty-​first century consumer as we experiment with new ways of working, eating, moving, consuming, of organising political life/​ community and living with technology. We begin to look and behave in a way that might be described as future primitive, living lives creatively with technology but not dominated by technology; a new ecological attitude to life that experiments with approaches to living with (and transforming) ecological degradation—​to live beyond ecological degradation in a way that will produce a new planetary civilisation built on values, beliefs and political ideas different to the ones that dominated modernity (Latour and Schultz 2022). The experience of Covid-​19 brought home the fragility of the interconnected world; the impact of new AI and robotic technology revealed the ominous prospect of a world without work or employment—​but the emerging possibilities of technology opened up new ways of organising society, of being in the world. Underpinning this emerging politics is a new sense of the tragic, a ‘new realism’ that rejects the technocratic visions of both authoritarians and liberals (Kaplan, Gray and Thompson 2023). After the pandemic and war in Ukraine, a new generation around the planet mobilises to broaden our concept of security and—​while the world still relies on Cold War strategies of deterrence—​their radical new moves plan to demilitarise the planet, to police the world differently. All leaders around the world have to

222  The Tactics, Terrains and Technologies of Future Warfare

respond to the new generation’s demands for a different future (the manifestos and inspirations for this new world are books like Donna Haraway’s Staying with the Trouble, Achille Mbembe’s The Earthly Community, Kim Stanley Robinson’s New York 2140 and The Ministry For The Future, the stories of Ursula Le Guin). There are still pockets of conflict around the planet with those inhabiting the identities inherited from the old days of the Cold War and the War on Terror; the focus and priority is on non-​lethal, protopian solutions for conflict resolution. The underlying focus is on preventing the conditions of planetary chaos and disorder. While the use of military force is viewed as sometimes necessary, there is a desire to prevent cycles of trauma caused by necropolitical violence; new generations in Europe and North America feel the need to deal with the trauma of war and ecological degradation as much as the youthful leaders in China and India. But when needed, the darker arts of the grey zone are deployed if it will lead to conflict prevention or creative reimagining of world politics; there is a constant concern about impure and granular measures creating necropolitical consequences. The military is often tasked with dealing with the interventions required to protect the peoples dealing with the accidents of modernity, the damage that will possibly require centuries to repair; machines are used to help us repair the planet as much as they are for policing and security. Around the planet, world leaders and citizens realise that, to escape the legacy of modernity, the different costs of war (such as the ecological consequences, the trauma that continues to damage individuals and societies across generations) and the cost of preparing for future wars in times of technological innovation and multiplication mean that the creation and design of a different world order is a matter of survival. Dark Shimmer. But there may also be Dark Shimmers as states attempt to control a planet dealing with climate crisis, inequality and technological disruption: states will create cities of control transformed by increasingly intimate technologies of surveillance and policing, attempting to control ecological crisis through the production of geoengineered and biotechnologically manufactured or modified habitats; humans will rely on various forms of enhancement on and in their bodies to be able to cope the stress of highly competitive, ruthless and stressful environments: life for most humans is one of constant unease because of the information overload—​experienced through various vision machines—​and the pace of constant political and economic change (along with the unease/​paranoia about techniques that are used to influence and control the citizenry in surveillance capitalism). This is a world that thinkers like Achille Mbembe saw as a necropolitical possibility—​and that we were unable to avoid. The world is divided into a number of economic and political blocks with a variety of different approaches to social and political control (liberal democracies try to hold on to their traditional values, but the world is becoming too chaotic to maintain a reasonable balance of liberty and security; waves of racist populism and authoritarianism in the liberal world are the norm); the primary interstate conflict is in the grey zone of sabotage, espionage

Cyberpunk International Politics? Enter the Shimmer  223

and subversion. When brutal ‘regional’ conflicts erupt or when actors look like they are becoming threats to a region, force is deployed ruthlessly and effectively, orchestrated by militaries deploying machines and highly trained special forces, soldiers whose brains and bodies are in the process of constant enhancement; those on the outside might risk giving sub-​threshold support to those being invaded; all the possibilities of creativity, granularity and new machines are experimented with to the max. But in terms of diplomacy and foreign policy, it’s generally a case of live and let die; states and societies try to maintain indifference to the suffering that occurs in other regions. All states are paranoid about the escalation of conflict beyond the grey zone; there is an arms race across a variety of technologies in this time of multiplication. Liberal states hold on to the possibility of protopian developments for military technology; outside the liberal world, there are few limits on necropolitical experimentation. Rather than leading to a new era of global governance and cooperation, Covid-​ 19 was the beginning of a new period of Cold War and deepening surveillance society around the planet. The problem of inequality is reduced to a problem of more efficient policing and techniques of control and division. Crime and corruption are one of the major causes of human misery around the planet. The planet is dealing with multiple sources of ecological degradation—​but there is not the political will or capability to deal with the problems where geoengineering is the last hope. This is the future of Elysium or Ghost in the Shell. Silver Shimmer. There might also be technocratic Silver Shimmer where—​ while there is still terrible inequality—​there are experiments in work and welfare to manage the constant disruptions of the twenty-​first century, a world that looks more like Steven Spielberg’s Minority Report (set in 2054) than Blade Runner 2049; while the climate crisis impacts on all states, there are technological and policy responses that seek to both repair the planet and protect nature from further destruction—​while at the same time maintaining growth in a global economy that provides radically new ways of living for the rich but also (less dramatic) benefits for the poorer in society: there are radically different worlds in this Silver Shimmer, but even those at the bottom are living lives of technologically enhanced comfort. Geopolitical change and pressure is managed by the smart diplomats and bureaucrats who have access to worlds of data, intelligence and influence that result in the fast resolution of diplomatic tensions—​and the fast resolution of emerging conflicts. Conflict takes place in the grey zone, designed by highly trained technicians, the realm of cyberpunk international politics and Blade Runner states; when conflicts emerge non-​lethal, protopian solutions are the preferred option (by most states)—​but if people are being killed or made to suffer then states will organise a military response to protect civilians, combining granular tactics and machinic innovation. Covid-​19 was the beginning of a new era of smart global governance and leadership—​all states saw the benefits of interconnection and collaboration as the only way to deal with the problems of the twenty-​first century. The Russo-​Ukrainian

224  The Tactics, Terrains and Technologies of Future Warfare

war was a warning about the dangers of war in the twenty-​first century, the age of the drone becoming the age of AI and next generation war machines. Liberalism evolved and adapted to the problems that had emerged from neoliberal times, work inspired by thinkers like G. John Ikenberry in A World Safe for Democracy: Liberal Internationalism and the Crises of Global Order. But underneath the ambitious plans for world order is still a technocratic optimism that is sowing the seeds for future chaos. And the desire to police world order results in necropolitical desires and wars or interventions where all the protopian technologies and tactics produce new zones of chaos and violence around the planet, unintended consequences and strategic surprises that drive the creation of new types of military innovation and technology. But leaders try to learn from their mistakes—​while the technicians design new generations of weaponry. Concluding Remarks: Future Worlds of Fantasy and International Politics

The world out to 2049 might be composed of different periods of Shimmer (green, dark or silver) that depend on the political, economic, cultural, and ethical climate at the time; it might be the case that some states and regions experiment with different Shimmers in different zones or cities; some Shimmers might be rejected. By 2049 there may be Shimmers that we cannot currently imagine. But in this view sketched here, we are heading into an unsettling time of personal, societal, political, economic, technological and geopolitical transformation/​mutation. In terms of the future of warfare, one of the questions that Annihilation leaves us with is this: if we are entering a time of radical transformation—​of geopolitical, ecological, economic and technological ‘mutation’—​what will we be when we leave (if indeed we do leave) the Shimmer? The world will probably be strange in 2049 (possibly extremely strange and unsettling from the perspective of the 2020s); war will be stranger still (although war is always Other, always strange in its violent destruction of whatever society or lifeworld is being destroyed and mutilated, strange and disturbing in the different ways it can use emerging technologies of ‘shock and awe’). Conflict and international politics might be transformed by protopian and non-​lethal possibilities; interstate war might be deterred by the possibility of apocalyptic war; conflict might take place in the rapidly evolving grey zones of cyberpunk international politics; new terrains of conflict—​such as the astropolitical domain of space and great power competition, the deep sea for economic competition, the social spaces of virtual reality—​may have grown in strategic importance. There is probably a tendency in liberal societies to believe that not only will we have new and improved technology with every passing decade (and possibly times of revolutionary strategic surprises and transformation), we will produce more and more people who will be smarter than we are now, an intelligence explosion enabled by education and the new tools that will help us enhance ourselves, the

Cyberpunk International Politics? Enter the Shimmer  225

creation of people who can lead us responsibly and safely through the Shimmer. As William Gibson says in an interview, each generation tends to assume that ‘the inhabitants of the past are hicks and rubes, and the inhabitants of the future are effete, overcomplicated beings with big brains and weak figures. We always think of ourselves as the cream of creation’ (Gibson and Beauman 2014). In other words, we think of our ancestors as the crude violent characters of a Cormac McCarthy novel and the people of the future as supersmart ‘cyborgs’ enjoying a peaceful world filled with amazing technology; more and more will have access to the best education and everyone will have the possibility of enhancing or improving their intelligence through the protopian products of new research. In the optimistic or hopeful view, we will have the leaders trained in what Joseph Nye describes as ‘smart power,’ the skills to get as through the complex global challenges ahead (Nye 2008). For all the focus in this book on the potential for the futuristic transformation of warfare, so much that is fundamental to war will likely stay the same as it was for Sun Tzu or Clausewitz (Martin 2023). The actions of states (or inactions) will continue to be shaped by the force of deterrence (pressure and influence from the outside) and from the force of the citizenry (the anti-​war resistance that can emerge from inside a state or military). To be sure, the geopolitical ‘Shimmer’ might be a time where new risky strategic (mis)calculations come into play—​either because of the global ‘accidents’ that occur (linked to climate emergency, pandemics, inequality or the dangers we cannot currently imagine) or because of the belief or faith in the new technological and ‘game changing’ possibilities in humane warfare. These military temptations are unlikely to disappear; if anything, the temptations might multiply and intensify in the years out to 2049. The world out to 2049 might be both a wonderland of new technological possibilities and a chaotic age of new global dangers: both authoritarian states and liberal states may embark on strategies that produce more war and chaos; there is no guarantee that we are moving beyond the temptation to embark on ‘relentless wars’: we might be ‘on a break’ while we transform our capabilities until the next time necropolitical desires take hold and lead us into a new mission justified as essential to our security and world order. As Agamben suggests, politics secretly works towards the production of emergencies; in this view, for all the faith in progress, our ability to improve all aspects of life, including warfare, we do not fully understand what we are, what remains secret to us, what remains in what Mbembe describes as the nocturnal body of state. In this sense, liberal states and societies might be like the replicants in Blade Runner, uncertain about who they really are. In 1965, the realist thinker Hans Morgenthau made this observation about policymakers in the United States during the Vietnam War: The world of fantasy in which those who govern us live inevitably begets failure in action; for the facts are what they are, and they are not to be trifled with. As

226  The Tactics, Terrains and Technologies of Future Warfare

inevitably, it begets dissimulation; for the world of fantasy must be protected from the world of reality. (Morgenthau 1965: 17) What Morgenthau is suggesting is that there are times when policymakers inhabit worlds of fantasy where they lack realism about what is possible and what is happening ‘on the ground’ (the necropolitical reality they are creating from their offices). As Hannah Arendt (1972) added, policymakers and decision makers might begin to lie to others—​and lie to themselves. Cultures of deception might be the exception in politics or, as Arendt seems to imply in her essay on ‘Lying in Politics,’ it might become the norm. Many would argue President George W. Bush and the neoconservatives existed in Morgenthau’s ‘worlds of fantasy’ just as Vladimir Putin did when he ordered the invasion of Ukraine. The question for the world out to 2049 (and beyond) is whether the experience of war in the century so far, combined with the multiplication of actors in a multipolar world, along with the emergence of new technologies and tactics of warfare, will limit the use of warfare as a tool of international politics, puncturing worlds of fantasy before they begin to be made real. Or in times of conspiracy theory, disinformation, fake news and deepfakes, will policymakers descend deeper into the ‘rabbit holes’ of their worlds of fantasy? Paul Virilio suggests that technology can become ‘a sort of Wonderland in which the warrior, like a child in its playpen, wants to try out everything, show off everything, for fear of otherwise seeming weak and isolated’ (Virilio 2001: 10). These times of acceleration might intensify and multiply the technological ‘wonderlands’ and military ‘worlds of fantasy’ that politicians and bureaucrats can enter into and be consumed by. We need to hope that humanity can imagine, create and design futures beyond the necropolitical matrix that seems unable to escape the use of war to produce (in)security around the planet. In times of uncertainty, fear, propaganda and disinformation, we need to find ways to see the possibility of worlds and futures beyond a reality that looks increasingly like a hybrid of dystopian cyberpunk and images of war and suffering from the previous century, a world between worlds. Bibliography Agamben, Giorgio. 2001. ‘Security and Terror,’ Theory & Event, Vol. 5, Issue 4: https://​ muse.jhu.edu/​arti​cle/​32641 Allison, Graham. 2018. Destined for War: Can America and China Escape the Thucydides’ Trap? (London: Scribe). Arendt, Hannah. 1972. Crises of the Republic (London: Harcourt Publishers). Chubb, Andrew. 2023. ‘Taiwan Strait Scenarios,’ Project Syndicate, 12 June: www.proj​ ect-​syndic​ate.org/​magaz​ine/​tai​wan-​str​ait-​isl​and-​outpo​sts-​vul​nera​ble-​to-​china-​by-​and​ rew-​chubb-​2023-​06

Cyberpunk International Politics? Enter the Shimmer  227

Gibson, William and Beauman, Ned. 2014. ‘Interview: William Gibson’ The Guardian, 16 November: www.theg​uard​ian.com/​books/​2014/​nov/​16/​will​iam-​gib​son-​interv​iew-​the-​ per​iphe​ral Ikenberry, John. 2020. A World Safe for Democracy: Liberal Internationalism and the Crises of Global Order (London: Yale University Press). Kaplan, Robert D, Gray, John, and Thompson, Helen. 2023. ‘The New Age of Tragedy,’ The New Statesman, 26 April: www.newst​ates​man.com/​ideas/​2023/​04/​new-​age-​trag​edy-​ china-​food-​eur​ope-​ene​rgy-​rob​ert-​kap​lan-​helen-​thomp​son-​john-​gray Latour, Bruno. 2017. Down to Earth: Politics in the New Climatic Regime (Cambridge: Polity). Latour, Bruno and Schultz, Nikolaj. 2022. On the Emergence of an Ecological Class: A Memo (Cambridge: Polity). Marazzi, Christian. 2008. Capital and Language: From the New Economy to the War Economy (Los Angeles: Semiotexte). Marshall, Andrew. 2003. ‘The Marshall Plan,’ Wired, February: www.wired.com/​2003/​02/​ marsh​all/​ Martin, Mike. 2023. How to Fight a War (London: Hurst). Mearsheimer, John. 2021. ‘The Inevitable Rivalry: America, China, and the Tragedy of Great-​Power Politics,’ Foreign Affairs, November–​December: www.for​eign​affa​irs.com/​ artic​les/​china/​2021-​10-​19/​ine​vita​ble-​riva​lry-​cold-​war Mitchell, David. 2015. The Bone Clocks (London: Sceptre). Morgenthau, Hans. 1965. Vietnam and the United States (Washington, DC: Public Affairs Press). Nye, Joseph. 2008. The Powers to Lead: Soft, Hard and Smart (London: Academic). Roubini, Nouriel. 2022. Megathreats (London: John Murray). Scranton, Roy. 2015. Learning to Die in the Anthropocene: Reflections on the End of a Civilization (San Francisco: City Lights). Virilio, Paul. 2001. Strategy of Deception (London: Verso).

INDEX

Note: Page numbers in italic refers to Figures. The acronym “AI” is used for “artificial intelligence” throughout this index. acceleration: economic 214; technological see technological acceleration; Virilio on 54–​60 accidents 3, 10, 60, 62, 112; involving AI 182, 200, 201, 203, 204, 205; catastrophic 88; of decision-​making 67; involving drones 174, 176; general 58; global 9, 203, 225; and international conflict 18; and microcube of death 119; of modernity 222; involving Ninja bomb 118–​19; organisational 139; involving robots 176; technological see technological accidents; in urban warfare 128 Adams, Douglas 4 Afghanistan 18–​19, 29, 47, 59, 68, 87, 120, 154–​5, 168, 212–​13 Africa 8, 44, 59, 72, 119–​20, 129–​34 after the future 5 Agamben, Giorgio 33, 212, 225 agile warrior 105, 139 AI see artificial intelligence (AI) AI Now Institute 204 AI 2041 5 AI-​enabled tools 144–​5, 191–​2, 193–​4, 194–​5, 196. 198–​9 AI/​machinic arms race 206 algorithmic war 183 Algorithmic War 163

Algorithmic Warfare Cross-​Functional Team 188, 203 algorithms 204 Alien 56, 185–​6, 208 Alita: Battle Angel 58–​9 All You Need Is Kill 9 Allison, Graham 149, 215, 216 AlphaGo 186 alternative futures 5 Amazon 117, 145, 198 ambiguous war 11, 19, 20, 74, 77, 106, 189 Annihilation 220, 224 Another Now 5 Anthropocene 57, 60, 218 anti-​mosaic war 197 AntMan 118 anxiety 27, 33; about AI 159, 182, 194, 200; over authoritarian dystopias 94; over change in scale 195; over China 70, 132; about all things ‘cyber’ 11–​12, 84, 93, 94, 95, 96, 98, 108; about de-​coupling 56; about deep fakes 189; existential 182; over feral cities 125; about the future 28–​9; geopolitical 11; over global ecological crisis 129; over granular war 154; over impure war 74–​5; over information 52; and liberal world 138, 212; and liquid modernity 54, 117; moral 173, 193; and mosaic warfare

Index  229

177; arising from pace of change 59; over Russia 70, 74, 76; over sabotage 96; about security 184; and the Shoggoth 187, 195, 198–​9, 204; over technology 131, 132, 147, 172, 199; and Thucydides Trap 215 apocalyptic international politics 7–​9, 28 Aradau, Claudia 5 Arendt, Hannah 30, 51, 213–​14, 226 Armenia 168 armies 6, 9, 14, 21, 34, 70–​1, 99, 115, 147, 162, 190, 197, 208, 211; machinic 193, 194 Arquilla, John: on battle deaths 45–​6; on cyberwar 99–​100; on finding always bears flanking 148; on Russia 142; on social netwar 100, 101, 108, 109; on Stuxnet 95–​6 Art of War, The 79, 84 artificial intelligence (AI): accountability gap 204; banality of 199–​205; Belfer report on 187, 190, 194, 195–​6, 197, 200; and future of warfare 182–​208; general 187, 203; matrix of bureaucracy 200; narrow 187, 203; in the time of the Shoggoth 184–​90; systems 185, 186, 187, 188–​9, 190, 191, 192–​3, 201, 202, 203, 204; terrorism in an age of 134–​6; vulnerabilities 190; see also intelligentisation, of warfare ASKE (Automating Scientific Knowledge Extraction) 202 astropolitics 3, 10, 17, 68, 92, 205–​6 asymmetric vulnerabilities 81, 84 asymmetrical actions 73 Atlas of AI 36, 183 Attack of the Clones 194 Attali, Jacques 7 authoritarian leadership 32, 41, 75 authoritarian regimes 9, 17, 28, 32, 34, 37–​8, 41, 52, 54, 85, 86, 87–​8, 120 authoritarianism 52, 129, 222–​3 authority 31, 122–​3, 199 Automating Scientific Knowledge Extraction (ASKE) 202 Autonomous 174 autonomous drones 185, 191–​2 autonomous weapons 14, 82, 173, 184, 188, 193, 194, 203 Azerbaijan 168 banality: of AI 199–​205; of evil 51 Bangladesh Bank heist 107

bare life 33 Bartelson, Jens 29 battle angels 21, 158–​78, 184, 190–​9 battle deaths 45 battlefield information systems 100 battlefield singularity 81, 184 battlefields 35, 68, 121, 141, 143, 144, 145, 148, 150, 168, 176, 183, 198, 216; electronic 173; of the future 146; transparent 169, 177 battlespaces 118, 147, 153; congested 32, 136, 143, 148, 154; emerging 168; messy 170; shrinking 128, 129; urban 141–​2 Bauman, Zygmunt 20, 183, 204, 208; on invisible war 161; on liberal societies 61, 62, 88, 128; on liquid modernity 116, 119–​20, 173, 176, 219; on modernity and violence 49–​54, 192–​3; on moral distance 90, 163; on solid modernity 115 Belfer report on AI 187, 190, 194, 195–​6, 197, 200 Belton, Catherine 75 Berardi, Franco 5 better angels of our nature 9, 130, 153, 176 Better Angels of Our Nature, The 31, 33, 39, 51–​2, 174 big data 44, 94, 117, 198, 213 Big Kill, The 45, 46 Bin Laden, Osama 13, 68–​9, 71, 119, 123, 134 biological war 74, 106, 126, 206 biology 39, 44, 56, 61, 111, 127, 172, 183, 205, 212, 221 biotechnology 56, 129, 208, 222 bitskrieg 100, 102, 108 Bitskrieg 109 Black Earth 53 Black Hawk Down 13, 121, 126, 134 Black Mirror 94 black swan events 10, 58, 80–​1, 108, 184, 219 Blade Runner 1, 4, 41–​2, 56, 61, 158, 160, 171–​2, 187, 220, 225 blade runner states 162–​71, 172, 176, 188, 190–​9, 200, 207, 211–​12, 223 Blade Runner 2049 3, 7, 58, 61, 125, 158, 159, 162, 172, 174, 177, 178, 182, 201, 205, 207, 218, 219, 223 blade runners 45, 158, 166 blocking 38, 109, 170 Blum, Andrew 96 Bomb, The 47 Bone Clocks, The 3, 217

230 Index

bonsai army 21, 117, 118 Bousquet, Antoine 56–​7, 85, 86, 171, 176 brain-​control weapons 82–​3 bribery 94, 95, 133 Brief History of the Future, A 7 British Army 1–​3, 120 Brose, Christian 197–​8 brutality 47, 115, 129, 134 Buchanan, Ben 96–​7, 106 Bugsplat project 163 bureaucracy 34, 200, 201, 202; and Nazi concentration camps 49, 51 Bush, President George W. 16, 226 Campaign to Stop Killer Robots, The 172, 173 Candy House, The 101 capitalism 7, 12, 28, 29, 34, 48, 52, 77–​8, 138; surveillance 56, 57, 94, 117, 138, 222 catastrophe 5, 33, 52, 58, 59, 213; climate 57; ecological 211; humanitarian 72, 82 catastrophic accidents 88 catastrophic events 5, 33–​4, 44, 99, 170 Chamayou, Gregoire 54, 118, 119, 161, 162, 163 changes in scale 13, 212, 59, 102, 114, 115–​21, 122, 128, 138, 148, 155, 195, 207 chaos 2, 22, 29, 34, 41, 58, 70, 72, 74, 75, 85, 86, 88, 121, 142, 147, 176, 214, 218, 219, 220, 222, 224, 225 ChatGPT 159, 182, 186, 207 Chen Qiufan 5, 215 Chiang, Ted 185, 186 Children of Men 58 China, and future wars 214–​17 Chinese science fiction 215 Cirillo, Pasquale 46 citizenry 32, 41, 45, 61, 93, 161, 166, 215, 222, 225 civilisation of affects 40 civilising process 20, 29, 39, 40, 41, 46, 47, 48, 49, 50, 51, 52, 53, 54, 57, 61, 88, 153, 173, 174, 212, 213 Civilizing Process, The 39 Clausewitz, Carl von 38, 72, 99, 225 climate catastrophe 57 climate change 1, 5, 7, 28, 33, 34, 48, 53, 55, 58, 93, 122–​3, 124, 125, 126, 158, 221; and future wars 217–​19 climate emergencies 2, 120, 134, 138, 141, 177, 203, 212, 217, 218, 219 Coker, Christopher 6, 7, 38, 114

Cold War 10, 11, 22, 30, 67, 75, 77–​8, 116, 117, 140, 151, 183, 221, 222, 223 Cold War 2.0 216–​17 Cole, August 6, 150 Collection and Monitoring via Planning for Active Situational Scenarios, The (COMPASS) 202 colonialism 40, 47–​8, 54, 59, 61, 129, 133 command and control 13, 21, 82, 85, 86, 100, 101, 108, 112, 142, 149, 169, 170, 172, 177, 193 commercial drones 166, 197 commodification 101 communication 10, 11, 13, 21, 28, 29, 33, 55, 57, 90, 91–​2, 93, 94, 95, 101, 108, 109, 111, 116, 122, 149, 160, 191 COMPASS (The Collection and Monitoring via Planning for Active Situational Scenarios) 202 competition 7, 10, 12, 17–​18, 48–​9, 212; economic 224; geopolitical 59, 121, 218; great power 123, 136, 140, 177, 183, 185, 219, 224; grey zone of 22–​3; to innovate 81; international 205–​6, 212, 217; interstate 9, 67, 68, 70, 95–​6, 183; military 81, 183; near peer 136; resource 17, 36, 78; state-​on-​state 9, 67, 68, 70, 95–​6, 183; sub-​threshold 20, 149 complexity 9, 86–​7, 121, 202, 207; of battlespaces 32, 143, 160; cultural 34; ethno-​political 174; of future war 76; human 34; infrastructural 71; of strategic consequences 178; sub-​threshold 23; technological 2, 30, 34, 56, 68–​9, 121, 127, 141, 143, 167, 170; urban 123, 140; of war 85, 86, 112, 143, 160, 196, 197 concentration camps 33, 49–​50, 51, 52, 53 conflict: international see international conflict; resolution of 9, 12, 38, 110, 197, 222; sub-​nuclear 150; urban 124, 126, 128, 129, 136, 140, 141–​2; virtual 99 congested battlespaces 32, 136, 143, 148, 154 connectivity 36, 93, 100, 101, 102, 103, 104, 105, 108, 109, 111, 116, 126, 191 conspiracy theory 73–​4, 130, 132, 190, 208, 214, 216, 226 consumer society 30, 53–​4 contactless action 73, 76, 87, 95, 112 contemporary war 108, 114, 167 cosmopolitan condition 30, 33 Covid-​19 pandemic 10, 58, 76, 80, 214, 221–​2, 223–​4

Index  231

Crawford, Kate 36, 55, 183, 185, 188, 200–​1 creative tactics 18, 71, 81, 196 creativity 12, 15, 18, 100, 109, 114, 127, 138, 147, 223; and AI-​enabled infrastructural war 192; dangerous 69, 83; in future warfare 197; and impure war 11, 100, 127, 177; sadistic 164; and the Shoggoth 184; sub-​threshold 76, 79, 80, 85, 86; tactical 86, 87; transgressive 87–​8 crime: cyber-​ 80, 90, 91, 95, 102, 104, 107, 108, 195–​6, 199; of the future 90–​3, 98; organised 90, 91, 102, 107–​8, 134, 135, 165, 166, 182, 208; urban 35 criminal groups 10, 93–​4, 170 criminal organisations 67, 96, 106, 123, 153, 165, 167, 211–​12 critical infrastructures 11, 19, 21, 69, 71, 82, 93, 95, 96, 98, 108, 112, 114, 149, 150, 154 Cronenberg, Brandon 138 Cronenberg, David 58, 101 cultural complexity 34 culture: organisational 72, 86; strategic 71–​2, 86, 215 cyber 2, 3, 11–​12, 108; beyond 110–​12; sub-​threshold 105–​8 cyber capabilities 96–​7, 105, 107 cyber defence 92, 106, 188 cyber espionage 18, 70, 71, 90, 92, 104, 105, 108, 109, 111, 150 cyber insecurity 74, 97 cyber 9/​11 96, 97, 98, 99, 100, 108 cyber operations 87, 97, 107 Cyber Pearl Harbor 98, 99, 108 cyber play, state of 93–​101 cyberattacks 13, 18, 59, 73, 84, 93, 96, 97–​8, 99, 100, 102, 106, 109, 133, 142, 189–​90, 195–​6, 203 cybercrime 80, 90, 91, 95, 102, 104, 107, 108, 195–​6, 199 cyber(in)security 58, 98 cyberpunk international politics 211–​26 cyber-​sabotage 21, 70, 97, 104, 105, 109, 126, 147, 196, 220 cybersecurity 21, 92, 93, 98, 105–​6, 108, 109, 110, 155, 195–​6, 204; defensive 92; informational dimension of 93–​5; infrastructural dimension of 95–​7 cybervulnerabilities 80, 92–​3, 96, 97–​8, 100, 102, 103, 106, 107, 189–​90 cyberwar 11, 21, 36, 59, 71, 80, 88, 90–​112; uncertainty of 97–​101

cyberweapons 96, 100 cyborgs 1, 45, 57, 70, 92, 105, 108, 182, 205, 206, 208, 221, 225 cybotage 99, 102 Daemon 98 dangerous creativity 69, 83 Dark Shimmer 222–​3 DARPA (Defense Advanced Research Projects Agency) 84, 85, 172, 202 death-​drive 45 decentralisation 142, 143, 145, 146, 148, 160, 211 deception 38, 41, 85, 100, 101, 107–​8, 109, 110, 111, 133, 142, 167, 168, 226; and impure war 86–​8 decision-​making 38, 67, 72, 84, 85, 86, 112, 140, 144, 200, 202, 203, 205, 208; future of 213–​14 deep fakes 130, 133, 134, 135, 188–​9, 206, 208, 226 defence 9, 34, 53, 81, 116, 117, 154; cyber-​ 92, 106, 188; passive 168 Defence Science and Technology Lab (DSTL) 81, 152 Defense Advanced Research Projects Agency (DARPA) 84, 85, 172, 202 defensive cybersecurity 92 defensive dominance 148, 153–​4, 190 dehumanisation 48, 51, 95, 111, 193, 199 Delillo, Don 205, 212 Demmers, Jolle 119–​20 democracy 33, 34, 48, 52; of emotion 116; liberal see liberal democracy; nocturnal body of 47, 48, 54 density, of future war 146, 154–​5 Der Derian, James 155, 178 Destined for War 215–​16 deterrence: by denial 67, 129, 213; by entanglement 9, 12, 19, 28, 68, 150, 212, 216; by punishment 68 Dick, Philip K. 3, 102 digital age 36, 78, 79, 91, 93, 96, 98, 102, 104, 111, 116 digital security 189–​90 digital world 67, 98–​9 Dillon, Michael 16 disaggregation 143–​4 Disappearing Violence 154–​5 disinformation 52, 70, 71, 73–​4, 84, 100, 104, 108, 187, 189, 190, 206, 208, 214, 216, 226 domestic resolve 151

232 Index

Domingos, Pedro 190 Dragon Day 98 Dragons and the Snakes, The 74 Dredd 7, 125, 128, 154 drone strikes 11, 13, 18, 104, 118, 119, 147, 155, 161, 163, 164, 174, 177, 178 drone swarms 14, 133, 135, 165, 169, 195, 197, 200, 201, 204, 206 drone use 161, 163, 166, 167, 175 drone war 21–​2, 59, 161, 174, 175, 192, 194; and the Blade Runner state 162–​71 drones: over Aleppo 2042 134–​6; autonomous 185, 191–​2; commercial 166, 197; humanitarian 161, 174, 176; micro-​ 12, 118, 123, 133, 152–​3, 167, 174, 190, 196–​7; non-​state actor use of 166; protopian 171 DSTL (Defence Science and Technology Lab) 81, 152 dystopian age 131, 190 dystopian pessimism 57, 102–​3 dystopian science fiction 56, 80, 83, 103, 131, 172, 174 ecological catastrophe 211 ecological degradation 5, 33, 57, 221, 222, 223 economic acceleration 214 economic competition 224 economic growth 18, 56, 125, 147 economic innovation 153, 184, 207 economic policy 17, 56 economic science fiction 5 economic/​social collapse 2 Edge of Tomorrow, The 9, 220 efficiency 36, 51, 93, 117, 133, 145, 147, 159, 169, 178, 185, 189, 201 Egan, Jennifer 101 Eichmann in Jerusalem 51 electronic battlefield 173 Elias, Norbert 39, 40, 41, 52, 53 elites 9, 108, 215 Elysium 7–​8, 57, 58, 140, 143, 205, 223 emergencies 212, 225; climate 2, 120, 134, 138, 141, 177, 203, 212, 217, 218, 219 emerging battlespaces 168 emerging technologies 16, 19, 33, 48, 55, 61, 67, 70, 88, 102, 103, 105, 106, 111, 125, 127, 136, 154, 188, 199, 206, 224 emerging terrains 20, 93, 192 emerging vulnerabilities 21, 59, 78–​9, 86, 100, 186 end of history 30, 31, 33, 73, 131, 190, 206

End of History and the Last Man, The 30, 33 end of liberalism 33, 75 endless wars 16, 38, 57, 82, 98, 104, 111, 214 entanglement, deterrence by 9, 12, 19, 28, 68, 150, 212, 216 environment of fear 116 environmental insecurity 217 Erfourth, Monte 110 escalation, of conflict 110, 114, 115, 123, 132, 149, 152, 216, 223 Escape Artist, The 50 espionage 10, 11, 94, 99, 100, 101, 149, 151, 190, 195, 211, 222–​3; cyber 18, 70, 71, 90, 92, 104, 105, 108, 109, 111, 150; informational 20; infrastructural 20 ethics 3, 6, 14, 15, 20, 21, 45, 49, 52, 56, 61–​2, 79, 86, 91, 106, 126, 130, 139, 154, 158, 160, 161, 163, 167, 171, 172, 173, 178, 199, 205, 215, 224 ethno-​political complexity 174 everywhere war 11, 68, 161, 162–​3, 164 evolution: of societies 4, 34; technological 92, 167; of weapons 47 existential anxiety 182 existential risk 189 exploitation: of granularity 149–​50, 151, 155; of the human body 167; of the messiness of the city 142; necropolitical 120; of vulnerabilities 15, 69, 81, 107, 136, 151, 154 explosives 125, 130, 135, 167 Eye of War, The 176 fabric of reality 155 facial recognition 135, 191 failed cities 122 fake news 52, 74, 84, 103, 130, 208, 226 fantasy: for the everywhere war 162–​3; future war 83; future worlds of 224–​6; worlds of 22, 213 fear, environment of 116 feral cities 123, 124, 125, 127, 129, 141 Feral Cities 122 feral zones 125–​6, 127, 129, 138 finding always bears flanking 148 firepower 143–​4, 169 First World War 35, 41, 45, 71, 116 Fleming, Ian 99 fog of war 38, 170 force disrupter 196 force enhancer 193–​4, 196

Index  233

force multiplication 197 foreign policy 2, 37, 48, 53, 87, 116, 128, 178, 214, 215, 223 Foreign Policy 99–​100, 191 fragility 7, 15, 58, 221 France 21, 50, 58, 117, 201 Frantzman, Seth J. 168–​9 Freedland, Jonathan 50 Freedman, Lawrence 6, 124 Friedman, Thomas 142, 148 Fukuyama, Francis 30, 31, 33, 206 fundamentalism 69, 135, 212 future of war, and artificial intelligence (AI) 182–​208 Future of War, The 6, 124 future primitive 221 future soldier 1, 125, 139–​40, 145, 208 future urban war 13, 122, 124, 128 future war: granularity of 21, 138–​56; Russian way of 71–​7 Future War: Preparing for the New Battlefield 6–​7 future war fantasy 83 future warfare, liberal way of 10–​19, 27–​42, 83–​5 future wars: and China 214–​17; and climate change 217–​19 Galeotti, Mark 12, 73, 75, 103 game-​changing events 4; see also wild card events Garland, Alex 220 Gattaca 205 Gaza Metro 124 GCHQ (Government Communications Headquarters) 106 general accidents 58 general AI 187, 203 geopolitical change 2, 4, 9, 11, 143, 217, 221 geopolitical competition 59, 121, 218 geopolitical contexts 9, 10, 154 geopolitical disorder 70–​1, 106, 213 geopolitics 11, 97, 116, 140, 192 Gerasimov Doctrine 20, 71–​7, 86 Ghost in the Shell 136, 194, 196, 212, 223 Ghostfleet 6, 150–​1, 216 Gibson, William 91, 94, 101, 109–​10, 165, 211, 225 global accidents 9, 203, 225 global economy 28, 39, 44, 55, 57, 116, 117, 125, 223

global governance 31, 33, 124, 153, 214, 223–​4 global politics 3, 9, 10, 29, 33, 47 global technological evolution 92 global transformation 116–​17 Global War on Terror 11, 12, 16, 21, 76, 119, 130–​1, 152, 159, 163, 173, 174, 192, 199, 212–​13, 218, 222 globalisation 36, 41, 53–​4, 56, 57, 77, 78, 221 Google 55, 117, 145, 186, 188, 203 Gould, Lauren 119–​20 governance 98, 159, 200, 204; global 31, 33, 124, 153, 214, 223–​4 Government Communications Headquarters (GCHQ) 106 granular events 13, 149, 161 granular security possibilities 155 granular times: future interstate war in 149–​53; future megacity wars in 121–​9 granular vulnerabilities 150, 151, 154 granular war 15, 19, 114, 118, 128, 134, 136, 140, 147, 148, 149, 154, 198, 206 granularity: of future conflict 21, 129–​36; of future war 22, 138–​56; of reality 155 Gray, John 28–​9, 46 great power competition 123, 136, 140, 177, 183, 185, 219, 224 great powers 11, 16, 19, 29–​30, 41, 46, 68, 71, 115, 129, 130–​1, 132, 143, 150, 151, 153–​4, 170, 190, 194, 203, 214, 215, 217; see also superpowers Green Shimmer 221–​2 Greenberg, Andy 72, 98, 100, 106, 108, 109 Gregory, Derek 11, 68, 161 grey zones 2, 11–​12, 18–​21, 46, 118, 139, 149, 197–​8, 202, 207, 214, 216; perpetual 11, 212, 213; urban 126, 132–​4; see also cyberpunk international politics; information war; interstate conflict; liminal war; sub-​threshold activity; sub-​threshold conflict group think 76, 213–​14 Grove, Jairus Victor 5 Hacker and the State, The 96–​7 hacking 12, 70, 74, 97, 107, 110, 135–​6, 142, 190, 211 HADR (humanitarian assistance and disaster relief) 2, 134, 219 Hamas 87, 109, 123–​4, 126, 195 Hammes, T.X. 106, 148 Harari, Yuval Noah 44–​5

234 Index

Hard Times in the 21st Century 17 health-​drive 45 heavy modernity 54 Her 185 high intensity warfare 2, 194 History of the Peloponnesian War, The 215–​16 Hitchhikers Guide to the Galaxy 6 Holocaust 49, 50, 51–​2 Homo Deus 44 Hon, Adrian 5 How Democracy Ends 33 human capabilities 158, 185 human complexity 34 human condition 1, 8, 10, 19, 27, 28, 29, 30–​1, 34, 35, 36, 40, 44, 52, 54, 56, 58, 91, 119, 153, 187, 219, 220 human organisation, territorial expansion of 159–​60 human suffering 8, 16, 17, 31, 46, 47, 50, 53, 54, 61, 111, 121, 123, 153, 163, 206, 212, 213, 223 human vulnerabilities 15, 99, 189 Humane 16, 38 humane war 19, 36, 60, 61–​2, 92, 97, 111, 178, 212, 225 humanitarian assistance and disaster relief (HADR) 2, 134, 219 humanitarian catastrophe 72, 82 humanitarian disasters 120, 122, 125, 127, 134, 147, 178 humanitarian drones 161, 174, 176 humanitarian interventions 17–​18, 128, 131, 132, 141, 149, 174 humanitarian relief 141–​2 human–​machine teaming 59, 184, 204, 208 hybrid war 2, 71–​2, 74–​5, 76 hyperconnectedness 37 hyper-​mixed methods 196–​7 hypothése d’engagement majeur 2, 8, 19, 41 IDF (Israeli Defense Forces) 87, 109, 175 Ikenberry, G. John 15, 31, 34, 224 impure war 10–​11, 12, 21, 59, 67–​71, 74, 76–​7; and China 81, 83; creativity of 15, 177; and cyberwar 100, 101–​2, 103, 110–​12; and deception 86–​8; and digital age 91, 93; and hypothése d’engagement majeur 19; and lethal empowerment 104, 105; and liberal way of war 121, 127; and private security companies 129–​30; and sabotage 96; tactics of 73, 74, 85; and technological innovation 103

Inception 13, 134, 199 inequality 28, 30, 33, 36, 57, 77, 122, 125, 129, 141, 177, 200–​1, 205, 206, 212, 221, 222, 223, 225 Inevitable, The 36, 44 information systems 100 information technologies 41, 78, 101, 116, 133, 214 information war 18, 20, 32, 38, 41, 71, 72, 76, 83, 100, 104, 105, 111, 128, 130, 185, 193 informational dimension, of cybersecurity 93–​5 informational espionage 20 informational manipulation 104, 111 informational vulnerabilities 93 infrastructural complexity 71 infrastructural dimension, of cybersecurity 95–​7 infrastructural espionage 20 infrastructural sabotage 74, 111, 206, 211 infrastructural vulnerabilities 11, 13, 67, 69, 93, 111, 112, 132 innovation: economic 153, 184, 207; open technological 15, 21–​2, 68–​9, 103, 128, 166, 177, 192; technological see technological innovation insecurity 7, 9, 17, 31, 67, 102–​3, 117, 119, 123, 127, 138, 174, 207, 215–​16; cyber 74, 97; environmental 217 intelligence 11, 37, 76, 90, 109, 118, 119, 133, 139, 140, 144, 148, 152, 155, 163, 171, 184, 186–​7, 188, 195, 196, 199, 202, 203, 205, 208, 214, 223, 224–​5 intelligentisation, of warfare 77–​83, 84, 86, 88, 91, 111 international competition 205–​6, 212, 217 international conflict 7, 8–​9, 12, 18, 20, 48, 76, 77, 83, 84, 87, 88, 91, 101, 108, 110, 116, 148, 161–​2, 167–​8, 199, 212, 217 international order 30, 31, 69, 75–​6, 116, 123, 126, 131, 196, 213 international politics, cyberpunk 211–​26 international relations 4, 6, 18–​20, 22–​3, 28, 35, 39, 46, 58, 60, 68, 72, 130, 155–​6, 183, 184, 213; and sub-​threshold cyber 105–​8 internet 32, 38, 103, 106; of battle things 146, 147, 196; quantum 98–​9; of things 80 interstate competition 9, 67, 68, 70, 95–​6, 183

Index  235

interstate war 8, 18, 19, 41–​2, 46, 47, 60, 68–​9, 115, 154, 190, 224; in granular times 149–​53 invasion: of Iraq 71; of Taiwan 68; of Ukraine see Ukraine In/​Visible War 54 interstate conflict 38, 46, 70, 71, 95–​6, 101, 130, 151, 222–​3 invisible wars 38, 40, 54, 61, 73, 97, 100, 105, 121, 126, 128, 129–​30, 145, 147, 161, 162, 163, 164, 173, 185, 195, 196, 197, 199, 207 Iraq 16, 18–​19, 59, 60, 68, 71, 77, 78, 87, 121, 134, 147, 149, 154, 167, 177, 212–​13 iron cage, of bureaucracy 200 Israeli Defense Forces (IDF) 87, 109, 175 jamming 170 Johnny Mnemonic 127 Jonze, Spike 185 juice jacking 94 Juicy Ghosts 91, 101 Kant, Immanuel 28, 30, 212 Kaplan, Fred 47 Kelly, Kevin 36, 37, 44, 45, 60, 97–​8 Kilcullen, David 11, 74, 77, 106, 121, 139, 144, 189 kill box 12, 21, 36, 59, 118–​19, 153, 163 Killer Apps 203–​4 kinder weapons 82 kinetic military action 107 King, Anthony 124, 139, 167 Knowles, Emily 120, 121 Kofman, Michael 74–​5, 76 Kosovo war 52, 60, 77 Krebs, Chris 109 Krulak, General Charles C. 140–​2 Kurth Cronin, Audrey 68–​9, 103–​4, 106, 128, 206–​7 Latiff, Robert H. 6–​7 Lazarus Group 107 Least of All Possible Evils, The 111 Lee, Kai-​Fu 5 Lem, Stanislaw 27, 28, 213 Lessons Learned From Contemporary Theatres 120 lethal empowerment 101, 103, 104, 105, 129 lethality 111, 115, 118–​19, 120, 167, 168, 176

liberal democracy 2, 9, 15, 16, 17, 28, 29, 30–​1, 33, 34, 37–​8, 52, 80, 116, 128, 160, 206 liberal internationalism 30, 34 liberal optimism 45 liberal warfighters 105, 146, 171 liberal way of future warfare, trends in 10–​19 Liberal Way of War 16 liberalism 29, 33, 45, 48; end of 33, 75 liminal war 11, 18–​19, 20, 74, 77, 106, 189 liquid modernity 54, 116, 117 liquid warfare 119–​20, 143; see also impure war logistics 36, 55, 84, 86, 115, 117, 133, 135, 191, 201 Lohne, Kjersti 174 long peace 4, 15, 31–​2, 46, 58, 216 Lotringer, Sylvére 67 Lovecraft, H.P. 185–​6 Lucaites, John Louis 54 Lying in Politics 226 McCarthy, Cormac 7, 31, 219, 225 McFate, Sean 7 machine learning 117, 182, 184, 185, 186, 187, 188, 194, 195–​6, 199, 200 machinic armies 193, 194 machinic vulnerabilities 169 machinic war 170, 171, 173, 176, 178, 193, 212, 217 machinic weapons 195 MacMillan, Margaret 10, 35, 36, 124 Mad Max Fury Road 7, 219 Maduro, Nicolas 166 Malicious Use of Artificial Intelligence, The 189 Manhunt (video game) 175 manhunting 54, 160–​1, 162 Manhunts: A Philosophical History 54 Marshall, Andrew 104, 217 Master Algorithm 190 Matrix, The 91, 196–​7, 226 Matrix Revolutions 196–​7 Mbembe, Achille 19–​20, 47, 48, 49, 54, 59, 155, 160–​1, 204, 213, 222, 225 mechanised struggle, age of 40–​1 megacities 2, 13, 58, 121, 122, 124, 126–​7, 128, 138–​9, 140, 158 megacity wars 121–​9 messiness 9, 22, 32–​3, 68–​9, 86, 102, 142, 170, 172, 207, 220 metropolitics 68

236 Index

microcube of death 119 micro-​drones 12, 118, 123, 133, 152–​3, 167, 174, 190, 196–​7 Middle East 8, 44, 58, 72, 120, 178 migration 39, 48, 74; see also refugees military competition 81, 183 military design movement 83, 87 military elites 9, 215 military force 17, 61, 84, 164, 184, 213, 215, 222 military interventionism 17, 68, 120 military recruitment 136, 201 military spending 2, 117 military technology 15, 78, 81, 84, 114, 132, 151, 199, 223; see also artificial intelligence (AI); drones military–​civil fusion 85 military–​technical vulnerabilities 191 Miller, Joe 110 minimal human survival 111 Mir, Asfandyar 164 Mitchell, David 3, 217 modernity: heavy 54; liquid 54, 116, 117; and remote control 159–​62; and violence 49–​54 Modernity and the Holocaust 20, 49, 50, 173, 176, 219 Mogadishu raid 2040 132–​4 Monk, Jeremiah 110 Moore, Dylan 164 Moore’s Law 55–​6 moral anxiety 173, 193 moral distance 50–​1, 53, 54, 90, 163, 176, 190–​3, 204 moral indifference 121, 155 moral proximity 50, 192, 193 Morgenthau, Hans 22, 213–​14, 225–​6 mosaic warfare 21, 83–​5, 92, 102, 105, 120–​1, 140, 177, 184, 196, 198, 199 ‘mowing the lawn’, Israeli tactic of 123, 126 Moyes, Richard 173 Moyn, Samuel 17, 36, 38, 60 multiculturalism 69, 75 multi-​domain warfare 143 multiplication 205–​8: of actors 22, 139, 226; of chaotic events 219; force 197; of strategic mistakes 214; of tactics and technologies 4, 22, 139, 147, 150, 222, 223 Mumford, Lewis 159–​60 Munster, Rens 5

Musk, Elon 32, 33, 91–​2, 173 Myth of the Machine, The 159, 160 naive empiricism 46 narrow AI 187, 203 national security 90, 91, 94, 141, 152, 190, 197, 204 Nazi Germany 51, 53 near peer competition 136 necropolitical exploitation 120 necropolitical possibility 62, 105, 222 necropolitical violence 48–​9, 52, 119, 123, 134, 163, 200–​1, 222 necropolitical wars 61, 161, 171 neoliberalism 53–​4, 116, 117, 224 network centric war 77 network society 77–​8 neuroscience 13, 111, 138, 153 neurowarfare 13, 38, 83, 108 New History of the Future in 100 Objects, A5 New Revolution in Military Affairs, The 197–​8 New Rules of War 7 new world order 17, 70, 78, 140 New York 2140 218–​19, 222 Newitz, Annalee 174 9/​11 attacks 10–​11, 13, 18, 21, 29, 58, 59, 60, 67, 68, 69, 70, 71, 82, 104, 116, 119, 130, 140; cyber 96, 97, 98, 99, 100, 108 Ninja bomb 118–​19, 153 Niva, Steve 154–​5 nocturnal body, of democracy 47, 48, 54 Nolan, Christopher 13, 74, 84, 134, 156, 199 non-​lethal interstate competition 183 non-​lethal weapons 7, 9, 82, 131, 133, 153 non-​state actors 7, 8, 12, 15, 20, 21, 59, 67, 96, 99, 103, 105, 106, 111, 127, 134, 164–​5, 166, 167, 170, 171, 194. 196 North Korea 107–​8, 216 Norton, Richard J. 122–​3, 124, 126, 127 NotPetya 97, 100, 106, 108 nuclear war 11, 28, 33, 47, 55, 71, 116 nuclear weapons 44, 46, 47, 91, 95–​6, 97, 115, 124, 149 Öberg, Dan 87–​8, 207 observe, orient, decide and act (OODA loop) 85, 160, 170, 202 Ocean’s Eleven 126 Oliver, Ryan 110 O’Neil, Cathy 200 ontogenetic view, of warfare 29

Index  237

OODA loop (observe, orient, decide and act) 85, 160, 170, 202 open systems 103, 106 open technological innovation 15, 21, 68–​9, 103, 128, 166, 177, 192 organisational accidents 139 organisational culture 72, 86 organisational vulnerabilities 15, 151 organised crime 90, 91, 102, 107–​8, 134, 135, 165, 166, 182, 208 Orwell, George 5, 29, 94, 183 pace of change, in technology 4–​5, 35, 55–​6, 59, 98, 102, 187, 196, 220 pandemics 7, 53, 57, 58, 126, 141, 147, 225; Covid-​19 10, 58, 76, 80, 214, 221–​2, 223–​4 passive defence 168 peace theory 28 peacekeeping 17, 140, 141–​2, 217, 219 People’s Liberation Army (PLA) 4, 77, 79, 81, 185 Peripheral, The 3, 172, 211 perpetual peace 28, 212 personal vulnerabilities 15, 99, 189 physical distance 163, 190–​3 physical security 162, 189, 190 Pinker, Steven: on battlefield distance 216; data usage of 45, 46–​7; on deterrence 47; on drone war 174; on the future 31, 32, 39; on future war 45, 60–​1, 176; on writers on the Holocaust 51–​2, 53; on the human condition 19, 31–​2; liberal optimism of 34, 45, 49; on nuclear weapons 46; on progress 52; Taleb and Cirillo’s criticism of 46 PLA (People’s Liberation Army) 4, 77, 79, 81, 185 planetary frontier 54, 138, 163 planning 12, 72, 76, 80, 86, 124, 146, 155, 202 pluralism 52 policing wars 17–​19, 32, 46, 67, 163 policy: economic 17, 56; foreign 2, 37, 48, 53, 87, 116, 128, 178, 214, 215, 223 political security 189 politics: apocalyptic international 7–​9, 28; astro 3, 10, 17, 68, 92, 205–​6; cyberpunk international 211–​26; geo-​ 11, 97, 116, 140, 192; global 3, 9, 10, 29, 33, 47; metro-​ 68; security 57, 69, 70, 130 Politics of Catastrophe 5 populism 33, 34, 75, 132, 222–​3

Possessor 138 Powell, James Lawrence 5 Power: A New Social Analysis 29 Power to the People 103 precise weapons 82 privacy 37, 136, 152, 188, 206 private security companies 59, 67, 120, 123, 125, 129–​30, 135 problem-​solving capacity 185 progress and war, ambiguity of 35 Prometheus 208 protopia 37, 57, 177 protopian drones 171 protopian Mogadishu 2038 129–​32 protopian optimism 204–​5 protopian war 37, 38–​9, 41, 48, 88, 131, 184 proximity 50, 51, 98, 103, 132, 163, 192, 193, 200 Pulse 98 pure war 11, 12, 19, 67, 68, 70, 71, 88, 91, 102, 112, 199 Pure War 67; see also Virilio, Paul Putin, Vladimir 32, 41, 68, 73, 74, 75, 76, 112, 185, 198, 214, 226 Putin’s People 75 quantum computing 3, 12, 35, 36, 56, 106, 155, 183 quantum internet 98–​9 quantum mechanics 155–​6 quantum physics 13, 155–​6 racism 33, 48, 222–​3 radicalisation 69, 94–​5 Ready Player One 7, 91 realism 2, 7, 9, 58, 211, 213, 214, 215, 218, 221, 226 reality, remoteness from 30, 213–​14 recruitment: military 136, 201; for terrorist organisations 94–​5 Red and the Black, The 39 Rees, Martin 186 refugee camps 134–​5 refugees 5, 48, 53, 74, 200, 219; see also migration Reid, Julian 16 remote control, and modernity 159 remote warfare 2, 54, 120, 121, 126, 160 remoteness from reality 30, 213–​14 replicants 3, 45, 61, 102, 108, 143, 158, 159, 162, 171–​2, 176, 182, 194, 199, 207, 225

238 Index

resilience 33, 83, 108; societal 74, 106, 121, 189 resource competition 17, 36, 78 Return from the Stars 27–​8, 32, 34, 213 Rid, Thomas 92, 99, 100–​1, 102, 108 Risico 99 Road, The 7, 8, 31, 219 Robinson, Kim Stanley 7, 218–​19, 222 robot war 140–​9, 161, 171, 176–​7, 202 roboticizing of the military 174 robotics 129, 130, 131, 133, 134, 143, 162, 164–​5, 171, 172, 193, 194, 195, 198; Three Laws of 173–​4 robots 15, 133, 159, 174, 178, 191–​2, 193, 194–​5, 197 Ronfeldt, David 99–​100 Rosa, Hartmut 55, 114 Royal Military Academy Sandhurst 1–​2, 105 Rucker, Rudy 91, 101 Runciman, David 33 Russell, Bertrand 29 Russian way of future war 71–​7 Russo-​Ukrainian war 17, 21–​2, 61, 87, 92, 168, 171, 177, 223–​4 Ryan, Mick 7 sabotage 20–​1, 92, 96, 97, 100–​1, 192; cyber-​ 20–​1, 70, 97, 104, 105, 109, 126, 147, 196, 220; infrastructural 74, 111, 206, 211 sadistic creativity 164 Sandhurst 1–​2, 105 Sandvik, Kristen Bergtora 174 Sandworm 109 Savage Ecology 5 scale of conflict, change in 114–​36 Scales, Robert H. 143–​6 Scharre, Paul 203–​4 science fiction: Chinese 215; dystopian 56, 80, 83, 103, 131, 172, 174; economic 5 Scott, Ridley 13, 208 Second World War 9, 15, 35, 45, 58, 71, 116, 145 security: cyber see cybersecurity; digital 189–​90; in a time of multiplication 205–​8; national 90, 91, 94, 141, 152, 190, 197, 204; physical 162, 189, 190; political 189; politics of 57, 69, 70, 130; in the time of the Shoggoth 184–​190 September 11 attacks see 9/​11 attacks Shadow Brokers 106 shadow wars 154–​5

Shimmer, the 22, 211–​26; Dark 222–​3; Green 221–​2; Silver 223 Shirreff, General Sir Richard 6 Shoggoth, the 184–​90, 191, 195, 198–​9, 204 shrinking battlespaces 128, 129 signalling 92, 97, 111 Silver Shimmer 223 Simons, Jon 54 Singer, Peter 6, 150 Six Days in Fallujah 121 small units 128, 134, 144–​5, 145–​6, 147, 148, 152–​3 small wars 117 smart cities 95, 124, 125, 126, 130, 140, 147, 218 smartphones 55, 94, 104, 117, 148, 164 Snyder, Timothy 53 Social Acceleration 55 social media 32, 33, 59, 87, 95, 98, 103; see also bitskrieg; disinformation social netwar 100 social networks 100 societal resilience 74, 106, 121, 189 societal vulnerabilities 12, 69, 86, 99 soft information war 130 soldier, future 1, 125, 139–​40, 145, 208 Somalia 79, 120 Soviet Union 75, 116 Sparta 215, 216 Speed and Politics 55 Spiderman Far From Home 38 Spielberg, Steven 142–​3, 223 Star Wars 189, 194 Starlink satellites 32, 33 state-​on-​state competition 9, 67, 68, 70, 95–​6, 183 Stendhal 39 Stengers, Isabelle 206 strategic corporal 141–​2, 144 strategic culture 71–​2, 86, 215 Strategic Studies Group 126–​7 strategic surprise 3, 69, 165; see also game-​changing events strategic thinking 39, 71, 76, 214, 215 Strawser, Bradley 175 Stuxnet 95–​6, 99, 191 sub-​nuclear conflict, between China and the United States 150 sub-​threshold activity 12, 17–​19, 20–​1, 76, 107, 111, 139 sub-​threshold competition 21, 149 sub-​threshold complexity 22

Index  239

sub-​threshold conflict 11, 98 sub-​threshold cyber 105–​8 sub-​threshold events 18, 147 sub-​threshold tactics 17–​18, 109 subversion 70, 75, 83, 92, 94, 99, 100, 101, 104, 146, 149, 184, 190, 195, 223 suicide bombers 14, 103, 167 Suleimani, Qassem 177, 178 Sun Tzu 72, 79, 84, 87, 225 superintelligence 187, 200 superpowers 2, 18, 19, 131, 170, 214, 218; see also great powers surveillance capitalism 56, 57, 94, 117, 138, 222 swarms 38, 81, 131, 141–​2, 184, 190, 197, 198, 199, 205; drone 14, 133, 135, 165, 169, 195, 197, 200, 201, 204, 206 Syria 8, 33, 74, 120, 121, 134, 170 Tactical Art in Future Wars 143 tactical creativity 86, 87 Taiwan 41, 68, 79, 153–​4, 216 Taleb, Nassim Nicolas 46 tanks 15, 84, 96, 97, 100, 115, 145, 148, 169, 203, 211 teaming, human–​machine 59, 184, 204, 208 technological acceleration 9, 14, 68, 76, 77, 78, 80, 100, 129, 182, 186, 187, 188 technological accidents 7, 8, 14, 21–​2, 33, 55, 57, 58, 67, 106, 139, 149, 163, 167, 170 technological complexity 2, 30, 34, 56, 68–​9, 121, 127, 141, 143, 167, 170 technological development 34, 37, 68, 111, 114 technological evolution 92, 167 technological innovation 14, 17, 21, 55, 57, 59, 78, 106, 125, 207, 208, 222; open 14, 21, 68–​9, 103, 128, 166, 177, 192 technological plague 78 technological transformation 14, 30, 56, 125, 206, 215 technological wonderland 60, 213, 225, 226 technologies: bio-​56, 129, 208, 222; emerging see emerging technologies; information 41, 78, 101, 116, 133, 214; military 16, 78, 81, 84, 114, 132, 151, 199, 223; see also artificial intelligence (AI); drones Tenet 13, 84, 156, 199 Terminator 91, 128, 159, 162, 166, 170, 174, 182–​3, 187, 188, 193, 195, 199, 200, 201, 203, 206, 208

territorial war 68, 82 terrorism 8, 12, 19, 28, 31, 67, 70, 96, 124, 126, 129, 132, 151, 166, 191, 193, 207; in an age of AI 134–​6 terrorist groups 37, 59, 136, 167, 170 terrorist networks 11, 12, 21, 67, 68–​9, 91, 96, 104, 106, 111–​12, 123, 125–​6, 130, 132, 134, 135, 154, 164–​5 Theborn, Göran 36 theft, of information/​intellectual property 91, 94–​5, 211 Third Offset Strategy 59, 106, 187–​8, 193–​4, 196 Third World War 152, 190 Thompson, Helen 17, 201 threat horizon 17, 41, 50, 184, 189, 207, 212 threat landscape 69, 200 Three Block Robot War 140–​9 Three Laws of Robotics 173–​4 Thucydides Trap 4, 215–​16 totalitarianism 30, 52, 94 Transcendence 187 transformation: global 116–​17; technological 14, 30, 56, 125, 206, 215 Transformers 128, 162, 183, 194, 199, 200, 208 transgressive creativity 87–​8 transparent battlefield 169, 177 Trudeau, Justin 4–​5 Trump, Donald 33, 74, 75, 79 Tubes 96 2001 A Space Odyssey 187 2084 Report 5 Ukraine 17–​18, 22, 32, 68, 70, 72, 74, 76–​7, 83, 97, 98, 100, 121, 142, 143, 145–​6, 147–​8, 149, 168–​9, 177, 192, 205, 208, 213, 226 undermine at a distance 152 (un)granular times, future megacity wars in 121–​9 unintended consequences 3, 10, 15, 60, 62, 86, 98, 106, 111, 119, 128, 139, 172, 203–​4 unnecessary wars 2, 22, 32, 37–​8, 39, 86, 162, 212 unrestricted warfare 20, 71–​2, 84, 86, 88, 96, 151, 197, 199, 214, 216; and the intelligentisation of warfare 77–​83 Unrestricted Warfare 77, 78–​9, 80–​1, 81–​2, 83, 84–​5, 112, 154, 197 urban battlespaces 141–​2

240 Index

urban complexity 123, 140 urban conflict 124, 126, 128, 129, 136, 140, 141–​2 urban crime 35 urban environments 13, 46, 71, 122–​3, 124, 125, 126, 127, 128, 134, 141, 143, 183 urban grey zone 126; lethal empowerment in the 132–​4 urban war 2, 127, 131, 133, 134, 139; future 14, 122, 124, 128 Urban Warfare in the Twenty-​First Century 124, 139, 167 utopianism 5, 36, 37, 58, 104 VanderMeer, Jeff 220 Varoufakis, Yanis 5, 56 Vietnam war 41, 213–​14, 225–​6 violence: and modernity 49–​54; necropolitical 48–​9, 52, 119, 123, 134, 163, 200–​1, 222 violent actors 103–​4 Virilio, Paul 20, 35, 47, 78; on accidents 80, 106, 204, 206; on deterrence 203; on disappearance of technology 154; on disinformation 52; on environment of fear 116; on impure war 10–​11, 21, 67, 69, 70, 77, 91, 101, 104; on metropolitics 68; on modernity, war, and acceleration 54–​60; on scale of conflict 13, 21, 102, 119; on technological wonderland 60, 213, 225, 226; on uncertainty 61, 62; on worldwide civil war 69–​70 virtual conflict 99 virtual reality 14, 136, 224 virtuous war 178 visions of the future 28, 56, 57, 58–​9, 60–​1 vulnerabilities 75, 77, 79, 81–​2, 108, 136, 139; AI 190; asymmetric 81, 84; cyber​ 80, 92–​3, 96, 97–​8, 100, 102, 103, 106, 107, 189–​90; emerging 21, 59, 78–​9, 86, 100, 186; of everyday life 11; exploitation of 15, 69, 81, 107, 136, 151, 154; granular 150, 151, 154; human 15, 99, 189; informational 93; infrastructural 11, 13, 67, 69, 93, 111, 112, 132; machinic 169; military–​technical 191; organisational 15, 151; personal 15, 99, 189; of a Shoggoth 187; societal 12, 69, 86, 99 Wagner group 59, 120, 123 WannaCry ransomware attacks 96, 97, 107–​8

War and International Thought 29 war and progress, ambiguity of 35 war at a distance 3, 115, 161–​2, 171, 206 war crimes 31, 98 war from above 115, 192 War: How Conflict Shaped Us 10, 124 War on Terror see Global War on Terror War Transformed 7 War With Russia 6 war machine 12, 29, 30, 42, 48–​9, 68, 82, 85, 86, 115, 147, 148–​9, 166, 169–​70, 173, 196, 199, 224; digital 21, 90–​112 warfare: from above 115, 192; algorithmic 183; ambiguous 11, 20, 74, 77, 106, 189; anti-​mosaic 197; biological 74, 106, 126, 206; contemporary 108, 114, 167; cyber 11, 21, 36, 59, 71, 80, 88, 90–​112; at a distance 3, 115, 161–​2, 171, 206; drone 21–​2, 59, 161, 162–​71, 174, 175, 192, 194; endless 16, 38, 57, 82, 98, 104, 111, 214; everywhere 11, 68, 161, 162–​3, 164; granular see granular war; high intensity 2, 194; humane 19, 36, 60, 61–​2, 92, 97, 178, 212, 225; hybrid 2, 71–​2, 74–​5, 76; impure see impure war; information see information war; intelligentisation of 77–​83, 84, 86, 88, 91, 111; interstate see interstate war; invisible see invisible wars; liberal way of see liberal way of war; liminal 11, 18–​19, 20, 74, 77, 106, 189; liquid see liquid warfare; machinic 170, 171, 173, 176, 178, 193, 212, 217; megacity 121–​9; mosaic see mosaic warfare; multi-​domain 143; in a time of multiplication 205–​8; necropolitical 61, 161, 171; network centric 77; neuro-​ 13, 38, 83, 108; nuclear 11, 28, 33, 47, 55, 71, 116; ontogenetic view of 29–​30; on planetary frontier 54; policing 17–​19, 32, 46, 67, 163; protopian 37, 38–​9, 41, 48, 88, 131, 184; pure 11, 12, 18, 67, 68, 70, 71, 88, 91, 102, 112, 199; remote 2, 54, 120, 121, 126, 160; robot 140–​9, 161, 171, 176–​7, 202; shadow 154–​5; small 117; soft information 130; territorial 68, 82; Three Block Robot 140–​9; unnecessary 2, 22, 32, 37–​8, 39, 86, 162, 212; unrestricted see unrestricted war; urban see urban war; virtuous 178; world 4, 13, 28, 58, 71, 115, 149, 217 warfighters 92, 108, 146, 148, 173; liberal 105, 146, 171

Index  241

warfighting 2, 10, 22, 48–​9, 60, 68, 83, 92, 105, 120, 149, 159, 174, 176, 199, 205 wars of choice 15, 41, 86 Watson, Abigail 120, 121 weaponisation of everything 103 Weaponisation of Everything, The 12 weapons 8, 111, 121, 130, 131, 145, 218, 224; autonomous 14, 82, 173, 184, 188, 193, 194, 203; brain-​control 82–​3; cyber-​ 96, 100; kinder 82; machinic 195; of mass destruction 17–​19, 47, 91, 116; non-​lethal 7, 9, 82, 131, 133, 153; nuclear 44, 46, 47, 91, 95–​6, 97, 115, 124, 149; precise 82 Weapons of Math Destruction 200 Weber, Max 200 Weizman, Eyal 111 Wendt, Alexander 155 Wilcox, Lauren 161, 163

wild card events 3, 69, 165; see also game-​changing events World, The 36 world orders 9, 17, 19, 22, 176–​7 World Safe For Democracy, A 34, 224 World Trade Centre attacks 69 world war 4, 13, 28, 58, 71, 115, 149, 217 World War I 35, 41, 45, 71, 116 World War II 9, 15, 35, 45, 58, 71, 116, 145 World War III 152, 190 world-​class military power, China’s ambition to be a 4 worlds of fantasy 22, 213–​14, 224–​6 worldwide civil war 69–​70 YellowJacket 118 Zero Dark Thirty 119, 140, 156 Zero K 205, 212 Zuboff, Shoshana 12, 37, 56, 117