Nonhuman Witnessing
Nonhuman Witnessing
a series
edited by
Erin
Manning
& Brian
Massumi
duke
university
press
durham
& london
2024
war,
data, and
ecolo gy
a fter
nonhuman
witnessing
the
end of
the world
michael
richardson
© 2024 Duke University Press
All rights reserved
Printed in the United States of
America on acid-free paper ∞
Proje ct Editor: Michael Trudeau
Designed by Aimee C. Harrison
Typeset in Minion Pro and ibm Plex Mono
by Westchester Publishing Services
Acknowledgments ix
Introduction 1
nonhuman witnessing
one 37
witnessing violence
two 80
witnessing algorithms
three 112
witnessing ecologies
four 150
witnessing absence
coda 174
oward a politics
t
of nonhuman witnessing
Notes 185
Bibliography 207
Index 229
This page intentionally left blank
acknowle dgments
x Acknowledgments
the editors and peer reviewers who guided these texts at a formative stage.
Parts of chapter 1 draw from “How to Witness a Drone Strike,” Digital War
(2022), published in a special issue edited by Olga Boichak and Andrew
Hoskins. What would eventually become sections of chapter 3 appeared in
“Climate Trauma, or the Affects of the Catastrophe to Come,” Environmen-
tal Humanities 10:1 (2018). An e arlier version of chapter 4 was published as
“Radical Absence: Encountering Traumatic Affect in Digitally Mediated
Disappearance,” Cultural Studies 32:1 (2018), in a special issue on media and
affect edited by Sarah Cefai.
I am doubly (if not triply) thankful to Sarah Cefai, who orchestrated my
visit to Goldsmiths in 2020, on the cusp of the pandemic, which led to con-
versations that pushed me to resolve some thorny questions with Joanna Zyl-
inska, Matt Fuller, Chris Woods, Ariel Caine, Eyal Weizman, Susan Schuppli,
Shela Sheikh, and Sarah herself.
Conversations large and small, online and off, with a host of interlocu-
tors contributed in large and subtle ways to the ideas in this book or helped
cheerlead it into existence. Thank you Adrian Mackenzie, Amy Gaeta, An-
drew Yip, Anthony McCosker, Ayesha Jehangir, Baden Pailthorpe, Boram
Jeong, Carolyn Pedwell, Casey Boyle, Chad Shomura, Charlotte Farrell,
Chris Agius, Chris O’Neill, Chrstine Parker, Crystal Abidin, Declan Kuch,
Donovan Schaefer, Edgar Gomez Cruz, Elke Schwarz, Emma Jane, Emma
Quilty, Fleur Johns, Gilbert Caluya, Hagit Keysar, Hannah Buck, Heather
Horst, Hussein Abbass, Jake Goldenfein, James Parker, Janet Chan, Ja-
than Sadowski, Jennifer Terry, Jenny Rice, Jodi Brooks, Joel Stern, Joseph
DeLappe, Joseph Pugliese, Jussi Parikka, Karin Sellberg, Kat Higgins, Kath-
rin Maurer, Kathryn Brimblecombe-Fox, Kynan Tan, Larissa Hjorth, Liam
Grealy, Lilie Chouliaraki, Lisa Slater, Louise Ravelli, Lyria Bennett-Moses,
Mark Andrejevic, Matthew Arthur, Mel Gregg, Michael Balfour, Michal Gi-
voni, Michele Barker, Milissa Dietz, Monique Mann, Nathaniel Rivers, Ned
Rossiter, Niamh Stephenson, Nisha Shah, Olga Boichak, Omar Al-Ghazzy,
Paul Frosh, Poppy de Souza, Rachel Morley, Rowan Wilken, Sarah Truman,
Sonia Qadir, Sukhmani Khorana, Tess Lea, Tim Neale, Tom Sear, Verena
Staub, Willy Blomme, Xan Chacko, Yanai Toister, and all t hose that I have
inevitably forgotten. Some of the most rewarding conversations took place
with participants in the Drone Cultures symposium that changed incarnation
from in-person to online over the course of 2020 and via the Drone F utures
seminar series that I ran online through the first year of the pandemic. Spe-
cial thanks to Ronak Kapadia, Antoine Bousquet, Jairus Grove, J. D. Schnepf,
Acknowledgments xi
Kate Chandler, Mahwish Chishty, Thomas Stubblefield, Caren Kaplan, and
Alex Edney-Browne. Your p resentations and conversations (many of which
have continued) all shed new light.
Coauthors and collaborators on other projects also helped me to work
through this one. I am immensely grateful to the legends J. D. Schnepf, Beryl
Pong, Adam Fish, Heather Ford, Kerstin Schankweiler, Amit Pinchevski,
Magdalena Zolkos, Madelene Veber, Anna Jackman, Kyla Allison, and An-
drew Brooks.
At unsw, I’m lucky to work with many wonderful academic and profes-
sional colleagues in the School of the Arts and Media, especially my cocon-
spirators in the Media F utures Hub. One of g reat joys of academic life has
been learning from my graduate students Simon Taylor, Meng Xia, Asal
Mahmoodi, Theresa Pham, Rachel Rowe, Bron Miller, Kyla Allison, Maddie
Hichens, Katariina Rahikainen, and Maryam Alavi Nia.
Remarkable comrades in academic life have enriched my world. Deepest
thanks to Anna Gibbs, who set me on the path; Meera Atkinson, my first
academic partner in crime; Magdalena Zolkos, for introducing me to affect;
Andrew Murphie, for opening the worlds of media; Stephanie Springgay,
for wise guidance and late-night bourbon; Rebecca Adelman, for finding
joys in email; Elizabeth Stephens, for lifting up o
thers, me included; Ramas-
wami Harindrinath, for getting me hired (and much more); Helen Groth, for
generous mentoring; Tessa Lunney, for laughs and lunches and always re-
membering; Nick Richardson, for unconditional cheerleading; Collin Chua,
for knowing what to read; Caren Kaplan, for solidarity and wisdom; Anna
Munster, for the creative energy and collaborative enthusiasm; Tanja Dreher,
for believing that change is possible; Thao Phan, for the unerring capacity
to ask the right questions; and Greg Seigworth, for endless capaciousness.
Special thanks to the artists who generously gave permission to include
their work: Kathryn Brimblecombe-Fox, Baden Pailthorpe, Kynan Tan, Ed-
ward Burtynsky, Grayson Cooke, Mahwish Chishty, and Yhonnie Scarce,
whose stunning Thunder Raining Poison appears on the cover. Particular thanks
to Noor Behram for his striking photograph (figure 1.3), used h ere under fair
use guidelines and (I hope) in allegiance with the politics of his work.
A book is nothing without readers, and the writing of this one is indebted
to the time, insight, critique, and support of some remarkable ones. The
Book Proposal Club of Astrida Neimanis and Lindsay Kelley helped build
the framework and find the book a home at Duke. Greg Seigworth, Nathan
Snaza, and Tanja Dreher read early, lumpy drafts of the introduction and
made it so much stronger through their generous attention to its unformed
xii Acknowledgments
thoughts and suggestions of further reading. Anna Munster read chapter 3
and homed in on its weakest points with unerring precision, which helped
strengthen that chapter greatly.
Research underpinning this book—and especially chapters 1 and 3—was
enormously assisted by Madelene Veber, who has worked with me over the
last four years and is, among other t hings, brilliant, indispensable, astute, and
creative. I’ve no doubt that Madelene will soon be writing acknowledgments
of her own.
For so many reasons, I am forever in debt to Andrew Brooks and Astrid
Lorange, staunchest pals and wisest of readers. Both read the manuscript top to
bottom and provided such thoughtful feedback and incisive edits. But more,
your fierce politics, love of thought and poetry and language, your convivial
joy in snacks, and your friendship have meant everything. Love you. gffs.
Two anonymous reviewers read both the proposal and eventual manu-
script. Their generous, critical, and constructive engagement with the project
at proposal stage had a transformative effect on the argument, framework, and
analysis. Without them, this book would be much poorer, or not a book at
all. All flaws that remain are mine alone. Thank you.
Brian Massumi and Erin Manning believed in this book from the first
loose description at a Montreal café in the dead of winter. Brian’s work led me
to fall for affect theory, for radical empiricism and process philosophy, and
for the relationality in all things, and you were both so welcoming to a clue-
less graduate student at the Into the Diagram workshop in 2011. I am truly
honored to be published in your series, alongside so many brilliant o thers.
At Duke University Press, I am immeasurably thankful to Courtney
Berger, whose editorial wisdom and unfailingly warm support for the book
made all the difference, editorial assistants Sandra Korn and Laura Jara-
millo, and Aimee Harrison for the stunning cover design. As all authors
know, a huge amount of unsung work goes into making a book and getting
it to readers. Thank you, Michael Trudeau, James Moore, Emily Lawrence,
Chad Royal, and the rest of the production and marketing crew for your
effort, enthusiasm, attention, and care. Big thanks to Cathy Hannabach
and Morgan Genevieve Blue from Ideas on Fire for stellar work producing
a rich index.
The Thread—Na’ama, Phil, Astrid, Andrew, Nick, Annie, Zoe—your soli-
darity, wisdom, humor, advice, and love are beyond measure.
To name all the family and friends who nurtured this book and its author
is an impossible task. You know who you are. Mum—you taught me to love
learning and to seek justice. Dad—you showed me how to pursue the exact
Acknowledgments xiii
and true. Daniel—I only wish you w ere nearer so that we could share more
joys together.
Finally, my deepest thanks and love go to Zoe, Adrian, and Sacha. There
is no one like you, Zoe Horn, so smart, creative, caring, loving, and funny.
You keep me on track, you put things in perspective, you teach me again
every day what matters most, and you always believe. Adrian and Sacha, this
book is for you, for everything y ou’ve taught me, and for the desperate need
to make new worlds to live and love within.
xiv Acknowledgments
introduction
nonhuman
witnessing
at 6:15 a.m., february 21, 2010, on a deserted stretch of road in the Uru-
zgan Province of A fghanistan, a convoy of three vehicles slowed to a halt
and figures spilled out, clumping and milling as dawn light filtered through
the mountains. Captured by the Multi-Spectral Targeting System (msts)
slung below the nose of the loitering mq-1 Predator, imagery of the con-
voy streamed across military networks to screens in the United States and
Afghanistan. On the screens, engines and people glowed white against the
gray-black landscape as indistinct heat signatures bled into one another in
the strange aesthetic of forward-looking infrared (flir). Image and control
data flowed through the network, moving between different devices, infra-
structures, and protocols. Connected by a ku-band satellite link to Ramstein
Air Base in Germany, the Predator’s data then traveled down optical fiber
cable under the Atlantic to the Ground Control Station at Creech Air Force
Base outside Las Vegas, Nevada, to image analyst “screeners” in Florida, to
command posts and ground stations across the globe, and to an encrypted
server farm for archiving, where the video and its accompanying metadata
would be logged, recorded, and held for future analysis. Years later, these
time-stamped pixel arrays of ones and zeros likely became part of the vast
video archive used to train machine learning algorithms to replace the l abor
of image analysts, a project initiated in partnership with Google and other
tech giants in a sign of strengthening ties between the architects of algorith-
mic enclosure and those of increasingly autonomous warfare.
On that pale morning, one place the video feed failed to reach was the US
Special Forces unit conducting an operation against a local Taliban leader
in nearby Khod. Afghanistan’s weak communications infrastructure and a
reliance on satellite bandwidth meant that the imagery never made it to the
ground, despite being subject to much debate as it was examined by screen-
ers, operators, and commanders. Conducted by radio and military internet
relay chat (mIRC) across discontinuous networks within the operational
apparatus, the debate over what the images showed angled ever more inexo-
rably toward violence as the affective surge toward action cohered with the
indistinction of the drone’s mediations. Alongside the msts, the Predator was
equipped with gilgamesh, a sophisticated eavesdropping system capable of
blanket signal interception of nearby cellphones. Like the image screeners,
analysts combing its data oriented their interpretation toward perceiving the
convoy as a node within an enemy network. On the ground, two dozen men,
women, and children spread prayer rugs on the dirt, while military personnel
on the other side of the planet argued over how to read the varied morpholo-
gies produced by the sensor-network-feed. Framed with military discourse,
these uncertain bodies w ere swiftly fixed as “military-aged males” and thus
subject to potential elimination.
Prayers complete, the three vehicles continued along the road, veering
away from the Special Forces at Khod in what one of the drone crew inter-
preted as a “flanking” maneuver. The lurking Predator carried only a single
missile, so two Kiowa attack h elicopters w
ere scrambled into position and
a little a fter 9 a.m., the convoy hit a treeless stretch of road. Guided by the
drone’s laser targeting system, two agm-114 Hellfire missiles launched from
the Kiowa h elicopters and struck the first and third cars, explosive charges
in each detonating to fragment the shell casing. Metal and flesh tore apart
and fused together. Bodies w ere everywhere, w hole and in pieces. Nasim, a
mechanic who survived the blast, l ater recalled wrecked vehicles, a headless
corpse, another body cut in half. On the full-color video feed that the crew
switched to after the strike, pixels re-presented themselves as women and,
eventually, as children. Later, the Pentagon claimed sixteen dead, including
three children; villagers said twenty-three, including two boys named Daoud
and Murtaza. A swiftly ordered US Department of Defense investigation
traced the tangled lines of communication, the processes of mediation, and
the failures of vision and transmission. Its report ran over two thousand
2 Introduction
pages. When eventually released under a Freedom of Information request
filed by the American Civil Liberties U nion, the report provided rare insight
into the secretive inner workings of drone warfare. Much attention was paid
to the transcript of communications between the Predator crew and ground
command. L ater used to frame both journalistic and scholarly accounts,
the transcript distilled the hubris, faith in technology, and tendency toward
violence that animates remote war. More than a d ecade later, the event and
its mediations and remediations remain a critical aperture into the drone
apparatus.1
In all of this, who—or what—bears witness? H uman witnesses abound:
the victims and survivors, whose flesh and words bear the scars and carry
the lived truth of hellfire from above; the pilot and sensor operator, the com-
manders, military lawyers, image analysts; the military investigators; the
documentarians and journalists who will tell the story of what happened, and
their audiences across the world; perhaps even the scholars, myself among
them, who turn to this moment to help make sense of remote war. Yet what
of our nonhuman counterparts? Th ere is the ground soaked in blood, the
roadway buckled by the explosive force of two warheads and blackened by
fire, and the dirt and stone of the roadside in a land wracked by war, and
the carbon-rich atmosphere through which the missile and signals travel,
another in the countless processes contributing to the ecological catastro-
phe that consumes the planet. There is the drone itself, not only the aerial
vehicle and its payload of sensors capturing light across the spectrum but
its signals relays, and the complex network of technologies, processes, and
practices that make up the apparatus. And there are, too, the algorithmic
tools for snooping cellphones and scouring video; the data centers sucking
power for cooling and expelling heat for stack upon stack of rack-mounted
computers; the undersea cables that carry military and civilian data alike.
If we extend the assemblage further, we arrive at lithium mines and orbital
satellites, image datasets and environmental sensors, cellphone manufactur-
ers and cloud services.
In most accounts of witnessing, much of this would be excluded alto-
gether, relegated to the status of evidence, or assigned the role of intermedi-
ary, dependent upon a h uman expert or interpreter. Nonhuman Witnessing
refuses that relegation and instead deepens and widens the scope of wit-
nessing to include the nonhuman. Opening witnessing to the nonhuman
provides deeper, more finely tuned understandings of events for us h umans.
But this book goes further, arguing that nonhuman witnessing enables the
communicative relations necessary for an alternative and pluriversal politics,
Nonhuman Witnessing 3
founded on the capacity of nonhuman entities of all kinds to witness and
through that witnessing compose new ethicopolitical forms. H uman wit-
nessing is no longer up to the task of producing the knowledge and forms of
relations necessary to overcome the catastrophic crises within which we find
ourselves. Only through an embrace of nonhuman witnessing can we humans,
if indeed we are still or ever w
ere human, reckon with the world-destroying
crises of war, data, and ecology that now envelop us.
nonhuman witnessing
4 Introduction
Nonhuman Witnessing is about what happens when the frame of what
counts as witnessing expands, how more-than-human epistemic communi-
ties might form, and what this might mean for subjectivity, the nature of
justice, and the struggle for more just worlds. I develop nonhuman witness-
ing as an analytical concept that brings nonhuman entities and phenomena
into the space of witnessing and accords them an agency otherwise denied or
limited by witnessing theory to date. This strategic gesture makes room for
excluded knowledges, subjectivities, and experiences within a wider frame-
work of cosmopolitical justice. It does so through an analysis of technolo-
gies, ecologies, events, bodies, materialities, and texts situated in crises of
military, algorithmic, and ecological violence. While witnessing can certainly
occur separate from violence, this book focuses predominantly on instances
of state and corporate violence that occur across a variety of scales, speeds,
temporalities, and intensities. This book understands violence as purposive
harm inflicted on people, animals, environments, and the ecological rela-
tions that make life and nonlife inextricable from one another. Violence
thus d etermines the possibility, capacity, and nature of life for h
umans and
nonhumans alike. This instrumentality means that violence is distinct from
mere force and cannot be neatly equated to destruction or death in general. In
the way I use it here, violence captures environmental, ecological, structural,
technological, affective, discursive, and infrastructural forms of instrumental
harm, as well as the directly corporeal and material forms that are most
obvious and widely accepted. One of my central propositions, then, is that
nonhuman witnessing brings more excessive and elusive violence into the
frame of witnessing in ways that human witnessing cannot.
What this book proposes is bold: to unknot witnessing, weave it anew as
inescapably entangled with the nonhuman, and within the warp and weft of
that weaving find a renewed political potential for witnessing after the end
of the world. My argument is that understanding witnessing as bound up
with nonhuman entities and processes provides new and potentially trans-
formative modes of relating to collective crisis and the role of the h uman
within it. For many on this planet, crisis is neither a new experience nor an
exceptional event but rather forms the condition under which life is lived.
The book takes as its starting point the presumption that contemporary
crises of war, algorithmic enclosure, and ecology are inseparable from the
enduring catastrophe of settler colonialism, w hether in their connection to
extractive industries, colonial militarisms, techniques of control developed in
settler states, or the regimes of seeing, knowing, and being that underpin the
European modernity that has spread unevenly, violently, and with varying
Nonhuman Witnessing 5
degrees of success across the planet. World-ending crises are all too familiar
for First Nations p eople, who live in what Potawatomi scholar and activist
Kyle Whyte calls “ancestral dystopias,” or present conditions that would
once have been apocalyptic futures.2 But I also want to emphasize that the
subject of History—the figure that enacts and is produced by such structures
of violence and control—is neither an accident nor a universal figure. Sylvia
Wynter calls this figure Man, the Western bourgeois figure “which overrepre-
sents itself as if it were the human itself.”3 This imposed image of the human
as Christian and middle class first emerged in the R enaissance, only to be
amended in biological terms by the sciences of the nineteenth century.4 As
I will argue later in this introduction, it is precisely this figure of Man that
is the unexamined subject of witnessing. Nonhuman Witnessing argues that
this dangerous fiction of Man the Witness cannot hold u nder the dual pres-
sures of existential catastrophe and its own violent contradictions.
The two terms of the main title thus signal the core theoretical interven-
tions and tensions of the book. By putting witnessing and the nonhuman in
conversation with each other, I aim to dismantle the humanist frame within
which witnessing has been understood until now. This revisioning of witness-
ing contributes to the larger critical and p olitical project of interrogating
fundamental assumptions within the Western tradition and its project of
domination. It also speaks to the necessity of building new methods and
modes of knowing that can grapple with the injurious impacts of algorithmic
enclosure, technowar, and anthropogenic climate change at a time when the
illusion of a cohesive world cannot hold. In doing so, Nonhuman Witnessing
aims to be as generative as it is critical: it is a work of thought in action that
seeks new ways of making theory and building concepts.
As an analytic concept, nonhuman witnessing describes the varied mate-
rial, technical, media-specific and situated relations through which ethicopo
litical knowledge, responsibilities and forms are produced in ways that can
include but neither require nor privilege h uman actors. As I define and
elaborate the concept, nonhuman witnessing rests on a vitalist conception
of existence that understands technics, affects, and materialities as registering
and communicating experience in forms that can be deemed witnessing
in their own right, prior to and distinct from any semiotic translation or
interpretation. In this, I am indebted to philosophies of radical empiricism
that stress the movement and relationality through which existence takes
shape and meaning. My own intellectual roots are in the processual vitalism
of Gilles Deleuze, and particularly its incarnation in the heterogenous field
that has come to be known as affect theory. More specifically, my approach to
6 Introduction
relationality borrows from Brian Massumi’s theorizing of affect as intensities
of relation between bodies and worlds, w hether h uman or non, corporeal
or technical, dominant or fugitive.5 Parallel to the emphasis throughout on
relationality, this book also approaches nonhuman witnessing with a debt to
media and cultural studies approaches to media and mediation, as well as
witnessing more specifically.
Yet this book is also indebted to encounters with First Nations cosmo-
epistemologies that understand animals, plants, rocks, sky, water, and land
as forms of life with inherent—rather than granted—rights, agencies, and
relations. Academic scholarship all too easily and often adopts an extractiv-
ist approach to such knowledges. As an uninvited settler living and working
on the unceded land of the Bidjigal and Gadigal peoples of the Eora Nation,
in what is now called Sydney, Australia, I engage with these knowledges in
a spirit of study without laying claim to traditions that aren’t mine. I want to
think and inquire with these knowledges, exploring their resonances with the
processual empiricism that anchors my own scholarly standpoint. My aim is
to show how the exclusion of the nonhuman from witnessing derives from
a distinct and narrow approach to both agency and knowledge, a limitation
that is endemic to the dominant strain of European philosophy that insists so
intently on the discrete and unitary over the relational and emergent.
Nonhuman witnessing elevates the status of the other-than-human in
bearing witness, refiguring witnessing as the entanglement of human and
nonhuman entities in the making of knowledge claims. In the air strike that
killed twenty-three civilians in Uruzgan, Afghanistan, claims to knowledge
about what was happening on the ground w ere animated within the mili-
tary apparatus by the interdependencies of media technics, environmental
conditions, and discursive practices. Violence registers as datalogical and
informational before it is kinetic and lethal: witnessing the event of violence
cannot be isolated to the drone operators or survivors or the infrared sensors,
but rather must be known through the registering of those complex relations
within and between human and nonhuman entities. Nonhuman witnessing
can be identified in ecological, biological, geological, and even chemical
manifestations, but also in technical and aesthetic forms, such as drone sen-
sor assemblages and machine learning algorithms. This means that nonhu-
man witnessing is inseparable from place, time, media, context, and the other
human and nonhuman bodies through and alongside which it takes place.
Against the singular world of the scientific or juridical witness inherited from
the Enlightenment, nonhuman witnessing coheres with what Mario Blaser
and Marisol de la Cadena call a “a world of many worlds.”6 This is, then, one
Nonhuman Witnessing 7
meaning of the temporality of this book’s title: a theory of witnessing for a
world of many worlds, after the end of the illusion that there is only one.
One consequence is that nonhuman witnessing is rife with practical and
conceptual tensions. The very proposition contains within it the irresolvable
paradox of identifying a mode of witnessing that must necessarily exceed
the capacity to “know” inherited from Western epistemologies. Pursuing
witnessing as a relational p rocess, rather than locating it in e ither the figure
of the witness or the act or object of testimony, raises the problem of which
encounters constitute witnessing. Where, in other words, is the demarcation
between mere registering and the witnessing of an event’s occurrence? How
is the status of witnessing bestowed and u nder what criteria? Tempting as it
might be to reconcile such tensions or produce checklists of qualification,
seeking to do so risks flattening nonhuman witnessing such that it loses
purchase on the specificity of media, materials, ecologies, technics, and con-
texts. Nor is nonhuman witnessing necessarily virtuous. Just as the soldier
can witness his own slaughter of innocents, so too the algorithmic witness
to drone strikes or environmental violence can be understood as a witness-
perpetrator. As with all witnessing, t here is no inherent justice to nonhuman
witnessing. The task at hand is to ask how nonhuman witnessing pries open
conceptual and practical space within how we h umans do politics, ethics,
and aesthetics.
As a theory of ethical, p olitical, and epistemic formation, nonhuman
witnessing responds to a twofold crisis in witnessing itself. Its humanist
form cannot reckon with the scale, complexity, intensity, and unknowabil-
ity of technoscientific war, algorithmic enclosure, and planetary ecological
catastrophe. Nor can witnessing hold in the wake of the disruption of “the
human” by ecological, technological, and critical-theoretical change, not least
under the pressure of critiques by Black and First Nations scholarship. Faced
with this crisis of witnessing, we are left with a choice: to reserve witnessing
for h
uman contexts and find new concepts to address and respond to new
crises, or, as this book argues, reconceive witnessing as entangled with the
nonhuman by attending to registrations and relations in the stuff of existence
and experience. Precisely because witnessing is so crucial to human—and
especially Western—knowledge and politics, t here is a strategic imperative
to revising its vocabulary to analyze, strengthen, and generate transversal
relations with the nonhuman that are ethical, p olitical, and communicative,
rather than simply informational or transactional.
This book, then, pursues what nonhuman witnessing is, but also what
nonhuman witnessing does as a concept for crafting knowledge out of which
8 Introduction
a more just politics might be formed and fought for. Rather than provide a
detached and abstract theory, it examines the media-specificity of nonhu-
man witnessing across a motley archive: the temporal and spatial scales of
planetary crisis, the traces of nuclear testing on First Nations land, digital
infrastructures that produce traumas in the everyday, deepfakes, scientific
imaging that probes beyond the spectrum of the human sensorium, algo-
rithmic investigative tools, the unprecedented surveillance system that is
the global climate monitoring regime, and remote warfare enacted through
increasingly autonomous drones. It combines close analyses of events,
technologies, and ecologies with cultural studies readings of political and
creative texts. From poetry to video to sculpture to fiction, creative works
play a critical role in this book because they allow me to pursue nonhu-
man witnessing into speculative domains in which aesthetics and worlds
relate to one another in strange, unexpected ways. This approach aims to
show both the media dynamics and cultural consequences of nonhuman
witnessing. In doing so, nonhuman witnessing emerges as a relational theory
for understanding and responding to entangled crises, one that attends to
complexity and difference even as it works across divergent domains and
dizzying scales.
Rather than stitching together a g rand theory, t hese sites reveal the ne-
cessity of capacious, open, situated, and flexible approaches to nonhuman
witnessing. What this book pursues are the resonances, overlaps, and unex-
[103.158.136.64] Project MUSE (2024-02-06 19:57 GMT) University of Edinburgh
Nonhuman Witnessing 9
elude the human entirely. It may well be that witnessing to which we are not
party is happening all the time, but of much greater significance are those
instances of nonhuman witnessing that seem to be addressed in some way
to the h
uman—and through that address insist on both our response and
responsibility.
Each chapter is organized around a double meaning: the witnessing of vio
lence, as well as the violence that can be done by witnessing; the witnessing
performed by algorithms, as well as the need to witness what algorithms do;
witnessing of more-than-human ecologies, as well as ecologies of witnessing;
the witnessing of absence, as well as the absence of witnessing. These doublings
perform the relational dynamics of nonhuman witnessing itself, reflecting
its working as both a critical concept and an emergent phenomenon. But
each chapter also elaborates a distinct operative concept for understanding
the processual modalities of nonhuman witnessing. Chapter 1, “Witnessing
Violence,” critiques the violence of increasingly autonomous warfare as it is
mediated through technology, bodies, and environments, elaborating the
notion of violent mediation as constitutive of martial life. Chapter 2, “Wit-
nessing Algorithms,” pursues machine learning algorithms that produce
techno-affective milieus of witnessing, articulating an account of the ma-
chinic affects that animate relations within and between technics, bodies, and
ecologies. Chapter 3, “Witnessing Ecologies,” attends to naturecultures under
the strain of climate catastrophe and nuclear war, conceptualizing a distinct
form of ecological trauma that ruptures vital relations between human and
nonhuman. Chapter 4, “Witnessing Absence,” conjoins the sites of war, al-
gorithm, and ecology to examine the traumatic absences that circulate in
the quotidian of digital media, developing the concept of radical absence to
show how nonhuman witnessing makes absence intensively present through
nonhuman infrastructures.
Each of these analytic concepts—violent mediation, machinic affect, eco-
logical trauma, and radical absence—explicate aspects of the processual dy-
namics of nonhuman witnessing. But while they intersect with one another
in many ways, they don’t snap neatly together to provide a unified theory
of nonhuman witnessing. Th ese concepts instead name the relational pro
cesses that constitute nonhuman witnessing across different contexts. Not
all nonhuman witnessing entails violent mediation or radical absence, for
example, but the former plays a crucial role in war while the latter is vital
to understanding how nonhuman witnessing functions in digital cultures.
Throughout the book, I show how t hese dynamics converge and diverge in
productive tension with one another, marshalling varied constellations of
10 Introduction
them as they obtain to distinct sites of analysis. In the coda, I pull together the
conceptual threads of the book to outline in explicit terms how nonhuman
witnessing enables a more pluriversal politics that foregrounds communica-
tive justice for more-than-human entities and ecologies.
Upending the long history in theory and philosophy of reserving witness-
ing for the human subject, I argue that witnessing is and always has been
nonhuman. Our contemporary conjuncture makes this much easier to see,
precisely because so much of Western ontology and epistemology has been
thrown into crisis. If crises of autonomous war, algorithmic enclosure, and
environmental catastrophe are indeed converging in the contemporary mo-
ment, it is surely in no small part because their roots reach so deep into the
historical ground of militarism, capitalism, and settler colonialism. Nonhu-
man witnessing thus provides purchase on unfolding catastrophic futures,
but also on the catastrophes of the past—and on the potential for radical
hope, historical acts of resistance, and the making and remaking of more
just worlds.
this mess w
e’re in
Nonhuman Witnessing 11
c ommunities, such as the Atacan in Chile.8 Whether mining users for data
or land for lithium, Amazon is ruthlessly extractive, the high-tech successor
to the colonial enterprises that coproduced racial capitalism.9 It has both
infiltrated and diverted countless facets of life, and its founder dreams of
extending that rapaciousness to the stars. Yet for all this, Amazon still retains
much of the veneer of techno-utopian solutionism: a frictionless f uture of
goods, data, and currency flowing through global infrastructures in which
human labor is obscured, if not erased from view altogether. The vision of a
transcendent future built on material waste and human sweat far more than
on computation and abstraction.
To state the obvious: Amazon is neither the architect nor the sole ben-
eficiary of the “modern world system of ‘racial capitalism’ dependent on
slavery, violence, imperialism, and genocide,” as Robin D. G. Kelley describes
the current global regime.10 Nor is it the only exemplar of the convergence
of crises to which this book is addressed. Since the turn of the millennium
and the attacks of 9/11, war and military technologies have undergone dra-
matic transformations, led by the United States but now sweeping across the
globe. Remotely piloted aerial systems, or drones, moved from the margins
to become instruments of killing and transform military strategy. Today, au-
tonomous and semiautonomous drones are used by more than one hundred
nations for surveillance and by a growing subset for lethal violence, backed by
artificial intelligence systems powered by machine learning neural networks
that undertake real-time analysis of impossibly large streams of remote sen-
sor and other data. Remote vehicles are used on and above every type of
terrain, as well as underwater and underground. Algorithmic selection and
targeting systems for drones and other weapons platforms are already h ere,
with fully autonomous weapons systems already emergent, held back less
by technical capacity than by military, political, and public unease with the
notion of removing h uman decision making from the act of killing. Th ese
changes have, as Jeremy Packer and Joshua Reeves point out, transformed
“enemy epistemology and e nemy production” in line with specific media
logics of “sensation, perception, reason, and comprehension tied to a given
medialogical environment.”11 The media-technological production of en-
emies and knowledge about those enemies is itself inextricable from the
determination that certain populations must be controlled or can be killed,
whether via the debilitating biopolitics that Jasbir Puar calls “the right to
maim” or in the necropolitics of remote warfare with which I opened this
book.12 Consequently, their martial media technics must be read within the
context of enduring colonialism.
12 Introduction
This martial transformation has been part and parcel of the wider enclo-
sure of life within computational systems and communications technologies,
whether at the macroscale of public health databases, citizen registers, and
biometric surveillance, or at the personal level with the ubiquitous presence
of social media and smartphones across the planet. Logics of surveillance
and control that have crept into every dimension of social, political, and
economic life are also deeply entwined with histories of anti-Black racism,
and the methods of domination applied during and after slavery in the Amer
icas, as Simone Browne persuasively shows.13 Indebted to wartime initiatives
of the 1940s, decades of Cold War arms racing and antagonistic cultural
politics that legitimated significant military spending in the United States,
and active partnerships between the Defense Advanced Research Projects
Agency (darpa) and what would become Silicon Valley, today’s communica-
tions technologies also bear the legacy of cybernetics and the effort to craft
“infrastructures of sensing and knowing,” as Orit Halpern puts it.14 At the
1970 World Exposition in Osaka, experimental multimedia environments
were built to demonstrate the potential for actualizing cybernetic systems
in urban architecture and planning. Reflecting on the influence of Expo ’70,
Yuriko Furuhata argues that “regulatory mechanisms of policing and surveil-
lance, modeled as multimedia systems and aided by networked communica-
tions, form a much darker and somber counterpart to the types of artistic
multimedia environments that emerged in the 1960s.”15 Japanese architects,
theorists, and multimedia artists played a crucial role in this dynamic, inher-
iting and responding to a different colonial legacy of violence and control.
Algorithmic technologies are now embedded in everything from Ama-
zon’s purchase recommendations to the creation of art, from the mining
of personal and population data to the provision of welfare services to the
structuring of knowledge itself via the search results of Google. But what
Paul Edwards calls the “closed world” of Cold War computation also laid the
infrastructural foundations for the “vast machine” of atmospheric monitor-
ing that allowed anthropogenic climate change to become more visible and
better understood, even as it became both contested and irreversible.16 Today,
ecosystems reel from hotter summers, extreme weather events, failing crops,
rising migration, ocean acidification, and atmospheric pollution, to name
but a handful of the more striking effects. W hether marked in the geology
of the planet or in the biosphere, the sheer scale of ecological crisis (which
is r eally a set of interlocking crises) is its own catastrophe, leading to deni-
als of scientific knowledge, failures of politics, and global paralysis around
meaningful response.
Nonhuman Witnessing 13
Each alone would be more than enough to end countless worlds, but
these three crises are also intensifying and accelerating, fueling and fueled
by the insatiable expansion of racial capitalism. Advances in machine learn-
ing have supercharged both algorithmic enclosure and autonomous warfare.
Reliance on mass computing in everything from image recognition to bitcoin
mining has combined with an exponential expansion in digital data stored
in servers and trafficked across networks to produce a huge carbon footprint
for computation. Built into the bedrock of the civilian internet as the host
of everything from ebooks to presidential election campaigns to banking,
those infrastructures have a massive environmental impact in heat gener-
ated and fossil fuel consumed.17 Those same fossil fuels, of course, power
the energy appetite of the US military, the world’s largest carbon polluter.
Institutionally, economically, and ecologically, Amazon and its ilk are deeply
integrated with military apparatuses, especially in the United States where
big tech provides everything from enterprise software to cloud storage to
strategic guidance through bodies such as the Defense Innovation Board,
chaired by ex–Google boss Eric Schmidt. Equivalent dynamics operate at
every level, whether in the shared reliance on remote sensors by militarized
drones, urban surveillance, and environmental monitoring, or the centrality
of extraction to climate change, military industries, and the mining of data.
Despite this tight bind between technology, war, and climate change,
ever-more innovation is proposed as the only solution by self-interested
luminaries such as Bill Gates. In the most basic material sense, these crises
of war, data, and climate and the system of racial capitalism they maintain
and depend on are drawing down the finite resources of the planet. Taken to-
gether, they are both product and perpetrator of violence, w hether structural
or infrastructural, environmental or military, algorithmic or interpersonal,
kinetic or slow.18 The very existence of such lists speak to both the ubiquity
and variety of violence today and its intimacy with crisis as the condition of
life for much of the planet. The explanatory force of nonhuman witnessing
resides in part in its capacity to register and communicate t hose forms of
violence that might otherwise be rendered invisible.
Galvanizing the language of crisis, as I have done so far, is not without risk.
As Whyte argues, claims of crisis—of food, resources, space, security—have
been frequently used to justify colonialism, both in the larger sense of the set-
tler enterprise and in specific instances such as the corporatization of tribal
governance in the United States as a response to an “emergency” of poverty.19
For Whyte, “crisis epistemologies” produce problematic politics that over-
14 Introduction
ride First Nations concerns, such as in the appropriation of tribal lands for
wind farms and other renewable initiatives in response to the exigencies of
the climate crisis. Such epistemologies depend on a conception of crisis as
aberrant and abnormal, a rupture that must be tamed and contained so that
the normal order of things can be restored. Rather than a break from order,
crisis is better understood as a condition of existence. “Crisis is not rupture, it
is fragmentation,” writes Henrik Vigh, “a state of somatic, social or existential
incoherence.” As such, crisis is “not a short-term explosive situation but a
much more durable and persistent circumstance.”20 It is not an event, but the
condition and context of life. Lauren Berlant calls this “crisis ordinariness,”
in which “crisis is not exceptional to history or consciousness but a process
embedded in the ordinary that unfolds in stories about navigating what’s
overwhelming.”21 Thinking about crisis in this way does not require an aban-
donment of the notion of rupture. But crisis as condition does demand that
we see rupture, trauma, violence, dispossession, precarity, and vulnerability
as at once pervasive and unevenly distributed. Crisis d oesn’t punctuate time,
so much as shape its passage, lacking any distinct beginning or end, enfold-
ing past and future.
Crisis also enfolds and consumes events, entangles bodies, intensifies
the contexts of their occurrence, and cuts through forms of connection to
impose new (dis)orders. Andrew Murphie calls this catastrophic multiplicity
“a complex storm of feeling, of aspects of world feeling each other in intense,
unexpected and constantly mutating ways.”22 Catastrophic multiplicity inten-
sifies, bewilders, and numbs feeling, which makes thinking with and through
problems difficult, if not impossible.23 Knowledge-making as a collective
endeavor becomes fraught and frayed. This generalized crisis environment
provides fertile conditions for states to harness ontopower, the power to bring
into being. Because ontopower targets life as it stirs into activity, it is a form
of power that both exceeds and precedes the h uman. Massumi describes it
as the “power to incite and orient emergence that institutes itself into the
pores of the world where life is just stirring, on the verge of being what it
will become, as yet barely there.”24 Ontopower operates at the processual
level of becoming itself. Deploying technoscientific apparatuses of war and
governance, states and other actors seek to harness ontopower in attempts to
preemptively control the f uture, as in the drone strike ordered in response
to the algorithmic analysis of phone calls and patterns of movement that
produce a “signature” deserving of eradication. But in d oing so, ontopower
also produces crises that themselves escape control, through its continual
Nonhuman Witnessing 15
animation of the forces of state violence, environmental extraction, and
algorithmic control. In this sense, ontopower does not replace biopower or
necropower but rather operates in concert with them
If I have drawn so many examples from martial contexts, this is b ecause
Nonhuman Witnessing finds its way into data and climate through war. Like
the political theorist Jairus Grove, I take the view that war is a form of life as
much as it is a means of death: terrible, ruinous, and endlessly destructive,
yet also generative and creative. Applied to geopolitics and indeed to every
thing from racism to capitalism, “war is not a metaphor; it is the intensive
fabric of relations” that form this historical era.25 What is needed is analysis
“characterized by inhuman encounters and deep relational processes across
geographical scales rather than a form of p olitical thinking that relies on
discreteness, causality, and an exceptional notion of human agency.”26 Also
like Grove, I am committed to decentering h uman actors, but not d oing away
with h uman responsibility for the vast assemblages that continue to cause so
much damage. As concept, practice, and phenomena, nonhuman witness-
ing brings such encounters, processes, and scales into conjunction with the
relational formation of knowledge and subjectivity. But it does so through
committed attention to the processes of mediation that animate and bind
together crises of war, data, and climate.
Lively, temporal, and always in flux, mediation is never foreclosed or
limited in its potential. Media studies scholarship has much to say on media-
tion. Sean Cubitt calls it the “effervescent commonality of human, technical,
and natural processes.”27 For Sarah Kember and Joanna Zylinska, mediation
is crucial to “understanding and articulating our being, becoming with, the
technological world, our emergence and ways of intra-acting with it, as well
as the acts and processes of temporarily stabilizing the world into media,
agents, relations, and networks.”28 In this sense, mediation is always rela-
tional, but it is also necessarily nonhuman: even the witness who speaks
their testimony entails the mediation of air so that wavelengths of sound
can carry from lips to ears. This vitalist understanding of mediation requires
an expansive understanding of media forms, one that sees everything from
clouds to usb drives to the planet itself as media.29 In keeping with the crucial
work of feminist scholars, this approach to mediation is avowedly material.
As Cubitt argues, “Media are finite, in the sense both that, as m atter, they are
inevitably tied to physics, especially the dimension of time; and that their
constituent elements—matter and energy, information and entropy, time and
space, but especially the first pair—are finite resources in the closed system
of planet Earth.”30 Crises of war, algorithm and ecology are thus also crises
16 Introduction
of media: of an accelerating consumption that only exacerbates all other cri-
ses. In the face of just such a trajectory, Cubitt calls for a renewed and more
differentially attuned mode of communication, one that resists the tendency
to extract information from nature but not speak back to it. Something like
this might be found in the radical empiricist tradition, which Chris Russill
argues offers an alternative intellectual history to communication theory
via William James, John Dewey, and George Herbert Mead that embraces
indeterminacy, incommensurability, and difference.31 Nonhuman witnessing
describes a critical concept and relational practice of a distinct mode of com-
munication, one constituted by an address that demands response but still
embraces opacity. It is a transversal opening onto the workings of violence,
experiences of precarity, and the shattering of epistemologies; an aperture
through which communication might take place in ways that are necessary
for care and justice in the aftermath of ended and ending worlds.
Words that would become this introduction were first written amid bushfires
that ravaged Australia in the summer of 2019 and then labored over in the
long years of the pandemic. Throughout that summer of smoke and ash,
the sun glowed pale red and the density of particulate m atter made the air
hazardous to breathe. More than a billion animals died, thousands of homes
were lost, countless habitats erased. Across traditional and social media, in
corridor conversations and at dinner parties, all the talk was about apoca-
lypse, climate change, the failure of normal politics to do much of anything
at all. As the pandemic took hold in early 2020 and then wore on through
the years, life here began to come undone, but the fabric never tore so deeply,
so devastatingly, as it did across much of the globe. With Australia’s borders
closed for well over year, the sense of an ending world was impossible to
escape, even without the massive loss of life experienced in so many places
and borne so disproportionately by the already vulnerable and precarious.
The very networks of travel and trade that expanded “the world” to fill “the
globe” were now a threat to its continuation. What worlds would remain in
the aftermath?
Living and working on unceded and sovereign Aboriginal lands, I am
enmeshed in ended and ending worlds. Colonial expansion ended the worlds
of First Nations p eoples in Australia long ago, beginning with the arrival
of Captain James Cook in 1770 and eighteen years later with the landing of
Nonhuman Witnessing 17
the First Fleet at Botany Bay, just a few bends of the coast south of my own
home. The lines of my own family are bound up with that dispossession, if not
at the point of a gun then through the construction of buildings, founding of
museums, plying of trade, and s ervice in the military. As my forebears settled
this land and built lives and families, the Traditional Owners experienced
massacre, epidemic, dispossession, incarceration, starvation, and the stealing
of children and the breaking of kinship formations.32 That ending of worlds
continues t oday, even as Aboriginal p eople endure and resist in powerful,
inspiring, and even beautiful ways. Preoccupations with an apocalypse that is
yet to come have a bitter irony in a place where First Nations have spent two-
and-a-half centuries surviving the end of the world, struggling for new and old
ways of living in this place that always was and always will be Aboriginal land.
After the end of the world: it is a temporality both commonplace and
strange. In Western popular culture, apocalypse has been in the air and on
the screen and page: zombies running amok, asteroid strikes, ai takeovers,
bioengineered crashes, alien invasions. Metaphors of late capitalism, or cli-
mate change, or global migration, t hese end-times imaginaries are no longer
the preserve of niche subcultures or millenarian religions but at the heart
of the most profitable, most mainstream forms of popular culture. But the
estrangement felt from these imaginings, the lure of catharsis in the fictional
experience of the end of the world, relies on being situated in relation to a
specific telling of history. As Whyte points out, “The hardships many non-
Indigenous people dread most of the climate crisis are ones that Indigenous
peoples have endured already due to different forms of colonialism: eco-
system collapse, species loss, economic crash, drastic relocation, and cul-
tural disintegration.”33 In this sense, the temporal location in the title of the
book—After the End of the World—describes a shifting, situated temporality
that hinges on whose world has ended, to what purpose, and by what hands.
As Nick Estes so succinctly makes clear in describing the impact of the Pick-
Sloan Dam on the Oceti Sakowin peoples of Dakota in the early twentieth
century, “taking away land and water also took away the possibility of a viable
future.”34 Now, that ending of worlds has come to the world enders, the colo-
nizers and empire builders who imagined into being a singular, global world
and made it so with the r ifle, the slave ship, the ledger, and the plantation.
Now, de la Cadena and Blaser write, there “is a new condition: now the colo-
nizers are as threatened as the worlds they displaced and destroyed when
they took over what they called terra nullius.”35 And yet ending worlds d on’t
always fully end and can be reseeded, as the resilience and endurance of First
Nations p eoples across the planet makes clear.
18 Introduction
Naming this era is no s imple matter b ecause to name the problem is also
to diagnosis it. Since its popularization by the atmospheric biochemist Paul
Crutzen and ecologist Eugene Stoermer in a short article from 2000, the term
Anthropocene has been widely adopted.36 While the label is useful because
it registers the impact of colonialism and industry on the planet’s biological
and geological systems, it also risks universalizing and misdiagnosing the
problem by naming an undifferentiated Anthropos as the causal agent.37 In
this it serves an ideological function: flattening responsibility onto the human
in the broadest sense both hides the histories of extraction, pollution, and
violence through which the planet has been transformed and obscures the
grossly unequal distribution of the spoils. Critics rightly argue that the term
Anthropocene risks occluding the originary violence of settler colonialism,
without which our era of petrocarbons, plastics, terraforming, species loss,
and ocean death might never have been possible at all. Alternatives now
abound, many of which attempt to name precisely distinct causal agents:
Capitalocene, Plantationocene, Eurocene.38 For me, deploying the term An-
thropocene is a necessary strategic decision despite its limitations. Sticking
with the Anthropocene allows me to center the Anthropos, understood as the
form of Man that has driven colonial and capitalist expansion and, crucially,
laid claim to the normative figure of the witness.39 Conceived in this way,
the Anthropocene and Man are co-constitutive. Countering the idea that
the Anthropocene begins with the Industrial Revolution or nuclear bomb,
Heather Davis and Métis scholar Zoe S. Todd argue that “placing the golden
spike at 1610, or from the beginning of the colonial period, names the prob
lem of colonialism as responsible for contemporary environmental crisis.”40
Known as the Columbian Exchange, 1610 marks both the moment when
the exchange of biomatter between Europe and the Americ as reshaped
ecosystems and when carbon dioxide levels dropped in the geologic layer
as a consequence of colonial genocide. Dating the Anthropocene in this
way ties it both conceptually and historically to Man, and to the ending of
worlds that is such an essential dimension of settler colonialism and racial
capitalism.
Situating this book a fter the end of the world is thus a conceptual claim,
as well as a historical one: the world has long since lost any claim to de-
scribe the totality of being. In its place are countless worlds without claim
to universality or unity. One of the ways in which the end of the world finds
hope is in recognizing that the world has always been multiple, a pluriverse
produced by the world-making power of countless knowledge systems. Such
a multiplicity enables what Kathleen Stewart calls worldings, or the “intimate,
Nonhuman Witnessing 19
compositional process of dwelling in spaces that bears, gestures, gestates,
worlds.”41 Reflecting on war and its aftermaths, Caren Kaplan writes of the
“disturbance of conventions of distance and proximity, the presence of many
pasts and places in what we try to think of as the here and now” that make
“modernity’s everyday aftermaths—the undeclared wars that grieve not only
the present absences but the absent presents—not so much a m atter of ghosts
as multiple worlds that a singular worldview cannot accommodate.”42 The
unruly intensities and haunting disruptions of t hese martial aftermaths are just
as evident in the wake of ecological violence, technological enclosure, and
colonial dispossession: time, place, space, experience and thought all resist
linearity, refuse o
rganization, unsettle the unfolding of life.43 As a form of
worlding after the end of the world, nonhuman witnessing is one means of
building a communicative politics that begins with ecological relations and
the inherent agencies of nonhuman things, animals, and places.44
20 Introduction
now engage with technics, ecologies, and politics in a testimonial mode that
entangles h uman and nonhuman actors.
Nonhuman Witnessing conceptualizes and theorizes t hese developments,
both as a means of making sense of these changes in situ and to connect
them into a larger project of reckoning with crisis, violence, and trauma. It
joins a growing body of critical interventions into the connections between
aesthetics, witnessing, and forensics, prominent among them the legal, artis-
tic, and theoretical works of Eyal Weizman and his research agency Forensic
Architecture, located at Goldsmiths, University of London. Weizman’s Forensic
Architecture theorizes the application of architectural techniques of siting,
sensing, mapping, modeling, and analyzing to the task of uncovering and
communicating “violence at the threshold of detectability.”45 Attending to
material architectures, media objects, and situated testimonies, forensic ar-
chitecture is an operative concept that provides a method for investigation.
How that method articulates with wider transformations is the subject of
Weizman and Matthew Fuller’s Investigative Aesthetics, which explores how
resistant investigations assemble aesthetically to produce what they call an
“investigative commons” to challenge state-and court-sanctioned knowledge
production and counter the post-truth “anti-epistemologies” of misinforma-
tion and disinformation that have undermined trust in shared realities.46
Aesthetics in their terms comprises both sensing and sense-making, and, as
such, is not exclusively human but rather found across all entities in their
relational milieus, as I explore in more detail in chapters 1, 2, and 3, includ-
ing with a close reading of the Forensic Architecture project Triple Chaser.
More closely attuned to the questions of witnessing that occupy this book,
Susan Schuppli’s Material Witness combines reflections on her artistic prac-
tice and work with Forensic Architecture, which draws on archival and eth-
nographic research to develop an account of how m atter can obtain standing
as a witness within public fora such as war crimes tribunals. Her material
witnesses are “nonhuman entities and machinic ecologies that archive their
complex interactions with the world, producing ontological transformations
and informatic dispositions that can be forensically decoded and reassembled
back into a history.”47 Material witnesses can express themselves through a
technical sensibility rather than speech per se, but “matter becomes a mate-
rial witness only when the complex histories entangled within objects are
unfolded, transformed into legible formats, and offered up for public consid-
eration and debate.”48 Material witnesses appear throughout this book, but
particularly in chapter 3 when I turn to the material traces of nuclear testing
and their mediation through art.
Nonhuman Witnessing 21
While Schuppli, Weizman, and Fuller ground their analysis in their own
investigative practices in and beyond the academy, Pugliese’s Biopolitics of the
More-Than-Human shares this book’s imperative to develop an apparatus for
critiquing contemporary warfare and the ruin it does to bodies and ecologies.
Discontented with existing practices of evidentiary analysis, Pugliese calls for
a “forensic ecology” that can “examine the physical remains, in particular, of
more-than-human entities left in the aftermath of the violence and destruc-
tion unleashed in militarized zones of occupation.”49 This is resonant with
the investigation of drone warfare and its violent mediations in chapter 1,
particularly in thinking through the entanglements of technics, bodies, and
ecologies.
Witnessing is also an important subfield of inquiry within media studies,
producing nuanced empirical and theoretical accounts of distinctive modes
and practices of witnessing and testimony. In an influential essay, John Dur-
ham Peters defines witnessing as “responsibility to the event” and points out
that media must wrestle with the “ground of doubt and distrust” that distance
adds to the “veracity gap” inherent to the relay of any testimony.50 Building
on this conception, Paul Frosh and Amit Pinchevski propose the concept of
“media witnessing,” or “witnessing performed in, by and through media” as
essential to contemporary world-making.51 Media witnessing, Lilie Chou-
liaraki argues, is a fraught proposition, veering easily into spectatorship as
distant audiences are presented with atrocity to which they have few or no
avenues of response.52 New witnessing practices emerged in concert with
new media technologies, producing what media studies scholars have vari-
ously called mobile witnessing, citizen-camera witnessing, crowd-sourced
evidence, digital witnessing, witnessing databases, and data witnessing.53
These practices have enabled affected individuals and communities to nar-
rate crises in culturally distinctive ways and to self-represent their witness-
ing, even if they have also produced new expert and intermediary functions
for h
uman rights o rganizations.54 Throughout Nonhuman Witnessing, this
research provides valuable insights into distinct witnessing practices related
to my lines of inquiry, but also serves as a springboard for thinking past the
limits of the human in ways that I hope will in turn be generative for scholar-
ship in media studies.
The works highlighted in the preceding pages share with mine a com-
mitment to interrogating the shibboleths of testimony, evidence, and
their relation to politics, technology, and justice. But t here are also criti-
cal departures. Where Weizman elucidates an existing practice of forensic
architecture, this book theorizes a more expansive, ontoepistemological
22 Introduction
reconception of witnessing as an encounter with and response to violence.
Where Fuller and Weizman focus on the theory and p rocess of investiga-
tion as a mechanism for assembling aesthetics, this book attends to how the
sensing and sense-making of aesthetics produces a witnessing relation that
is not dependent upon an investigative team, method, or apparatus. Where
Schuppli insists on contestation within public fora as a condition for material
witnessing, my approach to nonhuman witnessing insists on witnessing as
an experiential relation that can produce contestation but is not dependent
on it for its existence or even politics. Where Pugliese centers the law and its
enmeshment with military power and colonial structures, my concern is with
processes distinct from the juridical domain, and that fail to appear or cohere
within legal frames. Where media studies research delves into the complex
ensembles of media and h uman that produce distinct forms of witnessing, it
reserves ethical and political standing for h uman witnesses, intermediaries,
and audiences and leaves nonhuman agencies largely out of frame. In short,
Nonhuman Witnessing contributes to an active project within critical thought
in which debates over key concepts remain vibrant. And while the forms of
violence and modes of intervention with which all t hese works are concerned
are largely new, they are also embedded in a long history of transformation
in the forms and practices of witnessing, who counts as a witness, and how
shared knowledge is produced.
In the earliest foundations of the Western legal tradition in Athens and
Rome, the wounded body was considered the most reliable witness, which
meant torture was central to legal proceedings. Who could be tortured in
the name of truth was a m atter of importance: the enslaved w ere often the
subject of torture to provoke truthful testimony, not the powerful and prop-
ertied.55 Witnessing was borne on the body up until the Enlightenment, when
the law of proof emerged in conjunction with the ocular revolution of the
Renaissance and the humanist conception of the dignity of Man.56 In 1846
the United Kingdom abolished the law of the deodand, a relic of old E nglish
jurisprudence that held that an object in motion that has killed a human must
be held to account. Consequently, writes Su Ballard, “where once they were
able to take responsibility for the harm they have caused, now objects are
just another group of silenced witnesses.”57 This sentencing of the memory
of objects to evidence accompanied the modern juridical witness taking
familiar form: structured by norms, ordered in narrative, and verified by
accompanying evidence.58
The figure of the witness thus becomes synonymous with Man, which
meant certain bodies w ere again excluded: the enslaved, Indigenous and
Nonhuman Witnessing 23
Black p eople, and, often, w
omen and the unpropertied. Unable to become
witnesses before the law due to explicit rule or fear of retaliation, their flesh
could be made to speak through violent punishment. Hortense Spillers calls
the flesh that “zero degree of social conceptualization,” left behind in the
“theft of the body” that occurred in transatlantic slavery and Indigenous
dispossession: “a willful and violent (and unimaginable from this distance)
severing of the captive body from its motive will, its active desire.”59 Without
will or body, the enslaved and First Nations were rendered illegible to the
law as persons, figured as property or inhuman objects. As the philosophi-
cal underpinning of imperial and settler colonialism, Man depended on the
construction of Black Africans as the ultimate other, the slave, and the assimi-
lation of all dark skinned peoples into the category of “native” as the negative
inversion of the imagined normal h uman.60 As such, they w ere also denied
witnessing before the law, refused the right to attest to the violence done to
them.61 Thus the humanist figure of the witness fused new notions of the in-
dividual, unitary subject of rights and responsibilities with existing regimes
of humanity and inhumanity. But it also carried the legacies of monotheistic
religion, in which the figure of the witness claims intimacy with the divine.62
While the testimony of preachers figures prominently in American religious
culture, the martyr or blood witness is rooted in the early years of Christian
ity and carries through—if in radically different ways—to the present in the
dead of Auschwitz and the suicide bombers of isis.
But the Enlightenment and its rearticulation of Man also produced a
new and divergent form of witnessing, one that emerged in the eighteenth
and especially into the nineteenth century as markedly free from overt ties
to violence and law. With the invention of the scientific method and the
establishment of practices of experimentation and observation, science and
scientists both invented and claimed mastery over the natural world through
the production of knowledge about it. As Lorraine Daston and Peter Galison
catalog, the emergence of a new “epistemic virtue” of scientific objectivity was
a complex process related to transformations in perspective, understandings
of self, and much more.63 Within this framework, the scientist bears witness,
and it is upon their testimony that knowledge builds. Hypothesis, experi-
ment, record, replication, verification, peer review, and scholarly publica-
tion built normative guard rails to ensure objectivity, like the swearing of
an oath in court.64 But the scientific witness depended on a host of erasures.
Women were excluded, as was embodiment, in the invention of an affectless
and cultureless objectivity.65 Haraway writes that this “gentleman-witness”
becomes “the legitimate and authorized ventriloquist for the object world,
24 Introduction
adding nothing from his mere opinions, from his biasing embodiment.”66
By constructing expert knowledge divorced from opinion and transcendent
authority alike, the scientist—by default white and male—became endowed
with “the remarkable power to establish facts. He bears witness: he is objec-
tive; he guarantees the clarity and purity of objects.”67 This is the figure of the
witness capable of the “God-trick” of scientific rationality, which claims an
objective, ahistorical, and unbiased viewpoint on the world.68 This modest
witness wins his authority through the performative disavowal of power, and
in d
oing so entrenches science—new though it is—as the authoritative mode
of apprehending the world. Against the rich multiplicity of worlds that jostled
and warred with one another, this new science and its modest witnesses re-
made the world as a singular, knowable t hing, conquered by colonialism and
made profitable by capitalism.
If modern science heightens the power of Man the Witness, then the
roughly concurrent emergence of print and then technical media amplifies
and extends that authority in time and space, even as it enables new forms
and practices of nonhuman witnessing. Media technology had always been
bound up with witnessing—consider Moses, who descends from Mount
Sinai with the word of God engraved in stone—but the advent of modern
communications made bearing witness a form of informational sociality
around which shared truths form. No longer a matter for courts, churches,
and laboratories alone, witnessing through the printing press, telegraph, and
radio imagined nations into being and rendered distant events immedi-
ate. No surprise, then, that media studies has had so much to say about
witnessing. For John Ellis, t elevision had an even more profound effect on
witnessing by placing the viewer in the position of the witness.69 Mass media
made witnessing, as Frosh and Pinchevski put it, a “generalized mode of re-
lating to the world.”70 But this proliferation of media witnessing amplified the
“veracity gap” that must be bridged to grant the media narrative its author-
ity as truth, as John Durham Peters explains.71 Liveness, that new quality of
televisual media, stood in as truth’s guarantor: How could what is unfolding
now before one’s very eyes be anything but truth? Yet liveness is no guarantor
of the complete picture or the reliability of the witness, nor even—as I w ill
show in chapter 2’s examination of deepfake technologies—of the existence
of the witness. Liveness, like all media coverage of suffering and violence,
can produce spectatorship that dispels action rather than spurs it, present-
ing mere seeing as sufficient response.72 Still, media witnessing is often not
intended to spark action; its purpose is to bind communities around shared
understandings of events, such as the world-shattering nature of the 9/11
Nonhuman Witnessing 25
attacks for America and much of the West, or the extended intractability
of the covid-19 pandemic. Increasingly, this binding takes place not only
through the consumption of images, but also through actively participating
in their production and circulation.
In both science and media, witnessing serves as a sociotechnical appara-
tus that refracts experiment into authority, reportage into truth, science and
broadcasting into power.73 In the twentieth c entury, a shift took place from
transcendental knowledge, continuous media, and analogue technologies
to mathematical grids and models, discrete media, and statistical technolo-
gies.74 In The Practice of Light, Cubitt argues that the emergence of technical
media requires and constitutes a transformation in the processes through
which (especially visual) media are produced and the underlying epistemic
framework.75 Enumeration, probability, and statistical inference and analy
sis take hold, backed by mathematical theories of information and markets.
With the arrival of the postwar datalogical turn and the claims to potential
omniscience that flow from a seeming infinitude of information, the “com-
municative objectivity” of the cybernetic revolution documented by Halpern
began to bind both science and governance ever more tightly to networked
systems and screen interfaces. Networked computation applied to a data-
fied world produced a new kind of observer, one who followed the rules of
the new cybernetic order but saw the world through increasingly inhuman
modalities of perception.76 The witness as cyborg, harnessing and harnessed
to new technologies of vision began to shape how data was presented and
deployed.77 But it also signaled a deeper infiltration and extension of h
uman
perception and action via machine. This technological transformation laid
the foundation for smartphones, drones, remote sensors, and even artificial
intelligence to become instruments of witnessing, even as they transform the
relationship between witnessing and the ground truth against which it is so
often measured.78
What t hese changes in media and mediation make clear is that witness-
ing is a relational p rocess that probes, exposes, and undoes the limits of
representational modes of knowing and being.79 Rather than reinstantiating
the authority of the unitary subject or even of language, contemporary wit-
nessing exposes the primacy of relations between bodies, events, environ-
ments, worlds, and objects, even if they are obscured, denied, disavowed, or
absent. While testimony might take the form of language or a fixed image,
the experience of witnessing is always affective, occurring in the encounters
through which bodies and worlds emerge within and alongside one another.
Witnessing, writes Kelly Oliver, is “the heart of the circulation of energy
26 Introduction
that connects us, and obligates us, to each other.”80 But now witnessing must
reckon with the unravelling of the ontological and epistemological grounds
of knowledge by radical theory on the one hand and the interlocking crises
of the contemporary world on the other.
In an evocative, searching essay on the relation between testimony and
the witness, Michal Givoni writes that rather than an age of testimony, “ours
is an era of becoming a witness, a time in which individuals are called, in
greater numbers and intensity and at a growing rate, to fashion themselves
as witnesses, while their witness position is never guaranteed and their mode
of witnessing is questioned.”81 If becoming-witness is the task set for the
human, then what of the agencies that make up more-than-human worlds?
If we shift the angle with which we approach witnessing and the human, the
scene might be different: Could we not think of witnessing as yet another
pressure applied to the h uman, another dissolving agent working to undo
the narrowly inscribed figure of knowing and being that has both enabled
remarkable advancement but also done terrible, enduring, and world-ending
violence? Or, to put this differently, what if it is not only today’s insistent
presence of the nonhuman that demands a new understanding of witnessing,
but that witnessing carries within itself an unrevealed history, a constitutive
nonhumanity?
This choice to bring witnessing into conjunction with the “nonhuman”
rather than the more-than-, post-, in-or even de-human was not easily ar-
rived at. For me, nonhuman emphasizes distinction and difference from the
human, but retains its necessarily entangled relation to the human and thus
asserts the necessity of keeping the h
uman in the frame.82 As Richard Grusin
observes, “The human has always coevolved, coexisted and collaborated
with the nonhuman,” and, as such, “the human is characterized precisely
by this indistinction from the nonhuman.”83 The h uman is, in this sense,
constitutively dependent on complex relations with the nonhuman. This
relationality is central to moving to conceptualize nonhuman witnessing,
since witnessing itself is a relational practice. But I also find the nonhuman
beneficial b
ecause it implies no time before, a fter, or beyond the h uman.84
“Nonhuman” thus avoids the potential to read posthuman as an uncritical
desire to move “beyond the human,” as Zakkiyah Iman Jackson puts it, which
can be an impossible endeavor for t hose never fully afforded the category
of human to begin with, and who might not now wish to receive it, even if
only in passing.85 As Karen Barad points out, attending to the nonhuman
“calls into question the givenness of the differential categories of ‘human’
and ‘nonhuman,’ examining the practices through which these differential
Nonhuman Witnessing 27
boundaries are stabilized and destabilized.”86 As such, Dana Luciano and
Mel Y. Chen argue that “the nonhuman turn marks, for many critics, not a
venture ‘beyond’ the h uman but a new mode of critical realism, a recognition
that the nature of ‘reality’ itself is changing as power moves away from the
individual.” D oing so has material consequences.87 For Shela Sheikh, “where
care for both human and nonhuman life is at stake, witness collectivities
necessarily entail an expansion beyond the category of the human.”88 This
questioning of categories, boundaries, and differences is not only a matter of
language, but of the affects, materialities, and mediations of forces, bodies,
meanings, experiences, energies, and ecologies.
In this light, nonhuman should not be read as a dismissal of the related
terms outlined h ere, nor as a disavowal of the species we call h
uman as a key
locus for the struggle for justice. Established practices of witnessing have
stratified distinctions between h uman and the non through an inability to
give materiality and relationality their due. Zylinska argues that “embracing
nonhuman vision as both a concept and a mode of being in the world will
allow h umans to see beyond the humanist limitations of their current phi-
losophies and worldview, to unsee themselves in their godlike positioning
of both everywhere and nowhere, and to become reanchored and reattached
again.”89 As I conceive it, nonhuman witnessing is both a particular form of
perception and something else besides, a communicative form shaped by the
materiality and affectivity of the world as medium: an ethicopolitical mode
of relation for grounding anew how meaning comes to matter in the making
and remaking of worlds. Nonhuman witnessing is not an ahistorical or tran-
scendental concept, but rather the naming of a set of interconnected practices
and processes of witnessing bound up with evolving epistemic frameworks
and forms of mediation.
Nonhuman witnessing is not a free-floating concept but an injunction to
the h uman to become with and alongside the non in far more attentive and at-
tuned ways. Cubitt argues that fundamentally transformed practices of com-
munication offer “the possibility of changing the conduct of relations between
human beings and nature, and between both of them and the technologies
that so profoundly and multifariously mediate between them.”90 Nonhuman
witnessing is thus a historical p rocess, one that has—I would contend—always
operated in conjunction with h uman ethics, politics, and meaning-making but
that manifests in new forms, practices, intensities, and dynamics as epistemes
and media technics change through time. Nonhuman witnessing in the con
temporary conjuncture is thus a response to Man the Witness, but exploits,
escapes, and exists beyond the dominance of technical media. Tracing its
28 Introduction
occurrence in instances as diverse as edge computing weapons targeting and
glass-blown art, this book shows how nonhuman witnessing addresses power
as process, not solely biopower or necropower, but the ontopower that brings
becoming within its ambit. As a modality that operates across multiple levels
of sense-, truth-and world-making, opening witnessing to the nonhuman
takes up the task of producing new communicative aesthetics, ecologies, and
politics in the face of violence and its traumatic aftermaths.
To testify is, in the most basic sense, to insist that something be remem-
bered by someone or something other than the witness. Memory is shared
across species, technics, and materials: it is h uman and animal recall, but
also information stored in computation, ammonites fossilized in stone, scars
on gumtrees after summer fires. Its politics must be forged; its collectivity
brought into being. One means of making memory collective is witnessing.
Memorials to wars past bear witness, and statues of slave owners, Confed-
erate generals, and colonial “heroes” remind us of the violence that can be
entailed in being called to witness and remember u nder the normative rule
of empire.91 Memory itself is not normative, but rather attains its ethical or
moral weight through its marshalling to cultural or political ends. Witness-
ing, by contrast, is an ethicopolitical process: it is always and already on the
brink of becoming-political, even if its politics remain latent or geared very
far from justice. Witnessing orients toward the f uture, even if it reaches back
into the past. This book, then, is not “about” memory, even if memory and
its uncertainties feature often. Instead, I am interested in the registering of
experience that precedes memory, and of the intimate relation between this
witnessing and the violence and trauma to which it so often responds.
For trauma studies in the humanities, the witness to trauma—and to
historical trauma and atrocity in particular—lives with the violent event
written on and through the body, such that the past is in fact never past at all.
Fragments of experience cling to the present and refuse to become memory,
continuing as lived remnants of violence. Testimony exposes the failure of
language, the stuttering of representation, and the shattering of experience at
the heart of trauma.92 Testimony is thus vital and necessary, even as it cannot
ever provide a full accounting of trauma, nor be enough on its own to work
through the traumatic event and reconstitute the subject. This is part of why
trauma theory has had such influence on literary, film, and cultural theory: art
Nonhuman Witnessing 29
addresses those incidents of history that refuse comprehension, seeking to
overcome the collapse of meaning through aesthetic and imaginative force.
In this sense, trauma theory is unabashedly anthropocentric. It might not
celebrate a classical humanism, but it is dedicated to the h uman (in)capacity
to speak in the face of that which refuses or resists speech: those traumatic
events that most demand voice are also exactly t hose that refuse representa
tion.93 If the relation between testimony and traumatic event is necessarily
fractured, then how can the witness testify to historical facts? How can his-
tory even be written?94 This fragmenting of the connection between writing
or speech and the event throws testimony into crisis: witnessing becomes
precisely the urgent task of pursuing the event that w ill not give itself up to
knowing, whose full scope and meaning always eludes the grasp.95 This neces-
sary failure of witnessing within trauma theory marks the failure of the h uman:
witnessing signals the limit point of what the human can know of itself and
what it can become.96 Trauma can never appear as itself to the knowing sub-
ject, it can never be known and rendered speakable. Consequently, the human
itself is always bound by this failure to reckon with the traumatic. Witnessing
cannot exceed or extend beyond the h uman b ecause it is constitutive of an
incapacity for the human to be fully human in the face of trauma. Positioning
both trauma and testimony as operating on the line between h uman and less-
than-human, as trauma theory does, implies that the nonhuman cannot be
accorded either trauma or testimony. If witnessing enacts the paradox of the
human failure to be fully h uman, what room is t here for the animal, the plant,
the stone scorched by exploding fragments of a Hellfire missile? Yet trauma
escapes the confines of the subject. It can be climatic, atmospheric, collective,
and it can be transmitted between p eople and across generations. As chap-
ters 3 and 4 argue, trauma can be both affective and ecological. Trauma con-
tinually exceeds the h uman subject, which means that reading the failure of
witnessing as a falling short of the h
uman cannot hold. This very proposition
is an obscured anthropocentrism that predetermines what witnessing can be.
But all this discussion of testimony and trauma implies an original vio
lence. While trauma and witnessing are often yoked together by theory,
relations between violence and witnessing are often assumed, unstated, or
unresolved. In part, this is because violence itself is a slippery concept: perva-
sive, elusive, varied, and resistant to neat formulations. But it is also b ecause
witnessing and violence converge and diverge, coming together in some con-
texts but not at all or only thinly in others. Consider the difference between
witnessing police killings and witnessing a volcanic eruption. Both might
involve the destruction of life, but only one constitutes violence as such.
30 Introduction
Hannah Arendt makes this distinction clear. “Violence,” she writes, “is dis-
tinguished by its instrumental character,” whereas force describes “the energy
released by physical or social movements.”97 If violence is instrumental, it is
also relational. It might well be that violence is intrinsic to being a body. “The
body implies mortality, vulnerability, agency: the skin and the flesh expose
us to the gaze of o thers,” observes Judith Butler, “but also to touch, and to
violence, and bodies put us at risk of becoming the agency and instrument
of all t hese as well.”98
But violence can be structural, as well as direct and immediate, “exerted
systematically—that is, indirectly—by everyone who belongs to a certain
social order,” as Paul Farmer observes.99 Structural violence resists neat as-
criptions of blame or responsibility. Its effects are diffuse yet deeply harm-
ful, enabling oppression and working to maintain existing hierarchies of
wealth and power.100 Capitalism and colonialism are forms of structural
violence, even if they can also manifest in more kinetic, martial, and im-
mediate forms. This is why Patrick Wolfe describes settler colonialism as a
structure, not an event.101 But other forms of distributed violence feature
in this book: symbolic, discursive, infrastructural, environmental, and
algorithmic violence, for example. Lacking an obvious originating agent,
such violence takes place through institutions, linguistic exclusions, tech-
nocratic programs, extractive industries, and other such assemblages, often
harnessed to state and corporate power but at times filtered through more
ambiguous actors.102
Violence is not only distributed, but also differentially experienced. As
Saidiya Hartman, Hortense Spillers, and other scholars of slavery and Black
life teach us, violence strips away the body and exposes the flesh to injury,
often in diffuse and difficult-to-detect ways that permeate the quotidian.103
Racial violence exemplifies this dynamic b ecause it coalesces the capricious-
ness of law, the exclusionary force of Man, and the harnessing of relation to
produce subjects not governed by the law. Writing on the killing of p eople
of color in Brazil’s favelas, Denise Ferreira da Silva argues that “raciality im-
mediately justifies the state’s decision to kill” b ecause such “bodies and the
territories they inhabit always-already signify violence.”104 Violence exposes
the vulnerability of the body, but it distributes that vulnerability in radically
unequal ways. To say, then, that the body is defined by its vulnerability to
violence makes a necessarily political claim about who gets to possess a body
to encase their flesh. This is a question rooted in the Enlightenment concep-
tion of the subject, the figure of Man that Wynter ties to European colonial
expansion. Binding witnessing to the human means that who witnesses is
Nonhuman Witnessing 31
always contested ground—and witnessing itself can be complicit in the le-
gitimation of violence. After all, can the figure denied humanity bear witness
if witnessing belongs to the h uman? Preceding the body, flesh marked by
violence offers a way outside of Man, a fugitive witnessing enabled through
the generativity of flesh that refuses to give up its vitality and seeks solidarity,
resistance, and joy.
Violence, in other words, is a malleable phenomenon. In war, it can be
mechanized and automated, but also intensely intimate. It can unfold
slowly, as in the degradation of bodies exposed to radiation or the col-
lapse of environments polluted by toxic. “Violence unfolds on different
scales, over different durations, and at different speeds,” writes Weizman. “It
manifests itself in the instantaneous, eruptive force of the incident, evolves
in patterns and repetitions across built-up areas, and then manifests itself
in the slower, incremental degradation of large territories along extended
timescales.”105 Nor are those forms, modalities, intensities, and speeds sepa-
rate from one another. Violence flows between states. Buzzing in the sky
above, the drone generates fear and abiding anxiety, a kind of diffuse and
atmospheric violence, even as its surveillance systems engage in the violence
of datafication, transforming the textures of life into metadata. And then,
when a target is acquired and a missile launched, violence becomes horrify-
ingly kinetic. P eople living u nder drones in A fghanistan, Yemen, Gaza, or
Ukraine witness this violence, as do members of the military apparatus from
[103.158.136.64] Project MUSE (2024-02-06 19:57 GMT) University of Edinburgh
32 Introduction
ere is crucial: witnessing is not simply a response to violence, but what
h
violence destroys. “While trauma undermines subjectivity and witnessing
restores it,” she writes, “the process of witnessing is not reduced to the testi-
mony to trauma.”108 Trauma cannot be the foundation of subjectivity b ecause
such a move could only engender an impoverished political life. Disaggre-
gated from trauma, witnessing forges bonds that exceed any given situation
or singular act of witnessing.
Witnessing is always an open-ended, recursive, and necessarily active
process of becoming. But the important move that Oliver makes is to situ-
ate witnessing within a relational milieu, arguing that the self develops its
capacity as an internal witness through being witnessed by the other and
that is how subjectivity emerges from and with social relations. Working
within a psychoanalytic framework, Oliver argues that witnessing is essential
to working-through hostilities that stem from fear and anxiety over differ-
ence. This is a “profoundly ethical operation insofar as it forces us not only
to acknowledge our relations and obligations to others” but to transform
them.109 Working-through connects witnessing to sociality and makes trans-
formations—of love, of justice, of respect—possible. Unsurprisingly, Oliver’s
witnessing is unquestionably h uman: a p rocess that involves “language and
gestures” and an act of “love” in the face of the other and against the de-
humanizing power of oppression and violence. Witnessing is intrinsically
human such that human subjectivity itself is the “result of a continual process
of witnessing.”110 Objects have no capacity to witness precisely because the
object cannot speak or gesture.
Despite this avowed humanism, Oliver’s account helps elucidate some of
the interventions this book makes in thinking witnessing with the nonhu-
man. First, the rejection of a symbiotic relationship to trauma opens wit-
nessing to world-making in ways that invite richer and more generative
potential while not at all foreclosing the necessity of witnessing in response
to trauma and violence. Second, the insistence on the relationality of wit-
nessing as enacted through address and response provides a way into what
witnessing might be if address and response involve nonhuman animals, ma-
chines, entities, and environments, and so on, as long as we understand both
address and response outside their familiar anthropocentric frames. Third,
the conception of relationality as fundamentally biosocial, affective, and en-
ergetic already contains within it a permeability that is almost ecological in
its insistence on complexity and p rocess. Fourth, the notion that witnessing
forges relations that make working-through hostilities to difference possible
offers a way of understanding the dynamism of witnessing and why it makes
Nonhuman Witnessing 33
transformation possible. Taken together, these four implications offer points
of departure from the human witness and into the unruly domain of nonhu-
man witnessing.
In the painting Theatre of War: Photons Do Not Care, (figure I.1), Kathryn
Brimblecombe-Fox depicts the machinic attempt to make planetary environ-
ments subject to martial enclosure. A cluster of drones, networked by fine red
lines, looms over a pale dot in a field of rich blues and reds reminiscent of
scientific visualizations of cosmic evolution. Viewing the painting, we reside
in the cosmic distance, thrown far from any conceivable human perception of
the Earth or its technologies of war. And yet the painting calls for us to attend
figure I.1. Theatre of War: Photons Do Not Care, oil on linen 92 × 112 cm, Kathryn
Brimblecombe-Fox, 2021. Courtesy of the artist.
34 Introduction
to the planetary nature of military technologies, to their growing tendency
to render space-time itself as a site of martial contest. Photons do not care:
these massless particles are the raw stuff of the electromagnetic spectrum,
transcending national boundaries, the h uman, and the planet itself. And yet
they are also, increasingly, the site of military contestation and intervention,
as autonomous and cyber warfare infuses all other forms of martial conflict.
Military media, networked systems, and algorithmic assemblages all seek
mastery, and in doing so tug us into an age in which the world as target has
given way to the planet as an operative medium for targeting any point on
or above its surface.111 The hand of the artist is evident in the occasional un-
blended brush stroke of oil on linen, and in the uneven stippled dots arranged
into the pixelated drones. Th ese pixelated silhouettes of looming drones blur
computational mediation with organic representation, h uman hand, and
galactic scale. There is no escaping the human, the painting insists, no release
into an existence without responsibility for the crises wrought in the name of
economic growth, colonial expansion, state power, and military supremacy.
The question is what will happen, down on that pale blue dot, toward survival
and a new flourishing of life?
If crisis is the political and ecological condition within which much of
the planet lives, the unraveling of the fantasy of a unified, cohesive, and
knowable world offers some potential for more just and equitable f utures.
The enmeshed desire of states and other actors to both produce and control
crisis—crisis as a modality of governance that allows for the abrogation of
democratic and other responsibilities—is not solely about discourses, institu-
tions, or even technologies that target individuals and populations, whether
as biopolitical life-in-the-making or necropolitical death-in-waiting. Onto-
power heightens the stakes of contemporary technopolitical power, enabling
states and other actors to target the stirring of life within the bare activity
of existence. Techniques of ontopower seek to direct being as it becomes, to
harness emergence itself to the ends of the already dominant forces of pro-
duction and control. Such are the promises of the algorithmic technologies
of war, governance, culture, and ecology that this book explores, but so too
is there the potential in resistant harnessing of technics and aesthetics, algo-
rithmic and otherwise, to produce new modes of surviving with and living
beyond the World of Man.
Addressing h uman responsibility for the existential crises within which
we find ourselves—and reckoning with the radically unequal distribution
of both responsibility and the effects of crisis—requires us to hold onto the
human. But this holding onto the h uman must also undo the blind privilege,
Nonhuman Witnessing 35
the narrowness of vision, and the closed imagination that undergird an An-
thropos that is bound to Man. Oliver writes that “being together is the chaotic
adventure of subjectivity.”112 This book calls for witnessing as the foundation
of a renewed becoming-together—becoming-environmental, becoming-
machinic, becoming-imperceptible—that coheres not on h uman subjectivity
but on the chaotic dance of life and nonlife.
36 Introduction
Chapter
One
witnessing
violence
two mq-9 reapers confront each other nose to nose, simulated aerial ve-
hicles floating above simulated mountainous country. Light bends across
the mirrored surface of one; the other is gray and black, a digital replica
of its physical counterpart. Interspersed by spinning reflective planes and
suspended in inscrutable contemplation, the two machines seem possessed
of their own needs and desires. What takes place in this communion of mili-
tarized drones? While the drone skinned in military tones and textures is
disconcerting if familiar, the mirrored drone is both alluringly beautiful
and horrifyingly alien, an other-than-human object across which the gaze
slides and fails to stick. Its mirroring offers no clear reflections, but rather
refracts its surroundings into distorted fragments—a nonhuman resurfacing,
the world rendered into the materiality of the drone as it seeks to become
imperceptible. This moment in Australian artist Baden Pailthorpe’s mq-9
Reaper I–III (2014–16) captures something of what makes military drones
fascinating, disturbing, and urgently in need of critical attention. At once
threatening and seductive, the Reaper drone promises an omniscient and
yet nonhuman capacity to perceive, know, and kill, one that sanitizes war by
making it datalogical, computational, and spatially and affectively remote.
For the militarized drone of the last two decades—exemplified by (but far
from limited to) the Predators, Reapers, and Global Hawks operated by the
United States, or the Turkish Bayraktar tb2 used by Ukraine—this capacity
has depended on its near invisibility, its ability to operate untouched from the
atmosphere. As war becomes increasingly autonomous and more centered on
great power conflict, the forms and applications of drones are becoming far
more varied, ubiquitous, and dependent upon artificial intelligence.
Exhibited at Centre Pompidou, Art Basel Hong Kong, and numerous
festivals and galleries, mq-9 Reaper I–III presents drone warfare as violence
enacted through the computational simulation of reality (figure 1.1). Built in
the modeling program Autodesk 3ds Max, Pailthorpe’s project reimagines
key locations within the drone apparatus into the air above an environment
that references the mountainous terrain of Afghanistan, over which drone
warfare took its contemporary form. Shipping containers rotate slowly in
the clouds, walls cantilevering open on hydraulics to reveal ground control
station cockpits loaded with the screens, controllers, and interfaces needed to
crew the Reaper and its siblings. Or they open to expose spare living rooms
in which uniformed men perch on beige couches or do jumping jacks, trans-
planting the suburban life that bookends on-base shifts operating drones
from the domestic United States to the atmospheric zone of war. Graphics
are realistic but heightened, surreal simulacra of the computational space
of war and an aesthetic familiar to both video games and the promotional
videos produced by arms manufacturers. Their sterility mimics the rhetoric
of precision and hygiene that accumulates around remote warfare and infuses
the technocratic and corporate discourses that elide the violence inflicted by
lethal strikes.
More than this, the computational materiality of mq-9 Reaper is a stark re-
minder of the layers of simulation, data, modeling, and algorithms connected
by distinct logics and processes that constitute the martial contemporary.
Estranging relations between elements within the drone apparatus while
insisting on the distortions and reflections produced by its operations, Pailt-
horpe lays bare the circulatory, diagrammatic flows of the system by shifting
the locus of agency away from the h uman and to networked relations. When
soldiers appear on screen to shadowbox and sit at their control stations, they
have also entered the space of the drone and become its subject. Yet in taking
up the toolkit of modeling, computation, and simulation, Pailthorpe know-
ingly enters the epistemic regime of contemporary war and so is bound to
its informational logics and representational modalities even as they come
under scrutiny. How, then, to witness this increasingly autonomous form of
38 Chapter One
figure 1.1. Still from mq-9 Reaper (III), Baden Pailthorpe, 2016. Courtesy of the
artist.
war? How to grasp the violence its witnessing might do? While Pailthorpe’s
aesthetic intervention makes for an instructive entry point into the entangle-
ment of aesthetics, war, and computation, this chapter is not about drone art
per se.1 Rather, it pursues these questions of witnessing violence by tracing
the violent mediation that is essential to perception, knowledge-making, and
communication in contemporary war.
Witnessing Violence 39
differences, but as bearer of communication, it also establishes organizational
forms with varying degrees of longevity.”3 While mediation can be transfor-
mative and generative, enabling deep communication and the flourishing
of rich ecologies, it is not bound by moral standards nor intrinsically ethi-
cal. Mediation is thus not a normative process. With this concept of violent
mediation, I want to distinguish between mediation in general and t hose
instances in which it animates human desires to control, extract, dominate,
oppress, and kill. Violent mediation is often most evident through technical
systems that subjugate life and nonlife to their ends, but it is also at work in
datafication and computation, and in a host of biogeophysical interactions
instigated by humans to bring ecologies to heel or direct them to human
ends. In this chapter, my focus is on the violent mediations of drone warfare,
enacted through its sociotechnical apparatus. Violent mediation is not ancil-
lary to drone warfare, but constitutive of it.
In this, drone warfare is not an outlier within war more generally but
rather symptomatic of its media saturation. Martial operations are intensely
mediated, bound together through recursive informational flows structured
and organized by media technics. “Military knowledge,” as Packer and Reeves
put it, is primarily “a media problem, as warfare is o rganized, studied, pre-
pared for, and conducted according to communicative capacities.”4 Military
strategy, logistics, and operations are all determined by media technological
capacity, but also shape t hose technologies in turn. The necessity of com-
munication across distance produces semaphore, the t elegram, satellites,
and the internet, and t hese then enable naval formations, the coordination
of mass armies, the deployment of missile batteries, and the networking of
the battlefield via tactical drones, wearables, and mapping systems. This co-
constitution of war and media means that h uman soldiers, p ilots, analysts,
and even commanders are increasingly ancillary to the workings of the
systems themselves. If this was already true in the logistics or command-and-
control infrastructures of e arlier wars, the intensification and proliferation
of automation marks an acceleration of the removal of h uman agency. No
longer the essential component in waging war, the h uman is increasingly
seen as either its most fallible element or its datalogical target. The ballistics
revolution reorganized battlefield perception around wider geographies and
enabled the infliction of violence at considerable distance, while the nuclear
revolution introduced a planetary perception coupled with the potential for
violence at a planetary scale. But the emergent ai revolution is reconfiguring
perception to be everywhere and nowhere, with the capacity for violence so
tightly bound to perception that it too can take place anywhere at any time.
40 Chapter One
Warfare transforms not only in connection with technological, strategic, or
even political change, but also in concert with epistemic shifts in the foun-
dational frameworks, assumptions, and metaphors of scientific knowledge.5
From its inception, artillery targeting entailed mediation: the selection
of targets, measuring of distances, the translation to maps, the adjustment
of machinery, the firing of the gun. But with the emergence of autonomous
systems of war—exemplified by the adoption, development, augmentation,
and transformation of remotely piloted systems such as drones—mediation
takes on a new complexity founded on the imagined and presumed exclu-
sion of the human from its workings. Wide area motion imagery systems
track areas as large as small cities at high resolution, identifying and follow-
ing targets of potential interest that would be difficult if not impossible for
human analysts to comprehensively account for. As such systems develop
in capacity and autonomy, automated processes of mediation w ill locate,
select, track, and even execute threats that only exist within the framework
of the system. Military media are thus “constantly producing new enemies,
and new methods of enemy identification stimulate the development of new
weapons technologies designed to kill t hose newly identified enemies.”6 This
interconnection between media and what Packer and Reeves call “enemy
epistemology” and “enemy production” is not only a question of stabilized
media technologies intersecting with military strategic imperatives. It also
occurs through material processes of mediation, bounded by instrumental
technologies but let loose on the complex terrain of life.
As I theorize it, violent mediation is embedded in a material-ecological
understanding of war and the role of technologies of perception within it. In
this, it shares much with what Antoine Bousquet terms the “martial gaze,”
which aligns “perception and destruction” through “sensing, imaging and
mapping” that encompasses not just the visual but “the entire range of senso-
rial capabilities relevant to the conduct of war.”7 As perception and violence
are increasingly twinned, mediation functions within those apparatuses to
produce violence. Violent mediation is thus intrinsic to the martial gaze. We
might think of violent mediation as the connective tissue of such systems,
constituting sensing at the material level of technical operation but also
stitching sensing into the larger apparatus: the thermal camera of the drone
sensing its environment entails violence within its mediating processes, but
also in the translation from sensing (thermographic camera) to imaging (de-
coding for optical display) to targeting (fixing of the reticule on an agglom-
eration of pixels). Processes of mediation occur within each stage, but also
across them and throughout the kill chain. Attending to violent mediation
Witnessing Violence 41
thus means focusing on the movement, use, and structuring of information
within the military apparatus, as well as within the elements that compose
it. As with the martial gaze, much of this mediation is not visual—or only
presented visually for the benefit of h uman actors within the system. Much of
what is violent in such mediation is bound up with the technical processes
of datafication, abstraction, analysis, and instrumentalization that increas-
ingly animate military technologies of perception.
This chapter asks how witnessing might take place through the violent
mediations of the martial gaze, and how t hose mediations—and the corpo-
real, ecological, and affective violence they engender—might be witnessed.
It locates remote and increasingly autonomous warfare as both a driver and
beneficiary of algorithmic enclosure, while recognizing that it simultaneously
responds to and produces ecological crises.8 Nonhuman witnessing provides
an analytic framework for conceiving and excavating the witnessing that
takes place in, by, through and, crucially, of the drone assemblage. War has al-
ways been a form of life, as Grove maintains, but its emergent contemporary
forms possess a ubiquity, complexity, variability, autonomy, and technicity
unprecedented in human experience. Reckoning with this becoming-war will
require a refiguring of the h uman relation to it, but also a transformative shift
in what counts as ethical and p olitical claims to knowledge. This chapter thus
lays conceptual foundations for the examinations of algorithms, ecologies,
and absences that follow by showing how violent mediation is constitutively
imbricated with war.
By attending to the nonhuman of witnessing, I am not dismissing or mar-
ginalizing the Afghans, Yemenis, Somalis, Palestinians, Pakistanis, Syrians,
Iraqis and o thers who have given and w ill continue to give testimonies to
reporters and h uman rights o rganizations.9 As Madiha Tahir forcefully points
out, “every thing is speaking and talking and witnessing and testifying these
days, it seems, except the p eople whose f amily members and neighbors
have been blown to bits in this war.”10 Hearing t hose voices louder and
in more forums is unquestionably a vital task. Factual in orientation and
presented as narrative, many of t hese testimonies are s haped by the expecta-
tions of h
uman rights convent and the norms of tribunals and courts.11 Their
very familiarity, their echoing of testimonies of torture or rape or migration,
speaks to the “becoming witness” of international humanitarian politics
in the latter half of the twentieth century.12 Such testimonies intentionally
reinforce the humanist, rights-bearing subject because their very efficacy
and legitimacy depends on recognition by the institutions and conventions
of international humanitarian law, which are themselves interwoven with
42 Chapter One
neoliberal attempts to develop a moral framework for capitalist relations in
the wake of World War II.13 Yet in d oing so they seek to make recognizable
encounters with nonhuman systems of violence—networked, autonomous,
highly technical, and massively distributed in space—that resist the forms of
knowing and speaking available to the eyewitness. There is a tension, then,
between the necessity and possibility of making drone violence legible within
the conventions of human-centered forums, whether international humani-
tarian law or rights discourses more generally. Within such a framework,
drones and their data can only be made evidence, rather than recognized as
witnessing in themselves. That is, h uman witnessing takes p recedence and
priority, relegating the nonhuman to the status of evidence that must be in-
terpreted. While Pugliese provides a powerful case for a counterforensics that
reckons with the more-than-human and Schuppli shows how material wit-
nesses can obtain standing within public and legal fora, this chapter adopts
a strategic agnosticism t oward the agencies that animate the drone apparatus
and to the potential for any instance of witnessing taking f uture shape as
testimony. It refuses to deny potential standing as witness to the system (the
entire military drone network, for example) nor any given elements of such
systems (automated image analysis software, for example), even if they will
be hostile witnesses. And it understands nonhuman witnessing as preceding
the existence of fora for testimony, and so sees witnessing as independent
from such fora. This chapter thus attends to the constitutive entanglement
of human and nonhuman witnessing as a relational p rocess of mediation
through which violence is both registered and enacted on people, places, and
ecologies, no matter w hether testimony is ever called for.
In the remainder of this chapter, I examine nonhuman witnessing within
the widening frame of increasingly autonomous martial systems. First, I
consider the multiplying aftermaths of drone violence, attending to the in-
terplay of the survivor testimony, war’s material and cultural traces, and the
way drone sensors and computational systems perform their own nonhu-
man witnessing. As a counterpoint to this bleak vision, I then turn to look
at drone and remote sensor witnessing of Aleppo, Syria, in the aftermath
of war. Moving from the drone war of recent decades to more autonomous
futures, I then examine the violent mediations of augmented sensor systems
in the case of the Agile Condor targeting system, which I read as an instance
of automated media that displaces and disperses witnessing across military
architectures and into the preemptive technics of edge-computing targeting
systems. Finally, the chapter closes with an extended discussion of witness-
ing, autonomy, and the martial future of violent mediation.
Witnessing Violence 43
martial drone empiricism
44 Chapter One
temporary war in general and of drone war in particular. The kill box itself
is a mediation: an operative transfiguration of world into media. In taking
up life and refiguring its relation to death, this mediation is constitutively
violent even before it kills, reworking the ontoepistemological status of t hose
within its ambit from life to not-yet-death. Whether in concert with the kill
box or operating in a less preauthorized context, the kill chain of the drone
is distributed, dispersed, and mobile, producing and responding to emergent
threats actualized within and through the network.
In this chapter, I approach the problem of witnessing (drone) violence by
understanding it in relation and response to the becoming of war, rather than
beginning with an imagined fixity or boundedness to war. Against the idea
that the nature of war is given or known in advance, Antoine Bousquet, Jairus
Grove, and Nisha Shah propose embracing “war’s incessant becoming” such
that “its creativity, mutability and polyvalence” are as central to analysis as its
destruction.20 Their “martial empiricism” references philosophies of radical
empiricism—particularly Whitehead, James, and Deleuze—that resist any
preferential focus on either ontology or epistemology in favor of an open-
ended embrace of experience in all its generative mutability. Martial empiri-
cism orients critique toward the processes, relations, affects, sensations, and
technicities through which war autopoetically emerges. Such an approach
necessarily involves an openness to the incapacity to provide ultimate or de-
finitive answers and demands instead that martial violence be apprehended
“as a p rocess of becoming that is suspended between potentiality and actual-
ity,” in which the task of critique is “scrutinizing the enfolding of intensities,
relations and attributes that give rise to war’s givenness.”21 In the context of
increasingly autonomous warfare, one starting point for a martial empiri-
cism might be the perceptual relations that cohere around the figure of the
drone, itself understood as an unstable and hybrid assemblage through which
knowledge is produced and operationalized to violent ends.
My concern h ere, however, is less the emergent dynamics of autonomous
warfare as such but rather how witnessing occurs within this condition of
martial violence, and how nonhuman entities and processes engage and enfold
human experiencing and witnessing. My pursuit of nonhuman witnessing
within this becoming-war takes place through attention to violent media-
tion as a transversal process that both occurs within and connects distinct
formations of martial violence, as well as the bodies, technologies, and situ-
ations that compose them. Attending to violent mediations as processes of
knowledge-making and communicating opens the terrain on which witnessing
can and must take place. As I theorize it here, nonhuman witnessing provides
Witnessing Violence 45
a mode of inquiry into the tensions between actual and virtual in the flux of
becoming as it is interrupted, redirected, and mutated by martial violence.
Let us begin, then, with the violent mediations that animate the drone war
assemblage by attending first to the shift from optical to datalogical media-
tions. In their first operational incarnation above the skies of Kosovo in the
1990s and then A fghanistan a fter 9/11, Predator drones w
ere primarily optical
technologies. With full motion video (fmv) and (usually) thermographic
sensors, these drones “produce a special kind of intimacy that consistently
privileges the view of the hunter-killer,” as Derek Gregory puts it in an early and
influential critique of drone violence.22 One operator describes the view from
above as “looking through a soda straw” that cuts context and complexity
and tends to lock focus on whatever stays within its narrow targeting frame.23
Limitations of bandwidth and multiple stages of encoding and decoding
meant that video imagery was often not received by operators at anywhere
close to the high definition in which it was recorded, while the atmospheric
location of the sensors meant that p eople w
ere principally seen from directly
above or at a very acute a ngle, dehumanized pixels rather than recognizable
persons. This violent mediation cut, reduced, and blurred complexity in ways
that encouraged the infliction of force: rather than generating uncertainty that
might discourage lethal action, the mediation of events in the world through
the technical apparatus produced degraded information that was read as a
threat within the system. While the perceptual capacity of drone sensors
has advanced in the last decade, the underlying dynamics of using degraded
information to produce threats remains very much in place in contemporary
Reapers, Global Hawks, and similar lethal surveillance platforms.
To make sense of the drone as paradigmatic of a particular strand of con
temporary war, I want to tease out the relational processes that underpin
drone violence and in d oing so shift the locus of inquiry from image and
representation to mediation. Drone vision is digital vision, enabled through
sensors that transform light into binary data rather than an analog imprint.
Such vision operates through change and transmission of code, mathematical
arrangements that can be rendered into pixels for display to human operators.
Drone vision is thus operative and actionable, rather than merely representa
tional.24 That is, we can think of the drone assemblage as not only perceiving
but also producing slices of the world upon which operations can be per-
formed. Drones are automated media, oriented toward the future and governed
by a logic of preemption that seeks to define and control threat. “Pre-emption
operates in the register of the urgency of the imminent threat,” writes Mark
Andrejevic.25 Privileging visual representations risks instantiating problem-
46 Chapter One
atic imaginings of the temporal and spatial dynamics of drone warfare at
the expense of properly grasping its networked, mediated, processual, and
computational logics as a sociotechnical assemblage. Mediation is the per-
formative transformation of a perceptual encounter, one that occurs in time
and exceeds its content. It is a vital process, as well as a technical one: indeed,
its technicity is itself a form of life.
Drone mediations are enmeshed with terrestrial surfaces and substrates,
aerial atmospheres, built environments, multiple spectrums, and corporeal ac-
tivities. Parks calls this vertical mediation: “a process that far exceeds the screen
and involves the capacity to register the dynamism of occurrences within,
upon or in relation to myriad materials, objects, sites, surfaces or bodies on
earth.”26 As mediating technologies, “drones do not simply float above—they
rewrite and re-form life on earth in a most material way,” extending to “where
people move and how they communicate, which buildings stand and which
are destroyed, who s hall live and who s hall die.”27 In the context of war, the
mediations of the drone apparatus are not solely vertical but also violent, and
that violence is bound up with verticality. In perceiving and capturing slices
of existence through its perceptual technics, the drone assemblage is at once
reductive and productive. Reductive, in that it frames and subordinates life
within the narrow aperture, a ngle, and classificatory mechanisms of milita-
rized knowing. Productive, in that it transforms that life into actionable data
crowded with virtual futures of persistent surveillance, active control, and
even arbitrary death.
Both the soda straw and bandwidth problems spurred technological de-
velopments that marked an important shift in the sensory apparatus of war
and an intensification of its violent and vertical mediations. To c ounter the
narrow field of view, darpa facilitated a series of wide area motion imagery
(wami) initiatives to equip drones with sensors capable of recording and
analyzing hundreds of city blocks within a single frame.28 In its early forms,
wami promised to capture everything, but in doing so produced an aston-
ishing amount of data. Automated image analysis tools sought to exploit the
totality of the feed, a feat what would require hundreds, if not thousands,
of h
uman analysts working in real-time. But bandwidth issues also meant
that wami was difficult to make operational via the ad-hoc satellite, optical
fiber, and wireless relays that compose military network infrastructures.
wami thus produced spatial and temporal expansions in potential capability
and in labor, network, and computational demands. Take the Autonomous
Real-Time Ground Ubiquitous Surveillance Imaging System, or argus-i s,
which combined 368 overlapping high-definition sensors into the equivalent
Witnessing Violence 47
of a 1.8-billion-pixel camera to provide a high-resolution, full-motion video
of up to ten square miles at a ground resolution of six inches per pixel from
an altitude of twenty thousand feet (figure 1.2). As it was hyped in the 2013
pbs documentary Rise of the Drones, analysts would be able to create video
windows, track vehicles, generate 3d models, and access location-specific
archives to compare prior activities and track environmental change.
The volume of data produced by the system was astonishing: up to one
billion gigabytes of data in twenty-four hours r unning at full capacity. Such
potential perception far outstripped human visual capacities, promising to
transform the world and its inhabitants into actionable data that can be called
up on demand and rolled back and forward through time. But that techno-
logical capacity was never realized in practice due to the massive bandwidth
and computational power required to make the system effective. For wami
to provide its promised ubiquitous surveillance, the problem of getting data
to humans in swiftly actionable form needed to be resolved. The obvious
answer was to reduce the reliance on h umans: new systems are thus built
48 Chapter One
around on-board packages that automatically analyze sensor data for items
of interest and then push a selected subset of data through to h uman analysts
and operators. These edge computing systems, such as the Agile Condor
pod that I discuss later in this chapter, mark an intensified operative role for
computation, one in which autonomous software systems not only record
and analyze but also present data as actionable, where action can lead to kill-
ing. Mediation here takes on an overtly violent tendency, not simply through
what it excludes or removes but through the lives that it presents as (poten-
tially) requiring the application of lethal violence. As wami, edge computing,
machine vision, photogrammetry, and autonomous targeting and navigating
systems in general show, violent mediation is increasingly complex, distrib-
uted, and thick.29 The identification, selection, targeting and execution of
people depends upon a growing number of systems and technics involving
increasingly interoperable components, while at the same time becoming
opaquer in its workings. Making remote and increasingly autonomous war
sensible—that is, making it graspable and addressable within the terrain of
politics rather than its irruption into martial conflict—requires finding ways
to witness the workings of t hese violent mediations. Yet the perceptual op-
erations of violent mediation can themselves produce witnessing: registering
and responding to violence, including their own.
tenuous aftermaths
Drone warfare seems not to want to produce lasting aftermaths. Drone wars
persist, carried on through the open-ended generation of threat, the low
cost of involvement for aerial powers, and the ease with which they can be
returned to the air above places and populations. This distended temporality
is punctuated both by intense periods and sharp instances of violence and
textured by the ever-present potential of death from above. Wartime, writes
Beryl Pong, “constitutes its own violent, recalcitrant temporality.”30 Living
with drone war means living in enduring aftermaths, troughs of grief and
ruin that follow from drone strikes and shadow operations yet can never
mark an end to w artime. Drone war’s aftermaths are rarely spectacular, trans-
lated into narrow idioms that commemorate and reinstantiate a lost, yet
mythical, past. Instead, the aftermaths of drone war are intimate, contested,
and unruly; etched in stones, buildings, gardens, and bodies; seared into the
fabric of communities and cultures.
Witnessing Violence 49
The photojournalism of Noor Behram captures t hese entangled effects
of drone violence throughout Waziristan on the Pakistani border with
Afghanistan. Haunted faces of survivors, shattered bodies of victims, broken
homes, and fragments of Hellfire missiles—the people and objects docu-
mented since 2007 by Behram refuse to go unseen.31 Among the many ar-
resting images are those of survivors in the ruins of their homes, cracked
metal from the shaft of a Hellfire held in their hands like the weight of it might
break them all over again. H ere is the materiality of remote war, stark m
atter
that belies claims of surgical precision even as, according to Thomas Stub-
blefield, “these photographs at the same time acknowledge a certain inad-
equacy of (human) narrative in this system of drone vision.” In one potent
image, children stare into the lens, pieces of rubble offered to the camera and
the remnants of buildings (a home, a school?) all around (figure 1.3). Mark
Dorrian argues that the belatedness of the photographs to the act of vio
lence—bodies, homes, and missiles already destroyed—signals the “violent
cancellation of the possibility of witnessing” in the face of remote war.32 But
I want instead to suggest that these images confront the limits of human wit-
nessing as the Hellfire fragments, ruined homes, and haunted survivors insist
on richly textured, intimate relations shattered by war.33 They both assert the
radical absence of the technical apparatus of the drone on the ground, but
also insist on that absence as a site of witnessing: its absence is itself a violent
mediation. Against the violent delimitations of the algorithmic systems and
militarized modes of analysis that dehumanize people into targets, homes
into safe h ouses, and social relations into signs of threat, the material and
affective relations that circulate within and leap from t hese photographs
manifest the more-than-human wounding and trauma that accompanies
“precision” warfare—and the inability of military infrastructures to reckon
with or even acknowledge its ongoing presence.
Aftermaths such as t hese almost never disturb Western culture or politics,
held at a distance by an apathy t oward the unseen. Drone war persistently
happens over there, despite the ramifications of its racializing technopolitics
for publics at home.34 In her history of war’s aerial aftermaths, Caren Kaplan
calls for close attention to “unpredictable yet repetitive intensities of time
and space, disturbing the singular linear or bounded world that we take for
‘reality’ in Western culture.”35 Such “rogue intensities” are characteristic of
wartime, holding the potential to “disturb the everyday experiences of those
who might otherwise believe that they are unscathed or untouched folds
places and times onto each other while opening up possible affiliations and
historical accountability.”36 Careful attention to the ambivalence, contradiction,
50 Chapter One
figure 1.3. Photograph from Dande Darpa Khel, August 21, 2009,
by Noor Behram
r esistance, and uncertainty that marks the visual history of the aerial view
is crucial. But this same care can be extended beyond the aerial to its ter-
restrial reverberations and, in particular, its material, cultural, and affective
registrations. Drone war’s tenuous aftermaths become more response-able
and address-able when their witnessing is not a human project alone, but
also heard in the discordant strains of nonhuman witnessing.
In operation, drones flicker on the edge of perception. For people living
under drones, encountering them within the visual field is not uncommon, but
neither can sight be relied upon to warn of an operation in progress.37 On
American missions, militarized drones usually fly high enough not to be seen
at all, or to be caught only in the glint of sunlight in de-icing fluid as it slides
across the wings and fuselage of the vehicle. Rain can keep them grounded,
while cloudy weather sometimes means lower flights and greater visibility
and tends to be avoided by commanders keen not to alert the surveilled to
Witnessing Violence 51
their presence. But the aural presence of drones is far more constant: a whirr
that cuts through the hum of daily life and grinds against the mind. One man
describes the sound of the drones as “a wave of terror” that sweeps through the
community. Another links their buzz both to the permanent affective state of
fear and to the strain placed on communal gatherings. “When w e’re sitting
together to have a meeting, we’re scared there might be a strike,” he tells the
researchers. “When you can hear the drone circling in the sky, you think it
might strike you. W e’re always scared. We always have this fear in our head.”38
Alex Edney-Browne notes that one Afghani slang for drones is bnngina, after
the bnng noise that the drones make.39 That buzz works its way into bod-
ies. As Mohammad Kausar, f ather of three, says, “Drones are always on my
mind. It makes it difficult to sleep. They are like a mosquito. Even when you
don’t see them, you can hear them, you know they are t here.” If the everyday
disruptions and anxieties of life u nder drones is most present in their aural
intrusion, then might witnessing not also take place at this level of ears,
sound, and material vibration?
For Schuppli, this earwitnessing strains the limits of what can count as the
material witness of conflict because it leaves no trace, even if when “these low-
frequency emissions combine with physical matter, they vibrate the tympanic
membrane of the ear, so that hearing becomes a kind of barometer for read-
ing the atmospheric pressure of drone surveillance on the body public.”40 Yet
while the lack of trace limits the potential of this aural witnessing to enter the
legal domain, we nevertheless need to reckon with its registering in the body
as a critical point of contact in witnessing relations. Aural witnessing entails
bodily mediation in the now, yet what it mediates is the virtuality of f uture vio
lence: not simply a warning of potential drone strikes, but an impingement of
the future on the sensorium in the present. Bnngina is the crowding presence
of the aftermath to come, the violent mediation of a possible f uture.
Witnessing drone warfare from below is as much about making sensible the
enduring, gradual, and uneven violence done to the fabric of life as it is about
registering the spectacular, kinetic violence of the lethal strike. Surviving entails
reworking relations of community and the movements of daily life in counter-
rhythm to the algorithmic operations of intelligence gathering and analysis.
Disruptions to daily life and its communal governance are m atters of space
and movement, as well as custom, ritual, and routine. No longer socializing
after dark, no longer holding community gatherings, no longer undertaking
funeral rites: t hese are restrictions on mobility dictated by the uncertainty of
violence from the air.41 They also reflect intensive, shared learning in response
to drone violence, a communal pedagogy of atmospheric war. That pedagogy
52 Chapter One
not only entails reorienting daily life away from t hose activities that the drone
apparatus might mark as threatening—it also involves integrating responsive-
ness to intelligence gathering into daily life such that the potential presence of
the drone reweaves the cultural fabric. This reweaving becomes quite literal in
the incorporation of drone iconography into traditional Afghan war rugs, with
silhouettes of Predators and Reapers replacing the Soviet tanks and Stinger
missiles that found their way into these woven images in the 1980s.
In works by Pakistani American artist Mahwish Chishty, this cultural
imbrication of drone violence takes on a more direct critical dimension.
Trained in miniature painting at the National College of Arts in Lahore,
Chishty turned her attention to drone violence following a visit home in
2011. Combining her training in painting with the ornate folk traditions of
Pakistani truck art, Chishty’s Drone Art Paintings (2011–16) and accompany-
ing installations and video works refigure drone technologies as splendidly
visible, captured in the vibrant color and gold leaf of finely wrought bricolage
against tea-stained backgrounds. Painted in opaque gouache, the works in-
sist, as Ronak Kapadia points out, on the permanent visibility of the drone:
materialized not as technoscientific monstrosity but as contained and owned
figure 1.4. Reaper, Mahwish Chishty, gouache and gold flakes on paper, 2015.
Courtesy of the artist.
Witnessing Violence 53
figure 1.5. Image from Drone Shadows installation, Mahwish Chishty, 2015.
Courtesy of the artist.
by the body of the artist, the thick pigment of the paint, and the textured
surface of the paper, exemplified by the painting Reaper (figure 1.4).42 These
works are emblematic of what Kapadia calls an “insurgent aesthetics” that
seeks to unsettle the racialized, gendered, and colonial dynamics of empire.
Against the smooth, blank dimensionality of the militarized drone, Ch-
ishty’s paintings segment surfaces into blocks of color, flowers, flag motifs, and
eyes and mouths. For her Drone Shadows (2015) installation, Chishty painted
plastic model kits of Reaper and Predator drones in the bright reds, greens,
and yellows of truck art (figure 1.5). Suspending them in Perspex containers
and using gallery lights to cast shadows, Chishty puts the (in)visibility of drone
warfare in tension with the hypervisibility of the miniatures. In Chishty’s work,
there is an insistence on returning the nonhuman technics of drone warfare
to the embodied scale of craft and paint. In wrestling with how to figure such
nonhuman violence, Chishty undertakes a kind of nonhuman witnessing
in reverse: testifying aesthetically to the possibility and necessity of making
the seemingly invisible technoscientific mechanisms of violence the briefly
tamed object of art.43 This making visible and identifiable is, of course, al-
54 Chapter One
ways only propositional: an address to imagine otherwise, and to resist the
sanitizing discourse that surrounds and obscures drone violence in practice.
[103.158.136.64] Project MUSE (2024-02-06 19:57 GMT) University of Edinburgh
Witnessing Violence 55
binding existence more tightly to war. People on the ground speculated to
the Stanford and nyu researchers about paid informants, worried about
“chips” and “sims” placed in cars and h ouses, and complained of eroding
community trust and an atmosphere of paranoia.47 The learned protective
practices of p eople under the martial gaze testify to an intensive relation to
the potential for death that manifests in the drone apparatus. Movements,
changing cultural patterns and practices, a folding into life of the vagaries
of the algorithm—these are a kind of collective witnessing to the nonhuman
assemblages of signature strikes and their algorithmic architectures of intel-
ligence gathering and objectification.
Surviving drone warfare is, however, as much a matter of chance as anything
else.48 How surveillance analysts and signals intelligence processes capture and
classify bodies, movements, and social relations is deeply contingent. Death,
too, entails the randomized destruction of living bodies into ruined flesh. Algo-
rithmic killing, or “death by metadata” in Pugliese’s formulation, is far from the
technocratic ideal. While the language of surgical strikes and precision warfare
suggests some sanitized form of violence, the reality on the ground is very dif
ferent. Lethal strike survivor Idris Farid describes “pieces—body pieces—lying
around” and the effort to “identify the pieces and the body parts” to determine
“the right parts of the body and the right person.”49 Delving into the horrific
violence of an attack on a village in Yemen, Pugliese writes that distinguishing
between animal, child, and adult was often impossible, bodies fused into a
“composite residue of inextricable flesh. The one melts into the other. The one
is buried with the other.”50 While the targeting systems and discursive logic of
drone warfare dehumanizes through techniques of gendering and racializing,
its violence strips its victims of any corporeal distinction from other animals.
Reducing the living to “scattered fragments of undifferentiated flesh,” animal
and human bodies become what da Silva calls “no-bodies” and Pugliese labels
“nothing less than generic, anomic, and wholly killable flesh.”51 Even the land
is scarred. As one survivor put it, “The entire place looked as if it was burned
completely,” so much so that “all the stones in the vicinity had become black.”52
This ruination to h uman, animal, plant, and inanimate entities signals the
limits of a witnessing that centers the h uman: How can a narrow humanism
account for violence that strikes at the very vitality of more-than-human
ecologies? This enfolding of more-than-human environments with human
flesh demands what Pugliese calls forensic ecology. His vision of a radical fo-
rensics sees testimony as “a relational assemblage of heterogeneous materials
that, collectively, is mobilized to speak an evidentiary truth.”53 While mobili-
zation within a framework of laws typically depends upon a speaking subject,
56 Chapter One
the registration of violence enacted on the sites of drone strikes constitute a
form of witnessing that both precedes and exceeds the human. It precedes
the h uman b ecause the air’s mediation of light in the collection of sensing
data and of force in the on-rush of Hellfire missiles is already witnessing ru-
ined flesh, scarred rock, and shattered plant life in the instant of explosion.
It exceeds the human because this witnessing occurs below the threshold of
detectability—in the faint striations of dirt subject to passing shrapnel, in
the misting of viscera, in the ephemerality of heat—and far outside it, too,
in the elusive scale of the drone apparatus itself. Translating such witness-
ing into frameworks of individualized responsibility is impossible, not least
when the drone apparatus itself is so dispersed as to make no one singularly
responsible for any given strike.54 Yet while this combination of fused flesh
and machinic occlusion of responsibility certainly signals the limitations
of rights-based frameworks for dealing with the violence of increasingly
autonomous warfare, it also suggests the necessity of a conceptual means of
dwelling with the thick and messy confluence of forces that produces t hese
horrors. Such a dwelling-with can only be processual and can only reckon
with the violence of drone war as process. As a process of registration—which
is to say, of the violent event mediated into the more-than-human flesh of the
world—nonhuman witnessing offers critical purchase, insisting on attending
to both the thick knots that bind violence, as well as the tenuous strands of re-
lation that shimmer out of reach within ecologies and technical systems alike.
On the other side of the drone sensor array, aftermaths of violence are me-
diated very differently. In the form that has dominated the last twenty years
of remote warfare, drone sensors display sensing data in visual images on the
screens of operators located in ground control stations far from the battle-
field. Replicated on the terminals of lawyers, commanders, image analysts
and, in certain situations, officers commanding troops on the ground, the
principal lens for drone operations is e ither optical or thermal full-motion
video overlaid with gis, timestamp, targeting, and other key information.
With the arrival of the war on terror, Parks shows how media coverage “made
vertical space intelligible to global publics in new ways and powerfully re-
vealed what is at stake in being able to control the vertical field.”55 Media
coverage of the invasions of Afghanistan and then Iraq rendered the aerial
view familiar, training publics to recognize and decode new ways of seeing.56
According to Roger Stahl, drone vision “invited publics to see the drone war
through the very apparatus that prosecuted it,” and in doing so “framed out
those populations who must live and die u nder this new regime of aerial
occupation,” rendering them vulnerable, invisible, and ungrievable.57 When
Witnessing Violence 57
drone war does intrude on the mediascape of the United States, the United
Kingdom, Australia, France, Denmark, or elsewhere in the West, it does so
through the existing profusion of screens, stories, images, and mediated
encounters. Drone warfare presents distinct challenges for witnessing that
take place in, by, and through journalistic media.
Visible in YouTube videos of Reaper strikes and part of the visual rhe
toric of films such as Eye in the Sky (2015), the event of a missile strike over-
comes the sensory capacity of the drone: a burst of white, intensities of light
that overwhelms the optical camera and of heat that undoes the thermo-
graphic sensor.58 Focalizing infrared radiation through the lens and onto
the microbolometers assembled one-per-pixel into the sensor itself, ther-
mographic cameras have to manage wider wavelengths than their opti-
cal counterparts. For Nicole Starosielski, “the infrared camera is not just
another thermal medium alongside thermostats, sweatboxes, and heat
ray guns: it is a technology whose sensing capacities work to transform
all m
atter, whether bodies or buildings, into thermal media itself.” The im-
ages it produces depend upon the recasting of “the world as a landscape of
infrared reflectors and infrared emitters—as a field of thermal communica-
tion.”59 Sometimes, that field overwhelms the camera’s thermoceptive capac-
ity. When a missile strikes, the combination of limited resolution and intense
heat prevents infrared sensors from doing anything but assigning maximal
intensities—computer vision cannot resolve what it cannot sense. Whether
in optical or infrared, this incapacity to capture the event of the strike means
that drone sensors necessarily repeat the erasure of life at the level of sensor
process. From within the drone apparatus, the aftermath is always obscured
by the destruction itself, the wreckage of buildings and bodies, thick smoke,
and the heat of melted matter. Inhuman vision reveals its inhuman sensoria,
yet what h uman sensorium would not be shocked and undone by witnessing
such a t hing? In the aftermath, sensor operators typically shift to infrared
to identify the movements of bodies and the still-warm flesh of the dead.
Prescribed by the requirement to count all dead as military-aged males, as
threats until proven otherwise, military personnel decipher the aftermath
according to a rubric designed to repeat visceral, material violence in infor-
mational form. This reading of the scene—a kind of brute forensics—is often
yoked to the question of additional strikes. These so-called double taps are
often conducted at a delay intended to flush out further threats, but are far
more likely to kill or wound anyone who rushes to assist at the scene, a fact
that means bystanders often choose to listen to their neighbors die rather
than risk being killed themselves.
58 Chapter One
Not only are these sensors overwhelmed, but network latency also means
that the drone apparatus can only ever witness on a two-to six-second delay.
Whatever appears on screen does so with the event already in the past, not
quite real-time but still live in the sense that the drone system always experi-
ences liveness on delay. Distance vanishes, but time dilates. Drone systems
intensify this tension between occurrence and technical mediation: an elastic
temporality brimming with violence. Yet this latency also contains within it a
certain necessary trauma, a deferral of the traumatic event into the durational
virtuality of an arrived and arriving f uture. Produced by the combination of
distance and transcoding between components of the network, this latency is
one temporality of violent mediation, a time in which nonhuman witnessing
takes place in the ambivalent space of the drone apparatus itself. This mode
of nonhuman witnessing has little corporeal immediacy or political valence,
but it is witnessing that registers violence distributed in both time and space.
Seen in this way, the violent mediations of the drone apparatus remind us
that nonhuman witnessing carries no inherent ethics, no necessary tendency
toward justice, only an insistence on the complexity of registering an event
as knowable. For ethics, morality, or justice to enter the frame, the question
has to become one of testimony—of the bearing of witness after the event
of witnessing itself. If the drone apparatus is, in its own ambivalent way, a
witnessing machine, if a hostile one, then it is one that must in turn be wit-
nessed. That challenge is amplified by new technologies that augment the
sensory capacity of the drone through on-board advanced computing. But
before turning to one such technology, Agile Condor, I want to first consider
nonhuman witnessing in the aftermath of war in Aleppo, Syria.
witnessing aleppo
While the aerial view of war is rightly associated with surveillance, control,
and violence, remote sensing systems and civic drones can also be harnessed
as witnessing apparatuses for publics and researchers.60 Such uses of sensing
technologies reveal their partial, contested, and contingent nature, as well as
the fraught politics of control that suffuse both atmospheric sensing and digi-
tal infrastructures.61 Aleppo, in Syria, is a case in point. In March 2011 and
amid the Arab Spring, prodemocracy protests in Daraa against the regime
of Bashir al-Assad were brutally suppressed. When anti-Assad supporters
rebelled across the country, Syria swiftly fell into civil war, which in turn
produced power vacuums in various regions and enabled the Islamic State in
Witnessing Violence 59
Iraq and Syria (isis) to take root. Fought across four years from 2012 to 2016,
the Battle of Aleppo saw what the United Nations called “crimes of historic
proportions” committed by Syrian, rebel, and international forces, including
via Russian, American, and Turkish air strikes from crewed and uncrewed
aircraft.62 By the time the city was retaken by the Assad regime, some 31,273
civilians were reported dead and numerous culturally significant sites w ere
destroyed or damaged according to a unesco conversation report, including
the destruction of the Great Mosque and the eleventh-century minaret of the
Ummayad Mosque. Aerial and artillery bombardment ruined roads, homes,
schools, hospitals, and entire neighborhoods, reshaping the city in fundamental
ways and transforming life for its h uman and nonhuman inhabitants.
Rather than containing the violence, the application of “precision” weap-
ons such as drones and guided missiles seemed only to intensify the destruc-
tion: imagery of Aleppo in 2016 bears a remarkable similarity to that of Berlin
in 1945. W hether a missile was launched from a drone or manned h elicopter
is in some ways immaterial to the destruction it causes on the ground: the
dead remain dead, homes remain ruined. But in Aleppo the view from above
has afforded a more ambivalent relation to aerial aftermaths than is always
the case, a phenomenon revealed in different ways by the Conflict Urbanism:
Aleppo project from the Center for Spatial Research at Columbia University
and drone video by Aleppo Media Center, an antigovernment activist group
responsible for widely shared and republished footage.
Conflict Urbanism uses remote sensing imagery, geolocation data, and
open-source software tools to create an accessible digital platform for track-
ing the city’s wartime aftermaths. As artist, academic, and project lead Laura
Kurgan points out, “while war demolishes, it also reshapes a city, and, how-
ever difficult it is to imagine rebuilding in the midst of a war, Aleppo is being
restructured and w ill be rebuilt.”63 The core of the project is an interactive
map that reveals damage to the city’s urban fabric by layering high-resolution
satellite images with data from unitar’s unosat (the United Nations Satel-
lite Center, run by the United Nations Institute for Training and Research). In
its remediation of satellite imagery into an activist-aesthetic context, Con-
flict Urbanism: Aleppo continues Kurgan’s long-standing research practice
engagement with the politics of remote sensing imagery.64
From the main site hosted by the Center for Spatial Research, users are
able to engage with the city at the neighborhood scale, moving through
time and at different resolutions to track the damage to the city (figure 1.6).
This use of technics to make visible otherwise obscured transformations
to the more-than-human environment of the city succinctly encapsulates
60 Chapter One
figure 1.6. Image showing areas of intense damage, Conflict Urbanism: Aleppo
Witnessing Violence 61
capture of multispectral data of the earth, private satellite infrastructures
only take images they are tasked to collect. Users need to purchase satellite
time and specify locations. While the images produced can then be pur-
chased by others, the costs of tasking and purchasing can be prohibitive
for noncommercial or nonstate actors such as h uman rights o rganizations.
Depending on the satellite, resolutions down to around 0.25m are available
for public purchase, but for decades the US government limited commercial
resolutions to 0.5m to keep h uman bodies illegible.65 This can make the work of
conflict monitoring more difficult, obscuring the movement of p eople but also
the damage to buildings from non-incendiary missiles launched by drones.
Through an experimental approach, the project produced an algorithmic
dataset using open-access satellite images to m easure brightness in pixels
between successive images. This stitching together of spatial images across
66
temporalities allows the tracking of damage done to the city. Ground truth
for the project imagery was established via high-resolution satellite imagery,
as well as through the calibration and geographical location necessary to
the operation of remote sensing satellites. But the project also produces
a relational ground truth as images are compared, synthesized, and syn-
chronized.67 By foregrounding how this method is “messy and riddled with
ambiguity,” the project exposes the constructed and frictional nature of such
relational ground truthing. It reveals material, nonhuman traces of the wit-
nessing apparatus itself, a violent mediation within the witnessing of the city’s
destruction, in which low resolution obscures texture and specificity.
62 Chapter One
Alongside its tracking of human activity, such as the displacement of
people from ruined sections of the city to settlements on its outskirts, the
project also witnesses the complex interplay between urban environment,
violence, media, mobility, and renewal. Rather than focusing tightly to spe-
cific sites of airstrikes, Conflict Urbanism attends to “what surrounds the
circles—the areas contiguous to the damaged sites—in order to ask ques-
tions on an urban scale.”68 Such an approach enables a witnessing of violence
that centers the intentional and incidental destruction of cultural memory,
urban history, and community ecologies. This witnessing exceeds the h uman
but does not abandon it. By foregrounding the limitations of the platform,
keeping it open to collaboration and development, and directly addressing
issues of data neutrality, the project exemplifies the necessary contingency of
nonhuman witnessing. In Aleppo, urban violence registers its traces in Sch-
uppli’s material witnesses: wood, concrete, steel, glass, and asphalt as much
as in remote sensing systems, or indeed in the testimony of those displaced
residents of the city. In an environment in which people have been driven
from their homes, t hose nonhuman material witnesses capture something
that the displaced have left b ehind: the material and affective traces of de-
struction, loss, and absence of life.
Integrated into the online platform are YouTube videos captured on the
ground, what Lilie Chouliaraki and Omar Al-Ghazzi call the “flesh witness-
ing” of digital materials recorded and shared by people in conflict zones.69
These videos capture the angles, color, texture, and immediacy lacking in
the layered sensor data. Among them are drone videos produced by activists
from the Aleppo Media Center. Shot at the now-familiar but still uncanny
vantage of the drone—hovering above or just below rooftop, moving with in-
human smoothness, footage rendered with an almost too-sharp definition—
this footage mediates the violence of the aftermath. While mainstream media
coverage of Aleppo’s destruction featured drone footage from a range of
sources, including the Russian military, the video shot by the Aleppo Media
Center insists on capturing ruined streets, homes, shops, and squares, and
in d
oing so both reveals and obscures the violence (figure 1.8). While drone
footage is always imbricated in the militarism of the aerial view, it can none-
theless be deeply affecting. As Kaplan writes: “We absorb these views to
such a degree that they seem to become a part of our bodies, to constitute
a natural way of seeing.”70 This capacity to enfold nonhuman vantages into
the human sensorium speaks to the malleability of our perception, but also
to our cyborg existence, to the always more-than-human nature of h uman
sensoria and knowledge-making.71
Witnessing Violence 63
figure 1.8. Still from drone footage, Aleppo Media Center
64 Chapter One
violent mediation that makes it possible: attending to what is present in such
photographs as participating in an affective production of that which is not.
While a certain material intimacy exists between the film negative photo
graphs of postwar Berlin and the city and its violence, this task of witness-
ing the absence of violence is complicated by the machinic vision of drone
video in Aleppo. For Azoulay, photographs of spaces in which widespread
and systemic violence took place but is not shown present an injunction to
witness the photographs through the historical knowledge of an absence of
visual evidence. She thus reads “these perforated h ouses, heaps of torn walls,
empty frames, uprooted doors, piles of rubble—all those elements that used
to be pieces of homes—as the necessary spatial conditions under which a
huge number of w omen could be transformed into an unprotected popula-
tion prone to violation.”73 Drone imagery from Aleppo shares much with the
photographs analyzed by Azoulay: perforated walls, piles of rubble, blasted
windows, shattered sidewalks, distended roadways. It obscures the 31,273
civilians dead, the many more displaced, and the rape, theft, wounding, and
loss that accompanies such undoing of a city. Unlike the analog photography
of postwar Berlin, machinic vision does not imprint the light of the world
in the gelatin material of the film negative, but rather translates the fleeting
response of the optical sensor directly into pixels, stored as code and only
rendered in visual form for the benefit of the pilot and, l ater, the audience of
any distributed recordings.
Drone footage of w artime’s aftermaths in Aleppo mirrors processually
the violence of aerial war, with its digital targeting systems, guided muni-
tions, and sensor capture of the environment. But it reveals little of t hose
workings: drone footage of Aleppo is what remains within the machinic
frame but hidden both by the depopulated city and the technics of the sensor
itself. Integrated into the Conflict Urbanism mapping apparatus, this foot-
age both grounds and is grounded by multispectral satellite images. Drone
footage introduces a more-than-human visuality that is nonetheless tied to
line-of-sight operation and the practical constraints of battery life and signal
strength: it returns the aerial view almost to the body and yet also retains a
nonhuman detachment that heightens the witnessing of war’s aftermath.
Within the aftermaths of contemporary war’s violent mediations, witnessing
must pursue the tactile and affective, but also the machinic, technical and
networked architectures of seriality and sensing. Yet the nonhuman percep-
tion of drones and remote sensors is increasingly not only an extension of
human sense-making, but also an augmentation at the level of identification
and decision.
Witnessing Violence 65
augmenting the drone apparatus
figure 1.9. Agile Condor Operations Concept, Air Force Research Lab
66 Chapter One
strike, it sets the background conditions for what might be worthy of closer
attention and potential lethal action. It thus exemplifies both the violent
mediation of the drone apparatus, but also its liminal status between h uman
operation and lethal autonomy.
More autonomous data processing at the point of perception marks a
qualitative shift in the agential composition of warfare. Autonomous military
systems are not in themselves new—loitering munitions have been used by
Israel since the 1970s; the sage system designed to monitor Soviet nuclear
launches was built in the 1950s—but Agile Condor integrates autonomous
perception into an already complex kill chain, inserting a machinic intel-
ligence that preemptively shapes the fields of possibility for human analysts
and operators. Agile Condor thus constitutes a kind of liminal, nonhuman
witness: it (pre)determines the meaning and significance of objects and
events, presenting them as open to address by the remote warfare system.
Through the operative role of its on-board high-performance computer, the
ai pod siphons off human agency in the name of efficiency. No longer will
human analysts be concerned with discerning the figure of threat against
the ground of life, but only with the array of figures presented as action-
able. In the transcript of the drone strike that opened this book, it becomes
clear that almost two dozen p eople were killed in no small part b
ecause the
figures in view obtained an affective potency divorced from the milieu in
which the convoy moved. That is, mission atmosphere oriented the operators
and everyone else involved toward violence. Agile Condor entrenches this
orientation toward identifying foes and not friends into the milieu itself: a
machinic perpetrator, its witnessing tends toward violence.
In an oft-cited passage of War and Cinema, Paul Virilio writes that “along-
side the ‘war machine,’ there has always existed an ocular (and later optical
and electrooptical) ‘watching machine’ capable of providing soldiers, and
particularly commanders, with a visual perspective on the military action
underway. From the original watchtower through the anchored balloon to
the reconnaissance aircraft and remote-sensing satellites, one and the same
function has been indefinitely repeated, the eye’s function being the function
of a weapon.”75 This mechanization of perception involves “the splitting of
viewpoint, the sharing of perception of the environment between the animate
(the living subject) and the inanimate (the object, the sensing machine).”76
This splitting of perception entails not only the human and lens, but also an
entire technical apparatus that is motorized, electrical, computational, and
increasingly autonomous: what Virilio calls the “logistics of perception.”77
While not on the same order of magnitude as the arrival of networked warfare
Witnessing Violence 67
itself, edge computing is an important intervention in t hese logistics b ecause
it yokes the ontopower of perception to the necropolitical capacity to make
die. Unlike the s imple reactive relation between sensing and killing found
in an improvised explosive device (ied) or land mine, the ai intermediary
enabled by high-performance edge computing means that deterministic opera-
tions happen at a spatial and temporal remove from human agents.
But while president of General Atomics David R. Alexander claims that
Agile Condor’s “ability to autonomously fuse and interpret sensor data to de-
termine targets of interest is at the forefront of unmanned systems technology,”
edge computing is not confined to military applications or even to drones.
In fact, it originates in commercial problems of bandwidth and latency pro-
duced by the move t oward cloud computing architectures. Early edge com-
puting can be found in “cloudlets” such as content delivery networks that
cache web data closer to users so that, for example, ads can be served faster
and more responsively, preventing delays in page loading caused by the need
to pull data from distant, centralized data centers. Edge computation now ex-
ists in everything from networked security cameras to automated agricultural
systems, reducing the flow of data to central control points. Against the push
to centralize control via the capacity of networks to distribute information,
edge computing offers the potential to decentralize control while retaining
centralized authority. Such a tendency can only produce ever more radical ab-
sence, as experiences of the world are distributed, remediated, and rendered
computational even as they become operative and immediate. In war, this
fusion of sensing, classifying, and selecting within black-boxed technologies
signals an increasing a cceptance of computational agencies on and above the
battlefield, a machinic corollary to the shift of the US military to special forces
operations. Where the military media technologies of the twentieth century
shaped and were shaped by mass, t hose of the twenty first are devolved, di-
vidual, and distributed. Like power itself, military media have pushed more
and more computation to the edge of the logistics of perception.78
Artificial intelligence is particularly appealing for dealing with sensor data
because the first action required is to sift for items of interest, something that
machine learning is—in theory, at least—particularly well situated to do. But
standard methods of machine learning analysis require powerful graphics
processing units (gpus), particularly if the system will also learn on the fly.
That means significant power loads and accompanying heat. Consequently,
huge dividends can be achieved through computational techniques—both in
terms of hardware and software—that reduce the need for power, via both
more efficient circuit design and learning systems that only fire when needed.
68 Chapter One
According to both its marketing material and various technical papers pub-
lished by the development team from src and afrl, Agile Condor uses a
neuromorphic architecture modeled on human neural systems.79 In other
words, its capacity for discriminating perception is intended to mimic neu-
robiology in contrast to typical parallel processing architecture. Both the ibm
TrueNorth and Intel Loihi experimental processors used by the Agile Condor
can be traced to a darpa project called Systems of Neuromorphic Adaptive
Plastic Scalable Electronics (SyNAPSE), launched in 2008 to develop revolu-
tionary new neuromorphic processors and design tools. In contrast to typical
machine learning image analysis that addresses entire images, neuromorphic
systems such as Spiking Neural Network architectures are designed so that
individual “neurons” within the system can fire independently and directly
change the states of other neurons. Because information can be encoded di-
rectly into the signals themselves, spiking networks are not l imited to binary
states and can thus produce something closer to the analogue workings of the
brain, more proximate to the early cybernetic dream before it veered toward
an altogether different computational rationality.80 Because these neurons
only work when “spiked,” the network consumes significantly less power and
can autonomously gear up to higher capacity as needed. Neuromorphic sys-
tems such as Agile Condor are prime examples of what Andrejevic calls “au-
tomated media,” or “communication and information technologies that rely
on computerized processes governed by digital code to shape the produc-
tion, distribution, and use of information.”81 Harnessed to the martial gaze,
automated media reveal how, as Bousquet puts it, “the human sensorium has
been slowly and surely directed, mediated, and supplanted in s ervice to the
ultimate imperative of targeting.”82
As with so much emergent military technology, exactly how Agile Condor
might function in a battlefield context is impossible to ascertain. In a series of
articles published in various ieee forums between 2015 and 2020, the research
team from afrl and src Inc. reveal snippets of insight about the compu-
tational architecture and machine learning techniques used in the system.83
Using a mix of machine learning model types, including spiking neural net-
works and the MobileNet architecture, the researchers demonstrate a bal-
ance between accuracy and efficiency across a series of prototypes built on
ibm and Intel processors. Working with a range of test datasets that include
optical satellite imagery from the United States Geological Survey, various
experiments achieve object recognition accuracy of more than 90 percent,
depending on the specific technical arrangement. A similar accuracy was
maintained using imagery from the Moving and Stationary Target Acquisition
Witnessing Violence 69
and Recognition (mstar), a joint darpa and afrl program that collected
and processed sar imagery of various military targets. But that dataset, while
public, was produced in 1995 and its resolution has been far exceeded by
contemporary satellite imagery. As such, even though the technical informa-
tion about various chip, processor, and model configurations is interesting,
these publications give no indication of how the Agile Condor targeting
system would work in practice. What data w ill it be trained on? How w ill
it be verified and ground-truthed? How are its determinations presented to
operators and analysts? Does it have its own interface or is it integrated into
existing ground control station control systems? What information is made
available back through the system about modeling, probability, and so on?
How much on-the-fly learning is the system capable of executing, and what
quality control mechanisms are in place to verify accuracy or intervene in
the learning process?
With the in-practice workings of the apparatus itself largely foreclosed,
we can turn instead to the promotional materials for an articulation of the
military imaginary that animates Agile Condor. In a two-minute video pro-
duced by src Inc., Agile Condor is presented as a powerful tool for saving
lives and preventing violence.84 Rendered in computer graphics that share
opular video games such as the Call of Duty
the gritty, lens-flare aesthetic of p
series, a General Atomics Reaper drone takes off from a mountainous air
force base to a dark techno soundtrack. Cruising at night above a dense urban
environment, its sensor system identifies various objects, marking them with
glowing green squares. Then Agile Condor kicks in, automatically analyzing
incoming imagery (figure 1.10). Dramatized as a clichéd array of image feeds
entering the hardened box of the computer itself and headlined in multiple
places with the term “neuromorphic computing,” the Agile Condor swiftly
does its magic and an alert flashes up: threat detected. Cut to a swarthy
figure with an rpg on his shoulder, then a convoy of vehicles, and back to
the aerial view. Now, the convoy vehicles are marked in blue and the threat
in red. The sensor pulls focus onto the threat and zooms in tight, resolving a
high-resolution image that it then runs through a facial recognition system
to obtain a 98 percent match (figure 1.11). Signal streams back to command,
where “Agile Condor with neuromorphics enabled has detected an imminent
threat.” The convoy can now be diverted and helicopters sent to arrest the
would-be assailant.
Hyping the efficacy of the system in producing a swift, bloodless reso-
lution is not unusual for this genre of military technology videos, but the
presentation of its technics is revealing of the imaginaries that animate
70 Chapter One
figure 1.10. Agile with “neuromorphics enabled,” still from “Agile Condor™ High-
Performance Embedded Computing Architecture,” YouTube video, October 15, 2016
military desires for ai systems such as Agile Condor. Sensor capture, image
analysis, threat determination, geolocation, signals transfer, and operational
actualization are all presented as seamless, frictionless processes. Wide area
surveillance captures data at scale, which is then immediately transduced
into the Agile Condor analytic engine to identify and locate an imminent and
incontrovertible threat. How t hose analytics take place is obscured: Does the
Witnessing Violence 71
system first identify a static figure in the dark? Does it map onto the con-
voy? What are the relations between those things? Is it correlating between
different sensor feeds in real t ime? Once the threat is detected, the capacity
to recognize a face—something not mentioned in the technical papers pub-
lished to date by the Agile Condor team—provides a granular, individualized
level of analysis. This slippage in scale—from the unknown, impenetrable
urban environment to the named identity of an individual—exemplifies
the god trick that animates both the militarized view from above and the
artificial intelligence system. Agile Condor figures as a watchful guardian,
capable of oscillating between scales and presenting immediately actionable
information to a hyperresponsive command center. Despite the immediate
threat of violence, the response is measured and clinical. Precision warfare
performed through automated media promises to facilitate bloodless control.
As is often the case in military promotional materials, the use cases pre-
sented for public consumption veer closer to policing than mass or even
“precision” violence. Nevertheless, we can observe what Andrejevic calls the
“cascading logic of automation” in which “automated data collection leads to
automated data processing, which, in turn, leads to automated response.”85
This cascading logic has an inherent connection to the death drive, exempli-
fied by the development of Lethal Autonomous Weapons, but evident in
technologies such as Agile Condor, which are not only designed to facilitate
the application of lethal force but also to be part of the process of tipping over
the threshold into ever more complete autonomy. More specifically, Agile
Condor can be understood as operating in the mode of preemption, which
“dispenses with the question of causality: it takes as given the events it targets,
relying on comprehensive monitoring and predictive analytics to stop them
in their tracks.”86 Neural network analysis of sensor data is preemptive in
this way, filtering through data streams for sets of image characteristics that
map to particular models. Presenting the correlative outputs of this analysis
works to preempt interpretation, framing everything presented as potentially
actionable. This direct intervention in the becoming-target of p eople, struc-
tures, and ecologies, reveals Agile Condor as an operative expression of what
Massumi calls “ontopower”: the power to bring into being. Agile Condor and
all such autonomous systems do not simply identify targets but produce them
through their violent mediation of the world around them, binding affect and
encounter into the knowledge apparatus of the ai-enabled drone.
Agile Condor points to the existence of a machinic witnessing operat-
ing exclusively within an algorithmic domain inaccessible to the human.
This machinic witnessing occurs alongside the preemptive determinations
72 Chapter One
that the system makes. Diffracted through the hunt for emergent threat and
within the loop of sensor capture and algorithmic identification, classifica-
tion, filtering, and prioritizing, this mode of relation to the event probes the
limits of witnessing, as the next chapter examines in detail in the context of
learning algorithms. Within the broader milieu of drone warfare, any wit-
nessing that occurs within the Agile Condor system needs to be understood
in relation to its consequences for the witnessing that takes place within the
wider apparatus and in conjunction with its h uman actors. “Prosthetically
tethered to the war machine,” writes Bousquet, “the combatant’s cognitive
and neurological labors are hitched ever more tightly to cybernetic control
loops, mind and body subsumed into complex assemblages that render the
locus of agency increasingly diffuse and uncertain.”87 This dispersal of agency
throughout the system means that witnessing—as a mode of relation that
binds agencies to events—is also diffused. This diffusion concentrates within
particular pockets of intra-activity, sites of intensity where perceptual trans-
ductions take place, and where determinations are rendered in relation to
the data produced. If preemptive technologies such as Agile Condor seek
to cut through the inefficiencies of symbolic, narrative, and causal analysis,
they also undo the grounds of evidence itself by presenting the (potential)
need for action through an operative frame detached from the complexities
of the world beyond the sensor.88 In this sense, the machinic witnessing at
work within the technical constellation of Agile Condor pod, sensor array,
and aerial drone constitutes a kind of witnessing without evidence. For the
human operators, analysts, and commanders looped into such cybernetic
controls system to varying degrees of intimacy, witnessing is already violently
mediated by the preemptive shaping and techno-authority of the targeting
system. For those operators “seeing” war through the machinic eye of auto-
mated imaging and analysis systems, witnessing drone violence is inescap-
ably nonhuman. Not only because the apparatus mediates what is captured
by its sensors, but also because human witnessing is already preemptively
entangled within the machine vision system.
witnessing autonomy
Witnessing Violence 73
of killer robots dedicated to understanding their historical origins.”89 Such a
robot historian, DeLanda speculates, would compose a very different history
of their own emergence than a h uman might, one far more concerned with
how machines shape human evolution toward their own autonomy than
with the agency of h umans in assembling them. In the evolution of armies,
“it would see h umans as no more than pieces of a larger military-industrial
machine: a war machine.”90 Seeking to trace its own emergence, the historian
of a world of autonomous, weaponized robots would turn not to human
historical witnesses but to instances of machinic, signaletic, energetic, and
elemental witnessing registered in material records and relics, in the transfor-
mation of motors, fuel cells, transponders, mining equipment, the chemical
composition of geologic layers, atmospheres, and oceans. “Order emerges out
of chaos, the robot would notice, only at certain critical points in the flow of
matter and energy,” and so the question for the robot historian might well be
how certain factors cohere within self-organizing processes to tip them over
into evolutionary progression.91
Borrowing from Gilles Deleuze, DeLanda calls this autopoetic coherence
the “machinic phylum,” or the set of self-organizing principles and processes
that share deep mathematical similarities.92 For DeLanda’s putative robot
historian, the notion of a machinic phylum that blurs distinctions between
organic and inorganic life would be deeply appealing: it would suggest an in-
herent yet emergent coherence to the existence of “artificial” intelligence that
is not outside or alien to “nature.” Given how indebted computation is to war,
any account of how robot intelligence emerged would have to center military
technologies: “The moment autonomous weapons begin to select their own
targets, the moment the responsibility of establishing w hether a h
uman is
friend or foe is given to the machine, we w ill have crossed a threshold and
a new era will have begun for the machinic phylum.”93 In the three d ecades
since DeLanda’s book, autonomous systems have proliferated, evolved, and
mutated in startling ways. In this chapter, I have shown how targeting tech-
nologies such as Agile Condor operate on the cusp of autonomy, producing
potential targets within a situation of imagined machinic precision. Yet there
are already autonomous weapons systems that significantly predate the new
typologies built on artificial neural networks and other predictive analytics.
Missile defense shields such as Israel’s Iron Dome operate on predefined
rules to knock out incoming attacks in response to sensor data. Packer and
Reeves point to aerial weapons systems “programmed with a range of po-
tential target criteria” that allows them to “slip between offensive and defen-
sive modes, loitering in an engagement zone u ntil an appropriate target can
74 Chapter One
be discovered and automatically engaged.”94 Like all revolutions, then, the
seemingly sudden arrival of killer robots—heralded by viral videos of danc-
ing Boston Dynamics humanoids and swarming slaughterbots—has deeper
historical roots. Many of the most autonomous systems t oday are not found
on killer drones, but in huge guns mounted on naval vessels or on mobile
artillery platforms designed for surface-to-air defense.
While much of the history of early computing flowed from the labs of
darpa and other military agencies to the corporate world, rapid advance-
ment of machine intelligence now largely takes place at Google/Alphabet,
Amazon, Microsoft, Alibaba, Facebook/Meta, Palantir, and the countless
startups striving to join or be bought by the tech giants, or at university
labs, many underwritten by the tech industry.95 ai systems are built to be
transposable from one situation to another, such that machine vision and
navigation techniques developed for autonomous passenger vehicles can be
readily adapted to military contexts. With the infamous Predator already
mothballed and the Reaper slated to be decommissioned, remote warfare
is increasingly characterized by a far more diverse range of vehicles, plat-
forms, and systems. In the swift 2020 war between Armenia and Azerbaijan,
for example, the latter’s autonomous and semiautonomous drones proved
decisive, demonstrating the increasingly accessibility of these technologies
for military actors and signaling the capacity of homegrown automated sys-
tems to shift the calculus of war. In Ukraine’s resistance to the 2022 Russian
invasion, creative applications of consumer off-the-shelf drones augmented
the use of large-scale weaponized drones and loitering munitions. At the
same time, an arms race for swarming drone technologies is underway, with
India trumpeting a field test of seventy-five swarming drones in 2021 and
the darpa offset program showcasing mixed ground and aerial swarms
in 2019, stoking fears of a new genre of weapons of mass destruction. While
this diversification means that drones designed for an ever-widening array
of mission types and milieus can be readily found, increasingly critical ques-
tions concern software systems, data collection and analysis, and the opera-
tive processes that enable identifying friends and foes, and targeting t hose
deemed threats. Like DeLanda’s robot historian, we are now confronted with
the problem of tracing the emergence of such systems, but even more acutely
with the necessity of constructing the means to witness the autonomous vio
lence they will—and already do—produce.
Reflecting on the necessity for research to understand war in ontological
terms, Caroline Holmqvist calls for greater attention to “what it means to
be a h uman being living the condition of war.”96 Without diminishing the
Witnessing Violence 75
significance of this question, in the face of increasingly autonomous martial
systems and operations an inseparable concern is what it means to be non-
human in the condition of war. Or, to inflect this slightly differently, making
war sensible for h umans means being able to ask how autonomous warfare
systems shape and are shaped by the world-making and knowledge-forming
interplay of humans and nonhumans alike. Like the Reaper or Agile Condor,
such systems are witnessing machines, but also what must be witnessed. I
want instead to ask how nonhuman witnessing invites an alternative ap-
proach to questions of human accountability, responsibility, and intelligibility
in the operation of autonomous war. But the challenge of pursuing martial
empiricism into the realm of emergent military technologies is that so much
remains in the virtual space of speculation and proposition. We can only seek
to move with the machinic turbulence of uncertain becomings that are still
very much in the p rocess of (self-)organizing into the autonomous, machinic
violence of the f uture.
Within critical discussions of autonomous weapon systems, focus often
centers on the role of human actors within the system. As with so much
debate about ai more generally, problems are framed around the account-
ability of systems to human oversight. In military parlance, this is typically
understood by the position of the h uman in relation to the “loop” of decision
making that runs from sensing to targeting to firing. If a human is in-the-
loop, they have a deciding role on whether an action will be taken; on-the-loop
they have active oversight and the immediate capacity to intervene; off-the-
loop, the system runs autonomously without direct oversight. Prominent
critics of lethal autonomy, such as the roboticist Noel Sharkey, have proposed
more graduated categories for defining autonomy that center the agency of
human actors, with the aim of delineating high degrees of autonomy that
should be prevented from being strapped to lethal weapons.97 But while t hese
are important distinctions that support the international legal push to ban
lethal autonomous weapons systems, they operate within a larger tendency
toward the excision of the human from military systems. Military precision,
logistics, organization, and speed all depend on what Packer and Reeves call
“a preventive humanectomy” that promises to reduce friction and boost effi-
cacy by eliminating the weak point in data processing regimes.98 An ultimate
end of the militarization of violent mediation is thus the elimination of the
human within technological systems to anything other than a potential target
for violence. Within such systems, the capacity for the human to witness war
narrows to the sharp, brutal end of violence, almost certainly launched from
a significant geographical distance.
76 Chapter One
Witnessing this becoming-target becomes impossible from within the hu-
manist frame, both b ecause the human is excised and b ecause technoscien-
tific military systems, particularly those underpinned by complex algorithms
or artificial neural networks, are themselves inscrutable to humans. Problems
of black-boxed processes and partiality within knowledge production and
decision making are not unique to algorithms. Rather, as Amoore points
out, algorithms help “illuminate the already present problem of locating a
clearsighted account in a knowable human subject.”99 Knowledge of both self
and other is always partial, yet these limitations of knowledge are buttressed
by culture, politics, ethics, and sociality. Witnessing functions to bridge this
lack, proffering a relationality grounded in the necessity of building shared
knowledges, ways of living, and forms of connection. Reflecting on the feed-
back loops, datafied human associations and actions, and back propagation
mechanisms of machine learning systems in both surgical robots and weapon-
ized drones, Amoore points out the “human in the loop” is an elusive figure:
“The human with a definite article, the human, stands in for a more plural
and indefinite life, where h umans who are already multiple generate emer-
gent effects in communion with algorithms.”100 Unlike the h uman witness,
nonhuman witnessing transects t hese dynamics by refusing the distinctions
that underpin and separate out the h uman and the machine. Against the
notion that a reasoning h uman might provide both an ethical decision and
a witnessing account of autonomously executed violence, nonhuman wit-
nessing insists on the incapacity of e ither h uman or computer to account
for itself or the other. By starting with entangled relationalities, nonhuman
witnessing addresses violent mediation as an autonomous process that nev-
ertheless must be understood in relation to the human—and the human must
be grasped in its complicity with and resistance to such violent mediations.
My claim is not that understanding certain machinic processes as nonhu-
man witnessing would magically “reveal” or “expose” something new about
those processes. Rather, my contention is that the recognition of nonhuman
witnessing requires new critical understandings of the relations between ele
ments within systems of autonomous violence, and in d oing so insists that we
resist an uncritical return to the figure of the autonomous liberal subject as
the antidote.101 If nonhuman witnessing takes place within autonomous mili-
tary systems through the registering of violent or potentially violent events by
sensors, their transformation into actionable data through machine vision,
and their determination as killable according to a computational matrix
of preemptive predictions, then the nonhuman witnessing of autonomous
military systems must reckon with the violent mediations of witnessing itself.
Witnessing Violence 77
Within autonomous systems, those violent mediations are always directed
toward the future. Or, rather, they depend on accumulated data from the past
to produce machinic predictions about the future.
Predictive analytics are thus always about the production of futures, or
the preemptive demarcation of certain virtuals as more or less on the verge
of becoming actual. “Threat is from the future,” writes Brian Massumi. “It is
what comes next. Its eventual location and ultimate extent are undefined.
Its nature is open-ended. It is not that it is not: it is not in a way that is never
over.”102 This is the logic of preemption, where, as Andrejevic points out, “the
imminent threat becomes the lens through which a range of risks comes to
be viewed by t hose with the tools for responding to them.”103 Autonomous
military systems, whether weaponized or merely analytic, produce threat in
order to master it and in doing so collapse the future into the present through
the violent mediation of limitless potentiality into actionable probability.
Such systems are ontopowerful because they seek to intervene in becoming
itself, in the emergence of events from the temporal unfolding of existence
within time. While the claim of such systems is for security (of the state and
its citizens) and accuracy (in reducing the loss of life of t hose becoming-
targets), this masks a necropolitical imperative: the automated determination
of death as a mechanism for the production of power. Lethal autonomous
weapons systems show how technoscientific necropolitics continually pushes
power to the edge of perception, which functionally merges with the limits of
[103.158.136.64] Project MUSE (2024-02-06 19:57 GMT) University of Edinburgh
78 Chapter One
language the thickness of experience, however incompletely, so too might
nonhuman witnessing entail rendering sensible, however inadequately, the
violent mediations of datafication, preemption, and operationalism.
Once again, this pursuit of nonhuman witnessing returns us—seemingly
inevitably—to the h uman. In a fierce critique of the sociopolitical implica-
tions of algorithmic violence, Peter Asaro writes that in an age of autonomous
weapons we need to ask: “What w ill it mean to be human? What kind of
society will these systems be defending?”105 Questions of geopolitical power,
of regional and global balances and arms races, are not enough. Algorithmic
warfare leverages the globalized economy, infrastructures, and mobilities
that gird contemporary technocapitalism, which means that t hese questions
of how we reckon with its knowledge machines and knowledge claims are
not solely the preserve of military strategists or critical theorists. The necro-
and ontopolitics of algorithmic war and contemporary state violence share a
voracious need for embodied targets, h uman or otherwise, and autonomous
war must be returned to questions of life in material and martial terms, as
well as conceptual ones. Bound up with this task is also an understanding of
the human and machinic l abor involved in such systems, a question which I
will take up in the next chapter.
The point is not to grant the p olitical subjectivity of the h uman witness
to algorithms or killer robots or semiautonomous drones, or to relegate the
human from a central role in the witnessing of war. Recognizing the agency
of nonhuman entities does not equate to granting them citizenship, but non-
human witnessing aims to bring them into the space of p olitical contestation
with their agency intact. Speculating on the f uture consequences of autono-
mous weapons for the status of the h uman, Grove asks: “What w ill a close
encounter with nonhuman intelligence do to force a ‘persisting us’ to rethink
the use to which we have put machines in the pursuit of what we ourselves
have been unwilling to do?”106 Another way to pose this question is to ask
what ethicopolitical status we might afford to self-aware machinic encounters
with the world? How w ill we think about the forms of knowledge they gener-
ate and the testimonies of unjust use they might compose? In returning to the
human, then, the task at hand is to retain the nonhuman agencies, knowledges,
and relations excavated here, alongside an embodied, situated, and contingent
humanity. In the next chapter, I pursue this challenge in response to the
machine learning algorithms that are increasingly deployed as techniques
of power by states and corporations—but that can also provide openings for
resistance to those very institutions.
Witnessing Violence 79
chapter
two
witnessing
algorithms
launched in august 2020, the latest edition of the venerable Microsoft
Flight Simulator video game series offered an open-ended experience of a
world made suddenly inaccessible by covid-19. Unlike its p redecessors, ms
Flight Simulator 2020 makes the entire planet its gameplay environment.
In the hyperbole typical of much of the media coverage, New York Times
tech columnist Farhad Manjoo proclaimed that Microsoft had “created a
virtual representation of Earth so realistic that nearly all sense of abstraction
falls away.”1 As a technical achievement, Flight Simulator is certainly impres-
sive. Combining data from OpenStreetMaps and Bing Maps via the Azure
artificial intelligence cloud computing platform, Microsoft created an algo-
rithmic system to assign and render photorealistic 3d imagery of skyscrapers,
homes, trees, oceans, mountains, and so on. This imaging of the world is
not, however, photographic but datalogical: generated algorithmically by a
machine learning model fed vast amounts of map, satellite, photogrammetric,
and other data. It is a machinic imagining of the textures of the world. Like
Google Earth, it is a datalogical attempt at solving the fundamental problem
that plagues the unusable map from Borges’s short story “On Exactitude in
Science,” which in the effort to precisely represent every detail of an empire
grows to the same size as the territory. Rather than indexing its cartography
to the world perceived by h uman mapmakers, Flight Simulator generates
what its algorithms believe the world to be. Players quickly found numer-
ous strange glitches: a corporate office tower in place of the Washington
Monument, a mashup of vegetation and buildings in the Norwegian town
of Bergen, obelisks in place of palm trees. Far from a utopian rendering of a
world made beautiful yet knowable, Kyle Chayka writes in Slate that Flight
Simulator reminds us that “an automated, unchecked p rocess is warping the
(virtual) world around us, leading to these weird errors and aberrations.”2
Even as Flight Simulator seemed to offer a new algorithmic means of witness-
ing in wonder at the world, its glitches remind us of the necessity of witnessing
those same algorithmic systems. If algorithms are themselves witnessing, mak-
ing knowledge, and forging worlds of their own design, what might it mean to
witness their workings?
The world-making capacity of the algorithm is not readily apparent in its
more common definitions: a step-by-step instruction of how to solve a task;
a recipe; a form of programmed logic; an automated filtering mechanism.
These commonplace accounts fail to get to the heart of t hings, the operative
processing made possible by the “if . . . then” procedure of the algorithm and
its potentially harmful outcomes.3 In principle, algorithms are abstract pro
cesses, which means they are not dependent on a specific computer language
for their validity. But in practice, algorithms are typically encoded in distinct
computer languages and ecosystems. More than this, though, they are also
inescapably codes in the sense that they unlock certain translations, opera-
tions, or transformations of data.4 We might even think of them as magic
in the sense that the incantation of the algorithm by the software within
which it is packaged enables action to be performed. Like codes and magic,
algorithms conceal their own operations: they remain mysterious, including
to their makers. This inscrutability is particularly the case with the machine
learning algorithms that have become the principal means by which power
is now enacted, maintained, and reproduced in the digital domain.
Machine learning is a technique for the statistical analysis of huge quanti-
ties of data. A machine learner is an algorithmic system in which computer
code learns from data to produce a model that can be deployed on more data.
Machine learning produces models by using algorithmic techniques to look
for patterns in huge amounts of data, then applying those patterns to the
data to become increasingly discerning: able to recognize, differentiate, and
discriminate between elements within the database. Machine learning pow-
ers everything from inbox filtering to Netflix recommendations and it feeds
on the data produced through our interactions with t hose systems. Machine
Witnessing Algorithms 81
learning systems and the companies that promote them almost always seek
to obscure both the “free l abor” of user interactions and the low-paid l abor of
digital pieceworkers on platforms such as Mechanical Turk in an effort to sell
the technical prowess of their “ai” inventions. Machine learning uses layers
of neural networks—arrays of computational nodes that work collaboratively
to build relations between bits of data—to make predictions about the data.
With OpenAI’s ChatGPT, this manifests as the statistical production of text
based on what the model anticipates to be the desired answer to a query.
In military operations, it might mean identifying and prioritizing distinct
threats in a crowded conflict zone. Rather than following a defined sequence
of steps, machine learning models act recursively to build relational functions
that can be applied ever-more accurately and efficiently, provided the learner
is trained and optimized appropriately.5
But this technicity is not purely technical. As Adrian Mackenzie points
out, there are no machine learning systems without h uman coders and
humans are also needed to tag objects in the datasets for the supervised
training by which many machines learn.6 In so-called unsupervised learning,
algorithms develop their own data tags, but h uman effort is still constantly
required to tweak, select, optimize, and monitor training. Jathan Sadowski
calls this “Potemkin ai,” or artificial intelligence that is actually only thinly
computational and largely driven by h uman labor.7 On top of the obscured
human l abor, Sy Taffel shows how computational systems also elide massive
ecological costs of powering and cooling data centers, not to mention mining
rare and common metals or shipping equipment across the globe.8 To bring
machine learning into the language of this book, its models and algorithms
are not alien, purely technical agents wholly separate from the h uman, but
rather enmeshed with the h uman and with the more-than-human world.
How machine learners make knowledge matters because they are increas-
ingly pivotal to contemporary finance, logistics, science, governance, national
security, and culture, yet they remain hard to scrutinize, building blocks in
what Frank Pasquale calls the “black box society.”9
Despite their technical veneer, algorithms are shaped and bound by as-
sumptions and values about the world, drawn from the datasets upon which
they are built, the biases of their architects, and the instrumental objectives
of the institutions that use them. Th ese assumptions and values might be
as straightforward as whether to order library books by alphabet or catalog
number, or as outrageously discriminatory as Facebook allowing housing
advertisers to exclude users from target audiences using zip codes and other
proxies for race, class, and religion. Given the colonial entanglements of
82 Chapter Two
modern science, regimes of classification, and the statistical techniques that
underpin contemporary data science and machine learning, the constitutive
violence of many such systems should come as no surprise. Algorithmic
violence, whether in the form of digital redlining or autonomous weap-
ons, is an ethicopolitical problem much more than a technical one.10 As Ed
Finn points out, the algorithm is a crucial site of critical inquiry because it
is “the object at the intersection of computational space, cultural systems
and human cognition.”11 Traceable back to the cybernetic era of computa-
tional research that followed World War II, algorithms w ere at the center of
a radical transformation that substituted rationality for reason. Within two
decades of the war, as Orit Halpern argues, “the centrality of reason as a tool
to model h uman behavior, subjectivity, and society had been replaced with
a new set of discourses and methods that made ‘algorithm’ and ‘love’ speak-
able in the same sentence and that explicitly correlated psychotic perspective
with analytic logic.”12
Now deployed across almost every field of human endeavor and inquiry,
algorithms bridge the gap between computation, culture, and thought—but
they are not reducible to any of t hose domains. According to Taina Bucher,
algorithms are “entangled, multiple, and eventful and, therefore, things that
cannot be understood as being powerful in one way only.”13 Consequently,
“algorithmic systems embody an ensemble of strategies, where power is im-
manent in the field of action and situation in question.”14 Research by Safiya
Noble and o thers into the oppressive biases of Google and Facebook shows
how supposedly objective systems are inseparable from racism, sexism, and
other socially produced and reproduced structures of domination.15 Gen-
erative ai tools such as Dall-E 2 or Midjourney are no exception, evidenced
by the efforts of their architects to engineer user inquiries rather than resolve
the impossible problem of the underlying data.16 As Nick Seaver argues,
“algorithms are not singular technical objects that enter into many different
cultural interactions, but are rather unstable objects, culturally enacted by the
practices people use to engage with them.”17 Much like a poem, algorithms
are tricky objects to know and often cannot even reveal their own workings.18
Critical research thus attends less to what an algorithm is and more to what
it does.19
In pursuing nonhuman witnessing of, by, and through algorithms, my
focus is on their operative, extractive, and generative qualities, rather than
their computational mechanics. Through a series of investigations into dis-
tinct machine learning systems, I argue that algorithms can engage in a
perceptual process that constitutes nonhuman witnessing, elevating mere
Witnessing Algorithms 83
observation into an ethicopolitical plane. In drone warfare, an algorithm
might “see” certain activity, “decide” it is threatening, and “recommend” the
prosecution of violence. My contention is that such algorithmic registering
and translating of worldly phenomena constitutes witnessing because it does
so to violent ends and caries the most immediate traces of that violence. Fa-
cial recognition software is a tool for producing evidence through machinic
witnessing, yet both the data that feeds such systems and the unknowable
neural dynamics that animate them make it so dangerous that facial recog-
nition has been described as akin to plutonium.20 Algorithmic witnessing,
then, often takes place through the enactment of violence, with the algo-
rithm as both witness and perpetrator. At the same time, such algorithms
are themselves entities that must be witnessed—yet by their entangled nature
they resist being broken into consistent elements that can then be rendered
knowable.
This chapter grapples with the competing dynamics of the doubled mean-
ing of its title: algorithms that do witnessing and the witnessing of algorithms
(and what they do). Or, to put this differently, this chapter asks both how
algorithms might be agents of witnessing and how algorithms might be wit-
nessed? Rather than look for machinic relations to events that might be
analogous to human witnessing, this chapter seeks out intensive sites within
human-nonhuman assemblages where machinic affect—technical yet con-
tingent, potential rather than predetermined—enables forms of encounter
that generate a relation of responsibility between event and algorithm. Doing
so requires the bracketing of any ethical imperative to witnessing: algorithmic
witnessing can only ever be grasped within the milieu of the algorithm, an
agency that can only be ascribed ethics or morality through anthropomor-
phic gymnastics. Delving into the machinic affects of witnessing algorithms
will require us to depart further still from the narrow humanistic conception
of witnessing and to insist on the separation of witnessing from testimony.
If algorithmic technologies are now crucial knowledge machines, yoked
to power, and the infliction of state violence, then asking how witnessing
reckons with them and takes place through them requires attending to how
computational processes generate their machinic relations, and how t hose
relations sustain the power of those systems.
Even as the image increasingly overwhelms the word as the dominant
form of communication, the expansion of technologies that identify and
organize images means that a new form of aggregated, relational perception
is taking hold. Writing on the aggregation of huge numbers of images into
datasets analyzed by machine learning systems, Adrian Mackenzie and Anna
84 Chapter Two
Munster understand these relational processes as “generative technical forces
of experience.”21 They propose the concept of “platform seeing” to describe
an operative mode of perception “produced through the distributed events
and technocultural processes performed by, on and as image collections are
engaged by deep learning assemblages.”22 In their account, “seeing” is not the
act of a singular entity but rather something that takes place across a g reat
many human, material, and computational agents. Images become subject
to a host of functions: precisely formatted for input to models; labeled, pro
cessed, and used to configure small neural networks onboard smartphones;
moved from the devices of consumers to platforms and their data centers and
back. Through t hese and other functions, images transformed from bearers
of indexical relations to elements within operational (image) collections.23
Consequently, the relations between images within the dataset, including the
relations of elements within images to elements within others, become more
important than the images themselves.
Platform seeing is thus the “making operative of the visual by platforms
themselves.”24 This invisual mode of perception takes place outside the do-
main of representation: images no longer take their meaning from things in
the world but rather in relation to the elements and edges of other images.
Crucially, this “operativity cannot be seen by an observing ‘subject’ but rather
is enacted via observation events distributed throughout and across devices,
hardware, human agents and artificial networked architectures such as deep
learning networks.”25 Despite the absence of a h uman subject, t hese pro
cesses still constitute something called “seeing” precisely b
ecause they remain
within the perceptual domain of recognizing and differentiating images. In
this chapter, I make a parallel argument about witnessing: that even with-
out a witnessing “subject” in the unitary humanist sense, witnessing occurs
within and through algorithmic systems. Such witnessing necessarily exists
on a continuum with perceiving and cannot be neatly distinguished from
it. Different contexts, media technics, and h uman entanglements produce
distinct intensive fields of relation that shift perceiving into the modality of
witnessing. Not all human perception entails witnessing, and so neither does
all perception by the nonhuman agencies of algorithms.
While witnessing rarely figures in discussions of algorithms and artificial
intelligence, terms that appear in witnessing discourses abound: truth, rec-
ognition, memory, transparency, ethics. This is not to suggest an inherent
synchronicity between witnessing and the algorithmic, but rather to point
out that the perception required for both to operate possesses a purposive
dimension. As Amoore writes, “A defining ethical problem of the algorithm
Witnessing Algorithms 85
concerns not primarily the power to see, to collect, or to survey a vast data
landscape, but the power to perceive and distill something for action.”26 In
much the same way, witnessing is not reducible to seeing, but is a perceptual
encounter that produces an injunction to action through its configuring
of a particular scene and its coalescing of that scene’s relational dynamics.
Like algorithms, witnessing makes truth claims about the world as well,
and is also prone to oversight, misapprehension, and misstatement.27 Like
algorithms, witnessing is prone to falsity, whether deliberate or accidental.
Their distributed, multiple, contingent, and operative existence means that
algorithms cannot be known or accounted, and yet neither can the human.
It is only ever humans, plural, who can give account, and doing so is always
incomplete. This is why, for Amoore, “algorithms do not bring new problems
of blackboxed opacity and partiality, but they illuminate the already present
problem of locating a clearsighted account in a knowable human subject.”28
Neither human nor algorithm can give an account of itself that is complete
or transparent. An ethics for algorithmic worlds cannot “seek the grounds
of a unified I ” but must instead “dwell uncertainly with the difficulty of a
distributed and composite form of being.”29 This chapter pursues the ques-
tion of what distributed, opaque, and decentered witnessing might look like
within technics of the algorithm—and how such a contingent and multiple
domain might itself be witnessed.
Crucial to that task is tracing what I call machinic affect, or the intensive
relations that bind technical systems to one another and humans to technical
systems. By machinic affect, I mean the capacity to affect and be affected that
occurs within, through, and in contact with nonhuman technics. In keeping
with Félix Guattari’s expansive conception, my own use of “machinic” is not
restricted to the mechanistic but rather refers to the processual assemblage
of elements, objects, concepts, imaginaries, materialities, and so on that
form “machines” through their distinct yet transversal relations. Guattari’s
machines are organic and inorganic, technical and social, material and ab-
stract.30 Machinic affect is not so much indifferent to the flesh as it is promis-
cuous in its adhesive and intensifying properties, such that the corporeality
of the human does not default to center stage.31 Excavating machinic affect
from technical assemblages requires attending to the distinctiveness of indi-
vidual technical objects as they assemble, attenuate, modulate, amplify, and
terminate technical and nontechnical relations. In the context of witnessing,
machinic affect can be applied to understand the relations forged between
witness and event when mediated through screens. But more importantly
86 Chapter Two
and generatively, machinic affect offers an analytic for making visible the
otherwise obscured machinic relations of complex technical systems and
especially learning algorithms.
Machinic affect describes the dynamic intensities of technical systems. As
such, machinic affect is autonomous intensity: owned neither by one body
nor another, but constituting and constituted by them, whether human or
non. Pursuing machinic affect within the media-specificities of algorithmic
systems, I am interested in how the processual empiricism of what Massumi
calls the “virtual” illuminates the relational dynamics of machine learning.
For Massumi, the virtual describes the immanence of potentiality, its passage
from futurity through experience and into pastness. The virtual is what might
arise and what might have been. It is not opposed to the actual, but its under-
side. Affect is “precisely this two-sidedness, the simultaneous participation of
the virtual in the a ctual and the a ctual in the virtual, as one arises from and
returns to the other.”32 This chapter is about the necessity of witnessing how
algorithms, particularly machine learning ones, oscillate between actualizing
the virtual and virtualizing the actual.
If algorithmic systems are about taming potential into probability in the
name of emergent ordering of worldly phenomena, we can understand them
in Massumi’s terms as ontopowerful: as technological processes for the mas-
tery of becoming.33 Machine learning systems are constituted by unreason—
even madness—through looping recursivities.34 This nonlinearity, too, finds
much in common with Massumi’s recognition that “intensity would seem
to be associated with nonlinear processes: resonation and feedback that
momentarily suspend the linear progress of the narrative present from past
to future.”35 As well as disassembling and distributing the subject, witnessing
algorithms requires dismantling and dispersing the event in time as it is taken
up and worked upon by algorithmic agencies. This chapter thus excavates
the distinctive dynamics of nonhuman witnessing across four instances of
algorithmic world-making: the false witnessing of deepfakes; the animat-
ing of evidence in Forensic Architecture’s machine learning investigations;
military imaginings of archival and real-time processing of full motion video
imagery from loitering drones; and the witnessing of machine learning pro
cesses in aesthetic interventions into algorithmic systems. Operating with
different learning models and data sources and within very varied contexts,
these examples show the dangers of algorithmic witnessing and the necessity
of witnessing algorithms, but they also suggest the potential of such systems
to work against state and corporate violence.
Witnessing Algorithms 87
bearing false testimony: deepfakes
The synthetic media that would become known as “deepfakes” first surfaced
to mainstream attention with a December 2017 article by Samantha Cole in
Vice Media’s tech site Motherboard about a pornographic video that appeared
to feature the actress Gal Gadot having sex with her stepbrother (figure 2.1).
As Cole reported on the tech site, the video was a fake, the clever but decid-
edly imperfect creation of a Reddit user with some basic machine learning
skills and open-access tools downloaded from the code repository GitHub.36
Fake and face-swapped pornography are not new phenomena: cgi porn
is widely available, while photoshopped porn images have been around as
long as the internet and altered nude photographs since the early twentieth
century at least. The difference in the Gadot video was the application of
deep learning techniques to automatically swap one face with another. That
technique gave the Redditor his h andle and the new genre a name: deepfakes.
“With hundreds of face images, I can easily generate millions of distorted
images to train the network,” deepfakes told Cole. “After that if I feed the
network someone else’s face, the network will think it’s just another distorted
image and try to make it look like the training face.”37 With so many high-
quality images on which to train the system available online, celebrities like
Gadot are easy targets. But that same ease could readily apply to politicians,
and to voice as well as video. Arriving amid a rising tide of distrust in systems
and institutions, deepfakes seemed to herald a new threat, undermining
democratic processes and cybersecurity and facilitating misinformation and
revenge porn. A cottage industry of deepfake creation and detection sprung
up in response. Deepfakes seemed to enable the bearing of algorithmic false
witness—a problem only complicated by the arrival of more user-friendly
artificial image and video generation tools in the years since.
While t here are several techniques that can be used to generate deepfakes,
the most effective are produced through a form of deep learning neural net-
work called “generative adversarial networks,” or gans. While image recog-
nition algorithms are typically trained using convolutional neural networks
(cnns) that slide filters across images to learn their spatial properties, gans
work by pitting two algorithms against each other in a game of true and
false (figure 2.2). First proposed by ai researchers from Google Brain in a
2014 paper, the premise of gans is simple enough: one neural network (the
generator) learns to create images that it then feeds to another network (the
discriminator), which decides if the image is “fake” or “real” compared to its
own training dataset.38 Those results are then fed back into the generator, so
88 Chapter Two
figure 2.1. Still from Gal Gadot deepfake porn
that it can learn from the assessment of the discriminator. What makes the
technique powerful is that both networks are learning at the same time, with
the discriminator learning just enough to get ahead of the generator each
time the quality of its fakes catches up. To produce the Gal Gadot deepfake
with a gan, the generator would be fed the pornographic video while the
discriminator learned from real photos of Gadot. As the generator modified
its video using several image-blurring and blending techniques, proxim-
ity to what the discriminator was learning about Gadot would yield better
Witnessing Algorithms 89
and better results until the discriminator could no longer identify the fake
images as fake at all. If the gan was then trained on other video and image
sets, it would get even better at its task. In this way, gans can become highly
accomplished at swapping any face for any other. Versions of this technique
can be applied to specific parts of the face, too, such as the lips, or to audio,
enabling the falsification of someone’s voice to match a script, as in the widely
reported Obama lip-sync demonstration.39
While computer science papers have focused on deepfake creation and
detection, the humanities and social sciences has begun to address a wider
set of questions.40 The most attentive examinations of deepfakes have oc-
curred within porn studies, where the gendered nature of the technology in
practice—more than 99 percent of documented deepfakes feature women
face swapped into pornographic videos—has been documented and exam-
ined in a range of contexts, from revenge porn to communities of practice
to the emergence of “designer” porn.41 Legal scholarship within the United
States has addressed how deepfakes create tension between rights to free speech
and privacy, as well as how they present a potential crisis for the verifi-
cation of evidence presented to court.42 Possible impacts for cybersecu-
rity and information warfare are articulated in more apocalyptic terms.43
But deepfakes also point to the vibrancy of everyday data cultures, and
the experimental, open-source approach to ai and automation literacies
taking place on GitHub, YouTube, and Reddit.44 “Deepfakes are complex
epistemic t hings,” observes Rebecca Uliasz, which “testify to the ongoing
socio-technical value we place on visual accuracy which manifests in our
continued investment in imagistic realism as truthful.”45 As such, deepfakes
can be understood as yet another technological blow to shared epistemic
frameworks, further undermining certainty in image authenticity for both
journalists and publics.46
For witness, a New York–based human rights organization that equips
citizens and activists with video tools and resources, deepfakes and related
forms of synthetic media are an urgent danger b ecause they can amplify or
microtarget the kinds of media disinformation and incitement that spark
massacres, assaults, and political instability. In a report on synthetic media,
witness distills dozens of scenarios into five key problem areas: reality edits,
credible doppelgängers, news remixing, floods of falsehood, and plausible
deniability, in which claims of deepfaking allow bad faith actors to deny
having said or done what a video shows.47 Deepfakes lead to two interrelated
epistemic challenges: “the inability to show that something real is real, and
90 Chapter Two
then the ability to fake something as if it was real.”48 For witness, the in-
ability to prove that something real is real presents the more serious dilemma
because it suggests an existential peril for the mediated processes that are so
essential to contemporary shared realities. This algorithmic false witness risks
placing all mediated witnessing into question. Deepfake false witnessing cuts
and intensifies preexisting risks to individuals and communities by catalyzing
uncertainties within contemporary media ecologies. Over the last few years,
witness has worked with Partnership on ai to develop guidelines for appro-
priate use of synthetic media in human rights contexts. These are necessary
and important practical steps, but the epistemic challenges of deepfakes and
related media forms persist.
By threatening the legitimacy of the image, deepfakes destabilize the very
foundations of media witnessing as a shared means of producing an agreed
reality. Deepfakes emerged in a media witnessing ecology in which power has
shifted from the authority of legacy media to the immediacy of smartphone
and other user-generated content.49 As the necessity of grounding truth
claims becomes more urgent, deepfakes heighten the fallibility of witnessing
in, by, and through media. Th ese are fake images that make truth claims,
even as they undermine the possibility of common epistemic ground.50 In
places with declining trust in government or with significant government
instability and insecurity, deepfakes have the potential to incite violence and
violate human rights. Weaponized deepfakes assembled on the fly from so-
cial media records are one nasty possibility for the future of what Tom Sear
calls “xenowar.”51 If neither still nor moving images can be trusted to bear
the indexical relationship to the world that their authority depends upon,
the potential for any mediated witness to be false threatens to pry open the
fractures already r unning through any sense of shared reality. With their
emphasis on altering or swapping faces, deepfakes are affect machines even
more than cognitive deceptions. Machinic affect here takes a very recogniz-
able form in the micromovements of faciality described by Deleuze in his
account of the affection image in cinema and by Silvan Tomkins in his theory
of nine discrete relational affects manifested on the face.52 Face, voice, and
gesture are among the most crucial embodied qualities of bearing witness:
deepfakes seek to synthesize both fake and real to affect the viewer. Created
through the intensive interplay of machinic relations, deepfakes are also af-
fect engines when loose in the wild. As false witnessing algorithms, deepfakes
exemplify the inextricability of h uman and non in witnessing assemblages
within t oday’s deeply computational world.
Witnessing Algorithms 91
Deepfakes are among the most unsettling instances of the shifting re-
lationship between image and data. “An image that is computational is no
longer strictly concerned with mimesis, nor even with signification,” writes
Steven F. Anderson. “Computational images may serve as interfaces, carriers,
or surface renderings, the real importance of which are their underlying pro
cesses or data structures.”53 Deepfakes are syntheses of recorded and gener-
ated images made possible by the encoded nature of both. At the level of code
itself, neither bears any more material relation to the world beyond computa-
tion than the other. Even as the image reaches its zenith in visual culture, the
transfiguration into code that made its domination possible contains within
it the collapse of the authority granted to the image by its seemingly indexi-
cal relation to the world. Ironically enough given their origins in diy porn
communities, deepfakes speak to how “the once voyeuristic gaze of cinema
has given way to power relations articulated through computational systems
rather than through ocular regimes predicated on reflected light and bodies
in space.”54 The false witnessing of deepfakes suggests that contestations over
the meaning of images is moving away from signification and into genera-
tion. For deepfakes and imagery produced by Stable Diffusion or Dall-E 2,
contestation ceases to be about what the video image means and comes to be
about the process of its generation.
This movement from semiosis to p rocess means that the false witnessing
of deepfakes must be contested at an ontological level. While early iterations
had a tendency for glitching and an unsettling uncanny valley-like quality,
advances in the deep learning processes of gans now mean that h umans
can typically detect deepfakes about half the time, or at the same rate as ran-
dom chance. Deepfake detection tools that draw on the same kind of deep
learning neural networks have become increasingly important, but their
emergence and growing accuracy has led to an arms race between forgers
and detectors.55 This formation of a new adversarial, nonhuman, and ma-
chinic relationship between witness and interrogator points to yet another
site in which critical debates about culture, politics, ethics, and knowledge
play out without the human in the driver’s seat. A potentially endless game
of deception and unmasking awaits in which witnessing itself becomes the
ground of contestation between adversarial machine learning systems and
where social and political life become the field upon which the consequences
of that struggle play out. Ironically enough, algorithmic false witnessing
heightens computation’s claim as both figure and ground for how knowledge
is produced and contested.
92 Chapter Two
between evidence and witnessing: forensic
architecture and open-s ource machine learning
Synthetic media are everywhere, not just in deepfakes. Digital images and
objects that appear to index something in the world but do nothing of the
sort have their roots in video games and online worlds like Second Life.
With the growing appetite for niche machine learning training sets and ar-
tificial environments for testing autonomous machines, synthetic media are
increasingly central to the development of algorithmic systems that make
meaningful decisions or undertake actions in physical environments. Syn-
thetic media are swift to produce and can be tagged as part of the production
process, which reduces costs, delays, and inaccuracies from using people to
tag images or other data.
Microsoft AirSim is a prime example, an environment created in Epic’s
Unreal Engine that can be used to test autonomous vehicles, drones, and
other devices that depend on computer vision for navigation.56 Artificial
environments are useful testing grounds b ecause they are so precisely ma-
nipulable: trees can be bent to a specific wind f actor, light adjusted, surface
resistance altered. They are also faster and cheaper places to test and refine
navigation software prior to expensive material prototyping and real-world
testing. In machine learning, building synthetic training sets is now an es-
tablished practice, particularly in instances of limited data availability or lack
of data diversity. For example, the company Synthesis.ai produces synthetic
images of nonwhite people to train various kinds of recognition algorithms.
Synthetic media are valuable in contexts such as armed conflict, where im-
ages might be too few to produce a large enough corpus and too classified
to be released to e ither digital pieceworkers for tagging or private sector
developers to train algorithms.
But what happens when synthetic media are marshaled to do the activist
work of witnessing state and corporate violence? What are we to make of
the proposition that truths about the world might be produced via algo-
rithms trained almost exclusively on synthetic data? This section sketches
answers to t hese questions through an engagement with Triple Chaser, an
investigative aesthetic project from the UK-based research agency Forensic
Architecture. Founded in 2010 by architect and academic Eyal Weizman and
located at Goldsmiths, University of London, Forensic Architecture invents
investigative techniques using spatial, architectural, and situated methods.
Using aesthetic practice to produce actionable forensic evidence, their work
Witnessing Algorithms 93
appears in galleries, courtrooms, and communities. In recent years, they
have begun to use machine learning and synthetic media to overcome a
lack of publicly available images on which to train their machine learning
models and to multiply by several orders of magnitude the effectiveness of
images collected by activists. In contrast to the false witnessing of deepfakes,
these techniques show how algorithms can do the work of a more resistant
and generative witnessing, translated into open-source tools for activists via
well-documented GitHub pages.
Presented at the 2019 Whitney Biennial in New York, Triple Chaser com-
bines photographic images and video with synthetic media to develop a
dataset for a deep learning neural network able to recognize tear gas canisters
used against civilians around the world. It responds to the controversy that
engulfed the biennial following revelations that tear gas manufactured by Sa-
fariland, a company owned by Whitney trustee Warren B. Kanders, was used
against protestors at the US-Mexican border. Public demonstrations and artist
protests erupted, leading to significant negative press coverage across 2018 and
2019. Rather than withdraw, Forensic Architecture submitted an investigative
piece that sought to demonstrate the potential for machine learning to func-
tion as an activist tool. Produced in concert with artist and filmmaker Laura
Poitras, Triple Chaser was presented as an eleven-minute video installation.
Framed by a placard explaining the controversy and Forensic Architecture’s
decision to remain in the exhibition, viewers entered a severe, dark room to
watch a tightly focused account of Safariland, the problem of identifying tear
gas manufacturers, the technical processes employed by the research agency,
and its further applications. Despite initial intransigence, the withdrawal of
eight artists in July 2019 pushed Kanders to resign as vice chairman of the
museum and, later, announce that Safariland would sell off its chemicals
division that produces tear gas and other antidissent weapons. Meanwhile,
Forensic Architecture began to make its codes and image sets available for
open-source download and began applying the same techniques to other
cases, uploading its Mtriage tool and Model Zoo synthetic media database to
the code repository GitHub. A truth-seeking tool trained on synthetic data,
Triple Chaser reveals how machinic affects oscillate between witnessing and
evidence.
In keeping with the established ethos of Forensic Architecture, Triple
Chaser demonstrates how forensics—a practice heavily associated with both
policing and the law—can be turned against the very state agencies that
typically deploy its gaze. As Pugliese points out, “Embedded in the concept
of forensic is a combination of rhetorical, performative, and narratological
94 Chapter Two
techniques” that can be deployed outside courts of law.57 For Weizman, the
fora of forensics is critical: it brings evidence into the domain of contestation
in which politics happens. In his agency’s counterforensic investigation into
Safariland, tear gas deployed by police and security agencies becomes the
subject of interrogation and re-presentation to the public.58 In this making
public, distinctions and overlaps can be traced between different modes of
knowledge-making and address: the production of evidence, the speaking
of testimony, and the witnessing of the audience. But how might we under-
stand the role of the machine learning algorithm itself? And how are we to
conceptualize this synthetic evidence?
Weizman describes the practice of forensic architecture as composing
“evidence assemblages” from “different structures, infrastructures, objects,
environments, actors and incidents.”59 There is an inherent tension between
testimony and evidence that counterforensics as a resistant and activist
practice seeks to harness by making the material speak in its own terms. As
method, forensic architecture seeks a kind of “synthesis between testimony
and evidence” that takes up the lessons of the forensic turn in human rights
investigations to see testimony itself as a material practice as well as a lin-
guistic one.60 Barely detectable traces of violence can be marshaled through
the forensic process to become material witnesses, or evidentiary entities. But
evidence cannot speak for itself: it depends on the h uman witness. Evidence
and testimony are closely linked notions, not least b ecause both demarcate an
object: speech spoken, m atter marked. Testimony can, of course, be entered
into evidence. But something more fundamental is at work in Triple Chaser.
Its machine learning model doesn’t simply register or represent. It is opera-
tive, generating relations between objects in the world and the parameters
of its data. Its technical assemblage precedes both evidence and testimony.
It engages in nonhuman witnessing. Triple Chaser brings the registering of
violations of h uman rights into an agential domain in which the work of wit-
nessing is necessarily inseparable from the nonhuman, whether in the form
of code, data, or computation.
As development commenced, Triple Chaser faced a challenge: Forensic
Architecture was only able to source a small percentage of the thousands
of images needed to train a machine learning algorithm to recognize the
tear gas canister produced by Safariland. They were, however, able to source
detailed video footage of depleted canisters from activists and even obtained
some material fragments. Borrowing from strategies used by Microsoft,
Nvidia, and others, this video data could be modeled in environments built in
the Unreal gaming engine, and then scripted to output thousands of canister
Witnessing Algorithms 95
images against backgrounds ranging from abstract patterns to simulated
real-world contexts (figure 2.3). Tagging of t hese natively digital objects also
sidestepped the l abor and error of manual tagging, allowing a training set to
be swiftly built from images created with their metadata attached (figure 2.4).
Using several different machine learning techniques (including transfer
learning, combining synthetic and real images, and reverse discriminators),
investigators were able to train a neural network to identify Safariland tear
gas canisters from a partial image with a high degree of accuracy and with
weighted probabilities. These synthetic evidence assemblages then taught the
algorithm to witness.
Like most image recognition systems, Triple Chaser deploys a convolu-
tional neural network, or cnn, which learns how to spatially analyze the
pixels of an image. Trained on tagged datasets, cnns slide—convolve—a
series of filters across the surface of an image to produce activation maps that
allow the algorithm to iteratively learn about the spatial arrangements of pix-
els, which can be repeated across large sets of images. These activation maps
are passed from one convolution layer to the next, with various techniques
applied to increase accuracy and prevent the spatial scale of the system from
growing out of control. Exactly what happens within each convolutional
layer remains in the algorithmic unknown: it cannot be distilled into repre
sentational form but rather eludes cognition.61 Machine learning processes
thus exhibit a kind of autonomic, affective capacity to form relations between
objects and build schemas for action from the modulation and mapping of
those relations: machinic affect. Relations between elements vary in intensity,
with the p rocess of learning both producing and identifying intensities that
are autonomous from the elements themselves. It is precisely this that cannot
be “visualized” or “cognized.” Intensive relations assemble elements into new
aggregations; bodies affect and are affected by other bodies. Amoore writes
that algorithms must be understood as “entities whose particular form of
experimental and adventurous rationality incorporates unreason in an in-
tractable and productive knot.”62 Reflecting on economic self-interest and the
false grounds of rational choice, Massumi points out that “rationalities are ap-
paratuses of capture of affectivity.”63 Machine learning works in concomitant
ways. Th
ere is an autonomic quality to such algorithmic knowledge-making,
more affective than cognitive. This machinic registering of relations accu-
mulates to make legible otherwise unknown connections between sensory
data, and it does so with the potential (if not intention) for that registering
to make p olitical claims: to function as a kind of witnessing of what might
otherwise go undetected.
96 Chapter Two
figure 2.3. Four variations of synthetic media from Triple Chaser, Forensic
Architecture, 2019. Courtesy of Forensic Architecture.
figure 2.4. Applying weathering and wear effects to synthetic cannisters, Forensic
Architecture, 2021. Courtesy of Forensic Architecture.
Underpinning the project is the proposition that social media and other
image platforms contain within them markers of violence that can and
should be revealed. For the machine learning algorithm of Triple Chaser, the
events to which it becomes responsible are themselves computational: ma-
chinic encounters with the imaged mediation of tear gas canisters launched
at protesters, refugees, and migrants. But their computational nature does
not exclude them from witnessing. With so much of the world now e ither
emergent within or subject to computational systems, the reverse holds true:
the domain of computation and the events that compose it must be brought
within the frame of witnessing. While the standing of such counterforensic
algorithms in the courtroom might—for now—demand an expert human
witness to vouch for their accuracy and explain their processes, witnessing
itself has already taken place long before testimony occurs before the law.
Comparisons can be drawn to the analog photograph, which gradually be-
came a vital mode of witnessing and testimony, not least in contexts of war
and violence.64 Yet, despite its solidity, the photograph is an imperfect wit-
ness. Much that m atters resides in what it obscures, or what fails to enter the
frame, as in the nonhuman witnessing of Aleppo’s aftermaths that I examined
in the last chapter. With the photograph giving way to the digital image and
the digital image to the generative algorithm, the ambit of witnessing must
expand. As power is increasingly exercised through and even produced by
algorithmic systems, modes of knowledge-making and contestation predi-
cated on an ocular era must be updated for an age of more overt and complex
machinic affect-ability. Forensic Architecture’s work is also a potent reminder
that nonhuman witnessing is a m atter for galleries and activist politics as
much as the courts, providing the aesthetic means for the h uman to compre-
hend its constitutive entanglement with the non. Even if the law resists the
displacement of the human, art does not.
As Triple Chaser demonstrates, algorithmic witnessing troubles both
relations between witness and evidence and t hose between witnessing
and event. This machine learning system trained to witness via synthetic
datasets suggests that the linear temporal relation in which evidence—the
photograph, the fragment of tear gas canister—is interpreted by the h uman
witness cannot or need not hold. Through their capacities for recognition and
discrimination, nonhuman agencies of the machinic system enact the witness-
ing that turns the trace of events into evidence. Witnessing is, in this sense,
a relational diagram that makes possible the composition of relations that in
turn assemble into objects that can be experienced. If witnessing precedes
98 Chapter Two
both evidence and witness, then witnessing forges the witness rather than the
figure of the witness granting witnessing its legitimacy and standing.
While this processual refiguring of witnessing has ramifications for non-
human agencies and contexts beyond the algorithmic, Forensic Architecture’s
movement into this space suggests the strategic potential for an alternative
politics of machine learning. In the four years since the release of Triple
Chaser, Forensic Architecture has extended their use of machine learning
to deal with identifying R ussian tanks in Ukraine and other investigations.
While I firmly believe that skepticism toward the emancipatory and resistant
potential for machine learning and algorithmic systems more generally
is warranted, t here is also a strategic imperative to do more to ask how
such systems can work for people rather than against them. With its tools,
techniques, and synthetic media databases all made open source, Forensic
Architecture aims to democratize the production of evidence through the
proliferation of algorithmic witnessing that works on behalf of ngos, activ-
ists, and oppressed peoples, and against the technopolitical state. This inves-
tigative commons becomes an intensive field for nonhuman witnessing, in
which the entangled agencies of machines and h umans work to register and
make addressable otherwise elusive violence.
In June 2018, word spread inside Google that the company was partnering
with the US Department of Defense (DoD) to apply its artificial intelligence
expertise to the identification of objects in drone footage. A week later, the
same news broke on the tech site Gizmodo. Within days, Google had with-
drawn its engagement and released a set of principles for ai development
that precluded working on weapons systems, although with plenty of wiggle
room for other defense and national security applications.65 The controversy
marked a new notoriety for Project Maven, the code name for the Algorith-
mic Warfare Cross-Functional Team (awcft) created in April 2017 by order
of the Deputy Defense Secretary Robert Work. Its stated aim was to “turn
the enormous volume of data available to DoD into actionable intelligence”
with an initial focus on providing “computer vision algorithms for object
detection, classification, and alerts” in full-motion video from drone sys-
tems.66 The awcft had a mandate to “consolidate existing algorithm-based
technology initiatives related to mission areas of the Defense Intelligence
Witnessing Algorithms 99
Enterprise, including all initiatives that develop, employ, or field artificial
intelligence, automation, machine learning, deep learning, and computer
vision algorithms.”67 Not only would the team seek partnerships with Silicon
Valley, it would also adopt tech industry development techniques, such as
iterative and parallel prototyping, data labeling, end-user testing, and algo-
rithm training.68 In a reversal of the Pentagon’s typical hierarchical, drawn
out, and multiyear technological development processes, Project Maven
would be agile. It would fail often and learn quickly; move fast and break
things—but with weapons systems.
Military secrecy makes even an approximation of the scale of data requir-
ing analysis impossible to determine. Media reports suggest that the propor-
tion of drone sensor data currently analyzed by h umans represents a tiny
fraction. An article in Wired cites DoD officials claiming that 99 percent of
all drone video has not been reviewed.69 Project Maven boss General John
Shanahan is quoted as saying that twenty analysts working twenty-four hours
a day are able to successful analyze—exploit, in military lingo—around 6 to
12 percent of imagery from wide-area motion sensors such as the argus-i s
persistent surveillance platform discussed in chapter 1. Project Maven aimed
to bring ai analysis to the full-motion video data from the drone platforms
doing much of the surveillance work against isis in Iraq and Syria: the mq-1c
Gray Eagle and the mq-9 Reaper. By February 2017, DoD had decided that
deep learning algorithms should ultimately be able to perform at near h uman
levels but recognized that to do so meant working at scale. In its initial scop-
ing, Project Maven was intended to enable several autonomous functions,
including identifying thirty-eight different classes of objects, reverse image
search, counts within bounded boxes and over time, and selective object
tracking. It would integrate with Google Earth, ArcGIS, and other geographic
information systems (gis). Building datasets able to train machine learning
systems would require h uman tagging of huge amounts of data. According
to media reports, Project Maven outsourced much of this to the piecework
platform Figure Eight (formerly CrowdFlower), providing unclassified and
nonviolent images with instructions to draw and label boxes around various
objects. Combined with classified imagery tagged by internal analysts, this
data could train the convolutional neural network algorithms to identify
and classify objects within video feeds, using iterative training and testing
techniques honed in the tech industry.
Stored as ones and zeroes demarcating the position and color of pixels
and accompanied by crucial metadata that makes them legible to the com-
putational system, these images are optical only in potential. Unless called
figure 2.5. Still from “OpenAI Plays Hide and Seek . . . and Breaks the Game!,”
OpenAI, 2019
witnessing algorithms
witnessing
ecologies
in the communiqué from the Fiftieth Pacific Islands Forum, held in Tuvalu
in August 2019, leaders from the region “reaffirmed climate change as the single
greatest threat to the livelihoods, security, and wellbeing of the p eoples of the
Pacific,” but stopped short of calling for significant and immediate action. In
the Kainaki II Declaration that accompanied the formal communiqué, coun-
tries are called to “reflect” on transitioning away from coal rather than banning
its use, “meet or exceed” national emissions reductions rather than creating new
and more ambitious ones, and continue “efforts towards” meeting international
climate-funding promises rather than demanding urgent and ambitious
commitments.1 According to media reports, Australia successfully stymied
efforts for a much bolder declaration, reducing Prime Minister Akilisi Po-
hiva of Tonga to tears and prompting leaders from Fiji, Vanuatu, and Tuvalu
to make heated remarks about their more powerful neighbor. “We came
together in a nation that risks disappearing to the seas, but unfortunately we
settled for the communiqué,” said Fiji’s prime minister, Frank Bainimarama.
“Watered-down climate language has real consequences—like water-logged
homes, schools, communities, and ancestral burial grounds.”2 The ire of
Bainimarama and o thers was directed primarily at then Australian Prime
Minister Scott Morrison, a man who once triumphantly brandished a lump
of coal in Parliament, only reluctantly accepted the science of climate change,
and stalled progress to limit emissions and develop renewable energies at
every opportunity, achieving the ignominious distinction of Australia rank-
ing dead last among 170 states analyzed in a 2021 un report on climate ac-
tion.3 At the forum, it seemed Morrison wanted every dollar Australia spent
in the Pacific to be recognized, but refused to commit to any action that
might slow the rising seas threatening to swallow Tuvalu and other islands.
Much can be said of events such as this and the warped politics of climate
change, the enduring inequalities that underpin the failure to act by wealthy
nations, and the histories of colonialism, clientelism, and militarism that
shape the present Pacific. Just as the Marshall Islands and other nations in
the Pacific w ere crucial sites for nuclear testing throughout the Cold War,
so too are they now the canaries in the mineshaft of climate change. Indeed,
Elizabeth DeLoughrey points out that “climate science and nuclear weapons
testing have an intimate relationship,” as the tools and techniques for un-
derstanding the atmosphere developed for war w ere applied to establishing
carbon baselines and monitoring their change.4 Climate crisis is thus sutured
to “catastrophic ruptures to social and ecological systems” that “have already
been experienced through the violent processes of empire” and continue in
the ongoing, unnamed imperialism of regional geopolitics.5 Climate is itself
increasingly a military problem, securitized by planners in ways that have
little regard for the wellbeing of populations most subject to it.6 When the
Islander leaders of the Pacific juxtapose Australia’s domestic energy pricing
concerns with the erasure of life, culture, and community, it makes clear
that trauma is not registered as an individual experience but as an ecological
phenomenon. Pleas for an a cceptance of shared responsibility in the face of
drowning depends on shared witnessing, on opening onto impossible loss,
grief, and ecological trauma.
Among the most widely known evocations of the drowning islands of the
Pacific Ocean are the poems of Kathy Jetñil-Kijiner, a Marshallese spoken word
performance artist and writer. Her poem “Tell Them” includes t hese lines:
something else at work here beyond recognition that “the Marshallese are
both h umans and nonhumans.”10 The repeated refrain of the poem’s m iddle
stanzas—“tell them we are . . .”—intermingles human and non, “hollow
hulls” and “wood shavings,” “sweet harmonies” and “styrofoam cups of
koolaid red.”11 Distinctions of status slip away between “skies uncluttered”
and “dusty rubber slippers” as space, place, object, speech, and gesture
become entangled with the “we” of the poem. H ere is a complex ecology,
one rendered sensible—able to be felt—through the rhythm and rhetoric of
the poem but not reducible to language. “We are” might also be an assertion
of ontology, of shared being-in and becoming-with the world that is slowly
being drowned. “Tell Them” is an allegorical call for climate justice, but
also an address to the nonhuman entanglements already rupturing in the
refracting wakes of catastrophic pasts and f utures. Its witnessing demands
not only response-ability on the part of the state system, but also that the
rich ecologies of the Marshall Islands be recognized as response-able and
address-able. If this poem—and others like it—are calls to witness and acts
of testimony, then their mode of witnessing is nonhuman, animated by the
inextricable entanglements of being, land, living, community, ocean, and
culture.
This chapter coheres around the proposition that one way that ecologi-
cal trauma—complex, mutable, resilient, ephemeral, material, moving,
unsettling—comes into focus is through aesthetic works that undertake the
tentative, always incomplete project of nonhuman witnessing. In pursu-
ing this proposition, I attend to artistic and literary works that examine
ecological trauma through scale as a site of p olitical contention and in
the existential rupturing of nuclear weapons. Violent mediations and ma-
chinic affects animate the martial and capitalist operations, events, and
technologies that concern this chapter, but here I shift my inquiry from
what animates assemblages of catastrophic violence to pursue traumatic
This chapter expands the focus of the book to think with more-than-human
ecologies that encompass land, sky, and water, rather than remaining within
the technocultural domains that have been its principal preoccupation. As
with its p redecessors, this chapter travels with the doubling movement of
its title: the witnessing of ecologies and ecologies of witnessing. Understood
as complex systems of interacting and interdependent parts, ecologies are
constituted by relations between elements. Whether wrought in the split-
second fission of a nuclear bomb or the drawn-out temporality of radioac-
tive contamination, ecological violence strikes at the relational composition
of ecologies themselves. Uprooting a verdant tree to clear the way for a new
road is not ecologically violent simply b ecause the tree itself is lost, but
rather because its removal tears at the fabric of the ecology within which it is
webbed. As Cubitt writes: “Ecologies are not networks connecting previously
separate things: Every element of an ecology mediates every other. Life medi-
ates nutrients and sunlight, storing, changing, growing, passing, mutating,
returning.”13 Media theorist Matthew Fuller makes the point that the word
ecology “is one of the most expressive language currently has to indicate
the massive and dynamic interrelation of processes and objects, beings and
things, patterns and matter.”14 But ecologies can also be brutal, particularly
once we extend their conceptual reach into the violent. “Geopolitics, enacted
through global war, is itself a form of life that pursues a savage ecology,” Grove
insists, “radically antagonistic to survival as a collective rather than discrimi-
natory goal.”15 Ecologies are not inherently moral, but are rather inescapably
political on a planet shaped by Man.
Conceiving of media as ecological and ecologies as medial provides a
conceptual apparatus through which to examine, in the context of ecological
violence and its attendant traumas, the communicative mode I am calling
“nonhuman witnessing.” As this chapter argues, nonhuman witnessing can
become a reparative response to ecological trauma, the state of wounded
survival that follows in the wake of ecological violence. But it also responds
to a deeper historical rupture between human and nonhuman, a cleaving of
Man from Nature that is rooted in Platonic and Aristotelian thought and thus
inherent to the ascendance of Anthropos, even before its violent intensifica-
tion as Wynter’s Man, which I discussed in the introduction and will return
to in the coda. “The more humans defined themselves over against nature,”
Cubitt observes, “the more they defined nature over against themselves, in
this way formalizing and enforcing the split between the natural environment
With its roots in the Latin scala, meaning ladder or stairs, scale refers to de-
fined relations of space, time, or quantity between one thing and another. A
musical scale sets out the relations between one note and a series of o thers,
while a cartographic scale defines the ratio of distances on a map to those
on the earth. As Timothy Clark writes, scale “enables a calibrated and useful
extrapolation between dimensions.”35 Scale, then, is one means of making
instrumental and practical sense across difference, a means of managing rela-
tions between one thing and another. Scale helps anchor perception in worlds
that extend beyond the perceptual reach of the h uman sensorium; it enables
one to conceive of entities far bigger or smaller, say, than can be contained
within the human visual field. This is one of the promises of remote sensing:
not only to extend perception to atmospheric or underwater viewpoints, but
also to enable sensing at spatial and temporal scales that exceed the human.
As Fuller points out, “A ‘scale’ is something that operates at one level in what
might be thought of as an infinite zoom, were a camera to be built that
could be sensitized to elements as diverse as practices, institutions, atomi-
cal structures, weather patterns, linguistic formations, protocols, transport
infrastructures, a glance.”36 High-resolution satellite imagery thus not only
enables breadth of perception but also depth through the capacity to zoom
imagery down to the half meter and even smaller. Scale is an epistemologi-
cal tool, a means of o rganizing the world and its causal relations. It does not
inhere in any given entity but is an imposed relationality between one t hing
and another. At the same time, “a scale provides a certain perspectival optic
by which dimensions of relationality and other scales may be ‘read.” ’37 This
means scale can be intensely p olitical b
ecause it constructs relations between
entities and processes and, in d oing so, can become bound up with questions
of agency.
Defining our present geologic era as the Anthropocene, argues the post-
colonial historian Dipesh Chakrabarty, shifts the scale at which h uman agency
operates: “To call human beings geological agents is to scale up our imagi-
nation of the h uman . . . to attribute to us a force on the same scale as that
released at other times when t here has been a mass extinction of species.”38
But climate change is not only about happenings at the scale of the planet or
even the capacity of the human to have effects at the planetary scale. Rather,
Clark argues that it involves “an implosion of scales, implicating seemingly
trivial or small actions with enormous stakes” even as disciplinary, ideological,
witnessing scale
figure 3.2. Still from “Open Air,” Grayson Cooke collaboration with
Emma Walker, 2018. Courtesy of the artist.
ecological trauma
Rising panic in the West over the “end of the world” often fails to recognize
already existing experiences of ruined lifeworlds. Nor do enough planned
or imagined responses to the climate emergency give heed to the ontologies,
epistemologies, and practical knowledges of t hose p eople who lived far more
sustainable lives before and despite settler colonialism. Ecological catastro-
phe has already been experienced by First Nations: the anthropogenic end of
worlds is, all too terribly, nothing new. Through violence to knowledge, land,
and ways of living, as Kyle Powis Whyte argues, “settler colonialism commits
environmental injustice through the violent disruption of human relation-
ships to the environment.”53 Felling forests to graze cattle and grow crops,
introducing invasive species, diverting rivers and flooding valleys, flattening
hills and bifurcating mountains with highways—the list of such disruptions is
endless. Nor, of course, are such ecological traumas confined to the past. En-
vironmental destruction, loss of traditional forms of community, and death
itself all flow from resource extraction, weapons testing and war, plantation
agriculture, and other forms of what Rob Nixon calls the “slow violence” of
late capitalism, inflicted on the poor, oppressed, and dispossessed.54
Ecological trauma describes the injurious and ongoing effects at the level
of experience of the rupturing of relations that compose ecologies as living
and changing assemblages of more-than-human entities and processes. All
traumas target relations, severing encounters or events from the flow of ex-
perience and lodging those fragments in bodies as they go on, affecting and
affected by the world as it unfolds. But ecological trauma can be understood
as trauma that results from the rupturing of the relations that compose an
ecology, rather than t hose that enmesh a body within its world. Located at
the relational-compositional level of the ecology itself, ecological trauma
echoes collective cultural trauma, but is differentiated by its insistence on
nonhuman entities and the situatedness of all ecologies and their relations.
Like trauma more generally, ecological trauma is found not in the violence
that enacts a rupturing of relations but in how that rupturing carries through
into the f uture. Contaminating the unfolding multiplicities of experience
that animate an ecology with the past, ecological trauma is also haunted by
Yankunytjatjara elder Lester Yami called it a “black mist,” a thick cloud en-
veloping Adnyamathanha country, part of a huge swathe of Aboriginal land
in South Australia used for nuclear testing by Britain from 1953 to 1963.60 He
described his experience to the 1984 Royal Commission into the tests: “A big
bang—a noise like an explosion and later something come in the air . . . [it]
was coming from the south, black-like smoke. I was thinking it might be a
dust storm, but it was quiet, just moving . . . through the trees and above that
again, you know. It was just rolling and moving quietly.”61
Personally authorized by Prime Minister Robert Menzies and conducted
in secret, British nuclear testing in Australia took place on the Montebello
Islands (in 1952 and 1956), at Emu Fields in South Australia where Lester
Yami encountered the black mist (1953), and, most infamously, just to the
south of Emu Fields at Maralinga (1956–1963).62 Emu Fields was a particu-
larly disastrous choice: difficult to access by vehicle and prone to violent dust
more injurious and rupturing than what might be denoted by damage to “en-
vironment” within Western epistemologies. As Tynan continues: “Country
sits at the heart of coming to know and understand relationality as it is the
web that connects h umans to a system of Lore/Law and knowledge that can
never be human-centric.” Country is thus radically at odds with what Aileen
Moreton-Robinson calls the “possessive logics” of white settler sovereignty,
that claim land as property and thus render it always potentially subject to
extraction and violence.72
While the British authorities made efforts to mitigate the effects on white
farmers, the Aboriginal inhabitants of the region w ere almost entirely ne-
glected. Aboriginal culture, history, lifestyle, and ceremony were not con-
73
that is just as crucial to the ecology as the electromagnetic forces that hold
neutrons in check are to stability at the atomic scale.
Unstable isotopes are radioactive: they contain an unbalanced combina-
tion of neutrons and protons in their nucleus, which typically means too
many neutrons. By shedding extra energetic particles, t hese isotopes “decay”
into other particles, becoming more stable and less radioactive but releasing
nuclear radiation in the process. When a nuclear bomb is detonated, radio-
active particles are dispersed by the explosive force, attaching themselves
in turn to other particles. This is nuclear fallout: the irradiated particles of
weapon debris and dust that are carried on the wind, as Death Zephyr reminds
us, before they fall to earth. In their fall, they can attach and deform more
particles and the cells that make up life, such that stones, plants, animals, and
people become carriers of contamination, nonhuman and doomed witnesses
to nuclear catastrophe.
Some of the most devastating effects at Maralinga w ere not the bombs
themselves, but the “minor tests” involving the detonation of scattered pluto-
nium and other radiation “safety” experiments. Depending on the half-lives
of the isotopes involved, radioactive contamination might be present for
minutes, days, or years.84 Radioactive contamination can have enduring ef-
fects: making soil and water poisonous, producing cancers and miscarriages,
figure 3.10. Missile Park interior, Yhonnie Scarce, 2021. Courtesy of Yhonnie
Scarce and THIS IS NO FANTASY.
years, at least in theory. In practice, isv was difficult to implement and not
always fit to task, the government department overseeing the p rocess failed
to establish clear criteria—and, surprising no one familiar with the history
of the state’s treatment of Indigenous peoples in Australia, once costs grew,
vitrification was abandoned in f avor of exhuming and burying the waste.
Glass as a failed medium of remediation testifies to the unyielding nature of
nuclear radiation, but also to the persistent coloniality of settler politics, to
the legacies of who counts as human and who does not. As a byproduct of
nuclear testing and as a failed mechanism for decontamination, vitrification
is a process of mediation: silica into glass, by way of the intense applications
of energy. Its violence is not inherent, but contextual. Through the breath of
the glassblower, vitrified silica becomes intimate and lively: a rich ecology
of country, life, fruit, vegetable and yet still an ecology deeply wounded by
the violence of war and settlement. Glass yams and bush plums distill the
ecological traumas of nuclear testing at Maralinga, the stuff of life rearranged
into the mushroom cloud and its dispersal and memorialized in the tin sheds
of the test sites.
What it means to witness such ecological trauma looks very different
within the accepted bounds of historical witnessing, particularly in the of-
ficial form it took in the Royal Commission into British Nuclear Tests in
Australia, chaired by Jim McClelland. While the commission sought to ac-
count for the health impacts on Aboriginal people and heard the testimony
of Lester Yami and others, its principal focus was the irradiation of Austra-
lian servicemen, the safety precautions implemented by the British, and the
nature of the agreement between Australia and its imperial overlord. From a
cultural standpoint, Yami’s black mist is surely its enduring figure, one that
finds a glassy counterpart in the art of Yhonnie Scarce. Nuclear activism and
public pressure in the 1980s did much to make Maralinga and Emu Fields
visible to the wider Australian public, and in 2009 almost all the lands of the
excision were returned to their Traditional Owners. But the Royal Commis-
sion, the failed cleanups that followed, and the narrow inquiries from various
departments and committees function as stark reminders of the impossibility
of such organs of the settler state working against its fundamental invest-
ments in militarism and the denial of Indigenous sovereignty. Within such
confines, the capacities of witnessing are bound not only by the necessity of
speaking but also by legal norms and parliamentary terms of reference.
A more expansive witnessing must be sought elsewhere, in the poetry of
Indigenous writers such as Oodgeroo Noonuccal, Lionel Fogarty, and Natalie
Harkin, and even in the inventive research of scientists, who have exposed
wounding
In The Logic of Sense, Gilles Deleuze recognizes that futurity resides at the
heart of the event and its relation to h
uman expression. The event is “always
and at the same time something which has just happened and something
which is about to happen; never something which is happening.” 90 This
ecologies of witnessing
witnessing
absence
first absence:
the execution of james foley
On August 19, 2014, the Islamic State in Iraq and Syria (isis) uploaded a
video titled “A Message to America” that depicted the beheading of the
kidnapped journalist James Foley. Despite being swiftly pulled by YouTube,
the video and gruesome stills from it circulated on social media, news sites,
web forums, and shock galleries. Shot in crisp high definition, the video was
slickly produced and professionally edited. Deviating from the grainy foot-
age and awkward staging of executions filmed in A fghanistan or Iraq in the
years a fter 9/11, it had a consciously contemporary aesthetic. After a long
message addressed to President Obama, Foley appears on his knees, dressed
in orange. Behind him is a black-clad and masked executioner, around them
blasted desert and stark sky. The beheading itself lasts only ten seconds;
yet the moment of death is not shown. It occurs off-camera, disappeared
in the digital cut. A knife saws, but t here is no blood. Th
ere is only the body,
the head. The cuts shown are staged, experts say. Death itself is absent, but
radically so—despite not occurring on camera, it is everywhere in evidence.
Reflecting on the recurrent moment of the cut in photography, Kember and
Zylinska ask what might it mean to cut well, to cut in a way that entails a
vital, creative ethics.1 But what might it mean to cut poorly, to cut the clumsy
cut? For the digital cut to cut out the cutting of the body? In this gruesome
portrait of death without the moment of d ying, t here is an absence within
an absence—yet one that has a presence in the digital contagion of traumatic
affect. Perhaps the killing was botched, the blow of the sword too weak or at
the wrong angle.
Or perhaps the cut was too bloody, too grotesque. After all, the video’s pur-
pose was not only to incite shock, but also to recruit—to catch the disaffected,
the angry, the alienated and offer purpose through blood and violence.2 This
is a video that aims to traumatize, but also to speak to and through trauma.
As such, it is perhaps best understood as an image of digital war that exempli-
fies, as Andrew Hoskins and Shona Illingworth write, “a shift in the trauma of
civilians from a memory of the past to a perpetual anticipation of the threat
of the future, subjecting increasing numbers of people to unending physical
and psychological incarceration in a traumatizing present.”3 To watch such a
thing must be brutally visceral—but I d on’t know, I h aven’t seen it. Like the
deferred moment of death itself, I held back from an active participation in
its affective economy and have encountered it only in stills and secondhand
accounts, mediations of a mediation. Yet my r esistance to seeing the video
does not prevent its forcefulness from making its mark: t here is an urgent
affectivity in its absence, even now.
Despite its wide circulation, the beheading of James Foley—and later of
Steven Sotloff and others—produced a radical absence. Its absence resided
in the anxiety it engenders, the anxiety of potentially encountering the visual
force of war’s violence. An errant click, the wrong news article, a social media
post that slips through the controls instituted by Twitter or Facebook—to
encounter these videos would be so easily done, a s imple digital stumble or
the caprice of an algorithm. Crowding virtuals of affect, accumulating po-
tentials on the verge of becoming actual: an affective-traumatic atmosphere.
Brutal violence had infected the everyday of the digital. Who could say how
or where it had proliferated? The mythology of digital permanence, the no-
tion that w hatever words or images of ourselves find their way on online stay
there, resonated with the video’s disappearance. It was always potentially
appearing, even when it never arrived. Already testimonial texts that bear
witness to p olitical murder, such videos circulate in search of co-witnesses,
dependent on news values, browsing habits, and algorithmic recommenders.
Less than an hour after take-off on March 8, 2014, somewhere over the South
China Sea, Malaysian Airlines Flight 370 made its last contact with air traf-
fic control at 1:21 a.m. local time. Flying from Kuala Lumpur to Beijing was
meant to take less than six hours, but the Boeing 777 was only seen again
traumatic affect
Media are far more than surfaces on which trauma is inscribed. As Amit
Pinchevski argues, we can think of the “the traumatic as something that is
made manifest through media technological rendering,” rather than some-
thing that is simply represented in media.19 If radical absence begins with the
failure of the eyewitness to witness, an epistemological failure to translate
the registering of an event into knowable form, its continued existence as a
forceful absence on the plane of experience depends on more-than-human
processes of mediation. Mediation and trauma both share an uncertain re-
lationship between past and present, between presence and absence, and
between proximity and distance. As such, “media constitute the material con-
ditions for trauma to appear as something that cannot be fully approached
and yet somehow must be.”20At the level of process, technical media contain
within their own constitution the paradoxes that make trauma overwhelm-
ing: media are always entangled with experience, yet also insist on their
separateness. “Media matter,” writes Cubitt, “both in the sense of giving
material specificity to our descriptions of such abstract concepts as society
and environment, and in the sense of the active verb: mediation comes into
being as m
atter, its mattering constitutes the knowable, experienceable world,
making possible all sensing and being sensed, knowing and being known.”21
In mid-2012, Jessa Moore logged onto Facebook and learned that her friend
Anthony Dowdell had killed himself. She and others began to post memories
and photos, to tag him at restaurants or bars. “Facebook became our memo-
rial,” she said. “We could leave messages for him and each other.” Facebook
became a site of shared mourning, but also a way to keep memories alive—
even as it continually reminded Jessa of her friend’s absence from her life.31
Almost a d ecade later, Jessa’s experience is far from unique as I and many
others can attest, but her story, told in a widely read article in the Huffington
Post, marks an early incursion into media discourse of death on Facebook
and its weird affects. Estimates suggest that upward of thirty million Face-
book profiles have outlived the people who created them, with around eight
thousand users passing away e very day. In 2019, Carl J. Öhman and David
Watson published a statistical projection of the accumulation of profiles from
deceased users, using country and age data scraped from the Facebook api
in conjunction with country mortality rates. Their findings suggest that up to
4.9 billion dead users could populate Facebook by 2100, leading the research-
ers to call for a new, scalable, and sustainable model for preserving the data
of the deceased.32 Already, a microindustry has emerged to manage digital
estates, wrapping up accounts, tracing assets, and passing on data.33 On Face-
book, friends or family access accounts and make them inactive, or provide a
death certificate to Facebook to have their account officially “Memorialized,”
transforming their profile into a commemoration to which existing Facebook
friends can post but remains otherwise unchanged.34
Others are simply left in place, digital presences that bear no clear marker
of absent life, as if the user has simply stepped away from the computer. Yet
Screen-based media are only one slice of the pervasive digital mediation of con
temporary life, but their ubiquity means everything from homes and shopping
malls to buses and elevators has been infiltrated by the datalogical. To move
through such spaces is to have our attention demanded and diverted, with
digitized movement and sound calling us into a more temporal relation to
the visual and aural than the static imagery of the past allowed. This demand
for attention is also a bodily experience, from the adrenal surge that redirects
the body in gaming to the haptic signals of smartwatches. Augmented and
virtual reality hold the promise of interrupting our relation to the visual
field, layering data over what we see or replacing our immediate surrounds
entirely. Fantasies of neural link implants hint at a future of screen-body
fusion. Even now, interpersonal interactions slip between online and off, or
take place simultaneously in both domains.39 Smartphones and their ilk have
become what Bernard Stiegler calls “mnemotechnologies,” doing the work of
thinking, remembering, and processing our knowledge of the world.40 What
had been stable categories of causation no longer hold as relations between
objects, humans, and different media become increasingly fluid and relative.41
Even if there are antecedents for the transformative effects of digital net-
works in the long h uman history of mediations, such as the telegraph’s col-
lapsing of distance or cinema’s production of new modes of time, t here can
be little doubt that recent decades have seen an accelerated evolution in the
imbrication of media technologies and h uman life. Nonhuman technics and
the h uman sensorium are increasingly enfolded; affects flow between the cor-
poreal and machinic, intensities surging across surfaces and substrates, mod-
ulating and shaping. In the words of Nigel Thrift, “There is no stable ‘human’
experience b ecause the h
uman sensorium is constantly being re-invented
as the body continually adds parts to itself; therefore, how and what is
On May 24, 2020, the mining giant Rio Tinto detonated two rock shelters
in the Juukan Gorge in the Pilbara region of Western Australia, destroying
sites sacred to the Puutu Kunti Kurrama and Pinikura (pkkp) peoples that
The Juukan Gorge is known to be a place where the spirits of our relatives
who have passed away, even recently, have come to rest. It is a place that
the very, very old p eople still occupy. Purlykuti has been specifically re-
ferred to by the old people as a place of pardu, which refers to the special
language only spoken during ceremonies in the Pilbara. Our elders state
that it is certain that the spirits are very disturbed, and their living relatives
are also upset at this. This is why Juukan Gorge is important. It is in the
ancient blood of our p eople and contains their dna. It h ouses history and
the spirits of ancestors and it anchors the people to this country.56
Their absence would remain unbearably present, even as the cascading after-
math of the blasts brought a rare moment of scrutiny and accountability for
extractive capitalism and its legal and political foundations.
Enabling the destruction were two proximate agents of what Aileen
Moreton-Robinson calls “the possessive logic of patriarchal white sover-
eignty”: the incompetence and negligence of Rio Tinto and the gross dis-
parities of Western Australia’s Aboriginal Heritage Act 1972, u nder which the
destruction of the sites had been approved in 2013.57 On the Rio Tinto side,
the systematic sidelining of heritage reports and Traditional Owner con-
cerns became evident, facilitated—or so it was claimed—by the geographic
distance of the company’s executives in London from its mining activities in
Australia. This absence of communication protocols and heritage manage-
ment practices combined with an institutionalized disdain for traditional
witnessing absence
“All attachments are optimistic,” writes Berlant, and radical absence is itself a
form of attachment, for all the grief and death to which it attends: a witness-
ing relation with what has disappeared, an attachment to what is no longer
present that enables positive change.59 A witnessing of absence in the absence
of witnesses: such an attachment can be animated by traumatic affect yet
still spark a reparative movement—even if small, tentative, and threatened
by the very affectivity of the disappearances from which it might emerge.60
While far from a panacea and by no means a politics in itself, nonhuman
witnessing nevertheless widens the aperture from the human subject to
assemblages of h uman and nonhuman entities. Witnessing radical absence
is only possible due to the sheer materialities of networked infrastructures,
the algorithms and network protocols that enable the flow of machinic affect.
Witnessing radical absence means attending to t hose infrastructures, and to
the ecologies they disrupt, the wars they enliven, the extractive industries
they streamline. Witnessing absence in this way makes possible a differ
ent kind of response to systemic oppression than the voice of the testifying
subject, or even the assembled evidentiary force of Schuppli’s material wit-
nesses. Witnessing absence asks that we hold onto the possibility of witness-
ing in nonnormative ways, working outside the frame of courts and public
contestation. If we accept Berlant’s proposition that all attachments contain
some element of optimism, then an intensive attachment to absence might
well contain within it new forms and dynamics of relation that contain new
toward a
politics
of nonhuman
witnessing
covid-19 has shattered many of the fictions that sustained the global
order, racial capitalism, and the supremacy of Man. Th ese pandemic years
have been a brutal reminder of the nonhuman agencies that impinge upon
and transform us and our ways of living in profound and immeasurable ways.
Writing on the growing intimacies with such nonhuman agencies in sites
such as post-Fukushima Japan, Kath Weston argues that “ecointimacies are
compositional,” born of the “growing conviction that creatures co-constitute
other creatures, infiltrating one another’s very substance, materially and
otherwise.”1 covid-19 is that most intimate of infiltrators, absorbed through
air and breath, accelerated and intensified by both the desire to share social
and familial space, but even more so by an economic order that demands the
production and distribution of goods passed through h uman hands in tightly
packed spaces in which people have no choice but to breathe the same air. Re-
gimes of testing, the continual monitoring for new strains, the (re)instantiation
of borders of all kinds, the clear correlation between changing climate and
new diseases—the pandemic has forced us to confront our entanglements,
both with one another and with the nonhuman in all its technical and ecolog-
ical variety. More just and equitable f utures for h
uman life depend not only
on reckoning with covid-19 but also with the enduring crises from which
it is inseparable. Attending to the intimacies and estrangements through
which life is composed is crucial to that task. If the virus teaches us of our
constitutive entanglement within one another, it also insists on difference,
complexity, and the incommensurate opacities through which life coexists.
Produced by new cominglings of h uman and animal, the novel coronavirus
emerged from the forces of expansion, extraction, and enclosure that actual-
ize the compulsions of capital and its handmaiden the state.2 But even as the
pandemic disrupted the seemingly smooth flow of goods and p eople across
the globe, it also accelerated the datafication and informationalization of
life at all scales. From the profusion of Zoom meetings to the normalization
of population health surveillance to the redistribution of carbon emissions
away from air travel to data centers and compute resources, the pandemic
has intensified the constitutive contradiction of contemporary life between
the promise of a smooth and knowable World and the collective experience
of disjunctive, agonistic worlds. Collapsing the geopolitical into everyday
life, the stark inequities in access to covid-19 vaccines and treatments across
the globe—not to mention the very different capacity of wealthy nations to
weather the economic storms of lockdowns, deaths, and soaring health care
costs—are in turn reflected in the classist and racist application of restrictions
within polities, backed by police and militaries. Here in Sydney, for example,
armed police and active-duty soldiers w ere deployed en masse in the diverse
working-class suburbs in the southwest of the city, while residents of the
affluent east and north went largely untroubled. The biopolitics of health
management fused with an incipient necropolitics of militarized policing,
facilitated by the ontopolitical capacities of algorithmic analysis of the feral
transmission of the virus itself.
Politics as we know it is not equipped to deal with the intimacies of the
entangled and incommensurate, just as it is not equipped to reckon with
crises at the planetary scale. “Only a politics rebuilt on aesthetic principles,
that is, by remaking communications,” writes Cubitt, “offers the possibility
of changing the conduct of relations between human beings and nature, and
between both of them and the technologies that so profoundly and multifari-
ously mediate between them.”3 If the neoliberal moment of racial capitalism
has produced a fragmented and ad hoc politics based around the marketiza-
tion and informationalization of life, then an alternative politics must surely
begin with communication within and across difference. As I have argued
throughout this book, nonhuman witnessing is a distinctive communicative
modality, one in which difference is not a problem to be solved but rather
the grounds for flourishing. Many of the nonhuman entities and ecologies
traced in this book lack speech, or lack an inherent verbal or visual language
Many p eoples and worlds know deeply the destructive force of the World:
damming rivers and flooding homelands in the name of progress; clearing
bush for farmland; blasting mountain, hill, and stone to extract fossil fuels;
dispossessing peoples and breaking apart families; and severing ties to land,
country, and kin. Even if the World that, like Man, overrepresents itself as
the totality of existence has come to an end as a plausible or coherent notion,
its death throes continue to wrack the planet and life on it in catastrophic
ways. No reckoning has yet been made, despite the urgency. Indebted to the
Zapatista slogan “a world where many worlds fit,” de la Cadena and Blaser
describe “the practice of a world of many worlds, or what we call a plu-
riverse: heterogeneous worldings coming together as a p olitical ecology
of practices, negotiating their difficult being together in heterogeneity.”5 A
pluriversal reconception of coexistence—from World to worlds—is the task
at hand for that g reat swathe of humanity that has benefited from and main-
176 Coda
tained the fiction of Man. Pluriversality requires a new “political ontology,”
a “politics of reality” grounded in the presumption of “divergent worldings
constantly coming about through negotiations, enmeshments, crossings, and
interruptions.”6
Pluriversality confronts a dominant politics set sharply against the very
notion of many worlds. This politics “emerged (with science) to make a live-
able universe,” writes de la Cadena, “to control conflict among a single if cul-
turally diversified humanity living in a single scientifically knowable nature.”7
This political field depends on divisions between friend and enemy, as well as
between nature and culture. As de la Cadena argues, “These two antitheses—
between humanity and nature, and between allegedly superior and inferior
humans—declared the gradual extinction of other-than-human beings and
the worlds in which they existed.”8 To engage in politics, one had to be rec-
ognized within the hierarchical domain of Humanity—of Man—and not
assigned to Nature, a form of racialization many First Nations p eople have
been, and continue to be, subjected to. Pluralizing politics, then, is not simply
a question of inclusion within Man, but is to be found in the very dissolution
of such a notion to begin with. As I argued in the introduction, witnessing
has long operated as a coconspirator with Man, a guarantor to science, law,
religion, and culture of the coherence and cogency of the World. As I have
articulated the concept, nonhuman witnessing aims to break that binding of
the Witness to Man and, with it, Man to World.
This refiguring of witness and witnessing does not facilitate the smooth
aggregation of politics as usual with pluriversality but enables an adver-
sarial pluralism, in which noncontiguous and mutually exclusive worlds
can coexist—even if coexistence requires the end of the World of Man. Co-
existence depends upon contact and relationality, not mutual exclusion.
Incommensurate worlds can only coexist when contact with irreducible
difference is the condition for a relational politics. Attending to the nonhu-
man in witnessing is one way to “slow down reasoning and provoke the kind
of thinking that would enable us to undo, or more accurately, unlearn, the
single ontology of politics,” as de la Cadena puts it.9 Nonhuman witnessing
offers the means to trace how knowledge moves between or is animated
across many worlds in a situation in which media, like all resources, are finite.
Media and mediation hold the potential to generate the connective, com-
municative tissue between worlds. For Cubitt, communication constitutes
the ground of a renewed politics, a politics that reckons with the exclusion
of the nonhuman from the forms foisted on the world through the Enlight-
enment, colonialization, and marketization. To build alternative f utures, the
witnessing opacity
178 Coda
how complexity, uncertainty, and the unknowable are erased and elided
through instrumental processes of mediation. Machinic affect names those
relational intensities that animate technoscientific apparatuses, ambivalent
to the h uman and otherwise relegated to the mere operation of technical
systems. Ecological trauma describes the rippling effects of the rupturing of
relations within more-than-human ecologies, many of which elude human
understanding and can only ever be partially made sensible to the ecological
system itself. Radical absence brings t hese questions of the incommensurate
into the quotidian experience of the digital and its nonhuman infrastruc-
tures, accounting for encounters with what has been rendered absent yet
remains forcefully present. Th ese analytics thus engage with the necessary
opacity of existence, with the fundamental incapacity for entities to disclose
themselves to one another even when bound in relation.
Here, then, I arrive at a final doubled meaning: witnessing opacity, or the
nonhuman witnessing of opacity, and the opacity of nonhuman witnessing.
Nonhuman witnessing seeks to bring opacity into the space of witnessing, not
as a problem to be resolved but as a site of potential communicative relation.
At the same time, nonhuman witnessing is constituted by its own opacity,
its presence in zones of sensing and sense-making that cannot be decoded
or even identified at all. The dissolution of the human as privileged witness
depends on this potential for withdrawal from anthropocentric epistemology.
Modernity—with its Enlightenment and colonial underpinnings—demands
transparency, as Glissant argues: “This same transparency, in Western His-
tory, predicts that a common truth of Mankind exists and maintains that
what approaches it most closely is action that projects, whereby the world
is realized at the same time that it is caught in the act of its foundation.”12
Opacity works against this “reductive transparency.”13 It is not obscurity but
rather “that which cannot be reduced, which is the most perennial guarantee
of participation and confluence.”14 Opacity emerges with and is the condition
of new and old worlds alike. Opacity does not produce irreconcilable differ-
ence between cultures, languages, or ways of living but rather makes possible
the coexistence of multiplicities within a totality. “Opacities can coexist and
converge, weaving fabrics,” Glissant writes. “To understand these truly one
must focus on the texture of the weave and not on the nature of its compo-
nents.”15 This weave is Relation, or “what the world makes and expresses of
itself.”16 Glissant’s opening to Relation invokes “a poetics that is latent, open,
multilingual in intention, directly in contact with everything possible,” but in
his account is very much tied to human subjectivities and the traumas they
experience, particularly t hose of slavery’s M
iddle Passage. Glissant’s right to
180 Coda
tive, to become both response-able and address-able even while holding the
refusal that resides in the right to opacity.
Nonhuman witnessing seeks to bring into being the conditions for an other
wise by producing communicative relations across and within difference
that refuse to override the opaque and the incommensurate. Nonhuman
witnessing is an ecological mode of communication that arises from the fields
of relations that come together in the encounter between human and nonhu-
man, and most intensely so in contexts of violence, domination, and control.
By refusing the supremacy of Man the Witness as the figure through which
events obtain meaning or knowledge is produced, nonhuman witnessing
gives standing to diverse actors and entities, w
hether p eople denied humanity
or machinic intelligences or wounded ecologies in the aftermath of war. What
the nonhuman bears witness to might well be ruin, death, and trauma—and
the witness itself might be a perpetrator—but the fundamental implication
of nonhuman witnessing is to remake the h uman and the witnessing that we
do. Nonhuman witnessing can be mobilized to heal and empower, to bring
to light change in its emergence, and to insist on attending to voices, bodies,
patterns, and materialities denied standing in the present order. Nonhuman
witnessing is not a panacea, but rather a practice of forging relations with the
incommensurate. Its lure is becoming more h uman through the witnessing
of our constitutive nonhumanity.
The politics of nonhuman witnessing, then, is not one of rights, human
or otherwise. Expanding the domain of rights—granting rights to rivers and
other earth beings, for example—is a worthy enough endeavor but not one
that changes the conditions under which politics takes place. If a machine
were to bear witness as a rights-bearing subject, what rights would obtain to
it and what would their articulation mean for the rights that already accrue
to the “human”? Rights, for all the protections they provide, are part and
parcel of the existing order of racial capitalism and neoliberal governance,
guarantors of h uman privilege and individual autonomy within the epistemic
domain of the Enlightenment. Rather than extending rights that h umans
have to the nonhuman, the task at hand is to invite nonhumans subjectivities
and agencies into the space of politics and, in d oing so, seek to recompose
what politics is for the human. Cubitt again: “It is we ourselves who must
become other in order to produce an other world. The correlative is that we
wind the clock back, undo the damage, or raise the dead. But a world of
many worlds does require a communicative modality that reaches toward
the incommensurability of crowding worlds, even as it respects the necessity
182 Coda
of ineradicable difference. Rather than rights or democratic participation,
the politics of nonhuman witnessing concerns the emergent composition of
fields of relations out of which incommensurate collectivities and paradoxi-
cal knowledges might form. The politics of nonhuman witnessing is, in this
sense, an ecological poesis, an attunement to and calibration of the human
and the nonhuman that dwells in and with opacity. It is a politics of and for
the future, even as it provides the means to reimagine the past.
The politics of nonhuman witnessing is a politics of the commons, but
not the commons in a universal, global, or homogenous sense—rather it is
a profusion of commons, bound by their common commitment to neither
begin with nor seek to resolve homogeneity.21 Such a commons can only ever
be emergent and unfixed, since it must compose itself a new in the ongo-
ing antagonisms, negotiations, sympathies, and alliances between worlds.
Commons are necessarily communicative. Nonhuman witnessing offers the
potential for a distinct communicative mode, one that insists not simply
upon communication but on the demand for response and address. Such
terms carry with them a certain anthropocentrism, but in adopting them I
am not returning to narrow notions of speech or recognition. Address and
response form instead a communicative relation and generative aesthetic.
Fuller and Weizman describe the emergence of an investigative commons in
the new collectivities of forensic architecture, open-source investigation, and
distributed human rights research, which in turn draws on the existence of an
aesthetic commons, in which processes of sensing and sensing-making fold
into further such processes.22 If nonhuman witnessing animates or emerges
within particular commons, it also does so at the level of aesthetics and in
league with such instrumental investigative modes. But it also exceeds t hose
deliberate, human interventions, describing too the poesis that can arise in
the strange agonisms and fleeting alliances of machines, ecologies, animals,
and people.
To return to the Pacific Forum that opened chapter 3 of this book, nonhu-
man witnessing might galvanize a commons of islands and oceans, people
and winds, garbage and atmospheric sensing. Nonhuman witnessing would
not paper over the incapacities of speech or the ephemerality of certain agen-
cies but would be alive to what emerges in the intensive connections that can
arise when worlds are anchored, nurtured, and fought for. It is for this rea-
son that I have attended in this book not only to material events and actu-
ally existing technologies, but also to speculative imaginaries and creative
works. Such phenomena, objects, practices, and processes are often not at
all contiguous or willing to reveal their workings. Nor should they be. What
184 Coda
notes
the air 24 [hours a day], seven [days a week], but not when it’s raining. E very
time they are in the air, they can be heard. And because of the noise, we’re
psychologically disturbed—women, men, and children. . . . When there were
no drones, everything was all right. [There was] business, there was no psy-
chological stress and the people did what they could do for a living.” Stanford
Law School and nyu School of Law, “Living under Drones,” 164.
38 Stanford Law School and nyu School of Law, “Living under Drones,” 81.
39 Edney-Browne, “The Psychosocial Effects of Drone Violence,” 1347.
40 Schuppli, Material Witness, 124.
41 Kapadia, Insurgent Aesthetics, 69–72.
42 Kapadia, Insurgent Aesthetics, 5.
43 Chishty’s shadowy drones share some loose affinity with James Bridle’s
identically named and more well-known Drone Shadows (2012–18), in which
1:1 outlines of Reaper and Predator drones are painted onto public spaces in
cities such as London and New York, insisting on the return of the presence
of the drone to the places that authorize their deployment. Jennifer Rhee
provides an excellent critique of the limitations of the politics of identification
enacted by Bridle’s works, demonstrating how such works rest on an obscur-
Abraham, Nicolas, and Maria Torok. The Shell and the Kernel. Translated by
Nicholas T. Rand. Vol. 1. Chicago: University of Chicago Press, 1994.
Adler, Jonathan E. “Testimony, Trust, Knowing.” Journal of Philosophy 91, no. 5
(1994): 264–75.
Agamben, Giorgio. Remnants of Auschwitz: The Witness and the Archive. Trans-
lated by Daniel Heller-Roazen. New York: Zone, 2002.
Agostinho, Daniela, Kathrin Maurer, and Kristin Veel. “Introduction to the
Sensorial Experience of the Drone.” Senses and Society 15, no. 3 (Septem-
ber 2020): 251–58.
Ahmed, Sara. The Cultural Politics of Emotion. New York: Routledge, 2004.
Alaimo, Stacy. Bodily Natures: Science, Environment, and the Material Self.
Bloomington: Indiana University Press, 2010.
Alexis-Martin, Becky. “Nuclear Warfare and Weather (Im)Mobilities: From
Mushroom Clouds to Fallout.” In Weather: Spaces, Mobilities and Affects.
New York: Routledge, 2020.
Amazon. “All In: Staying the Course on Our Commitment to Sustainability,”
2020. https://sustainability.aboutamazon.com/environment/sustainable
-operations/carbon-footprint.
Amoore, Louise. “Algorithmic War: Everyday Geographies of the War on Terror.”
Antipode 41, no. 1 (January 2009): 49–69.
Amoore, Louise. Cloud Ethics: Algorithms and the Attributes of Ourselves and
Others. Durham, NC: Duke University Press, 2020.
Amoore, Louise. The Politics of Possibility: Risk and Security Beyond Probability.
Durham, NC: Duke University Press, 2013.
Amoore, Louise, and Rita Raley. “Securing with Algorithms: Knowledge, Deci-
sion, Sovereignty.” Security Dialogue 48, no. 1 (February 2017): 3–10.
Andén-Papadopoulos, Kari. “Citizen Camera-Witnessing: Embodied P olitical
Dissent in the Age of ‘Mediated Mass Self-Communication.’ ” New Media
and Society, 2013, 753–69.
Andén-Papadopoulos, Kari. “Media Witnessing and the ‘Crowd-Sourced Video
Revolution.’ ” Visual Communication 12, no. 3 (August 2013): 341–57.
Anderson, Steve F. Technologies of Vision: The War between Data and Images.
Cambridge, MA: mit Press, 2017.
Andrejevic, Mark. Automated Media. 1st ed. New York: Routledge, 2019.
Angerer, Marie-Luise. Ecology of Affect: Intensive Milieus and Contingent Encoun-
ters. Translated by Gerrit Jackson. Lüneborg: Meson, 2017.
Arendt, Hannah. On Violence. New York: Harcourt, Brace, Jovanovich, 1970.
Arnold, Lorna, and Mark Smith. Britain, Australia and the Bomb: The Nuclear
Tests and Their Aftermath. 2nd ed. London: Palgrave Macmillan, 2006.
Asaro, Peter. “Algorithms of Violence: Critical Social Perspectives on Autono-
mous Weapons.” Social Research: An International Quaterly 86, no. 2
(2019): 20.
Ashton, Chris, Alan Shuster Bruce, Gary Colledge, and Mark Dickinson. “The
Search for mh370.” Journal of Navigation 68 (2015): 1–22.
Atanasoski, Neda, and Kalindi Vora. Surrogate Humanity: Race, Robots, and the
Politics of Technological Futures. Perverse Modernities. Durham, NC: Duke
University Press, 2019.
Atkinson, Meera, and Michael Richardson, eds. Traumatic Affect. Newcastle upon
Tyne: Cambridge Scholars, 2013.
Azoulay, Ariella. “The Natural History of Rape.” Journal of Visual Culture 17, no. 2
(August 2018): 166–76.
Ballard, Su. “ ‘And They Are like Wild Beasts’: Violent Things in the Anthropo-
cene.” Fibreculture 226 (2019). https://thirty.fibreculturejournal.org/fcj-226
-and-they-are-like-wild-beasts-violent-things-in-the-anthropocene/.
Ballengee, Jennifer R. The Wound and the Witness: The Rhetoric of Torture.
New York: State University of New York Press, 2009.
Barad, Karen. “Posthumanist Performativity: Toward an Understanding of How
Matter Comes to Matter.” Signs: Journal of Women in Culture and Society 28,
no. 3 (March 2003): 801–31.
Barnaby, Frank, and Douglas Holdstock. The British Nuclear Weapons Pro-
gramme, 1952–2002. London: Routledge, 2004.
Barnell, Mark, Courtney Raymond, Christopher Capraro, et al. “Agile Condor: A
Scalable High Performance Embedded Computing Architecture.” Waltham,
MA: 2015 ieee High Performance Extreme Computing Conference
(September 2015): pp. 1–5. https://doi.org/10.1109/HPEC.2015.7322447.
208 Bibliography
Barnell, Mark, Courtney Raymond, Chris Capraro, Darrek Isereau, Chris Cicotta,
and Nathan Stokes. “High-Performance Computing (hpc) and Machine
Learning Demonstrated in Flight Using Agile Condor®.” Waltham, MA: 2018
ieee High Performance Extreme Computing Conference (September 2018):
1–4. https://doi.org/10.1109/HPEC.2018.8547797.
Bashir, Shazad, and Robert D. Crews. Under the Drones: Modern Lives in the
Afghanistan-Pakistan Borderlands. Cambridge, MA: Harvard University
Press, 2012.
Bellanova, Rocco, Kristina Irion, Katja Lindskov Jacobsen, Francesco Ragazzi,
Rune Saugmann, and Lucy Suchman. “Toward a Critique of Algorithmic
Violence.” International Political Sociology 15, no. 1 (2021): 121–50.
Bender, Emily M., Timnit Gebru, Angelina McMillan-Major, and Shmargaret
Shmitchell. “On the Dangers of Stochastic Parrots: Can Language Models
Be Too Big?” In Proceedings of the 2021 acm Conference on Fairness, Ac-
countability, and Transparency, 610–23. FAccT ’21. New York: Association
for Computing Machinery, 2021.
Benjamin, Ruha. Race after Technology: Abolitionist Tools for the New Jim Code.
Medford, MA: Polity, 2019.
Bennett, Jane. Vibrant M atter: A P olitical Ecology of Th
ings. Durham, NC: Duke
University Press, 2010.
Berlant, Lauren. “The Commons: Infrastructures for Troubling Times.” Environ-
ment and Planning D: Society and Space 34, no. 3 (2016): 393–419.
Berlant, Lauren. Cruel Optimism. Durham, NC: Duke University Press, 2011.
Berlant, Lauren. “Thinking about Feeling Historical.” Emotion, Space and Society
1, no. 1 (2008): 4–9.
Berlant, Lauren, and Jordan Greenwald. “Affect in the End Times: A Conversa-
tion with Lauren Berlant.” Qui Parle 20 (2012): 71–89.
Bhuta, Nehal, Susanne Beck, Robin Geiβ, Hin-Yan Liu, and Claus Kreβ, eds.
Autonomous Weapons Systems: Law, Ethics, Policy. Cambridge: Cambridge
University Press, 2016.
Boltanski, Luc. Distant Suffering: Morality, Media and Politics. Cambridge:
Cambridge University Press, 1999.
Bourke, Latika. “Public Beheading Fears: Tony Abbott Confirms Police Believed
Terrorists Planned ‘Demonstration Killings.’ ” Sydney Morning Herald,
September 18, 2014. https://www.smh.com.au/politics/federal/public
-beheading-fears-tony-abbott-confirms-police-believed-terrorists-planned
-demonstration-killings-20140918-10ilyq.html.
Bousquet, Antoine. The Eye of War: Military Perception from the Telescope to the
Drone. Minneapolis: University of Minnesota Press, 2018.
Bousquet, Antoine. The Scientific Way of Warfare: Order and Chaos on the Battle-
fields of Modernity. New York: Columbia University Press, 2009.
Bousquet, Antoine, Jairus Grove, and Nisha Shah. “Becoming War: Towards a
Martial Empiricism.” Security Dialogue 51, nos. 2–3 (2020): 99–128.
Bibliography 209
Bräunert, Svea, and Meredith Malone. To See without Being Seen: Contemporary
Art and Drone Warfare. Saint Louis, MO: Mildred Lane Kemper Art
Museum, 2016.
Bremner, Lindsay. “Fluid Ontologies in the Search for mh370.” Journal of the
Indian Ocean Region 11 (2015): 8–29.
Bremner, Lindsay. “Technologies of Uncertainty in the Search for mh370.” In Art
in the Anthropocene: Encounters among Aesthetics, Politics, Environments
and Epistemologies, edited by Heather Davis and Etienne Turpin. London:
Open Humanities, 2015.
Browne, Simone. Dark M atters: On the Surveillance of Blackness. Durham, NC:
Duke University Press, 2015.
Brubaker, Jed R., Gillian R. Hayes, and Paul Dourish. “Beyond the Grave:
Facebook as a Site for the Expansion of Death and Mourning.” Information
Society 29 (2013): 152–63.
Bucher, Taina. If . . . Then: Algorithmic Power and Politics. Oxford Studies in Digi-
tal Politics. New York: Oxford University Press, 2018.
Bulletin of the Atomic Scientists. “Project Maven Brings ai to the Fight against
isis.” December 21, 2017. https://thebulletin.org/2017/12/project-maven
-brings-ai-to-the-fight-against-isis/.
Butler, Judith. Precarious Life: The Powers of Mourning and Violence. New York:
Verso, 2004.
Campt, Tina. Listening to Images. Durham, NC: Duke University Press, 2017.
Carter, Michael. Australian Participants in British Nuclear Tests in Australia. Can-
berra: Department of Veterans’ Affairs, 2006. https://www.dva.gov.au/sites
/default/files/dosimetry_complete_study_1.pdf.
Caruth, Cathy. Unclaimed Experience: Trauma, Narrative, and History. Baltimore:
Johns Hopkins University Press, 1996.
Chakrabarty, Dipesh. “The Climate of History: Four Theses.” Critical Inquiry 35,
no. 2 (Winter 2009): 197–222.
Chamayou, Grégoire. Drone Theory. Translated by Janet Lloyd. London: Penguin,
2015.
Chandler, Katherine. Unmanning: How H umans, Machines and Media Perform
Drone Warfare. War Culture. New Brunswick, NJ: Rutgers University Press,
2020.
Chayka, Kyle. “The Weird Failures of Algorithm-Generated Images.” Slate,
August 28, 2020. https://slate.com/technology/2020/08/uncanniness-of
-algorithmic-style.html.
Chen, Yutian, Aja Huang, Ziyu Wang, Ioannis Antonoglou, Julian Schrittwieser,
David Silver, and Nando de Freitas. “Bayesian Optimization in AlphaGo.”
ArXiv, arXiv:1812.06855 (December 2018). https://arxiv.org/abs/1812.06855v1.
Chesney, Robert, and Danielle Citron. “Deepfakes and the New Disinformation
War: The Coming Age of Post-truth Geopolitics.” Foreign Affairs 98, no. 1
(2019): 147–55.
210 Bibliography
Chouliaraki, Lilie. “Digital Witnessing in Conflict Zones: The Politics of Reme-
diation.” Information, Communication and Society 18, no. 11 (2015): 1362–77.
Chouliaraki, Lilie. The Spectatorship of Suffering. London: sage, 2006.
Chouliaraki, Lilie, and Omar al-Ghazzi. “Beyond News Verification: Flesh Wit-
nessing and the Significance of Embodiment in Conflict News.” Journalism
23, no. 3 (2022): 649–67.
Chow, Rey. Entanglements, or Transmedial Thinking about Capture. Durham, NC:
Duke University Press, 2012.
Chow, Rey. The Age of the World Target: Self-Referentiality in War, Theory, and
Comparative Work. Durham, NC: Duke University Press, 2006.
Christopoulos, Demetris T., and Galina K. Ustinova. “Urgent Hypothesis on
Plane mh370 Disappearance.” ResearchGate, 2014.
Clark, Timothy. “Scale.” In Theory in the Era of Climate Change. Vol. 1, edited by
Tom Cohen, 148–66. Ann Arbor, MI: Open Humanities, 2012.
Clarke, Melissa. “Australia Shuts down Climate Deal after Discussions Reduce
Tongan pm to Tears.” abc News, August 15, 2019. https://www.abc.net
.au/news/2019-08-15/no-endorsements-come-out-of-tuvalu-declaration
/11419342.
Clery, Daniel. “Six Handshakes, Then Silence.” Science 344 (May 2014): 964–65.
https://www.science.org/doi/10.1126/science.344.6187.964.
Clough, Patricia Ticiento. The Affective Turn: Theorizing the Social. Durham, NC:
Duke University Press, 2007.
Coady, C. A. J. Testimony: A Philosophical Study. Oxford: Oxford University Press,
1994.
Cockburn, Andrew. Kill Chain: Drones and the Rise of High-Tech Assassins.
New York: Verso, 2015.
Cole, Samantha. “ai-Assisted Fake Porn Is H ere and We’re All Fucked.” Vice,
December 17, 2017. https://www.vice.com/en_us/article/gydydm/gal-gadot
-fake-ai-porn.
Cook, Megan, Barbara Etschmann, Rahul Ram, Konstantin Ignatyev, Gediminas
Gervinskas, Steven D. Conradson, Susan Cumberland, Vanessa N. L. Wong,
and Joёl Brugger. “The Nature of Pu-Bearing Particles from the Maralinga
Nuclear Testing Site, Australia.” Scientific Reports 11, no. 1 (2021): 10698.
Cowen, Deborah. The Deadly Life of Logistics. Minneapolis: University of Min-
nesota Press, 2014.
Cubitt, Sean. Finite Media: Environmental Implications of Digital Technologies.
Durham, NC: Duke University Press, 2017.
Cubitt, Sean. The Practice of Light: A Genealogy of Visual Technologies from Prints
to Pixels. Leonardo Book Series. Cambridge, MA: mit Press, 2014.
Danchev, Alex. “Bug Splat: The Art of the Drone.” International Affairs 92, no. 3
(2016): 703–13.
Das, Veena, Arthur Kleinman, Mamphela Ramphele, and Pamela Reynolds. Vio
lence and Subjectivity. Berkeley: University of California Press, 2000.
Bibliography 211
da Silva, Denise Ferreira. “No-Bodies.” Griffith Law Review 18, no. 2 (2009): 212–36.
Daston, Lorraine. “Epistemic Images.” In Vision and Its Instruments: Art, Science,
and Technology in Early Modern Europe, edited by A. Payne, 13–35. Univer-
sity Park: Pennsylvania State University Press, 2015.
Daston, Lorraine, and Peter Galison. Objectivity. New York: Zone, 2007.
Davis, Heather, and Zoe Todd. “On the Importance of a Date, or, Decolonizing
the Anthropocene.” acme: An International Journal for Critical Geographies
16, no. 4 (2017): 761–80.
Davis, Mike. “How a Pandemic Happens: We Knew This Was Coming.” Literary
Hub, May 18, 2020. https://lithub.com/how-a-pandemic-happens-we-knew
-this-was-coming/.
Dawes, James. That the World May Know: Bearing Witness to Atrocity. Cambridge,
MA: Harvard University Press, 2007.
Day, Sophie, and Celia Lury. “New Technologies of the Observer: #BringBack,
Visualization and Disappearance.” Theory, Culture and Society 34, nos. 7–8
(2017): 51–74.
de la Cadena, Marisol. Earth Beings: Ecologies of Practice across Andean Worlds.
Durham, NC: Duke University Press, 2015.
de la Cadena, Marisol. “Indigenous Cosmopolitics in the Andes: Conceptual Re-
flections beyond ‘Politics.’ ” Cultural Anthropology 25, no. 2 (2010): 334–70.
de la Cadena, Marisol. “Uncommoning Nature: Stories from the Anthropo-Not-
Seen.” In Anthropos and the Material, edited by Penny Harvey, Christian Krohn-
Hansen, and Knut G. Nustad, 35–58. Durham, NC: Duke University Press, 2019.
de la Cadena, Marisol, and Mario Blaser, eds. A World of Many Worlds. Durham,
NC: Duke University Press, 2018.
DeLanda, Manuel. War in the Age of Intelligent Machines. New York: Zone, 1991.
Deleuze, Gilles. Cinema 1: The Movement-Image. Translated by Hugh Tomlinson
and Barbara Habberjam. London: Continuum, 2005.
Deleuze, Gilles. “Ethology: Spinoza and Us.” In Incorporations, edited by Jonathan
Crary and Sanford Kwinter, translated by Robert Hurley, 625–33. New York:
Zone, 1992.
Deleuze, Gilles. The Logic of Sense. New York: Columbia University Press, 1990.
Deleuze, Gilles. Pure Immanence: Essays on a Life. Translated by Anne Boyman.
Cambridge, MA: Zone, 2001.
Deleuze, Gilles, and Félix Guattari. A Thousand Plateaus: Capitalism and
Schizophrenia. Translated by Brian Massumi. Minneapolis: University of
Minnesota Press, 1987.
DeLoughrey, Elizabeth M. Allegories of the Anthropocene. Durham, NC: Duke
University Press, 2019.
Demos, T. J. Against the Anthropocene: Visual Culture and Environment Today.
Berlin: Sternberg, 2017.
Dencik, Lina, Arne Hintz, and Jonathan Cable. “Towards Data Justice? The
Ambiguity of Anti-surveillance Resistance in Political Activism.” Big
212 Bibliography
Data and Society 3, no. 2 (July–December 2016). https://doi.org/10.1177
/2053951716679678.
Department of Defense. “Memorandum for the Establishment of an Algorithmic
Warfare Cross-functional Team (Project Maven),” April 26, 2017. https://
www.govexec.com/media/gbc/docs/pdfs_edit/establishment_of_the_awcft
_project_maven.pdf.
Doane, Mary Ann. The Emergence of Cinematic Time: Modernity, Contingency, the
Archive. Cambridge, MA: Harvard University Press, 2002.
Dorrian, Mark. “Drone Semiosis.” Cabinet, no. 54 (2014): 48–55.
DuBois, Page. Torture and Truth. New York: Routledge, 1991.
Easterling, Keller. Extrastatecraft: The Power of Infrastructure Space. New York:
Verso, 2014.
Edney-Browne, Alex. “The Psychosocial Effects of Drone Violence: Social Isola-
tion, Self-Objectification, and Depoliticization.” Political Psychology 40,
no. 6 (2019): 1341–56.
Edwards, Nelta. “Nuclear Colonialism and the Social Construction of Landscape
in Alaska.” Environmental Justice 4, no. 2 (2011): 109–14.
Edwards, Paul N. The Closed World: Computers and the Politics of Discourse in
Cold War America. Cambridge, MA: mit Press, 1996.
Edwards, Paul N. “Entangled Histories: Climate Science and Nuclear Weapons
Research.” Bulletin of the Atomic Scientists 68, no. 4 (2012): 28–40.
Edwards, Paul N. A Vast Machine: Computer Models, Climate Data, and the Poli-
tics of Global Warming. Cambridge, MA: mit Press, 2010.
Endres, Danielle. “The Most Nuclear-Bombed Place: Ecological Implications of
the US Nuclear Testing Program.” In Tracing Rhetoric and Material Life, ed-
ited by Bridie McGreavy, Justine Wells, George F. McHendry, and Samantha
Senda-Cook, 253–87. London: Palgrave Macmillan, 2018.
Endres, Danielle. “The Rhetoric of Nuclear Colonialism: Rhetorical Exclusion of
American Indian Arguments in the Yucca Mountain Nuclear Waste Siting
Decision.” Communication and Critical/Cultural Studies 6, no. 1 (2009):
39–60.
Escobar, Arturo. Designs for the Pluriverse: Radical Interdependence, Autonomy,
and the Making of Worlds. New Ecologies for the Twenty-First Century.
Durham, NC: Duke University Press, 2018.
Escobar, Arturo. Pluriversal Politics: The Real and the Possible. Latin America in
Translation. Durham, NC: Duke University Press, 2020.
Estes, Nick. Our History Is the F uture: Standing Rock versus the Dakota Access
Pipeline, and the Long Tradition of Indigenous R esistance. New York: Verso,
2019.
Ettinger, Bracha. The Matrixial Borderspace. Minneapolis: University of Minne-
sota Press, 2006.
Eubanks, Virginia. Automating I nequality: How High-Tech Tools Profile, Police,
and Punish the Poor. New York: St. Martin’s, 2018.
Bibliography 213
Fallis, Don. “The Epistemic Threat of Deepfakes.” Philosophy and Technology 34,
no. 4 (December 2021): 623–43. https://doi.org/10.1007/s13347-020-00419-2.
Farmer, Paul. “An Anthropology of Structural Violence.” Current Anthropology 45,
no. 3 (2004): 305–25.
Farrier, David. Anthropocene Poetics: Deep Time, Sacrifice Zones, and Extinction.
Minneapolis: University of Minnesota Press, 2019.
Farwell, James P. “The Media Strategy of isis.” Survival 56 (2014): 49–55.
Fassin, Didier. “The Humanitarian Politics of Testimony: Subjectification through
Trauma in the Israeli Palestinian Conflict.” Cultural Anthropology 23, no. 3
(2008): 531–58.
Felman, Shoshana, and Dori Laub. Testimony: Crises of Witnessing in Literature,
Psychoanalysis, and History. New York: Routledge, 1992.
Finn, Ed. What Algorithms Want: Imagination in the Age of Computing. Cambridge,
MA: mit Press, 2017.
Fish, Adam. “Blue Governmentality: Elemental Activism with Conservation
Technologies on Plundered Seas.” Political Geography 93 (2022): 102528.
Frosh, Paul. The Poetics of Digital Media. Cambridge: Polity, 2019.
Frosh, Paul, and Amit Pinchevski. “Introduction: Why Media Witnessing? Why
Now?” In Media Witnessing: Testimony in the Age of Mass Communication,
edited by Paul Frosh and Amit Pinchevski, 1–19. Basingstoke, UK: Palgrave
Macmillan, 2009.
Frosh, Paul, and Amit Pinchevski. Media Witnessing: Testimony in the Age of Mass
Communication. Basingstoke, UK: Palgrave Macmillan, 2009.
Fuller, Matthew. Media Ecologies: Materialist Energies in Art and Technoculture.
Leonardo. Cambridge, MA: mit Press, 2005.
Fuller, Matthew, and Eyal Weizman. Investigative Aesthetics: Conflicts and Com-
mons in the Politics of Truth. New York: Verso, 2021.
Furuhata, Yuriko. Climatic Media: Transpacific Experiments in Atmospheric Con-
trol. Elements. Durham, NC: Duke University Press, 2022.
Furuhata, Yuriko. “Multimedia Environments and Security Operations: Expo ’70
as a Laboratory of Governance.” Grey Room 54 (2014): 56–79.
Gabrys, Jennifer. Program Earth: Environmental Sensing Technology and the Making
of a Computational Planet. Minneapolis: University of Minnesota Press, 2016.
Geoghegan, Bernard Dionysius. “An Ecology of Operations: Vigilance, Radar, and
the Birth of the Computer Screen.” Representations 147, no. 1 (2019): 59–95.
Gerstner, Erik. “Face/Off: DeepFake Face Swaps and Privacy Laws.” Defense
Counsel Journal 87, no. 1 (2020): 1–14.
Gibbs, Anna. “Contagious Feelings: Pauline Hanson and the Epidemiol-
ogy of Affect.” Australian Humanities Review 24 (2001). https://
australianhumanitiesreview.org/2001/12/01/contagious-feelings-pauline
-hanson-and-the-epidemiology-of-affect/.
Gibbs, Anna. “Panic! Affect Contagion, Mimesis and Suggestion in the Social
Field.” Cultural Studies Review 14 (2008): 130–45.
214 Bibliography
Gibbs, Anna. “Writing and Danger: The Intercorporeality of Affect.” In Creative
Writing: Theory beyond Practice, edited by Tess Brody and Nigel Krauth,
157–67. Tenerife, qld: Post Pressed, 2006.
Gil-Fournier, Abelardo, and Jussi Parikka. “Ground Truth to Fake Geographies:
Machine Vision and Learning in Visual Practices.” ai and Society 36 (2020):
1253–62.
Gilmore, Ruth Wilson. Golden Gulag: Prisons, Surplus, Crisis, and Opposition in
Globalizing California. Berkeley: University of California Press, 2007.
Givoni, Michal. The Care of the Witness: A Contemporary History of Testimony in Cri-
ses. Human Rights in History. Cambridge: Cambridge University Press, 2016.
Givoni, Michal. “Witnessing/Testimony.” Mafte’akh 11, no. 2 (2011): 147–69.
Glissant, Édouard. Poetics of Relation. Translated by Betsy Wing. Ann Arbor:
University of Michigan Press, 1997.
Goodfellow, Ian J., Jean Pouget-Abadie, Mehdi Mirza, Bing Xu, David Warde-
Farley, Sherjil Ozair, Aaron Courville, and Yoshua Bengio. “Generative
Adversarial Networks.” ArXiv, arXiv:1406.2661 (June 2014). https://doi.org
/10.48550/arXiv.1406.2661.
GovernmentCIOMedia. “ai to Help Pentagon Prep for Algorithmic Warfare.”
June 11, 2016. https://governmentciomedia.com/ai-help-pentagon-prep
-algorithmic-warfare.
Graae, Andreas Immanuel, and Kathrin Maurer, eds. Drone Imaginaries: The
Power of Remote Vision. Manchester: Manchester University Press, 2021.
Gray, Jonathan. “Data Witnessing: Attending to Injustice with Data in Amnesty
International’s Decoders Project.” Information, Communication, and Society
22, no. 7 (2019): 971–91.
Grayson, Kyle, and Jocelyn Mawdsley. “Scopic Regimes and the Visual Turn in In-
ternational Relations: Seeing World Politics through the Drone.” European
Journal of International Relations 25, no. 2 (2018).
Greene, Daniel. “Drone Vision.” Surveillance and Society 13, no. 2 (2015): 233–49.
Greenpeace. “Clicking Clean Virginia: The Dirty Energy Powering Data Center
Alley,” February 13, 2019. https://www.greenpeace.org/usa/reports/click
-clean-virginia/.
Gregg, Melissa, and Gregory J. Seigworth. The Affect Theory Reader. Durham,
NC: Duke University Press, 2010.
Gregory, Derek. “From a View to a Kill: Drones and Late Modern War.” Theory,
Culture and Society 28, nos. 7–8 (2011): 188–215.
Gregory, Derek. “Under Afghan Skies (1).” Geographical Imaginations.
March 27, 2020. https://geographicalimaginations.com/2020/03/27/under
-afghan-skies-1/.
Gregory, Derek. “Under Afghan Skies (2).” Geographical Imaginations. April 1, 2020.
https://geographicalimaginations.com/2020/03/31/under-afghan-skies-2/.
Gregory, Derek. “Under Afghan Skies (3).” Geographical Imaginations. April 3, 2020.
https://geographicalimaginations.com/2020/04/03/under-afghan-skies-3/.
Bibliography 215
Grove, Jairus. “From Geopolitics to Geotechnics: Global Futures in the Shadow
of Automation, Cunning Machines, and Human Speciation.” International
Relations 34, no. 3 (2020): 432–55.
Grove, Jairus Victor. Savage Ecology: War and Geopolitics at the End of the World.
Durham, NC: Duke University Press, 2019.
Grusin, Richard. “Introduction.” In The Nonhuman Turn, edited by Richard
Grusin, vii–xxix. Minneapolis: University of Minnesota Press, 2015.
Grusin, Richard. Premediation: Affect and Mediality a fter 9/11. New York: Palgrave
Macmillan, 2010.
Guattari, Félix. Chaosmosis: An Ethico-Aesthetic Paradigm. Translated by Paul
Bains and Julian Pefanis. Bloomington: Indiana University Press, 1995.
Guattari, Félix. “On Machines.” Journal of Philosophy and the Visual Arts 6 (1995):
8–12.
Guattari, Félix. The Three Ecologies. New York: Continuum, 2005.
Guerin, Frances, and Roger Hallas. The Image and the Witness: Trauma, Memory
and Visual Culture. London: Wallflower, 2007.
Halpern, Orit. Beautiful Data: A History of Vision and Reason since 1945. Durham,
NC: Duke University Press, 2015.
Haraway, Donna. “Anthropocene, Capitalocene, Plantationocene, Chthulucene:
Making Kin.” Environmental Humanities 6, no. 1 (2015): 159–65.
Haraway, Donna. Modest−Witness@Second−Millennium.FemaleMan−Meets−Onc-
oMouse™: Feminism and Technoscience. New York: Routledge, 1997.
Haraway, Donna. “Situated Knowledges: The Science Question in Feminism and
the Privilege of Partial Perspective.” Feminist Studies 14, no. 3 (1988): 575–99.
Haraway, Donna. Staying with the Trouble: Making Kin in the Chthulucene. Ex-
[103.158.136.64] Project MUSE (2024-02-06 19:57 GMT) University of Edinburgh
216 Bibliography
Holmqvist, Caroline. “Undoing War: War Ontologies and the Materiality of
Drone Warfare.” Millennium 41, no. 3 (2013): 535–52.
Hoskins, Andrew, and Shona Illingworth. “Inaccessible War: Media, Memory,
Trauma and the Blueprint.” Digital War 1 (2020): 74–82.
Human Rights Watch. “A Wedding That Became a Funeral: US Drone Attack on
Marriage Procession in Yemen.” Human Rights Watch, February 19, 2014.
https://www.hrw.org/report/2014/02/19/wedding-became-funeral/us-drone
-attack-marriage-procession-yemen.
ican: International Campaign to Abolish Nuclear Weapons. “Black Mist: The
Impact of Nuclear Weapons on Australia,” January 2014. https://icanw.org
.au/wp-content/uploads/BlackMist-FINAL-Web.pdf.
Ingram, Haroro J. “Three Traits of the Islamic State’s Information Warfare.” rusi
Journal 159 (2014): 4–11.
Isereau, Darrek, et al. “Utilizing High-Performance Embedded Computing,
Agile Condor, for Intelligent Processing: An Artificial Intelligence Plat-
form for Remotely Piloted Aircraft.” London: 2017 Intelligent Systems
Conference (IntelliSys) (September 2017): 1155–59. https://doi.org/10.1109
/IntelliSys.2017.8324277.
Jackson, Zakiyyah Iman. “Animal: New Directions in the Theorization of Race
and Posthumanism.” Feminist Studies 39, no. 3 (2013): 669–85.
Jackson, Zakiyyah Iman. “Outer Worlds: The Persistence of Race in Movement
‘Beyond the Human.’ ” glq: A Journal of Lesbian and Gay Studies 21, no. 2
(2015): 215–18.
James, C. L. R. The Black Jacobins: Toussaint l’Ouverture and the San Domingo
Revolution. 2nd ed., rev. New York: Vintage, 1989.
Jetñil-Kijiner, Kathy. “Tell Them.” 2011. https://jkijiner.wordpress.com/2011/04/13
/tell-them/.
Johnson, Ted. “To Handle Its Influx of Drone Footage, Military Should Teach ai
to Watch tv.” Wired, November 26, 2017. https://www.wired.com/story/the
-military-should-teach-ai-to-watch-drone-footage/.
Joint Standing Committee on Northern Australia. “Never Again: Interim Report
into the Destruction of Indigenous Heritage Sites at Juukan Gorge.” Can-
berra: Parliament of the Commonwealth of Australia, December 2020.
Joint Standing Committee on Northern Australia. “A Way Forward: Final Report
into the Destruction of Indigenous Heritage Sites at Juukan Gorge.” Can-
berra: Parliament of the Commonwealth of Australia, October 2021.
Jue, Melody. Wild Blue Media: Thinking through Seawater. Elements. Durham,
NC: Duke University Press, 2020.
Jurgenson, Nathan. The Social Photo: On Photography and Social Media. New York:
Verso, 2019.
Kaleem, Jaweed. “Death on Facebook Now Common as ‘Dead Profiles’ Create Vast
Virtual Cemetery.” Huffington Post, October 1, 2015. https://www.huffpost
.com/entry/death-facebook-dead-profiles_n_2245397.
Bibliography 217
Kapadia, Ronak K. Insurgent Aesthetics: Security and the Queer Life of the Forever
War. Durham, NC: Duke University Press, 2020.
Kaplan, Caren. Aerial Aftermaths: Wartime from Above. Durham, NC: Duke
University Press, 2018.
Kaplan, Caren. “Drone-o-Rama: Troubling the Spatial and Temporal Logics of
Distance Warfare.” In Life in the Age of Drone Warfare, edited by Lisa Parks
and Caren Kaplan, 161–77. Durham, NC: Duke University Press, 2017.
Kaplan, Caren. “Drones and the Image Complex: The Limits of Representation in
the Era of Distance Warfare.” In Mediating the Spatiality of Conflicts: Inter-
national Conference Proceedings, edited by Armina Pilav, Marc Schoonder-
beek, Heidi Sohn, and Aleksandar Staničić, 29–43. Delft, Netherlands: bk,
2020.
Kaplan, E. Ann. Climate Trauma: Foreseeing the Future in Dystopian Film and Fic-
tion. New Brunswick, NJ: Rutgers University Press, 2016.
Kaplan, E. Ann. Trauma Culture: The Politics of Terror and Loss in Media and Lit
erature. New Brunswick, NJ: Rutgers University Press, 2005.
Kelley, Robin D. G. “What Did Cedric Robinson Mean by Racial Capitalism?”
Boston Review, January 12, 2017. https://bostonreview.net/race/robin-d-g
-kelley-what-did-cedric-robinson-mean-racial-capitalism.
Kember, Sarah, and Joanna Zylinska. Life after New Media: Mediation as a Vital
Process. Cambridge, MA: mit Press, 2012.
Keskinen, Niina, Marja Kaunonen, and Anna Liisa Aho. “How Loved Ones Ex-
press Grief after the Death of a Child by Sharing Photographs on Facebook.”
Journal of Loss and Trauma 24, no. 7 (2019): 609–24.
Khalili, Laleh. Sinews of War and Trade: Shipping and Capitalism in the Arabian
Peninsula. New York: Verso, 2020.
Kidder, Stanley Q., and Thomas H. Vonder Haar. Satellite Meteorology: An Intro-
duction. Cambridge, MA: Academic Press, 1995.
Kikerpill, Kristjan. “Choose Your Stars and Studs: The Rise of Deepfake Designer
Porn.” Porn Studies 7 no. 4 (2020): 1–5.
Kirchengast, Tyrone. “Deepfakes and Image Manipulation: Criminalisation and
Control.” Information and Communications Technology Law 29, no. 3 (2020):
308–23.
Kozol, Wendy. Distant Wars Visible: The Ambivalence of Witnessing. Minneapolis:
University of Minnesota Press, 2014.
Kurgan, Laura. Close up at a Distance: Mapping, Technology, and Politics.
New York: Zone, 2013.
Kurgan, Laura. “Conflict Urbanism, Aleppo: Mapping Urban Damage.” Architec-
tural Design 87, no. 1 (2017): 72–77.
LaCapra, Dominick. Representing the Holocaust: History, Theory, Trauma. Ithaca,
NY: Cornell University Press, 1994.
LaCapra, Dominick. Writing History, Writing Trauma. Baltimore: Johns Hopkins
University Press, 2001.
218 Bibliography
Ladd, Mike. “The Lesser Known History of the Maralinga Nuclear Tests—and
What It’s like to Stand at Ground Zero.” Australian Broadcasting Corpora-
tion, March 24, 2020. https://www.abc.net.au/news/2020-03-24/maralinga
-nuclear-tests-ground-zero-lesser-known-history/11882608.
Langbein, John. Torture and the Law of Proof. Chicago: University of Chicago
Press, 1977.
Lapoujade, David. William James, Empiricism and Pragmatism. Translated by Thomas
LaMarre. Thought in the Act. Durham, NC: Duke University Press, 2020.
Leaver, Tama. “The Social Media Contradiction: Data Mining and Digital Death.”
m/c Journal 16 (2013). https://journal.media-culture.org.au/index.php
/mcjournal/article/view/625.
Liu, Cixin. Death’s End. Bk. 3 of The Three-Body Problem, Translated by Ken Liu.
London: Head of Zeus, 2016.
Luciano, Dana, and Mel Y. Chen. “Has the Queer Ever Been Human?” glq: A
Journal of Lesbian and Gay Studies 21, nos. 2–3 (2015): 183–207.
Lyons, Kate. “Fiji pm Accuses Scott Morrison of ‘Insulting’ and Alienating Pacific
Leaders.” Guardian, August 17, 2019. https://www.theguardian.com/world
/2019/aug/16/fiji-pm-frank-bainimarama-insulting-scott-morrison-rift
-pacific-countries.
Mackenzie, Adrian. Machine Learners: Archaeology of a Data Practice. Cam-
bridge, MA: mit Press, 2017.
Mackenzie, Adrian, and Anna Munster. “Platform Seeing: Image Ensembles and
Their Invisualities.” Theory, Culture and Society 36, no. 5 (2019): 3–22.
Maddocks, Sophie. “‘A Deepfake Porn Plot Intended to Silence Me’: Exploring Con-
tinuities between Pornographic and ‘Political’ Deep Fakes.” Porn (2020): 1–9.
Manjoo, Farhad. “I Tried Microsoft’s Flight Simulator. The Earth Never Seemed
So Real.” New York Times, August 19, 2020. https://www.nytimes.com/2020
/08/19/opinion/microsoft-flight-simulator.html.
Manning, Erin. The Minor Gesture. Durham, NC: Duke University Press, 2016.
Manning, Erin, and Brian Massumi. Thought in the Act: Passages in the Ecology of
Experience. Minneapolis: University of Minnesota Press, 2014.
Maralinga Rehabilitation Technical Advisory Committee. “Rehabilitation of For-
mer Nuclear Test Sites at Emu and Maralinga (Australia).” Canberra: De-
partment of Education, Science, and Training, 2003. https://www.industry
.gov.au/sites/default/files/July%202018/document/pdf/rehabilitation-of
-former-nuclear-test-sites-at-emu-and-maralinga.pdf ?acsf_files_redirect.
Maras, Marie-Helen, and Alex Alexandrou. “Determining Authenticity of Video
Evidence in the Age of Artificial Intelligence and in the Wake of Deepfake
Videos.” International Journal of Evidence and Proof 23, no. 3 (2019): 255–62.
Massumi, Brian. “The Autonomy of Affect.” Cultural Critique 31 (1995): 83–109.
Massumi, Brian. “The Future Birth of the Affective Fact: The Political Ontology of
Threat.” In The Affect Theory Reader, edited by Melissa Gregg and Gregory J.
Seigworth. Durham, NC: Duke University Press, 2010.
Bibliography 219
Massumi, Brian. Ontopower: War, Powers, and the State of Perception. Durham,
NC: Duke University Press, 2015.
Massumi, Brian. Parables for the Virtual: Movement, Affect, Sensation. Post-
contemporary Interventions. Durham, NC: Duke University Press, 2002.
Maurer, Anaïs. “Snaring the Nuclear Sun: Decolonial Ecologies in Titaua Peu’s
Mutismes: E ‘Ore Te Vāvā.” Contemporary Pacific 32, no. 2 (2020): 371–97.
Mbembe, Achille. Necropolitics. Durham, NC: Duke University Press, 2019.
McCosker, Anthony. “Drone Media: Unruly Systems, Radical Empiricism and
Camera Consciousness.” Culture Machine 16 (2015). https://culturemachine
.net/vol-16-drone-cultures/drone-media/.
McCosker, Anthony. “Drone Vision, Zones of Protest, and the New Camera Con-
sciousness.” Media Fields Journal 9 (2015). http://www.mediafieldsjournal
.org/drone-vision-zones-of-protest/2015/8/21/drone-vision-zones-of-protest
-and-the-new-camera-consciousne.html.
McCosker, Anthony. “Making Sense of Deepfakes: Socializing ai and Building
Data Literacy on GitHub and YouTube.” New Media and Society (May 2022).
https://doi.org/10.1177/14614448221093943.
McCosker, Anthony, and Rowan Wilken. Automating Vision: The Social Impact of
the New Camera Consciousness. New York: Routledge, 2020.
McDonald, Matt. Ecological Security. Cambridge: Cambridge University Press, 2021.
McKittrick, Katherine. Dear Science and Other Stories. Errantries. Durham, NC:
Duke University Press, 2021.
McLelland, J. R. The Report of the Royal Commission into British Nuclear Tests
in Australia. Vol. 1. Parliamentary Paper. Canberra: The Parliament of the
Commonwealth of Australia, 1985. https://parlinfo.aph.gov.au/parlInfo
/download/publications/tabledpapers/HPP032016010928/upload_pdf
/HPP032016010928.pdf.
McNeill, J. R., and Peter Engelke. The G reat Acceleration: An Environmental History
of the Anthropocene since 1945. Cambridge, MA: Harvard University Press, 2014.
Michel, Arthur Holland. Eyes in the Sky: The Secret Rise of Gorgon Stare and How
It Will Watch Us All. Boston: Houghton Mifflin Harcourt, 2019.
Miglio. “ai in Unreal Engine: Learning through Virtual Simulations.” Unreal
Engine Blog, April 13, 2018. https://www.unrealengine.com/en-US/tech-blog
/ai-in-unreal-engine-learning-through-virtual-simulations.
Mirzoeff, Nicholas. How to See the World: An Introduction to Images, from Self-
Portraits to Selfies, Maps to Movies, and More. New York: Penguin, 2015.
Mirzoeff, Nicholas. The Right to Look: A Counterhistory of Visuality. Durham,
NC: Duke University Press, 2011.
Mittmann, J. D. “Maralinga: Aboriginal Poison Country.” Agora 25 (2017): 7.
Moore, Jason W., ed. Anthropocene or Capitalocene? Nature, History, and the
Crisis of Capitalism. Oakland, CA: pm, 2016.
Moreton-Robinson, Aileen. The White Possessive. Minneapolis: University of
Minnesota Press, 2015.
220 Bibliography
Morton, Timothy. Hyperobjects: Philosophy and Ecology after the End of the
World. Minneapolis: University of Minnesota Press, 2013.
Murphie, Andrew. “On Being Affected: Feeling in the Folding of Multiple Catas-
trophes.” Cultural Studies 32, no. 1 (2018): 18–42.
Murphie, Andrew. “World as Medium: A Whiteheadian Media Philosophy.” In
Immediation, edited by Erin Manning, Anna Munster, and Bodil Marie
Stavning Thomsen, 16–46. Detroit: Open Humanities, 2019.
Nagel, Emily van der. “Verifying Images: Deepfakes, Control, and Consent.” Porn
Studies 7, no. 4 (2020): 424–29.
Neale, Timothy, Alex Zahara, and Will Smith. “An Eternal Flame: The Elemen-
tal Governance of Wildfire’s Pasts, Presents and Futures.” Cultural Studies
Review 25, no. 2 (2019): 115–34.
Nixon, Rob. Slow Violence and the Environmentalism of the Poor. Cambridge,
MA: Harvard University Press, 2011.
Noble, Safiya Umoja. Algorithms of Oppression: How Search Engines Reinforce
Racism. New York: New York University Press, 2018.
Nyong’o, Tavia. “Little Monsters: Race, Sovereignty, and Queer Inhumanism in
Beasts of the Southern Wild.” glq: A Journal of Lesbian and Gay Studies 21,
no. 2–3 (2015): 249–72.
Offert, Fabian. “Latent Deep Space: Generative Adversarial Networks (gans) in
the Sciences.” Media+Environment 3, no. 2 (2021).
Offert, Fabian, and Thao Phan. “A Sign That Spells: DALL-E 2, Invisual Images
and the Racial Politics of Feature Space.” ArXiv, arXiv:2211.06323 (Octo-
ber 2022). http://arxiv.org/abs/2211.06323.
Öhman, Carl J., and David Watson. “Are the Dead Taking over Facebook? A Big
Data Approach to the Future of Death Online.” Big Data and Society 6, no. 1
(2019). https://doi.org/10.1177/2053951719842540.
Oliver, Kelly. Witnessing: Beyond Recognition. Minneapolis: University of Min-
nesota Press, 2001.
O’Malley, Pat, and Gavin J. D. Smith. “ ‘Smart’ Crime Prevention? Digitization
and Racialized Crime Control in a Smart City.” Theoretical Criminology 26,
no. 1 (2020): 40–56.
OpenAI. “Proximal Policy Optimization.” July 20, 2017. https://openai.com/blog
/openai-baselines-ppo/.
Packer, Jeremy, and Joshua Reeves. “Taking People Out: Drones, Media/Weapons, and
the Coming Humanectomy.” In Life in the Age of Drone Warfare, edited by Lisa
Parks and Caren Kaplan, 261–81. Durham, NC: Duke University Press, 2017.
Packer, Jeremy, and Joshua Reeves. Killer Apps: War, Media, Machine. Durham,
NC: Duke University Press, 2020.
Papailias, Penelope. “Witnessing in the Age of the Database: Viral Memorials, Af-
fective Publics, and the Assemblage of Mourning.” Memory Studies 9, no. 4
(2016): 437–54.
Bibliography 221
Parks, Lisa. Cultures in Orbit: Satellites and the Televisual. Console-Ing Passions.
Durham, NC: Duke University Press, 2005.
Parks, Lisa. Rethinking Media Coverage: Vertical Mediation and the War on Terror.
New York: Routledge, 2018.
Parks, Lisa, and Caren Kaplan, eds. Life in the Age of Drone Warfare. Durham,
NC: Duke University Press, 2017.
Pasquale, Frank. The Black Box Society: The Secret Algorithms That Control Money
and Information. 1st ed. Cambridge, MA: Harvard University Press, 2015.
Paterson, Thomas, and Lauren Hanley. “Political Warfare in the Digital Age:
Cyber Subversion, Information Operations and ‘Deep Fakes.’ ” Australian
Journal of International Affairs 74, no. 4 (2020): 439–54.
Peters, Edward. Torture. New York: Basil Blackwell, 1985.
Peters, John Durham. The Marvelous Clouds. Chicago: University of Chicago
Press, 2015.
Peters, John Durham. “Witnessing.” Media, Culture and Society 23, no. 6 (2001):
707–23.
Phan, Thao, and Scott Wark. “Racial Formations as Data Formations.” Big Data
and Society 8, no. 2 (2021).
Phan, Thao, and Scott Wark. “What Personalisation Can Do for You! Or: How to
Do Racial Discrimination without ‘Race.’ ” Culture Machine, 2021. https://
culturemachine.net/vol-20-machine-intelligences/what-personalisation-can
-do-for-you-or-how-to-do-racial-discrimination-without-race-thao-phan
-scott-wark/.
Phan, Thao, Jake Goldenfein, Monique Mann, and Declan Kuch, editors. Econo-
mies of Virtue—the Circulation of “Ethics” in ai. Amsterdam: Institute of
Network Cultures, 2022.
Pinchevski, Amit. Transmitted Wounds: Media and the Mediation of Trauma.
New York: Oxford University Press, 2019.
Pong, Beryl. British Literature and Culture in Second World Wartime: For the Dura-
tion. Oxford Mid-century Studies. New York: Oxford University Press, 2020.
Popova, Milena. “Reading out of Context: Pornographic Deepfakes, Celebrity,
and Intimacy.” Porn Studies 7 no. 4 (2019): 1–15.
Puar, Jasbir. “The Cost of Getting Better: Suicide, Sensation, Switchpoints.” glq:
A Journal of Lesbian and Gay Studies 18 (2012): 149–58.
Puar, Jasbir. Right to Maim: Debility, Capacity, Disability. Durham, NC: Duke
University Press, 2017.
Pugliese, Joseph. Biopolitics of the More-Than-Human: Forensic Ecologies of Vio
lence. Durham, NC: Duke University Press, 2020.
Pugliese, Joseph. “Death by Metadata: The Bioinformationalisation of Life and
the Transliteration of Algorithms to Flesh.” In Security, Race, Biopower:
Essays on Technology and Corporeality, edited by Holly Randell-Moon and
Ryan Tippet, 3–20. London: Palgrave Macmillan, 2016.
222 Bibliography
Pugliese, Joseph. “Drone Casino Mimesis: Telewarfare and Civil Militarization.”
Journal of Sociology 52, no. 3 (2016): 500–21.
Pugliese, Joseph. State Violence and the Execution of Law: Biopolitical Caesurae of
Torture, Black Sites, Drones. New York: Routledge, 2013.
Rae, Maria, Rosa Holman, and Amy Nethery. “Self-Represented Witnessing: The
Use of Social Media by Asylum Seekers in Australia’s Offshore Immigration
Detention Centres.” Media, Culture and Society 40, no. 4 (2018): 479–95.
Rancière, Jacques. Disagreement: Politics and Philosophy. Translated by Julie Rose.
Minneapolis: University of Minnesota Press, 1999.
Ray, Una. “The Myth of Empty Country and the Story of ‘Deadly’ Glass.” Di’van,
no. 9 (2021): 42–55.
Reading, Anna. “Mobile Witnessing: Ethics and the Camera Phone in the ‘War on
Terror.’ ” Globalizations 6, no. 1 (2009): 61–76.
Reiter, Bernd, ed. Constructing the Pluriverse: The Geopolitics of Knowledge.
Durham, NC: Duke University Press, 2018.
Rhee, Jennifer. The Robotic Imaginary: The Human and the Price of Dehumanized
Labor. Minneapolis: University of Minnesota Press, 2018.
Richardson, Michael. “Climate Trauma, or the Affects of the Catastrophe to
Come.” Environmental Humanities 10, no. 1 (2018): 1–19.
Richardson, Michael. “Drone’s-Eye View: Affective Witnessing and Technicities
of Perception.” In Image Testimonies: Witnessing in Times of Social Media,
edited by Kerstin Schankweiler, Verena Straub, and Tobias Wendl, 64–74.
New York: Routledge, 2018.
Richardson, Michael. Gestures of Testimony: Torture, Trauma, and Affect in Litera
ture. New York: Bloomsbury Academic, 2016.
Richardson, Michael. “There’s Something Going On.” Capacious 1, no. 2 (2018):
149–54.
Richardson, Michael, and Kerstin Schankweiler. “Affective Witnessing.” In Affec-
tive Societies: Key Concepts, edited by Jan Slaby and Christian von Scheve,
166–177. New York: Routledge, 2019.
Richardson, Michael, and Kerstin Schankweiler. “Introduction: Affective Wit-
nessing as Theory and Practice.” Parallax 26, no. 3 (2020): 235–53.
Rini, Regina. “Deepfakes and the Epistemic Backstop.” Philosophers’ Imprint 20,
no. 24 (2020): 1–16.
Ristovska, Sandra. Seeing H uman Rights: Video Activism as a Proxy Profession.
Information Policy. Cambridge, MA: mit Press, 2021.
Robinson, Cedric J. Black Marxism: The Making of the Black Radical Tradition.
Chapel Hill: University of North Carolina Press, 2000.
Rosenthal, Caitlin. Accounting for Slavery: Masters and Management. Cambridge,
MA: Harvard University Press, 2019.
Rothe, Delf. “Seeing Like a Satellite: Remote Sensing and the Ontological Politics
of Environmental Security.” Security Dialogue 48, no. 4 (2017): 334–53.
Bibliography 223
Rouvroy, Antoinette, and Thomas Berns. “Algorithmic Governmentality and
Prospects of Emancipation.” Reseaux 177, no. 1 (2013): 163–96.
Russill, Chris. “Earth Imaging: Photograph, Pixel, Program.” In Ecomedia: Key
Issues, edited by Stephen Rust, Salma Monani, and Sean Cubitt, 228–50.
New York: Routledge, 2015.
Russill, Chris. “Is the Earth a Medium? Situating the Planetary in Media Theory.”
Ctrl-z.Net.Au 7 (2017). http://www.ctrl-z.net.au/articles/issue-7/russill-is-the
-earth-a-medium/.
Russill, Chris. “The Road Not Taken: William James’s Radical Empiricism and
Communication Theory.” Communication Review 8, no. 3 (2005): 277–305.
Sadowski, Jathan. “Potemkin ai.” Real Life, August 6, 2018. https://reallifemag
.com/potemkin-ai/.
Sadowski, Jathan. Too Smart: How Digital Capitalism Is Extracting Data, Control-
ling Our Lives, and Taking over the World. Cambridge, MA: mit Press, 2020.
Sadowski, Jathan. “When Data Is Capital: Datafication, Accumulation, and Ex-
traction.” Big Data and Society 6, no. 1 (2019).
Safransky, Sara. “Geographies of Algorithmic Violence: Redlining the Smart City.”
International Journal of Urban and Regional Research 44, no. 2 (2020): 200–18.
Saldarriaga, Juan Francisco, Laura Kurgan, and Dare Brawley. “Visualizing Conflict:
Possibilities for Urban Research.” Urban Planning 2, no. 1 (2017): 100–107.
Savransky, Martin. Around the Day in Eighty Worlds: Politics of the Pluriverse.
Thought in the Act. Durham, NC: Duke University Press, 2021.
Scannell, Josh. “What Can an Algorithm Do?” dis Magazine. Accessed July 14,
2016. http://dismagazine.com/discussion/72975/josh-scannell-what-can-an
-algorithm-do/.
Scannell, Paddy. Television and the Meaning of Live: An Enquiry into the Human
Situation. Cambridge, UK: Polity, 2014.
Scarce, Yhonnie, Max Delany, and Australian Centre for Contemporary Art. Mis-
sile Park, 2021.
Schaefer, Donovan O. The Evolution of Affect Theory: The Humanities, the
Sciences, and the Study of Power. Cambridge Elements. Cambridge:
Cambridge University Press, 2019.
Schankweiler, Kerstin, Verena Straub, and Tobias Wendl, eds. Image Testimonies:
Witnessing in Times of Social Media. New York: Routledge, 2018.
Schuppli, Susan. Material Witness: Media, Forensics, Evidence. Cambridge, MA:
mit Press, 2020.
Sear, Tom. “Xenowar Dreams of Itself.” Digital War, 1 (2020). https://doi.org/10
.1057/s42984-020-00019-6.
Seaver, Nick. “Algorithms as Culture: Some Tactics for the Ethnography of Algo-
rithmic Systems.” Big Data and Society 4, no. 2 (2017).
Seigworth, Gregory J., and Matthew Tiessen. “Mobile Affects, Open Secrets, and
Global Illiquidity: Pockets, Pools, and Plasma.” Theory, Culture and Society
29 (2012): 47–77.
224 Bibliography
Seltzer, Mark. “Wound Culture: Trauma in the Pathological Public Sphere.”
October 80 (1997): 3–26.
Sheikh, Shela. “The Future of the Witness: Nature, Race and More-than-Human
Environmental Publics.” Kronos 44, no. 1 (2018): 145–62.
Sherwood, Dave. “Inside Lithium Giant sqm’s Struggle to Win Over Indigenous
Communities in Chile’s Atacama.” Reuters, January 15, 2021. https://www
.reuters.com/article/us-chile-lithium-sqm-focus-idUSKBN29K1DB.
Singh, Julietta. Unthinking Mastery: Dehumanism and Decolonial Entanglements.
Durham, NC: Duke University Press, 2018.
Slaughter, Joseph R. Human Rights, Inc.: The World Novel, Narrative Form, and
International Law. New York: Fordham University Press, 2007.
Smith, Joan. Clouds of Deceit: The Deadly Legacy of Britain’s Bomb Tests. London:
Faber, 1985.
Snaza, Nathan. “The Earth Is Not ‘Ours’ to Save.” In Interrogating the Anthropo-
cene: Ecology, Aesthetics, Pedagogy, and the Future in Question, edited by
Jan Jagodzinski, 339–57. Palgrave Studies in Educational Futures. Cham,
Switzerland: Springer International, 2018.
Sontag, Susan. Regarding the Pain of O thers. London: Penguin, 2003.
Sparrow, Jeff. “Plastic Sword the Least of asio’s Bungles in ‘Terror Raid.’ ” Crikey,
October 9, 2014. https://www.crikey.com.au/2014/10/09/plastic-sword-the
-least-of-asios-bungles-in-terror-raid/.
Spillers, Hortense J. “Mama’s Baby, Papa’s Maybe: An American Grammar Book.”
Diacritics 17, no. 2 (1987): 65–81.
Springgay, Stephanie, and Sarah E. Truman. Walking Methodologies in a More-
than-Human World: WalkingLab. New York: Routledge, 2018.
src Defense. “Agile Condor™ High-Performance Embedded Computing Archi-
tecture.” YouTube video, 1:57, October 14, 2016. https://www.youtube.com
/watch?v=sc1aOFmb3AI.
Stahl, Roger. Through the Crosshairs: War, Visual Culture, and the Weaponized
Gaze. New Brunswick, NJ: Rutgers University Press, 2018.
Stanford Law School and nyu School of Law. “Living under Drones: Death,
Injury, and Trauma to Civilians from US Drone Practices in Pakistan.”
International Human Rights and Conflict Resolution Clinic at Stanford Law
School and Global Justice Clinic at nyu School of Law, 2012.
Stark, Luke. “Facial Recognition Is the Plutonium of ai.” xrds: Crossroads, the
acm Magazine for Students 25, no. 3 (2019): 50–55.
Starosielski, Nicole. Media Hot and Cold. Elements. Durham, NC: Duke Univer-
sity Press, 2021.
Stewart, Kathleen. “Afterword: Worlding Refrains.” In The Affect Theory Reader,
edited by Melissa Gregg and Gregory J. Seigworth, 339–53. Durham, NC:
Duke University Press, 2010.
Stewart, Kathleen. “Atmospheric Attunements.” Environment and Planning D:
Society and Space 29, no. 3 (2011): 445–53.
Bibliography 225
Stiegler, Bernard. “Anamnesis and Hypomnesis.” Ars Industrialis, March 7, 2016.
https://arsindustrialis.org/anamnesis-and-hypomnesis.
Stubblefield, Thomas. Drone Art: The Everywhere War as Medium. Oakland, CA:
University of California Press, 2020.
Suchman, Lucy. “Algorithmic Warfare and the Reinvention of Accuracy.” Critical
Studies on Security 8, no. 2 (2020): 175–87.
SyncedReview. “Barack Obama Is the Benchmark for Fake Lip-Sync Videos.” Me-
dium, January 17, 2018. https://medium.com/syncedreview/barack-obama-is
-the-benchmark-for-fake-lip-sync-videos-d85057cb90ac.
Taffel, Sy. “Data and Oil: Metaphor, Materiality, and Metabolic Rifts.” New Media
and Society, 25, no 5 (2021): 980–98.
Tahir, Madiha. “The Distributed Empire of the War on Terror.” Boston Review,
September 9, 2021. https://bostonreview.net/global-justice/madiha-tahir
-war-on-terror-empire-pakistan.
Tahir, Madiha. “The Ground Was Always in Play.” Public Culture 29, no. 1 (2017):
5–16.
Taussig, Michael T. Law in a Lawless Land: Diary of a “Limpieza” in Colombia.
Chicago: University of Chicago Press, 2005.
Taylor, Christopher. Empire of Neglect: The West Indies in the Wake of British Lib-
eralism. Radical Américas. Durham, NC: Duke University Press, 2018.
Taylor, Simon. “Fiction Machines: How Drones Read Oceanic Volumes through
‘Technically-Induced Hallucination.’ ” In Drone Aesthetics: War, Cultures,
Ecologies, edited by Beryl Pong and Michael Richardson. London: Open
Humanities, 2024.
Thrift, Nigel. Non-representational Theory : Space, Politics, Affect. 1st ed. Hoboken,
NJ: Taylor and Francis, 2008.
Tomkins, Silvan. Shame and Its S isters: A Silvan Tomkins Reader. Durham, NC:
Duke University Press, 1995.
Tsing, A. L. “On Nonscalability: The Living World Is Not Amenable to Precision-
Nested Scales.” Common Knowledge 18, no. 3 (2012): 505–24.
Turner, Fred. From Counterculture to Cyberculture: Stewart Brand, the Whole
Earth Network, and the Rise of Digital Utopianism. Chicago: University of
Chicago Press, 2006.
Tynan, Elizabeth. Atomic Thunder: The Maralinga Story. Sydney: NewSouth, 2016.
Tynan, Elizabeth. “Sixty Years on, Maralinga Reminds Us Not to Put Security
over Safety.” Conversation, September 25, 2016. http://theconversation.com
/sixty-years-on-maralinga-reminds-us-not-to-put-security-over-safety
-62441.
Tynan, Lauren. “What Is Relationality? Indigenous Knowledges, Practices, and
Responsibilities with Kin.” Cultural Geographies, 28, no. 4 (2021): 597–610.
Uliasz, Rebecca. “On the Truth Claims of Deepfakes: Indexing Images and
Semantic Forensics.” Journal of Media Art Study and Theory 3, no. 1 (2022):
63–84.
226 Bibliography
un News. “ ‘Crimes of Historic Proportions’ Being Committed in Aleppo, un
Rights Chief Warns.” un News, October 21, 2016. https://news.un.org/en
/story/2016/10/543432-crimes-historic-proportions-being-committed
-aleppo-un-rights-chief-warns.
Valla, Clement. Postcards from Google Earth, 2012. http://www.postcards-from
-google-earth.com/.
Vigh, Henrik. “Crisis and Chronicity: Anthropological Perspectives on Continu-
ous Conflict and Decline.” Ethnos 73, no. 1 (2008): 5–24.
Virilio, Paul. War and Cinema: The Logistics of Perception. New York: Verso, 1989.
Vivian, Bradford. Commonplace Witnessing: Rhetorical Invention, Historical Re-
membrance, and Public Culture. New York: Oxford University Press, 2017.
Wall, T., and T. Monahan. “Surveillance and Violence from Afar: The Politics
of Drones and Liminal Security-Scapes.” Theoretical Criminology 15, no. 3
(2011): 239–54.
Weizman, Eyal. Forensic Architecture: Violence at the Threshold of Detectability.
New York: Zone, 2017.
Westfall, Sammy. “Australia Ranks Last on Climate Action in U.N. Report.”
Washington Post, July 2, 2021. https://www.washingtonpost.com/world
/2021/07/02/australia-climate-action-un-sustainable-development/.
Weston, Kath. Animate Planet: Making Visceral Sense of Living in a High-Tech, Eco-
logically Damaged World. anima. Durham, NC: Duke University Press, 2017.
Whittaker, Meredith. “The Steep Cost of Capture.” Interactions 28, no. 6 (2021):
50–55.
Whyte, Christopher. “Deepfake News: ai-Enabled Disinformation as a Multi-level
Public Policy Challenge.” Journal of Cyber Policy 5, no. 2 (2020): 199–217.
Whyte, Jessica. The Morals of the Market: H uman Rights and the Rise of Neoliber-
alism. New York: Verso, 2019.
Whyte, Kyle. “Against Crisis Epistemology.” In Routledge Handbook of Critical In-
digenous Studies, edited by Brendan Hokowhitu, Aileen Moreton-Robinson,
Linda Tuhiwai-Smith, Chris Andersen, Steve Larkin, and Brendan Hokow-
hitu, 52–64. New York: Routledge, 2021.
Whyte, Kyle. “Indigenous Science (Fiction) for the Anthropocene: Ancestral Dys-
topias and Fantasies of Climate Change Crises.” Environment and Planning
E: Nature and Space 1, nos. 1–2 (2018): 224–42.
Whyte, Kyle. “Settler Colonialism, Ecology, and Environmental Injustice.” Envi-
ronment and Society 9, no. 1 (2018): 125–44.
Wieviorka, Annette. The Era of the Witness. Ithaca, NY: Cornell University Press,
2006.
Wilcox, Lauren. “Embodying Algorithmic War: Gender, Race, and the Posthu-
man in Drone Warfare.” Security Dialogue 48, no. 1 (2017): 11–28.
Wilken, Rowan, and Julian Thomas. “Vertical Geomediation: The Automation
and Platformization of Photogrammetry.” New Media and Society 24, no. 11
(2022): 2531–47.
Bibliography 227
Williams, Raymond. Marxism and Literature, Oxford: Oxford University Press, 1977.
Wise, Jeff. “How Crazy Am I to Think I Actually Know Where That Malaysia
Airlines Plane Is?” New York Magazine, February 23, 2015. https://nymag
.com/intelligencer/2015/02/jeff-wise-mh370-theory.html.
witness Media Lab. “Mal-Uses of ai-Generated Synthetic Media and Deepfakes:
Pragmatic Solutions Discovery Convening.” witness, June 11, 2018.
Wolfe, Patrick. “Settler Colonialism and the Elimination of the Native.” Journal of
Genocide Research 8, no. 4 (2006): 387–409.
Woods, Derek. “Scale Critique for the Anthropocene.” Minnesota Review 2014,
no. 83 (2014): 133–42.
Wu, Xiaoping, and Martin Montgomery. “Witnessing in Crisis Contexts in the
Social Media Age: The Case of the 2015 Tianjin Blasts on Weibo.” Media,
Culture and Society 42, no. 5 (2020): 675–91.
Wynter, Sylvia. “Unsettling the Coloniality of Being/Power/Truth/Freedom:
Towards the H uman, After Man, Its Overrepresentation—an Argument.”
cr: The New Centennial Review 3, no. 3 (2003): 257–337.
Yang, Jun, Peng Gong, Rong Fu, Minghua Zhang, Jingming Chen, Shunlin Liang,
Bing Xu, Jiancheng Shi, and Robert Dickinson. “The Role of Satellite Re-
mote Sensing in Climate Change Studies.” Nature Climate Change 3, no. 10
(2013): 875–83.
Yunkaporta, Tyson. Sand Talk: How Indigenous Thinking Can Save the World.
Melbourne: Text, 2019.
Yusoff, Kathryn. A Billion Black Anthropocenes or None. Minneapolis: University
of Minnesota Press, 2018.
Zylinska, Joanna. Minimal Ethics for the Anthropocene. Ann Arbor, MI: Open
Humanities, 2014.
Zylinska, Joanna. Nonhuman Photography. Cambridge, MA: mit Press, 2017.
228 Bibliography
Index
Italicized page numbers refer to figures. aesthetic interventions and artwork, 13,
98; about nuclear testing, 21, 115, 138–45;
Aboriginal peoples, 18, 146–47, 205n55; drones in, 34–35, 37–39, 53, 107–10, 191n1,
Aboriginal Heritage Act of 1972 (Aus- 192n9, 193n33, 193n43; ecological trauma
tralia), 168–70; military cooptation of and, 117, 125–29, 138–46, 149, 154; insur-
Aboriginal languages, 202n78; nuclear gent aesthetics, 53; significance to the
weapons testing and, 133–39, 141–42, book, 9, 21, 23; and trauma theory, 29–30.
144–45; Stolen Generation, 187n32. See also individual artworks and artists
See also First Nations peoples; Indig- affect theory, 6–7, 185n5. See also machinic
enous peoples; individual nations and affect; traumatic affect
peoples Afghanistan, 32, 38, 46, 52–53, 55, 57, 150,
Abrar, Hisham, 193n37 192n9; Pakistan border, 50, 101–2, 194n51;
activism, 104, 119, 154, 192n13, 200n27; of Uruzgan Province, 1–3, 7, 67
Aleppo Media Center, 60, 63–64; anti- Agile Condor (src Inc.), 43, 49, 59, 66–74,
nuclear, 137, 144, 202n75; call for cloud 76
ethics, 110; of Forensic Architecture, Air Force Research Laboratory (afrl), 66;
20–21, 87, 93–99, 109, 178; humanitarian Moving and Stationary Target Acquisi-
testimony and, 4, 22, 24, 32, 42–43, 95, tion and Recognition (mstar), 69–70
183, 192n9; of Indigenous peoples, 6, 18, Alaimo, Stacy, 122, 190n86
20–21, 115, 137, 144, 169; prodemocracy al-Asad, Bashir, 59–60
protests in Daraa, 59; protests at US– Aleppo Media Center, 60, 63–64
Mexico border, 94; of witness, 90–91. Aleppo, Syria: aftermath of war in, 43,
See also aesthetic interventions and 59–65, 98, 119; Battle of Aleppo, 60; Con-
artwork flict Urbanism project and, 60–63, 65
Alexander, David R., 68 Atacan community of Chile, 12
Alexis-Martin, Becky, 202n80 Atkinson, Meera, 158
Al-Ghazzi, Omar, 63 Australia, 18, 37, 104, 107, 115; Aboriginal
algorithmic enclosure, 4–6, 8, 11, 118, 192n8; Heritage Act of 1972, 168–70; Atomic
autonomous warfare and, 2, 12–14, 42, 56, Weapons Test Safety Committee
79, 83; ethics and, 84–88, 91–92 (awtsc), 201n63; Brockman 4 iron ore
Algorithmic Warfare Cross-Functional mine, 168; climate crisis and, 17, 112–13,
Team (awcft), 99–100, 102, 109. See also 132, 148; “Digital Earth Australia,” 125;
Project Maven Eora Nation, 7; Juukan Gorge, 167–71,
Althusser, Louis, 190n82 205n55; Maralinga, 133–39, 141–42, 144,
Amazon, 11–12, 14, 75 176; Stolen Generation, 187n32; terrorist
Amoore, Louise, 77, 85–86, 96, 101, 110, raids in, 152; Woomera, 138, 142, 202n78
192n8 author positionality, 7, 17–18
ancestral dystopias, 6 Autonomous Real-Time Ground Ubiq-
Anderson, Stephen F., 92 uitous Surveillance Imaging System
Andrejevic, Mark, 46, 69, 72, 78, 195n86 (argus-i s), 47–48, 100
Angerer, Marie-Luise, 164 Azoulay, Ariella, 64–65
anthropocentrism, 33, 116, 179, 182–83;
atomic bomb and, 202n80; climate Baichal, Jennifer, 123
change and, 6, 13, 121–23, 126, 128, 130–32, Bainimarama, Frank, 112
149; minimal ethics and, 188n39; naming Ballard, Su, 23
of the Anthropocene, 19, 121–23; trauma Barad, Karen, 27
theory and, 30 Behram, Noor, 50–51, 193n31
Arab Spring, 59 Benjamin, Ruha, 102–3
ArcGIS, 100 Berlant, Lauren, 15, 158, 166–67, 171
Arendt, Hannah, 31 Biden, Joe, 198n65
Armenia–Azerbaijan war (2020), 75 Bidjigal people of Eora Nation, 7
artificial intelligence (ai), 26, 38, 40, 75, biopolitics, 12, 16, 22, 29, 35; covid-19 pan-
110, 180, 199n82; Agile Condor target- demic and, 175–76. See also ontopower
ing system, 43, 49, 59, 66–74, 76; Azure Biopolitics of the More-Than-Human
platform, 80; ChatGPT, 82, 105–6; (Pugliese), 22
convolutional neural networks (cnns), black boxing, 68, 77, 82, 86, 104–5, 110, 156
88, 96, 100; Dall-E 2, 83, 92; deepfakes Blaser, Mario, 7, 18, 146, 176
and, 88, 90–91, 197n39; FAccT confer- “Blue Marble,” 117
ence, 105; Figure Eight, 100; generative bnngina (slang for drones), 52
adversarial networks (gans), 88–90, 92, “Boiling Milk” (Halperin), 127–29, 145
197n40; Google Brain, 88; Google Ethical Borges, Jorge Luis: “On Exactitude in Sci-
ai team, 104, 199n78; human labor and, ence,” 80
76, 82, 85, 100–101; Mechanical Turk, 82; Boulminwi, Joy, 104
Midjourney, 83; Potemkin ai, 82; Project Bousquet, Antoine, 41, 45, 69, 73
Maven, 99–100, 103; Stable Diffusion, Brand, Stewart: Whole Earth Catalogue, 118
92; Synthesis.ai, 93. See also machine Bridle, James: Drone Shadows, 193n43
learning Brimblecombe-Fox, Kathryn: Theatre of
Asaro, Peter, 79 War: Photons Do Not Care, 34–35
Atacama salt flats, Chile, 123. See also Ata- British Royal Commission into Nuclear
can community of Chile Testing (1984), 133, 144
230 Index
Browne, Simone, 13 communicative politics: and nonhuman
Browning, Daniel, 139 witnessing, 3, 8, 20, 116, 175–83
Bucher, Taina, 83 Computer Learns Automation (Tan), 106–10
Burtynsky, Edward: “Clear Cut #3,” 123; computer vision, 39, 58, 77, 93, 99–100,
“Morenci Mine #2,” 124; “Salt Pan #18,” 108–9. See also drone vision; machine
123 vision; surveillance
Butler, Judith, 31 Conflict Urbanism project, 60–63, 65
convolutional neural networks (cnns), 88,
capitalism, 18, 25, 43, 114, 133, 148, 187n25, 96, 100
188n43; algorithms and, 79, 104, 152, 160; Cooke, Grayson: “Open Air,” 125–26
extractive industries and, 12, 31, 130–31, corporate violence, 5, 20, 38, 87, 93, 119.
168–71; racial, 12–14, 19, 174–75, 181. See also state violence; structural violence
See also settler colonialism; slavery covid-19 pandemic (2020), 17, 26, 80, 158,
Caruth, Cathy, 147 170, 174–76
Center for Spatial Research, Columbia Creech Air Force Base (US), 1
University: Conflict Urbanism: Aleppo crisis epistemologies, 14–15
project, 60–61 crisis ordinariness, 15, 158–59
Chakrabarty, Dipesh, 121 CrowdFlower platform, 100
Chamayou, Grégoire, 185n1, 192n19 Crutzen, Paul, 19
Chandler, Kate, 192n14 Cubitt, Sean, 16–17, 39–40, 116–17, 157, 175,
ChatGPT (OpenAI), 82, 105–6 177, 181–82; The Practice of Light, 26, 28,
Chayka, Kyle, 81 189n74
Chen, Mel Y., 28, 190n82 cyborg tendency, 26, 63, 164
Chile, 12, 123
Chishty, Mahwish: Drone Art Paintings, 53; da Silva, Denise Ferreira, 31, 56
Drone Shadows, 53–54, 193n43; Reaper, Daston, Lorraine, 24, 115
53–54 data friction, 12, 71, 76, 200n23
Chouliaraki, Lilie, 22, 63 data justice, 104–5, 110. See also activism
Chow, Rey, 166 Davis, Heather, 19
Christianity, 6, 24–25 Death Zephyr (Scarce), 139, 141
Clark, Timothy, 121, 124 deepfakes, 9, 25, 87–92, 102, 197n41; of
“Clear Cut #3” (Burtynsky), 123 Barack Obama, 197n39; political warfare
climate crisis, 11, 133; artwork about, 123–30; and, 198n43. See also synthetic media
climate trauma, 132; climatic media and, Defense Advanced Research Projects
119; Marshall Islands and, 112–15, 148; Agency (darpa), 13, 47; Moving and
scale of, 13, 121–22, 125; settler colonial- Stationary Target Acquisition and Recog-
ism and, 14–15, 18–19, 110, 113, 131. See also nition (mstar), 69–70; offset program,
ecological trauma 75; role in internet creation, 118; Systems
closed world computation, 13, 156 of Neuromorphic Adaptive Plastic Scal-
Cockburn, Andrew, 185n1 able Electronics (SyNAPSE), 69
Cold War era, 113, 117–18; closed world de la Cadena, Marisol, 7, 18, 146, 176–77, 182
computation and, 13, 156 DeLanda, Manuel, 73–75
Cole, Samantha, 88, 197n41 Deleuze, Gilles, 6, 45, 74, 91, 145–49
Columbian Exchange (1610), 19 DeLoughrey, Elizabeth, 113–14
communicative aesthetics, 29 Dencik, Lina, 104
communicative commons, 115 de Pencier, Nicholas, 123
Index 231
disappearances. See radical absence edge computing, 29, 49, 68
disinformation, 21, 90. See also deepfakes Edney-Browne, Alex, 52
Doane, Mary Ann, 164 Edwards, Paul N., 13, 156, 200n23
Dodson, Pat, 169 Ellis, John, 25, 189n69
Dorrian, Mark, 50, 193n33 empiricism, 7, 22, 76, 87; radical, 6, 17, 45,
Dowdell, Anthony, 160–61 117, 132
Drone Art Paintings (Chishty), 53 Enlightenment, the, 7, 23–24, 31, 177, 179, 181
drones and drone warfare, 8–9, 14–15, 20, Epic Games: Unreal Engine, 93, 95–96
99, 164, 178, 180; AeroVironment rq-11 Estes, Nick, 18
Raven, 44; Bayraktar tb2, 38; bnngina ethics, 28, 42, 59, 77, 102, 120; algorithms
(slang), 52; Castle Bravo, 137; Datta Khel and, 84–88, 91–92; of care, 129, 147; cloud
airstrike, 50, 101–2; gendering and, 50, ethics, 110; Google Ethical ai team,
56, 102–3, 185n1; Global Hawks, 38, 46; 104, 199n78; humanism and, 23, 32–33;
global increase in, 12; kill boxes and, memory and, 29; minimal, 188n39; video
44–45; life under, 32, 38, 46, 51–52, 55–58, editing and, 151
193n37, 194n51; Operation Kamikazi, 137; Eubanks, Virginia, 104
Predator, 1–3, 38, 44, 46, 53, 75, 193n43; Eye in the Sky, 58
racialization and, 50–51, 56, 102–3, 185n1,
193n43; Reaper, 37–39, 44, 46, 53–54, 58, FAccT conference, 105
66, 70, 75–76, 100, 193n43; search for Facebook (Meta), 75, 151, 155–56; bias and,
Malaysian Airlines Flight 370, 156–57; 82–83, 159; digital death and, 153, 160–62,
spatial dynamics and, 35, 37–38, 47, 50, 165–66, 204n34
52, 57, 59–65, 68, 192n19; temporality and, facial recognition, 11, 70, 72, 84; bias and,
44, 47–49, 59, 61–62, 64, 68, 78, 192n19; 104; deepfakes and, 88, 90–91
unmanning and, 44, 192n14; Uruzgan Farid, Idris, 56
airstrike, 1–3, 7, 67. See also aesthetic Farmer, Paul, 31, 191n99
interventions and artwork; drone vision; Farrier, David, 128
surveillance; violent mediations Federally Administered Tribal Areas
drone vision, 193n24, 193n33, 194n51; war- (fata), Pakistan, 55, 194n44
fare and, 46, 50, 57–59, 61, 65, 109, 194n58. feminist critique, 16, 24–25, 64–65, 117,
See also computer vision; machine vision; 189n66, 197n41
surveillance Finn, Ed, 83
First Nations peoples, 6–8, 20, 24, 158, 177,
Earth imaging, 118–21; “Blue Marble,” 117; 182; crisis epistemologies and, 14–15; min-
“Digital Earth Australia,” 125; “Earthrise,” ing sacred sites of, 167–71, 176; nuclear
117; Google Earth and, 80, 100, 196n2; weapons testing and, 9, 115; settler colo-
Landsat and, 61–62, 115, 125–26 nialism and, 17–18, 131, 136, 138, 188n60.
earwitnessing, 52 See also Aboriginal peoples; Indigenous
ecological trauma, 10, 30, 116–17, 132–33, peoples; individual nations and peoples
149; Aboriginal peoples and, 135, 142, Foley, James, 150–52, 166
144–45; aesthetic interventions and, 117, Forensic Architecture (Goldsmiths, Uni-
125–29, 138–45; First Nations peoples and, versity of London), 20, 87, 102, 178; Triple
115, 131; Pacific Islands and, 113–15, 148; Chaser, 21, 93–99, 109
radical absence and, 154, 158, 167, 170, 179; forensic ecology, 22, 56
wounding and, 145–48. See also trauma; Freedom of Information Act (US), 3
traumatic affect Frosh, Paul, 22, 25, 161
232 Index
Fuller, Matthew, 116, 120–21; Investigative Hellfire missiles, 2, 30, 50, 57, 66
Aesthetics, 21–23, 183 Hogan, Mél, 161
Furuhata, Yuriko, 13, 119 Holmqvist, Caroline, 75
Hoskins, Andrew, 151
Gabrys, Jennifer, 120, 156
Gadigal p eople of Eora Nation, 7 Illingworth, Shona, 151
Galison, Peter, 24 imperialism, 80, 156, 192n13; First Nations
Galtung, Johan, 191n99 and, 18, 113, 136, 144, 182; insurgent aes-
gaming. See video games thetics and, 53; violence and, 12, 18, 24, 29,
Gebru, Timnit, 104, 199n78 44, 113, 115, 194n44
gender, 24, 53; deepfakes and, 90, 197n41; Indigenous peoples, 18, 23–24, 146–47,
drone systems and, 50, 56, 102–3, 185n1; 201n63; back-to-the-land movement
sexual violence and, 64–65, 187n32; tar- and, 200n21; genocide and, 11–12, 187n32;
geting of “military-aged males,” 2, 58, 102 nuclear weapons testing and, 136, 144.
General Atomics, 68; Reaper drones, 37–39, See also Aboriginal peoples; First Nations
44, 46, 53, 58, 66, 70, 75–76, 100, 193n43 peoples; individual nations and peoples
generative adversarial networks (gans), Institute of Electrical and Electronics Engi-
88–90, 92, 197n40 neers (ieee), 69, 195n83
geographic information systems (gis), 100 insurgent aesthetics, 53. See also aesthetic
Gerrard, John: “The Farm,” 105 interventions and artwork
gilgamesh, 2, 55 Investigative Aesthetics (Fuller and
GitHub repository, 88, 90, 94 Weizman), 21–23, 183
Givoni, Michal, 27 invisual perception, 85, 101, 103, 108. See also
Glissant, Édouard, 110, 179–80 computer vision
Google (Alphabet), 2, 11, 13–14, 105, 198n65; Iraq, 42; Islamic State in Iraq and Syria
bias and, 83, 104; Brain, 88; digital death (isis) and, 59–60, 100, 103, 150, 152; US
and, 161; Earth, 80, 100, 105, 196n2; invasion of, 57
Ethical ai team, 104, 199n78; machine Iron Dome technology (Israel), 74
learning and, 75, 88, 196n5; PageRank, Islamic State in Iraq and Syria (isis), 59–60,
166; Project Maven, 99–100, 103 100, 103, 150, 152
Gregory, Derek, 46, 185n1
ground truth, 26, 56, 70, 91, 164; aftermath Jackson, Zakiyyah Iman, 27, 189n61
in Aleppo and, 61–62 James, C. L. R., 130
Grove, Jairus, 16, 42, 45, 79, 116, 187n25, James, William, 17, 45, 146, 160
196n106 Jetñil-Kijiner, Kathy, 148; “Tell Them,”
Grusin, Richard, 27, 153 113–14
Guattari, Félix, 86, 120, 147, 197n30 Joint Standing Committee on Northern
Gulf War (1991), 46, 192n19 Australia, 168–70, 205n55
Juukan Gorge sacred sites, 167–71, 205n55
Halperin, Ilana: “Boiling Milk,” 127–29, 145
Halpern, Orit, 13, 26, 83, 189n76 Kainaki II Declaration, 112
Haraway, Donna, 24–25, 117, 189n66 Kanders, Warren B., 94
Harkin, Natalie, 144; “mine and refine this Kapadia, Ronak, 53
float . . . ,” 145 Kaplan, Caren, 20, 44, 50, 63, 192n14
Hartman, Saidiya, 31, 191n103 Kausar, Mohammed, 52
Hayes, Burchell, 168 Kelley, Robin D. G., 12
Index 233
Kember, Sarah, 16, 151 and, 174; opacity and, 179–80; racializa-
kill boxes, 44–45, 192n19 tion and, 31, 177, 180
Kittler, Friedrich, 189n74 Manjoo, Farhad, 80
Kurgan, Laura, 60 Manning, Erin, 164, 172
Mãori peoples, 20
Landsat (nasa), 61–62, 115, 126; “Digital Maralinga, Australia, 133–39, 141–42, 144,
Earth Australia,” 125 176
Lapoujade, David, 159–60 Marshall Islands, 113–14, 137, 148
law, Western, 21, 52, 76, 90, 104, 194n44; martial gaze, 41–42, 44, 56, 69, 103
Aboriginal peoples and, 135–37, 168–70; Massumi, Brian, 78, 159; on ontopower, 15,
extractive capitalism and, 31, 168–70; First 72, 87; theorizing affect, 7, 96, 132, 148,
Nations peoples and, 20, 24, 136, 169–71; 164–65, 167, 185n5
terra nullius doctrine, 136, 202n74; Material Witness (Schuppli), 21
witnessing in, 23, 42–43, 56–57, 98, 144, McCosker, Anthony, 44
178, 189n61 media studies, 16, 25; witnessing subfield,
Liu, Cixin: The Three-Body Problem, 129–30 22–23
liveness, 25, 59, 166 mediation, 26, 98, 120, 175, 177; and absence,
“Living Under Drones” report (Stanford 151–67, 171–73; and deepfakes, 91; defini-
University and New York University), 56, tion, 16; and drones, 2–3, 22, 35, 39–79;
193n37 and ecology, 116–19, 125–28, 132; and ma-
Luciano, Dana, 28, 190n82 chinic affect, 86, 101, 114; and nonhuman
witnessing, 7, 28; and nuclear testing, 21,
machine learning, 68–69, 80–81, 83–84, 119, 114, 144–45; violent, 10, 22, 39–79, 101, 114,
196n5, 198n61; aesthetic interventions 119, 132, 146, 149, 178–79
and, 93–99, 105–9; deepfakes and, 88–89, media witnessing, 25–26, 91, 158, 189n69;
92, 102, 197n39; human labor and, 1–2, 12, definition, 22. See also social media;
77, 79, 82, 96, 100, 106–8. See also artificial violent mediations
intelligence (ai); machinic affect Menzies, Robert, 133, 201n63
machine vision, 26, 49, 73, 75, 164, 178. Meta. See Facebook (Meta)
See also computer vision; drone vision; Microsoft, 75; AirSim, 93; Bing, 105; Flight
surveillance Simulator, 80–81, 105, 109; Planetary
machinic affect, 10, 84, 101–3, 197n31; Computer, 119
Computer Learns Automation and, 109, military-aged males, targeting of, 2, 58, 102
110; deepfakes and, 91; definition, 86–87; “mine and refine this float . . .” (Harkin), 145
ecological trauma and, 114–15, 149, 179, misinformation, 21, 88. See also deepfakes
181; invisual perception and, 108; radical Missile Park (Scarce), 139, 142–43
absence and, 154, 158, 163–64, 166, 171; Mitchell, Margaret, 199n78
Triple Chaser and, 94, 96, 98. See also Mittman, J. D., 135–36
affect; machine learning mnemotechnologies, 163. See also drones
Mackenzie, Adrian, 82, 84–85, 108 and drone warfare
Malaysian Airlines Flight 370 (mh370), Moore, Jessa, 160–62
154–57, 165 “Morenci Mine #2” (Burtynsky), 124
Man, the Witness, 6, 19, 23–25, 28–29, 32, Moreton-Robinson, Aileen, 135, 168, 202n71
35–36, 185n4, 188n60; climate crisis and, Morgan, Nyarri, 135
115–17, 126–27, 149; concept of the World Morrison, Scott, 112–13
and, 176–77, 180–81; covid-19 pandemic Morton, Timothy, 132, 201n56
234 Index
mq-9 Reaper I–III (Pailthorpe), 37–39 Pacific Islands Forum (2019), 112, 183
Multi-Spectral Targeting System (msts), Packer, Jeremy, 12, 40–41, 74, 76
1–2 Paglen, Trevor, 105
Munster, Anna, 84–85, 108 Pailthorpe, Baden, 191n1; mq-9 Reaper I–III,
Murphie, Andrew, 15, 159, 187n29 37–39
Pakistan, 42, 53, 194n51; Federally Adminis-
nasa: Earth Resources Technology Satellite, tered Tribal Areas (fata) of, 55, 194n44;
126; Landsat, 61–62, 115, 125–26 Waziristan, 50, 101–2
Nature, 116, 147, 177, 201n56 Parks, Lisa, 44, 47, 57
natureculture, 10, 117 Pasquale, Frank, 82
necropolitics, 12, 16, 29, 35, 68, 78, 175. Pax for Peace, 119
See also biopolitics; ontopower Pearl, Daniel, 152
neuromorphic computing, 69–7 1 Peters, John Durham, 22, 25, 130, 187n29
9/11 attacks (September 11, 2001), 12, 25–26, Phan, Thao, 103
158; aftermath of, 46, 150, 152, 192n8 Pinchevski, Amit, 22, 25, 157, 159
Nixon, Rob, 131 pluralism, 146
Noble, Safiya, 83, 104 pluriversal politics, 3, 11, 19, 110, 153, 176–78,
nonhuman witnessing, definition, 3–11, 180, 182, 205n5
16–17, 23–29 Pohiva, Akilisi, 112
nonhuman witnessing, politics of, 175–84 Poitras, Laura: Triple Chaser, 21, 93–99,
nuclear colonialism, 19, 136–38, 142, 144–45, 109
202n75 police brutality, 30, 152, 158, 169, 175, 187n32
nuclear weapons testing, 67, 118, 202n79; “Polymorphism” (Tan), 105
artworks about, 21, 113–14, 138–45; in Pong, Beryl, 49
Australia, 9, 115, 133–39, 201n63, 202n72, porn studies, 90
202n78; on Marshall Islands, 113–14, 137, posthumanist theory, 27, 190n85
148; “Operation Antler,” 134; “Operation Project Maven (Google and US Department
Buffalo,” 134, 137–38; White Sands Missile of Defense), 99–100, 103
Testing Range (US), 202n80. See also proximal policy optimization (ppo), 107–8
nuclear colonialism Puar, Jasbir, 12, 205n60
Nvidia, 95 Pugliese, Joseph, 23, 43, 56, 94–95, 194n49;
Nyong’o, Tavia, 190n85 Biopolitics of the More-Than-Human, 22
Puutu Kunti Kurrama and Pinikura (pkkp)
Obama, Barack, 90, 150, 197n39 peoples, 167–71, 205n55
Oceti Sakowin peoples of Dakota, 18
Öhman, Carl J., 160 queer critique, 27–28, 104, 190n82, 205n60
Oliver, Kelly, 26–27, 32–33, 36, 101
ontopower, 29, 35, 68, 78–79, 87, 175; defini- race and racialization, 16, 31, 118, 180,
tion, 15–16, 72 190n85; Aboriginal peoples and, 145,
opacities, 86, 100, 175–76, 178–81, 183. 187n32; algorithmic bias and, 82–83,
See also black boxing 103–4, 110–11, 159; drone warfare and,
OpenAI, 107; ChatGPT, 82, 105–6; Two- 50–51, 56, 102–13, 185n1, 193n43; First
Minute Papers YouTube channel, 105–6 Nations p eoples and, 169, 177; insurgent
“Open Air” (Cooke and Walker), 125–26 aesthetics and, 53; racial capitalism, 12–14,
open-source tools, 20, 60, 90, 93–94, 183, 19, 174–75, 181; surveillance logics and, 13,
200n27 44. See also settler colonialism; slavery
Index 235
radical absence, 10, 50, 153–54, 163–67, Shah, Nisha, 45
172–73, 179; digital death and, 153, 160–63, Shanahan, John, 100
165–66, 204n34, 204n39; execution of Sharkey, Noel, 76
James Foley and, 150–52; Malaysian Air- Sheikh, Shela, 28
lines Flight 370 and, 155–57; sacred sites Singh, Julietta, 190n82
and, 168–71; traumatic affect and, 157–60 slavery, 12–13, 23–24, 29, 31, 179–80, 191n103;
radioactivity, 137, 141–42, 176, 202n72, plantations and, 18–19, 130–31. See also
202nn79, 203n83 capitalism
Ramstein Air Base, Germany, 1 social media, 13, 17, 20, 98, 109, 111, 130, 154;
Rancière, Jacques, 176 bias and, 82–83, 159; digital death and, 153,
rape, 42, 64–65 160–62, 165–66, 204n34; disappearance
Ray, Una, 139, 142 of Malaysian Airlines Flight 370 and,
Reeves, Joshua, 12, 40–41, 74, 76 155–57; execution of James Foley and,
Rhee, Jennifer, 193n43 150–52; grief and, 203n34; mining sacred
Rio Tinto, 167–70 sites and, 168, 170–71, 176; traumatic
Rise of the Drones (pbs), 48 affect and, 159–60, 162–63, 166. See also
Rothe, Delf, 119–20 individual platforms
Russill, Chris, 17, 118, 187n29 Sotloff, Steven, 151
spatial dynamics, 16, 20, 25, 88, 93, 96, 114,
Sadowski, Jathan, 82 142; climate crisis and, 9, 118, 120–23,
Safariland, 94–96 126–32; drone warfare and, 35, 37–38,
“Salt Pan #18” (Burtynsky), 123 47, 50, 52, 57, 59–65, 68, 192n19; radical
savage ecology, 116, 120 absence and, 154, 157–59, 162–64, 172, 174
Scannell, Paddy, 156–57 Spiking Neural Networks, 69
Scarce, Yhonnie, 115, 138, 144–45; Death Spillers, Hortense, 24, 31
Zephyr, 139, 141; Missile Park, 139, 142–43; src Inc., 66, 69–70. See also Agile Condor
Thunder Raining Poison, 139–40 Stahl, Roger, 57
Schmidt, Eric, 14, 198n65 Starosielski, Nicole, 58
Schuppli, Susan, 22–23, 43, 52, 63, 171; state violence, 5, 12–16, 20, 31–32, 79, 84, 119,
Material Witness, 21 178. See also drones and drone warfare;
Sear, Tom, 91 settler colonialism; structural violence
Seaver, Nick, 83 Stewart, Kathleen, 19–20, 172
Seltzer, Mark, 158 Stiegler, Bernard, 163
September 11, 2001 (9/11), attacks, 12, 25–26, Stoermer, Eugene, 19
158; aftermath of, 46, 150, 152, 192n8 Stolen Generation, 187n32
settler colonialism, 11, 19–20, 29, 111, 117, structural violence, 5, 14, 31, 61, 190n82,
121, 148, 185n4, 187n25, 188n43; author 191n99. See also settler colonialism;
positionality and, 7, 17–18; environmental slavery; state violence
injustice and, 5, 12, 115, 131; genocide and, structures of feeling, 122, 172, 201n41
19, 169, 187n32; possessive logics of, 39, Stubblefield, Thomas, 50, 191n1, 193n33
135–36, 169–70, 202n74; racist other- Suchman, Lucy, 78
ing and, 24–25, 44, 82–83, 103, 190n82; surveillance, 44, 56, 155–56; Agile Con-
resistance to, 13, 53, 110, 137–39, 142, 145, dor, 43, 49, 59, 66–74, 76; argus-i s,
168–69, 172, 177, 202n75; structural nature 47–48, 100; climate-monitoring systems,
of, 5, 14, 23, 31, 35, 113, 144, 169–71, 194n44 9, 13–14, 20, 113, 119–20, 126, 187n29,
sexual violence, 42, 64–65, 187n32 200n23; covid-19 pandemic and, 174–75;
236 Index
drones and, 12, 32, 46, 51–52, 59, 194n58; trauma studies, 29–30, 32, 147–48
Earth imaging, 61–62, 80, 100, 115, 117–21, traumatic affect, 151–54, 156–60, 164, 166–67,
125–26, 196n2; gilgamesh, 2, 55; nuclear 171–73; digital death and, 162–63, 165.
monitoring systems, 67, 118; skynet, See also affect theory; trauma
55; wide area motion imagery (wami) Triple Chaser (Poitras and Forensic Archi-
initiatives, 41, 47–49, 71, 100. See also tecture), 21, 93–99, 109
computer vision; drone vision; machine Tsing, Anna, 130
vision Tynan, Lauren, 135, 147
synthetic media, 66, 88, 90–91, 93–99.
See also deepfakes Ukraine, 2022 Russian invasion of, 38, 75
Syria, 42; aftermath of war in Aleppo, Uliasz, Rebecca, 90
43, 59–65, 98, 119; Islamic State in Iraq uncommmons, 206n21
and Syria (isis) and, 59–60, 100, 103, United Nations Institute for Training and
150, 152 Research (unitar): unosat (United
Nations Satellite Center), 60
Taffel, Sy, 82 US Defense Innovation Board, 14, 198n65
Tahir, Madiha, 42, 194n44 US Department of Defense (DoD), 198n65;
Tan, Kynan: Computer Learns Automation, Defense Intelligence Enterprise, 99–100;
106–10; “Polymorphism,” 105 Project Maven, 99–100, 103; report on
Taussig, Michael, 191n102 Uruzgan airstrike, 2–3
television, 25, 158, 189n69 US Special Forces, 2, 68
“Tell Them” (Jetñil-Kijiner), 113–14
temporality, 16, 18, 87, 98, 147, 187n36; Valla, Clement: Postcards from Google
climate crisis and, 114, 117–23, 125–30, 132, Earth, 196n2
145, 149; drone warfare and, 44, 47–49, video games, 38, 70, 162–63; Microsoft
59, 61–62, 64, 68, 78, 192n19; machine Flight Simulator, 80–81, 105, 109;
learning and, 100, 105, 107–8; of nuclear OpenAI, 106; Second Life, 93; Unigene,
radiation, 116, 133, 141–42, 203n83; radical 105; Unreal Engine, 93, 95–96
absence and, 154–55, 161–64, 172; signifi- Vigh, Henrik, 15
cance to the book, 8–9 violent mediations, 10, 101–2, 178–79;
terra nullius, 18, 136 aftermath of drone warfare and, 2–3,
Theatre of War: Photons Do Not Care 22, 49–50, 52, 57–65; climate crisis and,
(Brimblecombe-Fox), 34–35 114–16, 119, 132, 146, 149; drone systems
“The Farm” (Gerrard), 105 and, 39–49, 67, 72–73, 76–79, 84; radical
Thirft, Nigel, 163, 167 absence and, 151–54, 157–60, 172–73; rape
Three-Body Problem, The (Liu), 129–30 and, 64–65
Thunder Raining Poison (Scarce), 139–40 Virilio, Paul, 67
Todd, Zoe S., 19 von Neumann, John, 118
Tomkins, Silvan, 91
trans-corporeality, 190n86 Walker, Emma: “Open Air,” 125–26
trauma, 9, 15, 59; mythologizing and, Wark, Scott, 103
205n60; used to radicalize, 151; witnessing war rugs, 53, 192n9
in response to, 4, 21, 29, 32–33, 180–82; Watson, David, 160
wounding and, 50, 145–48. See also eco- Waup, Lisa, 142
logical trauma; traumatic affect; trauma Waziristan, Afghanistan–Pakistan border,
studies 50, 101–2
Index 237
Weizman, Eyal, 22, 32, 93, 95, 102, 119, 183; Woods, Derek, 122
Forensic Architecture, 21; Investigative Woomera, Australia, 138, 142, 202n78
Aesthetics, 21–23, 183. See also Forensic Work, Robert, 99
Architecture worldings, 19–20, 176–77
Weston, Kath, 174 wound culture, 158. See also trauma; violent
Whanganui River, New Zealand, 20 mediations
white settler sovereignty, 135, 137, 168–69, Wynter, Sylvia, 6, 31, 116, 118, 185n4
202n71. See also nuclear colonialism; set-
tler colonialism xenowar, 91. See also deepfakes
Whole Earth Catalog (Brand), 118
Whyte, Jess, 192n13 Yami, Lester, 133, 135, 139, 144
Whyte, Kyle Powis, 6, 14, 18, 131 YouTube, 58, 90, 156; aftermath in Aleppo
wide area motion imagery (wami) initia- and, 61, 63; Agile Condor videos, 71;
tives, 41, 47–49, 71, 100 execution of James Foley and, 150, 152;
Wilcox, Lauren, 103, 185n1 Two-Minute Papers channel (OpenAI),
Wilken, Rowan, 44 105–6
Williams, Raymond, 201n41 Yusoff, Kathryn, 188n43
witness, 90–91
Wolf, Asher, 104 Zylinska, Joanna, 16, 28, 122, 128, 151; mini-
Wolfe, Patrick, 31, 187n32 mal ethics, 188n39
238 Index
This page intentionally left blank