Provocation
Anti-Computing: Models and Thought Experiments
Speakers:
Jackie Wang, Ramon Amaro, Maya Indira Ganesh, Nora N. Khan
Profile:
Jackie Wang
Jackie Wang is a poet, multimedia artist, and scholar of the history and political economy of prisons and police. She is an assistant professor of American Studies and Ethnicity at the University of Southern California. Wang’s first book, Carceral Capitalism (2018), is a widely cited collection of essays on the racial, economic, political, legal, and technological dimensions of the U.S. carceral state.
Profile:
Ramon Amaro
Ramon Amaro is an engineer, sociologist, and cultural theorist in machine learning and AI research. He received a PhD from Goldsmiths, researching the philosophy of machine learning and the history of racial bias in mathematics. He is an Assistant Professor at University College London, and author of The Black Technical Object: On Machine Learning and the Aspiration of Black Being (2022).
Curatorial Frame (Excerpt):
Ramon Amaro’s and Jackie Wang’s respective practices as thinkers and writers have guided many of us into the break: a break where we can begin to consider the possibility of what Caroline Bassett calls “anti-computing,” the histories of dissent against computational logics, utopias, and imaginaries. In their research, study, and forthcoming book (see bio), Amaro delves into the gaps in the relationship between the black experience, machine learning research, and the racial history of scientific explanation. Wang, using books, poems, performance, and scholarship, traces an understanding of carceral politics, predictive logic towards a politics of disruption, resistance, and liberation. Jackie Wang meets Ramon Amaro in the dream state of co-thinking, to discuss Sylvia Wynter, Frantz Fanon, and sociogenic alienation. Together, they offer us ways to understand the condition of being subject to computing.
Soundbite:
“I study the history of prisons and police, looking at the political economy and technological dimension of the U.S. carceral state. A lot of my recent research is on the history of voice surveillance, looking at how World War 2 research happening at Bell Labs evolved into forensic voice identification.”
Jackie Wang, introducing her current research
Takeaway:
Computer vision is widely discussed but how AI surveills sound is less familiar to the public. In Wang’s telling, ‘voiceprints’ of the incarcerated are made for what she derides as dubious “fraud prevention,” on outgoing phone calls from prisons. “It’s actually used as a mechanism of controlling the incarcerated population. And even non-incarcerated people who talk to incarcerated people can be in these databases,” she says, encapsulating the widespread audio surveillance.
Takeaway:
AI and computational social sorting and disciplinary regimes are not new. They have a long history of “using symbolic languages in order to disrupt not only a sense of motion, and being,” notes Amaro. An important step in engaging the contemporary AI and machine learning discourse is to peel back the veneer of novelty that (conveniently) obstructs a much longer history of problematic statistical modelling.
Soundbite:
“I have a brother who is incarcerated. I was creeped out when I learned about audio surveillance because we often talk on prison phone lines. The only way you can circumvent them is to have a contraband cellphone.”
Jackie Wang, revealing voice surveillance and monitoring hits close to home for her and is not just ‘research’
Soundbite:
“Telecom companies take the data from prison calls and create other security products to sell to prisons. They say: ‘we can use this technology to identify criminal activity on these phone lines’ or ‘this phone number is in contact with multiple incarcerated people, maybe there is political organizing going on.’”
Jackie Wang, on the extractive industries built out of datasets collected from incarcerated populations
Precedent:
Coming under increasing scrutiny in recent years are inmate telephone systems (ITS), a privatized sub-sector adjacent to the American prison industry that profiteers off phone calls of the incarcerated. Essentially a duopoly held by two companies (Global Tel Link and Securus Technologies) with a 70% market share, prison populations and their families pay extortionate rates in order to stay in contact. Beyond further punishing the precarity of families with incarcerated members, the price gouging is doubly damaging, potentially discouraging or limiting contact, further compounding the deleterious mental health effects of serving time in prison.

Images: Securus Technologies promotional video
Soundbite:
“This year, Elon Musk was asked about the current state of limitations of autonomous vehicle systems and AI, and he answered ‘nothing is where it is supposed to be.’ I keep returning to that idea when thinking about racial processes and epistemic violence: Care is not where it is supposed to be. Community isn’t where it is supposed to be. Equitability isn’t where it is supposed to be.”
Ramon Amaro, zooming turning Elon Musk’s claims about engineering problems into a prognosis of sweeping cultural maladies
Soundbite:
“If we’re to relate that to Jackie’s intervention: these individuals in the carceral system are not where they’re supposed to be. Because the system itself is designed as an apparatus which is in place to disrupt an authentic sense of being.”
Ramon Amaro, connecting his sentiments with Wang’s research
Predecent:
Much of Wang and Ramon’s discussion plumbs the depths of the of French psychoanalyst Jacques Lacan’s notion of the mirror stage—as interpreted by postcolonial theorist Frantz Fanon. This largely centres around a footnote in Fanon’s 1952 book Black Skin White Masks, and using that reading of the mirror stage of development to consider the nature of the contemporary subject. “In Fanon’s analysis of the mirror stage, you have a child who is in front of a mirror and their caretaker is affirming their identification in the mirror. In the Fanonian formulation, the image in the mirror is already loaded with the social and the historical and the cultural … it’s not just a universal process that is devoid of race, essentially,” says Wang, summarizing the analysis.
Soundbite:
“You hear people in the AI ethics space talking about bias in facial recognition. ‘How can we make facial recognition more accurate?’ becomes the framework for thinking about the ethics of facial recognition. I’d be curious to hear about your thinking about legibility and opacity related to this technology.”
Jackie Wang, posing Ramon a question
Soundbite:
“Especially with the carceral system, this attempt to capture that moment. Discretize that moment into a symbolic language. Re-articulate that moment into a prognosis or type of prediction—and then take the audacious step of regurgitating that moment back to us. If we think about the real violence in that step, it’s there in which I’m concerned about what you call the irreducibility of this very personal, individual, and collective experience into that which becomes a technological problem.”
Ramon Amaro, breaking down what happens during data capture
Precedent:
Wang puts her research on carceral capitalism into conversation with current Big Tech meets law enforcement discourse in “Abolition of the Predictive State,” an essay guest editor Nora N. Khan commissioned for HOLO 3. Tracing a long historical arc of risk modelling and eugenic profiling she undermines the notion that AI-assisted policing could ever be neutral: “though it is possible to come up with abolitionist applications for data-driven tools—allocating resources, modelling environmental racism, identifying violent cops—we must remain vigilant to the seduction of techno-solutionism,” she writes.

Image: Predictive policing GUI mockup depicting ‘hotspot’ analysis with suggestions for officers to make to residents “to reduce or deter crime” / patent: George O. Mohler, US 8,949,164 B1 (2015)
Soundbite:
“For me, the carceral state is more damning to those that are not locked up than those who are. Because what it really says is the process whereby an individual becomes a normal citizen flows through physical confinement, systemic assignment, assignment to violence, and if you reach the end—because this is recidivism right?—you’re a normal citizen. What does that say about us? Have we actually gone through that process?”
Ramon Amaro, asking the American public to look in the mirror and reflect on the nature of their freedom
Soundbite:
“That’s a major Foucauldian insight. That disciplinary logics don’t begin and end at the prison. It’s embedded in our schooling and in the workplace. You know, the workhouse and the prison were one institution, and those logics are diffused across different social domains.”
Jackie Wang, on widespread social conditioning
Soundbite:
“I wonder if you’ll join me, and I’m going to put my algorithmic glasses on: I want to read this poem, but I want to think about the algorithmic at the same time. And then I have a provocative question for you.”
Ramon Amaro, before reading Wang’s poem “Masochism of the Knees” to her
Precedent:
Masochism of the Knees

Who is the girl forced to kneel on dried chickpeas to atone for the sin of being alive?
In the dream blindfold and bandage are one.
My hands go numb as I carry dried chickpeas.
In my head there is a voice that says “naked forest” and “a tiny photograph that is passed between hands in the dark.”
Why doesn’t the girl on the floor of the world talk?
Because she is a wound on the earth’s hide.
Not mouth.
Do you understand?
Wound, not mouth.


From Wang’s poetry collection The Sunflower Cast a Spell to Save Us From the Void (2021)
Soundbite:
“My provocation is: Are you suggesting that there is then the potentiality of atonement for this type of being in the numbness for what we might call the algorithmic?
Ramon Amaro, putting Wang’s poem under the microscope
Soundbite:
“Gosh! I’m not sure if I have an answer for you. But it was really uncanny hearing you read the poem. I was thinking: Maybe asking a poet what their poem means is kind of tricky. But as you were reading—and this connects to our Fanon conversation—I realized there was a shift in the landscape of my dream life … I would have these guilt dreams, but you couldn’t really trace the source of the guilt.”
Jackie Wang, delving into the psychology underpinning her poem
Soundbite:
“I’m not necessarily pessimistic about the possibility of creating disruptive technologies, but I think we have to deal with the racial capitalism, homophobia, white supremacy piece of it. I don’t really think you can separate the two.”
Jackie Wang, answering an audience questioning inquiring about the possibility of designing “soft, yielding, reasonable machines”
Soundbite:
“Me asking for a benevolent machine is me taking away my own accountability to make a benevolent life. I think if a machine were to be benevolent we wouldn’t even see it. Because we’ve never seen that before—not on a global scale.”
Ramon Amaro, reminding us that tech will not save us (from ourselves)
$40 USD