HOLO is an editorial and curatorial platform, exploring disciplinary interstices and entangled knowledge as epicentres of critical creative practice, radical imagination, research, and activism. Read more
New York Times tech reporter Hill chronicles Clearview AI, the facial recognition company with far right ties that emerged during the Trump era and whose technology has been at the centre of numerous privacy and civil liberties controversies.
Aram Bartholl bids farewell to his 2010 Google Streetview performance 15 Seconds of Fame, after the company updated its severly outdated Berlin image set. In October 2009, the German artist interrupted his coffee break on Borsigstraße to run after a passing Google Streetview car, creating the whimsical chase sequence that’s been online since the service launched in Germany in 2010. “15 Seconds of Fame turned into almost 15 years,” Bartholl jokes on Instagram. “The work is finally complete.”
“The majority of people aren’t users but subjects of AI. It’s not a matter of individual choice. Most AI determinations that shape our access to resources are behind the scenes in ways we probably don’t even know.”
– Signal Foundation president and AI Now Institute co-founder Meredith Whittaker, discussing AI ethics with Credo AI’s Navrina Singh, and Distributed AI Research Institute’s Alex Hanna at the Bloomberg Technology Summit. “AI is a surveillance technology,” Whittaker insists. “The Venn diagram of AI concerns and privacy concerns is a circle.”
“Projects such as Aadhaar propose a distinction between ‘identity and identification’—the former an amalgamation of social relations and historical processes, and the latter touted as a neutral act of correlating one piece of information to another.”
– Indian writer Arushi Vats, framing the Aadhaar biometric ID system. Drawing from her biography and critical theory, Vats ruminates on living with and resisting the 12-digit unique identity number assigned to every Indian citizen.
“Where Big Data is merely aestheticized, a new court art is created, in whose flickering lights you can ‘talk about e-cars’ with politicians and lobbyists undisturbed, as entrepreneur Frank Thelen enthusiastically posted.”
Celebrating her pioneering “seer-like spaces and live surveillance situations,” the retrospective “Julia Scher: Maximum Security Society” opens at Museum Abteiberg in Mönchengladbach (DE). The “essayistic survey” scans the American artist’s entire oeuvre of power and gaze-focused works, from Predictive Engineering, her live camera installations iterated at SFMOMA over the years (1993-2016), through Delta (Radio) and Planet Greyhound, both produced for her recent Kunsthalle Gießen exhibition (2022).
“I’m really interested in systems of power, and it often feels like there’s a secret inside these systems, because they function in ways that are almost invisible.”
– Conceptual artist Jill Magid, on how the nuances of power dynamics are often hidden in plain sight. Art that exposes these secrets, she says, “offers a way to slow down these systems and look at them in a different way.”
– Software artist Adam Harvey, warning about the use of Creative Commons licenses. Photos of people shared with the latter “can be freely redistributed in biometric AI and machine learning databases with virtually no legal recourse,” writes Harvey, referencing his 2022 research for the Open Future think tank’s AI_Commons project.
“The Google Street View data set is often stunning and often useful. But as a project, it was a grotesque violation of worldwide privacy norms that absolutely never should have happened.”
– American writer Joanne McNeil, reminding us that between 2007-10, Street View cars also collected emails, passwords, and other private information from WiFi networks in more than 30 countries. “We should never take a project at such a scale at face value,” McNeil warns.
Taipei’s Taiwan Contemporary Culture Lab (C-LAB) opens “The Unrestricted Society,” a group show probing the freedoms brought about by modern technology. Curator Chuang Wei-Tzu gathers works by Memo Akten, Paolo Cirio, Cheng Hsien-Yu, Theresa Schubert, Chang Yung-Ta and others that investigate agency in the age of mass surveillance. Kyriaki Goni’s CGI narrative Not Allowed for Algorithmic Audiences (2021, image), for example, features a smart assistant bent on reconciling humans and machines.
American logistics company UPS begins installing in-truck surveillance cameras. This summer drivers are reporting back-of-truck cargo area temperatures of 49° C, and in a move that made workers bristle, UPS rolled out Lytx telemetry cameras (image), which track GPS and monitor for “behaviours associated with collisions”—not air conditioning. “Whatever its capabilities, the mere presence of the camera has stoked fear and paranoia among my coworkers,” writes driver Matt Leichenger.
An incisive Big Tech critique and counter-control guide, that offers texts (by, among others, Bruce Sterling) and artistic research to highlight “the overt yet covert ways in which tools and platforms control our lives.”
An excavation of the legacy of sci-fi author Octavia E. Butler, American Artist’s solo exhibition “Shaper of God” opens at REDCAT in downtown Los Angeles. Both spent their formative years in the Pasadena region, which the artist translates into ruminations on “technology, race, surveillance, identity, and place” that map Southern California sites inspired by Butler’s novels and life: To Acorn (1984) (2022), for example, is a sculpture resembling the city bus stops that Butler would have waited at.
“Enough people purchased the preservative to attempt suicide that the company’s algorithm began suggesting other products that customers frequently bought along with it to aid in such efforts.”
– Megan Twohey & Gabriel J.X. Dance, on a dark twist of Amazon’s ‘frequently bought together’ recommendation engine. The journalists reveal that while eBay and Etsy have stopped carrying a chemical compound linked to scores of suicides, Amazon has dragged its heels.
“Neural networks are anti-fragile. Attacking makes them stronger. So-called adversarial attacks are rarely adversarial in nature. Most often they are used to fortify a neural network.”
– American artist and anti-surveillance researcher Adam Harvey, admitting defeat in the face of AI-powered computer vision systems. In his landmark project CV Dazzle (2010), Harvey famously defeated the CCTV-era Viola–Jones Haar Cascade face detection algorithm with low-cost makeup and hair hacks—a tactic he now deems no longer relevant. “Resistance can only happen at a collective level,” Harvey argues.