Exhibitions, Research, Criticism, Commentary

A chronology of 3,585 references across art, science, technology, and culture
“Even the word cypherpunk, I think it’s at least two-thirds gentrified at this point.”
– Ethereum co-founder Vitalik Buterin, on the difficulty of identifying good-faith actors in crypto. In conversation with Tor Project’s Roger Dingledine at Funding the Commons Buenos Aires, Buterin advises seeking “high integrity people” and following a moral compass when navigating an ecosystem full of opportunists. [quote edited]

“Today Is a Good Day to Discuss Digital Rights” at Espacio Fundación Telefónica Madrid features Aram Bartholl, Eva & Franco Mattes, United Visual Artists, and others, exploring online power dynamics through humour and provocation. Curated by Fundación Telefónica with Domestic Data Streamers, the show’s installations articulate how online actions shape (and erode) rights—from expression and data control to identity and the right to be forgotten.

“Data breaches are a perpetual concern with any data collection. Biometrics magnify that risk because your face cannot be reset, unlike a password or credit card number.”
– Attorney Mario Trujillo, framing the irreversible nature of biometric data collection in his analysis of Amazon Ring’s upcoming ‘Familiar Faces’ feature. The video doorbell will scan everyone who approaches Ring cameras—including people who never consented—and retain untagged faces for up to six months.
“It’s not that these guys are so smart that they’re running circles around Congress. It’s that Congress created the enshittogenic environment and then we got enshittocene.”
– Internet activist and sci-fi writer Cory Doctorow, tracing the rapid decline of user welfare—enshittification—back to the 1998 U.S. Digital Millennium Copyright Act. The law effectively criminalized reverse engineering, Doctorow explains, paving the way for closed, extractive platforms by limiting choice, interoperability, and privacy.
“Projects involving mixers, zero-knowledge proofs, multi-party computation, and other privacy-preserving protocols could face existential legal risk—not for what they do, but for how someone uses them.”
– Kelman PLLC, warning that crypto developer Roman Storm’s conviction for regulatory violations sets a dangerous precedent. Charged after North Korean hackers used his open source Tornado Cash protocol to anonymize (launder) stolen funds, Storm’s case blurs the line between “software development and criminal facilitation,” the lawyers argue. [quote edited]
“You’re asking millions of people to submit sensitive information to access legal content. That opens the door to leaks, abuse, and misuse of data.”
Windscribe CEO Yegor Sak, on the UK’s Online Safety Act requiring AI-driven age verification via facial recognition, government IDs, or credit card checks. Since taking effect July 25th, Virtual Private Network (VPN) signups have surged 1,400% as UK citizens resist algorithmic oversight by masking their location.
Z

“Anything you put online can and probably has been scraped,” concludes AI ethics researcher William Agnew after finding thousands of personal documents in a tiny sample of DataComp CommonPool. The massive dataset, used to train image generation models, likely contains hundreds of millions of private photos, IDs, and résumés scraped from the web. As journalist Eileen Guo notes, the findings expose “the original sin of AI systems built off public data—it’s extractive, misleading, and dangerous.”

“As a paid exhibition, visitors invested time and money to support the artist and institution without damaging artworks or disrupting operations. Our photos were taken in good faith, reflecting genuine engagement with the exhibition.”
– Chinese museum visitors, after Singapore-based artist Heman Chong allegedly re-shared their UCCA Dune Instagram exhibition selfies derisively claiming they “used my work as a backdrop for their narcissism.” The group filed a complaint citing privacy violations and public humiliation.
“There is a Latin America testing ground for products. If they are successful, they tend to be deployed in other jurisdictions, oftentimes with additional safeguards, sometimes not.”
– Tech policy researcher Ayden Férdeline, on how Latin America has become a hotbed for workplace surveillance startups. The venture-capital funded bossware technologies include biometric tracking, AI-powered productivity monitoring, and predictive analytics that continuously collect worker data.
“America already has all the technology it needs to build a draconian surveillance society—the conditions for such a dystopia have been falling into place slowly over time, waiting for the right authoritarian to come along and use it to crack down on American privacy and freedom.”
Ian Bogost and Charlie Warzel, describing the data-authoritarianism taking shape under Donald Trump. “DOGE is the logical end point of the Big Data movement,” they warn of government dataset exfiltration and merging.
“This is the nightmare scenario.”
– California Senator Tom Umberg, on 23andMe’s recent bankruptcy. The California Genetic Privacy Rights Act author and other privacy advocates are sounding an alarm that if the millions of genetic profiles submitted by users to the biotech company go up for auction, it will be an unprecedented data privacy disaster.
“If a government comes knocking at Telegram’s door asking for information on a wrongdoer, real or perceived, Telegram doesn’t have the same safety that its peers do. An end-to-end encrypted service can sincerely tell law enforcement that it can’t help them.”
– TechScape columnist Alex Hearn, on the arrest of Telegram CEO Pavel Durov. Hearn observes that had Telegram adopted end-to-end encryption in the first place—Durov would probably never have been arrested for hindering French law enforcement when he landed in Paris on Aug 25.
“We need acts of translation, and I think artists are supplying images and metaphors that we can use as a common currency. These metaphors are alternatives to those offered by corporations.”
– KW Berlin curator Nadim Samman, on the “profound alienation” at the heart of the current “Poetics of Encryption” exhibition. Channelling a cultural mood in conflict with obscured algorithmic regimes, the included works “express the anxiety, paranoia, and political rage we have around opaque tech.”

Simone C Niquille’s CGI film Beauty and The Beep (2024) premieres at EXPOSED Torino Foto Festival (IT), completing the Dutch artist’s trilogy on cohabitation with computer vision. Following an AI-trained computer model of a ‘smart chair’ trying—struggling—to find a place to sit, the film playfully collages evidence of the modern datafied home: The chair is designed after Bertil, the first IKEA product advertized with synthetic imagery while the parcour resembles Boston Dynamics’ model home for robot dogs.

UC San Diego’s Mandeville Art Gallery opens “Bodily Autonomy,” Lauren Lee McCarthy’s largest solo show in the U.S. to date. Curator Ceci Moss brings together two major series of works—Surrogate (2022) and Saliva (2022)—in which the Chinese-American artist examines bio-surveillance through performances, videos, and installations. A newly commissioned Saliva Bar, for example, invites visitors to reflect on data privacy, race, gender, and class as they pertain to genetic material over traded spit samples.

“The growing awareness that unchecked centralization and over-financialization cannot be what ‘crypto is about,’ and new technologies like second-generation privacy solutions and rollups are finally coming to fruition, present us with an opportunity to take things in a different direction.”
– Ethereum co-founder Vitalik Buterin, calling for developers to “make Ethereum cypherpunk again” by focusing on privacy-enhancing tools and public goods—not casinofication [quote edited]

The third edition of Japan’s Osaka Kansai International Art Festival ponders urban futures with a group exhibition that asks “STREET 3.0: Where Is The Street?” Curators Miwa Kutsuna and Yutaro Midorikawa present works by international artists that hack the city with technology (Aram Bartholl, Simon Weckert, AQV-EIKKKM), calligraphy, or olfactory. Bartholl’s over 1,400 node-strong network of Dead Drops (2010-, image), for example, inserts USB flash drives into the urban landscape for offline data sharing.

“On the whole, despite the ‘dystopian vibez’ of staring into an orb and letting it scan deeply into your eyeballs, it does seem like specialized hardware systems can do quite a decent job of protecting privacy.”
– Ethereum founder Vitalik Buterin, assessing Tools for Humanity’s plan to confirm proof-of-personhood for the global populace by scanning their irises for the Worldcoin project. While he concedes it will probably be necessary to distinguish humans from AI soon, Buterin warns of the triple threat of security vulnerabilities, identity black markets, and overly-centralized hardware.

Worldcoin, a proof-of-personhood digital identity system for a future full of AI agents, launches. A Tools for Humanity (OpenAI’s Sam Altman and engineer Alex Blania) initiative, it proposes iris scanning everyone on earth to assign them an anonymized biometric identity—and a related cryptocurrency. Anticipating AI-induced cultural shifts, Altman & Blania claim Worldcoin will let users “prove you are a real and unique person online” and assist in universal basic income (UBI) disbursement.

To dive deeper into Stream, please or become a .

Daily discoveries at the nexus of art, science, technology, and culture: Get full access by becoming a HOLO Supporter!
  • Perspective: research, long-form analysis, and critical commentary
  • Encounters: in-depth artist profiles and studio visits of pioneers and key innovators
  • Stream: a timeline and news archive with 3,100+ entries and counting
  • Edition: HOLO’s annual collector’s edition that captures the calendar year in print
.
$40 USD