1,567 days, 2,405 entries ...

Newsticker, link list, time machine: HOLO.mg/stream logs emerging trajectories in art, science, technology, and culture––every day
“I wanted to have you on and Apple asked us not to do it. They literally said, ‘Please don’t talk to her.’ Like, what is that sensitivity? Why are they so afraid to even have these conversations out in the public sphere?”
– Political comedy titan and returned Daily Show host Jon Stewart, telling U.S. Federal Trade Commission chair and antitrust lawyer Lina Khan about his former bosses interfering in his short-lived current affairs program on Apple TV+. Launched in 2021, The Problem with Jon Stewart was cancelled after only two seasons due to ‘creative differences.’
“By designing a firehouse of addictive content [and] displacing physical play and in-person socialising, these companies have rewired childhood and changed human development on an almost unimaginable scale.”
– Social psychologist and The Anxious Generation (2024) author Jonathan Haidt, on how social media giants, callously, fuel the raging youth mental health crisis. “The companies had done little or no research on the mental health effects of their products on children and adolescents, and they shared no data with researchers studying the health effects.”
“Apple selectively restricts access to the points of connection between third-party apps and the iPhone’s operating system, degrading the functionality of non-Apple apps and accessories.”
– U.S. Attourney General Merrick Garland, describing the iPhone and App Store as an anti-competitive walled garden. “Apple has held a dominant market share not because of its superiority, but because of its unlawful exclusionary behaviour,” he says in a speech announcing sweeping antitrust action against the tech giant.
OUT NOW:
Eleanor Drage & Kerry McInerney
The Good Robot
Building on their eponymous podcast (2021-), Cambridge University researchers Eleanor Drage and Kerry McInerney explore “why technology needs feminism” with leading feminist thinkers, activists, and technologists.
OUT NOW:
Yanis Varoufakis
Technofeudalism
Maverick economist Varoufakis puts Silicon Valley in the crosshairs, arguing Big Tech has supplanted traditional capitalism and hoovered up much of the World’s capital to “construct private cloud fiefdoms and privatize the Internet.”
“I’d build one-of-a-kind VR headsets into big masks from different cultures, sometimes adding lightning bolts and feathers. I wanted the headsets to be vibrant, exciting objects that enriched the real world, too.”
– American computer scientist, author, and VR pioneer Jaron Lanier, reminiscing on the technology’s early days when, contrary to today’s efforts to make them disappear, VR goggles were exciting aesthetic objects unto themselves. “If you’re going to wear a headset, you should be proud of that weird thing on your head!” Lanier writes.
a
“Be very wary of profit-driven corporations using the AGI patina of mysticism to market centralized tech always ultimately developed in service of growth.”
– Signal Foundation president and AI Now Institute co-founder Meredith Whittaker, on Meta joining the race for artificial general intelligence (AGI). “AGI is a marketing term overlaid with quasi-religious symbolism,” Whittaker warns, reminding her followers that the term AI, too, was coined in 1956 to attract grant money.
“Your own sense of reality becomes increasingly specific to you and your synthetic friends, but this isn’t happening on a neutral plane. Your friends work for giant corporations and are designed to extract as much value from you as possible.”
– American artist Trevor Paglen, at the 37th Chaos Communication Congress (37C3), illustrating a dystopian near-future of corporate AI companionship, where “emotional manipulation will be the name of the game”
OUT NOW:
Joanne McNeil
Wrong Way
Erudite tech critic Joanne McNeil debuts as a novelist, exploring Silicon Valley hubris, self-driving cars, and the “treacherous gaps between the working and middle classes wrought by the age of AI.”
“Silicon Valley runs on VC hype. VCs require hype to get a return on investment because they need an IPO or an acquisition. You don’t get rich by the technology working, you get rich by people believing it works long enough that one those two things gets you some money.”
– Signal Foundation president Meredith Whittaker, demystifying the AI revolution at the Washington Post Futurist Summit. “We need to be clear about what we are responding to: ChatGPT is an advertisement—a very expensive advertisement,” Whittaker insists.
David Golumbia
(1963-2023)
Fierce digital culture critic David Golumbia dies after a battle with cancer. Author of The Politics of Bitcoin (2016) and the forthcoming Cyberlibertarianism, the American researcher examined financialization, language, and software. Golumbia was an Associate Professor in the English Department at Virginia Commonwealth University.
OUT NOW:
Sanela Jahić
Under the Calculative Gaze
The paperback adaptation of Jahić’s artistic research shown at Aksioma in early 2023 expands on the entanglement of socially-applied technologies, systemic injustices, and creeping authoritarianism. Included: an essay by prominent AI critic Dan McQuillan.
“As U.S. et al. v. Google goes to trial, the echoes of the landmark federal suit against Microsoft, a quarter-century ago, are unmistakable.”
– Tech journalist Steve Lohr, reminiscing the last major American antitrust trial (1998). Once again “a tech giant is accused of using its overwhelming market power to unfairly cut competitors off from potential customers,” Lohr writes, noting Google is not quite as audacious though (a Microsoft exec famously planned to “cut off Netscape’s air supply”).
“I’m not opposed to satellite imaging, but I’ve been in quite a few climate meetings where people suggested that if only we had more data and better images we’d finally address the crisis. That’s not true.”
– Canadian tech critic, author, and Tech Won’t Save Us host Paris Marx, pushing against the notion that better tools like Satlas, the Allen Institute for AI’s machine learning-powered forest monitor, will lead to climate action. “Our data has been getting better for decades,” Marx argues, “and emissions have kept rising that whole time.”
OUT NOW:
Tamara Kneese
Death Glitch
Tech ethnographer Kneese draws on interviews with digital afterlife startups, chronic illness bloggers, and transhumanist tinkerers to explore how platform capitalism shapes our perception of mortality.
“In his feud with Zuckerberg, Musk is essentially playing Ric Flair without the charisma.”
– Tech reporter and Platformer founder Casey Newton, parsing the drama around the proposed cage fight between the two Silicon Valley giants in show sports terms. “As a connoisseur of pro wrestling, I’m quite familiar with the character Musk is playing here: the big talker who can’t back it up in the ring,” writes Newton. “Wrestling promoters have made a lot of money with cowardly heel champions [like Flair] who go to great lengths to avoid having to face their adversaries in combat.” In the real world, the Musk vs Zuck feud moves markets and, Newton argues, deserves a lot more skepticism.
“The majority of people aren’t users but subjects of AI. It’s not a matter of individual choice. Most AI determinations that shape our access to resources are behind the scenes in ways we probably don’t even know.”
– Signal Foundation president and AI Now Institute co-founder Meredith Whittaker, discussing AI ethics with Credo AI’s Navrina Singh, and Distributed AI Research Institute’s Alex Hanna at the Bloomberg Technology Summit. “AI is a surveillance technology,” Whittaker insists. “The Venn diagram of AI concerns and privacy concerns is a circle.”
“How do we prevent these language models from scraping our archives? But if they are going to scrape our archives, how do we at least make sure that we’re getting paid for that?”
New York Times tech columnist Kevin Roose, summarizing the dilemma faced by Reddit, Twitter, and other platforms currently “locking down” their application programming interfaces (APIs) to protect their vast archives (of user-generated content) from being scraped by OpenAI, Google and other companies developing AI language models
“Just as we’ve strewn the oceans with plastic trash and filled the atmosphere with carbon dioxide, so we’re about to fill the Internet with blah.”
– Security engineering expert Ross Anderson, on the dangers of model collapse as AI trains on AI-generated content. In a new paper, Anderson and team demonstrate a rapid degenerative AI feedback loop where “true underlying data distribution” is forgotten. “This will make it harder to train newer models by scraping the web,” Anderson warns, “giving an advantage to firms that already did, or that control access to human interfaces at scale.”
To dive deeper into Stream, please or become a .

Daily discoveries at the nexus of art, science, technology, and culture: Get full access by becoming a HOLO Reader!
  • Perspective: research, long-form analysis, and critical commentary
  • Encounters: in-depth artist profiles and studio visits of pioneers and key innovators
  • Stream: a timeline and news archive with 1,200+ entries and counting
  • Edition: HOLO’s annual collector’s edition that captures the calendar year in print
$40 USD