“In 2018, Christie’s auctioned its first piece of AI-generated art—a vaguely eighteenth century-looking portrait—which, according to The Verge, was created with ‘borrowed code’ and wrapped in an alluring but ultimately misleading narrative of a self-determined computer creating art.”
Matthew Braga is a freelance journalist based in Toronto. A former senior technology reporter for CBC News and editor at Motherboard, he covers technology, science, and culture.
I’ve been watching an increasing number of artists experiment with machine learning in all kinds of interesting ways. The writer Robin Sloan has been writing a novel with suggestions from an algorithm trained on old sci-fi and fantasy novels—a word here, a phrase there, curated fragments plucked from the ether, generated based on what Sloan has written so far. The musician Holly Herndon created her most recent album using musical contributions from a machine learning algorithm trained on a choir of Herndon and her collaborators, resulting in a haunting digital feedback loop that resembles their voices but is a performer all its own. And for days, earlier this Spring, I couldn’t stop thinking about the work of artist Cyril Diagne: a prototype of an app that functioned, essentially, as a copy-paste button for the physical world. Diagne took a picture of a dress on a wall, pointed their phone at a laptop, and the dress magically, almost immediately, lept to the digital canvas on the laptop’s screen.
But there are other experiments that give me pause. OpenAI recently introduced Jukebox, a machine learning algorithm that generates new songs in the style of popular artists—from the Pet Shop Boys to Nicki Minaj—after being trained on libraries of their music (there are some unnerving examples, but a lot of bad ones as well). A musician and machine learning developer offered a thoughtfully detailed critique. In 2018, Christie’s auctioned its first piece of AI-generated art—a vaguely eighteenth century-looking portrait—which, according to The Verge, was created with “borrowed code” and wrapped in an alluring but ultimately misleading narrative of a self-determined computer creating art. The ethically fraught world of deep fakes looms large, and work that incorporates image and facial recognition algorithms—and all their inherent biases towards race and gender, embedded by those who trained them—have so far presented the biggest concerns of them all.
“It’s tempting to think algorithms and automation could help level the playing field in a world that is already uneven … freeing overworked and underpaid artists to pursue the aspects of their art that are most enjoyable, creative, or personally fulfilling.”
Creatively, there’s an abundance of potential uses. But that doesn’t mean moral or ethical considerations can be ignored. I know that it won’t be long before machine learning algorithms enter the realm of the mundane, and are just another tool in the toolbox to help artists create more quickly, effectively, efficiently than performing the dull or repetitive tasks we could scarcely imagine automating before. It’s tempting to think this could help level the playing field in a world that is already uneven, especially when you have artists competing against other, more privileged artists with more money, more time, more comfortable living situations—freeing overworked and underpaid artists to pursue the aspects of their art that are most enjoyable, creative, or personally fulfilling. But at the same time, I think we have to be skeptical of the way these tools might be used to devalue artists, or exploit others.
It’s encouraging to see how both Sloan and Herndon have tried to account for this—Sloan, by acting as a curator and interpreter of the algorithm’s output, and Herndon, by only feeding the algorithm data derived from consenting collaborators. If these tools are to be part of the discussion around artist prosperity, artists can’t lose sight of the fact that the tech companies and communities that develop these tools have different imperatives than artists do. The way artists use them should be different, too.