“Adapt or Die”

(This is the third-and-a-half in a series on generative AI content. It’s an expansion on the article from earlier today.)

A perfectly-timed blood sacrifice to the ice in exchange for a concussion crossed with spring break crossed with my desire to work on books bring hampered by AI means I’ve got ample time right now to stew and to write articles about AI.  Buckle up.  I’ve got more coming.

Also, can your accusations that my writing sounds like AI—AI was initially trained largely on the work of academics who write like me. Spend five minutes speaking with me on a serious subject, and you’ll walk away with no doubt as to my writing ability when I can edit. The mild change in style to include some section titles is for my own ease—refer back to the concussion.

 

The Cruel Optimism of AI: Why “Adapt or Die” and “Artists Will Always Be Needed” Are Empty Promises

In the rapidly evolving world of generative AI, two phrases have come to define the prevailing mindset among its most ardent supporters. The first is a cold command: “Adapt or die.” The second is a hollow reassurance: “Human artists will never be obsolete, because their work will always be needed to train AI.” At a glance, these statements seem to offer opposing sentiments—one brutal, the other hopeful. But look closer, and both reflect the same dismissive and exploitative logic: a belief that the creative labor of human beings is only valuable in service of machines, and that the burden of survival rests solely on the shoulders of the displaced.

 

“Adapt or Die”: A Threat Disguised as Advice

The phrase “adapt or die” has long been used in business and technology circles to justify aggressive disruption, but in the context of generative AI, it takes on a particularly cruel edge. It reframes the loss of livelihoods, careers, and artistic purpose not as a consequence of unchecked technological overreach, but as a failure of individuals to evolve quickly enough. It turns systemic harm into personal weakness. Can’t pivot fast enough? Can’t keep up with every new tool, prompt trick, or licensing change? Then you deserve to be left behind.

This mindset is not just dismissive—it’s deeply dishonest. It ignores the vast power imbalance between those creating and deploying generative AI systems and those whose work is being used—without permission or compensation—to train them. “Adapt” in this context often means submitting to exploitation and becoming an unpaid trainer for the very system that may render your profession economically unviable…without creating anywhere near an equal number of jobs to ensure people don’t starve.  We can’t adapt to starvation.

 

The Parasitic Promise of Perpetual Use

Just as disturbing is the so-called comfort offered by AI advocates: that human artists and creators “will always be needed,” because AI will continue to rely on new human-made content for training. On the surface, this seems like a concession to the irreplaceable nature of human creativity. But the subtext is far more insidious. It doesn’t celebrate artists as creators—it reduces them to fuel.

This promise positions artists not as valued contributors, but as raw data streams to be scraped, analyzed, and commodified. It’s like telling someone, “Don’t worry, you’re not obsolete—you’re still a useful input.” Their value lies not in their vision or voice, but in their ability to feed a machine that will ultimately overshadow them in the marketplace. This isn’t preservation—it’s exploitation repackaged as praise.

The irony is hard to miss: artists are told their work is worthless in economic terms, that it holds no market value compared to what AI can generate. Yet that very same work is deemed valuable enough to train billion-dollar AI models. In other words, the market is happy to say an artist’s labor is worth $0 while still extracting everything it can from it.

 

The CEO Mindset Behind the Curtain

Both of these attitudes—“adapt or die” and “we’ll always need your work”—stem from a deeper worldview, one that mirrors the worst traits of corporate leadership. It’s the mindset of the CEO who takes credit for the labor of others, who centralizes profits while decentralizing risk, who insists that the market justifies every action, no matter how exploitative. This logic treats human creativity not as something sacred or singular, but as an input to be streamlined, scaled, and, eventually, replaced.

And it’s not just rhetoric—it’s reshaping the future of creative work. Writers, visual artists, musicians, and educators are already being squeezed out of jobs, told to compete with machines built on their own intellectual and emotional labor. When they speak up, they’re told to innovate or get out of the way.

 

A Call for Accountability, Not Just Adaptation

What’s missing from these conversations is accountability. Who gets to decide what counts as progress? Who reaps the rewards, and who bears the cost? The narrative around generative AI continues to center those with the most power—tech developers, CEOs, venture capitalists—while pushing artists and cultural workers to the margins, asking them to make peace with their own displacement.

To resist this future, we must reject the false binary of “adapt or die,” and demand more than the backhanded assurance that our work will “always be needed”—as fodder. We need a framework that centers consent, compensation, and dignity. One that sees creators not as training data, but as people with vision, rights, and value.

Progress that erases people is not progress. It’s theft dressed up as innovation. And no one should have to adapt to that.

Leave a Reply

Your email address will not be published. Required fields are marked *

search previous next tag category expand menu location phone mail time cart zoom edit close