(This is the seventh in a series on generative AI content.)
As generative AI continues to evolve, they’re unlocking powerful new possibilities for people interested in art, writing, music, and design—especially those who lack the skill, time, or desire to master those crafts in traditional ways. With just a prompt, users can generate portraits, vivid landscapes, catchy choruses, or full-length articles in seconds.
But as this wave of AI-generated content rolls in, an important question has surfaced: What do we call the people behind the prompts?
Many label themselves “artists,” “musicians,” or “writers.” But those titles are deeply tied to hands-on creative labor—to brushstrokes, keystrokes, and countless hours of study and practice. When people use those same titles after a single AI prompt, it creates confusion, frustration, and, in some communities, outright hostility.
Rather than dismissing AI users or gatekeeping creative spaces, it may be time for a reframing. Those who work effectively with AI aren’t necessarily traditional creators. They’re something new: directors, curators, producers, or creative leads. And that’s not a demotion—it’s a different skill set worth recognizing on its own terms.
But even as we explore this new role, we must acknowledge a hard truth: AI-generated output is built on the backs of human creators who never agreed to contribute—and that reality complicates any claim to authorship or innovation. New titles can, at the least, have room to acknowledge both sides.
The Shift From Maker to Shaper
Traditionally, being an artist or writer meant learning a craft—mastering tools, techniques, and principles through years of practice. But with AI, much of that technical labor can be offloaded to a model trained on massive datasets of human-made work.
In visual art, AI users guide the process by describing scenes, refining styles, and selecting from dozens of generated variations. It’s not unlike a film director shaping a vision through collaboration. The user might not do the “hands-on” work, but they still influence the final result.
This isn’t trivial. Prompting AI effectively requires aesthetic judgment, clear communication, and iteration. But while it is creative, it is not the same as the slow, often physical process of creation that defines traditional artistry.
And importantly, if we begin using terms like “AI director” or “AI curator,” we adopt roles that already carry an implied understanding: that part of the work is built on the labor of others. Just as a film director works with actors, camera operators, and designers, and a museum curator builds meaning through others’ artworks, AI users shape their outputs through a framework built on human-made data. In those established roles, no one assumes the director or curator did it all alone—nor should they here.
Writing With AI: Producer, Not Author
The same applies to writing. Tools like ChatGPT or Claude can now generate articles, poems, and short stories with surprising fluency. The user supplies a concept or prompt, tweaks the output, maybe edits for clarity—and calls it done.
This is closer to producing content than writing it. Writers craft language. They wrestle with structure, revise tone, and shape voice. If you didn’t write the sentences, you’re not the author. At best, you’re the editor-in-chief.
The Authors Guild has issued clear guidelines: if AI was used to generate significant portions of a written work, that must be disclosed. It’s not about exclusion—it’s about transparency, trust, and integrity within creative communities.
And again, if you call yourself a “story producer” or “AI content editor,” the title already implies your work was collaborative. It makes room for the invisible hands that shaped the raw material—and keeps the credit honest.
But Who Built the Machine?
Here’s where the ethical dilemma deepens. Every AI-generated output relies on the training data behind the model—and most of that data comes from real people who didn’t agree to contribute.
Artists, photographers, illustrators, musicians, and writers have had their work scraped, absorbed, and repurposed without credit, compensation, or consent. That work forms the foundation of the polished results AI can now generate in seconds.
So when someone uses AI to create and monetize content—selling books, prints, music, or articles—they’re often profiting from a system built on unacknowledged labor. This isn’t just about accuracy in job titles. It’s about ownership, fairness, and ethical use.
As legal scholar Andres Guadamuz explains, “The process by which artificial intelligence (AI) ‘learns’ to do something, particularly to generate works that emulate human creativity, often relies on having access and analysing large numbers of those works, learning patterns to create its own versions. To do this, the computer program must have copies of works to analyse to produce new results.” But those works are almost never used with authorization.
A New Creative Class—With New Responsibilities
Those who work with AI to generate creative output are not faking expertise—but they are engaging with creativity in a fundamentally different way. They’re not creators in the traditional sense. They’re navigators of a machine’s potential.
This includes designers who generate mockups or templates to guide real-world artists, writers who use AI to brainstorm ideas, then rewrite and reframe the results in their own voice, musicians who experiment with AI-assisted chords or loops as scaffolding for live performance, and developers who fine-tune generative models for specialized creative industries.
This new creative class brings something valuable to the table: agility, imagination, and a willingness to experiment. But with that access comes a need for ethical accountability. Adopting new titles—ones that acknowledge shared labor—helps clarify where creative credit begins and ends.
Why Definitions—and Honesty—Still Matter
Even as we reimagine creative roles, the language we use still matters. Calling someone an “artist” or “author” implies a kind of authorship that doesn’t apply when a machine is doing most of the work. That’s not about gatekeeping—it’s about honoring the truth behind the process.
A curator doesn’t paint, but they shape how we see art. A director doesn’t perform every scene, but their voice is in every frame. AI prompters deserve recognition too—but only when the role they played is described with honesty.
By using accurate titles—director, producer, editor, curator—we communicate something crucial: that others were involved, that human labor was foundational, and that the final product is collaborative by nature.
Collaboration, Not Substitution
AI isn’t inherently unethical. But when it’s used to replace creative labor—especially without acknowledgment—it crosses a line. The future of AI in the arts doesn’t need to be extractive. It can be collaborative. But only if users understand their place in the process—and own it with clarity.
If you’re using AI to shape an idea, experiment with structure, or find your way through a creative block—you’re participating in a new kind of work. Maybe you’re not the artist in the classical sense. But you are something else: a director, a curator, a navigator of borrowed knowledge.
And that’s a role worth owning—honestly.