(This is the eighth in a series on generative AI content.)
By now, it’s clear: the cat is out of the bag. Generative AI is here, and it’s not going away. The question is no longer if it should be used, but how it can be used responsibly—especially in creative spaces where human identity, imperfection, and voice are everything.
Because here’s what AI can’t replicate: the subtle, imperfect fingerprints of human creators. The slightly off brushstroke, the unusual turn of phrase, the minor variations in rhythm or tone—those aren’t flaws. They’re signatures. They’re the very things that make an artist a person, and not a polished prediction machine. When AI replicates style, it often strips out those tiny inconsistencies that give art its soul. What’s left may be technically impressive, but it’s emotionally flat: a hollow echo of something once deeply human.
As discussed, there’s a hard truth at the center of the generative AI boom that’s too often brushed aside: human artists can exist without AI, but AI cannot exist without human artists. No matter how sophisticated the software, how fast the output, or how slick the marketing, AI is entirely dependent on the creative labor of real people—writers, musicians, visual artists—who crafted the culture it was trained on.
And yet, we’re watching AI tools not just assist with creative work, but begin to replace the people who made them possible. Again, that’s not innovation—it’s extraction, dressed up in the language of “efficiency” and “progress.” And for artists and writers trying to navigate this new landscape with integrity, the question becomes: how do we use AI ethically, without erasing ourselves or others?
The answer may lie in a simple reframing: What if your AI of choice was your best friend?
Let’s step back a moment, and acknowledge a few things:
1. The Line Between Help and Authorship
The line between help and authorship matters. If a friend gave one clever line, you might quote them. If they plotted your entire novel, wrote all the dialogue, and handed you the pages, could you still call the book yours?
That’s the question AI forces us to ask. When we start relying on it not just to help, but to produce, we blur the boundaries between assistance and authorship. And when the underlying labor of thousands of uncredited artists and writers forms the basis of that AI’s training data, the ethical implications deepen.
Would you feel comfortable taking credit for something you built entirely on someone else’s uncredited work? Would you feel good about earning money, praise, or prestige for a piece your “friend” essentially made?
2. Transparency and Consent Aren’t Optional
Using AI ethically also means being honest—with others and with ourselves. If AI played a role in the creation of something, however minor, that deserves to be acknowledged. Transparency isn’t just a courtesy. It’s part of maintaining trust in creative communities.
And just as importantly, using AI responsibly means pushing for consensual models. If the AI being used was trained on stolen art, scraped writing, or unauthorized music, it doesn’t matter how helpful the output is. If it wouldn’t be ethical to take it from a friend, it shouldn’t feel ethical to take it from a dataset filled with real people’s work—especially without their permission.
3. Creating with Conscience
It’s tempting, very tempting, to see AI as a shortcut. It’s marketed that way: get more done, faster. Be more productive. Push past blocks. But the more we ask of it, the more we risk losing the parts of the process that matter—reflection, struggle, the pride that comes with shaping something with our own hands.
Ethical AI use isn’t about drawing an arbitrary line in the sand. It’s about maintaining creative integrity. It’s about refusing to let a tool erase the human spark that makes the work worth doing in the first place.
So ask: if this tool were my closest, most talented friend, how would I treat them? Would I thank them? Pay them? Credit them? Would I hand their work off as my own?
That’s where the boundaries lie. That’s where ethics begin.
What if your AI of choice was your best friend?
If your AI tool of choice were a close friend—one with a brilliant mind, photographic memory, and a knack for language or visuals—how much would you feel comfortable taking from them without acknowledgment? Without offering thanks, credit, or compensation? How much would you be willing to present as entirely your own?
This thought experiment helps put things into perspective because ethically using AI doesn’t mean avoiding it entirely. It means recognizing its role and its limits—and making sure the human voice stays at the center.
Ask yourself these:
• Would I feel comfortable showing this to a talented friend and saying, “I made this,” if I hadn’t done the heavy lifting?
• Would I feel comfortable selling this as my very own if a friend had contributed this much?
• If this were a friend, how much of this would I feel okay claiming as mine?
• Would I feel weird handing this in or publishing it if my friend had written this much?
• Am I guiding the process—or is my friend doing the heavy lifting?
• Am I using my friend because I’m stuck and want support, or because I want to skip the hard parts?
If your friend, AI, is offering support, helping refine, or giving ideas—ethically, that’s fine. But once it becomes the architect, voice, or author, the credit needs to shift—and the conversation about consent and transparency becomes unavoidable.
AI isn’t inherently unethical. But how it’s used—and how much credit it’s given—matters deeply. In music and visual art, just like in writing, ethical AI use starts with honesty, respect, and restraint. Use it like a creative friend, not a ghost artist. Let it inform your work, but never replace your voice.
Ethical Use of AI…
Ethical use of AI means it works with you—not instead of you. Here are some ways AI can be responsibly integrated into creative work without replacing human authorship or exploiting the work of others:
…in Writing: Collaborator, not Ghostwriter
• Book cover brainstorming: Not every author is a visual artist, and using AI to generate concepts or mockups for a book cover can be a helpful jumping-off point. The key is to treat it as ideation—not the final product. Hand the idea off to a human designer. Use AI for rough sketches, not polished, credit-claiming covers.
• Editing and revision assistance: AI can suggest alternate phrasings, help catch passive voice, or propose tense adjustments. Think of it as an editing companion—something that highlights, nudges, and questions, but doesn’t rewrite on your behalf. A grammar-savvy friend, not a ghostwriter.
• Word searches and phrasing help: Ever have a word on the tip of your tongue? AI can help jog your memory or offer synonyms, just like a friend would. It’s a useful tool for overcoming mental blocks, not for replacing the thought process altogether.
• Problem-solving in a plot corner: When stuck in a narrative dead end, AI can offer “what if” scenarios or help list potential directions to explore. The ethical distinction lies in who does the choosing, shaping, and rewriting. If AI throws out a dozen ideas and you pick one, reshape it, and make it your own, that’s a collaboration—not a replacement.
• Quick summaries or structural overviews: Need to get a sense of your own draft’s flow? AI can summarize or outline sections of text to help you step back and see the bigger picture. AI can help catch patterns or grammar mistakes you may have missed, and even flag areas where the pacing drags or the sentence structure is repetitive. It’s like having a second set of eyes—but automated. Treat it as a tool for review, not a replacement for developmental editing or thoughtful revision.
…in Visual Art: A Concept Tool, Not a Creator
• Thumbnail sketching and ideation: Artists can use AI to generate rough visual concepts for a project—a starting point to build on, not a finished product. It’s like getting input from a peer during early brainstorming.
• Palette and composition experiments: Trying out different color schemes or compositional layouts with AI can help spark ideas, much like flipping through reference photos or asking a friend for input.
• Mockups for layout or design: For book covers, posters, or digital art, AI can help mock up design ideas. The ethical approach is to treat these as drafts—tools for communicating with a client or guiding your own design, not the final art to sell or claim as fully original.
• Texture or background generation: AI can assist in generating filler elements—textures, skies, subtle patterns—to be heavily edited or integrated with original artwork. Think of this as using digital collage elements or stock textures—never as the central focus.
Where to draw the line: Generating portraits, illustrations, or polished compositions using AI—and then claiming them as personal artwork—is ethically fraught, especially when the model is trained on artists’ work without their consent. If you wouldn’t feel comfortable passing off a friend’s painting as your own, the same should apply to AI-generated images built on other people’s styles.
…in Music: A Creative Companion, Not a Composer
To disclose, these are ideas I’ve received from others. The extent of my delving into AI with music was finding out it exists, having Suno make a couple silly songs, tossing one into Klang.io to see if it could make sheet music, cringing at the output (the thing AI may do the worst is turning music into sheet music…holy hell…though this isn’t generative), then swearing it off…and figuring out how to get software to do what I want it to do. When it comes to mixing, laying down drum beats, and working on harmonies, I use a AKAI MIDI keyboard, Logic Pro as my DAW, Finale 27 as my notation software, though I have (and detest) Dorico, and Native Instruments and Garritan for instrumentation. I see no point whatsoever for me to using AI to assist in composition given what I use and have access to, but the barrier is very high. I’m talking thousands of dollars, and these are not things you can have beta readers help with, or that you can learn in the margins of notebook paper. While the composition in music still needs to come from you—I started with a piano—each musical instrument is like its own language, and the notation is different, transposition is different…
Ironically, the highest barrier to entry isn’t cost, but knowledge, and it’s due to the vitality of that knowledge that this category needs to be approached with care and by those with some foundational knowledge. Incidentally, some of the best books for learning music theory are open source and free, and yes, they’re used in post-secondary settings (Open Music Theory and Puget Sound Music Theory).
Here are several ways AI can be ethically uses in music composition without erasing the human or crossing ethical boundries:
• Chord progression ideas: AI can be a brainstorming partner when stuck on what comes next harmonically. It can suggest progressions based on a mood, genre, or key—but it’s up to you to shape, refine, and make them musically meaningful.
• Generating drum patterns or loops to experiment with: When working alone or without access to live instruments, AI-generated loops can act as placeholders or inspiration. Like a friend laying down a beat for you to riff over—not writing your song for you. This is akin to using the preloaded beats in Casio keyboards, but more specific to what you’re working on, and these patterns and loops are more likely to be generated using established music theory rather than anything scraped.
• Lyric brainstorming: AI can offer rhyming words or line ideas based on a theme. Think of it like bouncing ideas off a writing buddy. But the emotional core, the voice, the meaning—that still needs to come from you, and this still requires understanding melodic structures.
• Audio cleanup or mixing suggestions: AI can assist in cleaning up audio files or suggesting EQ and compression settings, much like an engineer giving advice. But creative decisions—like tone, texture, and space—are still yours to make. Though again, this also isn’t generative based on scraped data. You still need to have an understanding of what is what.
• Backing tracks for practice or arrangement tests: AI tools can provide quick instrumental backings when rehearsing or experimenting with song structure. They’re the equivalent of having a friend sit in for a jam session—not the final band.
• Assistance in your musical software issues: Here is where AI can shine. There are many DAWs, many plug-ins, many notation software options…the industry-standard since the 1980’s is being sunsetted… Getting these things to work together is a fucking beast. The operating system you’re using can affect how a program works with this plugin or that, that version might not have this feature or that for some reason… There are so many moving parts and no one guide can help. The guides for different instrument plug-ins might not be valid for your software, and might be 86 pages for six horns that it turns out won’t run on your machine because of *reasons*… AI, though, can assist greatly in figuring out how to many multiple moving parts all created independently of each other with little thought to the other parts that exist all work together, or help you determine if there’s a fatal flaw that renders further attempts futile. This is a much more complex issue than Duck Duck Go can handle. (Why yes, this is something that gave me countless hours of frustration earlier this month.)
Where to draw the line: Using AI to generate instrumental tracks or stems, to generate vocal performances, to replace musicians or singers—especially in the style of real artists—is where ethical use collapses. Consent, compensation, and creative credit matter. If AI is mimicking someone’s voice, artistry, or expression without their permission, it’s not a collaboration—it’s theft. Music composition, even more than writing or visual art, has active communities of exceptionally knowledgable people who will help you. Since the barrier is so high, you’re less likely to get replies from people who think wanting to be a thing makes them that thing.
Where We Go From Here
These things are hard—for a reason. Creation is a process of discovering your voice, shaping your thoughts, and building something no one else could make quite the same way. Your word choices, your pencil strokes, your music notes—the might not always be perfect, but the reflect what makes you you, and that’s a beautiful thing that shouldn’t be erased. Think about what makes awkward love notes of the past so charming—they aren’t svelte and sophisticated and some idea of AI-perfect, but the charm in the attempt makes them wonderful. Your child’s drawing of you, with an oversized head and one tiny hand and one bigger than than that oversized head, is from their heart, and is perfect because of that, no idea of AI-perfection needed. Surely you value the sincere, even if not technically perfect, attempts from those you love, because those imperfections are their vulnerabilities they’re laying bare for you, and those are the things that make them matter more than anything. Which touches you more—a handmade birthday care with a poem that almost works but shows the quirks of your friend, or a store-bought card that they decided is close enough, and they sign your name, and that’s that? When it comes to the store-bought cards, which do you hold closer to your heart, the one where they sign their name, or the one with a personal inscription that no pre-printed card could ever truly replicate?
So why try to remove the so-called “imperfections” that reflect us in what we claim to create? AI might be able to imitate a style, but it can’t imitate a story and a love that only you can tell. If someone else using AI could reasonably potentially get the same outcome, it’s not unique to you, and YOU are left out of that piece. This is genuinely heartbreaking.
If we reframe AI not as a shortcut or a substitute, but as a trusted creative friend—someone you admire, someone whose talents you respect—the ethical path forward becomes much clearer. You wouldn’t take their ideas and pretend they were your own, because it’s not you. You wouldn’t sell their work without permission. You wouldn’t let them labor in silence while you collected the credit. You’d collaborate. You’d acknowledge. You might even see it as more valuable for that teamwork. You’d respect the line between help and authorship. And you’d make sure that that which make you you is still allowed to come through.
That framing helps clarify the ethical boundary without erasing the you-ness of you: AI can be a sounding board, a problem-solver, a pattern-spotter. It can push you forward when you’re stuck or help you find a clearer way to say what you already meant. But it should never become the architect of your work or the source of your voice. Once it begins replacing—not supporting—human creativity, it stops being a friend and starts being a tool of erasure.
Ethical AI use is ultimately about honoring the human spark. It means valuing the slow, imperfect, deeply personal process of creating something real. It means refusing to exploit the work of others—whether they’re anonymous contributors in a scraped dataset or close collaborators helping you through a block. And it means holding ourselves accountable: being transparent, being thoughtful, and being honest about where the ideas came from, and who really deserves the credit. The suggestions above allow for the use of AI as a tool without turning AI into a crutch that ultimately strips you from the equation.
The tools will keep evolving. The technology will get better. The ethical questions will grow more complex. But one thing won’t change: good art comes from people. AI might help shape the scaffolding, but it’s the human heart that fills in the rest.
So use AI if it helps you grow. Use it if it opens up new possibilities. Use it the way you’d lean on a brilliant, trusted friend. But don’t forget: you’re the one telling the story. You always were.
Call to Action: Create with Care
As artists, writers, and makers, we are the stewards of our craft—and the choices we make now will shape the future of creative work. So be intentional. Ask questions. Stay curious. Use the tools available to you, but never at the cost of your own voice—or someone else’s.
If you’re using AI, make it a collaborator, not a crutch. Credit what deserves credit. Question where the data came from. Push for transparency. Advocate for ethical standards in the tools you rely on. And most of all, keep creating from a place of conscience. Because when we lead with respect, honesty, and care, we don’t just protect the integrity of our own work—we protect the future of human creativity itself.
And always remember: the most meaningful art doesn’t come from shortcuts. It comes from the spark only you carry.