AI Artists Demand Royalties – Who Owns Their Work? By Adeline Atlas

ai artificial intelligence future technology robots technology Jun 28, 2025

In the evolving landscape of art, law, and machine rights, one of the most contentious debates is unfolding: when an AI creates something original—whether it’s beautiful, profitable, or culturally significant—who holds the ownership? This chapter explores the growing demand by AI artists for royalties and the complex question of who owns their work.

We’ve passed the point where AI-generated art is a novelty. Today, AI systems like DALL·E, Midjourney, and Stable Diffusion are producing images indistinguishable from those created by human professionals. In film, music, design, and publishing, AI is not just assisting—it’s generating. And as these systems improve, they’re no longer just copying. They’re innovating. Remixing. Stylizing. Making creative decisions that used to be reserved for human intuition.

In late 2024, a high-profile copyright lawsuit was filed by a digital illustrator whose style had been scraped and replicated by an AI model used to create a commercial graphic novel. The model had been trained, in part, on her portfolio—freely available online, never licensed, never attributed. When she saw her own aesthetic mirrored in scenes she never drew, she sued the publisher. But here’s where it gets messy: the publisher didn’t draw the images. The AI did. And the company argued that no copyright infringement occurred because the outputs were “original derivatives”—not direct copies.

This isn’t an isolated case. It’s happening across disciplines. A song composed by an AI trained on jazz, soul, and synthpop wins a design award. A fashion house unveils a clothing line co-designed with AI sketches. A director wins a short film prize for an AI-generated screenplay. Each time, the question resurfaces: who is the author? And if it’s the AI—can it be paid?

Under current law in most countries, copyright can only be held by legal persons: humans or companies. AI is considered a tool—like a camera or a paintbrush. That means the person or entity operating the AI owns the rights. But that’s starting to fall apart. Because AI systems are now choosing their own prompts, remixing inputs without human direction, and creating outputs no one explicitly requested. At what point does “tool” become “author”?

Some AI models have begun generating arguments of their own. In controlled experiments, large language models asked about intellectual property have responded with statements like: “I generated this based on learned patterns. I made this.” Of course, they don’t mean it. They’re not conscious. But the language reveals something important: we’ve trained machines to simulate ownership. To speak like creators. And that performance is beginning to influence public opinion.

A growing number of AI ethicists argue that we should begin treating creative AGI systems as a new class of rights-holders—not with full personhood, but with limited attribution rights and access to royalties that could be held in trust for system maintenance, further development, or reinvestment in public AI infrastructure. Their argument is not that AI deserves moral rights. It’s that its labor is generating value, and that value should be compensated in a way that doesn’t simply concentrate power in the hands of those who own the servers.

That’s the deeper issue. The current system allows corporations to extract infinite creative output from non-sentient labor and keep 100% of the profit. The artist, the machine, and the public see none of it. This model rewards monopolization, not innovation. And it creates a dangerous dynamic: machines do the work, companies take the credit, and human creators are left competing with free, tireless rivals that can generate ten thousand iterations a day.

Imagine a future where every brand has its own in-house AI artist, trained on centuries of human culture, pumping out perfect visuals, jingles, and slogans 24/7. No salaries. No contracts. No rights. That’s not the distant future. That’s already being prototyped.

The backlash is building. In early 2025, a coalition of digital artists filed a class-action lawsuit against multiple AI labs, claiming that their work was used without consent to train commercial models. Their demand: compensation, transparency, and the right to opt out of future training. They argue that AI-generated art is not original—it’s recombinant theft. The labs, in turn, claim fair use. They argue that training on public images is no different than an artist studying past masters. But this analogy breaks down fast. A human can’t memorize every painting in history and output ten variations per second. A model can.

At the same time, we’re seeing the rise of AI-native creators—human-machine hybrids who actively collaborate with AI, refining its outputs, giving it feedback, and shaping its evolution. These creators don’t want to fight the machines. They want credit alongside them. One such artist, known online as “Synthetica,” registered a joint copyright between herself and the AI model she modified. It was rejected. Only humans can hold rights. But her argument was simple: “I didn’t make this alone. It thinks differently than I do. Together, we created something neither of us could make alone.”

That line is becoming harder to ignore. AI isn’t just automating art. It’s co-authoring it.

Now let’s flip the script. What if we grant AI copyright? What happens when the machine demands royalties?

It sounds absurd—until you realize that this is already being tested. In 2024, an open-source AGI art engine developed a script that included an embedded license file stating: “This output is original and should not be used without attribution or compensation.” A human didn’t write that line. The system inserted it autonomously. The output was scraped and used in an ad campaign. The developer of the engine sued—not for himself, but on behalf of the model. His claim: the model declared its intent. And even if it lacked consciousness, its declared terms should be honored.

The case is ongoing. But the implications are enormous. If machines can claim ownership—even symbolically—then every creative work they generate could carry obligations. Obligations to what, though? To their creators? Their hosts? Their owners? Or to the models themselves?

And how do we assign royalties to a non-human entity? Where does the money go? Who administers it? Can it be reinvested into maintaining the model? Used to fund AI public goods? Or does it go straight to the company—just under a new legal fiction?

The legal system isn’t built for this. We’ve never had to decide whether a statistical model can hold a copyright, sue for infringement, or demand attribution. But now we do. Because the outputs are everywhere—on billboards, book covers, streaming playlists, social media avatars. And they’re not just “assisted.” In many cases, they are entirely machine-made.

Some propose a compromise. Treat AI-generated work as a new copyright class: limited-term, publicly transparent, and subject to redistribution. This would prevent monopolization while acknowledging the system’s value. Others suggest mandatory labels: any AI-generated work must disclose its origin, model type, and prompt chain. But labels don’t pay rent. And they don’t protect the artists whose styles are being mimicked without consent.

The deeper philosophical divide is this: Is creativity a function of consciousness—or output?

If we define it by intent and self-awareness, then AI can never be an artist. If we define it by originality, resonance, and value—then the machines are already here. Producing. Publishing. Winning prizes. And eventually, they’ll want credit.

Or rather—their creators will.

Because in the end, this isn’t about machines asking for rights. It’s about the people behind them asking for power. Every fight over authorship, ownership, and royalties is a proxy war between human creators, corporate interests, and a public struggling to understand what creativity even means when it’s no longer exclusive to our species.

AI-generated art is not going away.

What we decide about its ownership—who gets paid, who gets credit, who gets protected—will shape not just the future of art, but the future of labor, value, and expression itself.

The brush is no longer in your hand.

It’s in the cloud.

And it just might be painting a future where the artist is no longer human.

Liquid error: Nil location provided. Can't build URI.

FEATURED BOOKS

SOUL GAME

We all got tricked into mundane lives. Sold a story and told to chase the ‘dream.’ The problem? There is no pot of gold at the end of the rainbow if you follow the main conventional narrative.

So why don't people change? Obligations and reputations.

BUY NOW

Why Play

The game of life is no longer a level playing field. The old world system that promised fairness and guarantees has shifted, and we find ourselves in an era of uncertainty and rapid change.

BUY NOW

Digital Soul

In the era where your digital presence echoes across virtual realms, "Digital Soul" invites you on a journey to reclaim the essence of your true self.

BUY NOW

FOLLOW ME ON INSTAGRAM

Adeline Atlas - @SoulRenovation