In this article
Training AI models on artist and animator work is a new prospect offered by generative AI. Artist talent in Hollywood animation and game art departments face the possibility that studios may train AI image or video models assets they create on the job.
This possibility is already being explored as part of different workflows at independent AI studios focused on animation. Earlier this week, VIP+ discussed how fine-tuning image and video models is proving effective to achieve a consistent look in animated content, including to make AI image outputs that closely match a specific character design destined for social media or to transfer a unique art style an artist developed for a project onto other animated material.
Yet even with the positives described, the ability of a major animation or game studio to train a model on assets a hired artist creates (and the studio then owns) would also seem to signal potential for talent exploitation and diminished future opportunities for artists and animators.
Data Ethics, Labor and Artist Talent
In its newly ratified agreement with the AMPTP, the Animation Guild (IATSE Local 839) wasn’t able to secure proposed restrictions against training on artist assets provided on the job, which would include the possibility of model fine-tuning. Likewise, many artists are work-for-hire, and not every artist or studio is in the union (for example, Pixar is not, while Marvel Animation is).
The new TAG contract also doesn’t allow union artists to reject having to use generative AI as a condition of employment, a right writers won in last fall’s WGA agreement, meaning a studio could require an artist to use gen AI as part of the production pipeline.
For many artists, generative AI is already a personally painful technology because the AI models being pitched by developers and considered by studio employers have been built on artists’ work, scraped from the internet and used without permission or compensation, a situation one artist compared to “labor theft.”
Artist sources shared with VIP+ that the artist community has already been experiencing layoffs, falloffs in work and unusual decreases in compensation this year. Though it’s hard to directly attribute all these effects to generative AI, its disruptive impact is also hard to ignore.
Now artists worry further displacement could result from a studio using those same models to train and retain its own models. For example, if a studio hires an artist to deliver some assets and uses those assets to fine-tune a model intended for content creation, the fine-tuned model could thereafter be used to produce new diverse but similar assets — a scenario arguably analogous to actors’ fight against their digital scan being used on future productions “in perpetuity.”
Without limitations for a studio training on assets an artist delivers on the job, artist sources to VIP+ voiced a concern that, simply by working, they would be contributing to further improving the models that eventually replace them in their roles or teams at a studio.
“The studio can do whatever they want with the drawings they’ve hired [an artist] to produce,” said Sam Tung, a storyboard artist who is involved with TAG as a member of the guild’s AI Task Force and served on the Negotiations Committee. “The apprehension among TAG workers is that you’re effectively digging your own grave if you draw a bunch of character designs or storyboards that get fed into a studio’s internal dataset and make the AI system even better at making characters or storyboards.”
Artist sources foresaw models trained on their work leading to downsizing teams or limiting an artist’s scope of work to cleaning up AI outputs.
As I wrote in the VIP+ June report “The State of Generative AI in Hollywood,” “Fine-tuning could be a new battleground for visual artists, analogous to how actors fought not to have their data stored for future use in perpetuity. Consent and compensation for fine-tuning on an artist’s style or assets developed while on a project would need to be reflected in their initial contract terms.”
Artists reflected they would now need to push for explicit terms around fine-tuning in their contracts at hire, although Tung wasn’t confident the studios would accept restricting training completely. Advocacy organization Concept Art Association hired a lawyer to produce boilerplate legal language artists can use in their contracts for illustration gigs.
Short of restricting training altogether, what constitutes fair compensation may even need to be redefined — that is, increased or even offered backend participation — if a hired artist delivers any original assets intended for AI training or creating a fine-tuned model that a studio plans to retain and use on future productions.
“There’s a difference between delivering an asset and delivering a digital version of your skills,” Jon Lam, senior storyboard artist at Riot Games, told VIP+ in May 2024. “We need to start making that differential in contracts to say you can use this work for reference, but I do not give you permission to fine-tune a model on my work.”
TAG’s new agreement does require producers to meet with artists if they have concerns about the use of an AI system. Artists have already been relaying such concerns to supervisors, from frustrations around the effectiveness of gen AI up to legal risks of using tools built on unlicensed copyrighted material. Lam referred to artists as the “canary in the coal mine” in highlighting some of the problems with integrating AI tools, particularly the underlying data used to train these systems.
“It’s been up to the artists at animation or game studios to educate ourselves on how generative AI works and then in turn educate our team leads and union reps how the technology infringes intellectual property from employers as well as artists themselves,” said Lam. “To artists, it may seem obvious where the data is being sourced from or weighted, but we often have to be extremely clear and supply receipts of infringement to those that have the final say.”
Legal Risk
Even as a few AI studios and independent creators pursue new methods, sources to VIP+ said the major traditional studios still see legal and consumer backlash risks as reasons not to use AI for consumer-facing content.
A fine-tuned model prioritizes the new data in its outputs, meaning that by fine-tuning on its owned IP or original art created by a human artist for a project, a studio likely minimizes any risk of infringing unlicensed copyrighted material that was present in the underlying pretrained model.
“The risk goes way down because the only things you’re adding are things you own or have rights to,” said Meeka Bondy, senior counsel in the Technology Transactions & Privacy Practice and co-chair of the film and TV industry group at Perkins Coie.
Even so, Bondy agreed that fine-tuning doesn’t eliminate infringement risk that has troubled enterprise use of generative AI for content creation, as VIP+ has previously argued. The fine-tuned model still relies on the pretrained model for its outputs, and most available pretrained image and video models have almost certainly trained on unlicensed copyrighted material.
“[Something in] the output could still look like it’s substantially similar to something else [in the training],” Bondy said.
Artists are likewise aware of the conundrum stalling studios. “Fine-tuning doesn’t really fix the problem. These fine-tuned models always sit on top of Stable Diffusion or something like that. That’s the thing full of copyrighted stuff,” said storyboard artist Tung. “The studio needs to be very confident about any software they’re giving the green light [for artists] to use because it may have some pretty serious legal implications for what you know those outputs are.”
Variety VIP+ Explores Gen AI From All Angles — Pick a Story
Related AI stories from VIP+
• AI Entertainment Studios: The New Breed of Companies Pioneering Gen AI-Powered Production
• AI Entertainment Studios: How Gen AI Tools Are Transforming Production Workflows
• Training AI Models Shows Benefit for Animation But Raises Ethical and Legal Questions