Technology
Adobe's Firefly Foundry: The bet on ethically trained AI
Major entertainment companies are building custom generative AI models trained exclusively on their own content libraries, as Adobe partners with Disney, CAA, and UTA to address the industry's copyright anxiety.

Major entertainment companies are building custom generative AI models trained exclusively on their own content libraries, as Adobe partners with Disney, CAA, and UTA to address the industry's copyright anxiety.
Just eighteen months ago, the Writers Guild strike centered on fears that studios would replace human creativity with generic AI output. Now those same studios are racing to build what Adobe calls commercially safe models. The shift reflects the technology's inevitability. Rather than being sidelined by external tech companies, studios are choosing to own the disruption by creating tools that amplify their existing IP assets.
According to Adobe's blog post, Firefly Foundry enables media companies to create custom models tuned to their unique intellectual property. The platform promises high-fidelity generation of video, audio, and 3D assets while maintaining what the company describes as ethical training practices, meaning the models only learn from content the studios already own.
The partnerships announced include Creative Artists Agency and United Talent Agency, along with studios that IT Brief Australia reports are already using the platform. Walt Disney appears among the early adopters, with B&T noting the company is building private, custom AI models that keep their IP isolated from public models.
The timing aligns with mounting legal pressure on AI companies over training data. OpenAI and Anthropic face multiple lawsuits from publishers and artists claiming unauthorized use of copyrighted material. Adobe's approach promises to sidestep copyright litigation, but the legal reality is murkier. Even training on owned content raises questions about derivative works, fair use boundaries, and the rights of individual contributors whose work becomes part of the training data. More critically, it assumes studios have clear ownership chains for all their content, a risky bet given the complex licensing, acquisition, and collaboration structures that define modern entertainment production.
Get the latest model rankings, product launches, and evaluation insights delivered to your inbox.
The technical limitations remain significant and telling. Adobe hasn't disclosed generation speeds, resolution limits, or minimum content requirements. These omissions suggest the technology may require massive content libraries to produce usable results, potentially limiting adoption to only the largest studios with decades of archived material. Adobe has also yet to produce a state of the art model comparable with the frontrunners such as Veo and Kling whose training runs have scraped copyrighted and non-copyrighted materials alike.
The platform appears designed for specific use cases rather than open-ended creativity. Marketing materials, franchise extensions, and pre-visualization seem like obvious applications. This focus reveals Adobe's understanding that custom AI's real value lies in scaling existing IP rather than creating new concepts. For studios, this means potentially infinite variations of established properties: new Marvel scenes for social media, additional Star Wars content for theme parks, or rapid concept art that maintains franchise visual consistency. The business model shifts from creating new IP to maximizing existing IP through AI multiplication.
VFX studios are reportedly among the early adopters, according to Mi3. Visual effects houses already work extensively with studio IP and could use custom models for rapid prototyping or generating background elements that match a film's aesthetic. VFX studios operate on tight margins and brutal deadlines. Custom AI could dramatically reduce costs for routine tasks like environment generation or crowd simulation, potentially reshaping the economics of post-production.
Adobe's silence on pricing details and minimum data requirements likely reflects a premium positioning strategy. This isn't meant to be accessible technology. The lack of clarity on AI identification metadata is more concerning, suggesting Adobe prioritizes seamless integration over transparency. As studios begin generating marketing content and franchise extensions indistinguishable from human-created material, audiences may lose the ability to distinguish between authentic creative work and AI-generated content.
The real test comes when these custom models start producing content at scale. If a Disney-trained model generates a scene that accidentally resembles a Pixar sequence, who owns what? Adobe's commercially safe promise assumes clean boundaries between IP that may not exist in practice, especially given how much entertainment companies license, acquire, and cross-pollinate content. The complexity multiplies when considering that many films involve multiple studios, international co-productions, and licensed elements, creating training datasets that may be legally owned but creatively entangled.


