OpenAI’s Sora Faces Hollywood Backlash Over Copyright Controls
OpenAI just dropped a bombshell about its Sora video tool. Content owners can now block their characters from appearing in AI-generated clips. Plus, the company plans to share revenue with studios that play along.
But here’s the real story. Major studios like Disney already opted out before OpenAI even announced these features. That tells you everything about Hollywood’s trust in AI companies right now.
The tensions run deeper than most people realize.
Studios Get Blocking Powers They Already Used
OpenAI rolled out Sora this week as a standalone app. Users can create 10-second videos from text prompts. Sounds simple. But the app lets people generate content using copyrighted characters without asking permission first.
Now CEO Sam Altman promises “more granular control” for rights holders. Studios can block their characters entirely. They can also choose specific ways their intellectual property appears in generated videos.
Disney didn’t wait around for these controls. Sources told Reuters the entertainment giant blocked Sora access to its material before the public launch. That’s a massive vote of no confidence in OpenAI’s copyright approach.
Other major studios remain silent about their plans. However, Disney’s early exit suggests widespread skepticism across Hollywood about AI-generated content using protected characters.
The Revenue Share Nobody Asked For
OpenAI wants to pay content owners who allow character generation. Altman admits the company needs to “take some trial and error to figure out” the payment model. Translation: They have no concrete plan yet.
But here’s what really matters. Users are creating way more video content than OpenAI expected. That creates two problems the company didn’t anticipate.
First, server costs are exploding. More videos mean higher computational expenses. OpenAI likely needs revenue from somewhere to offset these surprise costs.
Second, most user-generated content targets niche audiences. That makes monetization tricky since broad advertising won’t work effectively for highly specific content categories.
So OpenAI plans to test various approaches within Sora first. Then they’ll roll out whatever works across their entire product suite. That includes ChatGPT and other tools that might generate copyrighted content.
The timeline? Altman says “soon” but provides zero specifics. For studios considering whether to participate, that vagueness probably doesn’t inspire confidence.
Why Hollywood Stays Skeptical
Copyright tensions between AI companies and content creators keep escalating. Studios invest billions creating characters, storylines, and franchises. Now AI tools let anyone generate similar content in seconds.

OpenAI joins competitors like Meta’s Vibes platform and Google’s text-to-video tools. All these companies face the same fundamental question: How do you compensate creators when AI models train on their work without permission?
The legal framework remains unclear. Courts haven’t definitively ruled on whether training AI models on copyrighted content constitutes fair use. Meanwhile, AI companies rush forward building products that generate derivative works from protected intellectual property.
Studios also worry about brand control. What happens when users create inappropriate content featuring beloved characters? Who takes responsibility when AI-generated videos damage carefully crafted brand images?
These concerns explain why Disney opted out immediately. The studio spent decades building franchises worth billions. They’re not about to hand that control to an AI company with vague promises about future monetization.
The Real Business Model Problem
Altman’s blog post reveals something interesting. OpenAI didn’t expect the volume of content users would create. That suggests they launched Sora without fully understanding the economics.
Generative video costs significantly more than text generation. Each video requires massive computational resources. Processing 10-second clips at scale burns through server capacity and electricity.
Now OpenAI faces a classic tech trap. They built something users love but didn’t calculate the costs properly. So they need revenue fast to cover expenses. That’s likely driving the sudden announcement about revenue sharing with studios.
But here’s the catch. Studios don’t need OpenAI. They can build their own AI tools or partner with companies that respect intellectual property from the start. Disney already invested heavily in AI research and development.
So why would major entertainment companies share revenue with OpenAI instead of keeping everything in-house? The value proposition seems unclear beyond gaining access to OpenAI’s technology.
Niche Content Creates Monetization Headaches
Altman mentioned that users create content for “niche audiences.” That’s actually a bigger problem than it sounds.
Traditional advertising works at scale. Brands pay premium rates to reach millions of people. But niche content fragments audiences into tiny segments. That makes advertising less valuable per view.
OpenAI probably hoped Sora would generate mainstream content that attracts broad advertising support. Instead, users apparently create highly specific videos that appeal to small communities.
That forces OpenAI toward subscription models or transaction fees instead of advertising revenue. But subscriptions face resistance from users who already pay for ChatGPT. And transaction fees discourage the casual experimentation that makes AI tools spread.
So the company finds itself searching for a business model that actually works. Meanwhile, operational costs keep climbing as more users generate more videos.
What This Means for Creators
The announced controls sound reassuring. But they arrive after the product launched. That backwards approach reflects how AI companies typically operate: Build first, deal with consequences later.

Independent creators face different challenges than major studios. Small artists and designers lack legal teams to enforce copyright protections. They can’t afford to block AI tools that might help them remain relevant.
So while Disney opts out, smaller content creators might feel pressure to opt in. They’ll probably accept whatever revenue sharing OpenAI eventually offers because they have fewer alternatives.
That creates an imbalanced power dynamic. Major studios protect their intellectual property while independent artists subsidize AI development with their work.
OpenAI’s planned “trial and error” approach also benefits larger entities. Studios can afford to wait and see what terms emerge. Individual creators need income now and might accept unfavorable deals early in the process.
The Microsoft Connection Matters
Microsoft backs OpenAI with billions in investment. That partnership gives OpenAI significant financial runway to experiment with business models.
But Microsoft also wants returns on its AI investments. The software giant faces growing questions from investors about when AI products will generate meaningful profits.
So while Altman talks about taking time to figure out monetization, Microsoft’s financial pressure likely drives faster timelines. That could lead to rushed deals with content owners or pricing models that optimize for revenue over fairness.
Microsoft also competes with Google and Meta in AI video generation. That competitive pressure might push OpenAI to prioritize growth over sustainable relationships with content creators.
Trust Defines the Future
OpenAI’s announcement reveals the fundamental issue. AI companies built tools using content they didn’t create or license. Now they’re retrofitting permission systems after facing backlash.
That approach destroys trust. Studios remember being told AI would enhance creativity, not replace human artists or generate unauthorized derivative works. Now they see the same companies that made those promises scrambling to add controls.
Disney’s immediate opt-out demonstrates how badly OpenAI miscalculated. The entertainment industry watched AI development closely. They understood the implications before OpenAI launched consumer products.
So when OpenAI announces better controls and revenue sharing, studios rightfully question whether the company will follow through. Past behavior suggests AI firms prioritize technological advancement over creator compensation.
These trust issues extend beyond Hollywood. Musicians, authors, and visual artists all face similar concerns about AI tools trained on their work without permission. OpenAI’s handling of Sora will influence how other creative industries approach AI partnerships.
Altman promises implementation “soon” for revenue sharing. But soon isn’t good enough when studios already withdrew support. OpenAI needed these systems before launch, not after losing major content partners.
The damage might already be done. Other AI companies will learn from this mistake and potentially build better relationships with content owners from the start. That could leave OpenAI playing catch-up in the text-to-video market it pioneered.