Can Hollywood Keep AI Out?


Universal is slapping a new line in movie end credits that basically says, “Do not use this film to train AI.” You can spot it in How to Train Your Dragon, Jurassic World Rebirth, and The Bad Guys 2. Think of it like a “No Trespassing” sign for data scrapers; it doesn’t electrify the fence, but it does make “we told you so” a lot easier in court.
One polite sentence can decide who pays to feed Hollywood into AI. If more studios copy this, model makers either license footage (hi, paperwork) or avoid it. That can change training data, raise costs, and speed up the lawsuits that will set the rules. Not a tech upgrade an ownership move that messes with the AI supply chain.
The goal here is three things: reserve rights on the record, scare off unlicensed scraping, and push AI companies toward paid deals instead of the classic “we’ll fix it in post after the lawsuit.” In Europe, this also plugs into “text and data mining” rules where rightsholders can explicitly opt out. Translation: the credits line isn’t just sass; in some places, it’s the legal off-switch.
This isn’t about outscoring the neighbors; it’s about locking the cookie jar. The worry is generative AI tools (and the look-alikes they spit out) cloning a film’s style without calling payroll. That’s why Universal and Disney sued Midjourney in June over alleged unlicensed use of their IP. The warning in the credits is basically the receipt stapled to the lawsuit: opt-out recorded, proceed at your own risk.
If you make movies, add “no AI training” to your contracts and screeners, label your files, and keep basic records or watermarks so you can prove misuse. Actors set clear rules for your face and voice, and get paid if they’re used. Companies, pick AI tools that respect opt-outs and have real licenses; get it in writing and make a short rule for what staff can upload.
For everyone just grabbing extra butter, you may see fewer obvious movie-style clones in consumer AI, more permission prompts, and (hopefully) clearer rules on who gets paid when culture trains code. Not flashy, but this is how the internet’s fine print gets written.
Does a “do not train” label make you feel protected, annoyed, or meh? Drop a note on how this could hit your job, your team, or someone you know, and whether you’d ever license your work to AI for the right price.
- Matt Masinga
*Disclaimer: The content in this newsletter is for informational purposes only. We do not provide medical, legal, investment, or professional advice. While we do our best to ensure accuracy, some details may evolve over time or be based on third-party sources. Always do your own research and consult professionals before making decisions based on what you read here.