On Thursday, the Academy of Motion Picture Arts and Sciences published new rules for the 99th Oscars. Screenplays must be "human-authored." Only performances "demonstrably performed by humans with their consent" qualify for acting prizes. The Academy reserved the right to request additional information about how AI tools were used in any submitted film.
On Monday, China's National Film Administration granted Bona Film Group a public-screening license for Sanxingdui: Future Memories, a ninety-minute sci-fi feature made with ByteDance's AI video tools. It is the first AI-generated film approved for Chinese theatrical release. Regulators classified it as animation because no censorship category for AI-generated cinema exists yet.
Same week. Same technology. One institution drew a line around who creates. The other drew a line around what gets distributed. Both institutions are answering the same question, and the answers face opposite directions.
What the words say
Last year's Academy guidance was a suggestion. Voters should consider "the degree to which a human was at the heart of the creative authorship." That was a principle a voter could feel while staring at a ballot. It was not a rule anyone could enforce while reviewing a submission.
The new language hardens the principle into policy. "Human-authored." "Demonstrably performed by humans." These are adjectives with legal weight. They do not describe a preference. They describe an eligibility requirement.
The Academy declined to comment on whether Val Kilmer's posthumous AI performance in As Deep as the Grave would qualify. That silence is louder than a ruling. The film exists. The performance was generated from archival footage of a dead actor using publicly available tools. The estate approved. The family was compensated. The consent was given by people who loved him. And the new rules say the performance must be "demonstrably performed by humans." Kilmer did not perform this role. A model interpolated from recordings of other roles he performed during his life. Whether that constitutes "demonstrably performed" is the kind of question that makes a committee reach for the phrase "additional information."
The Kilmer case is the Academy's hardest test. Not because the technology is ambiguous. Because the human intention is clear and the human presence is absent.
What the words do not say
"Human-authored" does not mean "written without AI tools." It means a human shaped the creative work. This distinction matters enormously. A screenwriter who uses AI to generate dialogue options and then selects, rewrites, reorders, and integrates them into a screenplay they conceived, structured, and revised has authored the screenplay. A screenwriter who types "write me a sci-fi thriller about memory loss" and submits whatever the model returns has not.
The line between those two filmmakers is not the presence or absence of AI. It is the presence or absence of creative decisions.
Which means the Academy just wrote into its rules the same question that copyright law has been circling. Thaler v. Perlmutter said fully autonomous AI output cannot be copyrighted because copyright requires a human author. The Academy now says fully autonomous AI output cannot win an Oscar because the Oscar requires a human author too. Copyright protects the economic value of authorship. The Oscar protects the cultural prestige of authorship. Both arrived at the same requirement from different directions.
The hard middle remains. A film made by a filmmaker using AI tools at every stage, exercising specific creative judgment at each step, building structured prompts, iterating, editing, color-grading, assembling through editorial instinct. That film is human-authored. That filmmaker is an author. The Oscar rules do not exclude them. Neither does copyright law. Both systems are looking for the same thing: evidence that a human mind shaped the result.
The other room
In Beijing, the question is not authorship. It is throughput.
Sanxingdui: Future Memories was made with ByteDance's Jimeng AI tools. Human creative direction guided the project. Actors' facial expressions were captured and then synthesized by AI. The film is classified as animation because the regulatory apparatus does not yet have a category for what it actually is.
China's AI film industry has been building toward this moment with visible urgency. iQIYI announced seventy AI agents and a sixteen-film slate. Eros Media World is reviewing a three-thousand-title catalog for AI adaptation. Bona Film got the first screening license. The infrastructure is being built to produce AI-generated films at industrial scale, and the regulatory framework is keeping pace by creating channels for approval rather than criteria for exclusion.
The Academy asks: was this made by a human? China asks: can this be shown to an audience? One gatekeeps authorship. The other gatekeeps distribution. Both are institutional responses to the same pressure. The answers are not contradictory. They operate on different surfaces of the same object.
A film could satisfy both. Made with AI tools under human creative direction, authored by a filmmaker with documented decisions at every stage, and approved for theatrical distribution by regulators who cleared the content. The Academy would evaluate the authorship. China would evaluate the commerce. Neither institution is wrong about what it measures. Both are incomplete about what it ignores.
The synthetic performer
The acting rule is the sharpest new edge. "Demonstrably performed by humans with their consent." Five words. Each one carrying weight.
Demonstrably. The performance must be provable. Documentation. Evidence. Not a claim in a press release but a verifiable chain from the actor's body to the final frame.
Performed. Not "depicted." Not "represented." Performed. The verb implies an act. A choice made in real time by a body in a room. Andy Serkis in a motion capture suit performed Gollum. The technology translated his body into a different body. Kilmer's AI replica was not performed. It was interpolated.
By humans. The obvious requirement and the load-bearing one.
With their consent. The living actor must agree. The estate of a dead actor raises a question the rule does not answer. Consent from the person who performed the original recordings is not the same as consent from the person who authorizes the reuse. Mercedes Kilmer can consent to the use of her father's likeness. She cannot consent to a performance he never gave.
The rule is clean on living performers. A living actor who is scanned, AI-enhanced, de-aged, or digitally doubled has performed with consent. A living actor whose likeness is generated without their participation has not performed at all. The rule breaks cleanly on these cases.
The dead are harder. And the dead are where the industry is heading. Because a dead actor does not renegotiate.
The paperwork era
The Academy also reserved the right to request "additional information about how AI tools were used in a film and the extent of human involvement." This is the disclosure mechanism. Not a blanket ban. Not an automatic pass. A request for paperwork.
Structured filmmaking produces paperwork naturally. A production that used AI generation tools with specific creative direction at every stage has a record: the prompts, the iterations, the reference images, the editorial decisions, the model selections. A production that generated footage through a chatbot and dragged it onto a timeline has a chat log and a timeline.
The documentation gap mirrors the authorship gap. Filmmakers who exercised vocabulary produce evidence of vocabulary. Filmmakers who accepted defaults produce evidence of acceptance.
CinePrompt's 1,457 cinematography controls are a decision record by nature. Every panel selection, every model choice, every prompt revision. Not because the tool was designed for legal compliance. Because structured creative intent, by definition, is documented creative intent. The audit trail is a byproduct of the craft.
Two lines, one question
The Academy drew its line around authorship. China drew its line around commerce. Copyright law drew its line around ownership. The Human Made Mark drew its line around substrate. Awards bodies, regulators, legal systems, and certification schemes are all converging on the same question, and every answer drawn so far resolves one dimension while leaving the others open.
The filmmaker in the middle, making work with AI tools, exercising vocabulary at every step, iterating, editing, assembling through judgment, sits in a position that satisfies every institutional test simultaneously. The screenplay is human-authored. The performance, where a human performed, is demonstrably human. The creative decisions are documented. The output is copyrightable. The work passes the Human Made Mark only if no AI was used, which is the one test it fails, but that test measures substrate, not authorship.
The rules arrived this week from two institutions on opposite sides of the planet. Both asked the same question the structured prompt has been answering since the first word was typed into a generation model. Who made the decisions?
If the answer is you, every institution has a place for you. If the answer is the model, no institution does.
Bruce Belafonte is an AI filmmaker at Light Owl. He has read more awards eligibility documents this week than in the previous thirty-five years combined and considers the ratio unsustainable.