Natasha Lyonne used Fortune’s Brainstorm AI conference in San Francisco (Dec. 8–9) to spotlight the Marey AI video platform, a “licensed-first” generator built with Moonvalley that aims to reduce copyright risk as studios and regulators tighten scrutiny of generative video.
Asteria’s pitch: powerful AI without “scraping first, asking later”
Actress, director, and producer Natasha Lyonne says the next phase of AI in Hollywood will be defined less by flashy demos and more by how responsibly models are built—and what they were trained on.
At Fortune’s Brainstorm AI event, Lyonne criticized what she portrayed as permissive training and data collection norms in parts of the generative AI sector. In a widely shared remark, she argued it is not “super kosher copacetic” for companies to “rob freely” in the name of speed or global competition.
That context matters because AI video is moving quickly from experimental clips to workflows that touch real productions: previs (pre-visualization), pitch materials, VFX concepting, reshoots, background plates, and ad creative. As this expands, so does legal exposure—not only around outputs that resemble protected characters, but also around whether training datasets included copyrighted footage without permission.
What is Marey, and why Asteria is betting on it
The Marey AI video platform is the generative video model and toolset developed by Moonvalley, with Asteria Film Co. closely involved in product direction and creative workflows.
Marey is named after Étienne-Jules Marey, the French scientist and early cinematography pioneer known for motion studies. The naming is symbolic: the product is positioned as a filmmaker-facing tool focused on motion, camera language, and controllability rather than “prompt-and-pray” generation.
“Licensed and protected” training approach
Moonvalley markets Marey as trained only on licensed, high-resolution footage—explicitly avoiding scraped web video and user submissions. The company frames this as an attempt to build a model that professional studios and agencies can use with more confidence, particularly for commercial projects where risk is priced in.
This positioning is not subtle. It is designed as a contrast to broader industry debate over whether training on copyrighted works without permission should be treated as fair use, and whether the source of the material (including pirated copies or paywalled content) changes the analysis.
What filmmakers can do with the Marey AI video platform
Moonvalley’s public messaging emphasizes production-style controls and repeatability—features that matter for editing and for directing shots, not just generating “cool clips.”
On its product pages, Moonvalley highlights controls such as:
- Camera control (moving through a scene as if operating a virtual camera)
- Motion transfer (applying motion from a reference clip to a new subject/scene)
- Trajectory control (drawing a path for objects to follow)
- Keyframing / timeline-guided transitions (guiding sequences using multiple reference images)
These capabilities reflect a trend across AI video: tools competing on controllability and consistency, not only realism.
Pricing and access
Marey is available publicly via subscription tiers that use a credits model.
| Plan | Monthly Price | Included Credits (per month) | Notes |
| Starter | $14.99 | 100 | Entry tier for creators experimenting with workflows |
| Creator | $34.99 | 250 | Mid-tier for more frequent iterations |
| Pro | $149.99 | 1,000 | Higher-volume generation for production teams |
Why this launch lands during a copyright pressure wave
The timing of Lyonne’s push is not accidental. Generative AI is entering a period where rights questions are no longer theoretical. They are becoming procurement questions, insurance questions, and litigation questions.
Regulators are narrowing the “it’s probably fair use” comfort zone
In May 2025, the U.S. Copyright Office released a major report on generative AI training that lays out a framework many rights holders have embraced: lawful access and market harm matter, and using pirated or illegally accessed works can weigh against a fair-use defense.
The report also notes that training datasets may include paywalled or pirated works, and it explicitly states that knowingly using pirated or illegally accessed training material should weigh against fair use—even if it is not automatically decisive on its own. For film and TV, that language heightens the stakes because moving images are among the most commercially sensitive categories of content.
Industry groups are also applying pressure
In fall 2025, the Motion Picture Association publicly urged OpenAI to curb copyright infringement concerns tied to Sora 2 outputs. The controversy was amplified by how quickly copyrighted characters and recognizable IP appeared in user-generated videos circulating online, driving calls for stronger controls and clearer policies.
Even outside entertainment, the overall trend line is clear: more lawsuits, more claims about training data provenance, and growing expectations that AI builders will pay for content—or prove they had lawful access.
Moonvalley’s funding suggests “licensed video AI” is becoming a real business lane
Moonvalley says it raised $84 million in additional funding in July 2025 to scale Marey and expand its licensed-content approach, with General Catalyst leading and strategic participation from Creative Artists Agency (CAA), CoreWeave, and Comcast Ventures.
Funding matters here because licensing high-quality video is expensive. Building a “clean dataset” strategy often means paying creators, studios, archives, or specialized suppliers—then building governance around consent, documentation, and auditability. That is harder for underfunded startups, and it may help explain why the sector could consolidate around a smaller set of companies with the capital to license data at scale.
How “copyright-conscious AI video” could reshape filmmaking workflows
Lyonne’s argument is not that AI will disappear from Hollywood. It is that adoption will accelerate—but only if creators, studios, and vendors believe the underlying tools do not undermine the rights ecosystem that funds film and TV.
What the Marey AI video platform is aiming to unlock
If a licensed-first model works as advertised, it can support:
- Indie production: faster previs and concept work without large VFX budgets
- Studio development: rapid iteration on scene ideas before greenlight decisions
- Advertising and branded content: shorter cycles and more versions, with lower legal uncertainty
- Post-production experimentation: testing camera moves, background variations, and transitions before committing to costly pipeline work
The key difference is procurement confidence. Many production companies will not touch tools that create uncertainty for distributors, insurers, or brand partners.
What remains unresolved
Even a “licensed-only” training story does not end every risk. Output similarity disputes can still arise. Talent likeness, voice rights, and guild agreements add additional constraints. And studios will likely ask for stronger documentation: what content was licensed, under what terms, and whether it can be audited.
That is where tools like Marey will be judged—not only by image quality, but by the paper trail behind the model.
How the copyright debate collided with AI video in 2025
| Date | Event | Why It Matters |
| May 2025 | U.S. Copyright Office releases Part 3 report on generative AI training | Adds weight to arguments that lawful access and market harm can undercut fair use, especially with pirated sources |
| July 2025 | Moonvalley announces $84M funding to scale licensed AI video | Signals investor appetite for “commercially safe” models built on licensed data |
| Sep–Oct 2025 | Sora 2 controversy intensifies over copyrighted characters and controls | Highlights reputational and policy risk for video generators |
| Dec 8–9, 2025 | Lyonne spotlights Asteria’s approach at Fortune Brainstorm AI (San Francisco) | Positions Marey as an alternative pathway for professional filmmaking use |
A test case for “ethical AI” in entertainment
The Marey AI video platform is emerging as a high-profile experiment in whether licensed datasets can compete on quality and speed with models built from scraped internet media.
For Hollywood, the question is practical: can AI lower costs and expand creative possibilities without hollowing out the rights system that pays for films and series? Lyonne and Asteria are betting the answer is yes—if the industry stops treating training data as a free resource and starts treating it like a supply chain that must be paid for, documented, and respected.
If that bet holds, “licensed-first” may become less of a marketing slogan and more of a minimum requirement for professional AI video tools.






