Amazon’s ‘House of David’ Leverages AI for Hundreds of Visual Effects in Season 2

Key Points
- Amazon Prime’s "House of David" used roughly four times as many AI‑generated shots in season 2 compared to season 1.
- AI shots rose from "more than 70" in the first season to "between 350 and 400" in the second.
- Showrunner Jon Erwin described AI as a cost‑effective way to create large‑scale battle scenes and landscapes.
- The production stack combined image generators, up‑resolution tools and video generators from Runway, Luma, Google and Adobe, using "10 to 15 core tools".
- Critics called the visual effects "hokey," while faith‑based audiences responded positively.
- SAG‑AFTRA noted a cautious industry approach, emphasizing consent and fair compensation for AI‑generated likenesses.
- Erwin sees AI as a new form of live‑action filmmaking that could lower budgets and enable more projects.
Amazon Prime’s biblical drama “House of David” dramatically increased its use of generative AI in its second season, employing roughly four times as many AI‑generated shots as the first. Showrunner Jon Erwin described the approach as a cost‑effective way to create large‑scale battle scenes and expansive landscapes. The production combined image generators, up‑resolution tools, and video generators from vendors such as Runway, Luma, Google and Adobe, using “10 to 15 core tools.” While some critics decried the visual quality, industry labor groups noted the cautious adoption of AI, emphasizing consent and fair compensation for performers.
AI‑Driven Visuals Transform the Second Season
Amazon Prime’s series “House of David,” which follows the rise of the biblical king, dramatically expanded its use of generative AI for visual effects in the second season. The production moved from “more than 70” AI‑generated shots in the first season to “between 350 and 400” in the follow‑up, a jump described as “four times as much AI.” Showrunner Jon Erwin explained that the technology allowed the team to create large‑scale battle sequences, stone fortresses, fires on hillsides and sweeping mountain vistas without the budget required for traditional VFX.
Toolset and Workflow
Erwin’s team employed a stack of AI tools, categorizing them into three types: image generators, up‑resolution tools, and video generators. By the end of production, they were “using 10 to 15 core tools,” including products from Runway, Luma, Google and Adobe. This combination enabled the creation of fully virtual shots that were integrated with live‑action footage, effectively serving as a digital “puppet” for the story’s visual world.
Industry Reaction and Labor Perspective
Critics such as Variety’s Alison Herman labeled the series “wooden and cheap‑looking” with “hokey special effects,” while evangelical outlets praised its faith‑based storytelling. Labor union SAG‑AFTRA, the largest performers’ union, noted that the industry is taking a “cautious approach” to AI, emphasizing informed consent and fair compensation for likenesses. The union reported limited violations and observed that most AI usage currently focuses on editing efficiency rather than replacing performers.
Future Outlook
Erwin views AI as a new form of live‑action filmmaking, arguing that lower production costs could enable more projects and new jobs. However, other industry voices, such as actress Justine Bateman, caution that AI may not solve employment challenges and could be used primarily to boost profit margins. As AI tools become more integrated, the expectation is that they will blend into standard production pipelines rather than remain a marketing headline.