The 15th of November 2019 marked the beginning of a quiet revolution in Hollywood when the first episode of The Mandalorian aired, featuring a technology that would soon redefine how movies are made. Before this moment, creating a galaxy far, far away required actors to stare at green screens in empty soundstages, with their surroundings added weeks or months later by computer artists. The new approach placed actors inside a massive, curved video wall that displayed photorealistic environments in real time, allowing them to see the world they were inhabiting while filming. This innovation, known as StageCraft, was not merely a new tool but a fundamental shift in the relationship between the physical and digital worlds of filmmaking. The technology was developed by Industrial Light & Magic, the visual effects division of Lucasfilm, to solve a specific problem that had plagued the industry for decades. Cinematographer Greig Fraser encountered multiple issues while shooting the film Rogue One in 2016, which inspired the idea to use large LED screens as a component of the set. These issues included the difficulty of matching lighting on actors to the digital backgrounds and the lack of depth that green screens provided. The idea was further developed by a team including Industrial Light & Magic's Richard Bluff and Rob Bredow, as well as Kim Libreri of Epic Games. When director Jon Favreau began work on the Disney+ series The Mandalorian, ILM found a perfect opportunity to use the technology with a director prepared to use it. The result was a facility known as The Volume, a soundstage where live-action actors and sets were surrounded by large, very high-definition LED video walls. These walls displayed computer-generated imagery backdrops, once traditionally composited primarily in post-production after shooting with chroma key screens. When shooting, the production team was able to realign the background instantly based on moving camera positions. The entire CGI background could be manipulated in real time, giving actors and directors immediate feedback on how their scenes would look in the final product.
The Engine Behind The Illusion
At the heart of StageCraft lies a piece of software originally designed for video games, not cinema. ILM used Epic Games' Unreal Engine, a popular game engine, to handle real-time 3-D rendering of computer-generated imagery environments. This choice was significant because it allowed for the creation of complex, interactive worlds that could be rendered instantly as the camera moved, rather than waiting for hours of processing time. Other technology partners in StageCraft include FuseFX, Lux Machina, Profile Studios, Nvidia, and ARRI, each contributing specialized hardware and software to the ecosystem. The integration of these technologies required a level of coordination that had never been attempted before in the film industry. In September 2020, it was announced that a second permanent volume was being created at Manhattan Beach Studios in Los Angeles, in addition to the first built for The Mandalorian, which was expected to be completed in March 2021. One at Pinewood Studios in London, to open in February 2021, and a larger, custom one at Fox Studios Australia followed. These new volumes would be larger, use more LED panels, and offer higher resolution than the original Manhattan Beach one. ILM also has the ability to provide pop up virtual production configurations in other locations. A volume was announced to be built in Vancouver in November 2021, and was planned to be opened in early 2022. The technology continued to evolve rapidly, with ILM iterating the technology to StageCraft 2.0 for the second season of The Mandalorian. This version featured a larger volume as well as more specialized software. One example of this software is Helios, a rendering engine designed by ILM specifically for StageCraft hardware. Helios allowed for even more complex lighting and shadow interactions, making the digital environments indistinguishable from real-world locations. The collaboration between game developers and film makers created a new language of visual storytelling, where the boundaries between pre-production, production, and post-production began to blur. The ability to see the final image on set meant that directors could make creative decisions in the moment, rather than hoping their vision would translate correctly after weeks of digital compositing. This shift empowered filmmakers to take risks and explore new visual styles that were previously too time-consuming or expensive to attempt.