Utilising digital set design to the fullest.
Developing innovative projects using the XR stage.
Integrating high-end performance & motion capture tech directly with Unreal Engine.
Today’s Virtual Production processes leverage advances in computing power, VFX expertise and game-engine technology to:
- create environments that now interact with the live-action, allowing filmmakers to make real-time decisions and changes
- create backgrounds which move in full perspective with respect to the camera’s position, updating in real-time as the camera moves
- allow the production to move from one convincingly detailed and photo-real environment to another without leaving the stage
- allow final shots to be captured in camera, without the need to replace the backgrounds in post-production
In effect, locations from all over the world can be brought to the stage with compelling photo-realism.
Every director and visual effects (VFX) professional defines virtual production slightly in their own way. At its core, virtual production is modern content creation: it is an agile process characterized by starting VFX development early and leveraging technology throughout the entire production life cycle. The result is a digitally enhanced process that is flexible, robust, time-saving, cost-cutting and smart at each stage.
Traditional production is highly linear: directors and cinematographers plan scenes using storyboards and shot lists, actors are filmed on physical sets or against green screens, and the work of editors and VFX artists only begins after filming is complete.2 This one-way procession through pre-production, production, and post-production can encourage negative outcomes such as a “fix-it-in-post” mentality, destructive or duplicative VFX labor, and expensive reshoots.
Virtual production on the other hand is iterative and creative. Beginning VFX work in pre-production ensures that digital assets are available for planning and shooting, making it easier to continuously refine the final look and feel throughout the course of production. Virtual production is an expansion of the traditional filmmaking playbook, enabling studios to pursue greater experimentation while keeping time and production costs under control. With a virtual production pipeline:
Using 3D VFX assets to visualize and plan a task.
There are many types of visualization (for example, pitchvis, techvis, and postvis), but the
preeminent form is previs—planning a scene in 3D before shooting.
Motion capture (aka Mocap)
Capturing the movements of people to animate VFX assets.
Mocap systems have been used in film since the 1980s but over time, hardware form factors have shrunk and software has become increasingly automated, leading to an increase in the breadth and depth of its use. Excellent motion capture is still technically and artistically challenging and is a critical tool for creating realistic animations of digital humans and creatures.
Hybrid camera (aka Simulcam, green screen hybrid)
Compositing digital VFX with live-action camera footage in real-time.
Simulcam was originally developed and coined by Weta Digital and James Cameron for the movie Avatar (2009).10 It is a direct improvement over shooting against a green screen as visualizing the digital and physical simultaneously with accurate parallax helps directors gain a better spatial understanding of the scene. Actors also benefit from seeing a preliminary view of the visual effects instead of acting against a green wall.
Replacing shooting against green screens with shooting against LED panels that display final quality VFX.
LED is a natural progression from the technique of 2D video screen projection and LED’s ability to cast light for accurate reflections is a significant benefit to post-production.
So far, virtual production has been driven by creatives. Among the many use cases for virtual production tools, common themes include:
- Improving storytelling: When visualization spans the entire production life cycle and is used to plan scenes and shots it helps to ensure that the storyline, captured footage, and VFX alike are being produced in a way that accurately reflects the director’s vision.
- Resolving ambiguity: Historically, actors and directors shooting against a green screen have had to wait until post-production to see their output. Hybrid camera and LED volumes allow creatives and talent to see what they are interacting with and what they are producing in real-time.
- Unlocking possibilities: Photogrammetry and 3D scanning enable virtualization of real-world sets and environments. Inside game engines, filmmakers can reshape mountains, move the sun, and create a 12-hour-long sunset.
Adoption of virtual production can support desirable business outcomes such as saving a variety of costs and accelerating the final product’s time to market. Analysts note that Hollywood may be reaching the limits of efficiency using the traditional production methodology. The mindset and toolset of virtual production can help studios grow past these limits, supporting not only better creative outcomes but also potentially significant time and cost savings.
Based on our research, reshoots are common with high budget films and can account for 5% to 20% or more of the final production cost. Although not every story or director is a good fit for LED live-action production, virtualizing sets helps to greatly reduce those costs, as well as travel, transportation, and location costs, all while decreasing on site risk. VFX costs on a high-budget sci-fi/fantasy film can be as high as 20% of the total film budget. Similarly, shooting against an LED wall significantly reduces post-production VFX costs such as compositing and rotoscoping as well, helping filmmakers get ready for test screenings sooner.
Virtual production may also have cost benefits further downstream from principal photography. LED volumes and virtual sets can be used by marketing teams to shoot commercial and VFX assets which can be reused for sequels, subsequent seasons, and other media.
While reusing digital assets is not impossible today, it is not the norm. Most organizations have many disparate digital versions of the same 3D asset because each asset is tied to an individual show and, even within a given show, production and marketing budgets are siloed. This means that catalogues of assets, VP studios collaboration and collective pipelines will slowly become the new real-time production standard for in-camera VFX.