How ILM and Epic Games Created Virtual Sets for The Mandalorian's First Season

A diagram of the composition of a hybrid digital/physical shot.
A diagram of the composition of a hybrid digital/physical shot.
Image: Lucasfilm

VFX and filming are generally two separate parts of the complex process of making a television show or film. You film the live-action stuff, then you send it to have the experts add the visual effects in. Well, what if you want to do both at the same time—stream in high-quality backdrops onto a set to allow players to play against their actual digital surroundings? Maybe try using a game engine.


That’s the solution Lucasfilm came up with for The Mandalorian, anyway, which features a purportedly groundbreaking collaboration between Industrial Lights and Magic on the one hand and Epic Games on the other to produce real-time digital sets using a semicircular LED video wall and Epic’s Unreal Engine. This isn’t the first time we’ve heard of Lucasfilm using video game technology to help innovate in digital filmmaking, but a new video released by ILMVFX gives us the most detailed view yet at how the process works.

It’s neat stuff. By rendering settings in the Unreal Engine and then streaming it to these massive LED video walls, these VFX teams create backdrops in front of which live sets can be put together, allowing the actors to act in the actual environment viewers are going to see on screen. This also limits the amount of time needed for after-the-fact VFX and allows for a more director-driven approach, as directorial choices aren’t limited by the need to edit in VFX later on. Instead, the two arts can adapt to each other in a live setting.

So if you’re wondering why The Mandalorian is so visually engrossing, despite so heavily featuring digital effects, here’s part of the reason. The show returns for a second season this fall.

For more, make sure you’re following us on our Instagram @io9dotcom.


io9 Weekend Editor. Videogame writer at other places. Queer nerd girl.



The video doesn’t go into the more interesting details of how it improves workflow. Not only does it provide a real-time photoreal backdrop to extend a set with ease and cost reductions, but it also lights the actors accurately, and the portion of the back that is seen through the camera moves its perspective to match accurately.

There are limits to this technology, as including people in the background isn’t always going to look convincing or seamlessly interactive, and apparently foliage still looks too artificial, but those can be addressed more traditionally.

What it does do is open up the epic scale possibilities for action and Sci-Fi/Fantasy TV series. It’s not yet affordable for most of them, but that should also be addressed in short order.