top of page

The challenge with Virtual Production

Faster, easier, cheaper: For years, studios have promised an approach to filmmaking using LED walls that allows for real-time customization of digital backgrounds, eliminating the need for post-production.

It is the future of filmmaking, as demonstrated by Industrial Light & Magic (ILM) in collaboration with Epic Games through the success story of "The Mandalorian." I am a big fan of these new possibilities and I am excited about every project that utilizes virtual production. As a cinematographer, I have been working with this and similar techniques for the past eight years, and I want to share my experiences here. Because at the end of the day, we cinematographers find ourselves in the studio facing a virtual background, and we should know the limitations of this technology.

Since the publication of the blog post on March 25, 2022, a lot has happened. New projects, as well as new developments and countless discussions with colleagues, DPs, and VPTDs (Virtual Production Technical Directors), have prompted me to add content to the article and clarify certain aspects. Enjoy reading.

What is really behind the technology?

The LED wall can indeed save a production a lot of money, but it heavily depends on what is being shot and the scale of the production. "The Mandalorian" had a estimated budget of 100 million dollars. Since I don't know how much it would have cost to shoot the series conventionally, it's hard to say if Disney saved money here. What is certain is that it represents a good investment in the future. Several seasons and other films are using this LED studio, further developing the technology, and the know-how is growing. In the long run, this can certainly result in significant cost savings.

Is virtual reality the future of filmmaking?

No, I don't believe that an entire Tatort (German TV Series) will ever be shot in front of an LED wall. Or that we would prefer to generate a farmhouse on a computer rather than going to the countryside and shooting on location. But there are many reasons to give this technology a chance.

As early as 2015, I was hired as a DP for a production to find an LED wall solution for the series "Capelli Code." For two years, I tested extensively with a team of creative minds, including the AOTO LED wall that would be used in Matt Reeves' "The Batman" years later. However, we concluded that the LED wall did not meet our requirements and instead opted for an innovative rear-projection solution using Flex-Glass by Screen Innovations.

Left: Film still of Capelli Code. Right: Set-up in the studio with a 10m Flex Glass Screen..

Much time has passed since then. Unreal Engine 5 has been released, now offering more realistic real-time rendered backgrounds compared to back then, and it would undoubtedly lead to more sets being digitally created today. However, there have also been significant advancements in tracking, rotoscoping, and greenscreen processing in recent years, which make the use of greenscreens easier.

These significant innovative steps are what I miss with the LED wall. The LED panels were originally designed for presentations at trade shows and as advertising displays, not for use as screens for virtual backgrounds in film studios. This becomes evident when working with them for an extended period, and unfortunately, the problems we encountered with the LED wall back in 2015 still persist today.

What are the problems and limitations of the technology?

One main problem - in my opinion - is the incredible hype surrounding the new technology. It is indeed an innovative tool for us filmmakers, and I am generally a big fan of the technology. However, because I now have the know-how and can deal with the limitations, I know that one should approach the promotion of the LED wall by studios and manufacturers with the necessary skepticism.

The LED wall and the Unreal Engine offer complete creative freedom on set.

That is the dream of directors and DPs. It is the promise of many LED studios, and even though it would be theoretically possible to change parts or even the entire background on the shoot day through real-time rendering, in practice, it is not a viable option. The Unreal Engine is primarily a game engine and was not designed specifically for virtual production. It runs reliably only when all assets, shaders, and parameters have been optimized for the computers in the studio.

Just because it runs smoothly on a computer in a VFX company does not guarantee the same performance in a VP studio. The nDisplay plugin provided by Unreal Engine distributes the data processing to multiple computers and their GPUs. Usually, the files need to be optimized for nDisplay and should be thoroughly tested in the studio long before the shoot. Together with the tracking system, which varies from studio to studio, the essential genlocking of the camera to avoid artifacts, adjusting the frustum based on lens and camera movement, and more. The seamless operation of a VP studio is a highly complex matter that cannot be compared to simply opening a UE file on one's own computer.

Therefore, when dealing with the production company for the studio, advocate for the VP studio and give the Virtual Production Supervisor and their team sufficient time for preparation and optimization of the project. This also means that no new 3D assets should be brought in on the shoot day, and no new levels or project files should be opened. Listen to the studio crew; otherwise, the shoot could end up in a disaster, and then the €30,000.00+ (HALOSTAGE 2022 price) per day for the studio will be the least of your concerns.

Good studios usually provide consultation and support to us DPs free of charge during the preparation phase. Don't hesitate to ask questions, especially if you are working with In-Camera Visual Effects (ICVFX) for the first time. They are as interested as we are in achieving a good end result.

The LED wall provides realistic, better lighting on the actors.

The individual LED panels cannot be bent (although Halo Stage is working on a bendable panel). This results in a stair-step effect in curves. Fortunately, this effect is hardly noticeable with the current LED walls, but it still exists.

Halo Stage in Berlin. Photo: Tom Keller

LED walls are not just for the background. They can also be mounted on the ceiling to illuminate the entire scene. With just a click, you can switch from a daytime scene to a nighttime scene without changing lamps.

In fact, the LED wall reflects light onto props, costumes, and actors, which can be incredibly helpful. For the shiny, silver armor of the Mandalorian, which would have been difficult to shoot in front of a green screen, this is a must-have. Furthermore, anyone who has shot on a green screen knows how important it is to create the right lighting atmosphere. With LED walls, this is very easy. However, provided that the studio knows what it is doing and maintains the wall.

Dead or defective LEDs do occur from time to time. Just like any other LED unit, each diode burns out over time, loses luminosity, and changes color. AOTO adjusts the LED panels every three years, and according to HALO STAGE, they have spare LEDs that are exposed to the same number of operating hours to replace defective units without leaving a trace. It is an incredible but necessary effort to ensure a homogeneous image and high quality. In the end, most LED panels become unusable after about 10 years and need to be completely replaced.

What about the quality of light?

Unfortunately, the LED-Wall are not what a DP desires for lighting. They are simple RGB lights with poor color quality. This is because LED panels come from the trade show and advertising industry and were produced for the human eye. To represent intense colors, LEDs with a narrow spectrum and a wide gamut are intentionally chosen.

As soon as a camera sensor comes into play, which captures this limited color spectrum, problems arise. This issue is inherent to LED lights in general. Professional film light manufacturers like ARRI or DEDOLIGHT address this by incorporating additional white diodes into their RGB (WW...) lamps to enhance the color spectrum. However, it is unrealistic to expect an LED wall of Sky Panel quality due to its high cost. Therefore, it is important to question why one would deviate from using recognized brands for other shoots and opt for inexpensive Chinese replicas in an expensive LED studio.

All measurements were carried out with the utmost care using the Sekonic Color-Meter C-800, deviating measurements under other conditions are possible.

The Color Rendering Index (CRI) describes the color reproduction of an artificial light source - in this case, the LED wall - compared to sunlight. The maximum value is 100 Ra, indicating that there are no color distortions perceivable by our eyes. For home use, a Ra value of 80+ is generally recommended, while a Ra value below 50 is considered inadequate.

However, the Ra value only considers the pastel colors from R1 to R8. The saturated colors from R9 to R15 are not taken into account. Nowadays, many consumer lamps exhibit good, artificially inflated Ra values, as manufacturers ensure that these pastel colors are accurately reproduced during CRI measurements. Hence, it is important to also consider the saturated colors and the SSI (Spectral Similarity Index) specifically developed for camera sensors and LED lights. As I´m particularly interested in the R9 (strong red), R13 (light, yellowish pink), and R15 (Asian skin tone) values, I still measure the CRI value today. If this value is poor, even a high SSI value is of no use, and if the SSI value is good, I verify the measurement with SSI.

When CRI or SSI values are poor, as is generally the case with LED walls, it is impossible to achieve accurate color reproduction of costumes, props, and skin tones without additional lighting. Therefore, one must confront this limitation early on and search for an appropriate look. The color palette is limited, and the missing colors cannot be restored in post-production. Even when capturing the LED wall itself, color correction is necessary. The LEDs need to be adjusted to match the respective camera sensor, which is generally feasible without major issues, and good studios manage to accomplish this for various camera types, as long as the color shift due to the viewing angle is not too significant.

The color shift caused by the viewing angle is most noticeable in the ceiling panels. These panels, often of lower quality, cannot be matched to the LED wall. I argue that it is not coincidental that in the making-of footage from "The Mandalorian," no sky is projected onto the ceiling panels. The color discrepancy and incorrect perspective would simply make it look unconvincing.

Screenshots from: Why 'The Mandalorian' Uses Virtual Sets Over Green Screen | Movies Insider

Can an LED studio shoot work without film lighting?

As we have already determined, the color quality is too poor. However, I have also encountered a director who believed that every situation could be illuminated simply using the Unreal Engine and the LED wall. He failed to understand that they can only emit soft light. Therefore, even if there is a sun on the screen in the background, an additional hard light source is necessary to illuminate the foreground and achieve a realistic result.

In most LED walls, the individual panels can be removed, allowing a lamp to be shone through the wall. With the HALO STAGE system, this can be done in just a few seconds.

What about the white and black levels?

Initially, it was argued that the individual diodes of the LED wall can be completely turned off, resulting in a pure black at that location. However, since each LED is coated with a white diffuser to scatter the light, the otherwise black wall consists of many white dots that reflect ambient light. In fact, a modern rear-projection screen typically has much better black levels.

Halo Stage in Berlin. Photo: Tom Keller

In these images, the LED wall is completely turned off. The ceiling panels illuminate the set and unfortunately also the LED wall. Compared to the black shadows of the car, the LED wall appears light gray. One argument for using an LED wall over rear projection is that it has a brighter white level. To reduce artifacts such as scanlines, it is recommended to run the LEDs at maximum brightness. However, it quickly becomes apparent that contrary to general assumption, LEDs can also generate heat, and in large quantities, it can be significant.

Whenever possible, it is advisable to avoid costumes and props that are dark black and instead lighten shadows in the foreground, which can be darkened again in post-production.

This does not seem to be a problem for a major Disney production, as they typically create costumes and props specifically for the production or use masks to selectively darken the blacks in the background during post-production. In most cases, the problematic background is simply replaced entirely in post-production. In addition to black levels, pixel pitch is often a factor.

What is pixel pitch?

Pixel pitch refers to the distance between individual pixels on the screen. Let's take the example of a 2.86mm pixel pitch, as seen in "The Mandalorian." In this case, the pixels are more than 50 times further apart than on an iPhone 13 Pro. Why does this matter? In reality, it can be disregarded depending on the intended purpose. For example, if the screen is consistently filmed out of focus. However, if you want to transfer sharpness from the real plane to the virtual projection plane, moiré effects inevitably occur.

This happens when fewer diodes are captured than the camera has pixels. The problem becomes even worse with higher camera resolution. We already tested a 1.2mm pixel pitch in 2015, and the moiré effect occurred much later and was weaker. These panels were incredibly expensive even back then and continue to be today. That is likely the reason why many LED walls have a much larger pixel pitch than what is currently achievable.

Links: LED-Wall mit 7mm Pixel Pitch. Rechts LED-Wall mit 1.2mm Pixel Pitch / Bilder: Tom Keller

In the end, even a 1.2mm pixel pitch is not sufficient. I suspect that's why the issue of pixel pitch will persist for a long time.

A little side note: Even if you keep the background completely out of focus, moiré effects can occur in a reflection. Since studios often use LED walls with a higher pixel pitch for cost reasons, I experienced a moiré effect in the reflection on the car window due to the ceiling panels. This can be resolved by either moving the LED wall further away from the object or covering the LED wall on the ceiling with a diffuser (e.g., WD film or grid) to make the individual pixels no longer visible.

How can reality be connected to the virtual world?

Whether using an expensive LED wall or a more affordable modern rear projection, as soon as you separate the actors from the background, for example, by placing them in a house, a car, or on a boat, with no visible transition to the virtual space in the frame, the technology allows for incredibly fast work. Small details such as reflections on the car's paint or spill from the screen onto the actors add additional realism, and for most of these shots, post-production is actually not needed.

Left: film still with reflection of the screen. Right: Setup in the studio with a 4m Flex Glass Screen.

Furthermore, the technology offers more creative freedom than, for example, shooting on a low-loader on the road. You are also completely independent of weather conditions and can spend an entire day shooting a twilight scene.

However, aligning sets and props with the virtual world on the LED wall is a time-consuming challenge. Thanks to the improved color space, this is somewhat easier with rear projection, but sufficient preparation time needs to be allocated. The problem becomes more significant when using the virtual set as a light source.

Since the LED wall illuminates the studio set, you cannot simply change the color of the virtual floor to match the real floor in the studio. The virtual floor is just as much a light source as the rest of the wall, so changing the color on the wall also changes the color on the set. So, you have to continuously work on matching these transitions. A gradient mask in the Unreal Engine can usually help but should be considered a last resort. In the worst case, the shot may need to be corrected in post-production. It's not surprising that this is often done. Comparing the floor in the behind-the-scenes footage of "The Mandalorian" and the LED wall motorcycle shots below, you can see that neither the snow nor the forest floor blend with the background. Such adjustments need to be made in post-production.

What is the virtual green screen?

The idea is to have only the camera's frame or even a small area behind the actors in green. The shot is marked with tracking markers, and thanks to the uniform lighting of the wall, the rest of the LED wall typically displays the environment from the Unreal Engine at a lower resolution. I actually think this is a fantastic approach. Anyone who has worked with green screens knows how important it is to minimize the green area to minimize green spill.

Left: BTS from "The Mandalorian" by Right: Still taken from BTS video "The Mandalorian" Movie Insider.

With the camera's tracking system, you already have all the necessary data and can reintroduce the background with just a click. Or at least that's what you would think, but here comes the next problem and the reason why tracking markers need to be projected.

Why doesn't live tracking really work?

The problem exists with every tracking system, even the most expensive ones, at least for now. Because the tracking system, the Unreal Engine, and other components introduce a delay that can range from a few frames to half a second. So, when I start a pan with the tracked camera, it takes half a second for the background on the LED wall to move accordingly. Parallax doesn't work, and at best, you can only cut into a movement. This is a serious problem that can be resolved in post-production, as can the moiré effect. But it clearly highlights one thing:

Using an LED wall does not necessarily eliminate the need for post-production. And when you look at the advancements in tracking and keying in recent years, it naturally raises the question of whether an LED wall is preferable to a traditional greenscreen. At first glance, a greenscreen shot may seem cheaper, but to achieve results in post-production that are even remotely as good as with an LED wall, you have to dig deep into your pockets.

Do you have the better, clean audio in the studio?

As a cinematographer, I often fall into the trap of not thinking enough about sound. Therefore, here's a very important point that is rarely discussed: Real audio in the LED studio is a problem. The fans of the LED panels are clearly audible. But even worse is the acoustics present through the wall. In some cases, ADR (Automated Dialogue Replacement) will be unavoidable. The problem becomes less severe the smaller the LED wall is, but it's crucial to consult with the sound technician beforehand to assess the situation.


The new LED wall technology and virtual production as a whole are still in their infancy and will remain so for a while. However, those who, like me, have a desire to explore new paths and are not discouraged by these "problems," will discover a new world of incredible possibilities and perhaps even a new film language.

Left: Studio build in front of a small 4m Flex Glass Screen. Right: The Result in Camera.

The subpar LED studios that are only interested in quick money won't be able to sustain themselves in the coming years and will fade away. On the other hand, those who not only offer the best prices but also treat their clients honestly will endure. Here are some studios I can recommend:

United Kingdom:


MARS VOLUME (no personal experience, but heard good things)


LEDCAVE (no personal experience, but heard good things)


In an upcoming blog post, I will write about my role as a cinematographer in the pre-production of the feature film "LILI (A.T)" by Thomas Imbach. It will be the first Swiss feature film to be entirely shot using ICVFX (In-Camera Visual Effects). I will discuss the tools and plugins we can already use in the Unreal Engine to develop a visual language, break down the film in detail, light the scenes, and the countless tests, ups, and downs we experienced during production.


bottom of page