top of page

In Camera Visual Effects

In this blog post, I want to discuss In Camera Visual Effects (ICVFX) and what to watch out for when using Virtual Production (VP). This article builds upon my previous blog post, "The Challenges of Virtual Production". I recommend reading the previous post first to better understand what is written here.

ICVFX enables visual effects scenes to be shot directly in camera, which is actually the core idea of VP. In the following examples, I will show scenes created with minimal set construction, a small budget, and an even smaller team. I hope to demonstrate that ICVFX and VP can be used in any production, large or small.



VP combines computer-generated virtual worlds with physically constructed sets on stage. Even though most people think of LED volumes when they hear "Virtual Production," LEDs weren't the technological innovation that initiated the VP era. LED panels were available much earlier, and most LED volumes still do not use the latest LED technology due to the high cost. LED volumes are perfect for larger set builds but have their disadvantages, especially when used in small studios where the camera has to be close to the wall.


The real reason Virtual Production became interesting for the film industry was the progress in real-time rendering technology through game engines like Unity, CryEngine, and Unreal Engine (UE). Improvements in camera tracking systems and reduced latency in data transmission also played a significant role. Developments in these areas have paved the way for a new era of filmmaking, and AI will continue to drive this progress.

UE is a powerful platform offering a wide range of creative possibilities. Thanks to it, set scouting can be done conveniently from home, and locations can be digitally designed and modified until they are perfect. In the feature film "THE EXPOSURE", director Thomas Imbach and I completed the entire blocking over two months in front of our computers — he from his studio in Zurich and I from Berlin. This allowed us to be very flexible with our schedules. The entire blocking process could be keyframe-animated and rendered out with simple characters for every department to see. For the VFX artists, it's even better; they can review your digital camera shots in UE and know exactly what will be in the frame, where they have to spend more time on details, and where it's not necessary because it will be out of focus. It makes a lot of sense to start blocking your scenes very early on.

For independent projects with limited budgets, the large free libraries, including the photorealistic Quixel Megascans, are particularly useful. With just €200, you can get complete asset packages with mountains, trees, shrubs, and much more. We used one such asset package for the burning forest scenes in "EMBERS". It was purchased and then combined and redesigned with additional elements by VFX artists  Marlon Candeloro  and Gianni Piechota.

Unreal Engine Quixel Bridge



When I communicate with my gaffer,  Dominic Heim, I go into great detail and discuss lighting units, butterfly cloths, etc. For those of you who also have a clear idea of how to technically achieve a specific lighting mood, collaboration with VFX artists can be challenging, especially in low-budget productions, as they often use a different approach and vocabulary. Communication problems are less likely to arise when defining the sun’s position or weather moods in UE. But VFX artists are generally not accustomed to working on film sets, so they are unfamiliar with the usual terms and practices for lighting setups on set. This is especially noticeable in complex interior or night scenes.

In principle, you can also set lighting in UE the same way you would on a film set in the real world. The advantage is that lighting units and flags (for which you can use planes in UE) are invisible, which makes using them a lot easier. You don't have to worry about stands or other grip equipment in the frame. The classic method of setting light also helps reduce the typical game-like look of the virtual world.

Once you get used to navigating UE, it doesn't take long to start placing lights yourself and adjusting their parameters. This doesn't replace the need for a VFX artist, but it's much more efficient to navigate UE yourself than to just sit next to a VFX artist and try to discuss lamp positions and orientations within XYZ axes.


Virtual Production, Feuer Szene im Filmstudio Basel.



For the film " EMBERS", in addition to real fire in the studio, we also used the SumoSky from Sumolight. Like with LED volumes, each individual LED can be controlled through pixel mapping. SumoSky, in addition to the usual RGB diodes, has two different white diodes, leading to better color rendition. The system also consists of several LED bars and is especially popular in major Hollywood productions in the US and England as a dynamic sky lighting solution. However, when using real flames or candles, like we did in the film "THE EXPOSURE", the SumoSky must be gelled due to its limited color temperature range of 3450K - 15500K.




Virtual Production is often used with elaborate set builds, but I find it particularly interesting when combined with minimal set construction. In "THE EXPOSURE", which we shot entirely at the Filmstudio Basel, we could often skip set construction and focus on props like chairs, tables, or lamps to connect the real foreground with the digital background. Glass elements like the large door with old glass tiles, and the glass table and water glasses used in "EMBERS", are props that merge the real and virtual world. It's hard, nearly impossible, to get the same result with a green screen. Props were often also scanned for use in the digital world.

Virtual Production Filmstudio Basel, Lili Thomas Imbach


As long as you don't plan to manipulate the footage in post-production, which should be the goal with ICVFX, experimenting with elements like lens flares, particles, dust and haze in the air can be helpful to merge the two worlds. Even the heat shimmer caused by real fire in the studio is a great effect that adds credibility and atmosphere. For green screen shoots, these elements should be used with caution. Although much is possible in post, it quickly becomes more expensive.

Virtual Production Filmstudio Basel, Feuerszene mit DP Tom Keller





In February 2024, I co-taught a Virtual Production course for Focal with cinematographer Nikolaus Summerer and VFX supervisor Juri Stanossek. Nik had shot the Netflix series "1899", for which the LED Volume Dark Bay was built in Babelsberg. Because they had used Unreal Engine 4 (UE4) at the time, he used ARRI Alfa Anamorphics—large format lenses he further developed with ARRI specifically to avoid the typical game-like look of the engine. The characteristic anamorphic bokeh, vignetting, and shallow depth of field helped achieve that.

On a side note, I can highly recommend the "1899" series on Netflix to anyone interested in VP, even though it was unfortunately canceled after the first season. There's currently no better and more impressive VP production in Germany. Nik and his team did a great job.

Since the introduction of Unreal Engine 5 (UE5) in 2023, a lot has improved in terms of more realistic assets and performance optimization, thanks to Nanite. Nonetheless, I still recommend vintage or anamorphic lenses to give the image a less clean look. Another interesting option is 16mm or 8mm Kodak film. The heavy grain would be problematic with green screen but is ideal for ICVFX.

Interestingly, Juri mentioned that in post-production, they now appreciate lens flares and imperfections on green screen shots, because these characteristics provide a useful guide for look imitations later. In most cases, however, these features must first be removed to manipulate the image digitally and then re-added as an effect. Juri also shared examples where entire shots were replaced with Computer-Generated Images (CGI). Although VFX supervisors are on set, each VFX studio often has its own approach to working with the provided footage, and DPs are usually not involved in these decisions. With VP, it's different. Communication must happen early, giving DPs more control over the final image, which is another reason I appreciate ICVFX and VP so much.




The limitations of UE, as well as those of LED walls and projectors used for rear projection, make slow-motion shots at higher frame rates difficult. Sometimes you can use tricks or green screen techniques to achieve slow-motion effects in post-production. If flicker issues occur, DaVinci Resolve's De-Flicker can be helpful. However, for the time being, you can barely shoot beyond 60 fps in a VP studio.

Virtual Production Filmstudio Basel, Kameramann Tom Keller & Schauspieler Christoph Keller vor der Leinwand



Real-time rendering is a significant advantage of VP because it allows parallax effects. However, there are situations where these effects are not necessarily required, such as in scenes on the open ocean or in dry wastelands, where parallax doesn't play a significant role. In these cases, it can be useful to render the scene with Path Tracing from UE in optimal quality and then play it as a video on the wall. The same applies to shots from fast-moving cars, trains, or spaceships, and if you're shooting in slow motion, you can render the image at a higher frame rate than UE allows, creating interesting visual effects.


Additionally, you can use camera tracking for projection-mapped videos. This technique requires much less computing power and can often produce a more photorealistic background.



Virtual Production can also be combined with projects shot on real locations. Scenes can be shot outdoors during specific light and weather conditions, which normally don't last for long. Additional plates are then shot for later use in the VP studio to have more time, for example, for long dialogue scenes during a sunset. However, this requires careful planning and a lot of experience to ensure that the different shots blend seamlessly.

Virtual Production with Plates

For "EMBERS", Nico Gühlstorf and I waited for the otherwise rare snowfall in the region to shoot the required shots. I served as the double on location. In the studio, we then shot with the actual actor in front of our plates. The cost savings from this approach are enormous.


For the series "CAPELLI CODE“, we shot elaborate scenes with a large team at hard-to-reach locations. We focused on all the wider angles and completed the scenes in the studio, where the director, Alex Martin, had more time to work with the actors.

Film Crew at a hard to reach location

Another example is the film "LA CHACHE" by Lionel Baier, where I worked as a technical advisor. The film is set in 1968 in Paris, and because it was too expensive to block off a street and recreate the '60s for an extended period, they shot plates in Paris and used them as the background for a parked car in the studio.


Nonetheless, I think it's important not to see VP solely as a cost-saving measure. With poor planning, it's not. It offers filmmakers, especially writers, new opportunities that would otherwise be unthinkable with relatively small budgets. I also mentioned this in an interview for the Swiss National Television “10vor10” report.


As a closing remark, I want to leave you with this: Every technique has its pros and cons, and you shouldn't stick to one that doesn't make sense for the shot. For ICVFX, the LED volume has its place, as does rear projection or an 83-inch OLED TV. What's important isn't which technique you use but how you use it.

Thanks to everyone involved:

Gabriel Brosteanu, Joachim Budweiser, Marlon Candeloro, Tiziana Cuviello, Claudio Demel, Jan Gubser, Nico Gühlstorf, May Hälg, Dominic Heim, Christoph Keller, Kaija Ledergerber, Alex Martin, Pascal Pendl, Gianni Piechota, Ela Schaich, Roland Siegenthaler, Senso Stampa, Danny Steindorfer, Marcel Stucki, Tobias Sutter, Nadine Trachsel, Nea Trachsel, Klemens Trenkle, Christina Welter,

Thanks for the support:


Commenting has been turned off.
bottom of page