Virtual Production Lab: Test Shoot
Project Type: Motion Capture | Role: Motion Capture Artist / Director | Year: 2023
About
As part of the Virtual Production Lab, that was organised by the Filmuniversität Babelsberg Konrad Wolf, we tested out the possibilities of bringing metahumans to live on an LED Wall.
That was especially interesteing as a part of the development process of the upcoming Virtual Production Sci-Fi miniseries INTERCONNECTION. You can find out more about the project here: Overview
We created a very short scene to test out realtime motion capture and different camera perspectives. To portray these characters, we invited two actresses: Caroline Bröker as the „real“ Kira and Katz Köbbert for the performance capture of the „digital“ Kira.
My Tasks
To explore how to bring the digital twin of the protagonist Kira to life, I came up with the idea of using realtime motion capture: One actress plays the „digital“ Kira that is projected on an LED Wall while the actress for the „real“ Kira could interact with her digital twin. This could give a very authentic performance. For that, I worked with Codrin Podoleanu on the setup with the Xsens suit. The advantage is, that it is portable and not bound to a camera setup for tracking, as it uses active trackers on the body of the performer. We connected the data stream to Unreal Engine and combined it with a face capture that was recorded with an iPhone and a selfmade head rig. This data was then applied to a metahuman. After the setup was done and tested, I focused on directing on set.
Learnings
In theory – and even in our test environment – everything worked well. Of course, a lot of fine-tuning would have been necessary, and a dedicated hand-tracking system would have been required for better performance capture. However, conducting the shoot in the LED Cave also revealed that while the idea was ahead of its time, it had some fundamental flaws.
From a technical perspective, it became clear that the LED panels weren’t capable of displaying Metahumans at the required level of detail. Too many visual elements were lost, which meant the Metahuman didn’t meet our quality expectations. Additionally, the setup relied heavily on a stable data connection – not only between the Xsens suit and Unreal but also between Unreal and the LED wall’s render nodes. Any connection drop would put the entire shoot on hold, causing significant delays. Before attempting a full production, this system would need thorough testing and optimization.
From a director’s perspective, the interaction between characters wasn’t as seamless or natural as expected. There was a slight delay in movement transmission from the motion capture actress to the digital character, causing the actress playing the „real“ Kira to wait after hearing her co-star’s lines. Additionally, the motion capture actress had to perform remotely, relying only on screens for interaction, which made it difficult to connect with the scene and the other actress.
All in all, I still believe this is an exciting and innovative way to combine LED walls and motion capture, but several challenges need to be addressed before applying it in a full production. That’s why, for the final shoot of INTERCONNECTION, we decided to use traditional VFX to digitize the character in post-production.