An Unreal Production Experience
Small and Clever Productions have been shooting a BBC comedy series in Wales, UK, using virtual sets and a green screen studio. Here, Series Producer Phillip Moss gives an insight into this new way of ‘Mixed Reality’ working.
A few months ago, there was a bit of a buzz about a relatively new technology, where complex visual effects for high-end film and TV series were being achieved in-camera rather than through traditional green screen and post-production techniques. Series like The Mandalorian were at the forefront of this new movement. The breakthrough was to generate virtual sets in real time using a ‘game engine’ running on a powerful PC. The virtual sets then get displayed on huge, high quality LED screens all set up incredibly accurately in large studio spaces. Stand the actors in front of these screens and they can be in a desert, a fantasy cityscape and a spaceship all on the same day. The system cleverly moves the backgrounds to stay in perspective with any movement of the camera.
The game engine is Epic Games’ Unreal Engine - this complex and extensive bit of software has been used for many years to make the ‘maps’ for interactive games, including Epic’s most well-known game Fortnite. For all these games, Unreal Engine generates high quality ‘worlds’ in real time, so it’s also ideal for generating photo-real locations for TV and film. We haven't got the budget of a Mandalorian, but we’ve employed similar techniques, still using Unreal Engine, to help us make a multi-location comedy series - in the midst of lockdown - here in the UK.
The headline: It's not as easy as it looks. We've learnt so much in the last few months, and the road has been long and confusing, but there are some real advantages in using Unreal to help make short-form comedy.
Green is the colour.
It didn't take long to realise that the LED screen route wasn't going to work for us. These screens are still very expensive, and I think quite restrictive in what you can shoot, especially if you want to see a floor in shot or place an actor behind a virtual prop. I'm sure there are ways of doing this with an LED set-up, but we didn't have the space or the budget to contemplate doing it this way.
After a bit of research I found people doing 'indy' Virtual Production with green screen and relatively inexpensive kit and they were getting impressive results. So that’s the route I decided to explore, and now after several months of planning and testing, we are using this method to shoot a lot of the content for a six-part comedy series.
Our set-up uses an HTC Vive tracker mounted on an Ursa Mini Pro camera. The Vive tracker is a consumer-level product, admittedly using very clever electronics, designed for VR gaming. Users usually wear a headset which allows them to look around and travel through a 3D environment which tracks the movement of the headset. Luckily for us the Vive system can also be used for so-called Mixed Reality Production where a 3D set in Unreal is tracked to the movement of a ‘puck'. The puck would normally be worn on a player’s arm or leg to provide extra tracking info in a VR game. In our case the puck is mounted on the camera, and the puck sends data to the Unreal Engine, which moves the 3D set in sympathy with the puck. As the puck is on the camera, if we move the camera, then the live pictures and the Unreal world move together. The two separate feeds are combined to form the final composite.
What are we using this technique for?
So, we're making a sketch show, and there are many different scenes in each programme. We might need to be in a palace, up a mountain and on a boat, all in one sketch. The standard way of doing this - apart from shooting on location, which is totally impractical at the moment and too expensive for us anyway - is to use stock footage or still photos as backgrounds, with the actors filmed against green screen and the two feeds composited together. Traditionally, you have to frame your live action camera to match the background shot you've selected. It's hard to get this right and you are often compromised between the angle you want to shoot and the angle which works with the background shot you have available.
Doing it the Unreal way means we can frame our actors how we want to, and the background changes angle to match. This gives us so much more scope to try interesting camera angles and be confident that the backgrounds to our wide shots and close-ups are exactly what a camera would see if we were on a real set. We can also change the lighting in the virtual set, move walls, take out bits of furniture or move them slightly so that they are framed nicely in the final composite. The 'sets' are available to buy from the Unreal Marketplace and there is a huge variety out there, from apartments and restaurants to whole parks and street scenes. We’ve now started making our own sets too, which means we can do sketches in pretty much any location.
So how does it work technically?
For each shot, you need a real foreground and a virtual background. The real foreground comes from our Ursa Mini Pro camera, pointing at our actors in our green screen studio. The background comes from a ‘virtual’ camera on the set inside Unreal. The two feeds are 'comped' together by a separate piece of software called OBS. So anything green in the foreground is removed by OBS, leaving everything else from the real camera sitting on top of the background.
What's it like as a practical production tool?
It's hard to explain just how good it is to shoot in this way, but it's really liberating. A lot of the time it's not really that impressive on screen, but to me, that’s a good thing. The fact that it's not screaming out that it's being used is a sign of a technology becoming a proper workhorse for filmmaking and storytelling, something that interests me much more than perhaps some of the more showy test shots of cars in desert landscapes, or videos pulling out to reveal it's all done with LED panels.
Things that took a long time to understand
The most important thing we've learnt to get right when shooting with Unreal backgrounds is to calibrate where the floor is in both real and unreal worlds. Get that wrong and even if you get one angle looking right, nothing tracks when you reposition the camera. We had a situation where we had a shot looking up a corridor. Everything looked fine until the actors walked away from camera, up the virtual corridor. Then it looked like one of those false perspective rooms in a museum - our actors had become giants in just a few steps. That was all due to us thinking we knew where the floor was in relation to the camera, when in fact we were almost a metre out.
After lots of attempts to find a reliable way to match the height of the virtual camera to that of our real camera, we've found that one of the best ways to do it is by using a physical 1 metre by 1 metre wooden frame. We place the frame on our actual green floor, and also place a virtual cube of 1m in size on the floor of the Unreal set. We superimpose the camera output over the Unreal output and reposition the scene until the real 1m frame exactly frames the 1m cube. It takes a while, as there are lots of variables and they all affect each other, but when we get it matched properly we know that if we move the camera 235cm forward, the virtual set will move 235cm forward too. Even a few centimetres error in the height of the camera between worlds will result in everything going out of alignment if you move the camera.
What kit are we using?
In addition to the Ursa Mini Pro camera, the Vive system and Unreal Engine (which runs on a reasonably powerful PC), the other bit of kit is a Blackmagic Decklink card which handles four feeds of HD video in and out of the computer. We do the chromakeying (removing the green) using the OBS switcher software mentioned earlier. This amazing vision switcher and live keyer, which runs on the same PC as Unreal, is the most powerful piece of free TV software I've ever used. OBS can feed complex setups into other complex setups and then these can become sources within yet more layers. This gives us the ability, for example, to make our actors smaller and put them in huge wide shots complete with reflections, plus even have videos playing in other layered composites.
Two AJA Ki Pro Minis are used to record the HD Unreal and ‘comped’ feeds. This gives us an incredibly versatile set of tools to enable us to see - and review on set - how things will turn out when we eventually re-comp the footage in After Effects for final post-production.
One sketch involved a classic Police line-up identity parade, which we had to shoot in several passes so that the actors didn't have to stand close to each other (for Covid reasons). Using the two AJA recorders in and out of OBS, we could keep layering in the studio and so check that people weren't standing over each other in the final shot.
Conclusion
After using it for a few weeks now, we can say we have a system which gives us a lot of flexibility and offers great creative opportunities. We're using this technique in anger on a proper TV series, using a tracker and two base stations which together cost around £500. The big investments were the PC, the Decklink Card and of course the large green studio. It wasn't easy getting to this point but it has been worth the effort.
In terms of learning how to use Unreal and the Vive tracker, there are some invaluable tutorials out there - Matt Workman, Richard Frantzen, Aiden Wilson, Greg Corden and Pixel Prof take a bow - but even with their help, it's been a long process to work through the pitfalls and understand how Unreal works, as it has its quirks and the virtual production side is all a relatively recent addition. Thanks too to Chris Marshall and Chris McGaughey here in Wales who helped us out with some of the early testing, and to colleagues Jack Browse, Jack Beresford and Laura Kirkup who have spent the winter with me getting all this working.
For someone who has only occasionally done VFX, using this technique to create a large percentage of a series has been a bit of a ride. We feel a bit like pioneers and it's been great fun as a result. The series, called Age of Outrage, will be out in the UK later in the year on BBC One Wales and BBC iPlayer.