St David's Day commission

We’ve just delivered an archive special for the BBC, marking St David’s Day and celebrating some of the greatest singers ever to come out of Wales. Stars of Wales is a one hour treat of a show, with some fabulous performances from the BBC archives, all involving Welsh singers and the BBC National Orchestra of Wales. From commission to delivery took just over two weeks, showing how it’s great to be Small and Clever - we can just get on with making the show, delivering on time and with high production values.

Some of the footage used dated back to 1999, so we used AI video enhancement software to upscale the original Standard Definition footage to HD - the results are really impressive. For the poorer quality SD material, we can make it look sharper and more acceptable for today’s larger TVs; for better quality SD originals, we can make them almost indistinguishable from HD, even on larger screens.

Rebecca Evans on stage at Proms in the Park in 2008, long before HD was the norm for content creation. From the original SD footage, some of the details are just too fine for the limited number of pixels in the image.

The upscaled HD version - the AI software is doing much more than simple sharpening and creates a very convincing new level of detail. Image Copyright BBC.

We did a similar job with the audio, again using the latest AI techniques to enhance the original sound. Nicholas Davies, owner and dubbing mixer at Meteor Sound did an amazing job to bring out the vocals from the original recordings. All the performances were stereo mixes of the soloist and the orchestra. These are often live mixes from OBs, so the vocals can be a little lower in volume than we’d like. Nick was able to completely isolate the vocals from the orchestra, meaning he could mix more of the vocal back in. On one track featuring Dame Shirley Bassey, the resulting mix is like standing next to Dame Shirley herself, it’s a massive difference compared to the original recording.

These new techniques promise a game-changing shift in the use of archive and we look forward to making more archive shows in this way.

Age of Outrage series ready for transmission

After almost a year of development and production, we’re really pleased to say that Age of Outrage launches on BBC One Wales and iPlayer on Friday 12th November.

 
 

Following on from a successful pilot, the new six-part comedy series features a very talented ensemble cast, performing sketches from a range of equally talented comedy writers. A lot of the content was shot using Virtual Production techniques, with a fair bit of layering to get actors closer to each other than was allowed at the time we filmed, due to lockdown restrictions.

 

You can see a breakdown of some of the green screen shots in this video

 

An Unreal Production Experience

Small and Clever Productions have been shooting a BBC comedy series in Wales, UK, using virtual sets and a green screen studio. Here, Series Producer Phillip Moss gives an insight into this new way of ‘Mixed Reality’ working.

 
Actors Zoe Davies and Zak Ghazi-Torbati on the real set and in the virtual set.

Actors Zoe Davies and Zak Ghazi-Torbati on the real set and in the virtual set.

 

A few months ago, there was a bit of a buzz about a relatively new technology, where complex visual effects for high-end film and TV series were being achieved in-camera rather than through traditional green screen and post-production techniques. Series like The Mandalorian were at the forefront of this new movement. The breakthrough was to generate virtual sets in real time using a ‘game engine’ running on a powerful PC. The virtual sets then get displayed on huge, high quality LED screens all set up incredibly accurately in large studio spaces. Stand the actors in front of these screens and they can be in a desert, a fantasy cityscape and a spaceship all on the same day. The system cleverly moves the backgrounds to stay in perspective with any movement of the camera.

The game engine is Epic Games’ Unreal Engine - this complex and extensive bit of software has been used for many years to make the ‘maps’ for interactive games, including Epic’s most well-known game Fortnite. For all these games, Unreal Engine generates high quality ‘worlds’ in real time, so it’s also ideal for generating photo-real locations for TV and film. We haven't got the budget of a Mandalorian, but we’ve employed similar techniques, still using Unreal Engine, to help us make a multi-location comedy series - in the midst of lockdown - here in the UK.

The headline: It's not as easy as it looks. We've learnt so much in the last few months, and the road has been long and confusing, but there are some real advantages in using Unreal to help make short-form comedy.

Green is the colour.
It didn't take long to realise that the LED screen route wasn't going to work for us. These screens are still very expensive, and I think quite restrictive in what you can shoot, especially if you want to see a floor in shot or place an actor behind a virtual prop. I'm sure there are ways of doing this with an LED set-up, but we didn't have the space or the budget to contemplate doing it this way.

After a bit of research I found people doing 'indy' Virtual Production with green screen and relatively inexpensive kit and they were getting impressive results. So that’s the route I decided to explore, and now after several months of planning and testing, we are using this method to shoot a lot of the content for a six-part comedy series.

Our set-up uses an HTC Vive tracker mounted on an Ursa Mini Pro camera. The Vive tracker is a consumer-level product, admittedly using very clever electronics, designed for VR gaming. Users usually wear a headset which allows them to look around and travel through a 3D environment which tracks the movement of the headset. Luckily for us the Vive system can also be used for so-called Mixed Reality Production where a 3D set in Unreal is tracked to the movement of a ‘puck'. The puck would normally be worn on a player’s arm or leg to provide extra tracking info in a VR game. In our case the puck is mounted on the camera, and the puck sends data to the Unreal Engine, which moves the 3D set in sympathy with the puck. As the puck is on the camera, if we move the camera, then the live pictures and the Unreal world move together. The two separate feeds are combined to form the final composite.

 
Actor David Constant gets knighted in our studio. The ‘puck’ is the object in the bottom right of the picture, attached to our camera which is out of shot below.

Actor David Constant gets knighted in our studio. The ‘puck’ is the object in the bottom right of the picture, attached to our camera which is out of shot below.

The guide ‘comp’ shot.

The guide ‘comp’ shot.

 

What are we using this technique for?
So, we're making a sketch show, and there are many different scenes in each programme. We might need to be in a palace, up a mountain and on a boat, all in one sketch. The standard way of doing this - apart from shooting on location, which is totally impractical at the moment and too expensive for us anyway - is to use stock footage or still photos as backgrounds, with the actors filmed against green screen and the two feeds composited together. Traditionally, you have to frame your live action camera to match the background shot you've selected. It's hard to get this right and you are often compromised between the angle you want to shoot and the angle which works with the background shot you have available.

Doing it the Unreal way means we can frame our actors how we want to, and the background changes angle to match. This gives us so much more scope to try interesting camera angles and be confident that the backgrounds to our wide shots and close-ups are exactly what a camera would see if we were on a real set. We can also change the lighting in the virtual set, move walls, take out bits of furniture or move them slightly so that they are framed nicely in the final composite. The 'sets' are available to buy from the Unreal Marketplace and there is a huge variety out there, from apartments and restaurants to whole parks and street scenes. We’ve now started making our own sets too, which means we can do sketches in pretty much any location.

So how does it work technically?
For each shot, you need a real foreground and a virtual background. The real foreground comes from our Ursa Mini Pro camera, pointing at our actors in our green screen studio. The background comes from a ‘virtual’ camera on the set inside Unreal. The two feeds are 'comped' together by a separate piece of software called OBS. So anything green in the foreground is removed by OBS, leaving everything else from the real camera sitting on top of the background.

 
Unreal Engine on the left, OBS on the right. The pink rectangle with the blue icons above it is the Steam VR interface telling us it is getting a signal from the Vive puck. The main image on the left screen is our overview of the virtual set, while …

Unreal Engine on the left, OBS on the right. The pink rectangle with the blue icons above it is the Steam VR interface telling us it is getting a signal from the Vive puck. The main image on the left screen is our overview of the virtual set, while the image on the right-hand screen is the view from a virtual camera placed in the scene.

 

What's it like as a practical production tool?
It's hard to explain just how good it is to shoot in this way, but it's really liberating. A lot of the time it's not really that impressive on screen, but to me, that’s a good thing. The fact that it's not screaming out that it's being used is a sign of a technology becoming a proper workhorse for filmmaking and storytelling, something that interests me much more than perhaps some of the more showy test shots of cars in desert landscapes, or videos pulling out to reveal it's all done with LED panels.

Things that took a long time to understand
The most important thing we've learnt to get right when shooting with Unreal backgrounds is to calibrate where the floor is in both real and unreal worlds. Get that wrong and even if you get one angle looking right, nothing tracks when you reposition the camera. We had a situation where we had a shot looking up a corridor. Everything looked fine until the actors walked away from camera, up the virtual corridor. Then it looked like one of those false perspective rooms in a museum - our actors had become giants in just a few steps. That was all due to us thinking we knew where the floor was in relation to the camera, when in fact we were almost a metre out.

 
David Constant and Zak Ghazi-Torbati in ‘Death Row’ - the prison corridor is being generated by Unreal Engine

David Constant and Zak Ghazi-Torbati in ‘Death Row’ - the prison corridor is being generated by Unreal Engine

 

After lots of attempts to find a reliable way to match the height of the virtual camera to that of our real camera, we've found that one of the best ways to do it is by using a physical 1 metre by 1 metre wooden frame. We place the frame on our actual green floor, and also place a virtual cube of 1m in size on the floor of the Unreal set. We superimpose the camera output over the Unreal output and reposition the scene until the real 1m frame exactly frames the 1m cube. It takes a while, as there are lots of variables and they all affect each other, but when we get it matched properly we know that if we move the camera 235cm forward, the virtual set will move 235cm forward too. Even a few centimetres error in the height of the camera between worlds will result in everything going out of alignment if you move the camera.

 
In this image, taken directly from the Unreal Engine virtual camera, you can see a couple of our ‘calibration cubes’ in the virtual set. The stripes on the one in the background represent steps of 10 cm in both worlds. We exactly line up one of thes…

In this image, taken directly from the Unreal Engine virtual camera, you can see a couple of our ‘calibration cubes’ in the virtual set. The stripes on the one in the background represent steps of 10 cm in both worlds. We exactly line up one of these cubes up with our physical wooden frame and then we know that the real and unreal worlds are matched.

 
 
Zak Ghazi-Torbati as the Barman, producer Phillip Moss on set, and a bear. You can see our home-made wooden calibration frame in shot. I’m wearing M&S slippers to protect the floor, honest.

Zak Ghazi-Torbati as the Barman, producer Phillip Moss on set, and a bear. You can see our home-made wooden calibration frame in shot. I’m wearing M&S slippers to protect the floor, honest.

 
 
The guide composite shot which gives us a live feed on set.

The guide composite shot which gives us a live feed on set.

 

What kit are we using?
In addition to the Ursa Mini Pro camera, the Vive system and Unreal Engine (which runs on a reasonably powerful PC), the other bit of kit is a Blackmagic Decklink card which handles four feeds of HD video in and out of the computer. We do the chromakeying (removing the green) using the OBS switcher software mentioned earlier. This amazing vision switcher and live keyer, which runs on the same PC as Unreal, is the most powerful piece of free TV software I've ever used. OBS can feed complex setups into other complex setups and then these can become sources within yet more layers. This gives us the ability, for example, to make our actors smaller and put them in huge wide shots complete with reflections, plus even have videos playing in other layered composites.

 
Crew member Laura Kirkup with the clapperboard, actors Zoe Davies, Kayed Mohamed-Mason and David Constant behind the virtual barrels.

Crew member Laura Kirkup with the clapperboard, actors Zoe Davies, Kayed Mohamed-Mason and David Constant behind the virtual barrels.

 

Two AJA Ki Pro Minis are used to record the HD Unreal and ‘comped’ feeds. This gives us an incredibly versatile set of tools to enable us to see - and review on set - how things will turn out when we eventually re-comp the footage in After Effects for final post-production.

One sketch involved a classic Police line-up identity parade, which we had to shoot in several passes so that the actors didn't have to stand close to each other (for Covid reasons). Using the two AJA recorders in and out of OBS, we could keep layering in the studio and so check that people weren't standing over each other in the final shot.

 
Several layers combined to produce a final guide shot. This will be worked on more in post-production.

Several layers combined to produce a final guide shot. This will be worked on more in post-production.

 

Conclusion
After using it for a few weeks now, we can say we have a system which gives us a lot of flexibility and offers great creative opportunities. We're using this technique in anger on a proper TV series, using a tracker and two base stations which together cost around £500. The big investments were the PC, the Decklink Card and of course the large green studio. It wasn't easy getting to this point but it has been worth the effort.

In terms of learning how to use Unreal and the Vive tracker, there are some invaluable tutorials out there - Matt Workman, Richard Frantzen, Aiden Wilson, Greg Corden and Pixel Prof take a bow - but even with their help, it's been a long process to work through the pitfalls and understand how Unreal works, as it has its quirks and the virtual production side is all a relatively recent addition. Thanks too to Chris Marshall and Chris McGaughey here in Wales who helped us out with some of the early testing, and to colleagues Jack Browse, Jack Beresford and Laura Kirkup who have spent the winter with me getting all this working.

For someone who has only occasionally done VFX, using this technique to create a large percentage of a series has been a bit of a ride. We feel a bit like pioneers and it's been great fun as a result. The series, called Age of Outrage, will be out in the UK later in the year on BBC One Wales and BBC iPlayer.

Age of Outrage - call for writing submissions

 
Age of Outrage logo.png
 

We’re well into the pre-production stage of the sketch series Age of Outrage now, and we’re on the lookout for sketches, sight gags, songs or poems (actually, maybe not poems) which might feature in the new series. If you’re a writer or a budding writer who has a connection with Wales, through birth or long-term residency, we’d like to see your material. If you’re interested, have a read of the brief, and send your sketches to: hello@smallandclever.com

Small and Clever and Tiny and Happy

Just as lots of parents are spending more time with their kids, we’ve delivered a series of comedy shorts to BBC Learning’s Tiny Happy People campaign. Called Awkward Silences, the films are destined for the BBC website and social media platforms and take a light-hearted look at parenting young children.

BBC_Tiny Happy People_Landscape Master on Purple 500.jpg

This series forms part of a major initiative from the BBC aimed at encouraging parents to talk with their children more, at an early age. Research shows that this simple interaction has a big impact on development of language and social skills.

The films were great fun to film and edit, as we improvised the dialogue with our team of actors, based on storylines we’d developed with the BBC Learning team in Salford. This unscripted approach of rehearse, improvise, record is something we’ve done many times in the past, including on the BBC Wales Bafta-winning comedy series Scrum 4.

Actors Jalisa Andrews and Elin Pavli-Hinde in ‘Regifting’, one of nine films in the Awkward Silences series

Actors Jalisa Andrews and Elin Pavli-Hinde in ‘Regifting’, one of nine films in the Awkward Silences series

Age of Outrage

Our comedy pilot, working title Serving Suggestion, is now called Age of Outrage and is due to air on BBC One Wales on Friday 21 Feb. We’ve called it a sketch show for the internet age, and at least one of the sketches seems to have struck a chord with audiences out there as it’s already had 17 million views on Facebook (32 million now, as of 15 May). You can see the sketch below (45 million views now, as of 20 Oct) and also watch the full show on BBC iPlayer here.

Epic animation

We’ve just completed our Millennium documentary Epic: Party Like It’s 1999 for BBC Cymru Wales, which included animations and illustrations from students at the University of South Wales. We were very pleased to work with the animation team and especially with Leonie Sharrock who is a senior lecturer on the animation courses.

Leonie was a great help in coordinating six separate animations, made by individuals and groups on the course, and all this was achieved within a two-week timescale. You can see an example of the way the animations were used in the clip below.

It was a great experience to work with the animators and we’ve certainly noticed animations being used in a lot of recent documentaries, especially archive-based films like this project. It’s a good way of illustrating stories where no archive footage exists.