MediaCity’s dock10 Head of Production – Virtual studios technology explained

Richard - dock10

You’re watching BBC’s Match of the Day and the studio camera shot at the end of the highlights VT shows a dark sky full of stars, it tilts down to reveal a stylised, branded light-fitting; the camera then twists and pans 180 degrees across the studio towards an oversized image of a key football player and their game stats. You might think ‘wow’, how do they do that? writes Richard Wormwell, Head of Production Innovation at dock10.

The answer lies in Virtual Studio Technology. As Head of Production Innovation at dock10, I’ve been involved in taking this innovative technology from concept through development to on-air with shows including The BBC Euros 2020, The FIA Gran Turismo World Championships and a favourite of children (and parents!) – BBC Bitesize Daily.  

Instead of steel, wood and plastic, a virtual studio set is built out of pixels in a real-time 3D rendering engine; it seamlessly combines real people and objects with computer-generated environments and objects to achieve otherwise impossible sets that are so convincing you cannot tell what is real and what is not. The combination of leading-edge technologies that dock10 uses to achieve these super-realistic virtual environments still amazes me – here’s how we do it. 

3D space

Essentially our virtual studios use a traditional green screen set-up but with one important difference: we track the camera positions. When a camera is operated (whether hand-held, on a pedestal, a jib, a rail cam or even a techno-crane) we know with sub-millimetre accuracy where it is within the three-dimensional space of our studio; and at 25 frames per second, we accurately read all six degrees of movement (pan, tilt, roll, forward and back, left and right, up and down) as well as lens focal length and focus depths. We capture all this information through a small but wide-angled motion camera and LED sensor mounted on top of our broadcast cameras.

These motion cameras point to the ceiling and send a constant infrared (IR) signal up to the studio grid where a series of randomly placed reflective stickers bounce the IR light back down to the motion camera. The camera’s sensor reads the light hitting it and the time it’s taken to travel, then by a process of triangulation we can correctly calculate the exact position of the camera in 3D space.

dock10
dock10 studios.


Virtual sets

Meanwhile, our in-house team of Virtual Set Developers have been working alongside production companies, creative directors, traditional set designers and lighting directors to create the virtual worlds in which the presenters will deliver the shows. We use the latest 3D modelling packages and real-time renderers to create environments that can be as wild and creative as your imagination will allow; and being computer generated they are not restricted by the physical dimensions of the studio space – The Match of the Day virtual set for example is set in the centre circle of a 60,000-seater football stadium.

However, no matter how large the virtual space we also have to consider any physical objects used within it, such as the presenters’ desks and chairs. We model these along with a 3D version of the green screen walls and floor, so we have a full understanding of the virtual and physical limitations of the camera’s and presenter’s movements. We also build all our sets to work in 360 degrees, opening up the possibility for cameras to shoot in every direction. 

Rocket boots 

As part of the set design, we also develop augmented objects using computer graphics to alter and enhance what is seen. These might be data driven graphics delivered as 3D text, virtual pyrotechnics, or a huge video wall for VT playback that saves the cost of hiring expensive LED walls. For BBC Bitesize Daily we built a fully automated 3D robot that was driven by a performance artist wearing a motion-capture suit – we even gave him rocket boots so he could fly around the set! 

 


dock10 studios.
dock10 studios.


Green screen

Once we have a studio lit for a green screen production, a set of broadcast cameras all rigged with camera tracking devices, and a three-dimensional set built to run in a real-time renderer – all we need to do is combine these three different elements and run the signals though our broadcast infrastructure. Easy! We start by feeding the broadcast cameras’ tracking data and the virtual environment into a control interface that manages all the video and data signals; this software also allows us to generate the composite (combined) video signal using a keying technique similar to those used in high-end post production. By using a real-time imaged based keyer that runs on GPU processing power, we can produce spectacular results with contact shadows, transparent objects and sub-pixel details like hair. And because each camera is tracking all the other cameras, this allows us to generate dynamic ‘3D masks’ so that the cameras can move around the space – even right next to the presenters – without appearing on the finished broadcast. We can also ‘garbage mask’ areas that aren’t lit or covered in green such as the lighting grid and the ‘fourth wall’ which allows cameras to offer seemingly impossible shots.  

Programme making

Having achieved all that, it’s now just down to the programme making. The gallery crews have ‘standard’ monitor stacks showing each camera’s composite feed, so for them it’s just like producing and directing a traditional show. The camera operators get a view of both the clean and composite feeds on dual monitors so they can see exactly what’s going on; for the talent, we provide slung and floor monitors showing the output of the vision mixer or specific cameras, as well as eyeline monitors so they still have a visual connection when they need to conduct down-the-line interviews. These small eyeline monitors are often replaced with much larger virtual LED screens. Although it can be a bit daunting at first, we’ve found that most people get used to the set-up very quickly and are soon immersed in the space themselves.  

Match of the Day

dock10 was the UK’s first major studio to adopt these next-generation virtual studios technologies and after a year of R&D, training and recruitment we re-launched Match of the Day for the BBC from our first virtual set back in August 2019. Since then, we’ve gone on to produce hundreds of hours of virtual studio content putting the system through its paces and proving its value for taking the viewer experience to new levels. As more and more production companies start to realise the benefits of virtual and augmented production techniques and the cost and environmental saving that these technologies allow, it won’t be a surprise to see more of it hit your screen. The brilliance is though, that you might not notice.

Related News