Keho Interactive promoted the possibilities of a virtual studio in the Future Zone, an event held in Mediapolis on 31.1.2017. Virtual Reality headsets and content were represented in a wide variety. The goal of the event was to showcase companies utilizing the futuristic technology and to explore the possibilities of collaborations, for example, with the Finnish Broadcast Company YLE.
Combining green screen and motion capture
Virtual studio isn’t a recent invention. It’s been used in its older form in the children’s shows YLE has produced for almost three decades. With the addition of modern motion capture cameras and a game engine to render the virtual surroundings, the technique has taken a step to a new level.
Virtual studio is a concept of combining green screen with motion capture. With the green screen, you can create an animated environment and place the person in the studio straight in the middle of it. Motion capture cameras allow you to transfer the movements captured by the main camera that’s broadcasting into the animated world. Combined, they enable you to do multiple-camera filming from a virtual studio environment.
Virtual studio differs from the traditional green screen technic in two major ways: it enables multi-camera production and allows moving the cameras when filming. You can move around in the virtual environment created in the studio and change filming angles just like in normal multicamera-production. In the traditional green screen production, the camera has to stay still, resulting in a monotonous outcome.
Bringing the animation to life with a game engine and mocap-technology
Motion capture allows you to track movements in the virtual studio very precisely. As the person on camera moves around in the animated environment, shadows will follow just like in the real world. The addition of virtual objects and people can be done so well and smoothly that the viewer will have trouble distinguishing between animated and real elements.
The animated background can be made even more lifelike by adjusting the depth of field. When the camera focuses on the foreground, the background blurs out. It makes the viewing experience more natural for the human eye. The depth of the animated background can either be linked to the aperture and focus settings of the real camera or adjusted independently by hand. Both options require some more researching but can be implemented in the near future.
Virtual props could basically be anything you can imagine. You can play around with shapes, colors, materials, weather conditions and interactive elements endlessly when building a virtual world. You can add videos from other sources and bind them to the surfaces in the animated world. For example, during an intermission in an ice hockey game, you could place replays and clips from other games into the walls of the studio. The virtual studio doesn’t even need to resemble a traditional studio because when working with virtual reality your imagination is the only limit.
The game engine used by Keho Interactive takes the visuals of the animated content to the next level. It enables more realistic rendering of surface materials, more complex animations and the changing of the weather and time of day. The game engine can also be used to make the animated objects react to the actors’ movement or even messages sent by the viewers.
Engaging The Viewer
Transmedia has lately been trending in TV production when having discussions about collaboration between different types of media channels. Transmedia combines the productional aspect of TV to other types of media. In these transmedia productions, the story is the core aspect of the production, but its output varies from TV-series to a game or a live event. Transmedia is defined by its wider and better user experience and the viewer being a part of the phenomenon.
Virtual studio is a great platform for transmedia production. Elements from other medias and channels can be imported in real time. In addition, the virtual studio can be outfitted with elements that turn the viewing experience into a game. Viewers watching from their TVs at home can participate in the program via the internet and influence what’s happening in the studio.
The usage of the game engine means that the virtual studio is compatible with VR headsets. You can visit the virtual set just by putting on your VR glasses. As VR headset grow more common, traditional TV production must take them into consideration. It calls for virtual, interactive and gamified productions where the viewer is turned from passive to active.
Almost Everything is Possible
Motion capture enables for animated characters to be acted out in real time in the virtual studio. You can have a real person and an animated character standing next to each other. They can act like regular news anchors on a live broadcast, regardless of the other being an animated picture.
A virtual studio does not only enable adding real people into an animated environment but also makes it possible to bring animated content to the real world. If you are filming outside, you can add animated characters, such as animals or tour guides, or other kinds of foreground graphics. Motion capture technology enables you to mix reality and virtual worlds more authentically.
Together with Yle, we are practicing the use of avirtual studio during 2017, preparing for live productions. Our goal is to get our virtual studio technique to work so flawlessly that we can use it to produce live broadcasts. We are now looking for productions that are willing to try out the newest technology. If you have an idea that we could help you produce, contact us! We are happy to reply to all questions about the technique and actual implementation of a virtual studio.
Written by Vilma Linna, translated by Hermanni Ahtiainen