Games engines are transforming virtual sets and then TV

CN14094Is the writing on the wall for the traditional virtual set renderer? Traditional virtual set vendors seem to think so. Here’s the view of Gideon Ferber, director, product management and business development, Ross Video Virtual Solution: “Virtual studios reached their peak about ten years ago, but we now see a decline in the market because they are just not good enough.”

And here’s ChryonHego’s managing director of virtual solutions, Oliver Cohen: “Game engines represent the future industry standard.”

The introduction of games engines into broadcast will begin but not end with virtual sets. New entertainment formats mixing physical and virtual objects and characters and people in real time are being developed. Some call this concept interactive mixed reality.

Reality Check Systems (RCS) president Andrew Heimbold says the entrance of game engines into the virtual set space “is one of the most exciting developments I have seen”.

“There is a difference in visual quality; we have to acknowledge that,” agrees Gerhard Lang, chief engineering officer, Vizrt. “While games engines are ahead at this time there are also advances in broadcast engines which will bring the capabilities of the two technologies closer together.”

All eyes will be on Epic Games at NAB. The developer of Unreal Engine – one of the prime games technologies – is hosting its first ever broadcast trade show press conference. SVG Europe expects a flurry of integrations along the lines of NewTek’s partnership with Epic announced late last year.

Vizrt, for example, “has always had a very open system which has made it possible for users to choose an alternate renderer as a plug-in inside Viz,” says Lang. “We will continue that.”

Broadcast and games convergence

Virtual sets were first introduced by Ultimatte in the mid-1990s, based around its chroma-keying technology and were advanced by companies like Vizrt and Orad using powerful renderers based on character generators.

Competitor systems by Brainstorm, Ross Video and others continued to grow the market. Indeed, virtual sets have become a fixture at many broadcasters, primarily for use in children’s, sports and news programming as a cost-effective use of studio space. In recent years, such systems has been augmented with robotic tracking systems and the ability to overlay augmented reality (AR) graphics and 3D models.

The convergence of linear video production and gaming engines has been taking place over the past several years in an organic fashion. In a quest for more and more realistic rendering, producers are turning to gaming engines to provide state-of-the-art real-time 3D rendering of virtual environments.

“The main driver is to get a much more realistic looking image,” says Ferber. “Viewers can tell when a virtual set is being used. For many broadcasters that is the look and that’s fine, but the major element we set out to solve in adopting a games engine was to create a level of realism unrivalled by any character generator on the market. As much as some vendors can get nice results they will never be up to the level of a games engine which can push realism from day one. That’s something CGs were never meant to do.”

On the other hand, games engines were never designed to work in broadcast. Games engines from Epic Games (Unreal) or Unity are designed to render graphics (polygon counts, textures, specular lighting) as fast as possible and do not natively fit with broadcast signals which must correspond with the slower frame rates of SMPTE timecode.

“Gaming engines have gone beyond what has been possible with traditional virtual rendering engines,” argues Brian Olson, VP of product management for NewTek. “Things like global illumination, real-time reflections, and real-time shadows are difficult to do with most traditional virtual set products.”

Re-coding Unreal

Ross Video partnered with Oslo-based The Future Group to rewrite the Unreal code to comply with genlock. This software is being sold as Frontier, a new graphics engine. It is also packaging other components around this to create a turnkey tool-set for a new Mixed Reality production system which mixes virtual and physical objects, characters or people in real-time.

Other components of the Ross solution include a tracking system with which broadcast cameras and robotics communicate with Frontier and drive the software’s virtual cameras. Ross’ UX is the control platform used to drive the whole system.

“Games platforms give you all the editing tools to create your game but no interface to drive production, run events or trigger animations,” says Ferber. “UX brings the games engine into the Ross workflow. Using it we can control multiple engines and cameras and control tracking and data feeds from a single interface.”

The first example of its use is Lost in Time, a game show produced by FremantleMedia and TFG. Ross is also marketing Frontier to broadcasters.

“We think game engine-driven virtual production systems will be adopted as straight replacements for virtual sets – initially,” says Ferber. “This is the broadcaster’s comfort zone.”

However, it’s hoped that exposure to the technology will lead to more innovative presentations.

“What has not been possible with virtual sets is replication of outdoor scenes,” says Ferber. “With traditional CGs this is almost impossible since they do not reflect light sources [in a naturalistic manner]. Games engines open up endless stylistic and creative opportunities.”

Photoreal environments

Currently, the traditional virtual set technologies are more broadcast-compliant than games engines. “However, when it comes to rendering performance, game engines are a real step ahead – largely due to their connection to the video gaming world,” says Cohen. “In general, games engines scale better in terms of rendering performance and features, simply because the developer community behind Epic Unreal or Unity is much bigger than the support any single vendor could provide.”

RCS’ Heimbold agrees: “Game engines have many more capabilities in terms of photorealism; however, there can be limitations in their integration with broadcast workflows. On the other hand, there are established broadcast CG vendors with extremely refined workflow solutions that do not produce the photorealistic look you can achieve with a modern game engine. Both sides are converging to solve their respective challenges.”

Underpinning this growth is GPU technology from companies like NVIDIA that continue to increase performance for both sides.

“The question remains as to who will provide the most robust solution that not only looks great but also integrates seamlessly into modern facilities,” says Heimbold.

Cohen says ChyronHego is “looking at this topic very closely” while recognising that the transition to games engines in broadcast will take some time.

“Broadcast compliancy is a key issue for broadcasters. That’s why we see more applications for game engines in cinema and entertainment programmes for now.”

“It’s never been easier to create 3D graphics using standard modelling tools, and powerful 3D virtual set and game engines are yielding incredibly realistic results,” he insists. “For on-camera tracking technology, systems have become very reliable and offer the ability to track studio cameras in any positions simply by adding small optical sensors to the top of the cameras.”

Broadcast compliancy

He points to ChyronHego solutions like Filmbox for importing a complete 3D project for 3ds Max or Maya including lights, geometries, shaders, textures and animations. “Once the project is passed into the virtual set engine, the results are very photo-realistic thanks to major improvements in hardware and software,” says Cohen.

“Virtual graphics will start blooming for all sorts of content in the next few years,” he believes. “We will also see a transition from HD-SDI to IP video that will help game engines proliferate. Game engines will continue to become more and more broadcast compliant, and at some point we envision that virtual sets and virtual graphics will run as a cloud service – allowing multiple studio productions without all of the hassles of in-house hardware.”

Vizrt’s Lang doesn’t think photoreal graphics will drive demand. “We’ve been in this business a very long time and we see fashions come and go in waves. Virtual sets have always been a revenue generator for us. We are also seeing a lot of demand for Augmented Reality outdoors and in blending physical with virtual sets seamlessly so the audience really does not notice the difference.”

Game engines have certain sets of functions which they bring to the equation such as Artificial Intelligence and the ability to use realtime motion capture data, he adds.

As depth cameras advance and GPUs accelerate the processing of 3D maps interfaced with animation software, the green screen could be removed entirely. Vrtual content could then be created outside a studio.

“The technology will improve such that the look of a rendered scene and a real background will be indistinguishable,” says Lang. “Ultimately, the use of a virtual set or AR assets means the content has to be meaningful and be achieved in a way that could not with a real set otherwise the solution loses its effectiveness.”

Subscribe and Get SVG Europe Newsletters