Vince Pace: Data Mining Will Deliver 2D, 3D Viewing ‘Beyond Reality’
By: Adrian Pennington
CAMERON PACE Group (CPG) has signed an exclusive agreement with Dolby Laboratories to develop a means of capturing data about a scene in live or recorded production for use in enhancing the experience of watching 3D content on glasses-free screens.
Speaking at the Dolby booth this week, CPG Co-Chair Vince Pace said, “We’ve shifted from an analogue type of filming to something like data mining, where we can capture pixel density or high dynamic range and many other parameters, process it, and deliver the best possible viewing experience. There is a big opportunity to capture information through the CPG camera and take it right through to the display.”
Co-developed by Dolby and Philips, the Dolby 3D format was initiated to support the creation, delivery, and playback of glasses-free 3D content on TVs, tablets, and smartphones or other mobile devices.
Since autostereoscopic screens vary from tablet device to large screen with eight to 28 multiviews, Dolby believes that the best viewing experience comes when its algorithm is used to render the content appropriate to the display rather than letting the display extrapolate the 3D itself. The additional metadata would be carried in the signal along with the content.
“The collaboration is concentrated on the methodology just now,” said Pace. “We are not here yet. But this is one step on the road to achieving an experience beyond reality. I am serious about that. We will get to a stage where you will experience every live event and the creativity of the filmmaker in ways we haven’t even imagined. This holds for 2D as much as it does for 3D.
“I don’t want to say,” he continued, “that we are creating a Medusa — a monster — of new technology converging on-set or on location and that there will be a mathematician or Dolby engineer or even a stereographer looking over your shoulder. Quite the reverse. The camera will perform all this data capture for you.”
The Dolby and CPG project will see essential information derived from depth maps and attached to content at the point of capture. A new postproduction process will use the data to augment playback of content over autostereoscopic screens.
The Foundry is the first post vendor to integrate the Dolby 3D format into its products, Nuke and Ocula, although the plan is for other developers, such as SGO Mistika, to participate.
The key aspect that the Dolby 3D format needs to account for is that the depth budget in the current generation of autostereoscopic displays is less than that of stereoscopic displays.
According to Roland Vlaicu, senior director, broadcast imaging, Dolby Laboratories, this limitation will be reduced over time as panel technology is advanced, but the Dolby 3D format needs to manage the depth budget accordingly.
“There is a need to reduce the amount of depth [parallax] when content is played back to get into the comfort zone of the displays,” he said.
Although this would add an additional element into the postproduction workflow, doing so in real time for live events is more challenging. Vlaicu noted, “A lot of work needs to be done to figure out how to generate all of this data in real time.”
Dolby is talking with other 3D-production-system vendors — notably, 3Ality Digital — and plans to open out the format once the metadata parameters and capture and post workflows have been defined.
“We are creating a format and cooperating exclusively with CPG because of the visionary aspect of their achievements so far,” said Guido Voltolina, GM of the joint project. “This cooperation will develop a format so that other companies can save their creation in that form.”
He said that the first commercial displays optimized with Dolby 3D will be available “sometime next year. I don’t know whether a tablet maker will move faster than a TV maker.”
Getting there requires inexpensive 4K panels fitted with a switchable lenticular layer that will automatically alternate between a 2D and 3D viewing experience.