Fraunhofer advances automated 3D production
German research outfit Fraunhofer HHI has a number of interesting technologies designed to make 3D production as straightforward as 2D, writes Adrian Pennington. The Automated Stereo Production (ASP) system is intended for low cost 3D live and recorded broadcasts and is a joint venture between Fraunhofer HHI, rig developer P+S Technik, producer KUK Film Production and Zeiss.
It’s essentially a small, twin lens unit with onboard computer which takes coordinates from, and controls, the glass to remove manual lens set-up and stereo alignment. The concept most closely resembles the Meduza Titan camera.
Zeiss is developing new 2/3 inch zoom lenses for the unit, which is being shown in prototype on the P+S stand at NAB.
Fraunhofer is integrating its Stereoscopic Analyser (STAN) which calculates parameters for each shot like colour matching and stereo geometry, identifies incorrect settings and adjusts them on-the-fly.
“It can support 2K or 4K and we will have a working product by the end of the year,” says Fraunhofer’s Stefan Gick.
Also from Fraunhofer is a trifocal camera system which enables 2D on-set production and automated stereoscopic post production. A collaboration with Walt Disney Animation Research and Arri, the three camera system employs an Arri Alexa as the prime lens with two IndieCamGS2K point of view satellite cameras.
In post, footage from the cameras is synchronised by timecode then fed into an Adobe After Effects plug-in for creation of depth maps which can be used to render a 3D image, and to perform other tasks like colour correction.
“Instead of shooting 3D on set which is time-consuming to set-up, uses bulky equipment and can cause disparities in the image, this technology makes it possible for filmmakers to shoot 2D as normal with a lot of the decision making for 3D shifted to post,” explains Fraunhofer HHI research associate Nicola Gutberlet. “We don’t intend to fully get rid of manual processes for 3D in post but we can reduce it substantially. Special expertise in 3D cameras is not required, eliminating the need for stereographers on set or continual readjustment of camera settings.”
Depth maps are a well-established part of the conventional post-production workflow where they are typically used to generate three-dimensional special effects. However, the use of depth maps as an alternative to the standard production of stereo images is still very much in its infancy.
During post-production, dense depth maps are estimated from recorded material, which enables generation of stereo content. Another advantage of the system is that the question of the content’s target system – which could be a 3D cinema or stereoscopic display – now only needs to be settled during actual postproduction.
The trifocal camera will be used to shoot a live action short film for Disney this summer. If this proof of concept is successful, Disney intend to screen the short in cinemas and apply the technology on a future feature length production.
Also at NAB, Fraunhofer is demonstrating a prototype camera array with eight compact cameras, each consisting of a three-megapixel sensor, which is aimed at visual effects work.
“With the camera array we can calculate the distance between a camera and an object without the need for a greenscreen,” says Siegfried Foessel, head of Fraunhofer’s moving picture technologies department.