International Broadcasting Convention IBC 2014
Spatial movie production – more creative and better value
The days when the camera viewed a scene from a single position are long gone. Nowadays, special effects are in demand, and even more so if they’re in 3D. Researchers in the Spatial-AV project looked at how more creativity can be brought to three-dimensional cinema. Fraunhofer will be presenting the new technological solutions at this year’s International Broadcasting Convention, taking place September 12th to 16th in Amsterdam (hall 8, booth B80).
The camera revolves around the main character, who seems to be frozen in the middle of a jump – time seems to stand still for a moment and the camera shows the jumping figure from all sides. What two-dimensional movies can achieve can’t be missing from three-dimensional ones. Of course, the 3D versions have the advantage of being able to pull the viewers in from their seats and whisk them away into an alternative fantasy world. But, the same principle applies: in order to make sure that movies bring in enough money, the makers have to constantly come up with new special effects. In the case of three-dimensional films, however, this drives up the production costs, which are already high to begin with.
3D productions made easy...
What makes 3D production so complex and therefore costly: instead of one, the cameraman must operate and focus two cameras. This is because the left and right eyes have a slightly different angle of vision – the two cameras imitate this effect. And, as if that wasn’t stressful enough, the angle of inclination and the distance between the cameras must be adjusted constantly. In the future, camera operators will no longer have to worry about things like that: it will be enough to focus one camera, and everything else will follow automatically. This is made possible by software developed by the researchers at the Fraunhofer Institute for Integrated Circuits IIS. "The second camera adopts the focus setting of the first one, and appropriate algorithms ensure that the cameras adjust to one another in an optimum manner," explains Dr. Siegfried Foessel, head of department for Imaging Systems at the Fraunhofer IIS. There is already a prototype of the software. The cameras capture 25 frames per second and they recalibrate themselves automatically once per second.
"We want to create more opportunities for creativity – for both 2D and 3D productions," says Foessel, "and that includes both picture and sound." The researchers at IIS have therefore joined with their colleagues at the Fraunhofer Institute for Digital Media Technology IDMT, the Fraunhofer Institute for Telecommunications, Heinrich-Hertz-Institut, HHI, and the Fraunhofer Institute for Open Communication Systems FOKUS to look for the right solutions for the film industry. The project will run until the end of 2014 and the final presentation will take place at the IBC trade fair.
Virtual cameras can do more
For more complex special effects, however, two cameras are no longer enough. The IIS researchers have therefore set up a system comprising 16 cameras, which can be expanded even further as required. The trick to the system is in the software: the software uses the 16 camera images to generate depth maps that use gray tones to specify how far the object in this pixel is from the viewer. "We can use this depth map to generate any number of views from the 16 camera views – meaning that we have created a virtual camera, similar to movies that are entirely computer-generated. That gives us a great deal of freedom; we can produce moving shots without having to move the real cameras at all, for example," explains Foessel.
Sound will be three-dimensional
The sound tracks of movies are conventionally recorded and mixed in 5.1 format, meaning that the various sounds are distributed across different channels. Experts refer to this as the channel-based method. If you’ve bagged a good seat in the middle of the movie theater, you will experience spatial sound as intended. But in the seats at the side, the effect is less than perfect: in this case, you will be sitting nearer to one loudspeaker and the sound it produces will therefore seem considerably louder than the others. Wave field synthesis, however – a method realized for the first time by the researchers at the Fraunhofer Institute for Digital Media Technology IDMT – ensures that everyone in the audience hears three-dimensional sound, no matter where they are sitting. "Wave field synthesis is not channel-based, but object-based," explains Dr. Sandra Brix, head of department at IDMT. This means that individual noises, voices, or instruments are recorded as objects in their own right and can be placed as appropriate into the sound scene. This would allow a plane to acoustically "fly over" the audience, for example. To generate this spatial sound impression with individually audible sound objects, a large number of loudspeakers produce an acoustic wave front. This spreads throughout the entire space as a sound field, much like the ripples caused by a stone being dropped in water.
Changes, however, are not just coming to movie theaters; how we experience movies and television on our own sofas is also going to be transformed. Soccer and concert fans, for example, will be able to freely select the camera perspective, turn full circle, and enjoy a panoramic view of the field and the stands during live broadcasts. "OmniCam360" makes it possible: when this camera is placed at the edge of the field at the center line, it is able to capture a 360° panoramic view of the whole stadium. "The camera only weighs 15 kg and is no larger than a normal television camera. This enables it to be carried by one person and fixed to a tripod," says Christian Weissig, project manager at the Fraunhofer HHI in Berlin, where the camera was developed. The Omnicam comprises a total of ten cameras, but there is no need for complex calibration. All you need to remember when operating the Omnicam is: unpack the camera, plug it in, and start filming. The camera has already been able to prove its mettle in a range of test productions, including with the Vienna Philharmonic and the 2014 FIFA World Cup™ Production of the Final Match. The camera is now licensed and is being marketed.
If, however, you would like to produce film material for a dome-shaped screen, additional cameras are needed to point skywards – otherwise a panoramic view will be projected around the edge but there will be a gaping black hole on the ceiling. That’s why the researchers at Fraunhofer FOKUS have developed a special process to merge image streams from individual cameras into a seamless picture in real time. This means that even dome-shaped movie theaters will be able to show live broadcasts in the future.