Video Keeps Lord ofBy Mel Duvall | Posted 2006-01-14 Email Print
Know the Risk: Digital Transformation's Impact on Your Business-Critical Applications REGISTER >
Lord of the Rings director Peter Jackson needed to view what seven film crews were shooting. His solution: push the envelope on videoconferencing technology.Rings On Track">
The filming of The Lord of the Rings was groundbreaking in many respects. It was the first time in cinematic history that three films were shot simultaneously. It involved 18 months of filming and another four years of post-production work, including editing and the creation of the film's many digital effects. More than 20,000 extras were cast to play Orcs, goblins, elves and the other inhabitants of Middle-earth, and more than 150 locations throughout the New Zealand countryside would be used in shooting.
Nimmo's involvement with Jackson and the Rings trilogy began in 1999. He was working for Hewlett-Packard as a troubleshooter, and had been called to the newly established production office for the Rings project to assist with the computer setup. While on site, Nimmo discovered that Jackson was searching for a technology specialist who could devise a system to help him both "see" the filming taking place at remote locations and create a computer network that could support the mobile crews.
It was an adventure too grand for Nimmo to pass up. He quit his job with HP and formed 3Foot6, a reference to a Hobbit's height, to take on the communications challenge.
After evaluating options, including fledging personal computer-based Web camera systems, Nimmo concluded that business-grade videoconferencing systems were the most viable option for creating the communications system Jackson required. They could not only bring together the number of locations Jackson needed to observe simultaneously, but also offered the best picture quality. In 1999, palm-sized Web cameras were capable at best of providing postcard-size video at about 10 frames per second, and were fraught with jumpy pictures and scattered sound due to packet loss over the Internet. Videoconferencing systems, by comparison, could deliver full-screen, 30-frame-per-second, television-quality pictures.
Nimmo then teamed with Asnet Technologies Ltd., an Auckland-based distributor of Polycom videoconferencing systems, including its flagship ViewStation units, to pull the systems together. Asnet general manager Chris Stewart says the team's highest hurdle came in the first six months of the project.
For logistical and bandwidth reasons, the team chose to use satellite communications from Telecom New Zealand. Some filming would be done in remote mountainous regions, without access to phone or Internet lines. The only practical solution was satellite.
The team also sought Internet Protocol (IP) as the transmission format, because the satellite link would be used to send e-mail, digital images and other forms of data such as scene directions, to and from the remote sites. The satellite link Nimmo's crews would establish at each remote location would, in fact, also serve as the main computer network. "We wanted to have one piece of pipe that could bundle video, audio and data, and in order to get the necessary bandwidth, it meant we had to go with IP," Stewart says.
Therein lay the problem.
Polycom's videoconferencing units were programmed to operate over Integrated Services Digital Network (ISDN) telephone circuits, essentially dedicated phone lines, and not the public Internet. In 1999, Polycom was about to introduce Internet-based units, but the systems were still in beta. Working with Polycom's engineers in Austin, Texas, Asnet was able to obtain a beta unit for testing and spent three months working with Polycom's engineers to debug the system. Video-over-IP was in its infancy, and there were still challenges related to maintaining packet integrity over the Internet that could lead to transmission breakups. The engineers were able to accommodate for packet loss by introducing a technique known as error concealment. Lost or damaged packets are essentially reconstructed by using, or "borrowing," data from nearby frames.
Eventually they were able to fine tune the software, using the error concealment technique, to the point where the team felt confident enough to deploy the units in the field.
During a typical day on the set, Nimmo and his videoconferencing crews would awake at around 2 a.m. to start setting up the videoconferencing units and satellite linkups for a day of shooting. Getting the satellite linkup often proved the most difficult part of the operation. The film was shot in rugged terrain, sometimes in the shadows of mountains or deep in valleys, and the satellite dishes needed a clear line of sight westward toward a stationary satellite over the Tasman Sea. That meant Nimmo would have to find a location where the 10-foot-wide satellite dish would have an unimpeded sight line west, then run military-grade fiber-optic cable, which is about five times the diameter and weight of normal cable, to where a scene was being filmed.
Time Warner: All-Seeing Eye