Hollywood Confronts the Digital Revolution --McIlroy
There's a wonderful article in the August 11 edition of The Wall Street Journal called "Engineering Blue Skies" (you need an online subscription to grab this from The Wall Street Journal Website, but if you do a Google search, you'll often find articles posted on secondary Websites without charge.)
The article covers a new trend in Hollywood—that of converting feature films from silver halide to digital, in order to use digital tools to improve the images, and then converting them back to film for theater viewing. For anyone in the printing industry who has lived through the shift to digital imaging, the article is a touch of nostalgia.
As they say, it's déjà vu all over again. We've lived through all of the challenges of the move from analog to digital. Why is it that Hollywood, with its infinitely greater financial resources, wouldn't have called upon a few printers and prepress experts to learn exactly what it could expect? I don't know whether to laugh or cry over the ignorance and naiveté expressed by those interviewed in the article (so I'm doing both!).
The process discussed in the article creates a "digital intermediate," reflecting that fact that there's good old film on both ends of the process ("the medium still generally considered to provide the most reliable quality"). Needless to say, Eastman Kodak plays in this part of the business too, trying to protect its film market while semi-embracing the future. A Kodak subsidiary, Cinesite, is a leading vendor in the digital intermediate market.
Digital intermediates, the article explains, "allows movie makers to reap the benefits of digital technology without using digital cameras, which are still resisted by many makers of big-budget movies."
As was the case for printers and prepress shops a decade ago, Hollywood's problem starts with digital video itself. Hollywood has the same paranoia we used to have about digital photography—it can't possibly be as good as film! The article points out that "not only are directors and cinematographers used to working with film and, in some cases, reluctant to give it up, but the technology for distributing and projecting films digitally is still being developed. Movie distributors and exhibitors haven't yet finalized technology standards. . ."
Ah yes, been there, done that.
The most touching paragraph in the article concerns that old bugaboo, resolution. Gosh, they should have dropped by the IPA technical conferences for the last 20 years if they wanted to find out more than they could ever digest about that subject! The article informs that "film normally has a resolution of about 4K (4,100 pixels across by 3,000 deep). Most post-production houses convert that film into a computer file by scanning it at a resolution of 2K (2,048 pixels wide by 1,556 deep)." Eeek, half as much (actually, much less than that, but obviously no one is really counting). The article continues "the lost resolution can't be regained when the file is converted back into film."
Oh my gosh, you mean it's like lost forever?! That sounds tragic.
But no, apparently not tragic. Most people will never notice the difference. The article admits that "while the difference may not be visible to most viewers in a normal theater, it would be obvious in some IMAX theaters."
Yes, the opinions in the printing industry were always shaped by an elite old guard, too. Folks who, in all due respect, really could see the difference between a drum scan and a flatbed scan. But we soon enough learned that most customers couldn't see a difference, and that buyers were decreasingly willing to pay a significant premium to satisfy the tiny minority who could.
And so, in Hollywood, a digital trend is finally emerging: "Almost a hundred films using digital intermediate will appear in movie theaters this year, compared with about a dozen last year," the article points out. Many of this past summer's blockbusters used the technology, including "Pirates of the Caribbean," "Terminator 3," "Seabiscuit" and "S.W.A.T."
But even when you address the issue of "good enough" digital resolution, another detractor throws out yet another barrier: "Still, there are questions about how well digital intermediates will age when compared with film," the article suggests.
"That's a controversy," says Curtis Clark of the American Society of Cinematographers. "The issue is that as technology changes, if you store on some format, will that format become obsolete?"
Please give us a break! Film lasting longer than digital formats? Does anyone really think that it will be impossible to decode JPEG or MPEG a generation from now? Digital bits never degrade, and the same neutron bomb that would destroy all knowledge of how to decode an MPEG movie would quickly melt all of the celluloid in Hollywood.
To laugh or to cry? You decide. I think it's reassuring to realize—perhaps just a little too late—that even Hollywood's most sophisticated filmmakers can get as spooked by new technology as we used to be, half a generation ago.
—Thad McIlroy
About the Author
Thad McIlroy is an electronic publishing consultant and analyst, based at Arcadia House in San Francisco. He welcomes your comments at thad@arcadiahouse.com.
- Companies:
- Eastman Kodak
- Places:
- Hollywood