Film and digital can't be simply compared by using math. There's much more to the images than numbers.
By Rob Sheppard
Which picture is from film? Does it matter here? Not really. Both photos above work just fine on the printed page. The Minnesota forest scene (top) is from Kodachrome, while the Great Smokies stream (bottom) comes from a 7.1-megapixel Olympus EVOLT E-330. Both images make excellent large prints. You can’t compare film and digital simply by arbitrary use of numbers—the technologies are just too different.
About 15 years ago, digital imaging started to capture the attention of the press, including photography magazines. At that time, a lot of this was gee-whiz stuff, and most photographers saw it mainly as a curiosity or something that might work for scientists or other specialized use. Many pundits at the time made rash pronouncements of the technology, using all sorts of techniques to compare film and digital, but mainly they all came to the conclusion that it would be a very long time, if ever, before digital image capture could match film. And they all said that film would be around for a very long time. Obviously, that has turned out to be wrong.
But the idea of comparing film and digital capture still remains. The problem is that these two technologies are being compared by math rather than reality, so the comparisons often are dead wrong, which has caused problems for some pros.
The most common evaluation is done by comparing 35mm film with digital images. Some people still do this to "prove" the superiority of film over digital, even though no actual tests are done. They have used a lot of formulas to mathematically compare film and digital. Usually, the result is that for digital to equal 35mm film, something around 22 megapixels is needed, though I’ve seen comparisons that go as high as the 30-megapixel range.
Anyone in the publishing industry using a lot of digital images over the past few years quickly discovered this simply wasn’t true. No one did any math, but the results were obvious. You could enlarge digital images onto the printed page much more easily than working with film and a 6-megapixel camera would match 35mm except at the largest sizes. Of course, size was one element of the math formulas, so this only gave a hint of what digital might or might not do.
Personally, I’ve worked with several photographers who started printing large prints of 16x24- to 24x36-inches with stunning results. Even though I had been following the photo industry for quite some time, I was amazed and impressed by the quality in these prints that came from...6-megapixel cameras. This wasn’t supposed to be possible. Yet here at the OP offices, some of these photos were put up on the walls next to fine prints from 35mm film and, as a whole, they looked very close to the film prints.
Now cameras have gained better sensors, not simply with more megapixels, but with better noise characteristics, better tonal renditions, better color and more. These are now translating into amazing prints. I’ve seen 3x5-foot prints that are incredible from 10- to 12-megapixel cameras. These prints are from images shot with high attention to gaining image quality, such as shooting from a tripod, proper exposure, careful RAW processing and low ISO settings. They exhibit image quality above and beyond what we ever expected from 35mm film.
Recently, I scanned some old slides for a book project because I needed the subject matter in them. These were from Kodachrome and Velvia, and were shot using the best technique I could. I used to think some of those images were terrific shots from a quality standpoint. Yet, when I scanned them at high resolutions, I was disappointed. They didn’t match my digital work in sharpness or grain/noise.
So what’s going on here? How could the math used to compare film and digital be so wrong? There are a number of factors involved, not the least being noise/grain—pixels simply capture cleaner, better quality than sensitive bits of film grain.