|This Article Features Photo Zoom|
Photography is an art form that tends to push the limits of technology. Looking back over the last 25 years, we can see that many of the greatest innovations have served to define our perception of photography. Moving forward, these and other developments may just change what it means to be a photographer.
Twenty-five years ago, in 1985, Minolta launched the first SLR camera system with integrated autofocus. Autofocus had been available prior to that—cameras featuring autofocus were developed in the late 1970s—but 1985 marked the beginning of the wide availability of a feature many photographers take for granted.To be sure, many photographers were slow to embrace autofocus, and some still prefer to take the manual approach. But there’s no question autofocus has had a huge impact on the way photographers approach a subject, and has greatly improved the odds of capturing sharp photos. This is especially true for moving subjects, but even for static landscapes there’s a benefit to employing autofocus, which over the years has evolved to the point that it offers incredible precision, speed and accuracy.
First Digital SLR
In 1991, Kodak released the very first digital SLR, the DCS-100. I don’t think very many photographers quite realized the revolution that was in store, or just how quickly that revolution would take hold. That first digital SLR offered a resolution of “only” 1.3 megapixels, but it still marked a huge turning point in photography that reverberates to this day.
Many parts work together in a modern digital camera, and at the core is the image sensor. Within the sensor there’s a variety of high-tech components. To make digital cameras viable for photographers, the resolution of that sensor needs to be adequate to ensure fine detail and relatively large output sizes. The challenge is that for a given sensor size, increasing resolution requires a reduction in the size of individual photosites on the sensor. Smaller pixels means less light and, thus, a need for more amplification of the signal, resulting in noise.
Many technological advancements have contributed to the ability of modern image sensors to offer increasingly high resolution while reducing noise levels. One of the key pieces of technology when it comes to reducing noise is the microlens array used in CMOS image sensors, first used around the year 2000. A microlens array consists of a collection of tiny lenses, with (generally) one lens for every individual photosite. This focuses the light striking each photosite so that as much of that light as possible strikes the photodiode rather than the additional components around each photodiode. Without microlens arrays, we wouldn’t be able to capture such high resolutions with relatively low noise levels.
Most image sensors only gather one color value for each pixel, using a colored filter in front of each photosite so that each pixel records either red, green or blue light. Postprocessing, either in-camera or using RAW conversion software, calculates the missing values in order to determine the actual color for each pixel.
In 2002, Foveon (acquired by Sigma Corporation in 2008) released a full-color sensor. This unique sensor has three layers of photosites, much the way color film captures light with multiple light-sensitive layers. The result is an ability to capture color with greater fidelity and finer detail. While a relatively small percentage of digital cameras utilize the Foveon sensor technology, it still represents an important innovation. Canon and Nikon also have patents related to full-color sensors, suggesting potential developments on this front in the future.
|This Article Features Photo Zoom|
One of the tremendous advantages of today’s DSLR systems (and their film-based predecessors) is the ability to change lenses quickly and easily. The trouble is, whenever you change lenses, you’re leaving the image sensor exposed to the elements, introducing the potential for dust, moisture or other contaminants to come in contact with it. That can result in dust spots or other blemishes that will appear in every single image you capture until the sensor is cleaned.
Olympus introduced the first DSLR with self-cleaning technology for the image sensor in 2003, and now many camera models include a self-cleaning feature. In most cases, this consists of a high-frequency vibration that literally shakes any dust or other contaminants from the sensor, with an adhesive area that collects the dust that’s shaken loose. While this won’t eliminate all contaminants from the image sensor, it greatly reduces the dust spots and other blemishes you’d find in your images otherwise.
For many years, still images and motion pictures required completely different equipment. It seems video almost developed a bad reputation among photographers because, when video was readily available in a digital still camera, it generally was at a very low resolution, resulting in videos that were often of marginal quality at best.
In 2008, the Nikon D90 became the first DSLR to capture high-definition (HD) video, and now it seems you’d have a difficult time finding a digital SLR that doesn’t capture HD video. This opens up the potential for photographers who may have focused exclusively on still photographs to include motion in their images. Whether that translates into including motion in a single frame that otherwise could have been a still capture or producing your own short film, the ability to capture high-quality HD video with tremendous lens flexibility represents a huge innovation.
Fixed Pellicle Mirror
If you ask a photographer what’s the defining feature of an SLR camera, I suspect interchangeable lenses would be the top answer. But very close behind would be a mirror that swings up and out of the way so the image can be captured. That mirror movement includes a number of negative consequences, including camera vibration that can affect image sharpness and the inability to see the scene through the viewfinder during exposure. Sony has recently addressed these issues by releasing two new digital cameras that make use of a pellicle mirror. A pellicle mirror (which has been available in some film cameras previously, but never in a digital camera) splits the light projected by the lens into two streams. One is reflected upward toward the viewfinder and the other passes through the mirror to the image sensor. As a result, there’s no need for the mirror to move in order to capture an image. This removes the issue of camera vibration caused by mirror movement, enables you to view the scene through the viewfinder during exposure (even when capturing video) and makes faster shutter speeds possible, among other benefits. While the fixed mirror makes sensor cleaning a bit more challenging and causes some loss of light, the potential benefits are huge for photographers.
Tim Grey has authored over a dozen books on digital photography and imaging for photographers, including the new Real World Digital Photography, 3rd Edition. He publishes the Digital Darkroom Quarterly print newsletter and the Ask Tim Grey eNewsletter. Details can be found at www.timgrey.com.