OP Home > Columns > Photo Adventure > Seeing The Color


Tuesday, March 26, 2013

Seeing The Color

Override your DSLR’s auto features to photograph colors that your eye may not detect

Glow Worm Tunnel, Blue Mountains, Australia. Bill Hatcher manually set his DSLR's white balance to capture the effect of the green-filtered light.

I've been wondering if there are any secondary effects caused by the amazing automation we find in cameras, especially with auto white balance settings. Have you experimented with your camera's white balance settings, like switching your camera from auto white balance to one of the presets, or even better, to manual control with the color temperature (K, or Kelvin) settings? You can adjust white balance within the camera settings, or if shooting RAW photo files, during the post editing in your computer. Now, why would you want to override the camera's auto function that seems to work just fine? The answer: to better understand the color output created by different light sources. This might reward you from missing some wild colors out there you might not have noticed otherwise.

One of the marvels of digital cameras is the auto white balance setting. This camera control is what enables the camera to render whites in a photo as white with no orange, green or blue cast caused by varying light sources. The human eye makes these corrections so subtly that we're not even aware of it. The camera's auto white balance is calibrated to make the camera interpret the color spectrum of the scene much as the human eye does. Switching a camera's color balance function to one of the presets, like shade, flash or cloudy, essentially will override how your camera interprets color.

Way back in the days of film (about 10 years ago), cameras didn't have a white balance switch and film was balanced to record only one color temperature accurately, for example, daylight. To correct for color shifts caused by the different light we might encounter on a shoot, like the warm orange light of a candle or the blue cast reflected into shade, we used filters. We learned about filters in photo school or by observation and the self-taught method of trial and error.

Today's auto white balance is excellent and can compensate for different light balances as quickly as our own eyes. This is both good and bad. The good is that 98% of the time the color balance in a scene is rendered in as natural a way as possible, and it's instant; we don't have to mess around with a bunch of filters anymore. What's bad is that our photo eye is no longer trained to detect errant colors in a potential photo scene. That could mean you might miss opportunities to capture a seriously interesting scene because, to your naked eye, the scene is registered as normal and the auto settings on your camera do the same.

In the outdoors, I've come across many examples of unreal colors that auto color balance would render neutral and mute. Most of my early learning about detecting nearly unseen colors was while I was shooting with film, things like capturing the surreal red glow found in Arizona's slot canyons or the jade green ice on Alaska's glaciers. These experiences and many others helped me refine my photo eye for seeing color. Today, it's almost second nature that I look to see where the light is coming from. Is it passing through or reflecting off something that has changed the color spectrum of the light source? A quick look around will tell me.


Add Comment


Popular OP Articles