iPhone 5S and More About User Problems

Apple's announcement of the iPhone 5S yesterday once again brought up my point about solving user problems. There's one such photographic solution with the new iPhone that really stood out to me, because we photographers have been fighting this problem for decades (in some cases lifetimes): mixed light.

The Japanese companies have been putting out flash units with higher than daylight Kelvin temperatures for quite some time. A brand new Nikon Speedlight tends to be close to 6000K, for example, though it slowly degrades towards 5600K or so. Yet most of the places you'd use a flash have ambient light that is far lower than that, sometimes even lower than 3200K. Thus, if you use flash for almost anything other than daylight fill, you're producing a mixed light situation. The camera companies' response to the problem this creates for a photographer? "Just use filters." Okay, which filter? And do I really need to carry a filter for every possible situation? How do I even know what the light temperature I'm dealing with is?" From the Japanese companies: silence. 

Apple's answer is simple: the 5S's True Tone Flash uses two lights, one warm, one cool. The camera first measures the color of the ambient light (akin to doing an Auto white balance), then mixes the two flash outputs to match the color of the existing light. User problem (mostly) solved. The Japanese designers are all doing a "doh!" face slap right now. How long now before we see two-headed flashes with different color output? 

Some of Apple's moves on the 5S are reminiscent of features that have come along before. For instance the Auto Image Stabilization is a lot like Nikon's old BSS (best shot selector): the camera takes multiple shots to get the best shot. But unlike BSS, which just selects what it thinks is the best shot of the bunch, Apple actually looks for any motion blur in the shot and uses the multiple images to attempt to remove that blur.  

Even in the video realm Apple is pushing boundaries on interesting user problems. For example, the iPhone 5S will shot 720P/120. But it's not 120 fps that's interesting, it's that you can select variable playback so that part of the scene plays at 120 fps (normal motion) and part plays at 30 fps (4x slow motion). 

As Apple's Jony Ive puts it: "enhances the user experience." That should be the mantra of any designer, I think. That what the user gets is better in meaningful ways and not just a mindless repeat of what's been done in the past. 

Not to say that the iPhone 5S's camera is the be-all, end-all camera. It clearly isn't. But every time the smart smartphone makers iterate camera features these days, they're coming up with solutions to user photographic problems that the Japanese haven't solved in decades worth of iteration. Then they wonder why their compact camera business is disappearing. 

text and images © Thom Hogan 2013 -- all rights reserved // @bythom on twitter, hashtags #bythom, #gearophile