Gather round the camp fire folks, cause I'm about to tell the story of the way the ancestors of the Imaganuit tribe created photos in the not too distant past. After using their cameras, the respected elders all disappeared into their darkhuts and a long time later would come forth with finished images of astonishing beauty, often bigger than their arms could stretch. The rest of us all went to the autohut to drop off the pellets containing our work and an hour later we would return to find small replicas or our images that were all that we really needed.
Okay, enough channeling the story teller for the moment, as I don't want anyone to miss my point. Let's fast forward to today. Are we at the same place today as we were with film? I think we are, and that's part of the problem the camera industry is facing.
- Film: automated processors developed and printed most images, typically with negative film (which had wide latitude to exposure errors). A dedicated few either had their own darkroom to develop and print with, or used high-end labs to do the same for them. The primary output, however, was predominately 4x6 prints.
- Digital: automated JPEG engines demosaic and codify most images. A dedicated few shoot raw and control the demosaic and pixel-creating process much more carefully. The primary output, however, is modest-sized images distributed via the Internet (email, Facebook, Twitter, etc.).
Notice the parallels? I thought so. Only the JPEG engines in cameras aren't yet sending you notices with your finished images that say things like "This image was underexposed and while we've done what we could to fix it, it would be higher quality if you set the proper exposure" or "It does not appear that you have focused on a subject."
I'm getting very close to the point where I'd argue that, if all you ever do is shoot JPEG, you don't need a full frame (FX) camera. Heck, you probably don't even need an APS (DX) camera. The 4x6 print of the digital world is the 1920x1080 HDMI display that sits in many of your rooms. Go ahead, take any camera you've got lying around, take any shot using auto everything set to record a JPEG, then downsize it properly to 1920x1080 and display it on that LCD or plasma panel. I'll bet it looks darned good. Even the ones taken in low light.
I was browsing around the net earlier today and came across someone complaining about the image they took with their iPhone. But the image displayed on the Internet to illustrate the complaint looked…yes, you guessed it, perfectly adequate. Fine, actually. Could I have taken a better image with my Nikon D800 and the perfectly chosen lens, processed really carefully in my kitchen of demosaicing tools? Sure. Would it have looked any better on the Internet? Maybe some might say so, but I'll bet you that we'd be well above the 80/20 rule here: more than 80% wouldn't see any difference.
One of the problems I'm having with writing about cameras these days, and one of the reasons why I split out this site so that I could talk more generally about photography from a more gadgety and consumerish approach, is that virtually all the cameras on the market today can take good pictures in a wide range of conditions. The question now is can the camera makes redefine and augment the imaging portions of those cameras in interesting new ways and fix the crummy workflow we've been stuck with ever since DCF rules were originally standardized.
In the future folks, no one needs to go to huts to make things happen. Our images will magically spirit themselves from place to place, appearing when and where needed.