So it’s been over a month since I got the iPhone (as a Christmas present). As someone who’s used to using an Android— a Motorola smartphone, it was interesting testing out the differences between the devices. I’m sure there have been comparison tests before, so I guess this post will be relevant to my specific needs and environment. This isn’t exactly a fair review, considering neither model are new, nor is this a comprehensive review. Continue reading Out with the old… kind of
I’ve been “color correcting” the images taken with my camera phone for some time. I put ‘correcting’ in quotes because it’s been done with my naked eyes—manually, attempting to fit the image with my vision, or close enough.
Much of the time the images come out bluish. Color-balance with camera software on the phone is automatic without option, so there’s no way to get an even tone.
But beyond the unsharpening to reduce JPEG ‘blockiness’ (prior to scaling), and RGB balance, I came across a bigger problem. There were superfluous greens in the mix, and I’m not talking grass or moss. The
wood coated metal you see at the bottom of the original image, above (click to enlarge), is supposed to be bluish/purplish to the eye (how the bare wood returns light under a cloudy sky), not something turquoise with the camera. Similar greenish blues show up with the tree barks.
Part of “correcting” this has been color rotation, moving some of the blue channel into the red to compensate. For this particular scene, it took venturing outside a few times to see the desired colors. And I know I’ve gone too far when the whites on trees turn orange…
Side note: all processing has been done with GraphicConverter and sometimes ColorIt (for effects and color grain reduction), using the Mac emulator.
Well, it turns out, I was on to something with the color rotation.
A while back, I had noticed that the rods of the Quartz heater, when lit, were showing up pinkish. To the human eye, however, electrified Quartz emits an orange color. At first, I thought the camera was picking up ultraviolet light. So on Saturday, I did some testing…
The above image shows not only the pinkish, but the fact that the camera shows the rods visibly lit while my eyes did not; the heater was just starting up. The camera is sensitive to light in ways the human eye is not.
In further testing, I used my glasses as a filter (they have a UV filter). (I haven’t worn them though, in part because of a vein above my right ear—it gets pinched. Not wearing my glasses may have contributed to my vision loss…)
It would seem UV pickup was a problem because the light through the lens comes out darker. But upon wearing the glasses to view the new TV, the colors came out better. Now, how could that be? Turns out the glasses block some of the violet spectrum! It’s no wonder I couldn’t fix the colors on the screen no matter the settings—the blue filter on the screen is letting through violet light! Maybe that part of the spectrum wasn’t anticipated with the LED backlighting…
Back to the camera, curious about what I was dealing with I looked up what I could find for the camera’s spectral sensitivities, and how close it is (or would be) to human vision. Guess what I found?
First, I couldn’t find the exact camera model for the phone, but it’s a Sony, apparently. The smartphone is a Motorola Luge (2014 model), which is pretty much an enhanced version of the Droid Razr M/XT907, a 2012 model.
Second, the spectral sensitivity specs differ from camera to camera. Some are better than others, of course. You can download the 2012 PDF I found, detailing characteristics of several kinds of cameras here. (NOTE: 9.6 MB !)
Third, not only do digital cameras not pick up UV light (to any significant degree, anyway), but they fall out of the violet spectrum as well. So despite color correction software (also present in modern digital cameras), any given lens may not know blue from violet, or even pick up violet. (Try shooting scenes in black light.) I know my camera doesn’t translate violet into violet when viewing the TV, blue-only.
Lastly, the human eye does not see color in RGB. And this is important—no camera that picks up RGB the way digital lenses do will see what we see, no matter the auto-correction software. Here’s why, and what I’ve come to understand about the human eye.
Cone cells (which got the name because their tops have a physical shape of a cone) do not see RGB wavelength ranges, but short, medium and long wavelength ranges closer to magenta, cyan and yellow at their peaks. Rod cells, which make up the majority of low-light vision, have a more narrow range. (Because rods interfere with color processing, they’re usually deactivated during periods of normal light.)
The cone ranges overlap a lot, so
it’s in the subtraction of these natural overlaps that we get there’s an antagonistic nature when it comes to combining the cone signals for a relative color point. In the outset, we get violet due to a second resonance of long range (L) wavelengths. The second resonance in the graph above is clipped, but you can start to see L rise again (toward the left).
To produce one of the tens of millions of possible colors we see every some 50th of a second in exposure time, it takes hundreds to high thousands of photon hits. This signal processing is so dense that it’s performed entirely in the retina. The visual cortex in the brain works to process the eye’s signals into perceived colors.
In all, despite our advancements in technology, human vision is still far better than the conventional cameras we’ve made. At the very least, rod cells each require only one photon to activate, and populate the eye to the count of roughly 125 million. Only modern night-vision gear has improved upon our night-vision, albeit at lower resolutions.
…Well, that’s my 900-word two cents. I know now that no digital camera that I know of will do photography justice, not just my own.
Thanks for reading, and if I’ve gotten anything wrong, feel free to tell me.