No automatic metering nor white balance is gonna provide perfect result in every situation.
I don't think +Tom's Guide calling this "a big problem" is particularly fair in this article.
Their video demonstration of white balance shifting when the scene changes (hand introduced and removed from the scene) is at best inconclusive, that's pretty much the expected behavior.
If you want to know the reason why fixed or manual white balance is sometimes required, well that's it 🙂
Maybe the white balance adjustment could be slower, which would make it less apparent, but that would reduce its efficiency in other scenes with mixed lighting conditions.
Also, this sentence shows the author doesn't get how automatic white balance works, by stating the exact opposite of the reality:
In reviewing some of our test photos, Apple representatives said that the colors may have shifted as the result of changing content in different photos. But that's not how white balance works. It takes account of the color of light falling on the subjects, not on the assortment of subjects in a photo
Everything an AWB algorithm has to work with is the raw pixels values from the sensor, and some knowledge of what was going on before.
It doesn't know anything about light sources, weather conditions, light reflections.
Lastly, I don't understand why they qualify this other example as a problem while the iPhone 6 has onece again the expected behavior: exposing for the faces first when there's people, and preserving highlight to avoid blowing out the scene otherwise:
which brings much better results than their reference on metering, a Galaxy S5: http://media.bestofmicro.com/H/R/455535/original/sean-mike_gs5.png
However I agree that the S5 auto white balance gets better results but keeping in mind both get a result that's far too blue overall when exposing for the faces.
I'm all for reporting image quality issues, but it works better with reasonable expectations and proper understanding on how things work, especially when trying to explain that to readers.
Via +Amon RA
#supercurioBlog #color #camera
The iPhone 6 Camera Has a Big Problem: iOS 8
9 thoughts on “Every white balance algorithm has its strengths and weaknesses”
You really love bashing on articles, don't you? 😂
This is the problem when Engineers start to blog to non engineers.
Everything seems critics when in fact are just technical talk about the reality and the article …
Outside Technical/engineering side, when Historians, anthropologist or sociologist start to talk about something and religion peoples start to say they are persecuting.
I wish more +François Simond in all areas….
the base of science is challenge wit more science.
+Solomon Taiwo Lol it's a pain when starting to know a domain a little bit, you end up seeing errors almost everywhere.
This example of article look like the author seemed to want to bash iPhone 6 camera because unhappy with some results.
And people wanting to bash Apple and the iPhone happily re-share it, spreading errors.
That's crappy journalism capitalizing on haters.
Maybe it's true the white balance gets too often overly magenta.
But with methodology and facts got and explained wrong (he goes as far as contradicting Apple statement while they are perfectly right), the whole article ends up being entirely bogus.
Yes. Same at daily basis. Today I fight with my aunt because some electrical non certified things in grandma house.
My Uncle just say…
"…Why you want to spend money with this. the light is on. and this is what matter."..
And I want change the wires, change the central electrical protectors, use DS and DDS and DPS and ground and huffffffff…
Well, Our world is more complicated than the other ones. lol
I think I'm right in assuming auto white balance generally works like this; Take the lightest objects (pixels) in a scene and make corrections to the RGB levels to make that white. In the vast majority of cases this will work as you almost always have something white somewhere in a photo.
+Tim Morley I'm always amazed that white balance algorithms manage to often get things right in the first place 😀
There's not always something white in scenes and when there is, well the algorithm doesn't know for sure it's white (white is just another color) or something brighter than the rest.
This is something I'll be studying in depth and work on soon 🙂
I don't know how many times I've used the dropper feature in ufraw… Nothing beats saying "this is white" – and it's frustrating when a scene does not have any white in it.
For example, you'd be surprised how often there is JUST enough of the color of skin showing through a white shirt that it can throw off a WB algorithm. Try to WB on the shirt and now the whole scene looks screwy.
+Tim Morley I do not know for sure, but I think the algorithms have evolved somewhat. The algorithms are supposed to know what the scene is, focus on the proper thing (usually people or other object in the front) and overall show the scene in proper colors. After all nobody likes to have blue or other scene cast just because the scene was shot under incadescent lightbulbs or other light source that is not natural sun.
Intelligent algorithm might decide that since it sees 5 shapes that look like human faces, it should focus on them and use their colour as reference for human skin. Though then it would have to account for Asian people, Afro-American etc.
Other than that, just sad article and the misconceptions in it (regardless if being made because reviewer did not know or did not want to confuse readers).
"The Galaxy S5 camera, in comparison, is slightly recessed. This might block light from hitting the lens from the side, which is often a cause of chromatic aberration." – I always thought that chronatic aberration is caused by different refractive indices for different wavelights (ok, had to google that, I was not sure about proper English terms) that is well pronounced when e.g. viewing tree branches in front of blue sky. And it does not have to be purple. Maybe limiting light coming from sides could slightly lessen the problem but at the cost of reducing the amount of light hitting the sensor. But it could certainly help against photo becoming Abrams-que by flares.
" It may be that Android 4.2 is better at removing the artifact, or that the design of the camera prevents it." – did not know Android did the heavy lifting, actually everything I read from +François Simond and Opo seems to suggest the opposite (that it is work of firmware, special algorithms and apps, modified camera app, but not the code in general OS if unchanged).
Reviewer complains about colour shifting in park scene (which is actually consistent with what I would expect) while happily ignoring the enormous amount of highlight clipping in the right S5 image.
This is why I mostly can not read IT related articles by normal news sources and especially I hurt a lot whenever TV reporters try to say anything about IT/hacking/programming. There is so much wrong in so few seconds usually that it is not funny.
Also " But that's not how white balance works. It takes account of the color of light falling on the subjects, not on the assortment of subjects in a photo. Regardless of what else is happening in the shot, red peppers should never look purple, and white hands shouldn't turn pink."
I don't know how he imagines that is supposed to work. How could the camera magically see what light falls on subjects? It only sees the light reflected from subjects which is already influenced by their colour (reason for using neutral white by photographers). The only way it could do what he wants is if camera had AI able to completely analyze scene, see what is pictured, know its proper colour and shift white balance accordingly. We are not there yet.