After trying on the Nexus 6P with HDR+ always on yesterday, I wanted to evaluate HDR+ Auto in typical situations where you want multiple exposures to recover shadows or highlights.
As you can see from the samples shared here the result is clear: HDR+ Auto simply doesn't work. In all my attempts today, HDR+ Auto was unable to identify scenes with dynamic range challenges and in need for some tone mapping. I remember that yesterday HDR+ Auto turned HDR+ On at least once but I was unable to reproduce that today in real world scenarios.
– The 6P camera exposure system can easily underexposue a central subject. It is common that you need to aid the exposure system by tapping on your subject (hopefully your subject won't be of dark color)
– HDR+ sometimes increases contrast and reduces the final dynamic range instead of extending it, giving the opposite result to what's desired. It does so unpredictably.
My tip would be to activate HDR+ as forced On since HDR+ Auto doesn't activate it in the obvious conditions requiring it on, unless you need to shoot several images quickly. Then since the automatic exposure can't be trusted outdoor, you it is recommended to tap on subjects to expose for them, while the preview will often seem overexposed, HDR+ should usually re-expose the final image rendered and recover highlights in the process.
HDR+ in general needs work to avoid being counter productive randomly, and HDR+ Auto is useless as it is now.
I walked around in Chambéry with two cameras, so that happened. The 6P HDR+ works better in low light to extend dynamic range, and it's also performing great to extend the dynamic range in order to preserve skys.
While the 6P camera preforms rather poorly in great lighting due to suboptimal automatic settings and below average color rendition, it becomes an excellent performer in low-light. The new HDR+ computational photography algorithms working with the large 1/2.3" sensor equipped with f/2.0 aperture lens is a worthy alternative to OIS in stills .. at least compared to the Nexus 5, which you can see is still capable of perfectly usable shots in most situations.
The 6P camera is an absolute killer for selfies however. HDR+ on this one makes wonder to expose the face and everything else in the worst conditions. The focus distance is close enough to keep your face sharp and get some background blur. Even in low light, the amount of detail preserved is high: enough to show your skin texture, which is fine in some case, unflattering in others (in good lighting) where the sharpening will highlight skin imperfections instead.
I made this album because it also demonstrates that if color profiling accuracy is crucial for great outdoor shots: our eyes and brain are highly trained to recognize subtle color tones found in nature it is not as much if at all in artificial lighting. That's part of why the Nexus 6P camera can be an excellent performer in these situation despite it essentially sucks in sunny outdoor natural conditions.
Notes: – As you can notice, the field of view of the Nexus 6P is larger than the Nexus 5. It's pretty convenient for architecture and landscape, less suited to shoot people. – Both Nexus 6P and 5 bokeh circles are not very good. – I had to delete roughly 1/3 of out of focus shots from the Nexus 6P. It misses is just a bit quite often in low light, leaving you with a good looking but a bit blurry picture. Make sure to review and shoot again. – Unless specified, the exposure is in full auto (and sometimes not what I would choose manually) – The Nexus 6P is lacking exposure compensation entirely, while it is available even on HDR+ on the Nexus 5.
Oh and it was also the opportunity to take some pics of my city before leaving for Stockholm 🙂
+Sam Pullen demonstrates in this video the main limitation with Google's current computational photography approach.
Reported by most reviewers as "lag" or "bugs" of the #Nexus 5X and 6P camera app since this is the mode activated by default, it has a simple explanation: The amount of time to process multi-exposures shot as one HDR+ picture joined with the limited amount of RAM allowed as buffer makes it unsuitable for consecutive pictures shooting.
Here's the process which occurs with any Android smartphone since the past few years:
– you launch the camera app – the viewfinder preview starts the ISP to capture full resolution readouts from the sensor at 30 FPS, renders them at near display resolution, adjusting in real-time automatic exposure and white balance – you press the shutter button – the camera ISP hardware takes one of the full resolution sensor readout, renders it in full resolution, compresses the output automatically using the hardware JPEG encoder, offers the compressed file as a buffer to the camera app, camera app saves it as a file on disk. – the previous sequence of operations, using almost no CPU at all can be repeated at 4 times per second or more.
Instead, here's the process with HDR+:
– until you press the shutter button: identical. – the camera ISP hardware takes multiple (up to 9*) consecutive readouts, some with positive and negative exposure compensation, from the sensor at 30 FPS, renders them and offers them as uncompressed buffers in RAM to the camera application. – the camera application takes all these images as input and feeds them to a super-resolution algorithm, which also tunes the local contrast and color balance, compressing or extending the dynamic range locally depending on the analyzed image content. – the HDR+ algorithm takes a few hundred milliseconds to several seconds to render a processed image – once the HDR+ algorithm is finished, it offers the result as buffer to the hardware JPEG encoder, which returns a buffer to the camera app then saved as a file. – during the HDR+ processing in background, you can press the shutter button again to trigger the capture, but only as long as there is enough available memory to store those multiple exposures as uncompressed image buffers in RAM.
As you can see from this list of operations, the two modes function rather different.
– Standard mode doesn't rely on the CPU to do much beside synchronizing the preview between the camera and the display hardware, then saving the final result as a file on disk.
– HDR+ relies extensively on the RAM and CPU to build (hopefully) better images from many captures.
As a result, both the RAM and CPU are bottlenecks limiting the consecutive shooting capability.
Now you may ask: why isn't HDR a problem on other phones?
There are several explanations:
– Google chose a target that's unsuitable for the Nexus 5X consecutive image shooting. It means too many images in buffer given the amount of RAM and computational capability available, too complex processing.
– Google HDR+ implementation is not optimized enough.
– Samsung flagships since the Galaxy S5 process their HDR rendering hardware-accelerated instead of relying on the CPU. Their implementation is efficient: enough to process the preview in HDR at 30 FPS, compressing the dynamic range more and better than Google's HDR+, it doesn't slow down shooting either. Samsung HDR rendering is even available to third party applications in their Camera SDK while Google's HDR+ is entirely proprietary.
How can Google improve the situation?
– Reducing the amount of captures directed to HDR+ dynamically depending on the load to avoid stopping and making the photographer miss shots
– Reverting to 100% hardware accelerated standard shooting when at least two HDR+ images are processing in background instead of preventing the user to shoot. A standard image is better than no image at all. As demonstrated by +Sam Pullen, the current situation generates user frustration.
– Use more hardware acceleration (OpenGL shaders, Renderscript) and less CPU to improve the HDR+ algorithm speed to catch up with the competition, improving the power efficiency and avoid slowing down even more during consecutive shooting due to CPU thermal throttling.
* 9 frames for HDR+ was mentioned in a Google blog post last year, it could have changed on latest camera app.
I wonder how well and fully implemented the DNG spec is on the reader. Hopefully it's not differing much or at all from Adobe reference SDK. I say that because there's a few key aspects crucial for color calibration and accuracy missing from the regular Android DNGCreator class introduced in Lollipop.
Today is a big day for Snapseed users! Snapseed 2.1 brings RAW photo editing to your Android device.
Traditionally, shooting and editing RAW photos has been the domain of DSLR cameras and desktop software. But with the RAW capabilities that were added to Android 5 last year, RAW is now becoming important for mobile photography, too.
Snapseed now allows you to edit those RAW photos in the DNG file format right on your mobile device. You can also edit DNGs that were shot on cameras or converted from other RAW formats.
A photo in RAW format preserves all of the original data that was captured by the camera. This allows you to perform edits – such as recovering blown-out highlights – that are impossible with the more commonly used JPEG format. Check out the photos for an example of the details that RAW editing with Snapseed 2.1 can bring out in an image.
In addition to RAW editing, we have made some slight polishes throughout the app to make it easier to navigate, so give it a try!
With sound, not suffering from obvious aliasing artifacts, offering good detail instead. While I'm looking forward to direct comparisons with iPhones 6 and 6s, they appear to be by far the best Android phones in this department to date.
Nexus 6P Slow Motion Quality is Great! – Hi Speed Cameras
Now that the Nexus 6P has been getting into the hands of reviewers you can clearly see that the slow motion mode has great quality. It has audio same as the iPhone and the resolution in 240fps mode is on Par or might be better than the iPhone 6s.
Most of them are overexposed with large parts of the photo burned out, automatic white balance doesn't seem too consistent with some pics too blue and others too yellow. At least they're in focus. Low-light seem to be a winner however.
Most reviewers don't say if the pics were taking with or without HDR+. Maybe only HDR+ is alright and standard mode just poor.
Google has a lot of work to fix this camera. I mean, getting the exposure right is the very first thing. You'll see this illustrated in comparison with Galaxy S6 or iPhone.
– Comparing still images at 100% zoom for different sensor resolutions instead of normalized resolutions. – Comparing still images taken at a different focal length / field of view.
But still an interesting video from +SuperSaf TV! The +Sony Xperia Z5 does a good job at stabilization, especially in 1080p video. It's a shame that 4K video recording doesn't benefit from the same stabilization quality, although there's still some. As it's digital stabilization only however, you'll often see some artifacts coming from the motion blur due to movement in frames.
The Z5 also does a good job at stabilizing the front facing camera although it's at the expense of some artifacts and crop, is there no such capability on the Galaxy S6?
Galaxy S6 front and back lens are very good at dealing with flare, which is not the case with the Z5.
The Z5 color profile and automatic white balance is colder, as usual with +Sony products.
Galaxy S6 appears to be more reliable and consistent overall than the Z5, which can sometimes get the automatic white balance all wrong like at 4:57 (it gets a nuance of green as white reference, hence the whole scene turning into almost greyscale)
Note: 1/2.3 for 4:3 format is more surface area than 1/2.3 for 16:9 as this metric characterize the diagonal and not the surface area that captures light which evolves at the height x width (aka square)
"DB> Yea, same sensor (IMX377) and F/2.0 optics. But 6P has more CPU/GPU horsepower so has a few additional features like 240fps slomo (vs 120fps on 5X), Smartburst, and EIS."
If I'm interpreting this correctly, it means the LG +Nexus 5X will have no stabilization in video whatsoever. It will need some confirmation in testing of course, but that's a huge letdown for at least two reasons: – the Snapdragon 808 should be capable of software video stabilization. Heck, LG claims to combine optical and electronic image stabilization for better results. – it's a regression compared to the 2 years old Nexus 5, which featured an optically stabilized camera module that's doing a very good job in video mode.
Edit 2: Moto X Pure, has a very good electronic image stabilization in video for 1080p30, but not 1080p60 nor 4K it seems. Example: https://www.youtube.com/watch?v=wYxmstJ5NOI
Google's engineers answers is misguided then.