demonstrates in this video the main limitation with Google's current computational photography approach.
Reported by most reviewers as "lag" or "bugs" of the #Nexus 5X and 6P camera app since this is the mode activated by default, it has a simple explanation:
The amount of time to process multi-exposures shot as one HDR+ picture joined with the limited amount of RAM allowed as buffer makes it unsuitable for consecutive pictures shooting.
Here's the process which occurs with any Android smartphone since the past few years:
– you launch the camera app
– the viewfinder preview starts the ISP to capture full resolution readouts from the sensor at 30 FPS, renders them at near display resolution, adjusting in real-time automatic exposure and white balance
– you press the shutter button
– the camera ISP hardware takes one of the full resolution sensor readout, renders it in full resolution, compresses the output automatically using the hardware JPEG encoder, offers the compressed file as a buffer to the camera app, camera app saves it as a file on disk.
– the previous sequence of operations, using almost no CPU at all can be repeated at 4 times per second or more.
Instead, here's the process with HDR+:
– until you press the shutter button: identical.
– the camera ISP hardware takes multiple (up to 9*) consecutive readouts, some with positive and negative exposure compensation, from the sensor at 30 FPS, renders them and offers them as uncompressed buffers in RAM to the camera application.
– the camera application takes all these images as input and feeds them to a super-resolution algorithm, which also tunes the local contrast and color balance, compressing or extending the dynamic range locally depending on the analyzed image content.
– the HDR+ algorithm takes a few hundred milliseconds to several seconds to render a processed image
– once the HDR+ algorithm is finished, it offers the result as buffer to the hardware JPEG encoder, which returns a buffer to the camera app then saved as a file.
– during the HDR+ processing in background, you can press the shutter button again to trigger the capture, but only as long as there is enough available memory to store those multiple exposures as uncompressed image buffers in RAM.
As you can see from this list of operations, the two modes function rather different.
– Standard mode doesn't rely on the CPU to do much beside synchronizing the preview between the camera and the display hardware, then saving the final result as a file on disk.
– HDR+ relies extensively on the RAM and CPU to build (hopefully) better images from many captures.
As a result, both the RAM and CPU are bottlenecks limiting the consecutive shooting capability.
Now you may ask: why isn't HDR a problem on other phones?
There are several explanations:
– Google chose a target that's unsuitable for the Nexus 5X consecutive image shooting. It means too many images in buffer given the amount of RAM and computational capability available, too complex processing.
– Google HDR+ implementation is not optimized enough.
– Samsung flagships since the Galaxy S5 process their HDR rendering hardware-accelerated instead of relying on the CPU. Their implementation is efficient: enough to process the preview in HDR at 30 FPS, compressing the dynamic range more and better than Google's HDR+, it doesn't slow down shooting either. Samsung HDR rendering is even available to third party applications in their Camera SDK while Google's HDR+ is entirely proprietary.
How can Google improve the situation?
– Reducing the amount of captures directed to HDR+ dynamically depending on the load to avoid stopping and making the photographer miss shots
– Reverting to 100% hardware accelerated standard shooting when at least two HDR+ images are processing in background instead of preventing the user to shoot. A standard image is better than no image at all. As demonstrated by, the current situation generates user frustration.
– Use more hardware acceleration (OpenGL shaders, Renderscript) and less CPU to improve the HDR+ algorithm speed to catch up with the competition, improving the power efficiency and avoid slowing down even more during consecutive shooting due to CPU thermal throttling.
* 9 frames for HDR+ was mentioned in a Google blog post last year, it could have changed on latest camera app.