New Camera2 API is brilliant!
It's a complete revolution in what Android Camera apps can do with the camera, bringing terrific new processing capabilities (using various forms of hardware acceleration).
Most of what only vendor's Camera apps, using proprietary APIs and sometimes ISP (Image Signal Processing) specific features will now be possible with third party apps.
Think:
– burst and any application using high speed burst shots like HDR, Superresolution.
– RAW saved as DNG
– non-compressed de-bayered images to process with the CPU or GPU with GL ES shaders
– Color space conversion from native sensor RGB and custom contrast curve or tone mapping.
And a lot more: With L, Android enters a totally new territory for its Camera.
+Arthur Brownlee IV is squeeeeeeing like a college girl right now.
I am.
Thank god…now maybe they'll start pairing the better software with a decent camera sensor
Watching the recording of the Camera2 API session, the presenter mentions that the maximum capture rate at 8mp on the Nexus 5 is 30 fps. Provided you can compress and store the images fast enough (maybe via GPU or SoC-specific offloading), would this suggest video capture beyond 1080p is possible?
Recording for reference (@37:54) Google I/O 2014 – Building great multi-media experiences on Android
+Matt Joseph Thanks for the link I didn't watched this one yet =)
Yes, with a low complexity codec it's likely you can record much higher resolution video now on Nexus 5 already, at the expense of high bitrate of course.
In the session they showed 30fps for full 8mpx images on the nexus 5. Wouldn't this mean 60fps 4K video?
The problem is that even with a lot of RAM, the buffer will fill up quickly (after a second or two of RAW data). So the barrier to video isn't the capture process, it's the recording process, which involves compressing and storing the data at a comparable rate. If the capture process uses a dedicated portion of the SoC, then it might be easier. Likewise, if the capture process uses the CPU, then compression to H264 or similar could be accomplished via dedicated SoC extensions (like on Exynos or 800) or GPU offloading.
+Matt Joseph that's a good point. In the talk they also said in HDR+ they capture 115MiB in 0.9 seconds for processing afterwards.