+Tony Northrup demonstrates the Ricoh Theta S in this insightful review. Everyone will find his own usage of course but that's the first video I see from it showing so well how to use this little tool with examples for commercial purposes, family memories or even plain vlogging.
If you find the picture quality sufficient, go ahead! Otherwise it might be wise to wait for a future product that will record at higher resolution, with sharper lenses, with a little better color profiling.
360° video will get substantially better with 4K overall, including recording on such device: as you can observe, due to the dual circular projection, what is recorded lacks details: – once transformed geometrically into a 360 video – once again projected into planar or VR view.
Even with 1080p delivery, higher resolution recording will help. I don't know about you but I'll certainly get one at some point ☺
It's amazing how many seconds the effect lasts (and how true they appear) as long as you keep looking precisely at the dot.
In case you were wondering if you can use your eyes as instrument to calibrate displays, well.. let's say they're not the best tool for that 😀 – You can't access the RAW data from the sensors – There's too much processing in the brain and even before that at the biological level.
+Sam Pullen demonstrates in this video the main limitation with Google's current computational photography approach.
Reported by most reviewers as "lag" or "bugs" of the #Nexus 5X and 6P camera app since this is the mode activated by default, it has a simple explanation: The amount of time to process multi-exposures shot as one HDR+ picture joined with the limited amount of RAM allowed as buffer makes it unsuitable for consecutive pictures shooting.
Here's the process which occurs with any Android smartphone since the past few years:
– you launch the camera app – the viewfinder preview starts the ISP to capture full resolution readouts from the sensor at 30 FPS, renders them at near display resolution, adjusting in real-time automatic exposure and white balance – you press the shutter button – the camera ISP hardware takes one of the full resolution sensor readout, renders it in full resolution, compresses the output automatically using the hardware JPEG encoder, offers the compressed file as a buffer to the camera app, camera app saves it as a file on disk. – the previous sequence of operations, using almost no CPU at all can be repeated at 4 times per second or more.
Instead, here's the process with HDR+:
– until you press the shutter button: identical. – the camera ISP hardware takes multiple (up to 9*) consecutive readouts, some with positive and negative exposure compensation, from the sensor at 30 FPS, renders them and offers them as uncompressed buffers in RAM to the camera application. – the camera application takes all these images as input and feeds them to a super-resolution algorithm, which also tunes the local contrast and color balance, compressing or extending the dynamic range locally depending on the analyzed image content. – the HDR+ algorithm takes a few hundred milliseconds to several seconds to render a processed image – once the HDR+ algorithm is finished, it offers the result as buffer to the hardware JPEG encoder, which returns a buffer to the camera app then saved as a file. – during the HDR+ processing in background, you can press the shutter button again to trigger the capture, but only as long as there is enough available memory to store those multiple exposures as uncompressed image buffers in RAM.
As you can see from this list of operations, the two modes function rather different.
– Standard mode doesn't rely on the CPU to do much beside synchronizing the preview between the camera and the display hardware, then saving the final result as a file on disk.
– HDR+ relies extensively on the RAM and CPU to build (hopefully) better images from many captures.
As a result, both the RAM and CPU are bottlenecks limiting the consecutive shooting capability.
Now you may ask: why isn't HDR a problem on other phones?
There are several explanations:
– Google chose a target that's unsuitable for the Nexus 5X consecutive image shooting. It means too many images in buffer given the amount of RAM and computational capability available, too complex processing.
– Google HDR+ implementation is not optimized enough.
– Samsung flagships since the Galaxy S5 process their HDR rendering hardware-accelerated instead of relying on the CPU. Their implementation is efficient: enough to process the preview in HDR at 30 FPS, compressing the dynamic range more and better than Google's HDR+, it doesn't slow down shooting either. Samsung HDR rendering is even available to third party applications in their Camera SDK while Google's HDR+ is entirely proprietary.
How can Google improve the situation?
– Reducing the amount of captures directed to HDR+ dynamically depending on the load to avoid stopping and making the photographer miss shots
– Reverting to 100% hardware accelerated standard shooting when at least two HDR+ images are processing in background instead of preventing the user to shoot. A standard image is better than no image at all. As demonstrated by +Sam Pullen, the current situation generates user frustration.
– Use more hardware acceleration (OpenGL shaders, Renderscript) and less CPU to improve the HDR+ algorithm speed to catch up with the competition, improving the power efficiency and avoid slowing down even more during consecutive shooting due to CPU thermal throttling.
* 9 frames for HDR+ was mentioned in a Google blog post last year, it could have changed on latest camera app.
Following up on his first video where +JerryRigEverything bends a +Nexus 6P, I'm doing the same after qualifying the first one as likely non-representative it was done on a phone which glass was already shattered.
I can't see any particular flaw in the method of this one, and it's done with an educational approach. This phone very much has a point of vulnerability where demonstrated.
What I don't know is that if like the iPhone 6 and 6 Plus the phone can bend in regular use, little by little and stay bent, or this one is more about its "bend-breaking" ability.
It's too bad that after the iPhone 6 generation experience, manufactures still release products with a mechanical weak point such as this one. It shows that +Huawei likely didn't make their own stress test process to take into consideration the new elements, something that other manufacturers appear to have done in comparison.
At least, as observed previously, the phone bends/break above the battery and unlike the iPhone 6 it doesn't make it a safety hazard in this case
It is expected than picks of Mohs Hardness Scale of 7 and above scratched with some pressure will leave from marks to deep dents into a Gorilla Glass screen.
As soon as +JerryRigEverything does that, he compromises the structure of the complete glass by going through the coating and attacking what makes up the material's compression stress.
This is why it is not surprising to see it shatter. It doesn't mean that the glass is particularly fragile. Any similar glass construction will behave about the same once the damage goes past the coating.
I don't get the point of the lighter burn test. Maybe because I don't smoke?
The bend test however isn't looking too great. At least it seems to bend above the battery so that one should be reasonably safe. Edit: I agree the bend test might not be representative however due to the prior shattering of the display, then unable to participate to the structural rigidity.
Interesting as well is the component list +HTC chose for the A9, which may allow them to get a better margin for a each unit sold than most.
When you think of it it's the same sensor as the OnePlus One (and it's hard to complain about its raw performance) not an expensive component as we've seen it in tons of sub-$200 Chinese phone last year) But they complimented it with a stabilized lens – that's unfortunately making you motion sick with too much movement – and a better color profiling than before, cool and powerful camera features out of the box.
The panel too. Likely similar if not identical to what was on the Galaxy S4 2 years ago. Same, it's found on pretty cheap Chinese phones like from Gionee.
The SoC might cost maybe half of the 810? a smaller battery won't cost much either… But all that with better usage of the components than before, this is encouraging indeed for the Taiwanese manufacturer.
The bet on the price might or might not pay for them. I'm sure that in the US $499 and later rebate will enable some sales. The European price point of 599€ is unrealistic however and I'm sure it's a technique to attempt to make it look premium while it will always sell with mail-in or other kinds or heavy rebates for the street price.
His initial approach in the video is to comfort people wondering if the +Nexus 5X, they might have just ordered is actually slow. What I got from the video however is that a difference in performance between the 5X and the 6P is much larger than I expected. Like 1-generation gap difference of real-world performance.
Watching this new type of videos becoming popular, evaluating the amount of time you wait when using your phone and multitasking capabilities makes me wonder about the possibility to replicate those in an automated way. It might be possible to script all that, and get some quite valuable metrics allowing to compare phones quickly.
Maybe a few Android publications could team up and fund the development of such evaluation tool.
So yeah, why lasers?! You may ask. A cinema projector is supposed to display intergalactic ships, not shoot them down.
The reason motivating using lasers is driven by the UHD standard and its Rec. 2020 color gamut. Rec. 2020 colorspace gamut red green and blue primaries are single wavelength colors. That's how you can render the most intense saturated colors. More intense and saturated than any AMOLED or LCD with Quantum Dots.
Because we're still talking only about making colors by adding variable amounts of 3 primaries, Rec. 2020 gamut doesn't include all colors visible by your eye. For that you would need more than 3 primaries.
It's possible to speculate than Rec. 2020 will be large enough that we might never go further than that by adding more single wavelength primaries and cover even more of the visible light. But who knows, that might become the next marketing argument at some point 😁
TV manufacturers are also preparing laser backlight LCDs units. We're not sure yet if they'll reach the public due to power efficiency concerns. The wide gamut TVs you'll be able to buy, covering not the whole Rec. 2020 gamut but a good portion of them might stay AMOLED or LCDs + Quantum dots.
Between HDR high brightness and full Rec. 2020 support, be sure the future will have everything needed to massage your retinas just right.
"DB> Yea, same sensor (IMX377) and F/2.0 optics. But 6P has more CPU/GPU horsepower so has a few additional features like 240fps slomo (vs 120fps on 5X), Smartburst, and EIS."
If I'm interpreting this correctly, it means the LG +Nexus 5X will have no stabilization in video whatsoever. It will need some confirmation in testing of course, but that's a huge letdown for at least two reasons: – the Snapdragon 808 should be capable of software video stabilization. Heck, LG claims to combine optical and electronic image stabilization for better results. – it's a regression compared to the 2 years old Nexus 5, which featured an optically stabilized camera module that's doing a very good job in video mode.
Edit 2: Moto X Pure, has a very good electronic image stabilization in video for 1080p30, but not 1080p60 nor 4K it seems. Example: https://www.youtube.com/watch?v=wYxmstJ5NOI
Google's engineers answers is misguided then.