Since I've read that the Nexus 6P only "fast charge" protocol was 5V/3A over dual USB-C connectors, I was curious to find out.
And since I don't have any fancy equipment right now, this is using only software 🙂
– Using the provided charger and USB-C – USB-C cable:
Takes about 3A as expected
– Using a Samsung 5V/2A charger, which uses the same signaling as Quickcharge 1.0, and the provided USB-A USB-C cable (which is ridiculously short):
Takes about 2A, close to the maximum for this charger and definitely more than 1A which is the max for USB power without signaling.
So it stays reasonably compatible with most equipment dating from the pre-QuickCharge 2.0 era.
Either way, I'm not a fan of the solution Google adopted for the Nexus 5X and 6P charging.
The USB-C cables provided are annoyingly short. Since they have to carry 3A they're thick and inflexible.
Choosing a 50% higher current instead of higher voltages is inefficient: it requires thicker, more expensive, shorter cables.
And eventually, only 15W to charge a 3450 mAh battery is just not fast.
USB-C might be a the future standard but as it is on this year's nexus, it is not a very particularly convincing solution compared to Qualcomm QuickCharge 2.0 or 3.0
#supercurioBlog #charging #Nexus #Nexus6P
In Album Nexus 6P charging simple test
12 thoughts on “How much the Nexus 6P takes from a Quickcharge 1.0 charger?”
I agree with you that higher voltage gives better efficiency (that's why we use very high voltage lines for long distance in power grid).
But when charger and phone are not 100% compatible it's easier to just keep the voltage in the low standard 5V and keep current also below the standard 1A (that most chargers can handle)..
Anyway for the time being, I'm happy with current situation because (unlike Nexus 5) the 6P battery easily lasts me all day. I bought a few cheap 1m USB-A USB-C cables and I always charge it at night with an old 1A charger. I carry with me the original fast charger that came with phone just in case I need, but just used it once. I think that slowly charging the battery may degrade less in the long run and extend useful battery life.
I've heard (Android central guys) saying that we should be carefull when plugging different chargers and cables to the new N6P because phone may demand more power than charger can handle but I've aleready tested 2 cables and 2 adapters with 2 or 3 old chargers as well as an external battery power bank and have had no issues.
Does the silicon just not exist yet for higher voltage steppings? It's definitely there in the USB PD spec already.
+Miguel Silva what degrades the battery during charge is heat, and it appears to be well managed on the 6P when charging at maximum speed, including when using the phone.
I'm not concerned by premature aging that way.
Slower charge won't hurt of course 😉
Yeap, heat seems to be well controlled, I've used the phone recording 4k video with OpenCamera and an external battery power bank charging at the same time and phone temperature never got to more than 40 degree C (not sure if it was charging at 1A or 2A but it was keeping up with demand).
Anyway at night I have more than time to slowly charge, so battery temperature charging at 1A should be lower than charging at 3A even if it's just slightly lower that's something 😉
A part the +Miguel Silva comment, it seems no one cares about the long time wear of batteries. It sounds weird to me. I could possibly be a wear maniac (e. g. I compile even small things in ram and I haven't a swap partition in my SSDs), but I use the 1A charger with my oneplus one and I'm very diligent to not go below 20% of charge left and never go over 95%. I charged it at 100% few times in over a year, but I'm sorry for the battery if that happens.
Maybe you change devices quickly and you don't bother too much about long time battery wear. But I want to keep my phone performances at least for 2 years and it matters to me if my battery could resist half the stress it could withstand when I first bought the phone.
It seems that few years ago we were more sensible for that topic: we used to have default android stop at 96%, we had BLX (battery life extender, a patch of Ezekeel, a once famous developer of xda, that could stop the charge at the rate you want). It was for nexus s, maybe +François Simond can remember (we had voodoo colors available for that device). I'm confused by our thinking: we are more and more addicted to things that depends on batteries but we tend to care less about them.
About quick charging a battery, this site http://batteryuniversity.com/learn/article/how_to_prolong_lithium_based_batteries explains a lot about the fact that with higher Amp charging, you get much better short term performance and higher wear over long run. With lower Amp, you get worse short term performance, but an exponential increase of life cycles.
That site defines "C" as the (charging Amp) / (battery capacity) ratio. If it's 0.7 then you have the best mid term performance (which is more or less what vendors give with their default chargers).
Using proper USB C power draw with proper PU/PD resistors is the future proof and safe way for your devices.
USB PD cables/adapters will be the future, post that, you just need a single adapter/cable to charge all your USB C devices !!
And USB C has much more potential to it, max draw up to 20V/5A = 100W !!
I hope you follow these collection by +Benson Leung (Google staff for USB C) for better understanding about USB C.
15 watts should be enough for anyone. 😉
+Wolfgang Rupprecht Sure for now, but already 25 Watts power adapters started to show up, so there is nothing to wonder if couple of 30W smartphone chargers are out by end of 2016 on a pure and proper USB C to C !!
+Chandraprakash G pretty sure +Wolfgang Rupprecht's message was a trap 😁
+Riccardo Berto it's probably because in most cases, the battery wear is difficult to appreciate even over two years use of a smartphone, while software changes (power consuming bugs introduced or fixed, power consuming valuable new feature) during the same period of time can have most dramatic results.
In a way, by not allowing yourself to not go below 20% level, you get from the start the worst case scenario battery performance loss observed over several years, which might make the approach counter productive.
Regarding optimization based on generic information we all read from the websites or Wikipedia page, it's also worth considering that some might be misguided as it always is without measurement.
Like, you know: being driven by a belief instead of data.
Batteries continue to evolve in density every year, their aging characteristics might evolve as well.
I remember my first 2A charging charger getting hot while doing so, whereas devices charging at much higher wattage today get barely warm.
These are reasons why I'm not sure we can really compare today with 3-4 years ago.
+Chandraprakash G It was meant as a tongue-in-cheek message in the vein of Bill Gates' claim that "640k of memory should be enough for anyone".
On the other hand, the power inflation we are seeing is a bit disturbing. At each revision we increase the risk of a fire from the thermal runaway in the lithium-based battery.
+Wolfgang Rupprecht True, I really hate Lithium based batteries (rare and expensive element).
But there are dozens of practical storage mechanism evolving at a quantum speed as the need is imminent due to moving to Solar PV panels,
which needs good energy storage at night time, and for heavier purpose too.
Some examples of Metal air batteries, Hydrogen based energy storage, Compressed air energy storage, Compact Fuel cell etc etc…