▲ | lxgr 2 days ago | |||||||||||||||||||||||||
It could of course also be included only for power efficiency without being strictly necessary for optimal charging performance, but cooling performance can vary between cases and with environmental conditions, so I suspect it might be at least partially a functional requirement. | ||||||||||||||||||||||||||
▲ | coder543 2 days ago | parent [-] | |||||||||||||||||||||||||
More fearmongering. Provide proof, or please stop confusing people. This thread was asking for recommendations to understand how to charge their phone. The answer is extremely simple. This is not the thread to dig into irrelevant details. Read the room. If you want to get technical, the difference in conversion losses between AVS at 13.3V@3A and a non-AVS 15V PDO @ 2.667A is going to be rounding error. Both need to be regulated down to about 3.5V to 4.2V by the phone's internal circuitry. A hypothetical buck converter that is just as efficient at 15V -> 3.5V as it is at 13.3V -> 3.5V would see no difference at all in heat generation. In the real world, such a small difference in input voltages at >1A would likely yield a less than 1% efficiency difference. 1% of 40W is 0.4W, but I repeat: I am saying less than 1%, not 1%. Based on tests I've seen in the past, previous iPhones could dissipate at least 4W of power continuously, forever, in normal ambient conditions. For a 95% efficient buck converter (which is probably conservative here), the total thermal load from charging at 40W is 2W. Unless the thermal load exceeds 4W, there should be no difference in charging speed potential. 2W + 0.4W would be 2.4W, which is well below the 4W threshold, and the buck converter on a high end smartphone is probably more than 95% efficient, while the difference in input voltages probably yields less than 1% difference in efficiency, so 2.4W is a very conservative number here. Keep in mind that phones can dissipate significantly more than 4W for brief bursts, and the charge curve on the battery isn't going to allow 40W charging for very long regardless of the thermals, so the phone likely has even more thermal headroom here in the real world. Even a 20V PDO @ 2A -> 3.5V should be perfectly fine here. I see no evidence that there is even a possibility of AVS making a difference here unless you wrap the phone in real insulation (not a phone case) or stick it in a toaster oven while charging. Talking about how there might be an imperceptible difference under very specific extreme conditions is exactly what I would call fearmongering in a thread where someone was specifically asking for recommendations. It won't matter to anyone, ever, in any real world situation. This is not the thread to be contrarian for the sake of digging into minutiae. You're welcome to start a thread where that is the goal. In summary: OP does not need to buy a fancy AVS-enabled charger. They shouldn't avoid such chargers, but there is no reason to get one. Unless you have material proof to the contrary that you want to present, but it seems like you don't. | ||||||||||||||||||||||||||
|