> chargers and cables are advertised with the wattage they support, but it's really the voltage and current that matters.
Wattage is what you get when you multiply amps and volts. You just said the equivalent of "It's not about the ice cream, but about the ice and the cream".
Also, it arguably does have everything to do with volts - if you put the too much voltage into a device, most likely you're going to damage the device, its surroundings, or yourself (i.e. in a house fire).
USB-PD on USB-C negotiates the highest voltage your device and charger will support.
You can buy a 30W USB-PD charger that supports a smaller subset of voltages than a 20W USB-PD one, and in practice will deliver less power to your device because the voltages both like don't line up well.
What actually is supported by each tends not to be specified/disclosed-- or if it is, it takes a lot of digging to figure it out.
Yes and power negotiation does not protect against improper voltage whatsoever. This wasn't my point, despite me addressing negotiation in another comment.
If I put 120 volts through to my MacBook directly over the cable it will invariably damage it. Voltage negotiation doesn't matter here.
Also the original comment was unclear; they've edited to clarify. But of course, I'm the bad guy on HN for not being able to read minds.
E: I realize my GP comment was also unclear. I should have said "excessive" voltage, not improper I suppose. "Improper" in my head meant outside the rated voltage for either end/the cable. I'm not the one being a pedant here.
Out of curuosity: is there any instance where it isn't implemented correctly that actually damaged/fried some electronics? What I've seen is that in the worst case it doesn't charge or negotiates at a very low voltage and charges slowly.
Your own analogy can be used to explain how you do not fully understand the situation. 10 cups of cream and 1 cup of ice does not make ice cream, even though that technically is ice and cream. Just like a 10V 5a charger is not the same as a 50V 1a charger.
The important part with regards to power delivery isn't simply wattage. If a hypothetical charger can put out 96W at 1V@96A, it's never going to deliver even close to that amount of power to a device that expects 96W at 20.5V@4.7A.
Exactly. I was not just trying to be a jerk. Their analogy was actually a good one, they just used it incorrectly. Ice cream is only ice cream in a certain range of ice to cream ratios, just like a charge needs a "close enough" mix of voltage and amps for it to reach useful wattage (depending on the device).
They're not interchangeable. USB-C has power negotiation, but that doesn't mean the devices support those voltages. You still need to understand the voltage ratings on both ends of a cable.
That's exactly what OP is saying - chargers are commonly advertised as being able to supply a particular wattage, but that particular wattage is only attainable if the device being charged supports the maximum voltage the charger is capable of delivering.
OP is complaining that that leaves the true wattage of a given device/charger pair unknowable from the charger's packaging alone without further information as to what voltages and at what currents it can supply on request from the device. It's certainly a valid frustration.
The reverse situation would be the case (cable supports 3A, but not 6A). For the range of voltages supported by USB-PD, the cable only cares about amperage. Copper wire doesn’t care much if it’s 5V, 50V, or 250V, it’ll carry it the same. However, if the size of the wire isn’t correct for the amperage, it’ll over heat and potentially catch fire.
It’s the end device that cares about voltage and amperage both as the voltage steppers inside have to be ready to handle providing the right voltage to the chips and the amperage has to be right to handle the load.
Wattage is what you get when you multiply amps and volts. You just said the equivalent of "It's not about the ice cream, but about the ice and the cream".
Also, it arguably does have everything to do with volts - if you put the too much voltage into a device, most likely you're going to damage the device, its surroundings, or yourself (i.e. in a house fire).