It’s no surprise that NVIDIA is gradually dropping support for older videocards, with the Pascal (GTX 10xx) GPUs most recently getting axed. What’s more surprising is the terrible way t…
The default standard power limit is still the same as it ever was on each USB version
Nah, the default power limit started with 100 mA or 500 mA for “high power devices”. There are very few devices out there today that limit the current to that amount.
It all begun with non-spec host ports which just pushed however much current the circuitry could muster, rather than just the required 500 mA. Some had a proprietary way to signal just how much they’re willing to push (this is why iPhones used to be very fussy about the charger you plug them in to), but most cheapy ones didn’t. Then all the device manufacturers started pulling as much current as the host would provide, rather than limiting to 500 mA. USB-BC was mostly an attempt to standardize some of the existing usage, and USB-PD came much later.
A USB host providing more current than the device supports isn’t an issue though. A USB device simply won’t draw more than it needs. There’s no danger of dumping 5A into your 20 year old mouse because it defaults to a low power 100mA device. Even if the port can supply 10A / 5V or something silly, the current is limited by the voltage and load (the mouse).
Well, regardless, the spec only cares about devices drawing more current than the host can supply, and that has always been consistent. Electricity doesn’t really work in a way the host can “push” current, the only way it could do that would be with a higher voltage, which would damage anything not designed for it. But that’s what the USB-PD spec is for, negotiating what voltage to supply, up to 48V now.
Electricity doesn’t really work in a way the host can “push” current
On a basic level this is precisely how electricity works, a power supply literally pushes electrons by creating a difference in electric field magnitude between two points; or, in other words, by applying an electromotive force to electrons; or, in other words, by creating a voltage between two points. A load then does something with those electrons that usually creates an opposing electric field, be it heating a wire, spinning a motor, or sustaining a chemical reaction within a battery. The amount of power produced by the source and released at the load is proportional to (voltage) * (number of electrons being pushed by the supply per unit of time); usually, this is the limiting factor for most power supplies. They can hold a steady voltage until they have to push too many electrons, then the voltage starts dropping.
Edit: I see what you mean now. Yeah, for a given voltage, it is the load that determines the current, so there’s no safety issue with this for the load. However there could be issues with the cables. IIRC there was an issue with noise being introduced by higher current draws that meant you couldn’t charge and transfer data at the same time with some cables.
Nah, the default power limit started with 100 mA or 500 mA for “high power devices”. There are very few devices out there today that limit the current to that amount.
It all begun with non-spec host ports which just pushed however much current the circuitry could muster, rather than just the required 500 mA. Some had a proprietary way to signal just how much they’re willing to push (this is why iPhones used to be very fussy about the charger you plug them in to), but most cheapy ones didn’t. Then all the device manufacturers started pulling as much current as the host would provide, rather than limiting to 500 mA. USB-BC was mostly an attempt to standardize some of the existing usage, and USB-PD came much later.
A USB host providing more current than the device supports isn’t an issue though. A USB device simply won’t draw more than it needs. There’s no danger of dumping 5A into your 20 year old mouse because it defaults to a low power 100mA device. Even if the port can supply 10A / 5V or something silly, the current is limited by the voltage and load (the mouse).
Well, the original comment was about “pushing more current through than the spec”, and that’s pretty much what we did…
Well, regardless, the spec only cares about devices drawing more current than the host can supply, and that has always been consistent. Electricity doesn’t really work in a way the host can “push” current, the only way it could do that would be with a higher voltage, which would damage anything not designed for it. But that’s what the USB-PD spec is for, negotiating what voltage to supply, up to 48V now.
On a basic level this is precisely how electricity works, a power supply literally pushes electrons by creating a difference in electric field magnitude between two points; or, in other words, by applying an electromotive force to electrons; or, in other words, by creating a voltage between two points. A load then does something with those electrons that usually creates an opposing electric field, be it heating a wire, spinning a motor, or sustaining a chemical reaction within a battery. The amount of power produced by the source and released at the load is proportional to (voltage) * (number of electrons being pushed by the supply per unit of time); usually, this is the limiting factor for most power supplies. They can hold a steady voltage until they have to push too many electrons, then the voltage starts dropping.
Edit: I see what you mean now. Yeah, for a given voltage, it is the load that determines the current, so there’s no safety issue with this for the load. However there could be issues with the cables. IIRC there was an issue with noise being introduced by higher current draws that meant you couldn’t charge and transfer data at the same time with some cables.