What happens if you use a power supply with more amps than needed?

 


Usually it's not a problem, but there can be some caveats. The amperage is (mostly, but see below!) just a maximum rating, and it's not a problem if your load doesn't use all of it.


Firstly, PSUs tend to be less efficient when operated way below their capacity. So if e.g. you use a 20 amp PSU for a 1 amp load, that will work, but will lose more energy to waste heat as opposed to running the same load on a 2 amp power supply.


Secondly, some — mostly older — switch-mode power supplies have a minimum amperage rating in addition to the maximum. Designing a power supply to a minimum output current lets manufacturers simplify the circuitry and save a few cents per unit on parts.


When you turn on such a PSU without an appropriate load connected, it will likely break and what's worse, in the process of breaking it can temporarily produce a voltage much higher than its rated output, which will fry your load.


This problem is rare with more recent PSUs since controller chips with enough “intelligence" to avoid the issue are cheap and plenty now, and even budget PSUs use them. But check the specs just to make sure.

Plus récents Le plus ancien