0

I have a set of hardware that needs a power supply of 120 watts. I understand that Power is Current times Voltage or: $$P=VI$$ If I were to purchase a 13.8 V, 15 amp DC power supply, or a 13.8 V 50 amp DC power supply what expected differences can I see? Or per se a 13.8 V 5 amp DC 2y? As well the set of hardware is running off a turned on car battery which is about 13.8 Volts. The battery is connected to an adapter that outputs the 120 watts the hardware needs.

My attempt to understand the question is that when I multiply the voltage and amps of the 2 that shows how much power it can output; therefore, I know my hardware dedicates how much voltage and amps I should need when looking for a power supply? I also know that you can modulate the voltage and amps in these devices for suiting your needs. Is there anything I am missing when it comes to understanding DC power supplies?

1 Answers1

1

The power supply must provide the voltage that your load requires. Too high a voltage and your load will probably be damaged, too low a voltage and things probably won't work.

With the correct voltage, the power supply must be able to supply at least the current required by your load - a higher current rating is fine, as the load will only draw whatever current it requires.

Peter Bennett
  • 59,212
  • 1
  • 49
  • 132