0

I have a device that the output power (watts) is controlled by a software setting. If the software setting is such that you only need to supply 20 amps to the device is that all you have to supply? The device will out put double the amount of power if the software is set to output that much, thus requiring an input current of 40 amps.

Transistor
  • 175,532
  • 13
  • 190
  • 404
LDS
  • 1
  • Are you asking about sizing an input breaker? Do you have a datasheet for the device? – vir Apr 11 '22 at 21:07
  • It would be a lot easier to answer the question if you explained what the power supply and the load actually are. For example, if your load is resistive and its value is constant then the power required is given by $ P = I^2R $ where $I$ is the current so doubling the current would quadruple the power required. There's an [Edit] link below your question. – Transistor Apr 11 '22 at 21:21
  • 1
    your question wording is unclear ... it does not really make any sense ... what are you trying to ask? ... If you only need to supply 20 amps to the device is that all you have to supply? – jsotola Apr 11 '22 at 21:29
  • Let me revise my question. We have an outdoor LED display that has an output, 9000 NIT's controlled by software. We want to install this display indoors where the output will be 2000 NIT's. When we adjust the NIT's to 2000 the input amperage drops dramatically. The available amperage indoors is sufficient to run the display at the lower NIT output. We are being told that even though the 2000 NIT output requires a lower amperage input (controlled by internal software settings) that we need to provide the higher amperage input. I want to know if this is required based on NEC regulations. – LDS Apr 11 '22 at 21:35
  • My question was just a "what is required" question. I do not need any math applied to the answer. – LDS Apr 11 '22 at 21:36
  • Where did the UL comment come from? All I asked was per NEC code does the amount of incoming amperage/current need to cover the LED display when the display is at maximum NIT output. If the output is being set at 50% via internal software, does NEC code state anything in reference to this question? – LDS Apr 11 '22 at 22:55
  • If the supply is sized to deliver 10A and your load decides to draw 20A, then the circuit breaker should trip. Ultimately it becomes an operational decision. If for some reason the display is cranked up and the breaker trips, then that may be annoying for the user. Or if they want to increase the brightness at a later stage. Unless there is a specific reason not to have the supply rated to the maximum, then just go with the maximum. It is just a cost concern. – Kartman Apr 11 '22 at 23:54
  • 1
    Code generally requires that utilization equipment be installed according to the ratings marked on the equipment, not the setting of an adjustment. If you are being told what to do by someone who represents the authority having jurisdiction, you probably need to accept that as the final word. If not, you may be able to convince the AHJ to accept it if the equipment is marked to reflect the limitation and the branch circuit protection is appropriate. –  Apr 12 '22 at 03:29
  • Deleted my "UL" comments in deference to Charles Cowie's better informed opinion (immediately above.) – Solomon Slow Apr 12 '22 at 14:39

2 Answers2

0

9000 Nits = 9000 x 3.426 = 30834 Lumens.

An LED display would output 90 Lumens / Watt.

Your display power = 30834 / 90 = 342.6 Watts.

Assuming a power factor of 0.7, the supply current @ 120 V = 342.6 / 0.7 / 120 = 4 A and @ 240 V = 2 A.

At a lower illumination of 2000 Nits, the current would be 1A @ 120 V and 0.5 A @ 240 V.

Your display may be conveniently plugged into a 5 A wall socket.

vu2nan
  • 18,175
  • 1
  • 16
  • 47
0

I'm afraid you must size for the maximum possible load in software.

The equipment has a nameplate stating either VA or Amps. VA is like "watts with benefits". You can convert that to Amps by dividing by nominal voltage (e.g. 120V or 240V). Or vice versa.

You generally need to use the nameplate data, adjusted by NEC 210.19(A)(1) and whatever other parts of NEC apply - often a 125% derate. Many people assume 125% applies to all loads. While that's not quite true, nobody ever got fired for applying it LOL.

So for instance if the nameplate says 40A, that x 125% = 50A and you must size wires and breakers for that.

Surely we must be talking about low voltage DC wiring (to which NEC still does apply, but with a lot of asterisks and notch-outs). But in that case, you need to look at the nameplate of the DC power supply and base circuit and wire size off of that.

Note that NEC 110.2 and 110.3(B) always apply: use approved equipment and follow the instructions and labeling (which was approved with the equipment, defines the approved use and the limits of testing).

If the output is being set at 50% via internal software, does NEC code state anything in reference to this question?

It's not just NEC -- it's UL. UL approves equipment. UL's view is that you must wire for the "worst case, software malfunction, vengeful employee, max commandable" load - unless the software is safety-rated. (which is how EV's and EVSE's get to have programmable current ratings).

Remember Stuxnet and the uranium centrifuges that were destroyed by sending bad commands to their controllers? UL will not approve a situation where a hacker could set 1000 houses on fire by sending a worm to a million EVSE's, and finding which houses have Zinsco or FPE panels.