0

The question is as follows: A high-voltage transmission line carries $1000 \mathrm{\,A}$ at $700 \mathrm{\,kV}$ for a distance of $100 \mathrm{\,miles}$. If the resistance per length of the wire is $0.5 \mathrm{\, \Omega/mile}$, what is the power loss due to resistive losses?

I understand that you can receive the right answer with $P = I^2R$, but why wouldn't $P = V^2/R$ work? What's the equation $V^2/R$ physically mean in context of the problem that makes it the not the way to find the power lost?

userManyNumbers
  • 1,446
  • 1
  • 13
  • 26

1 Answers1

2

If you have a wire that carries 1000 volts at 2 amps, and it has a total resistance of 10 ohms, the power dissipated in the wire is 2 * 2 * 10 = 40 watts. It's dependent only on the current and the resistance, not the voltage.

The power dissipation has no relationship to the voltage between the wire and some other reference (such as "ground") since the "voltage drop" is only dependent on current and resistance. And the voltage drop, in this example, can be determined from V = IR: V = 2 * 10 = 20 volts.

Thus, "V = IR" is only relevant when the entire "V" value is applied directly across the "R", so that I = V/R.

Hot Licks
  • 1,189