The question is as follows: A high-voltage transmission line carries $1000 \mathrm{\,A}$ at $700 \mathrm{\,kV}$ for a distance of $100 \mathrm{\,miles}$. If the resistance per length of the wire is $0.5 \mathrm{\, \Omega/mile}$, what is the power loss due to resistive losses?
I understand that you can receive the right answer with $P = I^2R$, but why wouldn't $P = V^2/R$ work? What's the equation $V^2/R$ physically mean in context of the problem that makes it the not the way to find the power lost?