I've been studying the basics of General Relativity, and my question is: does it make sense to say that near a black hole (or any massive body), distances increase, the way on a topographical map the contours get closer and closer near a steep hill? So that an object escaping from near the event horizon is actually having to travel thousands of times more than the 'real' distance to reach a destination far away. This seems to jive with the idea that an infalling object (from an outsiders perspective) seems to travel an infinite distance, fading away but never crossing the horizon, and also with the Shapiro time delay phenomenon. But I want to ask in case this is a false analogy. And if it is true, how would that "stretched distance" (i.e. all the local distances on the trajectory added together) relate to the distance seen by an outsider? How could the outsider ever say "I am 100 million miles from a black hole," if any test particle would actually travel many times greater distance to traverse it? And how could a distant observer measure distance to a black hole anyway? How would a distant observer measure a valid distance to any object, if not by sending and receiving a light ray?
Asked
Active
Viewed 126 times
1
1 Answers
2
You can form $\int ds$ along a radial line at some given $t$ and thus create a notion of ruler distance. The ruler distance from the horizon is finite. You can see this also by looking at Flamm's paraboloid: distances along the paraboloid are finite.
However the time required for a particle to move outwards from the horizon is infinite.
Andrew Steane
- 65,285