How can I calculate the proper time of an accelerating frame of reference until it reaches a certain velocity relative to the stationary frame (before acceleration)?
For example, I depart from Earth on a spaceship under constant acceleration of 10 m/s2. How long do I need to keep accelerating (as measured on my clock), until an Earthbound observer sees me moving at .87 c relative to him? Or .99995 c? What would be the Earthbound frame time measured from start until the engine cutoff in these cases?
In classical mechanics, 10 m/s2 acceleration over 1000 seconds gives the same final speed relative to start as 100 m/s2 acceleration over 100 seconds, but I'm not sure if the same holds in special relativity and my speed vs. Earth would be the same for example with 10 m/s2 over 30 000 000 seconds as with 300 m/s2 over 1 000 000 seconds (acceleration and time measured using travelling clock).
Edit: To keep it simple, I'm only interested in special relativity point of view, not how the acceleration itself also affects the clock. I believe that with small values such as 1g the acceleration-induced time dilation can be ignored.