A $0.25\text{ kg}$ sphere of radius equal to $0.07\text{ meters}$ is dropped from rest from the top of a building.
Granted that it will not hit the ground before, when and after what distance will it reach terminal velocity?
Air density: $1.225\text{ kg/m}^3$
Drag coefficient: $0.56$
Please explain your thinking process.
The concept that eludes me is how velocity progresses over time, before reaching its terminal velocity, when the acceleration is not constant.
My best guess so far was to calculate its terminal velocity, using:
$F_d = 0.5 P C_d A V^2$
and equaling it to $mg$, getting:
$0.25 * 9.8 = 0.5 * 1.225 * 0.56 * 0.0308 * V^2$ <=>
<=> $V = (2.45/0.0106)^{1/2} = 15.23$
(being:
$P$: air density
$C_d$: drag coefficient
$A$: reference area
$V$: relative velocity)
And now I'm not sure where to go next. This is what I've come with to answer the questions above:
If I know that the initial acceleration was $0.25 * 9.8$ and its final acceleration is going to be 0, I calculated an average, constant acceleration of:
$(a_i - a_f)/2$ <=> $0 - 2.45/2 = 1.225$
With that in mind, I divided the calculated terminal velocity by its assumed constant acceleration:
$V_t / a = 15.23 / 1.225 = 12.433$ sec
Given that velocity equals distance over time, I found the distance at which Vt was reached using the average velocity between 0 (initial) and 15.23 (terminal) of 7.615:
$V = d/t <=> d = vt = 7.615*12.433 = 94.6773$ m
Now, I think this is a rather simplistic and probably wrong reasoning, so I would love to get some help!