1

When I practise problems, I come across ideal situations like constant velocities, constant accelerations, etc. But in real situations, objects usually don't magically gain momentum or acquire acceleration.

Correct me if I'm wrong, but velocities often fluctuate due to a rate of change, i.e., acceleration, and accelerations must also be fluctuating as forces are not built up instantaneously. Therefore, there is a rate of change for each of these parameters.

So, how many times do I have to repeat this process, i.e., differentiate x(t), for real objects(not particles) to reach a constant rate of change?

Reet Jaiswal
  • 385
  • 1
  • 11

2 Answers2

3

This is really just a mathematics question.

The only type of functions that will eventually go to $0$ after repeated differentiation are polynomial functions with a finite number of terns, since these are the types of functions you get upon repeated integration starting at $0$. Any other type of function will not go to $0$ after repeated differentiation.

The number of times you will need to differentiate this polynomial to get to $0$ is just one more than the highest degree term of the polynomial.

BioPhysicist
  • 59,060
3

Infinitely.

In general the real world is noisy. There's a bit of mathematics called stochastics which essentially says, “can we still do calculus if our functions are non-differentiable?” to try to model this noise. If you study that subfield you will notice that there is a lot of integration going on, and that is because integrated noise can be more well-behaved than the original noise: little peaks over here can cancel with little troughs over there. The inverse operation does the exact opposite, you had something which looks well defined and you take a derivative and now you see much more noise. Take another derivative, you see even more noise, your signal-to-noise ratio goes into the gutter fast.

You can see this best in Fourier space. Differentiation acts on the Fourier transform by multiplying by $-2\pi i f$, in addition to rotating the global phase it strengthens high-frequency components. To actually perform such differentiation the signal is often smoothed beforehand, essentially taking a low-pass filter to the Fourier transform first before multiplying by $f$, so that near 0 the combined multiplication is linear but far away the combined multiplication is zero.

Take a real basketball in air and start taking derivatives and you will rapidly amplify all the chaos of the turbulent airflow that the ball is running into and then casting off of the ball’s speckled-with-grip-studs surface.

Noise, by the way, is not uniformly a bad thing. Mathematicians very often have to deal with all sorts of degeneracies, “what if two columns of this matrix are scalar multiples of each other, then the matrix becomes non invertible, then my whole argument falls apart.” Or, “what is my matrix contains Jordan blocks and does not have a diagonalization.” In physics, the few times that we actually care about this we make a big stink about it and call it a law of conservation; for the most part we can assume that this does not happen and no two things perfectly balance each other and so forth. So all of our matrices are invertible and all that.

CR Drost
  • 39,588