I'm following the text Introduction to Electrodynamics by Griffiths, and I came across the following in an in-text problem:
Sketch the vector function v = $\frac{\boldsymbol{\hat{\mathbf{r}}}}{r^2}$, and compute its divergence. The answer may surprise you... can you explain it?
Well, the answer did surprise me, for the sketch of the function is indeed indicating a diverging field (like field lines from a point positive charge), yet the math claims the divergence to be zero. What's going wrong?
This is the solution I have, from a manual, which also doesn't make sense to me:
The answer is that $\nabla·v = 0$ everywhere except at the origin, but at the origin our calculation is no good, since $r = 0$, and the expression for $v$ blows up. In fact, $\nabla·v$ is infinite at that one point, and zero elsewhere.
Could someone please help me understand the situation? Any help would be appreciated, thanks!
P.S. I understand that this has been asked earlier on Physics SE, but I didn't understand the answers. The one with most upvotes said:
Pretty sure the question is about $\frac{\hat{r}}{r^2}$, i.e. the electric field around a point charge. Naively the divergence is zero, but properly taking into account the singularity at the origin gives a delta-distribution. (Answer by @genneth)
What's the delta distribution in conversation?