Quoting from the Feynman Lectures on Physics - Vol I:
The atoms are 1 or $2 \times 10^{−8}\ \rm cm$ in radius. Now $10^{−8}\ \rm cm$ is called an angstrom (just as another name), so we say they are 1 or 2 angstroms (Å) in radius. Another way to remember their size is this: if an apple is magnified to the size of the earth, then the atoms in the apple are approximately the size of the original apple.
How does this hold true?
Let us assume the radius of an average apple is about $6\ \rm cm$ ($0.06\ \rm m$). The radius of earth is about $6371\ \rm km$ ($6371000\ \rm m$). Therefore, a $\frac{6371000\ \mathrm{m}}{0.06\ \mathrm{m}} = 106183333.33$ magnification, i.e., a magnification of about $10^{-8}$ times is required to magnify an apple to the size of the earth.
If we magnify an atom of size say 1 angstrom ($10^{-10}\ \rm m$) by $106183333.33$ times, we get $0.0106\ \rm m$ or $1.06\ \rm cm$ only. The atom has not been magnified to the size of the original apple. How does the quoted statement in the book hold good?