The limit to which you refer is known as the thermodynamic limit in statistical mechanics. It consists in taking the limit of infinite particles ($N\rightarrow \infty$) and infinite volume ($V\rightarrow \infty$) while keeping a finite density $N/V$.
In a solid, both electrons and atomic nuclei contribute to the thermodynamical and elastic quantities, such as the Young modulus. Electronic contributions, which typically aren't negligible, are usually considered from a quantum mechanical point of view, while nuclei contributions can be considered classically. You can see how the thermodynamical limits plays a role by considering non-interacting electrons in a periodical lattice (I hope you are familiar with quantum mechanics, because that's at the heart of it). If you solve the Schrödinger equation for the hamiltonian of the non-interacting electrons in the potential of the lattice, you will find a relation between the energy E and the wave-number k of the electrons. For instance, in the one dimensional tight-binding model, you find:
\begin{equation}
E(k)=-2tcos(ka)
\end{equation}
where a is the separation between atoms in the lattice, and t describes the probability of an electron tunneling from one atom to a neighboring one.
In order to fully solve the problem, since the Schrödinger equation is a differential equation, you need to set boundary conditions. The choice of boundary conditions is important for "small" systems, but it plays no significant role for "big" systems (that is, in the thermodynamic limit). The imposition of boundary conditions results in the quantization of momentum, and you'll usually find that you have as many allowed quantum states as sites in the lattice (if you allow only one electron per site, and neglecting the spin). For periodic boundary conditions, the allowed states are given by:
\begin{equation}
k_n=\frac{2\pi n}{L}
\end{equation}
where L is the length of the system and n an arbitrary integer. Note that this implies that the electrons can't take any value of energy, but only those corresponding to a $k_n$ (energy is quantized). One can notice that the separation in wave-number between two "adjacent" states (that is, a state for a given n and the state for n+1) will be then
\begin{equation}
\Delta k=k_{n+1}-k_n=\frac{2\pi}{L}
\end{equation}
To count the number of states N, per unit length, you can do the following sum:
\begin{equation}
\frac{N}{L}=\frac{1}{L}\sum_{k} 1=\frac{1}{L}\sum_{k} \frac{\Delta k}{\frac{2\pi}{L}}=\frac{1}{2\pi}\sum_{k} \Delta k
\end{equation}
Being able to do sums over the states is really useful in solid state physics to calculate average values and thermodynamical quantities through a statistical mechanics approach.
Now comes the important part: when you take the thermodynamic limit ($L\rightarrow \infty$), the separation $\Delta k$ becomes infinitely small, which implies that the possible values for the energy become continuous, and that you have to replace the sum over discrete states by an integral over the (now continuous) values of k:
\begin{equation}
\sum_{k} \Delta k \rightarrow \int dk
\end{equation}
so:
\begin{equation}
\frac{1}{L}\sum_{k} 1=\frac{1}{2\pi}\int dk
\end{equation}
If you do the same analysis in the three dimensional case (by the way, you can do it for d dimensions), you have to replace L by V and more $2\pi$ factors appear. The systematic way of taking this limit is then by replacing the sums by integrals in your calculations in the following way:
\begin{equation}
\frac{1}{V}\sum_{k} \rightarrow \frac{1}{(2\pi)^3}\int dk
\end{equation}
While it is true than in reality you will never have $L\rightarrow \infty$, you can consider that you have entered this regime when $\frac{L}{a}>>1$. For two atoms, this will never be the case since L=a by definition (the size L of your system is the distance between the atoms a), but considering the typical interatomic distances, which are of the order of Angstroms or nanometers, a sample of a few micrometers can already be treated by this approach.