Given your mathematical background, I believe you will find a mathematical answer more appealing than a physical one.
For an objective viewer at an arbitrary macro scale, how does one define a set of macrostates to evaluate the entropy for?
To understand micro- and macrostates, we are going to start with the basics - a physical system is any set of physical objects which we have chosen to examine. Every physical system has properties which we call parameters if they can be quantified mathematically. For example, the pressure of a gas is a parameter because it can be quantified by a number. Other parameters of a gas are also the positions of its particles since those can be quantified by vectors. However, a gas's colour is not a parameter because colour is a percept (the closest thing to colour which can be called a parameter is the wavelengths of light which the gas reflects, but these are still not the colour itself, for different people will perceive them as different colours).
Depending on which set of parameters of a system we choose to examine, we have different parameterisations or descriptions of it. Note that these always describe the same system but represent different aspects of it. Given a parameterisation with $n$ parameters $P = (p_1,\cdots,p_n)$ we call any $n$-tuple of values $(v_1,\cdots,v_n)$ for the parameters $p_1,\cdots,p_n$ a physical state of the system in the parameterisation $P$. The set of all possible physical states $P$ is known as its phase space, and I will denote it as $S(P)$.
To talk about entropy, we always need two different parameterisations of the same system. Suppose we have two parameterisations $P = (p_1,\cdots, p_n)$ and $P' = (p_1',\cdots,p_m')$ of the same system, the phase space of $P$ is bigger than the phase space of $P'$ and there is a surjection $f: S(P) \to S(P')$ from the phase space of $P$ to the phase space of $P'$. This just means that if we know the values for the parameters in $P$, we
can calculate the values of
the parameters in $P'$. In this case, we call any physical state in $S(P')$ a macrostate and any physical state in $S(P)$ a microstate. The multiplicity $\Omega$ of a given macrostate $M \in S(P')$ is the number of microstates which correspond to it, i.e. it is the number of physical states $m \in S(P)$ such that $f(m) = M$.
Since, in real physical scenarios, multiplicities can become very large, we introduce the concept of entropy. The entropy $S$ of the physical system described by $P$ and $P'$ is defined as the Boltzmann constant $k_B$ times the natural logarithm of the multiplicity $\Omega$ of the system's current macrostate.
$$S \overset{\text{def}}{=} k_B \cdot \ln \Omega$$
The Boltzmann constant is there for historical reasons, and the logarithm gives us a relatively small number to work with, which is easier than working with multiplicities directly. And... this is basically it - that's all there is to entropy. It is just a relation between two different parameterisations which we choose to describe a physical system. Note that the choice of the two parameterisations $P$
and $P'$ results in different entropies.
Is the micro state always defined in terms of atoms, or sub-atomic particles? Can it be generalized to any scale < macro with the second law still holding?
In the case of gases, $P = (\vec{r}_1,\vec{p}_1,\cdots,\vec{r}_n,\vec{p}_n)$ is taken to be the parameterisation which studies the positions and momenta of all gas particles and $P' = (p,V,T)$ is the parameterisation which studies the pressure, volume and temperature of the gas. The entropy of the gas is thus a measure of how many different configurations of the gas's particles give rise to the specific pressure, volume and temperature that the gas is measured to have. The second law
of thermodynamics applies only in this
specific context - if you change the parameterisations, the entropy value will be different, even if the temperature $T$ is part of one of the parameterisations.