Quantum mechanics is a theory of (noncommutative) probability, that is in my opinion best understood in terms of Bayesian probability. The information on a system is encoded in the state (probability) and in its unitary dynamical evolution (analogously, one can think of the evolution only affecting observables i.e. random variables, but that's not important for the question at hand).
The act of observing is in some sense "outside" of any probability theory (for the latter simply describes the expected outcomes of such observations, but not the act of observing itself). But let us put that aside for a moment, and focus on measurements.
Measurements by means of instruments and measurement processes can be understood pretty well within the probability theory of quantum mechanics. But one needs to enlarge the system considered to include the instrument. The complex of "system+instrument" evolves unitarily, but the effective dynamics on the system alone (tracing out the degrees of freedom of the instrument) is not unitary and thus irreversible. This is the von Neumann scheme of measurement.
What happens to the noncommutative probability in this measurement scheme is the following. Let us suppose that the starting state of the system is a pure state (a state with maximal bayesian information), described by the orthogonal projection $P_{\psi}$ on the span of the vector $H\ni \psi=\sum_{n\in\mathbb{N}}a_n \varphi_n$ (where the latter is the decomposition in eigenvectors of the random variable $T$ we are measuring corresponding to $\lambda_n$). The instrument is in the state $P_\Psi$, $\Psi\in K$. So the resulting initial state of the complex "system+instrument" is a tensor product state $P_{\psi\otimes \Psi}$ on $H\otimes K$. The measurement process is given by a unitary evolution $U(t)$ on $H\otimes K$ that mixes $\psi$ and $\Psi$: $U(t)(\psi\otimes\Psi)$ is no more a tensor product.
At time $t^*$ when the measurement is finished, we trace out the degrees of freedom of the instrument to obtain the effective dynamics of the system after the measurement. This process is "irreversible", and yields a mixed state for the system. The resulting mixed state is
$$\sum_{n\in\mathbb{N}}\lvert a_n\rvert^2 P_{\varphi_n}\; .$$
Such state is then evolved by means of the unitary dynamics $u(t+t^*)$ of the system when isolated (it is no more interacting with the instrument), and is therefore $\sum_{n\in\mathbb{N}}\lvert a_n\rvert^2 P_{u(t+t^*)\varphi_n}$.
It is, in my opinion, misleading to say that we cannot know when a measurement has occurred. The maximal information on the system includes that you know if and when/how many times you put your system in interaction with an instrument (or another environment for what is worth). That essentially amounts to the knowledge of the unitary evolution of the complex "system+instrument(s)(+environment(s))"; and it is not different from the supposed maximal knowledge that we can have in classical mechanics (that in fact is the commutative probability theory that emerges from quantum mechanics when we neglect the noncommutative effects).
Of course, as it happens in any probability theory, if you have done $M$ measurements, the correct way of calculating the probability for the outcome $(x_1,\dotsc,x_M)$ of the measurements is using conditional probability. Now suppose that the time evolution preserves the projectors $P_{\varphi_n}$ above, i.e.
$$\sum_{n\in\mathbb{N}}\lvert a_n\rvert^2 P_{u(t+t^*)\varphi_n}=\sum_{n\in\mathbb{N}}\lvert a_n\rvert^2 P_{\varphi_n}\; .$$
Then suppose you do $M$ identical measurements as the one described above. Of course the resulting probability of measuring the outcomes $(\lambda_1,\dotsc,\lambda_M)$ is zero unless $\lambda_1=\dotsc=\lambda_M=\lambda_n$ that has associated probability $\lvert a_n\rvert^2$.
This is a pretty convincing scheme in my opinion, if interpreted (as it should be) as a probability theory. In addition, it does not need to assume any "emergent multiverse" or the alike, simply you assume to have the probabilistic knowledge of the system and surrounding environment, essentially up to how many times the two are interacting/coupled, and how.
References. The existence of measurement processes behaving as described above for any QM observable (with both discrete and continuous spectra) has been proved by Ozawa. This has recently been extended to all (interesting) observables in QFT as well by Okamura and Ozawa. An introduction to von Neumann's measurement scheme can be found in his book.