I'm not a physicist, but trying to understand the propagation of light for rendering purposes. So far, a lot has made sense, but I've stumbled upon something called the "Fundamental Theorem of Radiometry": $$ \dfrac{L_1}{L_2} = \dfrac{n^2_1}{n^2_2} $$ Where $L_1, L_2$ are the radiance and $n_1,n_2$ are the refractive index in element 1 (e.g. air) and 2 (e.g. water) respectively along a ray. A better definition is found here in section 2.5.4.2: https://dirsig.cis.rit.edu/docs/WaterManual.pdf
Since air has an IoR of $\approx 1$ and water of $\approx 1.34$, this would mean that $$L_2 \approx L_1 * \dfrac{1.34^2}{1^2} $$, so the radiance becomes larger in water. This is also explained in Figure 7 of this source: https://www.oceanopticsbook.info/view/light-and-radiometry/visualizing-radiances#x1-87
My question as layman would now be: Does this also mean a sensor (e.g. camera) would perceive incoming light in a brighter fashion right below the water surface than right above the surface?
This somehow contradicts with my understanding of the preservation of energy, but I realized it has something to do with the solid angle of the beam decreasing in water, but i don't know how this even matters for a single, linear traveling beam, or$-$from the perspective of a sensor$-$a measurement over a larger angle for all incoming light.