I have read somewhere that electromagnetic waves, when refracting only "appear" to slow down. That is when the wave goes thorough a material like glass it vibrates the electrons in the glass which generates its own electromagnetic wave that when added to the original wave makes it appear to slow down(i.e its crest moves slower in glass), however I believe the original electromagnetic wave does not slow down.
So if you add a light sensor on the other end of a glass and place a laser pointer at the other end. The speed of light record should be c is what I assume.
So I am wondering if the original electromagnetic wave ever slows down, or does it always only "appear" to slow down. I feel like an original electromagnetic wave should never slow down because the magnetic and electric fields should always travel at the speed of light everywhere, but I am not sure if this reasoning is correct.
edit: I understand that you could say that light slows down, if a photon is absorbed by an electron and then re-emitted again. However I was wondering if the original photon ever actually slows down on it's own, just like a ball might slow down due to friction just going on a path, traveling thorough a medium or if the same photon always travels at the speed of light unless it is absorbed by an electron?
For example in my original question about the laser and the light detector I wouldn't actually consider light to slow down if the photons that come out of the glass to the detector have somehow taken a non-straight path in the glass or are different photons than the ones entering the glass.