The receiver maintains its own internal timebase, and some of the unknowns that it needs to solve for are the frequency and phase offsets between that local timebase and "GPS system time" as inferred from the received signals.
Once the receiver has those values, the 1PPS output is generated from that timebase. There is no "direct connection" to either the carrier or the modulation of any of the satellite signals — there is significant Doppler shift on those signals anyway.
Low-end receivers use an inexpensive TCXO (temperature-compensated crystal oscillator) to drive the local timebase, but no attempt is made to frequency-lock that oscillator to GPS time. As a result, the 1PPS output may have some jitter in it, with the peak-to-peak value related to the period of the oscillator (usually on the order of 20 - 50 nsP-P).
Fancier high-precision receivers have the option of frequency-locking the local oscillator to GPS time, eliminating the jitter.
In my applications, I usually use low- to mid-range receivers that do not frequency-lock their timebases. In some of my designs, I don't care about the low-level jitter. In other designs, I have used a second PLL as a jitter filter to create my own low-jitter timebase.