Here is a non rotating neutron star emitting perfect black body radiation from its surface. Supposing that the radiation is redshifted by a factor of 2 for distant observers (which is chosen purely for simplicity because neutron stars in reality can’t be that compact), each photon’s energy should be halved uniformly. As a result, the frequency peak should also be halved. Because the frequency peak is linearly proportional to the temperature, the effective temperature perceived by distant observers should be halved as well. Besides the surface of the neutron star also experiences time dilation by a factor of 2, which means the number of photons emitted from the surface per second is twice the number of photons reaching distant sites. Taken together, the effective luminosity of the neutron star should be a quarter (1/4) of the surface luminosity.
However, according to the Stefan-Boltzmann law, the radiation power is proportional to the 4th power of temperature, which means the effective luminosity should be 1/16, not 1/4 of the surface luminosity. Is it due to the gravity lensing effect which makes the neutron star look bigger than its actual size? Another complicating factor is that once below the photon sphere, the surface of the neutron star no longer obeys the lambert emission law because light rays emitted at low angles are directed back towards the surface, which reduces the effective luminosity. These calculations are beyond my capability, so I come to ask for help.