It seems like 2 eyes is enough “wetware” to do interferometry inside brain. Can you definitely see some reason why this could not be happening, or some way to test if it does happen?
7 Answers
To do interferometry in post-processing after detection of radiation, the detector must be able to record the phase of the radiation. The eye cannot do this: the photochemical reactions that record the radiation are insensitive to phase.
In instrumentation, radio interferometry may be done post-detection because phase-sensitive radio detectors are practical. Optical interferometry is done pre-detection, using mirrors.
- 22,119
Despite the other two answers denying this there is another possibility. The Hanbury Brown - Twiss intensity interferometer uses noncoherent detectors without a common phase reference. It is conceivable that the brain compares the amplitude fluctuations coming from two non-coherent detectors; the optical hardware may allow it I just do not know if that is the case neurologically.
[1] Hanbury Brown & Twiss: Interferometry of the Intensity Fluctuations in Light. I. Basic Theory: The Correlation between Photons in Coherent Beams of Radiation; Proceedings of the Royal Society of London. Series A, Mathematical and Physical Sciences , Vol. 242, No. 1230 (Nov. 5, 1957), 300-324 [2] https://en.wikipedia.org/wiki/Intensity_interferometer
- 21,193
You're looking at the wrong sense transducers. Having two ears does allow for forms of interferometry, as the frequencies of auditory signals are in a range that allows for the nervous system to respond to phase differences between signals.
e.g. the relative phase of sound waves from a single source arriving at each ear gives an indication of the azimuth of that source: sound localisation
Meanwhile, hearing two sounds with only very slightly different frequencies, when each is played only in one ear, produces “binaural beats”: the perception of a non-existent sound that has a frequency equal to the difference of the two source frequencies. i.e. this is a sound that “exists” only in the brain, not the world.
I guess you’re imagining an interferometer like that used for long baseline radio interferometry. In this case it is necessary that the detectors can sensing the relative phase difference between waves arriving at each detector. Our eyes (and most visible EM detectors) only detect the amplitude of visible EM waves and not the phase. Therefore we can’t use eyes for optical interferometry no matter the “software” used to analyze the optical nerve signals.
- 16,234
As others have mentioned, for the brain to do optical interferometry the eyes would need to capture phase information from the light waves and send it to the brain, where simultaneous signals from each eye could be combined. But the light-sensitive cones and rods in the retina cannot detect phase, and achieving the required synchronisation at optical frequencies via neural signals is unlikely, if not downright impossible.
However, even if our retinas could detect phase, and used some kind of signal time-stamping to permit synchronisation, there's another major problem. The amount of data that the eyes would need to send to the brain would be enormous. As it is, the retina doesn't simply send raw pixel-like data; it has several layers of neural networking that perform significant data reduction on the signals that get transmitted via the optic nerve.
From the Wikipedia article on the retina;
Although there are more than 130 million retinal receptors, there are only approximately 1.2 million fibres (axons) in the optic nerve. So, a large amount of pre-processing is performed within the retina.
[...]
The centre–surround structures are mathematically equivalent to the edge detection algorithms used by computer programmers to extract or enhance the edges in a digital photograph. Thus, the retina performs operations on the image-representing impulses to enhance the edges of objects within its visual field.
Please see that article for further details.
- 13,541
Wavelengths of visible light are around 400nm at speed of light, nerve conduction is a few dozen meters per second. Resolution of signals from disparate nerves would not be anywhere near what would be needed for interferometry. With a bit of handwaving, you could consider the effect of different wavelengths of light on the color-sensitive retinal pigments a kind of molecular interferometry: that operates on lengths commensurate with the actual wavelengths in question. Eye distance is a way different scale.
I did not check if the following is actually used, but I believe even one eye should be sufficient, in principle, for some (primitive?) interferometry. What is (visible light) interferometry? Roughly speaking, it's detection of phase differences between, say, two light beams. Let us imagine two different light beams incident on an eye. I assume that the beams have finite width. A human can change the focus distance of the eye at will (within some range), so (under some conditions) (s)he can arrange for the two light beams to overlap on the retina. The intensity registered by the visual receptor cells within the overlap area will depend on the phase difference between the beams.
- 27,962