1

Would two otherwise identical photodiodes (same silicon, same field of view) with different active areas illuminated by the same ambient source (ambient as in fills the entire FOV) produce different magnitude of output?

To me it seems initially they would not since with the same field of view I would think the same number of photons falls on each. Then there is the fact that with larger active areas, there is a lower density of photons on the larger active area device so it is not clear to me what the result would be.

Much of the common literature says things like "larger active area captures more photons" but does not really relate things to field of view. However datasheets seem to indicate higher photocurrents for larger active areas that can't be accounted for with FOV (especially since the smaller photodiodes tend to have larger FOVs).

DKNguyen
  • 56,670
  • 5
  • 69
  • 160
  • Field of view is a property of the optical system in front of the photodiode, whereas the photodiode itself just has an active area and an angular acceptance code over which photons are absorbed. Maybe you could draw a ray diagram showing the scenario you're thinking of? – user1850479 Jun 10 '22 at 16:33
  • @user1850479 I have a scenario in mind with minimal optics. So two bare die fully exposed if that would give them the same field of view. If unenclosed active areas somehow inherently have a different FOV then add optics to one of them to force it to be the same as the other. Rephrased, you could shine a laser beam with the same incident energy at the photodiodes but adjust the diameter so all light falls onto the active area – DKNguyen Jun 10 '22 at 16:34
  • Perhaps I should be saying incident energy instead of FOV? I mean...the same acceptance angles does equate to the same incident energy for the same full-view source, correct? I guess I said FOV because I was only somewhat certain of that. – DKNguyen Jun 10 '22 at 16:38
  • Field of view is a property of an optical system (it is the area the lens is imaging onto the sensor), so not the right term for a bare sensor that doesn't have an image. If I understand correctly, you're talking about over/under filling the detector such that the same number of photons hits it either way, but in one case they're concentrated in the center and the other they're spread out over the whole area? In that case, there is no difference, at least as long as energy density is not so extreme that you induce nonlinearity. – user1850479 Jun 10 '22 at 16:40
  • The scenario I am thinking of is having a fully exposed 1mm diameter die, and 2mm diameter die just pointing straight up at the sky. I think this is similar to if you pointed a 1mm diameter laser at the 1mm diameter die and pointed a 2mm diameter laser at the 2mm diameter die where both lasers have the same incident energy (i.e. the 1mm diameter laser would have four times the area power density of the 2mm diameter laser). – DKNguyen Jun 10 '22 at 16:42
  • What I had in mind is the same number of photons hitting the detector either way. Although I was thinking less about overfilling and underfilling than just hitting the entire active area but adjusting the area power density of the laser. I think that's the same as just pointing both detectors at the sky (where they both have optics the same FOV, and optics to do that if necessary). – DKNguyen Jun 10 '22 at 16:43
  • If its a bare photodiode, all that matters is the number of photons that hit the active area. So a 2mm diameter collects 4x more sunlight than a 1mm diameter (assuming no lenses). If you scale the intensity of the source 4x, then you would have identical output in both cases. Usually when literature talks about larger sensors collecting more light, they're talking about a complete system designed around each sensor, so the larger sensor is usually paired with a larger lens, and since the whole system is scaled up its light collection increases. With a bare diode that happens too. – user1850479 Jun 10 '22 at 16:46
  • Hmmmmm. I do see how a bare photodiode with more area just collects more photons because it simply covers more flat area. I guess I am having trouble reconciling that with field of view because initially it would seem like two bare photodiodes with no optics have acceptance angles of nearly 180 degrees (or whatever maximum incident angle the semiconductor will accept light at). – DKNguyen Jun 10 '22 at 16:47
  • I'm having trouble finding the specific equations but I believe increasing the area of the PN junction would increase the dark noise and reduce your SNR/NEP. – vir Jun 10 '22 at 16:50
  • @user1850479 But you do raise a point I had not thought of before...lenses can be of different sizes/areas independent of the FOV they are providing. – DKNguyen Jun 10 '22 at 16:51
  • @vir Yes, i believe that is correct. – DKNguyen Jun 10 '22 at 16:51
  • For FOV α << 180 degrees, α is approximately equal to sensor diameter/f-number. So if two sensor/lens systems have the same FOV but sensor 2 is bigger than sensor 1, sensor 2's lens will have higher f-number, which means smaller aperture and less light. Of course this is for imaging optics and it all goes out the window if you're using e.g. parabolic mirrors or other non-imaging concentrators. – vir Jun 10 '22 at 16:57
  • @vir I don't have imaging in mind. Just collecting light and dumping it on the sensor. The image does not need to be preserved. I'll have to look up what apeture and f number are. – DKNguyen Jun 10 '22 at 16:58
  • I don't have time to type up a huge answer with drawings here, but if you're not making an image (no lens) you have the active area and an acceptance angle (analogous to FOV) which is determined by your mechanical housing and maybe total internal reflection (if any) in the detector. For a distant source like the sun, all rays come in parallel, so the angular acceptance is irrelevant as long as the diode is pointed the right direction. That leaves area, so two diodes gives you double the photons. If you had a source close to the detector (range of angles), then acceptance angle would matter. – user1850479 Jun 10 '22 at 17:09
  • @user1850479 The parallel rays coming from the sun makes since if it were in space. I believe that on the Earth's surface surrounded with atmospheric scattering this isn't the case though since you obviously see less of the sky with a narrower FOV whereas from Earth you will always see the entire sun with almost any FOV as long as you are looking at it. – DKNguyen Jun 10 '22 at 22:55
  • If you can see the sun in the sky than the rays are nearly parallel (0.5 degree angular extent, hence the sun is a bright, relatively small fraction of the much darker sky). If it's overcast enough that you cannot see the sun then the rays will be randomized (sky looks uniformly white). – user1850479 Jun 10 '22 at 23:09

0 Answers0