I have a set of experimental data, which come from a periodical event. More specifically, these are detections from a single photon detector, so there is no intensity, I only have the timestamps of the detections. In the data there is some random noise, i.e. accidental random counts that are due to the imperfection of the detector (dark counts).
What would be the best way to estimate the temporal period of those events?
I was thinking at some discrete Fourier analysis, but in general this type of algorithms are designed for data with variable intensity. In my case the "intensity" is only varying between 0 and 1.
Moreover I have to create "y axis data" where I have to insert a lot of "0" values for all the time values where I don't have an event, and the amount of such zeros would be a lot, as many as the time resolution of the counter would allow.
So, is the discrete Fourier analysis a good idea at all? And if not, what would be a better approach?
Thank you.