Mon, 06/01/2020 - 15:45
I am working through a download of SuperWasp data for an eclipsing binary (V1183 Her). Some of the data are suitable for calculating times of minima. However, I am also finding observations that have the same HJD, as shown below. (HJD, mag, error)
2454579.533, 12.8704, 0.0349
2454579.533, 12.9583, 0.0375
These data are apparently rounded (4th decimal places and more are 0). The duplicates tend to confuse Peranso. Questions: (1) Should I feel free to delete any duplicates. (2) If so is deleting the one with the largest uncertainty reasonable? (3) Have I made a mistake in the download such that I have actually truncated the HJD, thus "creating" the duplicates?
Many thanks for any comments, Ed
Indeed there is a rounding problem.
The two observations in the original file are as follows:
2454579.5329 147 12.8704 0.0349
2454579.5333 147 12.9583 0.0375
Anyway, deleting observations with large errors is recommended when using SuperWASP data. E.g. errors larger than 0.05 mag.
Thanks Sebastian. That answers the question nicely.