They’re not hypothesizing a measurement device recording those times and then rounding. They’re just setting up a “let’s take as a given that this is what really happened in the world” and asking what errors our processes could do if we do a series of imprecise measurements and then compare those measurements.
Side note: my HP 3458A can do a small handful of 8.5 digit DCV measurements per second.
>my HP 3458A can do a small handful of 8.5 digit DCV measurements per second.
I thought they had like 100k samples a second. Is it really sufficient for 8.5 precision?
Edit: perhaps measurement should be confined only to relative finish between the participants, e.g. finish within 1ms should be considered the same (regardless if they fall in the same hundreds of a second bucket)
I don’t keep mine in cal (or even powered up all the time), so I can’t claim full volt-nut status here, but my understanding is the 100K/sec 4.5 digit measurement rate is limited by communications not by sampling rate. (The 5.5 digit rate is 50K/sec, implying the 4.5 digit rate limit is not in the measurement but more likely in the GPIB.)
On thing I know with perfect precision: HP/Keysight knows more than me about this. :)
Side note: my HP 3458A can do a small handful of 8.5 digit DCV measurements per second.