I was suspicious that there might be some artifacts of terminal updating that are amplifying the effect in that test, so I whipped up a little graphical test:
I can still get it right pretty much 100% of the time, but the difference does feel a lot more subtle than what I was seeing in iTerm.
(If you change the delay, you need to hit randomize to make it take.)
Edit: Also we should keep in mind that we're not testing the latency we set in either test. We're testing that delay plus the delay from I/O. So 100ms might be indistinguishable. But by the time we pile everything on, we're getting closer to 200.
The result is the same as far as UI design is concerned; don't assume that you can get away with 100ms; most of that leeway has already been wasted by the OS and hardware.
Adding step="0.01" to the input allows stepping down in 10ms increments.
Personally I can distinguish every time at 0.06.
I find it difficult to believe the rest of the hardware loop has 40ms latency, it would be difficult for smooth rendering to occur if it did.
I suspect that part of this is 'training' yourself but also what the researchers meant. They may very well have meant that above 100ms is jarring and noticed as a delay but below 100ms is not seen as a delay rather than truly being unable to notice in an A/B test.
I suspect this test would be more difficult if it sometimes did A/A and B/B.
>I find it difficult to believe the rest of the hardware loop has 40ms latency, it would be difficult for smooth rendering to occur if it did.
Why would I/O latency have any effect on the smoothness of rendering? Are you sure you aren't thinking of throughput?
Remember: TFA just told us that many keyboards on their own add more than 40ms of latency. I definitely wouldn't have guessed that, so I'm very reluctant to entertain any certainty about the rest of the system having low latency.
https://jsfiddle.net/zs8bncxj/1/
I can still get it right pretty much 100% of the time, but the difference does feel a lot more subtle than what I was seeing in iTerm.
(If you change the delay, you need to hit randomize to make it take.)
Edit: Also we should keep in mind that we're not testing the latency we set in either test. We're testing that delay plus the delay from I/O. So 100ms might be indistinguishable. But by the time we pile everything on, we're getting closer to 200.
The result is the same as far as UI design is concerned; don't assume that you can get away with 100ms; most of that leeway has already been wasted by the OS and hardware.