Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The article is claiming that clocks that only show time truncated to a minute are off by an average of 30 seconds, which just isn't true in practice. When I say or hear 11:30, I assume it can be within 11:30:00 and 11:30:59. Humans rarely need accuracy to the second beyond this, so clocks intended to be used by humans truncate to the minute without causing problems. To say they are "late" is a real leap.

I mean, you can reduce this to absurdity too. A clock truncating to seconds is off by an average of 500 milliseconds! The problem is no one cares in day to day usage which is what human readable clocks are made for.



> When I say or hear 11:30, I assume it can be within 11:30:00 and 11:30:59.

And if you cared about seconds-precision, you would assume 11:30:30, and thus your error would average the same 15s.

The difference is that when you observe a minute change, you know that's exactly at zero seconds (instead of 30 — or 31 if you apply the same rounding rule to seconds).

It is fair to ask a question if rounding would be more useful (nope), but the entire article is incoherent and speaks mostly of the author's confusion.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: