The normal packet jitter on my connection is between 2 and 20 milliseconds. You could stream games to me with a fixed network delay of 120ms (plus 80ms for input/rendering) and there wouldn't be dropped or late frames.
What's the 99th percentile for that? How about packet-loss?
None of this is new stuff in gamedev, we've been build action games successfully since the days of 28.8 modems and 200ms pings.
Maybe that makes me a bit of an entrenched player(which is a poor position to argue on HN) but I've yet to see anything fundamental in these technologies that will address the latency issues in the same way you can with an interpolated client-side simulation.
I'm working on a game with my own GL engine, and i found locally even <5ms jitter is noticable, if only because the jitter occasionally causes a "frame time overflow", leading to a skipped frame or a frame being displayed twice.
You set up a fixed delay, that is large enough to contain your jitter. It doesn't matter if one packet takes 80ms and the next takes 105ms when you don't display it until the 120ms mark. There will be no visible jitter.