Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

So the point I was making(and a little poorly on re-reading) is that latency is fine up to a point(about 200ms) as long as it's consistent with no jitter. Input systems -> rendering fall under that category, you don't have your TV dropping frames or delivering them late. You can adjust for a constant latency and "lead" it, which is why game design that's predictive works so well in multiplayer.

Latency sensitive players are actually large part of the market, any action based multiplayer game will fall under that category, which includes FPS games which make up a large portion of game revenue. About 10% of the market makes 90% of the revenue so missing certain use cases excludes a large chunk.



The normal packet jitter on my connection is between 2 and 20 milliseconds. You could stream games to me with a fixed network delay of 120ms (plus 80ms for input/rendering) and there wouldn't be dropped or late frames.


> normal packet jitter

What's the 99th percentile for that? How about packet-loss?

None of this is new stuff in gamedev, we've been build action games successfully since the days of 28.8 modems and 200ms pings.

Maybe that makes me a bit of an entrenched player(which is a poor position to argue on HN) but I've yet to see anything fundamental in these technologies that will address the latency issues in the same way you can with an interpolated client-side simulation.


Checking this ping I have open, the 99th percentile jitter is less than 20ms, and packet loss is about .2%

It doesn't have to be as good, it just has to reach a certain level. And for a lot of games it works fine to have a bit of lead time on controls.


I'm working on a game with my own GL engine, and i found locally even <5ms jitter is noticable, if only because the jitter occasionally causes a "frame time overflow", leading to a skipped frame or a frame being displayed twice.


No, no, that's not what I'm saying.

You set up a fixed delay, that is large enough to contain your jitter. It doesn't matter if one packet takes 80ms and the next takes 105ms when you don't display it until the 120ms mark. There will be no visible jitter.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: