Hacker News new | past | comments | ask | show | jobs | submit login

In addition to input latency, with "real time" computer graphics it's basically impossible to know when your frame will actually be rendered on screen. If this information was available, it would be a lot easier to achieve smooth visuals without having to chase higher and higher frame-rates.



Don’t GPU fences tell you which vsync you’re targeting? Or are you saying you want the actual estimated display time of the frame? I know the latter is definitely available for VR systems as it’s required. Having it available for 2d games probably would help marginally in cutting down the perceived latency although I don’t know if you’d be able to feel it.


John Carmack has mentioned on Twitter that he spent a lot of time trying to make the frame release rate more consistent even with purpose-built VR hardware. I think game programmers found it easier to go with the gsync/freesync method of doing a vsync whenever the frame is ready instead of trying to hit a deadline.


Yes I want the actual estimated display time. This would allow for more accurate interpolation.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: