Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In a modern application on a modern OS, USB events are buffered. You don't get hard guarantees about how quickly they reach your application. But they will get there albeit often with multiple frames of latency.

On 8/16bit consoles everything was tied to the video chip. You essentially have to do your CPU work (including polling the controllers) during the vblank interval, a short time window that occurs 60 times per second on NTSC systems. This is essentially a zero-lag arrangement (max latency is 1/60th of a frame, average is 1/30th of a frame) but if you miss a controller input it's gone forever.

I'm not entirely sure how emulators handle this. They could deliver buffered controller inputs to the emulator on successive input polls for guaranteed delivery, but then the emulated software is going to see a lot of inputs with wacky timing that may or may not screw with game logic (think of a fighting game where you need to input specific things in sequence in specific time windows) so simply dropping inputs may be preferable to delivering a log jam of inputs.

As you are intuiting, this isn't generally an issue on modern hardware. Emulated games play fine. But, also... if you have a chance to sit down with real hardware connected to a CRT... it feels different. 1/30 frame of lag vs. multiple frames.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: