My 5 year old uses our first iPad which is over 10 years old now. The keyboard UI from then is still faster than my iPad pro keyboard UI because Steve Jobs valued the priority of that in the stack :)
OK so this is a partial tangent - but I have a washing machine that annoys the hell out of me for this exact same reason (I know washing machine UI is not a hot topic usually but it damn well should be!)
It has a dial on the front to choose settings. Potentially a great idea to skip quick to the one I want - BUT the damn thing can only detect 1 step of change about every half second. So if you fast-turn it 4 clicks, it still only moves one step. So you have to stand there, slowly turning it, click, click, click, click....
The dial is the biggest design feature on there. Massive. Right in the middle. Bright lights all around it. But they couldn't even be bothered to make it solve the one problem it was there to do.
I always think that about modern SLR cameras. They took a series of brilliant, instantly accessible physical dials (around the lens for aperture, on the top for exposure) and replaced them with menus and buttons on a tiny screen. WHY? How is that progress?
I think if someone did a kickstarter for a very simple physical buttoned digital camera it would do very well.
Open it up, take the PCB out, figure out how the dial sensor works (the dial is probably just repeatedly nudging a contact out of the way), and thennn... program an Arduino to be a giant input buffer (that slowly replays), and wire the Arduino between the dial and the rest of the machine. :D
Caveat: said Arduino may need to sit between the rest of the buttons as well, in case inputs must be given in sequence.
Possible benefit: switch the Arduino for something from the ESP family, and you could feed instructions to the machine over Wi-Fi or Bluetooth. (Which could take the form of a dedicated wireless button panel above the machine with macros for frequently-used options.)
(Hmm. I can tell there's some over-optimization in here, but I'm not sure _where_.)
I’d be pretty surprised, because I feel like the UI in OS X has always had more latency than Windows. Especially noticeable with the mouse with acceleration, but also throughout the rest of the UI. This was my biggest gripe in switching to OS X.
Yeah. Mouse was always way better in Windows and Linux. Finder browsing files has also always been laggy. Now with the latest macos versions it's ridiculous, if you take a screenshot, it takes about a minute for the file to appear.
Screen shots appear instantly in macOS, I'm not sure what machine you're on.
Finder browsing files is much faster in macOS, Windows scans the entire directory with defender every time you click on it before it allows you to sort it or interact with it. That can take several minutes on some of my directories.
The mouse tracking model in Windows is somewhat different, and a bit more responsive, but there are multiple third party extensions to macOS the make its model exactly the same as the Windows model.
Indeed, I remember being particularly impressed by iPad battery life in the early days. I would recharge maybe once a week of light usage. Now I’ve got to plug in nearly every single night.
I charge my close to 5 years old Pixel C tablet about once a week. Light usage, mostly some streaming up to an hour per day. Not sure if your light usage means something else though.
It’s been a while since I tested it (pre-Wayland). The main problems I saw back then was that the window managers tended to do lots of file I/O for things like moving windows or opening menus so it was critical to tune the I/O scheduler (and NFS home directories were game over), and since rendering was up to the app it was common to have, say, the chrome respond relatively well but everything inside the window would be chunky on the main thread.
I’d hope this is much better now given things like Wayland and the increased use of parallelism.
Linux used to do that out of the box. But there was a series of "quasi-real time has too much maintenance costs", "quasi-real time isn't even real enough for the real gains", and "our computers are fast now, we don't have to trade so much throughput for latency" decisions that were all very sensible but let us here.