It’s not imagined though, I use my partner’s phone sometimes and every time I used it I thought it was broken because the UI jitter was so jarring at 60Hz. Actually I’m still not convinced her phone isn’t broken. Also her flashlight resets to the lowest brightness EVERY time it’s cycled.
If the UI jitter on their phone was "so jarring", it's not because it's 60 Hz. It's because the phone's CPU isn't keeping up.
Like, nobody watches a video filmed at 60 fps and then watches their favorite TV show or a motion picture at 24 fps and says "the jitter was so jarring". And that's at less than half the rate we're even talking about! Similarly, even if you can tell the difference between 60 and 120 Hz, it's not jarring. It's not jittery. It's pretty subtle, honestly. You can notice it if you're paying attention, but you'd never in a million years call it "jarring".
I think a lot of people might be confusing 60 Hz with jittery UX that has nothing to do with the display refresh rate. Just because the display operates at a higher refresh rate doesn't mean the CPU is actually refreshing the interface at that rate. And with certain apps or with whatever happening in the background, it isn't.
> Like, nobody watches a video filmed at 60 fps and then watches their favorite TV show or a motion picture at 24 fps and says "the jitter was so jarring". And that's at less than half the rate we're even talking about!
Those have motion blur.
> Similarly, even if you can tell the difference between 60 and 120 Hz
I don't know why you're phrasing this so oddly doubtful? Being able to tell the difference between 60hz and 120hz is hardly uncommon. It's quite a large difference, and this is quite well studied.
> If the UI jitter on their phone was "so jarring", it's not because it's 60 Hz. It's because the phone's CPU isn't keeping up.
No, it's not. This isn't about dropped frames or micro-stutters caused by the CPU. It's about _motion clarity_.
You can follow the objects moving around on the screen much better, and the perceived motion is much smoother because there is literally twice the information hitting your eyes.
You can make a simple experiment — just change your current monitor to 30hz and move the mouse around.
Does it _feel_ different? Is the motion less smooth?
It's not because your computer is suddenly struggling to hit half of the frames it was hitting before; it's because you have less _motion information_ hitting your eyes (and the increased input lag; but that's a separate conversation).
60->120fps is less noticeable than 30->60fps; but for many, many people it is absolutely very clearly noticable.
> Like, nobody watches a video filmed at 60 fps and then watches their favorite TV show or a motion picture at 24 fps and says "the jitter was so jarring".
People absolutely complain about jitter in 24fps content on high-end displays with fast response times; it is especially noticeable in slow panning shots.
Google "oled 24fps stutter" to see people complaining about this.
It's literally why motion smoothing exists on TVs.
If you switch from 60hz to 30hz you absolutely notice. I wouldn’t think it’s wrong to say it is jarring.
30hz is still perfectly usable, but you constantly feel as if something is off. Like maybe you have a process running in the background eating all your CPU.
I imagine going from 120hz to 60hz is the same thing. It should be theoretically indistinguishable, but it’s noticeable.
That's bs. You will immediately notice the difference when going from let's say 120 hz down to 60 hz on a fast gaming pc even if you're just dragging windows around. Everything feels jarring to say the least compared to higher refresh rates and it has absolutely nothing to do with the CPU. It's because of the refresh rate.
It's same thing going from 120 hz to 60 hz on a phone while scrolling and swiping.
It's quite interesting though that there are people out there who won't notice the huge difference. But hey, at least they don't have to pay premium for the increase performance of the screen.