Display refresh rate isn't necessarily just about one aspect like flicker or smoothness.
For one thing, it affects the latency from human input to graphics output. How the graphics stack is implemented, especially with modern desktop compositors, there's typically at least 3 frames of latency. 3/60 is 50ms. 3/165 is 18ms. Whether or not you consciously notice it, the 165hz display is going to feel more instant when you push a button.
There's also what's referred to as "judder". When you're watching 24Hz video content on a 60Hz display, the frames of video get repeated 3 times, then 2 times, then 3 times etc. This results in a 16ms "judder" from frame to frame. It's subtle, and has been the norm for decades, but it is quantifiably less than ideal. A 165Hz display drops judder down to 6ms.
Another aspect of the refresh rate has to do with the frequency response of the display technology itself, and what many might call "smoothness". Looking back at CRT technology, the image is instantaneous wherever the electron beam is currently pointed. The overall image looks stable due to persistence within the human eye. If you film a CRT, it can look pretty wonky. With a CRT, 30Hz is too slow because pretty much everyone can see the flicker. 60Hz is borderline on a CRT, and I personally can see the flicker in my peripheral vision. Motion looks smooth regardless, though, because all the persistence is in your eyes. With traditional LCDs, the pixels are always on, and they are relatively slow to change; there's persistence in the display itself. So, 30Hz doesn't flicker, and all motion looks blurry no matter what. It just sucks other than being a conveniently flat screen. With modern LCDs and OLED displays, the pixels are still always on, but they are back to being very fast to change. So, 30Hz doesn't flicker, but motion is no longer blurry, but instead of flickering it looks jerky rather than smooth. At 60Hz things look pretty smooth, but you're still at the limit of some folk's peripheral vision.
A GPU doesn't have to render frames at the refresh rate. An old frame will be repeated if there isn't a new frame ready yet. If the GPU can't keep up, a 165Hz display effectively becomes an 82.5Hz display, or a 41.25Hz display. There certainly is going to be a power penalty in the GPU circuitry driving the display at a higher rate, but it's marginal vs. the cost of rendering the frames themselves. 82Hz is still luxury compared to 60Hz, in that it's better than good enough for 99% of people.
What it boils down to is that pushing the refresh rate higher gives the GPU/software more fine grained control over the display than otherwise. That control allows the software to optimize latency, judder, and smoothness better than the display itself can given a lower refresh rate.
For one thing, it affects the latency from human input to graphics output. How the graphics stack is implemented, especially with modern desktop compositors, there's typically at least 3 frames of latency. 3/60 is 50ms. 3/165 is 18ms. Whether or not you consciously notice it, the 165hz display is going to feel more instant when you push a button.
There's also what's referred to as "judder". When you're watching 24Hz video content on a 60Hz display, the frames of video get repeated 3 times, then 2 times, then 3 times etc. This results in a 16ms "judder" from frame to frame. It's subtle, and has been the norm for decades, but it is quantifiably less than ideal. A 165Hz display drops judder down to 6ms.
Another aspect of the refresh rate has to do with the frequency response of the display technology itself, and what many might call "smoothness". Looking back at CRT technology, the image is instantaneous wherever the electron beam is currently pointed. The overall image looks stable due to persistence within the human eye. If you film a CRT, it can look pretty wonky. With a CRT, 30Hz is too slow because pretty much everyone can see the flicker. 60Hz is borderline on a CRT, and I personally can see the flicker in my peripheral vision. Motion looks smooth regardless, though, because all the persistence is in your eyes. With traditional LCDs, the pixels are always on, and they are relatively slow to change; there's persistence in the display itself. So, 30Hz doesn't flicker, and all motion looks blurry no matter what. It just sucks other than being a conveniently flat screen. With modern LCDs and OLED displays, the pixels are still always on, but they are back to being very fast to change. So, 30Hz doesn't flicker, but motion is no longer blurry, but instead of flickering it looks jerky rather than smooth. At 60Hz things look pretty smooth, but you're still at the limit of some folk's peripheral vision.
A GPU doesn't have to render frames at the refresh rate. An old frame will be repeated if there isn't a new frame ready yet. If the GPU can't keep up, a 165Hz display effectively becomes an 82.5Hz display, or a 41.25Hz display. There certainly is going to be a power penalty in the GPU circuitry driving the display at a higher rate, but it's marginal vs. the cost of rendering the frames themselves. 82Hz is still luxury compared to 60Hz, in that it's better than good enough for 99% of people.
What it boils down to is that pushing the refresh rate higher gives the GPU/software more fine grained control over the display than otherwise. That control allows the software to optimize latency, judder, and smoothness better than the display itself can given a lower refresh rate.