Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Well, there is the backlight, but also the power usage of the TCON (timing controller board) for each display that will vary greatly. Usually the 4K ones end up being less efficient. If you get the 4K display, you can of course set the output to 1080p which would solve the "processing power" end. I think the difference between 1080p and 1440p (or their 16:10 equivalents) at 60Hz would be negligible from a GPU perspective (especially if PSR is set on), but ultimately you'd have to test the two different models with different resolution settings to really be able to tell.


> I think the difference between 1080p and 1440p (or their 16:10 equivalents) at 60Hz would be negligible from a GPU perspective

It is close to twice the number of pixels.

1080p = 2.073.600

1440p = 3.686.400

2160p = 8.294.400




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: