Hacker News new | past | comments | ask | show | jobs | submit login

On my Samsung ATIV (QHD+), I found that running the display at half the native resolution dramatically increased battery life and didn't seem to look any worse than a normal panel with a ~72 DPI native resolution.



How very odd... is it integrated graphics? I wonder if it's the display or the gpu/(v(ram)/cpu that makes consumes less resources.


Well....1920x1080@3 bytes per pixel@60Hz = 336MB/s, while 3200x1800@3 bytes per pixel@60Hz = 989MB/s. In QHD mode your laptop is sending nearly a gigabyte of data to the display...per second. It might not be very power hungry to compute those pixels with a modern GPU, but even simply sending that much data requires a lot more power than sending "only" 330MB/s.


I'm not so sure. Given this random google hit (on 10gbps ethernet nics) -- I'm not sure it makes much difference: http://www.cl.cam.ac.uk/~acr31/pubs/sohan-10gbpower.pdf

I couldn't find anything on hdmi/4k, but this seems to suggest it doesn't make much difference (compared to what I assume the backlight and screen draws): http://en.wikipedia.org/wiki/Thunderbolt_(interface)

At any rate, if we assume it takes 4 times as much power, and we know that phones do just fine with 1080p+ displays -- I doubt the signal is the problem. Maybe it's the RAM used for the videoframes?

I always assumed it was the screen that consumed power with HDPI displays (flipping more pixels) -- but maybe it's something else.


I was always under the impression that the backlight is a lot of the power that displays consume, which is why LED displays use so much less power than CCFC displays. That would remain constant with resolution, right? If that's the case, I'd think that the actual GPU work (or the RAM, like you say) is where the power difference was for me.

I could be wrong about how significant the backlight is on an LED display though.

I also thought that modern video cards (as in from the last 15-20 years) optimized 2D graphics so that regular desktop use didn't require pushing a screen full of pixels to the GPU on every frame. That would seem to diminish the significance of the actual number of pixels at QHD+.

I don't have a good answer.


That's a clever battery hack. I'll be using it on my sp3. Nice.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: