Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A lot of the computer advances of the last two decades have been about power efficiency: remember, phone are also computers. Laptops and tablets, too.

'Thanks' to limited battery capacity, consumers have been very interested in power efficiency.



I still have an old 3930K which is 6 cores at about 3.6Ghz with 16GB of RAM, it was used as a game server for years but its not on now. It consumes about 110Watts at idle, there is no GPU of note (710GT or something like that) but its mostly all the CPU and lack of power control. A newer desktop however with 32GB DDR5 and a 9800X3D will idle at 40Watts with a lot more drives and a modern GPU etc. New machines use considerably less power at idle and when you go back as far as a Pentium 4 those things used 100Watts all the time just for the CPU whether in use or not.

Anything since about the Core 9th gen does behave fairly well as do all the modern era Ryzen processors. There is definitely some CPUs in the middle that had a bunch of issues with power management and had performance issues ramping clockspeed up which was felt on the desktop as latency. Its been for me a major advance of the past 10 generations of CPUs the power management behaviour has improved significantly.


How would I measure how much Watts my devices use for any given state they could be in?


Plug it into a kill-a-watt or equivalent cheaper clone meter. Read the number off the display.


Eh, your CPU can't stay eg in the high power state (post its steady state thermal envelope) for very long, but you'd still like to know how much power that consumes.

The kill-a-watt is unlikely to be fast enough. Especially if there are perhaps capacitors in your computer's power supply?


Are you interested in how much energy a certain instruction uses or are you interested in how much power your computer uses while running a certain program?


That's definitely true, but I guess what I mean is that we sort of keep eating up new efficiency gain with more capacity. It's not uncommon for modern PCs to have much larger PSUs than in the past -- it's just that these PCs are doing far _more_ with their power. We could have moved both directions, though -- hit a stasis capability, but keep improving and refining efficiency.

But, to your point, ARM and Apple's M line are really exciting in this regard.


I don't think the desktop CPU and video chip manufacturers got that memo...

But as the top of this thread said, the most unjustified carbon footprint comes from javascript.


No, intel didn’t get it. AMD certainly did. An i7 14700k can draw 253 watts and do 5.4 GHz, a 9800X3D can boost to 5.2GHz at 160W. Thats pretty close to the top end of desktop CPUs (for home use). As you go down the chain, you’ll see huge drops in power usage.

Intel in particular are guilty of replacing their mid and low range CPUs and replacing them with neutered low power cores to try and claw back the laptop market.


160 W is less than Intel, but still a lot.

And I bet that even with AMD you get 85% of the performance with 50% of the power consumption on any current silicon...

And 85% of the current performance is a lot.

I have an AMD box that I temperature limited from the BIOS (because I was too lazy to look where the power limits were). It never uses more than 100W and it's fast enough.


Have you heard of 'race-to-idle' or 'race-to-sleep'?

Temporarily allowing performance (and thus power) spikes might make your whole system consume less energy, because it can idle more.


You can't race to idle when two out of three websites go amok with your CPU :)

That's like saying communism works in theory, it's just applied wrong.


None of these high end CPUs are bursting to 5+GHz for a regular website’s JavaScript. They’re running at their normal 40-60w draws. That massive burst is when you throw video encoding or compilation tasks at it. You can tell because you need to open a window when it happens.


> But as the top of this thread said, the most unjustified carbon footprint comes from javascript.

You think that a programming language has a carbon footprint just from existing? Maybe you could reword your argument.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: