I still have an old 3930K which is 6 cores at about 3.6Ghz with 16GB of RAM, it was used as a game server for years but its not on now. It consumes about 110Watts at idle, there is no GPU of note (710GT or something like that) but its mostly all the CPU and lack of power control. A newer desktop however with 32GB DDR5 and a 9800X3D will idle at 40Watts with a lot more drives and a modern GPU etc. New machines use considerably less power at idle and when you go back as far as a Pentium 4 those things used 100Watts all the time just for the CPU whether in use or not.
Anything since about the Core 9th gen does behave fairly well as do all the modern era Ryzen processors. There is definitely some CPUs in the middle that had a bunch of issues with power management and had performance issues ramping clockspeed up which was felt on the desktop as latency. Its been for me a major advance of the past 10 generations of CPUs the power management behaviour has improved significantly.
Eh, your CPU can't stay eg in the high power state (post its steady state thermal envelope) for very long, but you'd still like to know how much power that consumes.
The kill-a-watt is unlikely to be fast enough. Especially if there are perhaps capacitors in your computer's power supply?
Are you interested in how much energy a certain instruction uses or are you interested in how much power your computer uses while running a certain program?
That's definitely true, but I guess what I mean is that we sort of keep eating up new efficiency gain with more capacity. It's not uncommon for modern PCs to have much larger PSUs than in the past -- it's just that these PCs are doing far _more_ with their power. We could have moved both directions, though -- hit a stasis capability, but keep improving and refining efficiency.
But, to your point, ARM and Apple's M line are really exciting in this regard.
No, intel didn’t get it. AMD certainly did. An i7 14700k can draw 253 watts and do 5.4 GHz, a 9800X3D can boost to 5.2GHz at 160W. Thats pretty close to the top end of desktop CPUs (for home use). As you go down the chain, you’ll see huge drops in power usage.
Intel in particular are guilty of replacing their mid and low range CPUs and replacing them with neutered low power cores to try and claw back the laptop market.
And I bet that even with AMD you get 85% of the performance with 50% of the power consumption on any current silicon...
And 85% of the current performance is a lot.
I have an AMD box that I temperature limited from the BIOS (because I was too lazy to look where the power limits were). It never uses more than 100W and it's fast enough.
None of these high end CPUs are bursting to 5+GHz for a regular website’s JavaScript. They’re running at their normal 40-60w draws. That massive burst is when you throw video encoding or compilation tasks at it. You can tell because you need to open a window when it happens.
'Thanks' to limited battery capacity, consumers have been very interested in power efficiency.