The real problems with Skylake-X are chipset cost, power consumption, shitty partner boards, and TIM. All of these are forgivable given the performance - except the TIM.
Chipset cost will come down in 6-12 months after launch like it always does. This is part for the course, at launch X370 boards for Ryzen were going for well over $250 as well.
Power consumption is a consequence of AVX512 and the mesh interconnect along with raw core count. Everyone wants higher clocks, more cores, and more functional units. There are no easy efficiency gains anymore, and this is the price - power consumption. This is the "everything and the kitchen sink" processor and it runs hot as a result - but it absolutely crushes everything else on the market. This is no Bulldozer.
Board partners with insulators on top of their VRMs was going to come to a head sooner or later. This is the natural outgrowth of form over function, RGB LEDs on everything and stylized heatsink designs that insulate the board instead of actual cooling. The terrible reviews on those boards will sort this problem right out, they will be unusable in their current form.
Intel has been cruising for issues with their TIM for years (since Ivy Bridge), this time they finally have a chip that puts out enough heat they can't ignore it. Intel can get away with making you delid a $200 i5 or a $300 i7, it's not acceptable on a $1000 processor.
There is still a market for a 6-12C HEDT chip that can hit 5 GHz overclocked. This thing absolutely smokes Ryzen in gaming at stock clocks let alone OC'd - single-thread performance is still a dominant factor in good gaming performance and this chip delivers in spades. Combining its leads in IPC and clocks, it's fully 33% faster than Ryzen in single-thread performance. This is just a brutal amount of performance for gaming. Unfortunately without delidding you're not going to hit good OC clocks given the current TIM. And delidding is a dealbreaker on a $1000 CPU.
TIM is the actual core problem with Skylake-X - everything else will sort itself out. Skylake-X with solder would be a winner and Intel would be wise to turn the ship as fast as possible. The 6C and 8C version are priced much more reasonably and will sell great as long as they fix the TIM problem.
Intel claims they have problems with dies cracking, but AMD manages to solder much smaller dies, so IMO Intel just doesn't have a leg to stand on here. This is not something that should be pushed onto the customer with a $1000 processor - you're Chipzilla, figure something out.
Yup. TIM goes on the die to transfer heat to the Integrated Heat Spreader (which is the lid you normally see). Intel has an issue with their thermal paste, it doesn't properly fill the void and doesn't contact the IHS well, so heat just builds up in the die. Check out this amazing chart from the Tom's Hardware review:
Heat just is not getting to the IHS properly on these chips, and heavy overclocking just makes the whole thing worse.
It's really been a slow-burning problem since Ivy Bridge, where Intel switched from soldering the lid to TIM + an adhesive. Solder has long been preferred for its superior heat transmission, but Intel says say smaller dies have problems with cracking over time due to thermal cycling. However, AMD has been happily soldering the lid on much smaller Ryzen dies, so apparently it's not all much of an issue in practice.
Well, you can get away with that on a processor that puts out 50 watts during normal operation. It's been an issue for a while on the unlocked/overclockable SKUs, particularly on the latest 7700Ks, but even an OC'd 7700K only puts out ~100W, so it was relatively manageable. Extreme overclockers could delid and replace the thermal paste with something better (often liquid metal like Conductonaut), which does help performance quite a bit.
But, with the higher TDP of Skylake-X, this has become a pressing issue just for normal operation. Things change when we're talking about a $1000 processor that needs to be delidded to sustain boost at stock settings. That's just not acceptable.
I almost would rather have a bare die at this point. Mounting pressures are no longer insane so it wouldn't be as terribe an ordeal to mount as as Athlon XPs were back in the day (god forbid your screwdriver slip on that bracket, with 50+ pounds of force you are guaranteed to gouge something).
"This thing absolutely smokes Ryzen in gaming at stock clocks let alone OC'd - single-thread performance is still a dominant factor in good gaming performance and this chip delivers in spades."
That if you are limiting yourself to 1080p... At the resolution that I game (3440 x 1440) those performance differences disappear very fast. And even at 1080p 150 vs 180 frames don't matter that much for the majority of people.
The cost of this chip alone is the same as some Ryzen builds. There is a point where it financial doesn't make sense (price/performance) even if it's the fastest chip around.
The 6C version is priced directly against Ryzen (~$350), and the 8C version is only modestly more expensive (~$500), and they have a huge lead in gaming performance. That's perfectly affordable for the performance they give - virtually the same prices Ryzen launched at, in fact.
With this much of a lead in single-thread performance (~33%) a 6C Intel is actually outperforming an 8C Ryzen even in multi-thread performance and it's stomping it in games because single-thread performance is still so critical.
And the 8C Intels are just 33% faster than Ryzen across the board.
High-refresh gaming requires excellent single-threaded performance regardless of resolution, and 144 Hz is basically the new standard for midrange/high-end gaming builds at this point. A 144 Hz monitor starts at literally $150 and a very nice IPS 144 Hz FreeSync/GSync monitor can be had for $400-600.
It's not just 1080p - CPU single-thread-performance requirements scale with the framerate, it's just easier to hit higher framerates at lower resolutions. So 1080p benchmarks are a "leading indicator" of future high-refresh gaming performance as GPU tech improves and you upgrade in a year or two.
On the flip side, 4K benchmarks really mean almost nothing for CPUs. A Pentium G4560 is within a stone's throw of a 7700K at 4K because everything is GPU-bottlenecked at a very low framerate that virtually any processor can deliver. But that G4560 will fall behind in no time at all as GPU performance continues to improve and its actual performance (or lack thereof) is laid bare.
Or I guess put another way for gamers, saving $500 on your CPU lets you go from 1 GPU to 2 GPUs. For a lot of games, that is going to be a pretty big performance win.
It's silly to even look at 1080p for high-end gaming, beyond having some "rule-of-thumb" number to compare performance historically.
I remember I used to advice people to spend half on monitor, half on the pc system (and out of that, maybe close to half on the gpu). That used to mean a ~1000 USD monitor(s), and a ~500 USD GPU - for a total system price of ~2000 USD.
Today, most people would probably aim for a lower total cost, but it's still silly to sink a lot of cash into getting a great system, only to have a crappy monitor ruin the experience.
(Another caveat, I'd guess a high-end monitor should be able to survive/remain usable for closer to a decade than to 3-5 years -- which would be more typical for a pc. Of course, part of the reason for getting a pc system would be the possibility of ~incremental upgrades)
27" 144 Hz IPS GSync/FreeSync like the XF270HU or XB271HU is the place to aim for a general-purpose monitor right now ($400-600 depending on model and new/refurb). People inevitably are a bit dubious on them at first given the cost, then they try them and agree it's worth every penny.
Unless you want ultrawide that is - but there are some caveats there with game compatibility due to the aspect ratio.
At the desktop level I don't get why people care that much about power consumption? It means you have to dissipate more heat, okay, that means you can't use a cheap cooler. But it AFAIK even an extra 100w is cheap in the most expensive areas, especially when contrasted against productivity or cigarette breaks, or people sometimes being 20 minutes late to work...
Actually, air-cooling is marginal with these CPUs, even at stock frequencies. There is no air-cooler which allows them to sustain their frequencies under load.
Water-cooling is pretty much required, and an AIO does not cut it. Still, even with water-cooling you can't really overclock these. They are pretty much at their limit out of the factory.
This could conceivably solved by Intel switching away from silicon TIM to e.g. solder, since the Rth(jc) of the CPUs is much worse at ~0.3 K/W than the thermal resistance of a big CPU air cooler (~0.1 K/W).
The overclocking problems wouldn't be fixed though; there is no easy fix for a CPU that jumps to 400+ watts.
-
An entirely separate issue is that you need to get the heat out of your office. A human dissipates around 50-100 W; you can imagine that a small office crowded with half a dozen people is not pleasant in the summer.
I fully agree, and what's more Intel has the performance to back it up. This chip pulls a lot but it's wicked fast, it's a massive step forward in framerates. It combines the minimum-framerate improvements of HEDT/Ryzen with the single-threaded performance of Kaby Lake. Oh yeah and AVX512 too.
For a sense of perspective here, going from a circa-2012 2600K to a current 7700K is a 40% jump in performance, so it's roughly equivalent to 4-5 years of gains at Intel's usual tempo - only you also have 10 cores on this platform. This thing is an absolute monster for gaming or other tasks that lean heavily on single-threaded performance.
But the power consumption is really the triggering issue for the problems with shitty partner-boards overheating and the TIM. The TIM is really the showstopper right now.
For me its purely a noise issue. Less heat to dissipate means less fan noise(Only speaking about home use here. In an office setting the difference would be unnoticeable).
> In an office setting the difference would be unnoticeable
If you have a hall full of developers, having lots of noisy PCs can be annoying.
Where I work, we optimized for more silent PCs, because all these things do add up, and given the perf/watt-ratio you can get out of modern CPUs, there's no reason for people to need to have noisy PCs.
Even at home, optimizing watt-usage, even for a desktop build, is not completely without merits. All my future projects are planned as fanless as possible. And I know others who do the same. And if Intel can't deliver that, they'll just go buy something Arm-based, like an Rpi 3, which these days are getting good enough to actually do production loads.
I'm not even getting close to wanting a system where the CPU alone can draw 400+ watts.
> an Rpi 3, which these days are getting good enough to actually do production loads
They really aren't, CPU performance is really irrelevant given the RPi's architectural weaknesses. USB was never meant as a system bus and everything has to loop through the kernel stack. Having every single peripheral hanging off a single USB 2.0 bus is crippling for performance. A Pi can't even serve a share at full 100 mbit speed due to bus contention let alone do anything more intensive.
It's very similar to one of Apple's more famous goofs, the Performa 5200/6200 with its left-hand/right-hand bus split that forces the CPU to handle everything.
Some of the clone boards have USB 3.0, SATA, gigabit ethernet, etc and are much better performers in practice despite having slower CPUs "on paper". Or there are little mini-PCs using 5-15W laptop processors that are really nice and run x86 distros/binaries.
All of these are at roughly comparable TCOs to a Pi (they include things like AC adapters that must be purchased separately for the Pi). The RPi is a bad choice for server usage.
> They really aren't, CPU performance is really irrelevant given the RPi's architectural weaknesses
Obviously "production loads" is an undefined term and as such we can discuss infinitely back and forth exactly how much these cheap ARM machines can actually handle.
I also didn't mean to single out the Rpi3 as a universal performer, optimal for everything out there.
My point was that I'm seeing an increased amount of people who are happy with what these cheap boards can do, who 10 years ago would have been forced to buy a server of sorts to cover the same needs.
So now they don't buy servers. Instead they buy cheap, tiny and fan-less ARM-based machines and they're perfectly happy. They even think running ARM is cooler than running Intel, so it's something they brag about.
I'm absolutely not saying I'm going to replace my company's server-farm or my dev-computer with these anytime soon, but Intel cannot completely ignore the power-efficiency aspect either if they want to keep their dominance in the market.
I bought my Ryzen chip on its release day, I don't need some X370 board, I got myself a B350M board which doesn't affect me anything for my applications. It cost me $100 delivered.
There is no such alternative for Skylake-X - Intel charge you an arm and leg for its half decent products.
Overclockers are vocal but I can't imagine they're more than a tiny portion of the market these days? For most use cases modern CPUs are plenty fast enough stock. Businesses won't do something unsupported. Games will be written to run properly on supported chips. Maybe with a lot of effort you can get your games looking slightly nicer, sure, but for how many people is that worth it?
> Unfortunately without delidding you're not going to hit good OC clocks given the current TIM. And delidding is a dealbreaker on a $1000 CPU.
Yeah, absolutely agree. What's sad is there are fanboys defending this and saying this makes direct die cooling easier through delidding, which saves 1-2 C° over solder, and is the beast idea Intel had in recent years.
Chipset cost will come down in 6-12 months after launch like it always does. This is part for the course, at launch X370 boards for Ryzen were going for well over $250 as well.
Power consumption is a consequence of AVX512 and the mesh interconnect along with raw core count. Everyone wants higher clocks, more cores, and more functional units. There are no easy efficiency gains anymore, and this is the price - power consumption. This is the "everything and the kitchen sink" processor and it runs hot as a result - but it absolutely crushes everything else on the market. This is no Bulldozer.
Board partners with insulators on top of their VRMs was going to come to a head sooner or later. This is the natural outgrowth of form over function, RGB LEDs on everything and stylized heatsink designs that insulate the board instead of actual cooling. The terrible reviews on those boards will sort this problem right out, they will be unusable in their current form.
Intel has been cruising for issues with their TIM for years (since Ivy Bridge), this time they finally have a chip that puts out enough heat they can't ignore it. Intel can get away with making you delid a $200 i5 or a $300 i7, it's not acceptable on a $1000 processor.
There is still a market for a 6-12C HEDT chip that can hit 5 GHz overclocked. This thing absolutely smokes Ryzen in gaming at stock clocks let alone OC'd - single-thread performance is still a dominant factor in good gaming performance and this chip delivers in spades. Combining its leads in IPC and clocks, it's fully 33% faster than Ryzen in single-thread performance. This is just a brutal amount of performance for gaming. Unfortunately without delidding you're not going to hit good OC clocks given the current TIM. And delidding is a dealbreaker on a $1000 CPU.
TIM is the actual core problem with Skylake-X - everything else will sort itself out. Skylake-X with solder would be a winner and Intel would be wise to turn the ship as fast as possible. The 6C and 8C version are priced much more reasonably and will sell great as long as they fix the TIM problem.
Intel claims they have problems with dies cracking, but AMD manages to solder much smaller dies, so IMO Intel just doesn't have a leg to stand on here. This is not something that should be pushed onto the customer with a $1000 processor - you're Chipzilla, figure something out.