I think this is the first major layoff announcement at a big, established tech firm I've heard this year that is anywhere near this size. I mean, there have been loads of "couple hundred" employee layoffs, and certainly a lot of hiring slowdown/freezes, but again, haven't seen anything else near this big, and the layoffs previously seem to have been concentrated in unprofitable VC-funded companies. I know Tesla laid off a couple thousand people this summer, that's the closest thing that comes to mind.
Intel certainly has its own unique challenges, but even, for example, AMD had been doing pretty great up until this year, and they also just announced a big shortfall for their 3rd quarter results. Just wondering if this is really the tip of the iceberg for a true, broad retrenchment in tech, after the past 9 months of "Is a recession coming? Is a recession coming?"
Client/Consumer CPUs(Ryzen) is just one business for AMD.
The others are graphics (Radeon), datacenter (EPYC), and embedded (Xilinx, PS & Xbox consoles).
The one year forecast will probably have to be adjusted based on current unexpected economic conditions but they will keep growing.
What folks don't appreciate is that AMD is on a roll. A clear roadmap and processes in place
to execute like clockwork in tandem with TSMC. Intel can easily bounce back in about 3-4
years if they fix their foundry issues.
If they hadn't cut fab R&D years ago to juice profit margins, they wouldn't be in this situation now. I have no reason to believe they will suddenly start acting intelligently or with any real long-term thinking. Most likely it's just about managing the decline at this point (please Pat, prove me wrong)
It's still not clear that Intel shouldn't just go fabless like their competition. They're going for a hybrid approach right now where some of their own stuff is produced elsewhere. They're trying to get some manufacturers to use their fabs but they're nowhere near the state of the art. They lost that battle to TSMC a long time ago.
Will not happen for National Security reasons. DoD standard policy is that high criticality components must be able to made domestically. Or at least it was at some point.
Yes, but: the CHIPS act plans to increase domestic semiconductor manufacturing significantly. In the future, Intel might be able to go fabless with domestic manufacturing partners to satisfy DoD requirements.
“America invented the semiconductor, but today produces about 10 percent of the world’s supply—and none of the most advanced chips. Instead, we rely on East Asia for 75 percent of global production. The CHIPS and Science Act will unlock hundreds of billions more in private sector semiconductor investment across the country, including production essential to national defense and critical sectors.”
Just because Intel would go fabless doesn't mean the fabs themselves would disappear. They could spin them out just like AMD did when they were in trouble.
This, AMD's CPUs are better than ever. Their upcoming GPUs are looking very promising, especially considering how team green turned into a luxury brand. And their support for Linux has been very welcome.
Also their CPUs with Radeon GPU are a lot more performing than Intel equivalents. You can grab a 5600G or 5700G alone, no external video card needed, and expect to play most recent games, albeit not at maximum details. For those not obsessed with gaming at the highest possible details, this translates in saving some serious money.
A search for "5700g gaming" on YouTube returns some examples.
Peloton laid off over 4000. Snap 1280 (which is 20% of the company,) Shopify 1000 (10%.) Groupon laid off 15%. Salesforce, 1000. Microsoft 1800. Carvana (while not "big established" it's still a lot of people) laid off 2500. Tencent laying off 5500. Alibaba: 9500, ByteDance: 1000, Zillow 2300.
This definitely isn't really the first major layoff announcement.
Outside of big tech: Credit Suisse laying off 5000, Ford 8000, Telefonica 2700. Societe General 3700
A bunch more, but these are the one measured in thousands or otherwise a significant percentage of a company's workforce.
(By the way, not arguing with you, my point is that this isn't surprising, the writing has been on the wall for the past year, so an Intel layoff isn't a bellwether for things getting bad -- it's a lagging indicator of things already being bad.
Yeah, understood it's a lagging indicator, but most of those US companies you mention (Peloton, Snap, Shopify, Carvana) are in the "unprofitable VC-funded camp". Salesforce and Microsoft obviously aren't, but their numbers are also much, much smaller and the amounts are a really teeny percentage of their overall workforce. Zillow is a bit of a special case because of their complete f'up with flipping houses. Oracle feels like the only one really comparable to me, but perhaps I'm just showing my personal bias that I'd really wish Oracle would lay off everyone and go under, but I digress...
I guess my main point was that, even with recent layoffs, feels like most of those folks wouldn't have had much difficulty getting snapped up by other companies, especially in engineering (not saying it wasn't disruptive to those involved). But once you start laying off 20,000 here and 10,000 there, you get to the musical chairs point where some folks are going to be left without a chair for some time.
None of the tech companies you mentioned besides Microsoft and Alibaba are BigTech. Most of them were bad business that VCs were able to pawn off during the bubble market to the public markets. They were more or less a Ponzi scheme.
> My point is that this isn't surprising, the writing has been on the wall for the past year, so an Intel layoff isn't a bellwether for things getting bad -- it's a lagging indicator of things already being bad.
Yeah. It certainly was not a surprise and the market euphoria was going to all end in tears. Saw that a mile away several months before it happened. [0]
I don't know a lot of the details, but everything I've heard over the last couple of years indicated that AMD was absolutely crushing Intel.
A recent laptop I purchased, as well as the last desktop I put together (~2 years ago) each have Ryzen chips. I forget the details but in addition to performance issues, didn't Intel CPUs also have some major security vulnerabilities? And was it that they were related to instruction-level performance optimizations that, when disabled to address the security vulnerabilities, led to even worse performance?
So if AMD isn't doing great at the moment either, I can't imagine how hard Intel has been hit. I don't know anyone who is buying or recommending Intel CPUs at the moment.
I don't have any data, but from word of mouth and variously seeing posts and videos online, 12th gen Intel CPUs seem pretty popular for gaming builds. They're winning in benchmarks against 50 series AMD (as they should, being newer), but are also cheaper. I'll be curious to see how 13th gen Intel vs. 70 series AMD plays out. There are always complicating factors, such as motherboards for 70 series AMD being quite pricey for now.
Intel repeated their Prescott (Pentium 4/Pentium D) strategy of completely removing power limitations on their team to beat AMD.
As a result, the Intel processors have TDPs and real world power usages >2.5x that of a comparable Ryzen. Sure, they're winning, but at what cost? The 12900K at 240 watts pulls almost the same power as a 280W 64-core Threadripper.
AMD is responding in kind, with new top-end processors pulling 170W, or higher with their built-in overclocking that pushes the chip to even higher power draws as long as cooling permits. This looks to put them back into the lead, but it's just not a sustainable strategy.
I'm not sure if the actual differences in energy usage are so clear cut. These charts [0], which account for the 12900K spending less time to accomplish tasks than the 5950X, seem to indicate the disparity isn't so terrible. You can always under-volt too; the 13th gen press release includes charts [1] showing that the 13900k at 65W matches the performance of the 12900K at 241W.
When a 5800X3D offers equal or better gaming performance at 1/2 to 1/3rd the power consumption (e.g. 220w vs 80w), and you pay $0.65 per kW/h during peak times, I can only imagine: "Quite a few".
If electricity was $0.65/kWh for me, I'd move out of the country lol. Assuming a more realistic $0.40/kWh (considered very expensive in the US, where most enthusiasts live), 8 hours of gaming a day, and a 200W power limit, you're paying $19/mo. Not bad.
Don't get me wrong, the 5800x3d is a phenomenal CPU. However like many owners of that CPU, I'm also in the market for a 4090 and intend to use the full 600W power limit with my water loop and outdoor radiator. CPU power consumption is just not an issue for enthusiasts.
Nice, ha ha. I was going to say, look into Undervolting. I saw 95% perf at 60% of the power come up, ie 270w; that's actually going to be superb (very cool, quiet, still extremely performant).
So maybe configure that as an optional profile for titles not needing maximum juice?
I'm quite keen myself.
Power $ mentioned was peak times in Norway; I'm in Australia where it's not anywhere that bad yet. (0.25 AUD for me).
> I saw 95% perf at 60% of the power come up, ie 270w
That's neat, but 5% is a lot when you're spending $1600. Oh and all these measurements you see tossed around are done with air coolers and stock voltages. Of course more power doesn't help if you're limited by thermals. Specifically, you're limited by hotspots you can't see because they're between the on-die temperature sensors.
> maybe configure that as an optional profile for titles not needing maximum juice?
4K gsync monitors are typically limited to 120Hz. If my GPU is capable of running a title at 1200fps, it will idle 90% of the time and consume only ~60W regardless of the 600W power limit. So tuning this is pointless. I'd still get 1/10th the rendering latency compared to a GPU that's only capable of 120fps. Same logic applies to power limiting your GPU which will increase latency and kill frame pacing in exchange for a tiny reduction in electricity bills.
> Oh and all these measurements you see tossed around are done with air coolers and stock voltages. Of course more power doesn't help if you're limited by thermals. Specifically, you're limited by hotspots you can't see because they're between the on-die temperature sensors.
This completely misunderstands heat transfer. If the hotspot temp is 75 deg even with overclocking you're not limited by thermals: https://www.youtube.com/watch?v=zc-zwQMV8-s
>4K gsync monitors are typically limited to 120Hz. If my GPU is capable of running a title at 1200fps, it will idle 90% of the time and consume only ~60W regardless of the 600W power limit. So tuning this is pointless. I'd still get 1/10th the rendering latency compared to a GPU that's only capable of 120fps. Same logic applies to power limiting your GPU which will increase latency and kill frame pacing in exchange for a tiny reduction in electricity bills.
This completely misunderstands power consumption and the nonlinear relationship between power and clockspeed. Performing the same work in the same time in short bursts of high clocks and high voltage uses more power than constant low clocks and voltage.
It matters for SFF builds where people often want the highest performance possible but are severely limited by cooking. Efficiency becomes extremely important.
Traditionally nobody, but the massive power consumption of the RTX 4090 combined with a power-hungry CPU might make people take notice. When your desktop needs a dedicated circuit you have a problem.
My microwave uses 1200 or 1500 watts and does not need a dedicated circuit. Sure, I don’t run it continuously for an hour, but that has nothing to do with whether or not it needs a dedicated circuit.
A 16A circuit allows for 3.8kW or something like that @ 240v.
With these ludicrous power requirements, you may in fact need to rethink how you use your power. E.g. if on same as a non-heat pump drier, you'd have to make a choice between gaming and drying clothes.
Having said that I saw a reference to a 4090 offering 95% of the perf at 60% of the power usage if undervolted, so that becomes an attractive option now.
I absolutely love my 5800X3D's insanely low power usage (insane = performance per watt for gaming in simulator titles where it runs circles around Intel).
Of the last gen stuff, the 5800x3D is arguably among the best bang for buck, including for use in gaming builds. The applications where Intel's 12900K/12700K hold a significant advantage against the other chip with its 96MB of L3 cache typically aren't applications desperately in need of cpu power. IME it's poorly-optimized or hard-to-optimize software with bad cache coherency that most demands speed, and it's in those cases that the X3D delivers.
I was looking for a laptop for work and decent gaming, I went with an Intel NUC. Everything pointed to Ryzen being far better in most aspects - power, weight, height, and importantly battery life - however for the price point I was looking at, the faux-no-brand Intel with the 3060 was better even if it seems the chips do lag behind for similar generations.
AMD also had alot of those major bugs, but it wasn't reported as much since the original papers all targeted intel CPU's since they are the "standard", and then a couple months later someone would do it for AMD but it was old news.
All of the attacks are relevant in any environment where you execute code that you would rather not have access to any other information on your system (i.e. where you actually care about privileges).
For example, there have been demos of spectre in-browser, but applies to multi-user environments or simply apps that you didn't grant admin privs...
If you treat your computer security in a DOS/Win95 fashion, then none of them are relevant.
AMD CPUs were vulnerable to fewer of the speculative execution bugs than Intel, and the performance impact of the mitigations on AMD CPUs were far less than Intel.
AWS developed two generations of an in-house ARM chip (Graviton), which I think is cheaper for an equivalent power to an x86 instance. Cloudflare has also begun switching to ARM, for 57% more performance per watt:
The cloud providers have been buying Intel for years and can't afford to just throw thos chips away. AMD is capturing a much larger proportion of data center CPU sales, but you should still expect to see them buying Intel as well, since AMD can't produce enough chips to satisfy all demand. Also, qualifying new platforms is very expensive so cloud providers will continue to buy already-qualified platforms for as long as the perf/W makes sense.
AWS certainly offer AMD boxes as well (M5a, M6a, T3a etc), but nothing like the broad range of Intels, and not even the same kind of offerings as Graviton.
AMD chips had the same branch prediction side channel vulnerabilities that intel did. At this point their offerings are pretty similar with intel being a bit cheaper but using more electricity.
With the new releases, it seems that 5800x3D takes the crown for single threaded tasks and video games even in intel's 13xxx benchmarks. The 7950x is closely trailed by the 5800x3D.
But 2be fair amd is/was growing and intel is stagnant and lose market share right now but with a massive investments and going back on track with fabs process i would assume Intel soon start 2grow again (ofc after the end of the recession)
We have to remember AMD is making money out of TSMC advantage over Intel node which I assume won't last forever and if TSMC blunder even one node it can be catastrophic for AMD.
Considering USA shift and focus on tech war with China Intel fabs can only grow faster or everything can crash.
I am not sure if Intel counts though. They have had challenges for the last several years during the bull market, which they failed to address, now they are taking advantage of the bear market to cull numbers. If we see someone who did not have these very visible weaknesses cutting numbers that would be a sign, not Intel. Though I agree Intel is huge and them laying off people still is a big deal.
I’ve seen a whole lot of stories like this at larger and larger companies. There was one like it about Facebook yesterday. They’re shorting their own stock [predictions] and adding employment insecurity to reinforce that. Big tech companies would love to benefit from a less competitive hiring landscape and the “recession?” narrative fits that well.
Too bad, and so sad, that they’ll be just as eager for talent when talent inevitably moves on and finds they’re happier where they landed when they were being used as stock price pawns.
Anecdata here but: literally every single dev I talk to on a regular basis has told me their company has laid off a sizable amount of staff in the last 3 months. Every single one.
The companies range from startups to enterprise, at practically all stages of funding, but they each told me their funding never materialized or investors gave management ultimatums.
A few were in crypto, two are in fintech, a few more in various b2b tech companies. But they all gave almost identical explanations: management sees a rough economy ahead and tightened their belts accordingly.
I’m not trying to be doom and gloom, I still have a job, but even my partner, who is also a software engineer, just survived a round of lay-offs at their fintech that happened yesterday. They laid off 25% of their company across the board.
That seemed to be the feel in 2008 as well. Lots of companies that were fine choose to have layoffs just because they “saw” a rough economy ahead. Self fulfilling prophecy, or just trimming the fat…?
Why would “supply chain issues “ reduce spending? If a company is having supply chain issues, that by definition means that there is more demand than they can supply. Meaning people are willing to spend money. What are companies going to do when they get supply if they lay off workers?
Besides, if you work for any tech company, you should be able to throw your resume in the aid and get another job. Even if you are a blue collar worker, there are plenty of companies looking. I have a friend who works in finance for a major manufacturer. He said the company had to be a lot more lax about firing factory workers because they already had a shortage.
And by definition none of them had a profitable business model. The investors were hoping to pawn off their investments to an acquirer or the public markets - ie “the greater fool”. The investors knew they couldn’t find a bigger fool and are left holding the bag.
They were “tightening their belt” a profitable company doesn’t have to worry about that.
Did you read what I wrote at all? Why are you being so combative, lol. My partner’s company is not seeking funding and is profitable. Still had layoffs. Another friend’s company is profitable but did happen to be seeking investment. Still saw lay offs. Why do you insist on looking at this through the lens of funding and refuse to acknowledge that companies see a rough economy ahead, regardless of their funding status?
Even the profitable ones too are under pressure to show YoY and QoQ growth. To make that happen, belt tightening has to happen. So take away is no one is growing or expected to grow in the coming year.
The startup I worked for before my current company sold services to health care systems. It was hit hard by Covid. Hospitals were losing money and not paying for new products or even old ones.
My CTO said specifically “we need everyone we have and we aren’t going to be successful by laying off people. We have a vision and we need you all to help execute it”. They were bought out less than 9 months later for 10x revenue. I had moved on to $BigTech by then after leading their “cloud modernization” efforts.
The profitable tech companies are using it as an excuse to get rid of dead weight. No one is going to come after Cook, Jassy or Nadella for short term revenue misses.
Facebook and Google still have founders who own more than 50% of the voting shares. No one can come after them.
You can say a lot about the big 5. But none of them can be accused of short sightedness.
Intel has the legend of a corporate turnaround after a huge layoff in its history.
I'm not sure if I can take this as a harbinger of things to come, and I'm saying this while generally being pessimistic of the economy, I think Intel is a deeply troubled business and some deep layoffs were long time coming because Intel was spread real thin across so many businesses.
What positive side of a cycle will cause a resurgence in personal computers in the age of mobile? Servers are increasingly using custom purpose built chips. Intel would be better off being a fab.
Intel certainly has its own unique challenges, but even, for example, AMD had been doing pretty great up until this year, and they also just announced a big shortfall for their 3rd quarter results. Just wondering if this is really the tip of the iceberg for a true, broad retrenchment in tech, after the past 9 months of "Is a recession coming? Is a recession coming?"