Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I heard from a friend who works for Intel that he doesn't know why he was hired in the first place; his PhD was in a completely different domain, the objectives of the project were remote to his skills, and he told me this is what his entire team was made of. Seems like a lot of bloat present in this company, and it makes sense they feel the way forward is layoffs.




Second hand knowledge, I have a cousin in Intel Oregon. Intel mass hires PhDs in Physics/ Chemistry or Biology etc, reasoning that a PhD is enough to learn whatever is needed for a process engineer. Assume 30-40 people hired every cohort and there is 12 or so cohorts a year. Another curious thing I noticed, was Intel had online multi correct tests for its engineers that they had to pass weekly, presumably to keep track whether they are actually learning on the job or not. The multi correct tests though just seem like rote memorization and easy to cheat.

Overall my 5000 ft view, was the culture was very different from FAANG or a Bay Area Tech company. If the Bay Area approach is high ownership and high accountability, Intel was much more process driven and low ownership. They even tracked hours worked for engineers in Oregon.


Yeah, my old (and best) boss at a FAANG had been at Intel. She left because they basically wouldn't promote her as she didn't have a PhD.

I had a similar realization in biotech. I saw a lot of engineering masters grads hired to break down cardboard boxes and document chamber temperature logs. The idea was they could read and write, and perhaps fit into yet undetermined roles later.

I think it speaks to common challenges when hiring mangers are disconnected from the work, degrees and resumes are worthless, and turnover is difficult.

In many companies team leads dont have a role in the hiring or firing of the employees working for them.


I knew a guy who got a job with Intel's wearable division. Everything was chaotic, everyone was toxic, and Intel one day lost interest and fired the whole division.

The sad thing is they acquired the basis smartwatch and destroyed it, leaving only Garmin as developers of dedicated activity trackers. I considered getting a basis but was obviously glad I didn't.


I hear it’s division dependent, but just about every time someone complained about things being toxic at Microsoft they would be told at least it is not Intel.

I've been thinking of buying pixelmator pro recently for photo editing. It seems like a lovely photo editing application. And they have a lifetime license.

But Apple bought the company recently. I worry that whatever made the product great will go away post acquisition. Whether or not Apple keeps working on it at the same level of quality is anyone's guess. Or maybe they'll integrate the best features into their free Photos app and ditch the rest. Or something else entirely.

I can't think of any examples where acquisitions make a product better. But dozens where the product was killed immediately, or suffered a long slow death.


Maybe some of these will work for you; Minecraft, PayPal, GitHub, Instagram, WhatsApp, LinkedIn, Android, Waze.

With Apple it's harder for me to know. How do former Dark Sky users feel about the Weather app? I think it has all the features? How about Shazam, which I never used before it became an iOS feature? TestFlight retained its identity. Beats by Dre headsets did too, though Beats Music I think became Apple Music in a way.


Some of these are hard comparisons specifically because what you are describing is really the initial exit. Acquisitions are often not funded by valuation as much as an actual plan to make money.

Something like Minecraft for an example - the existing established customer base with perpetual license was not justification for buying it. The value Microsoft saw was around things like DLC content and cosmetics, and subscription revenue through server hosting.

From what I have observed - one could say that everything Apple acquires is an accu-hire first, for a product they want to ship and trying to find a delivery-focused team to help them with that.

If the company already built a product similar to that and had it hit the market - thats great! It means that they are getting a team which has delivered successfully and maybe even have a significant head start toward Apple's MVP. That likely means also that the team will have a fair bit of autonomy too (and often retain their brands).

DarkSky's product in that light wasn't their app. It was their work on localized weather models and their weather API.

Apple's Weather App doesn't look like DarkSky, but AFAICT you could rebuild the DarkSky app on the WeatherKit REST API (including features like historical weather, and supporting alternative platforms like Android).


With the possible exception of Android (which tbh I have never used) and possibly Minecraft, it's hard to make an argument that any of those acquisitions improved the products. At best they're kept in stasis.

Facebook replaced Instagram with a completely different app, which they then made quite useful to a lot of people. (Sure, I might object to all the strings attached, but there are dimensions along which Instagram was improved – in the same way you can improve a house by bulldozing it and building a dozen single-room flats in its place.)

Github has gotten a lot more unstable (GH Actions outages every couple weeks or so), but it definitely has not been in stasis: the pace of change has been a whole lot higher since the acquisition (and I'd say generally for the better)

It is also hard to argue that they were killed off immediately or slowly died.

You'll find many (including myself) who find Microsoft's purchase of Minecraft to be a huge loss for the game. I'll admit that the acquisition wasn't the calamity people were fearing, but overall it's still been a net negative or stagnation at best.

For starters they split the community among bedrock & java. And while a minecraft copy leveraging a C++ was a good idea, it seems they've mostly made the split to justify adding heavy monetization for skins and world maps to bedrock. (Maybe they feared backlash if they did that to the OG Java version?) This monetization seems to have killed people's appetite for hobby-project mods and maps.

Likewise, it's clear that the intended demographic of their marketing has become much younger. From the mob votes, the type of things that go in updates, it seems that what's added is far less deep. That updates are now more of a social media "Llamas in minecraft, look how goofy they are!" stunt.

I recently started a 1.7.10 modded world, and was surprised to see just how much stuff was already there. The only newer vanilla things that I found I missed were bees and slime blocks.

Maybe it's nostalgia, but this version feels nicer, like it's cohesive, and respects me as a player more.


Apple is different from Microsoft or Google in that they don’t have a monopoly to subsidize mistakes. They bought things like Logic and Final Cut because they realize that not having high-quality Mac software means that Adobe gets to choose whether professionals in those fields keep buying Macs. I would expect Pixelmator will continue to be developed to compete with Photoshop.

YouTube? Twitch? I don't use either but people sure flock to them.

There are many acquisitions that lead to better products.


I would argue neither is better for it, as a user.

They're more lucrative for creators/streamers and have further reach but the platform experience is noticeably worse.


Youtube was about to get destroyed by an avalanche of lawsuits by the TV/movie industry. Without Google's army of lawyers, they would not have lasted.

True. Android might be another example.

But there's also hundreds of examples of the opposite happening: Successful products being bought by a big company and then killed post acquisition.

We probably won't know which camp Pixelmator will fall into for a few years yet.


You would be arguing wrongly YouTube today is the largest trove of knowledge accessible by the largest number of people in the world. It also has a lot of false information but overall it is one of the greatest cause of change in the world.

Would YouTube be the behemoth it is without the plethora of content (some of it, high quality)? And if it being more lucrative for creators is what got that content, I would argue the platform as a whole is better. You could have the most whizz bang video platform, but without good content, what good is that?

With the money came the greed, the over-polished mass market content, and the ecosystem of creators is now mostly driven by engagement.

Not to mention all the topics that have been soft-banned because one algorithm flags those videos as not monetizable, and the next algorithm decides that only showing or recommending videos that can show ads results in the most add revenue

I don't think YouTube is clearly better or worse than it was before acquisition, and maybe an independent YouTube would have walked the same path. It is simply a very different platform that was ship-of-theseusd


Eh, it feels like you're looking for the worst.

Follow some channels like Practical Engineering or Veritasium ... both good quality, information dense. Yes, decent production values, but that's not a bad thing at all in my book.


YouTube was acquired in 2006. I do think since then things like video quality and length have improved, although you can argue the ads everwhere are bad UX.

Youtube is an absolutely miserable product compared to where it's been in the past, are you joking?

I immediately thought of Keyhole, Inc, which became Google Earth.

I have Pixelmator Pro & Photomator. They haven't meaningfully changed since Apple's acquisition, and they don't rely on any subscription or online features that could be ruined after the fact. If a future update fucks things over, you don't have to update. Everything runs locally.

Are they still being worked on? Have either product received updates since the acquisition?

I'm tossing up between pixelmator and affinity photo.


Pixelmator was updated at the end of June.

The main changes were integration of Apple's AI stuff and improved VoiceOver support. Nothing earth-shattering but it's still active.


I had the basis. It was fantastic. I still miss it.

I know someone with a PhD in biochemistry who was hired at Intel from a cancer research lab... I'm sure he sold his chemistry background well but I always thought that was an odd hire. Maybe there are just so few qualified PhDs that they'll happily take folks from adjacent fields?

Most of the senior leadership of Amazon in the early days were a bunch of randos from a formal credential standpoint. A car mechanic leading aws engineering, a musician running logistics, a chemical engineer optimizing the network etc .

Hedge funds also hire physicists and mechanical engineers


Your phrasing _drastically_ undersells the actual relevant background and experience there:

James hamilton the “mechanic” … with EE & CS degrees and time at ibm and ms. Dave Clark the “musician” (undergrad) … and an MBA focused on logistics. Jeff wilke the “chemist” … who worked on process optimization at honeywell and supply chains at aderesen.

So sure, might as well say DeSantis is an SDE Intern figuring out software deployments, Vosshall is an amateur aircraft EE, or marc brooker is some foreign radar engineer.

Signed, some newpaper dude who was an AWS PE doing edge networking and operations.


Chemical engineers are so good at distributed systems that it is almost a trope at this point. It is their specialty. Their entire discipline is optimizing aggregate throughput in decentralized systems with minimal coordination.

It maps 1:1 with the computer science but chemical engineering as a discipline has more robust design heuristics that don’t really have common equivalents in software even though they are equally applicable. Chemical engineering is extremely allergic to any brittleness in architecture, that’s a massive liability, whereas software tends to just accept it because “what’s the worst that could happen”.


From the tone of your post, I assume that you are a ChemE who works with CompSci folks. If what you say is true, why haven't ChemEs moved into the space and taken over? Software dev pays much better than ChemE.

Almost all of the chemical engineers I know do work in software, mostly for the money. The skillset translates to computer science relatively seamlessly. Chemical engineering is essentially computer science where you swapped atoms for bits, but far more difficult because there are only distributed systems and the background error rate is always noticeably non-zero.

I studied chemical engineering after I was already working in software, so I did it backward.


Did you study chemical engineering knowing it's applicability to software engineering?

Your observation is interesting because early ideas in object oriented design were likewise inspired by biological robustness in the face of a non-zero background error rate (see any of Alan Kay's early writings, and his Turing lecture). I wonder if half of a CS degree shouldn't also involve basic chemeng and bioeng.


Because if you need a systems designer / architect you will look for traditional credentials in the field. It’s the same reason that computer scientists cannot break into pharma despite the fact that they would really fit with the data infrastructure & processing challenges they face.

Ultimately it is all about how strict the hiring pipeline is to the credentials vs potential.


My D got her degree in ChemE from a top eng school was hired out of college by a big sw firm and is full time swe 5 years in (by choice).

Because they want to do ChemE rather than CompSci more than they care about their pay?

That sounds surprisingly non-random.

Graph theory originated in Chemistry. Not Computer Science.

Musicians know harmonics and indirectly lots of cyclical travel stuff. And waves.

The good car mechanics I know are scary smart.


You may as well say graph theory had its origins in Ancient Rome when they built the road network.

Most trace it back to Euler when he considered the problem of Seven Bridges of Konigsberg https://en.wikipedia.org/wiki/Seven_Bridges_of_K%C3%B6nigsbe...


Any good laboratory chemist can be trained to work in semiconductor research. The tools and jargon are largely similar.

in college I got a job offer from Intel without interviewing. I had applied, the hiring manager reached it and said they’d setup a loop, it never happened. then some weeks later I got an offer. super weird

also I was sorta laid off by the current Intel CEO from my last startup!


By your description it sounds like layoffs should be at the management level for incompetence, not for employees.

What would you do with all of the employees who are currently working in jobs or on projects or with skills not relevant to the company?

Look for mutually beneficial ways forward - reassignment to relevant projects, retraining where necessary, generous layoff package for those for whom neither hits. Realistically the vast majority of PhD employees are going to be highly motivated and want to work on something useful just as much as you want them to.

Let them make cool stuff the company can sell, increasing revenue and reversing the decline.

How to draw an owl: draw the owl.

They tried that, it didnt work.

Unless you're a sociopath, you let natural attrition run its course. If their skills weren't relevant when you hired them, then it's your fault. If you changed course after you hired them so that they stopped being relevant, then it's your fault. The only just thing to do is find a way to make their work meaningful until they move on.

No, you give them a fat severance and eat the losses. Maybe 6 months + 1 month per year of tenure, something like that. You're break even by the end of the fiscal year, you just gave someone a lifechanging amount of money, and they don't have the crushing morale problems of "the work I do is pointless" and get to collect unemployment in addition to severance.

If you are honest and generous with people, they aren't mad that you made a mistake and let them go. It's companies that try to give 2 weeks + 1 week per year of severance that are making a mistake, not the entire concept of layoffs.

(Without delving into the systemic reasons that layoffs are inevitable of course. If the system was different, they wouldn't have to happen, but we live in this system at the moment.)


A year or two of salary is generally not life-changing.

It doesn't have to be lifechanging to be meaningful. I'd happily take 6 months of severance over working ineffectually on a doomed product.

Imagine you're a middle tier functionary making $70k a year. Are you telling me $90k in cash after tax plus unemployment for ~6 months isn't lifechanging? That's a downpayment on a house, or your student loans paid off, or $4000 a year in permanent income for the rest of your life.

If yiu get severance, generally you don’t also get unemployment

> If you changed course after you hired them so that they stopped being relevant, then it's your fault.

Nobody can predict market conditions or technological advances.

If you don’t change course (mission, people) the company will likely fail and then everyone is out of a job, shareholders, pensioners, and 401k holding laypeople look money.

I do think that leadership is not held accountable enough for their mistakes and failures.


The situation of Intel is much more the result of bad management than the output of their current workers. For all purposes, they're effectively doing what they're were supposed to do when hired. So the logical conclusion is that Intel workers are the ones who should have the power to fire the entire management and put someone in place to fix the issue, not the other way around.

The output of workers is always a leadership problem, imho.

I disagree that the workers are the ones who should have the power to fire management unless they are shareholders. I think this should (and it does) fall upon the board and the shareholders. If the workers are shareholders, all the better.

Regardless, it's clear the current system needs work.


What a sad waste of talent in that case. A waste that could be mitigated by them finding a more productive way to help society than sticking to a pointless job.

Agree. We lean hard into sunk cost fallacy when it comes to job training.

“If your name is Farmer you’re a farmer.” mentality but self selected euphemism. “I trained as a software engineer and that’s what I am for 50 years! Dag gubmint trynna terk my herb!”

Service economy role play is the root of brain dead job life we’re all suffering through.


I would argue that should be done at a lot of companies.

What purpose would it serve? Remember, the purpose of a company is not to make good products.

Managers are also employees. Nobody's arguing they should be spared and I'm not sure that you can argue top management at Intel hasn't been let go over the years.

Also laying off incompetent managers alone won't solve the problem of having hired the wrong people


I think management has historically argued as such.

Who said they have the "wrong" people? They are doing exactly what they were hired to do.

wrong for the needs of the company. this isn't an assessment of their worth as an individual or as a professional

Do they have any clear direction for the future of the company? As it seems they don't, the idea that these workers are the wrong people is completely unfounded.

The whole thread started because of this comment

> I heard from a friend who works for Intel that he doesn't know why he was hired in the first place; his PhD was in a completely different domain, the objectives of the project were remote to his skills, and he told me this is what his entire team was made of. Seems like a lot of bloat present in this company, and it makes sense they feel the way forward is layoffs.


But this is just his point of view. Intel was hiring people and training them to do the job they wanted. And if they continued employment this means they were doing what was expected.

Intel has 108,000 employees.

In comparison:

Nvidia 36,000

AMD 28,000

Qualcomm 49,000

Texas Instruments 34,000

Broadcom 37,000

It is obvious that Intel is ridiculously overstaffed.


TSMC has 83000 employees. If Intel does everything TSMC & NV do, then they should have something like 83000+36000~120000 employees?

The scale isn't really comparable. TSMC manufactures 5x more wafers than Intel, and the disparity is getting exponentially worse every year (see the chart at https://thecuberesearch.com/247-special-breaking-analysis-th...). In fact 30% of Intel's own production is outsourced to TSMC.

Sure but that's different than your original point.

The scale also isn’t linear.

The world isn’t the panning for the 2027 takeover.

Of what?

2027 is the date the CCP has announced for when it'll be militarily ready to invade Taiwan. Whether they actually do so or not is an open question.

Taiwan.

There's other reasons to think it, but 5x wafers on 8x staff seems within the realm of comparable.

My reading of the numbers it should be 0.8x staff. I think you’re off by an order of magnitude.

I sure am. Whoops.

> If Intel does everything TSMC & NV do, then they should have something like 83000+36000~120000 employees?

TSMC is a fab, not a chip designer. And NV makes GPUs and small scale SoCs like the ones in the Nintendo Switch and automotive (IIRC the Tegra SoC that powered the Switch 1 literally was an automotive chip that they repurposed).

That's quite the difference from what Intel makes: CPUs that power a lot of the world's compute capacity for laptops, PCs and servers, wireless chips (Bluetooh+WiFi), their own GPU line...


> IIRC the Tegra SoC that powered the Switch 1 literally was an automotive chip that they repurposed

Tegra was designed for mobile devices like smartphones. The automotive part came later and isn’t particularly relevant. Intel also makes low power SoCs for mobile devices, e.g. Atom.


> Intel also makes low power SoCs for mobile devices, e.g. Atom.

Last time I heard that name was well over a decade ago for crappy "netbook" devices. Just looked it up, last Atom CPU was released in 2013 per Wikipedia [1]. They might still make them for embedded computing purposes with very long life cycles, but no idea at which volume.

[1] https://en.wikipedia.org/wiki/Intel_Atom


They still make them, they're just not called Atom anymore.

Intel runs their own fabs. NVidia, AMD, Qualcomm outsource chip manufacturing.

Their fabs have been falling well behind TSMC for nearly a decade, and in fact their higher end stuff is mostly produced by TSMC

Intel has about as many salary employees (blue badge) as contract employees (green badge). Only the blue badges are included in the counts.

None of those companies have chip manufacturing.

The only true comparison is TSMC but in only does chip manufacturing and not chip design/development.

So Nvidia + TSMC would probably be a fair comparison.


TI does

For the GP of this comment, TI mixes running their own fabs with some fabless product lines. A lot of the companies that make analog ICs still own their fabs and TI is one of them.

I know it's apples to oranges but ASML has 44,000 employees, for reference.

With this brilliant logic, you should switch to being a McKinsey consultant and start driving companies into the ground.

why would someone with a PhD apply for the position if that was the case? Were they hired and then re-tasked once employed?

Because Intel pays well (mid six figures + bonus) and PhD doesn't pay a minimum wage in most places. They were expressly hired without an overarching goal.

So in other words, that PhD was well worth the effort.

Back in 2012-2014 intel hired a bunch of “futurists” which were liberal arts majors from the northeastern US. Needless to say they spewed a bunch of nonsense and were fired years later, but I knew a few and they were puzzled they were hired to begin with.

I remember there being a bunch of Anthropologists that were hired before that, under Genevieve Bell. It wasn't clear to me why they were hired.

It felt to me like the people at the top were clueless, and so were hoping these hires would help give them an idea which direction to steer the ship.


Xerox hired an anthropologist once, Julian E. Orr, and it resulted in a really good book called “Talking about Machines: An Ethnography of a Modern Job”.

Of course, mostly he found was how out of touch the executives at Xerox were with what their employees were actually doing in practice. The executives thought of the technicians who repaired copiers almost as monkeys who were just supposed to follow a script prepared by the engineers. Meanwhile the technicians thought of themselves as engineers who needed to understand the machines in order to be successful, so they frequently spent hours reverse engineering the machines and the documentation to work out the underlying principles on which the machines worked. The most successful technicians had both soft skills for dealing with customers and selling upgrades and supplies as well as engineering skills for diagnosing broken hardware and actually getting it fixed correctly. It seems that none of the sales, engineering, or executives at Xerox liked hearing about any of it.


> I remember there being a bunch of Anthropologists that were hired before that, under Genevieve Bell. It wasn't clear to me why they were hired.

Yes, I remember contracting at Intel in 2006 and the Anthropologists were at one end of the building we were in. Their area was a lot different than the engineering areas. Lots of art, sitting around in circles, etc. I remember asking about what was up over there "Those are the anthropologists".


Bell went on to become Vice-Chancellor at the Australian National University. She has become a very controversial figure there.

https://www.nteu.au/News_Articles/Media_Releases/Staff_lose_...


Having a bunch of people who might at some point generate a few valuable ideas doesn't sound like a bad strategy. Intel is (was?) huge, their market penetration is enormous. I think Bell Labs did something similar back in the way -- maybe not with the liberal arts, but they certainly left a lot of room for serendipity.

What possible ideas could someone with no expertise in advanced math/physics/chemistry/or other relevant hard science have for the products and services Intel sells?

Who knows. But homogeneous thinking is bad. A bit of variation can spark innovation.

Why not throw a janitor, cook, and bus driver in the mix too then?

Almost every field require baseline knowledge of certain facts to make one's ideas useful. And we're talking about the most technologically advanced process in the world using cutting edge physics and materials science. The baseline here is basically as high as you can get.


I applied to Intel once a long time ago when I was just getting out of school, but when they replied asking for my resume in "Word format" I stopped pursuing it.

I didn't use Word to create my resume and if they can't deal with a PDF that was their problem.


I often look at the pdf metadata when reading CVs. It tells a story all of its own. And latex automatically signals virtue to me...

That's not at all an unusual request though. Plenty of recruiters want Word. Probably so they can make changes behind your back or some such nonsense. Or, less cynically, so they can more easily copy/paste stuff into their HRM tool.

I've never had a competent company ask me for that. The only ones who did were "old fart" companies. New companies, especially ones with good tech, have always taken my PDF.

> Probably so they can make changes behind your back

Nope, I don't consent to that.

> Or, less cynically, so they can more easily copy/paste stuff into their HRM tool

Their HRM tool should support PDFs if they are competent. They should also be able to read my resume with their own eyes. If not I consider the company not a good fit for me.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: