The way I understand it (and my understanding is certainly poor, so I welcome well-supported pushback on it), is that few, if any, components in the food that we in developed countries eat today are actively harmful in themselves (with the caveat outlined below)
The main issue is overconsumption leading to overweight and obesity. Food that’s high in refined sugars and/or saturated fats tend to contribute to this, because it’s palatable and calorie-dense
So in that sense, yes - I believe that as long as your diet is varied enough that you get sufficient intake of all, or at least most, of the essential nutrients, and you don’t eat too much (i.e. in moderation), the ratio of macronutrients doesn’t make a big difference to your health outcome
The crux is that moderation is hard when the food is jam-packed with calories, and it’s so delicious you just want to keep stuffing your face
By volume most of the food in modern western grocery stores is unnaturally sugary or otherwise calorie dense.
You have to restrict yourself to produce and a few scant other options to escape with balanced nutritional products.
They even advertise cereals as a "part of a healthy breakfast". Which is a lie under any circumstances, because it's never a healthy part if you eat it long term. (Yes it could keep you from starving to death in a famine, still not 'healthy'.) Imagine if they could only say "it will keep you from starving, and may significantly contribute to diabetes"
So the only two options you can imagine entail his detractors being irrational and emotional? You can not comprehend that anyone can have any valid complaint regarding him and his behavior at all?
Musk has accomplished some remarkable things, by having grand visions, ruthlessly executing on them, and being willing to repeatedly take on a massive amount of risk. If that had been all, I don’t think many people would be decrying him like this. It’s still easy to justify admiring those bits, if you’re so inclined
But he has also done a lot of things that make him unlikable and are harder to justify. He happily whips up massive amounts of hype, regardless of how likely his claims are to actually manifest (which is a large part of why the Tesla stock price is where it’s at). He sucks himself off at every possible turn, and takes dubious personal credit for a lot of things his companies achieve. He is vindictive and has exacted retribution on people with much less power than him (or pouts about it in an undignified fashion when the opponent is too powerful to crush, like the SEC). He has an easily bruised ego and lashes out in a very childish manner (remember the diver he called a pedo on Twitter). He enters into realms he has no expertise in and proclaims to the whole world that he has all the answers. He directly interferes in US and world politics by wielding his wealth and influence, sometimes with disastrous results (it doesn’t help that his political views usually are unsophisticated and immature, especially since he acts so certain of them). Etc, etc
Basically, he’s a dickhead that thinks he’s the best in the world at everything, and many of whose actions are detrimental to both individuals and the world at large. He doesn’t get a free pass for that just because he’s done some impressive things with his businesses
Free speech by foreign governments (or controlled by foreign governments) has never been protected by the US constitution, right?
I do agree that Trump, in both his administrations, has made it starkly clear that its checks and balances are quite impotent against a person or party that doesn’t care to follow the rules, so long as they have enough supporters that also don’t care, or are misled
> Hm, those are all valid, but they're also from the perspective of only caring about external forces. It's as if the work itself is only relevant insofar as we get something out of it.
From the perspective of the organization that pays you to do it, it is? At best there may be another mission that it genuinely cares about, usually there’s only a profit motive (which is also fine). If you want to create software as an end in itself or enjoy the craft without compromise, it usually can’t be in the context of a job or a business
Our job is to produce the best possible outcomes given the constraints we’re faced with, and inform leadership so that they’re aware of the tradeoffs when they make their decisions. Sometimes those decisions are going to be bad, and obviously it’s justified to be frustrated then. Other times they are correct, even when it means compromises on the engineering side. That’s when we have to just suck it up (or go elsewhere)
I still think there’s room for enjoying the work of creating software even under imperfect conditions. Striving for perfection is for hobbies, or the very rare circumstance when it’s justified by the goals of the organization
> From the perspective of the organization that pays you to do it, it is?
I agree. I'm just saying that if you limit yourself to explanations from that perspective, you will miss some of the explanations and be unable to describe real-world phenomena.
> If you want to create software as an end in itself or enjoy the craft without compromise, it usually can’t be in the context of a job or a business
Sure, but that's an extreme position. Maximizing the "ship features that make money" end of the balance doesn't end well in the long term. Maximimizing "all code must be perfect" also doesn't end well. There is very much a place for an appropriate amount of craftsmanship, and businesses do better and even make more money in the long run if that isn't choked out. ("Appropriate" varies widely by situation.)
I don’t think anyone is arguing that the basic idea of offshoring was invented after the COVID pandemic… But rather that workplaces and workforces geared towards remote working has made it more feasible and accelerated the process
Is it really hard to understand that the infrastructure for remote work, which I think everyone would agree got a major upgrade during the pandemic, would also make it much easier for companies to outsource software dev work?
Pre-2015 or so, yes, of course there was outsourcing, but it was honestly a major PITA in most cases. Most communication was done in conference calls, very little group video communication, lots of async chats, etc.. Any type of work where you needed a fairly frequent black-and-forth with various team members was rarely outsourced - the type of work that was outsourced was the type that was more likely to have static requirements.
But now, though, there is basically no difference working with a colleague who's working from home in the same city vs. working from home thousands of miles away (as long as there is good timezones overlap). And that is a change that only happened around the beginning of the pandemic, and I've personally seen companies much more willing to outsource because of it, and they're outsourcing a much wider type of work (e.g. brand new dev work that is frequently updated based on usage metrics) than they would in the past.
>Is it really hard to understand that the infrastructure for remote work, which I think everyone would agree got a major upgrade during the pandemic, would also make it much easier for companies to outsource software dev work?
No, because it wasn't actually upgraded.
Like be honest, the shift into remote work wasn't surrounded by massive tech advances or upgrades. All the tools that existed for remote work had been there, largely in the same fashion and capability, for decades.
So when you say the infra. and tooling has improved, you need to be specific because it's very hard to point to anything that was fundamentally or notably improved in the pandemic around remote work.
It all existed before. It was all used before. If you weren't using it before the pandemic that was by choice, not because it didn't exist.
Everything from our communication software, to developer collaboration tools, to how org's track and manage their employees all existed well, well before the pandemic.
It was a cultural change -- not a technological one.
> And that is a change that only happened around the beginning of the pandemic
I'm not sure what you're basing this on. Especially someone's that's had to work with peers across the globe for 4+ decades -- the tools have always been there.
> All the tools that existed for remote work had been there, largely in the same fashion and capability, for decades.
Trying to be charitable, but that is just complete nonsense. I managed an offshore team in 2007, and I managed offshore devs in 2022, and the experiences were world and away different. You're either totally full of shit or just managed teams on some other planet or something.
Detail what changed? Because I can't point to anything. And I've been doing this for 4+ decades. A lot of that remote.
So you're claiming things radically changed, and if such changes caused a huge shift in the workforce it should be pretty easy to give some examples, yeah?
Instead of going "you're full of shit", just answer the question at hand and the one that was given to you multiple times.
The fact that it's so easy to do and you just spent way more effort not doing it is a pretty clear indication you are following a narrative, and not facts.
This is just laughably ridiculous. OK, I'll bite though, even though I can't believe anyone is actually this willfully blind:
1. As the other commenter stated, gigabit Ethernet is now standard, and tons of people, throughout the world, have bandwidth to their home that can easily support high quality video conferencing. That just didn't exist 15+ years ago.
2. Group video chats on consumer grade devices simply didn't exist. Sure, in the mid 00s we had some group video conferencing, but they nearly always required dedicated facilities - people weren't having zoom meetings with 10 individuals from their laptops.
3. But perhaps most importantly, since the world is now used to doing everything remote, offshore teams are rarely "the odd man out". Right up until around the pandemic, most companies were culturally based around the office, and structures were set up to support in-office collaboration. Now, though, everyone is used to being remote anyway, like my favorite cartoon showing the difference between in-office, remote, and hybrid software devs - except there is no difference, because they're all on Zoom all the time anyway.
I just honestly can't believe that someone who managed remote teams in 2005 thinks it's the same as managing in 2025, and the plethora of advancements in networking and remote conferencing tech easily supports that.
Not OP but a couple major changes that didn't exist in most companies 20 years ago that reduced friction from an organizational perspective.
1. Gigabit internet - video call quality is significantly better than it was 10-15 years ago.
2. Zoom/Google Meets - the attention Zoom gave to UX just wasn't matched by any other precursor. Google Meets is a close second.
3. GSuite/O365 - sharing documents across organizations and being able to search for them successfully org wide has gotten much better now due to tools like Glean
4. Slack - most traditional companies didn't adopt Slack until the pandemic. Before that they were primarily leveraging email
There has been a whole decade of evolution in productivity and DevTooling throughout the 2010s, and the COVID WFH period forced most orgs (especially traditional orgs) to adopt a lot of that tooling.
On top of that, a large number of mid-level managers, engineers, PMs, and even VPs are in naturalization limbo, so it's become easy to find people to end up leading offices back in their home country while enforcing the same standards as back in the US. MS did this in Israel back in the 2010s, and most companies began doing something similar across Eastern Europe and India during the early years of COVID because most companies legitimately were worried it would become a Great Recession level event.
>Gigabit internet - video call quality is significantly better than it was 10-15 years ago.
You don't need gigabit internet for good video quality. We solved this with the MPEG-4, specifically H.264.
Most video streaming software still uses H.264 to this day, or some "mimic" of it. This was 15+ years ago btw.
You only need like, maybe 100 Mbps. Most definitely less for normal conferences.
> Zoom/Google Meets - the attention Zoom gave to UX just wasn't matched by any other precursor. Google Meets is a close second.
Opinion based, and you're welcome to have it. But not sure what qualifies as any sort of good evidence or reason for companies moving to other countries for their labor force.
>GSuite/O365 - sharing documents across organizations and being able to search for them successfully org wide has gotten much better now due to tools like Glean
Again, you're welcome to your opinions. Sharing documents in organizations across cities, states, and countries has been pretty mainstay for 30+ years.
And if you want my honest opinion, the tooling has gotten worse.
> Slack - most traditional companies didn't adopt Slack until the pandemic. Before that they were primarily leveraging email
At this point I'm just having trouble understanding why you think these things are fundamentally game changers.
>and DevTooling throughout the 2010s
Like what?
Again, if that's the best examples you could come up with they're not really enough and even worse (in the cases of GSuite/365) they are counter to your point about tooling improvement.
Spotify offers lossless now. But before that the highest quality was 320 kbps AAC, and if you’re able to differentiate between that and lossless even on state of the art equipment under perfect conditions, for the vast majority of songs, you’re an extreme outlier (and in that case, sure - go for the lossless option)
You can also download up to 10,000 songs per device for offline use, which should be enough for a plane ride
I can see other issues one might have with Spotify, but I don’t really think those are among them. I’ve had it for about 15 years, and I’ve been consistently happy with it for my own use
>if you’re able to differentiate between that and lossless even on state of the art equipment under perfect conditions, for the vast majority of songs, you’re an extreme outlier
Misconception: perfect conditions are what lossy codecs are designed for. You're actually more likely to hear compression artifacts under imperfect conditions that break the assumptions of psychoacoustic masking. Examples include strongly distorted frequency response from poor speakers, accidental comb filtering from room reflections, or even merely listening through a home surround sound system that matrix-decodes a stereo signal into additional channels, thus spatially isolating sounds that were assumed to be masked.
Not saying you’re wrong in this particular instance, but there are all sorts of areas where we accept that harm will occur at scale (e.g. that 40,000 people per year die in motor-vehicle incidents just in the US). How do we determine what is reasonable to expect?
We require auto manufacturers to include certain safety features in their vehicles, to decrease deaths to a socially acceptable level.
The central ill of centralized web platforms is that the US never mandated customer/content SLAs in regulation, even as their size necessitated that as a social good. (I.e. when they became 'too big for alternatives to be alternatives')
It wouldn't be complicated:
- If you're a platform (host user content) over X revenue...
- You are required to achieve a minimum SLA for responsiveness
- You are also required to hit minimum correctness / false positive targets
- You are also required to implement and facilitate a third-party arbitration mechanism, by which a certified arbitrator (customer's choice) can process a dispute (also with SLAs for responsiveness)
Google, Meta, Apple, Steam, Amazon, etc. could all be better, more effective platforms if they spent more time and money on resolution.
As-is, they invest what current law requires, and we get the current situation.
> The U.S. government concluded within the past two years that Israel was most likely behind the placement of cellphone surveillance devices that were found near the White House and other sensitive locations around Washington, according to three former senior U.S. officials with knowledge of the matter.
A synecdoche can either be when you use a part to represent the whole, or conversely use the whole to represent a part
I think it’s valid to consider the US government a part of the US. Thus, referring to the US government when saying that the US did something is a synecdoche
> For an extreme example: Harvard's tuition is nominally $60K per year, but for families earning $200K or less it's $0. Many prestigious universities follow similar patterns resulting in a large percentage of students paying no tuition, the middle ground of students paying some fraction, and a small number of students from wealthy families subsidizing everyone else.
As someone from a country (Sweden) that to a larger extent has decreased people’s reliance on their families, and grown the welfare state instead, it’s weird to think that your parents wealth or income should have any impact on things like tuition, once you’ve reached the age of majority
Once I finished high school, my parents had nothing to do with my business as far as any institutions were concerned, and vice versa. But uni was tax-funded and free at the point of use. And when they get too old to care for themselves, it will likely be the government supporting them financially, not me (unless I strike it rich first, in which case I suppose they’ll spend their sunset years in style)
There's always this subtext that Europeans solve these problems just by caring more about human values, but the truth usually involves interesting sets of tradeoffs. So in Europe the norm, besides free university, is extensive tracking: in the US, your choice of major is essentially a consumer decision, where in many European systems it's fixed at a relatively early age by your performance on things like the Abitur.
I'm not saying the European system is bad. Certainly there's a lot to complain about with a system that asks 18 year olds to make life-defining decisions about both their career and their financial prospects. But the differences do go beyond whether or not you're on the hook for your tuition.
I don't quite understand what you mean by "tracking". Speaking of Germany, because you mentioned the Abitur. Yes your ability to enter certain universities and studies depends on your performance during the Abitur. That is to enter e.g. law or medicine at you chosen university immediately (there is a wait time multiplier, so you can wait if you don't get in immediately) requires a certain grade point average. However I don't understand how this is different from SAT scores in the US (except for maybe the ability to bypass SAT requirements by being super wealthy, but I'm not sure that would be a good thing). In my experience kids in the US tend to be obsessed about their university choices much earlier than the ones in Europe.
Also talking about Germany, unless things changed dramatically in the last few years, most natural sciences and engineering degrees don't require a grade point average.
I can mainly speak for Sweden, but basically the answer there actually is ”everybody who wants to and meets the minimum requirements (essentially having graduated high school)”
Sweden has higher gross enrolment in tertiary education than the US, and a larger proportion of older students (people who go back later in life to progress their education or change paths)
I’ve heard that in countries like Germany people are often ”locked in” by choices they’ve made at an early age. There’s an element of that in Sweden too (more vocationally-focused high school programs may not give you all the courses that you need to enter all university programs), but that is not too onerous to overcome if you change your mind later (you can do ”foundational studies” to bridge the gap, or just sit exams to prove that you’re qualified)
Edit: but it’s maybe also to your point that universities have limited seats, just like everywhere. Maybe your high school grades or score at the equivalent of the SAT aren’t high enough to study mathematics at the top-rated institution even if you’re qualified, because there are too many people ahead of you. But you will be able to go to uni somewhere to study something
The main issue is overconsumption leading to overweight and obesity. Food that’s high in refined sugars and/or saturated fats tend to contribute to this, because it’s palatable and calorie-dense
So in that sense, yes - I believe that as long as your diet is varied enough that you get sufficient intake of all, or at least most, of the essential nutrients, and you don’t eat too much (i.e. in moderation), the ratio of macronutrients doesn’t make a big difference to your health outcome
The crux is that moderation is hard when the food is jam-packed with calories, and it’s so delicious you just want to keep stuffing your face