Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

What AI is going to wipe out is white collar jobs where people sleepwalk through the working day and carelessly half ass every task. In 2025, we can get LLMs to do that for us. Unfortunately, the kind of executive who thinks AI is a legitimate replacement for actual work does not recognize the difference. I expect to see the more credulous CEOs dynamiting their companies as a result. Whether the rest of us can survive this remains to be seen. The CEOs will be fine, of course.


> What AI is going to wipe out is white collar jobs where people sleepwalk through the working day and carelessly half ass every task.

The only reason this existed in the first place is because measuring performance is extremely difficult, and becomes more difficult the more complex a person's job is.

AI won't fix that. So even if you eliminate 50% of your employees, you won't be eliminating the bottom 50%. At worst, and probably what happens on average, your choices are about as good as random choice. So you end up with the same proportion of shitty workers as you had before. At worst worst, you actively select the poorest workers because you have some shitty metrics, which happens more often than we'd all like to think.


There's a connection to the return to office mandates here: the managers who don't see how anyone can work at home are the ones who've never done anything but yap in the office for a living, so they don't understand how sitting somewhere quiet and just thinking counts as work or delivers value for the company. It's a critical failure to appreciate that different people do different things for the business.


That is a hugely simplistic take that tells me you never managed people out coordinated work across many people. I mean I a more productive individually at home too, so are probably all my folks in the team. But we don’t always work independently from each others, by which point having some days in common is a massive booster


There is a spectrum: at one extremity is mandatory in-office presence every day; at the other is a fully-remote business. For any given individual, and for any given team, the approach needs to be placed on that spectrum according to what it is that that individual or team does. I'm not arguing in favour of any position on that spectrum; I'm arguing against blanket mandates that don't involve any consideration for what individuals in the business do.


I haven't worked in the US; and - have not yet worked in a company where such employees exist. Some are slower, some are fast or more efficient or productive - but they're all, everyone, under the pressure of too many tasks assigned to them, and it's always obvious that more personnel is needed but budget (supposedly) precludes it.

So, what you're describing is a mythical situation for me. But - US corporations are fabulously rich, or perhaps I should say highly-valued, and there are lots of investors to throw money at things I guess, so maybe that actually happens.


No, it's the same in the US, too. I don't know what these mythical companies are where people are saying 50% of the workforce does nothing, but I've never seen such a place. Everywhere I've ever worked had way more projects to get done than people available to do them. Everyone was working at capacity.


> What AI is going to wipe out is white collar jobs where people sleepwalk through the working day and carelessly half ass every task.

Note that AI wipes out the jobs, but not the tasks themselves. So if that's true, as a consumer, expect more sleepwalked, half-assed products, just created by AI.


CEO’s will be fine until their customers disappear. Are the AI’s going to click ads and buy iPhones?


AIs are great at generating bullshit, so if your job involves generating bullshit, you're probably on the chopping block.

I just wish that instead of getting more efficient at generating bullshit, we could just eliminate the bullshit.


> AIs are great at generating bullshit, so if your job involves generating bullshit, you're probably on the chopping block.

That covers majority of sales, advertising and marketing work. Unfortunately, replacing people with AI there will only make things worse for everyone.


Some of the best applications of LLMs I've seen are for reducing bullshit. My goal for creating AI products is to let us act more like humans and less like oxen. I know it's idealistic, but I need to act with some goal.


Nah - those people have the bandwidth/time to justify their value in my experience. They are also usually the people managing the productive.

Its the people that are constantly working, and too busy to be seen, producing output and keeping the lights on who don't have time for the "games" who AI is going for. Their jobs are easier to define since they are productive and do "something" - so its easy to market AI products for these use cases. After all these people are usually not the people in charge of the purse strings in most organisations for better or worse.


I think it’s actually going to save those people. They can vibe code themselves just enough output to survive where before they did next to nothing. In relative terms, they’ll get a much much higher productivity boost from AI than the already high-performing Staff engineer.

Management will be thrilled.


Yea


[flagged]


> It's a perversion of the free market

We can, together, overcome such challenges when we accept that "The purpose of a system is what it does".


There's a "purpose of a system", but there's also a purpose which we want that system to serve, and which prompts us to correct the system should it deviate from the goals we set for it.


That is a simplistic idea that I am scared has spread far and wide.

A system is a tool, it does have a use/purpose in the simplistic sense. But how we use the tool is ultimately the crux of the issue, for we can use that hammer to build houses or tear them down, or to build concentration camps or use it simply to injure someone directly.

No, the purpose of a tool/system is generally determined by the guiding philosophy of the user or society. Unfortunately society has replaced its philosophy (at least in America) with the economic system of capitalism; i.e. capitalism for capitalisms sake.


So you think the free market should serve social ends?


Thanks for saying it out loud. I meet a lot of people like you that think the same way as part of my job and they aren't willing to say it out loud.

It's about protecting your work, even if an LLM can do it better.

The only way an LLM can devalue your work is if it can do it better than you. And I don't just mean quality, I mean as a function of cost/quality/time.

Anyway, we can be enemies I don't care - I've been getting rid of roles that aren't useful anymore as much as I can. I do care that it affects them personally but I do want them to be doing something more useful for us all whatever that may be.


lol “I do care, but not enough to actually care”


Caring doesn't mean that you stop everything you're doing to address someone's needs. That's a pretty binary world if it was the case and maybe a convenient way to look at motives when you don't want nuance.

Caring about climate change doesn't mean you need to spend your entire life planting trees instead of doing what you're doing.


Consulting companies like the Big 4 where this happens most are bigger/stronger than ever (primarily due to AI related consulting). Try again.


What makes you think productive work is what consulting companies are selling? They're there for laundering accountability. When you bring in consultants to roll out your corporate AI strategy, and it all falls apart in a few years, you can say, "we were following best practices, nobody could have anticipated X," where X is whatever failure mode ultimately tanks the AI strategy.


Do you think that it's possible in principle to have a better or worse corporate AI strategy? I do, and because I do, it seems clear that companies paying top dollar are doing so because they expect a better one. There's no reason to pay KPMG's rates if all you need is a fall guy.

Most criticisms I see of management consulting seem to come from the perspective, which I get the sense you subscribe to, that management strategy is broadly fake so there's no underlying thing for the consultants to do better or worse on. I don't think that's right, but I'm never sure how to bridge the gap. It'd be like someone telling me that software architecture is fake and only code is real.


I'm willing to believe that one can be better or worse at management, and that in principle somebody could coach you on how to get better.

That said, how would we measure if our KPMG engagement worked or not? There's no control group company, so any comparison will have to be statistical or vibes-based. If there is a large enough sample size this can work: I'm sure there is somebody out there that can prove management consulting works for dentist practices in mid-size US cities or whatever, though any well-connected group that discovers this information can probably make more money by just doing a rollup of them. This actually seems to be happening in many industries of this kind. Why consult on how to be a more profitable auto repair business when you can do a leveraged buyout of 30 of them, make them all more profitabl, and pocket that insight yourself? I can understand if you're an poorly-connected individual and short on capital, but the big consulting firms are made up entirely of well-connected people who rub elbows with rich people all day.

Fundamentally, there will never be enough data to prove that IBM engaging McKinsey on AI in 2025 will have made any difference in IBM's bottom line. There's only one IBM and only one 2025!


The fall guy market is very sensitive to credentials. I hired Joey Blows from Juice-My-AI just hasn't that CYA shield of appoval.


Given that "design patterns" as a concept basically doesn't exist outside of Java and a few other languages no one actually uses, I'm apt to believe that "software architecture is fake and only code is real".


Design patterns (as in commonly re-used designs that solve commonly encountered problems) exist in every language used enough to have commonly encountered problems. Gang-of-Four style named design patterns are mostly a Java thing, and repeatedly lead to the terrible outcome of (hopefully junior) developers trying to find a problem to use the design pattern they just learned about on.


you hire consultants so you can cut staff and quality, but the CEOs were already going to do that.


Consulting companies don't sell productive advice. They sell management insurance.


I think this is the kind of logic you wind up with when you start with the assumption that the Big 4 tell the truth about absolutely everything all the time




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: