Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

To give a brief example of this -- a college asked why an exec had listened to my argument but not theirs recently, despite "saying the same thing". I explained that my argument contained actual impacts: actual delays, actual costs, an actual timeline when the impact would occur -- rather than nebulous "there will be problems".

Everyone comes to execs with hypothetical problems that all sound like people dressing up minor issues -- unless you can give specific details, justifications, etc. they're not going to parse properly.

This would be one case where a person asking an LLM for help is not even aware of the information they lack about the person they're trying to talk to.

We could define expertise this way: that knowledge/skill you need to have to formulate problems (, questions) from a vague or unknown starting point.

Under that definition, it becomes clear why LLMs "in the large" pose problems.



I don't know. Predicting delays, costs and timelines is notoriously hard unless it's something you've done the exact same way many times already. For example in physical work, like installing external insulation on a building, a contractor can fairly easily predict the time required because they did similar buildings in the past several years, it's multiplying an area by a time average, and they know the delay caused by asking for some material by checking the shipping time on the website they order it from.

Developing software is very different and many nontechnical execs still refuse to understand it, so the clever engineers learn to make up numbers because that makes them look good.

Realistically, you simply come across as more competent and the exec compressed all that talk about the details into "this guy is quite serious about not recommending going this way - whatever their true reason and gut feel, it's probably okay to go their way, they are a good asset in the company, I trust that someone who can talk like this is able to steer things to success". And the other guy was "this guy seems to actively hide his true reasons, and is kind of vague and unconfident, perhaps lazy or a general debbie downer, I see no reason to defer to him."


I think there's an element to that -- I also said it's about trust and credibility. However in this case it was partly about helping the exec cognise the decision and be aware that he needs to make a decision, basically scaffolding the decision-making process for the exec.

It's kinda annoying for decision-makers to be presented with what sounds like venting. This is something I've done before, in much worse ways actually -- even venting on a first-introduction handshake meeting. But I've learned how to boil that down into decision-making.

I do find it very annoying, still, how people are generally unwilling to help you explore your thinking out-loud with you, and want to close it down to "what's the impact?" "what's the decision?" -- so I sympathise a lot with people unable to do this well.

I often need to air unformulated concerns and it's a PITA to have people say, "well there's no impact to that" etc. : yeah, that isnt how experts think. Experts need to figure out even how to formulate mental models of all possible impacts, not just the ones you care about.

This is a common source of frustration between people who's job is to build (mental,technical...) models and people who's job is to manage social systems.


I think nontechnical execs have a mental model of technical expertise, where there's some big rule book lookup table that you learned in college and allows you to make precise, quantified, authoritative statements about things.

But of course the buck has to stop somewhere. By being definitive, you as the expert also give ammo to the exec. Maybe they already wanted to go that certain way, and now they can point to you and your mumbo jumbo as the solid reasoning. Kind of how consultants are used.


Ooh there's the first positive thing to come out of this whole LLM thing. Can we replace overpaid consultants who contribute nothing with sweet whispers from an executive chatbot?


In light of this it's interesting that a lot of arguments for LLMs discuss them largely in terms of what they "can" do and not what the actual use is.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: