Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> LLMs do not think

Yep.

More seriously, you described a great example of one of the challenges we haven't addressed. LLM output masquerades as thoughtful work products and wastes people's time (or worse tanks a project, hurts people, etc).

Now my job reviewing work is even harder because bad work has fewer warning signs to pick up on. Ugh.

I hope that your workplace developed a policy around LLM use that addressed the incident described. Unfortunately I think most places probably just ignore stuff like this in the faux scramble to "not be left behind".



It's even worse than you suggest, for the following reason. The rare employee that cares enough to read through an entire report is more likely to encounter false information which they will take as fact (not knowing that LLM produced the report, or unaware that LLMs produce garbage). The lazy employees will be unaffected.




Consider applying for YC's Fall 2025 batch! Applications are open till Aug 4

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: