Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

... Am I the only one thinking all those contorsions to get something usable are completely mental? All to get something potentially completely wrong in a subtle way?

Those LLM not only suck megawatts of energy and TFLOPS of compute, but they also consume heaps of brain power - all that for what, in the end? What betterment?



Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: