Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> LLMs are anything but consistent

Depends on how you're holding them, doesn't it? Set temperature=0.0 and you get very consistent responses, given consistent requests.



Does the article mention the temperature? I didn't see it.


With 38,000 trials you have a pretty good idea of what the sampling space is I’d bet.


I didn't see that either, you were the one who brought up consistency.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: