Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> It’s only going to take one bad suggestion that leaves someone in a dangerous situation

I feel like if one bad suggestion can leave somebody in a dangerous situation, many other things must have failed before, such as informing oneself of the general condition of roads in a given place and the current season, having a fallback plan in case digital navigation fails or a road is unexpectedly closed etc.



You expect the human to do all the actual work of planning the trip, but leave the only interesting parts to the LLM?


If you are doing all that, why do you need an AI?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: