Hacker News new | past | comments | ask | show | jobs | submit login

So who is liable if the AI makes severe mistakes?

Not that MS has been held accountable for all the security problems they've had recently.




Who is liable if a (non-CNC) machine tool goes off-kilter and kills someone? Possibly the manufacturer of the tool. Possibly the operator. Possibly the company that owned the tool as well as the one who manufactured it.

That is why we have courts. It's not necessarily a great solution, but the alternative would be abandoning all potentially dangerous mechanisms (heck, we wouldn't even have spears, wooden clubs, or sharpened rocks).


Probably the person who pushed the changes live without review


This implies that there needs to be changes pushed live without review in order for an LLM to make a mistake. Which is obviously very much not true.


There will be much more automated AI code than experienced reviewers, and with each generation there will be less if most development gets automated.




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: