Hacker News new | past | comments | ask | show | jobs | submit login

> Every software bug is an example of a computer doing exactly what it was told to do, instead of what we meant.

That is only as long as the person describing the behaviour as a bug is aligned with the programmer. Most of the time this is the case, but not always. For example a malicious programmer intentionally inserting a bug does in fact mean for the program to have that behavior.




Sure, but I don't think that matters as the ability to know what we are even really asking for is not as well understood for AI as for formal languages — AI can be used for subterfuge etc., but right now it's still somewhat like this old comic from 2017: https://xkcd.com/1838/




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: