Hacker News new | past | comments | ask | show | jobs | submit login

I can't see L5 happening without, amongst other things, the human ability to reason about things/concepts we've never encountered before (à la consciousness).

Highway driving is the most constrained normal driving problem, and this is solvable in 99% of cases. But there are so many things that can happen in most other driving situations that make me think that model-based approaches (Tesla) are doomed to fail... Go ahead, train a classifier for every situation you can think of - I guarantee you that you've missed many things.

Elon tweeted the other day that FSD is "coming soon". Either I'm totally wrong about this, and of course I hope I am, or Karpathy + the dev team should be tempering his expectations.

That isn't to say that there isn't immense value in L2/L3, there totally is. But I think that solving driving (being able to drive any situation a human can) is pretty much the same thing as solving intelligence generally.




Musk, for all his genuine achievements, is also a bullshitter. If/when his FSD prediction fails to come to pass, I expect that he will just say "Well, when I said FSD, what I really meant is..."


That is already happening. Go visit the Tesla forums, and you will hear a lot of fans trying to tone down expectations for FSD, redefining what that term means. I would have thought "full self-driving" was pretty explanatory, but from things I've recently read, the goalposts have already shifted significantly.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: