Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Yes, but ...

The argument that self-driving cars should be allowed on public roads as long as they are statistically as safe as human drivers (on average) seems valid, but of course none of these cars have AGI... they perform well in the anticipated simulator conditions in which they were trained (as long as they have the necessary sensors, e.g. Waymo's lidar, to read the environment in reliable fashion), but will not perform well in emergency/unanticipated conditions they were not trained on. Even outside of emergencies, Waymos still sometimes need to "phone home" for remote assistance in knowing what to do.

So, yes, they are out there, perhaps as safe on average as a human (I'd be interested to see a breakdown of the stats), but I'd not personally be comfortable riding in one since I'm not senile, drunk, teenager, hothead, distracted (using phone while driving), etc - not part of the class that are dragging the human safety stats down. I'd also not trust a Tesla where penny pinching, or just arrogant stupidity, has resulted in a sensor-poor design liable to failure modes like running into parked trucks.



  I'd not personally be comfortable riding in one since I'm not senile, drunk, teenager, hothead, distracted (using phone while driving), etc - not part of the class that are dragging the human safety stats down.
The challenge is that most people think they’re better than average drivers.


I'm not sure what the "challenge" is there, but certainly true in terms of human psychology.

My point was that if you are part of one of these accident-prone groups, you are certainly worse than average, and are probably safer (both for yourself, and everyone around you) in a Waymo. However, if you are an intelligent non-impaired experienced driver, then maybe not, and almost certainly not if we're talking about emergency and dangerous situations which is where it really matters.


How can you know if you're a good driver in an emergency situation? We don't exactly get a lot of practice.


Sure, you don't know how well any specific driver is going to react in an emergency situation, and some are going to be far worse than others (e.g. panicking, or not thinking quickly enough), but the human has the advantage of general intelligence and therefore NOT having to rely on having had practice at the specific circumstance they find themselves in.

A recent example - a few weeks ago I was following another car in making a turn down a side road, when suddenly that car stops dead (for no externally apparent reason), and starts backing up fast about to hit me. I immediately hit my horn and prepare to back up myself to get out of the way, since it was obvious to me - as a human - that they didn't realize I was there, and without intervention would hit me.

Driving away I watch the car in my rear view mirror and see it pull a U-turn to get back out of the side road, making it apparent why they had stopped before. I learned something, but of course the driverless car is incapable of learning, and certainly has no theory of mind, and would behave same as last time - good or bad - if something similar happened again.


In my lens, as long as companies don't want to be held liable for an accident, the shouldn't be on roads. They need to be extremely confident to the point of putting their money where their mouths are. That's true "safety".

That's the main difference with a human driver. If I take an Uber and we crash, that driver is liable. Waymo would fight tooth and nail to blame anything else.


Mercedes is doing this for specific places and conditions.


Well, it depends on the details. I'd trust a Waymo as much as an Uber but I'm pretty skeptical of the Tesla stuff they are launching in Austin.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: