The comments are a pile-on of criticism and how can this be legal. The question is what percentage of self driving hours lead to a situation like this, is it high enough to determine it's unsafe compared to non autopilot or statistically an outlier?
Someone on here[1] already compiled a list of 7 incidents in this video of a ~30 minute drive. 3 of these look like they could have resulted in a severe crash or even a fatality if the driver did not intervene (the train, proceeding into an intersection on a solid red light, ignoring pedestrians). That's a pretty bad batting average - most people don't nearly crash 3 times in half an hour of driving.
It should be banned without the ability to know ahead of time it is trustworthy within it's intended operating envelope. Said operating envelope should be clearly, and unambiguously explicable to a lay person.
Neither of those conditions are the case. I'd appreciate it if people would stop trying to use statistics to gove an unproven system being tested with a fundamentally unsafe methodology with minimal experimental controls a free pass.
No, I was just trying to work out what you were saying.
I do however think that it's wrong, why? Because human error is enough to deal with, couple this with now AI error. It makes society, crash investigation, designing roads way way more complicated.
Maybe if we ban human drivers 100% and optimize for cars then it would be a good idea. I just can't see cars not needing supervision for a very long time.
I doubt people are going to buy cars that might kill them by making "mistakes". We will see but I think it's going to be a hard sell. People can easily accept their own mistakes. A computer though?
It doesn't matter how statistically safe it is. Putting unreliable software like that in the streets causes more crashes than drivers. The sudden swervings, the phantom braking, the absolutely batshit insane behavior and the frankly dangerous "lidar is for scrubs we're only going to use cameras" behaviour and Tesla having autopilot turn off ever so slightly before a crash so that their logs can say that it wasn't on are so many reasons that it should be off the roads and Tesla fined so hard.
It doesn't matter because of the ways it behaves on that small percentage where it behaves like a blue collar on a cocaïne bender on a Friday night. It doesn't slam itself into a tree like a drunk driver in the middle of the night, it is so unpredictably bad that i half expect to see reports telling me it tried to take a train line because it's a shortcut.