This scares the crap out of me. I know it's an unpopular opinion, but I hope the government regulates the hell out of self driving cars. I want really strong robustness of autonomous systems. The difference between working in 99.9% of conditions and 99.99999% is huge. I'd hate to see corners being cut in the race to get there first, and I fully expect that's what will happen.
Ready for car hacking deaths? Get ready... Autonomous systems won't be good enough to detect threats and maneuver as well as attentive or lucky drivers for at least 5-10 years. I'm not sacrificing my steering for at least 12...
> Autonomous systems won't be good enough to detect threats
That reminds me of the worry about hacking pacemakers. If someone wants to kill you there are much easier ways to do it. A rock to the head works well without much technology involved.
A rock to the head is easy to detect and prevent. There are witnesses. Pacemakers require people to get closer. An increase in computers has lead to an increase in hacks, following that logic, an increase in hackable vehicles will lead to an increase in vehicles being hacked.
When people race to make something happen and omit security, it puts users at risk.
Information: the part of my argument you quoted has to do with user intervention, such as when another vehicle comes out of nowhere and the system doesn't know how to respond.
Does it scare you more, or less, than normal humans driving cars? There is approximately 1 fatality per 100mm miles driven in the United States, resulting in about 30,000 fatalities/year. If Autonomous Vehicles could reduce that to, say, 15,000 fatalities, is there any value in saving 15,000 lives?
Serious question - is it better, from an ethical perspective, to have 30,000 people die at the hands of humans, or 15,000 people die in autonomous accidents? Elon Musk is betting on #2 from an ethical perspective. I know a lot of people prefer #1.
To be clearer, it's not the concept of self driving cars itself that scares me. It's the aggressive timeline that sounds like it was dictated by marketing and strategic goals rather than technical feasibility and engineering estimates.
Technically, self-driving cars are about 1 year out from being better than human drivers under normal highway driving conditions. If they continue to follow that curve of progress, then by (my hand wavey estimates) 2018 you should see micro-mapped urban driving scenarios (similar to google car in the bay area) capable of responding to traffic lights, construction lights, 2019 you should should see exception based systems (Construction pylons, road flares), 2020 gets you responding to Traffic Police, hand waves, and 2021 fills in the gaps that I'm not thinking of.
So, in theory, by end of 2021, within the Bay Area (at least South Bay - Think Mountain View/Cupertino Area), Ford should be able to start deploying on a Trial Basis, automated fleets of cars that will, more safely than humans, pick up and deliver people under good driving conditions, during the day, over very micromapped routes.
Once they get that nailed, they should be able to start expanding to other neighborhoods in the Bay Area, until by 2028-2030, an automated car should be able to pick up and deliver you anywhere in the SFBAY area that has presumably been micromapped, while at the same time providing pretty decent highway automated driving experiences for good portions of California.
Based on those lessons, (and probably in parallel with the California Experience) - other countries/states/cities will likewise be micro-mapping their road systems, and also creating "Automated Capable Areas" - that will, over time, merge - until eventually you should be able to get, over well known roads, in non-inclement weather conditions from pretty much anywhere in the United States to Any other area, more safely than you would with a human. My prediction is 2050 at the latest, and 2040 if there is aggressive investment. (The actual data probably lands somewhere between those two).
Honestly, human drivers are so horrible, it's hard to believe that with just a few hundred billion dollars of investments over the next fifteen years, that we shouldn't be able to come up with something better.
100% - 99.9% is 1 in 1000 times.
In how many of the last 1000 car rides did you have a fender bender or accident? That's 1-3 years depending on how often you drive. I'd say 99.9% would be very close to an improvement over the current situation.
I would be more sympathetic to calls for strict regulation if the current situation weren't so alarmingly suboptimal.
He said 'conditions' not number of rides. Why would the number of rides matter, especially independently of the length of each ride? The location, weather and traffic conditions should matter much more.
> “That means there’s going to be no steering wheel. There’s going to be no gas pedal. There’s going to be no brake pedal,’’
I'm very skeptical of the claim that self-driving cars that work well enough to be fully autonomous are within reach. How do you get them to respond well to really uncommon but dangerous scenarios? Will they pull over safely when they hear an ambulance? If a police officer is directing traffic at an intersection, will they be able to read the hand signals? Will they know to slow down when a ball rolls across the street?
I think the semi-autonomous approach is more promising. The car could automatically take control and pull over if the driver is fishtailing, or drive to the hospital if the driver is incapacitated.
The scenarios you are describing are incredibly common, and trivial to develop for in comparison to real edge scenarios (not to say they are trivial to develop for). More challenging are things like, "How do you respond to a homeless guy standing on the street directing traffic into an alley", "How do you respond to a flooding on the road resulting in 6" of standing water obscuring all traffic lane markings", "How do you respond to a flooding on the road resulting in 18" of standing water? 24" of standing water?" How do you respond to a road covered in snow (12", 18", 24")? How do you respond to a road covered in snow with tire prints that seem to go into the oncoming lane? How do you respond to a road covered in snow with what appears to be a small 24" high snow drift in your lane?
Etc, Etc..
I think the answer to all these questions is, "Initial versions of Automated vehicles will only perform under ideal road scenarios. During Snow, Flooding, or other inclement weather, they will not be dispatched. Wait for future iterations in 2030, 2040, 2050, etc...
This is why Google is training their cars now in the real world to handle all of these kinds of things. Here's a Ted Talk [0] from 2015 where the head of their self driving unit makes the case for full autonomy vs. semi-autonomous because the semi-autonomous is good enough where people trust it, but not good enough to be trusted. Kind of like what happened where the Tesla driver went underneath the truck that wasn't properly detected.
I usually try to write a constructive comments... But this...
This made me laugh out loud. I'm sure they can build these cars, but as to being road ready. Good luck. As for being accepted by society, good luck. Finally, after the 20th or 30th death, do you really think they won't be regulated to extinction?
I'd figure there's a lot of ways that you can damage or try to disable/steal a driverless car, or simply vandalize out of boredom or inability to sympathize with the owner, so much so that these cars probably wouldn't want to service many not-well-off neighborhoods.
So, serious question: Are driverless cars such as these going to represent a new front in dividing the rich and the poor? If they are touted as benefit to public transportation, the poorest need public transportation the most.
Driverless cars will mean cars can use both definitions of redlining.
There's not a lot of weasel room in this promise, since it includes no accessible controls. And semi-autonomous cars are not really useful for ride hailing in any meaningful way.