Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's a pretty weak argument regardless of your stance on self driving cars. Student driver records aren't meaningful because we don't have enough data to make a judgement. We have lots of data on self driving cars. There are other ways to cast doubt on self driving car records but this isn't one of them.


Tesla's self driving cars are students that are under constant supervision of their teachers.


This shouldn't be downvoted to light grey. The analogy is excellent.

I presume this comment is talking about student drivers where the teacher has an override steering wheel. So the student gets into accidents at a lower than average rate because every time they get close to doing so the teacher takes over.


1) Most student hours aren't in override steering wheel cars

2) For this analogy to make sense, it needs to be an average driver, not a driving teacher

3) Unless you're claiming AI and human drivers are uniquely suited to solving different types of driving, if the effect you're claiming is true, you would expect the rate of human/teacher driving hours to be much worse than average because they miss out on all of the "easy" miles. So far, no evidence of this.


That's just a bad analogy though. If a student driver accumulates a million miles of driving with no accidents, they're probably a fine driver even if the teacher was in there the whole time. Conversely, you're not a safer driver if tomorrow a driving instructor decides to sit in the back seat of your car tomorrow.


Not if the student gives control back to the teacher every time there is a problem the student can't handle. The real metric is "dangerous/complicated driving situations mastered" and "miles driven" is only a stand-in. A bad one if the student can deselect the few miles which had the dangerous bits.


If we're really pushing this analogy however, then you would expect the teacher to have a much higher accident rate than the average driver because you're claiming the teacher only does the hard stuff, or, in the very least, misses out on the majority of the easy stuff (assuming, for comparability, that the teacher is an average competence driver).

Specifically, if you're claiming only 99/100 miles are easy and have no chance of crashes, then if a human only drives for the 1/100 miles that are hard, they should have a 100x higher crash rate than the human that drives all 100. They should probably have an even worse crash rate because of the switching cost of suddenly taking control, unless you want to make the weird argument that suddenly taking control of an autopilot car is safer.

The tesla report says autopilot experiences a crash every 4 million miles. With autopilot disengaged, it's every 2 million miles. The baseline national average is every 0.5 million miles.

I can't find the perfect statistics, but one study suggest uneducated people are 4x more likely to die in a car crash, so let's give some generous rounding and say, normalized to wealth, tesla drivers not actively using autopilot are at comparable levels to the average driver (1 per 2 million miles).

Unless telsa drivers are phenomenal emergency handlers, its difficult to explain how the non-autopilot crash rate could be so low, while also claiming tesla is hiding the true crash rate of its autopilot features by pushing difficult miles to human drivers, because the human drivers are receiving normal crash rates on what you claim is a much more difficult set of miles.

Its possible (probable) that the autopilot would experience a higher crash rate if it were not allowed to call in a human. But to ask generally if autopilot is reducing the total number of accidents that the drivers would experience otherwise, I'd say 'probably'.


It means they are probably fine to drive the courses they have been driving. However, we know with self-driving that it isn't necessarily representative of what they will be asked to drive. It is also not a particularly relevant comparison to human drivers, unless we normalize a bit for road/driving conditions.


You are artificially adding in a detail that the student is only driving in limited courses to imply there might be unfamiliar conditions to a self driving car where we shouldn't trust its performance.

That is a better point, but also just a different point from the one originally made.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: