Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This reads like a hit piece. Cars crash, that obvious. Are Robotaxis crashing above or below human-driver rates?


I've been asking for independent analysis for years now. The data is there. Yet all the headlines are from people who have an obvious bias - this is the first where I've seen a headline where there is no evidence that the data has been looked at.

There are many ways to "lie with statistics". Comparing all drivers - including those who are driving in weather self driving cars are not allowed to see - for example. there are many others and I want some deep analysis to know how they are doing and so far I've not seen it.


> I've been asking for independent analysis for years now. The data is there

Independent analysis would be great, but Tesla has been very withholding and even deceptive with its data.

Compare to https://waymo.com/safety/impact where anyone can download the data.


The biggest clue is that Tesla still needs to have a human supervisor in the car. They aren't doing that for show, it's an active admission that the tech isn't there yet.


The rates stated are about 10x higher than humans, and also far higher than Waymo.


well we're comparing waymo to tesla here and tesla is crashing a lot? more than waymo. 4 crashes in 250k miles versus like 10 in nearly 50 million?


From this article, Tesla crashes 50% more often. But hard to compare when one has a human safety driver and the other does not.

> the report finds that Tesla Robotaxis crash approximately once every 62,500 miles. Waymo vehicles, which have been involved in 1,267 crashes since the service went live, crash approximately every 98,600 miles. And, again, Waymo does not have human safety monitors inside its vehicles, unlike Tesla's Robotaxis.

https://mashable.com/article/tesla-robotaxis-with-human-safe...


how is it hard to compare? the company with a safety driver has more crashes. the comparison is easily made in that it makes it even worse


I think you also need to consider blame and severity, rather than raw crash numbers.


Tesla response:

Everything is caused by human safety drivers making mistakes. It's never the AI.


hmm, seems like the 1267 crashes includes general interruptions (like people hitting the car)


Above human rates for sure. In the 90s in my country, accident rate were 5 for a million kilometer (so 5 for 621371,192 miles), and the rate have come down since.

Basically they are crashing at the same rate as 18-25 years old in the 90s, in France. When we could still drink like 3 glasses and drive.


You just made me think of our PSA.

Was it "un verre, ça va, deux verres, ça va, trois verres, bonjour les dégâts" ? Something like that.

Edit: looks like we didn't have "deux verres", maybe.


Driverless Teslas are the hit pieces. Hitting people. Ayooo.

Seriously though, Tesla has an extension history of irresponsibly selling "autopilot" which killed a ton of people. Because they don't take safety seriously. Waymo hasn't.


Can you expand on a "ton" of people? Best source I've found was this wikipedia article:

https://en.wikipedia.org/wiki/List_of_Tesla_Autopilot_crashe...

But I suspect it isn't comprehensive. It's hard to get good data on this for a variety of reasons.


And are the crashes leading to fatalities at a higher or lower rate than human drivers?


This is the key. The known instances I've seen are very minor taps / fender benders. Not great, but not fatal accidents


Accountability is a pretty big issue, I think. We've accepted, for better or worse, a certain level of human-caused crashes for 100 years or so. If machines take the wheel they have to be an order of magnitude (or more) better.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: