Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Tesla crash driver posted videos of himself riding without hands on wheel (theguardian.com)
29 points by benrmatthews on May 15, 2021 | hide | past | favorite | 43 comments


See also "Tesla owner who 'drives' from back seat got arrested, then did it again": https://arstechnica.com/tech-policy/2021/05/tesla-owner-jail...


I know I shouldn't laugh and it's a serious issue to be committing these crimes, but that article is hilarious.

> When asked if he purchased a new Tesla after the previous one was impounded he said, "Yeah, I'm rich as [expletive]. I'm very rich."

> "I feel safer back here than I do up there," Sharma also told KTVU from the right-rear passenger seat.

> "I just got out of jail. I already got [another] Tesla. You feel me, I'm rich like that. I came out of the pandemic a fucking millionaire, bitch."

Is this what aristocrats would once call "new money"?


Remember 27 days ago when Hacker News was flooded with comments insisting that the underlying problem was the "Autopilot" name? The argument, best as I can tell, is that the name of their cruise control system somehow convinces innocent, everyday consumers into having deadly high speed crashes on short cul-du-sacs.

Then Consumer Reports published a video showing how to defeat Tesla's multiple safety protections by intentionally misleading the vehicle's driver occupancy sensors. Tesla was blamed for having sensors that couldn't detect intentional misuse.

Turns out that Occam's Razor holds: this whole saga was nothing more than a moron doing stupid things in a high performance vehicle. It's a shame that so many people had such wildly different assumptions because it happened to be an electric car made by a company whose CEO is an eccentric nerd and twitter troll.


I don't know, if they just called it co-pilot or wingman it'd still be cool, but also more accurate. I thought the big thing was bringing an electric car to market, this feature got overblown and it's become dangerous.


> got overblown

Musk was very intentional about promoting Tesla’s autopilot as equivalent of self-driving L4 (if I remember right), even getting into fights with people who questioned the validity of the claim.


Citation? If your description of Elon's statement is accurate, if he ever described the contemporaneous shipping Autopilot system as the "equivalent of self-driving L4" then I would change my mind on this matter immediately.


I'm not sure if this is enough for you to change your mind, but I remembered his claims from last year, and they were virtually as exaggerated

"U.S. electric vehicle maker Tesla is “very close” to achieving level 5 autonomous driving technology, CEO Elon Musk said on Thursday, referring to the capability to navigate roads without any driver input.

“I’m extremely confident that level 5, or essentially complete autonomy, will happen, and I think will happen very quickly,” Musk said in remarks made via a video message at the opening of Shanghai’s annual World Artificial Intelligence Conference (WAIC).

“I remain confident that we will have the basic functionality for level 5 autonomy complete this year.”

Obviously not exactly the same as making a definite statement about the capacity of current cars, but we all know what part of the message sticks with the public.

https://venturebeat.com/2020/07/09/elon-musk-says-tesla-is-v...


Elon being utterly delusional around the timeframes of future Tesla product releases isn't interesting to me, as I don't think that's ever been disputed. But I would change my mind if Musk made an egregiously misleading statement about currently shipping technology. This is something frequently attributed to him—but always in vague terms. I've never actually seen it.


How are those more accurate? Colloquially, to be "on autopilot" describes how one can mindlessly perform familiar actions which don't require novel thought. For example, walking home from the train station. I'm sure plenty of people have been injured or killed while walking around "on autopilot."

In the context of a software feature, "autopilot" implies a computer control system which is capable of maintaining rudimentary ongoing control of the vehicle.

Whereas a co-pilot is a whole other pilot. Surely that's worse, implying equivalence to a human sitting next to you, with comparable levels of training and skill, intuitively capable of taking over at any time. I wonder what would happen if Ford developed a system and called it "Co-pilot". We might never know.


“Autopilot” is literally an airplane term and refers to systems that can take-off, cruise, and land the plane even in many adverse conditions without any human input.


You are describing features available on a small minority of aircraft with autopilot features. The vast majority of aviation autopilot systems only control heading and basic flight stability.

Literally the opening paragraph of Wikipedia:

”An autopilot is a system used to control the trajectory of an aircraft, marine craft or spacecraft without requiring constant manual control by a human operator. Autopilots do not replace human operators. Instead, the autopilot assists the operator's control of the vehicle, allowing the operator to focus on broader aspects of operations (for example, monitoring the trajectory, weather and on-board systems).”

https://en.wikipedia.org/wiki/Autopilot


> The argument, best as I can tell, is that the name of their cruise control system somehow convinces innocent, everyday consumers into having deadly high speed crashes on short cul-du-sacs

Musk has been hyping Tesla's "full self driving" capability for years (see xvector's comment from a couple of months ago - https://news.ycombinator.com/item?id=26519357). Musk does it so much that people believe the system can do more than it really can.

Musk says this with a megaphone to his 54.7 million followers: https://www.theverge.com/2019/4/22/18510828/tesla-elon-musk-...

But then Tesla privately tells regulators that "full self driving" is only level 2: https://www.thedrive.com/tech/39647/tesla-admits-current-ful...

And then Tesla privately says that Musk's comments don't match engineering reality: https://www.theverge.com/2021/5/7/22424592/tesla-elon-musk-a...

At what point does "eccentric nerd" really just mean "huckster"?

The most responsible thing Musk can do here is to very publicly admit that autopilot can't do what he's been selling. Or will his eccentricity prevent that from happening?


I don't think anyone disagrees that Elon's forward-looking statements about future autonomous driving capabilities have been consistently egregiously optimistic. But forward-looking statements about future product timelines aren't relevant to what customers believe Autopilot's capabilities are today.

It is strange to claim that the autopilot feature "can't do what he's been selling" by pointing to statements containing qualifiers such as "I think we are less than two years away" and "I think probably by end of next year".


What would your average person think "full self driving" means? And what does Tesla's "full self driving" actually do?


I'm not talking about the "full self driving" optional feature. Personally I agree with criticism of the FSD product name and I think it's an inappropriate and misleading name. But that's not relevant here, because the car did not have the "full self driving" feature.

This is actually very important, because it means the customer explicitly declined Tesla's repeated prompts to option "full self driving" features into their vehicle. Therefore this customer will have been acutely aware that "full self driving" is a feature their car did not have and thus the car could not be capable of any such "full self-driving".


It had "full self driving". He said so himself:

https://insideevs.com/news/507038/nhtsa-investigate-tesla-cr...


It is amusing that this comment thread came to an abrupt end once you were contradicted about what is "actually very important".


"Another man was seriously injured when the electric vehicle hit him as he was helping the truck driver out of the wreck."


This is the real crime here.


> A message seeking comment from his wife was not returned.

That made me spit out my coffee. And not because I was laughing.

Journalists have to be one of the most tone-deaf and least sensitive groups of people on this planet. And I thought I was bad.

If you feel like you have to write an article like that mentioning people by their full names, by all means, fine. But maybe leave a grieving wife the fuck alone - at least for a few months. Find a family member that is a few steps removed if you can't contain yourself.


It's one thing to ask her for a comment, it's another to publish that she refused/ignored the request. That's usually reserved for asking an executive about a scandal at their company, not a grieving wife.


Thankfully so far these reckless sociopaths have only done self-owns. Tesla is in for a reckoning when an innocent gets killed.


In this case:

> Another man was seriously injured when the electric vehicle hit him as he was helping the truck driver out of the wreck.


AI has gotten really good but will it ever develop the necessary imagination to be able to handle all edge cases? If we ever do reach that point, we should require all cars to be AI-driven only so that bad drivers don’t injure themselves or others when they get into these kinds of situations. Until then, something needs to be done before too many people kill themselves by thinking that autonomous cars are more capable than they actually are.


If Tesla used Lidar for their emergency braking I doubt they'd be driving straight into stationary objects. But Elon is adamantly against Lidar and wants vision only.

Then again, other car companies have emergency braking without Lidar (just radar/camera like Tesla) that doesn't seem to crash into stationary objects, so it sounds like Tesla's software is just not as good at emergency braking.


I'm sure it will never be perfect. But i think at the very, very least, we should be well passed the point of never hitting heavy, stationary objects in the middle of the road assuming sufficient stopping distance.


This is poor quality thinking.

The problem is that the AI is held to an impossible standard of near perfection, yet we happily accept that human drivers kill themselves and others in droves.

The right question is whether self driving is better than the human driver.


Not sure I fully follow your comment. Who is the reckless sociopath here?


I am sure he means the driver.


Elon Musk plus the drivers who do things like sit in the backseat or go to sleep.


I'd guess Elon Musk and Tesla stock owners.

Tesla has sold people on these cars being self driving, and this is the natural consequence of that sales pitch


Let's keep things in perspective. Even tho it's reckless and illegal the accident rate is less than a human driver. I'd spend my engery on people looking at their phones and driving

"we registered one accident for every 4.19 million miles driven in which drivers had Autopilot engaged. For those driving without Autopilot but with our active safety features, we registered one accident for every 2.05 million miles driven."


Tesla is a master of framing. The miles driving without Autopilot includes situations where you can't use Autopilot anyway. The fair comparison would be times when Autopilot could be used but isn't vs Autopilot being used.


Ya, my immediate thought upon reading that was, you could rewrite that sentence to say: "There were fewer accidents while the autopilot did the easy stuff than when humans were doing all the difficult stuff."


We don't actually know if it's more safe since most of those 4.19 million miles will be driven by people paying somewhat attention while being driven by the autopilot. Not by the type of person reading a newspaper in the back while the autopilot is doing its thing.


The driver in all of these fatalities was the same,err,robot and if anyother idivual showed a propensity for running amok and killing people they would lose there licence to drive,have wildly increased insurance premiums and eventualy go to jail and be kept there. What it boils down to is that the tesla robot has been given a kill permit and that this is the thin edge of the automated killing of humans wedge.Think "area denial system"


This just boils down to another tesla-tard being arrogantly confident in a very imperfect system that, while improving, does fail, with very high consequences. There is a reason the automated driving system requires attentiveness and proper hand/foot placement, much less being in an entirely different seat.


Um... "The driver [...] posted social media videos of himself riding in the vehicle without his hands on the wheel or foot on the pedal."

That's... what autopilot does. What's the allegation here? That he used autopilot correctly and filmed it?

More to the point, where's the allegation that autopilot was at fault here? The only detail I can find is that he hit an existing overturned vehicle in the middle of the night. That's actually a routine kind of accident for human drivers, who aren't prepared to see stationary objects in the road. Add to that "2:30am" and I think it's pretty clear we'd all view this as a terrible but unavoidable tragedy in any other vehicle.

Again, what's the actual story here? I know what they want us to think based on the headline and lede. I just don't see that in the text of the article. Someone help me out here.


> That... what autopilot does.

Actually not true. From their website [0]:

“Autopilot and Full Self-Driving Capability are intended for use with a fully attentive driver, who has their hands on the wheel and is prepared to take over at any moment.”

[0] https://www.tesla.com/support/autopilot


Can someone tell me what the point of autopilot is if you are required to actually be fully in control of the vehicle at all time regardless?


Gosh dang it Bobby, the point is to sell more cars!


I am going to take that as a "no".


> That's... what autopilot does. What's the allegation here? That he used autopilot correctly and filmed it?

https://www.tesla.com/support/autopilot

> Before enabling Autopilot, the driver first needs to agree to “keep your hands on the steering wheel at all times” and to always “maintain control and responsibility for your vehicle.” Subsequently, every time the driver engages Autopilot, they are shown a visual reminder to “keep your hands on the wheel."




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: