Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I’m confused how this is possible. I have a 2020 Tesla and you have to turn the wheel every couple of minutes while Autopilot is on otherwise it will just turn off after beeping a ton.


Most of Tesla’s ass-covering mechanisms can be circumvented easily. The steering wheel nag in my Model X is easily fooled by any kind of asymmetric weight on the wheel. There are other mechanisms, like seat belt and seat weight monitoring, but these can be easily circumvented as well.

To Tesla’s credit, they have reportedly recently started using gaze tracking on cars equipped with a passenger-facing cam, which is much harder to circumvent. If you look at literally every other public sale automaker doing self driving, they use gaze tracking.


If you're actively circumventing your car's safety features to use it in out-of-spec ways it's hard to argue that an accident is anything but your own fault.

It's like pulling high negative G in a Cessna 172 and then blaming the manufacturer when the wings fall off.


It is the users fault,

but to honest, when left to its own devices the car drove at high speed into a tree. Its not a subtle obstacle. If the car can't figure out what is going on in this senario, thats not a good look.

Diving autonomously is hard, but not running into static obstacles seems one of the more basic things.


If you turn off autopilot and put a brick on the accelerator the Tesla may also drive into a tree! Or any car made in the last 100 years for that matter.


> when left to its own devices the car drove at high speed into a tree.

Was it "left to its own devices" though?

You can't put the car in drive from the back seat.

Autosteer is limited to 5MPH over the speed limit unless you are on a divided highway.

The car seems to have accelerated faster than it normally accelerates under autopilot.

Something doesn't add up here.


Was it confirmed that autopilot was on? Hopefully it was wirelessly reporting its black box info since it was burned so badly.


What if the Cessna salespeople and marketing was all doublespeak about high G flying (allegedly the salespeople on the floor go way beyond doublespeak sometimes)? Yeah the pilot would be to blame, but in that case Cessna would also be to blame.


If you actively circumvented multiple mechanisms that prevented you from making those manuevers, then yes, you bear full responsibility.


> What if the Cessna salespeople and marketing was all doublespeak about high G flying

The driver deliberately set out to sabotage mechanisms which are there to protect you.

A more apt comparison would be if the pilot disabled the stall warning on the aircraft. Then attempted to do low flying maneuvers at the threshold of the aircraft's stall speed.

If someone deliberately disables safety features, they hold the bag.


You have to actively ignore multiple safety warnings and deliberately bypass failsafes.

The big thing I have to wonder about is how on earth did they get the car going so fast on residential streets? Autopilot is hard-coded to limit you to 5MPH over the speed limit. I know you can push the gas down with a brick or something stupid like that, but doing that disables the car's ability to slow down or stop automatically which is a pretty fundamental part of what Autopilot does and how it operates.

So many people look at this as if the driver pushed a simple bypass button to get around a safety feature, but getting a Tesla going fast enough to wrap itself around a tree on a residential street is not simple at all.


Tools to fool Teslas minimal hands on detection have been available for years. Amazon even sells them[1]. Anyone buying them should have their license suspended as precautionary measure.

[1] https://www.amazon.com/QCKJ-Steering-Autopilot-Assisted-Auxi...


It is perfectly possible to nudge the steering wheel from the passenger side.


Tesla has not thought of incorporating a sensor to ensure that someone is actually behind the wheel and not at the side of the wheel? I would find that unlikely.


Well, you'd be mistaken. The only use the torque sensor in the steering wheel. There've been news articles with Tesla engineers asking for a better system, and Musk's explicit response was that technology like eye tracking or such would be ineffective.


AutoPilot will also disable if you unbuckle your seatbelt or lift your butt off the seat.


It's easy to fool, and it appears these two folks were doing so deliberately.


Totally agree with you. I own a Tesla and I am definitely not a big fan of how they communicate or treat their customer (customer service is just horrible). But in the present case, this is not a tesla issue, just an issue of two irresponsible people. The car would have beeped thousands times before crashing (at least very loud beeps for not applying force on the steering wheel and other loud beeps because the car was going off road and hitting a tree - obstacle detection).


Yeah, it seems to me that this sort of safety measure only has to be strong enough to force people to go out of their way to circumvent it: anyone who circumvents it has ipso facto displayed malicious intent and should be liable for any damage caused.


I was thinking the same.. the driver had to be up to some kind of shenanigans. The front end damage from the tree didn't look too bad. Maybe the driver was drunk and fled or got in the back seat to avoid arrest?


>or got in the back seat to avoid arrest

and then purposely burned to death to really sell the story? that's dedication!


Right.. there could be a few scenarios where it could have happened, but I highly doubt that he would have went to the back seat if the car was engulfed in flames. e.g. he was really drunk, the fire didn't start until later, he buckled himself in to wait for cops, dozed off and succumbed to smoke inhalation. You really don't think one of the first things that a drunk tesla driver that was in an accident would think would be to blame the automated driver?


Reach over and turn the wheel from the passenger side?

This one sounds like an ID10T error.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: