Same, and I wonder if it'll share a chapter with Tesla's prematurely shipped autonomous features, related deaths, and the dishonest messaging surrounding it.
Considering the MCAS[0] failures were largely software defects in an autonomous system, seems plausible to me...
In the case of Tesla they make the argument that the safety features have saved far more than they have injured/killed. Inexplicably, there is no federal agency that actually checks these things.
In this case I don't see how you could make the same argument, clearly the new planes were not safer than the old ones or they wouldn't have been grounded.
Based on my experience with Tesla’s autonomous driving software, there is no planet on which that’s true. It is quite literally worse than anyone I can imagine driving unless they’re high, drunk, texting, and doing 30+mph over the speed limit. I’ve come across an unfortunate number of drunk drivers in my time on this earth driving and I can say I’ve never once seen or heard of a drunk slamming on their brakes at 70mph because of a tree shadow across the road. I’ve personally experienced and seen someone else’s Tesla do it on multiple occasions.
I can anticipate and avoid a drunk driver in many situations. I’d never expect that behavior out of any human driver.
I had a Tesla with FSD which I never used again after it pulled out in front of another car on the motorway. Only me grabbing control and pulling us back into lane prevented us being rear-ended. My inferior human reactions were enough for that. Even the regular autopilot lane-keeping was useless on anything except nice neat roads with painted lanes, and it did several frightening phantom emergency stops, luckily all at low speed.
That said, my current MG4 is just as bad with emergency lane-keeping, but with the added entertainment of it being on by default and needing to be disabled on every trip. Level 5 self-driving won't happen for decades.
The good news is that you did what you were supposed to do. The bad news is that it would have been your responsibility if you hadn't: "FSD" doesn't detract from your responsibility to be in control of your vehicle, it merely enables your mind to wander away from the task.
I'm glad you're safe, and honestly you were set up to fail (hence the GP's point). Thank you for taking the responsible choice and not putting yourself in the same dangerous situation again.
Don't know of legalities, but we have two different scenarios. First, I car in front stopped, driver is responsible for breaking, FSD may or may not help, driver is responsible. Second, car was going straight and decides to randomly change lane, speed up/down. Driver may be able to take mitigating steps, but car is responsible. Second case is more akin to car accelerating when break is pressed.
And yet one sees people talking the fearure up all day on the internet, as if it's safer than human drivers. When I see such comments, I seriously question if they've ever driven a car and came across a Tesla in "self-driving" mode.
Just to add to the conversation a bit more:
Humans give away their intentions in more ways than turn signals, as a human you subconsciously know what the other is trying to do, just by watching. It's difficult to explain, and you gotta have driven a car to understand.
These self-driving vehicles aren't even close to mimicking that aspect. Leading us, humans on the road, a bit more clueless on what's going on.
Fair, but humans sometimes hit the accelerator rather than the break if they have to do an emergency stop, which is an error that you would not expect a self driving car to make.
I haven't driven a Tesla, although I've been in an uber where the guy used FSD and it didn't do anything crazy. I don't think anecdotes on the internet are going to settle this, an independent body needs to work with insurers etc. to track statistics to determine which driver assistance features are helping and which are hurting.
> I’d never expect that behavior out of any human driver.
People will slam on the brakes when they see an animal run out on the road. It's generally considered better driving practice to not do that, as you risk rear end collisions, but that behaviour from humans does exist.
What are you talking about, Tesla doesn't ship any autonomous features. Driver assist features do not make a car autonomous. Tesla has always plainly said you must monitor the car at all times autopilot is engaged, the exact same way that turning on cruise control in any other car doesn't make the car autonomous even though the car is managing the speed.
Quite right old chap! Surely no reasonable consumer would interpret terms like "Autopilot" or "Full Self Driving" to mean that the car autonomously drives itself!
> What are you talking about, Tesla doesn't ship any autonomous features. Driver assist features do not make a car autonomous. Tesla has always plainly said you must monitor the car at all times autopilot is engaged, the exact same way that turning on cruise control in any other car doesn't make the car autonomous even though the car is managing the speed.
What are you talking about?
What Tesla markets and sells falls into SAE Autonomous Ground Vehicle classifications, just not level 5.
Nothing I wrote spoke to anything Tesla sells being L5 (or any particular level at all).
Edit: On the topic of MCAS, it strikes me as a system akin to around an SAE L2/L3, overriding pilot inputs to prevent a perceived crash/stall risk. This AIUI is well within the autonomy space Teslas operate...
Considering the MCAS[0] failures were largely software defects in an autonomous system, seems plausible to me...
[0] https://en.wikipedia.org/wiki/Maneuvering_Characteristics_Au...