Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Study: Tesla Autopilot misleading, overestimated more than similar technology (mercurynews.com)
107 points by pseudolus on June 21, 2019 | hide | past | favorite | 154 comments


So I take it it's NOT okay to take your hands off the wheel with Tesla Autopilot? I am not familiar with the technology.

The Tesla page on it (https://www.tesla.com/autopilot) takes a while to get to describing what it does, but then says, among other things:

> Autopilot enables your car to steer, accelerate and brake automatically within its lane.

And:

> With the new Tesla Vision cameras, sensors and computing power, your Tesla will navigate tighter, more complex roads.

So it's steering... but you're supposed to leave your hands on the wheel anyway? But not actually steering, just resting there just in case? I'm not sure how realistic this is in terms of human behavior. To their credit, the page, at least as of this moment, also says:

> Current Autopilot features require active driver supervision and do not make the vehicle autonomous.

Reading just the web page, if you asked me "So, do you need to leave your hands on the wheel when autopilot is engaged on the freeway", I'd be unsure. Does "active driver supervision" mean "hands on the wheel"?

In general, it is very hard for humans to keep paying close attention to something they aren't involved in actually actively controlling, "just in case." I'm not sure if there's a way for this technology to be safe, if that's what it depends on to be safe. (not unique to Tesla).


As a regular AutoPilot user, it does require active supervision. You need to be looking out the window as it drives you down the highway.

It detects minute levels of torque on the wheel, so resting a hand in the wheel is enough to keep it happy. If it doesn’t detect any torque on the wheel it prompts you, and you can touch the wheel or move one of the scrollwheels to let it know you’re paying attention.

> In general, it is very hard for humans to keep paying close attention to something they aren't involved in actually actively controlling

I hear this a lot without citations, and I would guess it’s not specifically studied in terms of Tesla AutoPilot.

But what I can say is that AutoPilot takes the strain away from driving and allows me to focus on higher level planning and situational awareness. I watch the cars around me, I look ahead down the road and check mirrors more frequently. AutoPilot makes driving significantly more pleasant while allowing me to be more aware of potential risks around me.

I have no doubt some people choose to disengage. But it is by no means difficult to stay engaged in monitoring the drive, or at all boring.

On the contrary, while I’ve many times “zoned out” while driving as simply maintaining position in lane is extremely tedious, I’ve never zoned out while on AutoPilot, because watching the absurd things that the drivers around you are doing never gets old.


I would even say sometimes it checks too much. I've driven, actively looking ahead at the road, and then it will beep because I ignored the prompt on the dashboard (below the wheel). The prompt is white at the top of the dashboard and is a white gradient, that looks like glare on the dash.

I don't know how these people fall asleep behind the wheel unless they've done something that mimics an active hand on the wheel. Maybe with the model 3 which has a driver-facing camera, they will enable it and check the driver's eyes.


I have had this happen to but it only goes to emphasize just how much we tune out during long drives. My use of autopilot is to have the equivalent of a back seat driver or co-pilot who is just a bit paranoid. Zoning out is something the AP system won't do but drivers can do.

I look at it this way, how many have been driving on a long trip only to glance in the rear view mirror and see a bridge behind you that you do not truly remember driving under?

I find AP reduces stress of the long drive while adding safety. I do not allow it to change lanes on its own nor do I use navigate on autopilot. Many cars have automatic cruise control, the type that keeps distances from other vehicles, some have lane keeping assist which does as the name suggest, AP is all of that with active steering. Plus the car can drive in some of the worst conditions that are tiresome for human drivers, like at night in the rain with the headlight glare of oncoming cars does not even phase it.

About the article, all those things they asked if people felt safe(r) doing with autopilot, well they actually are safer. People do it already with plain cruise control and regular driving with no overriding system doing the active driving. Do I suggest being inattentive, no, but to suggest it isn't safer with AP is silly. However saying AP replaces a human driver is just as silly.


> I look at it this way, how many have been driving on a long trip only to glance in the rear view mirror and see a bridge behind you that you do not truly remember driving under?

Attention and memory formation/retention are not the same thing. The mind tends to discard detailed memories of routine tasks that have been mastered. If I ask you to tie your laces, would you remember which hand touched the laces first? It doesn't mean you weren't paying attention - only that it wasn't memorable.


That's an interesting thought. I will say that non-memorable is not the same as non-important. "Did I drive under a bridge" isn't the same as "did I leave the oven turned on" or "did I lock the front door".


> I watch the cars around me, I look ahead down the road and check mirrors more frequently.

I havent tried tesla autopilot yet so I don’t know how much easier it makes driving, but if steering[1] keeps you from doing these things ... should you be on the road at all? If you’re too tired to both steer and maintain situational awareness you should probably take a rest :/

[1] steering, more like keeping the wheel steady on the freeway. Afaik Tesla can’t autopilot on the twisties yet anyway


Cognitive load is a real thing.

People have a finite ability to multitask especially with similar actions. The only possible approach to driving to to not pay attention to most things as you focus on something specific and then cycle through each activity. Further, when you are forced to pay attention to something specific you must spend less time focusing on other things.


Babysitting Autopilot that can unpredictably steer me into a concrete barrier would be more demanding than driving a car that is more predictable.


driving a car that is more predictable which you drove into the baricade, or got rear ended by or side swiped someone on the exit . the avg driver is more deadly than any tesla on autopilot . By the stats .


Which stats? The only statistic I'm aware of Telsa claiming is that Tesla Autopilot is safer than all other road vehicles combined.


Tesla’s are really expensive to repair which impacts insurance costs. But, some companies have self-driving car discount‘s such as: https://www.joinroot.com/blog/tesladiscount/

Suggesting a net benefit exists relative to the base model without self driving capabilities.


One big problem I continue to see is that the behaviour you get used to can change with a software update and I think that's dangerous. People can naturally let down their guard when things become routine and may be ill prepared to deal with an unexpected change in behaviour from the autopilot.


If you don’t know, you’ll learn pretty quickly when you buy one. If you don’t keep your hands on the wheel, it will flash at you, then beep at you, then stop the car.

In reality, once you are used to it, it’s actually quite easy to keep a hand on the wheel while using Autopilot.


It's not about how easy it is to put your hand on wheel so that it can detect a bit of force (you can use bottle of water to do that).

The difficult part is to stay focused when doing that. If you spent 30 minutes driving to work on a highway, and your car was able to drive itself every day for 2 months, how likely are you to take any reasonable action on 3rd month when something unexpecting happens?


I think this undersells how easy it is to disengage attention when you are actively driving. The task of driving (especially your commute route) is very mundane.


the first instinct when you are moving above 50 kph on a highway, when you see trouble is to hit the brakes . Im not sure you can forget that. braking will pull you out of AP mode.


So the problem is the human attention span? Cars without any always-on computer aids such as automatic braking must be really dangerous then.


Driving a car is absolutely the most dangerous activity most people regularly participate in. I don't think you're saying anything controversial there.


Right so any time a computer can take over is a good thing because computers don’t get bored.


Not always.

The computer focuses better, but isn’t yet as capable. When things get dicey and the computer abruptly hands control back to a zoned-out human, they won’t have the same situational awareness as if they had been driving all along. OTOH, driving also tires humans out, which leads to errors and accidents too.

Figuring out the right balance is non-trivial.


Maybe. It may be the case as other commenters in this thread have said that the automation actually lets them prepare for unexpected events more than normal because it handles the mundane steering and speed adjustments.


That seems very, very unlikely to me.

I can imagine that being able to relax on the highway makes subsequent high-stress city driving a little more bearable, but people are really bad at going from zoned-out to action mode quickly.


I'm challenging the assumption that driving aids cause drivers to "zone out" any more than normal driving. Driving automation has been common for decades in the form of cruise control. I don't see anyone making the claim that using it makes you "zone out". Auto Pilot is just an evolution of this that handles the steering too. Drivers still need to drive and be engaged.


That doesn't follow, because making the human supervise the computer as it takes over rather than actively participating increases the risk of them getting bored and losing attention.


Do we know that though? Is a distracted auto pilot driver more dangerous than the average person in a car without any aids?


> In reality, once you are used to it, it’s actually quite easy to keep a hand on the wheel while using Autopilot.

...while you do something other than pay attention. Like take a nap or watch a movie (just search YouTube for Tesla nap).


I drive on AutoPilot every day. The shit I see other drivers doing on the highway blows my mind.

Yesterday a guy was trimming his toes. Bare foot up on the dash and just going at it. Can’t. Unsee.

Approximately half of drivers are holding a cellphone on their ear, or yawning with a look of utter exhaustion on their face, occasionally rocking out to some great tunes, which is fun when they catch you watching.

None of it is OK, but it is very clearly a universal issue much bigger than AutoPilot.


100% agree. Almost as bad for me was a recent conversation with a Tesla owner who said he drives his Model 3 to bars instead of taking Uber since it can take him home when he's drunk. The level to which people are abusing the system is phenomenal.


I fail to see how this isn’t a step forward, though. Given that there will always be a subset of people who drink at bars who will drive themselves home, then the availability of things like Autopilot can only help to reduce overall the number of accidents that occur as a result.


> Yesterday a guy was trimming his toes. Bare foot up on the dash and just going at it. Can’t. Unsee.

I had that guy as a Lyft driver once.


> So it's steering... but you're supposed to leave your hands on the wheel anyway?

I do something like that with cruise control. I enable it but I keep my foot resting on the accelerator just in case I have to break suddenly. I'm afraid that if I keep my foot somewhere else, when my reflex kicks in, I won't hit the break pedal.


> So it's steering... but you're supposed to leave your hands on the wheel anyway? But not actually steering, just resting there just in case? I'm not sure how realistic this is in terms of human behavior.

I have an Acura, so no autopilot, but it allows my car to steer, accelerate, and brake automatically within its lane. Also, if I don't apply control to the steering wheel (let alone have my hands off of it) for more than N seconds, it will warn me that it will disengage. The value is then more safety (the car keeps the lane by itself if you don't bother steering for some reason), which does reduce some fatigue, I guess.


I was worried about the “diffusion of responsibility” issue, so I rented one on Turo for the day and drove around 200 miles on freeways in Northern California. The user interface is really good, you learn pretty quickly the right amount of force to hold the wheel - then you can let it “autosteer”, but taking control is instantaneous, you just make the movement you want and the car disengages autosteer instantly. I probably took over from the car around 10 times that day, mainly in the left-hand lane with the left edge of the freeway marked with 2-foot-high white concrete barriers, it felt like it wanted to go into them. But that might just have been the car trying to drive “properly” in the center of the lane, while I tend to hug the right side in those situations to have a greater margin of error on the left.

It definitely reduces the cognitive load, letting you drive better in my opinion. But there isn’t an idiot-sensor in the car, so like every technology, there will be people who abuse it. The interesting question is whether those same idiots would be safer in a regular car, or if, on balance, the “autopilot+idiot” combination is safer - that seems like an empirical question to me.

Even at the current level, it’s a great advance on a regular car (I have no idea how it compares to other expensive cars with various assistive technologies, I usually buy super cheap cars...). I also think that if fully autonomous driving is technically possible, then Tesla will be the one to do it - they have several hundred thousand drivers giving them training data in all sorts of weird edge cases every day.

It really is worth watching the whole of the recent investor event at Tesla and then deciding what you think of their technology: https://www.youtube.com/watch?v=Ucp0TTmvqOE


> in the left-hand lane with the left edge of the freeway marked with 2-foot-high white concrete barriers, it felt like it wanted to go into them. But that might just have been the car trying to drive “properly” in the center of the lane, while I tend to hug the right side in those situations

I do this too. It is actually really good at holding the lane center, probably better than I am unless I'm really focused. But, it is more serene about the barrier than I am so sometimes I'll flinch away and end up driving again.

I was really impressed in the heavy rains we had a few weeks ago. At night in the storm I could barely make out the lanes but the car had a solid picture of not just my lane but the adjacent lanes as well. What ever filtering they are doing is impressive.


Tesla is betting they can do full self driving without LIDAR. It isn’t clear that this bet will pan out.


That just goes to prove what a misnomer Tesla’s “autopilot” is, since its core features are just lame assist.


The same thing is true with an airplane autopilot, which is an even lamer assist. It actually isn’t a misnomer at all, if we go by what autopilots have actually done historically.


Ahh the old Tesla and orange trick https://youtu.be/rOST5S06F-o


IMO stuff like this should be subject to the same safe disclosure procedures as security vulnerabilities. You know a bunch of idiots are going to try this after watching this video and endanger other drivers in the process


Similar thing people do with Passats with lane assist: https://www.youtube.com/watch?v=QV_FcG5NvUU


All very amusing, except that it no longer works.


It's OK but they have to tell you not to.


It’s “Ok” until you’re involved in a fatal accident and Tesla blames you for not having your hands on the wheel. Doesn’t matter that Elon himself has demoed it hands off, each of the autopilot fatalities was the fault of the driver.


Did you know there are 1.5 million auto-related deaths per year around the world. And in 7 years around 23 associated with Teslas? And none of the autopilot accidents affecting other people?


How many times does it have to be repeated, those stats compare apples and oranges. Luxury, modern, new vehicles driven by statistically more wealthy drivers have a much lower accident rate than most cars. When corrected for these factors Tesla’s safety rating is abysmal.

None of the autopilot accidents affected other people? I guess if by affected you mean fatalities, but 2 of them smashed into tractor trailers, so that definitely affected others. There are multiple reports of Teslas on autopilot running into stationary rescue vehicles as well, thank goodness they haven’t been fatal.


"Luxury, modern, new vehicles driven by statistically more wealthy drivers".

As far as risk aversion, I would say this group and Tesla owners are near opposites. Add in the completely different torque and breaking behavior. A completely different cockpit experience.


Correct. And yes, it is really dangerous and has led to countless unnecessary crashes and several deaths.

Humans are not capable of suddenly be required to take control of a vehicle with a split second notice.

Hopefully the laws mature enough to prevent these for being road legal.


If tesla’s autopilot were available to all vehicles for use only on highways (which is where most Tesla owners only use autopilot), the number of highway deaths due to inattentive drivers causing rear-ending accidents would drop to near zero overnight. It is so vastly superior to distracted humans on a highway, it is a “seatbelt” type revolution in terms of human safety.

But Tesla doesn’t seem to push this specific use-case resticted angle of it, Musk is focus on full self driving.


That is nothing unique to Tesla.

If they were serious about full self driving there wouldn't be any reason to bother with autopilot.


I'm amazed that you can agree it would drop rear-endings to near zero, and then immediately say there is no reason to bother with it.

Do you think a level 4/5 system would be done already if they hadn't made a level 2?


That's not what I said.

You can have automatic braking without having such an unsafe system.

Other brands do...

It isn't ethical, not that I expect them to care. But I don't think a sane company would play so loosely with peoples lives when they are fully dependent on complete trust when they launch something with full automation.


"Cruise Control" doesn't mean I can ignore the ~3200lbs of sports car I drive, just like "Tesla Autopilot" doesn't mean drivers can ignore the ~5000lbs of battery under them.

As an American who spends considerable time driving cars in various performance events (even some fun wheel to wheel endurance stuff) I think I can safely estimate at least ~25% of drivers on the road today, probably shouldn't be. Tesla is making progress, and regardless of whether they survive these various courts of public opinion, they're helping to lay the ground work for safer roads by forcing the industry to integrate more and more automated safety systems.

There will be problems. Computers will fail, mistakes will be made, and people will die, but I'm confident the break even point on those human costs is already pretty far behind us.*

*That said, we already have onerous and terrible auto regulations in this country. If we (as a society) want to force companies to be more cautious in this area, well, its entirely within our regulatory structures power to do so.


>Tesla is making progress, and regardless of whether they survive these various courts of public opinion, they're helping to lay the ground work for safer roads by forcing the industry to integrate more and more automated safety systems.

They're also wilfully misrepresenting the quality and safety of their technology to their customers, presenting intentionally misleading statistics to the media about it, and attempting to suppress all criticism by insisting it's part of some big oil / short seller conspiracy.


If you go to certain sub-Reddits or other similar forums, you will see that there is a tremendous amount of misinformation associated with Tesla, Tesla Auto Pilot, driver assist technology and Full Self Driving technology in general. I largely blame the absurdly irresponsible misinformation campaign spearheaded by Tesla and Elon personally. It's awful to get the public's understand of a nascent and potential game changing technology to start off on such a terrible note.

Some of the most common mis-conceptions I've seen include:

1. Level 5 Full Self Driving tech is right around the corner, with government regulation being the biggest show stopper. Elon promised "feature complete" of FSD by end of this year, and claims that within 6-12 months of that full level 5 self driving will be ready to ship to consumers.

2. Tesla is unique in their usage of computer vision in their FSD stack, and the rest of the industry only uses LIDAR. In reality almost everything Tesla does Waymo and others also do and very likely do it better, and on top of that they use LIDAR.

3. Since companies like Waymo employs pre-mapping and LIDAR, what they are building is a completely hardcoded solution to driving with very low degree of versatility and flexibility. I've seen Tesla fan suggesting that Waymo handles Stop Signs by hard coding the location of each stop sign in the map.

4. Because Auto Pilot, a level 2 driver assist system is being shipped in production cars and are racking up millions of miles, Tesla does not need any actual comprehensive testing of their FSD stack. I've seen many Tesla fans arguing this is why Tesla has done more test and gathers more "data" than companies like Waymo, when in reality Tesla's FSD system has seen almost ZERO actual testing anywhere, and puts them far behind competitors.

5. OTA updates makes Auto Pilot safer than competitors. In reality OTA very often enables less stringent QA and less rigorous regression testing. The Mountain View driver who was killed at the 85/101 on-ramp was the result of a terrible AP regression that failed to handle concrete barriers that it used to handle.

6. Auto Pilot is already safer than human driver because it has lower accidents/mile than the national average. This one is especially egregious since Tesla used active AP safety data compared against all motor vehicle accident data, which includes motorcycles and cars of all ages, driven by drivers of all ages, employing safety techs (or lack there of) of all levels, driving in all conditions. Where as active AP data is gathered by Tesla cars that's newer than 5 years old, driven by older, wealthier drivers, employs crucial safety tech such as automatic emergency braking, and only engaged during favorable conditions (mostly highway driving, clear weather, etc). When compared against data from cars with AEB, Auto Pilot has very comparative safety record.

Elon always had a tendency to be fast and loose with facts/predictions/promises, and usually I view them as entertaining at best and disappointing at worst, but with safety critical features like this what he's doing makes me feel borderline angry. A few major mistakes will set back the industry by years and as of now, Tesla is literally the only driver assist system that has a fatality count. I honestly cannot understand how any engineers with good conscious can still work on AP/FSD at Tesla and sleep well at night.


Your statements seem to have quite a slant as well.

> 3. [..] I've seen Tesla fan suggesting that Waymo handles Stop Signs by hard coding the location of each stop sign in the map.

I'm sure you'll see uninformed individuals behaving uninformed. But in my experience does this not represent the average of e.g. the r/tesla community. It's surprising to have you specifically mention Waymo, since it was mostly others (especially German car makers) who usually bring up how important high precision maps (or more recently "5G", while we're mentioning buzzwords) is for self driving. My first google result right now was from Lyft [0]. I can totally understand how people less informed on the topic can get that wrong.

> 4. [..] when in reality Tesla's FSD system has seen almost ZERO actual testing anywhere

What? The usual argument I remember is Tesla outsourcing the testing to its customers and the many good reasons against it. But how can you in good faith represent the tesla AP as untested after it has driven millions of miles? Or are you specifically talking about whatever final L5 FSD solution Tesla might come up with one day? Since this obviously cannot be tested yet, since it doesn't exist.

> 5. The Mountain View driver who was killed at the 85/101 on-ramp was the result of a terrible AP regression that failed to handle concrete barriers that it used to handle.

Can you source that being caused by a regression? The best I've found [1] mentions someone else saying it got worse in a comparable spot, but I also remember reading the killed driver having specifically complained to Tesla about the location the crash happened, but not if AP behavior there had gotten better in between.

But yeah, a lot of Tesla's communication around that case was horrible and I also don't feel they do enough to stop their fans and customers from being aware of the technologies current limits.

[0] https://medium.com/@LyftLevel5/https-medium-com-lyftlevel5-r... [1] https://arstechnica.com/cars/2019/03/dashcam-video-shows-tes...


>Or are you specifically talking about whatever final L5 FSD solution Tesla might come up with one day? Since this obviously cannot be tested yet, since it doesn't exist.

According to Elon they will reach feature complete by end of this year, but somehow that's achieved without any real world testing and validation, and they are supposed to be shipping to customers by next year.


> Tesla is literally the only driver assist system that has a fatality count

Uber had a fatal self-driving accident.


Uber isn't selling their self-driving system to anyone, precisely because they know it's not safe and reliable yet.


...and, goalposts moved.


While you make some interesting points, you are playing fast and loose with the difference between predictions and promises.

Elon says he thinks development cars will have FSD by the end of the year and you take it as a "promise" that they will have it and a "claim" that it will then be in production cars shortly thereafter.

He couched his predictions with a lot of caveats that you are ignoring while you engage in disingenuously calling the predictions "promises" and "claims."


“I think we will be ‘feature-complete’ on full self-driving this year, meaning the car will be able to find you in a parking lot, pick you up, take you all the way to your destination without an intervention this year,” Musk said during a podcast interview with the money management firm ARK Invest, which is a Tesla investor. “I am certain of that. That is not a question mark.”


Yep. Not a promise by any understanding of the word. He’s just saying what he thinks.


He's not some random person. He's the CEO and driving force behind a company that is investing billions in getting to FSD.

If he "thinks" it is achievable with that short timeframe, he must have a project plan with milestones being predictably achieved in accordance with that schedule.

If instead there's no such plan, or the progress is not in line with prior internal expectations, he doesn't "think" it is achievable, he wishes it, and it is unlikely to be achieved by that time.

Whether it is a promise or not isn't the point!


He said very clearly what he thinks is achievable. Full self driving in dev vehicles. No argument there. The OP was extrapolating this to other levels. What exactly do you have an issue with?


Tesla recently did an accidents/mile comparison of their own cars with Autopilot engaged vs not engaged and found a noticeable decrease in accident rates for Autopilot driving compared to manual driving.

Some of that difference could be accounted for in the types of miles customers choose to activate autopilot on vs drive manually, but it's certainly a positive sign.


I would say virtually all of the difference is because autopilot is engaged when driving is easy and humans intervene often. Look at the forums and you’ll see tons of stories of shadow braking and lane merging problems.


Births in hospitals have, on average, more complications than births at home. But not because births at home are inherently better, but rather because protentially complicated births are done at hospitals do be able to better care for mother and child.

People use Teslas cruise control in inherently safer situations. A proper comparison would compare identical road s - and would be done by a neutral body not involved with Tesla.


I can confirm I have seen Elon and Tesla Marketing do 'horrible' things. Its one of the least ethical mainstream companies.

If I could warn anyone, I found our fellow 'highly educated' nerds, fall for marketing, similar to how we see the uneducated masses fall for populist politicians. I believe this is because these 'nerds', don't work in automotive and have limited understanding of 2019 automotive technology.


People also have limited reading and listening skills. They overlook context, caveats, and weasel words and their brains add in levels of certainty and promises that simply aren’t there in the original statements. Blame the people doing the misinterpretation, not the people who made very clear but nuanced statements about possible future developments.


> they're helping to lay the ground work for safer roads by forcing the industry to integrate more and more automated safety systems.

They focus more on driver-aid systems that encourage not paying attention. Their automated safety systems aren't anything special compared to what most manufactures offer.


I disagree.

Which other vehicles on the road actively steer your vehicle away from an imminent collision when it's safe to do so?

Tesla's blind spot monitoring doesn't just detect what's in your blind spot, like other vehicles, it also detects whether a vehicle is slowing down or accelerating into the "blind spot" adjacent to your vehicle.

Their autopilot or automated emergency braking doesn't just brake when a vehicle in front of your slams on it's brakes, it slows down when someone in front of that vehicle hit the brakes.

There's a lot of intelligent engineering there, and a lot of improvements that are pushing the industry to be safer and more responsible. There's a lot of negative press out there, but much of it is not based on fair statistics.


My Model 3 did exactly that this morning. I was in the act of changing lanes, expecting the car behind me in the target lane to either maintain speed or slow to avoid hitting me. Instead it sped up, my car alerted me, and as I glanced in the side view to see the car approaching, the steering wheel gently tugged me (us??) back into our lane, avoiding collision. I’ve never felt so grateful to an object before.


Similar situation happened to me few days ago, my Model3 brake for myself to avoid a collision


Whereas, i presume, in your previous non-Tesla cars you were regularly getting into pileups?


How reductive. Let's try looking at it as getting closer to accidents before managing to fix the issue. So now with this system, one-in-a-million odds of that type of accident might become one-in-ten-million odds.


My point, ill-made, is this is anecdotal data and therefore useless in evaluating anything.


To be better understood, you could try directly stating your point instead of making a snarky remark that requires interpretation. :) as far as anecdotes, that’s all I have to offer on this topic. Feel free to ignore them if they don’t help you in evaluating something.


Uhh, my Audi did exactly that. Blind spot monitoring was adaptive to the speed differential. My current car uses brake light detection to assist with braking. These are not features unique to Tesla.


A Tesla has a radar with 160m range, plus an entire camera vision system with something like 70TFLOPs of computing power. Comparing that to simple brake light detection is disingenuous. Did you car also autonomously steer away from the incoming vehicle?


> Their automated safety systems aren't anything special compared to what most manufactures offer.

When I'm trying to figure out which of two really vocal, opposed camps might be more right on some issue, I look for the really hyperbolic statements that are easily refuted but still repeated.

2016: A Tesla's Forward Collision Warning System warns the driver about an impending collision he can't even see yet and starts applying the brakes. https://techcrunch.com/2016/12/28/watch-teslas-autopilot-sys...

2016: A Tesla automatically alerts the driver and swerves to the right to avoid a collision. https://www.inverse.com/article/14393-tesla-autopilot-avoidi...

2018: A Tesla's automatic emergency braking system makes it the only car in an intersection that isn't hit by a someone running a red light. https://insideevs.com/news/342781/watch-this-tesla-model-3-s...

2019: A Tesla gets rear-ended and then swerves around the vehicle in front of it and pulls safely off to the right. https://electrek.co/2019/05/06/tesla-autopilot-maneuver-avoi...

2019: A Tesla avoids a stopped vehicle on a busy road in icy conditions with extremely poor visibility. https://www.teslarati.com/tesla-model-3-automatic-emergency-...

There are so many different reports like this that searching for them is boring.

So which manufacturers do you have in mind when you say that "most" of them are offering comparable safety systems?


You'll find that indeed most manufacturers offer AEB, forward and rear collision avoidance, many of them offering side collision detection. Subaru, Volvo, Toyota, Audi, Mercedes, for ones I know for a fact. I'm more unsure as to why you think other manufacturers don't do these things.


Someone who hasn't bought a new car in the last couple of years probably wouldn't be aware of these features. Also for some reason car dealers, at least in the US where I've been buying cars recently, don't seem to highlight these features too much. Perhaps because they're not a way for the dealer to make more money. It can also be quite hard in my experience to get a straight story on exactly what the safety capabilities of a given car are. E.g. will it just warn of an impending collision or activate the brakes? Does it drive the steering system on lane departure or just warn? When buying a car recently for my new driver son to drive I had to pretty much try driving it into things to verify the features were implemented as expected and in one case (active steering on lane departure) I didn't know the car had the feature until after taking delivery.


I think a lot of car sales people are tech-averse and simply not interested in, or prepared for dealing with the faster consumer cycles of cars.

I recently did a test-drive and the guy maneuvered the car out of the dealer lot by driving backwards and looking out the rear window, completely ignoring the fantastic surround camera and sensor view that popped up on the huge display in the car... Great job selling that feature, buddy.


One reason is that other manufacturers are extremely conservative when it comes to the marketing of these capabilities, in comparison to Tesla who is overpromising a lot.

Another reason is that Teslas are "cool", and get a lot of press in non-automotive circles, which means that a lot of people think they were first with a lot of technology, which simply isn't true.

And a possible third reason is that quite a lot of tech people have been researching and toying with the idea of getting a Tesla, but those same people have never researched similar vehicles from other manufacturers, so they have no idea how they compare.

I had a 2014 Mercedes E-class, its self-driving capabilities was on-par with a 2014 Tesla. However, Tesla offers over-the-air updates of their vehicles, so existing Teslas got better, while my vehicle stayed the same, so that's a point they should have credit for and something they actually do better than most other manufacturers.


Individual reports are not data.

https://www.iihs.org/topics/advanced-driver-assistance

"study found that systems with forward collision warning and automatic braking cut rear-end crashes in half, while forward collision warning alone reduces them by 27 percent. The autobrake systems also greatly reduce rear-end crashes involving injury."


"Autopilot" is short for "automatic pilot" which is defined as "a device for keeping an aircraft on a set course without the intervention of the pilot."

If their "Autopilot" needs any sort of human supervision, either it is faulty, or tesla marketing is misleading. Either way, in case of an accident, it's tesla's fault - the name of the feature strongly implies that no human intervention is necessary, and talking about it with many pilot friends they all agree.


>Tesla also pointed out: “... If IIHS is opposed to the name ‘Autopilot,’ presumably they are equally opposed to the name ‘Automobile.’ ”

Seriously? This reads like internet trolls. But it's from a public company.


From the CEO who called a hero rescue diver “Pedo guy”


>> If IIHS is opposed to the name ‘Autopilot,’ presumably they are equally opposed to the name ‘Automobile.

My best attempt to understand this cryptic comment is as saying that Tesla intends "Autopilot" to mean a system for "piloting" an "automobile", unrelated to the common use of the word in aviation, where it's an "automated pilot(ing system)".

Even if that is the case, they should be aware of the potential for confusion with "automated pilot" and for the risk that this confusion entails.


the paragraph above it clearly states that the users are trained on how to engage and disengage the system. If consumers have a good understanding of the system and still feel that it is better than the competition how can it be an issue of mislabelling


Your comment is very misleading. You are referring to a statement by a *Tesla spokesperson:

> “Tesla provides owners with clear guidance on how to properly use Autopilot, as well as in-car instructions before they use the system and while the feature is in use,” a Tesla spokeswoman said Friday. “If a Tesla vehicle detects that a driver is not engaged while Autopilot is in use, the driver is prohibited from using it for that drive.”


so are they not providing the training ?? and if they are not i would accept my comments to be missleading.


The percentage of drivers who think it's safe to nap is small but nonetheless frightening. What are these people thinking?


They are thinking the same thing as people who drive drunk. People can be idiots.

You have to actively disable safety measures in order to get the car to drive for any kind of extended period without applying force to the wheel.

Anyone caught doing this intentionally should be cited for impaired driving same as a DUI.

Then there are the people who legitimately fall asleep at the wheel and AutoPilot saves their lives, which is a different matter.


> They are thinking the same thing as people who drive drunk.

They're worse. People who drive drunk don't necessarily make a fully conscious decision to do that. Their judgement is already impaired.

But if they think that sleeping with autopilot on is reasonable, that's 100% on them. It's something they decided ahead of time.


Lots of studies show that people who are sleep deprived are just as bad with judgement.


That's a different question though. The people were asked if they considered it safe in general. They were not in a situation when they were sleep deprived (hopefully).


Yeah, the first time I drove one, I was holding the wheel at the bottom, with two hands, but not tight enough. The car gave me 3 warnings, then disabled autopilot for the rest of the journey. I had to get off the freeway, stop, get out and lock the car, then it would let me use autopilot again...


Your mistake was that it detects rotational force, not your grip.

Rotating the wheel just a nudge with one finger is enough to let it know you’re there. You can also rotate the volume or cruise speed adjutant wheels —- sometimes I will tweak it up and down a click (meaning you end up with no change in setting) instead of rotating the wheel.


This is the percentage of all drivers who think its safe to sleep with Tesla autopilot not just Tesla drivers. People who actually own a Tesla will have seen the warning messages and will be familiar with the limitations of autopilot. The title of the study is very misleading.


well, they did name it 'autopilot'


I see the use of “autopilot” name bring brought up a lot. Thinking about what an autopilot in an airplane does, labeling Tesla’s system autopilot is actually pretty apt.

It’s unfortunate that there’s a disconnect between what autopilot means colloquially vs what it actually does on a plane.


> what an autopilot in an airplane does

...Flies the plane safely with zero input from the pilot (hands off the yoke) indefinitely? Lands the plane on its own [1]? I'm well aware that autopilot systems need input directions for desired altitude and air speed, but the semantic distinction people try to draw between "real" autopilot and Tesla's autopilot capabilities have never made sense to me.

[1] https://en.wikipedia.org/wiki/Autoland


> the semantic distinction people try to draw between "real" autopilot and Tesla's autopilot capabilities have never made sense to me.

Typical airliner autopilot flies along a list of 3D waypoints and manages airspeed to avoid falling from the sky or overstressing the airframe. It's a fairly trivial system since it does not have to find its way around a complex, constantly changing environment like a car does. It will happily fly into mountains, other airplanes or dangerous environments like thunderstorms.

All decision-making depends on pilots. Autopilot itself is nothing more than a simple cruise control that relieves pilots from things like manually maintaining constant altitude over six hours on a transatlantic flight. Any hobbist with a Raspberry PI can built a similar GPS-based waypoint following and speed scheduling for a car, but it's useless for road traffic, because roads are not straight, long empty stretches that can be navigated by driving from one waypoint directly to another waypoint 100 miles away.

Car autopilot must be able to make decisions (follow the road, react to obstacles) to be useful, and that makes it fundamentally more complex than any existing airliner autopilot.


Real plane autopilot is basically lane-following.

Tesla autopilot is theoretically capable of indefinite highway driving and emergency 'landings' (on the shoulder).

Pretty close in actual capability. The difference is the environment.


I think what you're missing is that Tesla's Autopilot is not done. It's clearly labeled "Beta" when you enable it in the car.

The goal of Tesla's Autopilot is to take you from your garage to your parking space with zero input. Therefore it's exactly the same as a plane's which takes you from runway to runway.

Just because it's a work-in-progress doesn't mean it's named wrong.


>Safely with zero input... indefinitely

It is very possible to hit people in the air if you aren’t paying attention

>Lands the plane on its own

Autolanding is not autopilot, it’s autolanding.


You can safely take a nap while flying a plane on autopilot, the device is more than capable of navigating a course with no human supervision for extended periods of time... which is not possible with the same technology on the ground, with all the trees and stray children running around.

So while the technology may be the same, calling it "autopilot" is not; in the one case it can run on auto, in the other it can not.


From Tesla's Autopilot page, during the first 5 seconds of their video demo[1]:

> The person in the driver's seat is only there for legal reasons. He is not doing anything. The car is driving itself.

[1] https://www.tesla.com/autopilot


But that's their full-self driving software that hasn't been released yet.


That's a strange claim without any evidence to back it up.

It's the first video on their "Autopilot" page. There's absolutely no mention of features that don't exist yet in the video.

You have to scroll all the way to the bottom of the page for any mention of full self-driving. I'd believe your claim if the video was under the headline of "Full Self-driving", but it isn't.


I guess the title doesn't show up.

When you pull up the video in vimeo, the title is

"Autopilot Full Self-Driving Hardware (Neighborhood Short)"


"Hardware."


Do you think pilots sleep while the passenger jet you're riding in is on 'autopilot'?


This is just as snarky as your comment: Do you think car drivers are getting the same years of training on a single vehicle before they can even co-pilot it?

How autopilot actually works in airplanes is unimportant, the public perception and Tesla's intentionally ambiguous play with the word is what counts.


Yes, absolutely [1].

[1] https://www.bbc.com/news/uk-24296544


They don't "sleep" as in regularly take planned naps due to trust in autopilot. They fall asleep, causing danger to passengers. Same thing happens on the road to drivers without autopilot - they just rarely get to talk about it afterwards.


People are more likely to fall asleep if they think they have a safety net. Also, autopilot automatically disengages when no input is detected on the steering wheel so if you fall asleep it’s going to turn off.


I don't think the General Public understand what autopilot is and what it does.

The perceived definition is far from the real definition.


It's called marketing, a good one.


My impression is that most non-pilots think that autopilot is a lot more sophisticated than it really is.

People hear things like "planes can basically land themselves" which is sort-of-true, and they think the planes can actually land themselves when the reality is that the pilots need to set up ILS properly, make sure the engine thrust is at a reasonable level, do the proper communication with Air Traffic Control, and only then (maybe) can the plane land itself.

Note: I'm not actually a pilot, so my details on landing with ILS might be a bit off.


Flying a plane in a straight line is much simpler than driving on a road. There is a reason why most aircraft accidents happen during landing or takeoff, once at cruise altitude there is very little that could suddenly go wrong.

That's why a relatively simple mechanism can autopilot a plane, but the same mechanism will crash a car in an instant.


I don't think the general public makes fine distinctions like that.

(for instance, I couldn't help but think the "sex" bill clinton was grilled about could not have led to pregnancy)


The difference is the jet often has 2 pilots, so they can take turn to nap.


So the obvious question is: how many uneventful naps do we need to take before we declare that it actually is safe?


Nonconsensual social-darwinism, sadly


I know that the name is a good one, but 'Autopilot' is a troublesome name.

I think a better way forward would be to do something fun with the 'Autopilot' name, e.g.:

"Zanzibar (formerly known as Autopilot)"

Nobody need know what "Zanzibar" actually is other than some place a long way from California. You could easily talk about your Tesla to a non-owner saying 'yeah, it's got Zanzibar...' or to a fellow owner 'did you spec the Zanzibar level 5 option?'

I picked the name 'Zanzibar' randomly here, it could be any place rather than any thing. If Autopilot was rebranded as 'celery' then someone would expect that in the glovebox. A brand name that means nothing and can be nebulous would be best.


GM has the right name with Super Cruise. It works just as well and in some cases better than Tesla Autopilot, but the perception of the public is that it's worse because it doesn't have the inflated name of 'Autopilot'.

* now that I look at it again, it is actually more 'auto' than autopilot since they do not require your hands to be on the wheel but they do track your eyes for attention to the road.


I mean that's a perfect choice as long as your marketing strategy isn't to hope that people pay extra for the feature thinking that it's FSD. The fact that Tesla still hasn't done something like this is proof that they're kind of relying on people to think that autopilot is something that automatically pilots the vehicle, which it doesn't.


I'm confused why there is such a dissonance on the definition of "autopilot." Most people are fully aware that commercial planes have at least one human pilot in addition to an "autopilot," and I think the passengers on the plane would be very unwilling to fly if the human pilots weren't present. "chauffeur" seems like it would be misleading for partial self-driving software, but "autopilot" seems spot on: helpful, but not good enough to replace the human.


Yes but it is a battle they have lost. Sometimes common sense does not prevail. Particularly in the media.


While it's true that with current roads the self driving car software solution may not converge to an acceptable solution, it's also true that as the software gets better municipalities will start to modify their streets and roads to assist the software. At that point the combined solution will very likely converge to a "better than human" result.


"Survey participants were asked questions including whether they thought it was safe to take their hands off the steering wheel"

I think this is misleading, Autopilot never told it's safe enough to let hands off the steering wheel. To this extend of course there is no difference than other solutions.


What's misleading? If they think it, they think it, and the intentional implications of advertising are just as important as the words it uses.


Tesla itself used to only mandate that you touched the steering wheel every quarter hour.



Reminds me of the YT videos of people caught sleeping in the drivers' seats of their Teslas on their ways to work. It's a behavior that should be possible but isn't yet assured to be safe enough to be allowed. #Affluenza


It isn’t safe at all, please please no one ever do this. People have died on autopilot and sleep may have been involved. Just because your car won’t kill you for the first hour of sleep does not mean that it won’t at some point over the next few years. Auto accidents are actually pretty rare things to individuals and it’s very easy to normalize extremely unsafe behavior.


Off when a technology promise too much but cannot deliver without constantly monitoring. You feel like forming an autistic taxi driver.


The fact that Tesla says not to take your hands off the wheel, built security for this, and have made it clear it's not at that level while people still do so is not their fault.


Weird, when I go to Tesla's Autopilot page[1], I'm presented with a video demo where the driver has his hands off of the wheel. The video has the following caption:

> The person in the driver's seat is only there for legal reasons. He is not doing anything. The car is driving itself.

[1] https://www.tesla.com/autopilot


Thats to show FSD, a different feature. If you were to ask they have been very clear of what it does now and what it should do in the future.


That's a stretch. The video is directly under the headline of "Autopilot". Absolutely nothing on the page suggests that the video is demonstrating features that don't exist yet.

Both the Center For Auto Safety and Consumer Watchdog[1] have filed claims of deceptive advertising with the FTC because of this Autopilot video. It's really reaching to claim that the video is demonstrating anything other than Autopilot.

[1] https://www.consumerwatchdog.org/news-story/consumer-groups-...


The video on that page makes no mention of FSD. The video is separated by approximately 7 content blocks from any mention of FSD.

I could completely understand a reasonable person thinking this was a demonstration of Autopilot, given it’s the primary content block on the /autopilot landing page.


There are plenty of video interviews where Musk himself shows using auto pilot with his hands off the wheel.


It's even on their official sites demo video! As shared in another comment https://www.tesla.com/autopilot


It clearly states that the functionality in the video is coming in a future update. Its a video showing what the model 3 hardware is capable of once the software and regulation catches up.


No it doesn't.

On the Model S page one feature is called "Autopilot: Future of Driving".

The Autopilot page then leads with a still shot of a driver not touching the wheel. If you press play it says "He is not doing anything. The car is driving itself".

Sure, about 2/3 page lengths further down, hidden in the middle of a paragraph, it caveats it. But unless you read the fine-print, it's advertising hands free driving.


> once the software and regulation catches up.

What regulation needs to catch up?


There is none.


Every single automaker demos their lane-keeping technology with hands off the wheel.


Show me.





Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: