Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I don't get this. Isn't it nightmarish since no one has any idea when it will veer into some random object?


Users of autopilot probably have faith in it, although in some cases (like the guy running into the concrete divider) this caused their deaths.

I guess your mindset is different to the grandparent poster's mindset. He thinks autopilot is as reliable as a human chauffeur, to you (and probably to me too) it'd be like having a driver which you can't communicate with and might suffer epilepsy and fatally yank the wheel at any point, so you have to keep paying attention to his driving.

Interestingly this mindset difference also exists in Tesla, the marketing department promotes 1 version, and the legal department says the other...


I hear a lot of this from people who've never tried Autopilot, and yet everyone who has tried Autopilot loves it.

My best guess here is that "babysitting an epileptic chauffeur" may sound scary, but in practice it's a lot less stressful than actually driving. Not needing to focus on lane-keeping and maintaining safe following distance is probably a lot nicer than you might expect – you still have to pay attention to lane-keeping, but that might even be easier when you're not forced to focus on the mechanical part of it.


> everyone who has tried Autopilot loves it.

I would like to bet the guy who hit the concrete barrier would disagree... sadly he's dead.


I'm not closing my eyes and going to sleep. I'm paying attention I just don't have to gas/break/adjust steering wheel every single second.

MIT just did a study and found the warnings and driver vigilance in Autopilot was sufficient enough to maintain safe driving.

https://hcai.mit.edu/human-side-of-tesla-autopilot/


It doesn’t do that.

And your still driving - you just don’t micromanage trivia (which turns out is pretty exhausting). That increases your cognitive capacity to observe the road ahead at a higher level. You now see things you could have missed when focusing on low-level steering, especially when tired. You notice a weirdly behaving card ahead of time and take over. You see the road is getting a bit more complicated, or congested, ahead and take over. Traffic too heavy for Tesla’s cautions lane changes or overtaking, you take over.


"when"?

It'd only be nightmarish if the driver (and passengers) have no faith in autopilot.

.. in which case, I doubt the road-trip would be driven entirely on autopilot.


Not at all. The only place they ever "veer" is when the lane markings are lacking, which isn't very common. It is especially uncommon on the long, boring stretches of interstate that seem to drag on forever when driving.


It's extremely common to have worn out lane markings in the snow belt. New paint only lasts one winter before the plows scrape off most of the reflective beads. Rain driving in the dark then becomes a guessing game no machine can handle.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: