Hacker News new | past | comments | ask | show | jobs | submit login

I own a model 3 with FSD. One of the key realizations I had after a few months with it is that self-driving at the moment is like supervising a newbie teenager driving. You have to stay alert and watchful, but also don’t have the direct control. When you are teaching a new driver it’s worth it.

Honestly, it’s a lot less work to just do it yourself.

This is one of those engineering situations where the 0.0001% edge cases matter, and could lead to fatalities. I don’t know if any of the implementations are up to the mark. FSD isn’t.




>> I don’t know if any of the implementations are up to the mark. FSD isn’t.

Having talked to engineers in the industry over the years, I feel confident to say they are not.

It's a problem where the base functionality - following a well marked lane - is actually f-ing easy, but every nonideality beyond that gets harder and harder. So people look at the base case and get all impressed. It really takes AGI to do properly. Management looks at each failure as if "just one more fix" is all they need.


I went to a conference focused on self driving cars back in 2019 - even back then, the best talk was from some professor in Florid, who's main point was that you probably don't have an alligator crossing the road in your training data. The point being, you really, really, actually do need nearly an AGI to drive safely. The entire world is out there!


Not alligators, but FSD handling family of deer in the road here: https://youtu.be/fpoXr_z_6a4?t=636


Tesla probably does!


I have a Mach E and have used Blue Cruise.

That’s not how I would put it. I don’t feel unsafe with it. It doesn’t feel like a new driver at all, I trust it to do its job. I haven’t used FSD but from what I’ve heard it sounds less trustworthy to me (mostly because it tries to do more in more situations, BC is very tightly controlled).

But its job is to stay in the lane and keep a good speed even in traffic.

When using BC your job just like less advanced lane centering + cruise control: maintain situational awareness for what’s comping up + what other cars are doing. Additionally I keep my hands on the wheel anyway to take over when needed even though BC is hands free.

I do find it a benefit. But it’s “just” a lane keep + radar CC system that’s a bit more advanced. Just like every single other system on the market you can’t take your eyes off the road. Ford enforces it. If it was still active they couldn’t have been looking away for too long.


It annoyingly hugs the right side of the lane for some reason, especially when you're activating BC. Once it goes far enough right it works well. There's a few spots on my 20 minute commute that it likes to disengage because they repainted lines and did a shitty job but that isn't strictly the car's fault.


My car does that too, BC 1.0.

Newer versions, 1.2 and up, are supposed to be far better. Even moving over a bit in the lane when next to a big truck. That may be 1.3.

Everyone will be getting upgraded to 1.4. We’re just waiting for the OTA update. It’s overdue, but anyone who had followed BC so far knows that’s just par for the course.

History for those unfamiliar: the Mach E was shipped before BlueCruise was ready. It took 9 months or so before there was an update available at the dealer for it.

Eventually newer cars cane with 1.2. Older cars were promised 1.2 but for various reasons it got delayed and a recall issue took priority.

Then they released 1.3 on new deliveries, and the promise became everyone gets 1.3.

That got delayed a bit and they were ahead on 1.4 so Ford decided to just move everyone to 1.4 as it was done early. But they hit problems on the rollout beta test for existing vehicles. Between that and seeming to give priority to the recent updates to work with Tesla superchargers, 1.4 missed the promised date of Q1.

It’ll come. And it’s supposed to be a very big improvement over 1.0 from those who have tested it. But Mach E owners who follow the forums know not reliable Ford software update dates are.


mach e slammed into a stopped car at full speed, killing the driver of the stopped car. if that doesn't give you pause I dont know what will.


I never drove one of these, but I always wonder if we're already past the "way too old person behind the wheel"-point where I rather see those people in a car driven by a computer than having them in control.

It becomes quite a discussion if a really horrible ancient happens here in Germany, but since this population group loves to go voting and Germany is a car nation, it dies down pretty fast when the topic of mandatory tests for elderly people come up.

It's the same problem with those who just got the license, of course, too.


The latest version 12 FSD beta seems significantly better, to the extent that experience with previous versions is less relevant. There's no real reason to expect these systems to get worse, they'll only improve. Today's videos versus those of even just a year or two ago shows huge progress.

Not better than doing yourself today in general, but it will be in more and more situations. https://www.youtube.com/watch?v=fpoXr_z_6a4 is illustrative of the current level of ability. A good example where the self-driving compares favorably with poor human driving is at 10:15 in the video.

There are some pretty major gaps (for example, limited ability to reverse and read signs), but I wouldn't bet on them not being resolved in the next five years.


I've tried FSD a handful of times during this month's beta and, while it's way better than when I tried it for a month a while back, it's at the point where I can very much see why the NTSB/etc are concerned about people trusting it too much.

It's generally really impressive and then suddenly very wrong. As in, it navigated a four-way stop with a in the crosswalk flawlessly. And then, later, it was lining up for a left turn into a parking lot and decided that the oncoming truck wasn't relevant, so it spun the steering wheel very quickly and lurched forward until I mashed the brake.

I'm very hopeful that we can replace the mistake-prone meatbags that drive cars now, but I'm incredibly concerned with people on other forums saying that they trust it completely.


I also tried it out during this month's trial, and while it is convenient in certain scenarios, I wouldn't consider it production-ready.

If you treat it as pre-release and truly understand the risks, then it functions adequately. As a side note, I am pleased with the progress being made in this space, when I worked in that specific industry years back, it was nowhere near the functionality you have now.


It's clearly not at the point of reliability where it should be used without close supervision, but the progress is undeniable.

See 29:14 in the video I linked for an example of it behaving potentially dangerously, in what the video author describes as a regression in its behaviour and a similar situation to what you describe.


Will they only improve? Or will they get better at most things while old previously handled edge cases sneak back in without warning? There are so many situations that it needs to handle 100% correctly.


Iirc Tesla has a giant internal validation sets of edge-case-related disengagements they test against using a simulator whenever they’re gonna deploy a new version.


That’s good to hear. It’s a huge problem space, and seems like it’d be rather difficult to test completely.


Even with normal cruise control and adaptive cruise control, when there is a lot of traffic, it’s easier to turn it off and take control. All these features seem great when no one else is on the road, to give the driver a bit of a rolling break, but that’s about it.


That's funny, I only ever use adaptive cruise when I'm in a lot of traffic on the highway. At slow speeds it's not going to do anything and I'm pretty much just going to follow the car in front at low speed. I use lane keeping the rest of the time and it really reduces fatigue.


My issue in traffic is that if someone gets between me and the car in front of me, the car hits the brakes to make some room. I’ve had it slow down from 50mph, from 75. It’s dangerous, if the person behind me isn’t paying attention, they could easily hit me. If I ever have it on in traffic my finger is on the switch to turn it off incase someone merges in front of me.

Of course, I use the max follow distance, so I have time to react if something happens and the car doesn’t do its job. I know some people who use the closest distance, which I guess would solve the merge issue, but would stress me out and if something happened, I’d be screwed.


Not sure what kind of car you have, but that’s the way my wife’s car works too — aggressive braking to keep the gap. My Tesla is the only car I’ve used adaptive cruise control where it doesn’t freak out and brake when you get cut off. Instead, if the car that pulled in ahead of you is traveling at a higher speed, it just lets the gap grow naturally.

Now, of course I’ve still had it brake at shadows on the road, so it’s not perfect, but sometimes it works like I’d expect.


A VW GTI was where I had it happen a lot. I don’t have it anymore. I assume my new car works the same way, but after the VW I haven’t given it the chance.

If someone merges while accelerating, the cruise control generally handled it ok. However, most of the time people merge, then speed up. During that first part, the car freaks out and slams on the brakes, and keeps on that course of action even as the car ahead speeds up. With the lead car speeding up, and mine going 50 when the new gap is correct, it ends up growing even bigger, so I then need to catch up. It makes me look like I can’t drive, lol.


And at slow speed in traffic jams, it plays accordion. When the car in front accelerates, it has a delay; when it decelerates, it has a delay… then it slams the breaks. If there’s one thing an automated system should be good at, it’s keeping a constant distance with the car in front, “constant” with the twist that it should be proportional to the speed.


Oh, haha, this is a miscommunication. I meant stop-and-go traffic when I meant traffic haha.


I have only found one practical application for adaptive cruise control so far: when driving in tunnels where speed cameras strictly and vigorously enforce speed limits, hence 99% of the drivers behave and adaptive cruise control is safe to use.

Outside the tunnels… other drivers' erratic behaviour makes adaptive cruise not very useful or too monotonous on long stretches of state and interstate highways to the point that it lulls [me] into a sleep.


Yeah, there is a lot of variation in interpretation here. I'm farther along the spectrum still: I'm on FSD essentially 100% of the time in highway environments now because I absolutely trust the car's attentiveness and precision more than my own.

Think back on how many times you've started a lane change and aborted because there was a car there, how many times you've looked down to navigate and hit the rumble strip, how many times you've had to rely on ABS to avoid hitting the car you didn't notice stopping in front of you, etc... Not once in almost three years and 40k miles in my Model Y has it done something like this. That's not to say it's never going to, but 100%, for sure, it's better than I am.


Think back on how many times you've started a lane change and aborted because there was a car there, how many times you've looked down to navigate and hit the rumble strip, how many times you've had to rely on ABS to avoid hitting the car you didn't notice stopping in front of you, etc.

I don't want to criticize your driving but these are not common occurences for me (ie. the answer to most is 'never'), and I would be quite concerned if they were. Most are solved by maintaining awareness around you (including checking blind spots) and follow distance ahead of you.


This is BS. It takes a couple seconds to check a blindspot. Same with most other things you look at, navigation, radio, whatever. The rest of traffic doesn’t wait around while you are focused on that one thing.


I think his point was that the time to check the lane over is before you initiate a lane change, not after. About the only reason to initiate a lane change without pre-checking where you're going is if there's an emergency (like the incident we're discussing) and then it's questionable whether it's better to do the lane change anyway...


Let's just say I've been astoundingly impressed at the quality of drivers on this forum. Come on, everyone stomps on their brakes for unexpected stuff, or has to jump back into the lane when it wasn't as clear as you thought[1]. You just code it, like almost all drivers do, as "someone else's fault", so it doesn't get filed away in your brain as something automation can fix. But it is, and I'm telling you straight up that it does.

[1] Seriously, you see this happen around you hourly (or frankly more often) in urban traffic. You really think none of those drivers count?


I think the answer for all this is one, and it's because a tire from the person ahead of me popped and he started swerving left, into my lane.

And I'm a bad driver (or an extremely careful/stressed one if you want to be charitable). All this should almost never happen.


Opinion: most of the replies to this comment are missing the bigger picture: FSD is trustworthy enough for highway WITH supervision.

I also use FSD on highways, and often on streets too. I find it’s very good when you adjust expectations. You still need to watch and be aware; but it’s absolutely less draining than driving yourself.

Excluding turns, I almost never interrupt FSD.

Turns, it’s not confident enough to go. Hoping 12 helps but I expect a few more years to fully solve. Even without turns it’s 100% worth it - I’m more attentive over the span of the drive with the car handling basics.


Jesus Christ.

If _any_ of those things happen to you regularly, please do something about changing that before you kill someone.


Agreed. I've been driving for 30 years and I have never hit a rumble strip I didn't expect to, or had ABS activate. That is not normal!


There are two kinds of drivers: drivers who know they're fallible, and fallible drivers who don't. Maybe you're an outlier who doesn't make these mistakes, and if so I applaud your skill. But most likely you're just like the rest of us, and have the added handicap of self-censoring your mistakes as "someone else's fault". Data filtering is a hell of a drug, as it were.


I’m someone that checks my side mirror when I make a turn just to collect data on whether my tire touches the line.

So I agree with OP. If you consider mistakes like “not checking your blind spot and aborting a lane change” as a part of normal driving, I would have to say that you are not a good driver.

You don’t get to a point where you become proud of your work without keeping track of your mistakes and improving on them.


Well said. I try to be 100% attentive (spoiler: am not perfect) and rather than losing interest in “boring” driving situations I play a game of trying to guess what other drivers will do, and being very conscious of whether I am right.

It’s mostly a way to maintain alertness, but it’s also helped discover patterns. Like, on city streets, if someone pulls out in front of you unreasonably and makes you get on the brakes, odds are good they will are only going a short distance and will turn, likely necessitating braking.

But yeah, just throwing up hands and declaring that everyone probably makes the mistake is no way to improve, at anything.


I’ve had ABS activate like twice in my entire life. Agree with the person you’re responding to: if ABS activates regularly when you’re driving then you may not be a safe driver.


I think for this person FSD might actually be an improvement.


Your inappropriate mockery notwithstanding, that was exactly my point. I'm happy you agree.


Yes, I do agree. The hope is that full-self-driving systems will help people who should really not be on the road, negligent drivers, the elderly, those with impaired capacity from drugs and alcohol, etc.


ABS regularly engages if you drive anywhere icey.

It's not a skill issue if the ground friction is basically 0.


There's only one kind of bad driver: one who's convinced all their mistakes are normal reasonable mistakes that everyone makes occasionally.


Those things you listed are not "oopsie daisy" mistakes everyone makes, those are situations that arise from negligent driving.


I'm not a _good_ driver - the rumble strip one will happen to me about once a year - but I haven't had an "ABS prevented a crash" in well over a decade.


I love my adaptive cruise during traffic; frankly I think that's where it provides the most value. Stop-and-go is the most annoying thing. Having the car do it for me is fantastic.


This is why I consider adaptive cruise to be close enough to FSD. Steadying the wheel is the easy part, I just want the speed taken care of as that’s the stressful part for me.

That said I’ve only had rentals. I drive a 2005 Tacona that doesn’t even have cruise control.


Yeah I don't understand it either. They're features that seemed aimed directly at one customer: the guy who is already watching YouTube and texting while going down the interstate and just keeping a vague idea of where the lines and cars are.


With an LLM in there (if that even) one wonders if each software version deployed to a driveless / level [can't remember] car should literally have to pass a driving test.

I say this because in line with what you say about teenagers is that the driving test is about being safe-enough to not kill someone, not being a good driver.


> I don’t know if any of the implementations are up to the mark. FSD isn’t.

It is very clearly called "Beta" software.

You can agree or disagree if Tesla should be allowed to put that out there, but I don't think it should come as any surprise that software labelled as beta is not 100% flawless.



Gmail was labeled “beta” for like 15 years. It doesn’t mean anything.

And if it did, so what? You’re dealing with people’s lives. No one will care what Greek letter they used to describe it.


So get angry at the regulators, they are allowing it.


It is, in fact, possible to have an ill opinion of multiple bad actors in a system.

"Don't hate the player, hate the game" elides the choice to not play.


I am. I don’t think ADAS system should just be able to be blindly unleashed on the roads without some sort of standard testing for minimum safety/non dangerousness.

But above I was just arguing slapping a word on the end doesn’t release you from responsibility.


Listening to you, any flaws or incidents involving FSD are anyone's fault but Tesla's. People who don't understand 'beta', regulators, and so on.

Mind you, that's Tesla's belief too.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: