Hacker News new | past | comments | ask | show | jobs | submit login

How do you drive into a solid war if you have lidar though? To say nothing of predictions, the object is where it's at.... where it's at at that moment. You don't need to predict where it's at now... because you know where it's at.



You can't drive if you only use the current "frame" of data as the basis for your decision. Imagine driving on the highway, a comfortable distance behind a lead vehicle.

The planning software would want to slam on the brakes without predicting that the blob of sensor data in front of you is going to continue moving forward at highway speeds. That motion prediction enables the planning software to know that the space in front of your vehicle will be unoccupied by the time you reach it.

A similar prediction error was the reason Cruise rear ended the bendy bus in SF a while back. It segmented the front and rear halves of the bus as two separate entities rather than a connected one, and mispredicted the motion of the rear half of the bus.


> That motion prediction enables the planning software to know that the space in front of your vehicle will be unoccupied by the time you reach it.

I think we're all on the same page about this part but what's confusing and hilarious is why would the correct answer ever be to drive into an unmoving object?

If they tried to avoid the truck and swerved and hit a different vehicle there would be no confusion here. But the self driving algorithm is effectively committing suicide (Kamikaze). That's novel.

My guess is that the self-driving car was not able to recognize the truck until it was very close and the sudden appearance of the truck is interpreted by the algorithm as if the truck is moving very fast. And the best answer in that case would be to let the truck pass (basically do what the waymo did).

But that means the lidar information about the shape not moving is being deprioritized in favor of the recognized object being calculated to move fast. A situation which could only really occur if a speeding vehicle plowed through a stationary object.

Fantastic solution for avoiding a situation like this -> https://www.youtube.com/watch?v=BbjjlvOxDYk

But a bad solution for avoiding a stationary object.


Who said it was an unmoving object? Maybe I missed something in the story, but I got the sense that this happened in motion. The towed truck would have essentially been moving forward at an angle, hanging across the lane boundary.


It wasn’t a solid wall, it was another vehicle.

The brakes don’t respond immediately - you need to be able to detect that a collision is imminent several seconds before it actually occurs.

This means you have to also successfully exclude all the scenarios where you are very close to another car, but a collision is not imminent because the car will be out of the way by the time you get there.

Yes, at some point before impact the Waymo probably figured out that it was about to collide. But not soon enough to do anything about it.


My brakes respond immediatly,for all intents and purposes as a human with a reaction time. I'm at fault if the person in front of me stops and I dont have the stopping distance to avoid a collision.

I get that self driving software is difficult. But theres no excuse for this type of accident.


That might be true for the simple case of following within a lane, although you only have to drive around to realize most drivers do not leave adequate following distance at all times to make this a pure physics problem. And neither is a good driver watching only the car in front, but also the brake lights of the cars in front of that, to help anticipate the car in front's likely actions.

But take an even slightly more complex example: you're on a two lane roadway and the car in the other lane changes into your lane, leaving inadequate stopping distance for you. You brake as hard as you safely can (maybe you have a too-close follower, too), but still there will be a few seconds when you could not, in fact, avert a collision if for some reason the car in front braked.

I have no idea what the legal situation would be: is it their fault if the crash happens within 3 seconds but yours if it happens after you've had time but failed to restablish your needed stopping distance?

Honestly even in the simple one lane case, I doubt you can slam your brakes on the interstate for no reason then expect to avoid any liability for the crash, blaming your follower for following too close.

Driving has a bunch of rules, then an awful lot of common sense and social interaction on top of them to make things actually work.


A car changing lanes does indeed remove stopping distance. But thats also something human drivers are naturally more capable of understanding than waymo. It shouldn't have mattered where a vehicle is on the road. Any human is able to predict if a weirdly loaded vehicle making a turn has a chance of invading their lane and/or stopping distance. Its a complex problem for sure but that also shows that you need absolute proof that the software is able to generalise the problem. Especially if you want self driving cars to respect flow of traffic over stopping distance.

Even if your software is as good as it can be, I doubt you'll be able to get them to recognise how to resolve deadlocks. Which would also involve severe hindrance to emergency vehicles.


I don't think anyone is excusing Waymo or saying that an accident is acceptable in this situation--it's just an interesting engineering problem to speculate about, and people are trying to figure out what caused it to fail.


This is why autonomous vehicles need a hard cap on the speed limit they are driving at.

If the max speed is 35mph, that allows a good braking system to respond by safely stopping from LIDAR info 99% of the time.


Yeah but then people aren’t going to want to take them because they are slow.


The average speed of a commuting car is around ~23mph, when you account for stops and red lights. 35mph is only a hindrance if you commute by freeway every day, which many people don't.

People will take them if they're priced right and they can do things in the car without having to pay attention to the road.


But autonomous vehicles tend to perform best on freeways, that's where they are most likely to be allowed


You can’t drive more than a few MPH unless you’re reacting based on the expected future, rather than the current one.

It’s why it’s so difficult to do (actually) and the ability to do it well is just as much about the risk appetite of the one responsible as anything else - because knowing if a car is likely to pull out at the light into traffic, or how likely someone is to be hiding in a bush or not is really hard. But that is what humans deal with all the time while driving.

Because no one can actually know the future, and predicting the future is fundamentally risky. And knowing when to hold ‘em, and when to fold ‘em is really more of an AGI type thing.


In self-driving, you are making predictions about where the object is right now based on the synthesis of data from your sensors (and often filtering information from past estimates of the object position). These might be high-precision, high-accuracy predictions, but they're predictions nonetheless.

(It's been quite some years since I worked on vision-based self-driving, so my experience is non-zero but also quite dated.)




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: