Hacker News new | past | comments | ask | show | jobs | submit | Mageek's comments login

To be clear, both are fiction, it’s just that one is fantasy and the other is set in modern reality.


Hmmm I don’t believe so? Zen is at least portrayed in its conclusion as a true, autobiographical story (with obviously dollops of philosophical musings).



I'm sure you meant, both are non-fiction.


The video doesn’t show how the car got into this situation


Yes. Is this a failed left turn where the Waymo became trapped in the intersection? Did something keep the Waymo from exiting via a left turn? There's a brief glimpse of a large pickup truck stopped in the intersection. A few more seconds of video before or after would help a lot. Maybe Waymo will post video from their side.


Yes, there could be actual good reasons for this. Maybe a car illegally turned left in front of the waymo and the waymo did a sharp turn to avoid a collision.


Rust has a method for enforcing better memory safety. That is great for deployed applications, but can be annoying when you’re still exploring / mutating your code to figure out the right shape of things.


I have never had the experience that being precise about what I mean slowed me down, if anything it was the opposite.


because sometimes you don't know precisely what you mean. if you don't already know the shape of your solution the 'safety' features restrain you.


The article says it: “We determined that due to the persistent orientation mismatch of the towed pickup truck and tow truck combination, the Waymo AV incorrectly predicted the future motion of the towed vehicle.” It was detected, but it predicted the truck would move in a way that it didn’t end up moving.


I don't find that acceptable in any way, no human driver is going to do that and by that I mean no human driver is going to drive into something just because it moved in a way they didn't expect. they're going to slam on the brakes and the only way that's going to happen is if momentum is too high.

I understand we have to have explanations or we can't fix them, but it's just as important to understand this should never have happened even WITH the described failure.

If I had to guess, there's code to avoid stopping at every little thing and that code took precedence (otherwise rides would not be enjoyable). And I get the competing interests here but there must be a comparison to humans when these incidents happen.


> no human driver is going to drive into something just because it moved in a way they didn't expect

I would actually put money that this is the cause of most crashes involving multiple moving cars. Hell, a friend of mine got into an accident two weeks ago where they t-boned somebody that turned onto a median when they didn't expect it.


which is dealt with the explanation about momentum.

If your friend saw them and never attempted to stop then your friend needs to lose their license.


> no human driver is going to drive into something just because it moved in a way they didn't expect.

This is literally the cause of almost every human accident.

Imagine you're driving. There's a car in front of you, also driving, at the same speed as you. Do you immediately slam on the brakes? No, because you EXPECT them to keep driving. That is how driving works.

If, suddenly, they do something unexpected - like slam on the brakes, that might cause an accident. Because ... they moved in an unexpected way.

I honestly can't even figure out what you meant to say.


Eh? Distraction is the main cause of human accidents, not incorrect predicition of motion of objects.

https://injured.ca/5-top-causes-of-car-accidents-in-ontario/


here, let me complete that quote for you

> they're going to slam on the brakes and the only way that's [hitting the other vehicle] going to happen is if momentum is too high.

there's a difference between hitting something and driving into something.


> I don't find that acceptable in any way

Well obviously even Waymo agrees, given that they're recalling vehicles to mitigate the issue.


If I have to choose between driving next to the nitwit texting or the software that might get tripped up in really unusual situations, I’m going with the software.


"no human driver"? Really? Ever? Are you willing to bet on that assertion? Even if the human driver downs a bottle of vodka before driving?


well if we're going to play that game, what if the human suffers a medical emergency!

or what if they're driving a bus and they have to keep above 60 mph!

---

I guess it's my fault, I didn't list every single contingency possible...


Well yeah, it is. This is an edge case in self-driving cars, same as it could be an edge case for humans.


Probably the main point that the software should function at least as well as a non-impaired capable human still stands…


How do you drive into a solid war if you have lidar though? To say nothing of predictions, the object is where it's at.... where it's at at that moment. You don't need to predict where it's at now... because you know where it's at.


You can't drive if you only use the current "frame" of data as the basis for your decision. Imagine driving on the highway, a comfortable distance behind a lead vehicle.

The planning software would want to slam on the brakes without predicting that the blob of sensor data in front of you is going to continue moving forward at highway speeds. That motion prediction enables the planning software to know that the space in front of your vehicle will be unoccupied by the time you reach it.

A similar prediction error was the reason Cruise rear ended the bendy bus in SF a while back. It segmented the front and rear halves of the bus as two separate entities rather than a connected one, and mispredicted the motion of the rear half of the bus.


> That motion prediction enables the planning software to know that the space in front of your vehicle will be unoccupied by the time you reach it.

I think we're all on the same page about this part but what's confusing and hilarious is why would the correct answer ever be to drive into an unmoving object?

If they tried to avoid the truck and swerved and hit a different vehicle there would be no confusion here. But the self driving algorithm is effectively committing suicide (Kamikaze). That's novel.

My guess is that the self-driving car was not able to recognize the truck until it was very close and the sudden appearance of the truck is interpreted by the algorithm as if the truck is moving very fast. And the best answer in that case would be to let the truck pass (basically do what the waymo did).

But that means the lidar information about the shape not moving is being deprioritized in favor of the recognized object being calculated to move fast. A situation which could only really occur if a speeding vehicle plowed through a stationary object.

Fantastic solution for avoiding a situation like this -> https://www.youtube.com/watch?v=BbjjlvOxDYk

But a bad solution for avoiding a stationary object.


Who said it was an unmoving object? Maybe I missed something in the story, but I got the sense that this happened in motion. The towed truck would have essentially been moving forward at an angle, hanging across the lane boundary.


It wasn’t a solid wall, it was another vehicle.

The brakes don’t respond immediately - you need to be able to detect that a collision is imminent several seconds before it actually occurs.

This means you have to also successfully exclude all the scenarios where you are very close to another car, but a collision is not imminent because the car will be out of the way by the time you get there.

Yes, at some point before impact the Waymo probably figured out that it was about to collide. But not soon enough to do anything about it.


My brakes respond immediatly,for all intents and purposes as a human with a reaction time. I'm at fault if the person in front of me stops and I dont have the stopping distance to avoid a collision.

I get that self driving software is difficult. But theres no excuse for this type of accident.


That might be true for the simple case of following within a lane, although you only have to drive around to realize most drivers do not leave adequate following distance at all times to make this a pure physics problem. And neither is a good driver watching only the car in front, but also the brake lights of the cars in front of that, to help anticipate the car in front's likely actions.

But take an even slightly more complex example: you're on a two lane roadway and the car in the other lane changes into your lane, leaving inadequate stopping distance for you. You brake as hard as you safely can (maybe you have a too-close follower, too), but still there will be a few seconds when you could not, in fact, avert a collision if for some reason the car in front braked.

I have no idea what the legal situation would be: is it their fault if the crash happens within 3 seconds but yours if it happens after you've had time but failed to restablish your needed stopping distance?

Honestly even in the simple one lane case, I doubt you can slam your brakes on the interstate for no reason then expect to avoid any liability for the crash, blaming your follower for following too close.

Driving has a bunch of rules, then an awful lot of common sense and social interaction on top of them to make things actually work.


A car changing lanes does indeed remove stopping distance. But thats also something human drivers are naturally more capable of understanding than waymo. It shouldn't have mattered where a vehicle is on the road. Any human is able to predict if a weirdly loaded vehicle making a turn has a chance of invading their lane and/or stopping distance. Its a complex problem for sure but that also shows that you need absolute proof that the software is able to generalise the problem. Especially if you want self driving cars to respect flow of traffic over stopping distance.

Even if your software is as good as it can be, I doubt you'll be able to get them to recognise how to resolve deadlocks. Which would also involve severe hindrance to emergency vehicles.


I don't think anyone is excusing Waymo or saying that an accident is acceptable in this situation--it's just an interesting engineering problem to speculate about, and people are trying to figure out what caused it to fail.


This is why autonomous vehicles need a hard cap on the speed limit they are driving at.

If the max speed is 35mph, that allows a good braking system to respond by safely stopping from LIDAR info 99% of the time.


Yeah but then people aren’t going to want to take them because they are slow.


The average speed of a commuting car is around ~23mph, when you account for stops and red lights. 35mph is only a hindrance if you commute by freeway every day, which many people don't.

People will take them if they're priced right and they can do things in the car without having to pay attention to the road.


But autonomous vehicles tend to perform best on freeways, that's where they are most likely to be allowed


You can’t drive more than a few MPH unless you’re reacting based on the expected future, rather than the current one.

It’s why it’s so difficult to do (actually) and the ability to do it well is just as much about the risk appetite of the one responsible as anything else - because knowing if a car is likely to pull out at the light into traffic, or how likely someone is to be hiding in a bush or not is really hard. But that is what humans deal with all the time while driving.

Because no one can actually know the future, and predicting the future is fundamentally risky. And knowing when to hold ‘em, and when to fold ‘em is really more of an AGI type thing.


In self-driving, you are making predictions about where the object is right now based on the synthesis of data from your sensors (and often filtering information from past estimates of the object position). These might be high-precision, high-accuracy predictions, but they're predictions nonetheless.

(It's been quite some years since I worked on vision-based self-driving, so my experience is non-zero but also quite dated.)


My mom was hit by a driver when she was biking and fell with her arm in front of the wheel. The driver then decided to pull forward and drove over her arm. So humans don’t really solve that problem.


NHTSA standing general order crash rates are a mandated and publicly available data source. That’s what the study is based on.


Another very good blog overview here (not mine): https://ianthehenry.com/posts/delaunay/

I love DCELs and have been tinkering with them in my own side projects lately. Very cool data structure!


Isn’t Waymo already operating a full service in Phoenix? Haven’t heard of many issues from there.


I highly recommend the course “Computer, Enhance!” by Casey Muratori on substack for those interested in how CPUs / assembly work. You get to decode byte code, simulate instructions, learn how the stack works, etc. It is really well-paced, gives you plenty of space to figure things out your own way (with reference material if you need it), and helped me get several “aha!” moments that solidified how things work.


I recently started the new class by Casey Muratori, “computer enhance”, that teaches performance-aware programming.

I also try to complete a small personal project every month and write about it.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: