Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In Uber’s own tech blog, it describes its use of LIDAR to generate a wire frame of each object and track it. The lighting should be irrelevant for the vehicle, while it is essential for the driver.

From the below article: “Last fall, Uber officials showing off their vehicles on Mill Avenue said their radar and lidar were able to detect objects, including jaywalkers, as far as 100 yards away and avoid collisions.”

Something went really wrong here, and I look forward to the release of the rest of the data. I’m wondering why the woman, who had a large enough silhouette, wasn’t detected by the software.

This is a better source: https://www.google.com/amp/s/amp.azcentral.com/story/4476480...



I suspect she was picked up by the sensors and categorised as an object. These are absolute fundamentals, and I can't fathom the car working at all if it can't do this. If it was being driven with a limited set of sensors or a malfunctioning sensor, hoo boy the NTSB will have a few things to say.

I reckon it's more likely that this is was due to a software issue where she was flagged as "not a collision concern" or similar due to being in the other lane on a 2-lane road, or due to her path not being predicted accurately due to the odd returns received by a bicycle in profile at that range (it may have thrown off their regular pedestrian detection). In any case a terrible shame.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: