These cars are driving on public roads, so the question is not whether the user is informed but whether they are compliant. It doesn't take many ignored beeps to learn that a particular customer is not using the feature as intended, and deactivate it for the safety of others. Offer to turn it back on after watching a trainig video at the dealer and signing some papers once, escalate to a mandatory counselling session after that, third strike and the feature is permanently off. Put it in the TOS so that asshole drivers can't sue and win in court. The customer should not be king when the safety of others is on the line.
To add to your comment: GM's SuperCruise semi-autonomous mode uses a real-time head and eye-tracking system to ensure the driver is alert. I don't know if it's based on DL, but it's available in a production vehicle.
On top of that, I think it is absolutely rational to expect autonomous cars to perform at the current state of the art. We tolerate certain kinds of bad human drivers, like beginners, because there is hardly an alternative. A self-driving car with the driving skills of a beginner would be completely unacceptable if the state of the art has skills comparable to, say, somebody with a few years of experience.
We don't really know yet what state of the art is for autonomous vehicles. Until we have gathered a bit more data from different companies we can't say anything.
This accident might have been a fluke, it might have been caused by bad engineering, it might have been caused by many things. We can only compare once we have a sizeable sample of incidents, or a long enough time of non-incidents that we're confident the system works well.
And every software update could change something about it.
And due to human vision being better than a dashcam, this car's emergency-assist system, the safety driver, could probably have caught this as well, if they had paid attention to the road.
Which is something that that production-grade technology can ensure with head and eye-tracking. Some non-autonomous cars have this feature to stop people from dozing off, and good semi-autonomous systems, like GM's SuperCruise will refuse to go into or stay in self-driving mode unless the driver is paying attention.
only person who bothered to point this out. A camera from the car doesn't even give the picture a human would see or what a million other sensors would see.
I think the point GP was making is that an attentive human driver may have been able to see the pedestrian much earlier than when she becomes visible on the video.
this, the human visual system adapts to darkness. Consider that the victim who obviously is crossing the street as part of their lifestyle has likely done this many times before, and of all the vehicles that could have hit the victim, the one that did happened to be one of uber's self-driving vehicles with a clearly inattentive driver behind the wheel. A driver paying attention to the road would have at least hit the brakes well before impact.
Intermittent street lights reduce the adaptation though and add glare - the human visual range for scotopic ('dark adapted') and mesopic (twilight conditions) vision is about 4 orders of magnitude of luminance (cd/m²) that the retina can perceive simulataneously from 0 to saturation, without adaptation (dilating pupil). Dark to light adaptation is very rapid and happens in fractions of a second, light to dark adaptation happens over minutes.
The eye will adapt to a mean level of light in the larger FOV (not fovea only) - that is why instrument clusters on cars need to be low-level lit, to not disturb this adaptation. Exterior light sources like headlights and street lights further influence adaptation and veiling glare can lead to light sources overshadowing smaller luminance signals and pushing them out of the range that the eye is adapted for.