That is speeding and also the car was inattentive to potential hazards outside of its vision. Normal human drivers slow down when they see a wild lady on the median looking like she might dart into the road. Normal people slow down when there are kids playing with a ball on the sidewalk. Do these cars do that? Do they even drive the speed limit?
edit: Also accidentally killing someone with your car because of minor speeding or inattentiveness usually results in a misdemeanor vehicular manslaughter charge. Who is getting that charge here?
If I'm imagining this scenario correctly, I've experienced it tons of times and never seen traffic slow down. It's a road with a median and a person attempting to cross the road has temporarily stopped in the median waiting for a break in traffic so they can cross. Maybe I've just lived in 2 crazy states, or maybe I'm not imagining this scenario correctly, but I can't recall ever having seen traffic slow down for this situation.
It's possible to believe that the sensors/AI failed (no braking) while also acknowledging that the vehicle/driver is not at fault for the death. I suspect that may ultimately be the case here.
Where have you lived? Here in DC you regularly have to slow down because tourists and homeless people will blithely jump into the street. Same thing in Philadelphia or New York.
You might also rest your foot on the brake in anticipation.
But neither attention nor anticipating controls is a concept that really exists in these autonomous system. They're "always ready". (yeah "attention" can exist in perception neural networks, but I am not sure that has much to do with keeping an eye out for a risky event)
> Normal people slow down when there are kids playing with a ball on the sidewalk.
The driving-exam manual I had to read before getting my license was mentioning this exact scenario, i.e. if one sees kids playing on the sidewalk close to the road that the driver should slow down. It was even included among the questions given at the written exam itself.
But, then again, the authorities over here in Europe are more attentive when handing out driving licenses to people who are about to handle 2-ton pieces of metal inside populated areas. In this case it seems like the "US driving mentality" (only the driver counts, damn be the pedestrians) has also affected the actual engineers who have programmed these "AI" cars.
I found this based on the photo of the accident location in this article [2]. I went to Tempe, AZ on Google Maps, entered the name of the building in the background of the photo ("First Solar"), and from there it was pretty easy to find the photo location. Then it was just a matter of going backward until the speed limit sign for that road in that direction was found.
But you see, this is a computerized car. Whereas for a human, there is gauge observation error, there’s really no excuse for a machine, especially a machine who has our lives in its hands
Driving 38 vs 35 is hardly an unsafe difference...
These are machines. They should obey the speed limit.
For people learning to drive -- the first step is learning how to obey the rules. The second step is learning which rules can be bent. Let's focus on step one at the moment.
Where I'm from, at least, for people learning to drive the first thing you learn is that, if you go the speed limit, you are going too slow and will irritate everyone around you.
Driving the speed limit is frequently unsafe, e.g. in adverse weather conditions or unusual street conditions (like when passing an adjacent block party with lots of adults and children milling about.
The speed limit is the maximum speed limit in ideal conditions.
Driving the speed limit is frequently unsafe. I drive on highway 401 in Ontario frequently, where the speed limit is 100kmh, but traffic flows at 120kmh or above, with faster traffic going more like 130. Going the speed limit means you are going 20kmh slower than the speed of traffic, and 30kmh slower than faster moving cars, which can definitely be dangerous since accidents are more often caused by speed differential, rather than absolute speed.
Not long ago I seem to remember people listing the lightning fast speed at which an automated car could drive as just one of the many blessings the technology would gift to us. Now we want them to drive the speed limit? Boring.
The promise of autonomous cars is that they will be better drivers than human drivers.
That autonomous car, by going 3mph above the speed limit, already failed at the most fundamental level: following the rules.
And it was not even an obscure or ambiguous rule at that, as I assume that there are signs prominently displaying the legal speed limit. But even if there aren't, the rule book covers those situations in an unambiguous way (or at least it does in my country).
Accidents are excusable. Shit happens.
But deliberately failing to follow the rules? That cannot be tolerated.