There was a great paper[1] I read about the human components in complex technical systems which argues that one of the roles of the human is to take the blame when the entire system fails. This does real valuable work for the companies involved and helps them avoid needing to answer the most uncomfortable questions.
[1] Moral Crumple Zones: Cautionary Tales in Human-Robot Interaction by Madeline Clare Elish
Well I am talking about all crashes, not just car-pedestrian crashes. For example Teslas crash in to other cars and then the company blames the human driver.
The flip side to this is that every system involving software seems to inevitably devolve into a situation where the human is expected to no longer be responsible.
Oh, you floored it while the car was pointed at the wall? It has cameras, why didn't the car disable the go pedal?
This is happening more and more with cars, and it seems inevitable that it will happen in other spaces as well, as software is expected to protect us from ourselves.
[1] Moral Crumple Zones: Cautionary Tales in Human-Robot Interaction by Madeline Clare Elish