I'm not saying that braking is magic and solves all problems. I'm saying that braking is morally sufficient in the real world. If hitting something is unavoidable, you can hit whatever is directly in front of you while braking as hard as possible. It's good enough.
Not for a machine which you can file a lawsuit over and say "it had a decision that it can actually make and chose to kill my son." See how that changes things? It's no longer "an accident."
No you can't. Humans don't have near the reaction time as these self driving cars do. They can process much more than we can, and they never get distracted, like we do.
If a self driving car kills someone and it had the choice/option to do something else then we have a problem that needs dealt with. This isn't a human being or human error.
Humans can panic and decide to swerve, and it only takes them a fraction of a second longer than it takes a self-driving car. Someone willing to argue that the car had a choice and made the wrong one could just as easily argue that the human had a choice and made the wrong one.
You might personally think that oh, the machine is being rational and intentional, that's different from a human. But a lot of people will treat a human the exact same way. That split second panic is judged as if they had all the time in the world to consider the optimal choice.
The legal system already has and handles this type of lawsuit.