The NTSB got rid of this thinking a long time ago when it comes to aviation accidents. If you think someone is "woefully negligent at their job" for modifying an unbraced if statement, how about pilots who fly their airliner into the ground? Yet aviation accident investigations focus not on blaming people who erred, but on how to avoid them knowing that humans make mistakes.
Instead of focusing on making ourselves feel better by talking about "a few bad apples" if we really want to avoiding things going bad in the future, we need to work with human behavior instead of against it.
Sometimes it really is a few bad apples, though -- it's a delicate balance.
Somewhere between "the pilot got onto the plane, intoxicated, and proceeded to fly it across the country" and "a major, unforeseeable control outage led to a midair collision between two aircraft following normal procedures", there exists a reasonably competent, responsible person, capable of performing their job duties as expected.
The key is being reasonable in what we expect from people. We can't all be superheroes, but people aren't mechanical robots with the wits of children, either.
But here's the thing - you don't weed out the bad apples by drastic punishment alone. You weed them out by designing your processes so they get tripped up, and then simply eject them from the profession.
And it's not the bad apples that are the problem. You spot a drunk captain easily. It's the mistakes that are the problem. The worst aircraft accident in history? Caused by one of the most experienced and responsible pilots in the industry. As a result of a large chain of failures along the way.
And these kind of problems are fixed by systematic change, not by punishment. Tenerife (the accident mentioned above) led to a whole slew of regulation changes. It led to an entirely new approach to leadership and decision making - crew resource management.
And that's what's keeping the airline industry a fairly safe business, not just imposing penalties.
It has nothing to do with "having the wits of children". It has everything to do with accepting that we all make mistakes. Sometimes silly ones. And that we need to put systems in place to prevent them before they happen.
I never meant to say there's no such thing as negligence. Of course there is.
But it's also true that the threat of punishment works poorly as a deterrent also in that case, because if you think you can get away with flying the plane intoxicated, you likely also think you won't get caught. And the same processes that catch honest mistakes also work to catch less innocent ones.
But would you say it makes more sense to put extra burdens on the majority of good apples to protect the few bad apples?
If we accept this as just an honest mistake, then can you name anything a programmer could do that you would consider just plain negligent? We are never going to be able to catch 100% of all flaws.
At least to me, I think that if the people writing sensitive code had a bit more "on the line", that they'd be more inclined to be careful. It may slow down output, it may cost more for development, but for really, really important stuff? I think that's worth the cost.
So you are OK with slowing down output by forcing people to be more careful out of fear, but not by putting in place a system that makes mistakes less likely?
The fundamental fact is: you can't change human nature, no matter how much you may wish you could. One day, the bad apple may be you, and thinking you are somehow immune to making mistakes is a very dangerous delusion.
Instead of focusing on making ourselves feel better by talking about "a few bad apples" if we really want to avoiding things going bad in the future, we need to work with human behavior instead of against it.