I bet this would be virtually impossible to prove, and career suicide for anyone who tried it.
These aren't mustache-twirling villains who distribute memos that say "Let's ignore safety issues to get this approved faster". They really believe they're doing what's best for all involved. We'll save everyone time and money, and make it easier for pilots. How is that not a good thing? We have no reason to believe safety will be compromised.
Predicting the safety implications of design decisions, years in the future, is not an easy task. If the AOA sensors (I think?) were a tiny bit more reliable, we'd never have seen a problem, and the MAX program would be considered a great success in efficiency.
I'm sure we've all worked for managers who made decisions we disagreed with, but couldn't prove they were making the wrong one.
However, they did actively choose to not put in redundancy and the status systems in the cockpits which should have been done out of the box following any common sense and failure mode analysis procedures. This decision itself is enough to bring this to court and as a result the internal communications on these decisions will be explored.
I am not in aviation, but if they added redundant sensors and a new indicator, with an accompanying change to the flight manual for how to react to said indicator, wouldn't that have run contrary to the goal of "no new training/certification required"?
There are likely document retention policies that limit how long the relevant emails and text messages may be retained. There could be Word files or other documentation that captures discussions still floating around though.
Hopefully whoever is investigating this is acting fast to acquire the emails before they automatically get wiped.
From what I've read, the assessed severity of the failure of aviation systems is rated. The rating for this system was not severe enough to require redundancy based on the assigned rating. I'll update my comment if I can find a reference.
>These aren't mustache-twirling villains who distribute memos that say "Let's ignore safety issues to get this approved faster". They really believe they're doing what's best for all involved. We'll save everyone time and money, and make it easier for pilots. How is that not a good thing? We have no reason to believe safety will be compromised.
That is an incredibly naive reading of the situation. For starters, there is no such rationale as 'reason to believe,' this is a highly regulated process for good reason, which requires testing and verification.
Sure it could be true, but the far more likely motivation is along the lines of Dieselgate. And yes, it can be proven that managers make bad decisions that incur legal liability.
Naive? I worked at Boeing for a couple years, and was on a software team where I was regularly asked to do things which flew in the face of the known best practices of the industry. (My team is not to blame for this. It wasn't for the 737MAX, it wasn't avionics, and the project was cancelled long before it was at all usable.)
It's not as "highly regulated" as you might want to believe. They talk a good game about CMMI but if you try to improve something they remind you that CMMI is only about process, not quality. Hurry up and ship something (deadlines!), and if it's not perfect we'll find it in test.
Given that the company has this culture, I find it much more plausible that this is to blame for their product issues. I don't need to hypothesize a big evil conspiracy to explain bad software.
In fact, that's true of almost every software organization. The James Bond joke [1] fell flat precisely because you don't need a James Bond villain to get buggy software. It's what you get by default.
I think that's what the parent poster meant by "a true whistleblower program with WITSEC level provisions for protection and monetary support would help cut this down."
A lot of people in a lot of industries might come forward if they didn't have to commit career suicide to do it.
Sure, but you can't guess who might be able to prove it. Are you going to offer WITSEC to every engineer who happens to disagree with their manager? Half the programmers I've ever worked with were annoyed by management and thought they were being asked to implement terrible decisions.
These aren't mustache-twirling villains who distribute memos that say "Let's ignore safety issues to get this approved faster". They really believe they're doing what's best for all involved. We'll save everyone time and money, and make it easier for pilots. How is that not a good thing? We have no reason to believe safety will be compromised.
Predicting the safety implications of design decisions, years in the future, is not an easy task. If the AOA sensors (I think?) were a tiny bit more reliable, we'd never have seen a problem, and the MAX program would be considered a great success in efficiency.
I'm sure we've all worked for managers who made decisions we disagreed with, but couldn't prove they were making the wrong one.