> Said another way, engineers now need to be technical and legal experts in the business domain?
Consider something like the 737 MAX debacle – did the programmers writing the MCAS code actually have enough aviation domain knowledge and understanding of where the component fit in the overall system to realise it was a threat to people's lives?
I don't know, but my guess is the most likely answer is "No".
From my limited information the MCAS code was primarily causing problems in association of incorrect readings by damaged sensors. Of course one could argue that this is an engineering failure because the MCAS failed to account for wrong sensor input but when you consider the legal implications of a MCAS fallback there is actually not much that can be done on the software side.
The MCAS is an optional component that reduces certification and training costs. It is definitively possible to fly the plane without accidents even with a disabled MCAS. So why can't the MCAS be turned off automatically when sensors fail? Because that changes the classification of the plane and therefore requires pilots to be certified for a new machine and receive new flight training for both MCAS and no MCAS modes.
If the software engineer was under a hippocratic oath then he would have to refuse to build the MCAS entirely but not because the idea of an MCAS is inherently unethical, no, he would have to refuse because the company he works at wants to use the MCAS for a non ethical purpose (namely operate and hide the existence of MCAS even when it is unsafe to do so).
This is basically a reverse audit but the software engineer has no authority conduct such an audit and even if he was allowed to, the business has no obligation to give him the necessary information to determine whether the MCAS will be used unethically.
> he works at wants to use the MCAS for a non ethical purpose (namely operate and hide the existence of MCAS even when it is unsafe to do so).
You think a programmer, handed a spec and asked to implement it, can be expected to know that their employer (or the employer's customer) wants to use it for a "non ethical purpose"?
Again, I can't know for sure, but I doubt the programmers who wrote MCAS (who most likely didn't even work for Boeing, but rather some subcontractor) actually knew, or could have known, how the code fit into Boeing's larger purposes
With all of that being said, were these software engineers (probably subcontractors) even given access to actual MCAS readouts or, more likely, virtualizations of expected readouts. These people probably didn't account for this type of faulty readout because the virtual machine never put out that type of fault.
Most types of development in these large companies is so compartmentalized that it's next to impossible to see the whole structure from a software engineers prospective. You need to be at a management level to understand how most of the pieces really come together, which is the only place where one of these "oaths" might make an influence. At that point, however, the selection is so goal oriented, I have a doubt as to whether or not people would take that oath.
Generally, there is somebody (typically in software assurance or systems engineering) who is supposed to ensure the fidelity of the simulator. Additionally, the hazard analysis or failure modes effects analysis should trace to specific test cases.
Of course, there’s all kinds of pressures that make these fall through the cracks. I vaguely remember an article stating some of these documents in the case of MCAS were not up to date
What the actual hazard analysis showed is that Boeing did not have the technical insight at the right level.
The HA listed MCAS as "hazardous" rather than "catastrophic". Meaning those in charge of that process document did not realize MCAS had the ability to down the airplane. I know it's tempting to arm-chair quarterback this, but let's assume they should have realized this hazard.
To your point, maybe the programmer doesn't have the systems knowledge to make those calls, but the process is predicated on somebody having both the technical acumen and the responsibility) for those decisions. This process broke down though.
> You think a programmer, handed a spec and asked to implement it, can be expected to know that their employer (or the employer's customer) wants to use it for a "non ethical purpose"?
No imtringued does not.
imtringued wrote:
> This is basically a reverse audit but the software engineer has no authority conduct such an audit and even if he was allowed to, the business has no obligation to give him the necessary information to determine whether the MCAS will be used unethically.
imtringued is saying that it would be impossible for a software engineer to determine whether what they were asked to do was ethical or not.
> If the software engineer was under a hippocratic oath then he would have to refuse to build the MCAS entirely but not because the idea of an MCAS is inherently unethical, no, he would have to refuse because the company he works at wants to use the MCAS for a non ethical purpose (namely operate and hide the existence of MCAS even when it is unsafe to do so).
One, it's not going to be clear from the request that the MCAS would want to be used in unethical ways.
Since the Hippocratic oath is the argument here, how many software developers want to work in a system closer to physicians? A national cartel controlling membership and licensure - tough luck if you want to hire more developers because there's an artificially limited supply. Mandatory academic training - goodbye self-taught developers. Follow-on training with pay 1/5th or less of your attending physicians - I know residents in specialties where attendings are paid $500k a year to start, and they are making $60k a year. Brutal shift work - residents work 70-80 hours a week easily. Toxic leadership - I've heard horror stories of residents being forced to lie on ACGME forms regarding their hours under penalty of being outright fired from their residency slot, which would make it nearly impossible to get a job as a physician (mainly because you'd have to apply to a different residency program and explain your termination).
I know they're not suggesting bringing the entire medical education & training structure over to tech workers, and everyone here likes to think that they're brilliant and changing lives every day but most of us are just throwing shitcode JS into a computer for 3-4 hours a day for an ad tech company and not much more. The comparison falls apart pretty quickly.
>actually not much that can be done on the software side.
This isn’t exactly true. There are mitigations (both software and non-software) that are expected to be done depending on hazard analysis. One of the items discovered is Boeing mischaracterized the MCAS hazard (it should have had a “catastrophic” hazard class). In addition, they didn’t appear to follow their own process for dual inputs required even for the lower severity class assigned. The “optional” part of MCAS was the secondary sensor reading into the software
>The MCAS is an optional component that reduces certification and training costs
No. The MCAS was a "necessary" component for pitch stability - without it, a 737 MAX in a pitch-up attitude would, in the absence of correcting inputs, pitch up further and further until it stalled. Without it, the airframe is uncertifiable, full stop.
I'm certain that's not correct, everything I've read on it has said MCAS was specifically a software modifier put in place to allow the plane to respond substantially the same as a regular 737 without the larger engines, in order to avoid having to have additional training for all 737 pilots worldwide.
Most aircraft, in a "pitch up attitude" will increase their angle of attack as thrust is applied. The issue was that the MAX would do so in a more radical way than the regular 737 did, and so the software was put in place to limit that so it flew like a regular 737 as far as the pilots could see.
Conceptually, MCAS wasn't a bad idea. The execution and using it as a replacement for training and not informing pilots of the flight characteristics changes between the models was stupid.
Although to be fair my summary wasn't entirely accurate - it wasn't that a MAX was outright dynamically unstable with no control input, as I described, but rather not sufficiently stable as to cause a monotonic increase in stick force as AOA increases, which can cause the combined system of pilot + flight dynamics to be unstable since the pilot relies on stick force as an indicator.
> Most aircraft, in a "pitch up attitude" will increase their angle of attack as thrust is applied
This is both incorrect and irrelevant. Most aircraft will climb when power is applied, but will not change their AOA unless the thrust axis is off-center. To a first approximation, power controls climb rate, and stick input changes AOA. Change in behavior under different power settings has little to do with the problem with the MAX. The problem with the MAX is that at high angles of attack - i.e., when the stick is held back, causing the air to meet the wing (and the engines) at a steeper angle - the engines, which are flung forward, start producing lift of their own and produce a pitch-up moment. This means that the further the pilot pulls the stick back, the less hard they have to pull. This is a dangerous inversion that increases the control order of the system, as it breaks the usual assumption that a given stick force will result in a given AOA, more or less.
Right -- it didn't give the exact same feedback to the pilot that the regular 737 did, which was why MCAS was created. The aircraft is no more or less unstable than a regular 737.
The original 737 does exactly the same thing the Max does with respect to producing a pitch-up moment -- as does nearly every other aircraft. It's just not nearly as pronounced as the Max is.
I'm sorry, none of that is correct. Did you read my link? The aircraft doesn't meet FAA regulations without the MCAS.
>The Boeing 737 MAX MCAS system is there ONLY to meet the FAA longitudinal stability requirements as specified in FAR Section 25.173, and in particular part (c) which mandates "stick force vs speed curve:, and also FAR Section 25.203 — "Stall characteristics".
A doctor refusing to do a procedure because he worries for his patient is seen as a good guy doing what is right, he is in his opinion saving a life. In addition he is trying to avoid the massive cost of a medical malpractice suit.
A software developer refusing a job because it does not meet his ethical parameters is just an unemployed software engineer.
I think one of the issues is the domain of is vast. One developer may be working on a basic CRUD app while the next is working safety critical code on a vessel going to Mars.
There are definitely areas where the prudent thing for a developer to do is raise a dissenting opinion, if not halting work. What seems lacking is clear industry consensus standards to back up that decision.
If a doctor says no, it's because of legal liability and risk to licence.
For the same reason, it's harder to hire some rando budget doctor because the field is gate-keeped by the requirement of a licence, and liability.
You can't magically make Engineering the same without the same conditions. Add barriers top entry that see my pay rise, or have cheap programmers with no liability.
Engineering already has these barriers, they just aren’t required or enforced.
I’ve never been on a project that requires a software product stamped by a licensed engineer. NCEES dropped the software license because there was so little demand, compared to, say, civil engineers who consider a license a rite of passage to career growth.
You need a licence to practice medicine, if there is something called a "license" for engineering but isn't required for practise, then it's not the same. If we have the same barriers, but not enforced, then we don't have the same barriers.
That’s not quite right. You actually do need a state license in the US to practice engineering for the public except for a few basic instances:
1) you work for the federal government
2) you work under a licensed engineer
3) you work under an industrial exemption
There’s differences depending on state. There has been a more concerted effort to remove #3 recently due to both political reasons and the technical issues in this thread. Most people performing engineering work under an industrial exemption don’t actually realize it. Again, this is different state to state. For example, in some states you can’t start a business with “engineering” in the name unless you have a certain percentage of owners/principals with an engineering license.[1]
What actually happens is conflating terms in common parlance. “Engineer” and “engineer” are not necessarily the same. For example, a computer engineer may work under an industrial exemption (due to working in a manufacturing service) while a software engineer does not. Legally, an “Engineer” claims an explicit responsibility to public safety.[2] Apropos to the headline article, there is a distinction with this difference.
From my experience, software engineers don't rotate nearly as often in traditional cyclical and defense businesses (auto, aerospace) as they do in the FAANG technology sector. There are a lot of grey beards who weren't so grey when they started.
That's completely the wrong comparison and context.
The reason MCAS came about was because management wanted to try to fudge a larger engine into an outdated design created for a different purpose rather than do the engineering and certification necessary for the new requirements and to update the system.
Management wanted to save money. Of course the engineering leadership did not want to fudge something -- they wanted to do proper engineering. But the people in charge just wanted to save money, and the engineering leadership could not do anything otherwise, even they knew that just making the engine larger and compensating did not make sense from an engineering standpoint.
By the time it got to the MCAS, that was far down the line of the decision to not do proper engineering.
Which demonstrates the irrelevancy of a hippocratic oath for engineers. Because the complexity of the system is such that no single engineer understands it all, and thus no single engineer is (or feels) responsible.
Blaming it on management is also irrelevant, since management merely takes the advice of engineers, and do financial/business trade offs to maximize profit. If the engineers cannot tell that MCAS system could fail this way (due to the complexity), management will not question it.
if the engineering leadership said, this is not a good idea. Management said, it saves money.
Engineering failed to convince management. Management didn't have the understanding that it was a bad idea.
It is now, no one's fault?
If management merely takes the advice of engineers ( and other who specialize in the things that they do not ), and they choose to ignore it because they do not understand the things they do not specialize in. I believe it's a reasonable to assume that management is more at fault than engineering ( I'm not sure they're is a situation here where any party is fault less )
Hippocratic oath for engineers is irrelevant, correct. But management does not take the advice of engineers. As I said, the engineers wanted to do proper engineering, but management wanted to save a buck, so they instructed engineering to fudge a cheap solution.
I'm not saying engineers taking an oath helps, its about the executives.. maybe something about the thread structure implies that but I actually only read the comment above.
Consider something like the 737 MAX debacle – did the programmers writing the MCAS code actually have enough aviation domain knowledge and understanding of where the component fit in the overall system to realise it was a threat to people's lives?
I don't know, but my guess is the most likely answer is "No".