No engineer had the power to veto this move by the company, just like no engineer had veto power over the Challenger launch or any of the other numerous tragedies we all discuss in our ethics classes. Attempts were made to prevent the tragedies, but nobody in charge listened. The fault lies with the person who has final authority over the decision.
If I make a mistake, which is discovered, as long as I attempt to correct the problem, I'm absolved of fault. A miscalculation is an accident. Knowingly deploying a faulty device due miscalculation is malfeasance. I may have made the calculation, but I never made the decision to let it kill people.
I think where we disagree is the level of responsibility for "final authority". If an engineer knowingly designs a failure, they should be held responsible as should the management. I don't think those two responsibilities of engineers and management are mutually exclusive.
I don't know about how NASA handled dissenting opinions in the Challenger days but my understanding is they have grown into more of a risk-informed decision making culture. I do think they have engineers who can stop work and force management to formally accept that risk nowadays. I don't know if Boeing has similar processes.
This isn't a miscalculation scenario. This is knowingly not following their own processes. At least according the the hazard analysis reported by the Seattle times, their own processes said there should have been a default redundant sensor. Not as optional equipment, but as the default configuration. The engineer may not have the final say but they do have a responsibility to drive that risk decision. If they know they are doing something dangerous or unethical, in my (perhaps unpopular) opinion they have a responsibility to voice that or leave. I say that as somebody who has been pressured in these types of decisions in an aerospace domain.
Challenger was somewhat different. Challenger was from a lack of temperature data, meaning they didn't have a good probabilistic rationale for either GO/NOGO decision. In hindsight, the safe bet was to wait for conditions that fell within the known data. As I understand the Boeing case, they identified the hazard but didn't follow their own procedures to mitigate it. That isn't really the same case of an "known unknown" but rather an "known known" that they didn't mitigate. I do think the engineers are culpable to a certain extent there.
Edit:
>I may have made the calculation, but I never made the decision to let it kill people.
I think this may go along with our differing opinions on the standards of the profession. In my view, there is a certain level of expected competence one should be held accountable for. That’s why doctors can be sued for negligence even if it was an “honest mistake”
> If they know they are doing something dangerous or unethical, in my (perhaps unpopular) opinion they have a responsibility to voice that or leave.
How do you know they didn't?
They will keep leaving and management will keep firing people until such a time they get people that will go with their decision.
Management has final say.
> Challenger was from a lack of temperature data
No, they had data alright. They had data from previous flights showing O-ring degradation. There were two, and at least one flight the seal got broken on the first O-ring completely and started on the second. NASA's own procedures would not allow launch if they were routinely exceeding their safety margins, so those got changed.
On the day of the fateful launch, engineers responsible for the boosters raised concerns due to the low temperatures. Those were escalated to a very high level in NASA. They were ultimately overruled.
Maybe they did, but the point I was countering was that they don't have responsibility, not that they have to stay. I.e., I disagree that you can ethically both shed responsibility and stay doing the job knowingly in an unsafe manner. If the end state is they only hire people who agree, it stands to reason both the engineers and managers share culpability. For those engineers who hold licenses, that becomes a backbone-stiffening measure. Not only do you approve said design, management needs you to in order for the design to be legal.
>They had data from previous flights showing O-ring degradation.
This isn't the same as saying they had data regarding the O-ring reliability at the low launch conditions of the day of the catastrophic failure. IIRC, the unique condition was the launch was occurring during previously un-encountered launch temps. "Raising concerns" isn't the same as saying you have incontrovertible evidence; that was the main crux of the decision. There will always be people raising technical concerns on these programs. Without good relevant data, schedule risk outweighed the un-quantifiable technical risk. It's been a while since I read the report so maybe I'm wrong on this.
IMO, Columbia is a better analogy. It was a known out-of-spec condition they decided not to mitigate because they were lulled into complacency
He did his job but that doesn't mean he has final say. I've read enough interviews to know he carried a heavy weight with him until his death because he felt he should have pushed back more. I think there's some confusion that I'm advocating an engineer must stop all risky actions at any cost. That's not what I'm saying. I'm saying if an engineer doesn't bring up a risk because "The boss doesn't want to hear it" that's willful negligence. Fighting for a position and being overruled is different that meekly rolling over.
If you believe engineering is a public trust profession, you owe it to the public to at least do due diligence. My issue is the people in this thread saying "It's all managements fault" and displacing any responsibility from the engineers. The engineers are the technical authority for management. We should strive to make sure management understands that technical risk; if they do and proceed anyway I think the engineers have done their job. That's what I think happened with Challenger. That's different from placating management because you're afraid for your job or plowing forward knowing a design will put people at risk.
If the bar was to get every program engineer to give a GO, there would never be another launch. There are engineers who don't trust aircraft that have flown for decades in part because we are bad at judging overall systemic risk. NASA has since instituted formal dissenting opinion processes and distinct technical authorities to allow risks to be raised and formally acknowledged without grinding the process to a halt.
If I make a mistake, which is discovered, as long as I attempt to correct the problem, I'm absolved of fault. A miscalculation is an accident. Knowingly deploying a faulty device due miscalculation is malfeasance. I may have made the calculation, but I never made the decision to let it kill people.