Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

>Engineers are almost never to blame in these situations

I strongly disagree with this. Engineering is a profession of public trust, even if operating under an industrial exemption to circumnavigate licensure. We need to hold ourselves accountable to those standards or else it risks becoming a vocation devoid of responsibility and lacking accountability similar to the C-suite positions that people rail against in this thread. I don't think society would cut a doctor or lawyer slack for 'just doing as they're told' and I think engineers are in the same area of public trust.

Good organizations should have dissenting opinion processes. The advantage is that it gives a way to hold management accountable (e.g., they must formally acknowledge that risk). Sure, using that process may come with professional risk, but that should come with the territory of a position of public trust.



No engineer had the power to veto this move by the company, just like no engineer had veto power over the Challenger launch or any of the other numerous tragedies we all discuss in our ethics classes. Attempts were made to prevent the tragedies, but nobody in charge listened. The fault lies with the person who has final authority over the decision.

If I make a mistake, which is discovered, as long as I attempt to correct the problem, I'm absolved of fault. A miscalculation is an accident. Knowingly deploying a faulty device due miscalculation is malfeasance. I may have made the calculation, but I never made the decision to let it kill people.


I think where we disagree is the level of responsibility for "final authority". If an engineer knowingly designs a failure, they should be held responsible as should the management. I don't think those two responsibilities of engineers and management are mutually exclusive.

I don't know about how NASA handled dissenting opinions in the Challenger days but my understanding is they have grown into more of a risk-informed decision making culture. I do think they have engineers who can stop work and force management to formally accept that risk nowadays. I don't know if Boeing has similar processes.

This isn't a miscalculation scenario. This is knowingly not following their own processes. At least according the the hazard analysis reported by the Seattle times, their own processes said there should have been a default redundant sensor. Not as optional equipment, but as the default configuration. The engineer may not have the final say but they do have a responsibility to drive that risk decision. If they know they are doing something dangerous or unethical, in my (perhaps unpopular) opinion they have a responsibility to voice that or leave. I say that as somebody who has been pressured in these types of decisions in an aerospace domain.

Challenger was somewhat different. Challenger was from a lack of temperature data, meaning they didn't have a good probabilistic rationale for either GO/NOGO decision. In hindsight, the safe bet was to wait for conditions that fell within the known data. As I understand the Boeing case, they identified the hazard but didn't follow their own procedures to mitigate it. That isn't really the same case of an "known unknown" but rather an "known known" that they didn't mitigate. I do think the engineers are culpable to a certain extent there.

Edit: >I may have made the calculation, but I never made the decision to let it kill people.

I think this may go along with our differing opinions on the standards of the profession. In my view, there is a certain level of expected competence one should be held accountable for. That’s why doctors can be sued for negligence even if it was an “honest mistake”


> If they know they are doing something dangerous or unethical, in my (perhaps unpopular) opinion they have a responsibility to voice that or leave.

How do you know they didn't?

They will keep leaving and management will keep firing people until such a time they get people that will go with their decision.

Management has final say.

> Challenger was from a lack of temperature data

No, they had data alright. They had data from previous flights showing O-ring degradation. There were two, and at least one flight the seal got broken on the first O-ring completely and started on the second. NASA's own procedures would not allow launch if they were routinely exceeding their safety margins, so those got changed.

On the day of the fateful launch, engineers responsible for the boosters raised concerns due to the low temperatures. Those were escalated to a very high level in NASA. They were ultimately overruled.


>How do you know they didn't?

Maybe they did, but the point I was countering was that they don't have responsibility, not that they have to stay. I.e., I disagree that you can ethically both shed responsibility and stay doing the job knowingly in an unsafe manner. If the end state is they only hire people who agree, it stands to reason both the engineers and managers share culpability. For those engineers who hold licenses, that becomes a backbone-stiffening measure. Not only do you approve said design, management needs you to in order for the design to be legal.

>They had data from previous flights showing O-ring degradation.

This isn't the same as saying they had data regarding the O-ring reliability at the low launch conditions of the day of the catastrophic failure. IIRC, the unique condition was the launch was occurring during previously un-encountered launch temps. "Raising concerns" isn't the same as saying you have incontrovertible evidence; that was the main crux of the decision. There will always be people raising technical concerns on these programs. Without good relevant data, schedule risk outweighed the un-quantifiable technical risk. It's been a while since I read the report so maybe I'm wrong on this.

IMO, Columbia is a better analogy. It was a known out-of-spec condition they decided not to mitigate because they were lulled into complacency


The engineer for the O-ring did say no go(the first time they had done so), management at the company that made the O-ring overruled them.

See: Challenger: A Rush To Launch https://www.youtube.com/watch?v=2FehGJQlOf0


He did his job but that doesn't mean he has final say. I've read enough interviews to know he carried a heavy weight with him until his death because he felt he should have pushed back more. I think there's some confusion that I'm advocating an engineer must stop all risky actions at any cost. That's not what I'm saying. I'm saying if an engineer doesn't bring up a risk because "The boss doesn't want to hear it" that's willful negligence. Fighting for a position and being overruled is different that meekly rolling over.

If you believe engineering is a public trust profession, you owe it to the public to at least do due diligence. My issue is the people in this thread saying "It's all managements fault" and displacing any responsibility from the engineers. The engineers are the technical authority for management. We should strive to make sure management understands that technical risk; if they do and proceed anyway I think the engineers have done their job. That's what I think happened with Challenger. That's different from placating management because you're afraid for your job or plowing forward knowing a design will put people at risk.

If the bar was to get every program engineer to give a GO, there would never be another launch. There are engineers who don't trust aircraft that have flown for decades in part because we are bad at judging overall systemic risk. NASA has since instituted formal dissenting opinion processes and distinct technical authorities to allow risks to be raised and formally acknowledged without grinding the process to a halt.


"Boisjoly and four other engineers tried desperately to convince management that the launch should be scrubbed. They warned that the cold would cause rubber o-rings to become brittle and fail, allowing hot gases to leak at the joints. Morton Thiokol and NASA managers dismissed the arguments, and decided to go ahead with the launch.

The next day, Challenger lifted off from its pad at Kennedy Space Center. At first, the launch appeared normal, but 73 seconds into flight the shuttle exploded, sending fragments arcing across the bright Florida sky. In view of a horrified public, including thousands of schoolchildren watching a live broadcast, seven astronauts plummeted to their deaths in the Atlantic ocean."

You might want to read the full article:

https://whistleblowing.us/2012/02/remembering-roger-m-boisjo...

As for Congress blaming "the engineers."

Virtually all American whistle blowers lose their jobs and are placed on a banned list. They are branded traitors and conveniently become sexual predators, rapists and pedophiles. From the ensuing manhunt, you'd think the whistle blower's the head of a terrorist organization. It's straight to prison if they are caught. And a constant looking over their shoulders if they escape the country.

But "management", the guys who actually did something illegal? Life goes on uninterrupted,... retire and become an amazon director.

Knowing the risk involved, how can anyone expect any rational engineer to leak stuff?

If not a single Boeing executive goes to prison for this, then this exercise is a sham, looking for helpless scapegoats to salvage the company's reputation and reduce its loss of plane orders.


>Knowing the risk involved, how can anyone expect any rational engineer to leak stuff?

Because that is the ethical mandate of the profession. “Profession” as in the root of the word to profess a vow. That consideration doesn’t come with the caveat of “as long as it’s convenient to your career.”

I get that it’s not easy. I just don’t like how people will give themselves ethical loopholes in a professional realm but not a personal one. As I stated in a different comment, I don’t think Challenger is the same as this Boeing case. In any event, NASA has instituted procedures to try an empower engineers to hold managers accountable


> That consideration doesn’t come with the caveat of “as long as it’s convenient to your career.”

When losing your career potentially means losing your home, your health insurance, and putting your family out on the streets; it's a lot more than just "an inconvenience" to your career.

What needs to change is that there need to be more stringent protections in place for engineers who blow the whistle on this kind of stuff and prevents companies from effectively destroying the lives of those who blow the whistle.


>it's a lot more than just "an inconvenience" to your career.

Luckily, we're talking about rare cases. Would you extend the same to a doctor who puts a patient at risk because he needs to make a mortgage payment?

Maybe I'm unreasonable, but I think if someone isn't willing to hold an ethical line they shouldn't be in a career of public trust. There's no shame in pursuing other professions, there should be shame in undermining the public trust because you aren't willing hold that line.


> In any event, NASA has instituted procedures to try an empower engineers to hold managers accountable

And yet, Columbia happened. Again, concerns from engineers were dismissed.


Yep. And EVA-23 almost became another catastrophe 10 years later.

I'm not making a claim they are perfect, but they may strive to be. Columbia instituted more changes, including a completely separate safety technical authority. This is a separate signatory who must approve operational decisions and (in theory) doesn't face the same schedule pressure. I'm still somewhat personally skeptical if this will avoid such further incidents because much of these are rooted in humans inability to understand risk in a statistical manner.


public trust is the same as the honor system is the same as no system.


It depends. I think it can be driven informally or formally. Do you think other public trust professions (doctors, lawyers, judges) are on equally weak foundations?

Informally, you have the onus to create a culture of accountability and each individual has the responsibility to uphold it. One way you do the opposite is for people to claim "engineers are almost never to blame". Maybe this is a third rail to bring up at this moment, but you can see how much culture matters when bad policing surfaces. Would you say "Beat cops can't be blamed, it's only the chief's fault"?

Formally, you can lean on licensure requirements. Requiring an official "approval" from a licensed engineer (and the accountability that goes with it) helps ensure the public trust oath isn't just a rubber stamp.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: