Have you ever worked with large-scale, historically grown systems? More likely than not, the people that have actively made the wrong decisions (or have, through inaction, allowed an initially-working system to degrade) are long gone.
Dilution of responsibility is a real problem, but harsh punishment of everybody that, under some (your?) metric that made an error is definitely not the solution.
If you're truly interested in this (and not just angry and looking for somebody to take it out on, even though the root cause here is probably systemic – which is not the same thing as "inevitable!"):
The aviation industry and its culture of failure analysis and prevention is a great place to start, in my view. We have a lot to learn from that in our industry.
"Large scale, historically grown systems" get like that because there wasn't sufficient accountability in the first place. Things like this don't happen in the aviation industry because there are regulatory bodies holding people accountable, something almost entirely missing from the software industry. That was the original point made in this thread.
> Things like this don't happen in the aviation industry because there are regulatory bodies holding people accountable
Important to note that this has been largely diluted in the US via regulatory capture, leading to things like the 737 Max disasters - so looking at the aviation industry a few decades ago is probably the way to go.
Dilution of responsibility is a real problem, but harsh punishment of everybody that, under some (your?) metric that made an error is definitely not the solution.
If you're truly interested in this (and not just angry and looking for somebody to take it out on, even though the root cause here is probably systemic – which is not the same thing as "inevitable!"):
The aviation industry and its culture of failure analysis and prevention is a great place to start, in my view. We have a lot to learn from that in our industry.