Say I wrote software to control a gamma ray knife, it's perfectly safe and it always does the right thing and shuts down properly when it detects a weird condition.
Compromising it would simply be a matter of changing a few bytes in the executable, or replacing the executable with another one.
This seems so obvious to me that I think you may have non-standard definitions of either safety or security.
That AVR can still be manipulated. If your definition of safety includes preventing in-person attacks on the data storage, then you pretty much need armed guards.
If that's the standard, then no wonder "software safety is near non-existent".
Ah, there's the non-standard definition. Safety means that the system performs as designed while the design invariants hold. Security means someone malicious can't change the invariants.
That's not what it is about. If someone calls you "non-standard", you challenge them to identify these standards. If you call me wrong, at least give it hands and feets.
> If you call me wrong, at least give it hands and feets.
\|/ \|/
\ /
You're wrong!
| |
^^^ ^^^
Sorry, couldn't help myself. There's an obscure Polish joke it made me think of (punchline being, thankfully you didn't ask for it to "hold its shit together").