This a common lay-person approach to "strengthening" cryptography, but is not as effective as using a better algorithm instead.
For example, SHA256 and SHA512 require approximately the same number of "cycles per byte" when using modern CPU instruction sets. So 2× SHA256 is 2× slower than 1× SHA512. On some processor models, 1× SHA512 is faster than 1× SHA256!
Of course, things aren't always this simple, but you get the idea.
Similarly, for some class of attacks, it's only 2× as much work to crack 2× SHA256 as it would take to crack 1×SHA256. However, for these attacks it is (2^256)× more work to crack SHA512, which takes the difficulty increase from "slightly more" to "absolutely impossible in this physical universe".
If the hash function does its job at mixing input, this is no different than applying just the final hash.
(With the technical caveat that doing at least two serial hashes does improve the properties of the hash function by defending against length extension attacks. That's why things like HMAC do multiple chained hashes. But it does zilch for collision or preimage resistance.
message+hash1(message)+hash2(message) message+hash1(hash1(message))
To my lay understanding, it would provide multiple chained validation steps, but I’m curious if there are any obvious flaws with this model.