Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That is about getting faster, not more secure. Right?

I mean, I get that it is kind of nice to be able to verify that multiple sets of folks can build the same thing and compare results. I'm curious if there are any theoretical thoughts on how much this helps. For instance, it does not guarantee that there are not malicious changes in the codebase. Which is far more likely to be a problem, I would think.

To that end, the entire browser war is ultimately counter security. As the vendors add more and more features, there are more and more places for malicious changes to hide. Not just in "mistakes," but in features that could be potentially misused. I feel that "deterministic builds" doesn't stem this that much. I'd love to be shown how/why I'm wrong.



Depending on the browser component model, could you hash individual binary components? Over time, "base" components could stabilize in the same manner as long-term-support Linux kernels, with surgical patches for security fixes. I don't know how practical this would be with the component models for Firefox and Chromium.


Sounds somewhat reasonable; though, some long term support Linux installations were vulnerable to Heartbleed.

And, especially with how far reaching some of the features of modern browsers are, the surface area for attacks is growing rather large.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: