Brendan wrote:"Can be reviewed by the public" is irrelevant when the public can't understand the source code in the first place, and if they could they've got better things to do than waste several years verifying a version of something when a new version is released each month.
Replace "the public" with "a vastly larger group of security professionals, from all the companies that rely on the software rather than just the one that produced it." Keeping the software independent of any one country is also important.
Brendan wrote:Now; read
this and think about it for a while.
That's never been a real threat. It's a very useful thought experiment, but there's no way something as heavily relied-upon and code-reviewed as GCC, or Clang, or the Linux kernel, etc. is going to gain the capability to recognize and modify its own source code on the fly without someone noticing.
The argument that most users won't read or understand the source, and the argument that most users just download binaries anyway, are straw man arguments, not really why open source is important. The important thing is that major projects like kernels, encryption libraries, etc. have several groups supporting and relying on them that
don't necessarily trust each other. This creates much stronger incentives for security than Random Corporation A that can just cave to governments with no good way for outside entities to find out.
This is not to say proprietary software is evil. It is harder to bootstrap open source software when you're not getting paid or when it's not something that really fits into this model. People do need to be paid for their work somehow. But security does push things toward and open source model- even Apple releases their source and it does get looked at by outsiders (although there's much less guarantee that the source matches the binary here).