Once again you are talking about programmers in general and not security researchers.
Really, have a look at what it requires to write binary code, let alone reverse engineering complicated code, that somebody else wrote.
I have had a look. I’ve also done some solving of simple crackmes and such. I’m definitely not competent, but to find a security backdoor well-hidden you’ll have to examine behavior, which requires certain skills, and then you’ll have to look at the executable code, and then, of course, having the source is good, but less so if it’s deliberately made look like normal.
I agree with Linus’ statement though:
I think I’m mistaken on that attribution, OpenBSD’s Theo de Raadt is more likely to be the author.
I stand by my opinion that it’s a bad look for a privacy- and security-oriented piece of software, to restrict non-“experts” from inspecting that, which should ensure that.
Yes, I agree that it’s better when the source is present. But if you overvalue the effect, then it might be worse. Say, again, with Linux - plenty of people are using thousands of pieces of FOSS software, trusting that resulting thing far more than Windows. If we knew that the level of trust is absolutely the same, then one could say Linux is safer. But we know that people sometimes do with Linux all kinds of things they wouldn’t do with Windows, because they overvalue the effect of it being FOSS. It’s FOSS, but you still better not store 10 years of home video unencrypted on the laptop you are carrying around, things like that.
Yes, because they constitute a significant portion, of the eyes, traditionally involved with doing the verification of software. You can allow a potentially cherry-picked group of researchers to do the verification, on behalf of the user-base, but that hinges on a “trust me bro” basis. I appreciate you’ve looked into the process in practice, but please understand that these pieces of software, are anything but simple. Also if a state-actor were to deliberately implement an exploit, it wouldn’t be necessarily obvious at all, even if source-code was available; they’re state-backed, top of their game security-researchers themselves. Even higher tier consumer-grade computer viruses, won’t execute in a virtualized environment, precisely to avoid being detected. They won’t compromise when unnecessary, and might only be exploited when absolutely required; again to avoid suspicion.
I fully agree with the last paragraph though, and believe there to be an overreliance on digital systems over all. In terms of FOSS software, you have to rely on many, many different contributors to facilitate maintenance, packaging and distribution in good faith; and sometimes all it takes is just one package, for the whole system to become compromised. But even so, I’m more comfortable knowing, the majority of software I’m running on my machines, to be open-source; than relying on a single entity, like Microsoft, having an abysmal track record in respect of privacy, while doing so in the dark. Of course you could restrict access to Microsoft servers using network filtering, but it’s not just that aspect, it’s also not having to deal with Microsoft’s increasingly restricted experience, primarily serving their perverse dark patterns. I do believe people should handle sensitive files with care, for instance: put Tails on a live-USB, leave it off the internet, put the files on an encrypted drive, dismount the drives physically, and store them somewhere safe.
Once again you are talking about programmers in general and not security researchers.
I have had a look. I’ve also done some solving of simple crackmes and such. I’m definitely not competent, but to find a security backdoor well-hidden you’ll have to examine behavior, which requires certain skills, and then you’ll have to look at the executable code, and then, of course, having the source is good, but less so if it’s deliberately made look like normal.
I think I’m mistaken on that attribution, OpenBSD’s Theo de Raadt is more likely to be the author.
Yes, I agree that it’s better when the source is present. But if you overvalue the effect, then it might be worse. Say, again, with Linux - plenty of people are using thousands of pieces of FOSS software, trusting that resulting thing far more than Windows. If we knew that the level of trust is absolutely the same, then one could say Linux is safer. But we know that people sometimes do with Linux all kinds of things they wouldn’t do with Windows, because they overvalue the effect of it being FOSS. It’s FOSS, but you still better not store 10 years of home video unencrypted on the laptop you are carrying around, things like that.
Yes, because they constitute a significant portion, of the eyes, traditionally involved with doing the verification of software. You can allow a potentially cherry-picked group of researchers to do the verification, on behalf of the user-base, but that hinges on a “trust me bro” basis. I appreciate you’ve looked into the process in practice, but please understand that these pieces of software, are anything but simple. Also if a state-actor were to deliberately implement an exploit, it wouldn’t be necessarily obvious at all, even if source-code was available; they’re state-backed, top of their game security-researchers themselves. Even higher tier consumer-grade computer viruses, won’t execute in a virtualized environment, precisely to avoid being detected. They won’t compromise when unnecessary, and might only be exploited when absolutely required; again to avoid suspicion.
I fully agree with the last paragraph though, and believe there to be an overreliance on digital systems over all. In terms of FOSS software, you have to rely on many, many different contributors to facilitate maintenance, packaging and distribution in good faith; and sometimes all it takes is just one package, for the whole system to become compromised. But even so, I’m more comfortable knowing, the majority of software I’m running on my machines, to be open-source; than relying on a single entity, like Microsoft, having an abysmal track record in respect of privacy, while doing so in the dark. Of course you could restrict access to Microsoft servers using network filtering, but it’s not just that aspect, it’s also not having to deal with Microsoft’s increasingly restricted experience, primarily serving their perverse dark patterns. I do believe people should handle sensitive files with care, for instance: put Tails on a live-USB, leave it off the internet, put the files on an encrypted drive, dismount the drives physically, and store them somewhere safe.