Another thing that might have made this hard - if Solar Winds were distributed as source code and each client built it themselves, with their own options (though the old back-doored c-compiler "thought experiment" may not be as much of a thought experiment anymore).
Moreover, achieving the hack was likely costly given the effort and the benefit of the hack appeared once the Solar Winds binary was distributed. You can reduce the benefit of such a hack by not having information critical enterprises all running the same binary blob.
If Solarwinds was distributed as source to hundreds of companies, maybe many would not bother diffing the source from the previous version but it seems plausible that a few would look at these at the least, especially given you are talking corporations who follow deployment procedures.
The build process itself could spit out the diffs at the end, for example.
well, you still have a 'tragedy of the commons' situation if say 10,000 users are all counting on at least 1 user to inspect the source - but with no commercial incentive to do so, nobody does.
How does a user of open source ever know or measure exactly how well the source code has been scrutinized?
I'm getting this "open source doesn't guaranteed there's no problem" claim. And I'd agree open source doesn't guarantee there's no problem. But it seems clear that routes which have allowed malware to be pushed automatically are fundamental, guarantee problems and malware in source code is a potential problem with an obvious answer.
Moreover, achieving the hack was likely costly given the effort and the benefit of the hack appeared once the Solar Winds binary was distributed. You can reduce the benefit of such a hack by not having information critical enterprises all running the same binary blob.