Hacker News new | past | comments | ask | show | jobs | submit login

Yeah, that works as long as the trusted group aren't themselves coerced into signing targeted malware, or choose to.



That's really no different to the developers choosing to write malicious code into their own product. A little paranoia about the storage and transport of executable data is understandable but if you cannot even trust the company employees not to compromise their own software then you shouldn't be using their products in the first place.

Where your distrust of keys would have more merit is the storage and strength of said keys. Eg If they dont have a strong passphrase and stored on an NFS / CIFS share then one could argue that they're no more secure than a bespoke build script.


If we believed this argument, there would be no need for Certificate Transparency, which understands that CAs can create certs for any domain -- they've got the keys to do it -- but mistrusts their signatures by default and trusts them only to the extent that their signatures (update bundles, in the software analogy) and logged and seen by everyone at the same time.

I think we can do the same thing for software. Why not try?


I get what you're saying but my point is that if you're running executable software from a particular company then you have to trust their staff with regards to deliberate malicious intent. Ergo if you don't trust the software signers from deliberately (whether that be voluntarily of via coercion) signing malicious software then you equally cannot trust the developers from writing in malicious code first hand and thus that being signed either with or without the certificate holders knowledge. If one cannot even trust the developers not to inject malware into their own software then you might as well give up with that company.

Yes there are tools one can run to protect themselves against the aforementioned, eg application firewalls, sandboxing, network firewalls, etc. But at some point you have to trust that Microsoft Office / Firefox / your favourite Linux distro / whatever was built honestly from the outset as software - even with the source code available in the case of OSS - is far too complex to reliably vet before running in production.

The issues with storage and transport (IPFS, HTTPS, etc) is a different matter because they're to protect against external attacks rather than corruption within the company itself. This is where the issue of software signing might fall short. Not because of disreputable people within the business but more because of negligence (eg certificates not being stored securely so attackers can inject malware into the software and then sign it themselves)

So I'm not against criticisms regarding software signing; I just don't agree with your points regarding the motives of the signers. Simply put, if you cannot trust key people within a business to write and release software honestly (negligence aside), then you should not be installing nor running their software to be begin with.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: