Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think it would be fine to have code signing certificate ensures that signer controls a certain DNS name.

I'm fine with "installer have been signed with somebody who owns imagemagick.org"



Will you notice if the software will be signed by certificate for "imagemaqick.com" or "imagemagik.com" or "imagemagick-developers.com"?


About as well as id notice if the software were signed by

- Benjamin Olafsson

- Imagemagick Solutions Gmbh

- Imagemaqick LLC

- FutureSoft Inc

It's very common for software to be developed by a company whose name bears no resemblance to the product itself. It's also common for small commercial or open source projects to be signed by an individual developer in their name. Am I going to verify any of these? In practice, no. If I wanted to verify the company name I'd visit the website I downloaded from and check the footer's copyright notice. Unless the website itself is EV validated (almost never) we're back to DV with extra steps.

The only time I've ever looked into the signee is when I downloaded obvious malware from a fake version of GNU Cash's website which I found from a Google ad. The malware was signed with a certificate from a Taiwanese hardware company.


Hopefully Windows will remember that I downloaded the file from imagemagic.com check that the certificate matches the place I downloaded it from...

Although... As long as downloads are always provided from the official domain via HTTPS, and the OS can keep track of that, I don't really see why the executable itself needs to be signed...


imagemagick.org

imagemagic.com

That's a good way to illustrate the GP's point.

As long as downloads are always provided from the official domain via HTTPS

You are conflating control over the public website with control over the build/signing infrastructure. A good defense-in-depth strategy means that a compromise of one should not lead to an automatic compromise of the other.


unless they use a cdn with a hostname like imagemagick-14.akamai.net


Most OS software is downloaded from code repositories like github or fosshub to save on networking costs, not to mention CDNs that are often used even when the link is on the software's website the file itself will often not be "coming from" that website.


> Most OS software is downloaded from code repositories like github or fosshub to save on networking costs,

[X] Doubt

Do not underestimate the sheer number of people jamming in software names or descriptions into Google and getting their wares on the likes of softpedia.


same difference though, that's the equivalent of a CDN or fosshub. Its definitely not the author of the software.


Yes, but most people aren’t. It also significantly reduces the usefulness of code signing for the vast majority. And your justification for that is that it personally wouldn’t be a big deal to you, someone that has an abnormal understanding of the technologies at play.


>It also significantly reduces the usefulness of code signing for the vast majority

I'd argue that code signing for the average person has zero utility on Windows, and negative utility on macOS.

I really don't think anybody understands or even cares what a certificate means, and the only practical outcome is that sometimes they get scary messages when the app they're installing didn't pay MS for a license.


macOS by default doesn't run unsigned or incorrectly signed apps, period. Only Apple can hand out certificates and certificates are at the very least associated with payment info (though sometimes they want more, DUNS number or whatever). Signed application bundles remove many attack vectors. The primary remaining vectors are:

1. A malicious entity can sign up for a developer account.

2. A non-malicious entity's certificate can be compromised.

(1) does not seem to happen often, if it does happen, Apple can revoke the certificate. They can also increase the burden of proof for creating a developer account if it becomes more common.

(2) happens occasionally. Apple can revoke the key. But in general there is a strong incentive for developers to properly protect their signing keys, because Apple could ban them by not signing their keys in the case of repeated issues.

Code signing substantially increases platform security and as a 16 year macOS user, I would not want to go back to pre-signing days, where you could never be sure whether an application bundle was compromised, unless you'd verify the archive/disk image with GnuPG. But that is opening a big can of worms (WoT, etc.).


> macOS by default doesn't run unsigned or incorrectly signed apps, period.

It kinda depends on what you mean by “default”, but you can always right click an app in Finder and select “open”. When you get the scary “unsigned app” pop-up there will be an extra option there to run it anyway, allowing you to run unsigned binaries without making any settings changes to the os.

That said, I largely agree with the rest of your comment. I do think, as a developer, their stapling stuff is way more onerous than the plain code signing. It basically puts Apple in the position to reject your app in the same way they reject apps in the store, even though there is no store involved.


You only can if you are the admin on your system or enter admin credentials. I don't know, is that still the default?


Maybe? It has never asked for credentials when I do that, but I’ve also never not been an admin on my computer (which is default), so I don’t know.


Why is it any less useful than some entity name? You can be pretty sure that google.com is controlled by Google, and if the domain on the app is g00gle.ru, that's going to fool exactly the same people as if the scammer's company name was Googel.


Because domain names are cheap and can be purchased in bulk.

TLS certificates are for encryption. You don't care if the endpoint is "bad" in some way, only that you are communicating with it securely.

Code signing certificates are to make bans stick. Totally different purpose.


Companies aren't cheap, but they aren't exactly expensive either. A couple weeks ago I've registered a company in Estonia – it only cost me the 265 € state fee. Code signing certificate is another what, 500 euro on top of that? Certainly more expensive than a $10 domain with a free certificate, but still could be a reasonable cost for e. g. a targeted attack.

There's another catch – you either have to register a company in your name, or find somebody to own it for you. I don't think the latter would be a big problem though: there was a lot of news about shady fintech startups in the Baltics lately and many of them were in fact registered in the name of random people looking for some quick cash.

Now, if I see something like MicroSoft-Inc OÜ (EE) in the app signature, I would probably get a bit suspicious. But if it's a less known brand? Who knows!


Sure, but once you start registering companies to run your shady schemes, you're in the process of transforming from a bad bad actor to a legitimate bad actor. Stop trying to steal bank credentials, and switch your spyware to pulling things that help to target ads, "optimize for engagement", or "streamline business", and suddenly the entire system starts working for you. You can then work in the clear, and use all the new security tools - like HSTS, DoH, certificate pinning, and code signing - to prevent your victims from protecting themselves from you.


Yeah and now that person who you paid to register it in your name rats you out to the police. Also, there are a lot of laws you can end up breaking with severe penalties in the course of trying to hide your identity for company registration purposes (it enters the world of anti-money laundering).

It's harder than it sounds, which is why malware authors prefer to steal keys than set up fake companies. Hence the new hardware requirements.


> Yeah and now that person who you paid to register it in your name rats you out to the police.

But they don't know who you are, because you're a scammer who is lying to everyone including them.

> Also, there are a lot of laws you can end up breaking with severe penalties in the course of trying to hide your identity for company registration purposes (it enters the world of anti-money laundering).

So is credit card fraud or CFAA violations, which is the thing this is ostensibly to prevent. Criminals don't follow laws.

> It's harder than it sounds, which is why malware authors prefer to steal keys than set up fake companies. Hence the new hardware requirements.

It's not actually that hard, it's just that stealing keys is really easy. And it's not obvious that the new hardware requirements are going to do any good, because that type of consumer hardware has been consistently riddled with vulnerabilities -- Intel essentially gave up on SGX because they couldn't make it work.


So you're going to find these people anonymously, now? How? You'll have to ask a lot of people before someone agrees to actually set up a fake company for you, you'll also have to find ways to pay them without losing your anonymity, and they will have to explain at some point to the tax authority where this unexpected income came from and what the company they've set up actually does.

If it was that easy to hide your tracks police would never catch criminals, but they do. Every time you increase the complexity of the scheme the chance for mistakes goes up.

> that type of consumer hardware has been consistently riddled with vulnerabilities

USB signing devices aren't really consumer hardware, are they? I don't recall vulns in HSMs being a major source of leaks previously, but I'm sure the game will move there sooner or later.

> Intel essentially gave up on SGX because they couldn't make it work

Intel are selling SGX today, have built new features on top of it, it works fine and all their competitors have been investing heavily into catching up.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: