Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Um, Firefox does the same sorts of thing.

A) Want to use unsigned addons to play around, or tweak your existing ones[1]? Nope, even if you allow them in about:config, you can't; you have to use a special development version and update it separately.

B) Want to customize your keyboard shortcuts? Sorry, that's not safe, you have to use a crippled API that won't take effect until a given page loads.

C) Want to control your setup? Sorry, their "studies" feature is owned by the marketing team and can make arbitrary changes to the app on the fly, all the way up to its cryptographic infrastructure! (As we learned in the recent add-on mishap [2].)

I use Firefox, but they're not obviously better on these matters.

[1] Like, I don't know, if you made a big deal about underprivileged groups getting into coding, and actually wanted to take the concept seriously rather than just get some photogenic token to write hello-world for a photo op.

[2] https://news.ycombinator.com/item?id=19826827



> A) Want to use unsigned addons to play around, or tweak your existing ones[1]? Nope, even if you allow them in about:config, you can't; you have to use a special development version and update it separately.

Signatures were mandated for add-ons because there was a plague of malware (or near-malware, such as third party anti-virus software) silently installing add-ons that broke Firefox in various ways. If the end-user could disable that requirement then the malware could do that too. The only way for Mozilla to address the issue was to hard-code the signature requirements. I don't think asking power-users to use Nightly/"unbranded" versions of Firefox to load unsigned extensions is something you should count against them.


Yes, I think that was actually a good choice. It's a bit of an inconvenience for people who want to install at our own risk, and we have to get used to a different colored icon, but it's too easy to get a user to go into about:config and change a setting and for them to forget about it. With Firefox Developer Edition they at least have to do a big download, install it, and go into about:config, and there's a chance they'll one day notice it and think "I'm not a developer. Why do I have Firefox Developer Edition?"


No, there wasn't. No malware was being spread en masse this way. It was an entirely fictitious threat, as it would have required convincing the user to enter a confusing part of the app (about:config) and correctly change a setting.

I can accept that there might have been a malware problem when signatures weren't the default requirement, but that's not the change I'm talking about, where they ignore a user's explicit preference that they're okay with unsigned add-ons.

Remember, Chrome allows you to turn off this setting by just flipping a switch into dev mode. Where's the torrent of compromised Chrome browsers from this vector?

>I don't think asking power-users to use Nightly/"unbranded" versions of Firefox to load unsigned extensions is something you should count against them.

It is when that version doesn't get the same updates and has to be maintained separately.


Hi I work on a security team that hunts for malware. Malicious extensions are a huge threat - totally happens for Chrome in particular, and I've even seen malware package old versions of browsers to get around the modern defenses.


Someone distributing a different product to get around you current products vulnerabilities is outside this threat model.


No it isn't, there isn't any specified threat model anyway.

What we're talking about is whether malicious extensions are something attackers want to use. Having to package an entire browser is a win - it's super noisy and means there's a huge binary to lug around.


>No it isn't, there isn't any specified threat model anyway.

Well, yeah, in the sense that Mozilla people don't really think through what threat model they're protecting against here.

>What we're talking about is whether malicious extensions are something attackers want to use. Having to package an entire browser is a win - it's super noisy and means there's a huge binary to lug around.

The vast majority of that benefit comes from the default requirement for code to be signed, not from the barely measureable fraction of users that knowningly disable this protection and then get pwned.


I think you're missing my point, so let's specify a bit more of a threat model.

Attacker has code execution on your system and wants to maintain persistence and exfiltrate sensitive browser data. Sounds reasonable for Mozilla - at least, it's not totally nuts of them to consider this in their threat model.

One avenue, and a popular one, is to then sideload a malicious extension. An attacker who can disable the extension check can do this easily. An attacker who can't has to resort to other means - packaging a separate payload to host the extension.

Does that sound reasonable? I don't want to argue, just to explain my perspective on this issue based on the attacks I have seen.


I'm referring to native malware abusing admin privileges to install extensions without the user's consent, not users deliberately installing malware extensions themselves. This was particularly bad in the XP/Vista era (I know I had to remove some rogue extensions from my relative's computers during that time). If the signature check were a flag the malware would just disable it at the same time it installed the extension (remember, it has admin privileges so no user interaction would be required).

Additionally, keep in mind that when the signature requirement was added there were still XUL/XPCOM extensions which could hook the browser much deeper than Chrome-style extensions and wreak much more havok.


>I'm referring to native malware abusing admin privileges to install extensions without the user's consent, not users deliberately installing malware extensions themselves.

Ah, so a vector that required code signing doesn't protect against.


Signatures absolutely do protect the user in this scenario. With mandatory signatures you can't get (obvious) malware into the browser without having it first approved by Mozilla (who should reject it upon review).


Okay I misunderstood what you meant by native malware.

But if your threat model is that extensions can be added without the user's consent, then that is the vulnerability you should fix. And it still wouldn't justify blocking a user who is aware of the risk and chooses to disable that layer of default protection.


Firefox leaves a lot to be desired, but is still miles better than Chrome.


Those aren't your only two options.


might as well be.

everything else is based on Chrome/Chromium.

Nothing else, other than Firefox, lets me customize in the way Firefox does either. As a security conscious person, this matters.


Firefox forks.


Well, there's pretty much only Safari as the remaining option (does it even have a non-Mac version?)


At least when Firefox has a misstep, it's scope is a lot smaller.

Taking a swipe at adblocking, obscuring where you're "logged into the browser" are IMHO - much bigger deals.


You raise very good points. The shortcuts thing has made me miserable for years. Surely the coders of Mozilla appreciate a good set of shortcuts, like in whatever IDE they write their code in!


It kinda drives me nuts that Ctrl+num changes to that tab number on Windows, but on Linux, it is Alt+num.


Cmd+nun on Mac :)


Didn't Mozilla promise to publish a post-mortem on the add-on thing two weeks ago? Did that ever happen?

Not the CTO post during the issue; the follow up that was supposed to be more technical.



No, that's the post that _promised_ a post-mortem:

> We’ll be running a formal post-mortem next week and will publish the list of changes we intend to make




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: