Hacker News new | past | comments | ask | show | jobs | submit login

That is a totally different issue. One thing is reviewing what's on your store. The other is restricting people to only installing what's on their store.

One is happening on their property. The other is happening on the user's property. They also want to exert control over the latter, which is causing these problems.




It's the difference between on the one hand some spyware or adware shoving an addon in the right location in the Firefox profile directory and either accepting any dialogs to confirm you want to side-load or social engineering the user into accepting them, and on the other the same spyware having to actually patch the firefox binary or exploit it to get the same behavior, since the binary has verification baked in.

It's a big annoyance for me too, because I use private extensions extensively for some business stuff (used by the users here), but had to switch to a different mechanism (because I don't want to deal with verification). It sucks for me, but I totally understand why they are doing it.


I am aware of mozilla's given rationale (and I disagree with the implementation), I was just pointing out that patrolling an addon store is orthogonal.

And the decision to not even let users add additional signing root keys is yet another axis on the decision space that was totally neglected.


The problem is that it's essentially just another dialog or control to be scripted or socially engineered around. You can make it more onerous because it's basically a one-time action that's not something most users will do, but to what degree is adding a signing key before install different that a dialog asking you if you really want to install this third party extension?

Until you have different access levels and can both restrict users/programs from running at the base level required to do this, and condition users to recognize when increased access is being requested, I don't see this problem going away. Windows UAC is basically what we're talking about, and it still took years to users to understand it and not just always allow it (if that's even true!), and that's a system shared over all of windows.

I think the only sane way to accomplish this that worked for users and didn't cause enterprise admins to totally shun Firefox would be to piggy back on Windows' certificate store and register certificate for Firefox use only (I assume this is possible, I'm pretty sure you can do this for app-to-app communication such as MSSQL encrypted connections), but then you have to make sure you also handle the mechanisms to do the same in Linux and OS X, and now we're seeing it's a much more complex undertaking.

Protecting users from their own stupidity is a tough and mostly unsolved problem. Punting and eating their own resources to provide the safest solution they can, even if it's annoying for a more technically literate subset of users, is a solution I can understand and respect, even if it's problematic for me, because it's putting safety and security the majority of users over a simple solution that would be worse overall.


UAC is already in place. Piggyback on that. Root on linux. And whatever osx does to protect application packages.

And I really dislike the "can be socially engineered around" buldgeon. If taken to its final conclusion you would have to lock-down systems and give users no freedoms at all because they could be convinced to do something bad with enough effort.


If taken to its final conclusion

And yet you don't take the "freedom" argument to its final conclusion, which is that you must grant Freedom -1. That's the freedom for anyone, anywhere, to run any software on your hardware, at any time and for any purpose.

And this is only half-joking: all access restrictions, even ones as basic as filesystem permissions, impinge on freedom in some fashion. You can work around them, of course, but you can work around Firefox's extension signing (and Apple's app signing, and lots and lots of other systems that people insist are objectively reducing "freedom"). Which means that to be consistent, you either have to be against even those basic access-control mechanisms, or you have to compromise on absolute "freedom" and begin arguing about how difficult or complex a workaround can be before you personally would rule out allowing a system to require it.


That would go a long way to solving the problem once and for all. It would be nice if Mozilla would provide the ability to install private signing keys to the browser.


I would disagree this is a different issue, one leads to the other. Mozilla wants to make sure that any addon the user can possibly installed is something they can trust. Including addons that other applications install (ie AV toolbars).

If you don't like that, you can use the unbranded version of firefox or the dev edition.




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: