It's shocking to me that people who ostensibly know something about software development are comfortable making statements like "it would never be able to leak because Apple can simply write a (bug-free, unexploitable, perfectly secure) set of checks locking it to the device"
It's shocking to me that people who ostensibly know something about software development take Apple at their word in this case.
Apple already has the backdoor.
Are you willing to guarantee that Apple will never lose control over their signing keys, giving whoever acquires them the ability to end-run the security of a locked device and install software that you, the device owner, are sandboxed from inspecting?
> Are you willing to guarantee that Apple will never lose control over their signing keys, giving whoever acquires them the ability to end-run the security of a locked device and install software that you, the device owner, are sandboxed from inspecting?
No. But this doesn't mean I don't think there's ALSO harm in the FBI or any government agency being able to demand companies build tools that expand the use and usability of that backdoor to parties beyond the company holding the key.
It sucks that Apple has a one ring when it comes to iOS security. It's incredibly dangerous if a government can require them to wield that one ring for arbitrary purposes via a contortion of the All Writs Act. And it's just plain stupid for software professionals to base their opinions on a belief that anyone is capable of writing an unexploitable check for device identity.
The FBI or any other government agency has always been able to demand companies do exactly this.
It was the height of naivety to think otherwise; it's not like we lack historical examples of what happens when a small number of companies make themselves the linchpin of trust/security:
Prior to this event, I had no idea that this generation of programmers seriously thought they could centralize so much information and control into their own hands, and somehow keep it out of the government's hands when they eventually came knocking.
Even if Apple wins this argument, they'll have to keep winning every argument, every network intrusion, every espionage attempt, forever. This particular argument is pointless; the high-value-single-point-of-failure security model is fundamentally flawed.
So we should what? Throw up our hands, not protect anyone who needs a cell phone? Concluded that everyone who isn't running RMS' Lemote Yeelong is fucked, and throw them to the wolves?
It seems obvious to me that we have to take the world as it is; yes, centralization of security is bad. Yes, we should fight to get away from this centralization of power in companies like Apple.
But as it stands now, it's incredibly important to support Apple's fight against this dramatic expansion of the All Writs Act's powers. The fight isn't "pointless", it's the exact opposite -- the security and privacy of hundreds of millions of people in the world, today, rests on the success of fights like these.
How much better it would be if we were all running Gnu/Debian Mobile on our OpenPhones is completely irrelevant. That's not the world we live in and better, open solutions are going to take years and decades to work toward.
We are never going to get to that world if Apple loses fights like these. We already have legislators working to make even the security offered by iOS today, for all its flawed dependence on Apple, illegal. Once these privacy and security rights are gone, that's the new normal, and open, truly securable phones won't even be legal to manufacture in the first place.
In addition to lobbying against the government in this case, we should loudly criticize Apple for putting this backdoor in their design!
Let's invert the use of the "ticking bomb" propaganda - say we've got a phone with data that can prevent an eminent attack. What person is going to say we shouldn't unlock the phone, because it would set a bad precedent? I'm a steadfast believer in the idea that computing devices should be private extensions of one's mind, but I would still say it's stupid to not hack into such a device if it can be done!
If you want a device that guarantees individual privacy against the manufacturer/USG, it has to be designed for such. You can't build an insecure system and then expect an A for effort.
This is not a dramatic expansion of the AWA's power. This is exactly what the AWA is meant to do. The few carve-outs of the authority of the AWA go much farther -- that's exactly what CALEA is, for example, and it has been held up by the courts repeatedly.
Your references to the GPL nonsense are a false dichotomy; All we need is ownership rights over the electronics we buy, like we've had -- even on Apple platforms -- for decades.
Supporting Apple now just ensures that when Apple does fall -- whether it's an expansion of CALEA, or espionage, or just a shift in their business priorities -- we may never know about it, and the impact on privacy/security is likely to be much worse.
If Apple loses this battle, there's no real impact to the risk profile. Apple already had the backdoor. The only change is that we're forced to be honest about that backdoor's existence, and start thinking real hard about how to avoid this kind of centralization of power, and its inevitable use, going forward.
In this case, the corporation is the threat, and corporate statism is probably the worst possible end game.
The state isn't really the problem, though -- centralization of authority is. The DoJ issue is basically moot; this is a lawful request made under public scrutiny and judicial review.
Whereas Apple could just change their business priorities at any point, and never even has to tell us.
The risk footprint is rather limited at this point, that much I trust. Each step in the sequence required to arrive at a compromised version of iOS that behaves the way FBI wants, is a step that increases the risk footprint for Apple and everyone with an iOS device. We could argue the size of this expansion, but we can all agree it is non-zero. And by definition trust is lost in proportion to that increased risk footprint.
I think that's an unreasonable burden on any company, but including users. This isn't just limited to Apple. Any company signing code is at risk of being asked to apply digital signatures to the code equivalent of malware, and to the free speech equivalent of falsehood. No.
Your argument is no different than the arguments that claim crypto backdoors can be kept secure. The problem is the existence of the backdoor, not the processes or politics that ostensibly will prevent its abuse.
Apple could ship encrypted backdoored binaries under an NSL gag order tomorrow, might not even know it themselves, and we'd never notice because we can't even introspect the device. In a few years, the federal government could extend CALEA to cover Apple, and there'd be little we could do because we can't override Apple's control over the software.
The security model is flawed; it requires Apple to fight and win every argument, every battle, every espionage attempt, in our favor, forever. The longer we propagate this security myth that putting absolute trust in the hands of the few is a viable security model, the worse things will be when it fails.
In the meantime, complying with this legal request doesn't meaningfully move the risk needle. The risk already existed. All it does is force Apple to admit that they hold a backdoor -- something they obviously are loathe to do, as noted by the US Attorney when she was forced to submit an additional court filing responding to Apple's public, calculated attempt to define the public debate before even responding to the court.
I disagree with the characterization that there's already a backdoor. Just because there's something of a black box involved in Apple's source code, compiling, and signing process with which a backdoor could be injected, is not proof of a backdoor.
However, I agree that the security model they have has a weakness, which is that it requires them to keep fighting against sovereigns, not just the U.S. government, for all time. That's a problem, I'm sure they're coming to terms with what that means, as are other companies and even users and governments. Historically Apple has been a closed-hardware company, it's difficult to imagine they'll shed that anytime soon, and if that's true there'll always be something of a black box involved.
But they could still alter the OS and firmware to require an unlock code to do OS or firmware updates, and if one can't be provided that all keys on the phone are erased first. Short of unknown backdoors, that obviates the current government request that Apple change the software. A law could possibly prevent them from shipping such an OS or firmware update. So the next step is making the user passcode stronger, and its hash algorithm much more computationally expensive. Even if there's a backdoor in the future the ability of friend or foe getting into the equipment is probably just too expensive within a reasonable time frame.
But if you're stuck on open hardware being the end goal, I'd probably agree with that, even though I think Apple will go to great lengths to avoid that.