If it is malice - the order for Apple to help should be reversed. If it is incompetence then that bolsters the argument that any backdoor will lead to leaks and unauthorized access to the backdoor, causing more harm to everyone.
Indeed. The best way to not have information stolen is to not have it, and particularly not have with someone proven to be incapable of security. http://www.softwar.net/stupid.html
>If it is incompetence then that bolsters the argument that any backdoor will lead to leaks and unauthorized access to the backdoor, causing more harm to everyone.
The backdoor the FBI wants to exploit cannot be leaked. They are specifically requesting it be limited to the specific phone.
If it were possible to take an apple update, edit it, and then apply to other phones, they wouldn't need Apple's help.
If you want to talk about "enablers of authoritarians", consider that Apple has built a system in which they are the sole authority, that is opaque to all introspection by third parties, and against which Apple -- and only Apple -- must protect their users against all comers, for all time.
In addition to which, our security relies on Apple themselves never changing their business priorities and choosing to exploit their position of absolute authority.
The fact that they have that authority is why the government can compel them to do anything in the first place.
If you want to talk technical specifics, then no, your assessment is incorrect. The Apple ID password was changed, but that doesn't affect the on-device keying.
PIN numbers are the weakest link in the iPhone crypto chain. Apple strengthened that link through non-cryptographic means: tamper-resistent key derivation software that either runs on the main CPU, or in later devices, on the secure enclave CPU.
That software enforces a limited number of retries, essentially strengthening the PIN number. However, Apple also retained the ability to subvert the owner's lock on the device and install new key derivation code that does not include those security features; this applies to both the 5c and later devices with secure enclave.
If Apple hadn't retained that backdoor, the FBI would have nothing to ask for. Apple has, however, and has consistently made themselves the sole authority and gatekeeper of these devices.
Thank you for explaining. I'm still not clear on what the FBI's major malfunction is.
How would relaxing the tamper-resistent key protection help here? One needs the PIN to reimage the device. Chicken & egg. Creating a one-off OS image can't help without first having the PIN.
And the goal is to get the data, not crack the phone. Why can't the FBI use the backups? And what do they hope to find that don't already know (by other means)?
Just sounds like CYA to me. The more I learn about this silliness, the less plausible the FBI's narrative becomes. The FBI screwed up, is now just finding scape goat.
---
Authoritarian already has a widely recognized definition.
> How would relaxing the tamper-resistent key protection help here? One needs the PIN to reimage the device. Chicken & egg. Creating a one-off OS image can't help without first having the PIN.
The weak link is that a PIN can be cracked very quickly; in hours or days. The search space just isn't very large.
The only thing preventing the FBI from doing so is the Apple-signed iOS code that erases data keys after too many unsuccessful retries.
So, if Apple uses their privileged backdoor to disable that check, the FBI can brute force the encryption key by trying as many PIN combinations as they like.
In effect, this means Apple already has the cryptographic backdoor necessary own any PIN-protected iPhone in the world.
That's small potatoes, though -- they can also install new software on locked devices, and push modified updates to applications distributed through the AppStore. After all, apps are resigned with Apple's signing key, discarding the original software authors' signatures.
When you factor in bitcode (in which Apple compiles the actual binaries server-side), application authors can't even verify that distributed binaries match what they uploaded, and the use of a relatively high-level bitcode allows Apple to much more easily patch/rewrite significant portions of the application.
In other words, Apple built a system in which they have almost absolute authority over every iPhone, and due to strict platform DRM, there's almost zero transparency into their use of it.
> Authoritarian already has a widely recognized definition.
"adj. Characterized by or favoring absolute obedience to authority, as against individual freedom: an authoritarian regime."
Can you install software on your iPhone that pre-empts Apple's authority over the device?
Can you install software without Apple's approval?
Can you prevent Apple from installing whatever software they like on your iPhone, including software that implements CALEA-compliant real-time surveillance?
The answer to all three is "no", and why I think this absolutely fits the "authoritarian" definition.
You can, of course, use a different vendor's phone. The situation there will be roughly the same. Eventually, if nothing else changes, we'll see CALEA expand to cover smart phones in the same way it expanded to cover internet traffic once the ISPs were sufficiently consolidated. The vendors' authority over the devices makes this easy.
Why can’t it be leaked? The software needs to be created and installed on the device. Even if it’s entirely in Apple hands, there’s no guarantee it will ever be leaked.
It wouldn’t need editing. It’s intended to disable the timeout when brute forcing passwords. It’s incredibly dangerous software to even exist. And also insanely valuable. Even more incentive for someone to leak it, even at Apple.
The software would only disable those features on that specific device, which would be hard coded. Even if you moved the software to another device, it wouldn't work. Even if you had the source code, and modified to work on a different device or all devices, you wouldn't be able to do it unless Apple signed the modified software as well.
It's not clear to me at least that the phone identifier they would be hard-coding into the build is actually designed to be a cryptographically secure and unalterable identifier protected by the secure enclave. Well the 5C has no secure enclave anyway so how is this ID secured?
If you can reprogram or electronically intercept and alter the ID as it is read by the firmware, the backdoor build could be run on any phone.
For example if it is tied to the UDID, the UDID = SHA1(serial + ECID + wifiMac + bluetoothMac). Here's an article where Apple says the ECID is alterable through the BPP (Baseband processor) [1] so perhaps exploitable by connecting to a BSE and hacking the BPP via LTE vulnerabilities. The serial number, WiFi and Bluetooth MACs can all be altered as well. So I'm not convinced UDID locked builds cannot be worked around by a motivated adversary.
Heck, finding a SHA1 hash collision by altering only the most easily set MAC addresses is computationally feasible and costs less than $1 million!
It's shocking to me that people who ostensibly know something about software development are comfortable making statements like "it would never be able to leak because Apple can simply write a (bug-free, unexploitable, perfectly secure) set of checks locking it to the device"
It's shocking to me that people who ostensibly know something about software development take Apple at their word in this case.
Apple already has the backdoor.
Are you willing to guarantee that Apple will never lose control over their signing keys, giving whoever acquires them the ability to end-run the security of a locked device and install software that you, the device owner, are sandboxed from inspecting?
> Are you willing to guarantee that Apple will never lose control over their signing keys, giving whoever acquires them the ability to end-run the security of a locked device and install software that you, the device owner, are sandboxed from inspecting?
No. But this doesn't mean I don't think there's ALSO harm in the FBI or any government agency being able to demand companies build tools that expand the use and usability of that backdoor to parties beyond the company holding the key.
It sucks that Apple has a one ring when it comes to iOS security. It's incredibly dangerous if a government can require them to wield that one ring for arbitrary purposes via a contortion of the All Writs Act. And it's just plain stupid for software professionals to base their opinions on a belief that anyone is capable of writing an unexploitable check for device identity.
The FBI or any other government agency has always been able to demand companies do exactly this.
It was the height of naivety to think otherwise; it's not like we lack historical examples of what happens when a small number of companies make themselves the linchpin of trust/security:
Prior to this event, I had no idea that this generation of programmers seriously thought they could centralize so much information and control into their own hands, and somehow keep it out of the government's hands when they eventually came knocking.
Even if Apple wins this argument, they'll have to keep winning every argument, every network intrusion, every espionage attempt, forever. This particular argument is pointless; the high-value-single-point-of-failure security model is fundamentally flawed.
So we should what? Throw up our hands, not protect anyone who needs a cell phone? Concluded that everyone who isn't running RMS' Lemote Yeelong is fucked, and throw them to the wolves?
It seems obvious to me that we have to take the world as it is; yes, centralization of security is bad. Yes, we should fight to get away from this centralization of power in companies like Apple.
But as it stands now, it's incredibly important to support Apple's fight against this dramatic expansion of the All Writs Act's powers. The fight isn't "pointless", it's the exact opposite -- the security and privacy of hundreds of millions of people in the world, today, rests on the success of fights like these.
How much better it would be if we were all running Gnu/Debian Mobile on our OpenPhones is completely irrelevant. That's not the world we live in and better, open solutions are going to take years and decades to work toward.
We are never going to get to that world if Apple loses fights like these. We already have legislators working to make even the security offered by iOS today, for all its flawed dependence on Apple, illegal. Once these privacy and security rights are gone, that's the new normal, and open, truly securable phones won't even be legal to manufacture in the first place.
In addition to lobbying against the government in this case, we should loudly criticize Apple for putting this backdoor in their design!
Let's invert the use of the "ticking bomb" propaganda - say we've got a phone with data that can prevent an eminent attack. What person is going to say we shouldn't unlock the phone, because it would set a bad precedent? I'm a steadfast believer in the idea that computing devices should be private extensions of one's mind, but I would still say it's stupid to not hack into such a device if it can be done!
If you want a device that guarantees individual privacy against the manufacturer/USG, it has to be designed for such. You can't build an insecure system and then expect an A for effort.
This is not a dramatic expansion of the AWA's power. This is exactly what the AWA is meant to do. The few carve-outs of the authority of the AWA go much farther -- that's exactly what CALEA is, for example, and it has been held up by the courts repeatedly.
Your references to the GPL nonsense are a false dichotomy; All we need is ownership rights over the electronics we buy, like we've had -- even on Apple platforms -- for decades.
Supporting Apple now just ensures that when Apple does fall -- whether it's an expansion of CALEA, or espionage, or just a shift in their business priorities -- we may never know about it, and the impact on privacy/security is likely to be much worse.
If Apple loses this battle, there's no real impact to the risk profile. Apple already had the backdoor. The only change is that we're forced to be honest about that backdoor's existence, and start thinking real hard about how to avoid this kind of centralization of power, and its inevitable use, going forward.
In this case, the corporation is the threat, and corporate statism is probably the worst possible end game.
The state isn't really the problem, though -- centralization of authority is. The DoJ issue is basically moot; this is a lawful request made under public scrutiny and judicial review.
Whereas Apple could just change their business priorities at any point, and never even has to tell us.
The risk footprint is rather limited at this point, that much I trust. Each step in the sequence required to arrive at a compromised version of iOS that behaves the way FBI wants, is a step that increases the risk footprint for Apple and everyone with an iOS device. We could argue the size of this expansion, but we can all agree it is non-zero. And by definition trust is lost in proportion to that increased risk footprint.
I think that's an unreasonable burden on any company, but including users. This isn't just limited to Apple. Any company signing code is at risk of being asked to apply digital signatures to the code equivalent of malware, and to the free speech equivalent of falsehood. No.
Your argument is no different than the arguments that claim crypto backdoors can be kept secure. The problem is the existence of the backdoor, not the processes or politics that ostensibly will prevent its abuse.
Apple could ship encrypted backdoored binaries under an NSL gag order tomorrow, might not even know it themselves, and we'd never notice because we can't even introspect the device. In a few years, the federal government could extend CALEA to cover Apple, and there'd be little we could do because we can't override Apple's control over the software.
The security model is flawed; it requires Apple to fight and win every argument, every battle, every espionage attempt, in our favor, forever. The longer we propagate this security myth that putting absolute trust in the hands of the few is a viable security model, the worse things will be when it fails.
In the meantime, complying with this legal request doesn't meaningfully move the risk needle. The risk already existed. All it does is force Apple to admit that they hold a backdoor -- something they obviously are loathe to do, as noted by the US Attorney when she was forced to submit an additional court filing responding to Apple's public, calculated attempt to define the public debate before even responding to the court.
I disagree with the characterization that there's already a backdoor. Just because there's something of a black box involved in Apple's source code, compiling, and signing process with which a backdoor could be injected, is not proof of a backdoor.
However, I agree that the security model they have has a weakness, which is that it requires them to keep fighting against sovereigns, not just the U.S. government, for all time. That's a problem, I'm sure they're coming to terms with what that means, as are other companies and even users and governments. Historically Apple has been a closed-hardware company, it's difficult to imagine they'll shed that anytime soon, and if that's true there'll always be something of a black box involved.
But they could still alter the OS and firmware to require an unlock code to do OS or firmware updates, and if one can't be provided that all keys on the phone are erased first. Short of unknown backdoors, that obviates the current government request that Apple change the software. A law could possibly prevent them from shipping such an OS or firmware update. So the next step is making the user passcode stronger, and its hash algorithm much more computationally expensive. Even if there's a backdoor in the future the ability of friend or foe getting into the equipment is probably just too expensive within a reasonable time frame.
But if you're stuck on open hardware being the end goal, I'd probably agree with that, even though I think Apple will go to great lengths to avoid that.
That's a good point. The unique device ID is baked into the hardware, but it doesn't look like it can be read directly, so it might not even be able to put in logic based on the ID into the firmware anyways.
Even if the original software was built to work only one device, if it's ever leaked it provides a roadmap for targeted attacks which may apply similar workarounds though other means than the update path.
It could be leaked but if properly designed it just wouldn't work on other devices.
If you mean that Apple could leak it, that isn't a real risk. There is already a risk that Apple has it's key that it signs updates leak. If that leaks anyone can write the modified software.
Apple should just write the modified software to only work on that specific iphone (by serial number).
The software already exists. You just have to lightly modify the existing software to turn off security features. The problem is that we don't have apple's key.
The problem is, Apple has to fight this now because once they've done this once, they're in a losing position when the government wants it done 800 more times.
Right now, Apple can argue undue burden. Someone needs to sit down, nop out a bunch of security measure in an older branch of iOS, add boot and installation tests that lock it down to a particular serial number in a way that isn't vulnerable to any easy spoofing, test it all, and finally sign it.
If they do all this now, the second time the FBI shows up at the door Apple can't decide to then start arguing undue burden. Any government lawyer could win the argument that Apple already did all the heavy lifting, and that merely changing the serial code checked for could obviously not now constitute an undue burden on the company.
Once they've started down this road it's just a slow frog boil of "obviously not undue burden" small changes to "here's a list of 500,000 potential terrorists whose data we may need to access. Push an OTA update to them that has bypassable security"
>The problem is, Apple has to fight this now because once they've done this once, they're in a losing position when the government wants it done 800 more times.
Since each of those 800 times will used after court issued a legal warrant, that is actually good.
This is the technical aspect that I think isn't getting nearly enough attention. While I agree with Cook that this will set a precedence where governments, both the US and others, begin making these requests more frequently, it's not actually creating a master key in this case. Apple would have to be compliant in each individual case to apply this software to another device assuming they code in the specification that it's only applicable to this device.
This still leaves them the ability to remove their ability to do so in the future, by requiring the passcode to update the firmware on both the hardware itself and the secure enclave.
Someone replied and then deleted (which is fine). IIRC they said something like "a/this court order isn't creating a master key." (my words, summarizing from memory).
Which is technically true. But consider each event a black box, into which you throw a targeted phone, and it comes out unlocked. Whether that was through unique effort for each case, a general capability developed and archived by Apple, or even an actual backdoor/masterkey developed by Apple out of exasperation from the expense of being legally compelled in each individual case and others to come, the effect is the same, the phone is reliably opened.