Hacker News new | past | comments | ask | show | jobs | submit login
Extracting Qualcomm's KeyMaster Keys – Breaking Android Full Disk Encryption (bits-please.blogspot.com)
341 points by laginimaineb on June 30, 2016 | hide | past | favorite | 98 comments



Full disk encryption (FDE) is a UX issue, not a technical one. You don't need a secure cryptographic processor, but the UX sucks without one.

A simple, working, FDE setup would be something like LUKS running at boot:

    1. Turn on phone
    2. Phone loads up initial bootstrap OS
    3. Phone prompts user for master key
    4. Master key is used to unlock volume
    5. Regular OS boot continues
If the master key has enough entropy, brute forcing it becomes impossible. The phone won't "disable" as there's no self-destructing component (i.e. "secure crypto chip") but that doesn't mean it can be cracked. Boil as many oceans as you'd like, you're not going to brute force 256 bits of entropy.

The UX problem is that the master key is a PITA to enter if it's long enough to be cryptographically secure. That's what a crypto chip is supposed to solve. A limited number of attempts with a shorter passphrase.


On the other hand, phones do not need to be rebooted often. And if I'm upgrading my OS the stupid thing is going to make me wait 20 minutes while it precompiles every app, so screw it I don't care if I have to spend 40 seconds inputting a password.


For what it's worth, the "optimizing apps" thing is going away in Android N.

https://developer.android.com/preview/api-overview.html#jit_...


I do find it somewhat amusing that Android went from Dalvik precompiled to Dalvik JIT, to ART precompiled, to ART JIT.


It's actually a rather logical set of steps, it follows the same pattern as elsewhere probably for the same underlying reasons.

Compiled once is easier to develop, prove correct and validate functionality. Just in time offers convenience in deferring compilation, but increases storage requirements and can decrease the initial runtime performance due to compiling the code. If you have spare cores this starts to make more sense. If you have a mechanism of profiling and automatically tuning the optimization of frequently used codepaths / infrequently used ones then JIT also starts to have payoffs (this is where human hours making a smarter compiler pay off across many applications that don't warrant the human hours optimizing directly).


Another interesting aspect of the evolution of the Dalvik and ART runtimes is that they are runtimes for (mostly) battery-powered devices. A JIT compiler was an obvious next step but less obvious is that it had to be battery conscious. It was designed to JIT as little code as possible to attain the greatest gain in performance as possible. That's not how you would design a JIT compiler for server Java!


Dalvik interpreted was the first step. You could sort of call it "precompiled" if you call dex a precompiler.


Also, the system update install process will happen while the phone is in use, and only require a reboot after installation is finished. The whole process will be much faster.

Not only that, but Android N will be able to boot all the way to the lock screen before asking for the password, and even support functionality like phone calls, limited notifications, or alarms (so if your phone reboots in the night you'll still get woken up in the morning).


Only for new devices that support it


Right, but it may reboot at the most inopportune moments, like after the battery runs out, and where you may not have access to the 20 character long master password which is impossible to remember.


Use Diceware and you will have no problem remembering a secure passphrase.

https://en.wikipedia.org/wiki/Diceware


How about a QR code on a business card kept in your wallet? Scan the QR code (with you phone's camera) to input the long passphrase (or even better just random bits).

There could still be a short pin/passphrase in addition to that. Keepass allows for something similar by having a local file + passphrase to unlock a password vault.

Backups of the QR code could be as easy as photocopying it!


If you're putting it in your wallet, just use plaintext.


> If you're putting it in your wallet, just use plaintext.

That has the same issue of not being able to input it into the phone. The QR code can be easily scanned.

Sure OCR of plain text is possible but some more standardized like a QR Code would be better.


I'd rather just type than fuss with getting a QR code to scan in possibly-bad lighting. I just tested a password of 30 random lowercase letters, which is total overkill for security, and it took less than 30 seconds to type into my phone. Doing that perhaps once a month, or once you find a charger after your battery dies, is no problem.


> I'd rather just type than fuss with getting a QR code to scan in possibly-bad lighting.

Nothing says you can't do either, or even both[1]. The idea of using a QR Code is so that mom/dad/grandpa/grandma can use a long password to unlock their phone without having to type it in.

Bonus points if the QR Code itself is in the shape of a key!

[1]: Which would be "something you have" + "something you know".


A ring or bracelet with an NFC tag in it. Well, a ring is probably too small for a proper NFC tag, but either way: NFC jewellery.


They absolutely make rings w/NFC tags. At least men's rings, I assume they may do it for women's too.


Really? That is something I dreamed about for a long time. A small object on the finger is the least obstructive form factor and becomes part of self. But unlike a fingerprint, a ring can be replaced if somehow compromised.


e.g. http://amzn.com/B00UTEHH8K claims "Lock and Unlock Your Smart Phone or Tablet Quickly, Hide the App When You Leave Your Smart Phone Alone"


Neat. Does android support NFC for the bootup FDE decrypt?


I doubt it. But you might not prefer it to, depending on the threats you want to secure against. If you had extremely valuable secrets, someone could just execute you and use your NFC jewelry to unlock those secrets.


Or rip off your fingers if you use your fingerprints to unlock, or just drug you with scopolamine to unlock it.


I'm looking for normal corporate security here, not fighting the KGB.


Phones spend most of their lives powered on. You need to be required to enter a similarly good password to unlock it while the phone is running, or disk encryption has accomplished nothing at all.


Not true at all. Any attack against the encrypted data itself will be stopped. Any attack that requires restarting the phone to load different software will be stopped. Having a secure boot password and also a software-enforced pin to unlock accomplishes a great deal. And as a bonus, when a security measure is not grossly inconvenient you get more people using it.


The key is in memory. How many attackers are prepared to read the flash chip but not the RAM?


In theory it should be easy to keep the key from leaving the CPU's cache.


Sometimes collection agencies rotate their phone numbers when they call you so you can't block them, the only way to turn off the phone.


You could turn off notifications, or use airplane mode.


The other UX problem is you need to make sure that someone doesn't replace the initial bootstrap OS with something else that captures the master key.

So, that's where you need the secure cryptographic processor and the trusted environment, etc.


Wait, Android doesn't use LUKS?

Argh. Just give me a stinkin' smartphone that runs Linux. These are solved problems, people.


Last I checked, it does. I'm pretty sure for instance the Nexus 5X relies on standard LUKS for FDE.

What may be protected through other means is the key used to encrypt that filesystem. The user enters a pin or a pattern to unlock said key on boot, and that key is used to initiate LUKS.

Maybe that's what they have cracked here?


Isn't Android running a really old Linux because they can't rebase? So they have to cherry pick... or has that stopped now?


No, but they tend to stick to the LTS releases and the kernel tends to not get a version update for the life of the device.

So it doesn't track the latest, but it's also not "really old" in that upstream Linux still considers it a supported version.


No it's actually very old. This is from my Nexus 5X, running Android N preview build:

    $ uname -a
    Linux localhost 3.10.73-gef93fe9 #1 SMP PREEMPT Fri May 20 20:48:13 UTC 2016 aarch64 Android
There was a technical reason for pinning it at this version (also known the "Android kernel"), but I don't remember what that specific reason was.


Jolla / Sailfish ?


You also want the passphrase to be easily changed, separately from the master key (because changing the master key is slow and resource-intensive).

Even if you have a strong passphrase, there's some probability of leaking some number of bits of it via side-channels every time it's entered (e.g. surveillance cameras, fingerprints on the screen, shoulder surfing, vulnerable code, TEMPEST, etc). Plus, people often keep a backup copy of their passphrases (unique, strong passphrases are hard to remember), so there's also a cumulative risk of the backup leaking over time, as well.

Long-term confidentiality is just surprisingly hard in the real world.


>Full disk encryption (FDE) is a UX issue, not a technical one.

Most of the secutity is pretty much a UX issue. Otherwise century-old one-time pad would fit all.


And with a crypto chip it's much easier to share the device. Think of family or classroom/work devices.


QR code maybe? I think that might work...


So if I understood correctly, there are 5 requirements for such a system to be secure:

  1: secure/unmodifiable cryptographic processor
  2: with unremovable rate limiting
  3: and exclusive access to a hardware key
  
  4: cryptographic processor has the only function of encrypting user data based on
  5: hardware key and a user supplied pin/key
Errors done by Qualcomm:

  Violated 3: Hardware key not exclusivly readable by cryptographic processor
  Violated 5: Encryption based on derived key
Anything I overlooked?

(edited: formatting)


Since the security of Android depends on hardware and OEM software not under Google's control, depending on FDE is apparently pointless. I guarantee Google really wants to build their own branded phones with their own secure Android version and gain Apple's advantages in building secure systems because you own everything.


This vulnerability does not make FDE pointless per se. It means that your security depends completely on the complexity of your passphrase (like the iPhone prior to the introduction of Secure Enclave). Which is still fairly bad given the typical passcode complexity on phones.


> Since the security of Android depends on hardware and OEM software not under Google's control, depending on FDE is apparently pointless.

For 99% of the users, this is complete nonsense. Why do we encrypt the phone? Are we afraid of the NSA or whatever agency cracking our phones? No! The reason most of us encrypt the phone is to prevent others from posting our pictures or reading our mail.

When somebody else finds my phone, or steals it, the four digit PIN is enough protection. Then again, I don't use a four digit PIN, but something longer, and from six or up it's really strong enough for manual unlocking.

People who have the means to read contents from the memory chips, they probably have the means to use other measures to get what they want.


There are levels of threat and protection between those you mentioned. As the threat from people with access to "other measures" increases, you moght at least want to make them work for it and have to go through proper channels.


It runs the 2-factor app that secures my online accounts, which may or may not include banking.


Businesses typically have rules about securing data, that covers a lot of users.


Are you guaranteeing that Nexus phones exist, or that Google will introduce a second line of phones?


Google don't manufacture the Nexus.


But they had a chance (and didn't) to bring that all in house with the Motorola buy, no?


Google wanted and kept most of the patents.


The (thinly sourced) rumor is that Google wants to build a new phone from scratch, a separate project from Nexus


For this to work they only need one little chip that can do what the secure enclave does. They can then require that chip in Android phones. That's a lot cheaper and easier. Then they need to control the manufacturing of this chip, not outsource it to China.


If only they had an OEM to do this, with a plant in Texas....


ARA?


Nope, that's the modular phone. This is talking about a standard phone manufactured by Google.


> I guarantee Google really wants to build their own branded phones

Considering they just sold for scrap an entire manufacturing pipeline to do exactly that, I'd say they really don't. If they do, they're going about it in the silliest and most roundabout way.


So out of curiosity, as someone who has a nontrivial passcode on his Android device that people constantly mock, how many characters are we talking to be safe.

I have known for a long time 4 digit numeric PINs are stupid. Sadly the San Bernadino case, for all the wrong reasons, taught me all the alternative auth methods are just as risky.

Should I be worried? I don't know. But as a long time Android enthusiast and power user who does not use Google Play on his phone and restrictive permission customization like XPrivacy, I am about to just give up and have a newish iPhone for secure stuff and knock around Android for the cool open source dev I aspire to with F-Droid.


A friend saw me entering my longish, non-trivial passphrase into my android phone and commented that I must not have many friends if I have to enter this passphrase every time to use my phone.

For a moment I was torn between being proud in my passphrase and sad at my lack of friends.


What's sad is your friend's superficial criteria for judging sociability.


You bring up an interesting UX issue.

If I need to be constantly available to my friends, "number of seconds to a text reply" is something that I need to minimize.

Longer passphrases prevent me from sending quick replies.


Some of my more sophisticated friends take this a step further and taunt me for not securely hiding notifications and buying a smart watch.

This confuses me because Bluetooth over the air does not strike me a secure mechanism (is it just Bluetooth; ironically scant detail unless my Google-fu is off, likely at this hour for me).

http://www.wareable.com/android-wear/android-wear-hack-alert...

And do I need yet another smart device? The mobile first movement, with shinier crap, moving away from standard protocols, leads me to believe I have been passively perpetuating the biggest wart on computer science's labors. I am kind of embarassed, a few years in and late to the party, I bought an Android phone at all.


It doesn't really break the encryption, as long as the password is strong enough to prevent brute forcing.

Relying on a weak password and a "trusted computing" mechanism like this one from Qualcomm to prevent an attacker with physical access from brute forcing it is not really advisable.

Using such a mechanism at all has downsides since it means that you lose the data if the mechanism or the entire device stops working, while otherwise you could simply move the SSD/flash/storage unit to a new identical device and have it just work as long as you type the correct password.


But how many Android users do you know that actually have a strong password? (Speaking of which: Google's decision to not allow strong encryption passwords together with short screen unlock PINs/patterns is not helping here. It's actually possible, but they don't expose the user interface to change them separately, probably for usability reasons.)

Also, while I would agree as far as PCs are concerned, moving the eMMC to a new phone's mainboard is probably also not a likely scenario.


moving the eMMC to a new phone's mainboard is probably also not a likely scenario.

Data recovery is, however, which is why I don't think one-sided promotion of ubiquitous hardware-locked-FDE is a good idea --- it becomes a tradeoff between others getting access to, and you losing access to, your data. Is it more important that this data be accessible by no one but you even if it means you might also lose access to it, or that you not lose access, even if it means others can access it? Hardware-locked FDE is suited only to the former case.


As someone who deals with hardware-backed encryption in laptops, it's a huge pain.

Modern phones are worse. I was at a training and happened to sit with computer forensicators and they complain about the new eMMC systems. Forensics systems for mobile devices were a piece of cake prior to that.

I suspect the eMMC thing is meant to make this difficult on purpose.

So, so much for that ...


Precisely - if you're relying on no one to be able to reverse engineering the "secure enclave", you're already in deep trouble.


Fascinating article, the more I learn about crypto and security the more obvious it becomes how hard this stuff is to get right.


You know, I'm not really all that certain that it's intellectually difficult to get crypto right; it feels more likely that it's organisationally and process-wise difficult to get crypto right.

What I mean is that folks want to take an existing product and layer crypto atop it, when really they need to start from the crypto and built a product over it. They want to do things which are mathematically impossible (e.g. let you read data, but not let you say that data to someone else). They want to hand the crypto to a junior developer instead of someone who knows what he's doing.

Crypto's not hard to do; organisations are bad at doing crypto.


> You know, I'm not really all that certain that it's intellectually difficult to get crypto right; it feels more likely that it's organisationally and process-wise difficult to get crypto right.

Replace "crypto" by just about any software engineering endeavor, and what you say is still correct.


spot on with one minor nitpick: crypto is not hard to use (as in, relying on existing libraries/standards/practices) - doing, as in writing your own crypto, otoh...



I think that might not be such a bad thing after all, since the flaws make it harder for companies to secure their products against the user, as they are apt to do. Something to ponder: would you rather live in a world of "perfect security" where encryption like this is unbreakable but so is DRM and other mechanisms of corporate/government control; or something closer to what we currently have, where "flaws" in encryption are periodically discovered but things like iOS jailbreaking and Android rooting still exist along with their associated freedoms?

Personally, I lean towards the latter.

https://www.gnu.org/philosophy/right-to-read.en.html

http://boingboing.net/2012/01/10/lockdown.html

http://boingboing.net/2012/08/23/civilwar.html


I'd rather live in the unbreakable world because then the oppression is out in the open / forced.

People expect stuff to just turn up on Pirate bay, extraction tools etc.


There is no getting it "right" just less wrong; aka "perfect" security does not exist.


"perfect" security does not exist.

For that matter, I think what "perfect security" could be like is the world in Orwell's 1984 --- definitely not something I'd want existing either.


Even that was not perfect.


Isn't that still an open question?


Maybe in the sense that one can ever really prove it to be "perfect".


Unfortunately, it seems as though fixing the issue is not simple, and might require hardware changes.

So any Android phone currently on the market is basically unfixable?


Just don't use a password so weak it can be bruted for your FDE and you're fine. Treat it like most existing software-only solutions like TrueCrypt, not like a magic black box that can make a 4 digit number into a secure password. Even if the hardware were "secure" it still wouldn't be advisable to rely on it as a successful reverse engineering of the hardware could still result in a bruteforce attack on your key. And while that may be high effort and expensive, it's certainly not impossible. Never trust a half measure, this sort of TPM/Secure Enclave approach is a half measure.


I believe so.


Well hold on.. there are several issues. One is that there is an exploit allowing arbitrary code execution in the trustzone. This immediate problem should be fixed.

The second is that allowing a firmware updates to trustzone code means that Google could push an update which allows them to retrieve keys. Such updates should only be allowed if the user accepts them. How the user is supposed to know the difference between a security fix vs. Google trying to break into their phone is the real issue.

But IOS has this same issue. If you accept a firmware update from Apple, maybe it includes a pin logger.


You're right. Let's take a second to go over the issues:

1. The arbitrary code-execution in TZ has already been privately disclosed and fixed.

2. As for the second issue - I would argue that it's not an issue at all. TZ should be updated (otherwise how would they fix the TZ code-exec vulnerabilities?).

However, in this case gaining code execution in the TZ kernel directly leads to the disclosure of the keys which are meant to bind the KDF to your device. This is in direct contract with Apple's KDF. The key here is that software shouldn't be trusted, by design.

3. As for the last issue, I would argue that this is just a clever form of social-engineering. After all, who's to say they didn't just swap the phone with a dummy phone to make you insert the correct password?

However, in Android's case, you wouldn't even need to cooperate. OEMs could simply flash the new TZ code, extract the KeyMaster keys, and bruteforce the key. All without having any help from you.


The section on extracting the key mentions the "Widevine DRM application" and "privilege escalation from QSEE"

Could the problem be fixed if the manufacturer used TrustZone exclusively for store key/encrypt/decrypt operations, eliminating the support for DRM and Trustlets?


I would use a longer password. But currently the unlock and the encryption password are always the same. That one of the issue I would like to see changed.

I would also like to have more fine grained rules on when I can unlock with a fingerprint, when pin and when I should be forced to put in the encryption password.

Additionally I would like to use U2F NFC token on my keychain as a second factor for unlock (if I have not touched the phone for X amount of time).


You can make it use long FDE passwords and a pattern by hackery. Some sqlite fiddling necessary. See XDA.


Would love to see the same analysis on Samsung hardware


I'll definitely try and get around to it sometime soon. However, I wouldn't be surprised if the situation is the same... After all, the KeyMaster module was initially only meant to keep encryption keys on the device, not to safeguard FDE.


Sorry for an extremely nooby question, but your comment makes me wonder:

Are Snapdragon and Exynos architectures that alike? I mean, do you have any preliminary idea how easy it would be to apply your Snapdragon method to an Exynos SoC?


Actually, I would imagine they are nothing like one another.

However, Exynos uses a TrustZone TEE implementation to implement the KeyMaster module, just like Qualcomm's Snapdragon. This means that unless that TEE implementation uses a hardware key which is not software accessible, the same issue would be present. Of course, reverse-engineering a new TEE implementation would take a long time... But it sounds like it could be fun.


Thank you.

You should be aware that people like you are unsung heroes and I'd like you to know that many of us who are busy struggling to stay afloat are feeling better knowing that people like you are doing the work you do.

Have that in mind. I wish you well.


If anyone is interested when the author talked about an overview of the iOS security measures, I created a more in-depth review that is an easy read and gives you a good over-view of the security architecture used. I'm happy to hear feedback if you enjoy it. The blog can be found at: https://woumn.wordpress.com/2016/05/02/security-principles-i...


If your device is rooted, you can use a long password for encryption and a pattern or even no lock for the lockscreen.

I have set it up this way, it needed some sqlite hackery. XDA dev has appropriate infos.


Could someone explain how and why it uses "0x1337" in order to validate the key?

Is that magic?

https://github.com/laginimaineb/android_fde_bruteforce/blob/...


Isn't that just m = (m^e mod n)^d mod n? So the value itself does not matter, you could use 42 instead of 0x1337.


https://www.appmobi.com

the only way to fly




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: