That's good timing, given the discussion yesterday on a similar topic:
https://news.ycombinator.com/item?id=12254960
There are a couple of things in the article I'm not sure Matt got quite right.
WhatsApp does let you compare key fingerprints, believe it or not. At least you can scan QR codes to check. I don't know if doing that triggers a key change warning in future.
End-to-end encryption doesn't seem to impact whether law enforcement can look backwards in time or not. Simply not logging message content is sufficient to prevent this. WhatsApp couldn't provide law enforcement with message content prior to a tap being requested even before they integrated the Signal protocol because they didn't log message content at all (or so they say). Introducing E2E crypto in the style of WhatsApp solves only one specific threat model as far as I can tell - if someone is capable of hacking your datacenter to the extent that they can siphon off and log messages by themselves without you noticing, but they aren't also capable of doing a key switcheroo. This would be a strange but possible kind of hack. Note that this assumes the users aren't storing their device keys and comparing them by hand and that the hacker can't influence the code that gets shipped.
He assumes the user can detect key mismatches. Even if users can compare keys, this assumes that their client does what they think it does. It's noted in another comment here but all it takes to undo this assumption is getting Google or Apple to push a dummy binary to the specific devices of interest that claims things are encrypted even when they aren't.
You wouldn't need to deploy threshold crypto 'at scale' for the proposed scheme to work. Some schemes like Shoup threshold RSA result in a normal public key:
http://www.shoup.net/papers/thsig.pdf
So the only part that's non standard is the software for working with the shares to decrypt, which only has to work and exist between the various agencies.
But I'm not actually even sure you need special threshold crypto schemes. I guess you could also take the session key(s) and encrypt them with key 1, then encrypting that value with key 2, etc, to build up an onion of encryptions. The various participants then have to pass around the value in the same order hard-coded into the software to get it back. The advantage of this approach is you can use ordinary HSMs to protect the keys, i.e. the hardware itself enforces that the private key may never leave the hardware unless it's being cloned to another HSM.
But these are all minor details. The point Matt makes is well made, which is that you can build backdoors into cryptographic systems, and the reasons people don't want to do this are primarily political rather than technical. I continue to be concerned that the tech community may be about to burn its credibility with the mainstream population for no reason by claiming this stuff is impossible to do or is completely unthinkable, when it's actually not. Opinion polling showed that there was no general consensus behind Apple's refusal to unlock the phone in the FBI case: many people don't support the tech industries absolutist position here (perhaps because they don't understand the potential mass surveillance has).
Moreover, governments will generally not accept an answer of "you are imperfect thus should not have the law enforcement capability you want". Lawmakers understand and accept that civil servants will make mistakes or be openly abusive and only generally want to control the levels of error/abuse, not eliminate it. Certainly the sorts of positions the Obama administration is looking for would accommodate key revocation procedures if the government agencies in question somehow did screw up and their private key leaked out of their HSMs. I suspect they'd happily agree to temporarily losing their capability to restore system integrity if there was a procedure for restoring their access once a neutral third party had re-audited the relevant offices. This sort of detail isn't where lawmakers are at: they think in broad strokes rather than the details of procedures.
> End-to-end encryption doesn't seem to impact whether law enforcement can look backwards in time or not.
I think he was referring to the fact that the Signal protocol has perfect forward secrecy -- if you break the key today, all previous communications are still secure because they used different keys (the key is updated using the Axolotl ratchet).
WhatsApp does let you compare key fingerprints, believe it or not. At least you can scan QR codes to check. I don't know if doing that triggers a key change warning in future.
End-to-end encryption doesn't seem to impact whether law enforcement can look backwards in time or not. Simply not logging message content is sufficient to prevent this. WhatsApp couldn't provide law enforcement with message content prior to a tap being requested even before they integrated the Signal protocol because they didn't log message content at all (or so they say). Introducing E2E crypto in the style of WhatsApp solves only one specific threat model as far as I can tell - if someone is capable of hacking your datacenter to the extent that they can siphon off and log messages by themselves without you noticing, but they aren't also capable of doing a key switcheroo. This would be a strange but possible kind of hack. Note that this assumes the users aren't storing their device keys and comparing them by hand and that the hacker can't influence the code that gets shipped.
He assumes the user can detect key mismatches. Even if users can compare keys, this assumes that their client does what they think it does. It's noted in another comment here but all it takes to undo this assumption is getting Google or Apple to push a dummy binary to the specific devices of interest that claims things are encrypted even when they aren't.
You wouldn't need to deploy threshold crypto 'at scale' for the proposed scheme to work. Some schemes like Shoup threshold RSA result in a normal public key:
So the only part that's non standard is the software for working with the shares to decrypt, which only has to work and exist between the various agencies.But I'm not actually even sure you need special threshold crypto schemes. I guess you could also take the session key(s) and encrypt them with key 1, then encrypting that value with key 2, etc, to build up an onion of encryptions. The various participants then have to pass around the value in the same order hard-coded into the software to get it back. The advantage of this approach is you can use ordinary HSMs to protect the keys, i.e. the hardware itself enforces that the private key may never leave the hardware unless it's being cloned to another HSM.
But these are all minor details. The point Matt makes is well made, which is that you can build backdoors into cryptographic systems, and the reasons people don't want to do this are primarily political rather than technical. I continue to be concerned that the tech community may be about to burn its credibility with the mainstream population for no reason by claiming this stuff is impossible to do or is completely unthinkable, when it's actually not. Opinion polling showed that there was no general consensus behind Apple's refusal to unlock the phone in the FBI case: many people don't support the tech industries absolutist position here (perhaps because they don't understand the potential mass surveillance has).
Moreover, governments will generally not accept an answer of "you are imperfect thus should not have the law enforcement capability you want". Lawmakers understand and accept that civil servants will make mistakes or be openly abusive and only generally want to control the levels of error/abuse, not eliminate it. Certainly the sorts of positions the Obama administration is looking for would accommodate key revocation procedures if the government agencies in question somehow did screw up and their private key leaked out of their HSMs. I suspect they'd happily agree to temporarily losing their capability to restore system integrity if there was a procedure for restoring their access once a neutral third party had re-audited the relevant offices. This sort of detail isn't where lawmakers are at: they think in broad strokes rather than the details of procedures.