"We kill people based on metadata." - former head of the National Security Agency Gen. Michael Hayden
“metadata absolutely tells you everything about somebody’s life. If you have enough metadata, you don’t really need content.” - NSA General Counsel Stewart Baker
OK, so we have another well written article explaining how attempts to demonize encryption and put it back in the bottle are misguided, and the spokespeople for the authoritarians and intelligence community are being disingenuous on cable TV.
But most of us already knew this. So what I'm asking is:
* Are these well reasoned words affecting/effecting policy?
I think this article is great, but something did stand out for me.
I'm a huge supporter of universal end-to-end encryption, but Feinstein's point is making me feel some cognitive dissonance:
>"I think with a court order, with good justification, all of that can be prevented."
There are cases where I would want law enforcement to be able to read encrypted communications in an emergency situation, with a valid court order. If someone is being held hostage, for example. Of course I don't want intelligence agencies having this same access; just very specific requests during exigent circumstances, with judge approval. A real judge in a real court, not a secret FISA court.
But to do that you need some kind of key escrow already set up with the government, and if you have that, there's nothing stopping law enforcement and intelligent agencies from spying on what they want when they want.
Right now this isn't a huge problem since a lot of people still communicate in plaintext, or things that are encrypted but logged/intercepted by a central location (Skype). But eventually more and more things will move to end-to-end encryption.
I wish that with a court order, we could read the thoughts of a suspect, too... But that's not possible, either.
Remember that a police force doesn't need infinite, perfect access to all of a target's communications to catch them. Think about the case we're describing: we have enough information to convince a judge that this person is a suspect. In order to get evidence to convict, you can:
* hack into their devices to retrieve stored data or install malicious software like keyloggers
* bug their place of living and work for video, audio, and WiFi capture
* interview/interrogate them, and everyone they associate with
* search their place of living and work
* place tracking devices
* watch their every move with drone or human surveillance
* have a human or drone tail them
* record all the metadata from their Internet communication
* record the contents of any unencrypted communication
We actually have all the access we need, particularly since we're allowed to hack into the target's devices. The only thing we CAN'T do, is include the target in bulk communication capture schemes. We actually have to pick targets and spend resources on them.
To be honest, I might be OK with that practical limit even WITHOUT a warrant requirement. If a police force believes that a suspect is worth spending their limited resources on, they must be worth something. If adding someone to a watch list is free, they'll do it to everybody.
Given that its original purpose was to discourage the authorities from torturing or jailing you in order to compel you to give evidence against yourself, one could argue that the Fifth would need new justification in a future where thoughts were easily read.
The right way to handle this is to recognize that sometimes, technology that has all of the magical properties that you want simply does not exist, or cannot exist. Secure, global key escrow is one of those things.
I think universal end-to-end encryption is an all-or-nothing proposition. You can't sell it by saying, "It's secure, except when it isn't." It's also a slippery slope to establish a key escrow for the government; will it be a crime to encrypt using a key that isn't in escrow?
Encryption can facilitate evil, but it also protects against evil. Universal adoption is hampered by obstacles including:
1. It's hard to do.
2. It's hard to understand, even if the tools become more user-friendly.
3. Most people will share a private key with a stranger when asked.
The right way to handle this is to encourage people to use strong encryption and acknowledge that it can be safe from eavesdropping, but still subject to weaknesses or participants revealing the information in other ways.
The right way to handle it is to enforce laws in meatspace. To create CP, for example, you need to abduct and exploit children, and produce content. This leaves a trail of money, places, and witnesses. Bear down on that instead of snooping.
It could improve the effectiveness of law enforcement to force them to focus on meatspace crime rather than playing with their shiny computer toys.
I'm of the same frame of mind.. There was an article from earlier this year about the suspect in a child-porn case who successfully encrypted his machine which has completely prevented prosecution:
As tools become better, is the price we pay for strong encryption that criminals who are mildly technical can get away with their crimes? That's hard to stomach if it's the 'right' answer..
It's really tricky. I don't want people to be forced to self-incriminate and give up their password. I don't want encryption to be banned. I don't want people to only be permitted to use government-approved encryption with escrow. I don't even want intelligence agencies to research vulnerabilities in existing cryptographic algorithms and implementations, because they have no reason to not use that for dragnet surveillance.
But obviously, I don't want CP or terrorism subjects to evade detection and prosecution because of encrypted drives or communications.
If the government forces the big American tech companies to adopt weaker encryption technologies, what's to stop terrorists from rolling their own communication app? Or new tech companies from deploying apps from countries where strong encryption is still legal? Would developers need to become familiar with government approved algorithms?
Separate from the debate on whether this is a good idea, I'd love to see a proposal for how a global encryption ban could actually be implemented.
Forcing the terrorists/criminals to roll their own encryption would be a huge win for both law enforcement/foreign intelligence agencies and the general public. When non-cryptographers write their encryption software the likelihood that it's insecure goes up, and if they're off using Mujahideen Secrets v5.0 instead of WhatsApp their communications would stick out like a sore thumb instead of blending in with regular users. When the terrorists/etc. don't blend in, the FBI/NSA/etc. don't have to sift through mainstream communications channels to figure out which people are bad guys and which are just regular people.
Copying the 2002 strategies of Anonymous doesn't make you a hacker; it makes you a loser. Stop using hashtags; this isn't Twitter. It's clear you love being addressed as a terrorist organization, but you're just some script kiddie in a basement who's about to have his internet unplugged.
Where a website is hosted has very little to do with what you can and can not report to the FBI and whether or not they decide to turn something into a criminal investigation would require that there is a crime first.
Posting dumb comments in a forum is not a crime (and it probably should not be or youtube.com will be out of an audience).
Associating oneself with a terrorist group is not a dumb comment, the claim is considered national security concern, which ultimately turns into a crime. Spreading hate messages is a crime. Spreading false information about cyber attack is a crime.
HN is responsible to report crimes if there's crime. Because HN is hosted on the U.S. soil, given this is a "dumb" claim supporting ISIS, when HN report to authority, the report would go to FBI.
We cannot take dumb cyber crime dumb. You do not know who is the next attacker. We have had US citizen fleeting to Syria to join ISIS, and give the recent attack in Cali, the fear of domestic attack is not imaginary.
“metadata absolutely tells you everything about somebody’s life. If you have enough metadata, you don’t really need content.” - NSA General Counsel Stewart Baker