we had the choice to make - either architect the world's most secure encryption system on the planet, so secure that CertiVox cannot see your data, or spend £500,000 building a backdoor into the system
So, just like Lavabit as 'moxie keeps pointing out[1][2], it wasn't actually secure. Still, I like the principled stand they took.
There is no actually secure - that's the problem. The best you can do is secure against specific threat models. Up until recently, most people didn't necessarily view government intrusion as a particularly credible threat, so didn't spend the extra time/effort/money mitigating against it.
One of the best things to come out of all these revelations, in my opinion, is a revised view of what threats we should consider which were previously dismissed as paranoid ramblings.
What I don't understand is why anyone trusted businesses (such as CertiVox and Lavabit) to keep their emails secure?
Because they didn't consider "because terrorism" to be a security threat that could penetrate privacy and property laws. Lavabit has proven that the Third-Party Doctrine means that once you give data to a business, you're giving it to the government.
Me personally? Because I don't actually need secure email (and so didn't use Lavabit or CertiVox).
My point is that if I did want secure email, I wouldn't trust a company whose email system architecture meant that they could read my email.
A personal email server might be a good solution, but then you have to maintain it. It seems as though it should be possible for a company to build an email system (and offer it as a service to customers) whereby they _couldn't_ read user's email.
This seems like a good thing. (And as a nice side-effect, the government can't then issue them with a warrant to read your email. Although that's not to say they can't read your email in other ways.)
I'm not aware of any companies providing double-blind encrypted email services, but they may be out there. Certainly they would eventually be accused of providing harbor to terrorists and other unsavories. At best, it sidesteps the problem of the lack of legal privacy protections when using a service provider of any kind.
You can trust an established business with a reputation to lose, not to defecate where it eats.
The upset in the threat model is that you can no longer trust that a business is free to choose according to self interest. You have to assume the government will be force-feeding it laxatives.
There are some details of the legislation in question here[1]. It allows the UK to monitor "in the interests of the economic well-being of the United Kingdom" which seems a little broad!
It would be interesting to know if this warrant targeted all users or a specific subset?
I wonder how they decide whether to issue a warrant or just break into the site in question. A warrant could imply that they are unable to attack the provider, or that they want to have a chilling effect.
RIPA was strongly opposed by the IT and communications community during its 'consultation' period, but was passed largely intact. It was so intrusive a proposal that even people like me who often say 'I'll write and complain' actually did write a letter.
Predictions of its misuse have been borne out, particularly as the authorized users of communication data were never enumerated or restricted. So we have the situation today where local Councils use RIPA to obtain communication data regarding individual citizens' activities.
> ..local Councils use RIPA to obtain communication data...
It's worth noting that legislation introduced during 2012 limited the circumstances under which councils could use RIPA powers [1] and required judicial approval for the exercise of those powers [2].
I think it's ironic that those who are in favour of personal liberties are often on the left of the political spectrum whereas, in this instance, it was the Tories who rolled back aspects of surveillance legislation that was introduced by Labour.
I think that the left/right distinction is silly myself.
You're right, it's totally ridiculous. The whole thing is rooted in shit that dates back to before the French Revolution[1], and there's no real connection whatsoever between the "left" and "right" of the French Assembly and modern political views.
How absurd is it? Just read this:
There is general consensus that the Left includes progressives, communists, social-liberals, greens, social-democrats, socialists, democratic-socialists, civil-libertarians (as in "social-libertarians"; not to be confused with the right's "economic-libertarians"), secularists, and anarchists, and that the Right includes conservatives, reactionaries, neoconservatives, capitalists, neoliberals, economic-libertarians (not to be confused with the left's "civil-libertarians"), social-authoritarians, monarchists, theocrats, nationalists, Nazis (including neo-Nazis) and fascists.
It takes anybody with even half a brain about .0002 seconds to realize that any sort of ontology that lumps "neoconservatives" and "economic libertarians" (which basically corresponds with "US style libertarianism") in the same bin, or that lumps "anarchists" with "democratic socialists", or that lumps pretty much any kind of "libertarian" along with "monarchists" and "theocrats", is totally useless.
In modern terms, there is no "right" and there is no "left". There are just people haphazardly throwing around archaic labels because they either A. have a vague notion that the label confers some sort of appealing "vibe" and want to leverage it to advance their cause, or B. have a vague notion that the label will be seen as a pejorative and want to use it to put their opposition in a bad light.
>> I think it's ironic that those who are in favour of personal liberties are often on the left of the political spectrum whereas, in this instance, it was the Tories who rolled back aspects of surveillance legislation that was introduced by Labour.
Those on what passes for the 'left' of UK politics spent over a decade stripping rights and liberties away just as far as they can in the name of protecting people.
I don't personally think it's much of a left-right issue. It's another axis.
Mean while, they Tories have rolled back legal aid, which means the poor cant get decent lawyers any more.
Non Brits might be amused to know that this Tory government wanted to out source poor people's legal representation to a haulage company. Yeah, you read that right.
I'm not the GP poster, but I'd suggest that one of the few policy areas the Tories and Lib Dems found that they could readily agree on was civil liberties.
One of the first things the new government did was start the wheels turning on what became the Protection of Freedoms Act 2012. Whether you think that law went far enough or not, I suspect most here would say it took steps in the right direction on numerous issues where the previous New Labour administration had eroded rights and liberties. A few of them were big-headline, fundamental issues, but there were quite a few relatively minor concerns addressed as well, such as the excessive use of surveillance by local authorities that we're talking about here.
Given that coalition government is not the norm in the UK, I suspect politically it was more important that this was something the two parties could agree on as an anchor than anything else, though no doubt many of the incoming Lib Dem MPs and at least some of the Tories were no fans of the previous situation anyway.
A warrant could imply that they are unable to attack the provider
This is the institution which allowed U-boats to sink British ships so not to even hint that the ENIGMA cipher was cracked. I'm sure they're utterly incompetent at keeping secrets, and anyone on the internet can deduce their most critical SIGINT capabilities simply by observing how they deal with civil warrants in minor cases.
This is the institution which allowed U-boats to sink British ships so not to even hint that the ENIGMA cipher was cracked. I'm sure they're utterly incompetent at keeping secrets
Easy call to make 70 years after the fact, I guess.
Well the Snowden revelations suggest that yes this kind of capability will become public. We exist in a space that is bounded by technological opportunities and limitations which agencies are just as subject to as everyone else.
A warrant might just mean that they're asking for the information legally, so that they can use it, rather than relying on their illegally gathered hacked information.
A warrant might mean that they do not ever illegally hack sites, and that they only ever obey the law.
RIPA is problematic law, and it's nice to see discussion of RIPA cases involving GCHQ.
Not avoiding the problems of RIPA and GCHQ (I certainly think it needs better oversight and I'd prefer better law) RIPA has been abused quite a lot by many other organisations for stuff that US readers may find really awful.
Local education authorities spy on parents to ensure they're actually in the catchment area of a school that they want to send children to, for example. Or councils spy on people applying for parking permits to make sure those people live in that area, and aren't just selling the permits on. Having that activity regulated is important, but RIPA is less regulation and more encouragment.
> A warrant might just mean that they're asking for the information legally, so that they can use it, rather than relying on their illegally gathered hacked information.
Note that the "warrant", in this case, required that CertiVox hand over the key(s) to the encrypted data. It's possible that the authorities (not necessarily GCHQ itself, as NTAC provides assistance to other law enforcement and intelligence agencies) had already intercepted or seized the data but were unable to read it because it was encrypted.
"The House will notice that the Bill restricts the activities of the SIS and GCHQ for safeguarding the economic well-being of the country to the acts or intentions of persons outside the United Kingdom. The agencies may not and do not get involved in domestic economic, commercial or financial affairs."
So it's not quite as broad reaching as it reads on first glance. It is generally a problem though that laws need to be written so they're broad enough to cover circumstances which might not have been considered at the time of the laws' writing so that new laws don't need to be written all the time yet they can't be so overly broad as to be unworkable - which is why the judiciary has it's role in deciding how to interpret the laws and decide whether or not they ought to apply in a given circumstance.
Also, one can argue that it's a good thing in general that our intelligence and security services' activities are regulated by laws like this (instead of operating in grey areas beyond the law).
And, by the way, activities carried out by the UK's intelligence and security services OUTSIDE the UK are also subject to RIPA.
In a case of "Just because you're paranoid Don't mean they're not after you" Harold Wilson, who was paranoid about lots of things but particularly being bugged by MI5, was in No 10 at a time when there were bugs in cabinet room and PM's study - whether the bugs were actually used or not isn't known:
Wow, shows a lot of integrity closing the product instead of still keeping it up in a compromised state to comply with the warrant. We've seen some other providers here in the US even changed functionality to retain keys used in web clients of secure email at the behest of government orders.
This does mean that the UK is now on the list, along with the US, of places where no credible crypto startup is possible, though.
> This does mean that the UK is now on the list, along with the US, of places where no credible crypto startup is possible, though.
I can think of very few developed countries that, when the rubber hits the road, will let you do what you want. Taking measures to aid official police or court investigations is simply an implied obligation in most countries with developed legal systems. Very few countries will tolerate service providers whose raison d'etre is "we won't cooperate with the authorities" except in limited situations like off-shore banking when the primary purpose is to hide assets or information from people in other countries.
This isn't specific to crypto or police investigations either. Say you want to start an accounting firm that guarantees it will never share your records if subpoenaed in a civil lawsuit. This would never fly in the U.S., not now or one hundred years ago, and while I'm not super familiar with European law, I can't imagine it would fly in any western European country either.
I think it falls into three groups for communications providers:
1) Laws like CALEA, which (if applicable) require a provider to develop and expose backdoors to the government in advance of a request
2) Building systems where an operator doesn't have access, but where a court order can compel changes to the system, including ultimately shutting it down.
3) Being able to build a system where operator doesn't have access to data, and upon a request, if no data can be turned over, continues operating. Must turn over any data which you do have.
4) No requirement to cooperate, or something equiv to 4th/5th A protections for the end user being extended to service providers.
I used to think the US was #3. I don't believe #4 exists anywhere, at least outside specific kinds of data (medical, legal). The US is at least #2 now, and might actually be #1 in more and more domains.
GCHQ are both better and worse than the NSA. I would argue that it seems they are if anything even less restrained than the NSA in over broad surveillance. However it is less clear whether GCHQ have actually broken the rules (and certainly not the constitution) which the NSA may have done.
My view is that everyone (in the world) should be pissed off at both the NSA and GCHQ and that Americans should be additionally angry that the NSA seems to have broken the limits placed on them in terms of domestic surveilance.
For those unaware it may be worth pointing out that the UK constitution has a principle of parliamentary sovereignty [1] - basically parliament can pass/amend/remove any law and no current law can bind future parliaments (they can always amend/remove it). Fundamentally if they say it's legal it is, so rules can be changed at any time such that GCHQ isn't technically breaking any rules.
The article also mentions this company's new product where additional security is possible because decryption requires data kept only by the customer. Perhaps that is the new way to go?
"where rather than hold the data, it is split in two so CertiVox has one half and the user has the other, and law enforcement would need both to access the data."
By data here, do they mean encryption key? If so then I'm not sure what splitting it in two does as you presumably have to join them together somewhere and GCHQ would presumably just attack wherever the complete key is available.
Defining some terms: Alice, Bob, the company, and GCHQ (Eve).
Alice sends Bob stuff, via the company.
GCHQ sends RIPA requests to the company, which means that GCHQ get Alice's communications. Neither Alice nor Bob are aware.
So, a new scheme where Alice and the company need keys to decrypt means that GCHQ send RIPA requests to the company, and to Alice. Now Alice, and the company, are aware of the RIPA request. Bob isn't. But at least Alice can stop sending things to Bob.
I don't know anything about the implementation, but I would assume that the idea is to avoid having the service provider be a single point where complete failure can occur. Perhaps this new system offloads that risk to the customer.
But then I get confused, what service is the service provider providing if the customer still needs to carry around half of 'the data'?
The headline attached to this submission is sensationalist and misleading. GCHQ did not force CertiVox to shut down PrivateSky. CertiVox decided to close PrivateSky after they were served with a notice issued under section 49 of the Regulation of Investigatory Powers Act (RIPA) which required that they hand over the key(s) required to decrypt one (or more) of their customer's data.
I disagree. The headline is reasonable because you can be put in a position in which you're not forced to do something, but it's extremely unreasonable not to do that thing. They weren't forced to close down per se, but their other options were extremely infeasible or reprehensible.
Arguably it is a problem they created by the architecture they chose, in which this is a fatal flaw of design that was almost certainly something that could be anticipated.
I do agree, but historically we've been able to write software in an environment we didn't foresee being hostile. We've been too trusting and blasé when it comes to privacy and security. For example, look at the original SMTP implementation: the designers didn't take into account there might be governments or other bad actors that would snoop wholesale on such messages as they bounced around the infrastructure. If the original designers had known, the specification would have undoubtedly been different.
It's also hard to create a system that's private and secure, and it's vastly less convenient to use which increases friction and drives down adoption. PGP, for example, is very good but extremely inconvenient to use all the time.
As an outsider, I think I understand the rationale behind their design decisions. Just saying it should have been more carefully created with privacy in mind glosses over the many complexities and difficulties that design goal entails.
We, as creators of software, have problems that are more political than technical, no matter where we are in the world.
Your interpretation is extremely naïve. I could hold a gun to your head, for example. You wouldn't be forced per se to meet my demands, but you probably would because it would be extremely unreasonable not to. In the same way, the consequences for not complying are: an extremely expensive and possibly unaffordable rewrite, going to jail for a few years, or giving up their customers data in violation of their principles. The GHCQ order effectively put a gun to their head.
Sounds like these services actually shutting down is a good thing for the end users, since they aren't actually securely designed in the first place! They even admit in the article they'd have to properly design their system to make email unreadable by their staff, and they chose to shutter their service vs trying to figure out how to make that work and still make money.
> I could hold a gun to your head, for example. You wouldn't be forced per se to meet my demands, but you probably would because it would be extremely unreasonable not to.
GCHQ never demanded that CertiVox shut down their service. That is the bottom line and, no matter what sort of ridiculous "gun against your head" rationale you come up with or how firmly you plug your ears while shouting "LALALA!" at the top of your voice; you cannot refute that fact.
EDIT: Actually, why don't we see what the CEO of CertiVox has to say on the matter?
"The headline strongly infers our friends at GCHQ “forced” us to take PrivateSky down. That’s hogwash."
> So if I put a gun to your head, and you give me money, you haven't been forced to, right?
Wrong. I HAVE been forced. You have threatened me with death if I don't comply with your demand.
GCHQ never demanded that CertiVox shut down their service. They (or, rather, the Home Secretary) demanded that they hand over the key(s) required to decrypt one (or more) of CertiVox's customer's data.
If CertiVox took it upon themselves to then shut down the PrivateSky service, that is their choice. But it is a lie to claim that GCHQ forced them to.
CertiVox could claim that they "felt they had no option but to shut down the service" or that they "could no longer, in all conscience, continue to offer and market a service that" blah blah blah but, the fact is that they always knew that PrivateSky was vulnerable to section 49 notices.
RIPA's been on the statute books for over a decade and section 49 has been in force since 2007. CertiVox launched PrivateSky in 2011, which means that they either built a service that they knew was vulnerable to RIPA disclosure requirements or they didn't know about RIPA. I don't believe that they didn't know about RIPA.
It doesn't matter what special snowflake rationale you come up with about what's reasonable or unreasonable. The simple fact of the matter is that the headline is misleading. GCHQ did not force PrivateSky to shut down.
EDIT: Even the CEO of CertiVox refutes the claim that GCHQ forced the closure of PrivateSky!
"The headline strongly infers our friends at GCHQ “forced” us to take PrivateSky down. That’s hogwash."
That doesn't appear to be the whole story. The claim to have needed to re-engineer their system to have access to keys, effectively making it a different service than it had been. And that means that a demand for keys can effectively make services where users' key are not accessible impermissible.
"or spend £500,000 building a backdoor into the system"
A nice round number... They had good motive to protect their customers and close the service. Why start talking about this kind of ridiculous development costs, that sound like it got pulled out of the air?
At least I'd be more sympathetic, if they sticked to the facts. I could even support their cause by posting and tweeting that this is terrible. Now I just feel, that cut the BS already, even they apparently were treated wrong and intimidated to shut down their business.
Since a comment claiming to come from the CEO of PivateSky is dead, it seems worth re-posting so that more people can read it :-
"
This is Brian, the CEO. Yea, it was a nice round number out of the air. But it probably wasn't far off the mark.
For all the other comments, below is a blog post on the matter which is going to go live shortly:
With the story about our PrivateSky takedown now public, I want to take the opportunity to clarify a few points in various articles that have appeared since yesterday covering the story.
The headline strongly infers our friends at GCHQ “forced” us to take PrivateSky down. That’s hogwash. In fact, the headline contradicts the article, which becomes clear as you read it.
Secondly, a very important point wasn’t printed. GCHQ couldn’t, by law, request a blanket back door on the system. There are a very rigid set of controls that mean only specific individuals can come under surveillance. The legal request for such surveillance has a due process that must be stridently followed. At no time did I or anyone at CertiVox talk about CertiVox in relation to any RIPA warrant, only the generic process by which these warrants are served.
By saying “our friends at GCHQ”, there is no facetiousness intended. The team at CertiVox have the upmost respect for the folks we interacted with at GCHQ. They took the due process I outlined in the previous point very seriously. We found that as an organisation, and every individual involved there, were as worried about a breach of public trust as much as we are.
Finally, I believe very strongly the following should be a larger part of the public discourse of these subjects. What everyone needs to understand is that every developed democracy in the world, even where privacy rights are enshrined to the maximum efficacy by statute, has laws on the books that mandate that Internet Service Providers have facilities to work with law enforcement for the purposes of legal intercept, to enforce public safety and security.
Being L.I. capable is a very important set features and functions that must be in place for any credible, commercial service on the Internet. In endeavouring to make PrivateSky as secure as possible, we overlooked this critical requirement when we built PrivateSky.
When CertiVox positioned PrivateSky as the easiest to use and most secure encrypted messaging service, we really had two significant points of differentiation. First, even though we held the root encryption keys to the system, it was architected in such as way that it would have been all but impossible for our internal staff to snoop on our customer’s communications, or for the service to leak any of our customer’s data. Secondly, our possession of the root keys, and our use of identity based encryption, made the system incredibly easy to use. For the user, there were no private or public keys to manage, every workflow was handled for the user in an easy to grasp pure HTML5 interface, no hardware or software required, just an HTML5 browser.
We boxed ourselves into a feature set and market position that when called upon to comply with legal statues, we simply had no alternative but to shut the service down. We built it, but we couldn’t host it.
Why? Because as you can probably surmise, there is an inherent impedance mismatch between being able to host a commercial communications service that gives the upmost in privacy to its users, against any breach, whilst at the same time being able to operate safely within the confines of the law as it is on the books in most countries on the planet.
Is this wrong? Actually, I don’t think it is. This may be an unpopular viewpoint, but I cannot argue against having a well regulated legal intercept function as being necessary to have in place for a society that prizes law and order and the safety of its citizens. This is speaking as someone who lived in NYC during 9/11.
In summary, it’s the abuse of the communications interception in the Snowden revelations that has everyone up in arms, as so it should. But that’s not what happened with PrivateSky.
Just confirmation of another vendor who said no to the government intrusions on their customer data.
I"ll say it again since it bears repeating. How many more companies are complying and giving their keys to the government so they can track people that we won't hear about??
Makes me sketchy on using any "encrypted" email service right now.
I like the idea obfuscating the location of the containers of the data as well as the content.
Shred, encrypt, disperse. You send a token to the recipient and they are able to use that to build location requests for the containers. Once they have the parts, they can assemble and decrypt.
So, just like Lavabit as 'moxie keeps pointing out[1][2], it wasn't actually secure. Still, I like the principled stand they took.
[1] https://news.ycombinator.com/item?id=6672442
[2] http://www.thoughtcrime.org/blog/lavabit-critique/