Hacker News new | past | comments | ask | show | jobs | submit login
Open Letter From UK Security Researchers (bristolcrypto.blogspot.co.uk)
114 points by austengary on Sept 16, 2013 | hide | past | favorite | 21 comments



These guys join a chorus of academic voices calling for the same. We've seen similar statements from Professors Rogaway and Green in the past few weeks on HN.

I think we need a legal solution, but if the tech community refuses to participate in corrupting the technology, it puts the spooks in a worse position. I'd sign a pledge saying I will never work on any apparatus (software, data centers, anything at all) that are used for mass surveillance. That's pretty easy for me to do though... the money government throws at some companies and people for this work is more tempting than the One Ring.

http://www.cs.ucdavis.edu/~rogaway/politics/surveillance.pdf

http://bits.blogs.nytimes.com/2013/09/10/government-announce...

More like this couldn't hurt.


> I'd sign a pledge saying I will never work on any apparatus (software, data centers, anything at all) that are used for mass surveillance.

Anybody that ever worked on Hadoop could not sign such a pledge.

Anybody that ever worked on Linux could not sign such a pledge.

Anybody that ever worked on solr could never sign such a pledge.

And so on. You get the idea, it is not possible to contribute to open source projects that have applications in large scale data storage, mining, generalized operating system work and so on without as a side effect making the apparatus of mass surveillance possible.

The only safeguards that will really work are very strict legal ones, transparency and accountability. As long as the last two are not present the first is meaningless (which is the situation we are currently in).


I think you know what I mean, but I'm sure the pledge could be worded to avoid this sort of conundrum.

If I make something that gets co-opted for use in surveillance, fine. But if, for example, I accept a grant from the NSA or GCHQ to improve Hadoop in some specific way, I've crossed a line.

I'm not going to say anyone who ever worked on designing a computer is complicit, but I don't think it's unreasonable to avoid working for companies that supply tons of computers for surveillance. Like I said, it's a tough sell to people who have families to support, but I don't think it's crazy to quit working for certain companies based on the revenue they get from the NSA and related agencies.

There was a time in my life that I would have contracted for the NSA or Booz Allen, I'm fairly sure. Given the revelations, and also given my current financial circumstances, I would not today.


I agree with this, and would take it even further. I don't have a problem with certain types of surveillance software, kinds that create jobs for many developers.

Software that blocks access to adult material for school networks, spam filtering software, any sort of contextual advertising software, web analytics, all of these have many traits of, or are, surveillance software, but I allow myself to be monitored by them all, knowingly.

I don't see a problem with schools and offices filtering inappropriate material, reading email for certain purposes, tracking users, etc, as long as users are aware of these practices, and are willing to trade that element of their freedom for the service being provided. The only practical way forward with any anti-surveillance movement is to focus on legislation, and tools to detect (and under appropriate circumstances) evade surveillance - e.g. I might use Tor at home, but would allow my work email to be read if required.


You could always add a 'don't use this for evil' clause to your licences… http://wonko.com/post/jsmin-isnt-welcome-on-google-code

(This is not a serious suggestion)


I'm still surprised the Data Protection Act hasn't been mentioned in the UK in relation to those issues. Basically it says in Schedule 1, Part 1 (The principles), point 8:

> Personal data shall not be transferred to a country or territory outside the European Economic Area unless that country or territory ensures an adequate level of protection for the rights and freedoms of data subjects in relation to the processing of personal data.

Which people usually assume includes the US. Well... I guess you could argue otherwise now. It was demonstrated that even encrypted data is not protected while sending it to the other side. Can we start referring to the DPA every time some storage of the UK personal data is done in AWS, or similar hosting?


Which is funny since visiting many UK government sites now involves sending information about your use of the site to a non-EU corporation outside the EU after they let some young hip web devs loose on it and they stuck Google Analytics everywhere.

Technically there is some kind of agreement about data security, but it's absolutely worthless. If the NSA comes knocking on Google's door and asks for complete information about UK citizens interaction with the UK government web presence then Google is going to cough up everything. What choice do they have? Larry Page isn't going to martyr himself over what would be a totally legal NSA request.


The DPA doesn't apply because there is an exemption for National Security.


I was under the impression that it's about the national security of the UK, not just any country. (part about the exemptions; (1) lists national security)

> (2) Subject to subsection (4), a certificate signed by a Minister of the Crown certifying that exemption from all or any of the provisions mentioned in subsection (1) is or at any time was required for the purpose there mentioned in respect of any personal data shall be conclusive evidence of that fact.

Also such certificate is not a secret and can be challenged. IANAL, so correct me if I got this wrong.


Interesting to see this because at least one of these guys used to work for GCHQ.


There's some correlation between the recent NIST/NSA row, especially since the same trading of researchers occurs between those too.


The University of Bristol has an entire Maths institute funded by GCHQ. There's a limited amount of use pure maths researchers can be to intelligence services.


[deleted]


The math in encryption is fine, however the governments have been inserting themselves into every nook and cranny outside of encryption. When they control the communications equipment and they have inserted themselves into the CAs then we're completely helpless.

The only way to win the game is to not play it -- don't rely on CAs and assume HTTPS is plaintext. We may be able to help ourselves but the general populace is helpless.

This is why we appeal to the governments that are attacking us: the general populace cannot protect themselves. We may be able to protect ourselves but we shouldn't be so selfish.


Can we remake the Internet by encrypting every packet with ECC? Something like DJB's CurveCP [1] to replace TCP, but with some user-friendly improvements (seeing proper names and links, not just strings of random characters). DJB said the performance overhead to do that is only 15 percent [2], which seems well worth it to me.

I feel that to feel really protected against the NSA in the future we'll need something like that, along with starting to demand open source firmware from all hardware vendors. In the meantime we can use ECDHE to encrypt all sessions.

[1] http://curvecp.org/

[2] http://www.youtube.com/watch?v=K8EGA834Nok&feature=youtu.be&...


Be careful with what ECC algorithm you use.

Bruce Schneier recommends against ECC in [1]:

  > Prefer conventional discrete-log-based systems
  > over elliptic-curve systems; the latter have
  > constants that the NSA influences when they can.
and clarifies in [2]:

  > I no longer trust the constants. I believe the
  > NSA has manipulated them through their relationships
  > with industry.
[1]: http://www.theguardian.com/world/2013/sep/05/nsa-how-to-rema... [2]: https://www.schneier.com/blog/archives/2013/09/the_nsa_is_br...


First, Schneier has a weird track record with ECC. I think he may be alone among "well-known" cryptographers in his distrust for ECC, which goes back over a decade.

Second, the CurveCP "constants" aren't NIST derived; they're Bernstein's Curve25519, which is derived transparently from first-principles math.

Third, there are standardized NIST curves (over binary fields) that are also derived transparently from first-principles.

Fourth, the curves that aren't totally transparent are still derived from a SHA hash of random string, per the method in IEEE 1363 (which in 1363's context makes perfect sense, since you can't really generate "fresh" curves for applications from first principles without everyone ending up with the same curves). The backdoor scenarios here are... convoluted.

The more you learn about the situation with NIST ECC, the less likely an overt backdoor seems. Maybe academia has missed something big, and all of ECC is broken; if that's the case, I think you should kiss conventional IFP and DLP crypto guby too.


CurveCP uses Curve25519, not an NIST curve. Schneier's comments on ECC apply only to NIST curves such as P-256, which were influenced by the NSA. Curve25519 was developed independently by a guy who sued the US for the right to export crypto (djb).


Good news: CurveCP already supports the user-friendly feature you describe, via CNAMEs. You make a CNAME for www.example.com to $pubkey.example.com. Users never see $pubkey.


The sad truth is that they never will fully truthfully come out due to "National Security". At this point NSA views this as an information leak and they are trying to "plug" the drainage. They are in damage control mode. Unfortunately now that the word is out "potential enemies" will be looking for these holes in places they did not look before, just because of the hints that they might exist, and probably do so...


Really? Only the professors deserve their own line?


I agree that looks rather silly, but also worth noting that "Professor" in the UK means something rather different to the US. In a UK academic department only the top researchers have the status of Professor, with other full time teaching and research staff being Lecturers, Senior Lecturers or Readers.




Consider applying for YC's Spring batch! Applications are open till Feb 11.

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: