Encryption export restrictions are an absolute joke. I can go read a Wikipedia article right now that heavily details the inner workings of most block ciphers. They're still fining people for exporting anything over 64 bits (which is most things), even if almost every individual let alone government has access to the technological knowledge to reproduce such encryption at home.
Seems like this fine is about them exporting a custom Linux distro' with all the normal encryption libraries still in-place (e.g. 256 AES).
Why do I get the strong sense that this fine isn't really about exporting encryption and is really about Wind River failing to place backdoors into this equipment? Because frankly that makes a lot more sense than what it appears to be on the surface.
Wind River got fined for "something" to do with encryption. Now you can either take the government at their word and assume that that thing is just exporting AES 256 within a standard OS, or maybe consider it a little deeper and wonder if it was punishment for something else Wind River did or didn't do.
>Encryption export restrictions are an absolute joke.
I don't think you give BIS & NSA the credit they deserve. Experto crede. The restrictions may serve a useful purpose and if you can't see it, that doesn't mean they are the ones missing the point.
Having myself had to pass the export control for one of my apps, and so having the opportunity to pour over the these regulations in great detail, I came to conclusion that the actual purpose of the crypto export regulation is not to control export of the technology. That sort of restrictions is impossible to implement, as you noted, and it's also quite unnecessary.
The actual purpose of the law is to prevent export of expertise. Implementing a crypto system that would be secure end-to-end is pretty much impossible without being an expert, and even then it's exceptionally hard. The best chances belong to a team of experts with a track record of fruitful collaboration and rich history in a particular area. Unlike bits and pieces of code, cohesive teams of experts are relatively easy to keep track of.
The way this control regime works, is that the entirety of the regulation revolves around two questions: 1) Are you supplying a turn-key end-to-end system to the client? 2) Are you supplying services to setup a complete, tailored system for the client? If the answer is "yes" to either question, that raises the red flag for the BIS/NSA, and they want to see what's going on there. Otherwise it's just a minor bureaucratic hoop to jump through. You are certainly welcome to export bunch of crypto code, and they even have a special open-source exception to the paperwork requirements.
So, you may disagree with the goals of this regulation, but it's certainly not a joke you make it out to be.
> 1) Are you supplying a turn-key end-to-end system to the client?
> So, you may disagree with the goals of this regulation, but it's certainly not a joke you make it out to be.
It remains a joke, when you consider how much really good open source software implementations are in the wild already. Think about the crypto in Debian and OpenBSD, including GPG, LUKS, openvpn, OTR, Open/LibreSSL with their support in Apache, nginx, etc. Even for Android, the best end-to-end encryption software (security-wise, not interface-wise) is open source. Let's face it, we have won this crypto "war" long ago. (This is not to say, we don't have other problems to deal with, regarding the immense power the NSA, etc have.)
And even all sufficiently secure software was closed source, as long as it is reasonably widespread, it will be easy to pirate the software instead of importing it from the US.
> 2) Are you supplying services to setup a complete, tailored system for the client?
What kind of systems do you have in mind? Afaik, the case referred to in the article does not fall into this category.
These are all components, not systems. A system would be something akin to a full-bank installation with all the computers, cabling, routers, crypto protocols, key generation/escrow/rotation/destruction, fall-back procedures, firewalls, hotglued USB ports, etc.
You can use those components to build a system like this, but you have to be an expert in it. This is why component export is restricted - all they want is to look into your design to see if you are an expert capable enough to design and implement a secure system. What happens if they pick interest in you is beyond my experience. I know, it speaks poorly of my crypto skills, huh?.. :) I imagine they would start looking into identity of the client(s), and see if they are connected to embargoed entities. Or something...
Fair point. I didn't recognize you had such a high level view of "system". But this raises the question whether one can really call these laws "cryptography export restrictions". Because, sure, cryptography is involved, yet the restrictions only apply to (/ are meaningful in regard of) the procedures involved in secure IT systems in general. I'd argue that this expertise is somewhat independent of cryptography, as you can swap the cryptography implementations with any other and the procedures would stay the same.
Even better, leave the cryptography out of the package entirely and just include an "apt-get install" line or a list of open source projects you have to install. Instructions for "cabling, routers, crypto protocols, key generation/escrow/rotation/destruction, fall-back procedures, firewalls, hotglued USB ports, etc." aren't cryptography in themselves, are they?
Fist, it adds more moving parts, making the system more likely to contain a hole, which might be just enough for NSA. Second, consider that typical buyer is a beauracracy, and they are are either buying a crypto system, or the are not buying it. The upgrade might be trivial for you, but a beauracrat has no way of knowing that, so he has to play by the rules.
I think they key to scale of the word system is however big it needs to be to establish actual security. You can have perfectly good crypto, but if you're relying on certificate authorities you may be safe from street hackers, but as far as NSA is concerned its wide open.
It's just reading tea leaves, of course. I imagine that NSA is not enjoying reading thousands of applications from iOS app devs like me, so if they keep doing it they must be getting something for their effort.
BTW, other countries can and do start to apply the same rules against US companies.
China started to have a lot more rules and regulations regarding to the importation of products with any kind of crypto.
They are becoming bigger IT/Mobile product market than US.
I can see in 2-3 years - as soon as the Chinese home grow Mobile SOC are mature, they will start to require Intel/QComm/Apple to disclose everything they do in the Chip/Software level relate to crypto before any of the next gen product are allowed to be sold in Chinese market.
The choice for AAPL/QComm/Intel will be do what the Chinese government wanted or loss access to 30-50% of customers on principle - like Google did a few years ago.
It's not illegal to export the system, you just have to get BIS/NSA permission to do that.
I see a few possible explanations on why they allowed it - 1) NSA was satisfied with their intercept capability (likely around key generation / exchange being based on cellular network/SMS) 2) they decided that the technology was a commodity and it's best to have Apple dominate it (where they can still attack the data center or use legal means) rather than cede it to a foreign power.
Whilst the ability to implement secure systems is certainly not commonplace, the United States doesn't have a monopoly on encryption and security expertise.
If the USG wanted to coerce Wind River into backdooring vxWorks, they have better ways to do it than a newsworthy enforcement action regarding cryptography exports.
Export controls for crypto are stupid and counterproductive, though; I agree with you there.
> I can go read a Wikipedia article right now that heavily details the inner workings of most block ciphers.
Yes, you can. And from that, you can go implement those cyphers. And the NSA is probably OK with that, because it's really hard to implement crypto correctly. If I understand correctly, more crypto systems are broken because of implementation flaws than because of protocol flaws. So if you have to implement it yourself, odds are that you will make a mistake that makes it insecure, especially if you aren't an expert on crypto.
> And from that, you can go implement those cyphers [...] If I understand correctly, more crypto systems are broken because of implementation flaws than because of protocol flaws.
That is compare with high professional flaw-less implementations with high professional back doors built in like the RSA products.
>So if you have to implement it yourself, odds are that you will make a mistake that makes it insecure, especially if you aren't an expert on crypto.
It of course depends on who you're, whether somebody would go to the lengths of custom cracking your custom implementation vs. buying an "off-the-shelf" crack/backdoor if you use off-the-shelf software.
I know this is a common belief, but I'm not completely sure it's true. I mean crypto is an algorithm, and there are reference vectors for most common types, so you can be pretty sure you got the basics right.
Now sure, you need to worry about other things, like the entropy of andomm number generators and so on, but this stuff isn't exactly rocket science.
So does experience help? Sure. And maybe you need to be I the upper quartile to get it right, but programming is programming and yes it's going to take some time and effort, but clearly it's do'able...
Now I'm not recommending you do your own - I'm just not sure I buy the idea that it's impossible for mere mortals to pull it off...
Then again, maybe I'm just an NSA employee wanting to make you think it's safe...
You should take a look at the [encryption] tag on stack overflow. There is a lot of really depressing stuff there.
Implementing low level cryptographic algorithms is hard - I'd say near impossible to get right - if you apply high standards like side-channel resistance. There are a lot of use cases though, where side-channel resistance is not necessary to get secure "enough" against the slightly above average attacker. But the next level to get wrong is using the low-level crypto correctly. E.g. by assuming encrypting data is enough to protect the content, until you find out about CBC padding oracles and how your service involuntarily decrypts the sensitive date for the attacker (ASP.net: [1]). Or you realize, that your ad-hoc message authentication code does not protect your API at all, because MD5, SHA1 and the SHA2 family are all subject to length extension attacks (Flickr: [2]).
The subtleties are endless: In RSA, you can calculate the `e`th root over the real numbers, if `e` and the message is small, RSA has it's own padding problems [3], Stream ciphers provide no authenticity whatsoever as you can just xor the ciphertext with any value to manipulate the plaintext, password hashing is not at all about what most people believe it to be, comparing passwords should be done using a PAKE [4] (otherwise you can bruteforce passwords, like in WPA)... And then there is stuff that even the experts get wrong, think TLS Triple Handshake [5].
You can learn about this kind of stuff by reading the right books (I like to recommend "Cryptography Engineering" [6]) and keeping an eye open for vulnerabilities and how they work, so it is not obscure or hidden knowledge. But building cryptographic systems involves a lot more work and expert knowledge than the average programmer in the upper quartile imagines. And, crucially, you won't know when you get it wrong.
Sure, crypto is hard, but I don't think for the last 20 years or so we have been living in a world where the five countries (and one specially-administered territory) mentioned in the article as having received the software (China, Hong Kong, Russia, Israel, South Africa, and South Korea) would have significantly more trouble than a private company in the U.S. producing secure cryptographic software based on known algorithms and protocols. The only consequence of export restrictions on crypto beyond the things that live in the classified/military world, is reducing the competitiveness of American software firms and decreasing economic output by having every piece of security software developed twice, once in the U.S. and once in say Switzerland or India or whichever country doesn't have insane export restrictions on crypto.
And those references are either well-studied by the NSA to learn the implementation failures, or they've actively worked on the reference to insert a weakness.
Dude. $750K for Intel. They spend more per day on psychological counseling for their chip geek employees who think they're living in the Matrix. Ain't gonna force them to backdoor nothin'... ain't gonna be punishment either for not doing a backdoor job when they were told nicely to do a nice backdoor job.
Sure, my tinfoil is as good as anybody's... REYNOLDS WRAP HEAVY DUTY... best there is... but this seems to be a clear case of some bureaucrat applying the rules he or she has been given to the case he or she has been given... Nothing else...
> Encryption export restrictions are an absolute joke. I can go read a Wikipedia article right now that heavily details the inner workings of most block ciphers.
You are right. Encryption exports are a joke. Not because of Wikipedia, however, but rather because of open source software.
It's trivial to find instructions on how to build OpenSSL from source, especially for a foreign government with all the power and money they have. This makes the law in question nonsensical in theory and ineffective in practice.
As a public policy matter, and if I had to guess, these controls have more to do with retarding the flow of any high-end technology to the USG blacklist. The policy goal isn't to prevent organizations on the blacklist from being able to deploy RSA encryption, but rather to prevent them from sourcing technology of any sort from US companies.
Blacklisted organizations can obviously still source RSA, along with whatever platform they want to run it on, but it's presumably more expensive for them to do so.
I'm not even sure the object is to make it significantly more expensive, at least not in the financial sense. If I were a securocrat, I'd be monitoring entities forbidden from purchasing this stuff from the US in order to compile intelligence on them and their vendors - do they prefer to use TOR, or just hit a supplier in .xyz domain, or does a particular individual reliably purchase a plane ticket after a legal rejection?
This could also be one of those things where everyone involved recognizes that the policy is incoherent, but any time someone makes a serious move towards reforming it, they're informed by DoD or DHS that aspects of the policy have convenient knock-on effects that they don't want to eliminate.
If the policy isn't actively harming industry (and beyond optics it may not really be doing that much direct harm), it may seem like poor risk/reward to change it.
> is really about Wind River failing to place backdoors into this equipment?
I also felt some bad vibes from this story. There are probably hundreds of US based companies exporting embedded crypto, but how many are exporting to China and Russia a RTOS for the aerospace and defense industries?
Careful: vxWorks applications to military/industrial work might be what got BIS's attention, but that's not what vxWorks is "for"; it's just a very popular RTOS.
The amount in question doesn't make sense for that. Nor does the voluntary disclosure. 750k isn't a scary and punishing fine when you have Intel backing you.
This could be just meant as a message to other firms. Intel might even be getting the money back through routes like tax breaks and just be playing along.
This idiocy also give you the feeling that the Federal government is on such a roll forward that every agency that ever lost a battle on security wants to come back and re-win it.
Encryption export restrictions are an absolute joke. I can go read a Wikipedia article right now that heavily details the inner workings of most block ciphers. They're still fining people for exporting anything over 64 bits (which is most things), even if almost every individual let alone government has access to the technological knowledge to reproduce such encryption at home.
Seems like this fine is about them exporting a custom Linux distro' with all the normal encryption libraries still in-place (e.g. 256 AES).
Why do I get the strong sense that this fine isn't really about exporting encryption and is really about Wind River failing to place backdoors into this equipment? Because frankly that makes a lot more sense than what it appears to be on the surface.
Wind River got fined for "something" to do with encryption. Now you can either take the government at their word and assume that that thing is just exporting AES 256 within a standard OS, or maybe consider it a little deeper and wonder if it was punishment for something else Wind River did or didn't do.