Old standards like RSA and ECDSA re based on the presumed classical hardness of the discrete log / hiddden subgroup problem, which is known to be quantum easy. New ones are based on lattice shortest vector problems, which are presumed to be both classical and quantum hard. But there's a risk that they turn out to be easy for both.
Is there support in the standards for combining the strength of old and new methods, so that successful attacks requires breaking both types of problems?
For other security mechanisms, like PKI, things are more complicated (and inefficient).
And one can argue that even if in theory the above gives you better security margin, the whole system becomes more complicated, and it may be practically less secure because of the additional moving parts. That is why there is no unanimous consensus: agencies in Europe recommend it, but the NSA does not.
Finally, note that the the 3rd standard (SLH-DSA), is PQC but it is based on old and well-understood standards (SHA2/SHA3), so it can arguably be used by itself.
Yep, that's the approach being taken by companies like Cloudflare[^cloudflare] and Google[^google], but notably the NSA recommends _against_ that approach[^nsa] (though the cybersecurity community at large thinks that may be because using hybrids now is disadvantageous to them breaking communications in the future).
I think you can mostly achieve this by using the ciphertext from the quantum algorithm as the plaintext to the classical algorithm (or vice versa).
I wouldn’t trust any solution that combines the algos in some more ”clever” way, as the whole point like you say is to guard against the risk of unproven new algorithms.
Please do not advocate rolling your own crypto - there are tons of pitfalls in nesting encryption algorithms like what you've just suggested. Besides, symmetric encryption algorithms aren't the biggest target of quantum computers, and in modern systems like TLS we really only use asymmetric encryption to exchange a symmetric key, and there are safe algorithms implemented by people who know what they're doing already available in most libraries.
To expand on this a little bit: TLS and other cryptosystems usually work by using an asymmetric algorithm (X25519, Kyber etc) to create a shared key, and then using an appropriate symmetric cipher mode (AES-GCM, ChaCha20/Poly1305 etc) to encrypt the actual data. The symmetric part is not known to have any special weakness to quantum attack. There is a possibility of small speedups due to Grover, but if you use a 256-bit key then those are irrelevant.
So for a hybrid classical/PQ system you're not necessarily looking to combine a whole classical and post-quantum encryption system: you just want to combine the key exchanges, since those are the part where the security is more in doubt, and then you don't have to redesign the symmetric layer, which would be more disruptive to the whole protocol. Usually the combination is done more or less by running both key exchanges, then hashing both their transcripts and the derived keys together to create a final symmetric key.
There has been considerable discussion on mailing lists on exactly what is the most appropriate way to hash everything together. For example, X-Wing https://eprint.iacr.org/2024/039.pdf skips hashing the Kyber part, because Kyber internally verifies the integrity of its ciphertexts. But this proposal has taken some flak (Turbolaser fire?) for being a premature optimization, and for not generalizing as well to other hypothetical combinations.
What are these pitfalls? My ignorance of the field is vast, but [opens mouth in preparation for feet] if nesting encryption weakens it doesn’t that imply that to decipher someone’s private message we should just encrypt it a few more times?
If you're just talking about basic encryption achieving no properties other than security against passive attack, then just nesting is probably fine. But in more complex systems, things end up being fairly subtle, so cryptographers aim for very specific security properties for the building blocks, and then try to combine these in a way that can be proved secure, at least in some attack model. This reduces the likelihood of a complex attack like the TLS triple handshake attack, where the protocol looks fine intuitively but can be broken by some weird pattern of forwarding messages between multiple parties.
So for example, nesting does not preserve IND-CCA security ("indistinguishability under chosen ciphertext attack"). Suppose you set ciphertext = outer_encrypt(outer_key, inner_encrypt(inner_key, data)). If the outer encryption system is broken, then an attacker can strip the outer layer and re-encrypt it. This will result in a different ciphertext, because if either layer is aiming for IND-CCA security, the encryption is necessarily randomized. Being able to modify ciphertexts in this way violates the IND-CCA-security goal. Then if another part of the system is designed assuming that the cipher is IND-CCA-secure, then its security is now at risk.
The attack surface is even broader if the system supports multiple combinations of ciphers, where an attacker might strip off one cipher layer and replace it with a different one.
Plenty. You can get oracles showing up in many weird places for one. If you want double encryption run AES-256-GCM TLS over WireGuard (Uses ChaCha20). You need to key both with hybrid post-quantum crypto though.
FWIW I was trying to convey the idea of "not rolling your own", in that instead you could just encrypt twice by using established implementations instead of rolling your own.
But as you and less_less rightly point out, it's not at all that simple for the majority of cases we usually mean (or at least wish for) when we say "encryption".
We did an updated version of the Cryptographic Right Answers for PQC. Given how quick the field is moving, I'm sure we're going to have to edit it a bunch :)
I have no idea how one person could have this much sophistication in their evening/weekend work on their dots: but for an example of some hard core BQP as daily driver:
The IBM/NSA teams discovering differential cryptanalysis and sitting on it until Biham and Shamir independently discovered it a decade later is a better example.
And even then they did use that knowledge to make DES more resistant to it rather than simply ignoring it. It wasn’t made public because at the time - early to mid 70’s it would’ve made most other block ciphers very beakeable. DES wasn’t broken until the late 90’s.
CryptoAG was about backdooring an implementation not an algorithm or finding a novel attack.
The fact that the NSA continually tries to backdoor implementations suggest that they likely do not have some magic method of breaking encryption that does not rely on brute force or weakening specific components in the implementation of cryptographic systems.
I know the difference, and I'm aware that Crypto AG scandal is not the same thing as not selecting better curves or withholding new techniques or exploits.
On the other hand, the end result is the same: lower security. It's irrelevant whether you backdoor an implementation (it was actually weakening the algorithm itself), design something with backdoors, tap dark fibers or lobby politicians for escrow laws.
It's about the spirit of the motivation, not how you achieve it.
Not Kyber-512, because of questionable long-term security. I'd say the strongest one your application can tolerate (a level 5 if you can tolerate it, otherwise a level 3), but other people have other priorities.
Edit: but don't use a level 5 alone if you can use a level 3 hybrid. The risk's still too high to use these algorithms alone.
> High security level. This system has a 2^128 security target; breaking it has similar difficulty to breaking NIST P-256, RSA with ~3000-bit keys, strong 128-bit block ciphers, etc.
Is there support in the standards for combining the strength of old and new methods, so that successful attacks requires breaking both types of problems?