I think you can mostly achieve this by using the ciphertext from the quantum algorithm as the plaintext to the classical algorithm (or vice versa).
I wouldn’t trust any solution that combines the algos in some more ”clever” way, as the whole point like you say is to guard against the risk of unproven new algorithms.
Please do not advocate rolling your own crypto - there are tons of pitfalls in nesting encryption algorithms like what you've just suggested. Besides, symmetric encryption algorithms aren't the biggest target of quantum computers, and in modern systems like TLS we really only use asymmetric encryption to exchange a symmetric key, and there are safe algorithms implemented by people who know what they're doing already available in most libraries.
To expand on this a little bit: TLS and other cryptosystems usually work by using an asymmetric algorithm (X25519, Kyber etc) to create a shared key, and then using an appropriate symmetric cipher mode (AES-GCM, ChaCha20/Poly1305 etc) to encrypt the actual data. The symmetric part is not known to have any special weakness to quantum attack. There is a possibility of small speedups due to Grover, but if you use a 256-bit key then those are irrelevant.
So for a hybrid classical/PQ system you're not necessarily looking to combine a whole classical and post-quantum encryption system: you just want to combine the key exchanges, since those are the part where the security is more in doubt, and then you don't have to redesign the symmetric layer, which would be more disruptive to the whole protocol. Usually the combination is done more or less by running both key exchanges, then hashing both their transcripts and the derived keys together to create a final symmetric key.
There has been considerable discussion on mailing lists on exactly what is the most appropriate way to hash everything together. For example, X-Wing https://eprint.iacr.org/2024/039.pdf skips hashing the Kyber part, because Kyber internally verifies the integrity of its ciphertexts. But this proposal has taken some flak (Turbolaser fire?) for being a premature optimization, and for not generalizing as well to other hypothetical combinations.
What are these pitfalls? My ignorance of the field is vast, but [opens mouth in preparation for feet] if nesting encryption weakens it doesn’t that imply that to decipher someone’s private message we should just encrypt it a few more times?
If you're just talking about basic encryption achieving no properties other than security against passive attack, then just nesting is probably fine. But in more complex systems, things end up being fairly subtle, so cryptographers aim for very specific security properties for the building blocks, and then try to combine these in a way that can be proved secure, at least in some attack model. This reduces the likelihood of a complex attack like the TLS triple handshake attack, where the protocol looks fine intuitively but can be broken by some weird pattern of forwarding messages between multiple parties.
So for example, nesting does not preserve IND-CCA security ("indistinguishability under chosen ciphertext attack"). Suppose you set ciphertext = outer_encrypt(outer_key, inner_encrypt(inner_key, data)). If the outer encryption system is broken, then an attacker can strip the outer layer and re-encrypt it. This will result in a different ciphertext, because if either layer is aiming for IND-CCA security, the encryption is necessarily randomized. Being able to modify ciphertexts in this way violates the IND-CCA-security goal. Then if another part of the system is designed assuming that the cipher is IND-CCA-secure, then its security is now at risk.
The attack surface is even broader if the system supports multiple combinations of ciphers, where an attacker might strip off one cipher layer and replace it with a different one.
Plenty. You can get oracles showing up in many weird places for one. If you want double encryption run AES-256-GCM TLS over WireGuard (Uses ChaCha20). You need to key both with hybrid post-quantum crypto though.
FWIW I was trying to convey the idea of "not rolling your own", in that instead you could just encrypt twice by using established implementations instead of rolling your own.
But as you and less_less rightly point out, it's not at all that simple for the majority of cases we usually mean (or at least wish for) when we say "encryption".
I wouldn’t trust any solution that combines the algos in some more ”clever” way, as the whole point like you say is to guard against the risk of unproven new algorithms.