Hacker News new | past | comments | ask | show | jobs | submit login
High-Fidelity Quantum Logic Gates (aps.org)
64 points by gk1 on Aug 6, 2016 | hide | past | favorite | 10 comments



"Oxford team achieves a quantum logic gate with record-breaking 99.9% precision, reaching the benchmark required to build a quantum computer"

https://www.reddit.com/r/science/comments/4wghmo/oxford_team...


I find it interesting that 99.9% (e.g. a failure rate of 1/1000) is considered "good".

On a traditional CPU, if it gave the wrong answer 1/1,000,000,000 that would probably result in a recall.


Because of the way physics works, you'll never get 100% outcome, but there's work to show, that over a certain threshold, all errors can be corrected by the algorithm in a way, that the final result is 100% correct (threshold theorem, apparently errors up to 3% can be corrected in some settings). The end result is perfect, you just have to get there differently.

https://en.wikipedia.org/wiki/Quantum_error_correction

Disclosure: did my PhD (DPhil rather) in that group a while back (under Andrew Steane, he did research into error correction), awesome to see their work on HN, congrats all around! :)


The 99% threshold is that required for error correction protocols to work. So imagine a thousand of these 99.9% error rate gates with some error correction routine on top of that so that the net effect is one very reliable gate. Where "very" depends on how many of these unreliable gates you use. But below 99% error rate none of this stuff works and so the whole scheme (quantum computers) can't get off the ground.


Any constant bound suffices to make error correction work. It's just that the number of extra bits that you need goes up.


Unfortunately this is not at all true for quantum error correction protocols. In the quantum world error correction makes things worse below the error threshold. Source: me doing a PhD on the subject at the moment.


Before the invention of the transistor, people also thought that classical computers would require layers of error correction to function reliably. From [1]:

> The practical problems that concerned von Neumann and the designers of the EDVAC in 1945 were the reliability of vacuum tubes and the main memory stored with mercury delay lines. [...] The invention of the transistor, integrated circuit, and error-correcting codes make von Neumann's concerns seem quaint today.

1: http://queue.acm.org/detail.cfm?id=2756508


Quantum is different.

Classical computers are constantly snapping physicals signal to discrete states. This snapping is itself a kind of error correction, and is almost the only when we use when processing (as opposed to storing or transmitting) information.

Quantum is different. Even if a computers' states are discrete, we work on superspositions, which are continuous. Just like with an analgue computer, these errors keep growing no matter how good the individual gates get.

Theorists tell us that the way around this is to have ancillary qubits that you "snap" as you go along. Done correctly, using proper ECC algorithms, this can clean up the real computational qubits. But that means advanced ECC is a fundemantal part of the processing.


It's a good thing of course, but it's also trapped-ion qubits, so this is not exactly something that can freely scale or be widely manufactured. It is in short, probably not the future of quantum computing, it's just another important step on the way.


Disclaimer: I used to work on trapped-ion systems.

I hear this a lot about trapped-ion qubits, but in their current state, no technology is scalable. The usual suspects are superconducting, quantum dots, or diamond NV-based qubits, but each technology has their own scalability problems. Superconducting qubits suffer from either requiring massive (vacuum tube sized) cavities or a ton of crosstalk. Good luck isolating superconducting LC circuits from one another on one circuit board - a reason why these are limited to only a few qubits. Quantum dots have pretty awful decoherence issues and I'm not sure they can be implanted deterministically. Moreover, I don't think people have demonstrated non-photon mediated entanglement, which is not particularly scalable. NVs have the same implantation and entanglement problems, though at least they can be used at room-temperature. Superconducting and quantum dot technologies require million-dollar dilution refrigerators and large amounts of (expensive!) helium-3.

Obviously, these technologies also have their advantages over trapped-ion qubits. But trapped-ion proponents also have their own roadmaps to scalability (surface traps created using lithiography).




Consider applying for YC's Summer 2025 batch! Applications are open till May 13

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: