Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I find it interesting that 99.9% (e.g. a failure rate of 1/1000) is considered "good".

On a traditional CPU, if it gave the wrong answer 1/1,000,000,000 that would probably result in a recall.



Because of the way physics works, you'll never get 100% outcome, but there's work to show, that over a certain threshold, all errors can be corrected by the algorithm in a way, that the final result is 100% correct (threshold theorem, apparently errors up to 3% can be corrected in some settings). The end result is perfect, you just have to get there differently.

https://en.wikipedia.org/wiki/Quantum_error_correction

Disclosure: did my PhD (DPhil rather) in that group a while back (under Andrew Steane, he did research into error correction), awesome to see their work on HN, congrats all around! :)


The 99% threshold is that required for error correction protocols to work. So imagine a thousand of these 99.9% error rate gates with some error correction routine on top of that so that the net effect is one very reliable gate. Where "very" depends on how many of these unreliable gates you use. But below 99% error rate none of this stuff works and so the whole scheme (quantum computers) can't get off the ground.


Any constant bound suffices to make error correction work. It's just that the number of extra bits that you need goes up.


Unfortunately this is not at all true for quantum error correction protocols. In the quantum world error correction makes things worse below the error threshold. Source: me doing a PhD on the subject at the moment.


Before the invention of the transistor, people also thought that classical computers would require layers of error correction to function reliably. From [1]:

> The practical problems that concerned von Neumann and the designers of the EDVAC in 1945 were the reliability of vacuum tubes and the main memory stored with mercury delay lines. [...] The invention of the transistor, integrated circuit, and error-correcting codes make von Neumann's concerns seem quaint today.

1: http://queue.acm.org/detail.cfm?id=2756508


Quantum is different.

Classical computers are constantly snapping physicals signal to discrete states. This snapping is itself a kind of error correction, and is almost the only when we use when processing (as opposed to storing or transmitting) information.

Quantum is different. Even if a computers' states are discrete, we work on superspositions, which are continuous. Just like with an analgue computer, these errors keep growing no matter how good the individual gates get.

Theorists tell us that the way around this is to have ancillary qubits that you "snap" as you go along. Done correctly, using proper ECC algorithms, this can clean up the real computational qubits. But that means advanced ECC is a fundemantal part of the processing.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: