Before the invention of the transistor, people also thought that classical computers would require layers of error correction to function reliably. From [1]:
> The practical problems that concerned von Neumann and the designers of the EDVAC in 1945 were the reliability of vacuum tubes and the main memory stored with mercury delay lines. [...] The invention of the transistor, integrated circuit, and error-correcting codes make von Neumann's concerns seem quaint today.
Classical computers are constantly snapping physicals signal to discrete states. This snapping is itself a kind of error correction, and is almost the only when we use when processing (as opposed to storing or transmitting) information.
Quantum is different. Even if a computers' states are discrete, we work on superspositions, which are continuous. Just like with an analgue computer, these errors keep growing no matter how good the individual gates get.
Theorists tell us that the way around this is to have ancillary qubits that you "snap" as you go along. Done correctly, using proper ECC algorithms, this can clean up the real computational qubits. But that means advanced ECC is a fundemantal part of the processing.
> The practical problems that concerned von Neumann and the designers of the EDVAC in 1945 were the reliability of vacuum tubes and the main memory stored with mercury delay lines. [...] The invention of the transistor, integrated circuit, and error-correcting codes make von Neumann's concerns seem quaint today.
1: http://queue.acm.org/detail.cfm?id=2756508