Reading through the paper in (3) above. If I understand the text on page 26 correctly, you predict that quantum computers will not be more efficient than classical computers:
"The class of problems that can be solved efficiently by quantum computers should be identical to the class of problems that can be solved efficiently by classical computers: More precisely, we predict in this appropriately coarse-grained case that P=BQP, where P and BQP denote the complexity classes of polynomial time and bounded error quantum polynomial time, respectively."
And:
"In other words, in order to maintain a causal invariant representation, the observer must perform a sufficient level of coarse-graining to ensure that any apparent advantage obtained through the use of a quantum computer over a classical one is effectively lost."
Am I missing something fundamental (most probably)? Are you predicting that quantum computers will not be able to, for example, factor RSA keys much faster than todays non-quantum machines?
I'm not sure if that is the case or not but if that is the prediction then they are in good company.
Nobel laureate Gerard 't Hooft also has a cellular automaton theory and one of his conclusions is: "If engineers ever succeed in making such quantum computers, it seems to me that the CAT is falsified; no classical theory can explain quantum mechanics." By "such quantum computers" he means computers that can run Shor's algorithm. "...but factoring a number with millions of digits into its prime factors will not be possible – unless fundamentally improved classical algorithms turn out to exist."
I believe most of these theories essentially suggest that the error correction required to produce a precise and reliable answer from more qubits, will grow faster than the computing power added by those qubits.
> no classical theory can explain quantum mechanics."
Not sure I can agree with 't Hooft on this. A GR-based theory with closed timelike curves can easily have particles travelling through time to interfere with themselves, and thus reproduce the quantum phenomena like the double-slit experiment. There's been some work on this:
In your last sentence, you compare future quantum computers to “today’s” non-quantum computers, which might be a false dichotomy.
[warning: uninformed tangent]
A more optimistic interpretation could be that quantum & non-quantum machines will be similar because we have huge leaps to make in non-quantum computer architecture.
This is strictly a theoretical thought-experiment for me, but it has always intrigued me that quantum computers sort-of model the problem itself in the hardware circuit & shove a bunch of qubits through it.
In digital computers, we mostly model Boolean logical structures & then, in software, translate our problem into that Boolean logic. This translation into discrete steps places a limit on the theoretical efficiency.
However, perhaps there is room in analog computing hardware to more closely model specific types of optimization problems & then shove a bunch of electrons through it (shouldn’t the electrons follow the path of least resistance?).
> In your last sentence, you compare future quantum computers to “today’s” non-quantum computers, which might be a false dichotomy.
Ah, good point.
Though I was more thinking of Shor's algorithm and Grover's algorithm that tells us the theoretical expected performance that could be achieved with quantum computers. Normally these are described as showing the speedup provided by a possible quantum computer (in relation to non-quantum computers).
So, when reading the Wolfram Model paper I cited, I read the statement regarding quantum computers as dismissing the possibility of achieving qantum computers capable of realising Shor's and Grover's.
But one could of course read it in a flip-side way, that there are algorithms out there to be discovered that achieves the same lower bound complexities on non-quantum machines.
Considering that the Wolfram Model is all about graphs and cellular automata, the statement should probably be considered not based on a RAM complexity model, but something like PRAM that considers parallelism.
What you are describing is an analog computer or circuit. These definitely exist, I had to build a circuit to model the physics of a bouncing ball in a Circuits class in college. However, I don't know how often analog computers are used in professional/practical applications these days. Here is some more info: https://spectrum.ieee.org/computing/hardware/not-your-father...
> However, perhaps there is room in analog computing hardware to more closely model specific types of optimization problems & then shove a bunch of electrons through it (shouldn’t the electrons follow the path of least resistance?).
I haven't checked the context, but that indicates a flaw in either their framework or the way they draw this conclusion. We know for a provable fact that some problems (not necessarily interesting or useful ones) can be solved exponentially faster on a quantum computer than a classical computer: Deutsch-Jozsa algorithm (or it's generalization: Simon's algorithm) demonstrates this.
We know problems can be solved faster on a "quantum computer" as the term is defined; we don't know that the version of QM that our reality runs on actually allows us to create such a computer, or at least scale it up to a demonstrably (not provably! Demonstrably!) superclassical level. That's what the Quantum Supremacy debate is about.
"The class of problems that can be solved efficiently by quantum computers should be identical to the class of problems that can be solved efficiently by classical computers: More precisely, we predict in this appropriately coarse-grained case that P=BQP, where P and BQP denote the complexity classes of polynomial time and bounded error quantum polynomial time, respectively."
And:
"In other words, in order to maintain a causal invariant representation, the observer must perform a sufficient level of coarse-graining to ensure that any apparent advantage obtained through the use of a quantum computer over a classical one is effectively lost."
Am I missing something fundamental (most probably)? Are you predicting that quantum computers will not be able to, for example, factor RSA keys much faster than todays non-quantum machines?