First off, this is not a "quantum computer" as defined by the field. Instead it is a "quantum annealer" that relies on the quantum adiabatic theorem, though it's pretty clear it is impossible for these spins devices to remain completely adiabatic.
They look at mostly combinatorial optimization problems, and largely they can not do anything faster than a classical (non-quantum) computer (i.e. a single laptop).
I read in future versions they may embed classical chips into their D-wave black box. Benchmarks at that point will be sort of silly.
I've taken courses in quantum mechanics and quantum computing, and I put in the effort to try and actually develop some intuition as to how they work, but I still don't have the slightest clue as to how you could build the actual physical substrate for performing those kinds of computations.
Then again, I don't feel like I understand classical computers all the way down to the metal, even after taking courses that supposedly explained that. No matter how much I learn, my mental model of real-world computing is not all that different from what it would be if I had simply decided to believe that, at the bottom, code just runs on magic.
highly reccomend reading Code: The Hidden Language of Computer Hardware and software. The author basically starts from how a telegraph works and extrapolates modern computing from that starting point. Great read, and helped me understand exactly what you're referring to, understanding classical computers down to the metal.
Have you built an adder out of logic gates? For me that was a good exercise in bridging the gap between "piece of electronics that I understand" and "component that does computation".
Quantum computation is the same. You figure out how to build gates (there are dozens of competing ideas), and then it's just plugging them together in the right way.
> Quantum computation is the same. You figure out how to build gates (there are dozens of competing ideas), and then it's just plugging them together in the right way.
Not exactly. Quantum computers are a kind of analog computers based on quantum mechanics and there are several different models for doing quantum computation ---only one of them is based on gates.
Annealers such as D-Wave aren't based on quantum gates.
They instead slowly anneal the system to its ground state (while avoiding states corresponding to a local minima in energy), and the usefulness of this lies in the fact that calculation of ground state of a system is often an extremely difficult problem, and that you can map your NP problem onto calculation of the ground state in certain cases.
Most of the models I've seen have been gate-based; there are certainly a wide variety of physical implementations for the same logical gates. You're right that the D-Wave system (if it's real) is different, but are there really that many other non-gate models?
> I still don't have the slightest clue as to how you could build the actual physical substrate for performing those kinds of computations.
I suppose a useful starting point for quantum intuition would be the double-slit experiment[1]. There's a huge amount of material written about that, so there's more chance of gaining intuition there than by diving straight into quantum computation. Another fun experiment to try and understand is the quantum bomb-detector[2].
To gain intuition about classical computation and it's physical implementation, you could consider the billiard ball computer[3]. As well as being intuitive in terms of Newtonian physics, it's also reversible, which is an important principle in quantum computers.
Now, consider what would happen if we used 'quantum billiard balls' which, like the photons in the double slit and bomb detector experiments, can be in superpositions and interfere with themselves; yet can still bounce off each other to implement logic gates (photons don't bounce like that, since they're bosons, but you get the idea).
We can use the principles of the billiard ball computer to encode our computation as the paths taken by objects (the balls) through space (our circuit). We can use the principles of quantum mechanics to allow objects to 'take many paths at once', and hence perform many computations at once. That's why quantum computation seems appealing for combinatorial problems.
We can use the interference effects seen in the double slit experiment to combine the results of these computations, such that the results combine constructively for the correct answer and destructively for incorrect answers (ie. which wire/optical fibre coming out of the circuit should contain a '1' signal and which a '0' signal). By taking a measurement after this interference has taken place, we can make the probability of getting the correct answer very high.
Anything that has an energy can be turned into a temperature by dividing by the Boltzman constant k_b (units: Joules/Kelvin).
The meaning of that temperature can be hard to interpret, though. Temperatures make the most sense when you are thinking about thermodynamics, not when you consider a single isolated particle.
They look at mostly combinatorial optimization problems, and largely they can not do anything faster than a classical (non-quantum) computer (i.e. a single laptop).
I read in future versions they may embed classical chips into their D-wave black box. Benchmarks at that point will be sort of silly.