Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Put another way, suppose you have some linear dynamical differential equation in n variables that you solve somehow. Then, take that solution and expand it in some set of basis functions (e.g. a Fourier series). You wouldn't throw your hands up in the air and say "wow that's so complex, look at all those infinite terms in the solution!". The complexity isn't really there, it just appears to be there because you've chosen to expand your solution in a basis that makes it appears really complex. Similarly in the MWI we see something that looks complex simply because we've chosen to expand the solution in a set of states that makes sense to us (state1 = particle at location 1, state2 = particle at location 2, ...)


With the Fourier example, there is a constant amount of information in the system, and so the apparent complexity in the Fourier basis representation is an illusion.

Is that the case with MWI? Is there a constant amount of information at time t and t+1? Note that I see a fundamental equivalence between information and entropy (of the computational sort), and so an exponential growth of computation required to get from t to t+1 is an inescapable theoretical burden.

To put it a different way, MWI seems to reify possibility. But the state of possibility grows exponentially in time, and so the theoretical entities grow exponentially.


Yes, there is a constant amount of information in the system. In fact, that's part of the beauty of MWI in contrast with Copenhagen. In MWI, the state at any point in time can be used to reconstruct the state at any other time. However, because of the collapse, that's not the case for Copenhagen. In other words, measurement in Copenhagen actually destroys information. As far as computational complexity goes, the same happens in classical mechanics. Start with 10^23 particles far apart but moving toward each other. Then simulating the first second is simple, but once they get close together, it gets hard with the computational complexity growing as time progresses (or alternatively the error growing for fixed computational resources).


I still don't follow. There is a constant amount of information as input into the system, but (from my understanding) the "bookkeeping" costs grow exponentially with time. This is different than the classical case where the complexity is linear with respect to time. A quick google search says that simulating quantum mechanics is NP-hard, which backs up this take. This bookkeeping is an implicit theoretical posit of a QM formalism. We can think of different ways to cash out this bookkeeping as different flavors of MWI, but we shouldn't hide this cost behind the nice formalism.

Comparing MWI to collapse interpretations, collapse is better regarding this bookkeeping as collapse represents an upper limit to the amount of quantum bookkeeping required. MWI has an exponentially growing unbounded bookkeeping cost.


Yes, that's right but that has to do with entanglement in QM and is not specific to MWI. In classical mechanics, a system of n particles is specified by 3n different functions of time - the three coordinates for each of the n particles. The complexity in terms of e.g. memory then scales linearly with the number of particles.

In QM by contrast we have entanglement, which essentially means that we can't describe one particle separately from all the other particles (if we could, then QM would be just as "easy" to solve as classical mechanics). Instead of 3n functions of time, we instead have a single function of 3n variables (plus time). The complexity of these functions does not scale linearly with n (imagine e.g. a Fourier series in one variable vs one for two variables)

So, you're right that QM is an exponentially harder problem to solve compared to classical mechanics, but this is because of entanglement and has nothing to do with Copenhagen vs MWI.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: