Mar 172017
 

Recently, I answered a question on Quora on the possibility that we live in a computer simulation.

Apparently, this is a hot topic. The other day, there was an essay on it by Sabine Hossenfelder.

I agree with Sabine’s main conclusion, as well as her point that “the programmer did it” is no explanation at all: it is just a modern version of mythology.

I also share her frustration, for instance, when she reacts to the nonsense from Stephen Wolfram about a “whole civilization” “down at the Planck scale”.

Sabine makes a point that discretization of spacetime might conflict with special relativity. I wonder if the folks behind doubly special relativity might be inclined to offer a thought or two on this topic.

In any case, I have another reason why I believe we cannot possibly live in a computer simulation.

My argument hinges on an unproven conjecture: My assumption that scalable quantum computing is really not possible because of the threshold theorem. Most supporters of quantum computing believe, of course, that the threshold theorem is precisely what makes quantum computing possible: if an error-correcting quantum computer reaches a certain threshold, it can emulate an arbitrary precision quantum computer accurately.

But I think this is precisely why the threshold will never be reached. One of these days, someone will prove a beautiful theorem that no large-scale quantum computer will ever be able to operate above the threshold, hence scalable quantum computing is just not possible.

Now what does this have to do with us living in a simulation? Countless experiments show that we live in a fundamentally quantum world. Contrary to popular belief (and many misguided popularizations) it does not mean a discretization at the quantum level. What it does mean is that even otherwise discrete quantities (e.g., the two spin states of an electron) turn into continuum variables (the phase of the wavefunction).

This is precisely what makes a quantum computer powerful: like an analog computer, it can perform certain algorithms more effectively than a digital computer, because whereas a digital computer operates on the countable set of discrete digits, a quantum or analog computer operates with the uncountable infinite of states offered by continuum variables.

Of course a conventional analog computer is very inaccurate, so nobody seriously proposed that one could ever be used to factor 1000-digit numbers.

This quantum world in which we live, with its richer structure, can be simulated only inefficiently using a digital computer. If that weren’t the case, we could use a digital computer to simulate a quantum computer and get on with it. But this means that if the world is a simulation, it cannot be a simulation running on a digital computer. The computer that runs the world has to be a quantum computer.

But if quantum computers do not exist… well, then they cannot simulate the world, can they?

Two further points about this argument. First, it is purely mathematical: I am offering a mathematical line of reasoning that no quantum universe can be a simulated universe. It is not a limitation of technology, but a (presumed) mathematical truth.

Second, the counterargument has often been proposed that perhaps the simulation is set up so that we do not get to see the discrepancies caused by inefficient simulation. I.e., the programmer cheats and erases the glitches from our simulated minds. But I don’t see how that could work either. For this to work, the algorithms employed by the simulation must anticipate not only all the possible ways in which we could ascertain the true nature of the world, but also assess all consequences of altering our state of mind. I think it quickly becomes evident that this really cannot be done without, well, simulating the world correctly, which is what we were trying to avoid… so no, I do not think it is possible.

Of course if tomorrow, someone announces that they cracked the threshold theorem and full-scale, scalable quantum computing is now reality, my argument goes down the drain. But frankly, I do not expect that to happen.

 Posted by at 11:34 pm

  4 Responses to “Deja vu”

  1. Do we know for certain that we live in a quantum world, and not just a digital simulation of one? That is, do we have any practical or theoretical basis for saying that it is not possible for a digital simulation to be good enough to have fooled us, so far at least?

  2. Actually, yes, we have pretty strong reasons to believe that we do not live in a digital simulation. Specific quantum effects, including the two-slit experiment, the Aharonov-Bohm effect, not to mention the whole decoherence business that hinders quantum computing simply would not exist in a self-consistent manner in a low-fidelity digital simulation. And a high-fidelity simulation (i.e., one that is good enough to fool us) is not possible on a digital computer.

  3. Why is it that a good-enough sim is not possible? I can recall reading for several independent and seemingly credible sources that a digital computer, while not capable of perfectly simulating an analog system, could do so to any desired degree of precision, and we do not seem to be anywhere remotely close to the theoretical performance limits for digital computing. Is there some kind of qualitative difference operating in this matter that simple can’t be smoothed over?

  4. So then, let’s play a game. Everybody is after quantum computers, right? That’s because quantum computers, using the rules of the quantum world, can do things that digital computers cannot, namely compute certain algorithms efficiently (in polynomial time).

    But wait, silly us… let’s just use a digital computer to simulate our quantum computer! After all, we can do it… and it doesn’t have to be perfect, so long as it reaches the desired precision! So why are we wasting all this time trying to discover a way to build a scaleable quantum architecture, instead of just running a digital simulation on a supercomputer?

    But you see, that’s it. Even a “good enough” simulation will not run in polynomial time. In other words, even to simulate a relatively small part of our quantum world (like, say, a single raindrop!), with “good enough” precision, you’d end up using all the resources of our entire Universe and then some, and still not finish the calculation until eons pass. That’s because a^x (|a|>1) always exceeds x^n, no matter how close a is to 1 and no matter how large n is… and it usually exceeds it long before the value of x (e.g., the number of particles in your simulation) becomes particularly large.

    This is the qualitative difference. Simulating a quantum world, even imperfectly, is an NP-hard problem. (And then there is the fact that it is hard to tell what “good enough” means, when a quantum interferometric experiment can be sensitive to arbitrarily small errors.)