Quantum physics seems to be the ultimate proof that the universe is a simulation.
The universe, intuitively, seems to be analog and continuous. That "feels" right to us. But quantum physics shows that it is actually discrete. But that is exactly how computer simulations work! They use very small time scales to make things appear continuous. We know that below certain time scales, things are essentially random. This is consistent with a computer simulation. You can't accurately simulate something that happens in less time than one "frame" of time. There is a whole area of mathematics that deals with how to
make simulations work accurately[wikipedia.org] given the limitation of discrete time scales.
The same happens with physical sizes. Below the
Planck scale[wikipedia.org] the universe starts to break-down and become random. This is exactly how things would work if the universe was using binary arithmetic. Suppose that every particle in the universe has a coordinate. You can represent it's position over a vast scale, but only with limited accuracy. The plank scale is that limit, and it indirectly tells us how many bits are in the coordinate field of each particle. When we try to measure the position of something accurately, we find that the position becomes random. And if you try to measure it's speed to more resolution than one "frame" of time, it becomes less accurate. Worse-yet: the only way we can measure the position or speed of a simulated particle is by comparing it to another simulated particule, which introduces yet more error. We are ultimately limited by the accuracy of the simulation.
One side-benefit of this is that we have an awesome source of stastically predictable randomness. Quantum computers are actually using the randomness of the simulator to take advantage of cpu-cycles that are "outside" of our universe. Within the simulator, we can only build a computer that is so fast. But if we find a way to tap into the computing power of the simulator, like by using the side-effects of one of it's built-in functions, then we can compute a result faster than anything we can do ourselves. It is like calling into "native code" while we are running in the interpreted bytecode.
Another indication that we are in a simulation is that quantum physics shows us that wave functions collapse when we observe them. That makes sense: why should the universal simulator waste time calculating quantities that are not currently being measured? Imagine a vast number of inputs, a vast number of calculations that produce outputs, and a smaller number of observers of those outputs. You can easily optimize away things that are not being observed. But we found a way to notice the side-effect of not calculating certain values. It's like a side-channel attack on an encryption algorithm. You can tell how many bits of a password are correct even without the output by seeing how long it took to calculate, or how much power the computer consumed.
I wonder if the designers of the simulator didn't know that we could see these kinds of side-effects, or if they are too difficult to fix. Either way, we are seeing side-effects of some of the shortcuts and optimizations.
Perhaps one day one of the programmers will look over at their printer and find a little note from someone way down here inside the simulation. If you could hack a few words outside of the system, what would they be?