In the Journal of Holography Applications in Physics, a quiet paper recently landed that attempts to do what decades of science fiction and stoner philosophy could not: pull the plug on the Matrix. While the tech industry’s elite spent the last decade arguing over whether we are all just subroutines in a post-human play-pen, a team led by Dr. Mir Faizal at the University of British Columbia’s Okanagan campus decided to check the math. Their conclusion is a cold shower for the simulation crowd: the fundamental structure of reality is logically incompatible with being a computer program.
For those who have followed the "Simulation Hypothesis" since Nick Bostrom first formalised it in 2003, the debate has always felt more like a secular religion than a technical inquiry. It posits that as computing power grows, any advanced civilisation will eventually run high-fidelity simulations of their ancestors. Statistically, the theory goes, there would be millions of simulated universes and only one "base" reality, making it overwhelmingly likely that we are the ones living on a hard drive. It is a neat, terrifyingly logical trick that has captured the imagination of everyone from Elon Musk to the creators of ChatGPT. But Faizal and his international collaborators, including the well-known physicist Lawrence M. Krauss, argue that this logic relies on a fundamental misunderstanding of what a computer actually is.
The Gödel trap for digital architects
The core of the researchers' argument rests on Kurt Gödel’s incompleteness theorems, a pillar of 20th-century mathematics that essentially functions as a "No Entry" sign for absolute logic. Gödel proved that in any sufficiently complex mathematical system, there will always be statements that are true but cannot be proven using the rules of that system. If the universe were a simulation, it would necessarily be governed by an algorithm—a finite set of computational rules. However, Faizal’s team points out that physical reality, particularly when viewed through the lens of quantum gravity, exhibits properties that are non-algorithmic.
To simulate a universe, you need a complete set of rules that can account for every possible state and interaction. But if Gödel is correct, no such complete set of rules can exist for a system as complex as our reality. There is a gap between what the "code" can describe and what actually happens. The researchers refer to this as "non-algorithmic understanding." It is the idea that the universe functions on a level of complexity that cannot be reduced to a series of 1s and 0s, or even the complex qubits of a quantum computer. If the universe requires non-algorithmic processes to function, then by definition, a computer—which is a purely algorithmic machine—cannot host it.
This is not just a philosophical disagreement; it is a hardware problem. In classical computing, we deal with Church-Turing machines, systems that can calculate anything that is computable. The UBC paper suggests that the laws of physics are not, in fact, computable in the way we assume. We can simulate the trajectory of a rocket or the heat of a star because these are approximations. But simulating the emergence of spacetime itself from the "information" of quantum gravity requires a type of processing that exceeds the logical limits of any system built on programmed rules.
The high cost of digital twins
While the theoretical physics community debates the non-algorithmic nature of reality, the European Union is currently betting billions of euros that we can, at least, simulate parts of it. The "Destination Earth" (DestinE) initiative is a flagship project aimed at creating a "digital twin" of the planet to monitor climate change and extreme weather events. It is a massive procurement exercise involving the European Space Agency (ESA) and the European Centre for Medium-Range Weather Forecasts (ECMWF). The project relies on the assumption that if you throw enough petaflops of computing power at a problem, you can recreate the Earth’s systems with perfect fidelity.
However, Faizal’s findings suggest a looming ceiling for these ambitions. If reality is fundamentally non-algorithmic, then every simulation—no matter how many GPU clusters in Bonn or Bologna we throw at it—will eventually hit a wall of "irreducible complexity." We are already seeing this in weather forecasting, where the gap between a model and the actual atmosphere is not just a matter of more data, but of chaotic variables that may be mathematically impossible to "pre-calculate." Brussels may be funding the most sophisticated mirror ever built, but the UBC research suggests the mirror can never truly become the thing it reflects.
In Germany, where the semiconductor supply chain is often viewed through the lens of industrial sovereignty, the idea that the universe is not a computer is strangely comforting. If the world were a simulation, the most powerful entity would be whoever owns the server farm. Under the current trajectory of chip manufacturing, that would likely be a corporate entity in Santa Clara or a state-backed fab in Hsinchu. By proving that the universe is not a program, the mathematics effectively restores the "physics" of the real world—resource-intensive, messy, and fundamentally uncontrollable—to its primary position.
Can information exist without code?
One of the more nuanced points in the paper involves the role of information. Modern physics, especially the holographic principle mentioned in the journal’s title, suggests that the universe is made of information. This has often been used as evidence for the simulation theory: if everything is just information, surely it’s just software? The researchers argue this is a category error. Information in a physical sense—the states of quantum particles—is not the same as digital information stored in a database.
This puts a different spin on the "quantum advantage" that companies like IQM in Finland or Pasqal in France are chasing. We aren't building computers to simulate reality; we are building machines that try to exploit the very non-algorithmic gaps that Faizal identifies. The goal is to use the "weirdness" of quantum mechanics—the parts that don't make sense in a classical computer—to perform tasks. But even a quantum computer is a system of logic. It still operates within the bounds of what can be mathematically structured.
The end of the Silicon Valley religion
The simulation hypothesis has always been popular because it provides a sense of order. If we are in a simulation, then there is a creator, a purpose, and perhaps even a "debug" mode. It turns the terrifying vastness of the cosmos into something familiar: a product. It is the ultimate expression of the tech industry’s hubris—the belief that everything, from the birth of a star to the feeling of a first kiss, is eventually reducible to a patentable process.
The tech billionaires may continue to fund research looking for "glitches in the Matrix" or ways to "break out" of the simulation, but they are likely chasing ghosts. The universe isn't failing to load its textures; it’s just operating on a level of mathematical complexity that doesn't care about our binary logic. It is a blow to the ego of the programmer, but a victory for the physicist. The universe is not a computer, and that is exactly why it works.
Europe has spent decades trying to build the perfect model of the world. It turns out the only way to truly understand the universe is to live in it, rather than trying to compile it. The research is a reminder that while you can simulate the rain, you can never quite get the computer wet.
Comments
No comments yet. Be the first!