Melvin Vopson’s research at the University of Portsmouth began not with a philosophy degree, but with the specific, irritating way that SARS-CoV-2 mutations behave. While the rest of the world was looking for a vaccine, Vopson was looking at the information content of the virus. He noticed something that defied the messy, chaotic expectations of biological evolution: the virus’s physical information entropy was not increasing. It was decreasing. In the world of classical thermodynamics, systems tend toward disorder. In Vopson’s data, the universe looked less like a wild forest and more like a software update being optimized for a smaller hard drive.
This observation led to what Vopson calls the Second Law of Infodynamics. It is a provocative, perhaps even heresy-adjacent, proposition that information entropy in any system must remain constant or decrease over time. If this sounds like the opposite of the Second Law of Thermodynamics, that’s because it is. But for those currently obsessed with the idea that we are living inside a vast, computational construct, Vopson’s law is the smoking gun. It suggests that the universe is governed by a mandate to minimize information—a process any software engineer in Berlin or Silicon Valley would recognize as data compression.
The thermodynamic tax on existence
The argument for a simulated reality usually suffers from a lack of physical evidence, drifting instead into the realm of late-night dorm room speculation. Vopson, however, anchors his theory in the Landauer principle. Established in the 1960s, Rolf Landauer’s principle posits that erasing a single bit of information releases a tiny, measurable amount of heat. It is the bridge between the abstract world of bits and the physical world of joules. In a European context, where the energy consumption of data centres in Frankfurt and Dublin is now a matter of national security and industrial policy, the Landauer principle is no longer a theoretical curiosity. It is a line item in a budget.
If information has mass and energy—a hypothesis Vopson is currently trying to test—then the entire universe could be seen as a data-management exercise. The symmetry we see in nature, from the hexagonal lattices of snowflakes to the spiralling arms of galaxies, could be interpreted not as 'beauty,' but as an efficiency measure. Symmetry is easier to code. It requires less data to describe a circle than a jagged, irregular rock. To the simulation proponents, the fact that our universe follows elegant mathematical laws is not a miracle; it is a sign of a developer trying to save on overhead.
The mathematical wall at UBC Okanagan
While the 'Infodynamics' camp looks at the elegance of the universe and sees code, a group of physicists at the University of British Columbia Okanagan has recently arrived at the opposite conclusion using the very tool the simulation theory relies on: mathematics. Their research, published late in 2025, addresses the 'Sign Problem' in quantum Monte Carlo simulations. This is not a philosophical disagreement; it is a hard wall in computational complexity that suggests the universe is simply too difficult to faked.
The UBC Okanagan team demonstrated that as the complexity of a quantum system increases—specifically systems involving many interacting particles—the computational resources required to simulate them grow exponentially. To simulate even a few hundred atoms with perfect accuracy, you would need a computer larger than the observable universe. This is the 'Sign Problem.' It is a mathematical glitch that occurs when trying to calculate the probability of quantum states, where positive and negative terms cancel each other out in a way that requires infinite precision to resolve.
For the universe to be a simulation, the 'hardware' running it would have to bypass the very laws of complexity that we observe within the simulation. If the 'Simulators' are using a shortcut to get around the Sign Problem, we should see evidence of those shortcuts—numerical 'jitter' or approximations in the subatomic world. So far, the deeper we look into the quantum foam, the more 'real' it appears. The math doesn't show a shortcut; it shows a system of such staggering, un-optimised complexity that any sane engineer would have abandoned the project in the prototyping phase.
The European industrial reality of digital ghosts
The fascination with simulation theory often mirrors our own technological anxieties. In Germany, the push for 'Sovereign Tech' and the massive subsidies for Intel’s Magdeburg plant or TSMC in Dresden are driven by the reality that we are increasingly dependent on the silicon wafer. When we start to view the universe as a simulation, we are essentially projecting our current industrial era onto the cosmos. Just as the Victorians saw the universe as a giant clockwork mechanism, we see it as a server rack.
However, the 'Information is Physical' hypothesis has implications that go far beyond 'The Matrix.' If Vopson is right about the Second Law of Infodynamics, it would change how we approach everything from semiconductor design to genomic sequencing. If systems naturally tend toward information compression, we might be fighting the tide by trying to build ever-larger, more 'noisy' data models. The European Union's obsession with the 'Green Twin'—digitising the economy to save energy—assumes that the digital version of reality is cheaper to maintain than the physical one. Physics, specifically the Landauer limit, suggests there is a floor to that efficiency.
Why we prefer the simulation
The debate between the Portsmouth 'compressors' and the Okanagan 'realists' reveals a curious tension in modern science. We are increasingly uncomfortable with a universe that is 'just' matter and energy. Matter is heavy, expensive, and subject to the slow decay of time. Information, by contrast, feels eternal and portable. The simulation theory is, in many ways, a secular theology for the data-driven age. It offers the promise of an 'Outside,' a creator (even if that creator is just a bored teenager in a higher dimension), and a reason for the mathematical order of the world.
But the UBC Okanagan findings serve as a cold shower. They suggest that reality is not a cheap trick. The 'Sign Problem' is a testament to the sheer, unbridled grit of the physical world. It tells us that the universe is not taking the easy path. It is calculating every single interaction, every single quantum fluctuation, in real-time, with no apparent regard for the 'memory cost.' It is an incredibly inefficient way to run a reality, which is exactly why it is likely real. A simulation would have crashed long ago under the weight of its own subatomic detail.
As we continue to pour billions into quantum computing and AI, we are essentially trying to build our own 'mini-simulations.' We are discovering, through the lens of the European Chips Act and the skyrocketing costs of electricity, that information is not free. Whether we are a compressed file in a cosmic hard drive or a messy, authentic collection of atoms, the tax remains the same. The universe doesn't have a GPU, and it doesn't seem to care about our storage limits. We are living in a reality that is far too complex to be anything other than itself. That is either a comfort or a terrifying realization of our own isolation.
Europe has the engineers to build the sensors that will eventually prove or disprove the mass of a bit. It just hasn't decided if it wants to live in the answer.
Comments
No comments yet. Be the first!