Melvin Vopson was staring at the way information behaves when he noticed a pattern that shouldn't exist in a chaotic, organic universe. In our physical world, things generally get messier over time—the famous law of entropy. But in the world of digital information, things tend to do the opposite. They compress. They optimize. They shed the redundant and keep only what is necessary to function. Vopson, a physicist at the University of Portsmouth, realized that the universe appears to be doing exactly that: zipping its own files to save on space.
This isn’t just a philosophical shower thought. Vopson has formalised this observation into what he calls the Second Law of Infodynamics. It suggests that the information content of systems doesn’t just fluctuate; it actively minimises itself. In a world of cold, hard physics, this looks less like the messy sprawl of evolution and more like the elegant code of a developer trying to keep a massive program from crashing the server. If the universe is trying to save on processing power, it implies there is a processor somewhere running the show.
The universe’s obsession with zip-files
Most of us think of information as something humans invented—bits on a hard drive or words in a book. But to a physicist, information is a physical property. It’s the state of every particle, the spin of every electron, the specific configuration that makes a hydrogen atom different from a piece of toast. Usually, the Second Law of Thermodynamics tells us that the universe is headed toward a state of maximum disorder. Your coffee gets cold, your car rusts, and the stars eventually burn out.
Vopson’s discovery flips the script. He argues that in information systems, entropy actually stays constant or decreases. This manifest efficiency is everywhere. Look at symmetry in nature: from the hexagonal perfection of a snowflake to the mirrored halves of a butterfly’s wings. Why does the universe love symmetry? Vopson argues it’s because symmetry is the ultimate data-saving hack. It is much easier to store the code for one half of a face and tell the system to "repeat" than it is to render a completely unique, asymmetrical mess.
This creates a massive tension with our traditional understanding of reality. If the universe is a natural, spontaneous occurrence, it has no reason to be efficient. Nature is usually a profligate spender of energy and space. But if we are living inside a simulation, efficiency is the only way the system survives. Every bit of redundant data removed is a bit of memory freed up. We aren't just living in a universe; we might be living in a highly optimized piece of software.
Computing the uncomputable
The UBC researchers focused on the sheer amount of computational power required to simulate the quantum interactions of just a few hundred electrons. Because quantum particles exist in a blur of multiple states at once—superposition—the amount of data needed to track them grows exponentially. To simulate even a small cluster of atoms with perfect fidelity, you would need a computer larger than the observable universe itself. It isn't a matter of building a better Mac Pro; it's a matter of fundamental physics.
This creates a deadlock between two schools of thought. Vopson sees the "code" and the optimization as evidence of a creator or a programmer. The UBC team sees the sheer complexity of the physics as proof that no computer could ever handle the load. The debate hinges on a single, nagging question: does the simulation have to be perfect? If you’re playing a video game, the computer doesn't render the entire world at once—it only renders what you’re looking at. This is a concept called frustum culling, and some physicists argue the universe does the exact same thing at the quantum level.
The DNA storage problem
Vopson’s most provocative claim involves the very building blocks of life. He suggests that DNA isn't just a biological blueprint, but a highly sophisticated information storage system that follows the laws of infodynamics. By analyzing the genetic sequences of viruses and organisms, he found that their information entropy decreases over time as they mutate. They aren't just evolving; they are optimizing their code.
This challenges the standard Darwinian view of random mutation. If mutations were truly random, we would expect a chaotic drift in information content. Instead, Vopson sees a trend toward data compression. It’s as if the biological world is trying to fit as much functional complexity as possible into the smallest possible genetic footprint. To a sceptic, this sounds like a digital ghost in the machine. To a biologist, it’s a radical rethinking of how life maintains its integrity across billions of years.
Critics, however, are quick to point out that Vopson might be mistaking the map for the territory. Just because we can describe the universe using information theory doesn’t mean the universe *is* information. We described the universe as a clockwork mechanism in the 18th century because that was our most advanced technology. Now that we have the internet and AI, we see the universe as a computer. It’s a classic case of human projection—we see what we know.
Why the Fermi Paradox points to a glitch
If we are in a simulation, it might finally explain why the skies are so quiet. The Fermi Paradox—the contradiction between the high probability of alien life and the total lack of evidence for it—has haunted astronomers for decades. If the universe is a simulation designed for humanity, or a specific experiment focused on Earth, the "programmers" wouldn't bother rendering other civilizations. They would be unnecessary background noise that eats up processing power.
This is often called the "Planetarium Hypothesis." It suggests that the stars we see are just a high-resolution backdrop, a shell around our solar system that gives the illusion of a vast, empty void. In this scenario, we don't see aliens because they aren't in the script. The universe feels infinite not because it is, but because it’s programmed to look that way whenever we point a telescope at the sky.
But even the best simulations have bugs. Some theorists point to the weirdness of quantum mechanics as the ultimate "glitch." The fact that particles don't have a definite position until they are observed—the Observer Effect—looks suspiciously like a computer only rendering an object when a player enters the room. Why waste energy calculating the position of every subatomic particle in the heart of a star if no one is there to check the results? The universe only becomes "real" when we look at it, a trick that would save a simulation an unimaginable amount of power.
The cost of the 'Sim' theory
The philosophical trade-off of believing in a simulation is steep. If we accept Vopson’s infodynamics as proof, we have to reckon with the fact that our reality is derivative. We are a sub-process. This leads to Nick Bostrom’s famous trilemma: either all civilizations go extinct before they can build simulations, they choose not to run them, or we are almost certainly living in one. If even one civilization eventually gains the power to run a "high-fidelity ancestor simulation," they would likely run thousands of them. Statistically, that would mean there is only one "real" world and millions of fake ones. The odds of us being in the real one would be millions to one.
However, the UBC Okanagan math offers a glimmer of hope for those who find the simulation idea depressing. Their proof relies on the idea of "quantum complexity," which suggests that nature is far more intricate than any digital approximation could ever be. There is a richness to the physical world—a chaotic, uncompressible depth—that no amount of code can mimic. According to their calculations, the universe isn't optimized; it’s actually incredibly, beautifully inefficient at the quantum scale.
We are left with two competing versions of reality. One is an elegant, optimized program where even your DNA is shedding excess bits to stay lean. The other is a physical powerhouse so complex that it defies any attempt to be simulated. Vopson is currently looking for the "smoking gun"—an experiment that would involve erasing information from a particle to see if it loses mass. If information has mass, as he predicts, the simulation theory moves from the realm of philosophy to the laboratory. Until then, we are stuck staring at the pixels, wondering if they go all the way down.
Comments
No comments yet. Be the first!