David Wolpert and Carlo Rovelli have spent a significant portion of their careers staring at the same mathematical wall, and in their latest collaboration, they have decided to point out that the wall is actually a mirror. The paper, recently published in the journal Entropy, does not offer the kind of comforting breakthrough that secures a press release on the front page of a mainstream broadsheet. Instead, it identifies a structural failure in the way we think about the past. By interrogating the Boltzmann brain paradox, the authors suggest that our entire perception of history—and the multi-billion-euro research infrastructure built upon it—might be resting on a logical loop that we have simply agreed to ignore.
The core of the issue is a decades-old statistical nightmare. Ludwig Boltzmann, the father of statistical mechanics, famously proposed that entropy—the measure of disorder—tends to increase. This gives us the arrow of time: eggs crack, they don’t un-crack. However, the underlying laws of physics are time-symmetric. If you watch a movie of a single atom bouncing around, you cannot tell if the film is playing forward or backward. This creates a statistical anomaly: it is mathematically more likely for a fully formed brain, complete with false memories of a life in Berlin or Cologne, to spontaneously fluctuate out of cosmic chaos than it is for the entire universe to have started in the impossibly low-entropy state required by the Big Bang.
The High Cost of Fixing the Past
For most working physicists, the Boltzmann brain is treated as a nuisance rather than a threat—the academic equivalent of a software bug that is patched with a clumsy workaround known as the "Past Hypothesis." This hypothesis simply asserts, by fiat, that the universe began in an extremely orderly state. If you accept this, the Boltzmann brains go away, and our memories of yesterday’s lunch become reliable data points. But Rovelli, Scharnhorst, and Wolpert argue that this fix is less of a solution and more of a bureaucratic sleight of hand. They have identified what they call the "entropy conjecture," a framework revealing that many arguments for the reliability of memory are fundamentally circular. We use our memories to prove the past was low-entropy, then use that low-entropy past to prove our memories are real.
This is not merely a philosophical debate for the faculty lounge. It touches on the very reliability of empirical data in high-stakes environments, from quantum cryptography to the calibration of deep-space sensors. If we cannot rigorously distinguish between a signal that records a real event and a statistical fluctuation that merely looks like one, the foundations of precision measurement begin to soften. In the European context, where the Horizon Europe programme pours billions into quantum hardware and high-precision sensors, the question of what constitutes "ground truth" in a noisy system is a matter of industrial strategy.
The Circularity Problem in European Labs
The research, conducted partly under the auspices of the Santa Fe Institute but carrying the distinct, sceptical imprint of European theoretical physics, highlights a tension in how we fund science. In Brussels, the focus is increasingly on "technology readiness levels" (TRLs). We want quantum computers that can break encryption or simulate new catalysts for the green transition. But Rovelli and Wolpert’s work suggests that we are still building these machines on a foundation of shaky assumptions about how information is preserved over time.
One of the more cutting observations in the study involves the choice of "fixed points" in time. When a physicist calculates the probability of an event, they must decide which variables are given. If you fix the present state of the universe as your only known data point, the math almost inevitably leads back to the Boltzmann brain scenario: you are a lone mind in a void, hallucinating a history. To avoid this, you must fix a second point in the distant past. The study points out that physics itself provides no manual for which points to fix. It is a subjective choice masquerading as a physical law. This choice is what allows us to trust the data coming out of a semiconductor fab or a particle accelerator, but the new analysis suggests we have been using the output to justify the input for far too long.
Why Engineering Realities Might Save the Paradox
The engineering trade-off here is one of computational complexity versus physical reality. If we were to actually account for the possibility of random fluctuations in every data set, our models would become too heavy to run. We assume the past is real because it is computationally efficient to do so. In the semiconductor industry, specifically in the development of next-generation EUV lithography, we rely on the temporal stability of physical laws to print circuits at the nanometre scale. If the past were truly as fluid as the Boltzmann math suggests, the concept of a "reproducible experiment" would vanish.
European industrial policy, particularly the Chips Act, is predicated on the idea that we can master the physical world through increasingly precise control of entropy. We spend years cooling quantum bits to near absolute zero to prevent "noise." But Wolpert and Rovelli are asking a deeper question: what if the noise is the default, and our signal is the anomaly? This perspective shift is uncomfortable for an industrial complex that views nature as something to be managed by a spreadsheet. It suggests that our sense of progress—the idea that we are moving from a known past to a predictable future—is a narrative we’ve constructed to keep the math from breaking.
The Sceptical Path Forward
In the corridors of the European Research Council, where Rovelli’s influence remains significant, this work signals a pivot back toward foundational questioning. At a time when European science is often pressured to justify its existence through immediate commercial application, this paper is a reminder that the most basic questions—like why we remember things—remain essentially unanswered. The circularity discovered by Wolpert and his colleagues suggests that we have been taking a shortcut through the most difficult part of the forest, assuming we knew the way home because we recognized the trees.
Ultimately, the work suggests that our trust in history is a pragmatic choice, not a mathematical certainty. It is a necessary fiction that allows us to build bridges, launch satellites, and fund research cycles. We will continue to invest in the future as if the past were a solid, immutable record, mostly because the alternative makes it impossible to fill out a grant application. It is progress, of course, but it is the kind of progress that suggests we should be much more careful about what we claim to know for certain. Europe will keep building the sensors; it just might start questioning the history they record.
Comments
No comments yet. Be the first!