LATEX

The simulation hypothesis and quantum mechanics

The simulation hypothesis is a theory put forward in 2003 by philosopher Nick Bostrom which is more than just the typical skeptical hypothesis (like Zhuangzi's butterfly dream hypothesis) since it is in fact a metaphysical theory (as argued by Chalmers).
This hypothesis states simply that most mankind including ourselves (properly qualified as "minds" or "observers") are in fact (computer) simulated beings NOT the biological being we think we/they are.
This hypothesis provides a neat solutions to many problems in quantum mechanics, cosmology and consciousness.
This argument although it is completely naturalistic it also has far reaching consequences in the philosophy of religion (free will, problem of evil, moral imperative, the nature of the creator, etc).
However, our interest in this hypothesis remains in the potential physical links which it establishes between quantum physics, time and consciousness.
This is the topic of discussion here in the coming days and weeks.
.............................................
 https://badisydri.blogspot.com/2019/07/blog-post_57.html


........................................................
 The nucleons (and other hadrons) can be built virtually/numerically by simulating the gauge theory of quantum chromodynamics (QCD) on the lattice. This is a fact true already today as every practicing theoretical physicist should know.
The nucleons (proton and neutron) which account for 99% of the mass of the luminous matter in the Universe are fully quantum particles which we use in their simulation only classical computers.
Is there a difference between the material nucleon and the simulated nucleon?
The answer is no as shown by the result of Durr et al (arXiv:0906.3599) in the attached figure.
QCD is the fundamental theory of the strong interactions in the Universe and the lattice theory which approximates these strong interactions (and which is characterized by a lattice spacing a and a finite volume V) is assumed/believed to enjoy a continuum limit to the actual interactions found in Nature.
Thus, the simulated nucleon lives in a discretized finite spacetime known as the lattice. This is actually a necessary step in order for QCD to well behave both in the UV (ultraviolet) and IR (infrared) regimes in the numerical calculations.
Furthermore, the laws by which this nucleon or its state evolves in time are those of quantum chromodynamics which is a generalization of electrodynamics to the dynamics of three charges called colors and hence the name chromodynamics.
The lattice is also required to be Euclidean which means that effectively there is no time (or more precisely time is Wick rotated). Under Wick rotation the wave function given by ex(iS) where S is the action is turned into a Boltzmann weight exp(-E) where the energy E is the Euclidean action while the path integral (or superposition principle which entails interference between paths) becomes essentially sum over probabilities for different paths which can be simulated then using Monte Carlo methods.
In summary, quantum mechanics in these simulations is effectively approximated by statistical mechanics.
Yet, miraculously this works and we get a correct simulated nucleons with minor differences with the real nucleon.
Thus, the annulement of time by rotating to Euclidean signature and the consideration of integrals over paths (which emphasizes the particle aspect) removes the measurement problem or so it seems.
Neverthleess, in some sense the real time for the simulated nucleons is the simulation time which is the Monte Carlo time of repetitive iterations of the numerical algorithm.
Once we have simulated the proton/neutron fully as it exists in Nature we can ask about the simulation of other particles, of the weak interactions and of gravity as a perturbation of the flat metric propagating in the background spacetime.
And although these envisaged simulations are all performed on a lattice approximation of spacetime we can ask about simulations of spacetime itself, i.e. of quantum gravity which is already possible within matrix models of string theory and noncommutative geometry.
The lattice approximation typically breaks Lorentz and rotational invariances and the lattice spacing can be identified with the order of the rotational anisotropie observed in cosmic rays (see arXiv:1210.1847).
But in matrix models and noncommutative geometry a discretized spacetime is possible without explicitly breaking rotational symmetry which can only thus be broken spontaneously.
After we have simulated matter, spacetime and the Universe we can also ask about simulated cells and living organisms until we reach simulations of perceptions, consciousness and mental phenomena. But this is a quite different and more involved story.
All these simulations are assumed to be performed by a classical computer but the whole logic is untouched if we consider quantum computers which are more natural simulators of the quantum Universe (although consciousness and related phenomena are more naturally simulated by classical computers since they are classical by virtue of the fact that all observers in quantum mechanics are necessarily classical as emphasized by Bohr who seems to be influenced by Kant in his thesis that our understanding of the world is itself constrained by our faculty of understanding).




.................................................
If the simulation hypothesis is correct and we are living in a simulation then the consequences for quantum physics and cosmology are profound and far reaching.
First, the demarcation between (physical) systems and (conscious) observers and the implicit dualism found in the Copenhagen interpretation is wholly consistent (as pointed out by Chalmers "Cartesian dualism is not quite so outlandish and conceptually problematic as tends to be supposed".)
The simulated Universe (which is naturally discrete and finite) follows then a mathematical algorithm which implements a fundamental theory of everything (still unkwon) in the same way that a simulated proton follows a mathematical algorithm which implements the fundamental theory of the strong interactions known as quantum chromodynamics or QCD.
The Copenhagen interpretation is an interpretation imposed by human observers (which are also simulated) of the laws of quantum mechanics. These observers see the unitary evolution of the state vector, the superposition principle, the statistical Born's rule, non-locality and quantum indeterminism as describing very well all physical systems. These laws are faithfully or approximately (on a lattice or through a matrix model) encoded into the environment of the simulation.
These simulated conscious observers are like players in a giant virtual reality game and what they observe in the simulation is subjugated to another numerical law which is the rendering of the content of the simulated environment.
This rendering of the content is what appears to the conscious observer as the law of the collapse of the wave function where reality is finally experienced.
Therefore the simulation consists of two types of computations. The unitary (internal) computation of physical reality including physical observers and the non-unitary rendering of the simulated physical reality to the conscious observer (player) whose experience of this physical reality is what we call consciousness.
In all purely quantum systems the simulated reality is not computed (even the unitary part) until the observer or the player seeks the experience or observation. This is simply due to the high cost of the calculations and to limited resources available to the simulator. This is then the statement that the "electron is not there until observed" (Bell's theorem).


But in classical systems (acting like weakly coupled systems in the sense that effectively we can set \hbar=0) the numerical calculation is low cost and therefore the computer performs the unitary calculation well before the observation or rendering time. So the moon is really out there even when we are not looking at it.
The above definition of physical reality as the connection between the rendering-of-the-simulated-enviornment to the player and the collapse-of-the-wave-function seen by the observer is based implicitly on the assumption of finite computation resources and requirement low computational complexity (see arXiv: 1703.00058).
Second, the simulation hypothesis allows also the manyworlds in a straightforward way. If we are living in a simulated reality it is easy to imagine that the being running the simulation has also computed the other parallel branches of the world. In fact this is only natural from the simulator point of view. This being is is the super-observer of the manyworlds formalism who observes the global structure of reality and not the local one associated with Copenhagen. Naturally, this being can herself being simulated and the simulation hypothesis if true tells us that she is certainly so.
Third, the Copenhagen interpretation under the guise of the simulation hypothesis is really nothing else but the Wigner interpretation (solipsism is not an issue any longer). The simulation hypothesis provides therefore a unification scheme between the three interpretations: Copenhagen, Wigner and manyworlds.
Chalmers, David (January 1990). "How Cartesian Dualism Might Have Been True".

No comments:

Post a Comment