Are we living in a simulated universe? – Part 2

Mar 13, 2014

Print Friendly, PDF & Email

THEY ARE SERIOUSLY ASKING:
Are we living in a simulated universe?

By Bradley Bartholomew
February 2014

Optimized-DSC 0406There is a basic concept in experimental physics called Lorentz symmetry. Essentially the experimental results should not reflect the orientation or framework of the experimental apparatus. Using a cubic space-time lattice to simulate electrons and muons at some point it is necessary to introduce a specific operator into the equations which will fine tune away the ‘lattice spacing’ artifacts. This operator has to do with recovering Lorentz symmetry in the lattice calculations. As the lattice spacing vanishes when compared with the scales of the system however, the Lorentz symmetry is recovered without the necessity of introducing this operator. They therefore calculate an approximate upper bound on the lattice spacing, below which the artifacts will not be observed. So they conclude that this breaking of rotational symmetry, if they were to observe it out there in the real physical world “would be a solid indicator of an underlying space-time grid, although not the only one.” They go on to point out that “another scenario that gives rise to rotational invariance violation involves the introduction of an external background with a preferred direction”.

It just so happens that the upper limit for high-energy cosmic rays corresponds roughly with their upper limit for lattice spacings in their simulations and “therefore, the lattice spacing used in the lattice simulation of the universe must be b 10-12 fm in order for the GZK cut off to be present or for the lattice spacing to provide the cut off in the cosmic ray spectrum.” These cosmic rays are high energy charged particles (normally the nuclei of atoms but they can also be high energy electrons, positrons and other subatomic particles) that strike the Earth’s atmosphere from all directions and come from outer space. The Greisen–Zatsepin–Kuzmin limit (GZK) is a theoretical limit on their upper energy which is brought about by their interaction with the CMB over long distances. They are actually charged particles and are therefore affected by the Earth’s magnetic field, and therefore could have considerable effects for their rotational symmetry. A magnetic field would flip the spin of the particle which would presumably change its ‘handedness’. Indeed the entire universe is said to have a magnetic field. The question arises whether the universe’s magnetic field would provide the other scenario they mention that gives rise to rotational invariance violation, namely ‘the introduction of an external background with a preferred direction’.

The researchers go on to say: “The most striking feature of the scenario in which the lattice provides the cut off to the cosmic ray spectrum is that the angular distribution of the highest energy components would exhibit the cubic symmetry in the rest frame of the lattice, deviating significantly from isotropy. For smaller lattice spacings, the cubic distribution would be less significant, and the GKZ mechanism would increasingly dominate the high energy structure. It may be the case that more advanced simulations will be performed with non-cubic lattices. The results obtained for cubic lattices indicate that the symmetries of the non-cubic lattices should be imprinted, at some level, on the high energy cosmic ray spectrum.” Presumably this means that if these rays were exhibiting isotropy they would be going in all possible directions, and if that were found not to be the case then it might be due to the fact that they are simulated on a cubic framework. I can think of at least three other factors that this might be due to. The fact that the universe is one gigantic magnetic field. And the fact that the cosmic rays might be coming from specific directions in the first place such as supernova explosions. And the fact that they might have passed through galactic magnetic fields which have caused their path to bend in a consistent fashion.

What can be said about this argument? The worst that can be said about these researchers is that they have a severe case of wanting to leap tall buildings in a single bound before they can even crawl. Leaving aside for the moment that much mystery still surrounds even the source of the cosmic rays, can they really be suggesting that the posthumans with the most powerful quantum computer imaginable would have used a lattice space-time foundation to simulate our universe, of the same scale they themselves use with a conventional computer to simulate ‘femto-sized universes’. Their simulations are about the size of maybe two or three protons (several fermis), and yet they are dreaming about simulating our entire universe. In addition it is evident from their argument that their femto-sized simulation is far from satisfactory. They say that they have successfully simulated the strong nuclear force subject to this rotational symmetry breaking problem, and the electromagnetic force, but they are having insurmountable difficulties with the weak nuclear force and quantum gravity. 

Actually it is quite likely that the universe is simulated on a space-time lattice framework, but even a non-physicist can deduce that the lattice spacing is likely to be in the order of Planck’s constant which sets the lower limit to the size of any ‘physical’ packet of energy. The real issue is whether it is even remotely likely that the simulation has been done by our posthuman demigod descendants way, way, way in the future. Referring to an article by the Yale Philosopher Nick Bostrom, they say: “Extrapolations to the distant futurity of trends in the growth of high-performance computing (HPC) have led philosophers to question – in a logically compelling way – whether the universe that we currently inhabit is a numerical simulation performed by our distant descendants”. (italics are mine) And they conclude their article by saying that because it has been so easy in their simulations to correct these rotational symmetry breaking artifacts and preserve chiral symmetry (‘handedness’), it is ‘unlikely’ that any but the earliest simulations would have had patches applied. They lament the fact that these patches or ‘improvements’ would likely be effective in masking their ability to probe the simulation possibility. 

Rather than try to pick holes in their argument, let’s try to take an overall view. For a start these are simulated scientists being able to detect (and suspect the authenticity of) the exact same latticelike structure in the external world that they themselves were simulated upon. It might help if these scientists go back and review some elementary computer theory. This is the way Seth Lloyd, Professor of Mechanical Engineering at MIT, explains it: “Gödel showed that the capacity for self-reference leads automatically to paradoxes in logic; the British mathematician Alan Turing showed that self-reference leads to uncomputability in computers.” So if you try to give pseudo-physical computer generated robots the capacity to ‘deconstruct’ themselves the whole simulation would crash. In any event any such latticelike structure is only mathematical in nature. It’s just factored in to the image of the results. It’s like the real biological posthuman scientists being able to see the discrete gaps in their real physical world imposed by Planck’s constant. They can’t do it no matter how godlike they have become. The posthumans do not have to simulate EVERY SINGLE COSMIC RAY THAT HAS BEEN BUZZING AROUND IN THE UNIVERSE FOR 14 BILLION YEARS. All they have to simulate is the mathematical data that becomes our ENTIRE knowledge about these cosmic rays. 

The argument advanced by these simulated scientists is purely mathematical and says nothing about what they are actually seeing. And after they do their experiments and get the ‘results’ they can actually see, of course these results will likewise be purely mathematical. To think the posthumans might have to intervene with a patch in the software because we are about to discover that we are virtual might make for a good episode of Startrek but from a philosophical point of view it is childish and delusional. The posthumans can tell us anything in mathematics about the foundations of these images we are seeing and we have to believe it. It’s my guess that far into the future in some real physical world some real biological posthuman demigod scientists are having a good bellylaugh right NOW. What can be funnier than a thinking toy that you have made actually coming out with some extremely complex mathematical thoughts that it is a toy.

All jokes aside, it’s not possible for a simulated being to see anything other than the mathematical foundation of the ‘external’ environment they are ‘seeing’. And the only reason why we can actually ‘see’ the mathematics is because the posthuman demigods have ‘enabled’ that in the simulation program. They simply generate an image in mathspeakese on the cortex of the brains of these scientists of results that are thoroughly consistent with all these infinitely large number of cosmic rays being ‘real’.

The possibility we are using computer generated mathspeakese to address the question whether we are simulated suggests maybe we should take another look at the mathematics. What about Heisenberg’s uncertainty principle, for example? We can never know all there is to know about a physical system in order to make a perfect simulation of it. Generally speaking the uncertainty principle states that the more certain we are of the value of some physical quantity, the less certain we become of a complementary quantity. It applies to all aspects of an elementary particle. If we know its exact position, we can know nothing about its momentum. If we know its exact energy value, we can know nothing about the time that has elapsed. If we know its spin about the vertical axis we can know nothing about the sideways axis. Essentially we can only know exactly 50% of the observables, and of the other complementary 50% we are totally ignorant and these can include both position and time which are crucial coordinates in any space-time lattice used as the framework for a simulation. Heisenberg’s uncertainty becomes a fundamental and insurmountable barrier to our having complete knowledge about the system. In addition to this there is the measurement problem. The math tells us that any measurement will tend to disturb the system we are measuring. So no matter how much computer savvy our descendants develop in the future, they themselves will hit this barrier as well. Ergo they will never actually be able to make a perfect simulation of something as modest as a chemical reaction in a test tube, let alone a perfect simulation of the entire universe.

Let’s look at some other examples of our cherished mathematics as well. Maybe the math the posthumans use tells them with certainty that a particle ‘exists’ in a certain state, not some vague probability that if we make an observation we will find a particle in a certain state. The space in the real physical world where the posthumans live may have straight up and down Cartesian coordinates or at least nice solid Reimann curvature they can actually ‘see’, not some weird mathematical hypothetical curved simulated space that we ‘think’ we live in. The posthumans can tell us anything in mathspeakese and we are programmed to believe it. What about these constants of Nature, these ad hoc numbers we have to stick into our equations to make them work. These could actually be the patches that they have had to apply to our simulation.

This raises a very interesting question. What if the probability theory, the uncertainty principle, Einstein’s space-time, the measurement problem and the constants of Nature are merely the mathematical laws of Nature that have been encoded into our simulation. The real biological scientists in their real physical universe will evidently have the Theory of Everything. In their world they can point to any particular particle and tell you exactly where it is and exactly the speed it is travelling. They can tell you exactly the change in energy as well as the exact time that has elapsed. They can measure any quantity they like without collapsing a probability wave function. They don’t have to pluck bogus numbers out of the air to make their equations work. They can therefore do a perfect simulation of their real physical universe, but in the simulation they take care to impose upon us a more limited knowledge of the laws of Nature. Indeed it would be very prudent of them to do so, otherwise our simulated world could overtake their real world, and we would become their masters. Where’s Mr Spock when you need him. We need to get someone’s advice on this. Where’s my TV guide, dammit!

Most Recent Posts

The President Who is Beyond Good and Evil: A Nietzschean Pantomime

Abstract: This paper draws many parallels between Guy Debord’s Society of the Spectacle and the philosophy of Friedrich Nietzsche particularly Beyond Good and Evil and Thus Spake Zarathustra in anticipating and predicting the rise of the Superman, the 45th President...

AN EXPLANATION OF THE POWERS OF FRANZ MESMER

Abstract: In his lifetime Franz Anton Mesmer was branded a charlatan by the scientific community on account of his claim of being able to cure many sickness and medical problems through animal magnetism. Notwithstanding this, although his specific claim to possess the...

INNER SELF lOCATED

Abstract: This paper presents an original interpretation of the Upanishads, in relation to all the dozens of passages that indicate that the Self operates in our sleep. The Upanishads identify two stages of sleep, deep sleep and dreaming sleep, and is very clear on...

QUANTUM THEORY AND CONSCIOUSNESS

A recent research paper entitled "Generating mechanical and optomechanical entanglement via pulsed interaction and measurement" is paving the way towards demonstrating that even macroscopic harmonic oscillators can display this ‘spooky’ phenomenon in quantum mechanics...

THE REASONABLE INEFFECTIVENESS OF MATHEMATICS IN NEUROSCIENCE

Abstract One of the leading theories about consciousness is Integrated Information Theory (IIT) which presents a mathematical equation that will calculate a Φ value based on probability distributions of the brain as a whole and its various regions that perform...

WHY AI WILL NEVER SURPASS HUMAN INTELLIGENCE

A paper entitled “The unreasonable effectiveness of deep learning in artificial intelligence” argues that the way forward towards achieving general AI, that is to say a human level intelligence, is to copy how an organic brain does if for humans. The paper argues that...

THE ELECTRONIC WAVEFORM OF ACTION POTENTIALS IN THE BRAIN

Abstract A new research paper has detected a "never-before-seen" waveform in the action potentials in the pyramidal neurons in the cortex of the brain which are believed to be modulated by neurotransmitters in the synaptic clefts of the dendrites which involve complex...

SOLVING THE HARD PROBLEM: CONSCIOUSNESS IS AN ELECTRONIC PHENOMENON

Abstract This paper is a reply to a paper ―Solving the "Hard Problem": Consciousness is an Intrinsic Property of Magnetic Fields‖. It is argued that this paper is essentially correct because it specifically nominates magnetite crystals (Biogenic Magnetic Nanoparticles...

The Evolution of Consciousness

Abstract This paper presents a new and positive theory of evolution, as distinct from instancing certain unique features of the phenotype of whatever species, and asking the question ‘How could this come about by natural selection of random genetic mutations’....

A REVIEW OF PSI ACTIVITY IN THE DNA

Abstract: This paper presents a general summary of psi experiments conducted with the DNA in the latter decades of the 20th Century particularly at the HeartMath Institute (HMI) in Boulder Creek, CA, by Dr. Glen Rein, relating to the ability of “healers” to affect the...