Notes - The Fabric of Reality

June 9, 2025

Chapter 1: The Theory of Everything

David Deutsch, born in Haifa, Israel, and educated at Cambridge and Oxford Universities, is a member of the Quantum Computation and Cryptography Research Group at the Clarendon Laboratory, Oxford University. He laid the foundations for quantum computation and is an authority on the theory of parallel universes. His book, The Fabric of Reality, is praised for its "refreshingly oblique, provocative insights" and its insistence that quantum mechanics should be taken as an explanation for how the world truly works. Richard Dawkins describes Deutsch as a "deeply knowledgeable professional physicist" who is the "most eloquent spokesman" for the Many Universes interpretation, making it coherent with epistemology, biological evolution, and computation theory. The book is considered "unique" and "multiversal" for its argument that innumerable universes exist alongside the perceived "real" one. Paul Davies notes Deutsch as "one of Britain's most original thinkers" who challenges traditional notions of reality by interweaving physics, biology, computing, and philosophy.

The central motivation for the worldview presented in the book is the belief that modern science provides profoundly deep theories about the structure of reality that should be taken seriously as explanations, not merely as pragmatic tools. These theories are often counter-intuitive, leading to attempts to avoid their implications through ad hoc modifications or reinterpretations. The book investigates what reality would be like if these theories were true.

Understanding, rather than merely knowing facts, is central to the author's argument. Knowing isn't about memorizing every piece of information (e.g., all beetle species or planetary data), but about grasping the right concepts, explanations, and theories. A single, comprehensible theory, like Einstein's general theory of relativity, can encompass an "infinity of indigestible facts" and predict phenomena "in principle," even if practical computation is infeasible. The real advantage of a formula is its ability to predict future observations and correct for errors in past data, but knowing a formula doesn't equate to understanding; understanding comes from explanation. For example, the general theory of relativity reveals and explains the curvature of space and time, which is its true value, not just its slightly more accurate predictions over Newton's theory. Scientific explanation, in general, explains our experience by referring to an underlying, unexperienced reality, and its most valuable attribute is that it explains the "fabric of reality itself".

Some philosophers and scientists, known as instrumentalists, disparage explanation, viewing a scientific theory solely as an "instrument" for making predictions, with its entire content lying in predictive formulae. To them, explanations are "mere psychological props". Steven Weinberg, for instance, once suggested it doesn't matter whether predictions are ascribed to gravitational fields or space-time curvature. However, Deutsch argues that the ascribed cause matters, not just for theoretical physicists, but for practical applications. An oracle providing perfect predictions but no explanations would be of "strictly limited utility" because it couldn't help design, troubleshoot, or improve anything; it would only predict experimental outcomes if one already knew which experiments to ask about. Even in weather forecasting, explanations enhance imperfect predictions by allowing judgment of reliability and deduction of further relevant information. Positivism, an extreme form of instrumentalism, claims that all statements not describing or predicting observations are meaningless, a doctrine that is self-meaningless by its own criterion. The author asserts that prediction is part of the method of science, not its purpose. Theories are proposed to explain phenomena, and experimental tests are used to falsify rivals. The "overwhelming majority of theories are rejected because they contain bad explanations," not failed tests, as illustrated by the untested "grass cure" for the common cold. The true purpose of science is to explain the world. These explanations often refer to unobserved entities like atoms, forces, and laws of nature, which are "part of the very fabric of reality". Explanations can also cover intrinsically unpredictable things, such as why a fair roulette wheel is unpredictable.

The common belief that the growth of knowledge makes understanding everything more difficult, leading to specialization (e.g., physics splitting into astrophysics, particle physics), is challenged. However, the author posits that a new theory, though explaining more and being more accurate, can also be easier to understand, making old theories redundant (e.g., Copernicus's heliocentric theory over Ptolemy's geocentric system, or Arabic numerals over Roman numerals). New theories can also unify existing ones, like Faraday and Maxwell's theory of electromagnetism. Better explanations in one field can improve understanding in others, making knowledge "structurally more amenable to being understood".

The distinction between understanding and "mere knowing" is crucial: understanding is about "why" and the "inner workings of things," while mere knowing is about "what". Understanding relates to laws of nature, coherence, elegance, and simplicity, and is a unique "higher function of the human mind and brain," dependent on creative thought. One can understand something without explicitly knowing it, as deep, general explanations cover unfamiliar situations. For example, understanding general relativity means one can deduce planetary motions given facts, rather than having every detail pre-memorized. However, applying existing theories to explain new phenomena (like quasars) can still constitute "genuinely new understanding" requiring creative thought, unlike merely calculating an orbit. While the stock of specific theories is growing, they are continually "demoted" as their understanding is subsumed by fewer, deeper, and more general theories.

Modern theories deduce specific applications (e.g., wall thickness) from deep explanations of materials and structures, which are general enough to apply to novel situations. While specialization exists (e.g., in architecture), a "deepening, unifying tendency" is also at work, as new ideas supersede, simplify, or unify existing ones, and extend understanding into previously unknown areas. In medicine, deeper biochemical explanations are replacing specific rules of thumb, leading to more general concepts and diminishing the role of specialists. The author's thesis is that depth is winning over breadth, meaning we are moving towards a state where one person could understand "everything that is understood". This does not imply soon understanding "everything there is" but rather the comprehensibility of what is understood, which depends on the fabric of reality being highly unified. Eventually, theories will become "so general, deep and integrated" that they form a single "Theory of Everything" that encompasses all known explanations for the whole of reality.

This "Theory of Everything" (TOE) is distinct from the particle physicists' "theory of everything," which would unify basic forces and particles but would not explain much beyond subatomic interactions. The particle physicists' term is motivated by the mistaken view that science is primarily reductionist, explaining things by analyzing them into components. Reductionism leads to a hierarchy of sciences from physics up to biology and psychology. However, higher-level sciences exist due to emergence, where "high-level simplicity 'emerges' from low-level complexity". Emergent phenomena are those with comprehensible facts not simply deducible from lower-level theories, such as life, thought, and computation. The opposite, holism (explanations only in terms of higher-level systems), is considered an even greater error. The author contends that scientific knowledge consists of explanations at every level, many of which are autonomous or explain low-level phenomena using high-level theories (e.g., a copper atom in a statue explained by concepts of war and tradition). These high-level explanations are not "second-class citizens". The truly privileged theories are those with the deepest explanations, not those at a particular scale of size or complexity.

Deutsch identifies four main strands that constitute our current understanding of the fabric of reality:

  1. Quantum theory: The deepest theory in physics, providing the explanatory and formal framework for modern physics.
  2. The theory of evolution: Primarily of living organisms.
  3. Epistemology: The theory of knowledge and how it's created.
  4. The theory of computation: About computers and what they can and cannot compute in principle.

These four seemingly independent subjects have such deep connections that understanding one requires understanding the others. Together, they form a "coherent explanatory structure" that is the "first real Theory of Everything," making understanding universal. The book aims to persuade readers that the fabric of reality is unified and comprehensible, leading towards a state where a single person could understand "everything that is understood".

Chapter 2: Shadows

This chapter introduces core concepts of quantum physics through experiments involving light and shadows, arguing that these seemingly simple phenomena lead to the extraordinary conclusion of parallel universes.

Imagine an electric torch in a dark, absorbent room. While intense light beams pass through each other without interaction, light itself is not infinitely divisible; it comes in discrete "lumps" or "atoms" called photons. A faint light beam is not dimmer, but rather consists of sparser photons. This property, where physical quantities appear only in discrete sizes, is called quantization.

When light passes through small holes, it "frays" and spreads out, creating intricate patterns of concentric rings and colors, known as interference. If a barrier has multiple slits, opening additional slits can make a previously bright area dark, demonstrating that something passing through the added slits "interferes" with the light from the others. This interfering entity behaves exactly like light, being reflected, transmitted, or blocked by what affects light.

The critical experiment involves performing these interference tests with only one photon at a time. Counter-intuitively, the interference pattern still forms. This leads to inescapable conclusions: when a single photon passes through one slit, something interferes with it that has passed through the other slits, and these interfering entities behave exactly like photons, except they "cannot be seen". Deutsch calls these unseen entities "shadow photons," in contrast to the "tangible photons" that are detected. Each tangible photon is accompanied by a vast "retinue of shadow photons," meaning there are many more shadow photons than tangible ones (at least a trillion in a typical lab setup).

This interference phenomenon is not unique to photons; it occurs for every type of particle (e.g., shadow neutrons, shadow electrons). This leads to the profound implication that reality is far bigger and mostly invisible, with our observable universe being just the "merest tip of the iceberg".

The tangible particles we observe collectively form a "universe" because they interact and are directly detectable. The crucial insight is that shadow particles are also partitioned among themselves in the same way, forming a "huge number of parallel universes". Each parallel universe is similar in composition to our tangible one, obeys the same laws of physics, but has particles in different positions. The traditional term "universe" now refers only to the tangible portion, while "multiverse" is coined for "physical reality as a whole". The existence of the multiverse is directly demonstrated by single-particle interference experiments. The stopping of shadow photons by opaque barriers occurs because they interact with shadow atoms in shadow barriers within their own parallel universes. Thus, parallel universes interact weakly only through interference phenomena.

Deutsch laments that the multiverse concept is a minority view among physicists, attributing this to those who are satisfied with "mere prediction" and lack a strong desire for deeper understanding, aligning with instrumentalist or positivist philosophies. He argues that pragmatic physicists' suggestions (e.g., that photons "behave as if" invisible entities exist) are incoherent, because the observed behavior requires something real to be passing through the other slits.

He points out that Hugh Everett was the first to clearly understand that quantum theory describes a multiverse. The debate among physicists about whether quantum theory forces acceptance of parallel universes is "wrong-headed". Instead, the observed single-particle interference phenomena themselves tell us that parallel universes exist, even without deep theories. Therefore, "the quantum theory of parallel universes is not the problem, it is the solution" and the "only one that is tenable" explanation for this reality.

The chapter concludes by severing the temporary distinction between "tangible" and "shadow". From an objective viewpoint, there are not two kinds of photons or universes; tangibility is relative to a given observer. All copies of an object or observer in the multiverse are equally real and none has a privileged position; if one "feels subjectively" tangible, all others feel the same about themselves.

Chapter 3: Problem-solving

This chapter reflects on the nature of scientific reasoning, particularly how momentous conclusions (like parallel universes) can arise from observations of "ever smaller physical effects". The author questions what justifies these inferences, especially since all empirical evidence is indirect, reaching us as "patterns of weak electric current trickling through our own brains".

A critical point is that these inferences are not a matter of logical deduction. Solipsism – the theory that only one mind exists and external reality is a dream – cannot be logically disproved from any observational evidence. This implies that one can logically deduce nothing about reality from observations. Therefore, when observations "rule out" a theory (like a single universe or Newtonian physics), it doesn't mean "disproving" it in a logical sense.

The chapter then critiques inductivism, the common-sense theory of scientific knowledge. Inductivism posits that theories are discovered by "extrapolating" observations and justified by repeated observations. The author argues that inductivism is "profoundly false". Theories are not mere generalizations from observations; for example, the parallel universe theory wasn't induced from observing multiple universes one by one. Repeating observations doesn't necessarily convince us of theories, which are explanations, not just predictions. Bertrand Russell's "chicken story" illustrates that repeated observations cannot justify predictions, as future events can contradict past patterns. More fundamentally, extrapolating observations is impossible without an existing explanatory framework; the same observations can be "extrapolated" to diametrically opposite predictions based on different explanations (e.g., a farmer's increased feeding could mean benevolence or fattening for slaughter). Thus, inductivism is "false, root and branch".

Instead, Deutsch advocates an explanation-centered theory of knowledge, focusing on how explanations arise and are justified. Karl Popper's theory regards science as a problem-solving process. It begins not with observations, but with an existing theory that seems inadequate, thus constituting a "problem".

The problem-solving process involves several stages:

  1. Problem: An inadequate existing theory or surprising observations.
  2. Conjectured Solutions: New, tentative theories are proposed to solve the problem.
  3. Criticism: These conjectures are examined and compared to find the "best explanations". Experimental testing is a crucial part of this criticism, where conflicting predictions from rival theories are tested, and false theories are abandoned.
  4. Replacement of Erroneous Theories: If a theory fails criticism, it is abandoned; if a new theory is adopted, the problem-solving is tentatively successful.

A correct prediction doesn't confirm an explanation; only an incorrect prediction makes an explanation unsatisfactory. Untestable theories are rejected because they cannot explain why events happen in a particular way. Science prefers theories that offer "more detailed explanations" as they expose themselves to more criticism and leave less unexplained. Most scientific criticism directly targets explanations, with experimental testing being an indirect but powerful method. Many theories are rejected without testing for being "bad explanations" that add unexplained assertions (e.g., the "grass cure" for the common cold).

The process is not strictly linear, involving "repeated backtracking" and reformulation of problems and solutions. This dynamic, competitive process, where theories are subjected to "variation and selection," resembles biological evolution. Popper called this an evolutionary epistemology. However, a key difference is that in human problem-solving, the creation of new conjectures is purposeful and knowledge-laden, unlike random biological mutations, and involves argument, which is absent in biological evolution. Both depend on the survival of "objective knowledge" (adaptation).

The justification for inferences drawn from observations is that we never draw inferences from observations alone; observations become significant when they reveal deficiencies in contending explanations. We choose a theory because arguments (only a few of which are observational) show its explanations to be superior to rivals. This problem- and explanation-based view of science resolves the "problem of induction" because there's no mystery in tentatively accepting the best available explanation.

Chapter 4: Criteria for Reality

The chapter explores how we determine what is real, drawing on Galileo's conflict with the Church over the heliocentric theory. Both Galileo and the Church were realists, believing in an external physical universe that affects our senses. Galileo's "dangerous way of thinking" was his belief that reliable knowledge of universal, mathematical laws was accessible through systematic experimental testing, taking precedence over common sense and religious doctrine. The Church accepted the heliocentric theory for predictions but denied its truth about reality, highlighting a fundamental disagreement about the explanatory nature of scientific knowledge.

The author argues that a Popperian defense (science as problem-solving) is insufficient against such skepticism, as problem-solving occurs within the human mind. The Inquisition's theory (geocentric view) is shown to be a "bad explanation" because it could only explain planetary motions by faithfully mimicking the heliocentric theory's predictions. This means it was "understood only in terms of a different cosmology, the heliocentric cosmology that it contradicts but faithfully mimics". Such a theory is indefensible because it makes its rival necessary for its own explanation.

The Inquisition's theory, like solipsism and its variants (e.g., creationism claiming recent creation with misleading evidence, behaviorism denying inner mental processes), draws an "arbitrary boundary" beyond which human reason cannot access reality. These theories see scientific rationality as a "mere game". They seek an "ultimate source of justification" outside this boundary, like divine revelation or direct experience.

The author refutes solipsism with a "philosophical joke" about a professor defending solipsism to students. The core argument is that if something (like "dream-people" or external reality) behaves as if it is independent and complex, it is independent and real. If a solipsist attributes these independent behaviors to their "unconscious mind," they are merely redefining "myself" to include a vast, complex, autonomous structure that behaves exactly as if it were external reality. Thus, solipsism is "realism disguised and weighed down by additional unnecessary assumptions". Similarly, positivism is indefensible as it claims its own meaninglessness.

The idea of a hierarchy of reliability (mathematical > scientific > philosophical) is dismissed as a "reductionist mistake". Explanations are justified by their "superior ability, relative to rival explanations, to solve the problems they address," not by their origin or means of derivation. Some philosophical arguments are "far more compelling than any scientific argument," and mathematical arguments derive their reliability from underlying physical and philosophical theories, meaning no knowledge yields absolute certainty.

The central criterion for reality used in science is articulated through Dr. Johnson's famous refutation of Bishop Berkeley's solipsism by kicking a rock. Dr. Johnson's criterion: "If it can kick back, it exists". This means if an object affects us in ways that require independent explanation, it is real. Galileo's observations of light from planets "kicked back" his retina, allowing him to infer the reality of heliocentric motions. Similarly, shadow photons "kick back" by interfering with visible photons, thus proving their existence. Postulating invisible "angels" instead of "invisible photons" is a worse explanation because there's no independent reason for angels to behave like photons. The imperceptibility of phenomena like Earth's motion or parallel universes doesn't negate their reality; they are "perceptible" through scientific instruments.

The complexity and autonomy of an entity's behavior are key evidence for its reality. Simpler explanations are preferred, especially those that account for complexity. This leads to a refined Dr. Johnson's criterion: "If, according to the simplest explanation, an entity is complex and autonomous, then that entity is real". Applying computational complexity theory, this means: "If a substantial amount of computation would be required to give us the illusion that a certain entity is real, then that entity is real". Denying the existence of shadow photons by proposing "action at a distance" is untenable, as calculating such an action would require the same computational effort as working out the history of myriad shadow photons. The irreducible complexity of this story necessitates their existence.

Galileo's discovery wasn't just about reliable scientific reasoning; it was a fact about physical reality itself: that it is "science-friendly" and "saturated with evidence". This evidence is openly accessible and consistent across different locations and times (e.g., astronomical evidence throughout the multiverse). Physical reality exhibits self-similarity, where patterns are endlessly repeated across space and universes (e.g., planets moving in ellipses, made of same elements). Reality contains not only evidence but also the means (human minds, artifacts) of understanding it, meaning "there are mathematical symbols in physical reality" – laws and explanations embodied in our knowledge. This resemblance of our knowledge to reality is called knowledge.

Chapter 5: Virtual Reality

The theory of computation is not merely abstract mathematics; computers are physical objects, and computations are physical processes governed by the laws of physics. The concept of universality in computation, meaning a universal computer can mimic any other, is significant because universal computers can be built and can compute the behavior of physical and abstract entities, demonstrating a "self-similarity of physical reality".

Virtual reality (VR) might seem to support arguments against the reliability of senses, as a flight simulator can "kick back" realistically even for non-existent objects. However, this doesn't invalidate Dr. Johnson's criterion. The computer generating the simulation is a real physical object, and its calculations are external, complex, and autonomous, thus passing the test for reality. The fact that it's simulating a non-existent object is irrelevant to refuting solipsism; as long as its response is complex and autonomous, it indicates something real outside the observer.

While VR can "pretend" to be objects obeying "false laws of physics," it's not a sign of human perceptual limitation. Instead, the existence of virtual reality is a fundamental property of the multiverse at large, which is "essential for both realism and science" and "makes science possible".

The ultimate limits of VR involve determining what environments can, "in principle," be artificially rendered, ignoring transient technological limitations but considering logical and physical principles. A virtual-reality generator provides the user with experiences of a specified environment, which are external to the user's mind. Its repertoire is the set of environments it can render.

The laws of physics impose no limit on the range and accuracy of image generators. Any human sensation or sequence of sensations can, in principle, be artificially rendered ("feelies" that engage all senses). Perfect accuracy means the rendering is indistinguishable from the intended sensation by the user.

Interactive VR is key, as the generator composes images based on continuous input from the user's actions. This requires sensors (e.g., nerve-impulse detectors), image generators (e.g., nerve-stimulation devices), and a controlling computer. Once the finite, species-specific engineering problems of connecting to the brain are solved, the main challenge shifts to programming the computer to render various environments. The accuracy of a rendered environment is its logical property, based on how it would respond to every possible action of the user, not just actual ones.

Accurately rendering a physically possible environment requires understanding its physics. The program in a VR generator embodies a general, predictive theory of the rendered environment's behavior. Conversely, discovering the physics of an environment involves creating a VR rendering of it. When a computer describes an eclipse, for example, the words evoke an "interactive" likeness in the reader's mind, a "general method of creating many different images" that can be tested; thus, science and VR rendering of physically possible environments are "two terms denoting the same activity".

Every VR generator, regardless of its program, is always rendering some physically possible environment (e.g., the flight simulator itself). Physically impossible environments can only be rendered if they are not perceptibly different from physically possible ones; "the fiction is always an interpretation in the mind of the beholder".

The concept of VR extends beyond physical simulations: mathematics is a form of virtual reality. When we imagine abstract mathematical entities, we are imagining an environment whose "physics" embodies their properties. Imagination itself is a straightforward form of virtual reality. More profoundly, our "direct" experience of the world through our senses is also virtual reality, generated by our unconscious minds using inborn and learned theories to interpret sensory data. Realists believe in an objective external reality, but we "never experience that reality directly". All our external experience and knowledge (even of non-physical worlds like logic or art) is encoded as programs for the brain's VR generator. Thus, all reasoning and thinking are forms of VR. Biologically, VR rendering of their environment is the characteristic means by which human beings survive.

Chapter 6: Universality and the Limits of Computation

The ultimate scope of virtual reality means a universal VR generator can render any logically possible environment. However, physical laws impose drastic restrictions: programs must be quantized, consist of a finite number of symbols, and be executable in a finite sequence of steps. Despite an infinite number of possible programs, there are still infinitely many environments that cannot be rendered, called Cantgotu environments.

The feasibility of a universal VR generator depends on the existence of a universal computer – a single machine capable of calculating anything that can be calculated. This leads to a stronger, physical version of the Church-Turing conjecture, which Deutsch calls the Turing principle: "Every physically possible technological process can be rendered by a universal virtual-reality generator". This principle implies a self-similarity in reality where the laws of physics can be embodied in other physical objects (the knower) and that knowledge-creating processes (science) are physically possible. In essence, the laws of physics "mandate their own comprehensibility". The physical possibility of building a universal VR generator means such machines "must actually be built in some universes" within the multiverse.

The author explains that our own external experiences are a form of virtual reality generated by our brains, and these renderings are always inaccurate. However, these inaccuracies are the starting point for scientific reasoning, driving us to solve problems about reality. Even if we were trapped in a VR programmed with "wrong laws of physics," we could still learn about the external reality. This is because scientists would seek explanations for the rendered environment's entities, and for most VRs, these entities are designed in the external reality, thus their explanation lies outside the VR. The rules of an internal environment (like a chess game) can contain "fossil evidence" of an external, evolutionary history (e.g., complex chess moves point to rule evolution). To forever fool inhabitants, a rendered environment would have to be "self-contained as regards explanations," which is likely impossible for any part of reality less than the whole. Reasoning from one's own existence (anthropic reasoning) and seeking explanations can reveal external reality.

Chapter 7: A Conversation About Justification (or 'David and the Crypto-inductivist')

This chapter addresses the persistent "problem of induction" in philosophy, which, despite Popper's claim to have solved it, remains contentious. The problem is: how can we justify any conclusion about the future based on past evidence, given the invalidity of inductive reasoning?. Many philosophers are "crypto-inductivists," worrying about this problem even if they don't explicitly use inductive methods.

Through a dialogue with a "Crypto-inductivist," Deutsch explains the Popperian methodology of justification. Deutsch asserts that reliance on a theory (e.g., gravity, predicting a fall from a tower) is justified, not by past experimental evidence alone, as an "infinity of theories" are consistent with past outcomes. Instead, justification comes from argument. Experiments play a "pivotal role" by refuting rival theories. Popper defines "rival theories" as actual contenders in a rational controversy, not all logically possible theories. Corroboration is not merely confirming instances, but the experimental refutation of rival theories.

The "Floater" scenario (a theory that David Deutsch floats if he jumps from a tower) illustrates that a theory can be consistent with past observations but still be a "bad explanation". Such a theory is rejected because it introduces an "unexplained qualification" that "spoils the explanation of gravity" without solving any problem. The problem lies in the substance of the theory, not its linguistic form; a language can be invented to make the "floater" theory seem unqualified, but its inherent defect of being an unexplained modification remains. Languages themselves embody implicit theories, and their structure reflects the "current state of the speakers' problem-situation". A theory that postulates an anomaly without explaining it is rejected because "any postulate which solves no problem is to be rejected".

This objective distinction between good and bad explanations provides the solution to the problem of induction: we justify relying on a theory because its explanations are better than any known rivals, not because we assume the future will resemble the past. The "explanation principle" (the criterion for rejecting problem-solving postulates) does not imply a "principle of induction" that could be used to select theories. Our theories simply assert things about the future, and we find out in what respects the future resembles the past after we have the theory.

The principles of rationality themselves are justified by argument; for instance, laws of deduction are justified because no explanation is improved by replacing them. Logical reasoning, like scientific reasoning, is a physical and fallible process. The Turing principle (discussed in Chapter 6) is a fundamental law of physics that explains how the universe creates knowledge about itself. Even if the Turing principle cannot be definitively "justified" as true, its validity does not detract from the justification of scientific theories, as it does not make their explanations worse.

Finally, the dialogue clarifies that arguments and explanations are not like mathematical proofs that proceed from fixed axioms; they "start in the middle" and improve through criticism, including of their own initial "axioms". Argument's purpose is to solve problems, not to be justified by an ultimate source.

Chapter 8: The Significance of Life

The chapter challenges the prevailing modern scientific view that life is theoretically insignificant and a mere side-effect of physics and chemistry. Historically, Aristotle believed life was fundamental due to unique animate matter and its dominance on Earth, while modern science, especially after Darwin, viewed life as a "remote offshoot" of physics. School definitions of life (e.g., motion, respiration) were seen as descriptive and inaccurate, failing to capture life's essence.

Modern biology defines life not by an "essence" but by molecular replicators. A replicator is "any entity that causes certain environments to copy it". Examples include computer viruses and memes (human ideas like jokes). All Earth life is based on genes (DNA molecules) which are replicators that direct the manufacturing of proteins and the creation of new organisms.

Adaptation is defined as "the degree to which the replicator contributes causally to its own replication in that environment". This requires considering not just what a replicator does, but what a "vast number of other objects, most of which do not exist," would do in various environments. This concept of adaptation, understood through Dawkins's "The Selfish Gene," shows organisms as part of the environment of genes.

The author argues that life is not theoretically derivative or physically insignificant. Genes, while molecules, are explained by "complex programs embodied in them," which are fundamentally different from how inanimate matter is explained. Furthermore, life is a form of virtual-reality generation. Living cells executing gene programs are "exquisitely accurate, interactive control" systems that aim to replicate genes by manufacturing complex environments (organisms). The "intention" of genes to replicate can be inferred from Darwin's theory; they succeed by actively "rendering" an environment that ensures their continued existence. Life is fundamentally about knowledge; an entity is adapted if it "embodies knowledge that causes the niche to keep that knowledge in existence". This connects to the Turing principle, which states the possibility of embodying physical laws in VR programs. In fact, all existing or future VR programs are "direct or indirect effects of life". Life is presumably the necessary means by which virtual reality was first realized in nature.

Life's physical impact is far from insignificant. The future of the Sun, for example, depends on the knowledge our descendants acquire and how they apply it (e.g., controlling the Sun or emigrating). Predicting the future of stars and galaxies requires making assumptions about the distribution and behavior of intelligent life, showing that "the gross physical development of the universe" depends on it. Underestimating life's impact is a "parochial" mistake, as the vast future of the universe offers "plenty of scope for life to affect and, in the long run, to dominate everything that happens". Life achieves its effects not through mass or energy, but by being "more knowledgeable". Knowledge is thus "at least as significant as any other physical quantity" in its effect on physical processes. Examples include human efforts in building the Great Wall of China and plants creating the Earth's oxygen.

The multiverse perspective reveals a "physically special" property of knowledge-bearing matter. A single cosmic ray hitting DNA causes a range of different mutations in different universes, highlighting the trans-universe structure. While genes and non-genes may look identical in one universe, in the multiverse, genes are "patterns that extend across many universes," forming a "crystal in the multiverse". Non-genes are just "irregularities" across universes. The largest-scale regular structures across the multiverse exist where knowledge-bearing matter (brains, DNA) has evolved. Thus, knowledge is a fundamental physical quantity, and the phenomenon of life is only slightly less so.

Chapter 9: Quantum Computers

This chapter delves into quantum computers and their profound implications for understanding reality, especially in the context of the multiverse. The laws of physics allow for continuous improvements in theories and computational tractability in discovering them, implying that the "fabric of reality must be, as it were, layered, for easy self-access". This suggests that the Turing principle, which states the possibility of universal computation, implies that universal virtual-reality generators can be built efficiently, not requiring "impracticably large resources" for simple renderings.

The concept of intractability is introduced with the example of factoring large numbers: classical computers take an infeasible amount of time (millions of years for 250-digit numbers), unlike trivial multiplication. This intractability highlights a computational limit for classical physics.

The "butterfly effect" (small changes causing large unpredictable outcomes) based on classical chaos theory is described as not holding true in any single universe, because perfect determinism is not a feature of reality. Instead, quantum mechanics predicts that groups of universes, initially nearly identical (e.g., a butterfly's wings up vs. down), remain nearly identical as groups but differentiate internally over time.

Quantum theory, often presented as probabilistic, often predicts a single, definite outcome for all universes in many experiments, even if the universes differed at intermediate stages. This is seen in interferometers, where two versions of a photon in different universes interfere, causing both versions to take the same exit route, making all universes identical at the end of the experiment. This demonstrates the "lot more happening in a quantum-mechanical environment" than meets the eye. The complexity of these quantum computations, where the number of different histories increases exponentially with interacting particles, further supports the existence of the multiverse as per Dr. Johnson's criterion.

The multiverse is described as a continuum of universes, not a discrete set. Each type of universe is present as a "certain tiny but non-zero proportion of the multiverse," and these proportions change continuously under quantum-mechanical laws. While transitions appear continuous objectively, they remain subjectively discontinuous from the perspective of an individual universe, but this is a "limitation of the diagram, and not a real feature".

The most significant aspect of quantum computation for reality is Shor's algorithm, which allows a quantum computer to factorize large numbers efficiently. This has "great significance for cryptography," as widely used public-key encryption (like RSA) relies on the intractability of factorization for classical computers. Deutsch poses a challenge to single-universe proponents: "explain how Shor's algorithm works". If a number is factorized using 10^500 times the atoms in the visible universe, "where was the computation performed?". This implicitly points to the vast computational resources of the multiverse. Quantum cryptography, which relies on individual photon manipulation for secure communication, currently has limited range but is an area of ongoing research.

Deutsch emphasizes that the fundamental importance of quantum computation lies not in its technological practicality or when it will be built, but in what studying it theoretically reveals about the deep connections between the laws of physics, universality, and other "strands of explanation" of the fabric of reality.

Chapter 10: The Nature of Mathematics

This chapter addresses the existence and nature of abstract, non-physical entities, particularly in mathematics, and their place in the fabric of reality. The author asks how we understand these entities and which of them require attributing an "independent existence". Applying Dr. Johnson's criterion ("if it can kick back, it exists"), if an abstraction "kicks back in a complex, autonomous way," it is real. For example, the natural numbers (1, 2, 3...) and prime numbers, while defined abstractly, reveal unexpected properties and intricate structures through mathematical proofs, akin to "kicking back" and requiring deep explanations.

The traditional view of mathematics holds that proofs make no reference to the physical world, can be performed in the mind, and yield absolute certainty. This contrasts with science, where observation is the arbiter, yielding only tentative knowledge. Deutsch asserts that the idea of mathematical certainty is a myth.

Historically, Plato believed in a realm of "Forms" where perfect abstract entities like numbers and circles exist, with the physical world being merely their imperfect "shadows". He argued that our knowledge of these perfect forms comes from inborn memories, providing absolute certainty that experience cannot. This idea, that mathematical knowledge comes from a "special" source (now called mathematical intuition) that confers absolute certainty, is still "accepted uncritically by virtually all mathematicians".

Controversies in mathematics, such as the use of infinite numbers, led to efforts to formalize proof theory. Intuitionism, an "extreme conservative strategy" advocated by Luitzen Egbertus Jan Brouwer, restricted mathematical reasoning to only "unchallengeably self-evident" aspects, denying the existence of infinite entities and even rejecting the law of the excluded middle. Deutsch argues that intuitionism is the mathematical expression of solipsism, a retreat into an inner world that ultimately fails to explain much of what it purports to encompass, introducing "unexplained complications".

David Hilbert proposed to establish certainty in mathematics through consistency, by defining a complete, finite, self-consistent set of inference rules that could prove their own consistency. However, Kurt Godel's incompleteness theorems (1931) definitively refuted Hilbert's program:

  1. No set of rules capable of validating ordinary arithmetic proofs can prove its own consistency.
  2. For any consistent set of rules, there exist valid proof methods that those rules fail to designate as valid. Godel showed how to construct a true proposition that couldn't be proved or disproved by its own rules.

Godel's results imply that there will never be a fixed method for determining mathematical truth or generating new knowledge. Progress in mathematics will always depend on creativity, inventing new proof types, and validating them through explanations based on improving understanding. Godel's own proofs were "self-evidently valid" only if one first understood their accompanying explanation. Thus, explanation plays the same paramount role in pure mathematics as in science. Proof and observation are merely means to check our explanations.

Roger Penrose argues that the human mind's ability to grasp new, infallible mathematical proofs (despite Godel's theorems) is incompatible with the Turing principle, suggesting the brain cannot be fully understood as a computer. Deutsch disagrees, noting that mathematicians often have conflicting intuitions and that there's no inherent "bell" that rings for a truly valid proof. Even if Penrose's proposed theory were true, it wouldn't guarantee certainty in mathematical knowledge.

Mathematical knowledge is derivative and depends on our knowledge of physics. Our understanding of perfect circles from diagrams depends on theories about the physical resemblance, which are not certain. Symbolic proofs, like diagrams, are physical objects (ink on paper) whose reliability relies on our theories of their physical behavior mimicking abstract entities. Virtual reality can render perfect Euclidean circles, allowing interactive experience and proof, demonstrating that even imperfect renderings can yield reliable knowledge if the imperfections are understood. Our imagination is this kind of VR rendering.

Proof is a physical process, a type of computation. No one can guarantee that a previously valid proof won't one day reveal a profound misconception based on a hidden assumption. Proof theory is not a branch of mathematics; it is a science about ensuring physical processes correctly mimic abstract entities. Godel's theorems are about these physical processes of proof, not pure logic. The intuition that proofs consist of finitely many steps is a physical requirement, not a logical one, and classical physics would not have enforced it.

Relationships between abstract entities are "absolutely necessary truths," forming the subject-matter of mathematics, but this does not mean our knowledge of them is certain. The objective of mathematics is mathematical explanation, not certainty or even truth. Mathematical knowledge works because some of our physical knowledge is equally reliable, allowing us to understand which physical objects can model abstract ones. Inborn intuitions, like the eye's false theory of yellow light, do not confer special authority on knowledge.

Therefore, the fabric of reality is unified; mathematical knowledge is not more certain. Mathematical entities are real (complex, autonomous, objective, independent of physics), and physics provides a "narrow window" to this realm, allowing us to gain knowledge of only the "infinitesimal minority" of truths that can be rendered in virtual reality. However, "incomprehensible mathematical entities" exist too, as they appear "inextricably in our explanations of the comprehensible ones".

Chapter 11: Time: The First Quantum Concept

The common-sense theory of time, which posits that time "flows" and our consciousness "moves" through it (as depicted by a moving "present moment" in grammatical diagrams), is fundamentally "nonsensical" and "factually inaccurate". Logically, a "moving present" would imply time changing within time, which is incoherent. Objectively, no moment is privileged as "now," just as no position is privileged as "here". We do not experience time flowing; rather, we experience differences between present perceptions and memories of past perceptions, which we incorrectly interpret as a moving present. Nothing can "move along the line" of time, as time itself is the framework. This inherent psychological difficulty arises from confusing the motion of our attention across a static representation of time with an impossible motion through real moments. Common sense holds two incompatible concepts: a moving present and a sequence of unchanging moments. We invoke the former for causes and effects, and the latter for describing when events happen.

In Newtonian physics, time is a continuum of unchanging "snapshots" of space, forming a static, unchangeable "block" called spacetime. Einstein's relativity made this four-dimensional geometrical interpretation indispensable, as observers in different frames of reference slice spacetime differently into "moments," yet construct identical spacetimes. The "block universe" concept means reality is "frozen" in this single four-dimensional block. In spacetime, nothing truly moves relative to itself; change is merely differences between snapshots.

The common-sense notion of cause and effect, where we have free will to affect the future but not the past, faces a challenge in spacetime physics. Counter-factual conditionals (e.g., "if Faraday had died in 1830...") have no meaning in spacetime physics, because non-existent events have no aftermaths. This is a "traditional paradox" rooted in the "false physics of spacetime".

The quantum concept of time, however, resolves this paradox within the framework of the multiverse. The multiverse is a "much bigger and more diverse entity" than spacetime. In the multiverse, variants of our universe actually exist where counterfactual events (like Faraday dying in 1830) occurred, making counterfactual statements objectively meaningful. An event X causes Y in our universe if both occur, but in most multiverse variants where X doesn't happen, Y doesn't either.

In the multiverse, there is no fundamental demarcation between snapshots of other times and snapshots of other universes. "Other times are just special cases of other universes". There is no overarching external time framework. Any snapshot in the multiverse is present in an "infinity of copies," so we speak of proportions, not numerical counts, of universes. The subjective probabilistic nature of quantum predictions (e.g., a coin toss) arises from the differentiation of identical observer copies across universes into groups experiencing different outcomes.

Classical physical laws are only an approximation in regions where snapshots fall into mutually determining chains, but this approximation breaks down with more detail, or far in time/multiverse, or in extreme conditions like black holes or the Big Crunch. The multiverse is a "complex, multi-dimensional jigsaw puzzle" rather than a linear sequence.

Within this multiverse, the common-sense concept of cause and effect makes perfect sense, as variants exist and obey deterministic laws. The conversion of subjective possibilities into actualities (e.g., a coin toss outcome) also makes sense, as observer copies differentiate, and each experiences only one outcome as "actual" in their universe, even though all outcomes are objectively "actualities" in the multiverse.

In summary, time is not a sequence of moments, nor does it flow. Our intuitions about time are broadly true but could not be coherently expressed in classical physics; they only make sense in quantum physics, where "time was a quantum concept all along". We exist in multiple versions across universes (which are "moments"). All moments and the entire multiverse are physically real.

Chapter 12: Time Travel

This chapter explores time travel, distinguishing between future-directed and past-directed journeys. Future-directed time travel is less problematic and conceptually easier, achievable through subjective time compression (like extended comas or brain slowing techniques) or Einstein's time dilation from high-speed travel.

The more complex issue is past-directed time travel, particularly its rendering in virtual reality (VR). A universal VR generator could perfectly render any past environment, including specific historical moments like ancient Rome. For an accurate rendering of time travel, the VR must be interactive, responding to the user's actions as the real past environment would, including interacting with historical figures.

This interactivity creates a "paradoxical" challenge because the user plays a "unique double, or multiple, role," as the VR must render future copies of the user appearing in the past that the user is currently inhabiting. In a non-interactive VR, recordings of past events could be played back, showing the user's past self non-reactively. This would in effect be a rendering of parallel universes.

In interactive mode, the VR needs to "know" what the user's future self will do to render it accurately. The crucial insight, however, is that if the user chooses not to follow their past self's actions (e.g., by speaking to their past self), the rendered copies will fail the authenticity test. This leads to the realization that the VR is rendering a "little multiverse," with multiple records of what happened based on different choices and interactions.

This leads to the understanding that in physical time travel, if one were to go to the past, they would not necessarily return to the exact same past snapshot they experienced. Interacting with the past means changing it, which means entering a different snapshot (i.e., a different parallel universe). The Grandfather Paradox (e.g., traveling back in time to prevent one's own birth) is not a logical inconsistency in the multiverse view; it simply means that different choices lead to different universes where copies of the traveler take different paths. Free will is maintained because the VR (or the multiverse) encompasses all possible reactions, and the user's choice determines which universe they are in. Thus, time travel is within the repertoire of a universal VR generator and is physically possible and non-paradoxical in quantum theory.

Time machines would not provide access to times before their creation. Visitors from the future could not know our future, only the future of their universe, whose past was identical to ours. They could warn us of disasters or bring news, but we would still be free to make our own choices, and the outcome is not guaranteed to be the same.

The knowledge paradox of time travel (e.g., a time traveler giving Shakespeare the text of Hamlet, implying the literature was never originally created) is discussed. This paradox, though logically consistent, "profoundly contradicts our understanding of where knowledge comes from," which is through creative, step-by-step evolutionary processes. The multiverse view resolves this: the time traveler comes from a different universe than the Shakespeare who originally wrote the plays. Knowledge and adaptation are always created incrementally through creativity or evolution, never "out of nothing".

Ultimately, time travel, if achieved, would reveal itself as a powerful computational resource, sharing computational work across parallel universes. Knowledge creation and biological evolution are physically significant processes that create "trans-universe structure" by making universes alike. The ability of a time machine to receive information from computations in other universes demonstrates how the multiverse itself can be seen as a "gigantic computation".

The study of time travel, even theoretically, highlights the deep connections between the four main strands: quantum physics (multiverse), computation (VR, time travel as computation), and epistemology/evolution (constraints on knowledge creation).

Chapter 13: The Four Strands

This chapter examines the remarkable, and often paradoxical, way the four main strands of explanation (quantum physics, evolution, epistemology, and computation) have been treated in their respective intellectual histories: simultaneously accepted in practice yet ignored or rejected as fundamental explanations of reality.

The author critiques Thomas Kuhn's paradigm theory, which suggests that scientists are blinded by their prevailing worldview and resist fundamental change, with scientific revolutions occurring through sociological battles rather than objective merit. Deutsch argues Kuhn is mistaken; while attachment to old ideas exists, it's a danger to be avoided, not an inherent blindness. Kuhn's denial of objective improvement in successive scientific explanations is demonstrably false (e.g., our ability to fly is objectively superior to ancient dreams). Furthermore, Kuhn's view misrepresents the collaborative and critical nature of scientific research, where even junior researchers can challenge leading experts without appeals to authority. Scientific rationality prevails during research discourse, distinct from typical human social interactions.

The paradoxical acceptance and rejection of the four strands are then detailed:

The persistent "explanatory gaps" in each individual theory lead critics to remain "haunted" and proponents (Popper, Turing, Everett, Dawkins) to be "constantly on the defensive against obsolete theories," hindering further fundamental progress.

Deutsch argues that none of the four strands can be properly understood independently of the other three. When taken individually, they have "explanatory gaps" that can make them seem "narrow, inhuman and pessimistic". For example, Turing's computation theory might seem to leave no room for consciousness or free will. However, the author suggests that the unification of computation and quantum physics (and the other strands) is essential for a fundamental understanding of consciousness. Knowledge, understood as complexity across universes in quantum theory, can provide a basis for this. The multiverse view also exonerates Turing's theory as an obstacle to understanding human attributes like free will, showing it fits naturally in a multiverse context. Similarly, the "explanatory gap" in Darwinism regarding the origin of complex adaptations can be filled by linking it to quantum theory (information storage/replication) and the Turing principle (universality of computation related to evolution). The growth of knowledge in Popperian epistemology, as a physical process, must be explained through quantum physics, the Turing principle, and evolution.

When these four theories are taken together as a unified explanation of the fabric of reality, their individual "unfortunate property" is reversed. This unified worldview is "fundamentally optimistic," placing human minds, explanation, and understanding at the center of the physical universe and human purposes.

Chapter 14: The Ends of the Universe

The final chapter explores the implications of the unified worldview for the ultimate future of the universe and humanity. Deutsch notes that physics has historically absorbed other fields (astronomy, chemistry, time). However, his proposed fabric of reality is not just fundamental physics; emergent principles like the Turing principle are physical laws at an emergent level, applying to complex machines and only consequentially to subatomic objects. Similarly, epistemology and evolution principles are emergent physical laws describing multiverse structure. This implies that the unified "Theory of Everything" will integrate all four strands.

The author states that viewing quantum physics as "swallowing" other strands is a narrow, reductionist perspective. Each strand is rich enough to form a worldview on its own, but these are "narrow, even misleading perspectives". The new synthesis of all four strands has a unique character, unlike any single one. The perceived "coldness" or "pointlessness" of individual strands (e.g., Stephen Hawking calling humans "chemical scum," Steven Weinberg seeing the universe as "pointless") is addressed by the unified view: human knowledge is a fundamental physical quantity influencing the "gross behaviour of our planet, star and galaxy". Knowledge creation and adaptations are the emergence of self-similarity mandated by the Turing principle. The unified explanatory structure is non-hierarchical, with principles in each strand being emergent from and yet explaining the others.

The chapter then introduces Frank Tipler's omega-point theory as an example of a theory belonging irreducibly to all four strands. The starting point is the Turing principle: a universal virtual-reality generator requires potentially unlimited resources. Existing cosmological models (like the Big Crunch or ever-expanding universes) seem to limit total computation, violating the Turing principle. However, Tipler discovered a class of cosmological models where, despite the universe being finite in space and time, the memory capacity, computational steps, and energy supply are all unlimited due to the extreme violence and gravitational shearing forces near the Big Crunch. This end-point is the omega point.

Deutsch defends this "extrapolation" to infinity against inductivist skepticism, arguing that the Turing principle is our best theory of computation, and any other assumption would "spoil good explanations of what is happening here and now". The oscillations of space leading to the omega point are unstable and require continual "steering" by manipulating the gravitational field. This steering necessitates the "continual creation of new knowledge," implying the survival of "intelligent entities" until the end of the universe. These intelligences, running as computer programs at ever-increasing physical speeds, will experience "subjectively infinite time," giving them "every incentive" to manage resources and prepare for their "open, infinite future". These intelligences could be our "intellectual descendants," requiring the transfer of human minds into "more robust hardware" infinitely many times. Tipler also posits deadlines for colonizing the galaxy and universe to achieve the omega point.

Deutsch agrees with Tipler's implicit choice that the Turing principle holds in "all universes". He notes that the comprehensibility of reality (an epistemological principle) can be inferred from the Turing principle, showcasing the unified theory's explanatory power.

Tipler's "quasi-religious interpretation" of the omega point, identifying its omniscient, omnipotent society of people as "God," has prevented the underlying scientific theory from being taken seriously. Deutsch clarifies that this "omniscient" means infinite knowledge created before the end, not possessed by an entity at the end. Tipler's further claims, like the resurrection of the dead via VR renderings of all physically possible past lives into a "heaven," are "informed speculation" built on plausible but not scientifically certain assumptions. Predicting the motives of these far-future intelligences is unreliable, as future knowledge cannot be predicted.

The author delves into morality and aesthetics, arguing they are also amenable to objective explanation within this unified worldview. Traditional attempts to define morality by usefulness (evolutionary, utilitarianism) fail because humans choose and change their preferences for moral reasons, which cannot be reduced. The non-hierarchical explanatory structure of the fabric of reality allows for moral values to exist objectively by playing a role in emergent explanations. For example, "human rights" could be explained as promoting knowledge growth (an epistemological benefit), making morality a form of "emergent usefulness". Similarly, "artistic value" or "beauty" can be understood as "emergent design," related to improving design criteria. If ethics and aesthetics are compatible with this worldview, then beauty and rightness must be as objective as scientific or mathematical truth, created through conjecture and rational criticism.

Ultimately, the omega-point theory implies that the universe will literally consist of "intelligent thought-processes". However, thought is problem-solving, involving "rival conjectures, errors, criticism, refutation and backtracking". So, even at the omega point, knowledge will be "riddled with errors," and its culture will be "discordant yet progressive," not a monolithic entity. This unified worldview, based on the four strands, is presented as the "natural," "conservative" view in the current state of knowledge, guiding future progress.