Collapse at the Speed of Causality
🜂 No Infinities, No Superpositions, No Multiverses—Just Physics
Thesis:
The universe is not infinite, not probabilistic, and not parallel. It is a deterministic, discrete cellular automaton bounded by a maximal rate of causal state propagation. Quantum mechanics, as currently formalised, is a statistical compression of finite, rule-bound processes—not a literal reflection of physical ontology. This paper dismantles the metaphysics of quantum computation under these constraints and reconstructs physical reality as a finite machine where no process is instantaneous and no infinity survives scrutiny.Subscribe
∘ 1. Introduction — The World Without Infinities
▣ Overview
In this reconstruction of physics, we begin with rejection: the rejection of infinity in all forms. There are no infinite divisions of space. No infinite densities. No instantaneous transitions. No equations permitted to evolve across ℝ or act upon objects of uncountable dimensionality.
The universe operates on constraints. We accept only what is finite, causal, and rate-limited. Every process must take time. Every event must require energy. Every interaction must occur within a discrete structure. In this world, we anchor our formalism to three physically motivated bounds, expressed without appeal to infinities:
• Planck Time
tₚ = √(ℏG∕c⁵) ≈ 5.39 × 10⁻⁴⁴ s
→ The smallest meaningful unit of temporal change; nothing occurs in less.
• Planck Length
ℓₚ = √(ℏG∕c³) ≈ 1.62 × 10⁻³⁵ m
→ The smallest meaningful separation between causal sites.
• Planck Energy
Eₚ = √(ℏc⁵∕G) ≈ 1.96 × 10⁹ J
→ The energy scale where gravitational self-interaction of a quantum state becomes inescapable.
These expressions are dimensionally derived from three constants:
– ℏ (reduced Planck constant)
– G (gravitational constant)
– c (speed of light)
They do not define metaphysical boundaries, but informational and causal thresholds. No event can be said to happen in a time less than tₚ, because no signal can be transmitted faster than c, and no meaningful physical distinction can be made between two states separated by less than ℓₚ. These are not opinions—they are limits on causal definition.
▣ Purpose
The goal is to impose strict finitude on the laws of physics. All current models—quantum field theory, general relativity, and quantum computing formalism—smuggle in infinities as if they were harmless. But infinite wavefunctions, infinite precision rotations, and Hilbert spaces of uncountable dimensionality are not physical. They are idealisations.
We assert the following operational constraints:-
No evolution faster than one transition per tₚ.
If event A causes event B, then Δt ≥ tₚ.
No collapse, signal, or influence violates this bound.
-
No physical structure smaller than ℓₚ.
There exists no distinguishable state between two points closer than ℓₚ.
Any such claim lacks causal observability.
-
No energy exceeding Eₚ in a single causal site.
The energy density ε is bounded: ε ≤ Eₚ∕ℓₚ³
Beyond this, gravitational and quantum effects destroy locality.
-
No state transitions without causal propagation.
A system S(t) must satisfy: S(t + tₚ) = R(S(t)),
where R is a rule acting only on a finite neighbourhood.
We replace continuous fields with finite, update-bound automata. Spacetime is no longer a manifold; it is a discrete causal graph. Equations no longer describe infinitely precise dynamics; they become effective descriptions of patterns emerging from bounded, rule-based state evolution.
The result is not a simplified physics. It is a more honest one.
In place of smoothness, we place structure.
In place of infinite evolution, we place causal steps.
In place of probability clouds, we place deterministic transitions.
This is not the wavefunction of the universe.
This is its clock.
And it ticks in tₚ, not in ℝ.
∘ 2. Discrete and Deterministic — The Cellular Substrate
▣ Hypothesis
The universe is a deterministic, discrete, causal cellular system. It is composed of finite-state units arranged on a spatial lattice, each of which evolves in synchrony via local update rules. There is no wavefunction collapse. No continuous evolution of complex amplitudes. No superposition. No inner product. The state of the universe at any moment is a finite mapping from a discrete spatial index to a finite set of values. Each update step applies a local function that determines the next configuration. Nothing is probabilistic. All apparent randomness arises from unresolved or unknown initial conditions.
Formally:
Let Σ be a finite alphabet of internal states.
Let Λ ⊂ ℤ³ be a discrete, bounded 3D spatial lattice.
At each discrete time t ∈ ℕ, the configuration of the system is a function:
Sₜ : Λ → Σ
Each cell x ∈ Λ updates according to a local rule:
Sₜ₊₁(x) = R(Sₜ(y₁), Sₜ(y₂), ..., Sₜ(yₙ))
where yᵢ ∈ N(x), the finite neighbourhood of x.
The rule R : Σⁿ → Σ is deterministic, synchronous, and local.
The universe is the evolution of Sₜ over t under repeated application of R across all x ∈ Λ. There is no global operator. There is no vector space. There is no linear structure.
▣ Implication
This model imposes the following constraints:-
Maximal update rate
Every site may update only once per minimal time interval tₚ.
Thus, for every cell, no event occurs more frequently than:
fₘₐₓ = 1⁄tₚ
-
Finite spatial resolution
Each cell occupies a region ≥ ℓₚ³. No interaction or differentiation exists at scales smaller than ℓₚ. The number of cells in any volume V is finite:
N = ⌊V⁄ℓₚ³⌋
-
Finite configuration space
The total number of distinct global configurations is bounded:
C = |Σ|ⁿ, n = number of spatial cells in Λ
There is no continuous parameter. No unbounded basis.
-
Locality of causation
No update may reference or depend on information from beyond N(x).
There is no non-local collapse. No entanglement that violates causal propagation.
-
Computability
The entire history of the system is computable.
That is: ∀t ∈ ℕ, Sₜ is determined uniquely by S₀ and R.
There are no truly random events.
▣ Probability Clouds vs. Deterministic Evolution
In standard quantum theory, the state is a vector ψ ∈ ℋ, evolved linearly under a unitary operator U. Measurement collapses this to an eigenvector with probability given by |⟨ψ|ϕ⟩|². But ℋ is infinite-dimensional. The amplitudes are complex numbers. The structure requires infinite precision and continuous evolution.
Here, by contrast:
– The state is always a point in Σ^Λ.
– There is no linearity.
– There is no projection.
– There is no superposition.
Every cell is in one definite state. Every evolution step follows one rule. What appears as a probability cloud is the lack of knowledge about which specific prior configuration S₀ led to the present outcome. All randomness is epistemic, not ontological.
▣ No Superposition — No Inner Product
This model makes no use of the vector structure of Hilbert space. There are no ψ = α|0⟩ + β|1⟩ constructions. There is no ⟨ϕ|ψ⟩. There is no Born rule. All observed distributions are derived from the frequency of specific transitions over initial conditions in the finite configuration space.
Let Ω be the set of all initial configurations S₀ consistent with a macroscopic preparation. Let A be an observable pattern. Then:
P(A) = |{S₀ ∈ Ω : Sₜ contains A}|⁄|Ω|
This is not probability in the quantum sense. It is a counting measure over finite, deterministic histories.
▣ Stable Particles and Localised Structures
In Wolfram-type Class IV automata, persistent, localised patterns emerge:
– They propagate.
– They interact.
– They preserve identity under collisions.
These are mapped to physical particles. Let P be a pattern occupying k cells that repeats after m steps:
Sₜ(x) = Sₜ₊ₘ(x + v) for all x in support(P)
Then v⁄m is its velocity. Its stability corresponds to mass. Its transformation rules define interaction type.
▣ Long-Range Coherence and Entanglement-like Histories
When multiple patterns share a causal history — that is, they emerged from a common origin — their state dependencies reflect a record of update constraints, not non-local metaphysical connectivity.
Entanglement correlations are not instantaneous. They are correlations between distant cells due to overlapping dependency trees from a shared causal past. There is no violation of locality.
▣ Matching Observables and Constants
The model does not assume charges, masses, or coupling constants. It derives them from the structure and frequency of emergent patterns. The speed of propagation of causal influence is c. This corresponds to maximal site-to-site transition per tₚ.
Energy emerges from frequency of pattern repetition.
Mass arises from persistence under transformation.
Charge appears in how certain patterns modify the updates of others.
▣ Ultraviolet Bound
Since cells cannot be smaller than ℓₚ and cannot update faster than tₚ, there is a natural ultraviolet cutoff. There are no loop integrals over arbitrarily high momenta. Divergences do not occur because the configuration space is finite and bounded.
The failure of running couplings to match certain expected values, the discrepancy in vacuum energy, and the limitations of loop-based corrections in perturbative QFT are all naturally explained by the hard limit in frequency and locality.
▣ Summary
There is no amplitude.
There is no collapse.
There is no infinity.
There is only state:
Sₜ : Λ → Σ
There is only update:
Sₜ₊₁(x) = R(Sₜ(N(x)))
There is only constraint, locality, and finite resolution.
This is not an approximation. This is the structure.
And the structure computes.
One tick at a time.
Bound by tₚ, limited by ℓₚ, and governed by R.
∘ 3. Quantum Formalism as Statistical Approximation
3. Quantum Formalism as Statistical Approximation
▣ Claim
Quantum mechanics, in its prevailing form, is a statistical front-end to an underlying deterministic machine. The wavefunction ψ is not a literal descriptor of physical state. It is a compression layer, an algebraic proxy for counting constrained possibilities when full resolution of state history is unavailable. What quantum formalism achieves is a successful mapping from deterministic causes to probabilistic predictions, not because the world is random, but because the observer lacks access to the complete microscopic configuration.
Let Ω be the set of micro-configurations consistent with a macroscopic preparation.
Let R be the update rule applied over discrete time t ∈ ℕ.
Let A be a readable macrostate pattern.
Then the “quantum probability” P(A) is not a complex amplitude squared, but:
P(A, t) = |{S₀ ∈ Ω : R⁽ᵗ⁾(S₀) ⊢ A}|⁄|Ω|
This is a counting measure over bounded, deterministic state evolution paths. The linear algebra of quantum theory — inner products, Hermitian operators, Hilbert spaces — emerges as a symbolic framework for encoding these frequencies when the automaton underneath is opaque or intractable.
Quantum amplitudes are not real waves. They are compressed weights.
The Schrödinger equation is not an engine. It is a summary.
Born's rule is not ontological. It is a statistical function over hidden, causal branches.
✦ Refutation
There is no physical superposition. No configuration in the real cellular substrate exists in more than one state at a time. There is no |0⟩ + |1⟩. There is only s ∈ Σ, one definite symbol per site.
Superposition is an artefact of unresolved branches in causal structure. Let two microstates S₀¹, S₀² ∈ Ω evolve under R into macroscopically indistinguishable outcomes A. Then from the observer’s coarse-grained perspective, the system appears to have been “in a superposition,” when in fact it followed a single path through a finite rule graph.
There is no ψ(x). There is only a history of rule applications.
There is no ∫ψ∗ψ dx. There is only counting of how many causal chains lead to an outcome.
There is no α|0⟩ + β|1⟩. There is only a lack of knowledge about which prior state led to now.
Superposition collapses not because of projection, but because information flow reaches a readable state.
✜ Outcome
“Collapse” is not instantaneous. It is not non-local. It is not metaphysical.
Collapse is a causal propagation. It is the physical convergence of rule effects toward a macroscopic register that we designate as an “observer.” This is not special. Every cell is a participant in causal propagation. No cell “observes.” No external agent triggers change.
Collapse is the end of causal ambiguity, not the reduction of amplitude.
Let C(x, t) be the causal cone of site x at time t.
If all possible contributing initial conditions in Ω have propagated deterministically to the same value at x, then the outcome is stable.
Collapse has occurred.
If multiple values remain unresolved at x from unresolved histories,
Collapse has not occurred.
Measurement, in this structure, is the termination of update ambiguity at an accessible site. This occurs through deterministic application of R, bounded in time and locality.
▣ No Wavefunction, No Projection
Standard QM relies on the unitary evolution:
ψ(t + Δt) = U(Δt) ψ(t)
followed by projection:
ψ → Pᵢψ / ||Pᵢψ||
with outcome i occurring with probability ||Pᵢψ||².
In the cellular substrate model:
– U is replaced by R, a finite rule over neighbourhoods
– Pᵢ is meaningless — there is no basis
– Measurement is a state lookup, not an operator
– “Probability” is frequency over Ω, not norm squared over ℂ
▣ Rewriting Schrödinger as Summary
The Schrödinger equation:
iħ ∂ψ/∂t = Hψ
is here understood as a bulk statistical envelope over the causal transitions of the CA.
The Hamiltonian H encodes what kind of configurations dominate the evolution. The complex exponential e^(−iHt/ħ) becomes a Fourier-like tool for encoding frequency and symmetry patterns in the allowed transitions, but it does not describe real, ontological evolution.
What this means: quantum formalism works because it is a spectral tool for summarising the statistical distribution over rule-paths in the deterministic substrate. But it is not the substrate. It is a shorthand for what cannot be tracked in full.
▣ Conclusion of Section
Quantum mechanics is not wrong.
It is efficient.
It compresses the causal structure of finite automata into algebra.
But what looks like mystery is compression.
What looks like randomness is ignorance.
What looks like collapse is just a convergence.
There is no ψ.
There is only Sₜ.
There is no projection.
There is only propagation.
And when the path ends, the outcome appears.
No magic. No observer. No wave.
Only the rule.
Ticked once more.
∘ 4. Why Quantum Computing Doesn’t Work in This World
▣ Rejection
In this model, qubits do not exist. Not as physical systems. Not as ontological structures. A qubit, in the standard formalism, is a two-dimensional complex vector:
|ψ⟩ = α|0⟩ + β|1⟩, with |α|² + |β|² = 1, α, β ∈ ℂ
This representation depends fundamentally on:-
Continuous state space
-
Infinite precision of complex amplitudes
-
Unitary transformations across this space
-
Superposition of physically real basis states
But in a finite, deterministic, causal cellular substrate:
– There is no ℂ.
– There is no linear space.
– There is no inner product.
– There is no physically real state |ψ⟩ that encodes multiple options.
The "qubit" is a notational tool to describe statistical uncertainty over unresolved cellular histories. Each cell is in exactly one state from a finite alphabet Σ. If two classical configurations evolve identically until diverging at some later point, a probabilistic description may seem useful — but that does not make the underlying state a vector. It remains one symbol per site. One rule per tick.
Therefore:
Qubits are epistemic compressions.
They are labels over possible trajectories of bounded, deterministic causal chains.
They do not exist as rotating entities in Hilbert space.
✦ Breakdown
Quantum algorithms rely on properties that cannot be physically instantiated in a finite-state system:-
Superposition
Supposedly enables exponential parallelism:
n qubits → 2ⁿ amplitudes simultaneously evolved
But in this model, only one configuration exists at a time:
Sₜ : Λ → Σ
No evolution takes place in parallel. No branching. Just one causal path per universe tick.
-
Amplitude interference
Quantum advantage arises from amplitude cancellation — carefully engineering overlaps between positive and negative contributions.
But amplitudes don’t exist.
There is no destructive interference — only a count of how many prior states lead to a result.
-
Unitary gates
Operations like the Hadamard, T-gate, or controlled rotations act over complex coefficients with theoretically infinite precision:
U|ψ⟩ = α′|0⟩ + β′|1⟩
But in a discrete CA:
– Only finite states exist
– Only deterministic local updates apply
– No continuous gate can act on a single site
– There is no way to represent or manipulate infinitesimally rotated coefficients
All transitions must be reducible to table-driven state rewrites:
R: Σⁿ → Σ, acting synchronously, cell by cell.-
Entanglement as tensor product space
Entanglement in QM requires state combinations over tensor products of Hilbert spaces:
|ψ⟩ = α|00⟩ + β|01⟩ + γ|10⟩ + δ|11⟩
In a finite CA, only local correlations via shared update ancestry exist.
Correlated outcomes arise from shared causal dependency, not instantaneous global constraint.
✜ Result
The entire architecture of “quantum speedup” collapses when constrained to physical realism.
In standard claims:
• Shor’s algorithm can factor n-digit numbers in polynomial time by exploiting exponential basis coverage via quantum Fourier transform
• Grover’s algorithm achieves quadratic speedup over classical search using amplitude amplification
• Quantum supremacy suggests that certain sampling problems cannot be performed classically within feasible timeframes
All of these require:
– Multiple coexistent, physically real state paths
– Interference between evolved amplitudes
– Global coherence and reversible unitary evolution
But in a system where:
• Only one configuration exists per tick
• Every site updates via a deterministic local rule
• No global amplitude can be instantiated or manipulated
• No infinite precision rotation exists
Then:
– Shor’s algorithm becomes a classical path traversal with no parallelism
– Grover’s becomes a bounded causal exploration with no amplification
– Supremacy becomes an artefact of inadequate classical simulation tools, not a fundamental property
The illusion of quantum acceleration emerges only under the assumption that complex amplitudes are physically real and can be manipulated with unlimited fidelity. Remove that, and the advantage disappears.
∘ 5. Entanglement Becomes Co-Dependency
▣ Clarification
In the conventional interpretation of quantum mechanics, entanglement is understood as a non-local phenomenon where the states of two or more particles are correlated in such a way that the state of one instantaneously influences the state of another, regardless of spatial separation. This non-locality has often been portrayed as a violation of causality and locality, as illustrated by Bell's theorem, which suggests that quantum mechanics predicts correlations that cannot be explained by any local hidden variables theory.
However, in this model, entanglement is not non-local. It is not an instantaneous or metaphysical connection across space and time. Instead, entanglement here is redefined as synchrony in the evolution of rule-dependence between different regions of space. The key idea is that the co-dependence of states across separated regions arises from shared causal history—not from instantaneous action at a distance.
Let’s be precise:
In the cellular automaton model, every cell updates according to local rules, and the state of each cell at any given time is determined by its local neighbourhood and the previous state of the lattice. This means that co-dependence across distant cells (which in quantum theory is termed “entanglement”) is the result of shared past history, not a violation of locality.
A cell x at time t and a cell y at time t+1 can be in a correlated state, not because of instantaneous non-local effects, but because they were both influenced by a shared configuration in a prior causal update, reflecting their shared ancestry in the lattice’s state evolution.
In this view, entanglement is causal correlation, not non-locality. It is simply a record of parallel rule propagation between cells that have evolved in synchrony, constrained by the same deterministic update rule.
✦ Removal
With the removal of non-locality, the Bell violations predicted by quantum theory lose their ontological significance. Bell’s theorem, which demonstrates that quantum correlations cannot be explained by any local hidden variables theory, is based on the assumption that non-local influences allow instantaneous correlations between distant events.
However, once we accept that causal propagation is finite and local, the framework of quantum entanglement changes completely. In a model where all transitions are bounded in time and space, entanglement no longer requires non-local action. It simply describes the synchrony of state evolution in different regions of space that share a common causal history.
Without branching universes or instantaneous constraints, the apparent violations of locality (such as those revealed by Bell's inequalities) become statistical reflections of deterministic processes. Instead of interpreting them as “spooky action at a distance,” we see them as overlapping histories in a discrete, causally-connected lattice. This explains the quantum correlations in a way that is entirely local and causal.
✜ Model
In this revised model, the universe is a lattice, and every event in this lattice is determined by local rules acting on a finite neighbourhood of cells. The concept of entanglement is therefore reinterpreted as co-dependence across causal light-cones. A light-cone here is defined as the set of cells whose states are causally linked via finite updates over time.
Let C(x, t) represent the causal light-cone of a cell x at time t. Then, the states of cells y ∈ C(x, t) are determined by their previous states Sₜ(y), and the interactions within these light-cones obey a deterministic rule:
Sₜ₊₁(x) = R(Sₜ(N(x)))
For two cells, x and y, their states can become correlated because they share a common causal light-cone and are governed by the same rule R. The key point here is that the correlation arises not from instantaneous influences but from the shared evolution of their states across the same causal network.
This deterministic structure eliminates the need for the non-local influences usually invoked by quantum mechanics to explain entanglement. There is no spooky action at a distance. There is only a shared history, which manifests as correlation in the present.
In fact, what appears to us as entanglement between distant particles can now be understood as the evolution of shared information between two or more distant regions of the causal lattice, constrained by the same rule and the same causal history.
▣ Summary of Section-
Entanglement is not non-local. It is co-dependence driven by a shared causal history.
-
There is no instantaneous connection between distant particles—entanglement is not “spooky action”.
-
The Bell violations lose their ontological significance in this framework because non-locality is not required to explain quantum correlations.
-
The causal lattice model provides a deterministic framework where state correlations emerge naturally from shared rule-based evolution, not from instantaneous connections across space.
The quantum mechanical model of entanglement is, in essence, a compression artifact of a deeper, more fundamental causal structure. It represents patterns of shared information emerging from the evolution of the universe, not mysterious or unexplainable connections.
This model preserves the predictive accuracy of quantum mechanics while removing its metaphysical assumptions, grounding physics in a fundamentally deterministic and local framework.
∘ 6. No Instantaneous Collapse — Just Rate-Limited Update
▣ Constraint
In the conventional quantum mechanical framework, the collapse of the wavefunction is often treated as a magical moment, occurring instantaneously when a measurement is made. This interpretation suggests that a system, prior to measurement, exists in a superposition of states, and upon measurement, the system "collapses" to one of those states, seemingly without the passage of time.
However, in the deterministic, finite-state model we are advocating, there is no instantaneous collapse. The universe is not governed by infinite precision, and measurement cannot occur faster than the rate of information propagation within the system. The passage of time is finite, and the speed of information transfer is bounded by the fundamental constants of nature, such as Planck time (tₚ) and Planck length (ℓₚ).
Thus, measurement, like any other process in this model, must propagate causally over time. Information cannot be instantaneously transmitted across an infinite distance. The collapse of a wavefunction—or rather, the reduction of uncertainty in the system—is not an instantaneous event but rather a rate-limited process, unfolding over time as the state of the system becomes deterministically readable by an observer.
✦ Reformulation
In this model, collapse is redefined not as a metaphysical event but as a process of information resolution that unfolds according to the same principles governing all interactions in the universe: deterministic local updates. When we measure a system, we do not induce an instantaneous shift in the system’s state. Instead, we observe a rate-limited propagation of changes to the state, where the velocity of information transfer is constrained by the finite speed of light and the Planck time.
Consider a measurement being performed on a distant particle, entangled with another. In the standard quantum interpretation, the outcome of the measurement seems to be instantaneously reflected in the state of the other particle, regardless of distance. This is often described as “spooky action at a distance.”
But in the model we propose, no information is transmitted faster than light. The state of the second particle evolves in the same way any other state would, following the deterministic rules of the cellular automaton. The correlation between the particles arises not from non-local action but from their shared causal history, and the update to the second particle’s state occurs in real-time, with a finite velocity determined by the lattice structure and the rule set.
Thus, collapse is no longer an instantaneous, mystical event. It is a rate-limited, causal propagation of state changes. The observable outcome of a measurement is simply the result of propagating the local rule across the lattice to the observable system.
✜ Elimination
Measurement is not a magical moment of wavefunction collapse; it is a clocked state change that occurs at a finite rate. When we measure a system, we are not forcing it into one of a set of predefined states instantaneously. Instead, we are observing the state of the system at a given time, which has evolved according to deterministic rules up to that point. This state is the result of the causal history of the system’s past interactions.
The idea that measurement occurs in an instant—"at the moment of observation"—is a distortion introduced by quantum formalism’s reliance on a smooth, continuous model. In reality, the system is updated discretely and causally. Measurement is no different from any other process that follows the rule. It is just a state readout, where the observable state is finally resolvable** at a given location in space and time.
Thus, collapse is not instantaneous, but occurs at the same speed as any other causal event. It propagates through the system as the observable state is updated, step by step, according to the system’s local rules.
▣ Summary of Section-
Measurement is not an instantaneous collapse. It is a rate-limited process, dictated by the finite speed at which information propagates.
-
The collapse of a wavefunction is not a mystical event; it is the deterministic resolution of a system’s state, which unfolds over time.
-
No non-locality exists in this model. The apparent instantaneous effects are simply the result of shared causal histories and the propagation of states over time.
-
Measurement is simply the readout of a system’s state at a given time, and its evolution follows the same deterministic rules as every other event in the system.
-
The speed of collapse is governed by the rate of information transfer within the system, bound by Planck time and Planck length.
In this world, collapse is not magical, it is the final step in a finite causal chain, and it is always bound by the rate at which the system updates its state.
NOTE: Direct Contradiction with Mainstream QM Interpretations: This section directly contradicts most mainstream interpretations of quantum mechanics, particularly those that embrace the instantaneous and non-local nature of wavefunction collapse (e.g., Copenhagen interpretation). It fundamentally redefines a core QM concept.-
My Thesis is falsifiable.
-
To experimentally test this, one would need to:
Design experiments to measure the finite speed of "collapse" or information resolution: If collapse is indeed a rate-limited process, there should be a measurable time delay for the "resolution of uncertainty" or state change to propagate across a distance, rather than appearing instantaneous. Current Bell tests generally assume instantaneous collapse when analyzing results, but more refined experiments might look for subtle time dependencies.
-
Distinguish "shared causal history" from true non-locality: Experiments would need to be designed to differentiate between correlations arising from a pre-established shared history (as the paper suggests ) and those that truly cannot be explained without faster-than-light influence or a rejection of realism. This is essentially what various loopholes in Bell tests (e.g., locality, freedom-of-choice) aim to address. A successful demonstration of a local, deterministic model that can fully reproduce the observed Bell correlations without violating these loopholes would be a direct test.
-
Investigate the "propagation of changes" in entangled systems: The model claims that the state of a second entangled particle evolves "in real-time, with a finite velocity determined by the lattice structure and the rule set". This implies that if one could precisely track the evolution of both particles, one would observe this finite-speed propagation, rather than an instantaneous correlation upon measurement
∘ 7. Redefining Computation in a Bounded Universe
▣ Observation
Computation, in its most basic sense, is the manipulation of structure—patterns of information—within causal constraints. It is not a magical process, but a deterministic evolution of states, following the rules of a given system. Whether we're considering a mechanical computer, a biological brain, or a cellular automaton, computation is simply the step-by-step progression of information from one state to another within defined boundaries of interaction.
In the physical universe, the boundary is defined by causal locality—the fact that no information can travel faster than the speed of light and that all events are rate-limited by the temporal and spatial resolution of the system (i.e., Planck time and Planck length). This gives us a finite information density, a finite update rate, and a finite number of possible configurations. Thus, computation is not an infinite process; it is a finite evolution constrained by the fundamental limits of the universe.
✦ Reformulation
In the quantum computing framework, we traditionally think of computation as the manipulation of qubits, which can exist in a superposition of states, allowing for a massive parallelism that supposedly provides exponential speedup over classical computation. Quantum gates, by acting on qubits, exploit the properties of superposition and interference to achieve what appears to be computational efficiency beyond classical limits.
However, when we eliminate the infinite precision and superposition, quantum computation becomes a special case of deterministic evolution of state spaces, just as classical computation is. The superpositions in quantum computation are not truly ontological superpositions; they are compressions over multiple possible states, representations of the statistical evolution of a system’s configurations. What appears to be quantum speedup is simply a result of how the system’s possible paths are compressed into a manageable set of states, and the apparent parallelism is a reflection of multiple unresolved causal histories rather than actual simultaneous evolutions.
Rather than multiple paths being explored in parallel, we have one system evolving deterministically across a discrete space of states. The deterministic update of each of these states—whether on a classical computer or a quantum computer—is still fundamentally bounded by causal rules.
Key Shift:
What we call "quantum" computation is, in this view, merely a special case of deterministic evolution, where the probabilistic outcomes we observe are the result of layered ignorance over the underlying causal history. No quantum gate is magic; it is just an algorithmic step within a deterministic causal process.
✜ Limit
This brings us to a critical limit:
There is no computational advantage in this model that can arise from infinite structure. In classical computation, we manipulate bits within a finite structure, stepping through a finite number of possible configurations. In the quantum model, while it appears we are manipulating a much larger state space via qubits in superposition, we are still, at the deepest level, constrained by local causality. The apparent "speedup" in quantum computation is a mirage, a consequence of the model abstraction rather than an actual increase in computational capacity.
The speedup we see in quantum algorithms, like Shor's or Grover's, does not arise from an inherent ability to process more information or faster information transfer. Instead, it emerges because we are using the compression of many causal histories into a more compact form. The outcome is that certain problems, which classically require checking many paths (such as factorization or search), can be collapsed into a smaller set of representative outcomes in the quantum framework. But this process still fundamentally operates within a bounded, deterministic causal system.
▣ Summary of Section-
Computation is the manipulation of structure within causal constraints, not a mystical or infinite process.
-
Quantum computation is deterministic evolution of state compressions, not a process of superposition or parallelism.
-
The apparent speedup in quantum algorithms is not due to an increase in computational power, but because of how the system compresses and represents state space.
-
No infinite structure is possible in this model—speedup is an illusion created by model abstraction.
-
True computational advantage can only arise from utilizing all causal dependencies in a bounded, discrete framework, not from leveraging abstracted infinite resources.
In this framework, computation is always bounded by local rules, rate limits, and causal constraints. There is no magic to the speedup of quantum computers; it is simply the statistical collapse of deterministic, rule-bound processes.
▣ Explaining Observed Quantum Advantage
While the deterministic, finite-state model proposed here rejects the metaphysical assumptions of quantum mechanics—such as superposition, entanglement, and infinite precision—there remains the challenge of explaining the observed quantum advantage in real-world quantum computing experiments. Quantum computers, even in their nascent stages, have demonstrated the ability to solve certain problems (such as specific sampling problems, small-scale factoring) in ways that classical computers struggle to simulate efficiently. These results align with quantum mechanical predictions and suggest that there is something about the quantum framework—or rather, the underlying phenomena—that classical computation fails to capture.
The question, then, is this:
How do we explain these observed advantages within the framework of a deterministic, finite-state model, without relying on the concepts of superposition, entanglement, and infinite precision gates?
Alternative Explanation: Statistical Representation of Complex Evolution
In the classical model, problems such as factoring large numbers or solving certain sampling problems require a brute-force exploration of possibilities, a process whose computational complexity scales exponentially with the size of the problem. The key to quantum speedup lies in the apparent ability to solve these problems more efficiently, often in polynomial time, as demonstrated by Shor's algorithm and others.
In the deterministic finite-state model, this advantage can be explained through the statistical representation of causal evolution rather than the concept of superposition. Rather than a system "exploring" multiple potential states simultaneously in superposition, the system follows a deterministic evolution, but compresses the representation of the possible outcomes. The quantum algorithms leverage statistical dependencies in the causal histories of the system to collapse multiple computational paths into a smaller set of possibilities, effectively solving the problem with fewer computational steps. In essence, quantum advantage arises from how efficiently the system represents and resolves state paths, not from simultaneous parallel computation.
Let’s break this down mathematically:
Let Ω represent the entire configuration space of a problem (e.g., the set of all possible factorizations of a number), and let Ω' be the reduced configuration space that corresponds to the observed outcomes. In a classical system, finding the correct outcome requires searching through Ω in an exhaustive manner. However, in the quantum framework, the search space effectively becomes compressed. While the quantum computer does not perform a truly parallel search, it exploits the compressive pathways between configurations, reducing the space that must be traversed to find the correct answer.
The compression is not magical—it's the result of a deterministic evolution under the quantum rule set, where statistical properties of the paths and updates lead to efficiencies in resolving the problem.
Formally:
Let S₀ represent the initial configuration of the system, and let Sₜ represent the evolution of that configuration under a deterministic rule R over time. In a quantum system, this evolution can be compressed into a smaller subspace Ω' that encodes the solution, while still respecting the constraints of causal locality. The apparent parallelism and quantum speedup arise from how the system’s possible paths are layered and collapsed through efficient state traversal, not through the superposition of multiple parallel paths.
Thus, the quantum advantage is not the result of non-local influences or infinite precision, but of a more efficient representation of the problem's state evolution—a form of causal compression where the space of possibilities is reduced through the application of deterministic rules.
Empirical Success of Quantum Mechanics: Bridging the Gap
It is essential to emphasize that the deterministic model does not discard the empirical success of quantum mechanics. Rather, it seeks to reinterpret the phenomena predicted by quantum mechanics in a way that does not rely on metaphysical assumptions like superposition or entanglement.
The observed advantages of quantum computing arise because quantum algorithms are patterned around efficiently compressing state space representations, and the statistical interpretation of these processes, though superficially similar to quantum interference, emerges from deterministic rule applications. In short, the apparent quantum speedup is the result of efficient state traversal through a compressed configuration space, rather than the simultaneous exploration of multiple parallel states.
To provide a concrete and mathematically rigorous alternative, the compressed state space model can be framed as follows:-
A quantum algorithm such as Shor's factors numbers by effectively reducing the space of possible factorizations into a smaller, more manageable subspace.
-
Classical methods require an exhaustive search across a large space of possible solutions.
-
The quantum advantage in Shor's algorithm emerges from the efficiency of traversal across the compressed state space, exploiting specific symmetries and correlations that are present in the deterministic evolution of the system.
This is not a parallel search across multiple solutions. It is a compressed, deterministic exploration of fewer possible paths, resulting in a more efficient solution to the problem.
▣ Conclusion of Section
The observed quantum advantage is not an illusion, but a real phenomenon that can be understood within a deterministic framework.-
Quantum computing’s apparent speedup arises not from non-locality or superposition, but from the compression of possible state configurations.
-
The efficiency of quantum algorithms comes from how these states are explored, not simultaneously, but deterministically through a process of statistical compression.
-
This compression is the key to understanding why quantum algorithms solve certain problems more efficiently than classical ones: it’s not a magical quantum speedup but a more efficient representation of problem states under deterministic rules.
In this model, quantum mechanics remains valid empirically but is reinterpreted as a statistical approximation of a deeper deterministic evolution. The quantum phenomena we observe are simply the compressed summary of many deterministic paths, leading to a more efficient resolution of computational problems.
∘ 8. Replacing Mysticism with Mechanics
▣ Target
This section focuses on the metaphysical constructs that pervade modern quantum mechanics, particularly:-
Hilbert space and its infinite-dimensional vector rotations
-
The Many-Worlds interpretation, suggesting an infinite number of parallel realities
-
The concept of wavefunction collapse, often depicted as a non-local, metaphysical moment
-
The ungrounded reliance on infinite precision in quantum states and operations
These constructs are mystical in nature—they assume infinities, unknowable states, and parallel universes. They impose a complex, abstract structure upon the natural world, elevating it to a level where intuitive understanding becomes difficult, and empirical validation is elusive. They imply a physical reality that is unknowable, chaotic, and mathematically infinite.
In contrast, our model aims to replace these abstract ideas with something more fundamentally grounded and empirically testable. We seek to redefine physics as a system of finite states, bounded update rules, and deterministic automata, all of which can be rigorously understood and observed.
✦ Action
To replace the mystical ideas of quantum mechanics, we need to transition from infinite-dimensional spaces to finite state spaces. Rather than relying on wavefunctions that exist in an abstract mathematical space with complex vector coefficients, we model the universe as a discrete lattice of cells that update deterministically based on local rules.-
Hilbert space is replaced with a finite state space Σ where each cell of the universe is a symbol drawn from a finite alphabet. Instead of an infinite-dimensional vector of amplitudes, the state of the universe at any given time is simply a configuration of states across all the cells in space.
-
Many-Worlds becomes redundant. The idea of branching universes, where each possible outcome of a quantum measurement creates a new, parallel universe, is replaced by a deterministic update rule that defines a single path of evolution. There are no parallel realities, only one universe evolving in a rule-based, deterministic fashion.
-
Wavefunction collapse is discarded as a metaphysical event. In our model, collapse is a propagation of state change, not an instantaneous, non-local transition. When a system is measured, the process is simply a causal update of the local state, resolved in finite time.
-
The assumption of infinite precision is removed. In this model, all states are finite, and the resolution of the system is bounded by Planck time (tₚ) and Planck length (ℓₚ). There is no need for an infinitely precise state vector; all information is discretized, and the universe operates on a finite grid with a fixed resolution.
✜ Resolution
With these replacements, physics is no longer mystified. The universe becomes a deterministic machine, governed by well-defined rules. We move from a universe of infinite possibilities and indeterminism to a universe of finite possibilities and determinism. There are no infinite vector spaces. There are no branching worlds. There are no non-local collapses. Instead, we have:-
Finite-state machines that represent the entire universe
-
Bounded update rules that govern how each state evolves over time
-
Deterministic automata that track the causal evolution of the universe, one state at a time
This is a grounded view of reality, where:-
Every event is causal and determined by the previous state of the system.
-
No hidden variables or parallel universes are needed to explain the outcomes of measurements.
-
The universe is a computational process, not a metaphysical mystery.
In this view, physics is no longer based on unknowable abstractions. It is based on observable, deterministic rules, grounded in finite states and causal evolution. Physics, once again, becomes understandable, testable, and observable—a process of rule-based state evolution.
▣ Summary of Section-
Mysticism in quantum mechanics—represented by concepts like Hilbert space, Many-Worlds, and wavefunction collapse—is replaced with a grounded, rule-based framework.
-
Finite state machines and bounded update rules are the new foundation of physics.
-
Wavefunction collapse becomes a deterministic, rate-limited process of state change, not an instantaneous, non-local event.
-
The concept of infinite precision is discarded in favor of a discrete, finite resolution governed by Planck units.
-
Physics is re-grounded—it is no longer an abstract, mystical realm but a deterministic automaton, governed by local rules and causal evolution.
In this framework, the universe is not a quantum mystery to be solved with infinite math and unobservable assumptions. It is a mechanical process, unfolding step by step, one finite state at a time.
∘ 9. The Planck Boundary — A Wall Against Infinity
∘ 9. The Planck Boundary — A Wall Against Infinity
▣ Assertion
The smallest unit of reality is not an infinitesimal. There is no infinite smallness or continuous resolution at the foundational level of physical space. Instead, the universe is composed of Planck-scale blocks, each representing the finite, indivisible unit of space-time. This unit is defined by the Planck length (ℓₚ), the Planck time (tₚ), and the Planck energy (Eₚ). These are not arbitrarily small values in a continuum but fundamental physical limits that mark the resolution boundaries of the universe.
At the smallest scales, reality is discrete, not continuous. This discrete structure eliminates the need for infinite precision and the idea of infinitesimally small distances or timescales. Rather than being a smooth, continuous fabric, spacetime is pixelated, with each "pixel" corresponding to a Planck-scale unit of space-time. This framework operates in stark contrast to the assumptions of continuum models in modern physics, which treat space and time as infinitely divisible.
✦ Consequence
The introduction of Planck-scale blocks has profound consequences for the structure of physical theories. Continuum models, which rely on the idea of smooth, infinitely divisible space-time, break down under the constraints of discrete boundaries. This includes:-
General relativity: The idea of a smooth, continuous fabric of spacetime is incompatible with the concept of discrete blocks. The very notion of a curved continuum in general relativity becomes incomplete, as the smallest scales of space are not infinitely divisible. Instead, they are quantized and governed by discrete rules.
-
Quantum field theory: Field theory relies on the idea of continuous fields permeating space. In this framework, fields are defined at every point in space-time. However, when we impose a Planck-scale boundary, these continuous fields must be redefined as discrete interactions that occur within the finite structure of the Planck blocks. There can no longer be infinitesimally small field values; field interactions must be modeled on a finite lattice of space-time units.
-
Particles and interactions: Particles are typically modeled as point-like objects in a continuous space. However, in a Planck-bound universe, particles must emerge from discrete interactions at the Planck scale. These particles are no longer point-like in the traditional sense, but rather exist as localised excitations of discrete fields, with their properties (mass, charge, spin, etc.) determined by the topological arrangement of the Planck-scale blocks.
By introducing this discreteness at the fundamental level, we remove the infinite degrees of freedom that continuum models assume. There are no infinitely small regions of space to integrate over, no infinite number of states within a field, and no need for renormalization. The infinite precision of classical models is replaced by the finite resolution dictated by the Planck units.
✜ Framework
All fields, particles, and interactions in the universe must now be reconstructed atop a finite topology that respects the Planck-scale boundary. This means:-
Fields: Fields must be re-conceptualized as discrete values defined on a finite lattice. Instead of the traditional view of continuous fields, such as the electromagnetic field or gravitational field, we will model these fields as finite sets of interactions occurring at each discrete cell. These interactions will follow local update rules that operate on the Planck-scale lattice.
-
Particles: In this framework, particles are not point-like objects, but localised excitations of the underlying fields. These excitations are finite in extent and quantized, existing within the structure of the Planck-scale blocks. The properties of these particles, including mass, charge, and spin, are determined by the configuration of the fields at the Planck level and the local update rules that govern their evolution.
-
Interactions: The fundamental forces (gravitational, electromagnetic, weak, and strong interactions) must be re-imagined as interactions between discrete states within the finite topology. For example, gravity, traditionally described by the curvature of a continuous spacetime fabric, is now understood as the effect of discrete interactions between Planck-scale blocks, mediated by fields defined on this lattice. The strength of these interactions is determined by the local update rules that govern the transitions between states in the lattice.
▣ Summary of Section-
The smallest unit of reality is Planck-scale blocks, not infinitesimals or continuous space-time.
-
Continuum models (such as general relativity and quantum field theory) break down because they rely on infinite precision and smooth, continuous space-time.
-
The universe operates on a finite lattice, where fields and particles are discrete and defined by local update rules.
-
All forces, particles, and interactions emerge from the interactions of Planck-scale blocks under deterministic evolution.
-
The Planck boundary provides a hard limit on the resolution of space-time and enforces finite topology throughout the universe.
By enforcing this Planck-scale structure, we not only eliminate the infinite degrees of freedom associated with continuum models but also provide a concrete framework for understanding the emergence of particles, fields, and interactions from discrete, rule-based evolution. This approach grounds physics in the finite and the deterministic, offering a more robust foundation for understanding the underlying mechanics of the universe.
∘ 10. Implications for Cosmology and Causality
▣ Forward View
A deterministic, finite-state model of the universe fundamentally reshapes our understanding of entropy, reversibility, and the arrow of time. In conventional physics, time is treated as an infinite continuum, where processes can theoretically evolve without any inherent limitations. This model, however, introduces an inherent rate-limited evolution, where the progression of time is tied to the rate at which information propagates across the finite causal lattice of the universe.-
Entropy: In the standard thermodynamic model, entropy is defined as the measure of disorder, often interpreted as the tendency for systems to evolve towards greater randomness or higher disorder. In a deterministic, finite-state universe, however, entropy must be redefined. It is not a gradual, irreversible increase in disorder, but rather a statistical reflection of the evolution of states over time. The degree of disorder reflects the number of distinct configurations accessible to the system at any point, not an infinite progression.
-
Reversibility: In classical mechanics, certain systems evolve in a time-symmetric manner, allowing for time reversibility. However, the introduction of finite-state boundaries and causal constraints provides a natural arrow of time. Time's progression is bounded by the discrete update rules, meaning that past states are limited by the available number of possible configurations at each step. While certain microstate configurations may theoretically be reversible, the global evolution remains constrained by the finite nature of the universe's state space.
-
Arrow of Time: The directionality of time in a deterministic model follows from the progression through a bounded state space. The arrow of time is dictated by the causal propagation of information across the lattice. As the universe evolves, configurations move from one state to the next in a determinate direction, with each step being finite and irreversible. The arrow of time emerges from the causal propagation of updates, not from an abstract notion of entropy growth.
✦ Constraint
In this framework, concepts like black holes, singularities, and wavefunction entanglement are finite-state structures, not infinities. The classical picture of a black hole, for example, involves a singularity—a point where gravity becomes infinite and spacetime curvature diverges. In our model, singularities are replaced by finite boundaries where space-time and gravitational fields are discrete.-
Black holes: Rather than a gravitational singularity at the center, we have a localized structure where the state of matter is highly compressed within a finite region, and the boundaries of this structure are defined by Planck-scale interactions. The event horizon is not a boundary of infinite curvature but rather a sharp transition between regions of different causal states in the finite lattice.
-
Singularities: Singularities, which in classical physics are infinite densities at the centers of black holes or other gravitational systems, become highly localized states in this model. These are not places of infinite density, but rather regions where the interactions within the finite lattice lead to extreme local state configurations. These states are determined by the causal structure of the universe and are not infinite, as they emerge from a bounded state space.
-
Wavefunction entanglement: In quantum mechanics, entanglement involves the non-local correlation between particles, even across vast distances. In the finite-state model, entanglement is redefined as causal correlation within a deterministic lattice. The connections between entangled particles are not instantaneous or non-local, but arise from the shared history of their respective state evolutions.
✜ Future Model
In the proposed future model, the universe is viewed as a causal lattice with no collapse points—just compression artifacts. This framework eliminates the need for wavefunction collapse, non-locality, or singularities, instead offering a causal network that propagates information in discrete steps.-
No Collapse Points: The collapse of the wavefunction, as it is classically understood, is an artefact of the statistical representation of a system’s state. In this model, there is no collapse; instead, there is a rate-limited propagation of information through a finite lattice of causal connections. Each step in the evolution of the system is bounded by the finite state-space and the update rules that govern the system.
-
Compression Artifacts: What appears as quantum interference or entanglement is reinterpreted as the compression of state possibilities. These artefacts are not real, ontological superpositions but simply reflect the limitations of our ability to track the full complexity of the system’s evolution. The statistical nature of quantum predictions arises not from intrinsic randomness, but from the layered ignorance over microstate configurations.
-
Causal Lattice: The universe is a causal lattice, where space-time itself is discrete and finite, and every event is determined by local update rules that propagate information in bounded steps. There is no infinite resolution in space or time. The causal propagation of updates defines the universe’s evolution, and every event is tied to the causal network that has come before it.
▣ Summary of Section-
A deterministic universe introduces a new understanding of entropy, reversibility, and the arrow of time, grounded in the finite, rate-limited update rules of a causal lattice.
-
Black holes, singularities, and wavefunction entanglement are not infinite phenomena but finite-state structures with causal dependencies based on the universe's discrete lattice.
-
The collapse of the wavefunction is replaced by rate-limited state propagation, and the universe's evolution is dictated by compression rather than superposition or non-locality.
-
The causal lattice model proposes a universe with bounded, rule-based updates, where physics is grounded in deterministic mechanics, not mystical abstractions.
In this model, the universe becomes a grounded, causal system, governed by finite, deterministic rules and bounded state spaces. Time, space, and causality are no longer subject to the whims of infinity; instead, they are tied to the concrete limits of the Planck-scale resolution and the deterministic evolution of the system’s state.
∘ 11. Conclusion — Back to the Machine
▣ Summary
The quantum narrative, with its mysticism of wavefunctions, superposition, and non-local entanglement, dissolves when we reframe the universe through the lens of discrete units and finite clocks. The continuous space-time model, with its unending possibilities and infinite smallness, is replaced by a bounded, deterministic system governed by finite state blocks.
In this worldview, quantum phenomena are not the product of an infinite continuum of probabilities, but rather emerge from the deterministic evolution of discrete states. Concepts like wavefunction collapse and probabilistic wave-particle duality are simply artefacts of causal compression within this finite system. The universe is not a probabilistic soup, but a causal machine, where the evolution of states unfolds step by step, tick by tick, each step constrained by local rules and finite boundaries.
The quantum paradoxes lose their grip because, at the most fundamental level, nothing is continuous, and no infinite precision is required. There is no infinite fine-tuning of states, no superposition of states, and no non-local entanglement. There is only the deterministic propagation of information, bound by finite resolutions of space-time.
✦ Principle
Physics is not floating in probability. It is ticking in countable steps. The deterministic nature of this universe means that every event, no matter how complex, arises from a finite sequence of causal transitions. These transitions are bounded in time by Planck time and bounded in space by Planck length. The universe is not governed by chance, but by rules that evolve the system from one state to another, each step a countable, deterministic transition.
The seeming randomness of quantum mechanics—such as the probabilistic outcomes of particle positions or measurement outcomes—stems not from intrinsic uncertainty in the universe but from our inability to track the full causal history of the system. Quantum mechanics, as we currently understand it, is simply a compression of the full causal structure into a tractable form, a statistical summary of deterministic transitions that we cannot completely resolve.
✜ Endgame
The future of physics lies in rebuilding the sciences on rules, not on reverence for abstract concepts or infinite approximations. The quantum mysticism that has dominated physics for nearly a century—defined by wavefunctions, non-locality, and hidden infinities—must be replaced by a vision of the universe as a causal machine with finite rules.
This is not about collapse—the sudden, magical reduction of a wavefunction into a single outcome. It is about convergence: the process by which information flows through a finite, discrete system, resolving from many possible configurations into a single, observable state. The apparent randomness in quantum systems is a manifestation of our limited ability to track the full causal history. What we see as collapse is merely the convergence of causal updates in a rule-bound system.
This is the final step in understanding the universe—not as a collection of mystical, unobservable phenomena, but as a machine governed by deterministic, finite rules. In this new model, we are not lost in a sea of probabilities and uncertainties. Instead, we can observe, track, and predict the universe’s evolution as it ticks forward, one step at a time.
▣ Summary of Section-
The quantum narrative dissolves when we frame the universe as discrete units and finite clocks, where causal evolution replaces metaphysical probability.
-
Physics is not probabilistic. It is deterministic, unfolding in countable steps governed by local rules.
-
The apparent randomness of quantum mechanics is explained as the compression of causal histories, not intrinsic uncertainty.
-
Collapse is not metaphysical. It is the convergence of causal paths, determined by finite, local rules.
-
The future of physics involves rebuilding on rules rather than abstract models, offering a grounded, deterministic vision of the universe as a causal machine.
In the final analysis, physics is no longer the domain of abstract infinities, mystical particles, or unobservable wavefunctions. It is the domain of rules, where every state update follows from the previous one in a finite, deterministically governed universe. No magic. No probability. Just ticking. Just evolution. Just the machine.
NOTE: The ultimate test of this "rejection" lies in the ability of the CA model to quantitatively reproduce all quantum phenomena and explain why classical computers cannot efficiently simulate the quantum systems observed to date, without resorting to the rejected quantum ontology.
Note: Reinterpretation of Bell's Theorem: My reinterpretation of Bell's theorem is a critical point. Bell's theorem shows that if local realism holds (local influences and pre-determined outcomes), then certain inequalities must be satisfied. Quantum mechanics, and indeed experiments, violate these inequalities. This has traditionally been interpreted as either a rejection of locality or a rejection of realism (or both). My position is a rejection of the non-local interpretation by positing a deterministic, local substratum where correlations arise from shared history. However, to fully explain observed Bell violations, this deterministic model must be expanded to explicitly show how the statistical outcomes match quantum predictions without requiring the kind of instantaneous correlations that Bell's theorem rules out for local hidden variables. The text describes "overlapping histories" and "shared causal dependency" but I don't mathematically demonstrate how these local, deterministic mechanisms perfectly reproduce the quantum correlations that defy classical local hidden variable models. This is the central challenge in other more formal papers.
Appendix: Current Experimental Data Challenging the Thesis
The thesis presented in this paper introduces a deterministic, discrete cellular automaton model to reinterpret quantum phenomena. However, there are existing experimental data and results that are widely interpreted by the mainstream physics community as either falsifying or presenting significant challenges to key claims of this model. While the paper presents conceptual reinterpretations, there is a lack of full quantitative proof for how this model can consistently match observed data. Below is an in-depth analysis of the points where the thesis faces difficulties in reconciling with experimental findings.
1. Fundamental Randomness of Quantum Events
Thesis Claim:
"Nothing is probabilistic. All apparent randomness arises from unresolved or unknown initial conditions."
"All randomness is epistemic, not ontological."
Falsifying Evidence (Mainstream Interpretation):
Numerous experiments, such as radioactive decay, single-photon detection at a beam splitter, and the Stern-Gerlach experiment, consistently produce outcomes that appear fundamentally probabilistic. In these experiments, the standard interpretation of quantum mechanics (e.g., Copenhagen interpretation) posits that the randomness is inherent in nature and not due to hidden variables.
For instance, in radioactive decay, particles decay at random intervals, and quantum theory predicts the probability distribution of decay events. Similarly, photon detection at a beam splitter shows random outcomes where the photon goes either through one path or another, even if the system is prepared in a superposition state. The Stern-Gerlach experiment also shows random outcomes when a particle's spin state is measured along a particular axis.
The thesis model suggests that all of this randomness is due to unknown initial conditions, positioning itself as a hidden variable theory. However, the mainstream interpretation uses the violation of Bell's inequalities and CHSH inequalities to argue that the randomness observed is ontological and not due to hidden variables. The paper needs to quantitatively reproduce the observed probabilistic nature of these experiments, which has not yet been accomplished.
2. Physical Reality of Superposition
Thesis Claim:
"There is no physical superposition. No configuration in the real cellular substrate exists in more than one state at a time."
"Superposition is an artefact of unresolved branches in causal structure."
Falsifying Evidence (Mainstream Interpretation):
Experiments such as the double-slit interference experiment with particles like electrons or even large molecules like fullerenes provide strong evidence of the physical reality of superposition. In these experiments, when individual particles pass through two slits, interference patterns form, which is inconsistent with the idea that particles pass through only one slit at a time. These interference patterns are seen as evidence that the particle is in a superposition of states.
In the double-slit experiment, a single electron creates an interference pattern on a detector screen, suggesting that the electron is exploring both slits simultaneously. The paper suggests that this superposition is a result of "unresolved branches" and not a true physical superposition. However, the thesis does not currently provide a detailed cellular automaton model that can reproduce these interference patterns (including their phase dependencies) using deterministic local rules. A quantitative, mechanistic explanation for these phenomena remains missing.
3. Non-Locality of Entanglement (Bell Test Violations)
Thesis Claim:
"Entanglement is not non-local... it is redefined as synchrony in the evolution of rule-dependence between different regions of space... The Bell violations predicted by quantum theory lose their ontological significance."
"There is no spooky action at a distance. There is only a shared history, which manifests as correlation in the present."
Falsifying Evidence (Mainstream Interpretation):
Numerous loophole-free Bell test experiments, which have closed detection, locality, and freedom-of-choice loopholes, have repeatedly violated Bell inequalities, showing that quantum entanglement can exhibit correlations that cannot be explained by any local hidden variables theory. These experiments provide strong evidence for non-locality, challenging the thesis' claim that entanglement is not non-local.
Bell's theorem, and subsequent experimental violations of Bell's inequalities, are interpreted as definitive proof that either nature is non-local or quantum mechanics is non-real. The paper acknowledges that this is a central challenge, yet no mathematical demonstration has been provided to show how local, deterministic mechanisms can reproduce the correlations that quantum entanglement experiments observe. Without such a demonstration, the data from Bell test violations stands in direct conflict with the thesis.
4. Instantaneous Nature of Wavefunction "Collapse"
Thesis Claim:
"There is no instantaneous collapse... The collapse of a wavefunction—or rather, the reduction of uncertainty in the system—is not an instantaneous event but rather a rate-limited process, unfolding over time."
"The speed of collapse is governed by the rate of information transfer within the system, bound by Planck time and Planck length."
Falsifying Evidence (Mainstream Interpretation/Lack of Evidence):
While it is difficult to definitively prove the instantaneous nature of wavefunction collapse experimentally, existing entanglement experiments are consistent with instantaneous correlations. When entangled particles are measured, the results appear to be instantaneously correlated, even over large distances. The thesis suggests that this collapse occurs at a rate limited by the speed of information transfer, but no experimental evidence currently supports a measurable finite speed for collapse or information resolution across space, especially at the Planck scale.
Additionally, proposed experiments designed to detect a finite speed for collapse have not yet observed any delay in the collapse process, leading to the conclusion that instantaneous collapse is consistent with current experimental results. The absence of a detectable speed of collapse is a significant challenge for the thesis.
5. Quantum Computational Advantage
Thesis Claim:
Quantum speedup is an "illusion" or a "mirage," and quantum computers merely perform "statistical compression" or "efficient state traversal" without true parallelism.
Falsifying Evidence (Mainstream Interpretation):
Quantum supremacy experiments, such as those carried out by Google (Sycamore processor) and China’s Jiuzhang and Zuchongzhi processors, have demonstrated that quantum computers can solve specific problems, such as sampling tasks, that are intractable for classical computers. The observed quantum speedup in these experiments is consistent with the quantum mechanical principles of superposition and entanglement, and it has been difficult to explain these results as mere classical path traversal or compression without invoking quantum principles.
The thesis suggests that quantum computers achieve speedup through statistical compression, but the actual performance observed in quantum supremacy tests suggests otherwise. The paper's interpretation remains conceptual, without providing a concrete algorithm that explains these results in a classical, deterministic framework.
Summary
While the thesis provides a conceptual reinterpretation of quantum phenomena through a deterministic, discrete cellular automaton model, it is currently contradicted by several key aspects of experimental data:-
Fundamental randomness: The paper's claim that all quantum randomness is epistemic (due to hidden variables) contradicts experimental results that are consistently interpreted as ontological randomness.
-
Physical superposition: The observed interference patterns from quantum experiments suggest the real existence of superposition, something the thesis currently cannot replicate through its deterministic model.
-
Non-locality: Violations of Bell's inequalities suggest that quantum entanglement is non-local, contrary to the thesis' claim of synchrony in a local, deterministic framework.
-
Wavefunction collapse: Experimental results are consistent with instantaneous collapse, which challenges the thesis' claim of a rate-limited collapse.
-
Quantum computational advantage: Observed quantum supremacy suggests that quantum computers can achieve true quantum speedup, which contradicts the thesis' claim of statistical compression as the source of speedup.
In conclusion, while the thesis is logically consistent and provides a conceptual framework, it is presently in conflict with existing experimental data. A full reconciliation of the thesis with these observations would require quantitative proofs and empirical validation that are beyond the scope of the current work.