The Bunkum Mystification of Quantum Mechanics by Non-Physicists

This is not exactly on-topic but I think it may be interesting.

It seems there has been a program going on since a while to determine if the use of complex numbers (which is pretty weird in a physical theory) is essential for quantum mechanics or if they could be dispensed with, using only nice good ol' real numbers instead.

Incredible as it may seems physicist have devised actual real-world experiments to find it out (*) and it seems that, indeed, quantum mechanics does need comples numbers (and then it's fundamentally utterly weird).

Imaginary numbers might seem like unicorns and goblins — interesting but irrelevant to reality.

But for describing matter at its roots, imaginary numbers turn out to be essential. They seem to be woven into the fabric of quantum mechanics, the math describing the realm of molecules, atoms and subatomic particles. A theory obeying the rules of quantum physics needs imaginary numbers to describe the real world, two new experiments suggest.
https://www.sciencenews.org/article/quantum-physics-imaginary-numbers-math-reality


(*) an article which underpins the theoretical framework for the experiments has been just published by Nature, if you can access the journal it is here, it's also been preprinted on arXiv here) .
 
it seems that, indeed, quantum mechanics does need comples numbers (and then it's fundamentally utterly weird).
Things get even weirder with the extensions of complex numbers: quaternions, which combine real numbers with three different kinds of "imaginary" numbers, and octonions, which have seven kinds of "imaginary" numbers. Quaternions can be linked to special relativity. In relativity, the three dimensions of space are given one sign, while time is given the other. If you square all of the quantities, you get three negative numbers for space and one positive number for time. In that way, time and space are able to trade off for each other as happens in relativity (travel faster in space according to an observer, and you travel slower in time according to that observer). Meanwhile some physicists are trying to crack the code on matter and energy with octonions, with mixed success. Here's a fascinating article on one of the pioneers in this field, Cohl Furey, and her work with octonions: https://www.quantamagazine.org/the-octonion-math-that-could-underpin-physics-20180720/

I'm just a caveman, but my hunch is that the final theory of physics will employ none of the above kinds of numbers. Instead, it will require a new system of discrete numbers, of which the reals/complex/quaternions/octonions are continuous idealizations — and without which, no one will crack this nut.
 
Last edited:
Saw this only now. Pardon for the delay.

Hmm, looks like gravity was a well-chosen example after all:
Article:
Newton's law of gravitation, the linchpin of his new cosmology, broke with explanatory conventions of natural philosophy, first for apparently proposing action at a distance, but more generally for not providing "true", physical causes. The argument for his System of the World (Principia, Book III) was based on phenomena, not reasoned first principles. This was viewed (mainly on the continent) as insufficient for proper natural philosophy.

Newton's law of universal gravitation was a realist theory intended to explain how the universe actually works whether or not an observer is observing it or interpreting it (bold added):

Article:
Newton held a realist reading of scientific theory as based upon inference from facts and observation, and his gravitational-theory (or NGT) as deduced from observed phenomena and Kepler's laws. Duhem criticises this realist approach to scientific-theory and NGT in particular, claiming empirical evidence cannot force theory-adoption.


According to NGT, every particle in the universe actually attracts every other particle, independent of observers. As the citation above demonstrates (extracted from a 2013 article on instrumentalism/positivism which you and markus appear to subscribe to in terms of your view on science), the discussion on whether science is instrumentalist/positivist or realist is still very much an ongoing debate within the Philosophy of Science. Outside QM (within physics and other natural sciences) and certain social sciences, most scientists, in my subjective experience, tend to be realist in their orientation rather than positivist. They're usually interested in real causes underlying real phenomena. However, I admit not having come across any surveys done on the matter which a credible sampling of a cross-section of the world's scientists of all disciplines would have taken. Such a survey could shed some light into whether my subjective experience corresponds to reality or not (pun intended).

The NGT is not a "true" physical cause in the conventional sense of a cause (as understood during Newton's time) whereby a physical event is explained by a localized cause or a set of localized causes immediately preceding the phenomenon along a chronological causal chain without any strange action-at-a-distance. The law of gravitation proposed by the MGT, which causes attraction between particles, does not operate/occur earlier in time from the actual motion of the particles, or locally. It is simultaneous and universal. Hence, the theory of NGT discusses a non-standard physical cause -- a physical law. But under the PoSR -- in other words, under a realist philosophy of science -- it would still qualify as a physical cause -- a sufficient reason for why the particles of the universe, according to Newton, behave in a certain way. Just not a kind of a cause that occurs along a time dimension. In other words, for Newton the NGT describes a real cause and he did not intend his theory to merely provide a mathematical formula or a calculational device to predict observation outcomes (unlike the QM which is just that), even though that's what it also does with a limited measure of success.
 
Last edited:

Source: https://www.youtube.com/watch?v=7oWip00iXbo&t=15s

A fascinating exploration of Jacob Barandes' theory of Stochastic-Quantum Correspondence which, essentially, reduces the wave function living in Hilbert space with all its attendant oddities (including entanglement and superposition) to a classical stochastic process involving definite particles moving along trajectories in spacetime.

Philosophers of Physics like Barandes at Harvard and Tim Maudlin at NYU amongst a few others are capable of discussing quantum mechanics with appropriate philosophical depth and scientific accuracy, demystifying popular misconceptions regarding the 'strangenesses' involved. Many science publications, media platforms and podcasts, including the likes of Alex O'Connor, rely on science popularizers who are not immersed experts on the topic, no matter how eloquent and lucid. The odd implications of Hilbert space wave function, when read as literal descriptions of reality, are happily albeit sloppily and ignorantly adopted as great candidates to account for consciousness which strikes as equally odd and mysterious. Proponents of panpsychism, a trending philosophy of mind and reality, are some of the most uncritical mystifiers of quantum mechanics, and sloppiest in their reading of its formalism.

As regards the OP, both Barandes and Maudlin are ever quick to remind that the 'oddities' of quantum mechanics are mostly artifacts of the mathematical formalism of quantum mechanics -- especially the wave function. The wave function was historically devised, and has been primarily employed, for predictions of measurement outcomes such as those established through the double-slit experiment. The wave function -- described in terms of a complex-valued differential equation in Hilbert space -- is a mathematical tool to predict rather than a description of quantum reality. The problems and weirdnesses arise when you try to employ such mathematical tools as explanations of ontological reality, and when you epistemologically equate it with non-instrumentalist/realist theories (most other physical theories, including General Relativity) actually attempting said explanation. The latter are what Maudlin would call proper scientific theories attempting to understand the universe whilst also producing accurate empirical predictions that give them credence.

Empirically (i.e. in actual observational scientific sense), in the double-slit experiment, only discrete particles locatable in spacetime are ever observed (dots on a screen, etc.). This fact casts reasonable doubt on any strong/dogmatic claim that these particles couldn't have existed in spacetime before measurement (appearing on the screen, etc.) or after, and that they equate to 'collapses' of some mathematical complex-valued differential equation (wave function with all its attendant inferences like superposition) floating in the universe.

In fact, wave function collapse (whether by measurement, gravity, subjective observer, randomly fluctuating field, pick your favourite interpretation) is itself a realist theory which views the wave function as a physical description of reality. And yet, it's a theory rather than an observed fact. Hence the very question of 'what produces the strange wave function collapse', often asked by science popularizers and even by some physicists, falsely assumes the wave function collapse as a physical fact. The De Broglie-Bohm, the Many-Worlds and Barandes' Stochastic-Quantum Correspondence interpretations, to mention a few, reject collapse theories entirely.

Indeed, collapse theories commit what philosopher William Frankena would call 'the definist fallacy' whereby 'one property is without explanation defined in terms of another property with which it is not clearly synonymous'. In other words, while wave function is described in terms of a complex-valued differential equation, the measured particle is described in terms of a definite-valued position or momentum. Yet despite being clearly non-synonymous and even linguistically incompatible, in collapse theories the latter description is what the former one turns into almost magically without explanation. As a brute fact.

As to superposition, Barandes describes articulately in The Stochastic-Quantum Correspondence, September 2023:

"From the standpoint of the stochastic-quantum correspondence, which gives an alternative formulation of quantum theory, the fact that these amplitudes 'interfere' with each other does not mean that they all physically occur in some sort of literal superposition, or that the system simultaneously takes all such paths in reality, but is merely an artifact of the indivisible dynamics of the underlying generalized stochastic system."
 
Last edited:
"From the standpoint of the stochastic-quantum correspondence, which gives an alternative formulation of quantum theory, the fact that these amplitudes 'interfere' with each other does not mean that they all physically occur in some sort of literal superposition, or that the system simultaneously takes all such paths in reality, but is merely an artifact of the indivisible dynamics of the underlying generalized stochastic system."
That the system simultaniously takes all such paths in reality is quite easily demonstrated with a laser, a mirror, and a diffraction grating. (Summary - you get a reflection from somewhere that you aren't pointing the laser beam: by *reducing* the classically-impossible, but quantum-theoretically-possible reflections you *create* a visible reflection. Less is quite literally more.) This is straight out of the Feynmann lectures from the early 60s. And it's entirely predictable, it's just the principle of least action. There's no reason to introduce anything stochastic into the discussion.
 
"From the standpoint of the stochastic-quantum correspondence, which gives an alternative formulation of quantum theory, the fact that these amplitudes 'interfere' with each other does not mean that they all physically occur in some sort of literal superposition, or that the system simultaneously takes all such paths in reality, but is merely an artifact of the indivisible dynamics of the underlying generalized stochastic system."
I can't say I'm sure, but haven't all 'underlying systems', stochastic or not, been ruled out by the experiments on Bell's inequality?
 
I can't say I'm sure, but haven't all 'underlying systems', stochastic or not, been ruled out by the experiments on Bell's inequality?

Article:
"Bell's original nonlocality theorem, as formulated and proved in 1964 [71], only addressed the case of a deterministic hidden-variables theory. Specifically, Bell showed that if one assumes that a theory's hidden variables uniquely determine measurement outcomes, and if one further assumes that the hidden variables are local in the sense that measurement results should not depend on the settings of faraway measuring devices, then one arrives at an inequality that is expressly violated by quantum theory."
 
That the system simultaniously takes all such paths in reality is quite easily demonstrated with a laser, a mirror, and a diffraction grating. (Summary - you get a reflection from somewhere that you aren't pointing the laser beam: by *reducing* the classically-impossible, but quantum-theoretically-possible reflections you *create* a visible reflection. Less is quite literally more.) This is straight out of the Feynmann lectures from the early 60s. And it's entirely predictable, it's just the principle of least action. There's no reason to introduce anything stochastic into the discussion.

(1) As Barandes puts it, even "textbook quantum theory is committed to the existence of measurement settings and definite measurement outcomes that end up behaving precisely as a (highly incomplete) set of stochastically evolving hidden variables."

(2) As David Albert would put it, the wave function itself fails to predict anything to appear as dots on the screen as it is complex-valued and does not even mathematically concern itself with definite positions. Only once the Born rule is applied to the wave function, it predicts a wave-like interference pattern formed by the dots on the screen which appear on the screen stochastically according to a probability distribution rendered by the Born rule.

(3) If merely the prediction of measurement outcomes in certain known experiments were the purport of quantum theory rather than the explanation of physical reality, then Barandes' reduction of it to indivisible stochastic processes is unnecessary.

To fully understand his argument, I highly recommend studying Barandes' full paper (not long) preceding these conclusions (bold added):

Article:
This paper has shown that one can reconstruct the mathematical formalism and all the empirical predictions of quantum theory using simpler, more physically transparent axioms than the standard Dirac-von Neumann axioms. Rather than postulating Hilbert spaces and their ingredients from the beginning, one instead posits a physical model, called a generalized stochastic system, based on trajectories in configuration spaces following generically indivisible stochastic dynamics. The stochastic-quantum correspondence then connects generalized stochastic systems with quantum systems in a fundamental way, showing that every quantum system can be viewed as the Hilbert-space representation of an underlying generalized stochastic system.

This perspective deflates some of the most mysterious features of quantum theory.
In particular, one sees that density matrices, wave functions, and all the other appurtenances of Hilbert spaces, while highly useful, are merely gauge variables. These appurtenances should therefore not be assigned direct physical meanings or treated as though they directly represent physical objects, any more than Lagrangians or Hamilton's principal functions directly represent physical objects.

Superposition is then not a literal smearing of physical objects, but is merely a mathematical artifact of catching a system in the middle of an indivisible stochastic process, as represented using a Hilbert-space formulation and wave functions.
 
Back
Top