Gratis verzending vanaf €35,-
Unieke producten
Milieuvriendelijk, hoogste kwaliteit
Professioneel advies: 085 - 743 03 12

The fantasy behind Sabine Hossenfelder’s superdeterminism

The fantasy behind Sabine Hossenfelder’s superdeterminism

Debating | Physics

shutterstock 1634115994 small

In today’s mid-week nugget, our Executive Director critiques physicist Sabine Hossenfelder’s proposed ‘superdeterminism,’ which aims to account for the theoretical difficulties of quantum measurement without departing from physicalist metaphysical assumptions.

In an introductory short video by Essentia Foundation, the cumulative evidence from foundations of physics that contradicts metaphysical physicalism—namely, the notion that physical entities have absolute, standalone existence—was reviewed. See below. The relevant technical literature is linked in the description of the video.

The argument goes as follows: if physical entities did have standalone existence, then their properties should be simply revealed by measurement. They should have whatever properties they have regardless of whether they are measured or what is measured about them. Measuring a table shouldn’t create its weight or length, but simply reveal what its weight or length already were immediately prior to measurement—or so physicalism presupposes.

As it turns out, however, when measurements are made on entangled quantum particles—the building blocks of nature out of which everything else, including tables, is constructed, according to physicalism—the measurement outcome from one particle depends on what is measured about the other. The choice of what to measure on the first particle determines what the second is, so their physical properties couldn’t have had existence prior to the measurement. Instead, what we see is that physical properties are the result of the measurement itself, not pre-existing realities merely revealed by measurement.

What this tells us is that physical entities aren’t fundamental in nature. Instead, they are merely the appearance or representation, upon observation, of a deeper layer of nature that is, by definition, non-physical.

Here is a metaphor to clarify this point. The physical world is akin to what is displayed on the dials of an airplane’s dashboard: the dials only display something when a measurement on the world outside is performed by the airplane’s sensors. If nothing is measured, then nothing is displayed on the dials. This, of course, doesn’t mean that there is no world outside! Surely there is one; it’s just not what the dials display. Instead, the world is what is measured in order for the dials to display something. The short video below illustrates this metaphor.

Similarly, the experimental results from foundations of physics don’t mean that there is no world outside; surely there is one! It’s just that the world, as it is in itself, prior to measurement, is not physical; for physicality is what is displayed on our own internal dashboard of dials, which we call perception. The ‘sensors’ are our five sense organs: eyes, nose, ears, tongue and skin. Physics is, essentially, a science of perception—that is, a science of the dials—as renowned physicist Andrei Linde, who developed cosmic inflation theory, once observed [1, p. 12]:

Let us remember that our knowledge of the world begins not with matter but with perceptions. I know for sure that my pain exists, my ‘green’ exists, and my ‘sweet’ exists … everything else is a theory. Later we find out that our perceptions obey some laws, which can be most conveniently formulated if we assume that there is some underlying reality beyond our perceptions. This model of material world obeying laws of physics is so successful that soon we forget about our starting point and say that matter is the only reality, and perceptions are only helpful for its description.

Physicist and science communicator Sabine Hossenfelder, however, takes a different view regarding the meaning of the experimental results in question, as discussed in two draft papers [2, 3] and a recent short video:

Let me start my commentary on Hossenfelder’s material by confessing that I sympathize with her work and tend to like her incisive, provocative style. Her voice opposes and slows down the descent of physics into fantasy informed by notions of beauty, as opposed to predictive models informed by hardnosed empirical observations. As someone disappointed with the failure of Super Symmetry, which inspired my younger years at CERN, I realize how important it is to keep our eyes on the proverbial empirical ball. However, despite Hossenfelder’s zeal to stay true to empiricism, she may have now failed it for the sake of safeguarding a physicalist metaphysical commitment.

Hossenfelder’s view is that the experimental results in question can be accounted for by ‘superdeterminism.’ The idea is that the particles must have some hidden properties that we know nothing about. These hidden properties presumably take part in a complex causal chain that encompasses the settings of the detectors used to make measurements. In other words, the detectors’ settings somehow influence something hidden about the particles measured. And since the choice of what to measure is necessarily reflected on those settings, the measurement results depend on that choice. She summarizes superdeterminism thus: “What a quantum particle does depends on what measurement will take place.”

If one posits that quantum particles are nature’s fundamental building blocks, then Hossenfelder’s summary above is strictly incorrect: we define quantum particles in terms of their properties, and their properties depend on what measurement will take place. Therefore, the accurate formulation would be that “what a quantum particle [is] depends on what measurement will take place,” not what it does. Formulated this way—i.e. correctly—the statement starts to look a little less intuitive than Hossenfelder makes it out to be. After all, how can what a particle is depend on what is measured about it? Shouldn’t measurement simply reveal what the particle already was, just as it does in the case of a table? How can the particle change what it is merely because the detector is set up in a different way?

But let us be charitable towards Hossenfelder. According to Quantum Field Theory, there are no such things as particles; the latter are merely metaphors for particular patterns of excitation of an underlying quantum field. And one can reasonably state that the quantum field does those excitations, for excitations are behaviors of the field. Therefore, “what the quantum [field] does depends on what measurement will take place.” That’s accurate enough and seems, at first, to restore plausibility to Hossenfelder’s argument.

The problem is that she is asking us to imagine the existence of things for which we have no direct empirical evidence; after all, they are “hidden.” She is also asking us to grant these invisible things some very specific and non-trivial capabilities: they must somehow—neither Hossenfelder nor anyone else has ever specified how—change, in some very particular ways, in response to the settings of the detector. Mind you, detectors are designed precisely to minimize disturbances to the state of what is measured. Hossenfelder’s imagined hidden variables would have to somehow overcome this barrier as well.

Let us use a metaphor to illustrate what we are being asked to believe. When I take a picture of some celestial body in the night sky—say, the moon—I can set my camera in a variety of different ways. I can, for instance, set aperture and exposure time to a variety of different values. What Hossenfelder is saying, in the context of this metaphor, is that there is some hidden and mysterious something about the moon that changes in response to what aperture or exposure I set on my camera. What the moon does up there in the sky somehow—we’re not told how—depends on how I set my camera here on the ground. This is superdeterminism in a nutshell and you be the judge of its plausibility.

Be that as it may, appealing to hidden variables is inevitably an appeal to a vague, undefined unknown; and not just any vague unknown, but one capable of non-trivial interactions with its environment. The same criticism Hossenfelder leverages against, for instance, superstring theory can be leveraged, verbatim, against hidden variables: we’re appealing to imaginary entities for which there is no direct empirical evidence. To put it in plain English, we have no reason to believe this fantastic invisible stuff, except for trying to save physicalism.

Hossenfelder could argue that the peculiarities of quantum mechanical measurements—the very problem at hand—are themselves evidence of hidden variables. But this, obviously, would beg the question entirely: quantum measurements can only be construed as evidence of hidden variables if one presupposes that hidden variables are responsible for them to begin with. In precisely the same way, only if I presuppose the existence of the—equally hidden—Flying Spaghetti Monster, who moves the celestial bodies around their orbits using His invisible noodly appendages, can the movements of the celestial bodies be construed as evidence of the existence of the Flying Spaghetti Monster.

In one of her papers [2], Hossenfelder speculates about a possible type of experiment that could one day substantiate superdeterminism. The proposal is to make multiple series of measurements on a quantum system, each based on the same initial conditions. If the series are determined by the system’s initial conditions, as superdeterminism postulates, we should see time-correlations across the different series that deviate from quantum mechanical predictions. The obvious problem, however, is that to reproduce the system’s initial state one needs to reproduce the initial values of the postulated hidden variables as well. But Hossenfelder has no idea what the hidden variables are, so she can’t control for their initial states and the whole exercise is pointless. To her credit, she admits as much in her paper. She then proceeds to speculate about some scenarios under which we could, perhaps, still derive some kind of indication from the experiment, even without being able to control its conditions. But the idea is so loose, vague and imprecise as to be useless.

Indeed, Hossenfelder’s proposed experiment has a critical and fairly obvious flaw: it cannot falsify superdeterminism. Therefore, it’s not a valid experiment, for we’ve known since Popper in the 1960s that valid experiments motivated by a hypothesis are supposed to be able to falsify the hypothesis. More specifically, if Hossenfelder’s experiment shows little time-correlation between the distinct series of measurements, she can always (a) say that the series were not carried out in sufficiently rapid succession, so the initial state drifted; or (b) say that there aren’t enough samples in each measurement series to find the correlations. The problem is that (a) and (b) are mutually contradictory: a long series implies that the next series will happen later, while series in rapid succession imply fewer samples per series. So the experiment is, by construction, incapable of falsifying hidden variables.

In conclusion, no, hidden variables have no empirical substantiation, neither in practice nor in principle; neither directly nor indirectly.

You see, I would like to say that hidden variables are just imaginary theoretical entities meant to rescue physicalist assumptions from the relentless clutches of experimental results. But even that would be saying too much; for proper imaginary entities entailed by proper scientific theories are explicitly and coherently defined. For instance, we knew what the Higgs boson should look like before we succeeded in measuring its footprints; we knew what to look for, and thus we found it. But hidden variables aren’t defined in terms of what they are supposed to be; instead, they are defined merely in terms of what they need to do in order for physical properties to have standalone existence. If I were tasked with looking for hidden variables—just as I once was tasked with looking for the Higgs boson—I wouldn’t know even how to begin, because we are not told by Hossenfelder what they are supposed to be. She is just furiously waving her hands and saying, “there has to be something (I have no clue what) that somehow (I have no clue how) does what I need it to do so I can continue to believe in a physicalist metaphysics.”

This is akin to the medieval notion of ‘effluvium,’ an imaginary elastic material that supposedly connected—invisibly—chaff to amber rods. Effluvium was meant to account for what we today understand to be electrostatic attraction, a field phenomenon. Medieval scholars observed that chaff somehow clung to amber rods when the latter were rubbed. Therefore—since they had no notion of fields—they figured that there had to be some material connecting chaff to rod through direct contact, right? After all, everything that happens in nature happens through direct material contact, right? Never mind that such material was invisible (hidden!), couldn’t be felt with the fingers, couldn’t be cut or measured directly, and that no one had the faintest idea what it was supposed to be, beyond defining it in terms of what it allegedly did to chaff; it just had to be there.

Hidden variables are Hossenfelder’s effluvium: there must be some mysterious invisible something that somehow does what needs to be done for us to think of physical entities as having standalone existence, right? Because measurable physical entities are all that exists and, as such, must have standalone existence… right?

On a more technical note, Hossenfelder bases her entire discussion on Bell’s inequalities but conspicuously fails to mention Leggett’s inequalities [4], a 21st-century extension of Bell’s work, more relevant to the points in contention, which separates the hypotheses of physical realism and locality so they can be tested independently. Neither does she address the experimental work done to verify Leggett’s inequalities, which later refuted physical realism rather specifically [5, 6]. Even more recent experiments have also demonstrated that physical quantities aren’t absolute, but contextual (i.e. relative, or ‘relational’) [7, 8], thereby contradicting superdeterminism. By now, a broad class of hidden variables has been refuted by experiments [9], which Hossenfelder doesn’t comment on at all.

Instead, she bases her case on the notion that the opponents of hidden variables dislike the latter simply because the hypothesis supposedly contradicts free will. While I am sure that there are physicists emotionally committed to free will, refuting their emotional commitments does not validate superdeterminism. Suggesting that it does is a straw man. I, for one, am on record stating that free will is a red herring [10] and don’t base my case against superdeterminism on it at all. I don’t need to.

We have to guard against turning the particular metaphysical assumptions of our time into an unfalsifiable system; one that accommodates anomalies through arbitrary, fantastical appeals to unknowns and vague, promissory hand waving. When the anomalies that contradicted Ptolemaic and Copernican astronomy—according to which the celestial bodies move in perfectly circular orbits—began to be observed, adherents came up with fantastical ‘epicycles’—circles moving on other circles—to accommodate the anomalies. The tortuous cumbersomeness of the resulting models should have been enough to force them to take a step back and contemplate their dilemma in a more intellectually honest manner. But subjective attachment to a particular set of metaphysical assumptions didn’t allow them to do so.

Today, as anomalies accumulate in various branches of science against the metaphysical assumptions of physicalism, we would do well to avoid a humiliating repetition of the epicycles affair. However, instead, what we are now witnessing are hypotheses being put forward, with a straight face, that fly in the face of any honest notion of reason and plausibility. The epicycles look benign and reasonable in comparison to the willingness of some 21st-century theoreticians to believe in fantasy. I shall comment more broadly on this peculiar phenomenon—a harbinger of paradigm changes—in my next essay for this magazine.













The Science of Consciousness: Panel discussion, first day

The Science of Consciousness: Panel discussion, first day

Debating | Neuroscience | 2022-01-30

shutterstock 1165031944 small

Closing the first day of the 2021 ‘The Science of Consciousness’ conference, dr. Bernardo Kastrup, Prof. dr. Heleen Slagter, Dr. Steve Taylor and Prof. dr. Henk Barendregt take questions and debate.

Announcing ‘The Science of Consciousness’ online conference, 2021

Announcing ‘The Science of Consciousness’ online conference, 2021

Debating | Consciousness

shutterstock 517025851 small

Every year Essentia Foundation organizes an online conference featuring some of the world’s leading scholars, scientists and academics, on a topic relevant to ontological idealism. This year, we are delighted to focus on The Science of Consciousness, in a very special edition of the conference organized by Prof. dr. Sarah Durston. We’re even more delighted to count as our partners, this time, the Sentience and Science Foundation and the Institute for Advanced Study of the University of Amsterdam.

One of the greatest questions we face is the nature of consciousness. Currently, the most widely accepted scientific framework is materialism: the idea that the world around us and everything in it, including ourselves, arises from the interplay of underlying material substances. Within this framework, consciousness is the result of brain processes. Yet, materialist neuroscience has not yet been able to pinpoint consciousness in the brain. Furthermore, the materialist framework does not explain the phenomenon of subjective experience and leaves no room for human values, including meaning, compassion and humanity. At this conference, we will explore different metaphysical frameworks that include materialism, but also alternatives such as idealism, which may contradict or complement materialism.

The Conference will take place on November 2nd and 4th 2021, online, from 2:00 to 5:00 PM CET. Each session will include presentations by renowned speakers and a panel session, with opportunity for questions from, and debate with, the audience.


November 2nd

14:00     Does the evidence indicate that experience is (generated by) brain activity?
                Dr. Bernardo Kastrup, author and director of Essentia Foundation

14:35     Consciousness: flexibility, risk factor, wisdom
               Prof.dr. Henk Barendregt, emeritus professor of mathematical logic, Radboud University

15:10     The predictive mind
               Prof.dr. Heleen Slagter, professor of brain, cognition and plasticity, Free University Amsterdam

15:45     An introduction to Panspiritism: How Fundamental Consciousness Becomes Individual Consciousness
               Dr. Steve Taylor, author and lecturer at Leeds Beckett University

16:20     Panel Discussion

17:00     End

November 4th

14:00     Emergence in the Universe & the Human Mind
               Prof.dr. Erik Verlinde, professor of theoretical physics, University of Amsterdam

14:35     Blobs of order: being in between below above
               Dr. Esmee Geerken, Arts Science fellow, UvA Institute for Advanced Study

15:10     Higher Dimensions of Consciousness?
               Dr. Jacob Jolij, author and lecturer at Groningen University

15:45     Consciousness as relational
               Dr. Iain McGilchrist, author, psychiatrist and former Oxford literary scholar

16:20     Panel Discussion

17:00     End


We will be publishing the videos of the conference over the next few weeks. After each publication, we will link the video to the respective agenda entry above.