Quantcast
  • Register
PhysicsOverflow is a next-generation academic platform for physicists and astronomers, including a community peer review system and a postgraduate-level discussion forum analogous to MathOverflow.

Welcome to PhysicsOverflow! PhysicsOverflow is an open platform for community peer review and graduate-level Physics discussion.

Please help promote PhysicsOverflow ads elsewhere if you like it.

News

PO is now at the Physics Department of Bielefeld University!

New printer friendly PO pages!

Migration to Bielefeld University was successful!

Please vote for this year's PhysicsOverflow ads!

Please do help out in categorising submissions. Submit a paper to PhysicsOverflow!

... see more

Tools for paper authors

Submit paper
Claim Paper Authorship

Tools for SE users

Search User
Reclaim SE Account
Request Account Merger
Nativise imported posts
Claim post (deleted users)
Import SE post

Users whose questions have been imported from Physics Stack Exchange, Theoretical Physics Stack Exchange, or any other Stack Exchange site are kindly requested to reclaim their account and not to register as a new user.

Public \(\beta\) tools

Report a bug with a feature
Request a new functionality
404 page design
Send feedback

Attributions

(propose a free ad)

Site Statistics

205 submissions , 163 unreviewed
5,082 questions , 2,232 unanswered
5,353 answers , 22,789 comments
1,470 users with positive rep
820 active unimported users
More ...

  On the thermal interpretation of quantum mechanics

+ 5 like - 1 dislike
16092 views

I got an email by Gerard t'Hooft concerning my thermal interpretation of quantum mechanics (see my Thermal Interpretation FAQ, Chapter 10 of my quantum mechanics book  or  the section Foundations independent of measurements from my theoretical physics FAQ). He asked me the following questions:

[Concerning the "thermodynamic interpretation,] all I could find about that is that it  neatly follows Copenhagen in "dodging the question" about what is really going on. All you have to do is work with density matrices if you don't like wave functions.
Fine, but how then do space and time develop curvature? Do we have to dodge the question what that curvature really is?

I want to make models of Planckian scale physics. For that, I need to know what space and time are like. Writing curvature in terms of non-commuting operators produces problems for me that I want cleared out of the way.

Could you explain in concise terms what your thermodynamic interpretation is, and  why you prefer that? Don't you also need an infinite universe of universes? What  happens when a measurement is done?

asked Oct 27, 2014 in Theoretical Physics by Arnold Neumaier (15,787 points) [ revision history ]
edited Dec 19, 2014 by dimension10

4 Answers

+ 5 like - 0 dislike

was recently introduced to Neumaiers interesting/ deep new interpretations of QM by Thomas Klimpel / (se). it appears to me they are quite well developed and on the surface similar to/ meshing with soliton theory which goes back decades but a definitive history seems to remain to be written.

Neumaier refers to a "hydrodynamical" interpretation, but at heart its a fluid dynamical system. there are recent very major advances in this theory by Anderson/ Brady. am surprised nobody has cited them yet wrt the Neumaier interpretation. highly recommend reading them in depth.

the near-obvious interpretation which no one has pointed out much it seems is that space itself is the fluid! this seems to be an interpretation also dating to Majorana

it is worth listing many famous/ influential quantum scientists who regarded or intuited QM theory as likely, or possibly/ conceivably having some kind of realistic "interpretation" other than the Copenhagen interpretation, and further even pointing to some "deeper theory". many of these views are generally downplayed/ deemphasized/ narrowly cited in typical historical accounts if even referred to at all.

  • Einstein
  • De Broglie
  • Schroedinger
  • Bohm
  • Bell
  • T'Hooft

T'Hooft asks specifically what happens during a measurement? the zen question of the 20th century continuing into the 21st. an analogy could be made to something like a sound microphone hooked up to a digitizer. this is a purely classical system but exhibits a wave/ particle duality. the least significant bit of binary samples of sounds by the microphone follows a photon-like law of converting analog, wave-based sound energy into digital counting. this analogy likely can be pushed much further with real mathematical/ experimental analysis (but of course can anticipate much skepticism in this regard, however invite further dialog/ investigation on this also).

following are some references that support the soliton theory of reality and mesh broadly/ compatibly with the Neumaier fluid dynamical ideas/ interpretation. my personal conjecture is that (along the following lines) classical experiments/ analogies/ correspondences further than those cited below can continue to be devised/ explored (along with leveraging the full power all other classic attack techniques/ directions such as mathematics, philosophy etc) that help reveal an ulterior quantum mechanical reality, & encourage/ urge others with similar aims to join in this pioneering research project/ program.

answered Dec 19, 2014 by vzn (80 points) [ revision history ]

@Vzn,

Thank you for the useful and provocative material, which indeed gives excellent food for thought. After reading it I did some further searching and came across this page: http://iopscience.iop.org/1742-6596/504/1

It includes the proceedings of the EmQM13 conference (Emergent Quantum Mechanics 2013, 3–6 October 2013, Vienna), which include interesting material about QM foundations research that might complement the papers you recommended above nicely.

[EDIT: I updated the link to point to the main page with the actual proceedings].

Ciao,
Paolo

thx! fyi some further elaboration/ discussion with TK on the microphone/ sound digitizer & new QM pov/ interpretations contd in se physics chat room: toy models of QM

update, had a long discussion with TK et all in the chat room, have turned up many other relevant, even striking refs meantime, recently wrote up all this/ summarized in a large new synthesis. superclassical/ emergent QM, recent developments, rough outline/ overview/ leads

new highlights: Borns rule/ nonlocality measurements in classical statistical systems, experiments proposed to measure the reality of the quantum wavefunction (Cavalcanti et al), liquid helium anomalies experimentally measured maybe related to soliton theory, related physics.se questions survey, etc

+ 3 like - 0 dislike

The main novelty in the thermodynamic interpretation is to regard that what is usually called an ensemble mean as in fact being an observable value. Thus quantum mechanics is at the same time a classical and a quantum theory; the natural hidden variables are Wigner-type multi-correlation functions.

For example, the classical hydrodynamic equations are obtained by looking only at field values (and projects the remainder away), and the Boltzmann equation is obtained by looking only at local 2-point correlators (and projects the remainder away). Adding local multi-momentum fields accounting for multparticle collisions adds observables that are more and more difficult to observe, and if all multimomentum fields are present, we have the full gamut of hidden (but in fact not so hidden) variables. Their dynamics is described by quantum field theory; in fact by generalized Wigner transforms of the Wightman n-point functions. 

Clearly the field values in hydrodynamics are observable in a classical sense, though not very accurately. On the other hand, in a quantum field theory, where operators are space-time dependent, it is impossible to obtain a true ensemble, as we cannot repeat experiments at the same space-time position. Thus in quantum field theory, ensembles are purely fictitious.

A quantum theory of gravity must account for quantum mechanics of large bodies, such as a star. But the interpretation problems are already apparent for much smaller bodies, such as a glass of water. Each particular glass of water in equilibrium is a single quantum system, but all the observables we customarily ascribe to it according to classical mechanics are true observables of the system, observable in the single instance, without having to postulate ensembles in the statistical sense. 

The thermal interpretation extends this to all quantum phenomena. A measurement is simply the recording of field values of some classical apparatus, described as part of the dissipative quantum system consisting of the apparatus and the system of interest, with the environment already projected out. This ensures that measurement values are definite and  irreversible. Moreover, traditional analyses of observed quantum measurement processes imply that this interpretation of measurements gives a correct account of the observation of microscopic systems (though the traditional measurement terminology there is very different). 

Regarding physics at the Planck scale, I think that spacetime is most likely a classical 4-dimensional manifold. There is a single universe whose states are given by a density matrix, a semidefinite trace class operator on some (most likely nonseparable) Hilbert space. Whatever we observe are classical values of fields (though usually called expectation values or correlation functions) depending on a space-time point and zero or more momenta. In particular, gravity (and hence curvature) is just another field, as this is the way it appears on the observable level. This is both fully local and fully compatible with the thermal interpretation.

Why I prefer this interpretation? Once one has gotten rid of the traditional brain-wash about quantum observables and quantum measurements, one can see that everything corresponds to how we actually observe things - by looking at extended objects and their (in essence hydrodynamic) variables.

How these extended objects respond to the microscopic systems under study cannot be part of the foundations but must be seen as the result of an analysis in terms of quantum statistical mechanics. Thus claiming that we observed a discrete particle with spin when we in fact observed a blob on a screen is something that needs to be explained rather than postulated. (In the thermal interpretation, this is interpreted instead as the observation of a continuous field by means of an apparatus that only allows discrete responses with an intensity-depending rate.) It turns out that the connection becomes stochastic due to the nature of the quantum dynamics rather than due to an intrinsic randomness in quantum mechanics.

answered Oct 27, 2014 by Arnold Neumaier (15,787 points) [ revision history ]
edited Oct 27, 2014 by Arnold Neumaier

Do you see this approach to measurement in other interpretations of QM? I've often thought that we construct QM from measurements, not that we construct measurements from QM (which is a significant part of what I take from your account here). An aspect that you don't address here is that we typically engineer material objects and their electronic environment so that they undergo thermodynamic transitions under more delicate changes of their environment than would likely occur without human intervention.

I presume you saw Padmanabhan's arXiv:1410.6285v1 [gr-qc] 23 Oct 2014, "Emergent Gravity Paradigm: Recent Progress", which I take to be enough similar in approach to be of interest. There are numerous others pursuing this kind of approach, of course.

I think one might pay some attention to the different symmetry groups of quantum fluctuations and of thermal fluctuations, respectively 3+1-dimensional Poincaré and 3-dimensional spatial Euclidean+1-dimensional temporal translations, when talking about thermodynamics.

... it is impossible to obtain a true ensemble, as we cannot repeat experiments at the same space-time position.

And is it really necessary to repeat them at the same space-time position?

I can’t help but noting that the “Thermodynamic Interpretation” is not an interpretation at all, but a prescription. Yes, I know how to calculate the most likely results of any experiment if someone gives me the Hamiltonian. But what is it that actually happened? That’s the dodged question. In most quantum theories, the “quantum state of the universe (note added)” is a wave function defined in the universe of all universes. That’s too large for me, particularly now that I know that, with some more effort, one can construct models that live in a single universe roughly resembling ours.

@Physics_Doctor, I think it may be, unless we can identify the whole dynamical configuration relative to one measurement in space-time with the whole dynamical configuration relative to another. How detailed the identification can be determines how well we can consider two events to be part of the same ensemble.

In most quantum theories, the “quantum state” is a wave function defined in the universe of all universes.

It is not so. We apply some boundary conditions to the wave function and those boundary conditions are approximate solutions to the QM equations involving also the boundary bodies, roughly speaking. No wave function can be "extended" father than the boundary bodies.

Peter, the very notions of time and space are inclusive by their definitions. They imply many-many experiments to fill the time and space axes with points.

@gthooft, experimental evidence about the Planck length scale currently underdetermines our models at such small scales. In time perhaps that will change, but even if the precision of our measurements increases enough to model events at the Planck scale with some confidence, we would then have to ask about what "actually happened" at scales of \(10^{-100}\) meters, say. As such, we would seem always, at sufficiently small scales, to be confined to statistical models. At small scales, there might be processes with characteristic velocities of, say, \(10^{30}\) meters/second or more, but operating incoherently enough that we cannot send messages using them.There might be a world of such incoherence and complexity at small scales that a tree would seem simple indeed.

Such ideas, those above at least, are mere speculations. We can't say much about small scales until we have more experimental evidence. Even if we can construct mechanical models that are as empirically adequate as QM, it seems, however, that at the leading edge of our theorizing we can more usefully and reliably construct statistical models than we can mechanical or other models.

@Peter Morgan, I'm afraid I do not agree with you. As for numbers, black hole holography strongly suggests that the square of 10-35 meters is the smallest space for a single bit of info, and the good old speed of light (3.108 m/s) is the speed limit. Also, if you take all known symmetry principles into account (GR, local Lorentz and what have you), our Planck length models seem to be overdetermined rather than underdetermined - I frankly don't regard string theories as rigorous enough to serve as useful models. This is why I think we don't have to wait for experiments to tell us what it might be that's happening at Planck scales.

Do you see this approach to measurement in other interpretations of QM? - no. It was born out of my dissatisfaction with all existing approaches.

Padmanabhan's arXiv:1410.6285: I don't see the similarity. The thermal interpretations says that what are usually considered to be ensemble expectations of fields are (as in nonequilibrium thermodynamics) actually the true observables. I saw no trace of this in P's paper.

To measure an ensemble expectation $\langle \Phi(t,x)\rangle$ one has to measure $\Phi(t,x)$ multiple times and taking averages. But this is impossible since each repetition happens either at a different time or at a different location.

Yes, it is impossible, but is it really necessary?

Practice shows that ensemble expectations obtained in laboratories are similar despite being taken in different time and space points.
 

@Physics_Doctor, measurements may be similar enough FAPP in commonplace circumstances, but if the gravitational or other background is varying faster than measurement events are happening, that may make attributing multiple events to the same ensemble tenuous.

@gthooft, I do not see overdetermination if one takes an empirical, effective field approach. Whether we think theory is overdetermined or underdetermined by experiment would then seem to be theory-dependent. If one is certain that the world is determined by a finite number of parameters, then I would agree that theory is overdetermined by experiment insofar as there are unlimited experimental results, but although I admit the possibility that there are only finitely many parameters I see only philosophical and emotional reasons for such certainty.

In most quantum theories, the “quantum state of the universe (note added)” is a wave function defined in the universe of all universes. That’s too large for me, particularly now that I know that, with some more effort, one can construct models that live in a single universe roughly resembling ours.

[I added details on quantum gravity.]

1. Ignoring gravity for the moment, the (one and only) universe is the collection of the fields of the standard model. The (deterministic) state of the universe  is a density matrix in the corresponding Hilbert space (or rather Hilbert bundle over spacetime). Equivalently, it is a numerical assignment of all gauge-invariant field correlators satisfying the Wightman axioms (apart from Poincare invariance, which would characterize the empty universe).

What ''actually happens'' is fully encoded in the space-time dependence of these correlators. Nothing is dodged - this is a fully explicit dynamics, though somewhat obscured through the difficulties of getting nonperturbative solutions.

Much of what is currently done in quantum field theory (of the standard model) is about aspects of possible universes. Each of these is given by a different state (aka density matrix, aka collection of correlation functions). One only considers highly idealized universes with very simple, computationally tractable states (e.g., universes containing only a few massive particles, or universes being in global equilibrium), since the true state of the universe is obviously far too complex since it encodes everything we know and will ever know. 

But deriving hydrodynamics from quantum field theory amounts to getting in the local equilibrium approximation equations of motion for the most accessible kind of thermal observables, the field values (aka field expectations, as identified by statistical mechanics as the thermodynamic observables). Clearly, hydrodynamics handles already most of the observables in the universe (including gravitation), which is very well described by a hydrodynamic picture. 

Where the hydrodynamic description is not adequate, one has as next lower level kinetic theory, which comprises a far larger array of observables, namely kinetic fields dependent on space-time position and a momentum vector; momentum average gives the hydrodynamic fields. In a semiclassical limit, the corresponding assumption of microlocal equilibrium leads to Boltzmann-type equations. In the full quantum version, this leads instead to Kadanoff-Baym equations, which are apparently sufficient to calculate much of the quantum dynamics of heavy ion collisions. 

Both in the hydrodynamic case and in the kinetic case we have a classical dynamics of deterministic fields. In the kinetic case, the stochastic scattering behavior of the particles (whose existence is questionable at sufficiently high resolution) naturally turns into a deterministic behavior of the corresponding kinetic fields. The existence of the latter is verifiable e.g., on the level of semiconductor design, and needs no change as one goes to higher resolution, except  more accurate approximations to the dynamics.

The thermal interpretation just adds multi-kinetic fields depending on a space-time position and an arbitrary number of momentum vectors, but these still have a well-defined dynamics on a corresponding Hilbert bundle over space-time, fully equivalent to the quantum field description, of which they are obtainable by a generalized Wigner transform. In terms of statistical mechanics, the dynamics is a relativistic version of a Wigner-transformed BBGKY hierarchy. 

The complexity is increased, but in a very natural way (just a simple extension of what is already necessary classically) and not by introducing artificial objects like many worlds or pilot waves. Moreover, the contact to the classical world is obvious through traditional statistical mechanics and thermodynamics. Everything done there immediately translates into insight into the ''hidden'' variables.

2. An extension to gravity is philosophically immediate, though the construction of a consistent quantum gravity theory may not be trivial. However, the Hilbert bundle setting has a natural general covariant extension, as all thermal fields (hydrodynamic, kinetic, and multi-kinetic) are local in space and time.

For quantum gravity, one just needs to add a spin 2 field for gravity. I do not understand why people treat the nonrenormalizability as an inconsistency to be avoided. Nonrenormalizability just means that one has to embed the theory into one with countably many parameters rather than one with few parameters; so the consistency level is the same. Countably many parameters are not a mathematical difficulty; already a power series has countably many parameters, but we handle them routinely, and utilize in computations only a few of these. The same can be done - and is in fact being done - e.g., in Burgess, Quantum Gravity in Everyday Life - for  nonrenormalizable field theories such as canonical quantum gravity.

Note that in a Hilbert bundle of multi-kinetic fields, the problem of observables that plagues traditional quantum gravity is absent, as an observer at space-time position $x$ may know in principle the values of all hydrodynamic, kinetic, and multi-kinetic fields at $x$ (and about what the observer may extrapolate about other space-time positions base on the known dynamics) .

Practice shows that ensemble expectations obtained in laboratories are similar despite being taken in different time and space points. 

But this is because one takes ensemble expectations of observables with very few degrees of freedom, that can often be identically prepared. However, faking an expectation of a quantum field by means of observations at different points in space and/or time amounts to a smearing of the field rather than a determination of the ensemble mean. One can see this already classically. 

@Arnold Neumaier, So in your opinion how do you characterize the electron in the hydrogen atom according to your interpretation.

In its ground state, the electron in a hydrogen atom is a spherical charge cloud around the nucelus, with a charge density equal to the charge multiplied by what is usually called the probability density.

For more details and references, please read ''How do atoms and molecules look like?" and ''Does an atom mostly consist of empty space?'' from Chapter A6 ''The structure of physical objects'', and ''What is an electron?'' and ''The shape of photons and electrons'' from Chapter B2 ''Photons and Electrons" of my theoretical physics FAQ.

However, faking an expectation of a quantum field by means of observations at different points in space and/or time amounts to a smearing of the field rather than a determination of the ensemble mean. One can see this already classically. 

I agree - if the field is space-time (smoothly on average) dependent, then such experiments are not about a local field properties. But then such experiments will be hardly described with a local field theory. Is your "interpretation" a different theory (more general or more exact) than QM?

My thermal interpretation interprets orthodox quantum mechanics in field theory form, without any alteration. Thus it is the same theory.

But all concepts, computations, and experiments are interpreted in a different language inspired by thermodynamics and its derivation from statistical mechanics, hence ''thermal interpretation''. This language conveys different intuitions and thereby avoids the standard paradoxes of quantum mechanics. In particular, there is no particle concept; observable particles are just localized lumps of matter and/or energy.

The argument you discussed shows that ensembles of quantum fields are in principle unobservable, hence operationally meaningless. This is intrinsic to the field concept, and serves as an argument for why ensembles of fields cannot have an operational statistical interpretation, hence must be treated as purely formal mathematical objects. (That the mathematics of probability theory can be used is no argument for the probabilities being caused by randomness. Rather it is like the use of probability theory in number theory, where everything is fully deterministic but arguments about averages are often couched in probabilistic terms.)

But since ensemble means encode all that can be said about a quantum system, they must have a different operational interpretation - namely the one given it by the thermal interpretation, in accordance with the thermodynamics of statistical mechanics. 

This argument holds for all fields, no matter whether a locality property holds. In particular, it also holds in any nonrelativistic field theory (i.e., second-quantized multiparticle theory as considered in nonequilibrium statistical mechanics), which is always dynamically nonlocal in space since the potential energy term in the second-quantized Hamiltonian is a double integral over two space coordinates.

Arnold, I got confused with your FAQ because I did not find a certainty there. Besides, there is an erroneous passage:

''quantum mechanics specifies the probability of finding an electron at position x relative to the nucleus. This probability is determined by $|\psi(x)|^2$, where $\psi(x)$ is the wave function of the electron given by Schroedinger's equation. The product of $-e$ and $|\psi(x)|^2$ is usually interpreted as charge density, because the electrons in an atom move so fast that the forces they exert on other charges are essentially equal to the forces caused by a static charge distribution $-e|\psi(x)|^2$.''

It is on the contrary - to "see" such a static picture, the projectile velocity should be much higher than $v_0$. Then the first Born approximation deals with unperturbed $\psi$ squared solely.

If the projectile is slow, it inevitably polarizes the atom and at each distance $R$  to the atom the atomic wave function will be $R$-and $Z_{proj.}$-dependent.

I think you are joking about "spherical charge cloud" around the nucleus.

You'll find certainty nowhere except in mathematics, and even there only if you accept the assumptions.

There is no error in what you quote from ''How do atoms and molecules look like?''. Indeed, I gave a number of references that highly respected experimentalists interpret the electron density of a molecule as a charge density. This is consistent with my thermal interpretation.

And there is no question that the density of an electron in the ground state is spherical.

This has nothing to do with shooting projectiles on the atom. This has been done by Rutherford, who concluded that atoms are nearly empty. But it only means that electrons don't scatter much.

Those experimentalists who deal with X-rays extrapolate too much!

I do not say the "negative charge cloud" in the ground state is not spherical. I meant the presence of a "positive charge cloud" too, which is determined with the same wave function.

It was funny - when I was reporting this "positive charge cloud" picture at a seminar in 1985, people could not understand why the nucleus should rotate around the atomic center of mass if the electron cloud (configuration) is absolutely spherically symmetric.

And, of course, those "clouds" are only meaningful when they are such in calculations. In particular, in the first Born approximation for scattering problems.

+ 2 like - 0 dislike

Without having to introduce any change in the formal apparatus of quantum physics, the deterministic dynamics of the complete collections of quantum mechanical expectations constructible from quantum fields, when restricted to the set of macroscopically relevant ones, already gives rise to all the stochastic features observed in practice.

I now (as of February 28, 2019) have three detailed papers on this: 

Foundations of quantum physics I. A critique of the tradition,
Foundations of quantum physics II. The thermal interpretation,
Foundations of quantum physics III. Measurement,

and a dummy paper containing just the abstracts of these (plus a fourth one not yet quite finished). 

answered Mar 1, 2019 by Arnold Neumaier (15,787 points) [ revision history ]
edited Mar 1, 2019 by Arnold Neumaier
+ 1 like - 0 dislike

I found a very simple way to present the basics: Instead of interpreting expectations as a concept meaningful only for frequent repetition under similar conditions, I interpret it for a single system in the following way, consistent with the practice of thermal statistical mechanics, with the Ehrenfest theorem in quantum mechanics, and with the obvious need to ascribe to particles created in the lab an approximate position even though it is not in a position eigenstate (which doesn't exist).

The basic thermal interpretation rule says:

Upon measuring a Hermitian operator $A$, the measured result will be approximately $\bar A=\langle A\rangle$ with an uncertainty at least of the order of $\sigma_A=\sqrt{\langle(A−\bar A)^2\rangle}$. If the measurement can be sufficiently often repeated (on an object with the same or sufficiently similar state) then $\sigma_A$ will be a lower bound on the standard deviation of the measurement results.

Compared to the Born rule (which follows in special cases), this completely changes the ontology: The interpretation applies now to a single system, has a good classical limit for macroscopic observables, and obviates the quantum-classical Heisenberg cut. Thus the main problems in the interpretation of quantum mechanics are neatly resolved without the need to introduce a more fundamental classical description.

Recent additional material on the thermal interpretation of quantum mechanics is referenced in my Thermal interpretation FAQ

answered May 26, 2016 by Arnold Neumaier (15,787 points) [ revision history ]
Most voted comments show all comments

Thanks @ArnoldNeumaier for recommending (in the Thermal Interpretation FAQ) L. Sklar's book "Physics and Chance" book, which seems excellent at a first glance. Sklar's most recent book also seems good. I am trying to understand the thermal interpretation, questions forthcoming.

Note - once again I posted this comment as an answer. Yes, one should be more careful, we are all attention-challenged these days and but I guess the user interface could be improved to make it clear when one is posting an answer and when a comment. How about asking the user to confirm? "Please confirm that you want to post this as an answer/comment. If you wanted to post a comment/answer instead, please follow this link."

@GiulioPrisco: You may post your (useful) suggestion about the answer box on meta, and it will be considered by our system developer @polarkernel.

I am late to this, but maybe I may still say something.

What you seem to give above (and in your FAQ) is an account of what it means operationally to have a probabilistic theory of nature.

(The point you highlight about probabilities not necessarily pertaining to an ensemble is widely and well appreciated, people like to refer to this as Bayesian as opposed to frequentist probability.)

So up to this point this "interpretation" seems uncontroversial and widely accepted, it just makes explicit what it means to have a probabilistic theory.

On your FAQ you continue by saying

There is no longer a physical reason to question the existence of the state of the whole universe, even though all its details may be unknown for ever. Measuring all observables or finlding its exact state is already out of the question for a small macroscopic quantum system such as a piece of metal. Thus, as for a metal, one must be content with describing the state of the universe approximately.

This sounds like you are imagining that there is or could in principle be a non-probabillistic exact state of the universe, and that the business about probabilities in qantum physics is only about the practical impossibility to ever determine this state exactly, with vanishing uncertainty.

But this is the subtle point. Theorems like the Kochen-Specker theorem or Bell's inequalities are usually read as saying that precisely this cannot be the case, that the probabilities in quantum physics are intrinsic and not just a coarse-graining of a fundamentally non-probabilistic information.

Maybe you disagree with this reading of these theorems. But then it would be good to explain why. In the present state of your writeup (I read the FAQ entry and chunks of the slides linked to there) I see a good discussion of the practice of probabilistic theories of nature, but no actual interpretation of quantum mechanics and no dealing with this core fact: That Kochen-Specker and Bell say, or so it seems, that the probabilities in quantum physics are intrinsic, and not just due to coarse-graining an in-principle non-probabilistic information.

@UrsSchreiber:  The 2003 paper is just the germ of the thermal interpretation, with a lot of the later insights lacking.

The thermal interpretation has no hidden variables - only the stuff figures that one encounters in standard statistical mechanics courses. Everything is completely orthodox except for the definition of what it means to take a measurement. Kochen-Specker applies only to von Neumann's definition of measurement - but the latter is an idealization in contradiction to actual macroscopic measurement practice, and hence in need of repair. 

In the thermal interpretation the meaning of measurement is taken form the undisputable fact that we measure macroscopic variables only once and get a definite value, in combination with the equally undisputable fact from statistical mechanics that what is measured in nonequilibrium thermodynamics is an ensemble expectation value.

With this interpretation one can calculate everything as it was always computed, without a single alteration, and still does not have to shut up since everything has a valid realistic interpretation.

@UrsSchreiber: It is called the thermal interpretation - not thermodynamic. Thermal just refers to the fact that everything we measure is measured in a thermal environment for which statistical thermodynamics is relevant. 

Ensemble expectation values $\bar A=\langle A\rangle$ have two different interpretations, in both cases without any subjectivity involved.

Either as an uncertain value, measured once only, with intrinsic uncertainty given by the ensemble standard deviation.

Or as the limit of a mean value obtained from an (imagined) infinite repetition of measurements of identically prepared systems, and approximately used for means of finitely many actual measurements.

The latter interpretation is the traditional one in microscopic quantum mechanics of tiny systems, while the former is never spelled out explicitly (except by me) but it is implicit in all statistical mechanics of large systems. It applies universally but is informative only when the uncertainty is small compared to the value itself - a situation which is the case for all macroscopic quantities but also when a microscopic system is in an eigenstate of the quantity measured.

Both interpretations are fully compatible and provide a smooth continuum between microscopic quantum measurements and macroscopic classical measurements. 

Statistical thermodynamics has nothing to do with Bayesian thinking. It was invented by Gibbs in the 19th century, more than 50 years before Jaynes introduced the (in my opinion misguided) subjective Bayesian interpretation. Expectations in statistical thermodynamics are exclusively ensemble expectations: The treatment in my online book never refers in the slightest way to anything Bayesian, except in the section where I show that the subjective view produces complete nonsense unless its subjective assumptions happen to agree with the objective situation.

What is subjectively called ignorance is objectively just an approximation, in exactly the same sense as our ignorance about all decimals of $\pi$ leads us to work in practice successfully with the approximation $\pi\approx 3.14159$.

''Realistic'' just means that there is no mysterious measurement process that produces measurement results out of nothing, i.e., unrelated (except stochastically) to the terms figuring in the mathematical model.

Kochen-Specker, Bell, etc. all assume that measurements are exact observations of discrete eigenvalues, an idealized assumption that is plainly wrong. The thermal interpretation holds instead that when you want to claim on the level of the mathematical model that you measured the spin of a particle you need to argue with mathematical rigor that the spot of the screen actually measured is in fact a measurement of the spin of the particle. This means that you must set up the mathematical machinery that proves that the sequence of macroscopic measurements (readings of a sequence of expectations) actually made correlate in a precise sense to what happens in the mathematical model to the system measured. Thus you have to solve a complicated problem in statistical mechanics. This is done (though with limited rigor) in the work by Allahverdyan et al. mentioned earlier.

The problem is analogous to the problem of classical mechanics to find out what one subsystem (the ''observer'') of a big system case can ''know'' about another subsystem (the ''observed'') in the sense that it can be computed (in principle) from the values of the quantities available by inspection to the observer (in a point particle model the positions and momenta of the observer's particles). To postulate that, just because we are looking at a classical system, the observer knows exactly the observed variables is clearly a drastic, only approximately correct simplification. The same kind of drastic simplification is made when postulating Born's rule instead of investigating what a quantum observer can ''know'' about an observed microscopic quantum system. 

Therefore getting a negative logical conclusion of Kochen-Specker or Bell type (from the assumption that Born's rule holds exactly and universally) does not mean anything for the true dynamics in which the simplified assumptions are only approximately valid. 

This is enough proof that one can disregard what Bell, Kochen & Specker, etc. say. 

Note also that the question of what a quantum observer can ''know'' about an observed microscopic quantum system can (and must) be answered independent of any postulates about the results of measurements. Instead, one has to define on the level of the mathematical model under which condition something defined mathematically in terms of observer variables only deserves to be called a measurement of something defined mathematically in terms of variables of the observed system only. Then one must prove results about the correlations of these two somethings that justify assigning the name ''measurement'' to their relationship. 

This is not easy - not even in case of a classical mechanical system. But only that can settles the issue.

Most recent comments show all comments

Thanks for the responses @ArnoldNeumaier  It sounds very interesting, but I am still unclear about the details of what your proposal says.

How literal is "thermodynamic" in "thermodynamic interpretation" meant to be? Is the proposal that quantum physics is precisely that, the thermodynamics of an unknown classical hidden variable theory?

More concretely, regarding the interpretation of expectation values:

When a state applied to an observable (observable quantity) produces a number between 0 and 1, what is your suggestion to read this number?

It seems you said you do not want to interpret it 1) as an average over realizations, and 2) not as a subjective measure of ignorance either. So it's 3) what?

At the same time you say it is to be thought of in the same way as expectations in thermodynamics. But expectations in thermodynamics are a combination of ensemble averages and Bayesian ignorance measurements: one the one hand we only know the configuration of the thermodynamical system to some approximation (that's our ignorance) and on the other hand in short periods of time the system  traces out many points in its spatial configuration space, this is why any finite-time observation sees an ensemble average (a single system, but many configurations in time).

In the article I was looking at you went a long way with formalizing concepts. Can you formalize your interpretation so that one may see as a theorem that there is a "realistic" interpretation of quantum physics which evades assumptions in Kochen-Specker and Bell? What is the definition of "realistic" that will make this work?

@ArnoldNeumaier, thanks. So you claim that all established discussion of realism in quantum physics is wrong on the basis of being overly idealized, and that the actual situation is more complicated. That sounds plausible and interesting to me. But can we turn this into a positive statement? It might be that you are right about the measurement process being substantially more complicated than existing approximations acknowledge, but that with a more sophisticated description one would still find that a realistic interpretation is untenable. How do we tell? Can we formalize it and make it a theorem? A theorem like Kochen-Specker/Bell, but now with modified (improved!) assumptions allowing the opposite outcome?

By the way, thanks for amplifying Whittle. I suppose his point of view comes down to that of modern "quantum probability", but it's good to see this careful classical account..

Your answer

Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead.
To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL.
Please consult the FAQ for as to how to format your post.
This is the answer box; if you want to write a comment instead, please use the 'add comment' button.
Live preview (may slow down editor)   Preview
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Anti-spam verification:
If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:
p$\varnothing$ysicsOverflow
Then drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds).
Please complete the anti-spam verification




user contributions licensed under cc by-sa 3.0 with attribution required

Your rights
...