Quantcast
  • Register
PhysicsOverflow is a next-generation academic platform for physicists and astronomers, including a community peer review system and a postgraduate-level discussion forum analogous to MathOverflow.

Welcome to PhysicsOverflow! PhysicsOverflow is an open platform for community peer review and graduate-level Physics discussion.

Please help promote PhysicsOverflow ads elsewhere if you like it.

News

PO is now at the Physics Department of Bielefeld University!

New printer friendly PO pages!

Migration to Bielefeld University was successful!

Please vote for this year's PhysicsOverflow ads!

Please do help out in categorising submissions. Submit a paper to PhysicsOverflow!

... see more

Tools for paper authors

Submit paper
Claim Paper Authorship

Tools for SE users

Search User
Reclaim SE Account
Request Account Merger
Nativise imported posts
Claim post (deleted users)
Import SE post

Users whose questions have been imported from Physics Stack Exchange, Theoretical Physics Stack Exchange, or any other Stack Exchange site are kindly requested to reclaim their account and not to register as a new user.

Public \(\beta\) tools

Report a bug with a feature
Request a new functionality
404 page design
Send feedback

Attributions

(propose a free ad)

Site Statistics

206 submissions , 164 unreviewed
5,103 questions , 2,249 unanswered
5,355 answers , 22,794 comments
1,470 users with positive rep
820 active unimported users
More ...

  How deterministic are large open quantum systems (e.g. with humans)?

+ 6 like - 0 dislike
5258 views

I hope to formulate the question, "How much quantum indeterminism is present in human-scale phenomena?" or "How much does the wavefunction of a human branch over time?" without delving into interpretational questions.

Consider some large system modeled as an open quantum system -- say, a person in a room, where the walls of the room interact in a boring way with some environment.  Begin with a pure initial state describing some comprehensible configuration. (Maybe the person is sitting down.)  Generically, the system will be in a highly mixed state after some time.  Both normal human experience and the study of decoherence suggest that this state will be a mixture of orthogonal pure states that describe classical-like configurations.  Call these configurations branches.

How much does a pure state of the system branch over human time scales?  There will soon be many (many) orthogonal branches with distinct microscopic details.  But to what extent will probabilities be spread over macroscopically (and noticeably) different branches?

The answer seems impossible to quantify by calculating anything directly.  Simple reasoning already demonstrates the answer must depend significantly on the initial state.  If the initial state describes a person in deep sleep, then after a few seconds there will be no high-probability branches in which the person is awake.  But if the person has an alarm clock triggered by nuclear decay, there may well be 50/50 branching in which the person is awake or asleep.  (Note that you couldn't get such branching with a regular alarm clock.)

The decay-triggered alarm clock demonstrates that the presence/absence of noticeably different branches depends on what's in the room.  What if two people are talking in a sparse room (with no decay-triggered alarms)?  Will there be different branches of non-negligible probability in which the people say different things?  And after how long?

Although I'm sympathetic to objections, I believe these questions are somewhat well-formulated within most conventional accounts of quantum mechanics.  That is, most physicists believe it is hypothetically possible to model a large system in this way -- perhaps using non-relativistic quantum mechanics or the Standard Model -- and to calculate branches, where the probabilities of these branches correspond to observation. 

Revision in response to comments:

In response to the helpful comments, let me now pose a more general and careful version of this question. 

I start with a widespread assumption: that it's hypothetically possible to perform some quantum simulation of a real-life macroscopic system, perhaps at the atomic level or quantum field-theoretic level. I'm assuming the simulation yields (perhaps probabilistic) predictions about macroscopic behavior and that these predictions match observation.  The initial condition for the simulation should somehow represent a classical state, as well-specified as possible.  I haven't been specific about what sort of "quantum simulation" this is -- whether we start with a pure/mixed state, whether we use an open system, etc.  But if the assumptions seem plausible, fill in the details however you like.  Make it a simple macroscopic system. I'm deliberately asking about a hypothetical simulation/model (rather than "the actual system") to separate this question from interpretational issues, if possible.

Now here's the question: Are my assumptions okay?  If so, what happens to the probability distributions for macroscopic observables as you run the simulation in time?  Does the simulation predict a single classical configuration?  (I don't think so.)  Does it predict some distribution over different classical configurations?  (What does the distribution look like?)  And what's the evidence?

asked Oct 8, 2015 in Theoretical Physics by Daniel Ranard (30 points) [ revision history ]
edited Oct 12, 2015 by Daniel Ranard

The crux of your question appears to be, how to quantify "amount of quantum indeterminism" or "amount of branching" for *anything*. A human being is an excessively complicated place to start for such questions.

Sure Mitchell, but he explicitly says that human is just an example.  The important point is that it's a macroscopic system, so one can take a thermodynamic limit.  I'm sure he'd be happy with answers that only address macroscopic non-hu

2 Answers

+ 4 like - 0 dislike

On human time scales, objects of human size can never be assumed to be in a pure state.

Indeed, environmental decoherence immediately (on human time scales) transforms a pure state into a state well described by a density matrix of the form $e^{-S/k}$ where $S$ has the form of a 1-particle operator in the asymptotic free Fock space plus an integral over the energy density. This state can be well-described by nonequilibrium thermodynamics, and is the reason why the latter describes everyday phenomena.

On the other hand, nonequilibrium thermodynamics is in general chaotic, hence for practical purposes as nondeterministic as quantum mechanics, unless the system is sufficently close to equilibrium.

answered Oct 8, 2015 by Arnold Neumaier (15,787 points) [ no revision ]

Do you think there exists some mixed state that models the system and is compatible with a classical description like "the person is sitting, etc."?  Then you can ask the same questions I've asked about further branching of an initially mixed state.  Or do you think there's no quantum state description of the room (pure or mixed) compatible with a given classical description?

The mixed states appropriate for macroscopic systems do not branch (unless interacting strongly with a pure microscopic quantum system as in the Schroedinger's cat Gedankenexperiment). They are characterized by a generalized grand canonical ensemble whose degrees of freedom are classical fields. These are describing situations like ''the person is sitting'', and evolve according to the rules of nonequilibrium thermodynamics.@dranard 

@ArnoldNeumaier.  Thanks.  If I understand correctly, you're saying a big system only branches when it's sensitively coupled to a microscopic system, which only happens when someone performs a quantum physics experiment explicitly configured to noticeably register microscopic branching.  In other words, you trust that "nonequilibrium thermodynamics" (or whatever you're calling the approximate theory) works perfectly to predict the macroscopic details of a configuration unless someone is purposely doing an experiment to break the approximation. 

If that's what you mean, it's a fun idea (a MWI person would say the first major branching took place in the 20th century!), but I'm skeptical that there isn't also branching in normal situations that don't involve deliberate quantum experiments. 

If parts of the system are classically chaotic, you might expect the classical configuration to be sensitive to microscopic quantum states.  When you ask about humans, the question becomes complicated.  You might expect that on a small scale, the constituent parts of an organism are chaotic and highly sensitive to microscopic conditions, but the miracle of biology is to remove this sensitivity and create predictable coarse-grained behavior.

@dranard: ''you're saying a big system only branches when it's sensitively coupled to a microscopic system'' Yes. This is because a macroscopic system can never be prepared in a nearly pure state - the preparation machinery would have to be far bigger than the size of the universe, and cannot be maintained in such a state because of severe environmentally induced decoherence. (In the attempts to construct quantum computers it is already very demanding to prepare and maintain a fairly small number of discrete subsystems embedded in a macroscopic system in a pure state - a toy task compared to what is needed to keep a macroscopic state pure.)

''If parts of the system are classically chaotic, you might expect the classical configuration to be sensitive to microscopic quantum states.'' No - if there is classical chaos is will mask any noncollective quantum behavior.

@ArnoldNeumaier I really appreciate the posts.  Any chance you give a reference/explanation for the statement "the mixed states appropriate for macroscopic systems do not branch [except in very special cases]"? Maybe I missed it.

@dranard: I do not have a reference. But branching in the Everett sense is strictly speaking a property of wave functions, hence of pure states only. To give the concept of branching a meaning in the case of mixtures one must therefore consider systems that interact strongly with a system in which a pure state has been prepared, so that one can say the mixture branches when the pure system branches. The latter is possible only for very tiny systems - in practice with either one continuous degree of freedom or a few discrete degrees of freedom.

For describing how nonequilibrium thermodynamics captures the macroscopically accessible part of quantum systems see, e.g., books and papers by Roger Balian.

@ArnoldNeumaier Re "nonequilibrium thermodynamics is in general chaotic, hence for practical purposes as nondeterministic as quantum mechanics, unless the system is sufficently close to equilibrium."

In other words, quantum physics is non-deterministic in principle, and classical (non-quantum) physics is non-deterministic in practice. But quantum non-determinism always gets amplified to the macroscopic level (e.g. the alarm clock triggered by nuclear decay mentioned in the OP), so it would seem that large system are also non-deterministic in-principle, even if they are usually described by classical physics.

@GiulioPrisco: Only very specially prepared quantum events get amplified, namely those that are strongly coupled to a classical switch. Most quantum events remain completely obscure, drowned in classical noise. A world dominated by quantum effects would look very different from ours.

@ArnoldNeumaier re "Only very specially prepared quantum events get amplified..." - But the chain can be long and unkonown (and also unkowable in practice). Imagine a nucleus that decays or doesn't decay, and a chemical reaction that happens or doesn't happen depending on that, ... , until an observable macroscopic consequence ("a classical switch") is reached. My guess is that happens all the time and makes large system unpredictable in-principle. Doesn't Prigogine imply something like that in "From Being to Becoming"?

@GiulioPrisco: If the chain is so long that no one recognizes it, it will go unnoticed, and the final event will be regarded as one of the many possible low probability events that happen occasionally but rarely - such as the collapse of a bridge.

One doesn't notice anything specifically quantum in such an event unless the chain is fairly direct and intended.

@ArnoldNeumaier re "If the chain is so long that no one recognizes it, it will go unnoticed..." - OK, but the chain is still there regardless of whether it gets noticed or not. My point is that the amplification of random quantum events makes macroscale physics non-deterministic in-principle, not only in-practice.

@GiulioPrisco: macroscale physics is already classically non-deterministic in-principle, since chaoticity requires initial information of a precision far bigger than the number of atoms in the universe to be able to predict the future over a finite but long time. A finite being in a classical universe is in principle unable to collect this information to the required accuracy.

@ArnoldNeumaier - that sounds like "non-deterministic in practice" rather than "non-deterministic in principle" to me.

If a given finite being with given finite resources is unable to predict the evolution of a system for a time longer than T, another more powerful finite being could predict the evolution of the system for a longer time, and yet another even more powerful finite being could predict the evolution of the system for an even longer time, and so forth.

What I am trying to get at is: is chaotic behavior of classical systems just a result of having finite computational resources, or is it more fundamental? Quantum physics seems "cleaner" because non-determinism is built-in the principles.

@GiulioPrisco: But your chain of finite beings must stop because of the limited size of the universe accessible to any being. Thus there is a barrier in principle.

Concerning "cleaner": The notion of fundamental randomness is on logical grounds intrinsically meaningless. I.e., one cannot in principle give an executable operational definition of its meaning. I therefore believe that quantum randomness is not fundamental but a consequence of a not yet found highly chaotic deterministic description. (I don't count Bohmian mechanics as such adescription, since it is not applicable to relativistic quantum fields.)

@ArnoldNeumaier re "I therefore believe that quantum randomness is not fundamental but a consequence of a not yet found highly chaotic deterministic description." - very interesting, thanks. Is this the "Thermal Interpretation" mentioned on your website? What should I read first?

re "The notion of fundamental randomness is on logical grounds intrinsically meaningless. I.e., one cannot in principle give an executable operational definition of its meaning." - How about I flip a coin next time I have to make a decision? (OK flipping a coin is chaotically deterministic, but I am sure you see what I mean). What's intrinsically meaningless in the notion of fundamental randomness?

@ArnoldNeumaier - by the way Arnold, your website is great. Thanks for curating it.

@GiulioPrisco: What's intrinsically meaningless in the notion of fundamental randomness?

This is off-topic here, but if you formulate an appropriate question about fundamental randomness in a separate thread, I'd argue it there.

My thermal interpretation is not directly related to determinism; it is only a first step towards a possible deterministic description. To understand it, you could start with my slides ''Optical models for quantum mechanics'' linked to from my profile page

@ArnoldNeumaier - Thanks for the link. I will think of how to formulate an appropriate question about fundamental randopmess and open a new thread.

+ 4 like - 0 dislike

Your question is central to my research interests, in the sense that completing that research would necessarily let me give you a precise, unambiguous answer.  So I can only give you an imprecise, hand-wavy one.  I'll write down the punchline, then work backwards.

Punchline: The instantaneous rate of branching, as measured in entropy/time (e.g., bits/s), is given by the sum of all positive Lyapunov exponents for all non-thermalized degrees of freedom. 

Most of the vagueness in this claim comes from defining/identifying degree of freedom that have thermalized, and dealing with cases of partial/incomplete thermalization; these problems exists classically.

Elaboration: Your question postulates that the macroscopic system starts in a quantum state corresponding to some comprehensible classical configuration, i.e., the system is initially in a quantum state whose Wigner function is localized around some classical point in phase space.  The Lyapunov exponents (units: inverse time) are a set of local quantities, each associated with a particular orthogonal direction in phase space.  They give the rate at which local trajectories diverge, and they (and their associated directions) vary from point to point in phase space. 

Lyapunov exponents are defined by the linearized dynamics around a point, and therefore they are constant on scales smaller than the third derivative of the potential.  (Perfectly linear dynamics are governed by a quadradic Hamiltonian and hence a vanishing third derivative.)  So if your Wigner function for the relevant degree of freedom is confined to a region smaller than this scale, it has a single well-defined set of Lyapunov exponents.  

On the other hand, the Wigner function for degrees of freedom that are completely thermalized is confined only by the submanifold associated with values of conserved quantities like the energy; within this submanifold, the Wigner function is spread over scales larger than the linearization neighborhood and hence is not quasi-classical.

As mentioned above, I don't know how to think about degree of freedom which are neither fully thermalized nor confined within the linear neighborhoods.

Argument: We want to associate (a) the rate at which nearby classical trajectories diverge with (b) the production of quantum entanglement entropy.  The close relationship between these two has been shown in a bunch of toy models.   For instance, see the many nice cites in the introduction of

Asplund and Berenstein, "Entanglement entropy converges to classical entropy around periodic orbits", (2015).  arXiv:1503.04857.  

especially this paper by my former advisor

Zurek and Paz,"Quantum chaos: a decoherent definition", Physica D 83, 300 (1995). arXiv:quant-ph/9502029.

The very crude picture is as follows. An initially pure quantum state with Wigner function localized around a classical point in phase space will spread to much larger phase-space scales at a rate given the Lyapunov exponent.  The couplings between systems and environments are smooth functions of the phase space coordinates (i.e., environments monitor/measure some combination of the system's x's and p's, but not arbitrary superpositions thereof), and the decoherence rate between two values of a coordinate is an increasing function of the difference. Once the Wigner function is spread over a sufficient distance in phase space, it will start to decohere into an incoherent mixture of branches, each of which are localized in phase space. See, for instance, Fig. 1 in Zurek & Paz:

Hence, the rate of trajectory divergence gives the rate of branching.

With regard to this:

But if the person has an alarm clock triggered by nuclear decay, there may well be 50/50 branching in which the person is awake or asleep.  (Note that you couldn't get such branching with a regular alarm clock.)

It's true you couldn't get such branching with a highly-reliable deterministic alarm clock, but you could dispense with the nuclear decay by measuring any macroscopic chaotic degree of freedom on a timescales longer than the associated Lyapunov time constant.  In particular, measuring thermal fluctuations of just about anything should be sufficient.

One more thing (rather controversial): The reason this questions was so hard to even formulate is two fold:

  1. No one has a good definition of what a branch, nor how to extract predictions for macroscopic observations directly from a general unitarily evolving wavefunction of the universe.  (My preferred formulation of this is Kent's set selection problem in the consistent histories framework.)
  2. Branching is intimately connected to the process of thermalization.  Although some recent progress in non-equilibrium thermodynamics has been made for systems near equilibrium (especially Crooks Fluctuation Theorem and related work), folks are still very confused about the process of thermalization even classically.  See, for instance, the amazingly open question of deriving Fourier's law from microscopic first principles, a very special case!
answered Oct 19, 2015 by Jess Riedel (220 points) [ revision history ]
edited Oct 19, 2015 by Jess Riedel

I blogged this answer.

Your answer

Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead.
To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL.
Please consult the FAQ for as to how to format your post.
This is the answer box; if you want to write a comment instead, please use the 'add comment' button.
Live preview (may slow down editor)   Preview
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Anti-spam verification:
If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:
$\varnothing\hbar$ysicsOverflow
Then drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds).
Please complete the anti-spam verification




user contributions licensed under cc by-sa 3.0 with attribution required

Your rights
...