Quantcast
  • Register
PhysicsOverflow is a next-generation academic platform for physicists and astronomers, including a community peer review system and a postgraduate-level discussion forum analogous to MathOverflow.

Welcome to PhysicsOverflow! PhysicsOverflow is an open platform for community peer review and graduate-level Physics discussion.

Please help promote PhysicsOverflow ads elsewhere if you like it.

News

PO is now at the Physics Department of Bielefeld University!

New printer friendly PO pages!

Migration to Bielefeld University was successful!

Please vote for this year's PhysicsOverflow ads!

Please do help out in categorising submissions. Submit a paper to PhysicsOverflow!

... see more

Tools for paper authors

Submit paper
Claim Paper Authorship

Tools for SE users

Search User
Reclaim SE Account
Request Account Merger
Nativise imported posts
Claim post (deleted users)
Import SE post

Users whose questions have been imported from Physics Stack Exchange, Theoretical Physics Stack Exchange, or any other Stack Exchange site are kindly requested to reclaim their account and not to register as a new user.

Public \(\beta\) tools

Report a bug with a feature
Request a new functionality
404 page design
Send feedback

Attributions

(propose a free ad)

Site Statistics

206 submissions , 164 unreviewed
5,103 questions , 2,249 unanswered
5,355 answers , 22,794 comments
1,470 users with positive rep
820 active unimported users
More ...

  Information conservation during quantum measurement

+ 5 like - 0 dislike
2838 views

Consider the following experiment. I take a spin-$\frac{1}{2}$ particle and make a $\sigma_x$ measurement (measure the spin in the $x$ direction), then make a $\sigma_y$ measurement, then another $\sigma_x$ one, then $\sigma_y$, and so on for $n$ measurements. The formalism of quantum mechanics tells us that the outcomes of these measurements will be random and independent. I now have a string of completely random bits, of length $n$. Briefly, my question is where does the information in this string come from?

The obvious answer is "quantum measurements are fundamentally indeterministic and it's simply a law of physics that you get a random string when you do that". The problem I have with this is that it can be shown that unitary evolution of quantum systems conserves von Neumann entropy, just as Hamiltonian evolution of a classical system conserves Shannon entropy. In the classical case this can be interpreted as "no process can create or destroy information on the microscopic level." It seems like the same should be true for the quantum case as well, but this seems hard to reconcile with the existence of "true" randomness in quantum measurement, which does seem to create information.

It's clear that there are some interpretations for which this isn't a problem. In particular, for a no-collapse interpretation the Universe just ends up in a superposition of $2^n$ states, each containing an observer looking at a different output string.

But I'm not a big fan of no-collapse interpretations, so I'm wondering how other quantum interpretations cope with this. In particular, in the "standard" interpretation (by which I mean the one that people adhere to when they say quantum mechanics doesn't need an interpretation), how is the indeterminacy of measurement reconciled with the conservation of von Neumann entropy? Is there an interpretation other than no-collapse that can solve it particularly well?

addendum

It seems worth summarising my current thinking on this, and having another go at making clear what I'm really asking.

I want to start by talking about the classical case, because only then can I make it clear where the analogy seems to break down. Let's consider a classical system that can take on one of $n$ discrete states (microstates). Since I don't initially know which state the system is in, I model the system with a probability distribution.

The system evolves over time. We model this by taking the vector $p$ of probabilities and multiplying it by a matrix T at each time step, i.e. $p_{t+1} = Tp_t$. The discrete analogue of Hamiltonian dynamics turns out to be the assumption that $T$ is a permutation matrix, i.e. it has exacly one 1 on each rown and column, and all its other entries are 0. (Note that permutation matrices are a subset of unitary matrices.) It turns out that, under this assumption, the Gibbs entropy (aka Shannon entropy) $H(p)$ does not change over time.

(It's also worth mentioning, as an aside, that instead of representing $p$ as a vector, I could choose to represent it as a diagonal matrix $P$, with $P_{ii}=p_i$. It then looks a lot like the density matrix formalism, with $P$ playing the role of $\rho$ and $T$ being equivalent to unitary evolution.)

Now let's say I make a measurement of the system. We'll assume that I don't disturb the system when I do this. For example, let's say the system has two states, and that initially I have no idea which of them the system is in, so $p=(\frac{1}{2},\frac{1}{2})$. After my measurement I know what state the system is in, so $p$ will become either $(1,0)$ or $(0,1)$ with equal probability. I have gained one bit of information about the system, and $H(p)$ has reduced by one bit. In the classical case these will always be equal, unless the system interacts with some other system whose state I don't precisely know (such as, for example, a heat bath).

Seen from this point of view, the change in von Neumann entropy when a quantum measurement is performed is not surprising. If entropy just represents a lack of information about a system then of course it should decrease when we get some information. In the comments below, where I refer to "subjective collapse" interpretations, I mean interpretations that try to interpret the "collapse" of a wavefunction as analogous to the "collapse" of the classical probability distribution as described above, and the von Neumann entropy as analogous to the Gibbs entropy. These are also called "$\psi$-epistemic" interpretations.

But there's a problem, which is this: in the experiment described at the beginning of this question, I'm getting one bit of information with every measurement, but the von Neumann entropy is remaining constant (at zero) instead of decreasing by one bit each time. In the classical case, "the total information I have gained about the system" + "uncertainty I have about the system" is constant, whereas in the quantum case it can increase. This is disturbing, and I suppose what I really want to know is whether there's any known interpretation in which this "extra" information is accounted for somehow (e.g. perhaps it could come from thermal degrees of freedom in the measuring apparatus), or in which it can be shown that something other than the von Neumann entropy plays a role analogous to the Gibbs entropy.

This post imported from StackExchange Physics at 2014-04-08 05:14 (UCT), posted by SE-user Nathaniel
asked Apr 14, 2012 in Theoretical Physics by Nathaniel (495 points) [ no revision ]
Cross-posted to theoreticalphysics.stackexchange.com/q/1164/189

This post imported from StackExchange Physics at 2014-04-08 05:14 (UCT), posted by SE-user Qmechanic

3 Answers

+ 4 like - 0 dislike

Two things:

  1. The quantum version of Louiville's theorem is the fact that time evolution is a unitary operator, i.e. it preserves amplitudes.

  2. Measurement is not mysterious. The conceptual problem is asking what is classical. Measurements, like all interactions, are quantum --- you treat the system as quantum, the measurement apparatus as quantum, and let them interact. The overall system + apparatus evolves unitarily. Now, the measurement apparatus does not perceive itself in a superposition --- that would be logically nonsense, so from the apparatus' point of view the system is "collapsed" to some state, which correlates with the state of apparatus. Procedurally, we model this by tracing over the degrees of freedom in the apparatus, and get a density matrix for the system.

This post imported from StackExchange Physics at 2014-04-08 05:14 (UCT), posted by SE-user genneth
answered Apr 14, 2012 by genneth (565 points) [ no revision ]
I will endorse genneth and post my answer as a comment only. ... Quantum measurements are fundamentally indeterministic and it's simply a law of physics that you get a random string when you do that. Liouville's theorem only applies to models of classical statistical physics, see en.wikipedia.org/wiki/Liouville%27s_theorem_(Hamiltonian) ... Our world isn't classical so it doesn't have to obey Liouville's theorem. The quantum extension only says that the evolution is unitary i.e. the total probability is 1 for any initial state but not that randomness is prohibited.

This post imported from StackExchange Physics at 2014-04-08 05:14 (UCT), posted by SE-user Luboš Motl
@genneth would you say your point 2 is different from endorsing a no-collapse interpretation? It sounds to me as if you're saying the overall system + apparatus (+ observer) is in a superposition of states in which different measurement results have occurred - thus the wavefunction never collapses but only appears to because the superposed observers are independent of one another. I can see very clearly that what I described is unproblematic under such an interpretation, but I was asking how other interpretations deal with it.

This post imported from StackExchange Physics at 2014-04-08 05:14 (UCT), posted by SE-user Nathaniel
@LubošMotl I understand the quantum analogue of Liouville's theorem. It implies that the von Neumann entropy is conserved just as much as the classical Liouville's theorem implies the conservation of Shannon entropy. It's this that I have trouble reconciling with the presence of stochasticity.

This post imported from StackExchange Physics at 2014-04-08 05:14 (UCT), posted by SE-user Nathaniel
I've edited the question to make that point clearer. (I shouldn't have referred to unitary evolution as "Liouville's theorem" - I just have a mental habit of thinking of them as the same thing.)

This post imported from StackExchange Physics at 2014-04-08 05:14 (UCT), posted by SE-user Nathaniel
+ 2 like - 0 dislike

Although your statement that

it can be shown that unitary evolution of quantum systems conserves von Neumann entropy,

is indeed true, in quantum theory and in particular in any "collapse" interpretation the evolution of a quantum system is governed by such unitary evolution except when the system is being measured; at those times the system undergoes a nonunitary projective collapse that does not conserve von Neumann entropy and is therefore free to tamper with the information content of the world. (I'm not sure one can argue a projective measurement either creates or destroys information. What I do think though is that it certainly makes the world more (or at least not less) entropic: if we collapse without knowing the answer, there's one more thing to find out about the world.)

This post imported from StackExchange Physics at 2014-04-08 05:14 (UCT), posted by SE-user Emilio Pisanty
answered Apr 17, 2012 by Emilio Pisanty (520 points) [ no revision ]
Most voted comments show all comments
We do observe v.N. entropy changes, but only when we perform measurements. (The situation you describe is a bit different as it involves ignoring one bit of information in order to create one bit of entropy - this is fine, because ignoring information also creates entropy in a classical system.) If objective collapses were spontaneously happening in non-measurement scenarios we would observe spontaneous decreases in v.N. entropy in isolated systems (though perhaps they'd need to be of a certain size and/or complexity), but we don't see this.

This post imported from StackExchange Physics at 2014-04-08 05:14 (UCT), posted by SE-user Nathaniel
Yes, I agree that this is just exactly the measurement problem. (This is clearer to me now than when I wrote the question.) I guess I was just interested in how the various interpretations of QM attempt to deal with the measurement problem, and whether looking it in this way sheds any light on whether their strategies are viable.

This post imported from StackExchange Physics at 2014-04-08 05:14 (UCT), posted by SE-user Nathaniel
This discussion was somewhat helpful, so I've awarded the bounty to you.

This post imported from StackExchange Physics at 2014-04-08 05:14 (UCT), posted by SE-user Nathaniel
why the downvotes? please comment...

This post imported from StackExchange Physics at 2014-04-08 05:14 (UCT), posted by SE-user Emilio Pisanty
I don't know who downvoted you (it wasn't me). But having re-read your answer I now see what you mean about spontaneous collapses increasing the v.N. entropy - you're saying that a spontaneous collapse would be equivalent to making a measurement but not looking at the answer. It's quite an interesting observation, and I'm not sure why I didn't get it before.

This post imported from StackExchange Physics at 2014-04-08 05:14 (UCT), posted by SE-user Nathaniel
Most recent comments show all comments
Thank you for the answer. This is of course a valid route to take. But something more is still needed to tell us exactly when such collapses occur, otherwise you run into problems with Schrödinger's cat, Wigner's friend, etc. More seriously, though, if one part of a quantum system interacts with another in such a way as for it to count as a measurement, the von Neumann entropy should spontaneously change - and we never observe that. That's why I think this argument is a serious problem for objective-collapse interpretations. (Subjective collapse ones are OK though.)

This post imported from StackExchange Physics at 2014-04-08 05:14 (UCT), posted by SE-user Nathaniel
I think your question then becomes exactly the (still very much open!) measurement problem. When exactly does/should unitary evolution give way to projective measurement? How exactly should one draw the Copenhagen-interpretation line between microscopic and macroscopic objects? Can one even draw it at all? As I see it, objective-collapse interpretations are nowhere near a position to answer these questions, and subjective collapse interpretations (i.e. many worlds?) are arguably just hiding it. (cont'd)

This post imported from StackExchange Physics at 2014-04-08 05:14 (UCT), posted by SE-user Emilio Pisanty
+ 0 like - 0 dislike

In "standard" interpretation the unitary evolution only happens when there is no observation.

The presence of the observer makes the evolution non-unitary so that non-deterministic processes may occur.

This makes any interpretation with collapse non-symmetric against different persons. Certain distinguished person (the observer) can make the collapse and others can not. This means that there is a living person on Earth with special physical properties, the ability to make the wavefunction collapse. Who is that person can be determined by physical means (although it is quite difficult because requires good isolation, currently impossible at the temperatures at which people can live).

Any interpretation that postulates observer-independent ("objective") collapse, equality of all people and so on is plainly wrong.

This post imported from StackExchange Physics at 2014-04-08 05:14 (UCT), posted by SE-user Anixx
answered Apr 19, 2012 by Anixx (30 points) [ no revision ]

Your answer

Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead.
To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL.
Please consult the FAQ for as to how to format your post.
This is the answer box; if you want to write a comment instead, please use the 'add comment' button.
Live preview (may slow down editor)   Preview
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Anti-spam verification:
If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:
p$\hbar$ysics$\varnothing$verflow
Then drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds).
Please complete the anti-spam verification




user contributions licensed under cc by-sa 3.0 with attribution required

Your rights
...