Quantcast
  • Register
PhysicsOverflow is a next-generation academic platform for physicists and astronomers, including a community peer review system and a postgraduate-level discussion forum analogous to MathOverflow.

Welcome to PhysicsOverflow! PhysicsOverflow is an open platform for community peer review and graduate-level Physics discussion.

Please help promote PhysicsOverflow ads elsewhere if you like it.

News

PO is now at the Physics Department of Bielefeld University!

New printer friendly PO pages!

Migration to Bielefeld University was successful!

Please vote for this year's PhysicsOverflow ads!

Please do help out in categorising submissions. Submit a paper to PhysicsOverflow!

... see more

Tools for paper authors

Submit paper
Claim Paper Authorship

Tools for SE users

Search User
Reclaim SE Account
Request Account Merger
Nativise imported posts
Claim post (deleted users)
Import SE post

Users whose questions have been imported from Physics Stack Exchange, Theoretical Physics Stack Exchange, or any other Stack Exchange site are kindly requested to reclaim their account and not to register as a new user.

Public \(\beta\) tools

Report a bug with a feature
Request a new functionality
404 page design
Send feedback

Attributions

(propose a free ad)

Site Statistics

205 submissions , 163 unreviewed
5,082 questions , 2,232 unanswered
5,353 answers , 22,789 comments
1,470 users with positive rep
820 active unimported users
More ...

  About Boltzmann H-theorem

+ 4 like - 0 dislike
2893 views

What is the assumption for Boltzmann H-theorem? One can derive it just from the unitarity of quantum mechanics, so this should be generally true, does it imply a closed system will always thermalize eventually? Does it apply for many-body localized states?

This post imported from StackExchange Physics at 2016-08-04 12:34 (UTC), posted by SE-user Mr. Gentleman
asked Jan 31, 2014 in Theoretical Physics by Mr. Gentleman (270 points) [ no revision ]
Could you specifically reference the derivation you mention. The assumptions should be in there. But just out of the top of my head, you're going to have to assume some specific initial conditions (they have to be "typical"). The way Boltzmann did it was with the Stosszahlansatz. But more modern derivations use more explicit assumptions on the initial conditions.

This post imported from StackExchange Physics at 2016-08-04 12:34 (UTC), posted by SE-user Raskolnikov
Boltzmann's H-theorem is not for unitary quantum mechanics, but for classical (nearly ideal) gases. For the quantum mechanical H-theorem, see von Neumann's work from 1929. For an English translation, see arxiv.org/abs/1003.2133

This post imported from StackExchange Physics at 2016-08-04 12:34 (UTC), posted by SE-user Zoltan Zimboras

2 Answers

+ 6 like - 0 dislike

I would like to share my thoughts and questions on the issue. The Boltzmann H theorem based on classical mechanics is well discussed in various literatures, the irreversibility comes from his assumption of molecular chaos, which cannot be justified from the underlying dynamical equation. Here I will try to say something on quantum H theorem, the point I want to make is that, although seemingly H theorem can be derived from unitarity, the true entropy increase in fact comes from the non-unitary part of quantum mechanics. Let me first recap the derivation using unitarity $^{1,2}$.

H theorem as a consequence of unitarity

Denote by $P_k$ the probability of a particle appearing on the state $|k\rangle$, $A_{kl}$ the transition rate from state $|k\rangle$ to state $|l\rangle$, then by the master equation

$${\frac {dP_{k}}{dt}}=\sum _{l}(A_{{kl }}P_{l }-A_{{l k}}P_{k})=\sum _{{l\neq k}}(A_{{kl }}P_{l }-A_{{l k}}P_{k})\cdots\cdots(1).$$ Then we take the derivative of entropy

$$S=-\sum_k P_k\ln P_k\cdots\cdots(2),$$

we obtain

$$\frac{dS}{dt}=-\sum_k\frac{dP_k}{dt}\left(1+\ln P_k\right)\cdots\cdots(3).$$ Together with (1) we have $$\frac{dS}{dt}=-\sum_{kl}\left\{(1+\ln P_k)A_{{kl }}P_{l }-(1+\ln P_k)A_{{l k}}P_{k}\right\}\cdots(4).$$ For the seond second term let us interchange the dummy indices $k$ and $l$, we get $$\frac{dS}{dt}=\sum_{kl}(\ln P_l-\ln P_k)A_{kl}P_l\cdots\cdots(5)$$ Now use the mathematical identity $(\ln P_l-\ln P_k)P_l\geq P_l- P_k$, we obtain

$$\frac{dS}{dt}\geq \sum_{kl}(P_l-P_k)A_{kl}= \sum_{kl}P_l(A_{kl}-A_{lk})\\=\sum_{l}P_l\big\{\sum_{k}(A_{kl}-A_{lk})\big\}\cdots\cdots(6)$$

Now unitarity ensures $\sum_{k}A_{kl}$ and $\sum_{k}A_{lk}$ are both 0, because as transition rates, $$\sum_{k}A_{kl}=\frac{d}{dt}\sum_{k}|\langle l|S|k\rangle|^2=\frac{d}{dt}\sum_{k}\langle l|S|k\rangle\langle k|S^{\dagger}|l\rangle\\=\frac{d}{dt}\langle l|SS^{\dagger}|l\rangle=\frac{d}{dt}\langle l|l\rangle=0\cdots\cdots(7),$$ where $S$ is the unitary time evolution operator describing the system. This is nothing but saying the total transition probability from one state to all states must be 1. It is clear (6) and (7) imply the H theorem: $$\frac{dS}{dt}\geq 0.$$ Where does the irreversibility come from?

Now we are in a position to question ourselves with Loschmidt's paradox, analogously to its classical version: There are many unitary and time-reversible quantum mechanical systems, if we have just derived H theorem using unitarity alone, how can it be reconciled with time-reversibility of the underlying dynamics?

What sneaked into the above derivation?

The crucial thing to notice is that, in the quantum regime, the definition of entropy using equation (2) is inherently an impossible one: the value of the entropy in (2) depends on the basis we choose to describe the system!

Consider a two-level system with two choices of orthogonal basis $\{|1\rangle, |2\rangle\}$ and $\{|a\rangle, |b\rangle\}$ related by $$|1\rangle=\frac{1}{\sqrt2}(|a\rangle+|b\rangle),\\|2\rangle=\frac{1}{\sqrt2}(|a\rangle-|b\rangle).$$ Suppose the system is in the state $|1\rangle$, then the entropy formula gives $S=0$ in the first choice of basis since it has 100% chance to appear in $|1\rangle$, while in the other basis $S=\ln2$ since it has 50%-50% chance to appear in either $|a\rangle$ or $|b\rangle$.

Now we may argue, it is one thing that to say the system is in $\frac{1}{\sqrt2}(|a\rangle+|b\rangle)$ and have the potential 50%-50% chance to transit into $|a\rangle$ and $|b\rangle$ after a measurement, but a different thing to say the transition has been realized by some measurement. Two situations must be described differently. If we look back to our derivation, it is not hard to see what we really did was, after a basis state evolves to a new state which is a superposition of the basis states, we assumed transitions to original basis states have happened instead of just staying in that superposition state, and in fact the original definition of entropy is not capable of describing such situation, as explained just now.

A plausible definition of quantum entropy is the Von Neumann entropy, which is a basis-independent definition of entropy, and in this description, the entropy of a unitarily evolving system is constant in time, while a (projective) measurement can increase the entropy.

Based on the above comparison, we see the irreversibility really comes as an assumption, the assumption that a measurement/decoherence has happened, and as we know, a (projective) measurement is a non-unitary, irreversible process, no paradox anymore.

My own question on the issue is, what to make of the fact that von Neumann entropy is constant in time? Does it mean it is incapable of describing a closed system evolving from non-equilibrium to equilibrium, or should we just reverse the argument and say any non-equilibrium to equilibrium evolution must be described by some non-unitary process?


1.Rephrased from section 3.6 of The Quantum Theory of Fields, Vol1, S. Weinberg

2.If I remember correctly(which I'm not quite confident on), such derivation was first given by Pauli, and he correctly spotted the origin of irreversibility, which he called the "random phase assumption".

This post imported from StackExchange Physics at 2016-08-04 12:34 (UTC), posted by SE-user Jia Yiyang
answered Feb 8, 2014 by Jia Yiyang (2,640 points) [ no revision ]
The von Neumann entropy is not the same thing as thermodynamic entropy, for the same reason the information entropy $\int -\rho\ln \rho\, dq\,dp$ is not thermodynamic entropy in classical statistical physics. Both von Neumann and information entropy are constant for isolated Hamiltonian systems. Thermodynamic ( Clausius ) entropy has to be derived from statistical physics differently - it is proportional to logarithm of the accessible phase space(classical statistical physics) or logarithm of the number of accessible microstates (quantum statistical physics).

This post imported from StackExchange Physics at 2016-08-04 12:34 (UTC), posted by SE-user Ján Lalinský
The latter entropies are defined only for equilibrium states. There is no law saying that the entropy has to be defined for non-equilibrium states or that it has to increase in time.

This post imported from StackExchange Physics at 2016-08-04 12:34 (UTC), posted by SE-user Ján Lalinský
@JánLalinský: thanks for pointing out my misunderstanding. So in the quantum case what would be the proper definition of thermodynamic entropy? Or what would be a basis-independent way of counting microstates?

This post imported from StackExchange Physics at 2016-08-04 12:34 (UTC), posted by SE-user Jia Yiyang
I am not sure about this, but the closest thing is this: let the system be isolated with energy somewhere in $\langle E,E+\Delta E\rangle$ (the interval is large enough to contain zillion of Hamiltonian eigenstates). Define statistical entropy as $S = k_B \ln ( D(E)\Delta E )$ where $D(E)$ is density of the Hamiltonian eigenfunctions on the energy line. This is statistical entropy for microcanonical ensemble and for macro-systems (such ideal gas with many particles) can be treated as quantity similar to thermodynamic entropy.

This post imported from StackExchange Physics at 2016-08-04 12:34 (UTC), posted by SE-user Ján Lalinský
@JánLalinský: ok, I'll try to look into this, thanks.

This post imported from StackExchange Physics at 2016-08-04 12:34 (UTC), posted by SE-user Jia Yiyang
@JiaYiyang: Thank you for your sharing your thoughts and thank you for filling up the derivation, which is indeed what I meant. I agree with you that the derivation missed the basis dependence, and your argument is quite good. Another point on Boltzmann entropy (proportional to the logarithm of numbers of microscopic states), however, my opinion is that this is the quantity that is maximized in thermal equilibrium, and when equilibrium is reached, ergodicity indicates the entropy can also be written as $-tr(\rho\ln\rho)$.

This post imported from StackExchange Physics at 2016-08-04 12:34 (UTC), posted by SE-user Mr. Gentleman
+ 0 like - 0 dislike

That calculation has restrictions, but, one in particular should be mentioned, that master equation is supposed to be connected to this entropy, but is not necessarily, the master equation can be connected to general entropic form, and that is a fundamental idea for a more complete proff.

This post imported from StackExchange Physics at 2016-08-04 12:34 (UTC), posted by SE-user Maike Afs
answered Jul 13, 2016 by Maike Afs (0 points) [ no revision ]
Wouldn't that be better as a comment?

This post imported from StackExchange Physics at 2016-08-04 12:34 (UTC), posted by SE-user MAFIA36790

Your answer

Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead.
To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL.
Please consult the FAQ for as to how to format your post.
This is the answer box; if you want to write a comment instead, please use the 'add comment' button.
Live preview (may slow down editor)   Preview
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Anti-spam verification:
If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:
p$\hbar$ysicsOverflo$\varnothing$
Then drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds).
Please complete the anti-spam verification




user contributions licensed under cc by-sa 3.0 with attribution required

Your rights
...