Quantcast
  • Register
PhysicsOverflow is a next-generation academic platform for physicists and astronomers, including a community peer review system and a postgraduate-level discussion forum analogous to MathOverflow.

Welcome to PhysicsOverflow! PhysicsOverflow is an open platform for community peer review and graduate-level Physics discussion.

Please help promote PhysicsOverflow ads elsewhere if you like it.

News

PO is now at the Physics Department of Bielefeld University!

New printer friendly PO pages!

Migration to Bielefeld University was successful!

Please vote for this year's PhysicsOverflow ads!

Please do help out in categorising submissions. Submit a paper to PhysicsOverflow!

... see more

Tools for paper authors

Submit paper
Claim Paper Authorship

Tools for SE users

Search User
Reclaim SE Account
Request Account Merger
Nativise imported posts
Claim post (deleted users)
Import SE post

Users whose questions have been imported from Physics Stack Exchange, Theoretical Physics Stack Exchange, or any other Stack Exchange site are kindly requested to reclaim their account and not to register as a new user.

Public \(\beta\) tools

Report a bug with a feature
Request a new functionality
404 page design
Send feedback

Attributions

(propose a free ad)

Site Statistics

205 submissions , 163 unreviewed
5,082 questions , 2,232 unanswered
5,353 answers , 22,789 comments
1,470 users with positive rep
820 active unimported users
More ...

  necessary and sufficient conditions for an isolated dynamical system which can approach thermal equilibrium automatically

+ 3 like - 0 dislike
2834 views

Given an isolated $N$-particle dynamical system with only two body interaction, that is 
$$H=\sum_{i=1}^N\frac{\mathbf{p}_i^2}{2m}+\sum_{i<j}V(\mathbf{r}_i-\mathbf{r}_j)$$

In the thermodynamic limit, that is $N\gg 1$ and $N/V=$constant, it seems that not all two body interaction can make system approach thermal equilibrium automatically. For example, if the interaction is inverse square attractive force, we know the system cannot approach thermal equilibrium.

Although there is [Boltzmann's H-theorem](https://en.wikipedia.org/wiki/H-theorem) to derive the second law of thermodynamics,  it relies on the [Boltzmann equation](https://en.wikipedia.org/wiki/Boltzmann_equation) which is derived from  [Liouville's equation](https://en.wikipedia.org/wiki/Liouville's_equation)  in approximation of  low density and short range interaction.

My question:

1. Does it mean that  any isolated system with low density and short range interaction can approach thermal equilibrium automatically? If not, what's the counterexample?

2. For long range interaction or high density isolated system, what's the necessary and sufficient conditions for such system can approach thermal equilibrium automatically? What's about coulomb interaction?

3. How to prove rigorously that a pure self-gravitational system cannot approach equilibrium? I only heard the hand-waving argument that gravity has the effect of clot, but I never see the rigorous proof.

I know there is maximal entropy postulate in microscopic ensemble.I just want to find the range of application of this postulate. I'm always curious about the above questions but I never saw the discussion in any textbook of statistical mechanics.  You can also cite the literature  in which I can find the answer.

asked Jan 19, 2017 in Theoretical Physics by Alienware (185 points) [ no revision ]
retagged Jan 19, 2017 by Alienware

1. A counterexample: a rare, but ordered one-atom gas, when some atoms are aligned along one axe and move with the same velocity and the other atoms do the same, but along another axe so that the atoms never collide.

2. Isolated systems are difficult to prepare and thus they are too artificial. One has to add some small "openness" and the exchange with the environment will bring the necessary chaos.

The other examples (pure gravity, etc.) are too artificial too. As far as they observable (otherwise you cannot say they are "purely gravitational"), they radiate and interact with the environment, so the openness is implicitly present. Considering an open system (or a subsystem of a bigger system) changes the framework and ease proving achieving the thermal equilibrium.

Vladimir, your setting in example 1. is not preserved by the dynamics, hence it probably starts mixing. Your comment 2 is fully appropriate.

Arnold, a short-range interaction potential can be made negligible when a rare gas is considered or when rigid ball model is used. Thus one can avoid collisions. Anyway, in such a system the mixing time is larger than the naive "collision time" $\tau$ estimated as the average relative distance $d$ divided by the average relative velocity $v_{{\rm{av}}}$.

Indeed. one must assume that the two lines don't intersect and the interaction vanishes exactly at distances larger than some proper fraction of the distance between the lines. You should post your comment (made a bit more precise) as an answer!

1 Answer

+ 2 like - 1 dislike
  1. Boltzmann's H-theorem in its original form is flawed as you can easily see from the Loschmidt and related paradoxes. The fact that the gas has a low density and the interactions are short-range has nothing to do directly with the thermalization of the gas but rather with the possibility to restrict your theoretical analysis only to two particles interacting at once (rather than three, four, or even the whole gas). 

    Modern arguments for thermodynamical postulates prefer to invoke the ergodicity hypothesis which roughly says that only the total energy of the system is conserved by the total gas evolution and otherwise the system is thoroughly evolving through all the possibilities allowed by this constraint. There is overwhelming anecdotical evidence that virtually any classical many-particle system with generic initial conditions has this property. Ergodicity leads to the conclusion that if you look at the system for an infinite time and average your observations, you will get thermodynamics.

    This so far is fairly rigorous, but the question is how well does this limit converge, i.e. what real physics do we get in real observations when our measurements are typically fractions of seconds. In full rigor and generality, this question is simply unanswered. Since low-density systems with short-range interactions allow the two-particle simplification, as indicated above, these are typically the systems for which most rigorous results are proven. (This also addresses 2.)
     
  2. Coulomb forces: In principle, the range of the Coulomb force generated by a single particle is infinite, similarly to gravitational force. In practice, however, physical systems tend to have zero net charge. In consequence, the Coulomb charges of individual particles are screened by particles of opposite charge flying around and the true range of the force can be estimated by the Debye length. In many cases, your gas is even made of neutral atoms rather than charged ions and then the electromagnetic force becomes a very weak and short-ranged residue.
     
  3. I wouldn't say that a gravitational system cannot reach a thermodynamical equilibrium. In many particular cases it can, but this equilibrium will be vastly different in its properties from well-known thermodynamical systems such as an ideal gas. For instance, the system will not be homogeneous; local pressure and temperature may increase with decreasing the energy of the system; there is no lower bound on energy; and so on. As such, these systems are unstable once we introduce effects such as radiation of energy out of the system, and this leads to a gradual gravitational collapse. This is why black hole space-times (i.e. the whole thing, with all of the infinite space around it) are considered as the only examples of truly self-consistent thermodynamical equilibria in relativity.

    People who have worked on self-gravitating thermodynamics which come to my mind as first are Donald Lynden-Bell and Joseph Katz, here and here are some nice reviews from them.
answered Jan 20, 2017 by Void (1,645 points) [ revision history ]
edited Jan 20, 2017 by Void

''There is overwhelming anecdotical evidence that virtually any many-particle system with generic initial conditions has this property.'' - No, rather there is overwhelming evidence for the contrary. Otherwise there would be no stable molecules - which are proof of additional conservation laws at ordinary energies, at least over human time scales. if there is any ergodicity in an n-particle system it is irrelevant on these time scales.

@ArnoldNeumaier  For quantum systems the question becomes more difficult, I do agree. But in classical systems I am very much convinced of that statement if infinite time-scales are involved.

As for finite intervals, there are nice examples such as stickiness of phase-space structures known from low-dimensional classical chaos which very well document that the limit to ergodicity in the sense of spending equal average time in all available phase space may be very weirdly converging, counterintuitive, and ugly (even though ultimately mathematically proven in these cases). I believe I have made appropriate reservations in this respect without having to go into detail.
(I made an edit to the answer to talk only about the classical case.)

@ArnoldNeumaier Thinking about the case of the molecule I actually do not see how this is an argument. A stable molecule is (ideally) a ground state of an ensemble of particles. Giving it some little bit of extra energy above the ground state and assuming ergodicity does not make it fall apart, only maybe make it show up with some excited electron which might result into a photon emission, relaxing it back to the ground state. Of course, once the photon leaves the system, ergodicity is violated.

But breaking the molecule up means increasing its total energy and this is something which has nothing to do with ergodicity, perhaps when we go to the level of the whole gas. But on the level of the whole gas we see that the typical collision energies are much smaller than molecular bounding energies. In typical organic molecules the bond-dissociation energies are $\sim 1 eV$. But at room temperature, the typical collision energies will be $\sim 10^{-2} eV$ and a bond-dissociation is thus simply not very probable even under the full-flung ergodic hypothesis. I.e. the fact that we know gases with stable molecules at our given conditions is no contradiction to the ergodic hypothesis, it can be all well argued for in terms of probability.

For dilute gases of monatomic atoms you may perhaps be right, but your claim is ''for virtually any classical many-particle system with generic initial conditions''. Thus it should cover all fluids and all macromolecular materials. Glassy behavior is ample demonstration of non-ergodicity in practice, as is the molecular dynamics of proteins. 

Note also that one can discuss molecule formation classically (though comparison with experiment is then poor): Bound states are just periodic solutions. Your argument about breaking a molecule is valid if the molecule is alone in the word, but classical reactions of the form AB+C to A+BC are perfectly allowed at fixed energy. The problem with ergodicity in multiparticle systems is that in many such systems, local changes are (sometimes extremely) slow and the total energy can be distributed quite unevenly.

Thus ergodicity is a poor explanation for thermodynamics. In his famous 1902 book, Gibbs didn't even mention it once!

@ArnoldNeumaier Glassy behaviour is difficult to understand but it does not negate the ergodic hypothesis, it can be hypothesized to be caused by extremely long time-scales over which one has to average to see ergodic behavior in practice. I do agree that there is no guarantee that the convergence of the ergodic limit is fast enough to be of importance in all systems universally. However, as long as there is no satisfactory model explaining this behavior as a result of true mathematical non-ergodicity, this cannot be given as a counter-example to ergodicity in the strict mathematical meaning of the word.

As for the molecules, of course the stability of molecules is guaranteed by the existence of a ground state and will be unstable in the classical world, similarly to the atom (btw quantum bound states do not correspond only to regular/quasi-periodic but also to bound chaotic orbits).  If a molecule has isoenergetic reconfigurations, then these will of course also exist in a gas in thermodynamical equilibrium. I.e. if isoenergetic compounds exist, the molecule is not strictly stable and reaches chemical equilibrium with the rest of its variants (and this is not anything new).



I am sorry, but citing Gibbs on this matter is a bad choice. Statistical stationarity is not a sufficient condition for thermodynamics and the statistical physics built by Gibbs is all built around a conjecture that the ``correct" stationary equilibrium distribution is $\sim \exp(-\beta E_{tot})$. However, any distribution $\sim f(\exp(-\beta E_{tot}))$ is in statistical equilibrium.

Either way, there is little argument as to why should a system come into statistical stationarity. The fact that we assert that it does is a very strong postulate not so different from the ergodic hypothesis. Why should a system based on reversible mechanics not oscillate in its probability distribution and never relax? In fact, we know that an isolated system starting from generic initial conditions does not ever reach statistical stationarity, it may only reach it in a time-averaged sense!

The ergodic hypothesis gives you the possibility to built a robust path to why exactly does a system reach (averaged) statistical stationarity, and why is the effective distribution specifically $\sim \exp(-\beta E)$. In fact, there is no approach known to me which is able to do that without invoking ergodicity at some point. (If you do not agree, please give a to-point concrete reference.)

I voted your answer down - not because you argued that assuming ergodicity, thermal equilibrium can be deduced - but because you argued that ''There is overwhelming anecdotical evidence that virtually any many-particle system with generic initial conditions.'' The latter is simply wrong. Possibly there is no other known argument than assuming ergodicity, but in this case the problem is simply unsolved, and perhaps classically unsolvable if answering the question reoqires quantum arguments.

In the quantum case, there are methods independent of ergodicty (which not even has a well-defined meaning in the quantum case); see, e.g., the book by Calzetta and Hu.

Your answer

Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead.
To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL.
Please consult the FAQ for as to how to format your post.
This is the answer box; if you want to write a comment instead, please use the 'add comment' button.
Live preview (may slow down editor)   Preview
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Anti-spam verification:
If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:
p$\hbar\varnothing$sicsOverflow
Then drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds).
Please complete the anti-spam verification




user contributions licensed under cc by-sa 3.0 with attribution required

Your rights
...