Quantcast
  • Register
PhysicsOverflow is a next-generation academic platform for physicists and astronomers, including a community peer review system and a postgraduate-level discussion forum analogous to MathOverflow.

Welcome to PhysicsOverflow! PhysicsOverflow is an open platform for community peer review and graduate-level Physics discussion.

Please help promote PhysicsOverflow ads elsewhere if you like it.

News

PO is now at the Physics Department of Bielefeld University!

New printer friendly PO pages!

Migration to Bielefeld University was successful!

Please vote for this year's PhysicsOverflow ads!

Please do help out in categorising submissions. Submit a paper to PhysicsOverflow!

... see more

Tools for paper authors

Submit paper
Claim Paper Authorship

Tools for SE users

Search User
Reclaim SE Account
Request Account Merger
Nativise imported posts
Claim post (deleted users)
Import SE post

Users whose questions have been imported from Physics Stack Exchange, Theoretical Physics Stack Exchange, or any other Stack Exchange site are kindly requested to reclaim their account and not to register as a new user.

Public \(\beta\) tools

Report a bug with a feature
Request a new functionality
404 page design
Send feedback

Attributions

(propose a free ad)

Site Statistics

206 submissions , 164 unreviewed
5,106 questions , 2,251 unanswered
5,400 answers , 23,019 comments
1,470 users with positive rep
822 active unimported users
More ...

  Boltzmann Distribution for Dissipative Systems

+ 6 like - 0 dislike
26 views

Consider a classical non-dissipative system, with dynamics described by Hamiltonian $\mathcal H(\mathbf{q},\mathbf{p})$ on a $2n$-dimensional phase space. If the system is in equilibrium with a thermal bath at inverse temperature $\beta$, we know that the phase-space probability density of the system is given by

\begin{equation} \rho(\mathbf{q},\mathbf{p}) = \frac{e^{-\beta \mathcal{H}(\mathbf{q},\mathbf{p})}}{\int d^n\mathbf{q}~d^n\mathbf{p}~e^{-\beta \mathcal{H}(\mathbf{q},\mathbf{p})}}, \tag{1} \end{equation}

which can be derived, for example, by maximizing the Gibbs entropy with a constraint on the average energy, which is assumed to be equal to the average of the Hamiltonian (see here for example).

Now for many dissipative systems, we know that the dynamics may still be described by a (possibly time-dependent) Hamiltonian $\mathcal{H}(\mathbf{q},\mathbf{p},t)$, which is generally not equal to the total energy $E(\mathbf{q},\mathbf{p})$. What will the phase-space probability density be for such a system in thermal equilibrium?

My thought was that in principle, the derivation of $(1)$ by maximizing entropy (with $\langle E(\mathbf{q},\mathbf{p})\rangle$ held constant) should work just as well, with the only difference being that we should use the energy function $E(\mathbf{q},\mathbf{p})$ instead of the Hamiltonian. In other words $$\rho(\mathbf{q},\mathbf{p}) = \frac{e^{-\beta E(\mathbf{q},\mathbf{p})}}{\int d^n\mathbf{q}~d^n\mathbf{p}~e^{-\beta E(\mathbf{q},\mathbf{p})}}.$$ But I'm not entirely sure of this, because if the relevant quantity is always the total energy, why do most textbooks (that I've seen) put so much emphasis on using the Hamiltonian?

This post imported from StackExchange Physics at 2025-01-22 10:53 (UTC), posted by SE-user Sahand Tabatabaei
asked Feb 2, 2021 in Theoretical Physics by Sahand Tabatabaei (30 points) [ no revision ]

1 Answer

+ 5 like - 0 dislike

Allow me to give some context to this answer.

For some decades statistical mechanics was taught in an axiomatic (which can also mean dogmatic) way: in such-and-such situation you use such-and-such distribution/ensemble. I'm sure it's still taught that way in many courses. But at least since the 1950s (with Brillouin, Jaynes, and Kac too in my opinion) it became clear that its axioms were actually consequences of a logical and rational approach – nothing more than probabilistic inference. After all, this was also the starting point of Boltzmann, Maxwell, Gibbs (which I find somewhat dogmatic here and there), Einstein.

And the probabilistic inference allows us to use any combination of macroscopic and microscopic information. This understanding led to at least two kinds of development.


First, for systems in macroscopic equilibrium, additional pieces of macroscopic information are now used to construct ensemble distributions. For example, if we know not only the average total energy (under repetitions of the macroscopic preparation), but also its variance, this additional piece of information can be also used and appears as an extra parameter in the distribution. The ensemble thus obtained is called the Gaussian ensemble. Depending on the value of the additional parameter, reflecting the observed variance, the microcanonical and canonical ensembles are obtained as special cases. For a review see for example

This is just an example. You can search the literature for "angular-momentum ensemble", "pressure ensemble", "evaporative ensemble"... The possibilities are endless. These ensembles are absolutely needed for particular systems, such as small-size or non-extensive systems, or for particular situations such as phase transitions.

The probabilistic-inference rationale behind this was explained very clearly by Jaynes; see for example

Another review focusing on more recent applications is


Second, outside of equilibrium we can use time-varying macroscopic information, and also microscopic information such as the Hamiltonian, if it's known. This leads to time-dependent ensembles, where the time dependence comes from the Hamiltonian or from the ensemble parameters, which are now functions of time, say $\beta(t)$ as they reflect time-varying macroscopic quantities. Roughly speaking we're using Gibbs's approach on the space of trajectories, rather than on state space. The approach – again it's nothing else than probabilistic inference – was rationally summarized by Jaynes, see for example (these two are the main references for your question):

(other papers by Jaynes at https://bayes.wustl.edu/etj/node1.html) and also

And many applications of this development are appearing, for example (most of them should also be on arXiv):

Most interesting is that the resulting equations and results mirror those of non-equilibrium thermodynamics developed from first principles (without microscopic considerations) – for which see for example Astarita's brilliant text:

In fact it's even possible to include spatially varying macroscopic information in this approach, as explained in Jaynes's paper above. For particular cases of this kind of derivations see for example

The parameters in the resulting distributions have space- and time-dependence, for example $\beta(x,t)$ – just like the temperature field in continuum thermomechanics. Indeed also in this case there appear amazing parallels with general, non-equilibrium thermomechanics – for which see for example


Coming finally to your question, the answer is that if our system is in thermal equilibrium then by definition its total energy is macroscopically constant in time. Then the ensemble distribution you write, with $E(q,p)$, is indeed appropriate for some predictive purposes, no matter what the Hamiltonian is. About time dependence and ensembles see Jaynes's paper "Inferential scattering" above, especially sections 4–7.

But we can also construct a sharper, time-dependent distribution by using the Hamiltonian evolution (time-dependent or not). In short, we use the Hamiltonian to propagate macroscopic information from each time to all other times. See the paper Macroscopic prediction above, especially sections 4–5.

There's no "correct" ensemble distribution: it depends on what we want to use it for (microscopic predictions? prediction of other macroscopic quantities?).

Outside of equilibrium, if we have time-dependent macroscopic measurements of the energy $E(t)$, we can build a time-dependent canonical distribution with a time-dependent parameter $\beta(t)$. Again we can also obtain an even sharper time-varying distribution by using the Hamiltonian.


Finally, it may be useful to point out that macroscopic dissipation can (and usually does) occur even if microscopically we have a time-independent Hamiltonian. The ideal gas, for example, is a (macroscopically) dissipative system, for entropic reasons. See for example

This post imported from StackExchange Physics at 2025-01-22 10:54 (UTC), posted by SE-user pglpm
answered Feb 3, 2021 by pglpm (590 points) [ no revision ]
This answer is just awesome! Thank you!

This post imported from StackExchange Physics at 2025-01-22 10:54 (UTC), posted by SE-user Sahand Tabatabaei
@SahandTabatabaei Thank you! Your intuition in your question was spot-on. I recommend the papers by Jaynes, those are really awesome.

This post imported from StackExchange Physics at 2025-01-22 10:54 (UTC), posted by SE-user pglpm

Your answer

Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead.
To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL.
Please consult the FAQ for as to how to format your post.
This is the answer box; if you want to write a comment instead, please use the 'add comment' button.
Live preview (may slow down editor)   Preview
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Anti-spam verification:
If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:
p$\hbar$ysicsOverf$\varnothing$ow
Then drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds).
Please complete the anti-spam verification




user contributions licensed under cc by-sa 3.0 with attribution required

Your rights
...