Quantcast
  • Register
PhysicsOverflow is a next-generation academic platform for physicists and astronomers, including a community peer review system and a postgraduate-level discussion forum analogous to MathOverflow.

Welcome to PhysicsOverflow! PhysicsOverflow is an open platform for community peer review and graduate-level Physics discussion.

Please help promote PhysicsOverflow ads elsewhere if you like it.

News

PO is now at the Physics Department of Bielefeld University!

New printer friendly PO pages!

Migration to Bielefeld University was successful!

Please vote for this year's PhysicsOverflow ads!

Please do help out in categorising submissions. Submit a paper to PhysicsOverflow!

... see more

Tools for paper authors

Submit paper
Claim Paper Authorship

Tools for SE users

Search User
Reclaim SE Account
Request Account Merger
Nativise imported posts
Claim post (deleted users)
Import SE post

Users whose questions have been imported from Physics Stack Exchange, Theoretical Physics Stack Exchange, or any other Stack Exchange site are kindly requested to reclaim their account and not to register as a new user.

Public \(\beta\) tools

Report a bug with a feature
Request a new functionality
404 page design
Send feedback

Attributions

(propose a free ad)

Site Statistics

205 submissions , 163 unreviewed
5,082 questions , 2,232 unanswered
5,353 answers , 22,789 comments
1,470 users with positive rep
820 active unimported users
More ...

  Does entropy measure extractable work?

+ 18 like - 0 dislike
2441 views

Entropy has two definitions, which come from two different branches of science: thermodynamics and information theory. Yet, they both are thought to agree. Is it true?

Entropy, as seen from information theory, measures our ignorance about a system. Entropy, as seen from thermodynamics, measures the amount of extractable work. One is epistemological, the other is ontological. Yet, they both can agree, if entropy really measures the amount of extractable work, given one's knowledge about the system.

But, is this true? Does Shannon's entropy computed on the probability distribution of a physical system express the amount of work we can obtain from it? How do increases in our knowledge increase the amount of extractable work? Is Landauer principle powerful enough to establish the connection?

This post has been migrated from (A51.SE)
asked Sep 16, 2011 in Theoretical Physics by Javier Rodriguez Lag (315 points) [ no revision ]
I find the dramatically different answers to this question a bit baffling. I also thought the two kinds of entropy were different, but I liked Joe's answer and upvoted it. Would it be possible for someone to reconcile the different answers, or point out why one (or more) of them is wrong?

This post has been migrated from (A51.SE)
One reconciliation is that "work" has multiple meanings. Typically the first meaning learned is thermodynamic, but (as Dirac demonstrated) a more general meaning amounts to: *Given a class of dynamical processes, 'work' is any mathematically natural measure of what those processes accomplish*

This post has been migrated from (A51.SE)
You probably want to say that the thermodynamic entropy tells you how much of the energy is *not* available for work! I'll also go with Matt's answer than this only makes sense if the state is in thermal equilibrium, whereas Shannon entropy is more general.

This post has been migrated from (A51.SE)
@Aaron: Actually, the reconciliation is that I was answering yes to whether or not the two entropy definitions agree. I didn't mean that entropy measures extractable work.

This post has been migrated from (A51.SE)

3 Answers

+ 16 like - 0 dislike

UPDATE: Below I am answering yes to the first question in the post (are the two kinds of entropy the same up to a constant). This has led to some confusion as both Matt and John gave answers saying "the answer is no", however I believe they are referring to the title "Does entropy measure extractable work?". Although the author uses the two interchangeably, the question itself contains a false premise: namely that physical entropy is a measure of extractable work ("Entropy, as seen from thermodynamics, measures the amount of extractable work"). This is simply not true in the general case, though can be true for certain special cases, as Matt's counter example shows. A concrete case of this is a ball placed anywhere on a level surface. If the ball is placed randomly, no more work can be extracted than if its location is known.


The answer is yes, the two kinds of entropy are the same up to a constant factor of $k_B \log 2$ (which is also the origin of Landauer's principle). On way to see this is via Maxwell's demon. In particular, via the Szilard engine, an idealised heat engine that uses a single particle gas. You then introduce a partition, which effectively partitions the gas into two regions, only one of which contains the particle. Now, if you knew which side of the partition the particle was on, you could use the pressure difference to extract work, and if you don't you can't since you don't know which way it will push.

Now the connection with information theory comes in when we measure which side of the partition the particle is on. From this we are gaining a certain amount of entropy (and hence information) in the register which holds our measurement result. But having this information decreases the entropy of the gas. And hence you can go back and forth between information entropy and physical entropy.

There is a fairly extensive literature on the matter, so instead of trying to give you a list, I'll point you to a review article on Maxwell's Demon and information theory from a few years ago: arXiv:0707.3400,

This post has been migrated from (A51.SE)
answered Sep 16, 2011 by Joe Fitzsimons (3,575 points) [ no revision ]
Yes, I see. But still with the Szilard engine example, I'd like to go further. If I know in which side the particle si, I can extract kT ln(2) of work. If I don't know where the particle is, I can get no work. But, what if I have partial knowledge? Is still Shannon's entropy the answer?

This post has been migrated from (A51.SE)
@Javier: Yes, you can use exactly the same trick, but use a measurement that returns 0 if the particle is in the left most compartment, and 1 with probability $p$ if the particle is in the rightmost compartment and 0 otherwise, and proceed from there. Have a look at the paper, there's lots of good stuff in it.

This post has been migrated from (A51.SE)
More recent results here http://arxiv.org/abs/0908.0424 and here http://arxiv.org/abs/1009.1630

This post has been migrated from (A51.SE)
+ 9 like - 0 dislike

The answer is no. Consider a system that has a degenerate ground state, such that the density matrix is a mixture of two ground state eigenstates. This has nonzero Shannon entropy, but you can't extract any work from it. More generally, thermodynamic entropy is not really well-defined for nonequilibrium systems, but Shannon entropy is.

My own take is that thermodynamic entropy and Shannon entropy are two conceptually distinct things. They happen to coincide in a wide variety of circumstances, but not always. It is not even clear to me whether the cases in which they happen to coincide in classical and quantum theory are necessary coincidences. It may be possible to come up with a well-defined physical theory in which they never coincide, e.g. in the convex set famework for generalized probabilistic theories studied in the quantum foundations community.

This post has been migrated from (A51.SE)
answered Oct 5, 2011 by Matt Leifer (130 points) [ no revision ]
A neat counterexample Matt! I was thinking a similar thing when reading the question.

This post has been migrated from (A51.SE)
@Matt: Good answer, and you get +1 from me, but I do not understand the claim that thermodynamic entropy is not well defined for non-equilibrium systems. It's given by $-k_B \sum_i p_i\log p_i$ where $p_i$ is the probability of the $i^{th}$ microstate. This is obviously defined for non-equilibrium systems and equilibrium systems alike.

This post has been migrated from (A51.SE)
Actually I guess the issue is clouded by the existence of alternative non-equivalent definitions of entropy.

This post has been migrated from (A51.SE)
+ 5 like - 0 dislike

Because the word "work" has multiple meanings, the answer in general is "no".

Typically the first meaning taught to students is thermodynamic, but (as Dirac demonstrated) a generalized meaning (which includes thermodynamic work as a particular case) amounts to:

Given a class of dynamical processes, 'work' is any potential function that naturally describes what that class accomplishes.

Specifically, in the context of isotope separation, Dirac established that the work potential $\mathcal{V}_c$ that is naturally associated to an isotope concentration $c$ is given by

$\displaystyle\qquad \mathcal{V}_c(c) = (2 c' - 1) \log\left[\frac{c'}{1-c'}\right]$

or equivalently for spin polarization $p = 2 c - 1$ the Dirac value function $\mathcal{V}_p$ associated to separative transport of spin polarization is

$\displaystyle\qquad\mathcal{V}_p(p') = \mathcal{V}_c(c')\big|_{c' = (1+p')/2} = p' \log\left[\frac{1+p'}{1-p'}\right]$

The key point is Dirac's value function is not proportional to an entropy difference (as is evident because $0\le \mathcal{V}_p(p') \lt \infty$ while per-mole entropy ranges over a finite range).

Moreover, the Dirac work associated to separation cannot be reversed to return mechanical work, since the separation process is entropically irreversible. Nonetheless, Dirac work has substantial economic value, and in fact defines the unit of value of a global market in separative work units (SWUs, pronounced "swooz").

For a derivation of the Dirac work function, see Dirac's own (unpublished) technical note "Theory of the separation of isotopes by statistical methods" (circa 1940), which appears in his Collected Works, or alternatively Donald Olander's survey article "Technical Basis of the Gas Centrifuge" (1972), or in general any textbook on isotope separation.

This post has been migrated from (A51.SE)
answered Oct 5, 2011 by John Sidles (485 points) [ no revision ]

Your answer

Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead.
To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL.
Please consult the FAQ for as to how to format your post.
This is the answer box; if you want to write a comment instead, please use the 'add comment' button.
Live preview (may slow down editor)   Preview
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Anti-spam verification:
If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:
p$\hbar$ysicsOverf$\varnothing$ow
Then drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds).
Please complete the anti-spam verification




user contributions licensed under cc by-sa 3.0 with attribution required

Your rights
...