Quantcast
  • Register
PhysicsOverflow is a next-generation academic platform for physicists and astronomers, including a community peer review system and a postgraduate-level discussion forum analogous to MathOverflow.

Welcome to PhysicsOverflow! PhysicsOverflow is an open platform for community peer review and graduate-level Physics discussion.

Please help promote PhysicsOverflow ads elsewhere if you like it.

News

PO is now at the Physics Department of Bielefeld University!

New printer friendly PO pages!

Migration to Bielefeld University was successful!

Please vote for this year's PhysicsOverflow ads!

Please do help out in categorising submissions. Submit a paper to PhysicsOverflow!

... see more

Tools for paper authors

Submit paper
Claim Paper Authorship

Tools for SE users

Search User
Reclaim SE Account
Request Account Merger
Nativise imported posts
Claim post (deleted users)
Import SE post

Users whose questions have been imported from Physics Stack Exchange, Theoretical Physics Stack Exchange, or any other Stack Exchange site are kindly requested to reclaim their account and not to register as a new user.

Public \(\beta\) tools

Report a bug with a feature
Request a new functionality
404 page design
Send feedback

Attributions

(propose a free ad)

Site Statistics

205 submissions , 163 unreviewed
5,082 questions , 2,232 unanswered
5,353 answers , 22,789 comments
1,470 users with positive rep
820 active unimported users
More ...

  An entropy of the Wigner function

+ 17 like - 0 dislike
4249 views

Is there an entropy that one can use for the Wigner quasi-probability distribution? (In the sense of a phase-space probability distribution, not - just von Neumann entropy.)

One cannot simply use $\int - W(q,p) \ln\left[ W(q,p) \right] dq dp$, as the Wigner function is not positively defined.

The motivation behind the question is the following:

A paper I. Białynicki-Birula, J. Mycielski, Uncertainty relations for information entropy in wave mechanics (Comm. Math. Phys. 1975) (or here) contains a derivation of an uncertainty principle based on an information entropy: $$-\int |\psi(q)|^2 \ln\left[|\psi(q)|^2\right]dq-\int |\tilde{\psi}(p)|^2 \ln\left[|\tilde{\psi}(p)|^2\right]dp\geq1+\ln\pi.$$ One of the consequences of the above relation is the Heisenberg's uncertainty principle. However, the entropic version works also in more general settings (e.g. a ring and the relation of position - angular momentum uncertainty).

As $|\psi(q)|^2=\int W(q,p)dp$ and $|\tilde{\psi}(p)|^2=\int W(q,p)dq$ and in the separable case (i.e. a gaussian wave function) the Winger function is just a product of the probabilities in position an in momentum, it is tempting to search for an entropy-like functional fulfilling the following relation: $$1+\ln\pi\leq\text{some_entropy}\left[ W\right]\leq -\int |\psi(q)|^2 \ln\left[|\psi(q)|^2\right]dq-\int |\tilde{\psi}(p)|^2 \ln\left[|\tilde{\psi}(p)|^2\right]dp.$$

This post has been migrated from (A51.SE)
asked Sep 29, 2011 in Theoretical Physics by Piotr Migdal (1,260 points) [ no revision ]
retagged Mar 7, 2014 by dimension10
I think that the right formula is obtained from the naive $-\int W\ln W$ by replacing the product (between the two things, and inside the logarithm, when e.g. Taylor-expanded) by the star-product relevant for quantum mechanics. In other words, you first calculate the density operator $\rho$ to the Wigner distribution and then calculate $-{\rm Tr} \rho \ln \rho$ out of it.

This post has been migrated from (A51.SE)
@LubošMotl: $-\text{\Tr}\rho \ln \rho$ is the von Neumann entropy so it is just zero for pure states.

This post has been migrated from (A51.SE)
In the Gaussian case, the entropic uncertainty relation is saturated (at least when the state is pure and "squeezed" in the canonical directions). In that case, $\text{some_entropy}[w]=1+\ln \pi$. Of course, there is still room for optimization in the direction, but it is not very interesting. What application do you have in mind ?

This post has been migrated from (A51.SE)
@FrédéricGrosshans: For (squeezed) Gaussian case it is simple. When it comes to applications - well, for now I don't have any particular in my mind, besides of making a generalization. Maybe the exact way written above is a blind path and one needs to try playing with $W^2$ (which is just proportional to $\text{Tr}[\rho^2]$) or work with deconvolving Husimi Q distribution (which is positively defined and precisely bounded from the von Neumann entropy plus a constant).

This post has been migrated from (A51.SE)
Would a complex valued entropy be bad? I forgot the exact requirements on Shannon Entropy, but if you don't require maximum entropy but just a stationary, you could _maybe_ still obtain a useful equation of motion, and the inequality you mention is perhaps still valid if considering the absolute value. Also, what about $\int - (\int W(q,p) dp) \ln\left[\int W(q,p) dp\right] dq$ itself?

This post has been migrated from (A51.SE)
@Piotr: yes, the von Neumann entropy for pure states is zero. Does it contradict anything I wrote?

This post has been migrated from (A51.SE)
@LubošMotl: It's not self-contradicting. However, I have no idea how to use it for my problem as it does not fulfill the last equation.

This post has been migrated from (A51.SE)
The entropy like functional you describe is interesting because it tells us the uncertainty if we make measurements of position and momentum. If you want to make find a functional with a smaller value, you want to consider a wider class of measurements. In the phase space picture it is natural to consider measurements of conjugate pairs of quadratures that are simply rotation of p and q. As such I would consider the minimization of the entropy functional over all Gaussian operations. This could give a smaller value but clearly not zero.

This post has been migrated from (A51.SE)
@Piotr, I see. So let me guess that you won't find any entropy depending on the full distribution in a 2-dimensional way that satisfies your last inequality. Why do you expect such a thing to exist? The vanishing entropy of a pure state is a "real" thing. Maybe if you just add $1+\ln\pi$ to von Neumann entropy, it would work?

This post has been migrated from (A51.SE)

2 Answers

+ 9 like - 0 dislike

The Wigner Function is simply a particular representation of a quantum state and so it only has an entropy in so far as the state does. One can ask are there any entropic quantities that have an elegant representation in terms of the Wigner function, and there may well be such quantities. Indeed, the linear entropy $1-\mathrm{tr}(\rho^{2})$ where $\mathrm{tr}(\rho^{2}) \propto \int W (p,q)^{2}$ has a neat form. However, like the von Neumann entropy this will give zero for a pure state! You already stated that you would like an entropy that does not give zero always so that you can make interesting statements like the above uncertainty relation. However, uncertainty relations crop up when you sum two or more entropy quantities.

I will expand on my remarks a bit more formally. The classical entropies, like the Shannon entropy, are defined on bit strings. We can define a quantum mechanical entropy by defining a measurement that gives us a bit string. For an observable $M$ with eigenvalues $\lambda_{j}$ and projectors $P_{j}$ onto the corresponding subspace, we can define a bit string $X_{M}(\rho)=\{ x_{1}, x_{2},... x_{j}... \}$ where $x_{j}=\mathrm{tr} ( \rho P_{j} )$. Now we can covert this into an entropy by classical means such as taking the Shannon entropy $S( X_{M}(\rho))$. If the state is an eigenstate of the measurement basis then the entropy will be zero.

How does the von Neumann entropy fit into this picture. Well an equivalent definition to the usual one is the following: $S_{vonN}(\rho)=\min \{ S( X_{M}(\rho)) |M=M^{\dagger} \}$ which is simply the minimum possible measurement entropy.

Where do uncertainly relations come in? Well to have an uncertainty relation we must have 2 measurement observables that do not commute. If they have no common eigenstates an uncertainty relations follows by simply adding the 2 entropies. The inequality you have cited is simply $S_{P}(\rho)+S_{X}(\rho)$ for position plus momentum uncertainty. As noted in the comments this inequality can be saturated and so there is no hope of improving on it.

My opinion is that it is meaningless to ask for an entropy outside of a measurement context, and so this is what you need to decide on first. If it really is just position and momentum your interested in then I think the cited inequality says all there is!

This is my first attempt at an answer on stack exchange so I hope it is useful!

This post has been migrated from (A51.SE)
answered Oct 1, 2011 by Earl (405 points) [ no revision ]
Welcome to TP.SE, Earl. Good to see you here.

This post has been migrated from (A51.SE)
@Earl: Thanks. Similar treatment (i.e. $\int W^2$) is in http://arxiv.org/abs/quant-ph/0203102. When it comes to 'meaninglessness' - well, I don't require this 'entropy' to be physical - just to give a lower bound for the sum entropies of position and momentum densities.

This post has been migrated from (A51.SE)
To extend this answer, we remark that the Wigner representation of a quantum state is not unique. Thus if we associate an entropy measure to Wigner representations, and are given a state, and wish to assign an entropy to that state in a physically and mathematically natural way, then (it seems to me) we have maximize (minimize?) that entropy measure over all possible Wigner representations of the given state (a task that perhaps is not computationally easy).

This post has been migrated from (A51.SE)
@John Sidles I didn't realize that Wigner representations were not unique! Do you have a reference for any examples of this?

This post has been migrated from (A51.SE)
Earl, as I recall (perhaps imperfectly) non-unique Wigner representations can be regarded as the large-$j$ limit (for spin-$j$ Hilbert spaces) of non-unique $P$-representations. Perelemov's text Generalized Coherent States and Their Applications (1986) discusses this topic (as I recall).

This post has been migrated from (A51.SE)
Helpful indeed, thanks. A first no-nonsense and really accessible explanation I encounter of the relation between the Shannon entropy and the uncertainty relation, thanks. (This is just a subjective comment based on my limited knowledge of the field).

This post has been migrated from (A51.SE)
@John Sidles: the Wigner representation is indeed unique (it is a bijective mapping from states to functions on phase space). I should add that the Piotr asked another question [here](http://theoreticalphysics.stackexchange.com/q/245/493) which addresses the non-uniqueness of people's _preference_ for quasi-probability functions.

This post has been migrated from (A51.SE)
+ 4 like - 0 dislike

The entropy your are looking for may be what is known as the Wehrl entropy. You obtain it by replacing the Wigner function $W_\psi(q,p)$ in $-\int W \log W$ by the Husimi function, which is the convolution of the Wigner function with a Gaussian, $$ H_\psi(q,p) = \frac{1}{\pi} \int W_\psi(q',p') \exp\left( -(q-q')^2 -(p-p')^2 \right) \mathrm{d}q' \mathrm{d} p' . $$ (I've set $\hbar=1$.) Since the Husimi function is non-negative $$ S_\psi := - \int H_\psi(q,p) \log H_\psi(q,p) \mathrm{d}q \, \mathrm{d} p $$ is well-defined. A good staring point may be Gnutzmann & Zyczkowski: Rényi-Wehrl entropies as measures of localization in phase space, J. Phys. A 34 10123.

This post has been migrated from (A51.SE)
answered Oct 6, 2011 by Stefan (40 points) [ no revision ]
Thanks Stefan. Actually I know the approach but wanted to go a step further. From Wherl entropy one can obtain entropic uncertainty for _smeared_ densities in position in momentum (i.e. $\frac{1}{\sqrt{\pi}}\int |\psi(q')|^2 \exp(-(q-q'))dq'$ and $\frac{1}{\sqrt{\pi}}\int |\tilde{\psi}(p')|^2 \exp(-(p-p'))dp'$, respectively). Unfortunately, such smeared uncertainty principle is weaker that the usual one _and_ I don't know if there is a way to derive the not-smeared uncertainty principle from the Husimi Q function.

This post has been migrated from (A51.SE)

Your answer

Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead.
To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL.
Please consult the FAQ for as to how to format your post.
This is the answer box; if you want to write a comment instead, please use the 'add comment' button.
Live preview (may slow down editor)   Preview
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Anti-spam verification:
If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:
p$\hbar$y$\varnothing$icsOverflow
Then drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds).
Please complete the anti-spam verification




user contributions licensed under cc by-sa 3.0 with attribution required

Your rights
...