Quantcast
  • Register
PhysicsOverflow is a next-generation academic platform for physicists and astronomers, including a community peer review system and a postgraduate-level discussion forum analogous to MathOverflow.

Welcome to PhysicsOverflow! PhysicsOverflow is an open platform for community peer review and graduate-level Physics discussion.

Please help promote PhysicsOverflow ads elsewhere if you like it.

News

PO is now at the Physics Department of Bielefeld University!

New printer friendly PO pages!

Migration to Bielefeld University was successful!

Please vote for this year's PhysicsOverflow ads!

Please do help out in categorising submissions. Submit a paper to PhysicsOverflow!

... see more

Tools for paper authors

Submit paper
Claim Paper Authorship

Tools for SE users

Search User
Reclaim SE Account
Request Account Merger
Nativise imported posts
Claim post (deleted users)
Import SE post

Users whose questions have been imported from Physics Stack Exchange, Theoretical Physics Stack Exchange, or any other Stack Exchange site are kindly requested to reclaim their account and not to register as a new user.

Public \(\beta\) tools

Report a bug with a feature
Request a new functionality
404 page design
Send feedback

Attributions

(propose a free ad)

Site Statistics

205 submissions , 163 unreviewed
5,082 questions , 2,232 unanswered
5,353 answers , 22,789 comments
1,470 users with positive rep
820 active unimported users
More ...

  Information Retrieval

+ 13 like - 0 dislike
2000 views

This question is motivated by the issue of information retrieval from black holes, but it is essentially a question about quantum information.

It is widely believed (in certain circles) that the information about the history of black hole formation is continuously leaked out as the black hole is evaporated, and is encoded by the Hawking radiation quanta. This way one does not lose information when the black hole is completely evaporated, and furthermore one does not need to postulate some implausible ways in which macroscopic amount of information is stored in an increasingly microscopic object, only to be somehow released at the very last stage of the evaporation process.

To make this more quantitative, I am wondering if similar questions were asked in the quantum information community. Suppose you have an encrypted message of a certain length which is revealed to you over time, and you are trying to decode it. Obviously the more time passes the more information you can have about the message, and when you reach a time which scale with the size of the message you are expected to gain access to the full information. I'm looking for some more quatitative knowledge about this question. For example:

For a generic decoding, is it known how much time you need in order to gain access to a finite fraction of the information? Are there some universal results about the asymptotics of such process (in the spirit of "critical exponents")?

Are there bounds on optimal encoding, whose purpose would be to delay a release of a finite fraction of the information to later and later times? In particular, can there be encoding which releases a finite fraction of the information only at the very last stages of the decoding?

I realize this is probably an hopelessly pedestrian question, pointing me towards some review literature might be the way to go in such case.

Edit: Thanks everyone for your answers, they all were useful in different ways. The question was vague enough for it not to have a single correct answer, but I'll choose Peter's because he somehow managed to read my mind (though there was no way to tell from the way the question was phrased). Hope to ask more precise questions on related topics in the future.

This post has been migrated from (A51.SE)
asked Sep 25, 2011 in Theoretical Physics by Moshe (2,405 points) [ no revision ]
retagged Apr 19, 2014 by dimension10
Dear Moshe, I am very confused by such questions about "getting most of the information by many measurements". You surely agree that for a single physical system, you can't measure the wave function, don't you? The wave function is not an observable: this terminology makes it easy to see that the statement is equivalent to "wave function can't be measured". A wave function in the $2^N$-dimensional Hilbert space may be measured by at most $N$ independent binary measurements so you get $N$ classical bits as a result. Because you could measure wrt. any basis of a complex space, they were qubits.

This post has been migrated from (A51.SE)
What I want to say is that if some information is encoded in the eigenvalue of an operator $L$ acting on your system and if you measure any observable $M$ that doesn't commute with $L$, the information about $L$ is immediately compromised ("by a collapse") and you will never be able to get it back. So if you measure different things than those you want to know, you're irreversibly damaging the information content. The potential (probabilistic) information about any observable was there to start with but measurements destroy it.

This post has been migrated from (A51.SE)
Lubos, of course you cannot know the state of the quantum system, but there are other more relevant measures of what you mean by information retrieval. See for example the paper by Haydn and Preskill (black holes as mirrors).

This post has been migrated from (A51.SE)
Edited to remove any reference to measurements, which is of course an unnecessarily loaded term in this context.

This post has been migrated from (A51.SE)

3 Answers

+ 12 like - 0 dislike

You've clearly read the paper of Hayden and Preskill: Black holes as mirrors: quantum information in random subsystems, since you mention it in the comments. Another quantum information phenomenon related to your question is "information locking" (see this and other papers), where you can arrange things so that all the information in a quantum system of $n$ bits is essentially inaccessible until you get the last $\log n$ bits. There are several papers about papers about "locking" information or entanglement, and I suspect some of your questions may be answered in one of these.

This post has been migrated from (A51.SE)
answered Sep 26, 2011 by Peter Shor (790 points) [ no revision ]
Indeed the question is in large part motivated by hearing sometime about "locking" (though not using those words). Now I know where to read about it. This is very useful, thanks.

This post has been migrated from (A51.SE)
+ 9 like - 0 dislike

To start out, I think it is important to clarify one issue. If we think of many quantum messages over which the information is spread, we can ask two questions that both seem consistent with your phrasing: 1) If I measure each message as I receive it, what is the information gained from that message, or 2) If I store the quantum messages, how much information can be extracted from the first $n-1$ of them, versus the first $n$ of them?

These problems are not the same, and can have very different answers depending on the encoding. Indeed, quantum messages sent over two or more zero capacity channels can contain non-zero information (see arXiv:0807.4935 for example). We've only known about weirdness like this for 3 years, so its worth keeping in mind that older papers might evoke additivity conjectures which have recently been proven false.

As regards the blackhole information problem, it is (2) that seems the most relevant. Holevo information gives an obvious way to bound the information contained in the system, and to calculate an upper bound on the rate of information leakage. This can probably be improved in the blackhole setting since there is no way to introduce additional randomness (which would essentially mean creating information).

The Holevo information is given by $\chi = S(\rho) - \sum_i p_i S(\rho_i)$, where $\rho = \sum_i p_i \rho_i$ is the density matrix for the system, and $\rho_i$ is the density matrix used to encode a particular message which occurs with probability $p_i$, and $S(\rho) = \mbox{Tr}(\rho \log_2 \rho)$ is the von Neumann entropy. The mutual information between the encoded information and the measurement outcomes is bounded from above by this quantity (see the link in Frédéric Grosshans' answer for a more detailed description of Holevo's theorem). Thus for a given encoding, you can calculate the Holevo information as a function of the number of messages received, which gives you the kind of thing you are looking for.

You also ask "For a generic decoding, is it known how much time you need in order to gain access to a finite fraction of the information? Are there some universal results about the asymptotics of such process (in the spirit of "critical exponents")?"

The answer to this is that there is a trivial upper bound on the time of infinity, and a lower bound of $n$ bits per $n$ qubits received which comes from Holevo's bound. Given more information about the process, better bounds are of course probably possible.

The reason for the infinite upperbound is as follows: If you encode quantum information with an error correction code which can detect errors on up to $d$ sites, it is necessarily the case that it is impossible to obtain the correct result for a measurement on the encoded information with probability better than guessing, if the measurement is restricted to be on less than $d$ sites.

Now, it is easy to construct codes with $d$ arbitrarily large as long as the encoding can be made even larger, and hence we get the infinite upper bound.

You can use the same trick to make pretty much any distribution you want, as long as it respects the lower bound given by Holevo information (which is tight for some encodings).

This post has been migrated from (A51.SE)
answered Sep 26, 2011 by Joe Fitzsimons (3,575 points) [ no revision ]
+ 8 like - 0 dislike

There are many bounds, on many different definition of "Information contained in a quantum state". The recent review by Mark E. Wilde From Classical to Quantum Shannon Theory, arxiv:1106.1445 is a good review on these concepts. I suppose the notions of quantum typicality (chapter 14) could be easily adapted to your problem.

I would look as your black hole as a composite system, divided into two parts, the black-hole itself and the already released Hawking radiation. And you would then study the Quantum Mutual Information between the two. (Or the classical mutual information, if you insist on making measurements)

This post has been migrated from (A51.SE)
answered Sep 25, 2011 by Frédéric Grosshans (250 points) [ no revision ]

Your answer

Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead.
To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL.
Please consult the FAQ for as to how to format your post.
This is the answer box; if you want to write a comment instead, please use the 'add comment' button.
Live preview (may slow down editor)   Preview
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Anti-spam verification:
If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:
p$\hbar$ysicsOver$\varnothing$low
Then drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds).
Please complete the anti-spam verification




user contributions licensed under cc by-sa 3.0 with attribution required

Your rights
...