Quantcast
  • Register
PhysicsOverflow is a next-generation academic platform for physicists and astronomers, including a community peer review system and a postgraduate-level discussion forum analogous to MathOverflow.

Welcome to PhysicsOverflow! PhysicsOverflow is an open platform for community peer review and graduate-level Physics discussion.

Please help promote PhysicsOverflow ads elsewhere if you like it.

News

PO is now at the Physics Department of Bielefeld University!

New printer friendly PO pages!

Migration to Bielefeld University was successful!

Please vote for this year's PhysicsOverflow ads!

Please do help out in categorising submissions. Submit a paper to PhysicsOverflow!

... see more

Tools for paper authors

Submit paper
Claim Paper Authorship

Tools for SE users

Search User
Reclaim SE Account
Request Account Merger
Nativise imported posts
Claim post (deleted users)
Import SE post

Users whose questions have been imported from Physics Stack Exchange, Theoretical Physics Stack Exchange, or any other Stack Exchange site are kindly requested to reclaim their account and not to register as a new user.

Public \(\beta\) tools

Report a bug with a feature
Request a new functionality
404 page design
Send feedback

Attributions

(propose a free ad)

Site Statistics

205 submissions , 163 unreviewed
5,082 questions , 2,232 unanswered
5,353 answers , 22,789 comments
1,470 users with positive rep
820 active unimported users
More ...

  Where does deleted information go?

+ 9 like - 0 dislike
15081 views

I've heard that, in classical and quantum mechanics, the law of conservation of information holds.

I always wonder where my deleted files and folders have gone on my computer. It must be somewhere I think. Can anyone in principle recover it even if I have overwritten my hard drive?

This post imported from StackExchange Physics at 2014-05-25 06:51 (UCT), posted by SE-user luming
asked May 1, 2014 in Theoretical Physics by BaBQ (95 points) [ no revision ]
Consider watching this video by VSauce. It might help you out: youtube.com/watch?v=G5s4-Kak49o

This post imported from StackExchange Physics at 2014-05-25 06:51 (UCT), posted by SE-user mathh
If information is conserved, then the information existed before it was on your computer too! :D I love this stuff!

This post imported from StackExchange Physics at 2014-05-25 06:51 (UCT), posted by SE-user Adam

4 Answers

+ 9 like - 0 dislike

Short Answer

The information is contained in the heat given off by erasing the information. Landauer' Principle states that erasing information in a computation, being a thermodynamically irreversible process, must give off heat proportional to the amount of information erased in order to satisfy the second law of thermodynamics. The emitted information is hopelessly scrambled though and recovering the original information is impossible in practice. Scrambling of information is what increasing entropy really means in plain English. Charles H. Bennett and Rolf Landauer developed the theory of thermodynamics of computation. The main results are presented in The thermodynamics of computation—a review.

Background

Erasure of information and the associated irreversibility are macroscopic/thermodynamic phenomena. At the microscopic level everything is reversible and all information is always preserved, at least according to the currently accepted physical theories, though this has been questioned by notable people such as Penrose and I think also by Prigogine. Reversibility of basic physical laws follows from Liouville's_theorem for classical mechanics and unitarity of the time evolution operator for quantum mechanics. Reversibility implies the conservation of information since time reversal can then reconstruct any seemingly lost information in a reversible system. The apparent conflict between macroscopic irreversibility and microscopic reversibilty is known as Loschmidt's paradox, though it is not actually a paradox.

In my understanding it is sensitivity to initial conditions, the butterfly effect, that reconciles macroscopic irreversibility with microscopic reversibility. Suppose time reverses while you are scrambling an egg. The egg should then just unscramble like in a film running backwards. However, the slightest perturbation, say by hitting a single molecule with a photon, will start a chain reaction as that molecule will collide with different molecules than it otherwise would have. Those will in turn have different interactions then they otherwise would have and so on. The trajectory of the perturbed system will diverge exponentially from the original time reversed trajectory. At the macroscopic level the unscrambing will initially continue, but a region of rescrambling will start to grow from where the photon struck and swallow the whole system leaving a completely scrambled egg.

This shows that time reversed states of non-equilibrium systems are statistically very special, their trajectories are extremely unstable and impossible to prepare in practice. The slightest perturbation of a time reversed non-equilibrium system causes the second law of thermodynamics to kick back in.

The above thought experiment also illustrates the Boltzmann brain paradox in that it makes it seem that a partially scrambled egg is more likely to arise form the spontaneous unscrambling of a completely scrambled egg than by breaking an intact one, since if trajectories leading to an intact egg in the future are extremely unstable, then by reversibility, so must trajectories originating from one in the past. Therefore the vast majority of possible past histories leading to a partially scrambled state must do so via spontaneous unscrambling. This problem is not yet satisfactorily resolved, particularly its cosmological implications, as can be seen by searching Arxiv and Google Scholar.

Nothing in this depends on any non classical effects.

This post imported from StackExchange Physics at 2014-05-25 06:51 (UCT), posted by SE-user Daniel Mahler
answered May 1, 2014 by Daniel Mahler (255 points) [ no revision ]
Most voted comments show all comments
@WetSavannaAnimalakaRodVance I can't tell you right now. I only have a very vague memory of this statement. I will check that.

This post imported from StackExchange Physics at 2014-05-25 06:51 (UCT), posted by SE-user luming
@luming I have added a paragraph explaining my understanding of how irreversibilty arises at the macroscopic level

This post imported from StackExchange Physics at 2014-05-25 06:51 (UCT), posted by SE-user Daniel Mahler
@DanielMahler in the future could you condense your editing so as to keep the total number of edits down?

This post imported from StackExchange Physics at 2014-05-25 06:51 (UCT), posted by SE-user David Z
@DavidZ Does it cause a problem? Is there a way to save edits without publishing them?

This post imported from StackExchange Physics at 2014-05-25 06:51 (UCT), posted by SE-user Daniel Mahler
@DanielMahler yes, every edit bumps the question to the top of the front page, taking attention from other questions, and also it makes it harder to parse the revision history when someone wants to do that. There's no way to save edits on the site, but you can use another program if you really want to; all I'm really saying is to wait until you have something major to fix, and then go through and make all the improvements you can find all in one go, rather than changing little things as you think of them.

This post imported from StackExchange Physics at 2014-05-25 06:51 (UCT), posted by SE-user David Z
Most recent comments show all comments
Thank you very much for your detailed and insightful answer. Can you explain why coarse grain of phase space explains the irreversibility of the macroscopic process. I thought that coarse grain is manually invented because human do not know the exactly trajectory in phase space, but mother nature know it well.

This post imported from StackExchange Physics at 2014-05-25 06:51 (UCT), posted by SE-user luming
@luming Coarse graining was originally invented exactly as you say, however, the idea finds rigorous grounding in the idea of separability of quantum Hilbert state spaces, as explained by Emilio Pisanty's answer to this question here. So, even though there are an infinite number of possible quantum states, they are countably infinite, with nonzero energy steps in between. So, the more information you want to encode in matter, the hotter you need to make it. I am not sure coarse graining, though, explains ....

This post imported from StackExchange Physics at 2014-05-25 06:51 (UCT), posted by SE-user WetSavannaAnimal aka Rod Vance
+ 3 like - 0 dislike

The deletion of data only allows other data to be saved where that data was previously saved. In other words, deletion is not what it seems, it is just freeing up the space that was otherwise occupied. If you save something on top of this space(overwrite it), then it is more difficult to recover via HD recovery tools. So yes it is possible to recover if overwritten but the probability of recovering the data decreases as the space is continually overwritten.

This post imported from StackExchange Physics at 2014-05-25 06:51 (UCT), posted by SE-user user3138766
answered May 1, 2014 by user3138766 (30 points) [ no revision ]
I think that you've addressed the side of the question that was not asked! We've seen that happen before.

This post imported from StackExchange Physics at 2014-05-25 06:51 (UCT), posted by SE-user dotancohen
On the contrary, this is the only correct answer. Data in modern computer systems aren't ever deleted in the sense described above. The amount of energy expended in deletion depends on what you delete. On a disk, it will be the equivalent of writing some new bits that mark the old data as no longer visible to the disk operating system. In RAM, it's more complicated because volatile memory is refreshed periodically and therefore calculating energy expended depends on refresh rate and refresh schemes. In either case, I suspect the energy to write a 1 bit is different than that to write a 0 bit.

This post imported from StackExchange Physics at 2014-05-25 06:51 (UCT), posted by SE-user ssamuel
+ 3 like - 0 dislike

To add to Daniel Maher's Excellent Answer and, referring to the same reference, Charles Bennett, "The Thermodynamics of Computation: A Review", Int. J. Theo. Phys., 21, No. 12, 1982.

A simple summary to Daniel's answer and also to your question "where has the information gone?" is - after deletion, it is now encoded in the *physical states of the matter (quantum states of "stuff") that makes up the computer and also its environment. As Daniel says, physics at a microscopic level is reversible, so you can imagine the deletion process as a "movie" (albeit one with a stupendously complicated plot) where the changes of state in the deleted memory chips influences the matter around the system so that the latter changes state subtly. Nature does not forget it got into its state at any time - or, more formally, the state of the World is a one to one (bijective) function of its state at any other time. So, you can, in principle, now run your movie backwards and see the configurations of the matter around the memory chips restore their former values.

In the paper that Daniel and I cite, Bennett invents perfectly reversible mechanical gates ("billiard ball computers") whose state can be polled without the expenditure of energy and then used such mechanical gates to thought-experimentally study the Szilard Engine and to show that Landauer's Limit arises not from the cost of finding out a system's state (as Szilard had originally assumed) but from the need to continually "forget" former states of the engine.

Probing this idea more carefully, as also done in Bennett's paper: One can indeed conceive non-biological simple finite state machines to realise the Maxwell Daemon - this has been done in the laboratory! see at the end - and as the Daemon converts heat to work, it must record a sequence of bits describing which side of the Daemon's door (or engine's piston, for an equivalent discussion of the Szilard engine) molecules were on. For a finite memory machine, one needs eventually to erase the memory so that the machine can keep working. However, "information" ultimately is not abstract - it needs to be "written in some kind of ink" you might say - and that ink is the states of physical systems. The fundamental laws of physics are reversible, so that one can in principle compute any former state of a system from the full knowledge of any future state - nothing gets lost. So, if the finite state machine's memory is erased, the information encoded that memory must show up, recorded somehow, as changes in the states of the physical system making up and surrounding the physical memory. So now those physical states behave like a memory: eventually those physical states can encode no more information, and the increased thermodynamic entropy of that physical system must be thrown out of the system, with the work expenditure required by the Second Law, before the Daemon can keep working. The need for this work is begotten of the need to erase information, and is the ultimate justification for Landauer's principle.

Going back to your computer: you can even do some back of the envelope calculations as to what would happen if you could reversibly store every step of a calculation in a bid to get around the Landauer Limit, see Emilio Pisanty's bounding of the information content of the human brain. You end up with your head blasting off in a scene that evokes a Dr Who regeneration (after Christopher Eccelston) with ion beams streaming from the neck! Also note, however, that even reversible computers need to erase information to initialise their states at the beginning of any computation. The uninitialised states need to be encoded in the states of physical systems too during the power up process.

Beyond heads blowing off from stupendously thermalised matter, there is also the Berkenstein Bound from the field of Black Hole Thermodynamics is (see the Wiki page with this name), which is the maximum amount information that can be encoded in a region of space with radius $R$ containing mass-energy $E$, it is:

$$I\leq \frac{2\,\pi\,R\,E}{\hbar\,c\,\log 2}$$

where $I$ is the number of bits contained in quantum states of that region of space. This bound was derived by doing a thought experiment wherein Berkenstein imagined lowering objects into black holes see this question and then deduced the above bound by assuming that the second law of thermodynamics holds. It works out to about $10^{42}$ bits to specify the full quantum state of an average sized human brain. This is to be compared to estimates of the Earth's total computer storage capacity, which is variously reckonned to be of the order of $10^{23}$ bits (see the Wikipedia "Zettabyte" page, for instance) as of writing (2013).

So, ultimately, a computer erasing its memory throughout computations and sundered from the rest of the Universe would fill up all the information encoding capacity of any finite region in space.

You might also like to browse a couple of articles I wrote on my website

Information is Physical: Landauer’s Principle and Information Soaking Capacity of Physical Systems

and

Free Energies: What Does a Physical Chemist Mean when He/She Talks of Needing Work to Throw Excess Entropy Out of a Reaction?

This post imported from StackExchange Physics at 2014-05-25 06:51 (UCT), posted by SE-user WetSavannaAnimal aka Rod Vance
answered May 1, 2014 by WetSavannaAnimal (485 points) [ no revision ]
I really appreciate that you know so much, and you are sharing it. Sometimes in bulk, there is the chances of OP not understanding the significance or missing some important points. I think it will be helpful, if you can highlight the important points at the end, with suitable heading and separating it from the other point. It is pleasure to read your answers, thank you.

This post imported from StackExchange Physics at 2014-05-25 06:51 (UCT), posted by SE-user Godparticle
+ 0 like - 3 dislike

Conservation of information in quantum mechanics is a hypocrisy as much as it is so in classical mechanics. It is not conserved in the same sense as energy, charge, or momentum. When sages like Hawking and Penrose discuss whether does “information” survive destruction in a spacetime singularity, they mean something utterly different from the concept familiar to an information engineer.

Reconciliation of conservation laws with quantum uncertainty is not an easy thing. But energy, charge, and momentum are observable. We can speculate about quantum states with wider or narrower uncertainty of specific observables, but we can measure them. But which quantities are observed in quantum mechanics, how do they behave? The classification of physical quantities as intensive and extensive is well-known. Does some quantity behave like the average value over many parts of the space, or does it accumulate? Let us speak about traditional denotations as about space-intensive and space-extensive quantities, because the same line of thought can be applied to alternative quantum realities generated by a quantum measurement, instead of parts of 3-dimensional space. Observable quantities must be multiverse-intensive. It does not imply arithmetic mean values (in the sense of either probability theory or QFT), but the idea that measurement of an observable with uncertain value makes its possible values separated in different branches of the future, and doesn’t create completely new values.

Suppose that we had a particle initially with uncertain momentum and after the experiment this original momentum became more certain. There are possibly other cases, with other values of momentum, but different “versions” of experimenter can’t communicate; each of them sees its value (or range of values) of the original momentum. They do not split the original momentum in parts in the sense of addition, they split multiple values of it that were possible in the initial quantum state, possibly with different probabilities. Thus, momentum is multiverse-intensive, although space-extensive. That’s why we can observe it consistently.

Now, let us had a state of quantum computer (that supposedly contained an “information”) and effect a measurement. Was “information” conserved? Yes and no, depending on the point of view. Quantum Mechanics now has several “versions” of experimenter in superposition (so it was), whereas each “version” sees only its own piece of the original state (so it was not). Why can’t they communicate? A quantum informatist would say that shitdecoherence happened. So, “information” conserved for Quantum Mechanics but a large part of it… disappeared for us, observers. We can’t observe this quantity consistently. There is no practical sense in the discussion about its conservation.

Okay, what do aforementioned quantum sages mean, indeed? They discuss whether are these quantum states with their treacherous superposition reliable in situations where the spacetime is destroyed itself. They might be or be not; it changes little in problems of overwritten disk drive or, say, human death.

answered Aug 21, 2014 by Incnis Mrsi (-15 points) [ revision history ]
edited Aug 21, 2014 by Incnis Mrsi

Your answer

Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead.
To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL.
Please consult the FAQ for as to how to format your post.
This is the answer box; if you want to write a comment instead, please use the 'add comment' button.
Live preview (may slow down editor)   Preview
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Anti-spam verification:
If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:
p$\hbar$ysicsO$\varnothing$erflow
Then drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds).
Please complete the anti-spam verification




user contributions licensed under cc by-sa 3.0 with attribution required

Your rights
...