To add to Daniel Maher's Excellent Answer and, referring to the same reference, Charles Bennett, "The Thermodynamics of Computation: A Review", Int. J. Theo. Phys., 21, No. 12, 1982.
A simple summary to Daniel's answer and also to your question "where has the information gone?" is - after deletion, it is now encoded in the *physical states of the matter (quantum states of "stuff") that makes up the computer and also its environment. As Daniel says, physics at a microscopic level is reversible, so you can imagine the deletion process as a "movie" (albeit one with a stupendously complicated plot) where the changes of state in the deleted memory chips influences the matter around the system so that the latter changes state subtly. Nature does not forget it got into its state at any time - or, more formally, the state of the World is a one to one (bijective) function of its state at any other time. So, you can, in principle, now run your movie backwards and see the configurations of the matter around the memory chips restore their former values.
In the paper that Daniel and I cite, Bennett invents perfectly reversible mechanical gates ("billiard ball computers") whose state can be polled without the expenditure of energy and then used such mechanical gates to thought-experimentally study the Szilard Engine and to show that Landauer's Limit arises not from the cost of finding out a system's state (as Szilard had originally assumed) but from the need to continually "forget" former states of the engine.
Probing this idea more carefully, as also done in Bennett's paper: One can indeed conceive non-biological simple finite state machines to realise the Maxwell Daemon - this has been done in the laboratory! see at the end - and as the Daemon converts heat to work, it must record a sequence of bits describing which side of the Daemon's door (or engine's piston, for an equivalent discussion of the Szilard engine) molecules were on. For a finite memory machine, one needs eventually to erase the memory so that the machine can keep working. However, "information" ultimately is not abstract - it needs to be "written in some kind of ink" you might say - and that ink is the states of physical systems. The fundamental laws of physics are reversible, so that one can in principle compute any former state of a system from the full knowledge of any future state - nothing gets lost. So, if the finite state machine's memory is erased, the information encoded that memory must show up, recorded somehow, as changes in the states of the physical system making up and surrounding the physical memory. So now those physical states behave like a memory: eventually those physical states can encode no more information, and the increased thermodynamic entropy of that physical system must be thrown out of the system, with the work expenditure required by the Second Law, before the Daemon can keep working. The need for this work is begotten of the need to erase information, and is the ultimate justification for Landauer's principle.
Going back to your computer: you can even do some back of the envelope calculations as to what would happen if you could reversibly store every step of a calculation in a bid to get around the Landauer Limit, see Emilio Pisanty's bounding of the information content of the human brain. You end up with your head blasting off in a scene that evokes a Dr Who regeneration (after Christopher Eccelston) with ion beams streaming from the neck! Also note, however, that even reversible computers need to erase information to initialise their states at the beginning of any computation. The uninitialised states need to be encoded in the states of physical systems too during the power up process.
Beyond heads blowing off from stupendously thermalised matter, there is also the Berkenstein Bound from the field of Black Hole Thermodynamics is (see the Wiki page with this name), which is the maximum amount information that can be encoded in a region of space with radius $R$ containing mass-energy $E$, it is:
$$I\leq \frac{2\,\pi\,R\,E}{\hbar\,c\,\log 2}$$
where $I$ is the number of bits contained in quantum states of that region of space. This bound was derived by doing a thought experiment wherein Berkenstein imagined lowering objects into black holes see this question and then deduced the above bound by assuming that the second law of thermodynamics holds. It works out to about $10^{42}$ bits to specify the full quantum state of an average sized human brain. This is to be compared to estimates of the Earth's total computer storage capacity, which is variously reckonned to be of the order of $10^{23}$ bits (see the Wikipedia "Zettabyte" page, for instance) as of writing (2013).
So, ultimately, a computer erasing its memory throughout computations and sundered from the rest of the Universe would fill up all the information encoding capacity of any finite region in space.
You might also like to browse a couple of articles I wrote on my website
Information is Physical: Landauer’s Principle and Information Soaking Capacity of Physical Systems
and
Free Energies: What Does a Physical Chemist Mean when He/She Talks of Needing Work to Throw Excess Entropy Out of a Reaction?
This post imported from StackExchange Physics at 2014-05-25 06:51 (UCT), posted by SE-user WetSavannaAnimal aka Rod Vance