The answer is given by the covariant entropy bound (CEB) also referred to as the Bousso bound after Raphael Bousso who first suggested it. The CEB sounds very similar to the Holographic principle (HP) in that both relate the dynamics of a system to what happens on its boundary, but the similarity ends there.
The HP suggests that the physics (specifically Supergravity or SUGRA) in a d-dimensional spacetime can be mapped to the physics of a conformal field theory living on it d-1 dimensional boundary.
The CEB is more along the lines of the Bekenstein bound which says that the entropy of a black hole is proportional to the area of its horizon:
S=kA4
To cut a long story short the maximum information that you can store in 1cc=10−6m3 of space is proportional to the area of its boundary. For a uniform spherical volume, that area is:
A=V2/3=10−4m2
Therefore the maximum information (# of bits) you can store is approximately given by:
S∼AApl
where Apl is the planck area ∼10−70m2. For our 1cc volume this gives Smax∼1066 bits.
Of course, this is a rough order-of-magnitude estimate, but it lies in the general ballpark and gives you an idea of the limit that you are talking about. As you can see, we still have decades if not centuries before our technology can saturate this bound !
Cheers,
Edit: Thanks to @mark for pointing out that 1cc=10−6m3 and not 10−9m3. Changes final result by three orders of magnitude.
On Entropy and Planck Area
In response to @david's observations in the comments let me elaborate on two issues.
Planck Area: From lqg (and also string theory) we know that geometric observables such as the area and volume are quantized in any theory of gravity. This result is at the kinematical level and is independent of what the actual dynamics are. The quantum of area, as one would expect, is of the order of ∼l2pl where lpl is the Planck length. In quantum gravity the dynamical entities are precisely these area elements to which one associates a spin-variable j, where generally j=±1/2 (the lowest rep of SU(2)). Each spin can carry a single qubit of information. Thus it is natural to associate the planck areas with a single unit of information.
Entropy as a measure of Information: There is a great misunderstanding in the physics community regarding the relationship between entropy S - usually described as a measure of disorder - and useful information I such as that stored on a chip, an abacus or any other device. However they are one and the same. I remember being laughed out of a physics chat room once for saying this so I don't expect anyone to take this at face value.
But think about this for a second (or two). What is entropy?
S=kBln(N)
where kB is Boltzmann's constant and N the number of microscopic degrees of freedom of a system. For a gas in a box, for eg, N corresponds to the number of different ways to distribute the molecules in a given volume. If we were able to actually use a gas chamber as an information storage device, then each one of these configurations would correspond to a unit of memory. Or consider a spin-chain with m spins. Each spin can take two (classical) values ±1/2. Using a spin to represent a bit, we see that a spin-chain of length m can encode 2m different numbers. What is the corresponding entropy:
S∼ln(2m)=mln(2)∼number of bits
since we have identified each spin with a bit (more precisely qubit). Therefore we can safely say that the entropy of a system is proportional to the number of bits required to describe the system and hence to its storage capacity.
This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user user346