Quantcast
  • Register
PhysicsOverflow is a next-generation academic platform for physicists and astronomers, including a community peer review system and a postgraduate-level discussion forum analogous to MathOverflow.

Welcome to PhysicsOverflow! PhysicsOverflow is an open platform for community peer review and graduate-level Physics discussion.

Please help promote PhysicsOverflow ads elsewhere if you like it.

News

PO is now at the Physics Department of Bielefeld University!

New printer friendly PO pages!

Migration to Bielefeld University was successful!

Please vote for this year's PhysicsOverflow ads!

Please do help out in categorising submissions. Submit a paper to PhysicsOverflow!

... see more

Tools for paper authors

Submit paper
Claim Paper Authorship

Tools for SE users

Search User
Reclaim SE Account
Request Account Merger
Nativise imported posts
Claim post (deleted users)
Import SE post

Users whose questions have been imported from Physics Stack Exchange, Theoretical Physics Stack Exchange, or any other Stack Exchange site are kindly requested to reclaim their account and not to register as a new user.

Public \(\beta\) tools

Report a bug with a feature
Request a new functionality
404 page design
Send feedback

Attributions

(propose a free ad)

Site Statistics

205 submissions , 163 unreviewed
5,082 questions , 2,232 unanswered
5,354 answers , 22,789 comments
1,470 users with positive rep
820 active unimported users
More ...

  Maximum theoretical data density

+ 10 like - 0 dislike
5866 views

Our ability to store data on or in physical media continues to grow, with the maximum amount a data you can store in a given volume increasing exponentially from year to year. Storage devices continue to get smaller and their capacity gets bigger.

This can't continue forever, though, I would imagine. "Things" can only get so small; but what about information? How small can a single bit of information be?

Put another way: given a limited physical space -- say 1 cubic centimeter -- and without assuming more dimensions than we currently have access to, what is the maximum amount of information that can be stored in that space? At what point does the exponential growth of storage density come to such a conclusive and final halt that we have no reason to even attempt to increase it further?

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user tylerl
asked Dec 27, 2010 in Theoretical Physics by tylerl (50 points) [ no revision ]
this is a great question having to do with Bousso's covariant entropy bound - see my answer

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user user346
a hydrogen atom has infinitely many energy eigenstates...

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user Mark Eichenlaub
@MarkEichenlaub But surely the higher and higher energy eigenstates fill up more and more space: IIRC there is no bound on the eigenstate "size" as you go higher in energy.

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user WetSavannaAnimal aka Rod Vance

3 Answers

+ 9 like - 0 dislike

The answer is given by the covariant entropy bound (CEB) also referred to as the Bousso bound after Raphael Bousso who first suggested it. The CEB sounds very similar to the Holographic principle (HP) in that both relate the dynamics of a system to what happens on its boundary, but the similarity ends there.

The HP suggests that the physics (specifically Supergravity or SUGRA) in a d-dimensional spacetime can be mapped to the physics of a conformal field theory living on it d-1 dimensional boundary.

The CEB is more along the lines of the Bekenstein bound which says that the entropy of a black hole is proportional to the area of its horizon:

$$ S = \frac{k A}{4} $$

To cut a long story short the maximum information that you can store in $1 cc = 10^{-6} m^3$ of space is proportional to the area of its boundary. For a uniform spherical volume, that area is:

$$ A = V^{2/3} = 10^{-4} m^2 $$

Therefore the maximum information (# of bits) you can store is approximately given by:

$$ S \sim \frac{A}{A_{pl}} $$

where $A_{pl}$ is the planck area $ \sim 10^{-70} m^2 $. For our $ 1 cc $ volume this gives $ S_{max} \sim 10^{66} $ bits.

Of course, this is a rough order-of-magnitude estimate, but it lies in the general ballpark and gives you an idea of the limit that you are talking about. As you can see, we still have decades if not centuries before our technology can saturate this bound !

                         Cheers,

Edit: Thanks to @mark for pointing out that $1 cc = 10^{-6} m^3$ and not $10^{-9} m^3$. Changes final result by three orders of magnitude.

On Entropy and Planck Area

In response to @david's observations in the comments let me elaborate on two issues.

  1. Planck Area: From lqg (and also string theory) we know that geometric observables such as the area and volume are quantized in any theory of gravity. This result is at the kinematical level and is independent of what the actual dynamics are. The quantum of area, as one would expect, is of the order of $\sim l_{pl}^2$ where $l_{pl}$ is the Planck length. In quantum gravity the dynamical entities are precisely these area elements to which one associates a spin-variable $j$, where generally $j = \pm 1/2$ (the lowest rep of SU(2)). Each spin can carry a single qubit of information. Thus it is natural to associate the planck areas with a single unit of information.

  2. Entropy as a measure of Information: There is a great misunderstanding in the physics community regarding the relationship between entropy $S$ - usually described as a measure of disorder - and useful information $I$ such as that stored on a chip, an abacus or any other device. However they are one and the same. I remember being laughed out of a physics chat room once for saying this so I don't expect anyone to take this at face value.

But think about this for a second (or two). What is entropy?

$$ S = k_B \ln(N) $$

where $k_B$ is Boltzmann's constant and $N$ the number of microscopic degrees of freedom of a system. For a gas in a box, for eg, $N$ corresponds to the number of different ways to distribute the molecules in a given volume. If we were able to actually use a gas chamber as an information storage device, then each one of these configurations would correspond to a unit of memory. Or consider a spin-chain with $m$ spins. Each spin can take two (classical) values $\pm 1/2$. Using a spin to represent a bit, we see that a spin-chain of length $m$ can encode $2^m$ different numbers. What is the corresponding entropy:

$ S \sim \ln(2^m) = m \ln(2) \sim \textrm{number of bits} $

since we have identified each spin with a bit (more precisely qubit). Therefore we can safely say that the entropy of a system is proportional to the number of bits required to describe the system and hence to its storage capacity.

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user user346
answered Dec 27, 2010 by Deepak Vaid (1,985 points) [ no revision ]
typo: should be $1cc = 10^{-6}m^3$, $A = 10^{-4}m^2$

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user Mark Eichenlaub
I've heard this a few times, might as well ask now. What if you take a volume $V_2$ that lies inside your volume $V_1$ such that $A_2 > A_1$. Which one would be able to hold more information?

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user Bruce Connor
@space_cadet: This has the makings of a great answer; my one (hopefully constructive) criticism is that you don't really explain why the proportionality constant $S/A$ is related to $A_{pl}$. Of course a full proof would be overkill, but I think it'd help to include a few words on the significance of the Planck area in this argument, for people who aren't familiar with it. Also I'd rather see a different symbol used instead of $S$ in your last equation, since entropy doesn't quite measure the number of bits of information. (I know it's just a constant factor difference, it just looks weird)

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user David Z
Thank you @mark and @david for your comments. I hope the edit resolves the questions you had. As for what @bruce pointed out (nice one btw) I'm still processing and will come back if I have a response.

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user user346
@Bruce: $V_2$ obviously; the whole point of holography is that volume doesn't matter at all, only area does :-) Of course, I am not sure to what degree this has been proved (as in calculated microscopically) for generic surfaces (not smooth even?) rather than for horizons of quite generic BH.

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user Marek
This is a nice answer, but I have to wonder, how much of this information can actually be exploited. It's clear that you have given absolute lower bound on that information. But in reality, we wouldn't be able to modify and read bits from BH's horizon. So I guess a bigger lower bound should exist. Or are you suggesting that all of that holographic information can somehow be managed, even in principle?

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user Marek
@marek - good point (two comments up). I was thinking along similar lines. Interestingly this line of reasoning sheds light on the geometric nature of the entropy bound, so is worth pursuing in greater detail.

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user user346
@marek - this is not a lower bound. It is an upper bound. It determines the maximum amount of information that you can store in a given region. Or am I misunderstanding you? Secondly, I'm not suggesting anything about how such information can be managed. That is a separate question that will lead us to consider limits on information processing and transfer as opposed to storage. There is an interesting article on this by JB Pendry available here

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user user346
@space_cadet: sorry, I reversed the order, I did mean to write upper. And by the way, no, it's not a separate question. If you want to store your information, it's because you want it safe and recover it later. Throwing things into BH doesn't count as storing information in my opinion (at the very least, that information will fly away as Hawking radiation sooner or later) :-) But thanks for the link.

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user Marek
@space_cadet: the answer is incomplete. All that it is saying is that you can only store information on the surface area. It makes sense (you need to have a surface to be able to retrieve the information). On the other hand the question asks about the volume. So the question now is: how much surface area can you fit in a given volume? en.wikipedia.org/wiki/Menger_sponge

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user Sklivvz
In the Pleasure of Finding Things out, Richard Feynman also speculates on the limits of information density. He doesn't go as far as the Planck length, as far as I can recall, presumably because that is very far of technologically, even nowadays. Interesting lecture.

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user Raskolnikov
@raskolnikov - Feynman was one of the original proponents of quantum computation. There is an article by him from around ~ 1982 on this topic. @Sklivvz - the information that can be stored on a black hole horizon or other surface, is the maximum information that can be stored within any region (i.e. volume) bounded by that surface. At high enough energies, space-time becomes 2+1 dimensional and the notion of volume disappears.

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user user346
There are problems with surface-area definition of information: 1) it makes information capacity observer-dependent (for an observer under the named surface the information capacity will be much differnt) 2) it is not additive and depends on the division we choose for the space: sum of many 1 cm^3 volumes evidently can store more information than one area of the same volume and the both smaller than a volume that encompasses both. 3) It does not account differences between quantum and classical information 4) It is not reversible: for an observer the entropy of the universe turns to be limited

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user Anixx
it makes information capacity observer-dependent ... I'm not sure what you mean by that. for an observer under the named surface the information capacity will be much differnt I'm sorry, I don't understand this. Do you mean that an observer within the bounded region will see something different from an observer on the outside? it is not additive and depends on the division we choose for the space ... there is a restriction on the sorts of surfaces which saturate the entropy bound. These are generally minimal or "extremal" surfaces which are unique for a given spacetime.

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user user346
It does not account differences between quantum and classical information ... again I don't understand the point you're trying to make. Please elaborate on what these differences are and how holography fails to account for them. 4) It is not reversible: for an observer the entropy of the universe turns to be limited ... Is that one criticism or two? Holography actually ensures unitarity and hence (microscopic) reversibility of a quantum gravity theory. See Lubos' answer to question 5407. Also, I don't see how the entropy of the universe being a finite quantity is undesirable.

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user user346
In the mathematical discipline of information theory, the entropy of a message is the expected value of the information contained in that message. The formulas are the same, so it shouldn't be surprising that entropy is a measure of information content in physical systems as well.

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user Nick Alger
Is the answer also true shortly after the big bang ?

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user jjcale
+ 2 like - 0 dislike

Ok then, let's say that for a given volume, and nano molecular data retrieval technology. Assuming that you want the data safe, retrievable, made of a long term stable atom what is the maximum data that can usefully be stored.

So firstly we need 1/2 of the total volume to be used for a single molecular layer of your chosen molecule, this will be the "platter" for our "hard drive".

Onto this you place the atoms that represent bits so you have your volume divided by the volume of your chosen molecule/element divided by 2 as the total number of bits.

But with molecular storage, you could use different molecules and have for example,

No molecule = 0 Gold = 1 Platinum =2 Silver = 3

Then you have 4 bit data storage without much loss in size, throw in some carbon 12 and carbon 13 and your up to 6 bit, find some more stable elements and your up to 8 bit and so on.

Of course data retrieval would be terribly slow, but for long term, small size storage. Your talking quadrillions of bits per cm3

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user david mcgowan
answered Sep 7, 2012 by david mcgowan (20 points) [ no revision ]
Is it more than the bound of $10^{66}$ per $1cc$ or not that mr. Bekenstein (provided two years ago)? If not then what is the point?

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user Val
+ 0 like - 0 dislike

I'm not a physicist, but I do know computer science, and keep up with the basics of physics, so let me give another answer for this:

We don't know yet. As long as there are smaller things that can be found, changed, and observed, we can use them to store information.

For example, if a new quantum property is found which can be in state A or state B, that's a new bit. If that's in every billion atoms of something, that's a billion more bits of data. If we then learn to manipulate that property into two additional states (say, right-way-out, and inside-out), then we've just added a new bit, raising that capacity to the power of 2.

So, the problem is that we're still learning what matter and spacetime are made of. Until we come up with a provably correct, unified theory, we don't know how many varying things there are within any material. Given that every single additional state is at least a ^2 change in information density, it's fairly useless to give "ballpark" figures until we know more. So it's probably just better to give something like Moore's Law - a prediction that we'll double the storage every so often, until we run out of new discoveries/technologies.

This post imported from StackExchange Physics at 2014-04-01 16:27 (UCT), posted by SE-user Lee
answered Sep 30, 2013 by Lee (0 points) [ no revision ]

Your answer

Please use answers only to (at least partly) answer questions. To comment, discuss, or ask for clarification, leave a comment instead.
To mask links under text, please type your text, highlight it, and click the "link" button. You can then enter your link URL.
Please consult the FAQ for as to how to format your post.
This is the answer box; if you want to write a comment instead, please use the 'add comment' button.
Live preview (may slow down editor)   Preview
Your name to display (optional):
Privacy: Your email address will only be used for sending these notifications.
Anti-spam verification:
If you are a human please identify the position of the character covered by the symbol $\varnothing$ in the following word:
p$\varnothing$ysicsOverflow
Then drag the red bullet below over the corresponding character of our banner. When you drop it there, the bullet changes to green (on slow internet connections after a few seconds).
Please complete the anti-spam verification




user contributions licensed under cc by-sa 3.0 with attribution required

Your rights
...