38

What is the most efficient way to store data that is currently hypothesized? Is it theoretically possible to store more than one bit per atom?

Until Floris' comment, I hadn't considered that efficiency is dependent on what you're trying to optimize. When I wrote the question, I had matter efficiency in mind. Are there other orthogonal dimensions that one may want to optimize?

And to be clear, by matter efficiency, I mean representing the most data (presumably bits) with the least amount of matter.

Tyler
  • 499
  • 4
  • 7

5 Answers5

59

It sounds as though you may be groping for the Bekestein Bound (see Wiki page of this name) from the field of Black Hole Thermodynamics. This bound postulates that the total maximum information storage capacity (in bits) for a spherical region in space $R$ containing total energy $E$ is:

$$I\leq \frac{2\,\pi\,R\,E}{\hbar\,c\,\log 2}\tag{1}$$

where $I$ is the number of bits contained in quantum states of that region of space. If you try to shove too much energy into a region of space, a Schwarzschild horizon and a black hole will form, to the Bekenstein bound implies a maximum information storage capacity for a region of space independent of $E$; the limit will be reached when $E$ is so high that $R$ becomes the Schwarzschild radius (radius of the event horizon) for that black hole; from this notion we get:

$$E\leq \frac{R\,c^4}{2\,G}\tag{2}$$

to prevent a black hole forming, with equality holding at the threshold of creating an horizon. From (1) and (2) we find:

$$I\leq \frac{\pi\,R^2\,c^3}{\hbar\,G\,\log\,2}\tag{3}$$

which is indeed the entropy of a black hole in bits: this is Hawking's famous formula $I = A/(4\,\log 2)$ bits, where $A$ is the black hole Schwartzschild horizon's area (but expressed in Planck units). Bekenstein derived these bounds by (1) postulating that the second law of thermodynamics stays true for systems containing black holes and then showing that (2) the second law can be made "safe" only if these bounds hold. Otherwise, one can imagine thought experiments that would violate the second law by throwing things into black holes. More on the grounding of the bounds can be found on the Scholarpedia page for the Bekenstein bound.

One gets truly colossal storage capacities for these formulas. For a region of space of 5 centimetres radius (the size of a tennis ball), we get $4.3\times10^{67}$ bits from (3). This is to be compared with the estimated total storage of Earth's computer systems of about $10^{23}$ bits in 2013 (see the Wikipedia Zettabyte Page). A one and a half kilogram human brain can store about $3\times 10^{42}$ bits and the mass of Earth roughly $10^{75}$ bits. These last two are more indicative of "normal" matter because the tennis ball example assumes we've packed so much energy in that a black hole is about to form. So the tennis ball would be made of ultracompressed matter like neutron star material.

From the human brain example, lets assume we have $(1500/12)\times 10^{24}$ atoms. (roughly Avagadro's number times the number of moles of carbon in that mass). The informational content worked out above would amount to more like $10^{16}$ bits per atom.

None of these bounds talk about the actual realisation of data storage. But it would be trivial to store more than one bit per atom theoretically by choosing an element with, say, three or four stable isotopes, and lining up the atoms in a lattice. You code your data by placing the appropriate isotope at each given position in the lattice, and retrieve your bits by reading which isotope is present at each position of the lattice. For example, Silicon has three stable isotopes: you code your message in a lattice of silicon like this, and your storage is $\log_2 3 \approx 1.58$ bits per atom.


Edit in answer to question by OP:

"since this is, as far as I can tell, relativistic/macro-scale physics, is there room for significant change when/if quantum physics is incorporated? (I.e. will this likely stay the same or change when the unifying theory is discovered? Or is it entirely independent of the unifying theory problem?)"

Yes it is macro-scale physics, but it will not improve when quantum effects are incorporated IF the second law of thermodynamics applies to black hole systems and my feeling is that many physicists who study quantum gravity believe it does. Macroscopic ensembles of quantum systems still heed the second law when you measure the entropy of a mixed states with the von Neumann entropy: this is the quantum extension of the Gibbs entropy. And, if you're talking about the second law, you are always talking about ensemble / large system behaviour: entropy fluctuates up and down: negligibly for macro systems but significantly for systems of small numbers of quantum particles. If you think about it though, it is the macro behaviour that is probably most interesting to you because you want to know how much information is stored on average per quantum particle. As I understand it, a great deal of quantum gravity theory is grounded on the assumption that black hole systems do indeed follow the second law. In causal set theory, for example, the assumed "atoms" of spacetime causally influence one another and you of course have pairs of these atoms that are entangled (causally influence one another) but which lie on either side of the Schwarzschild horizon: one of the pair is insidethe black hole and therefore cannot be probed from the outside, whilst the other pair member is in our universe. It is entangled and thus causally linked to the pair member inside the black hole which we cannot see. The outside-horizon pair member observable in our universe therefore has "hidden" state variables, i.e. encoded in the state of the pair member inside the horizon that add to its von Neumann entropy, as we would perceive it outside the horizon. This is why causal set theory foretells an entropy proportional to the horizon area (the famous Hawking equation $S = k\,A/4$) because it is the area that is proportional to the number of such pairs that straddle the horizon.


Links with Jerry Schirmer's Answer after discussions with Users Charles and user27542; see also Charles's question "How big is an excited hydrogen atom?"

Jerry Schirmer correctly (IMO) states that one can theoretically encode an infinite number of bits in the eigenstates of an excited hydrogen atom; this is of course if we can measure energy infinitely precisely and tell the states apart; since the spacing between neighbouring energy levels varies as $1/n^3$ where $n$ is the principal quantun number, we'd need to be willing to wait longer and longer to read our code as we try to cram more and more data into our hydrogen atom. Even if we are willing to do this, the coding scheme does not even come close to violating the Bekenstein bound because the size of the higher energy state orbitals increase, theoretically without bound, with the principal quantum number. I calculate the mean radius $\bar{r}$ of an orbital with principal quantum number $n$ in my answer here and the answer is $\bar{r}\approx n\,(n+\frac{3}{2})\,a \sim n^2$. Also, the angular momentum quantum numbers are bounded by $\ell \in1,\,2,\,\cdots,\,n-1$ and $m\in-\ell,\,-\ell+1,\,\cdots,\,\ell-1,\,\ell$, therefore the total number of eigenstates with principal quantum number $n$ is $1+3+5+\cdots 2n-1 = n^2$ and so the total number $N(n)$ of energy eigenstates with principal quantum number $n$ or less is $N(n)=1^2+2^2+\cdots+n^2 \approx n^3/3$. So $\bar{r}\propto n^2$ and $N \propto n^3$ thus $N\propto \sqrt{\bar{r}^3}$ and $I = \log N \approx A + \frac{3}{2}\,\log_2\,\bar{r}$ where $I$ is the encodable information content in bits and $A$ a constant.

Selene Routley
  • 90,184
  • 7
  • 198
  • 428
22

Assuming infinite precision in measurement, an infinite number of bits can be stored in a single atom.

Take the information you want to store, encode it into a string, and then calculate the Gödel number of the string. Call that number n. Then, excite a hydrogen atom to exactly the n${}^{\rm th}$ energy level.

In practice, the properties of a real hydrogen atom will make this completely impractical, of course, but we're just talking pure theory.

2

Actually, theorists have computed the ultimate computer, assuming that black holes exists. The basic procedure to estimate the maximum amount of information that can be stored in a given amount of space is related to the amount of information you can store at the smallest possible scales, as you keep concentrating matter and energy to code for bits. Information and entropy are related, so the ultimate limit is believed to be the entropy of a black hole, because as you keep miniaturization, a black hole will be the limit, as information density grows larger, it will eventually collapse into a black hole. See for instance this article for a popular science viewpoint http://www.nytimes.com/library/national/science/090500sci-physics-computers.html

1

I think in theory the size we can store information is 1 particle or 1 planck's length. That's the limit of quantum theory. Maybe we can store information in 1 sheet of something that has a 1x1 planck length square per slot. And state of particle in that slot is bit of information we can store

Thaina
  • 948
-1

I can also think of a method that can store infinite data in a couple of objects/atoms/whatever that depends on how good you can measure. Just measure the rotation of an object in relation to another object on the same axis. Simple example: have two disks that can spin on the same axis, with on each disk a indicator on what the up side is. The angle between these two indicators is between 0 and 360 degrees and depending on how accurate you can measure you can store data this way.

Ivo
  • 99