Real numbers and black holes

A real number is a number you can know to as much precision as you want. \(1.0\) is a real number, as is \( \pi \) and \( \sqrt{2} \). It's convenient that we have symbols like \(\sqrt{2}\) for talking about certain real numbers, like the solution to \( x^2 = 2 \). Most of the others are too long to write down, since apart from integers and fractions, real numbers are basically just infinite decimal expansions.

There are compact, symbolic ways of representing the number \( \pi \), and nothing stops us, in principle, from knowing as many digits of \( \pi \) as we want. But the more digits we write down, the more space we need to store them. If we wanted all the digits, we'd need all the space in the world, and then some.

Someone online asked if it was meaningful to say, "a particle is \( \sqrt{2} \) meters in the \( x \) direction", that is, to label the position with a real number, or if the "information density" would create a black hole. (The context was this paper called Problems with the Continuum by John Baez.)

If you don't see why "information density" even enters the picture, consider this. Start by compiling every sentence ever written, such as "It was the best of times, it was the worst of times..." and then translate them into one long binary string, like \( .0110110111...101 \). If I could place a particle anywhere on the real number line, I could choose to place it at \( x = .0110110111...101 \) cm. Then, when I wanted to retrieve those sentences, I could just measure the position of the particle. It would be a pretty impressive feat of memory storage, if that could indeed be done. All the information you could ever want, stored in a single tiny particle.

Is there a law that prevents us from storing an unlimited amount of information this way? Certainly in the real world, the uncertainty principle applies, and the more you tried to squeeze a particle into a tiny space, the faster it would wiggle. You couldn't pin its speed to zero without also messing with its position. But what about a hypothetical classical world, with point particles and zero velocities and infinite precision measurements? Well it still feels to me like you're breaking some kind of rule by putting a particle at exactly the point \( x = \sqrt{2} \), but can I make this feeling precise?

Information and energy

First, there's a surprisingly deep connection between information, energy, and temperature. A good starting point is Charles Bennet's work on information and energy (or Szilard's). To see how information can be used as a certain kind of "fuel," consider their argument.

Imagine a classical, zero-size particle in a box at temperature \( T \). For simplicity assume that the box lives on the interval \([0, 1]\).

At an instant, we're going to measure the particle's position to arbitrary precision and represent it in base two (just zeros and ones) with the following recipe. If the particle is on the left side of the box, we make the first digit a zero: \( x = .0 \). Then we look in that sub-box. If it's in the right half, we add a one: \( x = .01 \) and so on.

In the end we end up with a representation of the position of the particle as a series of \( N \) bits: $$ x = .\underbrace{010110111001...1}_\textrm{N digits} $$

How much work can we extract from this system? It turns out to be related to the amount of information we have about the system. With the particle position known, quickly insert membranes around that are only a distance \(.000...1 \) apart (so if we know the position to 1 micron of precision, we place the particle between two membranes separated by 1 micron.) Then we allow the particle to isothermally expand against the membranes and double the distance between them. In the process, we can extract work \( W \) equal to

$$ W =\int p dV = \int \frac{kT}{V} dV = kt \ln \frac{V_2}{V_1} = kT \ln 2 $$

This erases one bit of information in the position in exchange for \( kT \ln 2 \) units of work done (this is Szilard's result). Since we have \( N \) bits of information to work with, we can do this \( N \) times. In this classical world, \(N\) can be made arbitrarily large.

To extract that energy from the particle, you have to continuously re-heat it and add more energy back each time you want \( kT \ln 2 \) units of work done. Remember that the work is done isothermally, so the particle's energy is actually constant throughout this process. Its energy density, the kind of thing that would make a black hole, doesn't change.

What's really happening is that if we consider the positional information to be continuous, then the particle's positional entropy is basically infinite. We can reduce this entropy (locally, to the particle) with a measurement, and use that lowered entropy to extract as much isothermal work as we want. This is an example of how you can use information to extract work from a heat bath.

Safe from black holes?

As we've said above, even though the energy density of the particle is constant, its entropy is proportional to \(N\), where \(N\) is the number of bits of precision we can know its position to. If we can know the particle's position to arbitrary precision \( N \) large, then we hit a problem.

It turns out that there's a bound for the amount of entropy you can pack into a small space before it becomes a black hole,

$$ S_\text{matter} \lesssim A. $$

Why? It turns out we have lots of good reasons to think that the entropy of a black hole is proportional to its area. This has an interesting consequence. If you could store more information in a volume of matter than in an equally-sized black hole, then by squeezing the matter into its own black hole, you would lower the entropy of the universe, which is not allowed. So it must be the case that the information in any volume is less than the amount of an equally-sized black hole, otherwise it would become one.

In other words, I think in a classical world (with black holes) the information density would create black holes. This makes an interesting case for quantum mechanics, which comes in and blurs some of the particle detail, putting an upper bound on the entropy of a particle.

The quantum mechanical entropy

As mentioned above, quantum mechanics sort of rescues the situation by putting a limit on how much you can know about the position of a particle. The position is instead represented by a complex probability amplitude \( \psi(x) \) which squares to a probability.

In fact, in quantum mechanics, you can compute the differential entropy of a particle's wavefunction \( \psi(x) \) explicitly with

$$ S = - \int dx \left| \psi \right|^2 \ln \left| \psi \right|^2 $$

and this is finite. But this doesn't totally get us away from the problem with the continuum. In passing to the limit of a continuous \( \psi(x) \) (basically, saying we can measure the probability on a scale as fine as we want), we've actually thrown away an infinite amount of entropy. In fact, that part of the entropy corresponds to the logarithm of the spacing of our mesh, \(\epsilon\),

$$ S_\textrm{mesh} \sim - \ln(\epsilon). $$

Does that mean there's a limit to how precisely we can measure the wavefunction, \(\psi(x)\)? It's something to think about.

As a final thought: the linked post above shows how the entropy increases as you dial up the energy state of a hydrogen atom. The large \( n \) states of the hydrogen atom are called Rydberg states. For a sufficiently large value of \( n \), how does the entropy scale? What about the entropy density? Is it possible to hit the maximum entropy bound, and create a black hole? Maybe we'll follow that up in another post.