What is conservation of information? (physics)
June 20, 2018 4:58 PM   Subscribe

I've been watching the PBS Space Time series on black holes and quantum mechanics. They're now talking about how black holes violate the principle of conservation of (quantum) information. I'm really having a difficult time wrapping my brain around what is meant by information, why it needs to be conserved, and what it means to not be conserved.

A bit of background. I have master's degree in physics (or will in two weeks) [please for the love of zeus don't tell met that I should already understand this by now. that's not helpful]. I've taken introductory classes in relativity, quantum mechanics, statistical mechanics, and cosmology. I have furiously googled the answer to this question, but the answers all seem either too technical, or too vague for me to grasp. See here, here, here and here, for instance.

So, when physicists talk about information, what do they mean?

What is an example of information *not* being conserved?

Why is it important that information be conserved?

In the first video I linked, the host claims that mass falling into a black hole is then turned into essentially blackbody radiation, and that's a violation of conservation of information. How is this any different from mass at the center of a star being turned into blackbody radiation?

I am not an intuitively mathematically person, so answers that take the form of example, analogy, story, or images are appreciated and preferred over answers that contain a wall of equations (which is exactly why I didn't post this on stackexchange).
posted by runcibleshaw to Science & Nature (10 answers total) 6 users marked this as a favorite
 
I'm pretty sure this is closely related to entropy type stuff-- information that passes the event horizon is destroyed, akin to some bits representing that information being scrambled, ie an increase in entropy.
posted by Maecenas at 5:19 PM on June 20, 2018


It's above my understanding, but does this help Black Hole Firewall
posted by willnot at 5:25 PM on June 20, 2018 [1 favorite]


Try the first lecture in Leonard Susskind's series on Statistical Mechanics. It's long but since it's meant for non-majors you may be able to skip around. I don't think that it's always covered in Statistical Mechanics courses the way Susskind covers it here.
posted by muddgirl at 5:42 PM on June 20, 2018 [3 favorites]


It's a necessary consequence of all of the other conversation laws more than a physical thing, though as with many concepts in physics, you can flip it on its head and call information the fundamental thing and be equally correct as far as the math is concerned. Even when such things aren't "real,' the exercise often provokes insight.
posted by wierdo at 6:01 PM on June 20, 2018


Response by poster: I did watch that Susskind lecture during my googling. He just kind of insists that information is conserved, without giving a good reason other than that it must be that way. He says that one state must be preceded by exactly one state and followed by one state. So two states can't lead directly to a single state. But I don't know what that has to do with information.
posted by runcibleshaw at 6:13 PM on June 20, 2018


What state you came from and what state you are going to is the "information." You can't have both Green and Blue lead to Red because if you measure your present state and it is Red, you don't know your prior state, you can't return to it, and so the system is not reversible (reversibility is a topic in classical mechanics). By going from Green to Red in that system, knowledge about your prior state has been lost, so information has not been conserved.
posted by muddgirl at 6:35 PM on June 20, 2018 [4 favorites]


When physicists say "information", what they're usually talking is time-reversal invariance. This means that no matter how a system evolves in time, it's possible to "run the clock backwards" and reverse-engineer what state it started out in. In other words, there's never a case where you "lose" information about what state the system was in originally; you can always figure out the initial state from the final state.

In statistical mechanics, we often effectively lose this information because there are too many degrees of freedom to keep track of. For example, if I speak the word "tachyon" into this room, the sound waves will bounce around in the room for a little bit; and as long as this is happening, I could look at the sound waves at that moment, reverse-engineer their motion, and figure out that the waves had emanated from my mouth some time earlier. But eventually, the sound waves will break down into seemingly random motions of the air molecules in the room. Technically, the information is still there; I'd just need to know the motions of all 1020-something air molecules in this room to figure out what word I had spoken some time earlier.

Black holes add an additional wrinkle to this. If I speak the word "tachyon" into a box of gas, and then throw that box of gas into a black hole, then there is no way, even in principle, to extract that information. On the other hand, the conservation of information in a quantum mechanical sense is related to an idea called unitarity, which among other things implies that the sum of the probabilities of observing the system in each of its possible states is equal to one (i.e., the norm of the wave function is constant.) This contradiction places quantum mechanics and black holes in tension with each other; it's usually called the information paradox. Many physicists like to think about this problem because it could potentially provide hints towards a quantum theory of gravity.
posted by Johnny Assay at 6:57 PM on June 20, 2018 [17 favorites]


Response by poster: I think I might need a live-in physicist who can explain things to me like I'm Amelia Bedelia to figure this one out, y'all.
posted by runcibleshaw at 7:11 PM on June 20, 2018


I am nowhere near as qualified as you in physics, and I'm hardly qualified to talk about math either, but the one thing I would suggest is that you consider reading a bit about information theory (if you already have done this, sorry, but I noticed it wasn't mentioned in your question). It is a field of math, so things can get a bit equation heavy if you go deep into the subject, but the basic concepts are fairly intuitive.

So in "pure math" information theory, information is measured in bits. Like the bits in a computer. Just a simple two state value. If that seems strange, then think of a bit not as the state of a transistor, but simply as the answer to a yes or no question. So if I ask someone, "Did you eat a sandwich for lunch?" they can encode their answer in just one bit of information, since it is a single yes or no question. For another example, in the game of 20 questions, one player is trying to guess what the other player is thinking of with at most 20 bits of information. Again, sorry if this is way too basic since I'm not sure what you know about this topic. Basically, my point is that bits are often discussed in the specific context of computers, so it can be easy to overlook how they can actually be used to describe almost anything (with a sufficiently large number of bits).

So, when physicists talk about information, what do they mean?

Following up on the previous paragraph, my understanding is that when physicists talk about information they are quite literally talking about the same information theoretic idea of information, but applied specifically to a physical system. So physical information is just a large number of yes/no questions describing a state of a physical system (for systems that are larger than quantum scale). If you ask enough yes/no questions about a system, you can eventually get a sufficiently detailed description to do things like solving equations.

Since many classical equations are reversible (sorry if this is wrong, I think that's the case but as I said my physics knowledge is not great), the information needed for such equations should not be lost, otherwise how would you be able to work backwards in time using such equations? If information were lost, you would have to travel back in time to get the lost information before you could solve the equation! Since such time travel is not necessary in the case of simple classical systems, at least some information in the system must have been conserved. There is probably a similar argument to be made about gaining classical information, but my overall point is that reversibility necessarily requires conservation of information (see also Johnny Assay's comment).

Quantum mechanics, as usual, screws up this picture quite significantly. At quantum scales, you can't really ask discrete yes or no questions without making a measurement, and frankly I'm not entirely sure how measurements in QM interact with physical information. But you can still ask a different sort of question without making a measurement (I'm really abusing the question metaphor here).

Specifically, rather than asking directly for a yes or a no answer to a question about a quantum system, you can ask for the probability distribution of a yes or a no answer. So as an answer, instead of receiving a discrete 0 bit or 1 bit, you get a linear combination of a 0 state and a 1 state (the states being the basis vectors of a 2D vector space, since QM loves linear algebra), where the coefficients of each state are probability amplitudes corresponding to a 0 measurement or a 1 measurement if you were to observe the system being described. Again, this is basically just encoding the probabilities of a yes or a no answer using vectors, much like how other pure quantum states are represented if my understanding is correct.

These probabilistic answers are quantum information, rather than classical information, and their unit is called a qubit (a quantum bit). My very vague understanding of why physicists believe that quantum information is conserved is that losing or gaining quantum information messes with unitarity (again see Johnny Assay's comment), which seems to be a widely used mathematical principle in quantum mechanics. In particular, the summing to one property of unitarity mentioned in that comment is important in the information theoretic interpretation of a qubit. That sum, in the case of a qubit, is the probability of a yes measurement added to the probability of a no measurement. If the sum of those probabilities is not one, then the qubit no longer represents a proper probability distribution of a yes/no answer, which makes it not very useful as a representation of information. So unitarity ensures that qubits represent meaningful quantum information, so it is quite closely related to the idea of information conservation.

To wrap up this long and unfocused answer, my understanding is that conservation of information is not actually a fully "proven" physical principle, but it happens to play nicely with the math used in a bunch of different areas of physics, so some physicists accept it or at least rely on it as something that might be true. Also, I suppose the idea of being able to use the exact same set of yes or no questions to completely describe a system at any point in time might be appealing on a philosophical level. Again, take everything I've written with many large grains of salt, as there are almost certainly factual errors and definitely some oversimplified explanations. If there is one thing you take from this wall of text, it is that I recommend you look into the basic ideas of information theory, as you will probably see quite intuitively how they relate to physics and that should help you understand its applications to physics in the context of conservation of information. (Hopefully you don't already know all about this stuff, since I have no other suggestions).
posted by jv776 at 8:21 PM on June 20, 2018 [4 favorites]


The Feynman Lectures on Computation includes a chapter on the physics of information, with very clear explanations that should be accessible to anyone with a physics background. If I remember correctly, he doesn't talk directly about conservation of information, but does go in depth into closely related topics like reversible computation, which I think would provide a good foundation.
posted by mbrubeck at 9:23 AM on June 23, 2018


« Older Recipes for making chai concentrate?   |   Best way to get to Cal Academy of Sciences... on... Newer »
This thread is closed to new comments.