Why is entropy necessary for there to be memory?
April 21, 2011 2:06 PM Subscribe
In the From Eternity to Here episode of the Science Talk podcast, Sean Carroll says that without entropy there could be no memory. Could you provide an explanation of why that is to somebody who's very much a layperson when it comes to thermodynamics?
He says in the podcast that one the laws of thermodynamics is that entropy increases with time and says that's what gives time its arrow. I don't understand why one implies the other. I feel like he skipped a step without showing his work because I can't connect the fact that entropy increases over time to the idea that we can remember the past but can't remember the future.
He says in the podcast that one the laws of thermodynamics is that entropy increases with time and says that's what gives time its arrow. I don't understand why one implies the other. I feel like he skipped a step without showing his work because I can't connect the fact that entropy increases over time to the idea that we can remember the past but can't remember the future.
Entropy increases in a closed system as time goes forward. Roughly this means that disorder increases in a closed system as time goes forward.
As an example think of watching a normal video vs. one of those trick videos recorded with everyone walking backwards and then played forwards to make you think it is real. But you can instinctively recognize the impossible parts -- a stack of papers scattered in the air that magically pops into someone's hand, an explosion of flower petals than collapses back into a flower, a puddle of water that draws itself together and forms an ice cube. You instinctively know that disordered things don't spontaneously become ordered again. This is the arrow of time. You can easily recognize the difference between a video played forwards and backwards because of the direction from ordered to disordered.
posted by JackFlash at 3:16 PM on April 21, 2011
As an example think of watching a normal video vs. one of those trick videos recorded with everyone walking backwards and then played forwards to make you think it is real. But you can instinctively recognize the impossible parts -- a stack of papers scattered in the air that magically pops into someone's hand, an explosion of flower petals than collapses back into a flower, a puddle of water that draws itself together and forms an ice cube. You instinctively know that disordered things don't spontaneously become ordered again. This is the arrow of time. You can easily recognize the difference between a video played forwards and backwards because of the direction from ordered to disordered.
posted by JackFlash at 3:16 PM on April 21, 2011
Most laws of physics don't care which way you run time. This is known as "time reversal symmetry"; see here for more. (Of course it's only CPT that is conjectured to be true at this point, but this is not important here.)
But clearly the world does: glass does not unshatter, the coffee that is now cold on my desk is not going to get hot again on its own, and I can only remember the past.
Why? Second law of thermodynamics.
posted by nat at 5:04 PM on April 21, 2011
But clearly the world does: glass does not unshatter, the coffee that is now cold on my desk is not going to get hot again on its own, and I can only remember the past.
Why? Second law of thermodynamics.
posted by nat at 5:04 PM on April 21, 2011
Entropy basically means that things will keep going until they are stopped, and keep sitting until they are shoved. This is water seeking its own level, chemical reactions proceeding until completion, planets orbiting a star (at least on a short time scale). Entropy only goes in one direction, and that direction is toward the most stable state.
Put a boat in the water and over time, it will rust away. If you put a rusty boat in the water, it will not under any circumstances un-rust.
So, information theory says that in order to store information (bits on a chip, neurons in a brain), you have to change something, and then it has to want to basically stay that way. If entropy didn't work the way it does, the proteins in your neurons or the silicon on the chip would tend to uncombine into their more volatile forms.
posted by gjc at 6:43 PM on April 21, 2011
Put a boat in the water and over time, it will rust away. If you put a rusty boat in the water, it will not under any circumstances un-rust.
So, information theory says that in order to store information (bits on a chip, neurons in a brain), you have to change something, and then it has to want to basically stay that way. If entropy didn't work the way it does, the proteins in your neurons or the silicon on the chip would tend to uncombine into their more volatile forms.
posted by gjc at 6:43 PM on April 21, 2011
No. That is not at all what it means.
posted by secretseasons at 7:01 PM on April 21, 2011
posted by secretseasons at 7:01 PM on April 21, 2011
In my professional opinion as an armchair reader of many popular physics books, the original statement makes no sense to me.
If the time arrow didn't exist, then time would be just like all the other dimensions. Which would mean there would be no memory I guess, because time would be symmetrical. I don't have some ability to only see things in one direction, why should it be different for time?
But I suspect life itself wouldn't exist in a universe without a time arrow. One of the main distinguishing characteristics of life is that it is capable of collecting energy to build low-entropy structures in our increasing-entropy universe. How would that even make sense if everything was in some constant medium-entropy state?
posted by miyabo at 9:53 PM on April 21, 2011
If the time arrow didn't exist, then time would be just like all the other dimensions. Which would mean there would be no memory I guess, because time would be symmetrical. I don't have some ability to only see things in one direction, why should it be different for time?
But I suspect life itself wouldn't exist in a universe without a time arrow. One of the main distinguishing characteristics of life is that it is capable of collecting energy to build low-entropy structures in our increasing-entropy universe. How would that even make sense if everything was in some constant medium-entropy state?
posted by miyabo at 9:53 PM on April 21, 2011
Sean's argument is that, without an arrow of time, you'd have no reason to suspect that your current mental state had anything to do with actual events that had occurred to your physical body at some other moment in its existence (talking without referencing time is damn hard). At one moment, you might have a mental state that implied you had just eaten breakfast, but since there's no arrow of time, and no increase of entropy in one direction, the "next" moment you might find yourself eating that breakfast you remembered, or eating dinner, or not existing, or turning into a newt. "Life" would be just a collection of moments with no narrative flow, like a movie cut up into stills and then moved around out of order.
Basically, he's saying that "things had lower entropy in the past" is how we DEFINE the past. If we evolved in a Universe where entropy increased to "the future," we'd remember things that "hadn't happened" because the state of our brains would be connected to things that happened in "the future" rather than "in the past."
It's a reasonable argument, though one might point out its hard to test. However, if the arrow of time reversed today, how would you notice?
IAAP,IANSC (I am a physicist, I am not Sean Carroll). Or Scott Adams, for that matter.
posted by physicsmatt at 7:24 AM on April 22, 2011 [2 favorites]
Basically, he's saying that "things had lower entropy in the past" is how we DEFINE the past. If we evolved in a Universe where entropy increased to "the future," we'd remember things that "hadn't happened" because the state of our brains would be connected to things that happened in "the future" rather than "in the past."
It's a reasonable argument, though one might point out its hard to test. However, if the arrow of time reversed today, how would you notice?
IAAP,IANSC (I am a physicist, I am not Sean Carroll). Or Scott Adams, for that matter.
posted by physicsmatt at 7:24 AM on April 22, 2011 [2 favorites]
This thread is closed to new comments.
Here's wikipedia's take on entropy and the arrow of time.
My basic, and possibly flawed, interpretation follows from the fact that in a closed system entropy will always increase. (This is the Second Law of Thermodynamics you referenced.) It is this movement towards ever greater entropy is referred to the arrow of time. So how does this connect to memory?
First off, lets get into how entropy enables our perception of time. We don't live in a closed system - or, I should say, we observe many open systems in our environment. Entropy is increasing or decreasing in infinite ways around us. We see constant change.
Which is good, since our perception of time is contingent on detecting change. That's what separates one moment from the next; if we could detect no difference in our environment how can we say time is passing? So entropy is our constant, irrefutable clock that tells us something is different. Our brains can then process (and store) these moments in our memory. We can look back on the "then" that is different than the "now."
posted by m@f at 2:32 PM on April 21, 2011 [1 favorite]