Can someone please explain the The Second Law of Thermodynamics to me, with examples?
February 5, 2008 8:25 AM   Subscribe

Can someone please explain the The Second Law of Thermodynamics to me, with examples?

I'm reading The User Illusion: Cutting Consciousness Down to Size. Trying to understand Information Theory and entropy. For this I need to understand the 2nd law of Thermodynamics.. which i understand to be:

"Energy spontaneously tends to flow only from being concentrated in one place to becoming diffused or dispersed and spread out."

And an example of this is a rock will always fall if you let go of it a great height Or high pressure will always leak into a low pressure.

But this where I get confused.. that is surely Newtons theory of Cause and Effect in action not the second law.

So my question is really.. what is the difference between the second law of thermodynamics and newtons law of cause and effect?

Great Sites btw:-
posted by complience to Science & Nature (29 answers total) 3 users marked this as a favorite
The rock is not really a good example. It falls because a force is applied to it, which happens to be gravity. The rock will rise if you apply an upward force to it. Those are Newtonian effects, not Thermodynamics. But the second law would say, if that rock is hot, and you put it in ice water, heat will flow from the hot rock into the ice water until equilibrium is established.
posted by beagle at 8:33 AM on February 5, 2008


I think the "spontaneously" in your definition is the point of difference. Spreading out, without cause, is a quality of energy.

Your high pressure example is better than your rock example.

Fill your bathtub with hot water. Climb in. Wait a couple of hours. Where did all the hot go? It spread out into you, into the air in the room, into the air outside your house, into the atmosphere, into space.
posted by notyou at 8:36 AM on February 5, 2008 [1 favorite]

The Three Laws of Thermodynamics, as explained by my high school physics teacher who probably stole it from somewhere: You can't win, you can't break even, and you can't quit playing.
posted by BitterOldPunk at 8:38 AM on February 5, 2008 [5 favorites]

But the second law would say, if that rock is hot, and you put it in ice water, heat will flow from the hot rock into the ice water until equilibrium is established.

With this in mind, can you explain to me why the rock and the ice water move into equilibrium by temperature?
posted by biffa at 8:41 AM on February 5, 2008

Simple explanations of the laws of thermodynamics (cribbed from here):

1st law: You cannot win (that is, you cannot get something for nothing, because matter and energy are conserved).

2nd law: You cannot break even (you cannot return to the same energy state, because there is always an increase in disorder; entropy always increases).

3rd law: You cannot get out of the game (because absolute zero is unattainable).

In other words, the second law says entropy increases, and I like wikipedia's definition of entropy here: "entropy is the unavailability of a system's energy to do work." It's is not really limited to position, like in your example.

You might want to consider whether the problem lies in a perfectly reasonable confusion about entropy, which happens to be a very confusing subject.

Your specfic question seems directed to entropy in the information theory context, and for that it might be worth delving into Shannon entropy. There's an example there that might shed some light on your question.

As a side note, I am disturbed by how many first page Google search results for "explanation of second law of thermodynamics" go to pages about evolution or intelligent design.
posted by Pastabagel at 8:49 AM on February 5, 2008 [1 favorite]

Re: your rock example

Try to wrap your mind around this: There is no such thing as "cold." There is only heat (energy) and the lack thereof. That's how our teacher explained it. There is also no such thing as "dark" - just light (energy) and the lack thereof. In the same way (as our teacher liked to say), nothing "sucks" - there is only pressure and lack of pressure, with high pressure "pushing" into areas of low pressure.

Energy flows from the hot rock (full of energy) into the cold water (less energy, but still a lot of heat - well above absolute zero).

Only energy, not the lack of energy, can flow. You can imagine a beam of heat or light (microwave beam, flashlight, or laser), but what about a beam of dark? A beam of cold? That doesn't make sense.
posted by mamessner at 8:50 AM on February 5, 2008 [3 favorites]

It's all about probability. All it says is that systems tend to move into more probable states.

If I had a line of dice all with the number 6 face up, and I step by step pick one up and throw it and replace it in line with whatever value it has, the fact you end up with a disordered random sequence is essentially the second law at work on a macroscopic scale.

Now on the microscopic scale you not only have things like the positions of particles, but the distribution of energy in your system too, and rather than worrying about those details you talk about them in terms of macroscopic properties like temperature and entropy and formulate the same rules but in this new language, which is what ends up making it seem a lot more complicated than it really is.

In the language of statistical mechanics, the precise positions and states of all your particles (or the precise ordering of the numbers on your dice) is called the microstate. Generally, we worry day to day about the macrostate - the pressure and temperature of the gas, or the total number of 1s, 2s, 3s and so on of our dice (ignoring the ordering). The second law says basically that you move to the macrostate with the most microstates within it. Going back to the dice, there's only one microstate with all the dice showing a 6, so that's unlikely. There's an awful lot where the numbers of each value showing are about equal, so those macrostates are more likely.

Entropy is directly related to the number of microstates in the macrostate. If you have some number of such microstates W, then the entropy S is
S=k ln W

Because thermodynamics came about to talk about heat transfer in steam engines or what have you the language is a bit obfuscating, but fundamentally, all it is is saying that probable stuff happens. It's taken so seriously as a law because of the scale of the systems involved. You're not actually rolling from a short line of dice but you got bazillions of bazillions of bazillions of atoms jostling about, and the odds of getting anything unlikely are dramatically smaller as a result.
posted by edd at 8:53 AM on February 5, 2008 [2 favorites]

I would default to Flanders and Swann's First and Second Law song (as opposed to the "Glory Glory, dear old Thermo" I learned several years ago...) here

"The second law of thermodynamics
Heat cannot of itself pass from one body to a hotter body (x 2)
Heat won't pass from a cooler to a hotter (x 2)
You can try it if you like but you far better notter (x 2)
'Cause the cold in the cooler will be hotter as a ruler (x 2)
Because the hotter body's heat will pass through the cooler"

Also, Lord Kelvin's statement of the 2nd law of thermo as "It is impossible to convert heat completely into work." helped me out greatly in undergrad. As a sidenote, if you think the 2nd law is trippy, you shouldn't ever look into exergy.

The rock falling isn't really an example of entropy "at work", if you will. It's, as you stated, an F=m*a thing. The force of gravity acting on a mass results in an acceleration of the rock, or some such business. The pressure is an equilibrium thing, which has been mentioned earlier.
posted by conradjones at 8:56 AM on February 5, 2008 [1 favorite]

Fill your bathtub with hot water. Climb in. Wait a couple of hours. Where did all the hot go? It spread out into you, into the air in the room, into the air outside your house, into the atmosphere, into space.
posted by notyou at 11:36 AM on February 5

The corollary to this is to try to think about hot to get all that dissipated heat back into the bathwater, i.e. how can the interconnected molecules of the water, your body, tub, air, etc will collide an interact in such a way as to restore the heat to the bath water?
posted by Pastabagel at 8:58 AM on February 5, 2008

Imagine a pool table with no friction so that when a ball starts moving it never stops. No imagine the table is full of slow moving balls. Obviously some will be moving faster than others at any given time but the overall average speed will be constant because they never slow down.

No imagine we put a bunch of fast moving balls on to the table. They will collide with the slower moving ones and speed them up. The average rate will end up somewhere between the slower balls and faster balls depending on how many there are from each group.

If the pool balls are molecules, then the average speed is the temperature and the tendency of the system to go towards its most likely state is entropy. .
posted by euphorb at 9:02 AM on February 5, 2008 [1 favorite]

euphorb's explanation doesn't really get to the core of it, as it's exactly the second law itself that makes the fast balls speed up the slow ones, and slow down themselves. So it seems like a bit of a circular argument.

I think it might be more useful to point out that just as the balls end up spacing themselves out roughly evenly over the table just by chance, the energy spreads itself roughly evenly amongst the balls just by chance.
posted by edd at 9:07 AM on February 5, 2008

The basic idea of the second law is that highly ordered states are a lot less likely to happen than less ordered states. For example, suppose I had a box with a low wall dividing the bottom in half. I drop four marbles into the box, put the lid on, shake it up, and then open the lid and see where the marbles have settled. Let's assume for simplicity that each individual marble has an equal probability of landing on either side. Then probability theory tells me that I'll end up with all the marbles concentrated on (say) the left-hand side one time out of sixteen:


while I'll end up with two marbles on each side six times out of sixteen:


Roughly speaking, this is because to get a highly "ordered" state (with all marbles on one side), all the individual probabilities have to line up; while if only one choice gets messed up, the resulting configuration is immediately less ordered.

The Second Law of Thermodynamics is basically this principle, writ large. While in theory it would be possible to predict the individual motions of the 10,000,000,000,000,000,000,000,000 gas molecules in my office right now, given knowledge of all their initial velocities and positions, in practical terms it would be impossible to get all this information — and besides, we don't really care about where molecule number 2,076,276,255,591,174,265,927,232 is at any given time. Rather, what we do is we assume that the internal dynamics of a system causes it to sample all of its "internal microstates" with equal probability. (In fancy words, this is called the "ergodic hypothesis".) We then ignore the micro-properties of the system that we don't care about. What we then observe, when we ignore the fine properties of the system, is that the dynamics will always tend to drive the system towards more "disordered" states: since the disordered states are more likely, a given system will tend to evolve from an ordered state to a disordered state, and evolve from a given disordered state to another disordered state that looks much the same to us.
posted by Johnny Assay at 9:08 AM on February 5, 2008

The second law says that any differences in temperature, pressure, or density will tend to even out. Entropy is a measure of how far this equalization process has progressed. It may make more sense to think in terms of motion. The molecules are moving around more actively in hot objects, and will transfer that motion to cooler objects until the motions are equalized. Molecules that are moving with equal energy have no differential energy to transfer.
posted by weapons-grade pandemonium at 9:10 AM on February 5, 2008 [1 favorite]

Aaaaand edd said pretty much the same thing as me. Gotta type faster.
posted by Johnny Assay at 9:10 AM on February 5, 2008

Let's define a couple of things. Entropy can be defined in several ways. The basic definition has to do with the number of available states of a system, i.e. how many different ways can some amount of energy be distributed in a object that can hold smaller bits of energy in different ways. (it actually goes as the natural logarithm because of the large number of states that usually exist.) This can be rewritten using the definition of temperature to get the usual definition from chemistry, which relates the amount of heat change per total temperature to the amount of entropy change. This then is a measure of the amount of energy needed to reverse an irreversible process. There are related definitions of entropy for information theory, where the states are not physical, but relate to the entry and loss of information.

The second law of thermodynamics is also able to be stated in several ways. A basic formulation is Lord Kelvin's, that it is impossible to convert heat completely into work. This is saying that when storing energy in a collection of smaller states, and removing it in aggregate, some of the energy is rendered useless to perform actions. This is the historical basis for the second law, which stems from Carnot's work on the efficiency of steam and heat engines. Carnot found that there is an efficiency limit that is physical when trying to use a temperature difference to perform work. (A process that occurs in diverse systems including the engine in your car). Some of your confusion may stem from the extension of this law to other types of phenomena, such as biological and other phenomena. These other statements of the law are treating the components of their system like a physicist is treating the states in his system, i.e. they're ignoring the finer details, and making a statement about global behavior.

on preview, Johnny Assay says it well.
posted by apathy0o0 at 9:11 AM on February 5, 2008

Another way to think of this is that a system will randomly walk through its state space (considering every microscopic arrangement of particles, etc., as a distinct state). Each macroscopic state — something that we, as human-sized observers, would think of as a distinct state — corresponds to a vast number of different, but basically indistinguishable, microscopic states. But the macro-states which are "high entropy" (thermal equilibrium, e.g.) correspond to vastly more microscopic states than the states which we consider "low entropy". There are far more ways for the gas to fill the entire box than for it to fill only half the box. So as the system randomly walks through its microscopic state space, without any regard for entropy, the odds are very good that at any given time you'll find it in a "high entropy" state, simply because there are so immensely many more of those than "low entropy" states. If you start a system out out in a low-entropy state, then come back later and look at it, odds are extremely good that it will have blindly wandered out of the low-entropy oasis into the trackless high-entropy desert that surrounds it. This is another way of saying that entropy increases over time.

See also the cosmicvariance post linked from this earlier MeFi fpp.
posted by hattifattener at 9:24 AM on February 5, 2008 [1 favorite]

The Three Laws of Thermodynamics, as explained by my high school physics teacher who probably stole it from somewhere: You can't win, you can't break even, and you can't quit playing.

I believe it was Asimov who first said that. [citation needed]
posted by L. Fitzgerald Sjoberg at 9:32 AM on February 5, 2008

Two people are in a room separated by a wall with a slot at the top. One room is full of potatoes, and each room has a guy who hates potatoes. The guy with all the potatoes can grab one and throw it over the wall, but the other guy has to chase it down to throw it back. Eventually both sides will have the same number of potatoes, equilibrium will happen on its own.
posted by StickyCarpet at 9:37 AM on February 5, 2008 [1 favorite]

what is the difference between the second law of thermodynamics and newtons law of cause and effect?

None really, that is if you just define newton's cause and effect that broadly. Generally, Netwon's laws are used for macro-scale physical interactions: rocks, planets, etc. The laws of thermodynamics just describe the exact same physical interactions, execpt in regards to heat. I think classically you would be corrected if you said 'it was all the same thing' because when we talk about Newton we talk about his specific work and findings. I dont believe Newton was really working with heat transfers and entropy, but yes at some point its all physics. That is to say it is all atoms bumping around and being affected by the forces of the universe (gravity, electromagneticsm, etc).

I think its worth pointing out that your question is less a physics question and more an epistemology question.
posted by damn dirty ape at 10:07 AM on February 5, 2008

So to bring it back to your original question, "what is the difference between the second law of thermodynamics and newtons law of cause and effect?".

In a closed system, Newton's laws apply to the behavior of individual things within that system (motion) and the second law describes the behavior of the whole system (heat and temperature).
posted by euphorb at 10:10 AM on February 5, 2008

An aside: Lambert's sites (, etc.) are well-intentioned, but there are deficiencies with calling entropy a measure of "energy dispersal."

The most fundamental definition of entropy that I know of is S = -k Sum(pi ln pi) where pi is the probability of the system being in microstate i (a microstate is when each particle has an assigned quantum state that is compatible with macroscopic observables like temperature and pressure). If all microstates are equally probable, we have the familiar S = k ln W, where W is the number of microstates. The best description that I've heard of the 2nd law is that entropy (= number of microstates) tends to increase, and that entropy is maximized when a system is at equilibrium.

Consider two rings of pure material spinning in opposite directions at a very low temperature, arbitrarily close to absolute zero. The system velocity and angular momentum are zero; the only important number is the rotational speed. There are very few possible microstates (tending to one as we approach absolute zero) that are compatible with the system, because random atomic motion is nearly eliminated due to the low temperature. Each atom is pretty much limited to following its circular path with essentially no thermal vibration. The entropy is very low, almost zero.

If the rings are now brought into contact, they will eventually slow each other to a stop by friction. Now the rotational speed is zero and the material is hotter, say at some temperature T well above absolute zero. There is now a huge number of possible microstates, because the random thermal energy could be apportioned to the particles in an enormous number of combinations without us ever knowing the difference. (It doesn't matter whether atom #45,391,567,958,... is going fast and #198,562,994,261,... is going slow or vice versa, as long as the energies add up to put the bulk material at temperature T.)

And this is where I have a problem with Lambert's promotion of "energy dispersal." The energy isn't more disperse after we connect the rings. The energy didn't go anywhere, the system is closed. Neither has the energy spread out (the distribution of energy among the particles did, but Lambert isn't this precise). The average energy of the particles is still the same. I think the dispersal definition falls short here, while the microstates definition explains the spontaneity of the process with no problems.
posted by Mapes at 10:13 AM on February 5, 2008 [1 favorite]

As I understand it and how it applies to consciousness---you can never get more organized---systems will always spread into chaos. Like how smells will disperse in a closed space---you can't contain it once you let it go. I see cause and effect as a different issue.
posted by hulahulagirl at 10:24 AM on February 5, 2008

While in theory it would be possible to predict the individual motions of the 10,000,000,000,000,000,000,000,000 gas molecules in my office right now, given knowledge of all their initial velocities and positions,in practical terms it would be impossible to get all this information . . .

In theory as in practice it is not possible to know the initial velocities and positions at the same time.
posted by Neiltupper at 1:23 PM on February 5, 2008

In theory as in practice it is not possible to know the initial velocities and positions at the same time.

Momentums, not velocities.

not that it relates to the question...just sayin'....
posted by cabingirl at 2:10 PM on February 5, 2008

Man, you guys make it hard.

Frictionless pool table. Balls bouncing around, as per euphorb.

Now put another one next to it, join them magically together, and take out the dividing wall. What do you think do the balls will do?

That's an analogy for putting a cold thing next to a hot thing. Second law in action.

Now take a pool table with 10 bouncing balls and put it next to a pool table with 2 bouncing balls. Does it seem likely that when you take the wall out, the balls will move from the 2-ball table to the 10-ball table, or vice versa? If the two balls move to the first table without any of the ten balls moving to the second, congratulations--you've just decreased the entropy of the universe. But it's hardly very likely, is it? Not likely at all -- and in real systems, the number of "balls" is pretty close to immeasurable.

For followup, read up on Maxwell's Demon. He's my kind of guy.
posted by darksasami at 2:25 PM on February 5, 2008

I think it's worth noting that classical thermodynamics was a highly developed theory already in the early 1800's before even the existence of atoms and molecules was generally accepted by scientists (Einstein's 1905 paper on Brownian motion was instrumental in converting remaining skeptics), and that it can be derived very satisfyingly from the laws of thermodynamics regarded as first principles (justified by a combination of reason and observation) the way Newtonian dynamics is from Newton's laws, or even Euclidean geometry is from his postulates.

I've never gotten around to really trying to read it, but several of my friends swear by A. B. Pippard's Elements of Classical Thermodynamics for a particularly elegant exposition of this kind. From the Amazon reviews:

'Dr Pippard's book, whilst paying adequate attention to technique, is particularly to be recommended for providing the reader with an understanding of thermodynamics. Many awkward points, which are glossed over in other treatises, are discussed clearly and comprehensively ... anyone with a rudimentary knowledge of thermodynamics cannot fail to derive benefit and stimulus from its pages.' Philosophical Magazine

'If there possibly exist students who have at one time felt thermodynamics to be a somewhat dry and uninspiring subject, this book is to be recommended to them for refreshment.' American Institute of Physics

'There can be no hesitation in recommending this book to all undergraduates and postgraduates interested in thermodynamics, and many users of more advanced thermodynamics might well find pleasure in a study of this well-written account.' Nature

Speaking of Newton's laws and the second law of classical thermodynamics, there is at least one very fundamental incompatibility between them. Newton's laws are perfectly happy if you run the film backwards. That doesn't violate them, and they can therefore not give you a way to account for the direction of time, but running the film backward directly violates the second law, and you will often see entropy referred to as time's arrow.

By the way, I think a 1747 aphorism of Samuel Johnson's captures a great deal of the essence of the second law, and points to the weird dark clouds of moral implication that seem to have trailed it from its coal-age birth: all change is of itself an evil. Johnson was talking about change in language for the purposes of his dictionary rather than heat flow and useful work, but perhaps that only brings us back to the mysterious and deep connections between thermodynamics and information theory-- and ethics.
posted by jamjam at 7:09 PM on February 5, 2008

In theory as in practice it is not possible to know the initial velocities and positions at the same time.

Yeah, yeah, I knew somebody was going to bring up the quantum thing. To an extremely good approximation, the gas molecules in your typical office act just like little billiard balls — the quantum effects are several orders of magnitude smaller than anything we care about, especially when we're averaging over the "microstates" of the system. So this cavil doesn't really address my main point.

For some thermodynamic systems, of course, it is necessary to take quantum effects into account; this usually happens when they're a whole hell of a lot colder than any temperatures we normally experience. (The best-known example of this is the Bose-Einstein condensate.) But even when we have to take quantum mechanics into account, the basic principle remains the same: we're more likely to see something happen if that observation corresponds to a high-entropy macrostate of the system — a state that has a large number of microstates corresponding to it — and a low-entropy macrostate will still tend to evolve to a high-entropy macrostate.
posted by Johnny Assay at 12:17 PM on February 6, 2008

Just posted this related question.

And complience, you spelled “entropy” wrong in the tag name of this post. You might add “secondlawofthermodynamics” too.
posted by XMLicious at 4:55 PM on February 6, 2008

Response by poster: WoW fantasic replies, great reading.. thanks to everyone.

From what im gathering.. and please tell me if im wrong.

The 2nd Law of thermodynamics is simply this.. "energy moves about"

& of course we all know when stuff moves, newton gets involved.
posted by complience at 4:28 PM on February 8, 2008

« Older Who's out there?   |   How did primitive man cope with things like body... Newer »
This thread is closed to new comments.