Join 3,494 readers in helping fund MetaFilter (Hide)


Help me Understand Boltzmann's entropy formula: S = k log W?
July 30, 2009 6:54 AM   Subscribe

Help me Understand Boltzmann's entropy formula: S = k log W?

The 2nd Law of Thermodynamics is a concept that always holds me at bay. Sometimes I think I get it.. then realise I don't.

I'm reading Hans Christian Von Baeyers book "Warmth Disperses and Time Passes" Its a great book on the subject, but im struggling with how it trys to explain how Entropy is the logarithm of probability?

From page 106


Whenever you multiply two integers, the numbers of their respective digits add.

eg: 60 x 600 = 36,000

so two digits plus three digits equals five digits (the rule sometimes misses by one digit, as in 3x3 = 9, but that's a negligible error in view of the vastness of the number of molecules in a gas.)

So Boltzmann made the bold, inspired guess that entropy equals the number of digits of the corresponding probability



Can someone explain this in even more simple terms?
posted by complience to Science & Nature (16 answers total) 4 users marked this as a favorite
 
Someone recommended this book to me once, but I couldn't find it. Now I'm glad, because boy is that a unhelpful explanation.

Roughly speaking, the log (base 10) of a number tells you how many digits it has (less 1).

log 1 = 0
log 10 = 1
log 100 = 2
log 1000 = 3

etc. What's really going on is powers.

1 = 100
10 = 101
100 = 102
1000 = 103

The log is just telling you what power you raised 10 to to get your number. Logs are very helpful in many areas, especially when you are dealing with a very large span of sizes, because it compresses it down to something manageable. There are other useful features too.
posted by DU at 7:03 AM on July 30, 2009


(Or maybe I'm explaining too much log and not enough entropy.)
posted by DU at 7:08 AM on July 30, 2009


More simple terms? Erm... well no but...

The W is the number of microstates in the macrostate in question. A microstate is the exact arrangement of everything, broadly speaking, but the macrostate is the collection of states with broadly the same properties. In thermodynamics this will be the things like pressure and temperature and so on.

If I have a ten coins I might have as a macrostate the number of heads, and the microstate is the precise ordering of heads and tails.
HHHHHHHHHH is the only microstate with the ten-head macrostate, so W is 1.
HHTTHTHTTHH is one of many microstates with the five head macrostate so W is much larger.

All the microstates are equally likely, but as there are many more microstates that fit the five head macrostate the probability is larger - it's W divided by the number of microstates. So W directly gives you the probability of the macrostate. It's more precisely defined than my example suggests, but the gist of it is the same.

log W is then the log of the probability - as your text says it's roughly like the number of digits - but I wouldn't say it the way they did as probabilities are always scaled between 0 and 1 and the 'number of digits' is therefore confusing. W is scaled up from the probability though, so it makes more sense.

Looked at it this way, the 2nd law simply says W will increase - if I start flipping coins I will move to a more probable sequence from a less probable one. That's all it is. If you do something, you'll probably end up in a probable state.
posted by edd at 7:09 AM on July 30, 2009 [2 favorites]


It's worth remarking that, in a sense, the logarithm that shows up is a bit of a red herring. As edd says, rising entropy simply means that you're moving from a less probable state to a more probably one. You can describe this in terms of W, the number of microstates, or S = kB log(W), the entryopy; since log(x) is an increasing function, they increase with one another.

I believe that the reason that we use the entropy instead of the multiplicity of states is partially because entropy was a measure of something historically; it turns out these are the same thing.
posted by vernondalhart at 7:14 AM on July 30, 2009


Any configuration that involves more microstates is more likely to be observed (all microstates being equally probable). From there, it's a short conceptual jump to the Second Law (total entropy tends to increase, and all spontaneous processes increase entropy). At this point, though, we could connect this entropy concepts to microstates in a variety of ways; it's only necessary that entropy increases monotonically with the number of microstates. So why use a logarithm?

The easiest way to see why the function connecting S and W has to be a logarithm is to consider to equivalent amounts of an ideal gas. Assume each has entropy S. Now for entropy to be useful as an extensive state variable like volume or energy, it should be additive; it's convenient if the total entropy in the system is 2S. So put the containers together and remove a partition. Gas atoms are indistinguishable, so the system looks just as when the containers were separated. Thus, the entropy should still be 2S.

If you compare the possible microstates before and after the partition is lifted, however, you'll see that they've increased multiplicitively (you can prove this to yourself with pennies or buttons). To reconcile the multiplicative permutations of the number of microstates (W) with the desired additive property of a state variable (S), we must have S proportional to log(W).
posted by Mapes at 7:52 AM on July 30, 2009 [3 favorites]


Mapes: Gas atoms are indistinguishable, so the system looks just as when the containers were separated. Thus, the entropy should still be 2S.

That's not right; the entropy of the system with the missing partition should be much higher. When talking about gasses, the macrostates in question are not the number of molecules in a given part, but things like temperature, pressure. And there are many many states that don't have the two gasses split 50-50 between the two halves for which the macroscopic variables will be the same.
posted by vernondalhart at 8:00 AM on July 30, 2009


It might be helpful in thinking about entropy to realize that the log in its definition is not just a numerical convenience for dealing with large numbers, but is essential for certain of its properties to come out right. For example, imagine a black box whose observable properties (temperature, pressure, etc.) completely determine its unobservable "internal" state (heads or tails of the coins within, positions of gas molecules, etc.) Then there is only one microstate consistent with the macrostate. This means W = 1. This means log W = 0 and the black box has exactly zero entropy. If there is no uncertainty left after we make our macroscopic measurements, the system is in a state of maximum order.

A second essential property of logs is that they add when you bring two (distinguishable) systems together. Imagine one system with N possible internal states and another with M internal states. Assume that they don't interact. We are free in our mental accounting to consider them as two separate systems or as two non-interacting parts of a single system. Do we get the same answer for the entropy in both cases? Separated, one system has k log N entropy and the other has k log M. Considered together, there are N x M states because there is no interaction. The entropy of this combined system is k log (N x M) = k log N + k log M. Entropy adds, as it should!

I've always found that notions of probability make entropy more confusing than it already is. I find it clearer as just a count of the possible internal states on an appropriate scale.

On preview, Mapes is making my second point, but as vernondalhart observes, the counting is problematic.
posted by drdanger at 8:07 AM on July 30, 2009 [1 favorite]


I believe that the reason that we use the entropy instead of the multiplicity of states is partially because entropy was a measure of something historically; it turns out these are the same thing.

you mean dS = 'dQ'/T? thermodynamics isn't an obsolete science: you can define a entropy without recourse to Boltzmann's formula.

Also, Boltzmann took the secret of his formula to his grave...
posted by geos at 8:13 AM on July 30, 2009


Vernondalhart: That's not right; the entropy of the system with the missing partition should be much higher.

Read about the Gibbs paradox. When two identical gases are mixed, the entropy is additive, as I wrote, because the gas atoms are indistinguishable. If you still think I'm wrong, please give a reference.
posted by Mapes at 8:14 AM on July 30, 2009


And there are many many states that don't have the two gasses split 50-50 between the two halves for which the macroscopic variables will be the same.

Example? It seems to me that if there are different numbers of the same type of gas molecules in two containers, then at least one of the pressure, temperature, or volume must be different between the two containers (although possibly beneath the limits of our detection if the difference in numbers is very very small).
posted by DevilsAdvocate at 8:37 AM on July 30, 2009


Well, colour me wrong then. That's counterintuitive, but interesting.
posted by vernondalhart at 8:42 AM on July 30, 2009


I think what vernondalhart was getting at with his "historically" comment is that entropy was a known, studied, and measured quantity well before it was linked to microstates or probability. So in one sense, the logarithm is there because that what makes the math work out right and matches observation.
posted by DevilsAdvocate at 9:08 AM on July 30, 2009


As much as I love arguments over what the exact nature of Entropy actualy is,

But what I want to understand is the mathamatical constant boltzmann used to theorise that entropy can be linked to probablity.


Whenever you multiply two integers, the numbers of their respective digits add.


I've never come across this before, although in the book it suggests its basic highschool math.


When I attempt my own examples, the majority of them fail to conform to it.

eg:
12*3=36 (fail)
17*4=68 (fail)
88*5=440 (pass)
23*2=46 (fail)
161*45=7245 (fail)

I get the feeling im missing somthing really basic.
posted by complience at 1:27 AM on July 31, 2009


To be precise, when you multiply two non-zero integers, the number of digits in the product is either a) equal to the sum of the number of digits in the two numbers being multiplied, or b) one less than that sum.

For a mathematical treatment, consider that an n-digit number is one that is at least 10n-1 and less than 10n. (E.g., five-digit numbers go from 10000, or 104, to 99999, or 105-1.

So suppose you have an m-digit number, x, and an n-digit number, y. Thus you have:

10m-1≤x<1>m
10n-1≤y<1>n

So when you multiply x and y, you get:

10m+n-2≤x*y<1>m+n

Meaning that x*y has either m+n or m+n-1 digits.

Probably what Von Bayers was trying to do was to explain logarithms in a way that was accessible to readers who weren't familiar with the concept. The number of digits in a positive integer is an approximation to the base-ten logarithm of that number (to be precise, the number of digits is one more than the integral part of the base-ten logarithm of the number). But that point is that logarithms do strictly add when two numbers are multiplied: If x*y=z, then log x + log y = log z, and Boltzmann would have just expressed his formula in terms of the logarithm without thinking about the "number of digits," as would most people who use logarithms with any frequency. The mess about the number of digits adding (which is never off by more than one) is just Von Bayers's rather awkward way of trying to convey the concept to readers who aren't familiar with logarithms.
posted by DevilsAdvocate at 3:44 AM on July 31, 2009


Sorry, my equations got mucked up in the HTML. They should read:

10m-1≤x<10m
10n-1≤y<10n

10m+n-2≤x*y<10m+n
posted by DevilsAdvocate at 3:48 AM on July 31, 2009


Hey, I probably suggested that book. Glad you're finding it engaging.

The problem with your examples is that the number of digits is too small. Try multiplying some ten-digit numbers by some fifteen-digit numbers: the answer will twenty-four or twenty-five digits. There are nine times as many numbers with twenty-five digits as with twenty-four, so you'll usually get it right. But you won't be able to do this with your ten digit pocket calculator.

In thermodynamics you encounter numbers that act differently from what you learned in ordinary arithmetic. There are ordinary numbers (like, say, 23) that change when you add or multiply them with other ordinary numbers. There are "large" numbers (like, say, 1023). Large numbers don't change when you add ordinary numbers: 1023+23=1023. This is an approximation, but it's an approximation good to twenty-one decimal places, which is better than any experiment in the history of physical science. You have to be a little careful if you stumble on a problem like 1023+23-1023=23, where the large numbers cancel exactly, but otherwise it's a good rule.

With multiplicities for thermodynamical systems you encounter "very large" numbers like, say, 10(1023). You can write down large numbers (1023 is 100,000,000,000,000,000,000,000) but you would die of old age writing down 10(1023). Very large numbers have the surprising property that they don't change when multiplied by large numbers. For example,
1023 × 10(1023) = 10(23 + 1023) = 10(1023).
These very large numbers turn up in multiplicities thanks to factorials in probability, and you have to redevelop your intuition a bit to work with them.

If you have a thermodynamic system with N particles, the volume, energy, mass, and so on — the "extensive" quantities — are all proportional to N. The entropy is also extensive. The number of shufflings of N particles is something like N! ≈ NN; Boltzmann realized that the logarithm of this quantity is approximately proportional to N, as required for the entropy.

Congratulations on sticking with this for so long. If you graduate to a textbook with problems, I enjoyed Schroeder's.
posted by fantabulous timewaster at 11:17 PM on August 1, 2009 [1 favorite]


« Older Help me cheer up my wife! She'...   |  I'm looking for a fun, cheap, ... Newer »
This thread is closed to new comments.