In Part 6 I introduced the notion of entropy in the
context of heat engines, defining it as d

*S*= d*Q*/*T*. The great Ludwig Boltzmann gave us an equivalent,*statistical*notion of entropy, establishing it as a measure of*disorder*.
Imagine a gas
in a chamber (labelled A in the left part of the figure below), separated by a
partition from another chamber (B) of the same volume. This second chamber is
empty to start with. If the partition disappears, the molecules of the gas start
moving into the right half of the enlarged chamber, and soon the gas occupies
the entire (doubled) volume uniformly.

*n*molecules of the gas. Before the partition is removed, all the molecules are in the left half of the enlarged chamber. So the probability of finding any of the molecules in the left half is 100%, and it is zero for finding that molecule in the right half. After the partition has been removed, there is only a 50% chance of finding that molecule in the left half, and 50% chance for finding it in the right half. It is like tossing a coin, and saying that we associate ‘heads’ with finding the molecule in the left half, and ‘tails’ with finding it in the right half. In both cases the chance is 50% or ½.

Next let us
ask the question: What is the probability that all the

*n*molecules of the gas will ever occupy the left half of the chamber again? This probability is the same as that of flipping a coin*n*times, and finding ‘heads’ in each case, namely ½^{n}.
Considering
the fact that usually

*n*is a very large number (typically of the order of the Avogadro number, i.e. ~10^{23}), the answer is very close to zero. In other words, the free expansion of the gas is practically an*irreversible*process. On removal of the partition, the gas has*spontaneously*gone into a state of greater*disorder*, and it cannot spontaneously go back to the initial state of order (or rather less disorder).
Why do we say
'greater disorder'? Because the probability of finding any specified molecule
at any location in the left half of the chamber is now only half its earlier
value. Here 'order' means that there is a 100% chance that a thing is where we
expect it to be, so 50% chance means a state of less order, or greater disorder.

So,
intuitively we have no trouble agreeing that, left to themselves (with no
inputs from the outside), things are more likely to tend towards a state of
greater disorder. This is all that the second law of thermodynamics says. It
says that

*if we have an ISOLATED system, then, with the passage of time, it can only go towards a state of greater disorder on its own, and not a state of lesser disorder.*This happens because, as illustrated by the free-expansion-of-gas example above, a more disordered state is*more probable*.
How much has
the disorder of the gas increased on free expansion to twice the volume?
Consider any molecule of the gas. After the expansion, there are twice as many
positions at which the molecule may be found. And at any instant of time, for any
such position, there are twice as many positions at which a second molecule may
be found, so that the total number of possibilities for the two molecules is
now 2

^{2}. Thus, for*n*molecules there are 2*more ways in which the gas can fill the chamber after the free expansion. We say that, in the double-sized chamber, the gas has 2*^{n}*more '*^{n}*accessible states*', or '*microstates*'. [This is identical to the missing-information idea I explained in Part 21.]
The symbol

*W*is normally used for the number of microstates accessible to a system under consideration. This number doubled when the gas expanded to twice the volume. So, one way of quantifying the degree of disorder is to say that entropy*S*, a measure of disorder, is proportional to*W*; i.e.,*S*~*W*.
Boltzmann did
something even better than that for quantifying disorder. He
defined entropy

*S*as proportional to the*logarithm*of*W*; i.e.*S*~ log*W*.
In his honour,
the constant of proportionality (

*k*) is now called the Boltzmann constant. Thus entropy is defined by the famous equation_{B}*S*=*k*log_{B}_{2}*W*(or just*S = k*log*W*).
To see the
merit of introducing the logarithm in the definition of entropy, let us apply
it to calculate the increase of entropy when the gas expands to twice the
volume. Since

*W =*2*, we get*^{n}*S ~ n*. This makes sense. Introduction of the logarithm in the definition of entropy makes it, like energy or mass, a property*proportional*to the number of molecules in the system. Such properties are described as having the*additivity*feature. If the number of molecules in the gas is, say, doubled from*n*to 2*n*, it makes sense that the defined entropy also doubles in a linearly proportionate fashion.
The
introduction of log

*W*, instead of*W*, in the definition of entropy was done for the same reasons as those for defining missing information*I*(cf. Part 21). In fact,*I = S*.

Also, the
original (thermodynamic) and the later (statistical mechanics) formulations of
entropy are equivalent. In the former, entropy of an isolated system increases
because a system can increase its stability by obliterating thermal gradients.
In the latter, entropy increases (and information is lost) because spontaneous
obliteration of concentration gradients (and the ensuing more stable state) is
the most likely thing to happen. Concentration gradients get obliterated
spontaneously because that takes the system towards a state of equilibrium and
stability.

For an
isolated system, maximum stability, maximum entropy, and maximum probability
all go together.