1.1 Principle of Statistical Physics and Ensembles
1.1 Principle of Statistical Physics and Ensembles
1.1 Principle of Statistical Physics and Ensembles
Statistical Ensembles
1.1
1.1.1
Statistical systems are complex systems. The systems are so complex that we cannot obtain all the
information to completely characterize the systems. For example, a liter of gas may contain 1022 atoms.
To completely characterize such a system (or more precisely, a state of such a system), we need to known
the three components of the velocity for each atoms and the three components of the position for each
atoms. It is impossible to obtain 6 1022 real numbers to completely characterize the gas.
However, not knowing all the information needed to characterize gas does not prevented us to develop a
theory of gas. This is because we are only interested in some average properties of gas such as the pressure,
volume, temperature. Those properties do not depend on every little details of each atoms. Not knowing
every thing about the atoms does not prevent us from calculating those properties. This is the kind of
problems in statistical physics. In statistical physics we try to understand the properties of a complex
system without know all the information of the systems. This is possible since the properties that we are
interested in do not depend on the all details of the system.
1.1.2
In statistical physics there is only one principle: All possible states appear with an equal probability. Let
us explain what do we mean by the above statement. Suppose we know certain quantities, such as pressure,
total energy, etc, of a complex systems. But those quantities do not characterize the system completely.
This means that the system has a number of states for which those quantities take the same values. Thus
even after knowing the values of those quantities, we still do not know, among those possible states, which
state the system is actually in. Then according to the principle of statistical physical, we say all the possible
states are equally likely.
1.1.3
But the system can only be in one state at a given time. What do we mean by all the possible states
are equally likely? There are two points of view. In the rst point of view, we may imagine we have
many copies of the system, all characterized by the same set of quantities, such as total energy, pressure,
etc . But each copy may be in a dierent possible state. Then equally likely means that each possible
1
state appear the same number of times among the copies of the system. The copies of the system is called
ensemble. We have to have an ensemble to even dene the probabilities. Under the rst interpretation,
statistical physics is a science that deal with ensembles, rather than individual systems.
The second point of view only apply to the situation where the environment of the system is independent
of time. In this case we interpret equally likely as that all the possible states appear for the same amount
of time during a long period of time. The second point of view is related to the rst point of view if we
view the system at dierent times as the dierent copies of the system.
The two points of view may not be equivalent. The two points of view are equivalent only when the
system can visit all the possible states, many times, during a long period of time. This is the ergodicity
hypothesis.
Not all systems are ergodic. For a non-ergodic system, statistical physics only apply to its ensemble.
For an ergodic system, statistical physics also apply to the time average of the system.
1.2
Microcanonical ensemble
A microcanonical ensemble is an ensemble formed by isolated systems. All the systems in the ensemble
have the same energy (and possibly some other properties). Here by same energy we really mean that
all systems has an energy which lies within a small window between E and E + E.
1.2.1
Let us rst study a simple example: N spins in magnetic eld. The energy for an up-spin is E = 0 /2
and for a down-spin E = 0 /2.
We like to ask: How many states are there with a total energy E?
Since the total energy is given by E = M 20 (N M ) 20 , where M is the number of up-spin. So
the states with a total energy E are the states with M up-spins. But how many states are there with
N!
M =
M
M = E0 + N2 up-spins? The answer is CN
M !(N M )! . (Here CN is the number of ways to pick M objects
from N objects.) So the number of states with a total energy E is
E
(E) = CN0
+N
2
After obtain the number of states as a function of total energy E, we can dene the entropy of the
system: the entropy is kB time the log of the number of states:
S(E) = kB ln (E),
(1.2.1)
where kB = 1.3807 1016 erg/K = 8.617343(15) 105 eV/K is the Boltzmann constant. (The Boltzmann
constant is conversion factor between energy and temperature: 1K = 1.38071016 erg = 8.617343(15)
105 eV. Or 1eV = 11605K. We will introduce the denition of temperature shortly.)
For a microcanonical ensemble, entropy a function of energy E. So for our N -spin system, the entropy
is
S(E) = kB ln CN0
+N
2
(1.2.2)
ln(n!) = n ln n n +
2
1
ln(2n) + O(1/n)
2
(1.2.3)
S(E)/N
0
0.5
E/E 0 = / 0
0.5
Figure 1.1: The entropy per spin, S(E)/N , as a function of E or the average energy per spin. The
maximum entropy of a spin-1/2 spin is kB ln(2) = 0.69314718056kB .
Thus (see Fig. 1.1)
1
M
kB
S(E) = ln CN
N ln N M ln M (N M ) ln(N M )
M
N M
= M ln( ) (N M ) ln(
)
N
N
=N (f ln f f ln f )
(1.2.4)
0
0
M
where f M
N (or f 1 N ) is the probability for a spin to be up (or down). Since E = M 2 (N M ) 2 ,
E
1
E
1
we have f = 2 + E0 and f = 2 E0 where E0 = N 0 . Thus
[
1
E
1
E
1
E
1
E ]
1
kB
S(E) =N ( +
) ln( +
)(
) ln(
)
2 E0
2 E0
2 E0
2 E0
(1.2.5)
1.2.2
(1.2.6)
Concept of temperature
To introduce the concept of temperature, let us put two systems of spins together. The system-1 has N1
1,2 be the energies of the two systems at the beginning. The
spins and the system-2 has N2 spins. Let E
total energy is E = E1 + E2 . If we allow the two systems to exchange their energy, then the spins in the
two systems may ip up and down, and sample all the possible states with total energy E. Now we like to
ask what is the probability for the system-1 to have a new energy E1 ? Certainly, the system-2 will have
an energy E2 = E E1 when the system-1 to have a new energy E1 .
The number states with system-1 having an energy E1 and the system-2 having an energy E2 = E E1
is given by tot = 1 (E1 )2 (E2 ), or in terms of entropies:
1
1
S1 (E1 ) kB
S2 (EE1 )
= ekB
(1.2.7)
Since every possible states are equally possible, the probability for the system-1 to have an energy E1
1
P (E1 ) ekB
(1.2.8)
From Fig. 1.2, we see that when N , P (E1 ) is almost a -function. We can almost say for sure that
1 that it maximizes the total entropy S1 (E1 ) + S2 (E E1 ),
the energy of the system-1 has such a value E
or
1 ) = S (E E
1 )
S1 (E
(1.2.9)
2
3
0.8
0.6
10
0.4
100
0.2
1000
0
0.5
0.4
0.3
10000
0.2
0.1
E1 / |E |
Figure 1.2: For a system of N1 spins and a system of N2 spins with total energy E, we plot the probability
P (E1 ) for the N1 -spin system to have an energy E1 . Here N2 = 2N1 and N1 = 10, 100, 1000, 10000. The
total energy E is chosen to be N1 0 . P (E1 ) reach its maximum when E1 = E/3.
0.5
/ 0
0.5
Figure 1.3: The relation between temperate T , the inverse temperature with the average energy per spin
.
1 at the beginning is not equal to E
1 , then after we bring the two spin systems together, E1 will shift
If E
1 to E
1 . We see that Eq. (1.2.9) is a condition for equilibrium. It is also the maximum entropy
from E
condition. We have derived the second law of thermodynamics: as an isolated system approach to the
equilibrium state, its entropy always increase (if we dene the entropy as in Eq. (1.2.1)).
If we dene the temperature as
1
S(E)
= kB =
T
E
then the equilibrium condition Eq. (1.2.9) becomes
(1.2.10)
T1 = T2
(1.2.11)
1 ( 1 E0 E )
1 ( f )
1
= kB = kB ln 21
= kB ln
T
0
0
f
2 E0 + E
(1.2.12)
0.008
experiment
(emu/mole Cu)
Curie law
100
200
300
T (K)
1.2.3
Curies law
Eq. 1.2.12 relates the temperature T with the probability distribution of the spins: f and f . We nd
f
= e0 /kB T
f
(1.2.13)
1
e0 /kB T
,
f = /k T
,
/k
T
e 0 B +1
e 0 B +1
Note that Eq. 1.2.13 is nothing but the Boltzmann distribution. So from the equal probability principle
and the denition of temperature Eq. 1.2.10, we can derive the Boltzmann distribution.
f =
2
e0 /kB T + 1 e0 /kB T + 1
N gB
0
tanh
2
2kB T
gB B
N gB
tanh
=
2
2kB T
For B
kB T
gB ,
we have
M=
We nd magnetic susceptibility =
1.2.4
g 2 2B N
4kB T
g 2 2B N
B
4kB T
Properties of entropy
1
]
1
S(E) =N ( + ) ln( + ) ( ) ln( )
kB
2 0
2 0
2 0
2 0
(1.2.14)
where = E/N is the average energy per spin, we see that entropy is proportional to N , the size of system.
Thus S is an extensive quantity. In contrast, , as the average energy per spin, is an intensive quantity.
The total energy E is an extensive quantity and the temperature T is an intensive quantity.
5
2E
E1
E2
(a)
(b)
E
(c)
( E 1) ( E 2)
E1
( E ) ( E )
(1.2.15)
we see that entropy also depend on the energy window E. However, in the thermodynamical limit
N , such a dependence can be dropped and we can regard S as a function of E only.
To see this, we consider
S(E, E) =kB ln(number of states with energy between E and E + E)
=kB ln[ (number of states with energy between E and E + E)]
=S(E, E) + kB ln
(1.2.16)
= (E1 )(2E
E1 ), it is clear that the
> = (E1 )(E2 ) and the equilibrium state has
E1
a higher entropy (see Fig. 1.6). Thus reaching equilibrium always increase entropy (the second law of
thermodynamics).
After the two systems reach the equilibrium, we now forbid the energy exchange. The total number
= (E)(
E).
We like to show that ln (E)(
E)
= ln
in the thermodynamical
states is then reduced to
limit, ie the system Fig. 1.5b and the system Fig. 1.5c have the same entropy. As the maximum of the
E1 ), we nd
> /2N
(E1 )2 (2E
, where 2N is the number of possible distinct values of E1 . We also
ln
)
(1.2.18)
or
(1.2.19)
fi ln fi ,
(1.2.20)
i=1
where is the total number of states. If all states have the same probability to appear, we have fi = 1/
and
S = kB
fi ln fi = kB
i=1
1 ln 1 = kB ln .
(1.2.21)
i=1
initial states. Thus there will be only (E1 )(E2 ) possible nal states. The system in Fig. 1.5b has
states with energy 2E. But among possible state, only (E1 )(E2 ) of them are the nal states evolved
from the system in Fig. 1.5a. But we have no clue about which are the (E1 )(E2 ) possible nal states.
We lost the information. We only know the total energy of the system, and we only know the state can be
states. This is how the entropy get increased.
in one of the
7
4 states
4 states
7 states
4 states
(a)
(b)
Figure 1.7: The lines represent possible states. The thick lines represent states that actually appear in the
ensembles. (a) represents an irreversible process while (b) represents a reversible process.
The evolution from Fig. 1.5a to Fig. 1.5b is also presented in Fig. 1.7a. The Fig. 1.7b represent a
reversible (or adiabatic) evolution, say, caused by a change in 0 . We see that reversible (or adiabatic)
processes do not change the entropy, since the number of possible states is not changed.
1.3
1
N< (E) = 3N
d3N qd3N p
2
h
pi /2m<E
N
V
= 3N
d3N p
h
p2i <2mE
V N S3N ( 2mE)3N 2 m
(E) = N< (E + E) N< (E) =
E
(1.3.1)
h3N
To obtain Sn , we note
dn xex =
2
Sn rn1 drer
1
2
= Sn (r2 )(n2)/2 dr2 er
2
1
= Sn (n/2) = n/2
2
2
We nd that
Sn =
2 n/2
(n/2)
(1.3.2)
(1.3.3)
3N
3N
ln N + 3N ln((2m)1/2 /h) + 3N ln
ln(3N/2) + 3N/2 + ln E
2
2
v(2m)3/2
3 2 3
=N ln N + N ln
+ N ( ln
+ ) + ln E
(1.3.4)
h3
2
3
2
=N ln N + N ln v +
where v = V /N is the volume per particle and = E/N is the average energy per particle.
A big problem, the entropy is NOT extensive due to the N ln N term. We need to use a concept from
quantum physics - identical particle. For identical particles
1
N< (E) = 3N
d3N qd3N p
(1.3.5)
h N ! p2i /2m<E
Using ln N ! = N ln N N , we nd
1
kB
S(E) =N ln
v(2m)3/2
3 2 5
+ )
+ N ( ln
3
h
2
3
2
(1.3.6)
For identical particles, the entropy is extensive. The entropy per particle, s, is given by
1
1
kB
s =kB
S/N = ln
v(2m)3/2
3 2 5
+ ( ln
+ )
3
h
2
3
2
v(2m)3/2
h3
v
= ln 3
ln
(1.3.7)
Meaning: average energy per particle. (2m)1/2 the corresponding momentum. = h/(2m)1/2 the
corresponding wave length. v/3 number of wave packets that can be tted into the volume per particle.
Classical gas: v/3 1.
Quantum gas: v/3 1.
(Question: is air at room temperature a quantum gas or a classical gas?)
Thermodynamical function E(S, V, N )
From =
h2
e2s/3kB
2mv 2/3
we get
E(S, V, N ) = N
h2 N 2/3 2S/3N kB
e
2mV 2/3
9
(1.3.8)
E
2
h2 N 2/3 2S/3N kB
N
e
=
S V
3N kB 2mV 2/3
(1.3.9)
E
h2 N 2/3 2S/3N kB
2
N
e
=
V S
3V 2mV 2/3
(1.3.10)
The pressure
P =
We obtain the equation of state
P V = N kB T
10
(1.3.11)