1.1 Principle of Statistical Physics and Ensembles
1.1 Principle of Statistical Physics and Ensembles
1.1 Principle of Statistical Physics and Ensembles
Statistical Ensembles
1.1
Key points:
All possible states appear with an equal probability.
Statistical systems are complex systems. We do not know all the information that is needed to
completely characterize the systems. For example, the liter of gas may contain 1022 atoms. The
completely characterize such a system we need to known the three components of the velocity for
each atoms and the three components of the position for each atoms. It is impossible to obtain
6 1022 real numbers to completely characterize the gas.
However, not knowing all the information needed to characterize gas does not prevented us to
develop a theory of gas. This is because we are only interested in some average properties of gas
such as the pressure, volume, temperature. Those properties do not depend on every little details of
each atoms. Not knowing every thing about the atoms does not prevent us from calculating those
properties. This is the kind of problems facing statistical physics. In statistical physics we try to
understand the properties of a complex system without know all the information of the systems.
This is possible since the properties we are interested in do not depend on the details of the system.
In statistical physics there is only one principle: All possible states appear with an equal proba
bility. Let us explain what do we mean by the above statement. Suppose we know certain properties
of a complex systems. But those properties do not characterize the system completely. That means
the system has a numbers of states that all have the same properties. Thus even after knowing
those properties, we still do not know, among those possible states, which state the system is in.
According to the principle of statistical physical, we say all the possible states are equally likely.
But the system can only be in one state at a given time. What do we mean by all the possible
states are equally likely? There are two points of view. In the rst point of view, we may imagine
we have many copies of the system, all have the same properties. But each copy may be in a
dierent possible states. Then equally likely means that each possible state appear the same
number of times among the copies of the system. The copies of the system is called ensemble. We
have to have an ensemble to even dene the probabilities. Under the rst interpretation, statistical
physical is a science that deal with ensembles, rather than individual systems.
The second point of view only apply to the situation where the properties of the system is
independent of time. In this case we interpret equally likely as that all the possible states appear
1
for the same amount of time during a long period of time. The second point of view is related to
the rst point of view if we view the system at dierent times as the dierent copies of the system.
The second point of view may be equivalent to the rst point of view. The two points of view
are equivalent only when the system can visit all the possible states, many times, during the long
period of time. This is the ergodic hypothesis. Not all systems are ergodic. In this class, we will
take the rst point of view. We regard the statistical physics as a theory for ensembles. We will
apply the theory for ensembles to individual systems, assuming the systems are ergodic.
1.2
Microcanonical ensemble
A microcanonical ensemble is an ensemble formed by isolated systems. All the systems in the
ensemble have the same energy (and possibly some other properties). Here by same energy we
really mean all systems has an energy which lies within a small window between E and E + E.
1.2.1
(1.2.1)
where
M
CN
=
N!
M !(N M )!
(1.2.2)
(1.2.3)
(1.2.4)
When n is large
1
1
ln(n!) = (n + ) ln(n + 1) (n + 1) + ln(2) + ...
2
2
(1.2.5)
N ln N M ln M (N M ) ln(N M )
N M
M
= M ln( ) (N M ) ln(
)
N
N
=N (f ln f f ln f )
(1.2.6)
S(E)/N
0
0.5
E/E 0 = / 0
0.5
Figure 1.1: The entropy per spin, S(E)/N , as a function of E or the average energy per spin.
The maximum entropy of a spin1/2 spin is kB ln(2) = 0.69314718056kB .
where f (or f ) is the probability for a spin to be up (or down). Using f =
1
E
f = M
N = 2 E0 where E0 = N 0 , we nd
M
N
1
2
E
1
E
1
E
1
E
1
1
) ln( +
)(
) ln(
)
kB
S(E) =N ( +
2 E0
2 E0
2 E0
2 E0
E
E0
and
(1.2.7)
1.2.2
(1.2.8)
Concept of temperature
To introduce the concept of temperature, let us put two systems of spins together. System 1 has
1,2 be the energies of the two systems at the beginning.
N1 spins and System 2 has N2 spins. Let E
The total energy is E = E1 + E2 . If we allow the two systems to exchange their energy, then the
spins in the two systems may wondering around and sample all the possible states with total energy
E. The question is what is the probability for system 1 to have an energy E1
The number states with system 1 having an energy E1 is
1
N (E1 ) = ekB
1
S2 (EE1 )
S1 (E1 ) kB
= ek B
(1.2.9)
Every possible states are equally possible. Probability for system 1 to have an energy E1
1
P (E1 ) ekB
(1.2.10)
From Fig. 1.2, we see that when N , P (E1 ) is almost like a function. We can say for sure that
1 that it maximizes the total entropy S1 (E1 )+S2 (E E1 ),
the energy of system 1 has such a value E
or
1 ) = S (E E
1 )
S1 (E
(1.2.11)
2
1 at the beginning is not equal to E
1 , then after we bring the two spin systems together,
If E
E1 will shift from E1 to E1 . We see that Eq. (1.2.11) is a condition for equilibrium. It is also
maximum entropy condition. We have derived the second law of thermodynamics: as an isolated
system approach to the equilibrium state, its entropy always increase (if we dene the entropy as
in Eq. (1.2.4)).
If we dene the temperature as
1
S(E)
= kB =
T
E
3
(1.2.12)
0.8
0.6
10
0.4
100
0.2
1000
0
0.5
0.4
0.3
10000
0.2
0.1
E1 / |E |
Figure 1.2: For a system of N1 spins and a system of N2 spins with total energy E, we plot
the probability P (E1 ) for the N1 spin system to have an energy E1 . Here N2 = 2N1 and N1 =
10, 100, 1000, 10000. E is chosen to be N1 0 . P (E1 ) reach its maximum when E1 = E/3.
0.5
/ 0
0.5
Figure 1.3: The relation between temperate T , the inverse temperature with the average energy
per spin .
then the equilibrium condition Eq. (1.2.11) becomes
T1 = T2
(1.2.13)
1
1 1 0 +
= kB = kB ln 21
T
0
2 0
(1.2.14)
1.2.3
Curies law
For a spin1/2 system in magnetic eld B, 0 = gB B. The total magnetic energy is MB where
M is the magnetic moment. The energy per spin is = MB/N . From Eq. (1.2.14), we nd a
relation between the Beld induced magnetic moment M and the temperature T
gB N 2M
1
kB
=
ln
T
gB B
gB N + 2M
0.008
experiment
(emu/mole Cu)
Curie law
100
200
300
T (K)
kB T
gB ,
we have M gB N and
M=
We nd magnetic susceptibility =
1.2.4
g 2 2B N
4kB T
2N
g 2 B
B
4kB T
Properties of entropy
1
1
1
1
kB
S(E) =N ( + ) ln( + ) ( ) ln( )
2 0
2 0
2 0
2 0
(1.2.15)
we see that entropy is proportional to N , the size of system. Thus S is extensive quantity. In
contrast, , as the average energy per spin, is intensive quantity. The total energy E is a extensive
quantity and the temperature T is an intensive quantity.
Entropy and energy window
From the denition of entropy
S(E, E) = kB ln(number of states with energy between E and E + E)
(1.2.16)
we see that entropy also depend on the energy window E. However, in the thermodynamical
limit N , such a dependence can be dropped and we can regard S as a function of E only.
To see this, we consider
S(E, E) =kB ln(number of states with energy between E and E + E)
=kB ln[ (number of states with energy between E and E + E)]
=S(E, E) + kB ln
(1.2.17)
2E
E1
E2
(a)
(b)
E
(c)
( E 1 ) ( E 2 )
E1
( E ) ( E )
and the
Figure 1.6: Total numbers of states in the combined system with the total energy 2E
system 1 energy E1 .
Additive property of entropy
Consider two systems both with N spins. The rst system has total energy E1 and the second E2 .
E /
E /
The rst system has 1 = CN1 0 (E1 ) possible states and the second 2 = CN2 0 = (E2 )
possible states.
If we put the two systems together, but forbid any exchange of energy between them (see Fig.
1.5a), then the combined system will has = 1 2 possible states. The entropy of the combined
system S = kB ln is the sum of the sub systems
S = S 1 + S2
(1.2.18)
If we allow the two system to exchange energy, the two systems will reach an equilibrium state.
= (E1 + E2 )/2 in the equilibrium state. The
The subsystem will have the same average energy E
The number of possible states
equilibrium state of the combined system will have a total energy 2E.
0
2E/
=C
=
become
. Since
E1 (E1 )(2E E1 ), it is clear that the > = (E1 )(E2 )
2N
and the equilibrium state has a higher entropy (see Fig. 1.6). Thus reaching equilibrium always
increase entropy (the second law of thermodynamics).
After the two systems reach the equilibrium, we now forbid the energy exchange. The total
= (E)(
E).
We like to show that ln (E)(
E)
= ln
in the
number states is then reduced to
thermodynamical limit, ie the system Fig. 1.5b and the system Fig. 1.5c have the same entropy.
E1 ), we nd
> /2N
or
> ln
> ln(/2N
ln
)
(1.2.19)
(1.2.20)
4 states
4 states
7 states
4 states
(a)
(b)
Figure 1.7: The lines represent possible states. The thick lines represent states that actually appear
in the ensembles.
Since SS and S is of order N . In large N limit, we can regard S = S .
From Fig. 1.6 we also see that as system go from Fig. 1.5a to the equilibrium state Fig. 1.5b
or Fig. 1.5c, the entropy of the system is maximized. Or equilibrium state has maximum entropy.
Reversible and irreversible processes
The system in Fig. 1.5b is evolved from that in Fig. 1.5a. Thus there are only (E1 )(E2 ) possible
initial states, and there will be only (E1 )(E2 ) possible nal states. Those the system Fig. 1.5b
states with energy 2E,
it will only be in one of (E1 )(E2 ) possible nal states. But we have
has
no clue about which are the (E1 )(E2 ) possible nal states. We lost the information. We only
states.
know the total energy of the system, and we only know the state can be in one of the
This is how the entropy get increased.
The evolution from Fig. 1.5a to Fig. 1.5b is also presented in Fig. 1.7a. The Fig. 1.7b
represent a reversible (or adiabatic) evolution, say, caused by a change in 0 . We see that reversible
(or adiabatic) processes do not change the entropy, since the number of possible states is not
changed.
1.3
Each degree of freedom is described by a point in phase space (q, p). A particle has three degrees
of freedom and its state is described by (x, px , y, py , z, pz ).
Consider a N particle system. How many states with total energy below E. The answer is
innity. We need quantum physics to get a sensible result. Each state in a degree freedom occupies
a nite area qp = h. For the N particle system, the phase space is 6N dimensional. Each h3N
volume in the 6N dimensional phase space correspond to one state. Thus the number of states
with total energy below E is given by
1
V N S3N ( 2mE)3N /3N
3N
3N
N< (E) = 3N
d qd p =
(1.3.1)
h
h3N
2i /2m<E
7
R
where Sn is the solid angle in n dimension and 0 Sn rn1 dr = Sn Rn /n is the volume of a n
dimensional ball of radius R. The number states between E and E + E is
V N S3N ( 2mE)3N 2
(E) = N< (E + E) N< (E) =
E
(1.3.2)
2h3N
To obtain Sn , we note
d xe
Sn rn1 drer
1
2
= Sn (r2 )(n2)/2 dr2 er
2
1
= Sn (n/2) = n/2
2
=
We nd that
Sn =
2 n/2
(n/2)
(1.3.3)
(1.3.4)
3N
3N
ln(3N/2) + 3N/2 + ln E
ln N + 3N ln((2m)1/2 /h) + 3N ln
2
2
v(2m)3/2
3 2 3
=N ln N + N ln
+ N ( ln
+ ) + ln E
(1.3.5)
3
h
2
3
2
=N ln N + N ln v +
where v = V /N is the volume per particle and = E/N is the average energy per particle.
A big problem, the entropy is NOT extensive due to the N ln N term. We need to use a concept
from quantum physics identical particle. For identical particles
1
d3N qd3N p
(1.3.6)
N< (E) = 3N
h N ! 2i /2m<E
Using ln N ! = N ln N N , we nd
1
kB
S(E) =N ln
v(2m)3/2
3 2 5
+ N ( ln
+ )
3
h
2
3
2
(1.3.7)
For identical particles, the entropy is extensive. The entropy per particle, s, is given by
1
1
kB
s =kB
S/N = ln
v(2m)3/2
3 2 5
+ ( ln
+ )
3
h
2
3
2
v(2m)3/2
h3
v
= ln 3
ln
(1.3.8)
Meaning: average energy per particle. (2m)1/2 the corresponding momentum. = h/(2m)1/2
the corresponding wave length. v/3 number wave packets that can be tted into the volume per
particle.
h2
e2s/3kB
2mv 2/3
we get
E(S, V, N ) = N
h2 N 2/3 2S/3N kB
e
2mV 2/3
(1.3.9)
2
E
h2 N 2/3 2S/3N kB
N
e
=
S V
3N kB 2mV 2/3
(1.3.10)
h2 N 2/3 2S/3N kB
E
2
N
e
=
V S
3V 2mV 2/3
(1.3.11)
The pressure
P =
We obtain the equation of state
P V = N kB T
(1.3.12)