0% found this document useful (0 votes)
38 views19 pages

Entropy: 5.1. Clausius' Theorem

1) The document proves Clausius' theorem, which states that for any cyclic process, the total heat absorbed from a reservoir divided by the temperature at which it is absorbed can never be positive, and is only zero for a reversible cycle. 2) It does this by considering a complex cyclic process operated by a system, and shows that it is equivalent to many infinitesimal Carnot cycles between the system and a reservoir. 3) From this proof, it defines entropy S as a function of state, such that the change in entropy between two states is the heat absorbed divided by the temperature, for a reversible process. This establishes entropy as a property of thermodynamic systems.

Uploaded by

mon kwan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views19 pages

Entropy: 5.1. Clausius' Theorem

1) The document proves Clausius' theorem, which states that for any cyclic process, the total heat absorbed from a reservoir divided by the temperature at which it is absorbed can never be positive, and is only zero for a reversible cycle. 2) It does this by considering a complex cyclic process operated by a system, and shows that it is equivalent to many infinitesimal Carnot cycles between the system and a reservoir. 3) From this proof, it defines entropy S as a function of state, such that the change in entropy between two states is the heat absorbed divided by the temperature, for a reversible process. This establishes entropy as a property of thermodynamic systems.

Uploaded by

mon kwan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 19

5

Entropy

5.1. Clausius' theorem


So far, we have only discussed cycles in which the system
exchanges heat at two temperatures only. For heat engines based on
such cycles we have, according to Carnot's theorem,
V ^ T?rev

where i7rev is the efficiency of a reversible engine operating between the


same temperatures. Substituting for the efficiencies,

Orl

Oi On Ti
by the definition of thermodynamic temperature. Therefore,

Taking the heat entering the system as positive, we may write this1

(5.1)

We shall now prove a corresponding result for general cyclic processes


of any degree of complexity. In particular, there will be no restrictions
on the number of degrees of freedom of the system nor on the tem-
perature at which it may exchange heat with its surroundings.
1
It should be noted that the inequality becomes stronger as the engine
becomes less efficient and less reversible. We shall later see how this comes
about in a more general way.

Downloaded from https://www.cambridge.org/core. Chinese University of Hong Kong, on 25 Jan 2022 at 19:22:29, subject to the Cambridge Core
terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139167703.006
Entropy 69

To make the system execute the cycle, an appropriate series of


adjustments has to be made to its parameters (involving work) and at each
stage the appropriate amount of heat has to be supplied. The cycle itself
may be irreversible, but we may supply the heat reversibly by operating
a minute Carnot engine between the system and a large reservoir at
constant temperature (Fig. 5.1). By ensuring that the Carnot engine is
in thermal equilibrium with the reservoir or system during the transfer
of heat, no irreversibility is involved there. The series of processes
followed by the Carnot engine C is as follows. All are reversible.
(a) Cisat To.
(b) C is compressed (or expanded) adiabatically until its temperature
isT.
(c) C is placed in contact with the system and absorbs or supplies
heat by an isothermal change at T.
(d) C is expanded (or compressed) adiabatically until its temperature
is To.
(e) C is placed in contact with the reservoir and compressed (or
expanded) isothermally at To until it regains its original state.
In this way, the complex engine executes its cycle in infinitesimal steps
and no assumptions are made about the uniqueness of its adiabatics nor
about whether the working substance can depart from the specified
cycle.2
Fig. 5.1. Proof of Clausius' theorem.

•C

A misleading method which is often used to prove Clausius' theorem is to


superimpose on the general cycle a mesh of adiabatics and isotherms so as to
subdivide it into infinitesimal cycles to which (5.1) is applied. There are two
objections to this. In the first place, the argument depends on the assumption
that the system could exist in all of the states involved in the subdivision.
Clearly, this might not be true. The second, and more serious objection is
that we have not yet proved the existence of unique adiabatic surfaces for
systems of more than two variables. Thus the result would not be of general
validity.

Downloaded from https://www.cambridge.org/core. Chinese University of Hong Kong, on 25 Jan 2022 at 19:22:29, subject to the Cambridge Core
terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139167703.006
70 Equilibrium thermodynamics

If the heat supplied to the working substance at T in one journey of


the Carnot engine is dQ, the corresponding heat absorbed from the
reservoir is

Hence, the heat absorbed from the reservoir in one complete cycle of
the complex engine is

by the Kelvin statement of the second law. However, To is necessarily


positive, and therefore,

• <j> — - < 0 for any cycle. (5.2)

If the complex cycle were reversible we could have executed it in the


opposite direction and derived the result

But (5.2) applies to any cycle and therefore necessarily to reversible


cycles in particular. Hence, if the cycle is reversible both (5.2) and (5.3)
must be satisfied, giving
f dO
• (P — = 0 for reversible cycle. (5.4)

The two results (5.2) and (5.4) together form Clausius" theorem, which
may be stated formally as follows:

• For any closed cycle, <j>—-<0, where the equality necessarily


holds for a reversible cycle.

Clausius' inequality is very important, for our whole treatment of


irreversible processes will follow from it.
It is important to be clear about the significance of T in the above
results. In an irreversible cycle the various parts of the system might
not always be in equilibrium with one another and, in particular, there
might be temperature differences, making it impossible to define a
temperature for the system as a whole. In the proof, T is the temperature
of the Carnot engine as the heat is transferred across the boundary of
the system. Thus the T appearing in the integrals is the temperature at
which heat is supplied to the system. Only if the source of heat is in

Downloaded from https://www.cambridge.org/core. Chinese University of Hong Kong, on 25 Jan 2022 at 19:22:29, subject to the Cambridge Core
terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139167703.006
Entropy 71

thermal equilibrium with the system as a whole does it become the


temperature of the system also.

5.2. Entropy
We now define a new variable, the entropy S, by the relation

d S = -

for an infinitesimal reversible change. To emphasize that the equality


holds for reversible changes only, the definition of S is written

dS=- (5.5)

Then for a finite reversible change of state, the change in entropy is


given by

(5.6)

We shall now show that entropy is a function of state.

Proof that S is a function of state. Construct any reversible cycle and


select any two states A and B on it (Fig. 5.2). The Clausius' theorem states

0= dS
ACBDA ACBDA

Fig. 5.2. Proof that entropy is a function of state.

Downloaded from https://www.cambridge.org/core. Chinese University of Hong Kong, on 25 Jan 2022 at 19:22:29, subject to the Cambridge Core
terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139167703.006
72 Equilibrium thermodynamics

from the definition of 5, all processes being reversible;

<J> dS= J dS+ J dS.


ACBDA ACB BDA

Therefore,

dS = SB~SA-
ACB ADB

If the path via D is kept fixed and the path via C varied, we see that

J
ACB
= SB-SA

always takes the same value for any reversible path from A to B. Hence,
apart from an arbitrary additive constant, S must be uniquely defined
for every state of the system: i.e., S is a function of state.
Since 5 is a function of state, dS must be a perfect differential (i.e.,
it is uniquely defined for any given change of state and can therefore
always be integrated). But we have defined 5 by the equation

T
where dC? is not a perfect differential. Thus we have discovered that
there is an integrating factor for dQrev, namely, 1/T.
It also follows immediately that adiabatics exist and are unique for
systems of any number of degrees of freedom, for they are simply the
surfaces of constant entropy, the isentropes. This deals with the point
we had to leave in section 4.7.

5.3. Entropy in irreversible changes


Since entropy is a function of state, the change in entropy
accompanying a given change of state must always be the same, however
the change of state occurs. Only when the change takes place reversibly,
however, is the entropy change related to the heat transfer by the
equation

'-Jy
7"
for we imposed the condition of reversibility in the initial definition.
What is the relationship between entropy change and heat transfer in
irreversible processes?

Downloaded from https://www.cambridge.org/core. Chinese University of Hong Kong, on 25 Jan 2022 at 19:22:29, subject to the Cambridge Core
terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139167703.006
Entropy 73

Consider an irreversible change, A-» B. Construct any reversible path


R between A and B, thus forming an irreversible cycle ABRA (Fig.
5.3). For the irreversible cycle Clausius' theorem gives

Taking the integral in two parts,

r d(3 r &Q
+
J T
A iri
J ^-°
Brev

i.e.,

f do^ r do
J T~ ) T'
But

~jT~ ^B~>>A

by definition of entropy. Thus

Airrt

or,

Fig. 5.3. Determination of the behaviour of entropy in an


irreversible change.

Downloaded from https://www.cambridge.org/core. Chinese University of Hong Kong, on 25 Jan 2022 at 19:22:29, subject to the Cambridge Core
terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139167703.006
74 Equilibrium thermodynamics

for a differential irreversible change. Thus, we have the general result


• dS>6Q/T (5.7)
for any infinitesimal change where the equality necessarily applies if the
change is reversible. Again, T is the temperature at which the heat is
supplied to the system. Only when the source of heat is in thermal
equilibrium with the system as a whole does it become the temperature
of the system also.
Equation (5.7) is extremely important. It contains all the information
required for dealing with efficiency and irreversibility in thermal pro-
cesses. It may therefore be thought of as the focal point of the second
law since it is through it that the objectives of the second law are realized.
For a system which is thermally isolated (or completely isolated)
6Q = 0. Applying (5.7) we see that dS > 0. This general result is known
as the law of increase of entropy which may be stated formally as:
• The entropy of an isolated system cannot decrease.
A particular application of this law is that it may be used to determine the
equilibrium configuration of an isolated system. In approaching
equilibrium the entropy of the system can only increase. Therefore, the
final equilibrium configuration is that for which the entropy is as large as
possible. Later, when we come to discuss the interpretation of entropy, we
shall see how this principle may be applied.
It should be noted that the law of increase of entropy provides a
natural direction to the time sequence of natural events. Within the
mechanistic framework of Newtonian mechanics all processes are revers-
ible in time. (The equations remain unaltered in form on replacing t by
-f.) Why then is there the inevitable sequence to events, the so called
'arrow of time'? Thermodynamics does not answer this problem but it
does provide a new insight. The natural direction of events is that in
which entropy increases. All changes are therefore part of the irreversible
progress towards universal equilibrium. Thus, the arrow of time results
from there not being thermodynamic equilibrium throughout the uni-
verse. As long as temperature differences or density differences exist
natural evolution will continue and events will be directed forwards
towards equilibrium.

5.4. The entropy form of the first law


From the first law we were able to deduce the existence of
internal energy U, a function of state. For any change of state, however
it occurs, the change in U is given by equation (3.3), namely:

Downloaded from https://www.cambridge.org/core. Chinese University of Hong Kong, on 25 Jan 2022 at 19:22:29, subject to the Cambridge Core
terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139167703.006
Entropy 75

where dQ and dW are not differentials of functions of state and are


therefore not individually defined for a given change of state. To separate
the contributions to U from heat and work the constraints on the system
have to be known so that the path of the change may be found. If the
change takes place reversibly, the work done may be expressed in terms
of the system's parameters of state in the form £ X{ dxi9 and only when
the path is known can this be integrated. Thus, taking a simple fluid as
our model, we have
dU = dQ + dW always (5.8)
dW = — p d V for reversible changes (5.9)
and we have defined entropy such that
dC? = T dS for reversible changes. (5.10)
Substituting (5.9) and (5.10) in (5.8) we obtain
dU = T dS -p dV for reversible changes. (5.11)
However, in this equation all the variables are functions of state so that
all the differentials are perfect. As a result, integration of this equation
must be independent of the path of integration and the equation may
be applied to any change of state, however accomplished. To use the
equation we only require that initial and final states be defined and that
there is some reversible path between them. To find the change in
internal energy accompanying an irreversible change we choose any
convenient reversible path between initial and final states and integrate
(5.11) along it. Thus, by expressing dU in terms of state functions only,
we have
• dU = TdS-pdV always. (5.12)
Changes in any state function may be calculated by a convenient revers-
ible path in the same way.
For irreversible changes, the equalities (5.9) and (5.10) do not hold.
We have already shown that in this case (5.10) becomes the inequality
dQ < T dS so that for (5.12) to remain true dW>-pdV. This is what
one would expect. In the presence of irreversibility (when there is friction,
for example), the total work done is greater than that which would be
required to effect the same change in volume of the system without the
irreversibility.
The general form of the first law is thus
dU = TdS+YdXidxi (5.13)
where X{ and xt are the intensive variables and their conjugate extensive
variables. It is clear from its definition, that entropy is an extensive
variable so that from the form of (5.13) thermodynamic temperature

Downloaded from https://www.cambridge.org/core. Chinese University of Hong Kong, on 25 Jan 2022 at 19:22:29, subject to the Cambridge Core
terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139167703.006
76 Equilibrium thermodynamics

must be its corresponding intensive variable. The term T dS is thus


entirely similar to the work terms and may be grouped with them. This
gives the first law in a condensed form:
xi (5.14)
where the summation necessarily includes the term T dS which is rel-
evant to all systems.

5.5. Entropy and the degradation of energy


The work that can be extracted from a system in an infinitesimal
change of state is dW = dQ-dU. We have shown that dQ is related to
the entropy change by dQ < To dS, where To is the temperature at which
the heat is supplied, so that dW must satisfy the inequality dW <
TodS-dU. Thus, for a given change of state (so that dU and dS are
fixed), the maximum amount of work is extracted from the system when
the equality applies; that is, when the change is reversible. In this case,
the total entropy change of the system and its surroundings is zero,3 for,
in any process involving reversible exchange of heat with the surround-
ings, dSsystem = -dSsurroundings ; whereas, in an irreversible change, the
entropy change of the surroundings (assuming no irreversibility there)
is dS0 = -dQ/To, while that of the system satisfies the inequality dS >
dQ/T0. In this case, the entropy of the universe may increase, and, if
it does, we are able to extract less work from the system than would
have been the case if the same change had been made reversibly. Thus,
associated with the increase of entropy is the 'loss' of some energy which
could have been used for work. Clearly, this energy does not vanish,
for this would violate the first law, but rather it takes a form from which
it may be converted into work with less efficiency than previously. The
energy becomes degraded in that it is less useful for work. We may
illustrate this by a simple example.
Consider two bodies 1 and 2 which are at temperature 7\ and T2.
Suppose that Ti>T2. Then if we connect the bodies together by a
thermal resistance and allow a small quantity of heat q to flow, the total
change of entropy is

AS = ASi + AS2 = q( — -— >0, while 7\ >T2.


\12 T\J
Thus, the entropy increases and will continue to increase as long as the
heat flows, bringing the bodies towards equilibrium.
In this kind of situation, one often speaks of the entropy of the universe as
being conserved. This is simply a convenient way of lumping together the
system and its surroundings.

Downloaded from https://www.cambridge.org/core. Chinese University of Hong Kong, on 25 Jan 2022 at 19:22:29, subject to the Cambridge Core
terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139167703.006
Entropy 77

Now suppose that instead of allowing q to flow from 1 to 2 we used


it to operate a Carnot engine from which to obtain mechanical work.
Let us suppose that To is the temperature of the coldest reservoir we
have to hand for use with the Carnot engine. Then, by extracting q from
1 we could have obtained work

Wi =<

If, however, we first allow q to flow from 1 and 2 and then use it to
operate the Carnot engine we only obtain

Thus, in the course of the irreversible heat conduction the energy has
become degraded to the extent that the useful work we may obtain from
it has been decreased by

The increase in entropy in an irreversible change is thus a measure


of the extent to which energy becomes degraded in that change. Con-
versely, in order to extract the maximum amount of useful work from
a system or set of systems, changes must be performed in a reversible
manner so that total entropy (entropy of the system and its surroundings)
is conserved.
It is worth pointing out that if the two bodies in the above illustration
were allowed to reach thermal equilibrium (a) by heat conduction and
(b) by operating a Carnot engine between them and extracting work,
the final equilibrium temperatures would be different in the two cases.
In the first, U\ + U2 is conserved and the final temperature is

where the Cs are the thermal capacities, which, for simplicity, we have
taken to be constants. In the second case, S1 + S2 is conserved and
W = -A(Ui + U2). In the isentropic process, the final temperature is
given by
j,(S) _ j.[c 1 /(c 1 +c 2 )]y.[c 2 /(c 1 +c 2 )] < j{U) (5.15)

The difference in the final temperature corresponds to the lower value


for the total internal energy which results from work having been done.

5.6. Entropy and order


We have shown that the equilibrium state of an isolated system
is that for which the entropy takes on its maximum value, so that in

Downloaded from https://www.cambridge.org/core. Chinese University of Hong Kong, on 25 Jan 2022 at 19:22:29, subject to the Cambridge Core
terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139167703.006
78 Equilibrium thermodynamics

terms of macroscopic variables the maximization of entropy is the


condition for determining the equilibrium configuration. An alternative
approach would be to apply probability theory at the microscopic level
to the various possible configurations of the system and to seek that
configuration whose probability is greatest. This is the method of the
discipline known as Statistical Mechanics or Statistical Thermodynamics
(see Rosser, 1982, or Kittel and Kroemer, 1980). The exact definition
of the statistical probability of a particular macroscopic state, for which
we shall use the symbol g, is outside the scope of this book, but its
relationship to entropy is so important in making it possible to link
macroscopic and microscopic properties that some discussion of it is
essential.
In seeking the most probable configuration of a system we are, in
fact, seeking the configuration of the greatest disorder permitted by the
constraints to which the system is subjected. A configuration which
requires particular conditions of order (such as that no molecules should
be in a particular region of space), is clearly less likely to occur spon-
taneously than one in which no conditions are specified. Thus the most
probable configuration, the equilibrium configuration, is that in which
the disorder is as great as possible. The statistical probability of a
particular configuration is therefore a measure of its disorder. Without
involving ourselves in the exact definition of g we may illustrate its
connection with disorder by taking a simple example.
Consider a fixed mass of gas in a container. We divide the container
into two equal parts, A and B, and consider the probability that the
molecules will all be in one half. The probability that a particular
molecule will be in A is clearly \. The probability offindingtwo particular
molecules in A at the same time is \ x \. Extending the argument to all
N molecules, the probability that all molecules will be in A at any
particular time is (l)N. We may therefore compare the statistical probabil-
ity that all the molecules are in A, gA, with that for the molecules to
occur randomly throughout the whole box, gA+B'-
g A
= (§)" (5.16)
gA+B

(If the box contains 1 mol of the gas, we have N = 6 x 1023 and we see
that the chance of finding all the gas in one half of the box is about 1
in 101 8 x l ° 2 3 . This would occur spontaneously about once in IO 1 8 x l ° 2 3
universes: a rare event.)
This simple illustration demonstrates the connection between statis-
tical weight and disorder. We have shown that for equilibrium the

Downloaded from https://www.cambridge.org/core. Chinese University of Hong Kong, on 25 Jan 2022 at 19:22:29, subject to the Cambridge Core
terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139167703.006
Entropy 79

macroscopic quantity entropy must be maximized and how the corres-


ponding microscopic condition is the maximization of g, which is related
to the disorder in the system. Can we arrive at an explicit connection
between entropy and order? We may see what this might be by consider-
ing two systems, 1 and 2. Entropy is an extensive variable (section 5.4),
so that the total entropy of the two systems taken together is
S1+2 = S1 + S2. (5.17)
The probability of finding the systems simultaneously in particular
configurations we specify for them is the product of the probabilities for
each system alone:
gl+2 = glg2. (5.18)
Clearly, (5.17) and (5.18) are satisfied simultaneously if
S = k\ng,
where k is a constant. We may prove that this is necessarily the form
of the relation as follows.
Suppose

Then, according to (5.17) and (5.18),


/(glg2)=/(gl)+/(g2).
Differentiating twice, with respect to first gi and then g2,

or,

Integrating,
In /'(g) = -In (g) 4-constant
or
f'(g) = k/g
where k is a constant. Therefore,

or
S = k\ng + S0
where So is the constant of integration which it is convenient to take as
zero corresponding to a statistical probability of unity for a completely
ordered state. Thus we have proved that the relation between the entropy
and the statistical probability is
• S = k Ing. (5.19)

Downloaded from https://www.cambridge.org/core. Chinese University of Hong Kong, on 25 Jan 2022 at 19:22:29, subject to the Cambridge Core
terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139167703.006
80 Equilibrium thermodynamics

This is the important Boltzmann relation which links classical thermody-


namics with the microscopic properties of a system. We may show that
k is Boltzmann's constant, R/NA, by considering again the perfect gas
contained in a box. We calculate the difference in entropy between the
state in which the gas is entirely in one half of the box, and that in which
it is uniformly distributed throughout the box. We do this by first
imagining that the gas is constrained to one half of the box by a partition
and that the partition is then punctured to allow the gas to fill the whole
box. In the (irreversible) expansion 6Q = 6W = 0. Therefore,
dU = TdS-pdV = 0.
We may now choose a convenient reversible path by which to evaluate
the terms in the latter equation since all these are functions of state.

Using the perfect gas law, equation (8.10), and considering one mole
P_=R
T V9
giving
f dV V?
= R\ — = R\ny =
where k is Boltzmann's constant and NA is the Avogadro constant.
Comparing with (5.16) and (5.19) we see that k in (5.19) is indeed
Boltzmann's constant.
Thus the entropy of a system is a measure of the disorder within it.
This now makes it possible to interpret the degradation of energy
discussed in the previous section. If energy is to be extracted from a
system as efficiently as possible, that energy should be stored in an
ordered form. A mechanical storage device such as a spring is ideal but
thermal energy is also useful, particularly if the temperature is high, for
T is the intensive variable coupled with S. When energy is degraded in
an irreversible change it takes a less ordered form. This is obvious in
the case of mechanical friction where ordered mechanical energy is
dissipated as the disordered molecular motions of heat; but it applies
also to the flow of heat down a temperature gradient where the non-
equilibrium ordering of thermal energy, corresponding to the existence
of the temperature difference, is reduced.
The direct relationship of entropy to disorder is extremely important
in providing a link between macroscopic variables and microscopic

Downloaded from https://www.cambridge.org/core. Chinese University of Hong Kong, on 25 Jan 2022 at 19:22:29, subject to the Cambridge Core
terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139167703.006
Entropy 81

processes. We shall illustrate this relationship with a few simple


examples.

5.6.1. Heat capacities


Thermal energy is stored in a solid in the thermal motions of
its atoms, and, if it is a metal, of its electrons also. The equations
governing the motions of the atoms and electrons vary little with tem-
perature, but the extent of the thermal motions increases as the tem-
perature rises. The greater the thermal motions, the greater will be the
microscopic disorder in the system, and the greater the entropy, the
change in entropy being brought about by the heat which flows into the
body as the temperature is raised. Thus, common heat capacities are
associated with the gradual increase in disorder which accompanies a
rise in temperature.
Now the heat capacities may be written in terms of entropy derivatives.
For example,

BS

We may use this relationship to calculate how the entropy of a solid


varies with temperature.
The heat capacity of an insulating solid follows the Debye law, accord-
ing to which, at low temperatures, Cv oc T 3 , and at high temperatures,
Cv is constant (in agreement with Dulong and Petit's law). Thus, in the
low temperature limit, 5 increases as T 3 , and, at high temperatures, S
varies as In T. In the case of a metal, the electronic contribution to the
heat capacity is proportional to temperature so that the electronic
contribution to the entropy is also proportional to T.

5.6.2. Heat capacity anomalies


We would normally expect the thermal motions of the atoms
of a material to increase smoothly as the temperature rises. In some
substances it is found that superimposed on the smoothly varying back-
ground heat capacity is an extra contribution which occurs at a particular
temperature in the form of a relatively narrow peak. Such behaviour is
known as a heat capacity anomaly. The rapid rise in entropy associated
with the heat capacity anomaly indicates that some microscopic change
in order is occurring, and the magnitude of the entropy rise can be used
as a guide to what the microscopic changes might be. At temperatures
below the anomaly some aspect of the system must be ordered and above

Downloaded from https://www.cambridge.org/core. Chinese University of Hong Kong, on 25 Jan 2022 at 19:22:29, subject to the Cambridge Core
terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139167703.006
82 Equilibrium thermodynamics

the anomaly disordered. Such a change is therefore known as an order-


disorder transition.
A high temperature example of a heat capacity anomaly is to be found
in /?-brass, the 50/50 copper-zinc alloy. At about 460 °C there is a large
peak in the heat capacity (Fig. 5.4), indicating a local change of order.
Subtracting the background one obtains the anomalous contribution to
the heat capacity, c'. Integration of c' yields the increase in energy and
integration of c'/T yields the increase in entropy associated with the
change of order. For 1 mol the entropy change is close to NAk In 2 =
5.8 J K 1 suggesting a twofold change in order per pair of copper and
zinc atoms. The explanation, which has been confirmed for similar alloys
by X-ray studies,4 is that in the low temperature form the copper and
zinc atoms are arranged in a regular array whereas in the high tem-
perature form they are randomly distributed on the lattice sites. The
crystal structure of /3-brass is body-centred cubic so that the ordered
array corresponds to, say, copper atoms at the corners of the cubes and
zinc atoms at the centres. The probability, in the disordered structure,
of finding a particular kind of atom at a particular lattice site is clearly

Fig. 5.4. The specific heat capacity of (3-brass (Moser, 1936).

- 1.0

0.5

100 200 300 400 500 600


t/°C

4
The X-ray measurements cannot be done on /S-brass. The difference between
the relative atomic masses of copper and zinc is so small that their scattering
power for X-rays is very similar and it is not possible to distinguish the two
kinds of atom.

Downloaded from https://www.cambridge.org/core. Chinese University of Hong Kong, on 25 Jan 2022 at 19:22:29, subject to the Cambridge Core
terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139167703.006
Entropy 83

\ since there are equal numbers of copper and zinc atoms; whence we
have for the change of entropy:

AS = k In gdisordered = k In 2N* = R In 2.
^ordered

Another example of an order-disorder transition is to be found in the


low temperature behaviour of paramagnetic salts. Here, the heat capacity
anomaly, which is usually within a few kelvins of absolute zero, results
from a change in magnetic order. Paramagnetism is always associated
with the presence of microscopic magnetic dipoles which may be aligned
by an external field to produce a net total magnetization. Even in the
absence of an applied field, however, the different orientations possible
for the dipoles have slightly different energies as a result of their interac-
tions with one another and with the crystal lattice in which they are
situated. At low temperatures, they will all occupy the lowest available
levels and the material will be magnetically ordered. In many cases, the
ordered state corresponds to the parallel alignment of ferromagnetism
with its large net magnetization, but this is not the only form of ordering
which occurs (see Rosenberg, 1975; ch. 12). As the temperature is
raised, the dipoles become excited into the higher levels, and at high
temperatures, they become randomly distributed among the orienta-
tions. Thus, over the range of temperatures where the ordering sets in
there is an extra contribution to the heat capacity deriving from the
changing magnetic order. The anomaly in chromium potassium alum is
shown in Fig. 5.5. Here there are four possible orientations for the
dipoles,5 so that the probability offindinga particular dipole in a specified
orientation is \. The change in entropy per element in proceeding from
the ordered to the disordered state is therefore k In 4, and the entropy
change for one mole, k In 4NyK = R In 4 = 1.39 R. Measurements to the
lowest temperatures indicate that the entropy change associated with
the ordering is indeed of this order (de Klerk et al.y 1949).
It should be noted that in the above examples the magnitudes of the
changes in entropy are similar, but the transitions occur at very different
temperatures. The entropy change, of course, is simply dependent on
the change of order and plays no part in determining the transition
temperature. The latter is determined by energy considerations, being
that temperature at which thermal energy becomes comparable with
the energy associated with the ordering process. This gives for the
5
The magnetic moment is associated with the Cr3+ ion, which at low
temperatures behaves as if it were in a 4S state with / = S = § and g = 2 (see
de Klerk, 1956).

Downloaded from https://www.cambridge.org/core. Chinese University of Hong Kong, on 25 Jan 2022 at 19:22:29, subject to the Cambridge Core
terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139167703.006
84 Equilibrium thermodynamics

transition temperature, T — e/fc, where e is the energy required to


remove one element from the ordered state. (That is, to exchange a
copper atom with a zinc atom in the ordered brass or to disalign a dipole
in the ordered paramagnetic salt.) The corresponding energies in the
examples above are about 6 x 1 0 2 1 J ^ 4 0 m e V and l x l O " 2 4 J -
7,

Fig. 5.5. The low temperature heat capacity of chromium potassium


alum (Bleaney, 1950; and de Klerk et aL, 1949).

C/R 0.5 r

0.4

0.3

0.2

0.1

0
0 0.1 0.2 0.3 0.4
T/K

Fig. 5.6. Entropy near a first order change of phase.

AS= L/T

It is often convenient to express energies of atomic-sized systems in


electronvolts. It is useful to remember that k = 1.38 x 10~23 J = 86.3 /xeV.

Downloaded from https://www.cambridge.org/core. Chinese University of Hong Kong, on 25 Jan 2022 at 19:22:29, subject to the Cambridge Core
terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139167703.006
Entropy 85

5.6.3. Latent heats


In the examples we have discussed above, we have associated
common heat capacities with the gradual change of entropy associated
with gradual change of order as temperature changes. Latent heats
correspond to a sudden change in order associated with a first order
phase change7 such as the melting of a solid or vaporization of a liquid
(Fig. 5.6).
We may make a very crude estimate of the entropy change associated
with vaporization. If we think of the molecules in the liquid as moving
about freely, in a gas-like manner, but as being restricted to a much
smaller volume than when they are in the vapour phase, then the ratio
of the statistical probabilities of finding any one molecule in the large
volume available in the vapour rather than in the small volume available
in th£ liquid is simply equal to the ratio of the available volumes (cf.
the illustration at the beginning of section 5.6 where the volumes are
equal). Thus, the ratio of the statistical probabilities of the vapour and
liquid configurations for all NA molecules of one mole is
vapour / * vapour\ / P liquid \

> liquid >• 'liq.


•'liquid ' ^Pvapour'

For many substances, the density ratio is about 103. This gives the
entropy change associated with vaporization as
A5 = k In 103"A = R\n 103 « 1 R .
Relating this to the latent heat, we have

where Th is the boiling point. This corresponds to Trouton's rule, found


empirically, which states that for non-associated liquids,

The restriction to non-associated liquids is necessary, because when


association takes place a new degree of order is introduced in the liquid
giving rise to a further contribution to the entropy change. The weakness
in the argument leading to the estimate of L is that it ignores the finite
volume of the molecules which causes a much greater restriction of
molecular motion in the liquid than corresponds simply to the reduction
in available volume per molecule. This leads to an underestimate of

Change of phase will be discussed in detail in chapter 10.

Downloaded from https://www.cambridge.org/core. Chinese University of Hong Kong, on 25 Jan 2022 at 19:22:29, subject to the Cambridge Core
terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139167703.006
86 Equilibrium thermodynamics

5.6.4. Change in order by deformation


In the three examples above, change in order was brought about
by changing the temperature of the system. Change in order is also
generally brought about when work is done on a system under conditions
where heat may be exchanged with the surroundings. (If heat could not
be exchanged, the entropy would be invariant under reversible changes,
so that the statistical order would also be invariant. Irreversible work,
of course, always brings about an increase in entropy and a decrease in
order.) An illuminating example is to be found by contrasting the effects
of mechanical deformation in different kinds of solid:
If a metal wire is stretched adiabatically, it cools. If rubber is stretched
adiabatically the temperature rises.
This is easily understood in terms of their microscopic properties:
The metal of the wire consists of many small crystallites in each of
which the atoms are arranged in a regular lattice. When the wire is
stretched, each crystallite is distorted and loses some of its symmetry.
(For example, the lattice might be distorted from cubic to tetragonal.)
The loss in symmetry is a loss of order and corresponds to an increase
in entropy. If the distortion were performed under isothermal conditions,
heat would be absorbed corresponding to the increase in entropy. When
performed (reversibly) under adiabatic conditions, the total entropy must
be constant. Then, in order to permit the entropy increase required by
the reduction of crystal symmetry, entropy (heat) has to be supplied
from some other aspect of the system itself. It comes from the thermal
motions of the material, so the temperature falls. The connection is
represented in terms of thermodynamic coefficients:

\dxJs \dSJx\dxJT
The first term on the right is related to a principal heat capacity and is
always positive. The second term on the right is also positive, so the
temperature falls.
The molecular arrangement in rubber, on the other hand, is very
different from that of a crystal. Rubber consists of long organic molecules
which are normally tangled together in a random manner. When the
rubber is stretched these long molecules tend to align along the direction
of extension and the order increases. Therefore, in this case, when the
material is stretched isentropically the temperature rises.

Downloaded from https://www.cambridge.org/core. Chinese University of Hong Kong, on 25 Jan 2022 at 19:22:29, subject to the Cambridge Core
terms of use, available at https://www.cambridge.org/core/terms. https://doi.org/10.1017/CBO9781139167703.006

You might also like