0% found this document useful (0 votes)
27 views43 pages

Chapter-1-2 Pchem Nmsu

The document discusses early models of physical chemistry concepts that depicted heat and work in terms of measurable quantities like the displacement of a piston. It introduces the concepts of enthalpy and entropy through examples involving heat engines and the conversion of heat into work. It then discusses assumptions and distributions related to ideal gas behavior, specifically the Maxwell-Boltzmann distribution describing the probability of molecular velocities and its relationship to molecular mass and temperature. Finally, it derives the ideal gas law from Newton's second law by considering the force exerted by gas molecules colliding with the walls of a container.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
27 views43 pages

Chapter-1-2 Pchem Nmsu

The document discusses early models of physical chemistry concepts that depicted heat and work in terms of measurable quantities like the displacement of a piston. It introduces the concepts of enthalpy and entropy through examples involving heat engines and the conversion of heat into work. It then discusses assumptions and distributions related to ideal gas behavior, specifically the Maxwell-Boltzmann distribution describing the probability of molecular velocities and its relationship to molecular mass and temperature. Finally, it derives the ideal gas law from Newton's second law by considering the force exerted by gas molecules colliding with the walls of a container.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 43

Chapter 1 – Classical Physics Concepts

Stupid
Piston Physical Chemistry is the field of chemistry that
describes (and/or measures) the chemical world
Model
based upon its characteristics. Physically, our
perception of the “nature” of substances is
restricted to what we can measure. What we can
measure is distance, time, and perhaps the
sensation of heat.

Therefore, early models depicted concepts in these


“measurable” quantities. For example, the concept
of heat, was a change that caused a gas to increase
in volume (volume is just distance in 3 directions),
thus driving a piston upwards. Work was also
measured in terms of piston displacement. If the
displacement went towards the center of the earth
(“down”) it was defined as positive work (for the
system). If the piston went up, then it is negative
work (for the system).
And we define the first law of thermodynamics:

The heat engine – if the engine were


perfect then the q applied to drive the piston
up would be quantitatively converted into w,
work, that turns the belt. Note the negative
work (for the system), is positive work for the
surroundings (the belt).

But it is noted that some q energy needed goes


into heating the piston before the gas inside
gets heated and begins to expand. We define a
new variable. So, q(total needed) is
defined thus:

for constant P. And we have invented enthalpy.


But of course some q is lost to vibrations in the machine, q to the surroundings,
etc. The DE described in the previous slide can be confusing. Is it the DE including
this mysterious loss of q to the surroundings, or not? So, new terms, Gibbs free
energy and Entropy, was introduced to help keep things clear.

We define the heat lost to the surroundings as:


And we have invented entropy. and

And we define the total, all inclusive change in the free energy as DG,
And so:
DG = DH -TDS And the world was relieved

The second law of thermodynamics states:

This is perhaps most easy to understand in terms of the heat engine, or, in terms of
driving your car. You all know that we put way too much gas into our car – and
unfortunately not all that gas is converted into the wheels spinning taking us places.
Open the hood of your car after driving – and you will find the engine is hot. That is
because some of your precious gas (at $2-4/gal!) has been converted into S. If the
engine was “reversible” then DS could equal zero. But such a thing does not exist.
Ideal Gases and the Maxwell-Boltzmann Distribution

Assumptions:
You will find in P Chem, the first thing we do when we try to describe some system, is
we make lots of simplifying assumptions. If we didn’t do this the system would be too
complex to describe mathematically. Surprisingly, this approach works better than you
might think. And, it works quite well for the description of the behavior of gases.

1. The gas molecules are independent, randomly moving particles.


2. When a gas molecule hits the side of a container (wall) it loses no energy to the
wall. I.e. the collisions are fully elastic.
3. Since the motions are random, the velocities of the gas molecules obey a gaussian
distribution.
A gaussian distribution describes the number
of times you would get “heads” when tossing
a coin. Gaussian distributions obey an
exponential function.
h is the number of
head flips, N is the
2 −2(h− h ) N total number of flips,
2

f (h) = e
πN and <h> is the mean
of the head flips.
Maxwell now assumed the velocity of the particles
can be described in similar gaussian like equation by the vector:

Now, the probability of finding a gas molecule with a certain vx speed between
volumes V1 and V2, is just found by integrating the function that describes the speed
over the volume boundaries. (And we also have “A”, a normalization constant.)

In the interest of time we will not go through the derivation, but the final Maxwell-
Boltzmann distribution for molecular speeds is:

The effect of increased mass of nitrogen


compared to helium on the speed
distribution
There is an integration in the derivation,
where the “a” in the integration rule is:
M = MW = molecular weight

We are probably more used to thinking of the Maxwell-Boltzmann relationship in terms


of energy, and not speeds. We also tend to think of this relationship in terms of energy
ratios, not a probability. So let’s do that.
We can convert the speed into the molar
energy thus: the kinetic energy is equal to the MW
= m, mass of one molecule
total energy of the gas molecule NA

The ratio of the number of particles in one level over the number of particles in another
level is proportional to the ratio of their probabilities.
let Pa = probability of "a" outcome, let Pb = probability of "b" outcome.
And we have N=100 particles. Then the ratio of the particles in the "a" event state
! Pa N $ ! Pa $
to the "b" event state is: # &=# &
" Pb N % " Pb %
Reminder:

It is better to derive the Boltzmann distribution


using statistical mechanics concepts –
Note – in terms of velocities something we will do later in the course.

−E
The Boltzmann distribution:
F(state) ∝ e kT

So let’s take that ratio:


1
! M $ 2 ! Em,2 $ −Em,2 ( RT ) Note – in terms of energies
4π # & # & e
P2 " 2π RT % " π RT % Em,2 −(Em,2 −Em,1) ( RT )
= 1
= e
P1 ! M $ 2 ! Em,1 $ −Em,1 ( RT ) Em,1 I’m not quite there but at least you
4π # & # &e can see the relationship of the
" 2π RT % " π RT %
number of molecules between two
energy levels is an exponential
function.
The Ideal Gas Law
We will derive the idea gas law from F = ma. First, let’s only think of the particle in one
dimension, x.

The area of one wall is A (the y,z wall), so that


the volume of the box is xA. We need to find the
force/Area to relate to pressure (remembering
that the ideal gas law relates pressure to other
variables). If a particle travels from one end of
the box and back again, it’s change in v is -2vx

F on the wall is the point of interest. F on the


ball is positive, F on the wall is negative.

This fishy result comes from the change in the momentum. DP = D(mv)=mDv. The change
in momentum when the ball hits the wall is DP = Pf-Pi. The momentum is mv before it hits
the wall, and –mv after it hits the wall. Then Pf-Pi = -mv – mv = -2mv. That is momentum.
Now just consider the velocity – the change will likewise be -2v.
The Ideal Gas Law
To find a DT element we realize that when the particle underwent the -2vx velocity
change it also traveled 2a in distance. i.e., to figure out how long it took to get
somewhere, just divide the distance you traveled by how fast you were going. 200
miles to Albuquerque/75 mph = 2.67 hr.

Now plug this in for DT

This is the force of one


molecule

We get the average


force of the entire
sample by multiplying
by the number of
molecules
Typo in book, should be just one N
The Ideal Gas Law
Up to now we have just been considering the x direction, what about the velocity in all
dimensions?

In fact, Because the velocity in all dimensions is


the same

So we have: And we must accept


this equation:

So we have:
The Ideal Gas Law

Now plug this back into our F=ma


To become: modified equation

Note:

N = number of molecules = n (number of moles) x Na ( Avogadro’s number)


And M (molecular weight) = Na ( Avogadro’s number) x m (mass of one molecule)

Nm vx2 nN A m vx2 nM vx2 nMRT


F = = = =
a a a Ma
The Ideal Gas Law

Finally, we remember the definition of


pressure as force divided by area, and we
plug in for F, the force.

A = Area
b =bxc
a
c

The volume of the box

F = ma PV = nRT ✓
Chapter 2, an “Intro” to the Quantum world,
so we can begin to understand the Statistical
Mechanics world

We have all heard matter exists in a quantized world if


the system or “domain” is small enough – i.e. atoms,
photons, etc.

But what are the criteria for observing quantized


behavior, and what are the parameters quantized?

The de Broglie wavelength


helps to define when
quantized behavior is
observed or not.

de Broglie wavelength

And since p = mv, another version h


of the de Broglie wavelength is: λdB =
p
Towards the classical
continuous energy world

We use quantum mechanics


when ldB is comparable to the
size of the domain, but classical
mechanics whenever ldB is
much smaller than the domain.
At very high energies there are
so many available microstates
that we can’t distinguish
individual quantum states in the
laboratory.
Another way to say this is the
Correspondence Principle, that
states: quantum mechanics
approaches classical mechanics as
we increase the particle's kinetic
energy, mass, or domain.
Energy levels available to a particle in the 3-D “particle in a box”
Domain: think of the size of distances inside an atom, versus distances when you throw a ball.

Note—

if m is very small (like a particle, or atom) then ldB


is very large – and we observe quantized behavior.
The wavelength is on the order of the size of the
domain.

if m is very large (like a baseball) then ldB is very


small – and we observe “continuous” energy
behavior. The wavelength is so small we cannot
observe it with our coarse abilities.

If v is very large – i.e. a very fast moving particle, then ldB is very small – and we
observe “continuous” energy behavior. Think of this as the particle has lots of
energy and will therefore be in the “continuous” energy regime in the plot on the
previous slide.

If v is very small – i.e. a slowly moving particle, then ldB is very large – and we
observe quantized energy behavior. Think of this as the particle has very little
energy and will therefore be in the quantized energy regime in the plot on the
previous slide.
The Particle in a Three-Dimensional Box

Energy is quantized
The particle in a three-dimensional box - The system consists of a
small particle (an atom or molecule, for example) trapped in a
rectangular box, free to move anywhere along x, y,or z as long as it
stays inside the box. The box has sides of length a along x, b along y,
and c along z. These dimensions of the box define the domain of the
system, the distance over which the particle can travel. The possible
energy values for the particle are then (QM Eqs. 2.40 to 2.42) :

h
h=

See next slide


You can only do this if:

abc ≈ a 3 = V
then a = V 1 3, and a 2 = V 2 3
m is the particle mass, V is the volume of the box, and nx, ny, and nz are
quantum numbers restricted to integer values greater than 0. We
assume the box is nearly a cube, so a, b, and c can be treated as roughly
equal. The constants are combined into a single parameter e0 and nx2 + ny2 +
nz2 into a single term n2. This e0 is DIFFERENT than the e on the next slide

The state with the lowest possible energy is always called the ground state,
corresponding to the quantum numbers (nx,ny,nz) = (1,1, 1). Any other state
is an excited state.
The Particle in a Three-Dimensional Box
The degeneracy of particle “1” in a 3D particle in the box is:
For instance a particle can take on various
values of the nx, ny, nz quantum numbers,
yet have the same energies – they are
therefore degenerate.
Example: Let a particle have nx, ny, nz values of 3,3,3 and 5,1,1
2
We know the energy levels for a 3D particle in the box are: h 2
Then particles with the same n2 value will have the same ε n x n y nz = 2 n
energy - they will be degenerate. 8mV 3
For all of these possible combinations n2 = nx2 ny2 nz2
27. Therefore the energy degeneracy for the 9 9 9
particle is g=4. 4 ways to make a n2 = 27.
As you go up in n values the degeneracy 25 1 1
becomes very large, and approaches the 1 25 1
general equation above. The equation above
1 1 25
is a really a representation of a degeneracy
density.
Most simple case: 1 e- atom – i.e.: 1 e- and 1-2 protons
Note: particle in the box
Quantum States of Atoms
and 1 electron atom are
The energies of the electron in a one-electron
atom, such as H or He+ are given by:
NOT the same thing,
energy dependence on
n is very different. n2
versus 1/n2

The energy depends on a single quantum number,


n, which may take on only integer values greater
than zero.
The quantum state of the electron in the atom also depends
on the quantum numbers l (which determines the overall
angular momentum in the electron's orbital motion), ml
(which determines the orientation of that orbital motion),
and ms (which determines the orientation of the spin
angular momentum).
When we add more electrons to the atom, the energy
ceases to obey a simple
algebraic expression.
Permittivity of free space: ε0 = 8.854 187 817... × 10−12 F·m−1
Molecular Structure - or, how molecules behave

When we transfer energy into a molecule, there are now four


major forms of motion, or degrees of freedom, in which the
energy of the molecule can be stored:

• 1. electronic (motion of the electrons relative to the nuclei)

• 2. vibrational (motion of the nuclei relative to each other)

• 3. rotational (motion of the nuclei around the molecular center


of mass)

• 4. translational (motion of the molecular center of mass)


Electronic States
The quantum states of the electrons in molecules do not have the simple
form found for the one-electron atom.

We can measure the relative energies of those states in the laboratory by


spectroscopy (uv/vis). Excite the molecule from its ground electronic
state into one of its lowest excited electronic states by applying
electromagnetic radiation. Each photon has an energy given by Planck's
law (he based this upon empirical observations):

We’ve been using this already

where v is the frequency of the radiation, l its wavelength, and c the


speed of light, equal to 2.998 ·10 8 m s-I. The energies needed to excite a
molecule into a higher electronic state typically correspond to ultraviolet
or visible radiation.
Vibrational States
If we take away the electronic (i.e., electrons) contribution to the energy, the
remaining contributions come from motions of the nuclei (i.e., protons and
neutrons).
If there are N atoms in the molecule, then there are 3N degrees of freedom (each
nucleus can move along the x, y, and z axes).
We can assign three of those degrees of freedom to the motion of the center of
mass, (essentially the reduced mass) µ. These are translational degrees of
freedom. Center of mass
Two bodies

For a linear molecule, we then need two angles to specify


the orientation, and for a non-linear molecule we need
three angles. These are rotational degrees of freedom.
Everything else: That leaves 3N – 6 different vibrational
coordinates for non-linear molecules and 3N – 5 for linear
molecules.
1 n 1 1 1 1 1 Reduced mass, general
= ∑ = + + + ..... + case: where xi is the
µ i=1 xi x1 x2 x3 xn mass of the ith atom
Vibrational States Cont’d

For each vibrational coordinate (i.e. 3N-6 or 3N-5) we define a vibrational


constant we(J) that depends on the masses of the atoms moving and the
rigidity of the chemical bond. we(J) decreases as the reduced mass µ of the
moving atoms goes up, and increases with the force constant k (which measures
the rigidity of the bond).
The vibrational motion is
modeled after a spring

The quantum states for vibrational motion then have vibrational energies
(approximately) equal to
To get this you solve the
Schrodinger equation for the
harmonic oscillator
where the vibrational quantum number v can be any integer 0 or greater.
The transition energy from the ground to lowest excited vibrational state is
roughly one vibrational constant we and corresponds to the photon energy of
infrared radiation. Therefore, most vibrational spectroscopy is carried out at
infrared wavelengths. Note the lowest vibrational energy IS NOT zero.
Rotational States

Non-linear molecules have three rotational coordinates, and linear molecules


have two.

For a linear molecule, the rotational energies are given by

B is the rotational constant of the molecule (results from quantum mechanics


description) and the rotational quantum number J can be any integer 0 or
greater. The value of the rotational constant decreases as the molecule gets
larger and more massive. (think about this in terms of the rotational energy
spacing for a heavier molecule versus a lighter molecule)
Note the lowest rotational energy state CAN BE zero.

The degeneracy of the rotational states corresponds to different orientations of


the rotational motion, and for linear molecules g = 2J + 1. (non-linear is also
2J+1, but there is a different “B” value for each contributing rotation)
Bulk Properties – some definitions

• the system (or sample), which is the thing we want to study;


• the surroundings, which consist of everything outside the system;
• the boundary, which separates the system and surroundings, and
• the universe, which is everything: the system, the surroundings, and
the boundary in between.

Extensive and Intensive Parameters


Handy teaching
Extensive (obtained by summing
device – “box in a
together the contributions from all
box”
the molecules in the system)

Intensive (obtained by averaging


the contributions from the
molecules).
Examples Is the value different
between the big box
and the little box?
Extensive parameters (sums):
• N, the total number of molecules in the system, or alternatively
n = N / NA , the total number of moles N? Yes
• V, the total space occupied by the system V? Yes
• E, the sum of the translational, rotational, vibrational, and E? Yes
electronic energies of the system, measured relative to the
system's ground state (Called U, in some texts. BUT, not to be
confused with V, the potential energy)

Intensive parameters (averages):


• P, the pressure, which is the average force per unit area exerted
by the molecules on their surroundings
• r, the number density, which is the average number of
molecules per unit volume N / V, and which is related to the
mass density rm by
P? No
r? No
where M is the molar mass and NA is Avogadro's number. T? No
Temperature is the weirdo, it is not an “average” of anything.
Principal Assumptions (for statistical mechanics)

• Chemically identical molecules share the same physics


• Macroscopic variables are continuous variables (not quantized!)
• Measured properties reflect the ensemble average
Consider the figure representing all the possible

Column 1

Column 2

Column 3

Column 4
states of a small system.

Three Fluorine atoms are trapped in a space with only


enough room for 4 atoms, and the total energy is
sufficient for only 1 atom to be in an excited
electronic state (shown by the dark blue ball). The
other 2 atoms are in their ground states.
All atoms are in their ground translational states (that
means they can’t move around in the box).

The translational states are quantized in a box this


small, so there are only 4 translational wavefunctions
(physical placements of the atoms in the box)
available to the 3 atoms, corresponding to the
4 columns.
For each of these 4 translational states, the energy
of the excited electronic state may be added to any

Column 1

Column 2

Column 3

Column 4
of the 3 atoms, resulting in 3 rows for each column
and a total of 12 ensemble states.

These are 12 distinct microstates, each with unique


microscopic properties, but they yield identical
values for the macroscopic variables E (the total
energy of the system), V (the volume), and N (the
total number of particles).

What is the average pressure for the system on the


top wall only of the container?

If each atom exerts a pressure p0 on the walls next


to it, the average pressure against the top wall in
the diagram is 1.5p0 .That's because half of the
ensemble states have one molecule near the top
W = 12
wall (so exerting a pressure of p0 ) while the other kB = 1.38 × 10-23 m2 kg s-2 K-1
half of the states put two molecules against the top
wall (with a pressure of 2p0 ), and the average over S = kB lnΩ
the microstates is 1.5p0 . We have reported a = (1.38 ×10 −23 m 2 kg ⋅ s −2 K −1 ) ln (12 )
“macroscopic” property – i.e. an ensemble average. = 3.43×10 −23 J / K
The Boltzmann Entropy (statistical mechanics)
We define the Boltzmann entropy S of the system to be proportional to the logarithm of
the total number of microstates W of the system:

So for the previous example, what is W? 12.

The Boltzmann entropy is the rigorous definition for the entropy, working under any
circumstances.
These circumstances are dictated by the conditions of the microcanonical ensemble: fixed
values of E, V, and N . In this ensemble, W is the total number of microstates that have the
same energy; so under these conditions the ensemble size W is a new name for the
degeneracy of states (g in Section 2.1) of our N -particle system.
However, we use W to represent the total number of microstates in the ensemble
(effectively the degeneracy of the entire system), whereas we will continue to use g for the
degeneracy of quantum states for some particular individual particle.
The entropy measures one property of the system, straightforward in concept:

The entropy S counts the total number of distinct microstates


possible for the system. That's all it does.
It turns out there actually is a formal definition of temperature in terms of energy, even
though there may be no “break down” of C or K degrees into further units.
We introduce the definition of the temperature:
" ∂E %
T =$ ' Eqtn 2.24
# ∂S &V ,N
Now we can manipulate a few
terms relating W and S.
Now plug in for S

" ∂E % 1 " ∂S % 1 " ∂ ( kB lnΩ) %


T =$ ' ⇒ =$ ' ⇒ =$ ' ⇒
# ∂S &V ,N T # ∂E &V ,N T # ∂E &V ,N
1 " ∂ ( lnΩ) %
=$ ' The change in the natural log of W with respect to
kBT # ∂E &V ,N energy is equal to the inverse of kBT. This may be
useful later.
The total number of potential microstates W can be thought of in
another manner.
Suppose you have 6 atoms, frozen in space so you can distinguish them individually. The
number of potential ways these 6 atoms can be written with the labels 1 through 6 is N!,
where N=6. In other words: 6! = 720.

However, suppose we have 4 different electronic quantum states. Suppose the number of
atoms in the 1st quantum state is N1, the second quantum state N2, the third quantum state
N3, and the fourth quantum state N4. Let 3 atoms be in state 1, thus N1 = 3. Let 2 atoms
atoms be in state 2, thus N2 = 2. Let 1 atom be in state 3, thus N3 = 1. Finally, let 0 atoms be
in state 4, thus N4 = 0. But we still have our labels, 1,2,3,4,5,6. Now lets place our atoms.

These are all equivalent in terms of total


Atoms in Atoms in Atoms in Atoms in energy because the atoms involved are
state 1 state 2 state 3 state 4 still in their same quantum states, even
123 45 6 empty though we have switched the ordering a
bit
132 45 6 empty
123 54 6 empty
But these are not equivalent, because
162 45 3 empty atoms have been switched to different
461 32 5 empty quantum states.
The total number of potential microstates W.
These are all equivalent in terms of total
Atoms in Atoms in Atoms in Atoms in energy because the atoms involved are
state 1 state 2 state 3 state 4 still in their same quantum states, even
123 45 6 empty though we have switched the ordering a
bit
132 45 6 empty
123 54 6 empty
But these are not equivalent, because
162 45 3 empty atoms have been switched to different
461 32 5 empty quantum states.

These indistinguishable states do not actually contribute to our W (because we will never
know they are there), and we need to get rid of them somehow. We do this by dividing
the “raw” W number by the number of particles in each state factorial. This is a “new”
definition of W.

N! N!
Ω= = k For our particular example:
N1 !N 2 !......N k !
∏N ! 1
Ω=
6!
= 60
i=1
(3!) (2!) (1!) (0!)
The Partition Function
***The partition function can be said to accomplish one task-to obtain an effective count of
the states available to the system, weighing in the likelihood of each possible state. ****
This would be our “W”

The single particle partition function Canonical ensemble partition function

the probability that any given molecule


the probability that any given ensemble is
has energy e
in the state i.
The partition function normalizes P(e) or P(i), ensuring that if we examine all possible
energy levels, we have a 100% chance of finding the system at one of those energies or states.
A rougher, qualitative way to think of the partition function is this: Q (or q) counts the number
of states that the system can easily access. Available – because you must have enough energy
to populate the W states. Some of these states will not be realistically available because of
energy limitations.
Trivia of the day: a canonical form of a mathematical object is a standard way of
presenting that object as a mathematical expression. So, we have defined the
canonical ensemble partition function in the previous slide.
Example Page 85 in Thermo text

You discover a molecular system having the energy levels and degeneracies

Evaluate the partition function, and calculate P( e) for each of the four lowest energy levels.
See next slide for explanation

de

There are 2 particles so they are equally


dividing the available energy between
them
I blatantly snagged this
from the web, using the
search term
“probability of multiple
events”

This is similar to our idea that the degeneracy (energy degeneracy) of two particles in the
same “particle in the box” is equal to the degeneracy for one particle, squared. I.e., g1 is the
degeneracy of the first particle (the number of possible states for that first particle). The
probability of two particles in the g1 state would then just be g1 X g1 = g12. The “assumption”
we are making is that the second particle will also have available to it g1 degenerate states (in
reality it is g1-1). This is not too far off as long as g1 is a “large” number.
Strategy: This problem only deals
with relative (rather than absolute)
populations among the possible
states, so we don’t need to
calculate the partition function (i.e.
they drop out when you take the
ratio)

e
Read the
energies directly
from this graph Where q(T) is the partition function
We recall from some distant bad memory of general chemistry:
g = 2J + 1

state g E(cm-1) E/kB (K) −E (k T )


e−E (kBT ) ge B
3P J=0 1 0 0 1.00 1.00
3P J=1 3 49 71 0.87 2.6
3P J=2 5 131 188 0.69 3.4
1D J=2 5 15300 22000 0.00 0.0

Where e-(E/kbT) gives the relative population per quantum state, and ge-(E/kbT)
gives the relative population per energy level.

a. the energy level that has the highest population overall is 3P J=2

b. the energy level that has the highest population in each of its
individual quantum states is 3P J=0
For the E = 49 cm-1 case
kB = 1.38⋅10 −23 J ⋅ K −1
Wave number
Energy given in terms of “cm-1” is a wave
1 number form of energy. You must convert this
ωe = ν = to Joules
λ
1 1 1m
λ= = −1
= 0.02cm ⇒ λ = 0.02cm × = 2 ⋅10 −4 m
ν 49cm 100cm

8 −1
c (3⋅10 m ⋅ s )
E = hν = h = (6.62 ⋅10 −34 J ⋅ s) −4
= 9.93⋅10 −22
J
λ (2 ⋅10 m)

−E (kBT ) −9.93⋅10−22 J (1.38⋅10−23 JK −1⋅500 K ) −0.144


e =e =e = 0.87

ge−E (kBT ) = 3(0.87) = 2.60


Recall:

−E (kBT )
The number of molecules at a particular energy varies as g(E)e .
If we start near absolute zero, only the ground state (J = 0, where E = 0) will be
substantially populated, so the ratio P(J = 0) / P(J = 1) will be greater than 1. The
degeneracy of the CO rotational levels, grot = 2J+1, rises as the energy increases.
Therefore, at high temperatures, where −E (kBT ) 0
e ≈ e =1
the ratio P(J = 0) / P(J = 1)
will be approximately
grot (J = 0) / grot(J = 1) = 1 / 3.
P(J = 0) g0 e−E (kBT ) g0 e 0 1
= −E (kBT )
≈ 0
=
P(J = 1) g1e g1e 3
Because all molecular rotational constants are small compared to typical thermal
energies kBT, we’re usually in the high temperature limit and the ratio is close to 1/3
until we drop the temperature a lot. Therefore, we only need to compare values for the
canonical distribution at two points, such as J=0 and J=1.
But here they are asking for a SPECIFIC ratio
Given that the probability of finding a particle in the rotational microstate state J is:

−BJ ( J+1) kBT


The “wave number” form of the
(2J +1)e Boltzmann constant is kB = 0.697 cm-1K-1
P(J ) = ⇒
q(T )

0 −(1.93cm−1⋅2) (0.697cm−1⋅K −1⋅T )


(1)e (3)e (3)e−5.54 T
P(J = 0) = and P(J = 1) = =
q(T ) q(T ) q(T )

P(J = 0) e 0 q(T ) P(J = 0) 1


= −5.54 T and set =
P(J = 1) 3e q(T ) P(J = 1) 2

P(J = 0) 1 e 0 q(T ) 1
= = −5.54 T = −5.54 T
P(J = 1) 2 3e q(T ) 3e

1 1
= −5.54 T and solve for T
2 3e

2 −5.54 $2' −5.54


e−5.54 T = ⇒ = ln & ) ⇒ T = = 13.7K
3 T % 3( $ 2 '
ln & )
% 3(

You might also like