Quantum Chemistry II: Math Introduction: Albeiro Restrepo May 27, 2009
Quantum Chemistry II: Math Introduction: Albeiro Restrepo May 27, 2009
Quantum Chemistry II: Math Introduction: Albeiro Restrepo May 27, 2009
Albeiro Restrepo
May 27, 2009
Contents
1 Operator matrix representation 2
1
1 OPERATOR MATRIX REPRESENTATION 2
Operating with O over one of the vectors of {êi } will result in another vector of V, say ~vi ,
which in turn could be written as a linear combination of the vectors of {êi } (including
êi itself):
n
X
Oêi = ~vi = Oji êj (1)
j=1
where Oji , the n expansion coefficients are numbers (real or complex); the j label is the
summation index, i.e., the contribution from each individual basis vector to ~vi , while i
lables the vector that was acted upon with O. We now tackle the issue of the nature of
the Oji . Let’s take the sacalar product of equation 1 with any other basis vector (êk for
instance):
n
X n
X n
X
êk · (Oêi ) = êk · ~vi = êk · Oji êj = Oji êk · êj = Oji δkj = Oki
j=1 j=1 j=1
where I used the orthonormalization condition of the basis vectors, êk · êj = δkj , and
the fact that out of the n terms in the sum, only the one for which j = k survives the
action of Kronecker’s delta, all the others vanish. Equation 2 tells us that the expansion
coefficients are not arbitrary, they are fixed by the choice of basis set; for example, in
the basis {|αi}, the expansion coefficients are given by êα · Oêβ = Ωαβ , which are clearly
different numbers to the Oji . In Dirac’s notation we have:
n
X
O |ii = |vi i = Oji |ji
j=1
n
X n
X n
X
hk |O| ii = hk | Oji |ji = Oji hk |ji = Oji δkj = Oki =⇒ Oki = hk |O| ii
j=1 j=1 j=1
2 MATRIX ALGEBRA ↔ OPERATOR ALGEBRA ISOMORPHISM 3
For a set containing n elements, there are a total of n × n = n2 possible binary combi-
nations, thus we have a total of n2 expansion coefficients obtained from Oki = hk |O| ii =
êk · Oêi . If we write them on a square array and label rows and columns, we have O, the
matrix representation of O in the basis {|ii } = {êi }:
O11 O12 · · · O1n
O21 O22 · · · O2n
· · ··· ·
O=
· · ··· ·
· · ··· ·
On1 On2 · · · Onn
ê1 · Oê1 ê1 · Oê2 · · · ê1 · Oên h1 |O| 1i h1 |O| 2i · · · h1 |O| ni
ê2 · Oê1 ê2 · Oê2 · · · ê2 · Oên h2 |O| 1i h2 |O| 2i · · · h2 |O| ni
· · ··· · · · ··· ·
O=
=
· · ··· · · · ··· ·
· · ··· · · · ··· ·
ên · Oê1 ên · Oê2 · · · ên · Oên hn |O| 1i hn |O| 2i · · · hn |O| ni
Ωαα Ωαβ · · · Ωαη
Ωβα Ωββ · · · Ωβη
· · ··· ·
Ω=
· · ··· ·
· · ··· ·
Ωηα Ωηβ · · · Ωηη
hα |O| αi hα |O| βi · · · hα |O| ηi
êα · Oêα êα · Oêβ · · · êα · Oêη
hβ |O| αi hβ |O| βi · · · hβ |O| ηi
êβ · Oêα êβ · Oêβ · · · êβ · Oêη
· · ··· ·
Ω=
· · ··· · =
· · ··· ·
· · ··· ·
· · ··· ·
êη · Oêα êη · Oêβ · · · êη · Oêη
hη |O| αi hη |O| βi · · · hη |O| ηi
upon ~a with O produces ~b, another vector in V. Both ~a and ~b can be written as linear
combinations of the n vectors of some basis {|ii} = {êi } for V:
n
X n
X
O~a = ~b, ~a = ai êi , ~b = bj êj
i=1 j=1
n
X n
X
O |ai = |bi , |ai = ai |ii , |bi = bj |ji
i=1 j=1
Let’s explore the relationship between the sets of coefficients {ai } and {bj }:
n
X n
X
hk |bi = hk |O| ai = bj hk |ji = bj δkj = bk
j=1 j=1
n
X n
X n
X
hk |O| ai = ai hk |O| ii = ai Oki = Oki ai =⇒
i=1 i=1 i=1
n
X
Oki ai = bk
i=1
which is exactly the definition for the matrix product between O, the n × n matrix
representation of O and a, the n × 1 matrix representation of ~a to produce b, the n × 1
matrix representation of ~b; all matrix representations in the {|ii} = {êi } basis. We
started by operating with O on any vector of V to produce another vector of V, then
concluded that matrix multiplication of the matrix representations of the involved vectors
and operator yields exactly the same results, hence explicitly showing the isomorphism
between operator algebra and the algebra of matrix representations:
n
X n
X n
X
|bi = bj |ji =⇒ hi |bi = bj hi |ji bj δij = bi
j=1 j=1 j=1
n
X n
X n
X
hb | = b∗j hj | =⇒ hb |ii = b∗j hj |ii b∗j δji = b∗i
j=1 j=1 j=1
Since hi |bi = bi , hb |ii = b∗i , we can say that hi |bi = hb |ii ∗ , or more generally,
¯ ¯
ha |O| bi = ha |ci = hc |ai ∗ = hb ¯¯O† ¯¯ ai∗ = hb |O| ai∗ (5)
n n n
à n !
X X X X
|ai = ai |ii = |ii ai = |iihi| ai = |iihi| |ai
i=1 i=1 i=1 i=1
whatever is inside parenthesis leaves |ai unchanged, therefore it must equal unity:
n
X
|iihi| = 1 (6)
i=1
4 COMPLETE BASIS SETS 6
the left side of equation 6 is known as the dyadic of the vector space and is a testament
to the completeness of the basis set: every vector on the basis must be included.
Very important note: Looking for simplicity, I chose to write 1 for the sum in equation
6 because it resembles the multiplicative identity, however, it must be clear that the sum
is not a number, as numbers are not being added! The fact is, the sum in equation 6
comprises an entirely new and until now unknown (at least for most of you) kind of vector
multiplication. So far you heard of scalar (dot, inner) ~a ·~b and cross ~a ×~b vector products;
dot products yield a number while cross products yield a vector of the same dimension
of the vectors being multiplied. Let’s now focus on the matrix representation of vectors
(bras and kets) of the space in question:
a1
a2
·
|ai = a1 |1i + a2 |2i + · · · + an |ni ↔ an×1 =
·
·
an
³ ´
ha | = a∗1 h1 | + a∗2 h2 | + · · · + a∗n hn | ↔ a†1×n = a∗1 a∗2 · · · a∗n
To illustrate the power of a complete basis set and of the dyadic of the vector space,
let’s work out the matrix representation of sequential application of two operators. We
already know the result: if there is an isomorphism between operator algebra and matrix
algebra, then we better obtain the multiplication of the matrix representations of the
operators. Put in other words, we will prove the isomorphism AB ↔ AB. take the
sequential application AB to be equal to the application of some other operator C, and
begin with the matrix representation of C: