0% found this document useful (0 votes)
17 views31 pages

2019spring LA WK13 TUE v3

Chapter 5.5 discusses complex matrices, including definitions of complex conjugates, conjugate transposes, and properties of Hermitian matrices. It covers concepts such as inner products, orthogonality, and projections, along with important theorems like the Schwarz and Triangle inequalities. The chapter also introduces the Gram-Schmidt algorithm and QR factorization for converting linearly independent vectors into orthonormal sets.

Uploaded by

1jeongk
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
17 views31 pages

2019spring LA WK13 TUE v3

Chapter 5.5 discusses complex matrices, including definitions of complex conjugates, conjugate transposes, and properties of Hermitian matrices. It covers concepts such as inner products, orthogonality, and projections, along with important theorems like the Schwarz and Triangle inequalities. The chapter also introduces the Gram-Schmidt algorithm and QR factorization for converting linearly independent vectors into orthonormal sets.

Uploaded by

1jeongk
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 31

Chapter 5.

5 Complex Matrices

Week 13 / TUE

Spring 2019

1 / 31
Definition
The complex conjugate of a matrix is obtained by taking the complex
conjugate of each entry:
∗ ∗ ∗
a11 ··· a1n a11 ··· a1n
  
 .. .. ..  :=  .. .. ..  .
 . . .   . . . 
am1 ··· amn ∗
am1 ··· ∗
amn

Example
 ∗  
2i 1 −2i 1
 +i
1 0  =1−i 0
   
2 − 3i −i 2 + 3i i

2 / 31
Definition
The conjugate transpose, or Hermitian, of a matrix A is defined by

AH := (A∗ )T .

Note
The complex conjugation and transpose commmute:

AH = (A∗ )T = (AT )∗

If A is real, then AH = AT .

Example
 ∗
2i 1 " #
−2i 1−i 2 + 3i
1+i 0 =
 
1 0 i
2 − 3i −i

3 / 31
Exercise
Prove the following:
1 AHH = A
2 (AB)H = B H AH
3 (AH )−1 = (A−1 )H
4 det AH = (det A)∗
H
5 (e A )H = e A

4 / 31
Definition
Let A be an m × n complex matrix.
The subspace of Cm spanned by the columns of A is called the
column space of A, and denoted by C (A).
The row space of A is defined by

R(A) := C (AH ) ⊆ Cn .

Example
" #! " # " # " #
a1 b1 c1 a b c
C =h 1 , 1 , 1 i
a2 b2 c2 a2 b2 c2

   
" #! a1∗ a2∗
a1 b1 c1  ∗  ∗
R = hb1  , b2 i
a2 b2 c2
c1∗ c2∗

5 / 31
Remark
Let A be an m × n complex matrix. We still have

dim R(A) = dim C (A) = rank A = #(pivots).

Also,
R(A) ⊕ N(A) = Cn and C (A) ⊕ N(AH ) = Cm ,
so

rank A + dim N(A) = n and rank A + dim N(AH ) = m.

6 / 31
Definition
The norm of a vector in Cn is defined by

z1
 
 .. 
q
 .  := |z1 |2 + · · · + |zn |2 .
zn

If kx k = 1, x is called a unit vector.

Example
" #
1
q √
= |1|2 + |2i|2 = 1+4=5
2i

Note
kx k ∈ R
kx k ≥ 0
kx k = 0 ⇐⇒ x = 0
7 / 31
Definition
The inner product of two vectors x , y ∈ Cn is defined by

hx |y i := x H y ∈ C.

Example
    
a x h i x
hb |y i = a∗ b ∗ c ∗ y  = a∗ x + b ∗ y + c ∗ z
    
c z z

Note
1 hx |x i = kx k2 ≥ 0
2 hy |x i = hx |y i∗

Exercise
Show that
hx |Ay i = hAH x |y i.
8 / 31
Exercise
Prove the following:
1 h0|x i = hx |0i = 0
2 |hx |y i| = |hy |x i|
3 hx |y ihy |x i = |hx |y i|2
4 hx |y i + hy |x i = 2 Rehx |y i
5 hx |y i − hy |x i = 2i Imhx |y i
6 hx |y i = 0 ⇐⇒ hy |x i = 0
7 hx |x i = 0 ⇐⇒ x = 0
8 hx + y |zi = hx |zi + hy |zi
9 hx |y + zi = hx |y i + hx |zi
10 hx |cy i = chx |y i for all c ∈ C.
11 hcx |y i = c ∗ hx |y i for all c ∈ C.
9 / 31
Theorem (Schwarz Inequality)
|hx |y i| ≤ kx k ky k for all x , y ∈ Cn .

Proof
It is clearly true if y = 0, so assume that y 6= 0. Then
2  
hy |x i hy |x i hy |x i
0≤ x− y = x− y x− y
hy |y i hy |y i hy |y i

hx |y i hy |x i |hx |y i|2
= hx |x i − hy |x i − hx |y i + hy |y i
hy |y i hy |y i hy |y i2

2 |hx |y i|2 |hx |y i|2 2 2 |hx |y i|2


= kx k − 2 2 + 4 ky k = kx k − 2 ,
ky k ky k ky k

2 2
so |hx |y i|2 ≤ kx k ky k , hence |hx |y i| ≤ kx k ky k.

10 / 31
Theorem (Triangle Inequality)
kx + y k ≤ kx k + ky k for all x , y ∈ Cn .

Proof
2
kx + y k = hx + y |x + y i

2 2
= kx k + ky k + hx |y i + hy |x i

2 2
= kx k + ky k + 2 Rehx |y i

2 2
≤ kx k + ky k + 2|hx |y i|

2 2
Schwarz Inequality ≤ kx k + ky k + 2 kx k ky k

= (kx k + ky k)2
11 / 31
Definition
If
hx |y i = 0,
then x and y are said to be orthogonal, and we write

x ⊥ y.

Remark
Many orthogonal properties of real vectors continue to hold for complex
vectors. We list a few of them below:
x ⊥ y1 , ..., x ⊥ yk =⇒ x ⊥ hy1 , . . . , yk i
If {b1 , . . . , bn } is a basis of Cn , then

x ⊥ b1 , ..., x ⊥ bn =⇒ x = 0.

Mutually orthogonal vectors are linearly independent.

12 / 31
Pythagorean Theorem
If x ⊥ y , then
kx + y k2 = kx k2 + ky k2 .

Proof

kx + y k2 = hx + y |x + y i

= kx k2 + ky k2 + 
hx
|yi + 
hy
|xi

= kx k2 + ky k2

13 / 31
Definition
The orthogonal complement of a subspace U ⊆ Cn is defined by

U ⊥ := {x ∈ Cn | x ⊥ u for all u ∈ U} .

Remark
Many properties of U ⊥ continue to hold. We list a few of them below:

(U ⊥ )⊥ = U
U ∩ U⊥ = 0
U ⊕ U ⊥ = Cn

Exercise
Show that

N(A) = R(A)⊥ and N(AH ) = C (A)⊥ .

14 / 31
Definition
If
AH = A,
then A is called a Hermitian matrix.

Note
If A is Hermitian, then

hx |Ay i = hAx |y i.

A real matrix is symmetric if and only if Hermitian.

Charles Hermite (1822 – 1901) was a French mathematician.


15 / 31
Theorem
The eigenvalues of a Hermitian matrix are real.

Proof
Let A be a Hermitian matrix, and v an eigenvector of A associated with an
eigenvalue λ. Then
hv |Av i = hv |λv i = λhv |v i.
Taking the complex conjugate gives
λhv |v i = hv |Av i = hAv |v i = hv |Av i∗ = λ∗ hv |v i.
Since v 6= 0, hv |v i =
6 0. Hence, we can divide by hv |v i, yielding
λ = λ∗ ,
i.e., λ ∈ R.

Remark
In quantum mechanics, observable quantities, which are real, are
represented by Hermitian operators.
16 / 31
Theorem
Eigenvectors of a Hermitian matrix associated with distinct eigenvalues
are mutually orthogonal.

Proof
Let A be a Hermitian matrix, and u, v eigenvectors of A associated with
distinct eigenvalues λ, µ ∈ C, respectively. Since A is Hermitian,

µhu|v i = hu|µv i = hu|Av i = hAu|v i = hλu|v i = λ∗ hu|v i = λhu|v i.

If hu|v i =
6 0, then µ = λ, a contradiction. Hence, hu|v i = 0.

Note
Recall that eigenvectors of a matrix associated with distinct eigenvalues
are linearly independent. However, they are not necessarily mutually
orthogonal.

17 / 31
Projections
Let A be an n × n complex matrix. One can show that a linear
transformation
Cn → Cn , x 7→ Ax
is a projection if and only if A is Hermitian and idempotent, i.e.,

AH = A and A2 = A.

The proof is almost identical to the real version (WK6/TUE).

18 / 31
Projection onto a Line
If u ∈ Cn is a nonzero vector, then
 
| h i
  −u ∗ −
u
 

uu H |
phui = H
=  .
u u h i |
−u ∗ − u 
 
|

Note
If u is a unit vector, then
phui = uu H .

Note
For all v ∈ Cn ,
phui v = uu H v = hu|v iu.
19 / 31
Projection onto a Subspace
If u1 , . . . , uk is a mutually orthogonal basis of a subspace U ⊆ Cn , then

pU = phu1 i + · · · + phuk i .

Note
If u1 , . . . , uk is an orthonormal basis of U, then

pU = u1 u1H + · · · + uk ukH .

Note
If u1 , . . . , un is an orthonormal basis of Cn , then for allv ∈ Cn ,

v = hu1 |v iu1 + · · · + hun |v iun .

20 / 31
Gram-Schmidt
Gram-Schmidt algorithm converts linearly independent complex vectors
v1 , . . . , vk to orthonormal vectors u1 , . . . , uk .

v1
v1 u1 =
kv1 k

ṽ2
v2 ṽ2 = v2 − hu1 |v2 iu1 u2 =
kṽ2 k

ṽ3
v3 ṽ3 = v3 − hu1 |v3 iu1 − hu2 |v3 iu2 u3 =
kṽ3 k

ṽ4
v4 ṽ4 = v4 − hu1 |v4 iu1 − hu2 |v4 iu2 − hu3 |v4 iu3 u4 =
kṽ4 k
.. ..
. .
21 / 31
QR Factorization
v1 = kv1 k u1

v2 = hu1 |v2 iu1 + kṽ2 k u2

v3 = hu1 |v3 iu1 + hu2 |v3 iu2 + kṽ3 k u3


..
.
vn = hu1 |vn iu1 + hu2 |vn iu2 + · · · + hun−1 |vn iun−1 + kṽn k un

 
kv1 k hu1 |v2 i hu1 |v3 i · · · hu1 |vn i

| |
 
|
 0
|  kṽ2 k hu2 |v3 i · · · hu2 |vn i

v1 ··· vn  = u1 ··· un   0
 0 kṽ3 k · · · hu3 |vn i

| | | |  ... .. .. .. .. 

. . . . 
0 0 0 ··· kṽn k

=: QR
22 / 31
Definition
If
AH = A−1 ,
then A is called a unitary matrix.

Note
If U is unitary, then
hUx |Uy i = hx |y i.
A real matrix is orthogonal if and only if unitary.

Exercise
Show that for a square matrix A, TFAE:
i A is unitary.
ii The columns of A are orthonormal.
iii The rows of A are orthonormal.

23 / 31
Exercise
Show that if U is unitary, then

| det U| = 1.

Definition
If
AH = −A,
then A is called a skew-Hermitian matrix.

Note
A real matrix is skew-symmetric if and only if skew-Hermitian.
If A is Hermitian, then iA is skew-Hermitian.

Exercise
Show that if A is skew-Hermitian, then e A is unitary.

24 / 31
Theorem
Let A be an n × n complex matrix. A linear transformation

Cn → Cn , x 7→ Ax

is an isometry if and only if A is unitary.

Proof
If A is unitary, then

hAu|Av i = hu|AH Av i = hu|v i.

Hence, if v = u, then

kuk2 = hu|ui = hAu|Aui = kAuk2 ,

so
kuk = kAuk .
Conversely, suppose that A is an isometry. Then
25 / 31
Proof (cont’d.)

ku + v k2 − kuk2 − kv k2
hu|AH Av i = hAu|Av i = hu|v i =
2

kA(u + v )k2 − kAuk2 − kAv k2


=
2

kAu + Av k2 − kAuk2 − kAv k2


= ,
2
so if u = ei and v = ej , then

(AH A)ij = hei |AH Aej i = hei |ej i.

It follows that
AH A = I,
i.e., A is unitary.
26 / 31
Example
The rotation matrix " #
cos θ − sin θ
A=
sin θ cos θ
is an isometry, hence unitary.
" #
cos θ − λ − sin θ
det(A − λI) = det
sin θ cos θ − λ

= (cos θ − λ)(cos θ − λ) + sin2 θ

= λ2 − 2 cos θλ + 1 = (λ − e iθ )(λ − e −iθ ),

so the eigenvalues are λ = e ±iθ . Note that


|e ±iθ | = 1,
as expected.
27 / 31
Theorem
If λ is an eigenvalue of a unitary matrix, then |λ| = 1.

Proof
Let U be a unitary matrix, and v an eigenvector of U associated with an
eigenvalue λ. Since U is an isometry,

kv k = kUv k = kλv k = |λ| kv k .

Since kv k =
6 0, we can divide by kv k, giving |λ| = 1.

Remark
It follows that every eigenvalue of a unitary matrix is of the form

e iθ = cos θ + i sin θ.

The name unitary originates from this fact.

28 / 31
Theorem
Eigenvectors of a unitary matrix belonging to distinct eigenvalues are
mutually orthogonal.

Proof
Let U be a unitary matrix, and u, v eigenvectors of U associated with
distinct eigenvalues λ, µ ∈ C, respectively. Then

λ∗ µhu|v i = hλu|µv i = hUu|Uv i = hu|v i.

If hu|v i =
6 0, then we can divide by hu|v i, giving

λ∗ µ = 1.

Since |λ| = 1, multiplying both sides by λ gives

µ = |λ|2 µ = λλ∗ µ = λ,

a contradiction. Hence, hu|v i = 0, i.e., u ⊥ v .


29 / 31
Note
R C

Symmetric Hermitian
T
(A = A) (AH = A)

Orthogonal Unitary
−1
T
(A = A ) (AH = A−1 )
30 / 31
Exercise
Prove the following:
1 If U and V are unitary, then UV is unitary.
2 If U is unitary, then U −1 is unitary.
3 If A is Hermitian and U is unitary, then U −1 AU is Hermitian.

Spectral Theorem
If A is an n × n Hermitian matrix, then there exists a basis of Cn
consisting of orthonormal eigenvectors of A.

Remark
We will prove this theorem in the next lecture. The theorem implies that
every Hermitian matrix is unitarily diagonalizable, i.e.,

U H AU = Λ,

where U is a unitary matrix whose columns are orthonormal eigenvectors


of A, and Λ is a diagonal matrix of (real) eigenvalues of A.
31 / 31

You might also like