0% found this document useful (0 votes)
49 views10 pages

SP351 HW6: Nikhil Vasan September 2021

The document provides examples of applying the Gram-Schmidt process to orthogonalize sets of vectors. It also verifies properties of orthogonal matrices and eigenvectors/eigenvalues. Specifically, it shows: 1) Applying Gram-Schmidt to 3 sets of vectors to obtain orthogonal vectors 2) Verifying Cauchy-Schwartz inequality for 2 sets of vectors 3) Showing a matrix C is orthogonal and its eigenvectors are orthogonal 4) Proving properties about orthogonal matrices and symmetric/anti-symmetric matrices 5) Finding the eigenvalues and eigenvectors of a 2x2 matrix.

Uploaded by

Nikhil Vasan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
49 views10 pages

SP351 HW6: Nikhil Vasan September 2021

The document provides examples of applying the Gram-Schmidt process to orthogonalize sets of vectors. It also verifies properties of orthogonal matrices and eigenvectors/eigenvalues. Specifically, it shows: 1) Applying Gram-Schmidt to 3 sets of vectors to obtain orthogonal vectors 2) Verifying Cauchy-Schwartz inequality for 2 sets of vectors 3) Showing a matrix C is orthogonal and its eigenvectors are orthogonal 4) Proving properties about orthogonal matrices and symmetric/anti-symmetric matrices 5) Finding the eigenvalues and eigenvectors of a 2x2 matrix.

Uploaded by

Nikhil Vasan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

SP351 HW6

Nikhil Vasan
September 2021

1a. Given each set of basis vectors use Gram-Schmidt Orthogonalization


to find a set of orthogonal vectors
a. A = (0, 2, 0, 0), B = (3, −4, 0, 0), C = (1, 2, 3, 4)

A
e1 = = (0, 1, 0, 0) (1)
|A|

B − (e1 · B)e1
e2 = = (1, 0, 0, 0) (2)
|B − (e1 · B)e1 |

C − (C ė1 )e1 − (C · e2 )e2 3 4


e3 = = (0, 0, √ , √ ) (3)
|C − (C ė1 )e1 − (C · e2 )e2 | 25 25
3 4
= (0, 0, , ) (4)
5 5
As illustrated above, the Gram-Schmidt Orthogonalization process involves
iteratively subtracting from each vector the component of the previous unit
vectors within the original vector.
b. A = (0, 0, 0, 7), B = (2, 0, 0, 5), C = (3, 1, 1, 4)

A
e1 = = (0, 0, 0, 1) (5)
|A|

B − (e1 · B)e1
e2 = = (1, 0, 0, 0) (6)
|B − (e1 · B)e1 |

1
C − (C ė1 )e1 − (C · e2 )e2 1 1
e3 = = (0, √ , √ , 0) (7)
|C − (C ė1 )e1 − (C · e2 )e2 | 2 2
c. A = (6, 0, 0, 0), B = (1, 0, 2, 0), C = (4, 1, 9, 2)
A
e1 = = (1, 0, 0, 0) (8)
|A|

B − (e1 · B)e1
e2 = = (0, 0, 1, 0) (9)
|B − (e1 · B)e1 |

C − (C ė1 )e1 − (C · e2 )e2 1 2


e3 = = (0, √ , 0, √ ) (10)
|C − (C ė1 )e1 − (C · e2 )e2 | 5 5
1b. Find the norms of A, B and demonstrate that they obey the Cauchy-
Schwartz Inequality, that is, |A · B| ≤ |A||B|.
a. A = (3 + i, 1, 2 − i, −5i, i + 1), B = (2i, 4 − 3i, 1 + i, 1).
q
|A| = Σ40 A∗i Ai (11)

We take notice that because A, B ∈ C5 , we must use the inner-product as


opposed to the dot product, as such, we take the hermitian conjugates of our
vectors and multiply term-by-term, this is equivalent to taking the modulus
of the elements of our vectors
p √ √
|A| = |3 + i| + |1| + |2 − i| + | − 5i| + |i + 1| = 10 + 1 + 5 + 25 + 2 = 43
(12)
We do the same for B.
p √ √
|B| = |2i| + |4 − 3i| + |1 + i| + |3i| + |1| = 4 + 25 + 2 + 9 + 1 = 41
(13)
Let us check the Cauchy-Schwartz Inequality

|A · B| = |(3 − i)(2i) + (1)(4 − 3i) + (2 + i)(1 + i) + (5i)(3i) + (i − 1)(1)|


(14)

= |6i + 2 + 4 − 3i + 1 + 3i − 15 + i − 1| = |7i − 9| = 130 (15)

2
√ p
As we can see, 130 < (41)(43), and the inequality holds.

b. A = (2, 2i − 3, 1 + i, 5i, i − 2), B = (5i − 2, 1l3 + i, 2i, 4)


p √ √
|A| = |2| + |2i − 3| + |1 + i| + |5i| + |i − 2| = 4 + 13 + 2 + 25 + 5 = 49 = 7
(16)

p √ √
|B| = |5i − 2| + |1| + |3 + i| + |2i| + |4| = 29 + 1 + 10 + 4 + 16 = 60
(17)

|A∗ · B| = |(2)(5i − 2) + (2i + 3)(1) + (1 − i)(3 + i) + (5i)(2i) + (i + 2)(4)|


(18)

= |10i − 4 + 2i + 3 + 4 − 2i − 10 + 4i + 8| = |14i − 1| = 170
(19)

Again we see that our inequality holds.

3
2. Verify that the matrix is orthogonal
" #
√1 −2

5 5
C= √2 √1
(20)
5 5
" # " #
√1 −2

5 5
and that the eigenvectors A = √2
,B= √1
, are orthogonal. Let us first
5 5
determine whether the eigenvectors are orthogonal.
1 −2 1 2
A · B = ( √ )( √ ) + ( √ )( √ ) = 0 (21)
5 5 5 5
Furthermore, we notice something interesting about the magnitude of these
vectors ...
1
A · A = (12 + 22 ) = 1 (22)
5
Interesting! Let’s see if this holds in B
1
B · B = (12 + (−2)2 ) = 1 (23)
5
We also notice something interesting about the matrix C
 
C= A B (24)

C is the matrix, the columns of which, form an orthonormal basis. It must


be an orthogonal matrix! Let us test that.
   T   
T
  AT AA AB T 1 0
CC = A B = = (25)
BT BAT BB T 0 1

C is an orthogonal matrix, as CC T = I.
2b. Verify the following statements.
a. If C is orthogonal, and M is symmetric, prove that C −1 M C is symmetric.
Proof. We use direct proof. By hypothesis, C is orthogonal, that is, CC T =
CC −1 = I, therefore C T = C −1 . As such, C −1 M C = C T M C. By definition
of matrix transposition, (C T M C)T = C T M T (C T )T = C T M C = C −1 M C.

b. If C is orthogonal and M is anti-symmetric, show that CM C −1 is


anti-symmetric.

4
Proof. We use direct proof. By hypothesis, M is anti-symmetric, that is
M T = −M . Therefore, (C −1 M C)T = (C T M C)T = C T M T C = −C −1 M C.
Therefore, C −1 M C is anti-symmetric.

5
3.  Find the
 eigenvalues and eigen-vectors of the following matrix. a
3 −2
M=
−2 0
the eigen-vectors of this matrix come in the form
M⃗v = λ⃗v (26)

The eigen-vectors of this matrix come in the form


M⃗v = λ⃗v = λI⃗v (27)
As such,
(M − λI)⃗v = 0 (28)
⃗v is a non-trivial vector, as such ⃗v ∈ N [M − λI], because the null space of
M − λI is non-null, the det(M − λI) = 0. Evaluating our determinant we
obtain a polynomial in λ.

3 − λ −2 2
−2 −λ = −(3 − λ)λ − 4 = λ − 3λ + 4 = (λ + 1)(λ − 4) = 0 (29)

Our constraints are satisfied when λ = −1, 4. We can substitute these values
of λ into our earlier equation to obtain our eigenvectors
     
3 − (−1) −2 v1 0
= (30)
−2 0 − (−1) v2 0
Let us create a system of equations and see how v1 and v2 are related.
0 = 4v1 − 2v2 (31)
= −2v1 + v2 (32)
 
1
Therefore any vector of the form α is and eigenvector of this matrix with
2
eigenvalue -1 Let us solve when λ = 4
     
3 − (4) −2 v1 0
= (33)
−2 0 − (4) v2 0
Let us create a system of equations and see how v1 and v2 are related.
0 = −1v1 − 2v2 (34)
= −2v1 − 4v2 (35)

6
As we previously solved, we can solve
 for the eigenvector with eigenvalue 4,
2
which is any vector of the form α
−1
 
2 0 2
b. M = 0 2 0 
2 0 −1
As in 3a. We can solve for eigenvalues of this matrix

2 − λ 0 2

0 2 − λ 0 = (2 − λ)2 (−1 − λ) − 4(2 − λ) = (2 − λ)((2 − λ)(−1 − λ) − 4)

2 0 −1 − λ
(36)
= (2 − λ)(λ2 − λ − 2 − 4) = (2 − λ)(λ2 − λ − 6) = 0
(37)
= (2 − λ)(λ − 3)(λ + 2) = 0 (38)

Therefore our eigenvalues are λ = 2, 3, −2, we follow our previous method in


finding our eigenvectors For λ = 2
    
0 0 2 v1 0
0 0 0  v2  = 0 (39)
2 0 −3 v3 0

We notice that v1 = v3 = 0, however,  v3 is can be any value that we would


0
like it to be. As such, any multiple of 1 is an eigenvector with eigenvalue

0
λ = 2.
Now for λ = 3     
−1 0 2 v1 0
 0 −1 0  v2  = 0 (40)
2 0 −4 v3 0
We solve this system of equations to obtain our eigenvector

0 = −v1 + 2v3 (41)


= −v2 (42)
= 2v1 − 4v3 (43)

7
v2 = 0 as is indicated by the second row. However we notice equations 1 and
3 are multiples
  of each other. As such, v1 = 2v2 , and any multiple of the
2
vector 0 is an eigenvector with eigenvalue λ = 3
1
Finally we solve for our last eigenvalue, λ = −2.
    
4 0 2 v1 0
0 4 0 v2  = 0 (44)
2 0 1 v3 0

Again a similar situation as our previous


 eigenvalue ensues, and we obtain
1
that any multiple of the vector 0 is an eigenvector with λ = −2 as an
2
eigenvalue.

8
 
0 −i
4a. Diagonalize the pauli spin matrix σy = . That is, find hte
i 0
matrix D = C −1 σy C.
We notice that σy is hermitian, as such it’s eigenvalues are real and the
eigenvectors are orthogonal, and if we normalize our eigenvectors we will
produce a unitary matrix with our eigenvectors as our column vectors.

−λ −i 2
i −λ = λ − 1 = 0 (45)

We find that λ = ±1. Let us find the eigenvectors. We start with λ = 1


    
−1 −i v1 0
= (46)
i −1 v2 0

We solve this system to obtain our first eigenvector

−v1 − iv2 = 0 (47)


iv1 − 1 = 0 (48)

We see that

v1 = −iv2 (49)
v1 = −i (50)
 
−i
Solving this system we notice that v1 = −i, v2 = 1, our eigenvector
1
with eigenvalue λ = 1. Let us now solve for λ = −1
    
1 −i v1 0
= (51)
i 1 v2 0

Let us solve for our second eigenvector

v1 − iv2 = 0 (52)
iv1 + v2 = 0 (53)

We reduce

v1 = iv2 (54)
v1 = iv2 (55)

9
 
i
Our eigenvector with eigenvalue λ = −1 is . Let us define the matrix
  1
1 i −i
U= 2√ . We note that U is unitary.
1 1
    
† 1 i 1 −i i 1 0
U U= = (56)
2 −i 1 1 1 0 1
     
1 0 −i i −i 1 i −i −1 0
σy U = √ =√ (57)
2 i 0 1 1 2 1 1 0 1
We multiply by U −1 from the left and obtain
   
−1 1 −i 1 0 −i i −i
U σy U = (58)
2 i 1 i 0 1 1

Equation (57) comes from the fact that because the columns of U are eigen-
vectors of σy we know that σy U is equivalent to U times a matrix  that 
−1 0
multiplies each column by a scalar, which is what the matrix D =
0 1
−1 dagger
does. (58) comes after we multiply U by U = U .
−1 −1
b. What are U σx U and U σx U ?
      
−1 1 −i 1 0 1 i −i 1 −i 1 1 1
U σx U = = (59)
2 i 1 1 0 1 1 2 i 1 i −i
   
1 0 −2i 0 −i
= = (60)
2 21 0 i 0

We notice that
U −1 σx U = σy (61)
Let us atttempt σz
      
−1 1 −i 1 1 0 i −i 1 −i 1 i −i
U σz U = = (62)
2 i 1 0 −1 1 1 2 i 1 −1 −1
 
1 0 −2
= = −σx (63)
2 −2 0

U −1 σx U = −σx (64)

10

You might also like