0% found this document useful (0 votes)
36 views4 pages

Test 2A 2022 Solutions

The document contains responses to 4 questions: 1. It provides the QR factorization of a matrix and computes the projection matrices onto the subspace W and its orthogonal complement. 2. It defines the orthogonal complement of a subspace S and proves that any vector in S orthogonal must be the zero vector. 3. It computes the transpose of a linear transformation and shows that it satisfies the properties of a transpose. 4. It gives a counterexample to show that the map that takes a matrix to its determinant is not a linear map.

Uploaded by

聂久博
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views4 pages

Test 2A 2022 Solutions

The document contains responses to 4 questions: 1. It provides the QR factorization of a matrix and computes the projection matrices onto the subspace W and its orthogonal complement. 2. It defines the orthogonal complement of a subspace S and proves that any vector in S orthogonal must be the zero vector. 3. It computes the transpose of a linear transformation and shows that it satisfies the properties of a transpose. 4. It gives a counterexample to show that the map that takes a matrix to its determinant is not a linear map.

Uploaded by

聂久博
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

1 Q1

Q1a
The first line of the maple output computes the QR factorisation. The columns
of the matrix Q make the orthonormal basis for the subspace W:
   √ 
1 2 1 1 3 14 1
u1 = √ , √ , √ , − √ and u2 = √ , 0, − ,√ .
7 7 7 7 14 7 14

Q1b
The matrix of the projection is QQT . The second line of the maple output com-
putes this matrix. The product QQT e1 computes the first column of this matrix
and the product QQT e2 computes the second column of this matrix. Hence,
the answer is the sum of the first and the second column of the matrix QQT :
 
 15 6 −3
PW e 1 + e 2 = , , 0,
14 7 14

Q1c
We know that PW + PW ⊥ = I, where I is the identity map. Hence, we find
  
PW ⊥ e1 + e2 = e1 + e2 − PW e1 + e2
 
 1 1 3
⇒ PW ⊥ e1 + e2 = − , , 0, .
14 7 14

1
2 Q2

Q2a
The orthogonal complement S ⊥ is defined as follows
 
⊥ n
S = x ∈ R : x · y = 0, ∀y ∈ S .

Q2b
Since x ∈ S ⊥ , the vector x is perpendicular to every element of S. Since x ∈ S,
the vector x is perpendicular to itself:

x · x = 0.

Assume that x = x1 , x2 , . . . , xn . We know that x · x = x21 + x22 + . . . + x2n .
Hence, since xj ∈ R, j = 1, . . . , n,

x·x=0 ⇒ x21 + x22 + . . . + x2n = 0


⇒ x1 = x2 = . . . = xn = 0
⇒ x = 0.

QED

2
3 Q3

Q3a
Using co-factor expansion by the second column, we compute

a1 2 b1
 
!
 a2 b2
T 2e1 = det a2 0 b2  = −2 det = −2a2 b3 + 2b2 a3 .
 
a3 b3
a3 0 b3

Q3b
It means that

and ∀x, y ∈ C3 .
  
T λx + µy = λT x + µT y , ∀λ, µ ∈ C

3
4 Q4

Q4a
Once possible choice is as follows

1 0 0 0 0 0
   

A = 0 1 0 , B = 0 0 0 and A + B = I3
   

0 0 0 0 0 1
 
⇒ det A = 0, det B = 0 and det A + B = 1 6= 0 = det A + det B

Q4b
The above examples shows that this map is NOT linear.

You might also like