0% found this document useful (0 votes)
123 views3 pages

Linear Algebra Question Paper

This document provides information about a linear algebra and optimization course, including details about an in-semester exam. The exam contains 4 questions: 1) determining which sets associated with a vector space form a vector space, 2) finding a linear transformation between vector spaces, 3) showing a matrix is diagonalizable based on its eigenvalues, and 4) properties of a linear transformation that is a projection operator. The document also provides solutions to the exam questions in 3 sentences or less for each part.

Uploaded by

Shripad Bhat
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
123 views3 pages

Linear Algebra Question Paper

This document provides information about a linear algebra and optimization course, including details about an in-semester exam. The exam contains 4 questions: 1) determining which sets associated with a vector space form a vector space, 2) finding a linear transformation between vector spaces, 3) showing a matrix is diagonalizable based on its eigenvalues, and 4) properties of a linear transformation that is a projection operator. The document also provides solutions to the exam questions in 3 sentences or less for each part.

Uploaded by

Shripad Bhat
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 3

SC505 - Linear Algebra & Optimization

Autumn 2019/20
In-Sem 1 4th September 2019
Duration: 90 minutes Total Marks 20

Note: You may use any result/example derived/stated/discussed in class without proof.

1. Let (V, +, ·) be a finite dimensional real vector space. Which of the following sets S associated
with V, with corresponding addition and scalar multiplication operations (+S , ·S ) defined below,
form a vector space? Verify only (i) closure with respect to addition and scalar multiplication, (ii)
additive identity and additive inverse, and their uniqueness (iii) distributivity of scalar multipli-
cation and addition. In case, any one of these is violated, do not check for the rest. [3+3]

(a) S = {W ⊂ V | W is a subspace of V } is the set of all subspaces of V, +S is the usual sum of


subspaces, and ·S is the scalar multiplication defined as:

∀ a ∈ R, ∀W ∈ S, a ·S W := { a · w | ∀w ∈ W }.

(b) Let the addition of any vector v ∈ V and any subspace W of V be defined as v + W := {v +
w | ∀w ∈ W }, and denoted as vW . Let U be a fixed subspace of V, and let S = {vU | ∀v ∈ V }
on which addition and scalar multiplication is defined as follows:

∀ pU , qU ∈ S, pU +S qU := ( p + q)U
∀ a ∈ R, ∀ pU ∈ S, a ·S pU := ( a · p)U .

2. If possible, find a linear transformation T : R3 → R2 such that T ((−2, 1, 0)) = (2, 4), T ((1, −3, 2)) =
(1, −2) and T ((−3, −1, 2)) = (−4, 6). Justify. [3]

3. Let A be a square matrix such that all its eigenvalues are zero. Show that A is diagonalizable if
and only if A = 0, the zero matrix. [4]

4. Let V be a finite dimensional vector space, and T ∈ L(V ) such that T 2 = T. Show that [3+2+2]

(a) V = R( T ) ⊕ N ( T ).
(b) T is a projection on R( T ) along N ( T ), i.e., ∀v ∈ R( T ), Tv = v, and ∀v ∈ N ( T ), Tv = θ.
(c) T is diagonalizable. Compute its diagonal representation.

PG Orientation!

1
Solutions
1. Sets S of a FDVS (V, +, ·).

(a) S = {W ⊂ V | W is a subspace of V } is the set of all subspaces of V.


Note that the subspace {θ } is the additive identity. In checking whether (S, +S ) is an abelian
group, specifically existence of additive inverse, notice that for any subspaces U, W ∈ S,
U ⊂ U + W and W ⊂ U + W. Thus the only subspace for which additive inverse exists is
{θ }. Hence, S is not a vector space.
(b) S = {vU | ∀v ∈ V }.

i. Additive closure: ∀vU , wU ∈ S, vU +S wU = (v + w)U ∈ S, since v + w ∈ V.


ii. Scalar multiplication closure: ∀ a ∈ R, ∀vU ∈ S, a ·S vU = ( av)U ∈ S, since av ∈ V.
iii. Additive Commutativity: ∀vU , wU ∈ S, vU +S wU = (v + w)U = (w + v)U = vU +S wU .
iv. Additive associativity: ∀vU , wU , qU ∈ S, vU +S (wU +S qU ) = vU +S (w + q)U = (v +
w + q)U = ((v + w) + q)U = (v + w)U +S qU = (vU +S wU ) +S qU .
v. Additive identity: Let Θ := θU = U. Note that ∀vU ∈ S, vU +S U = (v + θ )U = vU . In
fact, Θ = uU , for any u ∈ U.
vi. Additive inverse: Let vU ∈ S. Then vU +S (−v)U = (v + (−v))U = Θ.
vii. Multiplicative identity: ∀vU ∈ S, 1 ·S vU = (1 · v)U = vU .
viii. Distributive: ∀vU , wU ∈ S, ∀ a ∈ R, a ·S (vU +S wU ) = a ·S (v + w)U = ( a · (v + w))U =
( a · v + a · w ) U = ( a · v ) U + S ( a · w ) U = a · S v U + a · S wU .
ix. Distributive: ∀ a, b ∈ R, ∀vU ∈ S, ( a + b) ·S vU = (( a + b) · v)U = ( a · v + b · v)U =
( a · v )U + S ( b · v )U = a · S vU + b · S vU .
Thus S is a vector space, and is also called the quotient of V over U (denoted as V/U).

2. Let c1 , c2 , c3 be the three input vectors on which T is defined. Note that if the three vectors are LI,
then they can be mapped to arbitrary element by a linear transformation, and thus such a linear
transformation would exist. In this case, 2c1 + c2 − c3 = θ, implying that the set {c1 , c2 , c3 } is not
LI. Let the corresponding outputs be denoted by a1 , a2 , a3 . It is easy to see that 2a1 + a2 − a3 6= θ,
and therefore such a linear transformation is not possible.

3. ⇒ Since A is diagonalizable with all eigenvalues zero, the diagonal matrix Λ in A = MΛM−1 is
the zero matrix, and thus A = MΛM−1 = 0.
⇐ Given that A ∈ Rn×n is the zero matrix, Rn = N ( A) = N ( A − 0.In ), where In denotes the
n × n identity matrix. Thus any non-zero vector in Rn is an eigenvector of A with eigenvalue 0.
Let β be an arbitrary choice of basis of Rn , and let M be the matrix with vectors from β represented
in the standard basis in the columns. Then AM = MΛ, with Λ = 0, or A = M0M−1 . Thus A is
diagonalizable with all eigenvalues being zero.

4. Projection operator:

(a) V = R( T ) ⊕ N ( T ).
Let u ∈ R( T ) ∩ N ( T ). ∃v ∈ V, Tv = u, and Tu = θ. Now T 2 v = Tu = θ = Tv = u.
Therefore R( T ) ∩ N ( T ) = {θ }.
Let v ∈ V be an arbitrary vector in V. Then, Tv = w ∈ R( T ). Apply T on both sides
to get T 2 v = Tw = Tv (using T 2 = T). Thus T (v − w) = θ and v − w ∈ N ( T ). Let
v − w = n ∈ N ( T ). Thus, v = w + n with w = Tv ∈ R( T ) and n ∈ N ( T ), proving
V = R( T ) ⊕ N ( T ).

2
(b) T is a projection on R( T ) along N ( T ).
Let v ∈ R( T ). Thus, ∃u ∈ V, Tu = v. Then, T 2 u = Tv = Tu = v. Thus Tv = v. Also by
definition, ∀n ∈ N ( T ), Tn = θ. Thus T is a projection on R( T ) along N ( T ).
(c) T is diagonalizable. Let β = { β 1 , . . . , β p } be a basis of R( T ) and let γ = {γ1 , . . . , γq } be a
basis of N ( T ). Since V = R( T ) ⊕ N ( T ), α = β ∪ γ is a basis of V. Since ∀v ∈ R( T ), Tv = v,
R( T ) is the eigenspace of eigenvalue 1, and N ( T ) is the eigenspace of eigenvalue 0. Thus α
is an eigenbasis of T, and the diagonal representation of T, [ T ]αα contains p ones followed by
q zeros on the diagonal.

You might also like