0% found this document useful (0 votes)
30 views5 pages

MATH 413 Homework 2: David Chung September 25, 2014

The document contains the homework problems from MATH 413. It first asks whether linearly dependent vectors v1, v2, v3 could have linearly independent transformed vectors w1, w2, w3. It shows that this is possible by representing the vectors in terms of each other and showing the transformed vectors are linearly independent combinations of basis vectors. The second problem asks about the relationship between the span of subsets S1 and S2, proving they are equal. The third section gives examples of matrix operations and properties. It then considers when a matrix A satisfies Ax=0, finding it must have a row that is a scalar multiple of another, making ATAx=0 as well.

Uploaded by

Marvin Hill
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views5 pages

MATH 413 Homework 2: David Chung September 25, 2014

The document contains the homework problems from MATH 413. It first asks whether linearly dependent vectors v1, v2, v3 could have linearly independent transformed vectors w1, w2, w3. It shows that this is possible by representing the vectors in terms of each other and showing the transformed vectors are linearly independent combinations of basis vectors. The second problem asks about the relationship between the span of subsets S1 and S2, proving they are equal. The third section gives examples of matrix operations and properties. It then considers when a matrix A satisfies Ax=0, finding it must have a row that is a scalar multiple of another, making ATAx=0 as well.

Uploaded by

Marvin Hill
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

MATH 413 Homework 2

David Chung
September 25, 2014
1. Is it possible that vectors v1 , v2 , v3 are linearly dependent, but the vectors
w1 = v1 + v2 , w2 = v2 + v3 , and w3 = v3 + v1 are linearly independent?
Let
v1 = v2 + v3
v2 = v2
v3 = v3

(1)
(2)
(3)

w1 = v1 + v2
= v2 + v3 + v2
= ( + 1)v2 + v3
w2 = v2 + v3
w3 = v3 + v1
= v3 + v2 + v3
= v2 + ( + 1)v3

(4)
(5)
(6)
(7)
(8)
(9)
(10)

with , being scalars. Then

w1 , w2 , w3 are all linear combinations of v2 , v3 . Thus w1 , w2 , w3 are linearly


independent.
P
2.Let u span(S). Then u = ni=1 i ui for ui S1 , i R. But, if ui S1 ,
then ui S2 . Thus the right hand side is also in span(S2 ). This implies that
u span(S2 ). To prove the second part, start with v V . Since S1 S2 ,
from the first part it follows that span(S1 ) span(S2 ). This means that
V span(S2 ). Hence span(S2 ) = V .
1

1 0 0
3a. 0 1 0
0 0 0

1 0 0
3b. 0 1 0
0 0 1

3
1

0
cos(30 ) sin(30 ) 0
2
2
3
3c. sin(30 ) cos(30 ) 0 = 12
0
2
0
0
1
0
0 1




1 0
0 0
4. Let A =
and B =
0 0
1 1


AB =

1 0
0 0



0 0
1 1

10+01
=
00+01


0 0
=
0 0


0 0
1
BA =
1 1
0

01+00
=
11+10


0 0
=
0 0



a b
s
5a. Suppose A =
and x =
c d
t

Ax =


(11)

10+01
00+01

(12)
(13)

0
0


(14)

00+00
10+10


(15)
(16)

where x 6= ~0 and Ax = ~0.

as+bt
cs+dt

= ~0


(17)
(18)

So a = bts , b = ast , c = dts , and d = cst . If we let = st , and = st ,


then we get a = b, b = a, c = d, and d = c. However, only works if
s 6= 0 and if t 6= 0, but not both at the same time, since x 6= ~0. If s 6= 0,
then b = d = 0 and if t 6= 0, then a = c = 0.


 
a c
u
T
So we let A =
and y =
where y 6= ~0.
b d
v


A y=

=

=

a c
b d



au + cv
bu + dv

u
v



(19)
(20)

bu + dv
bu + dv


(21)

Now we can see that the first row is just a scalar multiple of the second row.
So now we can try and solve for either u or v. Lets try solving for u.
bu + dv = 0
bu = dv
dv
u=
b

(22)
(23)
(24)

So if we let v = 0, then b = 0 and u 6= 0. Also if we let v be anything other


than 0, lets say v = 1, then we have u = db , so now we have . . .


bu + dv
bu + dv

b
b

d
b
d
b

+ d(1)
=
+ d(1)


d + d
=
d + d
 
0
=
0
= ~0


(25)
(26)
(27)
(28)

So without loss of generality, this works the same for the rest of the
cases and all of the cases.
3

5b. We want to show that Ax = ~0 AT Ax = ~0. The leftward direction


can be broken down into two cases. The first case is where AT is invertible,
while the other one is while AT is not invertible.
Ax = ~0 AT Ax = ~0 Case 1 AT is invertible. So assume Ax 6= ~0, we let
B = Ax.
AT B = ~0
A

(29)
T ~

(A B) = A

0
(AT AT )B = AT ~0
B = ~0

(30)
(31)
(32)

Ax = ~0 AT Ax = ~0 Case 2 AT is not invertible.


Then ad = bc (determinants), so . . .
AT Ax = 0

(a2 + c2 )x1 + (ab + cd)x2 = 0


(ab + cd)x1 + (b2 + d2 )x2 = 0
(ab + cd)
x2
(a2 + c2 )
(ab + cd)
x2 = 2
x1
(b + d2 )
x1 =

(33)
(34)
(35)
(36)
(37)
(38)

Plug in (37) into ax1 + bx2 = 0.


a(ab + cd)
x2 + bx2
a2 + c 2
(a2 b + cad)
x2 + bx2

a2 + c 2
(a2 b + c2 b)
2
x2 + bx2
a + c2
b(a2 + c2 )
x2 + bx2
2
a + c2
b(a2 + c2 )
2
x2 + bx2
a + c2
bx2 + bx2

=0

(39)

=0

(40)

=0

(41)

=0

(42)

=0

(43)

=0

(44)

=0

(45)

=0

(46)

=0

(47)

=0

(48)

=0

(49)

Now we can plug in (38) into cx1 + dx2 = 0.


d(ab + cd)
x1
b2 + d2
(adb + cd2 )
x1
cx1 + 2
b + d2
(b2 c + cd2 )
cx1 + 2
x1
b + d2
c(b2 + d2 )
cx1 + 2
x1
b + d2
cx1 + cx1
cx1 +

Ax = ~0 AT Ax = ~0

Ax = ~0

(50)

A (Ax) = A (~0)
AT (~0) = AT (~0)
T

~0 = ~0
Thus, Ax = ~0 AT Ax = ~0.
5

(51)
(52)
(53)

You might also like