0% found this document useful (0 votes)
25 views6 pages

Homework 2

assignment for math

Uploaded by

saurabhgodara02
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views6 pages

Homework 2

assignment for math

Uploaded by

saurabhgodara02
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Math 271: Mathematical Methods Tridip Ray

Semester I, 2024-25 ISI, Delhi

Homework 2 (Class Test on 02 September)

1. Consider a system of m simultaneous linear equations in n unknowns, Ax = c; where


0 1 0 1 0 1
a a12 a1n x c
B 11 C B 1 C B 1 C
B a21 a22 a2n CC B C B c2 C
B B x2 C B C
Am n = B . .. .. .. CC ; x n 1 = B . C ; cm 1 = B . C :
B .. . . . B .. C B .. C
@ A @ A @ A
am1 am2 amn xn cm

Prove that the system of equations will have a solution for every choice of right-hand
side (c1 ; c2 ; :::; cm ) if and only if

rank (A) = number of rows of A:

2. For the same system of m simultaneous linear equations in n unknowns, Ax = c; as in


Problem 1 above, suppose that the number of equations < the number of unknowns.
Prove that

(a) Ax = 0 has in…nitely many solutions;


(b) for any given c; Ax = c has 0 or in…nitely many solutions;
(c) if rank (A) = number of equations, Ax = c has in…nitely many solutions for every
choice of right-hand side (c1 ; c2 ; :::; cm ) :

3. Consider a system of m simultaneous linear equations in n unknowns, Ax = c; as in


the above two problems.

(a) Prove that the system of equations must have either no solution, one solution, or
in…nitely many solutions.
(b) Prove that the system of equations will have at most one solution for every choice
of right-hand side (c1 ; c2 ; :::; cm ) if and only if

rank (A) = number of columns of A:

1
(c) Prove that the system of equations has one and only one solution for every choice
of right-hand side (c1 ; c2 ; :::; cm ) if and only if

number of rows of A = number of columns of A = rank (A) :

A general linear model will have m equations in n variables:

a11 x1 + a12 x2 + +a1n xn = c1 ;


.. .. ..
. . . (1)
am1 x1 + am2 x2 + +amn xn = cm :

The variables whose values are determined by the system of equations (1) are called
endogenous variables. On the other hand, the variables whose values are determined
outside of system (1) are called exogenous variables. The division of the n variables
into endogenous and exogenous variables will be successful only if, after choosing values
for the exogenous variables and plugging them into system (1), one can then uniquely
solve the system for the endogenous variables.

(d) Let x1 ; x2 ; :::; xk and xk+1 ; xk+2 ; :::; xn be a partition of the n variables in (1) into
endogenous and exogenous variables, respectively. Provide, with clear explana-
tions, the necessary and su¢ cient conditions so that there is, for each choice
of values x0k+1 ; x0k+2 ; :::; x0n for the exogenous variables, a unique set of values
x01 ; x02 ; :::; x0k for the endogenous variables which solves (1).

4. Let A be an m n matrix, and let c be a vector in Rm : Prove that exactly one of the
following two alternatives holds.

Either the system of equations Ax = c has a solution,

or, the system of equations yA = 0 and yc = 1 has a solution.

5. Consider the matrix p !


2 2
A= p :
2 1

(a) Find the eigenvalues and the normalized eigenvectors of A:


(b) Use this example to verify the Spectral Decomposition Theorem, that is, matrix
A can be decomposed into a matrix L consisting of its eigenvalues on the diagonal
and the matrices B and B T consisting of its eigenvectors.

2
6. We have so far studied the Spectral Decomposition Theorem for symmetric matrices
with distinct eigenvalues. The theorem can be extended to accommodate non-distinct
eigenvalues as follows.

Let A be an n n symmetric matrix with eigenvalues, 1; 2 ; :::; n: In this listing,


an eigenvalue is repeated a number of times equal to its multiplicity. Thus, if one
eigenvalue has multiplicity k; there will be k eigenvalues with the same numerical
value.
Theorem (Spectral Decomposition):
Let A be an n n symmetric matrix with eigenvalues, 1; 2 ; :::; n: Even if
A has multiple eigenvalues, there exists a nonsingular matrix B whose columns
y 1 ; y 2 ; :::; y n are eigenvectors of A such that

(i) y 1 ; y 2 ; :::; y n are mutually orthogonal to each other,


1
(ii) B = B T ; and
(iii) B T AB = L, where L is the diagonal matrix with the eigenvalues of A
( 1; 2 ; :::; n) on its diagonal.

Question:
Let A be an n n symmetric matrix. Show that if 0 is an eigenvalue of A of
multiplicity k; then rank (A) = n k:

7. In this problem you will prove, using the method of induction, the following theorem:

Let 1; 2; ..., k be k distinct eigenvalues of the n n matrix A: Let x1 ; x2 ; ..., xk be


the corresponding eigenvectors. Then, x1 ; x2 ; ..., xk are linearly independent vectors.
[Note that A may not be a symmetric matrix.]

(a) Initial step: Prove that the theorem is true for k = 2:


(b) Inductive step: De…ne the inductive step carefully and then prove it to complete
the proof of the theorem.

8. Theorem: Let A be a symmetric n n matrix. A is negative de…nite if and only if


all its n leading principal minors alternate in sign, starting with negative. (That is,
the r-th leading principal minor, Ar ; r = 1; 2; :::; n; has the same sign as ( 1)r :)

In the following steps we will prove this theorem. The proof has two major ingredi-

3
ents: the principal of induction and the theory of partitioned matrices. First a brief
introduction to the theory of partitioned matrices.

Partitioned Matrices: Let A be a m n matrix. A submatrix of A is a matrix


formed by discarding some entire rows and/or columns of A: A partitioned matrix is a
matrix which has been partitioned into submatrices by horizontal and/or vertical lines
which extend along entire rows or columns of A: For example,
0 1
a11 a12 a13 a14 a15 a16
B C
B
A = @ a21 a22 a23 a24 a25 a26 C A;
a31 a32 a33 a34 a35 a36

which we can write as !


A11 A12 A13
A= :
A21 A22 A23
Each submatrix Aij is called a block of A:

Suppose that A and B are two m n matrices which are partitioned in the same way,
that is, ! !
A11 A12 A13 B11 B12 B13
A= and B =
A21 A22 A23 B21 B22 B23
where A11 and B11 have the same dimensions, A12 and B12 have the same dimensions,
and so on. Then A and B can be added as if the blocks are scalar entries:
!
A11 + B11 A12 + B12 A13 + B13
A+B = :
A21 + B21 A22 + B22 A23 + B23

Similarly, two partitioned matrices A and C can be multiplied, treating the blocks as
scalars, if the blocks are all of a size such that the matrix multiplication of blocks can
be done. For example, if
! !
A11 A12 B11 B12 B13
A= and B = ;
A21 A22 B21 B22 B23

then !
A11 B11 + A12 B21 A11 B12 + A12 B22 A11 B13 + A12 B23
AB =
A21 B11 + A22 B21 A21 B12 + A22 B22 A21 B13 + A22 B23
so long as the various matrix products Aij Bjk can be formed. For example, A11 must
have as many columns as B11 has rows, and so on.

4
Next we need two lemmas.

– Lemma 1: If A is a positive or negative de…nite matrix, then A is nonsingular.


– Lemma 2: Suppose that A is a symmetric matrix and that Q is a nonsingular
matrix. Then, QT AQ is a symmetric matrix, and A is positive (negative) de…nite
if and only if QT AQ is positive (negative) de…nite.

Now we proceed to prove the theorem by using induction on the size n of A: The result
is trivially true for 1 1 matrices. It is straightforward to verify the theorem (you do
not have to do it) directly for 2 2 symmetric matrices by completing the square in
the corresponding quadratic form on R2 :
! !
a b x1
f (x1 ; x2 ) = (x1 ; x2 ) = ax21 + 2bx1 x2 + cx22 :
b c x2
So we will suppose that the theorem is true for n n matrices and prove it to be true
for (n + 1) (n + 1) matrices.

Let A be an (n + 1) (n + 1) symmetric matrix. Write Aj for the j j leading principal


submatrix of A for j = 1; 2; :::; n + 1: By the inductive hypothesis the theorem is true
for n n matrices. In part (a) we will prove that if sign of jAr j is the same as ( 1)r ;
r = 1; 2; :::; n + 1, then A is negative de…nite. In part (b) we will prove the converse:
A is negative de…nite implies that sign of jAr j is the same as ( 1)r ; r = 1; 2; :::; n + 1.

(a) The inductive hypothesis is given, and assume that sign of jAr j is the same as ( 1)r ;
r = 1; 2; :::; n + 1.

(i) Argue that An is invertible.


– Partition A as
0 1
! a1;n+1
An a B .. C
A= ; where a = B
@ . C:
A
aT an+1;n+1
an;n+1
1
Let d = an+1;n+1 aT (An ) a; let In denote the n n identity matrix, and let
0n denote the n 1 column vector of all 0s.
(ii) Verify that
! ! !
In 0n An 0n In An 1 a
A= T
QT BQ:
1
(An a) 1 0Tn d 0Tn 1

5
(iii) Show that jAj = d jAn j ; and argue that d < 0:
!
x
(iv) Let X be an arbitrary (n + 1)-vector. Write X = ; where x is an n-
xn+1
vector.
Argue that X T BX = xT An x + d (xn+1 )2 < 0:
(v) Conclude that A is negative de…nite.

(b) The inductive hypothesis is given, and assume that A is negative de…nite. (Note that
A is (n + 1) (n + 1)).

(i) Prove that An is negative de…nite.


(ii) Argue that sign of jAr j is the same as ( 1)r ; r = 1; 2; :::; n.
– So we need to prove only that the sign of determinant of A itself is ( 1)n+1 .
– Since An is invertible, we can once again write A as QT BQ as in part (a) (ii) and
conclude that jAj = d jAn j still holds.
(iii) Argue that B is negative de…nite.
(iv) Choose X suitably in part (a) (iv) to show that d < 0:
(v) Conclude that the sign of jAj is ( 1)n+1 , that is, sign of jAr j is the same as ( 1)r ;
r = 1; 2; :::; n + 1.

You might also like