0% found this document useful (0 votes)
21 views7 pages

Chapter I

Uploaded by

tahar2003touhami
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views7 pages

Chapter I

Uploaded by

tahar2003touhami
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Chapter I

Techniques Related to
Matrix Calculations

Dr. Hasnaoui Abir


École Supérieure en Génie Électrique et
Energétique d'Oran
I. Systems of Linear Equations

Linear systems of equations are fundamental concepts in mathematics, possessing


extensive applications across diverse fields. They play a pivotal role in modeling and
controlling dynamic systems, particularly in areas such as robotics and automation.

A Single Linear Equation in n variables (x1,x2,...,xn) is an equation of the form:

a1x1+a2x2+•••+anxn=b

where: a1,a2,...,an and b are constant real or complex numbers.

The constant ai is called the coefficient of xi and b is called the constant term of the
equation.
A system of linear equations is a finite collection of linear equations in same variables.
For instance, a linear system of m equations in n variables x1, x2, . . . , xn can be written
as:

A linear system is called a homogeneous linear system if: b1= b2=......... .= bm= 0.

A system does not have any solution would be called an inconsistent system.

The linear system can be translated as the following matrix:

If A is an invertible square matrix of order n, then the system of equations whose


matrix representation is AX= B has a unique solution: X = A⁻¹B.
If matrix A is not invertible, the system either has an infinite number of solutions or no
solution at all.
II. Basic of Matrix Calculations

1. Matrix Operations

Matrix operations are fundamental mathematical procedures that allow us to manipulate


Matrices, including addition, subtraction and multiplication.

Addition and Subtraction

Only matrices with the same dimensions can be added or subtracted.

Exp:

Multiplication

If A is an m*n matrix and B is n*k matrix, their product AB is an m*k matrix.

2. Matrix transposition

The transpose of a matrix A, denoted AT, is formed by interchanging the rows and columns of
A;
AT = (aij)T = aji
Properties:
(AT )T = A
(A±B)T=AT±BT
(c A)T = c AT
(A B)T=BT AT
If A is symmetric: AT=A

3. Conjugate Transpose:

The conjugate transpose, of an n×m complex matrix A is an m×n matrix, obtained by


transposing A (aij) and applying complex conjugate on each entry (the complex conjugate of
a+ib being a-ib).

4. Matrix Trace:

The trace of a square matrix A is defined to be the sum of elements on the main diagonal of
A. The trace is only defined for a square matrix (n × n).

5. Invertible Matrices

The matrix M (n×n) is invertible if, and only if, MM−1 =M-1M= In, where M-1 is the inverse
of M and In is the n × n identity matrix.
6. Special Matrices

Null Matrix

It is a matrix whose all elements are zeros. A zero matrix can be a square matrix, or it can also
have an unequal number of rows and columns.

Diagonal Matrix

It is a square matrix A with aij=0 when i≠j.

Identity Matrix

The n×n identity matrix has ones on its diagonal and zeros elsewhere. It is square, diagonal

and symmetric.

Triangular Matrix

A square matrix A is upper triangular is a square matrix, whose all elements below the
principal diagonal are zeros.

A square matrix A is lower triangular is a square matrix, whose all elements above the
principal diagonal are zeros.
Symmetric and Antisymmetric Matrices

We have A=aij is a square matrix:

If AT = A, A is a symmetric matrix;
If AT = -A, A is an antisymmetric (skew-symmetric) matrix.

Normal matrix

A complex square matrix A is normal if it commutes with its conjugate transpose A*.

A*A=AA*

Unitary and Orthogonal Matrix

A square matrix A of order n is called unitary if and only if:

AAT=ATA=In

A square matrix A of order n is called orthogonal if and only if:

AA*=A*A=In
Idempotent matrix

It is a matrix which, when multiplied by itself, equal itself.

A2=A

Exp : , A=[3 -6 ;1 -2]=A2

Nilpotent matrix

It is a square matrix A such that: Ak=0

Singular Matrix

A square matrix A is said to be singular if detA = 0 (which is also written as |A| = 0) (A-1 is
not defined (i.e., A is non-invertible)).

7. Eigenvalue and Eigenvectors

Eigenvalue

Let A be a square matrix. An eigenvalue of A is a number λ that, when subtracted from each
element of the main diagonal of A, transforms A into a singular matrix.
λ is an eigenvalue of A if and only if :

Eigenvectors

When λ is an eigenvalue of A, a non zero vector X such that (A-λIn)X=0 is called an


eigenvector of A associated with the eigenvalue λ.

You might also like