0% found this document useful (0 votes)
84 views8 pages

Course: ELL 701 - Mathematical Methods in Control Instructor: M. Nabi

This document provides definitions and concepts related to linear algebra, including: 1) It defines a vector space as a set with vector addition and scalar multiplication operations satisfying certain axioms. 2) It defines a subspace as a subset of a vector space that is itself a vector space. 3) It defines linear dependence and independence of vectors in a vector space. 4) It discusses the concepts of basis, dimension, and span related to vector spaces.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
84 views8 pages

Course: ELL 701 - Mathematical Methods in Control Instructor: M. Nabi

This document provides definitions and concepts related to linear algebra, including: 1) It defines a vector space as a set with vector addition and scalar multiplication operations satisfying certain axioms. 2) It defines a subspace as a subset of a vector space that is itself a vector space. 3) It defines linear dependence and independence of vectors in a vector space. 4) It discusses the concepts of basis, dimension, and span related to vector spaces.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

Course: ELL 701 - Mathematical

Methods in Control
Instructor: M. Nabi
Basic Linear Algebra
Definitions and Concepts-I

This is for your first understanding the concepts, and many definitions
are deliberately kept non-rigorous. Some of these are also taken from the
references. You are encouraged to look into the refrences and other sources
also for mathematically rigorous definitions.

VECTOR SPACE

The following defines the notion of a vector space V where K is the field of
scalars.
Definition:

Let V be a nonempty set with two operations

1. Vector addition: This assigns to any u, v V a sum u + v in V


2. Scalar mutiplication: This assigns to any u V , k K a product
ku V .
Then V is called a vector space(over the field K) if the following axioms
hold for any vector u, v, w V
1. (u + v) + w = u + (v + w)
2. There is a vector in V , denoted by 0 and called the zero vector, such
that, for any u V, u + 0 = u . The 0 vector is also called the
additive identity
3. For each u V , there is a vector V , denoted by u, called the negative
of u, such that, u + (u) = (u) + u = 0
1

Figure 1: Vector Space.


4. u + v = v + u
5. k(u + v) = ku + kv, for any scalar k V
6. (a + b)u = au + bu, for any scalar a, b K
7. (ab)u = a(bu), for any scalar a, b K
8. 1u = u, for any scalar 1 K
In Fig.1 u and v are two vector in set S. To be a vector space u + v also
have to be in S.

SUBSPACES

Definition: Let V be a vector space over a field K and let W be a subset


of V . Then W is a subspace of V if W is itself a vector space over K with
respect to the operations of vector addition and scalar multiplication on V .
Theorem: Suppose W is a subset of a vector space V . Then W is a
subspace of V if the following two conditions hold:
1. The zero vector 0 belongs to W .
2. For every u, v W, k K:
The sum u + v W
The multiple ku W

Figure 2: Vector Subspace.

LINEAR DEPENDENCE AND INDEPENDENCE

Let V be a vector space over a field K. The following defines the notion of
linear dependence and independence of vectors over K. (One usually suppresses mentioning K when the field is understood.) This concept plays an
essential role in the theory of linear algebra and in mathematics in general.
Definition: We say that the vectors v1 , v2 , .., vm in V are linearly dependent if there exist scalars a1 , a2 , ..., am in K, not all of them 0, such that
a1 v1 + a2 v2 + .... + am vm = 0
Otherwise, we say that the vectors are linearly independent.
The above definition may be restated as follows. Consider the vector
equation
x1 v1 + x2 v2 + .... + xm vm = 0

(1)

where the xs are unknown scalars. This equation always has the zero
solution x1 = 0, x2 = 0, ..., xm = 0. Suppose this is the only solution, that
is, suppose we can show
x1 v1 + x2 v2 + .... + xm vm = 0 implies x1 = 0, x2 = 0, ..., xm = 0
Then the vectors v1 , v2 , ..., vm are linearly independent. On the other
hand, suppose the equation (1) has a nonzero solution; then the vectors are
linearly dependent.
3

Figure 3: Linear dependence and independece.

BASIS, DIMENSION AND SPAN

Definition A: A set S = {u1 , u2 , . . . , un } of vectors is a basis of V if the


following two properties hold:
1. S is a linearly independent set of vectors.
2. S spans V
Definition B: A set S = {u1 , u2 , . . . , un } of vectors is a basis of V if every
v V can be written uniquely as a linear combination of the basis vectors.
The following is a fundamental result in linear algebra.
Let V be a vector space such that one basis has m elements and another
basis has n elements. Then m = n.
A vector space V is said to be of finite dimension n or n-dimensional,
written
dim V = n
Dimension: The dimension of a vector space V is defined to be dim V =
number of vectors in any basis for V = number of vectors in any minimal
spanning set for V = number of vectors in any maximal independent subset
of V.
Span: Given a set of vectors S = {u1 , u2 , u3 , . . . , up }, their span, hSi, is
the set of all possible linear combinations of u1 , u2 , u3 , . . . , up . Symbolically,
4

Figure 4: Span
hSi = {1 u1 + 2 u2 + 3 u3 + . . . + p up | i IR, 1 i p}

LINEAR MAPPINGS (LINEAR TRANSFORMATIONS)

Definition: Let V and U be vector spaces over the same field K. A


mapping F : V U is called a linear mapping or linear transformation if it
satisfies the following two conditions:
1. For any vector u, v V, F (u + v) = F (u) + F (v)
2. For any scalar k and vector v V, F (kv) = kF (v)
Fig. 5 shows the basic concept of linear transformation, where L is the
operator, which transforms, for example, cu to L(cu).

Important Subspaces

For a linear function f mapping IRn into IRm . That is, R(f ) = {f (x) | x
IRn } is a subset of IRm is the set of all Images as x varies freely over IRn .
The range of every linear function f : IRn IRm is a subspace of IRm
and every subspace of IRm is the range of some linear function. For this
reason, subspaces of IRm are sometimes called linear spaces.
Range Spaces: The range of a matrix A IRmxn is defined to be the
subspace R(A) of IRm that is generated by the range of f (x) = Ax. That
5

Figure 5: Linear Transformation


is, R(A) = {Ax | x IRn } subset of IRm . Similarly, the range of AT is the
subspace of IRn defined by, R(AT ) = {AT y | y IRm } subset of IRn . Because
R(A) is the set of all images of vectors x IRn under transformation by A,
some people call R(A) the image space of A.
Column and Row Spaces: For A IRmxn }, the following statements
are true.
1. R(A) = the space spanned by the columns of A (column space).
2. R(AT )= the space spanned by the rows of A (row space).
3. b R(A) b = Ax for some x.
4. a R(AT ) aT = y T A for some y T
Nullspace (Kernel): For an m n matrix A, the set N (A) = {xn1 |
Ax = 0} subset of IRn , is called the Nullspace of A. In other words, N (A)
is simply the set of all solutions to the homogeneous system Ax = 0.
The set N (AT ) = {ym1 | AT y = 0} subset of IRm is called the left hand
nullspace of A because N (AT ) is the set of all solutions to the left-hand
homogeneous system y T A = 0T .
Zero Nullspace: If A is an m n matrix, then
1. N (A) = 0 if and only if rank(A) = n;
2. N (AT ) = 0 if and only if rank(A) = m.
Rank : The rank R(T ) of a linear mapping T : IRn IRn is the
dimension of Image(T ). The rank R(A) of a matrix A, is the dimension of
Col(A). Of course this implies that T and the matrix of T have the same
rank.
Col(A) + Ker(A) = n, the dimension of A.
6

INNER PRODUCT : (For Real Fields)

Definition: Let V be a real vector space. Suppose to each pair of vectors


u, v V there is assigned a real number, denoted by < u, v >. This function
is called a (real) inner product on V if it satisfies the following axioms:
1. Bilinearity: <au1 + bu2 , v> = a<u1 , v> + b<u2 , v>
2. Symmetry: <u, v> = <v, u>.
3. Positive Definiteness: <u, u> 0 u, and <u, u> = 0 if and only
if u = 0
The vector space V equipped with an inner product, (i.e. for whose members
an inner product has been defined), is called a (real) Inner product space.

VECTOR NORM

Definition: Let V be a real or complex vector space. Suppose to each


v V there is assigned a real number, denoted by kvk. This function k.k is
called a norm on V if it satisfies the following axioms:
1. kvk 0; and kvk = 0 if and only if v = 0
2. kkvk = |k|kvk
3. ku + vk kuk + kvk

The triangle property

A vector space V with a norm is called a normed vector space.


1-Norm: 1-Norm of a vector x Cn is defined as
Pn

kxk1 = (

i=1 |xi |).

2-Norm: 2-Norm of a vector x Cn is defined as

P
1. kxk2 = ( ni=1 |xi |2 )1/2 = xT x where x IRn

P
2. kxk2 = ( ni=1 |xi |2 )1/2 = x x where x Cn
-Norm: -Norm of a vector x Cn is defined as kxk = max{|xi | :
i = 1, 2, . . . , n}.
p-Norm: p-Norm of a vector x Cn is defined as
p 1/p
.
i=1 |xi | )

Pn

kxkp = (

MATRIX NORM

A matrix norm is a function kk from the set of all complex matrices (of
all finite orders) into IR that satisfies the following properties.
1. kAk 0; and kAk = 0 A = 0
2. kkAk = |k|kAk for all scalar k
3. kA + Bk kAk + kBk
Induced Matrix Norms: A vector norm that is defined on Cp for
p = m, n induces a matrix norm on Cmxn by setting
mn , x Cn1
kAk = maxxIRn kAxk
kxk = maxkxk=1 kAxk for A C

Matrix 2-Norms: The matrix norm induced by the euclidean vector


norm is

max
kAk2 = maxxIRn kAxk
kxk = maxkxk2 =1 kAxk2 =
where max is the largest eigen value of AT A.
Matrix 1-Norms: The matrix norm induced by the vector 1-norm is
defined as
kAk1 = maxxIRn kAxk
kxk = maxkxk1 =1 kAxk1 = maxj

i |aij |

= the largest absolute column sum.


Matrix -Norms: The matrix norm induced by the vector -norm
is defined as
kAk = maxxIRn kAxk
kxk = maxkxk =1 kAxk = maxi

|aij |

= the largest absolute row sum.

10

ORTHOGONAL SETS AND BASES

Consider a set S = u1 , u2 , . . . , ur of nonzero vectors in an inner product


space V . S is called orthogonal if each pair of vectors in S are orthogonal,
and S is called orthonormal if S is orthogonal and each vector in S has unit
length. That is:
1. Orthogonal: < ui , uj >= 0 for i 6= j.
2. Orthonormal:
< ui , uj > = 0 f or i 6= j
= 1 f or i = j

You might also like