0% found this document useful (0 votes)
25 views5 pages

Unit 4

Uploaded by

nikutiwari70
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
25 views5 pages

Unit 4

Uploaded by

nikutiwari70
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Unit-4

Vector Spaces- Vector Space, Subspace, Linear Combination, Linear Independence, Basis, Dimension,
Finding a Basis of a Vector Space, Coordinates, Change of Basis

Inner Product Spaces- Inner Product, Length, Orthogonal Vectors, Triangle Inequality, Cauchy- Schwarz
Inequality, Orthonormal (Orthogonal) Basis, Gram-Schmidt Process.

A Vector Space is a fundamental concept in linear algebra, representing a collection of vectors where
operations like addition and scalar multiplication are defined and satisfy specific rules.
Definition
A Vector Space (also called a Linear Space) over a field F is a set V equipped with two operations
1. Vector Addition: A rule to add two vectors u,v ∈ V to produce another vector u + v ∈ V.
2. Scalar Multiplication: A rule to multiply a vector v ∈ V by a scalar c ∈ F , producing another vector
c⋅v ∈ V.
Properties of a Vector Space:
To qualify as a vector space, the following properties must hold:
1. Closure
Addition: u + v ∈ V for all u, v ∈ V.
Scalar Multiplication: c⋅v ∈ V for all c ∈ F and v ∈ V.
2. Associativity
Vector addition is associative: u + (v + w) = (u + v) + w for all u, v, w ∈ V.
3.Commutativity
Vector addition is commutative: u + v = v + u for all u, v ∈ V.
4. Identity Element
There exists a zero vector 0 ∈ V such that v + 0 = v for all v ∈ V.
5. Inverse Element
For every vector v ∈ V , a vector – v ∈ V exists such that v + (−v) = 0 .
6. Distributivity
c⋅(u+v) = c⋅u + c⋅v for all c ∈ F and u , v ∈ V.
(c+d)⋅v =c⋅v + d⋅v for all c, d ∈ F and v ∈ V.
7. Associativity of Scalar Multiplication
c⋅(d⋅v) = (c⋅d)⋅v for all c, d ∈ F and v ∈ V.
8. Identity of Scalar Multiplication
1⋅v = v for all v ∈ V, where 1 is the multiplicative identity in F.
Applications
Vector spaces are foundational in:
• Physics (e.g., force vectors, electric fields)
• Computer science (e.g., graphics, machine learning)
• Engineering (e.g., signal processing, control systems)

Subspace
A subspace is a subset of a vector space that is itself a vector space under the same operations of vector
addition and scalar multiplication as the larger space.
Definition
Let V be a vector space over a field F. A subset W⊆V is called a subspace of V if W satisfies the following
three conditions:
1. Zero Vector: 0 ∈ W, where 0 is the zero vector of V.
2. Closed Under Addition: If u, v ∈ W then u + v ∈ W.
3. Closed Under Scalar Multiplication: If u ∈ W and c ∈ F, then c⋅u ∈ W.
If W satisfies these conditions, it inherits all vector space properties, making it a subspace of V.
Properties of a Subspace
1. Non-Empty: A subspace must contain at least the zero vector 0.
2. Closed Operations: For any u, v ∈ W and c, d ∈ F :
o u+ v ∈ W
o c⋅u ∈ W
3. Subset: W⊆V, meaning every element of W is also in V.
Applications of Subspaces
1. Linear Systems: The solution set of a homogeneous system of linear equations forms a subspace.
2. Geometric Interpretation: Subspaces represent lines, planes, and hyperplanes passing through the
origin in geometry.
3. Signal Processing: Subspaces are used in Fourier transforms and wavelet transforms.
Linear Combination
A linear combination of a set of vectors is a vector formed by adding together scalar multiples of those
vectors.
Definition
Let V be a vector space over a field F, and let {v1,v2,…,vn} be a set of vectors in V. A linear combination of
these vectors is any vector of the form:
v=c1v1+c2v2+⋯+cn.vn
where c1,c2,…,cn are scalars in F, and v1,v2,…,vn are vectors in V.

Applications of Linear Combinations


1. Solving Systems of Linear Equations: Representing solutions as combinations of basis vectors.
2. Geometry: Describing lines, planes, and higher-dimensional spaces.
3. Computer Graphics: Transforming and scaling objects using vector spaces.
4. Machine Learning: Representing data in high-dimensional spaces and reducing dimensions.
Linear Independence
In linear algebra, a set of vectors is said to be linearly independent if no vector in the set can be expressed
as a linear combination of the other vectors. Conversely, if at least one vector in the set can be expressed as
a linear combination of the others, the set is linearly dependent.
Definition
A set of vectors {v1,v2,…,vn} in a vector space V is linearly independent if the following equation has only
the trivial solution:
c1v1+c2v2+⋯+cnvn=0,
where 0 is the zero vector in V and c1,c2,…,cn are scalars.
Applications
1. Basis of a Vector Space: A basis is a linearly independent set of vectors that spans the entire vector
space.
2. Dimension: The number of linearly independent vectors in a vector space defines its dimension.
3. Systems of Equations: Solving linear systems often involves determining the linear independence of
rows or columns.
4. Eigenvectors: Eigenvectors corresponding to distinct eigenvalues are linearly independent.

In linear algebra, the coordinates of a vector represent its description relative to a specific basis. The change
of basis refers to converting the representation of a vector from one basis to another.

You might also like