0% found this document useful (0 votes)
41 views52 pages

Udacity Session10

The document introduces key concepts in linear algebra, including: Vectors, linear combinations, linear dependence/independence, linear transformations, and matrices. Vectors can be added or multiplied by scalars. A linear combination is a sum of vectors multiplied by scalars. Vectors are linearly dependent if one can be written as a linear combination of the others. A linear transformation is a function that preserves vector addition and scalar multiplication. Matrices are arrays of numbers used to represent linear transformations, with rules for addition and multiplication.

Uploaded by

mahmoud samir
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
41 views52 pages

Udacity Session10

The document introduces key concepts in linear algebra, including: Vectors, linear combinations, linear dependence/independence, linear transformations, and matrices. Vectors can be added or multiplied by scalars. A linear combination is a sum of vectors multiplied by scalars. Vectors are linearly dependent if one can be written as a linear combination of the others. A linear transformation is a function that preserves vector addition and scalar multiplication. Matrices are arrays of numbers used to represent linear transformations, with rules for addition and multiplication.

Uploaded by

mahmoud samir
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 52

Welcome

Recap

● Introduction
○ What is Linear Algebra?
○ What is it really about?
○ Why Linear Algebra?
○ Use cases of Linear Algebra in AI Programming.
● Vectors
○ Definition
○ Transpose
○ Vector Addition
○ Scalar-Vector Multiplication
Agenda

● Linear combinations
● Linear dependency / independency
● Linear Transformation
● Matrices
Linear Combinations

● As you’ve probably noticed, the two fundamental operations we’re


interested in in linear algebra are addition and scalar multiplication.
● In particular, linear transformations are exactly functions that preserve
addition and scalar multiplication.
● We often want to combine addition and scalar multiplication, so let’s
define a new term that describes this, and it’s called “Linear Combination”.
Linear Combinations
Linear Combinations

1 2
Linear Combinations

● You can notice that the result vector “[0, 3]” let’s call it <b>, is a
combination between the two co-officiant vectors “[2,-1]” <v1> and “[-1,2]”
<v2>
● Then
● b = constant_v1 * v1 + constant_v2 * v2
Linear Combinations

● Let’s say we have vectors


○ b = ( 3, 5 )
○ v1 = ( 1, 0 )
○ v2 = ( 0, 1 )
● We’re trying to decide whether ‘b’ is a linear combination of ‘v1’ and ‘v2’,
According to the definition, that means we’re deciding whether there are
scalars ‘c1’ and ‘c2’ such that
○ b = c1*v1 + c2*v2
Linear Combinations

● In that example you can just take a look at the vectors and decide ‘c1 = 3’
and ‘c2 = 5’, then “b = 3 * v1 + 5 * v2”
● But you can’t always “SEE” the solution, can you?
● Mathematicians call the above “Solution by Inspection”, and no one can do
that always, so we need some mathematical systematic approach.
Linear Combinations

● Let’s decide whether ‘b’ is a linear combination from ‘v1’ and ‘v2’.
○ b = (1, 14, -9)
○ v1 = (1, 4, 1)
○ v2 = (2, 3, 7)
● Then, let’s type the equation and see how can we calculate.
○ c1* (1, 4, 1) + c2 * (2, 3, 7) = (1, 14, -9)
Linear Combinations
Linear Combinations

● There are lots of methods to solve the above equation


○ Gauss-Jordan method
○ Matrix Inversion
○ …
● Matrix Inversion will be covered in a later session
● We will get back and solve that equation in a later session.
Linear dependence / independence

● Consider vectors v1, v2, v3, …., vn.


● An equation “c1 v1 + c2 v2 + …….. + cn vn = 0”, is called linear relation
among these vectors.
● If at least one of the constants “cn” is non-zero, we call this a non-trivial
solution.
● A trivial solution is: c1 = c2 = c3 = ….. = cn = 0.
Linear dependence / independence

● Vectors v1, v2, …., vn are linearly independent if there are no non-trivial
linear relations among them; that is v1, v2, … vk are linearly independent if
the only way to express “0” as a linear combination “c1 v2 + …. “ is to have “
c1 = c2 = c3 …. = cn = 0”
Linear dependence / independence
Linear dependence / independence

● In above example, we can find a combination of.


○ 0 = -2 * v1 + v2
● The above solution of “c1 = -2” and “c2 = 1” is a non-trivial solution
according to the definition.
● That means; V1 and V2 are linearly dependent.
Linear dependence / independence
Linear dependence / independence

● For the above example we could only find the trivial solution where “c1 = c2
= 0”
● That means; V1 and V2 are linearly independent
Linear Transformation

● Many branches of mathematics are concerned with studying functions


with particular properties.
● For example, single variable calculus is largely concerned with studying
functions of one variable that are differentiable.
● In linear algebra, the sort of function that we study is called a linear
transformation, and the goal of this handout is to explain what a linear
transformation is.
Linear Transformation

● If you’ve studied multivariable calculus, you’ve studied functions with


different types of inputs and outputs.
● For example, you’ve studied functions like f(x, y) = x^2 + y^2, where the
input is a pair of numbers (x and y) and the output is one number.
Linear Transformation

● You’ve also studied parametric curves like r(t) = [ cos(t), sin(t), t ] , where
the input (t) is a number and the output is a vector.
● In linear algebra, we’ll study functions where the input and output are both
vectors; such functions are often also called transformations. (A linear
transformation is a special kind of transformation, as we’ll explain soon.)
● Our functions will often be named T (for “transformation”),
Linear Transformation
Linear Transformation Visualization

Consider this Transformation.


Linear Transformation Visualization

● From the definition of this function we can see two examples.


Linear Transformation Visualization
Transformation Visualization
Transformation Visualization
Transformation Visualization
Linear Transformation

● A linear transformation is a function satisfying the following two


properties.
○ T(x+y) = T(x) + T(y) “T Preserves addition”
○ T(k x) = k T(x) , where ‘k’ is a scalar value “T Preserves multiplication”
Linear Transformation
Linear Transformation
Linear Transformation
Matrices

● A matrix is any rectangular array of numbers.


● If the array has n rows and m columns, then it is an n×m matrix.
● The numbers n and m are called the dimensions of the matrix.
● We will usually denote matrices with capital letters, like A, B, etc, although
we will sometimes use lower case letters for one dimensional matrices (ie:
1 × m or n × 1 matrices).
Matrices

● One dimensional matrices are often called vectors, as in row vector for a n
× 1 matrix or column vector for a 1 × m matrix but we are going to use the
word “vector” to refer to something different in Part II. We will use the
notation (Aij) to refer to the number in the i-th row and j-th column.
Matrices
Matrices

● A00 = a
● A22 =
● A02 =
● A01 =
● A20 =
Matrices

● There are a number of useful operations on matrices.


● Some of them are pretty obvious.
● For instance, you can add any two n× m matrices by simply adding the
corresponding entries. We will use A+B to denote the sum of matrices
formed in this way:
Matrices
Matrices Multiplication

● If you have an n × k matrix, A, and a k × m matrix, B, then you can matrix


multiply them together to form an n × m matrix denoted AB. (We sometimes
use A.B for the matrix product if that helps to make formulae clearer.) The
matrix product is one of the most fundamental matrix operations and it is
important to understand how it works in detail. It may seem unnatural at
first sight and we will learn where it comes from later but, for the moment, it
is best to treat it as something new to learn and just get used to it. The first
thing to remember is how the matrix dimensions work.
Matrices Multiplication
Matrices Multiplication

● Note two consequences of this. Just because you can form the matrix
product AB does not mean that you can form the product BA. Indeed, you
should be able to see that the products AB and BA only both make sense
when A and B are square matrices: they have the same number of rows as
columns
Matrices Multiplication

● To explain how matrix multiplication works, we are going to first do it in the


special case when n = m = 1. In this case we have a 1 × k matrix, A,
multiplied by a k × 1 matrix, B. According to the rule for dimensions, the
result should be a 1 × 1 matrix. This has just has one entry. What is the
entry? You get it by multiplying corresponding terms together and adding
the results:
Matrices Multiplication
Matrices Multiplication

● Once you know how to multiply one dimensional matrices, it is easy to


multiply any two matrices. If A is an n × k matrix and B is a k × m matrix,
then the ij-th element of AB is given by taking the i row of A, which is a 1 ×
k matrix, and the j-th column of B, which is a k × 1 matrix, and multiplying
them together just as in (8). Schematically, this looks as shown in Figure 1.
It can be helpful to arrange the matrices in this way if you are multiplying
matrices by hand.
Matrices Multiplication
Matrices Multiplication
Matrices Multiplication

matrix multiplication is not commutative


Exercise

● Matrix multiplication
Extra materials

● Linear dependence
● Linear combinations
● Linear Transformation
● matrices

You might also like