0% found this document useful (0 votes)
36 views6 pages

Linear Algebra Notes

This document is a comprehensive formula sheet for linear algebra, covering key concepts such as matrices, systems of linear equations, determinants, inverses, vectors, linear independence, eigenvalues, diagonalization, orthogonality, and linear transformations. It includes definitions, operations, and properties related to each topic. The information is structured into sections for easy reference.

Uploaded by

lincolnj200212
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
36 views6 pages

Linear Algebra Notes

This document is a comprehensive formula sheet for linear algebra, covering key concepts such as matrices, systems of linear equations, determinants, inverses, vectors, linear independence, eigenvalues, diagonalization, orthogonality, and linear transformations. It includes definitions, operations, and properties related to each topic. The information is structured into sections for easy reference.

Uploaded by

lincolnj200212
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 6

Linear Algebra Formula Sheet

1. Matrices and Matrix Operations

●​ Matrix Addition:​

○​ Add corresponding entries: A plus B = [a_ij + b_ij]​

●​ Scalar Multiplication:​

○​ c * A = [c * a_ij]​

●​ Matrix Multiplication:​

○​ AB = C where c_ij = sum over k of a_ik * b_kj​

●​ Identity Matrix:​

○​ I_n is an n by n matrix with 1’s on the diagonal and 0’s elsewhere​

●​ Transpose:​

○​ A^T = flip rows and columns of A​

2. Systems of Linear Equations

●​ System represented as:​

○​ A * x = b​

●​ Augmented matrix:​

○​ [A | b]​

●​ Row Operations:​

○​ Swap rows​
○​ Multiply a row by a nonzero scalar​

○​ Add a multiple of one row to another​

●​ Row-Echelon Form (REF):​

○​ Leading entries are 1’s and stair-step down​

●​ Reduced Row-Echelon Form (RREF):​

○​ Same as REF with zeros above and below leading 1’s​

●​ Gaussian Elimination:​

○​ Use row operations to reduce to REF​

●​ Gauss-Jordan Elimination:​

○​ Use row operations to reduce to RREF​

3. Determinants

●​ 2x2 Matrix:​

○​ det A = ad - bc​

●​ 3x3 Matrix (Rule of Sarrus or cofactor expansion):​

○​ det A = a11 * (minor of a11) - a12 * (minor of a12) + a13 * (minor of a13)​

●​ Properties:​

○​ det(AB) = det A * det B​

○​ det(A^T) = det A​

○​ If two rows are equal or a row is a multiple of another, det A = 0​

○​ If A is invertible, det A is not 0​


4. Inverse of a Matrix

●​ 2x2 Matrix:​

○​ A^(-1) = (1 / det A) * [d, -b; -c, a]​

●​ General formula:​

○​ A^(-1) = (1 / det A) * adj(A)​

○​ adj(A) = transpose of cofactor matrix​

●​ Inverse exists only if det A is not 0​

5. Vectors and Vector Spaces

●​ Vector addition:​

○​ u + v = [u1 + v1, u2 + v2, …, un + vn]​

●​ Scalar multiplication:​

○​ c * v = [c * v1, …, c * vn]​

●​ Dot product:​

○​ u dot v = u1 * v1 + u2 * v2 + … + un * vn​

●​ Length or norm of vector v:​

○​ ||v|| = square root of (v1^2 + v2^2 + … + vn^2)​

●​ Unit vector:​

○​ v divided by ||v||​

●​ Angle between vectors:​

○​ cos(theta) = (u dot v) / (||u|| * ||v||)​

●​ Orthogonality:​
○​ u and v are orthogonal if u dot v = 0​

6. Linear Independence and Span

●​ Vectors are linearly independent if the only solution to​

○​ c1v1 + c2v2 + … + cn*vn = 0 is c1 = c2 = … = cn = 0​

●​ The span of vectors v1, …, vn is the set of all linear combinations of them​

7. Basis and Dimension

●​ Basis:​

○​ A linearly independent set of vectors that spans a space​

●​ Dimension:​

○​ Number of vectors in a basis​

8. Eigenvalues and Eigenvectors

●​ A * v = lambda * v​

○​ lambda is the eigenvalue, v is the eigenvector​

●​ Find eigenvalues:​

○​ det(A - lambda * I) = 0​

●​ Find eigenvectors:​

○​ Solve (A - lambda * I) * v = 0​
9. Diagonalization

●​ A = PDP^(-1)​

○​ D is diagonal matrix of eigenvalues​

○​ P is matrix of eigenvectors​

●​ A is diagonalizable if it has n linearly independent eigenvectors​

10. Orthogonality and Projections

●​ Projection of v onto u:​

○​ proj_u(v) = (v dot u) / (u dot u) times u​

●​ Orthogonal set:​

○​ All vectors are pairwise orthogonal​

●​ Orthonormal set:​

○​ Orthogonal and all vectors have length 1​

●​ Gram-Schmidt Process:​

○​ Converts a set of vectors into an orthonormal basis​

11. Linear Transformations

●​ A transformation T is linear if:​

○​ T(u + v) = T(u) + T(v)​

○​ T(c * u) = c * T(u)​

●​ Matrix of T:​
○​ T(x) = A * x​

●​ Standard matrix:​

○​ Columns are T applied to standard basis vectors​

You might also like