Linear Algebra: Vectors and Vector Spaces
Linear Algebra: Vectors and Vector Spaces
Linear algebra
Full Article
TABLE OF CONTENTS
Linear algebra, mathematical discipline that
Introduction
deals with vectors and matrices and, more
Vectors and vector spaces
generally, with vector spaces and linear
Linear transformations and matrices
transformations. Unlike other parts of
Eigenvectors
mathematics that are frequently invigorated by
new ideas and unsolved problems, linear algebra
is very well understood. Its value lies in its many applications, from mathematical physics to
modern algebra and coding theory.
Linear algebra usually starts with the study of vectors, which are understood as quantities
having both magnitude and direction. Vectors lend themselves readily to physical applications.
For example, consider a solid object that is free to move in any direction. When two forces act
at the same time on this object, they produce a combined effect that is the same as a single
force. To picture this, represent the two forces v and w as arrows; the direction of each arrow
gives the direction of the force, and its length gives the magnitude of the force. The single
force that results from combining v and w is called their sum, written v + w. In the figure, v +
w corresponds to the diagonal of the parallelogram formed from adjacent sides represented by
v and w.
https://www.britannica.com/print/article/342069 2/4
9/15/2021 Linear algebra -- Britannica Online Encyclopedia
Vector spaces are one of the two main ingredients of linear algebra, the other being linear
transformations (or “operators” in the parlance of physicists). Linear transformations are
functions that send, or “map,” one vector to another vector. The simplest example of a linear
transformation sends each vector to c times itself, where c is some constant. Thus, every
vector remains in the same direction, but all lengths are multiplied by c. Another example is a
rotation, which leaves all lengths the same but alters the directions of the vectors. Linear refers
to the fact that the transformation preserves vector addition and scalar multiplication. This
means that if T is a linear transformation sending a vector v to T(v), then for any vectors v and
w, and any scalar c, the transformation must satisfy the properties T(v + w) = T(v) + T(w) and
T(cv) = cT(v).
product of two matrices shows the result of doing one transformation followed by another
(from right to left), and if the transformations are done in reverse order the result is usually
different. Thus, the product of two matrices depends on the order of multiplication; if S and T
are square matrices (matrices with the same number of rows as columns) of the same size,
then ST and TS are rarely equal. The matrix for a given transformation is found using
coordinates. For example, in two dimensions a linear transformation T can be completely
determined simply by knowing its effect on any two vectors v and w that have different
directions. Their transformations T(v) and T(w) are given by two coordinates; therefore, only
four coordinates, two for T(v) and two for T(w), are needed to specify T. These four
coordinates are arranged in a 2-by-2 matrix. In three dimensions three vectors u, v, and w are
https://www.britannica.com/print/article/342069 3/4
9/15/2021 Linear algebra -- Britannica Online Encyclopedia
needed, and to specify T(u), T(v), and T(w) one needs three coordinates for each. This results
in a 3-by-3 matrix.
Eigenvectors
When studying linear transformations, it is extremely useful to find nonzero vectors whose
direction is left unchanged by the transformation. These are called eigenvectors (also known
as characteristic vectors). If v is an eigenvector for the linear transformation T, then T(v) = λv
for some scalar λ. This scalar is called an eigenvalue. The eigenvalue of greatest absolute
value, along with its associated eigenvector, have special significance for many physical
applications. This is because whatever process is represented by the linear transformation
often acts repeatedly—feeding output from the last transformation back into another
transformation—which results in every arbitrary (nonzero) vector converging on the
eigenvector associated with the largest eigenvalue, though rescaled by a power of the
eigenvalue. In other words, the long-term behaviour of the system is determined by its
eigenvectors.
Finding the eigenvectors and eigenvalues for a linear transformation is often done using
matrix algebra, first developed in the mid-19th century by the English mathematician Arthur
Cayley. His work formed the foundation for modern linear algebra.
Citation Information
Article Title: Linear algebra
Website Name: Encyclopaedia Britannica
Publisher: Encyclopaedia Britannica, Inc.
Date Published: 09 June 2017
URL: https://www.britannica.com/science/linear-algebra
Access Date: September 15, 2021
https://www.britannica.com/print/article/342069 4/4