Lecture3_LinearOperators
Lecture3_LinearOperators
Linear operators
Linear algebra reminder
Dual spaces
• Let’s introduce an antilinear functional : on vector space V:
“Complete” vector spaces are also known as “Hilbert” space. Such spaces are called self-dual, and they
are actually the same spaces from mathematics point of view.
• Inner product is the example of the linear functional, and this theorem states that in simple spaces any
linear functional can be expressed through the inner product with some vector from the space itself.
Bra-ket notation
• Dirac’s bra-ket notation is mathematically meaningful, if we identify ket with
the elements of the dual space of the antilinear functionals, while bra is to be
identified with the vector of the vector space itself:
• In other words, object is a number (generally complex) that is the result of the
action of the functional from the dual space on the vector from the space .
• If representation theorem is satisfied, then there is a one-to-one
correspondence between elements of and , and the object is the same as the
inner product .
• For physicists, that distinction is almost always ignored, and both bra’s and
ket’s are identified with the vectors from , while is identified with the inner
product, being a vector which existence assured by the representation
theorem. I will (hopefully) make clear when the distinction is important, and
when it is not, and how to make it.
Examples of Dual spaces
• For 3D coordinate space: Reisz theorem holds, thus it is also a 3-
dimensional space. If we represent vector as a column of 3 numbers,
it is customary to represent the corresponding vector in the dual
space as a row of (conjugated) numbers.
• Further, we can view the inner product as a product between a vector
in a vector space and a vector from the dual space, and use matrix
multiplication rules to perform the calculations. The space – column.
Dual space: row (conjugated).
Linear Operators
• Operator is a rule (mapping) to transform one vector to another vector: V -> V (not necessarily to the same vector space):
The sum:
Let us also define inverse operator: , is an identity operator ().
Note 1: inverse operator does not always exist for a given operator…
Note 2: You might have noted, that all linear operators defined on the given vector space in turn form another linear space
themselves... With some additional properties, such space called algebra.
Note 3: nowhere commutativity of operators have been mentioned, in general two linear operators do not commute:
The operator that tells “how badly“ two operators do not commute called commutator:
Properties of commutators:
“Components” of the operator
(Matrix representation)
• Let’s consider acting on arbitrary vector:
So to define a linear operator, we need to know what it does to the basis vectors, and then we
know how to operate on the entire vector space.
Matrix element of
the operator
Thus for any vector now:
And the ket vector can be found multiplying the matrix representing the operator onto
the column of components representing the vector: .
Sum and the product of the operators is the sum and the product of the corresponding
matrices (Show this!).
Examples:
• Rotation of a vector in 2D: 𝜙
• Is that a linear operator?
• ;
• Eigenproblem: