0% found this document useful (0 votes)
65 views2 pages

Understanding Rank 2 Tensors

A Rank 2 tensor, also called a matrix, holds the magnitude at the intersection point of two directions. It represents a 2D space, such as a 3x3 pixel image, where each pixel is identified by its row and column indices. Higher ranked tensors allow the representation of more complex structures by adding more indices to uniquely identify each element.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
65 views2 pages

Understanding Rank 2 Tensors

A Rank 2 tensor, also called a matrix, holds the magnitude at the intersection point of two directions. It represents a 2D space, such as a 3x3 pixel image, where each pixel is identified by its row and column indices. Higher ranked tensors allow the representation of more complex structures by adding more indices to uniquely identify each element.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
You are on page 1/ 2

Welcome back.

Rank 0 tensors gave us the ability to look at magnitudes.


Rank 1 tensors gave us the ability to look
at magnitudes for features.
It allowed us to have a set
of values to describe our object.
But now that we have that
what does going up another rank get us?
Do you remember that I said,
"Lists are a good way to think of rank?"
Each time we go up a rank
we get a list of the previous rank to help us better
describe our object.
A Rank 2 tensor is also called a matrix
which is a combination of vectors.
It holds the magnitude
at the intersection point of two directions.
One example of this is a black and white image.
Looking at this square
we can see it as 3 pixels by 3 pixels
which means its shape is 3 by 3
represented as 3, 3.
We now have a list of lists.
The image itself measures 2 distinct directions.
First is the outer list position
then the inner list position
represented as rows and columns, respectively.
The image represents the various points
in space next to each other,
and that nearness actually conveys part
of the meaning for the data.
The TensorFlow documentation says that the rank
of a tensor is the number of indices required
to uniquely select each element of the tensor.
This applies to our lower ranked tensors as well,
but it becomes easier to demonstrate here.
To find the value of the bottom right pixel,
we have to specify both the row and the column
the value at row 3, column 3 is the brightness
or magnitude of the pixel specified
by the combination of the row and column vectors.
Importantly, you need to specify both
of the directions to find the magnitude of this point.
With a Rank 0 tensor, there are no indices to traverse.
You can identify every value in the tensor uniquely
with no movement and no direction at all.
With a Rank 1 tensor, you have to travel along 1 axis,
which I'll call A, to find each value.
This translates to the features
which would be A1, A2, A3, et cetera.
At Rank 2, you need to traverse 2 directions
which we'll call A and B for the row and column.
This gives us A1B1, A1B2, and A1B3 for the top row,
then A2B1, A2B2, and A2B3 for the second row and so on.
Rank is all about how you keep track
of your position within the overall tensor
structure as you're accessing specific data.
That was pretty heavy
so let's return to our virtual world to help it make sense.
Our Rank 2 tensor is a plane containing
various magnitudes at each point.
Here we have a Rank 2 tensor shaped
as 3 by 3.
I'm again using the number of blocks to show the magnitude.
We have 9 different colors each for a different feature
of the object we're trying to represent.
Each feature has its own unique magnitude,
but importantly, it also has neighbors.
These neighbors can provide context but don't have to.
A Rank 1 tensor also has neighbors.
In the case of our bicycle example, it doesn't matter
if price is the first or last component in the vector.
However, if we were using a Rank 1 tensor
to represent a sequence such as measurements taken one
after another, then that context matters.
We'll talk a lot more about context
when we look at convolutions.
Looking back to our Rank 1 tensor, a Rank 2 looks a lot
like a list of Rank 1's, and that's pretty accurate.
Importantly, this Rank 2 tensor is not a list
of Rank 1 tensors that describe different objects.
This is a list of Rank 1 tensors
that all describe the same object.
We'll see lists of same shaped tensors
that describe different objects later
when we talk about batches.
Did you catch that?
This will mean that collecting our data
into batches will increase our rank by 1.
This will become really important
as we're building and debugging models.
We're starting to see a pattern emerge here.
In the next lesson,
we'll explore Rank 3 tensors to see
if our pattern continues.

You might also like