0% found this document useful (0 votes)
70 views19 pages

Computer Organization: Large and Fast: Exploiting Memory Hierarchy

The document summarizes key concepts about cache memory hierarchies, including that caches store recently or frequently accessed data closer to the processor to reduce memory access time. It describes direct-mapped caches, where each memory location maps to exactly one cache location, and how the cache address is divided into tags and indexes to determine cache hits and misses. It provides an example of memory references to an empty cache and how to calculate the number of bits required for a cache given its size, block size, and address size.

Uploaded by

jackaccyou
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
70 views19 pages

Computer Organization: Large and Fast: Exploiting Memory Hierarchy

The document summarizes key concepts about cache memory hierarchies, including that caches store recently or frequently accessed data closer to the processor to reduce memory access time. It describes direct-mapped caches, where each memory location maps to exactly one cache location, and how the cache address is divided into tags and indexes to determine cache hits and misses. It provides an example of memory references to an empty cache and how to calculate the number of bits required for a cache given its size, block size, and address size.

Uploaded by

jackaccyou
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 19

Computer Organization

Chapter 5
Large and Fast: Exploiting Memory Hierarchy

Reading:
Textbook Section 5.1-5.5, 5.8
Basics of Cache
Cache was the name chosen to represent the level of the
memory hierarchy between the processor and main memory
in the first commercial computer to have this extra level
Small and fast memory
Stores the subset of instructions and data currently being accessed
Used to reduce average access time to memory
Caches exploit temporal locality by keeping recently accessed data
closer to the processor
Caches exploit spatial locality by moving blocks consisting of multiple
contiguous words closer to the processor
The goal is to achieve
Fast speed of cache memory access
Balance the cost of the memory system
Basics of Cache
Example: Request data item Xn
A very simple cache in which the processor requests are
each one word and the blocks also consist of a single word

How do we know whether


a data item is in the cache?
How do we find it?
If each word can go to
exactly one place in the
cache, then it is easy to
find the word
Direct Mapped Cache
Direct-mapped cache: a cache structure in which
each memory location is mapped to exactly one
location in the cache.
Assign the cache location based on the address of the
word in memory
Typical mapping between block addresses and cache
locations for direct-mapped cache:
Direct Mapped Cache
If number of blocks in the cache is a power of 2, we can
directly use low-order log2 (number of blocks in cache) bits
of the block address to find a block in cache

In this case, lower 3 bits of the


block address are used to find a
block in cache

1ten(00001two) map to 1ten(001two)


29ten(11101two) map to 5ten(101two)
Tags
How do we know which particular block is stored in
a cache location?
Tag: A field in a table used for a memory hierarchy that
contains the address information required to identify
whether the associated block in the hierarchy corresponds
to a requested word
Only need the upper portion of the address

We need only have the upper 2 of the 5


address bits in the tag
Lower 3-bit index field of the address
selects the block in cache
Valid Bits
What if there is no data in a location? We also need
a way to recognize that a cache block does not have
valid information
Valid bit: a field in the tables of a memory
hierarchy that indicates that the associated block
in the hierarchy contains valid data
1 = present
0 = not present
Initially 0
Cache Example
8-blocks, 1 word per block, direct mapped
Initial state

Index V Tag Data


000 N
001 N
010 N
011 N
100 N
101 N
110 N
111 N
Cache Example
Word addr Binary addr Hit/miss Cache block
22 10 110 Miss 110

Index V Tag Data


000 N
001 N
010 N
011 N
100 N
101 N
110 Y 10 Mem[10110]
111 N
Cache Example
Word addr Binary addr Hit/miss Cache block
26 11 010 Miss 010

Index V Tag Data


000 N
001 N
010 Y 11 Mem[11010]
011 N
100 N
101 N
110 Y 10 Mem[10110]
111 N
Cache Example
Word addr Binary addr Hit/miss Cache block
22 10 110 Hit 110
26 11 010 Hit 010

Index V Tag Data


000 N
001 N
010 Y 11 Mem[11010]
011 N
100 N
101 N
110 Y 10 Mem[10110]
111 N
Cache Example
Word addr Binary addr Hit/miss Cache block
16 10 000 Miss 000
3 00 011 Miss 011
16 10 000 Hit 000
Index V Tag Data
000 Y 10 Mem[10000]
001 N
010 Y 11 Mem[11010]
011 Y 00 Mem[00011]
100 N
101 N
110 Y 10 Mem[10110]
111 N
Cache Example
Word addr Binary addr Hit/miss Cache block
18 10 010 Miss 010

Index V Tag Data


000 Y 10 Mem[10000]
001 N
010 Y 10 Mem[10010]
011 Y 00 Mem[00011]
100 N
101 N
110 Y 10 Mem[10110]
111 N
Cache Example
Summary of eight memory references to an empty
eight-block cache
Cache Example with 64-bit address

A referenced address is
divided into
A tag field, which is
used to compare with
the value of the tag
field of the cache
A cache index, which is
used to select the block
Cache Example with 64-bit address
The total number of bits needed for a cache is a
function of the cache size and the address size,
because the cache includes both the storage for the
data and the tags
Three parts of cache:
Data
Tags
Valid bits
Cache Example with 64-bit address
For the following situation:
64-bit addresses
A direct-mapped cache
The cache size is 2n blocks, so n bits are used for the index
The block size is 2m words (2m+2 bytes), so m bits are used
for the word within the block, and two bits are used for
the byte part of the address

The size of the tag field is 64 - (n + m + 2)


Cache Example with 64-bit address
The total number of bits in a direct-mapped cache
is:
2n (block size + tag size + valid field size)

Block size is 2m words (2m 32 bits)


The number of bits in such a cache is
2n (2m 32 + (64 - n - m - 2) + 1)
= 2n (2m 32 + 63 - n - m)
Example: Compute Bits in Cache
How many total bits are required for a direct-
mapped cache with 16 KiB of data and 4-word
blocks, assuming a 64-bit address
16 KiB = 4096 (212) words
Block size = 4 (22) words
1024 (210) blocks
Size of tag field is 64 10 2 2 bits
Complete cache size is:
210 (4 32 + (64 - 10 - 2 - 2) + 1)
= 210 179 = 183296 bits
1 KiB = 1 Kibibyte = 210 bytes

You might also like