0% found this document useful (0 votes)
8 views23 pages

3 3-Cachereplacement

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views23 pages

3 3-Cachereplacement

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 23

CPSC 313: Computer Hardware

and Operating Systems


Unit 3: Caching
Eviction Policies

CPSC 313 1
Administration
• Quizzes:
o Quiz 2 is done (thank you!)
o Quiz 2 retakes still ongoing
o We have one more week than usual before Quiz 3!
• Labs:
o Lab 5: Due soon. It is big! If you have not made substantial progress already,
start now, and be realistic about your goals!
o Lab 6: Out shortly

CPSC 313 2
Today
• Learning Objectives
• Analyze different cache eviction policies
• Belady’s algorithm
• Least recently used (LRU)
• Least frequently used (LFU)
• Reading
• 6.4

CPSC 313 3
A Short Digression
When we say a cache size is 32 KBytes:
1. What data is included in the 32 KByte cache size?
2. How big is a KByte?

CPSC 313 4
Digression 1: What does the cache size include?
Our teeny – tiny cache has:
Although we have to store the TAG
• 8-byte cache lines
value for each line, it is not counted as
• 4 entries
part of the (useful) cache size
• Total size: 32 bytes

0 1 2 3 4 5 6 7
Line 0

Line 1

Line 2

Line 3

CPSC 313 5
Digression: How much is a Kbyte?
(Or how Margo, Patrice, Norm, Steve, … are dinosaurs)
So is a KByte 1000 or 1024 bytes?
• The official ISO standard says:
• 1 Kilobyte = 1000 bytes
• 1 kibibyte = 1024 bytes

Historically, in most of computer science, a Kbyte was 1024 bytes. We


will be using that definition in this course.

Is Reto a dinosaur?
CPSC 313 6
He seems fancier and newer 😁
So in this course Kbyte, Mbyte, Gbyte are the
power of 2 versions of things

The “official” unit is kibibytes or KiB (kilo binary byte)

(Not so for GHz or GIPS.


End Digression)
CPSC 313 7
Associativity Summary

A k-way set associative cache with n total slots for lines:


o Has n/k sets, each with k slots
o Each cache line in memory can go in one and only one set
o That means:
A direct-mapped cache is "1-way set associative"!
Each set is effectively "its own fully-associative cache" for "its own part of
memory"
If a set is full, we need a way to decide what to evict when we bring a new line in

CPSC 313 8
What happens when a set fills up?
• Each set has a limited capacity.
• When the application wants another item that belongs in a full set…
• Caching the new item requires evicting some other item.
• What item do I evict?
• We need an eviction policy (or replacement policy)
• Why didn’t we need one for direct-mapped caches?
• The available decisions here vary between hardware and software.

CPSC 313 9
Eviction in Hardware
• In hardware caches, you frequently have only a few choices:
• Direct mapped: Exactly 1 choice!
• N-way set associative (2, 4, 8… or even 3, 5, 6, 7, …!) : N choices
• It is typically too expensive to build fancy replacement algorithms into
HW caches, so we often use a random replacement policy.

CPSC 313 10
Eviction in Software
• In a perfect world, we’d like to evict the item that is least valuable.
• In the real world, we don’t know what that item is.
• Practically all eviction policies used by software caches try to
approximate this ideal. Examples:
• LRU: Least-Recently-Used – find the item that has been unused the longest
and get rid of that.
• LFU: Least-Frequently-Used – find the item that has been used least
frequently and get rid of that.
• Clock: Used in virtual memory systems to approximate LRU (you’ll learn more
about that later this semester).
• Something tuned to known access patterns.
• Cool live-updating, real-access-pattern-based, learned policies (not in HW!)
CPSC 313 11
An Ideal Eviction Policy: Belady’s Algorithm
• Always evict the item that we use farthest in the future.
• Assume we have 2 slots in our cache.
• Assume fully associative
• We access cache blocks in the following order:
A, A, C, A, C, B, B, B, A, A, B, C

Compulsory misses: I I
A C Capacity misses:
Hits: I I I
CPSC 313 12
An Ideal Eviction Policy: Belady’s Algorithm
• Always evict the item that we use farthest in the future.
• Assume we have 2 slots in our cache.
• Assume fully associative
• We access cache blocks in the following order:
A, A, C, A, C, B, B, B, A, A, B, C

Compulsory misses: I I I
A C
B Capacity misses: I
Hits: I I I I I I I I
CPSC 313 13
An Ideal Eviction Policy: Belady’s Algorithm
• Always evict the item that we use farthest in the future.
• Assume we have 2 slots in our cache.
• Assume fully associative
• We access cache blocks in the following order:
A, A, C, A, C, B, B, B, A, A, B, C
Hit rate: 8/12
Compulsory misses: I I I
A C Capacity misses: I
Hits: I I I I I I I I
CPSC 313 14
Approximating Belady: LRU
• Evict the item that was used farthest in the past.
• Assume we have 2 slots in our cache.
• Assume fully associative
• We access cache blocks in the following order:
A, A, C, A, C, B, B, B, A, A, B, C

Compulsory misses: I I
A C Capacity misses:
Hits: I I I
CPSC 313 15
Approximating Belady: LRU
• Evict the item that was used farthest in the past.
• Assume we have 2 slots in our cache.
• Assume fully associative
• We access cache blocks in the following order:
A, A, C, A, C, B, B, B, A, A, B, C

Compulsory misses: I I I
A
B C Capacity misses:
Hits: I I I I I
CPSC 313 16
Approximating Belady: LRU
• Evict the item that was used farthest in the past.
• Assume we have 2 slots in our cache.
• Assume fully associative
• We access cache blocks in the following order:
A, A, C, A, C, B, B, B, A, A, B, C
Hit rate: 7/12
Compulsory misses: I I I
B C
A Capacity misses: I I
Hits: I I I I I I I
CPSC 313 17
Least Frequently Used: LFU
 LFU = Least frequently used: Keep track of how many times each
item is used and replace the one with the smallest frequency count.
 Assume we have 2 slots in our cache.
 Assume fully associative
 We access cache blocks in the following order:
A, A, C, A, C, B, B, B, A, A, B, C

Compulsory misses:
Capacity misses:
Hits:
CPSC 313 18
Least Frequently Used: LFU
 LFU = Least frequently used: Keep track of how many times each
item is used and replace the one with the smallest frequency count.

A, A, C, A, C, B, B, B, A, A, B, C

Compulsory misses: I I I
A(3) B(1)
C(2) Capacity misses:
Hits: I I I
CPSC 313 19
Least Frequently Used: LFU
• LFU = Least frequently used: Keep track of how many times each
item is used and replace the one with the smallest frequency count.

A, A, C, A, C, B, B, B, A, A, B, C

Compulsory misses: I I I
A(4)
A(5)
A(3) B(2)
B(4)
B(3) Capacity misses:
Hits: I I I I I I I I
CPSC 313 20
Least Frequently Used: LFU
• LFU = Least frequently used: Keep track of how many times each
item is used and replace the one with the smallest frequency count.

A, A, C, A, C, B, B, B, A, A, B, C

Hit rate: 8/12


Compulsory misses: I I I
A(5) C(1) Capacity misses: I
Hits: I I I I I I I I
CPSC 313 21
There is no Perfect Policy
• LRU is frequently good
• LFU is sometimes better
• The amount of metadata space and work needed can be important!
• For anything but Belady, we can construct an adversarial workload
that can be pretty devastating. (And Belady requires knowing the future.)
• Assume a 2 slot, fully associative cache with LRU replacement
• Using only 3 different data items, produce an ordering whose hit rate is 0!

CPSC 313 22
Now you try!
• As always, work together! Talk, learn, get help 

CPSC 313 23

You might also like