0% found this document useful (0 votes)
21 views

Cache Memory Simulation Presentation

Uploaded by

darshraj87
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
21 views

Cache Memory Simulation Presentation

Uploaded by

darshraj87
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 13

Cache Memory Simulation:

Evaluating Replacement Policies


FIFO, LRU, and Random Replacement Policies
Darsh Rajput (12202130501011)
Harsh Patel (12202130501018)
G H Patel College of Engineering & Technology
Introduction
What is Cache Memory?
• Cache memory is a small, high-speed storage layer
between the CPU and main memory.
• It stores frequently accessed data to improve system
performance.
Objective of the Project:
• Simulate and evaluate different cache replacement
policies (FIFO, LRU, and Random).
• Understand the impact of these policies on cache
performance (hit and miss rate).
Cache Replacement Policies
FIFO (First-In-First-Out):
• Replaces the cache block that has been in the cache the longest.
• Simple to implement but may not always yield the best
performance.
LRU (Least Recently Used):
• Replaces the cache block that has not been accessed for the
longest time.
• Often provides better performance than FIFO.
Random Replacement:
• Randomly chooses a cache block to replace when a miss occurs.
• Simpler but may lead to worse performance compared to FIFO
and LRU.
Problem Statement
• Goal: Simulate and evaluate the performance
of cache replacement policies.
• Metrics: Number of cache hits, cache misses,
and hit rate.
• Objective: Compare how each policy performs
under similar conditions.
Cache Simulation Logic
• Memory Access Simulation:
• Random memory addresses are generated for a
given number of accesses.
• The cache is checked for a hit or miss, and the
cache replacement policy is applied.
• Policies Handled:
• FIFO: Replaces the oldest block.
• LRU: Replaces the least recently used block.
• Random: Replaces a random block.
Cache Access Function
Cache Access Workflow:
1. Calculate block address from the memory
address.
2. Check for a cache hit: If the tag matches and
the block is valid, it's a hit.
3. Cache Miss: Use the chosen policy (FIFO, LRU,
Random) to replace the block.
Simulating FIFO, LRU, and Random Policies

• FIFO Policy:
• Replaces the block at the index calculated from the
address.
• Simple but may not be the most efficient.
• LRU Policy:
• Replaces the cache block that was accessed the least
recently.
• Requires tracking the time of last access.
• Random Policy:
• Randomly replaces a cache block, which may not always
be optimal.
Performance Evaluation

•Metrics to Evaluate:
Cache Hits: When the requested data is found in the cache.
Cache Misses: When the requested data is not in the cache.
Hit Rate: The percentage of hits out of the total memory
accesses.
Formula: Hit Rate=CacheHits/Total Accesses×100
Simulation Results:
Shows how the hit and miss rates differ for FIFO, LRU,
and Random.
Example Simulation Results
• Results for a Simulation with 1000 Accesses:
• FIFO:
• Hits: 250
• Misses: 750
• Hit Rate: 25%
• LRU:
• Hits: 400
• Misses: 600
• Hit Rate: 40%
• Random:
• Hits: 300
• Misses: 700
• Hit Rate: 30%
• Conclusion: LRU performed better than FIFO and Random in this
simulation.
Code Overview
Code Explanation:
Cache Structure:
•Holds the valid flag, tag, and last_used time.
Replacement Policy Simulation:
•Simple logic to simulate FIFO, LRU, and Random
replacements.
Simulation Function:
•Runs the simulation with random memory accesses and
prints hit/miss statistics.
Code Snippet Example:
if (policy == LRU) {
// Find the least recently used cache block
int lru_index = 0;
for (int i = 1; i < CACHE_SIZE; i++) {
if (cache[i].last_used < cache[lru_index].last_used) {
lru_index = i;
}
}
cache[lru_index].valid = 1;
cache[lru_index].tag = tag;
cache[lru_index].last_used = (*time)++;
}
Future Enhancements
Enhancements to Consider:
• Implement multi-level cache (L1, L2, L3 caches).
• Compare policies with different cache sizes and block sizes.
• Simulate cache associativity (direct-mapped, set-associative,
etc.).
• Explore more advanced policies like LFU (Least Frequently
Used).
Conclusion

• Summary:
• This project simulates the performance of different
cache replacement policies: FIFO, LRU, and Random.
• LRU generally performs better, followed by Random,
and FIFO typically results in the lowest hit rate.
• Learning Outcomes:
• Gained insights into how cache replacement policies
affect system performance.
• Learned the importance of selecting the right cache
policy for system optimization.

You might also like