Cache Memory Simulation Presentation
Cache Memory Simulation Presentation
• FIFO Policy:
• Replaces the block at the index calculated from the
address.
• Simple but may not be the most efficient.
• LRU Policy:
• Replaces the cache block that was accessed the least
recently.
• Requires tracking the time of last access.
• Random Policy:
• Randomly replaces a cache block, which may not always
be optimal.
Performance Evaluation
•Metrics to Evaluate:
Cache Hits: When the requested data is found in the cache.
Cache Misses: When the requested data is not in the cache.
Hit Rate: The percentage of hits out of the total memory
accesses.
Formula: Hit Rate=CacheHits/Total Accesses×100
Simulation Results:
Shows how the hit and miss rates differ for FIFO, LRU,
and Random.
Example Simulation Results
• Results for a Simulation with 1000 Accesses:
• FIFO:
• Hits: 250
• Misses: 750
• Hit Rate: 25%
• LRU:
• Hits: 400
• Misses: 600
• Hit Rate: 40%
• Random:
• Hits: 300
• Misses: 700
• Hit Rate: 30%
• Conclusion: LRU performed better than FIFO and Random in this
simulation.
Code Overview
Code Explanation:
Cache Structure:
•Holds the valid flag, tag, and last_used time.
Replacement Policy Simulation:
•Simple logic to simulate FIFO, LRU, and Random
replacements.
Simulation Function:
•Runs the simulation with random memory accesses and
prints hit/miss statistics.
Code Snippet Example:
if (policy == LRU) {
// Find the least recently used cache block
int lru_index = 0;
for (int i = 1; i < CACHE_SIZE; i++) {
if (cache[i].last_used < cache[lru_index].last_used) {
lru_index = i;
}
}
cache[lru_index].valid = 1;
cache[lru_index].tag = tag;
cache[lru_index].last_used = (*time)++;
}
Future Enhancements
Enhancements to Consider:
• Implement multi-level cache (L1, L2, L3 caches).
• Compare policies with different cache sizes and block sizes.
• Simulate cache associativity (direct-mapped, set-associative,
etc.).
• Explore more advanced policies like LFU (Least Frequently
Used).
Conclusion
• Summary:
• This project simulates the performance of different
cache replacement policies: FIFO, LRU, and Random.
• LRU generally performs better, followed by Random,
and FIFO typically results in the lowest hit rate.
• Learning Outcomes:
• Gained insights into how cache replacement policies
affect system performance.
• Learned the importance of selecting the right cache
policy for system optimization.