Cache Memory With Associative Memory 2.2.5
Cache Memory With Associative Memory 2.2.5
5
Associative Cache
A type of CACHE designed to solve the problem of cache CONTENTION that plagues the
DIRECT MAPPED CACHE. In a fully associative cache, a data block from any memory
address may be stored into any CACHE LINE, and the whole address is used as the cache
TAG: hence, when looking for a match, all the tags must be compared simultaneously with
any requested address, which demands expensive extra hardware. However, contention is
avoided completely, as no block need ever be flushed unless the whole cache is full, and
then the least recently used may be chosen.
A set-associative cache is a compromise solution in which the cache lines are divided into
sets, and the middle bits of its address determine which set a block will be stored in: within
each set the cache remains fully associative. A cache that has two lines per set is called two-
way set-associative and requires only two tag comparisons per access, which reduces the
extra hardware required. A DIRECT MAPPED CACHE can be thought of as being one-way
set associative, while a fully associative cache is n-way associative where n is the total
number of cache lines. Finding the right balance between associativity and total cache
capacity for a particular processor is a fine art-various current cpus employ 2 way, 4-way
and 8-way designs.
REPLACEMENT ALGORITHMS OF CACHE MEMORY
Replacement algorithms are used when there is no available space in a cache in which to
place a data. Four of the most common cache replacement algorithms are described below:
The LRU algorithm selects for replacement the item that has been least recently used by the
CPU.
First-In-First-Out (FIFO):
The FIFO algorithm selects for replacement the item that has been in the cache from the
longest time.
The LRU algorithm selects for replacement the item that has been least frequently used by
the CPU.
Random:
In direct mapping,
Belady's Anamoly: For some cache replacement algorithm, the pages fault or miss rate
increase as the number of allocated frame increase.
Example: Let we have a sequence 7, 0 ,1, 2, 0, 3, 0, 4, 2, 3, and cache memory has 4 lines.
This cache algorithm uses a counter to keep track of how often an entry is accessed. With the
LFU cache algorithm, the entry with the lowest count is removed first. This method isn't used
that often, as it does not account for an item that had an initially high access rate and then was
not accessed for a long time.
References
Reference Books:
J.P. Hayes, “Computer Architecture and Organization”, Third Edition.
Mano, M., “Computer System Architecture”, Third Edition, Prentice Hall.
Stallings, W., “Computer Organization and Architecture”, Eighth Edition, Pearson
Education.
Text Books:
Carpinelli J.D,” Computer systems organization &Architecture”, Fourth Edition,
Addison Wesley.
Patterson and Hennessy, “Computer Architecture”, Fifth Edition Morgaon Kauffman.
Other References
What is Associative Cache? - Computer Notes (ecomputernotes.com)
http://www.eazynotes.com/notes/computer-system-architecture/slides/cache-
memory.pdf
file:///C:/Users/91987/Desktop/COA/lect11-cache-replacement.pdf
https://searchstorage.techtarget.com/definition/cache-algorithm
https://www.includehelp.com/cso/types-of-cache-replacement-policies.aspx