0% found this document useful (0 votes)
66 views

Cache Memory (Summary Report)

Cache memory is a small, fast memory located between the CPU and main memory that stores frequently used data. It speeds up data access and processing by allowing the CPU to retrieve data from cache memory faster than from main memory. Cache memory was first introduced in 1965 by Maurice Wilkes to improve processing speed. When the CPU requests data, it first checks the cache memory. If the data is present, it is a cache hit and the data is transferred quickly from cache to CPU. If not present, it is a cache miss and the data must be retrieved slowly from main memory. Modern computers use hierarchical cache designs with multiple levels like L1 and L2 caches to further improve performance.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
66 views

Cache Memory (Summary Report)

Cache memory is a small, fast memory located between the CPU and main memory that stores frequently used data. It speeds up data access and processing by allowing the CPU to retrieve data from cache memory faster than from main memory. Cache memory was first introduced in 1965 by Maurice Wilkes to improve processing speed. When the CPU requests data, it first checks the cache memory. If the data is present, it is a cache hit and the data is transferred quickly from cache to CPU. If not present, it is a cache miss and the data must be retrieved slowly from main memory. Modern computers use hierarchical cache designs with multiple levels like L1 and L2 caches to further improve performance.
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 1

Cache Memory

A Summarized Description
th

Muhammad Hasan Sarosh, 28 of April, 2014


Cache is a memory location, placed between CPU and the Main Memory. It is Fast but
Low in Capacity & Expensive. It is used to speed up the transfer of data to the processor which
ultimately results in faster processing.
The idea of cache was firstly introduced, in 1965 by Sir Maurice Vincent Wilkes (26 June
1913 29 November 2010) when he was professor at Cambridge, in an article titled Slave
Memories and Dynamic Storage Allocation published in IEEE Transactions on Electronic
Computers, April 1965. He referred Cache memory as Slave Memory.
Cache memory keeps the recently used data. Most often processors request the data
from the same Block frequently. By keeping a copy of block in cache memory saves the time;
and data is transferred to processor from cache memory in less time than it might have taken
to read the data from main memory.
When processor requests data, it is checked whether it is present in cache memory or
not. If required data is found in cache memory then it is transferred to CPU; one word at a time.
This scenario is termed as Cache Hit.
If required data is not found in cache memory then it is termed as Cache Miss and data is read
from main memory, the whole block containing the required data is transferred to cache
memory and then only the required words of data are transferred to the processor. That block
of data is also copied to cache memory, which may replace the Least Recently Used Block of
data. It keeps the cache memory updated with the recently used data.
The transfer of data between Cache and the processor is faster than the transfer
between cache and the main memory. The memory locations within the cache memory are
accessed by Random Access Method i.e. every location can be accessed directly at once and no
sequential searching is required; it makes the cache memory so fast.
Other than this Single Cache Memory, multi-level cache memory is also used. In such
scenario a primary cache which transfers data to processor directly, referred as L1 cache, reads
data from a secondary cache, referred as L2 cache, rather than reading it from the main
memory. While the other cache, L2 cache, reads the data from main memory and transfer it to
L1 cache rather than transferring it to the processor directly. L2 cache also keeps a copy of data.
The L2 cache is slower than L1 cache but has more capacity to store data than L1 cache.

You might also like