CMTS Handout 4
CMTS Handout 4
Memory
Memory is a collection of storage cells together with the necessary circuits to transfer
information to and from them. In other words, Computer memory is the storage space in
computer where data is to be processed and instructions required for processing are stored.
4.1 RAM
RAM is Random Access Memory which loses its contents when the computer is switched
off (it is volatile). This memory can be written to, plus instructions and data can be loaded into
it. The kind of memory used for holding programs and data being executed is called RAM. RAM
differs from ROM in that it can be both read and written; it is considered volatile storage
because unlike ROM, the components of RAM are lost when the power is turned off. Common
RAM sizes are such as 1GB, 2GB, 4GB, 6GB, & more (in multiples of 2).
Cache Memory
Cache memory is a very high speed semiconductor memory which can speed up CPU. It
acts as a buffer between the CPU and main memory. It is used to hold those parts of data and
program which are most frequently used by CPU. The parts of data and programs are
transferred from disk to cache memory by operating system, from where CPU can access them.
We will be seeing more on cache memory later on this chapter…
Secondary Memory
This type of memory is also known as external memory or non-volatile. It is slower than
main memory. These are used for storing data/Information permanently. CPU directly does
not access these memories instead they are accessed via input-output routines. Contents of
secondary memories are first transferred to main memory, and then CPU can access it. For
example: Hard disk, flash drive, CD-ROM, DVD etc. We will see them in detail in chapter six.
Cache Memory
It is a special high-speed storage mechanism. It can be either a reserved section of main
memory or an independent high-speed storage device. Two types of caching are commonly
used in personal computers: memory caching and disk caching. A memory cache, sometimes
called a cache store or RAM cache, is a portion of memory made of high-speed static RAM
(SRAM) instead of the slower and cheaper dynamic RAM (DRAM) used for main memory.
Memory caching is effective because most programs access the same data or instructions over
and over. By keeping as much of this information as possible in SRAM, the computer avoids
accessing the slower DRAM.
Some memory caches are built into the architecture of microprocessors. The Intel 80486
microprocessor, for example, contains an 8K memory cache, and the Pentium has a 16K cache.
Such internal caches are often called Level 1 (L1) caches. Most modern PCs also come with
external cache memory, which is located on the motherboard, called Level 2 (L2) caches. These
caches sit between the CPU and the DRAM. Like L1 caches, L2 caches are composed of SRAM
but they are much larger. Disk caching works under the same principle as memory caching, but
instead of using high-speed SRAM, a disk cache uses conventional main memory. The most
recently accessed data from the disk (as well as adjacent sectors) is stored in a memory buffer.
When a program needs to access data from the disk, it first checks the disk cache to see if the
data is there. Disk caching can dramatically improve the performance of applications, because
accessing a byte of data in RAM can be thousands of times faster than accessing a byte on a
hard disk.
When data is found in the cache, it is called a cache hit, and the effectiveness of a cache
is judged by its hit rate. Many cache systems use a technique known as smart caching, in which
the system can recognize certain types of frequently used data.
Advantages
The advantages of cache memory are as follows:
Cache memory is faster than main memory.
It consumes less access time as compared to main memory.
It stores the program that can be executed within a short period of time.
It stores data for temporary use.
Disadvantages
The disadvantages of cache memory are as follows:
Cache memory has limited capacity.
It is very expensive.