0% found this document useful (0 votes)
14 views7 pages

Week 1

detailed explatation of the topic

Uploaded by

sriteja0426
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views7 pages

Week 1

detailed explatation of the topic

Uploaded by

sriteja0426
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

Reading: Computer

Components and
Functions
a. System software: Software used to manage computer resources, for
example, the CPU, Memory, I/O devices, etc., fall under the category of
System Software. Examples are operating systems, editors, compilers, and
assemblers.

b. Application software: Software used to perform general or specific tasks


inside a computer. Hence can be categorized into general purpose and
specific purpose application software.

Computer Language:

A computer is a digital device in which a processor understands instructions


written in the form of a binary code i.e., any instruction or number using only
0’s and 1’s combination. Software written in the form of binary code is said to
use Machine Level Language.

In general, it’s quite difficult to write software programs in binary code. So,
software developers use a mnemonic-based language called the Assembly
Language. Assembly language enables writing the instruction in a simpler
form which is further converted into Machine Language using a special tool
called the Assembler.

Nowadays, we use High-Level Languages to develop software, which is


similar to using the English language. Some examples of high-level languages
are C, C++, Java, etc. Before execution by the computer, this high-level
language software program is converted to Assemble Language using a
special tool called the Compiler.

Levels of Transformation in Computer Systems:

Before a computer can solve a problem, different transformations happen


across multiple levels, as shown below:

● Problems are generally specified in natural languages, like English,


French, German, or other spoken languages that humans use to
converse. Unfortunately, this is not a language that computers
understand.
● Hence, the first transformation happens when we convert our problem
statement into an Algorithm. An algorithm is a step-by-step procedure
that a computer can carry out.
● Once the algorithm is selected, the next step is to translate it into a
computer program using a high-level language since it is simpler to
code and easier to debug for any errors. Also, because high-level
languages are machine independent, it enables the program to be
compiled for different machines.
● Using compilers, the next transformation happens when a software
program written using a particular programming language, is converted
into machine instructions that the processor can understand.
● The next transformation level involves Microarchitectures, which
involves implementing the computer’s Instruction Set Architecture (ISA).
● Once the microarchitecture of a computer is designed, we need to
implement each sub-part of this design. Each element of the
microarchitecture gets implemented using simple logic circuits, which
comprise the basic building blocks called the devices.

History of Computers:

Charles Babbage invented the first computer. He was born in December 1791
and died aged 79 years in Oct 1871. He is well-known as a mathematician,
philosopher, engineer, and inventor. But most importantly, he is well-known as
the father of the computer. Here is the further generation-wise development of
computers.

Generation Name Machine Name


Generation – 0 (1623 – 1945) Mechanical Calculating Machines

● Wilhelm Schikard – “Calculating


Clock”
● Blaise Pascal – “Pascaline”
● “Lightning Portable adder” and
“Addometer”

Generation – 1 (1943 – 1953) Vacuum Tubes Computers, EDVAC, ENIAC

(Implemented using Vacuum


Tubes)

Generation – 2 (1954 – 1965) IBM 7000 and DEC-PDP1

(Implemented using Transistor)

Generation – 3 (1965 – 1980) IBM 360

(Used Integrated Circuits)

Generation – 4 (1980 Onwards) Apple 1, Apple 2, IBM PC

(Based on VLSI Technology)

Advances in Semiconductor Technology:

Based on the number of devices that the IC can accommodate,


semiconductors can be classified as follows:

Technology Number of Devices on a Chip

Small-scale integration (SSI) 1-100 devices


Medium-scale integration (MSI) 100-3,000 devices

Large-scale integration (LSI) 3,000 - 100,000 devices

Very large-scale integration (VLSI) 100,000 - 100,000,000 devices

Ultra large-scale integration (ULSI) Over 100,000,000 devices

Little Computer 3 (LC-3):

The Little Computer 3 (LC-3) was developed jointly by Yale N. Patt at the
University of Texas at Austin and Sanjay J. Patel at the University of Illinois at
Urbana–Champaign. It includes the most important features of the well-known
computer systems available in the market. The instruction set of LC-3 has
only 15 instructions identified by unique codes. These represent the different
types of arithmetic, logic, and control operations that the LC-3 can perform.

● LC-3 Architecture:

The three important parts of LC-3 are memory, the Input and Output System,
and the Control and Processing Unit (CPU). The Processor Bus or the System
Bus connects the different parts, transferring information to and from each
component. Here, the CPU consists of the following key blocks:

1) Finite State Machine (FSM),

2) Arithmetic and Logic Unit (ALU), and

3) Register File or the Register Set.


● LC-3 Simulator:

To write and execute the program in LC-3, we do not require to purchase any
hardware-based LC-3 computer. A simulator called LC-3 simulator, is
available in Windows and Linux Machines, as well as a web version. Here,
every instruction stored in the LC-3 memory can be specified by:

1) Giving the value of that instruction and storing it in the correct memory
location.

2) Providing the raw instruction code in text form.

3) Writing assembly code in the text editor and load to the simulator.

Common Terms Used for Measurement

● Kilo- (K) = 1 thousand = 103 and 210


● Mega- (M) = 1 million = 106 and 220
● Giga- (G) = 1 billion = 109 and 230
● Tera- (T) = 1 trillion = 1012 and 240
● Peta- (P) = 1 quadrillion = 1015 and 250

Measurement of Speed
The base unit for the measurement of speed is Hz. Hertz stands for clock
cycles per second, which is also a unit of frequency. So, when we measure
the speed of operation of any component in the computer system, we look at
the number of processing cycles the component can execute per second time.

Measurement of Storage

The basic unit of storage in a computer system is a byte. A byte consists of 8


bits, where each bit can take two values - a 0 or a 1. Kilo Bytes or Mega Bytes
or Giga Bytes are other units to measure the capacity of storage units. The
main memory in a computer system, also called the RAM or the Random
Access Memory, is usually measured in Mega Bytes. On the other hand, disk
storage is measured in Giga Bytes for small systems and in Tera Bytes for
large systems.

Measurement of Time

● Milli- (m) = 1 thousandth = 10-3


● Micro- (μ) = 1 millionth = 10-6
● Nano- (n) = 1 billionth = 10-9
● Pico- (p) = 1 trillionth = 10-12
● Femto- (f) = 1 quadrillionth = 10-15

Reading Summary:

In this reading, you have learned the following:

● The need for a computer system


● Hardware and software components of a computer
● The levels of abstraction in computing systems
● The most significant events and concepts in the history of computers
● The high-level architecture of the LC-3 computer and use the LC-3
Simulator

You might also like