0% found this document useful (0 votes)
30 views12 pages

WK1 Study Session 1.1

History of Computer

Uploaded by

moses ike
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
30 views12 pages

WK1 Study Session 1.1

History of Computer

Uploaded by

moses ike
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 12

Study Session 1.

1:
Historical Background of computer system
Section and Subsection Headings
Introduction
1.0 Learning Outcome
2.0 Main Content
2.1 Historical background of computer
2.2 Definition of the computer system
2.3 Generations of computer
3.0 Summary and Conclusion
4.0 Self-Assessment Questions
5.0 Additional Activities (Videos, Animations & out of Class Activities)
6.0 Reference/Further Reading

Introduction
This study session is designed to give you an insight view or a recap on your previous knowledge
in introduction to computer (COM 101). History and generations of computer as well as the types
will be discussed in this study session.

1.0 Specific Learning Outcomes


At the end of study session Students should be able to:
i. Explain the and also define what is a computer
ii. Distinguish between the generations of computers.
iii. List and explain the characteristics of a computer

2.0 Main Content


2.1 Historical background of computer
The history of computers starts out about 2000 years ago in Babylonia (Mesopotamia), at the
birth of the abacus, a wooden rack holding two horizontal wires with beads strung on them. Blaise
Pascal is usually credited for building the first digital computer in 1642. It added numbers
entered with dials and was made to help his father, a tax collector. The basic principle of his
calculator is still used today in water meters and modern-day odometers. Instead of having a
carriage wheel turn the gear, he made each ten-teeth wheel accessible to be turned directly by a
person's hand (later inventors added keys and a crank), with the result that when the wheels were
turned in the proper sequences, a series of numbers was entered and a cumulative sum was
obtained. The gear train supplied a mechanical answer equal to the answer that is obtained by using
arithmetic. This first mechanical calculator, called the Pascaline, had several disadvantages.
Although it did offer a substantial improvement over manual calculations, only Pascal himself
could repair the device and it cost more than the people it replaced! In addition, the first signs of
technophobia emerged with mathematicians fearing the loss of their jobs due to progress. Contrary
to Pascal, Leibniz (1646-1716) successfully introduced a calculator onto the market. It is designed
in 1673 but it takes until 1694 to complete. The calculator can add, subtract, multiply, and divide.
Wheels are placed at right angles which could be displaced by a special stepping mechanism.
The speed of calculation for multiplication or division was acceptable. But like the Pascaline, this
calculator required that the operator using the device had to understand how to turn the wheels and
know the way of performing calculations with the calculator. Charles Babbage, an English
mechanical engineer and polymath, originated the concept of a programmable computer.
Considered the "father of the computer", he conceptualized and invented the first mechanical
computer in the early 19th century. After working on his revolutionary difference engine, designed
to aid in navigational calculations, in 1833 he realized that a much more general design, an
Analytical Engine, was possible. A step towards automated computing was the development of
punched cards, which were first successfully used with computers in 1890 by Herman Hollerith
and James Powers, who worked for the US. Census Bureau. They developed devices that could
read the information that had been punched into the cards automatically, without human help.
Because of this, reading errors were reduced dramatically, work flow increased, and, most
importantly, stacks of punched cards could be used as easily accessible memory of almost
unlimited size. Furthermore, different problems could be stored on different stacks of cards and
accessed when needed. These advantages were seen by commercial companies and soon led to the
development of improved punch-card using computers created by International Business Machines
(IBM), Remington (yes, the same people that make shavers), Burroughs, and other corporations.
These computers used electromechanical devices in which electrical power provided mechanical
motion -- like turning the wheels of an adding machine. Such systems included features to:
i. Feed in a specified number of cards automatically
ii. Add, multiply, and sort
iii. Feed out cards with punched results
The start of World War II produced a large need for computer capacity, especially for the
military. New weapons were made for which trajectory tables and other essential data were
needed. In 1942, John P. Eckert, John W. Mauchly, and their associates at the Moore school of
Electrical Engineering of University of Pennsylvania decided to build a high - speed electronic
computer to do the job. This machine became known as ENIAC (Electrical Numerical Integrator
And Calculator). The size of ENIAC’s numerical "word" was 10 decimal digits, and it could
multiply two of these numbers at a rate of 300 per second, by finding the value of each product
from a multiplication
table stored in its memory. ENIAC was therefore about 1,000 times faster than the previous
generation of relay computers. ENIAC used 18,000 vacuum tubes, about 1,800 square feet of
floor space, and consumed about 180,000 watts of electrical power. It had punched card I/O, 1
multiplier, 1 divider/square rooter, and 20 adders using decimal ring counters, which served as
adders and also as quick-access (.0002 seconds) read-write register storage. The executable
instructions making up a program were embodied in the separate "units" of ENIAC, which were
plugged together to form a "route" for the flow of information.
Early in the 50’s two important engineering discoveries changed the image of the electronic -
computer field, from one of fast but unreliable hardware to an image of relatively high reliability
and even more capability. These discoveries were the magnetic core memory and the Transistor -
Circuit Element. These technical discoveries quickly found their way into new models of digital
computers. RAM capacities increased from 8,000 to 64,000 words in commercially available
machines by the 1960’s, with access times of 2 to 3 MS (Milliseconds). These machines were very
expensive to purchase or even to rent and were particularly expensive to operate because of the
cost of expanding programming. Such computers were mostly found in large computer centers
operated by industry, government, and private laboratories - staffed with many programmers and
support personnel. This situation led to modes of operation enabling the sharing of the high
potential available. Many companies, such as Apple Computer and Radio Shack, introduced very
successful PC’s in the 1970's, encouraged in part by a fad in computer (video) games. In the 1980's
some friction occurred in the crowded PC field, with Apple and IBM keeping strong. In the
manufacturing of semiconductor chips, the Intel and Motorola Corporations were very competitive
into the 1980s, although Japanese firms were making strong economic advances, especially in the
area of memory chips. By the late 1980s, some personal computers were run by microprocessors
that, handling 32 bits of data at a time, could process about 4,000,000 instructions per second.

2.2 Definition of Computer System


A computer is an electronic device, operating under the control of instructions stored in its own
memory, that can accept data, process the data according to specified rules, produce results, and
store the results for future use. Computers have revolutionized the world within a short period of
decades; computers have become indispensable in every sphere of human life. In education,
computers are used for tasks such as writing papers, searching for articles, sending email and
participating in online classes. At work, people use computers to analyze data, make presentations,
conduct business transactions, communicate with customers and co-workers, control machines in
manufacturing facilities, and do many other things. At home, people use computers for tasks such
as paying bills, online shopping, communicating with friends and family and playing computer
games. Computers can do a wide variety of things because they can be programmed. This means
that computers are not designed to do just one job, but to do any job based on their programs.

1.3 Generations of computer


In the year 1946, the first successful electronic computer called ENIAC was developed and it
was the starting point of the current generation of computers.
1.3.1 First Generation (1945-1956)
With the onset of World War II, governments sought to develop computers to exploit their potential
strategic importance. This increased funding for computer development projects hastened technical
progress. By 1941 German engineer Konrad Zeus had developed a computer, the Z3, to design
airplanes and missiles. The Allied forces, however, made greater strides in developing powerful
computers. In 1943, the British completed a secret code-breaking computer called Colossus to
decode German messages. The Colossus’s impact on the development of the computer industry was
rather limited for two important reasons. First, Colossus was not a general-purpose computer; it
was only designed to decode secret messages. Second, the existence of the machine was kept secret
until decades after the war. American efforts produced a broader achievement. Howard H. Aiken,
a Harvard engineer working with IBM, succeeded in producing an electronic calculator by 1944.
The purpose of the computer was to create ballistic charts for the U.S. Navy. It was about half as
long as a football field and contained about 500 miles of wiring. The Harvard-IBM Automatic
Sequence Controlled Calculator, or Mark I for short, was an electronic relay computer. It used
electromagnetic signals to move mechanical parts. The machine was slow (taking 3-5 seconds per
calculation) and inflexible (in that sequences of calculations could not change); but it could perform
basic arithmetic as well as more complex equations. Another computer development spurred by the
war was the Electronic Numerical Integrator and Computer (ENIAC), produced by a partnership
between the U.S. government and the University of Pennsylvania. Consisting of 18,000 vacuum
tubes, 70,000 resistors and 5 million soldered joints, the computer was such a massive and
consumed 160 kilowatts of electrical power. ENIAC was developed by John Presper Eckert and
John W. Mauchly, and unlike the Colossus and Mark I, the machine was a general-purpose
computer that was 1,000 times faster than Mark I. In the mid-1940’s John von Neumann joined the
University of Pennsylvania team, initiating concepts in computer design that remained central to
computer engineering for 40 years. Figure below shows the ENIAC computer.

Fig 1: the ENIAC Computer

Von Neumann designed the Electronic Discrete Variable Automatic Computer (EDVAC) in 1945
with a memory to hold both a stored program as well as data. This "stored memory" technique as
well as the "conditional control transfer" that allowed the computer to be stopped at any point and
then resumed, allowed for greater versatility in computer programming. The key element to the von
Neumann architecture was the central processing unit, which allowed all computer functions to be
coordinated through a single source. Figure 1 shows the ENIAC computer. In 1951, Remington
Rand built the Universal Automatic Computer I (UNIVAC I), which became one of the first
commercially available computers to take advantage of these advances.

1.3.2 Second Generation (1954 – 1962)


The second generation saw several important developments at all levels of computer system
design, from the technology used to build the basic circuits to the programming languages used to
write scientific applications. Electronic switches in this era were based on discrete diode and
transistor technology with a switching time of approximately 0.3 microseconds. The first machines
to be built with this technology include TRADIC at Bell Laboratories in 1954 and TX-0 at MIT’s
Lincoln Laboratory. Memory technology was based on magnetic cores which could be accessed
in random order, as opposed to mercury delay lines, in which data was stored as an acoustic wave
that passed sequentially through the medium and could be accessed only when the data moved by
the I/O interface. Important innovations in computer architecture included index registers for
controlling loops and floating-point units for calculations based on real numbers. Prior to this
accessing successive elements in an array was quite tedious and often involved writing self-
modifying code (programs which modified themselves as they ran; at the time viewed as a
powerful application of the principle that programs and data were fundamentally the same, this
practice is now frowned upon as extremely hard to debug and is impossible in most high-level
languages). Floating point operations were performed by libraries of software routines in early
computers, but were done in hardware in second generation machines.
Second generation computers replaced machine language with assembly language, allowing
abbreviated programming codes to replace long and difficult binary codes. Throughout the early
1960’s, there were a number of commercially successful second-generation computers used in
business, universities and government from companies such as Burroughs, Control Data,
Honeywell, IBM, Sperry-Rand and others. These second-generation computers were also of solid-
state design and contained transistors in place of vacuum tubes. They also contained all the
components associated with the modern-day computer: printers, tape storage, disk storage,
memory, operating systems, and stored programs. One important example was the IBM 1401,
which was universally accepted throughout industry, and is considered by many to be the Model
T of the computer industry. By 1965, most large business routinely processed financial information
using second generation computers. It was the stored program and programming language that
gave the computers the flexibility to finally be cost effective and productive for business use. More
sophisticated high-level languages such as COBOL (Common Business-Oriented Language) and
FORTRAN (Formula Translator) came into use. These languages replaced cryptic binary machine
code with words, sentences, and mathematical formulas, making it much easier to program a
computer. New types of careers such as programmer, analyst and computer systems expert, and
the entire software industry began with second generation computers.

1.3.3 Third Generation Computers (1964-1971)


Despite the improvement provided by transistors over the vacuum tube, they still generated a
great deal of heat, which damaged sensitive internal parts of a computer. The quartz rock
eliminated this problem. Jack Kilby, an engineer with Texas Instruments, developed the
integrated circuit (IC) in 1958. The IC combined three electronic components onto a small silicon
disc, which was made from quartz. Scientists later managed to fit even more components on a
single chip, called a semiconductor. As a result, computers became ever smaller as more
components were squeezed onto the chip.
The third generation brought huge gains in computational power. Innovations in this era include
the use of integrated circuits, or ICs (semiconductor devices with several transistors built into
one physical component), semiconductor memories starting to be used instead of magnetic cores,
microprogramming as a technique for efficiently designing complex processors, the coming of
age of pipelining and other forms of parallel processing, and the introduction of operating
systems and time-sharing.

Fig 2: A vacuum tube, a resistor and an integrated circuit


1.3.4 Fourth Generation (1972 – 1984)
The next generation of computer systems saw the use of large-scale integration (LSI –
1000 devices per chip) and very large-scale integration (VLSI – 100,000 devices per
chip) in the construction of computing elements. At this scale entire processors will fit
onto a single chip, and for simple systems the entire computer (processor, main memory,
and I/O controllers) can fit on one chip. Gate delays dropped to about ins per gate.
After the integrated circuits, the only place to go was down in size, that is, Large Scale
Integration (LSI) could fit hundreds of components onto one chip. By the 1980’s, Very Large-
Scale Integration (VLSI) squeezed hundreds of thousands of components onto a chip. Ultra-
large-Scale Integration (ULSI) increased that number into the millions. It also increased their
power, efficiency and reliability.
The Intel 4004 chip, developed in 1971, took the integrated circuit one step further by locating
all the components of a computer (central processing unit, memory, and input and output
controls) on a minuscule chip. Unlike the previous integrated circuit that must be manufactured
to fit a special purpose, now one microprocessor could be manufactured and then programmed
to meet any number of demands. Soon everyday household items such as microwave ovens,
television sets and auto-mobiles with electronic fuel injection incorporated microprocessors.
Such condensed power allowed everyday people to harness the power of computers. They were
no longer developed exclusively for large business or government contracts.

1.3.5 Fifth Generation (1984-1990)


This generation of computers introduced machines with hundreds of processors that could all be
working on different parts of a single program. The scale of integration in semiconductors enabled
building chips with a million components, and semiconductor memories became standard on all
computers. Many advances in the science of computer design and technology emerged such as
parallel processing, which replaces von Neumann’s single central processing unit design, with a
system harnessing the power of many processors to work as one. The superconductor technology,
which allows the flow of electricity with little or no resistance, greatly improved the speed of
information flow. Computer networks, both wide area network (WAN) and local area network
(LAN) technology developed rapidly and single-user workstations also became popular.
The fifth-generation computers also introduced the use of Artificial intelligence. For example,
expert systems assist doctors in making diagnoses by applying the problem-solving steps a doctor
might use in assessing a patient’s needs.

1.3.6 Sixth Generation (1990 and beyond)


Most of the developments in computer systems since 1990 were improvements over established
systems. This generation brought about gains in parallel computing in both the hardware and in
improved understanding of how to develop algorithms to exploit parallel architectures.
Workstation technology continued to improve, with processor designs now using a combination
of RISC, pipelining, and parallel processing. Wide area networks, network bandwidth and speed
of operation and networking capabilities have kept developing tremendously. Personal
computers (PCs) now operate with Gigabit per second processors, multi-Gigabyte disks,
gigabytes of RAM, color printers, high-resolution graphic monitors, stereo sound cards and
graphical user interfaces. Thousands of software (operating systems and application software)
are existing today. This generation also brought micro controller technology,
which are embedded inside some other devices (often consumer products) so that they can
control the features or actions of the product. They work as small computers inside devices and
now serve as essential components in most machines.

1.3.7 Emerging Generation (Quantum Computers)


Conventional digital computers store information using bits represented by 0s or 1s. Quantum
computers use quantum bits, or qubits, to encode information as 0s, 1s, or both at the same time.
This superposition of states and other quantum mechanical phenomena of entanglement and
tunneling enables quantum computers to manipulate enormous combinations of states at once.
A qubit can be thought of like an imaginary sphere. Whereas a classical bit can be in two states
- at either of the two poles of the sphere - a qubit can be any point on the sphere. This means a
computer using these bits can store a huge amount of information using less energy than a
classical computer. D-Wave Systems developed an integrated quantum computer system running
on a 128-qubit processor in 2011. D-Wave Systems launched the D-Wave two (512-qubit
quantum computer) in 2013 and D-Wave 2X (1000-qubit quantum computer) in 2015. In January
2017, D-Wave Systems launched the D-Wave 2000Q quantum computer (2000 qubit). Figure 3
shows the D-Wave 2000Q quantum computer.

In-text Question
1. What is third generation computer made up of?
Answer
Third generation computers are made up of Integrated circuit (IC).

1.0 Summary and Conclusion


The history of computers starts out about 2000 years ago in Babylonia (Mesopotamia), at
the birth of the abacus, a wooden rack holding two horizontal wires with beads strung on
them. Blaise Pascal is usually credited for building the first digital computer in 1642. The
start of World War II produced a large need for computer capacity, especially for the
military. New weapons were made for which trajectory tables and other essential data were
needed. In 1942, John P. Eckert, John W. Mauchly, and their associates at the Moore school
of Electrical Engineering of University of Pennsylvania decided to build a high - speed
electronic computer to do the job. This machine became known as ENIAC (Electrical
Numerical Integrator and Calculator). The size of ENIAC’s numerical "word" was 10
decimal digits, and it could multiply two of these numbers at a rate of 300 per second, by
finding the value of each product from a multiplication table stored in its memory. ENIAC
was therefore about 1,000 times faster than the previous generation of relay computers.
ENIAC used 18,000 vacuum tubes, about 1,800 square feet of floor space, and consumed
about 180,000 watts of electrical power. It had punched card I/O, 1 multiplier, 1
divider/square rooter, and 20 adders using decimal ring counters, which served as adders
and also as quick-access (.0002 seconds) read-write register storage.
A computer is an electronic device, operating under the control of instructions stored in
its own memory, that can accept data, process the data according to specified rules, produce
results, and store the results for future use. Computers have revolutionized the world within
a short period of decades; computers have become indispensable in every sphere of human
life. In education, computers are used for tasks such as writing papers, searching for
articles, sending email and participating in online classes. At work, people use computers
to analyze data, make presentations, conduct business transactions, communicate with
customers and co-workers, control machines in manufacturing facilities, and do many other
things.
4.0 Self-Assessment Questions
1. Define a computer
2. Distinguish between fourth and second generation
3. Who is the father of computer?

5.0 Additional Activities (Videos, Animations & out of Class Activities)


https://spin.atomicobject.com/2016/07/31/eniac-
programmers/#:~:text=The%20ENIAC%20wasn't%20a,for%20entering%20tables%20of%2
0numbers.
https://www.power-and-beyond.com/quantum-computers-how-do-they-work-and-what-
might-we-expect-of-them-a-940315/?cmp=go-ta-art-trf-PuB_DSA-
20200714&gclid=EAIaIQobChMIwt7H86CA8AIVS_lRCh2iCgGiEAAYASAAEgKzUPD_
BwE

6.0 Reference/Further Reading


http://egov.uok.edu.in/elearningug/tutorials/5786_2_2016_161115130838.pdf
https://itcoursenotes.webs.com/IT%20Lec%201%20History%20of%20Computers.pdf

You might also like