Computer History
Computer History
of Computers
The history of computers dates back to the age when man started using tools for computations. The
whole history of computing can be divided into many periods based on the technology used in
computing devices.
Generations of computers explain the history of computers based on evolving technologies. With each
new generation, computer circuitry, size, and parts have been miniaturized, the processing and speed
doubled, memory got larger, and usability and reliability improved.
Note that the timeline specified for each generation is tentative and not definite. The generations are
actually based on evolving chip technology rather than any particular time frame.
Zeroeth Generation
Probably the first computing device was the abacus used by the Chinese in the fifth century BC. This
was used for systematic calculation of arithmetic operations.
In the year 1642, Blaise Pascal, a French scientist invented an adding machine called Pascal’s calculator,
which represents the position of digit with the help of gears in it.
Gottfried Wilhelm (von) Leibniz became one of the most prolific inventors in the field of mechanical
calculators. While working on adding automatic multiplication and division to Pascal's calculator, he
was the first to describe a pinwheel calculator in 1685 and invented the Leibniz wheel, used in the
arithmometer, the first mass-produced mechanical calculator. He also refined the binary number system,
which is the foundation of all digital computers. In 1804 Joseph Marie Jacquard, a French inventor,
devised a loom that used punched cards to direct the weaving pattern.
Charles Babbage designed several devices which he called Analytical Engines. In 1822 he designed
Difference Engine to calculate life tables (Statistics of expectation of life) for insurance business. This
work led to the development of Analytical Engine, which he designed in 1833. This provided a base for
modern computer.
At the end of the 19th century, Herman Hollerith and James Powers designed a data processing machine
for processing census data of the USA. Hollerith developed codes for processing both alphabetical and
numerical data by punching holes in cards and using a device to read such data into the computer
memory.
George Stibitz constructed the first automatic computer at the Bell Telephone Laboratories in New York
in 1939. Another major development was the building of MARK-I in 1940s which utilised electro-
magnetic relays.
Computers of first generation used vacuum tubes as the basic components for memory and circuitry for
Central Processing Unit. The first electronic computer, ENIAC (Electronic Numerical Integrator and
calculator) was developed in 1947 at the University of Pennsylvania, USA. This machine had vacuum
tubes as switching devices. They were very large in size, consumed lot of power and emitted too much
of heat. They required to be housed in large air-conditioned rooms. Among many things, The ENIAC
was used to study the feasibility of thermonuclear weaponry, firing of ballistic artillery and engine
thermal ignition, and elsewhere, for weather predictions. A case in point was the need by the USA army
to have machines capable of computing artillery firing tables fast enough. Existing ones took almost two
days. When completed the new machines computed this table data in seconds. Fortunately or
unfortunately, they became available only after the end of World War II in 1946.
The UNIVAC (Universal Automatic Computer), still by Engineers John W. Mauchly and J. Presper
Eckert was the first in the same era to be designed for commercial other than military use. It
manipulated both the alphabet and numbers fairly well and was used by USA Census Bureau to
enumerate the general population. It was later used to manipulate payrolls, records, company sales, and
even predicted presidential election results in 1952.
First generation computers were actually the first general purpose and true digital computers. They came
in time to replace the electromechanical systems which were way too slow for assigned tasks.
During this period, computer programming was mainly done in machine language. The user had to be
both an electronics expert and a programmer to use the computer for any task.
Example of First Generation of Computer:
1. ENIAC (1946)
2. EDSAC (1949)
3. EDVAC (1950)
4. UNIVAC-1 (1951)
Besides boasting of thousands of resisters and capacitors, these computers would use anything up to and
over 17,000 vacuum tubes, which meant computer installations covered entire rooms!
Input and output was done using punch cards, magnetic drums, typewriters and punch card readers.
Initially, technicians manually perforated the cards with holes. This was later done using computers.
Interfacing with first gen systems was done using plug boards and machine language. The technicians
wired up electrical circuits by connecting numerous cables to plug boards.
Then they slotted in specified punched cards into them and waited for hours for some form of
computation while hoping every one of the thousands of vacuum tubes lasted the distance lest they went
through the procedure again.
Notable development:
Von Neumann introduced the concept of stored program by around the same time and the first digital
computer using program, EDSAC (Electronic Delay Storage Automatic Calculator), was announced in
1949.
In second generation of computer transistors were used. The transistors were highly reliable and easier
to handle and maintain than the vacuum tubes. They required much less power. These transistors
replaced vacuum tubes in computers during this period. The magnetic cores were used to construct large
random access memories. Magnetic disk storage was also developed during this period.
Commercial applications rapidly developed during this period and dominated computer use by mid
1960s. This period also witnessed development of high level languages (like FORTRAN, COBOL,
ALGOL, and SNOWBOL) and operating systems. The computers used multiprogramming and batch
processing operating system.
Second generation computers saw advancement in data input and output procedures. Initially, these
processes were similar to the last models of 1st gen computers. They were tedious because they involved
multiple personnel carrying punched cards from room to room.
To speed up the process, the batch system was designed and implemented. It involved collecting
multiple data jobs into multiple punched cards and feeding them into single magnetic tapes using a fairly
smaller and inexpensive system. The IBM-1401 was one such computer. Processing, on the other hand,
was done using a more powerful system like the IBM 7094.
When data manipulation was complete, the files were transferred back to a magnetic tape. To do this
efficiently, IBM's operating system for IBM-7094 system and Fortran Monitor System were used. These
were the forerunners of operating system software to come.
Using a smaller system again, say IBM-1401, the data was printed out to multiple punch cards as output.
This was probably due to the overall upgrade from restrictive binary based machine code to languages
that wholly supported symbolic and alphanumeric coding. Programmers could now write in assemblers
and high-level languages like FORTRAN, COBOL, SNOWBALL, and BASIC in 1964.
Examples of 2nd Gen computers:
• IBM-7000
• CDC 3000 series
• UNIVAC 1107
• IBM-7094
• MARK III
• Honeywell 400
The period of third generation was from 1965-1971. The computers of third generation used Integrated
Circuits (ICs) in place of transistors. A single IC has many transistors, resistors, and capacitors along
with the associated circuitry.
The IC was invented by Jack Kilby. This development made computers smaller in size, reliable, and
efficient. In this generation remote processing, time-sharing, multiprogramming operating systems were
used. High-level languages (FORTRAN-II TO IV, COBOL, PASCAL PL/1, BASIC, ALGOL-68 etc.)
were used during this generation.
• IBM-360 series
• Honeywell-6000 series
• PDP (Personal Data Processor)
• IBM-370/168
• TDC-316
Fourth generation computers became more powerful, compact, reliable, and affordable. As a result, it
gave rise to Personal Computer (PC) revolution. In this generation, time sharing, real time networks,
distributed operating system were used. All the high-level languages like C, C++, DBASE etc., were
used in this generation.
• DEC 10
• STAR 1000
• PDP 11
• CRAY-1(Super Computer)
• CRAY-X-MP(Super Computer)
Advantages
Disadvantages
In the fifth generation, VLSI technology became ULSI (Ultra Large Scale Integration) technology,
resulting in the production of microprocessor chips having ten million electronic components.
This generation is based on parallel processing hardware and AI (Artificial Intelligence) software. AI is
an emerging branch in computer science, which interprets the means and method of making computers
think like human beings. All the high-level languages like C and C++, Java, .Net etc., are used in this
generation.
Computers can understand spoken words & imitate human reasoning. It can respond to its surroundings
using different types of sensors. Scientists are constantly working to increase the processing power of
computers. They are trying to create a computer with real IQ with the help of advanced programming
and technologies. IBM Watson computer is one example that outsmarts Harvard University Students.
The advancement in modern technologies will revolutionize the computer in future
AI includes −
• Robotics
• Neural Networks
• Game Playing
• Development of expert systems to make decisions in real-life situations
• Natural language understanding and generation
• ULSI technology
• Development of soft artificial intelligence
• Development of Natural language processing
• Advancement in Parallel Processing
• Advancement in Superconductor technology
• More user-friendly interfaces with multimedia features
• Availability of very powerful and compact computers at cheaper rates
• Desktop
• Laptop
• NoteBook
• UltraBook
• ChromeBook