Computer History: Classification of Generations of Computers
Computer History: Classification of Generations of Computers
Computer History: Classification of Generations of Computers
Generations of Computers
Updated on February 5, 2019
Alfred Amuno
more
Alfred is a long-time teacher and computer enthusiast who works with and
troubleshoots a wide range of computing devices.
Contact Author
The generations of computers illustrates the miniaturization of transistors and integrated circuit chips over the
years
Generations of computers explain the history of computers based on evolving technologies. With
each new generation, computer circuitry, size, and parts have been miniaturized, the processing and
speed doubled, memory got larger, and usability and reliability improved.
Note that the timeline specified for each generation is tentative and not definite. The generations are
actually based on evolving chip technology rather than any particular time frame.
The five generations of computers are characterized by electrical current flowing through the
processing mechanisms listed below:
A case in point was the need by the USA army to have machines capable of computing artillery firing
tables fast enough. Existing ones took almost two days. When completed the new machines
computed this table data in seconds. Fortunately or unfortunately, they became available only after
the end of World War II in 1946.
The first computer generations used vacuum tubes for amplification and switching purposes. The
tubes were made of sealed glass containers, the size of light bulbs. The sealed glass allowed
current to flow wirelessly from the filaments to metal plates. And because there were no moving
parts in the system, the flow amplified current to enable the computer to manipulate assigned tasks.
Vacuum tubes also started and ended the circuitry by switching on and off when turned on or off.
Besides boasting of thousands of resisters and capacitors, these computers would use anything up
to and over 17,000 vacuum tubes, which meant computer installations covered entire rooms!
Input and output was done using punch cards, magnetic drums, typewriters and punch card readers.
Initially, technicians manually perforated the cards with holes. This was later done using computers.
Interfacing with first gen systems was done using plugboards and machine language. The
technicians wired up electrical circuits by connecting numerous cables to plugboards.
Then they slotted in specified punched cards into them and waited for hours for some form of
computation while hoping every one of the thousands of vacuum tubes lasted the distance. Lest they
went through the procedure again.
A record machine plugboard for IBM 1401 | Source
These machines were intended for low-level operations and thus programming was done using only
binary digits 0s and 1s. The systems could solve only one problem at a time. Assembly language
and operating system software were nonexistent.
One of the most outstanding computers in this era was The ENIAC (Electronic Numerical Integrator
and Computer), which was designed and built by Engineers John W. Mauchly and J. Presper Eckert
of the University of Pennsylvania. Its assembly was done by a team of fifty men.
It was 1000 times faster than the previous electromechanical computers but was a little slow when it
came to re-programming.
Among many things, The ENIAC was used to study the feasibility of thermonuclear weaponry, firing
of ballistic artillery and engine thermal ignition, and elsewhere, for weather predictions.
The left side of The ENIAC computer | Source
These systems were enormous in size and occupied entire rooms while using lots of electric power.
This made them generate unbearable heat.
Unlike the over 17,000 vacuum tubes in The ENIAC, UNIVAC I used slightly over 5,000 vacuum
tubes. It was also half the size of its predecessor and sold over 46 units.
UNIVAC as exhibited in the Vienna Technical Museum | Source
Just like vacuum tubes, transistors are switches or electronic gates used to amplify or control
current, or switch electric signals on and off. They are called semiconductors because they contain
elements which lie between conductors and insulators.
Transistor semiconductors were invented at Bell Laboratories in 1947 by scientists William Shockley,
John Bardeen and Walter Brattain, but did not see the day of light until mid-1950s.
Second generation computers saw advancement in data input and output procedures. Initially, these
processes were similar to the last models of 1st gen computers. They were tedious because they
involved multiple personnel carrying punched cards from room to room.
To speed up the process, the batch system was conjured up and implemented. It involved collecting
multiple data jobs into multiple punched cards and feeding them into single magnetic tapes using a
fairly smaller and inexpensive system. The IBM-1401 was one such computer. Processing, on the
other hand, was done using a more powerful system like the IBM 7094.
When data manipulation was complete, the files were transferred back to a magnetic tape. To do
this efficiently, IBM's operating system for IBM-7094 system and Fortran Monitor System were used.
These were the harbingers of operating system software to come.
Using a smaller system again, say IBM-1401, the data was printed out to multiple punch cards as
output.
IBM 1401 computer with one circuit card access drawer opened, on display at the Computer History
Museum. | Source
Besides the development of operating systems software, other commercial applications were also
hitting the 'shelves'. This was probably due to the overall upgrade from restrictive binary based
machine code to languages that wholly supported symbolic and alphanumeric coding. Programmers
could now write in assemblers and high-level languages like FORTRAN, COBOL, SNOWBALL, and
BASIC in 1964.
Used transistors
Faster and more reliable than first generation systems
Were slightly smaller, cheaper, faster
Generated heat though a little less
Still relied on punch cards and printouts for input/output
Allowed assembly and high-level languages
Stored data in magnetic media
Were still costly
Needed air conditioning
Introduced assembly language and operating system software
Operator's console for IBM 7094 at the Computer History Museum | Source
The early mainframes and supercomputers were just some of the machines which took advantage of
transistors. The UNIVAC LARC mainframe from Sperry Rand (1960) and IBM-7030 Stretch
supercomputer (1961), and CDC 6600 mainframe (1963) were examples of these systems.
IBM-7000
CDC 3000 series
UNIVAC 1107
IBM-7094
MARK III
Honeywell 400
He readjusted this exponential growth after ten years, to every five years, in 1975.
The IC sought to solve the cumbersome procedures that went into designing the transistor circuitry.
The manual interconnection of capacitors, diodes, and rectifiers in transistors was time-consuming
and not completely reliable.
Jack Kilby of Texas Instruments and Robert Noyce of Fairchild Corporation separately discovered
the benefits of integrated circuits in 1958 and 1959, respectively. Kilby built his IC onto germanium
whereas Noyce built one onto a silicon chip.
The first systems to use the IC was the IBM 360, which was packed with the muscle to handle both
commercial and scientific assignments.
Almost all electronic devices today use some form of integrated circuits placed on printed circuit
boards.
The IC circuitry aside, the interaction with computers improved. Instead of punched cards printouts,
keyboards and better input peripherals were used to input data which were displayed for output
through visual display units.
Computers now used operating system software to manage computer hardware and resources. This
allowed systems to run different applications at a time. This was because of centralized applications
that monitored memory distribution.
Computers became accessible to the mass audience because of size and fair costing.
This generation also ushered in the concept of 'computer family' which challenged manufacturers to
come up with computer components that were compatible with other systems.
Used ICs
Used parallel processing
Were slightly smaller, cheaper, faster
Used motherboards
Data was input using keyboards
Output was visualized on the monitors
Used operating systems, thus permitting multitasking
Simplified programming languages i.e. BASIC
The next generation of mainframes and supercomputers took advantage of integrated circuits (IC).
The Scientific Data Systems Sigma 7 (1966) mainframe, and IBM-360 (1964) and CDC 8600
supercomputers (1969) were examples of these systems.
Other examples of third generation computers:
IBM-360
Personal Data Processor (PDP)
IBM-370
Intel, through its engineers Ted Hoff, Federico Faggin and Stan Mazor In November 1971,
introduced the world's first single chip microprocessor, the Intel 4004. It boasted of 2300 transistors
and measured 1/8" by 1/16".
What in the first generation filled an entire room could now fitted in the palm of the hand.
On its own, new microchip was as powerful as The ENIAC computer from 1946. It also merged most
of the functions that charged a computer like central processing unit, memory, input and output
controls.
The Intel C4004 microprocessor initiated the 4th computer generation | Source
Xerox Alto, arguably the firsst PC from 1973. It was powered by TI SN74S181N ALU chip from Texas
Instruments | Source
Challenged by the Xerox Alto, serious staff began in 1974 when Intel came up with a general
purpose 8-bit microprocessor it named 8808. It sought for, and asked Gary Kildall, a consultant, to
write an operating system for its new baby. This led to the disk-based operating system software
known as Control Program for Microcomputers (CPM).
In 1981, International Business Machine introduced its first computer for the home which ran the
4004 processor. It was known as IBM PC, with PC standing for personal computer. They partnered
with Bill Gates who bought Disk Operating System from Seattle Computer Product and had it
distributed with IBM's new computer.
The IBM PC architecture became the de facto market standard model, which other PC makers
emulated.
Apple under Steve jobs, changed the software game when it released the Apple Macintosh
computer with an improved GUI (Graphical User Interface) in 1984, using the interface idea learned
from Xerox PARC.
Remember that both Control Program for Microcomputer and Disk Operating System were
command-line based operating systems which the user to interface with the computer using the
keyboard.
The Apple Macintosh of 1984 | Source
Following the success of Apple's GUI, Microsoft too integrated a shell version of Windows in the
DOS version of 1985. Windows was used like this for the next 10 years until it was reinvented as
Windows 95. This was a true operating system software complete with all the right utilities.
While software became commonplace and corporations began charging money for it, a new
movement of programmers started Linux in 1991. Led by Linux Torvalds, they pioneered a free open
source operating system project called Linux.
Besides Linux, other open source operating systems and free software were distributed to cater for
office, networking and home computers.
Examples of open source and free software:
Ubuntu OS
Mozilla Firefox browser
Open Office
MySQL
VLC media player
Through the 1980s and 2000s, personal computers, and desktops, in particular, became
commonplace. They were cheap and installed in offices, schools and homes. Software that ran on
these computers also became readily available for small money or for free.
Desktops
All-in-one
Laptops
Workstations
Nettops
Tablets
Smartphones
A desktop computer
Soon, microprocessors moved out of the reserve of desktop computers into other platforms in
businesses and homes. First was the laptop, followed by tablets and Smartphone, consoles,
embedded systems, smart cards, made popular of the need to use the internet while on the move.
The proliferation of mobile computing device soon fought off the dominance of desktops. According
to ComScore in the publication Mobile’s Hierarchy of Needs of March 2017, mobiles accounted for
60% of all digital minutes across the world.
IBM z9 (2005), z10 (2008) and z13 (2015) are examples of mainframes.
Cray 1 (1975), Fugitsu K (2011), Titan (2013), Sunway TaihuLight (2016) are examples of
supercomputers.
The Cray-1 supercomputer of 1975 | Source
The implementation is designed to improve human and machine interaction by harnessing human
intelligence and taking advantage of the large data that has accumulated since the dawn of the
digital age.
It is viewed as the cyber-physical system and arises from the theory, concept and implementation of
artificial intelligence (AI) and machine learning (ML). AI and ML may not be the same but are used
interchangeably to mean the science of crafting devices and programs which are intelligent enough
to interact with humans, other computers, the environment, and programs, by mining big data to
achieve set goals.
The proliferation of computing devices with the possibility they can self-learn, respond and interact in
normal and probably different ways, based on acquired experience and environment, has also given
momentum to the Internet of Things (IoT) concept.
At their peak and with the right algorithms, computers will probably exhibit and process quite high
levels of deep learning from which, humans too, can.
Many AI projects are already being implemented while others are still in developmental stages.
Pioneers in accelerating AI include Google, Amazon, Microsoft, Apple, Facebook and Tesla.
The initial implementations are now seen on smart home devices which are meant to automate and
integrate activities in the house though audio/visual devices, and self-drive cars which are already
gracing the roads.
Coral (red) version of the Google Home Mini smart speaker | Source
No
See results
The larger goals in AI is to indulge devices to,
Quantum computing
Parallel processing
Ongoing AI projects: