Computer
Computer
A broad range of industrial and consumer products use computers as control systems,
including simple special-purpose devices like microwave ovens and remote controls,
and factory devices like industrial robots. Computers are at the core of general-
purpose devices such as personal computers and mobile devices such as smartphones.
Computers power the Internet, which links billions of computers and users.
Early computers were meant to be used only for calculations. Simple manual
instruments like the abacus have aided people in doing calculations since ancient
times. Early in the Industrial Revolution, some mechanical devices were built to
automate long, tedious tasks, such as guiding patterns for looms. More
sophisticated electrical machines did specialized analog calculations in the early
20th century. The first digital electronic calculating machines were developed
during World War II, both electromechanical and using thermionic valves. The first
semiconductor transistors in the late 1940s were followed by the silicon-based
MOSFET (MOS transistor) and monolithic integrated circuit chip technologies in the
late 1950s, leading to the microprocessor and the microcomputer revolution in the
1970s. The speed, power, and versatility of computers have been increasing
dramatically ever since then, with transistor counts increasing at a rapid pace
(Moore's law noted that counts doubled every two years), leading to the Digital
Revolution during the late 20th and early 21st centuries.
Etymology
A human computer.
A human computer, with microscope and calculator, 1952
It was not until the mid-20th century that the word acquired its modern definition;
according to the Oxford English Dictionary, the first known use of the word
computer was in a different sense, in a 1613 book called The Yong Mans Gleanings by
the English writer Richard Brathwait: "I haue [sic] read the truest computer of
Times, and the best Arithmetician that euer [sic] breathed, and he reduceth thy
dayes into a short number." This usage of the term referred to a human computer, a
person who carried out calculations or computations. The word continued to have the
same meaning until the middle of the 20th century. During the latter part of this
period, women were often hired as computers because they could be paid less than
their male counterparts.[1] By 1943, most human computers were women.[2]
The Online Etymology Dictionary gives the first attested use of computer in the
1640s, meaning 'one who calculates'; this is an "agent noun from compute (v.)". The
Online Etymology Dictionary states that the use of the term to mean "'calculating
machine' (of any type) is from 1897." The Online Etymology Dictionary indicates
that the "modern use" of the term, to mean 'programmable digital electronic
computer' dates from "1945 under this name; [in a] theoretical [sense] from 1937,
as Turing machine".[3] The name has remained, although modern computers are capable
of many higher-level functions.
History
Main articles: History of computing and History of computing hardware
For a chronological guide, see Timeline of computing.
Pre-20th century
The Chinese suanpan (算盘). The number represented on this abacus is 6,302,715,408.
The abacus was initially used for arithmetic tasks. The Roman abacus was developed
from devices used in Babylonia as early as 2400 BCE. Since then, many other forms
of reckoning boards or tables have been invented. In a medieval European counting
house, a checkered cloth would be placed on a table, and markers moved around on it
according to certain rules, as an aid to calculating sums of money.[5]
The Antikythera mechanism, dating back to ancient Greece circa 200–80 BCE, is an
early analog computing device.
The Antikythera mechanism is believed to be the earliest known mechanical analog
computer, according to Derek J. de Solla Price.[6] It was designed to calculate
astronomical positions. It was discovered in 1901 in the Antikythera wreck off the
Greek island of Antikythera, between Kythera and Crete, and has been dated to
approximately c. 100 BCE. Devices of comparable complexity to the Antikythera
mechanism would not reappear until the fourteenth century.[7]
The planimeter was a manual instrument to calculate the area of a closed figure by
tracing over it with a mechanical linkage.
A slide rule
The slide rule was invented around 1620–1630, by the English clergyman William
Oughtred, shortly after the publication of the concept of the logarithm. It is a
hand-operated analog computer for doing multiplication and division. As slide rule
development progressed, added scales provided reciprocals, squares and square
roots, cubes and cube roots, as well as transcendental functions such as logarithms
and exponentials, circular and hyperbolic trigonometry and other functions. Slide
rules with special scales are still used for quick performance of routine
calculations, such as the E6B circular slide rule used for time and distance
calculations on light aircraft.
In the 1890s, the Spanish engineer Leonardo Torres Quevedo began to develop a
series of advanced analog machines that could solve real and complex roots of
polynomials,[17][18][19][20] which were published in 1901 by the Paris Academy of
Sciences.[21]
First computer
Charles Babbage
The machine was about a century ahead of its time. All the parts for his machine
had to be made by hand – this was a major problem for a device with thousands of
parts. Eventually, the project was dissolved with the decision of the British
Government to cease funding. Babbage's failure to complete the analytical engine
can be chiefly attributed to political and financial difficulties as well as his
desire to develop an increasingly sophisticated computer and to move ahead faster
than anyone else could follow. Nevertheless, his son, Henry Babbage, completed a
simplified version of the analytical engine's computing unit (the mill) in 1888. He
gave a successful demonstration of its use in computing tables in 1906.
Analog computers
Main article: Analog computer
The art of mechanical analog computing reached its zenith with the differential
analyzer, completed in 1931 by Vannevar Bush at MIT.[35] By the 1950s, the success
of digital electronic computers had spelled the end for most analog computing
machines, but analog computers remained in use during the 1950s in some specialized
applications such as education (slide rule) and aircraft (control systems).
[citation needed]
Digital computers
Electromechanical
Claude Shannon's 1937 master's thesis laid the foundations of digital computing,
with his insight of applying Boolean algebra to the analysis and synthesis of
switching circuits being the basic concept which underlies all electronic digital
computers.[36][37]
By 1938, the United States Navy had developed the Torpedo Data Computer, an
electromechanical analog computer for submarines that used trigonometry to solve
the problem of firing a torpedo at a moving target. During World War II, similar
devices were developed in other countries.[38]
Replica of Konrad Zuse's Z3, the first fully automatic, digital (electromechanical)
computer
Early digital computers were electromechanical; electric switches drove mechanical
relays to perform the calculation. These devices had a low operating speed and were
eventually superseded by much faster all-electric computers, originally using
vacuum tubes. The Z2, created by German engineer Konrad Zuse in 1939 in Berlin, was
one of the earliest examples of an electromechanical relay computer.[39]
Zuse's next computer, the Z4, became the world's first commercial computer; after
initial delay due to the Second World War, it was completed in 1950 and delivered
to the ETH Zurich.[48] The computer was manufactured by Zuse's own company, Zuse
KG, which was founded in 1941 as the first company with the sole purpose of
developing computers in Berlin.[48] The Z4 served as the inspiration for the
construction of the ERMETH, the first Swiss computer and one of the first in
Europe.[49]
ENIAC was the first electronic, Turing-complete device, and performed ballistics
trajectory calculations for the United States Army.
The ENIAC[59] (Electronic Numerical Integrator and Computer) was the first
electronic programmable computer built in the U.S. Although the ENIAC was similar
to the Colossus, it was much faster, more flexible, and it was Turing-complete.
Like the Colossus, a "program" on the ENIAC was defined by the states of its patch
cables and switches, a far cry from the stored program electronic machines that
came later. Once a program was written, it had to be mechanically set into the
machine with manual resetting of plugs and switches. The programmers of the ENIAC
were six women, often known collectively as the "ENIAC girls".[60][61]
It combined the high speed of electronics with the ability to be programmed for
many complex problems. It could add or subtract 5000 times a second, a thousand
times faster than any other machine. It also had modules to multiply, divide, and
square root. High speed memory was limited to 20 words (about 80 bytes). Built
under the direction of John Mauchly and J. Presper Eckert at the University of
Pennsylvania, ENIAC's development and construction lasted from 1943 to full
operation at the end of 1945. The machine was huge, weighing 30 tons, using 200
kilowatts of electric power and contained over 18,000 vacuum tubes, 1,500 relays,
and hundreds of thousands of resistors, capacitors, and inductors.[62]
Modern computers
Concept of modern computer
The principle of the modern computer was proposed by Alan Turing in his seminal
1936 paper,[63] On Computable Numbers. Turing proposed a simple device that he
called "Universal Computing machine" and that is now known as a universal Turing
machine. He proved that such a machine is capable of computing anything that is
computable by executing instructions (program) stored on tape, allowing the machine
to be programmable. The fundamental concept of Turing's design is the stored
program, where all the instructions for computing are stored in memory. Von Neumann
acknowledged that the central concept of the modern computer was due to this paper.
[64] Turing machines are to this day a central object of study in theory of
computation. Except for the limitations imposed by their finite memory stores,
modern computers are said to be Turing-complete, which is to say, they have
algorithm execution capability equivalent to a universal Turing machine.
Stored programs
Main article: Stored-program computer
Three tall racks containing electronic circuit boards
A section of the reconstructed Manchester Baby, the first electronic stored-program
computer
Early computing machines had fixed programs. Changing its function required the re-
wiring and re-structuring of the machine.[52] With the proposal of the stored-
program computer this changed. A stored-program computer includes by design an
instruction set and can store in memory a set of instructions (a program) that
details the computation. The theoretical basis for the stored-program computer was
laid out by Alan Turing in his 1936 paper. In 1945, Turing joined the National
Physical Laboratory and began work on developing an electronic stored-program
digital computer. His 1945 report "Proposed Electronic Calculator" was the first
specification for such a device. John von Neumann at the University of Pennsylvania
also circulated his First Draft of a Report on the EDVAC in 1945.[34]
The Manchester Baby was the world's first stored-program computer. It was built at
the University of Manchester in England by Frederic C. Williams, Tom Kilburn and
Geoff Tootill, and ran its first program on 21 June 1948.[65] It was designed as a
testbed for the Williams tube, the first random-access digital storage device.[66]
Although the computer was described as "small and primitive" by a 1998
retrospective, it was the first working machine to contain all of the elements
essential to a modern electronic computer.[67] As soon as the Baby had demonstrated
the feasibility of its design, a project began at the university to develop it into
a practically useful computer, the Manchester Mark 1.
The Mark 1 in turn quickly became the prototype for the Ferranti Mark 1, the
world's first commercially available general-purpose computer.[68] Built by
Ferranti, it was delivered to the University of Manchester in February 1951. At
least seven of these later machines were delivered between 1953 and 1957, one of
them to Shell labs in Amsterdam.[69] In October 1947 the directors of British
catering company J. Lyons & Company decided to take an active role in promoting the
commercial development of computers. Lyons's LEO I computer, modelled closely on
the Cambridge EDSAC of 1949, became operational in April 1951[70] and ran the
world's first routine office computer job.
Transistors
Main articles: Transistor and History of the transistor
Further information: Transistor computer and MOSFET
MOSFET (MOS transistor), showing gate (G), body (B), source (S) and drain (D)
terminals. The gate is separated from the body by an insulating layer (pink).
The metal–oxide–silicon field-effect transistor (MOSFET), also known as the MOS
transistor, was invented at Bell Labs between 1955 and 1960[77][78][79][80][81][82]
and was the first truly compact transistor that could be miniaturized and mass-
produced for a wide range of uses.[73] With its high scalability,[83] and much
lower power consumption and higher density than bipolar junction transistors,[84]
the MOSFET made it possible to build high-density integrated circuits.[85][86] In
addition to data processing, it also enabled the practical use of MOS transistors
as memory cell storage elements, leading to the development of MOS semiconductor
memory, which replaced earlier magnetic-core memory in computers. The MOSFET led to
the microcomputer revolution,[87] and became the driving force behind the computer
revolution.[88][89] The MOSFET is the most widely used transistor in computers,[90]
[91] and is the fundamental building block of digital electronics.[92]
Integrated circuits
Main articles: Integrated circuit and Invention of the integrated circuit
Further information: Planar process and Microprocessor
The first working ICs were invented by Jack Kilby at Texas Instruments and Robert
Noyce at Fairchild Semiconductor.[94] Kilby recorded his initial ideas concerning
the integrated circuit in July 1958, successfully demonstrating the first working
integrated example on 12 September 1958.[95] In his patent application of 6
February 1959, Kilby described his new device as "a body of semiconductor
material ... wherein all the components of the electronic circuit are completely
integrated".[96][97] However, Kilby's invention was a hybrid integrated circuit
(hybrid IC), rather than a monolithic integrated circuit (IC) chip.[98] Kilby's IC
had external wire connections, which made it difficult to mass-produce.[99]
Noyce also came up with his own idea of an integrated circuit half a year later
than Kilby.[100] Noyce's invention was the first true monolithic IC chip.[101][99]
His chip solved many practical problems that Kilby's had not. Produced at Fairchild
Semiconductor, it was made of silicon, whereas Kilby's chip was made of germanium.
Noyce's monolithic IC was fabricated using the planar process, developed by his
colleague Jean Hoerni in early 1959. In turn, the planar process was based on Carl
Frosch and Lincoln Derick work on semiconductor surface passivation by silicon
dioxide.[102][103][104][105][106][107]
System on a Chip (SoCs) are complete computers on a microchip (or chip) the size of
a coin.[117] They may or may not have integrated RAM and flash memory. If not
integrated, the RAM is usually placed directly above (known as Package on package)
or below (on the opposite side of the circuit board) the SoC, and the flash memory
is usually placed right next to the SoC. This is done to improve data transfer
speeds, as the data signals do not have to travel long distances. Since ENIAC in
1945, computers have advanced enormously, with modern SoCs (such as the Snapdragon
865) being the size of a coin while also being hundreds of thousands of times more
powerful than ENIAC, integrating billions of transistors, and consuming only a few
watts of power.
Mobile computers
The first mobile computers were heavy and ran from mains power. The 50 lb (23 kg)
IBM 5100 was an early example. Later portables such as the Osborne 1 and Compaq
Portable were considerably lighter but still needed to be plugged in. The first
laptops, such as the Grid Compass, removed this requirement by incorporating
batteries – and with the continued miniaturization of computing resources and
advancements in portable battery life, portable computers grew in popularity in the
2000s.[118] The same developments allowed manufacturers to integrate computing
resources into cellular mobile phones by the early 2000s.
These smartphones and tablets run on a variety of operating systems and recently
became the dominant computing device on the market.[119] These are powered by
System on a Chip (SoCs), which are complete computers on a microchip the size of a
coin.[117]