Computer
Computer
A broad range of industrial and consumer products use computers as control systems,
including simple special-purpose devices like microwave ovens and remote controls,
and factory devices like industrial robots. Computers are at the core of general-purpose
devices such as personal computers and mobile devices such as smartphones.
Computers power the Internet, which links billions of computers and users.[citation needed]
Early computers were meant to be used only for calculations. Simple manual
instruments like the abacus have aided people in doing calculations since ancient times.
Early in the Industrial Revolution, some mechanical devices were built to automate long,
tedious tasks, such as guiding patterns for looms. More sophisticated electrical
machines did specialized analog calculations in the early 20th century. The
first digital electronic calculating machines were developed during World War II,
both electromechanical and using thermionic valves. The
first semiconductor transistors in the late 1940s were followed by the silicon-
based MOSFET (MOS transistor) and monolithic integrated circuit chip technologies in
the late 1950s, leading to the microprocessor and the microcomputer revolution in the
1970s. The speed, power, and versatility of computers have been increasing
dramatically ever since then, with transistor counts increasing at a rapid pace (Moore's
law noted that counts doubled every two years), leading to the Digital Revolution during
the late 20th and early 21st centuries.[citation needed]
Etymology
A human computer, with microscope and
calculator, 1952
It was not until the mid-20th century that the word acquired its modern definition;
according to the Oxford English Dictionary, the first known use of the
word computer was in a different sense, in a 1613 book called The Yong Mans
Gleanings by the English writer Richard Brathwait: "I haue [sic] read the truest computer
of Times, and the best Arithmetician that euer [sic] breathed, and he reduceth thy dayes
into a short number." This usage of the term referred to a human computer, a person
who carried out calculations or computations. The word continued to have the same
meaning until the middle of the 20th century. During the latter part of this period, women
were often hired as computers because they could be paid less than their male
counterparts.[1] By 1943, most human computers were women.[2]
The Online Etymology Dictionary gives the first attested use of computer in the 1640s,
meaning 'one who calculates'; this is an "agent noun from compute (v.)". The Online
Etymology Dictionary states that the use of the term to mean "'calculating machine' (of
any type) is from 1897." The Online Etymology Dictionary indicates that the "modern
use" of the term, to mean 'programmable digital electronic computer' dates from "1945
under this name; [in a] theoretical [sense] from 1937, as Turing machine".[3] The name
has remained, although modern computers are capable of many higher-level functions.
History
Main articles: History of computing and History of computing hardware
For a chronological guide, see Timeline of computing.
Pre-20th century
The Ishango bone, a bone tool dating back to prehistoric Africa
Devices have been used to aid computation for thousands of years, mostly using one-
to-one correspondence with fingers. The earliest counting device was most likely a form
of tally stick. Later record keeping aids throughout the Fertile Crescent included calculi
(clay spheres, cones, etc.) which represented counts of items, likely livestock or grains,
sealed in hollow unbaked clay containers.[a][4] The use of counting rods is one example.
The planimeter was a manual instrument to calculate the area of a closed figure by
tracing over it with a mechanical linkage.
A slide rule
The slide rule was invented around 1620–1630, by the English clergyman William
Oughtred, shortly after the publication of the concept of the logarithm. It is a hand-
operated analog computer for doing multiplication and division. As slide rule
development progressed, added scales provided reciprocals, squares and square roots,
cubes and cube roots, as well as transcendental functions such as logarithms and
exponentials, circular and hyperbolic trigonometry and other functions. Slide rules with
special scales are still used for quick performance of routine calculations, such as
the E6B circular slide rule used for time and distance calculations on light aircraft.
In the 1890s, the Spanish engineer Leonardo Torres Quevedo began to develop a
series of advanced analog machines that could solve real and complex roots
of polynomials,[17][18][19][20] which were published in 1901 by the Paris Academy of
Sciences.[21]
First computer
Charles Babbage
After working on his difference engine he announced his invention in 1822, in a paper to
the Royal Astronomical Society, titled "Note on the application of machinery to the
computation of astronomical and mathematical tables".[23] He also designed to aid in
navigational calculations, in 1833 he realized that a much more general design,
an analytical engine, was possible. The input of programs and data was to be provided
to the machine via punched cards, a method being used at the time to direct
mechanical looms such as the Jacquard loom. For output, the machine would have
a printer, a curve plotter and a bell. The machine would also be able to punch numbers
onto cards to be read in later. The engine would incorporate an arithmetic logic
unit, control flow in the form of conditional branching and loops, and integrated memory,
making it the first design for a general-purpose computer that could be described in
modern terms as Turing-complete.[24][25]
The machine was about a century ahead of its time. All the parts for his machine had to
be made by hand – this was a major problem for a device with thousands of parts.
Eventually, the project was dissolved with the decision of the British Government to
cease funding. Babbage's failure to complete the analytical engine can be chiefly
attributed to political and financial difficulties as well as his desire to develop an
increasingly sophisticated computer and to move ahead faster than anyone else could
follow. Nevertheless, his son, Henry Babbage, completed a simplified version of the
analytical engine's computing unit (the mill) in 1888. He gave a successful
demonstration of its use in computing tables in 1906.
formulas like , for a sequence of sets of values. The whole machine was to be
controlled by a read-only program, which was complete with provisions for conditional
branching. He also introduced the idea of floating-point arithmetic.[26][27][28] In 1920, to
celebrate the 100th anniversary of the invention of the arithmometer, Torres presented
in Paris the Electromechanical Arithmometer, which allowed a user to input arithmetic
problems through a keyboard, and computed and printed the
results,[29][30][31][32] demonstrating the feasibility of an electromechanical analytical engine.[33]
Analog computers
Main article: Analog computer
Sir William Thomson's third tide-predicting machine design,
1879–81
During the first half of the 20th century, many scientific computing needs were met by
increasingly sophisticated analog computers, which used a direct mechanical or
electrical model of the problem as a basis for computation. However, these were not
programmable and generally lacked the versatility and accuracy of modern digital
computers.[34] The first modern analog computer was a tide-predicting machine, invented
by Sir William Thomson (later to become Lord Kelvin) in 1872. The differential analyser,
a mechanical analog computer designed to solve differential equations by integration
using wheel-and-disc mechanisms, was conceptualized in 1876 by James Thomson,
the elder brother of the more famous Sir William Thomson.[16]
The art of mechanical analog computing reached its zenith with the differential analyzer,
built by H. L. Hazen and Vannevar Bush at MIT starting in 1927. This built on the
mechanical integrators of James Thomson and the torque amplifiers invented by H. W.
Nieman. A dozen of these devices were built before their obsolescence became
obvious.[citation needed] By the 1950s, the success of digital electronic computers had spelled
the end for most analog computing machines, but analog computers remained in use
during the 1950s in some specialized applications such as education (slide rule) and
aircraft (control systems).[citation needed]
Digital computers
Electromechanical
Claude Shannon's 1937 master's thesis laid the foundations of digital computing, with
his insight of applying Boolean algebra to the analysis and synthesis of switching
circuits being the basic concept which underlies all electronic digital computers. [35][36]
By 1938, the United States Navy had developed an electromechanical analog computer
small enough to use aboard a submarine. This was the Torpedo Data Computer, which
used trigonometry to solve the problem of firing a torpedo at a moving target. [citation
needed]
During World War II similar devices were developed in other countries as well.[citation
needed]
Replica of Konrad Zuse's Z3, the first fully automatic, digital
(electromechanical) computer
Early digital computers were electromechanical; electric switches drove mechanical
relays to perform the calculation. These devices had a low operating speed and were
eventually superseded by much faster all-electric computers, originally using vacuum
tubes. The Z2, created by German engineer Konrad Zuse in 1939 in Berlin, was one of
the earliest examples of an electromechanical relay computer.[37]
Zuse's next computer, the Z4, became the world's first commercial computer; after initial
delay due to the Second World War, it was completed in 1950 and delivered to the ETH
Zurich.[46] The computer was manufactured by Zuse's own company, Zuse KG, which
was founded in 1941 as the first company with the sole purpose of developing
computers in Berlin.[46] The Z4 served as the inspiration for the construction of
the ERMETH, the first Swiss computer and one of the first in Europe.[47]
Vacuum tubes and digital electronic circuits
Purely electronic circuit elements soon replaced their mechanical and
electromechanical equivalents, at the same time that digital calculation replaced analog.
The engineer Tommy Flowers, working at the Post Office Research Station in London in
the 1930s, began to explore the possible use of electronics for the telephone exchange.
Experimental equipment that he built in 1934 went into operation five years later,
converting a portion of the telephone exchange network into an electronic data
processing system, using thousands of vacuum tubes.[34] In the US, John Vincent
Atanasoff and Clifford E. Berry of Iowa State University developed and tested
the Atanasoff–Berry Computer (ABC) in 1942,[48] the first "automatic electronic digital
computer".[49] This design was also all-electronic and used about 300 vacuum tubes,
with capacitors fixed in a mechanically rotating drum for memory.[50]