0% found this document useful (0 votes)
12 views10 pages

Computer

A computer is a programmable machine that performs arithmetic and logical operations, with modern systems including hardware, software, and peripherals. The evolution of computers spans from early manual devices like the abacus to sophisticated digital machines, with significant advancements occurring during the 20th century, particularly with the development of the microprocessor. The term 'computer' originally referred to human calculators before transitioning to describe electronic devices capable of complex computations.

Uploaded by

Mosarraf Hossain
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views10 pages

Computer

A computer is a programmable machine that performs arithmetic and logical operations, with modern systems including hardware, software, and peripherals. The evolution of computers spans from early manual devices like the abacus to sophisticated digital machines, with significant advancements occurring during the 20th century, particularly with the development of the microprocessor. The term 'computer' originally referred to human calculators before transitioning to describe electronic devices capable of complex computations.

Uploaded by

Mosarraf Hossain
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 10

A computer is a machine that can be programmed to automatically carry

out sequences of arithmetic or logical operations (computation). Modern digital


electronic computers can perform generic sets of operations known as programs. These
programs enable computers to perform a wide range of tasks. The term computer
system may refer to a nominally complete computer that includes
the hardware, operating system, software, and peripheral equipment needed and used
for full operation; or to a group of computers that are linked and function together, such
as a computer network or computer cluster.

A broad range of industrial and consumer products use computers as control systems,
including simple special-purpose devices like microwave ovens and remote controls,
and factory devices like industrial robots. Computers are at the core of general-purpose
devices such as personal computers and mobile devices such as smartphones.
Computers power the Internet, which links billions of computers and users.[citation needed]

Early computers were meant to be used only for calculations. Simple manual
instruments like the abacus have aided people in doing calculations since ancient times.
Early in the Industrial Revolution, some mechanical devices were built to automate long,
tedious tasks, such as guiding patterns for looms. More sophisticated electrical
machines did specialized analog calculations in the early 20th century. The
first digital electronic calculating machines were developed during World War II,
both electromechanical and using thermionic valves. The
first semiconductor transistors in the late 1940s were followed by the silicon-
based MOSFET (MOS transistor) and monolithic integrated circuit chip technologies in
the late 1950s, leading to the microprocessor and the microcomputer revolution in the
1970s. The speed, power, and versatility of computers have been increasing
dramatically ever since then, with transistor counts increasing at a rapid pace (Moore's
law noted that counts doubled every two years), leading to the Digital Revolution during
the late 20th and early 21st centuries.[citation needed]

Conventionally, a modern computer consists of at least one processing element,


typically a central processing unit (CPU) in the form of a microprocessor, together with
some type of computer memory, typically semiconductor memory chips. The processing
element carries out arithmetic and logical operations, and a sequencing and control unit
can change the order of operations in response to stored information. Peripheral
devices include input devices (keyboards, mice, joysticks, etc.), output devices
(monitors, printers, etc.), and input/output devices that perform both functions
(e.g. touchscreens). Peripheral devices allow information to be retrieved from an
external source, and they enable the results of operations to be saved and
retrieved.[citation needed]

Etymology
A human computer, with microscope and
calculator, 1952
It was not until the mid-20th century that the word acquired its modern definition;
according to the Oxford English Dictionary, the first known use of the
word computer was in a different sense, in a 1613 book called The Yong Mans
Gleanings by the English writer Richard Brathwait: "I haue [sic] read the truest computer
of Times, and the best Arithmetician that euer [sic] breathed, and he reduceth thy dayes
into a short number." This usage of the term referred to a human computer, a person
who carried out calculations or computations. The word continued to have the same
meaning until the middle of the 20th century. During the latter part of this period, women
were often hired as computers because they could be paid less than their male
counterparts.[1] By 1943, most human computers were women.[2]

The Online Etymology Dictionary gives the first attested use of computer in the 1640s,
meaning 'one who calculates'; this is an "agent noun from compute (v.)". The Online
Etymology Dictionary states that the use of the term to mean "'calculating machine' (of
any type) is from 1897." The Online Etymology Dictionary indicates that the "modern
use" of the term, to mean 'programmable digital electronic computer' dates from "1945
under this name; [in a] theoretical [sense] from 1937, as Turing machine".[3] The name
has remained, although modern computers are capable of many higher-level functions.

History
Main articles: History of computing and History of computing hardware
For a chronological guide, see Timeline of computing.
Pre-20th century
The Ishango bone, a bone tool dating back to prehistoric Africa
Devices have been used to aid computation for thousands of years, mostly using one-
to-one correspondence with fingers. The earliest counting device was most likely a form
of tally stick. Later record keeping aids throughout the Fertile Crescent included calculi
(clay spheres, cones, etc.) which represented counts of items, likely livestock or grains,
sealed in hollow unbaked clay containers.[a][4] The use of counting rods is one example.

The Chinese suanpan (算盘). The number represented on


this abacus is 6,302,715,408.
The abacus was initially used for arithmetic tasks. The Roman abacus was developed
from devices used in Babylonia as early as 2400 BCE. Since then, many other forms of
reckoning boards or tables have been invented. In a medieval European counting
house, a checkered cloth would be placed on a table, and markers moved around on it
according to certain rules, as an aid to calculating sums of money.[5]
The Antikythera mechanism, dating back to ancient Greece circa
200–80 BCE, is an early analog computing device.
The Antikythera mechanism is believed to be the earliest known mechanical analog
computer, according to Derek J. de Solla Price.[6] It was designed to calculate
astronomical positions. It was discovered in 1901 in the Antikythera wreck off the Greek
island of Antikythera, between Kythera and Crete, and has been dated to
approximately c. 100 BCE. Devices of comparable complexity to the Antikythera
mechanism would not reappear until the fourteenth century.[7]

Many mechanical aids to calculation and measurement were constructed for


astronomical and navigation use. The planisphere was a star chart invented by Abū
Rayhān al-Bīrūnī in the early 11th century.[8] The astrolabe was invented in
the Hellenistic world in either the 1st or 2nd centuries BCE and is often attributed
to Hipparchus. A combination of the planisphere and dioptra, the astrolabe was
effectively an analog computer capable of working out several different kinds of
problems in spherical astronomy. An astrolabe incorporating a
mechanical calendar computer[9][10] and gear-wheels was invented by Abi Bakr
of Isfahan, Persia in 1235.[11] Abū Rayhān al-Bīrūnī invented the first mechanical
geared lunisolar calendar astrolabe,[12] an early fixed-wired knowledge processing
machine[13] with a gear train and gear-wheels,[14] c. 1000 AD.

The sector, a calculating instrument used for solving problems in


proportion, trigonometry, multiplication and division, and for various functions, such as
squares and cube roots, was developed in the late 16th century and found application in
gunnery, surveying and navigation.

The planimeter was a manual instrument to calculate the area of a closed figure by
tracing over it with a mechanical linkage.

A slide rule
The slide rule was invented around 1620–1630, by the English clergyman William
Oughtred, shortly after the publication of the concept of the logarithm. It is a hand-
operated analog computer for doing multiplication and division. As slide rule
development progressed, added scales provided reciprocals, squares and square roots,
cubes and cube roots, as well as transcendental functions such as logarithms and
exponentials, circular and hyperbolic trigonometry and other functions. Slide rules with
special scales are still used for quick performance of routine calculations, such as
the E6B circular slide rule used for time and distance calculations on light aircraft.

In the 1770s, Pierre Jaquet-Droz, a Swiss watchmaker, built a mechanical doll


(automaton) that could write holding a quill pen. By switching the number and order of
its internal wheels different letters, and hence different messages, could be produced. In
effect, it could be mechanically "programmed" to read instructions. Along with two other
complex machines, the doll is at the Musée d'Art et d'Histoire of Neuchâtel, Switzerland,
and still operates.[15]

In 1831–1835, mathematician and engineer Giovanni Plana devised a Perpetual


Calendar machine, which through a system of pulleys and cylinders could predict
the perpetual calendar for every year from 0 CE (that is, 1 BCE) to 4000 CE, keeping
track of leap years and varying day length. The tide-predicting machine invented by the
Scottish scientist Sir William Thomson in 1872 was of great utility to navigation in
shallow waters. It used a system of pulleys and wires to automatically calculate
predicted tide levels for a set period at a particular location.

The differential analyser, a mechanical analog computer designed to solve differential


equations by integration, used wheel-and-disc mechanisms to perform the integration.
In 1876, Sir William Thomson had already discussed the possible construction of such
calculators, but he had been stymied by the limited output torque of the ball-and-disk
integrators.[16] In a differential analyzer, the output of one integrator drove the input of the
next integrator, or a graphing output. The torque amplifier was the advance that allowed
these machines to work. Starting in the 1920s, Vannevar Bush and others developed
mechanical differential analyzers.

In the 1890s, the Spanish engineer Leonardo Torres Quevedo began to develop a
series of advanced analog machines that could solve real and complex roots
of polynomials,[17][18][19][20] which were published in 1901 by the Paris Academy of
Sciences.[21]

First computer
Charles Babbage

A diagram of a portion of Babbage's Difference engine

The Difference Engine Number 2 at the Intellectual Ventures laboratory in Seattle

Charles Babbage, an English mechanical engineer and polymath, originated the


concept of a programmable computer. Considered the "father of the computer",[22] he
conceptualized and invented the first mechanical computer in the early 19th century.

After working on his difference engine he announced his invention in 1822, in a paper to
the Royal Astronomical Society, titled "Note on the application of machinery to the
computation of astronomical and mathematical tables".[23] He also designed to aid in
navigational calculations, in 1833 he realized that a much more general design,
an analytical engine, was possible. The input of programs and data was to be provided
to the machine via punched cards, a method being used at the time to direct
mechanical looms such as the Jacquard loom. For output, the machine would have
a printer, a curve plotter and a bell. The machine would also be able to punch numbers
onto cards to be read in later. The engine would incorporate an arithmetic logic
unit, control flow in the form of conditional branching and loops, and integrated memory,
making it the first design for a general-purpose computer that could be described in
modern terms as Turing-complete.[24][25]

The machine was about a century ahead of its time. All the parts for his machine had to
be made by hand – this was a major problem for a device with thousands of parts.
Eventually, the project was dissolved with the decision of the British Government to
cease funding. Babbage's failure to complete the analytical engine can be chiefly
attributed to political and financial difficulties as well as his desire to develop an
increasingly sophisticated computer and to move ahead faster than anyone else could
follow. Nevertheless, his son, Henry Babbage, completed a simplified version of the
analytical engine's computing unit (the mill) in 1888. He gave a successful
demonstration of its use in computing tables in 1906.

Electromechanical calculating machine

Electro-mechanical calculator (1920) by Leonardo Torres


Quevedo.
In his work Essays on Automatics published in 1914, Leonardo Torres Quevedo wrote a
brief history of Babbage's efforts at constructing a mechanical Difference Engine and
Analytical Engine. The paper contains a design of a machine capable to calculate

formulas like , for a sequence of sets of values. The whole machine was to be
controlled by a read-only program, which was complete with provisions for conditional
branching. He also introduced the idea of floating-point arithmetic.[26][27][28] In 1920, to
celebrate the 100th anniversary of the invention of the arithmometer, Torres presented
in Paris the Electromechanical Arithmometer, which allowed a user to input arithmetic
problems through a keyboard, and computed and printed the
results,[29][30][31][32] demonstrating the feasibility of an electromechanical analytical engine.[33]

Analog computers
Main article: Analog computer
Sir William Thomson's third tide-predicting machine design,
1879–81
During the first half of the 20th century, many scientific computing needs were met by
increasingly sophisticated analog computers, which used a direct mechanical or
electrical model of the problem as a basis for computation. However, these were not
programmable and generally lacked the versatility and accuracy of modern digital
computers.[34] The first modern analog computer was a tide-predicting machine, invented
by Sir William Thomson (later to become Lord Kelvin) in 1872. The differential analyser,
a mechanical analog computer designed to solve differential equations by integration
using wheel-and-disc mechanisms, was conceptualized in 1876 by James Thomson,
the elder brother of the more famous Sir William Thomson.[16]

The art of mechanical analog computing reached its zenith with the differential analyzer,
built by H. L. Hazen and Vannevar Bush at MIT starting in 1927. This built on the
mechanical integrators of James Thomson and the torque amplifiers invented by H. W.
Nieman. A dozen of these devices were built before their obsolescence became
obvious.[citation needed] By the 1950s, the success of digital electronic computers had spelled
the end for most analog computing machines, but analog computers remained in use
during the 1950s in some specialized applications such as education (slide rule) and
aircraft (control systems).[citation needed]

Digital computers
Electromechanical
Claude Shannon's 1937 master's thesis laid the foundations of digital computing, with
his insight of applying Boolean algebra to the analysis and synthesis of switching
circuits being the basic concept which underlies all electronic digital computers. [35][36]

By 1938, the United States Navy had developed an electromechanical analog computer
small enough to use aboard a submarine. This was the Torpedo Data Computer, which
used trigonometry to solve the problem of firing a torpedo at a moving target. [citation
needed]
During World War II similar devices were developed in other countries as well.[citation
needed]
Replica of Konrad Zuse's Z3, the first fully automatic, digital
(electromechanical) computer
Early digital computers were electromechanical; electric switches drove mechanical
relays to perform the calculation. These devices had a low operating speed and were
eventually superseded by much faster all-electric computers, originally using vacuum
tubes. The Z2, created by German engineer Konrad Zuse in 1939 in Berlin, was one of
the earliest examples of an electromechanical relay computer.[37]

Konrad Zuse, inventor of the modern computer[38][39]


In 1941, Zuse followed his earlier machine up with the Z3, the world's first working
electromechanical programmable, fully automatic digital computer.[40][41] The Z3 was built
with 2000 relays, implementing a 22 bit word length that operated at a clock
frequency of about 5–10 Hz.[42] Program code was supplied on punched film while data
could be stored in 64 words of memory or supplied from the keyboard. It was quite
similar to modern machines in some respects, pioneering numerous advances such
as floating-point numbers. Rather than the harder-to-implement decimal system (used
in Charles Babbage's earlier design), using a binary system meant that Zuse's
machines were easier to build and potentially more reliable, given the technologies
available at that time.[43] The Z3 was not itself a universal computer but could be
extended to be Turing complete.[44][45]

Zuse's next computer, the Z4, became the world's first commercial computer; after initial
delay due to the Second World War, it was completed in 1950 and delivered to the ETH
Zurich.[46] The computer was manufactured by Zuse's own company, Zuse KG, which
was founded in 1941 as the first company with the sole purpose of developing
computers in Berlin.[46] The Z4 served as the inspiration for the construction of
the ERMETH, the first Swiss computer and one of the first in Europe.[47]
Vacuum tubes and digital electronic circuits
Purely electronic circuit elements soon replaced their mechanical and
electromechanical equivalents, at the same time that digital calculation replaced analog.
The engineer Tommy Flowers, working at the Post Office Research Station in London in
the 1930s, began to explore the possible use of electronics for the telephone exchange.
Experimental equipment that he built in 1934 went into operation five years later,
converting a portion of the telephone exchange network into an electronic data
processing system, using thousands of vacuum tubes.[34] In the US, John Vincent
Atanasoff and Clifford E. Berry of Iowa State University developed and tested
the Atanasoff–Berry Computer (ABC) in 1942,[48] the first "automatic electronic digital
computer".[49] This design was also all-electronic and used about 300 vacuum tubes,
with capacitors fixed in a mechanically rotating drum for memory.[50]

You might also like