0% found this document useful (0 votes)
2 views

Computer

A computer is a programmable machine designed to perform arithmetic and logical operations, with modern digital computers capable of executing a wide range of tasks through various programs. The history of computing spans from ancient manual devices like the abacus to the development of sophisticated electronic machines, culminating in the Digital Revolution. Key figures in this evolution include Charles Babbage, who conceptualized the first mechanical computer, and advancements in hardware and software have significantly increased the speed and capabilities of computers over time.

Uploaded by

toobanaeem111
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

Computer

A computer is a programmable machine designed to perform arithmetic and logical operations, with modern digital computers capable of executing a wide range of tasks through various programs. The history of computing spans from ancient manual devices like the abacus to the development of sophisticated electronic machines, culminating in the Digital Revolution. Key figures in this evolution include Charles Babbage, who conceptualized the first mechanical computer, and advancements in hardware and software have significantly increased the speed and capabilities of computers over time.

Uploaded by

toobanaeem111
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

Contents hide

(Top)
Etymology
History
Toggle History subsection
Types
Toggle Types subsection
Hardware
Toggle Hardware subsection
Software
Toggle Software subsection
Networking and the Internet
Unconventional computers
Future
Toggle Future subsection
Professions and organizations
See also
Notes
References
Sources
External links
Computer

241 languages
Article
Talk
Read
View source
View history

Tools
Appearance hide
Text

Small

Standard

Large
Width

Standard

Wide
Color (beta)

Automatic

Light

Dark

From Wikipedia, the free encyclopedia


For other uses, see Computer (disambiguation).
Computers and computing devices from different eras—left to right, top to bottom:
Early vacuum tube computer (ENIAC)
Mainframe computer (IBM System/360)
Smartphone (LYF Water 2)
Desktop computer (IBM ThinkCentre S50 with monitor)
Video game console (Nintendo GameCube)
Supercomputer (IBM Summit)
A computer is a machine that can be programmed to automatically carry out sequences of arithmetic
or logical operations (computation). Modern digital electronic computers can perform generic sets of
operations known as programs. These programs enable computers to perform a wide range of tasks.
The term computer system may refer to a nominally complete computer that includes the hardware,
operating system, software, and peripheral equipment needed and used for full operation; or to a
group of computers that are linked and function together, such as a computer network or computer
cluster.

A broad range of industrial and consumer products use computers as control systems, including
simple special-purpose devices like microwave ovens and remote controls, and factory devices like
industrial robots. Computers are at the core of general-purpose devices such as personal computers
and mobile devices such as smartphones. Computers power the Internet, which links billions of
computers and users.[citation needed]

Early computers were meant to be used only for calculations. Simple manual instruments like the
abacus have aided people in doing calculations since ancient times. Early in the Industrial Revolution,
some mechanical devices were built to automate long, tedious tasks, such as guiding patterns for
looms. More sophisticated electrical machines did specialized analog calculations in the early 20th
century. The first digital electronic calculating machines were developed during World War II, both
electromechanical and using thermionic valves. The first semiconductor transistors in the late 1940s
were followed by the silicon-based MOSFET (MOS transistor) and monolithic integrated circuit chip
technologies in the late 1950s, leading to the microprocessor and the microcomputer revolution in
the 1970s. The speed, power, and versatility of computers have been increasing dramatically ever
since then, with transistor counts increasing at a rapid pace (Moore's law noted that counts doubled
every two years), leading to the Digital Revolution during the late 20th and early 21st centuries.
[citation needed]

Conventionally, a modern computer consists of at least one processing element, typically a central
processing unit (CPU) in the form of a microprocessor, together with some type of computer memory,
typically semiconductor memory chips. The processing element carries out arithmetic and logical
operations, and a sequencing and control unit can change the order of operations in response to
stored information. Peripheral devices include input devices (keyboards, mice, joysticks, etc.), output
devices (monitors, printers, etc.), and input/output devices that perform both functions (e.g.
touchscreens). Peripheral devices allow information to be retrieved from an external source, and they
enable the results of operations to be saved and retrieved.[citation needed]

Etymology

A human computer, with microscope and calculator, 1952


It was not until the mid-20th century that the word acquired its modern definition; according to the
Oxford English Dictionary, the first known use of the word computer was in a different sense, in a
1613 book called The Yong Mans Gleanings by the English writer Richard Brathwait: "I haue [sic] read
the truest computer of Times, and the best Arithmetician that euer [sic] breathed, and he reduceth
thy dayes into a short number." This usage of the term referred to a human computer, a person who
carried out calculations or computations. The word continued to have the same meaning until the
middle of the 20th century. During the latter part of this period, women were often hired as
computers because they could be paid less than their male counterparts.[1] By 1943, most human
computers were women.[2]

The Online Etymology Dictionary gives the first attested use of computer in the 1640s, meaning 'one
who calculates'; this is an "agent noun from compute (v.)". The Online Etymology Dictionary states
that the use of the term to mean "'calculating machine' (of any type) is from 1897." The Online
Etymology Dictionary indicates that the "modern use" of the term, to mean 'programmable digital
electronic computer' dates from "1945 under this name; [in a] theoretical [sense] from 1937, as
Turing machine".[3] The name has remained, although modern computers are capable of many
higher-level functions.

History
Main articles: History of computing and History of computing hardware
For a chronological guide, see Timeline of computing.
Pre-20th century

The Ishango bone, a bone tool dating back to prehistoric Africa


Devices have been used to aid computation for thousands of years, mostly using one-to-one
correspondence with fingers. The earliest counting device was most likely a form of tally stick. Later
record keeping aids throughout the Fertile Crescent included calculi (clay spheres, cones, etc.) which
represented counts of items, likely livestock or grains, sealed in hollow unbaked clay containers.[a][4]
The use of counting rods is one example.

The Chinese suanpan (算盘). The number represented on this abacus is 6,302,715,408.
The abacus was initially used for arithmetic tasks. The Roman abacus was developed from devices
used in Babylonia as early as 2400 BCE. Since then, many other forms of reckoning boards or tables
have been invented. In a medieval European counting house, a checkered cloth would be placed on a
table, and markers moved around on it according to certain rules, as an aid to calculating sums of
money.[5]

The Antikythera mechanism, dating back to ancient Greece circa 150–100 BCE, is an early analog
computing device.
The Antikythera mechanism is believed to be the earliest known mechanical analog computer,
according to Derek J. de Solla Price.[6] It was designed to calculate astronomical positions. It was
discovered in 1901 in the Antikythera wreck off the Greek island of Antikythera, between Kythera and
Crete, and has been dated to approximately c. 100 BCE. Devices of comparable complexity to the
Antikythera mechanism would not reappear until the fourteenth century.[7]

Many mechanical aids to calculation and measurement were constructed for astronomical and
navigation use. The planisphere was a star chart invented by Abū Rayhān al-Bīrūnī in the early 11th
century.[8] The astrolabe was invented in the Hellenistic world in either the 1st or 2nd centuries BCE
and is often attributed to Hipparchus. A combination of the planisphere and dioptra, the astrolabe
was effectively an analog computer capable of working out several different kinds of problems in
spherical astronomy. An astrolabe incorporating a mechanical calendar computer[9][10] and gear-
wheels was invented by Abi Bakr of Isfahan, Persia in 1235.[11] Abū Rayhān al-Bīrūnī invented the first
mechanical geared lunisolar calendar astrolabe,[12] an early fixed-wired knowledge processing
machine[13] with a gear train and gear-wheels,[14] c. 1000 AD.

The sector, a calculating instrument used for solving problems in proportion, trigonometry,
multiplication and division, and for various functions, such as squares and cube roots, was developed
in the late 16th century and found application in gunnery, surveying and navigation.

The planimeter was a manual instrument to calculate the area of a closed figure by tracing over it
with a mechanical linkage.
A slide rule
The slide rule was invented around 1620–1630, by the English clergyman William Oughtred, shortly
after the publication of the concept of the logarithm. It is a hand-operated analog computer for doing
multiplication and division. As slide rule development progressed, added scales provided reciprocals,
squares and square roots, cubes and cube roots, as well as transcendental functions such as
logarithms and exponentials, circular and hyperbolic trigonometry and other functions. Slide rules
with special scales are still used for quick performance of routine calculations, such as the E6B circular
slide rule used for time and distance calculations on light aircraft.

In the 1770s, Pierre Jaquet-Droz, a Swiss watchmaker, built a mechanical doll (automaton) that could
write holding a quill pen. By switching the number and order of its internal wheels different letters,
and hence different messages, could be produced. In effect, it could be mechanically "programmed"
to read instructions. Along with two other complex machines, the doll is at the Musée d'Art et
d'Histoire of Neuchâtel, Switzerland, and still operates.[15]

In 1831–1835, mathematician and engineer Giovanni Plana devised a Perpetual Calendar machine,
which through a system of pulleys and cylinders could predict the perpetual calendar for every year
from 0 CE (that is, 1 BCE) to 4000 CE, keeping track of leap years and varying day length. The tide-
predicting machine invented by the Scottish scientist Sir William Thomson in 1872 was of great utility
to navigation in shallow waters. It used a system of pulleys and wires to automatically calculate
predicted tide levels for a set period at a particular location.

The differential analyser, a mechanical analog computer designed to solve differential equations by
integration, used wheel-and-disc mechanisms to perform the integration. In 1876, Sir William
Thomson had already discussed the possible construction of such calculators, but he had been
stymied by the limited output torque of the ball-and-disk integrators.[16] In a differential analyzer,
the output of one integrator drove the input of the next integrator, or a graphing output. The torque
amplifier was the advance that allowed these machines to work. Starting in the 1920s, Vannevar Bush
and others developed mechanical differential analyzers.

In the 1890s, the Spanish engineer Leonardo Torres Quevedo began to develop a series of advanced
analog machines that could solve real and complex roots of polynomials,[17][18][19][20] which were
published in 1901 by the Paris Academy of Sciences.[21]

First computer

Charles Babbage

A diagram of a portion of Babbage's Difference engine

The Difference Engine Number 2 at the Intellectual Ventures laboratory in Seattle


Charles Babbage, an English mechanical engineer and polymath, originated the concept of a
programmable computer. Considered the "father of the computer",[22] he conceptualized and
invented the first mechanical computer in the early 19th century.

After working on his difference engine he announced his invention in 1822, in a paper to the Royal
Astronomical Society, titled "Note on the application of machinery to the computation of
astronomical and mathematical tables".[23] He also designed to aid in navigational calculations, in
1833 he realized that a much more general design, an analytical engine, was possible. The input of
programs and data was to be provided to the machine via punched cards, a method being used at the
time to direct mechanical looms such as the Jacquard loom. For output, the machine would have a
printer, a curve plotter and a bell. The machine would also be able to punch numbers onto cards to be
read in later. The engine would incorporate an arithmetic logic unit, control flow in the form of
conditional branching and loops, and integrated memory, making it the first design for a general-
purpose computer that could be described in modern terms as Turing-complete.[24][25]
The machine was about a century ahead of its time. All the parts for his machine had to be made by
hand – this was a major problem for a device with thousands of parts. Eventually, the project was
dissolved with the decision of the British Government to cease funding. Babbage's failure to complete
the analytical engine can be chiefly attributed to political and financial difficulties as well as his desire
to develop an increasingly sophisticated computer and to move ahead faster than anyone else could
follow. Nevertheless, his son, Henry Babbage, completed a simplified version of the analytical engine's
computing unit (the mill) in 1888. He gave a successful demonstration of its use in computing tables in
1906.

Electromechanical calculating machine

Electro-mechanical calculator (1920) by Leonardo Torres Quevedo.


In his work Essays on Automatics published in 1914, Leonardo Torres Quevedo wrote a brief history of
Babbage's efforts at constructing a mechanical Difference Engine and Analytical Engine. The paper

𝑎
contains a design of a machine capable to calculate formulas like

𝑦
(

𝑧

)
2
, for a sequence of sets of values. The whole machine was to be controlled by a read-only program,
which was complete with provisions for conditional branching. He also introduced the idea of floating-
point arithmetic.[26][27][28] In 1920, to celebrate the 100th anniversary of the invention of the
arithmometer, Torres presented in Paris the Electromechanical Arithmometer, which allowed a user
to input arithmetic problems through a keyboard, and computed and printed the results,[29][30][31]
[32] demonstrating the feasibility of an electromechanical analytical engine.[33]

Analog computers
Main article: Analog computer

Sir William Thomson's third tide-predicting machine design, 1879–81


During the first half of the 20th century, many scientific computing needs were met by increasingly
sophisticated analog computers, which used a direct mechanical or electrical model of the problem as
a basis for computation. However, these were not programmable and generally lacked the versatility
and accuracy of modern digital computers.[34] The first modern analog computer was a tide-
predicting machine, invented by Sir William Thomson (later to become Lord Kelvin) in 1872. The
differential analyser, a mechanical analog computer designed to solve differential equations by
integration using wheel-and-disc mechanisms, was conceptualized in 1876 by James Thomson, the
elder brother of the more famous Sir William Thomson.[16]

You might also like