0% found this document useful (0 votes)
764 views51 pages

History of Computing - Wikipedia

This document provides a history of computing from ancient times to the modern era. It describes early counting tools like the abacus and how mathematical concepts developed over millennia. It discusses advances in numerical systems and notation that enabled new operations. Early mechanical computers included devices from ancient Greece and China, with further innovations in the Islamic world. Charles Babbage designed analytical engines in the 1800s that incorporated functions of modern computers like stored programs, though they were not fully built. The first modern digital computers emerged in the 1930s-40s using electronics to represent and manipulate data.

Uploaded by

علی رضا
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
764 views51 pages

History of Computing - Wikipedia

This document provides a history of computing from ancient times to the modern era. It describes early counting tools like the abacus and how mathematical concepts developed over millennia. It discusses advances in numerical systems and notation that enabled new operations. Early mechanical computers included devices from ancient Greece and China, with further innovations in the Islamic world. Charles Babbage designed analytical engines in the 1800s that incorporated functions of modern computers like stored programs, though they were not fully built. The first modern digital computers emerged in the 1930s-40s using electronics to represent and manipulate data.

Uploaded by

علی رضا
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 51

History of

computing

The history of computing is longer than


the history of computing hardware and
modern computing technology and
includes the history of methods intended
for pen and paper or for chalk and slate,
with or without the aid of tables.

Concrete devices
Digital computing is intimately tied to the
representation of numbers.[1] But long
before abstractions like the number arose,
there were mathematical concepts to
serve the purposes of civilization. These
concepts are implicit in concrete practices
such as :

one-to-one correspondence,[2] a rule to


count how many items, say on a tally
stick, eventually abstracted into
numbers;
comparison to a standard,[3] a method
for assuming reproducibility in a
measurement, for example, the number
of coins
the 3-4-5 right triangle was a device for
assuring a right angle, using ropes with
12 evenly spaced knots, for example.[4]

Numbers
Eventually, the concept of numbers
became concrete and familiar enough for
counting to arise, at times with sing-song
mnemonics to teach sequences to others.
All known human languages, except the
Piraha language, have words for at least
"one" and "two", and even some animals
like the blackbird can distinguish a
surprising number of items.[5]
Advances in the numeral system and
mathematical notation eventually led to
the discovery of mathematical operations
such as addition, subtraction,
multiplication, division, squaring, square
root, and so forth. Eventually the
operations were formalized, and concepts
about the operations became understood
well enough to be stated formally, and
even proven. See, for example, Euclid's
algorithm for finding the greatest common
divisor of two numbers.

By the High Middle Ages, the positional


Hindu–Arabic numeral system had
reached Europe, which allowed for
systematic computation of numbers.
During this period, the representation of a
calculation on paper actually allowed
calculation of mathematical expressions,
and the tabulation of mathematical
functions such as the square root and the
common logarithm (for use in
multiplication and division) and the
trigonometric functions. By the time of
Isaac Newton's research, paper or vellum
was an important computing resource,
and even in our present time, researchers
like Enrico Fermi would cover random
scraps of paper with calculation, to satisfy
their curiosity about an equation.[6] Even
into the period of programmable
calculators, Richard Feynman would
unhesitatingly compute any steps which
overflowed the memory of the calculators,
by hand, just to learn the answer.

Early computation
The earliest known tool for use in
computation is the Sumerian abacus, and
it was thought to have been invented in
Babylon c. 2700–2300 BC. Its original
style of usage was by lines drawn in sand
with pebbles. Abaci, of a more modern
design, are still used as calculation tools
today. This was the first known computer
and most advanced system of calculation
known to date - preceding Greek methods
by 2,000 years.

In c. 1050–771 BC, the south-pointing


chariot was invented in ancient China. It
was the first known geared mechanism to
use a differential gear, which was later
used in analog computers. The Chinese
also invented a more sophisticated abacus
from around the 2nd century BC known as
the Chinese abacus.

In the 5th century BC in ancient India, the


grammarian Pāṇini formulated the
grammar of Sanskrit in 3959 rules known
as the Ashtadhyayi which was highly
systematized and technical. Panini used
metarules, transformations and
recursions.[7]

In the 3rd century BC, Archimedes used


the mechanical principle of balance (see
Archimedes Palimpsest#Mathematical
content) to calculate mathematical
problems, such as the number of grains of
sand in the universe (The sand reckoner),
which also required a recursive notation
for numbers (e.g., the myriad myriad).

Around 200 BC the development of gears


had made it possible to create devices in
which the positions of wheels would
correspond to positions of astronomical
objects. By about 100 AD Hero of
Alexandria had described an odometer-like
device that could be driven automatically
and could effectively count in digital
form.[8] But it was not until the 1600s that
mechanical devices for digital
computation appear to have actually been
built.

The Antikythera mechanism is believed to


be the earliest known mechanical analog
computer.[9] It was designed to calculate
astronomical positions. It was discovered
in 1901 in the Antikythera wreck off the
Greek island of Antikythera, between
Kythera and Crete, and has been dated to
circa 100 BC.

Mechanical analog computer devices


appeared again a thousand years later in
the medieval Islamic world and were
developed by Muslim astronomers, such
as the mechanical geared astrolabe by
Abū Rayhān al-Bīrūnī,[10] and the
torquetum by Jabir ibn Aflah.[11] According
to Simon Singh, Muslim mathematicians
also made important advances in
cryptography, such as the development of
cryptanalysis and frequency analysis by
Alkindus.[12][13] Programmable machines
were also invented by Muslim engineers,
such as the automatic flute player by the
Banū Mūsā brothers,[14] and Al-Jazari's
humanoid robots and castle clock, which is
considered to be the first programmable
analog computer.[15]

During the Middle Ages, several European


philosophers made attempts to produce
analog computer devices. Influenced by
the Arabs and Scholasticism, Majorcan
philosopher Ramon Llull (1232–1315)
devoted a great part of his life to defining
and designing several logical machines
that, by combining simple and undeniable
philosophical truths, could produce all
possible knowledge. These machines were
never actually built, as they were more of a
thought experiment to produce new
knowledge in systematic ways; although
they could make simple logical operations,
they still needed a human being for the
interpretation of results. Moreover, they
lacked a versatile architecture, each
machine serving only very concrete
purposes. In spite of this, Llull's work had a
strong influence on Gottfried Leibniz (early
18th century), who developed his ideas
further, and built several calculating tools
using them.

Indeed, when John Napier discovered


logarithms for computational purposes in
the early 17th century, there followed a
period of considerable progress by
inventors and scientists in making
calculating tools. The apex of this early era
of formal computing can be seen in the
difference engine and its successor the
analytical engine (which was never
completely constructed but was designed
in detail), both by Charles Babbage. The
analytical engine combined concepts from
his work and that of others to create a
device that if constructed as designed
would have possessed many properties of
a modern electronic computer. These
properties include such features as an
internal "scratch memory" equivalent to
RAM, multiple forms of output including a
bell, a graph-plotter, and simple printer, and
a programmable input-output "hard"
memory of punch cards which it could
modify as well as read. The key
advancement which Babbage's devices
possessed beyond those created before
his was that each component of the
device was independent of the rest of the
machine, much like the components of a
modern electronic computer. This was a
fundamental shift in thought; previous
computational devices served only a
single purpose, but had to be at best
disassembled and reconfigured to solve a
new problem. Babbage's devices could be
reprogramed to solve new problems by the
entry of new data, and act upon previous
calculations within the same series of
instructions. Ada Lovelace took this
concept one step further, by creating a
program for the analytical engine to
calculate Bernoulli numbers, a complex
calculation requiring a recursive algorithm.
This is considered to be the first example
of a true computer program, a series of
instructions that act upon data not known
in full until the program is run.

Several examples of analog computation


survived into recent times. A planimeter is
a device which does integrals, using
distance as the analog quantity. Until the
1980s, HVAC systems used air both as the
analog quantity and the controlling
element. Unlike modern digital computers,
analog computers are not very flexible, and
need to be reconfigured (i.e.,
reprogrammed) manually to switch them
from working on one problem to another.
Analog computers had an advantage over
early digital computers in that they could
be used to solve complex problems using
behavioral analogues while the earliest
attempts at digital computers were quite
limited.
 

A Smith Chart is a well-known nomogram.

Since computers were rare in this era, the


solutions were often hard-coded into paper
forms such as nomograms,[16] which could
then produce analog solutions to these
problems, such as the distribution of
pressures and temperatures in a heating
system.
Digital electronic computers

The “brain” [computer] may one


day come down to our level [of
the common people] and help
with our income-tax and book-
keeping calculations. But this is
speculation and there is no sign
of it so far.

— British newspaper The


Star in a June 1949 news
article about the EDSAC
computer, long before the
era of the personal
computers.[17]

None of the early computational devices


were really computers in the modern
sense, and it took considerable
advancement in mathematics and theory
before the first modern computers could
be designed.

The first recorded idea of using digital


electronics for computing was the 1931
paper "The Use of Thyratrons for High
Speed Automatic Counting of Physical
Phenomena" by C. E. Wynn-Williams.[18]
From 1934 to 1936, NEC engineer Akira
Nakashima published a series of papers
introducing switching circuit theory, using
digital electronics for Boolean algebraic
operations,[19][20][21] influencing Claude
Shannon's seminal 1938 paper "A
Symbolic Analysis of Relay and Switching
Circuits".[22]

The 1937 Atanasoff–Berry computer


design was the first digital electronic
computer, though it was not
programmable. The Z3 computer, built by
German inventor Konrad Zuse in 1941, was
the first working programmable, fully
automatic computing machine.
Alan Turing modeled computation in terms
of a one-dimensional storage tape, leading
to the idea of the Turing machine and
Turing-complete programming systems.

During World War II, ballistics computing


was done by women, who were hired as
"computers." The term computer remained
one that referred to mostly women (now
seen as "operator") until 1945, after which
it took on the modern definition of
machinery it presently holds.[23]

The ENIAC (Electronic Numerical


Integrator And Computer) was the first
electronic general-purpose computer,
announced to the public in 1946. It was
Turing-complete, digital, and capable of
being reprogrammed to solve a full range
of computing problems. Women
implemented the programming for
machines like the ENIAC, and men created
the hardware.[23]

The Manchester Baby was the first


electronic stored-program computer. It
was built at the Victoria University of
Manchester by Frederic C. Williams, Tom
Kilburn and Geoff Tootill, and ran its first
program on 21 June 1948.[24] The first
stored-program transistor computer was
the ETL Mark III, developed by Japan's
Electrotechnical Laboratory[25][26][27] from
1954[28] to 1956.[26]

In 1954, 95% of computers in service were


being used for engineering and scientific
purposes.[29]

The microprocessor was introduced with


the Intel 4004. It began with the "Busicom
Project"[30] as Masatoshi Shima's three-
chip CPU design in 1968,[31][30] before
Sharp's Tadashi Sasaki conceived of a
single-chip CPU design, which he
discussed with Busicom and Intel in
1968.[32] The Intel 4004 was then
developed as a single-chip microprocessor
from 1969 to 1970, led by Intel's Marcian
Hoff and Federico Faggin and Busicom's
Masatoshi Shima.[30] The microprocessor
led to the development of
microcomputers, and the microcomputer
revolution.

Most early microprocessors, such as the


Intel 8008 and Intel 8080, were 8-bit. Texas
Instruments released the first fully 16-bit
microprocessor, the TMS9900 processor,
in June 1976.[33] They used the
microprocessor in the TI-99/4 and TI-
99/4A computers.
The 1980s brought about significant
advances with microprocessor that greatly
impacted the fields of engineering and
other sciences. The Motorola 68000
microprocessor had a processing speed
that was far superior to the other
microprocessors being used at the time.
Because of this, having a newer, faster
microprocessor allowed for the newer
microcomputers that came along after to
be more efficient in the amount of
computing they were able to do. This was
evident in the 1983 release of the Apple
Lisa. The Lisa was the first personal
computer with a graphical user interface
(GUI) that was sold commercially. It ran on
the Motorola 68000 CPU and used both
dual floppy disk drives and a 5 MB hard
drive for storage. The machine also had
1MB of RAM used for running software
from disk without rereading the disk
persistently.[34] After the failure of the Lisa
in terms of sales, Apple released its first
Macintosh computer, still running on the
Motorola 68000 microprocessor, but with
only 128KB of RAM, one floppy drive, and
no hard drive in order to lower the price.

In the late 1980s and early 1990s, we see


more advancements with computers
becoming more useful for actual
computational purposes. In 1989, Apple
released the Macintosh Portable, it
weighed 7.3 kg (16 lb) and was extremely
expensive, costing US$7,300. At launch it
was one of the most powerful laptops
available, but due to the price and weight,
it was not met with great success, and
was discontinued only two years later.
That same year Intel introduced the
Touchstone Delta supercomputer, which
had 512 microprocessors. This
technological advancement was very
significant, as it was used as a model for
some of the fastest multi-processor
systems in the world. It was even used as
a prototype for Caltech researchers, who
used the model for projects like real time
processing of satellite images and
simulating molecular models for various
fields of research.

Navigation and astronomy


Starting with known special cases, the
calculation of logarithms and
trigonometric functions can be performed
by looking up numbers in a mathematical
table, and interpolating between known
cases. For small enough differences, this
linear operation was accurate enough for
use in navigation and astronomy in the
Age of Exploration. The uses of
interpolation have thrived in the past 500
years: by the twentieth century Leslie
Comrie and W.J. Eckert systematized the
use of interpolation in tables of numbers
for punch card calculation.

Weather prediction
The numerical solution of differential
equations, notably the Navier-Stokes
equations was an important stimulus to
computing, with Lewis Fry Richardson's
numerical approach to solving differential
equations. The first computerised weather
forecast was performed in 1950 by a team
composed of American meteorologists
Jule Charney, Philip Thompson, Larry
Gates, and Norwegian meteorologist
Ragnar Fjørtoft, applied mathematician
John von Neumann, and ENIAC
programmer Klara Dan von
Neumann.[35][36][37] To this day, some of
the most powerful computer systems on
Earth are used for weather forecasts.

Symbolic computations
By the late 1960s, computer systems
could perform symbolic algebraic
manipulations well enough to pass
college-level calculus courses.

See also
Algorithm
Charles Babbage Institute - research
center for history of computing at
University of Minnesota
Computing timelines category
History of software
IT History Society
List of mathematicians
List of pioneers in computer science
Timeline of quantum computing

References
1. "Digital Computing - Dictionary
definition of Digital Computing |
Encyclopedia.com: FREE online
dictionary" . www.encyclopedia.com.
Retrieved 2017-09-11.
2. "One-to-One Correspondence: 0.5" .
Victoria Department of Education and
Early Childhood Development.
Archived from the original on 20
November 2012.
3. Ifrah, Georges (2000), The Universal
History of Numbers: From prehistory
to the invention of the computer., John
Wiley and Sons, p. 48, ISBN 0-471-
39340-1
4. W., Weisstein, Eric. "3, 4, 5 Triangle" .
mathworld.wolfram.com. Retrieved
2017-09-11.
5. Konrad Lorenz (1961). King Solomon's
Ring. Translated by Marjorie Kerr
Wilson. London: Methuen. ISBN 0-416-
53860-6.
6. "DIY: Enrico Fermi's Back of the
Envelope Calculations" .
7. Sinha, A. C. (1978). "On the status of
recursive rules in transformational
grammar". Lingua. 44 (2–3): 169.
doi:10.1016/0024-3841(78)90076-1 .
8. Wolfram, Stephen (2002). A New Kind
of Science. Wolfram Media, Inc.
p. 1107. ISBN 1-57955-008-8.
9. The Antikythera Mechanism Research
Project , The Antikythera Mechanism
Research Project. Retrieved 2007-07-
01
10. "Islam, Knowledge, and Science" .
University of Southern California.
Archived from the original on 2008-
01-19. Retrieved 2008-01-22.
11. Lorch, R. P. (1976), "The Astronomical
Instruments of Jabir ibn Aflah and the
Torquetum", Centaurus, 20 (1): 11–34,
Bibcode:1976Cent...20...11L ,
doi:10.1111/j.1600-
0498.1976.tb00214.x
12. Simon Singh, The Code Book, pp. 14-
20
13. "Al-Kindi, Cryptgraphy, Codebreaking
and Ciphers" . Retrieved 2007-01-12.
14. Koetsier, Teun (2001), "On the
prehistory of programmable
machines: musical automata, looms,
calculators", Mechanism and Machine
Theory, Elsevier, 36 (5): 589–603,
doi:10.1016/S0094-114X(01)00005-
2 ..
15. Ancient Discoveries, Episode 11:
Ancient Robots , History Channel,
archived from the original on March
1, 2014, retrieved 2008-09-06
16. Steinhaus, H. (1999). Mathematical
Snapshots (3rd ed.). New York: Dover.
pp. 92–95, p. 301.
17. [1]
18. Wynn-Williams, C. E. (July 2, 1931),
"The Use of Thyratrons for High Speed
Automatic Counting of Physical
Phenomena", Proceedings of the
Royal Society A, 132 (819): 295–310,
Bibcode:1931RSPSA.132..295W ,
doi:10.1098/rspa.1931.0102
19. History of Research on Switching
Theory in Japan , IEEJ Transactions
on Fundamentals and Materials, Vol.
124 (2004) No. 8, pp. 720-726,
Institute of Electrical Engineers of
Japan
20. Switching Theory/Relay Circuit
Network Theory/Theory of Logical
Mathematics , IPSJ Computer
Museum, Information Processing
Society of Japan
21. Radomir S. Stanković, Jaakko Astola
(2008), Reprints from the Early Days
of Information Sciences: TICSP Series
On the Contributions of Akira
Nakashima to Switching Theory ,
TICSP Series #40, Tampere
International Center for Signal
Processing, Tampere University of
Technology
22. Stanković, Radomir S.; Astola, Jaakko
T.; Karpovsky, Mark G. "Some
Historical Remarks on Switching
Theory" (PDF). Tampere International
Center for Signal Processing, Tampere
University of Technology.
CiteSeerX 10.1.1.66.1248 .
23. Light, Jennifer S. (July 1999). "When
Computers Were Women". Technology
and Culture. 40: 455–483.
24. Enticknap, Nicholas (Summer 1998),
"Computing's Golden Jubilee" ,
Resurrection, The Computer
Conservation Society (20), ISSN 0958-
7403 , retrieved 19 April 2008
25. Early Computers , Information
Processing Society of Japan
26. 【Electrotechnical Laboratory】 ETL
Mark III Transistor-Based Computer ,
Information Processing Society of
Japan
27. Early Computers: Brief History ,
Information Processing Society of
Japan
28. Martin Fransman (1993), The Market
and Beyond: Cooperation and
Competition in Information
Technology, page 19 , Cambridge
University Press
29. Ensmenger, Nathan (2010). The
Computer Boys Take Over. p. 58.
ISBN 978-0-262-05093-7.
30. Federico Faggin, The Making of the
First Microprocessor , IEEE Solid-State
Circuits Magazine, Winter 2009, IEEE
Xplore
31. Nigel Tout. "The Busicom 141-PF
calculator and the Intel 4004
microprocessor" . Retrieved
November 15, 2009.
32. Aspray, William (1994-05-25). "Oral-
History: Tadashi Sasaki" . Interview
#211 for the Center for the History of
Electrical Engineering. The Institute of
Electrical and Electronics Engineers,
Inc. Retrieved 2013-01-02.
33. Conner, Stuart. "Stuart's TM 990
Series 16-bit Microcomputer
Modules" . www.stuartconner.me.uk.
Retrieved 2017-09-05.
34. "Computers | Timeline of Computer
History | Computer History Museum" .
www.computerhistory.org. Retrieved
2017-09-05.
35. Charney, Fjörtoft and von Neumann,
1950, Numerical Integration of the
Barotropic Vorticity Equation Tellus, 2,
237-254
36. Witman, Sarah (16 June 2017). "Meet
the Computer Scientist You Should
Thank For Your Smartphone's Weather
App" . Smithsonian. Retrieved 22 July
2017.
37. Edwards, Paul N. (2010). A Vast
Machine: Computer Models, Climate
Data, and the Politics of Global
Warming . The MIT Press. ISBN 978-
0262013925.

Important women and their


contributions
Ada Lovelace wrote the addendum to
Babbage's Analytical Machine. Detailing,
in poetic style, the first computer
algorithm; a description of exactly how
The Analytical Machine should have
worked based on its design.
Grace Murray Hopper is a pioneer of
computing. She worked alongside
Howard H. Aiken on the IBM's Mark I.
Hopper also came up with the term
"debugging."
Hedy Lamarr invented a "frequency
hopping" technology that was used by
the Navy during World War II to control
torpedoes via radio signals. This same
technology is also used today in
creating Bluetooth and Wifi signals.
Frances Elizabeth "Betty" Holberton
invented "breakpoints" which are mini
pauses put into lines of computer code
to help programmers easily detect,
troubleshoot, and solve problems.
Frances Elizabeth ("Fran") Allen
Karen Spärck Jones was responsible for
"inverse document frequency" - a
concept that is most commonly used by
search engines.
Margaret Hamilton was the director of
the Software Engineering Division at MIT
which developed on-board flight
software for the Apollo's Missions to
Space.
Barbara Liskov developed the "Liskov
Substitution Principle."
Radia Perlman invented the "Spanning
Tree Protocol," a key network protocol
used in Ethernet networks.

External links
The History of Computing by J.A.N. Lee
"Things that Count: the rise and fall of
calculators"
The History of Computing Project
SIG on Computers, Information and
Society of the Society for the History of
Technology
The Modern History of Computing
Cringely's "Triumph of the Nerds"
Top 25 Days in Computing History
A Chronology of Digital Computing
Machines (to 1952) by Mark Brader
Bitsavers , an effort to capture, salvage,
and archive historical computer
software and manuals from
minicomputers and mainframes of the
1950s, 60s, 70s, and 80s
Cyberhistory (2001) by Keith Falloon.
UWA digital thesis repository.
Arithmometre.org , The reference about
Thomas de Colmar's arithmometers
Yahoo Computers and History
"All-Magnetic Logic Computer" .
Timeline of Innovations. SRI
International. Developed at SRI
International in 1961
Stephen White's excellent computer
history site (the above article is a
modified version of his work, used with
Permission)
Soviet Digital Electronics Museum - a
big collection of Soviet calculators,
computers, computer mice and other
devices
Logarithmic timeline of greatest
breakthroughs since start of computing
era in 1623 by Jürgen Schmidhuber,
from "The New AI: General & Sound &
Relevant for Physics, In B. Goertzel and
C. Pennachin, eds.: Artificial General
Intelligence, p. 175-198, 2006."
IEEE computer history timeline
Konrad Zuse, inventor of first working
programmable digital computer by
Jürgen Schmidhuber
The Moore School Lectures and the
British Lead in Stored Program Computer
Development (1946–1953) , article from
Virtual Travelog
MIT STS.035 — The History of
Computing (Spring 2004) from MIT
OpenCourseWare for undergraduate
level
Key Resources in the History of
Computing
Italian computer database of brands
Computer History - a collection of
articles by Bob Bemer
On the status of recursive rules in
transformational grammar (subscription
required)

YouTube video comparing 1980s home


computers to 2010s technology
A visual timeline of the development of
computers since COLOSSUS' inception
in 1943
History of Computing Visualization
Computer Histories - An introductory
course on the history of computing

British history links


Resurrection Bulletin of the Computer
Conservation Society (UK) 1990–2006
The story of the Manchester Mark I
(archive ), 50th Anniversary website at
the University of Manchester
Richmond Arabian History of Computing
Group Linking the Gulf and Europe

Retrieved from
"https://en.wikipedia.org/w/index.php?
title=History_of_computing&oldid=898145238"

Last edited 11 days ago by an anon…

Content is available under CC BY-SA 3.0 unless


otherwise noted.

You might also like