0% found this document useful (0 votes)
33 views7 pages

The History of Computing

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views7 pages

The History of Computing

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

The History of Computing 11/11/24, 8:33 PM

History of Computing
Definition of a Computer - before 1935, a computer was a person who performed arithmetic
calculations. Between 1935 and 1945 the definition referred to a machine, rather than a person. The
modern machine definition is based on von Neumann's concepts: a device that accepts input,
processes data, stores data, and produces output.

We have gone from the vacuum tube to the transistor, to the microchip. Then the microchip started
talking to the modem. Now we exchange text, sound, photos and movies in a digital environment.

Computing milestones and machine evolution:

14th C. - Abacus - an instrument for performing calculations by sliding counters along rods or in
grooves (graphic: Merriam Webster Collegiate Dictionary http://www.m-
w.com/mw/art/abacus.htm)

17th C. - Slide rule - a manual device used for calculation that consists in its simple form of a
ruler and a movable middle piece which are graduated with similar logarithmic scales (Picture
from the The Museum of HP Calculators)

1642 - Pascaline--a mechanical calculator built by Blaise Pascal, a 17th century mathematician,
for whom the Pascal computer programming language was named .

1804 - Jacquard loom - a loom programmed with punched cards invented by Joseph Marie
Jacquard

ca 1850 - Difference Engine , Analytical Engine - Charles Babbage and Ada Byron (See her
picture.). Babbage's description, in 1837, of the Analytical Engine, a hand cranked, mechanical
digital computer anticipated virtually every aspect of present-day computers. It wasn't until over
a 100 years later that another all purpose computer was conceived. Sketch of the Engine and
notes by Ada Byron King, Countess of Lovelace.

1939 -1942 - Atanasoff Berry Computer - built at Iowa State by Prof. John V. Atanasoff and
graduate student Clifford Berry. Represented several "firsts" in computing, including a binary
system of of arithmetic, parallel processing, regenerative memory, separation of memory and
computing functions, and more. Weighed 750 lbs. and had a memory storage of 3,000 bits
(0.4K). Recorded numbers by scorching marks into cards as it worked through a problem. See
diagram.

1940s - Colossus - a vacuum tube computing machine which broke Hitler's codes during WW II.
It was instrumental in helping Turing break the German's codes during WW II to turn the tide of
the war. In the summer of 1939, a small group of scholars became codebreakers, working at
Bletchley Part in England. This group of pioneering codebreakers helped shorten the war and
changed the course of history. See the the Bletchley Park Web site and its history. See more
information on Codes and Ciphers in the Second World War at Tony Sales' site.

1946 - ENIAC - World's first electronic, large scale, general-purpose computer, built by Mauchly
and Eckert, and activated at the University of Pennsylvania in 1946. ENIAC recreated on a
modern computer chip. See an explanation of ENIAC on a Chip by the Moore School of

http://mason.gmu.edu/~montecin/computer-hist-web.htm Page 1 of 7
The History of Computing 11/11/24, 8:33 PM

Electrical Engineering, University of Pennsylvania. The ENIAC is a 30 ton machine that


measured 50 x 30 feet. It contained 19,000 vacuum tubes, 6000 switches, and could add 5,000
numbers in a second, a remarkable accomplishment at the time. A reprogrammable machine,
the ENIAC performed initial calculations for the H-bomb. It was also used to prepare artillery
shell trajectory tables and perform other military and scientific calculations. Since there was no
software to reprogram the computer, people had to rewire it to get it to perform different
functions. The human programmers had to read wiring diagrams and know what each switch
did. J. Presper Eckert, Jr. and John W. Mauchly drew on Alansoff's work to create the ENIAC,
the Electronic Numerical Integrator and Computer.

1951-1959 - vacuum tube based technology. Vacuum Tubes are electronic devices, consisting
of a glass or steel vacuum envelope and two or more electrodes between which electrons can
move freely. First commercial computers used vacuum tubes: Univac, IBM 701.

1950s -1960s - UNIVAC - "punch card technology" The first commercially successful computer,
introduced in 1951 by Remington Rand. Over 40 systems were sold. Its memory was made of
mercury filled acoustic delay lines that held 1,000 12 digit numbers. It used magnetic tapes that
stored 1MB of data at a density of 128 cpi. UNIVAC became synonymous with computer (for a
while). See UNIVAC photo . See UNIVAC flow chart

1960 IBM 1620 - See photos at The Computer History Museum.

1960-1968 - transistor based technology. The transistor, invented in 1948, by Dr. John Bardeen,
Dr. Walter Brattain, and Dr. William Shockley . It almost completely replaced the vacuum tube
because of its reduced cost, weight, and power consumption and its higher reliability. See
explanation and diagram of a transistor and what the first transistor looked like. The transistor is
made to alter its state from a starting condition of conductivity (switched 'on', full current flow) to
a final condition of insulation (switched 'off', no current flow).

1969 - The Internet, originally the ARPAnet (Advanced Research Projects Agency network),
began as a military computer network.

1969-1977 - integrated circuits (IC) based technology. The first integrated circuit was
demonstrated by Texas Instruments inventor, Jack Kilby, in 1958. It was 7/16" wide and
contained two transistors. Examples of early integrated circuit technology: Intel 4004, Dec pdp
8, CRAY 1 (1976) - a 75MHz, 64-bit machine with a peak speed of 160 megaflops, (One million
floating point operations per second) the world's fastest processor at that time. Now circuits may
contain hundreds of thousands of transistors on a small piece of material, which revolutionized
computing. Here is a diagram of a modern integrated circuit, known as a chip.

1976 - CRAY 1 - The world's first electronic digital computer, developed in 1946. A 75MHz, 64-
bit machine with a peak speed of 160 megaflops, (one million floating point operations per
second) the world's fastest processor at that time.

1976 - Apples/MACs - The Apple was designed by Steve Wozniak and Steve Jobs. Apple was
the first to have a "windows" type graphical interface and the computer mouse. Like modern
computers, early Apples had a peripheral keyboard and mouse, and had a floppy drive that held
3.5" disks.The Macintosh replaced the Apple. See a picture of the The Apple III (1980 - 1985).

1978 to 1986 - large scale integration (LSI); Alto - early workstation with mouse; Apple,
designed by Steve Wozniak and Steve Jobs. Apple was the first to have a "windows" type

http://mason.gmu.edu/~montecin/computer-hist-web.htm Page 2 of 7
The History of Computing 11/11/24, 8:33 PM

graphical interface and the computer mouse. See Apple/MACs evolution over time. The PC and
clone market begins to expand. This begins first mass market of desktop computers.

1986 to today - the age of the networked computing, the Internet, and the WWW.

1990 - Tim Berners-Lee invented the networked hypertext system called the World Wide Web.

1992 - Bill Gates' Microsoft Corp. released Windows 3.1, an operating system that made IBM
and IBM-compatible PCs more user-friendly by integrating a graphical user interface into the
software. In replacing the old Windows command-line system, however, Microsoft created a
program similar to the Macintosh operating system. Apple sued for copyright infringement, but
Microsoft prevailed. Windows 3.1 went to Win 95, then Win 98, now Windows XP .... (There are
other OSs, of course, but Windows is the dominant OS today. MACs, by Apple, still have a
faithful following. Linux has a faithful following.

1995 - large commercial Internet service providers (ISPs), such as MCI, Sprint , AOL and
UUNET, began offering service to large number of customers.

1996 - Personal Digital Assistants (such as the Palm Pilot became available to consumers.
They can do numeric calculations, play games and music and download information from the
Internet. See How Stuff Works for a history and details.

back to top

Pioneer computer scientists

Charles Babbage (1792-1871) - Difference Engine, Analytical Engine. Ada Byron, daughter of the poet,
Lord Byron, worked with him. His description, in 1837, of the Analytical Engine, a mechanical digital
computer anticipated virtually every aspect of present-day computers. Sketch of the Engine and notes
by Ada Byron King, Countess of Lovelace.

Alan Turing -- 1912-1954. British Codebreaker. Worked on the Colossus (code breaking machine,
precursor to the computer) and the ACE (Automatic Computing Engine). Noted for many brilliant ideas,
Turing is perhaps best remembered for the concepts of the Turing Test for Artificial Intelligence and the
Turing Machine, an abstract model for modeling computer operations. The Turing Test is the "acid test"
of true artificial intelligence, as defined by the English scientist Alan Turing. In the 1940s, he said "a
machine has artificial intelligence when there is no discernible difference between the conversation
generated by the machine and that of an intelligent person." Turing was instrumental in breaking the
German enigma code during WWII with his Bombe computing machine. The Enigma is a machine used
by the Germans to create encrypted messages. See Turing's Treatise on Enigma.

See explanation of "The Turing Test": Oppy, Graham, Dowe, David, "The Turing Test", The Stanford
Encyclopedia of Philosophy (Summer 2003 Edition), Edward N. Zalta (ed.),
<http://plato.stanford.edu/archives/sum2003/entries/turing-test/>.

Pictures of the Enigma machine that the German's used to create encrypted messages.

More Information about the Enigma machine.

J. von Neumann -- (1903-1957). A child prodigy in mathematics, authored landmark paper explaining
how programs could be stored as data. (Unlike ENIAC, which had to be re-wired to be re-
programmed.). Virtually all computers today, from toys to supercomputers costing millions of dollars,
are variations on the computer architecture that John von Neumann created on the foundation of the
work of Alan Turing's work in the 1940s. It included three components used by most computers today:
a CPU; a slow-to-access storage area, like a hard drive; and secondary fast-access memory (RAM ).

http://mason.gmu.edu/~montecin/computer-hist-web.htm Page 3 of 7
The History of Computing 11/11/24, 8:33 PM

The machine stored instructions as binary values (creating the stored program concept) and executed
instructions sequentially - the processor fetched instructions one at a time and processed them. The
instruction is analyzed, data is processed, the next instruction is analyzed, etc. Today "von Neumann
architecture" often refers to the sequential nature of computers based on this model. See another von
Neumann source.

John V. Atanasoff -- (1904 - 1995) - one of the contenders, along with Konrad Zuse and H. Edward
Roberts and others, as the inventor of the first computer. The limited-function vacuum-tube device had
limited capabilities and did not have a central. It was not programmable, but could solve differential
equations using binary arithmetic. Atanasoff's Computer.

J. Presper Eckert, Jr. and John W. Mauchly completed the first programmed general purpose electronic
digital computer in 1946. They drew on Alansoff's work to create the ENIAC, the Electronic Numerical
Integrator and Computer. In 1973 a patent lawsuit resulted in John V. Atanasoff's being legally declared
as the inventor. Though Atanasoff got legal status for his achievement, many historians still give credit
to J. Presper Eckert, Jr., and John W. Mauchly the founding fathers of the modern computer. Eckert
and Mauchly formed the first computer company in 1946. Eckert received 87 patents. They introduced
the first modern binany computer with the Binary Automatic Computer (BINAC), which stored
information on magnetic tape rather than punched cards. Their UNIVAC I ,was built for the U.S. Census
Bureau. Their company was acquired by by Remington Rand, which merged into the Sperry Rand
Corp. and then into Unisys Corp. Eckert retired from Unisys in 1989.

Konrad Zuse-- (1910-1995) German who, during WW II, designed mechanical and electromechanical
computers. Zuse's Z1, his contender for the first freely programmable computer, contained all the basic
components of a modern computer (control unit, memory, micro sequences, etc.). Zuse, because of the
scarcity of material during WW II, used discarded video film as punch cards. Like a modern computer, it
was adaptable for different purposes and used on/off switch relays, a binary system of 1s and 0s (on =
1, off = 0). Completed in 1938, it was destroyed in the bombardment of Berlin in WW II, along with the
construction plans. In 1986, Zuse reconstructed the Z1.

H. Edward Roberts -- developed the MITS Altair 8800 in 1975. The Altair is considered by some to be
the first microcomputer (personal computer)., The MITS Altair 8800 was based on a 2 MHz Intel 8080
chip, with 256 bytes, standard RAM. It was developed a year before the first Apple, by Steve Wozniak
and Steve Jobs, came out. Paul Allen and Bill Gates (then a student at Harvard) wrote a scaled down
version of the Basic programming language to run on the Altair , which was the beginning of Microsoft.

See details about the MITS Altair 8800 at the Computer Museum of America (http://www.computer-
museum.org/collections/mits8800.html)

back to top

We can't talk about computers without mentioning:

The Birth of the Internet

The Internet, originally the ARPAnet (Advanced Research Projects Agency network), began as a
military computer network in 1969. This network was an experimental project of the U.S. Department
of Defense Advanced Research Projects Agency (DARPA).Other government agencies and
universities created internal networks based on the ARPAnet model. The catalyst for the Internet today
was provided by the National Science Foundation (NSF). Rather than have a physical communications
connection from each institution to a supercomputing center, the NSF began a "chain" of connections in
which institutions would be connected to their "neighbor" computing centers, which all tied into central
supercomputing centers. This beginning expanded to a global network of computer networks, which
allows computers all over the world to communicate with one another and share information stored at
various computer "servers," either on a local computer or a computer located anywhere in the world. In
1986, came the birth of the National Science Foundation Network (NSFNET), which scientists across
the country with five supercomputer centers. Universities were early users of the Internet. In 1992, the
Internet was still primarily used by researchers and academics. In 1995, large commercial Internet
service providers (ISPs), such as MCI, Sprint , AOL and UUNET, began offering service to large
number of customers.

http://mason.gmu.edu/~montecin/computer-hist-web.htm Page 4 of 7
The History of Computing 11/11/24, 8:33 PM

The Internet now links thousands of computer networks, reaching people all over the world. See this
Atlas of Cyberspaces for graphical images of networks in cyberspace.

Since traffic on the Internet has become so heavy, some of the scientific and academic institutions that
formed the original Internet developed a new global network called Internet 2. Known as the Abilene
Project, and running on fast fiber-optic cable, it officially opened for business in February, 1999 at a
ceremony in Washington, D.C.

George Mason University is one of 150 universities in the United States that are working on the Internet
2 project with industry through the University Corporation for Advanced Internet Development (UCAID)
to improve the functionality and capabilities of the Internet. The network's 2.4 gigabit-per-second speed
started with a transmission speed of 45,000 faster than a 56K modem.

The Birth of the WWW

1990 - Tim Berners-Lee, currently the director of the World Wide Web Consortium, the coordinating
body for Web development, invented the World Wide Web. He occupies the 3Com Founders chair at
the MIT Laboratory for Computer Science. The WWW was originally conceived and developed for the
high-energy physics collaborations, which require instantaneous information sharing between
physicists working in different universities and institutes all over the world. Now the WWW is used by
people all over the world, children and adults, for personal, commercial, and academic uses. Berners-
Lee and Robert Cailliau wrote the first WWW client and server software, defining Web addresses
(URLs), hypertext transfer protocol (http) and hypertext markup language (html). Here is Tim Berners-
Lee's original proposal to attempt to persuade CERN management to initiate a global hypertext
system, which Berners-Lee called "Mesh" before he decided on the name "World Wide Web" when
writing the code in 1990. In December 1993, Berners-Lee and Cailliau, along with Marc Andreesen and
E. Bina of NCSA, shared the Association for Computing (ACM) Software System Award for developing
the World-Wide Web. The graphical Web browser, Mosaic, evolved into Netscape.

The WWW is based on the hypertext protocol.

What is hypertext, anyway?

See CERN's overview of the WWW (What it is and its progress).

The ease of using the World Wide Web has made it easier for people to connect with one another,
overcoming the obstacles of time and space. This networking has spawned numerous virtual
communities and cybercultures. See this list of resources on cybercultures. The WWW has also
become a convenient way to buy and sell services and goods.

The Internet and WWW do not come without ethical and legal ramifications, such as copyright
infringement, computer spying and hacking, computer viruses, fraud, and privacy issues. See links to
computer Ethics, Laws, Privacy Issues. Also see Internet copyright resources.

What's next?? - something interesting to ponder: Nanotechnology - K. Eric Drexler is the founding
father of nanotechnology, the idea of using individual atoms and molecules to build living and
mechanical "things" in miniature factories. His vision is that if scientists can engineer DNA on a
molecular, why can't we build machines out of atoms and program them to build more machines? The
requirement for low cost creates an interest in these "self replicating manufacturing systems," studied
by von Neumann in the 1940's. These "nanorobots, " programmed by miniature computers smaller
than the human cell, could go through the bloodstream curing disease, perform surgery, etc. If this
technology comes about the barriers between engineered and living systems may be broken.
Researchers at various institutions and organizations, like NASA and Xerox, are working on this
technology.

See The NanoAge website.

back to top

http://mason.gmu.edu/~montecin/computer-hist-web.htm Page 5 of 7
The History of Computing 11/11/24, 8:33 PM

Some of the Many Women Pioneers in Computing:

Ada Byron King - Portrait .Countess of Lovelace and daughter of the British poet, Lord Byron (1815-
1852). - Ada was a mathematician and wrote extensive notes on Charles Babbage's calculating
machine and suggested how the engine might calculate Bernoulli numbers. This plan, is now regarded
as the first "computer program."Sketch of the Engine and notes by Ada Byron King, Countess of
Lovelace. A software language developed by the U.S. Department of Defense was named "Ada" in her
honor in 1979.

Edith Clarke (1883-1959) - At MIT, in June 1919, Clarke received the first Electrical Engineering
degree awarded to a woman . She developed and disseminated mathematical methods that simplified
calculations and reduced the time spent in solving problems in the design and operation of electrical
power systems.

Grace Murray Hopper (1906-1992) - Hopper earned an MA in 1930 and a Ph.D. in 1934 in
Mathematics, from Yale University. She retired from the Navy in 1967 with the rank of Rear Admiral.
Hopper created a compiler system that translated mathematical code into machine language. Later
versions, under her direction, the compiler became the forerunner to modern programming languages.
She pioneered the integration of English into programs with the FLOW-MATIC. Hopper received the
Computer Sciences "Man of The Year Award" in 1969. She was the first woman to be inducted into the
Distinguished Fellow British Computer Society in 1973. The term "bug," an error or defect in software
that causes a program to malfunction, originated, according to computer folklore, when Grace and her
team found a dead moth that had been "zapped" by the relay and caused the device to fail.

Erna Hoover - invented a computerized switching system for telephone traffic. For this achievement,
she was awarded the first software patent ever issued (Patent #3,623,007) on Nov. 23, 1971). She was
the first female supervisor of a technical department (at Bell Labs).

Kay McNulty Mauchly Antonelli and Alice Burks - made calculations for tables of firing and bombing
trajectories, as part of the war effort. This work prompted the development, in 1946, of the ENIAC, the
world's first electronic digital computer.

Adele Goldstine - assisted in the creation of the ENIAC and wrote the manual to use it.

Joan Margaret Winters - scientific programmer in SLAC Computing Services at the Stanford Linear
Accelerator Center, among other achievements.

Alexandra Illmer Forsythe (1918-1980) - .During the 1960's and 1970's, she co-authored a series of
textbooks on computer science. She wrote the first computer science textbook.

Evelyn Boyd Granville - was one of the first African American women to earn a Ph.D. in Mathematics.
During her career, she developed computer programs that were used for trajectory analysis in the
Mercury Project (the first U.S. manned mission in space) and in the Apollo Project (which sent U.S.
astronauts to the moon).

See Past Notable Women of Computing for more details

back to top

Resources:

Timelines:
Computer History Time Line
Timeline of Events in Computer History
Computer Museums/History Sites:
Generations Through the History of Computing - Take a tour of companies and computers have
led us to where we are today.
Triumph of the Nerds Online - companion to the PBS series
The Machine that Changed the World - companion Web site to video series (The book that

http://mason.gmu.edu/~montecin/computer-hist-web.htm Page 6 of 7
The History of Computing 11/11/24, 8:33 PM

accompanied the "Machine that Changed the World series: The Dream Machine: Exploring the
Computer Age. Jon Palfreman and Doron Swade. BBC Books, London, 1991)
The Ada Project
UVA Computer Museum
VA Tech History of Computers

List of links

Online Cyber Resources and Scholarship | Bibliography of Cyberbooks / Theory

updated Nov 2010

http://mason.gmu.edu/~montecin/computer-hist-web.htm Page 7 of 7

You might also like