0% found this document useful (0 votes)
9 views8 pages

CompTech 122 Topic 1. Computer History

The document outlines the history of computers from their origins as human calculators to the development of early mechanical and electronic devices. It highlights key inventions such as the abacus, calculating machines, and the first programmable digital computer, the Harvard Mark I, as well as significant figures like Ada Lovelace and Grace Hopper. The evolution of technology is traced through various innovations, leading to the modern computer era.

Uploaded by

Joshua Obenza
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
9 views8 pages

CompTech 122 Topic 1. Computer History

The document outlines the history of computers from their origins as human calculators to the development of early mechanical and electronic devices. It highlights key inventions such as the abacus, calculating machines, and the first programmable digital computer, the Harvard Mark I, as well as significant figures like Ada Lovelace and Grace Hopper. The evolution of technology is traced through various innovations, leading to the modern computer era.

Uploaded by

Joshua Obenza
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

CTU-San Francisco Campus

CompTech 122 – Computer Systems


Topic 1: History of Computers (Creation of Computers)

HISTORY OF COMPUTERS – PAST TO PRESENT


(Creation of Computers)
PART I
Humans
The first computers were people! That is, electronic computers (and the earlier mechanical
computers) were given this name because they performed the work that had previously
been assigned to people. "Computer" was originally a job title: it was used to describe those
human beings (predominantly women) whose job it was to perform the repetitive
calculations required to compute such things as navigational tables, tide charts, and
planetary positions for astronomical almanacs.

Abacus
The abacus was an early aid for mathematical computations such as multiplication, division,
addition, subtraction. Its only value is that it aids the memory of the human performing the
calculation. The oldest surviving abacus was used in 300 B.C. by the Babylonians. Cranmer
abacus is in common use by individuals who are blind.

Logarithms
In 1617 A Scotsman named John Napier invented logarithms, which are a technology that
allows multiplication to be performed via addition. But Napier also invented an alternative
to tables, where the logarithm values were carved on ivory sticks which are now called
Napier's Bones.

Slide Rule
Napier's invention led directly to the slide rule, first built in England in 1632 and still in use
in the 1960's by the NASA engineers of the Mercury, Gemini, and Apollo programs which
landed men on the moon.

Calculating Machine
Leonardo da Vinci (1452-1519) made drawings of gear-driven calculating machines but
apparently never built any. A Leonardo da Vinci drawing showing gears arranged for
computing

Calculating Clock
The first gear-driven calculating machine to actually be built was probably the calculating
clock, so named by its inventor, the German professor Wilhelm Schickard in 1623.

Pascaline
In 1642 Blaise Pascal, at age 19, invented the Pascaline as an aid for his father who was a
tax collector. Pascal was a child prodigy. At the age of 12, he was discovered doing his version
of Euclid's thirty-second proposition on the kitchen floor. Pascal went on to invent
probability theory, the hydraulic press, and the syringe. Shown below is an 8 digit version of
the Pascaline, and two views of a 6 digit version:
PART II
Stepped Reckoner
Just a few years after Pascal, the German Gottfried Wilhelm Leibniz (co-inventor with
Newton of calculus) managed to build a four-function (addition, subtraction, multiplication,
and division) calculator that he called the stepped reckoner. Leibniz was the first to
advocate use of the binary number system which is fundamental to the operation of
modern computers.

Punched Cards
In 1801 the Frenchman Joseph Marie Jacquard invented a power loom that could base its weave
(and hence the design on the fabric) upon a pattern automatically read from punched wooden
cards, held together in a long row by rope. Descendents of these punched cards have
been in use ever since (remember the "hanging chad" from the Florida presidential ballots of
the year 2000?). Jacquard's Loom showing the threads and the punched cards

This tapestry was woven by a Jacquard loom


Jacquard's technology was a real boon to mill owners, but put many loom operators out of
work. Angry mobs smashed Jacquard looms and once attacked Jacquard himself. History is full
of examples of labor unrest following technological innovation yet most studies show that,
overall, technology has actually increased the number of jobs.
1
Difference Engine
By 1822 the English mathematician Charles Babbage was proposing a steam driven
calculating machine the size of a room, which he called the Difference Engine. This
machine would be able to compute tables of numbers, such as logarithm tables. He
obtained government funding for this project due to the importance of numeric
tables in ocean navigation. By promoting their commercial and military navies, the
British government had managed to become the earth's greatest empire. But
construction of Babbage's Difference Engine proved exceedingly difficult and the
project soon became the most expensive government funded project up to that
point in English history. Ten years later the device was still nowhere near complete,
acrimony abounded between all involved, and funding dried up. The device was
never finished.

Analytic Engine
Babbage was not deterred, and by then was on to his next brainstorm, which he
called the Analytic Engine. This device, large as a house and powered by 6 steam
engines, would be more general purpose in nature because it would be
programmable, thanks to the punched card technology of Jacquard. But it was
Babbage who made an important intellectual leap regarding the punched cards.

Furthermore, Babbage realized that punched paper could be employed as a storage mechanism, holding computed numbers for future
reference. Because of the connection to the Jacquard loom, Babbage called the two main parts of his Analytic Engine the "Store" and
the "Mill", as both terms are used in the weaving industry. The Store was where numbers were held and the Mill was where they were
"woven" into new results. In a modern computer these same parts are called the memory unit and the central processing unit (CPU).

Babbage befriended Ada Byron, the daughter of the famous poet Lord Byron (Ada would later become the Countess Lady Lovelace by
marriage). Though she was only 19, she was fascinated by Babbage's ideas and thru letters and meetings with Babbage she learned
enough about the design of the Analytic Engine to begin fashioning programs for the still unbuilt machine. While Babbage refused to
publish his knowledge for another 30 years, Ada wrote a series of "Notes" wherein she detailed sequences of instructions she had
prepared for the Analytic Engine. The Analytic Engine remained unbuilt (the British government refused to get involved with this one)
but Ada earned her spot in history as the first computer programmer. Ada invented the subroutine and was the first to recognize the
importance of looping. Babbage himself went on to invent the modern postal system, cowcatchers on trains, and the ophthalmoscope,
which is still used today to treat the eye.

Hollerith Desk (Herman Hollerith)


The next breakthrough occurred in America. The U.S. Constitution states that a census should
be taken of all U.S. citizens every 10 years in order to determine the representation of the states
in Congress. While the very first census of 1790 had only required 9 months, by 1880 the U.S.
population had grown so much that the count for the 1880 census took 7.5 years. Automation
was clearly needed for the next census. The census bureau offered a prize for an inventor to
help with the 1890 census and this prize was won by Herman Hollerith, who proposed and then
successfully adopted Jacquard's punched cards for the purpose of computation.

Hollerith's invention, known as the Hollerith desk, consisted of a card reader which sensed the
holes in the cards, a gear driven mechanism which could count (using Pascal's mechanism which
we still see in car odometers), and a large wall of dial indicators (a car speedometer is a dial
indicator) to display the results of the count.

An operator working at a Hollerith’s Desk


The patterns on Jacquard's cards were determined when a tapestry was designed and then were
not changed. Today, we would call this a read-only form of information storage. Hollerith had
the insight to convert punched cards to what is today called a read/write technology. While
riding a train, he observed that the conductor didn't merely punch each ticket, but rather
punched a particular pattern of holes whose positions indicated the approximate height,
weight, eye color, etc. of the ticket owner.

Hollerith's technique was successful and the 1890 census was completed in only 3 years at a
savings of 5 million dollars. Interesting aside: the reason that a person who removes
inappropriate content from a book or movie is called a censor, as is a person who conducts a
census, is that in Roman society the public official called the "censor" had both of these jobs.

Hollerith built a company, the Tabulating Machine Company which, after a few buyouts,
eventually became International Business Machines, known today as IBM. IBM grew rapidly and
punched cards became ubiquitous. Your gas bill would arrive each month with a punch card you
had to return with your payment. This punch card recorded the particulars of your account:
your name, address, gas usage, etc.

2
PART III
Harvard Mark I
IBM continued to develop mechanical calculators for sale to businesses to help with financial accounting and inventory accounting.
One characteristic of both financial accounting and inventory accounting is that although you need to subtract, you don't need
negative numbers and you really don't have to multiply since multiplication can be accomplished via repeated addition.

World War II
But the U.S. military desired a mechanical calculator more optimized for scientific computation. By World War II the U.S. had
battleships that could lob shells weighing as much as a small car over distances up to 25 miles. Physicists could write the equations
that described how atmospheric drag, wind, gravity, muzzle velocity, etc. would determine the trajectory of the shell. But solving such
equations was extremely laborious. This was the work performed by the human computers. Their results would be published in ballistic
"firing tables" published in gunnery manuals. During World War II the U.S. military scoured the country looking for (generally female)
math majors to hire for the job of computing these tables.

But not enough humans could be found to keep up with the need for new tables. Sometimes artillery pieces had to be delivered to
the battlefield without the necessary firing tables and this meant they were close to useless because they couldn't be aimed properly.
Faced with this situation, the U.S. military was willing to invest in even hair-brained schemes to automate this type of computation.

One early success was the Harvard Mark I computer which was built as a partnership between Harvard and IBM in 1944. This was the
first programmable digital computer made in the U.S. But it was not a purely electronic computer. Instead the Mark I was constructed
out of switches, relays, rotating shafts, and clutches. The machine weighed 5 tons, incorporated 500 miles of wire, was 8 feet tall and
51 feet long, and had a 50 ft rotating shaft running its length, turned by a 5 horsepower electric motor. The Mark I ran non-stop for
15 years, sounding like a roomful of ladies knitting. To appreciate the scale of this machine note the four typewriters in the foreground
of the following photo.

The Harvard Mark I: an electro-mechancal computer


Here's a close-up of one of the Mark I's four paper tape readers. A paper tape
was an improvement over a box of punched cards as anyone who has ever
dropped -- and thus shuffled -- his "stack" knows. One of the four paper tape
readers on the Harvard Mark I (you can observe the punched paper roll emerging from
the bottom)

Computer Bug
One of the primary programmers for the Mark I was a woman, Grace Hopper.
Hopper found the first computer "bug": a dead moth that had gotten into the
Mark I and whose wings were blocking the reading of the holes in the paper
tape. The word "bug" had been used to describe a defect since at least 1889
but Hopper is credited with coining the word "debugging" to describe the
work to eliminate program faults.

The first computer bug [photo © 2002 IEEE]

Grace Hopper
In 1953 Grace Hopper invented the first high-level language, "Flow-matic". This language eventually became COBOL which was the
language most affected by the infamous Y2K problem. A high-level language is designed to be more understandable by humans than
is the binary language understood by the computing machinery. A high-level language is worthless without a program -- known as a
compiler -- to translate it into the binary language of the computer and hence Grace Hopper also constructed the world's first compiler.
Grace remained active as a Rear Admiral in the Navy Reserves until she was 79 (another record).

The Mark I operated on numbers that were 23 digits wide. It could add or subtract two of these numbers in three-tenths of a second,
multiply them in four seconds, and divide them in ten seconds. Forty-five years later computers could perform an addition in a billionth
of a second! Even though the Mark I had three quarters of a million components, it could only store 72 numbers! Today, home
computers can store 30 million numbers in RAM and another 10 billion numbers on their hard disk. Today, a number can be pulled
from RAM after a delay of only a few billionths of a second, and from a hard disk after a delay of only a few thousandths of a second.

3
Howard Aiken
On a humorous note, the principal designer of the Mark I, Howard Aiken of Harvard,
estimated in 1947 that six electronic digital computers would be sufficient to satisfy
the computing needs of the entire United States. IBM had commissioned this study to
determine whether it should bother developing this new invention into one of its
standard products (up until then computers were one-of-a-kind items built by special
arrangement). Aiken's prediction wasn't actually so bad as there were very few
institutions (principally, the government and military) that could afford the cost of
what was called a computer in 1947. He just didn't foresee the micro-electronics
revolution which would allow something like an IBM Stretch computer of 1959:

Apple Computers
The original Apple Computer, also known retroactively as the Apple I, or Apple-1, is a
personal computer released by the Apple Computer Company (now Apple Inc.) in
1976. They were designed and hand-built by Steve Wozniak. Wozniak's friend Steve
Jobs had the idea of selling the computer. The Apple I was Apple's first product, and
to finance its creation, Jobs sold his only means of transportation, a VW van and
Wozniak sold his HP-65 calculator for $500. It was demonstrated in July 1976 at the
Homebrew Computer Club in Palo Alto, California. Home computer of 1976 such as
this Apple I which sold for only $600:

Integrated Circuit
The microelectronics revolution is what allowed the amount of hand-crafted wiring
seen in the prior photo to be mass-produced as an integrated circuit which is a small
sliver of silicon the size of your thumbnail.
The primary advantage of an integrated circuit is not that the transistors (switches)
are miniscule (that's the secondary advantage), but rather that millions of transistors
can be created and interconnected in a mass-production process. All the elements on
the integrated circuit are fabricated simultaneously via a small number (maybe 12) of
optical masks that define the geometry of each layer. This speeds up the process of
fabricating the computer -- and hence reduces its cost -- just as Gutenberg's printing
press sped up the fabrication of books and thereby made them affordable to all.

The IBM Stretch computer of 1959 needed its 33 foot length to hold the 150,000
transistors it contained. These transistors were tremendously smaller than the
vacuum tubes they replaced, but they were still individual elements requiring
individual assembly. By the early 1980s this many transistors could be simultaneously
fabricated on an integrated circuit. Today's Pentium 4 microprocessor contains
42,000,000 transistors in this same thumbnail sized piece of silicon.

The DEC PDP-12


It's humorous to remember that in between the Stretch machine (which would be
called a mainframe today) and the Apple I (a desktop computer) there was an entire
industry segment referred to as mini-computers such as the following PDP-12
computer of 1969:

Atanasoff-Berry Computer
The Atanasoff-Berry Computer One of the earliest attempts to build an all-electronic
(that is, no gears, cams, belts, shafts, etc.) digital computer occurred in 1937 by J. V.
Atanasoff, a professor of physics and mathematics at Iowa State University. By 1941
he and his graduate student, Clifford Berry, had succeeded in building a machine that
could solve 29 simultaneous equations with 29 unknowns. This machine was the first
to store data as a charge on a capacitor, which is how today's computers store
information in their main memory (DRAM or dynamic RAM). As far as its inventors
were aware, it was also the first to employ binary arithmetic. However, the machine
was not programmable, it lacked a conditional branch, its design was appropriate for
only one type of mathematical problem, and it was not further pursued after World
War II.

Colossus
Another candidate for granddaddy of the modern computer was Colossus, built
during World War II by Britain for the purpose of breaking the cryptographic codes
used by Germany. Britain led the world in designing and building electronic machines
dedicated to code breaking, and was routinely able to read coded Germany radio
transmissions. But Colossus was definitely not a general purpose, reprogrammable
machine.

4
The Harvard Mark I, the Atanasoff-Berry computer, and the British Colossus all made important contributions. American and British
computer pioneers were still arguing over who was first to do what, when in 1965 the work of the German Konrad Zuse was published
for the first time in English. Zuse had built a sequence of general purpose computers in Nazi Germany. The first, the Z1, was built
between 1936 and 1938 in the parlor of his parent's home.

Zuse’s Computers
Zuse's third machine, the Z3, built in 1941, was probably the first operational,
general-purpose, programmable (that is, software controlled) digital computer.
Without knowledge of any calculating machine inventors since Leibniz (who lived
in the 1600's), Zuse reinvented Babbage's concept of programming and decided on
his own to employ binary representation for numbers (Babbage had advocated
decimal). The Z3 was destroyed by an Allied bombing raid. The Z1 and Z2 met the
same fate and the Z4 survived only because Zuse hauled it in a wagon up into the
mountains. Zuse's accomplishments are all the more incredible given the context
of the material and manpower shortages in Germany during World War II. Zuse
couldn't even obtain paper tape so he had to make his own by punching holes in
discarded movie film. Because these machines were unknown outside Germany,
they did not influence the path of computing in America. But their architecture is
identical to that still in use today: an arithmetic unit to do the calculations, a
memory for storing numbers, a control system to supervise operations, and input
and output devices to connect to the external world. Zuse also invented what might
be the first high-level computer language, "Plankalkul", though it too was unknown
outside Germany. The Zuse Z1 in its residential setting

PART IV
ENIAC (Electronic Numerical Integrator and Calculator)
The title of forefather of today's all-electronic digital computers is usually awarded
to ENIAC, which stood for Electronic Numerical Integrator and Calculator. ENIAC was
built at the University of Pennsylvania between 1943 and 1945 by two professors,
John Mauchly and the 24 year old J. Presper Eckert, who got funding from the war
department after promising they could build a machine that would replace all the
"computers", meaning the women who were employed calculating the firing tables
for the army's artillery guns.

ENIAC filled a 20 by 40 foot room, weighed 30 tons, and used more than 18,000
vacuum tubes. Like the Mark I, ENIAC employed paper card readers obtained from
IBM (these were a regular product for IBM, as they were a long established part of
business accounting machines, IBM's forte).

When operating, the ENIAC was silent but you knew it was on as the 18,000 vacuum tubes each generated waste heat like a light bulb
and all this heat (174,000 watts of heat) meant that the computer could only be operated in a specially designed room with its own
heavy duty air conditioning system. Only the left half of ENIAC is visible in the first picture, the right half was basically a mirror image
of what's visible.

ENIAC: the " (note that it wasn't even given the name of computer since
"computers" were people) [U.S. Army photo]

To reprogram the ENIAC you had to rearrange the patch cords that you can
observe on the left in the prior photo, and the settings of 3000 switches that you
can observe on the right. To program a modern computer, you type out a program
with statements like Circumference = 3.14 * diameter. To perform this
computation on ENIAC you had to rearrange a large number of patch cords and
then locate three particular knobs on that vast wall of knobs and set them to 3, 1,
and 4.

Reprogramming ENIAC involved a hike [U.S. Army photo]

Once the army agreed to fund ENIAC, Mauchly and Eckert worked around the
clock, seven days a week, hoping to complete the machine in time to
contribute to the war. Their war-time effort was so intense that most days
they ate all 3 meals in the company of the army Captain who was their liaison
with their military sponsors. They were allowed a small staff but soon
observed that they could hire only the most junior members of the University
of Pennsylvania staff because the more experienced faculty members knew
that their proposed machine would never work.

5
One of the most obvious problems was that the design would require vacuum
tubes to all work simultaneously. Even with 18,000 vacuum tubes, ENIAC could
only hold 20 numbers at a time. However, thanks to the elimination of moving
parts it ran much faster than the Mark I: a multiplication that required 6 seconds
on the Mark I could be performed on ENIAC in 2.8 thousandths of a second.
ENIAC's basic clock speed was 100,000 cycles per second. Today's home
computers employ clock speeds of 1,000,000,000 cycles per second. Built with
$500,000 from the U.S. Army, ENIAC's first task was to compute whether or not
it was possible to build a hydrogen bomb (the atomic bomb was completed
during the war and hence is older than ENIAC). The very first problem run on
ENIAC required only 20 seconds and was checked against an answer obtained
after forty hours of work with a mechanical calculator. After chewing on half a
million punch cards for six weeks, ENIAC did humanity no favor when it declared
the hydrogen bomb feasible. This first ENIAC program remains classified even
today.

Once ENIAC was finished and proved worthy of the cost of its development, its
designers set about to eliminate the obnoxious fact that reprogramming the
computer required a physical modification of all the patch cords and switches. It
took days to change ENIAC's program. Eckert and Mauchly's next teamed up with
the mathematician John von Neumann to design EDVAC, which pioneered the
stored program.

After ENIAC and EDVAC came other computers with humorous names such as
ILLIAC, JOHNNIAC, and, of course, MANIAC. ILLIAC was built at the University of
Illinois at Champaign-Urbana..

JOHNNIAC was a reference to John von Neumann, who was unquestionably a genius. At age 6 he could tell jokes in classical Greek. By
8 he was doing calculus. He could recite books he had read years earlier word for word. He could read a page of the phone directory
and then recite it backwards. On one occasion it took von Neumann only 6 minutes to solve a problem in his head that another
professor had spent hours on using a mechanical calculator. Von Neumann is perhaps most famous as the man who worked out the
complicated method needed to detonate an atomic bomb.

Today, one of the most notable characteristics of a computer is the fact that its
ability to be reprogrammed allows it to contribute to a wide variety of endeavors,
such as the following completely unrelated fields:
• the creation of special effects for movies,
• the compression of music to allow more minutes of music to fit within the
limited memory of an MP3 player,
• the observation of car tire rotation to detect and prevent skids in an anti-
lock braking system (ABS),
• the analysis of the writing style in Shakespeare's work with the goal of
proving whether a single individual really was responsible for all these
pieces.

By the end of the 1950's computers were no longer one-of-a-kind hand built devices
owned only by universities and government research labs. Eckert and Mauchly left
the University of Pennsylvania over a dispute about who owned the patents for their
invention. They decided to set up their own company. Their first product was the
famous UNIVAC computer, the first commercial (that is, mass produced) computer.

ENIAC was unquestionably the origin of the U.S. commercial computer industry, but
its inventors, Mauchly and Eckert, never achieved fortune from their work and their
company fell into financial problems and was sold at a loss. By 1955 IBM was selling
more computers than UNIVAC and by the 1960's the group of eight companies selling
computers was known as "IBM and the seven dwarfs". IBM grew so dominant that
the federal government pursued anti-trust proceedings against them from 1969 to
1982 (notice the pace of our country's legal system).

In IBM's case it was their own decision to hire an unknown but aggressive firm called Microsoft to provide the software for their
personal computer (PC). This lucrative contract allowed Microsoft to grow so dominant that by the year 2000 their market
capitalization (the total value of their stock) was twice that of IBM and they were convicted in Federal Court of running an illegal
monopoly.

6
Teletype
The Teletype was the standard mechanism used to interact with a time-sharing computer .A teletype
was a motorized typewriter that could transmit your keystrokes to the mainframe and then print the
computer's response on its roll of paper. You typed a single line of text, hit the carriage return button,
and waited for the teletype to begin noisily printing the computer's response (at a whopping 10
characters per second). On the left-hand side of the teletype in the prior picture you can observe a paper
tape reader and writer (i.e., puncher). Here's a close-up of paper tape:

Binary numbers
After observing the holes in paper tape it is perhaps obvious why all computers use binary numbers to represent data: a binary bit
(that is, one digit of a binary number) can only have the value of 0 or 1 (just as a decimal digit can only have the value of 0 thru 9).
Something which can only take two states is very easy to manufacture, control, and sense. In the case of paper tape, the hole has
either been punched or it has not. Electro-mechanical computers such as the Mark I used relays to represent data because a relay
(which is just a motor driven switch) can only be open or closed.

Paper tape has a long history as well. It was first used as an information storage medium
by Sir Charles Wheatstone, who used it to store Morse code that was arriving via the newly
invented telegraph (incidentally, Wheatstone was also the inventor of the accordion).

Punch Machine
The alternative to time sharing was batch mode processing, where the computer gives its
full attention to your program. In exchange for getting the computer's full attention at run-
time, you had to agree to prepare your program off-line on a key punch machine which
generated punch cards. An IBM Key Punch machine which operates like a typewriter
except it produces punched cards rather than a printed sheet of paper.

The original IBM Personal Computer (PC)


This transformation was a result of the invention of the microprocessor. A microprocessor
(uP) is a computer that is fabricated on an integrated circuit (IC). Computers had been
around for 20 years before the first microprocessor was developed at Intel in 1971. The
micro in the name microprocessor refers to the physical size. Intel didn't invent the
electronic computer. But they were the first to succeed in cramming an entire computer
on a single chip (IC). Intel was started in 1968 and initially produced only semiconductor
memory (Intel invented both the DRAM and the EPROM, two memory technologies that
are still going strong today). In 1969 they were approached by Busicom, a Japanese
manufacturer of high performance calculators (these were typewriter sized units, the first
shirt-pocket sized scientific calculator was the Hewlett-Packard HP35 introduced in 1972).

Busicom
Busicom wanted Intel to produce 12 custom calculator chips: one chip dedicated to the
keyboard, another chip dedicated to the display, another for the printer, etc. But
integrated circuits were (and are) expensive to design and this approach would have
required Busicom to bear the full expense of developing 12 new chips since these 12 chips
would only be of use to them. A typical Busicom desk calculator But a new Intel employee
(Ted Hoff) convinced Busicom to instead accept a general purpose computer chip which,
like all computers, could be reprogrammed for many different tasks (like controlling a
keyboard, a display, a printer, etc.).
Part V
First Microprocessor
Thus became the Intel 4004, the first microprocessor (uP). The 4004 consisted of 2300
aboard the Pioneer 10 spacecraft, which is now the man-made object farthest from the
earth. Curiously, transistors and was clocked at 108 kHz (i.e., 108,000 times per second).
Compare this to the 42 million transistors and the 2 GHz clock rate (i.e., 2,000,000,000
times per second) used in a Pentium 4. One of Intel's 4004 chips still functions Busicom
went bankrupt and never ended up using the ground-breaking microprocessor.

7
MITS Altair 8800, the first PC
Intel followed the 4004 with the 8008 and 8080. Intel priced the 8080 microprocessor
at $360 dollars as an insult to IBM's famous 360 mainframe which cost millions of
dollars. The 8080 was employed in the MITS Altair computer, which was the world's first
personal computer (PC). It was personal all right: you had to build it yourself from a kit
of parts that arrived in the mail. This kit didn't even include an enclosure and that is the
reason the unit shown below doesn't match the picture on the magazine cover. The

Steven Paul "Steve" Jobs (Apple/MAC)


Steven Paul "Steve" Jobs (/ˈdʒɒbz/; February 24, 1955 – October 5, 2011) was an
American entrepreneur. He is best known as the co-founder, chairman, and CEO of Apple
Inc. Through Apple, he was widely recognized as a charismatic pioneer of the personal
computer revolution] and for his influential career in the computer and consumer
electronics fields. Jobs also co-founded and served as chief executive of Pixar Animation
Studios; he became a member of the board of directors of The Walt Disney Company in
2006, when Disney acquired Pixar.

In the late 1970s, Apple co-founder Steve Wozniak engineered one of the first
commercially successful lines of personal computers, the Apple II series. Jobs was among
the first to see the commercial potential of Xerox PARC's mouse-driven graphical user
interface, which led to the creation of the Apple Lisa and, one year later, the Macintosh.
He also played a role in introducing the LaserWriter, one of the first widely available laser
printers to the market.

William Henry "Bill" Gates III (Microsoft Windows)


A Harvard freshman by the name of Bill Gates decided to drop out of college so he could
concentrate all his time writing programs for this computer. This early experienced put
Bill Gates in the right place at the right time once IBM decided to standardize on the Intel
microprocessors for their line of PCs in 1981. The Intel Pentium 4 used in today's PCs is
still compatible with the Intel 8088 used in IBM's first PC.

Born October 28, 1955) is an American business magnate and philanthropist. Gates is the
former chief executive and current chairman of Microsoft, the world’s largest personal-
computer software company, which he co-founded with Paul Allen. He is consistently
ranked among the world's wealthiest people and was the wealthiest overall from 1995 to
2009, excluding 2008, when he was ranked third; in 2011 he was the wealthiest American
and the second wealthiest person.

Linus Benedict Torvalds


Linus Benedict Torvalds (Swedish: [ˈliːn.ɵs ˈtuːr.valds] ( listen); born December 28, 1969) is
a Finnish American software engineer and hacker, who was the principal force behind the
development of the Linux kernel. He later became the chief architect of the Linux kernel,
and now acts as the project's coordinator. He also created the revision control system Git.
He was honored, along with Shinya Yamanaka, with the 2012 Millennium Technology Prize
by the Technology Academy Finland "in recognition of his creation of a new open-source
operating system for computers leading to the widely used Linux kernel"

Present Modern Computers


A computer is a general-purpose device that can be programmed to carry out a finite
set of arithmetic or logical operations. Since a sequence of operations can be readily
changed, the computer can solve more than one kind of problem. Conventionally, a
computer consists of at least one processing element, typically a central processing unit
(CPU) and some form of memory. The processing element carries out arithmetic and
logic operations, and a sequencing and control unit that can change the order of
operations based on stored information. Modern computers based on integrated
circuits are millions to billions of times more capable than the early machines, and
occupy a fraction of the space. Simple computers are small enough to fit into mobile
devices, and mobile computers can be powered by small batteries. Personal computers
in their various forms are icons of the Information Age and are what most people think
of as "computers". However, the embedded computers found in many devices from
mp3 players to fighter aircraft and from toys to industrial robots are the most
numerous...

Reference website:
http://www.computersciencelab.com/ComputerHistory/History.htm
http://www.computersciencelab.com/ComputerHistory/HistoryPt2.htm
http://www.computersciencelab.com/ComputerHistory/HistoryPt3.htm
http://www.computersciencelab.com/ComputerHistory/HistoryPt4.htm

You might also like