When Was The First Computer Invented
When Was The First Computer Invented
The Turing machine was first proposed by Alan Turing in 1936 and became the foundation for
theories about computing and computers. The machine was a device that printed symbols on
paper tape in a manner that emulated a person following a series of logical instructions. Without
these fundamentals, we wouldn't have the computers we use today.
The Colossus was the first electric programmable computer, developed by Tommy Flowers, and
first demonstrated in December 1943. The Colossus was created to help the British code breakers
read encrypted German messages.
Short for Atanasoff-Berry Computer, the ABC began development by Professor John Vincent
Atanasoff and graduate student Cliff Berry in 1937. Its development continued until 1942 at the
Iowa State College (now Iowa State University).
The ABC was an electrical computer that used more than 300 vacuum tubes for digital
computation, including binary math and Boolean logic and had no CPU (was not
programmable). On October 19, 1973, the US Federal Judge Earl R. Larson signed his decision
that the ENIAC patent by J. Presper Eckert and John Mauchly was invalid and named Atanasoff
the inventor of the electronic digital computer.
The ENIAC was invented by J. Presper Eckert and John Mauchly at the University of
Pennsylvania and began construction in 1943 and was not completed until 1946. It occupied
about 1,800 square feet and used about 18,000 vacuum tubes, weighing almost 50 tons. Although
the Judge ruled that the ABC computer was the first digital computer, many still consider the
ENIAC to be the first digital computer because it was fully functional.
Computer that ran the first graphical computer game, nicknamed "Baby".
Around the same time, the Manchester Mark 1 was another computer that could run stored
programs. Built at the Victoria University of Manchester, the first version of the Mark 1
computer became operational in April 1949. Mark 1 was used to run a program to search for
Mersenne primes for nine hours without error on June 16 and 17 that same year.
The first computer company was the Electronic Controls Company and was founded
in 1949 by J. Presper Eckert and John Mauchly, the same individuals who helped create the
ENIAC computer. The company was later renamed to EMCC or Eckert-Mauchly Computer
Corporation and released a series of mainframe computers under the UNIVAC name.
First delivered to the United States government in 1950, the UNIVAC 1101 or ERA 1101 is
considered to be the first computer that was capable of storing and running a program from
memory.
In 1942, Konrad Zuse begin working on the Z4 that later became the first commercial computer.
The computer was sold to Eduard Stiefel, a mathematician of the Swiss Federal Institute of
Technology Zurich on July 12, 1950.
On April 7, 1953 IBM publicly introduced the 701; its first commercial scientific computer.
MIT introduces the Whirlwind machine on March 8, 1955, a revolutionary computer that was
the first digital computer with magnetic core RAM and real-time graphics.
In 1960, Digital Equipment Corporation released its first of many PDP computers, the PDP-1.
In 1964, the first desktop computer, the Programma 101, was unveiled to the public at the New
York World's Fair. It was invented by Pier Giorgio Perotto and manufactured by Olivetti. About
44,000 Programma 101 computers were sold, each with a price tag of $3,200.
In 1968, Hewlett Packard began marketing the HP 9100A, considered to be the first mass-
marketed desktop computer.
Although it was never sold, the first workstation is considered to be the Xerox Alto, introduced
in 1974. The computer was revolutionary for its time and included a fully functional computer,
display, and mouse. The computer operated like many computers today
utilizing windows, menus and icons as an interface to its operating system. Many of the
computer's capabilities were first demonstrated in The Mother of All Demos by Douglas
Engelbart on December 9, 1968.
Intel introduces the first microprocessor, the Intel 4004 on November 15, 1971.
The Vietnamese-French engineer, André Truong Trong Thi, along with Francois Gernelle,
developed the Micral computer in 1973. Considered as the first "micro-computer", it used the
Intel 8008 processor and was the first commercial non-assembly computer. It originally sold for
$1,750.
Although the first personal computer is considered by many to be the KENBAK-1, which was
first introduced for $750 in 1971. The computer relied on a series of switches for inputting data
and output data by turning on and off a series of lights.
The IBM 5100 is the first portable computer, which was released on September 1975. The
computer weighed 55 pounds and had a five inch CRT display, tape drive, 1.9 MHz PALM
processor, and 64 KB of RAM. In the picture is an ad of the IBM 5100 taken from a November
1975 issue of Scientific America.
The first truly portable computer or laptop is considered to be the Osborne I, which was released
on April 1981 and developed by Adam Osborne. The Osborne I weighed 24.5 pounds, had a 5-
inch display, 64 KB of memory, two 5 1/4" floppy drives, ran the CP/M 2.2 operating system,
included a modem, and cost US$1,795.
The IBM PC Division (PCD) later released the IBM portable in 1984, its first portable computer
that weighed in at 30 pounds. Later in 1986, IBM PCD announced it's first laptop computer,
the PC Convertible, weighing 12 pounds. Finally, in 1994, IBM introduced the IBM ThinkPad
775CD, the first notebook with an integrated CD-ROM.
The Apple I (Apple 1) was the first Apple computer that originally sold for $666.66. The
computer kit was developed by Steve Wozniak in 1976 and contained a 6502 8-bit processor and
4 kb of memory, which was expandable to 8 or 48 kb using expansion cards. Although the Apple
I had a fully assembled circuit board the kit still required a power supply, display, keyboard,
and case to be operational. Below is a picture of an Apple I from an advertisement by Apple.
IBM introduced its first personal computer called the IBM PC in 1981. The computer was code
named and still sometimes referred to as the Acorn and had a 8088processor, 16 KB of memory,
which was expandable to 256 and utilized MS-DOS.
In 1992, Tandy Radio Shack became one of the first companies to release a computer based on
the MPC standard with its introduction of the M2500 XL/2 and M4020 SX computers.
In the mid-1950's Bell Labs developed the transistor. Transistors were capable of
performing many of the same tasks as vacuum tubes but were only a fraction of the size.
The first transistor-based computer was produced in 1959. Transistors were not only
smaller, enabling computer size to be reduced, but they were faster, more reliable and
consumed less electricity.
The other main improvement of this period was the development of computer
languages. Assembler languages or symbolic languages allowed programmers to
specify instructions in words (albeit very cryptic words) which were then translated into a
form that the machines could understand (typically series of 0's and 1's: Binary
code). Higher level languages also came into being during this period. Whereas
assembler languages had a one-to-one correspondence between their symbols and actual
machine functions, higher level language commands often represent complex sequences
of machine codes. Two higher-level languages developed during this period (Fortran and
Cobol) are still in use today though in a much more developed form.
In 1965 the first integrated circuit (IC) was developed in which complete circuits of
hundreds of components were able to be placed on a single silicon chip 2 or 3 mm square.
Computers using these IC's soon replaced transistor based machines. Again, one of the
major advantages was size, with computers becoming more powerful and at the same
time much smaller and cheaper. Computers thus became accessible to a much larger
audience. An added advantage of smaller size is that electrical signals have much shorter
distances to travel and so the speed of computers increased.
Another feature of this period is that computer software became much more powerful and
flexible and for the first time more than one program could share the computer's
resources at the same time (multi-tasking). The majority of programming languages used
today are often referred to as 3GL's (3rd generation languages) even though some of
them originated during the 2nd generation.
The boundary between the third and fourth generations is not very clear-cut at all. Most
of the developments since the mid 1960's can be seen as part of a continuum of gradual
miniaturisation. In 1970 large-scale integration was achieved where the equivalent of
thousands of integrated circuits were crammed onto a single silicon chip. This
development again increased computer performance (especially reliability and speed)
whilst reducing computer size and cost. Around this time the first complete general-
purpose microprocessor became available on a single chip. In 1975 Very Large Scale
Integration (VLSI) took the process one step further. Complete computer central
processors could now be built into one chip. The microcomputer was born. Such chips
are far more powerful than ENIAC and are only about 1cm square whilst ENIAC filled a
large building.
During this period Fourth Generation Languages (4GL's) have come into existence.
Such languages are a step further removed from the computer hardware in that they use
language much like natural language. Many database languages can be described as
4GL's. They are generally much easier to learn than are 3GL's.
The "fifth generation" of computers was defined by the Japanese government in 1980
when they unveiled an optimistic ten-year plan to produce the next generation of
computers. This was an interesting plan for two reasons. Firstly, it is not at all really clear
what the fourth generation is, or even whether the third generation had finished yet.
Secondly, it was an attempt to define a generation of computers before they had come
into existence. The main requirements of the 5G machines were that they incorporate the
features of Artificial Intelligence, Expert Systems, and Natural Language. The goal
was to produce machines that are capable of performing tasks in similar ways to humans,
are capable of learning, and are capable of interacting with humans in natural language
and preferably using both speech input (speech recognition) and speech output (speech
synthesis). Such goals are obviously of interest to linguists and speech scientists as
natural language and speech processing are key components of the definition. As you
may have guessed, this goal has not yet been fully realized, although significant progress
has been made towards various aspects of these goals.