0% found this document useful (0 votes)
14 views36 pages

History of Computer

The document outlines the history of computers, beginning with the need for faster census calculations in the 1880s and the invention of punch-card systems. It highlights key developments, including the creation of the first personal computers, the introduction of programming languages like COBOL, and the evolution of user interfaces. The timeline culminates with advancements in mobile computing and the emergence of social media platforms in the 21st century.

Uploaded by

Allen10
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views36 pages

History of Computer

The document outlines the history of computers, beginning with the need for faster census calculations in the 1880s and the invention of punch-card systems. It highlights key developments, including the creation of the first personal computers, the introduction of programming languages like COBOL, and the evolution of user interfaces. The timeline culminates with advancements in mobile computing and the emergence of social media platforms in the 21st century.

Uploaded by

Allen10
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 36

The computer was not born for entertainment or email

but out of need to solve a serious number-crunching


crisis. By 1880, the U.S population had grown so large that
it took more than seven years to tabulate the U.S Census
results. The government sought a faster way to get the
job done, giving rise to punch-card based computers that
took up entire rooms.
In 1801, In France, Joseph
Marie Jacquard invents a
loom that uses punched
wooden cards
automatically weave fabric
designs. Early computers
would use similar punch
cards.
In 1890, Herman Hollerith designs a punch card
system to calculate the 1880 census,
accomplishing the task in just three years and
saving the government $5 million. He
establishes a company that would ultimately
become IBM (International Business Machines).
In 1830, Charles Babbage
invented the machine named
Analytical Engine. It is a machine
to calculate the numerical value or
values of any formula or function
of which the mathematician can
indicate the method of solution.
In 1936, Alan Turing presents the notion of
a universal machine, called the Turing
machine, capable of computing anything
that is computable. The central concept of
the modern computer was based on his
ideas.
In 1941, Atanasoff and his
graduate student, Clifford
Berry, design a computer
that can solve 29 equations
simultaneously. This marks
the first time computer is
able to store information on
its main memory.
In 1934-1944, Two University
of Pennsylvania professors,
John Mauchly and J. Presper
Eckert, build the Electronic
Numerical Integrator and
Calculator (ENIAC).
In 1974, William Shockley, John
Bardeen and Walter Brattain of
Bell Laboratories invent the
transistor. They discovered how
to make an electric switch with
solid materials and no need for a
vacuum.
In 1953, Grace Hopper develops the first
computer language, which eventually
becomes known as COBOL.

COBOL (Common Business-Oriented


Language) is a high-level programming
language for business application. It was
the first popular language designed to be
operating system-agnostic and is still in
use in many financial and business
applications today. COBOL was designed
for business computer programs in
industries such as finance and human
resources.
In 1964, Douglas Engelbart
shows a prototype of the
modern computer, with a
mouse and a graphical user
interface (GUI). This marks the
evolution of the computer from
a specialized machine for
scientists and mathematicians
to technology that is more
accessible to the general public.
In 1970, The newly formed Intel
unveils the Intel 1103, the first
Dynamic Access Memory (DRAM)
chip.
Intel’s initial products were memory
chips, including the world’s first metal
oxide semiconductor, the 1101, which
did not sell well. However, it’s sibling,
the 1103, a one-kilobit dynamic
random access memory (DRAM) chip,
was successful and the first chip to
store a significant amount of
information.
In 1971, Alan Shugart leads a team
of IBM engineers who invent the
“floppy disk”, allowing data to be
shared among computers.

A floppy disk is a magnetic storage


medium for computer systems. The
floppy disk is composed of a thin,
flexible magnetic disk sealed in a
square carrier. In order to read and
write data from a floppy disk, a
computer system must have a floppy
disk drive (FDD).
In 1975, The January issue of
Popular Electronics magazine
features the Altair 8800, described
as the “worlds first minicomputer
kit to rival commercial models”.
Two “computer geeks”, Paul Allen
and Bill Gates, offer to write
software for the Altair, using the
new BASIC language.
The Altair 8800 was one of the
first computers available for
personal use.
In 1976, Steve Jobs and Steve
Wozniak start Apple Computers on
April Fool’s Day and roll out the Apple
1, the first computer with a single-
circuit board, according to Stanford
University.
In 1977, Radio Shack’s
initial production run of the
TRS-80 was just 3000. It
sold like crazy. For the first
time, non-geeks could write
programs and make a
computer do what they
wished.
In 1977, Job’s and Wozniak
incorporate Apple and show the
Apple II at the first West Coast
Computer Faire. It offers color
graphics and incorporates an
audio cassette drive for storage.
In 1981, The first IBM personal
computer, code-named
“Acorn”, is introduced. It uses
Microsoft’s MS-DOS operating
system. It has Intel chip, two
floppy disks and an optional
color monitor. It also
popularizes the term PC.
In 1983, Apple’s Lisa is the first
personal computer with GUI. It
also features a drop-down
menu and icons. It flops but
eventually evolves into the
Macintosh. The Gavilan SC is
the first portable computer
with the familiar flip form
factor and the first to be
marketed as a “laptop”.
In 1985, Microsoft
announces Windows,
according to Encyclopedia
Britannica. This was the
company’s response to
Apple’s GUI. Commodore
unveils the Amiga 1000,
which features advanced
audio and video capabilities.
In 1990, Tim Berners-Lee, a researcher
at CERN, the high-energy physics
laboratory in Geneva, develops
HyperText Markup Language (HTML),
giving rise to the World Wide Web.

HTML is a computer language devised


to allow website creation. These
websites can then be viewed by
anyone else connected to the internet.
It is relatively easy to learn, with the
basics being accessible to most people
in one sitting; and quite powerful in
what it allows you to create.
In 1993, The Pentium microprocessor
advances the use of graphics and music
on PCs.

Microprocessor is a controlling unit of a


micro-computer, fabricated on a small
chip capable of performing ALU
(Arithmetic Logical Unit) operations and
communicating with the other devices
connected to it.
In 1996, Sergey Brin and Larry
Page develop the Google
search engine at Stanford
University.
In 1999, The term Wi-fi becomes part of the
computing language and users begin
connecting to the Internet without wires.
In 2004, Mozilla’s Firefox
1.0 challenges Microsoft’s
Internet Explorer, the
dominant Web browser.
Facebook, a social
networking site, launches.
In 2005, YouTube, a video
sharing service, is founded.
Google acquires Android, a
Linux-based mobile phone
operating system.
In 2006, Apple introduces the
MacBook Pro, its first Intel-
based, dual-core mobile
computer, as well as an Intel-
based iMac.
In 2009, Microsoft
launches Windows 7,
which offers the ability to
pin applications to the
taskbar and advances in
touch and handwriting
recognition, among other
features.
In 2010, Apple unveils the
iPad, changing the way
consumers view media and
jumpstarting the dormant
tablet computer segment.
In 2011, Google releases the
Chromebook, a laptop that runs
Google Chrome OS.
In 2012, Facebook gains 1 billion
users on October 4.
In 2015, Apple releases the
Apple Watch. Microsoft
releases Windows 10.
In 2016, The first reprogrammable quantum computer
was created. “Until now, there hasn’t been any
quantum-computing platform that had the capability to
program new algorithms into their system. They’re
usually each tailored to attack a particular algorithm”,
said study lead author Shantanu Debnath, a quantum
physicist and optical engineer at the University of
Maryland, College Park.

You might also like