Brief History of Computer
Brief History of Computer
Computers originated several years back in history, taking on many forms before it
became the familiar rectangular monitor that we commonly use today. In fact, most discourse on
innovation and creation often features the evolution of the computer as one of the greatest and
most revolutionary achievements in human history. Needless to say, our reliable digital
companion has come a long way before it became one of the most powerful technologies in the
world.
Early history
As suggested by its name, the computer started as a mathematical tool for performing
manual calculations, or simply to “calculate” or to “count.” It began as a simple Abacus, a
rectangular tool with metal rods and beads intended to represent figures and numbers, assisting
merchants as far back as 1100 BCE. The need for easier and more convenient methods of
calculating paved the path towards the birth of better “calculators” such as the Gunter Scale and
Slide Rule.
The first ever calculator to be capable of performing mathematical functions was
invented by Blaise Pascal, a French mathematician and philosopher, back in the 1640s. It was
called the “Pascaline” which could only perform addition and subtraction. It wasn’t until the
1670s that multiplication became possible when Gottfried Wilhelm Von Leibniz, a German
mathematician and philosopher invented the Step Reckoner. Eventually, division became part of
the function of calculating devices in the 1820s, when Charles Xavier Thomas de Colmar built
the Arithmometer.
“Before the true power of computing could be realized, therefore, the naive view of
calculation had to be overcome (Britannica, 2023).” In the following years, the focus on
arithmetic functions began to shift towards other purposes such as industry, business, research,
and so forth. At this point, the term “computer” began to expand its definition, no longer
constricted to mere calculations.
In the field of textile manufacturing, Joseph Marie Jaquard, a French weaver, made a
breakthrough from repetitive machine functions by creating the Jacquard loom in the 1800s,
whose weaving function was guided by a card punched with holes. It can be concluded that
machines are capable of recognizing patterns or a form of language (not necessarily in words)
which proves that they are programmable to follow a set of commands (Britannica, 2023).
19th century
During the 1830s, Charles Babbage, an English mathematician and inventor, created the
Difference Engine, which was a complex calculator capable of performing one general
operation as well as storing data. However, due to financial constraints, developments for the
machine ceased a few years later. Despite this, Babbage continued to think of ways to improve
his ideas, eventually coming up with the improved Analytical Engine: a steam-driven “general-
purpose, fully program-controlled, automatic digital mechanical computer” (Britannica, 2023). It
resembled the modern computer in a lot of ways: a central processing unit equivalent, data
storage, a reader and a printer. It would perform even more complex functions than its
predecessors, diverting from the typical sequential operations by earlier machines as it was
supposedly capable of independently sorting through instructions. Unfortunately, this marvelous
idea had never been built due to feasibility issues and lack of financial support from the British
government.
An English mathematician and writer by the name of Ada Lovelace wrote a paper on the
Analytical Engine, pointing out its main distinctions from existing calculating machines.
Fascinated by the machine’s ability to go beyond the bounds of arithmetic, Lovelace used the
calculation of Bernoulli numbers as an example in her notes upon the subject, which was then
considered by historians as the first computer program and hailed her as the first
programmer in history (Britannica, 2015).
Computers became widely used in business by the late 19th century as machine
manufacturers, such as CTR Company and Burroughs Adding Machine Company, entered the
market to sell commercial-purpose calculators. Because of this, research and development for
creating better computers began early into the 20th century.
In the 1930s, Vannebar Bush, an American electrical engineer from the Massachusetts
Institute of Technology (MIT), developed the first modern analog computer called The
Differential Analyzer, which can perform calculations of differential equations commonly
encountered in engineering (Britannica, 2023)
Other than engineering, computers also entered other fields such as physics. Harvard
Professor Howard Hathaway Aiken, built a 15-meter mechanical computer called Harvard
Mark I in the 1940s. Based on Babbage’s Analytical Engine, the mechanical beast was one of
the first automatic large-scale calculators, easily programmable to execute high-speed
calculations. Mark I was utilized during World War II for studying magnetic fields (“Howard
Aiken”, 2023). Eventually, Aiken developed other similar machines, with the fourth one, Mark
IV, being electronic. Other electronic computers, such as the Atanasoff-Berry computer and
the Electronic Numerical Integrator and Computer (ENIAC) – which had a more general
purpose as compared to earlier calculators, were also developed during this period.
Many of the earlier developments of calculators are used in the military field as computer
development clashed with the World War II era. During such time, a German civil engineer by
the name of Konrad Zuse developed a series of computers which are largely recognized in
history. His first creation, the Z1, was the first binary digital computer, which unlike earlier
models, integrated the use of logic in its functions. His later work called the Z3 on the other
hand, was the world’s first functional program-controlled computer used in performing scientific
and engineering calculations (“Key contributions of Konrad Zuse,” n.d.).
At this point in time, computers still generally do not meet its modern definition of a
powerful, multipurpose machine. The modern computer would not have been designed until a
paper titled “Preliminary Discussion of the Logical Design of an Electronic Computing
Instrument” was published in 1946. One of the authors of the paper, John Von Neumann, a
Hungarian-American mathematician and computer scientist, contributed his knowledge to pave
the way for the rise of computer programming language in later years as well as the development
of the Electronic Discrete Variable Automatic Computer (EDVAC) – one of the first stored-
program machines. (“John von Neumann,” n.d.).
The late 20th century became a significant period of computer history due to major
technological advancements as well as the rise of the modern computer. What used to be huge,
metal machines gradually shrunk into more portable-sized devices that we call personal
computers today.
The IBM PC
The International Business Machines Corporation (IBM) introduced the IBM PC Model
5150, a business-use microcomputer which set the standard for personal computers. It’s most
notable features are: a boxy structure for a monitor with a monochromatic display, two floppy
disk drives, and an 83-key keyboard (“IBM PC 5150: Everything you need to know,” 2022).
Apple’s Lisa
Apple introduced the first personal computer that utilizes a graphical user interface in
1983 called “Local Integrated Software Architecture” or simply LISA for short. The
graphical user interface (GUI) looks like the Start button of today’s computer devices, which
allows users to easily perform tasks and interact with the device in a more practical, convenient
and visual fashion, which is then popularized by Apple’s Macintosh.
References
Gregersen, E. (2015, December 10). Ada Lovelace: The First Computer Programmer.
Encyclopedia Britannica. https://www.britannica.com/story/ada-lovelace-the-first-computer-
programmer
Key contributions of Konrad Zuse to the history of computer design and software. Key
Contributions of Konrad Zuse to the History of Computer Design and Software : History of
Information. (n.d.). https://www.historyofinformation.com/detail.php?id=613
Rebecca. (2022, November 30). IBM PC 5150: Everything you need to know. History.
https://history-computer.com/ibm-pc-5150-guide/