23ff809f64389662d2905679feca2dcd (1)
23ff809f64389662d2905679feca2dcd (1)
23ff809f64389662d2905679feca2dcd (1)
The
Module 1 – Computing and Its History
2
PROGRAM OUTCOMES
a. articulate and discuss the latest developments in the specific field of practice;
b. effectively communicate orally and in writing using both English and Filipino;
c. work effectively and independently in multi-disciplinary and multi-cultural teams;
d. act in recognition of professional, social, and ethical responsibility;
e. reserve and promote "Filipino historical and cultural heritage";
f. perform the basic functions of management such as planning, organizing, staffing, directing and
controlling;
g. apply the basic concepts that underlie each of the functional areas of business (marketing, finance,
human resource management, production and operations management, information technology, and
strategic management) and employ these concepts in various business situations;
h. select the proper decision making tools to critically, analytically and creatively solve problems and
drive results;
h. express oneself clearly and communicate effectively with stakeholders both in oral and written forms;
i. apply information and communication technology (ICT) skills as required by the business
environment;
j. work effectively with other stakeholders and manage conflict in the workplace;
k. plan and implement business related activities;
l. demonstrate corporate citizenship and social responsibility;
m. exercise high personal moral and ethical standards;
n. analyse the business environment for strategic direction;
o. prepare operational plans;
p. innovate business ideas based on emerging industry;
q. manage a strategic business unit for economic sustainability;
r. conduct business research;
s. to participate in various types of employment, development activities, and public discourse; and
t. particularly in response to the needs of the communities one serves.
INTRODUCTION
The history of computing is longer than the history of computing hardware and modern computing
technology and includes the history of methods intended for pen and paper or for chalk and slate, with or
without the aid of tables.
In this module, you should be able to you learned about functional area activities, both generally
and specific. This module will tackle the first early computing devices and advantages/disadvantages of
manual computing device and mechanical computing device.
Module 1 – Computing and Its History
3
a. discuss the different forms and types of computers that emerged throughout history;
b. identify the different generations of computer and its purpose;
c. describe the improvement in the product development process;
d. explain the efficiency advantage of each generation.
PRE-ASSESSMENT
3. What was the most important development for the miniaturization of computers?
a. the transistor
b. the vacuum tube
c. the integrated circuit
d. the ENIAC
7. What invention set the stage for the 2nd generation of computers?
a. The Vacuum Tube
b. The electric switch
c. The incredible edible egg
d. The transistor
LESSON MAP
This map shows the overview of Generation of Computer which composed of the Vacuum
tube (1st generation), Transistor (2nd generation), Integrated circuits (3rd generation),
Microprocessor (4th generation), and Artificial intelligence (5th generation).
Module 1 – Computing and Its History
6
CORE CONTENTS
Across
Down
EXPLORE:
Generations of Computer
The history of computer development is often referred to in reference to the different generations of
computing devices. Each generation of computer is characterized by a major technological development
that fundamentally changed the way computers operate, resulting in increasingly smaller, cheaper, more
powerful, efficient and reliable devices.
Many of the inventions and discoveries that contributed to the modern computer era do not neatly fit
into these strict categories. Students should not interpret these dates as strict historical boundaries.
The first computers used vacuum tubes for circuitry and magnetic drums for memory, and were
often enormous, taking up entire rooms. They were very expensive to operate and in addition to using a
great deal of electricity, generated a lot of heat, which was often the cause of malfunctions.
First generation computers relied on machine language to perform operations, and they could only
solve one problem at a time. Input was based on punched cards and paper tape, and output was
displayed on printouts.
The UNIVAC and ENIAC computers are examples of first-generation computing devices. The
UNIVAC was the first commercial computer delivered to a business client. It was used in the 1951U.S.
Bureau Census.
Transistors replaced vacuum tubes and ushered in the second generation of computers. The
transistor was invented in 1947 but did not see widespread use in computers until the late 50s. The
transistor was a vast improvement over the vacuum tube, allowing computers to become smaller, faster,
cheaper, more energy efficient and more reliable than their first-generation predecessors.
Module 1 – Computing and Its History
9
Second-generation computers still relied on punched cards for input and printouts for output.
Second-generation computers moved from cryptic binary machine language to symbolic, or assembly,
languages, which allowed programmers to specify instructions in words. High-level programming
languages were also being developed at this time, such as early versions of COBOL and FORTRAN.
These were also the first computers that stored their instructions in their memory, which moved from a
magnetic drum to magnetic core technology. The first computers of this generation were developed for
the atomic energy industry.
The development of the integrated circuit was the hallmark of the third generation of computers.
Transistors were miniaturized and placed on silicon chips, called semiconductors, which drastically
increased the speed and efficiency of computers. Instead of punched cards and printouts, users
interacted with third generation computers through keyboards and monitors and interfaced with an
operating system, which allowed the device to run many different applications at one time with a central
program that monitored the memory.
Module 1 – Computing and Its History
10
Computers for the first time became accessible to a mass audience because they were smaller
and cheaper than their predecessors.
This revolution can be summed in one word: Intel. The chip-maker developed the Intel 4004 chip
in 1971, which positioned all computer components (CPU, memory, input/output controls) onto a single
chip. What filled a room in the 1940s now fit in the palm of the hand. The Intel chip housed thousands of
integrated circuits. The year 1981 saw the first ever computer (IBM) specifically designed for home use
and 1984 saw the MacIntosh introduced by Apple. Microprocessors even moved beyond the realm of
computers and into an increasing number of everyday products.
The increased power of these small computers meant they could be linked, creating networks.
Which ultimately led to the development, birth and rapid evolution of the Internet. Other major advances
during this period have been the Graphical user interface (GUI), the mouse and more recently the
astounding advances in lap-top capability and hand-held devices.
Module 1 – Computing and Its History
11
Intel Corporation is the largest semiconductor manufacturer in the world, with major facilities in the
United States, Europe, and Asia. Intel has changed the world dramatically since it was founded in 1968;
the company invented the microprocessor, the 'computer on a chip' that made possible the first handheld
calculators and personal computers (PCs). By the early 21st century, Intel's microprocessors were found
in more than 80 percent of PCs worldwide.
Key Dates:
1968: Robert Noyce & Gordon Moore incorporate N M Electronics, which is soon
renamed Intel Corp.
1971: Intel introduces the world's first microprocessor and goes public.
1980: IBM chooses the Intel microprocessor for the first personal computer.
1999: Intel debuts the Pentium III and is added to the Dow Jones Industrial Average.
Ethernet Networks
Router
Intel remained competitive through a combination of clever marketing, well-supported research and
development, superior manufacturing proficiency, a vital corporate culture, legal proficiency, and an on-
going alliance with software giant Microsoft Corporation often referred to as 'Wintel.'
Intel Trademark
The name "Moore Noyce" was already trademarked by a hotel chain, so the two founders decided
upon the name "Intel" for their new company, a shortened version of "Integrated Electronics". However,
the rights to the name had to be bought from a company called Intelco first.
Module 1 – Computing and Its History
13
AMD Is Born
From its conception in 1969, AMD focused on producing microprocessors and similar computer
components. Initially, it merely licensed processor designs from other companies like Fairchild
Semiconductor. Although it started producing other PC components developed entirely in-house early on
as well, AMD wouldn't produce a processor it designed itself for several years.
In 1975, AMD created its first two non-licensed processor products. Technically, its AM2900 wasn't
a processor; rather, it was a series of components used to build a 4-bit modular processor. It also produced
the AM9080, which was a reverse-engineered clone of Intel's 8080 8-bit microprocessor.
AMD's entry into the x86 processor market began in the early 1980s following an agreement
between IBM and Intel. At the time, IBM was one of the largest computer manufacturers in the world and
quite possibly the single largest producer of computer products. IBM was deliberating on several different
processor designs to use in its upcoming products when it entered into negotiations with Intel. If Intel won
the contract, it would secure a massive order for the company's processors for use inside of IBM-compatible
PCs.
IBM was concerned, however, that the sheer number of processors that it needed would exceed
the production capabilities of any single manufacturer, so it required Intel to license its technology to third-
party manufacturers to ensure sufficient total volume. Intel, not wanting to lose the contract with IBM to a
competitor, agreed to IBM's terms in 1981.
Following the agreement, AMD began producing licensed identical clones of Intel's 8086 processors
in 1982.
In 1985, Intel released its first 32-bit x86 processor design, the 80386. AMD planned to release its
variation, the AM386, not long after, but Intel held it up in court. Intel claimed that its cross-licensing
agreement permitted AMD to produce copies of only the 80286 and older processor designs, but AMD
argued that the contract permitted it to create clones of the 80386 and future x86 derivatives, as well. After
years of legal battles, the courts sided with AMD, and the company was able to release its AM386 in 1991.
Although the AM386 is an 80386 clone, AMD released AM386 processors with clock speeds up to
40 MHz, whereas Intel's 80386 tapped out at 33 MHz. This gave AMD a performance advantage, and as
it used the same socket and platform as the 80386, it gave customers an upgrade path to their aging
systems.
Fifth generation computing devices, based on artificial intelligence, are still in development,
though there are some applications, such as voice recognition, that are being used today. The use of
parallel processing and superconductors is helping to make artificial intelligence a reality. Quantum
computation and molecular and nanotechnology will radically change the face of computers in years to
come. The goal of fifth-generation computing is to develop devices that respond to natural language
input and are capable of learning and self-organization.
Module 1 – Computing and Its History
15
EXPLAIN
Activity 2: Explanation
1. What is the greatest and most profound effect on society of the generations of computer?
____________________________________________________________
_____________________________________________________________________
_____________________________________________________________________
_____________________________________________________________________
____________________________________________________________
_____________________________________________________________________
_____________________________________________________________________
_____________________________________________________________________
____________________________________________________________
_____________________________________________________________________
Module 1 – Computing and Its History
16
_____________________________________________________________________
_____________________________________________________________________
4. Explain why technology is a set of knowledges and skills that enable humans to design and
build objects?
____________________________________________________________
_____________________________________________________________________
_____________________________________________________________________
_____________________________________________________________________
Activity No. 3
1. _____________________________________________________
2. _____________________________________________________
3. _____________________________________________________
4. _____________________________________________________
5. _____________________________________________________
EVALUATE
Activity No. 4
1. What were some disadvantages of first-generation computers? You may choose more than one answer.
TOPIC SUMMARY
• First Generation
In the late 1940s and early 1950s (EDSAC, UNIVAC I, etc.) computers used vacuum tubes for
their digital logic and liquid mercury memories for storage.
• Second Generation
In the late 1950s, transistors replaced tubes and used magnetic cores for memories (IBM
1401, Honeywell 800). Size was reduced and reliability was significantly improved.
• Third Generation
In the mid-1960s, computers used the first integrated circuits (IBM 360, CDC 6400) and the first
operating systems and database management systems. Although most processing was still batch
oriented using punch cards and magnetic tapes, online systems were being developed. This was the
era of mainframes and minicomputers, essentially large centralized computers and small departmental
computers.
• Fourth Generation
The mid to late-1970s spawned the microprocessor and personal computer, introducing distributed
processing and office automation. Word processing, query languages, report writers and spreadsheets
put large numbers of people in touch with the computer for the first time.
The 21st century ushered in the fifth generation, which increasingly delivers various forms of artificial
intelligence (AI). More sophisticated search and natural language recognition are features that users
recognize, but software that improves its functionality by learning on its own will change just about
Module 1 – Computing and Its History
18
• Intel was an early developer of SRAM and DRAM memory chips, which represented the majority of its
business until the early 1980s. Intel created the first commercial microprocessor chip in 1971, but it was
not until the success of the personal computer (PC) that this became its primary business. Intel’s
research objective is to introduce a new microarchitecture every two years. During the 1990s, Intel’s
investment in new microprocessor designs fostered the rapid growth of the PC industry. During this
period Intel became the dominant supplier of microprocessors for PCs, and was known for aggressive
and sometimes controversial tactics in defense of its market position, as well as a struggle with Microsoft
for control over the direction of the PC industry. In addition to its work in semiconductors, Intel has
begun research in electrical transmission and generation.
• (Advanced Micro Devices, Inc., Sunnyvale, CA, www.amd.com) A major manufacturer of
semiconductor devices including x86-compatible CPUs, embedded processors, flash memories,
programmable logic devices and networking chips. Founded in 1969 by Jerry Sanders and seven
friends, AMD's first product was a 4-bit shift register. During the 1970s, it entered the memory business,
and after reverse engineering the popular 8080 CPU, the microprocessor market as well.
POST-ASSESSMENT
e. 1920-1939
f. 1940-1959
g. 1960 – 1965
h. 1966 - 1971
i. 3
j. 4
k. 5
l. 6
3. What was the most important development for the miniaturization of computers?
m. the transistor
p. the ENIAC
q. 4
r.
Module 1 – Computing and Its History
19
s. 6
t. 8
u. 12
y. 3rd
z. 4th
aa. 5th
bb. 6th
7. What invention set the stage for the 2nd generation of computers?
cc. The Vacuum Tube
dd. The electric switch
ee. The incredible edible egg
ff. The transistor
REFERENCES
• https://www.encyclopedia.com/computing/news-wires-white-papers-and-
books/generations-computers
• https://www.tutorialspoint.com/basics_of_computer_science/basics_of_computer_s
cience_generations.htm
• https://www.pcmag.com/encyclopedia/term/computer-generations
• https://www.newworldencyclopedia.org/entry/Intel_Corporation