0% found this document useful (0 votes)
117 views6 pages

Computer Assignment: Qno:1) Write A Short Note On History of Computer From Start To Present?

The document provides a history of computers from the 1st to 5th generations. It describes the key developments including the transition from vacuum tubes to transistors to integrated circuits and microprocessors. Each generation brought improvements in size, speed, cost and capabilities. The 1st generation (1940s) filled entire rooms while the 4th generation (1970s) could fit in the palm of a hand. Hardware and software are also distinguished, with hardware comprising physical components and software comprising instructions to interact with the computer.

Uploaded by

Jasa Kahn
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
Download as docx, pdf, or txt
0% found this document useful (0 votes)
117 views6 pages

Computer Assignment: Qno:1) Write A Short Note On History of Computer From Start To Present?

The document provides a history of computers from the 1st to 5th generations. It describes the key developments including the transition from vacuum tubes to transistors to integrated circuits and microprocessors. Each generation brought improvements in size, speed, cost and capabilities. The 1st generation (1940s) filled entire rooms while the 4th generation (1970s) could fit in the palm of a hand. Hardware and software are also distinguished, with hardware comprising physical components and software comprising instructions to interact with the computer.

Uploaded by

Jasa Kahn
Copyright
© © All Rights Reserved
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
Download as docx, pdf, or txt
Download as docx, pdf, or txt
You are on page 1/ 6

COMPUTER ASSIGNMENT

QNo:1) write a short note on history of computer from start


to present?
ANS
First Generation: Vacuum Tubes (1940-1956)
The first computer systems used vacuum tubes for circuitry
and magnetic drums for memory, and were often enormous,
taking up entire rooms. These computers were very expensive to
operate and in addition to using a great deal of electricity, the first
computers generated a lot of heat, which was often the cause of
malfunctions.
First generation computers relied on machine language, the
lowest-level programming language understood by computers, to
perform operations, and they could only solve one problem at a
time. It would take operators days or even weeks to set-up a new
problem. Input was based on punched cards and paper tape, and
output was displayed on printouts.
The UNIVAC and ENIAC computers are examples of first-
generation computing devices. The UNIVAC was the first
commercial computer delivered to a business client, the U.S.
Census Bureau in 1951.

Second Generation: Transistors (1956-1963)


The world would see transistors replace vacuum tubes in the
second generation of computers. The transistor was invented at
Bell Labs in 1947 but did not see widespread use in computers
until the late 1950s. 
The transistor was far superior to the vacuum tube, allowing
computers to become smaller, faster, cheaper, more energy-
efficient and more reliable than their first-generation
predecessors. Though the transistor still generated a great deal of
heat that subjected the computer to damage, it was a vast
improvement over the vacuum tube. Second-generation
computers still relied on punched cards for input and printouts for
output.
From Binary to Assembly
Second-generation computers moved from cryptic binary machine
language to symbolic, or assembly, languages, which allowed
programmers to specify instructions in words. High-level
programming languages were also being developed at this time,
such as early versions of COBOL and FORTRAN. These were
also the first computers that stored their instructions in their
memory, which moved from a magnetic drum to magnetic core
technology.
The first computers of this generation were developed for the
atomic energy industry.
Third Generation: Integrated Circuits (1964-1971)
The development of the integrated circuit was the hallmark of the
third generation of computers. Transistors were miniaturized and
placed on silicon chips, called semiconductors, which drastically
increased the speed and efficiency of computers.
Instead of punched cards and printouts, users interacted with third
generation computers
through keyboards and monitors and interfaced with an operating
system, which allowed the device to run many
different applications at one time with a central program that
monitored the memory. Computers for the first time became
accessible to a mass audience because they were smaller and
cheaper than their predecessors.
Did You Know...? An integrated circuit (IC) is a small electronic
device made out of a semiconductor material. The first integrated
circuit was developed in the 1950s by Jack Kilby of Texas
Instruments and Robert Noyce of Fairchild Semiconductor.
Fourth Generation:  Microprocessors (1971-Present)
The microprocessor brought the fourth generation of computers,
as thousands of integrated circuits were built onto a single silicon
chip. What in the first generation filled an entire room could now fit
in the palm of the hand. The Intel 4004 chip, developed in 1971,
located all the components of the computer—from the central
processing unit and memory to input/output controls—on a single
chip.
In 1981 IBM introduced its first computer for the home user, and
in 1984 Apple introduced the Macintosh. Microprocessors also
moved out of the realm of desktop computers and into many
areas of life as more and more everyday products began to use
microprocessors.
As these small computers became more powerful, they could be
linked together to form networks, which eventually led to the
development of the Internet. Fourth generation computers also
saw the development of GUIs, the mouse and handheld devices.

Intel's first microprocessor, the 4004, was conceived by Ted Hoff


and Stanley Mazor.
Image Source: Intel Timeline (PDF)
Fifth Generation: Artificial Intelligence (Present and Beyond)
Fifth generation computing devices, based on artificial
intelligence, are still in development, though there are some
applications, such as voice recognition, that are being used today.
The use of parallel processing and superconductors is helping to
make artificial intelligence a reality.
Quantum computation and molecular and nanotechnology will
radically change the face of computers in years to come. The goal
of fifth-generation computing is to develop devices that respond
to natural language input and are capable of learning and self-
organization.

QNo:2) How is hardware is different from software?

ANS:
HARDWARE SOFTWARE

1)Devices that are required to Collection of instructions that


store and execute (or run) the enables a user to interact with
software. the computer. Software is a
program that enables a
computer to perform a
specific task, as opposed to
the physical components of
the system (hardware).
2)Input, storage, processing, System software,
control, and output devices. Programming software, and
Application software.
3)Hardware serve as the To perform the specific task,
delivery system for software you need to complete.
solutions. The hardware of a Software is generally not
computer is infrequently needed to for the hardware to
changed, in comparison with perform its basic level tasks
software and data, which are such as turning on and
“soft” in the sense that they responding to input.
are readily created, modified,
or erased on the computer
4)CD-ROM, monitor, printer, QuickBooks, Adobe Acrobat,
video card, scanners, label Google Chrome, Microsoft
makers Word
5)Hardware starts functioning To deliver its set of
once software is loaded. instructions, Software is
installed on hardware.
6)Hardware failure is random. Software failure is systematic.
Hardware does have Software does not have an
increasing failure at the last increasing failure rate.
stage.
7)Hardware wears out over Software does not wear out
time over time
8)Software does not wear out Software is logical in nature.
over time

QNo:3) which unit of computer is responsible for performing


operation? Why is it so important to the operation of computer?
ANS:
A central processing unit (CPU), also called a central processor or main
processor, is the electronic circuitry within a computer that carries out
the instructions of a computer program by performing the basic
arithmetic, logic, controlling, and input/output
(I/O) operations specified by the instructions. Because without that
computer is only a piece of box with no use.
QNo:4) describe at least four application, or uses, of computer other
than discussed in lab?
ANS:
The application of computer are as follows,
1) It is use in factories and industries
2) It is use for marketing.
3) It is use for coding and programming.
4) It is use for communication.
5) It is used in research Centre.
QNo:5) what is binary number system?
ANS:
The binary number system is a numbering system that
represents numeric values using two unique digits (0 and 1).
Mosting computing devices use binary numbering to represent
electronic circuit voltage state, (i.e., on/off switch), which
considers 0 voltage input as off and 1 input as on.

QNo:6) Differentiate between intelligence of human and


source?
ANS:

HUMAN COMPUTER
1) Process information Process information faster
slower
2) May be objective Highly objective
3) May be less accurate More accurate
4) Can easily adapt the Cannot the adapt the changes
changes well
5) Excellent social skills Average social skill
6) Has self-awareness Still working toward self-
awareness
7) innovation optimization
8) can easily multi task Cannot multi task well

You might also like