Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 6
History of computers
and its type
Created by:Ibrahim Qahtan Adnan
Supervisor:Dr.solav ali The history of the computer and the stages of its development The origin of the word computer dates back to the sixteenth century. This term refers to a human employee who performs certain mathematical operations, and this term remained associated with humans until the end of the nineteenth century, when this term came to refer to machines through which mathematical operations can be performed, and the first computer appeared in 1833 AD; The scientist Charles Babbage, who is considered the godfather of the computer, designed a mechanical computer with the aim of using it for general purposes.
First generation computers
The time period that coincided with the outbreak of World War II is considered a period of great importance in the development of computers. Specifically in 1938 AD, German engineer Konrad Sues was able to create the first programmable binary computer in history, and he called it (Z1). One year later, American physicist John Atanasoff and engineer Clifford Perry were able to build an analog computer known as (Atanasoff). berry computer). The ABC computer is an electrical computer that operates through more than 300 tubes to perform digital calculations and arithmetic and logical operations. The year 1943 witnessed the start of manufacturing the first general-purpose computer, the ENIAC device, short for Electronic Numerical Integrator and Calculator, which was completed in 1946. The ENIAC computer and its predecessors are the first generation of computers in history, lasting until the end of the 1950s. These devices were characterized by their slow operation, large size, and high cost. This is because it contains vacuum tube units as basic components of both the central processing unit and the memory unit, in addition to devices that rely on magnetic and paper tapes as input and output devices.
Second generation computers
The first generation of computers was based on vacuum tubes. However, the increasing commercial interest in computing, and the reliance on transistors as a basic component of manufacturing, led to the emergence of a new generation of these devices, known as the second generation. This device is a Transac S-2000 manufactured by Philco Corporation in 1958. It was one of the first devices made with transistors. IBM used transistors to manufacture devices. This happened through the manufacture of computers (IBM 7090). It is worth noting that second-generation computers primarily use hard drives and magnetic tapes to store data, and are programmed in different languages, such as COBOL, which is considered a programming language. In addition to a general business focus...about the FORTRAN language used in both scientific and commercial fields. The second generation of computers lasted from 1959 to 1965 AD, as the use of transistors reduced manufacturing costs, reduced size, and increased speed compared to first-generation computers. Computer performance was great too. This generation has lower power consumption.
Third generation computers
The emergence of the third generation of computers is due to what is known as the integrated circuit, which is referred to by the abbreviation (IC). The integrated circuit was invented by Robert Noyce and Jack Kilby between the years 1958-1959 AD, and the emergence of third- generation devices represented the first steps in the emergence of computers used at that time. The current computer (IBM-360) is considered the most important type of third generation computer. IBM spent approximately US$5 billion producing this series. The IBM-360 computer was used to perform many operations that require fast data processing, such as weather forecasting, astronomy, space exploration, and other specialized scientific fields. The third generation of computers continued to appear until 1971 AD, when the devices were distinguished by their high speed. Its small size and good efficiency compared to second-generation hardware and high-level programming languages are widely used.
Fourth generation of computers
The invention of what is known as the microprocessor led to the emergence of the fourth generation of computers, and the Intel 8008 processor, which was invented in 1972 AD, was the first processor. The advent of microprocessors has dramatically reduced the cost of producing computers, contributing to the emergence of personal computers (referred to as PCs), portable computers, as well as mobile phone devices. The Altair 8800, IBM 5100, and Micral computers are examples of some of the older types of fourth-generation hardware, microprocessors, or what is known as the central processing unit in computers today.
Fifth generation computers
Fifth generation devices appeared in 2010 AD, and the devices of this generation depend on the use of artificial intelligence, as these devices work in a way that enables them to interact with natural language inputs, and possess the ability to learn and self-organize in a way that makes them possess intelligence that is similar to the intelligence of humans to some extent, and the computer is considered Watson, which was produced by IBM; One of the most famous examples of 5G devices.