0% found this document useful (0 votes)
23 views

Computing - What Does It Mean?

The document discusses the history and definition of computing. It provides context that computing involves both the study of algorithms and the development of hardware and software. Major computing disciplines are also listed. The document then gives a brief overview of the history of computing technologies and concepts.

Uploaded by

18pease11
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views

Computing - What Does It Mean?

The document discusses the history and definition of computing. It provides context that computing involves both the study of algorithms and the development of hardware and software. Major computing disciplines are also listed. The document then gives a brief overview of the history of computing technologies and concepts.

Uploaded by

18pease11
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 2

Computing is any goal-oriented activity requiring, benefiting from, or creating computing machinery.

[1] It includes the study and experimentation of algorithmic processes, and development of
both hardware and software. Computing has scientific, engineering, mathematical, technological and
social aspects. Major computing disciplines include computer engineering, computer
science, cybersecurity, data science, information systems, information technology, digital
art and software engineering.[2]

The term computing is also synonymous with counting and calculating. In earlier times, it was used in
reference to the action performed by mechanical computing machines, and before that, to human
computers.[3]

ENIAC, the first programmable general-purpose electronic


digital computer

History[edit]

Main article: History of computing

For a chronological guide, see Timeline of computing.

The history of computing is longer than the history of computing hardware and includes the history of
methods intended for pen and paper (or for chalk and slate) with or without the aid of tables.
Computing is intimately tied to the representation of numbers, though mathematical concepts
necessary for computing existed before numeral systems. The earliest known tool for use in
computation is the abacus, and it is thought to have been invented in Babylon circa between 2700–2300
BC. Abaci, of a more modern design, are still used as calculation tools today.

The first recorded proposal for using digital electronics in computing was the 1931 paper "The Use of
Thyratrons for High Speed Automatic Counting of Physical Phenomena" by C. E. Wynn-Williams.
[4] Claude Shannon's 1938 paper "A Symbolic Analysis of Relay and Switching Circuits" then introduced
the idea of using electronics for Boolean algebraic operations.

The concept of a field-effect transistor was proposed by Julius Edgar Lilienfeld in 1925. John
Bardeen and Walter Brattain, while working under William Shockley at Bell Labs, built the first
working transistor, the point-contact transistor, in 1947.[5][6] In 1953, the University of
Manchester built the first transistorized computer, the Manchester Baby.[7] However, early junction
transistors were relatively bulky devices that were difficult to mass-produce, which limited them to a
number of specialised applications.[8] The metal–oxide–silicon field-effect transistor (MOSFET, or MOS
transistor) was invented by Mohamed Atalla and Dawon Kahng at Bell Labs in 1959.[9][10] The MOSFET
made it possible to build high-density integrated circuits,[11][12] leading to what is known as
the computer revolution or microcomputer revolution.

You might also like