Computing - What Does It Mean?
Computing - What Does It Mean?
[1] It includes the study and experimentation of algorithmic processes, and development of
both hardware and software. Computing has scientific, engineering, mathematical, technological and
social aspects. Major computing disciplines include computer engineering, computer
science, cybersecurity, data science, information systems, information technology, digital
art and software engineering.[2]
The term computing is also synonymous with counting and calculating. In earlier times, it was used in
reference to the action performed by mechanical computing machines, and before that, to human
computers.[3]
History[edit]
The history of computing is longer than the history of computing hardware and includes the history of
methods intended for pen and paper (or for chalk and slate) with or without the aid of tables.
Computing is intimately tied to the representation of numbers, though mathematical concepts
necessary for computing existed before numeral systems. The earliest known tool for use in
computation is the abacus, and it is thought to have been invented in Babylon circa between 2700–2300
BC. Abaci, of a more modern design, are still used as calculation tools today.
The first recorded proposal for using digital electronics in computing was the 1931 paper "The Use of
Thyratrons for High Speed Automatic Counting of Physical Phenomena" by C. E. Wynn-Williams.
[4] Claude Shannon's 1938 paper "A Symbolic Analysis of Relay and Switching Circuits" then introduced
the idea of using electronics for Boolean algebraic operations.
The concept of a field-effect transistor was proposed by Julius Edgar Lilienfeld in 1925. John
Bardeen and Walter Brattain, while working under William Shockley at Bell Labs, built the first
working transistor, the point-contact transistor, in 1947.[5][6] In 1953, the University of
Manchester built the first transistorized computer, the Manchester Baby.[7] However, early junction
transistors were relatively bulky devices that were difficult to mass-produce, which limited them to a
number of specialised applications.[8] The metal–oxide–silicon field-effect transistor (MOSFET, or MOS
transistor) was invented by Mohamed Atalla and Dawon Kahng at Bell Labs in 1959.[9][10] The MOSFET
made it possible to build high-density integrated circuits,[11][12] leading to what is known as
the computer revolution or microcomputer revolution.