Computer
Computer
Article
Talk
Read
View source
View history
Tools
Appearance
Text
Small
Standard
Large
Width
Standard
Wide
Color (beta)
Automatic
Light
Dark
Early computers were meant to be used only for calculations. Simple manual instruments like the
abacus have aided people in doing calculations since ancient times. Early in the Industrial
Revolution, some mechanical devices were built to automate long, tedious tasks, such as guiding
patterns for looms. More sophisticated electrical machines did specialized analog calculations in
the early 20th century. The first digital electronic calculating machines were developed during
World War II, both electromechanical and using thermionic valves. The first semiconductor
transistors in the late 1940s were followed by the silicon-based MOSFET (MOS transistor) and
monolithic integrated circuit chip technologies in the late 1950s, leading to the microprocessor
and the microcomputer revolution in the 1970s. The speed, power, and versatility of computers
have been increasing dramatically ever since then, with transistor counts increasing at a rapid
pace (Moore's law noted that counts doubled every two years), leading to the Digital Revolution
during the late 20th and early 21st centuries.