0% found this document useful (0 votes)
370 views12 pages

2400 BC Abacus

The document provides a history of computing from 2400 BC to 1994 AD. Some key events and inventions include the abacus in 2400 BC, the binary number system in 300 BC, Charles Babbage's mechanical computer designs in the 1800s, the first general purpose electronic computer (ENIAC) in 1945, the first commercial computer (UNIVAC I) in 1951, the invention of programming languages like FORTRAN and COBOL in the 1950s, the first microprocessor and personal computers in the 1970s, and the creation of the World Wide Web in 1990. The timeline shows the gradual progression from mechanical to electronic digital computers and networks over thousands of years.

Uploaded by

Jerome Regala
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
370 views12 pages

2400 BC Abacus

The document provides a history of computing from 2400 BC to 1994 AD. Some key events and inventions include the abacus in 2400 BC, the binary number system in 300 BC, Charles Babbage's mechanical computer designs in the 1800s, the first general purpose electronic computer (ENIAC) in 1945, the first commercial computer (UNIVAC I) in 1951, the invention of programming languages like FORTRAN and COBOL in the 1950s, the first microprocessor and personal computers in the 1970s, and the creation of the World Wide Web in 1990. The timeline shows the gradual progression from mechanical to electronic digital computers and networks over thousands of years.

Uploaded by

Jerome Regala
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 12

6/21/2009 3:03:00 AM

History
2400 BC Abacus: The abacus, the first known calculator, was invented in Babylonia 500 BC Panini: Introduced the forerunner to modern formal language theory 300 BC Pingala: Pingala invented the binary number system 87 BC Antikythera Mechanism: Built in Rhodes to track movement of the stars 60 AD Heron of Alexandria: Heron of Alexandria invents machines which follow a series of instructions 724 Liang Ling-Can: Liang Ling-Can invents the first fully mechanical clock 1492 Leonardo da Vinci: Drawings by Leonardo da Vinci depict inventions such as flying machines, including a helicopter, the first mechanical calculator and one of the first programmable robots 1614 John Napier: John Napier invents a system of moveable rods (Napier's Rods) based on logarithms which was able to multiply, divide and calculate square and cube roots 1622 William Oughtred: William Oughtred develops slide rules 1623 Calculating Clock: Invented by Wilhelm Schickard 1642 Blaise Pascal: Blaise Pascal invents the the "Pascaline", a mechanical adding machine 1671 Gottfried Leibniz: Gottfried Leibniz is known as one of the founding fathers of calculus 1801 Joseph-Marie Jacquard: Joseph-Marie Jacquard invents an automatic loom controlled by punched cards 1820 Arithmometer: The Arithmometer was the first mass-produced calculator invented by Charles Xavier Thomas de Colmar 1822 Charles Babbage: Charles Babbage designs his first mechanical computer 1834 Analytical Engine: The Analytical Engine was invented by Charles Babbage 1835 Morse code: Samuel Morse invents Morse code 1848 Boolean algebra: Boolean algebra is invented by George Boole 1853 Tabulating Machine: Per Georg Scheutz and his son Edvard invent the Tabulating Machine 1869 William Stanley Jevons: William Stanley Jevons designs a practical logic machine 1878 Ramon Verea: Ramon Verea invents a fast calculator with an internal multiplication table 1880 Alexander Graham Bell: Alexander Graham Bell invents the telephone called the Photophone

1884 Comptometer: The Comptometer is an invention of Dorr E. Felt which is operated by pressing keys 1890 Herman Hollerith: Herman Hollerith invents a counting machine which increment mechanical counters 1895 Guglielmo Marconi: Radio signals were invented by Guglielmo Marconi 1906 Lee De Forest: Lee De Forest invents the electronic tube 1911 IBM: IBM is formed on June 15, 1911 1924 John Logie Baird: Electro Mechanical television system was invented by John Logie Baird 1937 Alan Turing: Alan Turing develops the concept of a theoretical computing machine 1938 Konrad Zuse: Konrad Zuse creates the Z1 Computer a binary digital computer using punch tape 1939 George Stibitz: George Stibitz develops the Complex Number Calculator - a foundation for digital computers Hewlett Packard: William Hewlett and David Packard start Hewlett Packard John Vincent Atanasoff and Clifford Berry: John Vincent Atanasoff and Clifford Berry develop the ABC (Atanasoft-Berry Computer) prototype 1943 Enigma: Adolf Hitler uses the Enigma encryption machine Colossus: Alan Turing develops the the code-breaking machine Colossus 1944 Howard Aiken & Grace Hopper: Howard Aiken and Grace Hopper designed the MARK series of computers at Harvard University 1945 ENIAC: John Presper Eckert & John W. Mauchly: John Presper Eckert & John W. Mauchly develop the ENIAC ( Electronic Numerical Integrator and Computer) Computer Bug: The term computer bug as computer bug was first used by Grace Hopper 1946 F.C. Williams: F.C. Williams develops his cathode-ray tube (CRT) storing device the forerunner to random-access memory (RAM)

1947 Pilot ACE: Donald Watts Davies joins Alan Turing to build the fastest digital computer in England at the time, the Pilot ACE William Shockley: William Shockley invents the transistor at Bell Labs Douglas Engelbart: Douglas Engelbart theorises on interactive computing with keyboard and screen display instead of on punchcards 1948 Andrew Donald Booth: Andrew Donald Booth invents magnetic drum memory Frederic Calland Williams & Tom Kilburn: Frederic Calland Williams & Tom Kilburn develop the SSEM "Small Scale Experimental Machine" digital CRT storage which was soon nicknamed the "Baby" 1949 Claude Shannon: Claude Shannon builds the first machine that plays chess Howard Aiken: Howard Aiken develops the Harvard-MARK III

1950 Hideo Yamachito: The first electronic computer is created in Japan by Hideo Yamachito. Alan Turing: Alan Turing publishes his paper Computing Machinery and Intelligence which helps create the Turing Test. 1951 UNIVAC: UNIVAC I (UNIVersal Automatic Computer I) was introduced - the first commercial computer made in the United States and designed principally by John Presper Eckert & John W. Mauchly EDVAC: The EDVAC (Electronic Discrete Variable Automatic Computer) begins performing basic tasks. Unlike the ENIAC, it was binary rather than decimal 1953 The IBM 701 becomes available and a total of 19 are sold to the scientific community. 1954 John Backus & IBM: John Backus & IBM develop the FORTRAN Computer Programming Language 1955 Bell Labs introduces its first transistor computer. 1956 Optical fiber was invented by Basil Hirschowitz, C. Wilbur Peters, and Lawrence E. Curtiss 1957 Sputnik I and Sputnik II: Sputnik I and Sputnik II are launched by the Russians

1958 ARPA (Advanced Research Projects Agency) and NASA is formed Silicon chip: The first integrated circuit, or silicon chip, is produced by the US Jack Kilby & Robert Noyce 1959 Paul Baran: Paul Baran theorises on the "survivability of communication systems under nuclear attack", digital technology and symbiosis between humans and machines 1960 COBOL: The Common Business-Oriented Language (COBOL) programming language is invented.

1962 The first computer game: The first computer game Spacewar Computer Game invented BY Steve Russell & MIT 1963 The Computer Mouse: Douglas Engelbart invents and patents the first computer mouse (nicknamed the mouse because the tail came out the end) The American Standard Code for Information Interchange (ASCII) is developed to standardize data exchange among computers. 1964 Word processor: IBM introduces the first word processor BASIC: John Kemeny and Thomas Kurtz develop Beginners All-purpose Symbolic Instruction Language (BASIC) 1967 Floppy Disk: IBM creates the first floppy disk 1969 Seymour Cray: Seymour Cray develops the CDC 7600, the first supercomputer Gary Starkweather: Gary Starkweather invents the laser printer whilst working with Xerox ARPANET: The U.S. Department of Defense sets up the Advanced Research Projects Agency Network (ARPANET ) this network was the first building blocks to what the internet is today but originally with the intention of creating a computer network that could withstand any type of disaster.

1970 RAM: Intel introduces the world's first available dynamic RAM ( random-access memory) chip and the first microprocessor, the Intel 4004.

1972 First Video Game: Atari releases Pong, the first commercial video game The CD: The compact disc is invented in the United States.

1973 Robert Metcalfe and David Boggs: Robert Metcalfe creates the Ethernet, a local-area network (LAN) protocol Personal computer: The minicomputer Xerox Alto (1973) was a landmark step in the development of personal computers Gateways: Vint Cerf and Bob Kahn develop gateway routing computers to negotiate between the various national networks

1974 SQL: IBM develops SEQUEL (Structured English Query Language ) now known as SQL WYSIWYG: Charles Simonyi coins the term WYSIWYG (What You See Is What You Get) to describe the ability of being able to display a file or document exactly how it is going to be printed or viewed

1975 Portable computers: Altair produces the first portable computer Microsoft Corporation: The Microsoft Corporation was founded April 4, 1975 by Bill Gates and Paul Allen to develop and sell BASIC interpreters for the Altair 8800 1976 Apple: Apple Computers was founded Steve Wozniak and Steve Jobs 1977 Apple Computers Apple II, the first personal computer with color graphics, is demonstrated MODEM: Ward Christensen writes the programme "MODEM" allowing two microcomputers to exchange files with each other over a phone line 1978 Magnetic tape: The first magnetic tape is developed in the US 1980 Paul Allen and Bill Gates: IBM hires Paul Allen and Bill Gates to create an operating system for a new PC. They buy the rights to a simple operating system manufactured by Seattle Computer Products and use it as a template to develop DOS. 1982 Commodore 64: The Commodore 64 becomes the best-selling computer of all time. 1984 Apple Macintosh: Apple introduces the Macintosh with mouse and window interface

Cyberspace: William Gibson coins the word cyberspace when he publishes Neuromancer 1990 The Internet, World Wide Web & Tim Berners-Lee: Tim Berners-Lee and Robert Cailliau propose a 'hypertext' system starting the modern Internet Microsoft and IBM stop working together to develop operating systems 1991 The World Wide Web: The World Wide Web is launched to the public on August 6, 1991 1993 At the beginning of the year only 50 World Wide Web servers are known to exist 1994 The World Wide Web Consortium is founded by Tim Berners-Lee to help with the development of common protocols for the evolution of the World Wide Web YAHOO: YAHOO is created in April, 1994.

A Brief History of Computers and Networks, Part I Webster's Dictionary defines "computer" as any programmable electronic device that can store, retrieve, and process data. The basic idea of computing develops in the 1200's when a Moslem cleric proposes solving problems with a series of written procedures. As early as the 1640's mechanical calculators are manufactured for sale. Records exist of earlier machines, but Blaise Pascal invents the first commercial calculator, a hand powered adding machine. Although attempts to multiply mechanically were made by Gottfried Liebnitz in the 1670s the first true multiplying calculator appears in Germany shortly before the American Revolution. In 1801 a Frenchman, Joseph-Marie Jacquard builds a loom that weaves by reading punched holes stored on small sheets of hardwood. These plates are then inserted into the loom which reads (retrieves) the pattern and creates(process) the weave. Powered by water, this "machine" came 140 years before the development of the modern computer. Shortly after the first mass-produced calculator(1820), Charles Babbage begins his lifelong quest for a programmable machine. Although Babbage was a poor communicator and recordkeeper, his difference engine is sufficiently developed by 1842 that Ada Lovelace uses it to mechanically translate a short written work. She is generally regarded as the first programmer. Twelve years later George Boole, while professor of Mathematics at Cork University, writes An Investigation of the Laws of Thought(1854), and is generally recognized as the father of computer science.

The 1890 census is tabulated on punch cards similar to the ones used 90 years earlier to create weaves. Developed by Herman Hollerith of MIT, the system uses electric power(nonmechanical). The Hollerith Tabulating Company is a forerunner of today's IBM. Just prior to the introduction of Hollerith's machine the first printing calculator is introduced. In 1892 William Burroughs, a sickly ex-teller, introduces a commercially successful printing calculator. Although hand-powered, Burroughs quickly introduces an electronic model. In 1925, unaware of the work of Charles Babbage, Vannevar Bush of MIT builds a machine he calls the differential analyzer. Using a set of gears and shafts, much like Babbage, the machine can handle simple calculus problems, but accuracy is a problem. The period from 1935 through 1952 gets murky with claims and counterclaims of who invents what and when. Part of the problem lies in the international situation that makes much of the research secret. Other problems include poor record-keeping, deception and lack of definition. In 1935, Konrad Zuse, a German construction engineer, builds a mechanical calculator to handle the math involved in his profession. Shortly after completion, Zuse starts on a programmable electronic device which he completes in 1938. John Vincent Atanasoff begins work on a digital computer in 1936 in the basement of the Physics building on the campus of Iowa State. A graduate student, Clifford (John) Berry assists. The "ABC" is designed to solve linear equations common in physics. It displays some early features of later computers including electronic calculations. He shows it to others in 1939 and leaves the patent application with attorneys for the school when he leaves for a job in Washington during World War II. Unimpressed, the school never files and ABC is cannibalized by students. The Enigma, a complex mechanical encoder is used by the Germans and they believe it to be unbreakable. Several people involved, most notably Alan Turing, conceive machines to handle the problem, but none are technically feasible. Turing proposes a "Universal Machine" capable of "computing" any algorithm in 1937. That same year George Steblitz creates his Model K(itchen), a conglomeration of otherwise useless and leftover material, to solve complex calculations. He improves the design while working at Bell Labs and on September 11, 1940, Steblitz uses a teletype machine at Dartmouth College in New Hampshire to transmit a problem to his Complex Number Calculator in New York and receives the results. It is the first example of a network. First in Poland, and later in Great Britain and the United States, the Enigma code is broken. Information gained by this shortens the war. To break the code, the British, led by Touring, build the Colossus Mark I. The existence of this machine is a closely guarded secret of the British Government until 1970. The United States Navy, aided to some extent by the British, builds a machine capable of breaking not only the German code but the Japanese code as well.

In 1943 development begins on the Electronic Numerical Integrator And Computer (ENIAC) in earnest at Penn State. Designed by John Mauchly and J. Presper Eckert of the Moore School, they get help from John von Neumann and others. In 1944, the Havard Mark I is introduced. Based on a series of proposals from Howard Aiken in the late 1930's, the Mark I computes complex tables for the U.S. Navy. It uses a paper tape to store instructions and Aiken hires Grace Hopper("Amazing Grace") as one of three programmers working on the machine. Thomas J. Watson Sr. plays a pivotal role involving his company, IBM, in the machine's development. Early in 1945, with the Mark I stopped for repairs, Hopper notices a moth in one of the relays, possibly causing the problem. From this day on, Hopper refers to fixing the system as "debugging". The same year Von Neumann proposes the concept of a "stored program" in a paper that is never officially published. Work completes on ENIAC in 1946. Although only three years old the machine is woefully behind on technology, but the inventors opt to continue while working on a more modern machine, the EDVAC. Programming ENIAC requires it to be rewired. A later version eliminates this problem. To make the machine appear more impressive to reporters during its unveiling, a team member (possibly Eckert) puts translucent spheres(halved ping pong balls) over the lights. The US patent office will later recognize this as the first computer. The next year scientists employed by Bell Labs complete work on the transistor (John Bardeen, Walter Brattain and William Shockley receive the Nobel Prize in Physics in 1956), and by 1948 teams around the world work on a "stored program" machine. The first, nicknamed "Baby", is a prototype of a much larger machine under construction in Britain and is shown in June 1948. The impetus over the next 5 years for advances in computers is mostly the government and military. UNIVAC, delivered in 1951 to the Census Bureau, results in a tremendous financial loss to its manufacturer, Remington-Rand. The next year Grace Hopper, now an employee of that company proposes "reuseable software," code segments that could be extracted and assembled according to instructions in a "higher level language." The concept of compiling is born. Hopper would revise this concept over the next twenty years and her ideas would become an integral part of all modern computers. CBS uses one of the 46 UNIVAC computers produced to predict the outcome of the 1952 Presidential Election. They do not air the prediction for 3 hours because they do not trust the machine. IBM introduces the 701 the following year. It is the first commercially successful computer. In 1956 FORTRAN is introduced(proposed 1954, it takes nearly 3 years to develop the compiler). Two additional languages, LISP and COBOL, are added in 1957 and 1958. Other early languages include ALGOL and BASIC. Although never widely used, ALGOL is the basis for many of today's languages.

With the introduction of Control Data's CDC1604 in 1958, the first transistor powered computer, a new age dawns. Brilliant scientist Seymour Cray heads the development team. This year integrated circuits are introduced by two men, Jack Kilby and John Noyce, working independently. The second network is developed at MIT. Over the next three years computers begin affecting the day-to-day lives of most Americans. The addition of MICR characters at the bottom of checks is common. In 1961 Fairchild Semiconductor introduces the integrated circuit. Within ten years all computers use these instead of the transistor. Formally building sized computers are now room-sized, and are considerably more powerful. The following year the Atlas becomes operational, displaying many of the features that make today's systems so powerful including virtual memory, pipeline instruction execution and paging. Designed at the University of Manchester, some of the people who developed Colossus thirty years earlier make contributions. On April 7, 1964, IBM introduces the System/360. While a technical marvel, the main feature of this machine is business oriented...IBM guarantees the "upward compatibility" of the system, reducing the risk that a business would invest in outdated technology. Dartmouth College, where the first network was demonstrated 25 years earlier, moves to the forefront of the "computer age" with the introduction of TSS(Time Share System) a crude(by today's standards) networking system. It is the first Wide Area Network. In three years Randy Golden, President and Founder of Golden Ink, would begin working on this network. Within a year MIT returns to the top of the intellectual computer community with the introduction of a greatly refined network that features shared resources and uses the first minicomputer(DEC's PDP-8) to manage telephone lines. Bell Labs and GE play major roles in its design. In 1969 Bell Labs, unhappy with the direction of the MIT project, leaves and develops its own operating system, UNIX. One of the many precursors to today's Internet, ARPANet, is quietly launched. Alan Keys, who will later become a designer for Apple, proposes the "personal computer." Also in 1969, unhappy with Fairchild Semiconductor, a group of technicians begin discussing forming their own company. This company, formed the next year, would be known as Intel. The movie Colossus:The Forbin Project has a supercomputer as the villain. Next year, The Computer Wore Tennis Shoes was the first feature length movie with the word computer in the title. In 1971, Texas Instruments introduces the first "pocket calculator." It weighs 2.5 pounds. With the country embroiled in a crisis of confidence known as Watergate, in 1973 a little publicized judicial decision takes the patent for the computer away from Mauchly and Eckert and awards it to Atanasoff. Xerox introduces the mouse. Proposals are made for the first local area networks.

In 1975 the first personal computer is marketed in kit form. The Altair features 256 bytes of memory. Bill Gates, with others, writes a BASIC compiler for the machine. The next year Apple begins to market PC's, also in kit form. It includes a monitor and keyboard. The earliest RISC platforms become stable. In 1976, Queen Elizabeth goes on-line with the first royal email message. During the next few years the personal computer explodes on the American scene. Microsoft, Apple and many smaller PC related companies form (and some die). By 1977 stores begin to sell PC's. Continuing today, companies strive to reduce the size and price of PC's while increasing capacity. Entering the fray, IBM introduces it's PC in 1981(it's actually IBM's second attempt, but the first failed miserably). Time selects the computer as its Man of the Year in 1982. Tron, a computer-generated special effects extravaganza is released the same year.

Hardware
Hardware is a general term that refers to the physical artifacts of a technology. It may also mean the physical components of a computer system, in the form of computer hardware. Hardware historically meant the metal parts and fittings that were used to make wooden products stronger, more functional, longer lasting and easier to fabricate or assemble.[citation
needed]

Modern hardware stores typically sell equipment such as keys, locks, hinges, latches, corners, handles, wire, chains, plumbing supplies, tools, utensils, cutlery and machine parts, especially when they are made of metal.[citation needed] In a more colloquial sense, hardware can refer to military equipment, such as tanks, aircraft, ships or munitions. In the case of vehicles, such may instead be referred to as armour. In slang, the term can also refer to trophies and other physical representations of awards.

Software
Computer software, or just software is a general term used to describe a collection of computer programs, procedures and documentation that perform some tasks on a computer system. The term includes: Application software such as word processors which perform productive tasks for users. Firmware which is software programmed resident to electrically programmable memory devices on board mainboards or other types of integrated hardware carriers. Middleware which controls and co-ordinates distributed systems. System software such as operating systems, which interface with hardware to provide the necessary services for application software.

Software testing is a domain independent of development and programming. It consists of various methods to test and declare a software product fit before it can be launched for use by either an individual or a group. Many tests on functionality, performance and appearance are conducted by modern testers with various tools such as QTP, Load runner, Black box testing etc to edit a checklist of requirements against the developed code. ISTQB is a certification that is in demand for engineers who want to pursue a career in testing.[2] Testware which is an umbrella term or container term for all utilities and application software that serve in combination for testing a software package but not necessarily may optionally contribute to operational purposes. As such, testware is not a standing configuration but merely a working environment for application software or subsets thereof. Software includes websites, programs, video games, etc. that are coded by programming languages like C, C++, etc. "Software" is sometimes used in a broader context to mean anything which is not hardware but which is used with hardware, such as film, tapes and records.

NI History
From the beginning of modern nursing, data from standardized patient records were seen as a potentially powerful resource for assessing and improving the quality of care. As nursing informatics began to evolve in the second half of the 20th century, the lack of standards for language and data limited the functionality and usefulness of early applications. In response, nurses developed standardized languages, but until the turn of the century, neither they nor anyone else understood the attributes required to achieve computability and semantic interoperability. Collaboration across disciplines and national boundaries has led to the development of standards that meet these requirements, opening the way for powerful information tools. Many challenges remain, however. Realizing the potential of nurses to transform and improve health care and outcomes through informatics will require fundamental changes in individuals, organizations, and systems. Nurses are developing and applying informatics methods and tools to discover knowledge and improve health from the molecular to the global level and are seeking the collective wisdom of interdisciplinary and interorganizational collaboration to effect the necessary changes. NOTE: Although this article focuses on nursing informatics in the United States, nurses around the world have made substantial contributions to the field. This article alludes to a few of

those advances, but a comprehensive description is beyond the scope of the present work.

You might also like