Computer Science Short Notes
Computer Science Short Notes
I INTRODUCTION
Computer Science, study of the theory, experimentation, and engineering that form
the basis for the design and use of computers—devices that automatically process information.
Computer science traces its roots to work done by English mathematician Charles Babbage,
who first proposed a programmable mechanical calculator in 1837. Until the advent of
electronic digital computers in the 1940s, computer science was not generally distinguished as
being separate from mathematics and engineering. Since then it has sprouted numerous
focused on automating the process of making calculations for use in science and engineering.
Scientists and engineers developed theoretical models of computation that enabled them to
analyze how efficient different approaches were in performing various calculations. Computer
science overlapped considerably during this time with the branch of mathematics known as
numerical analysis, which examines the accuracy and precision of calculations. (see ENIAC;
UNIVAC.)
As the use of computers expanded between the 1950s and the 1970s, the focus of
systems—computer programs that provide a useful interface between a computer and a user.
During this time, computer scientists were also experimenting with new applications and
computer designs, creating the first computer networks, and exploring relationships between
the electronic circuitry that serves as the main information processing center in a computer.
This new technology revolutionized the computer industry by dramatically reducing the cost of
building computers and greatly increasing their processing speed. The microprocessor made
possible the advent of the personal computer, which resulted in an explosion in the use of
computer applications. Between the early 1970s and 1980s, computer science rapidly
expanded in an effort to develop new applications for personal computers and to drive the
technological advances in the computing industry. Much of the earlier research that had been
done began to reach the public through personal computers, which derived most of their early
systems by pioneering the designs of more complex, reliable, and powerful computers;
integral part of modern society, computer scientists strive to solve new problems and invent
The goals of computer science range from finding ways to better educate people in the
use of existing computers to highly speculative research into technologies and approaches
that may not be viable for decades. Underlying all of these specific goals is the desire to better
the human condition today and in the future through the improved use of information.
computer hardware and software based on that theory, and experimentally tests it. An
tools that are then evaluated in actual use. In other cases, experimentation may result in new
theory, such as the discovery that an artificial neural network exhibits behavior similar to
unnecessary because the outcome of experiments should be known in advance. But when
computer systems and their interactions with the natural world become sufficiently complex,
unforeseen behaviors can result. Experimentation and the traditional scientific method are
efficient ways for humans to use computers), and artificial intelligence (the attempt to make
programs that perform efficiently. Computer architecture is concerned with developing optimal
hardware for specific computational needs. The areas of artificial intelligence (AI) and human-
computer interfacing often involve the development of both software and hardware to solve
specific problems.
A Software Development
In developing computer software, computer scientists and engineers study various
areas and techniques of software design, such as the best types of programming languages
and algorithms (see below) to use in specific programs, how to efficiently store and retrieve
designers must consider many factors when developing a program. Often, program
performance in one area must be sacrificed for the sake of the general performance of the
software. For instance, since computers have only a limited amount of memory, software
designers must limit the number of features they include in a program so that it will not
require more memory than the system it is designed for can supply.
and engineers study methods and tools that facilitate the efficient development of correct,
reliable, and robust computer programs. Research in this branch of computer science
considers all the phases of the software life cycle, which begins with a formal problem
specification, and progresses to the design of a solution, its implementation as a program,
testing of the program, and program maintenance. Software engineers develop software tools
process. For example, tools can help to manage the many components of a large program that
Algorithms and data structures are the building blocks of computer programs. An
algorithm is a precise step-by-step procedure for solving a problem within a finite time and
using a finite amount of memory. Common algorithms include searching a collection of data,
sorting data, and numerical operations such as matrix multiplication. Data structures are
patterns for organizing information, and often represent relationships between data values.
Some common data structures are called lists, arrays, records, stacks, queues, and trees.
Computer scientists continue to develop new algorithms and data structures to solve
new problems and improve the efficiency of existing programs. One area of theoretical
research is called algorithmic complexity. Computer scientists in this field seek to develop
techniques for determining the inherent efficiency of algorithms with respect to one another.
Another area of theoretical research called computability theory seeks to identify the inherent
limits of computation.
computer. Natural languages such as English are ambiguous—meaning that their grammatical
structure and vocabulary can be interpreted in multiple ways—so they are not suited for
programming. Instead, simple and unambiguous artificial languages are used. Computer
scientists study ways of making programming languages more expressive, thereby simplifying
translated into machine language (the actual instructions that the computer follows).
Computer scientists also develop better translation algorithms that produce more efficient
prevent access by unauthorized users, and improve access speed. They are also interested in
developing techniques to compress the data, so that more can be stored in the same amount
of memory. Databases are sometimes distributed over multiple computers that update the
data simultaneously, which can lead to inconsistency in the stored information. To address this
problem, computer scientists also study ways of preventing inconsistency without reducing
access speed.
Information retrieval is concerned with locating data in collections that are not clearly
organized, such as a file of newspaper articles. Computer scientists develop algorithms for
creating indexes of the data. Once the information is indexed, techniques developed for
databases can be used to organize it. Data mining is a closely related field in which a large
body of information is analyzed to identify patterns. For example, mining the sales records
from a grocery store could identify shopping patterns to help guide the store in stocking its
Operating systems are programs that control the overall functioning of a computer.
They provide the user interface, place programs into the computer’s memory and cause it to
execute them, control the computer’s input and output devices, manage the computer’s
resources such as its disk space, protect the computer from unauthorized use, and keep stored
data secure. Computer scientists are interested in making operating systems easier to use,
more secure, and more efficient by developing new user interface designs, designing new
mechanisms that allow data to be shared while preventing access to sensitive data, and
developing algorithms that make more effective use of the computer’s time and memory.
calculations, often on large sets of data or with high precision. Because many of these
computations may take days or months to execute, computer scientists are interested in
making the calculations as efficient as possible. They also explore ways to increase the
numerical precision of computations, which can have such effects as improving the accuracy of
a weather forecast. The goals of improving efficiency and precision often conflict, with greater
as characters, words, drawings, algebraic expressions, encrypted data (data coded to prevent
unauthorized access), and the parts of data structures that represent relationships between
values (see Encryption). One unifying property of symbolic programs is that they often lack the
B Computer Architecture
Computer architecture is the design and analysis of new computer systems. Computer
architects study ways of improving computers by increasing their speed, storage capacity, and
reliability, and by reducing their cost and power consumption. Computer architects develop
both software and hardware models to analyze the performance of existing and proposed
computer designs, then use this analysis to guide development of new computers. They are
often involved with the engineering of a new computer because the accuracy of their models
depends on the design of the computer’s circuitry. Many computer architects are interested in
developing computers that are specialized for particular applications such as image
computer architecture to specific tasks often yields higher performance, lower cost, or both.
C Artificial Intelligence
Artificial intelligence (AI) research seeks to enable computers and machines to mimic human
intelligence and sensory processing ability, and models human behavior with computers to
improve our understanding of intelligence. The many branches of AI research include machine
reasoning, natural language understanding, speech recognition, computer vision, and artificial
neural networks.
A key technique developed in the study of artificial intelligence is to specify a problem as a set
of states, some of which are solutions, and then search for solution states. For example, in
chess, each move creates a new state. If a computer searched the states resulting from all
possible sequences of moves, it could identify those that win the game. However, the number
of states associated with many problems (such as the possible number of moves needed to
win a chess game) is so vast that exhaustively searching them is impractical. The search
process can be improved through the use of heuristics—rules that are specific to a given
problem and can therefore help guide the search. For example, a chess heuristic might
indicate that when a move results in checkmate, there is no point in examining alternate
moves.
D Robotics
Another area of computer science that has found wide practical use is robotics—the design and
toys to automated factory assembly lines, and relieve humans from tedious, repetitive, or
dangerous tasks. Robots are also employed where requirements of speed, precision,
involved in the field of robotics—study the many aspects of controlling robots. These aspects
include modeling the robot’s physical properties, modeling its environment, planning its
actions, directing its mechanisms efficiently, using sensors to provide feedback to the
controlling program, and ensuring the safety of its behavior. They also study ways of
simplifying the creation of control programs. One area of research seeks to provide robots with
more of the dexterity and adaptability of humans, and is closely associated with AI.
E Human-Computer Interfacing
Human-computer interfaces provide the means for people to use computers. An example of a
human-computer interface is the keyboard, which lets humans enter commands into a
computer and enter text into a specific application. The diversity of research into human-
evaluation of their effectiveness. Examples include improving computer access for people with
disabilities, simplifying program use, developing three-dimensional input and output devices
for virtual reality, improving handwriting and speech recognition, and developing heads-up
displays for aircraft instruments in which critical information such as speed, altitude, and
heading are displayed on a screen in front of the pilot’s window. One area of research, called
visualization, is concerned with graphically presenting large amounts of data so that people
V OTHER DISCIPLINES
Because computer science grew out of mathematics and , it retains many close
connections to those disciplines. Theoretical computer science draws many of its approaches
from mathematics and logic. Research in numerical computation overlaps with mathematics
research in numerical analysis. Computer architects work closely with the electrical engineers
Beyond these historical connections, there are strong ties between AI research and
connections with psychology. Robotics work with both mechanical engineers and physiologists
Computer science also has indirect relationships with virtually all disciplines that use
computers. Applications developed in other fields often involve collaboration with computer
engineering, and existing technology. In return, the computer scientists have the opportunity
to observe novel applications of computers, from which they gain a deeper insight into their
use. These relationships make computer science a highly interdisciplinary field of study.
Contributed By:
Charles C. Weems
communication, under the control of a set of instructions called a program. Programs usually
reside within the computer and are retrieved and processed by the computer’s electronics. The
program results are stored or routed to output devices, such as video display monitors or
printers. Computers perform a wide variety of activities reliably, accurately, and quickly.
II USES OF COMPUTERS
People use computers in many ways. In business, computers track inventories with bar
codes and scanners, check the credit status of customers, and transfer funds electronically. In
homes, tiny computers embedded in the electronic circuitry of most appliances control the
indoor temperature, operate home security systems, tell the time, and turn videocassette
recorders (VCRs) on and off. Computers in automobiles regulate the flow of fuel, thereby
increasing gas mileage, and are used in anti-theft systems. Computers also entertain, creating
laser disc. Computer programs, or applications, exist to aid every level of education, from
programs that teach simple addition or sentence construction to programs that teach
advanced calculus. Educators use computers to track grades and communicate with students;
with computer-controlled projection units, they can add graphics, sound, and animation to
systems that are too costly or impractical to build, such as testing the air flow around the next
encode and unscramble messages, and to keep track of personnel and supplies.
hardware includes the memory that stores data and program instructions; the central
processing unit (CPU) that carries out program instructions; the input devices, such as a
keyboard or mouse, that allow the user to communicate with the computer; the output
devices, such as printers and video display monitors, that enable the computer to present
information to the user; and buses (hardware lines or wires) that connect these and other
computer components. The programs that run the computer are called software. Software
generally is designed to perform a particular type of task—for example, to control the arm of a
robot to weld a car’s body, to write a letter, to display and modify a photograph, or to direct
They provide the user interface, place programs into the computer’s memory and cause it to
execute them, control the computer’s input and output devices, manage the computer’s
resources such as its disk space, protect the computer from unauthorized use, and keep stored
data secure. Computer scientists are interested in making operating systems easier to use,
more secure, and more efficient by developing new user interface designs, designing new
mechanisms that allow data to be shared while preventing access to sensitive data, and
developing algorithms that make more effective use of the computer’s time and memory.
When a computer is turned ON it searches for instructions in its memory. These instructions
tell the computer how to start up. Usually, one of the first sets of these instructions is a special
program called the operating system, which is the software that makes the computer work. It
prompts the user (or other machines) for input and commands, reports the results of these
commands and other operations, stores and manages data, and controls the sequence of the
software and hardware actions. When the user requests that a program run, the operating
system loads the program in the computer’s memory and runs the program. Popular operating
systems, such as Microsoft Windows and the Macintosh system (Mac OS), have graphical user
interfaces (GUIs)—that use tiny pictures, or icons, to represent various files and commands. To
access these files or commands, the user clicks the mouse on the icon or presses a
combination of keys on the keyboard. Some operating systems allow the user to carry out
B Computer Memory
To process information electronically, data are stored in a computer in the form of
binary digits, or bits, each having two possible representations (0 or 1). If a second bit is added
possible combinations: 00, 01, 10, or 11. A third bit added to this two-bit representation again
doubles the number of combinations, resulting in eight possibilities: 000, 001, 010, 011, 100,
101, 110, or 111. Each time a bit is added, the number of possible patterns is doubled. Eight
bits is called a byte; a byte has 256 possible combinations of 0s and 1s. See also Expanded
possible patterns to represent the entire alphabet, in lower and upper cases, as well as
numeric digits, punctuation marks, and several character-sized graphics symbols, including
non-English characters such as π . A byte also can be interpreted as a pattern that represents
a number between 0 and 255. A kilobyte—1,024 bytes—can store about 1,000 characters; a
megabyte can store about 1 million characters; a gigabyte can store about 1 billion characters;
and a terabyte can store about 1 trillion characters. Computer programmers usually decide
how a given byte should be interpreted—that is, as a single character, a character within a
string of text, a single number, or part of a larger number. Numbers can represent anything
The physical memory of a computer is either random access memory (RAM), which can
be read or changed by the user or computer, or read-only memory (ROM), which can be read
by the computer but not altered in any way. One way to store memory is within the circuitry of
the computer, usually in tiny computer chips that hold millions of bytes of information. The
memory within these computer chips is RAM. Memory also can be stored outside the circuitry
of the computer on external storage devices, such as magnetic floppy disks, which can store
about 2 megabytes of information; hard drives, which can store gigabytes of information;
compact discs (CDs), which can store up to 680 megabytes of information; and digital video
discs (DVDs), which can store 8.5 gigabytes of information. A single CD can store nearly as
much information as several hundred floppy disks, and some DVDs can hold more than 12
C The Bus
The bus enables the components in a computer, such as the CPU and the memory
circuits, to communicate as program instructions are being carried out. The bus is usually a flat
cable with numerous parallel wires. Each wire can carry one bit, so the bus can transmit many
bits along the cable at the same time. For example, a 16-bit bus, with 16 parallel wires, allows
another. Early computer designs utilized a single or very few buses. Modern designs typically
use many buses, some of them specialized to carry particular forms of data, such as graphics.
D Input Devices
Input devices, such as a keyboard or mouse, permit the computer user to
communicate with the computer. Other input devices include a joystick, a rod like device often
used by people who play computer games; a scanner, which converts images such as
photographs into digital images that the computer can manipulate; a touch panel, which
senses the placement of a user’s finger and can be used to execute commands or access files;
and a microphone, used to input sounds such as the human voice which can activate computer
commands in conjunction with voice recognition software. “Tablet” computers are being
developed that will allow users to interact with their screens using a pen like device.
the bus to the central processing unit (CPU), which is the part of the computer that translates
commands and runs programs. The CPU is a microprocessor chip—that is, a single piece of
stored in a CPU memory location called a register. Registers can be thought of as the CPU’s
tiny scratchpad, temporarily storing instructions or data. When a program is running, one
special register called the program counter keeps track of which program instruction comes
next by maintaining the memory location of the next program instruction to be executed. The
CPU’s control unit coordinates and times the CPU’s functions, and it uses the program counter
In a typical sequence, the CPU locates the next instruction in the appropriate memory
device. The instruction then travels along the bus from the computer’s memory to the CPU,
where it is stored in a special instruction register. Meanwhile, the program counter changes—
usually increasing a small amount—so that it contains the location of the instruction that will
be executed next. The current instruction is analyzed by a decoder, which determines what the
instruction will do. Any data the instruction needs are retrieved via the bus and placed in the
CPU’s registers. The CPU executes the instruction, and the results are stored in another
register or copied to specific memory locations via a bus. This entire sequence of steps is
each at a different stage in its instruction cycle. This is called pipeline processing.
F Output Devices
Once the CPU has executed the program instruction, the program may request that the
liquid crystal display. Other output devices are printers, overhead projectors, videocassette
IV PROGRAMMING LANGUAGES
Programming languages contain the series of commands that create software. A CPU
has a limited set of instructions known as machine code that it is capable of understanding.
The CPU can understand only this language. All other programming languages must be
easier to use. These other languages are slower because the language must be translated first
so that the computer can understand it. The translation can lead to code that may be less
A Machine Language
Computer programs that can be run by a computer’s operating system are called
machine code. These instructions are specific to the individual computer’s CPU and associated
hardware; for example, Intel Pentium and Power PC microprocessor chips each have different
machine languages and require different sets of codes to perform the same task. Machine
code instructions are few in number (roughly 20 to 200, depending on the computer and the
CPU). Typical instructions are for copying data from a memory location or for adding the
contents of two memory locations (usually registers in the CPU). Complex tasks require a
sequence of these simple instructions. Machine code instructions are binary—that is,
sequences of bits (0s and 1s). Because these sequences are long strings of 0s and 1s and are
usually not easy to understand, computer instructions usually are not written in machine code.
high-level language.
B Assembly Language
Assembly language uses easy-to-remember commands that are more understandable
an equivalent command in assembly language. For example, in one Intel assembly language,
the statement “MOV A, B” instructs the computer to copy data from location A to location B.
The same instruction in machine code is a string of 16 0s and 1s. Once an assembly-language
an assembler.
Assembly language is fast and powerful because of its correspondence with machine
different CPUs use different machine languages and therefore require different programs and
language program to carry out specific hardware tasks or to speed up parts of the high-level
C High-Level Languages
High-level languages were developed because of the difficulty of programming using
assembly languages. High-level languages are easier to use than machine and assembly
languages because their commands are closer to natural human language. In addition, these
languages are not CPU-specific. Instead, they contain general commands that work on
different CPUs. For example, a programmer writing in the high-level C++ programming
language who wants to display a greeting need include only the following command:
This command directs the computer’s CPU to display the greeting, and it will work no
matter what type of CPU the computer uses. When this statement is executed, the text that
appears between the quotes will be displayed. Although the “cout” and “endl” parts of the
above statement appear cryptic, programmers quickly become accustomed to their meanings.
For example, “cout” sends the greeting message to the “standard output” (usually the
computer user’s screen) and “endl” is how to tell the computer (when using the C++
instructions, high-level languages also must be translated. This is the task of a special program
called a compiler. A compiler turns a high-level program into a CPU-specific machine language.
For example, a programmer may write a program in a high-level language such as C++ or Java
and then prepare it for different machines, such as a Sun Microsystems work station or a
personal computer (PC), using compilers designed for those machines. This simplifies the
programmer’s task and makes the software more portable to different users and machines.
V FLOW-MATIC
American naval officer and mathematician Grace Murray Hopper helped develop the
credited for inventing the term bug, which indicates a computer malfunction; in 1945 she
discovered a hardware failure in the Mark II computer caused by a moth trapped between its
mechanical relays. She documented the event in her laboratory notebook, and the term
eventually came to represent any computer error, including one based strictly on incorrect
instructions in software. Hopper taped the moth into her notebook and wrote, “First actual
VI FORTRAN
From 1954 to 1958 American computer scientist John Backus of International Business
Machines, Inc. (IBM) developed Fortran, an acronym for Formula Translation. It became a
standard programming language because it could process mathematical formulas. Fortran and
VII BASIC
Hungarian-American mathematician John Kemeny and American mathematician
Thomas Kurtz at Dartmouth College in Hanover, New Hampshire, developed BASIC (Beginner’s
All-purpose Symbolic Instruction Code) in 1964. The language was easier to learn than its
predecessors and became popular due to its friendly, interactive nature and its inclusion on
early personal computers. Unlike languages that require all their instructions to be translated
into machine code first, BASIC is turned into machine language line by line as the program
runs. BASIC commands typify high-level languages because of their simplicity and their
closeness to natural human language. For example, a program that divides a number in half
can be written as
20 Y=X/2
sequence of the commands. The first line prints “ENTER A NUMBER” on the computer screen
followed by a question mark to prompt the user to type in the number labeled “X.” In the next
line, that number is divided by two and stored as “Y.” In the third line, the result of the
operation is displayed on the computer screen. Even though BASIC is rarely used today, this
simple program demonstrates how data are stored and manipulated in most high-level
programming languages.
Visual Basic, and Java. Some languages, such as the “markup languages” known as HTML,
XML, and their variants, are intended to display data, graphics, and media selections,
especially for users of the World Wide Web. Markup languages are often not considered
traditional high-level languages, but they enable a programmer to think in terms of collections
of cooperating objects instead of lists of commands. Objects, such as a circle, have properties
such as the radius of the circle and the command that draws it on the computer screen.
Classes of objects can inherit features from other classes of objects. For example, a class
defining squares can inherit features such as right angles from a class defining rectangles. This
set of programming classes simplifies the programmer’s task, resulting in more “reusable”
computer code. Reusable code allows a programmer to use code that has already been
designed, written, and tested. This makes the programmer’s task easier, and it results in more
IX TYPES OF COMPUTERS
A Digital and Analog
Computers can be either digital or analog. Virtually all modern computers are digital.
Digital refers to the processes in computers that manipulate binary numbers (0s or 1s), which
represent switches that are turned on or off by electrical current. A bit can have the value 0 or
the value 1, but nothing in between 0 and 1. Analog refers to circuits or numerical values that
have a continuous range. Both 0 and 1 can be represented by analog computers, but so can
A desk lamp can serve as an example of the difference between analog and digital. If the lamp
has a simple on/off switch, then the lamp system is digital, because the lamp either produces
light at a given moment or it does not. If a dimmer replaces the on/off switch, then the lamp is
analog, because the amount of light can vary continuously from on to off and all intensities in
between.
Analog computer systems were the first type to be produced. A popular analog
computer used in the 20th century was the slide rule. To perform calculations with a slide rule,
the user slides a narrow, gauged wooden strip inside a rulerlike holder. Because the sliding is
continuous and there is no mechanism to stop at any exact values, the slide rule is analog.
New interest has been shown recently in analog computers, particularly in areas such as
neural networks. These are specialized computer designs that attempt to mimic neurons of the
brain. They can be built to respond to continuous electrical signals. Most modern computers,
however, are digital machines whose components have a finite number of states—for example,
the 0 or 1, or on or off bits. These bits can be combined to denote information such as
the circuitry of appliances, such as televisions and wristwatches. These computers are typically
delivering doses of medicine, or keeping accurate time. They generally are “hard-wired”—that
memory, and physical size. Some small computers can be held in one hand and are called
personal digital assistants (PDAs). They are used as notepads, scheduling systems, and
address books; if equipped with a cellular phone, they can connect to worldwide computer
networks to exchange information regardless of location. Hand-held game devices are also
Portable laptop and notebook computers and desktop PCs are typically used in
businesses and at home to communicate on computer networks, for word processing, to track
finances, and for entertainment. They have large amounts of internal memory to store
hundreds of programs and documents. They are equipped with a keyboard; a mouse, trackball,
or other pointing device; and a video display monitor or liquid crystal display (LCD) to display
information. Laptop and notebook computers usually have hardware and software similar to
PCs, but they are more compact and have flat, lightweight LCDs instead of television-like video
display monitors. Most sources consider the terms “laptop” and “notebook” synonymous.
Workstations are similar to personal computers but have greater memory and more extensive
mathematical abilities, and they are connected to other workstations or personal computers to
exchange data. They are typically found in scientific, industrial, and business environments—
especially financial ones, such as stock exchanges—that require complex and fast
computations.
Mainframe computers have more memory, speed, and capabilities than workstations
and are usually shared by multiple users through a series of interconnected computers. They
control businesses and industrial facilities and are used for scientific research. The most
calculations, such as those used to create weather predictions. Large businesses, scientific
institutions, and the military use them. Some supercomputers have many sets of CPUs. These
computers break a task into small pieces, and each CPU processes a portion of the task to
increase overall speed and efficiency. Such computers are called parallel processors. As
computers have increased in sophistication, the boundaries between the various types have
become less rigid. The performance of various tasks and types of computing have also moved
from one type of computer to another. For example, networked PCs can work together on a
X NETWORKS
Computers can communicate with other computers through a series of connections
and associated hardware called a network. The advantage of a network is that data can be
exchanged rapidly, and software and hardware resources, such as hard-disk space or printers,
can be shared. Networks also allow remote use of a computer by a user who cannot physically
One type of network, a local area network (LAN), consists of several PCs or
workstations connected to a special computer called a server, often within the same building
or office complex. The server stores and manages programs and data. A server often contains
all of a networked group’s data and enables LAN workstations or PCs to be set up without large
storage capabilities. In this scenario, each PC may have “local” memory (for example, a hard
drive) specific to itself, but the bulk of storage resides on the server. This reduces the cost of
the workstation or PC because less expensive computers can be purchased, and it simplifies
the maintenance of software because the software resides only on the server rather than on
connected to PCs, workstations, or terminals that have no computational abilities of their own.
These “dumb” terminals are used only to enter data into, or receive output from, the central
computer.
Wide area networks (WANs) are networks that span large geographical areas.
Computers can connect to these networks to use facilities in another city or country. For
example, a person in Los Angeles can browse through the computerized archives of the Library
of Congress in Washington, D.C. The largest WAN is the Internet, a global consortium of
standards that enable computers to communicate with each other). The Internet is a
mammoth resource of data, programs, and utilities. American computer scientist Vinton Cerf
was largely responsible for creating the Internet in 1973 as part of the United States
development of Internet technology was turned over to private, government, and scientific
agencies. The World Wide Web, developed in the 1980s by British physicist Timothy Berners-
Lee, is a system of information resources accessed primarily through the Internet. Users can
obtain a variety of information in the form of text, graphics, sounds, or video. These data are
extensively cross-indexed, enabling users to browse (transfer their attention from one
information site to another) via buttons, highlighted text, or sophisticated searching software
XI HISTORY
A Beginnings
The history of computing began with an analog machine. In 1623 German scientist
Wilhelm Schikard invented a machine that used 11 complete and 6 incomplete sprocketed
wheels that could add, and with the aid of logarithm tables, multiply and divide.
French philosopher, mathematician, and physicist Blaise Pascal invented a machine in 1642
that added and subtracted, automatically carrying and borrowing digits from column to
column. Pascal built 50 copies of his machine, but most served as curiosities in parlors of the
type of computer: a silk loom. Jacquard’s loom used punched cards to program patterns that
helped the loom create woven fabrics. Although Jacquard was rewarded and admired by
French emperor Napoleon I for his work, he fled for his life from the city of Lyon pursued by
weavers who feared their jobs were in jeopardy due to Jacquard’s invention. The loom
prevailed, however: When Jacquard died, more than 30,000 of his looms existed in Lyon. The
looms are still used today, especially in the manufacture of fine furniture fabrics.
Precursor to Modern
C Computer
Another early mechanical computer was the Difference Engine, designed in the early
1820s by British mathematician and scientist Charles Babbage. Although never completed by
Babbage, the Difference Engine was intended to be a machine with a 20-decimal capacity that
could solve mathematical problems. Babbage also made plans for another machine, the
Analytical Engine, considered the mechanical precursor of the modern computer. The
Analytical Engine was designed to perform all arithmetic operations efficiently; however,
Babbage’s lack of political skills kept him from obtaining the approval and funds to build it.
Augusta Ada Byron, countess of Lovelace, was a personal friend and student of
Babbage. She was the daughter of the famous poet Lord Byron and one of only a few woman
mathematicians of her time. She prepared extensive notes concerning Babbage’s ideas and
the Analytical Engine. Lovelace’s conceptual programs for the machine led to the naming of a
programming language (Ada) in her honor. Although the Analytical Engine was never built, its
key concepts, such as the capacity to store instructions, the use of punched cards as a
primitive memory, and the ability to print, can be found in many modern computers.
combined the use of punched cards with devices that created and electronically read the
cards. Hollerith’s tabulator was used for the 1890 U.S. census, and it made the computational
time three to four times shorter than the time previously needed for hand counts. Hollerith’s
Tabulating Machine Company eventually merged with two companies to form the Computing-
In 1936 British mathematician Alan Turing proposed the idea of a machine that could process
equations without human direction. The machine (now known as a Turing machine) resembled
an automatic typewriter that used symbols for math and logic instead of letters. Turing
intended the device to be a “universal machine” that could be used to duplicate or represent
the function of any other existing machine. Turing’s machine was the theoretical precursor to
the modern digital computer. The Turing machine model is still used by modern computational
theorists.
In the 1930s American mathematician Howard Aiken developed the Mark I calculating
machine, which was built by IBM. This electronic calculating machine used relays and
used vacuum tubes and solid state transistors (tiny electrical switches) to manipulate the
binary numbers. Aiken also introduced computers to universities by establishing the first
obsessively mistrusted the concept of storing a program within the computer, insisting that the
integrity of the machine could be maintained only through a strict separation of program
instructions from data. His computer had to read instructions from punched cards, which could
be stored away from the computer. He also urged the National Bureau of Standards not to
support the development of computers, insisting that there would never be a need for more
mathematician John von Neumann developed one of the first computers used to solve
design for the Electronic Discrete Variable Automatic Computer (EDVAC)—in stark contrast to
the designs of Aiken, his contemporary—was the first electronic computer design to
incorporate a program stored entirely within its memory. This machine led to several others,
American physicist John Mauchly proposed the electronic digital computer called ENIAC, the
Electronic Numerical Integrator And Computer. He helped build it along with American
engineer John Presper Eckert, Jr., at the Moore School of Engineering at the University of
Pennsylvania in Philadelphia. ENIAC was operational in 1945 and introduced to the public in
1946. It is regarded as the first successful, general digital computer. It occupied 167 sq m
(1,800 sq ft), weighed more than 27,000 kg (60,000 lb), and contained more than 18,000
vacuum tubes. Roughly 2,000 of the computer’s vacuum tubes were replaced each month by a
team of six technicians. Many of ENIAC’s first tasks were for military purposes, such as
calculating ballistic firing tables and designing atomic weapons. Since ENIAC was initially not a
Eckert and Mauchly eventually formed their own company, which was then bought by the Rand
Corporation. They produced the Universal Automatic Computer (UNIVAC), which was used for a
broader variety of commercial applications. The first UNIVAC was delivered to the United
Between 1937 and 1939, while teaching at Iowa State College, American physicist John Vincent
Atanasoff built a prototype computing device called the Atanasoff-Berry Computer, or ABC,
with the help of his assistant, Clifford Berry. Atanasoff developed the concepts that were later
used in the design of the ENIAC. Atanasoff’s device was the first computer to separate data
processing from memory, but it is not clear whether a functional version was ever built.
Atanasoff did not receive credit for his contributions until 1973, when a lawsuit regarding the
Bardeen, and William Bradford Shockley developed the transistor, a device that can act as an
electric switch. The transistor had a tremendous impact on computer design, replacing costly,
In the late 1960s integrated circuits (tiny transistors and other electrical components arranged
resulted from the simultaneous, independent work of Jack Kilby at Texas Instruments and
Robert Noyce of the Fairchild Semiconductor Corporation in the late 1950s. As integrated
circuits became miniaturized, more components could be designed into a single computer
circuit. In the 1970s refinements in integrated circuit technology led to the development of the
Manufacturers used integrated circuit technology to build smaller and cheaper computers. The
first of these so-called personal computers (PCs)—the Altair 8800—appeared in 1975, sold by
Micro Instrumentation Telemetry Systems (MITS). The Altair used an 8-bit Intel 8080
microprocessor, had 256 bytes of RAM, received input through switches on the front panel, and
displayed output on rows of light-emitting diodes (LEDs). Refinements in the PC continued with
the inclusion of video displays, better storage devices, and CPUs with more computational
abilities. Graphical user interfaces were first designed by the Xerox Corporation, then later
used successfully by Apple Inc.. Today the development of sophisticated operating systems
such as Windows, the Mac OS, and Linux enables computer users to run programs and
Several researchers claim the “record” for the largest single calculation ever performed. One
large single calculation was accomplished by physicists at IBM in 1995. They solved one million
trillion mathematical subproblems by continuously running 448 computers for two years. Their
glueball. Japan, Italy, and the United States are collaborating to develop new supercomputers
In 1996 IBM challenged Garry Kasparov, the reigning world chess champion, to a chess match
with a supercomputer called Deep Blue. The computer had the ability to compute more than
100 million chess positions per second. In a 1997 rematch Deep Blue defeated Kasparov,
becoming the first computer to win a match against a reigning world chess champion with
regulation time controls. Many experts predict these types of parallel processing machines will
soon surpass human chess playing ability, and some speculate that massive calculating power
will one day replace intelligence. Deep Blue serves as a prototype for future computers that
will be required to solve complex problems. At issue, however, is whether a computer can be
developed with the ability to learn to solve problems on its own, rather than one programmed
In 1965 semiconductor pioneer Gordon Moore predicted that the number of transistors
contained on a computer chip would double every year. This is now known as Moore’s Law,
and it has proven to be somewhat accurate. The number of transistors and the computational
continue to shrink in size and are becoming faster, cheaper, and more versatile.
With their increasing power and versatility, computers simplify day-to-day life. Unfortunately,
as computer use becomes more widespread, so do the opportunities for misuse. Computer
hackers—people who illegally gain access to computer systems—often violate privacy and can
tamper with or destroy records. Programs called viruses or worms can replicate and spread
have used computers to electronically embezzle funds and alter credit histories (see Computer
Security). New ethical issues also have arisen, such as how to regulate material on the Internet
and the World Wide Web. Long-standing issues, such as privacy and freedom of expression,
are being reexamined in light of the digital revolution. Individuals, companies, and
Computers will become more advanced and they will also become easier to use. Improved
speech recognition will make the operation of a computer easier. Virtual reality, the technology
of interacting with a computer using all of the human senses, will also contribute to better
example, Virtual Reality Modeling language (VRML)—are currently in use or are being
Other, exotic models of computation are being developed, including biological computing that
uses living organisms, molecular computing that uses molecules with particular properties, and
computing that uses deoxyribonucleic acid (DNA), the basic unit of heredity, to store data and
carry out operations. These are examples of possible future computational platforms that, so
far, are limited in abilities or are strictly theoretical. Scientists investigate them because of the
physical limitations of miniaturizing circuits embedded in silicon. There are also limitations
Intriguing breakthroughs occurred in the area of quantum computing in the late 1990s.
combination of chlorine and hydrogen atoms) and a variation of a medical procedure called
magnetic resonance imaging (MRI) to compute at a molecular level. Scientists use a branch of
physics called quantum mechanics, which describes the behavior of subatomic particles
(particles that make up atoms), as the basis for quantum computing. Quantum computers may
one day be thousands to millions of times faster than current computers, because they take
advantage of the laws that govern the behavior of subatomic particles. These laws allow
quantum computers to examine all possible answers to a query simultaneously. Future uses of
quantum computers could include code breaking (see cryptography) and large database
queries. Theorists of chemistry, computer science, mathematics, and physics are now working
Communications between computer users and networks will benefit from new technologies
such as broadband communication systems that can carry significantly more data faster or
more conveniently to and from the vast interconnected databases that continue to grow in
Software
work. Software as a whole can be divided into a number of categories based on the types of
work done by programs. The two primary software categories are operating systems (system
software), which control the workings of the computer, and application software, which
addresses the multitude of tasks for which people use computers. System software thus
handles such essential, but often invisible, chores as maintaining disk files and managing the
screen, whereas application software performs word processing, database management, and
the like. Two additional categories that are neither system nor application software, although
they contain elements of both, are network software, which enables groups of computers to
communicate, and language software, which provides programmers with the tools they need
In addition to these task-based categories, several types of software are described based on
their method of distribution. These include the so-called canned programs or packaged
software developed and sold primarily through retail outlets; freeware and public-domain
software, which is made available without cost by its developer; shareware, which is similar to
freeware but usually carries a small fee for those who like the program; and the infamous
vaporware, which is software that either does not reach the market or appears much later than
System
system, any collection of component elements that work together to perform a task. In
consisting of a microprocessor and allied chips and circuitry, plus an input device (keyboard,
mouse, disk drive), an output device (monitor, disk drive), and any peripheral devices (printer,
modem). Within this hardware system is an operating system, often called system software,
which is an essential set of programs that manage hardware and data files and work with
application programs. External to the computer, system also refers to any collection or
I INTRODUCTION
consists of the components that can be physically handled. The function of these components
is typically divided into three main categories: input, output, and storage. Components in these
(CPU), the electronic circuitry that provides the computational ability and control of the
Software, on the other hand, is the set of instructions a computer uses to manipulate data,
such as a word-processing program or a video game. These programs are usually stored and
transferred via the computer's hardware to and from the CPU. Software also governs how the
hardware is utilized; for example, how information is retrieved from a storage device. The
interaction between the input and output hardware is controlled by software called the Basic
function are also associated with computer software. Since microprocessors have both
hardware and software aspects they are therefore often referred to as firmware.
II INPUT HARDWARE
Input hardware consists of external devices—that is, components outside of the computer’s
CPU—that provide information and instructions to the computer. A light pen is a stylus with a
light-sensitive tip that is used to draw directly on a computer’s video screen or to select
information on the screen by pressing a clip in the light pen or by pressing the light pen
against the surface of the screen. The pen contains light sensors that identify which portion of
(usually a ball, a light-emitting diode [LED], or a low-powered laser) on the bottom that enables
the user to control the motion of an on-screen pointer, or cursor, by moving the mouse on a
flat surface. As the device moves across the surface, the cursor moves across the screen. To
select items or choose commands on the screen, the user presses a button on the mouse. A
joystick is a pointing device composed of a lever that moves in multiple directions to navigate
A keyboard is a typewriter-like device that allows the user to type in text and commands to the
computer. Some keyboards have special function keys or integrated pointing devices, such as
a trackball or touch-sensitive regions that let the user’s finger motions move an on-screen
cursor.
Touch-screen displays, which are video displays with a special touch-sensitive surface, are also
becoming popular with personal electronic devices—examples include the Apple iPhone and
Nintendo DS video game system. Touch-screen displays are also becoming common in
everyday use. Examples include ticket kiosks in airports and automated teller machines (ATM).
An optical scanner uses light-sensing equipment to convert images such as a picture or text
into electronic signals that can be manipulated by a computer. For example, a photograph can
be scanned into a computer and then included in a text document created on that computer.
The two most common scanner types are the flatbed scanner, which is similar to an office
photocopier, and the handheld scanner, which is passed manually across the image to be
processed.
A microphone is a device for converting sound into signals that can then be stored,
manipulated, and played back by the computer. A voice recognition module is a device that
converts spoken words into information that the computer can recognize and process.
received from another computer. Each computer that sends or receives information must be
connected to a modem. The digital signal sent from one computer is converted by the modem
into an analog signal, which is then transmitted by telephone lines or television cables to the
receiving modem, which converts the signal back into a digital signal that the receiving
A network interface card (NIC) allows the computer to access a local area network (LAN)
through either a specialized cable similar to a telephone line or through a wireless (Wi-Fi)
connection. The vast majority of LANs connect through the Ethernet standard, which was
introduced in 1983.
Output hardware consists of internal and external devices that transfer information from the
computer’s CPU to the computer user. Graphics adapters, which are either an add-on card
information generated by the computer to an external display. Displays commonly take one of
two forms: a video screen with a cathode-ray tube (CRT) or a video screen with a liquid crystal
display (LCD). A CRT-based screen, or monitor, looks similar to a television set. Information
from the CPU is displayed using a beam of electrons that scans a phosphorescent surface that
emits light and creates images. An LCD-based screen displays visual information on a flatter
and smaller screen than a CRT-based video monitor. Laptop computers use LCD screens for
their displays.
Printers take text and image from a computer and print them on paper. Dot-matrix printers use
tiny wires to impact upon an inked ribbon to form characters. Laser printers employ beams of
light to draw images on a drum that then picks up fine black particles called toner. The toner is
fused to a page to produce an image. Inkjet printers fire droplets of ink onto a page to form
Computers can also output audio via a specialized chip on the motherboard or an add-on card
called a sound card. Users can attach speakers or headphones to an output port to hear the
audio produced by the computer. Many modern sound cards allow users to create music and
Storage hardware provides permanent storage of information and programs for retrieval by the
computer. The two main types of storage devices are disk drives and memory. There are
several types of disk drives: hard, floppy, magneto-optical, magnetic tape, and compact. Hard
disk drives store information in magnetic particles embedded in a disk. Usually a permanent
part of the computer, hard disk drives can store large amounts of information and retrieve that
information very quickly. Floppy disk drives also store information in magnetic particles
embedded in removable disks that may be floppy or rigid. Floppy disks store less information
than a hard disk drive and retrieve the information at a much slower rate. While most
computers still include a floppy disk drive, the technology has been gradually phased out in
Magneto-optical disk drives store information on removable disks that are sensitive to both
laser light and magnetic fields. They can store up to 9.1 gigabytes (GB) of data, but they have
slightly slower retrieval speeds as opposed to hard drives. They are much more rugged than
floppy disks, making them ideal for data backups. However, the introduction of newer media
that is both less expensive and able to store more data has made magneto-optical drives
obsolete.
Magnetic tape drives use magnetic tape similar to the tape used in VCR cassettes. Tape drives
have a very slow read/write time, but have a very high capacity; in fact, their capacity is
second only to hard disk drives. Tape drives are mainly used to back up data.
Compact disc drives store information on pits burned into the surface of a disc of reflective
material (see CD-ROM). CD-ROMs can store up to 737 megabytes (MB) of data. A Compact
Disc-Recordable (CD-R) or Compact Disc-ReWritable (CD-RW) drive can record data onto a
specialized disc, but only the CD-RW standard allows users to change the data stored on the
disc. A digital versatile disc (DVD) looks and works like a CD-ROM but can store up to 17.1 GB
of data on a single disc. Like CD-ROMs, there are specialized versions of DVDs, such as DVD-
Recordable (DVD-R) and DVD-ReWritable (DVD-RW), that can have data written onto them by
the user. More recently Sony Electronics developed DVD technology called Blu-ray. It has much
Memory refers to the computer chips that store information for quick retrieval by the CPU.
Random access memory (RAM) is used to store the information and instructions that operate
the computer's programs. Typically, programs are transferred from storage on a disk drive to
RAM. RAM is also known as volatile memory because the information within the computer
chips is lost when power to the computer is turned off. Read-only memory (ROM) contains
critical information and software that must be permanently available for computer operation,
such as the operating system that directs the computer's actions from start up to shut down.
ROM is called nonvolatile memory because the memory chips do not lose their information
A more recent development is solid-state RAM. Unlike standard RAM, solid state RAM can
contain information even if there is no power supply. Flash drives are removable storage
devices that utilize solid-state RAM to store information for long periods of time. Solid-state
drives (SSD) have also been introduced as a potential replacement for hard disk drives. SSDs
have faster access speeds than hard disks and have no moving parts. However, they are quite
expensive and do not have the ability to store as much data as a hard disk. Solid-state RAM
technology is also used in memory cards for digital media devices, such as digital cameras and
media players.
Some devices serve more than one purpose. For example, floppy disks may also be used as
input devices if they contain information to be used and processed by the computer user. In
addition, they can be used as output devices if the user wants to store the results of
computations on them.
V HARDWARE CONNECTIONS
and interact. A bus provides a common interconnected system composed of a group of wires
or circuitry that coordinates and moves information between the internal parts of a computer.
A computer bus consists of two channels, one that the CPU uses to locate data, called the
address bus, and another to send the data to that address, called the data bus. A bus is
characterized by two features: how much information it can manipulate at one time, called the
bus width, and how quickly it can transfer these data. In today’s computers, a series of buses
work together to communicate between the various internal and external devices.
A Internal Connections
Expansion, or add-on, cards use one of three bus types to interface with the computer. The
Peripheral Connection Interface (PCI) is the standard expansion card bus used in most
computers. The Accelerated Graphics Port (AGP) bus was developed to create a high-speed
interface with the CPU that bypassed the PCI bus. This bus was specifically designed for
modern video cards, which require a large amount of bandwidth to communicate with the CPU.
A newer version of PCI called PCI Express (PCIe) was designed to replace both PCI and AGP as
Internal storage devices use one of three separate standards to connect to the bus: parallel AT
attachment (PATA), serial AT attachment (SATA), or small computer system interface (SCSI).
The term AT refers to the IBM AT computer, first released in 1984. The PATA and SCSI
standards were first introduced in 1986; the SATA standard was introduced in 2002 as a
replacement for the PATA standard. The SCSI standard is mainly used in servers or high-end
systems.
For most of the history of the personal computer, external and internal devices have
communicated to each other through parallel connections. However, given the limitations of
since these have greater data transfer rates, as well as more reliability.
A serial connection is a wire or set of wires used to transfer information from the CPU to an
external device such as a mouse, keyboard, modem, scanner, and some types of printers. This
type of connection transfers only one piece of data at a time. The advantage to using a serial
simultaneously. Most scanners and printers use this type of connection. A parallel connection
is much faster than a serial connection, but it is limited to shorter distances between the CPU
The best way to see the difference between parallel and serial connections is to imagine the
differences between a freeway and a high-speed train line. The freeway is the parallel
connection—lots of lanes for cars. However, as more cars are put onto the freeway, the slower
each individual car travels, which means more lanes have to be built at a high cost if the cars
are to travel at high speed. The train line is the serial connection; it consists of two tracks and
can only take two trains at a time. However, these trains do not need to deal with traffic and
As CPU speeds increased and engineers increased the speed of the parallel connections to
keep up, the main problem of parallel connections—maintaining data integrity at high speed—
became more evident. Engineers began to look at serial connections as a possible solution to
the problem. This led to the development of both SATA and PCI Express, which, by using serial
connections, provide high data transfer rates with less materials used and no data loss.
B External Connections
The oldest external connections used by computers were the serial and parallel ports. These
were included on the original IBM PC from 1981. Originally designed as an interface to connect
computer to computer, the serial port was eventually used with various devices, including
modems, mice, keyboards, scanners, and some types of printers. Parallel ports were mainly
used with printers, but some scanners and external drives used the parallel port.
The Universal Serial Bus (USB) interface was developed to replace both the serial and parallel
ports as the standard for connecting external devices. Developed by a group of companies
including Microsoft, Intel, and IBM, the USB standard was first introduced in 1995. Besides
transferring data to and from the computer, USB can also provide a small amount of power,
eliminating the need for external power cables for most peripherals. The USB 2.0 standard,
which came into general usage in 2002, drastically improved the data transfer rate.
A competing standard to USB was developed at the same time by Apple and Texas
Instruments. Officially called IEEE 1394, it is more commonly called FireWire. It is capable of
transferring data at a higher rate than the original USB standard and became the standard
interface for multimedia hardware, such as video cameras. But Apple’s royalty rate and the
introduction of USB 2.0—as well as the fact that Intel, one of the companies behind USB, is
responsible for most motherboards and chipsets in use—meant that FireWire was unlikely to
become the standard peripheral interface for PCs. Today most computers have both USB and
Wireless devices have also become commonplace with computers. The initial wireless interface
used was infrared (IR), the same technology used in remote controls. However, this interface
required that the device have a direct line of sight to the IR sensor so that the data could be
transferred. It also had a high power requirement. Most modern wireless devices use radio
frequency (RF) signals to communicate to the computer. One of the most common wireless
standards used today is Bluetooth. It uses the same frequencies as the Wi-Fi standard used for
wireless LANs.
Personal Computer
I INTRODUCTION
Personal Computer (PC), computer in the form of a desktop or laptop device designed for use
by a single person. PCs function using a display monitor and a keyboard. Since their
introduction in the 1980s, PCs have become powerful and extremely versatile tools that have
revolutionized how people work, learn, communicate, and find entertainment. Many
households in the United States now have PCs, thanks to affordable prices and software that
has made PCs easy to use without special computer expertise. Personal computers are also a
crucial component of information technology (IT) and play a key role in modern economies
worldwide.
The usefulness and capabilities of personal computers can be greatly enhanced by connection
to the Internet and World Wide Web, as well as to smaller networks that link to local computers
or databases. Personal computers can also be used to access content stored on compact discs
(CDs) or digital versatile discs (DVDs), and to transfer files to personal media devices and
video players.
Personal computers are sometimes called microcomputers or micros. Powerful PCs designed
for professional or technical use are known as work stations. Other names that reflect different
roles for PCs include home computers and small-business computers. The PC is generally
larger and more powerful than handheld computers, including personal digital assistants
The different types of equipment that make a computer function are known as hardware; the
A Types of Hardware
PCs consist of electronic circuitry called a microprocessor, such as the central processing unit
(CPU), that directs logical and arithmetical functions and executes computer programs. The
CPU is located on a motherboard with other chips. A PC also has electronic memory known as
random access memory (RAM) to temporarily store programs and data. A basic component of
most PCs is a disk drive, commonly in the form of a hard disk or hard drive. A hard disk is a
magnetic storage device in the form of a disk or disks that rotate. The magnetically stored
information is read or modified using a drive head that scans the surface of the disk.
Removable storage devices—such as floppy drives, compact disc (CD-ROM) and digital
versatile disc (DVD) drives, and additional hard drives—can be used to permanently store as
well as access programs and data. PCs may have CD or DVD “burners” that allow users to
write or rewrite data onto recordable discs. Other external devices to transfer and store files
include memory sticks and flash drives, small solid-state devices that do not have internal
moving parts.
Cards are printed circuit boards that can be plugged into a PC to provide additional functions
such as recording or playing video or audio, or enhancing graphics (see Graphics Card).
A PC user enters information and commands with a keyboard or with a pointing device such as
a mouse. A joystick may be used for computer games or other tasks. Information from the PC
is displayed on a video monitor or on a liquid crystal display (LCD) video screen. Accessories
documents can be printed on laser, dot-matrix, or inkjet printers. The various components of
the computer system are physically attached to the PC through the bus. Some PCs have
wireless systems that use infrared or radio waves to link to the mouse, the keyboard, or other
components.
phone line and a modem (a device that permits transmission of digital signals). Wireless links
to the Internet and networks operate through a radio modem. Modems also are used to link
B Types of Software
PCs are run by software called the operating system. Widely used operating systems include
Microsoft’s Windows, Apple’s Mac OS, and Linux. Other types of software called applications
allow the user to perform a wide variety of tasks such as word processing; using spreadsheets;
or optional to the functioning of the computer. Drivers help operate keyboards, printers, and
Most PCs use software to run a screen display called a graphical user interface (GUI). A GUI
allows a user to open and move files, work with applications, and perform other tasks by
In addition to text files, PCs can store digital multimedia files such as photographs, audio
recordings, and video. These media files are usually in compressed digital formats such as
JPEG for photographs, MP3 for audio, and MPEG for video.
The wide variety of tasks that PCs can perform in conjunction with the PC’s role as a portal to
the Internet and World Wide Web have had profound effects on how people conduct their lives
In the home, PCs can help with balancing the family checkbook, keeping track of finances and
investments, and filing taxes, as well as preserving family documents for easy access or
indexing recipes. PCs are also a recreational device for playing computer games, watching
videos with webcasting, downloading music, saving photographs, or cataloging records and
books. Together with the Internet, PCs are a link to social contacts through electronic mail (e-
mail), text-messaging, personal Web pages, blogs, and chat groups. PCs can also allow quick
and convenient access to news and sports information on the World Wide Web, as well as
consumer information. Shopping from home over the Internet with a PC generates billions of
PCs can greatly improve productivity in the workplace, allowing people to collaborate on tasks
from different locations and easily share documents and information. Many people with a PC at
home are able to telecommute, working from home over the Internet. Laptop PCs with wireless
connections to the Internet allow people to work in virtually any environment when away from
the office. PCs can help people to be self-employed. Special software can make running a small
business from home much easier. PCs can also assist artists, writers, and musicians with their
creative work, or allow anyone to make their own musical mixes at home. Medical care has
been improved and costs have been reduced by transferring medical records into electronic
PCs have become an essential tool in education at all levels, from grammar school to
university. Many school children are given laptop computers to help with schoolwork and
homework. Classrooms of all kinds commonly use PCs. Many public libraries make PCs
available to members of the public. The Internet and World Wide Web provide access to
education is a growing service, allowing people to take classes and work on degrees at their
PCs can also be adapted to help people with disabilities, using special devices and software.
Special keyboards, cursors that translate head movements, or accessories such as foot mice
can allow people with limited physical movement to use a PC. PCs can also allow people with
aided by speech-recognition software that allows spoken commands to work a PC or for e-mail
and text to be read aloud. Text display can also be magnified for individuals with low vision.
The first true modern computers were developed during World War II (1939-1945) and used
vacuum tubes. These early computers were the size of houses and as expensive as
battleships, but they had none of the computational power or ease of use that are common in
modern PCs. More powerful mainframe computers were developed in the 1950s and 1960s,
but needed entire rooms and large amounts of electrical power to operate.
A major step toward the modern PC came in the 1960s when a group of researchers at the
Stanford Research Institute (SRI) in California began to explore ways for people to interact
more easily with computers. The SRI team developed the first computer mouse and other
innovations that would be refined and improved in the 1970s by researchers at the Xerox PARC
(Palo Alto Research Center, Inc). The PARC team developed an experimental PC design in 1973
called Alto, which was the first computer to have a graphical user interface (GUI).
Two crucial hardware developments would help make the SRI vision of computers practical.
circuits and microprocessors enabled computer makers to combine the essential elements of a
computer onto tiny silicon computer chips, thereby increasing computer performance and
decreasing cost.
The integrated circuit, or IC, was developed in 1959 and permitted the miniaturization of
computer-memory circuits. The microprocessor first appeared in 1971 with the Intel 4004,
created by Intel Corporation, and was originally designed to be the computing and logical
processor of calculators and watches. The microprocessor reduced the size of a computer’s
Because a CPU calculates, performs logical operations, contains operating instructions, and
manages data flows, the potential existed for developing a separate system that could function
as a complete microcomputer. The first such desktop-size system specifically designed for
personal use appeared in 1974; it was offered by Micro Instrumentation Telemetry Systems
(MITS). The owners of the system were then encouraged by the editor of Popular Electronics
magazine to create and sell a mail-order computer kit through the magazine.
The Altair 8800 is considered to be the first commercial PC. The Altair was built from a kit and
programmed by using switches. Information from the computer was displayed by light-emitting
diodes on the front panel of the machine. The Altair appeared on the cover of Popular
Electronics magazine in January 1975 and inspired many computer enthusiasts who would
later establish companies to produce computer hardware and software. The computer retailed
The demand for the microcomputer kit was immediate, unexpected, and totally overwhelming.
for the new market. The first major electronics firm to manufacture and sell personal
computers, Tandy Corporation (Radio Shack), introduced its model in 1977. It quickly
dominated the field, because of the combination of two attractive features: a keyboard and a
display terminal using a cathode-ray tube (CRT). It was also popular because it could be
programmed and the user was able to store information by means of cassette tape.
American computer designers Steven Jobs and Stephen Wozniak created the Apple II in 1977.
The Apple II was one of the first PCs to incorporate a color video display and a keyboard that
made the computer easy to use. Jobs and Wozniak incorporated Apple Computer Inc. the same
year. Some of the new features they introduced into their own microcomputers were expanded
memory, inexpensive disk-drive programs and data storage, and color graphics. Apple
Computer went on to become the fastest-growing company in U.S. business history. Its rapid
growth inspired a large number of similar microcomputer manufacturers to enter the field.
Before the end of the decade, the market for personal computers had become clearly defined.
In 1981 IBM introduced its own microcomputer model, the IBM PC. Although it did not make
use of the most recent computer technology, the IBM PC was a milestone in this burgeoning
field. It proved that the PC industry was more than a current fad, and that the PC was in fact a
necessary tool for the business community. The PC’s use of a 16-bit microprocessor initiated
the development of faster and more powerful microcomputers, and its use of an operating
system that was available to all other computer makers led to what was effectively a
standardization of the industry. The design of the IBM PC and its clones soon became the PC
standard, and an operating system developed by Microsoft Corporation became the dominant
A graphical user interface (GUI)—a visually appealing way to represent computer commands
and data on the screen—was first developed in 1983 when Apple introduced the Lisa, but the
new user interface did not gain widespread notice until 1984 with the introduction of the Apple
Macintosh. The Macintosh GUI combined icons (pictures that represent files or programs) with
windows (boxes that each contain an open file or program). A pointing device known as a
mouse controlled information on the screen. Inspired by earlier work of computer scientists at
Xerox Corporation, the Macintosh user interface made computers easy and fun to use and
the development of new storage technologies. A powerful 32-bit computer capable of running
advanced multiuser operating systems at high speeds appeared in the mid-1980s. This type of
computing power on an office desktop to serve all small businesses and most medium-size
businesses.
During the 1990s the price of personal computers came down at the same time that computer
chips became more powerful. The most important innovations, however, occurred with the PC
operating system software. Apple’s Macintosh computer had been the first to provide a
graphical user interface, but the computers remained relatively expensive. Microsoft
Corporation’s Windows software came preinstalled on IBM PCs and clones, which were
generally less expensive than Macintosh. Microsoft also designed its software to allow
individual computers to easily communicate and share files through networks in an office
environment. The introduction of the Windows operating systems, which had GUI systems
similar to Apple’s, helped make Microsoft the dominant provider of PC software for business
PCs in the form of portable notebook computers also emerged in the 1990s. These PCs could
be carried in a briefcase or backpack and could be powered with a battery or plugged in. The
first portable computers had been introduced at the end of the 1980s. The true laptop
computers came in the early 1990s with Apple’s Powerbook and IBM’s ThinkPad.
Despite its spectacular success in the software market, Microsoft was initially slow to
understand the importance of the Internet, which had been developed for government and
academic use in the 1960s and 1970s, and the World Wide Web, developed in the late 1980s.
The ability to access the Internet and the growing World Wide Web greatly enhanced the
usefulness of the PC, giving it enormous potential educational, commercial, and entertainment
value. In 1994 Netscape became the first browser designed to make the Internet and the
World Wide Web user friendly, similar to how a GUI makes using a PC much simpler. The
success of Netscape prompted Microsoft to develop its own Web browser called Internet
Explorer, released in 1995. Explorer was then included with the preinstalled Windows software
on PCs sold to consumers. This “bundling” of the Explorer browser was controversial and led to
Connecting PCs to the Internet had unanticipated consequences. PCs were vulnerable to
malicious software designed to damage files or computer hardware. Other types of software
programs could force a PC to send out e-mail messages or store files, or allow access to
existing files and software as well as track a user’s keystrokes and Internet activity without the
user's knowledge. Computer viruses and other malicious programs could be easily sent over
the Internet using e-mail or by secretly downloading files from Web pages a user visited.
Microsoft’s software was a particular target and may have been vulnerable in part because its
platforms and applications had been developed to allow computers to easily share files.
Since the late 1990s computer security has become a major concern. PC users can install
firewalls to block unwanted access or downloads over the Internet. They can also subscribe to
services that periodically scan personal computers for viruses and malicious software and
remove them. Operating-system software has also been designed to improve security.
PCs continue to improve in power and versatility. The growing use of 64-bit processors and
higher-speed chips in PCs in combination with broadband access to the Internet greatly
enhances media such as motion pictures and video, as well as games and interactive features.
The increasing use of computers to view and access media may be a further step toward the
merger of television and computer technology that has been predicted by some experts since
the 1990s.
Install
Install, in computer science, to set up and prepare for operation. Operating systems and
application programs commonly include a disk-based installation program that does most of
the work of setting up the program to work with the computer, printer, and other devices.
Often, such a program is capable of checking for devices attached to the system, requesting
the user to choose from sets of options, creating a place for itself on a hard disk, and
modifying system startup files if necessary. Installation can also pertain to the transfer of one
of a limited number of copies of a program to a hard drive or a floppy disk from a copy-
protected program disk (because the normal method of copying the program has been
Windows
Corporation that allows users to enter commands with a point-and-click device, such as a
mouse, instead of a keyboard. An operating system is a set of programs that control the basic
functions of a computer. The Windows operating system provides users with a graphical user
interface (GUI), which allows them to manipulate small pictures, called icons, on the computer
screen to issue commands. Windows is the most widely used operating system in the world. It
The Windows GUI is designed to be a natural, or intuitive, work environment for the user. With
Windows, the user can move a cursor around on the computer screen with a mouse. By
pointing the cursor at icons and clicking buttons on the mouse, the user can issue commands
to the computer to perform an action, such as starting a program, accessing a data file, or
copying a data file. Other commands can be reached through pull-down or click-on menu
items. The computer displays the active area in which the user is working as a window on the
computer screen. The currently active window may overlap with other previously active
windows that remain open on the screen. This type of GUI is said to include WIMP features:
Computer scientists at the Xerox Corporation’s Palo Alto Research Center (PARC) invented the
GUI concept in the early 1970s, but this innovation was not an immediate commercial success.
In 1983 Apple Computer featured a GUI in its Lisa computer. This GUI was updated and
Microsoft began its development of a GUI in 1983 as an extension of its MS-DOS operating
system. Microsoft’s Windows version 1.0 first appeared in 1985. In this version, the windows
were tiled, or presented next to each other rather than overlapping. Windows version 2.0,
introduced in 1987, was designed to resemble IBM’s OS/2 Presentation Manager, another GUI
operating system. Windows version 2.0 included the overlapping window feature. The more
powerful version 3.0 of Windows, introduced in 1990, and subsequent versions 3.1 and 3.11
rapidly made Windows the market leader in operating systems for personal computers, in part
because it was prepackaged on new personal computers. It also became the favored platform
system offers 32-bit multitasking, which gives a computer the ability to run several programs
simultaneously, or in parallel, at high speed. This operating system competes with IBM’s OS/2
as a platform for the intensive, high-end, networked computing environments found in many
businesses.
In 1995 Microsoft released a new version of Windows for personal computers called Windows
95. Windows 95 had a sleeker and simpler GUI than previous versions. It also offered 32-bit
processing, efficient multitasking, network connections, and Internet access. Windows 98,
In 1996 Microsoft debuted Windows CE, a scaled-down version of the Microsoft Windows
platform designed for use with handheld personal computers. Windows 2000, released at the
end of 1999, combined Windows NT technology with the Windows 98 graphical user interface.
In 2000 a special edition of Windows known as Windows Millenium Edition, or Windows ME,
provided a more stable version of the Windows 98 interface. In 2001 Microsoft released a new
operating system known as Windows XP, the company’s first operating system for consumers
Warp from IBM (see OS/2), and UNIX and its variations, such as Linux.
UNIX
very powerful operating system, UNIX is written in the C language and can be installed on
UNIX was originally developed by Ken Thompson and Dennis Ritchie at AT&T Bell Laboratories
in 1969 for use on minicomputers. In the early 1970s, many universities, research institutions,
and companies began to expand on and improve UNIX. These efforts resulted in two main
versions: BSD UNIX, a version developed at the University of California at Berkeley, and
Many companies developed and marketed their own versions of UNIX in subsequent years.
Variations of UNIX include AIX, a version of UNIX adapted by IBM to run on RISC-based
workstations; A/UX, a graphical version for the Apple Macintosh; XENIX OS, developed by
Microsoft Corporation for 16-bit microprocessors; SunOS, adapted and distributed by Sun
Microsystems, Inc.; Mach, a UNIX-compatible operating system for the NeXT computer; and
Linux, developed by Finnish computer engineer Linus Torvalds with collaborators worldwide.
Microsoft Corporation
I INTRODUCTION
Microsoft Corporation, the largest company in the world dedicated to creating computer
software. Microsoft develops and sells a wide variety of software products to businesses and
consumers and has subsidiary offices in more than 60 countries. The company’s operating
systems for personal computers are the most widely used in the world. Microsoft has its
Microsoft’s other well-known products include Word, a word processor; Excel, a spreadsheet
program; Access, a database program; and PowerPoint, a program for making business
presentations. These programs are sold separately and as part of Office, an integrated
software suite. The company also makes software applications for a wide variety of server
products for businesses. Microsoft’s Internet Explorer (IE) allows users to browse the World
Wide Web. Microsoft produces the Xbox game console and software games that run on the
console. Among the company’s other products are reference applications; financial software;
programming languages for software developers; input devices, such as pointing devices and
keyboards; software for personal digital assistants (PDAs) and cellular telephones; handwriting-
recognition software; software for creating Web pages; and computer-related books.
Microsoft operates the Microsoft Network (MSN), a collection of news, travel, financial,
entertainment, and information Web sites. Microsoft and NBC Universal jointly operate the
MSNBC Web site, the most popular all-news site on the Internet.
II FOUNDING
Microsoft was founded in 1975 by William H. Gates III and Paul Allen. The pair had teamed up
in high school through their hobby of programming on the original PDP-10 computer from the
Digital Equipment Corporation. In 1975 Popular Electronics magazine featured a cover story
about the Altair 8800, the first personal computer (PC). The article inspired Gates and Allen to
develop a version of the BASIC programming language for the Altair. They licensed the
software to Micro Instrumentation and Telemetry Systems (MITS), the Altair’s manufacturer,
and formed Microsoft (originally Micro-soft) in Albuquerque, New Mexico, to develop versions
Microsoft’s early customers included fledgling hardware firms such as Apple Inc., maker of the
Apple II computer; Commodore, maker of the PET computer; and Tandy Corporation, maker of
the Radio Shack TRS-80 computer. In 1977 Microsoft shipped its second language product,
Microsoft Fortran, and it soon released versions of BASIC for the 8080 and 8086
microprocessors.
III MS-DOS
In 1979 Gates and Allen moved the company to Bellevue, Washington, a suburb of their
hometown of Seattle. (The company moved to its current headquarters in Redmond in 1986.)
In 1980 International Business Machines Corporation (IBM) chose Microsoft to write the
operating system for the IBM PC personal computer, to be introduced the following year. Under
time pressure, Microsoft purchased 86-DOS (developed by programmer Tim Paterson and
originally called QDOS for Quick and Dirty Operating System) from a small company called
Seattle Computer Products for $50,000, modified it, and renamed it MS-DOS (Microsoft Disk
Operating System).
As part of its contract with IBM, Microsoft was permitted to license the operating system to
other companies. By 1984 Microsoft had licensed MS-DOS to 200 personal computer
manufacturers, making MS-DOS the standard operating system for PCs and driving Microsoft’s
enormous growth in the 1980s. Allen left the company in 1983 but remained on its board of
IV APPLICATION SOFTWARE
As sales of MS-DOS took off, Microsoft began to develop business applications for personal
computers. In 1982 it released Multiplan, a spreadsheet program, and the following year it
released a word-processing program, Microsoft Word. In 1984 Microsoft was one of the few
established software companies to develop application software for the Macintosh, a personal
computer developed by Apple Computer, Inc. Microsoft’s early support for the Macintosh
resulted in tremendous success for its Macintosh application software, including Word, Excel,
and Works (an integrated software suite). Multiplan for MS-DOS, however, faltered against the
V WINDOWS
In 1985 Microsoft released Windows, an operating system that extended the features of MS-
DOS and employed a graphical user interface. Windows 2.0, released in 1987, improved
performance and offered a new visual appearance. In 1990 Microsoft released a more powerful
version, Windows 3.0, which was followed by Windows 3.1 and 3.11. These versions, which
came preinstalled on most new personal computers, rapidly became the most widely used
operating systems. In 1990 Microsoft became the first personal-computer software company to
As Microsoft’s dominance grew in the market for personal-computer operating systems, the
company was accused of monopolistic business practices. In 1990 the Federal Trade
Commission (FTC) began investigating Microsoft for alleged anticompetitive practices, but it
was unable to reach a decision and dropped the case. The United States Department of Justice
In 1991 Microsoft and IBM ended a decade of collaboration when they went separate ways on
the next generation of operating systems for PCs. IBM chose to pursue the OS/2 operating
system (first released in 1987), which until then had been a joint venture with Microsoft.
Microsoft chose to evolve its Windows operating system into increasingly powerful systems. In
1993 Apple lost a copyright-infringement lawsuit against Microsoft that claimed Windows
illegally copied the design of the Macintosh’s graphical interface. An appellate court later
In 1993 Microsoft released Windows NT, an operating system for business environments. In
1994 the company and the Justice Department reached an agreement that called for Microsoft
to change the way its operating system software was sold and licensed to computer
manufacturers. In 1995 the company released Windows 95, which featured a simplified
interface, multitasking, and other improvements. An estimated 7 million copies of Windows 95
In the mid-1990s Microsoft began to expand into the media, entertainment, and
communications industries, launching MSN in 1995 and the MSNBC cable channel and Web site
in 1996. In late 2005, however, Microsoft and NBC dissolved their joint operation of the cable
channel, with NBC assuming full control. The two companies continued their 50-50 ownership
of the MSNBC Web site. In 1996 Microsoft introduced Windows CE, an operating system for
handheld personal digital assistants (PDAs). In 1997 Microsoft paid $425 million to acquire
That same year Microsoft invested $1 billion in Comcast Corporation, a U.S. cable-television
Internet.
In June 1998 Microsoft released Windows 98, which featured integrated Internet capabilities. In
the following month Gates appointed Steve Ballmer, executive vice president of Microsoft, as
the company’s president, giving him supervision of most day-to-day business operations of the
company. Gates retained the title of chairman and chief executive officer (CEO).
In 1999 Microsoft paid $5 billion to telecommunications company AT&T Corp. to use Microsoft’s
Windows CE operating system in devices designed to provide consumers with integrated cable
television, telephone, and high-speed Internet services. Also in 1999, the company released
Windows 2000, the latest version of the Windows NT operating system. In January 2000 Gates
transferred his title of CEO to Ballmer. While retaining the position of chairman, Gates also
took on the title of chief software architect to focus on the development of new products and
technologies.
In 2001 Microsoft released a new operating system known as Windows XP, the company’s first
operating system for consumers that was not based on MS-DOS. The same year the company
also released Xbox, its first venture into video-game consoles. Microsoft announced a new
business strategy in 2001 known as .Net (pronounced dot-net). The strategy sought to enable
a variety of hardware devices, from PCs to PDAs to cell phones, to communicate with each
other via the Internet, while also automating many computer functions. Confusion over the
term .Net led to the adoption of the slogan “seamless computing” in 2003.
Other major business developments in the early 21st century included new versions of the
Microsoft Network and the development with several major computer manufacturers of the
Tablet PC, a laptop computer that featured handwriting-recognition software and a wireless
connection to the Internet. In 2003 the company began to focus on “trustworthy computing,”
requiring its programmers to improve their skills in protecting software from malicious hacker
attacks in the form of computer viruses and worms. In 2004 Microsoft sold its innovative online
In November 2005 Microsoft unveiled its new-generation video game console, the Xbox 360
(see Electronic Games). The new device went beyond gaming, providing consumers with the
ability to store and play audio, video, and photo files. The same month Gates and the newly
named chief technology officer, Ray Ozzie, announced a new Web services initiative providing
software services on the Internet accessible from any browser. The initial components,
Windows Live and Office Live, represented a move away from packaged software. The
In June 2006 Gates announced that he would begin transitioning from a full-time role at
Microsoft to a full-time role at the Bill & Melinda Gates Foundation. Gates planned to have only
a part-time role at Microsoft by July 2008, though he would retain the title of chairman and
continue to advise the company on key business developments. As part of the transition, he
In November 2006 Microsoft released Vista, its first new operating system since Windows XP
was introduced in 2001. The long-anticipated system was first made available to businesses
only. A consumer version was released in January 2007. The new system won generally
favorable reviews for its improved graphics, search capabilities, and security protection
In late 1997 the Justice Department accused Microsoft of violating the 1994 agreement by
requiring computer manufacturers that installed Windows 95 to also include Internet Explorer,
Microsoft’s software for browsing the Internet. The government contended that Microsoft was
illegally taking advantage of its power in the market for computer operating systems to gain
control of the market for Internet browsers. In response, Microsoft argued that it should have
the right to enhance the functionality of Windows by integrating Internet-related features into
Also in late 1997, computer company Sun Microsystems sued Microsoft, alleging that it had
breached a contract for use of Sun’s Java universal programming language by introducing
Windows-only enhancements. In November 1998 a federal district court ruled against Microsoft
on an injunction filed by Sun earlier that year. The injunction forced Microsoft to revise its
software to meet Sun’s Java compatibility standards. The two companies settled the case in
2001, with Microsoft agreeing to pay Sun $20 million for limited use of Java. However, in 2002
Sun filed an antitrust suit seeking $1 billion in damages against Microsoft after Microsoft
Microsoft temporarily settled with the Justice Department in its antitrust case in early 1998 by
agreeing to allow personal computer manufacturers to offer a version of Windows 95 that did
not include access to Internet Explorer. However, in May 1998 the Justice Department and 20
states filed broad antitrust suits charging Microsoft with engaging in anticompetitive conduct.
The suits sought to force Microsoft to offer Windows without Internet Explorer or to include
The federal antitrust trial against Microsoft began in October 1998. Executives from Netscape,
Sun, and several other computer software and hardware companies testified regarding their
business deals with Microsoft. In November 1999 federal district court judge Thomas Penfield
Jackson issued his findings of fact in the antitrust case, in which he declared that Microsoft had
a monopoly in the market for personal computer operating systems. In 2000 Jackson ruled that
the company had violated antitrust laws by engaging in tactics that discouraged competition.
He ordered Microsoft to be split into two companies: one for operating systems and another for
all other businesses, including its Office software suite. He also imposed a number of interim
restrictions on the company’s business practices. The judge put these penalties on hold while
In June 2001 an appeals court upheld Jackson’s findings that Microsoft had monopoly power
and that the company used anticompetitive business practices to protect its Windows
monopoly. However, the appeals court threw out the trial court’s ruling that Microsoft had
illegally integrated Internet Explorer into Windows, returning the issue to a lower court for
review under a different legal standard. The appeals court also reversed Jackson’s order to
break up the company, in part because of the judge’s failure to hold a proper hearing on the
remedy and in part because of comments he made to reporters outside the courtroom about
the merits of the case. The court found that Jackson’s comments were improper because they
created the appearance of bias, even though the court found no evidence of actual bias. The
appeals court ordered that the case be assigned to a different district court judge to reconsider
The case was assigned to Judge Colleen Kollar-Kotelly, who urged both parties to reach a
settlement. In November 2001 Microsoft announced a settlement with the Justice Department
and nine of the states. Key provisions included requiring Microsoft to reveal technical
information about the Windows operating system to competitors so that software applications
known as middleware would be compatible with Windows, while also enabling personal
computer manufacturer could therefore remove access to Internet Explorer and enable
compliance with the settlement. However, nine other states and the District of Columbia
refused to accept the agreement and pressed for harsher remedies. (Of the original 20 states,
South Carolina and New Mexico dropped out of the case before the settlement was reached.)
In early 2002 Kollar-Kotelly held hearings to review the terms of the settlement and to consider
In November 2002 Kollar-Kotelly approved most of the provisions of the settlement and
rejected nearly all of the harsher remedies proposed by the dissenting parties. However, the
judge amended the settlement by extending the remedies regarding middleware to server
applications and by specifying that the compliance committee should be made up of at least
three outside members of Microsoft’s board of directors, who would be held responsible for
In March 2004 the European Commission, the highest administrative body of the European
Union (EU), ended a five-year-long investigation of antitrust charges brought against Microsoft
by finding that the company was an abusive monopolist. The inquiry was initiated by a
complaint filed in 1998 by Sun Microsystems, which charged that Sun’s server software could
not work adequately with Windows because Microsoft withheld information about Windows
source code.
The European Commission fined Microsoft €497 million (about $613 million at exchange rates
in March 2004), a record fine at the time. The commission required Microsoft to produce two
versions of the Windows operating system for the European market, including one without the
Windows Media Player, the application that plays audio and video. It also required the
company to share Windows code with competitors who make network server computer
products. European Union Competition Commissioner Mario Monti said bundling the Windows
Media Player with the operating system gave Microsoft an unfair advantage over competitors,
such as RealNetworks, a Seattle-based company that joined the complaint against Microsoft.
In finding that the practice of bundling was anticompetitive, the EC ruling went beyond the
terms of the settlement with the U.S. Justice Department, which allowed Microsoft to continue
bundling the Internet Explorer browser with Windows. Monti resisted a settlement, saying that
he wanted to establish a legal precedent. Although Microsoft paid the fine into an escrow
account, it sought a stay of the ruling’s antibundling provisions from the EU’s Court of First
Instance while Microsoft appealed the decision. In December 2004, however, the president of
the Court of First Instance denied Microsoft’s request for a stay, and Microsoft was forced to
begin offering in the European market a version of Windows without its media player. In July
2006 the European Commission fined Microsoft an additional $357 million for failing to comply
with its ruling. In September 2007 the Court of First Instance ruled on Microsoft’s appeal,
upholding almost all of the EC’s findings. Microsoft subsequently announced that it would not
pursue any further appeals and would comply with an additional European Commission ruling
In February 2008 the European Commission (EC) again fined Microsoft, this time for €899
million (U.S.$1.35 billion) for charging competitors “unreasonable” fees for access to technical
information that it was required to reveal under terms of the original 2004 antitrust ruling. At
the time, the fine was the largest ever levied by the EC against a corporation.
In April 2004 Microsoft and Sun announced that they had reached a settlement in their
ongoing legal disputes and also planned to collaborate with each other on a variety of
technology issues. Under the terms of the unexpected and unprecedented settlement,
Microsoft agreed to pay Sun nearly $2 billion—$700 million to settle the antitrust suit, $900
million to resolve patent issues, and an upfront royalty payment of $350 million for use of
Sun’s technologies. In return Sun agreed to drop its antitrust case and to make royalty
statement Sun also said that the objectives it had pursued in the EU case against Microsoft had
driven in large part by customers who wanted the two companies to work together to solve
technology issues. Most companies use a mixture of hardware and software systems, and
interoperability is key for them, both men noted. The two companies agreed to work together
to improve the compatibility of Microsoft’s .Net platform and Sun’s Java technologies.
In July 2005 Microsoft reached a settlement with IBM. Microsoft agreed to pay IBM $775 million
in cash and $75 million in software to resolve claims arising from the federal district court’s
antitrust findings in 2000. Although IBM did not file suit against Microsoft, the antitrust findings
laid the basis for litigation, and as a result Microsoft pursued a settlement without admitting
liability.
In October 2005 Microsoft reached terms with RealNetworks in a deal valued at $761 million.
The agreement settled the last major private lawsuit brought against the company as a result
of the federal government’s antitrust case. In total Microsoft has paid $7.2 billion in settling
lawsuits and fines stemming from antitrust issues, including a European Commission fine of
System V
System V, in computer science, the version of the UNIX system provided by AT&T and others.
It is both a standard, which is principally controlled by AT&T, and a set of commercial products
supplied by many vendors. Individual releases are numbered—for example, System V.4
indicates release 4.
Allocate
Allocate, in relation to computers, to reserve memory for use by a program. Programs often
need certain system resources such as memory or disk space, and they request them as
needed from the operating system. The process of responding to a request for a resource is
called allocation. The two basic types of allocation are static allocation, in which memory is set
aside when the program starts and remains allocated while the program is running, and
dynamic allocation, in which memory is allocated and deallocated while the program is
running. Dynamic memory allocation is done either explicitly, by issuing a memory allocation
request, or implicitly, when control passes to various subroutines (blocks) in a program. See
also Computer.
BLOCK
Block, in relation to computers, literally, a group of similar things—usually bytes of storage or
data, or segments of text. The word block is used in many contexts, so its exact definition
varies with the type of item referenced. In programming, a block is a section of random access
In disk storage, a block is a collection of consecutive bytes of data that are read from or
In applications, a block is a segment of text that can be selected and acted upon as a whole.
Computer Boot
Boot, in computer science, as a verb, to start up a computer. As a noun, the process of starting
or resetting a computer. A boot can be “cold,” as when the machine is first turned on, or
“warm,” as when the user presses the Ctrl-Alt-Del combination (IBM) or chooses Restart from
the Special menu (Apple). See also Cold Boot; Warm Boot.
1) Cold Boot
Cold Boot, in computer science, a startup process that begins with turning on the computer's
power. Typically, a cold boot involves some basic hardware checking by the system followed
by loading of the operating system from disk into memory. This can be done by pressing on
the keyboard Ctrl+Alt+Del, a three-key combination used with IBM and compatible
+Delete) causes a “warm boot“—the computer restarts, but does not go through all of the
internal checks involved when power to the system is switched on (“cold boot“). This particular
key combination is generally considered to have been chosen because the keys are widely
separated on the keyboard and difficult to press inadvertently at the same time.
Warm Boot
Warm Boot, in computer science, a system restart that does not involve turning on the power
and waiting for the computer to check itself and its devices. A warm boot typically means
loading or reloading the computer's operating system. On IBM and compatible personal
computers, a warm boot is accomplished by using the Ctrl-Alt-Del key sequence. On Apple
Macintosh computers, a warm boot can be requested with the Restart command on the Special
menu.
Conversion (computer)
Conversion (computer), in relation to computers, the process of changing from one form or
format to another; where information is concerned, a changeover that affects form but not
substance. Many types of conversion are carried out in work with computers. Among them are
File conversion: Changing a file from one format to another—for example, converting a word-
processed document from the format used by one program to the format used by another.
Another, more detailed, type of file conversion involves changing character coding from one
standard to another, as in converting EBCDIC characters (used primarily with mainframe
products have been designed to perform such conversions. See also ASCII; EBCDIC.
Data conversion: Changing the way information is represented—for example, changing binary
user looks up hexadecimal equivalents in a published table. Other times, data conversions are
Media conversion: Changing the storage medium from one form to another—for example, from
disk to tape or from 3.5-inch Apple Macintosh diskette to 5.25-inch MS-DOS diskette. Media
conversions of the Macintosh-to-MS-DOS type usually require both hardware and a conversion
program.
Software conversion: Changing or moving a program designed to run on one computer to run
on another. Usually this involves detailed (professional) work on the program itself.
System conversion: Changing from one operating system to another—for example, from MS-
DOS to UNIX or OS/2. Conversion, especially with a hard-disk system, can include backing up
Hardware conversion: Changing all or part of a computer system to work with new or different
devices. Hardware conversion related to microcomputers covers a wide range of changes, from
circuitry.
Virus (computer)
I INTRODUCTION
computer, interfering with data and software. Just as biological viruses infect people, spreading
from person to person, computer viruses infect personal computers (PCs) and servers, the
computers that control access to a network of computers. Some viruses are mere annoyances,
but others can do serious damage. Viruses can delete or change files, steal important
information, load and run unwanted applications, send documents via electronic mail (e-mail),
or even cripple a machine’s operating system (OS), the basic software that runs the computer.
A virus can infect a computer in a number of ways. It can arrive on a floppy disk or inside an e-
mail message. It can piggyback on files downloaded from the World Wide Web or from an
Internet service used to share music and movies. Or it can exploit flaws in the way computers
exchange data over a network. So-called blended-threat viruses spread via multiple methods
at the same time. Some blended-threat viruses, for instance, spread via e-mail but also
Traditionally, even if a virus found its way onto a computer, it could not actually infect the
machine—or propagate to other machines—unless the user was somehow fooled into
executing the virus by opening it and running it just as one would run a legitimate program.
But a new breed of computer virus can infect machines and spread to others entirely on its
own. Simply by connecting a computer to a network, the computer owner runs the risk of
infection. Because the Internet connects computers around the world, viruses can spread from
There are many categories of viruses, including parasitic or file viruses, bootstrap-sector,
multipartite, macro, and script viruses. Then there are so-called computer worms, which have
Parasitic or file viruses infect executable files or programs in the computer. These files are
often identified by the extension .exe in the name of the computer file. File viruses leave the
contents of the host program unchanged but attach to the host in such a way that the virus
code is run first. These viruses can be either direct-action or resident. A direct-action virus
selects one or more programs to infect each time it is executed. A resident virus hides in the
computer's memory and infects a particular program when that program is executed.
Bootstrap-sector viruses reside on the first portion of the hard disk or floppy disk, known as the
boot sector. These viruses replace either the programs that store information about the disk's
contents or the programs that start the computer. Typically, these viruses spread by means of
Multipartite viruses combine the abilities of the parasitic and the bootstrap-sector viruses, and
so are able to infect either files or boot sectors. These types of viruses can spread if a
Other viruses infect programs that contain powerful macro languages (programming
languages that let the user create new features and utilities). These viruses, called macro
viruses, are written in macro languages and automatically execute when the legitimate
program is opened.
Script viruses are written in script programming languages, such as VBScript (Visual Basic
Script) and JavaScript. These script languages can be seen as a special kind of macro language
and are even more powerful because most are closely related to the operating system
environment. The 'ILOVEYOU' virus, which appeared in 2000 and infected an estimated 1 in 5
Strictly speaking, a computer virus is always a program that attaches itself to some other
program. But computer virus has become a blanket term that also refers to computer worms.
A worm operates entirely on its own, without ever attaching itself to another program.
Typically, a worm spreads over e-mail and through other ways that computers exchange
information over a network. In this way, a worm not only wreaks havoc on machines, but also
clogs network connections and slows network traffic, so that it takes an excessively long time
IV ANTI-VIRAL TACTICS
A Preparation and Prevention
Computer users can prepare for a viral infection by creating backups of legitimate original
software and data files regularly so that the computer system can be restored if necessary.
Viral infection can be prevented by obtaining software from legitimate sources or by using a
quarantined computer—that is, a computer not connected to any network—to test new
software. Plus, users should regularly install operating system (OS) patches, software updates
that mend the sort of flaws, or holes, in the OS often exploited by viruses. Patches can be
downloaded from the Web site of the operating system’s developer. However, the best
prevention may be the installation of current and well-designed antiviral software. Such
software can prevent a viral infection and thereby help stop its spread.
B Virus Detection
Several types of antiviral software can be used to detect the presence of a virus. Scanning
software can recognize the characteristics of a virus's computer code and look for these
characteristics in the computer's files. Because new viruses must be analyzed as they appear,
scanning software must be updated periodically to be effective. Other scanners search for
common features of viral programs and are usually less reliable. Most antiviral software uses
both on-demand and on-access scanners. On-demand scanners are launched only when the
user activates them. On-access scanners, on the other hand, are constantly monitoring the
computer for viruses but are always in the background and are not visible to the user. The on-
access scanners are seen as the proactive part of an antivirus package and the on-demand
scanners are seen as reactive. On-demand scanners usually detect a virus only after the
infection has occurred and that is why they are considered reactive.
Antivirus software is usually sold as packages containing many different software programs
that are independent of one another and perform different functions. When installed or
packaged together, antiviral packages provide complete protection against viruses. Within
most antiviral packages, several methods are used to detect viruses. Checksumming, for
example, uses mathematical calculations to compare the state of executable programs before
and after they are run. If the checksum has not changed, then the system is uninfected.
Checksumming software can detect an infection only after it has occurred, however. As this
technology is dated and some viruses can evade it, checksumming is rarely used today.
Most antivirus packages also use heuristics (problem-solving by trial and error) to detect new
viruses. This technology observes a program’s behavior and evaluates how closely it
resembles a virus. It relies on experience with previous viruses to predict the likelihood that a
Other types of antiviral software include monitoring software and integrity-shell software.
damaging viral activities such as overwriting computer files or reformatting the computer's
hard drive. Integrity-shell software establishes layers through which any command to run a
program must pass. Checksumming is performed automatically within the integrity shell, and
Once a viral infection has been detected, it can be contained by immediately isolating
computers on networks, halting the exchange of files, and using only write-protected disks. In
order for a computer system to recover from a viral infection, the virus must first be
eliminated. Some antivirus software attempts to remove detected viruses, but sometimes with
unsatisfactory results. More reliable results are obtained by turning off the infected computer;
restarting it from a write-protected floppy disk; deleting infected files and replacing them with
legitimate files from backup disks; and erasing any viruses on the boot sector.
V VIRAL STRATEGIES
The authors of viruses have several strategies to circumvent antivirus software and to
propagate their creations more effectively. So-called polymorphic viruses make variations in
the copies of themselves to elude detection by scanning software. A stealth virus hides from
the operating system when the system checks the location where the virus resides, by forging
results that would be expected from an uninfected system. A so-called fast-infector virus
infects not only programs that are executed but also those that are merely accessed. As a
result, running antiviral scanning software on a computer infected by such a virus can infect
every program on the computer. A so-called slow-infector virus infects files only when the files
are modified, so that it appears to checksumming software that the modification was
may infect every tenth program executed. This strategy makes it more difficult to detect the
virus.
By using combinations of several virus-writing methods, virus authors can create more
complex new viruses. Many virus authors also tend to use new technologies when they appear.
The antivirus industry must move rapidly to change their antiviral software and eliminate the
There are other harmful computer programs that can be part of a virus but are not considered
viruses because they do not have the ability to replicate. These programs fall into three
categories: Trojan horses, logic bombs, and deliberately harmful or malicious software
programs that run within a Web browser, an application program such as Internet Explorer and
A Trojan horse is a program that pretends to be something else. A Trojan horse may appear to
be something interesting and harmless, such as a game, but when it runs it may have harmful
effects. The term comes from the classic Greek story of the Trojan horse found in Homer’s
Iliad.
A logic bomb infects a computer’s memory, but unlike a virus, it does not replicate itself. A
logic bomb delivers its instructions when it is triggered by a specific condition, such as when a
logic bomb has the ability to erase a hard drive or delete certain files.
Malicious software programs that run within a Web browser often appear in Java applets and
ActiveX controls. Although these applets and controls improve the usefulness of Web sites,
they also increase a vandal’s ability to interfere with unprotected systems. Because those
controls and applets require that certain components be downloaded to a user’s personal
computer (PC), activating an applet or control might actually download malicious code.
A History
In 1949 Hungarian American mathematician John von Neumann, at the Institute for Advanced
Study in Princeton, New Jersey, proposed that it was theoretically possible for a computer
program to replicate. This theory was tested in the 1950s at Bell Laboratories when a game
called Core Wars was developed, in which players created tiny computer programs that
In 1983 American electrical engineer Fred Cohen, at the time a graduate student, coined the
term virus to describe a self-replicating computer program. In 1985 the first Trojan horses
The so-called Brain virus appeared in 1986 and spread worldwide by 1987. In 1988 two new
viruses appeared: Stone, the first bootstrap-sector virus, and the Internet worm, which crossed
the United States overnight via computer network. The Dark Avenger virus, the first fast
Computer viruses grew more sophisticated in the 1990s. In 1995 the first macro language
virus, WinWord Concept, was created. In 1999 the Melissa macro virus, spread by e-mail,
disabled e-mail servers around the world for several hours, and in some cases several days.
Regarded by some as the most prolific virus ever, Melissa cost corporations millions of dollars
The VBS_LOVELETTER script virus, also known as the Love Bug and the ILOVEYOU virus,
unseated Melissa as the world's most prevalent and costly virus when it struck in May 2000. By
the time the outbreak was finally brought under control, losses were estimated at U.S.$10
billion, and the Love Bug is said to have infected 1 in every 5 PCs worldwide.
The year 2003 was a particularly bad year for computer viruses and worms. First, the Blaster
worm infected more than 10 million machines worldwide by exploiting a flaw in Microsoft’s
Windows operating system. A machine that lacked the appropriate patch could be infected
simply by connecting to the Internet. Then, the SoBig worm infected millions more machines in
an attempt to convert systems into networking relays capable of sending massive amounts of
junk e-mail known as spam. SoBig spread via e-mail, and before the outbreak was 24 hours
old, MessageLabs, a popular e-mail filtering company, captured more than a million SoBig
messages and called it the fastest-spreading virus in history. In January 2004, however, the
MyDoom virus set a new record, spreading even faster than SoBig, and, by most accounts,
Contributed By:
Eddy Willems
Application
certain type of work. An application thus differs from an operating system (which runs a
single task, such as word processing; others, called integrated software, offer somewhat less
power but include several applications, such as a word processor, a spreadsheet, and a
Windowing Environment
Windowing Environment, in computer science, an operating system or shell that presents the
user with specially delineated areas of the screen called windows. Each window can act
windows to be resized and moved around on the display. The Apple Macintosh Finder,
Microsoft Windows, and the OS/2 Presentation Manager are all examples of windowing
environments.
RAM
memory that can be read and written by the microprocessor or other hardware devices. The
storage locations can be accessed in any order. Note that the various types of ROM memory
are capable of random access. The term RAM, however, is generally understood to refer to
volatile memory, which can be written as well as read. See also Computer; EPROM; PROM.
Microsoft ® Encarta ® 2009. © 1993-2008 Microsoft Corporation. All rights reserved.