Mis Chapter Three
Mis Chapter Three
3. INFORMATION TECHNOLOGY
3.1. Introduction
All computers are systems of input, processing, output, storage, and control
components. In this section, we discuss the history, trends, applications, and some
basic concepts of the many types of computer systems in use today, hardware and
software components of computer, people, procedure, communication technologies
and database management.
At the dawn of the human concept of numbers, humans used their fingers and toes
to perform basic mathematical activities. Then our ancestors realized that by using
some objects to represent digits, they could perform computations beyond the
limited scope of their own fingers and toes. Shells, chicken bones, or any number
of objects could have been used, but the fact that the word calculate is derived
from calculus, the Latin word for “small stone,” suggests that pebbles or beads
were arranged to form the familiar abacus, arguably the first human-made
computing device. By manipulating the beads, it was possible with some skill and
practice to make rapid calculations.
The ENIAC (Electronic Numerical Integrator and Computer) was the first
electronic digital computer. It was completed in 1946 at the Moore School of
Electrical Engineering of the University of Pennsylvania. With no moving parts,
ENIAC was programmable and had the capability to store problem calculations
using vacuum tubes (about 18,000).A computer that uses vacuum tube technology
is called a first-generation computer. The principal drawback of ENIAC was its
size and processing ability. It occupied more than 1,500 square feet of floor space
and could process only one program or problem at a time.
In the late 1950s, transistors were invented and quickly replaced the thousands of
vacuum tubes used in electronic computers. A transistor-based computer could per-
form 200,000–250,000 calculations per second. The transistorized computer
represents the second generation of computer.
It was not until the mid-1960s that the third generation of computers came into
being. These were characterized by solid-state technology and integrated circuitry
coupled with extreme miniaturization.
After the history of computer the next important point is the processing speed. In
order to measure the speed of computer systems peoples use different techniques.
Early computer processing speeds were measured in milliseconds (thousandths of a
second) and microseconds (millionths of a second). Now computers operate in the
nanosecond (billionth of a second) range, with Picoseconds (trillionth of a second)
speed being attained by some computers.
Peripherals is the generic name given to all input, output, and secondary storage
devices that are part of a computer system but are not part of the CPU.The major
types of peripherals and media that can be part of a computer system are discussed
as follows.
Input technologies now provide a more natural user interface for computer users.
You can enter data and commands directly and easily into a computer system
through pointing devices like electronic mice and touch pads and with
technologies like optical scanning, handwriting recognition, and voice
recognition.
Output technologies
Storage technologies
Data and information must be stored until needed using a variety of storage
methods. For example, many people and organizations still rely on paper
documents stored in filing cabinets as a major form of storage media. However,
other computer users are more likely to depend on the memory circuits and
secondary storage devices of computer systems to meet your storage requirements.
Progress in very-large-scale integration (VLSI), which packs millions of memory
circuit elements on tiny semi-conductor memory chips (primary storage), is
responsible for continuing increases in the main-memory capacity of computers.
Secondary storage capacities are also escalating into the billions and trillions of
characters, due to advances in magnetic (floppy disk and hard disk drive)and
optical media.
Software is the general term for various kinds of programs used to operate and
manipulate computers and their peripheral devices. One common way of
describing hardware and software is to say that software can be thought of as the
variable part of a computer and hardware as the invariable part. There are many
types and categories of software. The two major categories of software are
application software and system software.
System software consists of programs that manage and support a computer system
and its information processing activities. We can group system software into two
major categories;
Application trends: Toward the pervasive use of the Internet, enterprise intranets,
and inter organizational extranets to support electronic business and commerce,
enterprise collaboration, and strategic advantage in local and global markets.
Network Topologies
An effective information system provides users with timely, accurate, and relevant
information. This information is stored in computer files. When the files are
properly arranged and maintained, users can easily access and retrieve the
information they need.
Database technology can cut through many of the problems created by traditional
file organization. A more rigorous definition of a database is a collection of data
organized to serve many applications efficiently by centralizing the data and
minimizing redundant data. Rather than storing data in separate files for each
application, data are stored physically to appear to users as being stored in only one
location. A single data base services multiple applications.
A computer system organizes data in a hierarchy that starts with bits and bytes and
progresses to fields, records, files, and database. A bit represents the smaller unit of
data a computer can handle.
A group of bits, called a byte, represent a single character, which can be a letter, a
number or another symbol. A grouping of characters into a word, a group of words,
or a complete number (such as a person's name or age), is called a field. A group of
related fields, such as the student's name, the course taken, the date and the grade
make up a record. A group of records of the same time is called a file. A group of
related files make up a database.
Every record in a file should contain at least one field that uniquely identifies that
record so that the record can be retrieved, updated or sorted. This identifier field is
called a key field.
A computer system organizes data in a hierarchy that starts with the bit, which
represents either a 0 or a 1. Bits can be grouped to form a byte to represent one
character, number of symbol. Bytes can be grouped to form a field and related
fields can be grouped to form a record. Related records can be collected to form a
file and related files can be organized into a database.
The above record describes the entity called order and its attributes. The specific
vales for order number, order date, item number, quantity, and amount for this
The direct file access method is used with direct file organization. This method
employs a key field to locate the physical address of a record. However, the
process is accomplishing using a mathematical formula called a transform
algorithm to translate the key field directly into the record's physical storage
This access method is most appropriate for applications where individual's records
must be located directly and rapidly for immediate processing only. A few records
in the file need to be retrieved at one time, and the required records are found in no
particular sequence, i.e., on line hotel reservation system.
In the company as a whole, this process led to multiple master files created,
maintained and operated by separate divisions or departments. The traditional file
environment is a way of collecting and maintaining data in an organization that
leads to each functional area or division creating and maintaining its own data files
and programs.
Under this file environment, there is no central listing of data files, data elements
or definition of data. The organization is collecting the same information on far too
many documents. The resulting problems are data redundancy, program data
dependence, inflexibility, poor data security, and inability to share data among
applications.
It is the presence of duplicate data in multiple data files. Data redundancy occurs
when different divisions, functional areas, and groups in an organization
independently collect the same piece of information.
It is the tight relationship between data stored in files and the specific programs
required to update and maintain those files. Every computer program has to
describe the location and nature of the data with which it works. These data
declarations can be longer than the substantive part of the program. In a traditional
file environment, any change in data requires a change in all of the programs that
access the data.
A traditional file system can deliver routine scheduled reports after extensive
programming efforts, but it cannot deliver ad hoc reports or respond to
unanticipated information requirements in a timely fashion. The information
required by ad hoc requests is "somewhat in the system" but is too expensive to
retrieve. Several programmers would have to work for weeks to put together the
required data items in a new file.
The DBMs acts as an interface between application programs and the physical data
files. When the application program calls for data item such as gross pay, the
DBMs finds this item in the database and presents it to the application program.
Using traditional data files, the programmer would have to define the data and then
tell the computer when they are. The following figure illustrates the elements of a
database management system.
A data directory
Program data dependence can be reduced by separating the logical view of data
from its physical arrangement
Designing Database
There are alternative ways of organizing data and representing relationship among
data in a database. Conventional DBMS uses one of three principal logical
database models for keeping track of entities, attributes, and relationships. The
three principal logical database models are hierarchical, network, and relational.
Each logical model has certain processing advantages and certain business
disadvantages.
The hierarchical data model presents data to users in a tree like structure. Within
each record, data elements are organized into pieces of records called segments. To
the user, each record looks like an organizations chart with one top level segment
called the root. An upper segment is connected logically to a lower segment in a
parent-child, but a child can have only one parent. Consider the following figure.
Pension
The network data model is a variation of the hierarchical data model. Indeed data
bases can be translated from hierarchical to network and vice versa in order to
optimize processing speed and convenience. Whereas, hierarchical structures,
despite one-to-many relationships, network structures depict data logically as many
to many relationships. In other words, parents have multiple "children" and a child
can have more than one parent. A typical many to many relationships in which
network DBMS excels in performance is the student course relationship (See the
following Figure).
The relational data model, the most recent of these three database models,
overcomes some of the limitations of the other two models. The relational model
represents all data in the database as simple two dimensional tables called
relations. The tables appear similar to flat files, but the information is more than
one file can be easily extracted and combined. Sometimes the tables are referred to
as files. Consider the following figure.
Table (Relation)
Column (Fields)
Order Number Order Date Delivery Date Part Amount Order Total
Hierarchical and network structures have several disadvantages. All of the access
paths, directories, and indices must be specified in advance. Once specified, they
are not easily changed without a major programming effort. Therefore, these
designs have low flexibility.
The strengths of relational DBMS are great flexibility in regard to ad hoc queries,
power to combine information from difficult sources, simplicity of design and
The weaknesses of relational DBMS are their relatively low processing efficiency.
These systems are somewhat slower because they typically require many accesses
to the data stored on disk to carry out the select, join, and project commands.
Selecting one part number from among millions, one record at a time, can take a
long time. Of course, the database can be indicated and "turned" to speed up pre-
specified queries. Relational systems do not have the large number of pointers
carried by hierarchical systems.
Much more is required for the development of database systems than simply
selecting a logical database model. Indeed, this selection may be among the last
decision. The database is an organizational discipline, a method, rather than a tool
or technology. It requires organizational and conceptual change.
Without management support and understanding, database efforts fail. The critical
elements in a database environment are (1) data administration, (2) data planning
and modeling methodology, (3) database technology and management, and (4)
users.
1) Data Administration
Database systems require that the organization recognize the strategic role of
information and begin actively to manage and plan for information as a corporate
resource. This means that the organization must develop a data administration
function with the power to define information requirements for the entire company
and with direct access to senior management.
Data administration is responsible for the specific policies and procedures through
which data can be managed as an organizational resource. These responsibilities
include developing information policy, planning for data, overseeing logical
database design and data dictionary development, and monitoring the usage of data
by information system specialists and end users groups.
Because the organizational interests served by the DBMS are much broader than
those in the traditional file environment, the organization requires enterprise-wide
planning for data. Enterprise analysis, which addresses the information
requirements of the entire organization (as opposed to the requirements of
individual applications), is needed to develop databases. The purpose of enterprise
analysis is to identify the key entities, and relationships that constitute the
organization's data.
Database requires new software and a new staff specially trained in DBMS
techniques as well as new management structures. Most corporations develop a
database design and management group within the corporate information system
division that is responsible for the more technical and operational aspects of
managing data. The functions it performs are called database administration. This
group does the following: