0% found this document useful (0 votes)
38 views9 pages

Notes For Finals

This document provides an overview of auditing database systems. It discusses the two approaches to data management: flat-file and database models. The flat-file approach can lead to data redundancy, storage problems, updating issues, and currency problems. A database management system (DBMS) using the database approach can overcome these issues. The key elements of a database environment are the DBMS, users, database administrator, physical database, and DBMS models. Common DBMS models include hierarchical, network, and relational models. The document also briefly discusses the systems development life cycle and application controls.

Uploaded by

JAMES SANGARIOS
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
38 views9 pages

Notes For Finals

This document provides an overview of auditing database systems. It discusses the two approaches to data management: flat-file and database models. The flat-file approach can lead to data redundancy, storage problems, updating issues, and currency problems. A database management system (DBMS) using the database approach can overcome these issues. The key elements of a database environment are the DBMS, users, database administrator, physical database, and DBMS models. Common DBMS models include hierarchical, network, and relational models. The document also briefly discusses the systems development life cycle and application controls.

Uploaded by

JAMES SANGARIOS
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 9

CHAPTER 4

AUDITING DATABASE SYSTEMS

Two general approaches of data management:

1. flat-file model
2. database model

1. FLAT-FILE APPROACH
• are data files that contain records with no structured relationships to other files.
• is most often associated with so-called legacy systems.

Data redundancy - replication of essentially the same data in multiple files.

contributes to three significant problems in the flat-file environment:


a. data storage
b. data updating
c. currency of information
d. task-data dependency - (not specifically caused by data redundancy)

2. DATABASE APPROACH

database management system (DBMS) - access to the data resource is controlled.

• special software system that is programmed to know which data elements each user is
authorized to access.

The traditional problems associated with the flat-file approach may be overcome through:
a. Elimination of Data Storage Problem
b. Elimination of Data Update Problem
c. Elimination of Currency Problem
d. Elimination of Task-Data Dependency Problem

KEY ELEMENTS OF THE DATABASE ENVIRONMENT:


1. database management system (DBMS)
2. users
3. database administrator
4. physical database
5. DBMS models
1. DATABASE MANAGEMENT SYSTEM
• central element of the database approach.

TYPICAL FEATURES:
a. Program development - contains application development software.
- create applications to access the database.
b. Backup and recovery - makes backup copies of the physical database.
c. Database usage reporting - captures statistics on what data are being used
d. Database access - permit authorized user access, both formal and informal, to the database.

three software modules that facilitate this task:


1) Data definition language (DDL)
2) Data Manipulation Language
3) Query Language

1) Data Definition Language


• programming language used to define the database to the DBMS.
• identifies the names and the relationship of all data elements, records, and files that
constitute the database.

This definition has three levels, called database views:


a. physical internal view
b. conceptual view (schema)
c. user view (subschema)

A. PHYSICAL INTERNAL VIEW - physical arrangement of records in the database is


presented through the internal view.

- lowest level of representation, which is one step removed from the physical
database.
- describes the structures of data records, the linkages between files, and the
physical arrangement and sequence of records in a file.

B. CONCEPTUAL VIEW/LOGICAL VIEW (SCHEMA) - describes the entire database.


C. EXTERNAL VIEW/USER VIEW (SUBSCHEMA) - defines the user’s section of the
database—the portion that an individual user is authorized to access.
2. USERS
users access the database in two ways:
1. Formal Access: Application Interfaces
2. Informal Access: Query Language

FORMAL ACCESS:
2) Data Manipulation Language
• proprietary programming language that a particular DBMS uses to retrieve, process, and
store data.
DBMS Operation - illustrates how the DBMS and user applications work together.

INFORMAL ACCESS:
3) Query Language
• query is an ad hoc access methodology for extracting information from a database.
• Structured Query Language (SQL) - standard query language for both mainframe and
microcomputer DBMSs.
▪ a fourth-generation, nonprocedural language.

3. DATABASE ADMINISTRATOR
• is responsible for managing the database resource. The sharing of a common database
by multiple users requires organization, coordination, rules, and guidelines to protect the
integrity of the database.

Functions of the Database Administrator:

DATABASE DESIGN IMPLEMENTATIONS OPERATIONS AND CHANGE AND


PLANNING MAINTENANCE GROWTH
Develop database schema Access policy Evaluate database Plan for change
strategy performance and growth
Define database subschema Security controls Reorganize Evaluate new
environment database technology
Develop data Internal view of Tests procedures Review standards
requirements database and procedures
Develop data Database control Programming
dictionary standards

Another important function of the DBA is the creation and maintenance of the data dictionary.
The data dictionary describes every data element in the database.
4. PHYSICAL DATABASE
• lowest level of the database and the only level that exists in physical form.
Data Structures - are the bricks and mortar of the database.
- allows records to be located, stored, and retrieved, and enables movement from
one record to another.

two fundamental components:


1. organization method
2. access method

1. Data Organization - the way records are physically arranged on the secondary storage
device. This may be either sequential or random.
➢ sequential files are stored in contiguous locations that occupy a specified area of
disk space.
➢ random files are stored without regard for their physical relationship to other
records of the same file.
2. Data Access Methods – the technique used to locate records and to navigate through the
database.

5. DBMS MODELS
Database Terminologies:
• Data Attribute/Field - a single item of data, such as customer’s name, account balance,
or address.
• Entity - a database representation of an individual resource, event, or agent about which
we choose to collect data
• Record Type (Table or File) - When we group together the data attributes that logically
define an entity.
• Database - set of record types that an organization needs to support its business
processes.
• Associations - Record types that constitute a database exist in relation to other record
types.
Three basic record associations are:
a. one-to-one
b. One to-many
c. many-to-many.

THE HIERARCHICAL MODEL


• earliest database management systems

information management system (IMS) is the most prevalent example of a hierarchical database.
▪ constructed of sets that describe the relationship
between two linked files. Each set contains a parent and
a child.
▪ Siblings - Files at the same level with the same parent.
▪ This structure is also called a tree structure.

Root - highest level in the tree.


Leaf - lowest file in a particular branch.
Navigational Databases - The hierarchical data model is called a navigational database because traversing
the files requires following a predefined path.

THE NETWORK MODEL


• navigational database with explicit linkages between records and files.
THE RELATIONAL MODEL
• attributes (data fields) forming columns.
• Tuple - Intersecting the columns to form rows in the table.

a normalized array of data that is similar, but not precisely equivalent, to a record in a flat-file
system.

CHAPTER 5

SYSTEMS DEVELOPMENT AND PROGRAM CHANGE ACTIVITIES


SYSTEMS DEVELOPMENT LIFE CYCLE 8 PHASE-PROCESS:
PHASE I: Systems Planning - to link individual system projects or applications to the strategic objectives
of the firm.
PHASE II: Systems analysis - a two-step process involving first a survey of the current system and then an
analysis of the user’s needs.
PHASE III: Conceptual Systems Design - to produce several alternative conceptual systems that satisfy
the system requirements identified during systems analysis.
PHASE IV: System Evaluation and Selection - optimization process that seeks to identify the best system.
PHASE V: Detailed Design - to produce a detailed description of the proposed system that both satisfies
the system requirements.
PHASE VI: Application Programming and Testing - to select a programming language from among the
various languages available and suitable to the application.
PHASE VII: System Implementation - database structures are created and populated with data.
PHASE VIII: Systems maintenance - formal process by which application programs undergo changes to
accommodate changes in user needs.
CHAPTER 7

COMPUTER-ASSISTED AUDIT TOOLS AND TECHNIQUES

APPLICATION CONTROLS - are programmed procedures designed to deal with potential exposures that
threaten specific applications.
Three broad categories:
1. input controls
2. processing controls
3. output controls

1. Input controls - designed to ensure that these transactions are valid, accurate, and
complete.
Data input procedures can be either be:
A. source document-triggered (batch)
B. direct input (real time)

Source document input requires human involvement and is prone to clerical errors.

Classes of Input Control:


• Source document controls
• Data coding controls
• Batch controls
• Validation controls
• Input error correction
• Generalized data input systems

➢ SOURCE DOCUMENT CONTROLS

Control Procedures:
a. Use Pre-numbered Source Documents - come prenumbered from the printer with a unique
sequential number on each document.
b. Use Source Documents in Sequence - distributed to the users and used in sequence.
c. Periodically Audit Source Documents - identify missing source documents.

Data Coding Controls - are checks on the integrity of data codes used in processing.

Three types of errors:


1. transcription errors –
three classes:
a. Addition - an extra digit or character is added to the code.
b. Truncation - when a digit or character is removed from the end of a code.
c. Substitution - replacement of one digit in a code with another.
2. single transposition errors
two types:
a. Single transposition errors occur when two adjacent digits are reversed.
b. Multiple transposition errors occur when nonadjacent digits are transposed.
3. multiple transposition errors

Check Digits - detecting data coding errors.


• a control digit (or digits) added to the code when it is originally assigned that allows the integrity
of the code to be established during subsequent processing.

check-digit techniques
1. Assign weights.
2. Sum the products
3. Divide by the modulus.
4. Subtract the remainder from the modulus.
5. Add the check digit to the original code

Batch controls - effective method of managing high volumes of transaction data through a system.
Two documents:
1. batch transmittal sheet
2. batch control log

Hash totals - a simple control technique that uses nonfinancial data to keep track of the records in a
batch.

Validation Controls - are intended to detect errors in transaction data before the data are processed.
Three Levels of Input Validation Controls:
1. Field interrogation - programmed procedures that examine the characteristics of the data in the field.
types of field interrogation:
a. Missing data checks - examine the contents of a field for the presence of blank spaces.
b. Numeric-alphabetic data checks - determine whether the correct form of data is in a field.
c. Zero-value checks - used to verify that certain fields are filled with zeros.
d. Limit checks - determine if the value in the field exceeds an authorized limit.
e. Range checks - assign upper and lower limits to acceptable data values.
f. Validity checks - compare actual values in a field against known acceptable values
g. Check digit - controls identify keystroke errors in key fields by testing the internal validity
of the code.

2. Record interrogation - validate the entire record by examining the interrelationship of its field
values.
Typical Tests:
a. Reasonableness checks - determine if a value in one field, which has already passed a
limit check and a range check, is reasonable when considered along with other data fields
in the record.
b. Sign checks are tests to see if the sign of a field is correct for the type of record being
processed.
c. Sequence checks are used to determine if a record is out of order.

3. File interrogation - ensure that the correct file is being processed by the system.
*Master files - contain permanent records of the firm.

a. Internal label checks - verify that the file processed is the one the program is actually
calling for.
b. Version checks - verify that the version of the file being processed is correct.
c. Expiration date check - prevents a file from being deleted before it expires.

Input Error Correction - when errors are detected in a batch, they must be corrected and the records
resubmitted for reprocessing.
Three common error handling techniques:
(1) correct immediately
(2) create an error file
(3) reject the entire batch.

OUTPUT CONTROLS - ensure that system output is not lost, misdirected, or corrupted and that privacy is
not violated.
Examples of privacy exposures:
o disclosure of trade secrets
o patents pending
o marketing research results
o patient medical records

Output Spooling – direct their output to a magnetic disk file rather than to the printer directly.
backlog - can cause a bottleneck, which adversely affects the throughput of the system.

Control testing techniques - provide information about the accuracy and completeness of an application’s
processes.
Two General Approaches:
(1) the black box (around the computer) approach
(2) the white box (through the computer) approach.

1. BLACK-BOX APPROACH - do not rely on a detailed knowledge of the application’s internal logic.
2. WHITE-BOX APPROACH - relies on an in-depth understanding of the internal logic of the
application being tested.

common types of tests of controls:


a. Authenticity tests
b. Accuracy tests
c. Completeness tests
d. Redundancy tests
e. Access tests
f. Audit trail tests
g. Rounding error tests
Rounding programs are particularly susceptible to salami frauds.
Salami frauds tend to affect a large number of victims, but the harm to each is immaterial.

COMPUTER-AIDED AUDIT TOOLS AND TECHNIQUES (CAAT)

Five CAATT approaches:


1. test data method - establish application integrity.
2. base case system evaluation - set of test data in use is comprehensive
3. tracing - electronic walkthrough of the application’s internal logic.
4. integrated test facility - automated technique that enables the auditor to test an application’s
logic and controls during its normal operation.
5. parallel simulation – auditor writes a program that simulates key features or processes of the
application under review.

CHAPTER 11

ENTERPRISE RESOURCE PLANNING (ERP)


ERP systems - are multiple module software packages that evolved primarily from traditional
manufacturing resource planning (MRP II) systems.

ERP Core Applications - applications that operationally support the day-to-day activities of the business.
Core applications are also called online transaction processing (OLTP) applications.

Online analytical processing (OLAP) - includes decision support, modeling, information retrieval, ad hoc
reporting/analysis, and what-if analysis.
• data warehouse - database constructed for quick searching, retrieval, ad hoc queries, and ease
of use.

RISKS ASSOCIATED WITH ERP IMPLEMENTATION:

A. Big Bang Versus Phased-in Implementation - more ambitious and riskier. Attempt to switch
operations from their old legacy systems to the new system in a single event that implements the
ERP across the entire company.
B. Phased-In Approach - a popular alternative. It is particularly suited to diversified organizations
whose units do not share common processes and data.

You might also like