Manual of Information Technology Audit PDF
Manual of Information Technology Audit PDF
Information
Technology
Audit
Volume III
IT Audit Manual
Volume III: Audit Programmes for Specific
Applications
Table of content
Particulars Page
8. Bibliography 113
Evolution of ERP
Features of ERP
Components of ERP
1.5 ERP solutions are usually divided into many sub-systems, like Sales and
Marketing, Master Scheduling, Material Requirement Planning, Capacity
Requirement Planning, Bill of Materials, Purchasing, manufacturing including Shop
floor control, Accounts Payable/Receivable, Logistics, Asset Management and
Financial Accounting
Benefits of ERP
There might be some overlap between the checklist and the Guidelines for Systems
Under Development. In case an organization is clearly taking an SDLC approach
towards adopting ERP application then the following programmes can be
supplemented by the guidelines.
• task breakdown
• budgeting of time and resources
• milestones
• checkpoints
• approvals
• is complete and current
• provides for participation by the affected user
department (owner/sponsor) management in the
definition and authorisation of a development,
implementation or modification project
• specifies the basis on which staff members are
assigned to projects
• defines responsibilities and authorities of project
team members
•
• provides for the creation of a clear written
statement defining the nature and scope of the
project before work on the project begins
• provides for an initial project definition
document which includes a clear statement of
the nature and scope of the project
No Item KD Reference
A. Inventory System Requirements
(Needs determination function)
1. Does the system record customer/consuming agency demand
and replenishment lead-time data for a period, analyze the
data for anomalies, and compute demand and lead time
forecasts on a regular basis? Computer-generated forecasts
generally should be changed only when information is
available to the manager that is not available to the automated
system
2. Does the system compute and routinely update the
ordering costs, which may vary depending on the
methods of procurement and other factors? The ordering
costs should include costs of:
o Reviewing the stock position;
o Preparing the purchase request;
o Selecting the supplier; receiving, inspecting,
and placing the material in storage; and
o Paying the vendor.
3. Does the system estimate and routinely update the per
unit inventory holding cost, which is an estimate of the
cost to hold each additional unit of inventory? Its primary
elements are storage space, obsolescence, interest on
inventory investment, and inventory shrinkage (due to
deterioration, theft, damage, etc.).
4. Does the system re-compute the Economic Order
Quantity (EOQ) on a regular basis using the demand
forecast, ordering cost, inventory holding cost, and unit
cost of the material? In lieu of the EOQ, any other
optimum order quantity calculation may be used,
provided that it is based
on sound principles and it minimizes total cost, including
the ordering and inventory holding costs.
5. Does the system compute the reorder point level on a
regular basis, considering stock available for issue,
backorders, quantities on hold, and quantities due for
delivery?
6. Does the system provide information on current inventories
and historical usage to be used in capacity planning?
7. Does the system establish overall production/purchase targets
necessary to fill customer/consuming agency orders and meet
operating schedules?
8. Does the system support predefined inspection plans and
quality standards?
No Item KD Reference
54. Does the system provide financial information in the
appropriate format and method to other financial
management systems used by the agency?
55. Does the inventory system accept cost and other appropriate
information from the agency's cost system if the agency has a
separate cost accounting system to support cost accumulation
by work elements, such as job order, activities, products,
etc.?
56. Does the system track accumulated costs, which should
include the value of direct materials, direct labour, and
overhead where applicable (including standard costs and
rates, if applicable) for work in process? Percentage of
completion information should be used to value work in
process.
57. Does the system provide features to record unit costs and
prices of products and services?
58. Does the system transfer work in process to finished goods
for inventory categorization and accounting purposes?
59. Does the system identify the intended location of the item
and track its movement from the point of initial receipt to its
final destination?
60. Does the system record identifiers, quantities, condition,
location, and other elements necessary to establish physical
control?
61. Does the system classify inventory items by commodity class
or type to meet agency needs for management and control?
D. Inventory System Requirements
(Inventory disposition function)
62. Does the system record changes in the location of the
inventory items and the associated changes in the person or
organization responsible for stewardship of the item?
63. Does the system record the value and quantities of items in
transit from one location to another?
64. Does the system generate the appropriate financial
transactions if the financial category needs to be changed to
"held for repair" or "excess, obsolete, or unserviceable?"
65. Does the system verify that the customer/consuming agency
order is received from an eligible customer/consuming
agency who is authorized to use the system and order the
items?
66. Does the system ensure that inventory items issued are
limited to available funds provided by the
customer/consuming agency?
No Item KD Reference
78. Does the system provide the core financial system with the
data necessary to establish the receivable and support
subsequent administration of the receivables management
and collection process?
79. Does the system decrease the inventory of finished goods
account and increase the cost of goods sold account by the
amount at which the inventory are valued?
80. Does the system record revenue and the appropriate
offsetting account (e.g., cash, receivables, or advances) in the
amount for which the inventory items are sold/issued (price)
in case of inventory of finished goods?
81. Does the system record the value of items issued from
storage or shipped to customer/consuming agency for which
title does not pass to the customer/consuming agency until a
subsequent event occurs?
82. Does the system decrease the quantity of the inventory item
on hand by the number of items sent to the disposal
organization?
83. Does the system record confirmation of receipt of the items
by the disposal organization?
84. Does the system transfer balances between financial
categories, for example, from "inventory held for sale/issue"
to "excess, obsolete, and unserviceable inventory?"
85. Does the system account for the proceeds resulting from
disposition of inventory items as scrap?
86. Does the system provide the following types of
management information?
o Stock availability?
o Customer/consuming agency order?
o Inventory turnover?
o Stock usage?
o Losses?
o Disposals?
E. Inventory System Requirements
(Program planning and monitoring function)
87. Does the system establish price computation models or
formulas to be used in the Bill Calculation activity? Pricing
models for entities are usually based on costs incurred, but
may be based on other factors, such as specific norms and
regulations for the agency, utility, and condition.
88. Does the system provide methods to support pricing by
groupings or commodities of items?
No Item KD Reference
97. Does the system track actual and standard cost variances for
materials (price and usage), labour (rate and efficiency), and
overhead (actual and applied) when a standard cost method is
used?
98. Does the system record reasons for significant deviations
between standard and actual costs?
99. Does the system provide capabilities to support adjustments
of rates and dispositions of variances by performing periodic
allocations?
100. Does the system match costs and revenues within the periods
they were incurred or realized to identify gains or losses from
sales/issual?
101. Does the system support analysis of operations on an annual
basis to determine if revenues are sufficient to cover the costs
of the entire inventory program?
102. Does the system provide sufficient transaction audit trails to
support balances of inventory shown on the agency's general
ledger and changes in those balances?
103. Does the system maintain the documentation supporting
inventory transactions until it is audited for accuracy and
approved by external auditors, but not for less than 3 years?
Retention may be longer when
(a) required by regulations,
(b) there is a possibility of legal action involving the
inventories, or
(c) contract terms or modifications require longer retention.
104. Does the system provide the following types of
management information?
(a) Cost per rupee of sales/issual?
(b) Operations cost?
Introduction
1. In November 2003, all the Chief Secretaries and Expenditure Secretary in the Central
Government were requested to involve audit in various phases of system
development. Consequent to that, several State Governments have requested the local
Accountant General to be involved in the development of specific systems. These
guidelines are issued in order to assist field offices to effectively and constructively
audit the process of system development in the auditee units.
2. The guidelines and programmes are comprehensive and relate to all the phases of the
system development life cycle and are more useful when applied in context of
development of large systems of capital intensive nature. In case of all systems, all of
these may not be applicable. The field offices and particularly the officer who is
associated with the audit of the system development process should decide which
ones would be applicable.
3. System development has become critical to government departments and
organizations hoping to improve governance and the delivery of services to its
citizens and clients by investing in large software applications. Yet, often expensive
applications development projects fail to deliver on the promises. Government
departments and organizations can reduce the risk of such failures by adopting a
structured approach such as System Development Life Cycle (SDLC) Methodology to
guide themselves and the developers.
4. Definition of System Development Life Cycle (SDLC) Methodology
System Development Life Cycle Methodology (SDLC) is defined as, “a
structured approach that divides an information systems development project
into distinct stages which follow sequentially and contain key decision points and
sign-offs. This permits an ordered evaluation of the problem to be solved, an
ordered design and development process, and an ordered implementation of the
solution. A final stage allows for management feedback and control through a
post-installation evaluation”.
5. In practice, however, the process of development of any system tends to take an
unstructured path. Problems that might not have been anticipated earlier may compell
the system development team to retrace some of the route already trodden. Other
methodologies like Rapid Application Development, Joint Application Design
Methodology, Soft Systems Methodology etc.often are combined with the structured
system development process.
6. Audit must take great care in associating itself only with such systems where a
development methodology is distinctly discernible. If the methodolody adopted is
purely ad hoc without any clear structure and adequate documentation, it would be
extremely risky to offer any comments on such systems.
7. When Audit is involved in the System Development Phase of Information
Technology systems it should not take up the role of a consultant or that of an internal
auditor but should remain a keen observer whose findings are reported from time to
Checklist (PI)
PI.1. Review the business and ensure a formal business case exists for the project.
PI.2. Ensure that a Project Initiation document exists and it has the approval of the
competent authority.
PI.3 If yes, then verify that the Project Initiation document contains at least the following
features:
PI.4. Ensure that the Project Initiation document has been reviewed and approved by the
competent authority.
PI.5. Ensure that an appropriate project organization has been outlined in the Project
Initiation documentation. Determine by examining the Project Initiation document
that:
IT Audit Manual Volume III 73
IT Audit Manual
Î Project Team members and representatives and their responsibilities have been
named including:
Project Director /Manager
User Manager/Director
Technical Representatives
User Functional Representatives
Î Steering Committee/ Sign Off Authority has been established and they have been
delegated requisite powers.
Î Evaluate the background and qualifications of project members for their
assignment to specific project tasks.
PI.6. Ensure that the user department management has appointed personnel from its
department to participate in the project.
PI.7. Verify that the Project Manager or one of the team members is responsible to ensure
the complete and accurate accumulation of project costs.
PI.8. Determine from the Project Initiation document that a work plan, including target
dates and resource requirements has been prepared. It delineates the manner in which
each phase of the development process (the preparation of feasibility study,
requirement definition, system design etc) is to be approved prior to proceeding to the
next phase of the project (programming, system testing, transaction testing, parallel
testing etc.).
PI.9. Verify that the target dates indicated in the work plan are in keeping with the resource
requirements outlined and any constraints involved.
FS.1 Ensure that steps have been taken by the project team to identify and consult all
affected parties?
IT Audit Manual Volume III 74
IT Audit Manual
FS.2 Ensure that a Technological Feasibility Study been prepared and documented?
Î Is the proposed technology feasible, considering the technical sophistication
existing or available through the organization?
FS.3 Review the technology feasibility report to see if it has adequately addressed:
Î Hardware needs and availability.
Î System software needs and availability.
Î Communications hardware and software needs availability. Valid time constraints
in the user department's information requirements and the manner of satisfying
them.
Î Operational feasibility (compliance with information architecture e.g. whether the
new project fits into the current mix of hardware, software, and communications).
FS.4 Verify that there is a consensus among user departments and designers concerning the
technological aspects of the system's configuration.
FS.5 Determine the organizational capability to manage the related technologies and to
decide whether the technologies should be developed or bought, operated in-house or
out, and maintained in-house or out.
FS.6 Has a User Requirements document been prepared and released? Does it include the
following expression of need in terms of the organization's mission:
Î A description of the current function.
Î Analysis of the deficiencies of the current function.
Î Resources expended on the current function.
Î Volume of work produced with the current function, including peak processing
performance and projected growth.
Î Internal control and security requirements.
Î Justification for improvement and changes.
Î Scope and objectives of proposed system.
Î Alternative solutions to solving the need.
Î Relationships with other systems.
Î Relationships with long-range plans and other information resource management
initiatives.
FS.7 Has the accuracy and completeness of user requirements been acknowledged by the
appropriate level of user, and by Data Processing management.
FS.8 Has the User Requirements document been reviewed by the Steering Committee/Sign
off Authorities?
Î Have they signified acceptance of the need to continue the project? Note any
conditional acceptance for follow-up in later stages.
FS.9 Confirm, if possible, with independent sources the reliability and track record of the
recommended hardware and software.
FS.10 Confirm if there is a plan to address the intellectual Property issues, including the
ownership of source code in case of development of customised software being
outsourced.
IT Audit Manual Volume III 75
IT Audit Manual
FS.11 Check whether a Cost/Benefit document has been prepared and released? Are all costs
identified as operating or capital?
FS.12 Ensure that the analysis of the project costs and benefits was prepared to evaluate the
economic feasibility of each alternative; check that
Î the assumptions and constraints in the cost/benefit analysis are reasonable
Î the user and system costs cover all stages of the life cycle
Î the estimated costs for each alternative include hardware and software
enhancements needed to support that alternative
Î estimated costs for each alternative includes cost of security and internal controls,
data preparation and entry, file conversion, testing, parallel operations,
acceptance, and related costs
Î the basis of estimation and computation of costs is reasonable
Î there is a consensus among end users, designers, and implementers concerning
system costs, benefits, and contractual agreements
FS.13 Ensure that the analysis of the project costs and benefits takes into consideration the
impact on human resources. Verify that estimated costs for each alternative includes:
Î training,
Î redeployment of staff,
Î ergonomic issues.
FS.14 Check whether the accuracy and completeness of the cost/benefit analysis and
acceptance of the recommended alternative has been acknowledged by the appropriate
level of user and by Data Processing management.
FS.15 Has the Cost/Benefit document been reviewed by the Steering Committee/Sign off
Authorities?
Î Have they signified acceptance of the recommended alternative and the need to
continue the project? Note any conditional acceptance for follow-up in later
stages.
FS.16 Check whether the users of an appropriate level and Data Processing management
have acknowledged that the analysis of processing alternatives is accurate and
complete and agrees with the recommendations.
FS.17 Check whether steps been taken by the project team to identify and consult all
affected parties?
FS.18 Does the Project documentation show that the skills of the staff employed on the
project meet the requirements specified in the Personnel Skills Summary?
FS.19 Has a Steering Committee Meeting Schedule document been prepared and released to
all interested parties, including EDP and user management?
FS.20 Review the minutes of the Committee meetings and note the following:
Î that EDP and user management were represented at each meeting, and
Î that meetings were held regularly.
FS.22 Check whether the Feasibility Study identifies the need for a System Processing
Controls Specifications or similar document?
FS.23 Determine that a statement of the level of security, privacy and accessibility needed
for system's data conforms to the government Acts, and that the statement is included
with the documentation to be reviewed by the Steering Committee/Sign off
Authorities.
Audit objective: To ensure that the general design of the system expands on
the findings of the feasibility study, produces a functional description of
manual and EDP processes, and devises an overall system design that can
be used to obtain a commitment for Detailed Design Stage.
SD.1 Check whether the organization has adopted any system development methodology and
framework to ensure that a process is in place that appropriately addresses all system
design issues (i.e. input, processing, output, internal controls, security, disaster
recovery, response time, reporting, change control etc.)
SD.2 Check whether a Systems Specifications document has been prepared and released?
Verify that it contains at least the following specifications/features:
Î system objectives and scope
Î general system concept and design considerations
Î chart showing function structure in terms of processes
Î logical data flow diagram showing flow among processes and data stores at the
data element level
Î Process descriptions, including complete and detailed definitions of processes for
all business cases involved.
Î Appropriate audit trails and controls are built into the system
Î Specifies volumes, timings, highs and lows, and quality specified for inputs,
outputs, and data stores
Î Service levels: Complete description of performance requirements. This will be
used in later stages to confirm the technical feasibility and resources requirement
of the system
SD.3 Check that the accuracy and completeness of system specifications has been
acknowledged by the appropriate level of user and by Data Processing management.
SD.4 Ensure that the System Specifications document been reviewed by the Steering
Committee/Sign off Authorities? Have they signified acceptance of the need to
continue the project? Note any conditional acceptance for follow-up in later stages.
SD.5 Ensure that the data dictionary/ directory has been prepared or updated to contain the
system specifications.
SD.6 Ensure that the skills of the staff being employed on the project (as Team Members or
Steering Committee/Sign off Authority members) continue to meet the requirements
envisaged in the Project Initiation report or Feasibility Report.
SD.7 Has a General Design Stage Status document been prepared and released? Verify that
it contains at least the following:
Î actual resources used to date, compared to planned, with reasons for variance
Î actual milestones achieved to date, compared to planned, with reasons for
variance
Î a roadmap for the Detailed Design Stage, including the following activities:
SD.9 Verify that the updated budget and schedule is in keeping with the updated
cost/benefit analysis.
SD.10 Verify the updated cost/benefit analysis against the cost/benefit analysis from the
previous stage and from source documents.
SD.11 Determine that the updated cost/benefit analysis has taken into consideration the
human resource impact requirements.
SD.12 Check the accuracy and completeness of the General Design Stage Status document
and agreement with it has been acknowledged by the appropriate level of user and by
Data Processing management.
SD.13 Ensure that the General Design Stage Status document has been reviewed by the
Steering Committee/Sign off Authorities and have they signified acceptance of it?
SD.14 Ensure that a System Processing Controls Specifications or similar document been
prepared and released? Verify that it addresses at least the following issues
Completeness
Î Ensuring that all data are initially recorded and identified.
Î Control should be established close to the source of the transaction.
Î Output should be reconciled to input.
Î Ensuring that corrections for all identified errors are re-entered into the system.
Î The timing of input submissions and output distribution should be properly
coordinated with processing.
Î Procedures are needed to ensure that output reports are independently reviewed
for completeness and form.
Accuracy
Î Procedures should exist to prevent errors in the preparation of input or source
data, and to detect and correct any significant errors that do occur.
Î Procedures should exist to prevent errors arising when data are converted to
machine processable form, and to detect and correct any significant errors that do
occur.
Î There should be procedures to ensure that data are transmitted accurately to the
computer centre.
Î Procedures should ensure that only valid files are used.
Î Controls must ensure that the accuracy of data is maintained during processing.
Î Procedures should ensure that program computations are performed correctly.
Î There should be a system of control over the physical operations of the computer
system.
DD.1 Has a Detailed Systems Design document been prepared and released? Verify that it
covers at least the following:
Î system flow and description, by function
Î data dictionary
Î system files
Î system inputs, including design of forms and video screens
Î system outputs, including design of forms, reports and video screens
Î system interface specifications
Î system software specifications
Î hardware specifications
Î communications specifications
Î system management utility specifications
Î audit, control, and security specifications
Î conversion specifications
Î Ensures that file requirements for at least the following files are being structured
as per system and user requirement and the organizations data dictionary rules:
master, transaction, command, programme, control, table, report, print, log,
transmission.
Î common processing module specifications
Î Input control and output control issues like :
does the application include control features, to help ensure that only
specifically authorised persons can input transaction and master data into the
system, such as access control matrix and logical access controls (including
passwords and biometrics) are in place depending on the security needs of the
organization.
do audit trails and controls provide the possibility of protecting the users
against discovery and misuse of their identity by other end users (e.g. by
offering anonymity, pseudonymity, unlinkability or unobservability) without
jeopardising the systems security.
do input routines trap the userid, logon etc that permit authorised persons to
identify the end user responsible for that element
are controls in place to ensure that all items entered can be accounted for, such
as having the system automatically attach a sequential number to each item;
DD.3 Review flow charts, decision tables, or narratives to assess the reasonableness of
program logic incorporated in applications.
DD.4 Check whether the accuracy and completeness of Detailed System Design
specifications has been acknowledged by the appropriate level of user and Data
Processing management.
DD.5 Check that the Detailed System Design document has been reviewed by the Steering
Committee/Sign off Authorities? Have they signified acceptance? Note any
conditional acceptance for follow-up in later stages.
DD.6 Check whether a program and system test plan has been developed and released?
Verify that it covers at least the following both for program and system testing, and
for volume and operational testing:
Î overview of the software to be tested, including vendor software and conversion
software and the work environment it operates in
Î test schedule
Î materials and supplies including equipment, software, storage facilities,
documentation, test input, sample output, and special forms
Î training requirements
Î list of user requirements to be tested
Î list of operational requirements to be tested
Î overview of test progression
Î description of the test to be performed on each requirement including the type of
input to be used, the method for recording results, constraints such as equipment
or personnel availability, evaluation criteria and any data manipulation required
for reporting purposes
DD.8 Check whether the accuracy and completeness of the Test Plan has been
acknowledged by the appropriate level of user and by Data Processing management.
DD.9 Check that the Test Plan document has been reviewed by the Steering
Committee/Sign off Authorities?
DD.10 Check about all of the items in the User Requirements document being tested?
Appropriate tests may include: walk throughs, simulations and prototypes. Ensure that
each module program, interrelated subsystem and the system as a whole are
DD.13 Has a Detailed Design Stage Status document been prepared and released. Verify
that the status document contains at least the following:
Î actual resources used to date, compared to planned, with reasons for variance
Î actual milestones achieved to date, compared to planned, with reasons for
variance
Î a roadmap for the Implementation stage, including the following activities:
designing the structures, logic, and flow of each system component
designing all data bases and files
estimating system performance and resource requirements and confirming
that service levels will be met
designing conversion tools
coding and testing programs
purchasing and testing vendor software
Î preliminary plan for the Installation Stage including reference to the following:
conversion of files
training
instruction manuals
redeployment of staff
Î updated budget and reasons for any changes
Î updated schedule and reasons for any changes
Î updated cost/benefit analysis
Î recommendation to continue or discontinue the project
DD.14 Verify actual resource use in source documents.
DD.15 Verify that the updated budget and schedule are in keeping with the updated
cost/benefit analysis.
DD.16 Verify the updated cost/benefit analysis against the cost/benefit analysis from the
previous stage and from source documents.
DD.17 Verify that the updated cost benefit analysis takes into consideration the human
resource impact requirements.
DD.18 Verify the accuracy and completeness of the Detailed Design Stage Status document
and agreement with it has been acknowledged by the appropriate level of user and by
Data Processing management.
DD.19 Verify that the Detailed Design Stage Status document has been reviewed by the
Steering Committee/Sign off Authorities and have they signified an acceptance of it?
DD.22 To ensure that the system will operate efficiently and effectively check that the
control techniques to satisfy the requirements outlined in the system management
controls specification document have been included for testing in the test plan.
DD.23 Verify that the test plan addresses the control requirements outlined in the System
Management Control Specifications document
This stage creates all computer programs, forms, manuals and training material
needed for an operational system. Detailed program logic will be designed and
application software coded. User, operations and training manuals will be
finalized and should cover, where appropriate:
data capture
data validation
system audit trails and controls
verification of analysis report
computer operating instructions
back-up and re-run procedures
security procedures
All aspects of the system, including program logic and operational procedures,
should be thoroughly tested. All procedures required for the installation of the
system should be defined and scheduled.
IM.1 Check that all manuals and other outputs required have been completed before
installation begins.
IM.2 Determine that the following have been prepared:
Î conversion tools
Î user manuals
Î conversion manuals
Î training manuals
Î operations manuals
Î program and systems documentation.
IM.3 Verify that the user manual has at least the following specifications/ features:
Î Overview of the system and the environment
This stage converts the system to operational status. The work includes
converting existing files (if any) or creating the initial information base, training
all personnel involved with the system (user and EDP), and instituting control
and operational procedures through pilot or parallel run phase-in. All
documentation from previous phases should be finalized. Conversion and
installation procedures should be reviewed and tested. The project manager
should issue a formal Project Completion Notice for approval.
IN.1. Whether there is a formal SDLC methodology in place for system installation,
including but not limited to a phased approach of training, performance sizing,
conversion plan, testing of program, group of programmes and total system, a parallel
test plan, acceptance testing, security testing, operational testing, change controls,
implementation and post implementation review and modification.
IN.2. Whether the accuracy, completeness, and authenticity of the files created by
conversion are ensured through the use of appropriate control techniques.
IN.3. Review the conversion plan before it is executed, referring to the List of Minimum
System Processing Controls.
IN.4. Verify that control techniques are being included in the conversion process to satisfy
all control concerns.
) This is an extremely critical process. No doubt about the integrity of the data
in the new files should be tolerated. Control techniques such as one-to-one
checks, may have to be used.
IN.5. Verify that the conversion was carried out according to plan.
IN.6. Verify that training was carried out according to the schedule prepared in the
Implementation stage and that any variations have been agreed to by user
management.
IN.7. Have installations been carried out according to the schedule prepared in the
implementation Stage and have any variations been agreed to by the user
management?
IN.8. Has user acceptance been formally agreed to, as appropriate, according to schedule?
For example, if stand-alone processing locations are being installed on an independent
basis, each location should sign-off its acceptance of the system.
IN.9. Ensure that all vendor provided system software installation passwords were changed
at the time of installation.
IN.10. Has an installation Stage Status document been prepared and released?
Verify that it contains at least the following:
Î actual resources used to date, compared to plan, with reasons for variance.
Î actual milestones achieved to date, with reasons for variance.
Î updated budget and reasons for any changes.
Î updated schedule and reasons for any changes.
Î updated cost/benefit analysis
IN.11. Verify actual resource use in source documents.
IN.12. Verify that the updated budget and schedule are in keeping with the updated
cost/benefit analysis.
Work during this stage consists of examining the project performance and
system performance against the original project documentation of system
cost/benefit and project cost and time schedules. A period of settling in is
normally allowed between Installation and Post-Installation audit. The audit
team could be changed at this point to maximize objectivity. Thus, project
reviews are important after system installation to assess the success of the
systems development process and to identify any differences in control design
and control operation
PO.1. Whether a formal post-installation review has been undertaken and the results
reported to management.
PO.2. Has a Post-Installation report or similar document been prepared?
Verify that it contains the following:
Î documentation of the system's actual achievements
Î comparison of those achievements against the original objectives
Î recommendations for improvements
Î actual resource use, compared to the original plan, with reasons for variance
Î actual milestones achieved, compared to the original plan, with reasons for
variance
Î updated cost/benefit analysis
PO.3. Verify actual resource use in source documents.
PO.4. Verify the updated cost/benefit analysis against source documents.
PO.5. Determine that the updated cost/benefit analysis has taken into consideration the
human resource impact requirements.
PO.6. Confirm that the organization continues to have the necessary resources to manage the
Project successfully.
PO.7. Have the needs of the business and/or end users changed
PO.8. Do documented procedures exist for controlling all documentation?
Ensure that:
Î Change control is a formal procedure for both the user and the development
groups
Î Change control logs ensure all changes shown were resolved
Î User is satisfied with turnaround of change requests- timeliness and cost
Î Changes were made as documented
Î Current documentation reflects changed environment
Î Change process is being monitored for improvements in acknowledgement,
response time, response effectiveness and user satisfaction with the process.
Î Test that for a sample of changes the following have been approved by the
management:
Request for change
Specification of change
Access to source program
Programmer completion of change
Request to move change into test environment
Completion of acceptance testing
Request for move into production
Î Overall and specific security impact has been determined and accepted
Î Distribution process has been developed
Î Test the review of change control documentation for inclusion of
Date of requested change
Person(s) requesting
Approval for change request
Approval for change made-IT function
Approval of change made-users
IT Audit Manual Volume III 89
IT Audit Manual
Documentation update date
Move date into production
Quality assurance sign off of change
Acceptance by operation
Î Ensure that
Code check in and checkout procedures for change exist
Change control logs ensure that all changes on log were resolved to user
satisfaction.
Users are aware and understand need for formal change control procedures
Staff enforcement process ensures compliance to change control procedures
Documentation determines request or system change has been approved and
prioritised by the management of the affected users and the service provider.
PO. 17. Do preventive maintenance schedules have any negative impact on critical or
sensitive applications and is scheduled maintenance being scheduled for peak
workload periods.
Literatures that have been consulted include COBIT (Control Objectives for Information & related Techniques) guidelines &
Publications of Treasury Board of Canada Secretariat.
Appendix
Give the project 3 points for each "yes" answer. Give the project partial credit if you feel that
is most accurate—for example, give it 2 points for "probably" and 1 point for "kind of, but
not really." If the project is in the early stages, answer the questions based on the project
plans. If the project is well underway, answer the questions based on what has actually
happened on the project. The section following the test explains how to interpret the score.
Requirements
1. ____ Does the project have a clear, unambiguous vision statement or mission
statement?
2. ____ Do all team members believe the vision is realistic?
3. ____ Does the project have a business case that details the business benefit and how the
benefit will be measured?
4. ____ Does the project have a user interface prototype that realistically and vividly
demonstrates the functionality that the actual system will have?
5. ____ Does the project have a detailed, written specification of what the software is
supposed to do?
6. ____ Did the project team interview people who will actually use the software (end
users) early in the project and continue to involve them throughout the project?
7. ____ Does the project have a detailed, written Software Development Plan?
8. ____ Does the project’s task list include creation of an installation program, conversion
of data from previous versions of the system, integration with third-party
software, meetings with the customer, and other "minor" tasks?
9. ____ Were the schedule and budget estimates officially updated at the end of the most
recently completed phase?
10. ____ Does the project have detailed, written architecture and design documents?
11. ____ Does the project have a detailed, written Quality Assurance Plan that requires
design and code reviews in addition to system testing?
12. ____ Does the project have a detailed Staged Delivery Plan for the software, which
describes the stages in which the software will be implemented and delivered?
13. ____ Does the project’s plan include time for holidays, vacation days, sick days, and
ongoing training, and are resources allocated at less than 100 percent?
14. ____ Was the project plan, including the schedule, approved by the development team,
the quality assurance team, and the technical writing team—in other words, the
people responsible for doing the work?
Scoring Guidelines
Score Comments
90+ A project with this score is virtually guaranteed to succeed in all respects, meeting its schedule,
Outstanding budget, quality, and other targets. Such a project is fully "self-actualized."
80–89 A project at this level is performing much better than average. Such a project has a high
Excellent probability of delivering its software close to its schedule, budget, and quality targets.
40–59 A project with this score will likely experience high stress and shaky team dynamics, and the
Fair software will ultimately be delivered with less functionality than desired at greater cost and with
a longer schedule.
< 40 A project with this score has significant weaknesses in the major areas of requirements,
At Risk planning, project control, risk management, and personnel. The primary concern of a project in
this category should be whether it will finish at all.
This quick diagnostic material is adapted, with thanks, from the Survival Guide Website at
www.construx.com/survivalguide/. This Material is copyright © 1993-1998 Steven C. McConnell.
• Vision – does the Government have an overall vision and road map for the
development of e-governance?
• E-Governance Framework – is there a proper e-governance framework,
covering legislation, regulations, standards and infrastructure for supporting the
delivery of e-government services?
• Delivery Model – has the Government designed/adapted a model for the
delivery of e-government services and transforming the existing method of
providing services?
• Organisational Structure – has the Government put in place an appropriate
organisational framework for planning, implementing and managing e-
governance?
• Digital Divide – how does the Government plan to tide over the digital divide
in the country/State and ensure that every citizen has affordable access to
computers and e-enabled services?
Detailed plans
ORGANISATIONAL ISSUES
FUNDING
SECURITY ISSUES
HUMAN RESOURCES
LEGAL ISSUES
QUALITY OF SERVICE
DATA PRIVACY
Some of the important “ICONS” in MS Access Queries Module are given below:
Press this
icon to view Press this
the data after icon for
designing a Press this summing
query icon to run a /profiling
query functions
Query - 1
Let us join ‘Bundl_SM’ table (contains records for each Major Head Bundle
corresponding to each of the Treasury Accounts in Jharkhand) and ‘MJH_M’ table
(Major Head Master) and see if there are any ‘Major Head Codes’ in ‘Bundl_SM’
table, that are not available in ‘MJH_M’ table.
• Open Access
• Go to Queries Module
• Select New on the Menu Bar
• Select Design View and press OK. You will see all the tables available in
the database in the Show Table dialog box
• Select Bundl_SM and press Add (You can also double-click the name of
the table to select it). You will see the Bundl_SM table on the screen
• Next select MJH_M and press Add. You will see the MJH_M table on the
screen
IT Audit Manual Volume III 109
IT Audit Manual
• Press Close
• Click on MJH_CD in Bundl_SM table and pull it over the MJH_CD in
MJH_M table. You will see a link / join between the two tables on the field
‘MJH_CD’.
Note: There are three types of joins in Access viz.
• An inner join, which is the default join and includes only rows where the
joined fields from both tables are equal.
• A left outer join which includes all records from the first table (in this case,
the Bundl_SM table) and only those records from the second table (in this
case, the MJH_M table) where the joined fields are equal.
• A right outer join which includes all records from the second table (in this
case, the MJH_M table) and only those records from the first table (in this
case, the Bundl_SM table) where the joined fields are equal.
• Double click the link between the two tables. You will see a small box
containing the Join Properties. This box gives the details of the two tables
and the columns / fields on which the two tables are joined. Select number 2
from the box, whereby all the records from Bundl_SM table will be
included and only those records from the MJH_M table where the joined
field i.e. MJH_CD are the same.
• Next, place the required fields from the table to the query. For this, hold
down the required fields from the Bundl_SM table, drag and drop them
onto the fields in the Query Box. Alternatively, you can also select the
required fields from within the query area by pressing on the field.
• Select the required fields from the second table also and run the query.
Query – 2
Profile of List of Payments (LOP) and Cash Accounts (CAC) for the whole year.
• Open Access → Queries → New → Design View
• In the Show Table box, select BUNDL_SM and press Add →Close.
• Drag and drop LOP_CAC, G_PAYMT from BUNDL_SM on to the Query
panel
• Press the Summing Function icon (∑)
IT Audit Manual Volume III 111
IT Audit Manual
• Select Group By under LOP_CAC
• Select Sum under G_PAYMT
• Run the Query
These were only a few samples of the basic type of queries that can be run using MS
Access. Numerous queries can be run to view, change and analyse data using different
parameters as well as use them for forms, reports and data access pages. The results of
these queries can also be exported to MS Excel to produce graphs and charts to be
included in audit reports.
Bibliography
________________________________________________