Ipttps PDF
Ipttps PDF
Introduction
Transaction = an event that determines either a manual or computer-based activity.
Transaction processing systems (TPS) = used for the processing of everyday transactions that produce
large amounts of data, for example, sales outlets or retail stores.
A TPS minimises the organisation’s costs by reducing the data that must be handled.
There are 2 types of transaction processing: Batch (time delay) and Real-time (immediate).
Batch Processing
Batch Processing = collects data as a group, batch and processes it later all at once.
It has a time delay. E.g. clearance of cheques.
It is carried out by large organisation’s using a mainframe1 or mid-range2 computer.
It involves a large batch of an identical data type, such as payroll or stock information.
Batch programs are often run at night when there is less demand for the information system.
1 A mainframe computer is a central computer for a large number of users. It is more powerful than a midrange computer and often
has thousands of terminals connected to it. Mainframe applications include payroll computations, accounting and airline seat
reservations.
2 A midrange is a central computer that performs the processing for a number of users working at terminals. A terminal is an
input/output device (usually a keyboard or screen). Midrange computers are typically used for accounting, database management
and specific industry applications.
The Advantages:
- Can be run as a regularly scheduled event or when enough data has been collected.
- Can be run as a fully automated process, without the need for a human operator.
Æ Atomicity – ensures that all of the steps involved in a transaction is completed successfully as a group. If any
step fails, no other step should be completed.4
Main disadvantages:
- Expense
- More hardware and software to install.
Below is a diagrammatic representation of real time processing:
3 Example: if an airline ticket agent starts to reserve the last seat on a flight, then another agent cannot tell another passenger that a
seat is available.
4 Example: a banking transaction may involve 2 steps: withdrawing money from a cheque account & transferring it into a savings
account. If the first step (withdrawal) succeeds, then the second step (transfer) must succeed otherwise, the entire transaction is
abandoned. This ensures the transaction isn’t recorded twice & the transfer only once.
- More computer operators are required in real time as operations are not centralised. Real time is more
difficult to maintain.
Data Validation
Data Validation = used to check the entry of transaction data.
Involves ensuring transaction correct & have been accurately stored.
Involves:
Æ Transaction Initiation
- Used in real time to eliminate a number of possible errors.
- Used to acknowledge that a TP monitor is ready to receive the transaction data.
- Some TPSs add an entry time to the transaction data to trace data if lost.
Æ Field Checking
- Occurs when the transaction data is entered into a database
- Data is organised into files,records, fields and characters.
- Data validation is carried out by checking the fields, using a range check, list check, type check or
check digit.
8 Central computer usually a mainframe or mid-range computer that does the processing for the entire chain.
System flow chart for point of sale system:
a) Hierarchical database: organises data in a series of levels. It uses a top-down structure consisting of nodes and
branches.
b) Network database: organises data as a series of nodes linked by branches. Each node can have many branches
& each lower level node may be linked to more than one higher node.
c) Relational database: organises data as a series of tables. Relationships are built between the tables to provide a
flexible way of manipulating & combining data.
Short Transactions.
Keeping transactions short enables the entire transaction to be processed quickly, which improves
concurrency.
User interaction during transaction processing avoided, slows down system
Real-Time backup
Characterised by continuous operations with downtime kept minimum.
Backing scheduled during times of low activity.
High Normalisation
Redundant information kept minimum – increasing speed of updates & improve concurrency.
Reducing speeds back-ups as there is less data.
Transaction File
Collection of transaction records.
Data in transaction file used to update master file.
Transaction file also audits trails and history for the organisation
Report File
Contains data that has been formatted for presentation to a user.
Work File
Temporary file in the system used during processing
Program File
Contains instructions for the processing of data.
Created from something like Visual Basic and C++
Data Warehousing
Data warehouse = a database that collects information from different data sources.
Data gathered in real time can be used for analysis if stored in a data warehouse.
Volume of data is a problem in analysing a database that is continually being updated. Solution is to
have periodic downloads of data into a separate database set especially for analysis – data warehouse.
The process of using analysis tools to find patterns and trends in that data is data mining.
This means that a ‘snapshot’ of the transaction database can be collected at any time, and is on-hand to
analyse database performance itself, as well as trends.
Historical
Real time systems represent current value.
Do not show inventory at some time in past.
Querying stock inventory a moment later may return a different response.
However, data stored in a data warehouse is accurate for a specific moment in time, as it represents its
historical information and cannot change.
Data warehouse stores a series of snapshots of an organisation’s operational data generated over a long period
of time.
Read-only
After data has moved to data warehouse – it does not change unless data incorrect.
As it represents a particular point in time – it must never be updated. 10
Only operations that occur are loading and querying data.
Backup procedures
- Organisations now dependent on TPS
- Well designed backup and recovery procedures minimises disruptions when TPS goes down.
- A backup is another copy of data, could be used to rebuild the system, if system down, the recovery
process rebuilds the system.
- Success of backup and recovery depends on implementing appropriate procedures.
- In general, the more valuable the data the:
i. More often the files should be backed up
ii. More copies should be made
iii. Greater number of locations where the backups should be stored.
Partial backups
Can be:
i. Differential – which have full copy of files from time to time, as well as regular copies that have
changed since last backup
ii. Incremental – which are copies of all files that have changed since last full backup.
Backup media
CD’s etc..
Reliability, cost, file size will determine method used.
Magnetic Tape
Can store large quantities of data inexpensively.
Magnetic tape is long thin plastic, coated with thin layer of magnetic material.
Tape wound on two reels inside.
Disadvantage – sequential.
Backup Software
Designed to manage copying of selected files to backup media, such as tapes, generally offers automated
scheduling as well as options for full, differential or incremental backups.
Checkpoint
DBMS periodically suspends all processing to synchronise its files and journals.
Transactions in progress are completed, and journals updated. System then in ‘quiet state’ and the database
with transaction logs is synchronised.
DBMS then writes a special record to the transaction file – ‘checkpoint record’ – which contains info
necessary to restart a system. This is taken frequently, when failures occur it is possible to resume processing
from most recent checkpoint.
Recovery manager
Restores database to a correct condition & restarts transaction processing.
Updating in a batch
Updating a batch used when transactions are recorded on paper11 or stored on magnetic tape.
Transactions collected & updated in a batch when it is convenient or economical to process them.
Historically, updating a batch was the only feasible method when transaction details were stored on punch
cards or magnetic tape. IT did not exist for immediate processing.
IT in batch requires secondary storage medium that can store large amounts of data inexpensively – usually
use Magnetic tape.
Hardware – includes large capacity secondary storage (such as hard drives), with direct access files so that
response time is quick (several seconds or less).
Software – must enable online work for multiple simultaneous use, as well as a simple user-friendly interface,
since many participants will use it & often have only brief on-the-job training.
Real Time processing = immediate processing of data. Provides instant confirmation of a transaction.
Involves a large number of users simultaneously.
>> Hardware
Includes MICR readers, ATMs and barcode readers.
Barcode readers
Used to collect product information at point of sale.
Often use laser to read barcodes.
Product information (description, price and code) is held on central computer linked to POS terminal.
12Example, people using an ATM generate transaction data by entering their debit card numbers & typing their requests on a
keyboard.
>> Forms
Form = document used to collect data from a person.
Form completed, it is processed in batch or real-time.
Designed to limit responses & minimise data entry errors by providing good format for user.
3 types of forms:
Paper - written on paper & batch processed later.
Online/on-screen – completed for data entry to populate databases, and can be processed in real time if
needed. E.g. Commonwealth Bank website. User can view, enter & change data any time. Well-designed
form provides information explaining the required data.
Forms can minimise data entry errors by automatically filling in previously stored data, such as a
customer’s address, once the user has entered the customer’s name.
Web forms – for users purchasing on net. User’s details become data in database.
Analysing Data
Output from a TPS is input to other types of information systems, such as decision support systems and
management information systems.
Data Mining = used in DSSs to find relationships and patterns in the data stored in a database.
Sorts through data & connects data – allows for better decision making.
Data models using ‘what-if’ scenarios are used where trends are not clear or change unpredictably, so that a
decision can be made.
Models follow mathematical principles based on independent variables which are governed by inputs.
13Example, a business uses TPS to process its sales transactions. It uses DSS to periodically summarise its sales data by date, region
and product. This summary information is stored in a separate database to be analysed by senior management. To make decision,
management need to see trends quickly by querying data.
Summary reports – generally combine data, showing totals over different areas or times for tactical &
strategic planning.
Exception reports - contain information which is outside normal range and is used to alert management to
some situation which is either unfavourable (like a critical shortage of some component) or may require
special handling (like one area suddenly having large numbers of late payments).
Bias
Bias = systematic inaccuracy due to methods used in collecting, processing & presenting data. Bias means
data is unfairly skewed or gives too much weight to a particular result.
Data gathered from TPS can be presented in tables in a biased way.
Generally an issue during the processing stage and presentation stages.
Deliberate bias = ethical issue.
Importance of data
Organisations need procedures to ensure data is secure, accurate and valid.
>>Data Security
Data Security = involves a series of safeguards to protect data.
Data is under threat by:
- Being stolen
- Destroyed
- Maliciously modified
14 Example, the POS terminal once performed by people in a manual transaction system, such as memorising price of products.
At the lowest security level passwords, biometric methods such as fingerprints and retinal scans work in that
they keep unskilled people from accessing data.
Atomicity occurs when all of the steps involved in a transaction are completed successfully as a group. If
any step fails, no other step should be completed and is abandoned. If some operations succeed & others
fail, there is no atomicity.
Consistency occurs when a transaction successfully transforms the system and the database from one valid
state to another. Consistency in a TPS stems from the correct application programming, such as always
debiting & crediting the same amount.
Isolation occurs if a transaction is processed concurrently with other transactions and still behaves as if it
were only transaction executing the system. Transactions must not interfere with each other’s database
updates.
15 Example, if the price of a product has been entered incorrectly into the database, then customers buying that product will be
These ACID properties gurantee that a transaction is never incomplete, the data is never inconsistent,
concurrent transactions are independent, and the effects of a transaction are permanent.
Management should not depend solely on the output from the TPS, but try to make decisions & plans
incorporating outputs from both management information and decision support systems.
People in management have created false transaction data to promote their careers, just shows results of TPS aren’t always correct. ∴ There should be less dependence.