Learning Unit 3
Learning Unit 3
Utilising databases
You have read extensively from various perspectives about data in AIN1501 and the
previous study units. According to Taniar and Rahayu (2022), data is the largest community
nowadays. The Economist (https://www.economist.com/leaders/2017/05/06/the-worlds-
most-valuable-resource-is-no-longer-oil-but-data), on 6 May 2017, published an article “The
world’s most valuable resource is no longer oil, but data”, highlighting that data is the most
valuable resource. Subsequently the term “data is the new oil”, was coined. The article
mentioned the giants that deal with data, such as Google. Amazon, Apple, Facebook, and
Microsoft. Oil is useful only if it is used to fuel an engine, whether it be a vehicle engine, an
aircraft engine, or a manufacturing engine. It is thus designed to convert one form of energy
into mechanical energy. In other words, oil fuels the engine as the engine uses fuel and
burns it to produce mechanical power. Then, if data is the new oil, it should be fuel for data
engines, as it will only be useful if it fuels the data engine. A data engine, like any other
engine, produces power that enables the information system to move and operate. As the
efficient data and information storage and retrieval needs of organisations increased over the
years, the use of databases to manage data and information also increased. Databases are
widely used in business nowadays and their utilisation can differ between organisations. In
processing of massive data sets to make the data analysis efficient, data needs to be
organised in a way that ensures efficient storage and access, including different data
management technologies: File management systems, database management systems
(hierarchical, network-based, relational), data warehouses, and SQL (UNISA 2022).
1
3.1 Introduction
Organisations today know that information technology is essential not only for daily
operations but also for gaining strategic advantage in the marketplace. The importance of
information technology means that information security has also become important.
Breaches in information security can result in litigation, financial losses, damage to brands,
loss of customer confidence, loss of business partner confidence, and can even cause the
organisation to go out of business. As we consider all these issues as a whole, we see how
critically important it is for information security professionals to have strong business
management and organisational skills. Information security professionals must also
communicate with entire user communities to raise their awareness of information security
issues through training and education, thereby promoting culture attuned to information
security. They must also work with business managers and the user community during risk
assessment (UNISA 2022). Anwar, Panjaitan and Supriati (2021) mention that since the
advent of the digital revolution, information systems have been defined and put into
practice. Thus, Database Administrators (DBAs) must be better aware of the procedures
used to preserve business data, as well as the standards and regulations that may be
applied to the data. In this regard, many businesses want the analytical process to take as
little time as feasible. Therefore, it is critical for businesses to have the ability to execute
analysis and report production from information systems in an effective, efficient, and
integrated manner.
In the previous study unit, we learnt about the database environments, its components
and the terminology used in a relational database. In this study unit, we will briefly discuss
the utilisation of databases in accounting and auditing. We will also look at the use of
databases in data warehouses, data marts and data mining and online analytical
processing.
• Describe the importance of database management systems for accountants and auditors.
• Distinguish the key differences between OLAP and data mining
The following icons are included in this study unit:
3
involve data cleaning, filtering, extractor, integration, aggregation amongst others, is known
as the extract, transform and load (ETL) process. Extracting the data from the data
warehouse is done by Online Analytical Processing (OLAP), which will then present the
retrieved data to a Business Intelligence (BI) tool for producing reports and charts.
The figure below shows the entire journey of data in various forms and formats. It starts
from an operational database (or transactional database) which is the backbone of any
information system.
Finance departments as well as accounting and auditing firms use accounting software to
record financial transactions. Most financial software uses databases to store and retrieve
financial data. For example, Pastel Partner software, which you will learn more about in
topic 7, uses a relational database to store data. We have learnt more about the different
available accounting information systems (AISs) in AIN1501 study unit 14.
Inadequate Accounting Information System (AIS) security increases the opportunity for
manipulation, falsification or alteration of accounting records. Thus, the accounting
profession recognises the need for increased security over AIS. Security of information has
become a major concern to all types of businesses as all technological advancements in hot
topic lists, represent fundamental changes to business practices, with significant system
security implications. The elevated importance of security stems from the recognition that
inadequate security over a system precludes any assurance that an AIS will produce
reliable information to meet internal and external reporting requirements (UNISA 2022).
Generally speaking, and as security philosophy, system security involves risk assessment
and counter-measure implementation to ensure that such systems will operate, function
correctly and be safe from attack by internal and external adversaries. Central to proper
security is that all stakeholders must understand the general need for security and specific
potential threats faced by the organisation. The system security topics of information
security and control, disaster recovery and high availability and resiliency of systems
security are both philosophical and financial. System security is often viewed in a manner
similar to physical security: Buy it once and use it forever, Unfortunately, like physical
security, obsolete policies, procedures and technologies leave systems extremely
vulnerable to external and internal attacks. Most stakeholders find it difficult to accept the
need for constant spending on system security when it is difficult to quantify the benefits.
Even when benefits can be quantified, unenlightened stakeholders may still question the
need for continuous spending in the system security area. In many cases, education can
5
overcome this philosophical barrier. Unfortunately, often only severe losses from a security
breakdown will prompt appropriate, albeit late action. The benefits of system security can be
calculated and quantified from a loss exposure perspective. Once each system is identified
and prioritised in terms of sustaining daily operations and the dollar amount calculated for
the upper cost limits, the cost of the security system program or upgrade must be compared
to the upper cost limits of system failure. Before undertaking such tasks, the system’s
vulnerabilities in terms of passwords, firewalls, data encryption, and employees must be
understood. The greatest threat to computer security is unauthorised access to data or
equipment. There are five basic threats to security: persons outside the organisation – 5%;
natural disasters – 8%; disgruntled employees – 10%; dishonest employees – 10%, and
human error – 67% (UNISA 2022).
Furthermore, public accounting firms are now equipped to help companies secure their
networks and work alongside companies to improve their current security systems. These
practices are growing rapidly as more companies seek help in protecting their information.
Market demand should also increase as companies start externally reporting on their
cybersecurity risk management efforts and obtaining assurance for the reporting (e.g., using
the AICPA Framework). Accordingly, beyond the need to protect organisational data, recent
academic research suggests numerous benefits of cybersecurity disclosures for client firms.
Two of the key benefits are that cybersecurity disclosures can be informative to investors
and can help mitigate the negative impacts of a subsequent breach. However, despite the
recommendations of regulatory bodies and the findings of recent research, firms often fail to
provide disclosures regarding cyber issues.
In their study, Tan and Low (2019) examined the prediction that blockchain technology will
transform accounting and the accounting profession because transactions recorded on a
blockchain can be aggregated into financial statements and confirmed as true and accurate.
They argued that in a blockchain-based AIS, accountants will no longer be the central
authority but will remain the preparer of financial reports required by regulations. However,
they will continue to influence policies such as the choice and accreditation of validators and
serve as validators of last resort. Using the three-tier architecture of the AIS, their study
addressed the gap in the literature about how characteristics of blockchain technology can
influence the implementation of a blockchain-based AIS, with related implications for the
accounting profession. Thus, they argue that blockchain technology affects the database
engine of the accounting information system (AIS) through digitisation of the current paper-
based validation process. However, audit evidence still needs to be gathered for rendering
of an audit opinion in a blockchain-based AIS. Digitisation of the validation process reduces
the error rate and lowers the cost of vouching, tracing and immutability of blockchain data,
and reduces the incentive and opportunities for fraud. Therefore, a blockchain-based AIS
alone does not guarantee that financial reports are true and fair. Lower error rates and
reduced incentives for accounting fraud in a blockchain-based AIS are expected to improve
audit quality. However, this prediction will need to be empirically tested when blockchain-
based AIS become available.
Yu, Lin and Tang (2018), in their study shedding light on the potential application of
blockchain technology in financial accounting and its possible impacts, established that
blockchain, as a decentralised ledger technology with its characteristics of being
transparent, secure, permanent and immutable, has been applied in many fields such as
cryptocurrency, equity financing, and corporate governance. However, the blockchain
technology is still in the experimental stage and several problems have to be solved,
including limited data processing capacity, information confidentiality, and regulatory
difficulties. They argued that in the short run the public blockchain could be used as a
platform for firms to voluntarily disclose information. In the long run, the application could
effectively reduce errors in disclosure and earnings management, increase the quality of
accounting information and mitigate information asymmetry.
Schmitz and Leoni (2019) posit that blockchain is a distributed ledger technology expected
to have significant impacts on the accounting and auditing profession. Their study,
applicable and timely for both accounting and auditing scholars and practitioners, explored
blockchain technology and its main implications for the accounting and auditing profession.
The research question it addressed was: What are the major themes emerging from
academic research and professional reports and websites debating blockchain technology
in the accounting and auditing context? A literature review of academic literature and
professional reports and websites was performed to identify a taxonomy of emerging
themes. They found that the most discussed themes in scholarly works and professional
sources are governance, transparency and trust issues in the blockchain ecosystem,
blockchain-enabled continuous audits, smart contract applications and the paradigmatic
shift in accountants' and auditors' roles.
7
According to Smith and Castonguay (2020), Blockchain technology has been a disruptive
force in currency, supply chain, and information sharing practices across a variety of
industries. Its usage has only recently expanded into assurance and financial reporting.
Their study explored blockchain's impact in these areas and provides guidance for
organisations and auditors utilising blockchain by addressing financial data integrity issues,
financial reporting risks, and implications for external auditors and firms' corporate
governance practices. Organisations utilising blockchain must adapt their policies and
procedures regarding internal controls and counterparty risk assessment to address
increasing regulation of the distribution of financial data, while their audit committees must
be prepared to address these challenges leading up to financial statement preparation.
External auditors need to assess blockchain implementation as a financial reporting risk and
balance the potentially more reliable and timelier audit evidence obtained from blockchain-
based reporting systems against the related increase in internal control testing.
According to Eaton, Grenier and Layman(2019), at a time when data breaches are common
headlines and companies are making massive investments in cybersecurity risk
management and reporting, the accounting firms are in a unique position to help. An
analysis of recent major data breaches creates an opportunity to learn how companies'
security systems are compromised and demonstrate how public accounting firms can assist
in those areas. Specifically, the role of accountants should be considered in all stages of
effective cybersecurity risk management: risk identification and measurement, control
system design and testing, external reporting, and independent assurance is key. It is
important to note that accounting firms in their advisory and/or assurance capacities utilise
multidisciplinary teams comprising traditional accountants who are also trained in
IT/cybersecurity, working alongside IT/cybersecurity specialists who may not have an
accounting background to enhance cybersecurity efforts in the reporting and assurance
stages.
ACPIA has been looking to the future on a broad basis, resulting to the CPA Horizons 2025
report. Relative to technology, the reports suggest the importance of technology as follows.
• CPAs must stay current with, embrace and exploit technology for their benefit.
Hierarchically, the audit has performed an attest function to determine the reliability
of financial information presented in printed financial statistics. This is expanding to
include the following:
– non- financial information not measured in monetary units (e.g.,
accountants might help determine occurrence rate for hotels or apartments
completeness)
– use of information technology to create or summarise information from
databases
The variety of opportunities within accounting were confirmed by the reports of AICPIA, the
Institute of Management Accountants ((IMA) and the Big Five (at the time) public accounting
firms. Practitioners surveyed reported that accounting graduates would need to be able to
provide services in the areas of financial analysis, financial planning financial reporting,
strategic consulting and system consulting.
According to Anwar, Panjaitan and Supriati (2021), several research studies on database
auditing have been conducted. Some of them include theories that assist the auditing
process. Database auditing is a capability for searching the use of database authority and
resources at a high level. It is a crucial part of database security and governance
requirements and is critical to conduct a database audit to detect malicious activities,
maintain data quality, and optimise system performance. Database auditing is thus an
9
option for investigating transactions that occur and is a vital part of database security and
government regulatory compliance. However, business applications lose track of the
company's business operations due to a lack of data audits. The audit trails created by data
operations allow DBAs to do periodic examination of access patterns and data updates in
the Database Management System (DBMS). One of the most serious issues in information
security is database auditing. Historical or temporal database data is required to develop
audits to track operations and types of activities through time. A database audit trail is a
generalised recording of "who did what to whom, when, and in what sequence." This
information is to be used to satisfy system integrity, recovery, auditing, and security
requirements of advanced integrated database/data communication systems. In this
process it is important to know what information must be retained in the audit trail to permit
recovery and auditing later and a scheme of organising the contents of the audit trail to
provide the required functions at minimum overhead. In the regard, auditing technologies
and methodologies are continually changing to catch up with business data processing
methods. For instance, the introduction of computers in business forced the creation of
Electronic Data Processing (EDP) auditing. Databases and distributed computing
substantively changed audit risks and necessitated the utilisation of essential new audit
tools. The advent of the internet, the consequent internetworking of applications, and the
progressive electronisation of many corporate processes have accelerated the trend and
the demand for new, more timely assurance processes (UNISA 2022).
As technology develops, the use of electronic audit files in audits has increased. These
audit software applications use databases to store and retrieve data. Examples of audit
software that uses databases to store data are CaseWare and CCH TeamMate (UNISA
2022).
Groomer and Murthy (2018) demonstrated an approach to address the unique control and
security concerns in database environments by using audit modules embedded into
application programs. Embedded audit modules (EAM) are sections of code built into
application programs that capture information of audit significance on a continuous basis.
The implementation of EAMs is presented using INGRESS a relational database
management system. An interface which enables the auditor to access audit-related
information stored in the database is also presented.
11
Task Force on Assurance Services have identified continuous auditing as a service that
should be offered. Continuous auditing is significantly different from an annual financial
statement audit. A recent research report produced by the CICA defines a continuous audit
as “a methodology that enables independent auditors to provide written assurance on a
subject matter using a series of auditors’ reports issued simultaneously with, or a short
period of time after, the occurrence of events underlying the subject matter”. However,
continuous auditing would present significant technical hurdles. Therefore, with the real‐time
accounting and electronic data interchange popularising, CAATs are becoming even more
necessary (UNISA 2022). The demand for timely and forward‐looking information hints that
the continuous audit will eventually replace the traditional audit report on year‐end results.
In addition, in the future, the entire concept of audit will change to a loose set of assurance
services, some of which will be statutory in nature. Many management processes
progressively rely on this future infrastructure. Four main issues distinguish assurance
processes from other management support functions: data structures, independent review,
the nature of analytics, and the nature of alarms. The data structures tend to focus on
cross-process metrics and time-series evaluation data. Its analytics focuses is on cross-
process integrity. E-Schwabe for example, continuously monitors all trades and filters some
for tighter scrutiny by internal auditors. Its alarms are independently delivered to auditors
(and other parties) and are defined, reviewed, and tested by these assurance professionals.
They propose that continuous assurance (CA) is therefore an aggregate of objectively
provided assurance services, derived from continuous online management information
structures — the objective of which is to improve the accuracy of corporate information
processes. These same services may also provide different forms of attestation including
point-in-time, evergreen, and continuous. One of the central views of corporate monitoring,
corporate IT (legacy, middleware, and internet) systems for series of real time
administrative (cash management or receivables management) high-level corporate metrics
(key performance indicators) and other processes is the evolving field of continuous
assurance.
Anwar Panjaitan and Supriati (2021) suggest that auditing is a process that involves
monitoring and recording activities from a user's database where an audit trail is the output
of the auditing process. Every database action that is audited creates an audit trail of the
information changes performed when auditing is enabled. The audit trail's contents contain
records that detail what occurred to the database. Each DBMS has its own restrictions in
terms of the number of records or event records it can manage. Database auditing,
according to Meg Coffin Murray, may be used to determine who accessed the database,
what actions were performed, and what data was modified. Auditing activities and database
access can aid in identifying and resolving database security concerns. Because auditing
analyses the record of activities, procedures, and behaviour of organisations or individuals,
it plays a critical role in ensuring compliance with the rules. The ability to follow changes in
the data trail, what modification actions were performed, and when they occurred using
historical data is one of the keys to successful auditing. Historical data may be modelled
relationally in databases using a variety of approaches, including distinct tables for historical
records, transaction logs, and multidimensional data.
Many auditors also use Microsoft Access or IDEA data analysis software to interrogate
their clients’ operational and financial data. These interrogations may find anomalies,
exceptions and trends in datasets obtained from clients, which then helps them to perform
sound quality risk-based audits.
Many organisations have multiple databases such as a database for financial information,
operational information, marketing and so forth. Because these databases are not linked, it
becomes difficult and extremely cumbersome to analyse data when it is needed from more
than one database. Some organisations also have massive databases. Running queries on
these huge databases requires a great deal of processing, which may affect operations
owing to the slowness of response times. Data warehouses have been created to overcome
these problems.
Companies over the last couple of decades have done more logging and data capture with
the advent of computers with database capabilities. Many have found that this data is quite
useful to augment, if only the information were available for statistical analyses. Both the
warehouses and the marts store information about clients, demographics, interactions, and
transactions which is not limited to commercial gains but can be applied in any number of
fields (from astronomy to zoology—anything that can be measured and has volumes of
data). For example, a transaction log history would keep information like: "Joe X withdrew
$50 each weekend at 9 am Saturday morning" or "most reliable visual astronomical
observations were before sunrise". Of course, like the last case, the conclusions are fairly
obvious.
The increasing popularity of intranets and the internet itself, has given rise to repositories of
data and engines that can search for correlates for internal uses and "sellable" information
(e.g., "what kind of people watch what kind of television shows during what times of the
day"). The database model types normally used for data warehouses are relational or
multidimensional (UNISA 2022).
As you can imagine, data warehouses are massive because they contain data from various
13
databases. Running queries on the data warehouse can be painstakingly slow because of
the size of the data warehouse.
According to Fleckenstein and Fellows (2018), data warehouses were traditionally built to
reflect “snapshots” of the enterprise-level operational environment over time. In other
words, a certain amount of operational data was recorded at a particular point in time and
stored in a data warehouse. Originally, such snapshots were typically taken monthly.
Today, they are often taken multiple times per day. Data warehouses provide a history of
the operational environment suitable for trend analysis. This allows analysts and business
executives to plan based on recent trends. Answers to questions such as how revenue or
costs have evolved over time and the ability to “slice and dice” such data are typical
functions asked of a data warehouse.
A data mart is a smaller data warehouse extracted from the main data
warehouse and contains specific related data extracted for a specific
organisational user group such as the finance department or the marketing
department (UNISA 2022).
The use of data marts makes running queries much quicker than running queries on the
full data warehouse. In response to the unnatural homogeneity and sheer data collection
problems of data warehouses, data marts tried to cut down the database by focusing on
topics or specific subjects. Focusing on more specific topics helped structure the data in a
more intuitive way and made the information more accessible. The collections would still be
gathered from other sources, including warehouses and other data marts. Lastly, marts
were easier to compartmentalise so that off-the-shelf solutions could be sold. The classical
transaction database is not able to do analytical processing, because
• transactional databases contain only raw data, and thus, the processing speed will
be considerably slower
• transactional databases are not designed for queries, reports and analyses.
• transactional databases are inconsistent in the way that they represent information
(UNISA 2022).
Hurst, Liu, Maxson, Permar, Boulware and Goldstein (2021) state that the development of
an electronic health records datamart, to support clinical and population health research is
necessary and that EHR systems represent an important research data source. This type
of data is highly complex and can be difficult to access. Typically, EHR data are stored in
an enterprise data warehouse (EDW) along with a number of other data sources such as
billing and claims data, laboratory tracking systems, and scheduling data that underlie
health system operations. These data warehouses require significant expertise and time to
navigate, and access is typically restricted to a small number of individuals to manage
privacy and legal concerns associated with access to large amounts of protected health
information (PHI). One way to make EHR data more accessible and actionable for research
purposes is to organise it into smaller relational databases, referred to as datamarts. These
datamarts are typically organised under Common Data Models (CDMs). CDMs, such as
those used by the National Patient-Centered Clinical Research Network (PCORnet) and/or
the Observational Medical Outcomes Partnership (OMOP), comprise a set of rules for how
to turn raw EHR data into simpler data models. These efforts have stimulated a significant
number of retrospective analyses and innovative multicentre clinical trials
Data mining often produces impressive quantifiable benefits across a broad range of
industries in a wide variety of applications. Data mining yields firm numbers that can make
the case not only for data mining, but for your whole data warehouse effort. For example, a
large wireless company dramatically increased their profitability using data mining. Faced
with a high churn rate (percentage of customers leaving), 40 per cent of the customer base
still using analogue as opposed to digital services, and a low monthly minutes usage that
resulted in an average revenue per user of less than $50, they turned to data mining. If they
could keep and upgrade more customers, the potential payback was significant. They might
otherwise lose 700,000 customers per month, at an annual replacement cost of $360
million! The data consisted of hundreds of fields, with approximately one third coming from
15
call detail records. Using SPSS's Clementine to mine the data on a Teradata platform, they
built a series of models that scored customers on their likelihood to leave and succeeded in
finding sets of rules that would predict customer behaviour. They confirmed the wisdom of
delivering the right offer at the right time, which meant talking directly to customers as well
as sending customised direct mail. To succeed, they needed several coordinated teams to
work together (UNISA 2022).
Lee (2017) states that irrespective of the communities, there is one point of consensus
about data mining. Data mining is a field of study about discovering useful summary
information from data. As innocent as it may look, there are important questions behind
discovering useful summary information. The following is an approach that can be followed
as a starting point.
These analyses can be used in decision making (including strategic decisions), forecasts,
predicative modelling, fraud detection, risk management and so on. The data sets used in
data mining are usually a data warehouse or a data mart, but data mining may also be
performed on source databases. For example, an insurance company using data mining
on its motorcar claims could uncover the fact that red cars with drivers younger than 25
years are more likely to be involved in an accident. The company could use this information
to correctly price insurance premiums for red motorcar drivers 25 years and younger.
Examples of data mining software include RapidMiner, SAS Enterprise Miner, IBM
SPSS Modeller and Orange.
17
3.8 Online analytical processing (OLAP)
As an example, consider the following question: How many pairs of red shoes were sold per
month to persons aged 20 to 30 years? The multiple dimensions used in the query were
product type (shoes), product colour (red), time period (month) and age (20–30 years).
Because these data sets are usually multidimensional, they are stored in a multidimensional
database, although some OLAP software is also compatible with relational databases.
OLAP is used in business intelligence, budgeting, forecasting, management reporting
and so on. IBM Cognos Business Intelligence and Oracle Database OLAP Option are
examples of OLAP software.
According to Taniar, and Rahayu, (2022), once data warehouse is created, the next step is
to use it. Using data warehouse means to extract data from the data warehouse for further
data analysis. The query to extract data from the data warehouse is an Online Analytical
Processing tool or OLAP. OLAP is implemented using SQL. Because it uses SQL command
to retrieve data from the data warehouses, the result is a table and the data is in a relational
table format. In short, a data warehouse is a collection of tables. OLAP queries the tables
and the results are also in a table format. The following is an example of an OLAP to
retrieve total sales for all shoes and jeans sales in 2022 and 2023 in Japan. This OLAP
uses group by cubes which not only get total sales as specified in the where clause but also
the respective subtotals as well as grand total:
select
T. Year, P. ProductName,
sum (Total_Sales) as TotalSales
from
SalesFact S,
TimeDim T,
ProductDim P,
LocationDim L,
where S. TimeID = T. TimeID
And S. ProductNo = P. ProductNo
And S. LocationID = L. LocationID
And T. Year = 2022 and T. Year = 2023
And P. ProductionName in (‘Dresses’, ‘Belts’)
And L. Country = ‘South_Africa’
Group by Cube (T. year, P. Production);
The results of the above are shown in the below table as OLAP raw results:
The blank cells indicate the sub-total. For example, the line containing 2022 and product
name shows the Total Sales for 2022 empty (sub-total for 2022), whereas the line
containing empty year followed by Dresses indicates the Total Sales for dresses (for the
year of 2022 and 2023). The last line in the results is the grand total: Total Sales of Dresses
and Belts in both years. Note that the results only contain the data that satisfies the SQL
query, and there is no fancy formatting. The formatting itself is not part of OLAP. Using SQL
commands, OLAP retrieves raw data which can then be later transformed using any
Business Intelligence (BI) tools. So, the focus is on the data as retrieved data is the most
important. The BI tool is for further presentation and visualisation. The data retrieved by the
SQL command as shown in Table 1.1 can be later formatted in a number of ways
depending on the use of the data in the business, as well as the features available in the BI
tools. For example, the data can be shown is a matrix- like format, such as Table 3.2
2022 2023
Dresses R2 252 600 R1 684 548 R3 937 148
Belts R 675 760 R1 357 179 R2 032 939
R2 928 360 R3 041 727 R5 970 087
Table 3.2: Results in matrix format (adapted from Taniar & Rahayu 2022).
In this matrix format the respective sub-totals are more clearly shown. This can also be
shown in various graphs. The presentation and visualization is not the focus of QLAP.
OLAP only retrieves, the required data that is the raw data. BI tools which receive the data
can present the data in any form: reports, graphs, dashboards, etc. Some BI tools have
complex features, whereas others may be simple but adequate for the business. For
example, Microsoft Excel is often deemed to be only adequate in presenting some basic
graphs or R also has some visualization features.
19
Figure 1.4. Table visualization of the data
OLAP is the foundation of data analytics as it is able to retrieve the required data from the
data warehouse while is later retrieved into data OLTP analytical engine. The data retrieved
by OLAP is raw data. This raw data needs to be presented in a suitable and attractive
format for management to be able to understand various aspect of the organisation. This
then becomes Business Intelligence (BI) which takes the raw data from OLAP and creates
various reports, graphs, and other data tools for presentation (Taniar & Rahayu 2022).
3.14 Summary
In this study unit, we briefly examined how databases are used in accounting and auditing.
We also discussed data warehouses, data marts, data mining and OLAP.
The next topic deals with how to develop and create spreadsheets to solve problems in a
business and accounting context using appropriate formats, formulas and functions. We
will also gain an understanding of the risks and controls associated with spreadsheets.
Activity 3.1
For a company that has just been established, e.g., Clothing Store, which
form of data utilisation might add more value to the company – data mining or
OLAP? Share your view on the Discussion Forum with reasons or examples
to support your answer.
Go to Discussion Forum 3.1 and discuss your findings with your fellow students.
References
Anwar, M.R., Panjaitan, R. & Supriati, R. (2021). Implementation Of Database Auditing By
Synchronization DBMS. Int. J. Cyber IT Serv. Manag, 1(2), pp. 197-205.
Eaton, T.V., Grenier, J.H. & Layman, D. (2019). Accounting and cybersecurity risk
management. Current Issues in Auditing, 13(2), pp. C1-C9.
Groomer, S.M. & Murthy, U.S. (2018). Continuous Auditing of Database Applications: An
Embedded Audit Module Approach1. In Continuous auditing. Emerald Publishing Limited.
21
Hurst, J.H., Liu, Y., Maxson, P.J., Permar, S.R., Boulware, L.E. & Goldstein, B.A. (2021).
Development of an electronic health records datamart to support clinical and population
health research. Journal of Clinical and Translational Science, 5(1).
Dean, T., Lee-Post, A. & Hapke, H. (2017). Universal Design for Learning in Teaching
Large Lecture Classes. Journal of Marketing Education, 39(1), 5–16. https://0-doi-
org.oasis.unisa.ac.za/10.1177/0273475316662104
Saeidi, H., Prasad, G.V.B. & Saremi, H. (2014). The Role of Accountants in Relation to
Accounting Information Systems and Difference between Users of AIS and Users of
Accounting. Vol 4 [11] October 2015: 115-123.
Schmitz, J. and Leoni, G. (2019). Accounting and auditing at the time of blockchain
technology: a research agenda. Australian Accounting Review, 29(2), pp.331-342.
Smith, S.S. & Castonguay, J.J. (2020). Blockchain and accounting governance: Emerging
issues and considerations for accounting and assurance professionals. Journal of Emerging
Technologies in Accounting, 17(1), pp.119-131.
Tan, B.S. & Low, K.Y. (2019). Blockchain as the database engine in the accounting
system. Australian Accounting Review, 29(2), pp.312-318.
Taniar, D. & Rahayu, W. (2022). Data Warehousing and Analytics: Fueling the Data Engine.
Springer Nature.
Vasarhelyi, M.A. & Halper, F.B. (2002). Concepts in continuous assurance. Researching
accounting as an information systems discipline, pp.257-271.
Yeo, D. (2021). Is Data Mining Merely Hype? In Data Management (pp. 777-782). Auerbach
Publications