0% found this document useful (0 votes)
24 views26 pages

BP Reader 2

Uploaded by

khmaponya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
24 views26 pages

BP Reader 2

Uploaded by

khmaponya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 26

BP Reader 2 INF3012S 2024 v1

Purpose of Reader ................................................................................................................ 1


BP Measures (Reporting) and Monitoring ............................................................................. 2
Internal and External Measure (Harmon 2014 & 2019) ...................................................... 2
Leading and Lagging Indicators (Harmon 2014 & 2019) .................................................... 3
Balanced Scorecard (BSC)................................................................................................ 4
Problems with the BSC ...................................................................................................... 4
References ........................................................................................................................ 5
BP Risks, Controls and Compliance...................................................................................... 6
Risk Defined (Parente, 2018) ............................................................................................ 6
Risk Management ............................................................................................................. 6
Risk Identification .............................................................................................................. 6
Risk Analysis ..................................................................................................................... 7
Risk Responses and Risk Attitudes ................................................................................... 8
The Business Risk Model (Price & Smith, 2000) ................................................................ 9
Achieving Compliance, Process and Controls Management (Sadiq et al. 2007) .............. 10
Business Compliance (Schumm et al., 2010) .................................................................. 12
References ...................................................................................................................... 13
BP Technology.................................................................................................................... 14
BPMS (van der Aalst, 2013) ............................................................................................ 14
BPMS (Harmon, 2015) .................................................................................................... 15
Business Process Software Tools (Harmon, 2016) .......................................................... 16
BPMS (Harmon, 2018) .................................................................................................... 17
Business Rules Management Systems (Vashisth et al., 2019) ........................................ 17
Market Guide for Intelligent BPMS (Srivastava et al., 2020) ............................................ 18
Context-Aware Business Process Management (BPM) (vom Brocke et al., 2021) ........... 19
Process Mining (Robledo, 2018)...................................................................................... 21
Process Discovery Review (Augusto, 2018) .................................................................... 22
Scalable process discovery and conformance checking (Leemens et al., 2018) .............. 23
How companies can prepare for the coming “AI-first” world (Davenport & Mittal, 2023) ... 24
Process Mining meets Artificial Intelligence and Machine Learning (Veit et al., 2017) ..... 24
References ...................................................................................................................... 25

Purpose of Reader
These excerpts of academic articles are updated annually and have been compiled for your
course so that students can have access to recent research and multiple perspectives on the
topics being covered. This also saves cost as no textbook is prescribed. Please ensure for
copyright purposes that this reader is not distributed.

1|Page ©2024 UCT, All Rights Reserved


BP Measures (Reporting) and Monitoring

Internal and External Measure (Harmon 2014 & 2019)


External measures (measures from outside) tell you about the results achieved by a process
or value chain. Internal data (measures from inside) tell you about how the process is
working, but they don’t tell you if the process is satisfying its stakeholders—be they
customers or shareholders. Ultimately, we judge the success or failure of a process by
external results. In the case of a value chain, those results may be from entities external to
the entire organization, as customers are (see Figure 5.2).

If we are focused on the organization, then the customer is outside the organization.
We can apply this same concept inside an organization, if we simply regard any
process that receives another process’s outputs as its customer. Thus, in Figure 5.3, we see
that processes can be both the supplier of one process and the customer of another.
In this case Process D has two external customers, Processes E and F. Before the manager
of Process D should consider examining whatever internal measures are used to
evaluate Process D, he or she should be sure that Process D’s outputs are satisfying its
customers, Process E and Process F. The logic here is the same as it is on the enterprise
level. It doesn’t make any sense to decrease the cost or to increase the productivity of
Process D if, as a result, the process is no longer able to deliver the products or services
it provides to Processes E and F. Once the external measures are defined and it’s clear
that Process D can consistently meet its external commitments, then, while keeping its
external measures constant, the process manager should focus on improving internal
measures.

Examples of External Measures we might want to examine include: Income measures,


Measures of customer satisfaction, Market growth measures, Stockholder satisfaction or
other external measures of the stock market’s confidence in what the company is doing.
Examples of Internal measures include: Efficiency and effectiveness of specific functions or
subprocesses, Costs of producing the product or service, Quality of internal outputs.
It’s usually easier to define or measure internal metrics than to measure external
results. Moreover, most functional units tend to focus on internal measures. In fact, as we
will see in a moment, one often focuses on internal measures because they are leading
indicators and provide managers with valuable information. Ultimately, however, to

2|Page ©2024 UCT, All Rights Reserved


effectively evaluate the performance of an organization, you must focus on the
external measures. Once you “lock down” the external measures, then you can begin
to focus on improving your internal measures, confident that any efficiency you achieve
will result in a real benefit to the organization. If you fail to lock down the external measures
first, however, you run the risk that you will improve internal efficiency or reduce production
costs at the expense of customer satisfaction, market growth, or the organization’s share
price.

Leading and Lagging Indicators (Harmon 2014 & 2019)


Another way to think about metrics and measures is to focus on whether they measure
something that can suggest action, or whether they simply report on a situation that one
can do nothing about. This focus is on using performance measures to help managers
make decisions. Leading indicators are measures that report on situations that are causally
related to outcomes that you desire. Lagging indicators describe situations that can’t
be changed.
Imagine you are a sales manager for Widgets, Inc. The executive board adopts a strategy
that calls for the expansion of Widget’s presence in the market. This is translated into a
specific goal: the company will increase its sales by 15% each quarter of the year. You can
wait till the end of the quarter and then determine how many Widgets you sold. That
measure, however, is a lagging indicator. Once the quarter is over, you won’t be able to do
anything about the number of sales you made during the quarter. You’ll know if you achieved
your goal or not, but you won’t be in any position to change the results. Now let us assume
you have been tracking your Widget sales for some time and know that about 10% of your
leads normally result in qualified prospects, and that your salespeople can typically arrange
calls with half of the qualified prospects. You also know that your salespeople sell Widgets to
20% of the customers they call on.

Figure 5.4 illustrates the Widget sales cycle we just described. If you know that your
salespeople are scheduled to make 100 sales calls this quarter, you can predict that you will
be making about 20 sales. Thus, sales calls scheduled is a leading indicator of successful
sales. It comes rather late in the sales cycle, however, and may not give you much time to
make corrections. The best leading indicator, in this case, would be to track leads. A quick
calculation shows that you get one sale for each 100 leads. Or, to look at it a little differently,
to increase your sales by 15 in a quarter, you will need to get 1500 more leads. If you track
leads per month, you will know at the end of the first month in the quarter if you are on track.
If you aren’t, you will need to sharply increase the effectiveness of your lead-generation
process in the second month or you will be unlikely to meet your goal.

As a generalization, whenever possible it is good to monitor leading indicators that provide


managers with the ability to take corrective action. Ultimately, of course, you are also going

3|Page ©2024 UCT, All Rights Reserved


to want to know exactly how many sales you made in the quarter, so you will end up
measuring both leads and sales, but the leading indicator will be more useful to the process
manager who wants to use the measure to help achieve his or her goals.

Balanced Scorecard (BSC)


The Balanced Scorecard approach, developed by Robert S. Kaplan and David P. Norton,
insists that management track four different types of measures: financial measures,
customer measures, internal business (process) measures, and innovation and
learning measures. Using the Balanced Scorecard approach, an organization identifies
corporate objectives within each of the four categories, and then aligns the management
hierarchy by assigning each manager his or her own scorecard with more specific objectives
in each of the four categories. Properly used, the system focuses every manager on a
balanced set of performance measures.

Another way of looking at


the measures is as follows:
• Financial Measures: How
Do We Look to
Shareholders?
• Internal Business
Measures: What Must We
Excel At?
• Innovation and Learning
Measures: Can We
Continue to Improve and
Create Value?

Customer Measures: How


Do Customer See Us?

Figure 5.5 illustrates a


scorecard of a hypothetical
company discussed in Kaplan
and Norton’s Jan/Feb 1992
article in Electronic Circuits
Inc (ECI). (Note that as we
use the term measure or
objective, the phrases the
Kaplan and Norton show on
this figure are really just goal
statements.)

Problems with the BSC


Some of the other common
problems observed with the
BSC are listed below.
1. Provides framework for
performance
management, but no
practical methodology in relation to the system required in practically implement it

4|Page ©2024 UCT, All Rights Reserved


2. Orientated to focus on historic and current events. It does not provide the ability to focus
on predicted future performance
3. Lack of incorporation of feedback loops between measures
4. Measures are equally weighted which is not a realistic reflection of the business priorities
5. The hierarchical model (strategy maps) reinforces the thinking that financial measures
are the most important
6. Used to support and entrench functional specialization
• Marketing gets the customer perspective
• Accounting get the financial perspective
• Operations gets the process perspective
7. Overlooks the significance of value chains which are critical to business processes

References
Harmon, P. (2014). Business process change: a business process management guide for managers
and process professionals.
Harmon, P. (2019). Business process change: a business process management guide for managers
and process professionals. Morgan Kaufmann.

5|Page ©2024 UCT, All Rights Reserved


BP Risks, Controls and Compliance

Risk Defined (Parente, 2018)


A Risk is an uncertain event or condition, which if it occurs, has a positive or negative effect
on at least one objective. A risk is denoted using the properties of probability and impact.
Probability is the likelihood of a risk occurring. It is the possibility of a project objective not
being met using the current project plan. Impact is the consequence of a risk occurring. Risk
exposure is calculated by multiplying a risk’s probability of occurring times the impact. It is
important to understand the distinction between a risk and a problem/ issue. A Risk is an
event that may occur in the future. A Problem or Issue is something which has already
occurred.
The Risk Management Process. The risk management process includes the following:
identification, assessment, response planning, execution, and planning, monitoring,
documentation and communication. The focus of risk identification is the discovery of
potential risks to the project, distinguishing any uncertain event which may positively or
negatively affect the obtainment of project objectives.

Risk Management
According to Sadgrove (2016) there have been three risk management ages. While the first
age focused only on hazards, businesses now focus on hazards as well as opportunities. Now
risk is controlled rather than just being insured. In this course we focus on hazards or negative
risk and its controls. Risk management is becoming more important because legislation is
getting tougher and insurance is more expensive and difficult.

Figure X Three ages of risk management (Sadgrove, 2016)


In terms of the relationship between Process and Risk Management (Rosemann & Zur
Muehlen, 2005):
• risk is an inherent property of every business process and techniques are needed to
identify, represent and analyse business process risks.
• The absence of such techniques is a concern because both operational risk mitigation
and legal compliance depend on the sufficient identification of corporate risk.

Risk Identification
Risk identification involves clearly identifying various threats (risks) and opportunities. The
causes and effects of each risk must be understood so effective responses can be made.
Risk identification is an important step in risk assessment, to determine what could cause a
potential loss, and to gain insight into how and why the loss might happen (Wei et al., 2016).
A threat is the potential to harm assets. We call the combination pair of threat and
vulnerability as threat–vulnerability pair. An organisation should identify their assets, threats,

6|Page ©2024 UCT, All Rights Reserved


vulnerabilities, and risks (Stallings & Brown, 2015). A risk identification process is outlined in
Figure 11.
Figure 11 - Risk Identification Process (ISACA, 2015, p.22).

Identify
Identify Identify Identify Identify
Existing
Assets Threats Vulnerabilities Consequences
Controls

Figure 12: The Gartner Business Risk Model (Proctor & Smith, 2018).

Figure 12 is a list of leading indicators of business risks. Executive management teams


struggle to make effective use of risk management because they fail to understand the
relationship between business processes and risks. The leading risk indicator (LRI)
catalogue intends to provide guidance for executives and risk managers to build
organisational-specific lists that factor in risks tied to performance (Proctor & Smith, 2018).
The business aspects, include:
• Demand Management. All the actionable activities involved with generating demand
for the products and services offered by the organization.
• Supply Management. All the actionable activities directly involved with supplying the
products and services offered by the organization.
• Support Services. All other actionable activities involved with supporting the
organization. These services operate within organizations by providing services to
internal clients. They operate on business principles and provide internal services at
a cost and quality that are acceptable to their clients when assessed against
alternatives.

Risk Analysis
Risk analysis answers two basic questions: “What is the likelihood of particular risk
occurring?” and “What is the impact if a particular risk occurs?” Risks can be analysed
qualitatively or quantitatively. Qualitative risk analysis is the process of prioritizing risks for
subsequent further analysis or action by assessing and combining their probability
(likelihood) and impact as shown in Figure 13. Risk is viewed not just in terms of financial
impact and probability, but also subjective criteria such as health and safety impact,

7|Page ©2024 UCT, All Rights Reserved


reputational impact, vulnerability, and speed of onset (Curtis and Carey, 2012). For example
an impact scale of 5 (Extreme) would be defined as:
• Financial loss of $X million or more
• International long-term negative media coverage; game-changing loss of market share
• Significant prosecution and fines, litigation including class actions, incarceration of
leadership
• Significant injuries or fatalities to employees or third parties, such as customers or vendors
• Multiple senior leaders leave

Figure 13: Risk Analysis (Curtis and Carey, 2012).


Illustrative Impact Scale

A likelihood scale of 5 (Frequent) would be defined as up to once in 2 years, almost certain,


90% or higher chance of occurrence over life of asset. After plotting on the heat map, risks
are then ranked from highest to lowest in terms of risk level. These rankings may then be
adjusted based on other considerations such as vulnerability, speed of onset, or detailed
knowledge of the nature of the impact (Curtis and Carey, 2012).

Risk Responses and Risk Attitudes


A rash attitude to risk can lead to ultimate disaster while excessive caution can lead to
missed opportunities. Some companies are happy to accept new ventures and risky
acquisitions. Young companies often take big risks while mature businesses want to protect
their gains. Risk appetite is often a reflection of the CEO’s outlook (Sadgrove, 2016).
After identifying and quantifying risks, management must decide how to respond. The main
strategies are (Marchewka, 2010):
1. Risk avoidance: eliminating a specific threat or risk, usually by eliminating its causes
or taking different course of action (change plan/scope). The risk is avoided

8|Page ©2024 UCT, All Rights Reserved


completely through the withdrawal of a planned or existing activity when the identified
risks and associated costs are considered too high.
2. Risk acceptance: accepting the consequences should a risk occur, and contingency
strategies planned (contingency). Allow (assume) the burden of loss or benefit to
remain without (further) mitigation.
3. Risk transference: shifting the consequence of a risk and responsibility for its
management to a third party (insurance, fixed price contract). Decision to share
certain risks with external parties that can effectively manage the particular risk.
4. Risk mitigation: reducing the impact of a risk event by reducing the probability of its
occurrence (build in redundancy). An action is taken to reduce the impact, negative
consequences, or both, of a risk through the use of controls.
5. Exploit: maximize probability and/or impact of opportunity.

The Business Risk Model (Price & Smith, 2000)


Organizations consciously identify the business processes that help them fulfill their
objectives. Organizations divide their business processes into two categories: core business
processes and internal service processes. Core business processes are those that an entity
uses to develop, produce, sell, and distribute its products and services. Internal service
processes provide appropriate resources to the other business processes. One of the core
business processes of libraries and research institutions is the acquisition and management
of research collection items.
A core process must have proper management controls to reduce those risks that threaten
the institution's ability to meet its objectives. Two risks that threaten these objectives are not
acquiring the right materials and not properly maintaining those materials. For each business
process that is critical to the execution of business strategies, management controls should
provide assurance that the best people are selected to own processes and control process
risks.
Management must establish clear objectives against which the process owners can measure
their performance. Process owners are encouraged to assess their business risks
continuously and to build cost-effective controls into the process to ensure that business risks
are held to an acceptable level. Finally, process owners are held accountable for process
performance, process risks, and the quality of the process controls. Therefore, monitoring
business risks and controls is often an additional process-owner responsibility.
In Step 1, the process owner defines the process control objectives. An organization's
control objectives can be related to its operations, its financial reporting, or its compliance with
laws and regulations. The control objectives that are relevant in this report are the operation
objectives. Operation objectives relate to achievement of the organization's mission—the
fundamental reason for its existence. A clear set of operational objectives and strategies
provides the focal point toward which the organization will commit substantial resources.
In Step 2, the process owner assesses business risk at the process level. After an
organization has defined the objectives its process-level controls should achieve, the process
owner must determine what controls are needed to achieve those objectives. This
determination is based largely on anticipated business risk. Business risk is determined by
understanding the internal and external factors that may affect the achievement of the process
objectives. For example, if one of the objectives of a research institution's operational process
is to negotiate acceptable prices for collection items, external factors such as inflation, supply
and demand for the product, and competitors' actions may affect the degree of risk in achieving
the objective. The mechanisms an institution builds into its procurement process to alert it to
these events and enable it to respond favorably to them are examples of internal factors that
affect business risk.
In determining the magnitude of business risk, management must estimate both the
significance of the risk and the likelihood of its occurrence. For example, a potential risk that
would not have a significant effect on the operations of the process and that has a low

9|Page ©2024 UCT, All Rights Reserved


likelihood of occurrence generally does not warrant considerable attention. Management
should recognize that some degree of risk will always exist, because resources are always
limited and all internal control systems possess inherent limitations.
In Step 3, the process owner designs and implements appropriate and effective
controls for the process on the basis of the risk-assessment results in Step 2. Controls
usually involve two elements: a policy to establish what should be done and procedures to
carry out the policy. Controls serve as mechanisms for reducing business risk.
Because every organization has its own objectives and strategies, there will be differences in
process controls among organizations. Even when organizations have similar objectives,
process controls are likely to differ, because each organization has its own managerial style
and culture. These differences influence the degree and type of business risks that similar
institutions may face. The process owner should consider these differences when designing
and implementing controls.
In Step 4, the process owner measures the performance of his or her processes. Each
process owner should design quantifiable measures that can be used to assess whether the
process is operating effectively. These measures, which are commonly referred to as key
performance indicators, detect weaknesses in controls and changes in external conditions that
are not reduced by process controls. The process owner should investigate unexpected
results or unusual trends that may indicate that the organization's objectives are not being
achieved. In the procurement process example, where the objective was to negotiate
acceptable prices for collection acquisitions, the process owner might establish acceptable
ranges of prices for certain types of collections on the basis of the average of prices for those
items over a period of time. The process owner would be alerted to a possible control failure
if the price of an item fell outside these ranges.
Step 5 requires the implementation of a process to monitor process control activities.
This is an ongoing activity because internal control systems and the control environment
change over time. New management may step in, information systems may be upgraded, or
new personnel may need to be trained in the control policies and procedures. Monitoring
ensures that internal control continues to operate effectively through all these changes.
Examples of ongoing monitoring activities include the following:
• Communications from external parties either corroborate internally generated
information or indicate problems. For example, customers implicitly corroborate billing
data by paying their invoices. Customer complaints, by contrast, may signal billing
system deficiencies.
• Supervisory activities provide oversight of control functions and identification of
deficiencies. For example, review activities serving as a control over the accuracy and
completeness of cataloging record entries are routinely supervised. Alternatively,
duties of individuals are segregated so that employees serve as checks on each other.
This deters fraud because it inhibits the ability of a staff member to conceal suspect
activities.
• Data recorded by information systems are compared with physical assets. Inventories
of research materials are examined and counted periodically. The counts are
compared with accounting records, and differences are investigated.
• Operations personnel are requested to state whether certain control procedures, such
as reconciling specified physical amounts to recorded amounts of items in their
process, are regularly performed. Management or internal audit personnel may verify
such statements.

Achieving Compliance, Process and Controls Management (Sadiq et al. 2007)


Essentially, compliance is ensuring that business processes, operations and practice are in
accordance with a prescribed and/or agreed set of norms. A recent report [4] identifies the gap
between management focus on compliance related issues and IT’s lack of ability to implement
the critical policies and procedures.
Currently there are two main approaches towards achieving compliance.

10 | P a g e ©2024 UCT, All Rights Reserved


• First is retrospective reporting, wherein traditional audits are conducted for “after-the-
fact” detection, often through manual checks by expensive consultants.
• A second and more recent approach is to provide some level of automation through
automated detection. The bulk of existing software solutions for compliance follow this
approach. The proposed solutions hook into variety of enterprise system components
(e.g. SAP HR, LDAP Directory, Groupware etc.) and generate audit reports against
hard-coded checks performed on the requisite system. A major issue with the above
approaches (in varying degrees of impact) is the lack of sustainability.
• We believe that a sustainable approach for achieving compliance should
fundamentally have a preventative focus. As such, we envisage an approach that
provides the capability to capture compliance requirements through a generic
requirements modeling framework, and subsequently facilitate the propagation of
these requirements into business process models and enterprise applications, thus
achieving compliance by design.
Control activities are the policies and procedures the organization uses to ensure that
necessary actions are taken to minimize risks associated with achieving its objectives.
Controls have various objectives and may be applied at various organizational and functional
levels.
• Preventive controls focus on preventing an error or irregularity.
• Detective controls focus on identifying when an error or irregularity has occurred.
• Corrective controls focus on recovering from, repairing the damage from, or
minimizing the cost of an error or irregularity
General controls can be grouped as follows:
• Organizational or Personnel Controls
• Documentation Controls
• Asset Accountability Controls
• Management Practice Controls
• Information Center Operations Controls
• Authorization Controls
• Access Controls
Physical controls include
• security over the assets themselves, limiting access to the assets to only authorized
people and periodically reconciling the authorized people, and periodically reconciling
the quantities on hand with the quantities recorded in the organization’s records.
Process Management and Controls Management are inter-related.

11 | P a g e ©2024 UCT, All Rights Reserved


Business Compliance (Schumm et al., 2010)
Most of the compliance requirements originate from rather generic compliance documents.
Compliance requirements may emerge from different sources and can take various forms.
They may originate from legislation and regulatory bodies (such as Sarbanes- Oxley and Basel
II), standards and code of practices (such as: ISO 9001) and/or business partner contracts.
These documents can be ambiguous and thus it is difficult to decide what exactly has to be
changed in a business process in order to ensure its compliance to these requirements.
Therefore, an appropriate model for capturing and specifying compliance requirements is
needed. In particular, since some parts of such documents may not be relevant for a given
process, this model needs to describe compliance requirements and correlate them with
business processes that must conform to them. Furthermore, since legislation and regulations
tend to change over time, a link to the
compliance source should be
preserved. The conceptual model
depicted in Fig. 1.
A Compliance requirement is a
constraint or assertion that results from
the interpretation of the compliance
sources, such as laws, regulations,
policies, standards, contracts, etc.
Failure to meet these requirements
increases the likelihood of a
compliance risk to materialize, which in
turn might impair the organization’s
business model, reputation and
financial condition. To mitigate these
risks and ensure that compliance
requirements are satisfied an
organization defines controls. A control Figure 1. Conceptual model for compliance management
describes the restraining or directing
influence to check, verify or enforce rules to satisfy one or more compliance requirements. A
Compliance rule is an operative definition of a compliance requirement which formally
describes a control. A Compliance fragment is a connected process structure that can be used
as a reusable building block for ensuring a faster and more consistent specification and
integration of compliance into a process. Compliance fragments can be used to implement a
compliance rule in terms of activities and control structures. A Compliance target is a generic
specification, such as a business process, or a compliance fragment, which is a target of
compliance requirements. A user (compliance or business expert) can issue a compliance
request to check whether a set of compliance targets conforms to a set of applicable
compliance requirements. The purpose of a compliance request is to identify if and how a
process can or should be changed to make it (more) compliant.

Table 1 gives an example of a compliance requirement regarding the appropriate segregation


of duties on the loan origination process.

12 | P a g e ©2024 UCT, All Rights Reserved


References
Curtis, P., & Carey, M. (2012). Risk assessment in practice. Committee of Sponsoring Organizations
of the Treadway Commission. https://www.coso.org/Documents/COSO-ERM-Risk-
Assessment-in-Practice-Thought-Paper-October-2012.pdf
ISACA. (2015). Certified in Risk and Information Systems Control (CRISC) Review Manual (Vol. 5).
Rolling Meadows, Illinois, USA: ISACA.
Marchewka, J.T. (2010), Information Technology Project Management, 3e, Wiley & Sons, Asia.
Price, L., & Smith, A. (2000). Managing Cultural Assets from a Business Perspective. Council on
Library and Information Resources, 1755 Massachusetts Ave. NW, Suite 500, Washington,
DC 20036. Available https://www.clir.org/pubs/reports/pub90/appendix1.html
Rittenberg, L & Martens, F (2012) Understanding and Communicating Risk Appetite, COSO,
https://www.coso.org/Documents/ERM-Understanding-and-Communicating-Risk-Appetite.pdf
Rosemann, M., & Zur Muehlen, M. (2005). Integrating risks in business process models. ACIS 2005
Proceedings, 50.
Sadgrove, K. (2016). The complete guide to business risk management. Routledge.
Sadiq, S., Governatori, G., & Namiri, K. (2007, September). Modeling control objectives for business
process compliance. In International conference on business process management (pp. 149-
164). Springer Berlin Heidelberg.
Schumm, D., Turetken, O., Kokash, N., Elgammal, A., Leymann, F., & Van Den Heuvel, W. J. (2010,
July). Business process compliance through reusable units of compliant processes.
In International Conference on Web Engineering (pp. 325-337). Springer Berlin Heidelberg.
Parente, S. (2018). Risk Management Made Easy. PM World Journal. https://pmworldlibrary.net/wp-
content/uploads/2018/06/pmwj71-Jun2018-Parente-risk-management-made-easy-umd-
conference-paper.pdf.
Proctor, P.& Smith, M (2018). The Gartner Business Risk Model: A Framework for Integrating Risk
and Performance. G00314696
Stallings, W., & Brown, L. (2015). Computer Security, Principles and Practice. Pearson, Boston.
[Powerpoint Slides Retrieved from: http://slideplayer.com/slide/6405319/].
Wei, Y. C., Wu, W. C., & Chu, Y. C. (2018). Performance evaluation of the recommendation
mechanism of information security risk identification. Neurocomputing, 279, 48-53.

13 | P a g e ©2024 UCT, All Rights Reserved


BP Technology

BPMS (van der Aalst, 2013)


Figure 3 sketches the emergence of BPM
systems [BPMS] and their role in the
overall information system architecture.
Initially, information systems were
developed from scratch; that is, everything
had to be programmed, even storing and
retrieving data. Soon people realized that
many information systems had similar
requirements with respect to data
management. Therefore, this generic
functionality was subcontracted to a DBM
system. Later, generic functionality related
to user interaction (forms, buttons, graphs,
etc.) was subcontracted to tools that can
automatically generate user interfaces. The
trend to subcontract recurring functionality
to generic tools continued in different
areas. BPM systems can be seen in this
context: a BPM system takes care of
process related aspects. BPM systems can
be seen as an extension of Workflow
Management (WFM). BPM systems
provide much broader support, for
example, by supporting simulation,
business process intelligence, case
management, and so forth. However,
compared to the database market, the
BPM market is much more diverse and
there is no consensus on notations and
core capabilities. This is not a surprise as
process management is much more challenging than data management. However,
WFM/BPM technology is often hidden inside other systems. For example, ERP systems like
SAP and Oracle provide workflow engines. Many other platforms include workflow-like
functionality. For example, integration and application infrastructure software such as IBM’s
WebSphere and Cordys Business Operations Platform (BOP) provides extensive process
support.
Service-Oriented Computing (SOC) has
had an incredible impact on the
architecture of process enactment
infrastructures. The key idea of service
orientation is to subcontract work to
specialized services in a loosely coupled
fashion. In SOC, functionality provided by
business applications is encapsulated
within web services, that is, software
components described at a semantic level,
which can be invoked by application
programs or by other services through a
stack of Internet standards including
HTTP, XML, SOAP, WSDL, and UDDI

14 | P a g e ©2024 UCT, All Rights Reserved


[102–107]. Once deployed, web services provided by various organizations can be
interconnected in order to implement business collaborations, leading to composite web
services. In a Service-Oriented Architecture (SOA) services are interacting, for example, by
exchanging messages. By combining basic services more complex services can be created
[103, 107].
Orchestration is concerned with the composition of services seen from the viewpoint of
single service (the “spider in the web”). Choreography is concerned with the composition of
services seen from a global viewpoint focusing on the common and complementary
observable behavior. Choreography is particularly relevant in a setting where there is no
single coordinator. The terms orchestration and choreography describe two aspects of
integrating services to create end-to-end business processes. The two terms overlap
somewhat and their distinction has been heavily discussed over the last decade. SOC and
SOA can be used to realize process enactment infrastructures. Processes may implement
services and, in turn, may use existing services. All modern BPM/WFM systems provide
facilities to expose defined processes as services and to implement activities in a process by
simply calling other services.

BPMS (Harmon, 2015)


A major change has occurred in this decade. Business people have realized that IT is no
longer a support service but an integral element in the company’s strategy. IT managers, for
their part, have decided to stop focusing on technology and support, as such, and to focus,
instead, on how they help implement business processes. In essence, the description of the
goals and workings of business processes has emerged as the common language that both
business executives and IT managers speak. This reorientation, has, in turn, led to a
sweeping reconsideration of how IT supports business managers and to the development of
integrated packages of business process management software suites. Software tools that,
a decade ago, would have been described as workflow, business intelligence, rules engines,
or enterprise application integration tools and now being integrated together and spoken of
as BPMS products (Khan 2004).
In 2003, Howard Smith and Peter Fingar wrote Business Process Management as a
clarion call for companies to develop and use BPMS products to automate and manage their
business processes. Smith and Fingar envisioned a world in which business managers
would be able to glance at computer screens and see how their business processes were
performing, and then, as needed, modify their processes to respond better to the evolving
business situation. In other words, BPMS was to be a new type of software – a layer of
software that sat on top of other software and managed all the people and software
elements required to control major business processes. It is worth stepping back and asking
to what degree that vision has been realized.
With a few exceptions, the BPMS software market has not evolved from scratch. Instead, the
BPMS vendors were already in existence, offering workflow, documentation, rules engines,
enterprise application integration (EAI), business intelligence (BI), or even ERP applications.
Vendors from each of these older software domains have rushed to modify and expand their
software products to incorporate capabilities associated with an evolving idea of what a
BPMS product might include. Thus, workflow vendors have added EAI and vice versa. Most
vendors have added a rule capability and incorporated BI (zur Muehlen 2004).
If there is a major difference between today’s “BPMS” applications and EAI or workflow
applications that would have been build in 2000, it lays in the fact that today’s EAI and
workflow systems are built to take advantage of the Internet and, increasingly, a Service
Oriented Architecture (SOA). Elementary SOA projects can be done without reference to
BPM, but sophisticated SOA projects, to be of value to the company, must be integrated with
a deep understanding of the organization’s business processes. Indeed, it is the emphasis

15 | P a g e ©2024 UCT, All Rights Reserved


on SOA, and the role that SOA infrastructure plays in the thinking of the leading platform
vendors, that explains their growing support for BPM and BPMS.
The new emphasis on BPMS and SOA, as the two sides of the same coin, is a mixed
blessing for the BPM community. It has attracted the interest of the platform vendors and
driven their commitment. At the same time, it has led them to emphasize the more technical
aspects of BPMS and make discussions of BPMS sound more and more like discussions of
enterprise integration. BPM and BPMS need not get lost when the discussion turns to SOA,
but they often do (Inaganti 2007). Or, more correctly, they get relegated to a very secondary
role. Like too many IT discussions in the past, SOA developers are inclined to simply ask the
business people for “their requirements” and then move on to the serious and complex work
involved in creating the infrastructure environment.

Business Process Software Tools (Harmon, 2016)


Business Process Modeling Tools (BP Modeling Tools) Business Process Modeling tools
are designed not only to define and document business processes, but also to store
information about the processes so that they can be easily updated and maintained.
Companies that move beyond isolated process change efforts and decide to define
enterprise-wide process architectures almost always shift to one of these tools. They are
more difficult to learn but the benefits they provide far outweigh the effort required.
Organization Modeling Tools Many of the BP Modeling tools include features that allow
users to create modeling of their organization. In essence, these models are very high-level
views of how the organization interacts with its environment, what value chains and major
business processes it supports, and how high-level processes are aligned with various types
of enterprise resources. Many of the BP Modeling tools include these capabilities and some
tools specialize in Organization Modeling.
Business Process Simulation Tools Most BP Modeling tools include simulation
capabilities. In addition, there are some tools that are especially designed for more
demanding simulation work. Most BP Modeling teams turn to specialists to undertake
simulation studies, and those specialists often prefer the more sophisticated Simulation
Tools.
Business Process Management Suites or Systems (BPMS Tools) These tools combine
process modeling with runtime execution. In essence, they combine features previously
found in workflow and EAI (Enterprise Application Integration) products. In some cases the
tools also incorporate Rule Management and Process Monitoring capabilities. These tools
are newer and are just beginning to gain a foothold in most companies. In the long run, they
promise to help companies create a process layer between those who define and manage
processes and the software resources used to implement processes.
BPM Applications In essence, BPM Suites are tools that one uses to create BPM
applications. A BPM Application is an application that is used to manage all of the people
and software systems used to implement a specific process. Whenever the organization is
called upon to execute a specific process, it relies on the BPM Application to manage the
execution. In a few years, as BPMS becomes more widely used, we expect to see BPM
Applications offered with BPMS built in. Conversely, we expect ERP and CRM vendors to
offer BPM Applications especially designed to integrate with their current ERP or CRM
modules. A BPMS is only a tool for building a BPM Application. A BPM Application is an
application designed to execute a specific process with BPMS built in to enable managers to
modify the application as needed.
Business Process Monitoring Tools Most BPMS tools offer some process monitoring
capabilities. They tend, for example, to provide information about process events to the
process supervisors. Other BPMS tools, and more sophisticated monitoring tools, combine
data from specific processes with information derived from other sources in a Data
Warehouse and then use simulation techniques or Business Intelligence (BI or Data Mining)
techniques to extract patterns from the data and to report information to executives via

16 | P a g e ©2024 UCT, All Rights Reserved


Executive Dashboards in something close to real-time. These tools are sometimes called
Business Activity Monitoring (BAM) tools.
Rule Management Tools Most BP Modeling tools allow analysts to identify and save
business rules. Most BPMS tools incorporate rule management tools that at least allow for
the identification of business rules used in specific business processes. In some cases the
Rule Management tools can be used to actually analyze business rules at runtime and
generate or suggest decisions using logical inferencing techniques.

BPMS (Harmon, 2018)


Early BPM tools were basically modeling tools that included some monitoring and some
management capabilities. As time has passed these tools have become much more
complex, adding business rules, data base management capabilities, and more extensive
real-time process change capabilities. With all the additional features, some of them only
poorly integrated, the tools have often been hard to understand and use. There has also
been a lot of consolidation in the market. Today, there is, in effect, a dual market with a few
larger vendors offering very powerful tools, and another group of vendors offering simpler
tools that are easier for beginners to use. At the same time, the rapidly evolving IT market
puts everyone under stress to upgrade their tools to accommodate new technologies, like
cloud computing, Robotic Process Automation and other Artificial Intelligence (AI) features.
New vendors keep appearing even as older vendors are acquired and new generations of
popular tools keep being offered.

Business Rules Management Systems (Vashisth et al., 2019)


A business rules management system (BRMS) A traditional BRMS is the predecessor to
the DMS. A BRMS provides a collection of design time and runtime software that enables an
enterprise to explicitly define, analyze, audit and maintain a wide variety of business rules.
BRMSs include a business rule engine (BRE) or code generator that provides an execution
mechanism for managed rules, optionally using AI techniques such as goal-driven or data-
driven reasoning. BRMSs are evolving into DMSs to support the growing use of analytics in
operational decision-making processes, and to leverage new design methodologies that are
centered on decision modeling.
Decision management suite — A DMS is a collection of design time and runtime software
that enables an enterprise to explicitly define, analyze, audit and maintain a wide variety of
decision-making software services, including business rules and various types of analytics.
Increasing interest in predictive analytics such as machine learning (ML) for operational
decision making has precipitated the evolution of the BRMS to the DMS. DMSs generally
have more support for decision modeling than BRMSs
Alternative Software
The common alternative to a DMS is to develop the business logic for decision making using
conventional application development or business process management (BPM) tools.
Simple decisions can be easily implemented through standard, procedural programming
languages such as Java. For larger sets of rules, some companies build their own custom
rule engines using soft-coding constructs such as database tables, to hold decision
parameters. This enables them to dynamically modify some aspects of decision logic without
recompiling and redeploying the application. However, this does not provide the business-
friendly decision authoring experience, governance, monitoring or documentation provided
by a DMS. Custom “homegrown” rule engines are also less practical for decisions that
require a sequence of multiple steps, such as multiple rulesets or one or more analytical
scoring steps. Maintaining a custom rule engine can also be burdensome over time,
especially if the original developers are no longer available. However, if decision logic
changes frequently, this approach can be less efficient than when using a DMS. Complex
decisions with multiple steps can be hard to design and document in a low-code tool unless

17 | P a g e ©2024 UCT, All Rights Reserved


it is used with a DMS or at least a decision modeling tool. Application development projects
that use an intelligent business process management suite (iBPMS) typically do not need a
DMS product. Depending on the specific product, a typical iBPMS includes most or all of the
features of a DMS.

Market Guide for Intelligent BPMS (Srivastava et al., 2020)


Gartner defines the intelligent business process management suite (iBPMS) market as the
group of vendors offering licensed software that supports the full cycle of business process
and decision discovery, analysis, design, implementation, execution, monitoring and
optimization. An iBPMS offering consolidates process discovery, modelling, integration
services, decision management, process orchestration and choreography, and advanced
analytics. The iBPMS market primarily caters to IT leaders, enterprise architects, process
architects, process analysts and process owners for both optimization and transformation of
business processes. These platforms contain a solid foundation of tools for orchestrating,
choreographing and automating end-to-end business processes and tasks within those
processes.
The promise of iBPMS platforms
is to enable management of the
full life cycle of complex, long-
running and sometimes
unstructured business
processes that cut across
organizational boundaries. All
iBPMS platforms provide
process orchestration,
choreography and modelling,
and support model-driven
development. Developer
personas can include citizen,
business unit and professional
developers, as well as specialist
developers for data and
decision design, integration,
event processing/streaming analytics and predictive analytics/ML. Based on client
interactions, secondary research and a vendor survey, we have identified the key
capabilities desired in iBPMS platforms (see Figure 1):

18 | P a g e ©2024 UCT, All Rights Reserved


The market for iBPMS software is mature, and has a number of entrenched vendors facing
competition from vendors in adjacent markets such as low-code, RPA, process mining and
iPaaS. As per Gartner’s estimates, in 2019, the business process management suite market
was $2.8 billion in size. By 2024, Gartner estimates this market to reach $2.9 billion. Due to
economic turbulence resulting from the COVID-19 pandemic, there is a clear shift toward
demand for iBPMS providers who have agility, are cost-effective, and enable citizen
developers and project teams to manage their broken business processes.
Increasing Focus on Low-Code Automation. Numerous vendors within this market have
shifted their focus to provide low-code automation. Going beyond a focus on incremental
optimization of business operations, vendors are focusing on enabling organizations to
rapidly build model-driven applications to automate business processes. iBPMS vendors are
adding capabilities such as graphical UIs (to define data), process models and prebuilt UI
components (to create forms and applications), and providing prebuilt integration flows to
seamlessly connect to different applications. For many vendors, this remains a marketing
messaging rather than being a real capability (see Magic Quadrant for Enterprise Low-Code
Application Platforms).
Addition of Process Mining and RPA Capabilities. Vendors in this market are
increasingly adding process mining and RPA capabilities to their platforms, either natively or
through integration with specialist vendors. For RPA, iBPMS vendors are also adding
capabilities to orchestrate bots from different RPA providers (see Magic Quadrant for
Robotic Process Automation and Market Guide for Process Mining).

Context-Aware Business Process Management (BPM) (vom Brocke et al., 2021)

The BPM Context Matrix is presented in


Figure 1.

Variability is expressed as the degree to which


a process can or should respond to internal
and external dynamics (Feldman & Pentland,
2003; Mertens & Recker, 2020). We observed
that some process groups need variability
(e.g., R&D processes, which differ according to
the goal, timeline, and people involved). Other
processes such as those prevailing in Audit
and Finance should not be variable at all.

Frequency reflects how often the process is


carried out (Lillrank, 2003). We observed that
some processes are performed often, and others are performed once per month or year.
Process executions may be more similar when they occur often (Goh & Pentland, 2019), and
some processes need to follow a specific sequence of steps. Audit and finance processes,
for example, need to conform to some defined standard in contrast to R&D processes, which
tend to occur rather rarely but usually deviate from detailed guidelines and standards.

Combining these two dimensions (variability and frequency), we developed a 4-quadrant


matrix. We refer to this as the BPM Context Matrix. Each quadrant represents a process
cluster recognizing processes with a specific set of requirements to be managed
successfully. We have assigned intuitive names to these process clusters (as shown in
Figure 1): Performance, Innovation, Reliability, and Agility.

• Performance Cluster: Processes occurring with high frequency and low variability. This
cluster is about processes which are performed very often (high frequency). Each

19 | P a g e ©2024 UCT, All Rights Reserved


performance should be more or less the same way (low variability). Consider a
production process. Ideally, the outcome of such a process is always the same, and the
way of production usually does not change. As an example, we can consider the
production of nails, which are identical throughout the same batch, fulfill the same
function, and therefore have a low variability. However, such a production process
occurs frequently.
• Innovation Cluster: Processes occurring with low frequency and high variability.
Processes that belong to the Innovation Cluster require a high degree of creativity. Much
of what happens in these processes cannot be anticipated or prescribed. These
processes occur rather rarely (low frequency). However, if such innovation processes
are executed, they usually run differently after each iteration (high variability). An
example of this are R&D processes. Since the outcome of such processes is usually
uncertain and not clear in detail from the beginning, they exhibit a high degree of
variability. However, the frequency with which such processes are performed is rather
low.
• Reliability Cluster: Processes occurring with low frequency and low variability. This
cluster is about processes which are performed very rarely (low frequency). When they
are performed, however, the execution should be more or less the same (low variability).
Consider the preparation of a tax return. This process is typically always structured in the
same way and is usually carried out once a year. Consistency and reliability are key, not
only for reasons of compliance but also to ensure that information is integrated when it is
needed. Since tax returns usually have to be filed once a year (low frequency) and are
usually always done in the same way (low variability), this type of process can be
assigned to the Reliability Cluster.
• Agility Cluster: Processes occurring with high frequency and high variability. In the Agility
Cluster, we find processes that run frequently (high frequency) and, at the same time,
exhibit a strong potential to deviate across process executions (high variability). These
processes are knowledge-intensive and draw on the experience and knowledge of those
who are involved in the processes (Badakhshan et al., 2019). Process execution is
characterized by improvisation. We assume that we often have to deal with complex
issues in the Agility Cluster. One example is the talent acquisition process. The way in
which new employees are acquired may be similar in its basic steps, but the exact
implementation varies depending on the applicant (the talent) and the open position

the BPM Context Matrix can also be used to inform and guide the selection of relevant digital
technologies, such as process mining and robotic process automation (RPA).

• To give a few examples, process mining has been identified as a promising means to
advance companies' process management approaches (Grisold et al., 2021) but many
organizations find it challenging to find value-adding applications areas where to start.
Using the BPM Context Matrix, it becomes obvious that it is about the high frequency
processes that would allow for meaningful results from process mining as these
processes provide a sufficient amount of digital trace data to be analyzed. Further, the
cluster particularly interesting for process mining is the agility cluster; if we have
sufficient data, we can pinpoint the high variability of process executions. According to
our study, managing processes in the agility cluster should actually aim at “challenging”
the variability, meaning to further investigate variability regarding its value creation.
Guiding questions can be: Is it necessary?, Is it value-adding?, Or is the variability
avoidable and preventable?
• To give another example: Robotic Process Automation (RPA) can have immediate
implications for processes associated with the performance cluster. Given the high
frequency and low variability, standardization and automation are management
imperatives in this cluster in order to make processes more efficient and effective.
Hence, RPA can be particularly useful to automate recurrent steps in the performance

20 | P a g e ©2024 UCT, All Rights Reserved


cluster (van der Aalst et al., 2018). In the innovation cluster, on the other hand,
standardization and automation might actually limit people's capabilities in finding
solutions to new problems. Hence, we do not consider it necessary to document detailed
steps of a process belonging to the Innovation Cluster. This would also restrict the
process users in their creative work. An example can be a product design process where
designers take new actions which respond to the specific needs of a given project
(Seidel et al., 2010). Support can be provided by means of project management or
messaging systems, which afford knowledge sharing and process transparency. For the
reliability cluster, it is key to “reinforce” the desired process performances, as it occurs
fairly seldom but – when it occurs – standard procedures need to be followed. Here, for
instance, workflows and templates can guide process performers.

Process Mining (Robledo, 2018)

Process Mining is a process analysis method that aims to discover, monitor and improve
real processes (processes not assumed) by extracting knowledge easily from available
event logs in the systems of current information of an organization. It goes beyond the pure
presentation of the key data of the process, recognizing the contextual relationships of the
processes, presenting them in the form of graphic analysis in order to diagnose problems
and suggest improvements in the quality of the process models. With Process Mining it will
be possible to detect or diagnose problems based on facts and not on conjectures or
intuitions. Process mining seeks the confrontation between event data (observed behavior)
and process models (hand-made or automatically discovered). Through the pairing of event
data and process models, it will be possible to check compliance, detect deviations, predict
delays, support decision making and recommend process redesigns.

Gartner began reporting on process mining in 2008 as “Automated Business Process


Discovery” (ABPD). The Institute of Electrical and Electronic Engineers (IEEE) Task
Force on Process Mining, established in 2009, published a Manifesto in late 2011 to
promote process mining as a new tool to improve the (re)design, control and support of the
business operating processes. Around the same time, one of the fathers of process
mining, Professor Wil van der Aalst, published the first book on process mining in 2011
that has recently been updated under the title “Process Mining: Data Science in Action”
edited by Springer. The application of Process Mining in an organization offers the
following capabilities:
• Automated discovery of process models, exceptions and instances of processes
(cases) together with basic frequencies and statistics.
• Automated discovery and analysis of customer interactions, as well as alignment with
internal processes.

21 | P a g e ©2024 UCT, All Rights Reserved


• Understanding of different perspectives on operations, not just a process
perspective.
• Monitoring of key performance indicators using dashboards in real time.
• Compliance verification capabilities and gap analysis
• Predictive analysis, prescriptive analysis, scenario testing and simulation with
contextual data.
• Improvement of existing or previous process models using additional data from
saved records.
• Data preparation and data cleansing support.
• Combination of different process models that interact with each other in a single
process mining panel.
• Support for the visualization of how processes contribute to business value (such as
business operating models) — contextualization of processes.
• Effective cooperation between Business and IT.
• Standardization of business processes.
• Improvement of operational excellence by optimizing processes.

Process mining exploits the information recorded in event logs to perform an analysis of the
real process afterwards. There are three main types of process mining:
1. Discovery, which takes an event log and produces a process model without using
any prior information, only with the help of Process Mining algorithms.
2. Conformance, where the event records (real processes) and the corresponding
process models (ideal and predefined processes in BPMN) are compared, and the
resulting coincidences or differences are identified, in order to diagnose the
deviations or inefficiencies between the process model derivative business and ideal
processes.
3. Enhancement (extension), where the process models are adapted and improved
according to the data of the real process.

Process Discovery Review (Augusto, 2018)


Modern information systems maintain detailed trails of the business processes they support,
including records of key process execution events, such as the creation of a case or the
execution of a task within an ongoing case. Process mining techniques allow analysts to
extract insights about the actual performance of a process from collections of such event

22 | P a g e ©2024 UCT, All Rights Reserved


records, also known as event logs [1]. In this context, an event log consists of a set of traces,
each trace itself consisting of the sequence of events related to a given case. One of the
most widely investigated process mining operations is automated process discovery. An
automated process discovery method takes as input an event log, and produces as output a
business process model that captures the control flow relations between tasks that are
observed in or implied by the event log. In order to be useful, such automatically discovered
process models must accurately reflect the behavior recorded in or implied by the log.
Regarding the modeling languages of the discovered process model, we notice that Petri
nets is the predominant one. However, more recently, we have seen the appearance of
methods that produce models in BPMN, a language that is more practically-oriented and
less technical than Petri nets. This denotes a shift in the target audience of these methods,
from data scientists to practitioners, such as business analysts and decision managers.
Other technical languages employed, besides Petri nets, include Causal nets, State
machines and simple Directed Acyclic Graphs.

Scalable process discovery and conformance checking (Leemens et al., 2018)


Considerable amounts of data are collected and stored by organisations nowadays. For
instance, ERP systems log business transaction events, high-tech systems such as X-ray
machines record software and hardware events, and web servers log page visits. Typically,
each action of a user executed with the system, e.g. a customer filling in a form or a machine
being switched on, can be recorded by the system as an event; all events related to the
same process execution,
e.g. a customer order or
an X-ray diagnosis, are
grouped in a trace
(ordered by their time);
an event log contains all
recorded traces of the
system. Process mining
aims to extract
information from such
event logs, for instance
social networks,
business process
models, compliance to
rules and regulations, and performance information (e.g. bottlenecks) [46]. In this paper, we
focus on two process mining challenges: process discovery and conformance checking.
Figure 1 shows the context of these two challenges: a real-life business process (a system)
is running, and the executed process steps are recorded in an event log.
In process discovery, one assumes that the inner workings of the system are unknown to the
analyst and cannot be obtained otherwise. Therefore, process discovery aims to learn a
process model from an event log, which describes the system as it actually happened (in
contrast to what is assumed to have happened) [50]. Two main challenges exist in process
discovery: first, one would like to learn an easy-to-understand model that captures the actual
behaviour. Second, the model should have a proper formal interpretation, i.e. have well-
defined behavioural semantics and be free of deadlocks and other anomalies (be sound)
[28]. Few existing algorithms solve both challenges together.
In contrast, conformance checking studies the differences between a process model and
reality. We distinguish two types of conformance checking. First, the model can be
compared with a log. Such log conformance checking can provide insight into the real
behaviour of an organisation, by highlighting which traces deviate from the model, and
where in the model deviations occur [50]. Second, the model can be compared with a model
of the system (but only if such a model is available). Model conformance checking can be
used to highlight differences between different snapshots of a process or to verify that a

23 | P a g e ©2024 UCT, All Rights Reserved


process model conforms to a design made earlier [19]. Moreover, model conformance
checking can be used to evaluate discovery algorithms by choosing a system model and
quantifying the similarity between this chosen model and the models discovered by
discovery algorithms.

How companies can prepare for the coming “AI-first” world (Davenport &
Mittal, 2023)
Some of the most successful and most technological organizations in the world have
declared their intention to be all-in on artificial intelligence – “AI fueled.” Google described it
as “a world that is AI-first, where computing becomes universally available—at home, at
work, in the car or on the go—and interacting .. . becomes much more natural and intuitive,
and above all, more intelligent.” Companies aiming to be AI- intensive in a variety of
industries share the goal of intuitive technology and pervasive intelligence and are applying
those objectives in their sectors, such as financial services, manufacturing or health care.
To achieve substantial value from AI, executives should consider deploying AI tools
systematically across every key function and enterprise operation to support new business
process designs and data-driven decision-making. Likewise, AI should drive new product
and service offerings and business models. Today, using AI in this aggressive fashion can
confer industry leadership. Eventually, it may become simply table stakes for survival.

Many companies are using rules-based robotic process automation (RPA) to automate back-
office structured workflows, but increasing numbers are combining RPA with machine
learning to enhance their decision- making. Virtual reality and other forms of simulations,
digital twins and metaverses are technologies that employ various forms of AI and are likely
to become more widely adopted in the future.

In contrast to the business process engineering movement of the early 1990s, where
headcount reduction was the primary driver, the emphasis in the era of AI is on
augmentation. While many have predicted that AI would replace humans, AI-powered
companies see the primary goal as discovering how to get the best out of both by
redesigning jobs, reskilling workers and becoming more efficient and effective in the
process. The closest connection between traditional process improvement and AI is rules-
based robotic process automation (RPA). For example, retirement and financial services
firm, Voya, has embedded an automation center of excellence within its Continuous
Improvement Center, which generally uses Lean and Six Sigma methods. A few companies
have effectively combined process reengineering and forms of AI other than RPA. DBS
Bank, for example, used AI to enable major process improvements in its anti-money-
laundering (AML) efforts, as well as in its customer centers in India and Singapore. For
companies addressing how AI can make possible dramatic improvements in business
processes, a new technology powered by AI called “process mining,” takes a lot of the
detailed work out of process improvement. It is catching on rapidly in many process-oriented
companies.

Process Mining meets Artificial Intelligence and Machine Learning (Veit et al.,
2017)
Process mining is a technique for the reconstruction, analysis, and improvement of business
processes using recorded event data from transactional IT systems [1,2,3]. Applied process
mining analysis usually starts with the explorative investigation of the process model
reproduced from the raw event data. This manual discovery aims to identify common
process deviations, undesired patterns, and sources of inefficiencies [2]. To better
understand business operations, the examined process data is often extended by additional
information beyond the event data (e.g. the net value or quantity of a purchase order). This
extended data model enables users to conduct advanced analyses to get deeper insights
into the process (e.g. creating an OLAP table including different dimensions and KPIs

24 | P a g e ©2024 UCT, All Rights Reserved


derived from the raw process data). The findings from the discovery phase trigger process
improvements such as reductions in throughput times and manual work, the elimination of
efficiencies, or the enhancement of process quality [5,6].
This explorative mining approach is a powerful tool but requires the user to manually
investigate the process data and also to have an ex-ante hypothesis on where to shine the
light in order to see where processes work well or poorly. Hence, the process mining user is
the driving force, who actively learns from the data. The new Proactive Insights Engine (PI)
overcomes this user-centric procedure through the combination of process mining and
machine learning capabilities. This new technology provides highly smart and fully
automated insights into business processes. PI automatically analyzes processes, uncovers
hidden problems, and reveals prescriptive recommendations on how to improve them in real-
time. It understands workflows and draws conclusions from them, automatically conducts
research on root causes of process violations, and provides recommendations for action.
Therefore, PI enlarges previous process mining solutions by going from an explorative
process discovery to an intelligent and fully automated process analysis. PI covers four main
components:
PI Conformance. Compares the actual ‘as-is’ process with the documented ‘to-be’ process
(conformance checking). It automatically identifies the highest priority issues and their root
causes. Based on these findings, users can take immediate action. Existing process models
can be imported (e.g. as BPMN file) to test their conformance. Alternatively, users can
create and edit process models directly in the application using the built-in Business Process
Modeler. The conformance checker automatically reveals a list of process violations, drills
them down to their root causes, and makes suggestions for how to fix them. The software
develops these recommendations based on the process data model and a combination of
statistics and usage data via indicators for statistical significance on the one hand and for
relevance from a business perspective on the other hand.
PI Machine Learning. Integrates the most sophisticated machine learning and statistical
algorithms natively into all Celonis analyses, which fully supports R-scripting language.
There are two ways for executing R-statements in the application. RCALL returns exactly
one value for each line of input. RAGG operates on groups and returns one value for each
aggregation. The possibility to integrate any R-Library or statement into Celonis allows the
user to apply advanced prediction techniques and machine learning algorithms to his
process analysis. Historic process data and the findings from process discovery serve as an
input to create predictions of the future. For example, it can be predicted how ongoing cases
will flow through the process until their completion (predictive process monitoring).
PI Social. Adds a further dimension, the social aspect of processes, to process mining.
Social maps process data to different teams and organizations to show how they interact
with each other. It identifies critical roles within the process, workload imbalances, and other
team inefficiencies. The visualization of the network of social process interactions uncovers
inefficiencies in organizational structures and the interactions among people involved in the
process.
PI Companion. Integrates the Celonis application into standard business applications. It
acts as a ‘process advisor’ and identifies recommendations at the time when critical
business decisions are made. This enables users to conduct process analysis while the
process is being executed, rather than analyzing processes after their completion. The new
certified addon interacts with SAP systems and supports decisions by using relevant data
from historical transactions. For example, users can check which vendor had the fastest
delivery record last month or customers’ payment behavior for well-grounded decisions on
payment terms.

References
Augusto, A., Conforti, R., Dumas, M., La Rosa, M., Maggi, F. M., Marrella, A., ... & Soo, A. (2018).
Automated discovery of process models from event logs: Review and benchmark. IEEE
Transactions on Knowledge and Data Engineering.

25 | P a g e ©2024 UCT, All Rights Reserved


Davenport, T. H., & Mittal, N. (2023). How companies can prepare for the coming “AI-first” world.
Strategy & Leadership, 51(1), 26-30.
Harmon, P. (2016). The state of business process management 2016. Business process trends.
http://www.bptrends.com/bptrends-surveys/
Harmon, P. (2015). The scope and evolution of business process management. In Handbook on
business process management 1. Springer, Berlin, Heidelberg.
Harmon, P. (2018). The state of business process management 2018. Business process trends.
http://www.bptrends.com/bptrends-surveys/
Leemans, S. J., Fahland, D., & Van der Aalst, W. M. (2018). Scalable process discovery and
conformance checking. Software & Systems Modeling, 17(2), 599-631.
Robledo (2018). Process Mining plays an essential role in Digital Transformation. Medium
Corporation. Available https://medium.com/@pedrorobledobpm/process-mining-plays-an-
essential-role-in-digital-transformation-384839236bbe
Srivastava, T. et al. (2020) Market Guide for Intelligent Business Process Management Suites.
Gartner G00732544
Vashisth, S, Schulte, W.R., Clark, W. & Vincent, P. (2019). Innovation Tech Insight for Decision
Management, Gartner G00433988.
Van der Aalst, W.M. (2013), “Business process management: a comprehensive survey”, ISRN
Software Engineering, Vol. 2013.
Veit, F., Geyer-Klingeberg, J., Madrzak, J., Haug, M., & Thomson, J. (2017). The Proactive Insights
Engine: Process Mining meets Machine Learning and Artificial Intelligence. In BPM (Demos).
Vom Brocke, J., Weber, M & Grisold, T. (2021) Class Notes: The BPM Context Matrix – A Framework
for Context-Aware Business Process Management (BPM), BP Trends.
https://www.bptrends.com/class-notes-the-bpm-context-matrix-a-framework-for-context-
aware-business-process-management-bpm/

26 | P a g e ©2024 UCT, All Rights Reserved

You might also like